WorldWideScience

Sample records for probable future capabilities

  1. Probability Weighting and Loss Aversion in Futures Hedging

    NARCIS (Netherlands)

    Mattos, F.; Garcia, P.; Pennings, J.M.E.

    2008-01-01

    We analyze how the introduction of probability weighting and loss aversion in a futures hedging model affects decision making. Analytical findings indicate that probability weighting alone always affects optimal hedge ratios, while loss and risk aversion only have an impact when probability

  2. Estimating market probabilities of future interest rate changes

    OpenAIRE

    Hlušek, Martin

    2002-01-01

    The goal of this paper is to estimate the market consensus forecast of future monetary policy development and to quantify the priced-in probability of interest rate changes for different future time horizons. The proposed model uses the current spot money market yield curve and available money market derivative instruments (forward rate agreements, FRAs) and estimates the market probability of interest rate changes up to a 12-month horizon.

  3. Capabilities of Future Training Support Packages

    National Research Council Canada - National Science Library

    Burnside, Billy

    2004-01-01

    .... This report identifies and analyzes five key capabilities needed in future TSPs: rapid tailoring or modification, reach, simulated operating environment, performance measurement, and pretests/selection criteria...

  4. Directed Design of Experiments for Validating Probability of Detection Capability of NDE Systems (DOEPOD)

    Science.gov (United States)

    Generazio, Edward R.

    2015-01-01

    Directed Design of Experiments for Validating Probability of Detection Capability of NDE Systems (DOEPOD) Manual v.1.2 The capability of an inspection system is established by applications of various methodologies to determine the probability of detection (POD). One accepted metric of an adequate inspection system is that there is 95% confidence that the POD is greater than 90% (90/95 POD). Design of experiments for validating probability of detection capability of nondestructive evaluation (NDE) systems (DOEPOD) is a methodology that is implemented via software to serve as a diagnostic tool providing detailed analysis of POD test data, guidance on establishing data distribution requirements, and resolving test issues. DOEPOD demands utilization of observance of occurrences. The DOEPOD capability has been developed to provide an efficient and accurate methodology that yields observed POD and confidence bounds for both Hit-Miss or signal amplitude testing. DOEPOD does not assume prescribed POD logarithmic or similar functions with assumed adequacy over a wide range of flaw sizes and inspection system technologies, so that multi-parameter curve fitting or model optimization approaches to generate a POD curve are not required. DOEPOD applications for supporting inspector qualifications is included.

  5. Key Future Engineering Capabilities for Human Capital Retention

    Science.gov (United States)

    Sivich, Lorrie

    Projected record retirements of Baby Boomer generation engineers have been predicted to result in significant losses of mission-critical knowledge in space, national security, and future scientific ventures vital to high-technology corporations. No comprehensive review or analysis of engineering capabilities has been performed to identify threats related to the specific loss of mission-critical knowledge posed by the increasing retirement of tenured engineers. Archival data from a single diversified Fortune 500 aerospace manufacturing engineering company's engineering career database were analyzed to ascertain whether relationships linking future engineering capabilities, engineering disciplines, and years of engineering experience could be identified to define critical knowledge transfer models. Chi square, logistic, and linear regression analyses were used to map patterns of discipline-specific, mission-critical knowledge using archival data of engineers' perceptions of engineering capabilities, key developmental experiences, and knowledge learned from their engineering careers. The results from the study were used to document key engineering future capabilities. The results were then used to develop a proposed human capital retention plan to address specific key knowledge gaps of younger engineers as veteran engineers retire. The potential for social change from this study involves informing leaders of aerospace engineering corporations on how to build better quality mentoring or succession plans to fill the void of lost knowledge from retiring engineers. This plan can secure mission-critical knowledge for younger engineers for current and future product development and increased global competitiveness in the technology market.

  6. Future southcentral US wildfire probability due to climate change

    Science.gov (United States)

    Stambaugh, Michael C.; Guyette, Richard P.; Stroh, Esther D.; Struckhoff, Matthew A.; Whittier, Joanna B.

    2018-01-01

    Globally, changing fire regimes due to climate is one of the greatest threats to ecosystems and society. In this paper, we present projections of future fire probability for the southcentral USA using downscaled climate projections and the Physical Chemistry Fire Frequency Model (PC2FM). Future fire probability is projected to both increase and decrease across the study region of Oklahoma, New Mexico, and Texas. Among all end-of-century projections, change in fire probabilities (CFPs) range from − 51 to + 240%. Greatest absolute increases in fire probability are shown for areas within the range of approximately 75 to 160 cm mean annual precipitation (MAP), regardless of climate model. Although fire is likely to become more frequent across the southcentral USA, spatial patterns may remain similar unless significant increases in precipitation occur, whereby more extensive areas with increased fire probability are predicted. Perhaps one of the most important results is illumination of climate changes where fire probability response (+, −) may deviate (i.e., tipping points). Fire regimes of southcentral US ecosystems occur in a geographic transition zone from reactant- to reaction-limited conditions, potentially making them uniquely responsive to different scenarios of temperature and precipitation changes. Identification and description of these conditions may help anticipate fire regime changes that will affect human health, agriculture, species conservation, and nutrient and water cycling.

  7. Cultural Differences in Young Adults' Perceptions of the Probability of Future Family Life Events.

    Science.gov (United States)

    Speirs, Calandra; Huang, Vivian; Konnert, Candace

    2017-09-01

    Most young adults are exposed to family caregiving; however, little is known about their perceptions of their future caregiving activities such as the probability of becoming a caregiver for their parents or providing assistance in relocating to a nursing home. This study examined the perceived probability of these events among 182 young adults and the following predictors of their probability ratings: gender, ethnicity, work or volunteer experience, experiences with caregiving and nursing homes, expectations about these transitions, and filial piety. Results indicated that Asian or South Asian participants rated the probability of being a caregiver as significantly higher than Caucasian participants, and the probability of placing a parent in a nursing home as significantly lower. Filial piety was the strongest predictor of the probability of these life events, and it mediated the relationship between ethnicity and probability ratings. These findings indicate the significant role of filial piety in shaping perceptions of future life events.

  8. The Future of Deterrent Capability for Medium-Sized Western Powers in the New Environment

    International Nuclear Information System (INIS)

    Quinlan, Michael

    2001-01-01

    What should be the longer-term future for the nuclear-weapons capabilities of France and the United Kingdom? I plan to tackle the subject in concrete terms. My presentation will be divided into three parts, and, though they are distinct rather than separate, they interact extensively. The first and largest part will relate to strategic context and concept: what aims, justifications and limitations should guide the future, or the absence of a future, for our capabilities? The second part, a good deal briefer, will be the practical content and character of the capabilities: what questions for decision will arise, and in what timescale, about the preservation, improvement or adjustment of the present capabilities? And the third part, still more briefly, will concern the political and institutional framework into which their future should or might be fitted. (author)

  9. Space Shuttle Launch Probability Analysis: Understanding History so We Can Predict the Future

    Science.gov (United States)

    Cates, Grant R.

    2014-01-01

    The Space Shuttle was launched 135 times and nearly half of those launches required 2 or more launch attempts. The Space Shuttle launch countdown historical data of 250 launch attempts provides a wealth of data that is important to analyze for strictly historical purposes as well as for use in predicting future launch vehicle launch countdown performance. This paper provides a statistical analysis of all Space Shuttle launch attempts including the empirical probability of launch on any given attempt and the cumulative probability of launch relative to the planned launch date at the start of the initial launch countdown. This information can be used to facilitate launch probability predictions of future launch vehicles such as NASA's Space Shuttle derived SLS. Understanding the cumulative probability of launch is particularly important for missions to Mars since the launch opportunities are relatively short in duration and one must wait for 2 years before a subsequent attempt can begin.

  10. Nuclear Research and Development Capabilities Needed to Support Future Growth

    Energy Technology Data Exchange (ETDEWEB)

    Wham, Robert M. [ORNL, P.O. Box 2008, Oak Ridge, TN 37831-6154 (United States); Kearns, Paul [Battelle Memorial Institute (United States); Marston, Ted [Marston Consulting (United States)

    2009-06-15

    The energy crisis looming before the United States can be resolved only by an approach that integrates a 'portfolio' of options. Nuclear energy, already an important element in the portfolio, should play an even more significant role in the future as the U.S. strives to attain energy independence and reduce carbon emissions. The DOE Office of Nuclear Energy asked Battelle Memorial Institute to obtain input from the commercial power generation industry on industry's vision for nuclear energy over the next 30-50 years. With this input, Battelle was asked to generate a set of research and development capabilities necessary for DOE to support the anticipated growth in nuclear power generation. This presentation, based on the report generated for the Office of Nuclear Energy, identifies the current and future nuclear research and development capabilities required to make this happen. The capabilities support: (1) continued, safe operation of the current fleet of nuclear plants; (2) the availability of a well qualified and trained workforce; (3) demonstration of the next generation nuclear plants; (4) development of a sustainable fuel cycle; (5) advanced technologies for maximizing resource utilization and minimization of waste and (6) advanced modeling and simulation for rapid and reliable development and deployment of new nuclear technologies. In order to assure these capabilities are made available, a Strategic Nuclear Energy Capability Initiative is proposed to provide the required resources during this critical period of time. (authors)

  11. Binomial Test Method for Determining Probability of Detection Capability for Fracture Critical Applications

    Science.gov (United States)

    Generazio, Edward R.

    2011-01-01

    The capability of an inspection system is established by applications of various methodologies to determine the probability of detection (POD). One accepted metric of an adequate inspection system is that for a minimum flaw size and all greater flaw sizes, there is 0.90 probability of detection with 95% confidence (90/95 POD). Directed design of experiments for probability of detection (DOEPOD) has been developed to provide an efficient and accurate methodology that yields estimates of POD and confidence bounds for both Hit-Miss or signal amplitude testing, where signal amplitudes are reduced to Hit-Miss by using a signal threshold Directed DOEPOD uses a nonparametric approach for the analysis or inspection data that does require any assumptions about the particular functional form of a POD function. The DOEPOD procedure identifies, for a given sample set whether or not the minimum requirement of 0.90 probability of detection with 95% confidence is demonstrated for a minimum flaw size and for all greater flaw sizes (90/95 POD). The DOEPOD procedures are sequentially executed in order to minimize the number of samples needed to demonstrate that there is a 90/95 POD lower confidence bound at a given flaw size and that the POD is monotonic for flaw sizes exceeding that 90/95 POD flaw size. The conservativeness of the DOEPOD methodology results is discussed. Validated guidelines for binomial estimation of POD for fracture critical inspection are established.

  12. Mediators of the Availability Heuristic in Probability Estimates of Future Events.

    Science.gov (United States)

    Levi, Ariel S.; Pryor, John B.

    Individuals often estimate the probability of future events by the ease with which they can recall or cognitively construct relevant instances. Previous research has not precisely identified the cognitive processes mediating this "availability heuristic." Two potential mediators (imagery of the event, perceived reasons or causes for the…

  13. Probably not future prediction using probability and statistical inference

    CERN Document Server

    Dworsky, Lawrence N

    2008-01-01

    An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...

  14. Relationship between Future Time Orientation and Item Nonresponse on Subjective Probability Questions: A Cross-Cultural Analysis.

    Science.gov (United States)

    Lee, Sunghee; Liu, Mingnan; Hu, Mengyao

    2017-06-01

    Time orientation is an unconscious yet fundamental cognitive process that provides a framework for organizing personal experiences in temporal categories of past, present and future, reflecting the relative emphasis given to these categories. Culture lies central to individuals' time orientation, leading to cultural variations in time orientation. For example, people from future-oriented cultures tend to emphasize the future and store information relevant for the future more than those from present- or past-oriented cultures. For survey questions that ask respondents to report expected probabilities of future events, this may translate into culture-specific question difficulties, manifested through systematically varying "I don't know" item nonresponse rates. This study drew on the time orientation theory and examined culture-specific nonresponse patterns on subjective probability questions using methodologically comparable population-based surveys from multiple countries. The results supported our hypothesis. Item nonresponse rates on these questions varied significantly in the way that future-orientation at the group as well as individual level was associated with lower nonresponse rates. This pattern did not apply to non-probability questions. Our study also suggested potential nonresponse bias. Examining culture-specific constructs, such as time orientation, as a framework for measurement mechanisms may contribute to improving cross-cultural research.

  15. Future probabilities of coastal floods in Finland

    Science.gov (United States)

    Pellikka, Havu; Leijala, Ulpu; Johansson, Milla M.; Leinonen, Katri; Kahma, Kimmo K.

    2018-04-01

    Coastal planning requires detailed knowledge of future flooding risks, and effective planning must consider both short-term sea level variations and the long-term trend. We calculate distributions that combine short- and long-term effects to provide estimates of flood probabilities in 2050 and 2100 on the Finnish coast in the Baltic Sea. Our distributions of short-term sea level variations are based on 46 years (1971-2016) of observations from the 13 Finnish tide gauges. The long-term scenarios of mean sea level combine postglacial land uplift, regionally adjusted scenarios of global sea level rise, and the effect of changes in the wind climate. The results predict that flooding risks will clearly increase by 2100 in the Gulf of Finland and the Bothnian Sea, while only a small increase or no change compared to present-day conditions is expected in the Bothnian Bay, where the land uplift is stronger.

  16. Rationalization and future planning for AECL's research reactor capability

    International Nuclear Information System (INIS)

    Slater, J.B.

    1990-01-01

    AECL's research reactor capability has played a crucial role in the development of Canada's nuclear program. All essential concepts for the CANDU reactors were developed and tested in the NRX and NRU reactors, and in parallel, important contributions to basic physics were made. The technical feasibility of advanced fuel cycles and of the organic-cooled option for CANDU reactors were also demonstrated in the two reactors and the WR-1 reactor. In addition, an important and growing radio-isotope production industry was established and marketed on a world-wide basis. In 1984, however, it was recognized that a review and rationalization of the research reactor capability was required. The commercial success of the CANDU reactor system had reduced the scope and size of the required development program. Limited research and development funding and competition from other research facilities and programs, required that the scope be reduced to a support basis essential to maintain strategic capability. Currently, AECL, is part-way through this rationalization program and completion should be attained during 1992/93 when the MAPLE reactor is operational and decisions on NRX decommissioning will be made. A companion paper describes some of the unique operational and maintenance problems which have resulted from this program and the solutions which have been developed. Future planning must recognize the age of the NRU reactor (currently 32 years) and the need to plan for eventual replacement. Strategy is being developed and supporting studies include a full technical assessment of the NRU reactor and the required age-related upgrading program, evaluation of the performance characteristics and costs of potential future replacement reactors, particularly the advanced MAPLE concept, and opportunities for international co-operation in developing mutually supportive research programs

  17. Electricity distribution within the future residence

    Energy Technology Data Exchange (ETDEWEB)

    Breeze, J.E.

    1981-11-01

    This study examined present residential wiring systems and identified their shortcomings. A list of the desirable attributes for future wiring systems is proposed. The outlook for the application to wiring systems of solid-state electronic devices is assessed. As further background for a proposed new wiring concept, the residential use of energy today and probable future trends are reviewed. Lastly, the concept of a distributed bus is proposed and developed on a conceptual basis for the residential wiring system of the future. The distributed bus concept can lead to the development of a residential wiring system to meet the following requirements: adaptable to meet probable future energy requirements for residences including alternative energy sources and energy storage; flexibility for servicing loads both in respect to location in the residence and to the size of the load; improved economy in the use of materials; capability for development as a designed or engineered system with factory assembled components and wiring harness; capability for expansion through the attachment of legs or auxillary rings; adaptable to any probable architectural residential development; capability for development to meet the requirements for ease of use and maintenance and with recognition of the growing importance of do-it-yourself repairs and alterations; and adaptable to the full range of solid-state electronics and micro-computer devices and controls including the concept of load control and management through the use of a central control module. 66 refs., 15 figs., 1 tab.

  18. COMPUTER SUPPORT SYSTEMS FOR ESTIMATING CHEMICAL TOXICITY: PRESENT CAPABILITIES AND FUTURE TRENDS

    Science.gov (United States)

    Computer Support Systems for Estimating Chemical Toxicity: Present Capabilities and Future Trends A wide variety of computer-based artificial intelligence (AI) and decision support systems exist currently to aid in the assessment of toxicity for environmental chemicals. T...

  19. Determination of stability of epimetamorphic rock slope using Minimax Probability Machine

    Directory of Open Access Journals (Sweden)

    Manoj Kumar

    2016-01-01

    Full Text Available The article employs Minimax Probability Machine (MPM for the prediction of the stability status of epimetamorphic rock slope. The MPM gives a worst-case bound on the probability of misclassification of future data points. Bulk density (d, height (H, inclination (β, cohesion (c and internal friction angle (φ have been used as input of the MPM. This study uses the MPM as a classification technique. Two models {Linear Minimax Probability Machine (LMPM and Kernelized Minimax Probability Machine (KMPM} have been developed. The generalization capability of the developed models has been checked by a case study. The experimental results demonstrate that MPM-based approaches are promising tools for the prediction of the stability status of epimetamorphic rock slope.

  20. Acceptance Probability (P a) Analysis for Process Validation Lifecycle Stages.

    Science.gov (United States)

    Alsmeyer, Daniel; Pazhayattil, Ajay; Chen, Shu; Munaretto, Francesco; Hye, Maksuda; Sanghvi, Pradeep

    2016-04-01

    This paper introduces an innovative statistical approach towards understanding how variation impacts the acceptance criteria of quality attributes. Because of more complex stage-wise acceptance criteria, traditional process capability measures are inadequate for general application in the pharmaceutical industry. The probability of acceptance concept provides a clear measure, derived from specific acceptance criteria for each quality attribute. In line with the 2011 FDA Guidance, this approach systematically evaluates data and scientifically establishes evidence that a process is capable of consistently delivering quality product. The probability of acceptance provides a direct and readily understandable indication of product risk. As with traditional capability indices, the acceptance probability approach assumes that underlying data distributions are normal. The computational solutions for dosage uniformity and dissolution acceptance criteria are readily applicable. For dosage uniformity, the expected AV range may be determined using the s lo and s hi values along with the worst case estimates of the mean. This approach permits a risk-based assessment of future batch performance of the critical quality attributes. The concept is also readily applicable to sterile/non sterile liquid dose products. Quality attributes such as deliverable volume and assay per spray have stage-wise acceptance that can be converted into an acceptance probability. Accepted statistical guidelines indicate processes with C pk > 1.33 as performing well within statistical control and those with C pk  1.33 is associated with a centered process that will statistically produce less than 63 defective units per million. This is equivalent to an acceptance probability of >99.99%.

  1. Informing future NRT satellite distribution capabilities: Lessons learned from NASA's Land Atmosphere NRT capability for EOS (LANCE)

    Science.gov (United States)

    Davies, D.; Murphy, K. J.; Michael, K.

    2013-12-01

    NASA's Land Atmosphere Near real-time Capability for EOS (Earth Observing System) (LANCE) provides data and imagery from Terra, Aqua and Aura satellites in less than 3 hours from satellite observation, to meet the needs of the near real-time (NRT) applications community. This article describes the architecture of the LANCE and outlines the modifications made to achieve the 3-hour latency requirement with a view to informing future NRT satellite distribution capabilities. It also describes how latency is determined. LANCE is a distributed system that builds on the existing EOS Data and Information System (EOSDIS) capabilities. To achieve the NRT latency requirement, many components of the EOS satellite operations, ground and science processing systems have been made more efficient without compromising the quality of science data processing. The EOS Data and Operations System (EDOS) processes the NRT stream with higher priority than the science data stream in order to minimize latency. In addition to expediting transfer times, the key difference between the NRT Level 0 products and those for standard science processing is the data used to determine the precise location and tilt of the satellite. Standard products use definitive geo-location (attitude and ephemeris) data provided daily, whereas NRT products use predicted geo-location provided by the instrument Global Positioning System (GPS) or approximation of navigational data (depending on platform). Level 0 data are processed in to higher-level products at designated Science Investigator-led Processing Systems (SIPS). The processes used by LANCE have been streamlined and adapted to work with datasets as soon as they are downlinked from satellites or transmitted from ground stations. Level 2 products that require ancillary data have modified production rules to relax the requirements for ancillary data so reducing processing times. Looking to the future, experience gained from LANCE can provide valuable lessons on

  2. Capability and dependency in the Newcastle 85+ cohort study. Projections of future care needs.

    Science.gov (United States)

    Jagger, Carol; Collerton, Joanna C; Davies, Karen; Kingston, Andrew; Robinson, Louise A; Eccles, Martin P; von Zglinicki, Thomas; Martin-Ruiz, Carmen; James, Oliver F W; Kirkwood, Tom B L; Bond, John

    2011-05-04

    Little is known of the capabilities of the oldest old, the fastest growing age group in the population. We aimed to estimate capability and dependency in a cohort of 85 year olds and to project future demand for care. Structured interviews at age 85 with 841 people born in 1921 and living in Newcastle and North Tyneside, UK who were permanently registered with participating general practices. Measures of capability included were self-reported activities of daily living (ADL), timed up and go test (TUG), standardised mini-mental state examination (SMMSE), and assessment of urinary continence in order to classify interval-need dependency. To project future demand for care the proportion needing 24-hour care was applied to the 2008 England and Wales population projections of those aged 80 years and over by gender. Of participants, 62% (522/841) were women, 77% (651/841) lived in standard housing, 13% (106/841) in sheltered housing and 10% (84/841) in a care home. Overall, 20% (165/841) reported no difficulty with any of the ADLs. Men were more capable in performing ADLs and more independent than women. TUG validated self-reported ADLs. When classified by 'interval of need' 41% (332/810) were independent, 39% (317/810) required help less often than daily, 12% (94/810) required help at regular times of the day and 8% (67/810) required 24-hour care. Of care-home residents, 94% (77/82) required daily help or 24-hour care. Future need for 24-hour care for people aged 80 years or over in England and Wales is projected to increase by 82% from 2010 to 2030 with a demand for 630,000 care-home places by 2030. This analysis highlights the diversity of capability and levels of dependency in this cohort. A remarkably high proportion remain independent, particularly men. However a significant proportion of this population require 24-hour care at home or in care homes. Projections for the next 20 years suggest substantial increases in the number requiring 24-hour care due to

  3. Is probability of frequency too narrow?

    International Nuclear Information System (INIS)

    Martz, H.F.

    1993-01-01

    Modern methods of statistical data analysis, such as empirical and hierarchical Bayesian methods, should find increasing use in future Probabilistic Risk Assessment (PRA) applications. In addition, there will be a more formalized use of expert judgment in future PRAs. These methods require an extension of the probabilistic framework of PRA, in particular, the popular notion of probability of frequency, to consideration of frequency of frequency, frequency of probability, and probability of probability. The genesis, interpretation, and examples of these three extended notions are discussed

  4. Estimation of probability of failure for damage-tolerant aerospace structures

    Science.gov (United States)

    Halbert, Keith

    The majority of aircraft structures are designed to be damage-tolerant such that safe operation can continue in the presence of minor damage. It is necessary to schedule inspections so that minor damage can be found and repaired. It is generally not possible to perform structural inspections prior to every flight. The scheduling is traditionally accomplished through a deterministic set of methods referred to as Damage Tolerance Analysis (DTA). DTA has proven to produce safe aircraft but does not provide estimates of the probability of failure of future flights or the probability of repair of future inspections. Without these estimates maintenance costs cannot be accurately predicted. Also, estimation of failure probabilities is now a regulatory requirement for some aircraft. The set of methods concerned with the probabilistic formulation of this problem are collectively referred to as Probabilistic Damage Tolerance Analysis (PDTA). The goal of PDTA is to control the failure probability while holding maintenance costs to a reasonable level. This work focuses specifically on PDTA for fatigue cracking of metallic aircraft structures. The growth of a crack (or cracks) must be modeled using all available data and engineering knowledge. The length of a crack can be assessed only indirectly through evidence such as non-destructive inspection results, failures or lack of failures, and the observed severity of usage of the structure. The current set of industry PDTA tools are lacking in several ways: they may in some cases yield poor estimates of failure probabilities, they cannot realistically represent the variety of possible failure and maintenance scenarios, and they do not allow for model updates which incorporate observed evidence. A PDTA modeling methodology must be flexible enough to estimate accurately the failure and repair probabilities under a variety of maintenance scenarios, and be capable of incorporating observed evidence as it becomes available. This

  5. SuperMAG: Present and Future Capabilities

    Science.gov (United States)

    Hsieh, S. W.; Gjerloev, J. W.; Barnes, R. J.

    2009-12-01

    SuperMAG is a global collaboration that provides ground magnetic field perturbations from a long list of stations in the same coordinate system, identical time resolution and with a common baseline removal approach. This unique high quality dataset provides a continuous and nearly global monitoring of the ground magnetic field perturbation. Currently, only archived data are available on the website and hence it targets basic research without any operational capabilities. The existing SuperMAG software can be easily adapted to ingest real-time or near real-time data and provide a now-casting capability. The SuperDARN program has a long history of providing near real-time maps of the northern hemisphere electrostatic potential and as both SuperMAG and SuperDARN share common software it is relatively easy to adapt these maps for global magnetic perturbations. Magnetometer measurements would be assimilated by the SuperMAG server using a variety of techniques, either by downloading data at regular intervals from remote servers or by real-time streaming connections. The existing SuperMAG analysis software would then process these measurements to provide the final calibrated data set using the SuperMAG coordinate system. The existing plotting software would then be used to produce regularly updated global plots. The talk will focus on current SuperMAG capabilities illustrating the potential for now-casting and eventually forecasting.

  6. Safety related requirements on future nuclear power plants

    International Nuclear Information System (INIS)

    Niehaus, F.

    1991-01-01

    Nuclear power has the potential to significantly contribute to the future energy supply. However, this requires continuous improvements in nuclear safety. Technological advancements and implementation of safety culture will achieve a safety level for future reactors of the present generation of a probability of core-melt of less than 10 -5 per year, and less than 10 -6 per year for large releases of radioactive materials. There are older reactors which do not comply with present safety thinking. The paper reviews findings of a recent design review of WWER 440/230 plants. Advanced evolutionary designs might be capable of reducing the probability of significant off-site releases to less than 10 -7 per year. For such reactors there are inherent limitations to increase safety further due to the human element, complexity of design and capability of the containment function. Therefore, revolutionary designs are being explored with the aim of eliminating the potential for off-site releases. In this context it seems to be advisable to explore concepts where the ultimate safety barrier is the fuel itself. (orig.) [de

  7. An Overview of Current and Future Stratospheric Balloon Mission Capabilities

    Science.gov (United States)

    Smith, Michael

    The modern stratospheric balloon has been used for a variety of missions since the late 1940's. Capabilities of these vehicles to carry larger payloads, fly to higher altitudes, and fly for longer periods of time have increased dramatically over this time. In addition to these basic performance metrics, reliability statistics for balloons have reached unprecedented levels in recent years. Balloon technology developed in the United States in the last decade has the potential to open a new era in economical space science using balloons. As always, the advantage of the balloon platform is the fact that missions can be carried out at a fraction of the cost and schedule of orbital missions. A secondary advantage is the fact that instruments can be re-flown numerous times while upgrading sensor and data processing technologies from year to year. New mission capabilities now have the potential for enabling ground breaking observations using balloons as the primary platform as opposed to a stepping stone to eventual orbital observatories. The limit of very high altitude balloon missions will be explored with respect to the current state of the art of balloon materials and fabrication. The same technological enablers will also be applied to possibilities for long duration missions at mid latitudes with payloads of several tons. The balloon types and their corresponding mission profiles will be presented in a performance matrix that will be useful for potential scientific users in planning future research programs.

  8. Quantifying the Global Fresh Water Budget: Capabilities from Current and Future Satellite Sensors

    Science.gov (United States)

    Hildebrand, Peter; Zaitchik, Benjamin

    2007-01-01

    The global water cycle is complex and its components are difficult to measure, particularly at the global scales and with the precision needed for assessing climate impacts. Recent advances in satellite observational capabilities, however, are greatly improving our knowledge of the key terms in the fresh water flux budget. Many components of the of the global water budget, e.g. precipitation, atmospheric moisture profiles, soil moisture, snow cover, sea ice are now routinely measured globally using instruments on satellites such as TRMM, AQUA, TERRA, GRACE, and ICESat, as well as on operational satellites. New techniques, many using data assimilation approaches, are providing pathways toward measuring snow water equivalent, evapotranspiration, ground water, ice mass, as well as improving the measurement quality for other components of the global water budget. This paper evaluates these current and developing satellite capabilities to observe the global fresh water budget, then looks forward to evaluate the potential for improvements that may result from future space missions as detailed by the US Decadal Survey, and operational plans. Based on these analyses, and on the goal of improved knowledge of the global fresh water budget under the effects of climate change, we suggest some priorities for the future, based on new approaches that may provide the improved measurements and the analyses needed to understand and observe the potential speed-up of the global water cycle under the effects of climate change.

  9. Department of Defense Energy and Logistics: Implications of Historic and Future Cost, Risk, and Capability Analysis

    Science.gov (United States)

    Tisa, Paul C.

    Every year the DoD spends billions satisfying its large petroleum demand. This spending is highly sensitive to uncontrollable and poorly understood market forces. Additionally, while some stakeholders may not prioritize its monetary cost and risk, energy is fundamentally coupled to other critical factors. Energy, operational capability, and logistics are heavily intertwined and dependent on uncertain security environment and technology futures. These components and their relationships are less understood. Without better characterization, future capabilities may be significantly limited by present-day acquisition decisions. One attempt to demonstrate these costs and risks to decision makers has been through a metric known as the Fully Burdened Cost of Energy (FBCE). FBCE is defined as the commodity price for fuel plus many of these hidden costs. The metric encouraged a valuable conversation and is still required by law. However, most FBCE development stopped before the lessons from that conversation were incorporated. Current implementation is easy to employ but creates little value. Properly characterizing the costs and risks of energy and putting them in a useful tradespace requires a new framework. This research aims to highlight energy's complex role in many aspects of military operations, the critical need to incorporate it in decisions, and a novel framework to do so. It is broken into five parts. The first describes the motivation behind FBCE, the limits of current implementation, and outlines a new framework that aids decisions. Respectively, the second, third, and fourth present a historic analysis of the connections between military capabilities and energy, analyze the recent evolution of this conversation within the DoD, and pull the historic analysis into a revised framework. The final part quantifies the potential impacts of deeply uncertain futures and technological development and introduces an expanded framework that brings capability, energy, and

  10. A business analytics capability framework

    Directory of Open Access Journals (Sweden)

    Ranko Cosic

    2015-09-01

    Full Text Available Business analytics (BA capabilities can potentially provide value and lead to better organisational performance. This paper develops a holistic, theoretically-grounded and practically relevant business analytics capability framework (BACF that specifies, defines and ranks the capabilities that constitute an organisational BA initiative. The BACF was developed in two phases. First, an a priori conceptual framework was developed based on the Resource-Based View theory of the firm and a thematic content analysis of the BA literature. Second, the conceptual framework was further developed and refined using a three round Delphi study involving 16 BA experts. Changes from the Delphi study resulted in a refined and confirmed framework including detailed capability definitions, together with a ranking of the capabilities based on importance. The BACF will help academic researchers and industry practitioners to better understand the capabilities that constitute an organisational BA initiative and their relative importance. In future work, the capabilities in the BACF will be operationalised to measure their as-is status, thus enabling organisations to identify key areas of strength and weakness and prioritise future capability improvement efforts.

  11. Prediction and probability in sciences

    International Nuclear Information System (INIS)

    Klein, E.; Sacquin, Y.

    1998-01-01

    This book reports the 7 presentations made at the third meeting 'physics and fundamental questions' whose theme was probability and prediction. The concept of probability that was invented to apprehend random phenomena has become an important branch of mathematics and its application range spreads from radioactivity to species evolution via cosmology or the management of very weak risks. The notion of probability is the basis of quantum mechanics and then is bound to the very nature of matter. The 7 topics are: - radioactivity and probability, - statistical and quantum fluctuations, - quantum mechanics as a generalized probability theory, - probability and the irrational efficiency of mathematics, - can we foresee the future of the universe?, - chance, eventuality and necessity in biology, - how to manage weak risks? (A.C.)

  12. The collision probability modules of WIMS-E

    International Nuclear Information System (INIS)

    Roth, M.J.

    1985-04-01

    This report describes how flat source first flight collision probabilities are calculated and used in the WIMS-E modular program. It includes a description of the input to the modules W-FLU, W-THES, W-PIP, W-PERS and W-MERGE. Input to other collision probability modules are described in separate reports. WIMS-E is capable of calculating collision probabilities in a wide variety of geometries, some of them quite complicated. It can also use them for a variety of purposes. (author)

  13. Legitimacy, capability, effectiveness and the future of the NPT

    International Nuclear Information System (INIS)

    Keeley, J.F.

    1987-01-01

    This chapter looks at the relationship between legitimacy and capability in conceptually and politically contestable regions. This issue was highlighted by India's nuclear test of May 1974 and the Osiraq raid of 1981. These illustrated the general problem of the threat to the coherence and legitimacy of the non-proliferation regime. This threat arose from the spread of nuclear technological capabilities. Two developments in the non-proliferation regime that have helped produce the more specific problems of that regime are discussed. These are the spread of nuclear technological capabilities and the development of complex co-operation networks. The prospects for the modification of the NPT in response to these challenges are considered finally. (U.K.)

  14. The United States should forego a damage-limitation capability against China

    Science.gov (United States)

    Glaser, Charles L.

    2017-11-01

    Bottom Lines • THE KEY STRATEGIC NUCLEAR CHOICE. Whether to attempt to preserve its damage-limitation capability against China is the key strategic nuclear choice facing the United States. The answer is much less clear-cut than when the United States faced the Soviet Union during the Cold War. • FEASIBILITY OF DAMAGE LIMITATION. Although technology has advanced significantly over the past three decades, future military competition between the U.S. and Chinese forces will favor large-scale nuclear retaliation over significant damage limitation. • BENEFITS AND RISKS OF A DAMAGE-LIMITATION CAPABILITY. The benefits provided by a modest damage-limitation capability would be small, because the United States can meet its most important regional deterrent requirements without one. In comparison, the risks, which include an increased probability of accidental and unauthorized Chinese attacks, as well as strained U.S.—China relations, would be large. • FOREGO DAMAGE LIMITATION. These twin findings—the poor prospects for prevailing in the military competition, and the small benefits and likely overall decrease in U.S. security—call for a U.S. policy that foregoes efforts to preserve or enhance its damage-limitation capability.

  15. Survey Probability and Factors affecting Farmers Participation in Future and Option Markets Case Study: Cotton product in Gonbad kavos city

    Directory of Open Access Journals (Sweden)

    F. sakhi

    2016-03-01

    .5 respectively. Multinomial Logit model estimation results for the probability of participation in the future and option markets showed that variables of the level of education, farm ownership, cotton acreage, and non-farm income, work experience in agriculture, the index of willing to use new technologies, the index of risk perception cotton market and risk aversion index are statistically significant. The variables of farm ownership, non-farm income and work experience in agriculture, showed negative effects and the other variables showed positive effects on the probability of participation in these markets. The results are in line with previous studies. Conclusion: The purpose of the current study was to look at the possibility of farmers participations in the future and option markets that presented as a means to reduce the cotton prices volatility. The dependent variable for this purpose, have four categories: participation in both market, and future market, participation in option market and participation in both future and option markets. Multinomial Legit Regression Model was used for data analysis. Results indicated that during the period of 2014 -2015 and the sample under study 35% of cotton growers unwilling to participate in the future and option markets. Farmers willingness to participate in the future and option market was 19% and %21.5, respectively. Multinomial Legit model estimation results for the probability of participation in the future and option markets showed that the variables of the level of education, farm ownership, cotton acreage, and non-farm income, work experience in agriculture, the index of willing to use new technologies, the index of risk perception cotton market and risk aversion index were statistically significant. The variables of farm ownership, non-farm income and work experience in agriculture, showed negative effects and the other variables positive effects on the probability of participation in these markets. The results are in line

  16. A Comparative Assessment of the Navy’s Future Naval Capabilities (FNC) Process and Joint Staff Capability Gap Assessment Process as Related to Pacific Commands (PACOM) Integrated Priority List Submission

    Science.gov (United States)

    2013-04-01

    based on personal interviews with Kit Carlan and Ken Bruner of PACOM, and PowerPoint slides dated March 23, 2011, and prepared by Kit Carlan. 8 The...Integration Branch, Joint Capability Division, J-8, Joint Staff; Mr. Ken Bruner , Science and Technology Advisor, PACOM; Mr. Kit Carlan, Future

  17. Psychophysics of the probability weighting function

    Science.gov (United States)

    Takahashi, Taiki

    2011-03-01

    A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (01e)=1e,w(1)=1), which has extensively been studied in behavioral neuroeconomics. The present study utilizes psychophysical theory to derive Prelec's probability weighting function from psychophysical laws of perceived waiting time in probabilistic choices. Also, the relations between the parameters in the probability weighting function and the probability discounting function in behavioral psychology are derived. Future directions in the application of the psychophysical theory of the probability weighting function in econophysics and neuroeconomics are discussed.

  18. Thermal disadvantage factor calculation by the multiregion collision probability method

    International Nuclear Information System (INIS)

    Ozgener, B.; Ozgener, H.A.

    2004-01-01

    A multi-region collision probability formulation that is capable of applying white boundary condition directly is presented and applied to thermal neutron transport problems. The disadvantage factors computed are compared with their counterparts calculated by S N methods with both direct and indirect application of white boundary condition. The results of the ABH and collision probability method with indirect application of white boundary condition are also considered and comparisons with benchmark Monte Carlo results are carried out. The studies show that the proposed formulation is capable of calculating thermal disadvantage factor with sufficient accuracy without resorting to the fictitious scattering outer shell approximation associated with the indirect application of the white boundary condition in collision probability solutions

  19. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  20. The Renovation and Future Capabilities of the Thacher Observatory

    Science.gov (United States)

    O'Neill, Katie; Osuna, Natalie; Edwards, Nick; Klink, Douglas; Swift, Jonathan; Vyhnal, Chris; Meyer, Kurt

    2016-01-01

    The Thacher School is in the process of renovating the campus observatory with a new meter class telescope and full automation capabilities for the purpose of scientific research and education. New equipment on site has provided a preliminary site characterization including seeing and V-band sky brightness measurements. These data, along with commissioning data from the MINERVA project (which uses comparable hardware) are used to estimate the capabilities of the observatory once renovation is complete. Our V-band limiting magnitude is expected to be better than 21.3 for a one minute integration time, and we estimate that milli-magnitude precision photometry will be possible for a V=14.5 point source over approximately 5 min timescales. The quick response, autonomous operation, and multi-band photometric capabilities of the renovated observatory will make it a powerful follow-up science facility for exoplanets, eclipsing binaries, near-Earth objects, stellar variability, and supernovae.

  1. A Comparative Assessment of the Navy’s Future Naval Capabilities (FNC) Process and Joint Staff Capability Gap Assessment Process as Related to Pacific Command’s Integrated Priority List Submission

    Science.gov (United States)

    2012-12-13

    is based on personal interviews with Kit Carlan and Ken Bruner of PACOM, and PowerPoint slides dated March 23, 2011 that were prepared by Kit Carlan...Ken Bruner , Science and Technology Advisor, PACOM; Mr. Kit Carlan, Future Capabilities Analyst, J-82, PACOM for key information and helpful

  2. The ESA River & Lake System: Current Capabilities and Future Potential

    DEFF Research Database (Denmark)

    Smith, Richard G.; Salloway, Mark; Berry, Philippa A. M.

    Measuring the earth's river and lake resources using satellite radar altimetry offers a unique global monitoring capability, which complements the detailed measurements made by the steadily decreasing number of in-situ gauges. To exploit this unique remote monitoring capability, a global pilot...

  3. Stress transferred by the 1995 Mw = 6.9 Kobe, Japan, shock: Effect on aftershocks and future earthquake probabilities

    Science.gov (United States)

    Toda, S.; Stein, R.S.; Reasenberg, P.A.; Dieterich, J.H.; Yoshida, A.

    1998-01-01

    The Kobe earthquake struck at the edge of the densely populated Osaka-Kyoto corridor in southwest Japan. We investigate how the earthquake transferred stress to nearby faults, altering their proximity to failure and thus changing earthquake probabilities. We find that relative to the pre-Kobe seismicity, Kobe aftershocks were concentrated in regions of calculated Coulomb stress increase and less common in regions of stress decrease. We quantify this relationship by forming the spatial correlation between the seismicity rate change and the Coulomb stress change. The correlation is significant for stress changes greater than 0.2-1.0 bars (0.02-0.1 MPa), and the nonlinear dependence of seismicity rate change on stress change is compatible with a state- and rate-dependent formulation for earthquake occurrence. We extend this analysis to future mainshocks by resolving the stress changes on major faults within 100 km of Kobe and calculating the change in probability caused by these stress changes. Transient effects of the stress changes are incorporated by the state-dependent constitutive relation, which amplifies the permanent stress changes during the aftershock period. Earthquake probability framed in this manner is highly time-dependent, much more so than is assumed in current practice. Because the probabilities depend on several poorly known parameters of the major faults, we estimate uncertainties of the probabilities by Monte Carlo simulation. This enables us to include uncertainties on the elapsed time since the last earthquake, the repeat time and its variability, and the period of aftershock decay. We estimate that a calculated 3-bar (0.3-MPa) stress increase on the eastern section of the Arima-Takatsuki Tectonic Line (ATTL) near Kyoto causes fivefold increase in the 30-year probability of a subsequent large earthquake near Kyoto; a 2-bar (0.2-MPa) stress decrease on the western section of the ATTL results in a reduction in probability by a factor of 140 to

  4. Evaluation of nuclear power plant component failure probability and core damage probability using simplified PSA model

    International Nuclear Information System (INIS)

    Shimada, Yoshio

    2000-01-01

    It is anticipated that the change of frequency of surveillance tests, preventive maintenance or parts replacement of safety related components may cause the change of component failure probability and result in the change of core damage probability. It is also anticipated that the change is different depending on the initiating event frequency or the component types. This study assessed the change of core damage probability using simplified PSA model capable of calculating core damage probability in a short time period, which is developed by the US NRC to process accident sequence precursors, when various component's failure probability is changed between 0 and 1, or Japanese or American initiating event frequency data are used. As a result of the analysis, (1) It was clarified that frequency of surveillance test, preventive maintenance or parts replacement of motor driven pumps (high pressure injection pumps, residual heat removal pumps, auxiliary feedwater pumps) should be carefully changed, since the core damage probability's change is large, when the base failure probability changes toward increasing direction. (2) Core damage probability change is insensitive to surveillance test frequency change, since the core damage probability change is small, when motor operated valves and turbine driven auxiliary feed water pump failure probability changes around one figure. (3) Core damage probability change is small, when Japanese failure probability data are applied to emergency diesel generator, even if failure probability changes one figure from the base value. On the other hand, when American failure probability data is applied, core damage probability increase is large, even if failure probability changes toward increasing direction. Therefore, when Japanese failure probability data is applied, core damage probability change is insensitive to surveillance tests frequency change etc. (author)

  5. NJOY 99/2001: new capabilities in data processing

    International Nuclear Information System (INIS)

    MacFarlane, Robert E.

    2002-01-01

    The NJOY Nuclear Data Processing System is used all over the world to process evaluated nuclear data in the ENDF format into libraries for applications. Over the last few years, a number of new capabilities have been added to the system to provide advanced features for MCNP, MCNPX, and other applications codes. These include probability tables for unresolved range self shielding, capabilities optimized for high-energy libraries (typically to 150 MeV for accelerator applications), options for detailed treatments of incident and outgoing charged particles, and a capability to handle photonuclear reactions. These new features and recent experience using NJOY99 for library production will be discussed, along with possible future work, such as delayed-neutron processing and capabilities to handle the new generation of photo-atomic, electro-atomic, and atomic-relaxation evaluations now becoming available in ENDF format. The latest version of the code, NJOY 2001, uses modern Fortran90 style, modularization, and memory allocation methods. The Evaluated Nuclear Data Files (ENDF) format has become the standard for representing nuclear data throughout the world, being used in the US ENDF/B libraries, the European JEF libraries, the Japanese JENDL libraries, and many others. At the same time, the NJOY Nuclear Data Processing System, which is used to convert evaluated nuclear data in the ENDF format into data libraries for nuclear applications, has become the method of choice throughout the world. The combination of these modern libraries of evaluated nuclear data and NJOY processing has proved very capable for classical applications in reactor analysis, fusion work, shielding, and criticality safety. However, over the last few years, new applications have appeared that require extended evaluated data and new processing techniques. A good example of this is the interest in accelerator-boosted applications, which has led to the need for data to higher energies, such as 150 Me

  6. Rainfall and net infiltration probabilities for future climate conditions at Yucca Mountain

    International Nuclear Information System (INIS)

    Long, A.; Childs, S.W.

    1993-01-01

    Performance assessment of repository integrity is a task rendered difficult because it requires predicting the future. This challenge has occupied many scientists who realize that the best assessments are required to maximize the probability of successful repository sitting and design. As part of a performance assessment effort directed by the EPRI, the authors have used probabilistic methods to assess the magnitude and timing of net infiltration at Yucca Mountain. A mathematical model for net infiltration previously published incorporated a probabilistic treatment of climate, surface hydrologic processes and a mathematical model of the infiltration process. In this paper, we present the details of the climatological analysis. The precipitation model is event-based, simulating characteristics of modern rainfall near Yucca Mountain, then extending the model to most likely values for different degrees of pluvial climates. Next the precipitation event model is fed into a process-based infiltration model that considers spatial variability in parameters relevant to net infiltration of Yucca Mountain. The model predicts that average annual net infiltration at Yucca Mountain will range from a mean of about 1 mm under present climatic conditions to a mean of at least 2.4 mm under full glacial (pluvial) conditions. Considerable variations about these means are expected to occur from year-to-year

  7. Identifying 21st Century Capabilities

    Science.gov (United States)

    Stevens, Robert

    2012-01-01

    What are the capabilities necessary to meet 21st century challenges? Much of the literature on 21st century skills focuses on skills necessary to meet those challenges associated with future work in a globalised world. The result is a limited characterisation of those capabilities necessary to address 21st century social, health and particularly…

  8. Building Airport Surface HITL Simulation Capability

    Science.gov (United States)

    Chinn, Fay Cherie

    2016-01-01

    FutureFlight Central is a high fidelity, real-time simulator designed to study surface operations and automation. As an air traffic control tower simulator, FFC allows stakeholders such as the FAA, controllers, pilots, airports, and airlines to develop and test advanced surface and terminal area concepts and automation including NextGen and beyond automation concepts and tools. These technologies will improve the safety, capacity and environmental issues facing the National Airspace system. FFC also has extensive video streaming capabilities, which combined with the 3-D database capability makes the facility ideal for any research needing an immersive virtual and or video environment. FutureFlight Central allows human in the loop testing which accommodates human interactions and errors giving a more complete picture than fast time simulations. This presentation describes FFCs capabilities and the components necessary to build an airport surface human in the loop simulation capability.

  9. Campus Capability Plan

    Energy Technology Data Exchange (ETDEWEB)

    Adams, C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Arsenlis, T. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Bailey, A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Bergman, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Brase, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Brenner, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Camara, L. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Carlton, H. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Cheng, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Chrzanowski, P. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Colson, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); East, D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Farrell, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Ferranti, L. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Gursahani, A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Hansen, R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Helms, L. L. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Hernandez, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Jeffries, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Larson, D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Lu, K. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); McNabb, D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Mercer, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Skeate, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Sueksdorf, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Zucca, B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Le, D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Ancria, R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Scott, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Leininger, L. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Gagliardi, F. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Gash, A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Bronson, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Chung, B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Hobson, B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Meeker, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Sanchez, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Zagar, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Quivey, B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Sommer, S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Atherton, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-06-06

    Lawrence Livermore National Laboratory Campus Capability Plan for 2018-2028. Lawrence Livermore National Laboratory (LLNL) is one of three national laboratories that are part of the National Nuclear Security Administration. LLNL provides critical expertise to strengthen U.S. security through development and application of world-class science and technology that: Ensures the safety, reliability, and performance of the U.S. nuclear weapons stockpile; Promotes international nuclear safety and nonproliferation; Reduces global danger from weapons of mass destruction; Supports U.S. leadership in science and technology. Essential to the execution and continued advancement of these mission areas are responsive infrastructure capabilities. This report showcases each LLNL capability area and describes the mission, science, and technology efforts enabled by LLNL infrastructure, as well as future infrastructure plans.

  10. Probability and Statistics: 5 Questions

    DEFF Research Database (Denmark)

    Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fits...... in these respects. Interviews with Nick Bingham, Luc Bovens, Terrence L. Fine, Haim Gaifman, Donald Gillies, James Hawthorne, Carl Hoefer, James M. Joyce, Joseph B. Kadane Isaac Levi, D.H. Mellor, Patrick Suppes, Jan von Plato, Carl Wagner, Sandy Zabell...

  11. Vulnerability assessment: Determining probabilities of neutralization of adversaries

    International Nuclear Information System (INIS)

    Graves, B.R.

    1987-01-01

    The Security Manager charged with the responsibility of designing Safeguards and Security Systems at Department of Energy facilities must take many factors into consideration. There must be a clear understanding, supported by documented guidance, of the level of threat to be addressed; the nature of the facility to be protected, and the funds available to design, implement, and maintain the Safeguards and Security System. Armed with these prerequisites, the Security Manager may then determine the characteristics of the Safeguards measures and security forces necessary to protect the facility. Security forces selection and training programs may then be established based on realistic facility needs. The next step is to attempt to determine the probability of security forces winning in a confrontation with adversaries. To determine the probability of success the Security Manager must consider the characteristics of the facility and surrounding area; the characteristics of the security forces and safeguards system at the facility; the response time and capabilities of the augmentation forces and the characteristics and capabilities of the adversary threat level to be addressed. Obviously, the Safeguards and Security Systems must initially address ''worst case'' scenarios consistent with stated guidelines. Validation of the assessment of the Safeguards and Security Systems must then be determined by simulation testing of the capabilities of the response forces against the capabilities of the adversary

  12. Convergence of Transition Probability Matrix in CLVMarkov Models

    Science.gov (United States)

    Permana, D.; Pasaribu, U. S.; Indratno, S. W.; Suprayogi, S.

    2018-04-01

    A transition probability matrix is an arrangement of transition probability from one states to another in a Markov chain model (MCM). One of interesting study on the MCM is its behavior for a long time in the future. The behavior is derived from one property of transition probabilty matrix for n steps. This term is called the convergence of the n-step transition matrix for n move to infinity. Mathematically, the convergence of the transition probability matrix is finding the limit of the transition matrix which is powered by n where n moves to infinity. The convergence form of the transition probability matrix is very interesting as it will bring the matrix to its stationary form. This form is useful for predicting the probability of transitions between states in the future. The method usually used to find the convergence of transition probability matrix is through the process of limiting the distribution. In this paper, the convergence of the transition probability matrix is searched using a simple concept of linear algebra that is by diagonalizing the matrix.This method has a higher level of complexity because it has to perform the process of diagonalization in its matrix. But this way has the advantage of obtaining a common form of power n of the transition probability matrix. This form is useful to see transition matrix before stationary. For example cases are taken from CLV model using MCM called Model of CLV-Markov. There are several models taken by its transition probability matrix to find its convergence form. The result is that the convergence of the matrix of transition probability through diagonalization has similarity with convergence with commonly used distribution of probability limiting method.

  13. Graphical Visualization of Human Exploration Capabilities

    Science.gov (United States)

    Rodgers, Erica M.; Williams-Byrd, Julie; Arney, Dale C.; Simon, Matthew A.; Williams, Phillip A.; Barsoum, Christopher; Cowan, Tyler; Larman, Kevin T.; Hay, Jason; Burg, Alex

    2016-01-01

    NASA's pioneering space strategy will require advanced capabilities to expand the boundaries of human exploration on the Journey to Mars (J2M). The Evolvable Mars Campaign (EMC) architecture serves as a framework to identify critical capabilities that need to be developed and tested in order to enable a range of human exploration destinations and missions. Agency-wide System Maturation Teams (SMT) are responsible for the maturation of these critical exploration capabilities and help formulate, guide and resolve performance gaps associated with the EMC-identified capabilities. Systems Capability Organization Reporting Engine boards (SCOREboards) were developed to integrate the SMT data sets into cohesive human exploration capability stories that can be used to promote dialog and communicate NASA's exploration investments. Each SCOREboard provides a graphical visualization of SMT capability development needs that enable exploration missions, and presents a comprehensive overview of data that outlines a roadmap of system maturation needs critical for the J2M. SCOREboards are generated by a computer program that extracts data from a main repository, sorts the data based on a tiered data reduction structure, and then plots the data according to specified user inputs. The ability to sort and plot varying data categories provides the flexibility to present specific SCOREboard capability roadmaps based on customer requests. This paper presents the development of the SCOREboard computer program and shows multiple complementary, yet different datasets through a unified format designed to facilitate comparison between datasets. Example SCOREboard capability roadmaps are presented followed by a discussion of how the roadmaps are used to: 1) communicate capability developments and readiness of systems for future missions, and 2) influence the definition of NASA's human exploration investment portfolio through capability-driven processes. The paper concludes with a description

  14. Projection of Korean Probable Maximum Precipitation under Future Climate Change Scenarios

    Directory of Open Access Journals (Sweden)

    Okjeong Lee

    2016-01-01

    Full Text Available According to the IPCC Fifth Assessment Report, air temperature and humidity of the future are expected to gradually increase over the current. In this study, future PMPs are estimated by using future dew point temperature projection data which are obtained from RCM data provided by the Korea Meteorological Administration. First, bias included in future dew point temperature projection data which is provided on a daily basis is corrected through a quantile-mapping method. Next, using a scale-invariance technique, 12-hour duration 100-year return period dew point temperatures which are essential input data for PMPs estimation are estimated from bias-corrected future dew point temperature data. After estimating future PMPs, it can be shown that PMPs in all future climate change scenarios (AR5 RCP2.6, RCP 4.5, RCP 6.0, and RCP 8.5 are very likely to increase.

  15. Advanced simulation capability for environmental management - current status and future applications

    Energy Technology Data Exchange (ETDEWEB)

    Freshley, Mark; Scheibe, Timothy [Pacific Northwest National Laboratory, Richland, Washington (United States); Robinson, Bruce; Moulton, J. David; Dixon, Paul [Los Alamos National Laboratory, Los Alamos, New Mexico (United States); Marble, Justin; Gerdes, Kurt [U.S. Department of Energy, Office of Environmental Management, Washington DC (United States); Stockton, Tom [Neptune and Company, Inc, Los Alamos, New Mexico (United States); Seitz, Roger [Savannah River National Laboratory, Aiken, South Carolina (United States); Black, Paul [Neptune and Company, Inc, Lakewood, Colorado (United States)

    2013-07-01

    The U.S. Department of Energy (US DOE) Office of Environmental Management (EM), Office of Soil and Groundwater (EM-12), is supporting development of the Advanced Simulation Capability for Environmental Management (ASCEM). ASCEM is a state-of-the-art scientific tool and approach that is currently aimed at understanding and predicting contaminant fate and transport in natural and engineered systems. ASCEM is a modular and open source high-performance computing tool. It will be used to facilitate integrated approaches to modeling and site characterization, and provide robust and standardized assessments of performance and risk for EM cleanup and closure activities. The ASCEM project continues to make significant progress in development of capabilities, with current emphasis on integration of capabilities in FY12. Capability development is occurring for both the Platform and Integrated Tool-sets and High-Performance Computing (HPC) multi-process simulator. The Platform capabilities provide the user interface and tools for end-to-end model development, starting with definition of the conceptual model, management of data for model input, model calibration and uncertainty analysis, and processing of model output, including visualization. The HPC capabilities target increased functionality of process model representations, tool-sets for interaction with Platform, and verification and model confidence testing. The integration of the Platform and HPC capabilities were tested and evaluated for EM applications in a set of demonstrations as part of Site Applications Thrust Area activities in 2012. The current maturity of the ASCEM computational and analysis capabilities has afforded the opportunity for collaborative efforts to develop decision analysis tools to support and optimize radioactive waste disposal. Recent advances in computerized decision analysis frameworks provide the perfect opportunity to bring this capability into ASCEM. This will allow radioactive waste

  16. U.S. Nuclear Regulatory Commission Extremely Low Probability of Rupture pilot study: xLPR framework model user's guide

    International Nuclear Information System (INIS)

    Kalinich, Donald A.; Sallaberry, Cedric M.; Mattie, Patrick D.

    2010-01-01

    For the U.S. Nuclear Regulatory Commission (NRC) Extremely Low Probability of Rupture (xLPR) pilot study, Sandia National Laboratories (SNL) was tasked to develop and evaluate a probabilistic framework using a commercial software package for Version 1.0 of the xLPR Code. Version 1.0 of the xLPR code is focused assessing the probability of rupture due to primary water stress corrosion cracking in dissimilar metal welds in pressurizer surge nozzles. Future versions of this framework will expand the capabilities to other cracking mechanisms, and other piping systems for both pressurized water reactors and boiling water reactors. The goal of the pilot study project is to plan the xLPR framework transition from Version 1.0 to Version 2.0; hence the initial Version 1.0 framework and code development will be used to define the requirements for Version 2.0. The software documented in this report has been developed and tested solely for this purpose. This framework and demonstration problem will be used to evaluate the commercial software's capabilities and applicability for use in creating the final version of the xLPR framework. This report details the design, system requirements, and the steps necessary to use the commercial-code based xLPR framework developed by SNL.

  17. NASA Capabilities That Could Impact Terrestrial Smart Grids of the Future

    Science.gov (United States)

    Beach, Raymond F.

    2015-01-01

    Incremental steps to steadily build, test, refine, and qualify capabilities that lead to affordable flight elements and a deep space capability. Potential Deep Space Vehicle Power system characteristics: power 10 kilowatts average; two independent power channels with multi-level cross-strapping; solar array power 24 plus kilowatts; multi-junction arrays; lithium Ion battery storage 200 plus ampere-hours; sized for deep space or low lunar orbit operation; distribution120 volts secondary (SAE AS 5698); 2 kilowatt power transfer between vehicles.

  18. Array capabilities and future arrays

    International Nuclear Information System (INIS)

    Radford, D.

    1993-01-01

    Early results from the new third-generation instruments GAMMASPHERE and EUROGAM are confirming the expectation that such arrays will have a revolutionary effect on the field of high-spin nuclear structure. When completed, GAMMASHPERE will have a resolving power am order of magnitude greater that of the best second-generation arrays. When combined with other instruments such as particle-detector arrays and fragment mass analysers, the capabilites of the arrays for the study of more exotic nuclei will be further enhanced. In order to better understand the limitations of these instruments, and to design improved future detector systems, it is important to have some intelligible and reliable calculation for the relative resolving power of different instrument designs. The derivation of such a figure of merit will be briefly presented, and the relative sensitivities of arrays currently proposed or under construction presented. The design of TRIGAM, a new third-generation array proposed for Chalk River, will also be discussed. It is instructive to consider how far arrays of Compton-suppressed Ge detectors could be taken. For example, it will be shown that an idealised open-quote perfectclose quotes third-generation array of 1000 detectors has a sensitivity an order of magnitude higher again than that of GAMMASPHERE. Less conventional options for new arrays will also be explored

  19. Box-particle probability hypothesis density filtering

    OpenAIRE

    Schikora, M.; Gning, A.; Mihaylova, L.; Cremers, D.; Koch, W.

    2014-01-01

    This paper develops a novel approach for multitarget tracking, called box-particle probability hypothesis density filter (box-PHD filter). The approach is able to track multiple targets and estimates the unknown number of targets. Furthermore, it is capable of dealing with three sources of uncertainty: stochastic, set-theoretic, and data association uncertainty. The box-PHD filter reduces the number of particles significantly, which improves the runtime considerably. The small number of box-p...

  20. Well-being, life satisfaction and capabilities of flood disaster victims

    International Nuclear Information System (INIS)

    Van Ootegem, Luc; Verhofstadt, Elsy

    2016-01-01

    The individual well-being of flood disaster victims is examined making use of two concepts: life satisfaction and perceived capabilities in life. These concepts are compared in two samples: a representative sample of Flemish respondents and a specific sample of people that have been the victim of a pluvial flood. Well-being as life satisfaction is found not to be related to past or expected future flooding, whereas well-being as capabilities in life is negatively related to both past and expected future flooding. - Highlights: • Well-being as life satisfaction is not related to past or expected future flooding. • Well-being as capabilities in life is negatively related to flooding. • A disaster can scare people for the future because of the scars that it provokes. • Assess the impact of a disaster not only by monetary damage and life satisfaction.

  1. Well-being, life satisfaction and capabilities of flood disaster victims

    Energy Technology Data Exchange (ETDEWEB)

    Van Ootegem, Luc, E-mail: Luc.VanOotegem@UGent.be [HIVA–University of Louvain (Belgium); SHERPPA–Ghent University (Belgium); Verhofstadt, Elsy [SHERPPA–Ghent University (Belgium)

    2016-02-15

    The individual well-being of flood disaster victims is examined making use of two concepts: life satisfaction and perceived capabilities in life. These concepts are compared in two samples: a representative sample of Flemish respondents and a specific sample of people that have been the victim of a pluvial flood. Well-being as life satisfaction is found not to be related to past or expected future flooding, whereas well-being as capabilities in life is negatively related to both past and expected future flooding. - Highlights: • Well-being as life satisfaction is not related to past or expected future flooding. • Well-being as capabilities in life is negatively related to flooding. • A disaster can scare people for the future because of the scars that it provokes. • Assess the impact of a disaster not only by monetary damage and life satisfaction.

  2. Spatial probability aids visual stimulus discrimination

    Directory of Open Access Journals (Sweden)

    Michael Druker

    2010-08-01

    Full Text Available We investigated whether the statistical predictability of a target's location would influence how quickly and accurately it was classified. Recent results have suggested that spatial probability can be a cue for the allocation of attention in visual search. One explanation for probability cuing is spatial repetition priming. In our two experiments we used probability distributions that were continuous across the display rather than relying on a few arbitrary screen locations. This produced fewer spatial repeats and allowed us to dissociate the effect of a high probability location from that of short-term spatial repetition. The task required participants to quickly judge the color of a single dot presented on a computer screen. In Experiment 1, targets were more probable in an off-center hotspot of high probability that gradually declined to a background rate. Targets garnered faster responses if they were near earlier target locations (priming and if they were near the high probability hotspot (probability cuing. In Experiment 2, target locations were chosen on three concentric circles around fixation. One circle contained 80% of targets. The value of this ring distribution is that it allowed for a spatially restricted high probability zone in which sequentially repeated trials were not likely to be physically close. Participant performance was sensitive to the high-probability circle in addition to the expected effects of eccentricity and the distance to recent targets. These two experiments suggest that inhomogeneities in spatial probability can be learned and used by participants on-line and without prompting as an aid for visual stimulus discrimination and that spatial repetition priming is not a sufficient explanation for this effect. Future models of attention should consider explicitly incorporating the probabilities of targets locations and features.

  3. Using Porterian Activity Analysis to Understand Organizational Capabilities

    DEFF Research Database (Denmark)

    Sheehan, Norman T.; Foss, Nicolai Juul

    2017-01-01

    conceptualized by Porter’s writings on the activity-based view. Porterian activity analysis is becoming more accepted in the strategy literature, but no strategy scholar has explicitly used Porter’s activities, and particularly his concept of drivers, to understand and analyze organizational capabilities....... Introducing Porterian activities into the discussion of capabilities improves strategy scholars’ understanding of the bases of capability heterogeneity, offers academics future directions for research, and provides managers with guidance to enhance their organizations’ capabilities....

  4. LOGISTIC REGRESSION AS A TOOL FOR DETERMINATION OF THE PROBABILITY OF DEFAULT FOR ENTERPRISES

    Directory of Open Access Journals (Sweden)

    Erika SPUCHLAKOVA

    2017-12-01

    Full Text Available In a rapidly changing world it is necessary to adapt to new conditions. From a day to day approaches can vary. For the proper management of the company it is essential to know the financial situation. Assessment of the company financial health can be carried out by financial analysis which provides a number of methods how to evaluate the company financial health. Analysis indicators are often included in the company assessment, in obtaining bank loans and other financial resources to ensure the functioning of the company. As company focuses on the future and its planning, it is essential to forecast the future financial situation. According to the results of company´s financial health prediction, the company decides on the extension or limitation of its business. It depends mainly on the capabilities of company´s management how they will use information obtained from financial analysis in practice. The findings of logistic regression methods were published firstly in the 60s, as an alternative to the least squares method. The essence of logistic regression is to determine the relationship between being explained (dependent variable and explanatory (independent variables. The basic principle of this static method is based on the regression analysis, but unlike linear regression, it can predict the probability of a phenomenon that has occurred or not. The aim of this paper is to determine the probability of bankruptcy enterprises.

  5. The NASA MSFC Electrostatic Levitation (ESL) Laboratory: Summary of Capabilities, Recent Upgrades, and Future Work

    Science.gov (United States)

    SanSoucie, Michael P.; Vermilion, David J.; Rogers, Jan R.

    2015-01-01

    The NASA Marshall Space Flight Center (MSFC) electrostatic levitation (ESL) laboratory has a long history of providing materials research and thermophysical property data. A summary of the labs capabilities, recent upgrades, and ongoing and future work will be provided. The laboratory has recently added two new capabilities to its main levitation chamber: a rapid quench system and an oxygen control system. The rapid quench system allows samples to be dropped into a quench vessel that can be filled with a low melting point material, such as a gallium or indium alloy. Thereby allowing rapid quenching of undercooled liquid metals. The oxygen control system consists of an oxygen sensor, oxygen pump, and a control unit. The sensor is a potentiometric device that determines the difference in oxygen activity between two gas compartments separated by an electrolyte, which is yttria-stabilized zirconia. The pump utilizes coulometric titration to either add or remove oxygen. The system is controlled by a desktop control unit, which can also be accessed via a computer. This system allows the oxygen partial pressure within the vacuum chamber to be measured and controlled, theoretically in the range from 10-36 to 100 bar. The ESL laboratory also has an emissometer, called the High-Temperature Emissivity Measurement System (HiTEMS). This system measures the spectral emissivity of materials from 600degC to 3,000degC. The system consists of a vacuum chamber, a black body source, and a Fourier Transform Infrared Spectrometer (FTIR). The system utilizes optics to swap the signal between the sample and the black body. The system was originally designed to measure the hemispherical spectral emissivity of levitated samples, which are typically 2.5mm spheres. Levitation allows emissivity measurements of molten samples, but more work is required to develop this capability. The system is currently setup measure the near-normal spectral emissivity of stationary samples, which has been used

  6. Eruption probabilities for the Lassen Volcanic Center and regional volcanism, northern California, and probabilities for large explosive eruptions in the Cascade Range

    Science.gov (United States)

    Nathenson, Manuel; Clynne, Michael A.; Muffler, L.J. Patrick

    2012-01-01

    Chronologies for eruptive activity of the Lassen Volcanic Center and for eruptions from the regional mafic vents in the surrounding area of the Lassen segment of the Cascade Range are here used to estimate probabilities of future eruptions. For the regional mafic volcanism, the ages of many vents are known only within broad ranges, and two models are developed that should bracket the actual eruptive ages. These chronologies are used with exponential, Weibull, and mixed-exponential probability distributions to match the data for time intervals between eruptions. For the Lassen Volcanic Center, the probability of an eruption in the next year is 1.4x10-4 for the exponential distribution and 2.3x10-4 for the mixed exponential distribution. For the regional mafic vents, the exponential distribution gives a probability of an eruption in the next year of 6.5x10-4, but the mixed exponential distribution indicates that the current probability, 12,000 years after the last event, could be significantly lower. For the exponential distribution, the highest probability is for an eruption from a regional mafic vent. Data on areas and volumes of lava flows and domes of the Lassen Volcanic Center and of eruptions from the regional mafic vents provide constraints on the probable sizes of future eruptions. Probabilities of lava-flow coverage are similar for the Lassen Volcanic Center and for regional mafic vents, whereas the probable eruptive volumes for the mafic vents are generally smaller. Data have been compiled for large explosive eruptions (>≈ 5 km3 in deposit volume) in the Cascade Range during the past 1.2 m.y. in order to estimate probabilities of eruption. For erupted volumes >≈5 km3, the rate of occurrence since 13.6 ka is much higher than for the entire period, and we use these data to calculate the annual probability of a large eruption at 4.6x10-4. For erupted volumes ≥10 km3, the rate of occurrence has been reasonably constant from 630 ka to the present, giving

  7. Technological Innovation Capabilities and Firm Performance

    OpenAIRE

    Richard C.M. Yam; William Lo; Esther P.Y. Tang; Antonio; K.W. Lau

    2010-01-01

    Technological innovation capability (TIC) is defined as a comprehensive set of characteristics of a firm that facilities and supports its technological innovation strategies. An audit to evaluate the TICs of a firm may trigger improvement in its future practices. Such an audit can be used by the firm for self assessment or third-party independent assessment to identify problems of its capability status. This paper attempts to develop such an auditing framework that can...

  8. U.S. Nuclear Regulatory Commission Extremely Low Probability of Rupture pilot study : xLPR framework model user's guide.

    Energy Technology Data Exchange (ETDEWEB)

    Kalinich, Donald A.; Sallaberry, Cedric M.; Mattie, Patrick D.

    2010-12-01

    For the U.S. Nuclear Regulatory Commission (NRC) Extremely Low Probability of Rupture (xLPR) pilot study, Sandia National Laboratories (SNL) was tasked to develop and evaluate a probabilistic framework using a commercial software package for Version 1.0 of the xLPR Code. Version 1.0 of the xLPR code is focused assessing the probability of rupture due to primary water stress corrosion cracking in dissimilar metal welds in pressurizer surge nozzles. Future versions of this framework will expand the capabilities to other cracking mechanisms, and other piping systems for both pressurized water reactors and boiling water reactors. The goal of the pilot study project is to plan the xLPR framework transition from Version 1.0 to Version 2.0; hence the initial Version 1.0 framework and code development will be used to define the requirements for Version 2.0. The software documented in this report has been developed and tested solely for this purpose. This framework and demonstration problem will be used to evaluate the commercial software's capabilities and applicability for use in creating the final version of the xLPR framework. This report details the design, system requirements, and the steps necessary to use the commercial-code based xLPR framework developed by SNL.

  9. Exploration Medical Capability (ExMC) Projects

    Science.gov (United States)

    Wu, Jimmy; Watkins, Sharmila; Baumann, David

    2010-01-01

    During missions to the Moon or Mars, the crew will need medical capabilities to diagnose and treat disease as well as for maintaining their health. The Exploration Medical Capability Element develops medical technologies, medical informatics, and clinical capabilities for different levels of care during space missions. The work done by team members in this Element is leading edge technology, procedure, and pharmacological development. They develop data systems that protect patient's private medical information, aid in the diagnosis of medical conditions, and act as a repository of relevant NASA life sciences experimental studies. To minimize the medical risks to crew health the physicians and scientists in this Element develop models to quantify the probability of medical events occurring during a mission. They define procedures to treat an ill or injured crew member who does not have access to an emergency room and who must be cared for in a microgravity environment where both liquids and solids behave differently than on Earth. To support the development of these medical capabilities, the Element manages the development of medical technologies that prevent, monitor, diagnose, and treat an ill or injured crewmember. The Exploration Medical Capability Element collaborates with the National Space Biomedical Research Institute (NSBRI), the Department of Defense, other Government-funded agencies, academic institutions, and industry.

  10. A probability of synthesis of the superheavy element Z = 124

    Energy Technology Data Exchange (ETDEWEB)

    Manjunatha, H.C. [Government College for Women, Department of Physics, Kolar, Karnataka (India); Sridhar, K.N. [Government First Grade College, Department of Physics, Kolar, Karnataka (India)

    2017-10-15

    We have studied the fusion cross section, evaporation residue cross section, compound nucleus formation probability (P{sub CN}) and survival probability (P{sub sur}) of different projectile target combinations to synthesize the superheavy element Z=124. Hence, we have identified the most probable projectile-target combination to synthesize the superheavy element Z = 124. To synthesize the superheavy element Z=124, the most probable projectile target combinations are Kr+Ra, Ni+Cm, Se+Th, Ge+U and Zn+Pu. We hope that our predictions may be a guide for the future experiments in the synthesis of superheavy nuclei Z = 124. (orig.)

  11. Generalized Probability-Probability Plots

    NARCIS (Netherlands)

    Mushkudiani, N.A.; Einmahl, J.H.J.

    2004-01-01

    We introduce generalized Probability-Probability (P-P) plots in order to study the one-sample goodness-of-fit problem and the two-sample problem, for real valued data.These plots, that are constructed by indexing with the class of closed intervals, globally preserve the properties of classical P-P

  12. Quantum Probabilities as Behavioral Probabilities

    Directory of Open Access Journals (Sweden)

    Vyacheslav I. Yukalov

    2017-03-01

    Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.

  13. Modeling Stochastic Complexity in Complex Adaptive Systems: Non-Kolmogorov Probability and the Process Algebra Approach.

    Science.gov (United States)

    Sulis, William H

    2017-10-01

    Walter Freeman III pioneered the application of nonlinear dynamical systems theories and methodologies in his work on mesoscopic brain dynamics.Sadly, mainstream psychology and psychiatry still cling to linear correlation based data analysis techniques, which threaten to subvert the process of experimentation and theory building. In order to progress, it is necessary to develop tools capable of managing the stochastic complexity of complex biopsychosocial systems, which includes multilevel feedback relationships, nonlinear interactions, chaotic dynamics and adaptability. In addition, however, these systems exhibit intrinsic randomness, non-Gaussian probability distributions, non-stationarity, contextuality, and non-Kolmogorov probabilities, as well as the absence of mean and/or variance and conditional probabilities. These properties and their implications for statistical analysis are discussed. An alternative approach, the Process Algebra approach, is described. It is a generative model, capable of generating non-Kolmogorov probabilities. It has proven useful in addressing fundamental problems in quantum mechanics and in the modeling of developing psychosocial systems.

  14. Evaluating late detection capability against diverse insider adversaries

    International Nuclear Information System (INIS)

    Sicherman, A.

    1987-01-01

    This paper describes a model for evaluating the late (after-the-fact) detection capability of material control and accountability (MCandA) systems against insider theft or diversion of special nuclear material. Potential insider cover-up strategies to defeat activities providing detection (e.g., inventories) are addressed by the model in a tractable manner. For each potential adversary and detection activity, two probabilities are assessed and used to fit the model. The model then computes the probability of detection for activities occurring periodically over time. The model provides insight into MCandA effectiveness and helps identify areas for safeguards improvement. 4 refs., 4 tabs

  15. Probability Aggregates in Probability Answer Set Programming

    OpenAIRE

    Saad, Emad

    2013-01-01

    Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...

  16. Probability and containment of turbine missiles

    International Nuclear Information System (INIS)

    Yeh, G.C.K.

    1976-01-01

    With the trend toward ever larger power generating plants with large high-speed turbines, an important plant design consideration is the potential for and consequences of mechanical failure of turbine rotors. Such rotor failure could result in high-velocity disc fragments (turbine missiles) perforating the turbine casing and jeopardizing vital plant systems. The designer must first estimate the probability of any turbine missile damaging any safety-related plant component for his turbine and his plant arrangement. If the probability is not low enough to be acceptable to the regulatory agency, he must design a shield to contain the postulated turbine missiles. Alternatively, the shield could be designed to retard (to reduce the velocity of) the missiles such that they would not damage any vital plant system. In this paper, some of the presently available references that can be used to evaluate the probability, containment and retardation of turbine missiles are reviewed; various alternative methods are compared; and subjects for future research are recommended. (Auth.)

  17. Demonstration of Hybrid DSMC-CFD Capability for Nonequilibrium Reacting Flow

    Science.gov (United States)

    2018-02-09

    AFRL-RV-PS- TR-2018-0056 AFRL-RV-PS- TR-2018-0056 DEMONSTRATION OF HYBRID DSMC-CFD CAPABILITY FOR NONEQUILIBRIUM REACTING FLOW Thomas E...4. TITLE AND SUBTITLE Demonstration of Hybrid DSMC-CFD Capability for Nonequilibrium Reacting Flow 5a. CONTRACT NUMBER 5b. GRANT NUMBER FA9453-17-1...simulation codes. The models are based on new ab-intio rate data obtained using state -of-the-art potential energy surfaces for air species. A probability

  18. Assessing the present and future probability of Hurricane Harvey’s rainfall

    OpenAIRE

    Emanuel, Kerry

    2017-01-01

    Significance Natural disasters such as the recent Hurricanes Harvey, Irma, and Maria highlight the need for quantitative estimates of the risk of such disasters. Statistically based risk assessment suffers from short records of often poor quality, and in the case of meteorological hazards, from the fact that the underlying climate is changing. This study shows how a recently developed physics-based risk assessment method can be applied to assessing the probabilities of extreme hurricane rainf...

  19. Arctic Observing Network Data Management: Current Capabilities and Their Promise for the Future

    Science.gov (United States)

    Collins, J.; Fetterer, F.; Moore, J. A.

    2008-12-01

    CADIS (the Cooperative Arctic Data and Information Service) serves as the data management, discovery and delivery component of the Arctic Observing Network (AON). As an International Polar Year (IPY) initiative, AON comprises 34 land, atmosphere and ocean observation sites, and will acquire much of the data coming from the interagency Study of Environmental Arctic Change (SEARCH). CADIS is tasked with ensuring that these observational data are managed for long term use by members of the entire Earth System Science community. Portions of CADIS are either in use by the community or available for testing. We now have an opportunity to evaluate the feedback received from our users, to identify any design shortcomings, and to identify those elements which serve their purpose well and will support future development. This presentation will focus on the nuts-and-bolts of the CADIS development to date, with an eye towards presenting lessons learned and best practices based on our experiences so far. The topics include: - How did we assess our users' needs, and how are those contributions reflected in the end product and its capabilities? - Why did we develop a CADIS metadata profile, and how does it allow CADIS to support preservation and scientific interoperability? - How can we shield the user from metadata complexities (especially those associated with various standards) while still obtaining the metadata needed to support an effective data management system? - How can we bridge the gap between the data storage formats considered convenient by researchers in the field, and those which are necessary to provide data interoperability? - What challenges have been encountered in our efforts to provide access to federated data (data stored outside of the CADIS system)? - What are the data browsing and visualization needs of the AON community, and which tools and technologies are most promising in terms of supporting those needs? A live demonstration of the current

  20. Alternatives for Future U.S. Space-Launch Capabilities

    Science.gov (United States)

    2006-10-01

    directive issued on January 14, 2004—called the new Vision for Space Exploration (VSE)—set out goals for future exploration of the solar system using...of the solar system using manned spacecraft. Among those goals was a proposal to return humans to the moon no later than 2020. The ultimate goal...U.S. launch capacity exclude the Sea Launch system operated by Boeing in partnership with RSC- Energia (based in Moscow), Kvaerner ASA (based in Oslo

  1. Reliability Analysis and Overload Capability Assessment of Oil-Immersed Power Transformers

    Directory of Open Access Journals (Sweden)

    Chen Wang

    2016-01-01

    Full Text Available Smart grids have been constructed so as to guarantee the security and stability of the power grid in recent years. Power transformers are a most vital component in the complicated smart grid network. Any transformer failure can cause damage of the whole power system, within which the failures caused by overloading cannot be ignored. This research gives a new insight into overload capability assessment of transformers. The hot-spot temperature of the winding is the most critical factor in measuring the overload capacity of power transformers. Thus, the hot-spot temperature is calculated to obtain the duration running time of the power transformers under overloading conditions. Then the overloading probability is fitted with the mature and widely accepted Weibull probability density function. To guarantee the accuracy of this fitting, a new objective function is proposed to obtain the desired parameters in the Weibull distributions. In addition, ten different mutation scenarios are adopted in the differential evolutionary algorithm to optimize the parameter in the Weibull distribution. The final comprehensive overload capability of the power transformer is assessed by the duration running time as well as the overloading probability. Compared with the previous studies that take no account of the overloading probability, the assessment results obtained in this research are much more reliable.

  2. Capability Building and Learning: An Emergent Behavior Approach

    Directory of Open Access Journals (Sweden)

    Andreu Rafael

    2014-12-01

    Full Text Available Economics-based models of firms typically overlook management acts and capability development. We propose a model that analyzes the aggregate behavior of a population of firms resulting from both specific management decisions and learning processes, that induce changes in companies’ capabilities. Decisions are made under imperfect information and bounded rationality, and managers may sacrifice short-term performance in exchange for qualitative outcomes that affect their firm’s future potential. The proposed model provides a structured setting in which these issues -often discussed only informally- can be systematically analyzed through simulation, producing a variety of hard-to-anticipate emergent behaviors. Economic performance is quite sensitive to managers’ estimates of their firms’ capabilities, and companies willing to sacrifice short-run results for future potential appear to be more stable than the rest. Also, bounded rationality can produce chaotic dynamics reminiscent of real life situations.

  3. The probability and the management of human error

    International Nuclear Information System (INIS)

    Dufey, R.B.; Saull, J.W.

    2004-01-01

    Embedded within modern technological systems, human error is the largest, and indeed dominant contributor to accident cause. The consequences dominate the risk profiles for nuclear power and for many other technologies. We need to quantify the probability of human error for the system as an integral contribution within the overall system failure, as it is generally not separable or predictable for actual events. We also need to provide a means to manage and effectively reduce the failure (error) rate. The fact that humans learn from their mistakes allows a new determination of the dynamic probability and human failure (error) rate in technological systems. The result is consistent with and derived from the available world data for modern technological systems. Comparisons are made to actual data from large technological systems and recent catastrophes. Best estimate values and relationships can be derived for both the human error rate, and for the probability. We describe the potential for new approaches to the management of human error and safety indicators, based on the principles of error state exclusion and of the systematic effect of learning. A new equation is given for the probability of human error (λ) that combines the influences of early inexperience, learning from experience (ε) and stochastic occurrences with having a finite minimum rate, this equation is λ 5.10 -5 + ((1/ε) - 5.10 -5 ) exp(-3*ε). The future failure rate is entirely determined by the experience: thus the past defines the future

  4. Enabling unmanned capabilities in the tactical wheeled vehicle fleet of the future

    Science.gov (United States)

    Zych, Noah

    2012-06-01

    From transporting troops and weapons systems to supplying beans, bullets, and Band-Aids to front-line warfighters, tactical wheeled vehicles serve as the materiel backbone anywhere there are boots on the ground. Drawing from the U.S. Army's Tactical Wheeled Vehicle Strategy and the Marine Corps Vision & Strategy 2025 reports, one may conclude that the services have modest expectations for the introduction of large unmanned ground systems into operational roles in the next 15 years. However, the Department of Defense has already invested considerably in the research and development of full-size UGVs-and commanders deployed in both Iraq and Afghanistan have advocated the urgent fielding of early incarnations of this technology, believing it could make a difference on their battlefields today. For military UGVs to evolve from mere tactical advantages into strategic assets with developed doctrine, they must become as trustworthy as a well-trained warfighter in performing their assigned task. Starting with the Marine Corps' ongoing Cargo Unmanned Ground Vehicle program as a baseline, and informed by feedback from previously deployed subject matter experts, this paper examines the gaps which presently exist in UGVs from a mission-capable perspective. It then considers viable near-term technical solutions to meet today's functional requirements, as well as long-term development strategies to enable truly robust performance. With future conflicts expected to be characterized by increasingly complex operational environments and a broad spectrum of rapidly adapting threats, one of the largest challenges for unmanned ground systems will be the ability to exhibit agility in unpredictable circumstances.

  5. Negative probability in the framework of combined probability

    OpenAIRE

    Burgin, Mark

    2013-01-01

    Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...

  6. Recent Investments by NASA's National Force Measurement Technology Capability

    Science.gov (United States)

    Commo, Sean A.; Ponder, Jonathan D.

    2016-01-01

    The National Force Measurement Technology Capability (NFMTC) is a nationwide partnership established in 2008 and sponsored by NASA's Aeronautics Evaluation and Test Capabilities (AETC) project to maintain and further develop force measurement capabilities. The NFMTC focuses on force measurement in wind tunnels and provides operational support in addition to conducting balance research. Based on force measurement capability challenges, strategic investments into research tasks are designed to meet the experimental requirements of current and future aerospace research programs and projects. This paper highlights recent and force measurement investments into several areas including recapitalizing the strain-gage balance inventory, developing balance best practices, improving calibration and facility capabilities, and researching potential technologies to advance balance capabilities.

  7. Personality Assessment: A Competency-Capability Perspective.

    Science.gov (United States)

    Kaslow, Nadine J; Finklea, J Tyler; Chan, Ginny

    2018-01-01

    This article begins by reviewing the proficiency of personality assessment in the context of the competencies movement, which has dominated health service psychology in recent years. It examines the value of including a capability framework for advancing this proficiency and enhancing the quality of personality assessments, including Therapeutic Assessment (Finn & Tonsager, 1997 ), that include a personality assessment component. This hybrid competency-capability framework is used to set the stage for the conduct of personality assessments in a variety of contexts and for the optimal training of personality assessment. Future directions are offered in terms of ways psychologists can strengthen their social contract with the public and offer a broader array of personality assessments in more diverse contexts and by individuals who are both competent and capable.

  8. Trends in Human-Computer Interaction to Support Future Intelligence Analysis Capabilities

    Science.gov (United States)

    2011-06-01

    Oblong Industries Inc. (Oblong, 2011). In addition to the camera-based gesture interaction (Figure 4), this system offers a management capability...EyeTap Lumus Eyewear LOE FogScreen HP LiM PC Microvision PEK and SHOWWX Pico Projectors Head Mounted Display Chinese Holo Screen 10 Advanced Analyst

  9. Conceptualizing innovation capabilities: A contingency perspective

    Directory of Open Access Journals (Sweden)

    Tor Helge Aas

    2017-01-01

    Full Text Available Empirical research has confirmed that a positive relationship exists between the implementation of innovation activities and the future performance of organizations. Firms utilize resources and capabilities to develop innovations in the form of new products, services or processes. Some firms prove to be better at reproducing innovation success than others, and the capacity to do so is referred to as innovation capability. However, the term innovation capability is ambiguously treated in extant literature. There are several different definitions of the concept and the distinction between innovation capabilities and other types of capabilities, such as dynamic capabilities, is neither explicitly stated, nor is the relationship between the concept and other resource- and capability-based concepts within strategy theory established. Although innovation is increasingly identified as crucial for a firm’s sustainable competitiveness in contemporary volatile and complex markets, the strategy-innovation link is underdeveloped in extant research. To overcome this challenge this paper raises the following research question: What type of innovation capabilities are required to innovate successfully? Due to the status of the extant research, we chose a conceptual research design to answer our research question and the paper contributes with a conceptual framework to discuss what innovation capabilities firms need to reproduce innovation success. Based on careful examination of current literature on innovation capability specifically, and the strategy-innovation link in general, we suggest that innovation capability must be viewed along two dimensions – innovation novelty and market characteristics. This framework enables the identification of four different contexts for innovation capabilities in a two-bytwo matrix. We discuss the types of innovation capabilities necessary within the four different contexts. This novel framework contributes to the

  10. Technological capability at the Brazilian official pharmaceutical laboratories

    Directory of Open Access Journals (Sweden)

    José Vitor Bomtempo Martins

    2008-10-01

    Full Text Available This paper studies the technological capability in the Brazilian Official Pharmaceutical Laboratories [OPL]. The technological capability analysis could contribute to organization strategies and governmental actions in order to improve OPL basic tasks as well to incorporate new ones, particularly concerning the innovation management. Inspired in Figueiredo (2000, 2003a, 2003b and Figueiredo and Ariffin (2003, a framework was drawn and adapted to pharmaceutical industry characteristics and current sanitary and health legislation. The framework allows to map different dimensions of the technological capability (installations, processes, products, equipments, organizational capability and knowledge management and the level attained by OPL (ordinary or innovating capability. OPL show a good development of ordinary capabilities, particularly in Product and Processes. Concerning the other dimensions, OPL are quite diverse. In general, innovating capabilities are not much developed. In the short term, it was identified a dispersion in the capacitating efforts. Considering their present level and the absorption efforts, good perspectives can be found in Installations, Processes and Organizational Capability. A lower level of efforts in Products and Knowledge Management could undermine these capabilities in the future.

  11. A Summary of Actinide Enrichment Technologies and Capability Gaps

    Energy Technology Data Exchange (ETDEWEB)

    Patton, Bradley D. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Robinson, Sharon M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-01-01

    The evaluation performed in this study indicates that a new program is needed to efficiently provide a national actinide radioisotope enrichment capability to produce milligram-to-gram quantities of unique materials for user communities. This program should leverage past actinide enrichment, the recent advances in stable isotope enrichment, and assessments of the future requirements to cost effectively develop this capability while establishing an experience base for a new generation of researchers in this vital area. Preliminary evaluations indicate that an electromagnetic isotope separation (EMIS) device would have the capability to meet the future needs of the user community for enriched actinides. The EMIS technology could be potentially coupled with other enrichment technologies, such as irradiation, as pre-enrichment and/or post-enrichment systems to increase the throughput, reduce losses of material, and/or reduce operational costs of the base EMIS system. Past actinide enrichment experience and advances in the EMIS technology applied in stable isotope separations should be leveraged with this new evaluation information to assist in the establishment of a domestic actinide radioisotope enrichment capability.

  12. Probability

    CERN Document Server

    Shiryaev, A N

    1996-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, ergodic theory, weak convergence of probability measures, stationary stochastic processes, and the Kalman-Bucy filter Many examples are discussed in detail, and there are a large number of exercises The book is accessible to advanced undergraduates and can be used as a text for self-study This new edition contains substantial revisions and updated references The reader will find a deeper study of topics such as the distance between probability measures, metrization of weak convergence, and contiguity of probability measures Proofs for a number of some important results which were merely stated in the first edition have been added The author included new material on the probability of large deviations, and on the central limit theorem for sums of dependent random variables

  13. Evaluation of artillery equipment maintenance support capability based on grey clustering

    Science.gov (United States)

    Zhai, Mei-jie; Gao, Peng

    2017-12-01

    This paper, the theory and method of evaluating the capability of equipment maintenance support in China and abroad are studied, from the point of view of the combat task of artillery troops and the strategic attachment in the future military struggle. This paper establishes the framework of the evaluation Index system of the equipment maintenance support capability of the artillery units, and applies the grey clustering method to the evaluation of the equipment maintenance support capability of the artillery units, and finally evaluates the equipment maintenance and support capability of the artillery brigade as an example, and analyzes the evaluation results. This paper finds out the outstanding problems existing in the maintenance and support of military equipment, and puts forward some constructive suggestions, in order to improve the status of military equipment maintenance and support and improve the level of future equipment maintenance.

  14. Focus in High School Mathematics: Statistics and Probability

    Science.gov (United States)

    National Council of Teachers of Mathematics, 2009

    2009-01-01

    Reasoning about and making sense of statistics and probability are essential to students' future success. This volume belongs to a series that supports National Council of Teachers of Mathematics' (NCTM's) "Focus in High School Mathematics: Reasoning and Sense Making" by providing additional guidance for making reasoning and sense making part of…

  15. Capable design or designing capabilities? An exploration of service design as an emerging organizational capability in Telenor – Martinkenaite

    Directory of Open Access Journals (Sweden)

    Ieva Martinkenaite

    2017-01-01

    Full Text Available This empirical paper examines a process, starting with the managerial decision to make service design an organizational capability, and follows it as it unfolds over time within one organization. Service design has become an established business practice of how firms create new products and services to promote differentiation in an increasingly uncertain business landscape. Implicit in the literature on service design are assumptions about strategic implications of adopting the prescribed innovation methods and tools. However, little is known about how service design evolves into an organizational capability enabling firms to transform their existing businesses and sustain competitiveness. Through a longitudinal, exploratory case study of service design practices in one of the world’s largest telecommunications companies, we explicate mechanisms through which service design evolves into an organizational capability by exploring the research question: what are the mechanisms through which service design develops into an organizational capability? Our study reveals the effect of an initial introduction of service design tools, identification of boundaryspanning actors and co-alignment of dedicated resources between internal functions, as well as through co-creation with customers. Over time, these activities lead to the adoption of service design practices, and subsequently these practices spark incremental learning throughout the organization, alter managerial decisions and influence multiple paths for the development of new capabilities. Reporting on this process, we are able to describe how service design practices were disseminated and institutionalized within the organization we observed. This study thus contributes by informing how service design can evolve into an organizational capability, as well as by bridging the emerging literature on service design and design thinking with established strategy theory. Further research will have to

  16. Defining Baconian Probability for Use in Assurance Argumentation

    Science.gov (United States)

    Graydon, Patrick J.

    2016-01-01

    The use of assurance cases (e.g., safety cases) in certification raises questions about confidence in assurance argument claims. Some researchers propose to assess confidence in assurance cases using Baconian induction. That is, a writer or analyst (1) identifies defeaters that might rebut or undermine each proposition in the assurance argument and (2) determines whether each defeater can be dismissed or ignored and why. Some researchers also propose denoting confidence using the counts of defeaters identified and eliminated-which they call Baconian probability-and performing arithmetic on these measures. But Baconian probabilities were first defined as ordinal rankings which cannot be manipulated arithmetically. In this paper, we recount noteworthy definitions of Baconian induction, review proposals to assess confidence in assurance claims using Baconian probability, analyze how these comport with or diverge from the original definition, and make recommendations for future practice.

  17. The Everett-Wheeler interpretation and the open future

    International Nuclear Information System (INIS)

    Sudbery, Anthony

    2011-01-01

    I discuss the meaning of probability in the Everett-Wheeler interpretation of quantum mechanics, together with the problem of defining histories. To resolve these, I propose an understanding of probability arising from a form of temporal logic: the probability of a future-tense proposition is identified with its truth value in a many-valued and context-dependent logic. In short, probability is degree of truth. These ideas relate to traditional naive ideas of time and chance. Indeed, I argue that Everettian quantum mechanics is the only form of scientific theory that truly incorporates the perception that the future is open.

  18. Developing a probability-based model of aquifer vulnerability in an agricultural region

    Science.gov (United States)

    Chen, Shih-Kai; Jang, Cheng-Shin; Peng, Yi-Huei

    2013-04-01

    SummaryHydrogeological settings of aquifers strongly influence the regional groundwater movement and pollution processes. Establishing a map of aquifer vulnerability is considerably critical for planning a scheme of groundwater quality protection. This study developed a novel probability-based DRASTIC model of aquifer vulnerability in the Choushui River alluvial fan, Taiwan, using indicator kriging and to determine various risk categories of contamination potentials based on estimated vulnerability indexes. Categories and ratings of six parameters in the probability-based DRASTIC model were probabilistically characterized according to the parameter classification methods of selecting a maximum estimation probability and calculating an expected value. Moreover, the probability-based estimation and assessment gave us an excellent insight into propagating the uncertainty of parameters due to limited observation data. To examine the prediction capacity of pollutants for the developed probability-based DRASTIC model, medium, high, and very high risk categories of contamination potentials were compared with observed nitrate-N exceeding 0.5 mg/L indicating the anthropogenic groundwater pollution. The analyzed results reveal that the developed probability-based DRASTIC model is capable of predicting high nitrate-N groundwater pollution and characterizing the parameter uncertainty via the probability estimation processes.

  19. Investments by NASA to build planetary protection capability

    Science.gov (United States)

    Buxbaum, Karen; Conley, Catharine; Lin, Ying; Hayati, Samad

    NASA continues to invest in capabilities that will enable or enhance planetary protection planning and implementation for future missions. These investments are critical to the Mars Exploration Program and will be increasingly important as missions are planned for exploration of the outer planets and their icy moons. Since the last COSPAR Congress, there has been an opportunity to respond to the advice of NRC-PREVCOM and the analysis of the MEPAG Special Regions Science Analysis Group. This stimulated research into such things as expanded bioburden reduction options, modern molecular assays and genetic inventory capability, and approaches to understand or avoid recontamination of spacecraft parts and samples. Within NASA, a portfolio of PP research efforts has been supported through the NASA Office of Planetary Protection, the Mars Technology Program, and the Mars Program Office. The investment strategy focuses on technology investments designed to enable future missions and reduce their costs. In this presentation we will provide an update on research and development supported by NASA to enhance planetary protection capability. Copyright 2008 California Institute of Technology. Government sponsorship acknowledged.

  20. Accelerator and Electrodynamics Capability Review

    International Nuclear Information System (INIS)

    Jones, Kevin W.

    2010-01-01

    Los Alamos National Laboratory (LANL) uses capability reviews to assess the science, technology and engineering (STE) quality and institutional integration and to advise Laboratory Management on the current and future health of the STE. Capability reviews address the STE integration that LANL uses to meet mission requirements. The Capability Review Committees serve a dual role of providing assessment of the Laboratory's technical contributions and integration towards its missions and providing advice to Laboratory Management. The assessments and advice are documented in reports prepared by the Capability Review Committees that are delivered to the Director and to the Principal Associate Director for Science, Technology and Engineering (PADSTE). Laboratory Management will use this report for STE assessment and planning. LANL has defined fifteen STE capabilities. Electrodynamics and Accelerators is one of the seven STE capabilities that LANL Management (Director, PADSTE, technical Associate Directors) has identified for review in Fiscal Year (FY) 2010. Accelerators and electrodynamics at LANL comprise a blend of large-scale facilities and innovative small-scale research with a growing focus on national security applications. This review is organized into five topical areas: (1) Free Electron Lasers; (2) Linear Accelerator Science and Technology; (3) Advanced Electromagnetics; (4) Next Generation Accelerator Concepts; and (5) National Security Accelerator Applications. The focus is on innovative technology with an emphasis on applications relevant to Laboratory mission. The role of Laboratory Directed Research and Development (LDRD) in support of accelerators/electrodynamics will be discussed. The review provides an opportunity for interaction with early career staff. Program sponsors and customers will provide their input on the value of the accelerator and electrodynamics capability to the Laboratory mission.

  1. Accelerator and electrodynamics capability review

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Kevin W [Los Alamos National Laboratory

    2010-01-01

    Los Alamos National Laboratory (LANL) uses capability reviews to assess the science, technology and engineering (STE) quality and institutional integration and to advise Laboratory Management on the current and future health of the STE. Capability reviews address the STE integration that LANL uses to meet mission requirements. The Capability Review Committees serve a dual role of providing assessment of the Laboratory's technical contributions and integration towards its missions and providing advice to Laboratory Management. The assessments and advice are documented in reports prepared by the Capability Review Committees that are delivered to the Director and to the Principal Associate Director for Science, Technology and Engineering (PADSTE). Laboratory Management will use this report for STE assessment and planning. LANL has defined fifteen STE capabilities. Electrodynamics and Accelerators is one of the seven STE capabilities that LANL Management (Director, PADSTE, technical Associate Directors) has identified for review in Fiscal Year (FY) 2010. Accelerators and electrodynamics at LANL comprise a blend of large-scale facilities and innovative small-scale research with a growing focus on national security applications. This review is organized into five topical areas: (1) Free Electron Lasers; (2) Linear Accelerator Science and Technology; (3) Advanced Electromagnetics; (4) Next Generation Accelerator Concepts; and (5) National Security Accelerator Applications. The focus is on innovative technology with an emphasis on applications relevant to Laboratory mission. The role of Laboratory Directed Research and Development (LDRD) in support of accelerators/electrodynamics will be discussed. The review provides an opportunity for interaction with early career staff. Program sponsors and customers will provide their input on the value of the accelerator and electrodynamics capability to the Laboratory mission.

  2. Scaling Qualitative Probability

    OpenAIRE

    Burgin, Mark

    2017-01-01

    There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...

  3. First Passage Probability Estimation of Wind Turbines by Markov Chain Monte Carlo

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri; Nielsen, Søren R.K.

    2013-01-01

    Markov Chain Monte Carlo simulation has received considerable attention within the past decade as reportedly one of the most powerful techniques for the first passage probability estimation of dynamic systems. A very popular method in this direction capable of estimating probability of rare events...... of the method by modifying the conditional sampler. In this paper, applicability of the original SS is compared to the recently introduced modifications of the method on a wind turbine model. The model incorporates a PID pitch controller which aims at keeping the rotational speed of the wind turbine rotor equal...... to its nominal value. Finally Monte Carlo simulations are performed which allow assessment of the accuracy of the first passage probability estimation by the SS methods....

  4. Assessing the clinical probability of pulmonary embolism

    International Nuclear Information System (INIS)

    Miniati, M.; Pistolesi, M.

    2001-01-01

    % in those with low probability. The prevalence of PE in patients with intermediate clinical probability was 41%. These results underscore the importance of incorporating the standardized reading of the electrocardiogram and of the chest radiograph into the clinical evaluation of patients with suspected PE. The interpretation of these laboratory data, however, requires experience. Future research is needed to develop standardized models, of varying degree of complexity, which may find application in different clinical settings to predict the probability of PE

  5. Materials capability review Los Alamos National Laboratory, May 3-6, 2010

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, Antoinette [Los Alamos National Laboratory

    2010-01-01

    The 2010 'Capability Review' process at LANL significantly differs from the Division reviews of prior years. The Capabilities being reviewed (some 4-8 per year) are deliberately chosen to be crosscutting over the Laboratory, and therefore will include not only several experimental, theoretical and simulation disciplines, but also contributions from multiple line organizations. This approach is consistent with the new Laboratory organizational structure, focusing on agile and integrated capabilities applied to present national security missions, and also nurtured to be available for rapid application to future missions. The overall intent is that the Committee assess the quality of the science, engineering, and technology identified in the agenda, and advise the LANS Board of Governors and Laboratory management. Specifically, the Committees will: (1) Assess the quality of science, technology and engineering within the Capability in the areas defined in the agenda. Identify issues to develop or enhance the core competencies within this capability. (2) Evaluate the integration of this capability across the Laboratory organizations that are listed in the agenda in terms of joint programs, projects, proposals, and/or publications. Describe the integration of this capability in the wider scientific community using the recognition as a leader within the community, ability to set research agendas, and attraction and retention of staff. (3) Assess the quality and relevance of this capability's science, technology and engineering contributions to current and emerging Laboratory programs, including Nuclear Weapons, Threat Reduction/Homeland Security, and Energy Security. (4) Advise the Laboratory Director/Principal Associate Director for Science, Technology and Engineering on the health of the Capability including the current and future (5 year) science, technology and engineering staff needs, mix of research and development activities, program opportunities

  6. Capabilities, performance, and future possibilities of high frequency polyphase resonant converters

    International Nuclear Information System (INIS)

    Reass, W.A.; Baca, D.M.; Bradley, J.T. III; Hardek, T.W.; Kwon, S.I.; Lynch, M.T.; Rees, D.E.

    2004-01-01

    High Frequency Polyphase Resonant Power Conditioning (PRPC) techniques developed at Los Alamos National Laboratory (LANL) are now being utilized for the Oak Ridge National Laboratory (ORNL) Spallation Neutron Source (SNS) accelerator klystron RF amplifier power systems. Three different styles of polyphase resonant converter modulators were developed for the SNS application. The various systems operate up to 140 kV, or 11 MW pulses, or up to 1.1 MW average power, all from a DC input of +/- 1.2 kV. Component improvements realized with the SNS effort coupled with new applied engineering techniques have resulted in dramatic changes in RF power conditioning topology. As an example, the high-voltage transformers are over 100 times smaller and lighter than equivalent 60 Hz versions. With resonant conversion techniques, load protective networks are not required. A shorted load de-tunes the resonance and little power transfer can occur. This provides for power conditioning systems that are inherently self-protective, with automatic fault 'ride-through' capabilities. By altering the Los Alamos design, higher power and CW power conditioning systems can be realized without further demands of the individual component voltage or current capabilities. This has led to designs that can accommodate 30 MW long pulse applications and megawatt class CW systems with high efficiencies. The same PRPC techniques can also be utilized for lower average power systems (∼250 kW). This permits the use of significantly higher frequency conversion techniques that result in extremely compact systems with short pulse (10 to 100 us) capabilities. These lower power PRPC systems may be suitable for medical Linacs and mobile RF systems. This paper will briefly review the performance achieved for the SNS accelerator and examine designs for high efficiency megawatt class CW systems and 30 MW peak power applications. The devices and designs for compact higher frequency converters utilized for short pulse

  7. Linker-dependent Junction Formation Probability in Single-Molecule Junctions

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Pil Sun; Kim, Taekyeong [HankukUniversity of Foreign Studies, Yongin (Korea, Republic of)

    2015-01-15

    We compare the junction formation probabilities of single-molecule junctions with different linker molecules by using a scanning tunneling microscope-based break-junction technique. We found that the junction formation probability varies as SH > SMe > NH2 for the benzene backbone molecule with different types of anchoring groups, through quantitative statistical analysis. These results are attributed to different bonding forces according to the linker groups formed with Au atoms in the electrodes, which is consistent with previous works. Our work allows a better understanding of the contact chemistry in the metal.molecule junction for future molecular electronic devices.

  8. Total Probability of Collision as a Metric for Finite Conjunction Assessment and Collision Risk Management

    Science.gov (United States)

    Frigm, Ryan C.; Hejduk, Matthew D.; Johnson, Lauren C.; Plakalovic, Dragan

    2015-01-01

    On-orbit collision risk is becoming an increasing mission risk to all operational satellites in Earth orbit. Managing this risk can be disruptive to mission and operations, present challenges for decision-makers, and is time-consuming for all parties involved. With the planned capability improvements to detecting and tracking smaller orbital debris and capacity improvements to routinely predict on-orbit conjunctions, this mission risk will continue to grow in terms of likelihood and effort. It is very real possibility that the future space environment will not allow collision risk management and mission operations to be conducted in the same manner as it is today. This paper presents the concept of a finite conjunction assessment-one where each discrete conjunction is not treated separately but, rather, as a continuous event that must be managed concurrently. The paper also introduces the Total Probability of Collision as an analogous metric for finite conjunction assessment operations and provides several options for its usage in a Concept of Operations.

  9. Summary of intrinsic and extrinsic factors affecting detection probability of marsh birds

    Science.gov (United States)

    Conway, C.J.; Gibbs, J.P.

    2011-01-01

    Many species of marsh birds (rails, bitterns, grebes, etc.) rely exclusively on emergent marsh vegetation for all phases of their life cycle, and many organizations have become concerned about the status and persistence of this group of birds. Yet, marsh birds are notoriously difficult to monitor due to their secretive habits. We synthesized the published and unpublished literature and summarized the factors that influence detection probability of secretive marsh birds in North America. Marsh birds are more likely to respond to conspecific than heterospecific calls, and seasonal peak in vocalization probability varies among co-existing species. The effectiveness of morning versus evening surveys varies among species and locations. Vocalization probability appears to be positively correlated with density in breeding Virginia Rails (Rallus limicola), Soras (Porzana carolina), and Clapper Rails (Rallus longirostris). Movement of birds toward the broadcast source creates biases when using count data from callbroadcast surveys to estimate population density. Ambient temperature, wind speed, cloud cover, and moon phase affected detection probability in some, but not all, studies. Better estimates of detection probability are needed. We provide recommendations that would help improve future marsh bird survey efforts and a list of 14 priority information and research needs that represent gaps in our current knowledge where future resources are best directed. ?? Society of Wetland Scientists 2011.

  10. Toward a generalized probability theory: conditional probabilities

    International Nuclear Information System (INIS)

    Cassinelli, G.

    1979-01-01

    The main mathematical object of interest in the quantum logic approach to the foundations of quantum mechanics is the orthomodular lattice and a set of probability measures, or states, defined by the lattice. This mathematical structure is studied per se, independently from the intuitive or physical motivation of its definition, as a generalized probability theory. It is thought that the building-up of such a probability theory could eventually throw light on the mathematical structure of Hilbert-space quantum mechanics as a particular concrete model of the generalized theory. (Auth.)

  11. Assessing future vent opening locations at the Somma-Vesuvio volcanic complex: 2. Probability maps of the caldera for a future Plinian/sub-Plinian event with uncertainty quantification

    Science.gov (United States)

    Tadini, A.; Bevilacqua, A.; Neri, A.; Cioni, R.; Aspinall, W. P.; Bisson, M.; Isaia, R.; Mazzarini, F.; Valentine, G. A.; Vitale, S.; Baxter, P. J.; Bertagnini, A.; Cerminara, M.; de Michieli Vitturi, M.; Di Roberto, A.; Engwell, S.; Esposti Ongaro, T.; Flandoli, F.; Pistolesi, M.

    2017-06-01

    In this study, we combine reconstructions of volcanological data sets and inputs from a structured expert judgment to produce a first long-term probability map for vent opening location for the next Plinian or sub-Plinian eruption of Somma-Vesuvio. In the past, the volcano has exhibited significant spatial variability in vent location; this can exert a significant control on where hazards materialize (particularly of pyroclastic density currents). The new vent opening probability mapping has been performed through (i) development of spatial probability density maps with Gaussian kernel functions for different data sets and (ii) weighted linear combination of these spatial density maps. The epistemic uncertainties affecting these data sets were quantified explicitly with expert judgments and implemented following a doubly stochastic approach. Various elicitation pooling metrics and subgroupings of experts and target questions were tested to evaluate the robustness of outcomes. Our findings indicate that (a) Somma-Vesuvio vent opening probabilities are distributed inside the whole caldera, with a peak corresponding to the area of the present crater, but with more than 50% probability that the next vent could open elsewhere within the caldera; (b) there is a mean probability of about 30% that the next vent will open west of the present edifice; (c) there is a mean probability of about 9.5% that the next medium-large eruption will enlarge the present Somma-Vesuvio caldera, and (d) there is a nonnegligible probability (mean value of 6-10%) that the next Plinian or sub-Plinian eruption will have its initial vent opening outside the present Somma-Vesuvio caldera.

  12. NESTEM-QRAS: A Tool for Estimating Probability of Failure

    Science.gov (United States)

    Patel, Bhogilal M.; Nagpal, Vinod K.; Lalli, Vincent A.; Pai, Shantaram; Rusick, Jeffrey J.

    2002-10-01

    An interface between two NASA GRC specialty codes, NESTEM and QRAS has been developed. This interface enables users to estimate, in advance, the risk of failure of a component, a subsystem, and/or a system under given operating conditions. This capability would be able to provide a needed input for estimating the success rate for any mission. NESTEM code, under development for the last 15 years at NASA Glenn Research Center, has the capability of estimating probability of failure of components under varying loading and environmental conditions. This code performs sensitivity analysis of all the input variables and provides their influence on the response variables in the form of cumulative distribution functions. QRAS, also developed by NASA, assesses risk of failure of a system or a mission based on the quantitative information provided by NESTEM or other similar codes, and user provided fault tree and modes of failure. This paper will describe briefly, the capabilities of the NESTEM, QRAS and the interface. Also, in this presentation we will describe stepwise process the interface uses using an example.

  13. NESTEM-QRAS: A Tool for Estimating Probability of Failure

    Science.gov (United States)

    Patel, Bhogilal M.; Nagpal, Vinod K.; Lalli, Vincent A.; Pai, Shantaram; Rusick, Jeffrey J.

    2002-01-01

    An interface between two NASA GRC specialty codes, NESTEM and QRAS has been developed. This interface enables users to estimate, in advance, the risk of failure of a component, a subsystem, and/or a system under given operating conditions. This capability would be able to provide a needed input for estimating the success rate for any mission. NESTEM code, under development for the last 15 years at NASA Glenn Research Center, has the capability of estimating probability of failure of components under varying loading and environmental conditions. This code performs sensitivity analysis of all the input variables and provides their influence on the response variables in the form of cumulative distribution functions. QRAS, also developed by NASA, assesses risk of failure of a system or a mission based on the quantitative information provided by NESTEM or other similar codes, and user provided fault tree and modes of failure. This paper will describe briefly, the capabilities of the NESTEM, QRAS and the interface. Also, in this presentation we will describe stepwise process the interface uses using an example.

  14. Production capability: ERDA methods and results

    International Nuclear Information System (INIS)

    Klemenic, J.

    1977-01-01

    Production centers are categorized into four classes, according to the relative certainty of future production. A ''forward cost'' basis is used to establish both the resource base and to define the acceptable production centers. The first phase of the work is called the ''Could'' capability. Resources are assigned to existing production centers, or new production centers are postulated based on adequate resources to support a mill for a reasonable economic life. A production schedule is developed for each center. The last step in the ''Could'' study is to aggregate the capital and operating costs. The final step in the Production Capability study is the rescheduling of the production from the ''Could'' to produce only sufficient U concentrate to meet the feed requirements of enrichment facilities operated at the announced transaction tails assay plans. The optimized production schedules are called the ''Need'' production capability. A separate study was also performed of industry production plans. 4 tables, 7 figs

  15. Quantum probability measures and tomographic probability densities

    NARCIS (Netherlands)

    Amosov, GG; Man'ko, [No Value

    2004-01-01

    Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the

  16. The new MCNP6 depletion capability

    International Nuclear Information System (INIS)

    Fensin, M. L.; James, M. R.; Hendricks, J. S.; Goorley, J. T.

    2012-01-01

    The first MCNP based in-line Monte Carlo depletion capability was officially released from the Radiation Safety Information and Computational Center as MCNPX 2.6.0. Both the MCNP5 and MCNPX codes have historically provided a successful combinatorial geometry based, continuous energy, Monte Carlo radiation transport solution for advanced reactor modeling and simulation. However, due to separate development pathways, useful simulation capabilities were dispersed between both codes and not unified in a single technology. MCNP6, the next evolution in the MCNP suite of codes, now combines the capability of both simulation tools, as well as providing new advanced technology, in a single radiation transport code. We describe here the new capabilities of the MCNP6 depletion code dating from the official RSICC release MCNPX 2.6.0, reported previously, to the now current state of MCNP6. NEA/OECD benchmark results are also reported. The MCNP6 depletion capability enhancements beyond MCNPX 2.6.0 reported here include: (1) new performance enhancing parallel architecture that implements both shared and distributed memory constructs; (2) enhanced memory management that maximizes calculation fidelity; and (3) improved burnup physics for better nuclide prediction. MCNP6 depletion enables complete, relatively easy-to-use depletion calculations in a single Monte Carlo code. The enhancements described here help provide a powerful capability as well as dictate a path forward for future development to improve the usefulness of the technology. (authors)

  17. The New MCNP6 Depletion Capability

    International Nuclear Information System (INIS)

    Fensin, Michael Lorne; James, Michael R.; Hendricks, John S.; Goorley, John T.

    2012-01-01

    The first MCNP based inline Monte Carlo depletion capability was officially released from the Radiation Safety Information and Computational Center as MCNPX 2.6.0. Both the MCNP5 and MCNPX codes have historically provided a successful combinatorial geometry based, continuous energy, Monte Carlo radiation transport solution for advanced reactor modeling and simulation. However, due to separate development pathways, useful simulation capabilities were dispersed between both codes and not unified in a single technology. MCNP6, the next evolution in the MCNP suite of codes, now combines the capability of both simulation tools, as well as providing new advanced technology, in a single radiation transport code. We describe here the new capabilities of the MCNP6 depletion code dating from the official RSICC release MCNPX 2.6.0, reported previously, to the now current state of MCNP6. NEA/OECD benchmark results are also reported. The MCNP6 depletion capability enhancements beyond MCNPX 2.6.0 reported here include: (1) new performance enhancing parallel architecture that implements both shared and distributed memory constructs; (2) enhanced memory management that maximizes calculation fidelity; and (3) improved burnup physics for better nuclide prediction. MCNP6 depletion enables complete, relatively easy-to-use depletion calculations in a single Monte Carlo code. The enhancements described here help provide a powerful capability as well as dictate a path forward for future development to improve the usefulness of the technology.

  18. Some uses of predictive probability of success in clinical drug development

    Directory of Open Access Journals (Sweden)

    Mauro Gasparini

    2013-03-01

    Full Text Available Predictive probability of success is a (subjective Bayesian evaluation of the prob- ability of a future successful event in a given state of information. In the context of pharmaceutical clinical drug development, successful events relate to the accrual of positive evidence on the therapy which is being developed, like demonstration of su- perior efficacy or ascertainment of safety. Positive evidence will usually be obtained via standard frequentist tools, according to the regulations imposed in the world of pharmaceutical development.Within a single trial, predictive probability of success can be identified with expected power, i.e. the evaluation of the success probability of the trial. Success means, for example, obtaining a significant result of a standard superiority test.Across trials, predictive probability of success can be the probability of a successful completion of an entire part of clinical development, for example a successful phase III development in the presence of phase II data.Calculations of predictive probability of success in the presence of normal data with known variance will be illustrated, both for within-trial and across-trial predictions.

  19. Agriculture and Energy 2030. How will farming adapt to future energy challenges?

    International Nuclear Information System (INIS)

    Portet, Fabienne; Herault, Bruno

    2010-04-01

    Energy is a major element in the competitiveness and sustainability of the French farming sector. It stands at the heart of a new model for productive and ecologically responsible agriculture. In this regard, it has been a central focus for various programmes and action plans conducted by the Ministry of Food, Agriculture and Fisheries: among others, the Energy Performance Plan (PPE) launched in 2009. The Agriculture and Energy 2030 exercise is part of this process and is directed at highlighting opportunities and risks for the agricultural sector where energy is concerned over the next twenty years. The present note describes the main links between agricultural activities and energy-related issues, in addition to the approach to strategic foresight that has been adopted. Strategic foresight is neither totally scientific nor pure imagination; it starts out from past and present facts in order to anticipate probable futures and prepare the way for decisions capable of facilitating or preventing the advent of those futures. (authors)

  20. COVAL, Compound Probability Distribution for Function of Probability Distribution

    International Nuclear Information System (INIS)

    Astolfi, M.; Elbaz, J.

    1979-01-01

    1 - Nature of the physical problem solved: Computation of the probability distribution of a function of variables, given the probability distribution of the variables themselves. 'COVAL' has been applied to reliability analysis of a structure subject to random loads. 2 - Method of solution: Numerical transformation of probability distributions

  1. Stochastic and simulation models of maritime intercept operations capabilities

    OpenAIRE

    Sato, Hiroyuki

    2005-01-01

    The research formulates and exercises stochastic and simulation models to assess the Maritime Intercept Operations (MIO) capabilities. The models focus on the surveillance operations of the Maritime Patrol Aircraft (MPA). The analysis using the models estimates the probability with which a terrorist vessel (Red) is detected, correctly classified, and escorted for intensive investigation and neutralization before it leaves an area of interest (AOI). The difficulty of obtaining adequate int...

  2. MTR fuel plate qualification capabilities at SCK-CEN

    International Nuclear Information System (INIS)

    Koonen, E.; Jacquet, P.

    2002-01-01

    In order to enhance the capabilities of BR2 in the field of MTR fuel plate testing, a dedicated irradiation device has been designed. In its basic version this device allows the irradiation of 3 fuel plates. The central fuel plate may be replaced by a dummy plate or a plate carrying dosimeters. A first FUTURE device has been built. A benchmark irradiation has been executed with standard BR2 fuel plates in order to qualify this device. Detailed neutronic calculations were performed and the results compared to the results of the post-irradiation examinations of the plates. These comparisons demonstrate the capability to conduct a fuel plate irradiation program under requested and well-known irradiation conditions. Further improvements are presently being designed in order to extend the ranges of heat flux and surface temperature of the fuel plates that can be handled with the FUTURE device. (author)

  3. On Probability Leakage

    OpenAIRE

    Briggs, William M.

    2012-01-01

    The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.

  4. Estimation of component failure probability from masked binomial system testing data

    International Nuclear Information System (INIS)

    Tan Zhibin

    2005-01-01

    The component failure probability estimates from analysis of binomial system testing data are very useful because they reflect the operational failure probability of components in the field which is similar to the test environment. In practice, this type of analysis is often confounded by the problem of data masking: the status of tested components is unknown. Methods in considering this type of uncertainty are usually computationally intensive and not practical to solve the problem for complex systems. In this paper, we consider masked binomial system testing data and develop a probabilistic model to efficiently estimate component failure probabilities. In the model, all system tests are classified into test categories based on component coverage. Component coverage of test categories is modeled by a bipartite graph. Test category failure probabilities conditional on the status of covered components are defined. An EM algorithm to estimate component failure probabilities is developed based on a simple but powerful concept: equivalent failures and tests. By simulation we not only demonstrate the convergence and accuracy of the algorithm but also show that the probabilistic model is capable of analyzing systems in series, parallel and any other user defined structures. A case study illustrates an application in test case prioritization

  5. Introduction to probability and statistics for ecosystem managers simulation and resampling

    CERN Document Server

    Haas, Timothy C

    2013-01-01

    Explores computer-intensive probability and statistics for ecosystem management decision making Simulation is an accessible way to explain probability and stochastic model behavior to beginners. This book introduces probability and statistics to future and practicing ecosystem managers by providing a comprehensive treatment of these two areas. The author presents a self-contained introduction for individuals involved in monitoring, assessing, and managing ecosystems and features intuitive, simulation-based explanations of probabilistic and statistical concepts. Mathematical programming details are provided for estimating ecosystem model parameters with Minimum Distance, a robust and computer-intensive method. The majority of examples illustrate how probability and statistics can be applied to ecosystem management challenges. There are over 50 exercises - making this book suitable for a lecture course in a natural resource and/or wildlife management department, or as the main text in a program of self-stud...

  6. Analysis of blocking probability for OFDM-based variable bandwidth optical network

    Science.gov (United States)

    Gong, Lei; Zhang, Jie; Zhao, Yongli; Lin, Xuefeng; Wu, Yuyao; Gu, Wanyi

    2011-12-01

    Orthogonal Frequency Division Multiplexing (OFDM) has recently been proposed as a modulation technique. For optical networks, because of its good spectral efficiency, flexibility, and tolerance to impairments, optical OFDM is much more flexible compared to traditional WDM systems, enabling elastic bandwidth transmissions, and optical networking is the future trend of development. In OFDM-based optical network the research of blocking rate has very important significance for network assessment. Current research for WDM network is basically based on a fixed bandwidth, in order to accommodate the future business and the fast-changing development of optical network, our study is based on variable bandwidth OFDM-based optical networks. We apply the mathematical analysis and theoretical derivation, based on the existing theory and algorithms, research blocking probability of the variable bandwidth of optical network, and then we will build a model for blocking probability.

  7. Changes in the probability of co-occurring extreme climate events

    Science.gov (United States)

    Diffenbaugh, N. S.

    2017-12-01

    Extreme climate events such as floods, droughts, heatwaves, and severe storms exert acute stresses on natural and human systems. When multiple extreme events co-occur, either in space or time, the impacts can be substantially compounded. A diverse set of human interests - including supply chains, agricultural commodities markets, reinsurance, and deployment of humanitarian aid - have historically relied on the rarity of extreme events to provide a geographic hedge against the compounded impacts of co-occuring extremes. However, changes in the frequency of extreme events in recent decades imply that the probability of co-occuring extremes is also changing, and is likely to continue to change in the future in response to additional global warming. This presentation will review the evidence for historical changes in extreme climate events and the response of extreme events to continued global warming, and will provide some perspective on methods for quantifying changes in the probability of co-occurring extremes in the past and future.

  8. Elastic K-means using posterior probability.

    Science.gov (United States)

    Zheng, Aihua; Jiang, Bo; Li, Yan; Zhang, Xuehan; Ding, Chris

    2017-01-01

    The widely used K-means clustering is a hard clustering algorithm. Here we propose a Elastic K-means clustering model (EKM) using posterior probability with soft capability where each data point can belong to multiple clusters fractionally and show the benefit of proposed Elastic K-means. Furthermore, in many applications, besides vector attributes information, pairwise relations (graph information) are also available. Thus we integrate EKM with Normalized Cut graph clustering into a single clustering formulation. Finally, we provide several useful matrix inequalities which are useful for matrix formulations of learning models. Based on these results, we prove the correctness and the convergence of EKM algorithms. Experimental results on six benchmark datasets demonstrate the effectiveness of proposed EKM and its integrated model.

  9. Quantum probabilities as Dempster-Shafer probabilities in the lattice of subspaces

    International Nuclear Information System (INIS)

    Vourdas, A.

    2014-01-01

    The orthocomplemented modular lattice of subspaces L[H(d)], of a quantum system with d-dimensional Hilbert space H(d), is considered. A generalized additivity relation which holds for Kolmogorov probabilities is violated by quantum probabilities in the full lattice L[H(d)] (it is only valid within the Boolean subalgebras of L[H(d)]). This suggests the use of more general (than Kolmogorov) probability theories, and here the Dempster-Shafer probability theory is adopted. An operator D(H 1 ,H 2 ), which quantifies deviations from Kolmogorov probability theory is introduced, and it is shown to be intimately related to the commutator of the projectors P(H 1 ),P(H 2 ), to the subspaces H 1 , H 2 . As an application, it is shown that the proof of the inequalities of Clauser, Horne, Shimony, and Holt for a system of two spin 1/2 particles is valid for Kolmogorov probabilities, but it is not valid for Dempster-Shafer probabilities. The violation of these inequalities in experiments supports the interpretation of quantum probabilities as Dempster-Shafer probabilities

  10. PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT

    Science.gov (United States)

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  11. More than words: Adults learn probabilities over categories and relationships between them.

    Science.gov (United States)

    Hudson Kam, Carla L

    2009-04-01

    This study examines whether human learners can acquire statistics over abstract categories and their relationships to each other. Adult learners were exposed to miniature artificial languages containing variation in the ordering of the Subject, Object, and Verb constituents. Different orders (e.g. SOV, VSO) occurred in the input with different frequencies, but the occurrence of one order versus another was not predictable. Importantly, the language was constructed such that participants could only match the overall input probabilities if they were tracking statistics over abstract categories, not over individual words. At test, participants reproduced the probabilities present in the input with a high degree of accuracy. Closer examination revealed that learner's were matching the probabilities associated with individual verbs rather than the category as a whole. However, individual nouns had no impact on word orders produced. Thus, participants learned the probabilities of a particular ordering of the abstract grammatical categories Subject and Object associated with each verb. Results suggest that statistical learning mechanisms are capable of tracking relationships between abstract linguistic categories in addition to individual items.

  12. Financial Capability:New Evidence for Ireland

    OpenAIRE

    Keeney, Mary J.; O’Donnell, Nuala

    2009-01-01

    Recent increases in financial innovation, particularly in the Anglo-Saxon banking culture, have seen a considerable growth in the amount of financial products available to the general public. Simultaneously, many workers are increasingly assuming responsibility for planning for their future pensions. This allied to increased life expectancy necessitates a greater degree of financial capability amongst the general public. This study has empirically examined this issue for the first time in an ...

  13. Physics, Physicists and Revolutionary Capabilities for the Intelligence Community

    Science.gov (United States)

    Porter, Lisa

    2009-05-01

    Over the past several decades, physicists have made seminal contributions to technological capabilities that have enabled the U.S. intelligence community to provide unexpected and unparalleled information to our nation's decision makers and help dispel the cloud of uncertainty they face in dealing with crises and challenges around the world. As we look to the future, we recognize that the ever-quickening pace of changes in the world and the threats we must confront demand continued innovation and improvement in the capabilities needed to provide the information on which our leaders depend. This talk will focus on some of the major technological challenges that the intelligence community faces in the coming years, and the many ways that physicists can help to overcome those challenges. The potential impact of physicists on the future capabilities of the US intelligence community is huge. In addition to the more obvious and direct impact through research in areas ranging from novel sensors to quantum information science, the unique approach physicists bring to a problem can also have an indirect but important effect by influencing how challenges in areas ranging from cybersecurity to advanced analytics are approached and solved. Several examples will be given.

  14. Probability distribution of long-run indiscriminate felling of trees in ...

    African Journals Online (AJOL)

    The study was undertaken to determine the probability distribution of Long-run indiscriminate felling of trees in northern senatorial district of Adamawa State. Specifically, the study focused on examining the future direction of indiscriminate felling of trees as well as its equilibrium distribution. A multi-stage and simple random ...

  15. Future fire probability modeling with climate change data and physical chemistry

    Science.gov (United States)

    Richard P. Guyette; Frank R. Thompson; Jodi Whittier; Michael C. Stambaugh; Daniel C. Dey

    2014-01-01

    Climate has a primary influence on the occurrence and rate of combustion in ecosystems with carbon-based fuels such as forests and grasslands. Society will be confronted with the effects of climate change on fire in future forests. There are, however, few quantitative appraisals of how climate will affect wildland fire in the United States. We demonstrated a method for...

  16. A Stochastic Model for the Landing Dispersion of Hazard Detection and Avoidance Capable Flight Systems

    Science.gov (United States)

    Witte, L.

    2014-06-01

    To support landing site assessments for HDA-capable flight systems and to facilitate trade studies between the potential HDA architectures versus the yielded probability of safe landing a stochastic landing dispersion model has been developed.

  17. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  18. Monitoring and Evaluation of Cultivated Land Irrigation Guarantee Capability with Remote Sensing

    Science.gov (United States)

    Zhang, C., Sr.; Huang, J.; Li, L.; Wang, H.; Zhu, D.

    2015-12-01

    Abstract: Cultivated Land Quality Grade monitoring and evaluation is an important way to improve the land production capability and ensure the country food safety. Irrigation guarantee capability is one of important aspects in the cultivated land quality monitoring and evaluation. In the current cultivated land quality monitoring processing based on field survey, the irrigation rate need much human resources investment in long investigation process. This study choses Beijing-Tianjin-Hebei as study region, taking the 1 km × 1 km grid size of cultivated land unit with a winter wheat-summer maize double cropping system as study object. A new irrigation capacity evaluation index based on the ratio of the annual irrigation requirement retrieved from MODIS data and the actual quantity of irrigation was proposed. With the years of monitoring results the irrigation guarantee capability of study area was evaluated comprehensively. The change trend of the irrigation guarantee capability index (IGCI) with the agricultural drought disaster area in rural statistical yearbook of Beijing-Tianjin-Hebei area was generally consistent. The average of IGCI value, the probability of irrigation-guaranteed year and the weighted average which controlled by the irrigation demand index were used and compared in this paper. The experiment results indicate that the classification result from the present method was close to that from irrigation probability in the gradation on agriculture land quality in 2012, with overlap of 73% similar units. The method of monitoring and evaluation of cultivated land IGCI proposed in this paper has a potential in cultivated land quality level monitoring and evaluation in China. Key words: remote sensing, evapotranspiration, MODIS cultivated land quality, irrigation guarantee capability Authors: Chao Zhang, Jianxi Huang, Li Li, Hongshuo Wang, Dehai Zhu China Agricultural University zhangchaobj@gmail.com

  19. Under the veil of neoliberalism: inequality, health, and capabilities

    OpenAIRE

    Kemp, Eagan

    2008-01-01

    The relationship between income inequality and health has received substantial attention in the fields of medical sociology and public health and continues to be debated. In Chile, previous findings indicate that there is an income inequality effect; respondents who live in areas with high inequality experience a greater probability of poor self-reported health. This study examines the Wilkinson income inequality hypothesis in a new way by using it in conjunction with Sen’s capability approac...

  20. Traffic simulation based ship collision probability modeling

    Energy Technology Data Exchange (ETDEWEB)

    Goerlandt, Floris, E-mail: floris.goerlandt@tkk.f [Aalto University, School of Science and Technology, Department of Applied Mechanics, Marine Technology, P.O. Box 15300, FI-00076 AALTO, Espoo (Finland); Kujala, Pentti [Aalto University, School of Science and Technology, Department of Applied Mechanics, Marine Technology, P.O. Box 15300, FI-00076 AALTO, Espoo (Finland)

    2011-01-15

    Maritime traffic poses various risks in terms of human, environmental and economic loss. In a risk analysis of ship collisions, it is important to get a reasonable estimate for the probability of such accidents and the consequences they lead to. In this paper, a method is proposed to assess the probability of vessels colliding with each other. The method is capable of determining the expected number of accidents, the locations where and the time when they are most likely to occur, while providing input for models concerned with the expected consequences. At the basis of the collision detection algorithm lays an extensive time domain micro-simulation of vessel traffic in the given area. The Monte Carlo simulation technique is applied to obtain a meaningful prediction of the relevant factors of the collision events. Data obtained through the Automatic Identification System is analyzed in detail to obtain realistic input data for the traffic simulation: traffic routes, the number of vessels on each route, the ship departure times, main dimensions and sailing speed. The results obtained by the proposed method for the studied case of the Gulf of Finland are presented, showing reasonable agreement with registered accident and near-miss data.

  1. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  2. Predicting critical transitions in dynamical systems from time series using nonstationary probability density modeling.

    Science.gov (United States)

    Kwasniok, Frank

    2013-11-01

    A time series analysis method for predicting the probability density of a dynamical system is proposed. A nonstationary parametric model of the probability density is estimated from data within a maximum likelihood framework and then extrapolated to forecast the future probability density and explore the system for critical transitions or tipping points. A full systematic account of parameter uncertainty is taken. The technique is generic, independent of the underlying dynamics of the system. The method is verified on simulated data and then applied to prediction of Arctic sea-ice extent.

  3. Probability and rational choice

    Directory of Open Access Journals (Sweden)

    David Botting

    2014-05-01

    Full Text Available http://dx.doi.org/10.5007/1808-1711.2014v18n1p1 In this paper I will discuss the rationality of reasoning about the future. There are two things that we might like to know about the future: which hypotheses are true and what will happen next. To put it in philosophical language, I aim to show that there are methods by which inferring to a generalization (selecting a hypothesis and inferring to the next instance (singular predictive inference can be shown to be normative and the method itself shown to be rational, where this is due in part to being based on evidence (although not in the same way and in part on a prior rational choice. I will also argue that these two inferences have been confused, being distinct not only conceptually (as nobody disputes but also in their results (the value given to the probability of the hypothesis being not in general that given to the next instance and that methods that are adequate for one are not by themselves adequate for the other. A number of debates over method founder on this confusion and do not show what the debaters think they show.

  4. Structural Capability of an Organization toward Innovation Capability

    DEFF Research Database (Denmark)

    Nielsen, Susanne Balslev; Momeni, Mostafa

    2016-01-01

    The scholars in the field of strategic management have developed two major approaches for attainment of competitive advantage: an approach based on environmental opportunities, and another one based on internal capabilities of an organization. Some investigations in the last two decades have...... indicated that the advantages relying on the internal capabilities of organizations may determine the competitive position of organizations better than environmental opportunities do. Characteristics of firms shows that one of the most internal capabilities that lead the organizations to the strongest...... competitive advantage in the organizations is the innovation capability. The innovation capability is associated with other organizational capabilities, and many organizations have focused on the need to identify innovation capabilities.This research focuses on recognition of the structural aspect...

  5. Nuclear Futures Analysis and Scenario Building

    International Nuclear Information System (INIS)

    Arthur, E.D.; Beller, D.; Canavan, G.H.; Krakowski, R.A.; Peterson, P.; Wagner, R.L.

    1999-01-01

    This LDRD project created and used advanced analysis capabilities to postulate scenarios and identify issues, externalities, and technologies associated with future ''things nuclear''. ''Things nuclear'' include areas pertaining to nuclear weapons, nuclear materials, and nuclear energy, examined in the context of future domestic and international environments. Analysis tools development included adaptation and expansion of energy, environmental, and economics (E3) models to incorporate a robust description of the nuclear fuel cycle (both current and future technology pathways), creation of a beginning proliferation risk model (coupled to the (E3) model), and extension of traditional first strike stability models to conditions expected to exist in the future (smaller force sizes, multipolar engagement environments, inclusion of actual and latent nuclear weapons (capability)). Accomplishments include scenario development for regional and global nuclear energy, the creation of a beginning nuclear architecture designed to improve the proliferation resistance and environmental performance of the nuclear fuel cycle, and numerous results for future nuclear weapons scenarios

  6. Multiregion, multigroup collision probability method with white boundary condition for light water reactor thermalization calculations

    International Nuclear Information System (INIS)

    Ozgener, B.; Ozgener, H.A.

    2005-01-01

    A multiregion, multigroup collision probability method with white boundary condition is developed for thermalization calculations of light water moderated reactors. Hydrogen scatterings are treated by Nelkin's kernel while scatterings from other nuclei are assumed to obey the free-gas scattering kernel. The isotropic return (white) boundary condition is applied directly by using the appropriate collision probabilities. Comparisons with alternate numerical methods show the validity of the present formulation. Comparisons with some experimental results indicate that the present formulation is capable of calculating disadvantage factors which are closer to the experimental results than alternative methods

  7. Foundations of probability

    International Nuclear Information System (INIS)

    Fraassen, B.C. van

    1979-01-01

    The interpretation of probabilities in physical theories are considered, whether quantum or classical. The following points are discussed 1) the functions P(μ, Q) in terms of which states and propositions can be represented, are classical (Kolmogoroff) probabilities, formally speaking, 2) these probabilities are generally interpreted as themselves conditional, and the conditions are mutually incompatible where the observables are maximal and 3) testing of the theory typically takes the form of confronting the expectation values of observable Q calculated with probability measures P(μ, Q) for states μ; hence, of comparing the probabilities P(μ, Q)(E) with the frequencies of occurrence of the corresponding events. It seems that even the interpretation of quantum mechanics, in so far as it concerns what the theory says about the empirical (i.e. actual, observable) phenomena, deals with the confrontation of classical probability measures with observable frequencies. This confrontation is studied. (Auth./C.F.)

  8. Analytical Chemistry Core Capability Assessment - Preliminary Report

    International Nuclear Information System (INIS)

    Barr, Mary E.; Farish, Thomas J.

    2012-01-01

    useful in defining a roadmap for what future capability needs to look like.

  9. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned

  10. Human Capability, Mild Perfectionism and Thickened Educational Praxis

    Science.gov (United States)

    Walker, Melanie

    2008-01-01

    This paper argues for a mild perfectionism in applying Amartya Sen's capability approach for an education transformative of student agency and well-being. Key to the paper is the significance of education as a process of being and becoming in the future, and education's fundamental objective of a positively changed human being. The capability…

  11. Managing Capabilities for Supply Chain Resilience Through it Integration

    Directory of Open Access Journals (Sweden)

    Gružauskas Valentas

    2017-08-01

    Full Text Available The trend for e-commerce, estimated population size to 11 billion by 2050, and an increase in urbanization level to 70 % is requiring to re-think the current supply chain. These trends changed the distribution process: delivery distances are decreasing, the product variety is increasing, and more products are being sold in smaller quantities. Therefore, the concept of supply chain resilience has gained more recognition in recent years. The scientific literature analysis conducted by the authors indicate several capabilities that influence supply chain resilience. Collaboration, flexibility, redundancy and integration are the most influential capabilities to supply chain resilience. However, the authors identify that the combination of these capabilities to supply chain resilience is under researched. The authors indicate that by combining these capabilities with the upcoming technologies of industry 4.0, supply chain resilience can be achieved. In the future, the authors are planning to conduct further research to identify the influence of these capabilities to supply chain resilience, to quantify supply chain resilience, and to provide further practices of industry 4.0 concept usage for supply chain resilience.

  12. Handbook of probability

    CERN Document Server

    Florescu, Ionut

    2013-01-01

    THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio

  13. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  14. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Science.gov (United States)

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  15. Measures, Probability and Holography in Cosmology

    Science.gov (United States)

    Phillips, Daniel

    This dissertation compiles four research projects on predicting values for cosmological parameters and models of the universe on the broadest scale. The first examines the Causal Entropic Principle (CEP) in inhomogeneous cosmologies. The CEP aims to predict the unexpectedly small value of the cosmological constant Lambda using a weighting by entropy increase on causal diamonds. The original work assumed a purely isotropic and homogeneous cosmology. But even the level of inhomogeneity observed in our universe forces reconsideration of certain arguments about entropy production. In particular, we must consider an ensemble of causal diamonds associated with each background cosmology and we can no longer immediately discard entropy production in the far future of the universe. Depending on our choices for a probability measure and our treatment of black hole evaporation, the prediction for Lambda may be left intact or dramatically altered. The second related project extends the CEP to universes with curvature. We have found that curvature values larger than rho k = 40rhom are disfavored by more than $99.99% and a peak value at rhoLambda = 7.9 x 10-123 and rhok =4.3rho m for open universes. For universes that allow only positive curvature or both positive and negative curvature, we find a correlation between curvature and dark energy that leads to an extended region of preferred values. Our universe is found to be disfavored to an extent depending the priors on curvature. We also provide a comparison to previous anthropic constraints on open universes and discuss future directions for this work. The third project examines how cosmologists should formulate basic questions of probability. We argue using simple models that all successful practical uses of probabilities originate in quantum fluctuations in the microscopic physical world around us, often propagated to macroscopic scales. Thus we claim there is no physically verified fully classical theory of probability. We

  16. Bounds on survival probability given mean probability of failure per demand; and the paradoxical advantages of uncertainty

    International Nuclear Information System (INIS)

    Strigini, Lorenzo; Wright, David

    2014-01-01

    When deciding whether to accept into service a new safety-critical system, or choosing between alternative systems, uncertainty about the parameters that affect future failure probability may be a major problem. This uncertainty can be extreme if there is the possibility of unknown design errors (e.g. in software), or wide variation between nominally equivalent components. We study the effect of parameter uncertainty on future reliability (survival probability), for systems required to have low risk of even only one failure or accident over the long term (e.g. their whole operational lifetime) and characterised by a single reliability parameter (e.g. probability of failure per demand – pfd). A complete mathematical treatment requires stating a probability distribution for any parameter with uncertain value. This is hard, so calculations are often performed using point estimates, like the expected value. We investigate conditions under which such simplified descriptions yield reliability values that are sure to be pessimistic (or optimistic) bounds for a prediction based on the true distribution. Two important observations are (i) using the expected value of the reliability parameter as its true value guarantees a pessimistic estimate of reliability, a useful property in most safety-related decisions; (ii) with a given expected pfd, broader distributions (in a formally defined meaning of “broader”), that is, systems that are a priori “less predictable”, lower the risk of failures or accidents. Result (i) justifies the simplification of using a mean in reliability modelling; we discuss within which scope this justification applies, and explore related scenarios, e.g. how things improve if we can test the system before operation. Result (ii) not only offers more flexible ways of bounding reliability predictions, but also has important, often counter-intuitive implications for decision making in various areas, like selection of components, project management

  17. NASA DOE POD NDE Capabilities Data Book

    Science.gov (United States)

    Generazio, Edward R.

    2015-01-01

    This data book contains the Directed Design of Experiments for Validating Probability of Detection (POD) Capability of NDE Systems (DOEPOD) analyses of the nondestructive inspection data presented in the NTIAC, Nondestructive Evaluation (NDE) Capabilities Data Book, 3rd ed., NTIAC DB-97-02. DOEPOD is designed as a decision support system to validate inspection system, personnel, and protocol demonstrating 0.90 POD with 95% confidence at critical flaw sizes, a90/95. The test methodology used in DOEPOD is based on the field of statistical sequential analysis founded by Abraham Wald. Sequential analysis is a method of statistical inference whose characteristic feature is that the number of observations required by the procedure is not determined in advance of the experiment. The decision to terminate the experiment depends, at each stage, on the results of the observations previously made. A merit of the sequential method, as applied to testing statistical hypotheses, is that test procedures can be constructed which require, on average, a substantially smaller number of observations than equally reliable test procedures based on a predetermined number of observations.

  18. Framing of decision problem in short and long term and probability perception

    Directory of Open Access Journals (Sweden)

    Anna Wielicka-Regulska

    2010-01-01

    Full Text Available Consumer preferences are dependent on problem framing and time perspective. For experiment’s participants avoiding of losses was less probable in distant time perspective than in near term. On the contrary, achieving gains in near future was less probable than in remote time. One may expect different reactions when presenting problem in terms of gains than in terms of losses. This can be exploited in promotion of highly desired social behaviours like savings for retirement, keeping good diet, investing in learning, and other advantageous activities that are usually put forward by consumers.

  19. Predictive probability methods for interim monitoring in clinical trials with longitudinal outcomes.

    Science.gov (United States)

    Zhou, Ming; Tang, Qi; Lang, Lixin; Xing, Jun; Tatsuoka, Kay

    2018-04-17

    In clinical research and development, interim monitoring is critical for better decision-making and minimizing the risk of exposing patients to possible ineffective therapies. For interim futility or efficacy monitoring, predictive probability methods are widely adopted in practice. Those methods have been well studied for univariate variables. However, for longitudinal studies, predictive probability methods using univariate information from only completers may not be most efficient, and data from on-going subjects can be utilized to improve efficiency. On the other hand, leveraging information from on-going subjects could allow an interim analysis to be potentially conducted once a sufficient number of subjects reach an earlier time point. For longitudinal outcomes, we derive closed-form formulas for predictive probabilities, including Bayesian predictive probability, predictive power, and conditional power and also give closed-form solutions for predictive probability of success in a future trial and the predictive probability of success of the best dose. When predictive probabilities are used for interim monitoring, we study their distributions and discuss their analytical cutoff values or stopping boundaries that have desired operating characteristics. We show that predictive probabilities utilizing all longitudinal information are more efficient for interim monitoring than that using information from completers only. To illustrate their practical application for longitudinal data, we analyze 2 real data examples from clinical trials. Copyright © 2018 John Wiley & Sons, Ltd.

  20. Materials Capability Review Los Alamos National Laboratory April 29-May 2, 2012

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, Antoinette J [Los Alamos National Laboratory

    2012-04-20

    Los Alamos National Laboratory (LANL) uses Capability Reviews to assess the quality and institutional integration of science, technology and engineering (STE) and to advise Laboratory Management on the current and future health of LANL STE. The capabilities are deliberately chosen to be crosscutting over the Laboratory and therefore will include experimental, theoretical and simulation disciplines from multiple line organizations. Capability Reviews are designed to provide a more holistic view of the STE quality, integration to achieve mission requirements, and mission relevance. The scope of these capabilities necessitate that there will be significant overlap in technical areas covered by capability reviews (e.g., materials research and weapons science and engineering). In addition, LANL staff may be reviewed in different capability reviews because of their varied assignments and expertise. The principal product of the Capability Review is the report that includes the review committee's assessments, recommendations, and recommendations for STE.

  1. Janus-faced probability

    CERN Document Server

    Rocchi, Paolo

    2014-01-01

    The problem of probability interpretation was long overlooked before exploding in the 20th century, when the frequentist and subjectivist schools formalized two conflicting conceptions of probability. Beyond the radical followers of the two schools, a circle of pluralist thinkers tends to reconcile the opposing concepts. The author uses two theorems in order to prove that the various interpretations of probability do not come into opposition and can be used in different contexts. The goal here is to clarify the multifold nature of probability by means of a purely mathematical approach and to show how philosophical arguments can only serve to deepen actual intellectual contrasts. The book can be considered as one of the most important contributions in the analysis of probability interpretation in the last 10-15 years.

  2. ORGANISATIONAL CAPABILITIES, COMPETITIVE ADVANTAGE AND PERFORMANCE IN SUPPORTING INDUSTRIES IN VIETNAM

    Directory of Open Access Journals (Sweden)

    Nham Phong Tuan

    2010-01-01

    Full Text Available This paper focuses on applying the resource-based view (RBV of firms to explain performance in supporting industries in Vietnam. Specifically, we based our research on the comprehensive framework of RBV and reviewed previous empirical researches before deciding on adopting a dynamic capabilities approach to test relationships among organisational capabilities, competitive advantage and performance. A multivariate analysis of survey responses of 102 firms belonging to supporting industries in Vietnam indicates that the organisational capabilities are related to the competitive advantage, that the competitive advantage is related to performance, and that the competitive advantage mediates the relationship between organizational capabilities and performance. These findings have considerable implications for academics as well as practitioners. Finally, this study also provides directions for future research.

  3. Transition probabilities of health states for workers in Malaysia using a Markov chain model

    Science.gov (United States)

    Samsuddin, Shamshimah; Ismail, Noriszura

    2017-04-01

    The aim of our study is to estimate the transition probabilities of health states for workers in Malaysia who contribute to the Employment Injury Scheme under the Social Security Organization Malaysia using the Markov chain model. Our study uses four states of health (active, temporary disability, permanent disability and death) based on the data collected from the longitudinal studies of workers in Malaysia for 5 years. The transition probabilities vary by health state, age and gender. The results show that men employees are more likely to have higher transition probabilities to any health state compared to women employees. The transition probabilities can be used to predict the future health of workers in terms of a function of current age, gender and health state.

  4. Culture and Probability Judgment Accuracy: The Influence of Holistic Reasoning.

    Science.gov (United States)

    Lechuga, Julia; Wiebe, John S

    2011-08-01

    A well-established phenomenon in the judgment and decision-making tradition is the overconfidence one places in the amount of knowledge that one possesses. Overconfidence or probability judgment accuracy varies not only individually but also across cultures. However, research efforts to explain cross-cultural variations in the overconfidence phenomenon have seldom been made. In Study 1, the authors compared the probability judgment accuracy of U.S. Americans (N = 108) and Mexican participants (N = 100). In Study 2, they experimentally primed culture by randomly assigning English/Spanish bilingual Mexican Americans (N = 195) to response language. Results of both studies replicated the cross-cultural variation of probability judgment accuracy previously observed in other cultural groups. U.S. Americans displayed less overconfidence when compared to Mexicans. These results were then replicated in bilingual participants, when culture was experimentally manipulated with language priming. Holistic reasoning did not account for the cross-cultural variation of overconfidence. Suggestions for future studies are discussed.

  5. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....

  6. Computer-aided mathematical analysis of probability of intercept for ground-based communication intercept system

    Science.gov (United States)

    Park, Sang Chul

    1989-09-01

    We develop a mathematical analysis model to calculate the probability of intercept (POI) for the ground-based communication intercept (COMINT) system. The POI is a measure of the effectiveness of the intercept system. We define the POI as the product of the probability of detection and the probability of coincidence. The probability of detection is a measure of the receiver's capability to detect a signal in the presence of noise. The probability of coincidence is the probability that an intercept system is available, actively listening in the proper frequency band, in the right direction and at the same time that the signal is received. We investigate the behavior of the POI with respect to the observation time, the separation distance, antenna elevations, the frequency of the signal, and the receiver bandwidths. We observe that the coincidence characteristic between the receiver scanning parameters and the signal parameters is the key factor to determine the time to obtain a given POI. This model can be used to find the optimal parameter combination to maximize the POI in a given scenario. We expand this model to a multiple system. This analysis is conducted on a personal computer to provide the portability. The model is also flexible and can be easily implemented under different situations.

  7. The Science-Policy Link: Stakeholder Reactions to the Uncertainties of Future Sea Level Rise

    Science.gov (United States)

    Plag, H.; Bye, B.

    2011-12-01

    would provide a different kind of science input to policy makers and stakeholders. Like in many other insurance problems (for example, earthquakes), where deterministic predictions are not possible and decisions have to be made on the basis of statistics and probabilities, the statistical approach to coastal resilience would require stakeholders to make decisions on the basis of probabilities instead of predictions. The science input for informed decisions on adaptation would consist of general probabilities of decadal to century scale sea level changes derived from paleo records, including the probabilities for large and rapid rises. Similar to other problems where the appearance of a hazard is associated with a high risk (like a fire in a house), this approach would also require a monitoring and warning system (a "smoke detector") capable of detecting any onset of a rapid sea level rise.

  8. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  9. Spatial vent opening probability map of El Hierro Island (Canary Islands, Spain)

    Science.gov (United States)

    Becerril, Laura; Cappello, Annalisa; Galindo, Inés; Neri, Marco; Del Negro, Ciro

    2013-04-01

    The assessment of the probable spatial distribution of new eruptions is useful to manage and reduce the volcanic risk. It can be achieved in different ways, but it becomes especially hard when dealing with volcanic areas less studied, poorly monitored and characterized by a low frequent activity, as El Hierro. Even though it is the youngest of the Canary Islands, before the 2011 eruption in the "Las Calmas Sea", El Hierro had been the least studied volcanic Island of the Canaries, with more historically devoted attention to La Palma, Tenerife and Lanzarote. We propose a probabilistic method to build the susceptibility map of El Hierro, i.e. the spatial distribution of vent opening for future eruptions, based on the mathematical analysis of the volcano-structural data collected mostly on the Island and, secondly, on the submerged part of the volcano, up to a distance of ~10-20 km from the coast. The volcano-structural data were collected through new fieldwork measurements, bathymetric information, and analysis of geological maps, orthophotos and aerial photographs. They have been divided in different datasets and converted into separate and weighted probability density functions, which were then included in a non-homogeneous Poisson process to produce the volcanic susceptibility map. Future eruptive events on El Hierro is mainly concentrated on the rifts zones, extending also beyond the shoreline. The major probabilities to host new eruptions are located on the distal parts of the South and West rifts, with the highest probability reached in the south-western area of the West rift. High probabilities are also observed in the Northeast and South rifts, and the submarine parts of the rifts. This map represents the first effort to deal with the volcanic hazard at El Hierro and can be a support tool for decision makers in land planning, emergency plans and civil defence actions.

  10. Are Differences Between Partners Always Detrimental? The Moderating Role of Future Connectedness

    Directory of Open Access Journals (Sweden)

    Simon Andrew Moss

    2014-12-01

    Full Text Available Whether perceived differences between romantic partners compromises or enhances relationships may depend on the characteristics of individuals. This study explores the possibility that differences in capabilities but not motives enhance relationship satisfaction—but only when the individuals feel connected to their future identity. In particular, when individuals feel connected to their future identity, their primary motivation is to accrue capabilities and resources that could be useful in subsequent decades. They will thus seek partners with capabilities they have yet to acquire because, consistent with self-expansion theory, they tend to perceive these abilities as part of their own self-concept. To test this premise, 152 individuals rated the motives and capabilities of both themselves and their partners and also answered questions that gauge their relationship satisfaction and connectedness to their future identity. Perceived differences in motives and capabilities were inversely associated with relationship satisfaction. However, when participants felt connected to their future identity, the inverse association between differences in capabilities and relationship satisfaction diminished. Accordingly, if individuals perceive their lives as stable, they can embrace some differences between themselves and their partner.

  11. Development and evaluation of an ultra-fast ASIC for future PET scanners using TOF-capable MPPC array detectors

    International Nuclear Information System (INIS)

    Ambe, T.; Ikeda, H.; Kataoka, J.; Matsuda, H.; Kato, T.

    2015-01-01

    We developed a front-end ASIC for future PET scanners with Time-Of-Flight (TOF) capability to be coupled with 4×4 Multi-Pixel Photon Counter (MPPC) arrays. The ASIC is designed based on the open-IP project proposed by JAXA and realized in TSMC 0.35 μm CMOS technology. The circuit comprises 16-channel, low impedance current conveyors for effectively acquiring fast MPPC signals. For precise measurement of the coincidence timing of 511-keV gamma rays, the leading-edge method was used to discriminate the signals. We first tested the time response of the ASIC by illuminating each channel of a MPPC array device 3×3 mm 2 in size with a Pico-second Light Pulsar with a light emission peak of 655 nm and pulse duration of 54 ps (FWHM). We obtained 105 ps (FWHM) on average for each channel in time jitter measurements. Moreover, we compensated for the time lag of each channel with inner delay circuits and succeeded in suppressing about a 700-ps lag to only 15 ps. This paper reports TOF measurements using back-to-back 511-keV signals, and suggests that the ASIC can be a promising device for future TOF-PET scanners based on the MPPC array. - Highlights: • We developed a newly designed large-area monolithic MPPC array. • We obtained fine gain uniformity, and good energy and time resolutions when coupled to the LYSO scintillator. • We fabricated gamma-ray camera consisting of the MPPC array and the submillimeter pixelized LYSO and GGAG scintillators. • In the flood images, each crystal of scintillator matrices was clearly resolved. • Good energy resolutions for 662 keV gamma-rays for each LYSO and GGAG scintillator matrices were obtained

  12. Lightweight Tactical Client: A Capability-Based Approach to Command Post Computing

    Science.gov (United States)

    2015-12-01

    bundles these capabilities together is proposed: a lightweight tactical client. In order to avoid miscommunication in the future, it is... solutions and almost definitely rules out most terminal-based thin clients. UNCLASSIFIED Approved for public release

  13. Survey of industrial coal conversion equipment capabilities: valves

    Energy Technology Data Exchange (ETDEWEB)

    Bush, W. A.; Slade, E. C.

    1978-06-01

    A survey of the industrial capabilities of the valve and valve-actuator industry to supply large, high-pressure stop valves for the future coal conversion industry is presented in this report. Also discussed are development and testing capabilities of valve and valve-actuator manufacturers and anticipated lead times required to manufacture advanced design valves for the most stringent service applications. Results indicate that the valve and valve-actuator industry is capable of manufacturing in quantity equipment of the size and for the pressure and temperature ranges which would be required in the coal conversion industry. Valve manufacturers do not, however, have sufficient product application experience to predict the continuing functional ability of valves used for lock-hopper feeders, slurry feeders, and slag-char letdown service. Developmental and testing efforts to modify existing valve designs or to develop new valve concepts for these applications were estimated to range from 1 to 6 years. A testing facility to simulate actuation of critical valves under service conditions would be beneficial.

  14. Improving Ranking Using Quantum Probability

    OpenAIRE

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...

  15. Security and VO management capabilities in a large-scale Grid operating system

    OpenAIRE

    Aziz, Benjamin; Sporea, Ioana

    2014-01-01

    This paper presents a number of security and VO management capabilities in a large-scale distributed Grid operating system. The capabilities formed the basis of the design and implementation of a number of security and VO management services in the system. The main aim of the paper is to provide some idea of the various functionality cases that need to be considered when designing similar large-scale systems in the future.

  16. Impact of Personnel Capabilities on Organizational Innovation Capability

    DEFF Research Database (Denmark)

    Nielsen, Susanne Balslev; Momeni, Mostafa

    2016-01-01

    in this rapidly changing world. This research focuses on definition of the personnel aspect of innovation capability, and proposes a conceptual model based on the scientific articles of academic literature on organisations innovation capability. This paper includes an expert based validation in three rounds...... of the Delphi method. And for the purpose of a better appreciation of the relationship dominating the factors of the model, it has distributed the questionnaire to Iranian companies in the Food industry. This research proposed a direct relationship between Innovation Capability and the Personnel Capability...

  17. Calculating the Probability of Returning a Loan with Binary Probability Models

    Directory of Open Access Journals (Sweden)

    Julian Vasilev

    2014-12-01

    Full Text Available The purpose of this article is to give a new approach in calculating the probability of returning a loan. A lot of factors affect the value of the probability. In this article by using statistical and econometric models some influencing factors are proved. The main approach is concerned with applying probit and logit models in loan management institutions. A new aspect of the credit risk analysis is given. Calculating the probability of returning a loan is a difficult task. We assume that specific data fields concerning the contract (month of signing, year of signing, given sum and data fields concerning the borrower of the loan (month of birth, year of birth (age, gender, region, where he/she lives may be independent variables in a binary logistics model with a dependent variable “the probability of returning a loan”. It is proved that the month of signing a contract, the year of signing a contract, the gender and the age of the loan owner do not affect the probability of returning a loan. It is proved that the probability of returning a loan depends on the sum of contract, the remoteness of the loan owner and the month of birth. The probability of returning a loan increases with the increase of the given sum, decreases with the proximity of the customer, increases for people born in the beginning of the year and decreases for people born at the end of the year.

  18. On the Possibility of Assigning Probabilities to Singular Cases, or: Probability Is Subjective Too!

    Directory of Open Access Journals (Sweden)

    Mark R. Crovelli

    2009-06-01

    Full Text Available Both Ludwig von Mises and Richard von Mises claimed that numerical probability could not be legitimately applied to singular cases. This paper challenges this aspect of the von Mises brothers’ theory of probability. It is argued that their denial that numerical probability could be applied to singular cases was based solely upon Richard von Mises’ exceptionally restrictive definition of probability. This paper challenges Richard von Mises’ definition of probability by arguing that the definition of probability necessarily depends upon whether the world is governed by time-invariant causal laws. It is argued that if the world is governed by time-invariant causal laws, a subjective definition of probability must be adopted. It is further argued that both the nature of human action and the relative frequency method for calculating numerical probabilities both presuppose that the world is indeed governed by time-invariant causal laws. It is finally argued that the subjective definition of probability undercuts the von Mises claim that numerical probability cannot legitimately be applied to singular, non-replicable cases.

  19. Archiving Software Systems: Approaches to Preserve Computational Capabilities

    Science.gov (United States)

    King, T. A.

    2014-12-01

    A great deal of effort is made to preserve scientific data. Not only because data is knowledge, but it is often costly to acquire and is sometimes collected under unique circumstances. Another part of the science enterprise is the development of software to process and analyze the data. Developed software is also a large investment and worthy of preservation. However, the long term preservation of software presents some challenges. Software often requires a specific technology stack to operate. This can include software, operating systems and hardware dependencies. One past approach to preserve computational capabilities is to maintain ancient hardware long past its typical viability. On an archive horizon of 100 years, this is not feasible. Another approach to preserve computational capabilities is to archive source code. While this can preserve details of the implementation and algorithms, it may not be possible to reproduce the technology stack needed to compile and run the resulting applications. This future forward dilemma has a solution. Technology used to create clouds and process big data can also be used to archive and preserve computational capabilities. We explore how basic hardware, virtual machines, containers and appropriate metadata can be used to preserve computational capabilities and to archive functional software systems. In conjunction with data archives, this provides scientist with both the data and capability to reproduce the processing and analysis used to generate past scientific results.

  20. Heavy Lift Launch Capability with a New Hydrocarbon Engine

    Science.gov (United States)

    Threet, Grady E., Jr.; Holt, James B.; Philips, Alan D.; Garcia, Jessica A.

    2011-01-01

    The Advanced Concepts Office at NASA's George C. Marshall Space Flight Center was tasked to define the thrust requirement of a new liquid oxygen rich staged combustion cycle hydrocarbon engine that could be utilized in a launch vehicle to meet NASA s future heavy lift needs. Launch vehicle concepts were sized using this engine for different heavy lift payload classes. Engine out capabilities for one of the heavy lift configurations were also analyzed for increased reliability that may be desired for high value payloads or crewed missions. The applicability for this engine in vehicle concepts to meet military and commercial class payloads comparable to current ELV capability was also evaluated.

  1. Capability Paternalism

    NARCIS (Netherlands)

    Claassen, R.J.G.|info:eu-repo/dai/nl/269266224

    A capability approach prescribes paternalist government actions to the extent that it requires the promotion of specific functionings, instead of the corresponding capabilities. Capability theorists have argued that their theories do not have much of these paternalist implications, since promoting

  2. A case study: application of statistical process control tool for determining process capability and sigma level.

    Science.gov (United States)

    Chopra, Vikram; Bairagi, Mukesh; Trivedi, P; Nagar, Mona

    2012-01-01

    Statistical process control is the application of statistical methods to the measurement and analysis of variation process. Various regulatory authorities such as Validation Guidance for Industry (2011), International Conference on Harmonisation ICH Q10 (2009), the Health Canada guidelines (2009), Health Science Authority, Singapore: Guidance for Product Quality Review (2008), and International Organization for Standardization ISO-9000:2005 provide regulatory support for the application of statistical process control for better process control and understanding. In this study risk assessments, normal probability distributions, control charts, and capability charts are employed for selection of critical quality attributes, determination of normal probability distribution, statistical stability, and capability of production processes, respectively. The objective of this study is to determine tablet production process quality in the form of sigma process capability. By interpreting data and graph trends, forecasting of critical quality attributes, sigma process capability, and stability of process were studied. The overall study contributes to an assessment of process at the sigma level with respect to out-of-specification attributes produced. Finally, the study will point to an area where the application of quality improvement and quality risk assessment principles for achievement of six sigma-capable processes is possible. Statistical process control is the most advantageous tool for determination of the quality of any production process. This tool is new for the pharmaceutical tablet production process. In the case of pharmaceutical tablet production processes, the quality control parameters act as quality assessment parameters. Application of risk assessment provides selection of critical quality attributes among quality control parameters. Sequential application of normality distributions, control charts, and capability analyses provides a valid statistical

  3. Greek paideia and terms of probability

    Directory of Open Access Journals (Sweden)

    Fernando Leon Parada

    2016-06-01

    Full Text Available This paper addresses three aspects of the conceptual framework for a doctoral dissertation research in process in the field of Mathematics Education, in particular, in the subfield of teaching and learning basic concepts of Probability Theory at the College level. It intends to contrast, sustain and elucidate the central statement that the meanings of some of these basic terms used in Probability Theory were not formally defined by any specific theory but relate to primordial ideas developed in Western culture from Ancient Greek myths. The first aspect deals with the notion of uncertainty, with that Greek thinkers described several archaic gods and goddesses of Destiny, like Parcas and Moiras, often personified in the goddess Tyche—Fortuna for the Romans—, as regarded in Werner Jaeger’s “Paideia”. The second aspect treats the idea of hazard from two different approaches: the first approach deals with hazard, denoted by Plato with the already demythologized term ‘tyche’ from the viewpoint of innate knowledge, as Jaeger points out. The second approach deals with hazard from a perspective that could be called “phenomenological”, from which Aristotle attempted to articulate uncertainty with a discourse based on the hypothesis of causality. The term ‘causal’ was opposed both to ‘casual’ and to ‘spontaneous’ (as used in the expression “spontaneous generation”, attributing uncertainty to ignorance of the future, thus respecting causal flow. The third aspect treated in the paper refers to some definitions and etymologies of some other modern words that have become technical terms in current Probability Theory, confirming the above-mentioned main proposition of this paper.

  4. Continuation of probability density functions using a generalized Lyapunov approach

    Energy Technology Data Exchange (ETDEWEB)

    Baars, S., E-mail: s.baars@rug.nl [Johann Bernoulli Institute for Mathematics and Computer Science, University of Groningen, P.O. Box 407, 9700 AK Groningen (Netherlands); Viebahn, J.P., E-mail: viebahn@cwi.nl [Centrum Wiskunde & Informatica (CWI), P.O. Box 94079, 1090 GB, Amsterdam (Netherlands); Mulder, T.E., E-mail: t.e.mulder@uu.nl [Institute for Marine and Atmospheric research Utrecht, Department of Physics and Astronomy, Utrecht University, Princetonplein 5, 3584 CC Utrecht (Netherlands); Kuehn, C., E-mail: ckuehn@ma.tum.de [Technical University of Munich, Faculty of Mathematics, Boltzmannstr. 3, 85748 Garching bei München (Germany); Wubs, F.W., E-mail: f.w.wubs@rug.nl [Johann Bernoulli Institute for Mathematics and Computer Science, University of Groningen, P.O. Box 407, 9700 AK Groningen (Netherlands); Dijkstra, H.A., E-mail: h.a.dijkstra@uu.nl [Institute for Marine and Atmospheric research Utrecht, Department of Physics and Astronomy, Utrecht University, Princetonplein 5, 3584 CC Utrecht (Netherlands); School of Chemical and Biomolecular Engineering, Cornell University, Ithaca, NY (United States)

    2017-05-01

    Techniques from numerical bifurcation theory are very useful to study transitions between steady fluid flow patterns and the instabilities involved. Here, we provide computational methodology to use parameter continuation in determining probability density functions of systems of stochastic partial differential equations near fixed points, under a small noise approximation. Key innovation is the efficient solution of a generalized Lyapunov equation using an iterative method involving low-rank approximations. We apply and illustrate the capabilities of the method using a problem in physical oceanography, i.e. the occurrence of multiple steady states of the Atlantic Ocean circulation.

  5. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  6. Theory including future not excluded

    DEFF Research Database (Denmark)

    Nagao, K.; Nielsen, H.B.

    2013-01-01

    We study a complex action theory (CAT) whose path runs over not only past but also future. We show that, if we regard a matrix element defined in terms of the future state at time T and the past state at time TA as an expectation value in the CAT, then we are allowed to have the Heisenberg equation......, Ehrenfest's theorem, and the conserved probability current density. In addition,we showthat the expectation value at the present time t of a future-included theory for large T - t and large t - T corresponds to that of a future-not-included theory with a proper inner product for large t - T. Hence, the CAT...

  7. The future petroleum geologist

    International Nuclear Information System (INIS)

    Berg, R.R.

    1992-01-01

    This paper reports that in July 1985, AAPG President William L. Fisher appointed a select committee to determine the capabilities that will be required of petroleum geologists in the future. His charge to the committee was based on the profound changes and uncertainties that were beginning to be felt in the industry and would surely affect the employment of geologists and their professional practice. These changes are well known: the supply of oil had exceeded demand, the price of oil was unstable, many companies were threatened by debt and buy-outs, and corporate restructuring was underway to meet changing economic conditions. All contributed to great uncertainty about the need and requirements of geological employment and practice. Specifically, President Fisher charged the committee to distinguish those elements of recent times that are cyclic and those that are long-term in their effects; to characterize the state of the industry for the next 25 years; to predict the capabilities that the future petroleum geologist should posses to meet the challenges of the future; and most importantly, the define the role of AAPG and its commitments to the membership under these changing conditions

  8. Neutron sources: Present practice and future potential

    International Nuclear Information System (INIS)

    Cierjacks, S.; Smith, A.B.

    1988-01-01

    The present capability and future potential of accelerator-based monoenergetic and white neutron sources are outlined in the context of fundamental and applied neutron-nuclear research. The neutron energy range extends from thermal to 500 MeV, and the time domain from steady-state to pico-second pulsed sources. Accelerator technology is summarized, including the production of intense light-ion, heavy-ion and electron beams. Target capabilities are discussed with attention to neutron-producing efficiency and power-handling capabilities. The status of underlying neutron-producing reactions is summarized. The present and future use of neutron sources in: fundamental neutron-nuclear research, nuclear data acquisition, materials damage studies, engineering tests, and biomedical applications are discussed. Emphasis is given to current status, near-term advances well within current technology, and to long-range projections. 90 refs., 4 figs

  9. Imperfection detection probability at ultrasonic testing of reactor vessels

    International Nuclear Information System (INIS)

    Kazinczy, F. de; Koernvik, L.Aa.

    1980-02-01

    The report is a lecture given at a symposium organized by the Swedish nuclear power inspectorate on February 1980. Equipments, calibration and testing procedures are reported. The estimation of defect detection probability for ultrasonic tests and the reliability of literature data are discussed. Practical testing of reactor vessels and welded joints are described. Swedish test procedures are compared with other countries. Series of test data for welded joints of the OKG-2 reactor are presented. Future recommendations for testing procedures are made. (GBn)

  10. Predicting future changes in Muskegon River Watershed game fish distributions under future land cover alteration and climate change scenarios

    Science.gov (United States)

    Steen, Paul J.; Wiley, Michael J.; Schaeffer, Jeffrey S.

    2010-01-01

    Future alterations in land cover and climate are likely to cause substantial changes in the ranges of fish species. Predictive distribution models are an important tool for assessing the probability that these changes will cause increases or decreases in or the extirpation of species. Classification tree models that predict the probability of game fish presence were applied to the streams of the Muskegon River watershed, Michigan. The models were used to study three potential future scenarios: (1) land cover change only, (2) land cover change and a 3°C increase in air temperature by 2100, and (3) land cover change and a 5°C increase in air temperature by 2100. The analysis indicated that the expected change in air temperature and subsequent change in water temperatures would result in the decline of coldwater fish in the Muskegon watershed by the end of the 21st century while cool- and warmwater species would significantly increase their ranges. The greatest decline detected was a 90% reduction in the probability that brook trout Salvelinus fontinalis would occur in Bigelow Creek. The greatest increase was a 276% increase in the probability that northern pike Esox lucius would occur in the Middle Branch River. Changes in land cover are expected to cause large changes in a few fish species, such as walleye Sander vitreus and Chinook salmon Oncorhynchus tshawytscha, but not to drive major changes in species composition. Managers can alter stream environmental conditions to maximize the probability that species will reside in particular stream reaches through application of the classification tree models. Such models represent a good way to predict future changes, as they give quantitative estimates of the n-dimensional niches for particular species.

  11. Capability and Interface Assessment of Gaming Technologies for Future Multi-Unmanned Air Vehicle Systems

    Science.gov (United States)

    2011-08-01

    Playing Games ( MMORPG ), which necessitate the management of multiple independent entities with sophisticated capabilities; and finally, arcade-style... MMORPG , SA 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT SAR 18. NUMBER OF PAGES 34 19a. NAME OF RESPONSIBLE...tested platform for simultaneous control of multiple entities. Similarly, the popularity of Massively Multiplayer Online Role Playing Games ( MMORPG

  12. Philosophical theories of probability

    CERN Document Server

    Gillies, Donald

    2000-01-01

    The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.

  13. Dynamic capabilities, Marketing Capability and Organizational Performance

    Directory of Open Access Journals (Sweden)

    Adriana Roseli Wünsch Takahashi

    2017-01-01

    Full Text Available The goal of the study is to investigate the influence of dynamic capabilities on organizational performance and the role of marketing capabilities as a mediator in this relationship in the context of private HEIs in Brazil. As a research method we carried out a survey with 316 IES and data analysis was operationalized with the technique of structural equation modeling. The results indicate that the dynamic capabilities have influence on organizational performance only when mediated by marketing ability. The marketing capability has an important role in the survival, growth and renewal on educational services offerings for HEIs in private sector, and consequently in organizational performance. It is also demonstrated that mediated relationship is more intense for HEI with up to 3,000 students and other organizational profile variables such as amount of courses, the constitution, the type of institution and type of education do not significantly alter the results.

  14. Does the Size Matter for Dynamics Capabilities? A Study on Absorptive Capacity

    Directory of Open Access Journals (Sweden)

    Marlon Fernandes Rodrigues Alves

    2016-10-01

    Full Text Available The objective of this study is to understand how organizational size influences dynamic capabilities in Brazil. To arrive at this understanding, structural equation modeling analysis was performed using the Brazilian Innovation Survey (PINTEC database to test for differences between SMEs and large companies in respect to the relationship between absorptive capacity (AC dimensions and innovation performance. The results show that in large companies, Potential AC and Realized AC impact innovation performance, whereas in small and medium-sized enterprises (SMEs, only Realized AC has an influence. In addition, SMEs are, in fact, better at converting Realized AC into innovation performance than large companies, probably due to their flexibility and agility. These findings reveal that organizational sizes influence the impact of dynamic capabilities on performance.

  15. Ensuring US National Aeronautics Test Capabilities

    Science.gov (United States)

    Marshall, Timothy J.

    2010-01-01

    process; and the reductions in wind tunnel testing requirements within the largest consumer of ATP wind tunnel test time, the Aeronautics Research Mission Directorate (ARMD). Retirement of the Space Shuttle Program and recent perturbations of NASA's Constellation Program will exacerbate this downward trend. Therefore it is crucial that ATP periodically revisit and determine which of its test capabilities are strategically important, which qualify as low-risk redundancies that could be put in an inactive status or closed, and address the challenges associated with both sustainment and improvements to the test capabilities that must remain active. This presentation will provide an overview of the ATP vision, mission, and goals as well as the challenges and opportunities the program is facing both today and in the future. We will discuss the strategy ATP is taking over the next five years to address the National aeronautics test capability challenges and what the program will do to capitalize on its opportunities to ensure a ready, robust and relevant portfolio of National aeronautics test capabilities.

  16. Assessing changes in failure probability of dams in a changing climate

    Science.gov (United States)

    Mallakpour, I.; AghaKouchak, A.; Moftakhari, H.; Ragno, E.

    2017-12-01

    Dams are crucial infrastructures and provide resilience against hydrometeorological extremes (e.g., droughts and floods). In 2017, California experienced series of flooding events terminating a 5-year drought, and leading to incidents such as structural failure of Oroville Dam's spillway. Because of large socioeconomic repercussions of such incidents, it is of paramount importance to evaluate dam failure risks associated with projected shifts in the streamflow regime. This becomes even more important as the current procedures for design of hydraulic structures (e.g., dams, bridges, spillways) are based on the so-called stationary assumption. Yet, changes in climate are anticipated to result in changes in statistics of river flow (e.g., more extreme floods) and possibly increasing the failure probability of already aging dams. Here, we examine changes in discharge under two representative concentration pathways (RCPs): RCP4.5 and RCP8.5. In this study, we used routed daily streamflow data from ten global climate models (GCMs) in order to investigate possible climate-induced changes in streamflow in northern California. Our results show that while the average flow does not show a significant change, extreme floods are projected to increase in the future. Using the extreme value theory, we estimate changes in the return periods of 50-year and 100-year floods in the current and future climates. Finally, we use the historical and future return periods to quantify changes in failure probability of dams in a warming climate.

  17. Concluding Remarks: The Current Status and Future Prospects for GRB Astronomy

    Science.gov (United States)

    Gehrels, Neil

    2009-01-01

    We are in a remarkable period of discovery in GRB astronomy. The current satellites including Swift, Fermi. AGILE and INTEGRAL are detecting and observing bursts of all varieties. Increasing capabilities for follow-up observations on the ground and in space are leading to rapid and deep coverage across the electromagnetic spectrum, The future will see continued operation of the current experiments and with future missions like SVOM plus possible rni_Ssions like JANUS and EXIST. An exciting expansion of capabilities is occurring in areas of gravitational waves and neutrinos that could open new windows on the GRB phenomenon. Increased IR capabilities on the ground and with missions like JWST will enable further exploration of high redshift bursts. The future is bright.

  18. Probability for statisticians

    CERN Document Server

    Shorack, Galen R

    2017-01-01

    This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...

  19. Development of an environmental radiation analysis research capability in the UAE

    International Nuclear Information System (INIS)

    Kim, Sung-yeop; Kim, Chankyu; Lee, Kun Jai; Chang, Soon Heung; Elmasri, Hasna; Beeley, Philip A.

    2013-01-01

    The UAE has started a nuclear energy program with the aim of having its first four units on-line between 2017 and 2020 and it is important that the country has an environmental radiation analysis capability to support this program. Khalifa University is therefore implementing a research laboratory to support both experimental analysis and radionuclide transport modeling in the aquatic and terrestrial environment. This paper outlines the development of this capability as well as the work in progress and planned for the future. - Highlights: • New university environmental radiation laboratory established in UAE. • Facilities included for alpha, beta and gamma radiometrics. • Transport modeling capability is being established. • Laboratory also used for education and training. • Robotic methods for sampling and analysis are under development

  20. Sandia Laboratories technical capabilities. Auxiliary capabilities: environmental health information science

    International Nuclear Information System (INIS)

    1975-09-01

    Sandia Laboratories is an engineering laboratory in which research, development, testing, and evaluation capabilities are integrated by program management for the generation of advanced designs. In fulfilling its primary responsibility to ERDA, Sandia Laboratories has acquired extensive research and development capabilities. The purpose of this series of documents is to catalog the many technical capabilities of the Laboratories. After the listing of capabilities, supporting information is provided in the form of highlights, which show applications. This document deals with auxiliary capabilities, in particular, environmental health and information science. (11 figures, 1 table) (RWR)

  1. A brief introduction to probability.

    Science.gov (United States)

    Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio

    2018-02-01

    The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.

  2. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...

  3. Probability of satellite collision

    Science.gov (United States)

    Mccarter, J. W.

    1972-01-01

    A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.

  4. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  5. Probability and Measure

    CERN Document Server

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  6. Experimental Probability in Elementary School

    Science.gov (United States)

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  7. Probability of causation tables and their possible implications for the practice of diagnostic radiology

    International Nuclear Information System (INIS)

    Gur, D.; Wald, N.

    1986-01-01

    In compliance with requirements in the Orphan Drug Act (97-414) of 1983, tables were recently constructed by an ad hoc committee of the National Institutes of Health (NIH) in which the probabilities that certain specific cancers are caused by previous radiation exposure are estimated. The reports of the NIH committee and a National Academy of Science oversight committee may have broad implications for the future practice of diagnostic radiology. The basis on which the probability of causation tables were established and some of the possible implications for diagnostic radiology are discussed

  8. The techno-entrepreneur of the future : Perspectives and Practices

    NARCIS (Netherlands)

    Ravesteijn, W.; Sjoer, E.

    2010-01-01

    Present sustainability problems require a new type of techno-entrepreneurship, in which traditional entrepreneurial qualities are combined with new capabilities related to the role, mission and responsibilities of future engineers. There are two sources of these new capabilities: Innovation Systems

  9. Structure life prediction at high temperature: present and future capabilities

    International Nuclear Information System (INIS)

    Chaboche, J.L.

    1987-01-01

    The life prediction techniques for high temperature conditions include several aspects which are considered successively in this article. Crack initiation criteria themselves, defined for the isolated volume element (the tension-compression specimen for example), including parametric relationships and continuous damage approaches and calculation of local stress and strain fields in the structure and their evolution under cyclic plasticity, which poses several difficult problems to obtain stabilized cyclic solutions are examined. The use of crack initiation criteria or damage rules from the result of the cyclic inelastic analysis and the prediction of crack growth in the structure are considered. Different levels are considered for the predictive tools: the classical approach, future methods presently under development and intermediate rules, which are already in use. Several examples are given on materials and components used either in the nuclear industry or in gas turbine engines. (author)

  10. The method of modular characteristic direction probabilities in MPACT

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Z. [School of Nuclear Science and Technology, Xi' an Jiaotong University, No. 28 Xianning west road, Xi' an, Shaanxi 710049 (China); Kochunas, B.; Collins, B.; Downar, T. [Department of Nuclear Engineering and Radiological Sciences, University of Michigan, 2200 Bonisteel, Ann Arbor, MI 48109 (United States); Wu, H. [School of Nuclear Science and Technology, Xi' an Jiaotong University, No. 28 Xianning west road, Xi' an, Shaanxi 710049 (China)

    2013-07-01

    The method of characteristic direction probabilities (CDP) is based on a modular ray tracing technique which combines the benefits of the collision probability method (CPM) and the method of characteristics (MOC). This past year CDP was implemented in the transport code MPACT for 2-D and 3-D transport calculations. By only coupling the fine mesh regions passed by the characteristic rays in the particular direction, the scale of the probabilities matrix is much smaller compared to the CPM. At the same time, the CDP has the same capacity of dealing with the complicated geometries with the MOC, because the same modular ray tracing techniques are used. Results from the C5G7 benchmark problems are given for different cases to show the accuracy and efficiency of the CDP compared to MOC. For the cases examined, the CDP and MOC methods were seen to differ in k{sub eff} by about 1-20 pcm, and the computational efficiency of the CDP appears to be better than the MOC for some problems. However, in other problems, particularly when the CDP matrices have to be recomputed from changing cross sections, the CDP does not perform as well. This indicates an area of future work. (authors)

  11. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  12. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  13. GaN-on-Silicon - Present capabilities and future directions

    Science.gov (United States)

    Boles, Timothy

    2018-02-01

    Gallium Nitride, in the form of epitaxial HEMT transistors on various substrate materials, is the newest and most promising semiconductor technology for high performance devices in the RF, microwave, and mmW arenas. This is particularly true for GaN-on-Silicon based devices and MMIC's which enable both state-of-the-art high frequency functionality and the ability to scale production into large wafer diameter CMOS foundries. The design and development of GaN-on-Silicon structures and devices will be presented beginning with the basic material parameters, growth of the required epitaxial construction, and leading to the fundamental operational theory of high frequency, high power HEMTs. In this discussion comparisons will be made with alternative substrate materials with emphasis on contrasting the inherent advantages of a silicon based system. Theory of operation of microwave and mmW high power HEMT devices will be presented with special emphasis on fundamental limitations of device performance including inherent frequency limiting transit time analysis, required impedance transformations, internal and external parasitic reactance, thermal impedance optimization, and challenges improved by full integration into monolithic MMICs. Lastly, future directions for implementing GaN-on-Silicon into mainstream CMOS silicon semiconductor technologies will be discussed.

  14. Probability for Weather and Climate

    Science.gov (United States)

    Smith, L. A.

    2013-12-01

    decision making versus advance science, are noted. It is argued that, just as no point forecast is complete without an estimate of its accuracy, no model-based probability forecast is complete without an estimate of its own irrelevance. The same nonlinearities that made the electronic computer so valuable links the selection and assimilation of observations, the formation of ensembles, the evolution of models, the casting of model simulations back into observables, and the presentation of this information to those who use it to take action or to advance science. Timescales of interest exceed the lifetime of a climate model and the career of a climate scientist, disarming the trichotomy that lead to swift advances in weather forecasting. Providing credible, informative climate services is a more difficult task. In this context, the value of comparing the forecasts of simulation models not only with each other but also with the performance of simple empirical models, whenever possible, is stressed. The credibility of meteorology is based on its ability to forecast and explain the weather. The credibility of climatology will always be based on flimsier stuff. Solid insights of climate science may be obscured if the severe limits on our ability to see the details of the future even probabilistically are not communicated clearly.

  15. Land use planning and wildfire: development policies influence future probability of housing loss

    Science.gov (United States)

    Syphard, Alexandra D.; Massada, Avi Bar; Butsic, Van; Keeley, Jon E.

    2013-01-01

    Increasing numbers of homes are being destroyed by wildfire in the wildland-urban interface. With projections of climate change and housing growth potentially exacerbating the threat of wildfire to homes and property, effective fire-risk reduction alternatives are needed as part of a comprehensive fire management plan. Land use planning represents a shift in traditional thinking from trying to eliminate wildfires, or even increasing resilience to them, toward avoiding exposure to them through the informed placement of new residential structures. For land use planning to be effective, it needs to be based on solid understanding of where and how to locate and arrange new homes. We simulated three scenarios of future residential development and projected landscape-level wildfire risk to residential structures in a rapidly urbanizing, fire-prone region in southern California. We based all future development on an econometric subdivision model, but we varied the emphasis of subdivision decision-making based on three broad and common growth types: infill, expansion, and leapfrog. Simulation results showed that decision-making based on these growth types, when applied locally for subdivision of individual parcels, produced substantial landscape-level differences in pattern, location, and extent of development. These differences in development, in turn, affected the area and proportion of structures at risk from burning in wildfires. Scenarios with lower housing density and larger numbers of small, isolated clusters of development, i.e., resulting from leapfrog development, were generally predicted to have the highest predicted fire risk to the largest proportion of structures in the study area, and infill development was predicted to have the lowest risk. These results suggest that land use planning should be considered an important component to fire risk management and that consistently applied policies based on residential pattern may provide substantial benefits for

  16. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  17. Human Inferences about Sequences: A Minimal Transition Probability Model.

    Directory of Open Access Journals (Sweden)

    Florent Meyniel

    2016-12-01

    Full Text Available The brain constantly infers the causes of the inputs it receives and uses these inferences to generate statistical expectations about future observations. Experimental evidence for these expectations and their violations include explicit reports, sequential effects on reaction times, and mismatch or surprise signals recorded in electrophysiology and functional MRI. Here, we explore the hypothesis that the brain acts as a near-optimal inference device that constantly attempts to infer the time-varying matrix of transition probabilities between the stimuli it receives, even when those stimuli are in fact fully unpredictable. This parsimonious Bayesian model, with a single free parameter, accounts for a broad range of findings on surprise signals, sequential effects and the perception of randomness. Notably, it explains the pervasive asymmetry between repetitions and alternations encountered in those studies. Our analysis suggests that a neural machinery for inferring transition probabilities lies at the core of human sequence knowledge.

  18. Assessing Graduate Sustainability Capability Post-Degree Completion: Why Is It Important and What Are the Challenges?

    Science.gov (United States)

    Sandri, Orana; Holdsworth, Sarah; Thomas, Ian

    2018-01-01

    Purpose: The purpose of this paper is to highlight both the need for measurement of graduate capabilities post-degree completion and the challenges posed by such a task. Higher education institutions provide an important site of learning that can equip future professionals with capabilities to manage and respond to complex sustainability…

  19. Probability of detection as a function of multiple influencing parameters

    Energy Technology Data Exchange (ETDEWEB)

    Pavlovic, Mato

    2014-10-15

    Non-destructive testing is subject to measurement uncertainties. In safety critical applications the reliability assessment of its capability to detect flaws is therefore necessary. In most applications, the flaw size is the single most important parameter that influences the probability of detection (POD) of the flaw. That is why the POD is typically calculated and expressed as a function of the flaw size. The capability of the inspection system to detect flaws is established by comparing the size of reliably detected flaw with the size of the flaw that is critical for the structural integrity. Applications where several factors have an important influence on the POD are investigated in this dissertation. To devise a reliable estimation of the NDT system capability it is necessary to express the POD as a function of all these factors. A multi-parameter POD model is developed. It enables POD to be calculated and expressed as a function of several influencing parameters. The model was tested on the data from the ultrasonic inspection of copper and cast iron components with artificial flaws. Also, a technique to spatially present POD data called the volume POD is developed. The fusion of the POD data coming from multiple inspections of the same component with different sensors is performed to reach the overall POD of the inspection system.

  20. Probability of detection as a function of multiple influencing parameters

    International Nuclear Information System (INIS)

    Pavlovic, Mato

    2014-01-01

    Non-destructive testing is subject to measurement uncertainties. In safety critical applications the reliability assessment of its capability to detect flaws is therefore necessary. In most applications, the flaw size is the single most important parameter that influences the probability of detection (POD) of the flaw. That is why the POD is typically calculated and expressed as a function of the flaw size. The capability of the inspection system to detect flaws is established by comparing the size of reliably detected flaw with the size of the flaw that is critical for the structural integrity. Applications where several factors have an important influence on the POD are investigated in this dissertation. To devise a reliable estimation of the NDT system capability it is necessary to express the POD as a function of all these factors. A multi-parameter POD model is developed. It enables POD to be calculated and expressed as a function of several influencing parameters. The model was tested on the data from the ultrasonic inspection of copper and cast iron components with artificial flaws. Also, a technique to spatially present POD data called the volume POD is developed. The fusion of the POD data coming from multiple inspections of the same component with different sensors is performed to reach the overall POD of the inspection system.

  1. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... analytic expression for the distribution function of a sum of random variables. The presence of heavy-tailed random variables complicates the problem even more. The objective of this dissertation is to provide better approximations by means of sharp asymptotic expressions and Monte Carlo estimators...

  2. Coming of Age on a Shoestring Budget: Financial Capability and Financial Behaviors of Lower-Income Millennials.

    Science.gov (United States)

    West, Stacia; Friedline, Terri

    2016-10-01

    Lower-income millennials make important financial decisions that may affect their future financial well-being. With limited resources, this population is at risk for acquiring too much debt or being unprepared for a financial emergency that can send them further into poverty and constrain their ability to leverage resources for future economic mobility. A financial capability approach, an intervention that combines financial education with financial inclusion through the use of a savings account, may correlate with millennials’ healthy financial behaviors. This study used data from the 2012 National Financial Capability Study to examine the relationship between financial capability and the financial behaviors of lower-income millennials between the ages of 18 and 34 years (N = 2,578). Compared with those lower-income millennials who were financially excluded, those who were financially capable were also 171 percent more likely to afford an unexpected expense, 182 percent more likely to save for emergencies, and 34 percent less likely to carry too much debt, relating to their greater overall financial satisfaction. The findings of this study indicate that interventions that develop lower-income millennials’ financial capability may be effective for promoting healthy financial behaviors.

  3. A probability-based multi-cycle sorting method for 4D-MRI: A simulation study.

    Science.gov (United States)

    Liang, Xiao; Yin, Fang-Fang; Liu, Yilin; Cai, Jing

    2016-12-01

    To develop a novel probability-based sorting method capable of generating multiple breathing cycles of 4D-MRI images and to evaluate performance of this new method by comparing with conventional phase-based methods in terms of image quality and tumor motion measurement. Based on previous findings that breathing motion probability density function (PDF) of a single breathing cycle is dramatically different from true stabilized PDF that resulted from many breathing cycles, it is expected that a probability-based sorting method capable of generating multiple breathing cycles of 4D images may capture breathing variation information missing from conventional single-cycle sorting methods. The overall idea is to identify a few main breathing cycles (and their corresponding weightings) that can best represent the main breathing patterns of the patient and then reconstruct a set of 4D images for each of the identified main breathing cycles. This method is implemented in three steps: (1) The breathing signal is decomposed into individual breathing cycles, characterized by amplitude, and period; (2) individual breathing cycles are grouped based on amplitude and period to determine the main breathing cycles. If a group contains more than 10% of all breathing cycles in a breathing signal, it is determined as a main breathing pattern group and is represented by the average of individual breathing cycles in the group; (3) for each main breathing cycle, a set of 4D images is reconstructed using a result-driven sorting method adapted from our previous study. The probability-based sorting method was first tested on 26 patients' breathing signals to evaluate its feasibility of improving target motion PDF. The new method was subsequently tested for a sequential image acquisition scheme on the 4D digital extended cardiac torso (XCAT) phantom. Performance of the probability-based and conventional sorting methods was evaluated in terms of target volume precision and accuracy as measured

  4. Achieving a Launch on Demand Capability

    Science.gov (United States)

    Greenberg, Joel S.

    2002-01-01

    The ability to place payloads [satellites] into orbit as and when required, often referred to as launch on demand, continues to be an elusive and yet largely unfulfilled goal. But what is the value of achieving launch on demand [LOD], and what metrics are appropriate? Achievement of a desired level of LOD capability must consider transportation system thruput, alternative transportation systems that comprise the transportation architecture, transportation demand, reliability and failure recovery characteristics of the alternatives, schedule guarantees, launch delays, payload integration schedules, procurement policies, and other factors. Measures of LOD capability should relate to the objective of the transportation architecture: the placement of payloads into orbit as and when required. Launch on demand capability must be defined in probabilistic terms such as the probability of not incurring a delay in excess of T when it is determined that it is necessary to place a payload into orbit. Three specific aspects of launch on demand are considered: [1] the ability to recover from adversity [i.e., a launch failure] and to keep up with the steady-state demand for placing satellites into orbit [this has been referred to as operability and resiliency], [2] the ability to respond to the requirement to launch a satellite when the need arises unexpectedly either because of an unexpected [random] on-orbit satellite failure that requires replacement or because of the sudden recognition of an unanticipated requirement, and [3] the ability to recover from adversity [i.e., a launch failure] during the placement of a constellation into orbit. The objective of this paper is to outline a formal approach for analyzing alternative transportation architectures in terms of their ability to provide a LOD capability. The economic aspect of LOD is developed by establishing a relationship between scheduling and the elimination of on-orbit spares while achieving the desired level of on

  5. Probability of success for phase III after exploratory biomarker analysis in phase II.

    Science.gov (United States)

    Götte, Heiko; Kirchner, Marietta; Sailer, Martin Oliver

    2017-05-01

    The probability of success or average power describes the potential of a future trial by weighting the power with a probability distribution of the treatment effect. The treatment effect estimate from a previous trial can be used to define such a distribution. During the development of targeted therapies, it is common practice to look for predictive biomarkers. The consequence is that the trial population for phase III is often selected on the basis of the most extreme result from phase II biomarker subgroup analyses. In such a case, there is a tendency to overestimate the treatment effect. We investigate whether the overestimation of the treatment effect estimate from phase II is transformed into a positive bias for the probability of success for phase III. We simulate a phase II/III development program for targeted therapies. This simulation allows to investigate selection probabilities and allows to compare the estimated with the true probability of success. We consider the estimated probability of success with and without subgroup selection. Depending on the true treatment effects, there is a negative bias without selection because of the weighting by the phase II distribution. In comparison, selection increases the estimated probability of success. Thus, selection does not lead to a bias in probability of success if underestimation due to the phase II distribution and overestimation due to selection cancel each other out. We recommend to perform similar simulations in practice to get the necessary information about the risk and chances associated with such subgroup selection designs. Copyright © 2017 John Wiley & Sons, Ltd.

  6. The influence of initial beliefs on judgments of probability.

    Science.gov (United States)

    Yu, Erica C; Lagnado, David A

    2012-01-01

    This study aims to investigate whether experimentally induced prior beliefs affect processing of evidence including the updating of beliefs under uncertainty about the unknown probabilities of outcomes and the structural, outcome-generating nature of the environment. Participants played a gambling task in the form of computer-simulated slot machines and were given information about the slot machines' possible outcomes without their associated probabilities. One group was induced with a prior belief about the outcome space that matched the space of actual outcomes to be sampled; the other group was induced with a skewed prior belief that included the actual outcomes and also fictional higher outcomes. In reality, however, all participants sampled evidence from the same underlying outcome distribution, regardless of priors given. Before and during sampling, participants expressed their beliefs about the outcome distribution (values and probabilities). Evaluation of those subjective probability distributions suggests that all participants' judgments converged toward the observed outcome distribution. However, despite observing no supporting evidence for fictional outcomes, a significant proportion of participants in the skewed priors condition expected them in the future. A probe of the participants' understanding of the underlying outcome-generating processes indicated that participants' judgments were based on the information given in the induced priors and consequently, a significant proportion of participants in the skewed condition believed the slot machines were not games of chance while participants in the control condition believed the machines generated outcomes at random. Beyond Bayesian or heuristic belief updating, priors not only contribute to belief revision but also affect one's deeper understanding of the environment.

  7. Quantum processes: probability fluxes, transition probabilities in unit time and vacuum vibrations

    International Nuclear Information System (INIS)

    Oleinik, V.P.; Arepjev, Ju D.

    1989-01-01

    Transition probabilities in unit time and probability fluxes are compared in studying the elementary quantum processes -the decay of a bound state under the action of time-varying and constant electric fields. It is shown that the difference between these quantities may be considerable, and so the use of transition probabilities W instead of probability fluxes Π, in calculating the particle fluxes, may lead to serious errors. The quantity W represents the rate of change with time of the population of the energy levels relating partly to the real states and partly to the virtual ones, and it cannot be directly measured in experiment. The vacuum background is shown to be continuously distorted when a perturbation acts on a system. Because of this the viewpoint of an observer on the physical properties of real particles continuously varies with time. This fact is not taken into consideration in the conventional theory of quantum transitions based on using the notion of probability amplitude. As a result, the probability amplitudes lose their physical meaning. All the physical information on quantum dynamics of a system is contained in the mean values of physical quantities. The existence of considerable differences between the quantities W and Π permits one in principle to make a choice of the correct theory of quantum transitions on the basis of experimental data. (author)

  8. Probability of criminal acts of violence: a test of jury predictive accuracy.

    Science.gov (United States)

    Reidy, Thomas J; Sorensen, Jon R; Cunningham, Mark D

    2013-01-01

    The ability of capital juries to accurately predict future prison violence at the sentencing phase of aggravated murder trials was examined through retrospective review of the disciplinary records of 115 male inmates sentenced to either life (n = 65) or death (n = 50) in Oregon from 1985 through 2008, with a mean post-conviction time at risk of 15.3 years. Violent prison behavior was completely unrelated to predictions made by capital jurors, with bidirectional accuracy simply reflecting the base rate of assaultive misconduct in the group. Rejection of the special issue predicting future violence enjoyed 90% accuracy. Conversely, predictions that future violence was probable had 90% error rates. More than 90% of the assaultive rule violations committed by these offenders resulted in no harm or only minor injuries. Copyright © 2013 John Wiley & Sons, Ltd.

  9. Dynamic capabilities and innovation capabilities: The case of the ‘Innovation Clinic’

    Directory of Open Access Journals (Sweden)

    Fred Strønen

    2017-01-01

    Full Text Available In this explorative study, we investigate the relationship between dynamic capabilities and innovation capabilities. Dynamic capabilities are at the core of strategic management in terms of how firms can ensure adaptation to changing environments over time. Our paper follows two paths of argumentation. First, we review and discuss some major contributions to the theories on ordinary capabilities, dynamic capabilities, and innovation capabilities. We seek to identify different understandings of the concepts in question, in order to clarify the distinctions and relationships between dynamic capabilities and innovation capabilities. Second, we present a case study of the ’Innovation Clinic’ at a major university hospital, including four innovation projects. We use this case study to explore and discuss how dynamic capabilities can be extended, as well as to what extent innovation capabilities can be said to be dynamic. In our conclusion, we discuss the conditions for nurturing ‘dynamic innovation capabilities’ in organizations.

  10. Propensity, Probability, and Quantum Theory

    Science.gov (United States)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  11. Capability ethics

    OpenAIRE

    Robeyns, Ingrid

    2012-01-01

    textabstractThe capability approach is one of the most recent additions to the landscape of normative theories in ethics and political philosophy. Yet in its present stage of development, the capability approach is not a full-blown normative theory, in contrast to utilitarianism, deontological theories, virtue ethics, or pragmatism. As I will argue in this chapter, at present the core of the capability approach is an account of value, which together with some other (more minor) normative comm...

  12. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....

  13. Satellite-based Tropical Cyclone Monitoring Capabilities

    Science.gov (United States)

    Hawkins, J.; Richardson, K.; Surratt, M.; Yang, S.; Lee, T. F.; Sampson, C. R.; Solbrig, J.; Kuciauskas, A. P.; Miller, S. D.; Kent, J.

    2012-12-01

    Satellite remote sensing capabilities to monitor tropical cyclone (TC) location, structure, and intensity have evolved by utilizing a combination of operational and research and development (R&D) sensors. The microwave imagers from the operational Defense Meteorological Satellite Program [Special Sensor Microwave/Imager (SSM/I) and the Special Sensor Microwave Imager Sounder (SSMIS)] form the "base" for structure observations due to their ability to view through upper-level clouds, modest size swaths and ability to capture most storm structure features. The NASA TRMM microwave imager and precipitation radar continue their 15+ yearlong missions in serving the TC warning and research communities. The cessation of NASA's QuikSCAT satellite after more than a decade of service is sorely missed, but India's OceanSat-2 scatterometer is now providing crucial ocean surface wind vectors in addition to the Navy's WindSat ocean surface wind vector retrievals. Another Advanced Scatterometer (ASCAT) onboard EUMETSAT's MetOp-2 satellite is slated for launch soon. Passive microwave imagery has received a much needed boost with the launch of the French/Indian Megha Tropiques imager in September 2011, basically greatly supplementing the very successful NASA TRMM pathfinder with a larger swath and more frequent temporal sampling. While initial data issues have delayed data utilization, current news indicates this data will be available in 2013. Future NASA Global Precipitation Mission (GPM) sensors starting in 2014 will provide enhanced capabilities. Also, the inclusion of the new microwave sounder data from the NPP ATMS (Oct 2011) will assist in mapping TC convective structures. The National Polar orbiting Partnership (NPP) program's VIIRS sensor includes a day night band (DNB) with the capability to view TC cloud structure at night when sufficient lunar illumination exits. Examples highlighting this new capability will be discussed in concert with additional data fusion efforts.

  14. Potential Future Igneous Activity at Yucca Mountain, Nevada

    International Nuclear Information System (INIS)

    Cline, M.; Perry, F.; Valentine, G.; Smistad, E.

    2005-01-01

    Location, timing, and volumes of post-Miocene volcanic activity, along with expert judgment, provide the basis for assessing the probability of future volcanism intersecting a proposed repository for nuclear waste at Yucca Mountain, Nevada. Analog studies of eruptive centers in the region that may represent the style and extent of possible future igneous activity at Yucca Mountain have aided in defining the consequence scenarios for intrusion into and eruption through a proposed repository. Modeling of magmatic processes related to magma/proposed repository interactions has been used to assess the potential consequences of a future igneous event through a proposed repository at Yucca Mountain. Results of work to date indicate future igneous activity in the Yucca Mountain region has a very low probability of intersecting the proposed repository. Probability of a future event intersecting a proposed repository at Yucca Mountain is approximately 1.7 x 10 -8 per year. Since completion of the Probabilistic Volcanic Hazard Assessment (PVHA) in 1996, anomalies representing potential buried volcanic centers have been identified from aeromagnetic surveys. A re-assessment of the hazard is currently underway to evaluate the probability of intersection in light of new information and to estimate the probability of one or more volcanic conduits located in the proposed repository along a dike that intersects the proposed repository. US Nuclear Regulatory Commission regulations for siting and licensing a proposed repository require that the consequences of a disruptive event (igneous event) with annual probability greater than 1 x 10 -8 be evaluated. Two consequence scenarios are considered: (1) igneous intrusion-poundwater transport case and (2) volcanic eruptive case. These scenarios equate to a dike or dike swarm intersecting repository drifts containing waste packages, formation of a conduit leading to a volcanic eruption through the repository that carries the contents of

  15. Using the Reliability Theory for Assessing the Decision Confidence Probability for Comparative Life Cycle Assessments.

    Science.gov (United States)

    Wei, Wei; Larrey-Lassalle, Pyrène; Faure, Thierry; Dumoulin, Nicolas; Roux, Philippe; Mathias, Jean-Denis

    2016-03-01

    Comparative decision making process is widely used to identify which option (system, product, service, etc.) has smaller environmental footprints and for providing recommendations that help stakeholders take future decisions. However, the uncertainty problem complicates the comparison and the decision making. Probability-based decision support in LCA is a way to help stakeholders in their decision-making process. It calculates the decision confidence probability which expresses the probability of a option to have a smaller environmental impact than the one of another option. Here we apply the reliability theory to approximate the decision confidence probability. We compare the traditional Monte Carlo method with a reliability method called FORM method. The Monte Carlo method needs high computational time to calculate the decision confidence probability. The FORM method enables us to approximate the decision confidence probability with fewer simulations than the Monte Carlo method by approximating the response surface. Moreover, the FORM method calculates the associated importance factors that correspond to a sensitivity analysis in relation to the probability. The importance factors allow stakeholders to determine which factors influence their decision. Our results clearly show that the reliability method provides additional useful information to stakeholders as well as it reduces the computational time.

  16. Burden of high fracture probability worldwide: secular increases 2010-2040.

    Science.gov (United States)

    Odén, A; McCloskey, E V; Kanis, J A; Harvey, N C; Johansson, H

    2015-09-01

    The number of individuals aged 50 years or more at high risk of osteoporotic fracture worldwide in 2010 was estimated at 158 million and is set to double by 2040. The aim of this study was to quantify the number of individuals worldwide aged 50 years or more at high risk of osteoporotic fracture in 2010 and 2040. A threshold of high fracture probability was set at the age-specific 10-year probability of a major fracture (clinical vertebral, forearm, humeral or hip fracture) which was equivalent to that of a woman with a BMI of 24 kg/m(2) and a prior fragility fracture but no other clinical risk factors. The prevalence of high risk was determined worldwide and by continent using all available country-specific FRAX models and applied the population demography for each country. Twenty-one million men and 137 million women had a fracture probability at or above the threshold in the world for the year 2010. The greatest number of men and women at high risk were from Asia (55 %). Worldwide, the number of high-risk individuals is expected to double over the next 40 years. We conclude that individuals with high probability of osteoporotic fractures comprise a very significant disease burden to society, particularly in Asia, and that this burden is set to increase markedly in the future. These analyses provide a platform for the evaluation of risk assessment and intervention strategies.

  17. Estimating the probability of rare events: addressing zero failure data.

    Science.gov (United States)

    Quigley, John; Revie, Matthew

    2011-07-01

    Traditional statistical procedures for estimating the probability of an event result in an estimate of zero when no events are realized. Alternative inferential procedures have been proposed for the situation where zero events have been realized but often these are ad hoc, relying on selecting methods dependent on the data that have been realized. Such data-dependent inference decisions violate fundamental statistical principles, resulting in estimation procedures whose benefits are difficult to assess. In this article, we propose estimating the probability of an event occurring through minimax inference on the probability that future samples of equal size realize no more events than that in the data on which the inference is based. Although motivated by inference on rare events, the method is not restricted to zero event data and closely approximates the maximum likelihood estimate (MLE) for nonzero data. The use of the minimax procedure provides a risk adverse inferential procedure where there are no events realized. A comparison is made with the MLE and regions of the underlying probability are identified where this approach is superior. Moreover, a comparison is made with three standard approaches to supporting inference where no event data are realized, which we argue are unduly pessimistic. We show that for situations of zero events the estimator can be simply approximated with 1/2.5n, where n is the number of trials. © 2011 Society for Risk Analysis.

  18. Optimism as a Prior Belief about the Probability of Future Reward

    Science.gov (United States)

    Kalra, Aditi; Seriès, Peggy

    2014-01-01

    Optimists hold positive a priori beliefs about the future. In Bayesian statistical theory, a priori beliefs can be overcome by experience. However, optimistic beliefs can at times appear surprisingly resistant to evidence, suggesting that optimism might also influence how new information is selected and learned. Here, we use a novel Pavlovian conditioning task, embedded in a normative framework, to directly assess how trait optimism, as classically measured using self-report questionnaires, influences choices between visual targets, by learning about their association with reward progresses. We find that trait optimism relates to an a priori belief about the likelihood of rewards, but not losses, in our task. Critically, this positive belief behaves like a probabilistic prior, i.e. its influence reduces with increasing experience. Contrary to findings in the literature related to unrealistic optimism and self-beliefs, it does not appear to influence the iterative learning process directly. PMID:24853098

  19. Optimism as a prior belief about the probability of future reward.

    Directory of Open Access Journals (Sweden)

    Aistis Stankevicius

    2014-05-01

    Full Text Available Optimists hold positive a priori beliefs about the future. In Bayesian statistical theory, a priori beliefs can be overcome by experience. However, optimistic beliefs can at times appear surprisingly resistant to evidence, suggesting that optimism might also influence how new information is selected and learned. Here, we use a novel Pavlovian conditioning task, embedded in a normative framework, to directly assess how trait optimism, as classically measured using self-report questionnaires, influences choices between visual targets, by learning about their association with reward progresses. We find that trait optimism relates to an a priori belief about the likelihood of rewards, but not losses, in our task. Critically, this positive belief behaves like a probabilistic prior, i.e. its influence reduces with increasing experience. Contrary to findings in the literature related to unrealistic optimism and self-beliefs, it does not appear to influence the iterative learning process directly.

  20. Gossiping Capabilities

    DEFF Research Database (Denmark)

    Mogensen, Martin; Frey, Davide; Guerraoui, Rachid

    Gossip-based protocols are now acknowledged as a sound basis to implement collaborative high-bandwidth content dissemination: content location is disseminated through gossip, the actual contents being subsequently pulled. In this paper, we present HEAP, HEterogeneity Aware gossip Protocol, where...... nodes dynamically adjust their contribution to gossip dissemination according to their capabilities. Using a continuous, itself gossip-based, approximation of relative capabilities, HEAP dynamically leverages the most capable nodes by (a) increasing their fanouts (while decreasing by the same proportion...... declare a high capability in order to augment their perceived quality without contributing accordingly. We evaluate HEAP in the context of a video streaming application on a 236 PlanetLab nodes testbed. Our results shows that HEAP improves the quality of the streaming by 25% over a standard gossip...

  1. The quantum probability calculus

    International Nuclear Information System (INIS)

    Jauch, J.M.

    1976-01-01

    The Wigner anomaly (1932) for the joint distribution of noncompatible observables is an indication that the classical probability calculus is not applicable for quantum probabilities. It should, therefore, be replaced by another, more general calculus, which is specifically adapted to quantal systems. In this article this calculus is exhibited and its mathematical axioms and the definitions of the basic concepts such as probability field, random variable, and expectation values are given. (B.R.H)

  2. Diagnostics of enterprise bankruptcy occurrence probability in an anti-crisis management: modern approaches and classification of models

    Directory of Open Access Journals (Sweden)

    I.V. Zhalinska

    2015-09-01

    Full Text Available Diagnostics of enterprise bankruptcy occurrence probability is defined as an important tool ensuring the viability of an organization under conditions of unpredictable dynamic environment. The paper aims to define the basic features of diagnostics of bankruptcy occurrence probability models and their classification. The article grounds the objective increasing of crisis probability in modern enterprises where such increasing leads to the need to improve the efficiency of anti-crisis enterprise activities. The system of anti-crisis management is based on the subsystem of diagnostics of bankruptcy occurrence probability. Such a subsystem is the main one for further measures to prevent and overcome the crisis. The classification of existing models of enterprise bankruptcy occurrence probability has been suggested. The classification is based on methodical and methodological principles of models. The following main groups of models are determined: the models using financial ratios, aggregates and scores, the models of discriminated analysis, the methods of strategic analysis, informal models, artificial intelligence systems and the combination of the models. The classification made it possible to identify the analytical capabilities of each of the groups of models suggested.

  3. Application of Probability Calculations to the Study of the Permissible Step and Touch Potentials to Ensure Personnel Safety

    International Nuclear Information System (INIS)

    Eisawy, E.A.

    2011-01-01

    The aim of this paper is to develop a practical method to evaluate the actual step and touch potential distributions in order to determine the risk of failure of the grounding system. The failure probability, indicating the safety level of the grounding system, is related to both applied (stress) and withstand (strength) step or touch potentials. The probability distributions of the applied step and touch potentials as well as the corresponding withstand step and touch potentials which represent the capability of the human body to resist stress potentials are presented. These two distributions are used to evaluate the failure probability of the grounding system which denotes the probability that the applied potential exceeds the withstand potential. The method is accomplished in considering the resistance of the human body, the foot contact resistance and the fault clearing time as an independent random variables, rather than fixed values as treated in the previous analysis in determining the safety requirements for a given grounding system

  4. The Royal Naval Medical Services: delivering medical operational capability. the 'black art' of Medical Operational Planning.

    Science.gov (United States)

    Faye, M

    2013-01-01

    This article looks to dispel the mysteries of the 'black art' of Medical Operational Planning whilst giving an overview of activity within the Medical Operational Capability area of Medical Division (Med Div) within Navy Command Headquarters (NCHQ) during a period when the Royal Naval Medical Services (RNMS) have been preparing and reconfiguring medical capability for the future contingent battle spaces. The rolling exercise program has been used to illustrate the ongoing preparations taken by the Medical Operational Capability (Med Op Cap) and the Medical Force Elements to deliver medical capability in the littoral and maritime environments.

  5. Probability of defect detection of Posiva's electron beam weld

    International Nuclear Information System (INIS)

    Kanzler, D.; Mueller, C.; Pitkaenen, J.

    2013-12-01

    The report 'Probability of Defect Detection of Posiva's electron beam weld' describes POD curves of four NDT methods radiographic testing, ultrasonic testing, eddy current testing and visual testing. POD-curves are based on the artificial defects in reference blocks. The results are devoted to the demonstration of suitability of the methods for EB weld testing. Report describes methodology and procedure applied by BAM. Report creates a link from the assessment of the reliability and inspection performance to the risk assessment process of the canister final disposal project. Report ensures the confirmation of the basic quality of the NDT methods and their capability to describe the quality of the EB-weld. The probability of detection curves are determined based on the MIL-1823 standard and it's reliability guidelines. The MIL-1823 standard was developed for the determination of integrity of gas turbine engines for the US military. In the POD-process there are determined as a key parameter for the defect detectability the a90/95 magnitudes, i.e. the size measure a of the defect, for which the lower 95 % confidence band crosses the 90 % POD level. By this way can be confirmed that defects with a size of a90/95 will be detected with 90 % probability. In case the experiment will be repeated 5 % might fall outside this confidence limit. (orig.)

  6. Excluding joint probabilities from quantum theory

    Science.gov (United States)

    Allahverdyan, Armen E.; Danageozian, Arshag

    2018-03-01

    Quantum theory does not provide a unique definition for the joint probability of two noncommuting observables, which is the next important question after the Born's probability for a single observable. Instead, various definitions were suggested, e.g., via quasiprobabilities or via hidden-variable theories. After reviewing open issues of the joint probability, we relate it to quantum imprecise probabilities, which are noncontextual and are consistent with all constraints expected from a quantum probability. We study two noncommuting observables in a two-dimensional Hilbert space and show that there is no precise joint probability that applies for any quantum state and is consistent with imprecise probabilities. This contrasts with theorems by Bell and Kochen-Specker that exclude joint probabilities for more than two noncommuting observables, in Hilbert space with dimension larger than two. If measurement contexts are included into the definition, joint probabilities are not excluded anymore, but they are still constrained by imprecise probabilities.

  7. Linear positivity and virtual probability

    International Nuclear Information System (INIS)

    Hartle, James B.

    2004-01-01

    We investigate the quantum theory of closed systems based on the linear positivity decoherence condition of Goldstein and Page. The objective of any quantum theory of a closed system, most generally the universe, is the prediction of probabilities for the individual members of sets of alternative coarse-grained histories of the system. Quantum interference between members of a set of alternative histories is an obstacle to assigning probabilities that are consistent with the rules of probability theory. A quantum theory of closed systems therefore requires two elements: (1) a condition specifying which sets of histories may be assigned probabilities and (2) a rule for those probabilities. The linear positivity condition of Goldstein and Page is the weakest of the general conditions proposed so far. Its general properties relating to exact probability sum rules, time neutrality, and conservation laws are explored. Its inconsistency with the usual notion of independent subsystems in quantum mechanics is reviewed. Its relation to the stronger condition of medium decoherence necessary for classicality is discussed. The linear positivity of histories in a number of simple model systems is investigated with the aim of exhibiting linearly positive sets of histories that are not decoherent. The utility of extending the notion of probability to include values outside the range of 0-1 is described. Alternatives with such virtual probabilities cannot be measured or recorded, but can be used in the intermediate steps of calculations of real probabilities. Extended probabilities give a simple and general way of formulating quantum theory. The various decoherence conditions are compared in terms of their utility for characterizing classicality and the role they might play in further generalizations of quantum mechanics

  8. The pleasures of probability

    CERN Document Server

    Isaac, Richard

    1995-01-01

    The ideas of probability are all around us. Lotteries, casino gambling, the al­ most non-stop polling which seems to mold public policy more and more­ these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re­ moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac­ ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...

  9. Probable Inference and Quantum Mechanics

    International Nuclear Information System (INIS)

    Grandy, W. T. Jr.

    2009-01-01

    In its current very successful interpretation the quantum theory is fundamentally statistical in nature. Although commonly viewed as a probability amplitude whose (complex) square is a probability, the wavefunction or state vector continues to defy consensus as to its exact meaning, primarily because it is not a physical observable. Rather than approach this problem directly, it is suggested that it is first necessary to clarify the precise role of probability theory in quantum mechanics, either as applied to, or as an intrinsic part of the quantum theory. When all is said and done the unsurprising conclusion is that quantum mechanics does not constitute a logic and probability unto itself, but adheres to the long-established rules of classical probability theory while providing a means within itself for calculating the relevant probabilities. In addition, the wavefunction is seen to be a description of the quantum state assigned by an observer based on definite information, such that the same state must be assigned by any other observer based on the same information, in much the same way that probabilities are assigned.

  10. Actual growth and probable future of the worldwide nuclear industry

    International Nuclear Information System (INIS)

    Bupp, I.C.

    1981-01-01

    Worldwide nuclear-power-reactor manufacturing capacity will exceed worldwide demand by a factor of two or more during the 1980s. Only in France and the Soviet bloc countries is it likely that the ambitious nuclear-power programs formulated in the mid-1970s will be implemented. In all other developed countries and in most developing countries, further delays and cancellations of previously announced programs are all but certain. The stalemate over the future of nuclear power is particularly deep in America. Administrative and personnel problems in the Nuclear Regulatory Commission, slow progress on radioactive waste disposal by the Department of Energy, severe financial problems for most electric utilities, and drastic reductions in the rate of electricity demand growth combine to make continuation of the five-year-old moratorium on reactor orders inevitable. Many of the ninety plants under construction may never operate, and some of the seventy in operation may shut down before the end of their economic life. Contrary to widespread belief, further oil price increases may not speed up world-wide reactor sales. It is possible that the world is heading for a worst of all possible outcomes: a large number of small nuclear power programs that do little to meet real energy needs but substantially complicate the problem of nuclear weapons proliferation. 24 references, 4 tables

  11. Probability Density Estimation Using Neural Networks in Monte Carlo Calculations

    International Nuclear Information System (INIS)

    Shim, Hyung Jin; Cho, Jin Young; Song, Jae Seung; Kim, Chang Hyo

    2008-01-01

    The Monte Carlo neutronics analysis requires the capability for a tally distribution estimation like an axial power distribution or a flux gradient in a fuel rod, etc. This problem can be regarded as a probability density function estimation from an observation set. We apply the neural network based density estimation method to an observation and sampling weight set produced by the Monte Carlo calculations. The neural network method is compared with the histogram and the functional expansion tally method for estimating a non-smooth density, a fission source distribution, and an absorption rate's gradient in a burnable absorber rod. The application results shows that the neural network method can approximate a tally distribution quite well. (authors)

  12. Physics at Future Colliders

    CERN Document Server

    Ellis, John R.

    1999-01-01

    After a brief review of the Big Issues in particle physics, we discuss the contributions to resolving that could be made by various planned and proposed future colliders. These include future runs of LEP and the Fermilab Tevatron collider, B factories, RHIC, the LHC, a linear electron-positron collider, an electron-proton collider in the LEP/LHC tunnel, a muon collider and a future larger hadron collider (FLHC). The Higgs boson and supersymmetry are used as benchmarks for assessing their capabilities. The LHC has great capacities for precision measurements as well as exploration, but also shortcomings where the complementary strengths of a linear electron-positron collider would be invaluable. It is not too soon to study seriously possible subsequent colliders.

  13. Probability concepts in quality risk management.

    Science.gov (United States)

    Claycamp, H Gregg

    2012-01-01

    Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although risk is generally a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management tools are relatively silent on the meaning and uses of "probability." The probability concept is typically applied by risk managers as a combination of frequency-based calculation and a "degree of belief" meaning of probability. Probability as a concept that is crucial for understanding and managing risk is discussed through examples from the most general, scenario-defining and ranking tools that use probability implicitly to more specific probabilistic tools in risk management. A rich history of probability in risk management applied to other fields suggests that high-quality risk management decisions benefit from the implementation of more thoughtful probability concepts in both risk modeling and risk management. Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although "risk" generally describes a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management methodologies and respective tools focus on managing severity but are relatively silent on the in-depth meaning and uses of "probability." Pharmaceutical manufacturers are expanding their use of quality risk management to identify and manage risks to the patient that might occur in phases of the pharmaceutical life cycle from drug development to manufacture, marketing to product discontinuation. A probability concept is typically applied by risk managers as a combination of data-based measures of probability and a subjective "degree of belief" meaning of probability. Probability as

  14. The perception of probability.

    Science.gov (United States)

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  15. Rights, goals, and capabilities

    NARCIS (Netherlands)

    van Hees, M.V.B.P.M

    This article analyses the relationship between rights and capabilities in order to get a better grasp of the kind of consequentialism that the capability theory represents. Capability rights have been defined as rights that have a capability as their object (rights to capabilities). Such a

  16. Aircraft Capability Management

    Science.gov (United States)

    Mumaw, Randy; Feary, Mike

    2018-01-01

    This presentation presents an overview of work performed at NASA Ames Research Center in 2017. The work concerns the analysis of current aircraft system management displays, and the initial development of an interface for providing information about aircraft system status. The new interface proposes a shift away from current aircraft system alerting interfaces that report the status of physical components, and towards displaying the implications of degradations on mission capability. The proposed interface describes these component failures in terms of operational consequences of aircraft system degradations. The research activity was an effort to examine the utility of different representations of complex systems and operating environments to support real-time decision making of off-nominal situations. A specific focus was to develop representations that provide better integrated information to allow pilots to more easily reason about the operational consequences of the off-nominal situations. The work is also seen as a pathway to autonomy, as information is integrated and understood in a form that automated responses could be developed for the off-nominal situations in the future.

  17. Probability distributions for Markov chain based quantum walks

    Science.gov (United States)

    Balu, Radhakrishnan; Liu, Chaobin; Venegas-Andraca, Salvador E.

    2018-01-01

    We analyze the probability distributions of the quantum walks induced from Markov chains by Szegedy (2004). The first part of this paper is devoted to the quantum walks induced from finite state Markov chains. It is shown that the probability distribution on the states of the underlying Markov chain is always convergent in the Cesaro sense. In particular, we deduce that the limiting distribution is uniform if the transition matrix is symmetric. In the case of a non-symmetric Markov chain, we exemplify that the limiting distribution of the quantum walk is not necessarily identical with the stationary distribution of the underlying irreducible Markov chain. The Szegedy scheme can be extended to infinite state Markov chains (random walks). In the second part, we formulate the quantum walk induced from a lazy random walk on the line. We then obtain the weak limit of the quantum walk. It is noted that the current quantum walk appears to spread faster than its counterpart-quantum walk on the line driven by the Grover coin discussed in literature. The paper closes with an outlook on possible future directions.

  18. Design Mechanism as Territorial Strategic Capability

    Directory of Open Access Journals (Sweden)

    Gianita BLEOJU

    2009-01-01

    Full Text Available The current exigencies that a territory must faced in order to its’ optimalpositioning in future regional competition requires the ability to design theappropriate mechanism which better valorize the territory capability. Such aconstruct is vital for territorial sustainable development and supposes thecreation of a specific body of knowledge from distinctive local resourceexploitation and unique value creation and allocation. Territorial mechanismdesign is a typical management decision about identification, ownership andcontrol of specific strategic capabilities and their combination in a distinctiveterritorial portfolio. The most difficult responsibility is to allocate the territorialvalue added which is a source of conflict among territorial components. Ourcurrent paper research covers the basics of two complementary territorialpillars-rural and tourism potential and proves the lack of specific designmechanisms which explain the current diminishing value of Galati Brailaregion. The proposed management system, relying upon territorial controlmechanism, will ensure knowledge sharing process via collaborative learning,with the final role of appropriate territorial attractivity signals, reinforcingidentity as key factor of territorial attractability. Our paper is fully documentedon there years of data analyzing from territorial area of interest. This offers usthe necessary empiric contrasting for our proposed solution.

  19. QBism the future of quantum physics

    CERN Document Server

    von Baeyer, Hans Christian

    2016-01-01

    Measured by the accuracy of its predictions and the scope of its technological applications, quantum mechanics is one of the most successful theories in science--as well as one of the most misunderstood. The deeper meaning of quantum mechanics remains controversial almost a century after its invention. Providing a way past quantum theory's paradoxes and puzzles, QBism offers a strikingly new interpretation that opens up for the nonspecialist reader the profound implications of quantum mechanics for how we understand and interact with the world. Short for Quantum Bayesianism, QBism adapts many of the conventional features of quantum mechanics in light of a revised understanding of probability. Bayesian probability, unlike the standard "frequentist probability," is defined as a numerical measure of the degree of an observer's belief that a future event will occur or that a particular proposition is true. Bayesianism's advantages over frequentist probability are that it is applicable to singular events, its pro...

  20. Enhancement of loss detection capability using a combination of the Kalman Filter/Linear Smoother and controllable unit accounting approach

    International Nuclear Information System (INIS)

    Pike, D.H.; Morrison, G.W.

    1979-01-01

    An approach to loss detection is presented which combines the optimal loss detection capability of state estimation techniques with a controllable unit accounting approach. The state estimation theory makes use of a linear system model which is capable of modeling the interaction of various controllable unit areas within a given facility. An example is presented which illustrates the increase in loss detection probability which is realizable with state estimation techniques. Comparisons are made with a Shewhart Control Chart and the CUSUM statistic

  1. How Long Before NATO Aircraft Carrier Force Projection Capabilities Are Successfully Countered? Some effects of the fiscal crises

    Directory of Open Access Journals (Sweden)

    Lučev Josip

    2014-10-01

    Full Text Available Growth and fiscal policy conducive to economic development have been severely jeopardized in most NATO member countries since 2008. In sharp contrast, China has experienced only a relatively slower GDP growth, which it has mitigated with a fiscally expansionary outlook. Under these conditions, when can we expect the politico-military position of NATO to be challenged? This paper surveys amphibious force projection capabilities in six countries: the USA, the UK, France, Russia, India and the People's Republic of China (PRC. An assessment of the current capability for aircraft carrier building and a survey of carrier-related ambitions is undertaken to offer projections of probable aircraft carrier fleets by 2030. The three non-NATO countries are far better positioned to build aircraft carriers than the three NATO members, with China in the lead. Nevertheless, there is a high probability of the continued military dominance of the USA and NATO, but also of a military build-up focusing on the Indian Ocean.

  2. Exact probability distribution function for the volatility of cumulative production

    Science.gov (United States)

    Zadourian, Rubina; Klümper, Andreas

    2018-04-01

    In this paper we study the volatility and its probability distribution function for the cumulative production based on the experience curve hypothesis. This work presents a generalization of the study of volatility in Lafond et al. (2017), which addressed the effects of normally distributed noise in the production process. Due to its wide applicability in industrial and technological activities we present here the mathematical foundation for an arbitrary distribution function of the process, which we expect will pave the future research on forecasting of the production process.

  3. Introduction to probability with R

    CERN Document Server

    Baclawski, Kenneth

    2008-01-01

    FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable

  4. Applied probability and stochastic processes

    CERN Document Server

    Sumita, Ushio

    1999-01-01

    Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...

  5. Incidents in nuclear research reactor examined by deterministic probability and probabilistic safety analysis

    International Nuclear Information System (INIS)

    Lopes, Valdir Maciel

    2010-01-01

    This study aims to evaluate the potential risks submitted by the incidents in nuclear research reactors. For its development, two databases of the International Atomic Energy Agency, IAEA, were used, the Incident Report System for Research Reactor and Research Reactor Data Base. For this type of assessment was used the Probabilistic Safety Analysis (PSA), within a confidence level of 90% and the Deterministic Probability Analysis (DPA). To obtain the results of calculations of probabilities for PSA, were used the theory and equations in the paper IAEA TECDOC - 636. The development of the calculations of probabilities for PSA was used the program Scilab version 5.1.1, free access, executable on Windows and Linux platforms. A specific program to get the results of probability was developed within the main program Scilab 5.1.1., for two distributions Fischer and Chi-square, both with the confidence level of 90%. Using the Sordi equations and Origin 6.0 program, were obtained the maximum admissible doses related to satisfy the risk limits established by the International Commission on Radiological Protection, ICRP, and were also obtained these maximum doses graphically (figure 1) resulting from the calculations of probabilities x maximum admissible doses. It was found that the reliability of the results of probability is related to the operational experience (reactor x year and fractions) and that the larger it is, greater the confidence in the outcome. Finally, a suggested list of future work to complement this paper was gathered. (author)

  6. Social Capital, IT Capability, and the Success of Knowledge Management Systems

    Directory of Open Access Journals (Sweden)

    Irene Y.L. Chen

    2009-03-01

    Full Text Available Many organizations have implemented knowledge management systems to support knowledge management. However, many of such systems have failed due to the lack of relationship networks and IT capability within organizations. Motivated by such concerns, this paper examines the factors that may facilitate the success of knowledge management systems. The ten constructs derived from social capital theory, resource-based view and IS success model are integrated into the current research model. Twenty-one hypotheses derived from the research model are empirically validated using a field survey of KMS users. The results suggest that social capital and organizational IT capability are important preconditions of the success of knowledge management systems. Among the posited relationships, trust, social interaction ties, IT capability do not significantly impact service quality, system quality and IT capability, respectively. Against prior expectation, service quality and knowledge quality do not significantly influence perceived KMS benefits and user satisfaction, respectively. Discussion of the results and conclusion are provided. This study then provides insights for future research avenue.

  7. Expected Signal Observability at Future Experiments

    CERN Document Server

    Bartsch, Valeria

    2005-01-01

    Several methods to quantify the ''significance'' of an expected signal at future experiments have been used or suggested in literature. In this note, comparisons are presented with a method based on the likelihood ratio of the ''background hypothesis'' and the ''signal-plus-background hypothesis''. A large number of Monte Carlo experiments are performed to investigate the properties of the various methods and to check whether the probability of a background fluctuation having produced the claimed significance of the discovery is properly described. In addition, the best possible separation between the two hypotheses should be provided, in other words, the discovery potential of a future experiment be maximal. Finally, a practical method to apply a likelihood-based definition of the significance is suggested in this note. Signal and background contributions are determined from a likelihoo d fit based on shapes only, and the probability density distributions of the significance thus determined are found to be o...

  8. Probability-of-Superiority SEM (PS-SEM—Detecting Probability-Based Multivariate Relationships in Behavioral Research

    Directory of Open Access Journals (Sweden)

    Johnson Ching-Hong Li

    2018-06-01

    Full Text Available In behavioral research, exploring bivariate relationships between variables X and Y based on the concept of probability-of-superiority (PS has received increasing attention. Unlike the conventional, linear-based bivariate relationship (e.g., Pearson's correlation, PS defines that X and Y can be related based on their likelihood—e.g., a student who is above mean in SAT has 63% likelihood of achieving an above-mean college GPA. Despite its increasing attention, the concept of PS is restricted to a simple bivariate scenario (X-Y pair, which hinders the development and application of PS in popular multivariate modeling such as structural equation modeling (SEM. Therefore, this study addresses an empirical-based simulation study that explores the potential of detecting PS-based relationship in SEM, called PS-SEM. The simulation results showed that the proposed PS-SEM method can detect and identify PS-based when data follow PS-based relationships, thereby providing a useful method for researchers to explore PS-based SEM in their studies. Conclusions, implications, and future directions based on the findings are also discussed.

  9. Are Differences Between Partners Always Detrimental? The Moderating Role of Future Connectedness

    OpenAIRE

    Simon Andrew Moss; Jasmine Dolan

    2014-01-01

    Whether perceived differences between romantic partners compromises or enhances relationships may depend on the characteristics of individuals. This study explores the possibility that differences in capabilities but not motives enhance relationship satisfaction—but only when the individuals feel connected to their future identity. In particular, when individuals feel connected to their future identity, their primary motivation is to accrue capabilities and resources that could be useful in s...

  10. Capability ethics

    NARCIS (Netherlands)

    I.A.M. Robeyns (Ingrid)

    2012-01-01

    textabstractThe capability approach is one of the most recent additions to the landscape of normative theories in ethics and political philosophy. Yet in its present stage of development, the capability approach is not a full-blown normative theory, in contrast to utilitarianism, deontological

  11. Irreversibility and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    The mathematical entropy - unlike physical entropy - is simply a measure of uniformity for probability distributions in general. So understood, conditional entropies have the same logical structure as conditional probabilities. If, as is sometimes supposed, conditional probabilities are time-reversible, then so are conditional entropies and, paradoxically, both then share this symmetry with physical equations of motion. The paradox is, of course that probabilities yield a direction to time both in statistical mechanics and quantum mechanics, while the equations of motion do not. The supposed time-reversibility of both conditionals seems also to involve a form of retrocausality that is related to, but possibly not the same as, that described by Costa de Beaurgard. The retrocausality is paradoxically at odds with the generally presumed irreversibility of the quantum mechanical measurement process. Further paradox emerges if the supposed time-reversibility of the conditionals is linked with the idea that the thermodynamic entropy is the same thing as 'missing information' since this confounds the thermodynamic and mathematical entropies. However, it is shown that irreversibility is a formal consequence of conditional entropies and, hence, of conditional probabilities also. 8 refs. (Author)

  12. Representation of Probability Density Functions from Orbit Determination using the Particle Filter

    Science.gov (United States)

    Mashiku, Alinda K.; Garrison, James; Carpenter, J. Russell

    2012-01-01

    Statistical orbit determination enables us to obtain estimates of the state and the statistical information of its region of uncertainty. In order to obtain an accurate representation of the probability density function (PDF) that incorporates higher order statistical information, we propose the use of nonlinear estimation methods such as the Particle Filter. The Particle Filter (PF) is capable of providing a PDF representation of the state estimates whose accuracy is dependent on the number of particles or samples used. For this method to be applicable to real case scenarios, we need a way of accurately representing the PDF in a compressed manner with little information loss. Hence we propose using the Independent Component Analysis (ICA) as a non-Gaussian dimensional reduction method that is capable of maintaining higher order statistical information obtained using the PF. Methods such as the Principal Component Analysis (PCA) are based on utilizing up to second order statistics, hence will not suffice in maintaining maximum information content. Both the PCA and the ICA are applied to two scenarios that involve a highly eccentric orbit with a lower apriori uncertainty covariance and a less eccentric orbit with a higher a priori uncertainty covariance, to illustrate the capability of the ICA in relation to the PCA.

  13. Statistics and Probability at Secondary Schools in the Federal State of Salzburg: An Empirical Study

    Directory of Open Access Journals (Sweden)

    Wolfgang Voit

    2014-12-01

    Full Text Available Knowledge about the practical use of statistics and probability in today's mathematics instruction at secondary schools is vital in order to improve the academic education for future teachers. We have conducted an empirical study among school teachers to inform towards improved mathematics instruction and teacher preparation. The study provides a snapshot into the daily practice of instruction at school. Centered around the four following questions, the status of statistics and probability was examined. Where did  the current mathematics teachers study? What relevance do statistics and probability have in school? Which contents are actually taught in class? What kind of continuing education would be desirable for teachers? The study population consisted of all teachers of mathematics at secondary schools in the federal state of Salzburg.

  14. Using multinomial and imprecise probability for non-parametric modelling of rainfall in Manizales (Colombia

    Directory of Open Access Journals (Sweden)

    Ibsen Chivatá Cárdenas

    2008-05-01

    Full Text Available This article presents a rainfall model constructed by applying non-parametric modelling and imprecise probabilities; these tools were used because there was not enough homogeneous information in the study area. The area’s hydro-logical information regarding rainfall was scarce and existing hydrological time series were not uniform. A distributed extended rainfall model was constructed from so-called probability boxes (p-boxes, multinomial probability distribu-tion and confidence intervals (a friendly algorithm was constructed for non-parametric modelling by combining the last two tools. This model confirmed the high level of uncertainty involved in local rainfall modelling. Uncertainty en-compassed the whole range (domain of probability values thereby showing the severe limitations on information, leading to the conclusion that a detailed estimation of probability would lead to significant error. Nevertheless, rele-vant information was extracted; it was estimated that maximum daily rainfall threshold (70 mm would be surpassed at least once every three years and the magnitude of uncertainty affecting hydrological parameter estimation. This paper’s conclusions may be of interest to non-parametric modellers and decisions-makers as such modelling and imprecise probability represents an alternative for hydrological variable assessment and maybe an obligatory proce-dure in the future. Its potential lies in treating scarce information and represents a robust modelling strategy for non-seasonal stochastic modelling conditions

  15. Dynamic Capabilities

    DEFF Research Database (Denmark)

    Grünbaum, Niels Nolsøe; Stenger, Marianne

    2013-01-01

    The findings reveal a positive relationship between dynamic capabilities and innovation performance in the case enterprises, as we would expect. It was, however, not possible to establish a positive relationship between innovation performance and profitability. Nor was there any positive...... relationship between dynamic capabilities and profitability....

  16. The Armored Brigade Combat Team (ABCT) in the Future: An Assessment of Capabilities Against the Hybrid Threat in the Future Operational Environment

    Science.gov (United States)

    2013-06-13

    requirements, followed by a tactical case study assessment, and a strengths, weaknesses, opportunities, and threats ( SWOT ) analysis of the BCTs against a...strategic and operational deployment. This information further developed the case study SWOT analysis of the BCTs against a hybrid threat. The SWOT ...a SWOT analysis . The next section addressed is the analysis . The analysis is comprised of the strategic capabilities assessment and the tactical

  17. Analysis of the performance capability of an infrared interior intrusion detector

    International Nuclear Information System (INIS)

    Dunn, D.R.

    1977-01-01

    Component performances are required by the LLL assessment procedure for material control and accounting (MC and A) systems. Monitors are an example of an MC and A component whose functions are to process measurements or observations for purposes of detecting abnormalities. This report develops a methodology for characterizing the performance of a class of infrared (IR) interior intrusion monitors or detectors. The methodology is developed around a specific commercial IR detector, the InfrAlarm, manufactured by Barnes Engineering Company (Models 19-124 and 19-115A). Statistical detection models for computing probabilities of detection and false alarms were derived, and the performance capability of the InfrAlarm IR detector was shown using these measures. The results obtained in the performance analysis show that the detection capability of the InfrAlarm is excellent (approx. 1), with very low false alarm rates, for a wide range in target characteristics. These results should be representative and particularly for non-hostile environments

  18. Introduction to probability with Mathematica

    CERN Document Server

    Hastings, Kevin J

    2009-01-01

    Discrete ProbabilityThe Cast of Characters Properties of Probability Simulation Random SamplingConditional ProbabilityIndependenceDiscrete DistributionsDiscrete Random Variables, Distributions, and ExpectationsBernoulli and Binomial Random VariablesGeometric and Negative Binomial Random Variables Poisson DistributionJoint, Marginal, and Conditional Distributions More on ExpectationContinuous ProbabilityFrom the Finite to the (Very) Infinite Continuous Random Variables and DistributionsContinuous ExpectationContinuous DistributionsThe Normal Distribution Bivariate Normal DistributionNew Random Variables from OldOrder Statistics Gamma DistributionsChi-Square, Student's t, and F-DistributionsTransformations of Normal Random VariablesAsymptotic TheoryStrong and Weak Laws of Large Numbers Central Limit TheoremStochastic Processes and ApplicationsMarkov ChainsPoisson Processes QueuesBrownian MotionFinancial MathematicsAppendixIntroduction to Mathematica Glossary of Mathematica Commands for Probability Short Answers...

  19. Failure probability under parameter uncertainty.

    Science.gov (United States)

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.

  20. What is the role of community capabilities for maternal health? An exploration of community capabilities as determinants to institutional deliveries in Bangladesh, India, and Uganda

    Directory of Open Access Journals (Sweden)

    Ligia Paina

    2016-11-01

    Full Text Available Abstract Background While community capabilities are recognized as important factors in developing resilient health systems and communities, appropriate metrics for these have not yet been developed. Furthermore, the role of community capabilities on access to maternal health services has been underexplored. In this paper, we summarize the development of a community capability score based on the Future Health System (FHS project’s experience in Bangladesh, India, and Uganda, and, examine the role of community capabilities as determinants of institutional delivery in these three contexts. Methods We developed a community capability score using a pooled dataset containing cross-sectional household survey data from Bangladesh, India, and Uganda. Our main outcome of interest was whether the woman delivered in an institution. Our predictor variables included the community capability score, as well as a series of previously identified determinants of maternal health. We calculate both population-averaged effects (using GEE logistic regression, as well as sub-national level effects (using a mixed effects model. Results Our final sample for analysis included 2775 women, of which 1238 were from Bangladesh, 1199 from India, and 338 from Uganda. We found that individual-level determinants of institutional deliveries, such as maternal education, parity, and ante-natal care access were significant in our analysis and had a strong impact on a woman’s odds of delivering in an institution. We also found that, in addition to individual-level determinants, greater community capability was significantly associated with higher odds of institutional delivery. For every additional capability, the odds of institutional delivery would increase by up to almost 6 %. Conclusion Individual-level characteristics are strong determinants of whether a woman delivered in an institution. However, we found that community capability also plays an important role, and should be

  1. Capabilities and Incapabilities of the Capabilities Approach to Health Justice.

    Science.gov (United States)

    Selgelid, Michael J

    2016-01-01

    This first part of this article critiques Sridhar Venkatapuram's conception of health as a capability. It argues that Venkatapuram relies on the problematic concept of dignity, implies that those who are unhealthy lack lives worthy of dignity (which seems politically incorrect), sets a low bar for health, appeals to metaphysically problematic thresholds, fails to draw clear connections between appealed-to capabilities and health, and downplays the importance/relevance of health functioning. It concludes by questioning whether justice entitlements should pertain to the capability for health versus health achievements, challenging Venkatapuram's claims about the strength of health entitlements, and demonstrating that the capabilities approach is unnecessary to address social determinants of health. © 2016 John Wiley & Sons Ltd.

  2. Uprated OMS engine status and future applications

    Science.gov (United States)

    Boyd, W. C.; Brasher, W. L.

    1986-01-01

    The baseline Orbital Maneuvering Engine (OME) of the Space Shuttle has the potential for significant performance uprating, leading to increased Shuttle performance capability. The approach to uprating that is being pursued at the NASA Lyndon B. Johnson Space Center is the use of a gas generator-driven turbopump to increase OME operating pressure. A higher pressure engine can have a greater nozzle expansion ratio in the same envelope and at the same thrust level, giving increased engine Isp. The results of trade studies and analyses that have led to the preferred uprated OME configuration are described. The significant accomplishments of a pre-development component demonstration program are also presented, including descriptions of test hardware and discussion of test results. It is shown that testing to date confirms the capability of the preferred uprated OME configuration to meet or exceed performance and life requirements. Potential future activities leading up to a full-scale development program are described, and the capability for the uprated OME to be used in future storable propellant upper stages is discussed.

  3. Automatic capability to store and retrieve component data and to calculate structural integrity of these components

    International Nuclear Information System (INIS)

    McKinnis, C.J.; Toor, P.M.

    1985-01-01

    In structural analysis, assimilation of material, geometry, and service history input parameters is very cumbersome. Quite often with changing service history and revised material properties and geometry, an analysis has to be repeated. To overcome the above mentioned difficulties, a computer program was developed to provide the capability to establish a computerized library of all material, geometry, and service history parameters for components. The program also has the capability to calculate the structural integrity based on the Arrhenius type equations, including the probability calculations. This unique combination of computerized input information storage and automated analysis procedure assures consistency, efficiency, and accuracy when the hardware integrity has to be reassessed

  4. Conditional Probability Modulates Visual Search Efficiency

    Directory of Open Access Journals (Sweden)

    Bryan eCort

    2013-10-01

    Full Text Available We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability – the likelihood of a particular color given a particular combination of two cues – varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.

  5. Future buildings Forum-2025: Toward a methodology for future buildings research

    Energy Technology Data Exchange (ETDEWEB)

    Briggs, R.S.

    1990-10-01

    The purpose of this paper is to explore methods that could be used in studying buildings of the future. The methodology that the forum will develop will have a number of likely applications, among them: the development of research agendas for new building energy technologies; the development of information and analytical capabilities usable by other IEA annexes to address their technology assessment needs; and the generation of information that can serve as input to global energy models designed to inform energy policy decisions. This paper is divided into two major sections. The first is an overview of existing methods of futures research. Terms and concepts are explained, providing the basis for the second section. The second section proposes a framework and general methodology for studying future buildings. This preliminary, or strawman, methodology is intended to provoke early thinking and discussions on how the research should be approached. 24 refs., 8 figs.

  6. Core Capabilities and Technical Enhancement -- FY-98 Annual Report

    Energy Technology Data Exchange (ETDEWEB)

    Miller, David Lynn

    1999-04-01

    The Core Capability and Technical Enhancement (CC&TE) Program, a part of the Verification, Validation, and Engineering Assessment Program, was implemented to enhance and augment the technical capabilities of the Idaho National Engineering and Environmental Laboratory (INEEL). The purpose for strengthening the technical capabilities of the INEEL is to provide the technical base to serve effectively as the Environmental Management Laboratory for the Office of Environmental Management (EM). An analysis of EM's science and technology needs as well as the technology investments currently being made by EM across the complex was used to formulate a portfolio of research activities designed to address EM's needs without overlapping work being done elsewhere. An additional purpose is to enhance and maintain the technical capabilities and research infrastructure at the INEEL. This is a progress report for fiscal year 1998 for the five CC&TE research investment areas: (a) transport aspects of selective mass transport agents, (b) chemistry of environmental surfaces, (c) materials dynamics, (d) characterization science, and (e) computational simulation of mechanical and chemical systems. In addition to the five purely technical research areas, this report deals with the science and technology foundations element of the CC&TE from the standpoint of program management and complex-wide issues. This report also provides details of ongoing and future work in all six areas.

  7. Core capabilities and technical enhancement, FY-98 annual report

    Energy Technology Data Exchange (ETDEWEB)

    Miller, D.L.

    1999-04-01

    The Core Capability and Technical Enhancement (CCTE) Program, a part of the Verification, Validation, and Engineering Assessment Program, was implemented to enhance and augment the technical capabilities of the Idaho National Engineering and Environmental Laboratory (INEEL). The purpose for strengthening the technical capabilities of the INEEL is to provide the technical base to serve effectively as the Environmental Management Laboratory for the Department of Energy's Office of Environmental Management (EM). An analysis of EM's science and technology needs as well as the technology investments currently being made by EM across the complex was used to formulate a portfolio of research activities designed to address EM's needs without overlapping work being done elsewhere. An additional purpose is to enhance and maintain the technical capabilities and research infrastructure at the INEEL. This is a progress report for fiscal year 1998 for the five CCTE research investment areas: (a) transport aspects of selective mass transport agents, (b) chemistry of environmental surfaces, (c) materials dynamics, (d) characterization science, and (e) computational simulation of mechanical and chemical systems. In addition to the five purely technical research areas, this report deals with the science and technology foundations element of the CCTE from the standpoint of program management and complex-wide issues. This report also provides details of ongoing and future work in all six areas.

  8. REDUCTIONS WITHOUT REGRET: DEFINING THE NEEDED CAPABILITIES

    Energy Technology Data Exchange (ETDEWEB)

    Swegle, J.; Tincher, D.

    2013-09-10

    This is the second of three papers (in addition to an introductory summary) aimed at providing a framework for evaluating future reductions or modifications of the U.S. nuclear force, first by considering previous instances in which nuclear-force capabilities were eliminated; second by looking forward into at least the foreseeable future at the features of global and regional deterrence (recognizing that new weapon systems currently projected will have expected lifetimes stretching beyond our ability to predict the future); and third by providing examples of past or possible undesirable outcomes in the shaping of the future nuclear force, as well as some closing thoughts for the future. This paper begins with a discussion of the current nuclear force and the plans and procurement programs for the modernization of that force. Current weapon systems and warheads were conceived and built decades ago, and procurement programs have begun for the modernization or replacement of major elements of the nuclear force: the heavy bomber, the air-launched cruise missile, the ICBMs, and the ballistic-missile submarines. In addition, the Nuclear Weapons Council has approved a new framework for nuclear-warhead life extension not fully fleshed out yet that aims to reduce the current number of nuclear explosives from seven to five, the so-called 3+2 vision. This vision includes three interoperable warheads for both ICBMs and SLBMs (thus eliminating one backup weapon) and two warheads for aircraft delivery (one gravity bomb and one cruise-missile, eliminating a second backup gravity bomb). This paper also includes a discussion of the current and near-term nuclear-deterrence mission, both global and regional, and offers some observations on future of the strategic deterrence mission and the challenges of regional and extended nuclear deterrence.

  9. Capabilities for Strategic Adaptation

    DEFF Research Database (Denmark)

    Distel, Andreas Philipp

    This dissertation explores capabilities that enable firms to strategically adapt to environmental changes and preserve competitiveness over time – often referred to as dynamic capabilities. While dynamic capabilities being a popular research domain, too little is known about what these capabiliti...

  10. Changing Knowledge, Changing Technology: Implications for Teacher Education Futures

    Science.gov (United States)

    Burden, Kevin; Aubusson, Peter; Brindley, Sue; Schuck, Sandy

    2016-01-01

    Recent research in teacher education futures has identified two themes that require further study: the changing nature of knowledge and the changing capabilities of technologies. This article examines the intersection of these two themes and their implications for teacher education. The research employed futures methodologies based on scenario…

  11. Addition of liquid waste incineration capability to the INEL's low-level waste incinerator

    International Nuclear Information System (INIS)

    Steverson, E.M.; Clark, D.P.; McFee, J.N.

    1986-01-01

    A liquid waste system has recently been installed in the Waste Experimental Reduction Facility (WERF) incinerator at the Idaho National Engineering Laboratory (INEL). In this paper, aspects of the incineration system such as the components, operations, capabilities, capital cost, EPA permit requirements, and future plans are discussed. The principal objective of the liquid incineration system is to provide the capability to process hazardous, radioactively contaminated, non-halogenated liquid wastes. The system consists primarily of a waste feed system, instrumentation and controls, and a liquid burner, which were procured at a capital cost of $115,000

  12. The Sentry Autonomous Underwater Vehicle: Field Trial Results and Future Capabilities

    Science.gov (United States)

    Yoerger, D. R.; Bradley, A. M.; Martin, S. C.; Whitcomb, L. L.

    2006-12-01

    The Sentry autonomous underwater vehicle combines an efficient long range survey capability with the ability to maneuver at low speeds. These attributes will permit Sentry to perform a variety of conventional and unconventional surveys including long range sonar surveys, hydrothermal plume surveys and near-bottom photo surveys. Sentry's streamlined body and fore and aft tilting planes, each possessing an independently controlled thruster, enable efficient operation in both near-bottom and cruising operations. Sentry is capable of being configured in two modes: hover mode, which commands Sentry's control surfaces to be aligned vertically, and forward flight mode, which allows Sentry's control surfaces to actuate between plus or minus 45 degrees. Sentry is equipped for full 6-Degrees of freedom position measurement. Vehicle heading, roll, and pitch are instrumented with a TCM2 PNI heading and attitude sensor. A Systron Donner yaw rate sensor instrumented heading rate. Depth is instrumented by a Paroscientific depth sensor. A 300kHz RD Instruments Doppler Sonar provides altitude and XYZ velocity measurements. In April 2006, we conducted our first deep water field trials of Sentry in Bermuda. These trials enabled us to examine a variety of issues, including the control software, vehicle safety systems, launch and recovery procedures, operation at depth, heading and depth controllers over a range of speeds, and power consumption. Sentry employ's a control system based upon the Jason 2 control system for low-level control, which has proven effective and reliable over several hundred deep-water dives. The Jason 2 control system, developed jointly at Johns Hopkins University and Woods Hole Oceanographic Institution, was augmented to manage Sentry-specific devices (sensors, actuators, and power storage) and to employ a high-level mission controller that supported autonomous mission scripting and error detection and response. This control suite will also support the Nereus

  13. Probability and uncertainty in nuclear safety decisions

    International Nuclear Information System (INIS)

    Pate-Cornell, M.E.

    1986-01-01

    In this paper, we examine some problems posed by the use of probabilities in Nuclear Safety decisions. We discuss some of the theoretical difficulties due to the collective nature of regulatory decisions, and, in particular, the calibration and the aggregation of risk information (e.g., experts opinions). We argue that, if one chooses numerical safety goals as a regulatory basis, one can reduce the constraints to an individual safety goal and a cost-benefit criterion. We show the relevance of risk uncertainties in this kind of regulatory framework. We conclude that, whereas expected values of future failure frequencies are adequate to show compliance with economic constraints, the use of a fractile (e.g., 95%) to be specified by the regulatory agency is justified to treat hazard uncertainties for the individual safety goal. (orig.)

  14. THE BLACK HOLE FORMATION PROBABILITY

    International Nuclear Information System (INIS)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D.

    2015-01-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH (M ZAMS ). Although we find that it is difficult to derive a unique P BH (M ZAMS ) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH (M ZAMS ) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH (M ZAMS ) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment

  15. The Black Hole Formation Probability

    Science.gov (United States)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D.

    2015-02-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH(M ZAMS). Although we find that it is difficult to derive a unique P BH(M ZAMS) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH(M ZAMS) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH(M ZAMS) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.

  16. Hyperbolic Discounting of the Far-Distant Future

    OpenAIRE

    Anchugina, Nina; Ryan, Matthew; Slinko, Arkadii

    2017-01-01

    We prove an analogue of Weitzman's (1998) famous result that an exponential discounter who is uncertain of the appropriate exponential discount rate should discount the far-distant future using the lowest (i.e., most patient) of the possible discount rates. Our analogous result applies to a hyperbolic discounter who is uncertain about the appropriate hyperbolic discount rate. In this case, the far-distant future should be discounted using the probability-weighted harmonic mean of the possible...

  17. Physics at Future Hadron Colliders

    CERN Document Server

    Baur, U.; Parsons, J.; Albrow, M.; Denisov, D.; Han, T.; Kotwal, A.; Olness, F.; Qian, J.; Belyaev, S.; Bosman, M.; Brooijmans, G.; Gaines, I.; Godfrey, S.; Hansen, J.B.; Hauser, J.; Heintz, U.; Hinchliffe, I.; Kao, C.; Landsberg, G.; Maltoni, F.; Oleari, C.; Pagliarone, C.; Paige, F.; Plehn, T.; Rainwater, D.; Reina, L.; Rizzo, T.; Su, S.; Tait, T.; Wackeroth, D.; Vataga, E.; Zeppenfeld, D.

    2001-01-01

    We discuss the physics opportunities and detector challenges at future hadron colliders. As guidelines for energies and luminosities we use the proposed luminosity and/or energy upgrade of the LHC (SLHC), and the Fermilab design of a Very Large Hadron Collider (VLHC). We illustrate the physics capabilities of future hadron colliders for a variety of new physics scenarios (supersymmetry, strong electroweak symmetry breaking, new gauge bosons, compositeness and extra dimensions). We also investigate the prospects of doing precision Higgs physics studies at such a machine, and list selected Standard Model physics rates.

  18. Organizational knowledge and capabilities in healthcare: Deconstructing and integrating diverse perspectives

    Science.gov (United States)

    Evans, Jenna M; Brown, Adalsteinn; Baker, G Ross

    2017-01-01

    Diverse concepts and bodies of work exist in the academic literature to guide research and practice on organizational knowledge and capabilities. However, these concepts have largely developed in parallel with minimal cross-fertilization, particularly in the healthcare domain. This contributes to confusion regarding conceptual boundaries and relationships, and to a lack of application of potentially useful evidence. The aim of this article is to assess three concepts associated with organizational knowledge content—intellectual capital, organizational core competencies, and dynamic capabilities—and to propose an agenda for future research. We conducted a literature review to identify and synthesize papers that apply the concepts of intellectual capital, organizational core competencies, and dynamic capabilities in healthcare settings. We explore the meaning of these concepts, summarize and critique associated healthcare research, and propose a high-level framework for conceptualizing how the concepts are related to each other. To support application of the concepts in practice, we conducted a case study of a healthcare organization. Through document review and interviews with current and former leaders, we identify and describe the organization’s intellectual capital, organizational core competencies, and dynamic capabilities. The review demonstrates that efforts to identify, understand, and improve organizational knowledge have been limited in health services research. In the literature on healthcare, we identified 38 papers on intellectual capital, 4 on core competencies, and 5 on dynamic capabilities. We link these disparate fields of inquiry by conceptualizing the three concepts as distinct, but overlapping concepts influenced by broader organizational learning and knowledge management processes. To aid healthcare researchers in studying and applying a knowledge-based view of organizational performance, we propose an agenda for future research involving

  19. A first course in probability

    CERN Document Server

    Ross, Sheldon

    2014-01-01

    A First Course in Probability, Ninth Edition, features clear and intuitive explanations of the mathematics of probability theory, outstanding problem sets, and a variety of diverse examples and applications. This book is ideal for an upper-level undergraduate or graduate level introduction to probability for math, science, engineering and business students. It assumes a background in elementary calculus.

  20. Estimating Effect Sizes and Expected Replication Probabilities from GWAS Summary Statistics

    DEFF Research Database (Denmark)

    Holland, Dominic; Wang, Yunpeng; Thompson, Wesley K

    2016-01-01

    Genome-wide Association Studies (GWAS) result in millions of summary statistics ("z-scores") for single nucleotide polymorphism (SNP) associations with phenotypes. These rich datasets afford deep insights into the nature and extent of genetic contributions to complex phenotypes such as psychiatric......-scores, as such knowledge would enhance causal SNP and gene discovery, help elucidate mechanistic pathways, and inform future study design. Here we present a parsimonious methodology for modeling effect sizes and replication probabilities, relying only on summary statistics from GWAS substudies, and a scheme allowing...... for estimating the degree of polygenicity of the phenotype and predicting the proportion of chip heritability explainable by genome-wide significant SNPs in future studies with larger sample sizes. We apply the model to recent GWAS of schizophrenia (N = 82,315) and putamen volume (N = 12,596), with approximately...

  1. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    2014-01-01

    either construct elicitation mechanisms that control for risk aversion, or construct elicitation mechanisms which undertake 'calibrating adjustments' to elicited reports. We illustrate how the joint estimation of risk attitudes and subjective probabilities can provide the calibration adjustments...... that theory calls for. We illustrate this approach using data from a controlled experiment with real monetary consequences to the subjects. This allows the observer to make inferences about the latent subjective probability, under virtually any well-specified model of choice under subjective risk, while still...

  2. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  3. Market-implied risk-neutral probabilities, actual probabilities, credit risk and news

    Directory of Open Access Journals (Sweden)

    Shashidhar Murthy

    2011-09-01

    Full Text Available Motivated by the credit crisis, this paper investigates links between risk-neutral probabilities of default implied by markets (e.g. from yield spreads and their actual counterparts (e.g. from ratings. It discusses differences between the two and clarifies underlying economic intuition using simple representations of credit risk pricing. Observed large differences across bonds in the ratio of the two probabilities are shown to imply that apparently safer securities can be more sensitive to news.

  4. Future climate

    International Nuclear Information System (INIS)

    La Croce, A.

    1991-01-01

    According to George Woodwell, founder of the Woods Hole Research Center, due the combustion of fossil fuels, deforestation and accelerated respiration, the net annual increase of carbon, in the form of carbon dioxide, to the 750 billion tonnes already present in the earth's atmosphere, is in the order of 3 to 5 billion tonnes. Around the world, scientists, investigating the probable effects of this increase on the earth's future climate, are now formulating coupled air and ocean current models which take account of water temperature and salinity dependent carbon dioxide exchange mechanisms acting between the atmosphere and deep layers of ocean waters

  5. Materials Capability Review Los Alamos National Laboratory May 4-7, 2009

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, Antoniette J [Los Alamos National Laboratory

    2009-01-01

    Los Alamos National Laboratory (LANL) uses external peer review to measure and continuously improve the quality of its science, technology and engineering (STE). LANL uses capability reviews to assess the STE quality and institutional integration and to advise Laboratory Management on the current and future health of the STE. Capability reviews address the STE integration that LANL uses to meet mission requirements. STE capabilities are define to cut across directorates providing a more holistic view of the STE quality, integration to achieve mission requirements, and mission relevance. The scope of these capabilities necessitate that there will be significant overlap in technical areas covered by capability reviews (e.g ., materials research and weapons science and engineering). In addition, LANL staff may be reviewed in different capability reviews because of their varied assignments and expertise. LANL plans to perform a complete review of the Laboratory's STE capabilities (hence staff) in a three-year cycle. The principal product of an external review is a report that includes the review committee's assessments, commendations, and recommendations for STE. The Capability Review Committees serve a dual role of providing assessment of the Laboratory's technical contributions and integration towards its missions and providing advice to Laboratory Management. The assessments and advice are documented in reports prepared by the Capability Review Committees that are delivered to the Director and to the Principal Associate Director for Science, Technology and Engineering (PADSTE). This report will be used by Laboratory Management for STE assessment and planning. The report is also provided to the Department of Energy (DOE) as part of LANL's Annual Performance Plan and to the Los Alamos National Security (LANS) LLC's Science and Technology Committee (STC) as part of its responsibilities to the LANS Board of Governors. LANL has defined fourteen

  6. Implementation of the Land, Atmosphere Near Real-Time Capability for EOS (LANCE)

    Science.gov (United States)

    Michael, Karen; Murphy, Kevin; Lowe, Dawn; Masuoka, Edward; Vollmer, Bruce; Tilmes, Curt; Teague, Michael; Ye, Gang; Maiden, Martha; Goodman, H. Michael; hide

    2010-01-01

    The past decade has seen a rapid increase in availability and usage of near real-time data from satellite sensors. Applications have demonstrated the utility of timely data in a number of areas ranging from numerical weather prediction and forecasting, to monitoring of natural hazards, disaster relief, agriculture and homeland security. As applications mature, the need to transition from prototypes to operational capabilities presents an opportunity to improve current near real-time systems and inform future capabilities. This paper presents NASA s effort to implement a near real-time capability for land and atmosphere data acquired by the Moderate Resolution Imaging Spectroradiometer (MODIS), Atmospheric Infrared Sounder (AIRS), Advanced Microwave Scanning Radiometer - Earth Observing System (AMSR-E), Microwave Limb Sounder (MLS) and Ozone Monitoring Instrument (OMI) instruments on the Terra, Aqua, and Aura satellites. Index Terms- Real time systems, Satellite applications

  7. Excel, Earthquakes, and Moneyball: exploring Cascadia earthquake probabilities using spreadsheets and baseball analogies

    Science.gov (United States)

    Campbell, M. R.; Salditch, L.; Brooks, E. M.; Stein, S.; Spencer, B. D.

    2017-12-01

    Much recent media attention focuses on Cascadia's earthquake hazard. A widely cited magazine article starts "An earthquake will destroy a sizable portion of the coastal Northwest. The question is when." Stories include statements like "a massive earthquake is overdue", "in the next 50 years, there is a 1-in-10 chance a "really big one" will erupt," or "the odds of the big Cascadia earthquake happening in the next fifty years are roughly one in three." These lead students to ask where the quoted probabilities come from and what they mean. These probability estimates involve two primary choices: what data are used to describe when past earthquakes happened and what models are used to forecast when future earthquakes will happen. The data come from a 10,000-year record of large paleoearthquakes compiled from subsidence data on land and turbidites, offshore deposits recording submarine slope failure. Earthquakes seem to have happened in clusters of four or five events, separated by gaps. Earthquakes within a cluster occur more frequently and regularly than in the full record. Hence the next earthquake is more likely if we assume that we are in the recent cluster that started about 1700 years ago, than if we assume the cluster is over. Students can explore how changing assumptions drastically changes probability estimates using easy-to-write and display spreadsheets, like those shown below. Insight can also come from baseball analogies. The cluster issue is like deciding whether to assume that a hitter's performance in the next game is better described by his lifetime record, or by the past few games, since he may be hitting unusually well or in a slump. The other big choice is whether to assume that the probability of an earthquake is constant with time, or is small immediately after one occurs and then grows with time. This is like whether to assume that a player's performance is the same from year to year, or changes over their career. Thus saying "the chance of

  8. Advanced Ceramic Materials for Future Aerospace Applications

    Science.gov (United States)

    Misra, Ajay

    2015-01-01

    With growing trend toward higher temperature capabilities, lightweight, and multifunctionality, significant advances in ceramic matrix composites (CMCs) will be required for future aerospace applications. The presentation will provide an overview of material requirements for future aerospace missions, and the role of ceramics and CMCs in meeting those requirements. Aerospace applications will include gas turbine engines, aircraft structure, hypersonic and access to space vehicles, space power and propulsion, and space communication.

  9. A probability space for quantum models

    Science.gov (United States)

    Lemmens, L. F.

    2017-06-01

    A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.

  10. The Advanced Modeling, Simulation and Analysis Capability Roadmap Vision for Engineering

    Science.gov (United States)

    Zang, Thomas; Lieber, Mike; Norton, Charles; Fucik, Karen

    2006-01-01

    This paper summarizes a subset of the Advanced Modeling Simulation and Analysis (AMSA) Capability Roadmap that was developed for NASA in 2005. The AMSA Capability Roadmap Team was chartered to "To identify what is needed to enhance NASA's capabilities to produce leading-edge exploration and science missions by improving engineering system development, operations, and science understanding through broad application of advanced modeling, simulation and analysis techniques." The AMSA roadmap stressed the need for integration, not just within the science, engineering and operations domains themselves, but also across these domains. Here we discuss the roadmap element pertaining to integration within the engineering domain, with a particular focus on implications for future observatory missions. The AMSA products supporting the system engineering function are mission information, bounds on information quality, and system validation guidance. The Engineering roadmap element contains 5 sub-elements: (1) Large-Scale Systems Models, (2) Anomalous Behavior Models, (3) advanced Uncertainty Models, (4) Virtual Testing Models, and (5) space-based Robotics Manufacture and Servicing Models.

  11. Probability, Nondeterminism and Concurrency

    DEFF Research Database (Denmark)

    Varacca, Daniele

    Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...

  12. Time-dependent earthquake probability calculations for southern Kanto after the 2011 M9.0 Tohoku earthquake

    Science.gov (United States)

    Nanjo, K. Z.; Sakai, S.; Kato, A.; Tsuruoka, H.; Hirata, N.

    2013-05-01

    Seismicity in southern Kanto activated with the 2011 March 11 Tohoku earthquake of magnitude M9.0, but does this cause a significant difference in the probability of more earthquakes at the present or in the To? future answer this question, we examine the effect of a change in the seismicity rate on the probability of earthquakes. Our data set is from the Japan Meteorological Agency earthquake catalogue, downloaded on 2012 May 30. Our approach is based on time-dependent earthquake probabilistic calculations, often used for aftershock hazard assessment, and are based on two statistical laws: the Gutenberg-Richter (GR) frequency-magnitude law and the Omori-Utsu (OU) aftershock-decay law. We first confirm that the seismicity following a quake of M4 or larger is well modelled by the GR law with b ˜ 1. Then, there is good agreement with the OU law with p ˜ 0.5, which indicates that the slow decay was notably significant. Based on these results, we then calculate the most probable estimates of future M6-7-class events for various periods, all with a starting date of 2012 May 30. The estimates are higher than pre-quake levels if we consider a period of 3-yr duration or shorter. However, for statistics-based forecasting such as this, errors that arise from parameter estimation must be considered. Taking into account the contribution of these errors to the probability calculations, we conclude that any increase in the probability of earthquakes is insignificant. Although we try to avoid overstating the change in probability, our observations combined with results from previous studies support the likelihood that afterslip (fault creep) in southern Kanto will slowly relax a stress step caused by the Tohoku earthquake. This afterslip in turn reminds us of the potential for stress redistribution to the surrounding regions. We note the importance of varying hazards not only in time but also in space to improve the probabilistic seismic hazard assessment for southern Kanto.

  13. Stochastic Economic Dispatch with Wind using Versatile Probability Distribution and L-BFGS-B Based Dual Decomposition

    DEFF Research Database (Denmark)

    Huang, Shaojun; Sun, Yuanzhang; Wu, Qiuwei

    2018-01-01

    This paper focuses on economic dispatch (ED) in power systems with intermittent wind power, which is a very critical issue in future power systems. A stochastic ED problem is formed based on the recently proposed versatile probability distribution (VPD) of wind power. The problem is then analyzed...

  14. Ignition Probability

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...

  15. Study for Safeguards Challenges to the Most Probably First Indonesian Future Power Plant of the Pebble Bed Modular Reactor

    International Nuclear Information System (INIS)

    Susilowati, E.

    2015-01-01

    In the near future Indonesia, the fourth most populous country, plans to build a small size power plant most probably a Pebble Bed Modular Reactor PBMR. This first nuclear power plant (NPP) is aimed to provide clear picture to the society in regard to performance and safety of nuclear power plant operation. Selection to the PBMR based on several factor including the combination of small size of the reactor and type of fuel allowing the use of passive safety systems, resulting in essential advantages in nuclear plant design and less dependence on plant operators for safety. In the light of safeguards perspective this typical reactor is also quite difference with previous light water reactor (LWR) design. From the fact that there are a small size large number of elements present in the reactor produced without individual serial numbers combine to on-line refueling same as the CANDU reactor, enforcing a new challenge to safeguards approach for this typical reactor. This paper discusses a bunch of safeguards measures have to be prepared by facility operator to support successfully international nuclear material and facility verification including elements of design relevant to safeguards need to be accomplished in consultation to the regulatory body, supplier or designer and the Agency/IAEA such as nuclear material balance area and key measurement point; possible diversion scenarios and safeguards strategy; and design features relevant to the IAEA equipment have to be installed at the reactor facility. It is deemed that result of discussion will alleviate and support the Agency approaching safeguards measure that may be applied to the purpose Indonesian first power plant of PBMR construction and operation. (author)

  16. Probability of Failure in Random Vibration

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Sørensen, John Dalsgaard

    1988-01-01

    Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out......-crossing. An integral equation for the probability density function of the time interval is formulated, and adequate approximations for the kernel are suggested. The kernel approximation results in approximate solutions for the probability density function of the time interval and thus for the first-passage probability...

  17. Capability-based computer systems

    CERN Document Server

    Levy, Henry M

    2014-01-01

    Capability-Based Computer Systems focuses on computer programs and their capabilities. The text first elaborates capability- and object-based system concepts, including capability-based systems, object-based approach, and summary. The book then describes early descriptor architectures and explains the Burroughs B5000, Rice University Computer, and Basic Language Machine. The text also focuses on early capability architectures. Dennis and Van Horn's Supervisor; CAL-TSS System; MIT PDP-1 Timesharing System; and Chicago Magic Number Machine are discussed. The book then describes Plessey System 25

  18. Assigning probability distributions to input parameters of performance assessment models

    Energy Technology Data Exchange (ETDEWEB)

    Mishra, Srikanta [INTERA Inc., Austin, TX (United States)

    2002-02-01

    This study presents an overview of various approaches for assigning probability distributions to input parameters and/or future states of performance assessment models. Specifically,three broad approaches are discussed for developing input distributions: (a) fitting continuous distributions to data, (b) subjective assessment of probabilities, and (c) Bayesian updating of prior knowledge based on new information. The report begins with a summary of the nature of data and distributions, followed by a discussion of several common theoretical parametric models for characterizing distributions. Next, various techniques are presented for fitting continuous distributions to data. These include probability plotting, method of moments, maximum likelihood estimation and nonlinear least squares analysis. The techniques are demonstrated using data from a recent performance assessment study for the Yucca Mountain project. Goodness of fit techniques are also discussed, followed by an overview of how distribution fitting is accomplished in commercial software packages. The issue of subjective assessment of probabilities is dealt with in terms of the maximum entropy distribution selection approach, as well as some common rules for codifying informal expert judgment. Formal expert elicitation protocols are discussed next, and are based primarily on the guidance provided by the US NRC. The Bayesian framework for updating prior distributions (beliefs) when new information becomes available is discussed. A simple numerical approach is presented for facilitating practical applications of the Bayes theorem. Finally, a systematic framework for assigning distributions is presented: (a) for the situation where enough data are available to define an empirical CDF or fit a parametric model to the data, and (b) to deal with the situation where only a limited amount of information is available.

  19. Assigning probability distributions to input parameters of performance assessment models

    International Nuclear Information System (INIS)

    Mishra, Srikanta

    2002-02-01

    This study presents an overview of various approaches for assigning probability distributions to input parameters and/or future states of performance assessment models. Specifically,three broad approaches are discussed for developing input distributions: (a) fitting continuous distributions to data, (b) subjective assessment of probabilities, and (c) Bayesian updating of prior knowledge based on new information. The report begins with a summary of the nature of data and distributions, followed by a discussion of several common theoretical parametric models for characterizing distributions. Next, various techniques are presented for fitting continuous distributions to data. These include probability plotting, method of moments, maximum likelihood estimation and nonlinear least squares analysis. The techniques are demonstrated using data from a recent performance assessment study for the Yucca Mountain project. Goodness of fit techniques are also discussed, followed by an overview of how distribution fitting is accomplished in commercial software packages. The issue of subjective assessment of probabilities is dealt with in terms of the maximum entropy distribution selection approach, as well as some common rules for codifying informal expert judgment. Formal expert elicitation protocols are discussed next, and are based primarily on the guidance provided by the US NRC. The Bayesian framework for updating prior distributions (beliefs) when new information becomes available is discussed. A simple numerical approach is presented for facilitating practical applications of the Bayes theorem. Finally, a systematic framework for assigning distributions is presented: (a) for the situation where enough data are available to define an empirical CDF or fit a parametric model to the data, and (b) to deal with the situation where only a limited amount of information is available

  20. Mobile Test Capabilities

    Data.gov (United States)

    Federal Laboratory Consortium — The Electrical Power Mobile Test capabilities are utilized to conduct electrical power quality testing on aircraft and helicopters. This capability allows that the...

  1. Radiation processing. Current status and future possibilities

    International Nuclear Information System (INIS)

    Woods, R.J.

    2000-01-01

    Radiation processing developed following the Second World War and employees gamma- or electron-irradiation to process polymers, cure alkene-based inks and coatings, sterilize medical supplies, irradiate food, and manage wastes. The current status of these applications is described with the probable direction of future developments. (author)

  2. Invariant probabilities of transition functions

    CERN Document Server

    Zaharopol, Radu

    2014-01-01

    The structure of the set of all the invariant probabilities and the structure of various types of individual invariant probabilities of a transition function are two topics of significant interest in the theory of transition functions, and are studied in this book. The results obtained are useful in ergodic theory and the theory of dynamical systems, which, in turn, can be applied in various other areas (like number theory). They are illustrated using transition functions defined by flows, semiflows, and one-parameter convolution semigroups of probability measures. In this book, all results on transition probabilities that have been published by the author between 2004 and 2008 are extended to transition functions. The proofs of the results obtained are new. For transition functions that satisfy very general conditions the book describes an ergodic decomposition that provides relevant information on the structure of the corresponding set of invariant probabilities. Ergodic decomposition means a splitting of t...

  3. Implementing NASA's Capability-Driven Approach: Insight into NASA's Processes for Maturing Exploration Systems

    Science.gov (United States)

    Williams-Byrd, Julie; Arney, Dale; Rodgers, Erica; Antol, Jeff; Simon, Matthew; Hay, Jason; Larman, Kevin

    2015-01-01

    , developing maturation plans and roadmaps for the identified performance gaps, specifying the interfaces between the various capabilities, and ensuring that the capabilities mature and integrate to enable future pioneering missions. By managing system development through the SMTs instead of traditional NASA programs and projects, the Agency is shifting from mission-driven development to a more flexible, capability-driven development. The process NASA uses to establish, integrate, prioritize, and manage the SMTs and associated capabilities is iterative. NASA relies on the Human Exploration and Operation Mission Directorate's SMT Integration Team within Advanced Exploration Systems to coordinate and facilitate the SMT process. The SMT Integration team conducts regular reviews and coordination meetings among the SMTs and has developed a number of tools to help the Agency implement capability driven processes. The SMT Integration team is uniquely positioned to help the Agency coordinate the SMTs and other processes that are making the capability-driven approach a reality. This paper will introduce the SMTs and the 12 key capabilities they represent. The role of the SMTs will be discussed with respect to Agency-wide processes to shift from mission-focused exploration to a capability-driven pioneering approach. Specific examples will be given to highlight systems development and testing within the SMTs. These examples will also show how NASA is using current investments in the International Space Station and future investments to develop and demonstrate capabilities. The paper will conclude by describing next steps and a process for soliciting feedback from the space exploration community to refine NASA's process for developing common exploration capabilities.

  4. Global mega forces: Implications for the future of natural resources

    Science.gov (United States)

    George H. Kubik

    2012-01-01

    The purpose of this paper is to provide an overview of leading global mega forces and their importance to the future of natural resource decisionmaking, policy development, and operation. Global mega forces are defined as a combination of major trends, preferences, and probabilities that come together to produce the potential for future high-impact outcomes. These...

  5. The concept of probability

    International Nuclear Information System (INIS)

    Bitsakis, E.I.; Nicolaides, C.A.

    1989-01-01

    The concept of probability is now, and always has been, central to the debate on the interpretation of quantum mechanics. Furthermore, probability permeates all of science, as well as our every day life. The papers included in this volume, written by leading proponents of the ideas expressed, embrace a broad spectrum of thought and results: mathematical, physical epistemological, and experimental, both specific and general. The contributions are arranged in parts under the following headings: Following Schroedinger's thoughts; Probability and quantum mechanics; Aspects of the arguments on nonlocality; Bell's theorem and EPR correlations; Real or Gedanken experiments and their interpretation; Questions about irreversibility and stochasticity; and Epistemology, interpretation and culture. (author). refs.; figs.; tabs

  6. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....

  7. Building Service Provider Capabilities

    DEFF Research Database (Denmark)

    Brandl, Kristin; Jaura, Manya; Ørberg Jensen, Peter D.

    2015-01-01

    In this paper we study whether and how the interaction between clients and the service providers contributes to the development of capabilities in service provider firms. In situations where such a contribution occurs, we analyze how different types of activities in the production process...... process. We find that clients influence the development of human capital capabilities and management capabilities in reciprocally produced services. While in sequential produced services clients influence the development of organizational capital capabilities and management capital capabilities....... of the services, such as sequential or reciprocal task activities, influence the development of different types of capabilities. We study five cases of offshore-outsourced knowledge-intensive business services that are distinguished according to their reciprocal or sequential task activities in their production...

  8. Path Loss, Shadow Fading, and Line-Of-Sight Probability Models for 5G Urban Macro-Cellular Scenarios

    DEFF Research Database (Denmark)

    Sun, Shu; Thomas, Timothy; Rappaport, Theodore S.

    2015-01-01

    This paper presents key parameters including the line-of-sight (LOS) probability, large-scale path loss, and shadow fading models for the design of future fifth generation (5G) wireless communication systems in urban macro-cellular (UMa) scenarios, using the data obtained from propagation...... measurements in Austin, US, and Aalborg, Denmark, at 2, 10, 18, and 38 GHz. A comparison of different LOS probability models is performed for the Aalborg environment. Both single-slope and dual-slope omnidirectional path loss models are investigated to analyze and contrast their root-mean-square (RMS) errors...

  9. 5G, an approach towards future telemedicine

    DEFF Research Database (Denmark)

    Anwar, Sadia; Prasad, Ramjee; Kumar, Ambuj

    for telemedicine application. Telemedicine’s applications and high data medical information generally require high definition visuals and lower latency connection, in addition mobility and reliability. The next generation of wireless communication standard, known as 5G, will provide data speed in (Gigabit per...... second) Gb/s with lower latency and higher reliability connection, and can be better approach for future telemedicine. In this paper we survey the current state of telemedicine along with examining the characteristics of 5G technology. We also present research challenges concerning 5G and telemedicine.......The use of smartphones has been increasing rapidly and it is expected that in future most people will have a smartphone capable of high speed Internet connection. The capability of smartphones with high definition display, computation power and multitude of sensors made it an excellent candidate...

  10. Effect of quality of family environment on the child's adaptation capabilities

    Directory of Open Access Journals (Sweden)

    Ivana Kreft

    2008-08-01

    Full Text Available The aim of the paper was to investigate how the quality of family environment is related to the child's adaptation capabilities. Child's adaptation was evaluated with a special assessment called SPP-3 (Systematic Psychological Assessment of a 3-year Old Child, that screens the population of 3-year olds to look for inadequate adaptation patterns. I assumed that in families where parents have higher education and where the environment is more stimulating children will show more effective and adaptive behaviour. Seventy-five children and parents who attended the psychological assessment in their regional hospitals first concluded the psychological examination (SPP-3 and then filled-in two questionnaires: The Family Environment Questionnaire (Zupančič, Podlesek, & Kavčič, 2004 and The Home Literacy Environment Questionnaire (Marjanovič Umek, Podlesek, & Fekonja, 2005. The results showed that quality of family environment does effect the child's adaptive capabilities and is associated with parental level of education. Of special importance for the child's socialization is the parents' ability to use effective control (to have consistent and clear demands. Hypothesis that the level of parental education affects the child's adaptation capabilities was not confirmed. Perhaps the parents' relations with the child are of greater importance, and these are probably not related to parents' education. The results show that child's adaptation capabilities are associated with parenting methods, so preventive psychological counselling may also be used to help parents choose more effective methods in order to allow the child to develop effective adaptive behaviour.

  11. Probability via expectation

    CERN Document Server

    Whittle, Peter

    1992-01-01

    This book is a complete revision of the earlier work Probability which ap­ peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de­ manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...

  12. The probability outcome correpondence principle : a dispositional view of the interpretation of probability statements

    NARCIS (Netherlands)

    Keren, G.; Teigen, K.H.

    2001-01-01

    This article presents a framework for lay people's internal representations of probabilities, which supposedly reflect the strength of underlying dispositions, or propensities, associated with the predicted event. From this framework, we derive the probability-outcome correspondence principle, which

  13. Poisson Processes in Free Probability

    OpenAIRE

    An, Guimei; Gao, Mingchu

    2015-01-01

    We prove a multidimensional Poisson limit theorem in free probability, and define joint free Poisson distributions in a non-commutative probability space. We define (compound) free Poisson process explicitly, similar to the definitions of (compound) Poisson processes in classical probability. We proved that the sum of finitely many freely independent compound free Poisson processes is a compound free Poisson processes. We give a step by step procedure for constructing a (compound) free Poisso...

  14. Solving probability reasoning based on DNA strand displacement and probability modules.

    Science.gov (United States)

    Zhang, Qiang; Wang, Xiaobiao; Wang, Xiaojun; Zhou, Changjun

    2017-12-01

    In computation biology, DNA strand displacement technology is used to simulate the computation process and has shown strong computing ability. Most researchers use it to solve logic problems, but it is only rarely used in probabilistic reasoning. To process probabilistic reasoning, a conditional probability derivation model and total probability model based on DNA strand displacement were established in this paper. The models were assessed through the game "read your mind." It has been shown to enable the application of probabilistic reasoning in genetic diagnosis. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Forecasting of future earthquakes in the northeast region of India considering energy released concept

    Science.gov (United States)

    Zarola, Amit; Sil, Arjun

    2018-04-01

    This study presents the forecasting of time and magnitude size of the next earthquake in the northeast India, using four probability distribution models (Gamma, Lognormal, Weibull and Log-logistic) considering updated earthquake catalog of magnitude Mw ≥ 6.0 that occurred from year 1737-2015 in the study area. On the basis of past seismicity of the region, two types of conditional probabilities have been estimated using their best fit model and respective model parameters. The first conditional probability is the probability of seismic energy (e × 1020 ergs), which is expected to release in the future earthquake, exceeding a certain level of seismic energy (E × 1020 ergs). And the second conditional probability is the probability of seismic energy (a × 1020 ergs/year), which is expected to release per year, exceeding a certain level of seismic energy per year (A × 1020 ergs/year). The logarithm likelihood functions (ln L) were also estimated for all four probability distribution models. A higher value of ln L suggests a better model and a lower value shows a worse model. The time of the future earthquake is forecasted by dividing the total seismic energy expected to release in the future earthquake with the total seismic energy expected to release per year. The epicentre of recently occurred 4 January 2016 Manipur earthquake (M 6.7), 13 April 2016 Myanmar earthquake (M 6.9) and the 24 August 2016 Myanmar earthquake (M 6.8) are located in zone Z.12, zone Z.16 and zone Z.15, respectively and that are the identified seismic source zones in the study area which show that the proposed techniques and models yield good forecasting accuracy.

  16. Advances in U.S. Land Imaging Capabilities

    Science.gov (United States)

    Stryker, T. S.

    2017-12-01

    Advancements in Earth observations, cloud computing, and data science are improving everyday life. Information from land-imaging satellites, such as the U.S. Landsat system, helps us to better understand the changing landscapes where we live, work, and play. This understanding builds capacity for improved decision-making about our lands, waters, and resources, driving economic growth, protecting lives and property, and safeguarding the environment. The USGS is fostering the use of land remote sensing technology to meet local, national, and global challenges. A key dimension to meeting these challenges is the full, free, and open provision of land remote sensing observations for both public and private sector applications. To achieve maximum impact, these data must also be easily discoverable, accessible, and usable. The presenter will describe the USGS Land Remote Sensing Program's current capabilities and future plans to collect and deliver land remote sensing information for societal benefit. He will discuss these capabilities in the context of national plans and policies, domestic partnerships, and international collaboration. The presenter will conclude with examples of how Landsat data is being used on a daily basis to improve lives and livelihoods.

  17. Systems Modeling to Implement Integrated System Health Management Capability

    Science.gov (United States)

    Figueroa, Jorge F.; Walker, Mark; Morris, Jonathan; Smith, Harvey; Schmalzel, John

    2007-01-01

    ISHM capability includes: detection of anomalies, diagnosis of causes of anomalies, prediction of future anomalies, and user interfaces that enable integrated awareness (past, present, and future) by users. This is achieved by focused management of data, information and knowledge (DIaK) that will likely be distributed across networks. Management of DIaK implies storage, sharing (timely availability), maintaining, evolving, and processing. Processing of DIaK encapsulates strategies, methodologies, algorithms, etc. focused on achieving high ISHM Functional Capability Level (FCL). High FCL means a high degree of success in detecting anomalies, diagnosing causes, predicting future anomalies, and enabling health integrated awareness by the user. A model that enables ISHM capability, and hence, DIaK management, is denominated the ISHM Model of the System (IMS). We describe aspects of the IMS that focus on processing of DIaK. Strategies, methodologies, and algorithms require proper context. We describe an approach to define and use contexts, implementation in an object-oriented software environment (G2), and validation using actual test data from a methane thruster test program at NASA SSC. Context is linked to existence of relationships among elements of a system. For example, the context to use a strategy to detect leak is to identify closed subsystems (e.g. bounded by closed valves and by tanks) that include pressure sensors, and check if the pressure is changing. We call these subsystems Pressurizable Subsystems. If pressure changes are detected, then all members of the closed subsystem become suspect of leakage. In this case, the context is defined by identifying a subsystem that is suitable for applying a strategy. Contexts are defined in many ways. Often, a context is defined by relationships of function (e.g. liquid flow, maintaining pressure, etc.), form (e.g. part of the same component, connected to other components, etc.), or space (e.g. physically close

  18. Sensitivity of the probability of failure to probability of detection curve regions

    International Nuclear Information System (INIS)

    Garza, J.; Millwater, H.

    2016-01-01

    Non-destructive inspection (NDI) techniques have been shown to play a vital role in fracture control plans, structural health monitoring, and ensuring availability and reliability of piping, pressure vessels, mechanical and aerospace equipment. Probabilistic fatigue simulations are often used in order to determine the efficacy of an inspection procedure with the NDI method modeled as a probability of detection (POD) curve. These simulations can be used to determine the most advantageous NDI method for a given application. As an aid to this process, a first order sensitivity method of the probability-of-failure (POF) with respect to regions of the POD curve (lower tail, middle region, right tail) is developed and presented here. The sensitivity method computes the partial derivative of the POF with respect to a change in each region of a POD or multiple POD curves. The sensitivities are computed at no cost by reusing the samples from an existing Monte Carlo (MC) analysis. A numerical example is presented considering single and multiple inspections. - Highlights: • Sensitivities of probability-of-failure to a region of probability-of-detection curve. • The sensitivities are computed with negligible cost. • Sensitivities identify the important region of a POD curve. • Sensitivities can be used as a guide to selecting the optimal POD curve.

  19. Computer-assisted sperm analysis (CASA): capabilities and potential developments.

    Science.gov (United States)

    Amann, Rupert P; Waberski, Dagmar

    2014-01-01

    Computer-assisted sperm analysis (CASA) systems have evolved over approximately 40 years, through advances in devices to capture the image from a microscope, huge increases in computational power concurrent with amazing reduction in size of computers, new computer languages, and updated/expanded software algorithms. Remarkably, basic concepts for identifying sperm and their motion patterns are little changed. Older and slower systems remain in use. Most major spermatology laboratories and semen processing facilities have a CASA system, but the extent of reliance thereon ranges widely. This review describes capabilities and limitations of present CASA technology used with boar, bull, and stallion sperm, followed by possible future developments. Each marketed system is different. Modern CASA systems can automatically view multiple fields in a shallow specimen chamber to capture strobe-like images of 500 to >2000 sperm, at 50 or 60 frames per second, in clear or complex extenders, and in information for ≥ 30 frames and provide summary data for each spermatozoon and the population. A few systems evaluate sperm morphology concurrent with motion. CASA cannot accurately predict 'fertility' that will be obtained with a semen sample or subject. However, when carefully validated, current CASA systems provide information important for quality assurance of semen planned for marketing, and for the understanding of the diversity of sperm responses to changes in the microenvironment in research. The four take-home messages from this review are: (1) animal species, extender or medium, specimen chamber, intensity of illumination, imaging hardware and software, instrument settings, technician, etc., all affect accuracy and precision of output values; (2) semen production facilities probably do not need a substantially different CASA system whereas biology laboratories would benefit from systems capable of imaging and tracking sperm in deep chambers for a flexible period of time

  20. THE BLACK HOLE FORMATION PROBABILITY

    Energy Technology Data Exchange (ETDEWEB)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D., E-mail: dclausen@tapir.caltech.edu [TAPIR, Walter Burke Institute for Theoretical Physics, California Institute of Technology, Mailcode 350-17, Pasadena, CA 91125 (United States)

    2015-02-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P {sub BH}(M {sub ZAMS}). Although we find that it is difficult to derive a unique P {sub BH}(M {sub ZAMS}) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P {sub BH}(M {sub ZAMS}) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P {sub BH}(M {sub ZAMS}) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.

  1. Truth, possibility and probability new logical foundations of probability and statistical inference

    CERN Document Server

    Chuaqui, R

    1991-01-01

    Anyone involved in the philosophy of science is naturally drawn into the study of the foundations of probability. Different interpretations of probability, based on competing philosophical ideas, lead to different statistical techniques, and frequently to mutually contradictory consequences. This unique book presents a new interpretation of probability, rooted in the traditional interpretation that was current in the 17th and 18th centuries. Mathematical models are constructed based on this interpretation, and statistical inference and decision theory are applied, including some examples in artificial intelligence, solving the main foundational problems. Nonstandard analysis is extensively developed for the construction of the models and in some of the proofs. Many nonstandard theorems are proved, some of them new, in particular, a representation theorem that asserts that any stochastic process can be approximated by a process defined over a space with equiprobable outcomes.

  2. Study on conditional probability of surface rupture: effect of fault dip and width of seismogenic layer

    Science.gov (United States)

    Inoue, N.

    2017-12-01

    The conditional probability of surface ruptures is affected by various factors, such as shallow material properties, process of earthquakes, ground motions and so on. Toda (2013) pointed out difference of the conditional probability of strike and reverse fault by considering the fault dip and width of seismogenic layer. This study evaluated conditional probability of surface rupture based on following procedures. Fault geometry was determined from the randomly generated magnitude based on The Headquarters for Earthquake Research Promotion (2017) method. If the defined fault plane was not saturated in the assumed width of the seismogenic layer, the fault plane depth was randomly provided within the seismogenic layer. The logistic analysis was performed to two data sets: surface displacement calculated by dislocation methods (Wang et al., 2003) from the defined source fault, the depth of top of the defined source fault. The estimated conditional probability from surface displacement indicated higher probability of reverse faults than that of strike faults, and this result coincides to previous similar studies (i.e. Kagawa et al., 2004; Kataoka and Kusakabe, 2005). On the contrary, the probability estimated from the depth of the source fault indicated higher probability of thrust faults than that of strike and reverse faults, and this trend is similar to the conditional probability of PFDHA results (Youngs et al., 2003; Moss and Ross, 2011). The probability of combined simulated results of thrust and reverse also shows low probability. The worldwide compiled reverse fault data include low fault dip angle earthquake. On the other hand, in the case of Japanese reverse fault, there is possibility that the conditional probability of reverse faults with less low dip angle earthquake shows low probability and indicates similar probability of strike fault (i.e. Takao et al., 2013). In the future, numerical simulation by considering failure condition of surface by the source

  3. Game interrupted: The rationality of considering the future

    Directory of Open Access Journals (Sweden)

    Brandon Almy

    2013-09-01

    Full Text Available The ``problem of points'', introduced by Paccioli in 1494 and solved by Pascal and Fermat 160 years later, inspired the modern concept of probability. Incidentally, the problem also shows that rational decision-making requires the consideration of future events. We show that naive responses to the problem of points are more future oriented and thus more rational in this sense when the problem itself is presented in a future frame instead of the canonical past frame. A simple nudge is sufficient to make decisions more rational. We consider the implications of this finding for hypothesis testing and predictions of replicability.

  4. Logic, probability, and human reasoning.

    Science.gov (United States)

    Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P

    2015-04-01

    This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Pre-Aggregation with Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    2006-01-01

    Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how...... the distributions can be subsequently used in pre-aggregation. Since the probability distributions can become large, we show how to achieve good time and space efficiency by approximating the distributions. We present the results of several experiments that demonstrate the effectiveness of our methods. The work...... is motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. This paper is the first to consider the approximate processing of probabilistic OLAP queries over probability distributions....

  6. PROBABLE FORECASTING IN THE COURSE OF INTERPRETING

    Directory of Open Access Journals (Sweden)

    Ye. B. Kagan

    2017-01-01

    Full Text Available Introduction. Translation practice has a heuristic nature and involves cognitive structures of consciousness of any interpreter. When preparing translators, special attention is paid to the development of their skill of probable forecasting.The aim of the present publication is to understand the process of anticipation from the position of the cognitive model of translation, development of exercises aimed at the development of prognostic abilities of students and interpreters when working with newspaper articles, containing metaphorical headlines.Methodology and research methods. The study is based on the competence approach to the training of students-translators, the complex of interrelated scientific methods, the main of which is the psycholinguistic experiment. With the use of quantitative data the features of the perception of newspaper texts on their metaphorical titles are characterized.Results and scientific novelty. On the basis of the conducted experiment to predict the content of newspaper articles with metaphorical headlines it is concluded that the main condition of predictability is the expectation. Probable forecasting as a professional competence of a future translator is formed in the process of training activities by integrating efforts of various departments of any language university. Specific exercises for the development of anticipation of students while studying the course of translation and interpretation are offered.Practical significance. The results of the study can be used by foreign language teachers of both language and non-language universities in teaching students of different specialties to translate foreign texts. 

  7. Data-driven probability concentration and sampling on manifold

    Energy Technology Data Exchange (ETDEWEB)

    Soize, C., E-mail: christian.soize@univ-paris-est.fr [Université Paris-Est, Laboratoire Modélisation et Simulation Multi-Echelle, MSME UMR 8208 CNRS, 5 bd Descartes, 77454 Marne-La-Vallée Cedex 2 (France); Ghanem, R., E-mail: ghanem@usc.edu [University of Southern California, 210 KAP Hall, Los Angeles, CA 90089 (United States)

    2016-09-15

    A new methodology is proposed for generating realizations of a random vector with values in a finite-dimensional Euclidean space that are statistically consistent with a dataset of observations of this vector. The probability distribution of this random vector, while a priori not known, is presumed to be concentrated on an unknown subset of the Euclidean space. A random matrix is introduced whose columns are independent copies of the random vector and for which the number of columns is the number of data points in the dataset. The approach is based on the use of (i) the multidimensional kernel-density estimation method for estimating the probability distribution of the random matrix, (ii) a MCMC method for generating realizations for the random matrix, (iii) the diffusion-maps approach for discovering and characterizing the geometry and the structure of the dataset, and (iv) a reduced-order representation of the random matrix, which is constructed using the diffusion-maps vectors associated with the first eigenvalues of the transition matrix relative to the given dataset. The convergence aspects of the proposed methodology are analyzed and a numerical validation is explored through three applications of increasing complexity. The proposed method is found to be robust to noise levels and data complexity as well as to the intrinsic dimension of data and the size of experimental datasets. Both the methodology and the underlying mathematical framework presented in this paper contribute new capabilities and perspectives at the interface of uncertainty quantification, statistical data analysis, stochastic modeling and associated statistical inverse problems.

  8. National Aeronautics and Space Administration (NASA) Environmental Control and Life Support (ECLS) Capability Roadmap Development for Exploration

    Science.gov (United States)

    Bagdigian, Robert M.; Carrasquillo, Robyn L.; Metcalf, Jordan; Peterson, Laurie

    2012-01-01

    NASA is considering a number of future human space exploration mission concepts. Although detailed requirements and vehicle architectures remain mostly undefined, near-term technology investment decisions need to be guided by the anticipated capabilities needed to enable or enhance the mission concepts. This paper describes a roadmap that NASA has formulated to guide the development of Environmental Control and Life Support Systems (ECLSS) capabilities required to enhance the long-term operation of the International Space Station (ISS) and enable beyond-Low Earth Orbit (LEO) human exploration missions. Three generic mission types were defined to serve as a basis for developing a prioritized list of needed capabilities and technologies. Those are 1) a short duration micro gravity mission; 2) a long duration transit microgravity mission; and 3) a long duration surface exploration mission. To organize the effort, ECLSS was categorized into three major functional groups (atmosphere, water, and solid waste management) with each broken down into sub-functions. The ability of existing, flight-proven state-of-the-art (SOA) technologies to meet the functional needs of each of the three mission types was then assessed. When SOA capabilities fell short of meeting the needs, those "gaps" were prioritized in terms of whether or not the corresponding capabilities enable or enhance each of the mission types. The resulting list of enabling and enhancing capability gaps can be used to guide future ECLSS development. A strategy to fulfill those needs over time was then developed in the form of a roadmap. Through execution of this roadmap, the hardware and technologies needed to enable and enhance exploration may be developed in a manner that synergistically benefits the ISS operational capability, supports Multi-Purpose Crew Vehicle (MPCV) development, and sustains long-term technology investments for longer duration missions. This paper summarizes NASA s ECLSS capability roadmap

  9. The intensity detection of single-photon detectors based on photon counting probability density statistics

    International Nuclear Information System (INIS)

    Zhang Zijing; Song Jie; Zhao Yuan; Wu Long

    2017-01-01

    Single-photon detectors possess the ultra-high sensitivity, but they cannot directly respond to signal intensity. Conventional methods adopt sampling gates with fixed width and count the triggered number of sampling gates, which is capable of obtaining photon counting probability to estimate the echo signal intensity. In this paper, we not only count the number of triggered sampling gates, but also record the triggered time position of photon counting pulses. The photon counting probability density distribution is obtained through the statistics of a series of the triggered time positions. Then Minimum Variance Unbiased Estimation (MVUE) method is used to estimate the echo signal intensity. Compared with conventional methods, this method can improve the estimation accuracy of echo signal intensity due to the acquisition of more detected information. Finally, a proof-of-principle laboratory system is established. The estimation accuracy of echo signal intensity is discussed and a high accuracy intensity image is acquired under low-light level environments. (paper)

  10. Probability and stochastic modeling

    CERN Document Server

    Rotar, Vladimir I

    2012-01-01

    Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...

  11. FMEF/experimental capabilities

    International Nuclear Information System (INIS)

    Burgess, C.A.; Dronen, V.R.

    1981-01-01

    The Fuels and Materials Examination Facility (FMEF), under construction at the Hanford site north of Richland, Washington, will be one of the most modern facilities offering irradiated fuels and materials examination capabilities and fuel fabrication development technologies. Scheduled for completion in 1984, the FMEF will provide examination capability for fuel assemblies, fuel pins and test pins irradiated in the FFTF. Various functions of the FMEF are described, with emphasis on experimental data-gathering capabilities in the facility's Nondestructive and Destructive examination cell complex

  12. Evaluating detection and estimation capabilities of magnetometer-based vehicle sensors

    Science.gov (United States)

    Slater, David M.; Jacyna, Garry M.

    2013-05-01

    In an effort to secure the northern and southern United States borders, MITRE has been tasked with developing Modeling and Simulation (M&S) tools that accurately capture the mapping between algorithm-level Measures of Performance (MOP) and system-level Measures of Effectiveness (MOE) for current/future surveillance systems deployed by the the Customs and Border Protection Office of Technology Innovations and Acquisitions (OTIA). This analysis is part of a larger M&S undertaking. The focus is on two MOPs for magnetometer-based Unattended Ground Sensors (UGS). UGS are placed near roads to detect passing vehicles and estimate properties of the vehicle's trajectory such as bearing and speed. The first MOP considered is the probability of detection. We derive probabilities of detection for a network of sensors over an arbitrary number of observation periods and explore how the probability of detection changes when multiple sensors are employed. The performance of UGS is also evaluated based on the level of variance in the estimation of trajectory parameters. We derive the Cramer-Rao bounds for the variances of the estimated parameters in two cases: when no a priori information is known and when the parameters are assumed to be Gaussian with known variances. Sample results show that UGS perform significantly better in the latter case.

  13. Transitional Probabilities Are Prioritized over Stimulus/Pattern Probabilities in Auditory Deviance Detection: Memory Basis for Predictive Sound Processing.

    Science.gov (United States)

    Mittag, Maria; Takegata, Rika; Winkler, István

    2016-09-14

    Representations encoding the probabilities of auditory events do not directly support predictive processing. In contrast, information about the probability with which a given sound follows another (transitional probability) allows predictions of upcoming sounds. We tested whether behavioral and cortical auditory deviance detection (the latter indexed by the mismatch negativity event-related potential) relies on probabilities of sound patterns or on transitional probabilities. We presented healthy adult volunteers with three types of rare tone-triplets among frequent standard triplets of high-low-high (H-L-H) or L-H-L pitch structure: proximity deviant (H-H-H/L-L-L), reversal deviant (L-H-L/H-L-H), and first-tone deviant (L-L-H/H-H-L). If deviance detection was based on pattern probability, reversal and first-tone deviants should be detected with similar latency because both differ from the standard at the first pattern position. If deviance detection was based on transitional probabilities, then reversal deviants should be the most difficult to detect because, unlike the other two deviants, they contain no low-probability pitch transitions. The data clearly showed that both behavioral and cortical auditory deviance detection uses transitional probabilities. Thus, the memory traces underlying cortical deviance detection may provide a link between stimulus probability-based change/novelty detectors operating at lower levels of the auditory system and higher auditory cognitive functions that involve predictive processing. Our research presents the first definite evidence for the auditory system prioritizing transitional probabilities over probabilities of individual sensory events. Forming representations for transitional probabilities paves the way for predictions of upcoming sounds. Several recent theories suggest that predictive processing provides the general basis of human perception, including important auditory functions, such as auditory scene analysis. Our

  14. Probable damage to tundra biota through sulphur dioxide destruction of lichens

    Energy Technology Data Exchange (ETDEWEB)

    Schofield, E; Hamilton, W L

    1970-01-01

    Lichens, which are important components of many Arctic ecosystems, are extremely sensitive to SO/sub 2/ pollution. Recent oilfield development in Arctic North America seems likely to eliminate lichens from large areas because of a unique combination of biological and meteorological factors. Probable future oilfield development in Greenland and the Soviet Union indicates that SO/sub 2/ pollution will become an increasingly serious threat to Arctic ecosystems. Therefore, uncontrolled burning of crude oil, fuel oil, and natural gas, should be avoided, and adequate sulphur-extraction facilities should be installed.

  15. Impacts of representing sea-level rise uncertainty on future flood risks: An example from San Francisco Bay.

    Science.gov (United States)

    Ruckert, Kelsey L; Oddo, Perry C; Keller, Klaus

    2017-01-01

    Rising sea levels increase the probability of future coastal flooding. Many decision-makers use risk analyses to inform the design of sea-level rise (SLR) adaptation strategies. These analyses are often silent on potentially relevant uncertainties. For example, some previous risk analyses use the expected, best, or large quantile (i.e., 90%) estimate of future SLR. Here, we use a case study to quantify and illustrate how neglecting SLR uncertainties can bias risk projections. Specifically, we focus on the future 100-yr (1% annual exceedance probability) coastal flood height (storm surge including SLR) in the year 2100 in the San Francisco Bay area. We find that accounting for uncertainty in future SLR increases the return level (the height associated with a probability of occurrence) by half a meter from roughly 2.2 to 2.7 m, compared to using the mean sea-level projection. Accounting for this uncertainty also changes the shape of the relationship between the return period (the inverse probability that an event of interest will occur) and the return level. For instance, incorporating uncertainties shortens the return period associated with the 2.2 m return level from a 100-yr to roughly a 7-yr return period (∼15% probability). Additionally, accounting for this uncertainty doubles the area at risk of flooding (the area to be flooded under a certain height; e.g., the 100-yr flood height) in San Francisco. These results indicate that the method of accounting for future SLR can have considerable impacts on the design of flood risk management strategies.

  16. Impacts of representing sea-level rise uncertainty on future flood risks: An example from San Francisco Bay.

    Directory of Open Access Journals (Sweden)

    Kelsey L Ruckert

    Full Text Available Rising sea levels increase the probability of future coastal flooding. Many decision-makers use risk analyses to inform the design of sea-level rise (SLR adaptation strategies. These analyses are often silent on potentially relevant uncertainties. For example, some previous risk analyses use the expected, best, or large quantile (i.e., 90% estimate of future SLR. Here, we use a case study to quantify and illustrate how neglecting SLR uncertainties can bias risk projections. Specifically, we focus on the future 100-yr (1% annual exceedance probability coastal flood height (storm surge including SLR in the year 2100 in the San Francisco Bay area. We find that accounting for uncertainty in future SLR increases the return level (the height associated with a probability of occurrence by half a meter from roughly 2.2 to 2.7 m, compared to using the mean sea-level projection. Accounting for this uncertainty also changes the shape of the relationship between the return period (the inverse probability that an event of interest will occur and the return level. For instance, incorporating uncertainties shortens the return period associated with the 2.2 m return level from a 100-yr to roughly a 7-yr return period (∼15% probability. Additionally, accounting for this uncertainty doubles the area at risk of flooding (the area to be flooded under a certain height; e.g., the 100-yr flood height in San Francisco. These results indicate that the method of accounting for future SLR can have considerable impacts on the design of flood risk management strategies.

  17. Probability tales

    CERN Document Server

    Grinstead, Charles M; Snell, J Laurie

    2011-01-01

    This book explores four real-world topics through the lens of probability theory. It can be used to supplement a standard text in probability or statistics. Most elementary textbooks present the basic theory and then illustrate the ideas with some neatly packaged examples. Here the authors assume that the reader has seen, or is learning, the basic theory from another book and concentrate in some depth on the following topics: streaks, the stock market, lotteries, and fingerprints. This extended format allows the authors to present multiple approaches to problems and to pursue promising side discussions in ways that would not be possible in a book constrained to cover a fixed set of topics. To keep the main narrative accessible, the authors have placed the more technical mathematical details in appendices. The appendices can be understood by someone who has taken one or two semesters of calculus.

  18. Verification of FA2D Prediction Capability Using Fuel Assembly Benchmark

    International Nuclear Information System (INIS)

    Jecmenica, R.; Pevec, D.; Grgic, D.; Konjarek, D.

    2008-01-01

    FA2D is 2D transport collision probability code developed at Faculty of Electrical Engineering and Computing, University Zagreb. It is used for calculation of cross section data at fuel assembly level. Main objective of its development was capability to generate cross section data to be used for fuel management and safety analyses of PWR reactors. Till now formal verification of code predictions capability is not performed at fuel assembly level, but results of fuel management calculations obtained using FA2D generated cross sections for NPP Krsko and IRIS reactor are compared against Westinghouse calculations. Cross section data were used within NRC's PARCS code and satisfactory preliminary results were obtained. This paper presents results of calculations performed for Nuclear Fuel Industries, Ltd., benchmark using FA2D, and SCALE5 TRITON calculation sequence (based on discrete ordinates code NEWT). Nuclear Fuel Industries, Ltd., Japan, released LWR Next Generation Fuels Benchmark with the aim to verify prediction capability in nuclear design for extended burnup regions. We performed calculations for two different Benchmark problem geometries - UO 2 pin cell and UO 2 PWR fuel assembly. The results obtained with two mentioned 2D spectral codes are presented for burnup dependency of infinite multiplication factor, isotopic concentration of important materials and for local peaking factor vs. burnup (in case of fuel assembly calculation).(author)

  19. A Tale of Two Probabilities

    Science.gov (United States)

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  20. Institutional Strain and Precarious Values in Meeting Future Nuclear Challenges

    Energy Technology Data Exchange (ETDEWEB)

    Bruce Matthews; Todd R. LaPorte

    1998-11-01

    This paper explores the implications of moderately expanding plutonium "pit" production capability within the strongly R&D culture of Los Alamos National Laboratory, especially in terms of the lab's current capacity or "fitness for the future" in which institutional stewardship of the nation's nuclear deterrent capability becomes a primary objective. The institutional properties needed to assure "future fitness" includes the organizational requisites highly reliable operations and sustained institutional constancy in a manner that evokes deep public trust and confidence. Estimates are made of the degree to which the key Division and most relevant Program office in this evolution already exhibits them.

  1. Impact of controlling the sum of error probability in the sequential probability ratio test

    Directory of Open Access Journals (Sweden)

    Bijoy Kumarr Pradhan

    2013-05-01

    Full Text Available A generalized modified method is proposed to control the sum of error probabilities in sequential probability ratio test to minimize the weighted average of the two average sample numbers under a simple null hypothesis and a simple alternative hypothesis with the restriction that the sum of error probabilities is a pre-assigned constant to find the optimal sample size and finally a comparison is done with the optimal sample size found from fixed sample size procedure. The results are applied to the cases when the random variate follows a normal law as well as Bernoullian law.

  2. Upgrading Probability via Fractions of Events

    Directory of Open Access Journals (Sweden)

    Frič Roman

    2016-08-01

    Full Text Available The influence of “Grundbegriffe” by A. N. Kolmogorov (published in 1933 on education in the area of probability and its impact on research in stochastics cannot be overestimated. We would like to point out three aspects of the classical probability theory “calling for” an upgrade: (i classical random events are black-and-white (Boolean; (ii classical random variables do not model quantum phenomena; (iii basic maps (probability measures and observables { dual maps to random variables have very different “mathematical nature”. Accordingly, we propose an upgraded probability theory based on Łukasiewicz operations (multivalued logic on events, elementary category theory, and covering the classical probability theory as a special case. The upgrade can be compared to replacing calculations with integers by calculations with rational (and real numbers. Namely, to avoid the three objections, we embed the classical (Boolean random events (represented by the f0; 1g-valued indicator functions of sets into upgraded random events (represented by measurable {0; 1}-valued functions, the minimal domain of probability containing “fractions” of classical random events, and we upgrade the notions of probability measure and random variable.

  3. Empirical investigation on using wind speed volatility to estimate the operation probability and power output of wind turbines

    International Nuclear Information System (INIS)

    Liu, Heping; Shi, Jing; Qu, Xiuli

    2013-01-01

    Highlights: ► Ten-minute wind speed and power generation data of an offshore wind turbine are used. ► An ARMA–GARCH-M model is built to simultaneously forecast wind speed mean and volatility. ► The operation probability and expected power output of the wind turbine are predicted. ► The integrated approach produces more accurate wind power forecasting than other conventional methods. - Abstract: In this paper, we introduce a quantitative methodology that performs the interval estimation of wind speed, calculates the operation probability of wind turbine, and forecasts the wind power output. The technological advantage of this methodology stems from the empowered capability of mean and volatility forecasting of wind speed. Based on the real wind speed and corresponding wind power output data from an offshore wind turbine, this methodology is applied to build an ARMA–GARCH-M model for wind speed forecasting, and then to compute the operation probability and the expected power output of the wind turbine. The results show that the developed methodology is effective, the obtained interval estimation of wind speed is reliable, and the forecasted operation probability and expected wind power output of the wind turbine are accurate

  4. A Comprehensive Probability Project for the Upper Division One-Semester Probability Course Using Yahtzee

    Science.gov (United States)

    Wilson, Jason; Lawman, Joshua; Murphy, Rachael; Nelson, Marissa

    2011-01-01

    This article describes a probability project used in an upper division, one-semester probability course with third-semester calculus and linear algebra prerequisites. The student learning outcome focused on developing the skills necessary for approaching project-sized math/stat application problems. These skills include appropriately defining…

  5. Can Probability Maps of Swept-Source Optical Coherence Tomography Predict Visual Field Changes in Preperimetric Glaucoma?

    Science.gov (United States)

    Lee, Won June; Kim, Young Kook; Jeoung, Jin Wook; Park, Ki Ho

    2017-12-01

    To determine the usefulness of swept-source optical coherence tomography (SS-OCT) probability maps in detecting locations with significant reduction in visual field (VF) sensitivity or predicting future VF changes, in patients with classically defined preperimetric glaucoma (PPG). Of 43 PPG patients, 43 eyes were followed-up on every 6 months for at least 2 years were analyzed in this longitudinal study. The patients underwent wide-field SS-OCT scanning and standard automated perimetry (SAP) at the time of enrollment. With this wide-scan protocol, probability maps originating from the corresponding thickness map and overlapped with SAP VF test points could be generated. We evaluated the vulnerable VF points with SS-OCT probability maps as well as the prevalence of locations with significant VF reduction or subsequent VF changes observed in the corresponding damaged areas of the probability maps. The vulnerable VF points were shown in superior and inferior arcuate patterns near the central fixation. In 19 of 43 PPG eyes (44.2%), significant reduction in baseline VF was detected within the areas of structural change on the SS-OCT probability maps. In 16 of 43 PPG eyes (37.2%), subsequent VF changes within the areas of SS-OCT probability map change were observed over the course of the follow-up. Structural changes on SS-OCT probability maps could detect or predict VF changes using SAP, in a considerable number of PPG eyes. Careful comparison of probability maps with SAP results could be useful in diagnosing and monitoring PPG patients in the clinical setting.

  6. Smoothing and projecting age-specific probabilities of death by TOPALS

    Directory of Open Access Journals (Sweden)

    Joop de Beer

    2012-10-01

    Full Text Available BACKGROUND TOPALS is a new relational model for smoothing and projecting age schedules. The model is operationally simple, flexible, and transparent. OBJECTIVE This article demonstrates how TOPALS can be used for both smoothing and projecting age-specific mortality for 26 European countries and compares the results of TOPALS with those of other smoothing and projection methods. METHODS TOPALS uses a linear spline to describe the ratios between the age-specific death probabilities of a given country and a standard age schedule. For smoothing purposes I use the average of death probabilities over 15 Western European countries as standard, whereas for projection purposes I use an age schedule of 'best practice' mortality. A partial adjustment model projects how quickly the death probabilities move in the direction of the best-practice level of mortality. RESULTS On average, TOPALS performs better than the Heligman-Pollard model and the Brass relational method in smoothing mortality age schedules. TOPALS can produce projections that are similar to those of the Lee-Carter method, but can easily be used to produce alternative scenarios as well. This article presents three projections of life expectancy at birth for the year 2060 for 26 European countries. The Baseline scenario assumes a continuation of the past trend in each country, the Convergence scenario assumes that there is a common trend across European countries, and the Acceleration scenario assumes that the future decline of death probabilities will exceed that in the past. The Baseline scenario projects that average European life expectancy at birth will increase to 80 years for men and 87 years for women in 2060, whereas the Acceleration scenario projects an increase to 90 and 93 years respectively. CONCLUSIONS TOPALS is a useful new tool for demographers for both smoothing age schedules and making scenarios.

  7. FFTF reload core nuclear design for increased experimental capability

    International Nuclear Information System (INIS)

    Rothrock, R.B.; Nelson, J.V.; Dobbin, K.D.; Bennett, R.A.

    1976-01-01

    In anticipation of continued growth in the FTR experimental irradiations program, the enrichments for the next batches of reload driver fuel to be manufactured have been increased to provide a substantially enlarged experimental reactivity allowance. The enrichments for these fuel assemblies, termed ''Cores 3 and 4,'' were selected to meet the following objectives and constraints: (1) maintain a reactor power capability of 400 MW (based on an evaluation of driver fuel centerline melting probability at 15 percent overpower); (2) provide a peak neutron flux of nominally 7 x 10 15 n/cm 2 -sec, with a minimum acceptable value of 95 percent of this (i.e., 6.65 x 10 15 n/cm 2 -sec); and (3) provide the maximum experimental reactivity allowance that is consistent with the above constraints

  8. Tribology needs for future space and aeronautical systems

    Science.gov (United States)

    Fusaro, Robert L.

    1991-01-01

    Future aeronautical and space missions will push tribology technology beyond its current capability. The objective is to discuss the current state of the art of tribology as it is applied to advanced aircraft and spacecraft. Areas of discussion include materials lubrication mechanisms, factors affecting lubrication, current and future tribological problem areas, potential new lubrication techniques, and perceived technology requirements that need to be met in order to solve these tribology problems.

  9. Developing Alliance Capabilities

    DEFF Research Database (Denmark)

    Heimeriks, Koen H.; Duysters, Geert; Vanhaverbeke, Wim

    This paper assesses the differential performance effects of learning mechanisms on the development of alliance capabilities. Prior research has suggested that different capability levels could be identified in which specific intra-firm learning mechanisms are used to enhance a firm's alliance...

  10. An Alternative Version of Conditional Probabilities and Bayes' Rule: An Application of Probability Logic

    Science.gov (United States)

    Satake, Eiki; Amato, Philip P.

    2008-01-01

    This paper presents an alternative version of formulas of conditional probabilities and Bayes' rule that demonstrate how the truth table of elementary mathematical logic applies to the derivations of the conditional probabilities of various complex, compound statements. This new approach is used to calculate the prior and posterior probabilities…

  11. Activity in inferior parietal and medial prefrontal cortex signals the accumulation of evidence in a probability learning task.

    Directory of Open Access Journals (Sweden)

    Mathieu d'Acremont

    Full Text Available In an uncertain environment, probabilities are key to predicting future events and making adaptive choices. However, little is known about how humans learn such probabilities and where and how they are encoded in the brain, especially when they concern more than two outcomes. During functional magnetic resonance imaging (fMRI, young adults learned the probabilities of uncertain stimuli through repetitive sampling. Stimuli represented payoffs and participants had to predict their occurrence to maximize their earnings. Choices indicated loss and risk aversion but unbiased estimation of probabilities. BOLD response in medial prefrontal cortex and angular gyri increased linearly with the probability of the currently observed stimulus, untainted by its value. Connectivity analyses during rest and task revealed that these regions belonged to the default mode network. The activation of past outcomes in memory is evoked as a possible mechanism to explain the engagement of the default mode network in probability learning. A BOLD response relating to value was detected only at decision time, mainly in striatum. It is concluded that activity in inferior parietal and medial prefrontal cortex reflects the amount of evidence accumulated in favor of competing and uncertain outcomes.

  12. NWChem Meeting on Science Driven Petascale Computing and Capability Development at EMSL

    Energy Technology Data Exchange (ETDEWEB)

    De Jong, Wibe A.

    2007-02-19

    On January 25, and 26, 2007, an NWChem meeting was held that was attended by 65 scientists from 29 institutions including 22 universities and 5 national laboratories. The goals of the meeting were to look at major scientific challenges that could be addressed by computational modeling in environmental molecular sciences, and to identify the associated capability development needs. In addition, insights were sought into petascale computing developments in computational chemistry. During the meeting common themes were identified that will drive the need for the development of new or improved capabilities in NWChem. Crucial areas of development that the developer's team will be focusing on are (1) modeling of dynamics and kinetics in chemical transformations, (2) modeling of chemistry at interfaces and in the condensed phase, and (3) spanning longer time scales in biological processes modeled with molecular dynamics. Various computational chemistry methodologies were discussed during the meeting, which will provide the basis for the capability developments in the near or long term future of NWChem.

  13. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Free probability and random matrices

    CERN Document Server

    Mingo, James A

    2017-01-01

    This volume opens the world of free probability to a wide variety of readers. From its roots in the theory of operator algebras, free probability has intertwined with non-crossing partitions, random matrices, applications in wireless communications, representation theory of large groups, quantum groups, the invariant subspace problem, large deviations, subfactors, and beyond. This book puts a special emphasis on the relation of free probability to random matrices, but also touches upon the operator algebraic, combinatorial, and analytic aspects of the theory. The book serves as a combination textbook/research monograph, with self-contained chapters, exercises scattered throughout the text, and coverage of important ongoing progress of the theory. It will appeal to graduate students and all mathematicians interested in random matrices and free probability from the point of view of operator algebras, combinatorics, analytic functions, or applications in engineering and statistical physics.

  15. Probability with applications and R

    CERN Document Server

    Dobrow, Robert P

    2013-01-01

    An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c

  16. A philosophical essay on probabilities

    CERN Document Server

    Laplace, Marquis de

    1996-01-01

    A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application

  17. Development of vendor independent safety analysis capability for nuclear power plants in Taiwan

    International Nuclear Information System (INIS)

    Tang, J.-R.

    2001-01-01

    The Institute of Nuclear Energy Research (INER) and the Taiwan Power Company (TPC) have long-term cooperation to develop vendor independent safety analysis capability to provide support to nuclear power plants in Taiwan in many aspects. This paper presents some applications of this analysis capability, introduces the analysis methodology, and discusses the significance of vendor independent analysis capability now and future. The applications include a safety analysis of core shroud crack for Chinshan BWR/4 Unit 2, a parallel reload safety analysis of the first 18-month extended fuel cycle for Kuosheng BWR/6 Unit 2 Cycle 13, an analysis to support Technical Specification change for Maanshan three-loop PWR, and a design analysis to support the review of Preliminary Safety Analysis Report of Lungmen ABWR. In addition, some recent applications such as an analysis to support the review of BWR fuel bid for Chinshan and Kuosheng demonstrates the needs of further development of the analysis capability to support nuclear power plants in the 21 st century. (authors)

  18. Chaos optimization algorithms based on chaotic maps with different probability distribution and search speed for global optimization

    Science.gov (United States)

    Yang, Dixiong; Liu, Zhenjun; Zhou, Jilei

    2014-04-01

    Chaos optimization algorithms (COAs) usually utilize the chaotic map like Logistic map to generate the pseudo-random numbers mapped as the design variables for global optimization. Many existing researches indicated that COA can more easily escape from the local minima than classical stochastic optimization algorithms. This paper reveals the inherent mechanism of high efficiency and superior performance of COA, from a new perspective of both the probability distribution property and search speed of chaotic sequences generated by different chaotic maps. The statistical property and search speed of chaotic sequences are represented by the probability density function (PDF) and the Lyapunov exponent, respectively. Meanwhile, the computational performances of hybrid chaos-BFGS algorithms based on eight one-dimensional chaotic maps with different PDF and Lyapunov exponents are compared, in which BFGS is a quasi-Newton method for local optimization. Moreover, several multimodal benchmark examples illustrate that, the probability distribution property and search speed of chaotic sequences from different chaotic maps significantly affect the global searching capability and optimization efficiency of COA. To achieve the high efficiency of COA, it is recommended to adopt the appropriate chaotic map generating the desired chaotic sequences with uniform or nearly uniform probability distribution and large Lyapunov exponent.

  19. High temperature triggers latent variation among individuals: oviposition rate and probability for outbreaks.

    Directory of Open Access Journals (Sweden)

    Christer Björkman

    2011-01-01

    Full Text Available It is anticipated that extreme population events, such as extinctions and outbreaks, will become more frequent as a consequence of climate change. To evaluate the increased probability of such events, it is crucial to understand the mechanisms involved. Variation between individuals in their response to climatic factors is an important consideration, especially if microevolution is expected to change the composition of populations.Here we present data of a willow leaf beetle species, showing high variation among individuals in oviposition rate at a high temperature (20 °C. It is particularly noteworthy that not all individuals responded to changes in temperature; individuals laying few eggs at 20 °C continued to do so when transferred to 12 °C, whereas individuals that laid many eggs at 20 °C reduced their oviposition and laid the same number of eggs as the others when transferred to 12 °C. When transferred back to 20 °C most individuals reverted to their original oviposition rate. Thus, high variation among individuals was only observed at the higher temperature. Using a simple population model and based on regional climate change scenarios we show that the probability of outbreaks increases if there is a realistic increase in the number of warm summers. The probability of outbreaks also increased with increasing heritability of the ability to respond to increased temperature.If climate becomes warmer and there is latent variation among individuals in their temperature response, the probability for outbreaks may increase. However, the likelihood for microevolution to play a role may be low. This conclusion is based on the fact that it has been difficult to show that microevolution affect the probability for extinctions. Our results highlight the urge for cautiousness when predicting the future concerning probabilities for extreme population events.

  20. Analysis of future nuclear power plants competitiveness with stochastic methods

    International Nuclear Information System (INIS)

    Feretic, D.; Tomsic, Z.

    2004-01-01

    To satisfy the increased demand it is necessary to build new electrical power plants, which could in an optimal way meet, the imposed acceptability criteria. The main criteria are potential to supply the required energy, to supply this energy with minimal (or at least acceptable) costs, to satisfy licensing requirements and be acceptable to public. The main competitors for unlimited electricity production in next few decades are fossil power plants (coal and gas) and nuclear power plants. New renewable power plants (solar, wind, biomass) are also important but due to limited energy supply potential and high costs can be only supplement to the main generating units. Large hydropower plans would be competitive under condition of existence of suitable sites for construction of such plants. The paper describes the application of a stochastic method for comparing economic parameters of future electrical power generating systems including conventional and nuclear power plants. The method is applied to establish competitive specific investment costs of future nuclear power plants when compared with combined cycle gas fired units combined with wind electricity generators using best estimated and optimistic input data. The bases for economic comparison of potential options are plant life time levelized electricity generating costs. The purpose is to assess the uncertainty of several key performance and cost of electricity produced in coal fired power plant, gas fired power plant and nuclear power plant developing probability distribution of levelized price of electricity from different Power Plants, cumulative probability of levelized price of electricity for each technology and probability distribution of cost difference between the technologies. The key parameters evaluated include: levelized electrical energy cost USD/kWh,, discount rate, interest rate for credit repayment, rate of expected increase of fuel cost, plant investment cost , fuel cost , constant annual

  1. Alternative probability theories for cognitive psychology.

    Science.gov (United States)

    Narens, Louis

    2014-01-01

    Various proposals for generalizing event spaces for probability functions have been put forth in the mathematical, scientific, and philosophic literatures. In cognitive psychology such generalizations are used for explaining puzzling results in decision theory and for modeling the influence of context effects. This commentary discusses proposals for generalizing probability theory to event spaces that are not necessarily boolean algebras. Two prominent examples are quantum probability theory, which is based on the set of closed subspaces of a Hilbert space, and topological probability theory, which is based on the set of open sets of a topology. Both have been applied to a variety of cognitive situations. This commentary focuses on how event space properties can influence probability concepts and impact cognitive modeling. Copyright © 2013 Cognitive Science Society, Inc.

  2. Probability theory and mathematical statistics for engineers

    CERN Document Server

    Pugachev, V S

    1984-01-01

    Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector

  3. Introduction to probability theory with contemporary applications

    CERN Document Server

    Helms, Lester L

    2010-01-01

    This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process

  4. Probability in quantum mechanics

    Directory of Open Access Journals (Sweden)

    J. G. Gilson

    1982-01-01

    Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.

  5. Pre-Service Teachers' Conceptions of Probability

    Science.gov (United States)

    Odafe, Victor U.

    2011-01-01

    Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…

  6. Nuclear data uncertainties: I, Basic concepts of probability

    Energy Technology Data Exchange (ETDEWEB)

    Smith, D.L.

    1988-12-01

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs.

  7. Nuclear data uncertainties: I, Basic concepts of probability

    International Nuclear Information System (INIS)

    Smith, D.L.

    1988-12-01

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs

  8. Economic choices reveal probability distortion in macaque monkeys.

    Science.gov (United States)

    Stauffer, William R; Lak, Armin; Bossaerts, Peter; Schultz, Wolfram

    2015-02-18

    Economic choices are largely determined by two principal elements, reward value (utility) and probability. Although nonlinear utility functions have been acknowledged for centuries, nonlinear probability weighting (probability distortion) was only recently recognized as a ubiquitous aspect of real-world choice behavior. Even when outcome probabilities are known and acknowledged, human decision makers often overweight low probability outcomes and underweight high probability outcomes. Whereas recent studies measured utility functions and their corresponding neural correlates in monkeys, it is not known whether monkeys distort probability in a manner similar to humans. Therefore, we investigated economic choices in macaque monkeys for evidence of probability distortion. We trained two monkeys to predict reward from probabilistic gambles with constant outcome values (0.5 ml or nothing). The probability of winning was conveyed using explicit visual cues (sector stimuli). Choices between the gambles revealed that the monkeys used the explicit probability information to make meaningful decisions. Using these cues, we measured probability distortion from choices between the gambles and safe rewards. Parametric modeling of the choices revealed classic probability weighting functions with inverted-S shape. Therefore, the animals overweighted low probability rewards and underweighted high probability rewards. Empirical investigation of the behavior verified that the choices were best explained by a combination of nonlinear value and nonlinear probability distortion. Together, these results suggest that probability distortion may reflect evolutionarily preserved neuronal processing. Copyright © 2015 Stauffer et al.

  9. Traffic routing for multicomputer networks with virtual cut-through capability

    Science.gov (United States)

    Kandlur, Dilip D.; Shin, Kang G.

    1992-01-01

    Consideration is given to the problem of selecting routes for interprocess communication in a network with virtual cut-through capability, while balancing the network load and minimizing the number of times that a message gets buffered. An approach is proposed that formulates the route selection problem as a minimization problem with a link cost function that depends upon the traffic through the link. The form of this cost function is derived using the probability of establishing a virtual cut-through route. The route selection problem is shown to be NP-hard, and an algorithm is developed to incrementally reduce the cost by rerouting the traffic. The performance of this algorithm is exemplified by two network topologies: the hypercube and the C-wrapped hexagonal mesh.

  10. Prediction of shock initiation thresholds and ignition probability of polymer-bonded explosives using mesoscale simulations

    Science.gov (United States)

    Kim, Seokpum; Wei, Yaochi; Horie, Yasuyuki; Zhou, Min

    2018-05-01

    The design of new materials requires establishment of macroscopic measures of material performance as functions of microstructure. Traditionally, this process has been an empirical endeavor. An approach to computationally predict the probabilistic ignition thresholds of polymer-bonded explosives (PBXs) using mesoscale simulations is developed. The simulations explicitly account for microstructure, constituent properties, and interfacial responses and capture processes responsible for the development of hotspots and damage. The specific mechanisms tracked include viscoelasticity, viscoplasticity, fracture, post-fracture contact, frictional heating, and heat conduction. The probabilistic analysis uses sets of statistically similar microstructure samples to directly mimic relevant experiments for quantification of statistical variations of material behavior due to inherent material heterogeneities. The particular thresholds and ignition probabilities predicted are expressed in James type and Walker-Wasley type relations, leading to the establishment of explicit analytical expressions for the ignition probability as function of loading. Specifically, the ignition thresholds corresponding to any given level of ignition probability and ignition probability maps are predicted for PBX 9404 for the loading regime of Up = 200-1200 m/s where Up is the particle speed. The predicted results are in good agreement with available experimental measurements. A parametric study also shows that binder properties can significantly affect the macroscopic ignition behavior of PBXs. The capability to computationally predict the macroscopic engineering material response relations out of material microstructures and basic constituent and interfacial properties lends itself to the design of new materials as well as the analysis of existing materials.

  11. Expected utility with lower probabilities

    DEFF Research Database (Denmark)

    Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1994-01-01

    An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...

  12. The fractal feature and price trend in the gold future market at the Shanghai Futures Exchange (SFE)

    Science.gov (United States)

    Wu, Binghui; Duan, Tingting

    2017-05-01

    The price of gold future is affected by many factors, which include the fluctuation of gold price and the change of trading environment. Fractal analysis can help investors gain better understandings of the price fluctuation and make reasonable investment decisions in the gold future market. After analyzing gold future price from January 2th, 2014 to April 12th, 2016 at the Shanghai Futures Exchange (SFE) in China, the conclusion is drawn that the gold future market has sustainability in each trading day, with all Hurst indexes greater than 0.5. The changing features of Hurst index indicate the sustainability of gold future market is strengthened first and weakened then. As a complicatedly nonlinear system, the gold future market can be well reflected by Elman neural network, which is capable of memorizing previous prices and particularly suited for forecasting time series in comparison with other types of neural networks. After analyzing the price trend in the gold future market, the results show that the relative error between the actual value of gold future and the predictive value of Elman neural network is smaller. This model that has a better performance in data fitting and predication, can help investors analyze and foresee the price tendency in the gold future market.

  13. Moxie matters: associations of future orientation with active life expectancy.

    Science.gov (United States)

    Laditka, Sarah B; Laditka, James N

    2017-10-01

    Being oriented toward the future has been associated with better future health. We studied associations of future orientation with life expectancy and the percentage of life with disability. We used the Panel Study of Income Dynamics (n = 5249). Participants' average age in 1968 was 33.0. Six questions repeatedly measured future orientation, 1968-1976. Seven waves (1999-2011, 33,331 person-years) measured disability in activities of daily living for the same individuals, whose average age in 1999 was 64.0. We estimated monthly probabilities of disability and death with multinomial logistic Markov models adjusted for age, sex, race/ethnicity, childhood health, and education. Using the probabilities, we created large populations with microsimulation, measuring disability in each month for each individual, age 55 through death. Life expectancy from age 55 for white men with high future orientation was age 77.6 (95% confidence interval 75.5-79.0), 6.9% (4.9-7.2) of those years with disability; results with low future orientation were 73.6 (72.2-75.4) and 9.6% (7.7-10.7). Comparable results for African American men were 74.8 (72.9-75.3), 8.1 (5.6-9.3), 71.0 (69.6-72.8), and 11.3 (9.1-11.7). For women, there were no significant differences associated with levels of future orientation for life expectancy. For white women with high future orientation 9.1% of remaining life from age 55 was disabled (6.3-9.9), compared to 12.4% (10.2-13.2) with low future orientation. Disability results for African American women were similar but statistically significant only at age 80 and over. High future orientation during early to middle adult ages may be associated with better health in older age.

  14. Striatal activity is modulated by target probability.

    Science.gov (United States)

    Hon, Nicholas

    2017-06-14

    Target probability has well-known neural effects. In the brain, target probability is known to affect frontal activity, with lower probability targets producing more prefrontal activation than those that occur with higher probability. Although the effect of target probability on cortical activity is well specified, its effect on subcortical structures such as the striatum is less well understood. Here, I examined this issue and found that the striatum was highly responsive to target probability. This is consistent with its hypothesized role in the gating of salient information into higher-order task representations. The current data are interpreted in light of that fact that different components of the striatum are sensitive to different types of task-relevant information.

  15. Energy future 2050

    Energy Technology Data Exchange (ETDEWEB)

    Syri, S; Kainiemi, L; Riikonen, V [Aalto Univ. School of Engineering, Espoo (Finland). Dept. of Energy Technology

    2011-07-01

    The track was organized by the Department of Energy Technology, School of Engineering, at Aalto University. Energy future 2050 -track introduced participants to the global long-term challenges of achieving a sustainable energy supply. According to the Intergovernmental Panel on Climate Change (IPCC), effective climate change mitigation would require the global greenhouse gas emissions to be reduced by 50-85% from the present level by 2050. For industrialized countries, this would probably mean a practically carbon-neutral economy and energy supply, as developing countries need more possibilities for growth and probably enter stricter emission reduction commitments with some delay. In the beginning of the workshop, students were introduced to global energy scenarios and the challenge of climate change mitigation. Students worked in three groups with the following topics: How to gain public acceptance of Carbon (dioxide) Capture and Storage (CCS) ? Personal emissions trading as a tool to achieve deep emission cuts, How to get rid of fossil fuel subsidies? Nordic cases are peat use in Finland and Sweden. (orig.)

  16. Failure probability analysis of optical grid

    Science.gov (United States)

    Zhong, Yaoquan; Guo, Wei; Sun, Weiqiang; Jin, Yaohui; Hu, Weisheng

    2008-11-01

    Optical grid, the integrated computing environment based on optical network, is expected to be an efficient infrastructure to support advanced data-intensive grid applications. In optical grid, the faults of both computational and network resources are inevitable due to the large scale and high complexity of the system. With the optical network based distributed computing systems extensive applied in the processing of data, the requirement of the application failure probability have been an important indicator of the quality of application and an important aspect the operators consider. This paper will present a task-based analysis method of the application failure probability in optical grid. Then the failure probability of the entire application can be quantified, and the performance of reducing application failure probability in different backup strategies can be compared, so that the different requirements of different clients can be satisfied according to the application failure probability respectively. In optical grid, when the application based DAG (directed acyclic graph) is executed in different backup strategies, the application failure probability and the application complete time is different. This paper will propose new multi-objective differentiated services algorithm (MDSA). New application scheduling algorithm can guarantee the requirement of the failure probability and improve the network resource utilization, realize a compromise between the network operator and the application submission. Then differentiated services can be achieved in optical grid.

  17. Transforming organizational capabilities in strategizing

    DEFF Research Database (Denmark)

    Jørgensen, Claus; Friis, Ole Uhrskov; Koch, Christian

    2014-01-01

    Offshored and networked enterprises are becoming an important if not leading organizational form and this development seriously challenges their organizational capabilities. More specifically, over the last years, SMEs have commenced entering these kinds of arrangements. As the organizational...... capabilities of SMEs are limited at the outset, even more emphasis is needed regarding the issues of developing relevant organizational capabilities. This paper aims at investigating how capabilities evolve during an offshoring process of more than 5 years in two Danish SMEs, i.e. not only short- but long......-term evolvements within the companies. We develop our framework of understanding organizational capabilities drawing on dynamic capability, relational capability and strategy as practice concepts, appreciating the performative aspects of developing new routines. Our two cases are taken from one author’s Ph...

  18. Spacetime quantum probabilities II: Relativized descriptions and Popperian propensities

    Science.gov (United States)

    Mugur-Schächter, M.

    1992-02-01

    In the first part of this work(1) we have explicated the spacetime structure of the probabilistic organization of quantum mechanics. We have shown that each quantum mechanical state, in consequence of the spacetime characteristics of the epistemic operations by which the observer produces the state to be studied and the processes of qualification of these, brings in a tree-like spacetime structure, a “quantum mechanical probability tree,” that transgresses the theory of probabilities as it now stands. In this second part we develop the general implications of these results. Starting from the lowest level of cognitive action and creating an appropriate symbolism, we construct a “relativizing epistemic syntax,” a “general method of relativized conceptualization” where—systematically—each description is explicitly referred to the epistemic operations by which the observer produces the entity to be described and obtains qualifications of it. The method generates a typology of increasingly complex relativized descriptions where the question of realism admits of a particularly clear pronouncement. Inside this typology the epistemic processes that lie—UNIVERSALLY—at the basis of any conceptualization, reveal a tree-like spacetime structure. It appears in particular that the spacetime structure of the relativized representation of a probabilistic description, which transgresses the nowadays theory of probabilities, is the general mould of which the quantum mechanical probability trees are only particular realizations. This entails a clear definition of the descriptional status of quantum mechanics. While the recognition of the universal cognitive content of the quantum mechanical formalism opens up vistas toward mathematical developments of the relativizing epistemic syntax. The relativized representation of a probabilistic description leads with inner necessity to a “morphic” interpretation of probabilities that can be regarded as a formalized and

  19. Vortex-flow aerodynamics - An emerging design capability

    Science.gov (United States)

    Campbell, J. F.

    1981-01-01

    Promising current theoretical and simulational developments in the field of leading edge vortex-generating delta, arrow ogival wings are reported, along with the history of theory and experiment leading to them. The effects of wing slenderness, leading edge nose radius, Mach number and incidence variations, and planform on the onset of vortex generation and redistribution of aerodynamic loads are considered. The range of design possibilities in this field are consequential for the future development of strategic aircraft, supersonic transports and commercial cargo aircraft which will possess low-speed, high-lift capability by virtue of leading edge vortex generation and control without recourse to heavy and expensive leading edge high-lift devices and compound airfoils. Attention is given to interactive graphics simulation devices recently developed.

  20. An Objective Theory of Probability (Routledge Revivals)

    CERN Document Server

    Gillies, Donald

    2012-01-01

    This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axioma

  1. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2014-01-01

    The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability t

  2. Defining Probability in Sex Offender Risk Assessment.

    Science.gov (United States)

    Elwood, Richard W

    2016-12-01

    There is ongoing debate and confusion over using actuarial scales to predict individuals' risk of sexual recidivism. Much of the debate comes from not distinguishing Frequentist from Bayesian definitions of probability. Much of the confusion comes from applying Frequentist probability to individuals' risk. By definition, only Bayesian probability can be applied to the single case. The Bayesian concept of probability resolves most of the confusion and much of the debate in sex offender risk assessment. Although Bayesian probability is well accepted in risk assessment generally, it has not been widely used to assess the risk of sex offenders. I review the two concepts of probability and show how the Bayesian view alone provides a coherent scheme to conceptualize individuals' risk of sexual recidivism.

  3. UT Biomedical Informatics Lab (BMIL) probability wheel

    Science.gov (United States)

    Huang, Sheng-Cheng; Lee, Sara; Wang, Allen; Cantor, Scott B.; Sun, Clement; Fan, Kaili; Reece, Gregory P.; Kim, Min Soon; Markey, Mia K.

    A probability wheel app is intended to facilitate communication between two people, an "investigator" and a "participant", about uncertainties inherent in decision-making. Traditionally, a probability wheel is a mechanical prop with two colored slices. A user adjusts the sizes of the slices to indicate the relative value of the probabilities assigned to them. A probability wheel can improve the adjustment process and attenuate the effect of anchoring bias when it is used to estimate or communicate probabilities of outcomes. The goal of this work was to develop a mobile application of the probability wheel that is portable, easily available, and more versatile. We provide a motivating example from medical decision-making, but the tool is widely applicable for researchers in the decision sciences.

  4. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  5. Quantum computing and probability

    International Nuclear Information System (INIS)

    Ferry, David K

    2009-01-01

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction. (viewpoint)

  6. Introduction of an Evaluation Tool to Predict the Probability of Success of Companies: The Innovativeness, Capabilities and Potential Model (ICP

    Directory of Open Access Journals (Sweden)

    Michael Lewrick

    2009-05-01

    Full Text Available Successful innovation requires management and in this paper a model to help manage the innovation process is presented. This model can be used to audit the management capability to innovate and to monitor how sales increase is related to innovativeness. The model was developed from a study of companies in the high technology cluster around Munich and validated using statistical procedures. The model was found to be effective at predicting the success or otherwise of the innovation strategy pursued by the company. The use of this model and how it can be used to identify areas for improvement are documented in this paper.

  7. The Future of Drones in the Modern Farming Industry

    Directory of Open Access Journals (Sweden)

    Nathan Stein

    2018-01-01

    partnerships between hardware and software providers are enabling the creation of more integrated, end-to-end drone solutions, capable of meeting the needs of agriculture professionals worldwide and influencing future developments in modern farming.

  8. [Biometric bases: basic concepts of probability calculation].

    Science.gov (United States)

    Dinya, E

    1998-04-26

    The author gives or outline of the basic concepts of probability theory. The bases of the event algebra, definition of the probability, the classical probability model and the random variable are presented.

  9. Pre-aggregation for Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    Motivated by the increasing need to analyze complex uncertain multidimensional data (e.g., in order to optimize and personalize location-based services), this paper proposes novel types of {\\em probabilistic} OLAP queries that operate on aggregate values that are probability distributions...... and the techniques to process these queries. The paper also presents the methods for computing the probability distributions, which enables pre-aggregation, and for using the pre-aggregated distributions for further aggregation. In order to achieve good time and space efficiency, the methods perform approximate...... multidimensional data analysis that is considered in this paper (i.e., approximate processing of probabilistic OLAP queries over probability distributions)....

  10. Contributions to quantum probability

    International Nuclear Information System (INIS)

    Fritz, Tobias

    2010-01-01

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a finite set can occur as the outcome

  11. Probability inequalities for decomposition integrals

    Czech Academy of Sciences Publication Activity Database

    Agahi, H.; Mesiar, Radko

    2017-01-01

    Roč. 315, č. 1 (2017), s. 240-248 ISSN 0377-0427 Institutional support: RVO:67985556 Keywords : Decomposition integral * Superdecomposition integral * Probability inequalities Subject RIV: BA - General Mathematics OBOR OECD: Statistics and probability Impact factor: 1.357, year: 2016 http://library.utia.cas.cz/separaty/2017/E/mesiar-0470959.pdf

  12. Probability, conditional probability and complementary cumulative distribution functions in performance assessment for radioactive waste disposal

    International Nuclear Information System (INIS)

    Helton, J.C.

    1996-03-01

    A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S st , S st , p st ) for stochastic uncertainty, a probability space (S su , S su , p su ) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S st , S st , p st ) and (S su , S su , p su ). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the U.S. Environmental Protection Agency's standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems

  13. Probability, conditional probability and complementary cumulative distribution functions in performance assessment for radioactive waste disposal

    International Nuclear Information System (INIS)

    Helton, J.C.

    1996-01-01

    A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S st , L st , P st ) for stochastic uncertainty, a probability space (S su , L su , P su ) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S st , L st , P st ) and (S su , L su , P su ). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the US Environmental Protection Agency's standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems

  14. The Public Health Practitioner of the Future.

    Science.gov (United States)

    Erwin, Paul Campbell; Brownson, Ross C

    2017-08-01

    The requisite capacities and capabilities of the public health practitioner of the future are being driven by multiple forces of change, including public health agency accreditation, climate change, health in all policies, social media and informatics, demographic transitions, globalized travel, and the repercussions of the Affordable Care Act. We describe five critical capacities and capabilities that public health practitioners can build on to successfully prepare for and respond to these forces of change: systems thinking and systems methods, communication capacities, an entrepreneurial orientation, transformational ethics, and policy analysis and response. Equipping the public health practitioner with the requisite capabilities and capacities will require new content and methods for those in public health academia, as well as a recommitment to lifelong learning on the part of the practitioner, within an increasingly uncertain and polarized political environment.

  15. Sensor Alerting Capability

    Science.gov (United States)

    Henriksson, Jakob; Bermudez, Luis; Satapathy, Goutam

    2013-04-01

    There is a large amount of sensor data generated today by various sensors, from in-situ buoys to mobile underwater gliders. Providing sensor data to the users through standardized services, language and data model is the promise of OGC's Sensor Web Enablement (SWE) initiative. As the amount of data grows it is becoming difficult for data providers, planners and managers to ensure reliability of data and services and to monitor critical data changes. Intelligent Automation Inc. (IAI) is developing a net-centric alerting capability to address these issues. The capability is built on Sensor Observation Services (SOSs), which is used to collect and monitor sensor data. The alerts can be configured at the service level and at the sensor data level. For example it can alert for irregular data delivery events or a geo-temporal statistic of sensor data crossing a preset threshold. The capability provides multiple delivery mechanisms and protocols, including traditional techniques such as email and RSS. With this capability decision makers can monitor their assets and data streams, correct failures or be alerted about a coming phenomena.

  16. Void probability scaling in hadron nucleus interactions

    International Nuclear Information System (INIS)

    Ghosh, Dipak; Deb, Argha; Bhattacharyya, Swarnapratim; Ghosh, Jayita; Bandyopadhyay, Prabhat; Das, Rupa; Mukherjee, Sima

    2002-01-01

    Heygi while investigating with the rapidity gap probability (that measures the chance of finding no particle in the pseudo-rapidity interval Δη) found that a scaling behavior in the rapidity gap probability has a close correspondence with the scaling of a void probability in galaxy correlation study. The main aim in this paper is to study the scaling behavior of the rapidity gap probability

  17. Interference suppression capabilities of smart cognitive-femto networks (SCFN)

    KAUST Repository

    Shakir, Muhammad; Atat, Rachad; Alouini, Mohamed-Slim

    2013-01-01

    Cognitive Radios are considered a standard part of future heterogeneous mobile network architectures. In this chapter, a two tier heterogeneous network with multiple Radio Access Technologies (RATs) is considered, namely (1) the secondary network, which comprises of Cognitive-Femto BS (CFBS), and (2) the macrocell network, which is considered a primary network. By exploiting the cooperation among the CFBS, the multiple CFBS can be considered a single base station with multiple geographically dispersed antennas, which can reduce the interference levels by directing the main beam toward the desired femtocell mobile user. The resultant network is referred to as Smart Cognitive-Femto Network (SCFN). In order to determine the effectiveness of the proposed smart network, the interference rejection capabilities of the SCFN is studied. It has been shown that the smart network offers significant performance improvements in interference suppression and Signal to Interference Ratio (SIR) and may be considered a promising solution to the interference management problems in future heterogeneous networks. © 2013, IGI Global.

  18. Dynamic Capabilities of Universities in the Knowledge Economy

    Directory of Open Access Journals (Sweden)

    Ruxandra BEJINARU

    2017-12-01

    Full Text Available The purpose of this paper is to analyze how universities can develop dynamic capabilities based on their strategic resources in order to increase their competitiveness in the knowledge economy. ”Dynamic capabilities” is a concept introduced by Teece, Pisano, and Shuen (1997 to emphasize the managerial capacity of a given organization of using efficiently its strategic resources in transforming opportunities in business success. They reflect the capacity of the organization to sense, seize, adapt to the changing environment. For a university, the most important strategic resources are information, knowledge, and ideas which constitute its intellectual capital. The present paper is analyzing critically these strategic resources of a university and the necessary conditions to develop dynamic capabilities in order to use efficiently these resources in a turbulent economic environment. Universities are in the knowledge economy the most important hubs for knowledge creation and its transfer to the students, as well as to their communities. At the same time, professors contribute to the development of the generic thinking skills of their students to help them for employability in a future with many new jobs and new business activities.

  19. Paraconsistent Probabilities: Consistency, Contradictions and Bayes’ Theorem

    Directory of Open Access Journals (Sweden)

    Juliana Bueno-Soler

    2016-09-01

    Full Text Available This paper represents the first steps towards constructing a paraconsistent theory of probability based on the Logics of Formal Inconsistency (LFIs. We show that LFIs encode very naturally an extension of the notion of probability able to express sophisticated probabilistic reasoning under contradictions employing appropriate notions of conditional probability and paraconsistent updating, via a version of Bayes’ theorem for conditionalization. We argue that the dissimilarity between the notions of inconsistency and contradiction, one of the pillars of LFIs, plays a central role in our extended notion of probability. Some critical historical and conceptual points about probability theory are also reviewed.

  20. Exploring non-signalling polytopes with negative probability

    International Nuclear Information System (INIS)

    Oas, G; Barros, J Acacio de; Carvalhaes, C

    2014-01-01

    Bipartite and tripartite EPR–Bell type systems are examined via joint quasi-probability distributions where probabilities are permitted to be negative. It is shown that such distributions exist only when the no-signalling condition is satisfied. A characteristic measure, the probability mass, is introduced and, via its minimization, limits the number of quasi-distributions describing a given marginal probability distribution. The minimized probability mass is shown to be an alternative way to characterize non-local systems. Non-signalling polytopes for two to eight settings in the bipartite scenario are examined and compared to prior work. Examining perfect cloning of non-local systems within the tripartite scenario suggests defining two categories of signalling. It is seen that many properties of non-local systems can be efficiently described by quasi-probability theory. (paper)

  1. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2005-01-01

    This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections

  2. A Future with Hope :China Agriculture Outlook 2007

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    @@ China's macro economy has remained in a good and stable condition overall, experiencing an annual GDP growth of over 10% for several consecutive years. Under this basic condition,the main focus of the Outlook was China's current grain and oil supply, and the demand market with its probable future prices.

  3. The enigma of probability and physics

    International Nuclear Information System (INIS)

    Mayants, L.

    1984-01-01

    This volume contains a coherent exposition of the elements of two unique sciences: probabilistics (science of probability) and probabilistic physics (application of probabilistics to physics). Proceeding from a key methodological principle, it starts with the disclosure of the true content of probability and the interrelation between probability theory and experimental statistics. This makes is possible to introduce a proper order in all the sciences dealing with probability and, by conceiving the real content of statistical mechanics and quantum mechanics in particular, to construct both as two interconnected domains of probabilistic physics. Consistent theories of kinetics of physical transformations, decay processes, and intramolecular rearrangements are also outlined. The interrelation between the electromagnetic field, photons, and the theoretically discovered subatomic particle 'emon' is considered. Numerous internal imperfections of conventional probability theory, statistical physics, and quantum physics are exposed and removed - quantum physics no longer needs special interpretation. EPR, Bohm, and Bell paradoxes are easily resolved, among others. (Auth.)

  4. Scenarios for future agriculture in Finland: a Delphi study among agri-food sector stakeholders

    Directory of Open Access Journals (Sweden)

    P. RIKKONEN

    2008-12-01

    Full Text Available This article presents alternative scenarios for future agriculture in Finland up to 2025. These scenarios are the results of a large Delphi study carried out among Finnish agri-food sector stakeholders. The Delphi panel members gave their future view on desirable and probable futures. From these two dimensions, three scenarios were elaborated through the future images – the subjective future path and the importance analysis. The scenarios represent a technology optimistic “day-dream agriculture”, a probable future as “industrialised agriculture” and an undesirable future path as “drifting agriculture”. Two mini-scenarios are also presented. They are based on a discontinuity event as an unexpected impact of climate change and an analogy event as an ecological breakdown due to the expansive animal disease epidemics. In both mini-scenarios, the directions of storylines are dramatically changed. The scenarios support strategic planning introducing not only one forecast but alternative outcomes as a basis for future strategy and decisions. In this study the scenarios were constructed to address the opportunities as a desired vision and also the threats as to an undesirable future in the agricultural sector. These results bring to the table a Finnish agri-food expert community view of the future directions of relevant key issues in the agricultural policy agenda.;

  5. F.Y. Edgeworth’s Treatise on Probabilities

    OpenAIRE

    Alberto Baccini

    2007-01-01

    Probability theory has a central role in Edgeworth’s thought; this paper examines the philosophical foundation of the theory. Starting from a frequentist position, Edgeworth introduced some innovations on the definition of primitive probabilities. He distinguished between primitive probabilities based on experience of statistical evidence, and primitive a priori probabilities based on a more general and less precise kind of experience, inherited by the human race through evolution. Given prim...

  6. History and future of remote sensing technology and education

    Science.gov (United States)

    Colwell, R. N.

    1980-01-01

    A historical overview of the discovery and development of photography, related sciences, and remote sensing technology is presented. The role of education to date in the development of remote sensing is discussed. The probable future and potential of remote sensing and training is described.

  7. Annunciation - building product team capabilities to support utility operational improvement

    International Nuclear Information System (INIS)

    Doucet, R.; Brown, R.; Trask, D.; Leger, R.; Mitchel, G.; Judd, R.; Davey, E.

    2003-01-01

    The purpose of this paper is to describe an AECL initiative to enhance the capabilities to assist utilities with undertaking annunciation improvement. This initiative was undertaken to complement a recent annunciation product upgrade, and in anticipation of developing commercial opportunities to assist Canadian and foreign utilities with control room annunciation improvement. Utilities are relying more and more on external engineering product and service providers to meet their plant support needs as they reduce in-house staffing to lower ongoing support costs. This evolving commercial environment places new demands on product and service providers, and provides new opportunities for increasing the proportion of product and service provider participation in plant improvement projects. This paper outlines recent AECL experience in the annunciation product area. The paper discusses the rationale for product support capability improvement, discusses the approaches undertaken, describes lessons learned, and outlines a proposed utility support model for assisting with future annunciation improvements. (author)

  8. The Los Alamos universe: Using multimedia to promote laboratory capabilities

    Energy Technology Data Exchange (ETDEWEB)

    Kindel, J.

    2000-03-01

    This project consists of a multimedia presentation that explains the technological capabilities of Los Alamos National Laboratory. It takes the form of a human-computer interface built around the metaphor of the universe. The project is intended promote Laboratory capabilities to a wide audience. Multimedia is simply a means of communicating information through a diverse set of tools--be they text, sound, animation, video, etc. Likewise, Los Alamos National Laboratory is a collection of diverse technologies, projects, and people. Given the ample material available at the Laboratory, there are tangible benefits to be gained by communicating across media. This paper consists of three parts. The first section provides some basic information about the Laboratory, its mission, and its needs. The second section introduces this multimedia presentation and the metaphor it is based on along with some basic concepts of color and user interaction used in the building of this project. The final section covers construction of the project, pitfalls, and future improvements.

  9. A conceptual framework and classification of capability areas for business process maturity

    Science.gov (United States)

    Van Looy, Amy; De Backer, Manu; Poels, Geert

    2014-03-01

    The article elaborates on business process maturity, which indicates how well an organisation can perform based on its business processes, i.e. on its way of working. This topic is of paramount importance for managers who try to excel in today's competitive world. Hence, business process maturity is an emerging research field. However, no consensus exists on the capability areas (or skills) needed to excel. Moreover, their theoretical foundation and synergies with other fields are frequently neglected. To overcome this gap, our study presents a conceptual framework with six main capability areas and 17 sub areas. It draws on theories regarding the traditional business process lifecycle, which are supplemented by recognised organisation management theories. The comprehensiveness of this framework is validated by mapping 69 business process maturity models (BPMMs) to the identified capability areas, based on content analysis. Nonetheless, as a consensus neither exists among the collected BPMMs, a classification of different maturity types is proposed, based on cluster analysis and discriminant analysis. Consequently, the findings contribute to the grounding of business process literature. Possible future avenues are evaluating existing BPMMs, directing new BPMMs or investigating which combinations of capability areas (i.e. maturity types) contribute more to performance than others.

  10. Higgs and SUSY searches at future colliders

    Indian Academy of Sciences (India)

    ... searches at future colliders, particularly comparing and contrasting the capabilities of LHC and next linear collider (NLC), including the aspects of Higgs searches in supersymmetric theories. I will also discuss how the search and study of sparticles other than the Higgs can be used to give information about the parameters ...

  11. Resource-Based Capability on Development Knowledge Management Capabilities of Coastal Community

    Science.gov (United States)

    Teniwut, Roberto M. K.; Hasyim, Cawalinya L.; Teniwut, Wellem A.

    2017-10-01

    Building sustainable knowledge management capabilities in the coastal area might face a whole new challenge since there are many intangible factors involved from openness on new knowledge, access and ability to use the latest technology to the various local wisdom that still in place. The aimed of this study was to identify and analyze the resource-based condition of coastal community in this area to have an empirical condition of tangible and intangible infrastructure on developing knowledge management capability coastal community in Southeast Maluku, Indonesia. We used qualitative and quantitative analysis by depth interview and questionnaire for collecting the data with multiple linear regression as our analysis method. The result provided the information on current state of resource-based capability of a coastal community in this Southeast Maluku to build a sustainability model of knowledge management capabilities especially on utilization marine and fisheries resources. The implication of this study can provide an empirical information for government, NGO and research institution to dictate on how they conducted their policy and program on developing coastal community region.

  12. Physics goals of future colliders

    International Nuclear Information System (INIS)

    Kane, G.L.

    1987-01-01

    These lectures describe some of the physics goals that future colliders are designed to achieve. Emphasis is on the SSC, but its capabilities are compared to those of other machines, and set in a context of what will be measured before the SSC is ready. Physics associated with the Higgs sector is examined most thoroughly, with a survey of the opportunities to find evidence of extended gauge theories

  13. Approximation methods in probability theory

    CERN Document Server

    Čekanavičius, Vydas

    2016-01-01

    This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.

  14. Multi-path transportation futures study : vehicle characterization and scenario analyses.

    Energy Technology Data Exchange (ETDEWEB)

    Plotkin, S. E.; Singh, M. K.; Energy Systems; TA Engineering; ORNL

    2009-12-03

    Projecting the future role of advanced drivetrains and fuels in the light vehicle market is inherently difficult, given the uncertainty (and likely volatility) of future oil prices, inadequate understanding of likely consumer response to new technologies, the relative infancy of several important new technologies with inevitable future changes in their performance and costs, and the importance - and uncertainty - of future government marketplace interventions (e.g., new regulatory standards or vehicle purchase incentives). This Multi-Path Transportation Futures (MP) Study has attempted to improve our understanding of this future role by examining several scenarios of vehicle costs, fuel prices, government subsidies, and other key factors. These are projections, not forecasts, in that they try to answer a series of 'what if' questions without assigning probabilities to most of the basic assumptions.

  15. Measurement of the two track separation capability of hybrid pixel sensors

    Energy Technology Data Exchange (ETDEWEB)

    Muñoz, F.J., E-mail: Francisca.MunozSanchez@manchester.ac.uk [University of Manchester (United Kingdom); Battaglia, M. [University of California, Santa Cruz, United States of America (United States); CERN, The European Organization for Nuclear Research (Switzerland); Da Vià, C. [University of Manchester (United Kingdom); La Rosa, A. [University of California, Santa Cruz, United States of America (United States); Dann, N. [University of Manchester (United Kingdom)

    2017-02-11

    Large Hadron Collider experiments face new challenges in Run-2 conditions due to the increased beam energy, the interest for searches of new physics signals with higher jet pT and the consequent longer decay length of heavy hadrons. In this new scenario, the capability of the innermost pixel sensors to distinguish tracks in very dense environment becomes crucial for efficient tracking and flavour tagging performance. In this work, we discuss the measurement in a test beam of the two track separation capability of hybrid pixel sensors using the interaction particles out of the collision of high energy pions on a thin copper target. With this method we are able to evaluate the effect of merged hits in the sensors under test due to tracks closer than the sensor spatial granularity in terms of collected charge, multiplicity and reconstruction efficiency. - Highlights: • Measurement of the two-track separation capability of hybrid pixel sensors. • Emulating track dense environment with a cooper target in a test beam. • Cooper target in between telescope arms to create vertices. • Validation of simulation and reconstruction algorithm for future vertex detectors. • New qualification method for pixel modules in track dense environments.

  16. Introduction to imprecise probabilities

    CERN Document Server

    Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M

    2014-01-01

    In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin

  17. Classic Problems of Probability

    CERN Document Server

    Gorroochurn, Prakash

    2012-01-01

    "A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin

  18. Quantifying Detection Probabilities for Proliferation Activities in Undeclared Facilities

    International Nuclear Information System (INIS)

    Listner, C.; Canty, M.; Niemeyer, I.; Rezniczek, A.; Stein, G.

    2015-01-01

    International Safeguards is currently in an evolutionary process to increase effectiveness and efficiency of the verification system. This is an obvious consequence of the inability to detect the Iraq's clandestine nuclear weapons programme in the early 90s. By the adoption of the Programme 93+2, this has led to the development of Integrated Safeguards and the State-level concept. Moreover, the IAEA's focus was extended onto proliferation activities outside the State's declared facilities. The effectiveness of safeguards activities within declared facilities can and have been quantified with respect to costs and detection probabilities. In contrast, when verifying the absence of undeclared facilities this quantification has been avoided in the past because it has been considered to be impossible. However, when balancing the allocation of budget between the declared and the undeclared field, explicit reasoning is needed why safeguards effort is distributed in a given way. Such reasoning can be given by a holistic, information and risk-driven approach to Acquisition Path Analysis comprising declared and undeclared facilities. Regarding the input, this approach relies on the quantification of several factors, i.e., costs of attractiveness values for specific proliferation activities, potential safeguards measures and detection probabilities for these measures also for the undeclared field. In order to overcome the lack of quantification for detection probabilities in undeclared facilities, the authors of this paper propose a general verification error model. Based on this model, four different approaches are explained and assessed with respect to their advantages and disadvantages: the analogy approach, the Bayes approach, the frequentist approach and the process approach. The paper concludes with a summary and an outlook on potential future research activities. (author)

  19. Capability and Mechanisms of Macrofungi in Heavy Metal Accumulation:A Review

    Directory of Open Access Journals (Sweden)

    CHEN Miao-miao

    2017-10-01

    Full Text Available Some macrofungi have the ability to accumulate heavy metals, which is comparable to hyper-accumulator plants. Cordyceps militaris can accumulate Zn up to 20 000 mg·kg-1. Therefore, macrofungi have the potential to be used as an important bioremediation tool for heavy metals. In this review, we summarized the heavy metal resistant capacity of typical macrofungi and known relevant mechanisms. Generally, straw-decay fungi presented better capability for Cu, Ag and Cd enrichment than wood-decay fungi, while wood-decay fungi could accumulate Cr, Mg, Se and Pb. Different macrofungi species, different growth periods(mycelium and fruiting body and different parts of fruiting body showed different capability for heavy metals accumulation. General mechanisms for heavy metals accumulation in macrofungi included extracellular precipitation in the forms of polymeric substances, cell wall adsorption and intracellular absorption. Macrofungi could also detoxify by chelating metal ions by metallothionein(MT, secreting antioxidant enzymes(SOD, CAT, POD and degradating the misfolded proteins by ubiquitin-proteasome system(UPS. We also explored the potential of macrofungi in heavy metal remediation and pollution diagnostics as a biological indicator. Some macrofungi had been applied in the remediation of heavy metal contaminated soils and water. Finally, some future research areas including strain breeding and genetic engineering were discussed, which might provide references for the future studies.

  20. NASA's Space Launch System: An Enabling Capability for International Exploration

    Science.gov (United States)

    Creech, Stephen D.; May, Todd A.; Robinson, Kimberly F.

    2014-01-01

    As the program moves out of the formulation phase and into implementation, work is well underway on NASA's new Space Launch System, the world's most powerful launch vehicle, which will enable a new era of human exploration of deep space. As assembly and testing of the rocket is taking place at numerous sites around the United States, mission planners within NASA and at the agency's international partners continue to evaluate utilization opportunities for this ground-breaking capability. Developed with the goals of safety, affordability, and sustainability in mind, the SLS rocket will launch the Orion Multi-Purpose Crew Vehicle (MPCV), equipment, supplies, and major science missions for exploration and discovery. NASA is developing this new capability in an austere economic climate, a fact which has inspired the SLS team to find innovative solutions to the challenges of designing, developing, fielding, and operating the largest rocket in history, via a path that will deliver an initial 70 metric ton (t) capability in December 2017 and then continuing through an incremental evolutionary strategy to reach a full capability greater than 130 t. SLS will be enabling for the first missions of human exploration beyond low Earth in almost half a century, and from its first crewed flight will be able to carry humans farther into space than they have ever voyaged before. In planning for the future of exploration, the International Space Exploration Coordination Group, representing 12 of the world's space agencies, has created the Global Exploration Roadmap, which outlines paths toward a human landing on Mars, beginning with capability-demonstrating missions to the Moon or an asteroid. The Roadmap and corresponding NASA research outline the requirements for reference missions for these destinations. SLS will offer a robust way to transport international crews and the air, water, food, and equipment they would need for such missions.

  1. Probability analysis of MCO over-pressurization during staging

    International Nuclear Information System (INIS)

    Pajunen, A.L.

    1997-01-01

    The purpose of this calculation is to determine the probability of Multi-Canister Overpacks (MCOs) over-pressurizing during staging at the Canister Storage Building (CSB). Pressurization of an MCO during staging is dependent upon changes to the MCO gas temperature and the build-up of reaction products during the staging period. These effects are predominantly limited by the amount of water that remains in the MCO following cold vacuum drying that is available for reaction during staging conditions. Because of the potential for increased pressure within an MCO, provisions for a filtered pressure relief valve and rupture disk have been incorporated into the MCO design. This calculation provides an estimate of the frequency that an MCO will contain enough water to pressurize beyond the limits of these design features. The results of this calculation will be used in support of further safety analyses and operational planning efforts. Under the bounding steady state CSB condition assumed for this analysis, an MCO must contain less than 1.6 kg (3.7 lbm) of water available for reaction to preclude actuation of the pressure relief valve at 100 psid. To preclude actuation of the MCO rupture disk at 150 psid, an MCO must contain less than 2.5 kg (5.5 lbm) of water available for reaction. These limits are based on the assumption that hydrogen generated by uranium-water reactions is the sole source of gas produced within the MCO and that hydrates in fuel particulate are the primary source of water available for reactions during staging conditions. The results of this analysis conclude that the probability of the hydrate water content of an MCO exceeding 1.6 kg is 0.08 and the probability that it will exceed 2.5 kg is 0.01. This implies that approximately 32 of 400 staged MCOs may experience pressurization to the point where the pressure relief valve actuates. In the event that an MCO pressure relief valve fails to open, the probability is 1 in 100 that the MCO would experience

  2. Contributions to quantum probability

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Tobias

    2010-06-25

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a

  3. Using Playing Cards to Differentiate Probability Interpretations

    Science.gov (United States)

    López Puga, Jorge

    2014-01-01

    The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.

  4. Development of students learning capabilities and professional capabilities

    DEFF Research Database (Denmark)

    Ringtved, Ulla Lunde; Wahl, Christian; Belle, Gianna

    This paper describes the work-in-progress on a project that aims todevelop a tool that via learning analytic methods enable studentsto enhance, document and assess the development of their learningcapabilities and professional capabilities in consequence of theirself-initiated study activities...... during their bachelor educations. Thetool aims at enhancing the development of students’ capabilities toself-initiate, self-regulate and self-assess their study activities.The tool uses the concept of collective intelligence as source formotivation and inspiration in self-initiating study activities...... as wellas self-assessing them. The tool is based on a heutagogical approachto support reflection on learning potential in these activities. Thisenhances the educational use of students self-initiated learningactivities by bringing visibility and evidence to them, and therebybringing value to the assessment...

  5. Towards a Categorical Account of Conditional Probability

    Directory of Open Access Journals (Sweden)

    Robert Furber

    2015-11-01

    Full Text Available This paper presents a categorical account of conditional probability, covering both the classical and the quantum case. Classical conditional probabilities are expressed as a certain "triangle-fill-in" condition, connecting marginal and joint probabilities, in the Kleisli category of the distribution monad. The conditional probabilities are induced by a map together with a predicate (the condition. The latter is a predicate in the logic of effect modules on this Kleisli category. This same approach can be transferred to the category of C*-algebras (with positive unital maps, whose predicate logic is also expressed in terms of effect modules. Conditional probabilities can again be expressed via a triangle-fill-in property. In the literature, there are several proposals for what quantum conditional probability should be, and also there are extra difficulties not present in the classical case. At this stage, we only describe quantum systems with classical parametrization.

  6. Probability, conditional probability and complementary cumulative distribution functions in performance assessment for radioactive waste disposal

    Energy Technology Data Exchange (ETDEWEB)

    Helton, J.C. [Arizona State Univ., Tempe, AZ (United States)

    1996-03-01

    A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S{sub st}, S{sub st}, p{sub st}) for stochastic uncertainty, a probability space (S{sub su}, S{sub su}, p{sub su}) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S{sub st}, S{sub st}, p{sub st}) and (S{sub su}, S{sub su}, p{sub su}). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the U.S. Environmental Protection Agency`s standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems.

  7. The Impact of Managerial and Adaptive Capabilities to Stimulate Organizational Innovation in SMEs: A Complementary PLS–SEM Approach

    Directory of Open Access Journals (Sweden)

    Zulfiqar Ali

    2017-11-01

    Full Text Available The aim of this study is to empirically explore and propose a rigorous model for the positive impact of managerial capability (in terms of decision-making, management style, people development, and succession planning and adaptive capability (in terms of horizon scanning, change management, and resilience on organizational innovation in the context of small and medium-sized enterprises (SMEs. The study uses partial least squares structural equation modeling (PLS–SEM to test the model hypotheses, and importance-performance matrix analysis (IPMA to provide information regarding the significance and relevance of the dimensions of managerial and adaptive capability in explaining organizational innovation in the proposed model. The empirical data is gathered through questionnaires from 210 SMEs. The results show a strong and significant relationship between managerial capability, adaptive capability, and organizational innovation. This study found that all of the dimensions of managerial capability and adaptive capability help to develop and improve the performance of organizational innovation in SMEs. The study concludes with a comprehensive discussion of the research limitations, and provides suggestions for future research.

  8. Transportation Energy Futures Series: Freight Transportation Demand: Energy-Efficient Scenarios for a Low-Carbon Future

    Energy Technology Data Exchange (ETDEWEB)

    Grenzeback, L. R.; Brown, A.; Fischer, M. J.; Hutson, N.; Lamm, C. R.; Pei, Y. L.; Vimmerstedt, L.; Vyas, A. D.; Winebrake, J. J.

    2013-03-01

    Freight transportation demand is projected to grow to 27.5 billion tons in 2040, and to nearly 30.2 billion tons in 2050. This report describes the current and future demand for freight transportation in terms of tons and ton-miles of commodities moved by truck, rail, water, pipeline, and air freight carriers. It outlines the economic, logistics, transportation, and policy and regulatory factors that shape freight demand, the trends and 2050 outlook for these factors, and their anticipated effect on freight demand. After describing federal policy actions that could influence future freight demand, the report then summarizes the capabilities of available analytical models for forecasting freight demand. This is one in a series of reports produced as a result of the Transportation Energy Futures project, a Department of Energy-sponsored multi-agency effort to pinpoint underexplored strategies for reducing GHGs and petroleum dependence related to transportation.

  9. Potential economic benefits of adapting agricultural production systems to future climate change

    Science.gov (United States)

    Fagre, Daniel B.; Pederson, Gregory; Bengtson, Lindsey E.; Prato, Tony; Qui, Zeyuan; Williams, Jimmie R.

    2010-01-01

    Potential economic impacts of future climate change on crop enterprise net returns and annual net farm income (NFI) are evaluated for small and large representative farms in Flathead Valley in Northwest Montana. Crop enterprise net returns and NFI in an historical climate period (1960–2005) and future climate period (2006–2050) are compared when agricultural production systems (APSs) are adapted to future climate change. Climate conditions in the future climate period are based on the A1B, B1, and A2 CO2 emission scenarios from the Intergovernmental Panel on Climate Change Fourth Assessment Report. Steps in the evaluation include: (1) specifying crop enterprises and APSs (i.e., combinations of crop enterprises) in consultation with locals producers; (2) simulating crop yields for two soils, crop prices, crop enterprises costs, and NFIs for APSs; (3) determining the dominant APS in the historical and future climate periods in terms of NFI; and (4) determining whether NFI for the dominant APS in the historical climate period is superior to NFI for the dominant APS in the future climate period. Crop yields are simulated using the Environmental/Policy Integrated Climate (EPIC) model and dominance comparisons for NFI are based on the stochastic efficiency with respect to a function (SERF) criterion. Probability distributions that best fit the EPIC-simulated crop yields are used to simulate 100 values for crop yields for the two soils in the historical and future climate periods. Best-fitting probability distributions for historical inflation-adjusted crop prices and specified triangular probability distributions for crop enterprise costs are used to simulate 100 values for crop prices and crop enterprise costs. Averaged over all crop enterprises, farm sizes, and soil types, simulated net return per ha averaged over all crop enterprises decreased 24% and simulated mean NFI for APSs decreased 57% between the historical and future climate periods. Although adapting

  10. Potential Economic Benefits of Adapting Agricultural Production Systems to Future Climate Change

    Science.gov (United States)

    Prato, Tony; Zeyuan, Qiu; Pederson, Gregory; Fagre, Dan; Bengtson, Lindsey E.; Williams, Jimmy R.

    2010-03-01

    Potential economic impacts of future climate change on crop enterprise net returns and annual net farm income (NFI) are evaluated for small and large representative farms in Flathead Valley in Northwest Montana. Crop enterprise net returns and NFI in an historical climate period (1960-2005) and future climate period (2006-2050) are compared when agricultural production systems (APSs) are adapted to future climate change. Climate conditions in the future climate period are based on the A1B, B1, and A2 CO2 emission scenarios from the Intergovernmental Panel on Climate Change Fourth Assessment Report. Steps in the evaluation include: (1) specifying crop enterprises and APSs (i.e., combinations of crop enterprises) in consultation with locals producers; (2) simulating crop yields for two soils, crop prices, crop enterprises costs, and NFIs for APSs; (3) determining the dominant APS in the historical and future climate periods in terms of NFI; and (4) determining whether NFI for the dominant APS in the historical climate period is superior to NFI for the dominant APS in the future climate period. Crop yields are simulated using the Environmental/Policy Integrated Climate (EPIC) model and dominance comparisons for NFI are based on the stochastic efficiency with respect to a function (SERF) criterion. Probability distributions that best fit the EPIC-simulated crop yields are used to simulate 100 values for crop yields for the two soils in the historical and future climate periods. Best-fitting probability distributions for historical inflation-adjusted crop prices and specified triangular probability distributions for crop enterprise costs are used to simulate 100 values for crop prices and crop enterprise costs. Averaged over all crop enterprises, farm sizes, and soil types, simulated net return per ha averaged over all crop enterprises decreased 24% and simulated mean NFI for APSs decreased 57% between the historical and future climate periods. Although adapting APSs to

  11. Potential economic benefits of adapting agricultural production systems to future climate change.

    Science.gov (United States)

    Prato, Tony; Zeyuan, Qiu; Pederson, Gregory; Fagre, Dan; Bengtson, Lindsey E; Williams, Jimmy R

    2010-03-01

    Potential economic impacts of future climate change on crop enterprise net returns and annual net farm income (NFI) are evaluated for small and large representative farms in Flathead Valley in Northwest Montana. Crop enterprise net returns and NFI in an historical climate period (1960-2005) and future climate period (2006-2050) are compared when agricultural production systems (APSs) are adapted to future climate change. Climate conditions in the future climate period are based on the A1B, B1, and A2 CO(2) emission scenarios from the Intergovernmental Panel on Climate Change Fourth Assessment Report. Steps in the evaluation include: (1) specifying crop enterprises and APSs (i.e., combinations of crop enterprises) in consultation with locals producers; (2) simulating crop yields for two soils, crop prices, crop enterprises costs, and NFIs for APSs; (3) determining the dominant APS in the historical and future climate periods in terms of NFI; and (4) determining whether NFI for the dominant APS in the historical climate period is superior to NFI for the dominant APS in the future climate period. Crop yields are simulated using the Environmental/Policy Integrated Climate (EPIC) model and dominance comparisons for NFI are based on the stochastic efficiency with respect to a function (SERF) criterion. Probability distributions that best fit the EPIC-simulated crop yields are used to simulate 100 values for crop yields for the two soils in the historical and future climate periods. Best-fitting probability distributions for historical inflation-adjusted crop prices and specified triangular probability distributions for crop enterprise costs are used to simulate 100 values for crop prices and crop enterprise costs. Averaged over all crop enterprises, farm sizes, and soil types, simulated net return per ha averaged over all crop enterprises decreased 24% and simulated mean NFI for APSs decreased 57% between the historical and future climate periods. Although adapting APSs

  12. Sensitivity of probability-of-failure estimates with respect to probability of detection curve parameters

    Energy Technology Data Exchange (ETDEWEB)

    Garza, J. [University of Texas at San Antonio, Mechanical Engineering, 1 UTSA circle, EB 3.04.50, San Antonio, TX 78249 (United States); Millwater, H., E-mail: harry.millwater@utsa.edu [University of Texas at San Antonio, Mechanical Engineering, 1 UTSA circle, EB 3.04.50, San Antonio, TX 78249 (United States)

    2012-04-15

    A methodology has been developed and demonstrated that can be used to compute the sensitivity of the probability-of-failure (POF) with respect to the parameters of inspection processes that are simulated using probability of detection (POD) curves. The formulation is such that the probabilistic sensitivities can be obtained at negligible cost using sampling methods by reusing the samples used to compute the POF. As a result, the methodology can be implemented for negligible cost in a post-processing non-intrusive manner thereby facilitating implementation with existing or commercial codes. The formulation is generic and not limited to any specific random variables, fracture mechanics formulation, or any specific POD curve as long as the POD is modeled parametrically. Sensitivity estimates for the cases of different POD curves at multiple inspections, and the same POD curves at multiple inspections have been derived. Several numerical examples are presented and show excellent agreement with finite difference estimates with significant computational savings. - Highlights: Black-Right-Pointing-Pointer Sensitivity of the probability-of-failure with respect to the probability-of-detection curve. Black-Right-Pointing-Pointer The sensitivities are computed with negligible cost using Monte Carlo sampling. Black-Right-Pointing-Pointer The change in the POF due to a change in the POD curve parameters can be easily estimated.

  13. Sensitivity of probability-of-failure estimates with respect to probability of detection curve parameters

    International Nuclear Information System (INIS)

    Garza, J.; Millwater, H.

    2012-01-01

    A methodology has been developed and demonstrated that can be used to compute the sensitivity of the probability-of-failure (POF) with respect to the parameters of inspection processes that are simulated using probability of detection (POD) curves. The formulation is such that the probabilistic sensitivities can be obtained at negligible cost using sampling methods by reusing the samples used to compute the POF. As a result, the methodology can be implemented for negligible cost in a post-processing non-intrusive manner thereby facilitating implementation with existing or commercial codes. The formulation is generic and not limited to any specific random variables, fracture mechanics formulation, or any specific POD curve as long as the POD is modeled parametrically. Sensitivity estimates for the cases of different POD curves at multiple inspections, and the same POD curves at multiple inspections have been derived. Several numerical examples are presented and show excellent agreement with finite difference estimates with significant computational savings. - Highlights: ► Sensitivity of the probability-of-failure with respect to the probability-of-detection curve. ►The sensitivities are computed with negligible cost using Monte Carlo sampling. ► The change in the POF due to a change in the POD curve parameters can be easily estimated.

  14. Capitalizing on capabilities.

    Science.gov (United States)

    Ulrich, Dave; Smallwood, Norm

    2004-06-01

    By making the most of organizational capabilities--employees' collective skills and fields of expertise--you can dramatically improve your company's market value. Although there is no magic list of proficiencies that every organization needs in order to succeed, the authors identify 11 intangible assets that well-managed companies tend to have: talent, speed, shared mind-set and coherent brand identity, accountability, collaboration, learning, leadership, customer connectivity, strategic unity, innovation, and efficiency. Such companies typically excel in only three of these capabilities while maintaining industry parity in the other areas. Organizations that fall below the norm in any of the 11 are likely candidates for dysfunction and competitive disadvantage. So you can determine how your company fares in these categories (or others, if the generic list doesn't suit your needs), the authors explain how to conduct a "capabilities audit," describing in particular the experiences and findings of two companies that recently performed such audits. In addition to highlighting which intangible assets are most important given the organization's history and strategy, this exercise will gauge how well your company delivers on its capabilities and will guide you in developing an action plan for improvement. A capabilities audit can work for an entire organization, a business unit, or a region--indeed, for any part of a company that has a strategy to generate financial or customer-related results. It enables executives to assess overall company strengths and weaknesses, senior leaders to define strategy, midlevel managers to execute strategy, and frontline leaders to achieve tactical results. In short, it helps turn intangible assets into concrete strengths.

  15. PROBABILITY CALIBRATION BY THE MINIMUM AND MAXIMUM PROBABILITY SCORES IN ONE-CLASS BAYES LEARNING FOR ANOMALY DETECTION

    Data.gov (United States)

    National Aeronautics and Space Administration — PROBABILITY CALIBRATION BY THE MINIMUM AND MAXIMUM PROBABILITY SCORES IN ONE-CLASS BAYES LEARNING FOR ANOMALY DETECTION GUICHONG LI, NATHALIE JAPKOWICZ, IAN HOFFMAN,...

  16. Assessing the origins, evolution and prospects of the literature on dynamic capabilities: A bibliometric analysis

    Directory of Open Access Journals (Sweden)

    Gema Albort-Morant

    2018-01-01

    Full Text Available The purpose of this study is to serve as orientation and guidance to academics that are starting or currently developing their research within the field of dynamic capabilities, in order to enhance their knowledge about which are the key scientific journals, authors and articles shaping this topic. This paper presents a bibliometric analysis on dynamic capabilities, making use of the Web of Science database to perform it. This analysis comprises fundamental issues such as (i the number of studies published per year, (ii the countries with the highest rate of productivity, (iii the most prolific and influential authors, (iv assessment of studies citing dynamic capabilities, and (v the most productive journals on dynamic capabilities and recent studies on this topic. Results reveal an exponential growth in the number of publications on dynamic capabilities for the 2000–2012 period. Although, since 2012 this growth has decelerated, the number of publications on this topic remains noteworthy. This study brings useful information for those academics and practitioners attempting to analyze and deepen within this particular field of research, at the same time that provides some insights concerning the future development and progress of the dynamic capabilities topic in the management, business and economics academic literature.

  17. Ecological Capability Evaluation of Rural Development by Means of GIS

    Directory of Open Access Journals (Sweden)

    J Nouri, R Sharifipour

    2004-07-01

    Full Text Available Execution of development and creation of appropriate points for rural development without considering ecological capability will result in the appearance of several environmental, economic and social problems. This research is done in an analysis approach frame of a system with the aim of choosing the most suitable location for rural development in Abadeh with an area of 22,000 km2. in 2002 by applying geographic information system (GIS precious tools. Based on the above objective, ecological resources of concerned area were recognized and surveyed. The obtained data changed into digital figures and together with the other descriptive data were shifted to Arc/Info and Arcview systems for the purpose of creation of data base. Based on specific ecological models of Iran and special conditions of the area and by using structured query language (SQL in Arcview, the ecological capability of concerned area for rural development was determined. By considering current natural limitations, such as limited severe earthquake danger in central areas, limitation of flood danger in some of the central and western areas, development of evaporating deposits and salt domes in east and precipitation under 500mm in the studied area, no suitable place for the first grade rural development was found. However, it showed capability for second-grade rural development aspect. This area includes 3.8% of total area of the studied place. For improving present management in the studied region, it is recommended that in future development of the region, offered appropriated points while emphasizing on the land having low production capability to be considered.

  18. Collective probabilities algorithm for surface hopping calculations

    International Nuclear Information System (INIS)

    Bastida, Adolfo; Cruz, Carlos; Zuniga, Jose; Requena, Alberto

    2003-01-01

    General equations that transition probabilities of the hopping algorithms in surface hopping calculations must obey to assure the equality between the average quantum and classical populations are derived. These equations are solved for two particular cases. In the first it is assumed that probabilities are the same for all trajectories and that the number of hops is kept to a minimum. These assumptions specify the collective probabilities (CP) algorithm, for which the transition probabilities depend on the average populations for all trajectories. In the second case, the probabilities for each trajectory are supposed to be completely independent of the results from the other trajectories. There is, then, a unique solution of the general equations assuring that the transition probabilities are equal to the quantum population of the target state, which is referred to as the independent probabilities (IP) algorithm. The fewest switches (FS) algorithm developed by Tully is accordingly understood as an approximate hopping algorithm which takes elements from the accurate CP and IP solutions. A numerical test of all these hopping algorithms is carried out for a one-dimensional two-state problem with two avoiding crossings which shows the accuracy and computational efficiency of the collective probabilities algorithm proposed, the limitations of the FS algorithm and the similarity between the results offered by the IP algorithm and those obtained with the Ehrenfest method

  19. High Pressure Oxygen Generation for Future Exploration Missions, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed innovation is the development of a cathode feed electrolysis cell stack capable of generating 3600 psi oxygen at a relevant scale for future exploration...

  20. Role of nuclear reactors in future military satellites

    International Nuclear Information System (INIS)

    Buden, D.; Angelo, J.A. Jr.

    1982-01-01

    Future military capabilities will be profoundly influenced by emerging Shuttle Era space technology. Regardless of the specific direction or content of tomorrow's military space program, it is clear that advanced space transportation systems, orbital support facilities, and large-capacity power subsystems will be needed to create the generally larger, more sophisticated military space systems of the future. This paper explores the critical role that space nuclear reactors should play in America's future space program and reviews the current state of nuclear reactor power plant technology. Space nuclear reactor technologies have the potential of satisfying power requirements ranging from 10 kW/sub (e)/ to 100 MW/sub (e)/

  1. An evaluation of the historical issues associated with achieving non-helicopter V/STOL capability and the search for the flying car

    OpenAIRE

    Saeed, B; Gratton, GB

    2010-01-01

    Copyright @ 2010 The Royal Aeronautical Society. This article is the final author version of the published paper. Combined Vertical and short take-off and landing, or ‘V/STOL’ capability has been of great demand and interest in the field of aeronautics since the creation of the aircraft. V/STOL capability is a targeted capability for many projected or prototype future aircraft. Past V/STOL aircraft are reviewed and analysed with regard to their performance parameters. This research has fou...

  2. Foundations of the theory of probability

    CERN Document Server

    Kolmogorov, AN

    2018-01-01

    This famous little book remains a foundational text for the understanding of probability theory, important both to students beginning a serious study of probability and to historians of modern mathematics. 1956 second edition.

  3. An eHealth Capabilities Framework for Graduates and Health Professionals: Mixed-Methods Study

    Science.gov (United States)

    McGregor, Deborah; Keep, Melanie; Janssen, Anna; Spallek, Heiko; Quinn, Deleana; Jones, Aaron; Tseris, Emma; Yeung, Wilson; Togher, Leanne; Solman, Annette; Shaw, Tim

    2018-01-01

    Background The demand for an eHealth-ready and adaptable workforce is placing increasing pressure on universities to deliver eHealth education. At present, eHealth education is largely focused on components of eHealth rather than considering a curriculum-wide approach. Objective This study aimed to develop a framework that could be used to guide health curriculum design based on current evidence, and stakeholder perceptions of eHealth capabilities expected of tertiary health graduates. Methods A 3-phase, mixed-methods approach incorporated the results of a literature review, focus groups, and a Delphi process to develop a framework of eHealth capability statements. Results Participants (N=39) with expertise or experience in eHealth education, practice, or policy provided feedback on the proposed framework, and following the fourth iteration of this process, consensus was achieved. The final framework consisted of 4 higher-level capability statements that describe the learning outcomes expected of university graduates across the domains of (1) digital health technologies, systems, and policies; (2) clinical practice; (3) data analysis and knowledge creation; and (4) technology implementation and codesign. Across the capability statements are 40 performance cues that provide examples of how these capabilities might be demonstrated. Conclusions The results of this study inform a cross-faculty eHealth curriculum that aligns with workforce expectations. There is a need for educational curriculum to reinforce existing eHealth capabilities, adapt existing capabilities to make them transferable to novel eHealth contexts, and introduce new learning opportunities for interactions with technologies within education and practice encounters. As such, the capability framework developed may assist in the application of eHealth by emerging and existing health care professionals. Future research needs to explore the potential for integration of findings into workforce development

  4. Using Prediction Markets to Generate Probability Density Functions for Climate Change Risk Assessment

    Science.gov (United States)

    Boslough, M.

    2011-12-01

    Climate-related uncertainty is traditionally presented as an error bar, but it is becoming increasingly common to express it in terms of a probability density function (PDF). PDFs are a necessary component of probabilistic risk assessments, for which simple "best estimate" values are insufficient. Many groups have generated PDFs for climate sensitivity using a variety of methods. These PDFs are broadly consistent, but vary significantly in their details. One axiom of the verification and validation community is, "codes don't make predictions, people make predictions." This is a statement of the fact that subject domain experts generate results using assumptions within a range of epistemic uncertainty and interpret them according to their expert opinion. Different experts with different methods will arrive at different PDFs. For effective decision support, a single consensus PDF would be useful. We suggest that market methods can be used to aggregate an ensemble of opinions into a single distribution that expresses the consensus. Prediction markets have been shown to be highly successful at forecasting the outcome of events ranging from elections to box office returns. In prediction markets, traders can take a position on whether some future event will or will not occur. These positions are expressed as contracts that are traded in a double-action market that aggregates price, which can be interpreted as a consensus probability that the event will take place. Since climate sensitivity cannot directly be measured, it cannot be predicted. However, the changes in global mean surface temperature are a direct consequence of climate sensitivity, changes in forcing, and internal variability. Viable prediction markets require an undisputed event outcome on a specific date. Climate-related markets exist on Intrade.com, an online trading exchange. One such contract is titled "Global Temperature Anomaly for Dec 2011 to be greater than 0.65 Degrees C." Settlement is based

  5. The evolution of alliance capabilities

    NARCIS (Netherlands)

    Heimeriks, K.H.; Duysters, G.M.; Vanhaverbeke, W.P.M.

    2004-01-01

    This paper assesses the effectiveness and differential performance effects of learning mechanisms on the evolution of alliance capabilities. Relying on the concept of capability lifecycles, prior research has suggested that different capability levels could be identified in which different

  6. Probability an introduction with statistical applications

    CERN Document Server

    Kinney, John J

    2014-01-01

    Praise for the First Edition""This is a well-written and impressively presented introduction to probability and statistics. The text throughout is highly readable, and the author makes liberal use of graphs and diagrams to clarify the theory.""  - The StatisticianThoroughly updated, Probability: An Introduction with Statistical Applications, Second Edition features a comprehensive exploration of statistical data analysis as an application of probability. The new edition provides an introduction to statistics with accessible coverage of reliability, acceptance sampling, confidence intervals, h

  7. Probability Distributome: A Web Computational Infrastructure for Exploring the Properties, Interrelations, and Applications of Probability Distributions.

    Science.gov (United States)

    Dinov, Ivo D; Siegrist, Kyle; Pearl, Dennis K; Kalinin, Alexandr; Christou, Nicolas

    2016-06-01

    Probability distributions are useful for modeling, simulation, analysis, and inference on varieties of natural processes and physical phenomena. There are uncountably many probability distributions. However, a few dozen families of distributions are commonly defined and are frequently used in practice for problem solving, experimental applications, and theoretical studies. In this paper, we present a new computational and graphical infrastructure, the Distributome , which facilitates the discovery, exploration and application of diverse spectra of probability distributions. The extensible Distributome infrastructure provides interfaces for (human and machine) traversal, search, and navigation of all common probability distributions. It also enables distribution modeling, applications, investigation of inter-distribution relations, as well as their analytical representations and computational utilization. The entire Distributome framework is designed and implemented as an open-source, community-built, and Internet-accessible infrastructure. It is portable, extensible and compatible with HTML5 and Web2.0 standards (http://Distributome.org). We demonstrate two types of applications of the probability Distributome resources: computational research and science education. The Distributome tools may be employed to address five complementary computational modeling applications (simulation, data-analysis and inference, model-fitting, examination of the analytical, mathematical and computational properties of specific probability distributions, and exploration of the inter-distributional relations). Many high school and college science, technology, engineering and mathematics (STEM) courses may be enriched by the use of modern pedagogical approaches and technology-enhanced methods. The Distributome resources provide enhancements for blended STEM education by improving student motivation, augmenting the classical curriculum with interactive webapps, and overhauling the

  8. Future of the reprocessing business at the RT-1 plant

    International Nuclear Information System (INIS)

    Bukharin, O.

    1995-01-01

    Economic viability of reprocessing operations at the RT-1 plant is provided by the contracts with nuclear utilities from Finland and Hungary. Finland will stop sending fuel to Mayak for reprocessing after 1996. Hungary will be capable to resolve the problem of spent fuel domestically some time in the future. This increases vulnerability of the reprocessing business at Mayak to future political uncertainties. (author)

  9. Transition probability spaces in loop quantum gravity

    Science.gov (United States)

    Guo, Xiao-Kan

    2018-03-01

    We study the (generalized) transition probability spaces, in the sense of Mielnik and Cantoni, for spacetime quantum states in loop quantum gravity. First, we show that loop quantum gravity admits the structures of transition probability spaces. This is exemplified by first checking such structures in covariant quantum mechanics and then identifying the transition probability spaces in spin foam models via a simplified version of general boundary formulation. The transition probability space thus defined gives a simple way to reconstruct the discrete analog of the Hilbert space of the canonical theory and the relevant quantum logical structures. Second, we show that the transition probability space and in particular the spin foam model are 2-categories. Then we discuss how to realize in spin foam models two proposals by Crane about the mathematical structures of quantum gravity, namely, the quantum topos and causal sites. We conclude that transition probability spaces provide us with an alternative framework to understand various foundational questions of loop quantum gravity.

  10. THE CLOUD TECHNOLOGIES IN PROFESSIONAL EDUCATION OF THE FUTURE ECONOMISTS

    OpenAIRE

    Yu. Dyulicheva

    2014-01-01

    The usage of the cloud services in the professional education of the future economists are investigated. The following cloud services are analyzed in the paper: 1) the cloud service gantter for project management, its resources management, risks evaluation with example of the capabilities for Ganter diagram creation, project critical path determination based on gantter cloud service are considered; 2) the cloud service SageMath Cloud with capabilities of programming languages R, Python, Cytho...

  11. Do doctors need statistics? Doctors' use of and attitudes to probability and statistics.

    Science.gov (United States)

    Swift, Louise; Miles, Susan; Price, Gill M; Shepstone, Lee; Leinster, Sam J

    2009-07-10

    There is little published evidence on what doctors do in their work that requires probability and statistics, yet the General Medical Council (GMC) requires new doctors to have these skills. This study investigated doctors' use of and attitudes to probability and statistics with a view to informing undergraduate teaching.An email questionnaire was sent to 473 clinicians with an affiliation to the University of East Anglia's Medical School.Of 130 respondents approximately 90 per cent of doctors who performed each of the following activities found probability and statistics useful for that activity: accessing clinical guidelines and evidence summaries, explaining levels of risk to patients, assessing medical marketing and advertising material, interpreting the results of a screening test, reading research publications for general professional interest, and using research publications to explore non-standard treatment and management options.Seventy-nine per cent (103/130, 95 per cent CI 71 per cent, 86 per cent) of participants considered probability and statistics important in their work. Sixty-three per cent (78/124, 95 per cent CI 54 per cent, 71 per cent) said that there were activities that they could do better or start doing if they had an improved understanding of these areas and 74 of these participants elaborated on this. Themes highlighted by participants included: being better able to critically evaluate other people's research; becoming more research-active, having a better understanding of risk; and being better able to explain things to, or teach, other people.Our results can be used to inform how probability and statistics should be taught to medical undergraduates and should encourage today's medical students of the subjects' relevance to their future careers. Copyright 2009 John Wiley & Sons, Ltd.

  12. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  13. Encounter Probability of Individual Wave Height

    DEFF Research Database (Denmark)

    Liu, Z.; Burcharth, H. F.

    1998-01-01

    wave height corresponding to a certain exceedence probability within a structure lifetime (encounter probability), based on the statistical analysis of long-term extreme significant wave height. Then the design individual wave height is calculated as the expected maximum individual wave height...... associated with the design significant wave height, with the assumption that the individual wave heights follow the Rayleigh distribution. However, the exceedence probability of such a design individual wave height within the structure lifetime is unknown. The paper presents a method for the determination...... of the design individual wave height corresponding to an exceedence probability within the structure lifetime, given the long-term extreme significant wave height. The method can also be applied for estimation of the number of relatively large waves for fatigue analysis of constructions....

  14. Systems Engineering for Space Exploration Medical Capabilities

    Science.gov (United States)

    Mindock, Jennifer; Reilly, Jeffrey; Rubin, David; Urbina, Michelle; Hailey, Melinda; Hanson, Andrea; Burba, Tyler; McGuire, Kerry; Cerro, Jeffrey; Middour, Chris; hide

    2017-01-01

    Human exploration missions that reach destinations beyond low Earth orbit, such as Mars, will present significant new challenges to crew health management. For the medical system, lack of consumable resupply, evacuation opportunities, and real-time ground support are key drivers toward greater autonomy. Recognition of the limited mission and vehicle resources available to carry out exploration missions motivates the Exploration Medical Capability (ExMC) Element's approach to enabling the necessary autonomy. The Element's work must integrate with the overall exploration mission and vehicle design efforts to successfully provide exploration medical capabilities. ExMC is applying systems engineering principles and practices to accomplish its goals. This paper discusses the structured and integrative approach that is guiding the medical system technical development. Assumptions for the required levels of care on exploration missions, medical system goals, and a Concept of Operations are early products that capture and clarify stakeholder expectations. Model-Based Systems Engineering techniques are then applied to define medical system behavior and architecture. Interfaces to other flight and ground systems, and within the medical system are identified and defined. Initial requirements and traceability are established, which sets the stage for identification of future technology development needs. An early approach for verification and validation, taking advantage of terrestrial and near-Earth exploration system analogs, is also defined to further guide system planning and development.

  15. Needs and Requirements for Future Research Reactors (ORNL Perspectives)

    Energy Technology Data Exchange (ETDEWEB)

    Ilas, Germina [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bryan, Chris [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Gehin, Jess C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-02-10

    The High Flux Isotope Reactor (HFIR) is a vital national and international resource for neutron science research, production of radioisotopes, and materials irradiation. While HFIR is expected to continue operation for the foreseeable future, interest is growing in understanding future research reactors features, needs, and requirements. To clarify, discuss, and compile these needs from the perspective of Oak Ridge National Laboratory (ORNL) research and development (R&D) missions, a workshop, titled “Needs and Requirements for Future Research Reactors”, was held at ORNL on May 12, 2015. The workshop engaged ORNL staff that is directly involved in research using HFIR to collect valuable input on the reactor’s current and future missions. The workshop provided an interactive forum for a fruitful exchange of opinions, and included a mix of short presentations and open discussions. ORNL staff members made 15 technical presentations based on their experience and areas of expertise, and discussed those capabilities of the HFIR and future research reactors that are essential for their current and future R&D needs. The workshop was attended by approximately 60 participants from three ORNL directorates. The agenda is included in Appendix A. This document summarizes the feedback provided by workshop contributors and participants. It also includes information and insights addressing key points that originated from the dialogue started at the workshop. A general overview is provided on the design features and capabilities of high performance research reactors currently in use or under construction worldwide. Recent and ongoing design efforts in the US and internationally are briefly summarized, followed by conclusions and recommendations.

  16. OPSAID improvements and capabilities report.

    Energy Technology Data Exchange (ETDEWEB)

    Halbgewachs, Ronald D.; Chavez, Adrian R.

    2011-08-01

    Process Control System (PCS) and Industrial Control System (ICS) security is critical to our national security. But there are a number of technological, economic, and educational impediments to PCS owners implementing effective security on their systems. Sandia National Laboratories has performed the research and development of the OPSAID (Open PCS Security Architecture for Interoperable Design), a project sponsored by the US Department of Energy Office of Electricity Delivery and Energy Reliability (DOE/OE), to address this issue. OPSAID is an open-source architecture for PCS/ICS security that provides a design basis for vendors to build add-on security devices for legacy systems, while providing a path forward for the development of inherently-secure PCS elements in the future. Using standardized hardware, a proof-of-concept prototype system was also developed. This report describes the improvements and capabilities that have been added to OPSAID since an initial report was released. Testing and validation of this architecture has been conducted in another project, Lemnos Interoperable Security Project, sponsored by DOE/OE and managed by the National Energy Technology Laboratory (NETL).

  17. Study on Informational Transaction and Its Effect on China's Stock Index Futures Market

    Directory of Open Access Journals (Sweden)

    Hongli Che

    2014-01-01

    Full Text Available Information is one of the important factors that influence the behavior of investors and then have an effect on the price of the risky assets in the market. Firstly, the new procedure developed by Easley et al. (2011 is used to estimate the Volume-Synchronized Probability of Informed Trading (VPIN of the Chinese stock index futures market. Then VPIN for special scenarios is depicted. As a result, we find that the future contracts generally have a larger number of information transactions. We also find that, for particular scenarios, the probability of informed trading in the market has obvious exceptions. The larger proportion of informed trader is, the higher the volatility of the price is.

  18. Two-slit experiment: quantum and classical probabilities

    International Nuclear Information System (INIS)

    Khrennikov, Andrei

    2015-01-01

    Inter-relation between quantum and classical probability models is one of the most fundamental problems of quantum foundations. Nowadays this problem also plays an important role in quantum technologies, in quantum cryptography and the theory of quantum random generators. In this letter, we compare the viewpoint of Richard Feynman that the behavior of quantum particles cannot be described by classical probability theory with the viewpoint that quantum–classical inter-relation is more complicated (cf, in particular, with the tomographic model of quantum mechanics developed in detail by Vladimir Man'ko). As a basic example, we consider the two-slit experiment, which played a crucial role in quantum foundational debates at the beginning of quantum mechanics (QM). In particular, its analysis led Niels Bohr to the formulation of the principle of complementarity. First, we demonstrate that in complete accordance with Feynman's viewpoint, the probabilities for the two-slit experiment have the non-Kolmogorovian structure, since they violate one of basic laws of classical probability theory, the law of total probability (the heart of the Bayesian analysis). However, then we show that these probabilities can be embedded in a natural way into the classical (Kolmogorov, 1933) probability model. To do this, one has to take into account the randomness of selection of different experimental contexts, the joint consideration of which led Feynman to a conclusion about the non-classicality of quantum probability. We compare this embedding of non-Kolmogorovian quantum probabilities into the Kolmogorov model with well-known embeddings of non-Euclidean geometries into Euclidean space (e.g., the Poincaré disk model for the Lobachvesky plane). (paper)

  19. Brandishing Cyberattack Capabilities

    Science.gov (United States)

    2013-01-01

    Advertising cyberwar capabilities may be helpful. It may back up a deterrence strategy. It might dissuade other states from conventional mischief or...to enable the attack.5 Many of the instruments of the attack remain with the target system, nestled in its log files, or even in the malware itself...debat- able. Even if demonstrated, what worked yesterday may not work today. But difficult does not mean impossible. Advertising cyberwar capabilities

  20. A Real-Time Capable Software-Defined Receiver Using GPU for Adaptive Anti-Jam GPS Sensors

    Science.gov (United States)

    Seo, Jiwon; Chen, Yu-Hsuan; De Lorenzo, David S.; Lo, Sherman; Enge, Per; Akos, Dennis; Lee, Jiyun

    2011-01-01

    Due to their weak received signal power, Global Positioning System (GPS) signals are vulnerable to radio frequency interference. Adaptive beam and null steering of the gain pattern of a GPS antenna array can significantly increase the resistance of GPS sensors to signal interference and jamming. Since adaptive array processing requires intensive computational power, beamsteering GPS receivers were usually implemented using hardware such as field-programmable gate arrays (FPGAs). However, a software implementation using general-purpose processors is much more desirable because of its flexibility and cost effectiveness. This paper presents a GPS software-defined radio (SDR) with adaptive beamsteering capability for anti-jam applications. The GPS SDR design is based on an optimized desktop parallel processing architecture using a quad-core Central Processing Unit (CPU) coupled with a new generation Graphics Processing Unit (GPU) having massively parallel processors. This GPS SDR demonstrates sufficient computational capability to support a four-element antenna array and future GPS L5 signal processing in real time. After providing the details of our design and optimization schemes for future GPU-based GPS SDR developments, the jamming resistance of our GPS SDR under synthetic wideband jamming is presented. Since the GPS SDR uses commercial-off-the-shelf hardware and processors, it can be easily adopted in civil GPS applications requiring anti-jam capabilities. PMID:22164116

  1. A Real-Time Capable Software-Defined Receiver Using GPU for Adaptive Anti-Jam GPS Sensors

    Directory of Open Access Journals (Sweden)

    Dennis Akos

    2011-09-01

    Full Text Available Due to their weak received signal power, Global Positioning System (GPS signals are vulnerable to radio frequency interference. Adaptive beam and null steering of the gain pattern of a GPS antenna array can significantly increase the resistance of GPS sensors to signal interference and jamming. Since adaptive array processing requires intensive computational power, beamsteering GPS receivers were usually implemented using hardware such as field-programmable gate arrays (FPGAs. However, a software implementation using general-purpose processors is much more desirable because of its flexibility and cost effectiveness. This paper presents a GPS software-defined radio (SDR with adaptive beamsteering capability for anti-jam applications. The GPS SDR design is based on an optimized desktop parallel processing architecture using a quad-core Central Processing Unit (CPU coupled with a new generation Graphics Processing Unit (GPU having massively parallel processors. This GPS SDR demonstrates sufficient computational capability to support a four-element antenna array and future GPS L5 signal processing in real time. After providing the details of our design and optimization schemes for future GPU-based GPS SDR developments, the jamming resistance of our GPS SDR under synthetic wideband jamming is presented. Since the GPS SDR uses commercial-off-the-shelf hardware and processors, it can be easily adopted in civil GPS applications requiring anti-jam capabilities.

  2. Relationship between innovation capability, innovation type, and firm performance

    Directory of Open Access Journals (Sweden)

    R.P. Jayani Rajapathirana

    2018-01-01

    Full Text Available Insurers are well versed in the litany of challenging conditions facing the sector. These challenges are economic, political, regulatory, legal, social, and technological. As a result of those pressures, the industry is experiencing increasing competition, muted growth, and an excess of capital. The increased connectivity among household and workplace devices, the development of autonomous vehicle and the rising threat of cyber attacks are transforming the way people live and risk they need to mitigate with insurance products. Insurers need to adopt their business models address the changes which can be threatening to the growth of the industry (Deloitte, 2017. Innovation is widely regarded as pinnacle success factor in highly competitive and global economy. An innovation perspective draws a clear picture of future opportunities that lie ahead. The main purpose of this paper is to explore the relationship among innovations capability, innovation type and on the different aspect of firm performance including innovation, market and financial performance based on an empirical study covering insurance industry in Sri Lanka. The research framework developed in this study was tested 379 senior managers of insurance companies. The empirical verification of assumption of this model has given evidence to confirm the relationship between innovation capabilities; innovation efforts and firm performance are significant and strong. The results of this study could lead effective management of innovation capability which helps to deliver more effective innovations outcomes to generate better performance and it would be benefits for management of the insurance companies.

  3. Handbook of probability theory and applications

    CERN Document Server

    Rudas, Tamas

    2008-01-01

    ""This is a valuable reference guide for readers interested in gaining a basic understanding of probability theory or its applications in problem solving in the other disciplines.""-CHOICEProviding cutting-edge perspectives and real-world insights into the greater utility of probability and its applications, the Handbook of Probability offers an equal balance of theory and direct applications in a non-technical, yet comprehensive, format. Editor Tamás Rudas and the internationally-known contributors present the material in a manner so that researchers of vari

  4. Computation of the Complex Probability Function

    Energy Technology Data Exchange (ETDEWEB)

    Trainer, Amelia Jo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ledwith, Patrick John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-22

    The complex probability function is important in many areas of physics and many techniques have been developed in an attempt to compute it for some z quickly and e ciently. Most prominent are the methods that use Gauss-Hermite quadrature, which uses the roots of the nth degree Hermite polynomial and corresponding weights to approximate the complex probability function. This document serves as an overview and discussion of the use, shortcomings, and potential improvements on the Gauss-Hermite quadrature for the complex probability function.

  5. The Probabilities of Unique Events

    Science.gov (United States)

    2012-08-30

    Washington, DC USA Max Lotstein and Phil Johnson-Laird Department of Psychology Princeton University Princeton, NJ USA August 30th 2012...social justice and also participated in antinuclear demonstrations. The participants ranked the probability that Linda is a feminist bank teller as...retorted that such a flagrant violation of the probability calculus was a result of a psychological experiment that obscured the rationality of the

  6. Probability judgments under ambiguity and conflict.

    Science.gov (United States)

    Smithson, Michael

    2015-01-01

    Whether conflict and ambiguity are distinct kinds of uncertainty remains an open question, as does their joint impact on judgments of overall uncertainty. This paper reviews recent advances in our understanding of human judgment and decision making when both ambiguity and conflict are present, and presents two types of testable models of judgments under conflict and ambiguity. The first type concerns estimate-pooling to arrive at "best" probability estimates. The second type is models of subjective assessments of conflict and ambiguity. These models are developed for dealing with both described and experienced information. A framework for testing these models in the described-information setting is presented, including a reanalysis of a multi-nation data-set to test best-estimate models, and a study of participants' assessments of conflict, ambiguity, and overall uncertainty reported by Smithson (2013). A framework for research in the experienced-information setting is then developed, that differs substantially from extant paradigms in the literature. This framework yields new models of "best" estimates and perceived conflict. The paper concludes with specific suggestions for future research on judgment and decision making under conflict and ambiguity.

  7. KSC Technical Capabilities Website

    Science.gov (United States)

    Nufer, Brian; Bursian, Henry; Brown, Laurette L.

    2010-01-01

    This document is the website pages that review the technical capabilities that the Kennedy Space Center (KSC) has for partnership opportunities. The purpose of this information is to make prospective customers aware of the capabilities and provide an opportunity to form relationships with the experts at KSC. The technical capabilities fall into these areas: (1) Ground Operations and Processing Services, (2) Design and Analysis Solutions, (3) Command and Control Systems / Services, (4) Materials and Processes, (5) Research and Technology Development and (6) Laboratories, Shops and Test Facilities.

  8. Heavy-ion fusion: Future promise and future directions

    International Nuclear Information System (INIS)

    Dudziak, D.J.; Saylor, W.W.; Pendergrass, J.H.

    1986-01-01

    The previous several papers in this heavy-ion fusion special session have described work that has taken place as part of the Heavy-Ion Fusion Systems Assessment (HIFSA) project. Key technical issues in the design and costing of targets, accelerator systems, beam transport, reactor and balance-of-plant, and systems integration have been identified and described. The HIFSA systems model was used to measure the relative value of improvements in physics understanding and technology developments in many different areas. The result of this study has been to, within the limits of our 1986 imagination and creativity, define the ''most attractive'' future heavy-ion fusion (HIF) power plant at some time in the future (beyond the year 2020 in this case). The project has specifically avoided narrowing the focus to a point facility design; thus, the generic systems modeling capability developed in the process allows for a relative comparison among design options. The authors describe what are thought to be achievable breakthroughs and what the relative significance of the breakthroughs will be, although the specific mechanism for achieving some breakthroughs may not be clear at this point

  9. Error detecting capabilities of the shortened Hamming codes adopted for error detection in IEEE Standard 802.3

    Science.gov (United States)

    Fujiwara, Toru; Kasami, Tadao; Lin, Shu

    1989-09-01

    The error-detecting capabilities of the shortened Hamming codes adopted for error detection in IEEE Standard 802.3 are investigated. These codes are also used for error detection in the data link layer of the Ethernet, a local area network. The weight distributions for various code lengths are calculated to obtain the probability of undetectable error and that of detectable error for a binary symmetric channel with bit-error rate between 0.00001 and 1/2.

  10. Organizational Economics of Capability and Heterogeneity

    DEFF Research Database (Denmark)

    Argyres, Nicholas S.; Felin, Teppo; Foss, Nicolai Juul

    2012-01-01

    For decades, the literatures on firm capabilities and organizational economics have been at odds with each other, specifically relative to explaining organizational boundaries and heterogeneity. We briefly trace the history of the relationship between the capabilities literature and organizational...... economics, and we point to the dominance of a “capabilities first” logic in this relationship. We argue that capabilities considerations are inherently intertwined with questions about organizational boundaries and internal organization, and we use this point to respond to the prevalent capabilities first...... logic. We offer an integrative research agenda that focuses first on the governance of capabilities and then on the capability of governance....

  11. Future of dual-use space awareness technologies

    Science.gov (United States)

    Kislitsyn, Boris V.; Idell, Paul S.; Crawford, Linda L.

    2000-10-01

    The use of all classes of space systems, whether owned by defense, civil, commercial, scientific, allied or foreign organizations, is increasing rapidly. In turn, the surveillance of such systems and activities in space are of interest to all parties. Interests will only increase in time and with the new ways to exploit the space environment. However, the current space awareness infrastructure and capabilities are not maintaining pace with the demands and advanced technologies being brought online. The use of surveillance technologies, some of which will be discussed in the conference, will provide us the eventual capability to observe and assess the environment, satellite health and status, and the uses of assets on orbit. This provides us a space awareness that is critical to the military operator and to the commercial entrepreneur for their respective successes. Thus the term 'dual-use technologies' has become a reality. For this reason we will briefly examine the background, current, and future technology trends that can lead us to some insights for future products and services.

  12. Metrology Measurement Capabilities

    Energy Technology Data Exchange (ETDEWEB)

    Dr. Glen E. Gronniger

    2007-10-02

    This document contains descriptions of Federal Manufacturing & Technologies (FM&T) Metrology capabilities, traceability flow charts, and the measurement uncertainty of each measurement capability. Metrology provides NIST traceable precision measurements or equipment calibration for a wide variety of parameters, ranges, and state-of-the-art uncertainties. Metrology laboratories conform to the requirements of the Department of Energy Development and Production Manual Chapter 13.2, ANSI/ISO/IEC ANSI/ISO/IEC 17025:2005, and ANSI/NCSL Z540-1. FM&T Metrology laboratories are accredited by NVLAP for the parameters, ranges, and uncertainties listed in the specific scope of accreditation under NVLAP Lab code 200108-0. See the Internet at http://ts.nist.gov/Standards/scopes/2001080.pdf. These parameters are summarized. The Honeywell Federal Manufacturing & Technologies (FM&T) Metrology Department has developed measurement technology and calibration capability in four major fields of measurement: (1) Mechanical; (2) Environmental, Gas, Liquid; (3) Electrical (DC, AC, RF/Microwave); and (4) Optical and Radiation. Metrology Engineering provides the expertise to develop measurement capabilities for virtually any type of measurement in the fields listed above. A strong audit function has been developed to provide a means to evaluate the calibration programs of our suppliers and internal calibration organizations. Evaluation includes measurement audits and technical surveys.

  13. High throughput nonparametric probability density estimation.

    Science.gov (United States)

    Farmer, Jenny; Jacobs, Donald

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference.

  14. Some thoughts on the future of neutron scattering

    International Nuclear Information System (INIS)

    Egelstaff, P.A.

    1991-01-01

    Attendance of ICANS meetings believe that neutron scattering has a bright future, but critics of neutron scattering argue that its practitioners are an aging group, that they use a few, very expensive neutron sources and that the interesting science may be done by other techniques. The ICANS committee asked me to comment on the future of neutron scattering in the light of this contrast. Some comments will be made on the age distribution, on the proper distribution of sources, on the convenient availability of neutron instruments and methods, on the expansion into new areas of science, on applications to industry and on the probable impact of synchrotron sources. It is hoped that these comments will lead to an outward looking discussion on the future. (author)

  15. Space station evolution: Planning for the future

    Science.gov (United States)

    Diaz, Alphonso V.; Askins, Barbara S.

    1987-06-01

    The need for permanently manned presence in space has been recognized by the United States and its international partners for many years. The development of this capability was delayed due to the concurrent recognition that reusable earth-to-orbit transportation was also needed and should be developed first. While the decision to go ahead with a permanently manned Space Station was on hold, requirements for the use of the Station were accumulating as ground-based research and the data from unmanned spacecraft sparked the imagination of both scientists and entrepreneurs. Thus, by the time of the Space Station implementation decision in the early 1980's, a variety of disciplines, with a variety of requirements, needed to be accommodated on one Space Station. Additional future requirements could be forecast for advanced missions that were still in the early planning stages. The logical response was the development of a multi-purpose Space Station with the ability to evolve on-orbit to new capabilities as required by user needs and national or international decisions, i.e., to build an evolutionary Space Station. Planning for evolution is conducted in parallel with the design and development of the baseline Space Station. Evolution planning is a strategic management process to facilitate change and protect future decisions. The objective is not to forecast the future, but to understand the future options and the implications of these on today's decisions. The major actions required now are: (1) the incorporation of evolution provisions (hooks and scars) in the baseline Space Station; and (2) the initiation of an evolution advanced development program.

  16. Space station evolution: Planning for the future

    Science.gov (United States)

    Diaz, Alphonso V.; Askins, Barbara S.

    1987-01-01

    The need for permanently manned presence in space has been recognized by the United States and its international partners for many years. The development of this capability was delayed due to the concurrent recognition that reusable earth-to-orbit transportation was also needed and should be developed first. While the decision to go ahead with a permanently manned Space Station was on hold, requirements for the use of the Station were accumulating as ground-based research and the data from unmanned spacecraft sparked the imagination of both scientists and entrepreneurs. Thus, by the time of the Space Station implementation decision in the early 1980's, a variety of disciplines, with a variety of requirements, needed to be accommodated on one Space Station. Additional future requirements could be forecast for advanced missions that were still in the early planning stages. The logical response was the development of a multi-purpose Space Station with the ability to evolve on-orbit to new capabilities as required by user needs and national or international decisions, i.e., to build an evolutionary Space Station. Planning for evolution is conducted in parallel with the design and development of the baseline Space Station. Evolution planning is a strategic management process to facilitate change and protect future decisions. The objective is not to forecast the future, but to understand the future options and the implications of these on today's decisions. The major actions required now are: (1) the incorporation of evolution provisions (hooks and scars) in the baseline Space Station; and (2) the initiation of an evolution advanced development program.

  17. Space Logistics: Launch Capabilities

    Science.gov (United States)

    Furnas, Randall B.

    1989-01-01

    The current maximum launch capability for the United States are shown. The predicted Earth-to-orbit requirements for the United States are presented. Contrasting the two indicates the strong National need for a major increase in Earth-to-orbit lift capability. Approximate weights for planned payloads are shown. NASA is studying the following options to meet the need for a new heavy-lift capability by mid to late 1990's: (1) Shuttle-C for near term (include growth versions); and (2) the Advanced Lauching System (ALS) for the long term. The current baseline two-engine Shuttle-C has a 15 x 82 ft payload bay and an expected lift capability of 82,000 lb to Low Earth Orbit. Several options are being considered which have expanded diameter payload bays. A three-engine Shuttle-C with an expected lift of 145,000 lb to LEO is being evaluated as well. The Advanced Launch System (ALS) is a potential joint development between the Air Force and NASA. This program is focused toward long-term launch requirements, specifically beyond the year 2000. The basic approach is to develop a family of vehicles with the same high reliability as the Shuttle system, yet offering a much greater lift capability at a greatly reduced cost (per pound of payload). The ALS unmanned family of vehicles will provide a low end lift capability equivalent to Titan IV, and a high end lift capability greater than the Soviet Energia if requirements for such a high-end vehicle are defined.In conclusion, the planning of the next generation space telescope should not be constrained to the current launch vehicles. New vehicle designs will be driven by the needs of anticipated heavy users.

  18. "I Don't Really Understand Probability at All": Final Year Pre-Service Teachers' Understanding of Probability

    Science.gov (United States)

    Maher, Nicole; Muir, Tracey

    2014-01-01

    This paper reports on one aspect of a wider study that investigated a selection of final year pre-service primary teachers' responses to four probability tasks. The tasks focused on foundational ideas of probability including sample space, independence, variation and expectation. Responses suggested that strongly held intuitions appeared to…

  19. Probability of spent fuel transportation accidents

    International Nuclear Information System (INIS)

    McClure, J.D.

    1981-07-01

    The transported volume of spent fuel, incident/accident experience and accident environment probabilities were reviewed in order to provide an estimate of spent fuel accident probabilities. In particular, the accident review assessed the accident experience for large casks of the type that could transport spent (irradiated) nuclear fuel. This review determined that since 1971, the beginning of official US Department of Transportation record keeping for accidents/incidents, there has been one spent fuel transportation accident. This information, coupled with estimated annual shipping volumes for spent fuel, indicated an estimated annual probability of a spent fuel transport accident of 5 x 10 -7 spent fuel accidents per mile. This is consistent with ordinary truck accident rates. A comparison of accident environments and regulatory test environments suggests that the probability of truck accidents exceeding regulatory test for impact is approximately 10 -9 /mile

  20. The Capability to Hold Property

    NARCIS (Netherlands)

    Claassen, Rutger

    2015-01-01

    This paper discusses the question of whether a capability theory of justice (such as that of Martha Nussbaum) should accept a basic “capability to hold property.” Answering this question is vital for bridging the gap between abstract capability theories of justice and their institutional