WorldWideScience

Sample records for improved coverage probabilities

  1. Impact of proof test interval and coverage on probability of failure of safety instrumented function

    International Nuclear Information System (INIS)

    Jin, Jianghong; Pang, Lei; Hu, Bin; Wang, Xiaodong

    2016-01-01

    Highlights: • Introduction of proof test coverage makes the calculation of the probability of failure for SIF more accurate. • The probability of failure undetected by proof test is independently defined as P TIF and calculated. • P TIF is quantified using reliability block diagram and simple formula of PFD avg . • Improving proof test coverage and adopting reasonable test period can reduce the probability of failure for SIF. - Abstract: Imperfection of proof test can result in the safety function failure of safety instrumented system (SIS) at any time in its life period. IEC61508 and other references ignored or only elementarily analyzed the imperfection of proof test. In order to further study the impact of the imperfection of proof test on the probability of failure for safety instrumented function (SIF), the necessity of proof test and influence of its imperfection on system performance was first analyzed theoretically. The probability of failure for safety instrumented function resulted from the imperfection of proof test was defined as probability of test independent failures (P TIF ), and P TIF was separately calculated by introducing proof test coverage and adopting reliability block diagram, with reference to the simplified calculation formula of average probability of failure on demand (PFD avg ). Research results show that: the shorter proof test period and the higher proof test coverage indicate the smaller probability of failure for safety instrumented function. The probability of failure for safety instrumented function which is calculated by introducing proof test coverage will be more accurate.

  2. Improving Ranking Using Quantum Probability

    OpenAIRE

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...

  3. On the Meta Distribution of Coverage Probability in Uplink Cellular Networks

    KAUST Repository

    Elsawy, Hesham; Alouini, Mohamed-Slim

    2017-01-01

    This letter studies the meta distribution of coverage probability (CP), within a stochastic geometry framework, for cellular uplink transmission with fractional path-loss inversion power control. Using the widely accepted Poisson point process (PPP

  4. Clinical implementation of coverage probability planning for nodal boosting in locally advanced cervical cancer

    DEFF Research Database (Denmark)

    Ramlov, Anne; Assenholt, Marianne S; Jensen, Maria F

    2017-01-01

    PURPOSE: To implement coverage probability (CovP) for dose planning of simultaneous integrated boost (SIB) of pathologic lymph nodes in locally advanced cervical cancer (LACC). MATERIAL AND METHODS: CovP constraints for SIB of the pathological nodal target (PTV-N) with a central dose peak...

  5. Optimize the Coverage Probability of Prediction Interval for Anomaly Detection of Sensor-Based Monitoring Series

    Directory of Open Access Journals (Sweden)

    Jingyue Pang

    2018-03-01

    Full Text Available Effective anomaly detection of sensing data is essential for identifying potential system failures. Because they require no prior knowledge or accumulated labels, and provide uncertainty presentation, the probability prediction methods (e.g., Gaussian process regression (GPR and relevance vector machine (RVM are especially adaptable to perform anomaly detection for sensing series. Generally, one key parameter of prediction models is coverage probability (CP, which controls the judging threshold of the testing sample and is generally set to a default value (e.g., 90% or 95%. There are few criteria to determine the optimal CP for anomaly detection. Therefore, this paper designs a graphic indicator of the receiver operating characteristic curve of prediction interval (ROC-PI based on the definition of the ROC curve which can depict the trade-off between the PI width and PI coverage probability across a series of cut-off points. Furthermore, the Youden index is modified to assess the performance of different CPs, by the minimization of which the optimal CP is derived by the simulated annealing (SA algorithm. Experiments conducted on two simulation datasets demonstrate the validity of the proposed method. Especially, an actual case study on sensing series from an on-orbit satellite illustrates its significant performance in practical application.

  6. On the Meta Distribution of Coverage Probability in Uplink Cellular Networks

    KAUST Repository

    Elsawy, Hesham

    2017-04-07

    This letter studies the meta distribution of coverage probability (CP), within a stochastic geometry framework, for cellular uplink transmission with fractional path-loss inversion power control. Using the widely accepted Poisson point process (PPP) for modeling the spatial locations of base stations (BSs), we obtain the percentiles of users that achieve a target uplink CP over an arbitrary, but fixed, realization of the PPP. To this end, the effect of the users activity factor (p) and the path-loss compensation factor () on the uplink performance are analyzed. The results show that decreasing p and/or increasing reduce the CP variation around the spatially averaged value.

  7. Improved Differential Evolution Algorithm for Wireless Sensor Network Coverage Optimization

    Directory of Open Access Journals (Sweden)

    Xing Xu

    2014-04-01

    Full Text Available In order to serve for the ecological monitoring efficiency of Poyang Lake, an improved hybrid algorithm, mixed with differential evolution and particle swarm optimization, is proposed and applied to optimize the coverage problem of wireless sensor network. And then, the affect of the population size and the number of iterations on the coverage performance are both discussed and analyzed. The four kinds of statistical results about the coverage rate are obtained through lots of simulation experiments.

  8. Improved HIV testing coverage after scale-up of ... - Lusaka

    African Journals Online (AJOL)

    Improved HIV testing coverage after scale-up of antiretroviral therapy programs in urban Zambia: Evidence from serial hospital surveillance. ... Background: We evaluated changing HIV testing coverage and prevalence rates before and after expanding city-wide antiretroviral therapy (ART) programs in Lusaka, Zambia.

  9. K Coverage Probability of 5G Wireless Cognitive Radio Network under Shadow Fading Effects

    Directory of Open Access Journals (Sweden)

    Ankur S. Kang

    2016-09-01

    Full Text Available Land mobile communication is burdened with typical propagation constraints due to the channel characteristics in radio systems.Also,the propagation characteristics vary form place to place and also as the mobile unit moves,from time to time.Hence,the tramsmission path between transmitter and receiver varies from simple direct LOS to the one which is severely obstructed by buildings,foliage and terrain.Multipath propagation and shadow fading effects affect the signal strength of an arbitrary Transmitter-Receiver due to the rapid fluctuations in the phase and amplitude of signal which also determines the average power over an area of tens or hundreds of meters.Shadowing introduces additional fluctuations,so the received local mean power varies around the area –mean.The present section deals with the performance analysis of fifth generation wireless cognitive radio network on the basis of signal and interference level based k coverage probability under the shadow fading effects.

  10. Extending Coverage and Lifetime of K-coverage Wireless Sensor Networks Using Improved Harmony Search

    Directory of Open Access Journals (Sweden)

    Shohreh Ebrahimnezhad

    2011-07-01

    Full Text Available K-coverage wireless sensor networks try to provide facilities such that each hotspot region is covered by at least k sensors. Because, the fundamental evaluation metrics of such networks are coverage and lifetime, proposing an approach that extends both of them simultaneously has a lot of interests. In this article, it is supposed that two kinds of nodes are available: static and mobile. The proposed method, at first, tries to balance energy among sensor nodes using Improved Harmony Search (IHS algorithm in a k-coverage and connected wireless sensor network in order to achieve a sensor node deployment. Also, this method proposes a suitable place for a gateway node (Sink that collects data from all sensors. Second, in order to prolong the network lifetime, some of the high energy-consuming mobile nodes are moved to the closest positions of low energy-consuming ones and vice versa after a while. This leads increasing the lifetime of network while connectivity and k-coverage are preserved. Through computer simulations, experimental results verified that the proposed IHS-based algorithm found better solution compared to some related methods.

  11. Camera Network Coverage Improving by Particle Swarm Optimization

    NARCIS (Netherlands)

    Xu, Y.C.; Lei, B.; Hendriks, E.A.

    2011-01-01

    This paper studies how to improve the field of view (FOV) coverage of a camera network. We focus on a special but practical scenario where the cameras are randomly scattered in a wide area and each camera may adjust its orientation but cannot move in any direction. We propose a particle swarm

  12. [Strategies to improve influenza vaccination coverage in Primary Health Care].

    Science.gov (United States)

    Antón, F; Richart, M J; Serrano, S; Martínez, A M; Pruteanu, D F

    2016-04-01

    Vaccination coverage reached in adults is insufficient, and there is a real need for new strategies. To compare strategies for improving influenza vaccination coverage in persons older than 64 years. New strategies were introduced in our health care centre during 2013-2014 influenza vaccination campaign, which included vaccinating patients in homes for the aged as well as in the health care centre. A comparison was made on vaccination coverage over the last 4 years in 3 practices of our health care centre: P1, the general physician vaccinated patients older than 64 that came to the practice; P2, the general physician systematically insisted in vaccination in elderly patients, strongly advising to book appointments, and P3, the general physician did not insist. These practices looked after P1: 278; P2: 320; P3: 294 patients older than 64 years. Overall/P1/P2/P3 coverages in 2010: 51.2/51.4/55/46.9% (P=NS), in 2011: 52.4/52.9/53.8/50.3% (P=NS), in 2012: 51.9/52.5/55.3/47.6% (P=NS), and in 2013: 63.5/79.1/59.7/52.7 (P=.000, P1 versus P2 and P3; P=NS between P2 and P3). Comparing the coverages in 2012-2013 within each practice P1 (P=.000); P2 (P=.045); P3 (P=.018). In P2 and P3 all vaccinations were given by the nurses as previously scheduled. In P3, 55% of the vaccinations were given by the nurses, 24.1% by the GP, 9.7% rejected vaccination, and the remainder did not come to the practice during the vaccination period (October 2013-February 2014). The strategy of vaccinating in the homes for the aged improved the vaccination coverage by 5% in each practice. The strategy of "I've got you here, I jab you here" in P1 improved the vaccination coverage by 22%. Copyright © 2014 Sociedad Española de Médicos de Atención Primaria (SEMERGEN). Publicado por Elsevier España, S.L.U. All rights reserved.

  13. Probability

    CERN Document Server

    Shiryaev, A N

    1996-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, ergodic theory, weak convergence of probability measures, stationary stochastic processes, and the Kalman-Bucy filter Many examples are discussed in detail, and there are a large number of exercises The book is accessible to advanced undergraduates and can be used as a text for self-study This new edition contains substantial revisions and updated references The reader will find a deeper study of topics such as the distance between probability measures, metrization of weak convergence, and contiguity of probability measures Proofs for a number of some important results which were merely stated in the first edition have been added The author included new material on the probability of large deviations, and on the central limit theorem for sums of dependent random variables

  14. Coverage Improvement for Wireless Sensor Networks using Grid Quorum based Node Mobility

    DEFF Research Database (Denmark)

    Mathur, Prateek; Nielsen, Rasmus Hjorth; Prasad, Neeli R.

    2012-01-01

    Coverage of wireless sensor networks (WSNs) is an important quality of service (QoS) metric and often the desired coverage is not attainable at the initial deployment, but node mobility can be used to improve the coverage by relocating sensor nodes. Unconstrained node mobility is considered infea...

  15. Root coverage procedures improve patient aesthetics. A systematic review and Bayesian network meta-analysis.

    Science.gov (United States)

    Cairo, Francesco; Pagliaro, Umberto; Buti, Jacopo; Baccini, Michela; Graziani, Filippo; Tonelli, Paolo; Pagavino, Gabriella; Tonetti, Maurizio S

    2016-11-01

    The aim of this study was to perform a systematic review (SR) of randomized controlled trials (RCTs) to explore if periodontal plastic surgery procedures for the treatment of single and multiple gingival recessions (Rec) may improve aesthetics at patient and professional levels. In order to combine evidence from direct and indirect comparisons by different trials a Bayesian network meta-analysis (BNM) was planned. A literature search on PubMed, Cochrane libraries, EMBASE, and hand-searched journals until January 2016 was conducted to identify RCTs presenting aesthetic outcomes after root coverage using standardized evaluations at patient and professional level. A total of 16 RCTs were selected in the SR; three RTCs presenting professional aesthetic evaluation with Root coverage Aesthetic Score (RES) and three showing final self-perception using the Visual Analogue Scale (VAS Est) could be included in a BNM model. Coronally Advanced Flap plus Connective Tissue Graft (CAF + CTG) and CAF + Acellular Dermal Matrix (ADM) and Autologous Fibroblasts (AF) were associated with the best RES outcomes (best probability = 24% and 64%, respectively), while CAF + CTG and CAF + CTG + Enamel matrix Derivatives (EMD) obtained highest values of VAS Est score (best probability = 44% and 26%, respectively). Periodontal Plastic Surgery (PPS) techniques applying grafts underneath CAF with or without the adding of EMD are associated with improved aesthetics assessed by final patient perception and RES as professional evaluation system. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  16. Improving detection probabilities for pests in stored grain.

    Science.gov (United States)

    Elmouttie, David; Kiermeier, Andreas; Hamilton, Grant

    2010-12-01

    The presence of insects in stored grain is a significant problem for grain farmers, bulk grain handlers and distributors worldwide. Inspection of bulk grain commodities is essential to detect pests and thereby to reduce the risk of their presence in exported goods. It has been well documented that insect pests cluster in response to factors such as microclimatic conditions within bulk grain. Statistical sampling methodologies for grain, however, have typically considered pests and pathogens to be homogeneously distributed throughout grain commodities. In this paper, a sampling methodology is demonstrated that accounts for the heterogeneous distribution of insects in bulk grain. It is shown that failure to account for the heterogeneous distribution of pests may lead to overestimates of the capacity for a sampling programme to detect insects in bulk grain. The results indicate the importance of the proportion of grain that is infested in addition to the density of pests within the infested grain. It is also demonstrated that the probability of detecting pests in bulk grain increases as the number of subsamples increases, even when the total volume or mass of grain sampled remains constant. This study underlines the importance of considering an appropriate biological model when developing sampling methodologies for insect pests. Accounting for a heterogeneous distribution of pests leads to a considerable improvement in the detection of pests over traditional sampling models. Copyright © 2010 Society of Chemical Industry.

  17. Interventions for improving coverage of childhood immunisation in low- and middle-income countries.

    Science.gov (United States)

    Oyo-Ita, Angela; Wiysonge, Charles S; Oringanje, Chioma; Nwachukwu, Chukwuemeka E; Oduwole, Olabisi; Meremikwu, Martin M

    2016-07-10

    , and extracted data in duplicate; resolving discrepancies by consensus. We then conducted random-effects meta-analyses and used GRADE to assess the certainty of evidence. Fourteen studies (10 cluster RCTs and four individual RCTs) met our inclusion criteria. These were conducted in Georgia (one study), Ghana (one study), Honduras (one study), India (two studies), Mali (one study), Mexico (one study), Nicaragua (one study), Nepal (one study), Pakistan (four studies), and Zimbabwe (one study). One study had an unclear risk of bias, and 13 had high risk of bias. The interventions evaluated in the studies included community-based health education (three studies), facility-based health education (three studies), household incentives (three studies), regular immunisation outreach sessions (one study), home visits (one study), supportive supervision (one study), information campaigns (one study), and integration of immunisation services with intermittent preventive treatment of malaria (one study).We found moderate-certainty evidence that health education at village meetings or at home probably improves coverage with three doses of diphtheria-tetanus-pertussis vaccines (DTP3: risk ratio (RR) 1.68, 95% confidence interval (CI) 1.09 to 2.59). We also found low-certainty evidence that facility-based health education plus redesigned vaccination reminder cards may improve DTP3 coverage (RR 1.50, 95% CI 1.21 to 1.87). Household monetary incentives may have little or no effect on full immunisation coverage (RR 1.05, 95% CI 0.90 to 1.23, low-certainty evidence). Regular immunisation outreach may improve full immunisation coverage (RR 3.09, 95% CI 1.69 to 5.67, low-certainty evidence) which may substantially improve if combined with household incentives (RR 6.66, 95% CI 3.93 to 11.28, low-certainty evidence). Home visits to identify non-vaccinated children and refer them to health clinics may improve uptake of three doses of oral polio vaccine (RR 1.22, 95% CI 1.07 to 1.39, low

  18. Interventions for improving coverage of childhood immunisation in low- and middle-income countries

    Science.gov (United States)

    Oyo-Ita, Angela; Wiysonge, Charles S; Oringanje, Chioma; Nwachukwu, Chukwuemeka E; Oduwole, Olabisi; Meremikwu, Martin M

    2016-01-01

    full texts of potentially eligible articles, assessed risk of bias, and extracted data in duplicate; resolving discrepancies by consensus. We then conducted random-effects meta-analyses and used GRADE to assess the certainty of evidence. Main results Fourteen studies (10 cluster RCTs and four individual RCTs) met our inclusion criteria. These were conducted in Georgia (one study), Ghana (one study), Honduras (one study), India (two studies), Mali (one study), Mexico (one study), Nicaragua (one study), Nepal (one study), Pakistan (four studies), and Zimbabwe (one study). One study had an unclear risk of bias, and 13 had high risk of bias. The interventions evaluated in the studies included community-based health education (three studies), facility-based health education (three studies), household incentives (three studies), regular immunisation outreach sessions (one study), home visits (one study), supportive supervision (one study), information campaigns (one study), and integration of immunisation services with intermittent preventive treatment of malaria (one study). We found moderate-certainty evidence that health education at village meetings or at home probably improves coverage with three doses of diphtheria-tetanus-pertussis vaccines (DTP3: risk ratio (RR) 1.68, 95% confidence interval (CI) 1.09 to 2.59). We also found low-certainty evidence that facility-based health education plus redesigned vaccination reminder cards may improve DTP3 coverage (RR 1.50, 95% CI 1.21 to 1.87). Household monetary incentives may have little or no effect on full immunisation coverage (RR 1.05, 95% CI 0.90 to 1.23, low-certainty evidence). Regular immunisation outreach may improve full immunisation coverage (RR 3.09, 95% CI 1.69 to 5.67, low-certainty evidence) which may substantially improve if combined with household incentives (RR 6.66, 95% CI 3.93 to 11.28, low-certainty evidence). Home visits to identify non-vaccinated children and refer them to health clinics may improve

  19. Improving information extraction using a probability-based approach

    DEFF Research Database (Denmark)

    Kim, S.; Ahmed, Saeema; Wallace, K.

    2007-01-01

    Information plays a crucial role during the entire life-cycle of a product. It has been shown that engineers frequently consult colleagues to obtain the information they require to solve problems. However, the industrial world is now more transient and key personnel move to other companies...... or retire. It is becoming essential to retrieve vital information from archived product documents, if it is available. There is, therefore, great interest in ways of extracting relevant and sharable information from documents. A keyword-based search is commonly used, but studies have shown...... the recall, while maintaining the high precision, a learning approach that makes identification decisions based on a probability model, rather than simply looking up the presence of the pre-defined variations, looks promising. This paper presents the results of developing such a probability-based entity...

  20. Can Plan Recommendations Improve the Coverage Decisions of Vulnerable Populations in Health Insurance Marketplaces?

    Science.gov (United States)

    Barnes, Andrew J; Hanoch, Yaniv; Rice, Thomas

    2016-01-01

    The Affordable Care Act's marketplaces present an important opportunity for expanding coverage but consumers face enormous challenges in navigating through enrollment and re-enrollment. We tested the effectiveness of a behaviorally informed policy tool--plan recommendations--in improving marketplace decisions. Data were gathered from a community sample of 656 lower-income, minority, rural residents of Virginia. We conducted an incentive-compatible, computer-based experiment using a hypothetical marketplace like the one consumers face in the federally-facilitated marketplaces, and examined their decision quality. Participants were randomly assigned to a control condition or three types of plan recommendations: social normative, physician, and government. For participants randomized to a plan recommendation condition, the plan that maximized expected earnings, and minimized total expected annual health care costs, was recommended. Primary data were gathered using an online choice experiment and questionnaire. Plan recommendations resulted in a 21 percentage point increase in the probability of choosing the earnings maximizing plan, after controlling for participant characteristics. Two conditions, government or providers recommending the lowest cost plan, resulted in plan choices that lowered annual costs compared to marketplaces where no recommendations were made. As millions of adults grapple with choosing plans in marketplaces and whether to switch plans during open enrollment, it is time to consider marketplace redesigns and leverage insights from the behavioral sciences to facilitate consumers' decisions.

  1. Can Plan Recommendations Improve the Coverage Decisions of Vulnerable Populations in Health Insurance Marketplaces?

    Directory of Open Access Journals (Sweden)

    Andrew J Barnes

    Full Text Available The Affordable Care Act's marketplaces present an important opportunity for expanding coverage but consumers face enormous challenges in navigating through enrollment and re-enrollment. We tested the effectiveness of a behaviorally informed policy tool--plan recommendations--in improving marketplace decisions.Data were gathered from a community sample of 656 lower-income, minority, rural residents of Virginia.We conducted an incentive-compatible, computer-based experiment using a hypothetical marketplace like the one consumers face in the federally-facilitated marketplaces, and examined their decision quality. Participants were randomly assigned to a control condition or three types of plan recommendations: social normative, physician, and government. For participants randomized to a plan recommendation condition, the plan that maximized expected earnings, and minimized total expected annual health care costs, was recommended.Primary data were gathered using an online choice experiment and questionnaire.Plan recommendations resulted in a 21 percentage point increase in the probability of choosing the earnings maximizing plan, after controlling for participant characteristics. Two conditions, government or providers recommending the lowest cost plan, resulted in plan choices that lowered annual costs compared to marketplaces where no recommendations were made.As millions of adults grapple with choosing plans in marketplaces and whether to switch plans during open enrollment, it is time to consider marketplace redesigns and leverage insights from the behavioral sciences to facilitate consumers' decisions.

  2. Improving the Methods for Accounting the Coverages of Payments to Employees

    Directory of Open Access Journals (Sweden)

    Zhurakovska Iryna V.

    2017-03-01

    Full Text Available The article is aimed at exploring the theoretical and practical problems of accounting the coverages of payments to employees and developing on this basis ways of addressing them. An analysis of both the international and the national accounting standards, practices of domestic enterprises, as well as scientific works of scientists, has helped to identify the problematic issues of accounting the coverages of payments to employees, including: ignoring the disclosure in accounting and reporting, absence of an adequate documentary support, complexity of the calculation methods, etc. The authors have suggested ways to improve accounting of payments to employees: documentation of coverages through the development of a Statement of the accrued coverages, simplification of calculation of payments to employees together with the related reflecting in the analytical accounting, disclosure in the accounting policy, and so forth. Such decisions would improve accounting the coverages of payments to employees, increase the frequency of applying such coverages in enterprises and their disclosure in the financial statements.

  3. ArcticDEM Year 3; Improving Coverage, Repetition and Resolution

    Science.gov (United States)

    Morin, P. J.; Porter, C. C.; Cloutier, M.; Howat, I.; Noh, M. J.; Willis, M. J.; Candela, S. G.; Bauer, G.; Kramer, W.; Bates, B.; Williamson, C.

    2017-12-01

    Surface topography is among the most fundamental data sets for geosciences, essential for disciplines ranging from glaciology to geodynamics. The ArcticDEM project is using sub-meter, commercial imagery licensed by the National Geospatial-Intelligence Agency, petascale computing, and open source photogrammetry software to produce a time-tagged 2m posting elevation model and a 5m posting mosaic of the entire Arctic region. As ArcticDEM enters its third year, the region has gone from having some of the sparsest and poorest elevation data to some of the most precise and complete data of any region on the globe. To date, we have produced and released over 80,000,000 km2 as 57,000 - 2m posting, time-stamped DEMs. The Arctic, on average, is covered four times though there are hotspots with more than 100 DEMs. In addition, the version 1 release includes a 5m posting mosaic covering the entire 20,000,000 km2 region. All products are publically available through arctidem.org, ESRI web services, and a web viewer. The final year of the project will consist of a complete refiltering of clouds/water and re-mosaicing of all elevation data. Since inception of the project, post-processing techniques have improved significantly, resulting in fewer voids, better registration, sharper coastlines, and fewer inaccuracies due to clouds. All ArcticDEM data will be released in 2018. Data, documentation, web services and web viewer are available at arcticdem.org

  4. Accuracy and impact of spatial aids based upon satellite enumeration to improve indoor residual spraying spatial coverage.

    Science.gov (United States)

    Bridges, Daniel J; Pollard, Derek; Winters, Anna M; Winters, Benjamin; Sikaala, Chadwick; Renn, Silvia; Larsen, David A

    2018-02-23

    Indoor residual spraying (IRS) is a key tool in the fight to control, eliminate and ultimately eradicate malaria. IRS protection is based on a communal effect such that an individual's protection primarily relies on the community-level coverage of IRS with limited protection being provided by household-level coverage. To ensure a communal effect is achieved through IRS, achieving high and uniform community-level coverage should be the ultimate priority of an IRS campaign. Ensuring high community-level coverage of IRS in malaria-endemic areas is challenging given the lack of information available about both the location and number of households needing IRS in any given area. A process termed 'mSpray' has been developed and implemented and involves use of satellite imagery for enumeration for planning IRS and a mobile application to guide IRS implementation. This study assessed (1) the accuracy of the satellite enumeration and (2) how various degrees of spatial aid provided through the mSpray process affected community-level IRS coverage during the 2015 spray campaign in Zambia. A 2-stage sampling process was applied to assess accuracy of satellite enumeration to determine number and location of sprayable structures. Results indicated an overall sensitivity of 94% for satellite enumeration compared to finding structures on the ground. After adjusting for structure size, roof, and wall type, households in Nchelenge District where all types of satellite-based spatial aids (paper-based maps plus use of the mobile mSpray application) were used were more likely to have received IRS than Kasama district where maps used were not based on satellite enumeration. The probability of a household being sprayed in Nchelenge district where tablet-based maps were used, did not differ statistically from that of a household in Samfya District, where detailed paper-based spatial aids based on satellite enumeration were provided. IRS coverage from the 2015 spray season benefited from

  5. Potential Improvements to VLBA UV-Coverages by the Addition of a 32-m Peruvian Antenna

    Science.gov (United States)

    Horiuchi, S.; Murphy, D. W.; Ishitsuka, J. K.; Ishitsuka, M.

    2005-12-01

    A plan is being currently developed to convert a 32-m telecomunications antenna in the Peruvian Andes into a radio astronomy facility. Significant improvements to stand-alone VLBA UV-coverages can be obtained with the addition of this southern hemisphere telescope to VLBA observations.

  6. Universal health coverage in Latin American countries: how to improve solidarity-based schemes.

    Science.gov (United States)

    Titelman, Daniel; Cetrángolo, Oscar; Acosta, Olga Lucía

    2015-04-04

    In this Health Policy we examine the association between the financing structure of health systems and universal health coverage. Latin American health systems encompass a wide range of financial sources, which translate into different solidarity-based schemes that combine contributory (payroll taxes) and non-contributory (general taxes) sources of financing. To move towards universal health coverage, solidarity-based schemes must heavily rely on countries' capacity to increase public expenditure in health. Improvement of solidarity-based schemes will need the expansion of mandatory universal insurance systems and strengthening of the public sector including increased fiscal expenditure. These actions demand a new model to integrate different sources of health-sector financing, including general tax revenue, social security contributions, and private expenditure. The extent of integration achieved among these sources will be the main determinant of solidarity and universal health coverage. The basic challenges for improvement of universal health coverage are not only to spend more on health, but also to reduce the proportion of out-of-pocket spending, which will need increased fiscal resources. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. A cross sectional study at subcentre level reflecting need for improving coverage of maternal health services

    Directory of Open Access Journals (Sweden)

    Geetu Singh

    2015-03-01

    Full Text Available Background: A Health Sub-centre is the most peripheral and first point of contact between the primary health care system and the community. It is imperative to get insight into their functioning which were established with the objectives of minimizing the hardships of the rural people. Objective: To study the coverages of maternal services at subcentres in district Jhansi. Material & Methods: A cross-sectional study was conducted with sample of 20 subcentres in the district Jhansi from June 2012 to July 2013. Various records of the Health workers were examined for maternal health services coverages and noted down on a pre-designed questionnaire. Results: Present study showed that currently married pregnant women aged 15-49 years registered for ANC were 72.1%. Women who received antenatal check-up in first trimester in subcentres were around 50%. Women who received 3 or more antenatal visits were only 29% in study. Meager 3.6% women received IFA for 100 days or more. Similarly women with full antenatal check-up were only 3%. In current study it was found that family planning coverages for female Sterilization was 60% but male Sterilization was just 0.5%. Conclusion: Higher emphasis needs to be given for better coverage of all maternal services. There should be provision for improvement of competence, confidence and motivation of health workers to ensure full range of maternal care activities specified under NRHM program.

  8. Upregulation of transmitter release probability improves a conversion of synaptic analogue signals into neuronal digital spikes

    Science.gov (United States)

    2012-01-01

    Action potentials at the neurons and graded signals at the synapses are primary codes in the brain. In terms of their functional interaction, the studies were focused on the influence of presynaptic spike patterns on synaptic activities. How the synapse dynamics quantitatively regulates the encoding of postsynaptic digital spikes remains unclear. We investigated this question at unitary glutamatergic synapses on cortical GABAergic neurons, especially the quantitative influences of release probability on synapse dynamics and neuronal encoding. Glutamate release probability and synaptic strength are proportionally upregulated by presynaptic sequential spikes. The upregulation of release probability and the efficiency of probability-driven synaptic facilitation are strengthened by elevating presynaptic spike frequency and Ca2+. The upregulation of release probability improves spike capacity and timing precision at postsynaptic neuron. These results suggest that the upregulation of presynaptic glutamate release facilitates a conversion of synaptic analogue signals into digital spikes in postsynaptic neurons, i.e., a functional compatibility between presynaptic and postsynaptic partners. PMID:22852823

  9. Change in challenging times: a plan for extending and improving health coverage.

    Science.gov (United States)

    Lambrew, Jeanne M; Podesta, John D; Shaw, Teresa L

    2005-01-01

    Some speculate that Americans are neither politically capable of nor morally committed to solving the health system problems. We disagree. We propose a plan that insures all and improves the value and cost-effectiveness of health care by knitting together employer-sponsored insurance and Medicaid; promoting prevention, research, and information technology; and financing its investments through a dedicated value-added tax. By prioritizing practicality, fairness, and responsibility, the plan aims to avoid ideological battles and prevent fear of major change. By emphasizing the moral imperative for change, especially relative to other options on the policy agenda, it aims to create momentum for expanding and improving health coverage for all.

  10. Improved candidate generation and coverage analysis methods for design optimization of symmetric multi-satellite constellations

    Science.gov (United States)

    Matossian, Mark G.

    1997-01-01

    Much attention in recent years has focused on commercial telecommunications ventures involving constellations of spacecraft in low and medium Earth orbit. These projects often require investments on the order of billions of dollars (US$) for development and operations, but surprisingly little work has been published on constellation design optimization for coverage analysis, traffic simulation and launch sequencing for constellation build-up strategies. This paper addresses the two most critical aspects of constellation orbital design — efficient constellation candidate generation and coverage analysis. Inefficiencies and flaws in the current standard algorithm for constellation modeling are identified, and a corrected and improved algorithm is presented. In the 1970's, John Walker and G. V. Mozhaev developed innovative strategies for continuous global coverage using symmetric non-geosynchronous constellations. (These are sometimes referred to as rosette, or Walker constellations. An example is pictured above.) In 1980, the late Arthur Ballard extended and generalized the work of Walker into a detailed algorithm for the NAVSTAR/GPS program, which deployed a 24 satellite symmetric constellation. Ballard's important contribution was published in his "Rosette Constellations of Earth Satellites."

  11. Localized probability of improvement for kriging based multi-objective optimization

    Science.gov (United States)

    Li, Yinjiang; Xiao, Song; Barba, Paolo Di; Rotaru, Mihai; Sykulski, Jan K.

    2017-12-01

    The paper introduces a new approach to kriging based multi-objective optimization by utilizing a local probability of improvement as the infill sampling criterion and the nearest neighbor check to ensure diversification and uniform distribution of Pareto fronts. The proposed method is computationally fast and linearly scalable to higher dimensions.

  12. Base pair probability estimates improve the prediction accuracy of RNA non-canonical base pairs.

    Directory of Open Access Journals (Sweden)

    Michael F Sloma

    2017-11-01

    Full Text Available Prediction of RNA tertiary structure from sequence is an important problem, but generating accurate structure models for even short sequences remains difficult. Predictions of RNA tertiary structure tend to be least accurate in loop regions, where non-canonical pairs are important for determining the details of structure. Non-canonical pairs can be predicted using a knowledge-based model of structure that scores nucleotide cyclic motifs, or NCMs. In this work, a partition function algorithm is introduced that allows the estimation of base pairing probabilities for both canonical and non-canonical interactions. Pairs that are predicted to be probable are more likely to be found in the true structure than pairs of lower probability. Pair probability estimates can be further improved by predicting the structure conserved across multiple homologous sequences using the TurboFold algorithm. These pairing probabilities, used in concert with prior knowledge of the canonical secondary structure, allow accurate inference of non-canonical pairs, an important step towards accurate prediction of the full tertiary structure. Software to predict non-canonical base pairs and pairing probabilities is now provided as part of the RNAstructure software package.

  13. Base pair probability estimates improve the prediction accuracy of RNA non-canonical base pairs.

    Science.gov (United States)

    Sloma, Michael F; Mathews, David H

    2017-11-01

    Prediction of RNA tertiary structure from sequence is an important problem, but generating accurate structure models for even short sequences remains difficult. Predictions of RNA tertiary structure tend to be least accurate in loop regions, where non-canonical pairs are important for determining the details of structure. Non-canonical pairs can be predicted using a knowledge-based model of structure that scores nucleotide cyclic motifs, or NCMs. In this work, a partition function algorithm is introduced that allows the estimation of base pairing probabilities for both canonical and non-canonical interactions. Pairs that are predicted to be probable are more likely to be found in the true structure than pairs of lower probability. Pair probability estimates can be further improved by predicting the structure conserved across multiple homologous sequences using the TurboFold algorithm. These pairing probabilities, used in concert with prior knowledge of the canonical secondary structure, allow accurate inference of non-canonical pairs, an important step towards accurate prediction of the full tertiary structure. Software to predict non-canonical base pairs and pairing probabilities is now provided as part of the RNAstructure software package.

  14. Improved detection probability of low level light and infrared image fusion system

    Science.gov (United States)

    Luo, Yuxiang; Fu, Rongguo; Zhang, Junju; Wang, Wencong; Chang, Benkang

    2018-02-01

    Low level light(LLL) image contains rich information on environment details, but is easily affected by the weather. In the case of smoke, rain, cloud or fog, much target information will lose. Infrared image, which is from the radiation produced by the object itself, can be "active" to obtain the target information in the scene. However, the image contrast and resolution is bad, the ability of the acquisition of target details is very poor, and the imaging mode does not conform to the human visual habit. The fusion of LLL and infrared image can make up for the deficiency of each sensor and give play to the advantages of single sensor. At first, we show the hardware design of fusion circuit. Then, through the recognition probability calculation of the target(one person) and the background image(trees), we find that the trees detection probability of LLL image is higher than that of the infrared image, and the person detection probability of the infrared image is obviously higher than that of LLL image. The detection probability of fusion image for one person and trees is higher than that of single detector. Therefore, image fusion can significantly enlarge recognition probability and improve detection efficiency.

  15. Local mandate improves equity of paid sick leave coverage: Seattle’s experience

    Directory of Open Access Journals (Sweden)

    Jennifer L. Romich

    2017-01-01

    Full Text Available Abstract Background Paid sick leave allows workers to take time off work for personal or family health needs, improving health and potentially limiting infectious diseases. The U.S. has no national sick leave mandate, and many American workers - particularly those at lower income levels - have no right to paid time off for their own or family members’ health needs. This article reports on outcomes of a local mandate, the City of Seattle Paid Sick and Safe Time Ordinance, which requires certain employers to provide paid sick leave to eligible workers. Methods Survey collectors contacted a stratified random sample of Seattle employers before the Ordinance went into effect and one year later. Pre- and post- analysis draws on responses to survey items by 345 employers who were subject to the paid sick leave mandate. Results Awareness of the policy and provision of paid leave grew significantly over the year after the Ordinance was enacted. More employers offered leave to full-time workers (80.8 to 93.9%, p < .001 and part-time workers (47.1 to 66.7%, p < .001 with particularly large increases in the hospitality sector, which includes food workers (coverage of any hospitality employee: 27.5 to 85.0%, p < .001. Conclusions Absent a federal policy, local paid sick time mandates can increase paid sick leave coverage, an important social determinant of health.

  16. SU-E-T-471: Improvement of Gamma Knife Treatment Planning Through Tumor Control Probability for Metastatic Brain Tumors

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Z [East Carolina University, Greenville, NC (United States); Feng, Y [East Carolina Univ, Rockville, MD (United States); Lo, S [Case Western Reserve University, Cleveland, OH (United States); Grecula, J [Ohio State University, Columbus, OH (United States); Mayr, N; Yuh, W [University of Washington, Seattle, WA (United States)

    2015-06-15

    Purpose: The dose–volume histogram (DVH) has been normally accepted as a tool for treatment plan evaluation. However, spatial information is lacking in DVH. As a supplement to the DVH in three-dimensional treatment planning, the differential DVH (DDVH) provides the spatial variation, the size and magnitude of the different dose regions within a region of interest, which can be incorporated into tumor control probability model. This study was to provide a method in evaluating and improving Gamma Knife treatment planning. Methods: 10 patients with brain metastases from different primary tumors including melanoma (#1,#4,#5, #10), breast cancer (#2), prostate cancer (#3) and lung cancer (#6–9) were analyzed. By using Leksell GammaPlan software, two plans were prepared for each patient. Special attention was given to the DDVHs that were different for different plans and were used for a comparison between two plans. Dose distribution inside target and tumor control probability (TCP) based on DDVH were calculated, where cell density and radiobiological parameters were adopted from literature. The plans were compared based on DVH, DDVH and TCP. Results: Using DVH, the coverage and selectivity were the same between plans for 10 patients. DDVH were different between two plans for each patient. The paired t-test showed no significant difference in TCP between the two plans. For brain metastases from melanoma (#1, #4–5), breast cancer (#2) and lung cancer (#6–8), the difference in TCP was less than 5%. But the difference in TCP was about 6.5% for patient #3 with the metastasis from prostate cancer, 10.1% and 178.7% for two patients (#9–10) with metastasis from lung cancer. Conclusion: Although DVH provides average dose–volume information, DDVH provides differential dose– volume information with respect to different regions inside the tumor. TCP provides radiobiological information and adds additional information on improving treatment planning as well as adaptive

  17. SU-E-T-471: Improvement of Gamma Knife Treatment Planning Through Tumor Control Probability for Metastatic Brain Tumors

    International Nuclear Information System (INIS)

    Huang, Z; Feng, Y; Lo, S; Grecula, J; Mayr, N; Yuh, W

    2015-01-01

    Purpose: The dose–volume histogram (DVH) has been normally accepted as a tool for treatment plan evaluation. However, spatial information is lacking in DVH. As a supplement to the DVH in three-dimensional treatment planning, the differential DVH (DDVH) provides the spatial variation, the size and magnitude of the different dose regions within a region of interest, which can be incorporated into tumor control probability model. This study was to provide a method in evaluating and improving Gamma Knife treatment planning. Methods: 10 patients with brain metastases from different primary tumors including melanoma (#1,#4,#5, #10), breast cancer (#2), prostate cancer (#3) and lung cancer (#6–9) were analyzed. By using Leksell GammaPlan software, two plans were prepared for each patient. Special attention was given to the DDVHs that were different for different plans and were used for a comparison between two plans. Dose distribution inside target and tumor control probability (TCP) based on DDVH were calculated, where cell density and radiobiological parameters were adopted from literature. The plans were compared based on DVH, DDVH and TCP. Results: Using DVH, the coverage and selectivity were the same between plans for 10 patients. DDVH were different between two plans for each patient. The paired t-test showed no significant difference in TCP between the two plans. For brain metastases from melanoma (#1, #4–5), breast cancer (#2) and lung cancer (#6–8), the difference in TCP was less than 5%. But the difference in TCP was about 6.5% for patient #3 with the metastasis from prostate cancer, 10.1% and 178.7% for two patients (#9–10) with metastasis from lung cancer. Conclusion: Although DVH provides average dose–volume information, DDVH provides differential dose– volume information with respect to different regions inside the tumor. TCP provides radiobiological information and adds additional information on improving treatment planning as well as adaptive

  18. Improving N-Glycan Coverage using HPLC-MS with Electrospray Ionization at Subambient Pressure

    Energy Technology Data Exchange (ETDEWEB)

    Marginean, Ioan; Kronewitter, Scott R.; Moore, Ronald J.; Slysz, Gordon W.; Monroe, Matthew E.; Anderson, Gordon A.; Tang, Keqi; Smith, Richard D.

    2012-10-01

    Human serum glycan profiling with mass spectrometry (MS) has been employed to study several disease conditions and is demonstrating promise for e.g. clinical biomarker discovery. However, the poor glycan ionization efficiency and the large dynamic range of glycan concentrations in human sera hinder comprehensive profiling. In particular, large glycans are problematic because they are present at low concentrations and prone to fragmentation. Here we show that the sub-ambient pressure ionization with nanoelectrospray (SPIN)-MS can expand the serum glycome profile when compared with the conventional atmospheric pressure electrospray ionization (ESI)-MS with a heated capillary inlet. Notably, the ions generated by the SPIN interface were observed at higher charge states for 50% of the annotated glycans. Out of a total of 130 detected glycans, 34 were only detected with the SPIN-MS, resulting in improved coverage of glycan families as well as of glycans with larger numbers of labile monosaccharides.

  19. Crossed-Slot Cavity-Backed Antenna with Improved Hemispherical Coverage

    DEFF Research Database (Denmark)

    Kim, Oleksiy S.; Breinbjerg, Olav; Østergaard, Allan

    2005-01-01

    The paper presents the results of the investigation of the crossed-slot cavity-backed antenna with the complementary crossed electric dipoles added to compensate the circularly polarized (CP) radiation pattern degradation near the horizon. Dependences of the radiation characteristics...... of the modified crossed-slot cavity-backed antenna on the length, width and height of the crossed electric dipoles are shown. Effects of a finite size ground plane are taken into account due to a full wave electromagnetic analysis software utilized in the parametrical investigations. Simulated and measured...... results for a selected antenna configuration prove that the properly adjusted crossed electric dipoles are able to improve the coverage and CP polarization characteristics of the crossed-slot cavity-backed antenna....

  20. Improving equity in health care financing in China during the progression towards Universal Health Coverage.

    Science.gov (United States)

    Chen, Mingsheng; Palmer, Andrew J; Si, Lei

    2017-12-29

    China is reforming the way it finances health care as it moves towards Universal Health Coverage (UHC) after the failure of market-oriented mechanisms for health care. Improving financing equity is a major policy goal of health care system during the progression towards universal coverage. We used progressivity analysis and dominance test to evaluate the financing channels of general taxation, pubic health insurance, and out-of-pocket (OOP) payments. In 2012 a survey of 8854 individuals in 3008 households recorded the socioeconomic and demographic status, and health care payments of those households. The overall Kakwani index (KI) of China's health care financing system is 0.0444. For general tax KI was -0.0241 (95% confidence interval (CI): -0.0315 to -0.0166). The indices for public health schemes (Urban Employee Basic Medical Insurance, Urban Resident's Basic Medical Insurance, New Rural Cooperative Medical Scheme) were respectively 0.1301 (95% CI: 0.1008 to 0.1594), -0.1737 (95% CI: -0.2166 to -0.1308), and -0.5598 (95% CI: -0.5830 to -0.5365); and for OOP payments KI was 0.0896 (95%CI: 0.0345 to 0.1447). OOP payments are still the dominant part of China's health care finance system. China's health care financing system is not really equitable. Reducing the proportion of indirect taxes would considerably improve health care financing equity. The flat-rate contribution mechanism is not recommended for use in public health insurance schemes, and more attention should be given to optimizing benefit packages during China's progression towards UHC.

  1. Improved method for estimating particle scattering probabilities to finite detectors for Monte Carlo simulation

    International Nuclear Information System (INIS)

    Mickael, M.; Gardner, R.P.; Verghese, K.

    1988-01-01

    An improved method for calculating the total probability of particle scattering within the solid angle subtended by finite detectors is developed, presented, and tested. The limiting polar and azimuthal angles subtended by the detector are measured from the direction that most simplifies their calculation rather than from the incident particle direction. A transformation of the particle scattering probability distribution function (pdf) is made to match the transformation of the direction from which the limiting angles are measured. The particle scattering probability to the detector is estimated by evaluating the integral of the transformed pdf over the range of the limiting angles measured from the preferred direction. A general formula for transforming the particle scattering pdf is derived from basic principles and applied to four important scattering pdf's; namely, isotropic scattering in the Lab system, isotropic neutron scattering in the center-of-mass system, thermal neutron scattering by the free gas model, and gamma-ray Klein-Nishina scattering. Some approximations have been made to these pdf's to enable analytical evaluations of the final integrals. These approximations are shown to be valid over a wide range of energies and for most elements. The particle scattering probability to spherical, planar circular, and right circular cylindrical detectors has been calculated using the new and previously reported direct approach. Results indicate that the new approach is valid and is computationally faster by orders of magnitude

  2. Strategies to improve treatment coverage in community-based public health programs: A systematic review of the literature.

    Directory of Open Access Journals (Sweden)

    Katrina V Deardorff

    2018-02-01

    Full Text Available Community-based public health campaigns, such as those used in mass deworming, vitamin A supplementation and child immunization programs, provide key healthcare interventions to targeted populations at scale. However, these programs often fall short of established coverage targets. The purpose of this systematic review was to evaluate the impact of strategies used to increase treatment coverage in community-based public health campaigns.We systematically searched CAB Direct, Embase, and PubMed archives for studies utilizing specific interventions to increase coverage of community-based distribution of drugs, vaccines, or other public health services. We identified 5,637 articles, from which 79 full texts were evaluated according to pre-defined inclusion and exclusion criteria. Twenty-eight articles met inclusion criteria and data were abstracted regarding strategy-specific changes in coverage from these sources. Strategies used to increase coverage included community-directed treatment (n = 6, pooled percent change in coverage: +26.2%, distributor incentives (n = 2, +25.3%, distribution along kinship networks (n = 1, +24.5%, intensified information, education, and communication activities (n = 8, +21.6%, fixed-point delivery (n = 1, +21.4%, door-to-door delivery (n = 1, +14.0%, integrated service distribution (n = 9, +12.7%, conversion from school- to community-based delivery (n = 3, +11.9%, and management by a non-governmental organization (n = 1, +5.8%.Strategies that target improving community member ownership of distribution appear to have a large impact on increasing treatment coverage. However, all strategies used to increase coverage successfully did so. These results may be useful to National Ministries, programs, and implementing partners in optimizing treatment coverage in community-based public health programs.

  3. Can coverage of schistosomiasis and soil transmitted helminthiasis control programmes targeting school-aged children be improved? New approaches.

    Science.gov (United States)

    Massa, K; Olsen, A; Sheshe, A; Ntakamulenga, R; Ndawi, B; Magnussen, P

    2009-11-01

    Control programmes generally use a school-based strategy of mass drug administration to reduce morbidity of schistosomiasis and soil-transmitted helminthiasis (STH) in school-aged populations. The success of school-based programmes depends on treatment coverage. The community-directed treatment (ComDT) approach has been implemented in the control of onchocerciasis and lymphatic filariasis in Africa and improves treatment coverage. This study compared the treatment coverage between the ComDT approach and the school-based treatment approach, where non-enrolled school-aged children were invited for treatment, in the control of schistosomiasis and STH among enrolled and non-enrolled school-aged children. Coverage during the first treatment round among enrolled children was similar for the two approaches (ComDT: 80.3% versus school: 82.1%, P=0.072). However, for the non-enrolled children the ComDT approach achieved a significantly higher coverage than the school-based approach (80.0 versus 59.2%, P<0.001). Similar treatment coverage levels were attained at the second treatment round. Again, equal levels of treatment coverage were found between the two approaches for the enrolled school-aged children, while the ComDT approach achieved a significantly higher coverage in the non-enrolled children. The results of this study showed that the ComDT approach can obtain significantly higher treatment coverage among the non-enrolled school-aged children compared to the school-based treatment approach for the control of schistosomiasis and STH.

  4. Improving polio vaccination coverage in Nigeria through the use of geographic information system technology.

    Science.gov (United States)

    Barau, Inuwa; Zubairu, Mahmud; Mwanza, Michael N; Seaman, Vincent Y

    2014-11-01

    Historically, microplanning for polio vaccination campaigns in Nigeria relied on inaccurate and incomplete hand-drawn maps, resulting in the exclusion of entire settlements and missed children. The goal of this work was to create accurate, coordinate-based maps for 8 polio-endemic states in northern Nigeria to improve microplanning and support tracking of vaccination teams, thereby enhancing coverage, supervision, and accountability. Settlement features were identified in the target states, using high-resolution satellite imagery. Field teams collected names and geocoordinates for each settlement feature, with the help of local guides. Global position system (GPS) tracking of vaccination teams was conducted in selected areas and daily feedback provided to supervisors. Geographic information system (GIS)-based maps were created for 2238 wards in the 8 target states. The resulting microplans included all settlements and more-efficient team assignments, owing to the improved spatial reference. GPS tracking was conducted in 111 high-risk local government areas, resulting in improved team performance and the identification of missed/poorly covered settlements. Accurate and complete maps are a necessary part of an effective polio microplan, and tracking vaccinators gives supervisors a tool to ensure that all settlements are visited. © The Author 2014. Published by Oxford University Press on behalf of the Infectious Diseases Society of America. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  5. Alternative Enzymes Lead to Improvements in Sequence Coverage and PTM Analysis

    Science.gov (United States)

    Hooper, Kyle; Rosenblatt, Michael; Urh, Marjeta; Saveliev, Sergei; Hosfield, Chris; Kobs, Gary; Ford, Michael; Jones, Richard; Amunugama, Ravi; Allen, David; Brazas, Robert

    2013-01-01

    The profiling of proteins using biological mass spectrometry (bottom up proteomics) most commonly requires trypsin. Trypsin is advantageous in that it produces peptides of optimal charge and size. However, for applications in which the proteins under investigation are part of a complex mixture or not isolated at high levels (i.e. low ng from an immunoprecipitation), sequence coverage is rarely complete. In addition, we have found that in several cases, like phosphorylation, acetylation, and methylation, alternative proteases are required to prepare peptides suitable for MS detection. This poster will provide specific examples which demonstrate this observation. For example, the application of a combined Trypsin/ Lys-C mixture reduces the number of missed cleavages by more than 3-fold producing samples with lower CV's (for biological replicates). The mixture is also well-suited for the complete proteolysis of hydrophobic, compact proteins. The addition of chymotrypsin and elastase has been found to be useful for identifying phosphorylation sites on proteins, especially on sequences where the site of phosphorylation inhibits trypsin (i.e. proximal to K or R). Many epigenetic applications have focused on histone modifications, like lysine acetylation and arginine methylation. Alternative proteases like Asp-N, Glu-C, and chymotrypsin have been especially useful given the fact that the modified K and R residues are resistant to c-terminal cleavage by trypsin. Finally, in the case of serum profiling, the addition of the endoglycosidase, PNGase F has been found to improve sequence coverage due to the removal of N-linked glycans.

  6. Enhanced Positioning Algorithm of ARPS for Improving Accuracy and Expanding Service Coverage

    Directory of Open Access Journals (Sweden)

    Kyuman Lee

    2016-08-01

    Full Text Available The airborne relay-based positioning system (ARPS, which employs the relaying of navigation signals, was proposed as an alternative positioning system. However, the ARPS has limitations, such as relatively large vertical error and service restrictions, because firstly, the user position is estimated based on airborne relays that are located in one direction, and secondly, the positioning is processed using only relayed navigation signals. In this paper, we propose an enhanced positioning algorithm to improve the performance of the ARPS. The main idea of the enhanced algorithm is the adaptable use of either virtual or direct measurements of reference stations in the calculation process based on the structural features of the ARPS. Unlike the existing two-step algorithm for airborne relay and user positioning, the enhanced algorithm is divided into two cases based on whether the required number of navigation signals for user positioning is met. In the first case, where the number of signals is greater than four, the user first estimates the positions of the airborne relays and its own initial position. Then, the user position is re-estimated by integrating a virtual measurement of a reference station that is calculated using the initial estimated user position and known reference positions. To prevent performance degradation, the re-estimation is performed after determining its requirement through comparing the expected position errors. If the navigation signals are insufficient, such as when the user is outside of airborne relay coverage, the user position is estimated by additionally using direct signal measurements of the reference stations in place of absent relayed signals. The simulation results demonstrate that a higher accuracy level can be achieved because the user position is estimated based on the measurements of airborne relays and a ground station. Furthermore, the service coverage is expanded by using direct measurements of reference

  7. Improved Membership Probability for Moving Groups: Bayesian and Machine Learning Approaches

    Science.gov (United States)

    Lee, Jinhee; Song, Inseok

    2018-01-01

    Gravitationally unbound loose stellar associations (i.e., young nearby moving groups: moving groups hereafter) have been intensively explored because they are important in planet and disk formation studies, exoplanet imaging, and age calibration. Among the many efforts devoted to the search for moving group members, a Bayesian approach (e.g.,using the code BANYAN) has become popular recently because of the many advantages it offers. However, the resultant membership probability needs to be carefully adopted because of its sensitive dependence on input models. In this study, we have developed an improved membership calculation tool focusing on the beta-Pic moving group. We made three improvements for building models used in BANYAN II: (1) updating a list of accepted members by re-assessing memberships in terms of position, motion, and age, (2) investigating member distribution functions in XYZ, and (3) exploring field star distribution functions in XYZUVW. Our improved tool can change membership probability up to 70%. Membership probability is critical and must be better defined. For example, our code identifies only one third of the candidate members in SIMBAD that are believed to be kinematically associated with beta-Pic moving group.Additionally, we performed cluster analysis of young nearby stars using an unsupervised machine learning approach. As more moving groups and their members are identified, the complexity and ambiguity in moving group configuration has been increased. To clarify this issue, we analyzed ~4,000 X-ray bright young stellar candidates. Here, we present the preliminary results. By re-identifying moving groups with the least human intervention, we expect to understand the composition of the solar neighborhood. Moreover better defined moving group membership will help us understand star formation and evolution in relatively low density environments; especially for the low-mass stars which will be identified in the coming Gaia release.

  8. Modeling detection probability to improve marsh bird surveys in southern Canada and the Great Lakes states

    Directory of Open Access Journals (Sweden)

    Douglas C. Tozer

    2016-12-01

    Full Text Available Marsh birds are notoriously elusive, with variation in detection probability across species, regions, seasons, and different times of day and weather. Therefore, it is important to develop regional field survey protocols that maximize detections, but that also produce data for estimating and analytically adjusting for remaining differences in detections. We aimed to improve regional field survey protocols by estimating detection probability of eight elusive marsh bird species throughout two regions that have ongoing marsh bird monitoring programs: the southern Canadian Prairies (Prairie region and the southern portion of the Great Lakes basin and parts of southern Québec (Great Lakes-St. Lawrence region. We accomplished our goal using generalized binomial N-mixture models and data from ~22,300 marsh bird surveys conducted between 2008 and 2014 by Bird Studies Canada's Prairie, Great Lakes, and Québec Marsh Monitoring Programs. Across all species, on average, detection probability was highest in the Great Lakes-St. Lawrence region from the beginning of May until mid-June, and then fell throughout the remainder of the season until the end of June; was lowest in the Prairie region in mid-May and then increased throughout the remainder of the season until the end of June; was highest during darkness compared with light; and did not vary significantly according to temperature (range: 0-30°C, cloud cover (0%-100%, or wind (0-20 kph, or during morning versus evening. We used our results to formulate improved marsh bird survey protocols for each region. Our analysis and recommendations are useful and contribute to conservation of wetland birds at various scales from local single-species studies to the continental North American Marsh Bird Monitoring Program.

  9. MASTER: a model to improve and standardize clinical breakpoints for antimicrobial susceptibility testing using forecast probabilities.

    Science.gov (United States)

    Blöchliger, Nicolas; Keller, Peter M; Böttger, Erik C; Hombach, Michael

    2017-09-01

    The procedure for setting clinical breakpoints (CBPs) for antimicrobial susceptibility has been poorly standardized with respect to population data, pharmacokinetic parameters and clinical outcome. Tools to standardize CBP setting could result in improved antibiogram forecast probabilities. We propose a model to estimate probabilities for methodological categorization errors and defined zones of methodological uncertainty (ZMUs), i.e. ranges of zone diameters that cannot reliably be classified. The impact of ZMUs on methodological error rates was used for CBP optimization. The model distinguishes theoretical true inhibition zone diameters from observed diameters, which suffer from methodological variation. True diameter distributions are described with a normal mixture model. The model was fitted to observed inhibition zone diameters of clinical Escherichia coli strains. Repeated measurements for a quality control strain were used to quantify methodological variation. For 9 of 13 antibiotics analysed, our model predicted error rates of  0.1% for ampicillin, cefoxitin, cefuroxime and amoxicillin/clavulanic acid. Increasing the susceptible CBP (cefoxitin) and introducing ZMUs (ampicillin, cefuroxime, amoxicillin/clavulanic acid) decreased error rates to < 0.1%. ZMUs contained low numbers of isolates for ampicillin and cefuroxime (3% and 6%), whereas the ZMU for amoxicillin/clavulanic acid contained 41% of all isolates and was considered not practical. We demonstrate that CBPs can be improved and standardized by minimizing methodological categorization error rates. ZMUs may be introduced if an intermediate zone is not appropriate for pharmacokinetic/pharmacodynamic or drug dosing reasons. Optimized CBPs will provide a standardized antibiotic susceptibility testing interpretation at a defined level of probability. © The Author 2017. Published by Oxford University Press on behalf of the British Society for Antimicrobial Chemotherapy. All rights reserved. For

  10. A drawback and an improvement of the classical Weibull probability plot

    International Nuclear Information System (INIS)

    Jiang, R.

    2014-01-01

    The classical Weibull Probability Paper (WPP) plot has been widely used to identify a model for fitting a given dataset. It is based on a match between the WPP plots of the model and data in shape. This paper carries out an analysis for the Weibull transformations that create the WPP plot and shows that the shape of the WPP plot of the data randomly generated from a distribution model can be significantly different from the shape of the WPP plot of the model due to the high non-linearity of the Weibull transformations. As such, choosing model based on the shape of the WPP plot of data can be unreliable. A cdf-based weighted least squares method is proposed to improve the parameter estimation accuracy; and an improved WPP plot is suggested to avoid the drawback of the classical WPP plot. The appropriateness and usefulness of the proposed estimation method and probability plot are illustrated by simulation and real-world examples

  11. Halobacterium salinarum NRC-1 PeptideAtlas: toward strategies for targeted proteomics and improved proteome coverage.

    Science.gov (United States)

    Van, Phu T; Schmid, Amy K; King, Nichole L; Kaur, Amardeep; Pan, Min; Whitehead, Kenia; Koide, Tie; Facciotti, Marc T; Goo, Young Ah; Deutsch, Eric W; Reiss, David J; Mallick, Parag; Baliga, Nitin S

    2008-09-01

    The relatively small numbers of proteins and fewer possible post-translational modifications in microbes provide a unique opportunity to comprehensively characterize their dynamic proteomes. We have constructed a PeptideAtlas (PA) covering 62.7% of the predicted proteome of the extremely halophilic archaeon Halobacterium salinarum NRC-1 by compiling approximately 636 000 tandem mass spectra from 497 mass spectrometry runs in 88 experiments. Analysis of the PA with respect to biophysical properties of constituent peptides, functional properties of parent proteins of detected peptides, and performance of different mass spectrometry approaches has highlighted plausible strategies for improving proteome coverage and selecting signature peptides for targeted proteomics. Notably, discovery of a significant correlation between absolute abundances of mRNAs and proteins has helped identify low abundance of proteins as the major limitation in peptide detection. Furthermore, we have discovered that iTRAQ labeling for quantitative proteomic analysis introduces a significant bias in peptide detection by mass spectrometry. Therefore, despite identifying at least one proteotypic peptide for almost all proteins in the PA, a context-dependent selection of proteotypic peptides appears to be the most effective approach for targeted proteomics.

  12. Improving the Yule-Nielsen modified Neugebauer model by dot surface coverages depending on the ink superposition conditions

    Science.gov (United States)

    Hersch, Roger David; Crete, Frederique

    2005-01-01

    Dot gain is different when dots are printed alone, printed in superposition with one ink or printed in superposition with two inks. In addition, the dot gain may also differ depending on which solid ink the considered halftone layer is superposed. In a previous research project, we developed a model for computing the effective surface coverage of a dot according to its superposition conditions. In the present contribution, we improve the Yule-Nielsen modified Neugebauer model by integrating into it our effective dot surface coverage computation model. Calibration of the reproduction curves mapping nominal to effective surface coverages in every superposition condition is carried out by fitting effective dot surfaces which minimize the sum of square differences between the measured reflection density spectra and reflection density spectra predicted according to the Yule-Nielsen modified Neugebauer model. In order to predict the reflection spectrum of a patch, its known nominal surface coverage values are converted into effective coverage values by weighting the contributions from different reproduction curves according to the weights of the contributing superposition conditions. We analyze the colorimetric prediction improvement brought by our extended dot surface coverage model for clustered-dot offset prints, thermal transfer prints and ink-jet prints. The color differences induced by the differences between measured reflection spectra and reflection spectra predicted according to the new dot surface estimation model are quantified on 729 different cyan, magenta, yellow patches covering the full color gamut. As a reference, these differences are also computed for the classical Yule-Nielsen modified spectral Neugebauer model incorporating a single halftone reproduction curve for each ink. Taking into account dot surface coverages according to different superposition conditions considerably improves the predictions of the Yule-Nielsen modified Neugebauer model. In

  13. Improving coverage of postnatal care in rural Ethiopia using a community-based, collaborative quality improvement approach.

    Science.gov (United States)

    Tesfaye, Solomon; Barry, Danika; Gobezayehu, Abebe Gebremariam; Frew, Aynalem Hailemichael; Stover, Kim Ethier; Tessema, Hana; Alamineh, Lamesgin; Sibley, Lynn M

    2014-01-01

    Ethiopia has high maternal and neonatal mortality and low use of skilled maternity care. The Maternal and Newborn Health in Ethiopia Partnership (MaNHEP), a 3.5-year learning project, used a community collaborative quality improvement approach to improve maternal and newborn health care during the birth-to-48-hour period. This study examines how the promotion of community maternal and newborn health (CMNH) family meetings and labor and birth notification contributed to increased postnatal care within 48 hours by skilled providers or health extension workers. Baseline and endline surveys, monthly quality improvement data, and MaNHEP's CMNH change package, a compendium of the most effective changes developed and tested by communities, were reviewed. Logistic regression assessed factors associated with postnatal care receipt. Monthly postnatal care receipt was plotted with control charts. The baseline (n = 1027) and endline (n = 1019) surveys showed significant increases in postnatal care, from 5% to 51% and from 15% to 47% in the Amhara and Oromiya regions, respectively (both P care. Women with any antenatal care were 1.7 times more likely to have had a postnatal care visit (odds ratio [OR], 1.67; 95% confidence interval [CI], 1.10-2.54; P care (OR, 4.86; 95% CI, 2.67-8.86; P care far exceeds the 7% postnatal care coverage rate reported in the 2011 Ethiopian Demographic and Health Survey (EDHS). This result was linked to ideas generated by community quality improvement teams for labor and birth notification and cooperation with community-level health workers to promote antenatal care and CMNH family meetings. © 2014 by the American College of Nurse-Midwives.

  14. Improved collision probability method for thermal-neutron-flux calculation in a cylindrical reactor cell

    International Nuclear Information System (INIS)

    Bosevski, T.

    1986-01-01

    An improved collision probability method for thermal-neutron-flux calculation in a cylindrical reactor cell has been developed. Expanding the neutron flux and source into a series of even powers of the radius, one' gets a convenient method for integration of the one-energy group integral transport equation. It is shown that it is possible to perform an analytical integration in the x-y plane in one variable and to use the effective Gaussian integration over another one. Choosing a convenient distribution of space points in fuel and moderator the transport matrix calculation and cell reaction rate integration were condensed. On the basis of the proposed method, the computer program DISKRET for the ZUSE-Z 23 K computer has been written. The suitability of the proposed method for the calculation of the thermal-neutron-flux distribution in a reactor cell can be seen from the test results obtained. Compared with the other collision probability methods, the proposed treatment excels with a mathematical simplicity and a faster convergence. (author)

  15. Camera trap arrays improve detection probability of wildlife: Investigating study design considerations using an empirical dataset.

    Science.gov (United States)

    O'Connor, Kelly M; Nathan, Lucas R; Liberati, Marjorie R; Tingley, Morgan W; Vokoun, Jason C; Rittenhouse, Tracy A G

    2017-01-01

    Camera trapping is a standard tool in ecological research and wildlife conservation. Study designs, particularly for small-bodied or cryptic wildlife species often attempt to boost low detection probabilities by using non-random camera placement or baited cameras, which may bias data, or incorrectly estimate detection and occupancy. We investigated the ability of non-baited, multi-camera arrays to increase detection probabilities of wildlife. Study design components were evaluated for their influence on wildlife detectability by iteratively parsing an empirical dataset (1) by different sizes of camera arrays deployed (1-10 cameras), and (2) by total season length (1-365 days). Four species from our dataset that represented a range of body sizes and differing degrees of presumed detectability based on life history traits were investigated: white-tailed deer (Odocoileus virginianus), bobcat (Lynx rufus), raccoon (Procyon lotor), and Virginia opossum (Didelphis virginiana). For all species, increasing from a single camera to a multi-camera array significantly improved detection probability across the range of season lengths and number of study sites evaluated. The use of a two camera array increased survey detection an average of 80% (range 40-128%) from the detection probability of a single camera across the four species. Species that were detected infrequently benefited most from a multiple-camera array, where the addition of up to eight cameras produced significant increases in detectability. However, for species detected at high frequencies, single cameras produced a season-long (i.e, the length of time over which cameras are deployed and actively monitored) detectability greater than 0.75. These results highlight the need for researchers to be critical about camera trap study designs based on their intended target species, as detectability for each focal species responded differently to array size and season length. We suggest that researchers a priori identify

  16. Camera trap arrays improve detection probability of wildlife: Investigating study design considerations using an empirical dataset.

    Directory of Open Access Journals (Sweden)

    Kelly M O'Connor

    Full Text Available Camera trapping is a standard tool in ecological research and wildlife conservation. Study designs, particularly for small-bodied or cryptic wildlife species often attempt to boost low detection probabilities by using non-random camera placement or baited cameras, which may bias data, or incorrectly estimate detection and occupancy. We investigated the ability of non-baited, multi-camera arrays to increase detection probabilities of wildlife. Study design components were evaluated for their influence on wildlife detectability by iteratively parsing an empirical dataset (1 by different sizes of camera arrays deployed (1-10 cameras, and (2 by total season length (1-365 days. Four species from our dataset that represented a range of body sizes and differing degrees of presumed detectability based on life history traits were investigated: white-tailed deer (Odocoileus virginianus, bobcat (Lynx rufus, raccoon (Procyon lotor, and Virginia opossum (Didelphis virginiana. For all species, increasing from a single camera to a multi-camera array significantly improved detection probability across the range of season lengths and number of study sites evaluated. The use of a two camera array increased survey detection an average of 80% (range 40-128% from the detection probability of a single camera across the four species. Species that were detected infrequently benefited most from a multiple-camera array, where the addition of up to eight cameras produced significant increases in detectability. However, for species detected at high frequencies, single cameras produced a season-long (i.e, the length of time over which cameras are deployed and actively monitored detectability greater than 0.75. These results highlight the need for researchers to be critical about camera trap study designs based on their intended target species, as detectability for each focal species responded differently to array size and season length. We suggest that researchers a priori

  17. [Leadership and vision in the improvement of universal health care coverage in low-income countries].

    Science.gov (United States)

    Meda, Ziemlé Clément; Konate, Lassina; Ouedraogo, Hyacinthe; Sanou, Moussa; Hercot, David; Sombie, Issiaka

    2011-01-01

    implementing a decentralized approach to tuberculosis detection, succeeded in improving access to care and enabled us to quantify the rate of tuberculosis-HIV co-infection in the HD. The fourth intervention improved financial access to emergency obstetric care by providing essential drugs and consumables for emergency obstetric surgery free of charge. The fifth intervention boosted the motivation of health workers by an annual 'competition of excellence', organised for workers and teams in the HD. Finally, our sixth intervention was the introduction of a "culture" of evaluation and transparency, by means of a local health journal, used to interact with stakeholders both at the local level and in the health sector more broadly. We also present our experiences regularly during national health science symposia. Although the DT operates with limited resources, it has over time managed to improve care and services in the HD, through its dynamic management and strategic planning. It has reduced inpatient mortality and improved access to care, particularly for vulnerable groups, in line with the Primary Health Care and Bamako Initiative principles. This case study would have benefited from a stronger methodology. However, it shows that in a context of limited resources it is still possible to strengthen the local health system by improving management practices. To progress towards universal health coverage, all core functions of a DT are worth implementing, including leadership and vision. National and international health strategies should thus include a plan to provide for and train local health system managers who can provide both leadership and strategic vision.

  18. Modeling coverage gaps in haplotype frequencies via Bayesian inference to improve stem cell donor selection.

    Science.gov (United States)

    Louzoun, Yoram; Alter, Idan; Gragert, Loren; Albrecht, Mark; Maiers, Martin

    2018-05-01

    Regardless of sampling depth, accurate genotype imputation is limited in regions of high polymorphism which often have a heavy-tailed haplotype frequency distribution. Many rare haplotypes are thus unobserved. Statistical methods to improve imputation by extending reference haplotype distributions using linkage disequilibrium patterns that relate allele and haplotype frequencies have not yet been explored. In the field of unrelated stem cell transplantation, imputation of highly polymorphic human leukocyte antigen (HLA) genes has an important application in identifying the best-matched stem cell donor when searching large registries totaling over 28,000,000 donors worldwide. Despite these large registry sizes, a significant proportion of searched patients present novel HLA haplotypes. Supporting this observation, HLA population genetic models have indicated that many extant HLA haplotypes remain unobserved. The absent haplotypes are a significant cause of error in haplotype matching. We have applied a Bayesian inference methodology for extending haplotype frequency distributions, using a model where new haplotypes are created by recombination of observed alleles. Applications of this joint probability model offer significant improvement in frequency distribution estimates over the best existing alternative methods, as we illustrate using five-locus HLA frequency data from the National Marrow Donor Program registry. Transplant matching algorithms and disease association studies involving phasing and imputation of rare variants may benefit from this statistical inference framework.

  19. Beta-decay rate and beta-delayed neutron emission probability of improved gross theory

    Science.gov (United States)

    Koura, Hiroyuki

    2014-09-01

    A theoretical study has been carried out on beta-decay rate and beta-delayed neutron emission probability. The gross theory of the beta decay is based on an idea of the sum rule of the beta-decay strength function, and has succeeded in describing beta-decay half-lives of nuclei overall nuclear mass region. The gross theory includes not only the allowed transition as the Fermi and the Gamow-Teller, but also the first-forbidden transition. In this work, some improvements are introduced as the nuclear shell correction on nuclear level densities and the nuclear deformation for nuclear strength functions, those effects were not included in the original gross theory. The shell energy and the nuclear deformation for unmeasured nuclei are adopted from the KTUY nuclear mass formula, which is based on the spherical-basis method. Considering the properties of the integrated Fermi function, we can roughly categorized energy region of excited-state of a daughter nucleus into three regions: a highly-excited energy region, which fully affect a delayed neutron probability, a middle energy region, which is estimated to contribute the decay heat, and a region neighboring the ground-state, which determines the beta-decay rate. Some results will be given in the presentation. A theoretical study has been carried out on beta-decay rate and beta-delayed neutron emission probability. The gross theory of the beta decay is based on an idea of the sum rule of the beta-decay strength function, and has succeeded in describing beta-decay half-lives of nuclei overall nuclear mass region. The gross theory includes not only the allowed transition as the Fermi and the Gamow-Teller, but also the first-forbidden transition. In this work, some improvements are introduced as the nuclear shell correction on nuclear level densities and the nuclear deformation for nuclear strength functions, those effects were not included in the original gross theory. The shell energy and the nuclear deformation for

  20. PoliMedia - Improving Analyses of Radio, TV & Newspaper Coverage of Political Debates

    NARCIS (Netherlands)

    M.J. Kemman (Max); M. Kleppe (Martijn)

    2013-01-01

    textabstractAbstract. Analysing media coverage across several types of media-outlets is a challenging task for academic researchers. The PoliMedia project aimed to showcase the potential of cross-media analysis by linking the digitised transcriptions of the debates at the Dutch Parliament (Dutch

  1. Improving Community Coverage of Oral Cholera Mass Vaccination Campaigns: Lessons Learned in Zanzibar

    Science.gov (United States)

    Schaetti, Christian; Ali, Said M.; Chaignat, Claire-Lise; Khatib, Ahmed M.; Hutubessy, Raymond; Weiss, Mitchell G.

    2012-01-01

    , local decision-makers should reconsider how careful logistical arrangements may improve community coverage and thus effectiveness of vaccination campaigns. PMID:22844489

  2. Improving Spectral Capacity and Wireless Network Coverage by Cognitive Radio Technology and Relay Nodes in Cellular Systems

    DEFF Research Database (Denmark)

    Frederiksen, Flemming Bjerge

    2008-01-01

    Methods to enhance the use of the frequency spectrum by automatical spectrum sensing plus spectrum sharing in a cognitive radio technology context have been presented and discussed in this paper. Ideas to improve the wireless transmission by orthogonal OFDM-based communication and to increase the...... the coverage of cellular systems by future wireless networks, relay channels, relay stations and collaborate radio have been presented as well. A revised hierarchical deployment of the future wireless and wired networks are shortly discussed....

  3. Active offer of vaccinations during hospitalization improves coverage among splenectomized patients: An Italian experience.

    Science.gov (United States)

    Gallone, Maria Serena; Martino, Carmen; Quarto, Michele; Tafuri, Silvio

    2017-08-01

    In 2014, an Italian hospital implemented a protocol for pneumococcal, meningococcal, and Haemophilus influenzae type b vaccines offer to splenectomized patients during their hospitalization. After 1 year, coverage for recommended vaccinations increased from 5.7%-66.7% and the average time between splenectomy and vaccines administration decreased from 84.7-7.5 days. Copyright © 2017 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  4. Electronic and postal reminders for improving immunisation coverage in children: protocol for a systematic review and meta-analysis.

    Science.gov (United States)

    Chachou, Martel J; Mukinda, Fidele K; Motaze, Villyen; Wiysonge, Charles S

    2015-10-15

    Worldwide, suboptimal immunisation coverage causes the deaths of more than one million children under five from vaccine-preventable diseases every year. Reasons for suboptimal coverage are multifactorial, and a combination of interventions is needed to improve compliance with immunisation schedules. One intervention relies on reminders, where the health system prompts caregivers to attend immunisation appointments on time or re-engages caregivers who have defaulted on scheduled appointments. We undertake this systematic review to investigate the potential of reminders using emails, phone calls, social media, letters or postcards to improve immunisation coverage in children under five. We will search for published and unpublished randomised controlled trials and non-randomised controlled trials in PubMed, Scopus, CINAHL, CENTRAL, Science Citation Index, WHOLIS, Clinicaltrials.gov and the WHO International Clinical Trials Platform. We will conduct screening of search results, study selection, data extraction and risk-of-bias assessment in duplicate, resolving disagreements by consensus. In addition, we will pool data from clinically homogeneous studies using random-effects meta-analysis; assess heterogeneity of effects using the χ(2) test of homogeneity; and quantify any observed heterogeneity using the I(2) statistic. This protocol does not need approval by an ethics committee because we will use publicly available data, without directly involving human participants. The results will provide updated evidence on the effects of electronic and postal reminders on immunisation coverage, and we will discuss the applicability of the findings to low and middle-income countries. We plan to disseminate review findings through publication in a peer-reviewed journal and presentation at relevant conferences. In addition, we will prepare a policymaker-friendly summary using a validated format (eg, SUPPORT Summary) and disseminate this through social media and email discussion

  5. Improving Health Care Coverage, Equity, And Financial Protection Through A Hybrid System: Malaysia's Experience.

    Science.gov (United States)

    Rannan-Eliya, Ravindra P; Anuranga, Chamara; Manual, Adilius; Sararaks, Sondi; Jailani, Anis S; Hamid, Abdul J; Razif, Izzanie M; Tan, Ee H; Darzi, Ara

    2016-05-01

    Malaysia has made substantial progress in providing access to health care for its citizens and has been more successful than many other countries that are better known as models of universal health coverage. Malaysia's health care coverage and outcomes are now approaching levels achieved by member nations of the Organization for Economic Cooperation and Development. Malaysia's results are achieved through a mix of public services (funded by general revenues) and parallel private services (predominantly financed by out-of-pocket spending). We examined the distributional aspects of health financing and delivery and assessed financial protection in Malaysia's hybrid system. We found that this system has been effective for many decades in equalizing health care use and providing protection from financial risk, despite modest government spending. Our results also indicate that a high out-of-pocket share of total financing is not a consistent proxy for financial protection; greater attention is needed to the absolute level of out-of-pocket spending. Malaysia's hybrid health system presents continuing unresolved policy challenges, but the country's experience nonetheless provides lessons for other emerging economies that want to expand access to health care despite limited fiscal resources. Project HOPE—The People-to-People Health Foundation, Inc.

  6. Investigating and improving student understanding of the probability distributions for measuring physical observables in quantum mechanics

    International Nuclear Information System (INIS)

    Marshman, Emily; Singh, Chandralekha

    2017-01-01

    A solid grasp of the probability distributions for measuring physical observables is central to connecting the quantum formalism to measurements. However, students often struggle with the probability distributions of measurement outcomes for an observable and have difficulty expressing this concept in different representations. Here we first describe the difficulties that upper-level undergraduate and PhD students have with the probability distributions for measuring physical observables in quantum mechanics. We then discuss how student difficulties found in written surveys and individual interviews were used as a guide in the development of a quantum interactive learning tutorial (QuILT) to help students develop a good grasp of the probability distributions of measurement outcomes for physical observables. The QuILT strives to help students become proficient in expressing the probability distributions for the measurement of physical observables in Dirac notation and in the position representation and be able to convert from Dirac notation to position representation and vice versa. We describe the development and evaluation of the QuILT and findings about the effectiveness of the QuILT from in-class evaluations. (paper)

  7. Sampling the stream landscape: Improving the applicability of an ecoregion-level capture probability model for stream fishes

    Science.gov (United States)

    Mollenhauer, Robert; Mouser, Joshua B.; Brewer, Shannon K.

    2018-01-01

    Temporal and spatial variability in streams result in heterogeneous gear capture probability (i.e., the proportion of available individuals identified) that confounds interpretation of data used to monitor fish abundance. We modeled tow-barge electrofishing capture probability at multiple spatial scales for nine Ozark Highland stream fishes. In addition to fish size, we identified seven reach-scale environmental characteristics associated with variable capture probability: stream discharge, water depth, conductivity, water clarity, emergent vegetation, wetted width–depth ratio, and proportion of riffle habitat. The magnitude of the relationship between capture probability and both discharge and depth varied among stream fishes. We also identified lithological characteristics among stream segments as a coarse-scale source of variable capture probability. The resulting capture probability model can be used to adjust catch data and derive reach-scale absolute abundance estimates across a wide range of sampling conditions with similar effort as used in more traditional fisheries surveys (i.e., catch per unit effort). Adjusting catch data based on variable capture probability improves the comparability of data sets, thus promoting both well-informed conservation and management decisions and advances in stream-fish ecology.

  8. The application of probability methods with a view to improving the quality of equipment

    International Nuclear Information System (INIS)

    Carnino, A.; Gachot, B.; Greppo, J.-F.; Guitton, J.

    1976-01-01

    After stating that reliability and availability could be considered as parameters allowing the quality of equipment to be estimated, the chief aspects of the use of probability methods in the field of quality is described. These methods are mainly applied at the design, operation and maintenance level of the equipment, as well as at the compilation stage of the corresponding data [fr

  9. 42 CFR 416.43 - Conditions for coverage-Quality assessment and performance improvement.

    Science.gov (United States)

    2010-10-01

    ..., an ongoing program that demonstrates measurable improvement in patient health outcomes, and improves patient safety by using quality indicators or performance measures associated with improved health outcomes and by the identification and reduction of medical errors. (2) The ASC must measure, analyze, and...

  10. National health expenditure projections, 2013-23: faster growth expected with expanded coverage and improving economy.

    Science.gov (United States)

    Sisko, Andrea M; Keehan, Sean P; Cuckler, Gigi A; Madison, Andrew J; Smith, Sheila D; Wolfe, Christian J; Stone, Devin A; Lizonitz, Joseph M; Poisal, John A

    2014-10-01

    In 2013 health spending growth is expected to have remained slow, at 3.6 percent, as a result of the sluggish economic recovery, the effects of sequestration, and continued increases in private health insurance cost-sharing requirements. The combined effects of the Affordable Care Act's coverage expansions, faster economic growth, and population aging are expected to fuel health spending growth this year and thereafter (5.6 percent in 2014 and 6.0 percent per year for 2015-23). However, the average rate of increase through 2023 is projected to be slower than the 7.2 percent average growth experienced during 1990-2008. Because health spending is projected to grow 1.1 percentage points faster than the average economic growth during 2013-23, the health share of the gross domestic product is expected to rise from 17.2 percent in 2012 to 19.3 percent in 2023. Project HOPE—The People-to-People Health Foundation, Inc.

  11. Examining dog owners' beliefs regarding rabies vaccination during government-funded vaccine clinics in Grenada to improve vaccine coverage rates.

    Science.gov (United States)

    Thomas, D; Delgado, A; Louison, B; Lefrancois, T; Shaw, J

    2013-07-01

    Vaccination of domestic pets is an important component of rabies control and prevention in countries where the disease is maintained in a wildlife reservoir. In Grenada, vaccine coverage rates were low, despite extensive public education and advertising of government-sponsored vaccine clinics where rabies vaccine is administered to animals at no cost to animal owners. Information was needed on reasons for decreased dog owner participation in government-funded rabies vaccination clinics. A total of 120 dog owners from 6 different parishes were asked to complete a questionnaire assessing their currently held beliefs about rabies vaccination and perception of the risk posed by rabies. Over 70% of respondents believed that problems in the organization and management of clinic sites could allow for fighting between dogs or disease spread among dogs, while 35% of owners did not believe that they had the ability or adequate help to bring their dogs to the clinic sites. Recommendations for improving vaccine coverage rates included: improved scheduling of clinic sites and dates; increased biosecurity at clinic locations; focused advertising on the availability of home visits, particularly for aggressive dogs or dogs with visible skin-related diseases such as mange; and the recruitment of community volunteers to assist with bringing dogs to the clinic sites. Copyright © 2013. Published by Elsevier B.V.

  12. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  13. Improving immunization in Afghanistan: results from a cross-sectional community-based survey to assess routine immunization coverage

    Directory of Open Access Journals (Sweden)

    Raveesha R. Mugali

    2017-04-01

    Full Text Available Abstract Background Despite progress in recent years, Afghanistan is lagging behind in realizing the full potential of immunization. The country is still endemic for polio transmission and measles outbreaks continue to occur. In spite of significant reductions over the past decade, the mortality rate of children under 5 years of age continues to remain high at 91 per 1000 live births. Methods The study was a descriptive community-based cross sectional household survey. The survey aimed to estimate the levels of immunization coverage at national and province levels. Specific objectives are to: establish valid baseline information to monitor progress of the immunization program; identify reasons why children are not immunized; and make recommendations to enhance access and quality of immunization services in Afghanistan. The survey was carried out in all 34 provinces of the country, with a sample of 6125 mothers of children aged 12–23 months. Results Nationally, 51% of children participating in the survey received all doses of each antigen irrespective of the recommended date of immunization or recommended interval between doses. About 31% of children were found to be partially vaccinated. Reasons for partial vaccination included: place to vaccinate child too far (23%, not aware of the need of vaccination (17%, no faith in vaccination (16%, mother was too busy (15%, and fear of side effects (11%. Conclusion The innovative mechanism of contracting out delivery of primary health care services in Afghanistan, including immunization, to non-governmental organizations is showing some positive results in quickly increasing coverage of essential interventions, including routine immunization. Much ground still needs to be covered with proper planning and management of resources in order to improve the immunization coverage in Afghanistan and increase survival and health status of its children.

  14. Semantics-based plausible reasoning to extend the knowledge coverage of medical knowledge bases for improved clinical decision support.

    Science.gov (United States)

    Mohammadhassanzadeh, Hossein; Van Woensel, William; Abidi, Samina Raza; Abidi, Syed Sibte Raza

    2017-01-01

    %, and 20% of missing values. This expansion in the KB coverage allowed solving complex disease diagnostic queries that were previously unresolvable, without losing the correctness of the answers. However, compared to deductive reasoning, data-intensive plausible reasoning mechanisms yield a significant performance overhead. We observed that plausible reasoning approaches, by generating tentative inferences and leveraging domain knowledge of experts, allow us to extend the coverage of medical knowledge bases, resulting in improved clinical decision support. Second, by leveraging OWL ontological knowledge, we are able to increase the expressivity and accuracy of plausible reasoning methods. Third, our approach is applicable to clinical decision support systems for a range of chronic diseases.

  15. Expanded microbial genome coverage and improved protein family annotation in the COG database.

    Science.gov (United States)

    Galperin, Michael Y; Makarova, Kira S; Wolf, Yuri I; Koonin, Eugene V

    2015-01-01

    Microbial genome sequencing projects produce numerous sequences of deduced proteins, only a small fraction of which have been or will ever be studied experimentally. This leaves sequence analysis as the only feasible way to annotate these proteins and assign to them tentative functions. The Clusters of Orthologous Groups of proteins (COGs) database (http://www.ncbi.nlm.nih.gov/COG/), first created in 1997, has been a popular tool for functional annotation. Its success was largely based on (i) its reliance on complete microbial genomes, which allowed reliable assignment of orthologs and paralogs for most genes; (ii) orthology-based approach, which used the function(s) of the characterized member(s) of the protein family (COG) to assign function(s) to the entire set of carefully identified orthologs and describe the range of potential functions when there were more than one; and (iii) careful manual curation of the annotation of the COGs, aimed at detailed prediction of the biological function(s) for each COG while avoiding annotation errors and overprediction. Here we present an update of the COGs, the first since 2003, and a comprehensive revision of the COG annotations and expansion of the genome coverage to include representative complete genomes from all bacterial and archaeal lineages down to the genus level. This re-analysis of the COGs shows that the original COG assignments had an error rate below 0.5% and allows an assessment of the progress in functional genomics in the past 12 years. During this time, functions of many previously uncharacterized COGs have been elucidated and tentative functional assignments of many COGs have been validated, either by targeted experiments or through the use of high-throughput methods. A particularly important development is the assignment of functions to several widespread, conserved proteins many of which turned out to participate in translation, in particular rRNA maturation and tRNA modification. The new version of the

  16. Improving normal tissue complication probability models: the need to adopt a "data-pooling" culture.

    Science.gov (United States)

    Deasy, Joseph O; Bentzen, Søren M; Jackson, Andrew; Ten Haken, Randall K; Yorke, Ellen D; Constine, Louis S; Sharma, Ashish; Marks, Lawrence B

    2010-03-01

    Clinical studies of the dependence of normal tissue response on dose-volume factors are often confusingly inconsistent, as the QUANTEC reviews demonstrate. A key opportunity to accelerate progress is to begin storing high-quality datasets in repositories. Using available technology, multiple repositories could be conveniently queried, without divulging protected health information, to identify relevant sources of data for further analysis. After obtaining institutional approvals, data could then be pooled, greatly enhancing the capability to construct predictive models that are more widely applicable and better powered to accurately identify key predictive factors (whether dosimetric, image-based, clinical, socioeconomic, or biological). Data pooling has already been carried out effectively in a few normal tissue complication probability studies and should become a common strategy. Copyright 2010 Elsevier Inc. All rights reserved.

  17. Adjuvant Chemotherapy Improves the Probability of Freedom From Recurrence in Patients With Resected Stage IB Lung Adenocarcinoma.

    Science.gov (United States)

    Hung, Jung-Jyh; Wu, Yu-Chung; Chou, Teh-Ying; Jeng, Wen-Juei; Yeh, Yi-Chen; Hsu, Wen-Hu

    2016-04-01

    The benefit of adjuvant chemotherapy remains controversial for patients with stage IB non-small-cell lung cancer (NSCLC). This study investigated the effect of adjuvant chemotherapy and the predictors of benefit from adjuvant chemotherapy in patients with stage IB lung adenocarcinoma. A total of 243 patients with completely resected pathologic stage IB lung adenocarcinoma were included in the study. Predictors of the benefits of improved overall survival (OS) or probability of freedom from recurrence (FFR) from platinum-based adjuvant chemotherapy in patients with resected stage IB lung adenocarcinoma were investigated. Among the 243 patients, 70 (28.8%) had received platinum-based doublet adjuvant chemotherapy. A micropapillary/solid-predominant pattern (versus an acinar/papillary-predominant pattern) was a significantly worse prognostic factor for probability of FFR (p = 0.033). Although adjuvant chemotherapy (versus surgical intervention alone) was not a significant prognostic factor for OS (p = 0.303), it was a significant prognostic factor for a better probability of FFR (p = 0.029) on multivariate analysis. In propensity-score-matched pairs, there was no significant difference in OS between patients who received adjuvant chemotherapy and those who did not (p = 0.386). Patients who received adjuvant chemotherapy had a significantly better probability of FFR than those who did not (p = 0.043). For patients with a predominantly micropapillary/solid pattern, adjuvant chemotherapy (p = 0.033) was a significant prognostic factor for a better probability of FFR on multivariate analysis. Adjuvant chemotherapy is a favorable prognostic factor for the probability of FFR in patients with stage IB lung adenocarcinoma, particularly in those with a micropapillary/solid-predominant pattern. Copyright © 2016 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.

  18. An Improvement to DCPT: The Particle Transfer Probability as a Function of Particle's Age

    International Nuclear Information System (INIS)

    L. Pan; G. S. Bodvarsson

    2001-01-01

    Multi-scale features of transport processes in fractured porous media make numerical modeling a difficult task of both conceptualization and computation. Dual-continuum particle tracker (DCPT) is an attractive method for modeling large-scale problems typically encountered in the field, such as those in unsaturated zone (UZ) of Yucca Mountain, Nevada. The major advantage is its capability to capture the major features of flow and transport in fractured porous rock (i-e., a fast fracture sub-system combined with a slow matrix sub-system) with reasonable computational resources. However, like other conventional dual-continuum approach-based numerical methods, DCPT (v1.0) is often criticized for failing to capture the transient features of the diffusion depth into the matrix. It may overestimate the transport of tracers through the fractures, especially for the cases with large fracture spacing, and predict artificial early breakthroughs. The objective of this study is to develop a new theory for calculating the particle transfer probability to captures the transient features of the diffusion depth into the matrix within the framework of the dual-continuum random walk particle method (RWPM)

  19. Sequential Probability Ratio Testing with Power Projective Base Method Improves Decision-Making for BCI

    Science.gov (United States)

    Liu, Rong

    2017-01-01

    Obtaining a fast and reliable decision is an important issue in brain-computer interfaces (BCI), particularly in practical real-time applications such as wheelchair or neuroprosthetic control. In this study, the EEG signals were firstly analyzed with a power projective base method. Then we were applied a decision-making model, the sequential probability ratio testing (SPRT), for single-trial classification of motor imagery movement events. The unique strength of this proposed classification method lies in its accumulative process, which increases the discriminative power as more and more evidence is observed over time. The properties of the method were illustrated on thirteen subjects' recordings from three datasets. Results showed that our proposed power projective method outperformed two benchmark methods for every subject. Moreover, with sequential classifier, the accuracies across subjects were significantly higher than that with nonsequential ones. The average maximum accuracy of the SPRT method was 84.1%, as compared with 82.3% accuracy for the sequential Bayesian (SB) method. The proposed SPRT method provides an explicit relationship between stopping time, thresholds, and error, which is important for balancing the time-accuracy trade-off. These results suggest SPRT would be useful in speeding up decision-making while trading off errors in BCI. PMID:29348781

  20. Mandatory Rest Stops Improve Athlete Safety during Event Medical Coverage for Ultramarathons.

    Science.gov (United States)

    Joslin, Jeremy; Mularella, Joshua; Bail, Allison; Wojcik, Susan; Cooney, Derek R

    2016-02-01

    Provisions of medical direction and clinical services for ultramarathons require specific attention to heat illness. Heat stress can affect athlete performance negatively, and heat accumulation without acclimatization is associated with the development of exertional heat stroke (EHS). In order to potentially mitigate the risk of this safety concern, the Jungle Marathon (Para, Brazil) instituted mandatory rest periods during the first two days of this 7-day, staged, Brazilian ultramarathon. Race records were reviewed retrospectively to determine the number of runners that suffered an emergency medical complication related to heat stress and did not finish (DNF) the race. Review of records included three years before and three years after the institution of these mandatory rest periods. A total of 326 runners competed in the Jungle Marathon during the 2008-2013 period of study. During the pre-intervention years, a total of 46 athletes (21%) DNF the full race with 25 (54.3%) cases attributed to heat-related factors. During the post-intervention years, a total of 26 athletes (24.3%) DNF the full race with four (15.4%) cases attributed to heat-related factors. Mandatory rest stops during extreme running events in hot or tropical environments, like the Jungle Marathon, are likely to improve athlete safety and improve the heat acclimatization process.

  1. SMS text message reminders to improve infant vaccination coverage in Guatemala: A pilot randomized controlled trial.

    Science.gov (United States)

    Domek, Gretchen J; Contreras-Roldan, Ingrid L; O'Leary, Sean T; Bull, Sheana; Furniss, Anna; Kempe, Allison; Asturias, Edwin J

    2016-05-05

    Patient reminder systems are an evidence-based way to improve childhood vaccination rates but are difficult to implement in low- and middle-income countries (LMICs). Short Message Service (SMS) texts may offer a potential low-cost solution, especially in LMICs where mobile phones are becoming more ubiquitous. To determine if an SMS-based vaccination reminder system aimed at improving completion of the infant primary immunization series is feasible and acceptable in Guatemala. A pilot randomized controlled trial was conducted at two public health clinics in Guatemala City. Infants aged 8-14 weeks presenting for the first dose of the primary immunization series were enrolled in March-April 2013. Participants randomized into the intervention received three SMS reminders one week before the second and third dose. A follow-up acceptability survey was administered to both groups. The participation rate was 86.8% (321/370); 8 did not own a cell phone and 12 could not use SMS. 96.9% of intervention parents were sent at least one SMS reminder prior to visit 2 and 96.3% prior to visit 3. Both intervention and usual care participants had high rates of vaccine and visit completion, with a non-statistically significant higher percentage of children in the intervention completing both visit 2 (95.0% vs. 90.1%, p=.12) and visit 3 (84.4% vs. 80.7%, p=.69). More intervention vs. usual care parents agreed that SMS reminders would be helpful for remembering appointments (p<.0001), agreed to being interested in receiving future SMS reminders (p<.0001), and said that they would be willing to pay for future SMS reminders (p=.01). This proof of concept evaluation showed that a new application of SMS technology is feasible to implement in a LMIC with high user satisfaction. Larger studies with modifications in the SMS system are needed to determine effectiveness (Clinical Trial Registry NCT01663636). Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  2. Cost and sustainability of a successful package of interventions to improve vaccination coverage for children in urban slums of Bangladesh.

    Science.gov (United States)

    Hayford, K; Uddin, M J; Koehlmoos, T P; Bishai, D M

    2014-04-25

    To estimate the incremental economic costs and explore satisfaction with a highly effective intervention for improving immunization coverage among slum populations in Dhaka, Bangladesh. A package of interventions based on extended clinic hours, vaccinator training, active surveillance, and community participation was piloted in two slum areas of Dhaka, and resulted in an increase in valid fully immunized children (FIC) from 43% pre-intervention to 99% post-intervention. Cost data and stakeholder perspectives were collected January-February 2010 via document review and 10 key stakeholders interviews to estimate the financial and opportunity costs of the intervention, including uncompensated time, training and supervision costs. The total economic cost of the 1-year intervention was $18,300, comprised of external management and supervision (73%), training (11%), coordination costs (1%), uncompensated staff time and clinic costs (2%), and communications, supplies and other costs (13%). An estimated 874 additional children were correctly and fully immunized due to the intervention, at an average cost of $20.95 per valid FIC. Key stakeholders ranked extended clinic hours and vaccinator training as the most important components of the intervention. External supervision was viewed as the most important factor for the intervention's success but also the costliest. All stakeholders would like to reinstate the intervention because it was effective, but additional funding would be needed to make the intervention sustainable. Targeting slum populations with an intensive immunization intervention was highly effective but would nearly triple the amount spent on immunization per FIC in slum areas. Those committed to increasing vaccination coverage for hard-to-reach children need to be prepared for substantially higher costs to achieve results. Copyright © 2014. Published by Elsevier Ltd.

  3. Probability or Reasoning: Current Thinking and Realistic Strategies for Improved Medical Decisions.

    Science.gov (United States)

    Nantha, Yogarabindranath Swarna

    2017-11-01

    A prescriptive model approach in decision making could help achieve better diagnostic accuracy in clinical practice through methods that are less reliant on probabilistic assessments. Various prescriptive measures aimed at regulating factors that influence heuristics and clinical reasoning could support clinical decision-making process. Clinicians could avoid time-consuming decision-making methods that require probabilistic calculations. Intuitively, they could rely on heuristics to obtain an accurate diagnosis in a given clinical setting. An extensive literature review of cognitive psychology and medical decision-making theory was performed to illustrate how heuristics could be effectively utilized in daily practice. Since physicians often rely on heuristics in realistic situations, probabilistic estimation might not be a useful tool in everyday clinical practice. Improvements in the descriptive model of decision making (heuristics) may allow for greater diagnostic accuracy.

  4. Improved techniques for outgoing wave variational principle calculations of converged state-to-state transition probabilities for chemical reactions

    Science.gov (United States)

    Mielke, Steven L.; Truhlar, Donald G.; Schwenke, David W.

    1991-01-01

    Improved techniques and well-optimized basis sets are presented for application of the outgoing wave variational principle to calculate converged quantum mechanical reaction probabilities. They are illustrated with calculations for the reactions D + H2 yields HD + H with total angular momentum J = 3 and F + H2 yields HF + H with J = 0 and 3. The optimization involves the choice of distortion potential, the grid for calculating half-integrated Green's functions, the placement, width, and number of primitive distributed Gaussians, and the computationally most efficient partition between dynamically adapted and primitive basis functions. Benchmark calculations with 224-1064 channels are presented.

  5. Probability for statisticians

    CERN Document Server

    Shorack, Galen R

    2017-01-01

    This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...

  6. Tailored liquid chromatography-mass spectrometry analysis improves the coverage of the intracellular metabolome of HepaRG cells.

    Science.gov (United States)

    Cuykx, Matthias; Negreira, Noelia; Beirnaert, Charlie; Van den Eede, Nele; Rodrigues, Robim; Vanhaecke, Tamara; Laukens, Kris; Covaci, Adrian

    2017-03-03

    Metabolomics protocols are often combined with Liquid Chromatography-Mass Spectrometry (LC-MS) using mostly reversed phase chromatography coupled to accurate mass spectrometry, e.g. quadrupole time-of-flight (QTOF) mass spectrometers to measure as many metabolites as possible. In this study, we optimised the LC-MS separation of cell extracts after fractionation in polar and non-polar fractions. Both phases were analysed separately in a tailored approach in four different runs (two for the non-polar and two for the polar-fraction), each of them specifically adapted to improve the separation of the metabolites present in the extract. This approach improves the coverage of a broad range of the metabolome of the HepaRG cells and the separation of intra-class metabolites. The non-polar fraction was analysed using a C18-column with end-capping, mobile phase compositions were specifically adapted for each ionisation mode using different co-solvents and buffers. The polar extracts were analysed with a mixed mode Hydrophilic Interaction Liquid Chromatography (HILIC) system. Acidic metabolites from glycolysis and the Krebs cycle, together with phosphorylated compounds, were best detected with a method using ion pairing (IP) with tributylamine and separation on a phenyl-hexyl column. Accurate mass detection was performed with the QTOF in MS-mode only using an extended dynamic range to improve the quality of the dataset. Parameters with the greatest impact on the detection were the balance between mass accuracy and linear range, the fragmentor voltage, the capillary voltage, the nozzle voltage, and the nebuliser pressure. By using a tailored approach for the intracellular HepaRG metabolome, consisting of three different LC techniques, over 2200 metabolites can be measured with a high precision and acceptable linear range. The developed method is suited for qualitative untargeted LC-MS metabolomics studies. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Public sector scale-up of zinc and ORS improves coverage in selected districts in Bihar, India.

    Science.gov (United States)

    Walker, Christa L Fischer; Taneja, Sunita; Lamberti, Laura M; Black, Robert E; Mazumder, Sarmila

    2015-12-01

    In Bihar, India, a new initiative to enhance diarrhea treatment with zinc and ORS in the public sector was rolled out in selected districts. We conducted an external evaluation to measure changes in diarrhea careseeking and treatment in intervention districts. We conducted baseline and endline household surveys among caregivers of children 2-59 months of age. We calculated summary statistics for household characteristics, knowledge, careseeking and treatments given to children with a diarrhea episode in the last 14 days and built logistic regression models to compare baseline and endline values. Caregivers named a public health center as an appropriate source of care for childhood diarrhea more often at endline (71.3%) compared to baseline (38.4%) but did not report increased careseeking to public sector providers for the current diarrhea episode. In logistic regression analyses, the odds of receiving zinc, with or without oral rehydration salts (ORS), increased at endline by more than 2.7 as compared to baseline. Children who were taken to the public sector for care were more likely to receive zinc (odds ratio, OR = 3.93) and zinc in addition to ORS (OR = 6.10) compared to children who were not taken to the public sector. Coverage of zinc and ORS can improve with public sector programs targeted at training and increasing product availability, but demand creation may be needed to increase public sector careseeking in areas where the private sector has historically provided much of the care.

  8. Public sector scale–up of zinc and ORS improves coverage in selected districts in Bihar, India

    Directory of Open Access Journals (Sweden)

    Christa L. Fischer Walker

    2015-02-01

    Full Text Available In Bihar, India, a new initiative to enhance diarrhea treatment with zinc and ORS in the public sector was rolled out in selected districts. We conducted an external evaluation to measure changes in diarrhea careseeking and treatment in intervention districts. We conducted baseline and endline household surveys among caregivers of children 2–59 months of age. We calculated summary statistics for household characteristics, knowledge, careseeking and treatments given to children with a diarrhea episode in the last 14 days and built logistic regression models to compare baseline and endline values. Caregivers named a public health center as an appropriate source of care for childhood diarrhea more often at endline (71.3% compared to baseline (38.4% but did not report increased careseeking to public sector providers for the current diarrhea episode. In logistic regression analyses, the odds of receiving zinc, with or without oral rehydration salts (ORS, increased at endline by more than 2.7 as compared to baseline. Children who were taken to the public sector for care were more likely to receive zinc (odds ratio, OR = 3.93 and zinc in addition to ORS (OR = 6.10 compared to children who were not taken to the public sector. Coverage of zinc and ORS can improve with public sector programs targeted at training and increasing product availability, but demand creation may be needed to increase public sector careseeking in areas where the private sector has historically provided much of the care.

  9. Piecing together the puzzle: Improving event content coverage for real-time sub-event detection using adaptive microblog crawling.

    Directory of Open Access Journals (Sweden)

    Laurissa Tokarchuk

    Full Text Available In an age when people are predisposed to report real-world events through their social media accounts, many researchers value the benefits of mining user generated content from social media. Compared with the traditional news media, social media services, such as Twitter, can provide more complete and timely information about the real-world events. However events are often like a puzzle and in order to solve the puzzle/understand the event, we must identify all the sub-events or pieces. Existing Twitter event monitoring systems for sub-event detection and summarization currently typically analyse events based on partial data as conventional data collection methodologies are unable to collect comprehensive event data. This results in existing systems often being unable to report sub-events in real-time and often in completely missing sub-events or pieces in the broader event puzzle. This paper proposes a Sub-event detection by real-TIme Microblog monitoring (STRIM framework that leverages the temporal feature of an expanded set of news-worthy event content. In order to more comprehensively and accurately identify sub-events this framework first proposes the use of adaptive microblog crawling. Our adaptive microblog crawler is capable of increasing the coverage of events while minimizing the amount of non-relevant content. We then propose a stream division methodology that can be accomplished in real time so that the temporal features of the expanded event streams can be analysed by a burst detection algorithm. In the final steps of the framework, the content features are extracted from each divided stream and recombined to provide a final summarization of the sub-events. The proposed framework is evaluated against traditional event detection using event recall and event precision metrics. Results show that improving the quality and coverage of event contents contribute to better event detection by identifying additional valid sub-events. The

  10. Piecing together the puzzle: Improving event content coverage for real-time sub-event detection using adaptive microblog crawling.

    Science.gov (United States)

    Tokarchuk, Laurissa; Wang, Xinyue; Poslad, Stefan

    2017-01-01

    In an age when people are predisposed to report real-world events through their social media accounts, many researchers value the benefits of mining user generated content from social media. Compared with the traditional news media, social media services, such as Twitter, can provide more complete and timely information about the real-world events. However events are often like a puzzle and in order to solve the puzzle/understand the event, we must identify all the sub-events or pieces. Existing Twitter event monitoring systems for sub-event detection and summarization currently typically analyse events based on partial data as conventional data collection methodologies are unable to collect comprehensive event data. This results in existing systems often being unable to report sub-events in real-time and often in completely missing sub-events or pieces in the broader event puzzle. This paper proposes a Sub-event detection by real-TIme Microblog monitoring (STRIM) framework that leverages the temporal feature of an expanded set of news-worthy event content. In order to more comprehensively and accurately identify sub-events this framework first proposes the use of adaptive microblog crawling. Our adaptive microblog crawler is capable of increasing the coverage of events while minimizing the amount of non-relevant content. We then propose a stream division methodology that can be accomplished in real time so that the temporal features of the expanded event streams can be analysed by a burst detection algorithm. In the final steps of the framework, the content features are extracted from each divided stream and recombined to provide a final summarization of the sub-events. The proposed framework is evaluated against traditional event detection using event recall and event precision metrics. Results show that improving the quality and coverage of event contents contribute to better event detection by identifying additional valid sub-events. The novel combination of

  11. Multi-GPU parallel algorithm design and analysis for improved inversion of probability tomography with gravity gradiometry data

    Science.gov (United States)

    Hou, Zhenlong; Huang, Danian

    2017-09-01

    In this paper, we make a study on the inversion of probability tomography (IPT) with gravity gradiometry data at first. The space resolution of the results is improved by multi-tensor joint inversion, depth weighting matrix and the other methods. Aiming at solving the problems brought by the big data in the exploration, we present the parallel algorithm and the performance analysis combining Compute Unified Device Architecture (CUDA) with Open Multi-Processing (OpenMP) based on Graphics Processing Unit (GPU) accelerating. In the test of the synthetic model and real data from Vinton Dome, we get the improved results. It is also proved that the improved inversion algorithm is effective and feasible. The performance of parallel algorithm we designed is better than the other ones with CUDA. The maximum speedup could be more than 200. In the performance analysis, multi-GPU speedup and multi-GPU efficiency are applied to analyze the scalability of the multi-GPU programs. The designed parallel algorithm is demonstrated to be able to process larger scale of data and the new analysis method is practical.

  12. Percent Coverage

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Percent Coverage is a spreadsheet that keeps track of and compares the number of vessels that have departed with and without observers to the numbers of vessels...

  13. Innovations that reach the patient : Early health technology assessment and improving the chances of coverage and implementation

    NARCIS (Netherlands)

    Van Harten, W. H.; Retèl, V. P.

    2016-01-01

    With growing concerns for the sustainability of the financial burden that health care-and especially cancer services-poses on the national budgets, the role of health economic analyses in coverage decisions is likely to grow. One of the strategies for the biomedical research field-also in oncology

  14. Feasibility of using global system for mobile communication (GSM)-based tracking for vaccinators to improve oral poliomyelitis vaccine campaign coverage in rural Pakistan.

    Science.gov (United States)

    Chandir, Subhash; Dharma, Vijay Kumar; Siddiqi, Danya Arif; Khan, Aamir Javed

    2017-09-05

    Despite multiple rounds of immunization campaigns, it has not been possible to achieve optimum immunization coverage for poliovirus in Pakistan. Supplementary activities to improve coverage of immunization, such as door-to-door campaigns are constrained by several factors including inaccurate hand-drawn maps and a lack of means to objectively monitor field teams in real time, resulting in suboptimal vaccine coverage during campaigns. Global System for Mobile Communications (GSM) - based tracking of mobile subscriber identity modules (SIMs) of vaccinators provides a low-cost solution to identify missed areas and ensure effective immunization coverage. We conducted a pilot study to investigate the feasibility of using GSM technology to track vaccinators through observing indicators including acceptability, ease of implementation, costs and scalability as well as the likelihood of ownership by District Health Officials. The real-time location of the field teams was displayed on a GSM tracking web dashboard accessible by supervisors and managers for effective monitoring of workforce attendance including 'time in-time out', and discerning if all target areas - specifically remote and high-risk locations - had been reached. Direct access to this information by supervisors eliminated the possibility of data fudging and inaccurate reporting by workers regarding their mobility. The tracking cost per vaccinator was USD 0.26/month. Our study shows that GSM-based tracking is potentially a cost-efficient approach, results in better monitoring and accountability, is scalable and provides the potential for improved geographic coverage of health services. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. An Innovative Open Data-driven Approach for Improved Interpretation of Coverage Data at NASA JPL's PO.DAA

    Science.gov (United States)

    McGibbney, L. J.; Armstrong, E. M.

    2016-12-01

    Figuratively speaking, Scientific Datasets (SD) are shared by data producers in a multitude of shapes, sizes and flavors. Primarily however they exist as machine-independent manifestations supporting the creation, access, and sharing of array-oriented SD that can on occasion be spread across multiple files. Within the Earth Sciences, the most notable general examples include the HDF family, NetCDF, etc. with other formats such as GRIB being used pervasively within specific domains such as the Oceanographic, Atmospheric and Meteorological sciences. Such file formats contain Coverage Data e.g. a digital representation of some spatio-temporal phenomenon. A challenge for large data producers such as NASA and NOAA as well as consumers of coverage datasets (particularly surrounding visualization and interactive use within web clients) is that this is still not a straight-forward issue due to size, serialization and inherent complexity. Additionally existing data formats are either unsuitable for the Web (like netCDF files) or hard to interpret independently due to missing standard structures and metadata (e.g. the OPeNDAP protocol). Therefore alternative, Web friendly manifestations of such datasets are required.CoverageJSON is an emerging data format for publishing coverage data to the web in a web-friendly, way which fits in with the linked data publication paradigm hence lowering the barrier for interpretation by consumers via mobile devices and client applications, etc. as well as data producers who can build next generation Web friendly Web services around datasets. This work will detail how CoverageJSON is being evaluated at NASA JPL's PO.DAAC as an enabling data representation format for publishing SD as Linked Open Data embedded within SD landing pages as well as via semantic data repositories. We are currently evaluating how utilization of CoverageJSON within SD landing pages addresses the long-standing acknowledgement that SD producers are not currently

  16. Immunization Coverage

    Science.gov (United States)

    ... room/fact-sheets/detail/immunization-coverage","@context":"http://schema.org","@type":"Article"}; العربية 中文 français русский español ... Plan Global Health Observatory (GHO) data - Immunization More information on vaccines and immunization News 1 in 10 ...

  17. Functional coverages

    NARCIS (Netherlands)

    Donchyts, G.; Baart, F.; Jagers, H.R.A.; Van Dam, A.

    2011-01-01

    A new Application Programming Interface (API) is presented which simplifies working with geospatial coverages as well as many other data structures of a multi-dimensional nature. The main idea extends the Common Data Model (CDM) developed at the University Corporation for Atmospheric Research

  18. Semantics-based plausible reasoning to extend the knowledge coverage of medical knowledge bases for improved clinical decision support

    OpenAIRE

    Mohammadhassanzadeh, Hossein; Van Woensel, William; Abidi, Samina Raza; Abidi, Syed Sibte Raza

    2017-01-01

    Background Capturing complete medical knowledge is challenging-often due to incomplete patient Electronic Health Records (EHR), but also because of valuable, tacit medical knowledge hidden away in physicians? experiences. To extend the coverage of incomplete medical knowledge-based systems beyond their deductive closure, and thus enhance their decision-support capabilities, we argue that innovative, multi-strategy reasoning approaches should be applied. In particular, plausible reasoning mech...

  19. Compensating for geographic variation in detection probability with water depth improves abundance estimates of coastal marine megafauna.

    Science.gov (United States)

    Hagihara, Rie; Jones, Rhondda E; Sobtzick, Susan; Cleguer, Christophe; Garrigue, Claire; Marsh, Helene

    2018-01-01

    The probability of an aquatic animal being available for detection is typically probability of detection is important for obtaining robust estimates of the population abundance and determining its status and trends. The dugong (Dugong dugon) is a bottom-feeding marine mammal and a seagrass community specialist. We hypothesized that the probability of a dugong being available for detection is dependent on water depth and that dugongs spend more time underwater in deep-water seagrass habitats than in shallow-water seagrass habitats. We tested this hypothesis by quantifying the depth use of 28 wild dugongs fitted with GPS satellite transmitters and time-depth recorders (TDRs) at three sites with distinct seagrass depth distributions: 1) open waters supporting extensive seagrass meadows to 40 m deep (Torres Strait, 6 dugongs, 2015); 2) a protected bay (average water depth 6.8 m) with extensive shallow seagrass beds (Moreton Bay, 13 dugongs, 2011 and 2012); and 3) a mixture of lagoon, coral and seagrass habitats to 60 m deep (New Caledonia, 9 dugongs, 2013). The fitted instruments were used to measure the times the dugongs spent in the experimentally determined detection zones under various environmental conditions. The estimated probability of detection was applied to aerial survey data previously collected at each location. In general, dugongs were least available for detection in Torres Strait, and the population estimates increased 6-7 fold using depth-specific availability correction factors compared with earlier estimates that assumed homogeneous detection probability across water depth and location. Detection probabilities were higher in Moreton Bay and New Caledonia than Torres Strait because the water transparency in these two locations was much greater than in Torres Strait and the effect of correcting for depth-specific detection probability much less. The methodology has application to visual survey of coastal megafauna including surveys using Unmanned

  20. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....

  1. Generalized Probability-Probability Plots

    NARCIS (Netherlands)

    Mushkudiani, N.A.; Einmahl, J.H.J.

    2004-01-01

    We introduce generalized Probability-Probability (P-P) plots in order to study the one-sample goodness-of-fit problem and the two-sample problem, for real valued data.These plots, that are constructed by indexing with the class of closed intervals, globally preserve the properties of classical P-P

  2. Community-based interventions to improve HPV vaccination coverage among 13- to 15-year-old females: measures implemented by local governments in Japan.

    Directory of Open Access Journals (Sweden)

    Hiroyuki Fujiwara

    Full Text Available The purpose of this study was to examine the effect of various community-based interventions in support of HPV vaccination implemented by cities and towns within Tochigi prefecture, Japan with a view to identifying useful indicators which might guide future interventions to improve HPV vaccination coverage in the prefecture. A postal questionnaire survey of all 27 local governments in Tochigi Prefecture was conducted in December 2010. All 27 responded, and 22 provided the exact numbers of the targeted and vaccinated populations of 13- to 15-year-old girls from April to December 2010. The local governments also answered questions on the type of interventions implemented including public subsidies, school-based programs, direct mail, free tickets and recalls. Local governments that conducted a school-based vaccination program reported 96.8% coverage for the 1(st dose, 96.2% for the 2(nd dose, and 91.2% for the 3(rd dose. Those that provided subsidies without school-based programs reported a wide range of vaccination rates: 45.7%-95.0% for the 1(st dose, 41.1%-93.7% for the 2(nd dose and 3.1%-90.1% for the 3(rd dose. Among this group, the combination of a free ticket, direct mail and recall was most effective, with 95.0% coverage for the 1(st dose, 93.7% for the 2(nd dose, and 90.1% for the 3(rd dose. The governments that did not offer a subsidy had the lowest vaccination coverage, with 0.8%-1.4% for the 1(st dose, 0.0%-0.8% for the 2(nd dose, and 0.1%-0.1% for the 3(rd dose. The results of this survey indicate that school-based vaccinations and public subsidies are the most effective method to improve HPV vaccination coverage; however, the combination of a free ticket, direct mail, and recalls with public subsidies are also important measures in increasing the vaccination rate. These data may afford important indicators for the successful implementation of future HPV vaccination programs.

  3. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  4. Ignition Probability

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...

  5. Quantum Probabilities as Behavioral Probabilities

    Directory of Open Access Journals (Sweden)

    Vyacheslav I. Yukalov

    2017-03-01

    Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.

  6. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... analytic expression for the distribution function of a sum of random variables. The presence of heavy-tailed random variables complicates the problem even more. The objective of this dissertation is to provide better approximations by means of sharp asymptotic expressions and Monte Carlo estimators...

  7. Use of mobile phones for improving vaccination coverage among children living in rural hard-to-reach areas and urban streets of Bangladesh.

    Science.gov (United States)

    Uddin, Md Jasim; Shamsuzzaman, Md; Horng, Lily; Labrique, Alain; Vasudevan, Lavanya; Zeller, Kelsey; Chowdhury, Mridul; Larson, Charles P; Bishai, David; Alam, Nurul

    2016-01-04

    In Bangladesh, full vaccination rates among children living in rural hard-to-reach areas and urban streets are low. We conducted a quasi-experimental pre-post study of a 12-month mobile phone intervention to improve vaccination among 0-11 months old children in rural hard-to-reach and urban street dweller areas. Software named "mTika" was employed within the existing public health system to electronically register each child's birth and remind mothers about upcoming vaccination dates with text messages. Android smart phones with mTika were provided to all health assistants/vaccinators and supervisors in intervention areas, while mothers used plain cell phones already owned by themselves or their families. Pre and post-intervention vaccination coverage was surveyed in intervention and control areas. Among children over 298 days old, full vaccination coverage actually decreased in control areas--rural baseline 65.9% to endline 55.2% and urban baseline 44.5% to endline 33.9%--while increasing in intervention areas from rural baseline 58.9% to endline 76*8%, difference +18.8% (95% CI 5.7-31.9) and urban baseline 40.7% to endline 57.1%, difference +16.5% (95% CI 3.9-29.0). Difference-in-difference (DID) estimates were +29.5% for rural intervention versus control areas and +27.1% for urban areas for full vaccination in children over 298 days old, and logistic regression adjusting for maternal education, mobile phone ownership, and sex of child showed intervention effect odds ratio (OR) of 3.8 (95% CI 1.5-9.2) in rural areas and 3.0 (95% CI 1.4-6.4) in urban areas. Among all age groups, intervention effects on age-appropriate vaccination coverage were positive: DIDs +13.1-30.5% and ORs 2.5-4.6 (pmobile phone intervention can improve vaccination coverage in rural hard-to-reach and urban street dweller communities in Bangladesh. This small-scale successful demonstration should serve as an example to other low-income countries with high mobile phone usage. Copyright © 2015

  8. Probability tales

    CERN Document Server

    Grinstead, Charles M; Snell, J Laurie

    2011-01-01

    This book explores four real-world topics through the lens of probability theory. It can be used to supplement a standard text in probability or statistics. Most elementary textbooks present the basic theory and then illustrate the ideas with some neatly packaged examples. Here the authors assume that the reader has seen, or is learning, the basic theory from another book and concentrate in some depth on the following topics: streaks, the stock market, lotteries, and fingerprints. This extended format allows the authors to present multiple approaches to problems and to pursue promising side discussions in ways that would not be possible in a book constrained to cover a fixed set of topics. To keep the main narrative accessible, the authors have placed the more technical mathematical details in appendices. The appendices can be understood by someone who has taken one or two semesters of calculus.

  9. Probability theory

    CERN Document Server

    Dorogovtsev, A Ya; Skorokhod, A V; Silvestrov, D S; Skorokhod, A V

    1997-01-01

    This book of problems is intended for students in pure and applied mathematics. There are problems in traditional areas of probability theory and problems in the theory of stochastic processes, which has wide applications in the theory of automatic control, queuing and reliability theories, and in many other modern science and engineering fields. Answers to most of the problems are given, and the book provides hints and solutions for more complicated problems.

  10. Improving the coverage of prevention of mother-to-child transmission of HIV services in Nigeria: should traditional birth attendants be engaged?

    Science.gov (United States)

    O Olakunde, Babayemi; Wakdok, Sabastine; Olaifa, Yewande; Agbo, Francis; Essen, Uduak; Ojo, Mathews; Oke, Maria; Ibi, Sarah

    2018-06-01

    Traditional birth attendants (TBAs) play an important role in the provision of care to pregnant women in rural parts of Nigeria, but they are barely engaged by the formal healthcare system in expanding the low coverage of prevention of mother-to-child transmission of HIV (PMTCT) services. Using a systematic approach, we engaged TBAs in Abia and Taraba States to scale-up PMTCT services under the National Agency for Control of AIDS Comprehensive AIDS Program with States. We conducted mapping of the TBAs, built their capacities, obtained their buy-in on mobilization of their clients and other pregnant women for HIV testing service outreaches, and established referral and linkage systems. A total of 720 TBAs were mapped (Abia 407; Taraba 313). Three hundred and ninety-nine TBAs who participated in the capacity-building meeting were linked to 115 primary healthcare centers (PHCs) in Abia State, while 245 TBAs were linked to 27 PHCs in Taraba State. From July 2016 to March 2017, the outreaches contributed 20% to the overall total number of pregnant women counseled, tested and received results, and 12% to the total number of HIV-infected women identified. There was a considerable yield of HIV-infected pregnant women among those tested in the TBA outreaches in comparison with the supported antenatal facilities (2% versus 3%, respectively). Engaging TBAs has the potential to improve the coverage of PMTCT services in Nigeria.

  11. Improved process for calculating the probability of being hit by crashing aircraft by the Balfanz-model

    International Nuclear Information System (INIS)

    Hennings, W.

    1988-01-01

    For calculating the probability of being hit by crashing military aircraft on different buildings, a model was introduced, which has already been used in the conventional fields. In the context of converting the research reactor BER II, this model was also used in the nuclear field. The report introduces this model and shows the application to a vertical cylinder as an example. Compared to the previous model, an exact and also simpler solution of the model attempt for determining the shade surface for different shapes of buildings is derived. The problems of the distribution of crashes given by the previous model is treated via the vertical angle and an attempt to solve these problems is given. (orig./HP) [de

  12. Assuring Access to Affordable Coverage

    Data.gov (United States)

    U.S. Department of Health & Human Services — Under the Affordable Care Act, millions of uninsured Americans will gain access to affordable coverage through Affordable Insurance Exchanges and improvements in...

  13. Improved imputation accuracy of rare and low-frequency variants using population-specific high-coverage WGS-based imputation reference panel.

    Science.gov (United States)

    Mitt, Mario; Kals, Mart; Pärn, Kalle; Gabriel, Stacey B; Lander, Eric S; Palotie, Aarno; Ripatti, Samuli; Morris, Andrew P; Metspalu, Andres; Esko, Tõnu; Mägi, Reedik; Palta, Priit

    2017-06-01

    Genetic imputation is a cost-efficient way to improve the power and resolution of genome-wide association (GWA) studies. Current publicly accessible imputation reference panels accurately predict genotypes for common variants with minor allele frequency (MAF)≥5% and low-frequency variants (0.5≤MAF<5%) across diverse populations, but the imputation of rare variation (MAF<0.5%) is still rather limited. In the current study, we evaluate imputation accuracy achieved with reference panels from diverse populations with a population-specific high-coverage (30 ×) whole-genome sequencing (WGS) based reference panel, comprising of 2244 Estonian individuals (0.25% of adult Estonians). Although the Estonian-specific panel contains fewer haplotypes and variants, the imputation confidence and accuracy of imputed low-frequency and rare variants was significantly higher. The results indicate the utility of population-specific reference panels for human genetic studies.

  14. Certainties and probabilities of the IPCC

    International Nuclear Information System (INIS)

    2004-01-01

    Based on an analysis of information about the climate evolution, simulations of a global warming and the snow coverage monitoring of Meteo-France, the IPCC presented its certainties and probabilities concerning the greenhouse effect. (A.L.B.)

  15. Improving immunisation coverage in rural India: clustered randomised controlled evaluation of immunisation campaigns with and without incentives.

    Science.gov (United States)

    Banerjee, Abhijit Vinayak; Duflo, Esther; Glennerster, Rachel; Kothari, Dhruva

    2010-05-17

    To assess the efficacy of modest non-financial incentives on immunisation rates in children aged 1-3 and to compare it with the effect of only improving the reliability of the supply of services. Clustered randomised controlled study. Rural Rajasthan, India. 1640 children aged 1-3 at end point. 134 villages were randomised to one of three groups: a once monthly reliable immunisation camp (intervention A; 379 children from 30 villages); a once monthly reliable immunisation camp with small incentives (raw lentils and metal plates for completed immunisation; intervention B; 382 children from 30 villages), or control (no intervention, 860 children in 74 villages). Surveys were undertaken in randomly selected households at baseline and about 18 months after the interventions started (end point). Proportion of children aged 1-3 at the end point who were partially or fully immunised. Among children aged 1-3 in the end point survey, rates of full immunisation were 39% (148/382, 95% confidence interval 30% to 47%) for intervention B villages (reliable immunisation with incentives), 18% (68/379, 11% to 23%) for intervention A villages (reliable immunisation without incentives), and 6% (50/860, 3% to 9%) for control villages. The relative risk of complete immunisation for intervention B versus control was 6.7 (4.5 to 8.8) and for intervention B versus intervention A was 2.2 (1.5 to 2.8). Children in areas neighbouring intervention B villages were also more likely to be fully immunised than those from areas neighbouring intervention A villages (1.9, 1.1 to 2.8). The average cost per immunisation was $56 (2202 rupees) in intervention A and $28 (1102 rupees, about pound16 or euro19) in intervention B. Improving reliability of services improves immunisation rates, but the effect remains modest. Small incentives have large positive impacts on the uptake of immunisation services in resource poor areas and are more cost effective than purely improving supply. IRSCTN87759937.

  16. Daily Isocenter Correction With Electromagnetic-Based Localization Improves Target Coverage and Rectal Sparing During Prostate Radiotherapy

    International Nuclear Information System (INIS)

    Rajendran, Ramji Ramaswamy; Plastaras, John P.; Mick, Rosemarie; McMichael Kohler, Diane; Kassaee, Alireza; Vapiwala, Neha

    2010-01-01

    Purpose: To evaluate dosimetric consequences of daily isocenter correction during prostate cancer radiation therapy using the Calypso 4D localization system. Methods and Materials: Data were analyzed from 28 patients with electromagnetic transponders implanted in their prostates for daily target localization and tracking. Treatment planning isocenters were recorded based on the values of the vertical, longitudinal, and lateral axes. Isocenter location obtained via alignment with skin tattoos was compared with that obtained via the electromagnetic localization system. Daily isocenter shifts, based on the isocenter location differences between the two alignment methods in each spatial axis, were calculated for each patient over their entire course. The mean isocenter shifts were used to determine dosimetric consequences of treatment based on skin tattoo alignments alone. Results: The mean += SD of the percentages of treatment days with shifts beyond += 0.5 cm for vertical, longitudinal and lateral shifts were 62% += 28%, 35% += 26%, and 38% +=21%, respectively. If daily electromagnetic localization was not used, the excess in prescribed dose delivered to 70% of the rectum was 10 Gy and the deficit in prescribed dose delivered to 95% of the planning target volume was 10 Gy. The mean isocenter shift was not associated with the volumes of the prostate, rectum, or bladder, or with patient body mass index. Conclusions: Daily isocenter localization can reduce the treatment dose to the rectum. Correcting for this variability could lead to improved dose delivery, reduced side effects, and potentially improved treatment outcomes.

  17. Medicare Coverage Database

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Medicare Coverage Database (MCD) contains all National Coverage Determinations (NCDs) and Local Coverage Determinations (LCDs), local articles, and proposed NCD...

  18. Qualification of the calculational methods of the fluence in the pressurised water reactors. Improvement of the cross sections treatment by the probability table method

    International Nuclear Information System (INIS)

    Zheng, S.H.

    1994-01-01

    It is indispensable to know the fluence on the nuclear reactor pressure vessel. The cross sections and their treatment have an important rule to this problem. In this study, two ''benchmarks'' have been interpreted by the Monte Carlo transport program TRIPOLI to qualify the calculational method and the cross sections used in the calculations. For the treatment of the cross sections, the multigroup method is usually used but it exists some problems such as the difficulty to choose the weighting function and the necessity of a great number of energy to represent well the cross section's fluctuation. In this thesis, we propose a new method called ''Probability Table Method'' to treat the neutron cross sections. For the qualification, a program of the simulation of neutron transport by the Monte Carlo method in one dimension has been written; the comparison of multigroup's results and probability table's results shows the advantages of this new method. The probability table has also been introduced in the TRIPOLI program; the calculational results of the iron deep penetration benchmark has been improved by comparing with the experimental results. So it is interest to use this new method in the shielding and neutronic calculation. (author). 42 refs., 109 figs., 36 tabs

  19. Coverage-based constraints for IMRT optimization

    Science.gov (United States)

    Mescher, H.; Ulrich, S.; Bangert, M.

    2017-09-01

    Radiation therapy treatment planning requires an incorporation of uncertainties in order to guarantee an adequate irradiation of the tumor volumes. In current clinical practice, uncertainties are accounted for implicitly with an expansion of the target volume according to generic margin recipes. Alternatively, it is possible to account for uncertainties by explicit minimization of objectives that describe worst-case treatment scenarios, the expectation value of the treatment or the coverage probability of the target volumes during treatment planning. In this note we show that approaches relying on objectives to induce a specific coverage of the clinical target volumes are inevitably sensitive to variation of the relative weighting of the objectives. To address this issue, we introduce coverage-based constraints for intensity-modulated radiation therapy (IMRT) treatment planning. Our implementation follows the concept of coverage-optimized planning that considers explicit error scenarios to calculate and optimize patient-specific probabilities q(\\hat{d}, \\hat{v}) of covering a specific target volume fraction \\hat{v} with a certain dose \\hat{d} . Using a constraint-based reformulation of coverage-based objectives we eliminate the trade-off between coverage and competing objectives during treatment planning. In-depth convergence tests including 324 treatment plan optimizations demonstrate the reliability of coverage-based constraints for varying levels of probability, dose and volume. General clinical applicability of coverage-based constraints is demonstrated for two cases. A sensitivity analysis regarding penalty variations within this planing study based on IMRT treatment planning using (1) coverage-based constraints, (2) coverage-based objectives, (3) probabilistic optimization, (4) robust optimization and (5) conventional margins illustrates the potential benefit of coverage-based constraints that do not require tedious adjustment of target volume objectives.

  20. Free probability and random matrices

    CERN Document Server

    Mingo, James A

    2017-01-01

    This volume opens the world of free probability to a wide variety of readers. From its roots in the theory of operator algebras, free probability has intertwined with non-crossing partitions, random matrices, applications in wireless communications, representation theory of large groups, quantum groups, the invariant subspace problem, large deviations, subfactors, and beyond. This book puts a special emphasis on the relation of free probability to random matrices, but also touches upon the operator algebraic, combinatorial, and analytic aspects of the theory. The book serves as a combination textbook/research monograph, with self-contained chapters, exercises scattered throughout the text, and coverage of important ongoing progress of the theory. It will appeal to graduate students and all mathematicians interested in random matrices and free probability from the point of view of operator algebras, combinatorics, analytic functions, or applications in engineering and statistical physics.

  1. Beyond coverage: improving the quality of antenatal care delivery through integrated mentorship and quality improvement at health centers in rural Rwanda.

    Science.gov (United States)

    Manzi, Anatole; Nyirazinyoye, Laetitia; Ntaganira, Joseph; Magge, Hema; Bigirimana, Evariste; Mukanzabikeshimana, Leoncie; Hirschhorn, Lisa R; Hedt-Gauthier, Bethany

    2018-02-23

    Inadequate antenatal care (ANC) can lead to missed diagnosis of danger signs or delayed referral to emergency obstetrical care, contributing to maternal mortality. In developing countries, ANC quality is often limited by skill and knowledge gaps of the health workforce. In 2011, the Mentorship, Enhanced Supervision for Healthcare and Quality Improvement (MESH-QI) program was implemented to strengthen providers' ANC performance at 21 rural health centers in Rwanda. We evaluated the effect of MESH-QI on the completeness of danger sign assessments. Completeness of danger sign assessments was measured by expert nurse mentors using standardized observation checklists. Checklists completed from October 2010 to May 2011 (n = 330) were used as baseline measurement and checklists completed between February and November 2012 (12-15 months after the start of MESH-QI implementation) were used for follow-up. We used a mixed-effects linear regression model to assess the effect of the MESH-QI intervention on the danger sign assessment score, controlling for potential confounders and the clustering of effect at the health center level. Complete assessment of all danger signs improved from 2.1% at baseline to 84.2% after MESH-QI (p ANC screening items. After controlling for potential confounders, the improvement in danger sign assessment score was significant. However, the effect of the MESH-QI was different by intervention district and type of observed ANC visit. In Southern Kayonza District, the increase in the danger sign assessment score was 6.28 (95% CI: 5.59, 6.98) for non-first ANC visits and 5.39 (95% CI: 4.62, 6.15) for first ANC visits. In Kirehe District, the increase in danger sign assessment score was 4.20 (95% CI: 3.59, 4.80) for non-first ANC visits and 3.30 (95% CI: 2.80, 3.81) for first ANC visits. Assessment of critical danger signs improved under MESH-QI, even when controlling for nurse-mentees' education level and previous training in focused ANC. MESH

  2. HMM-ModE – Improved classification using profile hidden Markov models by optimising the discrimination threshold and modifying emission probabilities with negative training sequences

    Directory of Open Access Journals (Sweden)

    Nandi Soumyadeep

    2007-03-01

    Full Text Available Abstract Background Profile Hidden Markov Models (HMM are statistical representations of protein families derived from patterns of sequence conservation in multiple alignments and have been used in identifying remote homologues with considerable success. These conservation patterns arise from fold specific signals, shared across multiple families, and function specific signals unique to the families. The availability of sequences pre-classified according to their function permits the use of negative training sequences to improve the specificity of the HMM, both by optimizing the threshold cutoff and by modifying emission probabilities to minimize the influence of fold-specific signals. A protocol to generate family specific HMMs is described that first constructs a profile HMM from an alignment of the family's sequences and then uses this model to identify sequences belonging to other classes that score above the default threshold (false positives. Ten-fold cross validation is used to optimise the discrimination threshold score for the model. The advent of fast multiple alignment methods enables the use of the profile alignments to align the true and false positive sequences, and the resulting alignments are used to modify the emission probabilities in the original model. Results The protocol, called HMM-ModE, was validated on a set of sequences belonging to six sub-families of the AGC family of kinases. These sequences have an average sequence similarity of 63% among the group though each sub-group has a different substrate specificity. The optimisation of discrimination threshold, by using negative sequences scored against the model improves specificity in test cases from an average of 21% to 98%. Further discrimination by the HMM after modifying model probabilities using negative training sequences is provided in a few cases, the average specificity rising to 99%. Similar improvements were obtained with a sample of G-Protein coupled receptors

  3. Women's Health Insurance Coverage

    Science.gov (United States)

    ... Women's Health Policy Women’s Health Insurance Coverage Women’s Health Insurance Coverage Published: Oct 31, 2017 Facebook Twitter LinkedIn ... that many women continue to face. Sources of Health Insurance Coverage Employer-Sponsored Insurance: Approximately 57.9 million ...

  4. Probability Aggregates in Probability Answer Set Programming

    OpenAIRE

    Saad, Emad

    2013-01-01

    Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...

  5. Physical Activity Improves Verbal and Spatial Memory in Older Adults with Probable Mild Cognitive Impairment: A 6-Month Randomized Controlled Trial

    Directory of Open Access Journals (Sweden)

    Lindsay S. Nagamatsu

    2013-01-01

    Full Text Available We report secondary findings from a randomized controlled trial on the effects of exercise on memory in older adults with probable MCI. We randomized 86 women aged 70–80 years with subjective memory complaints into one of three groups: resistance training, aerobic training, or balance and tone (control. All participants exercised twice per week for six months. We measured verbal memory and learning using the Rey Auditory Verbal Learning Test (RAVLT and spatial memory using a computerized test, before and after trial completion. We found that the aerobic training group remembered significantly more items in the loss after interference condition of the RAVLT compared with the control group after six months of training. In addition, both experimental groups showed improved spatial memory performance in the most difficult condition where they were required to memorize the spatial location of three items, compared with the control group. Lastly, we found a significant correlation between spatial memory performance and overall physical capacity after intervention in the aerobic training group. Taken together, our results provide support for the prevailing notion that exercise can positively impact cognitive functioning and may represent an effective strategy to improve memory in those who have begun to experience cognitive decline.

  6. WE-EF-207-08: Improve Cone Beam CT Using a Synchronized Moving Grid, An Inter-Projection Sensor Fusion and a Probability Total Variation Reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, H; Kong, V; Jin, J [Georgia Regents University Cancer Center, Augusta, GA (Georgia); Ren, L; Zhang, Y; Giles, W [Duke University Medical Center, Durham, NC (United States)

    2015-06-15

    Purpose: To present a cone beam computed tomography (CBCT) system, which uses a synchronized moving grid (SMOG) to reduce and correct scatter, an inter-projection sensor fusion (IPSF) algorithm to estimate the missing information blocked by the grid, and a probability total variation (pTV) algorithm to reconstruct the CBCT image. Methods: A prototype SMOG-equipped CBCT system was developed, and was used to acquire gridded projections with complimentary grid patterns in two neighboring projections. Scatter was reduced by the grid, and the remaining scatter was corrected by measuring it under the grid. An IPSF algorithm was used to estimate the missing information in a projection from data in its 2 neighboring projections. Feldkamp-Davis-Kress (FDK) algorithm was used to reconstruct the initial CBCT image using projections after IPSF processing for pTV. A probability map was generated depending on the confidence of estimation in IPSF for the regions of missing data and penumbra. pTV was finally used to reconstruct the CBCT image for a Catphan, and was compared to conventional CBCT image without using SMOG, images without using IPSF (SMOG + FDK and SMOG + mask-TV), and image without using pTV (SMOG + IPSF + FDK). Results: The conventional CBCT without using SMOG shows apparent scatter-induced cup artifacts. The approaches with SMOG but without IPSF show severe (SMOG + FDK) or additional (SMOG + TV) artifacts, possibly due to using projections of missing data. The 2 approaches with SMOG + IPSF removes the cup artifacts, and the pTV approach is superior than the FDK by substantially reducing the noise. Using the SMOG also reduces half of the imaging dose. Conclusion: The proposed technique is promising in improving CBCT image quality while reducing imaging dose.

  7. WE-EF-207-08: Improve Cone Beam CT Using a Synchronized Moving Grid, An Inter-Projection Sensor Fusion and a Probability Total Variation Reconstruction

    International Nuclear Information System (INIS)

    Zhang, H; Kong, V; Jin, J; Ren, L; Zhang, Y; Giles, W

    2015-01-01

    Purpose: To present a cone beam computed tomography (CBCT) system, which uses a synchronized moving grid (SMOG) to reduce and correct scatter, an inter-projection sensor fusion (IPSF) algorithm to estimate the missing information blocked by the grid, and a probability total variation (pTV) algorithm to reconstruct the CBCT image. Methods: A prototype SMOG-equipped CBCT system was developed, and was used to acquire gridded projections with complimentary grid patterns in two neighboring projections. Scatter was reduced by the grid, and the remaining scatter was corrected by measuring it under the grid. An IPSF algorithm was used to estimate the missing information in a projection from data in its 2 neighboring projections. Feldkamp-Davis-Kress (FDK) algorithm was used to reconstruct the initial CBCT image using projections after IPSF processing for pTV. A probability map was generated depending on the confidence of estimation in IPSF for the regions of missing data and penumbra. pTV was finally used to reconstruct the CBCT image for a Catphan, and was compared to conventional CBCT image without using SMOG, images without using IPSF (SMOG + FDK and SMOG + mask-TV), and image without using pTV (SMOG + IPSF + FDK). Results: The conventional CBCT without using SMOG shows apparent scatter-induced cup artifacts. The approaches with SMOG but without IPSF show severe (SMOG + FDK) or additional (SMOG + TV) artifacts, possibly due to using projections of missing data. The 2 approaches with SMOG + IPSF removes the cup artifacts, and the pTV approach is superior than the FDK by substantially reducing the noise. Using the SMOG also reduces half of the imaging dose. Conclusion: The proposed technique is promising in improving CBCT image quality while reducing imaging dose

  8. Community-based distribution of sulfadoxine-pyrimethamine for intermittent preventive treatment of malaria during pregnancy improved coverage but reduced antenatal attendance in southern Malawi

    NARCIS (Netherlands)

    Msyamboza, K. P.; Savage, E. J.; Kazembe, P. N.; Gies, S.; Kalanda, G.; D'Alessandro, U.; Brabin, B. J.

    2009-01-01

    To evaluate the impact of a 2-year programme for community-based delivery of sulfadoxine-pyremethamine (SP) on intermittent preventive treatment during pregnancy coverage, antenatal clinic attendance and pregnancy outcome. Fourteen intervention and 12 control villages in the catchment areas of

  9. SU-F-19A-12: Split-Ring Applicator with Interstitial Needle for Improved Volumetric Coverage in HDR Brachytherapy for Cervical Cancer

    Energy Technology Data Exchange (ETDEWEB)

    Sherertz, T; Ellis, R; Colussi, V; Mislmani, M; Traughber, B; Herrmann, K; Podder, T [University Hospitals Case Medical Center, Cleveland, OH (United States)

    2014-06-15

    Purpose: To evaluate volumetric coverage of a Mick Radionuclear titanium Split-Ring applicator (SRA) with/without interstitial needle compared to an intracavitary Vienna applicator (VA), interstitial-intracavitary VA, and intracavitary ring and tandem applicator (RTA). Methods: A 57 year-old female with FIGO stage IIB cervical carcinoma was treated following chemoradiotherapy (45Gy pelvic and 5.4Gy parametrial boost) with highdose- rate (HDR) brachytherapy to 30Gy in 5 fractions using a SRA. A single interstitial needle was placed using the Ellis Interstitial Cap for the final three fractions to increase coverage of left-sided gross residual disease identified on 3T-MRI. High-risk (HR) clinical target volume (CTV) and intermediate-risk (IR) CTV were defined using axial T2-weighted 2D and 3D MRI sequences (Philips PET/MRI unit). Organs-at-risks (OARs) were delineated on CT. Oncentra planning system was used for treatment optimization satisfying GEC-ESTRO guidelines for target coverage and OAR constraints. Retrospectively, treatment plans (additional 20 plans) were simulated using intracavitary SRA (without needle), intracavitary VA (without needle), interstitial-intracavitary VA, and intracavitary RTA with this same patient case. Plans were optimized for each fraction to maintain coverage to HR-CTV. Results: Interstitial-intracavitary SRA achieved the following combined coverage for external radiation and brachytherapy (EQD2): D90 HR-CTV =94.6Gy; Bladder-2cc =88.9Gy; Rectum-2cc =65.1Gy; Sigmoid-2cc =48.9Gy; Left vaginal wall (VW) =103Gy, Right VW =99.2Gy. Interstitial-intracavitary VA was able to achieve identical D90 HR-CTV =94.6Gy, yet Bladder-2cc =91.9Gy (exceeding GEC-ESTRO recommendations of 2cc<90Gy) and Left VW =120.8Gy and Right VW =115.5Gy. Neither the SRA nor VA without interstitial needle could cover HR-CTV adequately without exceeding dose to Bladder-2cc. Conventional RTA was unable to achieve target coverage for the HR-CTV >80Gy without severely

  10. Temporal variations of the fractal properties of seismicity in the western part of the north Anatolian fault zone: possible artifacts due to improvements in station coverage

    Directory of Open Access Journals (Sweden)

    A. O. Öncel

    1995-01-01

    Full Text Available Seismically-active fault zones are complex natural systems exhibiting scale-invariant or fractal correlation between earthquakes in space and time, and a power-law scaling of fault length or earthquake source dimension consistent with the exponent b of the Gutenberg-Richter frequency-magnitude relation. The fractal dimension of seismicity is a measure of the degree of both the heterogeneity of the process (whether fixed or self-generated and the clustering of seismic activity. Temporal variations of the b-value and the two-point fractal (correlation dimension Dc have been related to the preparation process for natural earthquakes and rock fracture in the laboratory These statistical scaling properties of seismicity may therefore have the potential at least to be sensitive short- term predictors of major earthquakes. The North Anatolian Fault Zone (NAFZ is a seismicallyactive dextral strike slip fault zone which forms the northern boundary of the westward moving Anatolian plate. It is splayed into three branches at about 31oE and continues westward toward the northern Aegean sea. In this study, we investigate the temporal variation of Dc and the Gutenberg-Richter b-value for seismicity in the western part of the NAFZ (including the northern Aegean sea for earthquakes of Ms > 4.5 occurring in the period between 1900 and 1992. b ranges from 0.6-1.6 and Dc from 0.6 to 1.4. The b-value is found to be weakly negatively correlated with Dc (r=-0.56. However the (log of event rate N is positively correlated with b, with a similar degree of statistical significance (r=0.42, and negatively correlated with Dc (r=-0.48. Since N increases dramatically with improved station coverage since 1970, the observed negative correlation between b and Dc is therefore more likely to be due to this effect than any underlying physical process in this case. We present this as an example of how man-made artefacts of recording can have similar statistical effects to

  11. Scaling Qualitative Probability

    OpenAIRE

    Burgin, Mark

    2017-01-01

    There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...

  12. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2012-01-01

    This book provides a unique and balanced approach to probability, statistics, and stochastic processes.   Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area.  The Second Edition features new coverage of analysis of variance (ANOVA), consistency and efficiency of estimators, asymptotic theory for maximum likelihood estimators, empirical distribution function and the Kolmogorov-Smirnov test, general linear models, multiple comparisons, Markov chain Monte Carlo (MCMC), Brownian motion, martingales, and

  13. Improving the Phosphoproteome Coverage for Limited Sample Amounts Using TiOsub>2sub>-SIMAC-HILIC (TiSH) Phosphopeptide Enrichment and Fractionation

    DEFF Research Database (Denmark)

    Engholm-Keller, Kasper; Larsen, Martin R

    2016-01-01

    spectrometry (LC-MS/MS) analysis. Due to the sample loss resulting from fractionation, this procedure is mainly performed when large quantities of sample are available. To make large-scale phosphoproteomics applicable to smaller amounts of protein we have recently combined highly specific TiO2-based...... protocol we describe the procedure step by step to allow for comprehensive coverage of the phosphoproteome utilizing only a few hundred micrograms of protein....

  14. Evaluating the McDonald’s business model for HIV prevention among truckers to improve program coverage and service utilization in India, 2004–2010

    Science.gov (United States)

    Rao, Vasudha Tirumalasetti; Mahapatra, Bidhubhusan; Juneja, Sachin; Singh, Indra R

    2013-01-01

    Background This study describes the experiences and results of a large-scale human immunodeficiency virus (HIV) prevention intervention for long-distance truck drivers operating on the national highways of India. Methods The intervention for long-distance truckers started in 2004 across 34 trans-shipment locations. However, due to poor coverage and utilization of services by truckers in the initial 18-month period, the intervention was redesigned to focus on only 17 trans-shipment locations. The redesigned intervention model was based on the McDonald’s business franchise model where the focus is on optimal placement of services, supported with branding and standardization of services offered, and a surround sound communication approach. Program output indicators were assessed using program monitoring data over 7 years (2004–2010) and two rounds of cross-sectional behavioral surveys conducted in January 2008 (n = 1402) and July 2009 (n = 1407). Results The number of truckers contacted per month per site increased from 374 in 2004 to 4327 in 2010. Analysis of survey data showed a seven-fold increase in clinic visits in the past 12 months from 2008 to 2009 (21% versus 63%, P business model for HIV prevention helped to increase program coverage and service utilization among long-distance truckers. Implementing HIV prevention programs in a highly mobile population such as truckers, in a limited number of high-impact locations, supported by branding of services, could help in saturating coverage and optimum utilization of available resources. PMID:23439724

  15. On Probability Leakage

    OpenAIRE

    Briggs, William M.

    2012-01-01

    The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.

  16. Target Coverage in Wireless Sensor Networks with Probabilistic Sensors

    Science.gov (United States)

    Shan, Anxing; Xu, Xianghua; Cheng, Zongmao

    2016-01-01

    Sensing coverage is a fundamental problem in wireless sensor networks (WSNs), which has attracted considerable attention. Conventional research on this topic focuses on the 0/1 coverage model, which is only a coarse approximation to the practical sensing model. In this paper, we study the target coverage problem, where the objective is to find the least number of sensor nodes in randomly-deployed WSNs based on the probabilistic sensing model. We analyze the joint detection probability of target with multiple sensors. Based on the theoretical analysis of the detection probability, we formulate the minimum ϵ-detection coverage problem. We prove that the minimum ϵ-detection coverage problem is NP-hard and present an approximation algorithm called the Probabilistic Sensor Coverage Algorithm (PSCA) with provable approximation ratios. To evaluate our design, we analyze the performance of PSCA theoretically and also perform extensive simulations to demonstrate the effectiveness of our proposed algorithm. PMID:27618902

  17. Probability of causation approach

    International Nuclear Information System (INIS)

    Jose, D.E.

    1988-01-01

    Probability of causation (PC) is sometimes viewed as a great improvement by those persons who are not happy with the present rulings of courts in radiation cases. The author does not share that hope and expects that PC will not play a significant role in these issues for at least the next decade. If it is ever adopted in a legislative compensation scheme, it will be used in a way that is unlikely to please most scientists. Consequently, PC is a false hope for radiation scientists, and its best contribution may well lie in some of the spin-off effects, such as an influence on medical practice

  18. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  19. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  20. Evaluating the McDonald's business model for HIV prevention among truckers to improve program coverage and service utilization in India, 2004–2010

    Directory of Open Access Journals (Sweden)

    Tirumalasetti Rao V

    2013-02-01

    Full Text Available Vasudha Tirumalasetti Rao,1 Bidhubhusan Mahapatra,2 Sachin Juneja,1 Indra R Singh11Transport Corporation of India Foundation, Gurgaon, Haryana, India 2Population Council, New Delhi, IndiaBackground: This study describes the experiences and results of a large-scale human immunodeficiency virus (HIV prevention intervention for long-distance truck drivers operating on the national highways of India.Methods: The intervention for long-distance truckers started in 2004 across 34 trans-shipment locations. However, due to poor coverage and utilization of services by truckers in the initial 18-month period, the intervention was redesigned to focus on only 17 trans-shipment locations. The redesigned intervention model was based on the McDonald's business franchise model where the focus is on optimal placement of services, supported with branding and standardization of services offered, and a surround sound communication approach. Program output indicators were assessed using program monitoring data over 7 years (2004–2010 and two rounds of cross-sectional behavioral surveys conducted in January 2008 (n = 1402 and July 2009 (n = 1407.Results: The number of truckers contacted per month per site increased from 374 in 2004 to 4327 in 2010. Analysis of survey data showed a seven-fold increase in clinic visits in the past 12 months from 2008 to 2009 (21% versus 63%, P < 0.001. A significant increase was also observed in the percentage of truckers who watched street plays (10% to 56%, P < 0.001, and participated in health exhibitions (6% to 35%, P < 0.001. Furthermore, an increase from round 1 to round 2 was observed in the percentage who received condoms (13% to 22%, P < 0.001, and attended one-one counseling (15% to 21%, P < 0.01. Treatment-seeking from program clinics for symptoms related to sexually transmitted infections increased six-fold during this period (16% versus 50%, P < 0.001.Conclusion: Adoption of a business model for HIV prevention helped to

  1. Insurance premiums and insurance coverage of near-poor children.

    Science.gov (United States)

    Hadley, Jack; Reschovsky, James D; Cunningham, Peter; Kenney, Genevieve; Dubay, Lisa

    States increasingly are using premiums for near-poor children in their public insurance programs (Medicaid/SCHIP) to limit private insurance crowd-out and constrain program costs. Using national data from four rounds of the Community Tracking Study Household Surveys spanning the seven years from 1996 to 2003, this study estimates a multinomial logistic regression model examining how public and private insurance premiums affect insurance coverage outcomes (Medicaid/SCHIP coverage, private coverage, and no coverage). Higher public premiums are significantly associated with a lower probability of public coverage and higher probabilities of private coverage and uninsurance; higher private premiums are significantly related to a lower probability of private coverage and higher probabilities of public coverage and uninsurance. The results imply that uninsurance rates will rise if both public and private premiums increase, and suggest that states that impose or increase public insurance premiums for near-poor children will succeed in discouraging crowd-out of private insurance, but at the expense of higher rates of uninsurance. Sustained increases in private insurance premiums will continue to create enrollment pressures on state insurance programs for children.

  2. Monitoring intervention coverage in the context of universal health coverage.

    Directory of Open Access Journals (Sweden)

    Ties Boerma

    2014-09-01

    Full Text Available Monitoring universal health coverage (UHC focuses on information on health intervention coverage and financial protection. This paper addresses monitoring intervention coverage, related to the full spectrum of UHC, including health promotion and disease prevention, treatment, rehabilitation, and palliation. A comprehensive core set of indicators most relevant to the country situation should be monitored on a regular basis as part of health progress and systems performance assessment for all countries. UHC monitoring should be embedded in a broad results framework for the country health system, but focus on indicators related to the coverage of interventions that most directly reflect the results of UHC investments and strategies in each country. A set of tracer coverage indicators can be selected, divided into two groups-promotion/prevention, and treatment/care-as illustrated in this paper. Disaggregation of the indicators by the main equity stratifiers is critical to monitor progress in all population groups. Targets need to be set in accordance with baselines, historical rate of progress, and measurement considerations. Critical measurement gaps also exist, especially for treatment indicators, covering issues such as mental health, injuries, chronic conditions, surgical interventions, rehabilitation, and palliation. Consequently, further research and proxy indicators need to be used in the interim. Ideally, indicators should include a quality of intervention dimension. For some interventions, use of a single indicator is feasible, such as management of hypertension; but in many areas additional indicators are needed to capture quality of service provision. The monitoring of UHC has significant implications for health information systems. Major data gaps will need to be filled. At a minimum, countries will need to administer regular household health surveys with biological and clinical data collection. Countries will also need to improve the

  3. Quantum probability measures and tomographic probability densities

    NARCIS (Netherlands)

    Amosov, GG; Man'ko, [No Value

    2004-01-01

    Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the

  4. Probability an introduction with statistical applications

    CERN Document Server

    Kinney, John J

    2014-01-01

    Praise for the First Edition""This is a well-written and impressively presented introduction to probability and statistics. The text throughout is highly readable, and the author makes liberal use of graphs and diagrams to clarify the theory.""  - The StatisticianThoroughly updated, Probability: An Introduction with Statistical Applications, Second Edition features a comprehensive exploration of statistical data analysis as an application of probability. The new edition provides an introduction to statistics with accessible coverage of reliability, acceptance sampling, confidence intervals, h

  5. Evaluating the McDonald's business model for HIV prevention among truckers to improve program coverage and service utilization in India, 2004-2010.

    Science.gov (United States)

    Rao, Vasudha Tirumalasetti; Mahapatra, Bidhubhusan; Juneja, Sachin; Singh, Indra R

    2013-01-01

    This study describes the experiences and results of a large-scale human immunodeficiency virus (HIV) prevention intervention for long-distance truck drivers operating on the national highways of India. The intervention for long-distance truckers started in 2004 across 34 trans-shipment locations. However, due to poor coverage and utilization of services by truckers in the initial 18-month period, the intervention was redesigned to focus on only 17 trans-shipment locations. The redesigned intervention model was based on the McDonald's business franchise model where the focus is on optimal placement of services, supported with branding and standardization of services offered, and a surround sound communication approach. Program output indicators were assessed using program monitoring data over 7 years (2004-2010) and two rounds of cross-sectional behavioral surveys conducted in January 2008 (n = 1402) and July 2009 (n = 1407). The number of truckers contacted per month per site increased from 374 in 2004 to 4327 in 2010. Analysis of survey data showed a seven-fold increase in clinic visits in the past 12 months from 2008 to 2009 (21% versus 63%, P < 0.001). A significant increase was also observed in the percentage of truckers who watched street plays (10% to 56%, P < 0.001), and participated in health exhibitions (6% to 35%, P < 0.001). Furthermore, an increase from round 1 to round 2 was observed in the percentage who received condoms (13% to 22%, P < 0.001), and attended one-one counseling (15% to 21%, P < 0.01). Treatment-seeking from program clinics for symptoms related to sexually transmitted infections increased six-fold during this period (16% versus 50%, P < 0.001). Adoption of a business model for HIV prevention helped to increase program coverage and service utilization among long-distance truckers. Implementing HIV prevention programs in a highly mobile population such as truckers, in a limited number of high-impact locations, supported by branding of

  6. Toward a generalized probability theory: conditional probabilities

    International Nuclear Information System (INIS)

    Cassinelli, G.

    1979-01-01

    The main mathematical object of interest in the quantum logic approach to the foundations of quantum mechanics is the orthomodular lattice and a set of probability measures, or states, defined by the lattice. This mathematical structure is studied per se, independently from the intuitive or physical motivation of its definition, as a generalized probability theory. It is thought that the building-up of such a probability theory could eventually throw light on the mathematical structure of Hilbert-space quantum mechanics as a particular concrete model of the generalized theory. (Auth.)

  7. Improvement on post-OPC verification efficiency for contact/via coverage check by final CD biasing of metal lines and considering their location on the metal layout

    Science.gov (United States)

    Kim, Youngmi; Choi, Jae-Young; Choi, Kwangseon; Choi, Jung-Hoe; Lee, Sooryong

    2011-04-01

    As IC design complexity keeps increasing, it is more and more difficult to ensure the pattern transfer after optical proximity correction (OPC) due to the continuous reduction of layout dimensions and lithographic limitation by k1 factor. To guarantee the imaging fidelity, resolution enhancement technologies (RET) such as off-axis illumination (OAI), different types of phase shift masks and OPC technique have been developed. In case of model-based OPC, to cross-confirm the contour image versus target layout, post-OPC verification solutions continuously keep developed - contour generation method and matching it to target structure, method for filtering and sorting the patterns to eliminate false errors and duplicate patterns. The way to detect only real errors by excluding false errors is the most important thing for accurate and fast verification process - to save not only reviewing time and engineer resource, but also whole wafer process time and so on. In general case of post-OPC verification for metal-contact/via coverage (CC) check, verification solution outputs huge of errors due to borderless design, so it is too difficult to review and correct all points of them. It should make OPC engineer to miss the real defect, and may it cause the delay time to market, at least. In this paper, we studied method for increasing efficiency of post-OPC verification, especially for the case of CC check. For metal layers, final CD after etch process shows various CD bias, which depends on distance with neighbor patterns, so it is more reasonable that consider final metal shape to confirm the contact/via coverage. Through the optimization of biasing rule for different pitches and shapes of metal lines, we could get more accurate and efficient verification results and decrease the time for review to find real errors. In this paper, the suggestion in order to increase efficiency of OPC verification process by using simple biasing rule to metal layout instead of etch model

  8. Cooperative Cloud Service Aware Mobile Internet Coverage Connectivity Guarantee Protocol Based on Sensor Opportunistic Coverage Mechanism

    Directory of Open Access Journals (Sweden)

    Qin Qin

    2015-01-01

    Full Text Available In order to improve the Internet coverage ratio and provide connectivity guarantee, based on sensor opportunistic coverage mechanism and cooperative cloud service, we proposed the coverage connectivity guarantee protocol for mobile Internet. In this scheme, based on the opportunistic covering rules, the network coverage algorithm of high reliability and real-time security was achieved by using the opportunity of sensor nodes and the Internet mobile node. Then, the cloud service business support platform is created based on the Internet application service management capabilities and wireless sensor network communication service capabilities, which is the architecture of the cloud support layer. The cooperative cloud service aware model was proposed. Finally, we proposed the mobile Internet coverage connectivity guarantee protocol. The results of experiments demonstrate that the proposed algorithm has excellent performance, in terms of the security of the Internet and the stability, as well as coverage connectivity ability.

  9. Medicaid Coverage Expansions and Cigarette Smoking Cessation Among Low-income Adults.

    Science.gov (United States)

    Koma, Jonathan W; Donohue, Julie M; Barry, Colleen L; Huskamp, Haiden A; Jarlenski, Marian

    2017-12-01

    Expanding Medicaid coverage to low-income adults may have increased smoking cessation through improved access to evidence-based treatments. Our study sought to determine if states' decisions to expand Medicaid increased recent smoking cessation. Using pooled cross-sectional data from the Behavioral Risk Factor Surveillance Survey for the years 2011-2015, we examined the association between state Medicaid coverage and the probability of recent smoking cessation among low-income adults without dependent children who were current or former smokers (n=36,083). We used difference-in-differences estimation to examine the effects of Medicaid coverage on smoking cessation, comparing low-income adult smokers in states with Medicaid coverage to comparable adults in states without Medicaid coverage, with ages 18-64 years to those ages 65 years and above. Analyses were conducted for the full sample and stratified by sex. Residence in a state with Medicaid coverage among low-income adult smokers ages 18-64 years was associated with an increase in recent smoking cessation of 2.1 percentage points (95% confidence interval, 0.25-3.9). In the comparison group of individuals ages 65 years and above, residence in a state with Medicaid coverage expansion was not associated with a change in recent smoking cessation (-0.1 percentage point, 95% confidence interval, -2.1 to 1.8). Similar increases in smoking cessation among those ages 18-64 years were estimated for females and males (1.9 and 2.2 percentage point, respectively). Findings are consistent with the hypothesis that Medicaid coverage expansions may have increased smoking cessation among low-income adults without dependent children via greater access to preventive health care services, including evidence-based smoking cessation services.

  10. Volume arc therapy of gynaecological tumours: target volume coverage improvement without dose increase for critical organs; Arctherapie volumique des tumeurs gynecologiques: amelioration de la couverture du volume cible sans augmentation de la dose aux organes critiques

    Energy Technology Data Exchange (ETDEWEB)

    Ducteil, A.; Kerr, C.; Idri, K.; Fenoglietto, P.; Vieillot, S.; Ailleres, N.; Dubois, J.B.; Azria, D. [CRLC Val-d' Aurelle, Montpellier (France)

    2011-10-15

    The authors report the assessment of the application of conventional intensity-modulated conformational radiotherapy (IMRT) and volume arc-therapy (RapidArc) for the treatment of cervical cancers, with respect to conventional radiotherapy. Dosimetric plans associated with each of these techniques have been compared. Dose-volume histograms of these three plans have also been compared for the previsional target volume (PTV), organs at risk, and sane tissue. IMCT techniques are equivalent in terms of sparing of organs at risk, and improve target volume coverage with respect to conventional radiotherapy. Arc-therapy reduces significantly treatment duration. Short communication

  11. Effects of coverage gap reform on adherence to diabetes medications.

    Science.gov (United States)

    Zeng, Feng; Patel, Bimal V; Brunetti, Louis

    2013-04-01

    To investigate the impact of Part D coverage gap reform on diabetes medication adherence. Retrospective data analysis based on pharmacy claims data from a national pharmacy benefit manager. We used a difference-in-difference-indifference method to evaluate the impact of coverage gap reform on adherence to diabetes medications. Two cohorts (2010 and 2011) were constructed to represent the last year before Affordable Care Act (ACA) reform and the first year after reform, respectively. Each patient had 2 observations: 1 before and 1 after entering the coverage gap. Patients in each cohort were divided into groups based on type of gap coverage: no coverage, partial coverage (generics only), and full coverage. Following ACA reform, patients with no gap coverage and patients with partial gap coverage experienced substantial drops in copayments in the coverage gap in 2011. Their adherence to diabetes medications in the gap, measured by percentage of days covered, improved correspondingly (2.99 percentage points, 95% confidence interval [CI] 0.49-5.48, P = .019 for patients with no coverage; 6.46 percentage points, 95% CI 3.34-9.58, P gap in 2011. However, their adherence did not increase (-0.13 percentage point, P = .8011). In the first year of ACA coverage gap reform, copayments in the gap decreased substantially for all patients. Patients with no coverage and patients with partial coverage in the gap had better adherence in the gap in 2011.

  12. -Net Approach to Sensor -Coverage

    Directory of Open Access Journals (Sweden)

    Fusco Giordano

    2010-01-01

    Full Text Available Wireless sensors rely on battery power, and in many applications it is difficult or prohibitive to replace them. Hence, in order to prolongate the system's lifetime, some sensors can be kept inactive while others perform all the tasks. In this paper, we study the -coverage problem of activating the minimum number of sensors to ensure that every point in the area is covered by at least sensors. This ensures higher fault tolerance, robustness, and improves many operations, among which position detection and intrusion detection. The -coverage problem is trivially NP-complete, and hence we can only provide approximation algorithms. In this paper, we present an algorithm based on an extension of the classical -net technique. This method gives an -approximation, where is the number of sensors in an optimal solution. We do not make any particular assumption on the shape of the areas covered by each sensor, besides that they must be closed, connected, and without holes.

  13. Philosophical theories of probability

    CERN Document Server

    Gillies, Donald

    2000-01-01

    The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.

  14. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned

  15. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  16. Childhood immunization rates in rural Intibucá, Honduras: an analysis of a local database tool and community health center records for assessing and improving vaccine coverage.

    Science.gov (United States)

    He, Yuan; Zarychta, Alan; Ranz, Joseph B; Carroll, Mary; Singleton, Lori M; Wilson, Paria M; Schlaudecker, Elizabeth P

    2012-12-07

    Vaccines are highly effective at preventing infectious diseases in children, and prevention is especially important in resource-limited countries where treatment is difficult to access. In Honduras, the World Health Organization (WHO) reports very high immunization rates in children. To determine whether or not these estimates accurately depict the immunization coverage in non-urban regions of the country, we compared the WHO data to immunization rates obtained from a local database tool and community health center records in rural Intibucá, Honduras. We used data from two sources to comprehensively evaluate immunization rates in the area: 1) census data from a local database and 2) immunization data collected at health centers. We compared these rates using logistic regression, and we compared them to publicly available WHO-reported estimates using confidence interval inclusion. We found that mean immunization rates for each vaccine were high (range 84.4 to 98.8 percent), but rates recorded at the health centers were significantly higher than those reported from the census data (p ≤ 0.001). Combining the results from both databases, the mean rates of four out of five vaccines were less than WHO-reported rates (p 0.05), except for diphtheria/tetanus/pertussis vaccine (p=0.02) and oral polio vaccine (p Honduras were high across data sources, though most of the rates recorded in rural Honduras were less than WHO-reported rates. Despite geographical difficulties and barriers to access, the local database and Honduran community health workers have developed a thorough system for ensuring that children receive their immunizations on time. The successful integration of community health workers and a database within the Honduran decentralized health system may serve as a model for other immunization programs in resource-limited countries where health care is less accessible.

  17. Does Motion Assessment With 4-Dimensional Computed Tomographic Imaging for Non–Small Cell Lung Cancer Radiotherapy Improve Target Volume Coverage?

    Directory of Open Access Journals (Sweden)

    Naseer Ahmed

    2017-03-01

    Full Text Available Introduction: Modern radiotherapy with 4-dimensional computed tomographic (4D-CT image acquisition for non–small cell lung cancer (NSCLC captures respiratory-mediated tumor motion to provide more accurate target delineation. This study compares conventional 3-dimensional (3D conformal radiotherapy (3DCRT plans generated with standard helical free-breathing CT (FBCT with plans generated on 4D-CT contoured volumes to determine whether target volume coverage is affected. Materials and methods: Fifteen patients with stage I to IV NSCLC were enrolled in the study. Free-breathing CT and 4D-CT data sets were acquired at the same simulation session and with the same immobilization. Gross tumor volume (GTV for primary and/or nodal disease was contoured on FBCT (GTV_3D. The 3DCRT plans were obtained, and the patients were treated according to our institution’s standard protocol using FBCT imaging. Gross tumor volume was contoured on 4D-CT for primary and/or nodal disease on all 10 respiratory phases and merged to create internal gross tumor volume (IGTV_4D. Clinical target volume margin was 5 mm in both plans, whereas planning tumor volume (PTV expansion was 1 cm axially and 1.5 cm superior/inferior for FBCT-based plans to incorporate setup errors and an estimate of respiratory-mediated tumor motion vs 8 mm isotropic margin for setup error only in all 4D-CT plans. The 3DCRT plans generated from the FBCT scan were copied on the 4D-CT data set with the same beam parameters. GTV_3D, IGTV_4D, PTV, and dose volume histogram from both data sets were analyzed and compared. Dice coefficient evaluated PTV similarity between FBCT and 4D-CT data sets. Results: In total, 14 of the 15 patients were analyzed. One patient was excluded as there was no measurable GTV. Mean GTV_3D was 115.3 cm 3 and mean IGTV_4D was 152.5 cm 3 ( P = .001. Mean PTV_3D was 530.0 cm 3 and PTV_4D was 499.8 cm 3 ( P = .40. Both gross primary and nodal disease analyzed separately were larger

  18. Does introducing an immunization package of services for migrant children improve the coverage, service quality and understanding? An evidence from an intervention study among 1548 migrant children in eastern China.

    Science.gov (United States)

    Hu, Yu; Luo, Shuying; Tang, Xuewen; Lou, Linqiao; Chen, Yaping; Guo, Jing; Zhang, Bing

    2015-07-15

    An EPI (Expanded Program on Immunization) intervention package was implemented from October 2011 to May 2014 among migrant children in Yiwu, east China. This study aimed to evaluate its impacts on vaccination coverage, maternal understanding of EPI and the local immunization service performance. A pre- and post-test design was used. The EPI intervention package included: (1) extending the EPI service time and increasing the frequency of vaccination service; (2) training program for vaccinators; (3) developing a screening tool to identify vaccination demands among migrant clinic attendants; (4) Social mobilization for immunization. Data were obtained from random sampling investigations, vaccination service statistics and qualitative interviews with vaccinators and mothers of migrant children. The analysis of quantitative data was based on a "before and after" evaluation and qualitative data were analyzed using content analysis. The immunization registration (records kept by immunization clinics) rate increased from 87.4 to 91.9% (P = 0.016) after implementation of the EPI intervention package and the EPI card holding (EPI card kept by caregivers) rate increased from 90.9 to 95.6% (P = 0.003). The coverage of fully immunized increased from 71.5 to 88.6% for migrant children aged 1-4 years (P < 0.001) and increased from 42.2 to 80.5% for migrant children aged 2-4 years (P < 0.001). The correct response rates on valid doses and management of adverse events among vaccinators were over 90% after training. The correct response rates on immunization among mothers of migrant children were 86.8-99.3% after interventions. Our study showed a substantial improvement in vaccination coverage among migrant children in Yiwu after implementation of the EPI intervention package. Further studies are needed to evaluate the cost-effectiveness of the interventions, to identify individual interventions that make the biggest contribution to coverage, and to examine the

  19. Conditional Probability Modulates Visual Search Efficiency

    Directory of Open Access Journals (Sweden)

    Bryan eCort

    2013-10-01

    Full Text Available We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability – the likelihood of a particular color given a particular combination of two cues – varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.

  20. Foundations of probability

    International Nuclear Information System (INIS)

    Fraassen, B.C. van

    1979-01-01

    The interpretation of probabilities in physical theories are considered, whether quantum or classical. The following points are discussed 1) the functions P(μ, Q) in terms of which states and propositions can be represented, are classical (Kolmogoroff) probabilities, formally speaking, 2) these probabilities are generally interpreted as themselves conditional, and the conditions are mutually incompatible where the observables are maximal and 3) testing of the theory typically takes the form of confronting the expectation values of observable Q calculated with probability measures P(μ, Q) for states μ; hence, of comparing the probabilities P(μ, Q)(E) with the frequencies of occurrence of the corresponding events. It seems that even the interpretation of quantum mechanics, in so far as it concerns what the theory says about the empirical (i.e. actual, observable) phenomena, deals with the confrontation of classical probability measures with observable frequencies. This confrontation is studied. (Auth./C.F.)

  1. The quantum probability calculus

    International Nuclear Information System (INIS)

    Jauch, J.M.

    1976-01-01

    The Wigner anomaly (1932) for the joint distribution of noncompatible observables is an indication that the classical probability calculus is not applicable for quantum probabilities. It should, therefore, be replaced by another, more general calculus, which is specifically adapted to quantal systems. In this article this calculus is exhibited and its mathematical axioms and the definitions of the basic concepts such as probability field, random variable, and expectation values are given. (B.R.H)

  2. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....

  3. Probability of satellite collision

    Science.gov (United States)

    Mccarter, J. W.

    1972-01-01

    A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.

  4. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...

  5. Handbook of probability

    CERN Document Server

    Florescu, Ionut

    2013-01-01

    THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio

  6. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  7. Development and formative evaluation of an innovative mHealth intervention for improving coverage of community-based maternal, newborn and child health services in rural areas of India

    Directory of Open Access Journals (Sweden)

    Dhiren Modi

    2015-02-01

    Full Text Available Background: A new cadre of village-based frontline health workers, called Accredited Social Health Activists (ASHAs, was created in India. However, coverage of selected community-based maternal, newborn and child health (MNCH services remains low. Objective: This article describes the process of development and formative evaluation of a complex mHealth intervention (ImTeCHO to increase the coverage of proven MNCH services in rural India by improving the performance of ASHAs. Design: The Medical Research Council (MRC framework for developing complex interventions was used. Gaps were identified in the usual care provided by ASHAs, based on a literature search, and SEWA Rural's1 three decades of grassroots experience. The components of the intervention (mHealth strategies were designed to overcome the gaps in care. The intervention, in the form of the ImTeCHO mobile phone and web application, along with the delivery model, was developed to incorporate these mHealth strategies. The intervention was piloted through 45 ASHAs among 45 villages in Gujarat (population: 45,000 over 7 months in 2013 to assess the acceptability, feasibility, and usefulness of the intervention and to identify barriers to its delivery. Results: Inadequate supervision and support to ASHAs were noted as a gap in usual care, resulting in low coverage of selected MNCH services and care received by complicated cases. Therefore, the ImTeCHO application was developed to integrate mHealth strategies in the form of job aid to ASHAs to assist with scheduling, behavior change communication, diagnosis, and patient management, along with supervision and support of ASHAs. During the pilot, the intervention and its delivery were found to be largely acceptable, feasible, and useful. A few changes were made to the intervention and its delivery, including 1 a new helpline for ASHAs, 2 further simplification of processes within the ImTeCHO incentive management system and 3 additional web

  8. Probability, Nondeterminism and Concurrency

    DEFF Research Database (Denmark)

    Varacca, Daniele

    Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...

  9. Janus-faced probability

    CERN Document Server

    Rocchi, Paolo

    2014-01-01

    The problem of probability interpretation was long overlooked before exploding in the 20th century, when the frequentist and subjectivist schools formalized two conflicting conceptions of probability. Beyond the radical followers of the two schools, a circle of pluralist thinkers tends to reconcile the opposing concepts. The author uses two theorems in order to prove that the various interpretations of probability do not come into opposition and can be used in different contexts. The goal here is to clarify the multifold nature of probability by means of a purely mathematical approach and to show how philosophical arguments can only serve to deepen actual intellectual contrasts. The book can be considered as one of the most important contributions in the analysis of probability interpretation in the last 10-15 years.

  10. Resolution, coverage, and geometry beyond traditional limits

    Energy Technology Data Exchange (ETDEWEB)

    Ronen, Shuki; Ferber, Ralf

    1998-12-31

    The presentation relates to the optimization of the image of seismic data and improved resolution and coverage of acquired data. Non traditional processing methods such as inversion to zero offset (IZO) are used. To realize the potential of saving acquisition cost by reducing in-fill and to plan resolution improvement by processing, geometry QC methods such as DMO Dip Coverage Spectrum (DDCS) and Bull`s Eyes Analysis are used. The DDCS is a 2-D spectrum whose entries consist of the DMO (Dip Move Out) coverage for a particular reflector specified by it`s true time dip and reflector normal strike. The Bull`s Eyes Analysis relies on real time processing of synthetic data generated with the real geometry. 4 refs., 6 figs.

  11. Coverage Metrics for Model Checking

    Science.gov (United States)

    Penix, John; Visser, Willem; Norvig, Peter (Technical Monitor)

    2001-01-01

    When using model checking to verify programs in practice, it is not usually possible to achieve complete coverage of the system. In this position paper we describe ongoing research within the Automated Software Engineering group at NASA Ames on the use of test coverage metrics to measure partial coverage and provide heuristic guidance for program model checking. We are specifically interested in applying and developing coverage metrics for concurrent programs that might be used to support certification of next generation avionics software.

  12. Probability and Measure

    CERN Document Server

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  13. The concept of probability

    International Nuclear Information System (INIS)

    Bitsakis, E.I.; Nicolaides, C.A.

    1989-01-01

    The concept of probability is now, and always has been, central to the debate on the interpretation of quantum mechanics. Furthermore, probability permeates all of science, as well as our every day life. The papers included in this volume, written by leading proponents of the ideas expressed, embrace a broad spectrum of thought and results: mathematical, physical epistemological, and experimental, both specific and general. The contributions are arranged in parts under the following headings: Following Schroedinger's thoughts; Probability and quantum mechanics; Aspects of the arguments on nonlocality; Bell's theorem and EPR correlations; Real or Gedanken experiments and their interpretation; Questions about irreversibility and stochasticity; and Epistemology, interpretation and culture. (author). refs.; figs.; tabs

  14. Whole-Pelvis Radiotherapy in Combination With Interstitial Brachytherapy: Does Coverage of the Pelvic Lymph Nodes Improve Treatment Outcome in High-Risk Prostate Cancer?

    International Nuclear Information System (INIS)

    Bittner, Nathan; Merrick, Gregory S.; Wallner, Kent E.; Butler, Wayne M.; Galbreath, Robert; Adamovich, Edward

    2010-01-01

    Purpose: To compare biochemical progression-free survival (bPFS), cause-specific survival (CSS), and overall survival (OS) rates among high-risk prostate cancer patients treated with brachytherapy and supplemental external beam radiation (EBRT) using either a mini-pelvis (MP) or a whole-pelvis (WP) field. Methods and Materials: From May 1995 to October 2005, 186 high-risk prostate cancer patients were treated with brachytherapy and EBRT with or without androgen-deprivation therapy (ADT). High-risk prostate cancer was defined as a Gleason score of ≥8 and/or a prostate-specific antigen (PSA) concentration of ≥20 ng/ml. Results: With a median follow-up of 6.7 years, the 10-year bPFS, CSS, and OS rates for the WP vs. the MP arms were 91.7% vs. 84.4% (p = 0.126), 95.5% vs. 92.6% (p = 0.515), and 79.5% vs. 67.1% (p = 0.721), respectively. Among those patients who received ADT, the 10-year bPFS, CSS, and OS rates for the WP vs. the MP arms were 93.6% vs. 90.1% (p = 0.413), 94.2% vs. 96.0% (p = 0.927), and 73.7% vs. 70.2% (p = 0.030), respectively. Among those patients who did not receive ADT, the 10-year bPFS, CSS, and OS rates for the WP vs. the MP arms were 82.4% vs. 75.0% (p = 0.639), 100% vs. 88% (p = 0.198), and 87.5% vs. 58.8% (p = 0.030), respectively. Based on multivariate analysis, none of the evaluated parameters predicted for CSS, while bPFS was best predicted by ADT and percent positive biopsy results. OS was best predicted by age and percent positive biopsy results. Conclusions: For high-risk prostate cancer patients receiving brachytherapy, there is a nonsignificant trend toward improved bPFS, CSS, and OS rates when brachytherapy is given with WPRT. This trend is most apparent among ADT-naive patients, for whom a significant improvement in OS was observed.

  15. Concepts of probability theory

    CERN Document Server

    Pfeiffer, Paul E

    1979-01-01

    Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.

  16. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  17. Probability and Statistical Inference

    OpenAIRE

    Prosper, Harrison B.

    2006-01-01

    These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.

  18. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  19. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  20. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  1. Probability in quantum mechanics

    Directory of Open Access Journals (Sweden)

    J. G. Gilson

    1982-01-01

    Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.

  2. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  3. Quantum computing and probability

    International Nuclear Information System (INIS)

    Ferry, David K

    2009-01-01

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction. (viewpoint)

  4. Increasing the coverage area through relay node deployment in long term evolution advanced cellular networks

    Science.gov (United States)

    Aldhaibani, Jaafar A.; Ahmad, R. B.; Yahya, A.; Azeez, Suzan A.

    2015-05-01

    Wireless multi-hop relay networks have become very important technologies in mobile communications. These networks ensure high throughput and coverage extension with a low cost. The poor capacity at cell edges is not enough to meet with growing demand of high capacity and throughput irrespective of user's placement in the cellular network. In this paper we propose optimal placement of relay node that provides maximum achievable rate at users and enhances the throughput and coverage at cell edge region. The proposed scheme is based on the outage probability at users and taken on account the interference between nodes. Numerical analyses along with simulation results indicated there are an improvement in capacity for users at the cell edge is 40% increment from all cell capacity.

  5. The perception of probability.

    Science.gov (United States)

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  6. Contraceptive Coverage and the Affordable Care Act.

    Science.gov (United States)

    Tschann, Mary; Soon, Reni

    2015-12-01

    A major goal of the Patient Protection and Affordable Care Act is reducing healthcare spending by shifting the focus of healthcare toward preventive care. Preventive services, including all FDA-approved contraception, must be provided to patients without cost-sharing under the ACA. No-cost contraception has been shown to increase uptake of highly effective birth control methods and reduce unintended pregnancy and abortion; however, some institutions and corporations argue that providing contraceptive coverage infringes on their religious beliefs. The contraceptive coverage mandate is evolving due to legal challenges, but it has already demonstrated success in reducing costs and improving access to contraception. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. Irreversibility and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    The mathematical entropy - unlike physical entropy - is simply a measure of uniformity for probability distributions in general. So understood, conditional entropies have the same logical structure as conditional probabilities. If, as is sometimes supposed, conditional probabilities are time-reversible, then so are conditional entropies and, paradoxically, both then share this symmetry with physical equations of motion. The paradox is, of course that probabilities yield a direction to time both in statistical mechanics and quantum mechanics, while the equations of motion do not. The supposed time-reversibility of both conditionals seems also to involve a form of retrocausality that is related to, but possibly not the same as, that described by Costa de Beaurgard. The retrocausality is paradoxically at odds with the generally presumed irreversibility of the quantum mechanical measurement process. Further paradox emerges if the supposed time-reversibility of the conditionals is linked with the idea that the thermodynamic entropy is the same thing as 'missing information' since this confounds the thermodynamic and mathematical entropies. However, it is shown that irreversibility is a formal consequence of conditional entropies and, hence, of conditional probabilities also. 8 refs. (Author)

  8. The pleasures of probability

    CERN Document Server

    Isaac, Richard

    1995-01-01

    The ideas of probability are all around us. Lotteries, casino gambling, the al­ most non-stop polling which seems to mold public policy more and more­ these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re­ moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac­ ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...

  9. Experimental Probability in Elementary School

    Science.gov (United States)

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  10. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....

  11. Probability and stochastic modeling

    CERN Document Server

    Rotar, Vladimir I

    2012-01-01

    Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...

  12. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  13. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    2014-01-01

    either construct elicitation mechanisms that control for risk aversion, or construct elicitation mechanisms which undertake 'calibrating adjustments' to elicited reports. We illustrate how the joint estimation of risk attitudes and subjective probabilities can provide the calibration adjustments...... that theory calls for. We illustrate this approach using data from a controlled experiment with real monetary consequences to the subjects. This allows the observer to make inferences about the latent subjective probability, under virtually any well-specified model of choice under subjective risk, while still...

  14. Introduction to imprecise probabilities

    CERN Document Server

    Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M

    2014-01-01

    In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin

  15. Classic Problems of Probability

    CERN Document Server

    Gorroochurn, Prakash

    2012-01-01

    "A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin

  16. Tetanus toxoid immunization coverage among mothers of below one ...

    African Journals Online (AJOL)

    Poverty and lack of health facilities also contributed to the low level of immunization coverage. For TT immunization to improve in the area studied, factors impeding immunization must be addressed. Keywords: tetanus, immunization, coverage. African Journal of Clinical and Experimental Microbiology Vol. 6 (3) 2005: 233- ...

  17. Improving word coverage using unsupervised morphological analyser

    Indian Academy of Sciences (India)

    To enable a computer to process information in human languages, ... vised morphological analyser (UMA) would learn how to analyse a language just by looking ... result for English, but they did remarkably worse for Finnish and Turkish.

  18. Counterexamples in probability

    CERN Document Server

    Stoyanov, Jordan M

    2013-01-01

    While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.

  19. Epistemology and Probability

    CERN Document Server

    Plotnitsky, Arkady

    2010-01-01

    Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general

  20. Transition probabilities for atoms

    International Nuclear Information System (INIS)

    Kim, Y.K.

    1980-01-01

    Current status of advanced theoretical methods for transition probabilities for atoms and ions is discussed. An experiment on the f values of the resonance transitions of the Kr and Xe isoelectronic sequences is suggested as a test for the theoretical methods

  1. Models for probability and statistical inference theory and applications

    CERN Document Server

    Stapleton, James H

    2007-01-01

    This concise, yet thorough, book is enhanced with simulations and graphs to build the intuition of readersModels for Probability and Statistical Inference was written over a five-year period and serves as a comprehensive treatment of the fundamentals of probability and statistical inference. With detailed theoretical coverage found throughout the book, readers acquire the fundamentals needed to advance to more specialized topics, such as sampling, linear models, design of experiments, statistical computing, survival analysis, and bootstrapping.Ideal as a textbook for a two-semester sequence on probability and statistical inference, early chapters provide coverage on probability and include discussions of: discrete models and random variables; discrete distributions including binomial, hypergeometric, geometric, and Poisson; continuous, normal, gamma, and conditional distributions; and limit theory. Since limit theory is usually the most difficult topic for readers to master, the author thoroughly discusses mo...

  2. Negative probability in the framework of combined probability

    OpenAIRE

    Burgin, Mark

    2013-01-01

    Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...

  3. Strategies for expanding health insurance coverage in vulnerable populations.

    Science.gov (United States)

    Jia, Liying; Yuan, Beibei; Huang, Fei; Lu, Ying; Garner, Paul; Meng, Qingyue

    2014-11-26

    evaluated the effects of strategies on increasing health insurance coverage for vulnerable populations. We defined strategies as measures to improve the enrolment of vulnerable populations into health insurance schemes. Two categories and six specified strategies were identified as the interventions. At least two review authors independently extracted data and assessed the risk of bias. We undertook a structured synthesis. We included two studies, both from the United States. People offered health insurance information and application support by community-based case managers were probably more likely to enrol their children into health insurance programmes (risk ratio (RR) 1.68, 95% confidence interval (CI) 1.44 to 1.96, moderate quality evidence) and were probably more likely to continue insuring their children (RR 2.59, 95% CI 1.95 to 3.44, moderate quality evidence). Of all the children that were insured, those in the intervention group may have been insured quicker (47.3 fewer days, 95% CI 20.6 to 74.0 fewer days, low quality evidence) and parents may have been more satisfied on average (satisfaction score average difference 1.07, 95% CI 0.72 to 1.42, low quality evidence).In the second study applications were handed out in emergency departments at hospitals, compared to not handing out applications, and may have had an effect on enrolment (RR 1.5, 95% CI 1.03 to 2.18, low quality evidence). Community-based case managers who provide health insurance information, application support, and negotiate with the insurer probably increase enrolment of children in health insurance schemes. However, the transferability of this intervention to other populations or other settings is uncertain. Handing out insurance application materials in hospital emergency departments may help increase the enrolment of children in health insurance schemes. Further studies evaluating the effectiveness of different strategies for expanding health insurance coverage in vulnerable population are

  4. Intensity Modulated Radiotherapy Improves Target Coverage and Parotid Gland Sparing When Delivering Total Mucosal Irradiation in Patients With Squamous Cell Carcinoma of Head and Neck of Unknown Primary Site

    International Nuclear Information System (INIS)

    Bhide, Shreerang; Clark, Catherine; Harrington, Kevin; Nutting, Christopher M.

    2007-01-01

    Head and neck squamous cell carcinoma with occult primary site represents a controversial clinical problem. Conventional total mucosal irradiation (TMI) maximizes local control, but at the expense of xerostomia. IMRT has been shown to spare salivary tissue in head and cancer patients. This study has been performed to investigate the potential of IMRT to perform nodal and TMI and also allow parotid gland sparing in this patient group. Conventional radiotherapy (CRT) and IMRT plans were produced for six patients to treat the ipsilateral (involved) post-operative neck (PTV1) and the un-operated contralateral neck and mucosal axis (PTV2). Plans were produced with and without the inclusion of nasopharynx in the PTV2. The potential to improve target coverage and spare the parotid glands was investigated for the IMRT plans. There was no significant difference in the mean doses to the PTV1 using CRT and IMRT (59.7 and 60.0 respectively, p = 0.5). The maximum doses to PTV1 and PTV2 were lower for the IMRT technique as compared to CRT (P = 0.008 and P < 0.0001), respectively, and the minimum doses to PTV1 and PTV2 were significantly higher for IMRT as compared to CRT (P = 0.001 and P = 0.001), respectively, illustrating better dose homogeneity with IMRT. The mean dose to the parotid gland contralateral to PTV1 was significantly lower for IMRT (23.21 ± 0.7) as compared to CRT (50.5 ± 5.8) (P < 0.0001). There was a significant difference in parotid dose between plans with and without the inclusion of the nasopharynx. IMRT offers improved dose homogeneity in PTV1 and PTV2 and allows for parotid sparing

  5. Contributions to quantum probability

    International Nuclear Information System (INIS)

    Fritz, Tobias

    2010-01-01

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a finite set can occur as the outcome

  6. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  7. Contributions to quantum probability

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Tobias

    2010-06-25

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a

  8. Waste Package Misload Probability

    International Nuclear Information System (INIS)

    Knudsen, J.K.

    2001-01-01

    The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a

  9. Probability theory and applications

    CERN Document Server

    Hsu, Elton P

    1999-01-01

    This volume, with contributions by leading experts in the field, is a collection of lecture notes of the six minicourses given at the IAS/Park City Summer Mathematics Institute. It introduces advanced graduates and researchers in probability theory to several of the currently active research areas in the field. Each course is self-contained with references and contains basic materials and recent results. Topics include interacting particle systems, percolation theory, analysis on path and loop spaces, and mathematical finance. The volume gives a balanced overview of the current status of probability theory. An extensive bibliography for further study and research is included. This unique collection presents several important areas of current research and a valuable survey reflecting the diversity of the field.

  10. Paradoxes in probability theory

    CERN Document Server

    Eckhardt, William

    2013-01-01

    Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory.  Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies.  Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.

  11. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  12. Model uncertainty and probability

    International Nuclear Information System (INIS)

    Parry, G.W.

    1994-01-01

    This paper discusses the issue of model uncertainty. The use of probability as a measure of an analyst's uncertainty as well as a means of describing random processes has caused some confusion, even though the two uses are representing different types of uncertainty with respect to modeling a system. The importance of maintaining the distinction between the two types is illustrated with a simple example

  13. Retrocausality and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    Costa de Beauregard has proposed that physical causality be identified with conditional probability. The proposal is shown to be vulnerable on two accounts. The first, though mathematically trivial, seems to be decisive so far as the current formulation of the proposal is concerned. The second lies in a physical inconsistency which seems to have its source in a Copenhagenlike disavowal of realism in quantum mechanics. 6 refs. (Author)

  14. UT Biomedical Informatics Lab (BMIL) probability wheel

    Science.gov (United States)

    Huang, Sheng-Cheng; Lee, Sara; Wang, Allen; Cantor, Scott B.; Sun, Clement; Fan, Kaili; Reece, Gregory P.; Kim, Min Soon; Markey, Mia K.

    A probability wheel app is intended to facilitate communication between two people, an "investigator" and a "participant", about uncertainties inherent in decision-making. Traditionally, a probability wheel is a mechanical prop with two colored slices. A user adjusts the sizes of the slices to indicate the relative value of the probabilities assigned to them. A probability wheel can improve the adjustment process and attenuate the effect of anchoring bias when it is used to estimate or communicate probabilities of outcomes. The goal of this work was to develop a mobile application of the probability wheel that is portable, easily available, and more versatile. We provide a motivating example from medical decision-making, but the tool is widely applicable for researchers in the decision sciences.

  15. Probability via expectation

    CERN Document Server

    Whittle, Peter

    1992-01-01

    This book is a complete revision of the earlier work Probability which ap­ peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de­ manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...

  16. Rural water supply and sanitation (RWSS) coverage in Swaziland: Toward achieving millennium development goals

    Science.gov (United States)

    Mwendera, E. J.

    An assessment of rural water supply and sanitation (RWSS) coverage in Swaziland was conducted in 2004/2005 as part of the Rural Water Supply and Sanitation Initiative (RWSSI). The initiative was developed by the African Development Bank with the aim of implementing it in the Regional Member Countries (RMCs), including Swaziland. Information on the RWSS sector programmes, costs, financial requirements and other related activities was obtained from a wide range of national documents, including sector papers and project files and progress reports. Interviews were held with staff from the central offices and field stations of Government of Swaziland (GOS) ministries and departments, non-governmental organizations (NGOs), bilateral and multilateral external support agencies, and private sector individuals and firms with some connection to the sector and/or its programmes. The assessment also involved field visits to various regions in order to obtain first hand information about the various technologies and institutional structures used in the provision of water supplies and sanitation services in the rural areas of the country. The results showed that the RWSS sector has made significant progress towards meeting the national targets of providing water and sanitation to the entire rural population by the year 2022. The assessment indicated that rural water supply coverage was 56% in 2004 while sanitation coverage was 63% in the same year. The results showed that there is some decline in the incidence of water-related diseases, such as diarrhoeal diseases, probably due to improved water supply and sanitation coverage. The study also showed that, with adequate financial resources, Swaziland is likely to achieve 100% coverage of both water supply and sanitation by the year 2022. It was concluded that in achieving its own national goals Swaziland will exceed the Millennium Development Goals (MDGs). However, such achievement is subject to adequate financial resources being

  17. Proton Therapy Coverage for Prostate Cancer Treatment

    International Nuclear Information System (INIS)

    Vargas, Carlos; Wagner, Marcus; Mahajan, Chaitali; Indelicato, Daniel; Fryer, Amber; Falchook, Aaron; Horne, David C.; Chellini, Angela; McKenzie, Craig C.; Lawlor, Paula C.; Li Zuofeng; Lin Liyong; Keole, Sameer

    2008-01-01

    Purpose: To determine the impact of prostate motion on dose coverage in proton therapy. Methods and Materials: A total of 120 prostate positions were analyzed on 10 treatment plans for 10 prostate patients treated using our low-risk proton therapy prostate protocol (University of Florida Proton Therapy Institute 001). Computed tomography and magnetic resonance imaging T 2 -weighted turbo spin-echo scans were registered for all cases. The planning target volume included the prostate with a 5-mm axial and 8-mm superoinferior expansion. The prostate was repositioned using 5- and 10-mm one-dimensional vectors and 10-mm multidimensional vectors (Points A-D). The beam was realigned for the 5- and 10-mm displacements. The prescription dose was 78 Gy equivalent (GE). Results: The mean percentage of rectum receiving 70 Gy (V 70 ) was 7.9%, the bladder V 70 was 14.0%, and the femoral head/neck V 50 was 0.1%, and the mean pelvic dose was 4.6 GE. The percentage of prostate receiving 78 Gy (V 78 ) with the 5-mm movements changed by -0.2% (range, 0.006-0.5%, p > 0.7). However, the prostate V 78 after a 10-mm displacement changed significantly (p 78 coverage had a large and significant reduction of 17.4% (range, 13.5-17.4%, p 78 coverage of the clinical target volume. The minimal prostate dose was reduced 33% (25.8 GE), on average, for Points A-D. The prostate minimal dose improved from 69.3 GE to 78.2 GE (p < 0.001) with realignment for 10-mm movements. Conclusion: The good dose coverage and low normal doses achieved for the initial plan was maintained with movements of ≤5 mm. Beam realignment improved coverage for 10-mm displacements

  18. Probability mapping of contaminants

    Energy Technology Data Exchange (ETDEWEB)

    Rautman, C.A.; Kaplan, P.G. [Sandia National Labs., Albuquerque, NM (United States); McGraw, M.A. [Univ. of California, Berkeley, CA (United States); Istok, J.D. [Oregon State Univ., Corvallis, OR (United States); Sigda, J.M. [New Mexico Inst. of Mining and Technology, Socorro, NM (United States)

    1994-04-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds).

  19. Probability mapping of contaminants

    International Nuclear Information System (INIS)

    Rautman, C.A.; Kaplan, P.G.; McGraw, M.A.; Istok, J.D.; Sigda, J.M.

    1994-01-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds)

  20. A High-Efficiency Uneven Cluster Deployment Algorithm Based on Network Layered for Event Coverage in UWSNs

    Directory of Open Access Journals (Sweden)

    Shanen Yu

    2016-12-01

    Full Text Available Most existing deployment algorithms for event coverage in underwater wireless sensor networks (UWSNs usually do not consider that network communication has non-uniform characteristics on three-dimensional underwater environments. Such deployment algorithms ignore that the nodes are distributed at different depths and have different probabilities for data acquisition, thereby leading to imbalances in the overall network energy consumption, decreasing the network performance, and resulting in poor and unreliable late network operation. Therefore, in this study, we proposed an uneven cluster deployment algorithm based network layered for event coverage. First, according to the energy consumption requirement of the communication load at different depths of the underwater network, we obtained the expected value of deployment nodes and the distribution density of each layer network after theoretical analysis and deduction. Afterward, the network is divided into multilayers based on uneven clusters, and the heterogeneous communication radius of nodes can improve the network connectivity rate. The recovery strategy is used to balance the energy consumption of nodes in the cluster and can efficiently reconstruct the network topology, which ensures that the network has a high network coverage and connectivity rate in a long period of data acquisition. Simulation results show that the proposed algorithm improves network reliability and prolongs network lifetime by significantly reducing the blind movement of overall network nodes while maintaining a high network coverage and connectivity rate.

  1. Generalized Probability Functions

    Directory of Open Access Journals (Sweden)

    Alexandre Souto Martinez

    2009-01-01

    Full Text Available From the integration of nonsymmetrical hyperboles, a one-parameter generalization of the logarithmic function is obtained. Inverting this function, one obtains the generalized exponential function. Motivated by the mathematical curiosity, we show that these generalized functions are suitable to generalize some probability density functions (pdfs. A very reliable rank distribution can be conveniently described by the generalized exponential function. Finally, we turn the attention to the generalization of one- and two-tail stretched exponential functions. We obtain, as particular cases, the generalized error function, the Zipf-Mandelbrot pdf, the generalized Gaussian and Laplace pdf. Their cumulative functions and moments were also obtained analytically.

  2. Probability in High Dimension

    Science.gov (United States)

    2014-06-30

    precisely the content of the following result. The price we pay is that the assumption that A is a packing in (F, k ·k1) is too weak to make this happen...Regularité des trajectoires des fonctions aléatoires gaussiennes. In: École d’Été de Probabilités de Saint- Flour , IV-1974, pp. 1–96. Lecture Notes in...Lectures on probability theory and statistics (Saint- Flour , 1994), Lecture Notes in Math., vol. 1648, pp. 165–294. Springer, Berlin (1996) 50. Ledoux

  3. Introduction to probability and statistics for engineers and scientists

    CERN Document Server

    Ross, Sheldon M

    2009-01-01

    This updated text provides a superior introduction to applied probability and statistics for engineering or science majors. Ross emphasizes the manner in which probability yields insight into statistical problems; ultimately resulting in an intuitive understanding of the statistical procedures most often used by practicing engineers and scientists. Real data sets are incorporated in a wide variety of exercises and examples throughout the book, and this emphasis on data motivates the probability coverage.As with the previous editions, Ross' text has remendously clear exposition, plus real-data

  4. Probable maximum flood control

    International Nuclear Information System (INIS)

    DeGabriele, C.E.; Wu, C.L.

    1991-11-01

    This study proposes preliminary design concepts to protect the waste-handling facilities and all shaft and ramp entries to the underground from the probable maximum flood (PMF) in the current design configuration for the proposed Nevada Nuclear Waste Storage Investigation (NNWSI) repository protection provisions were furnished by the United States Bureau of Reclamation (USSR) or developed from USSR data. Proposed flood protection provisions include site grading, drainage channels, and diversion dikes. Figures are provided to show these proposed flood protection provisions at each area investigated. These areas are the central surface facilities (including the waste-handling building and waste treatment building), tuff ramp portal, waste ramp portal, men-and-materials shaft, emplacement exhaust shaft, and exploratory shafts facility

  5. Computation of the Complex Probability Function

    Energy Technology Data Exchange (ETDEWEB)

    Trainer, Amelia Jo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ledwith, Patrick John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-22

    The complex probability function is important in many areas of physics and many techniques have been developed in an attempt to compute it for some z quickly and e ciently. Most prominent are the methods that use Gauss-Hermite quadrature, which uses the roots of the nth degree Hermite polynomial and corresponding weights to approximate the complex probability function. This document serves as an overview and discussion of the use, shortcomings, and potential improvements on the Gauss-Hermite quadrature for the complex probability function.

  6. Imprecise Probability Methods for Weapons UQ

    Energy Technology Data Exchange (ETDEWEB)

    Picard, Richard Roy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Vander Wiel, Scott Alan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-13

    Building on recent work in uncertainty quanti cation, we examine the use of imprecise probability methods to better characterize expert knowledge and to improve on misleading aspects of Bayesian analysis with informative prior distributions. Quantitative approaches to incorporate uncertainties in weapons certi cation are subject to rigorous external peer review, and in this regard, certain imprecise probability methods are well established in the literature and attractive. These methods are illustrated using experimental data from LANL detonator impact testing.

  7. Escape and transmission probabilities in cylindrical geometry

    International Nuclear Information System (INIS)

    Bjerke, M.A.

    1980-01-01

    An improved technique for the generation of escape and transmission probabilities in cylindrical geometry was applied to the existing resonance cross section processing code ROLAIDS. The algorithm of Hwang and Toppel, [ANL-FRA-TM-118] (with modifications) was employed. The probabilities generated were found to be as accurate as those given by the method previously applied in ROLAIDS, while requiring much less computer core storage and CPU time

  8. [Options for flap coverage in pressure sores].

    Science.gov (United States)

    Nae, S; Antohi, N; Stîngu, C; Stan, V; Parasca, S

    2010-01-01

    Despite improvements in reconstructive techniques for pressure sores, recurrences are still seen frequently, and success rate remains variable. During 2003 - 2007, at the Emergency Hospital for Plastic Surgery and Burns in Bucharest, 27 patients underwent surgical repair of 45 pressure sores located at sacral (22 ulcers), ischial (12 ulcers) and trochanteric (11 ulcers) regions. The mean patient age was 57, 1 years (range 26 to 82 years). Mean postoperative follow-up was 6 months (range 2 months - 2 years). There were 18 complications for the 45 sores (40%). At 6 months postoperatively, recurrence was noted in 12 ulcers (27%). Details regarding indications, contraindications, advantages and disadvantages for different coverage options are outlined. The authors advocate the importance of surgical coverage in reducing morbidity, mortality and treatment costs.

  9. Probability and rational choice

    Directory of Open Access Journals (Sweden)

    David Botting

    2014-05-01

    Full Text Available http://dx.doi.org/10.5007/1808-1711.2014v18n1p1 In this paper I will discuss the rationality of reasoning about the future. There are two things that we might like to know about the future: which hypotheses are true and what will happen next. To put it in philosophical language, I aim to show that there are methods by which inferring to a generalization (selecting a hypothesis and inferring to the next instance (singular predictive inference can be shown to be normative and the method itself shown to be rational, where this is due in part to being based on evidence (although not in the same way and in part on a prior rational choice. I will also argue that these two inferences have been confused, being distinct not only conceptually (as nobody disputes but also in their results (the value given to the probability of the hypothesis being not in general that given to the next instance and that methods that are adequate for one are not by themselves adequate for the other. A number of debates over method founder on this confusion and do not show what the debaters think they show.

  10. COVAL, Compound Probability Distribution for Function of Probability Distribution

    International Nuclear Information System (INIS)

    Astolfi, M.; Elbaz, J.

    1979-01-01

    1 - Nature of the physical problem solved: Computation of the probability distribution of a function of variables, given the probability distribution of the variables themselves. 'COVAL' has been applied to reliability analysis of a structure subject to random loads. 2 - Method of solution: Numerical transformation of probability distributions

  11. Graphene transfer process and optimization of graphene coverage

    OpenAIRE

    Sabki Syarifah Norfaezah; Shamsuri Shafiq Hafly; Fauzi Siti Fazlina; Chon-Ki Meghashama Lim; Othman Noraini

    2017-01-01

    Graphene grown on transition metal is known to be high in quality due to its controlled amount of defects and potentially used for many electronic applications. The transfer process of graphene grown on transition metal to a new substrate requires optimization in order to ensure that high graphene coverage can be obtained. In this work, an improvement in the graphene transfer process is performed from graphene grown on copper foil. It has been observed that the graphene coverage is affected b...

  12. A Tale of Two Probabilities

    Science.gov (United States)

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  13. Introduction to probability with R

    CERN Document Server

    Baclawski, Kenneth

    2008-01-01

    FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable

  14. A Novel Deployment Scheme Based on Three-Dimensional Coverage Model for Wireless Sensor Networks

    Science.gov (United States)

    Xiao, Fu; Yang, Yang; Wang, Ruchuan; Sun, Lijuan

    2014-01-01

    Coverage pattern and deployment strategy are directly related to the optimum allocation of limited resources for wireless sensor networks, such as energy of nodes, communication bandwidth, and computing power, and quality improvement is largely determined by these for wireless sensor networks. A three-dimensional coverage pattern and deployment scheme are proposed in this paper. Firstly, by analyzing the regular polyhedron models in three-dimensional scene, a coverage pattern based on cuboids is proposed, and then relationship between coverage and sensor nodes' radius is deduced; also the minimum number of sensor nodes to maintain network area's full coverage is calculated. At last, sensor nodes are deployed according to the coverage pattern after the monitor area is subdivided into finite 3D grid. Experimental results show that, compared with traditional random method, sensor nodes number is reduced effectively while coverage rate of monitor area is ensured using our coverage pattern and deterministic deployment scheme. PMID:25045747

  15. A first course in probability

    CERN Document Server

    Ross, Sheldon

    2014-01-01

    A First Course in Probability, Ninth Edition, features clear and intuitive explanations of the mathematics of probability theory, outstanding problem sets, and a variety of diverse examples and applications. This book is ideal for an upper-level undergraduate or graduate level introduction to probability for math, science, engineering and business students. It assumes a background in elementary calculus.

  16. Recommendation system for immunization coverage and monitoring.

    Science.gov (United States)

    Bhatti, Uzair Aslam; Huang, Mengxing; Wang, Hao; Zhang, Yu; Mehmood, Anum; Di, Wu

    2018-01-02

    Immunization averts an expected 2 to 3 million deaths every year from diphtheria, tetanus, pertussis (whooping cough), and measles; however, an additional 1.5 million deaths could be avoided if vaccination coverage was improved worldwide. 1 1 Data source for immunization records of 1.5 M: http://www.who.int/mediacentre/factsheets/fs378/en/ New vaccination technologies provide earlier diagnoses, personalized treatments and a wide range of other benefits for both patients and health care professionals. Childhood diseases that were commonplace less than a generation ago have become rare because of vaccines. However, 100% vaccination coverage is still the target to avoid further mortality. Governments have launched special campaigns to create an awareness of vaccination. In this paper, we have focused on data mining algorithms for big data using a collaborative approach for vaccination datasets to resolve problems with planning vaccinations in children, stocking vaccines, and tracking and monitoring non-vaccinated children appropriately. Geographical mapping of vaccination records helps to tackle red zone areas, where vaccination rates are poor, while green zone areas, where vaccination rates are good, can be monitored to enable health care staff to plan the administration of vaccines. Our recommendation algorithm assists in these processes by using deep data mining and by accessing records of other hospitals to highlight locations with lower rates of vaccination. The overall performance of the model is good. The model has been implemented in hospitals to control vaccination across the coverage area.

  17. Failure probability analysis of optical grid

    Science.gov (United States)

    Zhong, Yaoquan; Guo, Wei; Sun, Weiqiang; Jin, Yaohui; Hu, Weisheng

    2008-11-01

    Optical grid, the integrated computing environment based on optical network, is expected to be an efficient infrastructure to support advanced data-intensive grid applications. In optical grid, the faults of both computational and network resources are inevitable due to the large scale and high complexity of the system. With the optical network based distributed computing systems extensive applied in the processing of data, the requirement of the application failure probability have been an important indicator of the quality of application and an important aspect the operators consider. This paper will present a task-based analysis method of the application failure probability in optical grid. Then the failure probability of the entire application can be quantified, and the performance of reducing application failure probability in different backup strategies can be compared, so that the different requirements of different clients can be satisfied according to the application failure probability respectively. In optical grid, when the application based DAG (directed acyclic graph) is executed in different backup strategies, the application failure probability and the application complete time is different. This paper will propose new multi-objective differentiated services algorithm (MDSA). New application scheduling algorithm can guarantee the requirement of the failure probability and improve the network resource utilization, realize a compromise between the network operator and the application submission. Then differentiated services can be achieved in optical grid.

  18. Predicting binary choices from probability phrase meanings.

    Science.gov (United States)

    Wallsten, Thomas S; Jang, Yoonhee

    2008-08-01

    The issues of how individuals decide which of two events is more likely and of how they understand probability phrases both involve judging relative likelihoods. In this study, we investigated whether derived scales representing probability phrase meanings could be used within a choice model to predict independently observed binary choices. If they can, this simultaneously provides support for our model and suggests that the phrase meanings are measured meaningfully. The model assumes that, when deciding which of two events is more likely, judges take a single sample from memory regarding each event and respond accordingly. The model predicts choice probabilities by using the scaled meanings of individually selected probability phrases as proxies for confidence distributions associated with sampling from memory. Predictions are sustained for 34 of 41 participants but, nevertheless, are biased slightly low. Sequential sampling models improve the fit. The results have both theoretical and applied implications.

  19. Bayesian optimization for computationally extensive probability distributions.

    Science.gov (United States)

    Tamura, Ryo; Hukushima, Koji

    2018-01-01

    An efficient method for finding a better maximizer of computationally extensive probability distributions is proposed on the basis of a Bayesian optimization technique. A key idea of the proposed method is to use extreme values of acquisition functions by Gaussian processes for the next training phase, which should be located near a local maximum or a global maximum of the probability distribution. Our Bayesian optimization technique is applied to the posterior distribution in the effective physical model estimation, which is a computationally extensive probability distribution. Even when the number of sampling points on the posterior distributions is fixed to be small, the Bayesian optimization provides a better maximizer of the posterior distributions in comparison to those by the random search method, the steepest descent method, or the Monte Carlo method. Furthermore, the Bayesian optimization improves the results efficiently by combining the steepest descent method and thus it is a powerful tool to search for a better maximizer of computationally extensive probability distributions.

  20. Mediating Trust in Terrorism Coverage

    DEFF Research Database (Denmark)

    Mogensen, Kirsten

    crisis. While the framework is presented in the context of television coverage of a terror-related crisis situation, it can equally be used in connection with all other forms of mediated trust. Key words: National crisis, risk communication, crisis management, television coverage, mediated trust.......Mass mediated risk communication can contribute to perceptions of threats and fear of “others” and/or to perceptions of trust in fellow citizens and society to overcome problems. This paper outlines a cross-disciplinary holistic framework for research in mediated trust building during an acute...

  1. Insurance coverage and prenatal care among low-income pregnant women: an assessment of states' adoption of the "Unborn Child" option in Medicaid and CHIP.

    Science.gov (United States)

    Jarlenski, Marian P; Bennett, Wendy L; Barry, Colleen L; Bleich, Sara N

    2014-01-01

    The "Unborn Child" (UC) option provides state Medicaid/Children's Health Insurance Program (CHIP) programs with a new strategy to extend prenatal coverage to low-income women who would otherwise have difficulty enrolling in or would be ineligible for Medicaid. To examine the association of the UC option with the probability of enrollment in Medicaid/CHIP during pregnancy and probability of receiving adequate prenatal care. We use pooled cross-sectional data from the Pregnancy Risk Assessment Monitoring System from 32 states between 2004 and 2010 (n = 81,983). Multivariable regression is employed to examine the association of the UC option with Medicaid/CHIP enrollment during pregnancy among eligible women who were uninsured preconception (n = 45,082) and those who had insurance (but not Medicaid) preconception (n = 36,901). Multivariable regression is also employed to assess the association between the UC option and receipt of adequate prenatal care, measured by the Adequacy of Prenatal Care Utilization Index. Residing in a state with the UC option is associated with a greater probability of Medicaid enrollment during pregnancy relative to residing in a state without the policy both among women uninsured preconception (88% vs. 77%, P option is not significantly associated with receiving adequate prenatal care, among both women with and without insurance preconception. The UC option provides states a key way to expand or simplify prenatal insurance coverage, but further policy efforts are needed to ensure that coverage improves access to high-quality prenatal care.

  2. A brief introduction to probability.

    Science.gov (United States)

    Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio

    2018-02-01

    The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.

  3. Constructor theory of probability

    Science.gov (United States)

    2016-01-01

    Unitary quantum theory, having no Born Rule, is non-probabilistic. Hence the notorious problem of reconciling it with the unpredictability and appearance of stochasticity in quantum measurements. Generalizing and improving upon the so-called ‘decision-theoretic approach’, I shall recast that problem in the recently proposed constructor theory of information—where quantum theory is represented as one of a class of superinformation theories, which are local, non-probabilistic theories conforming to certain constructor-theoretic conditions. I prove that the unpredictability of measurement outcomes (to which constructor theory gives an exact meaning) necessarily arises in superinformation theories. Then I explain how the appearance of stochasticity in (finitely many) repeated measurements can arise under superinformation theories. And I establish sufficient conditions for a superinformation theory to inform decisions (made under it) as if it were probabilistic, via a Deutsch–Wallace-type argument—thus defining a class of decision-supporting superinformation theories. This broadens the domain of applicability of that argument to cover constructor-theory compliant theories. In addition, in this version some of the argument's assumptions, previously construed as merely decision-theoretic, follow from physical properties expressed by constructor-theoretic principles. PMID:27616914

  4. Newspaper coverage of mental illness in England 2008-2011.

    Science.gov (United States)

    Thornicroft, Amalia; Goulden, Robert; Shefer, Guy; Rhydderch, Danielle; Rose, Diana; Williams, Paul; Thornicroft, Graham; Henderson, Claire

    2013-04-01

    Better newspaper coverage of mental health-related issues is a target for the Time to Change (TTC) anti-stigma programme in England, whose population impact may be influenced by how far concurrent media coverage perpetuates stigma and discrimination. To compare English newspaper coverage of mental health-related topics each year of the TTC social marketing campaign (2009-2011) with baseline coverage in 2008. Content analysis was performed on articles in 27 local and national newspapers on two randomly chosen days each month. There was a significant increase in the proportion of anti-stigmatising articles between 2008 and 2011. There was no concomitant proportional decrease in stigmatising articles, and the contribution of mixed or neutral elements decreased. These findings provide promising results on improvements in press reporting of mental illness during the TTC programme in 2009-2011, and a basis for guidance to newspaper journalists and editors on reporting mental illness.

  5. Propensity, Probability, and Quantum Theory

    Science.gov (United States)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  6. A Max-Flow Based Algorithm for Connected Target Coverage with Probabilistic Sensors

    Directory of Open Access Journals (Sweden)

    Anxing Shan

    2017-05-01

    Full Text Available Coverage is a fundamental issue in the research field of wireless sensor networks (WSNs. Connected target coverage discusses the sensor placement to guarantee the needs of both coverage and connectivity. Existing works largely leverage on the Boolean disk model, which is only a coarse approximation to the practical sensing model. In this paper, we focus on the connected target coverage issue based on the probabilistic sensing model, which can characterize the quality of coverage more accurately. In the probabilistic sensing model, sensors are only be able to detect a target with certain probability. We study the collaborative detection probability of target under multiple sensors. Armed with the analysis of collaborative detection probability, we further formulate the minimum ϵ-connected target coverage problem, aiming to minimize the number of sensors satisfying the requirements of both coverage and connectivity. We map it into a flow graph and present an approximation algorithm called the minimum vertices maximum flow algorithm (MVMFA with provable time complex and approximation ratios. To evaluate our design, we analyze the performance of MVMFA theoretically and also conduct extensive simulation studies to demonstrate the effectiveness of our proposed algorithm.

  7. Terrorism and nuclear damage coverage

    International Nuclear Information System (INIS)

    Horbach, N. L. J. T.; Brown, O. F.; Vanden Borre, T.

    2004-01-01

    This paper deals with nuclear terrorism and the manner in which nuclear operators can insure themselves against it, based on the international nuclear liability conventions. It concludes that terrorism is currently not covered under the treaty exoneration provisions on 'war-like events' based on an analysis of the concept on 'terrorism' and travaux preparatoires. Consequently, operators remain liable for nuclear damage resulting from terrorist acts, for which mandatory insurance is applicable. Since nuclear insurance industry looks at excluding such insurance coverage from their policies in the near future, this article aims to suggest alternative means for insurance, in order to ensure adequate compensation for innocent victims. The September 11, 2001 attacks at the World Trade Center in New York City and the Pentagon in Washington, DC resulted in the largest loss in the history of insurance, inevitably leading to concerns about nuclear damage coverage, should future such assaults target a nuclear power plant or other nuclear installation. Since the attacks, some insurers have signalled their intentions to exclude coverage for terrorism from their nuclear liability and property insurance policies. Other insurers are maintaining coverage for terrorism, but are establishing aggregate limits or sublimits and are increasing premiums. Additional changes by insurers are likely to occur. Highlighted by the September 11th events, and most recently by those in Madrid on 11 March 2004, are questions about how to define acts of terrorism and the extent to which such are covered under the international nuclear liability conventions and various domestic nuclear liability laws. Of particular concern to insurers is the possibility of coordinated simultaneous attacks on multiple nuclear facilities. This paper provides a survey of the issues, and recommendations for future clarifications and coverage options.(author)

  8. Prediction and probability in sciences

    International Nuclear Information System (INIS)

    Klein, E.; Sacquin, Y.

    1998-01-01

    This book reports the 7 presentations made at the third meeting 'physics and fundamental questions' whose theme was probability and prediction. The concept of probability that was invented to apprehend random phenomena has become an important branch of mathematics and its application range spreads from radioactivity to species evolution via cosmology or the management of very weak risks. The notion of probability is the basis of quantum mechanics and then is bound to the very nature of matter. The 7 topics are: - radioactivity and probability, - statistical and quantum fluctuations, - quantum mechanics as a generalized probability theory, - probability and the irrational efficiency of mathematics, - can we foresee the future of the universe?, - chance, eventuality and necessity in biology, - how to manage weak risks? (A.C.)

  9. Applied probability and stochastic processes

    CERN Document Server

    Sumita, Ushio

    1999-01-01

    Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...

  10. High throughput nonparametric probability density estimation.

    Science.gov (United States)

    Farmer, Jenny; Jacobs, Donald

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference.

  11. Poisson Processes in Free Probability

    OpenAIRE

    An, Guimei; Gao, Mingchu

    2015-01-01

    We prove a multidimensional Poisson limit theorem in free probability, and define joint free Poisson distributions in a non-commutative probability space. We define (compound) free Poisson process explicitly, similar to the definitions of (compound) Poisson processes in classical probability. We proved that the sum of finitely many freely independent compound free Poisson processes is a compound free Poisson processes. We give a step by step procedure for constructing a (compound) free Poisso...

  12. PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT

    Science.gov (United States)

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  13. Strategies for expanding health insurance coverage in vulnerable populations

    Science.gov (United States)

    Jia, Liying; Yuan, Beibei; Huang, Fei; Lu, Ying; Garner, Paul; Meng, Qingyue

    2014-01-01

    ) studies and Interrupted time series (ITS) studies that evaluated the effects of strategies on increasing health insurance coverage for vulnerable populations. We defined strategies as measures to improve the enrolment of vulnerable populations into health insurance schemes. Two categories and six specified strategies were identified as the interventions. Data collection and analysis At least two review authors independently extracted data and assessed the risk of bias. We undertook a structured synthesis. Main results We included two studies, both from the United States. People offered health insurance information and application support by community-based case managers were probably more likely to enrol their children into health insurance programmes (risk ratio (RR) 1.68, 95% confidence interval (CI) 1.44 to 1.96, moderate quality evidence) and were probably more likely to continue insuring their children (RR 2.59, 95% CI 1.95 to 3.44, moderate quality evidence). Of all the children that were insured, those in the intervention group may have been insured quicker (47.3 fewer days, 95% CI 20.6 to 74.0 fewer days, low quality evidence) and parents may have been more satisfied on average (satisfaction score average difference 1.07, 95% CI 0.72 to 1.42, low quality evidence). In the second study applications were handed out in emergency departments at hospitals, compared to not handing out applications, and may have had an effect on enrolment (RR 1.5, 95% CI 1.03 to 2.18, low quality evidence). Authors' conclusions Community-based case managers who provide health insurance information, application support, and negotiate with the insurer probably increase enrolment of children in health insurance schemes. However, the transferability of this intervention to other populations or other settings is uncertain. Handing out insurance application materials in hospital emergency departments may help increase the enrolment of children in health insurance schemes. Further studies

  14. Probability inequalities for decomposition integrals

    Czech Academy of Sciences Publication Activity Database

    Agahi, H.; Mesiar, Radko

    2017-01-01

    Roč. 315, č. 1 (2017), s. 240-248 ISSN 0377-0427 Institutional support: RVO:67985556 Keywords : Decomposition integral * Superdecomposition integral * Probability inequalities Subject RIV: BA - General Mathematics OBOR OECD: Statistics and probability Impact factor: 1.357, year: 2016 http://library.utia.cas.cz/separaty/2017/E/mesiar-0470959.pdf

  15. Expected utility with lower probabilities

    DEFF Research Database (Denmark)

    Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1994-01-01

    An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...

  16. Invariant probabilities of transition functions

    CERN Document Server

    Zaharopol, Radu

    2014-01-01

    The structure of the set of all the invariant probabilities and the structure of various types of individual invariant probabilities of a transition function are two topics of significant interest in the theory of transition functions, and are studied in this book. The results obtained are useful in ergodic theory and the theory of dynamical systems, which, in turn, can be applied in various other areas (like number theory). They are illustrated using transition functions defined by flows, semiflows, and one-parameter convolution semigroups of probability measures. In this book, all results on transition probabilities that have been published by the author between 2004 and 2008 are extended to transition functions. The proofs of the results obtained are new. For transition functions that satisfy very general conditions the book describes an ergodic decomposition that provides relevant information on the structure of the corresponding set of invariant probabilities. Ergodic decomposition means a splitting of t...

  17. Introduction to probability with Mathematica

    CERN Document Server

    Hastings, Kevin J

    2009-01-01

    Discrete ProbabilityThe Cast of Characters Properties of Probability Simulation Random SamplingConditional ProbabilityIndependenceDiscrete DistributionsDiscrete Random Variables, Distributions, and ExpectationsBernoulli and Binomial Random VariablesGeometric and Negative Binomial Random Variables Poisson DistributionJoint, Marginal, and Conditional Distributions More on ExpectationContinuous ProbabilityFrom the Finite to the (Very) Infinite Continuous Random Variables and DistributionsContinuous ExpectationContinuous DistributionsThe Normal Distribution Bivariate Normal DistributionNew Random Variables from OldOrder Statistics Gamma DistributionsChi-Square, Student's t, and F-DistributionsTransformations of Normal Random VariablesAsymptotic TheoryStrong and Weak Laws of Large Numbers Central Limit TheoremStochastic Processes and ApplicationsMarkov ChainsPoisson Processes QueuesBrownian MotionFinancial MathematicsAppendixIntroduction to Mathematica Glossary of Mathematica Commands for Probability Short Answers...

  18. Root coverage with bridge flap

    Directory of Open Access Journals (Sweden)

    Pushpendra Kumar Verma

    2013-01-01

    Full Text Available Gingival recession in anterior teeth is a common concern due to esthetic reasons or root sensitivity. Gingival recession, especially in multiple anterior teeth, is of huge concern due to esthetic reasons. Various mucogingival surgeries are available for root coverage. This case report presents a new bridge flap technique, which allows the dentist not only to cover the previously denuded root surfaces but also to increase the zone of attached gingiva at a single step. In this case, a coronally advanced flap along with vestibular deepening technique was used as root coverage procedure for the treatment of multiple recession-type defect. Here, vestibular deepening technique is used to increase the width of the attached gingiva. The predictability of this procedure results in an esthetically healthy periodontium, along with gain in keratinized tissue and good patient′s acceptance.

  19. Linear positivity and virtual probability

    International Nuclear Information System (INIS)

    Hartle, James B.

    2004-01-01

    We investigate the quantum theory of closed systems based on the linear positivity decoherence condition of Goldstein and Page. The objective of any quantum theory of a closed system, most generally the universe, is the prediction of probabilities for the individual members of sets of alternative coarse-grained histories of the system. Quantum interference between members of a set of alternative histories is an obstacle to assigning probabilities that are consistent with the rules of probability theory. A quantum theory of closed systems therefore requires two elements: (1) a condition specifying which sets of histories may be assigned probabilities and (2) a rule for those probabilities. The linear positivity condition of Goldstein and Page is the weakest of the general conditions proposed so far. Its general properties relating to exact probability sum rules, time neutrality, and conservation laws are explored. Its inconsistency with the usual notion of independent subsystems in quantum mechanics is reviewed. Its relation to the stronger condition of medium decoherence necessary for classicality is discussed. The linear positivity of histories in a number of simple model systems is investigated with the aim of exhibiting linearly positive sets of histories that are not decoherent. The utility of extending the notion of probability to include values outside the range of 0-1 is described. Alternatives with such virtual probabilities cannot be measured or recorded, but can be used in the intermediate steps of calculations of real probabilities. Extended probabilities give a simple and general way of formulating quantum theory. The various decoherence conditions are compared in terms of their utility for characterizing classicality and the role they might play in further generalizations of quantum mechanics

  20. [Quantification of acetabular coverage in normal adult].

    Science.gov (United States)

    Lin, R M; Yang, C Y; Yu, C Y; Yang, C R; Chang, G L; Chou, Y L

    1991-03-01

    Quantification of acetabular coverage is important and can be expressed by superimposition of cartilage tracings on the maximum cross-sectional area of the femoral head. A practical Autolisp program on PC AutoCAD has been developed by us to quantify the acetabular coverage through numerical expression of the images of computed tomography. Thirty adults (60 hips) with normal center-edge angle and acetabular index in plain X ray were randomly selected for serial drops. These slices were prepared with a fixed coordination and in continuous sections of 5 mm in thickness. The contours of the cartilage of each section were digitized into a PC computer and processed by AutoCAD programs to quantify and characterize the acetabular coverage of normal and dysplastic adult hips. We found that a total coverage ratio of greater than 80%, an anterior coverage ratio of greater than 75% and a posterior coverage ratio of greater than 80% can be categorized in a normal group. Polar edge distance is a good indicator for the evaluation of preoperative and postoperative coverage conditions. For standardization and evaluation of acetabular coverage, the most suitable parameters are the total coverage ratio, anterior coverage ratio, posterior coverage ratio and polar edge distance. However, medial coverage and lateral coverage ratios are indispensable in cases of dysplastic hip because variations between them are so great that acetabuloplasty may be impossible. This program can also be used to classify precisely the type of dysplastic hip.

  1. Disparities in Private Health Insurance Coverage of Skilled Care

    Directory of Open Access Journals (Sweden)

    Stacey A. Tovino

    2017-10-01

    Full Text Available This article compares and contrasts public and private health insurance coverage of skilled medical rehabilitation, including cognitive rehabilitation, physical therapy, occupational therapy, speech-language pathology, and skilled nursing services (collectively, skilled care. As background, prior scholars writing in this area have focused on Medicare coverage of skilled care and have challenged coverage determinations limiting Medicare coverage to beneficiaries who are able to demonstrate improvement in their conditions within a specific period of time (the Improvement Standard. By and large, these scholars have applauded the settlement agreement approved on 24 January 2013, by the U.S. District Court for the District of Vermont in Jimmo v. Sebelius (Jimmo, as well as related motions, rulings, orders, government fact sheets, and Medicare program manual statements clarifying that Medicare covers skilled care that is necessary to prevent or slow a beneficiary’s deterioration or to maintain a beneficiary at his or her maximum practicable level of function even though no further improvement in the beneficiary’s condition is expected. Scholars who have focused on beneficiaries who have suffered severe brain injuries, in particular, have framed public insurance coverage of skilled brain rehabilitation as an important civil, disability, and educational right. Given that approximately two-thirds of Americans with health insurance are covered by private health insurance and that many private health plans continue to require their insureds to demonstrate improvement within a short period of time to obtain coverage of skilled care, scholarship assessing private health insurance coverage of skilled care is important but noticeably absent from the literature. This article responds to this gap by highlighting state benchmark plans’ and other private health plans’ continued use of the Improvement Standard in skilled care coverage decisions and

  2. A probabilistic coverage for on-the-fly test generation algorithms

    OpenAIRE

    Goga, N.

    2003-01-01

    This paper describes a way to compute the coverage for an on-the-fly test generation algorithm based on a probabilistic approach. The on-the-fly test generation and execution process and the development process of an implementation from a specification are viewed as a stochastic process. The probabilities of the stochastic processes are integrated in a generalized definition of coverage. The generalized formulas are instantiated for the ioco theory and for the specification of the TorX test g...

  3. The effect of EDTA in attachment gain and root coverage.

    Science.gov (United States)

    Kassab, Moawia M; Cohen, Robert E; Andreana, Sebastiano; Dentino, Andrew R

    2006-06-01

    Root surface biomodification using low pH agents such as citric acid and tetracycline has been proposed to enhance root coverage following connective tissue grafting. The authors hypothesized that root conditioning with neutral pH edetic acid would improve vertical recession depth, root surface coverage, pocket depth, and clinical attachment levels. Twenty teeth in 10 patients with Miller class I and II recession were treated with connective tissue grafting. The experimental sites received 24% edetic acid in sterile distilled water applied to the root surface for 2 minutes before grafting. Controls were pretreated with only sterile distilled water. Measurements were evaluated before surgery and 6 months after surgery. Analysis of variance was used to determine differences between experimental and control groups. We found significant postoperative improvements in vertical recession depth, root surface coverage, and clinical attachment levels in test and control groups, compared to postoperative data. Pocket depth differences were not significant (P<.01).

  4. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Science.gov (United States)

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  5. Probable Inference and Quantum Mechanics

    International Nuclear Information System (INIS)

    Grandy, W. T. Jr.

    2009-01-01

    In its current very successful interpretation the quantum theory is fundamentally statistical in nature. Although commonly viewed as a probability amplitude whose (complex) square is a probability, the wavefunction or state vector continues to defy consensus as to its exact meaning, primarily because it is not a physical observable. Rather than approach this problem directly, it is suggested that it is first necessary to clarify the precise role of probability theory in quantum mechanics, either as applied to, or as an intrinsic part of the quantum theory. When all is said and done the unsurprising conclusion is that quantum mechanics does not constitute a logic and probability unto itself, but adheres to the long-established rules of classical probability theory while providing a means within itself for calculating the relevant probabilities. In addition, the wavefunction is seen to be a description of the quantum state assigned by an observer based on definite information, such that the same state must be assigned by any other observer based on the same information, in much the same way that probabilities are assigned.

  6. 29 CFR 95.31 - Insurance coverage.

    Science.gov (United States)

    2010-07-01

    ... recipient. Federally-owned property need not be insured unless required by the terms and conditions of the... § 95.31 Insurance coverage. Recipients shall, at a minimum, provide the equivalent insurance coverage...

  7. Assessing Measurement Error in Medicare Coverage

    Data.gov (United States)

    U.S. Department of Health & Human Services — Assessing Measurement Error in Medicare Coverage From the National Health Interview Survey Using linked administrative data, to validate Medicare coverage estimates...

  8. 76 FR 7767 - Student Health Insurance Coverage

    Science.gov (United States)

    2011-02-11

    ... Student Health Insurance Coverage AGENCY: Centers for Medicare & Medicaid Services (CMS), HHS. ACTION... health insurance coverage under the Public Health Service Act and the Affordable Care Act. The proposed rule would define ``student health insurance [[Page 7768

  9. Failure probability under parameter uncertainty.

    Science.gov (United States)

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.

  10. Solving k-Barrier Coverage Problem Using Modified Gravitational Search Algorithm

    Directory of Open Access Journals (Sweden)

    Yanhua Zhang

    2017-01-01

    Full Text Available Coverage problem is a critical issue in wireless sensor networks for security applications. The k-barrier coverage is an effective measure to ensure robustness. In this paper, we formulate the k-barrier coverage problem as a constrained optimization problem and introduce the energy constraint of sensor node to prolong the lifetime of the k-barrier coverage. A novel hybrid particle swarm optimization and gravitational search algorithm (PGSA is proposed to solve this problem. The proposed PGSA adopts a k-barrier coverage generation strategy based on probability and integrates the exploitation ability in particle swarm optimization to update the velocity and enhance the global search capability and introduce the boundary mutation strategy of an agent to increase the population diversity and search accuracy. Extensive simulations are conducted to demonstrate the effectiveness of our proposed algorithm.

  11. Probability with applications and R

    CERN Document Server

    Dobrow, Robert P

    2013-01-01

    An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c

  12. A philosophical essay on probabilities

    CERN Document Server

    Laplace, Marquis de

    1996-01-01

    A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application

  13. Increasing Coverage of Hepatitis B Vaccination in China

    Science.gov (United States)

    Wang, Shengnan; Smith, Helen; Peng, Zhuoxin; Xu, Biao; Wang, Weibing

    2016-01-01

    Abstract This study used a system evaluation method to summarize China's experience on improving the coverage of hepatitis B vaccine, especially the strategies employed to improve the uptake of timely birth dosage. Identifying successful methods and strategies will provide strong evidence for policy makers and health workers in other countries with high hepatitis B prevalence. We conducted a literature review included English or Chinese literature carried out in mainland China, using PubMed, the Cochrane databases, Web of Knowledge, China National Knowledge Infrastructure, Wanfang data, and other relevant databases. Nineteen articles about the effectiveness and impact of interventions on improving the coverage of hepatitis B vaccine were included. Strong or moderate evidence showed that reinforcing health education, training and supervision, providing subsidies for facility birth, strengthening the coordination among health care providers, and using out-of-cold-chain storage for vaccines were all important to improving vaccination coverage. We found evidence that community education was the most commonly used intervention, and out-reach programs such as out-of-cold chain strategy were more effective in increasing the coverage of vaccination in remote areas where the facility birth rate was respectively low. The essential impact factors were found to be strong government commitment and the cooperation of the different government departments. Public interventions relying on basic health care systems combined with outreach care services were critical elements in improving the hepatitis B vaccination rate in China. This success could not have occurred without exceptional national commitment. PMID:27175710

  14. MC/DC and Toggle Coverage Measurement Tool for FBD Program Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Eui Sub; Jung, Se Jin; Kim, Jae Yeob; Yoo, Jun Beom [Konkuk University, Seoul (Korea, Republic of)

    2016-05-15

    The functional verification of FBD program can be implemented with various techniques such as testing and simulation. Simulation is preferable to verify FBD program, because it replicates operation of the PLC as well. The PLC is executed repeatedly as long as the controlled system is running based on scan time. Likewise, the simulation technique operates continuously and sequentially. Although engineers try to verify the functionality wholly, it is difficult to find residual errors in the design. Even if 100% functional coverage is accomplished, code coverage have 50%, which might indicate that the scenario is missing some key features of the design. Unfortunately, errors and bugs are often found in the missing points. To assure a high quality of functional verification, code coverage is important as well as functional coverage. We developed a pair tool 'FBDSim' and 'FBDCover' for FBD simulation and coverage measurement. The 'FBDSim' automatically simulates a set of FBD simulation scenarios. While the 'FBDSim' simulates the FBD program, it calculates the MC/DC and Toggle coverage and identifies unstimulated points. After FBD simulation is done, the 'FBDCover' reads the coverage results and shows the coverage with graphical feature and uncovered points with tree feature. The coverages and uncovered points can help engineers to improve the quality of simulation. We slightly dealt with the both coverages, but the coverage is dealt with more concrete and rigorous manner.

  15. 5 CFR 890.1106 - Coverage.

    Science.gov (United States)

    2010-01-01

    ... family member is an individual whose relationship to the enrollee meets the requirements of 5 U.S.C. 8901... EMPLOYEES HEALTH BENEFITS PROGRAM Temporary Continuation of Coverage § 890.1106 Coverage. (a) Type of enrollment. An individual who enrolls under this subpart may elect coverage for self alone or self and family...

  16. 40 CFR 51.356 - Vehicle coverage.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 2 2010-07-01 2010-07-01 false Vehicle coverage. 51.356 Section 51.356....356 Vehicle coverage. The performance standard for enhanced I/M programs assumes coverage of all 1968 and later model year light duty vehicles and light duty trucks up to 8,500 pounds GVWR, and includes...

  17. 29 CFR 801.3 - Coverage.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 3 2010-07-01 2010-07-01 false Coverage. 801.3 Section 801.3 Labor Regulations Relating to Labor (Continued) WAGE AND HOUR DIVISION, DEPARTMENT OF LABOR OTHER LAWS APPLICATION OF THE EMPLOYEE POLYGRAPH PROTECTION ACT OF 1988 General § 801.3 Coverage. (a) The coverage of the Act extends to “any...

  18. Logic, probability, and human reasoning.

    Science.gov (United States)

    Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P

    2015-04-01

    This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Introduction to probability and measure

    CERN Document Server

    Parthasarathy, K R

    2005-01-01

    According to a remark attributed to Mark Kac 'Probability Theory is a measure theory with a soul'. This book with its choice of proofs, remarks, examples and exercises has been prepared taking both these aesthetic and practical aspects into account.

  20. Expanding the universe of universal coverage: the population health argument for increasing coverage for immigrants.

    Science.gov (United States)

    Nandi, Arijit; Loue, Sana; Galea, Sandro

    2009-12-01

    As the US recession deepens, furthering the debate about healthcare reform is now even more important than ever. Few plans aimed at facilitating universal coverage make any mention of increasing access for uninsured non-citizens living in the US, many of whom are legally restricted from certain types of coverage. We conducted a critical review of the public health literature concerning the health status and access to health services among immigrant populations in the US. Using examples from infectious and chronic disease epidemiology, we argue that access to health services is at the intersection of the health of uninsured immigrants and the general population and that extending access to healthcare to all residents of the US, including undocumented immigrants, is beneficial from a population health perspective. Furthermore, from a health economics perspective, increasing access to care for immigrant populations may actually reduce net costs by increasing primary prevention and reducing the emphasis on emergency care for preventable conditions. It is unlikely that proposals for universal coverage will accomplish their objectives of improving population health and reducing social disparities in health if they do not address the substantial proportion of uninsured non-citizens living in the US.

  1. Joint probabilities and quantum cognition

    International Nuclear Information System (INIS)

    Acacio de Barros, J.

    2012-01-01

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  2. Joint probabilities and quantum cognition

    Energy Technology Data Exchange (ETDEWEB)

    Acacio de Barros, J. [Liberal Studies, 1600 Holloway Ave., San Francisco State University, San Francisco, CA 94132 (United States)

    2012-12-18

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  3. Default probabilities and default correlations

    OpenAIRE

    Erlenmaier, Ulrich; Gersbach, Hans

    2001-01-01

    Starting from the Merton framework for firm defaults, we provide the analytics and robustness of the relationship between default correlations. We show that loans with higher default probabilities will not only have higher variances but also higher correlations between loans. As a consequence, portfolio standard deviation can increase substantially when loan default probabilities rise. This result has two important implications. First, relative prices of loans with different default probabili...

  4. The Probabilities of Unique Events

    Science.gov (United States)

    2012-08-30

    Washington, DC USA Max Lotstein and Phil Johnson-Laird Department of Psychology Princeton University Princeton, NJ USA August 30th 2012...social justice and also participated in antinuclear demonstrations. The participants ranked the probability that Linda is a feminist bank teller as...retorted that such a flagrant violation of the probability calculus was a result of a psychological experiment that obscured the rationality of the

  5. Probability Matching, Fast and Slow

    OpenAIRE

    Koehler, Derek J.; James, Greta

    2014-01-01

    A prominent point of contention among researchers regarding the interpretation of probability-matching behavior is whether it represents a cognitively sophisticated, adaptive response to the inherent uncertainty of the tasks or settings in which it is observed, or whether instead it represents a fundamental shortcoming in the heuristics that support and guide human decision making. Put crudely, researchers disagree on whether probability matching is "smart" or "dumb." Here, we consider eviden...

  6. Handover Incentives for Self-Interested WLANs with Overlapping Coverage

    DEFF Research Database (Denmark)

    Fafoutis, Xenofon; Siris, Vasilios A.

    2012-01-01

    We consider an environment where self-interested IEEE 802.11 Wireless Local Area Networks (WLANs) have overlapping coverage, and investigate the incentives that can trigger handovers between the WLANs. Our focus is on the incentives for supporting handovers due solely to the improved performance...

  7. The Role of Media Coverage in Meeting Operational Objectives

    National Research Council Canada - National Science Library

    Mitchell-Musumarra, Mary

    2003-01-01

    ...: Operation Desert Storm, Operation Restore Hope (Somalia) and Operation Iraqi Freedom. It describes some of the motivations and concerns of the news media, and examines doctrine from the perspective of the media's requirements for information. Finally, recommendations are made to improve future media coverage of operations.

  8. Probably not future prediction using probability and statistical inference

    CERN Document Server

    Dworsky, Lawrence N

    2008-01-01

    An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...

  9. Box-particle probability hypothesis density filtering

    OpenAIRE

    Schikora, M.; Gning, A.; Mihaylova, L.; Cremers, D.; Koch, W.

    2014-01-01

    This paper develops a novel approach for multitarget tracking, called box-particle probability hypothesis density filter (box-PHD filter). The approach is able to track multiple targets and estimates the unknown number of targets. Furthermore, it is capable of dealing with three sources of uncertainty: stochastic, set-theoretic, and data association uncertainty. The box-PHD filter reduces the number of particles significantly, which improves the runtime considerably. The small number of box-p...

  10. Whole brain CT perfusion in acute anterior circulation ischemia: coverage size matters

    International Nuclear Information System (INIS)

    Emmer, B.J.; Rijkee, M.; Walderveen, M.A.A. van; Niesten, J.M.; Velthuis, B.K.; Wermer, M.J.H.

    2014-01-01

    Our aim was to compare infarct core volume on whole brain CT perfusion (CTP) with several limited coverage sizes (i.e., 3, 4, 6, and 8 cm), as currently used in routine clinical practice. In total, 40 acute ischemic stroke patients with non-contrast CT (NCCT) and CTP imaging of anterior circulation ischemia were included. Imaging was performed using a 320-multislice CT. Average volumes of infarct core of all simulated partial coverage sizes were calculated. Infarct core volume of each partial brain coverage was compared with infarct core volume of whole brain coverage and expressed using a percentage. To determine the optimal starting position for each simulated CTP coverage, the percentage of infarct coverage was calculated for every possible starting position of the simulated partial coverage in relation to Alberta Stroke Program Early CT Score in Acute Stroke Triage (ASPECTS 1) level. Whole brain CTP coverage further increased the percentage of infarct core volume depicted by 10 % as compared to the 8-cm coverage when the bottom slice was positioned at the ASPECTS 1 level. Optimization of the position of the region of interest (ROI) in 3 cm, 4 cm, and 8 cm improved the percentage of infarct depicted by 4 % for the 8-cm, 7 % for the 4-cm, and 13 % for the 3-cm coverage size. This study shows that whole brain CTP is the optimal coverage for CTP with a substantial improvement in accuracy in quantifying infarct core size. In addition, our results suggest that the optimal position of the ROI in limited coverage depends on the size of the coverage. (orig.)

  11. Whole brain CT perfusion in acute anterior circulation ischemia: coverage size matters

    Energy Technology Data Exchange (ETDEWEB)

    Emmer, B.J. [Erasmus Medical Centre, Department of Radiology, Postbus 2040, Rotterdam (Netherlands); Rijkee, M.; Walderveen, M.A.A. van [Leiden University Medical Centre, Department of Radiology, Leiden (Netherlands); Niesten, J.M.; Velthuis, B.K. [University Medical Centre Utrecht, Department of Radiology, Utrecht (Netherlands); Wermer, M.J.H. [Leiden University Medical Centre, Department of Neurology, Leiden (Netherlands)

    2014-12-15

    Our aim was to compare infarct core volume on whole brain CT perfusion (CTP) with several limited coverage sizes (i.e., 3, 4, 6, and 8 cm), as currently used in routine clinical practice. In total, 40 acute ischemic stroke patients with non-contrast CT (NCCT) and CTP imaging of anterior circulation ischemia were included. Imaging was performed using a 320-multislice CT. Average volumes of infarct core of all simulated partial coverage sizes were calculated. Infarct core volume of each partial brain coverage was compared with infarct core volume of whole brain coverage and expressed using a percentage. To determine the optimal starting position for each simulated CTP coverage, the percentage of infarct coverage was calculated for every possible starting position of the simulated partial coverage in relation to Alberta Stroke Program Early CT Score in Acute Stroke Triage (ASPECTS 1) level. Whole brain CTP coverage further increased the percentage of infarct core volume depicted by 10 % as compared to the 8-cm coverage when the bottom slice was positioned at the ASPECTS 1 level. Optimization of the position of the region of interest (ROI) in 3 cm, 4 cm, and 8 cm improved the percentage of infarct depicted by 4 % for the 8-cm, 7 % for the 4-cm, and 13 % for the 3-cm coverage size. This study shows that whole brain CTP is the optimal coverage for CTP with a substantial improvement in accuracy in quantifying infarct core size. In addition, our results suggest that the optimal position of the ROI in limited coverage depends on the size of the coverage. (orig.)

  12. Factors influencing reporting and harvest probabilities in North American geese

    Science.gov (United States)

    Zimmerman, G.S.; Moser, T.J.; Kendall, W.L.; Doherty, P.F.; White, Gary C.; Caswell, D.F.

    2009-01-01

    We assessed variation in reporting probabilities of standard bands among species, populations, harvest locations, and size classes of North American geese to enable estimation of unbiased harvest probabilities. We included reward (US10,20,30,50, or100) and control (0) banded geese from 16 recognized goose populations of 4 species: Canada (Branta canadensis), cackling (B. hutchinsii), Ross's (Chen rossii), and snow geese (C. caerulescens). We incorporated spatially explicit direct recoveries and live recaptures into a multinomial model to estimate reporting, harvest, and band-retention probabilities. We compared various models for estimating harvest probabilities at country (United States vs. Canada), flyway (5 administrative regions), and harvest area (i.e., flyways divided into northern and southern sections) scales. Mean reporting probability of standard bands was 0.73 (95 CI 0.690.77). Point estimates of reporting probabilities for goose populations or spatial units varied from 0.52 to 0.93, but confidence intervals for individual estimates overlapped and model selection indicated that models with species, population, or spatial effects were less parsimonious than those without these effects. Our estimates were similar to recently reported estimates for mallards (Anas platyrhynchos). We provide current harvest probability estimates for these populations using our direct measures of reporting probability, improving the accuracy of previous estimates obtained from recovery probabilities alone. Goose managers and researchers throughout North America can use our reporting probabilities to correct recovery probabilities estimated from standard banding operations for deriving spatially explicit harvest probabilities.

  13. Immunization coverage among Hispanic ancestry, 2003 National Immunization Survey.

    Science.gov (United States)

    Darling, Natalie J; Barker, Lawrence E; Shefer, Abigail M; Chu, Susan Y

    2005-12-01

    The Hispanic population is increasing and heterogeneous (Hispanic refers to persons of Spanish, Hispanic, or Latino descent). The objective was to examine immunization rates among Hispanic ancestry for the 4:3:1:3:3 series (> or = 4 doses diphtheria, tetanus toxoids, and pertussis vaccine; > or = 3 doses poliovirus vaccine; > or = 1 doses measles-containing vaccine; > or = 3 doses Haemophilus influenzae type b vaccine; and > or = 3 doses hepatitis B vaccine). The National Immunization Survey measures immunization coverage among 19- to 35-month-old U.S. children. Coverage was compared from combined 2001-2003 data among Hispanics and non-Hispanic whites using t-tests, and among Hispanic ancestry using a chi-square test. Hispanics were categorized as Mexican, Mexican American, Central American, South American, Puerto Rican, Cuban, Spanish Caribbean (primarily Dominican Republic), other, and multiple ancestry. Children of Hispanic ancestry increased from 21% in 1999 to 25% in 2003. These Hispanic children were less well immunized than non-Hispanic whites (77.0%, +/-2.1% [95% confidence interval] compared to 82.5%, +/-1.1% (95% CI) > in 2003). Immunization coverage did not vary significantly among Hispanics of varying ancestries (p=0.26); however, there was substantial geographic variability. In some areas, immunization coverage among Hispanics was significantly higher than non-Hispanic whites. Hispanic children were less well immunized than non-Hispanic whites; however, coverage varied notably by geographic area. Although a chi-square test found no significant differences in coverage among Hispanic ancestries, the range of coverage, 79.2%, +/-5.1% for Cuban Americans to 72.1%, +/-2.4% for Mexican descent, may suggest a need for improved and more localized monitoring among Hispanic communities.

  14. [The registration of deaths in Venezuela: an evaluation of coverage].

    Science.gov (United States)

    Bidegain, G; Lopez, D

    1987-08-01

    "This paper presents six indirect techniques for estimating the degree of death coverage as applied to vital statistics information in Venezuela between 1960 and 1982, collected by two public institutions, namely, the 'Oficina Central de Estadistica e Informatica' (OCEI) and the Ministry of Health and Social Assistance (MSAS).... The results show remarkable improvements in the death registry coverage for both institutions, that amount to 97 or 98 per cent at the beginning of the 80's. Nevertheless, great differences can be observed between them regarding both structure and volume of deaths by sex and age." Among the problems discussed are the impact of immigration and errors in age reporting. (SUMMARY IN ENG) excerpt

  15. Quad-Tree Visual-Calculus Analysis of Satellite Coverage

    Science.gov (United States)

    Lo, Martin W.; Hockney, George; Kwan, Bruce

    2003-01-01

    An improved method of analysis of coverage of areas of the Earth by a constellation of radio-communication or scientific-observation satellites has been developed. This method is intended to supplant an older method in which the global-coverage-analysis problem is solved from a ground-to-satellite perspective. The present method provides for rapid and efficient analysis. This method is derived from a satellite-to-ground perspective and involves a unique combination of two techniques for multiresolution representation of map features on the surface of a sphere.

  16. Coverage maximization for a poisson field of drone cells

    KAUST Repository

    Azari, Mohammad Mahdi

    2018-02-15

    The use of drone base stations to provide wireless connectivity for ground terminals is becoming a promising part of future technologies. The design of such aerial networks is however different compared to cellular 2D networks, as antennas from the drones are looking down, and the channel model becomes height-dependent. In this paper, we study the effect of antenna patterns and height-dependent shadowing. We consider a random network topology to capture the effect of dynamic changes of the flying base stations. First we characterize the aggregate interference imposed by the co-channel neighboring drones. Then we derive the link coverage probability between a ground user and its associated drone base station. The result is used to obtain the optimum system parameters in terms of drones antenna beamwidth, density and altitude. We also derive the average LoS probability of the associated drone and show that it is a good approximation and simplification of the coverage probability in low altitudes up to 500 m according to the required signal-to-interference-plus-noise ratio (SINR).

  17. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Probability and statistics with R

    CERN Document Server

    Ugarte, Maria Dolores; Arnholt, Alan T

    2008-01-01

    -Technometrics, May 2009, Vol. 51, No. 2 The book is comprehensive and well written. The notation is clear and the mathematical derivations behind nontrivial equations and computational implementations are carefully explained. Rather than presenting a collection of R scripts together with a summary of relevant theoretical results, this book offers a well-balanced mix of theory, examples and R code.-Raquel Prado, University of California, Santa Cruz,  The American Statistician, February 2009… an impressive book … Overall, this is a good reference book with comprehensive coverage of the details

  19. Random phenomena fundamentals of probability and statistics for engineers

    CERN Document Server

    Ogunnaike, Babatunde A

    2009-01-01

    PreludeApproach PhilosophyFour Basic PrinciplesI FoundationsTwo Motivating ExamplesYield Improvement in a Chemical ProcessQuality Assurance in a Glass Sheet Manufacturing ProcessOutline of a Systematic ApproachRandom Phenomena, Variability, and UncertaintyTwo Extreme Idealizations of Natural PhenomenaRandom Mass PhenomenaIntroducing ProbabilityThe Probabilistic FrameworkII ProbabilityFundamentals of Probability TheoryBuilding BlocksOperationsProbabilityConditional ProbabilityIndependenceRandom Variables and DistributionsDistributionsMathematical ExpectationCharacterizing DistributionsSpecial Derived Probability FunctionsMultidimensional Random VariablesDistributions of Several Random VariablesDistributional Characteristics of Jointly Distributed Random VariablesRandom Variable TransformationsSingle Variable TransformationsBivariate TransformationsGeneral Multivariate TransformationsApplication Case Studies I: ProbabilityMendel and HeredityWorld War II Warship Tactical Response Under AttackIII DistributionsIde...

  20. Realizing right to health through universal health coverage

    Directory of Open Access Journals (Sweden)

    ANJALI Singh

    2014-07-01

    Full Text Available Recognition of right to health is an essential step to work towards improvement of public health and to attain highest standard of physical and mental health of the people. Right to health in India is implicit part of right to life under Article 19 mentioned in the Constitution of India but is not recognized per se. Universal Health Coverage adopts rights based approach and principles of universality, equity, empowerment and comprehensiveness of care. The Universal Coverage Report of India makes recommendations in six identified areas to revamp the health systems in order to ensure right to health of Indians. These areas are: health financing and financial protection; health service norms; human resources for health; community participation and citizen engagement; access to medicines, vaccines and techno- logy; management and institutional reforms. This paper attempts to determine the ways inwhich Universal Health Coverage can make a contribution in realizing right to health and thus human rights in developing countries.

  1. Probability theory a foundational course

    CERN Document Server

    Pakshirajan, R P

    2013-01-01

    This book shares the dictum of J. L. Doob in treating Probability Theory as a branch of Measure Theory and establishes this relation early. Probability measures in product spaces are introduced right at the start by way of laying the ground work to later claim the existence of stochastic processes with prescribed finite dimensional distributions. Other topics analysed in the book include supports of probability measures, zero-one laws in product measure spaces, Erdos-Kac invariance principle, functional central limit theorem and functional law of the iterated logarithm for independent variables, Skorohod embedding, and the use of analytic functions of a complex variable in the study of geometric ergodicity in Markov chains. This book is offered as a text book for students pursuing graduate programs in Mathematics and or Statistics. The book aims to help the teacher present the theory with ease, and to help the student sustain his interest and joy in learning the subject.

  2. VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS

    Directory of Open Access Journals (Sweden)

    Smirnov Vladimir Alexandrovich

    2012-10-01

    Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.

  3. Approximation methods in probability theory

    CERN Document Server

    Čekanavičius, Vydas

    2016-01-01

    This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.

  4. Enhancing Political Will for Universal Health Coverage in Nigeria.

    Science.gov (United States)

    Aregbeshola, Bolaji S

    2017-01-01

    Universal health coverage aims to increase equity in access to quality health care services and to reduce financial risk due to health care costs. It is a key component of international health agenda and has been a subject of worldwide debate. Despite differing views on its scope and pathways to reach it, there is a global consensus that all countries should work toward universal health coverage. The goal remains distant for many African countries, including Nigeria. This is mostly due to lack of political will and commitment among political actors and policymakers. Evidence from countries such as Ghana, Chile, Mexico, China, Thailand, Turkey, Rwanda, Vietnam and Indonesia, which have introduced at least some form of universal health coverage scheme, shows that political will and commitment are key to the adoption of new laws and regulations for reforming coverage. For Nigeria to improve people's health, reduce poverty and achieve prosperity, universal health coverage must be vigorously pursued at all levels. Political will and commitment to these goals must be expressed in legal mandates and be translated into policies that ensure increased public health care financing for the benefit of all Nigerians. Nigeria, as part of a global system, cannot afford to lag behind in striving for this overarching health goal.

  5. Model uncertainty: Probabilities for models?

    International Nuclear Information System (INIS)

    Winkler, R.L.

    1994-01-01

    Like any other type of uncertainty, model uncertainty should be treated in terms of probabilities. The question is how to do this. The most commonly-used approach has a drawback related to the interpretation of the probabilities assigned to the models. If we step back and look at the big picture, asking what the appropriate focus of the model uncertainty question should be in the context of risk and decision analysis, we see that a different probabilistic approach makes more sense, although it raise some implementation questions. Current work that is underway to address these questions looks very promising

  6. Knowledge typology for imprecise probabilities.

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, G. D. (Gregory D.); Zucker, L. J. (Lauren J.)

    2002-01-01

    When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.

  7. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2011-01-01

    A mathematical and intuitive approach to probability, statistics, and stochastic processes This textbook provides a unique, balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. This text combines a rigorous, calculus-based development of theory with a more intuitive approach that appeals to readers' sense of reason and logic, an approach developed through the author's many years of classroom experience. The text begins with three chapters that d

  8. Statistical probability tables CALENDF program

    International Nuclear Information System (INIS)

    Ribon, P.

    1989-01-01

    The purpose of the probability tables is: - to obtain dense data representation - to calculate integrals by quadratures. They are mainly used in the USA for calculations by Monte Carlo and in the USSR and Europe for self-shielding calculations by the sub-group method. The moment probability tables, in addition to providing a more substantial mathematical basis and calculation methods, are adapted for condensation and mixture calculations, which are the crucial operations for reactor physics specialists. However, their extension is limited by the statistical hypothesis they imply. Efforts are being made to remove this obstacle, at the cost, it must be said, of greater complexity

  9. Probability, statistics, and queueing theory

    CERN Document Server

    Allen, Arnold O

    1990-01-01

    This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit

  10. Probability and Statistics: 5 Questions

    DEFF Research Database (Denmark)

    Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fits...... in these respects. Interviews with Nick Bingham, Luc Bovens, Terrence L. Fine, Haim Gaifman, Donald Gillies, James Hawthorne, Carl Hoefer, James M. Joyce, Joseph B. Kadane Isaac Levi, D.H. Mellor, Patrick Suppes, Jan von Plato, Carl Wagner, Sandy Zabell...

  11. Increasing Coverage of Hepatitis B Vaccination in China

    OpenAIRE

    Wang, Shengnan; Smith, Helen; Peng, Zhuoxin; Xu, Biao; Wang, Weibing

    2016-01-01

    Abstract This study used a system evaluation method to summarize China's experience on improving the coverage of hepatitis B vaccine, especially the strategies employed to improve the uptake of timely birth dosage. Identifying successful methods and strategies will provide strong evidence for policy makers and health workers in other countries with high hepatitis B prevalence. We conducted a literature review included English or Chinese literature carried out in mainland China, using PubMed, ...

  12. [Evaluation of dental care coverage in the State Military Police in Salvador, Bahia, Brazil].

    Science.gov (United States)

    Ribeiro-Sobrinho, Clóvis; Souza, Luís Eugênio Portela Fernandes de; Chaves, Sônia Cristina Lima

    2008-02-01

    This study seeks to evaluate dental care coverage in the State Military Police in Salvador, Bahia State, Brazil, from 2002 to 2004, estimating potential and real coverage rates. A single descriptive study was performed. Calculations were made of potential coverage rates considering hourly workloads of staff dentists and the real rates resulting from actual outpatient treatment. Potential human resources coverage was adequate (1 dentist per 1,618 policemen), while the real coverage rate was considered below the standard proposed by the Brazilian Ministry of Health (0.39 procedures per policeman per year). The low real coverage rate could be related to low productivity, the reasons for which should be investigated in greater depth in future studies, and might include organizational problems and lack of a management system to improve the quality of professional practice, with specifically defined targets.

  13. Cost-Utility Analysis of Extending Public Health Insurance Coverage to Include Diabetic Retinopathy Screening by Optometrists.

    Science.gov (United States)

    van Katwyk, Sasha; Jin, Ya-Ping; Trope, Graham E; Buys, Yvonne; Masucci, Lisa; Wedge, Richard; Flanagan, John; Brent, Michael H; El-Defrawy, Sherif; Tu, Hong Anh; Thavorn, Kednapa

    2017-09-01

    Diabetic retinopathy (DR) is one of the leading causes of vision loss and blindness in Canada. Eye examinations play an important role in early detection. However, DR screening by optometrists is not always universally covered by public or private health insurance plans. This study assessed whether expanding public health coverage to include diabetic eye examinations for retinopathy by optometrists is cost-effective from the perspective of the health care system. We conducted a cost-utility analysis of extended coverage for diabetic eye examinations in Prince Edward Island to include examinations by optometrists, not currently publicly covered. We used a Markov chain to simulate disease burden based on eye examination rates and DR progression over a 30-year time horizon. Results were presented as an incremental cost per quality-adjusted life year (QALY) gained. A series of one-way and probabilistic sensitivity analyses were performed. Extending public health coverage to eye examinations by optometrists was associated with higher costs ($9,908,543.32) and improved QALYs (156,862.44), over 30 years, resulting in an incremental cost-effectiveness ratio of $1668.43/QALY gained. Sensitivity analysis showed that the most influential determinants of the results were the cost of optometric screening and selected utility scores. At the commonly used threshold of $50,000/QALY, the probability that the new policy was cost-effective was 99.99%. Extending public health coverage to eye examinations by optometrists is cost-effective based on a commonly used threshold of $50,000/QALY. Findings from this study can inform the decision to expand public-insured optometric services for patients with diabetes. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  14. Sideline coverage of youth football.

    Science.gov (United States)

    Rizzone, Katie; Diamond, Alex; Gregory, Andrew

    2013-01-01

    Youth football is a popular sport in the United States and has been for some time. There are currently more than 3 million participants in youth football leagues according to USA Football. While the number of participants and overall injuries may be higher in other sports, football has a higher rate of injuries. Most youth sporting events do not have medical personnel on the sidelines in event of an injury or emergency. Therefore it is necessary for youth sports coaches to undergo basic medical training in order to effectively act in these situations. In addition, an argument could be made that appropriate medical personnel should be on the sideline for collision sports at all levels, from youth to professional. This article will discuss issues pertinent to sideline coverage of youth football, including coaching education, sideline personnel, emergency action plans, age and size divisions, tackle versus flag football, and injury prevention.

  15. Dynamic SEP event probability forecasts

    Science.gov (United States)

    Kahler, S. W.; Ling, A.

    2015-10-01

    The forecasting of solar energetic particle (SEP) event probabilities at Earth has been based primarily on the estimates of magnetic free energy in active regions and on the observations of peak fluxes and fluences of large (≥ M2) solar X-ray flares. These forecasts are typically issued for the next 24 h or with no definite expiration time, which can be deficient for time-critical operations when no SEP event appears following a large X-ray flare. It is therefore important to decrease the event probability forecast with time as a SEP event fails to appear. We use the NOAA listing of major (≥10 pfu) SEP events from 1976 to 2014 to plot the delay times from X-ray peaks to SEP threshold onsets as a function of solar source longitude. An algorithm is derived to decrease the SEP event probabilities with time when no event is observed to reach the 10 pfu threshold. In addition, we use known SEP event size distributions to modify probability forecasts when SEP intensity increases occur below the 10 pfu event threshold. An algorithm to provide a dynamic SEP event forecast, Pd, for both situations of SEP intensities following a large flare is derived.

  16. Conditional Independence in Applied Probability.

    Science.gov (United States)

    Pfeiffer, Paul E.

    This material assumes the user has the background provided by a good undergraduate course in applied probability. It is felt that introductory courses in calculus, linear algebra, and perhaps some differential equations should provide the requisite experience and proficiency with mathematical concepts, notation, and argument. The document is…

  17. Stretching Probability Explorations with Geoboards

    Science.gov (United States)

    Wheeler, Ann; Champion, Joe

    2016-01-01

    Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…

  18. GPS: Geometry, Probability, and Statistics

    Science.gov (United States)

    Field, Mike

    2012-01-01

    It might be said that for most occupations there is now less of a need for mathematics than there was say fifty years ago. But, the author argues, geometry, probability, and statistics constitute essential knowledge for everyone. Maybe not the geometry of Euclid, but certainly geometrical ways of thinking that might enable us to describe the world…

  19. Swedish earthquakes and acceleration probabilities

    International Nuclear Information System (INIS)

    Slunga, R.

    1979-03-01

    A method to assign probabilities to ground accelerations for Swedish sites is described. As hardly any nearfield instrumental data is available we are left with the problem of interpreting macroseismic data in terms of acceleration. By theoretical wave propagation computations the relation between seismic strength of the earthquake, focal depth, distance and ground accelerations are calculated. We found that most Swedish earthquake of the area, the 1904 earthquake 100 km south of Oslo, is an exception and probably had a focal depth exceeding 25 km. For the nuclear power plant sites an annual probability of 10 -5 has been proposed as interesting. This probability gives ground accelerations in the range 5-20 % for the sites. This acceleration is for a free bedrock site. For consistency all acceleration results in this study are given for bedrock sites. When applicating our model to the 1904 earthquake and assuming the focal zone to be in the lower crust we get the epicentral acceleration of this earthquake to be 5-15 % g. The results above are based on an analyses of macrosismic data as relevant instrumental data is lacking. However, the macroseismic acceleration model deduced in this study gives epicentral ground acceleration of small Swedish earthquakes in agreement with existent distant instrumental data. (author)

  20. DECOFF Probabilities of Failed Operations

    DEFF Research Database (Denmark)

    Gintautas, Tomas

    2015-01-01

    A statistical procedure of estimation of Probabilities of Failed Operations is described and exemplified using ECMWF weather forecasts and SIMO output from Rotor Lift test case models. Also safety factor influence is investigated. DECOFF statistical method is benchmarked against standard Alpha-factor...

  1. Risk estimation using probability machines

    Science.gov (United States)

    2014-01-01

    Background Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. Results We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. Conclusions The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a “risk machine”, will share properties from the statistical machine that it is derived from. PMID:24581306

  2. Probability and statistics: A reminder

    International Nuclear Information System (INIS)

    Clement, B.

    2013-01-01

    The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from 'data analysis in experimental sciences' given in [1]. (authors)

  3. Nash equilibrium with lower probabilities

    DEFF Research Database (Denmark)

    Groes, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1998-01-01

    We generalize the concept of Nash equilibrium in mixed strategies for strategic form games to allow for ambiguity in the players' expectations. In contrast to other contributions, we model ambiguity by means of so-called lower probability measures or belief functions, which makes it possible...

  4. On probability-possibility transformations

    Science.gov (United States)

    Klir, George J.; Parviz, Behzad

    1992-01-01

    Several probability-possibility transformations are compared in terms of the closeness of preserving second-order properties. The comparison is based on experimental results obtained by computer simulation. Two second-order properties are involved in this study: noninteraction of two distributions and projections of a joint distribution.

  5. Country-level predictors of vaccination coverage and inequalities in Gavi-supported countries.

    Science.gov (United States)

    Arsenault, Catherine; Johri, Mira; Nandi, Arijit; Mendoza Rodríguez, José M; Hansen, Peter M; Harper, Sam

    2017-04-25

    Important inequalities in childhood vaccination coverage persist between countries and population groups. Understanding why some countries achieve higher and more equitable levels of coverage is crucial to redress these inequalities. In this study, we explored the country-level determinants of (1) coverage of the third dose of diphtheria-tetanus-pertussis- (DTP3) containing vaccine and (2) within-country inequalities in DTP3 coverage in 45 countries supported by Gavi, the Vaccine Alliance. We used data from the most recent Demographic and Health Surveys (DHS) conducted between 2005 and 2014. We measured national DTP3 coverage and the slope index of inequality in DTP3 coverage with respect to household wealth, maternal education, and multidimensional poverty. We collated data on country health systems, health financing, governance and geographic and sociocultural contexts from published sources. We used meta-regressions to assess the relationship between these country-level factors and variations in DTP3 coverage and inequalities. To validate our findings, we repeated these analyses for coverage with measles-containing vaccine (MCV). We found considerable heterogeneity in DTP3 coverage and in the magnitude of inequalities across countries. Results for MCV were consistent with those from DTP3. Political stability, gender equality and smaller land surface were important predictors of higher and more equitable levels of DTP3 coverage. Inequalities in DTP3 coverage were also lower in countries receiving more external resources for health, with lower rates of out-of-pocket spending and with higher national coverage. Greater government spending on heath and lower linguistic fractionalization were also consistent with better vaccination outcomes. Improving vaccination coverage and reducing inequalities requires that policies and programs address critical social determinants of health including geographic and social exclusion, gender inequality and the availability of

  6. A note on the transition probability over Csup(*)-algebras

    International Nuclear Information System (INIS)

    Alberti, P.M.; Karl-Marx-Universitaet, Leipzig

    1983-01-01

    The algebraic structure of Uhlmann's transition probability between mixed states on unital Csup(*)-algebras is analyzed. Several improvements of methods to calculate the transition probability are fixed, examples are given (e.g., the case of quasi-local Csup(*)-algebras is dealt with) and two more functional characterizations are proved in general. (orig.)

  7. Providing Universal Health Insurance Coverage in Nigeria.

    Science.gov (United States)

    Okebukola, Peter O; Brieger, William R

    2016-07-07

    Despite a stated goal of achieving universal coverage, the National Health Insurance Scheme of Nigeria had achieved only 4% coverage 12 years after it was launched. This study assessed the plans of the National Health Insurance Scheme to achieve universal health insurance coverage in Nigeria by 2015 and discusses the challenges facing the scheme in achieving insurance coverage. In-depth interviews from various levels of the health-care system in the country, including providers, were conducted. The results of the analysis suggest that challenges to extending coverage include the difficulty in convincing autonomous state governments to buy into the scheme and an inadequate health workforce that might not be able to meet increased demand. Recommendations for increasing the scheme's coverage include increasing decentralization and strengthening human resources for health in the service delivery systems. Strong political will is needed as a catalyst to achieving these goals. © The Author(s) 2016.

  8. Shared access protocol (SAP) in femtocell channel resources for cellular coverage enhancement

    KAUST Repository

    Magableh, Amer M.

    2012-12-01

    Femtocells are promising techniques employed in cellular systems to enhance the indoor coverage, especially in areas with high density and high traffic rates. In this paper, we propose an efficient resource utilization protocol, named shared access protocol (SAP), that enables the unlicensed macro-cell user equipments (MC-UE) to communicate with partially closed access femtocell base stations and hence, improves and enhances the overall system performance in closed environments. For the proposed system model, we obtain, in closed-form, the main signal-to-interference plus noise ratio (SINR) characteristics, including the probability density function (PDF) and the cumulative distribution function (CDF). In addition, these expressions are further used to derive several performance metrics in closed-form, such as, the average bit error rate (BER), outage probability, and the average channel capacity for the proposed SAP herein. Furthermore, Monte-carlo simulations as well as numerical results are provided showing a good match that ensures and confirms the correctness of the derived expressions. © 2012 IEEE.

  9. Universal health coverage in Turkey: enhancement of equity.

    Science.gov (United States)

    Atun, Rifat; Aydın, Sabahattin; Chakraborty, Sarbani; Sümer, Safir; Aran, Meltem; Gürol, Ipek; Nazlıoğlu, Serpil; Ozgülcü, Senay; Aydoğan, Ulger; Ayar, Banu; Dilmen, Uğur; Akdağ, Recep

    2013-07-06

    Turkey has successfully introduced health system changes and provided its citizens with the right to health to achieve universal health coverage, which helped to address inequities in financing, health service access, and health outcomes. We trace the trajectory of health system reforms in Turkey, with a particular emphasis on 2003-13, which coincides with the Health Transformation Program (HTP). The HTP rapidly expanded health insurance coverage and access to health-care services for all citizens, especially the poorest population groups, to achieve universal health coverage. We analyse the contextual drivers that shaped the transformations in the health system, explore the design and implementation of the HTP, identify the factors that enabled its success, and investigate its effects. Our findings suggest that the HTP was instrumental in achieving universal health coverage to enhance equity substantially, and led to quantifiable and beneficial effects on all health system goals, with an improved level and distribution of health, greater fairness in financing with better financial protection, and notably increased user satisfaction. After the HTP, five health insurance schemes were consolidated to create a unified General Health Insurance scheme with harmonised and expanded benefits. Insurance coverage for the poorest population groups in Turkey increased from 2·4 million people in 2003, to 10·2 million in 2011. Health service access increased across the country-in particular, access and use of key maternal and child health services improved to help to greatly reduce the maternal mortality ratio, and under-5, infant, and neonatal mortality, especially in socioeconomically disadvantaged groups. Several factors helped to achieve universal health coverage and improve outcomes. These factors include economic growth, political stability, a comprehensive transformation strategy led by a transformation team, rapid policy translation, flexible implementation with

  10. Improved work zone design guidelines and enhanced model of travel delays in work zones : Phase I, portability and scalability of interarrival and service time probability distribution functions for different locations in Ohio and the establishment of impr

    Science.gov (United States)

    2006-01-01

    The project focuses on two major issues - the improvement of current work zone design practices and an analysis of : vehicle interarrival time (IAT) and speed distributions for the development of a digital computer simulation model for : queues and t...

  11. Dependent Human Error Probability Assessment

    International Nuclear Information System (INIS)

    Simic, Z.; Mikulicic, V.; Vukovic, I.

    2006-01-01

    This paper presents an assessment of the dependence between dynamic operator actions modeled in a Nuclear Power Plant (NPP) PRA and estimate the associated impact on Core damage frequency (CDF). This assessment was done improve HEP dependencies implementation inside existing PRA. All of the dynamic operator actions modeled in the NPP PRA are included in this assessment. Determining the level of HEP dependence and the associated influence on CDF are the major steps of this assessment. A decision on how to apply the results, i.e., should permanent HEP model changes be made, is based on the resulting relative CDF increase. Some CDF increase was selected as a threshold based on the NPP base CDF value and acceptance guidelines from the Regulatory Guide 1.174. HEP dependence resulting in a CDF increase of > 5E-07 would be considered potential candidates for specific incorporation into the baseline model. The approach used to judge the level of dependence between operator actions is based on dependency level categories and conditional probabilities developed in the Handbook of Human Reliability Analysis with Emphasis on Nuclear Power Plant Applications NUREG/CR-1278. To simplify the process, NUREG/CR-1278 identifies five levels of dependence: ZD (zero dependence), LD (low dependence), MD (moderate dependence), HD (high dependence), and CD (complete dependence). NUREG/CR-1278 also identifies several qualitative factors that could be involved in determining the level of dependence. Based on the NUREG/CR-1278 information, Time, Function, and Spatial attributes were judged to be the most important considerations when determining the level of dependence between operator actions within an accident sequence. These attributes were used to develop qualitative criteria (rules) that were used to judge the level of dependence (CD, HD, MD, LD, ZD) between the operator actions. After the level of dependence between the various HEPs is judged, quantitative values associated with the

  12. A Two-Phase Coverage-Enhancing Algorithm for Hybrid Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Qingguo Zhang

    2017-01-01

    Full Text Available Providing field coverage is a key task in many sensor network applications. In certain scenarios, the sensor field may have coverage holes due to random initial deployment of sensors; thus, the desired level of coverage cannot be achieved. A hybrid wireless sensor network is a cost-effective solution to this problem, which is achieved by repositioning a portion of the mobile sensors in the network to meet the network coverage requirement. This paper investigates how to redeploy mobile sensor nodes to improve network coverage in hybrid wireless sensor networks. We propose a two-phase coverage-enhancing algorithm for hybrid wireless sensor networks. In phase one, we use a differential evolution algorithm to compute the candidate’s target positions in the mobile sensor nodes that could potentially improve coverage. In the second phase, we use an optimization scheme on the candidate’s target positions calculated from phase one to reduce the accumulated potential moving distance of mobile sensors, such that the exact mobile sensor nodes that need to be moved as well as their final target positions can be determined. Experimental results show that the proposed algorithm provided significant improvement in terms of area coverage rate, average moving distance, area coverage–distance rate and the number of moved mobile sensors, when compare with other approaches.

  13. Large deviations and idempotent probability

    CERN Document Server

    Puhalskii, Anatolii

    2001-01-01

    In the view of many probabilists, author Anatolii Puhalskii''s research results stand among the most significant achievements in the modern theory of large deviations. In fact, his work marked a turning point in the depth of our understanding of the connections between the large deviation principle (LDP) and well-known methods for establishing weak convergence results.Large Deviations and Idempotent Probability expounds upon the recent methodology of building large deviation theory along the lines of weak convergence theory. The author develops an idempotent (or maxitive) probability theory, introduces idempotent analogues of martingales (maxingales), Wiener and Poisson processes, and Ito differential equations, and studies their properties. The large deviation principle for stochastic processes is formulated as a certain type of convergence of stochastic processes to idempotent processes. The author calls this large deviation convergence.The approach to establishing large deviation convergence uses novel com...

  14. Probability biases as Bayesian inference

    Directory of Open Access Journals (Sweden)

    Andre; C. R. Martins

    2006-11-01

    Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.

  15. Probability matching and strategy availability.

    Science.gov (United States)

    Koehler, Derek J; James, Greta

    2010-09-01

    Findings from two experiments indicate that probability matching in sequential choice arises from an asymmetry in strategy availability: The matching strategy comes readily to mind, whereas a superior alternative strategy, maximizing, does not. First, compared with the minority who spontaneously engage in maximizing, the majority of participants endorse maximizing as superior to matching in a direct comparison when both strategies are described. Second, when the maximizing strategy is brought to their attention, more participants subsequently engage in maximizing. Third, matchers are more likely than maximizers to base decisions in other tasks on their initial intuitions, suggesting that they are more inclined to use a choice strategy that comes to mind quickly. These results indicate that a substantial subset of probability matchers are victims of "underthinking" rather than "overthinking": They fail to engage in sufficient deliberation to generate a superior alternative to the matching strategy that comes so readily to mind.

  16. Probability as a Physical Motive

    Directory of Open Access Journals (Sweden)

    Peter Martin

    2007-04-01

    Full Text Available Recent theoretical progress in nonequilibrium thermodynamics, linking thephysical principle of Maximum Entropy Production (“MEP” to the information-theoretical“MaxEnt” principle of scientific inference, together with conjectures from theoreticalphysics that there may be no fundamental causal laws but only probabilities for physicalprocesses, and from evolutionary theory that biological systems expand “the adjacentpossible” as rapidly as possible, all lend credence to the proposition that probability shouldbe recognized as a fundamental physical motive. It is further proposed that spatial order andtemporal order are two aspects of the same thing, and that this is the essence of the secondlaw of thermodynamics.

  17. Review of Literature for Model Assisted Probability of Detection

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, Ryan M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Crawford, Susan L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Lareau, John P. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Anderson, Michael T. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-09-30

    This is a draft technical letter report for NRC client documenting a literature review of model assisted probability of detection (MAPOD) for potential application to nuclear power plant components for improvement of field NDE performance estimations.

  18. Logic, Probability, and Human Reasoning

    Science.gov (United States)

    2015-01-01

    accordingly suggest a way to integrate probability and deduction. The nature of deductive reasoning To be rational is to be able to make deductions...3–6] and they underlie mathematics, science, and tech- nology [7–10]. Plato claimed that emotions upset reason- ing. However, individuals in the grip...fundamental to human rationality . So, if counterexamples to its principal predictions occur, the theory will at least explain its own refutation

  19. Probability Measures on Groups IX

    CERN Document Server

    1989-01-01

    The latest in this series of Oberwolfach conferences focussed on the interplay between structural probability theory and various other areas of pure and applied mathematics such as Tauberian theory, infinite-dimensional rotation groups, central limit theorems, harmonizable processes, and spherical data. Thus it was attended by mathematicians whose research interests range from number theory to quantum physics in conjunction with structural properties of probabilistic phenomena. This volume contains 5 survey articles submitted on special invitation and 25 original research papers.

  20. Probability matching and strategy availability

    OpenAIRE

    J. Koehler, Derek; Koehler, Derek J.; James, Greta

    2010-01-01

    Findings from two experiments indicate that probability matching in sequential choice arises from an asymmetry in strategy availability: The matching strategy comes readily to mind, whereas a superior alternative strategy, maximizing, does not. First, compared with the minority who spontaneously engage in maximizing, the majority of participants endorse maximizing as superior to matching in a direct comparison when both strategies are described. Second, when the maximizing strategy is brought...

  1. Achieving Crossed Strong Barrier Coverage in Wireless Sensor Network.

    Science.gov (United States)

    Han, Ruisong; Yang, Wei; Zhang, Li

    2018-02-10

    Barrier coverage has been widely used to detect intrusions in wireless sensor networks (WSNs). It can fulfill the monitoring task while extending the lifetime of the network. Though barrier coverage in WSNs has been intensively studied in recent years, previous research failed to consider the problem of intrusion in transversal directions. If an intruder knows the deployment configuration of sensor nodes, then there is a high probability that it may traverse the whole target region from particular directions, without being detected. In this paper, we introduce the concept of crossed barrier coverage that can overcome this defect. We prove that the problem of finding the maximum number of crossed barriers is NP-hard and integer linear programming (ILP) is used to formulate the optimization problem. The branch-and-bound algorithm is adopted to determine the maximum number of crossed barriers. In addition, we also propose a multi-round shortest path algorithm (MSPA) to solve the optimization problem, which works heuristically to guarantee efficiency while maintaining near-optimal solutions. Several conventional algorithms for finding the maximum number of disjoint strong barriers are also modified to solve the crossed barrier problem and for the purpose of comparison. Extensive simulation studies demonstrate the effectiveness of MSPA.

  2. Sky coverage modeling for the whole sky for laser guide star multiconjugate adaptive optics.

    Science.gov (United States)

    Wang, Lianqi; Andersen, David; Ellerbroek, Brent

    2012-06-01

    The scientific productivity of laser guide star adaptive optics systems strongly depends on the sky coverage, which describes the probability of finding natural guide stars for the tip/tilt wavefront sensor(s) to achieve a certain performance. Knowledge of the sky coverage is also important for astronomers planning their observations. In this paper, we present an efficient method to compute the sky coverage for the laser guide star multiconjugate adaptive optics system, the Narrow Field Infrared Adaptive Optics System (NFIRAOS), being designed for the Thirty Meter Telescope project. We show that NFIRAOS can achieve more than 70% sky coverage over most of the accessible sky with the requirement of 191 nm total rms wavefront.

  3. Measuring coverage in MNCH: challenges and opportunities in the selection of coverage indicators for global monitoring.

    Directory of Open Access Journals (Sweden)

    Jennifer Harris Requejo

    Full Text Available Global monitoring of intervention coverage is a cornerstone of international efforts to improve reproductive, maternal, newborn, and child health. In this review, we examine the process and implications of selecting a core set of coverage indicators for global monitoring, using as examples the processes used by the Countdown to 2015 for Maternal, Newborn and Child Survival and the Commission on Accountability for Women's and Children's Health. We describe how the generation of data for global monitoring involves five iterative steps: development of standard indicator definitions and measurement approaches to ensure comparability across countries; collection of high-quality data at the country level; compilation of country data at the global level; organization of global databases; and rounds of data quality checking. Regular and rigorous technical review processes that involve high-level decision makers and experts familiar with indicator measurement are needed to maximize uptake and to ensure that indicators used for global monitoring are selected on the basis of available evidence of intervention effectiveness, feasibility of measurement, and data availability as well as programmatic relevance. Experience from recent initiatives illustrates the challenges of striking this balance as well as strategies for reducing the tensions inherent in the indicator selection process. We conclude that more attention and continued investment need to be directed to global monitoring, to support both the process of global database development and the selection of sets of coverage indicators to promote accountability. The stakes are high, because these indicators can drive policy and program development at the country and global level, and ultimately impact the health of women and children and the communities where they live.

  4. Impact of state Medicaid coverage on utilization of inpatient rehabilitation facilities among patients with stroke.

    Science.gov (United States)

    Skolarus, Lesli E; Burke, James F; Morgenstern, Lewis B; Meurer, William J; Adelman, Eric E; Kerber, Kevin A; Callaghan, Brian C; Lisabeth, Lynda D

    2014-08-01

    Poststroke rehabilitation is associated with improved outcomes. Medicaid coverage of inpatient rehabilitation facility (IRF) admissions varies by state. We explored the role of state Medicaid IRF coverage on IRF utilization among patients with stroke. Working age ischemic stroke patients with Medicaid were identified from the 2010 Nationwide Inpatient Sample. Medicaid coverage of IRFs (yes versus no) was ascertained. Primary outcome was discharge to IRF (versus other discharge destinations). We fit a logistic regression model that included patient demographics, Medicaid coverage, comorbidities, length of stay, tissue-type plasminogen activator use, state Medicaid IRF coverage, and the interaction between patient Medicaid status and state Medicaid IRF coverage while accounting for hospital clustering. Medicaid did not cover IRFs in 4 (TN, TX, SC, WV) of 42 states. The impact of State Medicaid IRF coverage was limited to Medicaid stroke patients (P for interaction stroke patients in states with Medicaid IRF coverage, Medicaid stroke patients hospitalized in states without Medicaid IRF coverage were less likely to be discharged to an IRF of 11.6% (95% confidence interval, 8.5%-14.7%) versus 19.5% (95% confidence interval, 18.3%-20.8%), Pstroke patients with Medicaid. Given the increasing stroke incidence among the working age and Medicaid expansion under the Affordable Care Act, careful attention to state Medicaid policy for poststroke rehabilitation and analysis of its effects on stroke outcome disparities are warranted. © 2014 American Heart Association, Inc.

  5. An introduction to probability and statistical inference

    CERN Document Server

    Roussas, George G

    2003-01-01

    "The text is wonderfully written and has the mostcomprehensive range of exercise problems that I have ever seen." - Tapas K. Das, University of South Florida"The exposition is great; a mixture between conversational tones and formal mathematics; the appropriate combination for a math text at [this] level. In my examination I could find no instance where I could improve the book." - H. Pat Goeters, Auburn, University, Alabama* Contains more than 200 illustrative examples discussed in detail, plus scores of numerical examples and applications* Chapters 1-8 can be used independently for an introductory course in probability* Provides a substantial number of proofs

  6. CDMA coverage under mobile heterogeneous network load

    NARCIS (Netherlands)

    Saban, D.; van den Berg, Hans Leo; Boucherie, Richardus J.; Endrayanto, A.I.

    2002-01-01

    We analytically investigate coverage (determined by the uplink) under non-homogeneous and moving traffic load of third generation UMTS mobile networks. In particular, for different call assignment policies, we investigate cell breathing and the movement of the coverage gap occurring between cells

  7. 5 CFR 531.402 - Employee coverage.

    Science.gov (United States)

    2010-01-01

    ... 5 Administrative Personnel 1 2010-01-01 2010-01-01 false Employee coverage. 531.402 Section 531... GENERAL SCHEDULE Within-Grade Increases § 531.402 Employee coverage. (a) Except as provided in paragraph (b) of this section, this subpart applies to employees who— (1) Are classified and paid under the...

  8. 22 CFR 226.31 - Insurance coverage.

    Science.gov (United States)

    2010-04-01

    ... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Insurance coverage. 226.31 Section 226.31 Foreign Relations AGENCY FOR INTERNATIONAL DEVELOPMENT ADMINISTRATION OF ASSISTANCE AWARDS TO U.S. NON-GOVERNMENTAL ORGANIZATIONS Post-award Requirements Property Standards § 226.31 Insurance coverage. Recipients...

  9. 14 CFR 1260.131 - Insurance coverage.

    Science.gov (United States)

    2010-01-01

    ... coverage. Recipients shall, at a minimum, provide the equivalent insurance coverage for real property and equipment acquired with Federal funds as provided for property owned by the recipient. Federally-owned property need not be insured unless required by the terms and conditions of the award. ...

  10. 2 CFR 215.31 - Insurance coverage.

    Science.gov (United States)

    2010-01-01

    ... Insurance coverage. Recipients shall, at a minimum, provide the equivalent insurance coverage for real property and equipment acquired with Federal funds as provided to property owned by the recipient. Federally-owned property need not be insured unless required by the terms and conditions of the award. ...

  11. 36 CFR 1210.31 - Insurance coverage.

    Science.gov (United States)

    2010-07-01

    ....31 Insurance coverage. Recipients shall, at a minimum, provide the equivalent insurance coverage for real property and equipment acquired with NHPRC funds as provided to property owned by the recipient. Federally-owned property need not be insured unless required by the terms and conditions of the award. ...

  12. Coverage matters: insurance and health care

    National Research Council Canada - National Science Library

    Board on Health Care Services Staff; Institute of Medicine Staff; Institute of Medicine; National Academy of Sciences

    2001-01-01

    ...? How does the system of insurance coverage in the U.S. operate, and where does it fail? The first of six Institute of Medicine reports that will examine in detail the consequences of having a large uninsured population, Coverage Matters...

  13. Legislating health care coverage for the unemployed.

    Science.gov (United States)

    Palley, H A; Feldman, G; Gallner, I; Tysor, M

    1985-01-01

    Because the unemployed and their families are often likely to develop stress-related health problems, ensuring them access to health care is a public health issue. Congressional efforts thus far to legislate health coverage for the unemployed have proposed a system that recognizes people's basic need for coverage but has several limitations.

  14. Quantifying the impact of cross coverage on physician's workload and performance in radiation oncology.

    Science.gov (United States)

    Mosaly, Prithima R; Mazur, Lukasz M; Jones, Ellen L; Hoyle, Lesley; Zagar, Timothy; Chera, Bhishamjit S; Marks, Lawrence B

    2013-01-01

    To quantitatively assess the difference in workload and performance of radiation oncology physicians during radiation therapy treatment planning tasks under the conditions of "cross coverage" versus planning a patient with whom they were familiar. Eight physicians (3 experienced faculty physicians and 5 physician residents) performed 2 cases. The first case represented a "cross-coverage" scenario where the physicians had no prior information about the case to be planned. The second exposure represented a "regular-coverage" scenario where the physicians were familiar with the patient case to be planned. Each case involved 3 tasks to be completed systematically. Workload was assessed both subjectively (perceived) using National Aeronautics and Space Administration-Task Load Index (NASA-TLX), and objectively (physiological) throughout the task using eye data (via monitoring pupil size and blink rate). Performance of each task and the case was measured using completion time. Subjective willingness to approve or disapprove the generated plan was obtained after completion of the case only. Forty-eight perceived and 48 physiological workload assessments were obtained. Overall, results revealed a significant increase in perceived workload (high NASA-TLX score) and decrease in performance (longer completion time and reduced approval rate) during cross coverage. There were nonsignificant increases in pupil diameter and decreases in the blink rate during cross-coverage versus regular-coverage scenario. In both cross-coverage and regular-coverage scenarios the level of experience did not affect workload and performance. The cross-coverage scenario significantly increases perceived workload and degrades performance versus regular coverage. Hence, to improve patient safety, efforts must be made to develop policies, standard operating procedures, and usability improvements to electronic medical record and treatment planning systems for "easier" information processing to deal with

  15. Impact of conflict on infant immunisation coverage in Afghanistan: a countrywide study 2000–2003

    Directory of Open Access Journals (Sweden)

    Seino Kaoruko

    2007-06-01

    Full Text Available Abstract Background Infant immunisation is an effective public health intervention to reduce the morbidity and mortality of vaccine preventable diseases. However, some developing countries fail to achieve desirable vaccination coverage; Afghanistan is one such country. The present study was performed to evaluate the progress and variation in infant immunisation coverage by district and region in Afghanistan and to assess the impact of conflict and resource availability on immunisation coverage. Results This study analysed reports of infant immunisation from 331 districts across 7 regions of Afghanistan between 2000 and 2003. Geographic information system (GIS analysis was used to visualise the distribution of immunisation coverage in districts and to identify geographic inequalities in the process of improvement of infant immunisation coverage. The number of districts reporting immunisation coverage increased substantially during the four years of the study. Progress in Bacillus Calmette-Guerin (BCG immunisation coverage was observed in all 7 regions, although satisfactory coverage of 80% remained unequally distributed. Progress in the third dose of Diphtheria-Pertussis-Tetanus (DPT3 immunisation differed among regions, in addition to the unequal distribution of immunisation coverage in 2000. The results of multivariate logistic regression analysis indicated a significant negative association between lack of security in the region and achievement of 80% coverage of immunisation regardless of available resources for immunisation, while resource availability showed no relation to immunisation coverage. Conclusion Although progress was observed in all 7 regions, geographic inequalities in these improvements remain a cause for concern. The results of the present study indicated that security within a country is an important factor for affecting the delivery of immunisation services.

  16. [Biometric bases: basic concepts of probability calculation].

    Science.gov (United States)

    Dinya, E

    1998-04-26

    The author gives or outline of the basic concepts of probability theory. The bases of the event algebra, definition of the probability, the classical probability model and the random variable are presented.

  17. Probability for Weather and Climate

    Science.gov (United States)

    Smith, L. A.

    2013-12-01

    Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of

  18. Probability, statistics, and computational science.

    Science.gov (United States)

    Beerenwinkel, Niko; Siebourg, Juliane

    2012-01-01

    In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.

  19. Sensitivity analysis using probability bounding

    International Nuclear Information System (INIS)

    Ferson, Scott; Troy Tucker, W.

    2006-01-01

    Probability bounds analysis (PBA) provides analysts a convenient means to characterize the neighborhood of possible results that would be obtained from plausible alternative inputs in probabilistic calculations. We show the relationship between PBA and the methods of interval analysis and probabilistic uncertainty analysis from which it is jointly derived, and indicate how the method can be used to assess the quality of probabilistic models such as those developed in Monte Carlo simulations for risk analyses. We also illustrate how a sensitivity analysis can be conducted within a PBA by pinching inputs to precise distributions or real values

  20. Directional Bias and Pheromone for Discovery and Coverage on Networks

    Energy Technology Data Exchange (ETDEWEB)

    Fink, Glenn A.; Berenhaut, Kenneth S.; Oehmen, Christopher S.

    2012-09-11

    Natural multi-agent systems often rely on “correlated random walks” (random walks that are biased toward a current heading) to distribute their agents over a space (e.g., for foraging, search, etc.). Our contribution involves creation of a new movement and pheromone model that applies the concept of heading bias in random walks to a multi-agent, digital-ants system designed for cyber-security monitoring. We examine the relative performance effects of both pheromone and heading bias on speed of discovery of a target and search-area coverage in a two-dimensional network layout. We found that heading bias was unexpectedly helpful in reducing search time and that it was more influential than pheromone for improving coverage. We conclude that while pheromone is very important for rapid discovery, heading bias can also greatly improve both performance metrics.

  1. Constructing and Using Broad-coverage Lexical Resource for Enhancing Morphological Analysis of Arabic

    OpenAIRE

    Sawalha, M.; Atwell, E.S.

    2010-01-01

    Broad-coverage language resources which provide prior linguistic knowledge must improve the accuracy and the performance of NLP applications. We are constructing a broad-coverage lexical resource to improve the accuracy of morphological analyzers and part-of-speech taggers of Arabic text. Over the past 1200 years, many different kinds of Arabic language lexicons were constructed; these lexicons are different in ordering, size and aim or goal of construction. We collected 23 machine-readable l...

  2. Selection of risk reduction portfolios under interval-valued probabilities

    International Nuclear Information System (INIS)

    Toppila, Antti; Salo, Ahti

    2017-01-01

    A central problem in risk management is that of identifying the optimal combination (or portfolio) of improvements that enhance the reliability of the system most through reducing failure event probabilities, subject to the availability of resources. This optimal portfolio can be sensitive with regard to epistemic uncertainties about the failure events' probabilities. In this paper, we develop an optimization model to support the allocation of resources to improvements that mitigate risks in coherent systems in which interval-valued probabilities defined by lower and upper bounds are employed to capture epistemic uncertainties. Decision recommendations are based on portfolio dominance: a resource allocation portfolio is dominated if there exists another portfolio that improves system reliability (i) at least as much for all feasible failure probabilities and (ii) strictly more for some feasible probabilities. Based on non-dominated portfolios, recommendations about improvements to implement are derived by inspecting in how many non-dominated portfolios a given improvement is contained. We present an exact method for computing the non-dominated portfolios. We also present an approximate method that simplifies the reliability function using total order interactions so that larger problem instances can be solved with reasonable computational effort. - Highlights: • Reliability allocation under epistemic uncertainty about probabilities. • Comparison of alternatives using dominance. • Computational methods for generating the non-dominated alternatives. • Deriving decision recommendations that are robust with respect to epistemic uncertainty.

  3. Lectures on probability and statistics

    International Nuclear Information System (INIS)

    Yost, G.P.

    1984-09-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another

  4. Probability Distributome: A Web Computational Infrastructure for Exploring the Properties, Interrelations, and Applications of Probability Distributions.

    Science.gov (United States)

    Dinov, Ivo D; Siegrist, Kyle; Pearl, Dennis K; Kalinin, Alexandr; Christou, Nicolas

    2016-06-01

    Probability distributions are useful for modeling, simulation, analysis, and inference on varieties of natural processes and physical phenomena. There are uncountably many probability distributions. However, a few dozen families of distributions are commonly defined and are frequently used in practice for problem solving, experimental applications, and theoretical studies. In this paper, we present a new computational and graphical infrastructure, the Distributome , which facilitates the discovery, exploration and application of diverse spectra of probability distributions. The extensible Distributome infrastructure provides interfaces for (human and machine) traversal, search, and navigation of all common probability distributions. It also enables distribution modeling, applications, investigation of inter-distribution relations, as well as their analytical representations and computational utilization. The entire Distributome framework is designed and implemented as an open-source, community-built, and Internet-accessible infrastructure. It is portable, extensible and compatible with HTML5 and Web2.0 standards (http://Distributome.org). We demonstrate two types of applications of the probability Distributome resources: computational research and science education. The Distributome tools may be employed to address five complementary computational modeling applications (simulation, data-analysis and inference, model-fitting, examination of the analytical, mathematical and computational properties of specific probability distributions, and exploration of the inter-distributional relations). Many high school and college science, technology, engineering and mathematics (STEM) courses may be enriched by the use of modern pedagogical approaches and technology-enhanced methods. The Distributome resources provide enhancements for blended STEM education by improving student motivation, augmenting the classical curriculum with interactive webapps, and overhauling the

  5. Network television news coverage of environmental risks

    International Nuclear Information System (INIS)

    Greenberg, M.R.; Sandman, P.M.; Sachsman, D.V.; Salomone, K.L.

    1989-01-01

    Despite the criticisms that surround television coverage of environmental risk, there have been relatively few attempts to measure what and whom television shows. Most research has focused analysis on a few weeks of coverage of major stories like the gas leak at Bhopal, the Three Mile Island nuclear accident, or the Mount St. Helen's eruption. To advance the research into television coverage of environmental risk, an analysis has been made of all environmental risk coverage by the network nightly news broadcasts for a period of more than two years. Researchers have analyzed all environmental risk coverage-564 stories in 26 months-presented on ABC, CBS, and NBC's evening news broadcasts from January 1984 through February 1986. The quantitative information from the 564 stories was balanced by a more qualitative analysis of the television coverage of two case studies-the dioxin contamination in Times Beach, Missouri, and the suspected methyl isocyanate emissions from the Union Carbide plant in Institute, West Virginia. Both qualitative and quantitative data contributed to the analysis of the role played by experts and environmental advocacy sources in coverage of environmental risk and to the suggestions for increasing that role

  6. Insurance Coverage Policies for Personalized Medicine

    Directory of Open Access Journals (Sweden)

    Andrew Hresko

    2012-10-01

    Full Text Available Adoption of personalized medicine in practice has been slow, in part due to the lack of evidence of clinical benefit provided by these technologies. Coverage by insurers is a critical step in achieving widespread adoption of personalized medicine. Insurers consider a variety of factors when formulating medical coverage policies for personalized medicine, including the overall strength of evidence for a test, availability of clinical guidelines and health technology assessments by independent organizations. In this study, we reviewed coverage policies of the largest U.S. insurers for genomic (disease-related and pharmacogenetic (PGx tests to determine the extent that these tests were covered and the evidence basis for the coverage decisions. We identified 41 coverage policies for 49 unique testing: 22 tests for disease diagnosis, prognosis and risk and 27 PGx tests. Fifty percent (or less of the tests reviewed were covered by insurers. Lack of evidence of clinical utility appears to be a major factor in decisions of non-coverage. The inclusion of PGx information in drug package inserts appears to be a common theme of PGx tests that are covered. This analysis highlights the variability of coverage determinations and factors considered, suggesting that the adoption of personal medicine will affected by numerous factors, but will continue to be slowed due to lack of demonstrated clinical benefit.

  7. Deep coverage of the beer proteome.

    Science.gov (United States)

    Grochalová, Martina; Konečná, Hana; Stejskal, Karel; Potěšil, David; Fridrichová, Danuše; Srbová, Eva; Ornerová, Kateřina; Zdráhal, Zbyněk

    2017-06-06

    We adopted an approach based on peptide immobilized pH gradient-isoelectric focusing (IPG-IEF) separation, coupled with LC-MS/MS, in order to maximize coverage of the beer proteome. A lager beer brewed using traditional Czech technology was degassed, desalted and digested. Tryptic peptides were separated by isoelectric focusing on an immobilized pH gradient strip and, after separation, the gel strip was divided into seven equally sized parts. Peptides extracted from gel fractions were analyzed by LC-MS/MS. This approach resulted in a three-fold increase in the number of proteins identified (over 1700) when compared to analysis of unfractionated beer processed by a filter-aided sample preparation (FASP). Over 1900 protein groups (PGs) in total were identified by both approaches. The study significantly extends knowledge about the beer proteome and demonstrates its complexity. Detailed knowledge of the protein content, especially gluten proteins, will enhance the evaluation of potential health risks related to beer consumption (coeliac disease) and will contribute to improving beer quality. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Health insurance coverage and its impact on medical cost: observations from the floating population in China.

    Directory of Open Access Journals (Sweden)

    Yinjun Zhao

    Full Text Available China has the world's largest floating (migrant population, which has characteristics largely different from the rest of the population. Our goal is to study health insurance coverage and its impact on medical cost for this population.A telephone survey was conducted in 2012. 644 subjects were surveyed. Univariate and multivariate analysis were conducted on insurance coverage and medical cost.82.2% of the surveyed subjects were covered by basic insurance at hometowns with hukou or at residences. Subjects' characteristics including age, education, occupation, and presence of chronic diseases were associated with insurance coverage. After controlling for confounders, insurance coverage was not significantly associated with gross or out-of-pocket medical cost.For the floating population, health insurance coverage needs to be improved. Policy interventions are needed so that health insurance can have a more effective protective effect on cost.

  9. A Novel Nonlinear Multitarget k-Degree Coverage Preservation Protocol in Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Zeyu Sun

    2016-01-01

    Full Text Available Due to the existence of a large number of redundant data in the process of covering multiple targets, the effective coverage of monitored region decreases, causing the network to consume more energy. To solve this problem, this paper proposes a multitarget k-degree coverage preservation protocol. Firstly, the affiliation between the sensor nodes and target nodes is established in the network model; meanwhile the method used to calculate the coverage expectation value of the monitored region is put forward; secondly, in the aspect of the network energy conversion, use scheduling mechanisms on the sensor nodes to balance the network energy and achieve different network coverage quality with energy conversion between different nodes. Finally, simulation results show that NMCP can improve the network lifetime by effectively reducing the number of active nodes to meet certain coverage requirements.

  10. Health Insurance Coverage and Its Impact on Medical Cost: Observations from the Floating Population in China

    Science.gov (United States)

    Zhao, Yinjun; Kang, Bowei; Liu, Yawen; Li, Yichong; Shi, Guoqing; Shen, Tao; Jiang, Yong; Zhang, Mei; Zhou, Maigeng; Wang, Limin

    2014-01-01

    Background China has the world's largest floating (migrant) population, which has characteristics largely different from the rest of the population. Our goal is to study health insurance coverage and its impact on medical cost for this population. Methods A telephone survey was conducted in 2012. 644 subjects were surveyed. Univariate and multivariate analysis were conducted on insurance coverage and medical cost. Results 82.2% of the surveyed subjects were covered by basic insurance at hometowns with hukou or at residences. Subjects' characteristics including age, education, occupation, and presence of chronic diseases were associated with insurance coverage. After controlling for confounders, insurance coverage was not significantly associated with gross or out-of-pocket medical cost. Conclusion For the floating population, health insurance coverage needs to be improved. Policy interventions are needed so that health insurance can have a more effective protective effect on cost. PMID:25386914

  11. Potential Uses of Administrative Records for Triple System Modeling for Estimation of Census Coverage Error in 2020

    Directory of Open Access Journals (Sweden)

    Griffin Richard A.

    2014-06-01

    Full Text Available Heterogeneity in capture probabilities is known to produce bias in the dual system estimates that have been used to estimate census coverage in U.S. Censuses since 1980. Triple system estimation using an administrative records list as a third source along with the census and coverage measurement survey has the potential to produce estimates with less bias. This is particularly important for hard-to-reach populations.

  12. Probability theory a comprehensive course

    CERN Document Server

    Klenke, Achim

    2014-01-01

    This second edition of the popular textbook contains a comprehensive course in modern probability theory. Overall, probabilistic concepts play an increasingly important role in mathematics, physics, biology, financial engineering and computer science. They help us in understanding magnetism, amorphous media, genetic diversity and the perils of random developments at financial markets, and they guide us in constructing more efficient algorithms.   To address these concepts, the title covers a wide variety of topics, many of which are not usually found in introductory textbooks, such as:   • limit theorems for sums of random variables • martingales • percolation • Markov chains and electrical networks • construction of stochastic processes • Poisson point process and infinite divisibility • large deviation principles and statistical physics • Brownian motion • stochastic integral and stochastic differential equations. The theory is developed rigorously and in a self-contained way, with the c...

  13. [Medical coverage of a road bicycle race].

    Science.gov (United States)

    Reifferscheid, Florian; Stuhr, Markus; Harding, Ulf; Schüler, Christine; Thoms, Jürgen; Püschel, Klaus; Kappus, Stefan

    2010-07-01

    Major sport events require adequate expertise and experience concerning medical coverage and support. Medical and ambulance services need to cover both participants and spectators. Likewise, residents at the venue need to be provided for. Concepts have to include the possibility of major incidents related to the event. Using the example of the Hamburg Cyclassics, a road bicycle race and major event for professional and amateur cyclists, this article describes the medical coverage, number of patients, types of injuries and emergencies. Objectives regarding the planning of future events and essential medical coverage are consequently discussed. (c) Georg Thieme Verlag Stuttgart-New York.

  14. 42 CFR 440.330 - Benchmark health benefits coverage.

    Science.gov (United States)

    2010-10-01

    ...) Federal Employees Health Benefit Plan Equivalent Coverage (FEHBP—Equivalent Health Insurance Coverage). A... coverage. Health benefits coverage that is offered and generally available to State employees in the State... 42 Public Health 4 2010-10-01 2010-10-01 false Benchmark health benefits coverage. 440.330 Section...

  15. Excluding joint probabilities from quantum theory

    Science.gov (United States)

    Allahverdyan, Armen E.; Danageozian, Arshag

    2018-03-01

    Quantum theory does not provide a unique definition for the joint probability of two noncommuting observables, which is the next important question after the Born's probability for a single observable. Instead, various definitions were suggested, e.g., via quasiprobabilities or via hidden-variable theories. After reviewing open issues of the joint probability, we relate it to quantum imprecise probabilities, which are noncontextual and are consistent with all constraints expected from a quantum probability. We study two noncommuting observables in a two-dimensional Hilbert space and show that there is no precise joint probability that applies for any quantum state and is consistent with imprecise probabilities. This contrasts with theorems by Bell and Kochen-Specker that exclude joint probabilities for more than two noncommuting observables, in Hilbert space with dimension larger than two. If measurement contexts are included into the definition, joint probabilities are not excluded anymore, but they are still constrained by imprecise probabilities.

  16. Probability theory and mathematical statistics for engineers

    CERN Document Server

    Pugachev, V S

    1984-01-01

    Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector

  17. Introduction to probability theory with contemporary applications

    CERN Document Server

    Helms, Lester L

    2010-01-01

    This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process

  18. Comparative analysis of the machine repair Problem with imperfect coverage and service pressure condition

    Science.gov (United States)

    Wang, Kuo-Hsiung; Liou, Cheng-Dar; Lin, Yu-Hsueh

    2013-02-01

    We analyze the warm-standby M/M/R machine repair problem with multiple imperfect coverage which involving the service pressure condition. When an operating machine (or warm standby) fails, it may be immediately detected, located, and replaced with a coverage probability c by a standby if one is available. A recursive method is used to develop the steady-state analytic solutions. The total expected profit function per unit time is derived to determine the joint optimal values at the maximum profit. We utilize the direct search method to measure the various characteristics of the profit function followed by Quasi-Newton method to search the optimal solutions.

  19. Stochastic persistence and stationary distribution in an SIS epidemic model with media coverage

    Science.gov (United States)

    Guo, Wenjuan; Cai, Yongli; Zhang, Qimin; Wang, Weiming

    2018-02-01

    This paper aims to study an SIS epidemic model with media coverage from a general deterministic model to a stochastic differential equation with environment fluctuation. Mathematically, we use the Markov semigroup theory to prove that the basic reproduction number R0s can be used to control the dynamics of stochastic system. Epidemiologically, we show that environment fluctuation can inhibit the occurrence of the disease, namely, in the case of disease persistence for the deterministic model, the disease still dies out with probability one for the stochastic model. So to a great extent the stochastic perturbation under media coverage affects the outbreak of the disease.

  20. The probable effect of integrated reporting on audit quality

    Directory of Open Access Journals (Sweden)

    Tamer A. El Nashar

    2016-06-01

    Full Text Available This paper examines a probable effect of integrated reporting on improving the audit quality of organizations. I correlate the hypothesis of this paper in relation to the current trends of protecting the economies, the financial markets and the societies. I predict an improvement of the audit quality, as a result to an estimated percentage of organizations’ reliance on the integrated reporting in their accountability perspective. I used a decision tree and a Bayes’ theorem approach, to predict the probabilities of the significant effect on improving the auditing quality. I find the overall result of this paper, indicates that the probability of organizations to rely on the integrated reporting by a significant percentage, predicts also a significant improvement in audit quality.

  1. Summary of DOD Acquisition Program Audit Coverage

    National Research Council Canada - National Science Library

    2001-01-01

    This report will provide the DoD audit community with information to support their planning efforts and provide management with information on the extent of audit coverage of DoD acquisition programs...

  2. NOAA Weather Radio - County Coverage by State

    Science.gov (United States)

    Non-Zero All Hazards Logo Emergency Alert Description Event Codes Fact Sheet FAQ Organization Search Coverage Listings NWR Station Search Maps SAME SAME Coding Using SAME SAME Non-Zero Codes DOCUMENTS NWR

  3. Media Coverage of Nuclear Energy after Fukushima

    International Nuclear Information System (INIS)

    Oltra, C.; Roman, P.; Prades, A.

    2013-01-01

    This report presents the main findings of a content analysis of printed media coverage of nuclear energy in Spain before and after the Fukushima accident. Our main objective is to understand the changes in the presentation of nuclear fission and nuclear fusion as a result of the accident in Japan. We specifically analyze the volume of coverage and thematic content in the media coverage for nuclear fusion from a sample of Spanish print articles in more than 20 newspapers from 2008 to 2012. We also analyze the media coverage of nuclear energy (fission) in three main Spanish newspapers one year before and one year after the accident. The results illustrate how the media contributed to the presentation of nuclear power in the months before and after the accident. This could have implications for the public understanding of nuclear power. (Author)

  4. Media Coverage of Nuclear Energy after Fukushima

    Energy Technology Data Exchange (ETDEWEB)

    Oltra, C.; Roman, P.; Prades, A.

    2013-07-01

    This report presents the main findings of a content analysis of printed media coverage of nuclear energy in Spain before and after the Fukushima accident. Our main objective is to understand the changes in the presentation of nuclear fission and nuclear fusion as a result of the accident in Japan. We specifically analyze the volume of coverage and thematic content in the media coverage for nuclear fusion from a sample of Spanish print articles in more than 20 newspapers from 2008 to 2012. We also analyze the media coverage of nuclear energy (fission) in three main Spanish newspapers one year before and one year after the accident. The results illustrate how the media contributed to the presentation of nuclear power in the months before and after the accident. This could have implications for the public understanding of nuclear power. (Author)

  5. 22 CFR 518.31 - Insurance coverage.

    Science.gov (United States)

    2010-04-01

    ... property owned by the recipient. Federally-owned property need not be insured unless required by the terms... Requirements Property Standards § 518.31 Insurance coverage. Recipients shall, at a minimum, provide the...

  6. 7 CFR 3019.31 - Insurance coverage.

    Science.gov (United States)

    2010-01-01

    ... recipient. Federally-owned property need not be insured unless required by the terms and conditions of the... Standards § 3019.31 Insurance coverage. Recipients shall, at a minimum, provide the equivalent insurance...

  7. 34 CFR 74.31 - Insurance coverage.

    Science.gov (United States)

    2010-07-01

    ... by the recipient. Federally-owned property need not be insured unless required by the terms and... Property Standards § 74.31 Insurance coverage. Recipients shall, at a minimum, provide the equivalent...

  8. 49 CFR 19.31 - Insurance coverage.

    Science.gov (United States)

    2010-10-01

    ... property owned by the recipient. Federally-owned property need not be insured unless required by the terms... Requirements Property Standards § 19.31 Insurance coverage. Recipients shall, at a minimum, provide the...

  9. 10 CFR 600.131 - Insurance coverage.

    Science.gov (United States)

    2010-01-01

    ... provided to property owned by the recipient. Federally-owned property need not be insured unless required... Nonprofit Organizations Post-Award Requirements § 600.131 Insurance coverage. Recipients shall, at a minimum...

  10. 20 CFR 435.31 - Insurance coverage.

    Science.gov (United States)

    2010-04-01

    ... funds as provided to property owned by the recipient. Federally-owned property need not be insured... ORGANIZATIONS Post-Award Requirements Property Standards § 435.31 Insurance coverage. Recipients must, at a...

  11. 28 CFR 70.31 - Insurance coverage.

    Science.gov (United States)

    2010-07-01

    ... with Federal funds as provided to property owned by the recipient. Federally-owned property need not be...-PROFIT ORGANIZATIONS Post-Award Requirements Property Standards § 70.31 Insurance coverage. Recipients...

  12. Coverage for SCS Pre-1941 Aerial Photography

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — This shapefile was generated by the U.S. Bureau of Land Management (BLM) at the New Mexico State Office to show the coverage for the Pre-1941 aerial photography...

  13. Medicaid Coverage for Methadone Maintenance and Use of Opioid Agonist Therapy in Specialty Addiction Treatment.

    Science.gov (United States)

    Saloner, Brendan; Stoller, Kenneth B; Barry, Colleen L

    2016-06-01

    This study examined differences in opioid agonist therapy (OAT) utilization among Medicaid-enrolled adults receiving public-sector opioid use disorder treatment in states with Medicaid coverage of methadone maintenance, states with block grant funding only, and states without public coverage of methadone. Person-level treatment admission data, which included information on reason for treatment and use of OAT from 36 states were linked to state-level Medicaid policies collected in a 50-state survey. Probabilities of OAT use among Medicaid enrollees in opioid addiction treatment were calculated, with adjustment for demographic characteristics and patterns of substance use. In adjusted analysis, 45.0% of Medicaid-enrolled individuals in opioid addiction treatment in states with Medicaid coverage for methadone maintenance used OAT, compared with 30.1% in states with block grant coverage only and 17.0% in states with no coverage. Differences were widest in nonintensive outpatient settings. Medicaid methadone maintenance coverage is critical for encouraging OAT among individuals with opioid use disorders.

  14. Qualification of the calculational methods of the fluence in the pressurised water reactors. Improvement of the cross sections treatment by the probability table method; Qualification des methodes de calculs de fluence dans les reacteurs a eau pressurisee. Amelioration du traitement des sections efficaces par la methode des tables de probabilite

    Energy Technology Data Exchange (ETDEWEB)

    Zheng, S H

    1994-01-01

    It is indispensable to know the fluence on the nuclear reactor pressure vessel. The cross sections and their treatment have an important rule to this problem. In this study, two ``benchmarks`` have been interpreted by the Monte Carlo transport program TRIPOLI to qualify the calculational method and the cross sections used in the calculations. For the treatment of the cross sections, the multigroup method is usually used but it exists some problems such as the difficulty to choose the weighting function and the necessity of a great number of energy to represent well the cross section`s fluctuation. In this thesis, we propose a new method called ``Probability Table Method`` to treat the neutron cross sections. For the qualification, a program of the simulation of neutron transport by the Monte Carlo method in one dimension has been written; the comparison of multigroup`s results and probability table`s results shows the advantages of this new method. The probability table has also been introduced in the TRIPOLI program; the calculational results of the iron deep penetration benchmark has been improved by comparing with the experimental results. So it is interest to use this new method in the shielding and neutronic calculation. (author). 42 refs., 109 figs., 36 tabs.

  15. Length and coverage of inhibitory decision rules

    KAUST Repository

    Alsolami, Fawaz

    2012-01-01

    Authors present algorithms for optimization of inhibitory rules relative to the length and coverage. Inhibitory rules have a relation "attribute ≠ value" on the right-hand side. The considered algorithms are based on extensions of dynamic programming. Paper contains also comparison of length and coverage of inhibitory rules constructed by a greedy algorithm and by the dynamic programming algorithm. © 2012 Springer-Verlag.

  16. K-forbidden transition probabilities

    International Nuclear Information System (INIS)

    Saitoh, T.R.; Sletten, G.; Bark, R.A.; Hagemann, G.B.; Herskind, B.; Saitoh-Hashimoto, N.; Tsukuba Univ., Ibaraki

    2000-01-01

    Reduced hindrance factors of K-forbidden transitions are compiled for nuclei with A∝180 where γ-vibrational states are observed. Correlations between these reduced hindrance factors and Coriolis forces, statistical level mixing and γ-softness have been studied. It is demonstrated that the K-forbidden transition probabilities are related to γ-softness. The decay of the high-K bandheads has been studied by means of the two-state mixing, which would be induced by the γ-softness, with the use of a number of K-forbidden transitions compiled in the present work, where high-K bandheads are depopulated by both E2 and ΔI=1 transitions. The validity of the two-state mixing scheme has been examined by using the proposed identity of the B(M1)/B(E2) ratios of transitions depopulating high-K bandheads and levels of low-K bands. A break down of the identity might indicate that other levels would mediate transitions between high- and low-K states. (orig.)

  17. Direct probability mapping of contaminants

    International Nuclear Information System (INIS)

    Rautman, C.A.

    1993-01-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. Geostatistical simulation provides powerful tools for investigating contaminant levels, and in particular, for identifying and using the spatial interrelationships among a set of isolated sample values. This additional information can be used to assess the likelihood of encountering contamination at unsampled locations and to evaluate the risk associated with decisions to remediate or not to remediate specific regions within a site. Past operation of the DOE Feed Materials Production Center has contaminated a site near Fernald, Ohio, with natural uranium. Soil geochemical data have been collected as part of the Uranium-in-Soils Integrated Demonstration Project. These data have been used to construct a number of stochastic images of potential contamination for parcels approximately the size of a selective remediation unit. Each such image accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely, statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination. Evaluation of the geostatistical simulations can yield maps representing the expected magnitude of the contamination for various regions and other information that may be important in determining a suitable remediation process or in sizing equipment to accomplish the restoration

  18. A new way to measure the world's protected area coverage.

    Science.gov (United States)

    Barr, Lissa M; Pressey, Robert L; Fuller, Richard A; Segan, Daniel B; McDonald-Madden, Eve; Possingham, Hugh P

    2011-01-01

    Protected areas are effective at stopping biodiversity loss, but their placement is constrained by the needs of people. Consequently protected areas are often biased toward areas that are unattractive for other human uses. Current reporting metrics that emphasise the total area protected do not account for this bias. To address this problem we propose that the distribution of protected areas be evaluated with an economic metric used to quantify inequality in income--the Gini coefficient. Using a modified version of this measure we discover that 73% of countries have inequitably protected their biodiversity and that common measures of protected area coverage do not adequately reveal this bias. Used in combination with total percentage protection, the Gini coefficient will improve the effectiveness of reporting on the growth of protected area coverage, paving the way for better representation of the world's biodiversity.

  19. Psychophysics of the probability weighting function

    Science.gov (United States)

    Takahashi, Taiki

    2011-03-01

    A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (01e)=1e,w(1)=1), which has extensively been studied in behavioral neuroeconomics. The present study utilizes psychophysical theory to derive Prelec's probability weighting function from psychophysical laws of perceived waiting time in probabilistic choices. Also, the relations between the parameters in the probability weighting function and the probability discounting function in behavioral psychology are derived. Future directions in the application of the psychophysical theory of the probability weighting function in econophysics and neuroeconomics are discussed.

  20. Cueing spatial attention through timing and probability.

    Science.gov (United States)

    Girardi, Giovanna; Antonucci, Gabriella; Nico, Daniele

    2013-01-01

    Even when focused on an effortful task we retain the ability to detect salient environmental information, and even irrelevant visual stimuli can be automatically detected. However, to which extent unattended information affects attentional control is not fully understood. Here we provide evidences of how the brain spontaneously organizes its cognitive resources by shifting attention between a selective-attending and a stimulus-driven modality within a single task. Using a spatial cueing paradigm we investigated the effect of cue-target asynchronies as a function of their probabilities of occurrence (i.e., relative frequency). Results show that this accessory information modulates attentional shifts. A valid spatial cue improved participants' performance as compared to an invalid one only in trials in which target onset was highly predictable because of its more robust occurrence. Conversely, cuing proved ineffective when spatial cue and target were associated according to a less frequent asynchrony. These patterns of response depended on asynchronies' probability and not on their duration. Our findings clearly demonstrate that through a fine decision-making, performed trial-by-trial, the brain utilizes implicit information to decide whether or not voluntarily shifting spatial attention. As if according to a cost-planning strategy, the cognitive effort of shifting attention depending on the cue is performed only when the expected advantages are higher. In a trade-off competition for cognitive resources, voluntary/automatic attending may thus be a more complex process than expected. Copyright © 2011 Elsevier Ltd. All rights reserved.

  1. The actual current density of gas-evolving electrodes—Notes on the bubble coverage

    International Nuclear Information System (INIS)

    Vogt, H.

    2012-01-01

    All investigations of electrochemical reactors with gas-evolving electrodes must take account of the fact that the actual current density controlling cell operation commonly differs substantially from the nominal current density used for practical purposes. Both quantities are interrelated by the fractional bubble coverage. This parameter is shown to be affected by a large number of operational quantities. However, available relationships of the bubble coverage take account only of the nominal current density. A further essential insufficiency is their inconsistency with reality for very large values of the bubble coverage being of relevance for operation conditions leading to anode effects. An improved relationship applicable to the total range is proposed.

  2. 45 CFR 148.124 - Certification and disclosure of coverage.

    Science.gov (United States)

    2010-10-01

    ... method of counting creditable coverage, and the requesting entity may identify specific information that... a payroll deduction for health coverage, a health insurance identification card, a certificate of...

  3. THE BLACK HOLE FORMATION PROBABILITY

    Energy Technology Data Exchange (ETDEWEB)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D., E-mail: dclausen@tapir.caltech.edu [TAPIR, Walter Burke Institute for Theoretical Physics, California Institute of Technology, Mailcode 350-17, Pasadena, CA 91125 (United States)

    2015-02-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P {sub BH}(M {sub ZAMS}). Although we find that it is difficult to derive a unique P {sub BH}(M {sub ZAMS}) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P {sub BH}(M {sub ZAMS}) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P {sub BH}(M {sub ZAMS}) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.

  4. THE BLACK HOLE FORMATION PROBABILITY

    International Nuclear Information System (INIS)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D.

    2015-01-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH (M ZAMS ). Although we find that it is difficult to derive a unique P BH (M ZAMS ) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH (M ZAMS ) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH (M ZAMS ) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment

  5. The Black Hole Formation Probability

    Science.gov (United States)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D.

    2015-02-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH(M ZAMS). Although we find that it is difficult to derive a unique P BH(M ZAMS) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH(M ZAMS) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH(M ZAMS) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.

  6. Impact of state mandatory insurance coverage on the use of diabetes preventive care

    Directory of Open Access Journals (Sweden)

    Barker Lawrence

    2010-05-01

    Full Text Available Abstract Background 46 U.S. states and the District of Columbia have passed laws and regulations mandating that health insurance plans cover diabetes treatment and preventive care. Previous research on state mandates suggested that these policies had little impact, since many health plans already covered the benefits. Here, we analyze the contents of and model the effect of state mandates. We examined how state mandates impacted the likelihood of using three types of diabetes preventive care: annual eye exams, annual foot exams, and performing daily self-monitoring of blood glucose (SMBG. Methods We collected information on diabetes benefits specified in state mandates and time the mandates were enacted. To assess impact, we used data that the Behavioral Risk Factor Surveillance System gathered between 1996 and 2000. 4,797 individuals with self-reported diabetes and covered by private insurance were included; 3,195 of these resided in the 16 states that passed state mandates between 1997 and 1999; 1,602 resided in the 8 states or the District of Columbia without state mandates by 2000. Multivariate logistic regression models (with state fixed effect, controlling for patient demographic characteristics and socio-economic status, state characteristics, and time trend were used to model the association between passing state mandates and the usage of the forms of diabetes preventive care, both individually and collectively. Results All 16 states that passed mandates between 1997 and 1999 required coverage of diabetic monitors and strips, while 15 states required coverage of diabetes self management education. Only 1 state required coverage of periodic eye and foot exams. State mandates were positively associated with a 6.3 (P = 0.04 and a 5.8 (P = 0.03 percentage point increase in the probability of privately insured diabetic patient's performing SMBG and simultaneous receiving all three preventive care, respectively; state mandates were not

  7. Medicaid Coverage of Methadone Maintenance and the Use of Opioid Agonist Therapy Among Pregnant Women in Specialty Treatment.

    Science.gov (United States)

    Bachhuber, Marcus A; Mehta, Pooja K; Faherty, Laura J; Saloner, Brendan

    2017-12-01

    Opioid agonist therapy (OAT) is the standard of care for pregnant women with opioid use disorder (OUD). Medicaid coverage policies may strongly influence OAT use in this group. To examine the association between Medicaid coverage of methadone maintenance and planned use of OAT in the publicly funded treatment system. Retrospective cross-sectional analysis of treatment admissions in 30 states extracted from the Treatment Episode Data Set (2013 and 2014). Medicaid-insured pregnant women with OUD (n=3354 treatment admissions). The main outcome measure was planned use of OAT on admission. The main exposure was state Medicaid coverage of methadone maintenance. Using multivariable logistic regression models adjusting for sociodemographic, substance use, and treatment characteristics, we compared the probability of planned OAT use in states with Medicaid coverage of methadone maintenance versus states without coverage. A total of 71% of pregnant women admitted to OUD treatment were 18-29 years old, 85% were white non-Hispanic, and 56% used heroin. Overall, 74% of admissions occurred in the 18 states with Medicaid coverage of methadone maintenance and 53% of admissions involved planned use of OAT. Compared with states without Medicaid coverage of methadone maintenance, admissions in states with coverage were significantly more likely to involve planned OAT use (adjusted difference: 32.9 percentage points, 95% confidence interval, 19.2-46.7). Including methadone maintenance in the Medicaid benefit is essential to increasing OAT among pregnant women with OUD and should be considered a key policy strategy to enhance outcomes for mothers and newborns.

  8. Improving health insurance coverage in Ghana : a case study

    NARCIS (Netherlands)

    Kotoh, A.M.

    2013-01-01

    Ghana is one of the first sub-Saharan African countries to introduce national health insurance to ensure more equity in access to health care. The response of the population has been disappointing, however. This study describes and examines an experiment with so called 'problem-solving groups' that

  9. Foundations of the theory of probability

    CERN Document Server

    Kolmogorov, AN

    2018-01-01

    This famous little book remains a foundational text for the understanding of probability theory, important both to students beginning a serious study of probability and to historians of modern mathematics. 1956 second edition.

  10. The Probability Distribution for a Biased Spinner

    Science.gov (United States)

    Foster, Colin

    2012-01-01

    This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)

  11. Analytic Neutrino Oscillation Probabilities in Matter: Revisited

    Energy Technology Data Exchange (ETDEWEB)

    Parke, Stephen J. [Fermilab; Denton, Peter B. [Copenhagen U.; Minakata, Hisakazu [Madrid, IFT

    2018-01-02

    We summarize our recent paper on neutrino oscillation probabilities in matter, explaining the importance, relevance and need for simple, highly accurate approximations to the neutrino oscillation probabilities in matter.

  12. The impact of the macroeconomy on health insurance coverage: evidence from the Great Recession.

    Science.gov (United States)

    Cawley, John; Moriya, Asako S; Simon, Kosali

    2015-02-01

    This paper investigates the impact of the macroeconomy on the health insurance coverage of Americans using panel data from the Survey of Income and Program Participation for 2004-2010, a period that includes the Great Recession of 2007-2009. We find that a one percentage point increase in the state unemployment rate is associated with a 1.67 percentage point (2.12%) reduction in the probability that men have health insurance; this effect is strongest among college-educated, white, and older (50-64 years old) men. For women and children, health insurance coverage is not significantly correlated with the unemployment rate, which may be the result of public health insurance acting as a social safety net. Compared with the previous recession, the health insurance coverage of men is more sensitive to the unemployment rate, which may be due to the nature of the Great Recession. Copyright © 2013 John Wiley & Sons, Ltd.

  13. Void probability scaling in hadron nucleus interactions

    International Nuclear Information System (INIS)

    Ghosh, Dipak; Deb, Argha; Bhattacharyya, Swarnapratim; Ghosh, Jayita; Bandyopadhyay, Prabhat; Das, Rupa; Mukherjee, Sima

    2002-01-01

    Heygi while investigating with the rapidity gap probability (that measures the chance of finding no particle in the pseudo-rapidity interval Δη) found that a scaling behavior in the rapidity gap probability has a close correspondence with the scaling of a void probability in galaxy correlation study. The main aim in this paper is to study the scaling behavior of the rapidity gap probability

  14. Pre-Service Teachers' Conceptions of Probability

    Science.gov (United States)

    Odafe, Victor U.

    2011-01-01

    Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…

  15. Using Playing Cards to Differentiate Probability Interpretations

    Science.gov (United States)

    López Puga, Jorge

    2014-01-01

    The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.

  16. Improved Variable Window Kernel Estimates of Probability Densities

    OpenAIRE

    Hall, Peter; Hu, Tien Chung; Marron, J. S.

    1995-01-01

    Variable window width kernel density estimators, with the width varying proportionally to the square root of the density, have been thought to have superior asymptotic properties. The rate of convergence has been claimed to be as good as those typical for higher-order kernels, which makes the variable width estimators more attractive because no adjustment is needed to handle the negativity usually entailed by the latter. However, in a recent paper, Terrell and Scott show that these results ca...

  17. Evaluation of Coverage and Barriers to Access to MAM Treatment in West Pokot County, Kenya

    International Nuclear Information System (INIS)

    Basquin, Cecile; Imelda, Awino; Gallagher, Maureen

    2014-01-01

    pitting oedema. Results showed that none of the four sub-counties achieved high coverage classification. The coverage for OTP was moderate in North and South Pokot, whilst low in West and Central Pokot. The overall county coverage classification was moderate. SFP coverage classification was found to be low across all four sub-counties and county wide as well. The assessment also identified that barriers to access to SAM and MAM treatment were often similar, e.g., the main barrier to access for both services was lack of programme awareness in Central and West Pokot. Some key recommendations towards increasing coverage included to improve stakeholder awareness via advocacy, engaging with the use of mass media, increasing outreach activities, and to minimize rejection by revising screening methods and systems in place. These were applicable to both OTP and SFP components of IMAM. Coverage evaluations for MAM treatment are less commonly conducted than those for SAM treatment as it is more challenging to identify cases of MAM physically. Nonetheless, in order to document program effectiveness, it will be important to further explore methods that can evaluate coverage of MAM programming. (author)

  18. Conventional sunscreen application does not lead to sufficient body coverage.

    Science.gov (United States)

    Jovanovic, Z; Schornstein, T; Sutor, A; Neufang, G; Hagens, R

    2017-10-01

    This study aimed to assess sunscreen application habits and relative body coverage after single whole body application. Fifty-two healthy volunteers were asked to use the test product once, following their usual sunscreen application routine. Standardized UV photographs, which were evaluated by Image Analysis, were conducted before and immediately after product application to evaluate relative body coverage. In addition to these procedures, the volunteers completed an online self-assessment questionnaire to assess sunscreen usage habits. After product application, the front side showed significantly less non-covered skin (4.35%) than the backside (17.27%) (P = 0.0000). Females showed overall significantly less non-covered skin (8.98%) than males (13.16%) (P = 0.0381). On the backside, females showed significantly less non-covered skin (13.57%) (P = 0.0045) than males (21.94%), while on the front side, this difference between females (4.14%) and males (4.53%) was not significant. In most cases, the usual sunscreen application routine does not provide complete body coverage even though an extra light sunscreen with good absorption properties was used. On average, 11% of the body surface was not covered by sunscreen at all. Therefore, appropriate consumer education is required to improve sunscreen application and to warrant effective sun protection. © 2017 Society of Cosmetic Scientists and the Société Française de Cosmétologie.

  19. Maximization of regional probabilities using Optimal Surface Graphs

    DEFF Research Database (Denmark)

    Arias Lorza, Andres M.; Van Engelen, Arna; Petersen, Jens

    2018-01-01

    Purpose: We present a segmentation method that maximizes regional probabilities enclosed by coupled surfaces using an Optimal Surface Graph (OSG) cut approach. This OSG cut determines the globally optimal solution given a graph constructed around an initial surface. While most methods for vessel...... wall segmentation only use edge information, we show that maximizing regional probabilities using an OSG improves the segmentation results. We applied this to automatically segment the vessel wall of the carotid artery in magnetic resonance images. Methods: First, voxel-wise regional probability maps...... were obtained using a Support Vector Machine classifier trained on local image features. Then, the OSG segments the regions which maximizes the regional probabilities considering smoothness and topological constraints. Results: The method was evaluated on 49 carotid arteries from 30 subjects...

  20. Sensitivity of postplanning target and OAR coverage estimates to dosimetric margin distribution sampling parameters.

    Science.gov (United States)

    Xu, Huijun; Gordon, J James; Siebers, Jeffrey V

    2011-02-01

    A dosimetric margin (DM) is the margin in a specified direction between a structure and a specified isodose surface, corresponding to a prescription or tolerance dose. The dosimetric margin distribution (DMD) is the distribution of DMs over all directions. Given a geometric uncertainty model, representing inter- or intrafraction setup uncertainties or internal organ motion, the DMD can be used to calculate coverage Q, which is the probability that a realized target or organ-at-risk (OAR) dose metric D, exceeds the corresponding prescription or tolerance dose. Postplanning coverage evaluation quantifies the percentage of uncertainties for which target and OAR structures meet their intended dose constraints. The goal of the present work is to evaluate coverage probabilities for 28 prostate treatment plans to determine DMD sampling parameters that ensure adequate accuracy for postplanning coverage estimates. Normally distributed interfraction setup uncertainties were applied to 28 plans for localized prostate cancer, with prescribed dose of 79.2 Gy and 10 mm clinical target volume to planning target volume (CTV-to-PTV) margins. Using angular or isotropic sampling techniques, dosimetric margins were determined for the CTV, bladder and rectum, assuming shift invariance of the dose distribution. For angular sampling, DMDs were sampled at fixed angular intervals w (e.g., w = 1 degree, 2 degrees, 5 degrees, 10 degrees, 20 degrees). Isotropic samples were uniformly distributed on the unit sphere resulting in variable angular increments, but were calculated for the same number of sampling directions as angular DMDs, and accordingly characterized by the effective angular increment omega eff. In each direction, the DM was calculated by moving the structure in radial steps of size delta (=0.1, 0.2, 0.5, 1 mm) until the specified isodose was crossed. Coverage estimation accuracy deltaQ was quantified as a function of the sampling parameters omega or omega eff and delta. The

  1. Conceptualising the lack of health insurance coverage.

    Science.gov (United States)

    Davis, J B

    2000-01-01

    This paper examines the lack of health insurance coverage in the US as a public policy issue. It first compares the problem of health insurance coverage to the problem of unemployment to show that in terms of the numbers of individuals affected lack of health insurance is a problem comparable in importance to the problem of unemployment. Secondly, the paper discusses the methodology involved in measuring health insurance coverage, and argues that the current method of estimation of the uninsured underestimates the extent that individuals go without health insurance. Third, the paper briefly introduces Amartya Sen's functioning and capabilities framework to suggest a way of representing the extent to which individuals are uninsured. Fourth, the paper sketches a means of operationalizing the Sen representation of the uninsured in terms of the disability-adjusted life year (DALY) measure.

  2. Cataract surgical coverage rate among adults aged 40 years and older

    Directory of Open Access Journals (Sweden)

    Lusianawaty Tana

    2016-02-01

    Full Text Available Cataract is a leading cause of curable blindness. Hence, in its global declaration of ‘Vision 2020 Right to Sight’, the World Health Organization (WHO encouraged its member countries to address the problem of incident cataract. Many factors are related to the cataract surgical coverage rate, such as gender and diabetes mellitus. The objective of this study was to determine the cataract surgical coverage rate and investigate the determinants factors of cataract surgical coverage rate among adults 40 years old and above with cataract. A cross sectional study was conducted using National Basic Health Research (Riskesdas 2007 data. Cataract surgery was defined as surgery conducted within the last 12 months before the survey was performed. There were 6939 subjects (3105 male, 3834 female who fulfilled the study criteria. The cataract surgical coverage rate was 19.3%. The cataract surgical coverage rate was lower in subjects with low education, in the group of farmers/fishermen/laborers, in the 40-49 years age group, in rural areas, and in subjects of low socioeconomic status (p0.05. Determinants that were related to cataract surgical coverage rate were age, type of area of residence, socioeconomic status, and region of residence (p<0.001. The implementation of educational programs and reforms to local ophthalmic health services may improve the cataract surgical coverage rate.

  3. [Vaccination coverage in young, middle age and elderly adults in Mexico].

    Science.gov (United States)

    Cruz-Hervert, Luis Pablo; Ferreira-Guerrero, Elizabeth; Díaz-Ortega, José Luis; Trejo-Valdivia, Belem; Téllez-Rojo, Martha María; Mongua-Rodríguez, Norma; Hernández-Serrato, María I; Montoya-Rodríguez, Airain Alejandra; García-García, Lourdes

    2013-01-01

    To estimate vaccination coverage in adults 20 years of age and older. Analysis of data obtained from the National Health and Nutrition Survey 2012. Among adults 20-59 years old coverage with complete scheme, measles and rubella (MR) and tetanus toxoid and diphtheria toxoid (Td) was 44.7,49. and 67.3%, respectively. Coverage and percentage of vaccination were significantly higher among women than men. Among women 20-49 years coverages with complete scheme, MR and Td were 48.3, 53.2 and 69.8%, respectively. Among adults 60-64 years old, coverage with complete scheme, Td and influenza vaccine were 46.5, 66.2 and 56.0%, respectively. Among adults >65 years coverages for complete scheme, Td, influenza vaccine and pneumococcal vaccine were 44.0, 69.0, 63.3 and 62.0%, respectively. Vaccination coverage among adult population as obtained from vaccination card or self-report is below optimal values although data may be underestimated. Recommendations for improvements are proposed.

  4. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2014-01-01

    The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability t

  5. Probability of Failure in Random Vibration

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Sørensen, John Dalsgaard

    1988-01-01

    Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out......-crossing. An integral equation for the probability density function of the time interval is formulated, and adequate approximations for the kernel are suggested. The kernel approximation results in approximate solutions for the probability density function of the time interval and thus for the first-passage probability...

  6. An Objective Theory of Probability (Routledge Revivals)

    CERN Document Server

    Gillies, Donald

    2012-01-01

    This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axioma

  7. Paraconsistent Probabilities: Consistency, Contradictions and Bayes’ Theorem

    Directory of Open Access Journals (Sweden)

    Juliana Bueno-Soler

    2016-09-01

    Full Text Available This paper represents the first steps towards constructing a paraconsistent theory of probability based on the Logics of Formal Inconsistency (LFIs. We show that LFIs encode very naturally an extension of the notion of probability able to express sophisticated probabilistic reasoning under contradictions employing appropriate notions of conditional probability and paraconsistent updating, via a version of Bayes’ theorem for conditionalization. We argue that the dissimilarity between the notions of inconsistency and contradiction, one of the pillars of LFIs, plays a central role in our extended notion of probability. Some critical historical and conceptual points about probability theory are also reviewed.

  8. Progress toward universal health coverage in ASEAN.

    Science.gov (United States)

    Van Minh, Hoang; Pocock, Nicola Suyin; Chaiyakunapruk, Nathorn; Chhorvann, Chhea; Duc, Ha Anh; Hanvoravongchai, Piya; Lim, Jeremy; Lucero-Prisno, Don Eliseo; Ng, Nawi; Phaholyothin, Natalie; Phonvisay, Alay; Soe, Kyaw Min; Sychareun, Vanphanom

    2014-01-01

    The Association of Southeast Asian Nations (ASEAN) is characterized by much diversity in terms of geography, society, economic development, and health outcomes. The health systems as well as healthcare structure and provisions vary considerably. Consequently, the progress toward Universal Health Coverage (UHC) in these countries also varies. This paper aims to describe the progress toward UHC in the ASEAN countries and discuss how regional integration could influence UHC. Data reported in this paper were obtained from published literature, reports, and gray literature available in the ASEAN countries. We used both online and manual search methods to gather the information and 'snowball' further data. We found that, in general, ASEAN countries have made good progress toward UHC, partly due to relatively sustained political commitments to endorse UHC in these countries. However, all the countries in ASEAN are facing several common barriers to achieving UHC, namely 1) financial constraints, including low levels of overall and government spending on health; 2) supply side constraints, including inadequate numbers and densities of health workers; and 3) the ongoing epidemiological transition at different stages characterized by increasing burdens of non-communicable diseases, persisting infectious diseases, and reemergence of potentially pandemic infectious diseases. The ASEAN Economic Community's (AEC) goal of regional economic integration and a single market by 2015 presents both opportunities and challenges for UHC. Healthcare services have become more available but health and healthcare inequities will likely worsen as better-off citizens of member states might receive more benefits from the liberalization of trade policy in health, either via regional outmigration of health workers or intra-country health worker movement toward private hospitals, which tend to be located in urban areas. For ASEAN countries, UHC should be explicitly considered to mitigate

  9. Progress toward universal health coverage in ASEAN

    Directory of Open Access Journals (Sweden)

    Hoang Van Minh

    2014-12-01

    Full Text Available Background: The Association of Southeast Asian Nations (ASEAN is characterized by much diversity in terms of geography, society, economic development, and health outcomes. The health systems as well as healthcare structure and provisions vary considerably. Consequently, the progress toward Universal Health Coverage (UHC in these countries also varies. This paper aims to describe the progress toward UHC in the ASEAN countries and discuss how regional integration could influence UHC. Design: Data reported in this paper were obtained from published literature, reports, and gray literature available in the ASEAN countries. We used both online and manual search methods to gather the information and ‘snowball’ further data. Results: We found that, in general, ASEAN countries have made good progress toward UHC, partly due to relatively sustained political commitments to endorse UHC in these countries. However, all the countries in ASEAN are facing several common barriers to achieving UHC, namely 1 financial constraints, including low levels of overall and government spending on health; 2 supply side constraints, including inadequate numbers and densities of health workers; and 3 the ongoing epidemiological transition at different stages characterized by increasing burdens of non-communicable diseases, persisting infectious diseases, and reemergence of potentially pandemic infectious diseases. The ASEAN Economic Community's (AEC goal of regional economic integration and a single market by 2015 presents both opportunities and challenges for UHC. Healthcare services have become more available but health and healthcare inequities will likely worsen as better-off citizens of member states might receive more benefits from the liberalization of trade policy in health, either via regional outmigration of health workers or intra-country health worker movement toward private hospitals, which tend to be located in urban areas. For ASEAN countries, UHC should

  10. Measuring coverage in MNCH: a validation study linking population survey derived coverage to maternal, newborn, and child health care records in rural China.

    Directory of Open Access Journals (Sweden)

    Li Liu

    Full Text Available Accurate data on coverage of key maternal, newborn, and child health (MNCH interventions are crucial for monitoring progress toward the Millennium Development Goals 4 and 5. Coverage estimates are primarily obtained from routine population surveys through self-reporting, the validity of which is not well understood. We aimed to examine the validity of the coverage of selected MNCH interventions in Gongcheng County, China.We conducted a validation study by comparing women's self-reported coverage of MNCH interventions relating to antenatal and postnatal care, mode of delivery, and child vaccinations in a community survey with their paper- and electronic-based health care records, treating the health care records as the reference standard. Of 936 women recruited, 914 (97.6% completed the survey. Results show that self-reported coverage of these interventions had moderate to high sensitivity (0.57 [95% confidence interval (CI: 0.50-0.63] to 0.99 [95% CI: 0.98-1.00] and low to high specificity (0 to 0.83 [95% CI: 0.80-0.86]. Despite varying overall validity, with the area under the receiver operating characteristic curve (AUC ranging between 0.49 [95% CI: 0.39-0.57] and 0.90 [95% CI: 0.88-0.92], bias in the coverage estimates at the population level was small to moderate, with the test to actual positive (TAP ratio ranging between 0.8 and 1.5 for 24 of the 28 indicators examined. Our ability to accurately estimate validity was affected by several caveats associated with the reference standard. Caution should be exercised when generalizing the results to other settings.The overall validity of self-reported coverage was moderate across selected MNCH indicators. However, at the population level, self-reported coverage appears to have small to moderate degree of bias. Accuracy of the coverage was particularly high for indicators with high recorded coverage or low recorded coverage but high specificity. The study provides insights into the accuracy of

  11. Aspects of coverage in medical DNA sequencing

    Directory of Open Access Journals (Sweden)

    Wilson Richard K

    2008-05-01

    Full Text Available Abstract Background DNA sequencing is now emerging as an important component in biomedical studies of diseases like cancer. Short-read, highly parallel sequencing instruments are expected to be used heavily for such projects, but many design specifications have yet to be conclusively established. Perhaps the most fundamental of these is the redundancy required to detect sequence variations, which bears directly upon genomic coverage and the consequent resolving power for discerning somatic mutations. Results We address the medical sequencing coverage problem via an extension of the standard mathematical theory of haploid coverage. The expected diploid multi-fold coverage, as well as its generalization for aneuploidy are derived and these expressions can be readily evaluated for any project. The resulting theory is used as a scaling law to calibrate performance to that of standard BAC sequencing at 8× to 10× redundancy, i.e. for expected coverages that exceed 99% of the unique sequence. A differential strategy is formalized for tumor/normal studies wherein tumor samples are sequenced more deeply than normal ones. In particular, both tumor alleles should be detected at least twice, while both normal alleles are detected at least once. Our theory predicts these requirements can be met for tumor and normal redundancies of approximately 26× and 21×, respectively. We explain why these values do not differ by a factor of 2, as might intuitively be expected. Future technology developments should prompt even deeper sequencing of tumors, but the 21× value for normal samples is essentially a constant. Conclusion Given the assumptions of standard coverage theory, our model gives pragmatic estimates for required redundancy. The differential strategy should be an efficient means of identifying potential somatic mutations for further study.

  12. 29 CFR 2.13 - Audiovisual coverage prohibited.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 1 2010-07-01 2010-07-01 true Audiovisual coverage prohibited. 2.13 Section 2.13 Labor Office of the Secretary of Labor GENERAL REGULATIONS Audiovisual Coverage of Administrative Hearings § 2.13 Audiovisual coverage prohibited. The Department shall not permit audiovisual coverage of the...

  13. 28 CFR 55.6 - Coverage under section 203(c).

    Science.gov (United States)

    2010-07-01

    ... THE VOTING RIGHTS ACT REGARDING LANGUAGE MINORITY GROUPS Nature of Coverage § 55.6 Coverage under section 203(c). (a) Coverage formula. There are four ways in which a political subdivision can become subject to section 203(c). 2 2 The criteria for coverage are contained in section 203(b). (1) Political...

  14. Microstrip Antenna Design for Femtocell Coverage Optimization

    Directory of Open Access Journals (Sweden)

    Afaz Uddin Ahmed

    2014-01-01

    Full Text Available A mircostrip antenna is designed for multielement antenna coverage optimization in femtocell network. Interference is the foremost concern for the cellular operator in vast commercial deployments of femtocell. Many techniques in physical, data link and network-layer are analysed and developed to settle down the interference issues. A multielement technique with self-configuration features is analyzed here for coverage optimization of femtocell. It also focuses on the execution of microstrip antenna for multielement configuration. The antenna is designed for LTE Band 7 by using standard FR4 dielectric substrate. The performance of the proposed antenna in the femtocell application is discussed along with results.

  15. Practical differences among probabilities, possibilities, and credibilities

    Science.gov (United States)

    Grandin, Jean-Francois; Moulin, Caroline

    2002-03-01

    This paper presents some important differences that exist between theories, which allow the uncertainty management in data fusion. The main comparative results illustrated in this paper are the followings: Incompatibility between decisions got from probabilities and credibilities is highlighted. In the dynamic frame, as remarked in [19] or [17], belief and plausibility of Dempster-Shafer model do not frame the Bayesian probability. This framing can however be obtained by the Modified Dempster-Shafer approach. It also can be obtained in the Bayesian framework either by simulation techniques, or with a studentization. The uncommitted in the Dempster-Shafer way, e.g. the mass accorded to the ignorance, gives a mechanism similar to the reliability in the Bayesian model. Uncommitted mass in Dempster-Shafer theory or reliability in Bayes theory act like a filter that weakens extracted information, and improves robustness to outliners. So, it is logical to observe on examples like the one presented particularly by D.M. Buede, a faster convergence of a Bayesian method that doesn't take into account the reliability, in front of Dempster-Shafer method which uses uncommitted mass. But, on Bayesian masses, if reliability is taken into account, at the same level that the uncommited, e.g. F=1-m, we observe an equivalent rate for convergence. When Dempster-Shafer and Bayes operator are informed by uncertainty, faster or lower convergence can be exhibited on non Bayesian masses. This is due to positive or negative synergy between information delivered by sensors. This effect is a direct consequence of non additivity when considering non Bayesian masses. Unknowledge of the prior in bayesian techniques can be quickly compensated by information accumulated as time goes on by a set of sensors. All these results are presented on simple examples, and developed when necessary.

  16. Handoff Rate and Coverage Analysis in Multi-tier Heterogeneous Networks

    OpenAIRE

    Sadr, Sanam; Adve, Raviraj S.

    2015-01-01

    This paper analyzes the impact of user mobility in multi-tier heterogeneous networks. We begin by obtaining the handoff rate for a mobile user in an irregular cellular network with the access point locations modeled as a homogeneous Poisson point process. The received signal-to-interference-ratio (SIR) distribution along with a chosen SIR threshold is then used to obtain the probability of coverage. To capture potential connection failures due to mobility, we assume that a fraction of handoff...

  17. Computing exact bundle compliance control charts via probability generating functions.

    Science.gov (United States)

    Chen, Binchao; Matis, Timothy; Benneyan, James

    2016-06-01

    Compliance to evidenced-base practices, individually and in 'bundles', remains an important focus of healthcare quality improvement for many clinical conditions. The exact probability distribution of composite bundle compliance measures used to develop corresponding control charts and other statistical tests is based on a fairly large convolution whose direct calculation can be computationally prohibitive. Various series expansions and other approximation approaches have been proposed, each with computational and accuracy tradeoffs, especially in the tails. This same probability distribution also arises in other important healthcare applications, such as for risk-adjusted outcomes and bed demand prediction, with the same computational difficulties. As an alternative, we use probability generating functions to rapidly obtain exact results and illustrate the improved accuracy and detection over other methods. Numerical testing across a wide range of applications demonstrates the computational efficiency and accuracy of this approach.

  18. Recommendations for the tuning of rare event probability estimators

    International Nuclear Information System (INIS)

    Balesdent, Mathieu; Morio, Jérôme; Marzat, Julien

    2015-01-01

    Being able to accurately estimate rare event probabilities is a challenging issue in order to improve the reliability of complex systems. Several powerful methods such as importance sampling, importance splitting or extreme value theory have been proposed in order to reduce the computational cost and to improve the accuracy of extreme probability estimation. However, the performance of these methods is highly correlated with the choice of tuning parameters, which are very difficult to determine. In order to highlight recommended tunings for such methods, an empirical campaign of automatic tuning on a set of representative test cases is conducted for splitting methods. It allows to provide a reduced set of tuning parameters that may lead to the reliable estimation of rare event probability for various problems. The relevance of the obtained result is assessed on a series of real-world aerospace problems

  19. Health workers and vaccination coverage in developing countries: an econometric analysis.

    Science.gov (United States)

    Anand, Sudhir; Bärnighausen, Till

    2007-04-14

    Vaccine-preventable diseases cause more than 1 million deaths among children in developing countries every year. Although health workers are needed to do vaccinations, the role of human resources for health as a determinant of vaccination coverage at the population level has not been investigated. Our aim was to test whether health worker density was positively associated with childhood vaccination coverage in developing countries. We did cross-country multiple regression analyses with coverage of three vaccinations--measles-containing vaccine (MCV); diphtheria, tetanus, and pertussis (DTP3); and poliomyelitis (polio3)--as dependent variables. Aggregate health worker density was an independent variable in one set of regressions; doctor and nurse densities were used separately in another set. We controlled for national income per person, female adult literacy, and land area. Health worker density was significantly associated with coverage of all three vaccinations (MCV p=0.0024; DTP3 p=0.0004; polio3 p=0.0008). However, when the effects of doctors and nurses were assessed separately, we found that nurse density was significantly associated with coverage of all three vaccinations (MCV p=0.0097; DTP3 p=0.0083; polio3 p=0.0089), but doctor density was not (MCV p=0.7953; DTP3 p=0.7971; polio3 p=0.7885). Female adult literacy was positively associated, and land area negatively associated, with vaccination coverage. National income per person had no effect on coverage. A higher density of health workers (nurses) increases the availability of vaccination services over time and space, making it more likely that children will be vaccinated. After controlling for other determinants, the level of income does not contribute to improved immunisation coverage. Health workers can be a major constraining factor on vaccination coverage in developing countries.

  20. Seasonal influenza vaccination coverage and its determinants among nursing homes personnel in western France

    Directory of Open Access Journals (Sweden)

    Christelle Elias

    2017-07-01

    Full Text Available Abstract Background Influenza-associated deaths is an important risk for the elderly in nursing homes (NHs worldwide. Vaccination coverage among residents is high but poorly effective due to immunosenescence. Hence, vaccination of personnel is an efficient way to protect residents. Our objective was to quantify the seasonal influenza vaccination (IV coverage among NH for elderly workers and identify its determinants in France. Methods We conducted a cross-sectional study in March 2016 in a randomized sample of NHs of the Ille-et-Vilaine department of Brittany, in western France. A standardized questionnaire was administered to a randomized sample of NH workers for face-to-face interviews. General data about the establishment was also collected. Results Among the 33 NHs surveyed, IV coverage for the 2015–2016 season among permanent workers was estimated at 20% (95% Confidence Interval (CI 15.3%–26.4% ranging from 0% to 69% depending on the establishments surveyed. Moreover, IV was associated with having previously experienced a “severe” influenza episode in the past (Prevalence Ratio 1.48, 95% CI 1.01–2.17, and varied by professional categories (p < 0.004 with better coverage among administrative staff. Better knowledge about influenza prevention tools was also correlated (p < 0.001 with a higher IV coverage. Individual perceptions of vaccination benefits had a significant influence on the IV coverage (p < 0.001. Although IV coverage did not reach a high rate, our study showed that personnel considered themselves sufficiently informed about IV. Conclusions IV coverage remains low in the NH worker population in Ille-et-Vilaine and also possibly in France. Strong variations of IV coverage among NHs suggest that management and working environment play an important role. To overcome vaccine “hesitancy”, specific communication tools may be required to be adapted to the various NH professionals to improve influenza prevention.

  1. Estimating Premium Sensitivity for Children's Public Health Insurance Coverage: Selection but No Death Spiral

    Science.gov (United States)

    Marton, James; Ketsche, Patricia G; Snyder, Angela; Adams, E Kathleen; Zhou, Mei

    2015-01-01

    Objective To estimate the effect of premium increases on the probability that near-poor and moderate-income children disenroll from public coverage. Data Sources Enrollment, eligibility, and claims data for Georgia's PeachCare for Kids™ (CHIP) program for multiple years. Study Design We exploited policy-induced variation in premiums generated by cross-sectional differences and changes over time in enrollee age, family size, and income to estimate the duration of enrollment as a function of the effective (per child) premium. We classify children as being of low, medium, or high illness severity. Principal Findings A dollar increase in the per-child premium is associated with a slight increase in a typical child's monthly probability of exiting coverage from 7.70 to 7.83 percent. Children with low illness severity have a significantly higher monthly baseline probability of exiting than children with medium or high illness severity, but the enrollment response to premium increases is similar across all three groups. Conclusions Success in achieving coverage gains through public programs is tempered by persistent problems in maintaining enrollment, which is modestly affected by premium increases. Retention is subject to adverse selection problems, but premium increases do not appear to significantly magnify the selection problem in this case. PMID:25130764

  2. Broader health coverage is good for the nation's health: evidence from country level panel data.

    Science.gov (United States)

    Moreno-Serra, Rodrigo; Smith, Peter C

    2015-01-01

    Progress towards universal health coverage involves providing people with access to needed health services without entailing financial hardship and is often advocated on the grounds that it improves population health. The paper offers econometric evidence on the effects of health coverage on mortality outcomes at the national level. We use a large panel data set of countries, examined by using instrumental variable specifications that explicitly allow for potential reverse causality and unobserved country-specific characteristics. We employ various proxies for the coverage level in a health system. Our results indicate that expanded health coverage, particularly through higher levels of publicly funded health spending, results in lower child and adult mortality, with the beneficial effect on child mortality being larger in poorer countries.

  3. Estimation of component failure probability from masked binomial system testing data

    International Nuclear Information System (INIS)

    Tan Zhibin

    2005-01-01

    The component failure probability estimates from analysis of binomial system testing data are very useful because they reflect the operational failure probability of components in the field which is similar to the test environment. In practice, this type of analysis is often confounded by the problem of data masking: the status of tested components is unknown. Methods in considering this type of uncertainty are usually computationally intensive and not practical to solve the problem for complex systems. In this paper, we consider masked binomial system testing data and develop a probabilistic model to efficiently estimate component failure probabilities. In the model, all system tests are classified into test categories based on component coverage. Component coverage of test categories is modeled by a bipartite graph. Test category failure probabilities conditional on the status of covered components are defined. An EM algorithm to estimate component failure probabilities is developed based on a simple but powerful concept: equivalent failures and tests. By simulation we not only demonstrate the convergence and accuracy of the algorithm but also show that the probabilistic model is capable of analyzing systems in series, parallel and any other user defined structures. A case study illustrates an application in test case prioritization

  4. 24 CFR 51.302 - Coverage.

    Science.gov (United States)

    2010-04-01

    ... 24 Housing and Urban Development 1 2010-04-01 2010-04-01 false Coverage. 51.302 Section 51.302 Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development... significantly prolongs the physical or economic life of existing facilities or which, in the case of Accident...

  5. 5 CFR 880.304 - FEGLI coverage.

    Science.gov (United States)

    2010-01-01

    ... under § 880.205, FEGLI premiums and benefits will be computed using the date of death established under...) RETIREMENT AND INSURANCE BENEFITS DURING PERIODS OF UNEXPLAINED ABSENCE Continuation of Benefits § 880.304 FEGLI coverage. (a) FEGLI premiums will not be collected during periods when an annuitant is a missing...

  6. 44 CFR 17.610 - Coverage.

    Science.gov (United States)

    2010-10-01

    ... SECURITY GENERAL GOVERNMENTWIDE REQUIREMENTS FOR DRUG-FREE WORKPLACE (GRANTS) § 17.610 Coverage. (a) This... covered by this subpart, except where specifically modified by this subpart. In the event of any conflict... are deemed to control with respect to the implementation of drug-free workplace requirements...

  7. 77 FR 16453 - Student Health Insurance Coverage

    Science.gov (United States)

    2012-03-21

    ... eliminating annual and lifetime dollar limits would result in dramatic premium hikes for student plans and.... Industry and university commenters noted that student health insurance coverage benefits typically... duplication of benefits and makes student plans more affordable. Industry commenters noted that student health...

  8. Coverage of space by random sets

    Indian Academy of Sciences (India)

    Consider the non-negative integer line. For each integer point we toss a coin. If the toss at location i is a. Heads we place an interval (of random length) there and move to location i + 1,. Tails we move to location i + 1. Coverage of space by random sets – p. 2/29 ...

  9. 5 CFR 610.402 - Coverage.

    Science.gov (United States)

    2010-01-01

    ... Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS HOURS OF DUTY Flexible and Compressed Work Schedules § 610.402 Coverage. The regulations contained in this subpart apply only to flexible work schedules and compressed work schedules established under subchapter 11 of chapter 61 of...

  10. 14 CFR 205.5 - Minimum coverage.

    Science.gov (United States)

    2010-01-01

    ... 18,000 pounds maximum payload capacity, carriers need only maintain coverage of $2,000,000 per... than 30 seats or 7,500 pounds maximum cargo payload capacity, and a maximum authorized takeoff weight... not be contingent upon the financial condition, solvency, or freedom from bankruptcy of the carrier...

  11. 5 CFR 734.401 - Coverage.

    Science.gov (United States)

    2010-01-01

    ...) POLITICAL ACTIVITIES OF FEDERAL EMPLOYEES Employees in Certain Agencies and Positions § 734.401 Coverage. (a... Criminal Investigation of the Internal Revenue Service. (11) The Office of Investigative Programs of the... Firearms; (13) The Criminal Division of the Department of Justice; (14) The Central Imagery Office; (15...

  12. Danish Media coverage of 22/7

    DEFF Research Database (Denmark)

    Hervik, Peter; Boisen, Sophie

    2013-01-01

    ’s Danish connections through an analysis of the first 100 days of Danish media coverage. We scrutinised 188 articles in the largest daily newspapers to find out how Danish actors related to ABB’s ideas. The key argument is that the discourses and opinions reflect pre-existing opinions and entrenched...

  13. Binning metagenomic contigs by coverage and composition

    NARCIS (Netherlands)

    Alneberg, J.; Bjarnason, B.S.; Bruijn, de I.; Schirmer, M.; Quick, J.; Ijaz, U.Z.; Lahti, L.M.; Loman, N.J.; Andersson, A.F.; Quince, C.

    2014-01-01

    Shotgun sequencing enables the reconstruction of genomes from complex microbial communities, but because assembly does not reconstruct entire genomes, it is necessary to bin genome fragments. Here we present CONCOCT, a new algorithm that combines sequence composition and coverage across multiple

  14. Probability concepts in quality risk management.

    Science.gov (United States)

    Claycamp, H Gregg

    2012-01-01

    Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although risk is generally a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management tools are relatively silent on the meaning and uses of "probability." The probability concept is typically applied by risk managers as a combination of frequency-based calculation and a "degree of belief" meaning of probability. Probability as a concept that is crucial for understanding and managing risk is discussed through examples from the most general, scenario-defining and ranking tools that use probability implicitly to more specific probabilistic tools in risk management. A rich history of probability in risk management applied to other fields suggests that high-quality risk management decisions benefit from the implementation of more thoughtful probability concepts in both risk modeling and risk management. Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although "risk" generally describes a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management methodologies and respective tools focus on managing severity but are relatively silent on the in-depth meaning and uses of "probability." Pharmaceutical manufacturers are expanding their use of quality risk management to identify and manage risks to the patient that might occur in phases of the pharmaceutical life cycle from drug development to manufacture, marketing to product discontinuation. A probability concept is typically applied by risk managers as a combination of data-based measures of probability and a subjective "degree of belief" meaning of probability. Probability as

  15. Numerical Loading of a Maxwellian Probability Distribution Function

    International Nuclear Information System (INIS)

    Lewandowski, J.L.V.

    2003-01-01

    A renormalization procedure for the numerical loading of a Maxwellian probability distribution function (PDF) is formulated. The procedure, which involves the solution of three coupled nonlinear equations, yields a numerically loaded PDF with improved properties for higher velocity moments. This method is particularly useful for low-noise particle-in-cell simulations with electron dynamics

  16. Learning about Posterior Probability: Do Diagrams and Elaborative Interrogation Help?

    Science.gov (United States)

    Clinton, Virginia; Alibali, Martha W.; Nathan, Mitchell J.

    2016-01-01

    To learn from a text, students must make meaningful connections among related ideas in that text. This study examined the effectiveness of two methods of improving connections--elaborative interrogation and diagrams--in written lessons about posterior probability. Undergraduate students (N = 198) read a lesson in one of three questioning…

  17. Transition probability spaces in loop quantum gravity

    Science.gov (United States)

    Guo, Xiao-Kan

    2018-03-01

    We study the (generalized) transition probability spaces, in the sense of Mielnik and Cantoni, for spacetime quantum states in loop quantum gravity. First, we show that loop quantum gravity admits the structures of transition probability spaces. This is exemplified by first checking such structures in covariant quantum mechanics and then identifying the transition probability spaces in spin foam models via a simplified version of general boundary formulation. The transition probability space thus defined gives a simple way to reconstruct the discrete analog of the Hilbert space of the canonical theory and the relevant quantum logical structures. Second, we show that the transition probability space and in particular the spin foam model are 2-categories. Then we discuss how to realize in spin foam models two proposals by Crane about the mathematical structures of quantum gravity, namely, the quantum topos and causal sites. We conclude that transition probability spaces provide us with an alternative framework to understand various foundational questions of loop quantum gravity.

  18. Towards a Categorical Account of Conditional Probability

    Directory of Open Access Journals (Sweden)

    Robert Furber

    2015-11-01

    Full Text Available This paper presents a categorical account of conditional probability, covering both the classical and the quantum case. Classical conditional probabilities are expressed as a certain "triangle-fill-in" condition, connecting marginal and joint probabilities, in the Kleisli category of the distribution monad. The conditional probabilities are induced by a map together with a predicate (the condition. The latter is a predicate in the logic of effect modules on this Kleisli category. This same approach can be transferred to the category of C*-algebras (with positive unital maps, whose predicate logic is also expressed in terms of effect modules. Conditional probabilities can again be expressed via a triangle-fill-in property. In the literature, there are several proposals for what quantum conditional probability should be, and also there are extra difficulties not present in the classical case. At this stage, we only describe quantum systems with classical parametrization.

  19. A probability space for quantum models

    Science.gov (United States)

    Lemmens, L. F.

    2017-06-01

    A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.

  20. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2005-01-01

    This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections

  1. Striatal activity is modulated by target probability.

    Science.gov (United States)

    Hon, Nicholas

    2017-06-14

    Target probability has well-known neural effects. In the brain, target probability is known to affect frontal activity, with lower probability targets producing more prefrontal activation than those that occur with higher probability. Although the effect of target probability on cortical activity is well specified, its effect on subcortical structures such as the striatum is less well understood. Here, I examined this issue and found that the striatum was highly responsive to target probability. This is consistent with its hypothesized role in the gating of salient information into higher-order task representations. The current data are interpreted in light of that fact that different components of the striatum are sensitive to different types of task-relevant information.

  2. Defining Probability in Sex Offender Risk Assessment.

    Science.gov (United States)

    Elwood, Richard W

    2016-12-01

    There is ongoing debate and confusion over using actuarial scales to predict individuals' risk of sexual recidivism. Much of the debate comes from not distinguishing Frequentist from Bayesian definitions of probability. Much of the confusion comes from applying Frequentist probability to individuals' risk. By definition, only Bayesian probability can be applied to the single case. The Bayesian concept of probability resolves most of the confusion and much of the debate in sex offender risk assessment. Although Bayesian probability is well accepted in risk assessment generally, it has not been widely used to assess the risk of sex offenders. I review the two concepts of probability and show how the Bayesian view alone provides a coherent scheme to conceptualize individuals' risk of sexual recidivism.

  3. Spatial probability aids visual stimulus discrimination

    Directory of Open Access Journals (Sweden)

    Michael Druker

    2010-08-01

    Full Text Available We investigated whether the statistical predictability of a target's location would influence how quickly and accurately it was classified. Recent results have suggested that spatial probability can be a cue for the allocation of attention in visual search. One explanation for probability cuing is spatial repetition priming. In our two experiments we used probability distributions that were continuous across the display rather than relying on a few arbitrary screen locations. This produced fewer spatial repeats and allowed us to dissociate the effect of a high probability location from that of short-term spatial repetition. The task required participants to quickly judge the color of a single dot presented on a computer screen. In Experiment 1, targets were more probable in an off-center hotspot of high probability that gradually declined to a background rate. Targets garnered faster responses if they were near earlier target locations (priming and if they were near the high probability hotspot (probability cuing. In Experiment 2, target locations were chosen on three concentric circles around fixation. One circle contained 80% of targets. The value of this ring distribution is that it allowed for a spatially restricted high probability zone in which sequentially repeated trials were not likely to be physically close. Participant performance was sensitive to the high-probability circle in addition to the expected effects of eccentricity and the distance to recent targets. These two experiments suggest that inhomogeneities in spatial probability can be learned and used by participants on-line and without prompting as an aid for visual stimulus discrimination and that spatial repetition priming is not a sufficient explanation for this effect. Future models of attention should consider explicitly incorporating the probabilities of targets locations and features.

  4. Factors associated with routine immunization coverage of children under one year old in Lao People's Democratic Republic.

    Science.gov (United States)

    Phoummalaysith, Bounfeng; Yamamoto, Eiko; Xeuatvongsa, Anonh; Louangpradith, Viengsakhone; Keohavong, Bounxou; Saw, Yu Mon; Hamajima, Nobuyuki

    2018-05-03

    Routine vaccination is administered free of charge to all children under one year old in Lao People's Democratic Republic (Lao PDR) and the national goal is to achieve at least 95% coverage with all vaccines included in the national immunization program by 2025. In this study, factors related to the immunization system and characteristics of provinces and districts in Lao PDR were examined to evaluate the association with routine immunization coverage. Coverage rates for Bacillus Calmette-Guerin (BCG), Diphtheria-Tetanus-Pertussis-Hepatitis B (DTP-HepB), DTP-HepB-Hib (Haemophilus influenzae type B), polio (OPV), and measles (MCV1) vaccines from 2002 to 2014 collected through regular reporting system, were used to identify the immunization coverage trends in Lao PDR. Correlation analysis was performed using immunization coverage, characteristics of provinces or districts (population, population density, and proportion of poor villages and high-risk villages), and factors related to immunization service (including the proportions of the following: villages served by health facility levels, vaccine session types, and presence of well-functioning cold chain equipment). To determine factors associated with low coverage, provinces were categorized based on 80% of DTP-HepB-Hib3 coverage (<80% = low group; ≥80% = high group). Coverages of BCG, DTP-HepB3, OPV3 and MCV1 increased gradually from 2007 to 2014 (82.2-88.3% in 2014). However, BCG coverage showed the least improvement from 2002 to 2014. The coverage of each vaccine correlated with the coverage of the other vaccines and DTP-HepB-Hib dropout rate in provinces as well as districts. The provinces with low immunization coverage were correlated with higher proportions of poor villages. Routine immunization coverage has been improving in the last 13 years, but the national goal is not yet reached in Lao PDR. The results of this study suggest that BCG coverage and poor villages should be targeted to improve

  5. Is probability of frequency too narrow?

    International Nuclear Information System (INIS)

    Martz, H.F.

    1993-01-01

    Modern methods of statistical data analysis, such as empirical and hierarchical Bayesian methods, should find increasing use in future Probabilistic Risk Assessment (PRA) applications. In addition, there will be a more formalized use of expert judgment in future PRAs. These methods require an extension of the probabilistic framework of PRA, in particular, the popular notion of probability of frequency, to consideration of frequency of frequency, frequency of probability, and probability of probability. The genesis, interpretation, and examples of these three extended notions are discussed

  6. Policy Choices for Progressive Realization of Universal Health Coverage Comment on "Ethical Perspective: Five Unacceptable Trade-offs on the Path to Universal Health Coverage".

    Science.gov (United States)

    Tangcharoensathien, Viroj; Patcharanarumol, Walaiporn; Panichkriangkrai, Warisa; Sommanustweechai, Angkana

    2016-07-31

    In responses to Norheim's editorial, this commentary offers reflections from Thailand, how the five unacceptable trade-offs were applied to the universal health coverage (UHC) reforms between 1975 and 2002 when the whole 64 million people were covered by one of the three public health insurance systems. This commentary aims to generate global discussions on how best UHC can be gradually achieved. Not only the proposed five discrete trade-offs within each dimension, there are also trade-offs between the three dimensions of UHC such as population coverage, service coverage and cost coverage. Findings from Thai UHC show that equity is applied for the population coverage extension, when the low income households and the informal sector were the priority population groups for coverage extension by different prepayment schemes in 1975 and 1984, respectively. With an exception of public sector employees who were historically covered as part of fringe benefits were covered well before the poor. The private sector employees were covered last in 1990. Historically, Thailand applied a comprehensive benefit package where a few items are excluded using the negative list; until there was improved capacities on technology assessment that cost-effectiveness are used for the inclusion of new interventions into the benefit package. Not only cost-effectiveness, but long term budget impact, equity and ethical considerations are taken into account. Cost coverage is mostly determined by the fiscal capacities. Close ended budget with mix of provider payment methods are used as a tool for trade-off service coverage and financial risk protection. Introducing copayment in the context of fee-for-service can be harmful to beneficiaries due to supplier induced demands, inefficiency and unpredictable out of pocket payment by households. UHC achieves favorable outcomes as it was implemented when there was a full geographical coverage of primary healthcare coverage in all districts and sub

  7. Probability evolution method for exit location distribution

    Science.gov (United States)

    Zhu, Jinjie; Chen, Zhen; Liu, Xianbin

    2018-03-01

    The exit problem in the framework of the large deviation theory has been a hot topic in the past few decades. The most probable escape path in the weak-noise limit has been clarified by the Freidlin-Wentzell action functional. However, noise in real physical systems cannot be arbitrarily small while noise with finite strength may induce nontrivial phenomena, such as noise-induced shift and noise-induced saddle-point avoidance. Traditional Monte Carlo simulation of noise-induced escape will take exponentially large time as noise approaches zero. The majority of the time is wasted on the uninteresting wandering around the attractors. In this paper, a new method is proposed to decrease the escape simulation time by an exponentially large factor by introducing a series of interfaces and by applying the reinjection on them. This method can be used to calculate the exit location distribution. It is verified by examining two classical examples and is compared with theoretical predictions. The results show that the method performs well for weak noise while may induce certain deviations for large noise. Finally, some possible ways to improve our method are discussed.

  8. TU-AB-BRB-00: New Methods to Ensure Target Coverage

    International Nuclear Information System (INIS)

    2015-01-01

    The accepted clinical method to accommodate targeting uncertainties inherent in fractionated external beam radiation therapy is to utilize GTV-to-CTV and CTV-to-PTV margins during the planning process to design a PTV-conformal static dose distribution on the planning image set. Ideally, margins are selected to ensure a high (e.g. >95%) target coverage probability (CP) in spite of inherent inter- and intra-fractional positional variations, tissue motions, and initial contouring uncertainties. Robust optimization techniques, also known as probabilistic treatment planning techniques, explicitly incorporate the dosimetric consequences of targeting uncertainties by including CP evaluation into the planning optimization process along with coverage-based planning objectives. The treatment planner no longer needs to use PTV and/or PRV margins; instead robust optimization utilizes probability distributions of the underlying uncertainties in conjunction with CP-evaluation for the underlying CTVs and OARs to design an optimal treated volume. This symposium will describe CP-evaluation methods as well as various robust planning techniques including use of probability-weighted dose distributions, probability-weighted objective functions, and coverage optimized planning. Methods to compute and display the effect of uncertainties on dose distributions will be presented. The use of robust planning to accommodate inter-fractional setup uncertainties, organ deformation, and contouring uncertainties will be examined as will its use to accommodate intra-fractional organ motion. Clinical examples will be used to inter-compare robust and margin-based planning, highlighting advantages of robust-plans in terms of target and normal tissue coverage. Robust-planning limitations as uncertainties approach zero and as the number of treatment fractions becomes small will be presented, as well as the factors limiting clinical implementation of robust planning. Learning Objectives: To understand

  9. Theoretical analysis on the probability of initiating persistent fission chain

    International Nuclear Information System (INIS)

    Liu Jianjun; Wang Zhe; Zhang Ben'ai

    2005-01-01

    For the finite multiplying system of fissile material in the presence of a weak neutron source, the authors analyses problems on the probability of initiating a persistent fission chain through reckoning the stochastic theory of neutron multiplication. In the theoretical treatment, the conventional point reactor conception model is developed to an improved form with position x and velocity v dependence. The estimated results including approximate value of the probability mentioned above and its distribution are given by means of diffusion approximation and compared with those with previous point reactor conception model. They are basically consistent, however the present model can provide details on the distribution. (authors)

  10. Allelic drop-out probabilities estimated by logistic regression

    DEFF Research Database (Denmark)

    Tvedebrink, Torben; Eriksen, Poul Svante; Asplund, Maria

    2012-01-01

    We discuss the model for estimating drop-out probabilities presented by Tvedebrink et al. [7] and the concerns, that have been raised. The criticism of the model has demonstrated that the model is not perfect. However, the model is very useful for advanced forensic genetic work, where allelic drop-out...... is occurring. With this discussion, we hope to improve the drop-out model, so that it can be used for practical forensic genetics and stimulate further discussions. We discuss how to estimate drop-out probabilities when using a varying number of PCR cycles and other experimental conditions....

  11. Early clinical outcome of coverage probability based treatment planning for simultaneous integrated boost of nodes in locally advanced cervical cancer

    DEFF Research Database (Denmark)

    Lindegaard, Jacob Chr; Assenholt, Marianne; Ramlov, Anne

    2017-01-01

    ) using volumetric arc therapy (VMAT) followed by magnetic resonance imaging (MRI) guided brachytherapy. PAN RT (13 pts) was given if >2 nodes or if node(s) were present at the common iliac vessels or PAN. Nodal gross tumour volumes (GTV-N) were contoured on both PET-CT and MRI. Clinical target volume......% and CTV-N D50 ≥ 101.5%. RESULTS: Seventy-four nodes were boosted. A consistent 5.0 ± 0.7 Gy dose reduction from CTV-N D98 to PTV-N D98 was obtained. In total, 73/74 nodes were in complete remission at 3 months PET-CT and MRI. Pelvic control was obtained in 21/23 patients. One patient (IB2, clear cell) had...

  12. Impact of the error sensing probability in wide coverage areas of clustered-based wireless sensor networks

    Directory of Open Access Journals (Sweden)

    Edgar Romo-Montiel

    2016-01-01

    Full Text Available Las redes inalámbricas de sensores están compuestas por un gran número de nodos autónomos que vigilan algún parámetro del ambiente de interés, como puede ser la temperatura, la humedad o incluso objetivos móviles. Este trabajo se enfoca en la detección de móviles en áreas amplias como puede ser la vigilancia de animales en un bosque o la detección de vehículos en misiones de seguridad. Específicamente, se propone, analiza y estudia un protocolo de agrupación de bajo consumo de energía. Para ello, se presentan dos esquemas de comunicaciones basados en el bien conocido protocolo LEACH. El desempeño del sistema se estudia por medio de un modelo matemático que describe el comportamiento de la red bajo los parámetros más relevantes, como son: radio de cobertura, radio de transmisión y número de nodos en la red. Adicionalmente, se estudia la probabilidad de transmisión en la fase de formación de grupos bajo consideraciones realistas de un canal inalámbrico, en donde la detección de la señal tiene errores debido a la interferencia y ruido en el canal de acceso

  13. Coverage probability of bootstrap confidence intervals in heavy-tailed frequency models, with application to precipitation data

    Czech Academy of Sciences Publication Activity Database

    Kyselý, Jan

    2010-01-01

    Roč. 101, 3-4 (2010), s. 345-361 ISSN 0177-798X R&D Projects: GA AV ČR KJB300420801 Institutional research plan: CEZ:AV0Z30420517 Keywords : bootstrap * extreme value analysis * confidence intervals * heavy-tailed distributions * precipitation amounts Subject RIV: DG - Athmosphere Sciences, Meteorology Impact factor: 1.684, year: 2010

  14. Probability of Grounding and Collision Events

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents are developed. This implies that probabilities as well as inherent consequences can be analysed and assessed. The presnt paper outlines a method for evaluation of the probability of ship...

  15. Probability of Grounding and Collision Events

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents have to be developed. This implies that probabilities as well as inherent consequences have to be analyzed and assessed.The present notes outline a method for evaluation of the probability...

  16. Introducing Disjoint and Independent Events in Probability.

    Science.gov (United States)

    Kelly, I. W.; Zwiers, F. W.

    Two central concepts in probability theory are those of independence and mutually exclusive events. This document is intended to provide suggestions to teachers that can be used to equip students with an intuitive, comprehensive understanding of these basic concepts in probability. The first section of the paper delineates mutually exclusive and…

  17. Selected papers on probability and statistics

    CERN Document Server

    2009-01-01

    This volume contains translations of papers that originally appeared in the Japanese journal Sūgaku. The papers range over a variety of topics in probability theory, statistics, and applications. This volume is suitable for graduate students and research mathematicians interested in probability and statistics.

  18. Collective probabilities algorithm for surface hopping calculations

    International Nuclear Information System (INIS)

    Bastida, Adolfo; Cruz, Carlos; Zuniga, Jose; Requena, Alberto

    2003-01-01

    General equations that transition probabilities of the hopping algorithms in surface hopping calculations must obey to assure the equality between the average quantum and classical populations are derived. These equations are solved for two particular cases. In the first it is assumed that probabilities are the same for all trajectories and that the number of hops is kept to a minimum. These assumptions specify the collective probabilities (CP) algorithm, for which the transition probabilities depend on the average populations for all trajectories. In the second case, the probabilities for each trajectory are supposed to be completely independent of the results from the other trajectories. There is, then, a unique solution of the general equations assuring that the transition probabilities are equal to the quantum population of the target state, which is referred to as the independent probabilities (IP) algorithm. The fewest switches (FS) algorithm developed by Tully is accordingly understood as an approximate hopping algorithm which takes elements from the accurate CP and IP solutions. A numerical test of all these hopping algorithms is carried out for a one-dimensional two-state problem with two avoiding crossings which shows the accuracy and computational efficiency of the collective probabilities algorithm proposed, the limitations of the FS algorithm and the similarity between the results offered by the IP algorithm and those obtained with the Ehrenfest method

  19. Examples of Neutrosophic Probability in Physics

    Directory of Open Access Journals (Sweden)

    Fu Yuhua

    2015-01-01

    Full Text Available This paper re-discusses the problems of the so-called “law of nonconservation of parity” and “accelerating expansion of the universe”, and presents the examples of determining Neutrosophic Probability of the experiment of Chien-Shiung Wu et al in 1957, and determining Neutrosophic Probability of accelerating expansion of the partial universe.

  20. Eliciting Subjective Probabilities with Binary Lotteries

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    objective probabilities. Drawing a sample from the same subject population, we find evidence that the binary lottery procedure induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation...

  1. Probability Issues in without Replacement Sampling

    Science.gov (United States)

    Joarder, A. H.; Al-Sabah, W. S.

    2007-01-01

    Sampling without replacement is an important aspect in teaching conditional probabilities in elementary statistics courses. Different methods proposed in different texts for calculating probabilities of events in this context are reviewed and their relative merits and limitations in applications are pinpointed. An alternative representation of…

  2. Some open problems in noncommutative probability

    International Nuclear Information System (INIS)

    Kruszynski, P.

    1981-01-01

    A generalization of probability measures to non-Boolean structures is discussed. The starting point of the theory is the Gleason theorem about the form of measures on closed subspaces of a Hilbert space. The problems are formulated in terms of probability on lattices of projections in arbitrary von Neumann algebras. (Auth.)

  3. Probability: A Matter of Life and Death

    Science.gov (United States)

    Hassani, Mehdi; Kippen, Rebecca; Mills, Terence

    2016-01-01

    Life tables are mathematical tables that document probabilities of dying and life expectancies at different ages in a society. Thus, the life table contains some essential features of the health of a population. Probability is often regarded as a difficult branch of mathematics. Life tables provide an interesting approach to introducing concepts…

  4. Teaching Probability: A Socio-Constructivist Perspective

    Science.gov (United States)

    Sharma, Sashi

    2015-01-01

    There is a considerable and rich literature on students' misconceptions in probability. However, less attention has been paid to the development of students' probabilistic thinking in the classroom. This paper offers a sequence, grounded in socio-constructivist perspective for teaching probability.

  5. Stimulus Probability Effects in Absolute Identification

    Science.gov (United States)

    Kent, Christopher; Lamberts, Koen

    2016-01-01

    This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of…

  6. 47 CFR 1.1623 - Probability calculation.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a) All calculations shall be...

  7. Simulations of Probabilities for Quantum Computing

    Science.gov (United States)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  8. Against All Odds: When Logic Meets Probability

    NARCIS (Netherlands)

    van Benthem, J.; Katoen, J.-P.; Langerak, R.; Rensink, A.

    2017-01-01

    This paper is a light walk along interfaces between logic and probability, triggered by a chance encounter with Ed Brinksma. It is not a research paper, or a literature survey, but a pointer to issues. I discuss both direct combinations of logic and probability and structured ways in which logic can

  9. An introduction to probability and stochastic processes

    CERN Document Server

    Melsa, James L

    2013-01-01

    Geared toward college seniors and first-year graduate students, this text is designed for a one-semester course in probability and stochastic processes. Topics covered in detail include probability theory, random variables and their functions, stochastic processes, linear system response to stochastic processes, Gaussian and Markov processes, and stochastic differential equations. 1973 edition.

  10. The probability of the false vacuum decay

    International Nuclear Information System (INIS)

    Kiselev, V.; Selivanov, K.

    1983-01-01

    The closed expession for the probability of the false vacuum decay in (1+1) dimensions is given. The probability of false vacuum decay is expessed as the product of exponential quasiclassical factor and a functional determinant of the given form. The method for calcutation of this determinant is developed and a complete answer for (1+1) dimensions is given

  11. Probability elements of the mathematical theory

    CERN Document Server

    Heathcote, C R

    2000-01-01

    Designed for students studying mathematical statistics and probability after completing a course in calculus and real variables, this text deals with basic notions of probability spaces, random variables, distribution functions and generating functions, as well as joint distributions and the convergence properties of sequences of random variables. Includes worked examples and over 250 exercises with solutions.

  12. The transition probabilities of the reciprocity model

    NARCIS (Netherlands)

    Snijders, T.A.B.

    1999-01-01

    The reciprocity model is a continuous-time Markov chain model used for modeling longitudinal network data. A new explicit expression is derived for its transition probability matrix. This expression can be checked relatively easily. Some properties of the transition probabilities are given, as well

  13. Probability numeracy and health insurance purchase

    NARCIS (Netherlands)

    Dillingh, Rik; Kooreman, Peter; Potters, Jan

    2016-01-01

    This paper provides new field evidence on the role of probability numeracy in health insurance purchase. Our regression results, based on rich survey panel data, indicate that the expenditure on two out of three measures of health insurance first rises with probability numeracy and then falls again.

  14. The enigma of probability and physics

    International Nuclear Information System (INIS)

    Mayants, L.

    1984-01-01

    This volume contains a coherent exposition of the elements of two unique sciences: probabilistics (science of probability) and probabilistic physics (application of probabilistics to physics). Proceeding from a key methodological principle, it starts with the disclosure of the true content of probability and the interrelation between probability theory and experimental statistics. This makes is possible to introduce a proper order in all the sciences dealing with probability and, by conceiving the real content of statistical mechanics and quantum mechanics in particular, to construct both as two interconnected domains of probabilistic physics. Consistent theories of kinetics of physical transformations, decay processes, and intramolecular rearrangements are also outlined. The interrelation between the electromagnetic field, photons, and the theoretically discovered subatomic particle 'emon' is considered. Numerous internal imperfections of conventional probability theory, statistical physics, and quantum physics are exposed and removed - quantum physics no longer needs special interpretation. EPR, Bohm, and Bell paradoxes are easily resolved, among others. (Auth.)

  15. Optimizing Probability of Detection Point Estimate Demonstration

    Science.gov (United States)

    Koshti, Ajay M.

    2017-01-01

    Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.

  16. Alternative probability theories for cognitive psychology.

    Science.gov (United States)

    Narens, Louis

    2014-01-01

    Various proposals for generalizing event spaces for probability functions have been put forth in the mathematical, scientific, and philosophic literatures. In cognitive psychology such generalizations are used for explaining puzzling results in decision theory and for modeling the influence of context effects. This commentary discusses proposals for generalizing probability theory to event spaces that are not necessarily boolean algebras. Two prominent examples are quantum probability theory, which is based on the set of closed subspaces of a Hilbert space, and topological probability theory, which is based on the set of open sets of a topology. Both have been applied to a variety of cognitive situations. This commentary focuses on how event space properties can influence probability concepts and impact cognitive modeling. Copyright © 2013 Cognitive Science Society, Inc.

  17. In-hospital fellow coverage reduces communication errors in the surgical intensive care unit.

    Science.gov (United States)

    Williams, Mallory; Alban, Rodrigo F; Hardy, James P; Oxman, David A; Garcia, Edward R; Hevelone, Nathanael; Frendl, Gyorgy; Rogers, Selwyn O

    2014-06-01

    Staff coverage strategies of intensive care units (ICUs) impact clinical outcomes. High-intensity staff coverage strategies are associated with lower morbidity and mortality. Accessible clinical expertise, team work, and effective communication have all been attributed to the success of this coverage strategy. We evaluate the impact of in-hospital fellow coverage (IHFC) on improving communication of cardiorespiratory events. A prospective observational study performed in an academic tertiary care center with high-intensity staff coverage. The main outcome measure was resident to fellow communication of cardiorespiratory events during IHFC vs home coverage (HC) periods. Three hundred twelve cardiorespiratory events were collected in 114 surgical ICU patients in 134 study days. Complete data were available for 306 events. One hundred three communication errors occurred. IHFC was associated with significantly better communication of events compared to HC (Pcommunicated 89% of events during IHFC vs 51% of events during HC (PCommunication patterns of junior and midlevel residents were similar. Midlevel residents communicated 68% of all on-call events (87% IHFC vs 50% HC, Pcommunicated 66% of events (94% IHFC vs 52% HC, PCommunication errors were lower in all ICUs during IHFC (Pcommunication errors. Copyright © 2014 Elsevier Inc. All rights reserved.

  18. A comparative analysis of coverage decisions for outpatient pharmaceuticals: evidence from Denmark, Norway and Sweden.

    Science.gov (United States)

    Grepstad, Mari; Kanavos, Panos

    2015-02-01

    This study analyses the reasons for differences and similarities in coverage recommendations for outpatient pharmaceuticals in Denmark, Norway and Sweden, following HTA appraisals. A comparative analysis of all outpatient drug appraisals carried out between January 2009 and December 2012, including an analysis of divergent coverage recommendations made by all three countries was performed. Agreement levels between HTA agencies were measured using kappa scores. Consultations with stakeholders in the three countries were carried out to complement the discussion on HTA processes and reimbursement outcomes. Nineteen outpatient drug-indication pairs appraised in each of the three countries were identified, of which 6 pairs (32%) had divergent coverage recommendations. An uneven distribution of coverage recommendations was observed, with the highest overlap in appraisals between Norway and Sweden (free-marginal kappa 0.89). Similarities were found in priority setting principles, mode of appraisal and reasoning for coverage recommendations. The study shows that health economic evaluation is less prominent or explicit in outpatient drug appraisals in Denmark than in Norway and Sweden, that all three countries could benefit from improved communication between appraisers and manufacturers, and that final coverage recommendations rely on factors other than safety, comparative efficacy or cost-effectiveness. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  19. A framework to estimate the coverage of AOPs in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jinkyun; Jung, Wondea [KAERI, Daejeon (Korea, Republic of)

    2015-05-15

    In this paper, a framework to estimate the coverage of AOPs in NPPs is proposed based on a SPV (Single Point Vulnerability) model. It is apparent that the sufficient coverage of AOPs is one of the prerequisites for improving the operational safety of NPPs because they provide a series of proper actions to be conducted by human operators, which are crucial for coping with off-normal conditions caused by the failure of critical components. In this light, the catalog of BEs (i.e., SPV components) identified from an SPV model could be a good source of information to enhance the coverage of AOPs. Unfortunately, because of the avalanche of the number of corresponding MCSs, it is inevitable to develop a screening process that allows us to select critical MCSs. For this reason, the MCSC score is defined along with the DIF concept. Based on the MCSC score, a framework that allows us to systematically investigate the coverage of AOPs is proposed in Ref. As a result, it is estimated that the coverage of AOPs being used in OPR1000 is about 63%. It should be noted that there are a couple of limitations in this study. For example, the precision of the abovementioned coverage entirely depends on that of the SPV model being scrutinized by the proposed framework. This implies that independent reviews of SMEs (Subject Matter Experts) who have sufficient knowledge on both the configuration and operation of NPPs are unavoidable to confirm the appropriateness of the suggested framework.

  20. Evaluation of nuclear power plant component failure probability and core damage probability using simplified PSA model

    International Nuclear Information System (INIS)

    Shimada, Yoshio

    2000-01-01

    It is anticipated that the change of frequency of surveillance tests, preventive maintenance or parts replacement of safety related components may cause the change of component failure probability and result in the change of core damage probability. It is also anticipated that the change is different depending on the initiating event frequency or the component types. This study assessed the change of core damage probability using simplified PSA model capable of calculating core damage probability in a short time period, which is developed by the US NRC to process accident sequence precursors, when various component's failure probability is changed between 0 and 1, or Japanese or American initiating event frequency data are used. As a result of the analysis, (1) It was clarified that frequency of surveillance test, preventive maintenance or parts replacement of motor driven pumps (high pressure injection pumps, residual heat removal pumps, auxiliary feedwater pumps) should be carefully changed, since the core damage probability's change is large, when the base failure probability changes toward increasing direction. (2) Core damage probability change is insensitive to surveillance test frequency change, since the core damage probability change is small, when motor operated valves and turbine driven auxiliary feed water pump failure probability changes around one figure. (3) Core damage probability change is small, when Japanese failure probability data are applied to emergency diesel generator, even if failure probability changes one figure from the base value. On the other hand, when American failure probability data is applied, core damage probability increase is large, even if failure probability changes toward increasing direction. Therefore, when Japanese failure probability data is applied, core damage probability change is insensitive to surveillance tests frequency change etc. (author)