Failure-probability driven dose painting
Vogelius, Ivan R.; Håkansson, Katrin; Due, Anne K.; Aznar, Marianne C.; Kristensen, Claus A.; Rasmussen, Jacob; Specht, Lena [Department of Radiation Oncology, Rigshospitalet, University of Copenhagen, Copenhagen 2100 (Denmark); Berthelsen, Anne K. [Department of Radiation Oncology, Rigshospitalet, University of Copenhagen, Copenhagen 2100, Denmark and Department of Clinical Physiology, Nuclear Medicine and PET, Rigshospitalet, University of Copenhagen, Copenhagen 2100 (Denmark); Bentzen, Søren M. [Department of Radiation Oncology, Rigshospitalet, University of Copenhagen, Copenhagen 2100, Denmark and Departments of Human Oncology and Medical Physics, University of Wisconsin, Madison, Wisconsin 53792 (United States)
2013-08-15
Purpose: To demonstrate a data-driven dose-painting strategy based on the spatial distribution of recurrences in previously treated patients. The result is a quantitative way to define a dose prescription function, optimizing the predicted local control at constant treatment intensity. A dose planning study using the optimized dose prescription in 20 patients is performed.Methods: Patients treated at our center have five tumor subvolumes from the center of the tumor (PET positive volume) and out delineated. The spatial distribution of 48 failures in patients with complete clinical response after (chemo)radiation is used to derive a model for tumor control probability (TCP). The total TCP is fixed to the clinically observed 70% actuarial TCP at five years. Additionally, the authors match the distribution of failures between the five subvolumes to the observed distribution. The steepness of the dose–response is extracted from the literature and the authors assume 30% and 20% risk of subclinical involvement in the elective volumes. The result is a five-compartment dose response model matching the observed distribution of failures. The model is used to optimize the distribution of dose in individual patients, while keeping the treatment intensity constant and the maximum prescribed dose below 85 Gy.Results: The vast majority of failures occur centrally despite the small volumes of the central regions. Thus, optimizing the dose prescription yields higher doses to the central target volumes and lower doses to the elective volumes. The dose planning study shows that the modified prescription is clinically feasible. The optimized TCP is 89% (range: 82%–91%) as compared to the observed TCP of 70%.Conclusions: The observed distribution of locoregional failures was used to derive an objective, data-driven dose prescription function. The optimized dose is predicted to result in a substantial increase in local control without increasing the predicted risk of toxicity.
Failure-probability driven dose painting
Vogelius, Ivan R; Håkansson, Katrin; Due, Anne K;
2013-01-01
To demonstrate a data-driven dose-painting strategy based on the spatial distribution of recurrences in previously treated patients. The result is a quantitative way to define a dose prescription function, optimizing the predicted local control at constant treatment intensity. A dose planning stu...
Hanford Environmental Dose Reconstruction Project
Finch, S.M.; McMakin, A.H. (comps.)
1992-02-01
The objective of the Hanford Environmental Dose Reconstruction Project is to estimate the radiation doses that individuals and populations could have received from nuclear operations at Hanford since 1944. The project is divided into the following technical tasks. These tasks correspond to the path radionuclides followed, from release to impact on humans (dose estimates): source terms; environmental transport; environmental monitoring data; demography, food consumption, and agriculture; environmental pathways and dose estimates.
Hanford Environmental Dose Reconstruction Project
Finch, S.M.; McMakin, A.H. (comps.)
1991-01-01
The objective of the Hanford Environmental Dose Reconstruction Project is to estimate the radiation doses that individuals and populations could have received from nuclear operations at Hanford since 1944. The project is being managed and conducted by the Pacific Northwest Laboratory (PNL) under the direction of an independent Technical Steering Panel (TSP). The TSP consists of experts in environmental pathways, epidemiology, surface-water transport, ground-water transport, statistics, demography, agriculture, meteorology, nuclear engineering, radiation dosimetry, and cultural anthropology. Included are appointed technical members representing the states of Oregon and Washington, a representative of Native American tribes, and an individual representing the public. The project is divided into the following technical tasks. These tasks correspond to the path radionuclides followed, from release to impact on human (dose estimates): Source Terms; Environmental Transport; Environmental Monitoring Data; Demographics, Agriculture, Food Habits and; Environmental Pathways and Dose Estimates.
Hanford Environmental Dose Reconstruction Project
McMakin, A.H.; Cannon, S.D.; Finch, S.M. (comps.)
1992-07-01
The objective of the Hanford Environmental Dose Reconstruction (HEDR) Project is to estimate the radiation doses that individuals and populations could have received from nuclear operations at Hanford since 1944. The TSP consists of experts in environmental pathways, epidemiology, surface-water transport, ground-water transport, statistics, demography, agriculture, meteorology, nuclear engineering, radiation dosimetry, and cultural anthropology. Included are appointed technical members representing the states of Oregon, Washington, and Idaho, a representative of Native American tribes, and an individual representing the public. The project is divided into the following technical tasks. These tasks correspond to the path radionuclides followed from release to impact on humans (dose estimates): Source terms, environmental transport, environmental monitoring data, demography, food consumption, and agriculture, and environmental pathways and dose estimates. Progress is discussed.
Hanford Environmental Dose Reconstruction Project
Finch, S.M. (comp.)
1990-01-01
The objective of the Hanford Environmental Dose Reconstruction Project is to estimate the radiation doses that populations could have received from nuclear operations at Hanford since 1944. The project is being managed and conducted by the Pacific Northwest Laboratory (PNL) under the direction of an independent Technical Steering Panel (TSP). The project is divided into the following technical tasks. These tasks correspond to the path radionuclides followed, from release to impact on humans (dose estimates). The Source Terms Task develops estimates of radioactive emissions from Hanford facilities since 1944. The Environmental Transport Task reconstructs the movement of radioactive materials from the areas of release to populations. The Environmental Monitoring Data Task assembles, evaluates, and reports historical environmental monitoring data. The Demographics, Agriculture, Food Habits Task develops the data needed to identify the populations that could have been affected by the releases. In addition to population and demographic data, the food and water resources and consumption patterns for populations are estimated because they provide a primary pathway for the intake of radionuclides. The Environmental Pathways and Dose Estimates Task use the information produced by the other tasks to estimate the radiation doses populations could have received from Hanford radiation. Project progress is documented in this monthly report, which is available to the public. 3 figs., 3 tabs.
Wilson, Jason; Lawman, Joshua; Murphy, Rachael; Nelson, Marissa
2011-01-01
This article describes a probability project used in an upper division, one-semester probability course with third-semester calculus and linear algebra prerequisites. The student learning outcome focused on developing the skills necessary for approaching project-sized math/stat application problems. These skills include appropriately defining…
Probability of causation: Implications for radiological protection and dose limitation
Fabrikant, J.I.
1987-05-01
This report on the probability of causation of radiation-induced cancer is an attempt to bring together biology, chemistry, physics and statistics to calculate a value in the form of a ratio expressed as a percentage. In involves the interactions of numerous cancer risk factors, and all are fraught with technical difficulties and uncertainties. It is a computational approach to a societal problem that should be resolved in the political arena by men and women of government and law. But, it must be examined, because at the present, we have no reasonable method to explain the complexity of the mechanism of radiation-induced cancer and the probability of injury to an individual exposed in the past to ionizing radiation, and because society does not know how to compensate such a person who may have been injured by radiation, and particularly low-level radiation. Five questions are discussed that concern probability of causation of radiation-induced cancer. First, what is it and how can we best define the concept? Second, what are the methods of estimation and cancer causation? Third, what are the uncertainties involved? Fourth, what are the strengths and limitation of the computational approach? And fifth, what are the implications for radiological protection and dose-limitation?
Probability of causation: Implications for radiological protection and dose limitation
Fabrikant, J.I.
1987-05-01
This report on the probability of causation of radiation-induced cancer is an attempt to bring together biology, chemistry, physics and statistics to calculate a value in the form of a ratio expressed as a percentage. In involves the interactions of numerous cancer risk factors, and all are fraught with technical difficulties and uncertainties. It is a computational approach to a societal problem that should be resolved in the political arena by men and women of government and law. But, it must be examined, because at the present, we have no reasonable method to explain the complexity of the mechanism of radiation-induced cancer and the probability of injury to an individual exposed in the past to ionizing radiation, and because society does not know how to compensate such a person who may have been injured by radiation, and particularly low-level radiation. Five questions are discussed that concern probability of causation of radiation-induced cancer. First, what is it and how can we best define the concept Second, what are the methods of estimation and cancer causation Third, what are the uncertainties involved Fourth, what are the strengths and limitation of the computational approach And fifth, what are the implications for radiological protection and dose-limitation
Probability distribution fitting of schedule overruns in construction projects
P E D Love; C-P Sing; WANG, X; Edwards, D.J.; H Odeyinka
2013-01-01
The probability of schedule overruns for construction and engineering projects can be ascertained using a ‘best fit’ probability distribution from an empirical distribution. The statistical characteristics of schedule overruns occurring in 276 Australian construction and engineering projects were analysed. Skewness and kurtosis values revealed that schedule overruns are non-Gaussian. Theoretical probability distributions were then fitted to the schedule overrun data; including the Kolmogorov–...
Tilly, David; Ahnesjö, Anders
2015-07-01
A fast algorithm is constructed to facilitate dose calculation for a large number of randomly sampled treatment scenarios, each representing a possible realisation of a full treatment with geometric, fraction specific displacements for an arbitrary number of fractions. The algorithm is applied to construct a dose volume coverage probability map (DVCM) based on dose calculated for several hundred treatment scenarios to enable the probabilistic evaluation of a treatment plan. For each treatment scenario, the algorithm calculates the total dose by perturbing a pre-calculated dose, separately for the primary and scatter dose components, for the nominal conditions. The ratio of the scenario specific accumulated fluence, and the average fluence for an infinite number of fractions is used to perturb the pre-calculated dose. Irregularities in the accumulated fluence may cause numerical instabilities in the ratio, which is mitigated by regularisation through convolution with a dose pencil kernel. Compared to full dose calculations the algorithm demonstrates a speedup factor of ~1000. The comparisons to full calculations show a 99% gamma index (2%/2 mm) pass rate for a single highly modulated beam in a virtual water phantom subject to setup errors during five fractions. The gamma comparison shows a 100% pass rate in a moving tumour irradiated by a single beam in a lung-like virtual phantom. DVCM iso-probability lines computed with the fast algorithm, and with full dose calculation for each of the fractions, for a hypo-fractionated prostate case treated with rotational arc therapy treatment were almost indistinguishable.
Hanford Environmental Dose Reconstruction Project. Monthly report
Finch, S.M.; McMakin, A.H. [comps.
1992-02-01
The objective of the Hanford Environmental Dose Reconstruction Project is to estimate the radiation doses that individuals and populations could have received from nuclear operations at Hanford since 1944. The project is divided into the following technical tasks. These tasks correspond to the path radionuclides followed, from release to impact on humans (dose estimates): source terms; environmental transport; environmental monitoring data; demography, food consumption, and agriculture; environmental pathways and dose estimates.
Hanford Environmental Dose Reconstruction Project Monthly Report
Finch, S.M. (comp.)
1990-01-01
The objective of the Hanford Environmental Dose Reconstruction Project is to estimate the radiation doses that populations could have received from nuclear operations at Hanford since 1944. The project is being managed and conducted by the Pacific Northwest Laboratory (PNL) under the direction of an independent Technical Steering Panel (TSP). The project is divided into the following technical tasks. These tasks correspond to the path radionuclides followed, from release to impact on humans (dose estimates): source terms; environmental transport; environmental monitoring data; demographics; agriculture; food habits; and environmental pathways and dose estimates. 3 figs.
Hanford Environmental Dose Reconstruction Project monthly report
Finch, S.M. (comp.)
1990-12-01
The objective of the Hanford Environmental Dose Reconstruction Project is to estimate the radiation doses that populations could have been have received from nuclear operations at Hanford since 1944. The project is being managed and conducted by the Pacific Northwest Laboratory (PNL) under the direction of an independent Technical Steering Panel (TSP). The project is divided into the following technical tasks. These tasks correspond to the path radionuclides followed, from release to impact on humans (dose estimates): source terms; environmental transport; environmental monitoring data; demographics, agriculture, food habits; and environmental pathways and dose estimates. 3 figs., 3 tabs.
Hanford Environmental Dose Reconstruction Project Monthly Report
Finch, S.M. (comp.)
1991-07-01
The objective of the Hanford Environmental Dose Reconstruction Project is to estimate the radiation doses that individuals and populations could have received from nuclear operations at Hanford since 1944. The project is being managed and conducted by the Pacific Northwest Laboratory (PNL) under the direction of an independent Technical Steering Panel (TSP). The project is divided into the following technical tasks. These tasks correspond to the path radionuclides followed, from release to impact on humans (dose estimates): Source terms; environmental transport; environmental monitoring data; demographics, agriculture, food habits; and environmental pathways and dose estimates. 2 figs., 2 tabs.
Hanford Environmental Dose Reconstruction Project monthly report
Finch, S.M. (comp.)
1991-10-01
The objective of the Hanford Environmental Dose Reconstruction Project is to estimate the radiation doeses that individuals and populations could have received from nuclear operations at Hanford since 1944. The project is divided into the following technical tasks. These tasks correspond to the path radionuclides followed, from release to impact on humans (dose estimates): Source terms; environmental transport; environmental monitoring data; demographics, agriculture, food habits; environmental pathways and dose estimates.
Hanford Environmental Dose Reconstruction Project Monthly Report
Dennis, B.S. (comp.)
1990-04-01
This monthly report summarizes the technical progress and project status for the Hanford Environmental Dose Reconstruction (HEDR) Project being conducted at Pacific Northwest Laboratory (PNL) under the direction of a Technical Steering Panel (TSP). The project is divided into the following technical tasks. These tasks address each of the primary steps in the path from radioactive releases to dose estimates: source terms, environmental transport, environmental monitoring data, demographics, agriculture, and food habits, and environmental pathways and dose estimates. The source terms task will develop estimates for radioactive emissions from Hanford facilities since 1944. These estimates will be based on historical measurements and production information. 1 fig., 1 tab.
Project Management Plan for the Hanford Environmental Dose Reconstruction Project
Shipler, D.B.
1992-03-01
This Project Management Plan (PMP) describes the approach that will be used to manage the Hanford Environmental Dose Reconstruction (HEDR) Project. The plan describes the management structure and the technical and administrative control systems that will be used to plan and control the HEDR Project performance. The plan also describes the relationship among key project participants: Battelle, the Centers for Disease Control (CDC), and the Technical Steering Panel (TSP).
Hanford Environmental Dose Reconstruction Project. Monthly report
McMakin, A.H.; Cannon, S.D.; Finch, S.M. [comps.
1992-07-01
The objective of the Hanford Environmental Dose Reconstruction (HEDR) Project is to estimate the radiation doses that individuals and populations could have received from nuclear operations at Hanford since 1944. The TSP consists of experts in environmental pathways, epidemiology, surface-water transport, ground-water transport, statistics, demography, agriculture, meteorology, nuclear engineering, radiation dosimetry, and cultural anthropology. Included are appointed technical members representing the states of Oregon, Washington, and Idaho, a representative of Native American tribes, and an individual representing the public. The project is divided into the following technical tasks. These tasks correspond to the path radionuclides followed from release to impact on humans (dose estimates): Source terms, environmental transport, environmental monitoring data, demography, food consumption, and agriculture, and environmental pathways and dose estimates. Progress is discussed.
Project Management Plan for the Hanford Environmental Dose Reconstruction Project
Shipler, D.B.; McMakin, A.H.; Finch, S.M.
1992-09-01
This Project Management Plan (PMP) describes the approach being used to manage the Hanford Environmental Dose Reconstruction (HEDR) Project. The plan describes the management structure and the technical and administrative control systems used to plan and control HEDR Project performance. The plan also describes the relationship among key project participants: Battelle, the Centers for Disease control (CDC), and the Technical Steering Panel (TSP). Battelle's contract with CDC only extends through May 1994 when the key technical work will be completed. There-fore, this plan is focused only on the period during which Battelle is a participant.
Project Management Plan for the Hanford Environmental Dose Reconstruction Project
Shipler, D.B.; McMakin, A.H.; Finch, S.M.
1992-09-01
This Project Management Plan (PMP) describes the approach being used to manage the Hanford Environmental Dose Reconstruction (HEDR) Project. The plan describes the management structure and the technical and administrative control systems used to plan and control HEDR Project performance. The plan also describes the relationship among key project participants: Battelle, the Centers for Disease control (CDC), and the Technical Steering Panel (TSP). Battelle`s contract with CDC only extends through May 1994 when the key technical work will be completed. There-fore, this plan is focused only on the period during which Battelle is a participant.
Selvaraj, Jothybasu; Baker, Colin; Nahum, Alan
2016-06-01
The impact of microscopic disease extension (MDE), extra-CTV tumour islets (TIs), incidental dose and dose conformity on tumour control probability (TCP) is analyzed using insilico simulations in this study. MDE in the region in between GTV and CTV is simulated inclusive of geometric uncertainties (GE) using spherical targets and spherical dose distribution. To study the effect of incidental dose on TIs and the effect of dose-response curve (DRC) on tumour control, islets were randomly distributed and TCP was calculated for various dose levels by rescaling the dose. Further, the impact of dose conformity on required PTV margins is also studied. The required PTV margins are ~2 mm lesser than assuming a uniform clonogen density if an exponential clonogen density fall off in the GTV-CTV is assumed. However, margins are almost equal if GE is higher in both cases. This shows that GE has a profound impact on margins. The effect of TIs showed a bi-phasic relation with increasing dose, indicating that patients with islets not in the beam paths do not benefit from dose escalation. Increasing dose conformity is also found to have considerable effect on TCP loss especially for larger GE. Further, smaller margins in IGRT should be used with caution where uncertainty in CTV definition is of concern.
Probability Distribution and Projected Trends of Daily Precipitation in China
CAO; Li-Ge; ZHONG; Jun; SU; Bu-Da; ZHAI; Jian-Qing; Macro; GEMMER
2013-01-01
Based on observed daily precipitation data of 540 stations and 3,839 gridded data from the high-resolution regional climate model COSMO-Climate Limited-area Modeling(CCLM)for 1961–2000,the simulation ability of CCLM on daily precipitation in China is examined,and the variation of daily precipitation distribution pattern is revealed.By applying the probability distribution and extreme value theory to the projected daily precipitation(2011–2050)under SRES A1B scenario with CCLM,trends of daily precipitation series and daily precipitation extremes are analyzed.Results show that except for the western Qinghai-Tibetan Plateau and South China,distribution patterns of the kurtosis and skewness calculated from the simulated and observed series are consistent with each other;their spatial correlation coefcients are above 0.75.The CCLM can well capture the distribution characteristics of daily precipitation over China.It is projected that in some parts of the Jianghuai region,central-eastern Northeast China and Inner Mongolia,the kurtosis and skewness will increase significantly,and precipitation extremes will increase during 2011–2050.The projected increase of maximum daily rainfall and longest non-precipitation period during flood season in the aforementioned regions,also show increasing trends of droughts and floods in the next 40 years.
Impact of temporal probability in 4D dose calculation for lung tumors.
Rouabhi, Ouided; Ma, Mingyu; Bayouth, John; Xia, Junyi
2015-11-08
The purpose of this study was to evaluate the dosimetric uncertainty in 4D dose calculation using three temporal probability distributions: uniform distribution, sinusoidal distribution, and patient-specific distribution derived from the patient respiratory trace. Temporal probability, defined as the fraction of time a patient spends in each respiratory amplitude, was evaluated in nine lung cancer patients. Four-dimensional computed tomography (4D CT), along with deformable image registration, was used to compute 4D dose incorporating the patient's respiratory motion. First, the dose of each of 10 phase CTs was computed using the same planning parameters as those used in 3D treatment planning based on the breath-hold CT. Next, deformable image registration was used to deform the dose of each phase CT to the breath-hold CT using the deformation map between the phase CT and the breath-hold CT. Finally, the 4D dose was computed by summing the deformed phase doses using their corresponding temporal probabilities. In this study, 4D dose calculated from the patient-specific temporal probability distribution was used as the ground truth. The dosimetric evaluation matrix included: 1) 3D gamma analysis, 2) mean tumor dose (MTD), 3) mean lung dose (MLD), and 4) lung V20. For seven out of nine patients, both uniform and sinusoidal temporal probability dose distributions were found to have an average gamma passing rate > 95% for both the lung and PTV regions. Compared with 4D dose calculated using the patient respiratory trace, doses using uniform and sinusoidal distribution showed a percentage difference on average of -0.1% ± 0.6% and -0.2% ± 0.4% in MTD, -0.2% ± 1.9% and -0.2% ± 1.3% in MLD, 0.09% ± 2.8% and -0.07% ± 1.8% in lung V20, -0.1% ± 2.0% and 0.08% ± 1.34% in lung V10, 0.47% ± 1.8% and 0.19% ± 1.3% in lung V5, respectively. We concluded that four-dimensional dose computed using either a uniform or sinusoidal temporal probability distribution can
无
2009-01-01
【说词】1. He can probably tell us the truth.2. Will it rain this afternoong ？ Probably【解语】作副词，意为“大概、或许”，表示可能性很大，通常指根据目前情况作出积极推测或判断；
High-dose neutron detector project update
Menlove, Howard Olsen [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Henzlova, Daniela [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2016-08-10
These are the slides for a progress review meeting by the sponsor. This is an update on the high-dose neutron detector project. In summary, improvements in both boron coating and signal amplification have been achieved; improved boron coating materials and procedures have increase efficiency by ~ 30-40% without the corresponding increase in the detector plate area; low dead-time via thin cell design (~ 4 mm gas gaps) and fast amplifiers; prototype PDT 8” pod has been received and testing is in progress; significant improvements in efficiency and stability have been verified; use commercial PDT ^{10}B design and fabrication to obtain a faster path from the research to practical high-dose neutron detector.
High-dose neutron detector project update
Menlove, Howard Olsen [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Henzlova, Daniela [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2016-08-10
These are the slides for a progress review meeting by the sponsor. This is an update on the high-dose neutron detector project. In summary, improvements in both boron coating and signal amplification have been achieved; improved boron coating materials and procedures have increased efficiency by ~ 30-40% without the corresponding increase in the detector plate area; low dead-time via thin cell design (~ 4 mm gas gaps) and fast amplifiers; prototype PDT 8” pod has been received and testing is in progress; significant improvements in efficiency and stability have been verified; use commercial PDT ^{10}B design and fabrication to obtain a faster path from the research to practical high-dose neutron detector.
Hanford Environmental Dose Reconstruction Project monthly report, August 1992
McMakin, A.H.; Cannon, S.D.; Finch, S.M. [comps.
1992-09-01
The objective of the Hanford Environmental Dose Reconstruction (HEDR) Project is to estimate the radiation doses that individuals and populations could have received from nuclear operations at Hanford since 1944. The project is divided into the following technical tasks. These tasks correspond to the path radionuclides followed from release to impact on humans (dose estimates): source terms; environmental transport; environmental monitoring data; demography; food consumption; and agriculture; and environmental pathway and dose estimates.
Gibbs, S J
2000-10-01
Effective dose equivalents (H(E)) and effective doses (E) for radiographic projections common in dentistry, calculated from the same organ dose distributions, are presented to determine whether the 2 quantities can be directly compared. Doses to all organs and tissues in the head, neck, trunk, and proximal extremities were determined for each projection (intraoral full-mouth radiographic survey, panoramic, cephalometric, temporomandibular tomograms, and submentovertex view) by computer simulation with Monte Carlo methods. H(E) and E were calculated from these complete distributions and by methods prescribed by the International Commission on Radiological Protection (ICRP). H(E) and E computed from complete dose distributions were found comparable within a few percentage points. However, those computed by strict application of ICRP methods were not. For radiographic projections with highly localized dose distributions, such as those common in dentistry, direct comparison of H(E) and E may not be meaningful, unless both computation algorithms are known.
Phase 1 of the Hanford Environmental Dose Reconstruction Project
1990-07-20
This report summarizes the water pathway portion of the first phase of the Hanford Environmental Dose Reconstruction (HEDR) Project, conducted by Battelle staff at the Pacific Northwest Laboratory under the direction of an independent Technical Steering Panel. The HEDR Project is estimating radiation doses that could have been received by the public from the Department of Energy's Hanford Site, in southeastern Washington State. Phase 1 of the water-pathway dose reconstruction sought to determine whether dose estimates could be calculated for populations in the area from above the Hanford Site at Priest Rapids Dam to below the site at McNary Dam from January 1964 to December 1966. Of the potential sources of radionuclides from the river, fish consumption was the most important. Later phases of the HEDR Project will address dose estimates for periods other than 1964--1966 and for populations downstream of McNary Dam. 17 refs., 20 figs., 1 tab.
Project of evaluation of doses in computed tomography in Poland
Slusarczyk-Kacprzyk, W.; Skrzynski, W.; Bulski, W. [Centre of Oncology, Medical Physics Dept., Warsaw (Poland)
2006-07-01
Project of evaluation of doses in computed tomography in Poland bases on the organization solutions implemented and evaluated at one of Polish oncological centres. In this study we analyzed doses for a group of 484 patients who underwent an examination with a G.E. HiSpeed CT scanner at the Centre of Oncology in Warsaw. Patient doses (weighted computed tomography dose index, C.T.D.I.w. and dose length product, D.L.P.) have been compared against reference values published by the Polish Ministry of Health. We found that typical patient doses do not exceed reference values. As reference dose levels are defined only for a standard-size patients, sometimes they may be exceeded for a properly done examination. Polish reference dose levels are not based on up-to-date data and should be revised. (authors)
Hanford Environmental Dose Reconstruction Project. Monthly report, December 1991
Finch, S.M.; McMakin, A.H. [comps.
1991-12-31
The objective of the Hanford Environmental Dose Reconstruction Project is to estimate the radiation doses that individuals and populations could have received from nuclear operations at Hanford since 1944. The project is being managed and conducted by the Pacific Northwest Laboratory (PNL) under the direction of an independent Technical Steering Panel (TSP). The TSP consists of experts in environmental pathways, epidemiology, surface-water transport, ground-water transport, statistics, demography, agriculture, meteorology, nuclear engineering, radiation dosimetry, and cultural anthropology. Included are appointed technical members representing the states of Oregon and Washington, a representative of Native American tribes, and an individual representing the public. The project is divided into the following technical tasks. These tasks correspond to the path radionuclides followed, from release to impact on human (dose estimates): Source Terms; Environmental Transport; Environmental Monitoring Data; Demographics, Agriculture, Food Habits and; Environmental Pathways and Dose Estimates.
Phase 1 of the Hanford Environmental Dose Reconstruction Project
1991-08-01
The work described in this report was prompted by the public's concern about potential effect from the radioactive materials released from the Hanford Site. The Hanford Environmental Dose Reconstruction (HEDR) Project was established to estimate radiation dose the public might have received from the Hanford Site since 1944, when facilities began operating. Phase 1 of the HEDR Project is a pilot'' or demonstration'' phase. The objectives of this initial phase were to determine whether enough historical information could be found or reconstructed to be used for dose estimation and develop and test conceptual and computational models for calculating credible dose estimates. Preliminary estimates of radiation doses were produced in Phase 1 because they are needed to achieve these objectives. The reader is cautioned that the dose estimates provided in this and other Phase 1 HEDR reports are preliminary. As the HEDR Project continues, the dose estimates will change for at least three reasons: more complete input information for models will be developed; the models themselves will be refined; and the size and shape of the geographic study area will change. This is one of three draft reports that summarize the first phase of the four-phased HEDR Project. This, the Summary Report, is directed to readers who want a general understanding of the Phase 1 work and preliminary dose estimates. The two other reports -- the Air Pathway Report and the Columbia River Pathway Report -- are for readers who understand the radiation dose assessment process and want to see more technical detail. Detailed descriptions of the dose reconstruction process are available in more than 20 supporting reports listed in Appendix A. 32 refs., 46 figs.
Updating Dosimetry for Emergency Response Dose Projections.
DeCair, Sara
2016-02-01
In 2013, the U.S. Environmental Protection Agency (EPA) proposed an update to the 1992 Protective Action Guides (PAG) Manual. The PAG Manual provides guidance to state and local officials planning for radiological emergencies. EPA requested public comment on the proposed revisions, while making them available for interim use by officials faced with an emergency situation. Developed with interagency partners, EPA's proposal incorporates newer dosimetric methods, identifies tools and guidelines developed since the current document was issued, and extends the scope of the PAGs to all significant radiological incidents, including radiological dispersal devices or improvised nuclear devices. In order to best serve the emergency management community, scientific policy direction had to be set on how to use International Commission on Radiological Protection Publication 60 age groups in dose assessment when implementing emergency guidelines. Certain guidelines that lend themselves to different PAGs for different subpopulations are the PAGs for potassium iodide (KI), food, and water. These guidelines provide age-specific recommendations because of the radiosensitivity of the thyroid and young children with respect to ingestion and inhalation doses in particular. Taking protective actions like using KI, avoiding certain foods or using alternative sources of drinking water can be relatively simple to implement by the parents of young children. Clear public messages can convey which age groups should take which action, unlike how an evacuation or relocation order should apply to entire households or neighborhoods. New in the PAG Manual is planning guidance for the late phase of an incident, after the situation is stabilized and efforts turn toward recovery. Because the late phase can take years to complete, decision makers are faced with managing public exposures in areas not fully remediated. The proposal includes quick-reference operational guidelines to inform re-entry to
Project DORIS - Dose reduction in Swedish BWRs
Lundgren, K.; Elkert, J.; Ingemansson, T.
1994-12-01
Radiation exposures show an increasing trend in Swedish BWRs. The corresponding trend in foreign BWRs is decreasing exposures. The overall result is that the Swedish BWRs no longer can be regarded as low-exposure plants in an international comparison. The changed situation has called for the establishment of more fundamental ALARA programs in the Swedish BWRs, and the purpose of the DORIS project, ordered by SSI, is to serve as a basis for such utility efforts. The base of the investigation is a comprehensive analysis of exposure and radiation data from the ABB Atom BWRs. The analysis shows, that the main reason for the increasing exposures in the Swedish BWRs is gradually increasing radiation levels, and this increase is mainly due to the buildup of Co60 activity on system surfaces. Extensive computer simulations have been performed to find the factors responsible for this radiation buildup. The following main factors have been identified: Cobalt inflow to the reactor circuit from erosion-corrosion of Stellite in turbine and reactor systems.; Higher and higher burnup levels for BWR fuel.; A tendency of too low iron inflow during recent years in some of the reactors.; Fuel failures resulting in considerable contamination of the fuel with tramp uranium.; Lower inflow of zinc due to replacement of brass tubes in turbine condensers with titanium tubes.; and High moisture content in the reactor steam, especially after uprated power levels. 22 refs.
Smoothing and projecting age-specific probabilities of death by TOPALS
Joop de Beer
2012-10-01
Full Text Available BACKGROUND TOPALS is a new relational model for smoothing and projecting age schedules. The model is operationally simple, flexible, and transparent. OBJECTIVE This article demonstrates how TOPALS can be used for both smoothing and projecting age-specific mortality for 26 European countries and compares the results of TOPALS with those of other smoothing and projection methods. METHODS TOPALS uses a linear spline to describe the ratios between the age-specific death probabilities of a given country and a standard age schedule. For smoothing purposes I use the average of death probabilities over 15 Western European countries as standard, whereas for projection purposes I use an age schedule of 'best practice' mortality. A partial adjustment model projects how quickly the death probabilities move in the direction of the best-practice level of mortality. RESULTS On average, TOPALS performs better than the Heligman-Pollard model and the Brass relational method in smoothing mortality age schedules. TOPALS can produce projections that are similar to those of the Lee-Carter method, but can easily be used to produce alternative scenarios as well. This article presents three projections of life expectancy at birth for the year 2060 for 26 European countries. The Baseline scenario assumes a continuation of the past trend in each country, the Convergence scenario assumes that there is a common trend across European countries, and the Acceleration scenario assumes that the future decline of death probabilities will exceed that in the past. The Baseline scenario projects that average European life expectancy at birth will increase to 80 years for men and 87 years for women in 2060, whereas the Acceleration scenario projects an increase to 90 and 93 years respectively. CONCLUSIONS TOPALS is a useful new tool for demographers for both smoothing age schedules and making scenarios.
Estimation of food consumption. Hanford Environmental Dose Reconstruction Project
Callaway, J.M. Jr.
1992-04-01
The research reported in this document was conducted as a part of the Hanford Environmental Dose Reconstruction (HEDR) Project. The objective of the HEDR Project is to estimate the radiation doses that people could have received from operations at the Hanford Site. Information required to estimate these doses includes estimates of the amounts of potentially contaminated foods that individuals in the region consumed during the study period. In that general framework, the objective of the Food Consumption Task was to develop a capability to provide information about the parameters of the distribution(s) of daily food consumption for representative groups in the population for selected years during the study period. This report describes the methods and data used to estimate food consumption and presents the results developed for Phase I of the HEDR Project.
Validation of HEDR models. Hanford Environmental Dose Reconstruction Project
Napier, B.A.; Simpson, J.C.; Eslinger, P.W.; Ramsdell, J.V. Jr.; Thiede, M.E.; Walters, W.H.
1994-05-01
The Hanford Environmental Dose Reconstruction (HEDR) Project has developed a set of computer models for estimating the possible radiation doses that individuals may have received from past Hanford Site operations. This document describes the validation of these models. In the HEDR Project, the model validation exercise consisted of comparing computational model estimates with limited historical field measurements and experimental measurements that are independent of those used to develop the models. The results of any one test do not mean that a model is valid. Rather, the collection of tests together provide a level of confidence that the HEDR models are valid.
吕渭济; 崔巍
2001-01-01
In this paper, two kinds of models are presented and optimized for project investment risk income on the basis of probability X distribution. One kind of model being proved has only a maximal value and another kind being proved has no extreme values.
Balásházy, Imre; Madas, Balázs Gergely; Hofmann, Werner
2013-01-01
Cellular hit probabilities of alpha particles emitted by inhaled radon progenies in sensitive bronchial epithelial cell nuclei were simulated at low exposure levels to obtain useful data for the rejection or in support of the linear-non-threshold (LNT) hypothesis. In this study, local distributions of deposited inhaled radon progenies in airway bifurcation models were computed at exposure conditions, which are characteristic of homes and uranium mines. Then, maximum local deposition enhancement factors at bronchial airway bifurcations, expressed as the ratio of local to average deposition densities, were determined to characterize the inhomogeneity of deposition and to elucidate their effect on resulting hit probabilities. The results obtained suggest that in the vicinity of the carinal regions of the central airways the probability of multiple hits can be quite high even at low average doses. Assuming a uniform distribution of activity there are practically no multiple hits and the hit probability as a funct...
Valdes, Gilmer, E-mail: gilmer.valdes@uphs.upenn.edu [Department of Radiation Oncology, Perelman Center for Advanced Medicine, University of Pennsylvania, Philadelphia, PA (United States); Robinson, Clifford [Department of Radiation Oncology, Siteman Cancer Center, Washington University in St. Louis, St. Louis, MO (United States); Lee, Percy [Department of Radiation Oncology, David Geffen School of Medicine, UCLA, Los Angeles, CA (United States); Morel, Delphine [Department of Biomedical Engineering, AIX Marseille 2 University, Marseille (France); Department of Medical Physics, Joseph Fourier University, Grenoble (France); Low, Daniel; Iwamoto, Keisuke S.; Lamb, James M. [Department of Radiation Oncology, David Geffen School of Medicine, UCLA, Los Angeles, CA (United States)
2015-04-01
Four-dimensional (4D) dose calculations for lung cancer radiotherapy have been technically feasible for a number of years but have not become standard clinical practice. The purpose of this study was to determine if clinically significant differences in tumor control probability (TCP) exist between 3D and 4D dose calculations so as to inform the decision whether 4D dose calculations should be used routinely for treatment planning. Radiotherapy plans for Stage I-II lung cancer were created for 8 patients. Clinically acceptable treatment plans were created with dose calculated on the end-exhale 4D computed tomography (CT) phase using a Monte Carlo algorithm. Dose was then projected onto the remaining 9 phases of 4D-CT using the Monte Carlo algorithm and accumulated onto the end-exhale phase using commercially available deformable registration software. The resulting dose-volume histograms (DVH) of the gross tumor volume (GTV), planning tumor volume (PTV), and PTV{sub setup} were compared according to target coverage and dose. The PTV{sub setup} was defined as a volume including the GTV and a margin for setup uncertainties but not for respiratory motion. TCPs resulting from these DVHs were estimated using a wide range of alphas, betas, and tumor cell densities. Differences of up to 5 Gy were observed between 3D and 4D calculations for a PTV with highly irregular shape. When the TCP was calculated using the resulting DVHs for fractionation schedules typically used in stereotactic body radiation therapy (SBRT), the TCP differed at most by 5% between 4D and 3D cases, and in most cases, it was by less than 1%. We conclude that 4D dose calculations are not necessary for most cases treated with SBRT, but they might be valuable for irregularly shaped target volumes. If 4D calculations are used, 4D DVHs should be evaluated on volumes that include margin for setup uncertainty but not respiratory motion.
Havelaar, A. H.; Swart, A. N.
2014-01-01
Dose-response models in microbial risk assessment consider two steps in the process ultimately leading to illness: from exposure to (asymptomatic) infection, and from infection to (symptomatic) illness. Most data and theoretical approaches are available for the exposure-infection step; the infection
FY 1991 project plan for the Hanford Environmental Dose Reconstruction Project, Phase 2
1991-02-01
Phase 1 of the Hanford Environmental Dose Reconstruction Project was designed to develop and demonstrate a method for estimating radiation doses people may have received from Hanford Site operations since 1944. The method researchers developed relied on a variety of measured and reconstructed data as input to a modular computer model that generates dose estimates and their uncertainties. As part of Phase 1, researchers used the reconstructed data and computer model to calculate preliminary dose estimates for populations in a limited geographical area and time period. Phase 2, now under way, is designed to evaluate the Phase 1 data and model and improve them to calculate more accurate and precise dose estimates. Phase 2 will also be used to obtain preliminary estimates of two categories of doses: for Native American tribes and for individuals included in the pilot phase of the Hanford Thyroid Disease Study (HTDS). TSP Directive 90-1 required HEDR staff to develop Phase 2 task plans for TSP approval. Draft task plans for Phase 2 were submitted to the TSP at the October 11--12, 1990 public meeting, and, after discussions of each activity and associated budget needs, the TSP directed HEDR staff to proceed with a slate of specific project activities for FY 1991 of Phase 2. This project plan contains detailed information about those activities. Phase 2 is expected to last 15--18 months. In mid-FY 1991, project activities and budget will be reevaluated to determine whether technical needs or priorities have changed. Separate from, but related to, this project plan, will be an integrated plan for the remainder of the project. HEDR staff will work with the TSP to map out a strategy that clearly describes end products'' for the project and the work necessary to complete them. This level of planning will provide a framework within which project decisions in Phases 2, 3, and 4 can be made.
Atwell, William; Tylka, Allan J.; Dietrich, William F.; Rojdev, Kristina; Matzkind, Courtney
2016-01-01
In an earlier paper presented at ICES in 2015, we investigated solar particle event (SPE) radiation exposures (absorbed dose) to small, thinly-shielded spacecraft during a period when the monthly smoothed sunspot number (SSN) was less than 30. Although such months are generally considered "solar-quiet", SPEs observed during these months even include Ground Level Events, the most energetic type of SPE. In this paper, we add to previous study those SPEs that occurred in 1973-2015 when the SSN was greater than 30 but less than 50. Based on the observable energy range of the solar protons, we classify the event as GLEs, sub-GLEs, and sub-sub-GLEs, all of which are potential contributors to the radiation hazard. We use the spectra of these events to construct a probabilistic model of the absorbed dose due to solar protons when SSN < 50 at various confidence levels for various depths of shielding and for various mission durations. We provide plots and tables of solar proton-induced absorbed dose as functions of confidence level, shielding thickness, and mission-duration that will be useful to system designers.
Calculating tumor trajectory and dose-of-the-day using cone-beam CT projections
Jones, Bernard L; Miften, Moyed
2015-01-01
Purpose: Cone-beam CT (CBCT) projection images provide anatomical data in real-time over several respiratory cycles, forming a comprehensive picture of tumor movement. We developed and validated a method which uses these projections to determine the trajectory of and dose to highly mobile tumors during each fraction of treatment. Methods: CBCT images of a respiration phantom were acquired, the trajectory of which mimicked a lung tumor with high amplitude (up to 2.5 cm) and hysteresis. A template-matching algorithm was used to identify the location of a steel BB in each CBCT projection, and a Gaussian probability density function for the absolute BB position was calculated which best fit the observed trajectory of the BB in the imager geometry. Two modifications of the trajectory reconstruction were investigated: first, using respiratory phase information to refine the trajectory estimation (Phase), and second, using the Monte Carlo (MC) method to sample the estimated Gaussian tumor position distribution. Resu...
Laheij GMH; Uijt de Haag PAM
1993-01-01
The research presented in this report forms part of the PROSA (PRObabilistic Safety Assessment) project. PROSA aims to determine the radiological effects on humans and the characteristics relevant to safety of disposal concepts for radioactive waste in rocksalt formations. This report describes
Nagata, Koichi [Kameda Medical Center, Department of Radiology, Kamogawa, Chiba (Japan); Jichi Medical University, Department of Radiology, Tochigi (Japan); National Cancer Center, Cancer Screening Technology Division, Research Center for Cancer Prevention and Screening, Tokyo (Japan); Fujiwara, Masanori; Mogi, Tomohiro; Iida, Nao [Kameda Medical Center Makuhari, Department of Radiology, Chiba (Japan); Kanazawa, Hidenori; Sugimoto, Hideharu [Jichi Medical University, Department of Radiology, Tochigi (Japan); Mitsushima, Toru [Kameda Medical Center Makuhari, Department of Gastroenterology, Chiba (Japan); Lefor, Alan T. [Jichi Medical University, Department of Surgery, Tochigi (Japan)
2015-01-15
To prospectively evaluate the radiation dose and image quality comparing low-dose CT colonography (CTC) reconstructed using different levels of iterative reconstruction techniques with routine-dose CTC reconstructed with filtered back projection. Following institutional ethics clearance and informed consent procedures, 210 patients underwent screening CTC using automatic tube current modulation for dual positions. Examinations were performed in the supine position with a routine-dose protocol and in the prone position, randomly applying four different low-dose protocols. Supine images were reconstructed with filtered back projection and prone images with iterative reconstruction. Two blinded observers assessed the image quality of endoluminal images. Image noise was quantitatively assessed by region-of-interest measurements. The mean effective dose in the supine series was 1.88 mSv using routine-dose CTC, compared to 0.92, 0.69, 0.57, and 0.46 mSv at four different low doses in the prone series (p < 0.01). Overall image quality and noise of low-dose CTC with iterative reconstruction were significantly improved compared to routine-dose CTC using filtered back projection. The lowest dose group had image quality comparable to routine-dose images. Low-dose CTC with iterative reconstruction reduces the radiation dose by 48.5 to 75.1 % without image quality degradation compared to routine-dose CTC with filtered back projection. (orig.)
Phase 1 of the Hanford Environmental Dose Reconstruction Project
1990-07-20
For more than 40 years, the US government made plutonium for nuclear weapons at the Hanford Site in southeastern Washington State. Radioactive materials were released to both the air and water from Hanford. People could have been exposed to these materials, called radionuclides. The Hanford Environmental Dose Reconstruction (HEDR) Project is a multi-year scientific study to estimate the radiation doses the public may have received as a results of these releases. The study began in 1988. During the first phase, scientists began to develop and test methods for reconstructing the radiation doses. To do this, scientists found or reconstructed information about the amount and type of radionuclides that were released from Hadford facilities, where they traveled in environment, and how they reached people. Information about the people who could have been exposed was also found or reconstructed. Scientists then developed a computer model that can estimate doses from radiation exposure received many years ago. All the information that had been gathered was fed into the computer model. Then scientists did a test run'' to see whether the model was working properly. As part of its test run,'' scientists asked the computer model to generate two types of preliminary results: amounts of radionuclides in the environment (air, soil, pasture grass, food, and milk) and preliminary doses people could have received from all the routes of radiation exposure, called exposure pathways. Preliminary dose estimates were made for categories of people who shared certain characteristics and for the Phase 1 population as a whole. 26 refs., 48 figs.
Stereoscopic interpretation of low-dose breast tomosynthesis projection images.
Muralidhar, Gautam S; Markey, Mia K; Bovik, Alan C; Haygood, Tamara Miner; Stephens, Tanya W; Geiser, William R; Garg, Naveen; Adrada, Beatriz E; Dogan, Basak E; Carkaci, Selin; Khisty, Raunak; Whitman, Gary J
2014-04-01
The purpose of this study was to evaluate stereoscopic perception of low-dose breast tomosynthesis projection images. In this Institutional Review Board exempt study, craniocaudal breast tomosynthesis cases (N = 47), consisting of 23 biopsy-proven malignant mass cases and 24 normal cases, were retrospectively reviewed. A stereoscopic pair comprised of two projection images that were ±4° apart from the zero angle projection was displayed on a Planar PL2010M stereoscopic display (Planar Systems, Inc., Beaverton, OR, USA). An experienced breast imager verified the truth for each case stereoscopically. A two-phase blinded observer study was conducted. In the first phase, two experienced breast imagers rated their ability to perceive 3D information using a scale of 1-3 and described the most suspicious lesion using the BI-RADS® descriptors. In the second phase, four experienced breast imagers were asked to make a binary decision on whether they saw a mass for which they would initiate a diagnostic workup or not and also report the location of the mass and provide a confidence score in the range of 0-100. The sensitivity and the specificity of the lesion detection task were evaluated. The results from our study suggest that radiologists who can perceive stereo can reliably interpret breast tomosynthesis projection images using stereoscopic viewing.
Buffa, F M; Nahum, A E
2000-10-01
The aim of this work is to investigate the influence of the statistical fluctuations of Monte Carlo (MC) dose distributions on the dose volume histograms (DVHs) and radiobiological models, in particular the Poisson model for tumour control probability (tcp). The MC matrix is characterized by a mean dose in each scoring voxel, d, and a statistical error on the mean dose, sigma(d); whilst the quantities d and sigma(d) depend on many statistical and physical parameters, here we consider only their dependence on the phantom voxel size and the number of histories from the radiation source. Dose distributions from high-energy photon beams have been analysed. It has been found that the DVH broadens when increasing the statistical noise of the dose distribution, and the tcp calculation systematically underestimates the real tumour control value, defined here as the value of tumour control when the statistical error of the dose distribution tends to zero. When increasing the number of energy deposition events, either by increasing the voxel dimensions or increasing the number of histories from the source, the DVH broadening decreases and tcp converges to the 'correct' value. It is shown that the underestimation of the tcp due to the noise in the dose distribution depends on the degree of heterogeneity of the radiobiological parameters over the population; in particular this error decreases with increasing the biological heterogeneity, whereas it becomes significant in the hypothesis of a radiosensitivity assay for single patients, or for subgroups of patients. It has been found, for example, that when the voxel dimension is changed from a cube with sides of 0.5 cm to a cube with sides of 0.25 cm (with a fixed number of histories of 10(8) from the source), the systematic error in the tcp calculation is about 75% in the homogeneous hypothesis, and it decreases to a minimum value of about 15% in a case of high radiobiological heterogeneity. The possibility of using the error
Morris, J M
1986-08-01
The performance analysis of maneuverable reentry vehicles (MaRV) in terms of its probability of penetration (PoP) against terminal engagement with a ballistic missile defense (BMD) system and in terms of its associate circular error probability (CEP), at impact is a very complex problem. A thorough study of this problem under the MaRV Penetration Study Project will require the development of a number of analytical and simulation tools. As a result of a preliminary study, a MaRV PoP vs CEP analysis concept has been formulated to support the MaRV Penetration Study Project. The concept is based on analytical models and techniques and, moreover, exploits the existing knowledge base and is physically intuitive. The analysis concept, as formulated, is applicable to arbitrary MaRV's and BMD systems.
Freshley, M.D.; Thorne, P.D.
1992-08-01
The Hanford Environmental Dose Reconstruction (HEDR) Project is being conducted to estimate radiation doses that populations and individuals could have received from Hanford Site operations from 1944 to the present. Four possible pathways by which radionuclides migrating in ground water on the Hanford Site could have reached the public have been identified: (1) through contaminated ground water migrating to the Columbia River; (2) through wells on or adjacent to the Hanford Site; (3) through wells next to the Columbia River downstream of Hanford that draw some or all of their water from the river (riparian wells); and (4) through atmospheric deposition resulting in contamination of a small watershed that, in turn, results in contamination of a shallow well or spring by transport in the ground water. These four pathways make up the ``ground-water pathway,`` which is the subject of this study. Assessment of the ground-water pathway was performed by (1) reviewing the existing extensive literature on ground water and ground-water monitoring at Hanford and (2) performing calculations to estimate radionuclide concentrations where no monitoring data were collected. Radiation doses that would result from exposure to these radionuclides were calculated.
Zhang, H; Kong, V; Jin, J [Georgia Regents University Cancer Center, Augusta, GA (Georgia); Ren, L; Zhang, Y; Giles, W [Duke University Medical Center, Durham, NC (United States)
2015-06-15
Purpose: To present a cone beam computed tomography (CBCT) system, which uses a synchronized moving grid (SMOG) to reduce and correct scatter, an inter-projection sensor fusion (IPSF) algorithm to estimate the missing information blocked by the grid, and a probability total variation (pTV) algorithm to reconstruct the CBCT image. Methods: A prototype SMOG-equipped CBCT system was developed, and was used to acquire gridded projections with complimentary grid patterns in two neighboring projections. Scatter was reduced by the grid, and the remaining scatter was corrected by measuring it under the grid. An IPSF algorithm was used to estimate the missing information in a projection from data in its 2 neighboring projections. Feldkamp-Davis-Kress (FDK) algorithm was used to reconstruct the initial CBCT image using projections after IPSF processing for pTV. A probability map was generated depending on the confidence of estimation in IPSF for the regions of missing data and penumbra. pTV was finally used to reconstruct the CBCT image for a Catphan, and was compared to conventional CBCT image without using SMOG, images without using IPSF (SMOG + FDK and SMOG + mask-TV), and image without using pTV (SMOG + IPSF + FDK). Results: The conventional CBCT without using SMOG shows apparent scatter-induced cup artifacts. The approaches with SMOG but without IPSF show severe (SMOG + FDK) or additional (SMOG + TV) artifacts, possibly due to using projections of missing data. The 2 approaches with SMOG + IPSF removes the cup artifacts, and the pTV approach is superior than the FDK by substantially reducing the noise. Using the SMOG also reduces half of the imaging dose. Conclusion: The proposed technique is promising in improving CBCT image quality while reducing imaging dose.
A multiscale filter for noise reduction of low-dose cone beam projections.
Yao, Weiguang; Farr, Jonathan B
2015-08-21
The Poisson or compound Poisson process governs the randomness of photon fluence in cone beam computed tomography (CBCT) imaging systems. The probability density function depends on the mean (noiseless) of the fluence at a certain detector. This dependence indicates the natural requirement of multiscale filters to smooth noise while preserving structures of the imaged object on the low-dose cone beam projection. In this work, we used a Gaussian filter, exp(-x2/2σ(2)(f)) as the multiscale filter to de-noise the low-dose cone beam projections. We analytically obtained the expression of σ(f), which represents the scale of the filter, by minimizing local noise-to-signal ratio. We analytically derived the variance of residual noise from the Poisson or compound Poisson processes after Gaussian filtering. From the derived analytical form of the variance of residual noise, optimal σ(2)(f)) is proved to be proportional to the noiseless fluence and modulated by local structure strength expressed as the linear fitting error of the structure. A strategy was used to obtain the reliable linear fitting error: smoothing the projection along the longitudinal direction to calculate the linear fitting error along the lateral direction and vice versa. The performance of our multiscale filter was examined on low-dose cone beam projections of a Catphan phantom and a head-and-neck patient. After performing the filter on the Catphan phantom projections scanned with pulse time 4 ms, the number of visible line pairs was similar to that scanned with 16 ms, and the contrast-to-noise ratio of the inserts was higher than that scanned with 16 ms about 64% in average. For the simulated head-and-neck patient projections with pulse time 4 ms, the visibility of soft tissue structures in the patient was comparable to that scanned with 20 ms. The image processing took less than 0.5 s per projection with 1024 × 768 pixels.
A multiscale filter for noise reduction of low-dose cone beam projections
Yao, Weiguang; Farr, Jonathan B.
2015-08-01
The Poisson or compound Poisson process governs the randomness of photon fluence in cone beam computed tomography (CBCT) imaging systems. The probability density function depends on the mean (noiseless) of the fluence at a certain detector. This dependence indicates the natural requirement of multiscale filters to smooth noise while preserving structures of the imaged object on the low-dose cone beam projection. In this work, we used a Gaussian filter, \\text{exp}≤ft(-{{x}2}/2σ f2\\right) as the multiscale filter to de-noise the low-dose cone beam projections. We analytically obtained the expression of {σf} , which represents the scale of the filter, by minimizing local noise-to-signal ratio. We analytically derived the variance of residual noise from the Poisson or compound Poisson processes after Gaussian filtering. From the derived analytical form of the variance of residual noise, optimal σ f2 is proved to be proportional to the noiseless fluence and modulated by local structure strength expressed as the linear fitting error of the structure. A strategy was used to obtain the reliable linear fitting error: smoothing the projection along the longitudinal direction to calculate the linear fitting error along the lateral direction and vice versa. The performance of our multiscale filter was examined on low-dose cone beam projections of a Catphan phantom and a head-and-neck patient. After performing the filter on the Catphan phantom projections scanned with pulse time 4 ms, the number of visible line pairs was similar to that scanned with 16 ms, and the contrast-to-noise ratio of the inserts was higher than that scanned with 16 ms about 64% in average. For the simulated head-and-neck patient projections with pulse time 4 ms, the visibility of soft tissue structures in the patient was comparable to that scanned with 20 ms. The image processing took less than 0.5 s per projection with 1024 × 768 pixels.
Hanford Environmental Dose Reconstruction Project. Quarterly report, June--August 1993
Cannon, S.D.; Finch, S.M. [comps.
1993-10-01
The objective of the Hanford Environmental Dose Reconstruction (HEDR) Project is to estimate the radiation doses that individuals and populations could have received from nuclear operations at Hanford since 1944. The project is divided into the following technical tasks. These tasks correspond to the path radionuclides followed from release to impact on humans (dose estimates): Source Terms, Environmental Transport, Environmental Monitoring Data, Demography, Food Consumption, and Agriculture, and Environmental Pathways and Dose Estimates.
Gholamreza Norouzi
2015-01-01
Full Text Available In project management context, time management is one of the most important factors affecting project success. This paper proposes a new method to solve research project scheduling problems (RPSP containing Fuzzy Graphical Evaluation and Review Technique (FGERT networks. Through the deliverables of this method, a proper estimation of project completion time (PCT and success probability can be achieved. So algorithms were developed to cover all features of the problem based on three main parameters “duration, occurrence probability, and success probability.” These developed algorithms were known as PR-FGERT (Parallel and Reversible-Fuzzy GERT networks. The main provided framework includes simplifying the network of project and taking regular steps to determine PCT and success probability. Simplifications include (1 equivalent making of parallel and series branches in fuzzy network considering the concepts of probabilistic nodes, (2 equivalent making of delay or reversible-to-itself branches and impact of changing the parameters of time and probability based on removing related branches, (3 equivalent making of simple and complex loops, and (4 an algorithm that was provided to resolve no-loop fuzzy network, after equivalent making. Finally, the performance of models was compared with existing methods. The results showed proper and real performance of models in comparison with existing methods.
FY 1992 revised task plans for the Hanford Environmental Dose Reconstruction Project. Revision 1
Shipler, D.B.
1992-04-01
The purpose of the Hanford Environmental Dose Reconstruction (HEDR) Project is to estimate radiation doses from Hanford Site operations since 1944 to populations and individuals. The primary objectives of work to be performed in FY 1992 is to determine the appropriate scope (space, time, and radionuclides, pathways and individuals/population groups) and accuracy (level of uncertainty in dose estimates) for the project. Another objective is to use a refined computer model to estimate Native American tribal doses and individual doses for the Hanford Thyroid Disease Study (HTDS). Project scope and accuracy requirements defined in FY 1992 can translated into model and data requirements that must be satisfied during FY 1993.
FY 1992 revised task plans for the Hanford Environmental Dose Reconstruction Project
Shipler, D.B.
1992-04-01
The purpose of the Hanford Environmental Dose Reconstruction (HEDR) Project is to estimate radiation doses from Hanford Site operations since 1944 to populations and individuals. The primary objectives of work to be performed in FY 1992 is to determine the appropriate scope (space, time, and radionuclides, pathways and individuals/population groups) and accuracy (level of uncertainty in dose estimates) for the project. Another objective is to use a refined computer model to estimate Native American tribal doses and individual doses for the Hanford Thyroid Disease Study (HTDS). Project scope and accuracy requirements defined in FY 1992 can translated into model and data requirements that must be satisfied during FY 1993.
Peres, David Johnny; Cancelliere, Antonino
2017-04-01
Climate change related to uncontrolled greenhouse gas emissions is expected to modify climate characteristics in a harmful way, increasing the frequency of many precipitation-triggered natural hazards, landslides included. In our study we analyse regional climate model (RCM) projections with the aim of assessing the potential future modifications of rainfall event characteristics linked to shallow landslide triggering, such as: event duration, total depth, and inter-arrival time. Factor of changes of the mean and the variance of these rainfall-event characteristics are exploited to adjust a stochastic rainfall generator aimed at simulating precipitation series likely to occur in the future. Then Monte Carlo simulations - where the stochastic rainfall generator and a physically based hydromechanical model are coupled - are carried out to estimate the probability of landslide triggering for future time horizons, and its changes respect to the current climate conditions. The proposed methodology is applied to the Peloritani region in Sicily, Italy, an area that in the past two decades has experienced several catastrophic shallow and rapidly moving landslide events. Different RCM simulations from the Coordinated regional Climate Downscaling Experiment (CORDEX) initiative are considered in the application, as well as two different emission scenarios, known as Representative Concentration Pathways: intermediate (RCP 4.5) and high-emissions (RCP 8.5). The estimated rainfall event characteristics modifications differ significantly both in magnitude and in direction (increase/decrease) from one model to another. RCMs are concordant only in predicting an increase of the mean of inter-event dry intervals. The variance of rainfall depth exhibits maximum changes (increase or decrease depending on the RCM), and it is the characteristic to which landslide triggering seems to be more sensitive. Some RCMs indicate significant variations of landslide probability due to climate
de Bruijn, Renée; Dabekaussen, Willem; Hijma, Marc; Wiersma, Ane; Abspoel-Bukman, Linda; Boeije, Remco; Courage, Wim; van der Geest, Johan; Hamburg, Marc; Harmsma, Edwin; Helmholt, Kristian; van den Heuvel, Frank; Kruse, Henk; Langius, Erik; Lazovik, Elena
2017-04-01
Due to heterogeneity of the subsurface in the delta environment of the Netherlands, differential subsidence over short distances results in tension and subsequent wear of subsurface infrastructure, such as water and gas pipelines. Due to uncertainties in the build-up of the subsurface, however, it is unknown where this problem is the most prominent. This is a problem for asset managers deciding when a pipeline needs replacement: damaged pipelines endanger security of supply and pose a significant threat to safety, yet premature replacement raises needless expenses. In both cases, costs - financial or other - are high. Therefore, an interdisciplinary research team of geotechnicians, geologists and Big Data engineers from research institutes TNO, Deltares and SkyGeo developed a stochastic model to predict differential subsidence and the probability of consequent pipeline failure on a (sub-)street level. In this project pipeline data from company databases is combined with a stochastic geological model and information on (historical) groundwater levels and overburden material. Probability of pipeline failure is modelled by a coupling with a subsidence model and two separate models on pipeline behaviour under stress, using a probabilistic approach. The total length of pipelines (approx. 200.000 km operational in the Netherlands) and the complexity of the model chain that is needed to calculate a chance of failure, results in large computational challenges, as it requires massive evaluation of possible scenarios to reach the required level of confidence. To cope with this, a scalable computational infrastructure has been developed, composing a model workflow in which components have a heterogeneous technological basis. Three pilot areas covering an urban, a rural and a mixed environment, characterised by different groundwater-management strategies and different overburden histories, are used to evaluate the differences in subsidence and uncertainties that come with
Witte, Marnix G.; Sonke, Jan-Jakob; Siebers, Jeffrey; Deasy, Joseph O.; van Herk, Marcel
2017-10-01
In the past, hypothetical spherical target volumes and ideally conformal dose distributions were analyzed to establish the safety of planning target volume (PTV) margins. In this work we extended these models to estimate how alternative methods of shaping dose distributions could lead to clinical improvements. Based on a spherical clinical target volume (CTV) and Gaussian distributions of systematic and random geometrical uncertainties, idealized 3D dose distributions were optimized to exhibit specific stochastic properties. A nearby spherical organ at risk (OAR) was introduced to explore the benefit of non-spherical dose distributions. Optimizing for the same minimum dose safety criterion as implied by the generally accepted use of a PTV, the extent of the high dose region in one direction could be reduced by half provided that dose in other directions is sufficiently compensated. Further reduction of this unilateral dosimetric margin decreased the target dose confidence, however the actual minimum CTV dose at 90% confidence typically exceeded the minimum PTV dose by 20% of prescription. Incorporation of smooth dose-effect relations within the optimization led to more concentrated dose distributions compared to the use of a PTV, with an improved balance between the probability of tumor cell kill and the risk of geometrical miss, and lower dose to surrounding tissues. Tumor control rate improvements in excess of 20% were found to be common for equal integral dose, while at the same time evading a nearby OAR. These results were robust against uncertainties in dose-effect relations and target heterogeneity, and did not depend on ‘shoulders’ or ‘horns’ in the dose distributions.
Mejía, C. A. Reynoso; Burgos, A. E. Buenfil; Trejo, C. Ruiz; García, A. Mota; Durán, E. Trejo; Ponce, M. Rodríguez; de Buen, I. Gamboa
2010-12-01
The aim of this thesis project is to compare doses calculated from the treatment planning system using computed tomography images, with those measured "in vivo" by using thermoluminescent dosimeters placed at different regions of the rectum and bladder of a patient during high-dose-rate intracavitary brachytherapy treatment of uterine cervical carcinoma. The experimental dosimeters characterisation and calibration have concluded and the protocol to carry out the "in vivo" measurements has been established. In this work, the calibration curves of two types of thermoluminescent dosimeters (rods and chips) are presented, and the proposed protocol to measure the "in vivo" dose is fully described.
Probability-weighted ensembles of U.S. county-level climate projections for climate risk analysis
Rasmussen, D J; Kopp, Robert E
2015-01-01
Quantitative assessment of climate change risk requires a method for constructing probabilistic time series of changes in physical climate parameters. Here, we develop two such methods, Surrogate/Model Mixed Ensemble (SMME) and Monte Carlo Pattern/Residual (MCPR), and apply them to construct joint probability density functions (PDFs) of temperature and precipitation change over the 21st century for every county in the United States. Both methods produce $likely$ (67% probability) temperature and precipitation projections consistent with the Intergovernmental Panel on Climate Change's interpretation of an equal-weighted Coupled Model Intercomparison Project 5 (CMIP5) ensemble, but also provide full PDFs that include tail estimates. For example, both methods indicate that, under representative concentration pathway (RCP) 8.5, there is a 5% chance that the contiguous United States could warm by at least 8$^\\circ$C. Variance decomposition of SMME and MCPR projections indicate that background variability dominates...
Napier, B.A.
1992-12-01
A series of scoping calculations has been undertaken to evaluate the absolute and relative contributions of different radionuclides and exposure pathways to doses that may have been received by individuals living in the vicinity of the Hanford Site. This scoping calculation (Calculation 004) examined the contributions of numerous radionuclides to cumulative dose via environmental exposures and accumulation in foods. Addressed in this calculation were the contributions to organ and effective dose of infants and adults from (1) air submersion and groundshine external dose, (2) inhalation, (3) ingestion of soil by humans, (4) ingestion of leafy vegetables, (5) ingestion of other vegetables and fruits, (6) ingestion of meat, (7) ingestion of eggs, and (8) ingestion of cows` milk from Feeding Regime 1, as described in calculation 002. This calculation specifically addresses cumulative radiation doses to infants and adults resulting from releases occurring over the period 1945 through 1972.
Morris, R.
1996-05-01
Building 2 on the U.S. Department of Energy (DOE) Grand Junction Projects Office (GJPO) site, which is operated by Rust Geotech, is part of the GJPO Remedial Action Program. This report describes measurements and modeling efforts to evaluate the radiation dose to members of the public who might someday occupy or tear down Building 2. The assessment of future doses to those occupying or demolishing Building 2 is based on assumptions about future uses of the building, measured data when available, and predictive modeling when necessary. Future use of the building is likely to be as an office facility. The DOE sponsored program, RESRAD-BUILD, Version. 1.5 was chosen for the modeling tool. Releasing the building for unrestricted use instead of demolishing it now could save a substantial amount of money compared with the baseline cost estimate because the site telecommunications system, housed in Building 2, would not be disabled and replaced. The information developed in this analysis may be used as part of an as low as reasonably achievable (ALARA) cost/benefit determination regarding disposition of Building 2.
Smoothing and projecting age-specific probabilities of death by TOPALS
de Beer, J.A.A.
2012-01-01
Background: TOPALS is a new relational model for smoothing and projecting age schedules. The model is operationally simple, flexible, and transparent. Objective: This article demonstrates how TOPALS can be used for both smoothing and projecting age-specific mortality for 26 European countries and co
Ikenberry, T. A.; Burnett, R. A.; Napier, B. A.; Reitz, N. A.; Shipler, D. B.
1992-02-01
Preliminary radiation doses were estimated and reported during Phase I of the Hanford Environmental Dose Reconstruction (HEDR) Project. As the project has progressed, additional information regarding the magnitude and timing of past radioactive releases has been developed, and the general scope of the required calculations has been enhanced. The overall HEDR computational model for computing doses attributable to atmospheric releases from Hanford Site operations is called HEDRIC (Hanford Environmental Dose Reconstruction Integrated Codes). It consists of four interrelated models: source term, atmospheric transport, environmental accumulation, and individual dose. The source term and atmospheric transport models are documented elsewhere. This report describes the initial implementation of the design specifications for the environmental accumulation model and computer code, called DESCARTES (Dynamic EStimates of Concentrations and Accumulated Radionuclides in Terrestrial Environments), and the individual dose model and computer code, called CIDER (Calculation of Individual Doses from Environmental Radionuclides). The computations required of these models and the design specifications for their codes were documented in Napier et al. (1992). Revisions to the original specifications and the basis for modeling decisions are explained. This report is not the final code documentation but gives the status of the model and code development to date. Final code documentation is scheduled to be completed in FY 1994 following additional code upgrades and refinements. The user's guide included in this report describes the operation of the environmental accumulation and individual dose codes and associated pre- and post-processor programs. A programmer's guide describes the logical structure of the programs and their input and output files.
LU Wei-ji; CUI Wei
2001-01-01
In this paper, two kinds of models are presented and optimized for pro ject investment risk income on the basis of probability χ distribution. One kin d of model being proved has only a maximal value and another kind being proved h as no extreme values.
Leyton, F.; Nogueira, M. S.; Da Silva, T. A. [Centro de Desenvolvimento da Tecnologia Nuclear / CNEN, Post-graduation in Sciences and Technology of Radiations, Minerals and Materials, Pte. Antonio Carlos No. 6627, Belo Horizonte 31270-901, Minas Gerais (Brazil); Gubolino, L.; Pivetta, M. R. [Hospital dos Fornecedores de Cana de Piracicaba, Av. Barao de Valenca 616, 13405-233 Piracicaba (Brazil); Ubeda, C., E-mail: leyton.fernando@gmail.com [Tarapaca University, Health Sciences Faculty, Radiological Sciences Center, Av. Gral. Velasquez 1775, 1000007 Arica, Arica and Parinacota (Chile)
2015-10-15
Cases of radiation induced cataract among cardiology professionals have been reported in studies. In view of evidence of radiation injuries, the ICRP recommends limiting the radiation dose to the lens to 20 mSv per year for occupational exposure. The aim of this works was to report scattered radiation doses at the height of the operators eye in an interventional cardiology facility from procedures performed without use of radiation protection devices, correlated with different angiographic projections and operational modes. Measurements were made in a cardiac laboratory with an angiography X-ray system GE equipped with flat-panel detector. PMMA plates of 30 x 30 x 5 cm were used to simulate a patient with a thickness of 20 cm. Two fluoroscopy modes (low and normal, 15 frame/s), cine mode 15 frame/s. Four angiographic projections anterior posterior (Ap), lateral (Lat), left anterior oblique caudal (spider) and left anterior oblique cranial (Lao-45/cra-30) and a cardiac protocol for patient between 70 to 90 kg was used. Measurements of phantom entrance doses rate and scatter doses rate were performed with two Unfors Xi plus. The detector measuring scatter radiation was positioned at the usual distance of the cardiologists eyes during working conditions (1 m from the isocenter and 1.7 m from the floor). There is a good linear correlation between the kerma-area product and scatter dose at the lens. An experimental correlation factor of 2.3; 12.0; 12.2 and 17.6 μSv/Gy cm{sup 2} were found for the Ap, Lao/cra, spider and Lat projections, respectively. The entrance dose of PMMA for fluoroscopy low, medium and cine was 13, 39 and 282 mGy/min, respectively to Ap. (Author)
Weiss, E V
2000-01-01
This report provides estimates of the expected whole body and extremity radiological dose, expressed as dose equivalent (DE), to workers conducting planned plutonium (Pu) stabilization processes at the Hanford Site Plutonium Finishing Plant (PFP). The report is based on a time and motion dose study commissioned for Project W-460, Plutonium Stabilization and Handling, to provide personnel exposure estimates for construction work in the PFP storage vault area plus operation of stabilization and packaging equipment at PFP.
Dose-projection considerations for emergency conditions at nuclear power plants
Stoetzel, G.A.; Ramsdell, J.V.; Poeton, R.W.; Powell, D.C.; Desrosiers, A.E.
1983-05-01
The purpose of this report is to review the problems and issues associated with making environmental radiation-dose projections during emergencies at nuclear power plants. The review is divided into three areas: source-term development, characterization of atmospheric dispersion and selection of appropriate dispersion models, and development of dosimetry calculations for determining thyroid dose and whole-body dose for ground-level and elevated releases. A discussion of uncertainties associated with these areas is also provided.
He, Weili; Cao, Xiting; Xu, Lu
2012-02-28
The evaluation of clinical proof of concept, optimal dose selection, and phase III probability of success has traditionally been conducted by a subjective and qualitative assessment of the efficacy and safety data. This, in part, was responsible for the numerous failed phase III programs in the past. The need to utilize more quantitative approaches to assess efficacy and safety profiles has never been greater. In this paper, we propose a framework that incorporates efficacy and safety data simultaneously for the joint evaluation of clinical proof of concept, optimal dose selection, and phase III probability of success. Simulation studies were conducted to evaluate the properties of our proposed methods. The proposed approach was applied to two real clinical studies. On the basis of the true outcome of the two clinical studies, the assessment based on our proposed approach suggested a reasonable path forward for both clinical programs.
Bone marrow dose in chest radiography: the posteroanterior vs. anteroposterior projection
Archer, B.R.; Whitmore, R.C.; North, L.B.; Bushong, S.C.
1979-10-01
The dose to active bone marrow resulting from anteroposterior (AP) and posteroanterior (PA) chest examinations was estimated using an Alderson Rando phantom and extruded lithium fluoride dosimeters. The AP projections resulted in a mean marrow dose range of 1.9 to 2.6 mrad (0.019 to 0.026 mGy) as compared to doses for PA projections of 3.4 to 3.8 mrad (0.034 to 0.038 mGy) for optimally diagnostic exposures taken at 70, 90, and 120 kVp.
Projection of Korean Probable Maximum Precipitation under Future Climate Change Scenarios
Okjeong Lee
2016-01-01
Full Text Available According to the IPCC Fifth Assessment Report, air temperature and humidity of the future are expected to gradually increase over the current. In this study, future PMPs are estimated by using future dew point temperature projection data which are obtained from RCM data provided by the Korea Meteorological Administration. First, bias included in future dew point temperature projection data which is provided on a daily basis is corrected through a quantile-mapping method. Next, using a scale-invariance technique, 12-hour duration 100-year return period dew point temperatures which are essential input data for PMPs estimation are estimated from bias-corrected future dew point temperature data. After estimating future PMPs, it can be shown that PMPs in all future climate change scenarios (AR5 RCP2.6, RCP 4.5, RCP 6.0, and RCP 8.5 are very likely to increase.
Columbia River pathway report: phase I of the Hanford Environmental Dose Reconstruction Project
1991-07-01
This report summarizes the river-pathway portion of the first phase of the Hanford Environmental Dose Reconstruction (HEDR) Project. The HEDR Project is estimating radiation doses that could have been received by the public from the Department of Energy's Hanford Site, in southeastern Washington State. Phase 1 of the river-pathway dose reconstruction effort sought to determine whether dose estimates could be calculated for populations in the area from above the Hanford Site at Priest Rapids Dam to below the site at McNary Dam from January 1964 to December 1966. Of the potential sources of radionuclides from the river, fish consumption was the most important. Doses from drinking water were lower at Pasco than at Richland and lower at Kennewick than at Pasco. The median values of preliminary dose estimates calculated by HEDR are similar to independent, previously published estimates of average doses to Richland residents. Later phases of the HEDR Project will address dose estimates for periods other than 1964--1966 and for populations downstream of McNary Dam. 17 refs., 19 figs., 1 tab.
Farris, W.T.; Napier, B.A.; Simpson, J.C.; Snyder, S.F.; Shipler, D.B.
1994-04-01
The purpose of the Hanford Environmental Dose Reconstruction (HEDR) Project is to estimate the radiation dose that individuals could have received as a result of radionuclide emissions since 1944 from the Hanford Site. One objective of the HEDR Project is to estimate doses to individuals who were exposed to the radionuclides released to the Columbia River (the river pathway). This report documents the last in a series of dose calculations conducted on the Columbia River pathway. The report summarizes the technical approach used to estimate radiation doses to three classes of representative individuals who may have used the Columbia River as a source of drinking water, food, or for recreational or occupational purposes. In addition, the report briefly explains the approaches used to estimate the radioactivity released to the river, the development of the parameters used to model the uptake and movement of radioactive materials in aquatic systems such as the Columbia River, and the method of calculating the Columbia River`s transport of radioactive materials. Potential Columbia River doses have been determined for representative individuals since the initiation of site activities in 1944. For this report, dose calculations were performed using conceptual models and computer codes developed for the purpose of estimating doses. All doses were estimated for representative individuals who share similar characteristics with segments of the general population.
Nahar, S N; Chen, G X; Pradhan, A K; Nahar, Sultana N.; Eissner, Werner; Chen, Guo-Xin; Pradhan, Anil K.
2003-01-01
An extensive set of fine structure levels and corresponding transition probabilities for allowed and forbidden transitions in Fe XVII is presented. A total of 490 bound energy levels of Fe XVII of total angular momenta 0 <= J <= 7 of even and odd parities with 2 <= n <= 10, 0 <= l <= 8, 0 <= L <= 8, and singlet and triplet multiplicities, are obtained. They translate to over 2.6 x 10^4 allowed (E1) transitions that are of dipole and intercombination type, and about 3000 forbidden transitions that include electric quadrupole (E2), magnetic dipole (M1), electric octopole (E3), and magnetic quadrupole (M2) type representing the most detailed calculations to date for the ion. Oscillator strengths f, line strengths S, and coefficients A of spontaneous emission for the E1 type transitions are obtained in the relativistic Breit-Pauli R-matrix approximation. A valus for the forbidden transitions are obtained from atomic structure calculations using codes SUPERSTRUCTURE and GRASP. The energy le...
Sheng, Ke; Cai, Jing; Brookeman, James; Molloy, Janelle; Christopher, John; Read, Paul
2006-09-01
Lung tumor motion trajectories measured by four-dimensional CT or dynamic MRI can be converted to a probability density function (PDF), which describes the probability of the tumor at a certain position, for PDF based treatment planning. Using this method in simulated sequential tomotherapy, we study the dose reduction of normal tissues and more important, the effect of PDF reproducibility on the accuracy of dosimetry. For these purposes, realistic PDFs were obtained from two dynamic MRI scans of a healthy volunteer within a 2 week interval. The first PDF was accumulated from a 300 s scan and the second PDF was calculated from variable scan times from 5 s (one breathing cycle) to 300 s. Optimized beam fluences based on the second PDF were delivered to the hypothetical gross target volume (GTV) of a lung phantom that moved following the first PDF The reproducibility between two PDFs varied from low (78%) to high (94.8%) when the second scan time increased from 5 s to 300 s. When a highly reproducible PDF was used in optimization, the dose coverage of GTV was maintained; phantom lung receiving 10%-20% prescription dose was reduced by 40%-50% and the mean phantom lung dose was reduced by 9.6%. However, optimization based on PDF with low reproducibility resulted in a 50% underdosed GTV. The dosimetric error increased nearly exponentially as the PDF error increased. Therefore, although the dose of the tumor surrounding tissue can be theoretically reduced by PDF based treatment planning, the reliability and applicability of this method highly depend on if a reproducible PDF exists and is measurable. By correlating the dosimetric error and PDF error together, a useful guideline for PDF data acquisition and patient qualification for PDF based planning can be derived.
Annika eJakobi
2015-11-01
Full Text Available Introduction:Presently used radio-chemotherapy regimens result in moderate local control rates for patients with advanced head and neck squamous cell carcinoma (HNSCC. Dose escalation (DE may be an option to improve patient outcome, but may also increase the risk of toxicities in healthy tissue. The presented treatment planning study evaluated the feasibility of two DE levels for advanced HNSCC patients, planned with either intensity-modulated photon therapy (IMXT or proton therapy (IMPT.Materials and Methods:For 45 HNSCC patients, IMXT and IMPT treatment plans were created including DE via a simultaneous integrated boost (SIB in the high-risk volume, while maintaining standard fractionation with 2 Gy per fraction in the remaining target volume. Two DE levels for the SIB were compared: 2.3 Gy and 2.6 Gy. Treatment plan evaluation included assessment of tumor control probabilities (TCP and normal tissue complication probabilities (NTCP.Results:An increase of approximately 10% in TCP was estimated between the DE levels. A pronounced high-dose rim surrounding the SIB volume was identified in IMXT treatment. Compared to IMPT, this extra dose slightly increased the TCP values and to a larger extent the NTCP values. For both modalities, the higher DE level led only to a small increase in NTCP values (mean differences < 2% in all models, except for the risk of aspiration, which increased on average by 8% and 6% with IMXT and IMPT, respectively, but showed a considerable patient dependence. Conclusions:Both DE levels appear applicable to patients with IMXT and IMPT since all calculated NTCP values, except for one, increased only little for the higher DE level. The estimated TCP increase is of relevant magnitude. The higher DE schedule needs to be investigated carefully in the setting of a prospective clinical trial, especially regarding toxicities caused by high local doses that lack a sound dose response description, e.g., ulcers.
Simpson, J.C.; Ramsdell, J.V. Jr.
1993-04-01
Hanford Environmental Dose Reconstruction (HEDR) Project staff are developing mathematical models to be used to estimate the radiation dose that individuals may have received as a result of emissions since 1944 from the US Department of Energy`s (DOE) Hanford Site near Richland, Washington. An uncertainty and sensitivity analyses plan is essential to understand and interpret the predictions from these mathematical models. This is especially true in the case of the HEDR models where the values of many parameters are unknown. This plan gives a thorough documentation of the uncertainty and hierarchical sensitivity analysis methods recommended for use on all HEDR mathematical models. The documentation includes both technical definitions and examples. In addition, an extensive demonstration of the uncertainty and sensitivity analysis process is provided using actual results from the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC). This demonstration shows how the approaches used in the recommended plan can be adapted for all dose predictions in the HEDR Project.
Miley, T.B.; Eslinger, P.W.; Nichols, W.E.; Lessor, K.S.; Ouderkirk, S.J.
1994-05-01
The purpose of the Hanford Environmental Dose Reconstruction (HEDR) Project is to estimate the radiation dose that individuals could have received as a result of emissions since 1944 from the Hanford Site near Richland, Washington. The HEDR Project work is conducted under several technical and administrative tasks, among which is the Environmental Pathways and Dose Estimates task. The staff on this task have developed a suite of computer codes which are used to estimate doses to individuals in the public. This document contains the user instructions for the DESCARTES (Dynamic estimates of concentrations and Accumulated Radionuclides in Terrestrial Environments) suite of codes. In addition to the DESCARTES code, this includes two air data preprocessors, a database postprocessor, and several utility routines that are used to format input data needed for DESCARTES.
Murray, C.E.; Lee, W.J.
1992-12-01
The purpose of this report is to provide a bibliography for the Native American tribe participants in the Hanford Environmental Dose Reconstruction (HEDR) Project to use. The HEDR Project`s primary objective is to estimate the radiation dose that individuals could have received as a result of emissions since 1944 from the US Department of Energy`s Hanford Site near Richland, Washington. Eight Native American tribes are responsible for estimating daily and seasonal consumption of traditional foods, demography, and other lifestyle factors that could have affected the radiation dose received by tribal members. This report provides a bibliography of recorded accounts that tribal researchers may use to verify their estimates. The bibliographic citations include references to information on the specific tribes, Columbia River plateau ethnobotany, infant feeding practices and milk consumption, nutritional studies and radiation, tribal economic and demographic characteristics (1940--1970), research methods, primary sources from the National Archives, regional archives, libraries, and museums.
Shipler, D.B.
1992-09-01
The purpose of the Hanford Environmental Dose Reconstruction (HEDR) Project is to estimate radiation doses from Hanford Site operations since 1944 to populations and individuals. The primary objective of work to be performed through May 1994 is to (1) determine the project's appropriate scope (space, time, radionuclides, pathways and individuals/population groups), (2) determine the project's appropriate level of accuracy (level of uncertainty in dose estimates) for the project, (3) complete model and data development, and (4) estimate doses for the Hanford Thyroid Disease Study (HTDS), representative individuals, and special populations as described herein. The plan for FY 1992 through May 1994 has been prepared based on activities and budgets approved by the Technical Steering Panel (TSP) at its meetings on August 19--20, 1991, and April 23--25, 1992. The activities can be divided into four broad categories: (1) model and data evaluation activities, (2)additional dose estimates, (3) model and data development activities, and (4)technical and communication support.
The PA projection of the clavicle: a dose-reducing technique.
Mc Entee, Mark F
2010-06-01
This study compares dose and image quality during PA and AP radiography of the clavicle. The methodology involved a cadaver-based dose and image quality study. Results demonstrate a statistically significant 56.1 % (p
Muecke, Ralph [Dept. of Radiotherapy, St. Josefs-Hospital. Wiesbaden (Germany); Micke, Oliver [Dept. of Radiotherapy, Muenster Univ. Hospital (Germany); Reichl, Berthold [Dept. of Radiotherapy, Weiden Hospital (DE)] (and others)
2007-03-15
A total of 502 patients treated between 1990 and 2002 with low-dose radiotherapy (RT) for painful heel spurs were analysed for prognostic factors for long-term treatment success. The median follow-up was 26 months, ranging from 1 to 103 months. Events were defined as (1) slightly improved or unchanged pain after therapy, or (2) recurrent pain sensations during the follow-up period. Overall 8-year event-free probability was 60.9%. Event-free probabilities of patients with one/two series (414/88) were 69.7%/32.2% (p <0.001); >58/{<=}58 years (236/266), 81.3%/47.9% (p =0.001); high voltage/orthovoltage (341/161), 67.9%/60.6% (p =0.019); pain anamnesis {<=} 6 months/ >6 months (308/194), 76.3%/43.9% (p =0.001); single dose 0.5/1.0 Gy (100/401), 86.2%/55.1% (p =0.009); without/with prior treatment (121/381), 83.1%/54.9% (p =0.023); men/women (165/337), 61.2%/61.5% (p =0.059). The multivariate Cox regression analysis with inclusion of the number of treatment series, age, photon energy, pain history, single-dose and prior treatments revealed patients with only one treatment series (p <0.001), an age >58 years (p =0.011) and therapy with high voltage photons (p =0.050) to be significant prognostic factors for pain relief. Overall low-dose RT is a very effective treatment in painful heel spurs.
Interim report on the meteorological database. Hanford Environmental Dose Reconstruction Project
Stage, S.A.; Ramsdell, J.V.; Simonen, C.A.; Burk, K.W.
1993-01-01
The Hanford Environmental Dose Reconstruction (HEDR) Project is estimating radiation doses that individuals may have received from operations at Hanford from 1944 to the present. An independent Technical Steering Panel (TSP) directs the project, which is being conducted by the Battelle, Pacific Northwest Laboratories in Richland, Washington. The goals of HEDR, as approved by the TSP, include dose estimates and determination of confidence ranges for these estimates. This letter report describes the current status of the meteorological database. The report defines the meteorological data available for use in climate model calculations, describes the data collection procedures and the preparation and control of the meteorological database. This report also provides an initial assessment of the data quality. The available meteorological data are adequate for atmospheric calculations. Initial checks of the data indicate the data entry accuracy meets the data quality objectives.
Draft Air Pathway Report: Phase 1 of the Hanford Environmental Dose Reconstruction Project
1990-07-20
This report summarizes the air pathway portion of the first phase of the Hanford Environmental Dose Reconstruction (HEDR) Project, conducted by Battelle staff at the Pacific Northwest Laboratory under the direction of an independent Technical Steering Panel. The HEDR Project is estimating historical radiation doses that could have been received by populations near the Department of Energy's Hanford Site, in southeastern Washington State. Phase 1 of the air-pathway dose reconstruction sought to determine whether dose estimates could be calculated for populations in the 10 counties nearest the Hanford Site from atmospheric releases of iodine-131 from the site from 1944--1947. Phase 1 demonstrated the following: HEDR-calculated source-term estimates of iodine-131 releases to the atmosphere were within 20% of previously published estimates; calculated vegetation concentrations of iodine-131 agree well with previously published measurements; the highest of the Phase 1 preliminary dose estimates to the thyroid are consistent with independent, previously published estimates of doses to maximally exposed individuals; and relatively crude, previously published measurements of thyroid burdens for Hanford workers are in the range of average burdens that the HEDR model estimated for similar reference individuals'' for the period 1944--1947. 4 refs., 10 figs., 9 tabs.
Kim, Myung-Hee Y.; Hu, Shaowen; Nounu, Hatem; Cucinotta, Francis A.
Solar particle events (SPEs) pose the risk of acute radiation sickness (ARS) to astronauts be-cause organ doses from large SPEs may reach critical levels during extra vehicular activities (EVAs) or lightly shielded spacecraft. NASA has developed an organ dose projection model of Baryon transport code (BRYNTRN) with an output data processing module of SUMDOSE, and a probabilistic model of acute radiation risk (ARR). BRYNTRN code operation requires extensive input preparation, and the risk projection models of organ doses and ARR take the output from BRYNTRN as an input to their calculations. With a graphical user interface (GUI) to handle input and output for BRYNTRN, these response models can be connected easily and correctly to BRYNTRN in a user-friendly way. The GUI for the Acute Radiation Risk and BRYNTRN Organ Dose (ARRBOD) projection code provides seamless integration of input and output manipulations required for operations of the ARRBOD modules: BRYNTRN, SUMDOSE, and the ARR probabilistic response model. The ARRBOD GUI is intended for mission planners, radiation shield designers, space operations in the mission operations direc-torate (MOD), and space biophysics researchers. Assessment of astronauts' organ doses and ARS from the exposure to historically large SPEs is in support of mission design and opera-tion planning to avoid ARS and stay within the current NASA short-term dose limits. The ARRBOD GUI will serve as a proof-of-concept for future integration of other risk projection models for human space applications. We present an overview of the ARRBOD GUI prod-uct, which is a new self-contained product, for the major components of the overall system, subsystem interconnections, and external interfaces.
Byram, S.J.
1991-05-01
The Hanford Environmental Dose Reconstruction (HEDR) Project will estimate radiation exposures people may have received from radioactive materials released during past operations at the Department of Energy's Hanford Site near Richland, Washington. The project is being conducted by Pacific Northwest Laboratory (PNL) under the direction of an independent Technical Steering Panel (TSP). The Centers for Disease Control (CDC) will use HEDR dose estimates in studies to investigate a potential link between thyroid disease and historical Hanford emissions. The HEDR Project was initiated to address public concerns about the possible health impacts from past releases of radioactive materials from Hanford. The TSP recognized early in the project that special mechanisms would be required to communicate effectively to the many different concerned audiences. To identify and develop these mechanisms, the TSP issued Directive 89-7 to PNL in May 1989. The TSP directed PNL to examine methods to communicate the causes and effects of uncertainties in the dose estimates. A literature review was conducted as the first activity in response to the TSP's directive. This report presents the results of the literature review. The objective of the literature review was to identify key principles'' that could be applied to develop communications strategies for the project. 26 refs., 6 figs.
Duncan, J.P.
1994-03-01
This report is a result of the Hanford Environmental Dose Reconstruction (HEDR) Project. The goal of the HEDR Project is to estimate the radiation dose that individuals could have received from emissions since 1944 at the Hanford Site near Richland, Washington. Members of the HEDR Project`s Environmental Monitoring Data Task have developed databases of historical environmental measurements of such emissions. The HEDR Project is conducted by Battelle, Pacific Northwest Laboratories. This report is the third in a series that documents the information available on measurements of iodine-131 concentrations in vegetation. The first two reports provide the data for 1945--1951. This report provides an overview of the historical documents, which contain vegetation data for 1952--1983. The overview is organized according to the documents available for any given year. Each section, covering one year, contains a discussion of the media sampled, the sampling locations, significant events if there were any, emission quantities, constituents measured, and a list of the documents with complete reference information. Because the emissions which affected vegetation were significantly less after 1951, the vegetation monitoring data after that date have not been used in the HEDR Project. However, access to these data may be of interest to the public. This overview is, therefore, being published.
Stage, S.A.; Ramsdell, J.V. Jr.; Simonen, C.A.; Burk, K.W.; Berg, L.K.
1993-11-01
The Hanford Environmental Dose Reconstruction (HEDR) Project is estimating radiation doses that individuals may have received from operations at Hanford from 1944 to the present. A number of computer programs are being developed by the HEDR Project to estimate doses and confidence ranges associated with radionuclides transported through the atmosphere and the Columbia River. One computer program is the Regional Atmospheric Transport Code for Hanford Emissions Tracking (RATCHET). RATCHET combines release data with information on atmospheric conditions including wind direction and speed. The RATCHET program uses these data to produce estimates of time-integrated air concentrations and surface contamination. These estimates are used in calculating dose by the Dynamic EStimates of Concentrations And Radionuclides in Terrestrial EnvironmentS (DESCARTES) and the Calculations of Individual Doses from Environmental Radionuclides (CIDER) computer programs. This report describes the final status of the meteorological database used by RATCHET. Data collection procedures and the preparation and control of the meteorological database are described, along with an assessment of the data quality.
Atwell, William; Tylka, Allan J.; Dietrich, William; Rojdev, Kristina; Matzkind, Courtney
2016-01-01
In an earlier paper (Atwell, et al., 2015), we investigated solar particle event (SPE) radiation exposures (absorbed dose) to small, thinly-shielded spacecraft during a period when the sunspot number (SSN) was less than 30. These SPEs contain Ground Level Events (GLE), sub-GLEs, and sub-sub-GLEs (Tylka and Dietrich, 2009, Tylka and Dietrich, 2008, and Atwell, et al., 2008). GLEs are extremely energetic solar particle events having proton energies extending into the several GeV range and producing secondary particles in the atmosphere, mostly neutrons, observed with ground station neutron monitors. Sub-GLE events are less energetic, extending into the several hundred MeV range, but do not produce secondary atmospheric particles. Sub-sub GLEs are even less energetic with an observable increase in protons at energies greater than 30 MeV, but no observable proton flux above 300 MeV. In this paper, we consider those SPEs that occurred during 1973-2010 when the SSN was greater than 30 but less than 50. In addition, we provide probability estimates of absorbed dose based on mission duration with a 95% confidence level (CL). We also discuss the implications of these data and provide some recommendations that may be useful to spacecraft designers of these smaller spacecraft.
Data base on dose reduction research projects for nuclear power plants. Volume 5
Khan, T.A.; Yu, C.K.; Roecklein, A.K. [Brookhaven National Lab., Upton, NY (United States)
1994-05-01
This is the fifth volume in a series of reports that provide information on dose reduction research and health physics technology or nuclear power plants. The information is taken from two of several databases maintained by Brookhaven National Laboratory`s ALARA Center for the Nuclear Regulatory Commission. The research section of the report covers dose reduction projects that are in the experimental or developmental phase. It includes topics such as steam generator degradation, decontamination, robotics, improvements in reactor materials, and inspection techniques. The section on health physics technology discusses dose reduction efforts that are in place or in the process of being implemented at nuclear power plants. A total of 105 new or updated projects are described. All project abstracts from this report are available to nuclear industry professionals with access to a fax machine through the ACEFAX system or a computer with a modem and the proper communications software through the ACE system. Detailed descriptions of how to access all the databases electronically are in the appendices of the report.
Holmes, C.W.
1991-04-01
The Hanford Environmental Dose Reconstruction (HEDR) Project will estimate radiation doses people may have received from exposure to radioactive materials released during past operations at the US Department of Energy's (DOE) Hanford Site near Richland, Washington. The HEDR Project was initiated in response to public concerns about possible health impacts from past releases of radioactive materials from Hanford. The TSP recognized early in the project that special mechanisms would be required to effectively communicate to the many different concerned audiences. Accordingly, the TSP directed PNL to examine methods for communicating causes and effects of uncertainties in the dose estimates. After considering the directive and discussing it with the Communications Subcommittee of the TSP, PNL undertook a broad investigation of communications methods to consider for inclusion in the TSP's current communications program. As part of this investigation, a literature review was conducted regarding risk communications. A key finding was that, in order to successfully communicate risk-related information, a thorough understanding of the knowledge level, concerns and information needs of the intended recipients (i.e., the audience) is necessary. Hence, a preliminary audience analysis was conducted as part of the present research. This report summarizes the results of this analysis. 1 ref., 9 tabs.
Air pathway report: Phase I of the Hanford Environmental Dose Reconstruction Project
1991-07-01
Phase 1 of the air-pathway portion of the Hanford Environmental Dose Reconstruction (HEDR) Project sought to determine whether dose estimates could be calculated for populations in the 10 counties nearest the Hanford Site from atmospheric releases of iodine-131 from the site from 1944--1947. Phase 1 demonstrated the following: HEDR-calculated source-term estimates of iodine-131 releases to the atmosphere were within 20% of previously published estimates; calculated vegetation concentrations of iodine-131 agree well with previously published measurements; the highest of the Phase 1 preliminary dose estimates to the thyroid are consistent with independent, previously published estimates of doses to maximally exposed individuals; and, relatively crude, previously published measurements of thyroid burdens for Hanford workers are in the range of average burdens that the HEDR model estimated for similar reference individuals'' for the period 1944--1947. Preliminary median dose estimates summed over the year 1945--1947 for the primary pathway, air-pasture-cow-milk-thyroid, ranged from low median values of 0.006 rad for upwind adults who obtained milk from backyard cows not on pasture to high median values of 68.0 rad for downwind infants who drank milk from pasture-fed cows. Extremes of the estimated range are a low of essentially zero to upwind adults and a high of almost 3000 rem to downwind infants. 37 refs., 37 figs., 2 tabs.
Deonigi, D.E.; Anderson, D.M.; Wilfert, G.L.
1994-04-01
The Hanford Environmental Dose Reconstruction (HEDR) Project was established to estimate radiation doses that people could have received from nuclear operations at the Hanford Site since 1944. For this period iodine-131 is the most important offsite contributor to radiation doses from Hanford operations. Consumption of milk from cows that ate vegetation contaminated by iodine-131 is the dominant radiation pathway for individuals who drank milk (Napier 1992). Information has been developed on commercial milk cow locations and commercial milk distribution during 1945 and 1951. The year 1945 was selected because during 1945 the largest amount of iodine-131 was released from Hanford facilities in a calendar year (Heeb 1993); therefore, 1945 was the year in which an individual was likely to have received the highest dose. The year 1951 was selected to provide data for comparing the changes that occurred in commercial milk flows (i.e., sources, processing locations, and market areas) between World War II and the post-war period. To estimate the doses people could have received from this milk flow, it is necessary to estimate the amount of milk people consumed, the source of the milk, the specific feeding regime used for milk cows, and the amount of iodine-131 contamination deposited on feed.
Dale, E; Hellebust, T P; Skjønsberg, A; Høgberg, T; Olsen, D R
2000-07-01
To calculate the normal tissue complication probability (NTCP) of late radiation effects on the rectum and bladder from repetitive CT scans during fractionated high-dose-rate brachytherapy (HDRB) and external beam radiotherapy (EBRT) of the uterine cervix and compare the NTCP with the clinical frequency of late effects. Fourteen patients with cancer of the uterine cervix (Stage IIb-IVa) underwent 3-6 (mean, 4.9) CT scans in treatment position during their course of HDRB using a ring applicator with an Iridium stepping source. The rectal and bladder walls were delineated on the treatment-planning system, such that a constant wall volume independent of organ filling was achieved. Dose-volume histograms (DVH) of the rectal and bladder walls were acquired. A method of summing multiple DVHs accounting for variable dose per fraction were applied to the DVHs of HDRB and EBRT together with the Lyman-Kutcher NTCP model fitted to clinical dose-volume tolerance data from recent studies. The D(mean) of the DVH from EBRT was close to the D(max) for both the rectum and bladder, confirming that the DVH from EBRT corresponded with homogeneous whole-organ irradiation. The NTCP of the rectum was 19.7% (13.5%, 25. 9%) (mean and 95% confidence interval), whereas the clinical frequency of late rectal sequelae (Grade 3-4, RTOG/EORTC) was 13% based on material from 200 patients. For the bladder the NTCP was 61. 9% (46.8%, 76.9%) as compared to the clinical frequency of Grade 3-4 late effects of 14%. If only 1 CT scan from HDRB was assumed available, the relative uncertainty (standard deviation or SD) of the NTCP value for an arbitrary patient was 20-30%, whereas 4 CT scans provided an uncertainty of 12-13%. The NTCP for the rectum was almost consistent with the clinical frequency of late effects, whereas the NTCP for bladder was too high. To obtain reliable (SD of 12-13%) NTCP values, 3-4 CT scans are needed during 5-7 fractions of HDRB treatments.
FY 1992 task plans for the Hanford Environmental Dose Reconstruction Project
None
1991-10-01
Phase 1 of the HEDR Project was designed to develop and demonstrate a method for estimating radiation doses people may have received from Hanford Site operations since 1944. The method researchers developed relied on a variety of measured and reconstructed data as input to a modular computer model that generates dose estimates and their uncertainties. As part of Phase 1, researchers used the reconstructed data and computer model to calculate preliminary dose estimates for populations from limited radionuclides, in a limited geographical area and time period. Phase 1 ended in FY 1990. In February 1991, the TSP decided to shift the project planning approach away from phases--which were centered around completion of major portions of technical activities--to individual fiscal years (FYs), which span October of one year through September of the next. Therefore, activities that were previously designated to occur in phases are now designated in an integrated schedule to occur in one or more of the next fiscal years into FY 1995. Task plans are updated every 6 months. In FY 1992, scientists will continue to improve Phase 1 data and models to calculate more accurate and precise dose estimates. The plan for FY 1992 has been prepared based on activities and budgets approved by the Technical Steering Panel (TSP) at its meeting on August 19--20, 1991. The activities can be divided into four categories: (1) model and data evaluation activities, (2) additional dose estimates, (3) model and data development activities, and (4) technical and communication support. 3 figs., 2 tabs.
Eslinger, P.W.; Ouderkirk, S.J.; Nichols, W.E.
1993-01-01
The Hanford Envirorunental Dose Reconstruction (HEDR) project is developing several computer codes to model the airborne release, transport, and envirormental accumulation of radionuclides resulting from Hanford operations from 1944 through 1972. In order to calculate the dose of radiation a person may have received in any given location, the geographic area addressed by the HEDR Project will be divided into a grid. The grid size suggested by the draft requirements contains 2091 units called nodes. Two of the codes being developed are DESCARTES and CIDER. The DESCARTES code will be used to estimate the concentration of radionuclides in environmental pathways from the output of the air transport code RATCHET. The CIDER code will use information provided by DESCARTES to estimate the dose received by an individual. The requirements that Battelle (BNW) set for these two codes were released to the HEDR Technical Steering Panel (TSP) in a draft document on November 10, 1992. This document reports on the preliminary work performed by the code development team to determine if the requirements could be met.
Ramsdell, J.V.
1991-07-01
Radiation doses that may have resulted from operations at the Hanford Site are being estimated in the Hanford Environmental Dose Reconstruction (HEDR) Project. One of the project subtasks, atmospheric transport, is responsible for estimating the transport, diffusion and deposition of radionuclides released to the atmosphere. This report discusses modeling transport and diffusion in the atmospheric pathway. It is divided into three major sections. The first section of the report presents the atmospheric modeling approach selected following discussion with the Technical Steering Panel that directs the HEDR Project. In addition, the section discusses the selection of the MESOI/MESORAD suite of atmospheric dispersion models that form the basis for initial calculations and future model development. The second section of the report describes alternative modeling approaches that were considered. Emphasis is placed on the family of plume and puff models that are based on Gaussian solution to the diffusion equations. The final portion of the section describes the performance of various models. The third section of the report discusses factors that bear on the selection of an atmospheric transport modeling approach for HEDR. These factors, which include the physical setting of the Hanford Site and the available meteorological data, serve as constraints on model selection. Five appendices are included in the report. 39 refs., 4 figs., 2 tabs.
Horeweg, Nanda; van Rosmalen, Joost; Heuvelmans, Marjolein A; van der Aalst, Carlijn M; Vliegenthart, Rozemarijn; Scholten, Ernst Th; ten Haaf, Kevin; Nackaerts, Kristiaan; Lammers, Jan-Willem J; Weenink, Carla; Groen, Harry J; van Ooijen, Peter; de Jong, Pim A; de Bock, Geertruida H; Mali, Willem; de Koning, Harry J; Oudkerk, Matthijs
2014-11-01
The main challenge in CT screening for lung cancer is the high prevalence of pulmonary nodules and the relatively low incidence of lung cancer. Management protocols use thresholds for nodule size and growth rate to determine which nodules require additional diagnostic procedures, but these should be based on individuals' probabilities of developing lung cancer. In this prespecified analysis, using data from the NELSON CT screening trial, we aimed to quantify how nodule diameter, volume, and volume doubling time affect the probability of developing lung cancer within 2 years of a CT scan, and to propose and evaluate thresholds for management protocols. Eligible participants in the NELSON trial were those aged 50-75 years, who have smoked 15 cigarettes or more per day for more than 25 years, or ten cigarettes or more for more than 30 years and were still smoking, or had stopped smoking less than 10 years ago. Participants were randomly assigned to low-dose CT screening at increasing intervals, or no screening. We included all participants assigned to the screening group who had attended at least one round of screening, and whose results were available from the national cancer registry database. We calculated lung cancer probabilities, stratified by nodule diameter, volume, and volume doubling time and did logistic regression analysis using diameter, volume, volume doubling time, and multinodularity as potential predictor variables. We assessed management strategies based on nodule threshold characteristics for specificity and sensitivity, and compared them to the American College of Chest Physicians (ACCP) guidelines. The NELSON trial is registered at www.trialregister.nl, number ISRCTN63545820. Volume, volume doubling time, and volumetry-based diameter of 9681 non-calcified nodules detected by CT screening in 7155 participants in the screening group of NELSON were used to quantify lung cancer probability. Lung cancer probability was low in participants with a nodule
Marsh, T.L.; Anderson, D.M.; Farris, W.T.; Ikenberry, T.A.; Napier, B.A.; Wilfert, G.L.
1992-09-01
This letter report summarizes a scoping study that examined the potential importance of fresh fruit and vegetable pathways to dose. A simple production index was constructed with data collected from the Washington State Department of Agriculture (WSDA), the United States Bureau of the Census, and the United States Department of Agriculture (USDA). Hanford Environmental Dose Reconstruction (HEDR) Project staff from Battelle, Pacific Northwest Laboratories, in cooperation with members of the Technical Steering Panel (TSP), selected lettuce and spinach as the produce pathways most likely to impact dose. County agricultural reports published in 1956 provided historical descriptions of the predominant distribution patterns of fresh lettuce and spinach from production regions to local population centers. Pathway rankings and screening dose estimates were calculated for specific populations living in selected locations within the HEDR study area.
Marsh, T.L.; Anderson, D.M.; Farris, W.T.; Ikenberry, T.A.; Napier, B.A.; Wilfert, G.L.
1992-09-01
This letter report summarizes a scoping study that examined the potential importance of fresh fruit and vegetable pathways to dose. A simple production index was constructed with data collected from the Washington State Department of Agriculture (WSDA), the United States Bureau of the Census, and the United States Department of Agriculture (USDA). Hanford Environmental Dose Reconstruction (HEDR) Project staff from Battelle, Pacific Northwest Laboratories, in cooperation with members of the Technical Steering Panel (TSP), selected lettuce and spinach as the produce pathways most likely to impact dose. County agricultural reports published in 1956 provided historical descriptions of the predominant distribution patterns of fresh lettuce and spinach from production regions to local population centers. Pathway rankings and screening dose estimates were calculated for specific populations living in selected locations within the HEDR study area.
2007-04-30
with Goldratt’s observation that negative human behavior is a major cause of the project-scheduling problem. Goldratt (1997) developed the Critical...2002). Assessment of cost uncertainties for large technology projects: A methodology and an application. Interfaces 32(4), 52-66. Goldratt , E.M
Nielsen, Sven P.; Isaksson, M.; Nilsson, Elisabeth (and others)
2005-07-01
The NKS B-programme EcoDoses project started in 2003 as a collaboration between all the Nordic countries. The aim of the project is to improve the radiological assessments of doses to man from terrestrial ecosystems. The present report sums up the work performed in the second phase of the project. The main topics in 2004 have been: (i) A continuation of previous work with a better approach for estimating global fallout on a regional or national scale, based on a correlation between precipitation and deposition rates. (ii) Fur-ther extension of the EcoDoses milk database. Estimation of effective ecological half lives of {sup 137}Cs in cows milk focussing on suitable post-Chernobyl time-series. Modelling integrated transfer of {sup 13}7{sup C}s to cow's milk from Nordic countries. (iii) Determination of effective ecological half lives for fresh water fish from Nordic lakes. (iv) Investigate ra-dioecological sensitivity for Nordic populations. (v) Food-chain modelling using the Eco-sys-model, which is the underlying food- and dose-module in several computerised deci-sion-making systems. (au)
An adaptive nonlocal filtering for low-dose CT in both image and projection domains
Yingmei Wang
2015-04-01
Full Text Available An important problem in low-dose CT is the image quality degradation caused by photon starvation. There are a lot of algorithms in sinogram domain or image domain to solve this problem. In view of strong self-similarity contained in the special sinusoid-like strip data in the sinogram space, we propose a novel non-local filtering, whose average weights are related to both the image FBP (filtered backprojection reconstructed from restored sinogram data and the image directly FBP reconstructed from noisy sinogram data. In the process of sinogram restoration, we apply a non-local method with smoothness parameters adjusted adaptively to the variance of noisy sinogram data, which makes the method much effective for noise reduction in sinogram domain. Simulation experiments show that our proposed method by filtering in both image and projection domains has a better performance in noise reduction and details preservation in reconstructed images.
Bergan, T. [Lavrans Skuterud, Haevard Thoerring (Norway); Liland, A. [Norwegian Radiation Protection Authority (NRPA) (Denmark)] (eds.)
2004-05-01
The NKS B-programme EcoDoses project started in 2003 as a collaboration between all the Nordic countries. The aim of the project is to improve the radiological assessments of doses to man from terrestrial ecosystems. The first part, conducted in 2003, has focussed on an extensive collation and review of both published and unpublished data from all the Nordic countries for the nuclear weapons fallout period and the post-Chemobyl period. This included data on radionuclides in air filters, precipitation, soil samples, milk and reindeer. Based on this, an improved model for estimating radioactive fallout based on precipitation data during the nuclear weapons fallout period has been developed. Effective ecological half- lives for 137Cs and 90Sr in milk have been calculated for the nuclear weapons fallout period. For reindeer the ecological half- lives for 137Cs have been calculated for both the nuclear weapons fallout period and the post-Chemobyl period. The data were also used to compare modelling results with observed concentrations. This was done at a workshop where the radioecological food-and-dose module in the ARGOS decision support system was used to predict transfer of deposited radionuclides to foodstuffs and subsequent radiation doses to man. The work conducted the first year is presented in this report and gives interesting, new results relevant for terrestrial radioecology. (au)
Domienik, J; Brodecki, M; Carinou, E; Donadille, L; Jankowski, J; Koukorava, C; Krim, S; Nikodemova, D; Ruiz-Lopez, N; Sans-Mercé, M; Struelens, L; Vanhavere, F
2011-03-01
The main objective of WP1 of the ORAMED (Optimization of RAdiation protection for MEDical staff) project is to obtain a set of standardised data on extremity and eye lens doses for staff in interventional radiology (IR) and cardiology (IC) and to optimise staff protection. A coordinated measurement program in different hospitals in Europe will help towards this direction. This study aims at analysing the first results of the measurement campaign performed in IR and IC procedures in 34 European hospitals. The highest doses were found for pacemakers, renal angioplasties and embolisations. Left finger and wrist seem to receive the highest extremity doses, while the highest eye lens doses are measured during embolisations. Finally, it was concluded that it is difficult to find a general correlation between kerma area product and extremity or eye lens doses.
Thiede, M.E.; Bates, D.J.; Mart, E.I.; Hanf, R.W.
1994-03-01
This report is a guide to the work accomplished by the Environmental Monitoring Data Task, which is one of the tasks in the Hanford Environmental Dose Reconstruction (HEDR) Project. The objective of the Environmental Monitoring Data Task was to recover, evaluate, process, and/or reconstruct the environmental monitoring data for the period 1945--1972. The period of time for which environmental monitoring data were sought was determined by the start-up and shut-down dates of the Hanford facilities that emitted the majority of radionuclides to the two major pathways: air and the Columbia River. Radionuclide emissions to the air were mainly the result of the operation of the chemical separations plants from 1944--1972 (Heeb 1994). Radionuclide emissions to the Columbia River were mainly the result of the operation of the single-pass production reactors from 1944--1971 (Heeb and Bates 1994). Therefore, the historical environmental monitoring data sought were for the period 1945--1972. Within the period of 1945--1972, specific periods of interest to the HEDR Project vary depending on the pathway. For example, 1945--1951 was the peak period for radionuclide emissions to the air and hence vegetation uptake of radionuclides, while 1956--1965 was the peak period for radionuclide emissions to the Columbia River and hence fish uptake of radionuclides. However, adequate historical data were not always available for the periods of interest. In the case of vegetation measurements, conversion and correction factors had to be developed to convert the historical measurements to modern standard measurements. Table S.1 lists the reports that explain these conversion and correction factors. In the case of Columbia River fish and waterfowl, bioconcentration factors were developed for use in any year where the river pathway data are insufficient.
Hartman, H; Engström, L; Lundberg, H
2015-01-01
We report lifetime measurements of the 6 levels in the 3d6(5D)4d e6G term in Fe ii at an energy of 10.4 eV, and f -values for 14 transitions from the investigated levels. The lifetimes were measured using time-resolved laser-induced fluorescence on ions in a laser-produced plasma. The high excitation energy, and the fact that the levels have the same parity as the the low-lying states directly populated in the plasma, necessitated the use of a two-photon excitation scheme. The probability for this process is greatly enhanced by the presence of the 3d6(5D)4p z6F levels at roughly half the energy di?erence. The f -values are obtained by combining the experimental lifetimes with branching fractions derived using relative intensities from a hollow cathode discharge lamp recorded with a Fourier transform spectrometer. The data is important for benchmarking atomic calculations of astrophysically important quantities and useful for spectroscopy of hot stars.
Hartman, H.; Nilsson, H.; Engström, L.; Lundberg, H.
2015-12-01
We report lifetime measurements of the 6 levels in the 3d6(5D)4d e6G term in Fe ii at an energy of 10.4 eV, and f-values for 14 transitions from the investigated levels. The lifetimes were measured using time-resolved laser-induced fluorescence on ions in a laser-produced plasma. The high excitation energy, and the fact that the levels have the same parity as the the low-lying states directly populated in the plasma, necessitated the use of a two-photon excitation scheme. The probability for this process is greatly enhanced by the presence of the 3d6(5D)4p z6F levels at roughly half the energy difference. The f-values are obtained by combining the experimental lifetimes with branching fractions derived using relative intensities from a hollow cathode discharge lamp recorded with a Fourier transform spectrometer. The data is important for benchmarking atomic calculations of astrophysically important quantities and useful for spectroscopy of hot stars.
Lee, Minwook; Kim, Myung-Joon; Lee, Mi-Jung [Yonsei University College of Medicine, Department of Radiology and Research Institute of Radiological Science, Severance Children' s Hospital, 50 Yonsei-ro, Seodaemun-gu, Seoul (Korea, Republic of); Han, Kyung Hwa [Yonsei University College of Medicine, Gangnam Medical Research Center, Biostatistics Collaboration Unit, Seoul (Korea, Republic of)
2014-07-17
Iterative reconstruction can be helpful to reduce radiation dose while maintaining image quality. However, this technique has not been fully evaluated in children during abdominal CT. To compare objective and subjective image quality between half-dose images reconstructed with iterative reconstruction at iteration strength levels 1 to 5 (half-S1 to half-S5 studies) and full-dose images reconstructed with filtered back projection (full studies) in pediatric abdominal CT. Twenty-one children (M:F = 13:8; mean age 8.2 ± 5.7 years) underwent dual-source abdominal CT (mean effective dose 4.8 ± 2.1 mSv). The objective image quality was evaluated as noise. Subjective image quality analysis was performed comparing each half study to the full study for noise, sharpness, artifact and diagnostic acceptability. Both objective and subjective image noise decreased with increasing iteration strength. Half-S4 and -S5 studies showed objective image noise similar to or lower than that of full studies. The half-S2 and -S3 studies produced the greatest sharpness and the half-S5 studies were the worst from a blocky appearance. Full and half studies did not differ in artifacts. Half-S3 studies showed the best diagnostic acceptability. Half-S4 and -S5 studies objectively and half-S3 studies subjectively showed comparable image quality to full studies in pediatric abdominal CT. (orig.)
Gudder, Stanley P
2014-01-01
Quantum probability is a subtle blend of quantum mechanics and classical probability theory. Its important ideas can be traced to the pioneering work of Richard Feynman in his path integral formalism.Only recently have the concept and ideas of quantum probability been presented in a rigorous axiomatic framework, and this book provides a coherent and comprehensive exposition of this approach. It gives a unified treatment of operational statistics, generalized measure theory and the path integral formalism that can only be found in scattered research articles.The first two chapters survey the ne
Asmussen, Søren; Albrecher, Hansjörg
The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...
Woloschak, Gayle E [Northwestern Univ., Evanston, IL (United States); Grdina, David [Univ. of Chicago, IL (United States); Li, Jian-Jian [Univ. of California, Davis, CA (United States)
2017-06-12
Low dose ionizing radiation effects are difficult to study in human population because of the numerous confounding factors such as genetic and lifestyle differences. Research in mammalian model systems and in vitro is generally used in order to overcome this difficulty. In this program project three projects have joined together to investigate effects of low doses of ionizing radiation. These are doses at and below 10 cGy of low linear energy transfer ionizing radiation such as X-ray and gamma rays. This project was focused on cellular signaling associated with nuclear factor kappa B (NFkB) and mitochondria - subcellular organelles critical for cell aging and aging-like changes induced by ionizing radiation. In addition to cells in culture this project utilized animal tissues accumulated in a radiation biology tissue archive housed at Northwestern University (http://janus.northwestern.edu/janus2/index.php). Major trust of Project 1 was to gather all of the DoE sponsored irradiated animal (mouse, rat and dog) data and tissues under one roof and investigate mitochondrial DNA changes and micro RNA changes in these samples. Through comparison of different samples we were trying to delineate mitochondrial DNA quantity alterations and micro RNA expression differences associated with different doses and dose rates of radiation. Historic animal irradiation experiments sponsored by DoE were done in several national laboratories and universities between 1950’s and 1990’s; while these experiments were closed data and tissues were released to Project 1. Project 2 used cells in culture to investigate effects that low doses or radiation have on NFκB and its target genes manganese superoxide dismutase (MnSOD) and genes involved in cell cycle: Cyclins (B1 and D1) and cyclin dependent kinases (CDKs). Project 3 used cells in culture such as “normal” human cells (breast epithelial cell line MCF10A cells and skin keratinocyte cells HK18) and mouse embryo fibroblast (mef
Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...
Shiryaev, Albert N
2016-01-01
This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.
Walter, Joan E.; Heuvelmans, Marjolein A.; de Jong, Pim A.; Vliegenthart, Rozemarijn; van Ooijen, Peter M. A.; Peters, Robin B.; ten Haaf, Kevin; Yousaf-Khan, Uraujh; van der Aalst, Carlijn M.; de Bock, Geertruida H.; Mali, Willem; Groen, Harry J. M.; de Koning, Harry J.; Oudkerk, Matthijs
2016-01-01
Background US guidelines now recommend lung cancer screening with low-dose CT for high-risk individuals. Reports of new nodules after baseline screening have been scarce and are inconsistent because of differences in definitions used. We aimed to identify the occurrence of new solid nodules and thei
Walter, Joan E.; Heuvelmans, Marjolein A.; de Jong, Pim A.; Vliegenthart, Rozemarijn; van Ooijen, Peter M. A.; Peters, Robin B.; ten Haaf, Kevin; Yousaf-Khan, Uraujh; van der Aalst, Carlijn M.; de Bock, Geertruida H.; Mali, Willem; Groen, Harry J. M.; de Koning, Harry J.; Oudkerk, Matthijs
2016-01-01
BACKGROUND: US guidelines now recommend lung cancer screening with low-dose CT for high-risk individuals. Reports of new nodules after baseline screening have been scarce and are inconsistent because of differences in definitions used. We aimed to identify the occurrence of new solid nodules and the
Walter, Joan E.; Heuvelmans, Marjolein A.; de Jong, Pim A.; Vliegenthart, Rozemarijn; van Ooijen, Peter M A; Peters, Robin B.; ten Haaf, Kevin; Yousaf-Khan, Uraujh; van der Aalst, Carlijn M.; de Bock, Geertruida H.; Mali, Willem P Th M; Groen, Harry J M; de Koning, Harry J.; Oudkerk, Matthijs
BACKGROUND: US guidelines now recommend lung cancer screening with low-dose CT for high-risk individuals. Reports of new nodules after baseline screening have been scarce and are inconsistent because of differences in definitions used. We aimed to identify the occurrence of new solid nodules and
Lexicographic Probability, Conditional Probability, and Nonstandard Probability
2009-11-11
the following conditions: CP1. µ(U |U) = 1 if U ∈ F ′. CP2 . µ(V1 ∪ V2 |U) = µ(V1 |U) + µ(V2 |U) if V1 ∩ V2 = ∅, U ∈ F ′, and V1, V2 ∈ F . CP3. µ(V |U...µ(V |X)× µ(X |U) if V ⊆ X ⊆ U , U,X ∈ F ′, V ∈ F . Note that it follows from CP1 and CP2 that µ(· |U) is a probability measure on (W,F) (and, in... CP2 hold. This is easily seen to determine µ. Moreover, µ vaciously satisfies CP3, since there do not exist distinct sets U and X in F ′ such that U
Bardy, Guillaume; Cathala, Philippe; Eiden, Céline; Baccino, Eric; Petit, Pierre; Mathieu, Olivier
2015-01-01
Buprenorphine is largely prescribed for maintenance treatment in opioid dependence due to its safety profile. Nevertheless, fatalities at therapeutic dose have been described when associated with other central nervous system depressants, such as ethanol or benzodiazepines. Here, we report a case of death due to association of buprenorphine at therapeutic dose with benzodiazepines and ethanol. Although toxicity has been often attributed to its metabolite norbuprenorphine rather than to buprenorphine itself, in our case, norbuprenorphine was not detected in urine and bile and only in traces in blood. Moreover, the presence in blood of free buprenorphine but not of glucuronide metabolites argues for an unusual early death, at the beginning of buprenorphine metabolism. We propose that in the context of prior toxic impregnation, buprenorphine directly (and not via its metabolite norbuprenorphine) acted as a triggering factor by blocking the ventilatory response, rapidly leading to fatal respiratory depression.
Snyder, S.F.; Farris, W.T.; Napier, B.A.; Ikenberry, T.A.; Gilbert, R.O.
1992-09-01
This letter report is a description of work performed for the Hanford Environmental Dose Reconstruction (HEDR) Project. The HEDR Project was established to estimate the radiation doses to individuals resulting from releases of radionuclides from the Hanford Site since 1944. This work is being done by staff at Battelle, Pacific Northwest Laboratories (Battelle) under a contract with the Centers for Disease Control (CDC) with technical direction provided by an independent Technical Steering Panel (TSP). The objective of this report is to-document the environmental accumulation and dose-assessment parameters that will be used to estimate the impacts of past Hanford Site airborne releases. During 1993, dose estimates made by staff at Battelle will be used by the Fred Hutchinson Cancer Research Center as part of the Hanford Thyroid Disease Study (HTDS). This document contains information on parameters that are specific to the airborne release of the radionuclide iodine-131. Future versions of this document will include parameter information pertinent to other pathways and radionuclides.
Rojas-Nandayapa, Leonardo
Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... of insurance companies facing losses due to natural disasters, banks seeking protection against huge losses, failures in expensive and sophisticated systems or loss of valuable information in electronic systems. The main difficulty when dealing with this kind of problems is the unavailability of a closed...
S Varadhan, S R
2001-01-01
This volume presents topics in probability theory covered during a first-year graduate course given at the Courant Institute of Mathematical Sciences. The necessary background material in measure theory is developed, including the standard topics, such as extension theorem, construction of measures, integration, product spaces, Radon-Nikodym theorem, and conditional expectation. In the first part of the book, characteristic functions are introduced, followed by the study of weak convergence of probability distributions. Then both the weak and strong limit theorems for sums of independent rando
Shanahan, M C
2017-08-01
The purpose of this study was to compare radiation dose measurements generated using a virtual radiography simulation with experimental dosimeter measurements for two radiation dose reduction techniques in digital radiography. Entrance Surface Dose (ESD) measurements were generated for an antero-posterior lumbar spine radiograph experimentally using NanoDOT™, single point dosimeters, for two radiographic systems (systems 1 and 2) and using Projection VR™, a virtual radiography simulation (system 3). Two dose reduction methods were tested, application of the 15% kVp rule, or simplified 10 kVp rule, and the exposure maintenance formula. The 15% or 10 kVp rules use a specified increase in kVp and halving of the mAs to reduce patient ESD. The exposure maintenance formula uses the increase in source-to-object distance to reduce ESD. Increasing kVp from 75 to 96 kVp, with the concomitant decrease in mAs, resulted in percent ESD reduction of 59.5% (4.02-1.63 mGy), 60.8% (3.55-1.39 mGy), and 60.3% (6.65-2.64 mGy), for experimental systems 1 and 2, and virtual simulation (system 3), respectively. Increasing the SID (with the appropriate increase in mAs) from 100 to 140 cm reduced ESD by 22.3% 18.8%, and 23.5%, for experimental systems 1 and 2, and virtual simulation (system 3), respectively. Percent dose reduction measurements were similar between the experimental and virtual measurement systems investigated. For the dose reduction practices tested, Projection VR™ provides a realistic alternate of percent dose reduction to direct dosimetry. Copyright © 2017 The College of Radiographers. Published by Elsevier Ltd. All rights reserved.
Estimation of 1945 to 1957 food consumption. Hanford Environmental Dose Reconstruction Project
Anderson, D.M.; Bates, D.J.; Marsh, T.L.
1993-07-01
This report details the methods used and the results of the study on the estimated historic levels of food consumption by individuals in the Hanford Environmental Dose Reconstruction (HEDR) study area from 1945--1957. This period includes the time of highest releases from Hanford and is the period for which data are being collected in the Hanford Thyroid Disease Study. These estimates provide the food-consumption inputs for the HEDR database of individual diets. This database will be an input file in the Hanford Environmental Dose Reconstruction Integrated Code (HEDRIC) computer model that will be used to calculate the radiation dose. The report focuses on fresh milk, eggs, lettuce, and spinach. These foods were chosen because they have been found to be significant contributors to radiation dose based on the Technical Steering Panel dose decision level.
Anderson, D.M.; Bates, D.J.; Marsh, T.L.
1993-03-01
This report details the methods used and the results of the study on the estimated historic levels of food consumption by individuals in the Hanford Environmental Dose Reconstruction (HEDR) study area from 1945--1957. This period includes the time of highest releases from Hanford and is the period for which data are being collected in the Hanford Thyroid Disease Study. These estimates provide the food-consumption inputs for the HEDR database of individual diets. This database will be an input file in the Hanford Environmental Dose Reconstruction Integrated Code (HEDRIC) computer model that will be used to calculate the radiation dose. The report focuses on fresh milk, eggs, lettuce, and spinach. These foods were chosen because they have been found to be significant contributors to radiation dose based on the Technical Steering Panel dose decision level.
Varga, Tamas
This booklet resulted from a 1980 visit by the author, a Hungarian mathematics educator, to the Teachers' Center Project at Southern Illinois University at Edwardsville. Included are activities and problems that make probablility concepts accessible to young children. The topics considered are: two probability games; choosing two beads; matching…
Kurz, Jochen H. [Fraunhofer-Institut fuer Zerstoerungsfreie Pruefverfahren (IZEP), Saarbruecken (Germany); Dugan, Sandra; Juengert, Anne [Stuttgart Univ. (Germany). Materialpruefungsanstalt (MPA)
2013-07-01
Reliable assessment procedures are an important aspect of maintenance concepts. Non-destructive testing (NDT) methods are an essential part of a variety of maintenance plans. Fracture mechanical assessments require knowledge of flaw dimensions, loads and material parameters. NDT methods are able to acquire information on all of these areas. However, it has to be considered that the level of detail information depends on the case investigated and therefore on the applicable methods. Reliability aspects of NDT methods are of importance if quantitative information is required. Different design concepts e.g. the damage tolerance approach in aerospace already include reliability criteria of NDT methods applied in maintenance plans. NDT is also an essential part during construction and maintenance of nuclear power plants. In Germany, type and extent of inspection are specified in Safety Standards of the Nuclear Safety Standards Commission (KTA). Only certified inspections are allowed in the nuclear industry. The qualification of NDT is carried out in form of performance demonstrations of the inspection teams and the equipment, witnessed by an authorized inspector. The results of these tests are mainly statements regarding the detection capabilities of certain artificial flaws. In other countries, e.g. the U.S., additional blind tests on test blocks with hidden and unknown flaws may be required, in which a certain percentage of these flaws has to be detected. The knowledge of the probability of detection (POD) curves of specific flaws in specific testing conditions is often not present. This paper shows the results of a research project designed for POD determination of ultrasound phased array inspections of real and artificial cracks. The continuative objective of this project was to generate quantitative POD results. The distribution of the crack sizes of the specimens and the inspection planning is discussed, and results of the ultrasound inspections are presented. In
Shimada, Mitsuhiro; Tagami, Shingo; Matsumoto, Takuma; Shimizu, Yoshifumi R; Yahiro, Masanobu
2016-01-01
We perform simultaneous analysis of (1) matter radii, (2) $B(E2; 0^+ \\rightarrow 2^+ )$ transition probabilities, and (3) excitation energies, $E(2^+)$ and $E(4^+)$, for $^{24-40}$Mg by using the beyond mean-field (BMF) framework with angular-momentum-projected configuration mixing with respect to the axially symmetric $\\beta_2$ deformation with infinitesimal cranking. The BMF calculations successfully reproduce all of the data for $r_{\\rm m}$, $B(E2)$, and $E(2^+)$ and $E(4^+)$, indicating that it is quite useful for data analysis, particularly for low-lying states. We also discuss the absolute value of the deformation parameter $\\beta_2$ deduced from measured values of $B(E2)$ and $r_{\\rm m}$. This framework makes it possible to investigate the effects of $\\beta_2$ deformation, the change in $\\beta_2$ due to restoration of rotational symmetry, $\\beta_2$ configuration mixing, and the inclusion of time-odd components by infinitesimal cranking. Under the assumption of axial deformation and parity conservation,...
Shimada, Mitsuhiro; Watanabe, Shin; Tagami, Shingo; Matsumoto, Takuma; Shimizu, Yoshifumi R.; Yahiro, Masanobu
2016-06-01
We perform simultaneous analysis of (1) matter radii, (2) B (E 2 ;0+→2+) transition probabilities, and (3) excitation energies, E (2+) and E (4+) , for Mg-4024 by using the beyond-mean-field (BMF) framework with angular-momentum-projected configuration mixing with respect to the axially symmetric β2 deformation with infinitesimal cranking. The BMF calculations successfully reproduce all of the data for rm,B (E 2 ) , and E (2+) and E (4+) , indicating that it is quite useful for data analysis; particularly for low-lying states. We also discuss the absolute value of the deformation parameter β2 deduced from measured values of B (E 2 ) and rm. This framework makes it possible to investigate the effects of β2 deformation, the change in β2 due to restoration of rotational symmetry, β2 configuration mixing, and the inclusion of time-odd components by infinitesimal cranking. Under the assumption of axial deformation and parity conservation, we clarify which effect is important for each of the three measurements and propose the kinds of BMF calculations that are practical for each of the three kinds of observables.
Busch, H.P.; Busch, S.; Decker, C.; Schilz, C. [Krankenhaus der Barmherzigen Brueder Trier, Abt. fuer Radiologie, Trier (Germany)
2003-01-01
Purpose: Comparison of the imaging capabilities of storage phosphor (computed) radiography and flat plate radiography with conventional film-screen radiography to find new strategies for quality and dose management, i.e., optimizing imaging quality and dose depending on the imaging method and clinical situation. Materials and Methods: Images of a CDRAD-phantom, hand-phantom, abdomen-phantom and chest-phantom obtained with different exposure voltages (50 kV, 73 kV, 109 kV) and different speeds (200, 400, 800, 1600) were processed with various digital systems (flat plate detector: Digital Diagnost [Philips]; storage phosphors: ADC-70 [Agfa], ADC-Solo [Agfa], FCR XG 1 [Fuji]) and a conventional film-screen system (HT100G/Ortho Regular [Agfa]). Results: The evaluation of CDRAD images found the flat plate detector system to have the highest contrast detectability for all dose levels, followed by the FCR XG 1, ADC-Solo and ADC-70 systems. Comparison of the organ-phantom images found the flat plate detector system to be equal to film-screen radiography and especially to storage phosphor systems even for low exposure doses. Conclusions: Flat plate radiography systems demonstrate the highest potential for high image quality when reducing the exposure dose. Depending on the system generation, the storage phosphor systems also show an improved image quality, but the possibility of a dose reduction is limited in comparison with the flat plate detector system. (orig.) [German] Ziel: Die vergleichende Darstellung der Abbildungseigenschaften verschiedener Speicherfoliengeraete und einer Flachdetektoranlage mit Film/Folienaufnahmen kann zu neuen Strategien des Qualitaets- und Dosismanagements fuehren, d.h. zu einer Optimierung von Bildqualitaet und Dosis in Abhaengigkeit von der Untersuchungsmethode und der klinischen Fragestellung. Methode: Aufnahmen von einem CDRAD-Phantom, Handphantom, Abdomenphantom und Thoraxphantom wurden mit verschiedenen digitalen Systemen (Flachdetektor
Mart, E.I.; Denham, D.H.; Thiede, M.E.
1993-12-01
This report is a result of the Hanford Environmental Dose Reconstruction (HEDR) Project whose goal is to estimate the radiation dose that individuals could have received from emissions since 1944 at the U.S. Department of Energy`s (DOE) Hanford Site near Richland, Washington. The HEDR Project is conducted by Battelle, Pacific Northwest Laboratories (BNW). One of the radionuclides emitted that would affect the radiation dose was iodine-131. This report describes in detail the reconstructed conversion and correction factors for historical measurements of iodine-131 in Hanford-area vegetation which was collected from the beginning of October 1945 through the end of December 1947.
Ahmadalipour, A.; Rana, A.; Moradkhani, H.
2014-12-01
Global climatic change is expected to have severe effects on natural systems along with various socio-economic aspects of human life. Global Climate Models (GCMs) are widely used to study the impacts in future, with varied projections/simulations from the entire participating member GCMs. This has urged scientific communities across the world try to improve the understandings of future climate conditions, and reduce the uncertainties associated with them. In the present study, we have used various multi-modelling methods, both deterministic and probabilistic, to reduce the model uncertainties, in historical time period of 1970-2000. The analysis is performed for uncertainty bounds of precipitation and temperature using 10 selected Global Climate Models (GCMs) from Climate Model Inter-comparison project Phase 5 (CMIP5) dataset over 10 sub-basins of Columbia River Basin (CRB). All the multi-modelling methods are applied and evaluated in accordance to their performance indicator using Taylor diagrams on simulating past climate for all 10 sub-basins. The best performing multi-model method, on basis of performance of all the climatic parameters, is chosen for a particular sub-basin and same is used to develop a probable future scenario for the period of 2010-2099. All the analysis and computations are performed on statistically downscaled GCM data to increase the accuracy and better capture the uncertainty bounds on sub-basin scale, as well as enhancing the ability of multi-modeling techniques. All the future time series are used to assess the uncertainties of climatic parameters for climate change analysis. Results have brought insight into each of the multi-modelling techniques i.e. highlighting the pros and cons of all the applied methods. It was also inferred that multi-modelling techniques varied from basin to basin and with different variables, as per their capabilities to capture the observation spread/uncertainty. Eventually, the different ensemble time series
Heeb, C.M.
1993-03-01
Detailed results of the Hanford Environmental Dose Reconstruction project (HEDR) iodine-131 release reconstruction are presented in this volume. Included are daily data on B, D, and F Plant, reactor operations from the P-Department Daily Reports (General Electric Company 1947). Tables of B and T Plant material processed from the three principal sources on separations plant operations: The Jaech report (Jaech undated), the 200 Area Report (Acken and Bird 1945; Bird and Donihee 1945), and the Metal History Reports (General Electric Company 1946). A transcription of the Jaech report is also provided because it is computer-generated and is not readily readable in its original format. The iodine-131 release data are from the STRM model. Cut-by-cut release estimates are provided, along with daily, monthly, and yearly summations. These summations are based on the hourly release estimates. The hourly data are contained in a 28 megabyte electronic file. Interested individuals may request a copy.
Eliazar, Iddo; Klafter, Joseph
2008-06-01
We explore six classes of fractal probability laws defined on the positive half-line: Weibull, Frechét, Lévy, hyper Pareto, hyper beta, and hyper shot noise. Each of these classes admits a unique statistical power-law structure, and is uniquely associated with a certain operation of renormalization. All six classes turn out to be one-dimensional projections of underlying Poisson processes which, in turn, are the unique fixed points of Poissonian renormalizations. The first three classes correspond to linear Poissonian renormalizations and are intimately related to extreme value theory (Weibull, Frechét) and to the central limit theorem (Lévy). The other three classes correspond to nonlinear Poissonian renormalizations. Pareto's law--commonly perceived as the "universal fractal probability distribution"--is merely a special case of the hyper Pareto class.
Olaciregui-Ruiz, Igor; Rozendaal, Roel; van Oers, René F M; Mijnheer, Ben; Mans, Anton
2017-05-01
At our institute, a transit back-projection algorithm is used clinically to reconstruct in vivo patient and in phantom 3D dose distributions using EPID measurements behind a patient or a polystyrene slab phantom, respectively. In this study, an extension to this algorithm is presented whereby in air EPID measurements are used in combination with CT data to reconstruct 'virtual' 3D dose distributions. By combining virtual and in vivo patient verification data for the same treatment, patient-related errors can be separated from machine, planning and model errors. The virtual back-projection algorithm is described and verified against the transit algorithm with measurements made behind a slab phantom, against dose measurements made with an ionization chamber and with the OCTAVIUS 4D system, as well as against TPS patient data. Virtual and in vivo patient dose verification results are also compared. Virtual dose reconstructions agree within 1% with ionization chamber measurements. The average γ-pass rate values (3% global dose/3mm) in the 3D dose comparison with the OCTAVIUS 4D system and the TPS patient data are 98.5±1.9%(1SD) and 97.1±2.9%(1SD), respectively. For virtual patient dose reconstructions, the differences with the TPS in median dose to the PTV remain within 4%. Virtual patient dose reconstruction makes pre-treatment verification based on deviations of DVH parameters feasible and eliminates the need for phantom positioning and re-planning. Virtual patient dose reconstructions have additional value in the inspection of in vivo deviations, particularly in situations where CBCT data is not available (or not conclusive). Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Nielsen, Tine B; Wieslander, Elinore; Fogliata, Antonella;
2011-01-01
To investigate differences in calculated doses and normal tissue complication probability (NTCP) values between different dose algorithms.......To investigate differences in calculated doses and normal tissue complication probability (NTCP) values between different dose algorithms....
Probability Aggregates in Probability Answer Set Programming
Saad, Emad
2013-01-01
Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...
Walters, W.H.; Dirkes, R.L.; Napier, B.A.
1992-11-01
As part of the Hanford Environmental Dose Reconstruction (HEDR) Project, Battelle, Pacific Northwest Laboratories reviewed literature and data on radionuclide concentrations and distribution in the water, sediment, and biota of the Columbia River and adjacent coastal areas. Over 600 documents were reviewed including Hanford reports, reports by offsite agencies, journal articles, and graduate theses. Radionuclide concentration data were used in preliminary estimates of individual dose for the period 1964 through 1966. This report summarizes the literature and database reviews and the results of the preliminary dose estimates.
Walters, W.H.; Dirkes, R.L.; Napier, B.A.
1992-04-01
As part of the Hanford Environmental Dose Reconstruction Project, Pacific Northwest Laboratory reviewed literature and data on radionuclide concentrations and distribution in the water, sediment, and biota of the Columbia River and adjacent coastal areas. Over 600 documents were reviewed including Hanford reports, reports by offsite agencies, journal articles, and graduate theses. Certain radionuclide concentration data were used in preliminary estimates of individual dose for the 1964--1966 time period. This report summarizes the literature and database review and the results of the preliminary dose estimates.
Ciraj-Bjelac, Olivera, E-mail: ociraj@vinca.rs [Vinca Institute of Nuclear Sciences, Belgrade (Serbia); Avramova-Cholakova, Simona, E-mail: s_avramova@mail.bg [National Centre of Radiobiology and Radiation Protection (NCRRP), Ministry of Health, Sofia (Bulgaria); Beganovic, Adnan, E-mail: adnanbeg@gmail.com [University of Sarajevo, Institute of Radiology, Sarajevo, Bosnia and Herzegovina (Bosnia and Herzegovina); Economides, Sotirios, E-mail: adnanbeg@gmail.com [Ministry of Development, Greek Atomic Energy Commission, Athens (Greece); Faj, Dario, E-mail: dariofaj@mefos.hr [University Hospital Osijek, Osijek (Croatia); Gershan, Vesna, E-mail: vgersan@gmail.com [National Commission on Radiation Protection, Institute of Radiology, Skopje, The Former Yugoslav Republic of Macedonia (Macedonia, The Former Yugoslav Republic of); Grupetta, Edward, E-mail: edward.gruppetta@gov.mt [St. Luke' s Hospital, Diagnostic Radiology Unit, Guardamangia (Malta); Kharita, M.H., E-mail: mhkharita@aec.org.sy [Atomic Energy Commission of Syria (AECS), Department of Protection and Safety, Radiation and Nuclear Regulatory Office, Damascus (Syrian Arab Republic); Milakovic, Milomir, E-mail: mmilomir@teol.net [Ministry of Health of the Republic of Srpska, Public Health Institute of Republic of Srpska, Banja Luka, Bosnia and Herzegovina (Bosnia and Herzegovina); Milu, Constantin, E-mail: milu.constantin@yahoo.com [Institute of Public Health, SSDL, Bucharest (Romania); Muhogora, Wilbroad E., E-mail: wmuhogora@yahoo.com [Tanzania Atomic Energy Commission, Arusha, Tanzania (Tanzania, United Republic of); Muthuvelu, Pirunthavany, E-mail: mpvany@gmail.com [Ministry of Health, Radiation Health Safety Branch, Putra Jaya (Malaysia); Oola, Samuel, E-mail: ooladavidson@yahoo.com [Mulago Hospital, Department of Radiology, Kampala (Uganda); Setayeshi, Saeid, E-mail: setayesh@aut.ac.ir [Ministry of Health, Treatment, and Medical Training, Tehran (Iran, Islamic Republic of); and others
2012-09-15
Purpose: The objective is to study mammography practice from an optimisation point of view by assessing the impact of simple and immediately implementable corrective actions on image quality. Materials and methods: This prospective multinational study included 54 mammography units in 17 countries. More than 21,000 mammography images were evaluated using a three-level image quality scoring system. Following initial assessment, appropriate corrective actions were implemented and image quality was re-assessed in 24 units. Results: The fraction of images that were considered acceptable without any remark in the first phase (before the implementation of corrective actions) was 70% and 75% for cranio-caudal and medio-lateral oblique projections, respectively. The main causes for poor image quality before corrective actions were related to film processing, damaged or scratched image receptors, or film-screen combinations that are not spectrally matched, inappropriate radiographic techniques and lack of training. Average glandular dose to a standard breast was 1.5 mGy (mean and range 0.59–3.2 mGy). After optimisation the frequency of poor quality images decreased, but the relative contributions of the various causes remained similar. Image quality improvements following appropriate corrective actions were up to 50 percentage points in some facilities. Conclusions: Poor image quality is a major source of unnecessary radiation dose to the breast. An increased awareness of good quality mammograms is of particular importance for countries that are moving towards introduction of population-based screening programmes. The study demonstrated how simple and low-cost measures can be a valuable tool in improving of image quality in mammography.
Probability theory and mathematical statistics for engineers
Pugachev, V S
1984-01-01
Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector
Scaling Qualitative Probability
Burgin, Mark
2017-01-01
There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...
Broeders, M.J.; Voorde, M. Ten; Veldkamp, W.J.H.; Engen, R.E. van; Landsveld-Verhoeven, C. van; Jong-Gunneman, M.N. t; Win, J. de; Greve, K.D.; Paap, E.; Heeten, GJ. den
2015-01-01
PURPOSE: To compare pain, projected breast area, radiation dose and image quality between flexible (FP) and rigid (RP) breast compression paddles. METHODS: The study was conducted in a Dutch mammographic screening unit (288 women). To compare both paddles one additional image with RP was made, consi
Pontana, Francois; Pagniez, Julien; Faivre, Jean-Baptiste; Hachulla, Anne-Lise; Remy, Jacques [University Lille Nord de France, Department of Thoracic Imaging, Hospital Calmette (EA 2694), Lille (France); Duhamel, Alain [University Lille Nord de France, Department of Medical Statistics, Lille (France); Flohr, Thomas [Computed Tomography Division, Siemens Healthcare, Forchheim (Germany); Remy-Jardin, Martine [University Lille Nord de France, Department of Thoracic Imaging, Hospital Calmette (EA 2694), Lille (France); Hospital Calmette, Department of Thoracic Imaging, Lille cedex (France)
2011-03-15
To evaluate the image quality of an iterative reconstruction algorithm (IRIS) in low-dose chest CT in comparison with standard-dose filtered back projection (FBP) CT. Eighty consecutive patients referred for a follow-up chest CT examination of the chest, underwent a low-dose CT examination (Group 2) in similar technical conditions to those of the initial examination, (Group 1) except for the milliamperage selection and the replacement of regular FBP reconstruction by iterative reconstructions using three (Group 2a) and five iterations (Group 2b). Despite a mean decrease of 35.5% in the dose-length-product, there was no statistically significant difference between Group 2a and Group 1 in the objective noise, signal-to-noise (SNR) and contrast-to-noise (CNR) ratios and distribution of the overall image quality scores. Compared to Group 1, objective image noise in Group 2b was significantly reduced with increased SNR and CNR and a trend towards improved image quality. Iterative reconstructions using three iterations provide similar image quality compared with the conventionally used FBP reconstruction at 35% less dose, thus enabling dose reduction without loss of diagnostic information. According to our preliminary results, even higher dose reductions than 35% may be feasible by using more than three iterations. (orig.)
Briggs, William M.
2012-01-01
The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.
Koo, Reginald; Jones, Martin L.
2011-01-01
Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.
Goldberg, Samuel
1960-01-01
Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.
Koo, Reginald; Jones, Martin L.
2011-01-01
Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.
Zhang, H; Kong, V; Jin, J [Georgia Regents University, Augusta, GA (Georgia); Ren, L [Duke University Medical Center, Durham, NC (United States)
2014-06-01
Purpose: Synchronized moving grid is a promising technique to reduce scatter and ghost artifacts in cone beam computed tomography (CBCT). However, it requires 2 projections in the same gantry angle to obtain full information due to signal blockage by the grid. We proposed an inter-projection interpolation (IPI) method to estimate blocked signals, which may reduce the scan time and the dose. This study aims to provide a framework to achieve a balance between speed, dose and image quality. Methods: The IPI method is based on the hypothesis that an abrupt signal in a projection can be well predicted by the information in the two immediate neighboring projections if the gantry angle step is small. The study was performed on a Catphan and a head phantom. The SMOG was simulated by erasing the information (filling with “0”) of the areas in each projection corresponding to the grid. An IPI algorithm was applied on each projection to recover the erased information. FDK algorithm was used to reconstruct CBCT images for the IPI-processed projections, and compared with the original image in term of signal to noise ratio (SNR) measured in the whole reconstruction image range. The effect of gantry angle step was investigated by comparing the CBCT images from projection sets of various gantry intervals, with IPI-predicted projections to fill the missing projection in the interval. Results: The IPI procession time was 1.79s±0.53s for each projection. SNR after IPI was 29.0db and 28.1db for the Catphan and head phantom, respectively, comparing to 15.3db and 22.7db for an inpainting based interpolation technique. SNR was 28.3, 28.3, 21.8, 19.3 and 17.3 db for gantry angle intervals of 1, 1.5, 2, 2.5 and 3 degrees, respectively. Conclusion: IPI is feasible to estimate the missing information, and achieve an reasonable CBCT image quality with reduced dose and scan time. This study is supported by NIH/NCI grant 1R01CA166948-01.
Degteva, M. O.; Anspaugh, L. R.; Napier, Bruce A.
2009-10-23
This is the concluding Progress Report for Project 1.1 of the U.S./Russia Joint Coordinating Committee on Radiation Effects Research (JCCRER). An overwhelming majority of our work this period has been to complete our primary obligation of providing a new version of the Techa River Dosimetry System (TRDS), which we call TRDS-2009D; the D denotes deterministic. This system provides estimates of individual doses to members of the Extended Techa River Cohort (ETRC) and post-natal doses to members of the Techa River Offspring Cohort (TROC). The latter doses were calculated with use of the TRDS-2009D. The doses for the members of the ETRC have been made available to the American and Russian epidemiologists in September for their studies in deriving radiogenic risk factors. Doses for members of the TROC are being provided to European and Russian epidemiologists, as partial input for studies of risk in this population. Two of our original goals for the completion of this nine-year phase of Project 1.1 were not completed. These are completion of TRDS-2009MC, which was to be a Monte Carlo version of TRDS-2009 that could be used for more explicit analysis of the impact of uncertainty in doses on uncertainty in radiogenic risk factors. The second incomplete goal was to be the provision of household specific external doses (rather than village average). This task was far along, but had to be delayed due to the lead investigator’s work on consideration of a revised source term.
Quantum probability measures and tomographic probability densities
Amosov, GG; Man'ko, [No Value
2004-01-01
Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the
Agreeing Probability Measures for Comparative Probability Structures
P.P. Wakker (Peter)
1981-01-01
textabstractIt is proved that fine and tight comparative probability structures (where the set of events is assumed to be an algebra, not necessarily a σ-algebra) have agreeing probability measures. Although this was often claimed in the literature, all proofs the author encountered are not valid
Nielsen, Sven P.; Andersson, Kasper; Thoerring, H. (and others)
2006-04-15
Considerable variations in activity concentrations in milk of {sup 137}Cs and {sup 90}Sr were observed between countries or regions due to precipitation patterns, soil types and inhomogeneity of Chernobyl fallout. Time trends indicate that factors influencing ecological half-lives for {sup 90}Sr are not the same as for {sup 137}Cs in the pasturemilk system. Internal doses to Faroese people derive mainly from dairy products, lamb and potatoes. The largest doses were received from nuclear weapons fallout in the early 1960's. {sup 137}Cs causes higher doses than 90Sr, and the regional variability is larger for {sup 137}Cs than for {sup 90}Sr. {sup 137}Cs deposition maps were made of Sweden. Values of 137Cs deposition and precipitation were used in the calculations of Nuclear Weapons Fallout (NWF). The deposition of {sup 137}Cs from the Chernobyl accident was calculated for western Sweden. Lowest levels of NWF {sup 137}Cs deposition density were noted in the north-eastern and eastern Sweden and the highest levels in the western parts. The Chernobyl {sup 137}Cs deposition is highest along the coast and lowest in the south-eastern part and along the middle. The calculated deposition from NWF and Chernobyl in western Sweden was compared to observed deposition and showed good agreement. Ecological halftimes of {sup 137}Cs in perch in Finnish lakes vary by a factor of three. The longest halftime of {sup 137}Cs in perch was 9 y and the shortest 3 y. Norwegian lakes differ from each other with respect to the rates of decrease of {sup 137}Cs in fish. Ecological halftimes of {sup 137}Cs in trout and Arctic char varied from 1 to 5 y. A more rapid reduction of {sup 137}Cs in fish is found in certain Norwegian lakes compared to Finnish lakes. In two Norwegian lakes the 137Cs concentrations in trout remain at about 100 Bq/kg since 1990. The European decision support systems, ARGOS and RODOS, include foodchain modules with default parameters derived from southern Germany. Many
Fullwood, R.R.
1989-04-01
The Advanced Neutron Source (ANS) (Difilippo, 1986; Gamble, 1986; West, 1986; Selby, 1987) will be the world's best facility for low energy neutron research. This performance requires the highest flux density of all non-pulsed reactors with concomitant low thermal inertial and fast response to upset conditions. One of the primary concerns is that a flow cessation of the order of a second may result in fuel damage. Such a flow stoppage could be the result of break in the primary piping. This report is a review of methods for assessing pipe break probabilities based on historical operating experience in power reactors, scaling methods, fracture mechanics and fracture growth models. The goal of this work is to develop parametric guidance for the ANS design to make the event highly unlikely. It is also to review and select methods that may be used in an interactive IBM-PC model providing fast and reasonably accurate models to aid the ANS designers in achieving the safety requirements. 80 refs., 7 figs.
Tapiador, Francisco J.; Sanchez, Enrique; Romera, Raquel (Inst. of Environmental Sciences, Univ. of Castilla-La Mancha (UCLM), 45071 Toledo (Spain)). e-mail: francisco.tapiador@uclm.es
2009-07-01
Regional climate models (RCMs) are dynamical downscaling tools aimed to improve the modelling of local physical processes. Ensembles of RCMs are widely used to improve the coarse-grain estimates of global climate models (GCMs) since the use of several RCMs helps to palliate uncertainties arising from different dynamical cores and numerical schemes methods. In this paper, we analyse the differences and similarities in the climate change response for an ensemble of heterogeneous RCMs forced by one GCM (HadAM3H), and one emissions scenario (IPCC's SRES-A2 scenario). As a difference with previous approaches using PRUDENCE database, the statistical description of climate characteristics is made through the spatial and temporal aggregation of the RCMs outputs into probability distribution functions (PDF) of monthly values. This procedure is a complementary approach to conventional seasonal analyses. Our results provide new, stronger evidence on expected marked regional differences in Europe in the A2 scenario in terms of precipitation and temperature changes. While we found an overall increase in the mean temperature and extreme values, we also found mixed regional differences for precipitation
Probability and Relative Frequency
Drieschner, Michael
2016-01-01
The concept of probability seems to have been inexplicable since its invention in the seventeenth century. In its use in science, probability is closely related with relative frequency. So the task seems to be interpreting that relation. In this paper, we start with predicted relative frequency and show that its structure is the same as that of probability. I propose to call that the `prediction interpretation' of probability. The consequences of that definition are discussed. The "ladder"-structure of the probability calculus is analyzed. The expectation of the relative frequency is shown to be equal to the predicted relative frequency. Probability is shown to be the most general empirically testable prediction.
Elements of probability theory
Rumshiskii, L Z
1965-01-01
Elements of Probability Theory presents the methods of the theory of probability. This book is divided into seven chapters that discuss the general rule for the multiplication of probabilities, the fundamental properties of the subject matter, and the classical definition of probability. The introductory chapters deal with the functions of random variables; continuous random variables; numerical characteristics of probability distributions; center of the probability distribution of a random variable; definition of the law of large numbers; stability of the sample mean and the method of moments
Evaluating probability forecasts
Lai, Tze Leung; Shen, David Bo; 10.1214/11-AOS902
2012-01-01
Probability forecasts of events are routinely used in climate predictions, in forecasting default probabilities on bank loans or in estimating the probability of a patient's positive response to treatment. Scoring rules have long been used to assess the efficacy of the forecast probabilities after observing the occurrence, or nonoccurrence, of the predicted events. We develop herein a statistical theory for scoring rules and propose an alternative approach to the evaluation of probability forecasts. This approach uses loss functions relating the predicted to the actual probabilities of the events and applies martingale theory to exploit the temporal structure between the forecast and the subsequent occurrence or nonoccurrence of the event.
The National Aquatic Resource Surveys (NARS) use probability-survey designs to assess the condition of the nation’s waters. In probability surveys (also known as sample-surveys or statistical surveys), sampling sites are selected randomly.
Roussas, George G
2006-01-01
Roussas's Introduction to Probability features exceptionally clear explanations of the mathematics of probability theory and explores its diverse applications through numerous interesting and motivational examples. It provides a thorough introduction to the subject for professionals and advanced students taking their first course in probability. The content is based on the introductory chapters of Roussas's book, An Intoduction to Probability and Statistical Inference, with additional chapters and revisions. Written by a well-respected author known for great exposition an
Philosophical theories of probability
Gillies, Donald
2000-01-01
The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.
Edwards, William F.; Shiflett, Ray C.; Shultz, Harris
2008-01-01
The mathematical model used to describe independence between two events in probability has a non-intuitive consequence called dependent spaces. The paper begins with a very brief history of the development of probability, then defines dependent spaces, and reviews what is known about finite spaces with uniform probability. The study of finite…
Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia
We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned
Interpretations of probability
Khrennikov, Andrei
2009-01-01
This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.
Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia
2013-01-01
We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned prob
Chen, Chien-Ming; Lin, Yang-Yu; Hsu, Ming-Yi; Hung, Chien-Fu; Liao, Ying-Lan; Tsai, Hui-Yu
2016-09-01
Evaluate the performance of Adaptive Iterative Dose Reduction 3D (AIDR 3D) and compare with filtered-back projection (FBP) regarding radiation dosage and image quality for an 80-kVp abdominal CT. An abdominal phantom underwent four CT acquisitions and reconstruction algorithms (FBP; AIDR 3D mild, standard and strong). Sixty-three patients underwent unenhanced liver CT with FBP and standard level AIDR 3D. Further post-acquisition reconstruction with strong level AIDR 3D was made. Patients were divided into two groups (radiation dose by 72% in the phantom and 47.1% in the patient study compared with FBP. There was no difference in mean attenuations. Image noise was the lowest and signal-to-noise ratio the highest using strong level AIDR 3D in both patient groups. For Deffradiation dose and maintenance of image quality compared with FBP. Using AIDR 3D reconstruction, patients with larger abdomen circumference could be imaged at 80kVp. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Dynamical Simulation of Probabilities
Zak, Michail
1996-01-01
It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-Lipschitz dynamics, without utilization of any man-made devices(such as random number generators). Self-orgainizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed. Special attention was focused upon coupled stochastic processes, defined in terms of conditional probabilities, for which joint probability does not exist. Simulations of quantum probabilities are also discussed.
Childers, Timothy
2013-01-01
Probability is increasingly important for our understanding of the world. What is probability? How do we model it, and how do we use it? Timothy Childers presents a lively introduction to the foundations of probability and to philosophical issues it raises. He keeps technicalities to a minimum, and assumes no prior knowledge of the subject. He explains the main interpretations of probability-frequentist, propensity, classical, Bayesian, and objective Bayesian-and uses stimulatingexamples to bring the subject to life. All students of philosophy will benefit from an understanding of probability,
Hanf, R.W.; Duncan, J.P.; Thiede, M.E.
1993-09-01
The purpose of the HEDR Project is to estimate the radiation dose that individuals could have received as a result of radionuclide emissions since 1944 from DOE`s Hanford Site. A major objective of the HEDR Project is to estimate doses to the thyroid of individuals who were exposed to iodine-131. A principal pathway for many of these individuals was milk from cows that ate vegetation contaminated by iodine-131 released into the air from Hanford facilities. The HEDR Project work is conducted under several technical and administrative tasks, among which is the Environmental Monitoring Data Task. Members of the Environmental Monitoring Data Task have developed databases of historical environmental measurements. These databases include iodine-131 concentrations for vegetation samples collected on and around the Hanford Site since 1944, the initial year in which the chemical separation plants were operated and whose effluents led to the release of radioactive iodine. These data will be used to assist in the HEDR model validation studies. To date, the vegetation data for 1945--1947 have been published. Because the factors used from 1945--1951 to convert raw count data to iodine-131 activity levels and to adjust reported iodine-131 activity levels did not account for all parameters affecting counting efficiency. Other conversion and correction factors need to be applied.
Proposal for Modified Damage Probability Distribution Functions
Pedersen, Preben Terndrup; Hansen, Peter Friis
1996-01-01
Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub-committee on st......Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub...
Becce, Fabio [University of Lausanne, Department of Diagnostic and Interventional Radiology, Centre Hospitalier Universitaire Vaudois, Lausanne (Switzerland); Universite Catholique Louvain, Department of Radiology, Cliniques Universitaires Saint-Luc, Brussels (Belgium); Ben Salah, Yosr; Berg, Bruno C. vande; Lecouvet, Frederic E.; Omoumi, Patrick [Universite Catholique Louvain, Department of Radiology, Cliniques Universitaires Saint-Luc, Brussels (Belgium); Verdun, Francis R. [University of Lausanne, Institute of Radiation Physics, Centre Hospitalier Universitaire Vaudois, Lausanne (Switzerland); Meuli, Reto [University of Lausanne, Department of Diagnostic and Interventional Radiology, Centre Hospitalier Universitaire Vaudois, Lausanne (Switzerland)
2013-07-15
To compare image quality of a standard-dose (SD) and a low-dose (LD) cervical spine CT protocol using filtered back-projection (FBP) and iterative reconstruction (IR). Forty patients investigated by cervical spine CT were prospectively randomised into two groups: SD (120 kVp, 275 mAs) and LD (120 kVp, 150 mAs), both applying automatic tube current modulation. Data were reconstructed using both FBP and sinogram-affirmed IR. Image noise, signal-to-noise (SNR) and contrast-to-noise (CNR) ratios were measured. Two radiologists independently and blindly assessed the following anatomical structures at C3-C4 and C6-C7 levels, using a four-point scale: intervertebral disc, content of neural foramina and dural sac, ligaments, soft tissues and vertebrae. They subsequently rated overall image quality using a ten-point scale. For both protocols and at each disc level, IR significantly decreased image noise and increased SNR and CNR, compared with FBP. SNR and CNR were statistically equivalent in LD-IR and SD-FBP protocols. Regardless of the dose and disc level, the qualitative scores with IR compared with FBP, and with LD-IR compared with SD-FBP, were significantly higher or not statistically different for intervertebral discs, neural foramina and ligaments, while significantly lower or not statistically different for soft tissues and vertebrae. The overall image quality scores were significantly higher with IR compared with FBP, and with LD-IR compared with SD-FBP. LD-IR cervical spine CT provides better image quality for intervertebral discs, neural foramina and ligaments, and worse image quality for soft tissues and vertebrae, compared with SD-FBP, while reducing radiation dose by approximately 40 %. (orig.)
Weidlich, P; Adam, C; Sroka, R; Lanzl, I; Assmann, W; Stief, C
2007-09-01
The treatment of urethral strictures represents an unsolved urological problem. The effect of a (32)P-coated urethral catheter in the sense of low-dose rate brachytherapy to modulate wound healing will be analyzed in an animal experiment. Unfortunately it is not possible to present any results because this is being studied for the first time and there are no experiences with low-dose rate brachytherapy and this form of application in the lower urinary tract. Furthermore the animal experiment will only start in the near future. Both decade-long experiences with radiotherapy to treat benign diseases and our own results of previous studies in otolaryngology and ophthalmology let us expect a significantly lower formation of urethral strictures after internal urethrotomy. This study will contribute to improving the treatment of urethral strictures as demanded in previous papers.
Probability and radical behaviorism
Espinosa, James M.
1992-01-01
The concept of probability appears to be very important in the radical behaviorism of Skinner. Yet, it seems that this probability has not been accurately defined and is still ambiguous. I give a strict, relative frequency interpretation of probability and its applicability to the data from the science of behavior as supplied by cumulative records. Two examples of stochastic processes are given that may model the data from cumulative records that result under conditions of continuous reinforcement and extinction, respectively. PMID:22478114
Probability and radical behaviorism
Espinosa, James M.
1992-01-01
The concept of probability appears to be very important in the radical behaviorism of Skinner. Yet, it seems that this probability has not been accurately defined and is still ambiguous. I give a strict, relative frequency interpretation of probability and its applicability to the data from the science of behavior as supplied by cumulative records. Two examples of stochastic processes are given that may model the data from cumulative records that result under conditions of continuous reinforc...
STATISTICAL ANALYSIS, REPORTS), (*PROBABILITY, REPORTS), INFORMATION THEORY, DIFFERENTIAL EQUATIONS, STATISTICAL PROCESSES, STOCHASTIC PROCESSES, MULTIVARIATE ANALYSIS, DISTRIBUTION THEORY , DECISION THEORY, MEASURE THEORY, OPTIMIZATION
Ash, Robert B; Lukacs, E
1972-01-01
Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var
Florescu, Ionut
2013-01-01
THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio
Zvolanek, Kristina; Ma, Rongtao; Zhou, Christina; Liang, Xiaoying; Wang, Shuo; Verma, Vivek; Zhu, Xiaofeng; Zhang, Qinghui; Driewer, Joseph; Lin, Chi; Zhen, Weining; Wahl, Andrew; Zhou, Su-Min; Zheng, Dandan
2017-05-01
Inhomogeneity dose modeling and respiratory motion description are two critical technical challenges for lung stereotactic body radiotherapy, an important treatment modality for small size primary and secondary lung tumors. Recent studies revealed lung density-dependent target dose differences between Monte Carlo (Type-C) algorithm and earlier algorithms. Therefore, this study aimed to investigate the equivalence of the two most popular CT datasets for treatment planning, free breathing (FB) and average intensity projection (AIP) CTs, using Type-C algorithms, and comparing with two older generation algorithms (Type-A and Type-B). Twenty patients (twenty-one lesions) were planned using a Type-A algorithm on the FB CT. Lung was contoured separately on FB and AIP CTs and compared. Dose comparison was obtained between the two CTs using four commercial dose algorithms including one Type-A (Pencil Beam Convolution - PBC), one Type-B (Analytical Anisotropic Algorithm - AAA), and two Type-C algorithms (Voxel Monte Carlo - VMC and Acuros External Beam - AXB). For each algorithm, the dosimetric parameters of the target (PTV, Dmin , Dmax , Dmean , D95, and D90) and lung (V5, V10, V20, V30, V35, and V40) were compared between the two CTs using the Wilcoxon signed rank test. Correlation between dosimetric differences and density differences for each algorithm were studied using linear regression and Spearman correlation, in which both global and local density differences were evaluated. Although the lung density differences on FB and AIP CTs were statistically significant (P = 0.003), the magnitude was small at 1.21 ± 1.45%. Correspondingly, for the two Type-C algorithms, target and lung dosimetric differences were small in magnitude and statistically insignificant (P > 0.05) for all but one instance, similar to the findings for the older generation algorithms. Nevertheless, a significant correlation was shown between the dosimetric and density differences for Type-C and Type
On Quantum Conditional Probability
Isabel Guerra Bobo
2013-02-01
Full Text Available We argue that quantum theory does not allow for a generalization of the notion of classical conditional probability by showing that the probability defined by the Lüders rule, standardly interpreted in the literature as the quantum-mechanical conditionalization rule, cannot be interpreted as such.
Choice Probability Generating Functions
Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel
This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...
Freund, John E
1993-01-01
Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.
Choice Probability Generating Functions
Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel
This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...
Choice probability generating functions
Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel
2013-01-01
This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...
Probability, Nondeterminism and Concurrency
Varacca, Daniele
Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...
Böning, G., E-mail: georg.boening@charite.de [Department of Radiology, Charité, Humboldt-University Medical School, Charitéplatz 1, 10117 Berlin (Germany); Schäfer, M.; Grupp, U. [Department of Radiology, Charité, Humboldt-University Medical School, Charitéplatz 1, 10117 Berlin (Germany); Kaul, D. [Department of Radiation Oncology, Charité, Humboldt-University Medical School, Charitéplatz 1, 10117 Berlin (Germany); Kahn, J. [Department of Radiology, Charité, Humboldt-University Medical School, Charitéplatz 1, 10117 Berlin (Germany); Pavel, M. [Department of Gastroenterology, Charité, Humboldt-University Medical School, Charitéplatz 1, 10117 Berlin (Germany); Maurer, M.; Denecke, T.; Hamm, B.; Streitparth, F. [Department of Radiology, Charité, Humboldt-University Medical School, Charitéplatz 1, 10117 Berlin (Germany)
2015-08-15
Highlights: • Iterative reconstruction (IR) in staging CT provides equal objective image quality compared to filtered back projection (FBP). • IR delivers excellent subjective quality and reduces effective dose compared to FBP. • In patients with neuroendocrine tumor (NET) or may other hypervascular abdominal tumors IR can be used without scarifying diagnostic confidence. - Abstract: Objective: To investigate whether dose reduction via adaptive statistical iterative reconstruction (ASIR) affects image quality and diagnostic accuracy in neuroendocrine tumor (NET) staging. Methods: A total of 28 NET patients were enrolled in the study. Inclusion criteria were histologically proven NET and visible tumor in abdominal computed tomography (CT). In an intraindividual study design, the patients underwent a baseline CT (filtered back projection, FBP) and follow-up CT (ASIR 40%) using matched scan parameters. Image quality was assessed subjectively using a 5-grade scoring system and objectively by determining signal-to-noise ratio (SNR) and contrast-to-noise ratios (CNRs). Applied volume computed tomography dose index (CTDI{sub vol}) of each scan was taken from the dose report. Results: ASIR 40% significantly reduced CTDI{sub vol} (10.17 ± 3.06 mGy [FBP], 6.34 ± 2.25 mGy [ASIR] (p < 0.001) by 37.6% and significantly increased CNRs (complete tumor-to-liver, 2.76 ± 1.87 [FBP], 3.2 ± 2.32 [ASIR]) (p < 0.05) (complete tumor-to-muscle, 2.74 ± 2.67 [FBP], 4.31 ± 4.61 [ASIR]) (p < 0.05) compared to FBP. Subjective scoring revealed no significant changes for diagnostic confidence (5.0 ± 0 [FBP], 5.0 ± 0 [ASIR]), visibility of suspicious lesion (4.8 ± 0.5 [FBP], 4.8 ± 0.5 [ASIR]) and artifacts (5.0 ± 0 [FBP], 5.0 ± 0 [ASIR]). ASIR 40% significantly decreased scores for noise (4.3 ± 0.6 [FBP], 4.0 ± 0.8 [ASIR]) (p < 0.05), contrast (4.4 ± 0.6 [FBP], 4.1 ± 0.8 [ASIR]) (p < 0.001) and visibility of small structures (4.5 ± 0.7 [FBP], 4.3 ± 0.8 [ASIR]) (p < 0
Billingsley, Patrick
2012-01-01
Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this
Hartmann, Stephan
2011-01-01
Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...
Concepts of probability theory
Pfeiffer, Paul E
1979-01-01
Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.
Hemmo, Meir
2012-01-01
What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive.
Probability and Bayesian statistics
1987-01-01
This book contains selected and refereed contributions to the "Inter national Symposium on Probability and Bayesian Statistics" which was orga nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...
Grimmett, Geoffrey
2014-01-01
Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...
THE NUCLEAR ENCOUNTER PROBABILITY
SMULDERS, PJM
1994-01-01
This Letter dicusses the nuclear encounter probability as used in ion channeling analysis. A formulation is given, incorporating effects of large beam angles and beam divergence. A critical examination of previous definitions is made.
Shorack, Galen R
2017-01-01
This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...
Probability and Statistical Inference
Prosper, Harrison B.
2006-01-01
These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.
Probability in quantum mechanics
J. G. Gilson
1982-01-01
Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.
Quantum computing and probability.
Ferry, David K
2009-11-25
Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.
Monte Carlo transition probabilities
Lucy, L. B.
2001-01-01
Transition probabilities governing the interaction of energy packets and matter are derived that allow Monte Carlo NLTE transfer codes to be constructed without simplifying the treatment of line formation. These probabilities are such that the Monte Carlo calculation asymptotically recovers the local emissivity of a gas in statistical equilibrium. Numerical experiments with one-point statistical equilibrium problems for Fe II and Hydrogen confirm this asymptotic behaviour. In addition, the re...
Degteva, M.O. [Urals Research Center for Radiation Medicine, Chelyabinsk (Russian Federation); Drozhko, E. [Branch 1 of Moscow Biophysics Inst., Ozersk (Russian Federation); Anspaugh, L.R. [Lawrence Livermore National Lab., CA (United States); Napier, B.A. [Pacific Northwest National Lab., Richland, WA (United States); Bouville, A.C. [National Cancer Inst., Bethesda, MD (United States); Miller, C.W. [Centers for Disease Control and Prevention, Atlanta, GA (United States)
1996-02-01
This work is being carried out as a feasibility study to determine if a long-term course of work can be implemented to assess the long-term risks of radiation exposure delivered at low to moderate dose rates to the populations living in the vicinity of the Mayak Industrial Association (MIA). This work was authorized and conducted under the auspices of the US-Russian Joint Coordinating Committee on Radiation Effects Research (JCCRER) and its Executive Committee (EC). The MIA was the first Russian site for the production and separation of plutonium. This plant began operation in 1948, and during its early days there were technological failures that resulted in the release of large amounts of waste into the rather small Techa River. There were also gaseous releases of radioiodines and other radionuclides during the early days of operation. In addition, there was an accidental explosion in a waste storage tank in 1957 that resulted in a significant release. The Techa River Cohort has been studied for several years by scientists from the Urals Research Centre for Radiation Medicine and an increase in both leukemia and solid tumors has been noted.
The perception of probability.
Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E
2014-01-01
We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making.
Experimental Probability in Elementary School
Andrew, Lane
2009-01-01
Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.
Experimental Probability in Elementary School
Andrew, Lane
2009-01-01
Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.
Isaac, Richard
1995-01-01
The ideas of probability are all around us. Lotteries, casino gambling, the al most non-stop polling which seems to mold public policy more and more these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...
Hanf, R.W.; Dirkes, R.L.; Duncan, J.P.
1992-07-01
The objective of the Hanford Environmental Dose Reconstruction Project (HEDR) is to estimate the potential radiation doses received by people living within the sphere of influence of the Hanford Site. A potential critical pathway for human radiation exposure is through the consumption of waterfowl that frequent onsite waste-water ponds or through eating of fish, shellfish, and waterfowl that reside in/on the Columbia River and its tributaries downstream of the reactors. This document summarizes information on fish, shellfish, and waterfowl radiation contamination for samples collected by Hanford monitoring personnel and offsite agencies for the period 1945 to 1972. Specific information includes the types of organisms sampled, the kinds of tissues and organs analyzed, the sampling locations, and the radionuclides reported. Some tissue concentrations are also included. We anticipate that these yearly summaries will be helpful to individuals and organizations interested in evaluating aquatic pathway information for locations impacted by Hanford operations and will be useful for planning the direction of future HEDR studies.
Takx, Richard A.P. [Department of Radiology and Radiological Science, Medical University of South Carolina, Charleston, SC (United States); Department of Radiology, Maastricht University Medical Centre, Maastricht (Netherlands); Schoepf, U. Joseph, E-mail: schoepf@musc.edu [Department of Radiology and Radiological Science, Medical University of South Carolina, Charleston, SC (United States); Division of Cardiology, Department of Medicine, Medical University of South Carolina, Charleston, SC (United States); Moscariello, Antonio [Department of Radiology and Radiological Science, Medical University of South Carolina, Charleston, SC (United States); Department of Radiology, Policlinico Universitario Campus Bio-Medico, Rome (Italy); Das, Marco [Department of Radiology, Maastricht University Medical Centre, Maastricht (Netherlands); Rowe, Garrett [Department of Radiology and Radiological Science, Medical University of South Carolina, Charleston, SC (United States); Schoenberg, Stefan O.; Fink, Christian [Institute of Clinical Radiology and Nuclear Medicine, University Medical Center Mannheim, Medical Faculty Mannheim, Heidelberg University (Germany); Henzler, Thomas [Department of Radiology and Radiological Science, Medical University of South Carolina, Charleston, SC (United States); Institute of Clinical Radiology and Nuclear Medicine, University Medical Center Mannheim, Medical Faculty Mannheim, Heidelberg University (Germany)
2013-02-15
Objective: To prospectively compare subjective and objective image quality in 20% tube current coronary CT angiography (cCTA) datasets between an iterative reconstruction algorithm (SAFIRE) and traditional filtered back projection (FBP). Materials and methods: Twenty patients underwent a prospectively ECG-triggered dual-step cCTA protocol using 2nd generation dual-source CT (DSCT). CT raw data was reconstructed using standard FBP at full-dose (Group{sub 1}a) and 80% tube current reduced low-dose (Group{sub 1}b). The low-dose raw data was additionally reconstructed using iterative raw data reconstruction (Group{sub 2}). Attenuation and image noise were measured in three regions of interest and signal-to-noise-ratio (SNR) as well as contrast-to-noise-ratio (CNR) was calculated. Subjective diagnostic image quality was evaluated using a 4-point Likert scale. Results: Mean image noise of group{sub 2} was lowered by 22% on average when compared to group{sub 1}b (p < 0.0001–0.0033), while there were no significant differences in mean attenuation within the same anatomical regions. The lower image noise resulted in significantly higher SNR and CNR ratios in group{sub 2} compared to group{sub 1}b (p < 0.0001–0.0232). Subjective image quality of group{sub 2} (1.88 ± 0.63) was also rated significantly higher when compared to group{sub 1}b (1.58 ± 0.63, p = 0.004). Conclusions: Image quality of 80% tube current reduced iteratively reconstructed cCTA raw data is significantly improved when compared to standard FBP and consequently may improve the diagnostic accuracy of cCTA.
Galindo G, I. F.; Vergara del C, J. A.; Galvan A, S. J. [Instituto Nacional de Electricidad y Energias Limpias, Reforma 113, Col. Palmira, 62490 Cuernavaca, Morelos (Mexico); Tijerina S, F., E-mail: francisco.tijerina@cfe.gob.mx [CFE, Central Nucleoelectrica Laguna Verde, Carretera Federal Cardel-Nautla Km 42.5, 91476 Municipio Alto Lucero, Veracruz (Mexico)
2016-09-15
The use of specialized codes to estimate the radiation dose projection to an emergency postulated event at a nuclear power plant requires that certain plant data be available according to the event being simulated. The calculation of the possible radiological release is the critical activity to carry out the emergency actions. However, not all of the plant data required are obtained directly from the plant but need to be calculated. In this paper we present a computational tool that calculates the plant data required to use the radiological dose estimation codes. The tool provides the required information when there is a gas emergency venting event in the primary containment atmosphere, whether well or dry well and also calculates the time in which the spent fuel pool would be discovered in the event of a leak of water on some of the walls or floor of the pool. The tool developed has mathematical models for the processes involved such as: compressible flow in pipes considering area change and for constant area, taking into account the effects of friction and for the case of the spent fuel pool hydraulic models to calculate the time in which a container is emptied. The models implemented in the tool are validated with data from the literature for simulated cases. The results with the tool are very similar to those of reference. This tool will also be very supportive so that in postulated emergency cases can use the radiological dose estimation codes to adequately and efficiently determine the actions to be taken in a way that affects as little as possible. (Author)
Improving Ranking Using Quantum Probability
Melucci, Massimo
2011-01-01
The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...
Koyama, Hisanobu; Seki, Shinichiro; Sugimura, Kazuro [Kobe University Graduate School of Medicine, Division of Radiology, Department of Radiology, Kobe, Hyogo (Japan); Ohno, Yoshiharu; Nishio, Mizuho; Matsumoto, Sumiaki; Yoshikawa, Takeshi [Kobe University Graduate School of Medicine, Advanced Biomedical Imaging Research Centre, Kobe (Japan); Kobe University Graduate School of Medicine, Division of Functional and Diagnostic Imaging Research, Department of Radiology, Kobe (Japan); Sugihara, Naoki [Toshiba Medical Systems Corporation, Ohtawara, Tochigi (Japan)
2014-08-15
The aim of this study was to evaluate the utility of the iterative reconstruction (IR) technique for quantitative bronchial assessment during low-dose computed tomography (CT) as a substitute for standard-dose CT in patients with/without chronic obstructive pulmonary disease. Fifty patients (mean age, 69.2; mean % predicted FEV1, 79.4) underwent standard-dose CT (150mAs) and low-dose CT (25mAs). Except for tube current, the imaging parameters were identical for both protocols. Standard-dose CT was reconstructed using filtered back-projection (FBP), and low-dose CT was reconstructed using IR and FBP. For quantitative bronchial assessment, the wall area percentage (WA%) of the sub-segmental bronchi and the airway luminal volume percentage (LV%) from the main bronchus to the peripheral bronchi were acquired in each dataset. The correlation and agreement of WA% and LV% between standard-dose CT and both low-dose CTs were statistically evaluated. WA% and LV% between standard-dose CT and both low-dose CTs were significant correlated (r > 0.77, p < 0.00001); however, only the LV% agreement between SD-CT and low-dose CT reconstructed with IR was moderate (concordance correlation coefficient = 0.93); the other agreement was poor (concordance correlation coefficient <0.90). Quantitative bronchial assessment via low-dose CT has potential as a substitute for standard-dose CT by using IR and airway luminal volumetry techniques. circle Quantitative bronchial assessment of COPD using low-dose CT is possible. (orig.)
Zurek, W H
2004-01-01
I show how probabilities arise in quantum physics by exploring implications of {\\it environment - assisted invariance} or {\\it envariance}, a recently discovered symmetry exhibited by entangled quantum systems. Envariance of perfectly entangled states can be used to rigorously justify complete ignorance of the observer about the outcome of any measurement on either of the members of the entangled pair. Envariance leads to Born's rule, $p_k \\propto |\\psi_k|^2$. Probabilities derived in this manner are an objective reflection of the underlying state of the system -- they reflect experimentally verifiable symmetries, and not just a subjective ``state of knowledge'' of the observer. Envariance - based approach is compared with and found superior to the key pre-quantum definitions of probability including the {\\it standard definition} based on the `principle of indifference' due to Laplace, and the {\\it relative frequency approach} advocated by von Mises. Implications of envariance for the interpretation of quantu...
Collision Probability Analysis
Hansen, Peter Friis; Pedersen, Preben Terndrup
1998-01-01
It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...
Choice probability generating functions
Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel
2010-01-01
This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications...
Negative Probabilities and Contextuality
de Barros, J Acacio; Oas, Gary
2015-01-01
There has been a growing interest, both in physics and psychology, in understanding contextuality in experimentally observed quantities. Different approaches have been proposed to deal with contextual systems, and a promising one is contextuality-by-default, put forth by Dzhafarov and Kujala. The goal of this paper is to present a tutorial on a different approach: negative probabilities. We do so by presenting the overall theory of negative probabilities in a way that is consistent with contextuality-by-default and by examining with this theory some simple examples where contextuality appears, both in physics and psychology.
Introduction to imprecise probabilities
Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M
2014-01-01
In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin
Carr, D.B.; Tolley, H.D.
1982-12-01
This paper investigates procedures for univariate nonparametric estimation of tail probabilities. Extrapolated values for tail probabilities beyond the data are also obtained based on the shape of the density in the tail. Several estimators which use exponential weighting are described. These are compared in a Monte Carlo study to nonweighted estimators, to the empirical cdf, to an integrated kernel, to a Fourier series estimate, to a penalized likelihood estimate and a maximum likelihood estimate. Selected weighted estimators are shown to compare favorably to many of these standard estimators for the sampling distributions investigated.
Classic Problems of Probability
Gorroochurn, Prakash
2012-01-01
"A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin
Plotnitsky, Arkady
2010-01-01
Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general
Counterexamples in probability
Stoyanov, Jordan M
2013-01-01
While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.
Collision Probability Analysis
Hansen, Peter Friis; Pedersen, Preben Terndrup
1998-01-01
probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...
Tomographic probability representation for quantum fermion fields
Andreev, V A; Man'ko, V I; Son, Nguyen Hung; Thanh, Nguyen Cong; Timofeev, Yu P; Zakharov, S D
2009-01-01
Tomographic probability representation is introduced for fermion fields. The states of the fermions are mapped onto probability distribution of discrete random variables (spin projections). The operators acting on the fermion states are described by fermionic tomographic symbols. The product of the operators acting on the fermion states is mapped onto star-product of the fermionic symbols. The kernel of the star-product is obtained. The antisymmetry of the fermion states is formulated as the specific symmetry property of the tomographic joint probability distribution associated with the states.
Frič, Roman; Papčo, Martin
2010-12-01
Motivated by IF-probability theory (intuitionistic fuzzy), we study n-component probability domains in which each event represents a body of competing components and the range of a state represents a simplex S n of n-tuples of possible rewards-the sum of the rewards is a number from [0,1]. For n=1 we get fuzzy events, for example a bold algebra, and the corresponding fuzzy probability theory can be developed within the category ID of D-posets (equivalently effect algebras) of fuzzy sets and sequentially continuous D-homomorphisms. For n=2 we get IF-events, i.e., pairs ( μ, ν) of fuzzy sets μ, ν∈[0,1] X such that μ( x)+ ν( x)≤1 for all x∈ X, but we order our pairs (events) coordinatewise. Hence the structure of IF-events (where ( μ 1, ν 1)≤( μ 2, ν 2) whenever μ 1≤ μ 2 and ν 2≤ ν 1) is different and, consequently, the resulting IF-probability theory models a different principle. The category ID is cogenerated by I=[0,1] (objects of ID are subobjects of powers I X ), has nice properties and basic probabilistic notions and constructions are categorical. For example, states are morphisms. We introduce the category S n D cogenerated by Sn=\\{(x1,x2,ldots ,xn)in In;sum_{i=1}nxi≤ 1\\} carrying the coordinatewise partial order, difference, and sequential convergence and we show how basic probability notions can be defined within S n D.
Bergstroem, Ulla (Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden)); Avila, Rodolfo; Ekstroem, Per-Anders; Cruz, Idalmis de la (Facilia AB, Bromma (Sweden))
2008-06-15
Following a review by the Swedish regulatory authorities of the safety analysis of the SFR 1 disposal facility for low and intermediate level waste, SKB has prepared an updated safety analysis, SAR-08. This report presents estimations of annual doses to the most exposed groups from potential radionuclide releases from the SFR 1 repository for a number of calculation cases, selected using a systematic approach for identifying relevant scenarios for the safety analysis. The dose estimates can be used for demonstrating that the long term safety of the repository is in compliance with the regulatory requirements. In particular, the mean values of the annual doses can be used to estimate the expected risks to the most exposed individuals, which can then be compared with the regulatory risk criteria for human health. The conversion from doses to risks is performed in the main report. For one scenario however, where the effects of an earthquake taking place close to the repository are analysed, risk calculations are presented in this report. In addition, prediction of concentrations of radionuclides in environmental media, such as water and soil, are compared with concentration limits suggested by the Erica-project as a base for estimating potential effects on the environment. The assessment of the impact on non-human biota showed that the potential impact is negligible. Committed collective dose for an integration period of 10,000 years for releases occurring during the first thousand years after closure are also calculated. The collective dose commitment was estimated to be 8 manSv. The dose calculations were carried out for a period of 100,000 years, which was sufficient to observe peak doses in all scenarios considered. Releases to the landscape and to a well were considered. The peaks of the mean annual doses from releases to the landscape are associated with C-14 releases to a future lake around year 5,000 AD. In the case of releases to a well, the peak annual doses
Negative probability in the framework of combined probability
Burgin, Mark
2013-01-01
Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...
Negative probability in the framework of combined probability
Burgin, Mark
2013-01-01
Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...
Paradoxes in probability theory
Eckhardt, William
2013-01-01
Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory. Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies. Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.
Contributions to quantum probability
Fritz, Tobias
2010-06-25
Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a
Superpositions of probability distributions
Jizba, Petr; Kleinert, Hagen
2008-09-01
Probability distributions which can be obtained from superpositions of Gaussian distributions of different variances v=σ2 play a favored role in quantum theory and financial markets. Such superpositions need not necessarily obey the Chapman-Kolmogorov semigroup relation for Markovian processes because they may introduce memory effects. We derive the general form of the smearing distributions in v which do not destroy the semigroup property. The smearing technique has two immediate applications. It permits simplifying the system of Kramers-Moyal equations for smeared and unsmeared conditional probabilities, and can be conveniently implemented in the path integral calculus. In many cases, the superposition of path integrals can be evaluated much easier than the initial path integral. Three simple examples are presented, and it is shown how the technique is extended to quantum mechanics.
Superpositions of probability distributions.
Jizba, Petr; Kleinert, Hagen
2008-09-01
Probability distributions which can be obtained from superpositions of Gaussian distributions of different variances v=sigma;{2} play a favored role in quantum theory and financial markets. Such superpositions need not necessarily obey the Chapman-Kolmogorov semigroup relation for Markovian processes because they may introduce memory effects. We derive the general form of the smearing distributions in v which do not destroy the semigroup property. The smearing technique has two immediate applications. It permits simplifying the system of Kramers-Moyal equations for smeared and unsmeared conditional probabilities, and can be conveniently implemented in the path integral calculus. In many cases, the superposition of path integrals can be evaluated much easier than the initial path integral. Three simple examples are presented, and it is shown how the technique is extended to quantum mechanics.
Probability theory and applications
Hsu, Elton P
1999-01-01
This volume, with contributions by leading experts in the field, is a collection of lecture notes of the six minicourses given at the IAS/Park City Summer Mathematics Institute. It introduces advanced graduates and researchers in probability theory to several of the currently active research areas in the field. Each course is self-contained with references and contains basic materials and recent results. Topics include interacting particle systems, percolation theory, analysis on path and loop spaces, and mathematical finance. The volume gives a balanced overview of the current status of probability theory. An extensive bibliography for further study and research is included. This unique collection presents several important areas of current research and a valuable survey reflecting the diversity of the field.
von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo
2014-06-01
Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.
Measurement uncertainty and probability
Willink, Robin
2013-01-01
A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.
1983-07-26
DeGroot , Morris H. Probability and Statistic. Addison-Wesley Publishing Company, Reading, Massachusetts, 1975. [Gillogly 78] Gillogly, J.J. Performance...distribution [ DeGroot 751 has just begun. The beta distribution has several features that might make it a more reasonable choice. As with the normal-based...1982. [Cooley 65] Cooley, J.M. and Tukey, J.W. An algorithm for the machine calculation of complex Fourier series. Math. Comp. 19, 1965. [ DeGroot 75
Whittle, Peter
1992-01-01
This book is a complete revision of the earlier work Probability which ap peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...
Probability Constructs in Preschool Education and How they Are Taught
Antonopoulos, Konstantinos; Zacharos, Konstantinos
2013-01-01
The teaching of Probability Theory constitutes a new trend in mathematics education internationally. The purpose of this research project was to explore the degree to which preschoolers understand key concepts of probabilistic thinking, such as sample space, the probability of an event and probability comparisons. At the same time, we evaluated an…
Bergstroem, Ulla (Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden)); Avila, Rodolfo; Ekstroem, Per-Anders; Cruz, Idalmis de la (Facilia AB, Bromma (Sweden))
2008-06-15
Following a review by the Swedish regulatory authorities of the safety analysis of the SFR 1 disposal facility for low and intermediate level waste, SKB has prepared an updated safety analysis, SAR-08. This report presents estimations of annual doses to the most exposed groups from potential radionuclide releases from the SFR 1 repository for a number of calculation cases, selected using a systematic approach for identifying relevant scenarios for the safety analysis. The dose estimates can be used for demonstrating that the long term safety of the repository is in compliance with the regulatory requirements. In particular, the mean values of the annual doses can be used to estimate the expected risks to the most exposed individuals, which can then be compared with the regulatory risk criteria for human health. The conversion from doses to risks is performed in the main report. For one scenario however, where the effects of an earthquake taking place close to the repository are analysed, risk calculations are presented in this report. In addition, prediction of concentrations of radionuclides in environmental media, such as water and soil, are compared with concentration limits suggested by the Erica-project as a base for estimating potential effects on the environment. The assessment of the impact on non-human biota showed that the potential impact is negligible. Committed collective dose for an integration period of 10,000 years for releases occurring during the first thousand years after closure are also calculated. The collective dose commitment was estimated to be 8 manSv. The dose calculations were carried out for a period of 100,000 years, which was sufficient to observe peak doses in all scenarios considered. Releases to the landscape and to a well were considered. The peaks of the mean annual doses from releases to the landscape are associated with C-14 releases to a future lake around year 5,000 AD. In the case of releases to a well, the peak annual doses
May, M.S.; Eller, A.; Stahl, C. [University Hospital Erlangen (Germany). Dept. of Radiology; and others
2014-06-15
Purpose: The aim of this study was to evaluate the potential of iterative reconstruction (IR) in chest computed tomography (CT) to reduce radiation exposure. The qualitative and quantitative image quality of standard reconstructions with filtered back projection (FBP) and half dose (HD) chest CT data reconstructed with FBP and IR was assessed. Materials and Methods: 52 consecutive patients underwent contrast-enhanced chest CT on a dual-source CT system at 120 kV and automatic exposure control. The tube current was equally split on both tube detector systems. For the HD datasets, only data from one tube detector system was utilized. Thus, FD and HD data was available for each patient with a single scan. Three datasets were reconstructed from the raw data: standard full dose (FD) images applying FBP which served as a reference, HD images applying FBP and IR. Objective image quality analysis was performed by measuring the image noise in tissue and air. The subjective image quality was evaluated by 2 radiologists according to European guidelines. Additional assessment of artifacts, lesion conspicuity and edge sharpness was performed. Results: Image noise did not differ significantly between HD-IR and FD-FBP (p = 0.254) but increased substantially in HD-FBP (p < 0.001). No statistically significant differences were found for the reproduction of anatomical and pathological structures between HD-IR and FD-FBP, subsegmental bronchi and bronchioli. The image quality of HD-FBP was rated inferior because of increased noise. Conclusion: A 50% dose reduction in contrast-enhanced chest CT is feasible without a loss of diagnostic confidence if IR is used for image data reconstruction. Iterative reconstruction is another powerful tool to reduce radiation exposure and can be combined with other dose-saving techniques. (orig.)
Improving Ranking Using Quantum Probability
Melucci, Massimo
2011-01-01
The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of false alarm and the same parameter estimation data. As quantum probability provided more effective detectors than classical probability within other domains that data management, we conjecture that, the system that can implement subspace-based detectors shall be more effective than a system which implements a set-based detectors, the effectiveness being calculated as expected recall estimated over the probability of detection and expected fallout estimated over the probability of false alarm.
Whiting, Alan B
2014-01-01
Professor Sir Karl Popper (1902-1994) was one of the most influential philosophers of science of the twentieth century, best known for his doctrine of falsifiability. His axiomatic formulation of probability, however, is unknown to current scientists, though it is championed by several current philosophers of science as superior to the familiar version. Applying his system to problems identified by himself and his supporters, it is shown that it does not have some features he intended and does not solve the problems they have identified.
Probably Almost Bayes Decisions
Anoulova, S.; Fischer, Paul; Poelt, S.
1996-01-01
discriminant functions for this purpose. We analyze this approach for different classes of distribution functions of Boolean features:kth order Bahadur-Lazarsfeld expansions andkth order Chow expansions. In both cases, we obtain upper bounds for the required sample size which are small polynomials...... in the relevant parameters and which match the lower bounds known for these classes. Moreover, the learning algorithms are efficient.......In this paper, we investigate the problem of classifying objects which are given by feature vectors with Boolean entries. Our aim is to "(efficiently) learn probably almost optimal classifications" from examples. A classical approach in pattern recognition uses empirical estimations of the Bayesian...
Sirca, Simon
2016-01-01
This book is designed as a practical and intuitive introduction to probability, statistics and random quantities for physicists. The book aims at getting to the main points by a clear, hands-on exposition supported by well-illustrated and worked-out examples. A strong focus on applications in physics and other natural sciences is maintained throughout. In addition to basic concepts of random variables, distributions, expected values and statistics, the book discusses the notions of entropy, Markov processes, and fundamentals of random number generation and Monte-Carlo methods.
Generalized Probability Functions
Alexandre Souto Martinez
2009-01-01
Full Text Available From the integration of nonsymmetrical hyperboles, a one-parameter generalization of the logarithmic function is obtained. Inverting this function, one obtains the generalized exponential function. Motivated by the mathematical curiosity, we show that these generalized functions are suitable to generalize some probability density functions (pdfs. A very reliable rank distribution can be conveniently described by the generalized exponential function. Finally, we turn the attention to the generalization of one- and two-tail stretched exponential functions. We obtain, as particular cases, the generalized error function, the Zipf-Mandelbrot pdf, the generalized Gaussian and Laplace pdf. Their cumulative functions and moments were also obtained analytically.
Radiological dose assessment for vault storage concepts
Richard, R.F.
1997-02-25
This radiological dose assessment presents neutron and photon dose rates in support of project W-460. Dose rates are provided for a single 3013 container, the ``infloor`` storage vault concept, and the ``cubicle`` storage vault concept.
Measure, integral and probability
Capiński, Marek
2004-01-01
Measure, Integral and Probability is a gentle introduction that makes measure and integration theory accessible to the average third-year undergraduate student. The ideas are developed at an easy pace in a form that is suitable for self-study, with an emphasis on clear explanations and concrete examples rather than abstract theory. For this second edition, the text has been thoroughly revised and expanded. New features include: · a substantial new chapter, featuring a constructive proof of the Radon-Nikodym theorem, an analysis of the structure of Lebesgue-Stieltjes measures, the Hahn-Jordan decomposition, and a brief introduction to martingales · key aspects of financial modelling, including the Black-Scholes formula, discussed briefly from a measure-theoretical perspective to help the reader understand the underlying mathematical framework. In addition, further exercises and examples are provided to encourage the reader to become directly involved with the material.
Probabilities for Solar Siblings
Valtonen, Mauri; Bajkova, A. T.; Bobylev, V. V.; Mylläri, A.
2015-02-01
We have shown previously (Bobylev et al. Astron Lett 37:550-562, 2011) that some of the stars in the solar neighborhood today may have originated in the same star cluster as the Sun, and could thus be called Solar Siblings. In this work we investigate the sensitivity of this result to galactic models and to parameters of these models, and also extend the sample of orbits. There are a number of good candidates for the sibling category, but due to the long period of orbit evolution since the break-up of the birth cluster of the Sun, one can only attach probabilities of membership. We find that up to 10 % (but more likely around 1 %) of the members of the Sun's birth cluster could be still found within 100 pc from the Sun today.
Probabilities for Solar Siblings
Valtonen, M; Bobylev, V V; Myllari, A
2015-01-01
We have shown previously (Bobylev et al 2011) that some of the stars in the Solar neighborhood today may have originated in the same star cluster as the Sun, and could thus be called Solar Siblings. In this work we investigate the sensitivity of this result to Galactic models and to parameters of these models, and also extend the sample of orbits. There are a number of good candidates for the Sibling category, but due to the long period of orbit evolution since the break-up of the birth cluster of the Sun, one can only attach probabilities of membership. We find that up to 10% (but more likely around 1 %) of the members of the Sun's birth cluster could be still found within 100 pc from the Sun today.
Emptiness Formation Probability
Crawford, Nicholas; Ng, Stephen; Starr, Shannon
2016-08-01
We present rigorous upper and lower bounds on the emptiness formation probability for the ground state of a spin-1/2 Heisenberg XXZ quantum spin system. For a d-dimensional system we find a rate of decay of the order {exp(-c L^{d+1})} where L is the sidelength of the box in which we ask for the emptiness formation event to occur. In the {d=1} case this confirms previous predictions made in the integrable systems community, though our bounds do not achieve the precision predicted by Bethe ansatz calculations. On the other hand, our bounds in the case {d ≥ 2} are new. The main tools we use are reflection positivity and a rigorous path integral expansion, which is a variation on those previously introduced by Toth, Aizenman-Nachtergaele and Ueltschi.
Learning unbelievable marginal probabilities
Pitkow, Xaq; Miller, Ken D
2011-01-01
Loopy belief propagation performs approximate inference on graphical models with loops. One might hope to compensate for the approximation by adjusting model parameters. Learning algorithms for this purpose have been explored previously, and the claim has been made that every set of locally consistent marginals can arise from belief propagation run on a graphical model. On the contrary, here we show that many probability distributions have marginals that cannot be reached by belief propagation using any set of model parameters or any learning algorithm. We call such marginals `unbelievable.' This problem occurs whenever the Hessian of the Bethe free energy is not positive-definite at the target marginals. All learning algorithms for belief propagation necessarily fail in these cases, producing beliefs or sets of beliefs that may even be worse than the pre-learning approximation. We then show that averaging inaccurate beliefs, each obtained from belief propagation using model parameters perturbed about some le...
People's conditional probability judgments follow probability theory (plus noise).
Costello, Fintan; Watts, Paul
2016-09-01
A common view in current psychology is that people estimate probabilities using various 'heuristics' or rules of thumb that do not follow the normative rules of probability theory. We present a model where people estimate conditional probabilities such as P(A|B) (the probability of A given that B has occurred) by a process that follows standard frequentist probability theory but is subject to random noise. This model accounts for various results from previous studies of conditional probability judgment. This model predicts that people's conditional probability judgments will agree with a series of fundamental identities in probability theory whose form cancels the effect of noise, while deviating from probability theory in other expressions whose form does not allow such cancellation. Two experiments strongly confirm these predictions, with people's estimates on average agreeing with probability theory for the noise-cancelling identities, but deviating from probability theory (in just the way predicted by the model) for other identities. This new model subsumes an earlier model of unconditional or 'direct' probability judgment which explains a number of systematic biases seen in direct probability judgment (Costello & Watts, 2014). This model may thus provide a fully general account of the mechanisms by which people estimate probabilities.
Savage s Concept of Probability
熊卫
2003-01-01
Starting with personal preference, Savage [3] constructs a foundation theory for probability from the qualitative probability to the quantitative probability and to utility. There are some profound logic connections between three steps in Savage's theory; that is, quantitative concepts properly represent qualitative concepts. Moreover, Savage's definition of subjective probability is in accordance with probability theory, and the theory gives us a rational decision model only if we assume that the weak ...
Probability Theory without Bayes' Rule
Rodriques, Samuel G.
2014-01-01
Within the Kolmogorov theory of probability, Bayes' rule allows one to perform statistical inference by relating conditional probabilities to unconditional probabilities. As we show here, however, there is a continuous set of alternative inference rules that yield the same results, and that may have computational or practical advantages for certain problems. We formulate generalized axioms for probability theory, according to which the reverse conditional probability distribution P(B|A) is no...
Callahan, Michael J; Anandalwar, Seema P; MacDougall, Robert D; Stamoulis, Catherine; Kleinman, Patricia L; Rangel, Shawn J; Bachur, Richard G; Taylor, George A
2015-03-01
OBJECTIVE. The purpose of this study was to determine the effect of a nominal 50% reduction in median absorbed radiation dose on sensitivity, specificity, and negative appendectomy rate of CT for acute appendicitis in children. MATERIALS AND METHODS. On the basis of a departmental practice quality improvement initiative using computer-generated gaussian noise for CT dose reduction, we applied a nominal dose reduction of 50% to abdominal CT techniques used for bowel imaging. This retrospective study consisted of 494 children who underwent a CT for suspected acute appendicitis before (n = 244; mean age, 133 months) and after (n = 250; mean age, 145 months) the nominal 50% dose reduction. Test performance characteristics of CT for acute appendicitis and impact on the negative appendectomy rate were compared for both time periods. Primary analyses were performed with histologic diagnosis as the outcome standard. Volume CT dose index and dose-length product were recorded from dose reports and size-specific dose estimates were calculated. RESULTS. The nominal 50% dose reduction resulted in an actual 39% decrease in median absorbed radiation dose. Sensitivity of CT for diagnosis of acute appendicitis was 98% (95% CI, 91-100%) versus 97% (91-100%), and specificity was 93% (88-96%) versus 94% (90-97%) before and after dose reduction, respectively. The negative appendectomy rate was 4.5% (0.8-10.25%) before dose reduction and 4.0% (0.4-7.6%) after dose reduction. CONCLUSION. The negative appendectomy rate and performance characteristics of the CT-based diagnosis of acute appendicitis were not affected by a 39% reduction in median absorbed radiation dose.
Probability state modeling theory.
Bagwell, C Bruce; Hunsberger, Benjamin C; Herbert, Donald J; Munson, Mark E; Hill, Beth L; Bray, Chris M; Preffer, Frederic I
2015-07-01
As the technology of cytometry matures, there is mounting pressure to address two major issues with data analyses. The first issue is to develop new analysis methods for high-dimensional data that can directly reveal and quantify important characteristics associated with complex cellular biology. The other issue is to replace subjective and inaccurate gating with automated methods that objectively define subpopulations and account for population overlap due to measurement uncertainty. Probability state modeling (PSM) is a technique that addresses both of these issues. The theory and important algorithms associated with PSM are presented along with simple examples and general strategies for autonomous analyses. PSM is leveraged to better understand B-cell ontogeny in bone marrow in a companion Cytometry Part B manuscript. Three short relevant videos are available in the online supporting information for both of these papers. PSM avoids the dimensionality barrier normally associated with high-dimensionality modeling by using broadened quantile functions instead of frequency functions to represent the modulation of cellular epitopes as cells differentiate. Since modeling programs ultimately minimize or maximize one or more objective functions, they are particularly amenable to automation and, therefore, represent a viable alternative to subjective and inaccurate gating approaches.
Probability distributions for magnetotellurics
Stodt, John A.
1982-11-01
Estimates of the magnetotelluric transfer functions can be viewed as ratios of two complex random variables. It is assumed that the numerator and denominator are governed approximately by a joint complex normal distribution. Under this assumption, probability distributions are obtained for the magnitude, squared magnitude, logarithm of the squared magnitude, and the phase of the estimates. Normal approximations to the distributions are obtained by calculating mean values and variances from error propagation, and the distributions are plotted with their normal approximations for different percentage errors in the numerator and denominator of the estimates, ranging from 10% to 75%. The distribution of the phase is approximated well by a normal distribution for the range of errors considered, while the distribution of the logarithm of the squared magnitude is approximated by a normal distribution for a much larger range of errors than is the distribution of the squared magnitude. The distribution of the squared magnitude is most sensitive to the presence of noise in the denominator of the estimate, in which case the true distribution deviates significantly from normal behavior as the percentage errors exceed 10%. In contrast, the normal approximation to the distribution of the logarithm of the magnitude is useful for errors as large as 75%.
RANDOM VARIABLE WITH FUZZY PROBABILITY
吕恩琳; 钟佑明
2003-01-01
Mathematic description about the second kind fuzzy random variable namely the random variable with crisp event-fuzzy probability was studied. Based on the interval probability and using the fuzzy resolution theorem, the feasible condition about a probability fuzzy number set was given, go a step further the definition arid characters of random variable with fuzzy probability ( RVFP ) and the fuzzy distribution function and fuzzy probability distribution sequence of the RVFP were put forward. The fuzzy probability resolution theorem with the closing operation of fuzzy probability was given and proved. The definition and characters of mathematical expectation and variance of the RVFP were studied also. All mathematic description about the RVFP has the closing operation for fuzzy probability, as a result, the foundation of perfecting fuzzy probability operation method is laid.
Evaluating dose response from flexible dose clinical trials
Baron David
2008-01-01
Full Text Available Abstract Background The true dose effect in flexible-dose clinical trials may be obscured and even reversed because dose and outcome are related. Methods To evaluate dose effect in response on primary efficacy scales from 2 randomized, double-blind, flexible-dose trials of patients with bipolar mania who received olanzapine (N = 234, 5–20 mg/day, or patients with schizophrenia who received olanzapine (N = 172, 10–20 mg/day, we used marginal structural models, inverse probability of treatment weighting (MSM, IPTW methodology. Dose profiles for mean changes from baseline were evaluated using weighted MSM with a repeated measures model. To adjust for selection bias due to non-random dose assignment and dropouts, patient-specific time-dependent weights were determined as products of (i stable weights based on inverse probability of receiving the sequence of dose assignments that was actually received by a patient up to given time multiplied by (ii stable weights based on inverse probability of patient remaining on treatment by that time. Results were compared with those by unweighted analyses. Results While the observed difference in efficacy scores for dose groups for the unweighted analysis strongly favored lower doses, the weighted analyses showed no strong dose effects and, in some cases, reversed the apparent "negative dose effect." Conclusion While naïve comparison of groups by last or modal dose in a flexible-dose trial may result in severely biased efficacy analyses, the MSM with IPTW estimators approach may be a valuable method of removing these biases and evaluating potential dose effect, which may prove useful for planning confirmatory trials.
Falk, Ruma; Kendig, Keith
2013-01-01
Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.
Probabilities and health risks: a qualitative approach.
Heyman, B; Henriksen, M; Maughan, K
1998-11-01
Health risks, defined in terms of the probability that an individual will suffer a particular type of adverse health event within a given time period, can be understood as referencing either natural entities or complex patterns of belief which incorporate the observer's values and knowledge, the position adopted in the present paper. The subjectivity inherent in judgements about adversity and time frames can be easily recognised, but social scientists have tended to accept uncritically the objectivity of probability. Most commonly in health risk analysis, the term probability refers to rates established by induction, and so requires the definition of a numerator and denominator. Depending upon their specification, many probabilities may be reasonably postulated for the same event, and individuals may change their risks by deciding to seek or avoid information. These apparent absurdities can be understood if probability is conceptualised as the projection of expectation onto the external world. Probabilities based on induction from observed frequencies provide glimpses of the future at the price of acceptance of the simplifying heuristic that statistics derived from aggregate groups can be validly attributed to individuals within them. The paper illustrates four implications of this conceptualisation of probability with qualitative data from a variety of sources, particularly a study of genetic counselling for pregnant women in a U.K. hospital. Firstly, the official selection of a specific probability heuristic reflects organisational constraints and values as well as predictive optimisation. Secondly, professionals and service users must work to maintain the facticity of an established heuristic in the face of alternatives. Thirdly, individuals, both lay and professional, manage probabilistic information in ways which support their strategic objectives. Fourthly, predictively sub-optimum schema, for example the idea of AIDS as a gay plague, may be selected because
Introduction to probability with R
Baclawski, Kenneth
2008-01-01
FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable
Pointwise probability reinforcements for robust statistical inference.
Frénay, Benoît; Verleysen, Michel
2014-02-01
Statistical inference using machine learning techniques may be difficult with small datasets because of abnormally frequent data (AFDs). AFDs are observations that are much more frequent in the training sample that they should be, with respect to their theoretical probability, and include e.g. outliers. Estimates of parameters tend to be biased towards models which support such data. This paper proposes to introduce pointwise probability reinforcements (PPRs): the probability of each observation is reinforced by a PPR and a regularisation allows controlling the amount of reinforcement which compensates for AFDs. The proposed solution is very generic, since it can be used to robustify any statistical inference method which can be formulated as a likelihood maximisation. Experiments show that PPRs can be easily used to tackle regression, classification and projection: models are freed from the influence of outliers. Moreover, outliers can be filtered manually since an abnormality degree is obtained for each observation.
Ross, Sheldon
2014-01-01
A First Course in Probability, Ninth Edition, features clear and intuitive explanations of the mathematics of probability theory, outstanding problem sets, and a variety of diverse examples and applications. This book is ideal for an upper-level undergraduate or graduate level introduction to probability for math, science, engineering and business students. It assumes a background in elementary calculus.
Conditionals, probability, and belief revision
Voorbraak, F.
1989-01-01
A famous result obtained in the mid-seventies by David Lewis shows that a straightforward interpretation of probabilities of conditionals as conditional probabilities runs into serious trouble. In this paper we try to circumvent this trouble by defining extensions of probability functions, called
Berger, Thomas
The radiation environment encountered in space differs in nature from that on earth, consisting mostly of high energetic ions from protons up to iron, resulting in radiation levels far exceeding the ones present on earth for occupational radiation workers. Accurate knowledge of the physical characteristics of the space radiation field in dependence on the solar activity, the orbital parameters and the different shielding configurations of the International Space Station (ISS) is therefore needed. For the investigation of the spatial and temporal distribution of the radiation field inside the European Columbus module the experiment “Dose Distribution Inside the ISS” (DOSIS), under the project and science lead of the German Aerospace Center (DLR), was launched on July 15th 2009 with STS-127 to the ISS. The DOSIS experiment consists of a combination of “Passive Detector Packages” (PDP) distributed at eleven locations inside Columbus for the measurement of the spatial variation of the radiation field and two active Dosimetry Telescopes (DOSTELs) with a Data and Power Unit (DDPU) in a dedicated nomex pouch mounted at a fixed location beneath the European Physiology Module rack (EPM) for the measurement of the temporal variation of the radiation field parameters. The DOSIS experiment suite measured during the lowest solar minimum conditions in the space age from July 2009 to June 2011. In July 2011 the active hardware was transferred to ground for refurbishment and preparation for the follow up DOSIS 3D experiment. The hardware for DOSIS 3D was launched with Soyuz 30S to the ISS on May 15th 2012. The PDPs are replaced with each even number Soyuz flight starting with Soyuz 30S. Data from the active detectors is transferred to ground via the EPM rack which is activated once a month for this action. The presentation will give an overview of the DOSIS and DOSIS 3D experiment and focus on the results from the passive radiation detectors from the DOSIS 3D experiment
The Art of Probability Assignment
Dimitrov, Vesselin I
2012-01-01
The problem of assigning probabilities when little is known is analized in the case where the quanities of interest are physical observables, i.e. can be measured and their values expressed by numbers. It is pointed out that the assignment of probabilities based on observation is a process of inference, involving the use of Bayes' theorem and the choice of a probability prior. When a lot of data is available, the resulting probability are remarkable insensitive to the form of the prior. In the oposite case of scarse data, it is suggested that the probabilities are assigned such that they are the least sensitive to specific variations of the probability prior. In the continuous case this results in a probability assignment rule wich calls for minimizing the Fisher information subject to constraints reflecting all available information. In the discrete case, the corresponding quantity to be minimized turns out to be a Renyi distance between the original and the shifted distribution.
Probability workshop to be better in probability topic
Asmat, Aszila; Ujang, Suriyati; Wahid, Sharifah Norhuda Syed
2015-02-01
The purpose of the present study was to examine whether statistics anxiety and attitudes towards probability topic among students in higher education level have an effect on their performance. 62 fourth semester science students were given statistics anxiety questionnaires about their perception towards probability topic. Result indicated that students' performance in probability topic is not related to anxiety level, which means that the higher level in statistics anxiety will not cause lower score in probability topic performance. The study also revealed that motivated students gained from probability workshop ensure that their performance in probability topic shows a positive improvement compared before the workshop. In addition there exists a significance difference in students' performance between genders with better achievement among female students compared to male students. Thus, more initiatives in learning programs with different teaching approaches is needed to provide useful information in improving student learning outcome in higher learning institution.
Propensity, Probability, and Quantum Theory
Ballentine, Leslie E.
2016-08-01
Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.
Adachi, N; Adjovi, Y; Aida, K; Akamatsu, H; Akiyama, S; Akli, A; Ando, A; Andrault, T; Antonietti, H; Anzai, S; Arkoun, G; Avenoso, C; Ayrault, D; Banasiewicz, M; Banaśkiewicz, M; Bernandini, L; Bernard, E; Berthet, E; Blanchard, M; Boreyko, D; Boros, K; Charron, S; Cornette, P; Czerkas, K; Dameron, M; Date, I; De Pontbriand, M; Demangeau, F; Dobaczewski, Ł; Dobrzyński, L; Ducouret, A; Dziedzic, M; Ecalle, A; Edon, V; Endo, K; Endo, T; Endo, Y; Etryk, D; Fabiszewska, M; Fang, S; Fauchier, D; Felici, F; Fujiwara, Y; Gardais, C; Gaul, W; Guérin, L; Hakoda, R; Hamamatsu, I; Handa, K; Haneda, H; Hara, T; Hashimoto, M; Hashimoto, T; Hashimoto, K; Hata, D; Hattori, M; Hayano, R; Hayashi, R; Higasi, H; Hiruta, M; Honda, A; Horikawa, Y; Horiuchi, H; Hozumi, Y; Ide, M; Ihara, S; Ikoma, T; Inohara, Y; Itazu, M; Ito, A; Janvrin, J; Jout, I; Kanda, H; Kanemori, G; Kanno, M; Kanomata, N; Kato, T; Kato, S; Katsu, J; Kawasaki, Y; Kikuchi, K; Kilian, P; Kimura, N; Kiya, M; Klepuszewski, M; Kluchnikov, E; Kodama, Y; Kokubun, R; Konishi, F; Konno, A; Kontsevoy, V; Koori, A; Koutaka, A; Kowol, A; Koyama, Y; Kozioł, M; Kozue, M; Kravtchenko, O; Kruczała, W; Kudła, M; Kudo, H; Kumagai, R; Kurogome, K; Kurosu, A; Kuse, M; Lacombe, A; Lefaillet, E; Magara, M; Malinowska, J; Malinowski, M; Maroselli, V; Masui, Y; Matsukawa, K; Matsuya, K; Matusik, B; Maulny, M; Mazur, P; Miyake, C; Miyamoto, Y; Miyata, K; Miyata, K; Miyazaki, M; Molęda, M; Morioka, T; Morita, E; Muto, K; Nadamoto, H; Nadzikiewicz, M; Nagashima, K; Nakade, M; Nakayama, C; Nakazawa, H; Nihei, Y; Nikul, R; Niwa, S; Niwa, O; Nogi, M; Nomura, K; Ogata, D; Ohguchi, H; Ohno, J; Okabe, M; Okada, M; Okada, Y; Omi, N; Onodera, H; Onodera, K; Ooki, S; Oonishi, K; Oonuma, H; Ooshima, H; Oouchi, H; Orsucci, M; Paoli, M; Penaud, M; Perdrisot, C; Petit, M; Piskowski, A; Płocharski, A; Polis, A; Polti, L; Potsepnia, T; Przybylski, D; Pytel, M; Quillet, W; Remy, A; Robert, C; Sadowski, M; Saito, M; Sakuma, D; Sano, K; Sasaki, Y; Sato, N; Schneider, T; Schneider, C; Schwartzman, K; Selivanov, E; Sezaki, M; Shiroishi, K; Shustava, I; Śniecińska, A; Stalchenko, E; Staroń, A; Stromboni, M; Studzińska, W; Sugisaki, H; Sukegawa, T; Sumida, M; Suzuki, Y; Suzuki, K; Suzuki, R; Suzuki, H; Suzuki, K; Świderski, W; Szudejko, M; Szymaszek, M; Tada, J; Taguchi, H; Takahashi, K; Tanaka, D; Tanaka, G; Tanaka, S; Tanino, K; Tazbir, K; Tcesnokova, N; Tgawa, N; Toda, N; Tsuchiya, H; Tsukamoto, H; Tsushima, T; Tsutsumi, K; Umemura, H; Uno, M; Usui, A; Utsumi, H; Vaucelle, M; Wada, Y; Watanabe, K; Watanabe, S; Watase, K; Witkowski, M; Yamaki, T; Yamamoto, J; Yamamoto, T; Yamashita, M; Yanai, M; Yasuda, K; Yoshida, Y; Yoshida, A; Yoshimura, K; Żmijewska, M; Zuclarelli, E
2015-01-01
Twelve high schools in Japan (of which six are in Fukushima Prefecture), four in France, eight in Poland and two in Belarus cooperated in the measurement and comparison of individual external doses in 2014. In total 216 high-school students and teachers participated in the study. Each participant wore an electronic personal dosimeter "D-shuttle" for two weeks, and kept a journal of his/her whereabouts and activities. The distributions of annual external doses estimated for each region overlap with each other, demonstrating that the personal external individual doses in locations where residence is currently allowed in Fukushima Prefecture and in Belarus are well within the range of estimated annual doses due to the background radiation level of other regions/countries.
Adachi, N; Adamovitch, V; Adjovi, Y; Aida, K; Akamatsu, H; Akiyama, S; Akli, A; Ando, A; Andrault, T; Antonietti, H; Anzai, S; Arkoun, G; Avenoso, C; Ayrault, D; Banasiewicz, M; Banaśkiewicz, M; Bernardini, L; Bernard, E; Berthet, E; Blanchard, M; Boreyko, D; Boros, K; Charron, S; Cornette, P; Czerkas, K; Dameron, M; Date, I; De Pontbriand, M; Demangeau, F; Dobaczewski, Ł; Dobrzyński, L; Ducouret, A; Dziedzic, M; Ecalle, A; Edon, V; Endo, K; Endo, T; Endo, Y; Etryk, D; Fabiszewska, M; Fang, S; Fauchier, D; Felici, F; Fujiwara, Y; Gardais, C; Gaul, W; Gurin, L; Hakoda, R; Hamamatsu, I; Handa, K; Haneda, H; Hara, T; Hashimoto, M; Hashimoto, T; Hashimoto, K; Hata, D; Hattori, M; Hayano, R; Hayashi, R; Higasi, H; Hiruta, M; Honda, A; Horikawa, Y; Horiuchi, H; Hozumi, Y; Ide, M; Ihara, S; Ikoma, T; Inohara, Y; Itazu, M; Ito, A; Janvrin, J; Jout, I; Kanda, H; Kanemori, G; Kanno, M; Kanomata, N; Kato, T; Kato, S; Katsu, J; Kawasaki, Y; Kikuchi, K; Kilian, P; Kimura, N; Kiya, M; Klepuszewski, M; Kluchnikov, E; Kodama, Y; Kokubun, R; Konishi, F; Konno, A; Kontsevoy, V; Koori, A; Koutaka, A; Kowol, A; Koyama, Y; Kozioł, M; Kozue, M; Kravtchenko, O; Kruczała, W; Kudła, M; Kudo, H; Kumagai, R; Kurogome, K; Kurosu, A; Kuse, M; Lacombe, A; Lefaillet, E; Magara, M; Malinowska, J; Malinowski, M; Maroselli, V; Masui, Y; Matsukawa, K; Matsuya, K; Matusik, B; Maulny, M; Mazur, P; Miyake, C; Miyamoto, Y; Miyata, K; Miyata, K; Miyazaki, M; Molȩda, M; Morioka, T; Morita, E; Muto, K; Nadamoto, H; Nadzikiewicz, M; Nagashima, K; Nakade, M; Nakayama, C; Nakazawa, H; Nihei, Y; Nikul, R; Niwa, S; Niwa, O; Nogi, M; Nomura, K; Ogata, D; Ohguchi, H; Ohno, J; Okabe, M; Okada, M; Okada, Y; Omi, N; Onodera, H; Onodera, K; Ooki, S; Oonishi, K; Oonuma, H; Ooshima, H; Oouchi, H; Orsucci, M; Paoli, M; Penaud, M; Perdrisot, C; Petit, M; Piskowski, A; Płocharski, A; Polis, A; Polti, L; Potsepnia, T; Przybylski, D; Pytel, M; Quillet, W; Remy, A; Robert, C; Sadowski, M; Saito, M; Sakuma, D; Sano, K; Sasaki, Y; Sato, N; Schneider, T; Schneider, C; Schwartzman, K; Selivanov, E; Sezaki, M; Shiroishi, K; Shustava, I; Śniecińska, A; Stalchenko, E; Staroń, A; Stromboni, M; Studzińska, W; Sugisaki, H; Sukegawa, T; Sumida, M; Suzuki, Y; Suzuki, K; Suzuki, R; Suzuki, H; Suzuki, K; Świderski, W; Szudejko, M; Szymaszek, M; Tada, J; Taguchi, H; Takahashi, K; Tanaka, D; Tanaka, G; Tanaka, S; Tanino, K; Tazbir, K; Tcesnokova, N; Tgawa, N; Toda, N; Tsuchiya, H; Tsukamoto, H; Tsushima, T; Tsutsumi, K; Umemura, H; Uno, M; Usui, A; Utsumi, H; Vaucelle, M; Wada, Y; Watanabe, K; Watanabe, S; Watase, K; Witkowski, M; Yamaki, T; Yamamoto, J; Yamamoto, T; Yamashita, M; Yanai, M; Yasuda, K; Yoshida, Y; Yoshida, A; Yoshimura, K; Żmijewska, M; Zuclarelli, E
2016-03-01
Twelve high schools in Japan (of which six are in Fukushima Prefecture), four in France, eight in Poland and two in Belarus cooperated in the measurement and comparison of individual external doses in 2014. In total 216 high-school students and teachers participated in the study. Each participant wore an electronic personal dosimeter 'D-shuttle' for two weeks, and kept a journal of his/her whereabouts and activities. The distributions of annual external doses estimated for each region overlap with each other, demonstrating that the personal external individual doses in locations where residence is currently allowed in Fukushima Prefecture and in Belarus are well within the range of estimated annual doses due to the terrestrial background radiation level of other regions/countries.
Hidden Variables or Positive Probabilities?
Rothman, T; Rothman, Tony
2001-01-01
Despite claims that Bell's inequalities are based on the Einstein locality condition, or equivalent, all derivations make an identical mathematical assumption: that local hidden-variable theories produce a set of positive-definite probabilities for detecting a particle with a given spin orientation. The standard argument is that because quantum mechanics assumes that particles are emitted in a superposition of states the theory cannot produce such a set of probabilities. We examine a paper by Eberhard who claims to show that a generalized Bell inequality, the CHSH inequality, can be derived solely on the basis of the locality condition, without recourse to hidden variables. We point out that he nonetheless assumes a set of positive-definite probabilities, which supports the claim that hidden variables or "locality" is not at issue here, positive-definite probabilities are. We demonstrate that quantum mechanics does predict a set of probabilities that violate the CHSH inequality; however these probabilities ar...
Applied probability and stochastic processes
Sumita, Ushio
1999-01-01
Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...
PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT
We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...
PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT
We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...
钱姗姗; 黄静; 马建华; 张华; 刘楠; 张喜乐; 冯前进; 陈武凡
2011-01-01
为获取低剂量CT图像的优质重建,本文提出一种基于投影数据非单调性全变分恢复的低剂量CT重建方法.新方法首先通过非线性Anscombe变换将满足Poisson分布的投影数据转化为近似Gaussian分布,其后对变换后的Gaussian型数据进行非单调性全变分最小化算法（Nonmonotone Total Variation Minimization,NTVM）滤波,最后对Anscombe逆变换数据实现传统的滤波反投影(Filtered Back Projection,FBP) CT重建.仿真和临床低剂量CT重建实验表明,本文方法在噪声清除、伪影抑制和缩短重建时间等方面均有上佳表现.%In order to improve the reconstruction quality of low-dose CT image,a new approach is proposed based on low-dose CT projection restoration in this paper. First, projection data is transformed from Poisson distribution to Gaussian distribution using nonlinear Anscombe transform. Then, the Anscombe transformed data is filtered by an efficient nonmonotone total variation minimization denoising algorithm. Last, the reconstruction is achieved by inverse Anscombe transform and filtered back projection (FBP) method. Simulated and clinical low-dose CT data experimental results demonstrate that a high-quality CT image can be reconstructed.
High-order noise analysis for low dose iterative image reconstruction methods: ASIR, IRIS, and MBAI
Do, Synho; Singh, Sarabjeet; Kalra, Mannudeep K.; Karl, W. Clem; Brady, Thomas J.; Pien, Homer
2011-03-01
Iterative reconstruction techniques (IRTs) has been shown to suppress noise significantly in low dose CT imaging. However, medical doctors hesitate to accept this new technology because visual impression of IRT images are different from full-dose filtered back-projection (FBP) images. Most common noise measurements such as the mean and standard deviation of homogeneous region in the image that do not provide sufficient characterization of noise statistics when probability density function becomes non-Gaussian. In this study, we measure L-moments of intensity values of images acquired at 10% of normal dose and reconstructed by IRT methods of two state-of-art clinical scanners (i.e., GE HDCT and Siemens DSCT flash) by keeping dosage level identical to each other. The high- and low-dose scans (i.e., 10% of high dose) were acquired from each scanner and L-moments of noise patches were calculated for the comparison.
NONE
2011-07-01
This document reports a dose assessment study performed by the IRSN (the French Radioprotection and Safety Nuclear Institute) 66 days after the Fukushima nuclear accident. A new dose assessment was carried out by IRSN to estimate projected doses due to external exposure from radioactive deposits, for exposure durations of 3 months, 1 year and 4 years before evacuation. The purpose of this report is to provide insight on all radiological assessments performed to the knowledge of the IRSN (the French Radioprotection and Safety Nuclear Institute) to date and the impact of population evacuation measures to be taken to minimize the medium and long-term risks of developing leukaemia or other radiation-induced cancers. This report only considers the external doses already received as well as the doses that may be received in the future from fallout deposits, regardless of doses received previously from the radioactive plume
2002-01-01
Electron paramagnetic resonance (EPR) dosimetry is a physical method for the assessment of absorbed dose from ionising radiation. It is based on the measurement of stable radiation induced radicals in human calcified tissues (primarily in tooth enamel). EPR dosimetry with teeth is now firmly established in retrospective dosimetry. It is a powerful method for providing information on exposure to ionising radiation many years after the event, since the 'signal' is 'stored' in the tooth or the bone. This technique is of particular relevance to relatively low dose exposures or when the results of conventional dosimetry are not available (e.g. in accidental circumstances). The use of EPR dosimetry, as an essential tool for retrospective assessment of radiation exposure is an important part of radioepidemiological studies and also provides data to select appropriate countermeasures based on retrospective evaluation of individual doses. Despite well established regulations and protocols for maintaining radiation pro...
Understanding Students' Beliefs about Probability.
Konold, Clifford
The concept of probability is not an easy concept for high school and college students to understand. This paper identifies and analyzes the students' alternative frameworks from the viewpoint of constructivism. There are various interpretations of probability through mathematical history: classical, frequentist, and subjectivist interpretation.…
Expected utility with lower probabilities
Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte
1994-01-01
An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...
Varieties of Belief and Probability
D.J.N. van Eijck (Jan); S. Ghosh; J. Szymanik
2015-01-01
htmlabstractFor reasoning about uncertain situations, we have probability theory, and we have logics of knowledge and belief. How does elementary probability theory relate to epistemic logic and the logic of belief? The paper focuses on the notion of betting belief, and interprets a language for
Landau-Zener Probability Reviewed
Valencia, C
2008-01-01
We examine the survival probability for neutrino propagation through matter with variable density. We present a new method to calculate the level-crossing probability that differs from Landau's method by constant factor, which is relevant in the interpretation of neutrino flux from supernova explosion.
Probability and Statistics: 5 Questions
Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fits...
A graduate course in probability
Tucker, Howard G
2014-01-01
Suitable for a graduate course in analytic probability, this text requires only a limited background in real analysis. Topics include probability spaces and distributions, stochastic independence, basic limiting options, strong limit theorems for independent random variables, central limit theorem, conditional expectation and Martingale theory, and an introduction to stochastic processes.
Invariant probabilities of transition functions
Zaharopol, Radu
2014-01-01
The structure of the set of all the invariant probabilities and the structure of various types of individual invariant probabilities of a transition function are two topics of significant interest in the theory of transition functions, and are studied in this book. The results obtained are useful in ergodic theory and the theory of dynamical systems, which, in turn, can be applied in various other areas (like number theory). They are illustrated using transition functions defined by flows, semiflows, and one-parameter convolution semigroups of probability measures. In this book, all results on transition probabilities that have been published by the author between 2004 and 2008 are extended to transition functions. The proofs of the results obtained are new. For transition functions that satisfy very general conditions the book describes an ergodic decomposition that provides relevant information on the structure of the corresponding set of invariant probabilities. Ergodic decomposition means a splitting of t...
Linear Positivity and Virtual Probability
Hartle, J B
2004-01-01
We investigate the quantum theory of closed systems based on the linear positivity decoherence condition of Goldstein and Page. A quantum theory of closed systems requires two elements; 1) a condition specifying which sets of histories may be assigned probabilities that are consistent with the rules of probability theory, and 2) a rule for those probabilities. The linear positivity condition of Goldstein and Page is the weakest of the general conditions proposed so far. Its general properties relating to exact probability sum rules, time-neutrality, and conservation laws are explored. Its inconsistency with the usual notion of independent subsystems in quantum mechanics is reviewed. Its relation to the stronger condition of medium decoherence necessary for classicality is discussed. The linear positivity of histories in a number of simple model systems is investigated with the aim of exhibiting linearly positive sets of histories that are not decoherent. The utility of extending the notion of probability to i...
Survival probability and ruin probability of a risk model
LUO Jian-hua
2008-01-01
In this paper, a new risk model is studied in which the rate of premium income is regarded as a random variable, the arrival of insurance policies is a Poisson process and the process of claim occurring is p-thinning process. The integral representations of the survival probability are gotten. The explicit formula of the survival probability on the infinite interval is obtained in the special casc--exponential distribution.The Lundberg inequality and the common formula of the ruin probability are gotten in terms of some techniques from martingale theory.
无
2005-01-01
People much given to gambling usually manage to work out rough-and-ready ways of measuring the likelihood of certain situations so as to know which way to bet their money, and how much. If they did not do this., they would quickly lose all their money to those who did.
Thiede, M.E.; Duncan, J.P.
1994-03-01
This report is a result of the Hanford Environmental Dose Reconstruction (HEDR) Project. The goal of the HEDR Project is to estimate the radiation dose that individuals could have received from radionuclide emissions since 1944 at the Hanford Site near Richland, Washington. The HEDR Project is conducted by Battelle, Pacific Northwest Laboratories. The time periods of greatest interest to the HEDR study vary depending on the type of environmental media concerned. Concentrations of radionuclides in Columbia River media from 1960--1970 provide the best historical data for validation of the Columbia River pathway computer models. This report provides the historical radionuclide measurements in Columbia River water (1960--1970), fish (1960--1967), waterfowl (1960--1970), gamebirds (1967--1970), and shellfish (1960--1970). Because of the large size of the databases (845 pages), this report is being published on diskette. A diskette of this report is available from the Technical Steering Panel (c/o K. CharLee, Office of Nuclear Waste Management, Department of Ecology, Technical Support and Publication Information Section, P.O. Box 47651, Olympia, Washington 98504-7651).
Probability-consistent spectrum and code spectrum
沈建文; 石树中
2004-01-01
In the seismic safety evaluation (SSE) for key projects, the probability-consistent spectrum (PCS), usually obtained from probabilistic seismic hazard analysis (PSHA), is not consistent with the design response spectrum given by Code for Seismic Design of Buildings (GB50011-2001). Sometimes, there may be a remarkable difference between them. If the PCS is lower than the corresponding code design response spectrum (CDS), the seismic fortification criterion for the key projects would be lower than that for the general industry and civil buildings. In the paper, the relation between PCS and CDS is discussed by using the ideal simple potential seismic source. The results show that in the most areas influenced mainly by the potential sources of the epicentral earthquakes and the regional earthquakes, PCS is generally lower than CDS in the long periods. We point out that the long-period response spectra of the code should be further studied and combined with the probability method of seismic zoning as much as possible. Because of the uncertainties in SSE, it should be prudent to use the long-period response spectra given by SSE for key projects when they are lower than CDS.
刘楠; 黄静; 马建华; 陈武凡; 卢虹冰; Zhengrong Liang
2011-01-01
针对低剂量CT成像质量退化问题,将CT投影数据恢复与图像数据恢复巧妙地融合,提出一种投影数据恢复导引的非局部平均(NL-means)低剂量CT重建方法.首先通过非线性Anscombe变换将满足Poisson分布的投影数据转化为Gaussian分布,以便于投影数据噪声的滤除;然后对滤波后的投影数据执行Anscombe逆变换和滤波反投影(FBP)CT图像重建;最后将投影数据滤波后的FBP图像作为先验构建非局部权值矩阵,并将该权值矩阵用于低剂量CT图像的非局部平均成像.仿真和临床实验结果表明.该方法在噪声消除和伪影抑制两方面均有上佳表现.%To improve the quality of low-dose computed tomography (CT) image, a novel projection data recovery induced non-local means for low-dose CT reconstruction is proposed. The presented method can take the advantages of data recovery methods in two domains (projection domain and image domain). Specially, the projection data is first transformed from Poisson distribution to Gaussian distribution using the nonlinear Anscombe transform in order to easily filter the noise of projection data. Second, after Anscombe transformed data is filtered, Anscombe inverse transform is performed, and the reconstructed image is achieved using the classical filtered back projection (FBP) method from filtered projection data. Last, non-local means (NL-means) weights of FBP image are computed from the restored projection data to induce the NL-means filtering of directly reconstructed FBP image from the un-restored projection data. Simulated and clinical experimental results demonstrate that the proposed method performs very well in lowering the noise and preserving the image edge.
Doi, M. [National Inst. of Radiological Sciences, Chiba (Japan); Lagarde, F. [Karolinska Inst., Stockholm (Sweden). Inst. of Environmental Medicine; Falk, R.; Swedjemark, G.A. [Swedish Radiation Protection Inst., Stockholm (Sweden)
1996-12-01
Effective dose per unit radon progeny exposure to Swedish population in 1992 is estimated by the risk projection model based on the Swedish epidemiological study of radon and lung cancer. The resulting values range from 1.29 - 3.00 mSv/WLM and 2.58 - 5.99 mSv/WLM, respectively. Assuming a radon concentration of 100 Bq/m{sup 3}, an equilibrium factor of 0.4 and an occupancy factor of 0.6 in Swedish houses, the annual effective dose for the Swedish population is estimated to be 0.43 - 1.98 mSv/year, which should be compared to the value of 1.9 mSv/year, according to the UNSCEAR 1993 report. 27 refs, tabs, figs.
SureTrak Probability of Impact Display
Elliott, John
2012-01-01
The SureTrak Probability of Impact Display software was developed for use during rocket launch operations. The software displays probability of impact information for each ship near the hazardous area during the time immediately preceding the launch of an unguided vehicle. Wallops range safety officers need to be sure that the risk to humans is below a certain threshold during each use of the Wallops Flight Facility Launch Range. Under the variable conditions that can exist at launch time, the decision to launch must be made in a timely manner to ensure a successful mission while not exceeding those risk criteria. Range safety officers need a tool that can give them the needed probability of impact information quickly, and in a format that is clearly understandable. This application is meant to fill that need. The software is a reuse of part of software developed for an earlier project: Ship Surveillance Software System (S4). The S4 project was written in C++ using Microsoft Visual Studio 6. The data structures and dialog templates from it were copied into a new application that calls the implementation of the algorithms from S4 and displays the results as needed. In the S4 software, the list of ships in the area was received from one local radar interface and from operators who entered the ship information manually. The SureTrak Probability of Impact Display application receives ship data from two local radars as well as the SureTrak system, eliminating the need for manual data entry.
Failure probability under parameter uncertainty.
Gerrard, R; Tsanakas, A
2011-05-01
In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.
Probability with applications and R
Dobrow, Robert P
2013-01-01
An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c
Probability Ranking in Vector Spaces
Melucci, Massimo
2011-01-01
The Probability Ranking Principle states that the document set with the highest values of probability of relevance optimizes information retrieval effectiveness given the probabilities are estimated as accurately as possible. The key point of the principle is the separation of the document set into two subsets with a given level of fallout and with the highest recall. The paper introduces the separation between two vector subspaces and shows that the separation yields a more effective performance than the optimal separation into subsets with the same available evidence, the performance being measured with recall and fallout. The result is proved mathematically and exemplified experimentally.
Holographic probabilities in eternal inflation.
Bousso, Raphael
2006-11-10
In the global description of eternal inflation, probabilities for vacua are notoriously ambiguous. The local point of view is preferred by holography and naturally picks out a simple probability measure. It is insensitive to large expansion factors or lifetimes and so resolves a recently noted paradox. Any cosmological measure must be complemented with the probability for observers to emerge in a given vacuum. In lieu of anthropic criteria, I propose to estimate this by the entropy that can be produced in a local patch. This allows for prior-free predictions.
Local Causality, Probability and Explanation
Healey, Richard A
2016-01-01
In papers published in the 25 years following his famous 1964 proof John Bell refined and reformulated his views on locality and causality. Although his formulations of local causality were in terms of probability, he had little to say about that notion. But assumptions about probability are implicit in his arguments and conclusions. Probability does not conform to these assumptions when quantum mechanics is applied to account for the particular correlations Bell argues are locally inexplicable. This account involves no superluminal action and there is even a sense in which it is local, but it is in tension with the requirement that the direct causes and effects of events are nearby.
A philosophical essay on probabilities
Laplace, Marquis de
1996-01-01
A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application
Lukas Ebner
2014-01-01
Full Text Available Objective:The aim of the present study was to evaluate a dose reduction in contrast-enhanced chest computed tomography (CT by comparing the three latest generations of Siemens CT scanners used in clinical practice. We analyzed the amount of radiation used with filtered back projection (FBP and an iterative reconstruction (IR algorithm to yield the same image quality. Furthermore, the influence on the radiation dose of the most recent integrated circuit detector (ICD; Stellar detector, Siemens Healthcare, Erlangen, Germany was investigated. Materials and Methods: 136 Patients were included. Scan parameters were set to a thorax routine: SOMATOM Sensation 64 (FBP, SOMATOM Definition Flash (IR, and SOMATOM Definition Edge (ICD and IR. Tube current was set constantly to the reference level of 100 mA automated tube current modulation using reference milliamperes. Care kV was used on the Flash and Edge scanner, while tube potential was individually selected between 100 and 140 kVp by the medical technologists at the SOMATOM Sensation. Quality assessment was performed on soft-tissue kernel reconstruction. Dose was represented by the dose length product. Results: Dose-length product (DLP with FBP for the average chest CT was 308 mGycm ± 99.6. In contrast, the DLP for the chest CT with IR algorithm was 196.8 mGycm ± 68.8 (P = 0.0001. Further decline in dose can be noted with IR and the ICD: DLP: 166.4 mGycm ± 54.5 (P = 0.033. The dose reduction compared to FBP was 36.1% with IR and 45.6% with IR/ICD. Signal-to-noise ratio (SNR was favorable in the aorta, bone, and soft tissue for IR/ICD in combination compared to FBP (the P values ranged from 0.003 to 0.048. Overall contrast-to-noise ratio (CNR improved with declining DLP. Conclusion: The most recent technical developments, namely IR in combination with integrated circuit detectors, can significantly lower radiation dose in chest CT examinations.
Probable warfarin interaction with menthol cough drops.
Coderre, Karen; Faria, Claudio; Dyer, Earl
2010-01-01
Warfarin is a widely used and effective oral anticoagulant; however, the agent has an extensive drug and food interaction profile. We describe a 46-year-old African-American man who was receiving warfarin for a venous thromboembolism and experienced a decrease in his international normalized ratio (INR). No corresponding reduction had been made in his warfarin dosage, and no changes had been made in his concomitant drug therapy or diet. The patient's INR fell from a therapeutic value of 2.6 (target range 2-3) to 1.6 while receiving a weekly warfarin dose of 50 mg. His INR remained stable at 1.6 for 3 weeks despite incremental increases in his warfarin dose. The patient reported that he had been taking 8-10 menthol cough drops/day due to dry conditions at his workplace during the time period that the INR decreased. Five days after discontinuing the cough drops, his INR increased from 1.6 to 2.9. Over the subsequent 5 weeks, his INR was stabilized at a much lower weekly warfarin dose of 40 mg. Use of the Naranjo adverse drug reaction probability scale indicated that the decreased INR was probably related to the concomitant use of menthol cough drops during warfarin therapy. The mechanism for this interaction may be related to the potential for menthol to affect the cytochrome P450 system as an inducer and inhibitor of certain isoenzymes that would potentially interfere with the metabolism of warfarin. To our knowledge, this is the second case report of an interaction between warfarin and menthol. Patients receiving warfarin should be closely monitored, as they may choose to take over-the-counter products without considering the potential implications, and counseled about a possible interaction with menthol cough drops.
Diurnal distribution of sunshine probability
Aydinli, S.
1982-01-01
The diurnal distribution of the sunshine probability is essential for the predetermination of average irradiances and illuminances by solar radiation on sloping surfaces. The most meteorological stations have only monthly average values of the sunshine duration available. It is, therefore, necessary to compute the diurnal distribution of sunshine probability starting from the average monthly values. It is shown how the symmetric component of the distribution of the sunshine probability which is a consequence of a ''sidescene effect'' of the clouds can be calculated. The asymmetric components of the sunshine probability depending on the location and the seasons and their influence on the predetermination of the global radiation are investigated and discussed.
Probability representation of classical states
Man'ko, OV; Man'ko, [No Value; Pilyavets, OV
2005-01-01
Probability representation of classical states described by symplectic tomograms is discussed. Tomographic symbols of classical observables which are functions on phase-space are studied. Explicit form of kernel of commutative star-product of the tomographic symbols is obtained.
Introduction to probability and measure
Parthasarathy, K R
2005-01-01
According to a remark attributed to Mark Kac 'Probability Theory is a measure theory with a soul'. This book with its choice of proofs, remarks, examples and exercises has been prepared taking both these aesthetic and practical aspects into account.
Free probability and random matrices
Mingo, James A
2017-01-01
This volume opens the world of free probability to a wide variety of readers. From its roots in the theory of operator algebras, free probability has intertwined with non-crossing partitions, random matrices, applications in wireless communications, representation theory of large groups, quantum groups, the invariant subspace problem, large deviations, subfactors, and beyond. This book puts a special emphasis on the relation of free probability to random matrices, but also touches upon the operator algebraic, combinatorial, and analytic aspects of the theory. The book serves as a combination textbook/research monograph, with self-contained chapters, exercises scattered throughout the text, and coverage of important ongoing progress of the theory. It will appeal to graduate students and all mathematicians interested in random matrices and free probability from the point of view of operator algebras, combinatorics, analytic functions, or applications in engineering and statistical physics.
The probabilities of unique events.
Sangeet S Khemlani
Full Text Available Many theorists argue that the probabilities of unique events, even real possibilities such as President Obama's re-election, are meaningless. As a consequence, psychologists have seldom investigated them. We propose a new theory (implemented in a computer program in which such estimates depend on an intuitive non-numerical system capable only of simple procedures, and a deliberative system that maps intuitions into numbers. The theory predicts that estimates of the probabilities of conjunctions should often tend to split the difference between the probabilities of the two conjuncts. We report two experiments showing that individuals commit such violations of the probability calculus, and corroborating other predictions of the theory, e.g., individuals err in the same way even when they make non-numerical verbal estimates, such as that an event is highly improbable.
Logic, probability, and human reasoning.
Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P
2015-04-01
This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction.
Default probabilities and default correlations
Erlenmaier, Ulrich; Gersbach, Hans
2001-01-01
Starting from the Merton framework for firm defaults, we provide the analytics and robustness of the relationship between default correlations. We show that loans with higher default probabilities will not only have higher variances but also higher correlations between loans. As a consequence, portfolio standard deviation can increase substantially when loan default probabilities rise. This result has two important implications. First, relative prices of loans with different default probabili...
Joint probabilities and quantum cognition
de Barros, J Acacio
2012-01-01
In this paper we discuss the existence of joint probability distributions for quantum-like response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.
Three lectures on free probability
2012-01-01
These are notes from a three-lecture mini-course on free probability given at MSRI in the Fall of 2010 and repeated a year later at Harvard. The lectures were aimed at mathematicians and mathematical physicists working in combinatorics, probability, and random matrix theory. The first lecture was a staged rediscovery of free independence from first principles, the second dealt with the additive calculus of free random variables, and the third focused on random matrix models.
Takahashi, Shintaro; Inoue, Kazuya; Suzuki, Masatoshi; Urushihara, Yusuke; Kuwahara, Yoshikazu; Hayashi, Gohei; Shiga, Soichiro; Fukumoto, Motoi; Kino, Yasushi; Sekine, Tsutomu; Abe, Yasuyuki; Fukuda, Tomokazu; Isogai, Emiko; Yamashiro, Hideaki; Fukumoto, Manabu
2015-12-01
It is not an exaggeration to say that, without nuclear accidents or the analysis of radiation therapy, there is no way in which we are able to quantify radiation effects on humans. Therefore, the livestock abandoned in the ex-evacuation zone and euthanized due to the Fukushima Daiichi Nuclear Power Plant (FNPP) accident are extremely valuable for analyzing the environmental pollution, its biodistribution, the metabolism of radionuclides, dose evaluation and the influence of internal exposure. We, therefore, sought to establish an archive system and to open it to researchers for increasing our understanding of radiation biology and improving protection against radiation. The sample bank of animals affected by the FNPP accident consists of frozen tissue samples, formalin-fixed paraffin-embedded specimens, dose of radionuclides deposited, etc., with individual sampling data.
Probably not future prediction using probability and statistical inference
Dworsky, Lawrence N
2008-01-01
An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...
Cluster Membership Probability: Polarimetric Approach
Medhi, Biman J
2013-01-01
Interstellar polarimetric data of the six open clusters Hogg 15, NGC 6611, NGC 5606, NGC 6231, NGC 5749 and NGC 6250 have been used to estimate the membership probability for the stars within them. For proper-motion member stars, the membership probability estimated using the polarimetric data is in good agreement with the proper-motion cluster membership probability. However, for proper-motion non-member stars, the membership probability estimated by the polarimetric method is in total disagreement with the proper-motion cluster membership probability. The inconsistencies in the determined memberships may be because of the fundamental differences between the two methods of determination: one is based on stellar proper-motion in space and the other is based on selective extinction of the stellar output by the asymmetric aligned dust grains present in the interstellar medium. The results and analysis suggest that the scatter of the Stokes vectors q(%) and u(%) for the proper-motion member stars depends on the ...
Normal probability plots with confidence.
Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang
2015-01-01
Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods.
Chopra, Arvind; Saluja, Manjit; Tillu, Girish; Venugopalan, Anuradha; Narsimulu, Gumdal; Sarmukaddam, Sanjeev; Patwardhan, Bhushan
2012-01-01
Results of an exploratory trial suggested activity trends of Zingiber officinale-Tinopsora cordifolia (platform combination)-based formulations in the treatment of Osteoarthritis (OA) Knees. These formulations were "platform combination+Withania somnifera+Tribulus terrestris" (formulation B) and "platform combination+Emblica officinale" (formulation C). This paper reports safety of these formulations when used in higher doses (1.5-2 times) along with Sallaki Guggul and Bhallataka Parpati (a Semecarpus anacardium preparation). Ninety-two patients with symptomatic OA knees were enrolled in a 6 weeks investigator blind, randomized parallel efficacy 4-arm multicenter drug trial. The 4 arms were (I) formulation B, 2 t.i.d.; (II) formulation B, 2 q.i.d.; (III) platform combination+Sallaki Guggul; (IV) Bhallataka Parpati+formulation C. A detailed enquiry was carried out for adverse events (AE) and drug toxicity as per a priori check list and volunteered information. Laboratory evaluation included detailed hematology and metabolic parameters. Patients were examined at baseline, first and fourth weeks, and on completion. Standard statistical program (SPSS version 12.5) was used for analysis. None of the patients reported serious AE or withdrew due to any drug-related toxicity. Mild gut-related (mostly epigastric burning) AE was reported. A mild increase in liver enzymes [serum glutamic pyruvate transaminase (SGPT), serum glutamic oxaloacetic transaminase (SGOT)] without any other hepatic abnormality was reported in 2 patients (group IV). Other laboratory parameters remained normal. The mean improvement in active pain visual analog scale (1.4, CI 0.5-2.22), WOMAC (functional activity questionnaire) pain score (1.37, CI 0.22-2.5), and urinary C-TAX (cartilage collagen breakdown product) assay was maximum (NS) in group IV. Lower dose group I showed numerically superior improvement compared with higher dose group II. The results suggested that despite higher doses, standardized
Arvind Chopra
2012-01-01
Full Text Available Background: Results of an exploratory trial suggested activity trends of Zingiber officinale-Tinopsora cordifolia (platform combination-based formulations in the treatment of Osteoarthritis (OA Knees. These formulations were "platform combination+Withania somnifera+Tribulus terrestris0" (formulation B and "platform combination+Emblica officinale" (formulation C. This paper reports safety of these formulations when used in higher doses (1.5-2 times along with Sallaki Guggul and Bhallataka Parpati (a Semecarpus anacardium preparation. Materials and Methods: Ninety-two patients with symptomatic OA knees were enrolled in a 6 weeks investigator blind, randomized parallel efficacy 4-arm multicenter drug trial. The 4 arms were (I formulation B, 2 t.i.d.; (II formulation B, 2 q.i.d.; (III platform combination+Sallaki Guggul; (IV Bhallataka Parpati+formulation C. A detailed enquiry was carried out for adverse events (AE and drug toxicity as per a priori check list and volunteered information. Laboratory evaluation included detailed hematology and metabolic parameters. Patients were examined at baseline, first and fourth weeks, and on completion. Standard statistical program (SPSS version 12.5 was used for analysis. Results: None of the patients reported serious AE or withdrew due to any drug-related toxicity. Mild gut-related (mostly epigastric burning AE was reported. A mild increase in liver enzymes [serum glutamic pyruvate transaminase (SGPT, serum glutamic oxaloacetic transaminase (SGOT] without any other hepatic abnormality was reported in 2 patients (group IV. Other laboratory parameters remained normal. The mean improvement in active pain visual analog scale (1.4, CI 0.5-2.22, WOMAC (functional activity questionnaire pain score (1.37, CI 0.22-2.5, and urinary C-TAX (cartilage collagen breakdown product assay was maximum (NS in group IV. Lower dose group I showed numerically superior improvement compared with higher dose group II. Conclusion: The
Detonation probabilities of high explosives
Eisenhawer, S.W.; Bott, T.F.; Bement, T.R.
1995-07-01
The probability of a high explosive violent reaction (HEVR) following various events is an extremely important aspect of estimating accident-sequence frequency for nuclear weapons dismantlement. In this paper, we describe the development of response curves for insults to PBX 9404, a conventional high-performance explosive used in US weapons. The insults during dismantlement include drops of high explosive (HE), strikes of tools and components on HE, and abrasion of the explosive. In the case of drops, we combine available test data on HEVRs and the results of flooring certification tests to estimate the HEVR probability. For other insults, it was necessary to use expert opinion. We describe the expert solicitation process and the methods used to consolidate the responses. The HEVR probabilities obtained from both approaches are compared.
Probability theory a foundational course
Pakshirajan, R P
2013-01-01
This book shares the dictum of J. L. Doob in treating Probability Theory as a branch of Measure Theory and establishes this relation early. Probability measures in product spaces are introduced right at the start by way of laying the ground work to later claim the existence of stochastic processes with prescribed finite dimensional distributions. Other topics analysed in the book include supports of probability measures, zero-one laws in product measure spaces, Erdos-Kac invariance principle, functional central limit theorem and functional law of the iterated logarithm for independent variables, Skorohod embedding, and the use of analytic functions of a complex variable in the study of geometric ergodicity in Markov chains. This book is offered as a text book for students pursuing graduate programs in Mathematics and or Statistics. The book aims to help the teacher present the theory with ease, and to help the student sustain his interest and joy in learning the subject.
VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS
Smirnov Vladimir Alexandrovich
2012-10-01
Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.
Approximation methods in probability theory
Čekanavičius, Vydas
2016-01-01
This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.
Probability on real Lie algebras
Franz, Uwe
2016-01-01
This monograph is a progressive introduction to non-commutativity in probability theory, summarizing and synthesizing recent results about classical and quantum stochastic processes on Lie algebras. In the early chapters, focus is placed on concrete examples of the links between algebraic relations and the moments of probability distributions. The subsequent chapters are more advanced and deal with Wigner densities for non-commutative couples of random variables, non-commutative stochastic processes with independent increments (quantum Lévy processes), and the quantum Malliavin calculus. This book will appeal to advanced undergraduate and graduate students interested in the relations between algebra, probability, and quantum theory. It also addresses a more advanced audience by covering other topics related to non-commutativity in stochastic calculus, Lévy processes, and the Malliavin calculus.
Probability, Statistics, and Stochastic Processes
Olofsson, Peter
2011-01-01
A mathematical and intuitive approach to probability, statistics, and stochastic processes This textbook provides a unique, balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. This text combines a rigorous, calculus-based development of theory with a more intuitive approach that appeals to readers' sense of reason and logic, an approach developed through the author's many years of classroom experience. The text begins with three chapters that d
Innovation and social probable knowledge
Marco Crocco
2000-01-01
In this paper some elements of Keynes's theory of probability are used to understand the process of diffusion of an innovation. Based on a work done elsewhere (Crocco 1999, 2000), we argue that this process can be viewed as a process of dealing with the collective uncertainty about how to sort a technological problem. Expanding the concepts of weight of argument and probable knowledge to deal with this kind of uncertainty we argue that the concepts of social weight of argument and social prob...
Knowledge typology for imprecise probabilities.
Wilson, G. D. (Gregory D.); Zucker, L. J. (Lauren J.)
2002-01-01
When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.
Probability, statistics, and queueing theory
Allen, Arnold O
1990-01-01
This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit
Comments on quantum probability theory.
Sloman, Steven
2014-01-01
Quantum probability theory (QP) is the best formal representation available of the most common form of judgment involving attribute comparison (inside judgment). People are capable, however, of judgments that involve proportions over sets of instances (outside judgment). Here, the theory does not do so well. I discuss the theory both in terms of descriptive adequacy and normative appropriateness.
Exact Probability Distribution versus Entropy
Kerstin Andersson
2014-10-01
Full Text Available The problem addressed concerns the determination of the average number of successive attempts of guessing a word of a certain length consisting of letters with given probabilities of occurrence. Both first- and second-order approximations to a natural language are considered. The guessing strategy used is guessing words in decreasing order of probability. When word and alphabet sizes are large, approximations are necessary in order to estimate the number of guesses. Several kinds of approximations are discussed demonstrating moderate requirements regarding both memory and central processing unit (CPU time. When considering realistic sizes of alphabets and words (100, the number of guesses can be estimated within minutes with reasonable accuracy (a few percent and may therefore constitute an alternative to, e.g., various entropy expressions. For many probability distributions, the density of the logarithm of probability products is close to a normal distribution. For those cases, it is possible to derive an analytical expression for the average number of guesses. The proportion of guesses needed on average compared to the total number decreases almost exponentially with the word length. The leading term in an asymptotic expansion can be used to estimate the number of guesses for large word lengths. Comparisons with analytical lower bounds and entropy expressions are also provided.
Stretching Probability Explorations with Geoboards
Wheeler, Ann; Champion, Joe
2016-01-01
Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…
Conditional Independence in Applied Probability.
Pfeiffer, Paul E.
This material assumes the user has the background provided by a good undergraduate course in applied probability. It is felt that introductory courses in calculus, linear algebra, and perhaps some differential equations should provide the requisite experience and proficiency with mathematical concepts, notation, and argument. The document is…
Fuzzy Markov chains: uncertain probabilities
2002-01-01
We consider finite Markov chains where there are uncertainties in some of the transition probabilities. These uncertainties are modeled by fuzzy numbers. Using a restricted fuzzy matrix multiplication we investigate the properties of regular, and absorbing, fuzzy Markov chains and show that the basic properties of these classical Markov chains generalize to fuzzy Markov chains.
Stretching Probability Explorations with Geoboards
Wheeler, Ann; Champion, Joe
2016-01-01
Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…
DECOFF Probabilities of Failed Operations
Gintautas, Tomas
A statistical procedure of estimation of Probabilities of Failed Operations is described and exemplified using ECMWF weather forecasts and SIMO output from Rotor Lift test case models. Also safety factor influence is investigated. DECOFF statistical method is benchmarked against standard Alpha...
A Novel Approach to Probability
Kafri, Oded
2016-01-01
When P indistinguishable balls are randomly distributed among L distinguishable boxes, and considering the dense system in which P much greater than L, our natural intuition tells us that the box with the average number of balls has the highest probability and that none of boxes are empty; however in reality, the probability of the empty box is always the highest. This fact is with contradistinction to sparse system in which the number of balls is smaller than the number of boxes (i.e. energy distribution in gas) in which the average value has the highest probability. Here we show that when we postulate the requirement that all possible configurations of balls in the boxes have equal probabilities, a realistic "long tail" distribution is obtained. This formalism when applied for sparse systems converges to distributions in which the average is preferred. We calculate some of the distributions resulted from this postulate and obtain most of the known distributions in nature, namely, Zipf law, Benford law, part...
Probability representations of fuzzy systems
LI Hongxing
2006-01-01
In this paper, the probability significance of fuzzy systems is revealed. It is pointed out that COG method, a defuzzification technique used commonly in fuzzy systems, is reasonable and is the optimal method in the sense of mean square. Based on different fuzzy implication operators, several typical probability distributions such as Zadeh distribution, Mamdani distribution, Lukasiewicz distribution, etc. are given. Those distributions act as "inner kernels" of fuzzy systems. Furthermore, by some properties of probability distributions of fuzzy systems, it is also demonstrated that CRI method, proposed by Zadeh, for constructing fuzzy systems is basically reasonable and effective. Besides, the special action of uniform probability distributions in fuzzy systems is characterized. Finally, the relationship between CRI method and triple I method is discussed. In the sense of construction of fuzzy systems, when restricting three fuzzy implication operators in triple I method to the same operator, CRI method and triple I method may be related in the following three basic ways: 1) Two methods are equivalent; 2) the latter is a degeneration of the former; 3) the latter is trivial whereas the former is not. When three fuzzy implication operators in triple I method are not restricted to the same operator, CRI method is a special case of triple I method; that is, triple I method is a more comprehensive algorithm. Since triple I method has a good logical foundation and comprises an idea of optimization of reasoning, triple I method will possess a beautiful vista of application.
On the probability of cure for heavy-ion radiotherapy.
Hanin, Leonid; Zaider, Marco
2014-07-21
The probability of a cure in radiation therapy (RT)-viewed as the probability of eventual extinction of all cancer cells-is unobservable, and the only way to compute it is through modeling the dynamics of cancer cell population during and post-treatment. The conundrum at the heart of biophysical models aimed at such prospective calculations is the absence of information on the initial size of the subpopulation of clonogenic cancer cells (also called stem-like cancer cells), that largely determines the outcome of RT, both in an individual and population settings. Other relevant parameters (e.g. potential doubling time, cell loss factor and survival probability as a function of dose) are, at least in principle, amenable to empirical determination. In this article we demonstrate that, for heavy-ion RT, microdosimetric considerations (justifiably ignored in conventional RT) combined with an expression for the clone extinction probability obtained from a mechanistic model of radiation cell survival lead to useful upper bounds on the size of the pre-treatment population of clonogenic cancer cells as well as upper and lower bounds on the cure probability. The main practical impact of these limiting values is the ability to make predictions about the probability of a cure for a given population of patients treated to newer, still unexplored treatment modalities from the empirically determined probability of a cure for the same or similar population resulting from conventional low linear energy transfer (typically photon/electron) RT. We also propose that the current trend to deliver a lower total dose in a smaller number of fractions with larger-than-conventional doses per fraction has physical limits that must be understood before embarking on a particular treatment schedule.
Measures, Probability and Holography in Cosmology
Phillips, Daniel
This dissertation compiles four research projects on predicting values for cosmological parameters and models of the universe on the broadest scale. The first examines the Causal Entropic Principle (CEP) in inhomogeneous cosmologies. The CEP aims to predict the unexpectedly small value of the cosmological constant Lambda using a weighting by entropy increase on causal diamonds. The original work assumed a purely isotropic and homogeneous cosmology. But even the level of inhomogeneity observed in our universe forces reconsideration of certain arguments about entropy production. In particular, we must consider an ensemble of causal diamonds associated with each background cosmology and we can no longer immediately discard entropy production in the far future of the universe. Depending on our choices for a probability measure and our treatment of black hole evaporation, the prediction for Lambda may be left intact or dramatically altered. The second related project extends the CEP to universes with curvature. We have found that curvature values larger than rho k = 40rhom are disfavored by more than $99.99% and a peak value at rhoLambda = 7.9 x 10-123 and rhok =4.3rho m for open universes. For universes that allow only positive curvature or both positive and negative curvature, we find a correlation between curvature and dark energy that leads to an extended region of preferred values. Our universe is found to be disfavored to an extent depending the priors on curvature. We also provide a comparison to previous anthropic constraints on open universes and discuss future directions for this work. The third project examines how cosmologists should formulate basic questions of probability. We argue using simple models that all successful practical uses of probabilities originate in quantum fluctuations in the microscopic physical world around us, often propagated to macroscopic scales. Thus we claim there is no physically verified fully classical theory of probability. We
Yates, Justin R; Breitenstein, Kerry A; Gunkel, Benjamin T; Hughes, Mallory N; Johnson, Anthony B; Rogers, Katherine K; Shape, Sara M
Risky decision making can be measured using a probability-discounting procedure, in which animals choose between a small, certain reinforcer and a large, uncertain reinforcer. Recent evidence has identified glutamate as a mediator of risky decision making, as blocking the N-methyl-d-aspartate (NMDA) receptor with MK-801 increases preference for a large, uncertain reinforcer. Because the order in which probabilities associated with the large reinforcer can modulate the effects of drugs on choice, the current study determined if NMDA receptor ligands alter probability discounting using ascending and descending schedules. Sixteen rats were trained in a probability-discounting procedure in which the odds against obtaining the large reinforcer increased (n=8) or decreased (n=8) across blocks of trials. Following behavioral training, rats received treatments of the NMDA receptor ligands MK-801 (uncompetitive antagonist; 0, 0.003, 0.01, or 0.03mg/kg), ketamine (uncompetitive antagonist; 0, 1.0, 5.0, or 10.0mg/kg), and ifenprodil (NR2B-selective non-competitive antagonist; 0, 1.0, 3.0, or 10.0mg/kg). Results showed discounting was steeper (indicating increased risk aversion) for rats on an ascending schedule relative to rats on the descending schedule. Furthermore, the effects of MK-801, ketamine, and ifenprodil on discounting were dependent on the schedule used. Specifically, the highest dose of each drug decreased risk taking in rats in the descending schedule, but only MK-801 (0.03mg/kg) increased risk taking in rats on an ascending schedule. These results show that probability presentation order modulates the effects of NMDA receptor ligands on risky decision making. Copyright Â© 2016 Elsevier Inc. All rights reserved.
Understanding Y haplotype matching probability.
Brenner, Charles H
2014-01-01
The Y haplotype population-genetic terrain is better explored from a fresh perspective rather than by analogy with the more familiar autosomal ideas. For haplotype matching probabilities, versus for autosomal matching probabilities, explicit attention to modelling - such as how evolution got us where we are - is much more important while consideration of population frequency is much less so. This paper explores, extends, and explains some of the concepts of "Fundamental problem of forensic mathematics - the evidential strength of a rare haplotype match". That earlier paper presented and validated a "kappa method" formula for the evidential strength when a suspect matches a previously unseen haplotype (such as a Y-haplotype) at the crime scene. Mathematical implications of the kappa method are intuitive and reasonable. Suspicions to the contrary raised in rest on elementary errors. Critical to deriving the kappa method or any sensible evidential calculation is understanding that thinking about haplotype population frequency is a red herring; the pivotal question is one of matching probability. But confusion between the two is unfortunately institutionalized in much of the forensic world. Examples make clear why (matching) probability is not (population) frequency and why uncertainty intervals on matching probabilities are merely confused thinking. Forensic matching calculations should be based on a model, on stipulated premises. The model inevitably only approximates reality, and any error in the results comes only from error in the model, the inexactness of the approximation. Sampling variation does not measure that inexactness and hence is not helpful in explaining evidence and is in fact an impediment. Alternative haplotype matching probability approaches that various authors have considered are reviewed. Some are based on no model and cannot be taken seriously. For the others, some evaluation of the models is discussed. Recent evidence supports the adequacy of
Webb, S.; Evans, P.M.; Swindell, W. (Institute of Cancer Research, Sutton (United Kingdom). Surrey Branch Royal Marsden Hospital, Sutton (United Kingdom)); Deasy, J.O. (Wisconsin Univ., Madison, WI (United States). Dept. of Medical Physics Wisconsin Univ., Madison, WI (United States). Dept. of Human Oncology)
1994-11-01
In this note it is shown that for a fixed integral dose to the planning target volume, the highest tumour control probability arises when the dose is spatially uniform. This 'uniform dose theorem' is proved both for (i) a specific TCP model based on Poisson/independent voxel statistics and (ii) any model for voxel control probability having a specific shape with respect to increasing dose. (author).
Probability biases as Bayesian inference
Andre; C. R. Martins
2006-11-01
Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.
Cluster pre-existence probability
Rajeswari, N.S.; Vijayaraghavan, K.R.; Balasubramaniam, M. [Bharathiar University, Department of Physics, Coimbatore (India)
2011-10-15
Pre-existence probability of the fragments for the complete binary spectrum of different systems such as {sup 56}Ni, {sup 116}Ba, {sup 226}Ra and {sup 256}Fm are calculated, from the overlapping part of the interaction potential using the WKB approximation. The role of reduced mass as well as the classical hydrodynamical mass in the WKB method is analysed. Within WKB, even for negative Q -value systems, the pre-existence probability is calculated. The calculations reveal rich structural information. The calculated results are compared with the values of preformed cluster model of Gupta and collaborators. The mass asymmetry motion is shown here for the first time as a part of relative separation motion. (orig.)
Large deviations and idempotent probability
Puhalskii, Anatolii
2001-01-01
In the view of many probabilists, author Anatolii Puhalskii''s research results stand among the most significant achievements in the modern theory of large deviations. In fact, his work marked a turning point in the depth of our understanding of the connections between the large deviation principle (LDP) and well-known methods for establishing weak convergence results.Large Deviations and Idempotent Probability expounds upon the recent methodology of building large deviation theory along the lines of weak convergence theory. The author develops an idempotent (or maxitive) probability theory, introduces idempotent analogues of martingales (maxingales), Wiener and Poisson processes, and Ito differential equations, and studies their properties. The large deviation principle for stochastic processes is formulated as a certain type of convergence of stochastic processes to idempotent processes. The author calls this large deviation convergence.The approach to establishing large deviation convergence uses novel com...
Sm Transition Probabilities and Abundances
Lawler, J E; Sneden, C; Cowan, J J
2005-01-01
Radiative lifetimes, accurate to +/- 5%, have been measured for 212 odd-parity levels of Sm II using laser-induced fluorescence. The lifetimes are combined with branching fractions measured using Fourier-transform spectrometry to determine transition probabilities for more than 900 lines of Sm II. This work is the largest-scale laboratory study to date of Sm II transition probabilities using modern methods. This improved data set has been used to determine a new solar photospheric Sm abundance, log epsilon = 1.00 +/- 0.03, from 26 lines. The spectra of three very metal-poor, neutron-capture-rich stars also have been analyzed, employing between 55 and 72 Sm II lines per star. The abundance ratios of Sm relative to other rare earth elements in these stars are in agreement, and are consistent with ratios expected from rapid neutron-capture nucleosynthesis (the r-process).
Knot probabilities in random diagrams
Cantarella, Jason; Chapman, Harrison; Mastin, Matt
2016-10-01
We consider a natural model of random knotting—choose a knot diagram at random from the finite set of diagrams with n crossings. We tabulate diagrams with 10 and fewer crossings and classify the diagrams by knot type, allowing us to compute exact probabilities for knots in this model. As expected, most diagrams with 10 and fewer crossings are unknots (about 78% of the roughly 1.6 billion 10 crossing diagrams). For these crossing numbers, the unknot fraction is mostly explained by the prevalence of ‘tree-like’ diagrams which are unknots for any assignment of over/under information at crossings. The data shows a roughly linear relationship between the log of knot type probability and the log of the frequency rank of the knot type, analogous to Zipf’s law for word frequency. The complete tabulation and all knot frequencies are included as supplementary data.
Probability distributions for multimeric systems.
Albert, Jaroslav; Rooman, Marianne
2016-01-01
We propose a fast and accurate method of obtaining the equilibrium mono-modal joint probability distributions for multimeric systems. The method necessitates only two assumptions: the copy number of all species of molecule may be treated as continuous; and, the probability density functions (pdf) are well-approximated by multivariate skew normal distributions (MSND). Starting from the master equation, we convert the problem into a set of equations for the statistical moments which are then expressed in terms of the parameters intrinsic to the MSND. Using an optimization package on Mathematica, we minimize a Euclidian distance function comprising of a sum of the squared difference between the left and the right hand sides of these equations. Comparison of results obtained via our method with those rendered by the Gillespie algorithm demonstrates our method to be highly accurate as well as efficient.
Asbestos and Probable Microscopic Polyangiitis
George S Rashed Philteos
2004-01-01
Full Text Available Several inorganic dust lung diseases (pneumoconioses are associated with autoimmune diseases. Although autoimmune serological abnormalities are common in asbestosis, clinical autoimmune/collagen vascular diseases are not commonly reported. A case of pulmonary asbestosis complicated by perinuclear-antineutrophil cytoplasmic antibody (myeloperoxidase positive probable microscopic polyangiitis (glomerulonephritis, pericarditis, alveolitis, multineuritis multiplex is described and the possible immunological mechanisms whereby asbestosis fibres might be relevant in induction of antineutrophil cytoplasmic antibodies are reviewed in the present report.
Logic, Probability, and Human Reasoning
2015-01-01
3–6] and they underlie mathematics , science, and tech- nology [7–10]. Plato claimed that emotions upset reason - ing. However, individuals in the grip...Press 10 Nickerson, R. (2011) Mathematical Reasoning : Patterns, Problems, Conjectures, and Proofs, Taylor & Francis 11 Blanchette, E. and Richards, A...Logic, probability, and human reasoning P.N. Johnson-Laird1,2, Sangeet S. Khemlani3, and Geoffrey P. Goodwin4 1 Princeton University, Princeton, NJ
Probability and statistics: A reminder
Clément Benoit
2013-07-01
Full Text Available The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from “data analysis in experimental sciences” given in [1
Probability Measures on Groups IX
1989-01-01
The latest in this series of Oberwolfach conferences focussed on the interplay between structural probability theory and various other areas of pure and applied mathematics such as Tauberian theory, infinite-dimensional rotation groups, central limit theorems, harmonizable processes, and spherical data. Thus it was attended by mathematicians whose research interests range from number theory to quantum physics in conjunction with structural properties of probabilistic phenomena. This volume contains 5 survey articles submitted on special invitation and 25 original research papers.
Earthquake probabilities: theoretical assessments and reality
Kossobokov, V. G.
2013-12-01
It is of common knowledge that earthquakes are complex phenomena which classification and sizing remain serious problems of the contemporary seismology. In general, their frequency-magnitude distribution exhibit power law scaling. This scaling differs significantly when different time and/or space domains are considered. At the scale of a particular earthquake rupture zone the frequency of similar size events is usually estimated to be about once in several hundred years. Evidently, contemporary seismology does not possess enough reported instrumental data for any reliable quantification of an earthquake probability at a given place of expected event. Regretfully, most of the state-of-the-art theoretical approaches to assess probability of seismic events are based on trivial (e.g. Poisson, periodic, etc) or, conversely, delicately-designed (e.g. STEP, ETAS, etc) models of earthquake sequences. Some of these models are evidently erroneous, some can be rejected by the existing statistics, and some are hardly testable in our life-time. Nevertheless such probabilistic counts including seismic hazard assessment and earthquake forecasting when used on practice eventually mislead to scientifically groundless advices communicated to decision makers and inappropriate decisions. As a result, the population of seismic regions continues facing unexpected risk and losses. The international project Global Earthquake Model (GEM) is on the wrong track, if it continues to base seismic risk estimates on the standard, mainly probabilistic, methodology to assess seismic hazard. It is generally accepted that earthquakes are infrequent, low-probability events. However, they keep occurring at earthquake-prone areas with 100% certainty. Given the expectation of seismic event once per hundred years, the daily probability of occurrence on a certain date may range from 0 to 100% depending on a choice of probability space (which is yet unknown and, therefore, made by a subjective lucky chance
Objective probability and quantum fuzziness
Mohrhoff, U
2007-01-01
This paper offers a critique of the Bayesian approach to quantum mechanics in general and of a recent paper by Caves, Fuchs, and Schack in particular (quant-ph/0608190 v2). In this paper the Bayesian interpretation of Born probabilities is defended against what the authors call the "objective-preparations view". The fact that Caves et al. and the proponents of this view equally misconstrue the time dependence of quantum states, voids the arguments pressed by the former against the latter. After tracing the genealogy of this common error, I argue that the real oxymoron is not an unknown quantum state, as the Bayesians hold, but an unprepared quantum state. I further argue that the essential role of probability in quantum theory is to define and quantify an objective fuzziness. This, more than anything, legitimizes conjoining "objective" to "probability". The measurement problem is essentially the problem of finding a coherent way of thinking about this objective fuzziness, and about the supervenience of the ma...
Empirical and Computational Tsunami Probability
Geist, E. L.; Parsons, T.; ten Brink, U. S.; Lee, H. J.
2008-12-01
A key component in assessing the hazard posed by tsunamis is quantification of tsunami likelihood or probability. To determine tsunami probability, one needs to know the distribution of tsunami sizes and the distribution of inter-event times. Both empirical and computational methods can be used to determine these distributions. Empirical methods rely on an extensive tsunami catalog and hence, the historical data must be carefully analyzed to determine whether the catalog is complete for a given runup or wave height range. Where site-specific historical records are sparse, spatial binning techniques can be used to perform a regional, empirical analysis. Global and site-specific tsunami catalogs suggest that tsunami sizes are distributed according to a truncated or tapered power law and inter-event times are distributed according to an exponential distribution modified to account for clustering of events in time. Computational methods closely follow Probabilistic Seismic Hazard Analysis (PSHA), where size and inter-event distributions are determined for tsunami sources, rather than tsunamis themselves as with empirical analysis. In comparison to PSHA, a critical difference in the computational approach to tsunami probabilities is the need to account for far-field sources. The three basic steps in computational analysis are (1) determination of parameter space for all potential sources (earthquakes, landslides, etc.), including size and inter-event distributions; (2) calculation of wave heights or runup at coastal locations, typically performed using numerical propagation models; and (3) aggregation of probabilities from all sources and incorporation of uncertainty. It is convenient to classify two different types of uncertainty: epistemic (or knowledge-based) and aleatory (or natural variability). Correspondingly, different methods have been traditionally used to incorporate uncertainty during aggregation, including logic trees and direct integration. Critical
Improved Estimation of Forestry Edge Effects Accounting for Detection Probability
Hocking, Daniel; Babbitt, Kimberly; Yamasaki, Mariko
2013-01-01
Poster presented at the 98th annual meeting of the Ecological Society of America (ESA) in Minneapolis, Minnesota, USA. We used a non-linear, parametric model accounting for detection probability to quantify red-backed salamander (Plethodon cinereus) abundance across clearcut-forest edges. This approach allows for projection across landscapes and prediction given alternative logging plans.
Probability for Weather and Climate
Smith, L. A.
2013-12-01
Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of
Estimating Probabilities in Recommendation Systems
Sun, Mingxuan; Kidwell, Paul
2010-01-01
Recommendation systems are emerging as an important business application with significant economic impact. Currently popular systems include Amazon's book recommendations, Netflix's movie recommendations, and Pandora's music recommendations. In this paper we address the problem of estimating probabilities associated with recommendation system data using non-parametric kernel smoothing. In our estimation we interpret missing items as randomly censored observations and obtain efficient computation schemes using combinatorial properties of generating functions. We demonstrate our approach with several case studies involving real world movie recommendation data. The results are comparable with state-of-the-art techniques while also providing probabilistic preference estimates outside the scope of traditional recommender systems.
Probability, Statistics, and Stochastic Processes
Olofsson, Peter
2012-01-01
This book provides a unique and balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. The Second Edition features new coverage of analysis of variance (ANOVA), consistency and efficiency of estimators, asymptotic theory for maximum likelihood estimators, empirical distribution function and the Kolmogorov-Smirnov test, general linear models, multiple comparisons, Markov chain Monte Carlo (MCMC), Brownian motion, martingales, and
PROBABILITY MODEL OF GUNTHER GENERATOR
无
2007-01-01
This paper constructs the probability model of Gunther generator at first, and the finite dimension union distribution of the output sequence is presented. The result shows that the output sequence is an independent and uniformly distributed 0,1 random variable sequence.It gives the theoretical foundation about why Gunther generator can avoid the statistic weakness of the output sequence of stop-and-go generator, and analyzes the coincidence between output sequence and input sequences of Gunther generator. The conclusions of this paper would offer theoretical references for designers and analyzers of clock-controlled generators.
Probability, statistics, and computational science.
Beerenwinkel, Niko; Siebourg, Juliane
2012-01-01
In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.
Probability of Detection Demonstration Transferability
Parker, Bradford H.
2008-01-01
The ongoing Mars Science Laboratory (MSL) Propellant Tank Penetrant Nondestructive Evaluation (NDE) Probability of Detection (POD) Assessment (NESC activity) has surfaced several issues associated with liquid penetrant POD demonstration testing. This presentation lists factors that may influence the transferability of POD demonstration tests. Initial testing will address the liquid penetrant inspection technique. Some of the factors to be considered in this task are crack aspect ratio, the extent of the crack opening, the material and the distance between the inspection surface and the inspector's eye.
Nakaura, Takeshi; Iyama, Yuji; Kidoh, Masafumi; Yokoyama, Koichi [Amakusa Medical Center, Diagnostic Radiology, Amakusa, Kumamoto (Japan); Kumamoto University, Department of Diagnostic Radiology, Graduate School of Medical Sciences, Kumamoto (Japan); Oda, Seitaro; Yamashita, Yasuyuki [Kumamoto University, Department of Diagnostic Radiology, Graduate School of Medical Sciences, Kumamoto (Japan); Tokuyasu, Shinichi [Philips Electronics, Kumamoto (Japan); Harada, Kazunori [Amakusa Medical Center, Department of Surgery, Kumamoto (Japan)
2016-03-15
The purpose of this study was to evaluate the utility of iterative model reconstruction (IMR) in brain CT especially with thin-slice images. This prospective study received institutional review board approval, and prior informed consent to participate was obtained from all patients. We enrolled 34 patients who underwent brain CT and reconstructed axial images with filtered back projection (FBP), hybrid iterative reconstruction (HIR) and IMR with 1 and 5 mm slice thicknesses. The CT number, image noise, contrast, and contrast noise ratio (CNR) between the thalamus and internal capsule, and the rate of increase of image noise in 1 and 5 mm thickness images between the reconstruction methods, were assessed. Two independent radiologists assessed image contrast, image noise, image sharpness, and overall image quality on a 4-point scale. The CNRs in 1 and 5 mm slice thickness were significantly higher with IMR (1.2 ± 0.6 and 2.2 ± 0.8, respectively) than with FBP (0.4 ± 0.3 and 1.0 ± 0.4, respectively) and HIR (0.5 ± 0.3 and 1.2 ± 0.4, respectively) (p < 0.01). The mean rate of increasing noise from 5 to 1 mm thickness images was significantly lower with IMR (1.7 ± 0.3) than with FBP (2.3 ± 0.3) and HIR (2.3 ± 0.4) (p < 0.01). There were no significant differences in qualitative analysis of unfamiliar image texture between the reconstruction techniques. IMR offers significant noise reduction and higher contrast and CNR in brain CT, especially for thin-slice images, when compared to FBP and HIR. (orig.)
The probability that a complete intersection is smooth
Bucur, Alina
2010-01-01
Given a smooth subscheme of a projective space over a finite field, we compute the probability that its intersection with a fixed number of hypersurface sections of large degree is smooth of the expected dimension. This generalizes the case of a single hypersurface, due to Poonen. We use this result to give a probabilistic model for the number of rational points of such a complete intersection. A somewhat surprising corollary is that the number of rational points on a random smooth intersection of two curves in projective 3-space is strictly less than the number of points on the projective line.
Hf Transition Probabilities and Abundances
Lawler, J E; Labby, Z E; Sneden, C; Cowan, J J; Ivans, I I
2006-01-01
Radiative lifetimes from laser-induced fluorescence measurements, accurate to about +/- 5 percent, are reported for 41 odd-parity levels of Hf II. The lifetimes are combined with branching fractions measured using Fourier transform spectrometry to determine transition probabilities for 150 lines of Hf II. Approximately half of these new transition probabilities overlap with recent independent measurements using a similar approach. The two sets of measurements are found to be in good agreement for measurements in common. Our new laboratory data are applied to refine the hafnium photospheric solar abundance and to determine hafnium abundances in 10 metal-poor giant stars with enhanced r-process abundances. For the Sun we derive log epsilon (Hf) = 0.88 +/- 0.08 from four lines; the uncertainty is dominated by the weakness of the lines and their blending by other spectral features. Within the uncertainties of our analysis, the r-process-rich stars possess constant Hf/La and Hf/Eu abundance ratios, log epsilon (Hf...
Gd Transition Probabilities and Abundances
Den Hartog, E A; Sneden, C; Cowan, J J
2006-01-01
Radiative lifetimes, accurate to +/- 5%, have been measured for 49 even-parity and 14 odd-parity levels of Gd II using laser-induced fluorescence. The lifetimes are combined with branching fractions measured using Fourier transform spectrometry to determine transition probabilities for 611 lines of Gd II. This work is the largest-scale laboratory study to date of Gd II transition probabilities and the first using a high performance Fourier transform spectrometer. This improved data set has been used to determine a new solar photospheric Gd abundance, log epsilon = 1.11 +/- 0.03. Revised Gd abundances have also been derived for the r-process-rich metal-poor giant stars CS 22892-052, BD+17 3248, and HD 115444. The resulting Gd/Eu abundance ratios are in very good agreement with the solar-system r-process ratio. We have employed the increasingly accurate stellar abundance determinations, resulting in large part from the more precise laboratory atomic data, to predict directly the Solar System r-process elemental...
Probable warfarin and dapsone interaction.
Truong, Teresa; Haley, James
2012-01-01
We describe a case of a 41-year-old woman who was stable for over a year on 22.5 mg/week of warfarin. At a follow-up visit, her international normalized ratio (INR) was found to be supratherapeutic at 3.9. Her only significant change was acyclovir initiation for shingles, and clindamycin and dapsone for infection on her right foot. An interaction report was run using Micromedex with no interactions reported. Sixteen percent of the weekly dose was held and maintenance dose was continued. Two weeks later, the INR remained supratherapeutic at 4.3, with discontinuation of clindamycin and dapsone, 5 days earlier, as the only change. This time an interaction report was run using Lexi-Comp, which identified an interaction between warfarin and dapsone. The INR has been therapeutic and stable since discontinuation of transient factors. It is hypothesized that warfarin and dapsone compete for binding on the CYP2C9 and CYP3A4 isoenzymes and therefore serum concentration of warfarin was elevated.
Post-Classical Probability Theory
Barnum, Howard
2012-01-01
This paper offers a brief introduction to the framework of "general probabilistic theories", otherwise known as the "convex-operational" approach the foundations of quantum mechanics. Broadly speaking, the goal of research in this vein is to locate quantum mechanics within a very much more general, but conceptually very straightforward, generalization of classical probability theory. The hope is that, by viewing quantum mechanics "from the outside", we may be able better to understand it. We illustrate several respects in which this has proved to be the case, reviewing work on cloning and broadcasting, teleportation and entanglement swapping, key distribution, and ensemble steering in this general framework. We also discuss a recent derivation of the Jordan-algebraic structure of finite-dimensional quantum theory from operationally reasonable postulates.
Associativity and normative credal probability.
Snow, P
2002-01-01
Cox's Theorem is a widely cited motivation for probabilistic models of uncertain belief. The theorem relates the associativity of the logical connectives to that of the arithmetic operations of probability. Recent questions about the correctness of Cox's Theorem have been resolved, but there are new questions about one functional equation used by Cox in 1946. This equation is missing from his later work. Advances in knowledge since 1946 and changes in Cox's research interests explain the equation's disappearance. Other associativity-based motivations avoid functional equations altogether, and so may be more transparently applied to finite domains and discrete beliefs. A discrete counterpart of Cox's Theorem can be assembled from results that have been in the literature since 1959.
Probability theory a comprehensive course
Klenke, Achim
2014-01-01
This second edition of the popular textbook contains a comprehensive course in modern probability theory. Overall, probabilistic concepts play an increasingly important role in mathematics, physics, biology, financial engineering and computer science. They help us in understanding magnetism, amorphous media, genetic diversity and the perils of random developments at financial markets, and they guide us in constructing more efficient algorithms. To address these concepts, the title covers a wide variety of topics, many of which are not usually found in introductory textbooks, such as: • limit theorems for sums of random variables • martingales • percolation • Markov chains and electrical networks • construction of stochastic processes • Poisson point process and infinite divisibility • large deviation principles and statistical physics • Brownian motion • stochastic integral and stochastic differential equations. The theory is developed rigorously and in a self-contained way, with the c...
The Inductive Applications of Probability Calculus
Corrado Gini
2015-06-01
Full Text Available The Author goes back to Founders of Probability calculus to investigate their original interpretation of the probability measure in the applications of the probability theory to real problems. The Author puts in evidence some misunderstandings related to the inversion of deductions derived by the use of probability distributions for investigating the causes of events.
Probability landscapes for integrative genomics
Benecke Arndt
2008-05-01
Full Text Available Abstract Background The comprehension of the gene regulatory code in eukaryotes is one of the major challenges of systems biology, and is a requirement for the development of novel therapeutic strategies for multifactorial diseases. Its bi-fold degeneration precludes brute force and statistical approaches based on the genomic sequence alone. Rather, recursive integration of systematic, whole-genome experimental data with advanced statistical regulatory sequence predictions needs to be developed. Such experimental approaches as well as the prediction tools are only starting to become available and increasing numbers of genome sequences and empirical sequence annotations are under continual discovery-driven change. Furthermore, given the complexity of the question, a decade(s long multi-laboratory effort needs to be envisioned. These constraints need to be considered in the creation of a framework that can pave a road to successful comprehension of the gene regulatory code. Results We introduce here a concept for such a framework, based entirely on systematic annotation in terms of probability profiles of genomic sequence using any type of relevant experimental and theoretical information and subsequent cross-correlation analysis in hypothesis-driven model building and testing. Conclusion Probability landscapes, which include as reference set the probabilistic representation of the genomic sequence, can be used efficiently to discover and analyze correlations amongst initially heterogeneous and un-relatable descriptions and genome-wide measurements. Furthermore, this structure is usable as a support for automatically generating and testing hypotheses for alternative gene regulatory grammars and the evaluation of those through statistical analysis of the high-dimensional correlations between genomic sequence, sequence annotations, and experimental data. Finally, this structure provides a concrete and tangible basis for attempting to formulate a
Identification of dose-reduction techniques for BWR and PWR repetitive high-dose jobs
Dionne, B.J.; Baum, J.W.
1984-01-01
As a result of concern about the apparent increase in collective radiation dose to workers at nuclear power plants, this project will provide information to industry in preplanning for radiation protection during maintenance operations. This study identifies Boiling Water Reactor (BWR) and Pressurized Water Reactor (PWR) repetitive jobs, and respective collective dose trends and dose reduction techniques. 3 references, 2 tables. (ACR)
Introduction to probability theory with contemporary applications
Helms, Lester L
2010-01-01
This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process
Statistical convergence of order $\\alpha$ in probability
Pratulananda Das; Sanjoy Ghosal; Sumit Som
2016-01-01
In this paper ideas of different types of convergence of a sequence of random variables in probability, namely, statistical convergence of order $\\alpha$ in probability, strong $p$-Ces$\\grave{\\mbox{a}}$ro summability of order $\\alpha$ in probability, lacunary statistical convergence or $S_{\\theta}$-convergence of order $\\alpha$ in probability, ${N_{\\theta}}$-convergence of order $\\alpha$ in probability have been introduced and their certain basic properties have been studied.
Csete, I. [National Office of Measures (OMH) - pilot laboratory and corresponding author (Hungary); Leiton, A.G. [Research Centre for Energy, Environment and Technology (CMRI-CIEMAT) (Spain); Sochor, V. [Czech Metrology Institute (CMI) (Czech Republic); Lapenas, A. [Latvian National Metrology Center (LNMC-RMTC) (Latvia); Grindborg, J.E. [Swedish Radiation Protection Authority (SSI) (Sweden); Jokelainen, I. [Radiation and Nuclear Safety Authority (STUK) (Finland); Bjerke, H. [Norwegian Radiation Protection Authority (NRPA) (Norway); Dobrovodsky, J. [Slovak Institute of Metrology (SMU) (Slovakia); Megzifene, A. [International Atomic Energy Agency, IAEA, Vienna (Austria); Hourdakis, C.J. [Hellenic Atomic Energy Committee (HAEC-HIRCL) (Greece); Ivanov, R. [National Centre of Metrology (NCM) (Bulgaria); Vekic, B. [Rudjer Boskovic Institute (IRB) (Croatia); Kokocinski, J. [Central Office of Measures (GUM) (Poland); Cardoso, J. [Institute for Nuclear Technology (ITN-LMRIR) (Portugal); Buermann, L. [Physikalisch Technische Bundesanstalt (PTB) (Germany); Tiefenboeck, W. [Bundesamt fur Eich und Vermesungswesen (BEV) (Austria); Stucki, G. [17 Bundesamt fur Metrologie (METAS) (Switzerland); Van Dijk, E. [NMi Van Swinden Laboratorium (NMi) (Netherlands); Toni, M.P. [ENEA-CR Istituto Nazionale di Metrologia delle Radiazioni Ionizzanti (ENEA) (Italy); Minniti, R. [20 National Institute of Standards and Technology (NIST) (United States); McCaffrey, J.P. [National Research Council Canada (NRC) (Canada); Silva, C.N.M. [National Metrology Laboratory of Ionizing Radiation (LNMRI-IRD) (Brazil); Kharitonov, I. [D I Mendeleyev Institute for Metrology (VNIIM) (RU); Webb, D. [Australian Radiation Protection and Nuclear Safety Agency (ARPANSA) (Australia); Saravi, M. [National Atomic Energy Commission (CNEA-CAE) (Argentina); Delaunay, F. [Laboratoire National Henri Becquerel (LNE-LNHB) (France)
2010-06-15
The results of an unprecedented international effort involving 26 countries are reported. The EUROMET.RI(I)-K1 and EUROMET.RI(I)-K4 key comparisons were conducted with the goal of supporting the relevant calibration and measurement capabilities (CMC) planned for publication by the participant laboratories. The measured quantities were the air kerma (K{sub air}) and the absorbed dose to water (Dw) in {sup 60}Co radiotherapy beams. The comparison was conducted by the pilot laboratory MKEH (Hungary), in a star-shaped arrangement from January 2005 to December 2008. The calibration coefficients of four transfer ionization chambers were measured using two electrometers. The largest deviation between any two calibration coefficients for the four chambers in terms of air kerma and absorbed dose to water was 2.7% and 3.3% respectively. An analysis of the participant uncertainty budgets enabled the calculation of degrees of equivalence (DoE), in terms of the deviations of the results and their associated uncertainties. As a result of this EUROMET project 813 comparison, the BIPM key comparison database (KCDB) will include eleven new Kair and fourteen new D{sub w} DoE values of European secondary standard dosimetry laboratories (SSDLs), and the KCDB will be updated with the new DoE values of the other participant laboratories. The pair-wise degrees of equivalence of participants were also calculated. In addition to assessing calibration techniques and uncertainty calculations of the participants, these comparisons enabled the experimental determinations of N{sub Dw}/N{sub Kair} ratios in the {sup 60}Co gamma radiation beam for the four radiotherapy transfer chambers. (authors)
Fusion probability in heavy nuclei
Banerjee, Tathagata; Nath, S.; Pal, Santanu
2015-03-01
Background: Fusion between two massive nuclei is a very complex process and is characterized by three stages: (a) capture inside the potential barrier, (b) formation of an equilibrated compound nucleus (CN), and (c) statistical decay of the CN leading to a cold evaporation residue (ER) or fission. The second stage is the least understood of the three and is the most crucial in predicting yield of superheavy elements (SHE) formed in complete fusion reactions. Purpose: A systematic study of average fusion probability, PCN> , is undertaken to obtain a better understanding of its dependence on various reaction parameters. The study may also help to clearly demarcate onset of non-CN fission (NCNF), which causes fusion probability, PCN, to deviate from unity. Method: ER excitation functions for 52 reactions leading to CN in the mass region 170-220, which are available in the literature, have been compared with statistical model (SM) calculations. Capture cross sections have been obtained from a coupled-channels code. In the SM, shell corrections in both the level density and the fission barrier have been included. PCN> for these reactions has been extracted by comparing experimental and theoretical ER excitation functions in the energy range ˜5 %-35% above the potential barrier, where known effects of nuclear structure are insignificant. Results: PCN> has been shown to vary with entrance channel mass asymmetry, η (or charge product, ZpZt ), as well as with fissility of the CN, χCN. No parameter has been found to be adequate as a single scaling variable to determine PCN> . Approximate boundaries have been obtained from where PCN> starts deviating from unity. Conclusions: This study quite clearly reveals the limits of applicability of the SM in interpreting experimental observables from fusion reactions involving two massive nuclei. Deviation of PCN> from unity marks the beginning of the domain of dynamical models of fusion. Availability of precise ER cross sections
王平贵; 高丽; 安婧; 王旭霞; 张宁静; 唐宇; 邓琳; 李慧
2013-01-01
Objective To evaluate a pilot project for improving the timely-birth dose coverage rate of hepatitis B vaccine in Gansu province,and to provide evidence for making control strategy for hepatitis B.Methods Using probability proportionate to size sampling(PPS) method,30 villages in one of 7 counties in Tianshui city were surveyed.Totally 1 470 children and their parents were investigated about hepatitis B virus vaccination and awareness of hepatitis B before and after the implementation of the project.And 210 village physicians were investigated about factors affecting the timely-birth dose coverage of hepatitis B vaccine.In each county,2 medical institutions at county level or above and 3 township hospitals were randomly selected and investigated on hepatitis B vaccination among newborns and the rate of hepatitis B virus surface antigen(HBsAg) detection among hospitalized pregnant women,and the awareness or hepatitis B in medical workers.Results With the implementation of the project,the timely-birth dose coverage rate was increased from 75.24％ (1 106/1 470)to 94.83％ (1 394/1 470).The hospitalized delivery rate was increased from 55.85％ (818/1 470)to 81.43 ％ (1 197/1 470).The rate of HBsAg detection among hospitalized pregnant women was increased from 80.00％ (14 830/18 537) to 99.21％ (32 584/32 842).The awareness rates of hepatitis B in the medical workers and the children's parents were significantly increased compared with those of baseline survey.The villages physicians reported that the major factors affecting the timely-birth dose coverage of hepatitis B vaccine were not knowing the birith of the neonate,the distance being too far to reach the newborns's home,and without hepatitis B vaccine at local area.Conclusion Through implementation of the project,the timely-birth dose coverage rate of hepatitis B vaccine in Tianshui city was significantly improved.The awareness rates of hepatitis B in medical workers and children's parents were significantly
Probable Linezolid-Induced Pancytopenia
Nita Lakhani
2005-01-01
Full Text Available A 75-year-old male outpatient with cardiac disease, diabetes, chronic renal insufficiency and iron deficiency anemia was prescribed linezolid 600 mg twice daily for a methicillin-resistant Staphylococcus aureus diabetic foot osteomyelitis. After one week, his blood counts were consistent with baseline values. The patient failed to return for subsequent blood work. On day 26, he was admitted to hospital with acute renal failure secondary to dehydration, and was found to be pancytopenic (erythrocytes 2.5x1012/L, leukocytes 2.9x109/L, platelets 59x109/L, hemoglobin 71 g/L. The patient was transfused, and linezolid was discontinued. His blood counts improved over the week and remained at baseline two months later. The patient's decline in blood counts from baseline levels met previously established criteria for clinical significance. Application of the Naranjo scale indicated a probable relationship between pancytopenia and linezolid. Clinicians should be aware of this rare effect with linezolid, and prospectively identify patients at risk and emphasize weekly hematological monitoring.
The Black Hole Formation Probability
Clausen, Drew; Ott, Christian D
2014-01-01
A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. Using the observed BH mass distribution from Galactic X-ray binaries, we derive the probability that a star will make a BH as a function of its ZAMS mass, $P_{\\rm BH}(M_{\\rm ZAMS})$. We explore possible biases in the observed BH mass distribution and find that this sample is best suited for studying BH formation in stars with ZAMS masses in the range $12-...
Avoiding Negative Probabilities in Quantum Mechanics
Nyambuya, Golden Gadzirayi
2013-01-01
As currently understood since its discovery, the bare Klein-Gordon theory consists of negative quantum probabilities which are considered to be physically meaningless if not outright obsolete. Despite this annoying setback, these negative probabilities are what led the great Paul Dirac in 1928 to the esoteric discovery of the Dirac Equation. The Dirac Equation led to one of the greatest advances in our understanding of the physical world. In this reading, we ask the seemingly senseless question, "Do negative probabilities exist in quantum mechanics?" In an effort to answer this question, we arrive at the conclusion that depending on the choice one makes of the quantum probability current, one will obtain negative probabilities. We thus propose a new quantum probability current of the Klein-Gordon theory. This quantum probability current leads directly to positive definite quantum probabilities. Because these negative probabilities are in the bare Klein-Gordon theory, intrinsically a result of negative energie...
Psychophysics of the probability weighting function
Takahashi, Taiki
2011-03-01
A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (01e)=1e,w(1)=1), which has extensively been studied in behavioral neuroeconomics. The present study utilizes psychophysical theory to derive Prelec's probability weighting function from psychophysical laws of perceived waiting time in probabilistic choices. Also, the relations between the parameters in the probability weighting function and the probability discounting function in behavioral psychology are derived. Future directions in the application of the psychophysical theory of the probability weighting function in econophysics and neuroeconomics are discussed.
Kicken, P.J.H.; Zankl, M.; Kemerink, G.J
1999-07-01
X ray projection data (see Part I) and GSF phantoms ADAM and EVA were used as input for the GSF Monte Carlo transport code to calculate hitherto unavailable dose conversion coefficients (DCCs) for common projections in arteriography of the lower limbs. These DCCs served to estimate organ equivalent doses and effective dose in a study of 455 patients. The effective dose caused by percutaneous needle puncture arteriography of one leg was on average 1 mSv, by Seldinger catherisation for arteriography of both legs 4 mSv, and by intravenous digital subtraction arteriography (DSA) 5 mSv. For needle puncture and Seldinger arteriography the effective dose attributable to fluoroscopy was about 50% for male and 60% for female patients. The contribution of DSA was between 15 and 35%, that of cut films between 17 to 28%, depending on gender and procedure. The effective dose in intravenous arteriography was mainly due to DSA (91-93%). (author)
The Probability Distribution for a Biased Spinner
Foster, Colin
2012-01-01
This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)
Conditional probability modulates visual search efficiency.
Cort, Bryan; Anderson, Britt
2013-01-01
We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability-the likelihood of a particular color given a particular combination of two cues-varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.
Conditional Probability Modulates Visual Search Efficiency
Bryan eCort
2013-10-01
Full Text Available We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability – the likelihood of a particular color given a particular combination of two cues – varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.
The Probability Distribution for a Biased Spinner
Foster, Colin
2012-01-01
This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)
Probability, random processes, and ergodic properties
Gray, Robert M
1988-01-01
This book has been written for several reasons, not all of which are academic. This material was for many years the first half of a book in progress on information and ergodic theory. The intent was and is to provide a reasonably self-contained advanced treatment of measure theory, prob ability theory, and the theory of discrete time random processes with an emphasis on general alphabets and on ergodic and stationary properties of random processes that might be neither ergodic nor stationary. The intended audience was mathematically inc1ined engineering graduate students and visiting scholars who had not had formal courses in measure theoretic probability . Much of the material is familiar stuff for mathematicians, but many of the topics and results have not previously appeared in books. The original project grew too large and the first part contained much that would likely bore mathematicians and dis courage them from the second part. Hence I finally followed the suggestion to separate the material and split...
Using classical probability to guarantee properties of infinite quantum sequences
Gutmann, S
1995-01-01
We consider the product of infinitely many copies of a spin-1\\over 2 system. We construct projection operators on the corresponding nonseparable Hilbert space which measure whether the outcome of an infinite sequence of \\sigma^x measurements has any specified property. In many cases, product states are eigenstates of the projections, and therefore the result of measuring the property is determined. Thus we obtain a nonprobabilistic quantum analogue to the law of large numbers, the randomness property, and all other familiar almost-sure theorems of classical probability.
Pre-Service Teachers' Conceptions of Probability
Odafe, Victor U.
2011-01-01
Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…
Using Playing Cards to Differentiate Probability Interpretations
López Puga, Jorge
2014-01-01
The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.
47 CFR 1.1623 - Probability calculation.
2010-10-01
... 47 Telecommunication 1 2010-10-01 2010-10-01 false Probability calculation. 1.1623 Section 1.1623... Mass Media Services General Procedures § 1.1623 Probability calculation. (a) All calculations shall be computed to no less than three significant digits. Probabilities will be truncated to the number of...
Eliciting Subjective Probabilities with Binary Lotteries
Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd
objective probabilities. Drawing a sample from the same subject population, we find evidence that the binary lottery procedure induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation...... of subjective probabilities in subjects with certain Non-Expected Utility preference representations that satisfy weak conditions that we identify....
Inferring Beliefs as Subjectively Imprecise Probabilities
Andersen, Steffen; Fountain, John; Harrison, Glenn W.;
2012-01-01
We propose a method for estimating subjective beliefs, viewed as a subjective probability distribution. The key insight is to characterize beliefs as a parameter to be estimated from observed choices in a well-defined experimental task and to estimate that parameter as a random coefficient. The e...... probabilities are indeed best characterized as probability distributions with non-zero variance....
Scoring Rules for Subjective Probability Distributions
Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd;
report the true subjective probability of a binary event, even under Subjective Expected Utility. To address this one can “calibrate” inferences about true subjective probabilities from elicited subjective probabilities over binary events, recognizing the incentives that risk averse agents have...
Using Playing Cards to Differentiate Probability Interpretations
López Puga, Jorge
2014-01-01
The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.
The trajectory of the target probability effect.
Hon, Nicholas; Yap, Melvin J; Jabar, Syaheed B
2013-05-01
The effect of target probability on detection times is well-established: Even when detection accuracy is high, lower probability targets are detected more slowly than higher probability ones. Although this target probability effect on detection times has been well-studied, one aspect of it has remained largely unexamined: How the effect develops over the span of an experiment. Here, we investigated this issue with two detection experiments that assessed different target probability ratios. Conventional block segment analysis and linear mixed-effects modeling converged on two key findings. First, we found that the magnitude of the target probability effect increases as one progresses through a block of trials. Second, we found, by examining the trajectories of the low- and high-probability targets, that this increase in effect magnitude was driven by the low-probability targets. Specifically, we found that low-probability targets were detected more slowly as a block of trials progressed. Performance to high-probability targets, on the other hand, was largely invariant across the block. The latter finding is of particular interest because it cannot be reconciled with accounts that propose that the target probability effect is driven by the high-probability targets.
Pre-Service Teachers' Conceptions of Probability
Odafe, Victor U.
2011-01-01
Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…
FUZZY SETS THEORY AS THE PART OF PROBABILITY THEORY
Orlov A. I.
2013-01-01
One of the key provisions of the system fuzzy interval mathematics - the claim that the theory of fuzzy sets is the part of the theory of random sets, thus, part of the probability theory. The article is devoted to the justification of this statement. Proved number of theorems that show that the fuzzy sets and the results of operations on them can be viewed as the projections of random sets and the results of the corresponding operations on them
Paraconsistent Probabilities: Consistency, Contradictions and Bayes’ Theorem
Juliana Bueno-Soler
2016-09-01
Full Text Available This paper represents the first steps towards constructing a paraconsistent theory of probability based on the Logics of Formal Inconsistency (LFIs. We show that LFIs encode very naturally an extension of the notion of probability able to express sophisticated probabilistic reasoning under contradictions employing appropriate notions of conditional probability and paraconsistent updating, via a version of Bayes’ theorem for conditionalization. We argue that the dissimilarity between the notions of inconsistency and contradiction, one of the pillars of LFIs, plays a central role in our extended notion of probability. Some critical historical and conceptual points about probability theory are also reviewed.
An Objective Theory of Probability (Routledge Revivals)
Gillies, Donald
2012-01-01
This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axioma
Fundamentals of applied probability and random processes
Ibe, Oliver
2014-01-01
The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability t
Probability of Failure in Random Vibration
Nielsen, Søren R.K.; Sørensen, John Dalsgaard
1988-01-01
Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out......-crossing. An integral equation for the probability density function of the time interval is formulated, and adequate approximations for the kernel are suggested. The kernel approximation results in approximate solutions for the probability density function of the time interval and thus for the first-passage probability...
唐瑜; 崔学英; 张权; 刘祎; 桂志国
2014-01-01
Because of using lower X-ray dose,the harm of radiation in low-dose CT image on the human body is greatly reduced.But the problem it brings out is that the projection data are polluted seriously by the noise,thus leads to the reductions in quality of the reconstructed image.In view of the above problems,based on partial differential equations we present an improved projectiond domain denoising algorithm. On the basis of the anisotropic diffusion equation,the algorithm can effectively reflect the feature of local characteristics of the image by using the local entropy to control the degree of diffusion.Experimental results show that the new algorithm better preserves the detail and edge information of the reconstructed image while increasing the SNR of reconstructed image.%低剂量 CT 图像由于采用了较低的 X 射线放射剂量，大大降低了辐射对人体的危害，但由此带来的问题是投影数据受噪声污染严重，从而导致了重建图像质量的降低。针对上述问题，在基于偏微分方程的基础上，提出一种改进的投影域降噪算法。该算法在各向异性扩散方程的基础上，利用局部熵可以有效地反映图像局部特征的特点，来控制扩散的程度。实验结果表明，新的算法在提高重建图像信噪比的同时更好地保持了图像的细节和边缘信息。
On the computability of conditional probability
Ackerman, Nathanael L; Roy, Daniel M
2010-01-01
We study the problem of computing conditional probabilities, a fundamental operation in statistics and machine learning. In the elementary discrete setting, a ratio of probabilities defines conditional probability. In the abstract setting, conditional probability is defined axiomatically and the search for more constructive definitions is the subject of a rich literature in probability theory and statistics. In the discrete or dominated setting, under suitable computability hypotheses, conditional probabilities are computable. However, we show that in general one cannot compute conditional probabilities. We do this by constructing a pair of computable random variables in the unit interval whose conditional distribution encodes the halting problem at almost every point. We show that this result is tight, in the sense that given an oracle for the halting problem, one can compute this conditional distribution. On the other hand, we show that conditioning in abstract settings is computable in the presence of cert...
Integrated statistical modelling of spatial landslide probability
Mergili, M.; Chu, H.-J.
2015-09-01
Statistical methods are commonly employed to estimate spatial probabilities of landslide release at the catchment or regional scale. Travel distances and impact areas are often computed by means of conceptual mass point models. The present work introduces a fully automated procedure extending and combining both concepts to compute an integrated spatial landslide probability: (i) the landslide inventory is subset into release and deposition zones. (ii) We employ a simple statistical approach to estimate the pixel-based landslide release probability. (iii) We use the cumulative probability density function of the angle of reach of the observed landslide pixels to assign an impact probability to each pixel. (iv) We introduce the zonal probability i.e. the spatial probability that at least one landslide pixel occurs within a zone of defined size. We quantify this relationship by a set of empirical curves. (v) The integrated spatial landslide probability is defined as the maximum of the release probability and the product of the impact probability and the zonal release probability relevant for each pixel. We demonstrate the approach with a 637 km2 study area in southern Taiwan, using an inventory of 1399 landslides triggered by the typhoon Morakot in 2009. We observe that (i) the average integrated spatial landslide probability over the entire study area corresponds reasonably well to the fraction of the observed landside area; (ii) the model performs moderately well in predicting the observed spatial landslide distribution; (iii) the size of the release zone (or any other zone of spatial aggregation) influences the integrated spatial landslide probability to a much higher degree than the pixel-based release probability; (iv) removing the largest landslides from the analysis leads to an enhanced model performance.
刘建军; 滕丽娟; 刘嘉伟
2015-01-01
南水北调中线总干渠跨越地理纬度大，由南向北气温逐渐降低，由气候温和区走向气候严寒区，沿程流量逐渐减少，流速降低，水深减小，沿岸多交叉建筑物，遇寒冷天气，不可避免地产生冰危害问题。在大量收集资料的基础上，分析了渠道冰期输水可能出现的冰情及其产生条件，并提出了运行过程中的冰害防治措施。%China’s Middle route South-to-North water transfer project strides across large span of altitude, walking from mild climate area to freezing area, the temperature is lower and lower, along with the decrease of discharge、velocity、depth and all kinds of buildings across the channel, the Middle route channel will suffer from ice disaster inevitably. Based on magnanimous data, this paper demonstrates the necessary conditions for the appearance of ice cases in Hebei area along the route of South-to-North Channel, states probable ice cases of water delivery in freezing period and the creation conditions, analysis the ice cases along the route of South-to-North Channel, puts forward the corresponding measures for ice hazards prevention of South-to-North Channel in the process of running .
Novel Radiobiological Gamma Index for Evaluation of 3-Dimensional Predicted Dose Distribution
Sumida, Iori, E-mail: sumida@radonc.med.osaka-u.ac.jp [Department of Radiation Oncology, Osaka University Graduate School of Medicine, Osaka (Japan); Yamaguchi, Hajime; Kizaki, Hisao; Aboshi, Keiko; Tsujii, Mari; Yoshikawa, Nobuhiko; Yamada, Yuji [Department of Radiation Oncology, NTT West Osaka Hospital, Osaka (Japan); Suzuki, Osamu; Seo, Yuji [Department of Radiation Oncology, Osaka University Graduate School of Medicine, Osaka (Japan); Isohashi, Fumiaki [Department of Radiation Oncology, NTT West Osaka Hospital, Osaka (Japan); Yoshioka, Yasuo [Department of Radiation Oncology, Osaka University Graduate School of Medicine, Osaka (Japan); Ogawa, Kazuhiko [Department of Radiation Oncology, NTT West Osaka Hospital, Osaka (Japan)
2015-07-15
Purpose: To propose a gamma index-based dose evaluation index that integrates the radiobiological parameters of tumor control (TCP) and normal tissue complication probabilities (NTCP). Methods and Materials: Fifteen prostate and head and neck (H&N) cancer patients received intensity modulated radiation therapy. Before treatment, patient-specific quality assurance was conducted via beam-by-beam analysis, and beam-specific dose error distributions were generated. The predicted 3-dimensional (3D) dose distribution was calculated by back-projection of relative dose error distribution per beam. A 3D gamma analysis of different organs (prostate: clinical [CTV] and planned target volumes [PTV], rectum, bladder, femoral heads; H&N: gross tumor volume [GTV], CTV, spinal cord, brain stem, both parotids) was performed using predicted and planned dose distributions under 2%/2 mm tolerance and physical gamma passing rate was calculated. TCP and NTCP values were calculated for voxels with physical gamma indices (PGI) >1. We propose a new radiobiological gamma index (RGI) to quantify the radiobiological effects of TCP and NTCP and calculate radiobiological gamma passing rates. Results: The mean RGI gamma passing rates for prostate cases were significantly different compared with those of PGI (P<.03–.001). The mean RGI gamma passing rates for H&N cases (except for GTV) were significantly different compared with those of PGI (P<.001). Differences in gamma passing rates between PGI and RGI were due to dose differences between the planned and predicted dose distributions. Radiobiological gamma distribution was visualized to identify areas where the dose was radiobiologically important. Conclusions: RGI was proposed to integrate radiobiological effects into PGI. This index would assist physicians and medical physicists not only in physical evaluations of treatment delivery accuracy, but also in clinical evaluations of predicted dose distribution.
The OSIRIS Weight of Evidence approach: ITS for the endpoints repeated-dose toxicity (RepDose ITS).
Tluczkiewicz, Inga; Batke, Monika; Kroese, Dinant; Buist, Harrie; Aldenberg, Tom; Pauné, Eduard; Grimm, Helvi; Kühne, Ralph; Schüürmann, Gerrit; Mangelsdorf, Inge; Escher, Sylvia E
2013-11-01
In the FP6 European project OSIRIS, Integrated Testing Strategies (ITSs) for relevant toxicological endpoints were developed to avoid new animal testing and thus to reduce time and costs. The present paper describes the development of an ITS for repeated-dose toxicity called RepDose ITS which evaluates the conditions under which in vivo non-guideline studies are reliable. In a tiered approach three aspects of these "non-guideline" studies are assessed: the documentation of the study (reliability), the quality of the study design (adequacy) and the scope of examination (validity). The reliability is addressed by the method "Knock-out criteria", which consists of four essential criteria for repeated-dose toxicity studies. A second tool, termed QUANTOS (Quality Assessment of Non-guideline Toxicity Studies), evaluates and weights the adequacy of the study by using intra-criterion and inter-criteria weighting. Finally, the Coverage approach calculates a probability that the detected Lowest-Observed-Effect-Level (LOEL) is similar to the LOEL of a guideline study dependent on the examined targets and organs of the non-guideline study. If the validity and adequacy of the non-guideline study are insufficient for risk assessment, the ITS proposes to apply category approach or the Threshold of Toxicological Concern (TTC) concept, and only as a last resort new animal-testing.
Bell Could Become the Copernicus of Probability
Khrennikov, Andrei
2016-07-01
Our aim is to emphasize the role of mathematical models in physics, especially models of geometry and probability. We briefly compare developments of geometry and probability by pointing to similarities and differences: from Euclid to Lobachevsky and from Kolmogorov to Bell. In probability, Bell could play the same role as Lobachevsky in geometry. In fact, violation of Bell’s inequality can be treated as implying the impossibility to apply the classical probability model of Kolmogorov (1933) to quantum phenomena. Thus the quantum probabilistic model (based on Born’s rule) can be considered as the concrete example of the non-Kolmogorovian model of probability, similarly to the Lobachevskian model — the first example of the non-Euclidean model of geometry. This is the “probability model” interpretation of the violation of Bell’s inequality. We also criticize the standard interpretation—an attempt to add to rigorous mathematical probability models additional elements such as (non)locality and (un)realism. Finally, we compare embeddings of non-Euclidean geometries into the Euclidean space with embeddings of the non-Kolmogorovian probabilities (in particular, quantum probability) into the Kolmogorov probability space. As an example, we consider the CHSH-test.
Application of Risk Probability Evaluation Method to Offshore Platform Construction
YU Jianxing; TAN Zhendong
2005-01-01
Offshore project risk concerns many influence factors with complex relationship, and traditional methods cannot be used for the evaluation on risk probability. To deal with this problem, a new method was developed by the combination of improved technique for order preference by similarity ideal solution method, analytical hierarchy process method and the network response surface method. The risk probability was calculated by adopting network response surface analysis based on the state variable of a known event and its degree of membership.This quantification method was applied to an offshore platform project, Bonan oil and gas field project in Bohai Bay in June 2004.There were 7 sub-projects and each includes 4 risk factors.The values of 28 risk factors, ranging from 10-6 to 10-4, were achieved. This precision conforms to the international principle of as low as reasonably practically.The evaluation indicates that the values of comprehensive level of construction group and ability of technical personnel on the spot are relatively high among all risk factors, so these two factors should be paid more attention to in offshore platform construction.
Cell-cycle times and the tumour control probability.
Maler, Adrian; Lutscher, Frithjof
2010-12-01
Mechanistic dynamic cell population models for the tumour control probability (TCP) to date have used a simplistic representation of the cell cycle: either an exponential cell-cycle time distribution (Zaider & Minerbo, 2000, Tumour control probability: a formulation applicable to any temporal protocol of dose delivery. Phys. Med. Biol., 45, 279-293) or a two-compartment model (Dawson & Hillen, 2006, Derivation of the tumour control probability (TCP) from a cell cycle model. Comput. Math. Methods Med., 7, 121-142; Hillen, de Vries, Gong & Yurtseven, 2009, From cell population models to tumour control probability: including cell cycle effects. Acta Oncol. (submitted)). Neither of these simplifications captures realistic cell-cycle time distributions, which are rather narrowly peaked around the mean. We investigate how including such distributions affects predictions of the TCP. At first, we revisit the so-called 'active-quiescent' model that splits the cell cycle into two compartments and explore how an assumption of compartmental independence influences the predicted TCP. Then, we formulate a deterministic age-structured model and a corresponding branching process. We find that under realistic cell-cycle time distributions, lower treatment intensities are sufficient to obtain the same TCP as in the aforementioned models with simplified cell cycles, as long as the treatment is constant in time. For fractionated treatment, the situation reverses such that under realistic cell-cycle time distributions, the model requires more intense treatment to obtain the same TCP.
Towards a Categorical Account of Conditional Probability
Robert Furber
2015-11-01
Full Text Available This paper presents a categorical account of conditional probability, covering both the classical and the quantum case. Classical conditional probabilities are expressed as a certain "triangle-fill-in" condition, connecting marginal and joint probabilities, in the Kleisli category of the distribution monad. The conditional probabilities are induced by a map together with a predicate (the condition. The latter is a predicate in the logic of effect modules on this Kleisli category. This same approach can be transferred to the category of C*-algebras (with positive unital maps, whose predicate logic is also expressed in terms of effect modules. Conditional probabilities can again be expressed via a triangle-fill-in property. In the literature, there are several proposals for what quantum conditional probability should be, and also there are extra difficulties not present in the classical case. At this stage, we only describe quantum systems with classical parametrization.
UT Biomedical Informatics Lab (BMIL probability wheel
Sheng-Cheng Huang
2016-01-01
Full Text Available A probability wheel app is intended to facilitate communication between two people, an “investigator” and a “participant”, about uncertainties inherent in decision-making. Traditionally, a probability wheel is a mechanical prop with two colored slices. A user adjusts the sizes of the slices to indicate the relative value of the probabilities assigned to them. A probability wheel can improve the adjustment process and attenuate the effect of anchoring bias when it is used to estimate or communicate probabilities of outcomes. The goal of this work was to develop a mobile application of the probability wheel that is portable, easily available, and more versatile. We provide a motivating example from medical decision-making, but the tool is widely applicable for researchers in the decision sciences.
Fundamentals of applied probability and random processes
Ibe, Oliver
2005-01-01
This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections
Total variation denoising of probability measures using iterated function systems with probabilities
La Torre, Davide; Mendivil, Franklin; Vrscay, Edward R.
2017-01-01
In this paper we present a total variation denoising problem for probability measures using the set of fixed point probability measures of iterated function systems with probabilities IFSP. By means of the Collage Theorem for contraction mappings, we provide an upper bound for this problem that can be solved by determining a set of probabilities.
Bayesian Probabilities and the Histories Algebra
Marlow, Thomas
2006-01-01
We attempt a justification of a generalisation of the consistent histories programme using a notion of probability that is valid for all complete sets of history propositions. This consists of introducing Cox's axioms of probability theory and showing that our candidate notion of probability obeys them. We also give a generalisation of Bayes' theorem and comment upon how Bayesianism should be useful for the quantum gravity/cosmology programmes.
Non-Boolean probabilities and quantum measurement
Niestegge, Gerd
2001-08-03
A non-Boolean extension of the classical probability model is proposed. The non-Boolean probabilities reproduce typical quantum phenomena. The proposed model is more general and more abstract, but easier to interpret, than the quantum mechanical Hilbert space formalism and exhibits a particular phenomenon (state-independent conditional probabilities) which may provide new opportunities for an understanding of the quantum measurement process. Examples of the proposed model are provided, using Jordan operator algebras. (author)
Data analysis recipes: Probability calculus for inference
Hogg, David W
2012-01-01
In this pedagogical text aimed at those wanting to start thinking about or brush up on probabilistic inference, I review the rules by which probability distribution functions can (and cannot) be combined. I connect these rules to the operations performed in probabilistic data analysis. Dimensional analysis is emphasized as a valuable tool for helping to construct non-wrong probabilistic statements. The applications of probability calculus in constructing likelihoods, marginalized likelihoods, posterior probabilities, and posterior predictions are all discussed.
Spatial probability aids visual stimulus discrimination
Michael Druker
2010-08-01
Full Text Available We investigated whether the statistical predictability of a target's location would influence how quickly and accurately it was classified. Recent results have suggested that spatial probability can be a cue for the allocation of attention in visual search. One explanation for probability cuing is spatial repetition priming. In our two experiments we used probability distributions that were continuous across the display rather than relying on a few arbitrary screen locations. This produced fewer spatial repeats and allowed us to dissociate the effect of a high probability location from that of short-term spatial repetition. The task required participants to quickly judge the color of a single dot presented on a computer screen. In Experiment 1, targets were more probable in an off-center hotspot of high probability that gradually declined to a background rate. Targets garnered faster responses if they were near earlier target locations (priming and if they were near the high probability hotspot (probability cuing. In Experiment 2, target locations were chosen on three concentric circles around fixation. One circle contained 80% of targets. The value of this ring distribution is that it allowed for a spatially restricted high probability zone in which sequentially repeated trials were not likely to be physically close. Participant performance was sensitive to the high-probability circle in addition to the expected effects of eccentricity and the distance to recent targets. These two experiments suggest that inhomogeneities in spatial probability can be learned and used by participants on-line and without prompting as an aid for visual stimulus discrimination and that spatial repetition priming is not a sufficient explanation for this effect. Future models of attention should consider explicitly incorporating the probabilities of targets locations and features.
Some New Results on Transition Probability
Yu Quan XIE
2008-01-01
In this paper, we study the basic properties of stationary transition probability of Markov processes on a general measurable space (E, ε), such as the continuity, maximum probability, zero point, positive probability set standardization, and obtain a series of important results such as Continuity Theorem, Representation Theorem, Levy Theorem and so on. These results are very useful for us to study stationary tri-point transition probability on a general measurable space (E, ε). Our main tools such as Egoroff's Theorem, Vitali-Hahn-Saks's Theorem and the theory of atomic set and well-posedness of measure are also very interesting and fashionable.
Probabilities are single-case, or nothing
Appleby, D M
2004-01-01
Physicists have, hitherto, mostly adopted a frequentist conception of probability, according to which probability statements apply only to ensembles. It is argued that we should, instead, adopt an epistemic, or Bayesian conception, in which probabilities are conceived as logical constructs rather than physical realities, and in which probability statements do apply directly to individual events. The question is closely related to the disagreement between the orthodox school of statistical thought and the Bayesian school. It has important technical implications (it makes a difference, what statistical methodology one adopts). It may also have important implications for the interpretation of the quantum state.
Real analysis and probability solutions to problems
Ash, Robert P
1972-01-01
Real Analysis and Probability: Solutions to Problems presents solutions to problems in real analysis and probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability; the interplay between measure theory and topology; conditional probability and expectation; the central limit theorem; and strong laws of large numbers in terms of martingale theory.Comprised of eight chapters, this volume begins with problems and solutions for the theory of measure and integration, followed by various applications of the basic integration theory.
Melucci, Massimo
2012-01-01
Probabilistic models require the notion of event space for defining a probability measure. An event space has a probability measure which ensues the Kolmogorov axioms. However, the probabilities observed from distinct sources, such as that of relevance of documents, may not admit a single event space thus causing some issues. In this article, some results are introduced for ensuring whether the observed prob- abilities of relevance of documents admit a single event space. More- over, an alternative framework of probability is introduced, thus chal- lenging the use of classical probability for ranking documents. Some reflections on the convenience of extending the classical probabilis- tic retrieval toward a more general framework which encompasses the issues are made.
Acoustic dose and acoustic dose-rate.
Duck, Francis
2009-10-01
Acoustic dose is defined as the energy deposited by absorption of an acoustic wave per unit mass of the medium supporting the wave. Expressions for acoustic dose and acoustic dose-rate are given for plane-wave conditions, including temporal and frequency dependencies of energy deposition. The relationship between the acoustic dose-rate and the resulting temperature increase is explored, as is the relationship between acoustic dose-rate and radiation force. Energy transfer from the wave to the medium by means of acoustic cavitation is considered, and an approach is proposed in principle that could allow cavitation to be included within the proposed definitions of acoustic dose and acoustic dose-rate.
Banerjee, D.; Bøtter-Jensen, L.; Murray, A.S.
2000-01-01
meets the fundamental requirement of the single-aliquot regenerative-dose protocol, in that any change in the luminescence recombination probability can be corrected for by using the OSL response to a fixed test dose. The response of a particular aliquot is examined after three different treatments...... temperature and test-dose size. Finally, dose-depth profiles are presented for two bricks. These profiles demonstrate that the high precisions (similar to 1%) obtained using the regenerative-dose protocol are reflected in smooth dose-depth dependencies. (C) 2000 Elsevier Science Ltd. All rights reserved....
The Role of Age on Dose Limiting Toxicities (DLTs) in Phase I Dose-escalation Trials
Schwandt, A; Harris, P. J.; Hunsberger, S.; Deleporte, A.; Smith, G. L.; Vulih, D.; Anderson, B. D.; Ivy, S. P.
2016-01-01
Purpose Elderly oncology patients are not enrolled in early phase trials in proportion to the numbers of geriatric patients with cancer. There may be concern that elderly patients will not tolerate investigational agents as well as younger patients resulting in a disproportionate number of dose-limiting toxicities (DLTs). Recent single-institution studies provide conflicting data on the relationship between age and DLT. Experimental Design We retrospectively reviewed data about patients treated on single-agent, dose-escalation, phase I clinical trials sponsored by the Cancer Therapy Evaluation Program (CTEP) of the National Cancer Institute. Patients’ dose levels were described as percentage of maximum tolerated dose (%MTD), the highest dose level at which <33% of patients had a DLT, or recommended phase II dose (RP2D). Mixed-effect logistic regression models were used to analyze relationships between the probability of a DLT and age and other explanatory variables. Results Increasing dose, increasing age, and worsening performance status (PS) were significantly related to an increased probability of a DLT in this model (p<0.05). There was no association between dose level administered and age (p=0.57). Conclusions This analysis of phase I dose-escalation trials involving over 500 patients older than 70 years of age, is the largest reported. As age and dose level increased and PS worsened, the probability of a DLT increased. While increasing age was associated with occurrence of DLT, this risk remained within accepted thresholds of risk for phase I trials. There was no evidence of age bias on enrollment of patients on low or high dose levels. PMID:25028396
Comparing linear probability model coefficients across groups
Holm, Anders; Ejrnæs, Mette; Karlson, Kristian Bernt
2015-01-01
This article offers a formal identification analysis of the problem in comparing coefficients from linear probability models between groups. We show that differences in coefficients from these models can result not only from genuine differences in effects, but also from differences in one or more...... these limitations, and we suggest a restricted approach to using linear probability model coefficients in group comparisons....
Stimulus Probability Effects in Absolute Identification
Kent, Christopher; Lamberts, Koen
2016-01-01
This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of…
Probability numeracy and health insurance purchase
Dillingh, Rik; Kooreman, Peter; Potters, Jan
2016-01-01
This paper provides new field evidence on the role of probability numeracy in health insurance purchase. Our regression results, based on rich survey panel data, indicate that the expenditure on two out of three measures of health insurance first rises with probability numeracy and then falls again.
Recent Developments in Applied Probability and Statistics
Devroye, Luc; Kohler, Michael; Korn, Ralf
2010-01-01
This book presents surveys on recent developments in applied probability and statistics. The contributions include topics such as nonparametric regression and density estimation, option pricing, probabilistic methods for multivariate interpolation, robust graphical modelling and stochastic differential equations. Due to its broad coverage of different topics the book offers an excellent overview of recent developments in applied probability and statistics.
Examples of Neutrosophic Probability in Physics
Fu Yuhua
2015-01-01
Full Text Available This paper re-discusses the problems of the so-called “law of nonconservation of parity” and “accelerating expansion of the universe”, and presents the examples of determining Neutrosophic Probability of the experiment of Chien-Shiung Wu et al in 1957, and determining Neutrosophic Probability of accelerating expansion of the partial universe.
Average Transmission Probability of a Random Stack
Lu, Yin; Miniatura, Christian; Englert, Berthold-Georg
2010-01-01
The transmission through a stack of identical slabs that are separated by gaps with random widths is usually treated by calculating the average of the logarithm of the transmission probability. We show how to calculate the average of the transmission probability itself with the aid of a recurrence relation and derive analytical upper and lower…
Probability: A Matter of Life and Death
Hassani, Mehdi; Kippen, Rebecca; Mills, Terence
2016-01-01
Life tables are mathematical tables that document probabilities of dying and life expectancies at different ages in a society. Thus, the life table contains some essential features of the health of a population. Probability is often regarded as a difficult branch of mathematics. Life tables provide an interesting approach to introducing concepts…
Teaching Probability: A Socio-Constructivist Perspective
Sharma, Sashi
2015-01-01
There is a considerable and rich literature on students' misconceptions in probability. However, less attention has been paid to the development of students' probabilistic thinking in the classroom. This paper offers a sequence, grounded in socio-constructivist perspective for teaching probability.
Probability numeracy and health insurance purchase
Dillingh, Rik; Kooreman, Peter; Potters, Jan
2016-01-01
This paper provides new field evidence on the role of probability numeracy in health insurance purchase. Our regression results, based on rich survey panel data, indicate that the expenditure on two out of three measures of health insurance first rises with probability numeracy and then falls again.
Selected papers on probability and statistics
2009-01-01
This volume contains translations of papers that originally appeared in the Japanese journal Sūgaku. The papers range over a variety of topics in probability theory, statistics, and applications. This volume is suitable for graduate students and research mathematicians interested in probability and statistics.
Analytical Study of Thermonuclear Reaction Probability Integrals
Chaudhry, M A; Mathai, A M
2000-01-01
An analytic study of the reaction probability integrals corresponding to the various forms of the slowly varying cross-section factor $S(E)$ is attempted. Exact expressions for reaction probability integrals are expressed in terms of the extended gamma functions.
Probability Issues in without Replacement Sampling
Joarder, A. H.; Al-Sabah, W. S.
2007-01-01
Sampling without replacement is an important aspect in teaching conditional probabilities in elementary statistics courses. Different methods proposed in different texts for calculating probabilities of events in this context are reviewed and their relative merits and limitations in applications are pinpointed. An alternative representation of…
Simulations of Probabilities for Quantum Computing
Zak, M.
1996-01-01
It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.
Stimulus Probability Effects in Absolute Identification
Kent, Christopher; Lamberts, Koen
2016-01-01
This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of…
Probability: A Matter of Life and Death
Hassani, Mehdi; Kippen, Rebecca; Mills, Terence
2016-01-01
Life tables are mathematical tables that document probabilities of dying and life expectancies at different ages in a society. Thus, the life table contains some essential features of the health of a population. Probability is often regarded as a difficult branch of mathematics. Life tables provide an interesting approach to introducing concepts…
Average Transmission Probability of a Random Stack
Lu, Yin; Miniatura, Christian; Englert, Berthold-Georg
2010-01-01
The transmission through a stack of identical slabs that are separated by gaps with random widths is usually treated by calculating the average of the logarithm of the transmission probability. We show how to calculate the average of the transmission probability itself with the aid of a recurrence relation and derive analytical upper and lower…
Probability of Grounding and Collision Events
Pedersen, Preben Terndrup
1996-01-01
To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents have to be developed. This implies that probabilities as well as inherent consequences have to be analyzed and assessed.The present notes outline a method for evaluation of the probability...
Teaching Probability: A Socio-Constructivist Perspective
Sharma, Sashi
2015-01-01
There is a considerable and rich literature on students' misconceptions in probability. However, less attention has been paid to the development of students' probabilistic thinking in the classroom. This paper offers a sequence, grounded in socio-constructivist perspective for teaching probability.
Probability of Grounding and Collision Events
Pedersen, Preben Terndrup
1996-01-01
To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents are developed. This implies that probabilities as well as inherent consequences can be analysed and assessed. The presnt paper outlines a method for evaluation of the probability of ship...
An introduction to probability and stochastic processes
Melsa, James L
2013-01-01
Geared toward college seniors and first-year graduate students, this text is designed for a one-semester course in probability and stochastic processes. Topics covered in detail include probability theory, random variables and their functions, stochastic processes, linear system response to stochastic processes, Gaussian and Markov processes, and stochastic differential equations. 1973 edition.
Norris, David C
2017-01-01
Background. Absent adaptive, individualized dose-finding in early-phase oncology trials, subsequent 'confirmatory' Phase III trials risk suboptimal dosing, with resulting loss of statistical power and reduced probability of technical success for the investigational therapy. While progress has been made toward explicitly adaptive dose-finding and quantitative modeling of dose-response relationships, most such work continues to be organized around a concept of 'the' maximum tolerated dose (MTD). The purpose of this paper is to demonstrate concretely how the aim of early-phase trials might be conceived, not as 'dose-finding', but as dose titration algorithm (DTA)-finding. Methods. A Phase I dosing study is simulated, for a notional cytotoxic chemotherapy drug, with neutropenia constituting the critical dose-limiting toxicity. The drug's population pharmacokinetics and myelosuppression dynamics are simulated using published parameter estimates for docetaxel. The amenability of this model to linearization is explored empirically. The properties of a simple DTA targeting neutrophil nadir of 500 cells/mm (3) using a Newton-Raphson heuristic are explored through simulation in 25 simulated study subjects. Results. Individual-level myelosuppression dynamics in the simulation model approximately linearize under simple transformations of neutrophil concentration and drug dose. The simulated dose titration exhibits largely satisfactory convergence, with great variance in individualized optimal dosing. Some titration courses exhibit overshooting. Conclusions. The large inter-individual variability in simulated optimal dosing underscores the need to replace 'the' MTD with an individualized concept of MTD i . To illustrate this principle, the simplest possible DTA capable of realizing such a concept is demonstrated. Qualitative phenomena observed in this demonstration support discussion of the notion of tuning such algorithms. Although here illustrated specifically in relation to
Acceptance Probability (P a) Analysis for Process Validation Lifecycle Stages.
Alsmeyer, Daniel; Pazhayattil, Ajay; Chen, Shu; Munaretto, Francesco; Hye, Maksuda; Sanghvi, Pradeep
2016-04-01
This paper introduces an innovative statistical approach towards understanding how variation impacts the acceptance criteria of quality attributes. Because of more complex stage-wise acceptance criteria, traditional process capability measures are inadequate for general application in the pharmaceutical industry. The probability of acceptance concept provides a clear measure, derived from specific acceptance criteria for each quality attribute. In line with the 2011 FDA Guidance, this approach systematically evaluates data and scientifically establishes evidence that a process is capable of consistently delivering quality product. The probability of acceptance provides a direct and readily understandable indication of product risk. As with traditional capability indices, the acceptance probability approach assumes that underlying data distributions are normal. The computational solutions for dosage uniformity and dissolution acceptance criteria are readily applicable. For dosage uniformity, the expected AV range may be determined using the s lo and s hi values along with the worst case estimates of the mean. This approach permits a risk-based assessment of future batch performance of the critical quality attributes. The concept is also readily applicable to sterile/non sterile liquid dose products. Quality attributes such as deliverable volume and assay per spray have stage-wise acceptance that can be converted into an acceptance probability. Accepted statistical guidelines indicate processes with C pk > 1.33 as performing well within statistical control and those with C pk 1.33 is associated with a centered process that will statistically produce less than 63 defective units per million. This is equivalent to an acceptance probability of >99.99%.
Laboratory-Tutorial activities for teaching probability
Wittmann, M C; Morgan, J T; Feeley, Roger E.; Morgan, Jeffrey T.; Wittmann, Michael C.
2006-01-01
We report on the development of students' ideas of probability and probability density in a University of Maine laboratory-based general education physics course called Intuitive Quantum Physics. Students in the course are generally math phobic with unfavorable expectations about the nature of physics and their ability to do it. We describe a set of activities used to teach concepts of probability and probability density. Rudimentary knowledge of mechanics is needed for one activity, but otherwise the material requires no additional preparation. Extensions of the activities include relating probability density to potential energy graphs for certain "touchstone" examples. Students have difficulties learning the target concepts, such as comparing the ratio of time in a region to total time in all regions. Instead, they often focus on edge effects, pattern match to previously studied situations, reason about necessary but incomplete macroscopic elements of the system, use the gambler's fallacy, and use expectati...
Alternative probability theories for cognitive psychology.
Narens, Louis
2014-01-01
Various proposals for generalizing event spaces for probability functions have been put forth in the mathematical, scientific, and philosophic literatures. In cognitive psychology such generalizations are used for explaining puzzling results in decision theory and for modeling the influence of context effects. This commentary discusses proposals for generalizing probability theory to event spaces that are not necessarily boolean algebras. Two prominent examples are quantum probability theory, which is based on the set of closed subspaces of a Hilbert space, and topological probability theory, which is based on the set of open sets of a topology. Both have been applied to a variety of cognitive situations. This commentary focuses on how event space properties can influence probability concepts and impact cognitive modeling.
Optimizing Probability of Detection Point Estimate Demonstration
Koshti, Ajay M.
2017-01-01
Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.
Multinomial mixture model with heterogeneous classification probabilities
Holland, M.D.; Gray, B.R.
2011-01-01
Royle and Link (Ecology 86(9):2505-2512, 2005) proposed an analytical method that allowed estimation of multinomial distribution parameters and classification probabilities from categorical data measured with error. While useful, we demonstrate algebraically and by simulations that this method yields biased multinomial parameter estimates when the probabilities of correct category classifications vary among sampling units. We address this shortcoming by treating these probabilities as logit-normal random variables within a Bayesian framework. We use Markov chain Monte Carlo to compute Bayes estimates from a simulated sample from the posterior distribution. Based on simulations, this elaborated Royle-Link model yields nearly unbiased estimates of multinomial and correct classification probability estimates when classification probabilities are allowed to vary according to the normal distribution on the logit scale or according to the Beta distribution. The method is illustrated using categorical submersed aquatic vegetation data. ?? 2010 Springer Science+Business Media, LLC.
Transition probability spaces in loop quantum gravity
Guo, Xiao-Kan
2016-01-01
We study the (generalized) transition probability spaces, in the sense of Mielnik and Cantoni, for spacetime quantum states in loop quantum gravity. First, we show that loop quantum gravity admits the structures of transition probability spaces. This is achieved by first checking such structures in covariant quantum mechanics, and then passing to spin foam models via the general boundary formulation. The transition probability space thus defined gives a simple way to reconstruct the Hilbert space of the canonical theory and the relevant quantum logical structure. Second, we show that the transition probability space and in particular the spin foam model are 2-categories. Then we discuss how to realize property transitions and causality in this categorical context in connection with presheaves on quantaloids and respectively causal categories. We conclude that transition probability spaces provide us with an alternative framework to understand various foundational questions of loop quantum gravity.
Project 2010 Project Management
Happy, Robert
2010-01-01
The ideal on-the-job reference guide for project managers who use Microsoft Project 2010. This must-have guide to using Microsoft Project 2010 is written from a real project manager's perspective and is packed with information you can use on the job. The book explores using Project 2010 during phases of project management, reveals best practices, and walks you through project flow from planning through tracking to closure. This valuable book follows the processes defined in the PMBOK Guide, Fourth Edition , and also provides exam prep for Microsoft's MCTS: Project 2010 certification.: Explains
Evaluation of World Population-Weighted Effective Dose due to Cosmic Ray Exposure
Sato, Tatsuhiko
2016-01-01
After the release of the Report of the United Nations Scientific Committee of the Effects of Atomic Radiation in 2000 (UNSCEAR2000), it became commonly accepted that the world population-weighted effective dose due to cosmic-ray exposure is 0.38 mSv, with a range from 0.3 to 2 mSv. However, these values were derived from approximate projections of altitude and geographic dependences of the cosmic-ray dose rates as well as the world population. This study hence re-evaluated the population-weighted annual effective doses and their probability densities for the entire world as well as for 230 individual nations, using a sophisticated cosmic-ray flux calculation model in tandem with detailed grid population and elevation databases. The resulting world population-weighted annual effective dose was determined to be 0.32 mSv, which is smaller than the UNSCEAR’s evaluation by 16%, with a range from 0.23 to 0.70 mSv covering 99% of the world population. These values were noted to vary with the solar modulation condition within a range of approximately 15%. All assessed population-weighted annual effective doses as well as their statistical information for each nation are provided in the supplementary files annexed to this report. These data improve our understanding of cosmic-ray radiation exposures to populations globally. PMID:27650664
Evaluation of World Population-Weighted Effective Dose due to Cosmic Ray Exposure
Sato, Tatsuhiko
2016-09-01
After the release of the Report of the United Nations Scientific Committee of the Effects of Atomic Radiation in 2000 (UNSCEAR2000), it became commonly accepted that the world population-weighted effective dose due to cosmic-ray exposure is 0.38 mSv, with a range from 0.3 to 2 mSv. However, these values were derived from approximate projections of altitude and geographic dependences of the cosmic-ray dose rates as well as the world population. This study hence re-evaluated the population-weighted annual effective doses and their probability densities for the entire world as well as for 230 individual nations, using a sophisticated cosmic-ray flux calculation model in tandem with detailed grid population and elevation databases. The resulting world population-weighted annual effective dose was determined to be 0.32 mSv, which is smaller than the UNSCEAR’s evaluation by 16%, with a range from 0.23 to 0.70 mSv covering 99% of the world population. These values were noted to vary with the solar modulation condition within a range of approximately 15%. All assessed population-weighted annual effective doses as well as their statistical information for each nation are provided in the supplementary files annexed to this report. These data improve our understanding of cosmic-ray radiation exposures to populations globally.
Evaluation of World Population-Weighted Effective Dose due to Cosmic Ray Exposure.
Sato, Tatsuhiko
2016-09-21
After the release of the Report of the United Nations Scientific Committee of the Effects of Atomic Radiation in 2000 (UNSCEAR2000), it became commonly accepted that the world population-weighted effective dose due to cosmic-ray exposure is 0.38 mSv, with a range from 0.3 to 2 mSv. However, these values were derived from approximate projections of altitude and geographic dependences of the cosmic-ray dose rates as well as the world population. This study hence re-evaluated the population-weighted annual effective doses and their probability densities for the entire world as well as for 230 individual nations, using a sophisticated cosmic-ray flux calculation model in tandem with detailed grid population and elevation databases. The resulting world population-weighted annual effective dose was determined to be 0.32 mSv, which is smaller than the UNSCEAR's evaluation by 16%, with a range from 0.23 to 0.70 mSv covering 99% of the world population. These values were noted to vary with the solar modulation condition within a range of approximately 15%. All assessed population-weighted annual effective doses as well as their statistical information for each nation are provided in the supplementary files annexed to this report. These data improve our understanding of cosmic-ray radiation exposures to populations globally.
On the Percolation BCFT and the Crossing Probability of Watts
Ridout, David
2008-01-01
The logarithmic conformal field theory describing critical percolation is further explored using Watts' determination of the probability that there exists a cluster connecting both horizontal and vertical edges. The boundary condition changing operator which governs Watts' computation is identified with a primary field which does not fit naturally within the extended Kac table. Instead a ``shifted'' extended Kac table is shown to be relevant. Augmenting the previously known logarithmic theory based on Cardy's crossing probability by this field, a larger theory is obtained, in which new classes of indecomposable rank-2 modules are present. No rank-3 Jordan cells are yet observed. A highly non-trivial check of the identification of Watts' field is that no Gurarie-Ludwig-type inconsistencies are observed in this augmentation. The article concludes with an extended discussion of various topics related to extending these results including projectivity, boundary sectors and inconsistency loopholes.
LET-painting increases tumour control probability in hypoxic tumours.
Bassler, Niels; Toftegaard, Jakob; Lühr, Armin; Sørensen, Brita Singers; Scifoni, Emanuele; Krämer, Michael; Jäkel, Oliver; Mortensen, Lise Saksø; Overgaard, Jens; Petersen, Jørgen B
2014-01-01
LET-painting was suggested as a method to overcome tumour hypoxia. In vitro experiments have demonstrated a well-established relationship between the oxygen enhancement ratio (OER) and linear energy transfer (LET), where OER approaches unity for high-LET values. However, high-LET radiation also increases the risk for side effects in normal tissue. LET-painting attempts to restrict high-LET radiation to compartments that are found to be hypoxic, while applying lower LET radiation to normoxic tissues. Methods. Carbon-12 and oxygen-16 ion treatment plans with four fields and with homogeneous dose in the target volume, are applied on an oropharyngeal cancer case with an identified hypoxic entity within the tumour. The target dose is optimised to achieve a tumour control probability (TCP) of 95% when assuming a fully normoxic tissue. Using the same primary particle energy fluence needed for this plan, TCP is recalculated for three cases assuming hypoxia: first, redistributing LET to match the hypoxic structure (LET-painting). Second, plans are recalculated for varying hypoxic tumour volume in order to investigate the threshold volume where TCP can be established. Finally, a slight dose boost (5-20%) is additionally allowed in the hypoxic subvolume to assess its impact on TCP. Results. LET-painting with carbon-12 ions can only achieve tumour control for hypoxic subvolumes smaller than 0.5 cm(3). Using oxygen-16 ions, tumour control can be achieved for tumours with hypoxic subvolumes of up to 1 or 2 cm(3). Tumour control can be achieved for tumours with even larger hypoxic subvolumes, if a slight dose boost is allowed in combination with LET-painting. Conclusion. Our findings clearly indicate that a substantial increase in tumour control can be achieved when applying the LET-painting concept using oxygen-16 ions on hypoxic tumours, ideally with a slight dose boost.
Survival probability in patients with liver trauma.
Buci, Skender; Kukeli, Agim
2016-08-01
Purpose - The purpose of this paper is to assess the survival probability among patients with liver trauma injury using the anatomical and psychological scores of conditions, characteristics and treatment modes. Design/methodology/approach - A logistic model is used to estimate 173 patients' survival probability. Data are taken from patient records. Only emergency room patients admitted to University Hospital of Trauma (former Military Hospital) in Tirana are included. Data are recorded anonymously, preserving the patients' privacy. Findings - When correctly predicted, the logistic models show that survival probability varies from 70.5 percent up to 95.4 percent. The degree of trauma injury, trauma with liver and other organs, total days the patient was hospitalized, and treatment method (conservative vs intervention) are statistically important in explaining survival probability. Practical implications - The study gives patients, their relatives and physicians ample and sound information they can use to predict survival chances, the best treatment and resource management. Originality/value - This study, which has not been done previously, explores survival probability, success probability for conservative and non-conservative treatment, and success probability for single vs multiple injuries from liver trauma.
Finite doses are employed in experimental toxicology studies. Under the traditional methodology, the point of departure (POD) value for low dose extrapolation is identified as one of these doses. Dose spacing necessarily precludes a more accurate description of the POD value. ...
Mitchell, Scott A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ebeida, Mohamed Salah [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Romero, Vicente J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Swiler, Laura Painton [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rushdi, Ahmad A. [Univ. of Texas, Austin, TX (United States); Abdelkader, Ahmad [Univ. of Maryland, College Park, MD (United States)
2015-09-01
This SAND report summarizes our work on the Sandia National Laboratory LDRD project titled "Efficient Probability of Failure Calculations for QMU using Computational Geometry" which was project #165617 and proposal #13-0144. This report merely summarizes our work. Those interested in the technical details are encouraged to read the full published results, and contact the report authors for the status of the software and follow-on projects.
Probability an introduction with statistical applications
Kinney, John J
2014-01-01
Praise for the First Edition""This is a well-written and impressively presented introduction to probability and statistics. The text throughout is highly readable, and the author makes liberal use of graphs and diagrams to clarify the theory."" - The StatisticianThoroughly updated, Probability: An Introduction with Statistical Applications, Second Edition features a comprehensive exploration of statistical data analysis as an application of probability. The new edition provides an introduction to statistics with accessible coverage of reliability, acceptance sampling, confidence intervals, h
Pre-aggregation for Probability Distributions
Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach
Motivated by the increasing need to analyze complex uncertain multidimensional data (e.g., in order to optimize and personalize location-based services), this paper proposes novel types of {\\em probabilistic} OLAP queries that operate on aggregate values that are probability distributions...... and the techniques to process these queries. The paper also presents the methods for computing the probability distributions, which enables pre-aggregation, and for using the pre-aggregated distributions for further aggregation. In order to achieve good time and space efficiency, the methods perform approximate...... multidimensional data analysis that is considered in this paper (i.e., approximate processing of probabilistic OLAP queries over probability distributions)....
Are All Probabilities Fundamentally Quantum Mechanical?
Pradhan, Rajat Kumar
2011-01-01
The subjective and the objective aspects of probabilities are incorporated in a simple duality axiom inspired by observer participation in quantum theory. Transcending the classical notion of probabilities, it is proposed and demonstrated that all probabilities may be fundamentally quantum mechanical in the sense that they may all be derived from the corresponding amplitudes. The classical coin-toss and the quantum double slit interference experiments are discussed as illustrative prototype examples. Absence of multi-order quantum interference effects in multiple-slit experiments and the Experimental tests of complementarity in Wheeler's delayed-choice type experiments are explained using the involvement of the observer.
Eliciting Subjective Probabilities with Binary Lotteries
Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd
2014-01-01
We evaluate a binary lottery procedure for inducing risk neutral behavior in a subjective belief elicitation task. Prior research has shown this procedure to robustly induce risk neutrality when subjects are given a single risk task defined over objective probabilities. Drawing a sample from...... the same subject population, we find evidence that the binary lottery procedure also induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation of subjective probabilities in subjects...... with popular non-expected utility preference representations that satisfy weak conditions....
Concept of probability in statistical physics
Guttmann, Y M
1999-01-01
Foundational issues in statistical mechanics and the more general question of how probability is to be understood in the context of physical theories are both areas that have been neglected by philosophers of physics. This book fills an important gap in the literature by providing a most systematic study of how to interpret probabilistic assertions in the context of statistical mechanics. The book explores both subjectivist and objectivist accounts of probability, and takes full measure of work in the foundations of probability theory, in statistical mechanics, and in mathematical theory. It will be of particular interest to philosophers of science, physicists and mathematicians interested in foundational issues, and also to historians of science.
Computation of the Complex Probability Function
Trainer, Amelia Jo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ledwith, Patrick John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2017-08-22
The complex probability function is important in many areas of physics and many techniques have been developed in an attempt to compute it for some z quickly and e ciently. Most prominent are the methods that use Gauss-Hermite quadrature, which uses the roots of the n^{th} degree Hermite polynomial and corresponding weights to approximate the complex probability function. This document serves as an overview and discussion of the use, shortcomings, and potential improvements on the Gauss-Hermite quadrature for the complex probability function.
Eliciting Subjective Probabilities with Binary Lotteries
Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd
We evaluate the binary lottery procedure for inducing risk neutral behavior in a subjective belief elicitation task. Harrison, Martínez-Correa and Swarthout [2013] found that the binary lottery procedure works robustly to induce risk neutrality when subjects are given one risk task defined over...... objective probabilities. Drawing a sample from the same subject population, we find evidence that the binary lottery procedure induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation...... of subjective probabilities in subjects with certain Non-Expected Utility preference representations that satisfy weak conditions that we identify....
Probability amplitude in quantum like games
Grib, A A; Starkov, K
2003-01-01
Examples of games between two partners with mixed strategies, calculated by the use of the probability amplitude are given. The first game is described by the quantum formalism of spin one half system for which two noncommuting observables are measured. The second game corresponds to the spin one case. Quantum logical orthocomplemented nondistributive lattices for these two games are presented. Interference terms for the probability amplitudes are analyzed by using so called contextual approach to probability (in the von Mises frequency approach). We underline that our games are not based on using of some microscopic systems. The whole scenario is macroscopic.
Basic Probability Theory for Biomedical Engineers
Enderle, John
2006-01-01
This is the first in a series of short books on probability theory and random processes for biomedical engineers. This text is written as an introduction to probability theory. The goal was to prepare students, engineers and scientists at all levels of background and experience for the application of this theory to a wide variety of problems--as well as pursue these topics at a more advanced level. The approach is to present a unified treatment of the subject. There are only a few key concepts involved in the basic theory of probability theory. These key concepts are all presented in the first
Advanced Probability Theory for Biomedical Engineers
Enderle, John
2006-01-01
This is the third in a series of short books on probability theory and random processes for biomedical engineers. This book focuses on standard probability distributions commonly encountered in biomedical engineering. The exponential, Poisson and Gaussian distributions are introduced, as well as important approximations to the Bernoulli PMF and Gaussian CDF. Many important properties of jointly Gaussian random variables are presented. The primary subjects of the final chapter are methods for determining the probability distribution of a function of a random variable. We first evaluate the prob
Comparing linear probability model coefficients across groups
Holm, Anders; Ejrnæs, Mette; Karlson, Kristian Bernt
2015-01-01
This article offers a formal identification analysis of the problem in comparing coefficients from linear probability models between groups. We show that differences in coefficients from these models can result not only from genuine differences in effects, but also from differences in one or more...... of the following three components: outcome truncation, scale parameters and distributional shape of the predictor variable. These results point to limitations in using linear probability model coefficients for group comparisons. We also provide Monte Carlo simulations and real examples to illustrate...... these limitations, and we suggest a restricted approach to using linear probability model coefficients in group comparisons....
Pre-aggregation for Probability Distributions
Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach
Motivated by the increasing need to analyze complex uncertain multidimensional data (e.g., in order to optimize and personalize location-based services), this paper proposes novel types of {\\em probabilistic} OLAP queries that operate on aggregate values that are probability distributions...... and the techniques to process these queries. The paper also presents the methods for computing the probability distributions, which enables pre-aggregation, and for using the pre-aggregated distributions for further aggregation. In order to achieve good time and space efficiency, the methods perform approximate...... multidimensional data analysis that is considered in this paper (i.e., approximate processing of probabilistic OLAP queries over probability distributions)....
Handbook of probability theory and applications
Rudas, Tamas
2008-01-01
""This is a valuable reference guide for readers interested in gaining a basic understanding of probability theory or its applications in problem solving in the other disciplines.""-CHOICEProviding cutting-edge perspectives and real-world insights into the greater utility of probability and its applications, the Handbook of Probability offers an equal balance of theory and direct applications in a non-technical, yet comprehensive, format. Editor Tamás Rudas and the internationally-known contributors present the material in a manner so that researchers of vari
Reduced reward-related probability learning in schizophrenia patients
Yılmaz A
2012-01-01
Full Text Available Alpaslan Yilmaz1,2, Fatma Simsek2, Ali Saffet Gonul2,31Department of Sport and Health, Physical Education and Sports College, Erciyes University, Kayseri, Turkey; 2Department of Psychiatry, SoCAT Lab, Ege University School of Medicine, Bornova, Izmir, Turkey; 3Department of Psychiatry and Behavioral Sciences, Mercer University School of Medicine, Macon, GA, USAAbstract: Although it is known that individuals with schizophrenia demonstrate marked impairment in reinforcement learning, the details of this impairment are not known. The aim of this study was to test the hypothesis that reward-related probability learning is altered in schizophrenia patients. Twenty-five clinically stable schizophrenia patients and 25 age- and gender-matched controls participated in the study. A simple gambling paradigm was used in which five different cues were associated with different reward probabilities (50%, 67%, and 100%. Participants were asked to make their best guess about the reward probability of each cue. Compared with controls, patients had significant impairment in learning contingencies on the basis of reward-related feedback. The correlation analyses revealed that the impairment of patients partially correlated with the severity of negative symptoms as measured on the Positive and Negative Syndrome Scale but that it was not related to antipsychotic dose. In conclusion, the present study showed that the schizophrenia patients had impaired reward-based learning and that this was independent from their medication status.Keywords: reinforcement learning, reward, punishment, motivation
NONE
2015-07-01
In 2011, a specific agreement was signed between the Nuclear Safety Council and the University Malaga for carrying out a survey of used radiology procedures in the Spanish sanitary centers, its frequency and doses received by patients. (Author)
Zika Probably Not Spread Through Saliva: Study
... page: https://medlineplus.gov/news/fullstory_167531.html Zika Probably Not Spread Through Saliva: Study Research with ... HealthDay News) -- Scientists have some interesting news about Zika: You're unlikely to get the virus from ...
Teaching Elementary Probability Through its History.
Kunoff, Sharon; Pines, Sylvia
1986-01-01
Historical problems are presented which can readily be solved by students once some elementary probability concepts are developed. The Duke of Tuscany's Problem; the problem of points; and the question of proportions, divination, and Bertrand's Paradox are included. (MNS)
Probability and statistics with integrated software routines
Deep, Ronald
2005-01-01
Probability & Statistics with Integrated Software Routines is a calculus-based treatment of probability concurrent with and integrated with statistics through interactive, tailored software applications designed to enhance the phenomena of probability and statistics. The software programs make the book unique.The book comes with a CD containing the interactive software leading to the Statistical Genie. The student can issue commands repeatedly while making parameter changes to observe the effects. Computer programming is an excellent skill for problem solvers, involving design, prototyping, data gathering, testing, redesign, validating, etc, all wrapped up in the scientific method.See also: CD to accompany Probability and Stats with Integrated Software Routines (0123694698)* Incorporates more than 1,000 engaging problems with answers* Includes more than 300 solved examples* Uses varied problem solving methods
Encounter Probability of Individual Wave Height
Liu, Z.; Burcharth, H. F.
1998-01-01
wave height corresponding to a certain exceedence probability within a structure lifetime (encounter probability), based on the statistical analysis of long-term extreme significant wave height. Then the design individual wave height is calculated as the expected maximum individual wave height...... associated with the design significant wave height, with the assumption that the individual wave heights follow the Rayleigh distribution. However, the exceedence probability of such a design individual wave height within the structure lifetime is unknown. The paper presents a method for the determination...... of the design individual wave height corresponding to an exceedence probability within the structure lifetime, given the long-term extreme significant wave height. The method can also be applied for estimation of the number of relatively large waves for fatigue analysis of constructions....
Pre-Aggregation with Probability Distributions
Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach
2006-01-01
Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how...... the distributions can be subsequently used in pre-aggregation. Since the probability distributions can become large, we show how to achieve good time and space efficiency by approximating the distributions. We present the results of several experiments that demonstrate the effectiveness of our methods. The work...... is motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. This paper is the first to consider the approximate processing of probabilistic OLAP queries over probability distributions....
Modelling the probability of building fires
Vojtěch Barták
2014-12-01
Full Text Available Systematic spatial risk analysis plays a crucial role in preventing emergencies.In the Czech Republic, risk mapping is currently based on the risk accumulationprinciple, area vulnerability, and preparedness levels of Integrated Rescue Systemcomponents. Expert estimates are used to determine risk levels for individualhazard types, while statistical modelling based on data from actual incidents andtheir possible causes is not used. Our model study, conducted in cooperation withthe Fire Rescue Service of the Czech Republic as a model within the Liberec andHradec Králové regions, presents an analytical procedure leading to the creation ofbuilding fire probability maps based on recent incidents in the studied areas andon building parameters. In order to estimate the probability of building fires, aprediction model based on logistic regression was used. Probability of fire calculatedby means of model parameters and attributes of specific buildings can subsequentlybe visualized in probability maps.
Probability calculations under the IAC hypothesis
Wilson, Mark C; 10.1016/j.mathsocsci.2007.05.003
2012-01-01
We show how powerful algorithms recently developed for counting lattice points and computing volumes of convex polyhedra can be used to compute probabilities of a wide variety of events of interest in social choice theory. Several illustrative examples are given.
Inclusion probability with dropout: an operational formula.
Milot, E; Courteau, J; Crispino, F; Mailly, F
2015-05-01
In forensic genetics, a mixture of two or more contributors to a DNA profile is often interpreted using the inclusion probabilities theory. In this paper, we present a general formula for estimating the probability of inclusion (PI, also known as the RMNE probability) from a subset of visible alleles when dropouts are possible. This one-locus formula can easily be extended to multiple loci using the cumulative probability of inclusion. We show that an exact formulation requires fixing the number of contributors, hence to slightly modify the classic interpretation of the PI. We discuss the implications of our results for the enduring debate over the use of PI vs likelihood ratio approaches within the context of low template amplifications.
Pre-Aggregation with Probability Distributions
Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach
2006-01-01
Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how...... the distributions can be subsequently used in pre-aggregation. Since the probability distributions can become large, we show how to achieve good time and space efficiency by approximating the distributions. We present the results of several experiments that demonstrate the effectiveness of our methods. The work...... is motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. This paper is the first to consider the approximate processing of probabilistic OLAP queries over probability distributions....
Survival probability for open spherical billiards
Dettmann, Carl P.; Rahman, Mohammed R.
2014-12-01
We study the survival probability for long times in an open spherical billiard, extending previous work on the circular billiard. We provide details of calculations regarding two billiard configurations, specifically a sphere with a circular hole and a sphere with a square hole. The constant terms of the long-time survival probability expansions have been derived analytically. Terms that vanish in the long time limit are investigated analytically and numerically, leading to connections with the Riemann hypothesis.
Data analysis recipes: Probability calculus for inference
Hogg, David W.
2012-01-01
In this pedagogical text aimed at those wanting to start thinking about or brush up on probabilistic inference, I review the rules by which probability distribution functions can (and cannot) be combined. I connect these rules to the operations performed in probabilistic data analysis. Dimensional analysis is emphasized as a valuable tool for helping to construct non-wrong probabilistic statements. The applications of probability calculus in constructing likelihoods, marginalized likelihoods,...
Ruin Probability in Linear Time Series Model
ZHANG Lihong
2005-01-01
This paper analyzes a continuous time risk model with a linear model used to model the claim process. The time is discretized stochastically using the times when claims occur, using Doob's stopping time theorem and martingale inequalities to obtain expressions for the ruin probability as well as both exponential and non-exponential upper bounds for the ruin probability for an infinite time horizon. Numerical results are included to illustrate the accuracy of the non-exponential bound.
Representing Uncertainty by Probability and Possibility
Uncertain parameters in modeling are usually represented by probability distributions reflecting either the objective uncertainty of the parameters or the subjective belief held by the model builder. This approach is particularly suited for representing the statistical nature or variance of uncer......Uncertain parameters in modeling are usually represented by probability distributions reflecting either the objective uncertainty of the parameters or the subjective belief held by the model builder. This approach is particularly suited for representing the statistical nature or variance...
De Finetti's contribution to probability and statistics
Cifarelli, Donato Michele; Regazzini, Eugenio
1996-01-01
This paper summarizes the scientific activity of de Finetti in probability and statistics. It falls into three sections: Section 1 includes an essential biography of de Finetti and a survey of the basic features of the scientific milieu in which he took the first steps of his scientific career; Section 2 concerns de Finetti's work in probability: (a) foundations, (b) processes with independent increments, (c) sequences of exchangeable random variables, and (d) contributions which fall within ...
Characteristic Functions over C*-Probability Spaces
王勤; 李绍宽
2003-01-01
Various properties of the characteristic functions of random variables in a non-commutative C*-probability space are studied in this paper. It turns out that the distributions of random variables are uniquely determined by their characteristic functions. By using the properties of characteristic functions, a central limit theorem for a sequence of independent identically distributed random variables in a C*-probability space is established as well.
Imprecise Probability Methods for Weapons UQ
Picard, Richard Roy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Vander Wiel, Scott Alan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2016-05-13
Building on recent work in uncertainty quanti cation, we examine the use of imprecise probability methods to better characterize expert knowledge and to improve on misleading aspects of Bayesian analysis with informative prior distributions. Quantitative approaches to incorporate uncertainties in weapons certi cation are subject to rigorous external peer review, and in this regard, certain imprecise probability methods are well established in the literature and attractive. These methods are illustrated using experimental data from LANL detonator impact testing.
Probability, clinical decision making and hypothesis testing
A Banerjee
2009-01-01
Full Text Available Few clinicians grasp the true concept of probability expressed in the ′P value.′ For most, a statistically significant P value is the end of the search for truth. In fact, the opposite is the case. The present paper attempts to put the P value in proper perspective by explaining different types of probabilities, their role in clinical decision making, medical research and hypothesis testing.
Probability and statistics for computer science
Johnson, James L
2011-01-01
Comprehensive and thorough development of both probability and statistics for serious computer scientists; goal-oriented: ""to present the mathematical analysis underlying probability results"" Special emphases on simulation and discrete decision theory Mathematically-rich, but self-contained text, at a gentle pace Review of calculus and linear algebra in an appendix Mathematical interludes (in each chapter) which examine mathematical techniques in the context of probabilistic or statistical importance Numerous section exercises, summaries, historical notes, and Further Readings for reinforcem
Representing Uncertainty by Probability and Possibility
Uncertain parameters in modeling are usually represented by probability distributions reflecting either the objective uncertainty of the parameters or the subjective belief held by the model builder. This approach is particularly suited for representing the statistical nature or variance of uncer......Uncertain parameters in modeling are usually represented by probability distributions reflecting either the objective uncertainty of the parameters or the subjective belief held by the model builder. This approach is particularly suited for representing the statistical nature or variance...
Site occupancy models with heterogeneous detection probabilities
Royle, J. Andrew
2006-01-01
Models for estimating the probability of occurrence of a species in the presence of imperfect detection are important in many ecological disciplines. In these ?site occupancy? models, the possibility of heterogeneity in detection probabilities among sites must be considered because variation in abundance (and other factors) among sampled sites induces variation in detection probability (p). In this article, I develop occurrence probability models that allow for heterogeneous detection probabilities by considering several common classes of mixture distributions for p. For any mixing distribution, the likelihood has the general form of a zero-inflated binomial mixture for which inference based upon integrated likelihood is straightforward. A recent paper by Link (2003, Biometrics 59, 1123?1130) demonstrates that in closed population models used for estimating population size, different classes of mixture distributions are indistinguishable from data, yet can produce very different inferences about population size. I demonstrate that this problem can also arise in models for estimating site occupancy in the presence of heterogeneous detection probabilities. The implications of this are discussed in the context of an application to avian survey data and the development of animal monitoring programs.
Translating Climate-Change Probabilities into Impact Risks - Overcoming the Impact- Model Bottleneck
Dettinger, M.
2008-12-01
Projections of climate change in response to increasing greenhouse-gas concentrations are uncertain and likely to remain so for the foreseeable future. As more projections become available for analysts, we are increasingly able to characterize the probabilities of obtaining various levels of climate change in current projections. However, the probabilities of most interest in impact assessments are not the probabilities of climate changes, but rather the probabilities (or risks) of various levels and kinds of climate-change impact. These risks can be difficult to estimate even if the climate-change probabilities are well known. The difficulty arises because, frequently, impact models and assessments are computationally demanding or time consuming of hands-on, human expert analyses, so that severe limits are placed on the numbers of climate- change scenarios from which detailed impacts can be assessed. Estimation of risks of various impacts is generally difficult with the few resulting examples. However, real-world examples from the water-resources sector will be used to show that, by applying several different "derived distributions" approaches for estimating the risks of various impacts from known climate-change probabilities to just a few impact-model simulations, risks can be estimated along with indications of how accurate are the impact-risk estimates. The prospects for a priori selection of a few climate-change scenarios (from a larger ensemble of available projections) that will allow the best, most economical estimates of impact risks will be explored with a simple but real-world example.
New Approach to Total Dose Specification for Spacecraft Electronics
Xapsos, Michael
2017-01-01
Variability of the space radiation environment is investigated with regard to total dose specification for spacecraft electronics. It is shown to have a significant impact. A new approach is developed for total dose requirements that replaces the radiation design margin concept with failure probability during a mission.
[Absorbed doses in dental radiology].
Bianchi, S D; Roccuzzo, M; Albrito, F; Ragona, R; Anglesio, S
1996-01-01
The growing use of dento-maxillo-facial radiographic examinations has been accompanied by the publication of a large number of studies on dosimetry. A thorough review of the literature is presented in this article. Most studies were carried out on tissue equivalent skull phantoms, while only a few were in vivo. The aim of the present study was to evaluate in vivo absorbed doses during Orthopantomography (OPT). Full Mouth Periapical Examination (FMPE) and Intraoral Tube Panoramic Radiography (ITPR). Measurements were made on 30 patients, reproducing clinical conditions, in 46 anatomical sites, with 24 intra- and 22 extra-oral thermoluminiscent dosimeters (TLDS). The highest doses were measured, in orthopantomography, at the right mandibular angle (1899 mu Gy) in FMPE on the right naso-labial fold (5640 mu Gy and in ITPR on the palatal surface of the left second upper molar (1936 mu Gy). Intraoral doses ranged from 21 mu Gy, in orthopantomography, to 4494 mu Gy in FMPE. Standard errors ranged from 142% in ITPR to 5% in orthopantomography. The highest rate of standard errors was found in FMPE and ITPR. The data collected in this trial are in agreement with others in major literature reports. Disagreements are probably due to different exam acquisition and data collections. Such differences, presented comparison in several sites, justify lower doses in FMPE and ITPR. Advantages and disadvantages of in vivo dosimetry of the maxillary region are discussed, the former being a close resemblance to clinical conditions of examination and the latter the impossibility of collecting values in depth of tissues. Finally, both ITPR and FMPE required lower doses than expected, and can be therefore reconsidered relative to their radiation risk.
Tsunami probability in the Caribbean region
Parsons, T.; Geist, E. L.
2008-12-01
We calculated tsunami runup probability at coastal sites throughout the Caribbean region. We applied a Poissonian probability model because of the variety of uncorrelated tsunami sources in the region. Coastlines were discretized into 20km by 20km cells, and the mean tsunami runup rate was determined for each cell. A remarkable ~500-year empirical record was used to calculate an empirical tsunami probability map, the first of three constructed for this study. However, it's unclear whether the 500-year record is complete, so we conducted a seismic moment-balance exercise using a finite element model of the Caribbean-North American plate boundaries and the earthquake catalog, and found that moment could be balanced if the seismic coupling coefficient is c=0.32. Modeled moment release was therefore used to generate synthetic earthquake sequences to calculate 50 tsunami runup scenarios for 500-year periods. We made a second probability map from numerically-calculated runup rates in each cell. Differences between the first two probability maps based on empirical and numerical-modeled rates suggest that each captured different aspects of tsunami generation; the empirical model may be deficient in primary plate-boundary events, whereas numerical model rates lack back-arc fault and landslide sources. We thus prepared a third probability map using Bayesian likelihood functions derived from the empirical and numerical rate models and their attendant uncertainty to weight a range of rates at each 20km by 20km coastal cell. Our best-estimate map gives a range of 30-year runup probability from 0-30 percent regionally.
Domingo, C., E-mail: carles.domingo@uab.ca [Grup de Fisica de les Radiacions, Departament de Fisica. Edifici C, Campus UAB, Universitat Autonoma de Barcelona, E-08193 Bellaterra (Spain); Garcia-Fuste, M.J.; Morales, E.; Amgarou, K. [Grup de Fisica de les Radiacions, Departament de Fisica. Edifici C, Campus UAB, Universitat Autonoma de Barcelona, E-08193 Bellaterra (Spain); Terron, J.A. [Servicio de Radiofisica, Hospital Universitario Virgen Macarena. E- 41009 Sevilla. Spain (Spain); Rosello, J.; Brualla, L. [ERESA, Avda. Tres Cruces s/n. E-46014 Valencia (Spain); Nunez, L. [Servicio de Radiofisica, Hospital. Puerta de Hierro. E-28222 Majadahonda (Spain); Colmenares, R. [Serv. de Oncologia Radioterapica, Hosp. Ramon y Cajal, E-28049 Madrid (Spain); Gomez, F. [Dpto. de Particulas. Univ. de Santiago. E-15782 Santiago de Compostela. Spain (Spain); Hartmann, G.H. [DKFZ E0400 Im Neuenheimer Feld 280. D-69120 Heidelberg (Germany) (Germany); Sanchez-Doblado, F. [Servicio de Radiofisica, Hospital Universitario Virgen Macarena. E- 41009 Sevilla. Spain (Spain); Dpto. de Fisiologia Medica y Biofisica. Universidad de Sevilla. E-41009 Sevilla. Spain (Spain); Fernandez, F. [Grup de Fisica de les Radiacions, Departament de Fisica. Edifici C, Campus UAB, Universitat Autonoma de Barcelona, E-08193 Bellaterra (Spain); Consejo de Seguridad Nuclear, Justo Dorado 11 E-28040 Madrid (Spain)
2010-12-15
A project has been set up to study the effect on a radiotherapy patient of the neutrons produced around the LINAC accelerator head by photonuclear reactions induced by photons above {approx}8 MeV. These neutrons may reach directly the patient, or they may interact with the surrounding materials until they become thermalised, scattering all over the treatment room and affecting the patient as well, contributing to peripheral dose. Spectrometry was performed with a calibrated and validated set of Bonner spheres at a point located at 50 cm from the isocenter, as well as at the place where a digital device for measuring neutrons, based on the upset of SRAM memories induced by thermal neutrons, is located inside the treatment room. Exposures have taken place in six LINAC accelerators with different energies (from 15 to 23 MV) with the aim of relating the spectrometer measurements with the readings of the digital device under various exposure and room geometry conditions. The final purpose of the project is to be able to relate, under any given treatment condition and room geometry, the readings of this digital device to patient neutron effective dose and peripheral dose in organs of interest. This would allow inferring the probability of developing second malignancies as a consequence of the treatment. Results indicate that unit neutron fluence spectra at 50 cm from the isocenter do not depend on accelerator characteristics, while spectra at the place of the digital device are strongly influenced by the treatment room geometry.
Measurement of entrance skin dose and estimation of organ dose during pediatric chest radiography.
Kumaresan, M; Kumar, Rajesh; Biju, K; Choubey, Ajay; Kantharia, S
2011-06-01
Entrance skin dose (ESD) was measured to calculate the organ doses from the anteroposterior (AP) and posteroanterior (PA) chest x-ray projections for pediatric patients in an Indian hospital. High sensitivity tissue-equivalent thermoluminescent dosimeters (TLD, LiF: Mg, Cu, P chips) were used for measuring entrance skin dose. The respective organ doses were calculated using the Monte Carlo method (MCNP 3.1) to simulate the examination set-up and a three-dimensional mathematical phantom for representing an average 5-y-old Indian child. Using this method, conversion coefficients were derived for translating the measured ESD to organ doses. The average measured ESDs for the chest AP and PA projections were 0.305 mGy and 0.171 mGy, respectively. The average calculated organ doses in the AP and the PA projections were 0.196 and 0.086 mSv for the thyroid, 0.167 and 0.045 mSv for the trachea, 0.078 and 0.043 mSv for the lungs, 0.110 and 0.013 mSv for the liver, 0.002 and 0.016 mSv for the bone marrow, 0.024 and 0.002 mSv for the kidneys, and 0.109 and 0.023 mSv for the heart, respectively. The ESD and organ doses can be reduced significantly with the proper radiological technique. According to these results, the chest PA projection should be preferred over the AP projection in pediatric patients. The estimated organ doses for the chest AP and PA projections can be used for the estimation of the associated risk.