Converting dose distributions into tumour control probability
International Nuclear Information System (INIS)
Nahum, A.E.
1996-01-01
The endpoints in radiotherapy that are truly of relevance are not dose distributions but the probability of local control, sometimes known as the Tumour Control Probability (TCP) and the Probability of Normal Tissue Complications (NTCP). A model for the estimation of TCP based on simple radiobiological considerations is described. It is shown that incorporation of inter-patient heterogeneity into the radiosensitivity parameter a through s a can result in a clinically realistic slope for the dose-response curve. The model is applied to inhomogeneous target dose distributions in order to demonstrate the relationship between dose uniformity and s a . The consequences of varying clonogenic density are also explored. Finally the model is applied to the target-volume DVHs for patients in a clinical trial of conformal pelvic radiotherapy; the effect of dose inhomogeneities on distributions of TCP are shown as well as the potential benefits of customizing the target dose according to normal-tissue DVHs. (author). 37 refs, 9 figs
Converting dose distributions into tumour control probability
Energy Technology Data Exchange (ETDEWEB)
Nahum, A E [The Royal Marsden Hospital, London (United Kingdom). Joint Dept. of Physics
1996-08-01
The endpoints in radiotherapy that are truly of relevance are not dose distributions but the probability of local control, sometimes known as the Tumour Control Probability (TCP) and the Probability of Normal Tissue Complications (NTCP). A model for the estimation of TCP based on simple radiobiological considerations is described. It is shown that incorporation of inter-patient heterogeneity into the radiosensitivity parameter a through s{sub a} can result in a clinically realistic slope for the dose-response curve. The model is applied to inhomogeneous target dose distributions in order to demonstrate the relationship between dose uniformity and s{sub a}. The consequences of varying clonogenic density are also explored. Finally the model is applied to the target-volume DVHs for patients in a clinical trial of conformal pelvic radiotherapy; the effect of dose inhomogeneities on distributions of TCP are shown as well as the potential benefits of customizing the target dose according to normal-tissue DVHs. (author). 37 refs, 9 figs.
Failure-probability driven dose painting
International Nuclear Information System (INIS)
Vogelius, Ivan R.; Håkansson, Katrin; Due, Anne K.; Aznar, Marianne C.; Kristensen, Claus A.; Rasmussen, Jacob; Specht, Lena; Berthelsen, Anne K.; Bentzen, Søren M.
2013-01-01
Purpose: To demonstrate a data-driven dose-painting strategy based on the spatial distribution of recurrences in previously treated patients. The result is a quantitative way to define a dose prescription function, optimizing the predicted local control at constant treatment intensity. A dose planning study using the optimized dose prescription in 20 patients is performed.Methods: Patients treated at our center have five tumor subvolumes from the center of the tumor (PET positive volume) and out delineated. The spatial distribution of 48 failures in patients with complete clinical response after (chemo)radiation is used to derive a model for tumor control probability (TCP). The total TCP is fixed to the clinically observed 70% actuarial TCP at five years. Additionally, the authors match the distribution of failures between the five subvolumes to the observed distribution. The steepness of the dose–response is extracted from the literature and the authors assume 30% and 20% risk of subclinical involvement in the elective volumes. The result is a five-compartment dose response model matching the observed distribution of failures. The model is used to optimize the distribution of dose in individual patients, while keeping the treatment intensity constant and the maximum prescribed dose below 85 Gy.Results: The vast majority of failures occur centrally despite the small volumes of the central regions. Thus, optimizing the dose prescription yields higher doses to the central target volumes and lower doses to the elective volumes. The dose planning study shows that the modified prescription is clinically feasible. The optimized TCP is 89% (range: 82%–91%) as compared to the observed TCP of 70%.Conclusions: The observed distribution of locoregional failures was used to derive an objective, data-driven dose prescription function. The optimized dose is predicted to result in a substantial increase in local control without increasing the predicted risk of toxicity
Failure-probability driven dose painting
DEFF Research Database (Denmark)
Vogelius, Ivan R; Håkansson, Katrin; Due, Anne K
2013-01-01
To demonstrate a data-driven dose-painting strategy based on the spatial distribution of recurrences in previously treated patients. The result is a quantitative way to define a dose prescription function, optimizing the predicted local control at constant treatment intensity. A dose planning study...
Hanford Environmental Dose Reconstruction Project
International Nuclear Information System (INIS)
Cannon, S.D.; Finch, S.M.
1992-10-01
The objective of the Hanford Environmental Dose Reconstruction (HEDR) Project is to estimate the radiation doses that individuals and populations could have received from nuclear operations at Hanford since 1944. The independent Technical Steering Panel (TSP) provides technical direction. The project is divided into the following technical tasks. These tasks correspond to the path radionuclides followed from release to impact on humans (dose estimates):Source Terms, Environmental Transport, Environmental Monitoring Data, Demography, Food Consumption, and Agriculture, and Environmental Pathways and Dose Estimates
Hanford Environmental Dose Reconstruction Project
International Nuclear Information System (INIS)
Finch, S.M.; McMakin, A.H.
1991-04-01
The objective of the Hanford Environmental Dose Reconstruction Project is to estimate the radiation doses that populations could have received from nuclear operations at Hanford since 1944. The project is being managed and conducted by the Pacific Northwest Laboratory (PNL) under the direction of an independent Technical Steering Panel (TSP). The project is divided into the following technical tasks. These tasks correspond to the path radionuclides followed, from released to impact on humans (dose estimates): source terms; environmental transport; environmental monitoring data; demographics, agriculture, food habits; and, environmental pathways and dose estimates
Hanford Environmental Dose Reconstruction Project
International Nuclear Information System (INIS)
Finch, S.M.; McMakin, A.H.
1992-06-01
The objective of the Hanford Environmental Dose Reconstruction Project is to estimate the radiation doses that individuals and populations could have received from nuclear operations at Hanford since 1944. The project is being managed and conducted by the Battelle Pacific Northwest Laboratories under contract with the Centers for Disease Control. The independent Technical Steering Panel (TSP) provides technical direction. The project is divided into the following technical tasks. These tasks correspond to the path radionuclides followed, from release to impact on humans (dose estimates): source terms; environmental transport; environmental monitoring data; demography, food consumption, and agriculture; environmental pathways and dose estimates
Wilson, Jason; Lawman, Joshua; Murphy, Rachael; Nelson, Marissa
2011-01-01
This article describes a probability project used in an upper division, one-semester probability course with third-semester calculus and linear algebra prerequisites. The student learning outcome focused on developing the skills necessary for approaching project-sized math/stat application problems. These skills include appropriately defining…
Hanford Environmental Dose Reconstruction Project
International Nuclear Information System (INIS)
McMakin, A.H.; Cannon, S.D.; Finch, S.M.
1992-07-01
The objective of the Hanford Environmental Dose Reconstruction (HEDR) Project is to estimate the radiation doses that individuals and populations could have received from nuclear operations at Hanford since 1944. The TSP consists of experts in environmental pathways, epidemiology, surface-water transport, ground-water transport, statistics, demography, agriculture, meteorology, nuclear engineering, radiation dosimetry, and cultural anthropology. Included are appointed technical members representing the states of Oregon, Washington, and Idaho, a representative of Native American tribes, and an individual representing the public. The project is divided into the following technical tasks. These tasks correspond to the path radionuclides followed from release to impact on humans (dose estimates): Source terms, environmental transport, environmental monitoring data, demography, food consumption, and agriculture, and environmental pathways and dose estimates. Progress is discussed
International Nuclear Information System (INIS)
Zhou, J; Ding, X; Liang, J; Zhang, J; Wang, Y; Yan, D
2016-01-01
Purpose: With energy repainting in lung IMPT, the dose delivered is approximate to the convolution of dose in each phase with corresponding breathing PDF. This study is to compute breathing PDF weighted 4D dose in lung IMPT treatment and compare to its initial robust plan. Methods: Six lung patients were evaluated in this study. Amsterdam shroud image were generated from pre-treatment 4D cone-beam projections. Diaphragm motion curve was extract from the shroud image and the breathing PDF was generated. Each patient was planned to 60 Gy (12GyX5). In initial plans, ITV density on average CT was overridden with its maximum value for planning, using two IMPT beams with robust optimization (5mm uncertainty in patient position and 3.5% range uncertainty). The plan was applied to all 4D CT phases. The dose in each phase was deformed to a reference phase. 4D dose is reconstructed by summing all these doses based on corresponding weighting from the PDF. Plan parameters, including maximum dose (Dmax), ITV V100, homogeneity index (HI=D2/D98), R50 (50%IDL/ITV), and the lung-GTV’s V12.5 and V5 were compared between the reconstructed 4D dose to initial plans. Results: The Dmax is significantly less dose in the reconstructed 4D dose, 68.12±3.5Gy, vs. 70.1±4.3Gy in the initial plans (p=0.015). No significant difference is found for the ITV V100, HI, and R50, 92.2%±15.4% vs. 96.3%±2.5% (p=0.565), 1.033±0.016 vs. 1.038±0.017 (p=0.548), 19.2±12.1 vs. 18.1±11.6 (p=0.265), for the 4D dose and initial plans, respectively. The lung-GTV V12.5 and V5 are significantly high in the 4D dose, 13.9%±4.8% vs. 13.0%±4.6% (p=0.021) and 17.6%±5.4% vs. 16.9%±5.2% (p=0.011), respectively. Conclusion: 4D dose reconstruction based on phase PDF can be used to evaluate the dose received by the patient. A robust optimization based on the phase PDF may even further improve patient care.
International Nuclear Information System (INIS)
Wen Xiaoqiong; Li Qiang; Zhou Guangming; Li Wenjian; Wei Zengquan
2001-01-01
In order to estimate the influence of the un-uniform dose distribution on the clinical treatment result, the Influence of dose distribution homogeneity on the tumor control probability was investigated. Basing on the formula deduced previously for survival fraction of cells irradiated by the un-uniform heavy-ion irradiation field and the theory of tumor control probability, the tumor control probability was calculated for a tumor mode exposed to different dose distribution homogeneity. The results show that the tumor control probability responding to the same total dose will decrease if the dose distribution homogeneity gets worse. In clinical treatment, the dose distribution homogeneity should be better than 95%
International Nuclear Information System (INIS)
Coleman, J.H.
1980-10-01
A technique is discussed for computing the probability distribution of the accumulated dose received by an arbitrary receptor resulting from several single releases from an intermittent source. The probability density of the accumulated dose is the convolution of the probability densities of doses from the intermittent releases. Emissions are not assumed to be constant over the brief release period. The fast fourier transform is used in the calculation of the convolution
Spent Nuclear Fuel Project dose management plan
International Nuclear Information System (INIS)
Bergsman, K.H.
1996-03-01
This dose management plan facilitates meeting the dose management and ALARA requirements applicable to the design activities of the Spent Nuclear Fuel Project, and establishes consistency of information used by multiple subprojects in ALARA evaluations. The method for meeting the ALARA requirements applicable to facility designs involves two components. The first is each Spent Nuclear Fuel Project subproject incorporating ALARA principles, ALARA design optimizations, and ALARA design reviews throughout the design of facilities and equipment. The second component is the Spent Nuclear Fuel Project management providing overall dose management guidance to the subprojects and oversight of the subproject dose management efforts
Hanford Environmental Dose Reconstruction Project monthly report
International Nuclear Information System (INIS)
Finch, S.M.
1990-12-01
The objective of the Hanford Environmental Dose Reconstruction Project is to estimate the radiation doses that populations could have been have received from nuclear operations at Hanford since 1944. The project is being managed and conducted by the Pacific Northwest Laboratory (PNL) under the direction of an independent Technical Steering Panel (TSP). The project is divided into the following technical tasks. These tasks correspond to the path radionuclides followed, from release to impact on humans (dose estimates): source terms; environmental transport; environmental monitoring data; demographics, agriculture, food habits; and environmental pathways and dose estimates. 3 figs., 3 tabs
Hanford Environmental Dose Reconstruction Project monthly report
International Nuclear Information System (INIS)
Finch, S.M.
1991-10-01
The objective of the Hanford Environmental Dose Reconstruction Project is to estimate the radiation doeses that individuals and populations could have received from nuclear operations at Hanford since 1944. The project is divided into the following technical tasks. These tasks correspond to the path radionuclides followed, from release to impact on humans (dose estimates): Source terms; environmental transport; environmental monitoring data; demographics, agriculture, food habits; environmental pathways and dose estimates
Shiryaev, A N
1996-01-01
This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, ergodic theory, weak convergence of probability measures, stationary stochastic processes, and the Kalman-Bucy filter Many examples are discussed in detail, and there are a large number of exercises The book is accessible to advanced undergraduates and can be used as a text for self-study This new edition contains substantial revisions and updated references The reader will find a deeper study of topics such as the distance between probability measures, metrization of weak convergence, and contiguity of probability measures Proofs for a number of some important results which were merely stated in the first edition have been added The author included new material on the probability of large deviations, and on the central limit theorem for sums of dependent random variables
Probability Distribution and Projected Trends of Daily Precipitation in China
Institute of Scientific and Technical Information of China (English)
CAO; Li-Ge; ZHONG; Jun; SU; Bu-Da; ZHAI; Jian-Qing; Macro; GEMMER
2013-01-01
Based on observed daily precipitation data of 540 stations and 3,839 gridded data from the high-resolution regional climate model COSMO-Climate Limited-area Modeling(CCLM)for 1961–2000,the simulation ability of CCLM on daily precipitation in China is examined,and the variation of daily precipitation distribution pattern is revealed.By applying the probability distribution and extreme value theory to the projected daily precipitation(2011–2050)under SRES A1B scenario with CCLM,trends of daily precipitation series and daily precipitation extremes are analyzed.Results show that except for the western Qinghai-Tibetan Plateau and South China,distribution patterns of the kurtosis and skewness calculated from the simulated and observed series are consistent with each other;their spatial correlation coefcients are above 0.75.The CCLM can well capture the distribution characteristics of daily precipitation over China.It is projected that in some parts of the Jianghuai region,central-eastern Northeast China and Inner Mongolia,the kurtosis and skewness will increase significantly,and precipitation extremes will increase during 2011–2050.The projected increase of maximum daily rainfall and longest non-precipitation period during flood season in the aforementioned regions,also show increasing trends of droughts and floods in the next 40 years.
The Hanford Environmental Dose Reconstruction Project: Overview
International Nuclear Information System (INIS)
Haerer, H.A.; Freshley, M.D.; Gilbert, R.O.; Morgan, L.G.; Napier, B.A.; Rhoads, R.E.; Woodruff, R.K.
1990-01-01
In 1988, researchers began a multiyear effort to estimate radiation doses that people could have received since 1944 at the U.S. Department of Energy's Hanford Site. The study was prompted by increasing concern about potential health effects to the public from more than 40 yr of nuclear activities. We will provide an overview of the Hanford Environmental Dose Reconstruction Project and its technical approach. The work has required development of new methods and tools for dealing with unique technical and communication challenges. Scientists are using a probabilistic, rather than the more typical deterministic, approach to generate dose distributions rather than single-point estimates. Uncertainties in input parameters are reflected in dose results. Sensitivity analyses are used to optimize project resources and define the project's scope. An independent technical steering panel directs and approves the work in a public forum. Dose estimates are based on review and analysis of historical data related to operations, effluents, and monitoring; determination of important radionuclides; and reconstruction of source terms, environmental conditions that affected transport, concentrations in environmental media, and human elements, such as population distribution, agricultural practices, food consumption patterns, and lifestyles. A companion paper in this volume, The Hanford Environmental Dose Reconstruction Project: Technical Approach, describes the computational framework for the work
Hanford Environmental Dose Reconstruction Project monthly report
International Nuclear Information System (INIS)
McMakin, A.H., Cannon, S.D.; Finch, S.M.
1992-09-01
The objective of the Hanford Environmental Dose Reconstruction MDR) Project is to estimate the radiation doses that individuals and populations could have received from nuclear operations at Hanford since 1944. The TSP consists of experts in envirorunental pathways. epidemiology, surface-water transport, ground-water transport, statistics, demography, agriculture, meteorology, nuclear engineering. radiation dosimetry. and cultural anthropology. Included are appointed members representing the states of Oregon, Washington, and Idaho, a representative of Native American tribes, and an individual representing the public. The project is divided into the following technical tasks. These tasks correspond to the path radionuclides followed from release to impact on humans (dose estimates): Source Terms; Environmental Transport; Environmental Monitoring Data Demography, Food Consumption, and Agriculture; and Environmental Pathways and Dose Estimates
International Nuclear Information System (INIS)
Booth, J.T.; Zavgorodni, S.F.; Royal Adelaide Hospital, SA
2001-01-01
Uncertainty in the precise quantity of radiation dose delivered to tumours in external beam radiotherapy is present due to many factors, and can result in either spatially uniform (Gaussian) or spatially non-uniform dose errors. These dose errors are incorporated into the calculation of tumour control probability (TCP) and produce a distribution of possible TCP values over a population. We also study the effect of inter-patient cell sensitivity heterogeneity on the population distribution of patient TCPs. This study aims to investigate the relative importance of these three uncertainties (spatially uniform dose uncertainty, spatially non-uniform dose uncertainty, and inter-patient cell sensitivity heterogeneity) on the delivered dose and TCP distribution following a typical course of fractionated external beam radiotherapy. The dose distributions used for patient treatments are modelled in one dimension. Geometric positioning uncertainties during and before treatment are considered as shifts of a pre-calculated dose distribution. Following the simulation of a population of patients, distributions of dose across the patient population are used to calculate mean treatment dose, standard deviation in mean treatment dose, mean TCP, standard deviation in TCP, and TCP mode. These parameters are calculated with each of the three uncertainties included separately. The calculations show that the dose errors in the tumour volume are dominated by the spatially uniform component of dose uncertainty. This could be related to machine specific parameters, such as linear accelerator calibration. TCP calculation is affected dramatically by inter-patient variation in the cell sensitivity and to a lesser extent by the spatially uniform dose errors. The positioning errors with the 1.5 cm margins used cause dose uncertainty outside the tumour volume and have a small effect on mean treatment dose (in the tumour volume) and tumour control. Copyright (2001) Australasian College of
Dose prescription complexity versus tumor control probability in biologically conformal radiotherapy
International Nuclear Information System (INIS)
South, C. P.; Evans, P. M.; Partridge, M.
2009-01-01
The technical feasibility and potential benefits of voxel-based nonuniform dose prescriptions for biologically heterogeneous tumors have been widely demonstrated. In some cases, an ''ideal'' dose prescription has been generated by individualizing the dose to every voxel within the target, but often this voxel-based prescription has been discretized into a small number of compartments. The number of dose levels utilized and the methods used for prescribing doses and assigning tumor voxels to different dose compartments have varied significantly. The authors present an investigation into the relationship between the complexity of the dose prescription and the tumor control probability (TCP) for a number of these methods. The linear quadratic model of cell killing was used in conjunction with a number of modeled tumors heterogeneous in clonogen density, oxygenation, or proliferation. Models based on simple mathematical functions, published biological data, and biological image data were investigated. Target voxels were assigned to dose compartments using (i) simple rules based on the initial biological distribution, (ii) iterative methods designed to maximize the achievable TCP, or (iii) methods based on an ideal dose prescription. The relative performance of the simple rules was found to depend on the form of heterogeneity of the tumor, while the iterative and ideal dose methods performed comparably for all models investigated. In all cases the maximum achievable TCP was approached within the first few (typically two to five) compartments. Results suggest that irrespective of the pattern of heterogeneity, the optimal dose prescription can be well approximated using only a few dose levels but only if both the compartment boundaries and prescribed dose levels are well chosen.
Hanford environmental dose reconstruction project - an overview
International Nuclear Information System (INIS)
Shipler, D.B.; Napier, B.A.; Farris, W.T.
1996-01-01
The Hanford Environmental Dose Reconstruction Project was initiated because of public interest in the historical releases of radioactive materials from the Hanford Site, located in southcentral Washington State. By 1986, over 38,000 pages of environmental monitoring documentation from the early years of Hanford operations had been released. Special committees reviewing the documents recommended initiation of the Hanford Environmental Dose Reconstruction Project, which began in October 1987, and is conducted by Battelle, Pacific Northwest Laboratories. The technical approach taken was to reconstruct releases of radioactive materials based on facility operating information; develop and/or adapt transport, pathway, and dose models and computer codes; reconstruct environmental, meterological, and hydrological monitoring information; reconstruct demographic, agricultural, and lifestyle characteristics; apply statistical methods to all forms of uncertainty in the information, parameters, and models; and perform scientific investigation that were technically defensible. The geographic area for the study includes ∼2 x 10 5 km 2 (75,000 mi 2 ) in eastern Washington, western Idaho, and northeastern Oregon (essentially the Mid-columbia Basin of the Pacific Northwest). Three exposure pathways were considered: the atmosphere, the Columbia River, and ground water
International Nuclear Information System (INIS)
Lyman, J.T.; Wolbarst, A.B.
1987-01-01
To predict the likelihood of success of a therapeutic strategy, one must be able to assess the effects of the treatment upon both diseased and healthy tissues. This paper proposes a method for determining the probability that a healthy organ that receives a non-uniform distribution of X-irradiation, heat, chemotherapy, or other agent will escape complications. Starting with any given dose distribution, a dose-cumulative-volume histogram for the organ is generated. This is then reduced by an interpolation scheme (involving the volume-weighting of complication probabilities) to a slightly different histogram that corresponds to the same overall likelihood of complications, but which contains one less step. The procedure is repeated, one step at a time, until there remains a final, single-step histogram, for which the complication probability can be determined. The formalism makes use of a complication response function C(D, V) which, for the given treatment schedule, represents the probability of complications arising when the fraction V of the organ receives dose D and the rest of the organ gets none. Although the data required to generate this function are sparse at present, it should be possible to obtain the necessary information from in vivo and clinical studies. Volume effects are taken explicitly into account in two ways: the precise shape of the patient's histogram is employed in the calculation, and the complication response function is a function of the volume
Calculation of complication probability of pion treatment at PSI using dose-volume histograms
International Nuclear Information System (INIS)
Nakagawa, Keiichi; Akanuma, Atsuo; Aoki, Yukimasa
1991-01-01
In the conformation technique a target volume is irradiated uniformly as in conventional radiations, whereas surrounding tissue and organs are nonuniformly irradiated. Clinical data on radiation injuries that accumulate with conventional radiation are not applicable without appropriate compensation. Recently a putative solution of this problem was proposed by Lyman using dose-volume histograms. This histogram reduction method reduces a given dose-volume histogram of an organ to a single step which corresponds to the equivalent complication probability by interpolation. As a result it converts nonuniform radiation into a unique dose to the whole organ which has the equivalent likelihood of radiation injury. This method is based on low LET radiation with conventional fractionation schedules. When it is applied to high LET radiation such as negative pion treatment, a high LET dose should be converted to an equivalent photon dose using an appropriate value of RBE. In the present study the histogram reduction method was applied to actual patients treated by the negative pion conformation technique at the Paul Scherrer Institute. Out of evaluable 90 cases of pelvic tumors, 16 developed grade III-IV bladder injury, and 7 developed grade III-IV rectal injury. The 90 cases were divided into roughly equal groups according to the equivalent doses to the entire bladder and rectum. Complication rates and equivalent doses to the full organs in these groups could be represented by a sigmoid dose-effect relation. When RBE from a pion dose to a photon dose is assumed to be 2.1 for bladder injury, the rates of bladder complications fit best to the theoretical complication curve. When the RBE value was 2.3, the rates of rectal injury fit the theoretical curve best. These values are close to the conversion factor of 2.0 that is used in clinical practice at PSI. This agreement suggests the clinical feasibility of the histogram reduction method in conformation radiotherapy. (author)
Hanford Environmental Dose Reconstruction Project monthly report, August 1992
International Nuclear Information System (INIS)
McMakin, A.H.; Cannon, S.D.; Finch, S.M.
1992-01-01
The objective of the Hanford Environmental Dose Reconstruction (HEDR) Project is to estimate the radiation doses that individuals and populations could have received from nuclear operations at Hanford since 1944. The project is divided into the following technical tasks. These tasks correspond to the path radionuclides followed from release to impact on humans (dose estimates): source terms; environmental transport; environmental monitoring data; demography; food consumption; and agriculture; and environmental pathway and dose estimates
Institute of Scientific and Technical Information of China (English)
Weixu Dai; Weiwei Wu; Bo Yu; Yunhao Zhu
2016-01-01
A success probability orientated optimization model for resource al ocation of the technological innovation multi-project system is studied. Based on the definition of the technological in-novation multi-project system, the leveling optimization of cost and success probability is set as the objective of resource al ocation. The cost function and the probability function of the optimization model are constructed. Then the objective function of the model is constructed and the solving process is explained. The model is applied to the resource al ocation of an enterprise’s technological innovation multi-project system. The results show that the pro-posed model is more effective in rational resource al ocation, and is more applicable in maximizing the utility of the technological innovation multi-project system.
Overview of the Hanford Environmental Dose Reconstruction Project
International Nuclear Information System (INIS)
Shipler, D.B.; Napier, B.A.; Ikenberry, T.A.
1992-04-01
The objective of the Hanford Environmental Dose Reconstruction (HEDR) Project is to estimate the radiation doses that specific and representative individuals and populations may have received as a result of releases of radioactive materials from historical operations at the Hanford Site. These dose estimates would account for the uncertainties of information regarding facilities operations, environmental monitoring, demography, food consumption and lifestyles, and the variability of natural phenomena. Other objectives of the HEDR Project include: supporting the Hanford Thyroid Disease Study (HTDS), declassifying Hanford-generated information and making it available to the public, performing high-quality, credible science, and conducting the project in an open, public forum. The project is briefly described
Mathematical models of tumor growth: translating absorbed dose to tumor control probability
International Nuclear Information System (INIS)
Sgouros, G.
1996-01-01
cell loss due to irradiation, the log-kill model, therefore, predicts that incomplete treatment of a kinetically heterogeneous tumor will yield a more proliferative tumor. The probability of tumor control in such a simulation may be obtained from the nadir in tumor cell number. If the nadir is not sufficiently low to yield a high probability of tumor control, then the tumor will re-grow. Since tumors in each sub-population are assumed lost at the same rate, cells comprising the sub-population with the shortest potential doubling time will re-grow the fastest, yielding a recurrent tumor that is more proliferative. A number of assumptions and simplifications are both implicitly and explicitly made in converting absorbed dose to tumor control probability. The modeling analyses described above must, therefore, be viewed in terms of understanding and evaluating different treatment approaches with the goal of treatment optimization rather than outcome prediction
International Nuclear Information System (INIS)
Ramirez, M. L.; Ruiz, A.; Ferrer, N.; Alonso Farto, J. C.; Alvarez, C.; Rodriguez, M.
2013-01-01
The DOMNES Project is created in 2001 to carry out a survey on nuclear medicine procedures used in the Spanish health centers, their frequency and the doses given to patients. In addition, it reports information to Dose Data Project Med 2, focusing on radiology exams. (Author)
Smoothing and projecting age-specific probabilities of death by TOPALS
Directory of Open Access Journals (Sweden)
Joop de Beer
2012-10-01
Full Text Available BACKGROUND TOPALS is a new relational model for smoothing and projecting age schedules. The model is operationally simple, flexible, and transparent. OBJECTIVE This article demonstrates how TOPALS can be used for both smoothing and projecting age-specific mortality for 26 European countries and compares the results of TOPALS with those of other smoothing and projection methods. METHODS TOPALS uses a linear spline to describe the ratios between the age-specific death probabilities of a given country and a standard age schedule. For smoothing purposes I use the average of death probabilities over 15 Western European countries as standard, whereas for projection purposes I use an age schedule of 'best practice' mortality. A partial adjustment model projects how quickly the death probabilities move in the direction of the best-practice level of mortality. RESULTS On average, TOPALS performs better than the Heligman-Pollard model and the Brass relational method in smoothing mortality age schedules. TOPALS can produce projections that are similar to those of the Lee-Carter method, but can easily be used to produce alternative scenarios as well. This article presents three projections of life expectancy at birth for the year 2060 for 26 European countries. The Baseline scenario assumes a continuation of the past trend in each country, the Convergence scenario assumes that there is a common trend across European countries, and the Acceleration scenario assumes that the future decline of death probabilities will exceed that in the past. The Baseline scenario projects that average European life expectancy at birth will increase to 80 years for men and 87 years for women in 2060, whereas the Acceleration scenario projects an increase to 90 and 93 years respectively. CONCLUSIONS TOPALS is a useful new tool for demographers for both smoothing age schedules and making scenarios.
International Nuclear Information System (INIS)
Begnozzi, L.; Gentile, F.P.; Di Nallo, A.M.; Chiatti, L.; Zicari, C.; Consorti, R.; Benassi, M.
1994-01-01
Since volumetric dose distributions are available with 3-dimensional radiotherapy treatment planning they can be used in statistical evaluation of response to radiation. This report presents a method to calculate the influence of dose inhomogeneity and fractionation in normal tissue complication probability evaluation. The mathematical expression for the calculation of normal tissue complication probability has been derived combining the Lyman model with the histogram reduction method of Kutcher et al. and using the normalized total dose (NTD) instead of the total dose. The fitting of published tolerance data, in case of homogeneous or partial brain irradiation, has been considered. For the same total or partial volume homogeneous irradiation of the brain, curves of normal tissue complication probability have been calculated with fraction size of 1.5 Gy and of 3 Gy instead of 2 Gy, to show the influence of fraction size. The influence of dose distribution inhomogeneity and α/β value has also been simulated: Considering α/β=1.6 Gy or α/β=4.1 Gy for kidney clinical nephritis, the calculated curves of normal tissue complication probability are shown. Combining NTD calculations and histogram reduction techniques, normal tissue complication probability can be estimated taking into account the most relevant contributing factors, including the volume effect. (orig.) [de
Work plan for the Hanford Environmental Dose Reconstruction Project
Energy Technology Data Exchange (ETDEWEB)
1989-12-01
The primary objective of the Hanford Environmental Dose Reconstruction (HEDR) Project is to estimate the radiation doses that populations could have received from nuclear operations at the Hanford Site since 1944, with descriptions of uncertainties inherent in such estimates. The secondary objective is to make project records--information that HEDR staff members used to estimate radiation doses--available to the public. Preliminary dose estimates for a limited geographic area and time period, certain radionuclides, and certain populations are planned to be available in 1990; complete results are planned to be reported in 1993. Project reports and references used in the reports are available to the public in the DOE Public Reading Room in Richland, Washington. Project progress is documented in monthly reports, which are also available to the public in the DOE Public Reading Room.
International Nuclear Information System (INIS)
Zhou Sumin; Das, Shiva; Wang Zhiheng; Marks, Lawrence B.
2004-01-01
The generalized equivalent uniform dose (GEUD) model uses a power-law formalism, where the outcome is related to the dose via a power law. We herein investigate the mathematical compatibility between this GEUD model and the Poisson statistics based tumor control probability (TCP) model. The GEUD and TCP formulations are combined and subjected to a compatibility constraint equation. This compatibility constraint equates tumor control probability from the original heterogeneous target dose distribution to that from the homogeneous dose from the GEUD formalism. It is shown that this constraint equation possesses a unique, analytical closed-form solution which relates radiation dose to the tumor cell survival fraction. It is further demonstrated that, when there is no positive threshold or finite critical dose in the tumor response to radiation, this relationship is not bounded within the realistic cell survival limits of 0%-100%. Thus, the GEUD and TCP formalisms are, in general, mathematically inconsistent. However, when a threshold dose or finite critical dose exists in the tumor response to radiation, there is a unique mathematical solution for the tumor cell survival fraction that allows the GEUD and TCP formalisms to coexist, provided that all portions of the tumor are confined within certain specific dose ranges
International Nuclear Information System (INIS)
Niemierko, Andrzej; Goitein, Michael
1991-01-01
The authors investigate a model of normal tissue complication probability for tissues that may be represented by a critical element architecture. They derive formulas for complication probability that apply to both a partial volume irradiation and to an arbitrary inhomogeneous dose distribution. The dose-volume isoeffect relationship which is a consequence of a critical element architecture is discussed and compared to the empirical power law relationship. A dose-volume histogram reduction scheme for a 'pure' critical element model is derived. In addition, a point-based algorithm which does not require precomputation of a dose-volume histogram is derived. The existing published dose-volume histogram reduction algorithms are analyzed. The authors show that the existing algorithms, developed empirically without an explicit biophysical model, have a close relationship to the critical element model at low levels of complication probability. However, it is also showed that they have aspects which are not compatible with a critical element model and the authors propose a modification to one of them to circumvent its restriction to low complication probabilities. (author). 26 refs.; 7 figs
International Nuclear Information System (INIS)
Moiseenko, Vitali; Battista, Jerry; Van Dyk, Jake
2000-01-01
Purpose: To evaluate the impact of dose-volume histogram (DVH) reduction schemes and models of normal tissue complication probability (NTCP) on ranking of radiation treatment plans. Methods and Materials: Data for liver complications in humans and for spinal cord in rats were used to derive input parameters of four different NTCP models. DVH reduction was performed using two schemes: 'effective volume' and 'preferred Lyman'. DVHs for competing treatment plans were derived from a sample DVH by varying dose uniformity in a high dose region so that the obtained cumulative DVHs intersected. Treatment plans were ranked according to the calculated NTCP values. Results: Whenever the preferred Lyman scheme was used to reduce the DVH, competing plans were indistinguishable as long as the mean dose was constant. The effective volume DVH reduction scheme did allow us to distinguish between these competing treatment plans. However, plan ranking depended on the radiobiological model used and its input parameters. Conclusions: Dose escalation will be a significant part of radiation treatment planning using new technologies, such as 3-D conformal radiotherapy and tomotherapy. Such dose escalation will depend on how the dose distributions in organs at risk are interpreted in terms of expected complication probabilities. The present study indicates considerable variability in predicted NTCP values because of the methods used for DVH reduction and radiobiological models and their input parameters. Animal studies and collection of standardized clinical data are needed to ascertain the effects of non-uniform dose distributions and to test the validity of the models currently in use
Integration of models for the Hanford Environmental Dose Reconstruction Project
International Nuclear Information System (INIS)
Napier, B.A.
1991-01-01
The objective of the Hanford Environmental Dose Reconstruction Project is to estimate the radiation dose that individuals could have received as a result of emissions from nuclear operations at Hanford since 1944. The objective of phase 1 of the project was to demonstrate through calculations that adequate models and support data exist or could be developed to allow realistic estimations of doses to individuals from releases of radionuclides to the environment that occurred as long as 45 years ago. Much of the data used in phase 1 was preliminary; therefore, the doses calculated must be considered preliminary approximations. This paper describes the integration of various models that was implemented for initial computer calculations. Models were required for estimating the quantity of radioactive material released, for evaluating its transport through the environment, for estimating human exposure, and for evaluating resultant doses
Updating Dosimetry for Emergency Response Dose Projections.
DeCair, Sara
2016-02-01
In 2013, the U.S. Environmental Protection Agency (EPA) proposed an update to the 1992 Protective Action Guides (PAG) Manual. The PAG Manual provides guidance to state and local officials planning for radiological emergencies. EPA requested public comment on the proposed revisions, while making them available for interim use by officials faced with an emergency situation. Developed with interagency partners, EPA's proposal incorporates newer dosimetric methods, identifies tools and guidelines developed since the current document was issued, and extends the scope of the PAGs to all significant radiological incidents, including radiological dispersal devices or improvised nuclear devices. In order to best serve the emergency management community, scientific policy direction had to be set on how to use International Commission on Radiological Protection Publication 60 age groups in dose assessment when implementing emergency guidelines. Certain guidelines that lend themselves to different PAGs for different subpopulations are the PAGs for potassium iodide (KI), food, and water. These guidelines provide age-specific recommendations because of the radiosensitivity of the thyroid and young children with respect to ingestion and inhalation doses in particular. Taking protective actions like using KI, avoiding certain foods or using alternative sources of drinking water can be relatively simple to implement by the parents of young children. Clear public messages can convey which age groups should take which action, unlike how an evacuation or relocation order should apply to entire households or neighborhoods. New in the PAG Manual is planning guidance for the late phase of an incident, after the situation is stabilized and efforts turn toward recovery. Because the late phase can take years to complete, decision makers are faced with managing public exposures in areas not fully remediated. The proposal includes quick-reference operational guidelines to inform re-entry to
Phase 1 of the Hanford Environmental Dose Reconstruction Project
International Nuclear Information System (INIS)
1991-08-01
The work described in this report was prompted by the public's concern about potential effect from the radioactive materials released from the Hanford Site. The Hanford Environmental Dose Reconstruction (HEDR) Project was established to estimate radiation dose the public might have received from the Hanford Site since 1944, when facilities began operating. Phase 1 of the HEDR Project is a ''pilot'' or ''demonstration'' phase. The objectives of this initial phase were to determine whether enough historical information could be found or reconstructed to be used for dose estimation and develop and test conceptual and computational models for calculating credible dose estimates. Preliminary estimates of radiation doses were produced in Phase 1 because they are needed to achieve these objectives. The reader is cautioned that the dose estimates provided in this and other Phase 1 HEDR reports are preliminary. As the HEDR Project continues, the dose estimates will change for at least three reasons: more complete input information for models will be developed; the models themselves will be refined; and the size and shape of the geographic study area will change. This is one of three draft reports that summarize the first phase of the four-phased HEDR Project. This, the Summary Report, is directed to readers who want a general understanding of the Phase 1 work and preliminary dose estimates. The two other reports -- the Air Pathway Report and the Columbia River Pathway Report -- are for readers who understand the radiation dose assessment process and want to see more technical detail. Detailed descriptions of the dose reconstruction process are available in more than 20 supporting reports listed in Appendix A. 32 refs., 46 figs
Schumacher, Sandra; Pierau, Roberto; Wirth, Wolfgang
2017-04-01
In recent years, the development of geothermal plants in Germany has increased significantly due to a favorable political setting and resulting financial incentives. However, most projects are developed by local communities or private investors, which cannot afford a project to fail. To cover the risk of total loss if the geothermal well should not provide the energy output necessary for an economically viable project, investors try to procure insurances for this worst case scenario. In order to issue such insurances, the insurance companies insist on so called probability-of-success studies (POS studies), in which the geological risk for not achieving the necessary temperatures and/or flow rates for an economically successful project is quantified. Quantifying the probability of reaching a minimum temperature, which has to be defined by the project investors, is relatively straight forward as subsurface temperatures in Germany are comparatively well known due tens of thousands of hydrocarbon wells. Moreover, for the German Molasse Basin a method to characterize the hydraulic potential of a site based on pump test analysis has been developed and refined in recent years. However, to quantify the probability of reaching a given flow rate with a given drawdown is much more challenging in areas where pump test data are generally not available (e.g. the North German Basin). Therefore, a new method based on log and core derived porosity and permeability data was developed to quantify the geological risk of reaching a determined flow rate in such areas. We present both methods for POS studies and show how subsurface data such as pump tests or log and core measurements can be used to predict the chances of a potential geothermal project from a geological point of view.
Therapeutic treatment plan optimization with probability density-based dose prescription
International Nuclear Information System (INIS)
Lian Jun; Cotrutz, Cristian; Xing Lei
2003-01-01
The dose optimization in inverse planning is realized under the guidance of an objective function. The prescription doses in a conventional approach are usually rigid values, defining in most instances an ill-conditioned optimization problem. In this work, we propose a more general dose optimization scheme based on a statistical formalism [Xing et al., Med. Phys. 21, 2348-2358 (1999)]. Instead of a rigid dose, the prescription to a structure is specified by a preference function, which describes the user's preference over other doses in case the most desired dose is not attainable. The variation range of the prescription dose and the shape of the preference function are predesigned by the user based on prior clinical experience. Consequently, during the iterative optimization process, the prescription dose is allowed to deviate, with a certain preference level, from the most desired dose. By not restricting the prescription dose to a fixed value, the optimization problem becomes less ill-defined. The conventional inverse planning algorithm represents a special case of the new formalism. An iterative dose optimization algorithm is used to optimize the system. The performance of the proposed technique is systematically studied using a hypothetical C-shaped tumor with an abutting circular critical structure and a prostate case. It is shown that the final dose distribution can be manipulated flexibly by tuning the shape of the preference function and that using a preference function can lead to optimized dose distributions in accordance with the planner's specification. The proposed framework offers an effective mechanism to formalize the planner's priorities over different possible clinical scenarios and incorporate them into dose optimization. The enhanced control over the final plan may greatly facilitate the IMRT treatment planning process
Population doses in Spain. Contribution of the project dopoes a dose Datamed 2
International Nuclear Information System (INIS)
Ruiz Cruces, R.; Canete Hidalgo, S.; Perez Martinez, M.; Pola, A.; Moreno, S.; Rodriguez, M.; Alvarez, C.; Gil, M.
2013-01-01
Frequency and effective dose values are of the order of the reported in the publication Radiation Protection 154 by neighbouring countries. Spain participated actively in the project DDM2 by sending all the required information and this has served test to ensure the correct development of the DOPOES project, which is ongoing. (Author)
Communication tools for the Hanford Environmental Dose Reconstruction Project
International Nuclear Information System (INIS)
Blazek, Mary Lou; Power, Max S.
1992-01-01
From 1944 to 1989, the U.S. Department of Energy produced plutonium at the Hanford Site in southeast Washington State. In the early days of operation, large amounts of radioactive materials were released to the environment. Documents about the releases were made public in 1986. The Hanford Environmental Dose Reconstruction Project began in 1987. The Project will determine how much radioactive material was released, how that material may have exposed people, and what radiation doses people may have received. The Project will be complete in 1995. The federal government pays for the work. The scientific work on the study is done by Battelle's Pacific Northwest Laboratory. Public credibility and valid science are equally important to those directing the dose reconstruction work. A number of tools are used to inform the public and encourage public participation. These tools are examined in this paper. (author)
Final design review report for K basin dose reduction project
International Nuclear Information System (INIS)
Blackburn, L.D.
1996-01-01
The strategy for reducing radiation dose originating from radionuclides absorbed in the K East Basin concrete is to raise the pool water level to provide additional shielding. This report documents a final design review for cleaning/coating basin walls and modifying other basin components where appropriate. The conclusion of this review was that the documents developed constitute an acceptable design for the Dose Reduction Project
Decision management for the Hanford Environmental Dose Reconstruction Project
Energy Technology Data Exchange (ETDEWEB)
Roberds, W.J.; Haerer, H.A. [Golder Associates, Inc., Redmond, WA (United States); Winterfeldt, D.V. [Decision Insights, Laguna Beach, CA (United States)
1992-04-01
The Hanford Environmental Dose Reconstruction (HEDR) Project is in the process of developing estimates for the radiation doses that individuals and population groups may have received as a result of past activities at the Hanford Reservation in Eastern Washington. A formal decision-aiding methodology has been developed to assist the HEDR Project in making significant and defensible decisions regarding how this study will be conducted. These decisions relate primarily to policy (e.g., the appropriate level of public participation in the study) and specific technical aspects (e.g., the appropriate domain and depth of the study), and may have significant consequences with respect to technical results, costs, and public acceptability.
Decision management for the Hanford Environmental Dose Reconstruction Project
International Nuclear Information System (INIS)
Roberds, W.J.; Haerer, H.A.; Winterfeldt, D.V.
1992-04-01
The Hanford Environmental Dose Reconstruction (HEDR) Project is in the process of developing estimates for the radiation doses that individuals and population groups may have received as a result of past activities at the Hanford Reservation in Eastern Washington. A formal decision-aiding methodology has been developed to assist the HEDR Project in making significant and defensible decisions regarding how this study will be conducted. These decisions relate primarily to policy (e.g., the appropriate level of public participation in the study) and specific technical aspects (e.g., the appropriate domain and depth of the study), and may have significant consequences with respect to technical results, costs, and public acceptability
Developing milk industry estimates for dose reconstruction projects
International Nuclear Information System (INIS)
Beck, D.M.; Darwin, R.F.
1991-01-01
One of the most important contributors to radiation doses from hanford during the 1944-1947 period was radioactive iodine. Consumption of milk from cows that ate vegetation contaminated with iodine is likely the dominant pathway of human exposure. To estimate the doses people could have received from this pathway, it is necessary to reconstruct the amount of milk consumed by people living near Hanford, the source of the milk, and the type of feed that the milk cows ate. This task is challenging because the dairy industry has undergone radical changes since the end of World War 2, and records that document the impact of these changes on the study area are scarce. Similar problems are faced by researchers on most dose reconstruction efforts. The purpose of this work is to document and evaluate the methods used on the Hanford Environmental Dose Reconstruction (HEDR) Project to reconstruct the milk industry and to present preliminary results
The NIOSH Radiation Dose Reconstruction Project: managing technical challenges.
Moeller, Matthew P; Townsend, Ronald D; Dooley, David A
2008-07-01
Approximately two years after promulgation of the Energy Employees Occupational Illness Compensation Program Act, the National Institute for Occupational Safety and Health Office of Compensation and Analysis Support selected a contractor team to perform many aspects of the radiation dose reconstruction process. The project scope and schedule necessitated the development of an organization involving a comparatively large number of health physicists. From the initial stages, there were many technical and managerial challenges that required continuous planning, integration, and conflict resolution. This paper identifies those challenges and describes the resolutions and lessons learned. These insights are hopefully useful to managers of similar scientific projects, especially those requiring significant data, technical methods, and calculations. The most complex challenge has been to complete defensible, individualized dose reconstructions that support timely compensation decisions at an acceptable production level. Adherence to applying claimant-favorable and transparent science consistent with the requirements of the Act has been the key to establishing credibility, which is essential to this large and complex project involving tens of thousands of individual stakeholders. The initial challenges included garnering sufficient and capable scientific staff, developing an effective infrastructure, establishing necessary methods and procedures, and integrating activities to ensure consistent, quality products. The continuing challenges include maintaining the project focus on recommending a compensation determination (rather than generating an accurate dose reconstruction), managing the associated very large data and information management challenges, and ensuring quality control and assurance in the presence of an evolving infrastructure. The lessons learned concern project credibility, claimant favorability, project priorities, quality and consistency, and critical
International Nuclear Information System (INIS)
Rasin, I.M.; Sarapul'tsev, I.A.
1975-01-01
The probability distribution of tissue radiation doses in the skeleton were studied in experiments on swines and dogs. When introducing Sr-90 into the organism from the day of birth till 90 days dose rate probability distribution is characterized by one, or, for adult animals, by two independent aggregates. Each of these aggregates correspond to the normal distribution law
Atwell, William; Tylka, Allan J.; Dietrich, William F.; Rojdev, Kristina; Matzkind, Courtney
2016-01-01
In an earlier paper presented at ICES in 2015, we investigated solar particle event (SPE) radiation exposures (absorbed dose) to small, thinly-shielded spacecraft during a period when the monthly smoothed sunspot number (SSN) was less than 30. Although such months are generally considered "solar-quiet", SPEs observed during these months even include Ground Level Events, the most energetic type of SPE. In this paper, we add to previous study those SPEs that occurred in 1973-2015 when the SSN was greater than 30 but less than 50. Based on the observable energy range of the solar protons, we classify the event as GLEs, sub-GLEs, and sub-sub-GLEs, all of which are potential contributors to the radiation hazard. We use the spectra of these events to construct a probabilistic model of the absorbed dose due to solar protons when SSN < 50 at various confidence levels for various depths of shielding and for various mission durations. We provide plots and tables of solar proton-induced absorbed dose as functions of confidence level, shielding thickness, and mission-duration that will be useful to system designers.
Review on Population Projection Methodology for Radiological Dose Assessment
Energy Technology Data Exchange (ETDEWEB)
Jang, M. S.; Kang, H. S.; Kim, S. R. [NESS, Daejeon (Korea, Republic of); Hwang, W. T. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Yang, Y. H. [KHNP, Daejeon (Korea, Republic of)
2015-05-15
Radiation environment report (RER), one of the essential documents in plant operating license or continuous operation license, includes population projection. Population estimates are utilized in determining the collective dose at the operation or restart time of nuclear power plant. Many population projection models are suggested and also under development. We carried out the sensitivity analysis on various population projection models to Daejeon city as a target. Daejeon city showed the increase or decrease in the cross-sectional population, because of the development of Sejong city, Doan new town and etc. We analyzed the population of Daejeon city using statistical ARIMA model and various simple population projection models. It is important to determine the population limit in Modified exponential model but it is not easy. Therefore, the various properties of the area such as the decrease and increase of population, new town development plan, social and natural environment change and etc., should be carefully reviewed to estimate the future population of any area.
Review on Population Projection Methodology for Radiological Dose Assessment
International Nuclear Information System (INIS)
Jang, M. S.; Kang, H. S.; Kim, S. R.; Hwang, W. T.; Yang, Y. H.
2015-01-01
Radiation environment report (RER), one of the essential documents in plant operating license or continuous operation license, includes population projection. Population estimates are utilized in determining the collective dose at the operation or restart time of nuclear power plant. Many population projection models are suggested and also under development. We carried out the sensitivity analysis on various population projection models to Daejeon city as a target. Daejeon city showed the increase or decrease in the cross-sectional population, because of the development of Sejong city, Doan new town and etc. We analyzed the population of Daejeon city using statistical ARIMA model and various simple population projection models. It is important to determine the population limit in Modified exponential model but it is not easy. Therefore, the various properties of the area such as the decrease and increase of population, new town development plan, social and natural environment change and etc., should be carefully reviewed to estimate the future population of any area
Data base on nuclear power plant dose reduction research projects
Energy Technology Data Exchange (ETDEWEB)
Khan, T.A.; Dionne, B.J.; Baum, J.W.
1985-12-01
This report contains project information on the research and development activities of the nuclear power industry in the area of dose reduction. It is based on a data base of information set up at the ALARA Center of Brookhaven National Laboratory. One purpose of this report is to draw attention to work in progress and to enable researchers and subscribers to obtain further information from the investigators and project managers. Information is provided on 180 projects, divided according to whether they are oriented to Engineering Research or to Health Physics Technology. The report contains indices on main category, project manager, principal investigator, sponsoring organization, contracting organization, and subject. This is an initial report. It is intended that periodic updates be issued whenever sufficient material has been accumulated.
Data base on nuclear power plant dose reduction research projects
International Nuclear Information System (INIS)
Khan, T.A.; Dionne, B.J.; Baum, J.W.
1985-12-01
This report contains project information on the research and development activities of the nuclear power industry in the area of dose reduction. It is based on a data base of information set up at the ALARA Center of Brookhaven National Laboratory. One purpose of this report is to draw attention to work in progress and to enable researchers and subscribers to obtain further information from the investigators and project managers. Information is provided on 180 projects, divided according to whether they are oriented to Engineering Research or to Health Physics Technology. The report contains indices on main category, project manager, principal investigator, sponsoring organization, contracting organization, and subject. This is an initial report. It is intended that periodic updates be issued whenever sufficient material has been accumulated
FY 1991 project plan for the Hanford Environmental Dose Reconstruction Project, Phase 2
International Nuclear Information System (INIS)
1991-02-01
Phase 1 of the Hanford Environmental Dose Reconstruction Project was designed to develop and demonstrate a method for estimating radiation doses people may have received from Hanford Site operations since 1944. The method researchers developed relied on a variety of measured and reconstructed data as input to a modular computer model that generates dose estimates and their uncertainties. As part of Phase 1, researchers used the reconstructed data and computer model to calculate preliminary dose estimates for populations in a limited geographical area and time period. Phase 2, now under way, is designed to evaluate the Phase 1 data and model and improve them to calculate more accurate and precise dose estimates. Phase 2 will also be used to obtain preliminary estimates of two categories of doses: for Native American tribes and for individuals included in the pilot phase of the Hanford Thyroid Disease Study (HTDS). TSP Directive 90-1 required HEDR staff to develop Phase 2 task plans for TSP approval. Draft task plans for Phase 2 were submitted to the TSP at the October 11--12, 1990 public meeting, and, after discussions of each activity and associated budget needs, the TSP directed HEDR staff to proceed with a slate of specific project activities for FY 1991 of Phase 2. This project plan contains detailed information about those activities. Phase 2 is expected to last 15--18 months. In mid-FY 1991, project activities and budget will be reevaluated to determine whether technical needs or priorities have changed. Separate from, but related to, this project plan, will be an integrated plan for the remainder of the project. HEDR staff will work with the TSP to map out a strategy that clearly describes ''end products'' for the project and the work necessary to complete them. This level of planning will provide a framework within which project decisions in Phases 2, 3, and 4 can be made
The Hanford Environmental Dose Reconstruction (HEDR) Project: Technical approach
International Nuclear Information System (INIS)
Napier, B.A.; Freshley, M.D.; Gilbert, R.O.; Haerer, H.A.; Morgan, L.G.; Rhoads, R.E.; Woodruff, R.K.
1990-01-01
Historical measurements and current assessment techniques are being combined to estimate potential radiation doses to people from radioactive releases to the air, the Columbia River, soils, and ground water at the Hanford Site since 1944. Environmental contamination from these releases has been monitored, at varying levels of detail, for 45 yr. Phase I of the Hanford Environmental Reconstruction Project will estimate the magnitude of potential doses, their areal extends, and their associated uncertainties. The Phase I study area comprises 10 counties in eastern Washington and northern Oregon, within a 100-mi radius of the site, including a stretch of the Columbia River that was most significantly affected. These counties contain a range of projected and measured contaminant levels, environmental exposure pathways, and population groups. Phase I dose estimates are being developed for the periods 1944 through 1947 for air pathways and 1964 through 1966 for river pathways. Important radionuclide/pathway combinations include fission products, such as 131 I, in milk for early atmospheric releases and activation products, such as 32 P and 65 Zn, in fish for releases to the river. Potential doses range over several orders of magnitude within the study area. We will expand the time periods and study are in three successive phases, as warranted by results of Phase I
Tugnoli, Alessandro; Gubinelli, Gianfilippo; Landucci, Gabriele; Cozzani, Valerio
2014-08-30
The evaluation of the initial direction and velocity of the fragments generated in the fragmentation of a vessel due to internal pressure is an important information in the assessment of damage caused by fragments, in particular within the quantitative risk assessment (QRA) of chemical and process plants. In the present study an approach is proposed to the identification and validation of probability density functions (pdfs) for the initial direction of the fragments. A detailed review of a large number of past accidents provided the background information for the validation procedure. A specific method was developed for the validation of the proposed pdfs. Validated pdfs were obtained for both the vertical and horizontal angles of projection and for the initial velocity of the fragments. Copyright © 2014 Elsevier B.V. All rights reserved.
International Nuclear Information System (INIS)
Mantz, C.A.; Song, P.; Farhangi, E.; Nautiyal, J.; Awan, A.; Ignacio, L.; Weichselbaum, R.; Vijayakumar, S.
1997-01-01
Purpose: Impotence is a familiar sequela of definitive external beam radiation therapy (EBRT) for localized prostate cancer; however, nerve-sparing radical prostatectomy (NSRP) has offered potency rates as high as 70% for selected for patients in several large series. To the authors' knowledge, age and stage-matched comparisons between the effects of EBRT and NSRP upon the normal age trend of impotence have not been performed. Herein, we report the change in potency over time in an EBRT-treated population, determine the significantly predisposing health factors affecting potency in this population, and compare age and stage-matched potency rates with those of normal males and prostatectomy patients. Methods and Materials: Our results are obtained from a retrospective study of 114 patients ranging in age from 52 to 85 (mean, 68) who were diagnosed with clinical stages A-C C (T1-T4N0M0) prostate cancer and then treated conformally with megavoltage x-rays to 6500-7000 cGy (180-200 cGy per fraction) using the four-field box technique. Information concerning pre-RT potency, medical and surgical history, and medications was documented for each patient as was time of post-RT change in potency during regular follow-up. The median follow-up time was 18.5 months. Results: The actuarial probability of potency for all patients gradually decreased throughout post-RT follow-up. At months 1, 12, 24, and 36, potency rates were 98, 92, 75, and 66%, respectively. For those patients who became impotent, the median time to impotence was 14 months. Factors identified from logistic regression analysis as significant predictors of post-EBRT impotence include pre-EBRT partial potency (p < 0.001), vascular disease (p < 0.001), and diabetes (p = 0.003). Next, an actuarial plot of potency probability to patient age for the EBRT-treated population was compared to that obtained from the Massachusetts Male Aging Study of normal males. The two curves were not significantly different (logrank
Data base on nuclear power plant dose reduction research projects
Energy Technology Data Exchange (ETDEWEB)
Khan, T.A.; Baum, J.W.
1986-10-01
Staff at the ALARA Center of Brookhaven National Laboratory have established a data base of information about current research that is likely to result in lower radiation doses to workers. The data base, concerned primarily with nuclear power generation, is part of a project that the ALARA Center is carrying out for the Nuclear Regulatory Commission. This report describes its current status. A substantial amount of research on reducing occupational exposure is being done in the US and abroad. This research is beginning to have an impact on the collective dose expenditures at nuclear power plants. The collective radiation doses in Europe, Japan, and North America all show downward trends. A large part of the research in the US is either sponsored by the nuclear industry through joint industry organizations such as EPRI and ESEERCO or is done by individual corporations. There is also significant participation by smaller companies. The main emphasis of the research on dose reduction is on engineering approaches aimed at reducing radiation fields or keeping people out of high-exposure areas by using robotics. Effective ALARA programs are also underway at a large number of nuclear plants. Additional attention should be given to non-engineering approaches to dose reduction, which are potentially very useful and cost effective but require quantitative study and analysis based on data from nuclear power plants. 9 refs., 1 fig.
Guidance on internal dose assessments from monitoring data (Project IDEAS)
International Nuclear Information System (INIS)
Doerfel, H.; Andrasi, A.; Bailey, M.; Berkovski, V.; Castellani, M.; Hurtgen, C.; Jourdain, R.; Le Guen, B.
2003-01-01
Several international intercomparison exercises on intake and internal dose assessments from monitoring data led to the conclusion that the results calculated by different participants varied significantly mainly to the broad variety of methods and assumptions applied in the assessment procedure. Based on these experiences the need of harmonisation of the procedures has been formulated as an EU research project under the 5th Framework Programme, with the aim of developing general guidelines for standardising assessments of intakes and internal doses. In the IDEAS project, eight institutions from seven European countries are participating, also using inputs from internal dosimetry professionals from across Europe to ensure broad consensus in the outcome of the project. To ensure that the guidelines are applicable to a wide range of practical situations, the first step will be to compile a database on well documented cases of internal contamination. In parallel, an improved version of existing software will be developed and distributed to the partners for further use. Many cases from the database will be evaluated independently by more partners using the same software and the results will be discussed and the draft guidelines prepared. The guidelines will then be revised and refined on the basis of the experiences and discussions of two workshops, and an inter-comparison exercise organised in the frame of the project which will be open to all internal dosimetry professionals. (author)
Hanford Environmental Dose Reconstruction Project independent direction and oversight
International Nuclear Information System (INIS)
Blazek, M.L.; Power, M.
1991-01-01
Hanford was selected in 1942 as one of the sites for the Manhattan Project. It produced plutonium for one of the world's first nuclear weapons. The US Department of Energy (DOE) and its predecessors continued to make plutonium for nuclear weapons at Hanford for more than four decades. In the early days of Hanford operations, radioactive materials routinely were released to the environment by many processes. The DOE disclosed documents about these releases in 1986. In 1987, Washington, Oregon, and regional Indian tribes gathered an independent panel of experts. This group recommended dose reconstruction and health effects feasibility studies. Later that year, DOE hired Battelle Pacific Northwest Laboratory (PNL) to reconstruct potential public radiation doses from Hanford's past releases of radioactive material. The DOE agreed with the states and tribes that project direction would come from an independent technical steering panel (TSP). This approach was critical to gain public credibility for the project and the science. The TSP directs the project and makes policy. That is now clear - but, it was hard-earned. Conducting science in an open public process is new, challenging, and clearly worthwhile. The panel's product is good science that is believed and accepted by the public - our client
Liu, Rong
2017-01-01
Obtaining a fast and reliable decision is an important issue in brain-computer interfaces (BCI), particularly in practical real-time applications such as wheelchair or neuroprosthetic control. In this study, the EEG signals were firstly analyzed with a power projective base method. Then we were applied a decision-making model, the sequential probability ratio testing (SPRT), for single-trial classification of motor imagery movement events. The unique strength of this proposed classification method lies in its accumulative process, which increases the discriminative power as more and more evidence is observed over time. The properties of the method were illustrated on thirteen subjects' recordings from three datasets. Results showed that our proposed power projective method outperformed two benchmark methods for every subject. Moreover, with sequential classifier, the accuracies across subjects were significantly higher than that with nonsequential ones. The average maximum accuracy of the SPRT method was 84.1%, as compared with 82.3% accuracy for the sequential Bayesian (SB) method. The proposed SPRT method provides an explicit relationship between stopping time, thresholds, and error, which is important for balancing the time-accuracy trade-off. These results suggest SPRT would be useful in speeding up decision-making while trading off errors in BCI. PMID:29348781
Luxton, Gary; Keall, Paul J; King, Christopher R
2008-01-07
To facilitate the use of biological outcome modeling for treatment planning, an exponential function is introduced as a simpler equivalent to the Lyman formula for calculating normal tissue complication probability (NTCP). The single parameter of the exponential function is chosen to reproduce the Lyman calculation to within approximately 0.3%, and thus enable easy conversion of data contained in empirical fits of Lyman parameters for organs at risk (OARs). Organ parameters for the new formula are given in terms of Lyman model m and TD(50), and conversely m and TD(50) are expressed in terms of the parameters of the new equation. The role of the Lyman volume-effect parameter n is unchanged from its role in the Lyman model. For a non-homogeneously irradiated OAR, an equation relates d(ref), n, v(eff) and the Niemierko equivalent uniform dose (EUD), where d(ref) and v(eff) are the reference dose and effective fractional volume of the Kutcher-Burman reduction algorithm (i.e. the LKB model). It follows in the LKB model that uniform EUD irradiation of an OAR results in the same NTCP as the original non-homogeneous distribution. The NTCP equation is therefore represented as a function of EUD. The inverse equation expresses EUD as a function of NTCP and is used to generate a table of EUD versus normal tissue complication probability for the Emami-Burman parameter fits as well as for OAR parameter sets from more recent data.
International Nuclear Information System (INIS)
Luxton, Gary; Keall, Paul J; King, Christopher R
2008-01-01
To facilitate the use of biological outcome modeling for treatment planning, an exponential function is introduced as a simpler equivalent to the Lyman formula for calculating normal tissue complication probability (NTCP). The single parameter of the exponential function is chosen to reproduce the Lyman calculation to within ∼0.3%, and thus enable easy conversion of data contained in empirical fits of Lyman parameters for organs at risk (OARs). Organ parameters for the new formula are given in terms of Lyman model m and TD 50 , and conversely m and TD 50 are expressed in terms of the parameters of the new equation. The role of the Lyman volume-effect parameter n is unchanged from its role in the Lyman model. For a non-homogeneously irradiated OAR, an equation relates d ref , n, v eff and the Niemierko equivalent uniform dose (EUD), where d ref and v eff are the reference dose and effective fractional volume of the Kutcher-Burman reduction algorithm (i.e. the LKB model). It follows in the LKB model that uniform EUD irradiation of an OAR results in the same NTCP as the original non-homogeneous distribution. The NTCP equation is therefore represented as a function of EUD. The inverse equation expresses EUD as a function of NTCP and is used to generate a table of EUD versus normal tissue complication probability for the Emami-Burman parameter fits as well as for OAR parameter sets from more recent data
Energy Technology Data Exchange (ETDEWEB)
Thomas E. Widner; et. al.
1999-07-01
In the early 1990s, concern about the Oak Ridge Reservation's past releases of contaminants to the environment prompted Tennessee's public health officials to pursue an in-depth study of potential off-site health effects at Oak Ridge. This study, the Oak Ridge dose reconstruction, was supported by an agreement between the U.S. Department of Energy (DOE) and the State of Tennessee, and was overseen by a 12-member panel of individuals appointed by Tennessee's Commissioner of Health. The panel requested that the principal investigator for the project prepare the following report, ''Oak Ridge Dose Reconstruction Project Summary Report,'' to serve the following purposes: (1) summarize in a single, less technical report, the methods and results of the various investigations that comprised the Phase II of the dose reconstruction; (2) describe the systematic searching of classified and unclassified historical records that was a vital component of the project; and (3) summarize the less detailed, screening-level assessments that were performed to evaluate the potential health significance of a number of materials, such a uranium, whose priority did not require a complete dose reconstruction effort. This report describes each major step of the dose reconstruction study: (1) the review of thousands of historical records to obtain information relating to past operations at each facility; (2) estimation of the quantity and timing of releases of radioiodines from X-10, of mercury from Y-12, of PCB's from all facilities, and of cesium-137 and other radionuclides from White Oak Creek; (3) evaluation of the routes taken by these contaminants through the environment to nearby populations; and (4) estimation of doses and health risks to exposed groups. Calculations found the highest excess cancer risks for a female born in 1952 who drank goat milk; the highest non-cancer health risk was for children in a farm family exposed to PCBs in and near
International Nuclear Information System (INIS)
Widner, Thomas E.; email = twidner@jajoneses.com
1999-01-01
In the early 1990s, concern about the Oak Ridge Reservation's past releases of contaminants to the environment prompted Tennessee's public health officials to pursue an in-depth study of potential off-site health effects at Oak Ridge. This study, the Oak Ridge dose reconstruction, was supported by an agreement between the U.S. Department of Energy (DOE) and the State of Tennessee, and was overseen by a 12-member panel of individuals appointed by Tennessee's Commissioner of Health. The panel requested that the principal investigator for the project prepare the following report, ''Oak Ridge Dose Reconstruction Project Summary Report,'' to serve the following purposes: (1) summarize in a single, less technical report, the methods and results of the various investigations that comprised the Phase II of the dose reconstruction; (2) describe the systematic searching of classified and unclassified historical records that was a vital component of the project; and (3) summarize the less detailed, screening-level assessments that were performed to evaluate the potential health significance of a number of materials, such a uranium, whose priority did not require a complete dose reconstruction effort. This report describes each major step of the dose reconstruction study: (1) the review of thousands of historical records to obtain information relating to past operations at each facility; (2) estimation of the quantity and timing of releases of radioiodines from X-10, of mercury from Y-12, of PCB's from all facilities, and of cesium-137 and other radionuclides from White Oak Creek; (3) evaluation of the routes taken by these contaminants through the environment to nearby populations; and (4) estimation of doses and health risks to exposed groups. Calculations found the highest excess cancer risks for a female born in 1952 who drank goat milk; the highest non-cancer health risk was for children in a farm family exposed to PCBs in and near East Fork Poplar Creek. More detailed
International Nuclear Information System (INIS)
Buffa, Francesca M.
2000-01-01
The aim of this work is to investigate the influence of the statistical fluctuations of Monte Carlo (MC) dose distributions on the dose volume histograms (DVHs) and radiobiological models, in particular the Poisson model for tumour control probability (tcp). The MC matrix is characterized by a mean dose in each scoring voxel, d, and a statistical error on the mean dose, σ d ; whilst the quantities d and σ d depend on many statistical and physical parameters, here we consider only their dependence on the phantom voxel size and the number of histories from the radiation source. Dose distributions from high-energy photon beams have been analysed. It has been found that the DVH broadens when increasing the statistical noise of the dose distribution, and the tcp calculation systematically underestimates the real tumour control value, defined here as the value of tumour control when the statistical error of the dose distribution tends to zero. When increasing the number of energy deposition events, either by increasing the voxel dimensions or increasing the number of histories from the source, the DVH broadening decreases and tcp converges to the 'correct' value. It is shown that the underestimation of the tcp due to the noise in the dose distribution depends on the degree of heterogeneity of the radiobiological parameters over the population; in particular this error decreases with increasing the biological heterogeneity, whereas it becomes significant in the hypothesis of a radiosensitivity assay for single patients, or for subgroups of patients. It has been found, for example, that when the voxel dimension is changed from a cube with sides of 0.5 cm to a cube with sides of 0.25 cm (with a fixed number of histories of 10 8 from the source), the systematic error in the tcp calculation is about 75% in the homogeneous hypothesis, and it decreases to a minimum value of about 15% in a case of high radiobiological heterogeneity. The possibility of using the error on the
Memorandum concerning the European project of dose passport (dosimetry booklet)
International Nuclear Information System (INIS)
2013-01-01
In fact the European project represents the implementation in European law of the 90/641 EURATOM directive that proposed a common European system for the follow-up of the occupational irradiation of workers. The EURATOM directive recommends a computer system while the European project proposes to write down information in a simple booklet. Some experts highlight the fact that it would be easier and more reliable to upgrade a computer file than a booklet and that the information must be available in different European languages. The experts recommend that the European countries must agree on what information would be compulsory, and on an accurate definition of the radiation dose we have to report and on how to measure it in order to have a consistent system throughout Europe. (A.C.)
Energy Technology Data Exchange (ETDEWEB)
Zhang, H; Kong, V; Jin, J [Georgia Regents University Cancer Center, Augusta, GA (Georgia); Ren, L; Zhang, Y; Giles, W [Duke University Medical Center, Durham, NC (United States)
2015-06-15
Purpose: To present a cone beam computed tomography (CBCT) system, which uses a synchronized moving grid (SMOG) to reduce and correct scatter, an inter-projection sensor fusion (IPSF) algorithm to estimate the missing information blocked by the grid, and a probability total variation (pTV) algorithm to reconstruct the CBCT image. Methods: A prototype SMOG-equipped CBCT system was developed, and was used to acquire gridded projections with complimentary grid patterns in two neighboring projections. Scatter was reduced by the grid, and the remaining scatter was corrected by measuring it under the grid. An IPSF algorithm was used to estimate the missing information in a projection from data in its 2 neighboring projections. Feldkamp-Davis-Kress (FDK) algorithm was used to reconstruct the initial CBCT image using projections after IPSF processing for pTV. A probability map was generated depending on the confidence of estimation in IPSF for the regions of missing data and penumbra. pTV was finally used to reconstruct the CBCT image for a Catphan, and was compared to conventional CBCT image without using SMOG, images without using IPSF (SMOG + FDK and SMOG + mask-TV), and image without using pTV (SMOG + IPSF + FDK). Results: The conventional CBCT without using SMOG shows apparent scatter-induced cup artifacts. The approaches with SMOG but without IPSF show severe (SMOG + FDK) or additional (SMOG + TV) artifacts, possibly due to using projections of missing data. The 2 approaches with SMOG + IPSF removes the cup artifacts, and the pTV approach is superior than the FDK by substantially reducing the noise. Using the SMOG also reduces half of the imaging dose. Conclusion: The proposed technique is promising in improving CBCT image quality while reducing imaging dose.
International Nuclear Information System (INIS)
Zhang, H; Kong, V; Jin, J; Ren, L; Zhang, Y; Giles, W
2015-01-01
Purpose: To present a cone beam computed tomography (CBCT) system, which uses a synchronized moving grid (SMOG) to reduce and correct scatter, an inter-projection sensor fusion (IPSF) algorithm to estimate the missing information blocked by the grid, and a probability total variation (pTV) algorithm to reconstruct the CBCT image. Methods: A prototype SMOG-equipped CBCT system was developed, and was used to acquire gridded projections with complimentary grid patterns in two neighboring projections. Scatter was reduced by the grid, and the remaining scatter was corrected by measuring it under the grid. An IPSF algorithm was used to estimate the missing information in a projection from data in its 2 neighboring projections. Feldkamp-Davis-Kress (FDK) algorithm was used to reconstruct the initial CBCT image using projections after IPSF processing for pTV. A probability map was generated depending on the confidence of estimation in IPSF for the regions of missing data and penumbra. pTV was finally used to reconstruct the CBCT image for a Catphan, and was compared to conventional CBCT image without using SMOG, images without using IPSF (SMOG + FDK and SMOG + mask-TV), and image without using pTV (SMOG + IPSF + FDK). Results: The conventional CBCT without using SMOG shows apparent scatter-induced cup artifacts. The approaches with SMOG but without IPSF show severe (SMOG + FDK) or additional (SMOG + TV) artifacts, possibly due to using projections of missing data. The 2 approaches with SMOG + IPSF removes the cup artifacts, and the pTV approach is superior than the FDK by substantially reducing the noise. Using the SMOG also reduces half of the imaging dose. Conclusion: The proposed technique is promising in improving CBCT image quality while reducing imaging dose
Revue of some dosimetry and dose assessment European projects
International Nuclear Information System (INIS)
Bolognese-Milsztajn, T.; Frank, D.; Lacoste, V.; Pihet, P.
2006-01-01
Full text of publication follows: Within the 5. Framework Programme of the European Commission several project dealing with dosimetry and dose assessment for internal and external exposure have been supported. A revue of the results of some of them is presented in this paper. The EURADOS network which involved 50 dosimetry institutes in EUROPE has coordinated the project DOSIMETRY NETWORK: the main results achieved within this action are the following: - The release on the World Wide Web of the EURADOS Database of Dosimetry Research Facilities; - The realisation of the report 'Harmonization of Individual Monitoring (IM) in Europe'; - The continuation of the intercomparisons programme of environmental radiation monitoring systems; - The realisation of the report Cosmic radiation exposure of aircraft crew. The EVIDOS project aimed at evaluating state of the art dosimetry techniques in representative workplaces of the nuclear industry with complex mixed neutron-photon radiation fields. This paper summarises the main findings from a practical point of view. Conclusions and recommendations will be given concerning characterisation of radiation fields, methods to derive radiation protection quantities and dosimeters results. The IDEA project aimed to improve the assessment of incorporated radionuclides through developments of advanced in-vivo and bioassay monitoring techniques and making use of such enhancements for improvements in routine monitoring. The primary goal was to categorize those new developments regarding their potential and eligibility for the routine monitoring community. The costs of monitoring for internal exposures in the workplace are usually significantly greater than the equivalent costs for external exposures. There is therefore a need to ensure that resources are employed with maximum effectiveness. The EC-funded OMINEX (Optimisation of Monitoring for Internal Exposure) project has developed methods for optimising the design and implementation of
Directory of Open Access Journals (Sweden)
Gholamreza Norouzi
2015-01-01
Full Text Available In project management context, time management is one of the most important factors affecting project success. This paper proposes a new method to solve research project scheduling problems (RPSP containing Fuzzy Graphical Evaluation and Review Technique (FGERT networks. Through the deliverables of this method, a proper estimation of project completion time (PCT and success probability can be achieved. So algorithms were developed to cover all features of the problem based on three main parameters “duration, occurrence probability, and success probability.” These developed algorithms were known as PR-FGERT (Parallel and Reversible-Fuzzy GERT networks. The main provided framework includes simplifying the network of project and taking regular steps to determine PCT and success probability. Simplifications include (1 equivalent making of parallel and series branches in fuzzy network considering the concepts of probabilistic nodes, (2 equivalent making of delay or reversible-to-itself branches and impact of changing the parameters of time and probability based on removing related branches, (3 equivalent making of simple and complex loops, and (4 an algorithm that was provided to resolve no-loop fuzzy network, after equivalent making. Finally, the performance of models was compared with existing methods. The results showed proper and real performance of models in comparison with existing methods.
de Bruijn, Renée; Dabekaussen, Willem; Hijma, Marc; Wiersma, Ane; Abspoel-Bukman, Linda; Boeije, Remco; Courage, Wim; van der Geest, Johan; Hamburg, Marc; Harmsma, Edwin; Helmholt, Kristian; van den Heuvel, Frank; Kruse, Henk; Langius, Erik; Lazovik, Elena
2017-04-01
Due to heterogeneity of the subsurface in the delta environment of the Netherlands, differential subsidence over short distances results in tension and subsequent wear of subsurface infrastructure, such as water and gas pipelines. Due to uncertainties in the build-up of the subsurface, however, it is unknown where this problem is the most prominent. This is a problem for asset managers deciding when a pipeline needs replacement: damaged pipelines endanger security of supply and pose a significant threat to safety, yet premature replacement raises needless expenses. In both cases, costs - financial or other - are high. Therefore, an interdisciplinary research team of geotechnicians, geologists and Big Data engineers from research institutes TNO, Deltares and SkyGeo developed a stochastic model to predict differential subsidence and the probability of consequent pipeline failure on a (sub-)street level. In this project pipeline data from company databases is combined with a stochastic geological model and information on (historical) groundwater levels and overburden material. Probability of pipeline failure is modelled by a coupling with a subsidence model and two separate models on pipeline behaviour under stress, using a probabilistic approach. The total length of pipelines (approx. 200.000 km operational in the Netherlands) and the complexity of the model chain that is needed to calculate a chance of failure, results in large computational challenges, as it requires massive evaluation of possible scenarios to reach the required level of confidence. To cope with this, a scalable computational infrastructure has been developed, composing a model workflow in which components have a heterogeneous technological basis. Three pilot areas covering an urban, a rural and a mixed environment, characterised by different groundwater-management strategies and different overburden histories, are used to evaluate the differences in subsidence and uncertainties that come with
Energy Technology Data Exchange (ETDEWEB)
Samuels, Stuart E.; Eisbruch, Avraham; Vineberg, Karen; Lee, Jae; Lee, Choonik; Matuszak, Martha M.; Ten Haken, Randall K.; Brock, Kristy K., E-mail: kbrock@med.umich.edu
2016-11-01
Purpose: Strategies to reduce the toxicities of head and neck radiation (ie, dysphagia [difficulty swallowing] and xerostomia [dry mouth]) are currently underway. However, the predicted benefit of dose and planning target volume (PTV) reduction strategies is unknown. The purpose of the present study was to compare the normal tissue complication probabilities (NTCP) for swallowing and salivary structures in standard plans (70 Gy [P70]), dose-reduced plans (60 Gy [P60]), and plans eliminating the PTV margin. Methods and Materials: A total of 38 oropharyngeal cancer (OPC) plans were analyzed. Standard organ-sparing volumetric modulated arc therapy plans (P70) were created and then modified by eliminating the PTVs and treating the clinical tumor volumes (CTVs) only (C70) or maintaining the PTV but reducing the dose to 60 Gy (P60). NTCP dose models for the pharyngeal constrictors, glottis/supraglottic larynx, parotid glands (PGs), and submandibular glands (SMGs) were analyzed. The minimal clinically important benefit was defined as a mean change in NTCP of >5%. The P70 NTCP thresholds and overlap percentages of the organs at risk with the PTVs (56-59 Gy, vPTV{sub 56}) were evaluated to identify the predictors for NTCP improvement. Results: With the P60 plans, only the ipsilateral PG (iPG) benefited (23.9% vs 16.2%; P<.01). With the C70 plans, only the iPG (23.9% vs 17.5%; P<.01) and contralateral SMG (cSMG) (NTCP 32.1% vs 22.9%; P<.01) benefited. An iPG NTCP threshold of 20% and 30% predicted NTCP benefits for the P60 and C70 plans, respectively (P<.001). A cSMG NTCP threshold of 30% predicted for an NTCP benefit with the C70 plans (P<.001). Furthermore, for the iPG, a vPTV{sub 56} >13% predicted benefit with P60 (P<.001) and C70 (P=.002). For the cSMG, a vPTV{sub 56} >22% predicted benefit with C70 (P<.01). Conclusions: PTV elimination and dose-reduction lowered the NTCP of the iPG, and PTV elimination lowered the NTCP of the cSMG. NTCP thresholds and the
International Nuclear Information System (INIS)
Soehn, Matthias; Yan Di; Liang Jian; Meldolesi, Elisa; Vargas, Carlos; Alber, Markus
2007-01-01
Purpose: Accurate modeling of rectal complications based on dose-volume histogram (DVH) data are necessary to allow safe dose escalation in radiotherapy of prostate cancer. We applied different equivalent uniform dose (EUD)-based and dose-volume-based normal tissue complication probability (NTCP) models to rectal wall DVHs and follow-up data for 319 prostate cancer patients to identify the dosimetric factors most predictive for Grade ≥ 2 rectal bleeding. Methods and Materials: Data for 319 patients treated at the William Beaumont Hospital with three-dimensional conformal radiotherapy (3D-CRT) under an adaptive radiotherapy protocol were used for this study. The following models were considered: (1) Lyman model and (2) logit-formula with DVH reduced to generalized EUD (3) serial reconstruction unit (RU) model (4) Poisson-EUD model, and (5) mean dose- and (6) cutoff dose-logistic regression model. The parameters and their confidence intervals were determined using maximum likelihood estimation. Results: Of the patients, 51 (16.0%) showed Grade 2 or higher bleeding. As assessed qualitatively and quantitatively, the Lyman- and Logit-EUD, serial RU, and Poisson-EUD model fitted the data very well. Rectal wall mean dose did not correlate to Grade 2 or higher bleeding. For the cutoff dose model, the volume receiving > 73.7 Gy showed most significant correlation to bleeding. However, this model fitted the data more poorly than the EUD-based models. Conclusions: Our study clearly confirms a volume effect for late rectal bleeding. This can be described very well by the EUD-like models, of which the serial RU- and Poisson-EUD model can describe the data with only two parameters. Dose-volume-based cutoff-dose models performed worse
International Nuclear Information System (INIS)
Schilstra, C.; Meertens, H.
2001-01-01
Purpose: Usually, models that predict normal tissue complication probability (NTCP) are fitted to clinical data with the maximum likelihood (ML) method. This method inevitably causes a loss of information contained in the data. In this study, an alternative method is investigated that calculates the parameter probability distribution (PD), and, thus, conserves all information. The PD method also allows the calculation of the uncertainty in the NTCP, which is an (often-neglected) prerequisite for the intercomparison of both treatment plans and NTCP models. The PD and ML methods are applied to parotid gland data, and the results are compared. Methods and Materials: The drop in salivary flow due to radiotherapy was measured in 25 parotid glands of 15 patients. Together with the parotid gland dose-volume histograms (DVH), this enabled the calculation of the parameter PDs for three different NTCP models (Lyman, relative seriality, and critical volume). From these PDs, the NTCP and its uncertainty could be calculated for arbitrary parotid gland DVHs. ML parameters and resulting NTCP values were calculated also. Results: All models fitted equally well. The parameter PDs turned out to have nonnormal shapes and long tails. The NTCP predictions of the ML and PD method usually differed considerably, depending on the NTCP model and the nature of irradiation. NTCP curves and ML parameters suggested a highly parallel organization of the parotid gland. Conclusions: Considering the substantial differences between the NTCP predictions of the ML and PD method, the use of the PD method is preferred, because this is the only method that takes all information contained in the clinical data into account. Furthermore, PD method gives a true measure of the uncertainty in the NTCP
A multiscale filter for noise reduction of low-dose cone beam projections.
Yao, Weiguang; Farr, Jonathan B
2015-08-21
The Poisson or compound Poisson process governs the randomness of photon fluence in cone beam computed tomography (CBCT) imaging systems. The probability density function depends on the mean (noiseless) of the fluence at a certain detector. This dependence indicates the natural requirement of multiscale filters to smooth noise while preserving structures of the imaged object on the low-dose cone beam projection. In this work, we used a Gaussian filter, exp(-x2/2σ(2)(f)) as the multiscale filter to de-noise the low-dose cone beam projections. We analytically obtained the expression of σ(f), which represents the scale of the filter, by minimizing local noise-to-signal ratio. We analytically derived the variance of residual noise from the Poisson or compound Poisson processes after Gaussian filtering. From the derived analytical form of the variance of residual noise, optimal σ(2)(f)) is proved to be proportional to the noiseless fluence and modulated by local structure strength expressed as the linear fitting error of the structure. A strategy was used to obtain the reliable linear fitting error: smoothing the projection along the longitudinal direction to calculate the linear fitting error along the lateral direction and vice versa. The performance of our multiscale filter was examined on low-dose cone beam projections of a Catphan phantom and a head-and-neck patient. After performing the filter on the Catphan phantom projections scanned with pulse time 4 ms, the number of visible line pairs was similar to that scanned with 16 ms, and the contrast-to-noise ratio of the inserts was higher than that scanned with 16 ms about 64% in average. For the simulated head-and-neck patient projections with pulse time 4 ms, the visibility of soft tissue structures in the patient was comparable to that scanned with 20 ms. The image processing took less than 0.5 s per projection with 1024 × 768 pixels.
Hanford Environmental Dose Reconstruction Project, Quarterly report, September--November 1993
International Nuclear Information System (INIS)
Cannon, S.D.; Finch, S.M.
1993-01-01
The objective of the Hanford Environmental Dose Reconstruction (HEDR) Project is to estimate the radiation doses that individuals and populations could have received from nuclear operations at Hanford since 1944. The project is divided into the following technical tasks. These tasks correspond to the path radionuclides followed from release to impact on humans (dose estimates); Source Terms, Environmental Transport, Environmental Monitoring Data, Demography, Food Consumption, and Agriculture, and Environmental Pathways and Dose Estimates
Energy Technology Data Exchange (ETDEWEB)
Lu, S; Streit, R D; Chou, C K
1980-01-01
This report summarizes work performed for the U.S. Nuclear Regulatory Commission (NRC) by the Load Combination Program at the Lawrence Livermore National Laboratory to establish a technical basis for the NRC to use in reassessing its requirement that earthquake and large loss-of-coolant accident (LOCA) loads be combined in the design of nuclear power plants. A systematic probabilistic approach is used to treat the random nature of earthquake and transient loading to estimate the probability of large LOCAs that are directly and indirectly induced by earthquakes. A large LOCA is defined in this report as a double-ended guillotine break of the primary reactor coolant loop piping (the hot leg, cold leg, and crossover) of a pressurized water reactor (PWR). Unit 1 of the Zion Nuclear Power Plant, a four-loop PWR-1, is used for this study. To estimate the probability of a large LOCA directly induced by earthquakes, only fatigue crack growth resulting from the combined effects of thermal, pressure, seismic, and other cyclic loads is considered. Fatigue crack growth is simulated with a deterministic fracture mechanics model that incorporates stochastic inputs of initial crack size distribution, material properties, stress histories, and leak detection probability. Results of the simulation indicate that the probability of a double-ended guillotine break, either with or without an earthquake, is very small (on the order of 10{sup -12}). The probability of a leak was found to be several orders of magnitude greater than that of a complete pipe rupture. A limited investigation involving engineering judgment of a double-ended guillotine break indirectly induced by an earthquake is also reported. (author)
International Nuclear Information System (INIS)
Lu, S.; Streit, R.D.; Chou, C.K.
1980-01-01
This report summarizes work performed for the U.S. Nuclear Regulatory Commission (NRC) by the Load Combination Program at the Lawrence Livermore National Laboratory to establish a technical basis for the NRC to use in reassessing its requirement that earthquake and large loss-of-coolant accident (LOCA) loads be combined in the design of nuclear power plants. A systematic probabilistic approach is used to treat the random nature of earthquake and transient loading to estimate the probability of large LOCAs that are directly and indirectly induced by earthquakes. A large LOCA is defined in this report as a double-ended guillotine break of the primary reactor coolant loop piping (the hot leg, cold leg, and crossover) of a pressurized water reactor (PWR). Unit 1 of the Zion Nuclear Power Plant, a four-loop PWR-1, is used for this study. To estimate the probability of a large LOCA directly induced by earthquakes, only fatigue crack growth resulting from the combined effects of thermal, pressure, seismic, and other cyclic loads is considered. Fatigue crack growth is simulated with a deterministic fracture mechanics model that incorporates stochastic inputs of initial crack size distribution, material properties, stress histories, and leak detection probability. Results of the simulation indicate that the probability of a double-ended guillotine break, either with or without an earthquake, is very small (on the order of 10 -12 ). The probability of a leak was found to be several orders of magnitude greater than that of a complete pipe rupture. A limited investigation involving engineering judgment of a double-ended guillotine break indirectly induced by an earthquake is also reported. (author)
FY 1992 revised task plans for the Hanford Environmental Dose Reconstruction Project
International Nuclear Information System (INIS)
Shipler, D.B.
1992-04-01
The purpose of the Hanford Environmental Dose Reconstruction (HEDR) Project is to estimate radiation doses from Hanford Site operations since 1944 to populations and individuals. The primary objectives of work to be performed in FY 1992 is to determine the appropriate scope (space, time, and radionuclides, pathways and individuals/population groups) and accuracy (level of uncertainty in dose estimates) for the project. Another objective is to use a refined computer model to estimate Native American tribal doses and individual doses for the Hanford Thyroid Disease Study (HTDS). Project scope and accuracy requirements defined in FY 1992 can translated into model and data requirements that must be satisfied during FY 1993
Dose rate analysis for Tank 101 AZ (Project W151)
International Nuclear Information System (INIS)
Schwarz, R.A.; Hillesland, K.E.; Carter, L.L.
1994-11-01
This document describes the expected dose rates for modification to tank 101 AZ including modifications to the steam coil, mixer pump, and temperature probes. The thrust of the effort is to determine dose rates from: modification of a steam coil and caisson; the installation of mixer pumps; the installation of temperature probes; and estimates of dose rates that will be encountered while making these changes. Because the dose rates for all of these configurations depend upon the photon source within the supernate and sludge, comparisons were also made between measured dose rates within a drywell and the corresponding calculated dose rates. The calculational tool used is a Monte Carlo (MCNP 2 ) code since complicated three dimensional geometries are involved. A summary of the most important results of the entire study is given in Section 2. The basic calculational geometry model of the tank is discussed in Section 3, along with a tabulation of the photon sources that were used within the supernate and the sludge, and a discussion of uncertainties. The calculated dose rates around the steam coil and caisson before and after modification are discussed in Section 4. The configuration for the installation of the mixer pumps and the resulting dose rates are given in Section 5. The predicted changes in dose rates due to a possible dilution of the supernate source are given in Section 6. The calculational configuration used to model the installation of temperature probes and the resulting predicted dose rates are discussed in Section 7. Finally, comparisons of measured to calculated dose rates within a drywell are summarized in Section 8. Extended discussions of calculational models and Monte Carlo optimization techniques used are included in Appendix A
Energy Technology Data Exchange (ETDEWEB)
Heeb, C.M.
1994-05-01
The purpose of the Hanford Environmental Dose Reconstruction Project is to estimate the radiation dose that individuals could have received as a result of radionuclide emissions since 1944 from the Hanford Site. The first step in determining dose is to estimate the amount and timing of radionuclide releases to air and water. This report provides the air release information.
Directory of Open Access Journals (Sweden)
Slama Remy
2006-03-01
Full Text Available Abstract Background Male gonadal exposure to ionizing radiation may disrupt spermatogenesis, but its influence on the fecundity of couples has been rarely studied. We aimed to characterize the influence of male gonadal dose of ionizing radiation delivered during radiodiagnostic on the monthly probability of pregnancy. Methods We recruited a random sample of women who retrospectively described 1110 periods of unprotected intercourse beginning between 1985 and 1999 and leading either to a live birth or to no pregnancy; their duration was censored after 13 months. The male partner answered a telephone questionnaire on radiodiagnostic examinations. We assigned a mean gonadal dose to each type of radiodiagnostic examination. We defined male dose for each period of unprotected intercourse as the sum of the gonadal doses of the X-ray examinations experienced between 18 years of age and the date of discontinuation of contraception. Time to pregnancy was analysed using a discrete Cox model with random effect allowing to estimate hazard ratios of pregnancy. Results After adjustment for female factors likely to influence fecundity, there was no evidence of an association between male dose and the probability of pregnancy (test of homogeneity, p = 0.55. When compared to couples with a male gonadal dose between 0.01 and 0.20 milligrays (n = 321 periods of unprotected intercourse, couples with a gonadal dose above 10 milligrays had a hazard ratio of pregnancy of 1.44 (95% confidence interval, 0.73–2.86, n = 31. Conclusion Our study provides no evidence of a long-term detrimental effect of male gonadal dose of ionizing radiation delivered during radiodiagnostic on the monthly probability of pregnancy during the year following discontinuation of contraceptive use. Classification errors due to the retrospective assessment of male gonadal exposure may have limited the statistical power of our study.
Projection of Korean Probable Maximum Precipitation under Future Climate Change Scenarios
Directory of Open Access Journals (Sweden)
Okjeong Lee
2016-01-01
Full Text Available According to the IPCC Fifth Assessment Report, air temperature and humidity of the future are expected to gradually increase over the current. In this study, future PMPs are estimated by using future dew point temperature projection data which are obtained from RCM data provided by the Korea Meteorological Administration. First, bias included in future dew point temperature projection data which is provided on a daily basis is corrected through a quantile-mapping method. Next, using a scale-invariance technique, 12-hour duration 100-year return period dew point temperatures which are essential input data for PMPs estimation are estimated from bias-corrected future dew point temperature data. After estimating future PMPs, it can be shown that PMPs in all future climate change scenarios (AR5 RCP2.6, RCP 4.5, RCP 6.0, and RCP 8.5 are very likely to increase.
Dose and risk assessment approach for the Fernald CERCLA D ampersand D Project
International Nuclear Information System (INIS)
Throckmorton, J.D.; Clark, T.R.; Waligora, S.J. Jr.; Haaker, R.F.
1994-01-01
At the Fernald Environmental Management Project (FEMP) the uranium processing facilities used from the 1952 through 1989 are near or beyond their intended design life. These current conditions present an increasing probability for future releases of hazardous substances to the environment. To support a decision by the U.S. Department of Energy (DOE) and the Environmental Protection Agency (EPA) to remediate the buildings, a dose and risk assessment was performed to determine the extent of exposure that would be associated with the controlled decontamination and dismantlement (D ampersand D) of the Fernald facilities. A conceptual risk assessment model was developed, with exposure mechanisms and associated pathways for each potential receptor. The three receptor groups were defined as: the remediation workers, other on-site workers (those not performing D ampersand D), and off-site residents. For use in the conceptual model, an airborne source term was developed through process knowledge, other historical information and data, and air sample data from within the facilities. Individual and collective doses and risks were developed for each receptor and for each population group. The risk assessment demonstrated that all exposures resulting from the action would be within the acceptable DOE administrative control level of 2.0 rem per year for occupational workers and the acceptable EPA risk range from 10 -6 to 10 -4 for the general public
International Nuclear Information System (INIS)
Morris, R.
1996-05-01
Building 2 on the U.S. Department of Energy (DOE) Grand Junction Projects Office (GJPO) site, which is operated by Rust Geotech, is part of the GJPO Remedial Action Program. This report describes measurements and modeling efforts to evaluate the radiation dose to members of the public who might someday occupy or tear down Building 2. The assessment of future doses to those occupying or demolishing Building 2 is based on assumptions about future uses of the building, measured data when available, and predictive modeling when necessary. Future use of the building is likely to be as an office facility. The DOE sponsored program, RESRAD-BUILD, Version. 1.5 was chosen for the modeling tool. Releasing the building for unrestricted use instead of demolishing it now could save a substantial amount of money compared with the baseline cost estimate because the site telecommunications system, housed in Building 2, would not be disabled and replaced. The information developed in this analysis may be used as part of an as low as reasonably achievable (ALARA) cost/benefit determination regarding disposition of Building 2
International Nuclear Information System (INIS)
Leyton, Fernando; Nogueira, Maria S.; Gubolino, Luiz A.; Pivetta, Makyson R.; Ubeda, Carlos
2016-01-01
Studies have reported cases of radiation-induced cataract among cardiology professionals. In view of the evidence of epidemiological studies, the ICRP recommends a new threshold for opacities and a new radiation dose to eye lens limit of 20 mSv per year for occupational exposure. The aim of this paper is to report scattered radiation doses at the height of the operator's eye in an interventional cardiology facility without considering radiation protection devices and to correlate these values with different angiographic projections and operational modes. Measurements were taken in a cardiac laboratory with an angiography X-ray system equipped with flat-panel detector. PMMA plates of 30×30×5 cm were used with a thickness of 20 cm. Measurements were taken in two fluoroscopy modes (low and normal, 15 pulses/s) and in cine mode (15 frames/s). Four angiographic projections were used: anterior posterior; lateral; left anterior oblique caudal (spider); and left anterior oblique cranial, with a cardiac protocol for patients weighing between 70 and 90 kg. Measurements of phantom entrance dose rate and scatter dose rate were performed with two Unfors Xi plus detectors. The detector measuring scatter radiation was positioned at the usual distance of the cardiologist's eyes during working conditions. There is a good linear correlation between the kerma area product and scatter dose at the lens. Experimental correlation factors of 2.3, 12.0, 12.2 and 17.6 μSv/Gy cm2 were found for different projections. PMMA entrance dose rates for low and medium fluoroscopy and cine modes were 13, 39 and 282 mGy/min, respectively, for AP projection. - Highlights: • A method is presented to estimate the scatter radiation dose at operator eye height. • The method allows estimating scatter radiation dose measuring ambient dose equivalent. • Operator could exceed threshold for lens opacities if protection tools are not used. • There is a good linear correlation between kerma
Required doses for projection methods in X-ray diagnosis
International Nuclear Information System (INIS)
Hagemann, G.
1992-01-01
The ideal dose requirement has been stated by Cohen et al. (1981) by a formula basing on parallel beam, maximum quantum yield and Bucky grid effect depending on the signal to noise ratio and object contrast. This was checked by means of contrast detail diagrams measured at the hole phantom, and was additionally compared with measurement results obtained with acrylic glass phantoms. The optimal dose requirement is obtained by the maximum technically possible approach to the ideal requirement level. Examples are given, besides for x-ray equipment with Gd 2 O 2 S screen film systems for grid screen mammography, and new thoracic examination systems for mass screenings. Finally, a few values concerning the dose requirement or the analogous time required for fluorscent screening in angiography and interventional radiology, are stated, as well as for dentistry and paediatric x-ray diagnostics. (orig./HP) [de
Energy Technology Data Exchange (ETDEWEB)
Leyton, F.; Nogueira, M. S.; Da Silva, T. A. [Centro de Desenvolvimento da Tecnologia Nuclear / CNEN, Post-graduation in Sciences and Technology of Radiations, Minerals and Materials, Pte. Antonio Carlos No. 6627, Belo Horizonte 31270-901, Minas Gerais (Brazil); Gubolino, L.; Pivetta, M. R. [Hospital dos Fornecedores de Cana de Piracicaba, Av. Barao de Valenca 616, 13405-233 Piracicaba (Brazil); Ubeda, C., E-mail: leyton.fernando@gmail.com [Tarapaca University, Health Sciences Faculty, Radiological Sciences Center, Av. Gral. Velasquez 1775, 1000007 Arica, Arica and Parinacota (Chile)
2015-10-15
Cases of radiation induced cataract among cardiology professionals have been reported in studies. In view of evidence of radiation injuries, the ICRP recommends limiting the radiation dose to the lens to 20 mSv per year for occupational exposure. The aim of this works was to report scattered radiation doses at the height of the operators eye in an interventional cardiology facility from procedures performed without use of radiation protection devices, correlated with different angiographic projections and operational modes. Measurements were made in a cardiac laboratory with an angiography X-ray system GE equipped with flat-panel detector. PMMA plates of 30 x 30 x 5 cm were used to simulate a patient with a thickness of 20 cm. Two fluoroscopy modes (low and normal, 15 frame/s), cine mode 15 frame/s. Four angiographic projections anterior posterior (Ap), lateral (Lat), left anterior oblique caudal (spider) and left anterior oblique cranial (Lao-45/cra-30) and a cardiac protocol for patient between 70 to 90 kg was used. Measurements of phantom entrance doses rate and scatter doses rate were performed with two Unfors Xi plus. The detector measuring scatter radiation was positioned at the usual distance of the cardiologists eyes during working conditions (1 m from the isocenter and 1.7 m from the floor). There is a good linear correlation between the kerma-area product and scatter dose at the lens. An experimental correlation factor of 2.3; 12.0; 12.2 and 17.6 μSv/Gy cm{sup 2} were found for the Ap, Lao/cra, spider and Lat projections, respectively. The entrance dose of PMMA for fluoroscopy low, medium and cine was 13, 39 and 282 mGy/min, respectively to Ap. (Author)
International Nuclear Information System (INIS)
Leyton, F.; Nogueira, M. S.; Da Silva, T. A.; Gubolino, L.; Pivetta, M. R.; Ubeda, C.
2015-10-01
Cases of radiation induced cataract among cardiology professionals have been reported in studies. In view of evidence of radiation injuries, the ICRP recommends limiting the radiation dose to the lens to 20 mSv per year for occupational exposure. The aim of this works was to report scattered radiation doses at the height of the operators eye in an interventional cardiology facility from procedures performed without use of radiation protection devices, correlated with different angiographic projections and operational modes. Measurements were made in a cardiac laboratory with an angiography X-ray system GE equipped with flat-panel detector. PMMA plates of 30 x 30 x 5 cm were used to simulate a patient with a thickness of 20 cm. Two fluoroscopy modes (low and normal, 15 frame/s), cine mode 15 frame/s. Four angiographic projections anterior posterior (Ap), lateral (Lat), left anterior oblique caudal (spider) and left anterior oblique cranial (Lao-45/cra-30) and a cardiac protocol for patient between 70 to 90 kg was used. Measurements of phantom entrance doses rate and scatter doses rate were performed with two Unfors Xi plus. The detector measuring scatter radiation was positioned at the usual distance of the cardiologists eyes during working conditions (1 m from the isocenter and 1.7 m from the floor). There is a good linear correlation between the kerma-area product and scatter dose at the lens. An experimental correlation factor of 2.3; 12.0; 12.2 and 17.6 μSv/Gy cm 2 were found for the Ap, Lao/cra, spider and Lat projections, respectively. The entrance dose of PMMA for fluoroscopy low, medium and cine was 13, 39 and 282 mGy/min, respectively to Ap. (Author)
International Nuclear Information System (INIS)
WEISS, E.V.
2000-01-01
This report provides estimates of the expected whole body and extremity radiological dose, expressed as dose equivalent (DE), to workers conducting planned plutonium (Pu) stabilization processes at the Hanford Site Plutonium Finishing Plant (PFP). The report is based on a time and motion dose study commissioned for Project W-460, Plutonium Stabilization and Handling, to provide personnel exposure estimates for construction work in the PFP storage vault area plus operation of stabilization and packaging equipment at PFP
Weiss, E V
2000-01-01
This report provides estimates of the expected whole body and extremity radiological dose, expressed as dose equivalent (DE), to workers conducting planned plutonium (Pu) stabilization processes at the Hanford Site Plutonium Finishing Plant (PFP). The report is based on a time and motion dose study commissioned for Project W-460, Plutonium Stabilization and Handling, to provide personnel exposure estimates for construction work in the PFP storage vault area plus operation of stabilization and packaging equipment at PFP.
Dose-projection considerations for emergency conditions at nuclear power plants
International Nuclear Information System (INIS)
Stoetzel, G.A.; Ramsdell, J.V.; Poeton, R.W.; Powell, D.C.; Desrosiers, A.E.
1983-05-01
The purpose of this report is to review the problems and issues associated with making environmental radiation-dose projections during emergencies at nuclear power plants. The review is divided into three areas: source-term development, characterization of atmospheric dispersion and selection of appropriate dispersion models, and development of dosimetry calculations for determining thyroid dose and whole-body dose for ground-level and elevated releases. A discussion of uncertainties associated with these areas is also provided
Dose-projection considerations for emergency conditions at nuclear power plants
Energy Technology Data Exchange (ETDEWEB)
Stoetzel, G.A.; Ramsdell, J.V.; Poeton, R.W.; Powell, D.C.; Desrosiers, A.E.
1983-05-01
The purpose of this report is to review the problems and issues associated with making environmental radiation-dose projections during emergencies at nuclear power plants. The review is divided into three areas: source-term development, characterization of atmospheric dispersion and selection of appropriate dispersion models, and development of dosimetry calculations for determining thyroid dose and whole-body dose for ground-level and elevated releases. A discussion of uncertainties associated with these areas is also provided.
Saltybaeva, Natalia; Krauss, Andreas; Alkadhi, Hatem
2017-03-01
Purpose To calculate the effect of localizer radiography projections to the total radiation dose, including both the dose from localizer radiography and that from subsequent chest computed tomography (CT) with tube current modulation (TCM). Materials and Methods An anthropomorphic phantom was scanned with 192-section CT without and with differently sized breast attachments. Chest CT with TCM was performed after one localizer radiographic examination with anteroposterior (AP) or posteroanterior (PA) projections. Dose distributions were obtained by means of Monte Carlo simulations based on acquired CT data. For Monte Carlo simulations of localizer radiography, the tube position was fixed at 0° and 180°; for chest CT, a spiral trajectory with TCM was used. The effect of tube start angles on dose distribution was investigated with Monte Carlo simulations by using TCM curves with fixed start angles (0°, 90°, and 180°). Total doses for lungs, heart, and breast were calculated as the sum of the dose from localizer radiography and CT. Image noise was defined as the standard deviation of attenuation measured in 14 circular regions of interest. The Wilcoxon signed rank test, paired t test, and Friedman analysis of variance were conducted to evaluate differences in noise, TCM curves, and organ doses, respectively. Results Organ doses from localizer radiography were lower when using a PA instead of an AP projection (P = .005). The use of a PA projection resulted in higher TCM values for chest CT (P chest CT. © RSNA, 2016 Online supplemental material is available for this article.
Preliminary design review report for K Basin Dose Reduction Project
International Nuclear Information System (INIS)
Blackburn, L.D.
1996-01-01
The strategy for reducing radiation dose, originating from radionuclides absorbed in the K East Basin concrete, is to raise the pool water level to provide additional shielding. This report documents a preliminary design review conducted to ensure that design approaches for cleaning/coating basin walls and modifying other basin components were appropriate. The conclusion of this review was that design documents presently conclusion of this review was that design documents presently completed or in process of modification are and acceptable basis for proceeding to complete the design
International Nuclear Information System (INIS)
Levegruen, Sabine; Jackson, Andrew; Zelefsky, Michael J.; Venkatraman, Ennapadam S.; Skwarchuk, Mark W.; Schlegel, Wolfgang; Fuks, Zvi; Leibel, Steven A.; Ling, C. Clifton
2000-01-01
Purpose: To investigate tumor control following three-dimensional conformal radiation therapy (3D-CRT) of prostate cancer and to identify dose-distribution variables that correlate with local control assessed through posttreatment prostate biopsies. Methods and Material: Data from 132 patients, treated at Memorial Sloan-Kettering Cancer Center (MSKCC), who had a prostate biopsy 2.5 years or more after 3D-CRT for T1c-T3 prostate cancer with prescription doses of 64.8-81 Gy were analyzed. Variables derived from the dose distribution in the PTV included: minimum dose (Dmin), maximum dose (Dmax), mean dose (Dmean), dose to n% of the PTV (Dn), where n = 1%, ..., 99%. The concept of the equivalent uniform dose (EUD) was evaluated for different values of the surviving fraction at 2 Gy (SF 2 ). Four tumor control probability (TCP) models (one phenomenologic model using a logistic function and three Poisson cell kill models) were investigated using two sets of input parameters, one for low and one for high T-stage tumors. Application of both sets to all patients was also investigated. In addition, several tumor-related prognostic variables were examined (including T-stage, Gleason score). Univariate and multivariate logistic regression analyses were performed. The ability of the logistic regression models (univariate and multivariate) to predict the biopsy result correctly was tested by performing cross-validation analyses and evaluating the results in terms of receiver operating characteristic (ROC) curves. Results: In univariate analysis, prescription dose (Dprescr), Dmax, Dmean, dose to n% of the PTV with n of 70% or less correlate with outcome (p 2 : EUD correlates significantly with outcome for SF 2 of 0.4 or more, but not for lower SF 2 values. Using either of the two input parameters sets, all TCP models correlate with outcome (p 2 , is limited because the low dose region may not coincide with the tumor location. Instead, for MSKCC prostate cancer patients with their
International Nuclear Information System (INIS)
Shipler, D.B.
1993-09-01
The purpose of the Hanford Environmental Dose Reconstruction (HEDR) Project is to estimate radiation doses from Hanford Site operations since 1944 to representative individuals. The primary objective of work to be performed through May 1994 is to determine the project's appropriate scope: space, time, radionuclides, pathways and representative individuals; determine the project's appropriate level of accuracy/level of uncertainty in dose estimates; complete model and data development; and estimate doses for the Hanford Thyroid Disease Study and representative individuals. A major objective of the HEDR Project is to estimate doses to the thyroid of individuals who were exposed to iodine-131. A principal pathway for many of these individuals was milk from cows that ate vegetation contaminated by iodine-131 released into the air from Hanford facilities. The plan for June 1992 through May 1994 has been prepared based on activities and budgets approved by the Technical Steering Panel (TSP) at its meetings on January 7--9, 1993 and February 25--26, 1993. The activities can be divided into three broad categories: (1) computer code and data development activities, (2) calculation of doses, and (3) technical and communication support to the TSP and the TSP Native American Working Group (NAWG). The following activities will be conducted to accomplish project objectives through May 1994
Directory of Open Access Journals (Sweden)
Annika eJakobi
2015-11-01
Full Text Available Introduction:Presently used radio-chemotherapy regimens result in moderate local control rates for patients with advanced head and neck squamous cell carcinoma (HNSCC. Dose escalation (DE may be an option to improve patient outcome, but may also increase the risk of toxicities in healthy tissue. The presented treatment planning study evaluated the feasibility of two DE levels for advanced HNSCC patients, planned with either intensity-modulated photon therapy (IMXT or proton therapy (IMPT.Materials and Methods:For 45 HNSCC patients, IMXT and IMPT treatment plans were created including DE via a simultaneous integrated boost (SIB in the high-risk volume, while maintaining standard fractionation with 2 Gy per fraction in the remaining target volume. Two DE levels for the SIB were compared: 2.3 Gy and 2.6 Gy. Treatment plan evaluation included assessment of tumor control probabilities (TCP and normal tissue complication probabilities (NTCP.Results:An increase of approximately 10% in TCP was estimated between the DE levels. A pronounced high-dose rim surrounding the SIB volume was identified in IMXT treatment. Compared to IMPT, this extra dose slightly increased the TCP values and to a larger extent the NTCP values. For both modalities, the higher DE level led only to a small increase in NTCP values (mean differences < 2% in all models, except for the risk of aspiration, which increased on average by 8% and 6% with IMXT and IMPT, respectively, but showed a considerable patient dependence. Conclusions:Both DE levels appear applicable to patients with IMXT and IMPT since all calculated NTCP values, except for one, increased only little for the higher DE level. The estimated TCP increase is of relevant magnitude. The higher DE schedule needs to be investigated carefully in the setting of a prospective clinical trial, especially regarding toxicities caused by high local doses that lack a sound dose response description, e.g., ulcers.
Columbia River pathway report: phase I of the Hanford Environmental Dose Reconstruction Project
Energy Technology Data Exchange (ETDEWEB)
1991-07-01
This report summarizes the river-pathway portion of the first phase of the Hanford Environmental Dose Reconstruction (HEDR) Project. The HEDR Project is estimating radiation doses that could have been received by the public from the Department of Energy's Hanford Site, in southeastern Washington State. Phase 1 of the river-pathway dose reconstruction effort sought to determine whether dose estimates could be calculated for populations in the area from above the Hanford Site at Priest Rapids Dam to below the site at McNary Dam from January 1964 to December 1966. Of the potential sources of radionuclides from the river, fish consumption was the most important. Doses from drinking water were lower at Pasco than at Richland and lower at Kennewick than at Pasco. The median values of preliminary dose estimates calculated by HEDR are similar to independent, previously published estimates of average doses to Richland residents. Later phases of the HEDR Project will address dose estimates for periods other than 1964--1966 and for populations downstream of McNary Dam. 17 refs., 19 figs., 1 tab.
Energy Technology Data Exchange (ETDEWEB)
Silveira, T.B.; Cerbaro, B.Q.; Rosa, L.A.R. da, E-mail: thiago.fisimed@gmail.com, E-mail: tbsilveira@inca.gov.br [Instituto de Radioproteção e Dosimetria (IRD/CNEN-RJ), Rio de Janeiro - RJ (Brazil)
2017-07-01
The aim of this work was to implement a simple algorithm to evaluate isocenter dose in a phantom using the back-projected transmitted dose acquired using an Electronic Portal Imaging Device (EPID) available in a Varian Trilogy accelerator with two nominal 6 and 10 MV photon beams. This algorithm was developed in MATLAB language, to calibrate EPID measured dose in absolute dose, using a deconvolution process, and to incorporate all scattering and attenuation contributions due to photon interactions with phantom. Modeling process was simplified by using empirical curve adjustments to describe the contribution of scattering and attenuation effects. The implemented algorithm and method were validated employing 19 patient treatment plans with 104 clinical irradiation fields projected on the phantom used. Results for EPID absolute dose calibration by deconvolution have showed percent deviations lower than 1%. Final method validation presented average percent deviations between isocenter doses calculated by back-projection and isocenter doses determined with ionization chamber of 1,86% (SD of 1,00%) and -0,94% (SD of 0,61%) for 6 and 10 MV, respectively. Normalized field by field analysis showed deviations smaller than 2% for 89% of all data for 6 MV beams and 94% for 10 MV beams. It was concluded that the proposed algorithm possesses sufficient accuracy to be used for in vivo dosimetry, being sensitive to detect dose delivery errors bigger than 3-4% for conformal and intensity modulated radiation therapy techniques. (author)
International Nuclear Information System (INIS)
Vanhavere, F.; Carinou, E.; Domienik, J.; Donadille, L.; Ginjaume, M.; Gualdrini, G.; Koukorava, C.; Krim, S.; Nikodemova, D.; Ruiz-Lopez, N.; Sans-Merce, M.; Struelens, L.
2011-01-01
Within the ORAMED project (Optimization of Radiation Protection of Medical Staff) a coordinated measurement program for occupationally exposed medical staff was performed in different hospitals in Europe ( (www.oramed-fp7.eu)). The main objective was to obtain a set of standardized data on extremity and eye lens doses for staff involved in interventional radiology and cardiology and to optimize radiation protection. Special attention was given to the measurement of the doses to the eye lenses. In this paper an overview will be given of the measured eye lens doses and the main influence factors for these doses. The measured eye lens doses are extrapolated to annual doses. The extrapolations showed that monitoring of the eye lens should be performed on routine basis.
NRC source term assessment for incident response dose projections
International Nuclear Information System (INIS)
Easley, P.; Pasedag, W.
1984-01-01
The NRC provides advice and assistance to licensees and State and local authorities in responding to accidents. The TACT code supports this function by providing source term projections for two situations during early (15 to 60 minutes) accident response: (1) Core/containment damage is indicated, but there are no measured releases. Quantification of a predicted release permits emergency response before people are exposed. With TACT, response personnel can estimate releases based on fuel and cladding conditions, coolant boundary and containment integrity, and mitigative systems operability. For this type of estimate, TACT is intermediate between default assumptions and time-consuming mechanistic codes. (2) A combination of plant status and limited release data are available. For this situation, iterations between predictions based on known conditions which are compared to measured releases gives reasonable confidence in supplemental source term information otherwise unavailable: nuclide mix, releases not monitored, and trending or abrupt changes. The assumptions and models used in TACT, and examples of its use, are given in this paper
Energy Technology Data Exchange (ETDEWEB)
Hauck, Carlin R.; Ye, Hong; Chen, Peter Y.; Gustafson, Gary S.; Limbacher, Amy; Krauss, Daniel J., E-mail: Daniel.krauss@beaumont.edu
2017-05-01
Purpose: Prostate-specific antigen (PSA) bounce is a temporary elevation of the PSA level above a prior nadir. The purpose of this study was to determine whether the frequency of a PSA bounce following high-dose-rate (HDR) interstitial brachytherapy for the treatment of prostate cancer is associated with individual treatment fraction size. Methods and Materials: Between 1999 and 2014, 554 patients underwent treatment of low- or intermediate-risk prostate cancer with definitive HDR brachytherapy as monotherapy and had ≥3 subsequent PSA measurements. Four different fraction sizes were used: 950 cGy × 4 fractions, 1200 cGy × 2 fractions, 1350 cGy × 2 fractions, 1900 cGy × 1 fraction. Four definitions of PSA bounce were applied: ≥0.2, ≥0.5, ≥1.0, and ≥2.0 ng/mL above the prior nadir with a subsequent return to the nadir. Results: The median follow-up period was 3.7 years. The actuarial 3-year rate of PSA bounce for the entire cohort was 41.3%, 28.4%, 17.4%, and 6.8% for nadir +0.2, +0.5, +1.0, and +2.0 ng/mL, respectively. The 3-year rate of PSA bounce >0.2 ng/mL was 42.2%, 32.1%, 41.0%, and 59.1% for the 950-, 1200-, 1350-, and 1900-cGy/fraction levels, respectively (P=.002). The hazard ratio for bounce >0.2 ng/mL for patients receiving a single fraction of 1900 cGy compared with those receiving treatment in multiple fractions was 1.786 (P=.024). For patients treated with a single 1900-cGy fraction, the 1-, 2-, and 3-year rates of PSA bounce exceeding the Phoenix biochemical failure definition (nadir +2 ng/mL) were 4.5%, 18.7%, and 18.7%, respectively, higher than the rates for all other administered dose levels (P=.025). Conclusions: The incidence of PSA bounce increases with single-fraction HDR treatment. Knowledge of posttreatment PSA kinetics may aid in decision making regarding management of potential biochemical failures.
International Nuclear Information System (INIS)
Hauck, Carlin R.; Ye, Hong; Chen, Peter Y.; Gustafson, Gary S.; Limbacher, Amy; Krauss, Daniel J.
2017-01-01
Purpose: Prostate-specific antigen (PSA) bounce is a temporary elevation of the PSA level above a prior nadir. The purpose of this study was to determine whether the frequency of a PSA bounce following high-dose-rate (HDR) interstitial brachytherapy for the treatment of prostate cancer is associated with individual treatment fraction size. Methods and Materials: Between 1999 and 2014, 554 patients underwent treatment of low- or intermediate-risk prostate cancer with definitive HDR brachytherapy as monotherapy and had ≥3 subsequent PSA measurements. Four different fraction sizes were used: 950 cGy × 4 fractions, 1200 cGy × 2 fractions, 1350 cGy × 2 fractions, 1900 cGy × 1 fraction. Four definitions of PSA bounce were applied: ≥0.2, ≥0.5, ≥1.0, and ≥2.0 ng/mL above the prior nadir with a subsequent return to the nadir. Results: The median follow-up period was 3.7 years. The actuarial 3-year rate of PSA bounce for the entire cohort was 41.3%, 28.4%, 17.4%, and 6.8% for nadir +0.2, +0.5, +1.0, and +2.0 ng/mL, respectively. The 3-year rate of PSA bounce >0.2 ng/mL was 42.2%, 32.1%, 41.0%, and 59.1% for the 950-, 1200-, 1350-, and 1900-cGy/fraction levels, respectively (P=.002). The hazard ratio for bounce >0.2 ng/mL for patients receiving a single fraction of 1900 cGy compared with those receiving treatment in multiple fractions was 1.786 (P=.024). For patients treated with a single 1900-cGy fraction, the 1-, 2-, and 3-year rates of PSA bounce exceeding the Phoenix biochemical failure definition (nadir +2 ng/mL) were 4.5%, 18.7%, and 18.7%, respectively, higher than the rates for all other administered dose levels (P=.025). Conclusions: The incidence of PSA bounce increases with single-fraction HDR treatment. Knowledge of posttreatment PSA kinetics may aid in decision making regarding management of potential biochemical failures.
Data base on dose reduction research projects for nuclear power plants: Volume 3
Energy Technology Data Exchange (ETDEWEB)
Khan, T.A.; Baum, J.W.
1989-05-01
This is the third volume in a series of reports that provide information on dose-reduction research and health physics technology for nuclear power plants. The information is taken from data base maintained by Brookhaven National Laboratory's ALARA Center for the Nuclear Regulatory Commission. This report presents information on 80 new projects, covering a wide area of activities. Projects on steam generator degradation, decontamination, robotics, improvement in reactor materials, and inspection techniques, among others, are described in the research section. The section on health physics technology includes some simple and very cost-effective projects to reduce radiation exposures. Collective dose data from the United States and other countries are also presented. In the conclusion, we suggest that although new advanced reactor design technology will eventually reduce radiation exposures at nuclear power plants to levels below serious concern, in the interim an aggressive approach to dose reduction remains necessary. 20 refs.
Energy Technology Data Exchange (ETDEWEB)
Murray, C.E.; Lee, W.J.
1992-12-01
The purpose of this report is to provide a bibliography for the Native American tribe participants in the Hanford Environmental Dose Reconstruction (HEDR) Project to use. The HEDR Project`s primary objective is to estimate the radiation dose that individuals could have received as a result of emissions since 1944 from the US Department of Energy`s Hanford Site near Richland, Washington. Eight Native American tribes are responsible for estimating daily and seasonal consumption of traditional foods, demography, and other lifestyle factors that could have affected the radiation dose received by tribal members. This report provides a bibliography of recorded accounts that tribal researchers may use to verify their estimates. The bibliographic citations include references to information on the specific tribes, Columbia River plateau ethnobotany, infant feeding practices and milk consumption, nutritional studies and radiation, tribal economic and demographic characteristics (1940--1970), research methods, primary sources from the National Archives, regional archives, libraries, and museums.
The PA projection of the clavicle: a dose-reducing technique.
LENUS (Irish Health Repository)
Mc Entee, Mark F
2010-06-01
This study compares dose and image quality during PA and AP radiography of the clavicle. The methodology involved a cadaver-based dose and image quality study. Results demonstrate a statistically significant 56.1 % (p
Acute Radiation Risk and BRYNTRN Organ Dose Projection Graphical User Interface
Cucinotta, Francis A.; Hu, Shaowen; Nounu, Hateni N.; Kim, Myung-Hee
2011-01-01
The integration of human space applications risk projection models of organ dose and acute radiation risk has been a key problem. NASA has developed an organ dose projection model using the BRYNTRN with SUM DOSE computer codes, and a probabilistic model of Acute Radiation Risk (ARR). The codes BRYNTRN and SUM DOSE are a Baryon transport code and an output data processing code, respectively. The risk projection models of organ doses and ARR take the output from BRYNTRN as an input to their calculations. With a graphical user interface (GUI) to handle input and output for BRYNTRN, the response models can be connected easily and correctly to BRYNTRN. A GUI for the ARR and BRYNTRN Organ Dose (ARRBOD) projection code provides seamless integration of input and output manipulations, which are required for operations of the ARRBOD modules. The ARRBOD GUI is intended for mission planners, radiation shield designers, space operations in the mission operations directorate (MOD), and space biophysics researchers. BRYNTRN code operation requires extensive input preparation. Only a graphical user interface (GUI) can handle input and output for BRYNTRN to the response models easily and correctly. The purpose of the GUI development for ARRBOD is to provide seamless integration of input and output manipulations for the operations of projection modules (BRYNTRN, SLMDOSE, and the ARR probabilistic response model) in assessing the acute risk and the organ doses of significant Solar Particle Events (SPEs). The assessment of astronauts radiation risk from SPE is in support of mission design and operational planning to manage radiation risks in future space missions. The ARRBOD GUI can identify the proper shielding solutions using the gender-specific organ dose assessments in order to avoid ARR symptoms, and to stay within the current NASA short-term dose limits. The quantified evaluation of ARR severities based on any given shielding configuration and a specified EVA or other mission
International Nuclear Information System (INIS)
Shipler, D.B.
1992-09-01
The purpose of the Hanford Environmental Dose Reconstruction (HEDR) Project is to estimate radiation doses from Hanford Site operations since 1944 to populations and individuals. The primary objective of work to be performed through May 1994 is to (1) determine the project's appropriate scope (space, time, radionuclides, pathways and individuals/population groups), (2) determine the project's appropriate level of accuracy (level of uncertainty in dose estimates) for the project, (3) complete model and data development, and (4) estimate doses for the Hanford Thyroid Disease Study (HTDS), representative individuals, and special populations as described herein. The plan for FY 1992 through May 1994 has been prepared based on activities and budgets approved by the Technical Steering Panel (TSP) at its meetings on August 19--20, 1991, and April 23--25, 1992. The activities can be divided into four broad categories: (1) model and data evaluation activities, (2)additional dose estimates, (3) model and data development activities, and (4)technical and communication support
Anandavadivelan, Poorna; Brismar, Torkel B; Nilsson, Magnus; Johar, Asif M; Martin, Lena
2016-06-01
Profound weight loss and malnutrition subsequent to severe dysphagia and cancer cachexia are cardinal symptoms in oesophageal cancer (OC). Low muscle mass/sarcopenia has been linked to toxicity during neo-adjuvant therapy in other cancers, with worser effects in sarcopenic obesity. In this study the association between sarcopenia and/or sarcopenic obesity and dose limiting toxicity (DLT) during cycle one chemotherapy in resectable OC patients was evaluated. Body composition was assessed from computed tomography scans of 72 consecutively diagnosed OC patients. Lean body mass and body fat mass were estimated. Patients were grouped as sarcopenic or non-sarcopenic based on pre-defined gender-specific cut-offs for sarcopenia, and as underweight/normal (BMI sarcopenia combined with overweight and obesity. DLT was defined as temporary reduction/delay or permanent discontinuation of drugs due to adverse effects. Odds ratios for developing toxicity were ascertained using multiple logistic regression. Of 72 patients, 85% (n = 61) were males. Sarcopenia and sarcopenic obesity were present in 31 (43%) and 10 (14%), respectively, prior to chemotherapy. Sarcopenic patients had significantly lower adipose tissue index (p = 0.02) compared to non-sarcopenic patients. Patients with DLT (n = 24) had lower skeletal muscle mass (p = 0.04) than those without DLT. Sarcopenic patients (OR = 2.47; 95% CI: 0.88-6.93) showed a trend towards increased DLT risk (p < 0.10). Logistic regression with BMI as an interaction term indicated higher DLT risk in sarcopenic patients with normal BMI (OR = 1.60; 95% CI 0.30-8.40), but was non-significant. In the sarcopenic obese, risk of DLT increased significantly (OR = 5.54; 95% CI 1.12-27.44). Sarcopenic and sarcopenic obese OC patients may be at a higher risk for developing DLT during chemotherapy compared to non-sarcopenic OC patients. Copyright © 2015 Elsevier Ltd and European Society for Clinical Nutrition and Metabolism. All
FY 1991 Task plans for the Hanford Environmental Dose Reconstruction Project
International Nuclear Information System (INIS)
1991-04-01
The purpose of the Hanford Environmental Dose Reconstruction (HEDR) Project is to estimate radiation doses from Hanford Site operations since 1944 to populations and individuals. The objectives of work in Fiscal Year (FY) 1991 are to analyze data and models used in Phase 1 and restructure the models to increase accuracy and reduce uncertainty in dose estimation capability. Databases will be expanded and efforts will begin to determine the appropriate scope (space, time, radionuclides, pathways and individuals/population groups) and accuracy (level of uncertainty in dose estimates) for the project. Project scope and accuracy requirements, once defined, can be translated into additional model and data requirements later in the project. Task plans for FY 1991 have been prepared based on activities approved by the Technical Steering Panel (TSP) in October 1990 and mid-year revisions discussed at the TSP planning/budget workshop in February 1991. The activities can be divided into two broad categories: (1) model and data development and evaluation, (2) project, technical and communication support. 3 figs., 1 tab
International Nuclear Information System (INIS)
Andrasi, A.; Bailey, M.; Puncher, M.; Berkovski, V.; Eric Blanchardon, E.; Jourdain, J.-R.; Carlo-Maria Castellani, C.-M.; Doerfel, H.; Christian Hurtgen, Ch.; Le Guen, B.
2003-01-01
Several international intercomparison exercises on intake and internal dose assessments from monitoring data led to the conclusion that the results calculated by different participants varied significantly mainly because of the wide variety of methods and assumptions applied in the assessment procedure. Based on these experiences the need for harmonisation of the procedures has been formulated as an EU research project under the 5 th Framework Programme (2001-2005), with the aim of developing general guidelines for standardising assessments of intakes and internal doses. In the IDEAS project eight institutions from seven European countries are participating using inputs also from internal dosimetry professionals from across Europe to ensure broad consensus in the outcome of the project. The IDEAS project is explained
Energy Technology Data Exchange (ETDEWEB)
Yamada, Yoshitake, E-mail: yamada@rad.med.keio.ac.jp [Department of Diagnostic Radiology, Keio University School of Medicine, 35 Shinanomachi, Shinjuku-ku, Tokyo 160-8582 (Japan); Jinzaki, Masahiro, E-mail: jinzaki@rad.med.keio.ac.jp [Department of Diagnostic Radiology, Keio University School of Medicine, 35 Shinanomachi, Shinjuku-ku, Tokyo 160-8582 (Japan); Hosokawa, Takahiro, E-mail: hosokawa@rad.med.keio.ac.jp [Department of Diagnostic Radiology, Keio University School of Medicine, 35 Shinanomachi, Shinjuku-ku, Tokyo 160-8582 (Japan); Tanami, Yutaka, E-mail: tanami@rad.med.keio.ac.jp [Department of Diagnostic Radiology, Keio University School of Medicine, 35 Shinanomachi, Shinjuku-ku, Tokyo 160-8582 (Japan); Sugiura, Hiroaki, E-mail: hsugiura@rad.med.keio.ac.jp [Department of Diagnostic Radiology, Keio University School of Medicine, 35 Shinanomachi, Shinjuku-ku, Tokyo 160-8582 (Japan); Abe, Takayuki, E-mail: tabe@z5.keio.jp [Center for Clinical Research, Keio University School of Medicine, 35 Shinanomachi, Shinjuku-ku, Tokyo 160-8582 (Japan); Kuribayashi, Sachio, E-mail: skuribay@a5.keio.jp [Department of Diagnostic Radiology, Keio University School of Medicine, 35 Shinanomachi, Shinjuku-ku, Tokyo 160-8582 (Japan)
2012-12-15
Objectives: To assess the effectiveness of adaptive iterative dose reduction (AIDR) and AIDR 3D in improving the image quality in low-dose chest CT (LDCT). Materials and methods: Fifty patients underwent standard-dose chest CT (SDCT) and LDCT simultaneously, performed under automatic exposure control with noise index of 19 and 38 (for a 2-mm slice thickness), respectively. The SDCT images were reconstructed with filtered back projection (SDCT-FBP images), and the LDCT images with FBP, AIDR and AIDR 3D (LDCT-FBP, LDCT-AIDR and LDCT-AIDR 3D images, respectively). On all the 200 lung and 200 mediastinal image series, objective image noise and signal-to-noise ratio (SNR) were measured in several regions, and two blinded radiologists independently assessed the subjective image quality. Wilcoxon's signed rank sum test with Bonferroni's correction was used for the statistical analyses. Results: The mean dose reduction in LDCT was 64.2% as compared with the dose in SDCT. LDCT-AIDR 3D images showed significantly reduced objective noise and significantly increased SNR in all regions as compared to the SDCT-FBP, LDCT-FBP and LDCT-AIDR images (all, P ≤ 0.003). In all assessments of the image quality, LDCT-AIDR 3D images were superior to LDCT-AIDR and LDCT-FBP images. The overall diagnostic acceptability of both the lung and mediastinal LDCT-AIDR 3D images was comparable to that of the lung and mediastinal SDCT-FBP images. Conclusions: AIDR 3D is superior to AIDR. Intra-individual comparisons between SDCT and LDCT suggest that AIDR 3D allows a 64.2% reduction of the radiation dose as compared to SDCT, by substantially reducing the objective image noise and increasing the SNR, while maintaining the overall diagnostic acceptability.
Energy Technology Data Exchange (ETDEWEB)
Arvold, Nils D. [Harvard Radiation Oncology Program, Harvard Medical School, Boston, MA (United States); Niemierko, Andrzej; Broussard, George P.; Adams, Judith; Fullerton, Barbara; Loeffler, Jay S. [Department of Radiation Oncology, Massachusetts General Hospital and Harvard Medical School, Boston, MA (United States); Shih, Helen A., E-mail: hshih@partners.org [Department of Radiation Oncology, Massachusetts General Hospital and Harvard Medical School, Boston, MA (United States)
2012-07-15
Purpose: To calculated projected second tumor rates and dose to organs at risk (OAR) in patients with benign intracranial meningioma (BM), according to dosimetric comparisons between proton radiotherapy (PRT) and photon radiotherapy (XRT) treatment plans. Methods and Materials: Ten patients with BM treated at Massachusetts General Hospital during 2006-2010 with PRT were replanned with XRT (intensity-modulated or three-dimensional conformal radiotherapy), optimizing dose to the tumor while sparing OAR. Total dose was 54 Gy in 1.8 Gy per fraction for all plans. We calculated equivalent uniform doses, normal tissue complication probabilities, and whole brain-based estimates of excess risk of radiation-associated intracranial second tumors. Results: Excess risk of second tumors was significantly lower among PRT compared with XRT plans (1.3 vs. 2.8 per 10,000 patients per year, p < 0.002). Mean equivalent uniform doses were lower among PRT plans for the whole brain (19.0 vs. 22.8 Gy, p < 0.0001), brainstem (23.8 vs. 35.2 Gy, p = 0.004), hippocampi (left, 13.5 vs. 25.6 Gy, p < 0.0001; right, 7.6 vs. 21.8 Gy, p = 0.001), temporal lobes (left, 25.8 vs. 34.6 Gy, p = 0.007; right, 25.8 vs. 32.9 Gy, p = 0.008), pituitary gland (29.2 vs. 37.0 Gy, p = 0.047), optic nerves (left, 28.5 vs. 33.8 Gy, p = 0.04; right, 25.1 vs. 31.1 Gy, p = 0.07), and cochleas (left, 12.2 vs. 15.8 Gy, p = 0.39; right,1.5 vs. 8.8 Gy, p = 0.01). Mean normal tissue complication probability was <1% for all structures and not significantly different between PRT and XRT plans. Conclusions: Compared with XRT, PRT for BM decreases the risk of RT-associated second tumors by half and delivers significantly lower doses to neurocognitive and critical structures of vision and hearing.
International Nuclear Information System (INIS)
Arvold, Nils D.; Niemierko, Andrzej; Broussard, George P.; Adams, Judith; Fullerton, Barbara; Loeffler, Jay S.; Shih, Helen A.
2012-01-01
Purpose: To calculated projected second tumor rates and dose to organs at risk (OAR) in patients with benign intracranial meningioma (BM), according to dosimetric comparisons between proton radiotherapy (PRT) and photon radiotherapy (XRT) treatment plans. Methods and Materials: Ten patients with BM treated at Massachusetts General Hospital during 2006–2010 with PRT were replanned with XRT (intensity-modulated or three-dimensional conformal radiotherapy), optimizing dose to the tumor while sparing OAR. Total dose was 54 Gy in 1.8 Gy per fraction for all plans. We calculated equivalent uniform doses, normal tissue complication probabilities, and whole brain–based estimates of excess risk of radiation-associated intracranial second tumors. Results: Excess risk of second tumors was significantly lower among PRT compared with XRT plans (1.3 vs. 2.8 per 10,000 patients per year, p < 0.002). Mean equivalent uniform doses were lower among PRT plans for the whole brain (19.0 vs. 22.8 Gy, p < 0.0001), brainstem (23.8 vs. 35.2 Gy, p = 0.004), hippocampi (left, 13.5 vs. 25.6 Gy, p < 0.0001; right, 7.6 vs. 21.8 Gy, p = 0.001), temporal lobes (left, 25.8 vs. 34.6 Gy, p = 0.007; right, 25.8 vs. 32.9 Gy, p = 0.008), pituitary gland (29.2 vs. 37.0 Gy, p = 0.047), optic nerves (left, 28.5 vs. 33.8 Gy, p = 0.04; right, 25.1 vs. 31.1 Gy, p = 0.07), and cochleas (left, 12.2 vs. 15.8 Gy, p = 0.39; right,1.5 vs. 8.8 Gy, p = 0.01). Mean normal tissue complication probability was <1% for all structures and not significantly different between PRT and XRT plans. Conclusions: Compared with XRT, PRT for BM decreases the risk of RT-associated second tumors by half and delivers significantly lower doses to neurocognitive and critical structures of vision and hearing.
Handbook of selected organ doses for projections common in pediatric radiology
International Nuclear Information System (INIS)
Rosenstein, M.; Beck, T.J.; Warner, G.G.
1979-05-01
This handbook contains data from which absorbed dose (mrad) to selected organs can be estimated for common projections in pediatric radiology. The organ doses are for three reference patients: a newborn (0 to 6 months), a 1-year old child, and a 5-year old child. One intent of the handbook is to permit the user to evaluate the effect on organ dose to these reference pediatric patients as a function of certain changes in technical parameters used in or among facilities. A second intent is to permit a comparison to be made of organ doses as a function of age. This comparison can be extended to a reference adult by referring to the previous Handbook of Selected Organ Doses fo Projections Common in Diagnostic Radiology, FDA 76-8031. Assignment of organ doses to individual pediatric patients using the Handbook data is not recommended unless the physical characteristics of the patient closely correlate with one of the three reference pediatric patients given in Appendix A
Draft Air Pathway Report: Phase 1 of the Hanford Environmental Dose Reconstruction Project
Energy Technology Data Exchange (ETDEWEB)
1990-07-20
This report summarizes the air pathway portion of the first phase of the Hanford Environmental Dose Reconstruction (HEDR) Project, conducted by Battelle staff at the Pacific Northwest Laboratory under the direction of an independent Technical Steering Panel. The HEDR Project is estimating historical radiation doses that could have been received by populations near the Department of Energy's Hanford Site, in southeastern Washington State. Phase 1 of the air-pathway dose reconstruction sought to determine whether dose estimates could be calculated for populations in the 10 counties nearest the Hanford Site from atmospheric releases of iodine-131 from the site from 1944--1947. Phase 1 demonstrated the following: HEDR-calculated source-term estimates of iodine-131 releases to the atmosphere were within 20% of previously published estimates; calculated vegetation concentrations of iodine-131 agree well with previously published measurements; the highest of the Phase 1 preliminary dose estimates to the thyroid are consistent with independent, previously published estimates of doses to maximally exposed individuals; and relatively crude, previously published measurements of thyroid burdens for Hanford workers are in the range of average burdens that the HEDR model estimated for similar reference individuals'' for the period 1944--1947. 4 refs., 10 figs., 9 tabs.
Kim, Myung-Hee Y.; Hu, Shaowen; Nounu, Hatem N.; Cucinotta, Francis A.
2010-01-01
Solar particle events (SPEs) pose the risk of acute radiation sickness (ARS) to astronauts, because organ doses from large SPEs may reach critical levels during extra vehicular activities (EVAs) or lightly shielded spacecraft. NASA has developed an organ dose projection model of Baryon transport code (BRYNTRN) with an output data processing module of SUMDOSE, and a probabilistic model of acute radiation risk (ARR). BRYNTRN code operation requires extensive input preparation, and the risk projection models of organ doses and ARR take the output from BRYNTRN as an input to their calculations. With a graphical user interface (GUI) to handle input and output for BRYNTRN, these response models can be connected easily and correctly to BRYNTRN in a user friendly way. The GUI for the Acute Radiation Risk and BRYNTRN Organ Dose (ARRBOD) projection code provides seamless integration of input and output manipulations required for operations of the ARRBOD modules: BRYNTRN, SUMDOSE, and the ARR probabilistic response model. The ARRBOD GUI is intended for mission planners, radiation shield designers, space operations in the mission operations directorate (MOD), and space biophysics researchers. Assessment of astronauts organ doses and ARS from the exposure to historically large SPEs is in support of mission design and operation planning to avoid ARS and stay within the current NASA short-term dose limits. The ARRBOD GUI will serve as a proof-of-concept for future integration of other risk projection models for human space applications. We present an overview of the ARRBOD GUI product, which is a new self-contained product, for the major components of the overall system, subsystem interconnections, and external interfaces.
International Nuclear Information System (INIS)
Byram, S.J.
1991-05-01
The Hanford Environmental Dose Reconstruction (HEDR) Project will estimate radiation exposures people may have received from radioactive materials released during past operations at the Department of Energy's Hanford Site near Richland, Washington. The project is being conducted by Pacific Northwest Laboratory (PNL) under the direction of an independent Technical Steering Panel (TSP). The Centers for Disease Control (CDC) will use HEDR dose estimates in studies to investigate a potential link between thyroid disease and historical Hanford emissions. The HEDR Project was initiated to address public concerns about the possible health impacts from past releases of radioactive materials from Hanford. The TSP recognized early in the project that special mechanisms would be required to communicate effectively to the many different concerned audiences. To identify and develop these mechanisms, the TSP issued Directive 89-7 to PNL in May 1989. The TSP directed PNL to examine methods to communicate the causes and effects of uncertainties in the dose estimates. A literature review was conducted as the first activity in response to the TSP's directive. This report presents the results of the literature review. The objective of the literature review was to identify ''key principles'' that could be applied to develop communications strategies for the project. 26 refs., 6 figs
International Nuclear Information System (INIS)
Fang Dong; Yuan Xianshun; Zhang Dongsheng
2012-01-01
Objective: To evaluate the subject's absorbed dose,equivalent dose and effective dose. Methods: The CBCT unit was Implagraphy and three scan projections were selected such as mandible, maxilla and temporomandibular joint (TMJ). Thermoluminescent dosimeter tubes were used to record the absorbed dose at special positions in the head and neck region of an adult skull and tissue-equivalent phantom. 16 interested organs included pituitary, lens, parotid glands, submandibular glands, sublingual glands, diploe, spongy bone of the chin and cervical vertebra, skins of cheeks and nuchal region, thyroid and esophagus. The absorbed dose was measured in these organs, and then the effective dose (E 1990 , E 2007 ) were calculated according to different ICRP tissue weighting factors. Results: The absorbed dose of mandible,maxilla and TMJ scan varied from (0.99 ±0.09) to (12.85 ±0.09)mGy, (0.93 ±0.01) to (13.07 ±0.02) mGy and (0.68 ±0.01) to (10.18 ± 0.04)mGy. There was significant difference among the three scan projections (F=19.61-30992.27, P<0.05). The equivalent doses of lens and skin were (1.11± 0.07)-(5.76 ± 0.06) mSv and (6.96 ± 0.06)-(10.64 ± 0.07) mSv. There was significant difference among the three scan projections (F=4473.02, 9385.50, P<0.05). The effective dose (E 1990 , E 2007 ) was [(191.35±1.53), (325.17 ±2.58) μSv] for mandible scan, [(106.62 ±2.17), (226.28 ±2.81) μSv] for maxilla scan, [(104.21 ± 1.02), (142.36 ± 1.90) μSv]for TMJ scan, respectively. Conclusions: The valid measurement should be taken to reduce the subject's dose such as a careful history and clinical examination before the performance of CBCT, the latest risk/benefit assessment,precise scan position, the shielding of thyroid as well as brain and the smaller volume size as well. (authors)
Atwell, William; Tylka, Allan J.; Dietrich, William; Rojdev, Kristina; Matzkind, Courtney
2016-01-01
In an earlier paper (Atwell, et al., 2015), we investigated solar particle event (SPE) radiation exposures (absorbed dose) to small, thinly-shielded spacecraft during a period when the sunspot number (SSN) was less than 30. These SPEs contain Ground Level Events (GLE), sub-GLEs, and sub-sub-GLEs (Tylka and Dietrich, 2009, Tylka and Dietrich, 2008, and Atwell, et al., 2008). GLEs are extremely energetic solar particle events having proton energies extending into the several GeV range and producing secondary particles in the atmosphere, mostly neutrons, observed with ground station neutron monitors. Sub-GLE events are less energetic, extending into the several hundred MeV range, but do not produce secondary atmospheric particles. Sub-sub GLEs are even less energetic with an observable increase in protons at energies greater than 30 MeV, but no observable proton flux above 300 MeV. In this paper, we consider those SPEs that occurred during 1973-2010 when the SSN was greater than 30 but less than 50. In addition, we provide probability estimates of absorbed dose based on mission duration with a 95% confidence level (CL). We also discuss the implications of these data and provide some recommendations that may be useful to spacecraft designers of these smaller spacecraft.
Energy Technology Data Exchange (ETDEWEB)
Napier, B.A.
1991-07-01
The objective of the Hanford Environmental Dose Reconstruction (HEDR) Project is to estimate the radiation dose that individuals could have received as a result of emissions from nuclear operations at Hanford since their inception in 1944. A vital step in the estimation of radiation doses is the determination of the source term,'' that is, the quantities of radionuclides that were released to the environment from the various Hanford operations. Hanford operations have at various times involved hundreds of different radionuclides, some in relatively large quantities. Those radionuclides present in the largest quantities, although significant from an operational handling point of view, may not necessarily have been those of greatest concern for offsite radiation dose. This report documents the selection of the dominant radionuclides (those that may have resulted in the largest portion of the received doses) in the source term for Phase 1 of the HEDR Project, that is, for atmospheric releases from 1944 through 1947 and for surface water releases from 1964 through 1966. 15 refs., 3 figs., 10 tabs.
International Nuclear Information System (INIS)
Doerfel, H.; Andrasi, A.; Bailey, M.; Blanchardon, E.; Cruz-Suarez, R.; Berkovski, V.; Castellani, C. M.; Hurtgenv, C.; Leguen, B.; Malatova, I.; Marsh, J.; Stather, J.; Zeger, J.
2007-01-01
In recent major international intercomparison exercises on intake and internal dose assessments from monitoring data, the results calculated by different participants varied significantly. Based on this experience the need for harmonisation of the procedures has been formulated within an EU 5. Framework Programme research project. The aim of the project, IDEAS, is to develop general guidelines for standardising assessments of intakes and internal doses. The IDEAS project started in October 2001 and ended in June 2005. The project is closely related to some goals of the work of Committee 2 of the ICRP and since 2003 there has been close cooperation between the two groups. To ensure that the guidelines are applicable to a wide range of practical situations, the first step was to compile a database of well-documented cases of internal contamination. In parallel, an improved version of an existing software package was developed and distributed to the partners for further use. A large number of cases from the database was evaluated independently by the partners and the results reviewed. Based on these evaluations, guidelines were drafted and discussed with dosimetry professionals from around the world by means of a virtual workshop on the Internet early in 2004. The guidelines have been revised and refined on the basis of the experiences and discussions in this virtual workshop. The general philosophy of the Guidelines is presented here, focusing on the principles of harmonisation, optimisation and proportionality. Finally, the proposed Levels of Task to structure the approach of internal dose evaluation are reported. (authors)
Rescue dose orders as an alternative to range orders: an evidence-based practice project.
Yi, Cassia
2015-06-01
Relief of pain is a fundamental aspect of optimal patient care. However, pain management in the inpatient setting is often constrained by concerns related to regulatory oversight, particularly with regard to the use of opioid dose range orders. These concerns can inadvertently result in the development of policies and practices that can negatively impact the health care team's ability to deliver optimal and individualized pain management. An evidence-based practice project was undertaken to address concerns about regulatory oversight of pain management processes by changing the way pain was managed in a large academic hospital setting. A novel pain management approach using rescue dose medications was established as an alternative to opioid dose range orders. The use of the rescue dose protocol was successfully implemented. Outcomes included an overall reduction in the administration of inappropriate intravenous opioids and opioid-acetaminophen combination medications, with a subsequent increase in single-entity first-line opioid analgesics. Rescue dose protocols may offer an alternative to opioid dose range orders as a means of effectively managing pain. Copyright © 2015 American Society of PeriAnesthesia Nurses. Published by Elsevier Inc. All rights reserved.
Energy Technology Data Exchange (ETDEWEB)
Hu Weigang; Graff, Pierre; Boettger, Thomas; Pouliot, Jean [Department of Radiation Oncology, University of California, San Francisco, San Francisco, California 94143 (United States); and others
2011-04-15
Purpose: To develop a spatially encoded dose difference maximal intensity projection (DD-MIP) as an online patient dose evaluation tool for visualizing the dose differences between the planning dose and dose on the treatment day. Methods: Megavoltage cone-beam CT (MVCBCT) images acquired on the treatment day are used for generating the dose difference index. Each index is represented by different colors for underdose, acceptable, and overdose regions. A maximal intensity projection (MIP) algorithm is developed to compress all the information of an arbitrary 3D dose difference index into a 2D DD-MIP image. In such an algorithm, a distance transformation is generated based on the planning CT. Then, two new volumes representing the overdose and underdose regions of the dose difference index are encoded with the distance transformation map. The distance-encoded indices of each volume are normalized using the skin distance obtained on the planning CT. After that, two MIPs are generated based on the underdose and overdose volumes with green-to-blue and green-to-red lookup tables, respectively. Finally, the two MIPs are merged with an appropriate transparency level and rendered in planning CT images. Results: The spatially encoded DD-MIP was implemented in a dose-guided radiotherapy prototype and tested on 33 MVCBCT images from six patients. The user can easily establish the threshold for the overdose and underdose. A 3% difference between the treatment and planning dose was used as the threshold in the study; hence, the DD-MIP shows red or blue color for the dose difference >3% or {<=}3%, respectively. With such a method, the overdose and underdose regions can be visualized and distinguished without being overshadowed by superficial dose differences. Conclusions: A DD-MIP algorithm was developed that compresses information from 3D into a single or two orthogonal projections while hinting the user whether the dose difference is on the skin surface or deeper.
Hu, Weigang; Graff, Pierre; Boettger, Thomas; Pouliot, Jean
2011-04-01
To develop a spatially encoded dose difference maximal intensity projection (DD-MIP) as an online patient dose evaluation tool for visualizing the dose differences between the planning dose and dose on the treatment day. Megavoltage cone-beam CT (MVCBCT) images acquired on the treatment day are used for generating the dose difference index. Each index is represented by different colors for underdose, acceptable, and overdose regions. A maximal intensity projection (MIP) algorithm is developed to compress all the information of an arbitrary 3D dose difference index into a 2D DD-MIP image. In such an algorithm, a distance transformation is generated based on the planning CT. Then, two new volumes representing the overdose and underdose regions of the dose difference index are encoded with the distance transformation map. The distance-encoded indices of each volume are normalized using the skin distance obtained on the planning CT. After that, two MIPs are generated based on the underdose and overdose volumes with green-to-blue and green-to-red lookup tables, respectively. Finally, the two MIPs are merged with an appropriate transparency level and rendered in planning CT images. The spatially encoded DD-MIP was implemented in a dose-guided radiotherapy prototype and tested on 33 MVCBCT images from six patients. The user can easily establish the threshold for the overdose and underdose. A 3% difference between the treatment and planning dose was used as the threshold in the study; hence, the DD-MIP shows red or blue color for the dose difference > 3% or < or = 3%, respectively. With such a method, the overdose and underdose regions can be visualized and distinguished without being overshadowed by superficial dose differences. A DD-MIP algorithm was developed that compresses information from 3D into a single or two orthogonal projections while hinting the user whether the dose difference is on the skin surface or deeper.
Initial communication survey results for the Hanford Environmental Dose Reconstruction Project
International Nuclear Information System (INIS)
Beck, D.M.
1991-03-01
To support the public communication efforts of the Technical Steering Panel of the Hanford Environmental Dose Reconstruction (HEDR) Project, a public survey was conducted. The survey was intended to provide information about the public's knowledge and interest in the project and the best ways to communicate project results. Questions about the project were included as part of an omnibus survey conducted by Washington State University. The survey was conducted by phone to Washington State residents in the spring of 1990. This report gives the HEDR-related questions and summary data of responses. Questions associated with the HEDR Project were grouped into four categories: knowledge of the HEDR Project; interest in the project; preferred ways of receiving information about the project (including public information meetings, a newsletter mailed to homes, presentations to civic groups in the respondent's community, a computer bulletin board respondent could access with a modem, information displays at public buildings and shopping malls, and an information video sent to respondent); and level of concern over past exposure from Hanford operations. Questions abut whom state residents are most likely to trust about radiation issues were also part of the omnibus survey, and responses are included in this report
International Nuclear Information System (INIS)
Paz, L.R. de la; Palattao, M.V.; Estacio, J.F.L.; Anden, A.
1987-04-01
Doses arising from the ingestion of radioactive contamination coming from Chernobyl accident are calculated using various radioactivity limits adopted by different organizations after the accident. These are compared with that allowed in the Philippines. Projected concentrations of Cs-137 and Cs-134 in various food items in the affected countries, one month and one year after the accident are calculated using a model proposed by Boone, Ng and Palms. Except for food produced in one or two hot spots, the projected concentrations after one year are expected to return to within the range of pre-Chernobyl values. (Auth.) 12 refs.; 13 tabs.; 6 figs
A real-time stack radioactivity monitoring system and dose projection program
Energy Technology Data Exchange (ETDEWEB)
Hull, A.P.; Michael, P.A. [Brookhaven National Laboratory, Upton, NY (United States); Bernstein, H.J. [Bernstein & Sons, Bellport, NY (United States)
1995-02-01
At Brookhaven National Laboratory, a commercial Low- and High-Range Air Effluent Monitor has become operational at the 60 Mw (t) High Flux Beam Reactor. Its output data is combined with that from ground-level and elevated meteorological sensors to provide a real-time projection of the down-wind dose rates from noble gases and radioiodines released from the HFBR`s 100 m stack. The output of the monitor, and the meteorological sensors and the dose projections can be viewed at emergency response terminals located in the Reactor Control Room, its Technical Support Center and at the laboratory`s separately located Meteorological Station and Monitoring and Assessment Center.
International Nuclear Information System (INIS)
Napier, B.A.
1991-07-01
The objective of the Hanford Environmental Dose Reconstruction (HEDR) Project is to estimate the radiation dose that individuals could have received as a result of emission from nuclear operations at Hanford since their inception in 1944. The purpose of this report is to outline the basic algorithm and necessary computer calculations to be used to calculate radiation doses specific and hypothetical individuals in the vicinity of Hanford. The system design requirements, those things that must be accomplished, are defined. The system design specifications, the techniques by which those requirements are met, are outlined. Included are the basic equations, logic diagrams, and preliminary definition of the nature of each input distribution. 4 refs., 10 figs., 9 tabs
Energy Technology Data Exchange (ETDEWEB)
Napier, B.A.
1991-07-01
The objective of the Hanford Environmental Dose Reconstruction (HEDR) Project is to estimate the radiation dose that individuals could have received as a result of emission from nuclear operations at Hanford since their inception in 1944. The purpose of this report is to outline the basic algorithm and necessary computer calculations to be used to calculate radiation doses specific and hypothetical individuals in the vicinity of Hanford. The system design requirements, those things that must be accomplished, are defined. The system design specifications, the techniques by which those requirements are met, are outlined. Included are the basic equations, logic diagrams, and preliminary definition of the nature of each input distribution. 4 refs., 10 figs., 9 tabs.
Kim, Myung-Hee; Hu, Shaowen; Nounu, Hatem N.; Cucinotta, Francis A.
2010-01-01
The space radiation environment, particularly solar particle events (SPEs), poses the risk of acute radiation sickness (ARS) to humans; and organ doses from SPE exposure may reach critical levels during extra vehicular activities (EVAs) or within lightly shielded spacecraft. NASA has developed an organ dose projection model using the BRYNTRN with SUMDOSE computer codes, and a probabilistic model of Acute Radiation Risk (ARR). The codes BRYNTRN and SUMDOSE, written in FORTRAN, are a Baryon transport code and an output data processing code, respectively. The ARR code is written in C. The risk projection models of organ doses and ARR take the output from BRYNTRN as an input to their calculations. BRYNTRN code operation requires extensive input preparation. With a graphical user interface (GUI) to handle input and output for BRYNTRN, the response models can be connected easily and correctly to BRYNTRN in friendly way. A GUI for the Acute Radiation Risk and BRYNTRN Organ Dose (ARRBOD) projection code provides seamless integration of input and output manipulations, which are required for operations of the ARRBOD modules: BRYNTRN, SUMDOSE, and the ARR probabilistic response model. The ARRBOD GUI is intended for mission planners, radiation shield designers, space operations in the mission operations directorate (MOD), and space biophysics researchers. The ARRBOD GUI will serve as a proof-of-concept example for future integration of other human space applications risk projection models. The current version of the ARRBOD GUI is a new self-contained product and will have follow-on versions, as options are added: 1) human geometries of MAX/FAX in addition to CAM/CAF; 2) shielding distributions for spacecraft, Mars surface and atmosphere; 3) various space environmental and biophysical models; and 4) other response models to be connected to the BRYNTRN. The major components of the overall system, the subsystem interconnections, and external interfaces are described in this
International Nuclear Information System (INIS)
Brnic, Zoran; Hebrang, Andrija
2001-01-01
Introduction: Standard mammography includes two views, craniocaudal and medio-lateral oblique. Depending on patient's body constitution, central beam angle in mediolateral oblique projection may vary, with 45 deg. being suitable for the majority of patients in routine daily practice. With continuous improvement in X-ray technology and radiographers' training, the risk of radiation induced cancerogenesis is considerably reduced and acceptable when compared to benefit. However, the risk still exists, being cumulative and directly related to absorbed glandular dose. There is no minimal dose of radiation which is absolutely harmless, and every effort to reduce the dose is welcome. In this retrospective study two different angles (45 vs. 60 deg.) of mediolateral oblique view were compared according to radiation dose and efficacy of breast compression. Patients and methods: In 52 women, additional 60 deg. oblique films were done after craniocaudal and mediolateral oblique 45 deg.-films, with the same kVp and positioning technique. Breast thickness, time-current products (mA s) and absorbed doses were compared between 45 deg. - and 60 deg.-films. Subgroups of women with large, small, prominent and pendulous breasts were analyzed separately, following the same methodology as for the whole group. Results: mA s were 11.5% lower and compression 7% better with an angle of 60 deg. than with 45 deg. In the subgroup of women with small breasts, mA s values were 13% lower and compression 9% better with 60 deg. than with 45 deg., while in the subgroup with large breasts, mA s were 9% lower and compression 5% better. In the subgroup of patients with pendulous breasts, mA s values were 12% lower and compression 10% better with 60 deg. than with 45 deg., while in the subgroup with prominent breasts, mA s values were 4% lower and compression 3% better. Absorbed glandular dose was estimated to be approximately 20% lower when an oblique mammogram was done with 60 deg. instead of 45 deg
Data base on dose reduction research projects for nuclear power plants
International Nuclear Information System (INIS)
Khan, T.A.; Yu, C.K.; Roecklein, A.K.
1994-05-01
This is the fifth volume in a series of reports that provide information on dose reduction research and health physics technology or nuclear power plants. The information is taken from two of several databases maintained by Brookhaven National Laboratory's ALARA Center for the Nuclear Regulatory Commission. The research section of the report covers dose reduction projects that are in the experimental or developmental phase. It includes topics such as steam generator degradation, decontamination, robotics, improvements in reactor materials, and inspection techniques. The section on health physics technology discusses dose reduction efforts that are in place or in the process of being implemented at nuclear power plants. A total of 105 new or updated projects are described. All project abstracts from this report are available to nuclear industry professionals with access to a fax machine through the ACEFAX system or a computer with a modem and the proper communications software through the ACE system. Detailed descriptions of how to access all the databases electronically are in the appendices of the report
Data base on dose reduction research projects for nuclear power plants. Volume 5
Energy Technology Data Exchange (ETDEWEB)
Khan, T.A.; Yu, C.K.; Roecklein, A.K. [Brookhaven National Lab., Upton, NY (United States)
1994-05-01
This is the fifth volume in a series of reports that provide information on dose reduction research and health physics technology or nuclear power plants. The information is taken from two of several databases maintained by Brookhaven National Laboratory`s ALARA Center for the Nuclear Regulatory Commission. The research section of the report covers dose reduction projects that are in the experimental or developmental phase. It includes topics such as steam generator degradation, decontamination, robotics, improvements in reactor materials, and inspection techniques. The section on health physics technology discusses dose reduction efforts that are in place or in the process of being implemented at nuclear power plants. A total of 105 new or updated projects are described. All project abstracts from this report are available to nuclear industry professionals with access to a fax machine through the ACEFAX system or a computer with a modem and the proper communications software through the ACE system. Detailed descriptions of how to access all the databases electronically are in the appendices of the report.
Energy Technology Data Exchange (ETDEWEB)
Holmes, C.W.
1991-04-01
The Hanford Environmental Dose Reconstruction (HEDR) Project will estimate radiation doses people may have received from exposure to radioactive materials released during past operations at the US Department of Energy's (DOE) Hanford Site near Richland, Washington. The HEDR Project was initiated in response to public concerns about possible health impacts from past releases of radioactive materials from Hanford. The TSP recognized early in the project that special mechanisms would be required to effectively communicate to the many different concerned audiences. Accordingly, the TSP directed PNL to examine methods for communicating causes and effects of uncertainties in the dose estimates. After considering the directive and discussing it with the Communications Subcommittee of the TSP, PNL undertook a broad investigation of communications methods to consider for inclusion in the TSP's current communications program. As part of this investigation, a literature review was conducted regarding risk communications. A key finding was that, in order to successfully communicate risk-related information, a thorough understanding of the knowledge level, concerns and information needs of the intended recipients (i.e., the audience) is necessary. Hence, a preliminary audience analysis was conducted as part of the present research. This report summarizes the results of this analysis. 1 ref., 9 tabs.
General guidelines for the Assessment of Internal Dose from Monitoring Data (Project IDEAS)
International Nuclear Information System (INIS)
Doerfel, H.; Andrasi, A.; Bailey, M.; Blanchardon, E.; Berkovski, V.; Castellani, C. M.; Hurtgen, C.; Jourdain, J. R.; LeGuen, B.; Puncher, M.
2004-01-01
In recent major international intercomparison exercises on intake and internal dose assessments from monitoring data the results calculated by different participants varied significantly. This was mainly due to the broad variety of methods and assumptions applied in the assessment procedure. Based on these experiences the need for harmonisation of the procedures has been formulated within an EU research project under the 5th Framework Programme. The aim of the project, IDEAS, is to develop general guidelines for standardising assessments of intakes and internal doses. The IDEAS project started in October 2001 and will end in March 2005. Eight institutions from seven European countries are participating. Inputs from internal dosimetry professionals from across Europe are also being used to ensure a broad consensus in the outcome of the project. The IDEAS project is closely related to some goals of the work of Committee 2 of the ICRP and since 2003 there has been close cooperation between the two groups. To ensure that the guidelines are applicable to a wide range of practical situations, the first step has been to compile a database of well-documented cases of internal contamination. In parallel, an improved version of an existing software package has been developed and distributed to the partners for further use. A large number of cases from the database have been evaluated independently by partners in the project using the same software and the results have been reviewed. Based on these evaluations guidelines are being drafted and will be discussed with dosimetry professionals from around the world by means of a virtual workshop on the Internet early in 2004. The guidelines will be revised and refined on the basis of the experiences and discussions of this virtual workshop and the outcome of an intercomparison exercise organised as part of the project. This will be open to all internal dosimetry professionals. (Author) 10 refs
Energy Technology Data Exchange (ETDEWEB)
Deonigi, D.E.; Anderson, D.M.; Wilfert, G.L.
1994-04-01
The Hanford Environmental Dose Reconstruction (HEDR) Project was established to estimate radiation doses that people could have received from nuclear operations at the Hanford Site since 1944. For this period iodine-131 is the most important offsite contributor to radiation doses from Hanford operations. Consumption of milk from cows that ate vegetation contaminated by iodine-131 is the dominant radiation pathway for individuals who drank milk (Napier 1992). Information has been developed on commercial milk cow locations and commercial milk distribution during 1945 and 1951. The year 1945 was selected because during 1945 the largest amount of iodine-131 was released from Hanford facilities in a calendar year (Heeb 1993); therefore, 1945 was the year in which an individual was likely to have received the highest dose. The year 1951 was selected to provide data for comparing the changes that occurred in commercial milk flows (i.e., sources, processing locations, and market areas) between World War II and the post-war period. To estimate the doses people could have received from this milk flow, it is necessary to estimate the amount of milk people consumed, the source of the milk, the specific feeding regime used for milk cows, and the amount of iodine-131 contamination deposited on feed.
FY 1993 task plans for the Hanford Environmental Dose Reconstruction Project
International Nuclear Information System (INIS)
Shipler, D.B.
1991-10-01
The purpose of the Hanford Environmental Dose Reconstruction (HEDR) Project is to estimate radiation doses from Hanford Site operations since 1944 to individuals and populations. The primary objective of work to be performed in FY 1993 is to complete the source term estimates and dose estimates for key radionuclides for the air and river pathways. At the end of FY 1993, the capability will be in place to estimate doses for individuals in the extended (32-county) study area, 1944--1991. Native American research will continue to provide input for tribal dose estimates. In FY 1993, the Technical Steering Panel (TSP) will decide whether demographic and river pathways data collection should be extended beyond FY 1993 levels. The FY 1993 work scopes and milestones in this document are based on the work plan discussed at the TSP Budget/Fiscal Subcommittee meeting on August 19--20, 1991. Table 1 shows the FY 1993 milestones; Table 2 shows estimated costs. The subsequent work scope descriptions are based on the milestones. This document and the FY 1992 task plans will form the basis for a contract with Battelle and the Centers for Disease Control (CDC). The 2-year dose reconstruction contract is expected to begin in February 1992. This contract will replace the current arrangement, whereby the US Department of Energy directly funds the Pacific Northwest Laboratory to conduct dose reconstruction work. In late FY 1992, the FY 1993 task plans will be more fully developed with detailed technical approaches, data quality objectives, and budgeted labor hours. The task plans will be updated again in July 1993 to reflect any scope, milestone, or cost changes directed during the year by the TSP. 2 tabs
National Research Council Canada - National Science Library
Kujawski, Edouard
2007-01-01
In the real world, "Money Allocated is Money Spent" (MAIMS). As a consequence, cost underruns are rarely available to protect against cost overruns, while task overruns are passed on to the total project cost...
FY 1992 task plans for the Hanford Environmental Dose Reconstruction Project
International Nuclear Information System (INIS)
1991-10-01
Phase 1 of the HEDR Project was designed to develop and demonstrate a method for estimating radiation doses people may have received from Hanford Site operations since 1944. The method researchers developed relied on a variety of measured and reconstructed data as input to a modular computer model that generates dose estimates and their uncertainties. As part of Phase 1, researchers used the reconstructed data and computer model to calculate preliminary dose estimates for populations from limited radionuclides, in a limited geographical area and time period. Phase 1 ended in FY 1990. In February 1991, the TSP decided to shift the project planning approach away from phases--which were centered around completion of major portions of technical activities--to individual fiscal years (FYs), which span October of one year through September of the next. Therefore, activities that were previously designated to occur in phases are now designated in an integrated schedule to occur in one or more of the next fiscal years into FY 1995. Task plans are updated every 6 months. In FY 1992, scientists will continue to improve Phase 1 data and models to calculate more accurate and precise dose estimates. The plan for FY 1992 has been prepared based on activities and budgets approved by the Technical Steering Panel (TSP) at its meeting on August 19--20, 1991. The activities can be divided into four categories: (1) model and data evaluation activities, (2) additional dose estimates, (3) model and data development activities, and (4) technical and communication support. 3 figs., 2 tabs
International Nuclear Information System (INIS)
Ramsdell, J.V.
1991-07-01
Radiation doses that may have resulted from operations at the Hanford Site are being estimated in the Hanford Environmental Dose Reconstruction (HEDR) Project. One of the project subtasks, atmospheric transport, is responsible for estimating the transport, diffusion and deposition of radionuclides released to the atmosphere. This report discusses modeling transport and diffusion in the atmospheric pathway. It is divided into three major sections. The first section of the report presents the atmospheric modeling approach selected following discussion with the Technical Steering Panel that directs the HEDR Project. In addition, the section discusses the selection of the MESOI/MESORAD suite of atmospheric dispersion models that form the basis for initial calculations and future model development. The second section of the report describes alternative modeling approaches that were considered. Emphasis is placed on the family of plume and puff models that are based on Gaussian solution to the diffusion equations. The final portion of the section describes the performance of various models. The third section of the report discusses factors that bear on the selection of an atmospheric transport modeling approach for HEDR. These factors, which include the physical setting of the Hanford Site and the available meteorological data, serve as constraints on model selection. Five appendices are included in the report. 39 refs., 4 figs., 2 tabs
Benakli, Nadia; Kostadinov, Boyan; Satyanarayana, Ashwin; Singh, Satyanand
2017-04-01
The goal of this paper is to promote computational thinking among mathematics, engineering, science and technology students, through hands-on computer experiments. These activities have the potential to empower students to learn, create and invent with technology, and they engage computational thinking through simulations, visualizations and data analysis. We present nine computer experiments and suggest a few more, with applications to calculus, probability and data analysis, which engage computational thinking through simulations, visualizations and data analysis. We are using the free (open-source) statistical programming language R. Our goal is to give a taste of what R offers rather than to present a comprehensive tutorial on the R language. In our experience, these kinds of interactive computer activities can be easily integrated into a smart classroom. Furthermore, these activities do tend to keep students motivated and actively engaged in the process of learning, problem solving and developing a better intuition for understanding complex mathematical concepts.
Acceptance test procedure for K basins dose reduction project clean and coat equipment
International Nuclear Information System (INIS)
Creed, R.F.
1996-01-01
This document is the Acceptance Test Procedure (ATP) for the clean and coat equipment designed by Oceaneering Hanford, Inc. under purchase order MDK-XVC-406988 for use in the 105 K East Basin. The ATP provides the guidelines and criteria to test the equipment's ability to clean and coat the concrete perimeter, divider walls, and dummy elevator pit above the existing water level. This equipment was designed and built in support of the Spent Nuclear Fuel, Dose Reduction Project. The ATP will be performed at the 305 test facility in the 300 Area at Hanford. The test results will be documented in WHC-SD-SNF-ATR-020
International Nuclear Information System (INIS)
Sabatier, L.
2006-01-01
RISC-RAD (Radiosensitivity of Individuals and Susceptibility to Cancer induced by ionizing Radiations) is an Integrated Project funded by the European Commission under 6. Framework Programme / EURATOM. RISC-RAD started on 1. January 2004 for a duration of four years. Coordinated by Cea (Dr Laure Sabatier), it involves 11 European countries (Austria, Denmark, Finland, France, Germany, Ireland, Italy, the Netherlands, Spain, Sweden and the United Kingdom) and 29 research institutions. Objectives: Exposures to low and protracted doses of ionizing radiation are very frequent in normal living environment, at work places, in industry and in medicine. Effects of these exposures on human health cannot be reliably assessed by epidemiological methods, nor is thoroughly understood by biologists. RISC-RAD project proposes to help bridging the gap of scientific knowledge about these effects. To achieve this goal, a necessary key step is to understand the basic mechanisms by which radiation induces cancer. Studying this multistage process in an integrated way, the project offers a new biological approach characterised by and clear-cut and objective-driven scientific policy: the project is focused on the effects of low doses (less than 100 mSv) and protracted doses of radiation. It aims at identifying new parameters that take into account the differences in radiation responses between individuals. A group of modelers works closely with the experimental teams in order to better quantify the risks associated with low and protracted doses. Research work is divided into five work packages interacting closely with each other. WP1 is dedicated to DNA damage. Ionizing Radiation (IR) produce a broad spectrum of base modifications and DNA strand breaks of different kinds, among which double-strand breaks and 'clustered damage' which is thought to be a major feature in biological effectiveness of IR. The aim of Work Package 1 is to improve understanding of the initial DNA damage induced by
Data base on dose reduction research projects for nuclear power plants. Volume 4
Energy Technology Data Exchange (ETDEWEB)
Khan, T.A.; Vulin, D.S.; Liang, H.; Baum, J.W. [Brookhaven National Lab., Upton, NY (United States)
1992-08-01
This is the fourth volume in a series of reports that provide information on dose reduction research and health physics technology for nuclear power plants. The information is taken from a data base maintained by Brookhaven National Laboratory`s ALARA Center for the Nuclear Regulatory Commission. This report presents information on 118 new or updated projects, covering a wide range of activities. Projects including steam generator degradation, decontamination, robotics, improvement in reactor materials, and inspection techniques, among others, are described in the research section of the report. The section on health physics technology includes some simple and very cost-effective projects to reduce radiation exposures. Included in this volume is a detailed description of how to access the BNL data bases which store this information. All project abstracts from this report, as well as many other useful documents, can be accessed, with permission, through our on-line system, ACE. A computer equipped with a modem, or a fax machine is all that is required to connect to ACE. Many features of ACE, including software, hardware, and communications specifics, are explained in this report.
Data base on dose reduction research projects for nuclear power plants
Energy Technology Data Exchange (ETDEWEB)
Khan, T.A.; Vulin, D.S.; Liang, H.; Baum, J.W. (Brookhaven National Lab., Upton, NY (United States))
1992-08-01
This is the fourth volume in a series of reports that provide information on dose reduction research and health physics technology for nuclear power plants. The information is taken from a data base maintained by Brookhaven National Laboratory's ALARA Center for the Nuclear Regulatory Commission. This report presents information on 118 new or updated projects, covering a wide range of activities. Projects including steam generator degradation, decontamination, robotics, improvement in reactor materials, and inspection techniques, among others, are described in the research section of the report. The section on health physics technology includes some simple and very cost-effective projects to reduce radiation exposures. Included in this volume is a detailed description of how to access the BNL data bases which store this information. All project abstracts from this report, as well as many other useful documents, can be accessed, with permission, through our on-line system, ACE. A computer equipped with a modem, or a fax machine is all that is required to connect to ACE. Many features of ACE, including software, hardware, and communications specifics, are explained in this report.
DEFF Research Database (Denmark)
Asmussen, Søren; Albrecher, Hansjörg
The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....
International Nuclear Information System (INIS)
Khelassi-Toutaoui, Nadia; Merad, Ahmed; Toutaoui, A.E.K.; Bairi, Souad
2008-01-01
Full text: Purpose: To evaluate patient doses in Interventional Radiology (IR) and Cardiology (IC) procedures in Algeria, within the framework of an International Atomic Energy Agency (IAEA) regional project on radiation protection of patients and medical exposure control (RAF 9033). Materials and Methods: Three public hospitals (CHU Bab el Oued, CHU Parnet and CHU Mustapha) and one specialised Cardiology Service (Clinique Maouche) were chosen for the study. For Maximum Skin Dose (MSD) evaluation, gafchromic films XR type R were used, placed on patient's back before the procedure. The Dose Area Product (DAP) and MSD were measured in 57 IR and IC procedures, either diagnostic or therapeutic. Results: The results revealed large variations in MSD (0.06-3.3 Gy) and DAP (5.5-332 mGycm 2 ). Mean MSD was 0.227 Gy in cerebral angiography, 0.202 Gy in coronary angiography, 1.162 Gy in Percutaneus Transluminal Coronary Angioplasty (PTCA) and 0.128 in abdominal angiography. The correlation of DAP and MSD was significant (r = 0.7). The correlation was DAP and fluoroscopy time was also significant (r = 0.8). Conclusion: The highest MSD values were found in PTCA which is a therapeutic procedure. Two PTCAs out of the 57 procedures measured in total had MSD over the threshold of 2 Gy for deterministic effects (MSD 1 = 3.0 Gy and MSD 2 3.3 Gy). The large variations in MSD reveal the need to continuously monitor patient doses in IR and IC procedures with special emphasis in PTCA procedure. (author)
Generalized Probability-Probability Plots
Mushkudiani, N.A.; Einmahl, J.H.J.
2004-01-01
We introduce generalized Probability-Probability (P-P) plots in order to study the one-sample goodness-of-fit problem and the two-sample problem, for real valued data.These plots, that are constructed by indexing with the class of closed intervals, globally preserve the properties of classical P-P
International Nuclear Information System (INIS)
Padmanaban, Sriram; Warren, Samantha; Walsh, Anthony; Partridge, Mike; Hawkins, Maria A
2014-01-01
To investigate systematic changes in dose arising when treatment plans optimised using the Anisotropic Analytical Algorithm (AAA) are recalculated using Acuros XB (AXB) in patients treated with definitive chemoradiotherapy (dCRT) for locally advanced oesophageal cancers. We have compared treatment plans created using AAA with those recalculated using AXB. Although the Anisotropic Analytical Algorithm (AAA) is currently more widely used in clinical routine, Acuros XB (AXB) has been shown to more accurately calculate the dose distribution, particularly in heterogeneous regions. Studies to predict clinical outcome should be based on modelling the dose delivered to the patient as accurately as possible. CT datasets from ten patients were selected for this retrospective study. VMAT (Volumetric modulated arc therapy) plans with 2 arcs, collimator rotation ± 5-10° and dose prescription 50 Gy / 25 fractions were created using Varian Eclipse (v10.0). The initial dose calculation was performed with AAA, and AXB plans were created by re-calculating the dose distribution using the same number of monitor units (MU) and multileaf collimator (MLC) files as the original plan. The difference in calculated dose to organs at risk (OAR) was compared using dose-volume histogram (DVH) statistics and p values were calculated using the Wilcoxon signed rank test. The potential clinical effect of dosimetric differences in the gross tumour volume (GTV) was evaluated using three different TCP models from the literature. PTV Median dose was apparently 0.9 Gy lower (range: 0.5 Gy - 1.3 Gy; p < 0.05) for VMAT AAA plans re-calculated with AXB and GTV mean dose was reduced by on average 1.0 Gy (0.3 Gy −1.5 Gy; p < 0.05). An apparent difference in TCP of between 1.2% and 3.1% was found depending on the choice of TCP model. OAR mean dose was lower in the AXB recalculated plan than the AAA plan (on average, dose reduction: lung 1.7%, heart 2.4%). Similar trends were seen for CRT plans
Shiryaev, Albert N
2016-01-01
This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.
Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...
Kalmbach, K.; Röhlig, K.-J.
2016-01-01
Within the ENTRIA project, an interdisciplinary group of scientists developed a research paper aiming at a synthesis of the technical, sociology of knowledge, legal, societal, and political aspects of dose limits within the field of radioactive waste management. In this paper, the ENTRIA project is
International Nuclear Information System (INIS)
Catlin, R.J.; Goldman, M.; Anspaugh, L.R.
1988-01-01
Estimates of projected collective dose and average individual dose commitments from Chernobyl releases were made for various regions. Consideration was given to the possible effectiveness of protective actions taken by various countries to reduce projected doses to their populations. Although some preliminary data indicate possible mean reductions of about 25% in total collective doses over the first year, and of about 55% in collective dose to the thyroid, no corrections were made to these dose estimates because of the variable nature of the data. A new combined set of dose-effect models recently published by the United States Nuclear Regulatory Commission was then applied to estimate the ranges of possible future additional health effects due to the Chernobyl accident. In this method possible health effects are estimated on an individual site basis and the results are then summed. Both absolute and relative risk projection models are used. By use of these methods, ''best'' estimates of possible additional health effects were projected for the Northern Hemisphere as follows: 1) over the next 50 years, up to 28 thousand radiation-induced fatal cancers, compared to an expected 600 million cancer deaths from natural or spontaneous causes; 2) over the next year, up to 700 additional cases of severe mental retardation, compared to a normal expectation of 340 thousand cases; and 3) in the first generation, up to 1.9 thousand radiation-induced genetic disorders, compared to 180 million naturally-occurring cases. The possibility of zero health effects at very low doses and dose rates cannot be excluded. Owing to the very large numbers of naturally-occurring health effects, it is unlikely that any additional health effects will be demonstrable except, perhaps, for the more highly exposed population in the immediate vicinity of Chernobyl. 13 refs, 4 figs, 6 tabs
Padmanaban, Sriram; Warren, Samantha; Walsh, Anthony; Partridge, Mike; Hawkins, Maria A
2014-12-23
To investigate systematic changes in dose arising when treatment plans optimised using the Anisotropic Analytical Algorithm (AAA) are recalculated using Acuros XB (AXB) in patients treated with definitive chemoradiotherapy (dCRT) for locally advanced oesophageal cancers. We have compared treatment plans created using AAA with those recalculated using AXB. Although the Anisotropic Analytical Algorithm (AAA) is currently more widely used in clinical routine, Acuros XB (AXB) has been shown to more accurately calculate the dose distribution, particularly in heterogeneous regions. Studies to predict clinical outcome should be based on modelling the dose delivered to the patient as accurately as possible. CT datasets from ten patients were selected for this retrospective study. VMAT (Volumetric modulated arc therapy) plans with 2 arcs, collimator rotation ± 5-10° and dose prescription 50 Gy / 25 fractions were created using Varian Eclipse (v10.0). The initial dose calculation was performed with AAA, and AXB plans were created by re-calculating the dose distribution using the same number of monitor units (MU) and multileaf collimator (MLC) files as the original plan. The difference in calculated dose to organs at risk (OAR) was compared using dose-volume histogram (DVH) statistics and p values were calculated using the Wilcoxon signed rank test. The potential clinical effect of dosimetric differences in the gross tumour volume (GTV) was evaluated using three different TCP models from the literature. PTV Median dose was apparently 0.9 Gy lower (range: 0.5 Gy - 1.3 Gy; p AAA plans re-calculated with AXB and GTV mean dose was reduced by on average 1.0 Gy (0.3 Gy -1.5 Gy; p AAA plan (on average, dose reduction: lung 1.7%, heart 2.4%). Similar trends were seen for CRT plans. Differences in dose distribution are observed with VMAT and CRT plans recalculated with AXB particularly within soft tissue at the tumour/lung interface, where AXB has been shown to more
International Nuclear Information System (INIS)
Wang, Rui; Schoepf, U. Joseph; Wu, Runze; Reddy, Ryan P.; Zhang, Chuanchen; Yu, Wei; Liu, Yi; Zhang, Zhaoqi
2012-01-01
Purpose: To investigate the image quality and radiation dose of low radiation dose CT coronary angiography (CTCA) using sinogram affirmed iterative reconstruction (SAFIRE) compared with standard dose CTCA using filtered back-projection (FBP) in obese patients. Materials and methods: Seventy-eight consecutive obese patients were randomized into two groups and scanned using a prospectively ECG-triggered step-and-shot (SAS) CTCA protocol on a dual-source CT scanner. Thirty-nine patients (protocol A) were examined using a routine radiation dose protocol at 120 kV and images were reconstructed with FBP (protocol A). Thirty-nine patients (protocol B) were examined using a low dose protocol at 100 kV and images were reconstructed with SAFIRE. Two blinded observers independently assessed the image quality of each coronary segment using a 4-point scale (1 = non-diagnostic, 4 = excellent) and measured the objective parameters image noise, signal-to-noise ratio (SNR), and contrast-to-noise ratio (CNR). Radiation dose was calculated. Results: The coronary artery image quality scores, image noise, SNR and CNR were not significantly different between protocols A and B (all p > 0.05), with image quality scores of 3.51 ± 0.70 versus 3.55 ± 0.47, respectively. The effective radiation dose was significantly lower in protocol B (4.41 ± 0.83 mSv) than that in protocol A (8.83 ± 1.74 mSv, p < 0.01). Conclusion: Compared with standard dose CTCA using FBP, low dose CTCA using SAFIRE can maintain diagnostic image quality with 50% reduction of radiation dose.
Energy Technology Data Exchange (ETDEWEB)
Nielsen, Sven P.; Isaksson, M.; Nilsson, Elisabeth (and others)
2005-07-01
The NKS B-programme EcoDoses project started in 2003 as a collaboration between all the Nordic countries. The aim of the project is to improve the radiological assessments of doses to man from terrestrial ecosystems. The present report sums up the work performed in the second phase of the project. The main topics in 2004 have been: (i) A continuation of previous work with a better approach for estimating global fallout on a regional or national scale, based on a correlation between precipitation and deposition rates. (ii) Fur-ther extension of the EcoDoses milk database. Estimation of effective ecological half lives of {sup 137}Cs in cows milk focussing on suitable post-Chernobyl time-series. Modelling integrated transfer of {sup 13}7{sup C}s to cow's milk from Nordic countries. (iii) Determination of effective ecological half lives for fresh water fish from Nordic lakes. (iv) Investigate ra-dioecological sensitivity for Nordic populations. (v) Food-chain modelling using the Eco-sys-model, which is the underlying food- and dose-module in several computerised deci-sion-making systems. (au)
International Nuclear Information System (INIS)
Nielsen, Sven P.; Isaksson, M.; Nilsson, Elisabeth
2005-07-01
The NKS B-programme EcoDoses project started in 2003 as a collaboration between all the Nordic countries. The aim of the project is to improve the radiological assessments of doses to man from terrestrial ecosystems. The present report sums up the work performed in the second phase of the project. The main topics in 2004 have been: (i) A continuation of previous work with a better approach for estimating global fallout on a regional or national scale, based on a correlation between precipitation and deposition rates. (ii) Fur-ther extension of the EcoDoses milk database. Estimation of effective ecological half lives of 137 Cs in cows milk focussing on suitable post-Chernobyl time-series. Modelling integrated transfer of 13 7 C s to cow's milk from Nordic countries. (iii) Determination of effective ecological half lives for fresh water fish from Nordic lakes. (iv) Investigate ra-dioecological sensitivity for Nordic populations. (v) Food-chain modelling using the Eco-sys-model, which is the underlying food- and dose-module in several computerised deci-sion-making systems. (au)
International Nuclear Information System (INIS)
Marsh, J. W.; Castellani, C. M.; Hurtgen, C.; Lopez, M. A.; Andrasi, A.; Bailey, M. R.; Birchall, A.; Blanchardon, E.; Desai, A. D.; Dorrian, M. D.; Doerfel, H.; Koukouliou, V.; Luciani, A.; Malatova, I.; Molokanov, A.; Puncher, M.; Vrba, T.
2008-01-01
The work of Task Group 5.1 (uncertainty studies and revision of IDEAS guidelines) and Task Group 5.5 (update of IDEAS databases) of the CONRAD project is described. Scattering factor (SF) values (i.e. measurement uncertainties) have been calculated for different radionuclides and types of monitoring data using real data contained in the IDEAS Internal Contamination Database. Based upon this work and other published values, default SF values are suggested. Uncertainty studies have been carried out using both a Bayesian approach as well as a frequentist (classical) approach. The IDEAS guidelines have been revised in areas relating to the evaluation of an effective AMAD, guidance is given on evaluating wound cases with the NCRP wound model and suggestions made on the number and type of measurements required for dose assessment. (authors)
International Nuclear Information System (INIS)
Stolk, D.J.
1987-04-01
On request of the Netherlands government FEL-TNO is developing a decision support system with the acronym RAMBOS for the assessment of the off-site consequences of an accident with hazardous materials. This is a user friendly interactive computer program, which uses very sophisticated graphical means. RAMBOS supports the emergency planning organization in two ways. Firstly, the risk to the residents in the surroundings of the accident is quantified in terms of severity and magnitude (number of casualties, etc.). Secondly, the consequences of countermeasures, such as sheltering and evacuation, are predicted. By evaluating several countermeasures the user can determine an optimum policy to reduce the impact of the accident. Within the framework of the EC project 'Benchmark exercise on dose estimation in a regulatory context' on request of the Ministry of Housing, Physical Planning and Environment calculations were carried out with the RAMBOS system. This report contains the results of these calculations. 3 refs.; 2 figs.; 10 tabs
Quantum Probabilities as Behavioral Probabilities
Directory of Open Access Journals (Sweden)
Vyacheslav I. Yukalov
2017-03-01
Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.
Parameter calculation tool for the application of radiological dose projection codes
International Nuclear Information System (INIS)
Galindo G, I. F.; Vergara del C, J. A.; Galvan A, S. J.; Tijerina S, F.
2016-09-01
The use of specialized codes to estimate the radiation dose projection to an emergency postulated event at a nuclear power plant requires that certain plant data be available according to the event being simulated. The calculation of the possible radiological release is the critical activity to carry out the emergency actions. However, not all of the plant data required are obtained directly from the plant but need to be calculated. In this paper we present a computational tool that calculates the plant data required to use the radiological dose estimation codes. The tool provides the required information when there is a gas emergency venting event in the primary containment atmosphere, whether well or dry well and also calculates the time in which the spent fuel pool would be discovered in the event of a leak of water on some of the walls or floor of the pool. The tool developed has mathematical models for the processes involved such as: compressible flow in pipes considering area change and for constant area, taking into account the effects of friction and for the case of the spent fuel pool hydraulic models to calculate the time in which a container is emptied. The models implemented in the tool are validated with data from the literature for simulated cases. The results with the tool are very similar to those of reference. This tool will also be very supportive so that in postulated emergency cases can use the radiological dose estimation codes to adequately and efficiently determine the actions to be taken in a way that affects as little as possible. (Author)
Cavities at the Si projected range by high dose and energy Si ion implantation in Si
International Nuclear Information System (INIS)
Canino, M.; Regula, G.; Lancin, M.; Xu, M.; Pichaud, B.; Ntzoenzok, E.; Barthe, M.F.
2009-01-01
Two series of n-type Si samples α and β are implanted with Si ions at high dose (1 x 10 16 ) and high energies, 0.3 and 1.0 MeV, respectively. Both sort of samples are then implanted with 5 x 10 16 He cm -2 (at 10 or 50 keV) and eventually with B atoms. Some of the samples are annealed at temperatures ranging from 800 to 1000 deg. C to allow the thermal growth of He-cavities, located between sample surface and the projected range (R p ) of Si. After the triple ion implantation, which corresponds to defect engineering, samples were characterized by cross-section transmission electron microscopy (XTEM). Voids (or bubbles) are observed not only at the R p (He) on all annealed samples, but also at the R p (Si) on β samples implanted with He at 50 keV. The samples are also studied by positron annihilation spectroscopy (PAS) and the spectra confirm that as-implanted samples contain di-vacancies and that the annealed ones, even at high temperature have bigger open volumes, which are assumed to be the same voids observed by XTEM. It is demonstrated that a sole Si implantation at high energy and dose is efficient to create cavities which are thermally stable up to 1000 deg. C only in the presence of He.
DEFF Research Database (Denmark)
Rojas-Nandayapa, Leonardo
Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... analytic expression for the distribution function of a sum of random variables. The presence of heavy-tailed random variables complicates the problem even more. The objective of this dissertation is to provide better approximations by means of sharp asymptotic expressions and Monte Carlo estimators...
Energy Technology Data Exchange (ETDEWEB)
Bergan, T. [Lavrans Skuterud, Haevard Thoerring (Norway); Liland, A. [Norwegian Radiation Protection Authority (NRPA) (Denmark)] (eds.)
2004-05-01
The NKS B-programme EcoDoses project started in 2003 as a collaboration between all the Nordic countries. The aim of the project is to improve the radiological assessments of doses to man from terrestrial ecosystems. The first part, conducted in 2003, has focussed on an extensive collation and review of both published and unpublished data from all the Nordic countries for the nuclear weapons fallout period and the post-Chemobyl period. This included data on radionuclides in air filters, precipitation, soil samples, milk and reindeer. Based on this, an improved model for estimating radioactive fallout based on precipitation data during the nuclear weapons fallout period has been developed. Effective ecological half- lives for 137Cs and 90Sr in milk have been calculated for the nuclear weapons fallout period. For reindeer the ecological half- lives for 137Cs have been calculated for both the nuclear weapons fallout period and the post-Chemobyl period. The data were also used to compare modelling results with observed concentrations. This was done at a workshop where the radioecological food-and-dose module in the ARGOS decision support system was used to predict transfer of deposited radionuclides to foodstuffs and subsequent radiation doses to man. The work conducted the first year is presented in this report and gives interesting, new results relevant for terrestrial radioecology. (au)
Grinstead, Charles M; Snell, J Laurie
2011-01-01
This book explores four real-world topics through the lens of probability theory. It can be used to supplement a standard text in probability or statistics. Most elementary textbooks present the basic theory and then illustrate the ideas with some neatly packaged examples. Here the authors assume that the reader has seen, or is learning, the basic theory from another book and concentrate in some depth on the following topics: streaks, the stock market, lotteries, and fingerprints. This extended format allows the authors to present multiple approaches to problems and to pursue promising side discussions in ways that would not be possible in a book constrained to cover a fixed set of topics. To keep the main narrative accessible, the authors have placed the more technical mathematical details in appendices. The appendices can be understood by someone who has taken one or two semesters of calculus.
Dorogovtsev, A Ya; Skorokhod, A V; Silvestrov, D S; Skorokhod, A V
1997-01-01
This book of problems is intended for students in pure and applied mathematics. There are problems in traditional areas of probability theory and problems in the theory of stochastic processes, which has wide applications in the theory of automatic control, queuing and reliability theories, and in many other modern science and engineering fields. Answers to most of the problems are given, and the book provides hints and solutions for more complicated problems.
Intercomparison exercise on internal dose assessment. Final report of a joint IAEA-IDEAS project
International Nuclear Information System (INIS)
2007-09-01
There have been several intercomparison exercises organized already at national and international levels for the assessment of occupational exposure due to intakes of radionuclides. These intercomparison exercises revealed significant differences in approaches, methods and assumptions, and consequently in the results. Because of the relevance of the issue for internal dosimetrists, the IAEA organized a new intercomparison exercise in cooperation with the IDEAS project General Guidelines for the Evaluation of Incorporation Monitoring Data, launched under the 5th EU Framework Programme (EU Contract No. FIKR-CT2001-00160). This new intercomparison exercise focused especially on the effect of the guidelines for harmonization of internal dosimetry. It also considered the following aspects: - to provide possibilities for the participating laboratories to check the quality of their internal dose assessment methods in applying the recent ICRP recommendations (e.g. for the new respiratory tract model); - to compare different approaches in interpretation of internal contamination monitoring data; - to quantify the differences in internal dose assessments based on the new guidelines or on other procedures, respectively; - to provide some figures for the influence of the input parameters on the monitoring results; and - to provide a broad forum for information exchange. Several cases have been selected for this exercise with the aim of covering a wide range of practices in the nuclear fuel cycle and in medical applications. The cases were: 1. Acute intake of HTO; 2. Acute inhalation of fission products 137 Cs and 90 Sr; 3. Intake of 60 Co; 4. Repeated intakes of 131 I; 5. Intake of enriched uranium; 6. Single intake of plutonium radionuclides and 241 Am. An Internet based approach had been used for the presentation of the cases, collection of responses and potential discussion of the results. Solutions to these cases were reported by 80 participants worldwide. This report
Engelsman, Martijn; Remeijer, Peter; van Herk, Marcel; Mijnheer, Ben; Damen, Eugène
2003-01-01
To assess the benefit of beam fringe (50%-90% dose level) sharpening for lung tumors, we performed a numerical simulation in which all geometrical errors (breathing motion, random and systematic errors) are included. A 50 mm diameter lung tumor, located centrally in a lung-equivalent phantom was
Takahashi, Fumihiro; Morita, Satoshi
2018-02-08
Phase II clinical trials are conducted to determine the optimal dose of the study drug for use in Phase III clinical trials while also balancing efficacy and safety. In conducting these trials, it may be important to consider subpopulations of patients grouped by background factors such as drug metabolism and kidney and liver function. Determining the optimal dose, as well as maximizing the eﬀectiveness of the study drug by analyzing patient subpopulations, requires a complex decision-making process. In extreme cases, drug development has to be terminated due to inadequate efficacy or severe toxicity. Such a decision may be based on a particular subpopulation. We propose a Bayesian utility approach (BUART) to randomized Phase II clinical trials which uses a first-order bivariate normal dynamic linear model for efficacy and safety in order to determine the optimal dose and study population in a subsequent Phase III clinical trial. We carried out a simulation study under a wide range of clinical scenarios to evaluate the performance of the proposed method in comparison with a conventional method separately analyzing efficacy and safety in each patient population. The proposed method showed more favorable operating characteristics in determining the optimal population and dose.
Timmers, Janine; ten Voorde, Marloes; van Engen, Ruben E.; van Landsveld-Verhoeven, Cary; Pijnappel, Ruud; Droogh-de Greve, Kitty; den Heeten, Gerard J.; Broeders, Mireille J. M.
2015-01-01
Purpose: To compare projected breast area, image quality, pain experience and radiation dose between mammography performed with and without radiolucent positioning sheets. Methods: 184 women screened in the Dutch breast screening programme (May-June 2012) provided written informed consent to have
Timmers, Janine; ten Voorde, Marloes; van Engen, Ruben E.; van Landsveld-Verhoeven, Cary; Pijnappel, Ruud; Droogh-de Greve, Kitty; den Heeten, Gerard J.; Broeders, Mireille J. M.
2015-01-01
To compare projected breast area, image quality, pain experience and radiation dose between mammography performed with and without radiolucent positioning sheets. 184 women screened in the Dutch breast screening programme (May-June 2012) provided written informed consent to have one additional image
Energy Technology Data Exchange (ETDEWEB)
Woloschak, Gayle E [Northwestern Univ., Evanston, IL (United States); Grdina, David [Univ. of Chicago, IL (United States); Li, Jian-Jian [Univ. of California, Davis, CA (United States)
2017-06-12
Low dose ionizing radiation effects are difficult to study in human population because of the numerous confounding factors such as genetic and lifestyle differences. Research in mammalian model systems and in vitro is generally used in order to overcome this difficulty. In this program project three projects have joined together to investigate effects of low doses of ionizing radiation. These are doses at and below 10 cGy of low linear energy transfer ionizing radiation such as X-ray and gamma rays. This project was focused on cellular signaling associated with nuclear factor kappa B (NFkB) and mitochondria - subcellular organelles critical for cell aging and aging-like changes induced by ionizing radiation. In addition to cells in culture this project utilized animal tissues accumulated in a radiation biology tissue archive housed at Northwestern University (http://janus.northwestern.edu/janus2/index.php). Major trust of Project 1 was to gather all of the DoE sponsored irradiated animal (mouse, rat and dog) data and tissues under one roof and investigate mitochondrial DNA changes and micro RNA changes in these samples. Through comparison of different samples we were trying to delineate mitochondrial DNA quantity alterations and micro RNA expression differences associated with different doses and dose rates of radiation. Historic animal irradiation experiments sponsored by DoE were done in several national laboratories and universities between 1950’s and 1990’s; while these experiments were closed data and tissues were released to Project 1. Project 2 used cells in culture to investigate effects that low doses or radiation have on NFκB and its target genes manganese superoxide dismutase (MnSOD) and genes involved in cell cycle: Cyclins (B1 and D1) and cyclin dependent kinases (CDKs). Project 3 used cells in culture such as “normal” human cells (breast epithelial cell line MCF10A cells and skin keratinocyte cells HK18) and mouse embryo fibroblast (mef
International Nuclear Information System (INIS)
2010-01-01
In recent years, many surgical procedures have increasingly been replaced by interventional procedures that guide catheters into the arteries under X ray fluoroscopic guidance to perform a variety of operations such as ballooning, embolization, implantation of stents etc. The radiation exposure to patients and staff in such procedures is much higher than in simple radiographic examinations like X ray of chest or abdomen such that radiation induced skin injuries to patients and eye lens opacities among workers have been reported in the 1990's and after. Interventional procedures have grown both in frequency and importance during the last decade. This Coordinated Research Project (CRP) and TECDOC were developed within the International Atomic Energy Agency's (IAEA) framework of statutory responsibility to provide for the worldwide application of the standards for the protection of people against exposure to ionizing radiation. The CRP took place between 2003 and 2005 in six countries, with a view of optimizing the radiation protection of patients undergoing interventional procedures. The Fundamental Safety Principles and the International Basic Safety Standards for Protection against Ionizing Radiation (BSS) issued by the IAEA and co-sponsored by the Food and Agriculture Organization of the United Nations (FAO), the International Labour Organization (ILO), the World Health Organization (WHO), the Pan American Health Organization (PAHO) and the Nuclear Energy Agency (NEA), among others, require the radiation protection of patients undergoing medical exposures through justification of the procedures involved and through optimization. In keeping with its responsibility on the application of standards, the IAEA programme on Radiological Protection of Patients encourages the reduction of patient doses. To facilitate this, it has issued specific advice on the application of the BSS in the field of radiology in Safety Reports Series No. 39 and the three volumes on Radiation
International Nuclear Information System (INIS)
Kurz, Jochen H.; Dugan, Sandra; Juengert, Anne
2013-01-01
Reliable assessment procedures are an important aspect of maintenance concepts. Non-destructive testing (NDT) methods are an essential part of a variety of maintenance plans. Fracture mechanical assessments require knowledge of flaw dimensions, loads and material parameters. NDT methods are able to acquire information on all of these areas. However, it has to be considered that the level of detail information depends on the case investigated and therefore on the applicable methods. Reliability aspects of NDT methods are of importance if quantitative information is required. Different design concepts e.g. the damage tolerance approach in aerospace already include reliability criteria of NDT methods applied in maintenance plans. NDT is also an essential part during construction and maintenance of nuclear power plants. In Germany, type and extent of inspection are specified in Safety Standards of the Nuclear Safety Standards Commission (KTA). Only certified inspections are allowed in the nuclear industry. The qualification of NDT is carried out in form of performance demonstrations of the inspection teams and the equipment, witnessed by an authorized inspector. The results of these tests are mainly statements regarding the detection capabilities of certain artificial flaws. In other countries, e.g. the U.S., additional blind tests on test blocks with hidden and unknown flaws may be required, in which a certain percentage of these flaws has to be detected. The knowledge of the probability of detection (POD) curves of specific flaws in specific testing conditions is often not present. This paper shows the results of a research project designed for POD determination of ultrasound phased array inspections of real and artificial cracks. The continuative objective of this project was to generate quantitative POD results. The distribution of the crack sizes of the specimens and the inspection planning is discussed, and results of the ultrasound inspections are presented. In
Lu, Bo; Lu, Haibin; Palta, Jatinder
2010-05-12
The objective of this study was to evaluate the effect of kilovoltage cone-beam computed tomography (CBCT) on registration accuracy and image qualities with a reduced number of planar projections used in volumetric imaging reconstruction. The ultimate goal is to evaluate the possibility of reducing the patient dose while maintaining registration accuracy under different projection-number schemes for various clinical sites. An Elekta Synergy Linear accelerator with an onboard CBCT system was used in this study. The quality of the Elekta XVI cone-beam three-dimensional volumetric images reconstructed with a decreasing number of projections was quantitatively evaluated by a Catphan phantom. Subsequently, we tested the registration accuracy of imaging data sets on three rigid anthropomorphic phantoms and three real patient sites under the reduced projection-number (as low as 1/6th) reconstruction of CBCT data with different rectilinear shifts and rota-tions. CBCT scan results of the Catphan phantom indicated the CBCT images got noisier when the number of projections was reduced, but their spatial resolution and uniformity were hardly affected. The maximum registration errors under the small amount transformation of the reference CT images were found to be within 0.7 mm translation and 0.3 masculine rotation. However, when the projection number was lower than one-fourth of the full set with a large amount of transformation of reference CT images, the registration could easily be trapped into local minima solutions for a nonrigid anatomy. We concluded, by using projection-number reduction strategy under conscientious care, imaging-guided localization procedure could achieve a lower patient dose without losing the registration accuracy for various clinical sites and situations. A faster scanning time is the main advantage compared to the mA decrease-based, dose-reduction method.
International Nuclear Information System (INIS)
Walrand, Stephan; Hanin, François-Xavier; Pauwels, Stanislas; Jamar, François
2012-01-01
Clinical trials on 177 Lu– 90 Y therapy used empirical activity ratios. Radionuclides (RN) with larger beta maximal range could favourably replace 90 Y. Our aim is to provide RN dose-deposition kernels and to compare the tumour control probability (TCP) of RN combinations. Dose kernels were derived by integration of the mono-energetic beta-ray dose distributions (computed using Monte Carlo) weighted by their respective beta spectrum. Nine homogeneous spherical tumours (1–25 mm in diameter) and four spherical tumours including a lattice of cold, but alive, spheres (1, 3, 5, 7 mm in diameter) were modelled. The TCP for 93 Y, 90 Y and 125 Sn in combination with 177 Lu in variable proportions (that kept constant the renal cortex biological effective dose) were derived by 3D dose kernel convolution. For a mean tumour-absorbed dose of 180 Gy, 2 mm homogeneous tumours and tumours including 3 mm diameter cold alive spheres were both well controlled (TCP > 0.9) using a 75–25% combination of 177 Lu and 90 Y activity. However, 125 Sn– 177 Lu achieved a significantly better result by controlling 1 mm-homogeneous tumour simultaneously with tumours including 5 mm diameter cold alive spheres. Clinical trials using RN combinations should use RN proportions tuned to the patient dosimetry. 125 Sn production and its coupling to somatostatin analogue appear feasible. Assuming similar pharmacokinetics 125 Sn is the best RN for combination with 177 Lu in peptide receptor radiotherapy justifying pharmacokinetics studies in rodent of 125 Sn-labelled somatostatin analogues. (paper)
Energy Technology Data Exchange (ETDEWEB)
Anderson, D.M.; Bates, D.J.; Marsh, T.L.
1993-03-01
This report details the methods used and the results of the study on the estimated historic levels of food consumption by individuals in the Hanford Environmental Dose Reconstruction (HEDR) study area from 1945--1957. This period includes the time of highest releases from Hanford and is the period for which data are being collected in the Hanford Thyroid Disease Study. These estimates provide the food-consumption inputs for the HEDR database of individual diets. This database will be an input file in the Hanford Environmental Dose Reconstruction Integrated Code (HEDRIC) computer model that will be used to calculate the radiation dose. The report focuses on fresh milk, eggs, lettuce, and spinach. These foods were chosen because they have been found to be significant contributors to radiation dose based on the Technical Steering Panel dose decision level.
Estimation of 1945 to 1957 food consumption. Hanford Environmental Dose Reconstruction Project
Energy Technology Data Exchange (ETDEWEB)
Anderson, D.M.; Bates, D.J.; Marsh, T.L.
1993-07-01
This report details the methods used and the results of the study on the estimated historic levels of food consumption by individuals in the Hanford Environmental Dose Reconstruction (HEDR) study area from 1945--1957. This period includes the time of highest releases from Hanford and is the period for which data are being collected in the Hanford Thyroid Disease Study. These estimates provide the food-consumption inputs for the HEDR database of individual diets. This database will be an input file in the Hanford Environmental Dose Reconstruction Integrated Code (HEDRIC) computer model that will be used to calculate the radiation dose. The report focuses on fresh milk, eggs, lettuce, and spinach. These foods were chosen because they have been found to be significant contributors to radiation dose based on the Technical Steering Panel dose decision level.
Energy Technology Data Exchange (ETDEWEB)
Snyder, S.F.; Farris, W.T.; Napier, B.A.; Ikenberry, T.A.; Gilbert, R.O.
1992-09-01
This letter report is a description of work performed for the Hanford Environmental Dose Reconstruction (HEDR) Project. The HEDR Project was established to estimate the radiation doses to individuals resulting from releases of radionuclides from the Hanford Site since 1944. This work is being done by staff at Battelle, Pacific Northwest Laboratories (Battelle) under a contract with the Centers for Disease Control (CDC) with technical direction provided by an independent Technical Steering Panel (TSP). The objective of this report is to-document the environmental accumulation and dose-assessment parameters that will be used to estimate the impacts of past Hanford Site airborne releases. During 1993, dose estimates made by staff at Battelle will be used by the Fred Hutchinson Cancer Research Center as part of the Hanford Thyroid Disease Study (HTDS). This document contains information on parameters that are specific to the airborne release of the radionuclide iodine-131. Future versions of this document will include parameter information pertinent to other pathways and radionuclides.
International Nuclear Information System (INIS)
Jacobs, D.G.; Easterly, C.E.; Phillips, J.E.
1979-01-01
Releases of tritium in the past have been largely in the form of tritiated water, and the projected radiation doses could be estimated by assuming tritium behaviour to parallel that of water. There is increasing interest in potential releases of tritium in the form of HT because of significant recent advances in fusion reactor research. Several recent studies have shown that bacteria containing the enzyme hydrogenase can catalyse the conversion of HT to HTO at rates several orders of magnitude faster than the rates measured in atmospheric systems. Rates of conversion in the soil have been combined with estimates of rates of permeation of HT into the soil and with global and local models depicting tritium transport and cycling. The results suggest that for the expected conversion rates, the impact on projected radiation doses should be relatively minor. (author)
SU-F-18C-15: Model-Based Multiscale Noise Reduction On Low Dose Cone Beam Projection
International Nuclear Information System (INIS)
Yao, W; Farr, J
2014-01-01
Purpose: To improve image quality of low dose cone beam CT for patient positioning in radiation therapy. Methods: In low dose cone beam CT (CBCT) imaging systems, Poisson process governs the randomness of photon fluence at x-ray source and the detector because of the independent binomial process of photon absorption in medium. On a CBCT projection, the variance of fluence consists of the variance of noiseless imaging structure and that of Poisson noise, which is proportional to the mean (noiseless) of the fluence at the detector. This requires multiscale filters to smoothen noise while keeping the structure information of the imaged object. We used a mathematical model of Poisson process to design multiscale filters and established the balance of noise correction and structure blurring. The algorithm was checked with low dose kilo-voltage CBCT projections acquired from a Varian OBI system. Results: From the investigation of low dose CBCT of a Catphan phantom and patients, it showed that our model-based multiscale technique could efficiently reduce noise and meanwhile keep the fine structure of the imaged object. After the image processing, the number of visible line pairs in Catphan phantom scanned with 4 ms pulse time was similar to that scanned with 32 ms, and soft tissue structure from simulated 4 ms patient head-and-neck images was also comparable with scanned 20 ms ones. Compared with fixed-scale technique, the image quality from multiscale one was improved. Conclusion: Use of projection-specific multiscale filters can reach better balance on noise reduction and structure information loss. The image quality of low dose CBCT can be improved by using multiscale filters
Popowicz, Natalia; Bintcliffe, Oliver; De Fonseka, Duneesha; Blyth, Kevin G; Smith, Nicola A; Piccolo, Francesco; Martin, Geoffrey; Wong, Donny; Edey, Anthony; Maskell, Nick; Lee, Y C Gary
2017-06-01
Intrapleural therapy with a combination of tissue plasminogen activator (tPA) 10 mg and DNase 5 mg administered twice daily has been shown in randomized and open-label studies to successfully manage over 90% of patients with pleural infection without surgery. Potential bleeding risks associated with intrapleural tPA and its costs remain important concerns. The aim of the ongoing Alteplase Dose Assessment for Pleural infection Therapy (ADAPT) project is to investigate the efficacy and safety of dose de-escalation for intrapleural tPA. The first of several planned studies is presented here. To evaluate the efficacy and safety of a reduced starting dose regimen of 5 mg of tPA with 5 mg of DNase administered intrapleurally for pleural infection. Consecutive patients with pleural infection at four participating centers in Australia, the United Kingdom, and New Zealand were included in this observational, open-label study. Treatment was initiated with tPA 5 mg and DNase 5 mg twice daily. Subsequent dose escalation was permitted at the discretion of the attending physician. Data relating to treatment success, radiological and systemic inflammatory changes (blood C-reactive protein), volume of fluid drained, length of hospital stay, and treatment complications were extracted retrospectively from the medical records. We evaluated 61 patients (41 males; age, 57 ± 16 yr). Most patients (n = 58 [93.4%]) were successfully treated without requiring surgery for pleural infection. Treatment success was corroborated by clearance of pleural opacities visualized by chest radiography (from 42% [interquartile range, 22-58] to 16% [8-31] of hemithorax; P < 0.001), increase in pleural fluid drainage (from 175 ml in the 24 h preceding treatment to 2,025 ml [interquartile range, 1,247-2,984] over 72 h of therapy; P < 0.05) and a reduction in blood C-reactive protein (P < 0.05). Seven patients (11.5%) had dose escalation of tPA to 10 mg. Three patients underwent
International Nuclear Information System (INIS)
Mitsumori, Lee M.; Shuman, William P.; Busey, Janet M.; Kolokythas, Orpheus; Koprowicz, Kent M.
2012-01-01
To compare routine dose liver CT reconstructed with filtered back projection (FBP) versus low dose images reconstructed with FBP and adaptive statistical iterative reconstruction (ASIR). In this retrospective study, patients had a routine dose protocol reconstructed with FBP, and again within 17 months (median 6.1 months), had a low dose protocol reconstructed twice, with FBP and ASIR. These reconstructions were compared for noise, image quality, and radiation dose. Nineteen patients were included. (12 male, mean age 58). Noise was significantly lower in low dose images reconstructed with ASIR compared to routine dose images reconstructed with FBP (liver: p <.05, aorta: p < 0.001). Low dose FBP images were scored significantly lower for subjective image quality than low dose ASIR (2.1 ± 0.5, 3.2 ± 0.8, p < 0.001). There was no difference in subjective image quality scores between routine dose FBP images and low dose ASIR images (3.6 ± 0.5, 3.2 ± 0.8, NS).Radiation dose was 41% less for the low dose protocol (4.4 ± 2.4 mSv versus 7.5 ± 5.5 mSv, p < 0.05). Our initial results suggest low dose CT images reconstructed with ASIR may have lower measured noise, similar image quality, yet significantly less radiation dose compared with higher dose images reconstructed with FBP. (orig.)
Energy Technology Data Exchange (ETDEWEB)
Mitsumori, Lee M.; Shuman, William P.; Busey, Janet M.; Kolokythas, Orpheus; Koprowicz, Kent M. [University of Washington School of Medicine, Department of Radiology, Seattle, WA (United States)
2012-01-15
To compare routine dose liver CT reconstructed with filtered back projection (FBP) versus low dose images reconstructed with FBP and adaptive statistical iterative reconstruction (ASIR). In this retrospective study, patients had a routine dose protocol reconstructed with FBP, and again within 17 months (median 6.1 months), had a low dose protocol reconstructed twice, with FBP and ASIR. These reconstructions were compared for noise, image quality, and radiation dose. Nineteen patients were included. (12 male, mean age 58). Noise was significantly lower in low dose images reconstructed with ASIR compared to routine dose images reconstructed with FBP (liver: p <.05, aorta: p < 0.001). Low dose FBP images were scored significantly lower for subjective image quality than low dose ASIR (2.1 {+-} 0.5, 3.2 {+-} 0.8, p < 0.001). There was no difference in subjective image quality scores between routine dose FBP images and low dose ASIR images (3.6 {+-} 0.5, 3.2 {+-} 0.8, NS).Radiation dose was 41% less for the low dose protocol (4.4 {+-} 2.4 mSv versus 7.5 {+-} 5.5 mSv, p < 0.05). Our initial results suggest low dose CT images reconstructed with ASIR may have lower measured noise, similar image quality, yet significantly less radiation dose compared with higher dose images reconstructed with FBP. (orig.)
International Nuclear Information System (INIS)
Catlin, R.J.; Goldman, M.; Anspaugh, L.R.
1987-01-01
Best estimates of possible additional health effects were projected for the Northern Hemisphere: (1) over the next 50 years, up to 28 thousand radiation-induced fatal cancers, compared to an expected 600 million cancer deaths from natural or spontaneous causes; (2) over the next year, up to 700 additional cases of severe mental retardation, compared to a normal expectation of 340 thousand cases; and (3) in the first generation, up to 1.9 thousand radiation-induced genetic disorders, compared to 180 million naturally-occurring cases. The possibility of zero health effects at very low doses and dose rates cannot be excluded. Due to the very large numbers of naturally-occurring health effects, it is unlikely that any additional health effects will be demonstrable except, perhaps, for the more highly exposed population in the immediate vicinity of Chernobyl. 13 refs., 4 figs., 6 tabs
International Nuclear Information System (INIS)
Catlin, R.J.; Goldman, M.; Anspaugh, L.R.
1987-01-01
Best estimates of possible additional health effects were projected for the Northern Hemisphere: (1) over the next 50 years, up to 28 thousand radiation-induced fatal cancers, compared to an expected 600 million cancer deaths FR-om natural or spontaneous causes; (2) over the next year, up to 700 additional cases of severe mental retardation, compared to a normal expectation of 340 thousand cases; and (3) in the first generation, up to 1.9 thousand radiation-induced genetic disorders, compared to 180 million naturally-occurring cases. The possibility of zero health effects at very low doses and dose rates cannot be excluded. Due to the very large numbers of naturally-occurring health effects, it is unlikely that any additional health effects will be demonstrable except, perhaps, for the more highly exposed population in the immediate vicinity of Chernobyl. 13 refs., 4 figs., 6 tabs
International Nuclear Information System (INIS)
Till, J.E.
1995-01-01
This article describes the process by which the author came to recognize the importance of openness to the public in environmental studies, during the Hanford Environmental Dose Reconstruction Project. Using the Dose reconstruction public involvement, the article goes on to describe a general guide to the construction of a new, positive framework for conducting future public studies. The steps include the following: putting the public in the study; building credibility into a public study (1 -search for proof in historical records; 2-define the domain and the exposed population; 3-characterize the material released; 4-identify key materials, pathways and receptors; 5-encouraging public participation; 6 -explaining the meaning of the results) and reconciling scientific and public issues
Energy Technology Data Exchange (ETDEWEB)
Till, J.E. [Radiological Assessment Corp., Neeses, SC (United States)
1995-09-01
This article describes the process by which the author came to recognize the importance of openness to the public in environmental studies, during the Hanford Environmental Dose Reconstruction Project. Using the Dose reconstruction public involvement, the article goes on to describe a general guide to the construction of a new, positive framework for conducting future public studies. The steps include the following: putting the public in the study; building credibility into a public study (1 -search for proof in historical records; 2-define the domain and the exposed population; 3-characterize the material released; 4-identify key materials, pathways and receptors; 5-encouraging public participation; 6 -explaining the meaning of the results) and reconciling scientific and public issues.
Conceptual design review report for K Basin Dose Reduction Project clean and coat task
International Nuclear Information System (INIS)
Blackburn, L.D.
1996-01-01
The strategy for reducing radiation dose originating from radionuclides absorbed in the concrete is to raise the pool water level to provide additional shielding. The concrete walls need to be coated to prevent future radionuclide absorption into the walls. This report documents a conceptual design review of equipment to clean and coat basin walls. The review concluded that the proposed concepts were and acceptable basis for proceeding with detailed final design
Radiation Dose Assessment for Military Personnel of the Enewetak Atoll Cleanup Project (1977-1980)
2018-04-13
effects. This conclusion is supported by the Health Physics Society’s position statement regarding radiation health risks: Substantial and...support from DoD’s Dose Assessment and Recording Working Group (DARWG) and professional health physics experts of the military services who are ECUP...sample.8 At the time ECUP was underway, a trigger level was established based on the proposal of the American Health Physics Society Plutonium
Final design review report for K Basin Dose Reduction Project Clean and Coat Task
International Nuclear Information System (INIS)
Blackburn, L.D.
1996-02-01
The strategy for reducing radiation dose originating from radionuclides absorbed in the concrete is to raise the pool water level to provide additional shielding. The concrete walls need to be coated to prevent future radionuclide absorption into the walls. This report documents a final design review of equipment to clean and coat basin walls. The review concluded that the design presented was acceptable for release for fabrication
Probability Aggregates in Probability Answer Set Programming
Saad, Emad
2013-01-01
Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...
Directory of Open Access Journals (Sweden)
Ta-Chuan Hung
2009-12-01
Conclusion: Both criteria demonstrate that MS is highly prevalent in elderly hypertensive patients in Taiwan. Additionally in women, but not men, the predicted probability of stroke is higher in MS than in non-MS patients. The diagnosis of MS is potentially useful for identifying elderly hypertensive females with an elevated risk of stroke in Taiwan.
Scaling Qualitative Probability
Burgin, Mark
2017-01-01
There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...
Occupational dose for the water cooling system of the SEAFP project
International Nuclear Information System (INIS)
Sandri, S.; Di Pace, L.
1996-01-01
The Occupational Radiation Exposure (ORE) for the water primary cooling system (PCS) of the SEAFP was assessed taking into account the first wall/blanket section only. All the potential radiological sources were considered and the analysis was restricted to the most important source at the PCS, the activated corrosion products (ACP). The relevant dose rate was evaluated using the computer code MCNP. Comparison of the results with the respective values measured at the fission PWR plants made it possible to transfer the parameters relevant to the working activities to the SEAFP PCS. Maintenance and inspection were found to be the only working tasks applicable to the SEAFP circuit and the worker access was considered to be allowed 24 h after the plant shut down only. 12 refs., 11 tabs
Energy Technology Data Exchange (ETDEWEB)
Mart, E.I.; Denham, D.H.; Thiede, M.E.
1993-12-01
This report is a result of the Hanford Environmental Dose Reconstruction (HEDR) Project whose goal is to estimate the radiation dose that individuals could have received from emissions since 1944 at the U.S. Department of Energy`s (DOE) Hanford Site near Richland, Washington. The HEDR Project is conducted by Battelle, Pacific Northwest Laboratories (BNW). One of the radionuclides emitted that would affect the radiation dose was iodine-131. This report describes in detail the reconstructed conversion and correction factors for historical measurements of iodine-131 in Hanford-area vegetation which was collected from the beginning of October 1945 through the end of December 1947.
International Nuclear Information System (INIS)
2001-07-01
The primary purpose of this CRP was to provide a co-ordinated international effort to assemble and evaluate relevant data using sound technical judgement concerning the effects that fires, explosions or breaches of hulls of ships might have on the integrity of radioactive material packages. The probability and expected consequences of such events could thereby be assessed. If it were shown that the proportion of maritime accidents with severity in excess of the IAEA regulatory requirements was expected to be higher than that for land transport, then pertinent proposals could be submitted to the forthcoming Revision Panels to amend the IAEA Regulations for Safe Transport of Radioactive Material and their supporting documents. Four main areas of research were included in the CRP. These consisted of studying the probability of ship accidents; fire; collision; and radiological consequences
Energy Technology Data Exchange (ETDEWEB)
NONE
2001-07-01
The primary purpose of this CRP was to provide a co-ordinated international effort to assemble and evaluate relevant data using sound technical judgement concerning the effects that fires, explosions or breaches of hulls of ships might have on the integrity of radioactive material packages. The probability and expected consequences of such events could thereby be assessed. If it were shown that the proportion of maritime accidents with severity in excess of the IAEA regulatory requirements was expected to be higher than that for land transport, then pertinent proposals could be submitted to the forthcoming Revision Panels to amend the IAEA Regulations for Safe Transport of Radioactive Material and their supporting documents. Four main areas of research were included in the CRP. These consisted of studying the probability of ship accidents; fire; collision; and radiological consequences.
International Nuclear Information System (INIS)
Runchal, A.K.; Merkhofer, M.W.; Olmsted, E.; Davis, J.D.
1984-11-01
The present study implemented a probability encoding method to estimate the probability distributions of selected hydrologic variables for the Cohassett basalt flow top and flow interior, and the anisotropy ratio of the interior of the Cohassett basalt flow beneath the Hanford Site. Site-speciic data for these hydrologic parameters are currently inadequate for the purpose of preliminary assessment of candidate repository performance. However, this information is required to complete preliminary performance assessment studies. Rockwell chose a probability encoding method developed by SRI International to generate credible and auditable estimates of the probability distributions of effective porosity and hydraulic conductivity anisotropy. The results indicate significant differences of opinion among the experts. This was especially true of the values of the effective porosity of the Cohassett basalt flow interior for which estimates differ by more than five orders of magnitude. The experts are in greater agreement about the values of effective porosity of the Cohassett basalt flow top; their estimates for this variable are generally within one to two orders of magnitiude of each other. For anisotropy ratio, the expert estimates are generally within two or three orders of magnitude of each other. Based on this study, the Rockwell hydrologists estimate the effective porosity of the Cohassett basalt flow top to be generally higher than do the independent experts. For the effective porosity of the Cohassett basalt flow top, the estimates of the Rockwell hydrologists indicate a smaller uncertainty than do the estimates of the independent experts. On the other hand, for the effective porosity and anisotropy ratio of the Cohassett basalt flow interior, the estimates of the Rockwell hydrologists indicate a larger uncertainty than do the estimates of the independent experts
International Nuclear Information System (INIS)
Mazleha Maskin; Tom, P.P.; Ahmad Hassan Sallehudin Mohd Sarif; Faizal Mohamed; Mohd Fazli Zakaria; Muhamad Puad Abu
2014-01-01
This article reports about the lessons learnt from the development of level 1 probabilistic safety assessment (PSA) project that was implemented under the IAEA mentoring program for TRIGA MARK II PUSPATI research reactor (RTP). As a project that involved more than 3 organizations, a strategic planning of the management and implementation of individual assignment is truly a hectic task. This report compiles all related activities from the forming of the Malaysian PSA team up to the final report submitted to the IAEA. (author)
Probability mapping of contaminants
Energy Technology Data Exchange (ETDEWEB)
Rautman, C.A.; Kaplan, P.G. [Sandia National Labs., Albuquerque, NM (United States); McGraw, M.A. [Univ. of California, Berkeley, CA (United States); Istok, J.D. [Oregon State Univ., Corvallis, OR (United States); Sigda, J.M. [New Mexico Inst. of Mining and Technology, Socorro, NM (United States)
1994-04-01
Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds).
Probability mapping of contaminants
International Nuclear Information System (INIS)
Rautman, C.A.; Kaplan, P.G.; McGraw, M.A.; Istok, J.D.; Sigda, J.M.
1994-01-01
Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds)
Probability theory and mathematical statistics for engineers
Pugachev, V S
1984-01-01
Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector
Mock-up experiments for the project of high dose irradiation on the RPV concrete
International Nuclear Information System (INIS)
Zdarek, J.; Brabec, P.; Frybort, O.; Lahodova, Z.; Vit, J.; Stemberk, P.
2015-01-01
Aging of NPP's concrete structures comes into growing interest in connection with solution of life extension programmes of operated units. Securing continued safe operation of NPPs calls for additional proofs of suitable long term behaviour of loaded reinforced concrete structures. An irradiation test of concrete samples was performed in the core of the LVR-15 reactor. The irradiation capsule was hung in the irradiation channel and the cooling of the capsule was ensured through direct contact of the capsule wall with the primary circuit water. Cylindrical, serpentine concrete samples (50 mm in diameter and 100 mm in length), representing composition of WWER RPV cavity, was chosen as a compromise of mechanical properties testing needs and dimension limitations of reactor irradiation channel. Heating during irradiation test was maintained under 93 Celsius degrees by cooling and was controlled by embedded thermocouple. Design of the cooling management was supported by computational analysis. The dependencies of heated concrete samples to the neutron fluence and the gamma heating were obtained by changing the thermal power of the reactor and by changing the vertical position of the sample in the irradiation channel. The irradiation capsule was filled with inert gas (helium) to allow the measurement of generated gas. The determination of concrete samples activity for long-term irradiation was performed on the principles of the Neutron Activation Analysis. Preliminary mock-up tests have proved the ability to fulfill technical needs for planned high dose irradiation experiment
International Nuclear Information System (INIS)
Filipy, R.E.; Borst, F.J.; Cross, F.T.; Park, J.F.; Moss, O.R.
1980-06-01
The report presents a mathematical model for the purpose of predicting the fraction of human population which would die within 1 year of an accidental exposure to airborne radionuclides. The model is based on data from laboratory experiments with rats, dogs and baboons, and from human epidemiological data. Doses from external, whole-body irradiation and from inhaled, alpha- and beta-emitting radionuclides are calculated for several organs. The probabilities of death from radiation pneumonitis and from bone marrow irradiation are predicted from doses accumulated within 30 days of exposure to the radioactive aerosol. The model is compared with existing similar models under hypothetical exposure conditions. Suggestions for further experiments with inhaled radionuclides are included
Briggs, William M.
2012-01-01
The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.
Allen, George E.; Brown, Simon G. A.; Buckley, Nicholas A.; O’Leary, Margaret A.; Page, Colin B.; Currie, Bart J.; White, Julian; Isbister, Geoffrey K.
2012-01-01
Background Snakebite is a global health issue and treatment with antivenom continues to be problematic. Brown snakes (genus Pseudonaja) are the most medically important group of Australian snakes and there is controversy over the dose of brown snake antivenom. We aimed to investigate the clinical and laboratory features of definite brown snake (Pseudonaja spp.) envenoming, and determine the dose of antivenom required. Methods and Finding This was a prospective observational study of definite brown snake envenoming from the Australian Snakebite Project (ASP) based on snake identification or specific enzyme immunoassay for Pseudonaja venom. From January 2004 to January 2012 there were 149 definite brown snake bites [median age 42y (2–81y); 100 males]. Systemic envenoming occurred in 136 (88%) cases. All envenomed patients developed venom induced consumption coagulopathy (VICC), with complete VICC in 109 (80%) and partial VICC in 27 (20%). Systemic symptoms occurred in 61 (45%) and mild neurotoxicity in 2 (1%). Myotoxicity did not occur. Severe envenoming occurred in 51 patients (38%) and was characterised by collapse or hypotension (37), thrombotic microangiopathy (15), major haemorrhage (5), cardiac arrest (7) and death (6). The median peak venom concentration in 118 envenomed patients was 1.6 ng/mL (Range: 0.15–210 ng/mL). The median initial antivenom dose was 2 vials (Range: 1–40) in 128 patients receiving antivenom. There was no difference in INR recovery or clinical outcome between patients receiving one or more than one vial of antivenom. Free venom was not detected in 112/115 patients post-antivenom with only low concentrations (0.4 to 0.9 ng/ml) in three patients. Conclusions Envenoming by brown snakes causes VICC and over a third of patients had serious complications including major haemorrhage, collapse and microangiopathy. The results of this study support accumulating evidence that giving more than one vial of antivenom is unnecessary in brown snake
Koo, Reginald; Jones, Martin L.
2011-01-01
Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.
Goldberg, Samuel
1960-01-01
Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.
Olaciregui-Ruiz, Igor; Rozendaal, Roel; van Oers, René F M; Mijnheer, Ben; Mans, Anton
2017-05-01
At our institute, a transit back-projection algorithm is used clinically to reconstruct in vivo patient and in phantom 3D dose distributions using EPID measurements behind a patient or a polystyrene slab phantom, respectively. In this study, an extension to this algorithm is presented whereby in air EPID measurements are used in combination with CT data to reconstruct 'virtual' 3D dose distributions. By combining virtual and in vivo patient verification data for the same treatment, patient-related errors can be separated from machine, planning and model errors. The virtual back-projection algorithm is described and verified against the transit algorithm with measurements made behind a slab phantom, against dose measurements made with an ionization chamber and with the OCTAVIUS 4D system, as well as against TPS patient data. Virtual and in vivo patient dose verification results are also compared. Virtual dose reconstructions agree within 1% with ionization chamber measurements. The average γ-pass rate values (3% global dose/3mm) in the 3D dose comparison with the OCTAVIUS 4D system and the TPS patient data are 98.5±1.9%(1SD) and 97.1±2.9%(1SD), respectively. For virtual patient dose reconstructions, the differences with the TPS in median dose to the PTV remain within 4%. Virtual patient dose reconstruction makes pre-treatment verification based on deviations of DVH parameters feasible and eliminates the need for phantom positioning and re-planning. Virtual patient dose reconstructions have additional value in the inspection of in vivo deviations, particularly in situations where CBCT data is not available (or not conclusive). Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Quantum probability measures and tomographic probability densities
Amosov, GG; Man'ko, [No Value
2004-01-01
Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the
International Nuclear Information System (INIS)
Fullwood, R.R.
1989-04-01
The Advanced Neutron Source (ANS) (Difilippo, 1986; Gamble, 1986; West, 1986; Selby, 1987) will be the world's best facility for low energy neutron research. This performance requires the highest flux density of all non-pulsed reactors with concomitant low thermal inertial and fast response to upset conditions. One of the primary concerns is that a flow cessation of the order of a second may result in fuel damage. Such a flow stoppage could be the result of break in the primary piping. This report is a review of methods for assessing pipe break probabilities based on historical operating experience in power reactors, scaling methods, fracture mechanics and fracture growth models. The goal of this work is to develop parametric guidance for the ANS design to make the event highly unlikely. It is also to review and select methods that may be used in an interactive IBM-PC model providing fast and reasonably accurate models to aid the ANS designers in achieving the safety requirements. 80 refs., 7 figs
Energy Technology Data Exchange (ETDEWEB)
Walters, W.H.; Dirkes, R.L.; Napier, B.A.
1992-11-01
As part of the Hanford Environmental Dose Reconstruction (HEDR) Project, Battelle, Pacific Northwest Laboratories reviewed literature and data on radionuclide concentrations and distribution in the water, sediment, and biota of the Columbia River and adjacent coastal areas. Over 600 documents were reviewed including Hanford reports, reports by offsite agencies, journal articles, and graduate theses. Radionuclide concentration data were used in preliminary estimates of individual dose for the period 1964 through 1966. This report summarizes the literature and database reviews and the results of the preliminary dose estimates.
On probability-possibility transformations
Klir, George J.; Parviz, Behzad
1992-01-01
Several probability-possibility transformations are compared in terms of the closeness of preserving second-order properties. The comparison is based on experimental results obtained by computer simulation. Two second-order properties are involved in this study: noninteraction of two distributions and projections of a joint distribution.
International Nuclear Information System (INIS)
Ciraj-Bjelac, Olivera; Avramova-Cholakova, Simona; Beganovic, Adnan; Economides, Sotirios; Faj, Dario; Gershan, Vesna; Grupetta, Edward; Kharita, M.H.; Milakovic, Milomir; Milu, Constantin; Muhogora, Wilbroad E.; Muthuvelu, Pirunthavany; Oola, Samuel; Setayeshi, Saeid
2012-01-01
Purpose: The objective is to study mammography practice from an optimisation point of view by assessing the impact of simple and immediately implementable corrective actions on image quality. Materials and methods: This prospective multinational study included 54 mammography units in 17 countries. More than 21,000 mammography images were evaluated using a three-level image quality scoring system. Following initial assessment, appropriate corrective actions were implemented and image quality was re-assessed in 24 units. Results: The fraction of images that were considered acceptable without any remark in the first phase (before the implementation of corrective actions) was 70% and 75% for cranio-caudal and medio-lateral oblique projections, respectively. The main causes for poor image quality before corrective actions were related to film processing, damaged or scratched image receptors, or film-screen combinations that are not spectrally matched, inappropriate radiographic techniques and lack of training. Average glandular dose to a standard breast was 1.5 mGy (mean and range 0.59–3.2 mGy). After optimisation the frequency of poor quality images decreased, but the relative contributions of the various causes remained similar. Image quality improvements following appropriate corrective actions were up to 50 percentage points in some facilities. Conclusions: Poor image quality is a major source of unnecessary radiation dose to the breast. An increased awareness of good quality mammograms is of particular importance for countries that are moving towards introduction of population-based screening programmes. The study demonstrated how simple and low-cost measures can be a valuable tool in improving of image quality in mammography
Toward a generalized probability theory: conditional probabilities
International Nuclear Information System (INIS)
Cassinelli, G.
1979-01-01
The main mathematical object of interest in the quantum logic approach to the foundations of quantum mechanics is the orthomodular lattice and a set of probability measures, or states, defined by the lattice. This mathematical structure is studied per se, independently from the intuitive or physical motivation of its definition, as a generalized probability theory. It is thought that the building-up of such a probability theory could eventually throw light on the mathematical structure of Hilbert-space quantum mechanics as a particular concrete model of the generalized theory. (Auth.)
Buck, David; Subramanyam, Rajeev; Varughese, Anna
2016-01-01
The use of a single-dose vial across multiple patients presents a risk to sterility and is against CDC guidelines. We initiated a quality improvement (QI) project to reduce the intraoperative use of single-dose vials of fentanyl across multiple patients at Cincinnati Children's Hospital Medical Center (CCHMC). The initial step of the improvement project was the development of a Key Driver Diagram. The diagram has the SMART aim of the project, key drivers inherent to the process we are trying to improve, and specific interventions targeting the key drivers. The number of patients each week receiving an IV dose of fentanyl, from a vial previously accessed for another patient was tracked in a high turnover operating room (OR). The improvement model used was based on the concept of building Plan-Do-Study-Act (PDSA) cycles. Tests of change included provider education, provision of an increased number of fentanyl vials, alternate wasting processes, and provision of single-use fentanyl syringes by the pharmacy. Prior to initiation of this project, it was common for a single fentanyl vial to be accessed for multiple patients. Our data showed an average percentage of failures of just over 50%. During the end of the project, after 7 months, the mean percentage failures had dropped to 5%. Preparation of 20 mcg single-use fentanyl syringes by pharmacy, combined with education of providers on appropriate use, was successful in reducing failures to below our goal of 25%. Appropriately sized fentanyl syringes prepared by pharmacy, education on correct use of single-dose vials, and reminders in the OR, reduced the percentage of patients receiving a dose of fentanyl from a vial previously accessed for another patient in a high-volume otolaryngology room. © 2015 John Wiley & Sons Ltd.
Philosophical theories of probability
Gillies, Donald
2000-01-01
The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.
Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia
We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned
Interpretations of probability
Khrennikov, Andrei
2009-01-01
This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.
Energy Technology Data Exchange (ETDEWEB)
Degteva, M. O.; Anspaugh, L. R.; Napier, Bruce A.
2009-10-23
This is the concluding Progress Report for Project 1.1 of the U.S./Russia Joint Coordinating Committee on Radiation Effects Research (JCCRER). An overwhelming majority of our work this period has been to complete our primary obligation of providing a new version of the Techa River Dosimetry System (TRDS), which we call TRDS-2009D; the D denotes deterministic. This system provides estimates of individual doses to members of the Extended Techa River Cohort (ETRC) and post-natal doses to members of the Techa River Offspring Cohort (TROC). The latter doses were calculated with use of the TRDS-2009D. The doses for the members of the ETRC have been made available to the American and Russian epidemiologists in September for their studies in deriving radiogenic risk factors. Doses for members of the TROC are being provided to European and Russian epidemiologists, as partial input for studies of risk in this population. Two of our original goals for the completion of this nine-year phase of Project 1.1 were not completed. These are completion of TRDS-2009MC, which was to be a Monte Carlo version of TRDS-2009 that could be used for more explicit analysis of the impact of uncertainty in doses on uncertainty in radiogenic risk factors. The second incomplete goal was to be the provision of household specific external doses (rather than village average). This task was far along, but had to be delayed due to the lead investigator’s work on consideration of a revised source term.
International Nuclear Information System (INIS)
Zhang, H; Kong, V; Jin, J; Ren, L
2014-01-01
Purpose: Synchronized moving grid is a promising technique to reduce scatter and ghost artifacts in cone beam computed tomography (CBCT). However, it requires 2 projections in the same gantry angle to obtain full information due to signal blockage by the grid. We proposed an inter-projection interpolation (IPI) method to estimate blocked signals, which may reduce the scan time and the dose. This study aims to provide a framework to achieve a balance between speed, dose and image quality. Methods: The IPI method is based on the hypothesis that an abrupt signal in a projection can be well predicted by the information in the two immediate neighboring projections if the gantry angle step is small. The study was performed on a Catphan and a head phantom. The SMOG was simulated by erasing the information (filling with “0”) of the areas in each projection corresponding to the grid. An IPI algorithm was applied on each projection to recover the erased information. FDK algorithm was used to reconstruct CBCT images for the IPI-processed projections, and compared with the original image in term of signal to noise ratio (SNR) measured in the whole reconstruction image range. The effect of gantry angle step was investigated by comparing the CBCT images from projection sets of various gantry intervals, with IPI-predicted projections to fill the missing projection in the interval. Results: The IPI procession time was 1.79s±0.53s for each projection. SNR after IPI was 29.0db and 28.1db for the Catphan and head phantom, respectively, comparing to 15.3db and 22.7db for an inpainting based interpolation technique. SNR was 28.3, 28.3, 21.8, 19.3 and 17.3 db for gantry angle intervals of 1, 1.5, 2, 2.5 and 3 degrees, respectively. Conclusion: IPI is feasible to estimate the missing information, and achieve an reasonable CBCT image quality with reduced dose and scan time. This study is supported by NIH/NCI grant 1R01CA166948-01
International Nuclear Information System (INIS)
Fraassen, B.C. van
1979-01-01
The interpretation of probabilities in physical theories are considered, whether quantum or classical. The following points are discussed 1) the functions P(μ, Q) in terms of which states and propositions can be represented, are classical (Kolmogoroff) probabilities, formally speaking, 2) these probabilities are generally interpreted as themselves conditional, and the conditions are mutually incompatible where the observables are maximal and 3) testing of the theory typically takes the form of confronting the expectation values of observable Q calculated with probability measures P(μ, Q) for states μ; hence, of comparing the probabilities P(μ, Q)(E) with the frequencies of occurrence of the corresponding events. It seems that even the interpretation of quantum mechanics, in so far as it concerns what the theory says about the empirical (i.e. actual, observable) phenomena, deals with the confrontation of classical probability measures with observable frequencies. This confrontation is studied. (Auth./C.F.)
The quantum probability calculus
International Nuclear Information System (INIS)
Jauch, J.M.
1976-01-01
The Wigner anomaly (1932) for the joint distribution of noncompatible observables is an indication that the classical probability calculus is not applicable for quantum probabilities. It should, therefore, be replaced by another, more general calculus, which is specifically adapted to quantal systems. In this article this calculus is exhibited and its mathematical axioms and the definitions of the basic concepts such as probability field, random variable, and expectation values are given. (B.R.H)
Choice Probability Generating Functions
DEFF Research Database (Denmark)
Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel
This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....
Probability of satellite collision
Mccarter, J. W.
1972-01-01
A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.
Choice probability generating functions
DEFF Research Database (Denmark)
Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel
2013-01-01
This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...
Florescu, Ionut
2013-01-01
THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio
Ash, Robert B; Lukacs, E
1972-01-01
Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var
International Nuclear Information System (INIS)
Choi, Noah C.; Fischman, Alan J.; Niemierko, Andrzej; Ryu, Jin-Sook; Lynch, Thomas; Wain, John; Wright, Cameron; Fidias, Panos; Mathisen, Douglas
2002-01-01
Purpose: To determine the dose-response relationship between the probability of tumor control on the basis of pathologic tumor response (pTCP) and the residual metabolic rate of glucose (MRglc) in response to preoperative chemoradiotherapy in locally advanced non-small-cell lung cancer and to define the level of residual MRglc that corresponds to pTCP 50% and pTCP ≥95%. Methods and Materials: Quantitative dynamic 18F-2-fluoro-2-deoxy-D-glucose (18F-FDG) positron emission tomography was performed to measure regional MRglc at the primary lesion before and 2 weeks after preoperative chemoradiotherapy in an initial group of 13 patients with locally advanced NSCLC. A simplified kinetic method was developed subsequently from the initial dynamic study and used in the subsequent 16 patients. The preoperative radiotherapy programs consisted of (1) a split course of 42 Gy in 28 fractions within a period of 28 days using a twice-daily treatment schedule for Stage IIIA(N2) NSCLC (n=18) and (2) standard once-daily radiation schedule of 45-63 Gy in 25-35 fractions during a 5-7-week period (n=11). The preoperative chemotherapy regimens included two cycles of cisplatin, vinblastine, and 5-fluorouracil (n=24), cisplatin and etoposide (n=2), and cisplatin, Taxol, and 5-fluorouracil (n=3). Patients free of tumor progression after preoperative chemoradiotherapy underwent surgery. The degree of residual MRglc measured 2 weeks after preoperative chemoradiotherapy and 2 weeks before surgery was correlated with the pathologic tumor response. The relationship between MRglc and pTCP was modeled using logistic regression. Results: Of 32 patients entered into the study, 29 (16 men and 13 women; 30 lesions) were evaluated for the correlation between residual MRglc and pathologic tumor response. Three patients did not participate in the second study because of a steady decline in general condition. The median age was 60 years (range 42-78). One of the 29 patients had two separate lesions, and
Energy Technology Data Exchange (ETDEWEB)
Nielsen, Sven P.; Andersson, Kasper; Thoerring, H. (and others)
2006-04-15
Considerable variations in activity concentrations in milk of {sup 137}Cs and {sup 90}Sr were observed between countries or regions due to precipitation patterns, soil types and inhomogeneity of Chernobyl fallout. Time trends indicate that factors influencing ecological half-lives for {sup 90}Sr are not the same as for {sup 137}Cs in the pasturemilk system. Internal doses to Faroese people derive mainly from dairy products, lamb and potatoes. The largest doses were received from nuclear weapons fallout in the early 1960's. {sup 137}Cs causes higher doses than 90Sr, and the regional variability is larger for {sup 137}Cs than for {sup 90}Sr. {sup 137}Cs deposition maps were made of Sweden. Values of 137Cs deposition and precipitation were used in the calculations of Nuclear Weapons Fallout (NWF). The deposition of {sup 137}Cs from the Chernobyl accident was calculated for western Sweden. Lowest levels of NWF {sup 137}Cs deposition density were noted in the north-eastern and eastern Sweden and the highest levels in the western parts. The Chernobyl {sup 137}Cs deposition is highest along the coast and lowest in the south-eastern part and along the middle. The calculated deposition from NWF and Chernobyl in western Sweden was compared to observed deposition and showed good agreement. Ecological halftimes of {sup 137}Cs in perch in Finnish lakes vary by a factor of three. The longest halftime of {sup 137}Cs in perch was 9 y and the shortest 3 y. Norwegian lakes differ from each other with respect to the rates of decrease of {sup 137}Cs in fish. Ecological halftimes of {sup 137}Cs in trout and Arctic char varied from 1 to 5 y. A more rapid reduction of {sup 137}Cs in fish is found in certain Norwegian lakes compared to Finnish lakes. In two Norwegian lakes the 137Cs concentrations in trout remain at about 100 Bq/kg since 1990. The European decision support systems, ARGOS and RODOS, include foodchain modules with default parameters derived from southern Germany. Many
International Nuclear Information System (INIS)
Nielsen, Sven P.; Andersson, Kasper; Thoerring, H.
2006-04-01
Considerable variations in activity concentrations in milk of 137 Cs and 90 Sr were observed between countries or regions due to precipitation patterns, soil types and inhomogeneity of Chernobyl fallout. Time trends indicate that factors influencing ecological half-lives for 90 Sr are not the same as for 137 Cs in the pasturemilk system. Internal doses to Faroese people derive mainly from dairy products, lamb and potatoes. The largest doses were received from nuclear weapons fallout in the early 1960's. 137 Cs causes higher doses than 90Sr, and the regional variability is larger for 137 Cs than for 90 Sr. 137 Cs deposition maps were made of Sweden. Values of 137Cs deposition and precipitation were used in the calculations of Nuclear Weapons Fallout (NWF). The deposition of 137 Cs from the Chernobyl accident was calculated for western Sweden. Lowest levels of NWF 137 Cs deposition density were noted in the north-eastern and eastern Sweden and the highest levels in the western parts. The Chernobyl 137 Cs deposition is highest along the coast and lowest in the south-eastern part and along the middle. The calculated deposition from NWF and Chernobyl in western Sweden was compared to observed deposition and showed good agreement. Ecological halftimes of 137 Cs in perch in Finnish lakes vary by a factor of three. The longest halftime of 137 Cs in perch was 9 y and the shortest 3 y. Norwegian lakes differ from each other with respect to the rates of decrease of 137 Cs in fish. Ecological halftimes of 137 Cs in trout and Arctic char varied from 1 to 5 y. A more rapid reduction of 137 Cs in fish is found in certain Norwegian lakes compared to Finnish lakes. In two Norwegian lakes the 137Cs concentrations in trout remain at about 100 Bq/kg since 1990. The European decision support systems, ARGOS and RODOS, include foodchain modules with default parameters derived from southern Germany. Many parameters describing foodchain transfer are subject to considerable variation
Callahan, Michael J; Kleinman, Patricia L; Strauss, Keith J; Bandos, Andriy; Taylor, George A; Tsai, Andy; Kleinman, Paul K
2015-01-01
The purpose of this study was to develop a departmental practice quality improvement project to systematically reduce CT doses for the evaluation of suspected pediatric appendicitis by introducing computer-generated gaussian noise. Two hundred MDCT abdominopelvic examinations of patients younger than 20 years performed with girth-based scanning parameters for suspected appendicitis were reviewed. Two judges selected 45 examinations in which the diagnosis of appendicitis was excluded (14, appendix not visualized; 31, normal appendix visualized). Gaussian noise was introduced into axial image series, creating five additional series acquired at 25-76% of the original dose. Two readers reviewed 270 image series for appendix visualization (4-point Likert scale and arrow localization). Volume CT dose index (CTDIvol) and size-specific dose estimate (SSDE) were calculated by use of patient girth. Confidence ratings and localization accuracy were analyzed with mixed models and nonparametric bootstrap analysis at a 0.05 significance level. The mean baseline SSDE for the 45 patients was 16 mGy (95% CI, 12-20 mGy), and the corresponding CTDIvol was 10 mGy (95% CI, 4-16 mGy). Changes in correct appendix localization frequencies were minor. There was no substantial trend with decreasing simulated dose level (p = 0.46). Confidence ratings decreased with increasing dose reduction (p = 0.007). The average decreases were -0.27 for the 25% simulated dose (p = 0.01), -0.17 for 33% (p = 0.03), and -0.03 for 43% (p = 0.65). Pediatric abdominal MDCT can be performed with 43% of the original dose (SSDE, 7 mGy; CTDIvol, 4.3 mGy) without substantially affecting visualization of a normal appendix.
Energy Technology Data Exchange (ETDEWEB)
Roberts, F.F.; Kishore, P.R.S.; Cunningham, M.E.
1978-08-01
A series of 86 pediatric lumbar spine abnormalities was evaluated to determine the diagnostic benefit of radiography in oblique projection as compared to frontal-lateral projections alone. In only four patients was an abnormality apparent on the oblique view which had not already been demonstrated by the frontal-lateral series; each of these represented an isolated spondylolysis. Because the diagnostic yield was low at a patient cost of more than double the gonadal radiation dose, it is recommended that oblique views be eliminated in the routine radiography of the pediatric lumbar spine.
Some open problems in noncommutative probability
International Nuclear Information System (INIS)
Kruszynski, P.
1981-01-01
A generalization of probability measures to non-Boolean structures is discussed. The starting point of the theory is the Gleason theorem about the form of measures on closed subspaces of a Hilbert space. The problems are formulated in terms of probability on lattices of projections in arbitrary von Neumann algebras. (Auth.)
Proposal for Modified Damage Probability Distribution Functions
DEFF Research Database (Denmark)
Pedersen, Preben Terndrup; Hansen, Peter Friis
1996-01-01
Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub-committee on st......Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub...
Freund, John E
1993-01-01
Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.
Probability, Nondeterminism and Concurrency
DEFF Research Database (Denmark)
Varacca, Daniele
Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...
Rocchi, Paolo
2014-01-01
The problem of probability interpretation was long overlooked before exploding in the 20th century, when the frequentist and subjectivist schools formalized two conflicting conceptions of probability. Beyond the radical followers of the two schools, a circle of pluralist thinkers tends to reconcile the opposing concepts. The author uses two theorems in order to prove that the various interpretations of probability do not come into opposition and can be used in different contexts. The goal here is to clarify the multifold nature of probability by means of a purely mathematical approach and to show how philosophical arguments can only serve to deepen actual intellectual contrasts. The book can be considered as one of the most important contributions in the analysis of probability interpretation in the last 10-15 years.
International Nuclear Information System (INIS)
2015-01-01
In 2011, a specific agreement was signed between the Nuclear Safety Council and the University Malaga for carrying out a survey of used radiology procedures in the Spanish sanitary centers, its frequency and doses received by patients. (Author)
International Nuclear Information System (INIS)
Sont, W.N.
1995-01-01
A method is introduced to estimate career doses from a combination of historical monitoring data and a single year's dose data. This method, called D1 eliminates the bias arising from incorporating historical dose data from times when occupational doses were generally much higher than they are today. Doses calculated by this method are still conditional on the preservation of the status quo in the effectiveness of radiation protection. The method takes into account the variation of the annual dose, and of the probability of being monitored, with the time elapsed since the start of a career. It also allows for the calculation of a standard error of the projected career dose. Results from recent Canadian dose data are presented. (author)
Billingsley, Patrick
2012-01-01
Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this
International Nuclear Information System (INIS)
Bitsakis, E.I.; Nicolaides, C.A.
1989-01-01
The concept of probability is now, and always has been, central to the debate on the interpretation of quantum mechanics. Furthermore, probability permeates all of science, as well as our every day life. The papers included in this volume, written by leading proponents of the ideas expressed, embrace a broad spectrum of thought and results: mathematical, physical epistemological, and experimental, both specific and general. The contributions are arranged in parts under the following headings: Following Schroedinger's thoughts; Probability and quantum mechanics; Aspects of the arguments on nonlocality; Bell's theorem and EPR correlations; Real or Gedanken experiments and their interpretation; Questions about irreversibility and stochasticity; and Epistemology, interpretation and culture. (author). refs.; figs.; tabs
Energy Technology Data Exchange (ETDEWEB)
NONE
2002-12-01
Electron paramagnetic resonance (EPR) dosimetry is a physical method for the assessment of absorbed dose from ionising radiation. It is based on the measurement of stable radiation induced radicals in human calcified tissues (primarily in tooth enamel). EPR dosimetry with teeth is now firmly established in retrospective dosimetry. It is a powerful method for providing information on exposure to ionising radiation many years after the event, since the 'signal' is 'stored' in the tooth or the bone. This technique is of particular relevance to relatively low dose exposures or when the results of conventional dosimetry are not available (e.g. in accidental circumstances). The use of EPR dosimetry, as an essential tool for retrospective assessment of radiation exposure is an important part of radioepidemiological studies and also provides data to select appropriate countermeasures based on retrospective evaluation of individual doses. Despite well established regulations and protocols for maintaining radiation protection dose limits, the assurance that these limits will not be exceeded cannot be guaranteed, thus providing new challenges for development of accurate methods of individual dose assessment. To meet some of these challenges, in 1998 the IAEA initiated a co-ordinated research project (CRP) with the objective to review the available methods, current research and development in EPR biodosimetry technology, which may be of practical use. The major goal of this CRP was to investigate the use of EPR biodosimetry for reconstruction of absorbed dose in tooth enamel with the aim of providing Member States with up-to-date, and generally agreed upon advice regarding the most suitable procedures and the best focus for their research. The co-ordinated research project was conducted over four years and this publication presents the results and findings by a group of investigators from different countries. The available cytogenetic methods for radiation dose assessment were
International Nuclear Information System (INIS)
2002-12-01
Electron paramagnetic resonance (EPR) dosimetry is a physical method for the assessment of absorbed dose from ionising radiation. It is based on the measurement of stable radiation induced radicals in human calcified tissues (primarily in tooth enamel). EPR dosimetry with teeth is now firmly established in retrospective dosimetry. It is a powerful method for providing information on exposure to ionising radiation many years after the event, since the 'signal' is 'stored' in the tooth or the bone. This technique is of particular relevance to relatively low dose exposures or when the results of conventional dosimetry are not available (e.g. in accidental circumstances). The use of EPR dosimetry, as an essential tool for retrospective assessment of radiation exposure is an important part of radioepidemiological studies and also provides data to select appropriate countermeasures based on retrospective evaluation of individual doses. Despite well established regulations and protocols for maintaining radiation protection dose limits, the assurance that these limits will not be exceeded cannot be guaranteed, thus providing new challenges for development of accurate methods of individual dose assessment. To meet some of these challenges, in 1998 the IAEA initiated a co-ordinated research project (CRP) with the objective to review the available methods, current research and development in EPR biodosimetry technology, which may be of practical use. The major goal of this CRP was to investigate the use of EPR biodosimetry for reconstruction of absorbed dose in tooth enamel with the aim of providing Member States with up-to-date, and generally agreed upon advice regarding the most suitable procedures and the best focus for their research. The co-ordinated research project was conducted over four years and this publication presents the results and findings by a group of investigators from different countries. The available cytogenetic methods for radiation dose assessment were
Shorack, Galen R
2017-01-01
This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...
Concepts of probability theory
Pfeiffer, Paul E
1979-01-01
Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.
Probability and Bayesian statistics
1987-01-01
This book contains selected and refereed contributions to the "Inter national Symposium on Probability and Bayesian Statistics" which was orga nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...
Probability and Statistical Inference
Prosper, Harrison B.
2006-01-01
These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.
Hartmann, Stephan
2011-01-01
Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...
Grimmett, Geoffrey
2014-01-01
Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...
Hemmo, Meir
2012-01-01
What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive.
Probability in quantum mechanics
Directory of Open Access Journals (Sweden)
J. G. Gilson
1982-01-01
Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.
Quantum computing and probability.
Ferry, David K
2009-11-25
Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.
Quantum computing and probability
International Nuclear Information System (INIS)
Ferry, David K
2009-01-01
Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction. (viewpoint)
International Nuclear Information System (INIS)
Bodelle, B.; Isler, S.; Scholtz, J.-E.; Frellesen, C.; Luboldt, W.; Vogl, T.J.; Beeres, M.
2016-01-01
Aim: To evaluate the advantage of sinogram-affirmed iterative reconstruction (SIR) compared to filtered back projection (FBP) in upper abdomen computed tomography (CT) after transarterial chemoembolisation (TACE) at different tube currents. Materials and methods: The study was approved by the institutional review board. Written informed consent was obtained from all patients. Post-TACE CT was performed with different tube currents successively varied in four steps (180, 90, 45 and 23 mAs) with 40 patients per group (mean age: 60±12 years, range: 23–85 years, sex: 70 female, 90 male). The data were reconstructed with standard FBP and five different SIR strengths. Image quality was independently rated by two readers on a five-point scale. High (Lipiodol-to-liver) as well as low (liver-to-fat) contrast-to-noise ratios (CNRs) were intra-individually compared within one dose to determine the optimal strength (S1–S5) and inter-individually between different doses to determine the possibility of dose reduction using the Kruskal–Wallis test. Results: Subjective image quality and objective CNR analysis were concordant: intra-individually, SIR was significantly (p<0.001) superior to FBP. Inter-individually, regarding different doses (180 versus 23 ref mAs), there was no significant (p=1.00) difference when using S5 SIR at 23 mAs instead of FBP. Conclusion: SIR allows for an 88% dose reduction from 3.43 to 0.4 mSv in unenhanced CT of the liver following TACE without subjective or objective loss in image quality. - Highlights: • Diagnostic image quality and radiation dose of ultra-low-dose CT of the upper abdomen using sinogram affirmed iterative reconstruction following transarterial chemoembolization in comparison to low-dose and standard dose CT and filtered back projection technique. • Ultra-low dose CT of the upper abdomen using sinogram affirmed iterative reconstruction allows for significant dose reduction by 88%. • Ultra-low dose CT of the upper abdomen
Neutrosophic Probability, Set, And Logic (first version)
Smarandache, Florentin
2000-01-01
This project is a part of a National Science Foundation interdisciplinary project proposal. Starting from a new viewpoint in philosophy, the neutrosophy, one extends the classical "probability theory", "fuzzy set" and "fuzzy logic" to , and respectively. They are useful in artificial intelligence, neural networks, evolutionary programming, neutrosophic dynamic systems, and quantum mechanics.
Energy Technology Data Exchange (ETDEWEB)
Khawaja, Ranish Deedar Ali, E-mail: rkhawaja@mgh.harvard.edu [Division of Thoracic Radiology, MGH Imaging, Massachusetts General Hospital and Harvard Medical School, Boston (United States); Singh, Sarabjeet [Division of Thoracic Radiology, MGH Imaging, Massachusetts General Hospital and Harvard Medical School, Boston (United States); Madan, Rachna [Division of Thoracic Radiology, Brigham and Women' s Hospital and Harvard Medical School, Boston (United States); Sharma, Amita; Padole, Atul; Pourjabbar, Sarvenaz; Digumarthy, Subba; Shepard, Jo-Anne; Kalra, Mannudeep K. [Division of Thoracic Radiology, MGH Imaging, Massachusetts General Hospital and Harvard Medical School, Boston (United States)
2014-10-15
Highlights: • Filtered back projection technique enables acceptable image quality for chest CT examinations at 0.9 mGy (estimated effective dose of 0.5 mSv) for selected sizes of patients. • Lesion detection (such as solid non-calcified lung nodules) in lung parenchyma is optimal at 0.9 mGy, with limited visualization of thyroid nodules in FBP images. • Further dose reduction down to 0.4 mGy is possible for most patients undergoing follow-up chest CT for evaluation of larger lung nodules and GGOs. • Our results may help set the reference ALARA dose for chest CT examinations reconstructed with filtered back projection technique using the minimum possible radiation dose with acceptable image quality and lesion detection. - Abstract: Purpose: To assess lesion detection and diagnostic image quality of filtered back projection (FBP) reconstruction technique in ultra low-dose chest CT examinations. Methods and materials: In this IRB-approved ongoing prospective clinical study, 116 CT-image-series at four different radiation-doses were performed for 29 patients (age, 57–87 years; F:M – 15:12; BMI 16–32 kg/m{sup 2}). All patients provided written-informed-consent for the acquisitions of additional ultra low-dose (ULD) series on a 256-slice MDCT (iCT, Philips Healthcare). In-addition to their clinical standard-dose chest CT (SD, 120 kV mean CTDI{sub vol}, 6 ± 1 mGy), ULD-CT was subsequently performed at three-dose-levels (0.9 mGy [120 kV]; 0.5 mGy [100 kV] and 0.2 mGy [80 kV]). Images were reconstructed with FBP (2.5 mm * 1.25 mm) resulting into four-stacks: SD-FBP (reference-standard), FBP{sub 0.9}, FBP{sub 0.5}, and FBP{sub 0.2}. Four thoracic-radiologists from two-teaching-hospitals independently-evaluated data for lesion-detection and visibility-of-small-structures. Friedman's-non-parametric-test with post hoc Dunn's-test was used for data-analysis. Results: Interobserver-agreement was substantial between radiologists (k = 0.6–0.8). With
International Nuclear Information System (INIS)
Benjamin, J.; Wesley, S.G.; Rajan, M.P.
2013-01-01
Invertebrates are significant reference organisms, and some of them tend to accumulate certain radionuclides in increased levels. It is imperative that the levels of radionuclides are measured in certain organism in the vicinity of any major nuclear power project before its commissioning; hence, this study was carried out in the surroundings of the Kudankulam Nuclear Power Project site. The natural radionuclide polonium-210 having affinity to the organic matter in the soil and to the protein content of the animals, is very significant as it delivers a high internal dose to the organisms. The activity concentration of this radionuclide, its transfer and dose were assessed in two terrestrial (earthworm, Pheretima posthuma and land snail, Trachea vittata) and two aquatic (apple snail, Pila globosa and bivalve mollusc, Lamellidens marginalis) invertebrates. The activity concentration of 210 Po was found to be the highest in the earthworm and the lowest in the land snail. The per-animal dose due to 210 Po was the highest for the apple snail and the lowest for the earthworm. The results indicate that 210 Po does not constitute a significant radiological threat to the organisms. (author)
DEFF Research Database (Denmark)
Rasmussen, Troels A.; Merritt, Timothy R.
2017-01-01
CNC cutting machines have become essential tools for designers and architects enabling rapid prototyping, model-building and production of high quality components. Designers often cut from new materials, discarding the irregularly shaped remains. We introduce ProjecTables, a visual augmented...... reality system for interactive packing of model parts onto sheet materials. ProjecTables enables designers to (re)use scrap materials for CNC cutting that would have been previously thrown away, at the same time supporting aesthetic choices related to wood grain, avoiding surface blemishes, and other...... relevant material properties. We conducted evaluations of ProjecTables with design students from Aarhus School of Architecture, demonstrating that participants could quickly and easily place and orient model parts reducing material waste. Contextual interviews and ideation sessions led to a deeper...
International Nuclear Information System (INIS)
Jong, E.J. de; Koester, H.W.; Vries, W.J. de; Lembrechts, J.F.
1990-03-01
Parts are presented of the results of a safety-assessment study of disposal of medium and low level radioactive waste in salt formations in the Netherlands. The study concerns several disposal concepts for 2 kinds of salt formation, a deep dome and a shallow dome. 7 cases were studied with the same Dutch inventory and 1 with a reference inventory R, in order to compare results with those of other PACOMA participants. The total activity of the reference inventory R is 30 percent lower than the Dutch inventory, but some long living nuclides such as I-129, Np-237 and U-238 have a considerably higher activity. This reference inventor R has been combined with the disposal concept of mined cavities in a shallow salt dome. In each case. the released fraction of stored radio-nuclides moves gradually with water through the geosphere to the bio-sphere where it enters a river. River water is used for sprinkler irrigation and for drinking by man and livestock. The dispersal of the radionuclides into the biosphere is calculated with the BIOS program of the NRPB. Subroutines linked to the program add doses via different pathways to obtain a maximum individual dose, a collective dose and an integrated collective dose. This study presents results of these calculations. (author). 11 refs.; 39 figs.; 111 tabs
The perception of probability.
Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E
2014-01-01
We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).
Irreversibility and conditional probability
International Nuclear Information System (INIS)
Stuart, C.I.J.M.
1989-01-01
The mathematical entropy - unlike physical entropy - is simply a measure of uniformity for probability distributions in general. So understood, conditional entropies have the same logical structure as conditional probabilities. If, as is sometimes supposed, conditional probabilities are time-reversible, then so are conditional entropies and, paradoxically, both then share this symmetry with physical equations of motion. The paradox is, of course that probabilities yield a direction to time both in statistical mechanics and quantum mechanics, while the equations of motion do not. The supposed time-reversibility of both conditionals seems also to involve a form of retrocausality that is related to, but possibly not the same as, that described by Costa de Beaurgard. The retrocausality is paradoxically at odds with the generally presumed irreversibility of the quantum mechanical measurement process. Further paradox emerges if the supposed time-reversibility of the conditionals is linked with the idea that the thermodynamic entropy is the same thing as 'missing information' since this confounds the thermodynamic and mathematical entropies. However, it is shown that irreversibility is a formal consequence of conditional entropies and, hence, of conditional probabilities also. 8 refs. (Author)
Isaac, Richard
1995-01-01
The ideas of probability are all around us. Lotteries, casino gambling, the al most non-stop polling which seems to mold public policy more and more these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...
Experimental Probability in Elementary School
Andrew, Lane
2009-01-01
Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.
Improving Ranking Using Quantum Probability
Melucci, Massimo
2011-01-01
The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...
Choice probability generating functions
DEFF Research Database (Denmark)
Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel
2010-01-01
This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....
Probability and stochastic modeling
Rotar, Vladimir I
2012-01-01
Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...
Collision Probability Analysis
DEFF Research Database (Denmark)
Hansen, Peter Friis; Pedersen, Preben Terndrup
1998-01-01
It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...
Energy Technology Data Exchange (ETDEWEB)
Becce, Fabio [University of Lausanne, Department of Diagnostic and Interventional Radiology, Centre Hospitalier Universitaire Vaudois, Lausanne (Switzerland); Universite Catholique Louvain, Department of Radiology, Cliniques Universitaires Saint-Luc, Brussels (Belgium); Ben Salah, Yosr; Berg, Bruno C. vande; Lecouvet, Frederic E.; Omoumi, Patrick [Universite Catholique Louvain, Department of Radiology, Cliniques Universitaires Saint-Luc, Brussels (Belgium); Verdun, Francis R. [University of Lausanne, Institute of Radiation Physics, Centre Hospitalier Universitaire Vaudois, Lausanne (Switzerland); Meuli, Reto [University of Lausanne, Department of Diagnostic and Interventional Radiology, Centre Hospitalier Universitaire Vaudois, Lausanne (Switzerland)
2013-07-15
To compare image quality of a standard-dose (SD) and a low-dose (LD) cervical spine CT protocol using filtered back-projection (FBP) and iterative reconstruction (IR). Forty patients investigated by cervical spine CT were prospectively randomised into two groups: SD (120 kVp, 275 mAs) and LD (120 kVp, 150 mAs), both applying automatic tube current modulation. Data were reconstructed using both FBP and sinogram-affirmed IR. Image noise, signal-to-noise (SNR) and contrast-to-noise (CNR) ratios were measured. Two radiologists independently and blindly assessed the following anatomical structures at C3-C4 and C6-C7 levels, using a four-point scale: intervertebral disc, content of neural foramina and dural sac, ligaments, soft tissues and vertebrae. They subsequently rated overall image quality using a ten-point scale. For both protocols and at each disc level, IR significantly decreased image noise and increased SNR and CNR, compared with FBP. SNR and CNR were statistically equivalent in LD-IR and SD-FBP protocols. Regardless of the dose and disc level, the qualitative scores with IR compared with FBP, and with LD-IR compared with SD-FBP, were significantly higher or not statistically different for intervertebral discs, neural foramina and ligaments, while significantly lower or not statistically different for soft tissues and vertebrae. The overall image quality scores were significantly higher with IR compared with FBP, and with LD-IR compared with SD-FBP. LD-IR cervical spine CT provides better image quality for intervertebral discs, neural foramina and ligaments, and worse image quality for soft tissues and vertebrae, compared with SD-FBP, while reducing radiation dose by approximately 40 %. (orig.)
International Nuclear Information System (INIS)
Becce, Fabio; Ben Salah, Yosr; Berg, Bruno C. vande; Lecouvet, Frederic E.; Omoumi, Patrick; Verdun, Francis R.; Meuli, Reto
2013-01-01
To compare image quality of a standard-dose (SD) and a low-dose (LD) cervical spine CT protocol using filtered back-projection (FBP) and iterative reconstruction (IR). Forty patients investigated by cervical spine CT were prospectively randomised into two groups: SD (120 kVp, 275 mAs) and LD (120 kVp, 150 mAs), both applying automatic tube current modulation. Data were reconstructed using both FBP and sinogram-affirmed IR. Image noise, signal-to-noise (SNR) and contrast-to-noise (CNR) ratios were measured. Two radiologists independently and blindly assessed the following anatomical structures at C3-C4 and C6-C7 levels, using a four-point scale: intervertebral disc, content of neural foramina and dural sac, ligaments, soft tissues and vertebrae. They subsequently rated overall image quality using a ten-point scale. For both protocols and at each disc level, IR significantly decreased image noise and increased SNR and CNR, compared with FBP. SNR and CNR were statistically equivalent in LD-IR and SD-FBP protocols. Regardless of the dose and disc level, the qualitative scores with IR compared with FBP, and with LD-IR compared with SD-FBP, were significantly higher or not statistically different for intervertebral discs, neural foramina and ligaments, while significantly lower or not statistically different for soft tissues and vertebrae. The overall image quality scores were significantly higher with IR compared with FBP, and with LD-IR compared with SD-FBP. LD-IR cervical spine CT provides better image quality for intervertebral discs, neural foramina and ligaments, and worse image quality for soft tissues and vertebrae, compared with SD-FBP, while reducing radiation dose by approximately 40 %. (orig.)
International Nuclear Information System (INIS)
Rodrigues, J.C.L.; Negus, I.S.; Manghat, N.E.; Hamilton, M.C.K.
2013-01-01
Aim: To investigate the effect of incorporating a lateral scan projection radiograph (topogram) in addition to the standard frontal topogram on excess scan length in computed tomography pulmonary angiography (CTPA) and to quantify the impact on effective dose. Materials and methods: Fifty consecutive patients referred for exclusion of pulmonary embolism who had undergone a CTPA examination with conventional frontal topogram to plan scan length (protocol A) were compared with 50 consecutive patients who had undergone a CTPA study with frontal and additional lateral topogram for planning (protocol B) in a retrospective audit. Optimal scan length was defined from lung apex to lung base. Mean excess scan length beyond these landmarks was determined. The mean organ doses to the thyroid, liver, and stomach, as well as mean effective dose, were estimated using standard conversion factors. Results: The mean excess scan length was significantly lower in protocol B compared to the protocol A cohort (19.5 ± 17.4 mm [mean ± standard deviation] versus 39.1 ± 20.4 mm, p < 0.0001). The mean excess scan length below the lung bases was significantly lower in the protocol B cohort compared to the protocol A group (7.5 ± 12.7 mm versus 23 ± 16.6 mm, p < 0.0001), as were the mean organ doses to the stomach (4.24 ± 0.81 mGy versus 5.22 ± 1.06 mGy, p < 0.0001) and liver (5.60 ± 0.64 mGy versus 6.38 ± 0.81 mGy, p < 0.0001). A non-significant reduction in over-scanning above the apices in protocol B was observed compared with protocol A (12 ± 8.8 mm versus 16.2 ± 13.6 mm, p = 0.07), which equated to lower mean thyroid organ dose in (3.28 ± 1.76 mGy versus 4.11 ± 3.11 mGy, p = 0.104). Conclusion: The present audit indicates that incorporation of a lateral topogram into the CTPA protocol, together with radiographer education, reduces excess scan length, which significantly reduces the dose to the liver and stomach, and potentially lowers the dose to the thyroid. This simple
Estimating Subjective Probabilities
DEFF Research Database (Denmark)
Andersen, Steffen; Fountain, John; Harrison, Glenn W.
2014-01-01
either construct elicitation mechanisms that control for risk aversion, or construct elicitation mechanisms which undertake 'calibrating adjustments' to elicited reports. We illustrate how the joint estimation of risk attitudes and subjective probabilities can provide the calibration adjustments...... that theory calls for. We illustrate this approach using data from a controlled experiment with real monetary consequences to the subjects. This allows the observer to make inferences about the latent subjective probability, under virtually any well-specified model of choice under subjective risk, while still...
Introduction to imprecise probabilities
Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M
2014-01-01
In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin
Classic Problems of Probability
Gorroochurn, Prakash
2012-01-01
"A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin
Franchi, F; Lazzeri, C; Foschi, M; Tosti-Guerra, C; Barletta, G
2002-08-01
Pharmacological and clinical studies on the effects of angiotensin-converting enzyme (ACE) inhibitors support the idea of a central role played Angiotensin II which is able to cause cardiovascular and renal diseases also independently of its blood pressure elevating effects. The present investigation was aimed at evaluating the effect(s) of three different pharmacological regimens on both blood pressure and sympathetic drive in uncomplicated essential hypertension, by means of blood pressure laboratory measurements and ambulatory monitoring, 24-h heart rate variability and plasma noradrenaline levels. Thus, an ACE-inhibitor monotherapy (trandolapril, 2 mg/day), an AT(1)-receptor antagonist monotherapy (irbesartan, 300 mg/day), their low-dose combination (0.5 mg/day plus 150 mg/day, respectively) and placebo were given, in a randomised, single-blind, crossover fashion for a period of 3 weeks each to 12 mild essential hypertensives. Power spectral analysis (short recordings) and noradrenaline measurements were also performed in the supine position and after a postural challenge (60 degrees head-up tilting test: HUT). The low-dose combination therapy induced the greatest reduction in LF component and in LF/HF ratio, both in the resting and tilted positions, as well as in blood pressure. However, the physiological autonomic response to HUT was maintained. Noradrenaline plasma levels were lower after the combined therapy than after each drug alone. Our data demonstrate that in mild and uncomplicated essential hypertension, the chronic low-dose combination therapy with an ACE-inhibitor and an AT(1)-antagonist is more effective than the recommended full-dose monotherapy with either drug in influencing the autonomic regulation of the heart, suggesting a relative reduction in sympathetic drive both at cardiac and systemic levels.
Counterexamples in probability
Stoyanov, Jordan M
2013-01-01
While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.
Plotnitsky, Arkady
2010-01-01
Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general
Transition probabilities for atoms
International Nuclear Information System (INIS)
Kim, Y.K.
1980-01-01
Current status of advanced theoretical methods for transition probabilities for atoms and ions is discussed. An experiment on the f values of the resonance transitions of the Kr and Xe isoelectronic sequences is suggested as a test for the theoretical methods
Negative probability in the framework of combined probability
Burgin, Mark
2013-01-01
Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...
International Nuclear Information System (INIS)
2000-06-01
In many Member States the use of large cobalt-60 gamma ray facilities and electron beam accelerators with beam energies from about 0.1 to 10 MeV for industrial processing continues to increase. For these processes, quality assurance relies on the application of well established dosimetry systems and procedures. This is especially the case for health regulated processes, such as the radiation sterilization of health care products, and the irradiation of food to eliminate pathogenic organisms or to control insect pests. A co-ordinated research project (CRP) was initiated by the IAEA in June 1995. Research contracts and research agreements in areas of high dose dosimetry were initiated to meet these challenges. The major goals of this CRP were to investigate the parameters that influence the response of dosimeters and to develop reference and transfer dosimetry techniques, especially for electron beams of energy less than 4 MeV and for high energy X ray sources (up to 5 MV). These will help to unify the radiation measurements performed by different radiation processing facilities and other high dose dosimetry users in Member States and encourage efforts to obtain traceability to primary and secondary standards laboratories. It will also aim to strengthen and expand the present International Dose Assurance Service (IDAS) provided by the IAEA
International Nuclear Information System (INIS)
Takx, Richard A.P.; Schoepf, U. Joseph; Moscariello, Antonio; Das, Marco; Rowe, Garrett; Schoenberg, Stefan O.; Fink, Christian; Henzler, Thomas
2013-01-01
Objective: To prospectively compare subjective and objective image quality in 20% tube current coronary CT angiography (cCTA) datasets between an iterative reconstruction algorithm (SAFIRE) and traditional filtered back projection (FBP). Materials and methods: Twenty patients underwent a prospectively ECG-triggered dual-step cCTA protocol using 2nd generation dual-source CT (DSCT). CT raw data was reconstructed using standard FBP at full-dose (Group 1 a) and 80% tube current reduced low-dose (Group 1 b). The low-dose raw data was additionally reconstructed using iterative raw data reconstruction (Group 2 ). Attenuation and image noise were measured in three regions of interest and signal-to-noise-ratio (SNR) as well as contrast-to-noise-ratio (CNR) was calculated. Subjective diagnostic image quality was evaluated using a 4-point Likert scale. Results: Mean image noise of group 2 was lowered by 22% on average when compared to group 1 b (p 2 compared to group 1 b (p 2 (1.88 ± 0.63) was also rated significantly higher when compared to group 1 b (1.58 ± 0.63, p = 0.004). Conclusions: Image quality of 80% tube current reduced iteratively reconstructed cCTA raw data is significantly improved when compared to standard FBP and consequently may improve the diagnostic accuracy of cCTA
Contributions to quantum probability
International Nuclear Information System (INIS)
Fritz, Tobias
2010-01-01
Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a finite set can occur as the outcome
von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo
2014-06-01
Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.
Contributions to quantum probability
Energy Technology Data Exchange (ETDEWEB)
Fritz, Tobias
2010-06-25
Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a
Waste Package Misload Probability
International Nuclear Information System (INIS)
Knudsen, J.K.
2001-01-01
The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a
Probability theory and applications
Hsu, Elton P
1999-01-01
This volume, with contributions by leading experts in the field, is a collection of lecture notes of the six minicourses given at the IAS/Park City Summer Mathematics Institute. It introduces advanced graduates and researchers in probability theory to several of the currently active research areas in the field. Each course is self-contained with references and contains basic materials and recent results. Topics include interacting particle systems, percolation theory, analysis on path and loop spaces, and mathematical finance. The volume gives a balanced overview of the current status of probability theory. An extensive bibliography for further study and research is included. This unique collection presents several important areas of current research and a valuable survey reflecting the diversity of the field.
Paradoxes in probability theory
Eckhardt, William
2013-01-01
Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory. Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies. Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.
Measurement uncertainty and probability
Willink, Robin
2013-01-01
A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.
International Nuclear Information System (INIS)
Böning, G.; Schäfer, M.; Grupp, U.; Kaul, D.; Kahn, J.; Pavel, M.; Maurer, M.; Denecke, T.; Hamm, B.; Streitparth, F.
2015-01-01
Highlights: • Iterative reconstruction (IR) in staging CT provides equal objective image quality compared to filtered back projection (FBP). • IR delivers excellent subjective quality and reduces effective dose compared to FBP. • In patients with neuroendocrine tumor (NET) or may other hypervascular abdominal tumors IR can be used without scarifying diagnostic confidence. - Abstract: Objective: To investigate whether dose reduction via adaptive statistical iterative reconstruction (ASIR) affects image quality and diagnostic accuracy in neuroendocrine tumor (NET) staging. Methods: A total of 28 NET patients were enrolled in the study. Inclusion criteria were histologically proven NET and visible tumor in abdominal computed tomography (CT). In an intraindividual study design, the patients underwent a baseline CT (filtered back projection, FBP) and follow-up CT (ASIR 40%) using matched scan parameters. Image quality was assessed subjectively using a 5-grade scoring system and objectively by determining signal-to-noise ratio (SNR) and contrast-to-noise ratios (CNRs). Applied volume computed tomography dose index (CTDI vol ) of each scan was taken from the dose report. Results: ASIR 40% significantly reduced CTDI vol (10.17 ± 3.06 mGy [FBP], 6.34 ± 2.25 mGy [ASIR] (p < 0.001) by 37.6% and significantly increased CNRs (complete tumor-to-liver, 2.76 ± 1.87 [FBP], 3.2 ± 2.32 [ASIR]) (p < 0.05) (complete tumor-to-muscle, 2.74 ± 2.67 [FBP], 4.31 ± 4.61 [ASIR]) (p < 0.05) compared to FBP. Subjective scoring revealed no significant changes for diagnostic confidence (5.0 ± 0 [FBP], 5.0 ± 0 [ASIR]), visibility of suspicious lesion (4.8 ± 0.5 [FBP], 4.8 ± 0.5 [ASIR]) and artifacts (5.0 ± 0 [FBP], 5.0 ± 0 [ASIR]). ASIR 40% significantly decreased scores for noise (4.3 ± 0.6 [FBP], 4.0 ± 0.8 [ASIR]) (p < 0.05), contrast (4.4 ± 0.6 [FBP], 4.1 ± 0.8 [ASIR]) (p < 0.001) and visibility of small structures (4.5 ± 0.7 [FBP], 4.3 ± 0.8 [ASIR]) (p < 0
Energy Technology Data Exchange (ETDEWEB)
Böning, G., E-mail: georg.boening@charite.de [Department of Radiology, Charité, Humboldt-University Medical School, Charitéplatz 1, 10117 Berlin (Germany); Schäfer, M.; Grupp, U. [Department of Radiology, Charité, Humboldt-University Medical School, Charitéplatz 1, 10117 Berlin (Germany); Kaul, D. [Department of Radiation Oncology, Charité, Humboldt-University Medical School, Charitéplatz 1, 10117 Berlin (Germany); Kahn, J. [Department of Radiology, Charité, Humboldt-University Medical School, Charitéplatz 1, 10117 Berlin (Germany); Pavel, M. [Department of Gastroenterology, Charité, Humboldt-University Medical School, Charitéplatz 1, 10117 Berlin (Germany); Maurer, M.; Denecke, T.; Hamm, B.; Streitparth, F. [Department of Radiology, Charité, Humboldt-University Medical School, Charitéplatz 1, 10117 Berlin (Germany)
2015-08-15
Highlights: • Iterative reconstruction (IR) in staging CT provides equal objective image quality compared to filtered back projection (FBP). • IR delivers excellent subjective quality and reduces effective dose compared to FBP. • In patients with neuroendocrine tumor (NET) or may other hypervascular abdominal tumors IR can be used without scarifying diagnostic confidence. - Abstract: Objective: To investigate whether dose reduction via adaptive statistical iterative reconstruction (ASIR) affects image quality and diagnostic accuracy in neuroendocrine tumor (NET) staging. Methods: A total of 28 NET patients were enrolled in the study. Inclusion criteria were histologically proven NET and visible tumor in abdominal computed tomography (CT). In an intraindividual study design, the patients underwent a baseline CT (filtered back projection, FBP) and follow-up CT (ASIR 40%) using matched scan parameters. Image quality was assessed subjectively using a 5-grade scoring system and objectively by determining signal-to-noise ratio (SNR) and contrast-to-noise ratios (CNRs). Applied volume computed tomography dose index (CTDI{sub vol}) of each scan was taken from the dose report. Results: ASIR 40% significantly reduced CTDI{sub vol} (10.17 ± 3.06 mGy [FBP], 6.34 ± 2.25 mGy [ASIR] (p < 0.001) by 37.6% and significantly increased CNRs (complete tumor-to-liver, 2.76 ± 1.87 [FBP], 3.2 ± 2.32 [ASIR]) (p < 0.05) (complete tumor-to-muscle, 2.74 ± 2.67 [FBP], 4.31 ± 4.61 [ASIR]) (p < 0.05) compared to FBP. Subjective scoring revealed no significant changes for diagnostic confidence (5.0 ± 0 [FBP], 5.0 ± 0 [ASIR]), visibility of suspicious lesion (4.8 ± 0.5 [FBP], 4.8 ± 0.5 [ASIR]) and artifacts (5.0 ± 0 [FBP], 5.0 ± 0 [ASIR]). ASIR 40% significantly decreased scores for noise (4.3 ± 0.6 [FBP], 4.0 ± 0.8 [ASIR]) (p < 0.05), contrast (4.4 ± 0.6 [FBP], 4.1 ± 0.8 [ASIR]) (p < 0.001) and visibility of small structures (4.5 ± 0.7 [FBP], 4.3 ± 0.8 [ASIR]) (p < 0
Model uncertainty and probability
International Nuclear Information System (INIS)
Parry, G.W.
1994-01-01
This paper discusses the issue of model uncertainty. The use of probability as a measure of an analyst's uncertainty as well as a means of describing random processes has caused some confusion, even though the two uses are representing different types of uncertainty with respect to modeling a system. The importance of maintaining the distinction between the two types is illustrated with a simple example
Retrocausality and conditional probability
International Nuclear Information System (INIS)
Stuart, C.I.J.M.
1989-01-01
Costa de Beauregard has proposed that physical causality be identified with conditional probability. The proposal is shown to be vulnerable on two accounts. The first, though mathematically trivial, seems to be decisive so far as the current formulation of the proposal is concerned. The second lies in a physical inconsistency which seems to have its source in a Copenhagenlike disavowal of realism in quantum mechanics. 6 refs. (Author)
Whittle, Peter
1992-01-01
This book is a complete revision of the earlier work Probability which ap peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...
Böning, G; Schäfer, M; Grupp, U; Kaul, D; Kahn, J; Pavel, M; Maurer, M; Denecke, T; Hamm, B; Streitparth, F
2015-08-01
To investigate whether dose reduction via adaptive statistical iterative reconstruction (ASIR) affects image quality and diagnostic accuracy in neuroendocrine tumor (NET) staging. A total of 28 NET patients were enrolled in the study. Inclusion criteria were histologically proven NET and visible tumor in abdominal computed tomography (CT). In an intraindividual study design, the patients underwent a baseline CT (filtered back projection, FBP) and follow-up CT (ASIR 40%) using matched scan parameters. Image quality was assessed subjectively using a 5-grade scoring system and objectively by determining signal-to-noise ratio (SNR) and contrast-to-noise ratios (CNRs). Applied volume computed tomography dose index (CTDIvol) of each scan was taken from the dose report. ASIR 40% significantly reduced CTDIvol (10.17±3.06mGy [FBP], 6.34±2.25mGy [ASIR] (pASIR]) (pASIR]) (pASIR]), visibility of suspicious lesion (4.8±0.5 [FBP], 4.8±0.5 [ASIR]) and artifacts (5.0±0 [FBP], 5.0±0 [ASIR]). ASIR 40% significantly decreased scores for noise (4.3±0.6 [FBP], 4.0±0.8 [ASIR]) (pASIR]) (pASIR]) (pASIR can be used to reduce radiation dose without sacrificing image quality and diagnostic confidence in staging CT of NET patients. This may be beneficial for patients with frequent follow-up and significant cumulative radiation exposure. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Probability of causation approach
International Nuclear Information System (INIS)
Jose, D.E.
1988-01-01
Probability of causation (PC) is sometimes viewed as a great improvement by those persons who are not happy with the present rulings of courts in radiation cases. The author does not share that hope and expects that PC will not play a significant role in these issues for at least the next decade. If it is ever adopted in a legislative compensation scheme, it will be used in a way that is unlikely to please most scientists. Consequently, PC is a false hope for radiation scientists, and its best contribution may well lie in some of the spin-off effects, such as an influence on medical practice
Generalized Probability Functions
Directory of Open Access Journals (Sweden)
Alexandre Souto Martinez
2009-01-01
Full Text Available From the integration of nonsymmetrical hyperboles, a one-parameter generalization of the logarithmic function is obtained. Inverting this function, one obtains the generalized exponential function. Motivated by the mathematical curiosity, we show that these generalized functions are suitable to generalize some probability density functions (pdfs. A very reliable rank distribution can be conveniently described by the generalized exponential function. Finally, we turn the attention to the generalization of one- and two-tail stretched exponential functions. We obtain, as particular cases, the generalized error function, the Zipf-Mandelbrot pdf, the generalized Gaussian and Laplace pdf. Their cumulative functions and moments were also obtained analytically.
2014-06-30
precisely the content of the following result. The price we pay is that the assumption that A is a packing in (F, k ·k1) is too weak to make this happen...Regularité des trajectoires des fonctions aléatoires gaussiennes. In: École d’Été de Probabilités de Saint- Flour , IV-1974, pp. 1–96. Lecture Notes in...Lectures on probability theory and statistics (Saint- Flour , 1994), Lecture Notes in Math., vol. 1648, pp. 165–294. Springer, Berlin (1996) 50. Ledoux
International Nuclear Information System (INIS)
Fraass, B.
2015-01-01
Over the past 20 years the NIH has funded individual grants, program projects grants, and clinical trials which have been instrumental in advancing patient care. The ways that each grant mechanism lends itself to the different phases of translating research into clinical practice will be described. Major technological innovations, such as IMRT and proton therapy, have been advanced with R01-type and P01-type funding and will be discussed. Similarly, the role of program project grants in identifying and addressing key hypotheses on the potential of 3D conformal therapy, normal tissue-guided dose escalation and motion management will be described. An overview will be provided regarding how these technological innovations have been applied to multi-institutional NIH-sponsored trials. Finally, the panel will discuss regarding which research questions should be funded by the NIH to inspire the next advances in radiation therapy. Learning Objectives: Understand the different funding mechanisms of the NIH Learn about research advances that have led to innovation in delivery Review achievements due to NIH-funded program project grants in radiotherapy over the past 20 years Understand example advances achieved with multi-institutional clinical trials NIH
International Nuclear Information System (INIS)
2007-11-01
Active personal dosimeters (APD) are widely used in many countries, i.e. in the medical field and as operational dosimeters in nuclear power plants. Their use as legal dosimeters is already established in a few countries, and will increase in the near future. In the majority of countries, APDs have not undergone accreditation programmes or intercomparisons. In 2001, an EURADOS (European Radiation Dosimetry Group) Working Group on harmonization of individual monitoring was formed, funded by the European Commission, in the fifth framework programme, and by the participating institutes. The work addressed four issues; inter alia also an inventory of new developments in individual monitoring with an emphasis on the possibilities and performance of active (electronic) dosimeters for both photon/beta and neutron dosimetry. Within the work on this issue, a catalogue of the most extensively used active personal dosimeters (APDs) suitable for individual monitoring was made. On the basis of the knowledge gained in this activity, the organization of an international intercomparison, which would address APDs, was considered of great value to the dosimetric community. The IAEA in cooperation with EURADOS organized such an intercomparison in which most of the testing criteria as described in two internationally accepted standards (IEC61526 and IEC61283) were used. Additionally, simulated workplace fields were used for testing the APD reactions to pulsed X ray fields and mixed gamma/X ray fields. This is the first time that results of comparisons of such types are published, which is of great importance for APD end users in medical diagnostic and surgery X ray applications. Nine suppliers from six countries in Europe and the USA participated in the intercomparison with 13 different models. One of the models was a special design for extremity dose measurements. Irradiations and readout was done by two accredited calibration laboratories in Belgium and France and the French
International Nuclear Information System (INIS)
Degteva, M.O.; Drozhko, E.; Anspaugh, L.R.; Napier, B.A.; Bouville, A.C.; Miller, C.W.
1996-02-01
This work is being carried out as a feasibility study to determine if a long-term course of work can be implemented to assess the long-term risks of radiation exposure delivered at low to moderate dose rates to the populations living in the vicinity of the Mayak Industrial Association (MIA). This work was authorized and conducted under the auspices of the US-Russian Joint Coordinating Committee on Radiation Effects Research (JCCRER) and its Executive Committee (EC). The MIA was the first Russian site for the production and separation of plutonium. This plant began operation in 1948, and during its early days there were technological failures that resulted in the release of large amounts of waste into the rather small Techa River. There were also gaseous releases of radioiodines and other radionuclides during the early days of operation. In addition, there was an accidental explosion in a waste storage tank in 1957 that resulted in a significant release. The Techa River Cohort has been studied for several years by scientists from the Urals Research Centre for Radiation Medicine and an increase in both leukemia and solid tumors has been noted
Energy Technology Data Exchange (ETDEWEB)
Degteva, M.O. [Urals Research Center for Radiation Medicine, Chelyabinsk (Russian Federation); Drozhko, E. [Branch 1 of Moscow Biophysics Inst., Ozersk (Russian Federation); Anspaugh, L.R. [Lawrence Livermore National Lab., CA (United States); Napier, B.A. [Pacific Northwest National Lab., Richland, WA (United States); Bouville, A.C. [National Cancer Inst., Bethesda, MD (United States); Miller, C.W. [Centers for Disease Control and Prevention, Atlanta, GA (United States)
1996-02-01
This work is being carried out as a feasibility study to determine if a long-term course of work can be implemented to assess the long-term risks of radiation exposure delivered at low to moderate dose rates to the populations living in the vicinity of the Mayak Industrial Association (MIA). This work was authorized and conducted under the auspices of the US-Russian Joint Coordinating Committee on Radiation Effects Research (JCCRER) and its Executive Committee (EC). The MIA was the first Russian site for the production and separation of plutonium. This plant began operation in 1948, and during its early days there were technological failures that resulted in the release of large amounts of waste into the rather small Techa River. There were also gaseous releases of radioiodines and other radionuclides during the early days of operation. In addition, there was an accidental explosion in a waste storage tank in 1957 that resulted in a significant release. The Techa River Cohort has been studied for several years by scientists from the Urals Research Centre for Radiation Medicine and an increase in both leukemia and solid tumors has been noted.
International Nuclear Information System (INIS)
2011-01-01
This document reports a dose assessment study performed by the IRSN (the French Radioprotection and Safety Nuclear Institute) 66 days after the Fukushima nuclear accident. A new dose assessment was carried out by IRSN to estimate projected doses due to external exposure from radioactive deposits, for exposure durations of 3 months, 1 year and 4 years before evacuation. The purpose of this report is to provide insight on all radiological assessments performed to the knowledge of the IRSN (the French Radioprotection and Safety Nuclear Institute) to date and the impact of population evacuation measures to be taken to minimize the medium and long-term risks of developing leukaemia or other radiation-induced cancers. This report only considers the external doses already received as well as the doses that may be received in the future from fallout deposits, regardless of doses received previously from the radioactive plume
Probable maximum flood control
International Nuclear Information System (INIS)
DeGabriele, C.E.; Wu, C.L.
1991-11-01
This study proposes preliminary design concepts to protect the waste-handling facilities and all shaft and ramp entries to the underground from the probable maximum flood (PMF) in the current design configuration for the proposed Nevada Nuclear Waste Storage Investigation (NNWSI) repository protection provisions were furnished by the United States Bureau of Reclamation (USSR) or developed from USSR data. Proposed flood protection provisions include site grading, drainage channels, and diversion dikes. Figures are provided to show these proposed flood protection provisions at each area investigated. These areas are the central surface facilities (including the waste-handling building and waste treatment building), tuff ramp portal, waste ramp portal, men-and-materials shaft, emplacement exhaust shaft, and exploratory shafts facility
Conditional probability on MV-algebras
Czech Academy of Sciences Publication Activity Database
Kroupa, Tomáš
2005-01-01
Roč. 149, č. 2 (2005), s. 369-381 ISSN 0165-0114 R&D Projects: GA AV ČR IAA2075302 Institutional research plan: CEZ:AV0Z10750506 Keywords : conditional probability * tribe * MV-algebra Subject RIV: BA - General Mathematics Impact factor: 1.039, year: 2005
Energy Technology Data Exchange (ETDEWEB)
Galindo G, I. F.; Vergara del C, J. A.; Galvan A, S. J. [Instituto Nacional de Electricidad y Energias Limpias, Reforma 113, Col. Palmira, 62490 Cuernavaca, Morelos (Mexico); Tijerina S, F., E-mail: francisco.tijerina@cfe.gob.mx [CFE, Central Nucleoelectrica Laguna Verde, Carretera Federal Cardel-Nautla Km 42.5, 91476 Municipio Alto Lucero, Veracruz (Mexico)
2016-09-15
The use of specialized codes to estimate the radiation dose projection to an emergency postulated event at a nuclear power plant requires that certain plant data be available according to the event being simulated. The calculation of the possible radiological release is the critical activity to carry out the emergency actions. However, not all of the plant data required are obtained directly from the plant but need to be calculated. In this paper we present a computational tool that calculates the plant data required to use the radiological dose estimation codes. The tool provides the required information when there is a gas emergency venting event in the primary containment atmosphere, whether well or dry well and also calculates the time in which the spent fuel pool would be discovered in the event of a leak of water on some of the walls or floor of the pool. The tool developed has mathematical models for the processes involved such as: compressible flow in pipes considering area change and for constant area, taking into account the effects of friction and for the case of the spent fuel pool hydraulic models to calculate the time in which a container is emptied. The models implemented in the tool are validated with data from the literature for simulated cases. The results with the tool are very similar to those of reference. This tool will also be very supportive so that in postulated emergency cases can use the radiological dose estimation codes to adequately and efficiently determine the actions to be taken in a way that affects as little as possible. (Author)
Energy Technology Data Exchange (ETDEWEB)
Hanf, R.W.; Dirkes, R.L.; Duncan, J.P.
1992-07-01
The objective of the Hanford Environmental Dose Reconstruction Project (HEDR) is to estimate the potential radiation doses received by people living within the sphere of influence of the Hanford Site. A potential critical pathway for human radiation exposure is through the consumption of waterfowl that frequent onsite waste-water ponds or through eating of fish, shellfish, and waterfowl that reside in/on the Columbia River and its tributaries downstream of the reactors. This document summarizes information on fish, shellfish, and waterfowl radiation contamination for samples collected by Hanford monitoring personnel and offsite agencies for the period 1945 to 1972. Specific information includes the types of organisms sampled, the kinds of tissues and organs analyzed, the sampling locations, and the radionuclides reported. Some tissue concentrations are also included. We anticipate that these yearly summaries will be helpful to individuals and organizations interested in evaluating aquatic pathway information for locations impacted by Hanford operations and will be useful for planning the direction of future HEDR studies.
Energy Technology Data Exchange (ETDEWEB)
Koyama, Hisanobu; Seki, Shinichiro; Sugimura, Kazuro [Kobe University Graduate School of Medicine, Division of Radiology, Department of Radiology, Kobe, Hyogo (Japan); Ohno, Yoshiharu; Nishio, Mizuho; Matsumoto, Sumiaki; Yoshikawa, Takeshi [Kobe University Graduate School of Medicine, Advanced Biomedical Imaging Research Centre, Kobe (Japan); Kobe University Graduate School of Medicine, Division of Functional and Diagnostic Imaging Research, Department of Radiology, Kobe (Japan); Sugihara, Naoki [Toshiba Medical Systems Corporation, Ohtawara, Tochigi (Japan)
2014-08-15
The aim of this study was to evaluate the utility of the iterative reconstruction (IR) technique for quantitative bronchial assessment during low-dose computed tomography (CT) as a substitute for standard-dose CT in patients with/without chronic obstructive pulmonary disease. Fifty patients (mean age, 69.2; mean % predicted FEV1, 79.4) underwent standard-dose CT (150mAs) and low-dose CT (25mAs). Except for tube current, the imaging parameters were identical for both protocols. Standard-dose CT was reconstructed using filtered back-projection (FBP), and low-dose CT was reconstructed using IR and FBP. For quantitative bronchial assessment, the wall area percentage (WA%) of the sub-segmental bronchi and the airway luminal volume percentage (LV%) from the main bronchus to the peripheral bronchi were acquired in each dataset. The correlation and agreement of WA% and LV% between standard-dose CT and both low-dose CTs were statistically evaluated. WA% and LV% between standard-dose CT and both low-dose CTs were significant correlated (r > 0.77, p < 0.00001); however, only the LV% agreement between SD-CT and low-dose CT reconstructed with IR was moderate (concordance correlation coefficient = 0.93); the other agreement was poor (concordance correlation coefficient <0.90). Quantitative bronchial assessment via low-dose CT has potential as a substitute for standard-dose CT by using IR and airway luminal volumetry techniques. circle Quantitative bronchial assessment of COPD using low-dose CT is possible. (orig.)
Bayesian maximum posterior probability method for interpreting plutonium urinalysis data
International Nuclear Information System (INIS)
Miller, G.; Inkret, W.C.
1996-01-01
A new internal dosimetry code for interpreting urinalysis data in terms of radionuclide intakes is described for the case of plutonium. The mathematical method is to maximise the Bayesian posterior probability using an entropy function as the prior probability distribution. A software package (MEMSYS) developed for image reconstruction is used. Some advantages of the new code are that it ensures positive calculated dose, it smooths out fluctuating data, and it provides an estimate of the propagated uncertainty in the calculated doses. (author)
Probability and rational choice
Directory of Open Access Journals (Sweden)
David Botting
2014-05-01
Full Text Available http://dx.doi.org/10.5007/1808-1711.2014v18n1p1 In this paper I will discuss the rationality of reasoning about the future. There are two things that we might like to know about the future: which hypotheses are true and what will happen next. To put it in philosophical language, I aim to show that there are methods by which inferring to a generalization (selecting a hypothesis and inferring to the next instance (singular predictive inference can be shown to be normative and the method itself shown to be rational, where this is due in part to being based on evidence (although not in the same way and in part on a prior rational choice. I will also argue that these two inferences have been confused, being distinct not only conceptually (as nobody disputes but also in their results (the value given to the probability of the hypothesis being not in general that given to the next instance and that methods that are adequate for one are not by themselves adequate for the other. A number of debates over method founder on this confusion and do not show what the debaters think they show.
COVAL, Compound Probability Distribution for Function of Probability Distribution
International Nuclear Information System (INIS)
Astolfi, M.; Elbaz, J.
1979-01-01
1 - Nature of the physical problem solved: Computation of the probability distribution of a function of variables, given the probability distribution of the variables themselves. 'COVAL' has been applied to reliability analysis of a structure subject to random loads. 2 - Method of solution: Numerical transformation of probability distributions
Falk, Ruma; Kendig, Keith
2013-01-01
Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.
International Nuclear Information System (INIS)
Novakova, O.; Ryba, J.; Slezak, V.; Svobodova, B.; Viererbl, L.
1984-10-01
The methods are discussea of measuring dose rate or dose using a scintillation counte. A plastic scintillator based on polystyrene with PBD and POPOP activators and coated with ZnS(Ag) was chosen for the projected monitor. The scintillators were cylindrical and spherical in shape and of different sizes; black polypropylene tubes were chosen as the best case for the probs. For the counter with different plastic scintillators, the statistical error 2σ for natural background was determined. For determining the suitable thickness of the ZnS(Ag) layer the energy dependence of the counter was measured. Radioisotopes 137 Cs, 241 Am and 109 Cd were chosen as radiation sources. The best suited ZnS(Ag) thickness was found to be 0.5 μm. Experiments were carried out to determine the directional dependence of the detector response and the signal to noise ratio. The temperature dependence of the detector response and its compensation were studied, as were the time stability and fatigue manifestations of the photomultiplier. The design of a laboratory prototype of a dose rate and dose monitor is described. Block diagrams are given of the various functional parts of the instrument. The designed instrument is easiiy portable, battery powered, measures dose rates from natural background in the range of five orders, i.e., 10 -2 to 10 3 nGy/s, and allows to determine a dose of up to 10 mGy. Accouracy of measurement in the energy range of 50 keV to 1 MeV is better than +-20%. (E.S.)
Introduction to probability with R
Baclawski, Kenneth
2008-01-01
FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable
International Nuclear Information System (INIS)
Kim, Milim; Lee, Jeong Min; Son, Hyo Shin; Han, Joon Koo; Choi, Byung Ihn; Yoon, Jeong Hee; Choi, Jin Woo
2014-01-01
To evaluate the impact of the adaptive iterative dose reduction (AIDR) three-dimensional (3D) algorithm in CT on noise reduction and the image quality compared to the filtered back projection (FBP) algorithm and to compare the effectiveness of AIDR 3D on noise reduction according to the body habitus using phantoms with different sizes. Three different-sized phantoms with diameters of 24 cm, 30 cm, and 40 cm were built up using the American College of Radiology CT accreditation phantom and layers of pork belly fat. Each phantom was scanned eight times using different mAs. Images were reconstructed using the FBP and three different strengths of the AIDR 3D. The image noise, the contrast-to-noise ratio (CNR) and the signal-to-noise ratio (SNR) of the phantom were assessed. Two radiologists assessed the image quality of the 4 image sets in consensus. The effectiveness of AIDR 3D on noise reduction compared with FBP were also compared according to the phantom sizes. Adaptive iterative dose reduction 3D significantly reduced the image noise compared with FBP and enhanced the SNR and CNR (p < 0.05) with improved image quality (p < 0.05). When a stronger reconstruction algorithm was used, greater increase of SNR and CNR as well as noise reduction was achieved (p < 0.05). The noise reduction effect of AIDR 3D was significantly greater in the 40-cm phantom than in the 24-cm or 30-cm phantoms (p < 0.05). The AIDR 3D algorithm is effective to reduce the image noise as well as to improve the image-quality parameters compared by FBP algorithm, and its effectiveness may increase as the phantom size increases.
Energy Technology Data Exchange (ETDEWEB)
Kim, Milim; Lee, Jeong Min; Son, Hyo Shin; Han, Joon Koo; Choi, Byung Ihn [College of Medicine, Seoul National University, Seoul (Korea, Republic of); Yoon, Jeong Hee; Choi, Jin Woo [Dept. of Radiology, Seoul National University Hospital, Seoul (Korea, Republic of)
2014-04-15
To evaluate the impact of the adaptive iterative dose reduction (AIDR) three-dimensional (3D) algorithm in CT on noise reduction and the image quality compared to the filtered back projection (FBP) algorithm and to compare the effectiveness of AIDR 3D on noise reduction according to the body habitus using phantoms with different sizes. Three different-sized phantoms with diameters of 24 cm, 30 cm, and 40 cm were built up using the American College of Radiology CT accreditation phantom and layers of pork belly fat. Each phantom was scanned eight times using different mAs. Images were reconstructed using the FBP and three different strengths of the AIDR 3D. The image noise, the contrast-to-noise ratio (CNR) and the signal-to-noise ratio (SNR) of the phantom were assessed. Two radiologists assessed the image quality of the 4 image sets in consensus. The effectiveness of AIDR 3D on noise reduction compared with FBP were also compared according to the phantom sizes. Adaptive iterative dose reduction 3D significantly reduced the image noise compared with FBP and enhanced the SNR and CNR (p < 0.05) with improved image quality (p < 0.05). When a stronger reconstruction algorithm was used, greater increase of SNR and CNR as well as noise reduction was achieved (p < 0.05). The noise reduction effect of AIDR 3D was significantly greater in the 40-cm phantom than in the 24-cm or 30-cm phantoms (p < 0.05). The AIDR 3D algorithm is effective to reduce the image noise as well as to improve the image-quality parameters compared by FBP algorithm, and its effectiveness may increase as the phantom size increases.
Ross, Sheldon
2014-01-01
A First Course in Probability, Ninth Edition, features clear and intuitive explanations of the mathematics of probability theory, outstanding problem sets, and a variety of diverse examples and applications. This book is ideal for an upper-level undergraduate or graduate level introduction to probability for math, science, engineering and business students. It assumes a background in elementary calculus.
Direct probability mapping of contaminants
International Nuclear Information System (INIS)
Rautman, C.A.
1993-01-01
Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. Geostatistical simulation provides powerful tools for investigating contaminant levels, and in particular, for identifying and using the spatial interrelationships among a set of isolated sample values. This additional information can be used to assess the likelihood of encountering contamination at unsampled locations and to evaluate the risk associated with decisions to remediate or not to remediate specific regions within a site. Past operation of the DOE Feed Materials Production Center has contaminated a site near Fernald, Ohio, with natural uranium. Soil geochemical data have been collected as part of the Uranium-in-Soils Integrated Demonstration Project. These data have been used to construct a number of stochastic images of potential contamination for parcels approximately the size of a selective remediation unit. Each such image accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely, statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination. Evaluation of the geostatistical simulations can yield maps representing the expected magnitude of the contamination for various regions and other information that may be important in determining a suitable remediation process or in sizing equipment to accomplish the restoration
International Nuclear Information System (INIS)
Bergstroem, Ulla; Avila, Rodolfo; Ekstroem, Per-Anders; Cruz, Idalmis de la
2008-05-01
Following a review by the Swedish regulatory authorities of the safety analysis of the SFR 1 disposal facility for low and intermediate level waste, SKB has prepared an updated safety analysis, SAR-08. This report presents estimations of annual doses to the most exposed groups from potential radionuclide releases from the SFR 1 repository for a number of calculation cases, selected using a systematic approach for identifying relevant scenarios for the safety analysis. The dose estimates can be used for demonstrating that the long term safety of the repository is in compliance with the regulatory requirements. In particular, the mean values of the annual doses can be used to estimate the expected risks to the most exposed individuals, which can then be compared with the regulatory risk criteria for human health. The conversion from doses to risks is performed in the main report. For one scenario however, where the effects of an earthquake taking place close to the repository are analysed, risk calculations are presented in this report. In addition, prediction of concentrations of radionuclides in environmental media, such as water and soil, are compared with concentration limits suggested by the Erica-project as a base for estimating potential effects on the environment. The assessment of the impact on non-human biota showed that the potential impact is negligible. Committed collective dose for an integration period of 10,000 years for releases occurring during the first thousand years after closure are also calculated. The collective dose commitment was estimated to be 8 manSv. The dose calculations were carried out for a period of 100,000 years, which was sufficient to observe peak doses in all scenarios considered. Releases to the landscape and to a well were considered. The peaks of the mean annual doses from releases to the landscape are associated with C-14 releases to a future lake around year 5,000 AD. In the case of releases to a well, the peak annual doses
Energy Technology Data Exchange (ETDEWEB)
Bergstroem, Ulla (Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden)); Avila, Rodolfo; Ekstroem, Per-Anders; Cruz, Idalmis de la (Facilia AB, Bromma (Sweden))
2008-06-15
Following a review by the Swedish regulatory authorities of the safety analysis of the SFR 1 disposal facility for low and intermediate level waste, SKB has prepared an updated safety analysis, SAR-08. This report presents estimations of annual doses to the most exposed groups from potential radionuclide releases from the SFR 1 repository for a number of calculation cases, selected using a systematic approach for identifying relevant scenarios for the safety analysis. The dose estimates can be used for demonstrating that the long term safety of the repository is in compliance with the regulatory requirements. In particular, the mean values of the annual doses can be used to estimate the expected risks to the most exposed individuals, which can then be compared with the regulatory risk criteria for human health. The conversion from doses to risks is performed in the main report. For one scenario however, where the effects of an earthquake taking place close to the repository are analysed, risk calculations are presented in this report. In addition, prediction of concentrations of radionuclides in environmental media, such as water and soil, are compared with concentration limits suggested by the Erica-project as a base for estimating potential effects on the environment. The assessment of the impact on non-human biota showed that the potential impact is negligible. Committed collective dose for an integration period of 10,000 years for releases occurring during the first thousand years after closure are also calculated. The collective dose commitment was estimated to be 8 manSv. The dose calculations were carried out for a period of 100,000 years, which was sufficient to observe peak doses in all scenarios considered. Releases to the landscape and to a well were considered. The peaks of the mean annual doses from releases to the landscape are associated with C-14 releases to a future lake around year 5,000 AD. In the case of releases to a well, the peak annual doses
A brief introduction to probability.
Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio
2018-02-01
The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.
Pointwise probability reinforcements for robust statistical inference.
Frénay, Benoît; Verleysen, Michel
2014-02-01
Statistical inference using machine learning techniques may be difficult with small datasets because of abnormally frequent data (AFDs). AFDs are observations that are much more frequent in the training sample that they should be, with respect to their theoretical probability, and include e.g. outliers. Estimates of parameters tend to be biased towards models which support such data. This paper proposes to introduce pointwise probability reinforcements (PPRs): the probability of each observation is reinforced by a PPR and a regularisation allows controlling the amount of reinforcement which compensates for AFDs. The proposed solution is very generic, since it can be used to robustify any statistical inference method which can be formulated as a likelihood maximisation. Experiments show that PPRs can be easily used to tackle regression, classification and projection: models are freed from the influence of outliers. Moreover, outliers can be filtered manually since an abnormality degree is obtained for each observation. Copyright © 2013 Elsevier Ltd. All rights reserved.
Introduction to probability and measure theories
International Nuclear Information System (INIS)
Partasarati, K.
1983-01-01
Chapters of probability and measured theories are presented. The Borele images of spaces with the measure into each other and in separate metric spaces are studied. The Kolmogorov theorem on the continuation of probabilies is drawn from the theorem on the measure continuation to the projective limits of spaces with measure. The integration theory is plotted, measures on multiplications of spaces are studied. The theory of conventional mathematical expectations by projections in Hilbert space is presented. In conclusion, the theory of weak convergence of measures of elements of the theory of characteristic functions and the theory of invariant and quasi-invariant measures on groups and homogeneous spaces is given
Propensity, Probability, and Quantum Theory
Ballentine, Leslie E.
2016-08-01
Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.
Heart sounds analysis using probability assessment
Czech Academy of Sciences Publication Activity Database
Plešinger, Filip; Viščor, Ivo; Halámek, Josef; Jurčo, Juraj; Jurák, Pavel
2017-01-01
Roč. 38, č. 8 (2017), s. 1685-1700 ISSN 0967-3334 R&D Projects: GA ČR GAP102/12/2034; GA MŠk(CZ) LO1212; GA MŠk ED0017/01/01 Institutional support: RVO:68081731 Keywords : heart sounds * FFT * machine learning * signal averaging * probability assessment Subject RIV: FS - Medical Facilities ; Equipment OBOR OECD: Medical engineering Impact factor: 2.058, year: 2016
Tumor control probability after a radiation of animal tumors
International Nuclear Information System (INIS)
Urano, Muneyasu; Ando, Koichi; Koike, Sachiko; Nesumi, Naofumi
1975-01-01
Tumor control and regrowth probability of animal tumors irradiated with a single x-ray dose were determined, using a spontaneous C3H mouse mammary carcinoma. Cellular radiation sensitivity of tumor cells and tumor control probability of the tumor were examined by the TD 50 and TCD 50 assays respectively. Tumor growth kinetics were measured by counting the percentage of labelled mitosis and by measuring the growth curve. A mathematical analysis of tumor control probability was made from these results. A formula proposed, accounted for cell population kinetics or division probability model, cell sensitivity to radiation and number of tumor cells. (auth.)
International Nuclear Information System (INIS)
Konstantinov, Y.O.; Bruk, G.Y.; Ershov, E.B.
2000-01-01
The cohort of children in the western districts of the Bryansk Region of Russia exposed to radiation following the Chernobyl accident is described in this paper. The cohort was selected under the Joint Medical Research Project on Dosimetry Associated with the Chernobyl Accident conducted by Sasakawa Memorial Health Foundation (SMHF, Japan) and the Research Institute of Radiation Hygiene (RIRH, Russia). The subjects of the Research Project are those people residing in the most contaminated areas of Russia who was 0 to 10 years old at the time of exposure. At the moment the cohort comprises 1210 subjects, though this number may slightly decrease in course of a follow-up in view of migration of population. Most of cohort subjects were examined on their health status within the framework of the Chernobyl Sasakawa Health and Medical Cooperation Project (CSHMCP) from 1991-1996. In view of the main findings of studies in CSHMCP were thyroid abnormalities, selection of subjects was conducted on the basis of the credible estimates of thyroid dose. Preference for subjects to be included into the cohort was defined by the availability of health examination data from previous study (1991-1996) and individual dosimetry, environmental and social data that may prove useful for reconstruction of individual dose. The primary data analyzed for subjects selection are measurements of iodine-131 in the thyroid in May-June 1986, questionnaire data on individual food habits and early measurements of radiocesium in the body of subjects made by RIRH from May to September 1986. Plausible analytical models were applied to calculate thyroid dose from available data. Previously worked out methods of thyroid dose reconstruction using early measurement data of radiocesium content in the body and questionnaire data on individual consumption of locally produced milk were reevaluated. Basing on these analytical procedures, the individual thyroid dose was ascribed to each member of the cohort. The
Prediction and probability in sciences
International Nuclear Information System (INIS)
Klein, E.; Sacquin, Y.
1998-01-01
This book reports the 7 presentations made at the third meeting 'physics and fundamental questions' whose theme was probability and prediction. The concept of probability that was invented to apprehend random phenomena has become an important branch of mathematics and its application range spreads from radioactivity to species evolution via cosmology or the management of very weak risks. The notion of probability is the basis of quantum mechanics and then is bound to the very nature of matter. The 7 topics are: - radioactivity and probability, - statistical and quantum fluctuations, - quantum mechanics as a generalized probability theory, - probability and the irrational efficiency of mathematics, - can we foresee the future of the universe?, - chance, eventuality and necessity in biology, - how to manage weak risks? (A.C.)
Applied probability and stochastic processes
Sumita, Ushio
1999-01-01
Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...
International Nuclear Information System (INIS)
Cumak, V.; Morgun, A.; Bakhanova, O.; Loganovs'kij, K.; Loganovs'ka, T.; Marazziti, D.
2015-01-01
This study aimed at investigating radiation exposure of hippocampus in interventional medical professionals irradiated in the operating room, and to compare doses in the hippocampus with the effective dose (protection quantity), as well as with the doses measured by individual dosimeter, in order to estimate probability of reaching levels of radiation induced cognitive and other neuropsychiatric alterations during their working career, through a Monte Carlo simulation. The results showed that cranial irradiation was very heterogeneous and depended on the projection: doses of left and right hippocampi may be different up to a factor of 2.5; under certain conditions, the dose of the left hippocampus may be twice the effective dose, estimated by conventional double dosimetry algorithm. The professional span doses of the irradiated hippocampus may overcome the threshold able to provoke possible cognitive and emotional-behavioral impairment. Therefore, in-depth studies of the effects of brain irradiation in occupationally exposed interventional medical personnel appear urgently needed and crucial
Poisson Processes in Free Probability
An, Guimei; Gao, Mingchu
2015-01-01
We prove a multidimensional Poisson limit theorem in free probability, and define joint free Poisson distributions in a non-commutative probability space. We define (compound) free Poisson process explicitly, similar to the definitions of (compound) Poisson processes in classical probability. We proved that the sum of finitely many freely independent compound free Poisson processes is a compound free Poisson processes. We give a step by step procedure for constructing a (compound) free Poisso...
PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT
We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...
Fisher classifier and its probability of error estimation
Chittineni, C. B.
1979-01-01
Computationally efficient expressions are derived for estimating the probability of error using the leave-one-out method. The optimal threshold for the classification of patterns projected onto Fisher's direction is derived. A simple generalization of the Fisher classifier to multiple classes is presented. Computational expressions are developed for estimating the probability of error of the multiclass Fisher classifier.
Normal tissue complication probability for salivary glands
International Nuclear Information System (INIS)
Rana, B.S.
2008-01-01
The purpose of radiotherapy is to make a profitable balance between the morbidity (due to side effects of radiation) and cure of malignancy. To achieve this, one needs to know the relation between NTCP (normal tissue complication probability) and various treatment variables of a schedule viz. daily dose, duration of treatment, total dose and fractionation along with tissue conditions. Prospective studies require that a large number of patients be treated with varied schedule parameters and a statistically acceptable number of patients develop complications so that a true relation between NTCP and a particular variable is established. In this study Salivary Glands Complications have been considered. The cases treated in 60 Co teletherapy machine during the period 1994 to 2002 were analyzed and the clinicians judgement in ascertaining the end points was the only means of observations. The only end points were early and late xerestomia which were considered for NTCP evaluations for a period of 5 years
Probability inequalities for decomposition integrals
Czech Academy of Sciences Publication Activity Database
Agahi, H.; Mesiar, Radko
2017-01-01
Roč. 315, č. 1 (2017), s. 240-248 ISSN 0377-0427 Institutional support: RVO:67985556 Keywords : Decomposition integral * Superdecomposition integral * Probability inequalities Subject RIV: BA - General Mathematics OBOR OECD: Statistics and probability Impact factor: 1.357, year: 2016 http://library.utia.cas.cz/separaty/2017/E/mesiar-0470959.pdf
Expected utility with lower probabilities
DEFF Research Database (Denmark)
Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte
1994-01-01
An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...
Invariant probabilities of transition functions
Zaharopol, Radu
2014-01-01
The structure of the set of all the invariant probabilities and the structure of various types of individual invariant probabilities of a transition function are two topics of significant interest in the theory of transition functions, and are studied in this book. The results obtained are useful in ergodic theory and the theory of dynamical systems, which, in turn, can be applied in various other areas (like number theory). They are illustrated using transition functions defined by flows, semiflows, and one-parameter convolution semigroups of probability measures. In this book, all results on transition probabilities that have been published by the author between 2004 and 2008 are extended to transition functions. The proofs of the results obtained are new. For transition functions that satisfy very general conditions the book describes an ergodic decomposition that provides relevant information on the structure of the corresponding set of invariant probabilities. Ergodic decomposition means a splitting of t...
Introduction to probability with Mathematica
Hastings, Kevin J
2009-01-01
Discrete ProbabilityThe Cast of Characters Properties of Probability Simulation Random SamplingConditional ProbabilityIndependenceDiscrete DistributionsDiscrete Random Variables, Distributions, and ExpectationsBernoulli and Binomial Random VariablesGeometric and Negative Binomial Random Variables Poisson DistributionJoint, Marginal, and Conditional Distributions More on ExpectationContinuous ProbabilityFrom the Finite to the (Very) Infinite Continuous Random Variables and DistributionsContinuous ExpectationContinuous DistributionsThe Normal Distribution Bivariate Normal DistributionNew Random Variables from OldOrder Statistics Gamma DistributionsChi-Square, Student's t, and F-DistributionsTransformations of Normal Random VariablesAsymptotic TheoryStrong and Weak Laws of Large Numbers Central Limit TheoremStochastic Processes and ApplicationsMarkov ChainsPoisson Processes QueuesBrownian MotionFinancial MathematicsAppendixIntroduction to Mathematica Glossary of Mathematica Commands for Probability Short Answers...
Linear positivity and virtual probability
International Nuclear Information System (INIS)
Hartle, James B.
2004-01-01
We investigate the quantum theory of closed systems based on the linear positivity decoherence condition of Goldstein and Page. The objective of any quantum theory of a closed system, most generally the universe, is the prediction of probabilities for the individual members of sets of alternative coarse-grained histories of the system. Quantum interference between members of a set of alternative histories is an obstacle to assigning probabilities that are consistent with the rules of probability theory. A quantum theory of closed systems therefore requires two elements: (1) a condition specifying which sets of histories may be assigned probabilities and (2) a rule for those probabilities. The linear positivity condition of Goldstein and Page is the weakest of the general conditions proposed so far. Its general properties relating to exact probability sum rules, time neutrality, and conservation laws are explored. Its inconsistency with the usual notion of independent subsystems in quantum mechanics is reviewed. Its relation to the stronger condition of medium decoherence necessary for classicality is discussed. The linear positivity of histories in a number of simple model systems is investigated with the aim of exhibiting linearly positive sets of histories that are not decoherent. The utility of extending the notion of probability to include values outside the range of 0-1 is described. Alternatives with such virtual probabilities cannot be measured or recorded, but can be used in the intermediate steps of calculations of real probabilities. Extended probabilities give a simple and general way of formulating quantum theory. The various decoherence conditions are compared in terms of their utility for characterizing classicality and the role they might play in further generalizations of quantum mechanics
International Nuclear Information System (INIS)
Oinam, A.S.; Dubey, S.; Kehwar, T.S.; Rout, Sanjaya K.; Patel, F.D.; Sharma, S.C.; Goyal, D.R.; Narayan, P.
2002-01-01
Intracavitary brachytherapy is one of the well-established techniques for the treatment of carcinoma of cervix. The prediction of late effect of normal tissue like rectum and bladder needs the defining of the volume of the bladder and rectum in situ. In the normal planning of intracavitary and interstitial implants, simulated radiograph films are used to reconstruct the applicator geometry and dose points to represent the dose to critical organs. CT based brachytherapy can define such volume instead of defining dose points, which represent the dose to these critical organs
Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines
Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.
2011-01-01
Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433
Probable Inference and Quantum Mechanics
International Nuclear Information System (INIS)
Grandy, W. T. Jr.
2009-01-01
In its current very successful interpretation the quantum theory is fundamentally statistical in nature. Although commonly viewed as a probability amplitude whose (complex) square is a probability, the wavefunction or state vector continues to defy consensus as to its exact meaning, primarily because it is not a physical observable. Rather than approach this problem directly, it is suggested that it is first necessary to clarify the precise role of probability theory in quantum mechanics, either as applied to, or as an intrinsic part of the quantum theory. When all is said and done the unsurprising conclusion is that quantum mechanics does not constitute a logic and probability unto itself, but adheres to the long-established rules of classical probability theory while providing a means within itself for calculating the relevant probabilities. In addition, the wavefunction is seen to be a description of the quantum state assigned by an observer based on definite information, such that the same state must be assigned by any other observer based on the same information, in much the same way that probabilities are assigned.
Failure probability under parameter uncertainty.
Gerrard, R; Tsanakas, A
2011-05-01
In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.
Probability with applications and R
Dobrow, Robert P
2013-01-01
An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c
A philosophical essay on probabilities
Laplace, Marquis de
1996-01-01
A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application
Bakker, Astrid; de Lange, René; Ahmed, Abdulfatah; Garcia, André; Tomkinson, David; Salamin, Julie; Buyvidovich, Sergey A.; Sohrabi, Tina; Dominguez, Alexandre; Campeanu, Cosmin
2015-01-01
Background: Computed tomography (CT) is one of the most used modalities for diagnostics in paediatric populations, which is a concern as it also delivers a high patient dose. Research has focused on developing computer algorithms that provide better image quality at lower dose. The iterative
Logic, probability, and human reasoning.
Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P
2015-04-01
This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction. Copyright © 2015 Elsevier Ltd. All rights reserved.
Free probability and random matrices
Mingo, James A
2017-01-01
This volume opens the world of free probability to a wide variety of readers. From its roots in the theory of operator algebras, free probability has intertwined with non-crossing partitions, random matrices, applications in wireless communications, representation theory of large groups, quantum groups, the invariant subspace problem, large deviations, subfactors, and beyond. This book puts a special emphasis on the relation of free probability to random matrices, but also touches upon the operator algebraic, combinatorial, and analytic aspects of the theory. The book serves as a combination textbook/research monograph, with self-contained chapters, exercises scattered throughout the text, and coverage of important ongoing progress of the theory. It will appeal to graduate students and all mathematicians interested in random matrices and free probability from the point of view of operator algebras, combinatorics, analytic functions, or applications in engineering and statistical physics.
Introduction to probability and measure
Parthasarathy, K R
2005-01-01
According to a remark attributed to Mark Kac 'Probability Theory is a measure theory with a soul'. This book with its choice of proofs, remarks, examples and exercises has been prepared taking both these aesthetic and practical aspects into account.
Joint probabilities and quantum cognition
International Nuclear Information System (INIS)
Acacio de Barros, J.
2012-01-01
In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.
Joint probabilities and quantum cognition
Energy Technology Data Exchange (ETDEWEB)
Acacio de Barros, J. [Liberal Studies, 1600 Holloway Ave., San Francisco State University, San Francisco, CA 94132 (United States)
2012-12-18
In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.
Default probabilities and default correlations
Erlenmaier, Ulrich; Gersbach, Hans
2001-01-01
Starting from the Merton framework for firm defaults, we provide the analytics and robustness of the relationship between default correlations. We show that loans with higher default probabilities will not only have higher variances but also higher correlations between loans. As a consequence, portfolio standard deviation can increase substantially when loan default probabilities rise. This result has two important implications. First, relative prices of loans with different default probabili...
The Probabilities of Unique Events
2012-08-30
Washington, DC USA Max Lotstein and Phil Johnson-Laird Department of Psychology Princeton University Princeton, NJ USA August 30th 2012...social justice and also participated in antinuclear demonstrations. The participants ranked the probability that Linda is a feminist bank teller as...retorted that such a flagrant violation of the probability calculus was a result of a psychological experiment that obscured the rationality of the
Probability Matching, Fast and Slow
Koehler, Derek J.; James, Greta
2014-01-01
A prominent point of contention among researchers regarding the interpretation of probability-matching behavior is whether it represents a cognitively sophisticated, adaptive response to the inherent uncertainty of the tasks or settings in which it is observed, or whether instead it represents a fundamental shortcoming in the heuristics that support and guide human decision making. Put crudely, researchers disagree on whether probability matching is "smart" or "dumb." Here, we consider eviden...
SureTrak Probability of Impact Display
Elliott, John
2012-01-01
The SureTrak Probability of Impact Display software was developed for use during rocket launch operations. The software displays probability of impact information for each ship near the hazardous area during the time immediately preceding the launch of an unguided vehicle. Wallops range safety officers need to be sure that the risk to humans is below a certain threshold during each use of the Wallops Flight Facility Launch Range. Under the variable conditions that can exist at launch time, the decision to launch must be made in a timely manner to ensure a successful mission while not exceeding those risk criteria. Range safety officers need a tool that can give them the needed probability of impact information quickly, and in a format that is clearly understandable. This application is meant to fill that need. The software is a reuse of part of software developed for an earlier project: Ship Surveillance Software System (S4). The S4 project was written in C++ using Microsoft Visual Studio 6. The data structures and dialog templates from it were copied into a new application that calls the implementation of the algorithms from S4 and displays the results as needed. In the S4 software, the list of ships in the area was received from one local radar interface and from operators who entered the ship information manually. The SureTrak Probability of Impact Display application receives ship data from two local radars as well as the SureTrak system, eliminating the need for manual data entry.
Research advances in probability of causation calculation of radiogenic neoplasms
International Nuclear Information System (INIS)
Ning Jing; Yuan Yong; Xie Xiangdong; Yang Guoshan
2009-01-01
Probability of causation (PC) was used to facilitate the adjudication of compensation claims for cancers diagnosed following exposure to ionizing radiation. In this article, the excess cancer risk assessment models used for PC calculation are reviewed. Cancer risk transfer models between different populations, dependence of cancer risk on dose and dose rate, modification by epidemiological risk factors and application of PC are also discussed in brief. (authors)
Probably not future prediction using probability and statistical inference
Dworsky, Lawrence N
2008-01-01
An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...
International Nuclear Information System (INIS)
Nakamoto, Atsushi; Kim, Tonsok; Hori, Masatoshi; Onishi, Hiromitsu; Tsuboyama, Takahiro; Sakane, Makoto; Tatsumi, Mitsuaki; Tomiyama, Noriyuki
2015-01-01
Highlights: • MBIR significantly improves objective image quality. • MBIR reduces the radiation dose by 87.5% without increasing objective image noise. • A half dose will be needed to maintain the subjective image quality. - Abstract: Purpose: To evaluate the image quality of upper abdominal CT images reconstructed with model-based iterative reconstruction (MBIR) in comparison with filtered back projection (FBP) and adaptive statistical iterative reconstruction (ASIR) on scans acquired with various radiation exposure dose protocols. Materials and methods: This prospective study was approved by our institutional review board, and informed consent was obtained from all 90 patients who underwent both control-dose (CD) and reduced-dose (RD) CT of the upper abdomen (unenhanced: n = 45, contrast-enhanced: n = 45). The RD scan protocol was randomly selected from three protocols; Protocol A: 12.5% dose, Protocol B: 25% dose, Protocol C: 50% dose. Objective image noise, signal-to-noise (SNR) ratio for the liver parenchyma, visual image score and lesion conspicuity were compared among CD images of FBP and RD images of FBP, ASIR and MBIR. Results: RD images of MBIR yielded significantly lower objective image noise and higher SNR compared with RD images of FBP and ASIR for all protocols (P < .01) and CD images of FBP for Protocol C (P < .05). Although the subjective image quality of RD images of MBIR was almost acceptable for Protocol C, it was inferior to that of CD images of FBP for Protocols A and B (P < .0083). The conspicuity of the small lesions in RD images of MBIR tended to be superior to that in RD images of FBP and ASIR and inferior to that in CD images for Protocols A and B, although the differences were not significant (P > .0083). Conclusion: Although 12.5%-dose MBIR images (mean size-specific dose estimates [SSDE] of 1.13 mGy) yielded objective image noise and SNR comparable to CD-FBP images, at least a 50% dose (mean SSDE of 4.63 mGy) would be needed to
Energy Technology Data Exchange (ETDEWEB)
Nakamoto, Atsushi, E-mail: a-nakamoto@radiol.med.osaka-u.ac.jp; Kim, Tonsok, E-mail: kim@radiol.med.osaka-u.ac.jp; Hori, Masatoshi, E-mail: mhori@radiol.med.osaka-u.ac.jp; Onishi, Hiromitsu, E-mail: h-onishi@radiol.med.osaka-u.ac.jp; Tsuboyama, Takahiro, E-mail: t-tsuboyama@radiol.med.osaka-u.ac.jp; Sakane, Makoto, E-mail: m-sakane@radiol.med.osaka-u.ac.jp; Tatsumi, Mitsuaki, E-mail: m-tatsumi@radiol.med.osaka-u.ac.jp; Tomiyama, Noriyuki, E-mail: tomiyama@radiol.med.osaka-u.ac.jp
2015-09-15
Highlights: • MBIR significantly improves objective image quality. • MBIR reduces the radiation dose by 87.5% without increasing objective image noise. • A half dose will be needed to maintain the subjective image quality. - Abstract: Purpose: To evaluate the image quality of upper abdominal CT images reconstructed with model-based iterative reconstruction (MBIR) in comparison with filtered back projection (FBP) and adaptive statistical iterative reconstruction (ASIR) on scans acquired with various radiation exposure dose protocols. Materials and methods: This prospective study was approved by our institutional review board, and informed consent was obtained from all 90 patients who underwent both control-dose (CD) and reduced-dose (RD) CT of the upper abdomen (unenhanced: n = 45, contrast-enhanced: n = 45). The RD scan protocol was randomly selected from three protocols; Protocol A: 12.5% dose, Protocol B: 25% dose, Protocol C: 50% dose. Objective image noise, signal-to-noise (SNR) ratio for the liver parenchyma, visual image score and lesion conspicuity were compared among CD images of FBP and RD images of FBP, ASIR and MBIR. Results: RD images of MBIR yielded significantly lower objective image noise and higher SNR compared with RD images of FBP and ASIR for all protocols (P < .01) and CD images of FBP for Protocol C (P < .05). Although the subjective image quality of RD images of MBIR was almost acceptable for Protocol C, it was inferior to that of CD images of FBP for Protocols A and B (P < .0083). The conspicuity of the small lesions in RD images of MBIR tended to be superior to that in RD images of FBP and ASIR and inferior to that in CD images for Protocols A and B, although the differences were not significant (P > .0083). Conclusion: Although 12.5%-dose MBIR images (mean size-specific dose estimates [SSDE] of 1.13 mGy) yielded objective image noise and SNR comparable to CD-FBP images, at least a 50% dose (mean SSDE of 4.63 mGy) would be needed to
Normal probability plots with confidence.
Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang
2015-01-01
Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Energy Technology Data Exchange (ETDEWEB)
Laqmani, Azien, E-mail: a.laqmani@uke.de [Department of Diagnostic and Interventional Radiology and Nuclear Medicine, University Medical Center Hamburg-Eppendorf, Martinistr. 52, 20246 Hamburg (Germany); Avanesov, Maxim; Butscheidt, Sebastian; Kurfürst, Maximilian [Department of Diagnostic and Interventional Radiology and Nuclear Medicine, University Medical Center Hamburg-Eppendorf, Martinistr. 52, 20246 Hamburg (Germany); Sehner, Susanne [Department of Medical Biometry and Epidemiology, University Medical Center Hamburg-Eppendorf, Martinistr. 52, 20246 Hamburg (Germany); Schmidt-Holtz, Jakob; Derlin, Thorsten; Behzadi, Cyrus [Department of Diagnostic and Interventional Radiology and Nuclear Medicine, University Medical Center Hamburg-Eppendorf, Martinistr. 52, 20246 Hamburg (Germany); Nagel, Hans D. [Science & Technology for Radiology, Fritz-Reuter-Weg 5f, 21244 Buchholz, Germany, (Germany); Adam, Gerhard; Regier, Marc [Department of Diagnostic and Interventional Radiology and Nuclear Medicine, University Medical Center Hamburg-Eppendorf, Martinistr. 52, 20246 Hamburg (Germany)
2016-11-15
Objective: To compare both image quality and visibility of normal and abnormal findings at submillisievert chest CT (smSv-CT) using filtered back projection (FBP) and the two different iterative reconstruction (IR) techniques iterative model reconstruction (IMR) and iDose{sup 4}™. Materials and methods: This institutional review board approved study was based on retrospective interpretation of clinically indicated acquired data. The requirement to obtain informed consent was waived. 81 patients with suspected pneumonia underwent smSv-CT (Brilliance iCT, Philips Healthcare; mean effective dose: 0.86 ± 0.2 mSv). Data were reconstructed using FBP and two different IR techniques iDose{sup 4}™ and IMR (Philips Healthcare) at various iteration levels. Objective image noise (OIN) was measured. Two experienced readers independently assessed all images for image noise, image appearance and visibility of normal anatomic and abnormal findings. A random intercept model was used for statistical analysis. Results: Compared to FBP and iDose{sup 4}™, IMR reduced OIN up to 88% and 72%, respectively (p < 0.001). A mild blotchy image appearance was seen in IMR images, affecting diagnostic confidence. iDose{sup 4}™ images provided satisfactory to good image quality for visibility of normal and abnormal findings and were superior to FBP (p < 0.001). IMR images were significantly inferior for visibility of normal structures compared to iDose{sup 4}™, while being superior for visibility of abnormal findings except for reticular pattern (p < 0.001). Conclusion: IMR results for visibility of normal and abnormal lung findings are heterogeneous, indicating that IMR may not represent a priority technique for clinical routine. iDose{sup 4}™ represents a suitable method for evaluation of lung tissue at submillisievert chest CT.
Probability theory a foundational course
Pakshirajan, R P
2013-01-01
This book shares the dictum of J. L. Doob in treating Probability Theory as a branch of Measure Theory and establishes this relation early. Probability measures in product spaces are introduced right at the start by way of laying the ground work to later claim the existence of stochastic processes with prescribed finite dimensional distributions. Other topics analysed in the book include supports of probability measures, zero-one laws in product measure spaces, Erdos-Kac invariance principle, functional central limit theorem and functional law of the iterated logarithm for independent variables, Skorohod embedding, and the use of analytic functions of a complex variable in the study of geometric ergodicity in Markov chains. This book is offered as a text book for students pursuing graduate programs in Mathematics and or Statistics. The book aims to help the teacher present the theory with ease, and to help the student sustain his interest and joy in learning the subject.
VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS
Directory of Open Access Journals (Sweden)
Smirnov Vladimir Alexandrovich
2012-10-01
Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.
Approximation methods in probability theory
Čekanavičius, Vydas
2016-01-01
This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.
Future southcentral US wildfire probability due to climate change
Stambaugh, Michael C.; Guyette, Richard P.; Stroh, Esther D.; Struckhoff, Matthew A.; Whittier, Joanna B.
2018-01-01
Globally, changing fire regimes due to climate is one of the greatest threats to ecosystems and society. In this paper, we present projections of future fire probability for the southcentral USA using downscaled climate projections and the Physical Chemistry Fire Frequency Model (PC2FM). Future fire probability is projected to both increase and decrease across the study region of Oklahoma, New Mexico, and Texas. Among all end-of-century projections, change in fire probabilities (CFPs) range from − 51 to + 240%. Greatest absolute increases in fire probability are shown for areas within the range of approximately 75 to 160 cm mean annual precipitation (MAP), regardless of climate model. Although fire is likely to become more frequent across the southcentral USA, spatial patterns may remain similar unless significant increases in precipitation occur, whereby more extensive areas with increased fire probability are predicted. Perhaps one of the most important results is illumination of climate changes where fire probability response (+, −) may deviate (i.e., tipping points). Fire regimes of southcentral US ecosystems occur in a geographic transition zone from reactant- to reaction-limited conditions, potentially making them uniquely responsive to different scenarios of temperature and precipitation changes. Identification and description of these conditions may help anticipate fire regime changes that will affect human health, agriculture, species conservation, and nutrient and water cycling.
Are low radiation doses Dangerous?
International Nuclear Information System (INIS)
Garcia Lima, O.; Cornejo, N.
1996-01-01
In the last few years the answers to this questions has been affirmative as well as negative from a radiation protection point of view low doses of ionizing radiation potentially constitute an agent causing stochasting effects. A lineal relation without threshold is assumed between dose and probability of occurrence of these effects . Arguments against the danger of probability of occurrence of these effects. Arguments again the danger of low dose radiation are reflected in concepts such as Hormesis and adaptive response, which are phenomena that being studied at present
Model uncertainty: Probabilities for models?
International Nuclear Information System (INIS)
Winkler, R.L.
1994-01-01
Like any other type of uncertainty, model uncertainty should be treated in terms of probabilities. The question is how to do this. The most commonly-used approach has a drawback related to the interpretation of the probabilities assigned to the models. If we step back and look at the big picture, asking what the appropriate focus of the model uncertainty question should be in the context of risk and decision analysis, we see that a different probabilistic approach makes more sense, although it raise some implementation questions. Current work that is underway to address these questions looks very promising
Knowledge typology for imprecise probabilities.
Energy Technology Data Exchange (ETDEWEB)
Wilson, G. D. (Gregory D.); Zucker, L. J. (Lauren J.)
2002-01-01
When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.
Probability, Statistics, and Stochastic Processes
Olofsson, Peter
2011-01-01
A mathematical and intuitive approach to probability, statistics, and stochastic processes This textbook provides a unique, balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. This text combines a rigorous, calculus-based development of theory with a more intuitive approach that appeals to readers' sense of reason and logic, an approach developed through the author's many years of classroom experience. The text begins with three chapters that d
Statistical probability tables CALENDF program
International Nuclear Information System (INIS)
Ribon, P.
1989-01-01
The purpose of the probability tables is: - to obtain dense data representation - to calculate integrals by quadratures. They are mainly used in the USA for calculations by Monte Carlo and in the USSR and Europe for self-shielding calculations by the sub-group method. The moment probability tables, in addition to providing a more substantial mathematical basis and calculation methods, are adapted for condensation and mixture calculations, which are the crucial operations for reactor physics specialists. However, their extension is limited by the statistical hypothesis they imply. Efforts are being made to remove this obstacle, at the cost, it must be said, of greater complexity
Probability, statistics, and queueing theory
Allen, Arnold O
1990-01-01
This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit
Probability and Statistics: 5 Questions
DEFF Research Database (Denmark)
Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fits...... in these respects. Interviews with Nick Bingham, Luc Bovens, Terrence L. Fine, Haim Gaifman, Donald Gillies, James Hawthorne, Carl Hoefer, James M. Joyce, Joseph B. Kadane Isaac Levi, D.H. Mellor, Patrick Suppes, Jan von Plato, Carl Wagner, Sandy Zabell...
Al Qaroot, Bashar; Hogg, Peter; Twiste, Martin; Howard, David
2014-01-01
Patients with vertebral column deformations are exposed to high risks associated with ionising radiation exposure. Risks are further increased due to the serial X-ray images that are needed to measure and asses their spinal deformation using Cobb or superimposition methods. Therefore, optimising such X-ray practice, via reducing dose whilst maintaining image quality, is a necessity. With a specific focus on lateral thoraco-lumbar images for Cobb and superimposition measurements, this paper outlines a systematic procedure to the optimisation of X-ray practice. Optimisation was conducted based on suitable image quality from minimal dose. Image quality was appraised using a visual-analogue-rating-scale, and Monte-Carlo modelling was used for dose estimation. The optimised X-ray practice was identified by imaging healthy normal-weight male adult living human volunteers. The optimised practice consisted of: anode towards the head, broad focus, no OID or grid, 80 kVp, 32 mAs and 130 cm SID. Images of suitable quality for laterally assessing spinal conditions using Cobb or superimposition measurements were produced from an effective dose of 0.05 mSv, which is 83% less than the average effective dose used in the UK for lateral thoracic/lumbar exposures. This optimisation procedure can be adopted and use for optimisation of other radiographic techniques.
Dynamic SEP event probability forecasts
Kahler, S. W.; Ling, A.
2015-10-01
The forecasting of solar energetic particle (SEP) event probabilities at Earth has been based primarily on the estimates of magnetic free energy in active regions and on the observations of peak fluxes and fluences of large (≥ M2) solar X-ray flares. These forecasts are typically issued for the next 24 h or with no definite expiration time, which can be deficient for time-critical operations when no SEP event appears following a large X-ray flare. It is therefore important to decrease the event probability forecast with time as a SEP event fails to appear. We use the NOAA listing of major (≥10 pfu) SEP events from 1976 to 2014 to plot the delay times from X-ray peaks to SEP threshold onsets as a function of solar source longitude. An algorithm is derived to decrease the SEP event probabilities with time when no event is observed to reach the 10 pfu threshold. In addition, we use known SEP event size distributions to modify probability forecasts when SEP intensity increases occur below the 10 pfu event threshold. An algorithm to provide a dynamic SEP event forecast, Pd, for both situations of SEP intensities following a large flare is derived.
Conditional Independence in Applied Probability.
Pfeiffer, Paul E.
This material assumes the user has the background provided by a good undergraduate course in applied probability. It is felt that introductory courses in calculus, linear algebra, and perhaps some differential equations should provide the requisite experience and proficiency with mathematical concepts, notation, and argument. The document is…
Stretching Probability Explorations with Geoboards
Wheeler, Ann; Champion, Joe
2016-01-01
Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…
GPS: Geometry, Probability, and Statistics
Field, Mike
2012-01-01
It might be said that for most occupations there is now less of a need for mathematics than there was say fifty years ago. But, the author argues, geometry, probability, and statistics constitute essential knowledge for everyone. Maybe not the geometry of Euclid, but certainly geometrical ways of thinking that might enable us to describe the world…
Swedish earthquakes and acceleration probabilities
International Nuclear Information System (INIS)
Slunga, R.
1979-03-01
A method to assign probabilities to ground accelerations for Swedish sites is described. As hardly any nearfield instrumental data is available we are left with the problem of interpreting macroseismic data in terms of acceleration. By theoretical wave propagation computations the relation between seismic strength of the earthquake, focal depth, distance and ground accelerations are calculated. We found that most Swedish earthquake of the area, the 1904 earthquake 100 km south of Oslo, is an exception and probably had a focal depth exceeding 25 km. For the nuclear power plant sites an annual probability of 10 -5 has been proposed as interesting. This probability gives ground accelerations in the range 5-20 % for the sites. This acceleration is for a free bedrock site. For consistency all acceleration results in this study are given for bedrock sites. When applicating our model to the 1904 earthquake and assuming the focal zone to be in the lower crust we get the epicentral acceleration of this earthquake to be 5-15 % g. The results above are based on an analyses of macrosismic data as relevant instrumental data is lacking. However, the macroseismic acceleration model deduced in this study gives epicentral ground acceleration of small Swedish earthquakes in agreement with existent distant instrumental data. (author)
DECOFF Probabilities of Failed Operations
DEFF Research Database (Denmark)
Gintautas, Tomas
2015-01-01
A statistical procedure of estimation of Probabilities of Failed Operations is described and exemplified using ECMWF weather forecasts and SIMO output from Rotor Lift test case models. Also safety factor influence is investigated. DECOFF statistical method is benchmarked against standard Alpha-factor...
Risk estimation using probability machines
2014-01-01
Background Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. Results We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. Conclusions The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a “risk machine”, will share properties from the statistical machine that it is derived from. PMID:24581306
Probability and statistics: A reminder
International Nuclear Information System (INIS)
Clement, B.
2013-01-01
The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from 'data analysis in experimental sciences' given in [1]. (authors)
Nash equilibrium with lower probabilities
DEFF Research Database (Denmark)
Groes, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte
1998-01-01
We generalize the concept of Nash equilibrium in mixed strategies for strategic form games to allow for ambiguity in the players' expectations. In contrast to other contributions, we model ambiguity by means of so-called lower probability measures or belief functions, which makes it possible...
2002-01-01
Electron paramagnetic resonance (EPR) dosimetry is a physical method for the assessment of absorbed dose from ionising radiation. It is based on the measurement of stable radiation induced radicals in human calcified tissues (primarily in tooth enamel). EPR dosimetry with teeth is now firmly established in retrospective dosimetry. It is a powerful method for providing information on exposure to ionising radiation many years after the event, since the 'signal' is 'stored' in the tooth or the bone. This technique is of particular relevance to relatively low dose exposures or when the results of conventional dosimetry are not available (e.g. in accidental circumstances). The use of EPR dosimetry, as an essential tool for retrospective assessment of radiation exposure is an important part of radioepidemiological studies and also provides data to select appropriate countermeasures based on retrospective evaluation of individual doses. Despite well established regulations and protocols for maintaining radiation pro...
International Nuclear Information System (INIS)
Boehlke, S.; Niegoth, H.
2012-01-01
In the nuclear power plant Leibstadt (KKL) during the next year large components will be dismantled and stored for final disposal within the interim storage facility ZENT at the NPP site. Before construction of ZENT appropriate estimations of the local dose rate inside and outside the building and the collective dose for the normal operation have to be performed. The shielding calculations are based on the properties of the stored components and radiation sources and on the concepts for working place requirements. The installation of control and monitoring areas will depend on these calculations. For the determination of the shielding potential of concrete walls and steel doors with the defined boundary conditions point-kernel codes like MICROSHIELd registered are used. Complex problems cannot be modeled with this code. Therefore the point-kernel code VISIPLAN registered was developed for the determination of the local dose distribution functions in 3D models. The possibility of motion sequence inputs allows an optimization of collective dose estimations for the operational phases of a nuclear facility.
Karstad, K. (Kristina); Rugulies, R. (Reiner); Skotte, J. (Jørgen); Munch, P.K. (Pernille Kold); Greiner, B.A. (Birgit A.); Burdorf, A. (Alex); Søgaard, K. (Karen); A. Holtermann (Andreas)
2018-01-01
textabstractThe aim of the study was to develop and evaluate the reliability of the “Danish observational study of eldercare work and musculoskeletal disorders” (DOSES) observation instrument to assess physical and psychosocial risk factors for musculoskeletal disorders (MSD) in eldercare work.
Jump probabilities in the non-Markovian quantum jump method
International Nuclear Information System (INIS)
Haerkoenen, Kari
2010-01-01
The dynamics of a non-Markovian open quantum system described by a general time-local master equation is studied. The propagation of the density operator is constructed in terms of two processes: (i) deterministic evolution and (ii) evolution of a probability density functional in the projective Hilbert space. The analysis provides a derivation for the jump probabilities used in the recently developed non-Markovian quantum jump (NMQJ) method (Piilo et al 2008 Phys. Rev. Lett. 100 180402).
On the probability of cure for heavy-ion radiotherapy
International Nuclear Information System (INIS)
Hanin, Leonid; Zaider, Marco
2014-01-01
The probability of a cure in radiation therapy (RT)—viewed as the probability of eventual extinction of all cancer cells—is unobservable, and the only way to compute it is through modeling the dynamics of cancer cell population during and post-treatment. The conundrum at the heart of biophysical models aimed at such prospective calculations is the absence of information on the initial size of the subpopulation of clonogenic cancer cells (also called stem-like cancer cells), that largely determines the outcome of RT, both in an individual and population settings. Other relevant parameters (e.g. potential doubling time, cell loss factor and survival probability as a function of dose) are, at least in principle, amenable to empirical determination. In this article we demonstrate that, for heavy-ion RT, microdosimetric considerations (justifiably ignored in conventional RT) combined with an expression for the clone extinction probability obtained from a mechanistic model of radiation cell survival lead to useful upper bounds on the size of the pre-treatment population of clonogenic cancer cells as well as upper and lower bounds on the cure probability. The main practical impact of these limiting values is the ability to make predictions about the probability of a cure for a given population of patients treated to newer, still unexplored treatment modalities from the empirically determined probability of a cure for the same or similar population resulting from conventional low linear energy transfer (typically photon/electron) RT. We also propose that the current trend to deliver a lower total dose in a smaller number of fractions with larger-than-conventional doses per fraction has physical limits that must be understood before embarking on a particular treatment schedule. (paper)
Energy Technology Data Exchange (ETDEWEB)
NONE
2011-07-01
This document reports a dose assessment study performed by the IRSN (the French Radioprotection and Safety Nuclear Institute) 66 days after the Fukushima nuclear accident. A new dose assessment was carried out by IRSN to estimate projected doses due to external exposure from radioactive deposits, for exposure durations of 3 months, 1 year and 4 years before evacuation. The purpose of this report is to provide insight on all radiological assessments performed to the knowledge of the IRSN (the French Radioprotection and Safety Nuclear Institute) to date and the impact of population evacuation measures to be taken to minimize the medium and long-term risks of developing leukaemia or other radiation-induced cancers. This report only considers the external doses already received as well as the doses that may be received in the future from fallout deposits, regardless of doses received previously from the radioactive plume
Large deviations and idempotent probability
Puhalskii, Anatolii
2001-01-01
In the view of many probabilists, author Anatolii Puhalskii''s research results stand among the most significant achievements in the modern theory of large deviations. In fact, his work marked a turning point in the depth of our understanding of the connections between the large deviation principle (LDP) and well-known methods for establishing weak convergence results.Large Deviations and Idempotent Probability expounds upon the recent methodology of building large deviation theory along the lines of weak convergence theory. The author develops an idempotent (or maxitive) probability theory, introduces idempotent analogues of martingales (maxingales), Wiener and Poisson processes, and Ito differential equations, and studies their properties. The large deviation principle for stochastic processes is formulated as a certain type of convergence of stochastic processes to idempotent processes. The author calls this large deviation convergence.The approach to establishing large deviation convergence uses novel com...
Probability biases as Bayesian inference
Directory of Open Access Journals (Sweden)
Andre; C. R. Martins
2006-11-01
Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.
Probability matching and strategy availability.
Koehler, Derek J; James, Greta
2010-09-01
Findings from two experiments indicate that probability matching in sequential choice arises from an asymmetry in strategy availability: The matching strategy comes readily to mind, whereas a superior alternative strategy, maximizing, does not. First, compared with the minority who spontaneously engage in maximizing, the majority of participants endorse maximizing as superior to matching in a direct comparison when both strategies are described. Second, when the maximizing strategy is brought to their attention, more participants subsequently engage in maximizing. Third, matchers are more likely than maximizers to base decisions in other tasks on their initial intuitions, suggesting that they are more inclined to use a choice strategy that comes to mind quickly. These results indicate that a substantial subset of probability matchers are victims of "underthinking" rather than "overthinking": They fail to engage in sufficient deliberation to generate a superior alternative to the matching strategy that comes so readily to mind.
Probability as a Physical Motive
Directory of Open Access Journals (Sweden)
Peter Martin
2007-04-01
Full Text Available Recent theoretical progress in nonequilibrium thermodynamics, linking thephysical principle of Maximum Entropy Production (Ã¢Â€ÂœMEPÃ¢Â€Â to the information-theoreticalÃ¢Â€ÂœMaxEntÃ¢Â€Â principle of scientific inference, together with conjectures from theoreticalphysics that there may be no fundamental causal laws but only probabilities for physicalprocesses, and from evolutionary theory that biological systems expand Ã¢Â€Âœthe adjacentpossibleÃ¢Â€Â as rapidly as possible, all lend credence to the proposition that probability shouldbe recognized as a fundamental physical motive. It is further proposed that spatial order andtemporal order are two aspects of the same thing, and that this is the essence of the secondlaw of thermodynamics.
Measures, Probability and Holography in Cosmology
Phillips, Daniel
This dissertation compiles four research projects on predicting values for cosmological parameters and models of the universe on the broadest scale. The first examines the Causal Entropic Principle (CEP) in inhomogeneous cosmologies. The CEP aims to predict the unexpectedly small value of the cosmological constant Lambda using a weighting by entropy increase on causal diamonds. The original work assumed a purely isotropic and homogeneous cosmology. But even the level of inhomogeneity observed in our universe forces reconsideration of certain arguments about entropy production. In particular, we must consider an ensemble of causal diamonds associated with each background cosmology and we can no longer immediately discard entropy production in the far future of the universe. Depending on our choices for a probability measure and our treatment of black hole evaporation, the prediction for Lambda may be left intact or dramatically altered. The second related project extends the CEP to universes with curvature. We have found that curvature values larger than rho k = 40rhom are disfavored by more than $99.99% and a peak value at rhoLambda = 7.9 x 10-123 and rhok =4.3rho m for open universes. For universes that allow only positive curvature or both positive and negative curvature, we find a correlation between curvature and dark energy that leads to an extended region of preferred values. Our universe is found to be disfavored to an extent depending the priors on curvature. We also provide a comparison to previous anthropic constraints on open universes and discuss future directions for this work. The third project examines how cosmologists should formulate basic questions of probability. We argue using simple models that all successful practical uses of probabilities originate in quantum fluctuations in the microscopic physical world around us, often propagated to macroscopic scales. Thus we claim there is no physically verified fully classical theory of probability. We
Yates, Justin R; Breitenstein, Kerry A; Gunkel, Benjamin T; Hughes, Mallory N; Johnson, Anthony B; Rogers, Katherine K; Shape, Sara M
Risky decision making can be measured using a probability-discounting procedure, in which animals choose between a small, certain reinforcer and a large, uncertain reinforcer. Recent evidence has identified glutamate as a mediator of risky decision making, as blocking the N-methyl-d-aspartate (NMDA) receptor with MK-801 increases preference for a large, uncertain reinforcer. Because the order in which probabilities associated with the large reinforcer can modulate the effects of drugs on choice, the current study determined if NMDA receptor ligands alter probability discounting using ascending and descending schedules. Sixteen rats were trained in a probability-discounting procedure in which the odds against obtaining the large reinforcer increased (n=8) or decreased (n=8) across blocks of trials. Following behavioral training, rats received treatments of the NMDA receptor ligands MK-801 (uncompetitive antagonist; 0, 0.003, 0.01, or 0.03mg/kg), ketamine (uncompetitive antagonist; 0, 1.0, 5.0, or 10.0mg/kg), and ifenprodil (NR2B-selective non-competitive antagonist; 0, 1.0, 3.0, or 10.0mg/kg). Results showed discounting was steeper (indicating increased risk aversion) for rats on an ascending schedule relative to rats on the descending schedule. Furthermore, the effects of MK-801, ketamine, and ifenprodil on discounting were dependent on the schedule used. Specifically, the highest dose of each drug decreased risk taking in rats in the descending schedule, but only MK-801 (0.03mg/kg) increased risk taking in rats on an ascending schedule. These results show that probability presentation order modulates the effects of NMDA receptor ligands on risky decision making. Copyright Â© 2016 Elsevier Inc. All rights reserved.
Logic, Probability, and Human Reasoning
2015-01-01
accordingly suggest a way to integrate probability and deduction. The nature of deductive reasoning To be rational is to be able to make deductions...3–6] and they underlie mathematics, science, and tech- nology [7–10]. Plato claimed that emotions upset reason- ing. However, individuals in the grip...fundamental to human rationality . So, if counterexamples to its principal predictions occur, the theory will at least explain its own refutation
Probability Measures on Groups IX
1989-01-01
The latest in this series of Oberwolfach conferences focussed on the interplay between structural probability theory and various other areas of pure and applied mathematics such as Tauberian theory, infinite-dimensional rotation groups, central limit theorems, harmonizable processes, and spherical data. Thus it was attended by mathematicians whose research interests range from number theory to quantum physics in conjunction with structural properties of probabilistic phenomena. This volume contains 5 survey articles submitted on special invitation and 25 original research papers.
Probability matching and strategy availability
J. Koehler, Derek; Koehler, Derek J.; James, Greta
2010-01-01
Findings from two experiments indicate that probability matching in sequential choice arises from an asymmetry in strategy availability: The matching strategy comes readily to mind, whereas a superior alternative strategy, maximizing, does not. First, compared with the minority who spontaneously engage in maximizing, the majority of participants endorse maximizing as superior to matching in a direct comparison when both strategies are described. Second, when the maximizing strategy is brought...
Energy Technology Data Exchange (ETDEWEB)
Doi, M [National Inst. of Radiological Sciences, Chiba (Japan); Lagarde, F [Karolinska Inst., Stockholm (Sweden). Inst. of Environmental Medicine; Falk, R; Swedjemark, G A [Swedish Radiation Protection Inst., Stockholm (Sweden)
1996-12-01
Effective dose per unit radon progeny exposure to Swedish population in 1992 is estimated by the risk projection model based on the Swedish epidemiological study of radon and lung cancer. The resulting values range from 1.29 - 3.00 mSv/WLM and 2.58 - 5.99 mSv/WLM, respectively. Assuming a radon concentration of 100 Bq/m{sup 3}, an equilibrium factor of 0.4 and an occupancy factor of 0.6 in Swedish houses, the annual effective dose for the Swedish population is estimated to be 0.43 - 1.98 mSv/year, which should be compared to the value of 1.9 mSv/year, according to the UNSCEAR 1993 report. 27 refs, tabs, figs.
International Nuclear Information System (INIS)
Doi, M.; Lagarde, F.
1996-12-01
Effective dose per unit radon progeny exposure to Swedish population in 1992 is estimated by the risk projection model based on the Swedish epidemiological study of radon and lung cancer. The resulting values range from 1.29 - 3.00 mSv/WLM and 2.58 - 5.99 mSv/WLM, respectively. Assuming a radon concentration of 100 Bq/m 3 , an equilibrium factor of 0.4 and an occupancy factor of 0.6 in Swedish houses, the annual effective dose for the Swedish population is estimated to be 0.43 - 1.98 mSv/year, which should be compared to the value of 1.9 mSv/year, according to the UNSCEAR 1993 report. 27 refs, tabs, figs
High-order noise analysis for low dose iterative image reconstruction methods: ASIR, IRIS, and MBAI
Do, Synho; Singh, Sarabjeet; Kalra, Mannudeep K.; Karl, W. Clem; Brady, Thomas J.; Pien, Homer
2011-03-01
Iterative reconstruction techniques (IRTs) has been shown to suppress noise significantly in low dose CT imaging. However, medical doctors hesitate to accept this new technology because visual impression of IRT images are different from full-dose filtered back-projection (FBP) images. Most common noise measurements such as the mean and standard deviation of homogeneous region in the image that do not provide sufficient characterization of noise statistics when probability density function becomes non-Gaussian. In this study, we measure L-moments of intensity values of images acquired at 10% of normal dose and reconstructed by IRT methods of two state-of-art clinical scanners (i.e., GE HDCT and Siemens DSCT flash) by keeping dosage level identical to each other. The high- and low-dose scans (i.e., 10% of high dose) were acquired from each scanner and L-moments of noise patches were calculated for the comparison.
Directory of Open Access Journals (Sweden)
Lukas Ebner
2014-01-01
Full Text Available Objective:The aim of the present study was to evaluate a dose reduction in contrast-enhanced chest computed tomography (CT by comparing the three latest generations of Siemens CT scanners used in clinical practice. We analyzed the amount of radiation used with filtered back projection (FBP and an iterative reconstruction (IR algorithm to yield the same image quality. Furthermore, the influence on the radiation dose of the most recent integrated circuit detector (ICD; Stellar detector, Siemens Healthcare, Erlangen, Germany was investigated. Materials and Methods: 136 Patients were included. Scan parameters were set to a thorax routine: SOMATOM Sensation 64 (FBP, SOMATOM Definition Flash (IR, and SOMATOM Definition Edge (ICD and IR. Tube current was set constantly to the reference level of 100 mA automated tube current modulation using reference milliamperes. Care kV was used on the Flash and Edge scanner, while tube potential was individually selected between 100 and 140 kVp by the medical technologists at the SOMATOM Sensation. Quality assessment was performed on soft-tissue kernel reconstruction. Dose was represented by the dose length product. Results: Dose-length product (DLP with FBP for the average chest CT was 308 mGycm ± 99.6. In contrast, the DLP for the chest CT with IR algorithm was 196.8 mGycm ± 68.8 (P = 0.0001. Further decline in dose can be noted with IR and the ICD: DLP: 166.4 mGycm ± 54.5 (P = 0.033. The dose reduction compared to FBP was 36.1% with IR and 45.6% with IR/ICD. Signal-to-noise ratio (SNR was favorable in the aorta, bone, and soft tissue for IR/ICD in combination compared to FBP (the P values ranged from 0.003 to 0.048. Overall contrast-to-noise ratio (CNR improved with declining DLP. Conclusion: The most recent technical developments, namely IR in combination with integrated circuit detectors, can significantly lower radiation dose in chest CT examinations.
Arvind Chopra; Manjit Saluja; Girish Tillu; Anuradha Venugopalan; Gumdal Narsimulu; Sanjeev Sarmukaddam; Bhushan Patwardhan
2012-01-01
Background: Results of an exploratory trial suggested activity trends of Zingiber officinale-Tinopsora cordifolia (platform combination)-based formulations in the treatment of Osteoarthritis (OA) Knees. These formulations were "platform combination+Withania somnifera+Tribulus terrestris0" (formulation B) and "platform combination+Emblica officinale" (formulation C). This paper reports safety of these formulations when used in higher doses (1.5-2 times) along with Sallaki Guggul and Bhallataka...
Monte Carlo calculations of patient doses from dental radiography
International Nuclear Information System (INIS)
Gibbs, S.J.; Pujol, A.; Chen, T.S.; Malcolm, A.W.
1984-01-01
A Monte Carlo computer program has been developed to calculate patient dose from diagnostic radiologic procedures. Input data include patient anatomy as serial CT scans at 1-cm intervals from a typical cadaver, beam spectrum, and projection geometry. The program tracks single photons, accounting for photoelectric effect, coherent (using atomic form factors) and incoherent (using scatter functions) scatter. Inhomogeneities (bone, teeth, muscle, fat, lung, air cavities, etc.) are accounted for as they are encountered. Dose is accumulated in a three-dimensional array of voxels, corresponding to the CT input. Output consists of isodose curves, doses to specific organs, and effective dose equivalent, H/sub E/, as defined by ICRP. Initial results, from dental bite-wing projections using 90-kVp, half-wave rectified dental spectra, have produced H/sub E/ values ranging from 3 to 17 microsieverts (0.3-1.7 mrem) per image, depending on image receptor and projection geometry. The probability of stochastic effect is estimated by ICRP as 10/sup -2//Sv, or about 10/sup -7/ to 10/sup -8/ per image
[Biometric bases: basic concepts of probability calculation].
Dinya, E
1998-04-26
The author gives or outline of the basic concepts of probability theory. The bases of the event algebra, definition of the probability, the classical probability model and the random variable are presented.
Probability for Weather and Climate
Smith, L. A.
2013-12-01
Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of
Probability, Statistics, and Stochastic Processes
Olofsson, Peter
2012-01-01
This book provides a unique and balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. The Second Edition features new coverage of analysis of variance (ANOVA), consistency and efficiency of estimators, asymptotic theory for maximum likelihood estimators, empirical distribution function and the Kolmogorov-Smirnov test, general linear models, multiple comparisons, Markov chain Monte Carlo (MCMC), Brownian motion, martingales, and
Probability, statistics, and computational science.
Beerenwinkel, Niko; Siebourg, Juliane
2012-01-01
In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.
Sensitivity analysis using probability bounding
International Nuclear Information System (INIS)
Ferson, Scott; Troy Tucker, W.
2006-01-01
Probability bounds analysis (PBA) provides analysts a convenient means to characterize the neighborhood of possible results that would be obtained from plausible alternative inputs in probabilistic calculations. We show the relationship between PBA and the methods of interval analysis and probabilistic uncertainty analysis from which it is jointly derived, and indicate how the method can be used to assess the quality of probabilistic models such as those developed in Monte Carlo simulations for risk analyses. We also illustrate how a sensitivity analysis can be conducted within a PBA by pinching inputs to precise distributions or real values
Gonad dose in cineurethrocystography
International Nuclear Information System (INIS)
Ardran, G.M.; Dixon-Brown, A.; Fursdon, P.S.
1978-01-01
The technical factors used for cineurethrocystography for the true lateral projection in females are given. The mid-line radiation dose has been measured with LiF TLD inserted into the vagina in 19 examinations. The average dose recorded was 148 mrad, the range being 50 to 306 mrad, the average number of cine frames exposed was 96. Data obtained using a Rando phantom indicated that the average ovary dose would be 30% greater than the mid-line dose since the near ovary receives a higher dose than the more distant one. The technique used for men is also given, the average gonad dose in six men being 123 mrad, range 56 to 243 mrad when simple lead foil gonad protection was used; the average number of cine frames was 107. The dose in one man without gonad protection was 1575 mrad for 112 cine frames. The results for both sexes compare favourably with those of others reported in the literature and with gonad doses recorded in typical IVP examinations. (author)
Lectures on probability and statistics
International Nuclear Information System (INIS)
Yost, G.P.
1984-09-01
These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another
Chopra, Arvind; Saluja, Manjit; Tillu, Girish; Venugopalan, Anuradha; Narsimulu, Gumdal; Sarmukaddam, Sanjeev; Patwardhan, Bhushan
2012-01-01
Results of an exploratory trial suggested activity trends of Zingiber officinale-Tinopsora cordifolia (platform combination)-based formulations in the treatment of Osteoarthritis (OA) Knees. These formulations were "platform combination+Withania somnifera+Tribulus terrestris" (formulation B) and "platform combination+Emblica officinale" (formulation C). This paper reports safety of these formulations when used in higher doses (1.5-2 times) along with Sallaki Guggul and Bhallataka Parpati (a Semecarpus anacardium preparation). Ninety-two patients with symptomatic OA knees were enrolled in a 6 weeks investigator blind, randomized parallel efficacy 4-arm multicenter drug trial. The 4 arms were (I) formulation B, 2 t.i.d.; (II) formulation B, 2 q.i.d.; (III) platform combination+Sallaki Guggul; (IV) Bhallataka Parpati+formulation C. A detailed enquiry was carried out for adverse events (AE) and drug toxicity as per a priori check list and volunteered information. Laboratory evaluation included detailed hematology and metabolic parameters. Patients were examined at baseline, first and fourth weeks, and on completion. Standard statistical program (SPSS version 12.5) was used for analysis. None of the patients reported serious AE or withdrew due to any drug-related toxicity. Mild gut-related (mostly epigastric burning) AE was reported. A mild increase in liver enzymes [serum glutamic pyruvate transaminase (SGPT), serum glutamic oxaloacetic transaminase (SGOT)] without any other hepatic abnormality was reported in 2 patients (group IV). Other laboratory parameters remained normal. The mean improvement in active pain visual analog scale (1.4, CI 0.5-2.22), WOMAC (functional activity questionnaire) pain score (1.37, CI 0.22-2.5), and urinary C-TAX (cartilage collagen breakdown product) assay was maximum (NS) in group IV. Lower dose group I showed numerically superior improvement compared with higher dose group II. The results suggested that despite higher doses, standardized
Directory of Open Access Journals (Sweden)
Arvind Chopra
2012-01-01
Full Text Available Background: Results of an exploratory trial suggested activity trends of Zingiber officinale-Tinopsora cordifolia (platform combination-based formulations in the treatment of Osteoarthritis (OA Knees. These formulations were "platform combination+Withania somnifera+Tribulus terrestris0" (formulation B and "platform combination+Emblica officinale" (formulation C. This paper reports safety of these formulations when used in higher doses (1.5-2 times along with Sallaki Guggul and Bhallataka Parpati (a Semecarpus anacardium preparation. Materials and Methods: Ninety-two patients with symptomatic OA knees were enrolled in a 6 weeks investigator blind, randomized parallel efficacy 4-arm multicenter drug trial. The 4 arms were (I formulation B, 2 t.i.d.; (II formulation B, 2 q.i.d.; (III platform combination+Sallaki Guggul; (IV Bhallataka Parpati+formulation C. A detailed enquiry was carried out for adverse events (AE and drug toxicity as per a priori check list and volunteered information. Laboratory evaluation included detailed hematology and metabolic parameters. Patients were examined at baseline, first and fourth weeks, and on completion. Standard statistical program (SPSS version 12.5 was used for analysis. Results: None of the patients reported serious AE or withdrew due to any drug-related toxicity. Mild gut-related (mostly epigastric burning AE was reported. A mild increase in liver enzymes [serum glutamic pyruvate transaminase (SGPT, serum glutamic oxaloacetic transaminase (SGOT] without any other hepatic abnormality was reported in 2 patients (group IV. Other laboratory parameters remained normal. The mean improvement in active pain visual analog scale (1.4, CI 0.5-2.22, WOMAC (functional activity questionnaire pain score (1.37, CI 0.22-2.5, and urinary C-TAX (cartilage collagen breakdown product assay was maximum (NS in group IV. Lower dose group I showed numerically superior improvement compared with higher dose group II. Conclusion: The
International Nuclear Information System (INIS)
2005-05-01
Mammography is an extremely useful non-invasive imaging technique with unparalleled advantages for the detection of breast cancer. It has played an immense role in the screening of women above a certain age or with a family history of breast cancer. The IAEA has a statutory responsibility to establish standards for the protection of people against exposure to ionizing radiation and to provide for the worldwide application of those standards. A fundamental requirement of the International Basic Safety Standards for Protection Against Ionizing Radiation (BSS) and for the Safety of Radiation Sources, issued by the IAEA and co-sponsored by FAO, ILO, WHO, PAHO and NEA, is the optimization of radiological protection of patients undergoing medical exposure. In keeping with its responsibility on the application of standards, the IAEA programme on Radiological Protection of Patients attempts to reduce radiation doses to patients while balancing quality assurance considerations. IAEA-TECDOC-796, Radiation Doses in Diagnostic Radiology and Methods for Dose Reduction (1995), addresses this aspect. The related IAEA-TECDOC-1423 on Optimization of the Radiological Protection of Patients undergoing Radiography, Fluoroscopy and Computed Tomography, (2004) constitutes the final report of the coordinated research in Africa, Asia and eastern Europe. The preceding publications do not explicitly consider mammography. Mindful of the importance of this imaging technique, the IAEA launched a Coordinated Research Project on Optimization of Protection in Mammography in some eastern European States. The present publication is the outcome of this project: it is aimed at evaluating the situation in a number of countries, identifying variations in the technique, examining the status of the equipment and comparing performance in the light of the norms established by the European Commission. A number of important aspects are covered, including: - quality control of mammography equipment; - imaging
Normal tissue dose-effect models in biological dose optimisation
International Nuclear Information System (INIS)
Alber, M.
2008-01-01
Sophisticated radiotherapy techniques like intensity modulated radiotherapy with photons and protons rely on numerical dose optimisation. The evaluation of normal tissue dose distributions that deviate significantly from the common clinical routine and also the mathematical expression of desirable properties of a dose distribution is difficult. In essence, a dose evaluation model for normal tissues has to express the tissue specific volume effect. A formalism of local dose effect measures is presented, which can be applied to serial and parallel responding tissues as well as target volumes and physical dose penalties. These models allow a transparent description of the volume effect and an efficient control over the optimum dose distribution. They can be linked to normal tissue complication probability models and the equivalent uniform dose concept. In clinical applications, they provide a means to standardize normal tissue doses in the face of inevitable anatomical differences between patients and a vastly increased freedom to shape the dose, without being overly limiting like sets of dose-volume constraints. (orig.)
Probability theory a comprehensive course
Klenke, Achim
2014-01-01
This second edition of the popular textbook contains a comprehensive course in modern probability theory. Overall, probabilistic concepts play an increasingly important role in mathematics, physics, biology, financial engineering and computer science. They help us in understanding magnetism, amorphous media, genetic diversity and the perils of random developments at financial markets, and they guide us in constructing more efficient algorithms. To address these concepts, the title covers a wide variety of topics, many of which are not usually found in introductory textbooks, such as: • limit theorems for sums of random variables • martingales • percolation • Markov chains and electrical networks • construction of stochastic processes • Poisson point process and infinite divisibility • large deviation principles and statistical physics • Brownian motion • stochastic integral and stochastic differential equations. The theory is developed rigorously and in a self-contained way, with the c...
Factors Affecting Detection Probability of Acoustic Tags in Coral Reefs
Bermudez, Edgar F.
2012-01-01
of the transmitter detection range and the detection probability. A one-month range test of a coded telemetric system was conducted prior to a large-scale tagging project investigating the movement of approximately 400 fishes from 30 species on offshore coral reefs
Excluding joint probabilities from quantum theory
Allahverdyan, Armen E.; Danageozian, Arshag
2018-03-01
Quantum theory does not provide a unique definition for the joint probability of two noncommuting observables, which is the next important question after the Born's probability for a single observable. Instead, various definitions were suggested, e.g., via quasiprobabilities or via hidden-variable theories. After reviewing open issues of the joint probability, we relate it to quantum imprecise probabilities, which are noncontextual and are consistent with all constraints expected from a quantum probability. We study two noncommuting observables in a two-dimensional Hilbert space and show that there is no precise joint probability that applies for any quantum state and is consistent with imprecise probabilities. This contrasts with theorems by Bell and Kochen-Specker that exclude joint probabilities for more than two noncommuting observables, in Hilbert space with dimension larger than two. If measurement contexts are included into the definition, joint probabilities are not excluded anymore, but they are still constrained by imprecise probabilities.
International Nuclear Information System (INIS)
Fitoussi, L.
1987-12-01
The dose limit is defined to be the level of harmfulness which must not be exceeded, so that an activity can be exercised in a regular manner without running a risk unacceptable to man and the society. The paper examines the effects of radiation categorised into stochastic and non-stochastic. Dose limits for workers and the public are discussed
Introduction to probability theory with contemporary applications
Helms, Lester L
2010-01-01
This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process
K-forbidden transition probabilities
International Nuclear Information System (INIS)
Saitoh, T.R.; Sletten, G.; Bark, R.A.; Hagemann, G.B.; Herskind, B.; Saitoh-Hashimoto, N.; Tsukuba Univ., Ibaraki
2000-01-01
Reduced hindrance factors of K-forbidden transitions are compiled for nuclei with A∝180 where γ-vibrational states are observed. Correlations between these reduced hindrance factors and Coriolis forces, statistical level mixing and γ-softness have been studied. It is demonstrated that the K-forbidden transition probabilities are related to γ-softness. The decay of the high-K bandheads has been studied by means of the two-state mixing, which would be induced by the γ-softness, with the use of a number of K-forbidden transitions compiled in the present work, where high-K bandheads are depopulated by both E2 and ΔI=1 transitions. The validity of the two-state mixing scheme has been examined by using the proposed identity of the B(M1)/B(E2) ratios of transitions depopulating high-K bandheads and levels of low-K bands. A break down of the identity might indicate that other levels would mediate transitions between high- and low-K states. (orig.)
Psychophysics of the probability weighting function
Takahashi, Taiki
2011-03-01
A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (01e)=1e,w(1)=1), which has extensively been studied in behavioral neuroeconomics. The present study utilizes psychophysical theory to derive Prelec's probability weighting function from psychophysical laws of perceived waiting time in probabilistic choices. Also, the relations between the parameters in the probability weighting function and the probability discounting function in behavioral psychology are derived. Future directions in the application of the psychophysical theory of the probability weighting function in econophysics and neuroeconomics are discussed.
Research and assessment of national population dose
International Nuclear Information System (INIS)
Pan Ziqiang
1984-01-01
This article describes the necessity and probability of making researches on assessment of national population dose, and discusses some problems which might be noticeable in the research work. (author)
International Nuclear Information System (INIS)
Alvarez R, J.T.; Anaya M, R.A.
2004-01-01
With the purpose of eliminating the controversy about the lineal hypothesis without threshold which found the systems of dose limitation of the recommendations of ICRP 26 and 60, at the end of last decade R. Clarke president of the ICRP proposed the concept of Controllable Dose: as the dose or dose sum that an individual receives from a particular source which can be reasonably controllable by means of any means; said concept proposes a change in the philosophy of the radiological protection of its concern by social approaches to an individual focus. In this work a panorama of the foundations is presented, convenient and inconveniences that this proposal has loosened in the international community of the radiological protection, with the purpose of to familiarize to our Mexican community in radiological protection with these new concepts. (Author)
Patient dose measurement and dose reduction in chest radiography
Directory of Open Access Journals (Sweden)
Milatović Aleksandra A.
2014-01-01
Full Text Available Investigations presented in this paper represent the first estimation of patient doses in chest radiography in Montenegro. In the initial stage of our study, we measured the entrance surface air kerma and kerma area product for chest radiography in five major health institutions in the country. A total of 214 patients were observed. We reported the mean value, minimum and third quartile values, as well as maximum values of surface air kerma and kerma area product of patient doses. In the second stage, the possibilities for dose reduction were investigated. Mean kerma area product values were 0.8 ± 0.5 Gycm2 for the posterior-anterior projection and 1.6 ± 0.9 Gycm2 for the lateral projection. The max/min ratio for the entrance surface air kerma was found to be 53 for the posterior-anterior projection and 88 for the lateral projection. Comparing the results obtained in Montenegro with results from other countries, we concluded that patient doses in our medical centres are significantly higher. Changes in exposure parameters and increased filtration contributed to a dose reduction of up to 36% for posterior-anterior chest examinations. The variability of the estimated dose values points to a significant space for dose reduction throughout the process of radiological practice optimisation.
THE BLACK HOLE FORMATION PROBABILITY
Energy Technology Data Exchange (ETDEWEB)
Clausen, Drew; Piro, Anthony L.; Ott, Christian D., E-mail: dclausen@tapir.caltech.edu [TAPIR, Walter Burke Institute for Theoretical Physics, California Institute of Technology, Mailcode 350-17, Pasadena, CA 91125 (United States)
2015-02-01
A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P {sub BH}(M {sub ZAMS}). Although we find that it is difficult to derive a unique P {sub BH}(M {sub ZAMS}) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P {sub BH}(M {sub ZAMS}) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P {sub BH}(M {sub ZAMS}) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.
THE BLACK HOLE FORMATION PROBABILITY
International Nuclear Information System (INIS)
Clausen, Drew; Piro, Anthony L.; Ott, Christian D.
2015-01-01
A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH (M ZAMS ). Although we find that it is difficult to derive a unique P BH (M ZAMS ) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH (M ZAMS ) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH (M ZAMS ) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment
The Black Hole Formation Probability
Clausen, Drew; Piro, Anthony L.; Ott, Christian D.
2015-02-01
A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH(M ZAMS). Although we find that it is difficult to derive a unique P BH(M ZAMS) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH(M ZAMS) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH(M ZAMS) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.
Projected 24-hour post-dose ocular itching scores post-treatment with olopatadine 0.7% versus 0.2.
Fidler, Matthew L; Ogundele, Abayomi; Covert, David; Sarangapani, Ramesh
2018-04-21
Olopatadine is an antihistamine and mast cell stabilizer used for treating allergic conjunctivitis. Olopatadine 0.7% has been recently approved for daily dosing in the US, which supersedes the previously approved 0.2% strength. The objective of this analysis was to characterize patients who have better itching relief at 24 h when taking olopatadine 0.7% treatment instead of olopatadine 0.2% (in terms of proportions of responses) and relate this to the severity of baseline itching as an indirect metric of a patient's sensitivity to antihistamines. A differential odds model was developed using data from two conjunctival allergen challenge (CAC) studies to characterize individual-level and population-level response to ocular itching following olopatadine treatment and the data was analyzed retrospectively. This modeling analysis was designed to predict 24 h ocular itching scores and to quantify the differences in 24 h itching relief following treatment with olopatadine 0.2% versus 0.7% in patients with moderate-to-high baseline itching. A one-compartment kinetic-pharmacodynamic E max model was used to determine the effect of olopatadine. Impact of baseline itching severity, vehicle effect and the drug effect on the overall itching scores post-treatment were explicitly incorporated in the model. The model quantified trends observed in the clinical data with regards to both mean scores and the proportions of patients responding to olopatadine treatment. The model predicts a higher proportion of patients in the olopatadine 0.7% versus 0.2% group will experience relief within 24 h. This prediction was confirmed with retrospective clinical data analysis. The number of allergy patients relieved with olopatadine 0.7% increased with higher baseline itching severity scores, when compared to olopatadine 0.2%.
The irradiation tolerance dose of the proximal vagina
International Nuclear Information System (INIS)
Au, Samuel P.; Grigsby, Perry W.
2003-01-01
Purpose: The purpose of this investigation was to determine the irradiation tolerance level and complication rates of the proximal vagina to combined external irradiation and low dose rate (LDR) brachytherapy. Also, the mucosal tolerance for fractionated high dose rate (HDR) brachytherapy is further projected based on the biological equivalent dose (BED) of LDR for an acceptable complication rate. Materials and methods: Two hundred seventy-four patients with stages I-IV cervical carcinoma treated with irradiation therapy alone from 1987 to 1997 were retrospectively reviewed for radiation-associated late sequelae of the proximal vagina. All patients received LDR brachytherapy and 95% also received external pelvic irradiation. Follow-up ranged from 15 to 126 months (median, 43 months). The proximal vagina mucosa dose from a single ovoid (single source) or from both ovoids plus the tandem (all sources), together with the external irradiation dose, were used to derive the probability of a complication using the maximum likelihood logistic regression technique. The BED based on the linear-quadratic model was used to compute the corresponding tolerance levels for LDR or HDR brachytherapy. Results: Grades 1 and 2 complications occurred in 10.6% of patients and Grade 3 complications occurred in 3.6%. There were no Grade 4 complications. Complications occurred from 3 to 71 months (median, 7 months) after completion of irradiation, with over 60% occurring in the first year. By logistic regression analysis, both the mucosal dose from a single ovoid or that from all sources, combined with the external irradiation dose, demonstrate a statistically significant fit to the dose response complication curves (both with P=0.016). The single source dose was highly correlated with the all source dose with a cross-correlation coefficient 0.93. The all source dose was approximately 1.4 times the single source dose. Over the LDR brachytherapy dose rate range, the complication rate was
Foundations of the theory of probability
Kolmogorov, AN
2018-01-01
This famous little book remains a foundational text for the understanding of probability theory, important both to students beginning a serious study of probability and to historians of modern mathematics. 1956 second edition.
The Probability Distribution for a Biased Spinner
Foster, Colin
2012-01-01
This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)
Conditional Probability Modulates Visual Search Efficiency
Directory of Open Access Journals (Sweden)
Bryan eCort
2013-10-01
Full Text Available We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability – the likelihood of a particular color given a particular combination of two cues – varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.
Analytic Neutrino Oscillation Probabilities in Matter: Revisited
Energy Technology Data Exchange (ETDEWEB)
Parke, Stephen J. [Fermilab; Denton, Peter B. [Copenhagen U.; Minakata, Hisakazu [Madrid, IFT
2018-01-02
We summarize our recent paper on neutrino oscillation probabilities in matter, explaining the importance, relevance and need for simple, highly accurate approximations to the neutrino oscillation probabilities in matter.
DEFF Research Database (Denmark)
Ramlov, Anne; Assenholt, Marianne S; Jensen, Maria F
2017-01-01
PURPOSE: To implement coverage probability (CovP) for dose planning of simultaneous integrated boost (SIB) of pathologic lymph nodes in locally advanced cervical cancer (LACC). MATERIAL AND METHODS: CovP constraints for SIB of the pathological nodal target (PTV-N) with a central dose peak...
Void probability scaling in hadron nucleus interactions
International Nuclear Information System (INIS)
Ghosh, Dipak; Deb, Argha; Bhattacharyya, Swarnapratim; Ghosh, Jayita; Bandyopadhyay, Prabhat; Das, Rupa; Mukherjee, Sima
2002-01-01
Heygi while investigating with the rapidity gap probability (that measures the chance of finding no particle in the pseudo-rapidity interval Δη) found that a scaling behavior in the rapidity gap probability has a close correspondence with the scaling of a void probability in galaxy correlation study. The main aim in this paper is to study the scaling behavior of the rapidity gap probability
Dose. Detriment. Limit assessment
International Nuclear Information System (INIS)
Breckow, J.
2015-01-01
One goal of radiation protection is the limitation of stochastic effects due to radiation exposure. The probability of occurrence of a radiation induced stochastic effect, however, is only one of several other parameters which determine the radiation detriment. Though the ICRP-concept of detriment is a quantitative definition, the kind of detriment weighting includes somewhat subjective elements. In this sense, the detriment-concept of ICRP represents already at the stage of effective dose a kind of assessment. Thus, by comparing radiation protection standards and concepts interconvertible or with those of environment or occupational protection one should be aware of the possibly different principles of detriment assessment.
Dose/dose-rate responses of shrimp larvae to UV-B radiation
International Nuclear Information System (INIS)
Damkaer, D.M.
1981-01-01
Previous work indicated dose-rate thresholds in the effects of UV-B on the near-surface larvae of three shrimp species. Additional observations suggest that the total dose response varies with dose-rate. Below 0.002 Wm -2 sub([DNA]) irradiance no significant effect is noted in activity, development, or survival. Beyond that dose-rate threshold, shrimp larvae are significantly affected if the total dose exceeds about 85 Jm -2 sub([DNA]). Predictions cannot be made without both the dose-rate and the dose. These dose/dose-rate thresholds are compared to four-year mean dose/dose-rate solar UV-B irradiances at the experimental site, measured at the surface and calculated for 1 m depth. The probability that the shrimp larvae would receive lethal irradiance is low for the first half of the season of surface occurrence, even with a 44% increase in damaging UV radiation. (orig.)
Pre-Service Teachers' Conceptions of Probability
Odafe, Victor U.
2011-01-01
Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…
Using Playing Cards to Differentiate Probability Interpretations
López Puga, Jorge
2014-01-01
The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.
Impact of the probability of causation on the radiation protection program
International Nuclear Information System (INIS)
Meinhold, C.B.
1988-01-01
Although the probability of causation approach is the only scientific basis on which a given cancer can be judged to be causally related to a given exposure, the impact of this concept on the radiation safety program could be counter-productive. As health physicists, the practices and the concepts we employ have been developed to protect the worker. Effective dose equivalent and committed dose equivalent are protective concepts but useless for probability of causation analysis. Perhaps extensive records will be the only way that good radiation protection and probability of causation analysis can coexist
Energy Technology Data Exchange (ETDEWEB)
Nakaura, Takeshi; Iyama, Yuji; Kidoh, Masafumi; Yokoyama, Koichi [Amakusa Medical Center, Diagnostic Radiology, Amakusa, Kumamoto (Japan); Kumamoto University, Department of Diagnostic Radiology, Graduate School of Medical Sciences, Kumamoto (Japan); Oda, Seitaro; Yamashita, Yasuyuki [Kumamoto University, Department of Diagnostic Radiology, Graduate School of Medical Sciences, Kumamoto (Japan); Tokuyasu, Shinichi [Philips Electronics, Kumamoto (Japan); Harada, Kazunori [Amakusa Medical Center, Department of Surgery, Kumamoto (Japan)
2016-03-15
The purpose of this study was to evaluate the utility of iterative model reconstruction (IMR) in brain CT especially with thin-slice images. This prospective study received institutional review board approval, and prior informed consent to participate was obtained from all patients. We enrolled 34 patients who underwent brain CT and reconstructed axial images with filtered back projection (FBP), hybrid iterative reconstruction (HIR) and IMR with 1 and 5 mm slice thicknesses. The CT number, image noise, contrast, and contrast noise ratio (CNR) between the thalamus and internal capsule, and the rate of increase of image noise in 1 and 5 mm thickness images between the reconstruction methods, were assessed. Two independent radiologists assessed image contrast, image noise, image sharpness, and overall image quality on a 4-point scale. The CNRs in 1 and 5 mm slice thickness were significantly higher with IMR (1.2 ± 0.6 and 2.2 ± 0.8, respectively) than with FBP (0.4 ± 0.3 and 1.0 ± 0.4, respectively) and HIR (0.5 ± 0.3 and 1.2 ± 0.4, respectively) (p < 0.01). The mean rate of increasing noise from 5 to 1 mm thickness images was significantly lower with IMR (1.7 ± 0.3) than with FBP (2.3 ± 0.3) and HIR (2.3 ± 0.4) (p < 0.01). There were no significant differences in qualitative analysis of unfamiliar image texture between the reconstruction techniques. IMR offers significant noise reduction and higher contrast and CNR in brain CT, especially for thin-slice images, when compared to FBP and HIR. (orig.)
Dependent Human Error Probability Assessment
International Nuclear Information System (INIS)
Simic, Z.; Mikulicic, V.; Vukovic, I.
2006-01-01
This paper presents an assessment of the dependence between dynamic operator actions modeled in a Nuclear Power Plant (NPP) PRA and estimate the associated impact on Core damage frequency (CDF). This assessment was done improve HEP dependencies implementation inside existing PRA. All of the dynamic operator actions modeled in the NPP PRA are included in this assessment. Determining the level of HEP dependence and the associated influence on CDF are the major steps of this assessment. A decision on how to apply the results, i.e., should permanent HEP model changes be made, is based on the resulting relative CDF increase. Some CDF increase was selected as a threshold based on the NPP base CDF value and acceptance guidelines from the Regulatory Guide 1.174. HEP dependence resulting in a CDF increase of > 5E-07 would be considered potential candidates for specific incorporation into the baseline model. The approach used to judge the level of dependence between operator actions is based on dependency level categories and conditional probabilities developed in the Handbook of Human Reliability Analysis with Emphasis on Nuclear Power Plant Applications NUREG/CR-1278. To simplify the process, NUREG/CR-1278 identifies five levels of dependence: ZD (zero dependence), LD (low dependence), MD (moderate dependence), HD (high dependence), and CD (complete dependence). NUREG/CR-1278 also identifies several qualitative factors that could be involved in determining the level of dependence. Based on the NUREG/CR-1278 information, Time, Function, and Spatial attributes were judged to be the most important considerations when determining the level of dependence between operator actions within an accident sequence. These attributes were used to develop qualitative criteria (rules) that were used to judge the level of dependence (CD, HD, MD, LD, ZD) between the operator actions. After the level of dependence between the various HEPs is judged, quantitative values associated with the
International Nuclear Information System (INIS)
Rantalainen, L.
1996-01-01
The estimated annual radiation doses to Finns have been reduced in the recent years without any change in the actual radiation environment. This is because the radiation types have been changed. The risk factors will probably be changed again in the future, because recent studies show discrepancies in the neutron dosimetry concerning the city of Hiroshima. Neutron dosimetry discrepancy has been found between the predicted and estimated neutron radiation. The prediction of neutron radiation is calculated by Monte Carlo simulations, which have also been used when designing recommendations for the limits of radiation doses (ICRP60). Estimation of the neutron radiation is made on the basis of measured neutron activation of materials in the city. The estimated neutron dose beyond 1 km is two to ten, or more, times as high as the predicted dose. This discrepancy is important, because the most relevant distances with respect to radiation risk evaluation are between 1 and 2 km. Because of this discrepancy, the present radiation risk factors for gamma and neutron radiation, which rely on the Monte Carlo calculations, are false, too. The recommendations of ICRP60 have been adopted in a few countries, including Finland, and they affect the planned common limits of the EU. It is questionable whether happiness is increased by adopting false limits, even if they are common. (orig.) (2 figs., 1 tab.)
Probability, random processes, and ergodic properties
Gray, Robert M
1988-01-01
This book has been written for several reasons, not all of which are academic. This material was for many years the first half of a book in progress on information and ergodic theory. The intent was and is to provide a reasonably self-contained advanced treatment of measure theory, prob ability theory, and the theory of discrete time random processes with an emphasis on general alphabets and on ergodic and stationary properties of random processes that might be neither ergodic nor stationary. The intended audience was mathematically inc1ined engineering graduate students and visiting scholars who had not had formal courses in measure theoretic probability . Much of the material is familiar stuff for mathematicians, but many of the topics and results have not previously appeared in books. The original project grew too large and the first part contained much that would likely bore mathematicians and dis courage them from the second part. Hence I finally followed the suggestion to separate the material and split...
International Nuclear Information System (INIS)
Tirmarche, M.; Hubert, P.
1992-01-01
Actually, epidemiological studies have to establish if the assessment of cancer risk can be verified at low chronic radiation doses. The population surveillance must be very long, the side effects and cancers of such radiation appearing much later. In France, this epidemiological study on nuclear workers have been decided recently. Before describing the experiment and french projects in epidemiology of nuclear workers, the authors present the main english and american studies
Fundamentals of applied probability and random processes
Ibe, Oliver
2014-01-01
The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability t
Probability of Failure in Random Vibration
DEFF Research Database (Denmark)
Nielsen, Søren R.K.; Sørensen, John Dalsgaard
1988-01-01
Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out......-crossing. An integral equation for the probability density function of the time interval is formulated, and adequate approximations for the kernel are suggested. The kernel approximation results in approximate solutions for the probability density function of the time interval and thus for the first-passage probability...
An Objective Theory of Probability (Routledge Revivals)
Gillies, Donald
2012-01-01
This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axioma
Paraconsistent Probabilities: Consistency, Contradictions and Bayes’ Theorem
Directory of Open Access Journals (Sweden)
Juliana Bueno-Soler
2016-09-01
Full Text Available This paper represents the first steps towards constructing a paraconsistent theory of probability based on the Logics of Formal Inconsistency (LFIs. We show that LFIs encode very naturally an extension of the notion of probability able to express sophisticated probabilistic reasoning under contradictions employing appropriate notions of conditional probability and paraconsistent updating, via a version of Bayes’ theorem for conditionalization. We argue that the dissimilarity between the notions of inconsistency and contradiction, one of the pillars of LFIs, plays a central role in our extended notion of probability. Some critical historical and conceptual points about probability theory are also reviewed.
What is the probability that radiation caused a particular cancer
International Nuclear Information System (INIS)
Voelz, G.L.
1983-01-01
Courts, lawyers, health physicists, physicians, and others are searching for a credible answer to the question posed in the title of this paper. The cases in which the question arises frequently stem from an individual that has cancer and they, or their next-of-kin, are convinced that a past radiation exposure - usually small - is responsible for causing it. An arithmetic expression of this problem is simple: the probability of causation by the radiation dose in question is equal to the risk of cancer from the radiation dose divided by the risk of cancer from all causes. The application of risk factors to this equation is not so simple. It must involve careful evaluation of the reliability of and variations in risk coefficients for development of cancer due to radiation exposure, other carcinogenic agents, and natural causes for the particular individual. Examination of our knowledge of these various factors indicates that a large range in the answers can result due to the variability and imprecision of the data. Nevertheless, the attempts to calculate and the probability that radiation caused the cancer is extremely useful to provide a gross perspective on the probability of causation. It will likely rule in or out a significant number of cases despite the limitations in our understandings of the etiology of cancer and the risks from various factors. For the remaining cases, a thoughtful and educated judgment based on selected data and circumstances of the case will also be needed before the expert can develop and support his opinion
Energy Technology Data Exchange (ETDEWEB)
Ulanowski, Alexander; Eidemueller, Markus; Guethlin, Denise; Kaiser, Jan Christian; Shemiakina, Elena; Jacob, Peter [Helmholtz Zentrum Muenchen - Deutsches Forschungszentrum fuer Gesundheit und Umwelt, Muenchen (Germany). Inst fuer Strahlenschutz
2016-11-15
Methodology and a corresponding computer program ProZES were developed to estimate the probability that a previous radiation exposure for a specific person and a given exposure situation has resulted in cancer (probability of causation or relationship between the exposure and the disease, Z). ProZES can provide the scientific basis to support making decisions on compensation claims due to cancer following occupational exposure to radiation. Starting from the results achieved in the first version of ProZES, when the general methodology and risk models for colon, stomach, lung, and female breast were implemented, the second stage of the ProZES development was focused on the development of risk models for all other cancer locations, including leukaemias and lymphomas as well as risk models for lung cancer after exposure to radon. The models for estimating the cancer risks and the associated probability Z are mostly based on the observed cancer incidence in the cohort of the atomic bomb survivors in Hiroshima and Nagasaki. Most of the models are newly developed for the project. For the frequent types of cancer, specific models of radiation risk have been developed, while for the less common diseases the risk models were developed for the groups of functionally similar diseases. Since various models built upon the basis of the same data can result in different predictions for ''dose-effect'' relationships, so the method of ''multi-model inference'' is used for some types of cancer to derive risk factors, which are less dependent on individual models and take model uncertainties into account. Risk estimates for the Japanese population must be transferred to the German population. An essential element is the estimation of the uncertainty of the associated probability. ProZES was developed as a user-friendly stand-alone program, which can assess and present the individualised estimate of probability of relationship between radiation
International Nuclear Information System (INIS)
Li Yunhai; Liao Yuan; Zhou Lijun; Pan Ziqiang; Feng Yan
2003-01-01
Objective: On basis of physical dose optimization, LQ model was used to investigate the difference between the curves of biological effective dose and physical isodose. The influence of applying the biological dose concept on three dimensional conformal radiotherapy of prostate carcinoma was discussed. Methods: Four treatment plannings were designed for physical dose optimization: three fields, four-box fields, five fields and six fields. Target dose uniformity and protection of the critical tissue-rectum were used as the principal standard for designing the treatment planning. Biological effective dose (BED) was calculated by LQ model. The difference between the BED curve drawn in the central layer and the physical isodose curve was studied. The difference between the adjusted physical dose (APD) and the physical dose was also studied. Results: Five field planning was the best in target dose uniformity and protection of the critical tissue-rectum. The physical dose was uniform in the target, but the biological effective doses revealed great discrepancy in the biological model. Adjusted physical dose distribution also displayed larger discrepancy than the physical dose unadjusted. Conclusions: Intensified Modulated Radiotherapy (IMRT) technique with inversion planning using biological dose concept may be much more advantageous to reach a high tumor control probability and low normal tissue complication probability
Probability concepts in quality risk management.
Claycamp, H Gregg
2012-01-01
Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although risk is generally a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management tools are relatively silent on the meaning and uses of "probability." The probability concept is typically applied by risk managers as a combination of frequency-based calculation and a "degree of belief" meaning of probability. Probability as a concept that is crucial for understanding and managing risk is discussed through examples from the most general, scenario-defining and ranking tools that use probability implicitly to more specific probabilistic tools in risk management. A rich history of probability in risk management applied to other fields suggests that high-quality risk management decisions benefit from the implementation of more thoughtful probability concepts in both risk modeling and risk management. Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although "risk" generally describes a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management methodologies and respective tools focus on managing severity but are relatively silent on the in-depth meaning and uses of "probability." Pharmaceutical manufacturers are expanding their use of quality risk management to identify and manage risks to the patient that might occur in phases of the pharmaceutical life cycle from drug development to manufacture, marketing to product discontinuation. A probability concept is typically applied by risk managers as a combination of data-based measures of probability and a subjective "degree of belief" meaning of probability. Probability as
Microbeams, microdosimetry and specific dose
International Nuclear Information System (INIS)
Randers-Pehrson, H.
2002-01-01
Dose and its usefulness as a single parameter to describe the amount of radiation absorbed are well established for most situations. The conditions where the concept of dose starts to break down are well known, mostly from the study of microdosimetry. For low doses of high LET radiation it is noted that the process of taking the limiting value of the energy absorbed within a test volume divided by the mass within that volume yields either zero or a relatively large value. The problem is further exacerbated with microbeam irradiations where the uniformity of the energy deposition is experimentally manipulated on the spatial scale of cells being irradiated. Booz introduced a quantity to deal with these problems: the unfortunately named 'mean specific energy in affected volumes'. This quantity multiplied by the probability that a test volume has received an energy deposit is equal to dose (in situations where dose can be defined). I propose that Booz's quantity be renamed 'specific dose', that is the mean energy deposited divided by the mass within a specified volume. If we believe for instance that the nucleus of a cell is the critical volume for biological effects, we can refer to the nuclear specific dose. A microbeam experiment wherein 10 per cent of the cell nuclei were targeted with 10 alpha particles would be described as delivering a nuclear specific dose of 1.6 Gy to 10 per cent of the population. (author)
Transition probability spaces in loop quantum gravity
Guo, Xiao-Kan
2018-03-01
We study the (generalized) transition probability spaces, in the sense of Mielnik and Cantoni, for spacetime quantum states in loop quantum gravity. First, we show that loop quantum gravity admits the structures of transition probability spaces. This is exemplified by first checking such structures in covariant quantum mechanics and then identifying the transition probability spaces in spin foam models via a simplified version of general boundary formulation. The transition probability space thus defined gives a simple way to reconstruct the discrete analog of the Hilbert space of the canonical theory and the relevant quantum logical structures. Second, we show that the transition probability space and in particular the spin foam model are 2-categories. Then we discuss how to realize in spin foam models two proposals by Crane about the mathematical structures of quantum gravity, namely, the quantum topos and causal sites. We conclude that transition probability spaces provide us with an alternative framework to understand various foundational questions of loop quantum gravity.
Towards a Categorical Account of Conditional Probability
Directory of Open Access Journals (Sweden)
Robert Furber
2015-11-01
Full Text Available This paper presents a categorical account of conditional probability, covering both the classical and the quantum case. Classical conditional probabilities are expressed as a certain "triangle-fill-in" condition, connecting marginal and joint probabilities, in the Kleisli category of the distribution monad. The conditional probabilities are induced by a map together with a predicate (the condition. The latter is a predicate in the logic of effect modules on this Kleisli category. This same approach can be transferred to the category of C*-algebras (with positive unital maps, whose predicate logic is also expressed in terms of effect modules. Conditional probabilities can again be expressed via a triangle-fill-in property. In the literature, there are several proposals for what quantum conditional probability should be, and also there are extra difficulties not present in the classical case. At this stage, we only describe quantum systems with classical parametrization.
UT Biomedical Informatics Lab (BMIL) probability wheel
Huang, Sheng-Cheng; Lee, Sara; Wang, Allen; Cantor, Scott B.; Sun, Clement; Fan, Kaili; Reece, Gregory P.; Kim, Min Soon; Markey, Mia K.
A probability wheel app is intended to facilitate communication between two people, an "investigator" and a "participant", about uncertainties inherent in decision-making. Traditionally, a probability wheel is a mechanical prop with two colored slices. A user adjusts the sizes of the slices to indicate the relative value of the probabilities assigned to them. A probability wheel can improve the adjustment process and attenuate the effect of anchoring bias when it is used to estimate or communicate probabilities of outcomes. The goal of this work was to develop a mobile application of the probability wheel that is portable, easily available, and more versatile. We provide a motivating example from medical decision-making, but the tool is widely applicable for researchers in the decision sciences.
A probability space for quantum models
Lemmens, L. F.
2017-06-01
A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.
Fundamentals of applied probability and random processes
Ibe, Oliver
2005-01-01
This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections
Striatal activity is modulated by target probability.
Hon, Nicholas
2017-06-14
Target probability has well-known neural effects. In the brain, target probability is known to affect frontal activity, with lower probability targets producing more prefrontal activation than those that occur with higher probability. Although the effect of target probability on cortical activity is well specified, its effect on subcortical structures such as the striatum is less well understood. Here, I examined this issue and found that the striatum was highly responsive to target probability. This is consistent with its hypothesized role in the gating of salient information into higher-order task representations. The current data are interpreted in light of that fact that different components of the striatum are sensitive to different types of task-relevant information.
Defining Probability in Sex Offender Risk Assessment.
Elwood, Richard W
2016-12-01
There is ongoing debate and confusion over using actuarial scales to predict individuals' risk of sexual recidivism. Much of the debate comes from not distinguishing Frequentist from Bayesian definitions of probability. Much of the confusion comes from applying Frequentist probability to individuals' risk. By definition, only Bayesian probability can be applied to the single case. The Bayesian concept of probability resolves most of the confusion and much of the debate in sex offender risk assessment. Although Bayesian probability is well accepted in risk assessment generally, it has not been widely used to assess the risk of sex offenders. I review the two concepts of probability and show how the Bayesian view alone provides a coherent scheme to conceptualize individuals' risk of sexual recidivism.
Spatial probability aids visual stimulus discrimination
Directory of Open Access Journals (Sweden)
Michael Druker
2010-08-01
Full Text Available We investigated whether the statistical predictability of a target's location would influence how quickly and accurately it was classified. Recent results have suggested that spatial probability can be a cue for the allocation of attention in visual search. One explanation for probability cuing is spatial repetition priming. In our two experiments we used probability distributions that were continuous across the display rather than relying on a few arbitrary screen locations. This produced fewer spatial repeats and allowed us to dissociate the effect of a high probability location from that of short-term spatial repetition. The task required participants to quickly judge the color of a single dot presented on a computer screen. In Experiment 1, targets were more probable in an off-center hotspot of high probability that gradually declined to a background rate. Targets garnered faster responses if they were near earlier target locations (priming and if they were near the high probability hotspot (probability cuing. In Experiment 2, target locations were chosen on three concentric circles around fixation. One circle contained 80% of targets. The value of this ring distribution is that it allowed for a spatially restricted high probability zone in which sequentially repeated trials were not likely to be physically close. Participant performance was sensitive to the high-probability circle in addition to the expected effects of eccentricity and the distance to recent targets. These two experiments suggest that inhomogeneities in spatial probability can be learned and used by participants on-line and without prompting as an aid for visual stimulus discrimination and that spatial repetition priming is not a sufficient explanation for this effect. Future models of attention should consider explicitly incorporating the probabilities of targets locations and features.
Bayesian estimation of dose rate effectiveness
International Nuclear Information System (INIS)
Arnish, J.J.; Groer, P.G.
2000-01-01
A Bayesian statistical method was used to quantify the effectiveness of high dose rate 137 Cs gamma radiation at inducing fatal mammary tumours and increasing the overall mortality rate in BALB/c female mice. The Bayesian approach considers both the temporal and dose dependence of radiation carcinogenesis and total mortality. This paper provides the first direct estimation of dose rate effectiveness using Bayesian statistics. This statistical approach provides a quantitative description of the uncertainty of the factor characterising the dose rate in terms of a probability density function. The results show that a fixed dose from 137 Cs gamma radiation delivered at a high dose rate is more effective at inducing fatal mammary tumours and increasing the overall mortality rate in BALB/c female mice than the same dose delivered at a low dose rate. (author)
International Nuclear Information System (INIS)
Csete, I.; Leiton, A.G.; Sochor, V.; Lapenas, A.; Grindborg, J.E.; Jokelainen, I.; Bjerke, H.; Dobrovodsky, J.; Megzifene, A.; Hourdakis, C.J.; Ivanov, R.; Vekic, B.; Kokocinski, J.; Cardoso, J.; Buermann, L.; Tiefenboeck, W.; Stucki, G.; Van Dijk, E.; Toni, M.P.; Minniti, R.; McCaffrey, J.P.; Silva, C.N.M.; Kharitonov, I.; Webb, D.; Saravi, M.; Delaunay, F.
2010-01-01
The results of an unprecedented international effort involving 26 countries are reported. The EUROMET.RI(I)-K1 and EUROMET.RI(I)-K4 key comparisons were conducted with the goal of supporting the relevant calibration and measurement capabilities (CMC) planned for publication by the participant laboratories. The measured quantities were the air kerma (K air ) and the absorbed dose to water (Dw) in 60 Co radiotherapy beams. The comparison was conducted by the pilot laboratory MKEH (Hungary), in a star-shaped arrangement from January 2005 to December 2008. The calibration coefficients of four transfer ionization chambers were measured using two electrometers. The largest deviation between any two calibration coefficients for the four chambers in terms of air kerma and absorbed dose to water was 2.7% and 3.3% respectively. An analysis of the participant uncertainty budgets enabled the calculation of degrees of equivalence (DoE), in terms of the deviations of the results and their associated uncertainties. As a result of this EUROMET project 813 comparison, the BIPM key comparison database (KCDB) will include eleven new Kair and fourteen new D w DoE values of European secondary standard dosimetry laboratories (SSDLs), and the KCDB will be updated with the new DoE values of the other participant laboratories. The pair-wise degrees of equivalence of participants were also calculated. In addition to assessing calibration techniques and uncertainty calculations of the participants, these comparisons enabled the experimental determinations of N Dw /N Kair ratios in the 60 Co gamma radiation beam for the four radiotherapy transfer chambers. (authors)
Energy Technology Data Exchange (ETDEWEB)
Csete, I. [National Office of Measures (OMH) - pilot laboratory and corresponding author (Hungary); Leiton, A.G. [Research Centre for Energy, Environment and Technology (CMRI-CIEMAT) (Spain); Sochor, V. [Czech Metrology Institute (CMI) (Czech Republic); Lapenas, A. [Latvian National Metrology Center (LNMC-RMTC) (Latvia); Grindborg, J.E. [Swedish Radiation Protection Authority (SSI) (Sweden); Jokelainen, I. [Radiation and Nuclear Safety Authority (STUK) (Finland); Bjerke, H. [Norwegian Radiation Protection Authority (NRPA) (Norway); Dobrovodsky, J. [Slovak Institute of Metrology (SMU) (Slovakia); Megzifene, A. [International Atomic Energy Agency, IAEA, Vienna (Austria); Hourdakis, C.J. [Hellenic Atomic Energy Committee (HAEC-HIRCL) (Greece); Ivanov, R. [National Centre of Metrology (NCM) (Bulgaria); Vekic, B. [Rudjer Boskovic Institute (IRB) (Croatia); Kokocinski, J. [Central Office of Measures (GUM) (Poland); Cardoso, J. [Institute for Nuclear Technology (ITN-LMRIR) (Portugal); Buermann, L. [Physikalisch Technische Bundesanstalt (PTB) (Germany); Tiefenboeck, W. [Bundesamt fur Eich und Vermesungswesen (BEV) (Austria); Stucki, G. [17 Bundesamt fur Metrologie (METAS) (Switzerland); Van Dijk, E. [NMi Van Swinden Laboratorium (NMi) (Netherlands); Toni, M.P. [ENEA-CR Istituto Nazionale di Metrologia delle Radiazioni Ionizzanti (ENEA) (Italy); Minniti, R. [20 National Institute of Standards and Technology (NIST) (United States); McCaffrey, J.P. [National Research Council Canada (NRC) (Canada); Silva, C.N.M. [National Metrology Laboratory of Ionizing Radiation (LNMRI-IRD) (Brazil); Kharitonov, I. [D I Mendeleyev Institute for Metrology (VNIIM) (RU); Webb, D. [Australian Radiation Protection and Nuclear Safety Agency (ARPANSA) (Australia); Saravi, M. [National Atomic Energy Commission (CNEA-CAE) (Argentina); Delaunay, F. [Laboratoire National Henri Becquerel (LNE-LNHB) (France)
2010-06-15
The results of an unprecedented international effort involving 26 countries are reported. The EUROMET.RI(I)-K1 and EUROMET.RI(I)-K4 key comparisons were conducted with the goal of supporting the relevant calibration and measurement capabilities (CMC) planned for publication by the participant laboratories. The measured quantities were the air kerma (K{sub air}) and the absorbed dose to water (Dw) in {sup 60}Co radiotherapy beams. The comparison was conducted by the pilot laboratory MKEH (Hungary), in a star-shaped arrangement from January 2005 to December 2008. The calibration coefficients of four transfer ionization chambers were measured using two electrometers. The largest deviation between any two calibration coefficients for the four chambers in terms of air kerma and absorbed dose to water was 2.7% and 3.3% respectively. An analysis of the participant uncertainty budgets enabled the calculation of degrees of equivalence (DoE), in terms of the deviations of the results and their associated uncertainties. As a result of this EUROMET project 813 comparison, the BIPM key comparison database (KCDB) will include eleven new Kair and fourteen new D{sub w} DoE values of European secondary standard dosimetry laboratories (SSDLs), and the KCDB will be updated with the new DoE values of the other participant laboratories. The pair-wise degrees of equivalence of participants were also calculated. In addition to assessing calibration techniques and uncertainty calculations of the participants, these comparisons enabled the experimental determinations of N{sub Dw}/N{sub Kair} ratios in the {sup 60}Co gamma radiation beam for the four radiotherapy transfer chambers. (authors)
Is probability of frequency too narrow?
International Nuclear Information System (INIS)
Martz, H.F.
1993-01-01
Modern methods of statistical data analysis, such as empirical and hierarchical Bayesian methods, should find increasing use in future Probabilistic Risk Assessment (PRA) applications. In addition, there will be a more formalized use of expert judgment in future PRAs. These methods require an extension of the probabilistic framework of PRA, in particular, the popular notion of probability of frequency, to consideration of frequency of frequency, frequency of probability, and probability of probability. The genesis, interpretation, and examples of these three extended notions are discussed
Calculating radiation exposure and dose
International Nuclear Information System (INIS)
Hondros, J.
1987-01-01
This paper discusses the methods and procedures used to calculate the radiation exposures and radiation doses to designated employees of the Olympic Dam Project. Each of the three major exposure pathways are examined. These are: gamma irradiation, radon daughter inhalation and radioactive dust inhalation. A further section presents ICRP methodology for combining individual pathway exposures to give a total dose figure. Computer programs used for calculations and data storage are also presented briefly
Collective dose commitments from nuclear power programmes
International Nuclear Information System (INIS)
Beninson, D.
1977-01-01
The concepts of collective dose and collective dose commitment are discussed, particularly regarding their use to compare the relative importance of the exposure from several radiation sources and to predict future annual doses from a continuing practice. The collective dose commitment contributions from occupational exposure and population exposure due to the different components of the nuclear power fuel cycle are evaluated. A special discussion is devoted to exposures delivered over a very long time by released radionuclides of long half-lives and to the use of the incomplete collective dose commitment. The maximum future annual ''per caput'' doses from present and projected nuclear power programmes are estimated
Dose rate constants for new dose quantities
International Nuclear Information System (INIS)
Tschurlovits, M.; Daverda, G.; Leitner, A.
1992-01-01
Conceptual changes and new quantities made is necessary to reassess dose rate quantities. Calculations of the dose rate constant were done for air kerma, ambient dose equivalent and directional dose equivalent. The number of radionuclides is more than 200. The threshold energy is selected as 20 keV for the dose equivalent constants. The dose rate constant for the photon equivalent dose as used mainly in German speaking countries as a temporary quantity is also included. (Author)
Probability of Grounding and Collision Events
DEFF Research Database (Denmark)
Pedersen, Preben Terndrup
1996-01-01
To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents are developed. This implies that probabilities as well as inherent consequences can be analysed and assessed. The presnt paper outlines a method for evaluation of the probability of ship...
Probability of Grounding and Collision Events
DEFF Research Database (Denmark)
Pedersen, Preben Terndrup
1996-01-01
To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents have to be developed. This implies that probabilities as well as inherent consequences have to be analyzed and assessed.The present notes outline a method for evaluation of the probability...
Introducing Disjoint and Independent Events in Probability.
Kelly, I. W.; Zwiers, F. W.
Two central concepts in probability theory are those of independence and mutually exclusive events. This document is intended to provide suggestions to teachers that can be used to equip students with an intuitive, comprehensive understanding of these basic concepts in probability. The first section of the paper delineates mutually exclusive and…
Selected papers on probability and statistics
2009-01-01
This volume contains translations of papers that originally appeared in the Japanese journal Sūgaku. The papers range over a variety of topics in probability theory, statistics, and applications. This volume is suitable for graduate students and research mathematicians interested in probability and statistics.
Collective probabilities algorithm for surface hopping calculations
International Nuclear Information System (INIS)
Bastida, Adolfo; Cruz, Carlos; Zuniga, Jose; Requena, Alberto
2003-01-01
General equations that transition probabilities of the hopping algorithms in surface hopping calculations must obey to assure the equality between the average quantum and classical populations are derived. These equations are solved for two particular cases. In the first it is assumed that probabilities are the same for all trajectories and that the number of hops is kept to a minimum. These assumptions specify the collective probabilities (CP) algorithm, for which the transition probabilities depend on the average populations for all trajectories. In the second case, the probabilities for each trajectory are supposed to be completely independent of the results from the other trajectories. There is, then, a unique solution of the general equations assuring that the transition probabilities are equal to the quantum population of the target state, which is referred to as the independent probabilities (IP) algorithm. The fewest switches (FS) algorithm developed by Tully is accordingly understood as an approximate hopping algorithm which takes elements from the accurate CP and IP solutions. A numerical test of all these hopping algorithms is carried out for a one-dimensional two-state problem with two avoiding crossings which shows the accuracy and computational efficiency of the collective probabilities algorithm proposed, the limitations of the FS algorithm and the similarity between the results offered by the IP algorithm and those obtained with the Ehrenfest method
Examples of Neutrosophic Probability in Physics
Directory of Open Access Journals (Sweden)
Fu Yuhua
2015-01-01
Full Text Available This paper re-discusses the problems of the so-called “law of nonconservation of parity” and “accelerating expansion of the universe”, and presents the examples of determining Neutrosophic Probability of the experiment of Chien-Shiung Wu et al in 1957, and determining Neutrosophic Probability of accelerating expansion of the partial universe.
Eliciting Subjective Probabilities with Binary Lotteries
DEFF Research Database (Denmark)
Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd
objective probabilities. Drawing a sample from the same subject population, we find evidence that the binary lottery procedure induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation...
Probability Issues in without Replacement Sampling
Joarder, A. H.; Al-Sabah, W. S.
2007-01-01
Sampling without replacement is an important aspect in teaching conditional probabilities in elementary statistics courses. Different methods proposed in different texts for calculating probabilities of events in this context are reviewed and their relative merits and limitations in applications are pinpointed. An alternative representation of…
Probability: A Matter of Life and Death
Hassani, Mehdi; Kippen, Rebecca; Mills, Terence
2016-01-01
Life tables are mathematical tables that document probabilities of dying and life expectancies at different ages in a society. Thus, the life table contains some essential features of the health of a population. Probability is often regarded as a difficult branch of mathematics. Life tables provide an interesting approach to introducing concepts…
Teaching Probability: A Socio-Constructivist Perspective
Sharma, Sashi
2015-01-01
There is a considerable and rich literature on students' misconceptions in probability. However, less attention has been paid to the development of students' probabilistic thinking in the classroom. This paper offers a sequence, grounded in socio-constructivist perspective for teaching probability.
Stimulus Probability Effects in Absolute Identification
Kent, Christopher; Lamberts, Koen
2016-01-01
This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of…
47 CFR 1.1623 - Probability calculation.
2010-10-01
... 47 Telecommunication 1 2010-10-01 2010-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a) All calculations shall be...
Simulations of Probabilities for Quantum Computing
Zak, M.
1996-01-01
It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.
Against All Odds: When Logic Meets Probability
van Benthem, J.; Katoen, J.-P.; Langerak, R.; Rensink, A.
2017-01-01
This paper is a light walk along interfaces between logic and probability, triggered by a chance encounter with Ed Brinksma. It is not a research paper, or a literature survey, but a pointer to issues. I discuss both direct combinations of logic and probability and structured ways in which logic can
An introduction to probability and stochastic processes
Melsa, James L
2013-01-01
Geared toward college seniors and first-year graduate students, this text is designed for a one-semester course in probability and stochastic processes. Topics covered in detail include probability theory, random variables and their functions, stochastic processes, linear system response to stochastic processes, Gaussian and Markov processes, and stochastic differential equations. 1973 edition.
The probability of the false vacuum decay
International Nuclear Information System (INIS)
Kiselev, V.; Selivanov, K.
1983-01-01
The closed expession for the probability of the false vacuum decay in (1+1) dimensions is given. The probability of false vacuum decay is expessed as the product of exponential quasiclassical factor and a functional determinant of the given form. The method for calcutation of this determinant is developed and a complete answer for (1+1) dimensions is given
Probability elements of the mathematical theory
Heathcote, C R
2000-01-01
Designed for students studying mathematical statistics and probability after completing a course in calculus and real variables, this text deals with basic notions of probability spaces, random variables, distribution functions and generating functions, as well as joint distributions and the convergence properties of sequences of random variables. Includes worked examples and over 250 exercises with solutions.
The transition probabilities of the reciprocity model
Snijders, T.A.B.
1999-01-01
The reciprocity model is a continuous-time Markov chain model used for modeling longitudinal network data. A new explicit expression is derived for its transition probability matrix. This expression can be checked relatively easily. Some properties of the transition probabilities are given, as well
Probability numeracy and health insurance purchase
Dillingh, Rik; Kooreman, Peter; Potters, Jan
2016-01-01
This paper provides new field evidence on the role of probability numeracy in health insurance purchase. Our regression results, based on rich survey panel data, indicate that the expenditure on two out of three measures of health insurance first rises with probability numeracy and then falls again.
The enigma of probability and physics
International Nuclear Information System (INIS)
Mayants, L.
1984-01-01
This volume contains a coherent exposition of the elements of two unique sciences: probabilistics (science of probability) and probabilistic physics (application of probabilistics to physics). Proceeding from a key methodological principle, it starts with the disclosure of the true content of probability and the interrelation between probability theory and experimental statistics. This makes is possible to introduce a proper order in all the sciences dealing with probability and, by conceiving the real content of statistical mechanics and quantum mechanics in particular, to construct both as two interconnected domains of probabilistic physics. Consistent theories of kinetics of physical transformations, decay processes, and intramolecular rearrangements are also outlined. The interrelation between the electromagnetic field, photons, and the theoretically discovered subatomic particle 'emon' is considered. Numerous internal imperfections of conventional probability theory, statistical physics, and quantum physics are exposed and removed - quantum physics no longer needs special interpretation. EPR, Bohm, and Bell paradoxes are easily resolved, among others. (Auth.)
Optimizing Probability of Detection Point Estimate Demonstration
Koshti, Ajay M.
2017-01-01
Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.
Alternative probability theories for cognitive psychology.
Narens, Louis
2014-01-01
Various proposals for generalizing event spaces for probability functions have been put forth in the mathematical, scientific, and philosophic literatures. In cognitive psychology such generalizations are used for explaining puzzling results in decision theory and for modeling the influence of context effects. This commentary discusses proposals for generalizing probability theory to event spaces that are not necessarily boolean algebras. Two prominent examples are quantum probability theory, which is based on the set of closed subspaces of a Hilbert space, and topological probability theory, which is based on the set of open sets of a topology. Both have been applied to a variety of cognitive situations. This commentary focuses on how event space properties can influence probability concepts and impact cognitive modeling. Copyright © 2013 Cognitive Science Society, Inc.
Doses to the Norwegian population from naturally occuring radiation and from the Chernobyl fallout
International Nuclear Information System (INIS)
Strand, T.
1987-01-01
The doses to the Norwegian population from naturally occuring radiation are extensively reviewed. The annual population weighted average dose equivalent to the Norwegian population from 222 Rn and its daughters is estimated to be between 3.5 and 4.5 mSv. The average concentration of 220 Rn daughters in Norwegian dwellings is most probably between 1.0 and 1.5 Bq m -3 . The corresponding effective dose equivalent for 220 Rn and its daughters is estimated to be between 0.4 and 0.6 mSv. The total annual collective dose equivalent from naturally occuring radiation in Norway is found to be between 21000 and 27000 man Sv. The doses to the Norwegian population from the Chernobyl fallout are briefly discussed. Based on the results of a ''food basket'' project and supplementary data from about 30000 measurements on food samples the first year after the reactor accident, the total annual effective dose equivalent from foodstuffs to an average Norwegian consumer during this first year is estimated to be 0.15 +-0.002 m Sv at the 95% confidence level. The per caput effective dose equivalent from external fallout gamma radiation in the first year after the Chernobyl accident, is approximately 82 μSv in Norway
International Nuclear Information System (INIS)
Shimada, Yoshio
2000-01-01
It is anticipated that the change of frequency of surveillance tests, preventive maintenance or parts replacement of safety related components may cause the change of component failure probability and result in the change of core damage probability. It is also anticipated that the change is different depending on the initiating event frequency or the component types. This study assessed the change of core damage probability using simplified PSA model capable of calculating core damage probability in a short time period, which is developed by the US NRC to process accident sequence precursors, when various component's failure probability is changed between 0 and 1, or Japanese or American initiating event frequency data are used. As a result of the analysis, (1) It was clarified that frequency of surveillance test, preventive maintenance or parts replacement of motor driven pumps (high pressure injection pumps, residual heat removal pumps, auxiliary feedwater pumps) should be carefully changed, since the core damage probability's change is large, when the base failure probability changes toward increasing direction. (2) Core damage probability change is insensitive to surveillance test frequency change, since the core damage probability change is small, when motor operated valves and turbine driven auxiliary feed water pump failure probability changes around one figure. (3) Core damage probability change is small, when Japanese failure probability data are applied to emergency diesel generator, even if failure probability changes one figure from the base value. On the other hand, when American failure probability data is applied, core damage probability increase is large, even if failure probability changes toward increasing direction. Therefore, when Japanese failure probability data is applied, core damage probability change is insensitive to surveillance tests frequency change etc. (author)
Options and pitfalls of normal tissues complication probability models
International Nuclear Information System (INIS)
Dorr, Wolfgang
2011-01-01
Full text: Technological improvements in the physical administration of radiotherapy have led to increasing conformation of the treatment volume (TV) with the planning target volume (PTV) and of the irradiated volume (IV) with the TV. In this process of improvement of the physical quality of radiotherapy, the total volumes of organs at risk exposed to significant doses have significantly decreased, resulting in increased inhomogeneities in the dose distributions within these organs. This has resulted in a need to identify and quantify volume effects in different normal tissues. Today, irradiated volume today must be considered a 6t h 'R' of radiotherapy, in addition to the 5 'Rs' defined by Withers and Steel in the mid/end 1980 s. The current status of knowledge of these volume effects has recently been summarized for many organs and tissues by the QUANTEC (Quantitative Analysis of Normal Tissue Effects in the Clinic) initiative [Int. J. Radiat. Oncol. BioI. Phys. 76 (3) Suppl., 2010]. However, the concept of using dose-volume histogram parameters as a basis for dose constraints, even without applying any models for normal tissue complication probabilities (NTCP), is based on (some) assumptions that are not met in clinical routine treatment planning. First, and most important, dose-volume histogram (DVH) parameters are usually derived from a single, 'snap-shot' CT-scan, without considering physiological (urinary bladder, intestine) or radiation induced (edema, patient weight loss) changes during radiotherapy. Also, individual variations, or different institutional strategies of delineating organs at risk are rarely considered. Moreover, the reduction of the 3-dimentional dose distribution into a '2dimensl' DVH parameter implies that the localization of the dose within an organ is irrelevant-there are ample examples that this assumption is not justified. Routinely used dose constraints also do not take into account that the residual function of an organ may be
International Nuclear Information System (INIS)
Clark, J.; Limbrick, A.
1996-01-01
This paper identifies and reviews the issues to be addressed and the procedures to be followed during the mobilisation of projects using LFG as an energy source. Knowledge of the procedures involved in project mobilisation, their sequence and probable timescales, is essential for efficient project management. It is assumed that the majority of projects will be situated on existing, licensed landfill sites and, in addition to complying with the relevant conditions of the waste management licence and original planning consent, any proposed developments on the site will require a separate planning consent. Experience in the UK indicates that obtaining planning permission rarely constitutes a barrier to the development of schemes for the utilisation of LFG. Even so, an appreciation of the applicable environmental and planning legislation is essential as this will enable the developer to recognise the main concerns of the relevant planning authority at an early stage of the project, resulting in the preparation of an informed and well-structured application for planning permission. For a LFG utilisation scheme on an existing landfill site, the need to carry out an environmental assessment (EA) as part of the application for planning permission will, in vitually all cases, be discretionary. Even if not deemed necessary by the planning authority, an EA is a useful tool at the planning application stage, to identify and address potential problems and to support discussions with bodies such as the Environment Agency, from whom consents or authorisations may be required. Carrying out an EA can thus provide for more cost-effective project development and enhanced environmental protection. Typically, the principal contractual arrangements, such as the purchase of gas or the sale of electricity, will have been established before the project mobilisation phase. However, there are many other contractural arrangements that must be established, and consents and permits that may be
Acceptance Probability (P a) Analysis for Process Validation Lifecycle Stages.
Alsmeyer, Daniel; Pazhayattil, Ajay; Chen, Shu; Munaretto, Francesco; Hye, Maksuda; Sanghvi, Pradeep
2016-04-01
This paper introduces an innovative statistical approach towards understanding how variation impacts the acceptance criteria of quality attributes. Because of more complex stage-wise acceptance criteria, traditional process capability measures are inadequate for general application in the pharmaceutical industry. The probability of acceptance concept provides a clear measure, derived from specific acceptance criteria for each quality attribute. In line with the 2011 FDA Guidance, this approach systematically evaluates data and scientifically establishes evidence that a process is capable of consistently delivering quality product. The probability of acceptance provides a direct and readily understandable indication of product risk. As with traditional capability indices, the acceptance probability approach assumes that underlying data distributions are normal. The computational solutions for dosage uniformity and dissolution acceptance criteria are readily applicable. For dosage uniformity, the expected AV range may be determined using the s lo and s hi values along with the worst case estimates of the mean. This approach permits a risk-based assessment of future batch performance of the critical quality attributes. The concept is also readily applicable to sterile/non sterile liquid dose products. Quality attributes such as deliverable volume and assay per spray have stage-wise acceptance that can be converted into an acceptance probability. Accepted statistical guidelines indicate processes with C pk > 1.33 as performing well within statistical control and those with C pk 1.33 is associated with a centered process that will statistically produce less than 63 defective units per million. This is equivalent to an acceptance probability of >99.99%.
Assessing the clinical probability of pulmonary embolism
International Nuclear Information System (INIS)
Miniati, M.; Pistolesi, M.
2001-01-01
Clinical assessment is a cornerstone of the recently validated diagnostic strategies for pulmonary embolism (PE). Although the diagnostic yield of individual symptoms, signs, and common laboratory tests is limited, the combination of these variables, either by empirical assessment or by a prediction rule, can be used to express a clinical probability of PE. The latter may serve as pretest probability to predict the probability of PE after further objective testing (posterior or post-test probability). Over the last few years, attempts have been made to develop structured prediction models for PE. In a Canadian multicenter prospective study, the clinical probability of PE was rated as low, intermediate, or high according to a model which included assessment of presenting symptoms and signs, risk factors, and presence or absence of an alternative diagnosis at least as likely as PE. Recently, a simple clinical score was developed to stratify outpatients with suspected PE into groups with low, intermediate, or high clinical probability. Logistic regression was used to predict parameters associated with PE. A score ≤ 4 identified patients with low probability of whom 10% had PE. The prevalence of PE in patients with intermediate (score 5-8) and high probability (score ≥ 9) was 38 and 81%, respectively. As opposed to the Canadian model, this clinical score is standardized. The predictor variables identified in the model, however, were derived from a database of emergency ward patients. This model may, therefore, not be valid in assessing the clinical probability of PE in inpatients. In the PISA-PED study, a clinical diagnostic algorithm was developed which rests on the identification of three relevant clinical symptoms and on their association with electrocardiographic and/or radiographic abnormalities specific for PE. Among patients who, according to the model, had been rated as having a high clinical probability, the prevalence of proven PE was 97%, while it was 3