Statistical and low dose response
The low dose response and the lower limit of detection of the Hanford dosimeter depend upon may factors, including the energy of the radiation, whether the exposure is to be a single radiation or mixed fields, annealing cycles, environmental factors, and how well various batches of TLD materials are matched in the system. A careful statistical study and sensitivity analysis were performed to determine how these factors influence the response of the dosimeter system. Estimates have been included in this study of the standard deviation of calculated dose for various mixed field exposures from 0 to 1000 mrem
Lack of Statistical Significance
Kehle, Thomas J.; Bray, Melissa A.; Chafouleas, Sandra M.; Kawano, Takuji
2007-01-01
Criticism has been leveled against the use of statistical significance testing (SST) in many disciplines. However, the field of school psychology has been largely devoid of critiques of SST. Inspection of the primary journals in school psychology indicated numerous examples of SST with nonrandom samples and/or samples of convenience. In this…
J.J.P. Jansen (Justin); A. Burdorf (Alex)
2003-01-01
textabstractBACKGROUND: In epidemiological studies on physical workloads and back complaints, among the important features in modelling dose-response relations are the measurement strategy of the exposure and the nature of the dose-response relation that is assumed. AIM: To evaluat
Anaman, K. A.; Ibrahim, N.
- The effects on human health resulting from the January to April 1998 haze-related air pollution episode in Brunei Darussalam were analysed for five groups of diseases of the respiratory system. The analysis concentrated on the statistical estimation of dose-response functions which related the number of cases of respiratory diseases to the level of quality of ambient environment as measured by the pollutants standards index (PSI) and other environmental variables. The total number of cases of the five groups of diseases was shown to be significantly related to PSI and temperature. Societal costs were also estimated. The results showed that societal costs were significantly related to PSI, temperature and relative humidity. Societal costs increased with higher PSI and relative humidity but decreased with increasing temperature.
Statistically significant relational data mining :
Berry, Jonathan W.; Leung, Vitus Joseph; Phillips, Cynthia Ann; Pinar, Ali; Robinson, David Gerald; Berger-Wolf, Tanya; Bhowmick, Sanjukta; Casleton, Emily; Kaiser, Mark; Nordman, Daniel J.; Wilson, Alyson G.
2014-02-01
This report summarizes the work performed under the project (3z(BStatitically significant relational data mining.(3y (BThe goal of the project was to add more statistical rigor to the fairly ad hoc area of data mining on graphs. Our goal was to develop better algorithms and better ways to evaluate algorithm quality. We concetrated on algorithms for community detection, approximate pattern matching, and graph similarity measures. Approximate pattern matching involves finding an instance of a relatively small pattern, expressed with tolerance, in a large graph of data observed with uncertainty. This report gathers the abstracts and references for the eight refereed publications that have appeared as part of this work. We then archive three pieces of research that have not yet been published. The first is theoretical and experimental evidence that a popular statistical measure for comparison of community assignments favors over-resolved communities over approximations to a ground truth. The second are statistically motivated methods for measuring the quality of an approximate match of a small pattern in a large graph. The third is a new probabilistic random graph model. Statisticians favor these models for graph analysis. The new local structure graph model overcomes some of the issues with popular models such as exponential random graph models and latent variable models.
Self-play: statistical significance
Haworth, Guy McCrossan
2003-01-01
Heinz recently completed a comprehensive experiment in self-play using the FRITZ chess engine to establish the ‘decreasing returns’ hypothesis with specific levels of statistical confidence. This note revisits the results and recalculates the confidence levels of this and other hypotheses. These appear to be better than Heinz’ initial analysis suggests.
Poisson regression methods are used to describe dose-response relations for cancer mortality for a subcohort of 28,347 white male radiation workers. Age specific baseline rates are described using both internal and external (US white male) rates. Regression analyses are based on an analytic data structure (ADS) that consists of a table of observed deaths, expected deaths, and person-years at risk for each combination of levels of seven risk factors. The factors are socioeconomic status, length of employment, birth cohort, age at risk, facility, internal exposure, and external exposure. Each observation in the ADS consists of the index value of each of the stratifying factors, the observed deaths, the expected deaths, the person-years, and the ten year lagged average cumulative dose. Regression diagnostics show that a linear exponential relative risk model is not appropriate for these data. Results are presented using a main effects model for factors other than external radiation, and an excess relative risk term for cumulative external radiation dose
The thresholds for statistical and clinical significance
Jakobsen, Janus Christian; Gluud, Christian; Winkel, Per;
2014-01-01
BACKGROUND: Thresholds for statistical significance are insufficiently demonstrated by 95% confidence intervals or P-values when assessing results from randomised clinical trials. First, a P-value only shows the probability of getting a result assuming that the null hypothesis is true and does not...... reflect the probability of getting a result assuming an alternative hypothesis to the null hypothesis is true. Second, a confidence interval or a P-value showing significance may be caused by multiplicity. Third, statistical significance does not necessarily result in clinical significance. Therefore...... probability that a given trial result is compatible with a 'null' effect (corresponding to the P-value) divided by the probability that the trial result is compatible with the intervention effect hypothesised in the sample size calculation; (3) adjust the confidence intervals and the statistical significance...
Assessing statistical significance of periodogram peaks
Baluev, Roman V.
2007-01-01
The least-squares (or Lomb-Scargle) periodogram is a powerful tool which is used routinely in many branches of astronomy to search for periodicities in observational data. The problem of assessing statistical significance of candidate periodicities for different periodograms is considered. Based on results in extreme value theory, improved analytic estimations of false alarm probabilities are given. They include an upper limit to the false alarm probability (or a lower limit to the significan...
Assessing statistical significance of periodogram peaks
Baluev, Roman V
2007-01-01
The least-squares (or Lomb-Scargle) periodogram is a powerful tool which is used routinely in many branches of astronomy to search for periodicities in observational data. The problem of assessing statistical significance of candidate periodicities for different periodograms is considered. Based on results in extreme value theory, improved analytic estimations of false alarm probabilities are given. They include an upper limit to the false alarm probability (or a lower limit to the significance). These estimations are tested numerically in order to establish regions of their practical applicability.
Assessing the statistical significance of periodogram peaks
Baluev, R. V.
2008-04-01
The least-squares (or Lomb-Scargle) periodogram is a powerful tool that is routinely used in many branches of astronomy to search for periodicities in observational data. The problem of assessing the statistical significance of candidate periodicities for a number of periodograms is considered. Based on results in extreme value theory, improved analytic estimations of false alarm probabilities are given. These include an upper limit to the false alarm probability (or a lower limit to the significance). The estimations are tested numerically in order to establish regions of their practical applicability.
Dose-response analysis using R
Ritz, Christian; Baty, Florent; Streibig, Jens Carl;
2015-01-01
Dose-response analysis can be carried out using multi-purpose commercial statistical software, but except for a few special cases the analysis easily becomes cumbersome as relevant, non-standard output requires manual programming. The extension package drc for the statistical environment R provides...... a flexible and versatile infrastructure for dose-response analyses in general. The present version of the package, reflecting extensions and modifications over the last decade, provides a user-friendly interface to specify the model assumptions about the dose-response relationship and comes with a...
Social significance of community structure: Statistical view
Li, Hui-Jia
2015-01-01
Community structure analysis is a powerful tool for social networks, which can simplify their topological and functional analysis considerably. However, since community detection methods have random factors and real social networks obtained from complex systems always contain error edges, evaluating the significance of community structure partitioned is an urgent and important question. In this paper, integrating the specific characteristics of real society, we present a novel framework analyzing the significance of social community specially. The dynamics of social interactions are modeled by identifying social leaders and corresponding hierarchical structures. Instead of a direct comparison with the average outcome of a random model, we compute the similarity of a given node with the leader by the number of common neighbors. To determine the membership vector, an efficient community detection algorithm is proposed based on the position of nodes and their corresponding leaders. Then, using log-likelihood sco...
Nonlinearity of dose responses in thermoluminescence dosimetry
All of dose responses in thermoluminescence (TL) dosimetry can be described by a dose response function derived from statistical Poisson distribution. Two characteristic parameters in this function, one hit factor R and characteristic dose D0, can be used to analyze the nonlinearity of TL responses. The one hit factor R indicates whether there is a linear region in the dose responses, and that the responses are linear-sublinear or linear-supralinear-sublinear. The characteristic dose D0 is used to compare the range of linear region in responses and sensitivity of TLD. When coupling with physical mechanisms in the TL process, the dominant features of the TL nonlinear behavior observed in experiments can be explained. (8 refs., 7 figs., 1 tab.)
Effect size as a supplement to statistical significance testing
Gašper Cankar; Boštjan Bajec
2003-01-01
Researchers in the field of psychology often face the situation that the statistical significance depends largely on the sample size and its statistical power. Effect size is a statistical measure that can offer some solutions for constructive research, since it can overcome the problems that are connected to the sample size. This article presents statistical significance testing we meet in psychology and the usage of smaller group of the effect size measures – measures of the standardi...
A Statistical Significance Simulation Study for the General Scientist
Levman, Jacob
2011-01-01
When a scientist performs an experiment they normally acquire a set of measurements and are expected to demonstrate that their results are "statistically significant" thus confirming whatever hypothesis they are testing. The main method for establishing statistical significance involves demonstrating that there is a low probability that the observed experimental results were the product of random chance. This is typically defined as p < 0.05, which indicates there is less than a 5% chance tha...
Statistical significance test for transition matrices of atmospheric Markov chains
Vautard, Robert; Mo, Kingtse C.; Ghil, Michael
1990-01-01
Low-frequency variability of large-scale atmospheric dynamics can be represented schematically by a Markov chain of multiple flow regimes. This Markov chain contains useful information for the long-range forecaster, provided that the statistical significance of the associated transition matrix can be reliably tested. Monte Carlo simulation yields a very reliable significance test for the elements of this matrix. The results of this test agree with previously used empirical formulae when each cluster of maps identified as a distinct flow regime is sufficiently large and when they all contain a comparable number of maps. Monte Carlo simulation provides a more reliable way to test the statistical significance of transitions to and from small clusters. It can determine the most likely transitions, as well as the most unlikely ones, with a prescribed level of statistical significance.
Dose response relationship in local radiotherapy for hepatocellular carcinoma
Park, Hee Chul; Seong, Jin Sil; Han, Kwang Hyub; Chon, Chae Yoon; Moon, Young Myoung; Song, Jae Seok; Suh, Chang Ok [College of Medicine, Yonsei Univ., Seoul (Korea, Republic of)
2001-06-01
In this study, it was investigated whether dose response relation existed or not in local radiotherapy for primary hepatocellular carcinoma. From January 1992 to March 2000, 158 patients were included in present study. Exclusion criteria included the presence of extrahepatic metastasis, liver cirrhosis of Child's class C, tumors occupying more than two thirds of the entire liver, and performance status on the ECOG scale of more than 3. Radiotherapy was given to the field including tumor with generous margin using 6, 10-MV X-ray. Mean tumor dose was 48.2{+-}7.9 Gy in daily 1.8 Gy fractions. Tumor response was based on diagnostic radiologic examinations such as CT scan, MR imaging, hepatic artery angiography at 4-8 weeks following completion of treatment. Statistical analysis was done to investigate the existence of dose response relationship of local radiotherapy when it was applied to the treatment of primary hepatocellular carcinoma. An objective response was observed in 106 of 158 patients, giving a response rate of 67. 1%. Statistical analysis revealed that total dose was the most significant factor in relation to tumor response when local radiotherapy was applied to the treatment of primary hepatocellular carcinoma. Only 29.2% showed objective response in patients treated with dose less than 40 Gy, while 68.6% and 77.1 % showed major response in patients with 40-50 Gy and more than 50 Gy, respectively. Child-Pugh classification was significant factor in the development of ascites, overt radiation induced liver disease and gastroenteritis. Radiation dose was an important factor for development of radiation induced gastroduodenal ulcer. Present study showed the existence of dose response relationship in local radiotherapy for primary hepatocellular carcinoma. Only radiotherapy dose was a significant factor to predict the objective response. Further study is required to predict the maximal tolerance dose in consideration of liver function and non
Caveats for using statistical significance tests in research assessments
Schneider, Jesper Wiborg
2013-01-01
controversial and numerous criticisms have been leveled against their use. Based on examples from articles by proponents of the use statistical significance tests in research assessments, we address some of the numerous problems with such tests. The issues specifically discussed are the ritual practice of such...... tests, their dichotomous application in decision making, the difference between statistical and substantive significance, the implausibility of most null hypotheses, the crucial assumption of randomness, as well as the utility of standard errors and confidence intervals for inferential purposes. We...... argue that applying statistical significance tests and mechanically adhering to their results are highly problematic and detrimental to critical thinking. We claim that the use of such tests do not provide any advantages in relation to deciding whether differences between citation indicators are...
Dose response relationship and Alara
In this paper, it will be shown how dose-response relationships allow to give quantitative figures for the detriment of irradiation. At this stage, the detriment is expressed directly as a certain number of health effects, whose valuation is not dealt with here. The present tools for quantifying, their weaknesses and their strenghts, and their scientific basis will be developed
On detection and assessment of statistical significance of Genomic Islands
Chaudhuri Probal
2008-04-01
Full Text Available Abstract Background Many of the available methods for detecting Genomic Islands (GIs in prokaryotic genomes use markers such as transposons, proximal tRNAs, flanking repeats etc., or they use other supervised techniques requiring training datasets. Most of these methods are primarily based on the biases in GC content or codon and amino acid usage of the islands. However, these methods either do not use any formal statistical test of significance or use statistical tests for which the critical values and the P-values are not adequately justified. We propose a method, which is unsupervised in nature and uses Monte-Carlo statistical tests based on randomly selected segments of a chromosome. Such tests are supported by precise statistical distribution theory, and consequently, the resulting P-values are quite reliable for making the decision. Results Our algorithm (named Design-Island, an acronym for Detection of Statistically Significant Genomic Island runs in two phases. Some 'putative GIs' are identified in the first phase, and those are refined into smaller segments containing horizontally acquired genes in the refinement phase. This method is applied to Salmonella typhi CT18 genome leading to the discovery of several new pathogenicity, antibiotic resistance and metabolic islands that were missed by earlier methods. Many of these islands contain mobile genetic elements like phage-mediated genes, transposons, integrase and IS elements confirming their horizontal acquirement. Conclusion The proposed method is based on statistical tests supported by precise distribution theory and reliable P-values along with a technique for visualizing statistically significant islands. The performance of our method is better than many other well known methods in terms of their sensitivity and accuracy, and in terms of specificity, it is comparable to other methods.
On use of the multistage dose-response model for assessing laboratory animal carcinogenicity
Nitcheva, Daniella; Piegorsch, Walter W.; West, R. Webster
2007-01-01
We explore how well a statistical multistage model describes dose-response patterns in laboratory animal carcinogenicity experiments from a large database of quantal response data. The data are collected from the U.S. EPA’s publicly available IRIS data warehouse and examined statistically to determine how often higher-order values in the multistage predictor yield significant improvements in explanatory power over lower-order values. Our results suggest that the addition of a second-order par...
A relational retrieval database has been developed compiling toxicological studies assessing the occurrence of hormetic dose responses and their quantitative characteristics. This database permits an evaluation of these studies over numerous parameters, including study design and dose-response features and physical/chemical properties of the agents. The database contains approximately 5600 dose-response relationships satisfying evaluative criteria for hormesis across over approximately 900 agents from a broadly diversified spectrum of chemical classes and physical agents. The assessment reveals that hormetic dose-response relationships occur in males and females of numerous animal models in all principal age groups as well as across species displaying a broad range of differential susceptibilities to toxic agents. The biological models are extensive, including plants, viruses, bacteria, fungi, insects, fish, birds, rodents, and primates, including humans. The spectrum of endpoints displaying hormetic dose responses is also broad being inclusive of growth, longevity, numerous metabolic parameters, disease incidences (including cancer), various performance endpoints such as cognitive functions, immune responses among others. Quantitative features of the hormetic dose response reveal that the vast majority of cases display a maximum stimulatory response less than two-fold greater than the control while the width of the stimulatory response is typically less than 100-fold in dose range immediately contiguous with the toxicological NO(A)EL. The database also contains a quantitative evaluation component that differentiates among the various dose responses concerning the strength of the evidence supporting a hormetic conclusion based on study design features, magnitude of the stimulatory response, statistical significance, and reproducibility of findings
Mahalanobis distance and variable selection to optimize dose response
A battery of statistical techniques are combined to improve detection of low-level dose response. First, Mahalanobis distances are used to classify objects as normal or abnormal. Then the proportion classified abnormal is regressed on dose. Finally, a subset of regressor variables is selected which maximizes the slope of the dose response line. Use of the techniques is illustrated by application to mouse sperm damaged by low doses of x-rays
Priya Ranganathan
2015-01-01
Full Text Available In the second part of a series on pitfalls in statistical analysis, we look at various ways in which a statistically significant study result can be expressed. We debunk some of the myths regarding the ′P′ value, explain the importance of ′confidence intervals′ and clarify the importance of including both values in a paper
Jia, Bin; Lynn, Henry S
2015-01-01
Background The CONSORT statement requires clinical trials to report confidence intervals, which help to assess the precision and clinical importance of the treatment effect. Conventional sample size calculations for clinical trials, however, only consider issues of statistical significance (that is, significance level and power). Method A more consistent approach is proposed whereby sample size planning also incorporates information on clinical significance as indicated by the boundaries of t...
Systematic reviews of anesthesiologic interventions reported as statistically significant
Imberger, Georgina; Gluud, Christian; Boylan, John;
2015-01-01
nominally statistically significant meta-analyses of anesthesiologic interventions, we used TSA to estimate power and imprecision in the context of sparse data and repeated updates. METHODS: We conducted a search to identify all systematic reviews with meta-analyses that investigated an intervention that...... updates. RESULTS: From 11,870 titles, we found 682 systematic reviews that investigated anesthesiologic interventions. In the 50 sampled meta-analyses, the median number of trials included was 8 (interquartile range [IQR], 5-14), the median number of participants was 964 (IQR, 523-1736), and the median...
Skull base chordomas: analysis of dose-response characteristics
Objective: To extract dose-response characteristics from dose-volume histograms and corresponding actuarial survival statistics for 115 patients with skull base chordomas. Materials and Methods: We analyzed data for 115 patients with skull base chordoma treated with combined photon and proton conformal radiotherapy to doses in the range 66.6Gy - 79.2Gy. Data set for each patient included gender, histology, age, tumor volume, prescribed dose, overall treatment time, time to recurrence or time to last observation, target dose-volume histogram, and several dosimetric parameters (minimum/mean/median/maximum target dose, percent of the target volume receiving the prescribed dose, dose to 90% of the target volume, and the Equivalent Uniform Dose (EUD). Data were analyzed using the Kaplan-Meier survivor function estimate, the proportional hazards (Cox) model, and parametric modeling of the actuarial probability of recurrence. Parameters of dose-response characteristics were obtained using the maximum likelihood method. Results: Local failure developed in 42 (36%) of patients, with actuarial local control rates at 5 years of 59.2%. The proportional hazards model revealed significant dependence of gender on the probability of recurrence, with female patients having significantly poorer prognosis (hazard ratio of 2.3 with the p value of 0.008). The Wilcoxon and the log-rank tests of the corresponding Kaplan-Meier recurrence-free survival curves confirmed statistical significance of this effect. The Cox model with stratification by gender showed significance of tumor volume (p=0.01), the minimum target dose (p=0.02), and the EUD (p=0.02). Other parameters were not significant at the α level of significance of 0.05, including the prescribed dose (p=0.21). Parametric analysis using a combined model of tumor control probability (to account for non-uniformity of target dose distribution) and the Weibull failure time model (to account for censoring) allowed us to estimate
Implicit dose-response curves.
Pérez Millán, Mercedes; Dickenstein, Alicia
2015-06-01
We develop tools from computational algebraic geometry for the study of steady state features of autonomous polynomial dynamical systems via elimination of variables. In particular, we obtain nontrivial bounds for the steady state concentration of a given species in biochemical reaction networks with mass-action kinetics. This species is understood as the output of the network and we thus bound the maximal response of the system. The improved bounds give smaller starting boxes to launch numerical methods. We apply our results to the sequential enzymatic network studied in Markevich et al. (J Cell Biol 164(3):353-359, 2004) to find nontrivial upper bounds for the different substrate concentrations at steady state. Our approach does not require any simulation, analytical expression to describe the output in terms of the input, or the absence of multistationarity. Instead, we show how to extract information from effectively computable implicit dose-response curves, with the use of resultants and discriminants. We moreover illustrate in the application to an enzymatic network, the relation between the exact implicit dose-response curve we obtain symbolically and the standard hysteresis diagram provided by a numerical ode solver. The setting and tools we propose could yield many other results adapted to any autonomous polynomial dynamical system, beyond those where it is possible to get explicit expressions. PMID:25008963
Sibling Competition & Growth Tradeoffs. Biological vs. Statistical Significance
Kramer, Karen L.; Veile, Amanda; Otárola-Castillo, Erik
2016-01-01
Early childhood growth has many downstream effects on future health and reproduction and is an important measure of offspring quality. While a tradeoff between family size and child growth outcomes is theoretically predicted in high-fertility societies, empirical evidence is mixed. This is often attributed to phenotypic variation in parental condition. However, inconsistent study results may also arise because family size confounds the potentially differential effects that older and younger siblings can have on young children’s growth. Additionally, inconsistent results might reflect that the biological significance associated with different growth trajectories is poorly understood. This paper addresses these concerns by tracking children’s monthly gains in height and weight from weaning to age five in a high fertility Maya community. We predict that: 1) as an aggregate measure family size will not have a major impact on child growth during the post weaning period; 2) competition from young siblings will negatively impact child growth during the post weaning period; 3) however because of their economic value, older siblings will have a negligible effect on young children’s growth. Accounting for parental condition, we use linear mixed models to evaluate the effects that family size, younger and older siblings have on children’s growth. Congruent with our expectations, it is younger siblings who have the most detrimental effect on children’s growth. While we find statistical evidence of a quantity/quality tradeoff effect, the biological significance of these results is negligible in early childhood. Our findings help to resolve why quantity/quality studies have had inconsistent results by showing that sibling competition varies with sibling age composition, not just family size, and that biological significance is distinct from statistical significance. PMID:26938742
Statistical significance across multiple optimization models for community partition
Li, Ju; Li, Hui-Jia; Mao, He-Jin; Chen, Junhua
2016-05-01
The study of community structure is an important problem in a wide range of applications, which can help us understand the real network system deeply. However, due to the existence of random factors and error edges in real networks, how to measure the significance of community structure efficiently is a crucial question. In this paper, we present a novel statistical framework computing the significance of community structure across multiple optimization methods. Different from the universal approaches, we calculate the similarity between a given node and its leader and employ the distribution of link tightness to derive the significance score, instead of a direct comparison to a randomized model. Based on the distribution of community tightness, a new “p-value” form significance measure is proposed for community structure analysis. Specially, the well-known approaches and their corresponding quality functions are unified to a novel general formulation, which facilitates in providing a detailed comparison across them. To determine the position of leaders and their corresponding followers, an efficient algorithm is proposed based on the spectral theory. Finally, we apply the significance analysis to some famous benchmark networks and the good performance verified the effectiveness and efficiency of our framework.
Large SDSS quasar groups and their statistical significance
Park, Changbom; Einasto, Maret; Lietzen, Heidi; Heinamaki, Pekka
2015-01-01
We use a volume-limited sample of quasars in the Sloan Digital Sky Survey (SDSS) DR7 quasar catalog to identify quasar groups and address their statistical significance. This quasar sample has a uniform selection function on the sky and nearly a maximum possible contiguous volume that can be drawn from the DR7 catalog. Quasar groups are identified by using the Friend-of-Friend algorithm with a set of fixed comoving linking lengths. We find that the richness distribution of the richest 100 quasar groups or the size distribution of the largest 100 groups are statistically equivalent with those of randomly-distributed points with the same number density and sky coverage when groups are identified with the linking length of 70 h-1Mpc. It is shown that the large-scale structures like the huge Large Quasar Group (U1.27) reported by Clowes et al. (2013) can be found with high probability even if quasars have no physical clustering, and does not challenge the initially homogeneous cosmological models. Our results are...
Lexical Co-occurrence, Statistical Significance, and Word Association
Chaudhari, Dipak; Laxman, Srivatsan
2010-01-01
Lexical co-occurrence is an important cue for detecting word associations. We present a theoretical framework for discovering statistically significant lexical co-occurrences from a given corpus. In contrast with the prevalent practice of giving weightage to unigram frequencies, we focus only on the documents containing both the terms (of a candidate bigram). We detect biases in span distributions of associated words, while being agnostic to variations in global unigram frequencies. Our framework has the fidelity to distinguish different classes of lexical co-occurrences, based on strengths of the document and corpuslevel cues of co-occurrence in the data. We perform extensive experiments on benchmark data sets to study the performance of various co-occurrence measures that are currently known in literature. We find that a relatively obscure measure called Ochiai, and a newly introduced measure CSA capture the notion of lexical co-occurrence best, followed next by LLR, Dice, and TTest, while another popular m...
Wilkinson, Mick
2014-01-01
Decisions about support for therapies in light of data are made using statistical inference. The dominant approach is null-hypothesis-significance-testing. Applied correctly it provides a procedure for making dichotomous decisions about zero-effect null hypotheses with known and controlled error rates. Type I and type II error rates must be specified in advance and the latter controlled by a priori sample size calculation. This approach does not provide the probability of hypotheses or the st...
Wilkinson, Mick
2014-01-01
Decisions about support for predictions of theories in light of data are made using statistical inference. The dominant approach in sport and exercise science is the Neyman-Pearson significance-testing approach. When applied correctly it provides a reliable procedure for making dichotomous decisions for accepting or rejecting zero-effect null hypotheses with known and controlled long-run error rates. Type I and type II error rates must be specified in advance and the latter controlled by cond...
Statistical controversies in clinical research: statistical significance-too much of a good thing ….
Buyse, M; Hurvitz, S A; Andre, F; Jiang, Z; Burris, H A; Toi, M; Eiermann, W; Lindsay, M-A; Slamon, D
2016-05-01
The use and interpretation of P values is a matter of debate in applied research. We argue that P values are useful as a pragmatic guide to interpret the results of a clinical trial, not as a strict binary boundary that separates real treatment effects from lack thereof. We illustrate our point using the result of BOLERO-1, a randomized, double-blind trial evaluating the efficacy and safety of adding everolimus to trastuzumab and paclitaxel as first-line therapy for HER2+ advanced breast cancer. In this trial, the benefit of everolimus was seen only in the predefined subset of patients with hormone receptor-negative breast cancer at baseline (progression-free survival hazard ratio = 0.66, P = 0.0049). A strict interpretation of this finding, based on complex 'alpha splitting' rules to assess statistical significance, led to the conclusion that the benefit of everolimus was not statistically significant either overall or in the subset. We contend that this interpretation does not do justice to the data, and we argue that the benefit of everolimus in hormone receptor-negative breast cancer is both statistically compelling and clinically relevant. PMID:26861602
Analysis of Dose Response for Circulatory Disease After Radiotherapy for Benign Disease
Little, Mark P., E-mail: mark.little@nih.gov [Radiation Epidemiology Branch, National Cancer Institute, Executive Plaza South, Rockville, Maryland (United States); Kleinerman, Ruth A. [Radiation Epidemiology Branch, National Cancer Institute, Executive Plaza South, Rockville, Maryland (United States); Stovall, Marilyn; Smith, Susan A. [Department of Radiation Physics, University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Mabuchi, Kiyohiko [Radiation Epidemiology Branch, National Cancer Institute, Executive Plaza South, Rockville, Maryland (United States)
2012-12-01
Purpose: To assess the shape of the dose-response for various circulatory disease endpoints, and modifiers by age and time since exposure. Methods and Materials: This was an analysis of the US peptic ulcer data testing for heterogeneity of radiogenic risk by circulatory disease endpoint (ischemic heart, cerebrovascular, other circulatory disease). Results: There were significant excess risks for all circulatory disease, with an excess relative risk Gy{sup -1} of 0.082 (95% CI 0.031-0.140), and ischemic heart disease, with an excess relative risk Gy{sup -1} of 0.102 (95% CI 0.039-0.174) (both p = 0.01), and indications of excess risk for stroke. There were no statistically significant (p > 0.2) differences between risks by endpoint, and few indications of curvature in the dose-response. There were significant (p < 0.001) modifications of relative risk by time since exposure, the magnitude of which did not vary between endpoints (p > 0.2). Risk modifications were similar if analysis was restricted to patients receiving radiation, although the relative risks were slightly larger and the risk of stroke failed to be significant. The slopes of the dose-response were generally consistent with those observed in the Japanese atomic bomb survivors and in occupationally and medically exposed groups. Conclusions: There were excess risks for a variety of circulatory diseases in this dataset, with significant modification of risk by time since exposure. The consistency of the dose-response slopes with those observed in radiotherapeutically treated groups at much higher dose, as well as in lower dose-exposed cohorts such as the Japanese atomic bomb survivors and nuclear workers, implies that there may be little sparing effect of fractionation of dose or low-dose-rate exposure.
After statistics reform : Should we still teach significance testing?
A. Hak (Tony)
2014-01-01
textabstractIn the longer term null hypothesis significance testing (NHST) will disappear because p- values are not informative and not replicable. Should we continue to teach in the future the procedures of then abolished routines (i.e., NHST)? Three arguments are discussed for not teaching NHST in
Dose/response relationships and policy formulation
The ICRP 26 cost/benefit approach to establishing operational radiation protection guidelines is discussed. The purpose is to aid the policy maker in the decision making process, using as a basis the dose-response curve
Understanding the Sampling Distribution and Its Use in Testing Statistical Significance.
Breunig, Nancy A.
Despite the increasing criticism of statistical significance testing by researchers, particularly in the publication of the 1994 American Psychological Association's style manual, statistical significance test results are still popular in journal articles. For this reason, it remains important to understand the logic of inferential statistics. A…
Smith, Ariana L; Wein, Alan J
2011-05-01
To evaluate the statistical and clinical efficacy of the pharmacological treatments of nocturia using non-antidiuretic agents. A literature review of treatments of nocturia specifically addressing the impact of alpha blockers, 5-alpha reductase inhibitors (5ARI) and antimuscarinics on reduction in nocturnal voids. Despite commonly reported statistically significant results, nocturia has shown a poor clinical response to traditional therapies for benign prostatic hyperplasia including alpha blockers and 5ARI. Similarly, nocturia has shown a poor clinical response to traditional therapies for overactive bladder including antimuscarinics. Statistical success has been achieved in some groups with a variety of alpha blockers and antimuscarinic agents, but the clinical significance of these changes is doubtful. It is likely that other types of therapy will need to be employed in order to achieve a clinically significant reduction in nocturia. PMID:21518417
The risk of developing thyroid cancer and other thyroid neoplasms after radiation exposure is well known, but specific modifiers of the dose-response relationship are not. The authors have identified 4,296 subjects who received treatment before their sixteenth birthday with orthovoltage radiation for benign conditions in the head and neck area. Individual thyroid dose estimates were calculated for 3,843 subjects. Of the 2,634 subjects who have been found, 1,043 have developed thyroid nodules of all types, and 309 have developed thyroid cancer. The radiation dose-response relationship was consistent with a linear excess relative risk model for thyroid cancer and thyroid nodules within the range of observed doses. Women developed thyroid cancer and thyroid nodules at a higher rate, but the slopes of the dose-response curves were the same for men and women. Age at radiation exposure was a significant factor of the risk, with a lower age at exposure associated with a higher risk. To determine the effect of the wide publicity and the screening program, which began in 1974, the authors compared the dose-response relationship for cases diagnosed before and after 1974. The overall rates increased dramatically after 1974, but the estimates of the slopes of the dose-response curves were not statistically different. The slope of the dose-response curve for thyroid neoplasms appears to have reached a maximum 25-29 yr after radiation exposure, but the dose response continued to be elevated at the end of follow-up. These data are consistent with the tumorigenic effects of radiation lasting at least 40 yr
Early radiation dose-response in lung: an ultrastructural study
A systematic fine-structural study of dog lungs was undertaken to ascertain the radiation dose response in the lungs of large animals. The capillary endothelium appeared to be the initial site of the post-irradiation pulmonary damage. This subpheural response included diffuse septal thickening, fibrosis, edema, and reduced alveolar lumina. The deep parenchymal response involved perivascular fibrosis, which was associated with perivascular hyperplasia of Type II pneumocytes, increased number and sizes of lamellar bodies, increased production and release of lamellar surfactant. No changes of alveolar luminar size were noted. The most significant changes were observed in those dose zones exposed to greater than 2400 rad, suggesting the possibility of an identifiable dose-response relationship. Early detection of radiation pneumonitis by electron microscopy is demonstrated, and qualitative and quantitative correlation of injury with both postirradiation time and dose is presented
"What If" Analyses: Ways to Interpret Statistical Significance Test Results Using EXCEL or "R"
Ozturk, Elif
2012-01-01
The present paper aims to review two motivations to conduct "what if" analyses using Excel and "R" to understand the statistical significance tests through the sample size context. "What if" analyses can be used to teach students what statistical significance tests really do and in applied research either prospectively to estimate what sample size…
Modeling of dose-response relationships.
Altshuler, B
1981-01-01
The main dose-response models for chronic toxicity are considered. For dichotomous response, the log probit, multi-hit, and multistage models are presented. For time-to-occurrence response, the log-normal and three variations of multistage models are presented. Finally, the Cornfield hockey-stick model is considered, and, for low-dose extrapolation, it is suggested that response be taken to be proportional to dose and to a power of time determined by background response.
Inhalation Anthrax: Dose Response and Risk Analysis
Coleman, Margaret E.; Thran, Brandolyn; Morse, Stephen S.; Hugh-Jones, Martin; Massulik, Stacey
2008-01-01
The notion that inhalation of a single Bacillus anthracis spore is fatal has become entrenched nearly to the point of urban legend, in part because of incomplete articulation of the scientific basis for microbial risk assessment, particularly dose-response assessment. Risk analysis (ie, risk assessment, risk communication, risk management) necessitates transparency: distinguishing scientific facts, hypotheses, judgments, biases in interpretations, and potential misinformation. The difficulty ...
Brouwer, Danny; Meijer, Rob; Zevalkink, J.
2013-01-01
Several researchers have emphasized that item response theory (IRT)-based methods should be preferred over classical approaches in measuring change for individual patients. In the present study we discuss and evaluate the use of IRT-based statistics to measure statistical significant individual chan
Xu, Kuan-Man
2006-01-01
A new method is proposed to compare statistical differences between summary histograms, which are the histograms summed over a large ensemble of individual histograms. It consists of choosing a distance statistic for measuring the difference between summary histograms and using a bootstrap procedure to calculate the statistical significance level. Bootstrapping is an approach to statistical inference that makes few assumptions about the underlying probability distribution that describes the data. Three distance statistics are compared in this study. They are the Euclidean distance, the Jeffries-Matusita distance and the Kuiper distance. The data used in testing the bootstrap method are satellite measurements of cloud systems called cloud objects. Each cloud object is defined as a contiguous region/patch composed of individual footprints or fields of view. A histogram of measured values over footprints is generated for each parameter of each cloud object and then summary histograms are accumulated over all individual histograms in a given cloud-object size category. The results of statistical hypothesis tests using all three distances as test statistics are generally similar, indicating the validity of the proposed method. The Euclidean distance is determined to be most suitable after comparing the statistical tests of several parameters with distinct probability distributions among three cloud-object size categories. Impacts on the statistical significance levels resulting from differences in the total lengths of satellite footprint data between two size categories are also discussed.
Tornetta Paul; Siegel Judith; Sung Jinsil; Bhandari Mohit
2008-01-01
Abstract Background Evidence-based medicine posits that health care research is founded upon clinically important differences in patient centered outcomes. Statistically significant differences between two treatments may not necessarily reflect a clinically important difference. We aimed to quantify the sample sizes and magnitude of treatment effects in a review of orthopaedic randomized trials with statistically significant findings. Methods We conducted a comprehensive search (PubMed, Cochr...
Zhang, Yuhong; Misra, Sanchit; Agrawal, Ankit; Patwary, Md Mostofa Ali; Liao, Wei-keng; Qin, Zhiguang; Choudhary, Alok
2012-01-01
Background Pairwise statistical significance has been recognized to be able to accurately identify related sequences, which is a very important cornerstone procedure in numerous bioinformatics applications. However, it is both computationally and data intensive, which poses a big challenge in terms of performance and scalability. Results We present a GPU implementation to accelerate pairwise statistical significance estimation of local sequence alignment using standard substitution matrices. ...
Vermeesch, Pieter
2011-02-01
In my Eos Forum of 24 November 2009 (90(47), 443), I used the chi-square test to reject the null hypothesis that earthquakes occur independent of the weekday to make the point that statistical significance should not be confused with geological significance. Of the five comments on my article, only the one by Sornette and Pisarenko [2011] disputes this conclusion, while the remaining comments take issue with certain aspects of the geophysical case study. In this reply I will address all of these points, after providing some necessary further background about statistical tests. Two types of error can result from a hypothesis test. A Type I error occurs when a true null hypothesis is erroneously rejected by chance. A Type II error occurs when a false null hypothesis is erroneously accepted by chance. By definition, the p value is the probability, under the null hypothesis, of obtaining a test statistic at least as extreme as the one observed. In other words, the smaller the p value, the lower the probability that a Type I error has been made. In light of the exceedingly small p value of the earthquake data set, Tseng and Chen's [2011] assertion that a Type I error has been committed is clearly wrong. How about Type II errors?
Engsted, Tom
I comment on the controversy between McCloskey & Ziliak and Hoover & Siegler on statistical versus economic significance, in the March 2008 issue of the Journal of Economic Methodology. I argue that while McCloskey & Ziliak are right in emphasizing 'real error', i.e. non-sampling error that cannot...... be eliminated through specification testing, they fail to acknowledge those areas in economics, e.g. rational expectations macroeconomics and asset pricing, where researchers clearly distinguish between statistical and economic significance and where statistical testing plays a relatively minor role...... of how to obtain reliable estimates, and I argue that significance tests are useful tools in those cases where a statistical model serves as input in the quantification of an economic model. Finally, I provide a specific example from economics - asset return predictability - where the distinction...
Sullivan, Jeremy R.
2001-01-01
Summarizes the post-1994 literature in psychology and education regarding statistical significance testing, emphasizing limitations and defenses of statistical testing and alternatives or supplements to statistical significance testing. (SLD)
TESS-based dose-response using pediatric clonidine exposures
Objective: The toxic and lethal doses of clonidine in children are unclear. This study was designed to determine whether data from the American Association of Poison Control Centers Toxic Exposure Surveillance System (TESS) could be utilized to determine a dose-response relationship for pediatric clonidine exposure. Methods: 3458 single-substance clonidine exposures in children <6 years of age reported to TESS from January 2000 through December 2003 were examined. Dose ingested, age, and medical outcome were available for 1550 cases. Respiratory arrest cases (n = 8) were classified as the most severe of the medical outcome categories (Arrest, Major, Moderate, Mild, and No effect). Exposures reported as a 'taste or lick' (n = 51) were included as a dose of 1/10 of the dosage form involved. Dose ranged from 0.4 to 1980 (median 13) μg/kg. Weight was imputed based on a quadratic estimate of weight for age. Dose certainty was coded as exact (26% of cases) or not exact (74%). Medical outcome (response) was examined via logistic regression using SAS JMP (release 5.1). Results: The logistic model describing medical outcome (P < 0.0001) included Log dose/kg (P 0.0000) and Certainty (P = 0.045). Conclusion: TESS data can provide the basis for a statistically sound description of dose-response for pediatric clonidine poisoning exposures
Little, Mark P., E-mail: mark.little@nih.gov [Radiation Epidemiology Branch, National Cancer Institute, Rockville, Maryland (United States); Stovall, Marilyn; Smith, Susan A. [Department of Radiation Physics, University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Kleinerman, Ruth A. [Radiation Epidemiology Branch, National Cancer Institute, Rockville, Maryland (United States)
2013-02-01
Purpose: To assess the shape of the dose response for various cancer endpoints and modifiers by age and time. Methods and Materials: Reanalysis of the US peptic ulcer data testing for heterogeneity of radiogenic risk by cancer endpoint (stomach, pancreas, lung, leukemia, all other). Results: There are statistically significant (P<.05) excess risks for all cancer and for lung cancer and borderline statistically significant risks for stomach cancer (P=.07), and leukemia (P=.06), with excess relative risks Gy{sup -1} of 0.024 (95% confidence interval [CI] 0.011, 0.039), 0.559 (95% CI 0.221, 1.021), 0.042 (95% CI -0.002, 0.119), and 1.087 (95% CI -0.018, 4.925), respectively. There is statistically significant (P=.007) excess risk of pancreatic cancer when adjusted for dose-response curvature. General downward curvature is apparent in the dose response, statistically significant (P<.05) for all cancers, pancreatic cancer, and all other cancers (ie, other than stomach, pancreas, lung, leukemia). There are indications of reduction in relative risk with increasing age at exposure (for all cancers, pancreatic cancer), but no evidence for quadratic variations in relative risk with age at exposure. If a linear-exponential dose response is used, there is no significant heterogeneity in the dose response among the 5 endpoints considered or in the speed of variation of relative risk with age at exposure. The risks are generally consistent with those observed in the Japanese atomic bomb survivors and in groups of nuclear workers. Conclusions: There are excess risks for various malignancies in this data set. Generally there is a marked downward curvature in the dose response and significant reduction in relative risk with increasing age at exposure. The consistency of risks with those observed in the Japanese atomic bomb survivors and in groups of nuclear workers implies that there may be little sparing effect of fractionation of dose or low-dose-rate exposure.
Chládek, J; Brázdil, M; Halámek, J; Plešinger, F; Jurák, P
2013-01-01
We present an off-line analysis procedure for exploring brain activity recorded from intra-cerebral electroencephalographic data (SEEG). The objective is to determine the statistical differences between different types of stimulations in the time-frequency domain. The procedure is based on computing relative signal power change and subsequent statistical analysis. An example of characteristic statistically significant event-related de/synchronization (ERD/ERS) detected across different frequency bands following different oddball stimuli is presented. The method is used for off-line functional classification of different brain areas. PMID:24109865
Quantitative Dose-Response Curves from Subcellular Lipid Multilayer Microarrays
Kusi-Appiah, A. E.; Lowry, T. W.; Darrow, E. M.; Wilson, K.; Chadwick, B. P.; Davidson, M. W.; Lenhert, S.
2015-01-01
The dose-dependent bioactivity of small molecules on cells is a crucial factor in drug discovery and personalized medicine. Although small-molecule microarrays are a promising platform for miniaturized screening, it has been a challenge to use them to obtain quantitative dose-response curves in vitro, especially for lipophilic compounds. Here we establish a small-molecule microarray assay capable of controlling the dosage of small lipophilic molecules delivered to cells by varying the sub-cellular volumes of surface supported lipid micro- and nanostructure arrays fabricated with nanointaglio. Features with sub-cellular lateral dimensions were found necessary to obtain normal cell adhesion with HeLa cells. The volumes of the lipophilic drug-containing nanostructures were determined using a fluorescence microscope calibrated by atomic-force microscopy. We used the surface supported lipid volume information to obtain EC-50 values for the response of HeLa cells to three FDA-approved lipophilic anticancer drugs, docetaxel, imiquimod and triethylenemelamine, which were found to be significantly different from neat lipid controls. No significant toxicity was observed on the control cells surrounding the drug/lipid patterns, indicating lack of interference or leakage from the arrays. Comparison of the microarray data to dose-response curves for the same drugs delivered liposomally from solution revealed quantitative differences in the efficacy values, which we explain in terms of cell-adhesion playing a more important role in the surface-based assay. The assay should be scalable to a density of at least 10,000 dose response curves on the area of a standard microtiter plate. PMID:26167949
Zhang, Zhang
2012-03-22
Background: Genetic mutation, selective pressure for translational efficiency and accuracy, level of gene expression, and protein function through natural selection are all believed to lead to codon usage bias (CUB). Therefore, informative measurement of CUB is of fundamental importance to making inferences regarding gene function and genome evolution. However, extant measures of CUB have not fully accounted for the quantitative effect of background nucleotide composition and have not statistically evaluated the significance of CUB in sequence analysis.Results: Here we propose a novel measure--Codon Deviation Coefficient (CDC)--that provides an informative measurement of CUB and its statistical significance without requiring any prior knowledge. Unlike previous measures, CDC estimates CUB by accounting for background nucleotide compositions tailored to codon positions and adopts the bootstrapping to assess the statistical significance of CUB for any given sequence. We evaluate CDC by examining its effectiveness on simulated sequences and empirical data and show that CDC outperforms extant measures by achieving a more informative estimation of CUB and its statistical significance.Conclusions: As validated by both simulated and empirical data, CDC provides a highly informative quantification of CUB and its statistical significance, useful for determining comparative magnitudes and patterns of biased codon usage for genes or genomes with diverse sequence compositions. 2012 Zhang et al; licensee BioMed Central Ltd.
Zhang Zhang
2012-03-01
Full Text Available Abstract Background Genetic mutation, selective pressure for translational efficiency and accuracy, level of gene expression, and protein function through natural selection are all believed to lead to codon usage bias (CUB. Therefore, informative measurement of CUB is of fundamental importance to making inferences regarding gene function and genome evolution. However, extant measures of CUB have not fully accounted for the quantitative effect of background nucleotide composition and have not statistically evaluated the significance of CUB in sequence analysis. Results Here we propose a novel measure--Codon Deviation Coefficient (CDC--that provides an informative measurement of CUB and its statistical significance without requiring any prior knowledge. Unlike previous measures, CDC estimates CUB by accounting for background nucleotide compositions tailored to codon positions and adopts the bootstrapping to assess the statistical significance of CUB for any given sequence. We evaluate CDC by examining its effectiveness on simulated sequences and empirical data and show that CDC outperforms extant measures by achieving a more informative estimation of CUB and its statistical significance. Conclusions As validated by both simulated and empirical data, CDC provides a highly informative quantification of CUB and its statistical significance, useful for determining comparative magnitudes and patterns of biased codon usage for genes or genomes with diverse sequence compositions.
The conference attracts approximately 500 scientists researching in the area of non-linear low dose effects. These scientists represent a wide range of biological/medical fields and technical disciplines. Observations that biphasic dose responses are frequently reported in each of these areas but that the recognition of similar dose response relationships across disciplines is very rarely appreciated and exploited. By bringing scientist of such diverse backgrounds together who are working on the common area of non-linear dose response relationships this will enhance our understanding of the occurrence, origin, mechanism, significance and practical applications of such dose response relationships
FionaFidler
2010-07-01
Full Text Available A statistically significant result, and a non-significant result may differ little, although significance status may tempt an interpretation of difference. Two studies are reported that compared interpretation of such results presented using null hypothesis significance testing (NHST, or confidence intervals (CIs. Authors of articles published in psychology, behavioural neuroscience, and medical journals were asked, via email, to interpret two fictitious studies that found similar results, one statistically significant, and the other non-significant. Responses from 330 authors varied greatly, but interpretation was generally poor, whether results were presented as CIs or using NHST. However, when interpreting CIs respondents who mentioned NHST were 60% likely to conclude, unjustifiably, the two results conflicted, whereas those who interpreted CIs without reference to NHST were 95% likely to conclude, justifiably, the two results were consistent. Findings were generally similar for all three disciplines. An email survey of academic psychologists confirmed that CIs elicit better interpretations if NHST is not invoked. Improved statistical inference can result from encouragement of meta-analytic thinking and use of CIs but, for full benefit, such highly desirable statistical reform requires also that researchers interpret CIs without recourse to NHST.
EasyGene – a prokaryotic gene finder that ranks ORFs by statistical significance
Larsen, Thomas Schou; Krogh, Anders Stærmose
2003-01-01
annotated as genes.Results: In this paper, we present a new automated gene-finding method, EasyGene, which estimates the statistical significance of a predicted gene. The gene finder is based on a hidden Markov model (HMM) that is automatically estimated for a new genome. Using extensions of similarities...... is the expected number of ORFs in one megabase of random sequence at the same significance level or better, where the random sequence has the same statistics as the genome in the sense of a third order Markov chain.Conclusions: The result is a flexible gene finder whose overall performance matches or exceeds...
Does Statistical Significance Help to Evaluate Predictive Performance of Competing Models?
Levent Bulut
2016-04-01
Full Text Available In Monte Carlo experiment with simulated data, we show that as a point forecast criterion, the Clark and West's (2006 unconditional test of mean squared prediction errors does not reflect the relative performance of a superior model over a relatively weaker one. The simulation results show that even though the mean squared prediction errors of a constructed superior model is far below a weaker alternative, the Clark- West test does not reflect this in their test statistics. Therefore, studies that use this statistic in testing the predictive accuracy of alternative exchange rate models, stock return predictability, inflation forecasting, and unemployment forecasting should not weight too much on the magnitude of the statistically significant Clark-West tests statistics.
Fidler, Fiona; Burgman, Mark A; Cumming, Geoff; Buttrose, Robert; Thomason, Neil
2006-10-01
Over the last decade, criticisms of null-hypothesis significance testing have grown dramatically, and several alternative practices, such as confidence intervals, information theoretic, and Bayesian methods, have been advocated. Have these calls for change had an impact on the statistical reporting practices in conservation biology? In 2000 and 2001, 92% of sampled articles in Conservation Biology and Biological Conservation reported results of null-hypothesis tests. In 2005 this figure dropped to 78%. There were corresponding increases in the use of confidence intervals, information theoretic, and Bayesian techniques. Of those articles reporting null-hypothesis testing--which still easily constitute the majority--very few report statistical power (8%) and many misinterpret statistical nonsignificance as evidence for no effect (63%). Overall, results of our survey show some improvements in statistical practice, but further efforts are clearly required to move the discipline toward improved practices. PMID:17002771
Bayesian Dose-Response Modeling in Sparse Data
Kim, Steven B.
This book discusses Bayesian dose-response modeling in small samples applied to two different settings. The first setting is early phase clinical trials, and the second setting is toxicology studies in cancer risk assessment. In early phase clinical trials, experimental units are humans who are actual patients. Prior to a clinical trial, opinions from multiple subject area experts are generally more informative than the opinion of a single expert, but we may face a dilemma when they have disagreeing prior opinions. In this regard, we consider compromising the disagreement and compare two different approaches for making a decision. In addition to combining multiple opinions, we also address balancing two levels of ethics in early phase clinical trials. The first level is individual-level ethics which reflects the perspective of trial participants. The second level is population-level ethics which reflects the perspective of future patients. We extensively compare two existing statistical methods which focus on each perspective and propose a new method which balances the two conflicting perspectives. In toxicology studies, experimental units are living animals. Here we focus on a potential non-monotonic dose-response relationship which is known as hormesis. Briefly, hormesis is a phenomenon which can be characterized by a beneficial effect at low doses and a harmful effect at high doses. In cancer risk assessments, the estimation of a parameter, which is known as a benchmark dose, can be highly sensitive to a class of assumptions, monotonicity or hormesis. In this regard, we propose a robust approach which considers both monotonicity and hormesis as a possibility. In addition, We discuss statistical hypothesis testing for hormesis and consider various experimental designs for detecting hormesis based on Bayesian decision theory. Past experiments have not been optimally designed for testing for hormesis, and some Bayesian optimal designs may not be optimal under a
EasyGene – a prokaryotic gene finder that ranks ORFs by statistical significance
Larsen Thomas
2003-06-01
Full Text Available Abstract Background Contrary to other areas of sequence analysis, a measure of statistical significance of a putative gene has not been devised to help in discriminating real genes from the masses of random Open Reading Frames (ORFs in prokaryotic genomes. Therefore, many genomes have too many short ORFs annotated as genes. Results In this paper, we present a new automated gene-finding method, EasyGene, which estimates the statistical significance of a predicted gene. The gene finder is based on a hidden Markov model (HMM that is automatically estimated for a new genome. Using extensions of similarities in Swiss-Prot, a high quality training set of genes is automatically extracted from the genome and used to estimate the HMM. Putative genes are then scored with the HMM, and based on score and length of an ORF, the statistical significance is calculated. The measure of statistical significance for an ORF is the expected number of ORFs in one megabase of random sequence at the same significance level or better, where the random sequence has the same statistics as the genome in the sense of a third order Markov chain. Conclusions The result is a flexible gene finder whose overall performance matches or exceeds other methods. The entire pipeline of computer processing from the raw input of a genome or set of contigs to a list of putative genes with significance is automated, making it easy to apply EasyGene to newly sequenced organisms. EasyGene with pre-trained models can be accessed at http://www.cbs.dtu.dk/services/EasyGene.
Linting, Marielle; van Os, Bart Jan; Meulman, Jacqueline J.
2011-01-01
In this paper, the statistical significance of the contribution of variables to the principal components in principal components analysis (PCA) is assessed nonparametrically by the use of permutation tests. We compare a new strategy to a strategy used in previous research consisting of permuting the columns (variables) of a data matrix…
Spinella, Sarah
2011-01-01
As result replicability is essential to science and difficult to achieve through external replicability, the present paper notes the insufficiency of null hypothesis statistical significance testing (NHSST) and explains the bootstrap as a plausible alternative, with a heuristic example to illustrate the bootstrap method. The bootstrap relies on…
Statistical significance of trends in monthly heavy precipitation over the US
Mahajan, Salil
2011-05-11
Trends in monthly heavy precipitation, defined by a return period of one year, are assessed for statistical significance in observations and Global Climate Model (GCM) simulations over the contiguous United States using Monte Carlo non-parametric and parametric bootstrapping techniques. The results from the two Monte Carlo approaches are found to be similar to each other, and also to the traditional non-parametric Kendall\\'s τ test, implying the robustness of the approach. Two different observational data-sets are employed to test for trends in monthly heavy precipitation and are found to exhibit consistent results. Both data-sets demonstrate upward trends, one of which is found to be statistically significant at the 95% confidence level. Upward trends similar to observations are observed in some climate model simulations of the twentieth century, but their statistical significance is marginal. For projections of the twenty-first century, a statistically significant upwards trend is observed in most of the climate models analyzed. The change in the simulated precipitation variance appears to be more important in the twenty-first century projections than changes in the mean precipitation. Stochastic fluctuations of the climate-system are found to be dominate monthly heavy precipitation as some GCM simulations show a downwards trend even in the twenty-first century projections when the greenhouse gas forcings are strong. © 2011 Springer-Verlag.
Statistical Significance, Effect Size, and Replication: What Do the Journals Say?
DeVaney, Thomas A.
2001-01-01
Studied the attitudes of representatives of journals in education, sociology, and psychology through an electronic survey completed by 194 journal representatives. Results suggest that the majority of journals do not have written policies concerning the reporting of results from statistical significance testing, and most indicated that statistical…
Jones, Allan; Sommerlund, Bo
2007-01-01
The uses of null hypothesis significance testing (NHST) and statistical power analysis within psychological research are critically discussed. The article looks at the problems of relying solely on NHST when dealing with small and large sample sizes. The use of power-analysis in estimating the...
Recent Literature on Whether Statistical Significance Tests Should or Should Not Be Banned.
Deegear, James
This paper summarizes the literature regarding statistical significant testing with an emphasis on recent literature in various discipline and literature exploring why researchers have demonstrably failed to be influenced by the American Psychological Association publication manual's encouragement to report effect sizes. Also considered are…
Interpreting Statistical Significance Test Results: A Proposed New "What If" Method.
Kieffer, Kevin M.; Thompson, Bruce
As the 1994 publication manual of the American Psychological Association emphasized, "p" values are affected by sample size. As a result, it can be helpful to interpret the results of statistical significant tests in a sample size context by conducting so-called "what if" analyses. However, these methods can be inaccurate unless "corrected" effect…
Yang Yaning
2003-12-01
Full Text Available Abstract Background With the increasing amount of data generated in molecular genetics laboratories, it is often difficult to make sense of results because of the vast number of different outcomes or variables studied. Examples include expression levels for large numbers of genes and haplotypes at large numbers of loci. It is then natural to group observations into smaller numbers of classes that allow for an easier overview and interpretation of the data. This grouping is often carried out in multiple steps with the aid of hierarchical cluster analysis, each step leading to a smaller number of classes by combining similar observations or classes. At each step, either implicitly or explicitly, researchers tend to interpret results and eventually focus on that set of classes providing the "best" (most significant result. While this approach makes sense, the overall statistical significance of the experiment must include the clustering process, which modifies the grouping structure of the data and often removes variation. Results For hierarchically clustered data, we propose considering the strongest result or, equivalently, the smallest p-value as the experiment-wise statistic of interest and evaluating its significance level for a global assessment of statistical significance. We apply our approach to datasets from haplotype association and microarray expression studies where hierarchical clustering has been used. Conclusion In all of the cases we examine, we find that relying on one set of classes in the course of clustering leads to significance levels that are too small when compared with the significance level associated with an overall statistic that incorporates the process of clustering. In other words, relying on one step of clustering may furnish a formally significant result while the overall experiment is not significant.
Faraone, Stephen V.; Spencer, Thomas J.; Kollins, Scott H.; Glatt, Stephen J.; Goodman, David
2012-01-01
Objective: To explore dose-response effects of lisdexamfetamine dimesylate (LDX) treatment for ADHD. Method: This was a 4-week, randomized, double-blinded, placebo-controlled, parallel-group, forced-dose titration study in adult participants, aged 18 to 55 years, meeting "Diagnostic and Statistical Manual of Mental Disorders" (4th ed., text rev.)…
Confidence bounds for nonlinear dose-response relationships.
Baayen, C; Hougaard, P
2015-11-30
An important aim of drug trials is to characterize the dose-response relationship of a new compound. Such a relationship can often be described by a parametric (nonlinear) function that is monotone in dose. If such a model is fitted, it is useful to know the uncertainty of the fitted curve. It is well known that Wald confidence intervals are based on linear approximations and are often unsatisfactory in nonlinear models. Apart from incorrect coverage rates, they can be unreasonable in the sense that the lower confidence limit of the difference to placebo can be negative, even when an overall test shows a significant positive effect. Bootstrap confidence intervals solve many of the problems of the Wald confidence intervals but are computationally intensive and prone to undercoverage for small sample sizes. In this work, we propose a profile likelihood approach to compute confidence intervals for the dose-response curve. These confidence bounds have better coverage than Wald intervals and are more precise and generally faster than bootstrap methods. Moreover, if monotonicity is assumed, the profile likelihood approach takes this automatically into account. The approach is illustrated using a public dataset and simulations based on the Emax and sigmoid Emax models. PMID:26112765
Stern, J M
2010-01-01
This book presents our case in defense of a constructivist epistemological framework and the use of compatible statistical theory and inference tools. The basic metaphor of decision theory is the maximization of a gambler's expected fortune, according to his own subjective utility, prior beliefs an learned experiences. This metaphor has proven to be very useful, leading the development of Bayesian statistics since its XX-th century revival, rooted on the work of de Finetti, Savage and others. The basic metaphor presented in this text, as a foundation for cognitive constructivism, is that of an eigen-solution, and the verification of its objective epistemic status. The FBST - Full Bayesian Significance Test - is the cornerstone of a set of statistical tolls conceived to assess the epistemic value of such eigen-solutions, according to their four essential attributes, namely, sharpness, stability, separability and composability. We believe that this alternative perspective, complementary to the one ofered by dec...
Dose-response in stereotactic irradiation of lung tumors
The dose-response for local tumor control after stereotactic radiotherapy of 92 pulmonary tumors (36 NSCLC and 56 metastases) was evaluated. Short course irradiation of 1-8 fractions with different fraction doses was used. After a median follow-up of 14 months (2-85 months) 11 local recurrences were observed with significant advantage for higher doses. When normalization to a biologically effective dose (BED) is used a dose of 94 Gy at the isocenter and 50 Gy at the PTV-margin are demonstrated to give 50% probability of tumor control (TCD50). Multivariate analysis revealed the dose at the PTV-margin as the only significant factor for local control
Fernández-Somoano Ana; Suárez-Gil Patricio; Silva-Ayçaguer Luis
2010-01-01
Abstract Background The null hypothesis significance test (NHST) is the most frequently used statistical method, although its inferential validity has been widely criticized since its introduction. In 1988, the International Committee of Medical Journal Editors (ICMJE) warned against sole reliance on NHST to substantiate study conclusions and suggested supplementary use of confidence intervals (CI). Our objective was to evaluate the extent and quality in the use of NHST and CI, both in Englis...
Tornetta Paul
2008-01-01
Full Text Available Abstract Background Evidence-based medicine posits that health care research is founded upon clinically important differences in patient centered outcomes. Statistically significant differences between two treatments may not necessarily reflect a clinically important difference. We aimed to quantify the sample sizes and magnitude of treatment effects in a review of orthopaedic randomized trials with statistically significant findings. Methods We conducted a comprehensive search (PubMed, Cochrane for all randomized controlled trials between 1/1/95 to 12/31/04. Eligible studies include those that focused upon orthopaedic trauma. Baseline characteristics and treatment effects were abstracted by two reviewers. Briefly, for continuous outcome measures (ie functional scores, we calculated effect sizes (mean difference/standard deviation. Dichotomous variables (ie infection, nonunion were summarized as absolute risk differences and relative risk reductions (RRR. Effect sizes >0.80 and RRRs>50% were defined as large effects. Using regression analysis we examined the association between the total number of outcome events and treatment effect (dichotomous outcomes. Results Our search yielded 433 randomized controlled trials (RCTs, of which 76 RCTs with statistically significant findings on 184 outcomes (122 continuous/62 dichotomous outcomes met study eligibility criteria. The mean effect size across studies with continuous outcome variables was 1.7 (95% confidence interval: 1.43–1.97. For dichotomous outcomes, the mean risk difference was 30% (95%confidence interval:24%–36% and the mean relative risk reduction was 61% (95% confidence interval: 55%–66%; range: 0%–97%. Fewer numbers of total outcome events in studies was strongly correlated with increasing magnitude of the treatment effect (Pearson's R = -0.70, p Conclusion Our review suggests that statistically significant results in orthopaedic trials have the following implications-1 On average
Dagmar Sigmundová
2012-01-01
Full Text Available BACKGROUND: An adequate statistical analysis of obtained data is a necessary condition for the right interpretation of results followed by correct formulation of conclusions. The use of appropriate statistical procedures is repeatedly a subject of justified criticism and recommendations in our and other international publications. For example, during the interpretation of results we often "blindly" rely on the statistical significance, and the practical significance of the research is ignored. OBJECTIVE: The main aim of this study is to highlight the formulation of the practical significance of the results and its correct interpretation. We used an analysis of the international studies and own findings from physical activity monitoring. Another aim is to introduce an applicable eﬀect size coefficient as a guide for assessing the practical significance. METHODS: Materials for formulating rules to assess the practical significance and introduction of eﬀect size coefficients was comprised of the 29 international and Czech studies about the level of week and short time (PE lesson, training, exercise lesson physical activity of Czech children and adolescents (1,129 girls and 938 boys, and adults (5,727 females and 5,426 males with use of accelerometers, pedometers and IPAQ questionnaires during 2000-2010. RESULTS: Rules for assessing the practical significance consider measurement error, data variability and the size of position from the beginning on the measurement scale. The formulation of the practical significance should include - a a determination of the minimal values of certain measurement unit that will be limiting for assessing the significancy of the diﬀerence; b the determination of a minimal size of mutual relationship between the expected results and findings. The presentation of the "eﬀect size" coefficients (d, r, r2, K2, ω2 comprises of their definition, the conditions for use as well as the calculation and interpretation of
On determining the statistical significance of discontinuities within ordered ecological data
Current ecological theory hypothesizes that boundaries between adjacent ecosystem units are important in determining ecosystem structure and function across heterogeneous landscapes, and that such boundaries are potentially important sites for early detection of global climate change effects. Yet traditional data analysis methods focus primarily on homogeneous units rather than on the boundaries between them; thus, new methods are being developed for detecting, characterizing and classifying boundaries, e.g., split moving-window boundary analysis (SMW). SMW is a simple yet sensitive method for locating discontinuities that may exist within multivariate, serial data at various scales relative to the length of the data series. However, SMW is subjective and relative, and therefore locates apparent discontinuities even within random, serial data. In this paper they present two nonparametric methods for determining the statistical significance of discontinuities detected by SMW. First, they describe a Monte Carlo method for determining the statistical significance of scale-dependent discontinuities. Second, they propose a nonparametric, scale-independent method that is more appropriate for locating statistically significant discontinuities that separate different, relatively homogeneous groups of varying size along a series. They examine the robustness of these two methods using computer-generated data having varying intensities of imposed discontinuities, and illustrate their application to locating boundaries between vegetation samples collected at systematic intervals across a desert landscape in southern New Mexico
Alves, Gelio; Wang, Guanghui; Ogurtsov, Aleksey Y.; Drake, Steven K.; Gucek, Marjan; Suffredini, Anthony F.; Sacks, David B.; Yu, Yi-Kuo
2016-02-01
Correct and rapid identification of microorganisms is the key to the success of many important applications in health and safety, including, but not limited to, infection treatment, food safety, and biodefense. With the advance of mass spectrometry (MS) technology, the speed of identification can be greatly improved. However, the increasing number of microbes sequenced is challenging correct microbial identification because of the large number of choices present. To properly disentangle candidate microbes, one needs to go beyond apparent morphology or simple `fingerprinting'; to correctly prioritize the candidate microbes, one needs to have accurate statistical significance in microbial identification. We meet these challenges by using peptidome profiles of microbes to better separate them and by designing an analysis method that yields accurate statistical significance. Here, we present an analysis pipeline that uses tandem MS (MS/MS) spectra for microbial identification or classification. We have demonstrated, using MS/MS data of 81 samples, each composed of a single known microorganism, that the proposed pipeline can correctly identify microorganisms at least at the genus and species levels. We have also shown that the proposed pipeline computes accurate statistical significances, i.e., E-values for identified peptides and unified E-values for identified microorganisms. The proposed analysis pipeline has been implemented in MiCId, a freely available software for Microorganism Classification and Identification. MiCId is available for download at http://www.ncbi.nlm.nih.gov/CBBresearch/Yu/downloads.html.
Testing statistical significance scores of sequence comparison methods with structure similarity
Leunissen Jack AM
2006-10-01
Full Text Available Abstract Background In the past years the Smith-Waterman sequence comparison algorithm has gained popularity due to improved implementations and rapidly increasing computing power. However, the quality and sensitivity of a database search is not only determined by the algorithm but also by the statistical significance testing for an alignment. The e-value is the most commonly used statistical validation method for sequence database searching. The CluSTr database and the Protein World database have been created using an alternative statistical significance test: a Z-score based on Monte-Carlo statistics. Several papers have described the superiority of the Z-score as compared to the e-value, using simulated data. We were interested if this could be validated when applied to existing, evolutionary related protein sequences. Results All experiments are performed on the ASTRAL SCOP database. The Smith-Waterman sequence comparison algorithm with both e-value and Z-score statistics is evaluated, using ROC, CVE and AP measures. The BLAST and FASTA algorithms are used as reference. We find that two out of three Smith-Waterman implementations with e-value are better at predicting structural similarities between proteins than the Smith-Waterman implementation with Z-score. SSEARCH especially has very high scores. Conclusion The compute intensive Z-score does not have a clear advantage over the e-value. The Smith-Waterman implementations give generally better results than their heuristic counterparts. We recommend using the SSEARCH algorithm combined with e-values for pairwise sequence comparisons.
Alcohol and cirrhosis: dose--response or threshold effect?
Kamper-Jørgensen, Mads; Grønbaek, Morten; Tolstrup, Janne;
2004-01-01
BACKGROUND/AIMS: General population studies have shown a strong association between alcohol intake and death from alcoholic cirrhosis, but whether this is a dose-response or a threshold effect remains unknown, and the relation among alcohol misusers has not been studied. METHODS: A cohort of 6152...... alcohol misusing men and women aged 15-83 were interviewed about drinking pattern and social issues and followed for 84,257 person-years. Outcome was alcoholic cirrhosis mortality. Data was analyzed by means of Cox-regression models. RESULTS: In this large prospective cohort study of alcohol misusers...... there was a 27 fold increased mortality from alcoholic cirrhosis in men and a 35 fold increased mortality from alcoholic cirrhosis in women compared to the Danish population. Number of drinks per day was not significantly associated with death from alcoholic cirrhosis, since there was no additional risk of death...
Island method for estimating the statistical significance of profile-profile alignment scores
Poleksic Aleksandar
2009-04-01
Full Text Available Abstract Background In the last decade, a significant improvement in detecting remote similarity between protein sequences has been made by utilizing alignment profiles in place of amino-acid strings. Unfortunately, no analytical theory is available for estimating the significance of a gapped alignment of two profiles. Many experiments suggest that the distribution of local profile-profile alignment scores is of the Gumbel form. However, estimating distribution parameters by random simulations turns out to be computationally very expensive. Results We demonstrate that the background distribution of profile-profile alignment scores heavily depends on profiles' composition and thus the distribution parameters must be estimated independently, for each pair of profiles of interest. We also show that accurate estimates of statistical parameters can be obtained using the "island statistics" for profile-profile alignments. Conclusion The island statistics can be generalized to profile-profile alignments to provide an efficient method for the alignment score normalization. Since multiple island scores can be extracted from a single comparison of two profiles, the island method has a clear speed advantage over the direct shuffling method for comparable accuracy in parameter estimates.
Testing statistical significance of trends in learning, ageing and safety indicators
A relatively new subject for probabilistic safety methodology is statistical analysis of trends in observed failures and other safety indicators reflecting ageing or learning in operational and maintenance experience at industrial facilities. Random variations of the indicators can mask real changes or cause false alarms. Methodology is proposed for testing statistical significance of apparent trends in safety indicators. Improved methods are developed for detecting both monotonic and non-monotonic trends, some demonstrated by simulation studies and real examples to be more powerful than those known so far. An effective way to use standard trend tests with transformed data for testing exponentiality of data is also demonstrated and found superior to a well-known Lilliefors' goodness-of-fit test
Statistically Non-significant Papers in Environmental Health Studies included more Outcome Variables
Pentti Nieminen; Khaled Abass; Kirsi Vhkanga; Arja Rautio
2015-01-01
Objective The number of analyzed outcome variables is important in the statistical analysis and interpretation of research findings. This study investigated published papers in the field of environmental health studies. We aimed to examine whether differences in the number of reported outcome variables exist between papers with non-significant findings compared to those with significant findings. Articles on the maternal exposure to mercury and child development were used as examples. Methods Articles published between 1995 and 2013 focusing on the relationships between maternal exposure to mercury and child development were collected from Medline and Scopus. Results Of 87 extracted papers, 73 used statistical significance testing and 38 (43.7%) of these reported ‘non-significant’ (P>0.05) findings. The median number of child development outcome variables in papers reporting ‘significant’ (n=35) and ‘non-significant’ (n=38) results was 4 versus 7, respectively (Mann-Whitney test P-value=0.014). An elevated number of outcome variables was especially found in papers reporting non-significant associations between maternal mercury and outcomes when mercury was the only analyzed exposure variable. Conclusion Authors often report analyzed health outcome variables based on their P-values rather than on stated primary research questions. Such a practice probably skews the research evidence.
Biological dosimetry in radiological protection: dose response curves elaboration for 60Co and 137Cs
Ionizing radiation sources for pacific uses are being extensively utilized by modern society and the applications of these sources have raised the probability of the occurrence of accidents. The accidental exposition to radiation creates a necessity of the development of methods to evaluate dose quantity. This data could be obtained by the measurement of damage caused by radiation in the exposed person. The radiation dose can be estimated in exposed persons through physical methods (physical dosimetry) but the biological methods can't be dispensed, and among them, the cytogenetic one that makes use of chromosome aberrations (dicentric and centric ring) formed in peripheral blood lymphocytes (PBL) exposed to ionizing radiation. This method correlates the frequency of radioinduced aberrations with the estimated absorbed dose, as in vitro as in vivo, which is called cytogenetic dosimetry. By the introduction of improved new techniques in culture, in the interpretation of aberrations in the different analysers of slides and by the adoption of different statistical programs to analyse the data, significant differences are observed among laboratories in dose-response curves (calibration curves). The estimation of absorbed dose utilizing other laboratory calibration curves may introduce some uncertainties, so the International Atomic Energy Agency (IAEA) advises that each laboratory elaborates your own dose-response curve for cytogenetic dosimetry. The results were obtained from peripheral blood lymphocytes of the healthy and no-smoking donors exposed to 60Co and 137Cs radiation, with dose rate of 5 cGy.min.-1. Six points of dose were determined 20,50,100,200,300,400 cGy and the control not irradiated. The analysed aberrations were of chromosomic type, dicentric and centric ring. The dose response curve for dicentrics were obtained by frequencies weighted in liner-quadratic mathematic model and the equation resulted were for 60Co: Y = (3 46 +- 2.14)10-4 cGy-1 + (3.45 +- 0
Purpose: The purpose of this paper is to use the outcome of a dose escalation protocol for three-dimensional conformal radiation therapy (3D-CRT) of prostate cancer to study the dose-response for late rectal toxicity and to identify anatomic, dosimetric, and clinical factors that correlate with late rectal bleeding in multivariate analysis. Methods and Materials: Seven hundred forty-three patients with T1c-T3 prostate cancer were treated with 3D-CRT with prescribed doses of 64.8 to 81.0 Gy. The 5-year actuarial rate of late rectal toxicity was assessed using Kaplan-Meier statistics. A retrospective dosimetric analysis was performed for patients treated to 70.2 Gy (52 patients) or 75.6 Gy (119 patients) who either exhibited late rectal bleeding (RTOG Grade 2/3) within 30 months after treatment (i.e., 70.2 Gy--13 patients, 75.6 Gy--36 patients) or were nonbleeding for at least 30 months (i.e., 70.2 Gy--39 patients, 75.6 Gy--83 patients). Univariate and multivariate logistic regression was performed to correlate late rectal bleeding with several anatomic, dosimetric, and clinical variables. Results: A dose response for ≥ Grade 2 late rectal toxicity was observed. By multivariate analysis, the following factors were significantly correlated with ≥ Grade 2 late rectal bleeding for patients prescribed 70.2 Gy: 1) enclosure of the outer rectal contour by the 50% isodose on the isocenter slice (i.e., Iso50) (p max (p max
Jakobsen, Janus Christian; Wetterslev, Jorn; Winkel, Per; Lange, Theis; Gluud, Christian
2014-01-01
Assessment, Development, and Evaluation (GRADE) guidelines. RESULTS: We propose an eight-step procedure for better validation of meta-analytic results in systematic reviews (1) Obtain the 95% confidence intervals and the P-values from both fixed-effect and random-effects meta-analyses and report the most......BACKGROUND: Thresholds for statistical significance when assessing meta-analysis results are being insufficiently demonstrated by traditional 95% confidence intervals and P-values. Assessment of intervention effects in systematic reviews with meta-analysis deserves greater rigour. METHODS...... proposed eight-step procedure will increase the validity of assessments of intervention effects in systematic reviews of randomised clinical trials....
Ritter, Axel; Muñoz-Carpena, Rafael
2013-02-01
SummarySuccess in the use of computer models for simulating environmental variables and processes requires objective model calibration and verification procedures. Several methods for quantifying the goodness-of-fit of observations against model-calculated values have been proposed but none of them is free of limitations and are often ambiguous. When a single indicator is used it may lead to incorrect verification of the model. Instead, a combination of graphical results, absolute value error statistics (i.e. root mean square error), and normalized goodness-of-fit statistics (i.e. Nash-Sutcliffe Efficiency coefficient, NSE) is currently recommended. Interpretation of NSE values is often subjective, and may be biased by the magnitude and number of data points, data outliers and repeated data. The statistical significance of the performance statistics is an aspect generally ignored that helps in reducing subjectivity in the proper interpretation of the model performance. In this work, approximated probability distributions for two common indicators (NSE and root mean square error) are derived with bootstrapping (block bootstrapping when dealing with time series), followed by bias corrected and accelerated calculation of confidence intervals. Hypothesis testing of the indicators exceeding threshold values is proposed in a unified framework for statistically accepting or rejecting the model performance. It is illustrated how model performance is not linearly related with NSE, which is critical for its proper interpretation. Additionally, the sensitivity of the indicators to model bias, outliers and repeated data is evaluated. The potential of the difference between root mean square error and mean absolute error for detecting outliers is explored, showing that this may be considered a necessary but not a sufficient condition of outlier presence. The usefulness of the approach for the evaluation of model performance is illustrated with case studies including those with
Hu, R; Hu, Rui; Wang, Bin
2000-01-01
Finding out statistically significant words in DNA and protein sequences forms the basis for many genetic studies. By applying the maximal entropy principle, we give one systematic way to study the nonrandom occurrence of words in DNA or protein sequences. Through comparison with experimental results, it was shown that patterns of regulatory binding sites in Saccharomyces cerevisiae(yeast) genomes tend to occur significantly in the promoter regions. We studied two correlated gene family of yeast. The method successfully extracts the binding sites varified by experiments in each family. Many putative regulatory sites in the upstream regions are proposed. The study also suggested that some regulatory sites are a ctive in both directions, while others show directional preference.
Characterization of a developmental toxicity dose-response model
Faustman, E.M.; Wellington, D.G.; Smith, W.P.; Kimmel, C.A.
1989-02-01
The Rai and Van Ryzin dose-response model proposed for teratology experiments has been characterized for its appropriateness and applicability in modeling the dichotomous response data from developmental toxicity studies. Modifications were made in the initial probability statements to reflect more accurately biological events underlying developmental toxicity. Data sets used for the evaluation were obtained from the National Toxicology Program and U.S. EPA laboratories. The studies included developmental evaluations of ethylene glycol, diethylhexyl phthalate, di- and triethylene glycol dimethyl ethers, and nitrofen in rats, mice, or rabbits. Graphic examination and statistical evaluation demonstrate that this model is sensitive to the data when compared to directly measured experimental outcomes. The model was used to interpolate to low-risk dose levels, and comparisons were made between the values obtained and the no-observed-adverse-effect levels (NOAELs) divided by an uncertainty factor. Our investigation suggests that the Rai and Van Ryzin model is sensitive to the developmental toxicity end points, prenatal deaths, and malformations, and appears to model closely their relationship to dose.
Pulmonary inflammation and crystalline silica in respirable coal mine dust: dose-response
E D Kuempel; M D Attfield; V Vallyathan; N L Lapp; J M Hale; R J Smith; V Castranova
2003-02-01
This study describes the quantitative relationships between early pulmonary responses and the estimated lungburden or cumulative exposure of respirable-quartz or coal mine dust. Data from a previous bronchoalveolar lavage (BAL) study in coal miners ( = 20) and nonminers ( = 16) were used including cell counts of alveolar macrophages (AMs) and polymorphonuclear leukocytes (PMNs), and the antioxidant superoxide dismutase (SOD) levels. Miners’ individual working lifetime particulate exposures were estimated from work histories and mine air sampling data, and quartz lung-burdens were estimated using a lung dosimetry model. Results show that quartz, as either cumulative exposure or estimated lung-burden, was a highly statistically significant predictor of PMN response ( < 0.0001); however cumulative coal dust exposure did not significantly add to the prediction of PMNs ( = 0.2) above that predicted by cumulative quartz exposure ( < 0.0001). Despite the small study size, radiographic category was also significantly related to increasing levels of both PMNs and quartz lung burden (-values < 0.04). SOD in BAL fluid rose linearly with quartz lung burden ( < 0.01), but AM count in BAL fluid did not ( > 0.4). This study demonstrates dose-response relationships between respirable crystalline silica in coal mine dust and pulmonary inflammation, antioxidant production, and radiographic small opacities.
Statistical Significance of Non-Reproducibility of Cross Sections in Dissipative Reactions
王琦; 董玉川; 李松林; 田文栋; 李志常; 路秀琴; 赵葵; 符长波; 刘建成; 姜华; 胡桂青
2003-01-01
Two independent excitation function measurements have been performed in the reaction system of 19F+93 Nb using two target foils of the same nominal thickness. We measured the dissipative reaction products at incident energies of 102 through 108 MeV with a step of 250keV. The variance of energy autocorrelation functions of the reaction products was found to be three times of that originated from the randomized counting rates. By analysing the probability distributions of the deviations in the measured cross sections, we found that about 20% of all the deviations exceeds three standard deviations. This indicates that the non-reproducibility of the cross sections in the two independent measurements is of a statistical significance but not originated from randomized fluctuation of counting rates.
Molecular circuits, biological switches, and nonlinear dose-response relationships.
Andersen, Melvin E.; Yang, Raymond S.H.; French, C. Tenley; Chubb, Laura S; Dennison, James E
2002-01-01
Signaling motifs (nuclear transcriptional receptors, kinase/phosphatase cascades, G-coupled protein receptors, etc.) have composite dose-response behaviors in relation to concentrations of protein receptors and endogenous signaling molecules. "Molecular circuits" include the biological components and their interactions that comprise the workings of these signaling motifs. Many of these molecular circuits have nonlinear dose-response behaviors for endogenous ligands and for exogenous toxicants...
Froehlich, Tanya E.; Epstein, Jeffery N.; Nick, Todd G.; Melguizo Castro, Maria S.; Stein, Mark A.; Brinkman, William B.; Graham, Amanda J.; Langberg, Joshua M.; Kahn, Robert S.
2011-01-01
Objective: Because of significant individual variability in attention-deficit/hyperactivity disorder (ADHD) medication response, there is increasing interest in identifying genetic predictors of treatment effects. This study examined the role of four catecholamine-related candidate genes in moderating methylphenidate (MPH) dose-response. Method:…
Kellerer-Pirklbauer, Andreas
2016-04-01
Longer data series (e.g. >10 a) of ground temperatures in alpine regions are helpful to improve the understanding regarding the effects of present climate change on distribution and thermal characteristics of seasonal frost- and permafrost-affected areas. Beginning in 2004 - and more intensively since 2006 - a permafrost and seasonal frost monitoring network was established in Central and Eastern Austria by the University of Graz. This network consists of c.60 ground temperature (surface and near-surface) monitoring sites which are located at 1922-3002 m a.s.l., at latitude 46°55'-47°22'N and at longitude 12°44'-14°41'E. These data allow conclusions about general ground thermal conditions, potential permafrost occurrence, trend during the observation period, and regional pattern of changes. Calculations and analyses of several different temperature-related parameters were accomplished. At an annual scale a region-wide statistical significant warming during the observation period was revealed by e.g. an increase in mean annual temperature values (mean, maximum) or the significant lowering of the surface frost number (F+). At a seasonal scale no significant trend of any temperature-related parameter was in most cases revealed for spring (MAM) and autumn (SON). Winter (DJF) shows only a weak warming. In contrast, the summer (JJA) season reveals in general a significant warming as confirmed by several different temperature-related parameters such as e.g. mean seasonal temperature, number of thawing degree days, number of freezing degree days, or days without night frost. On a monthly basis August shows the statistically most robust and strongest warming of all months, although regional differences occur. Despite the fact that the general ground temperature warming during the last decade is confirmed by the field data in the study region, complications in trend analyses arise by temperature anomalies (e.g. warm winter 2006/07) or substantial variations in the winter
Carcinogen dose-response curves for both ionizing radiation and chemicals are typically assumed to be linear at environmentally relevant doses. This assumption is used to ensure protection of the public health in the absence of relevant dose-response data. A theoretical justification for the assumption has been provided by the argument that low dose linearity is expected when an exogenous agent adds to an ongoing endogenous process. Here, we use computational modeling to evaluate (1) how two biological adaptive processes, induction of DNA repair and cell cycle checkpoint control, may affect the shapes of dose-response curves for DNA-damaging carcinogens and (2) how the resulting dose-response behaviors may vary within a population. Each model incorporating an adaptive process was capable of generating not only monotonic dose-responses but also nonmonotonic (J-shaped) and threshold responses. Monte Carlo analysis suggested that all these dose-response behaviors could coexist within a population, as the spectrum of qualitative differences arose from quantitative changes in parameter values. While this analysis is largely theoretical, it suggests that (a) accurate prediction of the qualitative form of the dose-response requires a quantitative understanding of the mechanism (b) significant uncertainty is associated with human health risk prediction in the absence of such quantitative understanding and (c) a stronger experimental and regulatory focus on biological mechanisms and interindividual variability would allow flexibility in regulatory treatment of environmental carcinogens without compromising human health
Cumulative lognormal distributions of dose-response vs. dose distributions
A review of the author's findings over four decades will show that the lognormal probability density function can be fit to many types of positive-variate radiation measurement and response data. The cumulative lognormal plot on probability vs. logarithmic coordinate graph paper can be shown to be useful in comparing trends in exposure distributions or responses under differing conditions or experimental parameters. For variates that can take on only positive values, such a model is more natural than the 'normal' (Gaussian) model. Such modeling can also be helpful in elucidating underlying mechanisms that cause the observed data distributions. It is important, however, to differentiate between the cumulative plot of a dose distribution, in which successive percentages of data are not statistically independent, and the plots of dose-response data for which independent groups of animals or persons are irradiated or observed for selected doses or dose intervals. While independent response points can often be best fitted by appropriate regression methods, the density functions for cumulative dose or concentration distributions must be fit by particular maximum likelihood estimates from the data. Also, as indicated in the texts by D.J. Finney and by R.O. Gilbert, for example, a simple plot of such data on available probability (or probit) vs. log scale graph paper will quickly show whether an adequate representation of the data is a lognormal function. Processes that naturally generate lognormal variates are sometimes estimated by statistics that follow the lognormal straight line for a cumulative plot on a probability vs. log scale; on the other hand, sometimes the statistics of interpretation follow such a line only over a certain range. Reported examples of lognormal occupational exposure distributions include those in some facilities in which roundoff biases were removed for some years. However, for a number of exposure distributions at licensed facilities in the
There's more than one way to conduct a replication study: Beyond statistical significance.
Anderson, Samantha F; Maxwell, Scott E
2016-03-01
As the field of psychology struggles to trust published findings, replication research has begun to become more of a priority to both scientists and journals. With this increasing emphasis placed on reproducibility, it is essential that replication studies be capable of advancing the field. However, we argue that many researchers have been only narrowly interpreting the meaning of replication, with studies being designed with a simple statistically significant or nonsignificant results framework in mind. Although this interpretation may be desirable in some cases, we develop a variety of additional "replication goals" that researchers could consider when planning studies. Even if researchers are aware of these goals, we show that they are rarely used in practice-as results are typically analyzed in a manner only appropriate to a simple significance test. We discuss each goal conceptually, explain appropriate analysis procedures, and provide 1 or more examples to illustrate these analyses in practice. We hope that these various goals will allow researchers to develop a more nuanced understanding of replication that can be flexible enough to answer the various questions that researchers might seek to understand. PMID:26214497
Homeopathy: statistical significance versus the sample size in experiments with Toxoplasma gondii
Ana LÃƒÂºcia Falavigna Guilherme
2011-09-01
Full Text Available Introduction: Toxoplasmosis is a zoonosis that represents a serious public health problem, caused by Toxoplasma gondii, which affects 20-90% of the world human population [1,2]. It is a serious problem especially when considering the congenital transmission due to congenital sequels. Treatment with highly diluted substances is one of the alternative/complementary medicines most employed in the world [3,4]. The current ethical rules regarding the number of animals used in animal experimental protocols with the use of more conservative statistical methods [5] can not enhance the biological effects of highly diluted substances observed by the experience of the researcher. Aim: To evaluate the minimum number of animals per group to achieve a significant difference among the groups of animals treated with biotherapic T. gondii and infected with the protozoan regarding the number of cysts observed in the brain. Material and methods: A blind randomized controlled trial was performed using eleven Swiss male mice, aged 57 days, divided into two groups: BIOT-200DH - treated with biotherapic (n=6 and CONTROL - treated with hydroalcoholic solution 7% (n=7.The animals of the group BIOT-200DH were treated for 3 consecutive days in a single dose 0.1ml/dose/day. The animals of BIOT Ã¢â‚¬â€œ 200DH group were orally infected with 20 cysts of ME49-T. gondii. The animals of the control group were treated with cereal alcohol 7% (n=7 for 3 consecutive days and then were infected with 20 cysts of ME49 -T. gondii orally. The biotherapic 200DH T. gondii was prepared with homogenized mouse brain, with 20 cysts of T. gondii / 100ÃŽÂ¼L according to the Brazilian Homeopathic Pharmacopoeia [6] in laminar flow. After 60 days post-infection the animals were killed in a chamber saturated with halothane, the brains were homogenized and resuspended in 1 ml of saline solution. Cysts were counted in 25 ml of this suspension, covered with a 24x24 mm coverglass
Yoshiguchi, H; Sato, K; Yoshiguchi, Hiroyuki; Nagataki, Shigehiro; Sato, Katsuhiko
2004-01-01
Recently, the High Resolution Fly's Eye (HiRes) experiment claims that there is no small scale anisotropy in the arrival distribution of ultra-high energy cosmic rays (UHECRs) above $E>10^{19}$ eV contrary to the Akeno Giant Air Shower Array (AGASA) observation. In this paper, we discuss the statistical significance of this discrepancy between the two experiments. We calculate arrival distribution of UHECRs above $10^{19}$ eV predicted by the source models constructed using the Optical Redshift Survey galaxy sample. We apply the new method developed by us for calculating arrival distribution in the presence of the galactic magnetic field. The great advantage of this method is that it enables us to calculate UHECR arrival distribution with lower energy ($\\sim 10^{19}$ eV) than previous studies within reasonable time by following only the trajectories of UHECRs actually reaching the earth. It has been realized that the small scale anisotropy observed by the AGASA can be explained with the source number density ...
ASTRO definition, at 5 years after radiotherapy, the dose required for 50% tumor control (TCD50) for low-risk patients was 57.3 Gy (95% confidence interval [CI], 47.6-67.0). The γ50 was 1.4 (95% CI, -0.1 to 2.9) around 57 Gy. A statistically significant dose-response relation was found using the ASTRO definition. However, no dose-response relation was noted using the CN + 2 definition for these low-risk patients. For the intermediate-risk patients, using the ASTRO definition, the TCD50 was 67.5 Gy (95% CI, 65.5-69.5) Gy and the γ50 was 2.2 (95% CI, 1.1-3.2) around TCD50. Using the CN + 2 definition, the TCD50 was 57.8 Gy (95% CI, 49.8-65.9) and the γ50 was 1.4 (95% CI, 0.2-2.5). Recursive partitioning analysis identified two subgroups within the low-risk group, as well as the intermediate-risk group: PSA level 78 Gy for these patients. A dose-response relation was noted for the intermediate-risk patients using either the CN + 2 or ASTRO definition. Most of the benefit from the higher doses also derived from the intermediate-risk patients with higher PSA levels. Some room for improvement appears to exist with additional dose increases in this group
Helmut Kern
2012-03-01
Full Text Available Aging is a multifactorial process that is characterized by decline in muscle mass and performance. Several factors, including reduced exercise, poor nutrition and modified hormonal metabolism, are responsible for changes in the rates of protein synthesis and degradation that drive skeletal muscle mass reduction with a consequent decline of force generation and mobility functional performances. Seniors with normal life style were enrolled: two groups in Vienna (n=32 and two groups in Bratislava: (n=19. All subjects were healthy and declared not to have any specific physical/disease problems. The two Vienna groups of seniors exercised for 10 weeks with two different types of training (leg press at the hospital or home-based functional electrical stimulation, h-b FES. Demografic data (age, height and weight were recorded before and after the training period and before and after the training period the patients were submitted to mobility functional analyses and muscle biopsies. The mobility functional analyses were: 1. gait speed (10m test fastest speed, in m/s; 2. time which the subject needed to rise from a chair for five times (5x Chair-Rise, in s; 3. Timed –Up-Go- Test, in s; 4. Stair-Test, in s; 5. isometric measurement of quadriceps force (Torque/kg, in Nm/kg; and 6. Dynamic Balance in mm. Preliminary analyses of muscle biopsies from quadriceps in some of the Vienna and Bratislava patients present morphometric results consistent with their functional behaviors. The statistically significant improvements in functional testings here reported demonstrates the effectiveness of h-b FES, and strongly support h-b FES, as a safe home-based method to improve contractility and performances of ageing muscles.
Review of dose-response curves for acute antimigraine drugs
Hougaard, Anders; Tfelt-Hansen, Peer
2015-01-01
calcitonin-gene related peptide receptor antagonists (telcagepant, MK-3207, BI 44370 TA and BMS-927711) in placebo-controlled trials were reviewed. In addition, dose-response curves for adverse events (AEs) were reviewed. Expert opinion: For most triptans, the dose-response curve for efficacy is flat......, there are many unmet needs. Although upcoming drugs may not be superior to triptans, migraine patients will potentially benefit greatly from these, especially patients who are triptan non-responders and patients with cardiovascular disease....
Vujović Svetlana R.
2013-01-01
Full Text Available This paper illustrates the utility of multivariate statistical techniques for analysis and interpretation of water quality data sets and identification of pollution sources/factors with a view to get better information about the water quality and design of monitoring network for effective management of water resources. Multivariate statistical techniques, such as factor analysis (FA/principal component analysis (PCA and cluster analysis (CA, were applied for the evaluation of variations and for the interpretation of a water quality data set of the natural water bodies obtained during 2010 year of monitoring of 13 parameters at 33 different sites. FA/PCA attempts to explain the correlations between the observations in terms of the underlying factors, which are not directly observable. Factor analysis is applied to physico-chemical parameters of natural water bodies with the aim classification and data summation as well as segmentation of heterogeneous data sets into smaller homogeneous subsets. Factor loadings were categorized as strong and moderate corresponding to the absolute loading values of >0.75, 0.75-0.50, respectively. Four principal factors were obtained with Eigenvalues >1 summing more than 78 % of the total variance in the water data sets, which is adequate to give good prior information regarding data structure. Each factor that is significantly related to specific variables represents a different dimension of water quality. The first factor F1 accounting for 28 % of the total variance and represents the hydrochemical dimension of water quality. The second factor F2 accounting for 18% of the total variance and may be taken factor of water eutrophication. The third factor F3 accounting 17 % of the total variance and represents the influence of point sources of pollution on water quality. The fourth factor F4 accounting 13 % of the total variance and may be taken as an ecological dimension of water quality. Cluster analysis (CA is an
Deng, Nina; Allison, Jeroan J; Fang, Hua Julia; Ash, Arlene S.; Ware, John E.
2013-01-01
Background Relative validity (RV), a ratio of ANOVA F-statistics, is often used to compare the validity of patient-reported outcome (PRO) measures. We used the bootstrap to establish the statistical significance of the RV and to identify key factors affecting its significance. Methods Based on responses from 453 chronic kidney disease (CKD) patients to 16 CKD-specific and generic PRO measures, RVs were computed to determine how well each measure discriminated across clinically-defined groups ...
Cigarette smoking and risk of rheumatoid arthritis: a dose-response meta-analysis
2014-01-01
Introduction Although previous studies found that cigarette smoking is associated with risk of rheumatoid arthritis (RA), the dose-response relationship remains unclear. This meta-analysis quantitatively summarizes accumulated evidence regarding the association of lifelong exposure to cigarette smoking assessed as pack-years with the risk of RA. Methods Relevant studies were identified by a search of MEDLINE and EMBASE from 1966 to October 2013, with no restrictions. Reference lists from retrieved articles were also reviewed. Studies that reported relative risks (RR) or odds ratio (OR) estimates with 95% confidence intervals (CIs) for the association between pack-years of cigarette smoking and rheumatoid arthritis were included in a dose-response random-effects meta-regression analysis. Results We included 3 prospective cohorts and 7 case-control studies in the meta-analysis. They included a total of 4,552 RA cases. There was no indication of heterogeneity (Pheterogeneity = 0.32) and publication bias did not affect the results. Compared to never smokers, the risk of developing RA increased by 26% (RR = 1.26, 95% CI 1.14 to 1.39) among those who smoked 1 to 10 pack-years and doubled among those with more than 20 pack-years (RR for 21 to 30 pack years = 1.94, 95% CI 1.65 to 2.27). The risk of RA was not increasing further for higher exposure levels (RR for >40 pack-years = 2.07, 95% CI 1.15 to 3.73). The risk of RA was statistically significantly higher among rheumatoid factor (RF)-positive RA cases (RR = 2.47, 95% CI 2.02 to 3.02) compared to RF-negative (RR = 1.58, 95% CI 1.15 to 2.18) when comparing the highest versus lowest category of pack-years for the individual studies. Conclusions Lifelong cigarette smoking was positively associated with the risk of RA even among smokers with a low lifelong exposure. The risk of RA did not further increase with an exposure higher than 20 pack-years. PMID:24594022
In order to make correct predictions of the second moment of statistical nuclear variables, such as the number of fissions and the number of thermalized neutrons, the dependence of the energy distribution of the source particles on their number should be considered. It has been pointed out recently that neglecting this number dependence in accelerator driven systems might result in bad estimates of the second moment, and this paper contains qualitative and quantitative estimates of the size of these efforts. We walk towards the requested results in two steps. First, models of the number dependent energy distributions of the neutrons that are ejected in the spallation reactions are constructed, both by simple assumptions and by extracting energy distributions of spallation neutrons from a high-energy particle transport code. Then, the second moment of nuclear variables in a sub-critical reactor, into which spallation neutrons are injected, is calculated. The results from second moment calculations using number dependent energy distributions for the source neutrons are compared to those where only the average energy distribution is used. Two physical models are employed to simulate the neutron transport in the reactor. One is analytical, treating only slowing down of neutrons by elastic scattering in the core material. For this model, equations are written down and solved for the second moment of thermalized neutrons that include the distribution of energy of the spallation neutrons. The other model utilizes Monte Carlo methods for tracking the source neutrons as they travel inside the reactor material. Fast and thermal fission reactions are considered, as well as neutron capture and elastic scattering, and the second moment of the number of fissions, the number of neutrons that leaked out of the system, etc. are calculated. Both models use a cylindrical core with a homogenous mixture of core material. Our results indicate that the number dependence of the energy
Testing statistical significance scores of sequence comparison methods with structure similarity
Hulsen, T.; Vlieg, de J.; Leunissen, J.A.M.; Groenen, P.
2006-01-01
Background - In the past years the Smith-Waterman sequence comparison algorithm has gained popularity due to improved implementations and rapidly increasing computing power. However, the quality and sensitivity of a database search is not only determined by the algorithm but also by the statistical
Testing statistical significance scores of sequence comparison methods with structure similarity
Hulsen, T.; Vlieg, J. de; Leunissen, J.A.M.; Groenen, P.M.
2006-01-01
BACKGROUND: In the past years the Smith-Waterman sequence comparison algorithm has gained popularity due to improved implementations and rapidly increasing computing power. However, the quality and sensitivity of a database search is not only determined by the algorithm but also by the statistical s
Dose-response aligned circuits in signaling systems.
Long Yan
Full Text Available Cells use biological signal transduction pathways to respond to environmental stimuli and the behavior of many cell types depends on precise sensing and transmission of external information. A notable property of signal transduction that was characterized in the Saccharomyces cerevisiae yeast cell and many mammalian cells is the alignment of dose-response curves. It was found that the dose response of the receptor matches closely the dose responses of the downstream. This dose-response alignment (DoRA renders equal sensitivities and concordant responses in different parts of signaling system and guarantees a faithful information transmission. The experimental observations raise interesting questions about the nature of the information transmission through DoRA signaling networks and design principles of signaling systems with this function. Here, we performed an exhaustive computational analysis on network architectures that underlie the DoRA function in simple regulatory networks composed of two and three enzymes. The minimal circuits capable of DoRA were examined with Michaelis-Menten kinetics. Several motifs that are essential for the dynamical function of DoRA were identified. Systematic analysis of the topology space of robust DoRA circuits revealed that, rather than fine-tuning the network's parameters, the function is primarily realized by enzymatic regulations on the controlled node that are constrained in limiting regions of saturation or linearity.
A Bayesian Semiparametric Model for Radiation Dose-Response Estimation.
Furukawa, Kyoji; Misumi, Munechika; Cologne, John B; Cullings, Harry M
2016-06-01
In evaluating the risk of exposure to health hazards, characterizing the dose-response relationship and estimating acceptable exposure levels are the primary goals. In analyses of health risks associated with exposure to ionizing radiation, while there is a clear agreement that moderate to high radiation doses cause harmful effects in humans, little has been known about the possible biological effects at low doses, for example, below 0.1 Gy, which is the dose range relevant to most radiation exposures of concern today. A conventional approach to radiation dose-response estimation based on simple parametric forms, such as the linear nonthreshold model, can be misleading in evaluating the risk and, in particular, its uncertainty at low doses. As an alternative approach, we consider a Bayesian semiparametric model that has a connected piece-wise-linear dose-response function with prior distributions having an autoregressive structure among the random slope coefficients defined over closely spaced dose categories. With a simulation study and application to analysis of cancer incidence data among Japanese atomic bomb survivors, we show that this approach can produce smooth and flexible dose-response estimation while reasonably handling the risk uncertainty at low doses and elsewhere. With relatively few assumptions and modeling options to be made by the analyst, the method can be particularly useful in assessing risks associated with low-dose radiation exposures. PMID:26581473
Dose-response relationship with radiotherapy: an evidence?
The dose-response relationship is a fundamental basis of radiobiology. Despite many clinical data, difficulties remain to demonstrate a relation between dose and local control: relative role of treatment associated with radiation therapy (surgery, chemotherapy, hormonal therapy), tumor heterogeneity, few prospective randomized studies, uncertainty of local control assessment. Three different situations are discussed: tumors with high local control probabilities for which dose effect is demonstrated by randomized studies (breast cancer) or sound retrospective data (soft tissues sarcomas), tumors with intermediate local control probabilities for which dose effect seems to be important according to retrospective studies and ongoing or published phase III trials (prostate cancer), tumors with low local control probabilities for which dose effect appears to be modest beyond standard doses, and inferior to the benefit of concurrent chemotherapy (lung and oesophageal cancer). For head and neck tumors, the dose-response relationship has been explored through hyperfractionation and accelerated radiation therapy and a dose effect has been demonstrated but must be compared to the benefit of concurrent chemotherapy. Last but not least, the development of conformal radiotherapy allow the exploration of the dose response relationship for tumors such as hepatocellular carcinomas traditionally excluded from the field of conventional radiation therapy. In conclusion, the dose-response relationship remains a sound basis of radiation therapy for many tumors and is a parameter to take into account for further randomized studies. (author)
Radiation Dose-Response Relationships and Risk Assessment
The notion of a dose-response relationship was probably invented shortly after the discovery of poisons, the invention of alcoholic beverages, and the bringing of fire into a confined space in the forgotten depths of ancient prehistory. The amount of poison or medicine ingested can easily be observed to affect the behavior, health, or sickness outcome. Threshold effects, such as death, could be easily understood for intoxicants, medicine, and poisons. As Paracelsus (1493-1541), the 'father' of modern toxicology said, 'It is the dose that makes the poison.' Perhaps less obvious is the fact that implicit in such dose-response relationships is also the notion of dose rate. Usually, the dose is administered fairly acutely, in a single injection, pill, or swallow; a few puffs on a pipe; or a meal of eating or drinking. The same amount of intoxicants, medicine, or poisons administered over a week or month might have little or no observable effect. Thus, before the discovery of ionizing radiation in the late 19th century, toxicology ('the science of poisons') and pharmacology had deeply ingrained notions of dose-response relationships. This chapter demonstrates that the notion of a dose-response relationship for ionizing radiation is hopelessly simplistic from a scientific standpoint. While useful from a policy or regulatory standpoint, dose-response relationships cannot possibly convey enough information to describe the problem from a quantitative view of radiation biology, nor can they address societal values. Three sections of this chapter address the concepts, observations, and theories that contribute to the scientific input to the practice of managing risks from exposure to ionizing radiation. The presentation begins with irradiation regimes, followed by responses to high and low doses of ionizing radiation, and a discussion of how all of this can inform radiation risk management. The knowledge that is really needed for prediction of individual risk is presented
Significance and statistical errors in the analysis of DNA microarray data
Brody, James P.; Williams, Brian A.; Wold, Barbara J.; Quake, Stephen R
2002-01-01
DNA microarrays are important devices for high throughput measurements of gene expression, but no rational foundation has been established for understanding the sources of within-chip statistical error. We designed a specialized chip and protocol to investigate the distribution and magnitude of within-chip errors and discovered that, as expected from theoretical expectations, measurement errors follow a Lorentzian-like distribution, which explains the widely observed but unexplained ill-repro...
Dose response curve of 60Co for premature condensed chromosome fragments of human lymphocytes
The dose-response curves obtained by premature condensed chromosome (PCC) and conventional cellular genetic methods can be represented by two linear equations. The ratio of the slopes, KPCC/KM1, is about 28. In comparison to the conventional method, the PCC method has many advantages; e.g. it is faster, simpler, more sensitive and accurate. Its significance in the study of radiation damage is also discussed
Dominic Beaulieu-Prévost
2006-03-01
Full Text Available For the last 50 years of research in quantitative social sciences, the empirical evaluation of scientific hypotheses has been based on the rejection or not of the null hypothesis. However, more than 300 articles demonstrated that this method was problematic. In summary, null hypothesis testing (NHT is unfalsifiable, its results depend directly on sample size and the null hypothesis is both improbable and not plausible. Consequently, alternatives to NHT such as confidence intervals (CI and measures of effect size are starting to be used in scientific publications. The purpose of this article is, first, to provide the conceptual tools necessary to implement an approach based on confidence intervals, and second, to briefly demonstrate why such an approach is an interesting alternative to an approach based on NHT. As demonstrated in the article, the proposed CI approach avoids most problems related to a NHT approach and can often improve the scientific and contextual relevance of the statistical interpretations by testing range hypotheses instead of a point hypothesis and by defining the minimal value of a substantial effect. The main advantage of such a CI approach is that it replaces the notion of statistical power by an easily interpretable three-value logic (probable presence of a substantial effect, probable absence of a substantial effect and probabilistic undetermination. The demonstration includes a complete example.
The difficulties involved in the control of biological and radioimmunological assay systems, and in the maintenance of standard, as well as, the usual heterogeneity of assayed samples require some evidence of similarity between the dose-response curves obtained with the standard and the sample. Nowadays the parallelism test is used to provide such evidence. However, some indications of non-normal errors distribution, such as the presence of out layers, render the parallelism test both conceptually implausible and statistically inefficient. In such a manner we suggest the non-parametric 'frequencial' test as a more sounding option. (author)
Dose response in prostate cancer with 8-12 years' follow-up
/mL, although large numbers of patients are required to demonstrate a difference. The radiation dose, Gleason score, and palpation T stage were significant predictors for the entire patient set, as well as for those with pretreatment PSA levels between 10 and 20 ng/mL. The FDM rate for all patients included in this series was 89%, 83%, and 83% at 5, 10, and 12 years, respectively. For patients with pretreatment PSA levels 9 years of median follow-up confirm the existence of a dose response for both bNED control and FDM. The dose response in prostate cancer is real, and the absence of biochemical recurrence after 8 years demonstrates the lack of late failure and suggests cure
Dose-response curve to salbutamol during acute and chronic treatment with formoterol in COPD
La Piana GE
2011-07-01
Full Text Available Giuseppe Emanuele La Piana¹, Luciano Corda², Enrica Bertella¹, Luigi Taranto Montemurro¹, Laura Pini¹, Claudio Tantucci¹¹Cattedra di Malattie dell'Apparato Respiratorio, Università di Brescia, ²Prima Divisione di Medicina Interna, Spedali Civili, Brescia, ItalyBackground: Use of short-acting ß2-agonists in chronic obstructive pulmonary disease (COPD during treatment with long-acting ß2-agonists is recommended as needed, but its effectiveness is unclear. The purpose of this study was to assess the additional bronchodilating effect of increasing doses of salbutamol during acute and chronic treatment with formoterol in patients with COPD.Methods: Ten patients with COPD underwent a dose-response curve to salbutamol (until 800 µg of cumulative dose after a 1-week washout (baseline, 8 hours after the first administration of formoterol 12 µg (day 1, and after a 12-week and 24-week period of treatment with formoterol (12 µg twice daily by dry powder inhaler. Peak expiratory flow, forced expiratory volume in one second (FEV1, forced vital capacity, and inspiratory capacity were measured at the different periods of treatment and at different steps of the dose-response curve.Results: Despite acute or chronic administration of formoterol, maximal values of peak expiratory flow, FEV1, and forced vital capacity after 800 µg of salbutamol were unchanged compared with baseline. The baseline FEV1 dose-response curve was steeper than that at day 1, week 12, or week 24 (P < 0.0001. Within each dose-response curve, FEV1 was different only at baseline and at day 1 (P < 0.001, when FEV1 was still greater at 800 µg than at 0 µg (P < 0.02. In contrast, the forced vital capacity dose-response curves were similar at the different periods, while within each dose-response curve, forced vital capacity was different in all instances (P < 0.001, always being higher at 800 µg than at 0 µg (P < 0.05.Conclusion: In patients with stable COPD, the maximal effect
Model Averaging Software for Dichotomous Dose Response Risk Estimation
Matthew W. Wheeler
2008-02-01
Full Text Available Model averaging has been shown to be a useful method for incorporating model uncertainty in quantitative risk estimation. In certain circumstances this technique is computationally complex, requiring sophisticated software to carry out the computation. We introduce software that implements model averaging for risk assessment based upon dichotomous dose-response data. This software, which we call Model Averaging for Dichotomous Response Benchmark Dose (MADr-BMD, ﬁts the quantal response models, which are also used in the US Environmental Protection Agency benchmark dose software suite, and generates a model-averaged dose response model to generate benchmark dose and benchmark dose lower bound estimates. The software fulﬁlls a need for risk assessors, allowing them to go beyond one single model in their risk assessments based on quantal data by focusing on a set of models that describes the experimental data.
Hayslett, H T
1991-01-01
Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the
Krumbholz, Aniko; Anielski, Patricia; Gfrerer, Lena; Graw, Matthias; Geyer, Hans; Schänzer, Wilhelm; Dvorak, Jiri; Thieme, Detlef
2014-01-01
Clenbuterol is a well-established β2-agonist, which is prohibited in sports and strictly regulated for use in the livestock industry. During the last few years clenbuterol-positive results in doping controls and in samples from residents or travellers from a high-risk country were suspected to be related the illegal use of clenbuterol for fattening. A sensitive liquid chromatography-tandem mass spectrometry (LC-MS/MS) method was developed to detect low clenbuterol residues in hair with a detection limit of 0.02 pg/mg. A sub-therapeutic application study and a field study with volunteers, who have a high risk of contamination, were performed. For the application study, a total dosage of 30 µg clenbuterol was applied to 20 healthy volunteers on 5 subsequent days. One month after the beginning of the application, clenbuterol was detected in the proximal hair segment (0-1 cm) in concentrations between 0.43 and 4.76 pg/mg. For the second part, samples of 66 Mexican soccer players were analyzed. In 89% of these volunteers, clenbuterol was detectable in their hair at concentrations between 0.02 and 1.90 pg/mg. A comparison of both parts showed no statistical difference between sub-therapeutic application and contamination. In contrast, discrimination to a typical abuse of clenbuterol is apparently possible. Due to these findings results of real doping control samples can be evaluated. PMID:25388545
Onwuegbuzie, Anthony J.; Roberts, J. Kyle; Daniel, Larry G.
2005-01-01
In this article, the authors (a) illustrate how displaying disattenuated correlation coefficients alongside their unadjusted counterparts will allow researchers to assess the impact of unreliability on bivariate relationships and (b) demonstrate how a proposed new "what if reliability" analysis can complement null hypothesis significance tests of…
Schöllnberger, H; Kaiser, J C; Jacob, P; Walsh, L
2012-05-01
The non-cancer mortality data for cerebrovascular disease (CVD) and cardiovascular diseases from Report 13 on the atomic bomb survivors published by the Radiation Effects Research Foundation were analysed to investigate the dose-response for the influence of radiation on these detrimental health effects. Various parametric and categorical models (such as linear-no-threshold (LNT) and a number of threshold and step models) were analysed with a statistical selection protocol that rated the model description of the data. Instead of applying the usual approach of identifying one preferred model for each data set, a set of plausible models was applied, and a sub-set of non-nested models was identified that all fitted the data about equally well. Subsequently, this sub-set of non-nested models was used to perform multi-model inference (MMI), an innovative method of mathematically combining different models to allow risk estimates to be based on several plausible dose-response models rather than just relying on a single model of choice. This procedure thereby produces more reliable risk estimates based on a more comprehensive appraisal of model uncertainties. For CVD, MMI yielded a weak dose-response (with a risk estimate of about one-third of the LNT model) below a step at 0.6 Gy and a stronger dose-response at higher doses. The calculated risk estimates are consistent with zero risk below this threshold-dose. For mortalities related to cardiovascular diseases, an LNT-type dose-response was found with risk estimates consistent with zero risk below 2.2 Gy based on 90% confidence intervals. The MMI approach described here resolves a dilemma in practical radiation protection when one is forced to select between models with profoundly different dose-responses for risk estimates. PMID:22437350
Jewett, M. E.; Dijk, D. J.; Kronauer, R. E.; Dinges, D. F.
1999-01-01
Although it has been well documented that sleep is required for human performance and alertness to recover from low levels after prolonged periods of wakefulness, it remains unclear whether they increase in a linear or asymptotic manner during sleep. It has been postulated that there is a relation between the rate of improvement in neurobehavioral functioning and rate of decline of slow-wave sleep and/or slow-wave activity (SWS/SWA) during sleep, but this has not been verified. Thus, a cross-study comparison was conducted in which dose-response curves (DRCs) were constructed for Stanford Sleepiness Scale (SSS) and Psychomotor Vigilance Task (PVT) tests taken at 1000 hours by subjects who had been allowed to sleep 0 hours, 2 hours, 5 hours or 8 hours the previous night. We found that the DRCs to each PVT metric improved in a saturating exponential manner, with recovery rates that were similar [time constant (T) approximately 2.14 hours] for all the metrics. This recovery rate was slightly faster than, though not statistically significantly different from, the reported rate of SWS/SWA decline (T approximately 2.7 hours). The DRC to the SSS improved much more slowly than psychomotor vigilance, so that it could be fit equally well by a linear function (slope = -0.26) or a saturating exponential function (T = 9.09 hours). We conclude that although SWS/SWA, subjective alertness, and a wide variety of psychomotor vigilance metrics may all change asymptotically during sleep, it remains to be determined whether the underlying physiologic processes governing their expression are different.
Asymptotic significance levels of tests for monotone trends in rates or proportions can be profoundly anticonservative when applied to small numbers of events and when distributions of exposure to risk are highly skewed. In such cases Monte Carlo (MC) estimation of observed levels of significance (''p-values'') can be very useful. We describe a simple technique of importance sampling (IS) which can greatly improve the efficiency of MC estimation in this setting. Implementation of the IS technique is described, and the variance of the IS estimator is derived. It is shown that, in many situations likely to occur in practice, the variance is substantially less than that of a simple MC estimator proposed earlier. Generalizations beyond the case of survival data without ties are described, and the use of IS is illustrated with data regarding mortality among atomic bomb survivors. (author)
Diethylene glycol-induced toxicities show marked threshold dose response in rats
Landry, Greg M., E-mail: Landry.Greg@mayo.edu [Department of Pharmacology, Toxicology, & Neuroscience, Louisiana State University Health Sciences Center, Shreveport, LA (United States); Dunning, Cody L., E-mail: cdunni@lsuhsc.edu [Department of Pharmacology, Toxicology, & Neuroscience, Louisiana State University Health Sciences Center, Shreveport, LA (United States); Abreo, Fleurette, E-mail: fabreo@lsuhsc.edu [Department of Pathology, Louisiana State University Health Sciences Center, Shreveport, LA (United States); Latimer, Brian, E-mail: blatim@lsuhsc.edu [Department of Pharmacology, Toxicology, & Neuroscience, Louisiana State University Health Sciences Center, Shreveport, LA (United States); Orchard, Elysse, E-mail: eorcha@lsuhsc.edu [Department of Pharmacology, Toxicology, & Neuroscience, Louisiana State University Health Sciences Center, Shreveport, LA (United States); Division of Animal Resources, Louisiana State University Health Sciences Center, Shreveport, LA (United States); McMartin, Kenneth E., E-mail: kmcmar@lsuhsc.edu [Department of Pharmacology, Toxicology, & Neuroscience, Louisiana State University Health Sciences Center, Shreveport, LA (United States)
2015-02-01
Diethylene glycol (DEG) exposure poses risks to human health because of widespread industrial use and accidental exposures from contaminated products. To enhance the understanding of the mechanistic role of metabolites in DEG toxicity, this study used a dose response paradigm to determine a rat model that would best mimic DEG exposure in humans. Wistar and Fischer-344 (F-344) rats were treated by oral gavage with 0, 2, 5, or 10 g/kg DEG and blood, kidney and liver tissues were collected at 48 h. Both rat strains treated with 10 g/kg DEG had equivalent degrees of metabolic acidosis, renal toxicity (increased BUN and creatinine and cortical necrosis) and liver toxicity (increased serum enzyme levels, centrilobular necrosis and severe glycogen depletion). There was no liver or kidney toxicity at the lower DEG doses (2 and 5 g/kg) regardless of strain, demonstrating a steep threshold dose response. Kidney diglycolic acid (DGA), the presumed nephrotoxic metabolite of DEG, was markedly elevated in both rat strains administered 10 g/kg DEG, but no DGA was present at 2 or 5 g/kg, asserting its necessary role in DEG-induced toxicity. These results indicate that mechanistically in order to produce toxicity, metabolism to and significant target organ accumulation of DGA are required and that both strains would be useful for DEG risk assessments. - Highlights: • DEG produces a steep threshold dose response for kidney injury in rats. • Wistar and F-344 rats do not differ in response to DEG-induced renal injury. • The dose response for renal injury closely mirrors that for renal DGA accumulation. • Results demonstrate the importance of DGA accumulation in producing kidney injury.
Diethylene glycol-induced toxicities show marked threshold dose response in rats
Diethylene glycol (DEG) exposure poses risks to human health because of widespread industrial use and accidental exposures from contaminated products. To enhance the understanding of the mechanistic role of metabolites in DEG toxicity, this study used a dose response paradigm to determine a rat model that would best mimic DEG exposure in humans. Wistar and Fischer-344 (F-344) rats were treated by oral gavage with 0, 2, 5, or 10 g/kg DEG and blood, kidney and liver tissues were collected at 48 h. Both rat strains treated with 10 g/kg DEG had equivalent degrees of metabolic acidosis, renal toxicity (increased BUN and creatinine and cortical necrosis) and liver toxicity (increased serum enzyme levels, centrilobular necrosis and severe glycogen depletion). There was no liver or kidney toxicity at the lower DEG doses (2 and 5 g/kg) regardless of strain, demonstrating a steep threshold dose response. Kidney diglycolic acid (DGA), the presumed nephrotoxic metabolite of DEG, was markedly elevated in both rat strains administered 10 g/kg DEG, but no DGA was present at 2 or 5 g/kg, asserting its necessary role in DEG-induced toxicity. These results indicate that mechanistically in order to produce toxicity, metabolism to and significant target organ accumulation of DGA are required and that both strains would be useful for DEG risk assessments. - Highlights: • DEG produces a steep threshold dose response for kidney injury in rats. • Wistar and F-344 rats do not differ in response to DEG-induced renal injury. • The dose response for renal injury closely mirrors that for renal DGA accumulation. • Results demonstrate the importance of DGA accumulation in producing kidney injury
We verified the setup error (SE) in two persons' radiation therapist's team, which consist of staff and new face. We performed the significance test for SE by the staff group and the new face group. One group consists of four staff therapists with at least 5 to 30 years of experience. The other group consists of new face radiation therapists that have 1 to 1.5 years of experience. Analyzed were 53 patients diagnosed with pelvic cancer (seven patients who underwent 3 dimensional conformal radiation therapy (3DCRT) and 46 patients who underwent intensity modulated radiation therapy (IMRT). Image verification was 1460 times. It was performed through setup verification by cone beam computed tomography (CBCT), and we measured SE of four directions (lateral, long, vertical, 3D). We performed the student's t-test to get the difference of the average error between the staff group and the new face group. The results of significance tests show that there is no difference between SE in the staff group and the new face group in radiotherapy. (author)
Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.
Márcio Mourão
Full Text Available We investigated commonly used methods (Autocorrelation, Enright, and Discrete Fourier Transform to estimate the periodicity of oscillatory data and determine which method most accurately estimated periods while being least vulnerable to the presence of noise. Both simulated and experimental data were used in the analysis performed. We determined the significance of calculated periods by applying these methods to several random permutations of the data and then calculating the probability of obtaining the period's peak in the corresponding periodograms. Our analysis suggests that the Enright method is the most accurate for estimating the period of oscillatory data. We further show that to accurately estimate the period of oscillatory data, it is necessary that at least five cycles of data are sampled, using at least four data points per cycle. These results suggest that the Enright method should be more widely applied in order to improve the analysis of oscillatory data.
A company's overall safety program becomes an important consideration to continue performing work and for procuring future contract awards. When injuries or accidents occur, the employer ultimately loses on two counts - increased medical costs and employee absences. This paper summarizes the human and organizational components that contributed to successful safety programs implemented by WESKEM, LLC's Environmental, Safety, and Health Departments located in Paducah, Kentucky, and Oak Ridge, Tennessee. The philosophy of 'safety, compliance, and then production' and programmatic components implemented at the start of the contracts were qualitatively identified as contributing factors resulting in a significant accumulation of safe work hours and an Experience Modification Rate (EMR) of <1.0. Furthermore, a study by the Associated General Contractors of America quantitatively validated components, already found in the WESKEM, LLC programs, as contributing factors to prevent employee accidents and injuries. Therefore, an investment in the human and organizational components now can pay dividends later by reducing the EMR, which is the key to reducing Workers' Compensation premiums. Also, knowing your employees' demographics and taking an active approach to evaluate and prevent fatigue may help employees balance work and non-work responsibilities. In turn, this approach can assist employers in maintaining a healthy and productive workforce. For these reasons, it is essential that safety needs be considered as the starting point when performing work. (authors)
Mutans Streptococci Dose Response to Xylitol Chewing Gum
Milgrom, P.; Ly, K.A.; Roberts, M C; Rothen, M; Mueller, G.; Yamaguchi, D.K.
2006-01-01
Xylitol is promoted in caries-preventive strategies, yet its effective dose range is unclear. This study determined the dose-response of mutans streptococci in plaque and unstimulated saliva to xylitol gum. Participants (n = 132) were randomized: controls (G1) (sorbitol/maltitol), or combinations giving xylitol 3.44 g/day (G2), 6.88 g/day (G3), or 10.32 g/day (G4). Groups chewed 3 pellets/4 times/d. Samples were taken at baseline, 5 wks, and 6 mos, and were cultured on modified Mitis Salivari...
Purpose: MR-Linac devices under development worldwide will require standard calibration, commissioning, and quality assurance. Solid state radiation detectors are often used for dose profiles and percent depth dose measurements. The dose response of selected solid state detectors is therefore evaluated in varying transverse and longitudinal magnetic fields for this purpose. Methods: The Monte Carlo code PENELOPE was used to model irradiation of a PTW 60003 diamond detector and IBA PFD diode detector in the presence of a magnetic field. The field itself was varied in strength, and oriented both transversely and longitudinally with respect to the incident photon beam. The long axis of the detectors was oriented either parallel or perpendicular to the photon beam. The dose to the active volume of each detector in air was scored, and its ratio to dose with zero magnetic field strength was determined as the “dose response” in magnetic field. Measurements at low fields for both detectors in transverse magnetic fields were taken to evaluate the accuracy of the simulations. Additional simulations were performed in a water phantom to obtain few representative points for beam profile and percent depth dose measurements. Results: Simulations show significant dose response as a function of magnetic field in transverse field geometries. This response can be near 20% at 1.5 T, and it is highly dependent on the detectors’ relative orientation to the magnetic field, the energy of the photon beam, and detector composition. Measurements at low transverse magnetic fields verify the simulations for both detectors in their relative orientations to radiation beam. Longitudinal magnetic fields, in contrast, show little dose response, rising slowly with magnetic field, and reaching 0.5%–1% at 1.5 T regardless of detector orientation. Water tank and in air simulation results were the same within simulation uncertainty where lateral electronic equilibrium is present and expectedly
For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g., Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO2-emissions, Electricity supply, Energy imports by country of origin in January-March 2000, Energy exports by recipient country in January-March 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
Maximum likelihood estimation for cytogenetic dose-response curves
In vitro dose-response curves are used to describe the relation between the yield of dicentric chromosome aberrations and radiation dose for human lymphocytes. The dicentric yields follow the Poisson distribution, and the expected yield depends on both the magnitude and the temporal distribution of the dose for low LET radiation. A general dose-response model that describes this relation has been obtained by Kellerer and Rossi using the theory of dual radiation action. The yield of elementary lesions is kappa[γd + g(t, tau)d2], where t is the time and d is dose. The coefficient of the d2 term is determined by the recovery function and the temporal mode of irradiation. Two special cases of practical interest are split-dose and continuous exposure experiments, and the resulting models are intrinsically nonlinear in the parameters. A general purpose maximum likelihood estimation procedure is described and illustrated with numerical examples from both experimental designs. Poisson regression analysis is used for estimation, hypothesis testing, and regression diagnostics. Results are discussed in the context of exposure assessment procedures for both acute and chronic human radiation exposure
For the year 2002, part of the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot 2001, Statistics Finland, Helsinki 2002). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO2-emissions, Supply and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees on energy products
For the year 2003 and 2004, the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot, Statistics Finland, Helsinki 2003, ISSN 0785-3165). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO2-emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-March 2004, Energy exports by recipient country in January-March 2004, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees
Dose selection for prostate cancer patients based on dose comparison and dose response studies
Purpose: To better define the appropriate dose for individual prostate cancer patients treated with three-dimensional conformal radiation therapy (3D CRT). Methods and Materials: Six hundred eighteen patients treated with 3D CRT between 4/89 and 4/97 with a median follow-up of 53 months are the subject of this study. The bNED outcomes were assessed by the American Society for Therapeutic Radiology and Oncology (ASTRO) definition. The patients were grouped into three groups by prostate-specific antigen (PSA) level (<10 ng/ml, 10-19.9 ng/ml, and 20+ ng/ml) and further subgrouped into six subgroups by favorable (T1, 2A and Gleason score ≤6 and no perineural invasion) and unfavorable characteristics (one or more of T2B, T3, Gleason 7-10, perineural invasion). Dose comparisons for bNED studies were made for each of the six subgroups by dividing patients at 76 Gy for all subgroups except the favorable <10 ng/ml subgroup, which was divided at 72.5 Gy. Five-year bNED rates were compared for the median dose of each dose comparison subgroup. Dose response functions were plotted based on 5-year bNED rates for the six patient groupings, with the data from each of the six subgroups divided into three dose groups. The 5-year bNED rate was also estimated using the dose response function and compares 73 Gy with 78 Gy. Results: Dose comparisons show a significant difference in 5-year bNED rates for three of the six subgroups but not for the favorable <10 ng/ml, the favorable 10-19.9 ng/ml, or the unfavorable ≥20 ng/ml subgroups. The significant differences ranged from 22% to 40% improvement in 5-year bNED with higher dose. Dose response functions show significant differences in 5-year bNED rates comparing 73 Gy and 78 Gy for four of the six subgroups. Again, no difference was observed for the favorable <10 ng/ml group or the unfavorable ≥20 ng/ml group. The significant differences observed in 5-year bNED ranged from 15% to 43%. Conclusions: Dose response varies by patient
Time course and dose response of alpha tocopherol on oxidative stress in haemodialysis patients
Coombes Jeff S
2009-10-01
Full Text Available Abstract Background Oxidative stress is associated with increased cardiovascular morbidity and mortality particularly in patients with end stage kidney disease. Although observational data from the general population has shown dietary antioxidant intake is associated with reduced cardiovascular morbidity and mortality, most clinical intervention trials have failed to support this relationship. This may be a consequence of not using an effective antioxidant dose and/or not investigating patients with elevated oxidative stress. The SPACE study, conducted in haemodialysis patients, reported that 800 IU/day of alpha tocopherol significantly reduced cardiovascular disease endpoints. A recent time course and dose response study conducted in hypercholesterolaemic patients that found 1600 IU/day of alpha tocopherol was an optimal dose. There is no such dose response data available for haemodialysis patients. Therefore the aim of this study is to investigate the effect of different doses of oral alpha tocopherol on oxidative stress in haemodialysis patients with elevated oxidative stress and the time taken to achieve this effect. Methods The study will consist of a time-course followed by a dose response study. In the time course study 20 haemodialysis patients with elevated oxidative stress will take either 1600 IU/day natural (RRR alpha tocopherol for 20 weeks or placebo. Blood will be collected every two weeks and analysed for a marker of oxidative stress (plasma F2-isoprostanes and alpha tocopherol. The optimum time period to significantly decrease plasma F2-isoprostanes will be determined from this study. In the dose response study 60 patients will be randomised to receive either placebo, 100, 200, 400, 800 or 1600 IU/day of natural (RRR alpha tocopherol for a time period determined from the time course study. Blood will be collected at baseline and every two weeks and analysed for plasma F2-isoprostanes and alpha tocopherol. It is hypothesised that
Dose response curves for effects of low-level radiation
The linear dose-response model used by international committees to assess the genetic and carcinogenic hazards of low-level radiation appears to be the most reasonable interpretation of the available scientific data that are relevant to this topic. There are, of course, reasons to believe that this model may overestimate radiation hazards in certain instances, a fact acknowledged in recent reports of these committees. The linear model is now also being utilized to estimate the potential carcinogenic hazards of other agents such as asbestos and polycyclic aromatic hydrocarbons. This model implies that there is no safe dose for any of these agents and that potential health hazards will increase in direct proportion to total accumulated dose. The practical implication is the recommendation that all exposures should be kept 'as low as reasonably achievable, economic and social factors being taken into account'. (auth)
Dose Response of Alanine Detectors Irradiated with Carbon Ion Beams
Herrmann, Rochus; Jäkel, Oliver; Palmans, Hugo;
2011-01-01
Purpose: The dose response of the alanine detector shows a dependence on particle energy and type, when irradiated with ion beams. The purpose of this study is to investigate the response behaviour of the alanine detector in clinical carbon ion beams and compare the results with model predictions....... Methods: Alanine detectors have been irradiated with carbon ions with an energy range of 89-400 MeV/u. The relative effectiveness of alanine has been measured in this regime. Pristine and spread out Bragg peak depth-dose curves have been measured with alanine dosimeters. The track-structure based alanine...... response model developed by J. Hansen and K. Olsen has been implemented in the Monte Carlo code FLUKA, and calculations were compared with experimental results. Results: Calculations of the relative effectiveness deviate less than 5% from the measured values for mono energetic beams. Measured depth...
Proposal of a probabilistic dose-response model
A biologically updated dose-response model is presented as an alternative to the linear-quadratic model currently in use for cancer risk assessment. The new model is based on the probability functions for misrepair and/or unrepair of DNA lesions, in terms of the radiation damage production rate in the cell (supposedly, a stem cell) and its repair-rate constant. The model makes use, interpreting it on the basis of misrepair probabilities, of the ''dose and dose-rate effectiveness factor'' of ICRP, and provides the way for a continuous extrapolation between the high and low dose-rate regions, ratifying the ''linear non-threshold hypothesis'' as the main option. Anyhow, the model throws some doubts about the additive property of the dose. (author)
Baluev, Roman V.
2013-11-01
We consider the `multifrequency' periodogram, in which the putative signal is modelled as a sum of two or more sinusoidal harmonics with independent frequencies. It is useful in cases when the data may contain several periodic components, especially when their interaction with each other and with the data sampling patterns might produce misleading results. Although the multifrequency statistic itself was constructed earlier, for example by G. Foster in his CLEANest algorithm, its probabilistic properties (the detection significance levels) are still poorly known and much of what is deemed known is not rigorous. These detection levels are nonetheless important for data analysis. We argue that to prove the simultaneous existence of all n components revealed in a multiperiodic variation, it is mandatory to apply at least 2n - 1 significance tests, among which most involve various multifrequency statistics, and only n tests are single-frequency ones. The main result of this paper is an analytic estimation of the statistical significance of the frequency tuples that the multifrequency periodogram can reveal. Using the theory of extreme values of random fields (the generalized Rice method), we find a useful approximation to the relevant false alarm probability. For the double-frequency periodogram, this approximation is given by the elementary formula (π/16)W2e- zz2, where W denotes the normalized width of the settled frequency range, and z is the observed periodogram maximum. We carried out intensive Monte Carlo simulations to show that the practical quality of this approximation is satisfactory. A similar analytic expression for the general multifrequency periodogram is also given, although with less numerical verification.
The dose-response relationship for UV-tumorigenesis
The main objective of the investigations was to extend the knowledge on experimental UV-carcinogenesis and to use the experimental results as guidelines for developing a dose-response model for UV-carcinogenesis. The animal experiments carried out were all long-term ones. It was decided that - in anticipation of the data to be obtained - a model for such an assessment should be developed using the experimental results available at the start of the present study (1977). This initial study is presented. The results of two animal experiments are presented, which show that UV radiation is capable of inducing a systemic effect that enhances the de novo formation of UV induced tumors. The results of the main experiment are presented. In this experiment groups of mice were subjected to daily exposure to a certain dose of UV radiation in order to find the dose-response relationship. The relation between the daily dose and the duration of the treatment till the appearance of tumors (for instance, as measured by the yield) was ascertained for tumors of different sizes. It appears that the growth of a tumor is dose-independent, and, therefore, only the initiation of a tumor is dose-dependent. Finally an experiment is presented in which it was measured that, if a mouse is subjected to daily UV exposure, the transmission of the epidermis in the shortwave UV region decreases continuously. This decrease is due to hyperplasia of the epidermis, i.e., thickening of the epidermis by an increase in the number of cells per unit surface area. (Auth.)
Dose-response relationships for radium-induced bone sarcomas
The incidence of bone sarcomas among 3055 female radium-dial workers who entered the dial industry before 1950 was used to determine dose-response relationships for the induction of bone sarcomas by radium. Two subpopulations were analyzed: all measured cases who survived at last five years after the start of employment and all cases who survived at least two years after first measurement. The first constituted a group based on year of entry; it contained 1468 women who experienced 42 bone sarcomas; the expected number was 0.4. The second comprised a group based on first measurement; it contained 1257 women who experienced 13 bone sarcomas; the expected number was 0.2. The dose-response function, I = (C + αD + #betta#D2)e/sup -#betta#D/, and simplifications of this general form, were fit to each data set. Two functions, I = (C + αD + #betta#D2)e/sup -#betta#D/ and I = (C + #betta#D2)e/sup -#betta#D/, fit the data for year of entry (p greater than or equal to 0.05); both these functions and I = (C + αD) fit the data for first measurement. The function I = (C + #betta#D2)e/sup -#betta#D/ was used to predict the number of bone sarcomas in all other pre-1950 radium cases (medical, laboratory, and other exposure); fewer were actually observed than the fit of this function to the female dial workers predicted
Andersen, Klaus Ejner; Lidén, C; Hansen, J;
1993-01-01
of the different concentrations. Readings were performed blind. The results were analysed by means of polynomial multiple-regression methods and a logistic dose-response model. Half the patients (38/72) had a threshold patch-test concentration for nickel sulphate in the range of 3-0.3 microgram/cm2. The 'angry...... back' phenomenon was not apparent in this study, as the spill-over effect was not statistically significant. Strong reactions to high concentrations of nickel sulphate did not enhance the response to adjacent lower concentrations of nickel sulphate....
L. Østvand
2014-03-01
Full Text Available Various interpretations of the notion of a trend in the context of global warming are discussed, contrasting the difference between viewing a trend as the deterministic response to an external forcing and viewing it as a slow variation which can be separated from the background spectral continuum of long-range persistent climate noise. The emphasis in this paper is on the latter notion, and a general scheme is presented for testing a multi-parameter trend model against a null hypothesis which models the observed climate record as an autocorrelated noise. The scheme is employed to the instrumental global sea-surface temperature record and the global land temperature record. A trend model comprising a linear plus an oscillatory trend with period of approximately 70 yr, and the statistical significance of the trends, are tested against three different null models: first-order autoregressive process, fractional Gaussian noise, and fractional Brownian motion. The parameters of the null models are estimated from the instrumental record, but are also checked to be consistent with a Northern Hemisphere temperature reconstruction prior to 1750 for which an anthropogenic trend is negligible. The linear trend in the period 1850–2010 AD is significant in all cases, but the oscillatory trend is insignificant for ocean data and barely significant for land data. However, by using the significance of the linear trend to constrain the null hypothesis, the oscillatory trend in the land record appears to be statistically significant. The results suggest that the global land record may be better suited for detection of the global warming signal than the ocean record.
E. M. Dunne
2012-06-01
Full Text Available Observed correlations between short-term decreases in cosmic ray ionisation and cloud and aerosol properties have been attributed to short-term decreases in the ion-induced nucleation rate. We use a global aerosol microphysics model to determine whether a 10-day reduction of 15% in the nucleation rate could generate a statistically significant response in aerosol concentrations and optical properties. As an upper limit to the possible effect of changes in the ion-induced nucleation rate, we perturb the total nucleation rate, which has been shown to generate particle concentrations and nucleation events in reasonable agreement with global observations. When measured against a known aerosol control state, the model predicts a 0.15% decrease in global mean cloud condensation nucleus concentrations at the surface. However, taking into account the variability in aerosol, no statistically significant response can be detected in concentrations of particles with diameters larger than 10 nm, in cloud condensation nuclei with diameters larger than 70 nm, or in the Ångström exponent. The results suggest that the observed correlation between short-term decreases in cosmic ray ionisation and cloud and aerosol properties cannot be explained by associated changes in the large-scale nucleation rate.
Baluev, Roman V
2013-01-01
We consider the "multi-frequency" periodogram, in which the putative signal is modelled as a sum of two or more sinusoidal harmonics with idependent frequencies. It is useful in the cases when the data may contain several periodic components, especially when their interaction with each other and with the data sampling patterns might produce misleading results. Although the multi-frequency statistic itself was already constructed, e.g. by G. Foster in his CLEANest algorithm, its probabilistic properties (the detection significance levels) are still poorly known and much of what is deemed known is unrigourous. These detection levels are nonetheless important for the data analysis. We argue that to prove the simultaneous existence of all $n$ components revealed in a multi-periodic variation, it is mandatory to apply at least $2^n-1$ significance tests, among which the most involves various multi-frequency statistics, and only $n$ tests are single-frequency ones. The main result of the paper is an analytic estima...
X-ray dose response of calcite-A comprehensive analysis for optimal application in TL dosimetry
Kalita, J. M.; Wary, G.
2016-09-01
The effect of various annealing treatments on dosimetric characteristics of orange calcite (CaCO3) mineral has been studied in detail. Quantitative analysis on the dose response shows that the 573 K annealed sample showed sublinear dose response from 10 mGy to 1 Gy. The fading and reproducibility of this sample are also good enough for dosimetric application. However, a specific annealing treatment after irradiation shows some significant improvements in the dosimetric characteristics of the sample. The 773 K pre-annealed sample, after X-ray irradiation post-annealing at 340 K for 6 min provides linear dose response from 10 mGy to 3.60 Gy, very less fading and good reproducibility. Moreover, this sample after post-annealing at 380 K for 6 min shows linear dose response from 10 mGy to 5.40 Gy when analyzed from the ∼408 K thermoluminescence (TL) glow peak. Analysis of TL glow curves confirmed that the 1.30 eV trap center in calcite crystal is the most effective trapping site for dosimetric application.
Application of Dempster-Shafer theory in dose response outcome analysis
Chen, Wenzhou; Cui, Yunfeng; He, Yanyan; Yu, Yan; Galvin, James; Hussaini, Yousuff M.; Xiao, Ying
2012-09-01
The Quantitative Analysis of Normal Tissue Effects in the Clinic (QUANTEC) reviews summarize the currently available three-dimensional dose/volume/outcome data from multi-institutions and numerous articles to update and refine the normal tissue dose/volume tolerance guidelines. As pointed out in the review, the data have limitations and even some inconsistency. However, with the help of new physical and statistical techniques, the information in the review could be updated so that patient care can be continually improved. The purpose of this work is to demonstrate the application of a mathematical theory, the Dempster-Shafer theory, in dose/volume/outcome data analysis. We applied this theory to the original data obtained from published clinical studies describing dose response for radiation pneumonitis. Belief and plausibility concepts were introduced for dose response evaluation. We were also able to consider the uncertainty and inconsistency of the data from these studies with Yager's combination rule, a special methodology of Dempster-Shafer theory, to fuse the data at several specific doses. The values of belief and plausibility functions were obtained at the corresponding doses. Then we applied the Lyman-Kutcher-Burman (LKB) model to fit these values and a belief-plausibility range was obtained. This range could be considered as a probability range to assist physicians and treatment planners in determining acceptable dose-volume constraints. Finally, the parameters obtained from the LKB model fitting were compared with those in Emami and Burman's papers and those from other frequentist statistics methods. We found that Emami and Burman's parameters are within the belief-plausibility range we calculated by the Dempster-Shafer theory.
Application of Dempster–Shafer theory in dose response outcome analysis
The Quantitative Analysis of Normal Tissue Effects in the Clinic (QUANTEC) reviews summarize the currently available three-dimensional dose/volume/outcome data from multi-institutions and numerous articles to update and refine the normal tissue dose/volume tolerance guidelines. As pointed out in the review, the data have limitations and even some inconsistency. However, with the help of new physical and statistical techniques, the information in the review could be updated so that patient care can be continually improved. The purpose of this work is to demonstrate the application of a mathematical theory, the Dempster–Shafer theory, in dose/volume/outcome data analysis. We applied this theory to the original data obtained from published clinical studies describing dose response for radiation pneumonitis. Belief and plausibility concepts were introduced for dose response evaluation. We were also able to consider the uncertainty and inconsistency of the data from these studies with Yager's combination rule, a special methodology of Dempster–Shafer theory, to fuse the data at several specific doses. The values of belief and plausibility functions were obtained at the corresponding doses. Then we applied the Lyman–Kutcher–Burman (LKB) model to fit these values and a belief–plausibility range was obtained. This range could be considered as a probability range to assist physicians and treatment planners in determining acceptable dose–volume constraints. Finally, the parameters obtained from the LKB model fitting were compared with those in Emami and Burman's papers and those from other frequentist statistics methods. We found that Emami and Burman's parameters are within the belief–plausibility range we calculated by the Dempster–Shafer theory. (paper)
Linear dose response curves in fungi and tradescantia
Tradescantia Clone 02 data suggests that linear non-threshold dose responses are expected to the lowest doses and dose rates of low linear energy transfer (LET) radiation. This is likely to be true for other living organisms even though Clone 02 is radiation sensitive. It is concluded that Clone 02 is partially defective in the RAD 6 pathway for the repair of DNA interstrand cross-links (ISCL) and other loss of coding damage (LCD), based on its cross sensitivities to EMS and ionizing radiation. Tradescantia Clone 02 data showing linear non-threshold induction of somatic genetic events in part reflects the repair deficiency of this Clone. More DNA damage is repaired by recombinational mechanisms in Clone 02 than would occur in a wild-type strain. Two important classes of DNA lesions are induced by ionizing radiation in DNA - double strand breaks (DSB) which are repaired by recombination mechanisms, and loss of coding information damage (LCD), which is repaired by error prone mechanisms but may also be a substrate for recombinational repair. Based on data from yeast, there are two different repair pathways which deal with these differing lesions with different somatic genetic consequences. From yeast, yield cross sections can be derived and applied to DNA damage and repair in Tradescantia. For Clone 02, per lesion, more visible genetic events are scored than in wild-type strains. In a radiation-derived sub-clone, Clone 0106, which is more variable than Clone 02, even more events occur per lesion. This derivative clone, plus breeding experiments, indicate that Clone 02 is heterozygous, or a 'carrier' for a mutant version of a gene in the Tradescantia RAD 6 repair pathway. Clone 02 is, therefore, much like a Fanconi's anemia carrier in a human population, while the Clone 0106 derivative is much like a homozygous Fanconi's anemia patient, with respect to its response to ionizing radiation damage. Two anomalies in its dose response curves for 'pink' loss of
A Randomized, Open-Label, Dose-Response Study of Losartan in Hypertensive Children
Wells, Thomas G.; Shahinfar, Shahnaz; Massaad, Rachid; Dankner, Wayne M.; Lam, Chun; Santoro, Emanuela Palumbo; McCrary Sisk, Christine; Blaustein, Robert O.
2014-01-01
Background and objectives Once-daily losartan reduces BP in a dose-dependent manner and is well tolerated in hypertensive children aged 6–16 years. This study assessed the dose-response relationship, safety, and tolerability of losartan in hypertensive children aged 6 months to 6 years. Design, setting, participants, & measurements This was a 12-week, randomized, open-label, dose-ranging study, with a 2-year extension. Patients were randomized to losartan at the following dosages: 0.1 mg/kg per day (low), 0.3 mg/kg per day (medium), or 0.7 mg/kg per day (high). Losartan was titrated to the next dose level (to a 1.4 mg/kg per day maximum dosage, not exceeding 100 mg/d, which was not one of the three original doses offered at randomization) at weeks 3, 6, and 9 for patients who did not attain their goal BP and were not taking the highest dose. Dose response was evaluated by analyzing the slope of change in sitting systolic BP (SBP; primary end point) and diastolic BP (DBP; secondary end point) after 3 weeks compared with baseline. Adverse events (AEs) were recorded throughout. Results Of the 101 patients randomized, 99 were included in the analysis (low dose, n=32; medium dose, n=34; and high dose, n=33). Mean sitting BP decreased from baseline in the low-, medium-, and high-dose groups by 7.3, 7.6, and 6.7 mmHg, respectively, for SBP and 8.2, 5.1, and 6.7 mmHg, respectively, for DBP after 3 weeks. No dose-response relationship was established by the slope analysis on SBP (P=0.75) or DBP (P=0.64). The BP-lowering effect was observed throughout the 2-year extension. The incidence of AEs was low and comparable between groups. Conclusions Hypertensive children aged 6 months to 6 years treated with losartan 0.1–0.7 mg/kg per day had clinically significant decreases from baseline in SBP and DBP, yet no dose-response relationship was evident. Losartan, at a dosage up to 1.4 mg/kg per day, was well tolerated. PMID:24875194
Investigating quartz optically stimulated luminescence dose-response curves at high doses
Lowick, Sally E., E-mail: lowick@geo.unibe.c [Institut fuer Geologie, Universitaet Bern, Baltzerstrasse 1-3, 3012 Bern (Switzerland); Preusser, Frank [Institut fuer Geologie, Universitaet Bern, Baltzerstrasse 1-3, 3012 Bern (Switzerland); Wintle, Ann G. [Institute of Geography and Earth Sciences, Aberystwyth University, AberystwythSY23 3DB (United Kingdom)
2010-10-15
Despite the general expectation that optically stimulated luminescence (OSL) growth should be described by a simple saturating exponential function, an additional high dose component is often reported in the dose response of quartz. Although often reported as linear, it appears that this response is the early expression of a second saturating exponential. While some studies using equivalent doses that fall in this high dose region have produced ages that correlate well with independent dating, others report that it results in unreliable age determinations. Two fine grain sedimentary quartz samples that display such a response were used to investigate the origin of this additional high dose component: three experiments were conducted to examine their dose-response up to >1000 Gy. The high dose rates provided by laboratory irradiation were found not to induce a sensitivity change in the response to a subsequent test dose, with the latter not being significantly different from those generated following naturally acquired doses. The relative percentage contributions of the fast and medium OSL components remained fixed throughout the dose-response curve, suggesting that the electron traps that give rise to the initial OSL do not change with dose. An attempt was made to investigate a change in luminescence centre recombination probability by monitoring the depletion of the '325 {sup o}C' thermoluminescence (TL) during the optical stimulation that would result in depletion of the OSL signal. The emissions measured through both the conventional ultraviolet (UV), and a longer wavelength violet/blue (VB) window, displayed similar relative growth with dose, although it was not possible to resolve the origin of the VB emissions. No evidence was found to indicate whether the additional component at high doses occurs naturally or is a product of laboratory treatment. However, it appears that these samples display an increased sensitivity of quartz OSL to high doses
Webb-Robertson, Bobbie-Jo M.; McCue, Lee Ann; Waters, Katrina M.; Matzke, Melissa M.; Jacobs, Jon M.; Metz, Thomas O.; Varnum, Susan M.; Pounds, Joel G.
2010-11-01
Liquid chromatography-mass spectrometry-based (LC-MS) proteomics uses peak intensities of proteolytic peptides to infer the differential abundance of peptides/proteins. However, substantial run-to-run variability in peptide intensities and observations (presence/absence) of peptides makes data analysis quite challenging. The missing abundance values in LC-MS proteomics data are difficult to address with traditional imputation-based approaches because the mechanisms by which data are missing are unknown a priori. Data can be missing due to random mechanisms such as experimental error, or non-random mechanisms such as a true biological effect. We present a statistical approach that uses a test of independence known as a G-test to test the null hypothesis of independence between the number of missing values and the experimental groups. We pair the G-test results evaluating independence of missing data (IMD) with a standard analysis of variance (ANOVA) that uses only means and variances computed from the observed data. Each peptide is therefore represented by two statistical confidence metrics, one for qualitative differential observation and one for quantitative differential intensity. We use two simulated and two real LC-MS datasets to demonstrate the robustness and sensitivity of the ANOVA-IMD approach for assigning confidence to peptides with significant differential abundance among experimental groups.
Background and purpose: We fit phenomenological tumor control probability (TCP) models to biopsy outcome after three-dimensional conformal radiation therapy (3D-CRT) of prostate cancer patients to quantify the local dose-response of prostate cancer. Materials and methods: We analyzed the outcome after photon beam 3D-CRT of 103 patients with stage T1c-T3 prostate cancer treated at Memorial Sloan-Kettering Cancer Center (MSKCC) (prescribed target doses between 64.8 and 81 Gy) who had a prostate biopsy performed ≥2.5 years after end of treatment. A univariate logistic regression model based on Dmean (mean dose in the planning target volume of each patient) was fit to the whole data set and separately to subgroups characterized by low and high values of tumor-related prognostic factors T-stage (6), and pre-treatment prostate-specific antigen (PSA) (≤10 ng/ml vs. >10 ng/ml). In addition, we evaluated five different classifications of the patients into three risk groups, based on all possible combinations of two or three prognostic factors, and fit bivariate logistic regression models with Dmean and the risk group category to all patients. Dose-response curves were characterized by TCD50, the dose to control 50% of the tumors, and γ50, the normalized slope of the dose-response curve at TCD50. Results: Dmean correlates significantly with biopsy outcome in all patient subgroups and larger values of TCD50 are observed for patients with unfavorable compared to favorable prognostic factors. For example, TCD50 for high T-stage patients is 7 Gy higher than for low T-stage patients. For all evaluated risk group definitions, Dmean and the risk group category are independent predictors of biopsy outcome in bivariate analysis. The fit values of TCD50 show a clear separation of 9-10.6 Gy between low and high risk patients. The corresponding dose-response curves are steeper (γ50=3.4-5.2) than those obtained when all patients are analyzed together (γ50=2.9). Conclusions: Dose-response
Development of a mid-head radiation dose response function
Calculations have been made of the incident neutron and gamma-ray absorbed dose response as a function of energy in the mid-head position of a phantom model. The calculations were performed with the DOT discrete ordinates transport code in the adjoint mode using co-axial cylinders to represent the head and torso. Results, given in a coupled 37-neutron-group, 21-gamma-ray-group structure (37/21) and a 22-neutron-group, 18-gamma-ray-group structure (22/18), are compared with previously obtained results. The mid-head response is less than the conventional radiation protection fluence-to-dose factors which are based on maximum phantom values. In the case of a fission source in air the neutron dose is about a factor of 4 less, and the secondary gamma-ray dose is about a factor of 1.5 less. For a fusion source the neutron dose ratio varies from about 1.9 at close range to about 3. The gamma-ray dose ratio is about the same as for the fission source. Tables of the various response functions are presented in the Appendix A
Mutans Streptococci Dose Response to Xylitol Chewing Gum
Milgrom, P.; Ly, K.A.; Roberts, M.C.; Rothen, M.; Mueller, G.; Yamaguchi, D.K.
2008-01-01
Xylitol is promoted in caries-preventive strategies, yet its effective dose range is unclear. This study determined the dose-response of mutans streptococci in plaque and unstimulated saliva to xylitol gum. Participants (n = 132) were randomized: controls (G1) (sorbitol/maltitol), or combinations giving xylitol 3.44 g/day (G2), 6.88 g/day (G3), or 10.32 g/day (G4). Groups chewed 3 pellets/4 times/d. Samples were taken at baseline, 5 wks, and 6 mos, and were cultured on modified Mitis Salivarius agar for mutans streptococci and on blood agar for total culturable flora. At 5 wks, mutans streptococci levels in plaque were 10x lower than baseline in G3 and G4 (P = 0.007/0.003). There were no differences in saliva. At 6 mos, mutans streptococci in plaque for G3 and G4 remained 10x lower than baseline (P = 0.007/0.04). Saliva for G3 and G4 was lower than baseline by 8 to 9x (P = 0.011/0.038). Xylitol at 6.44 g/day and 10.32 g/day reduces mutans streptococci in plaque at 5 wks, and in plaque and unstimulated saliva at 6 mos. A plateau effect is suggested between 6.44 g and 10.32 g xylitol/day. PMID:16434738
Optimal dose-response relationships in voice therapy.
Roy, Nelson
2012-10-01
Like other areas of speech-language pathology, the behavioural management of voice disorders lacks precision regarding optimal dose-response relationships. In voice therapy, dosing can presumably vary from no measurable effect (i.e., no observable benefit or adverse effect), to ideal dose (maximum benefit with no adverse effects), to doses that produce toxic or harmful effects on voice production. Practicing specific vocal exercises will inevitably increase vocal load. At ideal doses, these exercises may be non-toxic and beneficial, while at intermediate or high doses, the same exercises may actually be toxic or damaging to vocal fold tissues. In pharmacology, toxicity is a critical concept, yet it is rarely considered in voice therapy, with little known regarding "effective" concentrations of specific voice therapies vs "toxic" concentrations. The potential for vocal fold tissue damage related to overdosing on specific vocal exercises has been under-studied. In this commentary, the issue of dosing will be explored within the context of voice therapy, with particular emphasis placed on possible "overdosing". PMID:22574765
Nonlinear dose-response relationships and inducible cellular defence mechanisms
With the inclusion of inducible radioprotective mechanisms in a radiobiological state-vector model it was possible to explain plateaus in dose-response relationships for neoplastic transformation produced by in vitro irradiation of different cell lines with low-LET irradiation at high dose rates. The current study repeated the simulation of one data set that contains a plateau at mid doses. In contrast to earlier studies, the new one did not model the repair of double-strand breaks (DSBs) located in bulk DNA (likely via non-homologous end joining) as being inducible. Repair of specific DSBs located in actively transcribed genes was assumed to occur via homologous recombination and was considered to be inducible. This reduced the number of parameters that have to be determined by fitting the model to data. In addition, all types of radical scavengers were formerly considered to be inducible by radiation. This was redefined in the current work and the effectiveness of scavengers was implemented in a refined way. The current work investigated whether these and other model adjustments lead to an improved fit of the data set. (author)
The effect of measurement error on the dose-response curve.
Yoshimura, I
1990-01-01
In epidemiological studies for an environmental risk assessment, doses are often observed with errors. However, they have received little attention in data analysis. This paper studies the effect of measurement errors on the observed dose-response curve. Under the assumptions of the monotone likelihood ratio on errors and a monotone increasing dose-response curve, it is verified that the slope of the observed dose-response curve is likely to be gentler than the true one. The observed variance...
Optimal designs for dose-response models with restricted design spaces
Dette, Holger; Biedermann, Stefanie; Zhu, Wei
2004-01-01
In dose-response studies, the dose range is often restricted due to concerns over drug toxicity and/or efficacy. We derive optimal designs for estimating the underlying dose-response curve for a restricted or unrestricted dose range with respect to a broad class of optimality criteria. The underlying curve belongs to a diversified set of link functions suitable for the dose response studies and having a common canonical form. These include the fundamental binary response models -- t...
WANG Hanjie; SHI Weilai; CHEN Xiaohong
2006-01-01
The West Development Policy being implemented in China is causing significant land use and land cover (LULC) changes in West China. With the up-to-date satellite database of the Global Land Cover Characteristics Database (GLCCD) that characterizes the lower boundary conditions, the regional climate model RIEMS-TEA is used to simulate possible impacts of the significant LULC variation. The model was run for five continuous three-month periods from 1 June to 1 September of 1993, 1994, 1995, 1996, and 1997, and the results of the five groups are examined by means of a student t-test to identify the statistical significance of regional climate variation. The main results are: (1) The regional climate is affected by the LULC variation because the equilibrium of water and heat transfer in the air-vegetation interface is changed. (2) The integrated impact of the LULC variation on regional climate is not only limited to West China where the LULC varies, but also to some areas in the model domain where the LULC does not vary at all. (3) The East Asian monsoon system and its vertical structure are adjusted by the large scale LULC variation in western China, where the consequences are the enhancement of the westward water vapor transfer from the east oast and the relevant increase of wet-hydrostatic energy in the middle-upper atmospheric layers. (4) The ecological engineering in West China affects significantly the regional climate in Northwest China, North China and the middle-lower reaches of the Yangtze River; there are obvious effects in South, Northeast, and Southwest China, but minor effects in Tibet.
Kossobokov, V.G.; Romashkova, L.L.; Keilis-Borok, V. I.; Healy, J.H.
1999-01-01
Algorithms M8 and MSc (i.e., the Mendocino Scenario) were used in a real-time intermediate-term research prediction of the strongest earthquakes in the Circum-Pacific seismic belt. Predictions are made by M8 first. Then, the areas of alarm are reduced by MSc at the cost that some earthquakes are missed in the second approximation of prediction. In 1992-1997, five earthquakes of magnitude 8 and above occurred in the test area: all of them were predicted by M8 and MSc identified correctly the locations of four of them. The space-time volume of the alarms is 36% and 18%, correspondingly, when estimated with a normalized product measure of empirical distribution of epicenters and uniform time. The statistical significance of the achieved results is beyond 99% both for M8 and MSc. For magnitude 7.5 + , 10 out of 19 earthquakes were predicted by M8 in 40% and five were predicted by M8-MSc in 13% of the total volume considered. This implies a significance level of 81% for M8 and 92% for M8-MSc. The lower significance levels might result from a global change in seismic regime in 1993-1996, when the rate of the largest events has doubled and all of them become exclusively normal or reversed faults. The predictions are fully reproducible; the algorithms M8 and MSc in complete formal definitions were published before we started our experiment [Keilis-Borok, V.I., Kossobokov, V.G., 1990. Premonitory activation of seismic flow: Algorithm M8, Phys. Earth and Planet. Inter. 61, 73-83; Kossobokov, V.G., Keilis-Borok, V.I., Smith, S.W., 1990. Localization of intermediate-term earthquake prediction, J. Geophys. Res., 95, 19763-19772; Healy, J.H., Kossobokov, V.G., Dewey, J.W., 1992. A test to evaluate the earthquake prediction algorithm, M8. U.S. Geol. Surv. OFR 92-401]. M8 is available from the IASPEI Software Library [Healy, J.H., Keilis-Borok, V.I., Lee, W.H.K. (Eds.), 1997. Algorithms for Earthquake Statistics and Prediction, Vol. 6. IASPEI Software Library]. ?? 1999 Elsevier
Østvand, Lene; Rypdal, Martin
2013-01-01
Various interpretations of the notion of a trend in the context of global warming are discussed, contrasting the difference between viewing a trend as the deterministic response to an external forcing and viewing it as a slow variation which can be separated from the background spectral continuum of long-range persistent climate noise. The emphasis in this paper is on the latter notion, and a general scheme is presented for testing a multi-parameter trend model against a null hypothesis which models the observed climate record as an autocorrelated noise. The scheme is employed to the instrumental global sea-surface temperature record and the global land-temperature record. A trend model comprising a linear plus an oscillatory trend with period of approximately 60 yr, and the statistical significance of the trends, are tested against three different null models: first-order autoregressive process, fractional Gaussian noise, and fractional Brownian motion. The linear trend is significant in all cases, but the o...
Ortiz de García, Sheyla; García-Encina, Pedro A; Irusta-Mata, Rubén
2016-01-01
The presence of pharmaceuticals and personal care products (PPCPs) in the environment has become a real and widespread concern in recent years. Therefore, the primary goal of this study was to investigate 20 common and widely used PPCPs to assess their individual and combined effect on an important species in one trophic level, i.e., bacteria. The ecotoxicological effects of PPCPs at two different concentration ranges were determined in the bacterium Vibrio fischeri using Microtox(®) and were statistically analyzed using three models in the GraphPad Prism 6 program for Windows, v.6.03. A four-parameter model best fit the majority of the compounds. The half maximal effective concentration (EC50) of each PPCP was estimated using the best-fitting model and was compared with the results from a recent study. Comparative analysis indicated that most compounds showed the same level of toxicity. Moreover, the stimulatory effects of PPCPs at environmental concentrations (low doses) were assessed. These results indicated that certain compounds have traditional inverted U- or J-shaped dose-response curves, and 55% of them presented a stimulatory effect below the zero effect-concentration point. Effective concentrations of 0 (EC0), 5 (EC5) and 50% (EC50) were calculated for each PPCP as the ecotoxicological points. All compounds that presented narcosis as a mode of toxic action at high doses also exhibited stimulation at low concentrations. The maximum stimulatory effect of a mixture was higher than the highest stimulatory effect of each individually tested compound. Moreover, when the exposure time was increased, the hormetic effect decreased. Hormesis is being increasingly included in dose-response studies because this may have a harmful, beneficial or indifferent effect in an environment. Despite the results obtained in this research, further investigations need to be conducted to elucidate the behavior of PPCPs in aquatic environments. PMID:26518677
Dose Response of MARV/Angola Infection in Cynomolgus Macaques following IM or Aerosol Exposure.
Sara C Johnston
Full Text Available Marburg virus infection in humans causes a hemorrhagic disease with a high case fatality rate. Countermeasure development requires the use of well-characterized animal models that mimic human disease. To further characterize the cynomolgus macaque model of MARV/Angola, two independent dose response studies were performed using the intramuscular or aerosol routes of exposure. All animals succumbed at the lowest target dose; therefore, a dose effect could not be determined. For intramuscular-exposed animals, 100 PFU was the first target dose that was not significantly different than higher target doses in terms of time to disposition, clinical pathology, and histopathology. Although a significant difference was not observed between aerosol-exposed animals in the 10 PFU and 100 PFU target dose groups, 100 PFU was determined to be the lowest target dose that could be consistently obtained and accurately titrated in aerosol studies.
Shared Dosimetry Error in Epidemiological Dose-Response Analyses
Stram, Daniel O.; Preston, Dale L.; Sokolnikov, Mikhail; Napier, Bruce; Kopecky, Kenneth J.; Boice, John; Beck, Harold; Till, John; Bouville, Andre
2015-01-01
Radiation dose reconstruction systems for large-scale epidemiological studies are sophisticated both in providing estimates of dose and in representing dosimetry uncertainty. For example, a computer program was used by the Hanford Thyroid Disease Study to provide 100 realizations of possible dose to study participants. The variation in realizations reflected the range of possible dose for each cohort member consistent with the data on dose determinates in the cohort. Another example is the Mayak Worker Dosimetry System 2013 which estimates both external and internal exposures and provides multiple realizations of "possible" dose history to workers given dose determinants. This paper takes up the problem of dealing with complex dosimetry systems that provide multiple realizations of dose in an epidemiologic analysis. In this paper we derive expected scores and the information matrix for a model used widely in radiation epidemiology, namely the linear excess relative risk (ERR) model that allows for a linear dose response (risk in relation to radiation) and distinguishes between modifiers of background rates and of the excess risk due to exposure. We show that treating the mean dose for each individual (calculated by averaging over the realizations) as if it was true dose (ignoring both shared and unshared dosimetry errors) gives asymptotically unbiased estimates (i.e. the score has expectation zero) and valid tests of the null hypothesis that the ERR slope β is zero. Although the score is unbiased the information matrix (and hence the standard errors of the estimate of β) is biased for β≠0 when ignoring errors in dose estimates, and we show how to adjust the information matrix to remove this bias, using the multiple realizations of dose. The use of these methods in the context of several studies including, the Mayak Worker Cohort, and the U.S. Atomic Veterans Study, is discussed. PMID:25799311
Shared Dosimetry Error in Epidemiological Dose-Response Analyses
Stram, Daniel; Preston, D. L.; Sokolnkov, Mikhail; Napier, Bruce A.; Kopecky, Kenneth; Boice, John; Beck, Harold L.; Till, John E.; Bouville, A.
2015-03-23
Radiation dose reconstruction systems for large-scale epidemiological studies are sophisticated both in providing estimates of dose and in representing dosimetry uncertainty. For example, a computer program was used by the Hanford Thyroid Disease Study to provide 100 realizations of possible dose to study participants. The variation in realizations reflected the range of possible dose for each cohort member consistent with the data on dose determinates in the cohort. Another example is the Mayak Worker Dosimetry System 2013 which estimates both external and internal exposures and provides multiple realizations of "possible" dose history to workers given dose determinants. This paper takes up the problem of dealing with complex dosimetry systems that provide multiple realizations of dose in an epidemiologic analysis. In this paper we derive expected scores and the information matrix for a model used widely in radiation epidemiology, namely the linear excess relative risk (ERR) model that allows for a linear dose response (risk in relation to radiation) and distinguishes between modifiers of background rates and of the excess risk due to exposure. We show that treating the mean dose for each individual (calculated by averaging over the realizations) as if it was true dose (ignoring both shared and unshared dosimetry errors) gives asymptotically unbiased estimates (i.e. the score has expectation zero) and valid tests of the null hypothesis that the ERR slope β is zero. Although the score is unbiased the information matrix (and hence the standard errors of the estimate of β) is biased for β≠0 when ignoring errors in dose estimates, and we show how to adjust the information matrix to remove this bias, using the multiple realizations of dose. Use of these methods for several studies, including the Mayak Worker Cohort and the U.S. Atomic Veterans Study, is discussed.
Shared dosimetry error in epidemiological dose-response analyses
Radiation dose reconstruction systems for large-scale epidemiological studies are sophisticated both in providing estimates of dose and in representing dosimetry uncertainty. For example, a computer program was used by the Hanford Thyroid Disease Study to provide 100 realizations of possible dose to study participants. The variation in realizations reflected the range of possible dose for each cohort member consistent with the data on dose determinates in the cohort. Another example is the Mayak Worker Dosimetry System 2013 which estimates both external and internal exposures and provides multiple realizations of 'possible' dose history to workers given dose determinants. This paper takes up the problem of dealing with complex dosimetry systems that provide multiple realizations of dose in an epidemiologic analysis. In this paper we derive expected scores and the information matrix for a model used widely in radiation epidemiology, namely the linear excess relative risk (ERR) model that allows for a linear dose response (risk in relation to radiation) and distinguishes between modifiers of background rates and of the excess risk due to exposure. We show that treating the mean dose for each individual (calculated by averaging over the realizations) as if it was true dose (ignoring both shared and unshared dosimetry errors) gives asymptotically unbiased estimates (i.e. the score has expectation zero) and valid tests of the null hypothesis that the ERR slope β is zero. Although the score is unbiased the information matrix (and hence the standard errors of the estimate of β) is biased for β≠0 when ignoring errors in dose estimates, and we show how to adjust the information matrix to remove this bias, using the multiple realizations of dose. The use of these methods in the context of several studies including, the Mayak Worker Cohort, and the U.S. Atomic Veterans Study, is discussed
Shared dosimetry error in epidemiological dose-response analyses.
Daniel O Stram
Full Text Available Radiation dose reconstruction systems for large-scale epidemiological studies are sophisticated both in providing estimates of dose and in representing dosimetry uncertainty. For example, a computer program was used by the Hanford Thyroid Disease Study to provide 100 realizations of possible dose to study participants. The variation in realizations reflected the range of possible dose for each cohort member consistent with the data on dose determinates in the cohort. Another example is the Mayak Worker Dosimetry System 2013 which estimates both external and internal exposures and provides multiple realizations of "possible" dose history to workers given dose determinants. This paper takes up the problem of dealing with complex dosimetry systems that provide multiple realizations of dose in an epidemiologic analysis. In this paper we derive expected scores and the information matrix for a model used widely in radiation epidemiology, namely the linear excess relative risk (ERR model that allows for a linear dose response (risk in relation to radiation and distinguishes between modifiers of background rates and of the excess risk due to exposure. We show that treating the mean dose for each individual (calculated by averaging over the realizations as if it was true dose (ignoring both shared and unshared dosimetry errors gives asymptotically unbiased estimates (i.e. the score has expectation zero and valid tests of the null hypothesis that the ERR slope β is zero. Although the score is unbiased the information matrix (and hence the standard errors of the estimate of β is biased for β≠0 when ignoring errors in dose estimates, and we show how to adjust the information matrix to remove this bias, using the multiple realizations of dose. The use of these methods in the context of several studies including, the Mayak Worker Cohort, and the U.S. Atomic Veterans Study, is discussed.
Confidence levels, clinical significance curves, and risk-benefit contours are tools improving analysis of clinical studies and minimizing misinterpretation of published results, however no software has been available for their calculation. The objective was to develop software to help clinicians utilize these tools. Excel 2000 spreadsheets were designed using only built-in functions, without macros. The workbook was protected and encrypted so that users can modify only input cells. The workbook has 4 spreadsheets for use in studies comparing two patient groups. Sheet 1 comprises instructions and graphic examples for use. Sheet 2 allows the user to input the main study results (e.g. survival rates) into a 2-by-2 table. Confidence intervals (95%), p-value and the confidence level for Treatment A being better than Treatment B are automatically generated. An additional input cell allows the user to determine the confidence associated with a specified level of benefit. For example if the user wishes to know the confidence that Treatment A is at least 10% better than B, 10% is entered. Sheet 2 automatically displays clinical significance curves, graphically illustrating confidence levels for all possible benefits of one treatment over the other. Sheet 3 allows input of toxicity data, and calculates the confidence that one treatment is more toxic than the other. It also determines the confidence that the relative toxicity of the most effective arm does not exceed user-defined tolerability. Sheet 4 automatically calculates risk-benefit contours, displaying the confidence associated with a specified scenario of minimum benefit and maximum risk of one treatment arm over the other. The spreadsheet is freely downloadable at www.ontumor.com/professional/statistics.htm A simple, self-explanatory, freely available spreadsheet calculator was developed using Excel 2000. The incorporated decision-making tools can be used for data analysis and improve the reporting of results of any
Volume and heterogeneity dependence of the dose response relationship for head and neck tumours
Based on the Poisson statistics of cell kill a model for the response of heterogeneous tumours to non-uniform dose delivery have been developed. The five parameters required to characterize the response are the 50% response dose, D50, the normalized dose-response gradient, γ, the tumour heterogeneity factor, h, the relative volume, y and the extra daily dose required to counteract the tumour cell proliferation, δ. The model has been fitted to data from a number of clinical investigations to allow the derivation of clinically relevant radiation response parameters for head and neck tumours. The D50 value for T2 larynx cancers is 59.9 Gy in 41 days with a relative standard deviation of 2.1 Gy and the γ value is 2.9 with a relative standard deviation of 0.3. The value of δ, which is most consistent with the clinical data for laryngeal tumours, is 0.35 Gy/day and this value should be used if the treatment time is changed from the 41 days normalization. The heterogeneity factor, h, is close to zero for laryngeal tumours which indicates that their response is basically governed by Poisson statistics. Nasopharyngeal tumours, on the other hand, exhibit h values around 0.2 which indicates that these tumours are more heterogeneous in their internal organization and so are their responses to radiation. (orig.)
Dose response of surfactants to attenuate gas embolism related platelet aggregation
Eckmann, David M.; Eckmann, Yonaton Y.; Tomczyk, Nancy
2014-03-01
Intravascular gas embolism promotes blood clot formation, cellular activation, and adhesion events, particularly with platelets. Populating the interface with surfactants is a chemical-based intervention to reduce injury from gas embolism. We studied platelet activation and platelet aggregation, prominent adverse responses to blood contact with bubbles. We examined dose-response relationships for two chemically distinct surfactants to attenuate the rise in platelet function stimulated by exposure to microbubbles. Significant reduction in platelet aggregation and platelet activation occurred with increasing concentration of the surfactants, indicating presence of a saturable system. A population balance model for platelet aggregation in the presence of embolism bubbles and surfactants was developed. Monte Carlo simulations for platelet aggregation were performed. Results agree qualitatively with experimental findings. Surfactant dose-dependent reductions in platelet activation and aggregation indicate inhibition of the gas/liquid interface's ability to stimulate cellular activation mechanically.
Aaron Fisher; G. Brooke Anderson; Roger Peng; Jeff Leek
2014-01-01
Scatterplots are the most common way for statisticians, scientists, and the public to visually detect relationships between measured variables. At the same time, and despite widely publicized controversy, P-values remain the most commonly used measure to statistically justify relationships identified between variables. Here we measure the ability to detect statistically significant relationships from scatterplots in a randomized trial of 2,039 students in a statistics massive open online cour...
Severe iodine deficiency (ID) results in adverse health outcomes and remains a benchmark for understanding the effects of developmental hypothyroidism. The implications of marginal ID, however, remain less well known. The current study examined the relationship between graded levels of ID in rats and serum thyroid hormones, thyroid iodine content, and urinary iodide excretion. The goals of this study were to provide parametric and dose-response information for development of a quantitative model of the thyroid axis. Female Long Evans rats were fed casein-based diets containing varying iodine (I) concentrations for 8 weeks. Diets were created by adding 975, 200, 125, 25, or 0 μg/kg I to the base diet (∼25 μg I/kg chow) to produce 5 nominal I levels, ranging from excess (basal + added I, Treatment 1: 1000 μg I/kg chow) to deficient (Treatment 5: 25 μg I/kg chow). Food intake and body weight were monitored throughout and on 2 consecutive days each week over the 8-week exposure period, animals were placed in metabolism cages to capture urine. Food, water intake, and body weight gain did not differ among treatment groups. Serum T4 was dose-dependently reduced relative to Treatment 1 with significant declines (19 and 48%) at the two lowest I groups, and no significant changes in serum T3 or TSH were detected. Increases in thyroid weight and decreases in thyroidal and urinary iodide content were observed as a function of decreasing I in the diet. Data were compared with predictions from a recently published biologically based dose-response (BBDR) model for ID. Relative to model predictions, female Long Evans rats under the conditions of this study appeared more resilient to low I intake. These results challenge existing models and provide essential information for development of quantitative BBDR models for ID during pregnancy and lactation.
Zuyderduyn Scott D
2007-08-01
Full Text Available Abstract Background Serial analysis of gene expression (SAGE is used to obtain quantitative snapshots of the transcriptome. These profiles are count-based and are assumed to follow a Binomial or Poisson distribution. However, tag counts observed across multiple libraries (for example, one or more groups of biological replicates have additional variance that cannot be accommodated by this assumption alone. Several models have been proposed to account for this effect, all of which utilize a continuous prior distribution to explain the excess variance. Here, a Poisson mixture model, which assumes excess variability arises from sampling a mixture of distinct components, is proposed and the merits of this model are discussed and evaluated. Results The goodness of fit of the Poisson mixture model on 15 sets of biological SAGE replicates is compared to the previously proposed hierarchical gamma-Poisson (negative binomial model, and a substantial improvement is seen. In further support of the mixture model, there is observed: 1 an increase in the number of mixture components needed to fit the expression of tags representing more than one transcript; and 2 a tendency for components to cluster libraries into the same groups. A confidence score is presented that can identify tags that are differentially expressed between groups of SAGE libraries. Several examples where this test outperforms those previously proposed are highlighted. Conclusion The Poisson mixture model performs well as a a method to represent SAGE data from biological replicates, and b a basis to assign significance when testing for differential expression between multiple groups of replicates. Code for the R statistical software package is included to assist investigators in applying this model to their own data.
The incidence of leukemia during 1950 to 1971 in a fixed mortality sample of atomic bomb survivors in Hiroshima and Nagasaki was analyzed as a function of neutron and γ kerma and marrow doses. Two dose-response models were tested for acute leukemia, chronic granulocytic leukemia, and all types of leukemia, respectively. Each model postulates that the leukemia incidence depends upon the sum of separate risks imposed by γ and neutron doses. In Model I the risk from both types of radiation is assumed to be directly proportional to the respective doses, while Model II assumes that whereas the risk from neutrons is directly proportional to the dose, the risk from γ rays is proportional to dose-squared. The analysis demonstrated that the dose-response of the two types of leukemia differed by type of radiation. The data suggested that the response of acute leukemia was best explained by Model II, while the response of chronic granulocytic leukemia depended almost linearly upon neutron dose alone, because the regression coefficients associated with γ radiation for both Models I and II were not significant. The relative biological effectiveness (RBE) of neutrons in relation to γ rays for incidence of acute leukemia was estimated to be approximately 30/(Dn)/sup 1/2/ [95% confidence limits; 17/(Dn)/sup 1/2/ approx. 54/(Dn)/sup 1/2/] for kerma and 32/(Dn)/sup 1/2/ [95% confidence limits; 18/(Dn)/sup 1/2/ approx. 58/(Dn)/sup 1/2/] for marrow dose (Dn = neutron dose). If acute and chronic granulocytic leukemias are considered together as all types of leukemia, Model II appears to fit the data slightly better than Model I, but neither model is statistically rejected by the data
Fisher, Aaron; Anderson, G. Brooke; Peng, Roger
2014-01-01
Scatterplots are the most common way for statisticians, scientists, and the public to visually detect relationships between measured variables. At the same time, and despite widely publicized controversy, P-values remain the most commonly used measure to statistically justify relationships identified between variables. Here we measure the ability to detect statistically significant relationships from scatterplots in a randomized trial of 2,039 students in a statistics massive open online course (MOOC). Each subject was shown a random set of scatterplots and asked to visually determine if the underlying relationships were statistically significant at the P < 0.05 level. Subjects correctly classified only 47.4% (95% CI [45.1%–49.7%]) of statistically significant relationships, and 74.6% (95% CI [72.5%–76.6%]) of non-significant relationships. Adding visual aids such as a best fit line or scatterplot smooth increased the probability a relationship was called significant, regardless of whether the relationship was actually significant. Classification of statistically significant relationships improved on repeat attempts of the survey, although classification of non-significant relationships did not. Our results suggest: (1) that evidence-based data analysis can be used to identify weaknesses in theoretical procedures in the hands of average users, (2) data analysts can be trained to improve detection of statistically significant results with practice, but (3) data analysts have incorrect intuition about what statistically significant relationships look like, particularly for small effects. We have built a web tool for people to compare scatterplots with their corresponding p-values which is available here: http://glimmer.rstudio.com/afisher/EDA/. PMID:25337457
Fisher, Aaron; Anderson, G Brooke; Peng, Roger; Leek, Jeff
2014-01-01
Scatterplots are the most common way for statisticians, scientists, and the public to visually detect relationships between measured variables. At the same time, and despite widely publicized controversy, P-values remain the most commonly used measure to statistically justify relationships identified between variables. Here we measure the ability to detect statistically significant relationships from scatterplots in a randomized trial of 2,039 students in a statistics massive open online course (MOOC). Each subject was shown a random set of scatterplots and asked to visually determine if the underlying relationships were statistically significant at the P < 0.05 level. Subjects correctly classified only 47.4% (95% CI [45.1%-49.7%]) of statistically significant relationships, and 74.6% (95% CI [72.5%-76.6%]) of non-significant relationships. Adding visual aids such as a best fit line or scatterplot smooth increased the probability a relationship was called significant, regardless of whether the relationship was actually significant. Classification of statistically significant relationships improved on repeat attempts of the survey, although classification of non-significant relationships did not. Our results suggest: (1) that evidence-based data analysis can be used to identify weaknesses in theoretical procedures in the hands of average users, (2) data analysts can be trained to improve detection of statistically significant results with practice, but (3) data analysts have incorrect intuition about what statistically significant relationships look like, particularly for small effects. We have built a web tool for people to compare scatterplots with their corresponding p-values which is available here: http://glimmer.rstudio.com/afisher/EDA/. PMID:25337457
Aaron Fisher
2014-10-01
Full Text Available Scatterplots are the most common way for statisticians, scientists, and the public to visually detect relationships between measured variables. At the same time, and despite widely publicized controversy, P-values remain the most commonly used measure to statistically justify relationships identified between variables. Here we measure the ability to detect statistically significant relationships from scatterplots in a randomized trial of 2,039 students in a statistics massive open online course (MOOC. Each subject was shown a random set of scatterplots and asked to visually determine if the underlying relationships were statistically significant at the P < 0.05 level. Subjects correctly classified only 47.4% (95% CI [45.1%–49.7%] of statistically significant relationships, and 74.6% (95% CI [72.5%–76.6%] of non-significant relationships. Adding visual aids such as a best fit line or scatterplot smooth increased the probability a relationship was called significant, regardless of whether the relationship was actually significant. Classification of statistically significant relationships improved on repeat attempts of the survey, although classification of non-significant relationships did not. Our results suggest: (1 that evidence-based data analysis can be used to identify weaknesses in theoretical procedures in the hands of average users, (2 data analysts can be trained to improve detection of statistically significant results with practice, but (3 data analysts have incorrect intuition about what statistically significant relationships look like, particularly for small effects. We have built a web tool for people to compare scatterplots with their corresponding p-values which is available here: http://glimmer.rstudio.com/afisher/EDA/.
Bosch, Ernesto; Nyboe Andersen, Anders; Barri, Pedro; García-Velasco, Juan Antonio; de Sutter, Petra; Fernández-Sánchez, Manuel; Visnova, Hana; Klein, Bjarke M; Mannaerts, Bernadette; Arce, Joan-Carles
2015-01-01
: Follicular development and endocrine parameters during controlled ovarian stimulation (COS) with rhFSH. RESULTS: Serum FSH levels increased with increasing rhFSH doses and steady-state levels for each dose were similar in both AMH strata. In the whole study population, significant (P < 0·001) positive dose...... responses were observed for the number of follicles ≥12 mm, and serum levels of oestradiol, inhibin B, inhibin A and progesterone at end of stimulation. In comparison with the higher AMH stratum, patients in the lower AMH stratum had significantly different slopes of the dose-response curves for these...
In a reappraisal of the data of the Hiroshima and Nagasaki A-bomb survivors, the cancer risk is calculated based on the five lowest dose groups. It is concluded that a linear extrapolation underestimates the cancer risk. Therefore, a supralinear dose response curve is fitted for the lowest dose groups (< 0.6 Sv). This reappraisal has been evaluated by the Ministry of social affairs, health and energy. The evaluation shows that a supralinear dose response relationship is not significant for the lowest dose groups. By restriction to the lowest dose groups, part of the information on the dose response relation is discarded. Including all dose groups, however, does not yield significant supralinearity. (orig.)
The Shape of the Dose-Response Relationship between Sugars and Caries in Adults.
Bernabé, E; Vehkalahti, M M; Sheiham, A; Lundqvist, A; Suominen, A L
2016-02-01
Dental caries is considered a diet-mediated disease, as sugars are essential in the caries process. However, some gaps in knowledge about the sugars-caries relationship still need addressing. This longitudinal study aimed to explore 1) the shape of the dose-response association between sugars intake and caries in adults, 2) the relative contribution of frequency and amount of sugars intake to caries levels, and 3) whether the association between sugars intake and caries varies by exposure to fluoride toothpaste. We used data from 1,702 dentate adults who participated in at least 2 of 3 surveys in Finland (Health 2000, 2004/05 Follow-up Study of Adults' Oral Health, and Health 2011). Frequency and amount of sugars intake were measured with a validated food frequency questionnaire. The DMFT index was the repeated outcome measure. Data were analyzed with fractional polynomials and linear mixed effects models. None of the 43 fractional polynomials tested provided a better fit to the data than the simpler linear model. In a mutually adjusted linear mixed effects model, the amount of, but not the frequency of, sugars intake was significantly associated with DMFT throughout the follow-up period. Furthermore, the longitudinal association between amount of sugars intake and DMFT was weaker in adults who used fluoride toothpaste daily than in those using it less often than daily. The findings of this longitudinal study among Finnish adults suggest a linear dose-response relationship between sugars and caries, with amount of intake being more important than frequency of ingestion. Also, daily use of fluoride toothpaste reduced but did not eliminate the association between amount of sugars intake and dental caries. PMID:26553884
Progesterone in experimental permanent stroke: a dose-response and therapeutic time-window study.
Wali, Bushra; Ishrat, Tauheed; Won, Soonmi; Stein, Donald G; Sayeed, Iqbal
2014-02-01
Currently, the only approved treatment for ischaemic stroke is tissue plasminogen activator, a clot-buster. This treatment can have dangerous consequences if not given within the first 4 h after stroke. Our group and others have shown progesterone to be beneficial in preclinical studies of stroke, but a progesterone dose-response and time-window study is lacking. We tested male Sprague-Dawley rats (12 months old) with permanent middle cerebral artery occlusion or sham operations on multiple measures of sensory, motor and cognitive performance. For the dose-response study, animals received intraperitoneal injections of progesterone (8, 16 or 32 mg/kg) at 1 h post-occlusion, and subcutaneous injections at 6 h and then once every 24 h for 7 days. For the time-window study, the optimal dose of progesterone was given starting at 3, 6 or 24 h post-stroke. Behavioural recovery was evaluated at repeated intervals. Rats were killed at 22 days post-stroke and brains extracted for evaluation of infarct volume. Both 8 and 16 mg/kg doses of progesterone produced attenuation of infarct volume compared with the placebo, and improved functional outcomes up to 3 weeks after stroke on locomotor activity, grip strength, sensory neglect, gait impairment, motor coordination and spatial navigation tests. In the time-window study, the progesterone group exhibited substantial neuroprotection as late as 6 h after stroke onset. Compared with placebo, progesterone showed a significant reduction in infarct size with 3- and 6-h delays. Moderate doses (8 and 16 mg/kg) of progesterone reduced infarct size and improved functional deficits in our clinically relevant model of stroke. The 8 mg/kg dose was optimal in improving motor, sensory and memory function, and this effect was observed over a large therapeutic time window. Progesterone shows promise as a potential therapeutic agent and should be examined for safety and efficacy in a clinical trial for ischaemic stroke. PMID:24374329
Reproducibilty test of ferrous xylenol orange gel dose response with optical cone beam CT scanning
Jordan, K.; Battista, J.
2004-01-01
Our previous studies of ferrous xylenol orange gelatin gel have revealed a spatial dependence to the dose response of samples contained in 10 cm diameter cylinders. Dose response is defined as change in optical attenuation coefficient divided by the dose (units cm-1 Gy-1). This set of experiments was conducted to determine the reproducibility of our preparation, irradiation and full 3D optical cone beam CT scanning. The data provided an internal check of a larger storage time-dose response dependence study.
Threshold estimation based on a p-value framework in dose-response and regression settings
Mallik, Atul; Banerjee, Moulinath; Michailidis, George
2011-01-01
We use p-values to identify the threshold level at which a regression function takes off from its baseline value, a problem motivated by applications in toxicological and pharmacological dose-response studies and environmental statistics. We study the problem in two sampling settings: one where multiple responses can be obtained at a number of different covariate-levels and the other the standard regression setting involving limited number of response values at each covariate. Our procedure involves testing the hypothesis that the regression function is at its baseline at each covariate value and then computing the potentially approximate p-value of the test. An estimate of the threshold is obtained by fitting a piecewise constant function with a single jump discontinuity, otherwise known as a stump, to these observed p-values, as they behave in markedly different ways on the two sides of the threshold. The estimate is shown to be consistent and its finite sample properties are studied through simulations. Ou...
Baluev, Roman V.
2013-01-01
We consider the "multi-frequency" periodogram, in which the putative signal is modelled as a sum of two or more sinusoidal harmonics with idependent frequencies. It is useful in the cases when the data may contain several periodic components, especially when their interaction with each other and with the data sampling patterns might produce misleading results. Although the multi-frequency statistic itself was already constructed, e.g. by G. Foster in his CLEANest algorithm, its probabilistic ...
PCBS: CANCER DOSE-RESPONSE ASSESSMENT AND APPLICATION TO ENVIRONMENTAL MIXTURES (1996)
This report updates the cancer dose-response assessment for polychlorinated biphenyls (PCBs) and shows how information on toxicity, disposition, and environmental processes can be considered together to evaluate health risks from PCB mixtures in the environment. Processes that ch...
Wang, Hong-Qiang; Tsai, Chung-Jui
2013-01-01
With the rapid increase of omics data, correlation analysis has become an indispensable tool for inferring meaningful associations from a large number of observations. Pearson correlation coefficient (PCC) and its variants are widely used for such purposes. However, it remains challenging to test whether an observed association is reliable both statistically and biologically. We present here a new method, CorSig, for statistical inference of correlation significance. CorSig is based on a biol...
Kjellman, B; Tollig, H; Wettrell, G
1980-10-01
In this study the effects of nebulized racemic epinephrine (Micronephrine) were investigated in children with asthma. The drug was inhaled by a compressor nebulizer with a plastic mask. In the first part of the study it is shown that nebulized Micronephrine has a dose-dependent bronchodilatory effect. In the second part the effect is compared with that of nebulized salbutamol in 10 children (7-16 years of age) with bronchial asthma. The highest dose used in the dose-response trials (=0.9 mg Micronephrine/kg body-weight) was compared with 0.15 mg salbutamol/kg body-weight, which is the dose commonly used in Sweden. There was no significant difference between the drugs as regards increase of forced expiratory volume in 1 sec or duration of the increase. There was a small but significant increase in systolic blood pressure, measured 5 min after the inhalation of Micronephrine but no significant change in diastolic pressure or heart rate. Four children complained of temporary sore throat after the inhalation. PMID:7468946
Time and total dose response of non-volatile UVPROMs
While survivability testing of floating gate non-volatile UVPROM memory devices has been documented in numerous journals, this paper reports on the total dose radiation response and intrinsic charge loss as a function of operating time in a system. Five groups of Intel and Signetics 27C256 devices were aged from one to five years through accelerated bake to simulate system use. Characterizations of the groups with five years of simulated use will be presented in detail in this paper. Device margin voltage was characterized before and after aging and after exposure to five total dose radiation levels (1K - 5K rads (Si)). A statistical model based upon the characterization data was developed to establish re-programming intervals for these devices when used in airborne electronic systems
Folate intake and the risk of breast cancer: a dose-response meta-analysis of prospective studies.
Yu-Fei Zhang
Full Text Available BACKGROUND: Previous observational studies regarding the existence of an association between folate intake and the risk of breast cancer have been inconsistent. This study aimed to summarize the evidence regarding this relationship using a dose-response meta-analytic approach. METHODOLOGY AND PRINCIPAL FINDINGS: We performed electronic searches of the PubMed, EmBase, and Cochrane Library databases to identify studies published through June 2013. Only prospective observational studies that reported breast cancer effect estimates with 95% confidence intervals (CIs for more than 2 folate intake categories were included. We excluded traditional case-control studies because of possible bias from various confounding factors. Overall, we included 14 prospective studies that reported data on 677,858 individuals. Folate intake had little effect on the breast cancer risk (relative risk (RR for highest versus lowest category = 0.97; 95% CI, 0.90-1.05; P = 0.451. Dose-response meta-analysis also suggested that a 100 µg/day increase in folate intake had no significant effect on the risk of breast cancer (RR = 0.99; 95% CI, 0.98-1.01; P = 0.361. Furthermore, we used restricted cubic splines to evaluate the nonlinear relationship between folate intake and the risk of breast cancer, and discovered a potential J-shaped correlation between folate intake and breast cancer risk (P = 0.007 and revealed that a daily folate intake of 200-320 µg was associated with a lower breast cancer risk; however, the breast cancer risk increased significantly with a daily folate intake >400 µg. CONCLUSION/SIGNIFICANCE: Our study revealed that folate intake had little or no effect on the risk of breast cancer; moreover, a dose-response meta-analysis suggested a J-shaped association between folate intake and breast cancer.
Dose - response relationship between noise exposure and the risk of occupational injury
Jin-Ha Yoon
2015-01-01
Full Text Available Many workers worldwide experience fatality and disability caused by occupational injuries. This study examined the relationship between noise exposure and occupational injuries at factories in Korea. A total of 1790 factories located in northern Gyeonggi Province, Korea was evaluated. The time-weighted average levels of dust and noise exposure were taken from Workplace Exposure Assessment data. Apart occupational injuries, sports events, traffic accidents, and other accidents occurring outside workplaces were excluded. The incidences of occupational injury in each factory were calculated by data from the Korea Workers′ Compensation and Welfare Services. Workplaces were classified according to the incidence of any occupational injuries (incident or nonincident workplaces, respectively. Workplace dust exposure was classified as 90 dB. Workplaces with high noise exposure were significantly associated with being incident workplaces, whereas workplaces with high dust exposure were not. The odds ratios (95% confidence intervals derived from a logistic regression model were 1.68 (1.27-2.24 and 3.42 (2.26-5.17 at 80-89 dB and ≥90 dB versus <80 dB. These associations remained significant when in a separate analysis according to high or low dust exposure level. Noise exposure increases the risk of occupational injury in the workplace. Furthermore, the risk of occupational injury increases with noise exposure level in a dose-response relationship. Therefore, strategies for reducing noise exposure level are required to decrease the risk of occupational injury.
Dose-Response Effect of Sunlight on Vitamin D2 Production in Agaricus bisporus Mushrooms.
Urbain, Paul; Jakobsen, Jette
2015-09-23
The dose response effect of UV-B irradiation from sunlight on vitamin D2 content of sliced Agaricus bisporus (white button mushroom) during the process of sun-drying was investigated.Real-time UV-B and UV-A data were obtained using a high-performance spectroradiometer. During the first hour of sunlight exposure, the vitamin D2 content of the mushrooms increased in a linear manner, with concentrations increasing from 0.1 μg/g up to 3.9 ± 0.8 μg/g dry weight (DW). At the subsequent two measurements one and 3 h later, respectively, a plateau was reached. Two hours of additional exposure triggered a significant decline in vitamin D2 content. After just 15 min of sun exposure and an UV-B dose of 0.13 J/cm(2), the vitamin D2 content increased significantly to 2.2 ± 0.5 μg/g DW (P < 0.0001), which is equivalent to 17.6 μg (704 IU) vitamin D2 per 100 g of fresh mushrooms and comparable to levels found in fatty fish like the Atlantic salmon. PMID:26314311
Dose-Response Effect of Sunlight on Vitamin D2 Production in Agaricus bisporus Mushrooms
Urbain, Paul; Jakobsen, Jette
2015-01-01
The dose response effect of UV-B irradiation from sunlight on vitamin D2 content of sliced Agaricus bisporus (white button mushroom) during the process of sun-drying was investigated.Real-time UV-B and UV-A data were obtained using a high-performance spectroradiometer. During the first hour...... of sunlight exposure, the vitamin D2 content of the mushrooms increased in a linear manner, with concentrations increasing from 0.1 μg/g up to 3.9 ± 0.8 μg/g dry weight (DW). At the subsequent two measurements one and 3 h later, respectively, a plateau was reached. Two hours of additional exposure triggered...... a significant decline in vitamin D2 content. After just 15 min of sun exposure and an UV-B dose of 0.13 J/cm(2), the vitamin D2 content increased significantly to 2.2 ± 0.5 μg/g DW (P mushrooms and comparable to levels found...
Lorenzen, Ellen; Einer-Jensen, Katja; Martinussen, T.; LaPatra, S.E.; Lorenzen, Niels
2000-01-01
investigate the potential for protecting fish against VHS by DNA vaccination, experiments were conducted to determine the amount of plasmid DNA needed for induction of protective immunity. The time to onset of immunity and the duration of protection following administration of a protective vaccine dose were...... also analyzed. The dose-response analysis revealed that significant protection of rainbow trout fingerlings was obtained following intramuscular injection of only 0.01 mug of plasmid DNA encoding the VHSV glycoprotein gene. In addition, higher doses of DNA induced immunity to a virus isolate...
The alanine detector in BNCT dosimetry: Dose response in thermal and epithermal neutron fields
Schmitz, T., E-mail: schmito@uni-mainz.de [Institute for nuclear chemistry, Johannes Gutenberg-University, Mainz D-55128 (Germany); Bassler, N. [Department of Physics and Astronomy, Aarhus University, Ny Munkegade 120, Aarhus C, Aarhus 8000 (Denmark); Blaickner, M. [AIT Austrian Institute of Technology GmbH, Vienna A-1220 (Austria); Ziegner, M. [AIT Austrian Institute of Technology GmbH, Vienna A-1220, Austria and TU Wien, Vienna University of Technology, Vienna A-1020 (Austria); Hsiao, M. C. [Insitute of Nuclear Engineering and Science, National Tsing Hua University, Hsinchu 30013, Taiwan (China); Liu, Y. H. [Nuclear Science and Technology Development Center, National Tsing Hua University, Hsinchu 30013, Taiwan (China); Koivunoro, H. [Department of Physics, University of Helsinki, POB 64, FI-00014, Finland and HUS Medical Imaging Center, Helsinki University Central Hospital, FI-00029 HUS (Finland); Auterinen, I.; Serén, T.; Kotiluoto, P. [VTT Technical Research Centre of Finland, Espoo (Finland); Palmans, H. [National Physical Laboratory, Acoustics and Ionising Radiation Division, Teddington TW11 0LW, United Kingdom and Medical Physics Group, EBG MedAustron GmbH, Wiener Neustadt A-2700 (Austria); Sharpe, P. [National Physical Laboratory, Acoustics and Ionising Radiation Division, Teddington TW11 0LW (United Kingdom); Langguth, P. [Department of Pharmacy and Toxicology, University of Mainz, Mainz D-55128 (Germany); Hampel, G. [Institut für Kernchemie, Johannes Gutenberg-Universität, Mainz D-55128 (Germany)
2015-01-15
Purpose: The response of alanine solid state dosimeters to ionizing radiation strongly depends on particle type and energy. Due to nuclear interactions, neutron fields usually also consist of secondary particles such as photons and protons of diverse energies. Various experiments have been carried out in three different neutron beams to explore the alanine dose response behavior and to validate model predictions. Additionally, application in medical neutron fields for boron neutron capture therapy is discussed. Methods: Alanine detectors have been irradiated in the thermal neutron field of the research reactor TRIGA Mainz, Germany, in five experimental conditions, generating different secondary particle spectra. Further irradiations have been made in the epithermal neutron beams at the research reactors FiR 1 in Helsinki, Finland, and Tsing Hua open pool reactor in HsinChu, Taiwan ROC. Readout has been performed with electron spin resonance spectrometry with reference to an absorbed dose standard in a {sup 60}Co gamma ray beam. Absorbed doses and dose components have been calculated using the Monte Carlo codes FLUKA and MCNP. The relative effectiveness (RE), linking absorbed dose and detector response, has been calculated using the Hansen and Olsen alanine response model. Results: The measured dose response of the alanine detector in the different experiments has been evaluated and compared to model predictions. Therefore, a relative effectiveness has been calculated for each dose component, accounting for its dependence on particle type and energy. Agreement within 5% between model and measurement has been achieved for most irradiated detectors. Significant differences have been observed in response behavior between thermal and epithermal neutron fields, especially regarding dose composition and depth dose curves. The calculated dose components could be verified with the experimental results in the different primary and secondary particle fields. Conclusions: The
无
2007-01-01
In order to meet the demand of nowcasting convective storms in Beijing, the climatological characteristics of convective storms in Beijing and its vicinity were analyzed based on the infrared (IR) temperature of black body (TBB) data during May―August of 1997―2004. The climatological probabilities, the diurnal cycle and the spatial distribution of convective storms are given respectively in this paper. The results show that the climatological characteristics of convective storms denoted by TBB≤-52℃ are consistent with those statistic studies based on the surface and lightning observations. Furthermore, the climatological characteristics of May and June are very different from those of July and August, showing that there are two types of convective storms in this region. One occurs in the transient polar air mass on the midlatitude continent during the late spring and early summer. This type of convection arises with thunder, strong wind gust and hail over the mountainous area in the northern part of this region from afternoon to nightfall, the other occurs with heavy rainfall in the warm and moist air mass over the North China Plain and vicinity of Bohai Sea. This study also shows that the long-term data of IR TBB observed by geostationary satellite can complement the temporal and spatial limitation of the weather radar and surface observations.
The aim of this thesis is to contribute to a better understanding of the health effects of chronic external low doses of ionising radiation. This work is based on the French cohort of CEA-AREVA NC nuclear workers. The mains stages of this thesis were (1) conducting a review of epidemiological studies on nuclear workers, (2) completing the database and performing a descriptive analysis of the cohort, (3) quantifying risk by different statistical methods and (4) modelling the exposure-time-risk relationship. The cohort includes monitored workers employed more than one year between 1950 and 1994 at CEA or AREVA NC companies. Individual annual external exposure, history of work, vital status and causes of death were reconstructed for each worker. Standardized mortality ratios using French national mortality rates as external reference were computed. Exposure-risk analysis was conducted in the cohort using the linear excess relative risk model, based on both Poisson regression and Cox model. Time dependent modifying factors were investigated by adding an interaction term in the model or by using exposure time windows. The cohort includes 36, 769 workers, followed-up until age 60 in average. During the 1968- 2004 period, 5, 443 deaths, 2, 213 cancers, 62 leukemia and 1, 314 cardiovascular diseases were recorded. Among the 57% exposed workers, the mean cumulative dose was 21.5 milli-sieverts (mSv). A strong Healthy Worker Effect is observed in the cohort. Significant elevated risks of pleura cancer and melanoma deaths were observed in the cohort but not associated with dose. No significant association was observed with solid cancers, lung cancer and cardiovascular diseases. A significant dose-response relationship was observed for leukemia excluding chronic lymphatic leukemia, mainly for doses received less than 15 years before and for yearly dose rates higher than 10 mSv. This PhD work contributes to the evaluation of risks associated to chronic external radiation
X-Ray energy dependence of the dose response of SIRAD radiation dosimeters
Cheung, Tsang [City University of Hong Kong, Department of Physics and Materials Science, Kowloon Tong, Hong Kong (China); Butson, Martin J. [City University of Hong Kong, Department of Physics and Materials Science, Kowloon Tong, Hong Kong (China); Illawarra Cancer Care Centre, Department of Medical Physics, Crown St, Wollongong, NSW 2500 (Australia); Centre for Medical Radiation Physics, University of Wollongong, Northfields Ave, Gwyneville 2518, NSW (Australia); E-mail: martin.Butson@sesiahs.health.nsw.gov.au; Yu, Peter K.N. [City University of Hong Kong, Department of Physics and Materials Science, Kowloon Tong, Hong Kong (China)
2007-07-15
SIRADs (self-indicating instant radiation alert dosimeters) are designed to measure accident radiation doses. As the energy of radiation is usually unknown in such situations, a detector with a weak energy dependence of its response to dose would be ideal. We have studied the energy dependence of the dose response of SIRADs in the range from 50kVp to 10MV, which corresponds to photon equivalent energies from 25.5keV to 2.2MeV. The response to the same dose at 25.5keV is (29+/-4)%(+/-1s) lower than the response at 1.4MeV. The response to a dose slowly increases with radiation energy. This energy dependence is relatively weak in comparison with the dependence for radiographic films and similar in magnitude to the dependence for lithium fluoride thermoluminescence dosimeters. This energy dependence of the response diminishes the accuracy of dose assessments in radiation fields of unknown energy, but does not significantly compromise the core ability of the devices to provide visual estimates of radiation doses.
Gamma- and electron dose response of the electrical conductivity of polyaniline based polymer blends
Complete text of publication follows. Conducting polymers, also known as 'synthetic metals' have been the subject of widespread investigations over the past decade due to their very promising characteristics. Polyaniline (PANI) holds a special position among conducting polymers in that its most highly conducting doped form can be reached by protonic acid doping or oxidative doping. It was published earlier, that the electrical conductivity of some polyaniline based polymer composites increases to a significant extent when irradiated to gamma, electron or UV radiation. The aim of the present study was to measure the high frequency conductivity of blended films of PANI with poly(vinylchloride), PVC, and chlorinated poly(propylene) irradiated in air to different doses. In order to find the most suitable composition od these composites the mass percentage of PANI within the PPCl and PVC matrix was changed between 5 - 30%. These samples were then gamma irradiated and the induced electrical conductivity was measured in the 1 kHz - 1 MHz frequency range to determine the most sensitive evaluation conditions. After selecting both the most suitable measuring conditions as well as the blend compositions the dose response of the chosen samples was determined in the dose range of 10 - 250 kGy. With respect to potential dosimetry application the effect of electron irradiation, the effect of irradiation temperature and the stability of the irradiated samples have also been investigated
Fang, Yongxiang; Wit, Ernst
2008-01-01
Fisher’s combined probability test is the most commonly used method to test the overall significance of a set independent p-values. However, it is very obviously that Fisher’s statistic is more sensitive to smaller p-values than to larger p-value and a small p-value may overrule the other p-values a
Lindmark, Anita; van Rompaye, Bart; Goetghebeur, Els; Glader, Eva-Lotta; Eriksson, Marie
2016-01-01
Background When profiling hospital performance, quality inicators are commonly evaluated through hospital-specific adjusted means with confidence intervals. When identifying deviations from a norm, large hospitals can have statistically significant results even for clinically irrelevant deviations while important deviations in small hospitals can remain undiscovered. We have used data from the Swedish Stroke Register (Riksstroke) to illustrate the properties of a benchmarking method that integrates considerations of both clinical relevance and level of statistical significance. Methods The performance measure used was case-mix adjusted risk of death or dependency in activities of daily living within 3 months after stroke. A hospital was labeled as having outlying performance if its case-mix adjusted risk exceeded a benchmark value with a specified statistical confidence level. The benchmark was expressed relative to the population risk and should reflect the clinically relevant deviation that is to be detected. A simulation study based on Riksstroke patient data from 2008–2009 was performed to investigate the effect of the choice of the statistical confidence level and benchmark value on the diagnostic properties of the method. Results Simulations were based on 18,309 patients in 76 hospitals. The widely used setting, comparing 95% confidence intervals to the national average, resulted in low sensitivity (0.252) and high specificity (0.991). There were large variations in sensitivity and specificity for different requirements of statistical confidence. Lowering statistical confidence improved sensitivity with a relatively smaller loss of specificity. Variations due to different benchmark values were smaller, especially for sensitivity. This allows the choice of a clinically relevant benchmark to be driven by clinical factors without major concerns about sufficiently reliable evidence. Conclusions The study emphasizes the importance of combining clinical relevance
Dose-response relationships for female radium dial workers
Among 1474 women employed in the United States radium dial-painting industry before 1930, there are 61 known cases of bone sarcoma and 21 cases of carcinoma of the paranasal sinuses or the mastoid air cells (''head carcinomas''). The relative effectiveness of 226Ra and 228Ra and dose-incidence relationships were examined for the 759 of these women whose radium body burden has been determined; there are 38 cases of bone sarcoma and 17 cases of head carcinoma in this group. Incidence (I) was expressed as tumor cases per person-year and the dose parameter (D) was the quantity (microcuries) of radium that entered the blood during the period of exposure. To the observed data for each type of tumor were fitted equations that can be formulated from the general form I = (C + alpha dD+ βD2)e-γ/sup D/, where C, the natural incidence for this population, was about 10-5 per person-year. For each equation, the best values of the dose coefficients were found by a least-squares fitting procedure. An equation of the form I = (C + BD2)e-/sup γD/ provided the best fit for the bone sarcomas, when the dose was expressed as microcuries of 226Ra plus 2.5 times microcuries of 228Ra. An acceptable fit to the head carcinoma data was provided by the linear equation I = C + d alpha γ with D equal to microcuries of 226Ra. As a test of bias due to selection of cases with known symptoms of malignancy, the analyses were repeated after removal of all cases for whom radium was determined only after exhumation, and no significant changes in the fitted coefficients were found. The dose-incidence equations obtained when the dose was expressed as average skeletal dose in rad are also given
This study was carried out to investigate the radiation dose-response of micronucleus frequencies in Tradescantia pollen mother cells. The number of micronuclei increased in the tetrads as a result of chromosome deletion after irradiation. The maximal frequency of micronucleus showed a good dose-response relationship in the range of dose 0∼50 cGy. On the basis of the relationship, a dose of 1 cGy resulted in two additional micronuclei in 100 tetrads. The radiation dose-response relationship of micronucleus occurrence is prerequisite to biological monitoring of radiation and can be modified for biological risk assessment of toxicants, and to safety test of water or soil integrity
Engelbrecht, C. A.; Frescura, F. A. M.; Frank, B. S.
2009-01-01
We have used Lomb-Scargle periodogram analysis and Monte Carlo significance tests to detect periodicities above the 3-sigma level in the Beta Cephei stars V400 Car, V401 Car, V403 Car and V405 Car. These methods produce six previously unreported periodicities in the expected frequency range of excited pulsations: one in V400 Car, three in V401 Car, one in V403 Car and one in V405 Car. One of these six frequencies is significant above the 4-sigma level. We provide statistical significances for...
Dose response and fading characteristics of an alanine-agarose gel
The dose response of an alanine-agarose gel, analyzed by ESR spectrometry, and the stability of the radiation-induced free radicals have been investigated. The stability of the ESR signal is higher for dosimeter samples analyzed at 77 K than for dried samples, analyzed at room-temperature. The dose response is linear to within ±2% in the absorbed dose interval 2-100 Gy. The variations in spectral line shape were analyzed at temperatures between 77 and 270 K. The experimental ESR spectrum at 77 K was compared with a simulated spectrum of polycrystals of L-α-alanine. (Author)
Nazife Sefi Yurdakul
2015-10-01
Full Text Available Objectives: To evaluate the results of symmetric and asymmetric surgery and responses to surgical amounts in patients with infantile esotropia. Materials and Methods: The records of patients with infantile esotropia who underwent bilateral medial rectus recession (symmetric surgery and unilateral medial rectus recession with lateral rectus resection (asymmetric surgery were analyzed. The results of the cases with symmetric (group 1 and asymmetric (group 2, successful (group 3 and failed (group 4 surgeries were compared, and responses to the amount of surgery were investigated. Results: There were no significant differences between group 1 (n=71 and group 2 (n=13 cases in terms of gender, refraction, preoperative distance deviation, anisometropia and postoperative deviation angles, binocular vision, surgical success or follow-up period (p>0.05. The rate of amblyopia, near deviation and amount of surgery were higher in group 2 cases (p0.05. The average postoperative follow-up period was 15.41±19.93 months (range, 6-98 months in group 3 cases and 40.45±40.06 months (range, 6-143 months in group 4 cases (p=0.000. No significant difference was detected in the amount of deviation corrected per 1 mm of surgical procedure between the successful cases in the symmetric and asymmetric groups (p>0.05. Conclusion: Symmetric or asymmetric surgery may be preferable in patients with infantile esotropia according to the clinical features. It is necessary for every clinic to review its own dose-response results. (Turk J Ophthalmol 2015; 45: 197-202
Two methods for measuring failure fraction on irradiated coated-particle fuels have been developed, one in the United States (the IMGA system - Irradiated-Microsphere Gamma Analyzer) and one in the Federal Republic of Germany (FRG) (the PIAA procedure - Postirradiation Annealing and Beta Autoradiography). A comparison of the two methods on two standardized sets of irradiated particles was undertaken to evaluate the accuracy, operational procedures, and expense of each method in obtaining statistically significant results. From the comparison, the postirradiation examination method employing the IMGA system was found to be superior to the PIAA procedure for measuring statistically significant failure fractions. Both methods require that the irradiated fuel be in the form of loose particles, each requires extensive remote hot-cell facilities, and each is capable of physically separating failed particles from unfailed particles. Important differences noted in the comparison are described
Paula, Débora P; Andow, David A; Bellinati, André; Timbó, Renata Velozo; Souza, Lucas M; Pires, Carmen S S; Sujii, Edison R
2016-04-01
Dose-response assays and surrogate species are standard methods for risk analysis for environmental chemicals. These assume that individuals within a species have unimodal responses and that a surrogate species can predict responses of other related taxa. We exposed immature individuals of closely related aphidophagous coccinellid predators, Cycloneda sanguinea and Harmonia axyridis, to Cry1Ac and Cry1F toxins through uniform and constant artificial tritrophic exposure through Myzus persicae aphids. Both toxins were detected in coccinellid pupae, with individual and interspecific variation. Uptake was significantly higher in H. axyridis than in C. sanguinea, both in the proportion of individuals and the concentrations per individual. We also observed bimodal uptake of the Cry toxins by H. axyridis, which indicated that some individuals had low bioaccumulation and some had high bioaccumulation. This suggests that standard dose-response assays need to be interpreted with caution and future assays should examine the modality of the responses. In addition, the similarity in the biological effects of the Cry toxins in the two predators was due to different biological exposure mechanisms. The majority of H. axyridis were exposed both internally and in the gut, while C. sanguinea was exposed primarily in the gut. Thus, despite their close phylogenetic relatedness, these species would not be good surrogates for each other and the surrogate species methodology should be tested more rigorously. PMID:26846212
Kalimeris, A.; Potirakis, S. M.; Eftaxias, K.; Antonopoulos, G.; Kopanas, J.; Nomikos, C.
2016-05-01
A multi-spectral analysis of the kHz electromagnetic time series associated with Athens' earthquake (M = 5.9, 7 September 1999) is presented here, that results to the reliable discrimination of the fracto-electromagnetic emissions from the natural geo-electromagnetic field background. Five spectral analysis methods are utilized in order to resolve the statistically significant variability modes of the studied dynamical system out of a red noise background (the revised Multi-Taper Method, the Singular Spectrum Analysis, and the Wavelet Analysis among them). The performed analysis reveals the existence of three distinct epochs in the time series for the period before the earthquake, a "quiet", a "transitional" and an "active" epoch. Towards the end of the active epoch, during a sub-period which is approximately starting two days before the earthquake, the dynamical system passes into a high activity state, where electromagnetic signal emissions become powerful and statistically significant almost in all time-scales. The temporal behavior of the studied system in each one of these epochs is further searched through mathematical reconstruction in the time domain of those spectral features that were found to be statistically significant. The transition of the system from the quiet to the active state proved to be detectable first in the long time-scales and afterwards in the short scales. Finally, a Hurst exponent analysis revealed persistent characteristics embedded in the two strong EM bursts observed during the "active" epoch.
Reynolds, S J; Donham, K J; Whitten, P; Merchant, J A; Burmeister, L F; Popendorf, W J
1996-01-01
Studies describing respiratory health hazards for workers in swine production facilities have been published in the United States, Sweden, Canada, the Netherlands, and Denmark. Up to 50% of these workers experience bronchitis, organic dust toxic syndrome, hyper-reactive airways disease, chronic mucous membrane irritation, and other respiratory effects. These studies clearly point to the fact that this occupational environment poses a significant health risk hazard, and that control methods are needed to protect the worker. Before precise control strategies can be developed, implemented, and evaluated, dose-response studies are required to determine acceptable target levels for exposure. A previous manuscript described the development of multiple regression equations characterizing the relationships between environmental exposures and pulmonary response in a cohort of 207 swine producers. Baseline pulmonary function was included as a significant predictor of cross-shift decrements in pulmonary function in addition to personal measurements of dust, endotoxin, and ammonia concentrations. These equations were then used to predict specific exposure levels of dust and ammonia that could be expected to elicit significant decrements in cross-shift pulmonary function. This paper presents the results from analysis of follow-up data obtained on this same cohort 2 years after the initial measurements. At the second measurement period of the study (time-2), swine workers were found to have a mean cross-shift decrease in FEV1 of 2%. Cross-shift change in FEV1 was significantly correlated with personal exposures to total dust, total endotoxin, respirable endotoxin, and ammonia. The magnitude of the decrease in FEV1 was associated with increasing airborne concentrations of these environmental parameters thus confirming the dose-response relationship observed in the initial study (time-1). The correlation of dust with FEV1 changes in workers with more than 6 years of exposure (time
Imre, G; Fokkema, DS; Den Boer, JA; Ter Horst, GJ
2006-01-01
The present dose-response study sought to determine the effects of subanesthetic dosages (4-16 mg/kg) of ketamine on locomotion, sensorimotor gating (PP1), working memory, as well as c-fos expression in various limbic regions implicated in the pathogenesis of schizophrenia. In addition, we examined
Steinman, Kenneth J.; Ferketich, Amy K.; Sahr, Timothy
2008-01-01
This article addresses two inconsistent findings in the literature on adolescent religious activity (RA) and substance use: whether a dose-response relationship characterizes the association of these variables, and whether the association varies by grade, gender, ethnicity, family structure, school type, and type of substance. Multinomial logistic…
G. Koshy; A. Delpisheh; B.J. Brabin
2011-01-01
The combined dose response effects of pregnancy cigarette smoke exposure on childhood overweight, obesity and short stature have not been reported. A community based cross-sectional survey of 3038 children aged 5-11 years from 15 primary schools in Merseyside, UK. Self-completed parental questionnai
Cytogenetics dosimetry: dose-response curve for low doses of X-ray
The purpose of this study was to conduct a preliminary study for the standardization in the future, the dose-response curve for low doses of X-rays, through the analysis of in vitro cultures of peripheral blood samples of 3 men and 3 women occupationally not exposed to artificial sources of ionizing radiation, age 18-40 years, where possible nonsmokers
Dose-Response Issues Concerning the Relations between Regular Physical Activity and Health.
Rankinen, Tuomo; Bouchard, Claude
2002-01-01
This paper categorizes the many benefits of physical activity, offering information concerning the type of dose necessary to get that benefit. In 2000, Health Canada and the United States Centers for Disease Control and Prevention, along with other agencies, sponsored a symposium to determine whether there was a dose-response relationship between…
Toxoplasma gondii is a protozoan parasite that is responsible for approximately 24% of deaths attributed to foodborne pathogens in the United States.A substantial portion of human T. gondii infections may be acquired through the consumption of meats. The dose-response relationship for human exposure...
Kerger, Brent D; Copeland, Teri L; DeCaprio, Anthony P
2011-10-01
Persistent organic chemicals, such as perfluorooctanoic acid (PFOA), perfluorooctanesulfonate (PFOS), dioxins, and polychlorinated biphenyls, pose investigative challenges because they are found in virtually everyone (there is no unexposed control group). To overcome this problem, outcome data in some studies are sorted by chemical dose level and findings in low-end dose groups are compared to sequential higher dose groups. An example is the C8 Health Project that evaluated serum PFOA/PFOS (C8) and total cholesterol among 46,294 West Virginia residents who lived, worked, or went to school for at least 1 year in a C8 contaminated drinking-water district and were over age 18 in 2005-2006. The risk for high total cholesterol (>240 mg/dL) measured via odds ratios (ORs) in logistic regression models showed sequential OR increases with PFOA quartile, in comparison to the lowest quartile (OR = 1.00), that were each significantly elevated (OR = 1.21, 1.33, and 1.40, respectively), but age, sex, and body mass index were stronger correlates. Importantly, the magnitude of cholesterol increase was small (12 mg/dL from lowest to highest exposure deciles) and comparison to similar statistics for the general U.S. population showed the C8 cohort had lower rates of high cholesterol. This suggests that inadvertent selection bias may have affected the lowest exposure quartile (control group), making tenuous the dose-response relationship between PFOA/PFOS and risk of high cholesterol. This case illustrates the substantial difficulties in assigning toxicological importance to statistical comparisons for common disease states that utilize subgroups with low exposures as an effective control group. PMID:21770727
Intravenous Nicotine Self-Administration in Smokers: Dose-Response Function and Sex Differences.
Jensen, Kevin P; DeVito, Elise E; Valentine, Gerald; Gueorguieva, Ralitza; Sofuoglu, Mehmet
2016-07-01
Sex differences in the sensitivity to nicotine may influence vulnerability to tobacco dependence. The goal of this study was to investigate the dose-response function for the reinforcing and subjective effects of intravenous nicotine in male and female smokers. Tobacco-dependent subjects (12 male and 14 female) participated in four experimental sessions in which they received sample infusions of saline and nicotine (0.1, 0.2, 0.3, or 0.4 mg doses) in a randomized double-blind crossover design. During each session, subjects first received the sample infusions, and heart rate (HR), blood pressure, and subjective stimulatory, pleasurable and aversive responses were monitored. Immediately following the sample infusions, subjects self-administered either nicotine or saline in six double-blind forced-choice trials. A sex by dose interaction was observed in the nicotine choice paradigm. Nicotine self-administration rate was negatively correlated with nicotine dose in males (males displayed choice preference for low doses of nicotine over high doses of nicotine), but no significant relationship between dose and choice preference was evident in females. Relative to placebo, sample doses of nicotine increased heart rate and blood pressure, and induced stimulatory, pleasurable, and aversive subjective effects. Diastolic blood pressure increased dose dependently in males, but not in females. These findings, which demonstrate sex differences in nicotine self-administration for doses that are near to the reinforcement threshold, suggest that male and female smokers may respond differently to the changes in nicotine doses available for self-administration. PMID:26717881
Dose-response relationship between sports activity and musculoskeletal pain in adolescents.
Kamada, Masamitsu; Abe, Takafumi; Kitayuguchi, Jun; Imamura, Fumiaki; Lee, I-Min; Kadowaki, Masaru; Sawada, Susumu S; Miyachi, Motohiko; Matsui, Yuzuru; Uchio, Yuji
2016-06-01
Physical activity has multiple health benefits but may also increase the risk of developing musculoskeletal pain (MSP). However, the relationship between physical activity and MSP has not been well characterized. This study examined the dose-response relationship between sports activity and MSP among adolescents. Two school-based serial surveys were conducted 1 year apart in adolescents aged 12 to 18 years in Unnan, Japan. Self-administered questionnaires were completed by 2403 students. Associations between time spent in organized sports activity and MSP were analyzed cross-sectionally (n = 2403) and longitudinally (n = 374, students free of pain and in seventh or 10th grade at baseline) with repeated-measures Poisson regression and restricted cubic splines, with adjustment for potential confounders. The prevalence of overall pain, defined as having pain recently at least several times a week in at least one part of the body, was 27.4%. In the cross-sectional analysis, sports activity was significantly associated with pain prevalence. Each additional 1 h/wk of sports activity was associated with a 3% higher probability of having pain (prevalence ratio = 1.03, 95% confidence interval = 1.02-1.04). Similar trends were found across causes (traumatic and nontraumatic pain) and anatomic locations (upper limbs, lower back, and lower limbs). In longitudinal analysis, the risk ratio for developing pain at 1-year follow-up per 1 h/wk increase in baseline sports activity was 1.03 (95% confidence interval = 1.02-1.05). Spline models indicated a linear association (P adolescents played sports, the more likely they were to have and develop pain. PMID:26894915
Intracoronary irradiation: dose response for the prevention of restenosis in swine
Purpose: Restenosis after percutaneous transluminal coronary angioplasty represents, in part, a proliferative response of vascular smooth muscle at the site of injury. We have previously shown that high-dose radiation (20 Gy), delivered via an intracoronary 192Ir source, causes focal medial fibrosis and markedly impairs the restenosis process after balloon angioplasty in swine. This study sought to delineate the dose-response characteristics of this effect. Methods and Materials: Forty juvenile swine underwent coronary angiography; a segment of the left coronary artery was chosen as a target for balloon injury. In 30 swine, a 2 cm ribbon of 192Ir was positioned at the target segment and 20, 15, or 10 Gy were delivered to the vessel wall (10 animals/dose). Subsequently, overdilatation balloon angioplasty was performed at the irradiated segment. In 10 control swine, overdilatation balloon angioplasty was performed without previous irradiation. Thirty-eight animals survived until sacrifice at 30 ± 3 days. Histopathological analysis was performed by a pathologist in a blinded manner. The area of maximal luminal compromise within the target segment was analyzed via computer-assisted planimetry. Results: Neointimal area was decreased by 71.4% at 20 Gy and by 58.3% at 15 Gy compared with control animals (p < 0.05 for both). A stimulatory effect on smooth muscle cell proliferation was noted at 10 Gy, with a 123% increase in neointimal area compared with controls (p < 0.05). Mean percent area stenosis was also reduced by 63% at 20 Gy and by 74.8% at 15 Gy compared with controls (p < 0.05 for both). Conclusions: Intracoronary irradiation prior to overstretch balloon angioplasty markedly reduces neointima formation; this effect is dose dependent, with evidence of a significant stimulatory effect at 10 Gy. The effective therapeutic dose range for the prevention of restenosis in this model begins at approximately 15 Gy delivered to the vessel wall
Radiotherapy treatment outcome models are a complicated function of treatment, clinical and biological factors. Our objective is to provide clinicians and scientists with an accurate, flexible and user-friendly software tool to explore radiotherapy outcomes data and build statistical tumour control or normal tissue complications models. The software tool, called the dose response explorer system (DREES), is based on Matlab, and uses a named-field structure array data type. DREES/Matlab in combination with another open-source tool (CERR) provides an environment for analysing treatment outcomes. DREES provides many radiotherapy outcome modelling features, including (1) fitting of analytical normal tissue complication probability (NTCP) and tumour control probability (TCP) models, (2) combined modelling of multiple dose-volume variables (e.g., mean dose, max dose, etc) and clinical factors (age, gender, stage, etc) using multi-term regression modelling, (3) manual or automated selection of logistic or actuarial model variables using bootstrap statistical resampling, (4) estimation of uncertainty in model parameters, (5) performance assessment of univariate and multivariate analyses using Spearman's rank correlation and chi-square statistics, boxplots, nomograms, Kaplan-Meier survival plots, and receiver operating characteristics curves, and (6) graphical capabilities to visualize NTCP or TCP prediction versus selected variable models using various plots. DREES provides clinical researchers with a tool customized for radiotherapy outcome modelling. DREES is freely distributed. We expect to continue developing DREES based on user feedback
El Naqa, I.; Suneja, G.; Lindsay, P. E.; Hope, A. J.; Alaly, J. R.; Vicic, M.; Bradley, J. D.; Apte, A.; Deasy, J. O.
2006-11-01
Radiotherapy treatment outcome models are a complicated function of treatment, clinical and biological factors. Our objective is to provide clinicians and scientists with an accurate, flexible and user-friendly software tool to explore radiotherapy outcomes data and build statistical tumour control or normal tissue complications models. The software tool, called the dose response explorer system (DREES), is based on Matlab, and uses a named-field structure array data type. DREES/Matlab in combination with another open-source tool (CERR) provides an environment for analysing treatment outcomes. DREES provides many radiotherapy outcome modelling features, including (1) fitting of analytical normal tissue complication probability (NTCP) and tumour control probability (TCP) models, (2) combined modelling of multiple dose-volume variables (e.g., mean dose, max dose, etc) and clinical factors (age, gender, stage, etc) using multi-term regression modelling, (3) manual or automated selection of logistic or actuarial model variables using bootstrap statistical resampling, (4) estimation of uncertainty in model parameters, (5) performance assessment of univariate and multivariate analyses using Spearman's rank correlation and chi-square statistics, boxplots, nomograms, Kaplan-Meier survival plots, and receiver operating characteristics curves, and (6) graphical capabilities to visualize NTCP or TCP prediction versus selected variable models using various plots. DREES provides clinical researchers with a tool customized for radiotherapy outcome modelling. DREES is freely distributed. We expect to continue developing DREES based on user feedback.
Dengfeng Gao
Full Text Available BACKGROUND: The consumption of dairy products may influence the risk of type 2 diabetes mellitus (T2DM, but inconsistent findings have been reported. Moreover, large variation in the types of dairy intake has not yet been fully explored. METHODS AND RESULTS: We conducted a systematic review and meta-analysis to clarify the dose-response association of dairy products intake and T2DM risk. We searched PubMed, EMBASE and Scopus for studies of dairy products intake and T2DM risk published up to the end of October 2012. Random-effects models were used to estimate summary relative risk (RR statistics. Dose-response relations were evaluated using data from different dairy products in each study. We included 14 articles of cohort studies that reported RR estimates and 95% confidence intervals (95% CIs of T2DM with dairy products intake. We found an inverse linear association of consumption of total dairy products (13 studies, low-fat dairy products (8 studies, cheese (7 studies and yogurt (7 studies and risk of T2DM. The pooled RRs were 0.94 (95% CI 0.91-0.97 and 0.88 (0.84-0.93 for 200 g/day total and low-fat dairy consumption, respectively. The pooled RRs were 0.80 (0.69-0.93 and 0.91 (0.82-1.00 for 30 g/d cheese and 50 g/d yogurt consumption, respectively. We also found a nonlinear association of total and low-fat dairy intake and T2DM risk, and the inverse association appeared to be strongest within 200 g/d intake. CONCLUSION: A modest increase in daily intake of dairy products such as low fat dairy, cheese and yogurt may contribute to the prevention of T2DM, which needs confirmation in randomized controlled trials.
Pottenger, Lynn H; Gollapudi, B Bhaskar
2010-01-01
For more than 40+ years, genotoxicity data have been interpreted in a qualitative, binary mode; a chemical is considered either positive or negative for a response in the test system. Although dose-response information is sometimes used in this decision, it is not routine to obtain the amount of information needed to inform risk assessment, for example to determine no-observed-genotoxic-effect-levels, primarily due to the historical view of genotoxic responses as "linear, no-threshold." Only recently have researchers begun to address this issue through robust experimental designs and application of statistical models. A growing body-of-evidence supports the existence of response thresholds for a number of mutagenic agents, in vitro and in vivo. Clearly, simple observation of a "hockey-stick" dose-response curve is not sufficient to establish a threshold. Collection of robust empirical data must be supported with an analysis of biological plausibility for the observed threshold. In this context, a chemical-specific mode-of-action (MOA) approach, which identifies key events responsible for the observed mutagenic effect, is extremely valuable. Biomarkers of key events, providing qualitative and quantitative information, can be integrated in a weight-of-evidence-based assessment of genotoxicity data from multiple test systems and used to identify data gaps to resolve/reduce uncertainties during the risk assessment process. To this end, specific recommendations on study design and data analysis are proposed. As the Environmental Mutagen Society celebrates its 40th anniversary, the field of genetic toxicology is marking a milestone on the path to a new paradigm, using a MOA, data-driven approach to answer questions about thresholds for genotoxic agents. PMID:20806283
Purpose: Vascular injury could be a cause of hippocampal dysfunction leading to late neurocognitive decline in patients receiving brain radiotherapy (RT). Hence, our aim was to develop a multivariate interaction model for characterization of hippocampal vascular dose-response and early prediction of radiation-induced late neurocognitive impairments. Methods: 27 patients (17 males and 10 females, age 31–80 years) were enrolled in an IRB-approved prospective longitudinal study. All patients were diagnosed with a low-grade glioma or benign tumor and treated by 3-D conformal or intensity-modulated RT with a median dose of 54 Gy (50.4–59.4 Gy in 1.8− Gy fractions). Six DCE-MRI scans were performed from pre-RT to 18 months post-RT. DCE data were fitted to the modified Toft model to obtain the transfer constant of gadolinium influx from the intravascular space into the extravascular extracellular space, Ktrans, and the fraction of blood plasma volume, Vp. The hippocampus vascular property alterations after starting RT were characterized by changes in the hippocampal mean values of, μh(Ktrans)τ and μh(Vp)τ. The dose-response, Δμh(Ktrans/Vp)pre->τ, was modeled using a multivariate linear regression considering integrations of doses with age, sex, hippocampal laterality and presence of tumor/edema near a hippocampus. Finally, the early vascular dose-response in hippocampus was correlated with neurocognitive decline 6 and 18 months post-RT. Results: The μh(Ktrans) increased significantly from pre-RT to 1 month post-RT (p<0.0004). The multivariate model showed that the dose effect on Δμh(Ktrans)pre->1M post-RT was interacted with sex (p<0.0007) and age (p<0.00004), with the dose-response more pronounced in older females. Also, the vascular dose-response in the left hippocampus of females was significantly correlated with memory function decline at 6 (r = − 0.95, p<0.0006) and 18 (r = −0.88, p<0.02) months post-RT. Conclusion: The hippocampal vascular
Farjam, R; Pramanik, P; Srinivasan, A; Chapman, C; Tsien, C; Lawrence, T; Cao, Y [University of Michigan, Ann Arbor, MI (United States)
2014-06-15
Purpose: Vascular injury could be a cause of hippocampal dysfunction leading to late neurocognitive decline in patients receiving brain radiotherapy (RT). Hence, our aim was to develop a multivariate interaction model for characterization of hippocampal vascular dose-response and early prediction of radiation-induced late neurocognitive impairments. Methods: 27 patients (17 males and 10 females, age 31–80 years) were enrolled in an IRB-approved prospective longitudinal study. All patients were diagnosed with a low-grade glioma or benign tumor and treated by 3-D conformal or intensity-modulated RT with a median dose of 54 Gy (50.4–59.4 Gy in 1.8− Gy fractions). Six DCE-MRI scans were performed from pre-RT to 18 months post-RT. DCE data were fitted to the modified Toft model to obtain the transfer constant of gadolinium influx from the intravascular space into the extravascular extracellular space, Ktrans, and the fraction of blood plasma volume, Vp. The hippocampus vascular property alterations after starting RT were characterized by changes in the hippocampal mean values of, μh(Ktrans)τ and μh(Vp)τ. The dose-response, Δμh(Ktrans/Vp)pre->τ, was modeled using a multivariate linear regression considering integrations of doses with age, sex, hippocampal laterality and presence of tumor/edema near a hippocampus. Finally, the early vascular dose-response in hippocampus was correlated with neurocognitive decline 6 and 18 months post-RT. Results: The μh(Ktrans) increased significantly from pre-RT to 1 month post-RT (p<0.0004). The multivariate model showed that the dose effect on Δμh(Ktrans)pre->1M post-RT was interacted with sex (p<0.0007) and age (p<0.00004), with the dose-response more pronounced in older females. Also, the vascular dose-response in the left hippocampus of females was significantly correlated with memory function decline at 6 (r = − 0.95, p<0.0006) and 18 (r = −0.88, p<0.02) months post-RT. Conclusion: The hippocampal vascular
Karen Larwin
2014-02-01
Full Text Available The present study examined students' statistics-related self-efficacy, as measured with the current statistics self-efficacy (CSSE inventory developed by Finney and Schraw (2003. Structural equation modeling was used to check the confirmatory factor analysis of the one-dimensional factor of CSSE. Once confirmed, this factor was used to test whether a significant link to prior mathematics experiences exists. Additionally a new post-structural equation modeling (SEM application was employed to compute error-free latent variable score for CSSE in an effort to examine the ancillary effects of gender, age, ethnicity, department, degree level, hours completed, expected course grade, number of college-level math classes, current GPA on students' CSSE scores. Results support the one-dimensional construct and as expected, the model demonstrated a significant link between CSSE scores and prior mathematics experiences to CSSE. Additionally the students' department, expected grade, and number of prior math classes were found to have a significant effect on student's CSSE scores.
Kuang, Dan; Zhang, Wangzhen; Deng, Qifei; Zhang, Xiao; Huang, Kun; Guan, Lei; Hu, Die; Wu, Tangchun; Guo, Huan
2013-07-01
Polycyclic aromatic hydrocarbons (PAHs) are known to induce reactive oxygen species and oxidative stress, but the dose-response relationships between exposure to PAHs and oxidative stress levels have not been established. In this study, we recruited 1333 male coke oven workers, monitored the levels of environmental PAHs, and measured internal PAH exposure biomarkers including 12 urinary PAH metabolites and plasma benzo[a]pyrene-r-7,t-8,t-9,c-10-tetrahydotetrol-albumin (BPDE-Alb) adducts, as well as the two oxidative biomarkers urinary 8-hydroxydeoxyguanosine (8-OHdG) and 8-iso-prostaglandin-F2α (8-iso-PGF2α). We found that the total concentration of urinary PAH metabolites and plasma BPDE-Alb adducts were both significantly associated with increased 8-OHdG and 8-iso-PGF2α in both smokers and nonsmokers (all p coke oven workers. PMID:23745771
The impact of different dose response parameters on biologically optimized IMRT in breast cancer
Costa Ferreira, Brigida; Mavroidis, Panayiotis; Adamus-Górka, Magdalena; Svensson, Roger; Lind, Bengt K.
2008-05-01
The full potential of biologically optimized radiation therapy can only be maximized with the prediction of individual patient radiosensitivity prior to treatment. Unfortunately, the available biological parameters, derived from clinical trials, reflect an average radiosensitivity of the examined populations. In the present study, a breast cancer patient of stage I II with positive lymph nodes was chosen in order to analyse the effect of the variation of individual radiosensitivity on the optimal dose distribution. Thus, deviations from the average biological parameters, describing tumour, heart and lung response, were introduced covering the range of patient radiosensitivity reported in the literature. Two treatment configurations of three and seven biologically optimized intensity-modulated beams were employed. The different dose distributions were analysed using biological and physical parameters such as the complication-free tumour control probability (P+), the biologically effective uniform dose (\\bar{\\bar{D}} ), dose volume histograms, mean doses, standard deviations, maximum and minimum doses. In the three-beam plan, the difference in P+ between the optimal dose distribution (when the individual patient radiosensitivity is known) and the reference dose distribution, which is optimal for the average patient biology, ranges up to 13.9% when varying the radiosensitivity of the target volume, up to 0.9% when varying the radiosensitivity of the heart and up to 1.3% when varying the radiosensitivity of the lung. Similarly, in the seven-beam plan, the differences in P+ are up to 13.1% for the target, up to 1.6% for the heart and up to 0.9% for the left lung. When the radiosensitivity of the most important tissues in breast cancer radiation therapy was simultaneously changed, the maximum gain in outcome was as high as 7.7%. The impact of the dose response uncertainties on the treatment outcome was clinically insignificant for the majority of the simulated patients
The impact of different dose-response parameters on biologically optimized IMRT in breast cancer
The full potential of biologically optimized radiation therapy can only be maximized with the prediction of individual patient radiosensitivity prior to treatment. Unfortunately, the available biological parameters, derived from clinical trials, reflect an average radiosensitivity of the examined populations. In the present study, a breast cancer patient of stage I-II with positive lymph nodes was chosen in order to analyse the effect of the variation of individual radiosensitivity on the optimal dose distribution. Thus, deviations from the average biological parameters, describing tumour, heart and lung response, were introduced covering the range of patient radiosensitivity reported in the literature. Two treatment configurations of three and seven biologically optimized intensity-modulated beams were employed. The different dose distributions were analysed using biological and physical parameters such as the complication-free tumour control probability (P+), the biologically effective uniform dose, dose volume histograms, mean doses, standard deviations, maximum and minimum doses. In the three-beam plan, the difference in P+ between the optimal dose distribution (when the individual patient radiosensitivity is known) and the reference dose distribution, which is optimal for the average patient biology, ranges up to 13.9% when varying the radiosensitivity of the target volume, up to 0.9% when varying the radiosensitivity of the heart and up to 1.3% when varying the radiosensitivity of the lung. Similarly, in the seven-beam plan, the differences in P+ are up to 13.1% for the target, up to 1.6% for the heart and up to 0.9% for the left lung. When the radiosensitivity of the most important tissues in breast cancer radiation therapy was simultaneously changed, the maximum gain in outcome was as high as 7.7%. The impact of the dose-response uncertainties on the treatment outcome was clinically insignificant for the majority of the simulated patients. However, the jump
Is There a Dose-Response Relationship for Heart Disease With Low-Dose Radiation Therapy?
Purpose: To quantify cardiac radiation therapy (RT) exposure using sensitive measures of cardiac dysfunction; and to correlate dysfunction with heart doses, in the setting of adjuvant RT for left-sided breast cancer. Methods and Materials: On a randomized trial, 32 women with node-positive left-sided breast cancer underwent pre-RT stress single photon emission computed tomography (SPECT-CT) myocardial perfusion scans. Patients received RT to the breast/chest wall and regional lymph nodes to doses of 50 to 52.2 Gy. Repeat SPECT-CT scans were performed 1 year after RT. Perfusion defects (PD), summed stress defects scores (SSS), and ejection fractions (EF) were evaluated. Doses to the heart and coronary arteries were quantified. Results: The mean difference in pre- and post-RT PD was −0.38% ± 3.20% (P=.68), with no clinically significant defects. To assess for subclinical effects, PD were also examined using a 1.5-SD below the normal mean threshold, with a mean difference of 2.53% ± 12.57% (P=.38). The mean differences in SSS and EF before and after RT were 0.78% ± 2.50% (P=.08) and 1.75% ± 7.29% (P=.39), respectively. The average heart Dmean and D95 were 2.82 Gy (range, 1.11-6.06 Gy) and 0.90 Gy (range, 0.13-2.17 Gy), respectively. The average Dmean and D95 to the left anterior descending artery were 7.22 Gy (range, 2.58-18.05 Gy) and 3.22 Gy (range, 1.23-6.86 Gy), respectively. No correlations were found between cardiac doses and changes in PD, SSS, and EF. Conclusions: Using sensitive measures of cardiac function, no clinically significant defects were found after RT, with the average heart Dmean <5 Gy. Although a dose response may exist for measures of cardiac dysfunction at higher doses, no correlation was found in the present study for low doses delivered to cardiac structures and perfusion, SSS, or EF
Park, Eun-Jung; Choi, Sang-Cheon; Ahn, Jung-Hwan; Min, Young-Gi
2013-01-01
The objectives of our study were to investigate the dose-response relationship of the TASER X26 discharge duration in an anesthetized swine model. Fourteen swines were anesthetized and then exposed to TASER X26 discharge for 5 sec (n = 5) or for 10 sec (n = 6). The sham control group (n = 3) was anesthetized and studied using the same protocol except TASER X26 discharges during the experiments. Hemodynamic parameters were obtained. Blood pressure and total peripheral resistance decreased significantly after TASER discharge and returned to baseline value at 15 min after 5 sec of TASER discharge but did not return to baseline values during the 30-min observation period after 10 sec of TASER discharge. Repetitive TASER X26 discharge resulted in adverse physiologic events with a dose-response relationship related to the duration of TASER X26 discharge in an anesthetized swine model. PMID:23066880
Dose Effects of Ion Beam Exposure on Deinococcus Radiodurans: Survival and Dose Response
无
2001-01-01
To explore the survival and dose response of organism for different radiation sources is of great importance in the research of radiobiology. In this study, the survival-dose response of Deinococcus radiodurans (E.coli, as the control) for ultra-violet (UV), γ-rays radiation and ion beam exposure was investigated. The shoulder type of survival curves were found for both UV and γ-ray ionizing radiation, but the saddle type of survival curves were shown for H+ 、 N+( 20keV and 30keV) and Ar+ beam exposure. This dose effect of the survival initially decreased withthe increase in dose and then increased in the high dose range and finally decreased again in thehigher dose range. Our experimental results suggest that D. radiodurans, which is considerablyradio-resistant to UV and x-ray and γ-ray ionizing radiation, do not resist ion beam exposure.
Aspartame tablets were studied for gamma dose response, using spectrophotometric read-out method. The optimum concentration for ferrous ions was 2x10-4moldm-3 and xylenol orange with 2.5x10-1moldm-3 of sulphuric acid for the optimum acidity in FX solution. Wavelength of maximum absorbance is 548nm. Post-irradiation stability is appreciable i.e. for not less than one month. Dose response is non-linear with third order polynomial fit, in the dose range of 1000-10000Gy. This system of aspartame was further used for carrying out relative percentage dose profile measurement in Gamma Cell-220. Results obtained were inter-compared with that of a glutamine dosimeter, which showed that maximum difference between the values of aspartame and glutamine systems is within +/-10%
Christensen, Erik R.; Kusk, Kresten Ole; Nyholm, Niels
2009-01-01
We derive equations for the effective concentration giving 10% inhibition (EC10) with 95% confidence limits for probit (log-normal), Weibull, and logistic dose -responsemodels on the basis of experimentally derived median effective concentrations (EC50s) and the curve slope at the central point (50......% inhibition). For illustration, data from closed, freshwater algal assays are analyzed using the green alga Pseudokirchneriella subcapitata with growth rate as the response parameter. Dose-response regressions for four test chemicals (tetraethylammonium bromide, musculamine, benzonitrile, and 4......-4-(trifluoromethyl)phenoxy-phenol) with ranges of representative slopes at 50% response (0.54-2.62) and EC50s (2.20-357 mg/L) were selected. Reference EC50s and EC10s with 95% confidence limits using probit or Weibull models are calculated by nonlinear regression on the whole dataset using a dose - response...
Shinde, S.H. [Radiation Safety Systems Division, Bhabha Atomic Research Centre, Mumbai 400 085 (India)]. E-mail: shs_barc@yahoo.com; Mukherjee, T. [Radiation Safety Systems Division, Chemistry Group, Bhabha Atomic Research Centre, Mumbai 400 085 (India)
2007-02-15
Aspartame tablets were studied for gamma dose response, using spectrophotometric read-out method. The optimum concentration for ferrous ions was 2x10{sup -4}moldm{sup -3} and xylenol orange with 2.5x10{sup -1}moldm{sup -3} of sulphuric acid for the optimum acidity in FX solution. Wavelength of maximum absorbance is 548nm. Post-irradiation stability is appreciable i.e. for not less than one month. Dose response is non-linear with third order polynomial fit, in the dose range of 1000-10000Gy. This system of aspartame was further used for carrying out relative percentage dose profile measurement in Gamma Cell-220. Results obtained were inter-compared with that of a glutamine dosimeter, which showed that maximum difference between the values of aspartame and glutamine systems is within +/-10%.
Best, R; Harrell, A; Geesey, C; Libby, B; Wijesooriya, K [University of Virginia, Charlottesville, VA (United States)
2014-06-15
Purpose: The purpose of this study is to inter-compare and find statistically significant differences between flattened field fixed-beam (FB) IMRT with flattening-filter free (FFF) volumetric modulated arc therapy (VMAT) for stereotactic body radiation therapy SBRT. Methods: SBRT plans using FB IMRT and FFF VMAT were generated for fifteen SBRT lung patients using 6 MV beams. For each patient, both IMRT and VMAT plans were created for comparison. Plans were generated utilizing RTOG 0915 (peripheral, 10 patients) and RTOG 0813 (medial, 5 patients) lung protocols. Target dose, critical structure dose, and treatment time were compared and tested for statistical significance. Parameters of interest included prescription isodose surface coverage, target dose heterogeneity, high dose spillage (location and volume), low dose spillage (location and volume), lung dose spillage, and critical structure maximum- and volumetric-dose limits. Results: For all criteria, we found equivalent or higher conformality with VMAT plans as well as reduced critical structure doses. Several differences passed a Student's t-test of significance: VMAT reduced the high dose spillage, evaluated with conformality index (CI), by an average of 9.4%±15.1% (p=0.030) compared to IMRT. VMAT plans reduced the lung volume receiving 20 Gy by 16.2%±15.0% (p=0.016) compared with IMRT. For the RTOG 0915 peripheral lesions, the volumes of lung receiving 12.4 Gy and 11.6 Gy were reduced by 27.0%±13.8% and 27.5%±12.6% (for both, p<0.001) in VMAT plans. Of the 26 protocol pass/fail criteria, VMAT plans were able to achieve an average of 0.2±0.7 (p=0.026) more constraints than the IMRT plans. Conclusions: FFF VMAT has dosimetric advantages over fixed beam IMRT for lung SBRT. Significant advantages included increased dose conformity, and reduced organs-at-risk doses. The overall improvements in terms of protocol pass/fail criteria were more modest and will require more patient data to establish difference
Purpose: The purpose of this study is to inter-compare and find statistically significant differences between flattened field fixed-beam (FB) IMRT with flattening-filter free (FFF) volumetric modulated arc therapy (VMAT) for stereotactic body radiation therapy SBRT. Methods: SBRT plans using FB IMRT and FFF VMAT were generated for fifteen SBRT lung patients using 6 MV beams. For each patient, both IMRT and VMAT plans were created for comparison. Plans were generated utilizing RTOG 0915 (peripheral, 10 patients) and RTOG 0813 (medial, 5 patients) lung protocols. Target dose, critical structure dose, and treatment time were compared and tested for statistical significance. Parameters of interest included prescription isodose surface coverage, target dose heterogeneity, high dose spillage (location and volume), low dose spillage (location and volume), lung dose spillage, and critical structure maximum- and volumetric-dose limits. Results: For all criteria, we found equivalent or higher conformality with VMAT plans as well as reduced critical structure doses. Several differences passed a Student's t-test of significance: VMAT reduced the high dose spillage, evaluated with conformality index (CI), by an average of 9.4%±15.1% (p=0.030) compared to IMRT. VMAT plans reduced the lung volume receiving 20 Gy by 16.2%±15.0% (p=0.016) compared with IMRT. For the RTOG 0915 peripheral lesions, the volumes of lung receiving 12.4 Gy and 11.6 Gy were reduced by 27.0%±13.8% and 27.5%±12.6% (for both, p<0.001) in VMAT plans. Of the 26 protocol pass/fail criteria, VMAT plans were able to achieve an average of 0.2±0.7 (p=0.026) more constraints than the IMRT plans. Conclusions: FFF VMAT has dosimetric advantages over fixed beam IMRT for lung SBRT. Significant advantages included increased dose conformity, and reduced organs-at-risk doses. The overall improvements in terms of protocol pass/fail criteria were more modest and will require more patient data to establish difference
Marina Yepifanova
2012-01-01
Full Text Available It is theoretically proved and experimentally possibility of maintenance of continuity of formation of socially significant hierarchy of motives of the learning of senior pupils by means of multimedia technology proves to be true at training to the physicist.The pedagogical category of continuity as one of important didactic principles is discussed. On the basis of the wide pedagogical experiment which has captured a number of educational institutions, efficiency of multimedia technology is shown at formation of hierarchy of motives of the learning of the schoolboys, having socially significant orientation.Introduction possibility in pedagogical researches of modern statistical measures of estimation is discussed and their efficiency with reference to a problem of continuity of formation of motives of the learning of senior pupils is shown.
Dose-response curves for DNA neutral (pH 9.6) filter elution were obtained with synchronized CHO cells exposed to X-rays at various phases of cell cycle. The dose response was similar in synchronized and plateau-phase G1 cells, as well as in cells arrested at the G1/S border using aphidicolin; it flattened as cells progressed into S phase and reached a minimum in the middle of this phase. An increase in DNA elution dose response, to values only slightly lower than those obtained with G1 cells, was observed as cells entered G2 phase. Significant alterations in the sedimentation properties of the DNA during S phase were also observed in Ehrlich ascites tumor cells using the neutral sucrose gradient centrifugation technique. A significant proportion of the DNA from S cells irradiated with 10 Gy sedimented at speeds (350S-700S) well above the maximum sedimentation speed expected for free sedimenting DNA molecules (ssub(max) = 350S), indicating the formation of a DNA complex. DNA from G1, G1/S, or G2 + M cells sedimented as expected for free sedimenting molecules. (author)
Dose response of hydrazine - Deproteinated tooth enamel under blue light stimulation
Yuece, Ulkue Rabia, E-mail: ulkuyuce@hotmail.co [Ankara University, Faculty of Engineering, Department of Engineering Physics, 06100, Tandogan - Ankara (Turkey); Meric, Niyazi, E-mail: meric@ankara.edu.t [Ankara University, Faculty of Engineering, Department of Engineering Physics, 06100, Tandogan - Ankara (Turkey); Atakol, Orhan, E-mail: atakol@science.ankara.edu.t [Ankara University, Science Faculty, Department of Chemistry, 06100, Tandogan - Ankara (Turkey); Yasar, Fusun, E-mail: ab121310@adalet.gov.t [Council of Forensic Medicine, Ankara Branch, Ankara (Turkey)
2010-08-15
The beta dose response and Optically Stimulated Luminescence (OSL) signal stability characteristics of human tooth enamel deproteinated by hydrazine reagent under blue photon stimulation are reported. Removal of the protein organic component of tooth enamel resulted in a higher OSL sensitivity and slower fading of OSL signals. The effect of chemical sample preparation on the enamel sample sensitivity is discussed and further steps to make this deproteinization treatment suitable for in vitro dose reconstruction studies are suggested.