Glaz, Joseph
2009-01-01
Suitable for graduate students and researchers in applied probability and statistics, as well as for scientists in biology, computer science, pharmaceutical science and medicine, this title brings together a collection of chapters illustrating the depth and diversity of theory, methods and applications in the area of scan statistics.
Chen, Jin; Roth, Robert E; Naito, Adam T; Lengerich, Eugene J; Maceachren, Alan M
2008-11-07
Kulldorff's spatial scan statistic and its software implementation - SaTScan - are widely used for detecting and evaluating geographic clusters. However, two issues make using the method and interpreting its results non-trivial: (1) the method lacks cartographic support for understanding the clusters in geographic context and (2) results from the method are sensitive to parameter choices related to cluster scaling (abbreviated as scaling parameters), but the system provides no direct support for making these choices. We employ both established and novel geovisual analytics methods to address these issues and to enhance the interpretation of SaTScan results. We demonstrate our geovisual analytics approach in a case study analysis of cervical cancer mortality in the U.S. We address the first issue by providing an interactive visual interface to support the interpretation of SaTScan results. Our research to address the second issue prompted a broader discussion about the sensitivity of SaTScan results to parameter choices. Sensitivity has two components: (1) the method can identify clusters that, while being statistically significant, have heterogeneous contents comprised of both high-risk and low-risk locations and (2) the method can identify clusters that are unstable in location and size as the spatial scan scaling parameter is varied. To investigate cluster result stability, we conducted multiple SaTScan runs with systematically selected parameters. The results, when scanning a large spatial dataset (e.g., U.S. data aggregated by county), demonstrate that no single spatial scan scaling value is known to be optimal to identify clusters that exist at different scales; instead, multiple scans that vary the parameters are necessary. We introduce a novel method of measuring and visualizing reliability that facilitates identification of homogeneous clusters that are stable across analysis scales. Finally, we propose a logical approach to proceed through the analysis of
Scanning Tunneling Microscopy - image interpretation
International Nuclear Information System (INIS)
Maca, F.
1998-01-01
The basic ideas of image interpretation in Scanning Tunneling Microscopy are presented using simple quantum-mechanical models and supplied with examples of successful application. The importance is stressed of a correct interpretation of this brilliant experimental surface technique
Equivalent statistics and data interpretation.
Francis, Gregory
2017-08-01
Recent reform efforts in psychological science have led to a plethora of choices for scientists to analyze their data. A scientist making an inference about their data must now decide whether to report a p value, summarize the data with a standardized effect size and its confidence interval, report a Bayes Factor, or use other model comparison methods. To make good choices among these options, it is necessary for researchers to understand the characteristics of the various statistics used by the different analysis frameworks. Toward that end, this paper makes two contributions. First, it shows that for the case of a two-sample t test with known sample sizes, many different summary statistics are mathematically equivalent in the sense that they are based on the very same information in the data set. When the sample sizes are known, the p value provides as much information about a data set as the confidence interval of Cohen's d or a JZS Bayes factor. Second, this equivalence means that different analysis methods differ only in their interpretation of the empirical data. At first glance, it might seem that mathematical equivalence of the statistics suggests that it does not matter much which statistic is reported, but the opposite is true because the appropriateness of a reported statistic is relative to the inference it promotes. Accordingly, scientists should choose an analysis method appropriate for their scientific investigation. A direct comparison of the different inferential frameworks provides some guidance for scientists to make good choices and improve scientific practice.
Statistical interpretation of geochemical data
International Nuclear Information System (INIS)
Carambula, M.
1990-01-01
Statistical results have been obtained from a geochemical research from the following four aerial photographies Zapican, Carape, Las Canias, Alferez. They have been studied 3020 samples in total, to 22 chemical elements using plasma emission spectrometry methods.
Spatial scan statistics using elliptic windows
DEFF Research Database (Denmark)
Christiansen, Lasse Engbo; Andersen, Jens Strodl; Wegener, Henrik Caspar
The spatial scan statistic is widely used to search for clusters in epidemiologic data. This paper shows that the usually applied elimination of secondary clusters as implemented in SatScan is sensitive to smooth changes in the shape of the clusters. We present an algorithm for generation of set...
Spatial scan statistics using elliptic windows
DEFF Research Database (Denmark)
Christiansen, Lasse Engbo; Andersen, Jens Strodl; Wegener, Henrik Caspar
2006-01-01
The spatial scan statistic is widely used to search for clusters. This article shows that the usually applied elimination of secondary clusters as implemented in SatScan is sensitive to smooth changes in the shape of the clusters. We present an algorithm for generation of a set of confocal elliptic...
Statistics and Data Interpretation for Social Work
Rosenthal, James
2011-01-01
"Without question, this text will be the most authoritative source of information on statistics in the human services. From my point of view, it is a definitive work that combines a rigorous pedagogy with a down to earth (commonsense) exploration of the complex and difficult issues in data analysis (statistics) and interpretation. I welcome its publication.". -Praise for the First Edition. Written by a social worker for social work students, this is a nuts and bolts guide to statistics that presents complex calculations and concepts in clear, easy-to-understand language. It includes
Data-driven inference for the spatial scan statistic
Directory of Open Access Journals (Sweden)
Duczmal Luiz H
2011-08-01
Full Text Available Abstract Background Kulldorff's spatial scan statistic for aggregated area maps searches for clusters of cases without specifying their size (number of areas or geographic location in advance. Their statistical significance is tested while adjusting for the multiple testing inherent in such a procedure. However, as is shown in this work, this adjustment is not done in an even manner for all possible cluster sizes. Results A modification is proposed to the usual inference test of the spatial scan statistic, incorporating additional information about the size of the most likely cluster found. A new interpretation of the results of the spatial scan statistic is done, posing a modified inference question: what is the probability that the null hypothesis is rejected for the original observed cases map with a most likely cluster of size k, taking into account only those most likely clusters of size k found under null hypothesis for comparison? This question is especially important when the p-value computed by the usual inference process is near the alpha significance level, regarding the correctness of the decision based in this inference. Conclusions A practical procedure is provided to make more accurate inferences about the most likely cluster found by the spatial scan statistic.
Data-driven inference for the spatial scan statistic.
Almeida, Alexandre C L; Duarte, Anderson R; Duczmal, Luiz H; Oliveira, Fernando L P; Takahashi, Ricardo H C
2011-08-02
Kulldorff's spatial scan statistic for aggregated area maps searches for clusters of cases without specifying their size (number of areas) or geographic location in advance. Their statistical significance is tested while adjusting for the multiple testing inherent in such a procedure. However, as is shown in this work, this adjustment is not done in an even manner for all possible cluster sizes. A modification is proposed to the usual inference test of the spatial scan statistic, incorporating additional information about the size of the most likely cluster found. A new interpretation of the results of the spatial scan statistic is done, posing a modified inference question: what is the probability that the null hypothesis is rejected for the original observed cases map with a most likely cluster of size k, taking into account only those most likely clusters of size k found under null hypothesis for comparison? This question is especially important when the p-value computed by the usual inference process is near the alpha significance level, regarding the correctness of the decision based in this inference. A practical procedure is provided to make more accurate inferences about the most likely cluster found by the spatial scan statistic.
International Nuclear Information System (INIS)
Tadaki, Kohtaro
2010-01-01
The statistical mechanical interpretation of algorithmic information theory (AIT, for short) was introduced and developed by our former works [K. Tadaki, Local Proceedings of CiE 2008, pp. 425-434, 2008] and [K. Tadaki, Proceedings of LFCS'09, Springer's LNCS, vol. 5407, pp. 422-440, 2009], where we introduced the notion of thermodynamic quantities, such as partition function Z(T), free energy F(T), energy E(T), statistical mechanical entropy S(T), and specific heat C(T), into AIT. We then discovered that, in the interpretation, the temperature T equals to the partial randomness of the values of all these thermodynamic quantities, where the notion of partial randomness is a stronger representation of the compression rate by means of program-size complexity. Furthermore, we showed that this situation holds for the temperature T itself, which is one of the most typical thermodynamic quantities. Namely, we showed that, for each of the thermodynamic quantities Z(T), F(T), E(T), and S(T) above, the computability of its value at temperature T gives a sufficient condition for T is an element of (0,1) to satisfy the condition that the partial randomness of T equals to T. In this paper, based on a physical argument on the same level of mathematical strictness as normal statistical mechanics in physics, we develop a total statistical mechanical interpretation of AIT which actualizes a perfect correspondence to normal statistical mechanics. We do this by identifying a microcanonical ensemble in the framework of AIT. As a result, we clarify the statistical mechanical meaning of the thermodynamic quantities of AIT.
Skeletal blood flow: implications for bone-scan interpretation
International Nuclear Information System (INIS)
Charkes, N.D.
1980-01-01
The dispersion of the skeleton throughout the body and its complex vascular anatomy require indirect methods for the measurement of skeletal blood flow. The results of one such method, compartmental analysis of skeletal tracer kinetics, are presented. The assumptions underlying the models were tested in animals and found to be in agreement with experimental observations. Based upon the models and the experimental results, inferences concerning bone-scan interpretation can be drawn: decreased cardiac output produces low-contrast (technically poor) scans; decreased skeletal flow produces photon-deficient lesions; increase of cardiac output or of generalized systemic blood flow is undetectable 1 to 2 h after dose; increased local skeletal blood flow results from disturbance of the bone microvasculature and can occur from neurologic (sympatholytic) disorders or in association with focal abnormalities that also incite the formation of reactive bone (e.g., metastasis, fracture, etc.). Mathematical solutions of tracer kinetic data thus become relevant to bone-scan interpretation
Interpretation of commonly used statistical regression models.
Kasza, Jessica; Wolfe, Rory
2014-01-01
A review of some regression models commonly used in respiratory health applications is provided in this article. Simple linear regression, multiple linear regression, logistic regression and ordinal logistic regression are considered. The focus of this article is on the interpretation of the regression coefficients of each model, which are illustrated through the application of these models to a respiratory health research study. © 2013 The Authors. Respirology © 2013 Asian Pacific Society of Respirology.
The Statistical Interpretation of Entropy: An Activity
Timmberlake, Todd
2010-01-01
The second law of thermodynamics, which states that the entropy of an isolated macroscopic system can increase but will not decrease, is a cornerstone of modern physics. Ludwig Boltzmann argued that the second law arises from the motion of the atoms that compose the system. Boltzmann's statistical mechanics provides deep insight into the…
A nonparametric spatial scan statistic for continuous data.
Jung, Inkyung; Cho, Ho Jin
2015-10-20
Spatial scan statistics are widely used for spatial cluster detection, and several parametric models exist. For continuous data, a normal-based scan statistic can be used. However, the performance of the model has not been fully evaluated for non-normal data. We propose a nonparametric spatial scan statistic based on the Wilcoxon rank-sum test statistic and compared the performance of the method with parametric models via a simulation study under various scenarios. The nonparametric method outperforms the normal-based scan statistic in terms of power and accuracy in almost all cases under consideration in the simulation study. The proposed nonparametric spatial scan statistic is therefore an excellent alternative to the normal model for continuous data and is especially useful for data following skewed or heavy-tailed distributions.
Huffman and linear scanning methods with statistical language models.
Roark, Brian; Fried-Oken, Melanie; Gibbons, Chris
2015-03-01
Current scanning access methods for text generation in AAC devices are limited to relatively few options, most notably row/column variations within a matrix. We present Huffman scanning, a new method for applying statistical language models to binary-switch, static-grid typing AAC interfaces, and compare it to other scanning options under a variety of conditions. We present results for 16 adults without disabilities and one 36-year-old man with locked-in syndrome who presents with complex communication needs and uses AAC scanning devices for writing. Huffman scanning with a statistical language model yielded significant typing speedups for the 16 participants without disabilities versus any of the other methods tested, including two row/column scanning methods. A similar pattern of results was found with the individual with locked-in syndrome. Interestingly, faster typing speeds were obtained with Huffman scanning using a more leisurely scan rate than relatively fast individually calibrated scan rates. Overall, the results reported here demonstrate great promise for the usability of Huffman scanning as a faster alternative to row/column scanning.
Combinatorial interpretation of Haldane-Wu fractional exclusion statistics.
Aringazin, A K; Mazhitov, M I
2002-08-01
Assuming that the maximal allowed number of identical particles in a state is an integer parameter, q, we derive the statistical weight and analyze the associated equation that defines the statistical distribution. The derived distribution covers Fermi-Dirac and Bose-Einstein ones in the particular cases q=1 and q--> infinity (n(i)/q-->1), respectively. We show that the derived statistical weight provides a natural combinatorial interpretation of Haldane-Wu fractional exclusion statistics, and present exact solutions of the distribution equation.
Tuuli, Methodius G; Odibo, Anthony O
2011-08-01
The objective of this article is to discuss the rationale for common statistical tests used for the analysis and interpretation of prenatal diagnostic imaging studies. Examples from the literature are used to illustrate descriptive and inferential statistics. The uses and limitations of linear and logistic regression analyses are discussed in detail.
Advanced statistics to improve the physical interpretation of atomization processes
International Nuclear Information System (INIS)
Panão, Miguel R.O.; Radu, Lucian
2013-01-01
Highlights: ► Finite pdf mixtures improves physical interpretation of sprays. ► Bayesian approach using MCMC algorithm is used to find the best finite mixture. ► Statistical method identifies multiple droplet clusters in a spray. ► Multiple drop clusters eventually associated with multiple atomization mechanisms. ► Spray described by drop size distribution and not only its moments. -- Abstract: This paper reports an analysis of the physics of atomization processes using advanced statistical tools. Namely, finite mixtures of probability density functions, which best fitting is found using a Bayesian approach based on a Markov chain Monte Carlo (MCMC) algorithm. This approach takes into account eventual multimodality and heterogeneities in drop size distributions. Therefore, it provides information about the complete probability density function of multimodal drop size distributions and allows the identification of subgroups in the heterogeneous data. This allows improving the physical interpretation of atomization processes. Moreover, it also overcomes the limitations induced by analyzing the spray droplets characteristics through moments alone, particularly, the hindering of different natures of droplet formation. Finally, the method is applied to physically interpret a case-study based on multijet atomization processes
A critical look at prospective surveillance using a scan statistic.
Correa, Thais R; Assunção, Renato M; Costa, Marcelo A
2015-03-30
The scan statistic is a very popular surveillance technique for purely spatial, purely temporal, and spatial-temporal disease data. It was extended to the prospective surveillance case, and it has been applied quite extensively in this situation. When the usual signal rules, as those implemented in SaTScan(TM) (Boston, MA, USA) software, are used, we show that the scan statistic method is not appropriate for the prospective case. The reason is that it does not adjust properly for the sequential and repeated tests carried out during the surveillance. We demonstrate that the nominal significance level α is not meaningful and there is no relationship between α and the recurrence interval or the average run length (ARL). In some cases, the ARL may be equal to ∞, which makes the method ineffective. This lack of control of the type-I error probability and of the ARL leads us to strongly oppose the use of the scan statistic with the usual signal rules in the prospective context. Copyright © 2014 John Wiley & Sons, Ltd.
Difficulties in the interpretation of focal changes on liver scans
Energy Technology Data Exchange (ETDEWEB)
Derimanov, S.G. (Oblastnoj Onkologicheskij Dispanser, Veliko-Tyrnovo (Bulgaria))
1983-01-01
A series of cases are presented in which erroneous conclusions were arrived at by liver scans. The analysis of these errors has shown that they are determined by the similarity of a scanographic picture of focal changes in various diseases of the liver (cancer, cirrhosis, echinococcus, etc.) and the adjacent organs (bronchogenic cyst, renal tumor, etc.). In this connection all attempts to determine the etiology of a pathological process by liver scans without correlation with the results of multidimensional examination frequently result in errors.
Normal variants and artifacts in bone scan: potential for errors in interpretation
International Nuclear Information System (INIS)
Sohn, Myung Hee
2004-01-01
Bone scan is one of the most frequently performed studies in nuclear medicine. In bone scan, the amount of radioisotope taken up by lesion depends primarily on the local rate of bone turnover rather than on the bone mass. Bone scan is extremely sensitive for detecting bony abnormalities. However, abnormalities that appear on bone scan may not always represent disease. The normal scan appearances may be affected not only by skeletal physiology and anatomy but also by a variety of technical factors which can influence image quality. Many normal variants and artifacts may appear on bone scan. They could simulate a pathologic process and could mislead into the wrong diagnostic interpretation. Therefore, their recognition is necessary to avoid misdiagnosis. A nuclear medicine physician should be aware of variable appearance of the normal variants and artifacts on bone scan. In this article, a variety of normal variants and artifacts mimicking real pathologic lesion in bone scan interpretation are discussed and illustrated
Statistical transformation and the interpretation of inpatient glucose control data.
Saulnier, George E; Castro, Janna C; Cook, Curtiss B
2014-03-01
To introduce a statistical method of assessing hospital-based non-intensive care unit (non-ICU) inpatient glucose control. Point-of-care blood glucose (POC-BG) data from hospital non-ICUs were extracted for January 1 through December 31, 2011. Glucose data distribution was examined before and after Box-Cox transformations and compared to normality. Different subsets of data were used to establish upper and lower control limits, and exponentially weighted moving average (EWMA) control charts were constructed from June, July, and October data as examples to determine if out-of-control events were identified differently in nontransformed versus transformed data. A total of 36,381 POC-BG values were analyzed. In all 3 monthly test samples, glucose distributions in nontransformed data were skewed but approached a normal distribution once transformed. Interpretation of out-of-control events from EWMA control chart analyses also revealed differences. In the June test data, an out-of-control process was identified at sample 53 with nontransformed data, whereas the transformed data remained in control for the duration of the observed period. Analysis of July data demonstrated an out-of-control process sooner in the transformed (sample 55) than nontransformed (sample 111) data, whereas for October, transformed data remained in control longer than nontransformed data. Statistical transformations increase the normal behavior of inpatient non-ICU glycemic data sets. The decision to transform glucose data could influence the interpretation and conclusions about the status of inpatient glycemic control. Further study is required to determine whether transformed versus nontransformed data influence clinical decisions or evaluation of interventions.
A spatial scan statistic for compound Poisson data.
Rosychuk, Rhonda J; Chang, Hsing-Ming
2013-12-20
The topic of spatial cluster detection gained attention in statistics during the late 1980s and early 1990s. Effort has been devoted to the development of methods for detecting spatial clustering of cases and events in the biological sciences, astronomy and epidemiology. More recently, research has examined detecting clusters of correlated count data associated with health conditions of individuals. Such a method allows researchers to examine spatial relationships of disease-related events rather than just incident or prevalent cases. We introduce a spatial scan test that identifies clusters of events in a study region. Because an individual case may have multiple (repeated) events, we base the test on a compound Poisson model. We illustrate our method for cluster detection on emergency department visits, where individuals may make multiple disease-related visits. Copyright © 2013 John Wiley & Sons, Ltd.
Identifying clusters of active transportation using spatial scan statistics.
Huang, Lan; Stinchcomb, David G; Pickle, Linda W; Dill, Jennifer; Berrigan, David
2009-08-01
There is an intense interest in the possibility that neighborhood characteristics influence active transportation such as walking or biking. The purpose of this paper is to illustrate how a spatial cluster identification method can evaluate the geographic variation of active transportation and identify neighborhoods with unusually high/low levels of active transportation. Self-reported walking/biking prevalence, demographic characteristics, street connectivity variables, and neighborhood socioeconomic data were collected from respondents to the 2001 California Health Interview Survey (CHIS; N=10,688) in Los Angeles County (LAC) and San Diego County (SDC). Spatial scan statistics were used to identify clusters of high or low prevalence (with and without age-adjustment) and the quantity of time spent walking and biking. The data, a subset from the 2001 CHIS, were analyzed in 2007-2008. Geographic clusters of significantly high or low prevalence of walking and biking were detected in LAC and SDC. Structural variables such as street connectivity and shorter block lengths are consistently associated with higher levels of active transportation, but associations between active transportation and socioeconomic variables at the individual and neighborhood levels are mixed. Only one cluster with less time spent walking and biking among walkers/bikers was detected in LAC, and this was of borderline significance. Age-adjustment affects the clustering pattern of walking/biking prevalence in LAC, but not in SDC. The use of spatial scan statistics to identify significant clustering of health behaviors such as active transportation adds to the more traditional regression analysis that examines associations between behavior and environmental factors by identifying specific geographic areas with unusual levels of the behavior independent of predefined administrative units.
Brazilian Amazonia Deforestation Detection Using Spatio-Temporal Scan Statistics
Vieira, C. A. O.; Santos, N. T.; Carneiro, A. P. S.; Balieiro, A. A. S.
2012-07-01
The spatio-temporal models, developed for analyses of diseases, can also be used for others fields of study, including concerns about forest and deforestation. The aim of this paper is to quantitatively check priority areas in order to combat deforestation on the Amazon forest, using the space-time scan statistic. The study area location is at the south of the Amazonas State and cover around 297.183 kilometre squares, including the municipality of Boca do Acre, Labrea, Canutama, Humaita, Manicore, Novo Aripuana e Apui County on the north region of Brazil. This area has showed a significant change for land cover, which has increased the number of deforestation's alerts. Therefore this situation becomes a concern and gets more investigation, trying to stop factors that increase the number of cases in the area. The methodology includes the location and year that deforestation's alert occurred. These deforestation's alerts are mapped by the DETER (Detection System of Deforestation in Real Time in Amazonia), which is carry out by the Brazilian Space Agency (INPE). The software SatScanTM v7.0 was used in order to define space-time permutation scan statistic for detection of deforestation cases. The outcome of this experiment shows an efficient model to detect space-time clusters of deforestation's alerts. The model was efficient to detect the location, the size, the order and characteristics about activities at the end of the experiments. Two clusters were considered actives and kept actives up to the end of the study. These clusters are located in Canutama and Lábrea County. This quantitative spatial modelling of deforestation warnings allowed: firstly, identifying actives clustering of deforestation, in which the environment government official are able to concentrate their actions; secondly, identifying historic clustering of deforestation, in which the environment government official are able to monitoring in order to avoid them to became actives again; and finally
BRAZILIAN AMAZONIA DEFORESTATION DETECTION USING SPATIO-TEMPORAL SCAN STATISTICS
Directory of Open Access Journals (Sweden)
C. A. O. Vieira
2012-07-01
Full Text Available The spatio-temporal models, developed for analyses of diseases, can also be used for others fields of study, including concerns about forest and deforestation. The aim of this paper is to quantitatively check priority areas in order to combat deforestation on the Amazon forest, using the space-time scan statistic. The study area location is at the south of the Amazonas State and cover around 297.183 kilometre squares, including the municipality of Boca do Acre, Labrea, Canutama, Humaita, Manicore, Novo Aripuana e Apui County on the north region of Brazil. This area has showed a significant change for land cover, which has increased the number of deforestation's alerts. Therefore this situation becomes a concern and gets more investigation, trying to stop factors that increase the number of cases in the area. The methodology includes the location and year that deforestation’s alert occurred. These deforestation's alerts are mapped by the DETER (Detection System of Deforestation in Real Time in Amazonia, which is carry out by the Brazilian Space Agency (INPE. The software SatScanTM v7.0 was used in order to define space-time permutation scan statistic for detection of deforestation cases. The outcome of this experiment shows an efficient model to detect space-time clusters of deforestation’s alerts. The model was efficient to detect the location, the size, the order and characteristics about activities at the end of the experiments. Two clusters were considered actives and kept actives up to the end of the study. These clusters are located in Canutama and Lábrea County. This quantitative spatial modelling of deforestation warnings allowed: firstly, identifying actives clustering of deforestation, in which the environment government official are able to concentrate their actions; secondly, identifying historic clustering of deforestation, in which the environment government official are able to monitoring in order to avoid them to became
A scan statistic to extract causal gene clusters from case-control genome-wide rare CNV data
Directory of Open Access Journals (Sweden)
Scherer Stephen W
2011-05-01
Full Text Available Abstract Background Several statistical tests have been developed for analyzing genome-wide association data by incorporating gene pathway information in terms of gene sets. Using these methods, hundreds of gene sets are typically tested, and the tested gene sets often overlap. This overlapping greatly increases the probability of generating false positives, and the results obtained are difficult to interpret, particularly when many gene sets show statistical significance. Results We propose a flexible statistical framework to circumvent these problems. Inspired by spatial scan statistics for detecting clustering of disease occurrence in the field of epidemiology, we developed a scan statistic to extract disease-associated gene clusters from a whole gene pathway. Extracting one or a few significant gene clusters from a global pathway limits the overall false positive probability, which results in increased statistical power, and facilitates the interpretation of test results. In the present study, we applied our method to genome-wide association data for rare copy-number variations, which have been strongly implicated in common diseases. Application of our method to a simulated dataset demonstrated the high accuracy of this method in detecting disease-associated gene clusters in a whole gene pathway. Conclusions The scan statistic approach proposed here shows a high level of accuracy in detecting gene clusters in a whole gene pathway. This study has provided a sound statistical framework for analyzing genome-wide rare CNV data by incorporating topological information on the gene pathway.
Workplace Statistical Literacy for Teachers: Interpreting Box Plots
Pierce, Robyn; Chick, Helen
2013-01-01
As a consequence of the increased use of data in workplace environments, there is a need to understand the demands that are placed on users to make sense of such data. In education, teachers are being increasingly expected to interpret and apply complex data about student and school performance, and, yet it is not clear that they always have the…
DEFF Research Database (Denmark)
Ljungberg, Thomas; Scott, D; Kristiansen, Søren Munch
Digital elevation models derived from high-precision Air-borne Laser Scanning (ALS or LiDAR) point clouds are becoming increasingly available throughout the world. These elevation models presents a very valuable tool for locating and interpreting geomorphological as well as archaeological features...
Simulators of tray distillation columns as tools for interpreting gamma-ray scan profile signal
International Nuclear Information System (INIS)
Offei-Mensah, P.S.; Gbadago, J.K.; Dagadu, C.P.K.; Danso, K.A.
2008-01-01
Simulators of tray distillation columns were used to provide technical guidelines for interpreting signals from gamma ray scans used for analysing malfunctions in distillation columns. The transmitted radiation intensities at 0.05 m intervals were determined from top to bottom of simulators of tray distillation columns exposed to 20 mCi of '1'3'7 Cs. Signals generated from the simulators were identical with the experimental signals obtained from the stabilizer column of the crude oil distillation unit at the Tema Oil Refinery Ghana Limited. Changes in the signal level were observed with changes in diameter, type of material (gasoline, air, debris, steel) and orientation of scan line. The analysis provided accurate interpretation of gamma scan profiles. (au)
Interpreting Statistical Findings A Guide For Health Professionals And Students
Walker, Jan
2010-01-01
This book is aimed at those studying and working in the field of health care, including nurses and the professions allied to medicine, who have little prior knowledge of statistics but for whom critical review of research is an essential skill.
Variation in reaction norms: Statistical considerations and biological interpretation.
Morrissey, Michael B; Liefting, Maartje
2016-09-01
Analysis of reaction norms, the functions by which the phenotype produced by a given genotype depends on the environment, is critical to studying many aspects of phenotypic evolution. Different techniques are available for quantifying different aspects of reaction norm variation. We examine what biological inferences can be drawn from some of the more readily applicable analyses for studying reaction norms. We adopt a strongly biologically motivated view, but draw on statistical theory to highlight strengths and drawbacks of different techniques. In particular, consideration of some formal statistical theory leads to revision of some recently, and forcefully, advocated opinions on reaction norm analysis. We clarify what simple analysis of the slope between mean phenotype in two environments can tell us about reaction norms, explore the conditions under which polynomial regression can provide robust inferences about reaction norm shape, and explore how different existing approaches may be used to draw inferences about variation in reaction norm shape. We show how mixed model-based approaches can provide more robust inferences than more commonly used multistep statistical approaches, and derive new metrics of the relative importance of variation in reaction norm intercepts, slopes, and curvatures. © 2016 The Author(s). Evolution © 2016 The Society for the Study of Evolution.
Theoretical, analytical, and statistical interpretation of environmental data
International Nuclear Information System (INIS)
Lombard, S.M.
1974-01-01
The reliability of data from radiochemical analyses of environmental samples cannot be determined from nuclear counting statistics alone. The rigorous application of the principles of propagation of errors, an understanding of the physics and chemistry of the species of interest in the environment, and the application of information from research on the analytical procedure are all necessary for a valid estimation of the errors associated with analytical results. The specific case of the determination of plutonium in soil is considered in terms of analytical problems and data reliability. (U.S.)
A statistical model for interpreting computerized dynamic posturography data
Feiveson, Alan H.; Metter, E. Jeffrey; Paloski, William H.
2002-01-01
Computerized dynamic posturography (CDP) is widely used for assessment of altered balance control. CDP trials are quantified using the equilibrium score (ES), which ranges from zero to 100, as a decreasing function of peak sway angle. The problem of how best to model and analyze ESs from a controlled study is considered. The ES often exhibits a skewed distribution in repeated trials, which can lead to incorrect inference when applying standard regression or analysis of variance models. Furthermore, CDP trials are terminated when a patient loses balance. In these situations, the ES is not observable, but is assigned the lowest possible score--zero. As a result, the response variable has a mixed discrete-continuous distribution, further compromising inference obtained by standard statistical methods. Here, we develop alternative methodology for analyzing ESs under a stochastic model extending the ES to a continuous latent random variable that always exists, but is unobserved in the event of a fall. Loss of balance occurs conditionally, with probability depending on the realized latent ES. After fitting the model by a form of quasi-maximum-likelihood, one may perform statistical inference to assess the effects of explanatory variables. An example is provided, using data from the NIH/NIA Baltimore Longitudinal Study on Aging.
A flexible spatial scan statistic with a restricted likelihood ratio for detecting disease clusters.
Tango, Toshiro; Takahashi, Kunihiko
2012-12-30
Spatial scan statistics are widely used tools for detection of disease clusters. Especially, the circular spatial scan statistic proposed by Kulldorff (1997) has been utilized in a wide variety of epidemiological studies and disease surveillance. However, as it cannot detect noncircular, irregularly shaped clusters, many authors have proposed different spatial scan statistics, including the elliptic version of Kulldorff's scan statistic. The flexible spatial scan statistic proposed by Tango and Takahashi (2005) has also been used for detecting irregularly shaped clusters. However, this method sets a feasible limitation of a maximum of 30 nearest neighbors for searching candidate clusters because of heavy computational load. In this paper, we show a flexible spatial scan statistic implemented with a restricted likelihood ratio proposed by Tango (2008) to (1) eliminate the limitation of 30 nearest neighbors and (2) to have surprisingly much less computational time than the original flexible spatial scan statistic. As a side effect, it is shown to be able to detect clusters with any shape reasonably well as the relative risk of the cluster becomes large via Monte Carlo simulation. We illustrate the proposed spatial scan statistic with data on mortality from cerebrovascular disease in the Tokyo Metropolitan area, Japan. Copyright © 2012 John Wiley & Sons, Ltd.
A Scan Statistic for Continuous Data Based on the Normal Probability Model
Konty, Kevin; Kulldorff, Martin; Huang, Lan
2009-01-01
Abstract Temporal, spatial and space-time scan statistics are commonly used to detect and evaluate the statistical significance of temporal and/or geographical disease clusters, without any prior assumptions on the location, time period or size of those clusters. Scan statistics are mostly used for count data, such as disease incidence or mortality. Sometimes there is an interest in looking for clusters with respect to a continuous variable, such as lead levels in children or low birth weight...
Effect of clinical information in brain CT scan interpretation : a blinded double crossover study
International Nuclear Information System (INIS)
Zhianpour, M.; Janghorbani, M.
2004-01-01
Errors and variations in interpretation can happen in clinical imaging. Few studies have examined the biased effect of clinical information on reporting of brain CT scans. In a blinded double crossover design, we studied whether three radiologists were biased by clinical information when making CT scan diagnosis of the brain. Three consultant radiologists in three rounds with at least a one month interval assessed 100 consecutive cases of brain CT scan. In the first round, clinical information was not available and 100 films without clinical information were given to radiologists. In the second round, the same 100 films were given and true clinical information was available. In the third round, the same 100 films were given and false clinical information was allocated. In 180 cases (60%) the evaluation resulted in the same diagnosis on all three occasions (95% confidence interval (CI): 54.5, 65.5), whereas 120(40%; 95% CI:34.5, 45.5) sets were evaluated differently. 48 cases (16%; 95% CI:11.9,20.1) had discordant evaluation with true and 33 (11%; 95% CI:7.5, 14.5) with false clinical information. Discordance without and with true and false clinical information was 39 (13%; 95% CI:9.2, 16.8). Correct clinical information improves the brain CT report, while the report became less accurate false clinical information was allocated. These results indicate that radiologists are biased by clinical information when reporting brian CT scans
Interpretation of the results of statistical measurements. [search for basic probability model
Olshevskiy, V. V.
1973-01-01
For random processes, the calculated probability characteristic, and the measured statistical estimate are used in a quality functional, which defines the difference between the two functions. Based on the assumption that the statistical measurement procedure is organized so that the parameters for a selected model are optimized, it is shown that the interpretation of experimental research is a search for a basic probability model.
Statistical image reconstruction methods for simultaneous emission/transmission PET scans
International Nuclear Information System (INIS)
Erdogan, H.; Fessler, J.A.
1996-01-01
Transmission scans are necessary for estimating the attenuation correction factors (ACFs) to yield quantitatively accurate PET emission images. To reduce the total scan time, post-injection transmission scans have been proposed in which one can simultaneously acquire emission and transmission data using rod sources and sinogram windowing. However, since the post-injection transmission scans are corrupted by emission coincidences, accurate correction for attenuation becomes more challenging. Conventional methods (emission subtraction) for ACF computation from post-injection scans are suboptimal and require relatively long scan times. We introduce statistical methods based on penalized-likelihood objectives to compute ACFs and then use them to reconstruct lower noise PET emission images from simultaneous transmission/emission scans. Simulations show the efficacy of the proposed methods. These methods improve image quality and SNR of the estimates as compared to conventional methods
Energy Technology Data Exchange (ETDEWEB)
Lubin, E.; Lewitus, Z.; Rosenfeld, J.; Levi, M. [Beilinson Medical Centre, University of Tel Aviv School of Medicine (Israel)
1969-05-15
Every impairment of the kidneys' blood supply, and the production or excretion of urine, is reflected by abnormal renograms that are not specific enough in the information they provide. The subsequent performance of renal scans with Hippuran {sup 131}I adds information to the topographical distribution of the Hippuran {sup 131}I in the renal parenchyma, and to the dynamics of its transport through the urinary system. The information thus obtained is valuable in itself and is necessary for the correct interpretation of renography. We have found this specially useful in the follow-up of the transplanted kidneys. In cases of anuria following renal transplantation, the renogram is useful for indicating that renal circulation is present in the transplanted kidney but is inadequate as a method to differentiate between other renal and post-renal causes of anuria. The anuric transplanted kidney should be scanned thirty minutes after the injection of Hippuran {sup 131}I. Patterns of complementary results of renograms and renal scannings are presented that correspond to prerenal, renal and postrenal causes of anuria, with all the important therapeutic implications this differential diagnosis has. (author)
Statistics translated a step-by-step guide to analyzing and interpreting data
Terrell, Steven R
2012-01-01
Written in a humorous and encouraging style, this text shows how the most common statistical tools can be used to answer interesting real-world questions, presented as mysteries to be solved. Engaging research examples lead the reader through a series of six steps, from identifying a researchable problem to stating a hypothesis, identifying independent and dependent variables, and selecting and interpreting appropriate statistical tests. All techniques are demonstrated both manually and with the help of SPSS software. The book provides students and others who may need to read and interpret sta
A scan statistic for continuous data based on the normal probability model
Directory of Open Access Journals (Sweden)
Huang Lan
2009-10-01
Full Text Available Abstract Temporal, spatial and space-time scan statistics are commonly used to detect and evaluate the statistical significance of temporal and/or geographical disease clusters, without any prior assumptions on the location, time period or size of those clusters. Scan statistics are mostly used for count data, such as disease incidence or mortality. Sometimes there is an interest in looking for clusters with respect to a continuous variable, such as lead levels in children or low birth weight. For such continuous data, we present a scan statistic where the likelihood is calculated using the the normal probability model. It may also be used for other distributions, while still maintaining the correct alpha level. In an application of the new method, we look for geographical clusters of low birth weight in New York City.
Background Noise Removal in Ultrasonic B-scan Images Using Iterative Statistical Techniques
Wells, I.; Charlton, P. C.; Mosey, S.; Donne, K. E.
2008-01-01
The interpretation of ultrasonic B-scan images can be a time-consuming process and its success depends on operator skills and experience. Removal of the image background will potentially improve its quality and hence improve operator diagnosis. An automatic background noise removal algorithm is
The uterine blush. A potential false-positive in Meckel's scan interpretation
International Nuclear Information System (INIS)
Fink-Bennett, D.
1982-01-01
To determine the presence, prevalence, and clinical importance of /sup 99m/Tc pertechnetate uterine uptake, this retrospective analysis of 71 Meckel's scans was undertaken. Specifically, each study was evaluated for the presence of a focal accumulation of radiotracer cephalad to the bladder. Patients received an intravenous dose of 150 microCi/kg of /sup 99m/Tc pertechnetate. Each study consisted of 15 one minute anterior serial gamma camera images, and a 15, 30, and 60 minute anterior, right lateral and posterior scintiscan. Menstrual histories were obtained from all patients except two. No males (33/33), nor premenstrual (13/13), menopausal (4/4) or posthysterectomy (2/2) patients revealed a uterine blush. Eleven of 15 patients (73%) with regular menses demonstrated a uterine blush. They were in the menstrual or secretory phases of their cycle. Four demonstrated no uterine uptake, had regular periods, but were in the proliferative phase of their cycle. Two with irregular periods, and one with no recorded menstrual history, manifested the blush. Radiotracer should be expected in the uterus during the menstrual and secretory phases of the menstrual cycle. It is a manifestation of a normal physiologic phenomenon, and must be recognized to prevent false-positive Meckel's scan interpretations
The uterine blush. A potential false-positive in Meckel's scan interpretation
Energy Technology Data Exchange (ETDEWEB)
Fink-Bennett, D.
1982-10-01
To determine the presence, prevalence, and clinical importance of /sup 99m/Tc pertechnetate uterine uptake, this retrospective analysis of 71 Meckel's scans was undertaken. Specifically, each study was evaluated for the presence of a focal accumulation of radiotracer cephalad to the bladder. Patients received an intravenous dose of 150 microCi/kg of /sup 99m/Tc pertechnetate. Each study consisted of 15 one minute anterior serial gamma camera images, and a 15, 30, and 60 minute anterior, right lateral and posterior scintiscan. Menstrual histories were obtained from all patients except two. No males (33/33), nor premenstrual (13/13), menopausal (4/4) or posthysterectomy (2/2) patients revealed a uterine blush. Eleven of 15 patients (73%) with regular menses demonstrated a uterine blush. They were in the menstrual or secretory phases of their cycle. Four demonstrated no uterine uptake, had regular periods, but were in the proliferative phase of their cycle. Two with irregular periods, and one with no recorded menstrual history, manifested the blush. Radiotracer should be expected in the uterus during the menstrual and secretory phases of the menstrual cycle. It is a manifestation of a normal physiologic phenomenon, and must be recognized to prevent false-positive Meckel's scan interpretations.
Interpreting Statistical Significance Test Results: A Proposed New "What If" Method.
Kieffer, Kevin M.; Thompson, Bruce
As the 1994 publication manual of the American Psychological Association emphasized, "p" values are affected by sample size. As a result, it can be helpful to interpret the results of statistical significant tests in a sample size context by conducting so-called "what if" analyses. However, these methods can be inaccurate…
A spatial scan statistic for survival data based on Weibull distribution.
Bhatt, Vijaya; Tiwari, Neeraj
2014-05-20
The spatial scan statistic has been developed as a geographical cluster detection analysis tool for different types of data sets such as Bernoulli, Poisson, ordinal, normal and exponential. We propose a scan statistic for survival data based on Weibull distribution. It may also be used for other survival distributions, such as exponential, gamma, and log normal. The proposed method is applied on the survival data of tuberculosis patients for the years 2004-2005 in Nainital district of Uttarakhand, India. Simulation studies reveal that the proposed method performs well for different survival distribution functions. Copyright © 2013 John Wiley & Sons, Ltd.
A log-Weibull spatial scan statistic for time to event data.
Usman, Iram; Rosychuk, Rhonda J
2018-06-13
Spatial scan statistics have been used for the identification of geographic clusters of elevated numbers of cases of a condition such as disease outbreaks. These statistics accompanied by the appropriate distribution can also identify geographic areas with either longer or shorter time to events. Other authors have proposed the spatial scan statistics based on the exponential and Weibull distributions. We propose the log-Weibull as an alternative distribution for the spatial scan statistic for time to events data and compare and contrast the log-Weibull and Weibull distributions through simulation studies. The effect of type I differential censoring and power have been investigated through simulated data. Methods are also illustrated on time to specialist visit data for discharged patients presenting to emergency departments for atrial fibrillation and flutter in Alberta during 2010-2011. We found northern regions of Alberta had longer times to specialist visit than other areas. We proposed the spatial scan statistic for the log-Weibull distribution as a new approach for detecting spatial clusters for time to event data. The simulation studies suggest that the test performs well for log-Weibull data.
Local multiplicity adjustment for the spatial scan statistic using the Gumbel distribution.
Gangnon, Ronald E
2012-03-01
The spatial scan statistic is an important and widely used tool for cluster detection. It is based on the simultaneous evaluation of the statistical significance of the maximum likelihood ratio test statistic over a large collection of potential clusters. In most cluster detection problems, there is variation in the extent of local multiplicity across the study region. For example, using a fixed maximum geographic radius for clusters, urban areas typically have many overlapping potential clusters, whereas rural areas have relatively few. The spatial scan statistic does not account for local multiplicity variation. We describe a previously proposed local multiplicity adjustment based on a nested Bonferroni correction and propose a novel adjustment based on a Gumbel distribution approximation to the distribution of a local scan statistic. We compare the performance of all three statistics in terms of power and a novel unbiased cluster detection criterion. These methods are then applied to the well-known New York leukemia dataset and a Wisconsin breast cancer incidence dataset. © 2011, The International Biometric Society.
A spatial scan statistic for nonisotropic two-level risk cluster.
Li, Xiao-Zhou; Wang, Jin-Feng; Yang, Wei-Zhong; Li, Zhong-Jie; Lai, Sheng-Jie
2012-01-30
Spatial scan statistic methods are commonly used for geographical disease surveillance and cluster detection. The standard spatial scan statistic does not model any variability in the underlying risks of subregions belonging to a detected cluster. For a multilevel risk cluster, the isotonic spatial scan statistic could model a centralized high-risk kernel in the cluster. Because variations in disease risks are anisotropic owing to different social, economical, or transport factors, the real high-risk kernel will not necessarily take the central place in a whole cluster area. We propose a spatial scan statistic for a nonisotropic two-level risk cluster, which could be used to detect a whole cluster and a noncentralized high-risk kernel within the cluster simultaneously. The performance of the three methods was evaluated through an intensive simulation study. Our proposed nonisotropic two-level method showed better power and geographical precision with two-level risk cluster scenarios, especially for a noncentralized high-risk kernel. Our proposed method is illustrated using the hand-foot-mouth disease data in Pingdu City, Shandong, China in May 2009, compared with two other methods. In this practical study, the nonisotropic two-level method is the only way to precisely detect a high-risk area in a detected whole cluster. Copyright © 2011 John Wiley & Sons, Ltd.
DEFF Research Database (Denmark)
Löfgren, Johan; Loft, Annika; Barbosa de Lima, Vinicius Araújo
2017-01-01
had an external F-18-FDG PET/CT scan were included. Only information that had been available at the time of the initial reading at the external hospital was available at re-interpretation. Teams with one radiologist and one nuclear medicine physician working side by side performed the re......PURPOSE: To evaluate, in a controlled prospective manner with double-blind read, whether there are differences in interpretations of PET/CT scans at our tertiary medical centre, Rigshospitalet, compared to the external hospitals. METHODS: Ninety consecutive patients referred to our department who...
International Nuclear Information System (INIS)
Lan, B.L.
2001-01-01
An alternative interpretation to Bohm's 'quantum force' and 'active information' is proposed. Numerical evidence is presented, which suggests that the time series of Bohm's 'quantum force' evaluated at the Bohmian position for non-stationary quantum states are typically non-Gaussian stable distributed with a flat power spectrum in classically chaotic Hamiltonian systems. An important implication of these statistical properties is briefly mentioned. (orig.)
Alternative interpretations of statistics on health effects of low-level radiation
International Nuclear Information System (INIS)
Hamilton, L.D.
1983-01-01
Four examples of the interpretation of statistics of data on low-level radiation are reviewed: (a) genetic effects of the atomic bombs at Hiroshima and Nagasaki, (b) cancer at Rocky Flats, (c) childhood leukemia and fallout in Utah, and (d) cancer among workers at the Portsmouth Naval Shipyard. Aggregation of data, adjustment for age, and other problems related to the determination of health effects of low-level radiation are discussed. Troublesome issues related to post hoc analysis are considered
Farrell, Mary Beth
2018-06-01
This article is the second part of a continuing education series reviewing basic statistics that nuclear medicine and molecular imaging technologists should understand. In this article, the statistics for evaluating interpretation accuracy, significance, and variance are discussed. Throughout the article, actual statistics are pulled from the published literature. We begin by explaining 2 methods for quantifying interpretive accuracy: interreader and intrareader reliability. Agreement among readers can be expressed simply as a percentage. However, the Cohen κ-statistic is a more robust measure of agreement that accounts for chance. The higher the κ-statistic is, the higher is the agreement between readers. When 3 or more readers are being compared, the Fleiss κ-statistic is used. Significance testing determines whether the difference between 2 conditions or interventions is meaningful. Statistical significance is usually expressed using a number called a probability ( P ) value. Calculation of P value is beyond the scope of this review. However, knowing how to interpret P values is important for understanding the scientific literature. Generally, a P value of less than 0.05 is considered significant and indicates that the results of the experiment are due to more than just chance. Variance, standard deviation (SD), confidence interval, and standard error (SE) explain the dispersion of data around a mean of a sample drawn from a population. SD is commonly reported in the literature. A small SD indicates that there is not much variation in the sample data. Many biologic measurements fall into what is referred to as a normal distribution taking the shape of a bell curve. In a normal distribution, 68% of the data will fall within 1 SD, 95% will fall within 2 SDs, and 99.7% will fall within 3 SDs. Confidence interval defines the range of possible values within which the population parameter is likely to lie and gives an idea of the precision of the statistic being
Small nodule detectability evaluation using a generalized scan-statistic model
International Nuclear Information System (INIS)
Popescu, Lucretiu M; Lewitt, Robert M
2006-01-01
In this paper is investigated the use of the scan statistic for evaluating the detectability of small nodules in medical images. The scan-statistic method is often used in applications in which random fields must be searched for abnormal local features. Several results of the detection with localization theory are reviewed and a generalization is presented using the noise nodule distribution obtained by scanning arbitrary areas. One benefit of the noise nodule model is that it enables determination of the scan-statistic distribution by using only a few image samples in a way suitable both for simulation and experimental setups. Also, based on the noise nodule model, the case of multiple targets per image is addressed and an image abnormality test using the likelihood ratio and an alternative test using multiple decision thresholds are derived. The results obtained reveal that in the case of low contrast nodules or multiple nodules the usual test strategy based on a single decision threshold underperforms compared with the alternative tests. That is a consequence of the fact that not only the contrast or the size, but also the number of suspicious nodules is a clue indicating the image abnormality. In the case of the likelihood ratio test, the multiple clues are unified in a single decision variable. Other tests that process multiple clues differently do not necessarily produce a unique ROC curve, as shown in examples using a test involving two decision thresholds. We present examples with two-dimensional time-of-flight (TOF) and non-TOF PET image sets analysed using the scan statistic for different search areas, as well as the fixed position observer
Menzerath-Altmann Law: Statistical Mechanical Interpretation as Applied to a Linguistic Organization
Eroglu, Sertac
2014-10-01
The distribution behavior described by the empirical Menzerath-Altmann law is frequently encountered during the self-organization of linguistic and non-linguistic natural organizations at various structural levels. This study presents a statistical mechanical derivation of the law based on the analogy between the classical particles of a statistical mechanical organization and the distinct words of a textual organization. The derived model, a transformed (generalized) form of the Menzerath-Altmann model, was termed as the statistical mechanical Menzerath-Altmann model. The derived model allows interpreting the model parameters in terms of physical concepts. We also propose that many organizations presenting the Menzerath-Altmann law behavior, whether linguistic or not, can be methodically examined by the transformed distribution model through the properly defined structure-dependent parameter and the energy associated states.
Drug safety data mining with a tree-based scan statistic.
Kulldorff, Martin; Dashevsky, Inna; Avery, Taliser R; Chan, Arnold K; Davis, Robert L; Graham, David; Platt, Richard; Andrade, Susan E; Boudreau, Denise; Gunter, Margaret J; Herrinton, Lisa J; Pawloski, Pamala A; Raebel, Marsha A; Roblin, Douglas; Brown, Jeffrey S
2013-05-01
In post-marketing drug safety surveillance, data mining can potentially detect rare but serious adverse events. Assessing an entire collection of drug-event pairs is traditionally performed on a predefined level of granularity. It is unknown a priori whether a drug causes a very specific or a set of related adverse events, such as mitral valve disorders, all valve disorders, or different types of heart disease. This methodological paper evaluates the tree-based scan statistic data mining method to enhance drug safety surveillance. We use a three-million-member electronic health records database from the HMO Research Network. Using the tree-based scan statistic, we assess the safety of selected antifungal and diabetes drugs, simultaneously evaluating overlapping diagnosis groups at different granularity levels, adjusting for multiple testing. Expected and observed adverse event counts were adjusted for age, sex, and health plan, producing a log likelihood ratio test statistic. Out of 732 evaluated disease groupings, 24 were statistically significant, divided among 10 non-overlapping disease categories. Five of the 10 signals are known adverse effects, four are likely due to confounding by indication, while one may warrant further investigation. The tree-based scan statistic can be successfully applied as a data mining tool in drug safety surveillance using observational data. The total number of statistical signals was modest and does not imply a causal relationship. Rather, data mining results should be used to generate candidate drug-event pairs for rigorous epidemiological studies to evaluate the individual and comparative safety profiles of drugs. Copyright © 2013 John Wiley & Sons, Ltd.
Optimizing the maximum reported cluster size in the spatial scan statistic for ordinal data.
Kim, Sehwi; Jung, Inkyung
2017-01-01
The spatial scan statistic is an important tool for spatial cluster detection. There have been numerous studies on scanning window shapes. However, little research has been done on the maximum scanning window size or maximum reported cluster size. Recently, Han et al. proposed to use the Gini coefficient to optimize the maximum reported cluster size. However, the method has been developed and evaluated only for the Poisson model. We adopt the Gini coefficient to be applicable to the spatial scan statistic for ordinal data to determine the optimal maximum reported cluster size. Through a simulation study and application to a real data example, we evaluate the performance of the proposed approach. With some sophisticated modification, the Gini coefficient can be effectively employed for the ordinal model. The Gini coefficient most often picked the optimal maximum reported cluster sizes that were the same as or smaller than the true cluster sizes with very high accuracy. It seems that we can obtain a more refined collection of clusters by using the Gini coefficient. The Gini coefficient developed specifically for the ordinal model can be useful for optimizing the maximum reported cluster size for ordinal data and helpful for properly and informatively discovering cluster patterns.
A study on the use of Gumbel approximation with the Bernoulli spatial scan statistic.
Read, S; Bath, P A; Willett, P; Maheswaran, R
2013-08-30
The Bernoulli version of the spatial scan statistic is a well established method of detecting localised spatial clusters in binary labelled point data, a typical application being the epidemiological case-control study. A recent study suggests the inferential accuracy of several versions of the spatial scan statistic (principally the Poisson version) can be improved, at little computational cost, by using the Gumbel distribution, a method now available in SaTScan(TM) (www.satscan.org). We study in detail the effect of this technique when applied to the Bernoulli version and demonstrate that it is highly effective, albeit with some increase in false alarm rates at certain significance thresholds. We explain how this increase is due to the discrete nature of the Bernoulli spatial scan statistic and demonstrate that it can affect even small p-values. Despite this, we argue that the Gumbel method is actually preferable for very small p-values. Furthermore, we extend previous research by running benchmark trials on 12 000 synthetic datasets, thus demonstrating that the overall detection capability of the Bernoulli version (i.e. ratio of power to false alarm rate) is not noticeably affected by the use of the Gumbel method. We also provide an example application of the Gumbel method using data on hospital admissions for chronic obstructive pulmonary disease. Copyright © 2013 John Wiley & Sons, Ltd.
Shewhart, Mark
1991-01-01
Statistical Process Control (SPC) charts are one of several tools used in quality control. Other tools include flow charts, histograms, cause and effect diagrams, check sheets, Pareto diagrams, graphs, and scatter diagrams. A control chart is simply a graph which indicates process variation over time. The purpose of drawing a control chart is to detect any changes in the process signalled by abnormal points or patterns on the graph. The Artificial Intelligence Support Center (AISC) of the Acquisition Logistics Division has developed a hybrid machine learning expert system prototype which automates the process of constructing and interpreting control charts.
Misuse of statistics in the interpretation of data on low-level radiation
International Nuclear Information System (INIS)
Hamilton, L.D.
1982-01-01
Four misuses of statistics in the interpretation of data of low-level radiation are reviewed: (1) post-hoc analysis and aggregation of data leading to faulty conclusions in the reanalysis of genetic effects of the atomic bomb, and premature conclusions on the Portsmouth Naval Shipyard data; (2) inappropriate adjustment for age and ignoring differences between urban and rural areas leading to potentially spurious increase in incidence of cancer at Rocky Flats; (3) hazard of summary statistics based on ill-conditioned individual rates leading to spurious association between childhood leukemia and fallout in Utah; and (4) the danger of prematurely published preliminary work with inadequate consideration of epidemiological problems - censored data - leading to inappropriate conclusions, needless alarm at the Portsmouth Naval Shipyard, and diversion of scarce research funds
Misuse of statistics in the interpretation of data on low-level radiation
Energy Technology Data Exchange (ETDEWEB)
Hamilton, L.D.
1982-01-01
Four misuses of statistics in the interpretation of data of low-level radiation are reviewed: (1) post-hoc analysis and aggregation of data leading to faulty conclusions in the reanalysis of genetic effects of the atomic bomb, and premature conclusions on the Portsmouth Naval Shipyard data; (2) inappropriate adjustment for age and ignoring differences between urban and rural areas leading to potentially spurious increase in incidence of cancer at Rocky Flats; (3) hazard of summary statistics based on ill-conditioned individual rates leading to spurious association between childhood leukemia and fallout in Utah; and (4) the danger of prematurely published preliminary work with inadequate consideration of epidemiological problems - censored data - leading to inappropriate conclusions, needless alarm at the Portsmouth Naval Shipyard, and diversion of scarce research funds.
Statistical Literacy: High School Students in Reading, Interpreting and Presenting Data
Hafiyusholeh, M.; Budayasa, K.; Siswono, T. Y. E.
2018-01-01
One of the foundations for high school students in statistics is to be able to read data; presents data in the form of tables and diagrams and its interpretation. The purpose of this study is to describe high school students’ competencies in reading, interpreting and presenting data. Subjects were consisted of male and female students who had high levels of mathematical ability. Collecting data was done in form of task formulation which is analyzed by reducing, presenting and verifying data. Results showed that the students read the data based on explicit explanations on the diagram, such as explaining the points in the diagram as the relation between the x and y axis and determining the simple trend of a graph, including the maximum and minimum point. In interpreting and summarizing the data, both subjects pay attention to general data trends and use them to predict increases or decreases in data. The male estimates the value of the (n+1) of weight data by using the modus of the data, while the females estimate the weigth by using the average. The male tend to do not consider the characteristics of the data, while the female more carefully consider the characteristics of data.
Precipitate statistics in an Al-Mg-Si-Cu alloy from scanning precession electron diffraction data
Sunde, J. K.; Paulsen, Ø.; Wenner, S.; Holmestad, R.
2017-09-01
The key microstructural feature providing strength to age-hardenable Al alloys is nanoscale precipitates. Alloy development requires a reliable statistical assessment of these precipitates, in order to link the microstructure with material properties. Here, it is demonstrated that scanning precession electron diffraction combined with computational analysis enable the semi-automated extraction of precipitate statistics in an Al-Mg-Si-Cu alloy. Among the main findings is the precipitate number density, which agrees well with a conventional method based on manual counting and measurements. By virtue of its data analysis objectivity, our methodology is therefore seen as an advantageous alternative to existing routines, offering reproducibility and efficiency in alloy statistics. Additional results include improved qualitative information on phase distributions. The developed procedure is generic and applicable to any material containing nanoscale precipitates.
Saulnier, George E; Castro, Janna C; Cook, Curtiss B
2014-05-01
Glucose control can be problematic in critically ill patients. We evaluated the impact of statistical transformation on interpretation of intensive care unit inpatient glucose control data. Point-of-care blood glucose (POC-BG) data derived from patients in the intensive care unit for 2011 was obtained. Box-Cox transformation of POC-BG measurements was performed, and distribution of data was determined before and after transformation. Different data subsets were used to establish statistical upper and lower control limits. Exponentially weighted moving average (EWMA) control charts constructed from April, October, and November data determined whether out-of-control events could be identified differently in transformed versus nontransformed data. A total of 8679 POC-BG values were analyzed. POC-BG distributions in nontransformed data were skewed but approached normality after transformation. EWMA control charts revealed differences in projected detection of out-of-control events. In April, an out-of-control process resulting in the lower control limit being exceeded was identified at sample 116 in nontransformed data but not in transformed data. October transformed data detected an out-of-control process exceeding the upper control limit at sample 27 that was not detected in nontransformed data. Nontransformed November results remained in control, but transformation identified an out-of-control event less than 10 samples into the observation period. Using statistical methods to assess population-based glucose control in the intensive care unit could alter conclusions about the effectiveness of care processes for managing hyperglycemia. Further study is required to determine whether transformed versus nontransformed data change clinical decisions about the interpretation of care or intervention results. © 2014 Diabetes Technology Society.
Semi-quantitative interpretation of the bone scan in metabolic bone disease
Energy Technology Data Exchange (ETDEWEB)
Fogelman, I; Turner, J G; Hay, I D; Boyle, I T [Royal Infirmary, Glasgow (UK). Dept. of Nuclear Medicine; Citrin, D L [Wisconsin Univ., Madison (USA). Dept. of Human Oncology; Bessent, G R
1979-01-01
Certain easily recognisable features are commonly seen in the bone scans of patients with metabolic bone disorders. Seven such features have been numerically graded by three independent observers in the scans of 100 patients with metabolic bone disease and of 50 control subjects. The total score for each patient is defined as the metabolic index. The mean metabolic index for each group of patients with metabolic bone disease is significantly greater than that for the control group (P < 0.001). (orig.).
Dourado, Jules Carlos; Pereira, Júlio Leonardo Barbosa; Albuquerque, Lucas Alverne Freitas de; Carvalho, Gervásio Teles Cardos de; Dias, Patrícia; Dias, Laura; Bicalho, Marcos; Magalhães, Pollyana; Dellaretti, Marcos
2015-08-01
The power of interpretation in the analysis of cranial computed tomography (CCT) among neurosurgeons and radiologists has rarely been studied. This study aimed to assess the rate of agreement in the interpretation of CCTs between neurosurgeons and a radiologist in an emergency department. 227 CCT were independently analyzed by two neurosurgeons (NS1 and NS2) and a radiologist (RAD). The level of agreement in interpreting the examination was studied. The Kappa values obtained between NS1 and NS2 and RAD were considered nearly perfect and substantial agreement. The highest levels of agreement when evaluating abnormalities were observed in the identification of tumors, hydrocephalus and intracranial hematomas. The worst levels of agreement were observed for leukoaraiosis and reduced brain volume. For diseases in which the emergency room procedure must be determined, agreement in the interpretation of CCTs between the radiologist and neurosurgeons was satisfactory.
Error analysis of terrestrial laser scanning data by means of spherical statistics and 3D graphs.
Cuartero, Aurora; Armesto, Julia; Rodríguez, Pablo G; Arias, Pedro
2010-01-01
This paper presents a complete analysis of the positional errors of terrestrial laser scanning (TLS) data based on spherical statistics and 3D graphs. Spherical statistics are preferred because of the 3D vectorial nature of the spatial error. Error vectors have three metric elements (one module and two angles) that were analyzed by spherical statistics. A study case has been presented and discussed in detail. Errors were calculating using 53 check points (CP) and CP coordinates were measured by a digitizer with submillimetre accuracy. The positional accuracy was analyzed by both the conventional method (modular errors analysis) and the proposed method (angular errors analysis) by 3D graphics and numerical spherical statistics. Two packages in R programming language were performed to obtain graphics automatically. The results indicated that the proposed method is advantageous as it offers a more complete analysis of the positional accuracy, such as angular error component, uniformity of the vector distribution, error isotropy, and error, in addition the modular error component by linear statistics.
Gerrits, Reinie G; Kringos, Dionne S; van den Berg, Michael J; Klazinga, Niek S
2018-03-07
Policy-makers, managers, scientists, patients and the general public are confronted daily with figures on health and healthcare through public reporting in newspapers, webpages and press releases. However, information on the key characteristics of these figures necessary for their correct interpretation is often not adequately communicated, which can lead to misinterpretation and misinformed decision-making. The objective of this research was to map the key characteristics relevant to the interpretation of figures on health and healthcare, and to develop a Figure Interpretation Assessment Tool-Health (FIAT-Health) through which figures on health and healthcare can be systematically assessed, allowing for a better interpretation of these figures. The abovementioned key characteristics of figures on health and healthcare were identified through systematic expert consultations in the Netherlands on four topic categories of figures, namely morbidity, healthcare expenditure, healthcare outcomes and lifestyle. The identified characteristics were used as a frame for the development of the FIAT-Health. Development of the tool and its content was supported and validated through regular review by a sounding board of potential users. Identified characteristics relevant for the interpretation of figures in the four categories relate to the figures' origin, credibility, expression, subject matter, population and geographical focus, time period, and underlying data collection methods. The characteristics were translated into a set of 13 dichotomous and 4-point Likert scale questions constituting the FIAT-Health, and two final assessment statements. Users of the FIAT-Health were provided with a summary overview of their answers to support a final assessment of the correctness of a figure and the appropriateness of its reporting. FIAT-Health can support policy-makers, managers, scientists, patients and the general public to systematically assess the quality of publicly reported
International Nuclear Information System (INIS)
Shafieloo, Arman
2012-01-01
By introducing Crossing functions and hyper-parameters I show that the Bayesian interpretation of the Crossing Statistics [1] can be used trivially for the purpose of model selection among cosmological models. In this approach to falsify a cosmological model there is no need to compare it with other models or assume any particular form of parametrization for the cosmological quantities like luminosity distance, Hubble parameter or equation of state of dark energy. Instead, hyper-parameters of Crossing functions perform as discriminators between correct and wrong models. Using this approach one can falsify any assumed cosmological model without putting priors on the underlying actual model of the universe and its parameters, hence the issue of dark energy parametrization is resolved. It will be also shown that the sensitivity of the method to the intrinsic dispersion of the data is small that is another important characteristic of the method in testing cosmological models dealing with data with high uncertainties
Karuppiah, R.; Faldi, A.; Laurenzi, I.; Usadi, A.; Venkatesh, A.
2014-12-01
An increasing number of studies are focused on assessing the environmental footprint of different products and processes, especially using life cycle assessment (LCA). This work shows how combining statistical methods and Geographic Information Systems (GIS) with environmental analyses can help improve the quality of results and their interpretation. Most environmental assessments in literature yield single numbers that characterize the environmental impact of a process/product - typically global or country averages, often unchanging in time. In this work, we show how statistical analysis and GIS can help address these limitations. For example, we demonstrate a method to separately quantify uncertainty and variability in the result of LCA models using a power generation case study. This is important for rigorous comparisons between the impacts of different processes. Another challenge is lack of data that can affect the rigor of LCAs. We have developed an approach to estimate environmental impacts of incompletely characterized processes using predictive statistical models. This method is applied to estimate unreported coal power plant emissions in several world regions. There is also a general lack of spatio-temporal characterization of the results in environmental analyses. For instance, studies that focus on water usage do not put in context where and when water is withdrawn. Through the use of hydrological modeling combined with GIS, we quantify water stress on a regional and seasonal basis to understand water supply and demand risks for multiple users. Another example where it is important to consider regional dependency of impacts is when characterizing how agricultural land occupation affects biodiversity in a region. We developed a data-driven methodology used in conjuction with GIS to determine if there is a statistically significant difference between the impacts of growing different crops on different species in various biomes of the world.
Braun, Stefan; Pokorná, Šárka; Šachl, Radek; Hof, Martin; Heerklotz, Heiko; Hoernke, Maria
2018-01-23
The mode of action of membrane-active molecules, such as antimicrobial, anticancer, cell penetrating, and fusion peptides and their synthetic mimics, transfection agents, drug permeation enhancers, and biological signaling molecules (e.g., quorum sensing), involves either the general or local destabilization of the target membrane or the formation of defined, rather stable pores. Some effects aim at killing the cell, while others need to be limited in space and time to avoid serious damage. Biological tests reveal translocation of compounds and cell death but do not provide a detailed, mechanistic, and quantitative understanding of the modes of action and their molecular basis. Model membrane studies of membrane leakage have been used for decades to tackle this issue, but their interpretation in terms of biology has remained challenging and often quite limited. Here we compare two recent, powerful protocols to study model membrane leakage: the microscopic detection of dye influx into giant liposomes and time-correlated single photon counting experiments to characterize dye efflux from large unilamellar vesicles. A statistical treatment of both data sets does not only harmonize apparent discrepancies but also makes us aware of principal issues that have been confusing the interpretation of model membrane leakage data so far. Moreover, our study reveals a fundamental difference between nano- and microscale systems that needs to be taken into account when conclusions about microscale objects, such as cells, are drawn from nanoscale models.
Energy Technology Data Exchange (ETDEWEB)
Daisuke Onozuka; Akihito Hagihara [Fukuoka Institute of Health and Environmental Sciences, Fukuoka (Japan). Department of Information Science
2007-07-01
Tuberculosis (TB) has reemerged as a global public health epidemic in recent years. Although evaluating local disease clusters leads to effective prevention and control of TB, there are few, if any, spatiotemporal comparisons for epidemic diseases. TB cases among residents in Fukuoka Prefecture between 1999 and 2004 (n = 9,119) were geocoded at the census tract level (n = 109) based on residence at the time of diagnosis. The spatial and space-time scan statistics were then used to identify clusters of census tracts with elevated proportions of TB cases. In the purely spatial analyses, the most likely clusters were in the Chikuho coal mining area (in 1999, 2002, 2003, 2004), the Kita-Kyushu industrial area (in 2000), and the Fukuoka urban area (in 2001). In the space-time analysis, the most likely cluster was the Kita-Kyushu industrial area (in 2000). The north part of Fukuoka Prefecture was the most likely to have a cluster with a significantly high occurrence of TB. The spatial and space-time scan statistics are effective ways of describing circular disease clusters. Since, in reality, infectious diseases might form other cluster types, the effectiveness of the method may be limited under actual practice. The sophistication of the analytical methodology, however, is a topic for future study. 48 refs., 3 figs., 3 tabs.
Directory of Open Access Journals (Sweden)
Onozuka Daisuke
2007-04-01
Full Text Available Abstract Background Tuberculosis (TB has reemerged as a global public health epidemic in recent years. Although evaluating local disease clusters leads to effective prevention and control of TB, there are few, if any, spatiotemporal comparisons for epidemic diseases. Methods TB cases among residents in Fukuoka Prefecture between 1999 and 2004 (n = 9,119 were geocoded at the census tract level (n = 109 based on residence at the time of diagnosis. The spatial and space-time scan statistics were then used to identify clusters of census tracts with elevated proportions of TB cases. Results In the purely spatial analyses, the most likely clusters were in the Chikuho coal mining area (in 1999, 2002, 2003, 2004, the Kita-Kyushu industrial area (in 2000, and the Fukuoka urban area (in 2001. In the space-time analysis, the most likely cluster was the Kita-Kyushu industrial area (in 2000. The north part of Fukuoka Prefecture was the most likely to have a cluster with a significantly high occurrence of TB. Conclusion The spatial and space-time scan statistics are effective ways of describing circular disease clusters. Since, in reality, infectious diseases might form other cluster types, the effectiveness of the method may be limited under actual practice. The sophistication of the analytical methodology, however, is a topic for future study.
Directory of Open Access Journals (Sweden)
Elżbieta Biernat
2014-12-01
Full Text Available Background: The aim of this paper is to assess whether basic descriptive statistics is sufficient to interpret the data on physical activity of Poles within occupational domain of life. Material and Methods: The study group consisted of 964 randomly selected Polish working professionals. The long version of the International Physical Activity Questionnaire (IPAQ was used. Descriptive statistics included characteristics of variables using: mean (M, median (Me, maximal and minimal values (max–min., standard deviation (SD and percentile values. Statistical inference was based on the comparison of variables with the significance level of 0.05 (Kruskal-Wallis and Pearson’s Chi2 tests. Results: Occupational physical activity (OPA was declared by 46.4% of respondents (vigorous – 23.5%, moderate – 30.2%, walking – 39.5%. The total OPA amounted to 2751.1 MET-min/week (Metabolic Equivalent of Task with very high standard deviation (SD = 5302.8 and max = 35 511 MET-min/week. It concerned different types of activities. Approximately 10% (90th percentile overstated the average. However, there was no significant difference depended on the character of the profession, or the type of activity. The average time of sitting was 256 min/day. As many as 39% of the respondents met the World Health Organization standards only due to OPA (42.5% of white-collar workers, 38% of administrative and technical employees and only 37.9% of physical workers. Conclusions: In the data analysis it is necessary to define quantiles to provide a fuller picture of the distributions of OPA in MET-min/week. It is also crucial to update the guidelines for data processing and analysis of long version of IPAQ. It seems that 16 h of activity/day is not a sufficient criterion for excluding the results from further analysis. Med Pr 2014;65(6:743–753
Autonomic Differentiation Map: A Novel Statistical Tool for Interpretation of Heart Rate Variability
Directory of Open Access Journals (Sweden)
Daniela Lucini
2018-04-01
Full Text Available In spite of the large body of evidence suggesting Heart Rate Variability (HRV alone or combined with blood pressure variability (providing an estimate of baroreflex gain as a useful technique to assess the autonomic regulation of the cardiovascular system, there is still an ongoing debate about methodology, interpretation, and clinical applications. In the present investigation, we hypothesize that non-parametric and multivariate exploratory statistical manipulation of HRV data could provide a novel informational tool useful to differentiate normal controls from clinical groups, such as athletes, or subjects affected by obesity, hypertension, or stress. With a data-driven protocol in 1,352 ambulant subjects, we compute HRV and baroreflex indices from short-term data series as proxies of autonomic (ANS regulation. We apply a three-step statistical procedure, by first removing age and gender effects. Subsequently, by factor analysis, we extract four ANS latent domains that detain the large majority of information (86.94%, subdivided in oscillatory (40.84%, amplitude (18.04%, pressure (16.48%, and pulse domains (11.58%. Finally, we test the overall capacity to differentiate clinical groups vs. control. To give more practical value and improve readability, statistical results concerning individual discriminant ANS proxies and ANS differentiation profiles are displayed through peculiar graphical tools, i.e., significance diagram and ANS differentiation map, respectively. This approach, which simultaneously uses all available information about the system, shows what domains make up the difference in ANS discrimination. e.g., athletes differ from controls in all domains, but with a graded strength: maximal in the (normalized oscillatory and in the pulse domains, slightly less in the pressure domain and minimal in the amplitude domain. The application of multiple (non-parametric and exploratory statistical and graphical tools to ANS proxies defines
Interpretation of gamma-scanning data from the ORR demonstration elements
International Nuclear Information System (INIS)
Bretscher, M.M.; Snelgrove, J.L.; Hobbs, R.W.
1989-01-01
The HEU and LEU fuel elements used in the ORR whole-core demonstration were gamma-scanned to determine the axial distribution of the 140 La and 137 Cs activities. Analysis of this data is now complete. From the 140 La activity distributions cycle-averaged powers were determined while the 137 Cs data provided a measure of the final 235 U burnup in the fuel elements. A method for calculating correction factors for activity gradients transverse to the fuel element axis is presented and is applied to the first mixed core used in the demonstration during the gradual transition to an all LEU core. Results based on the gamma-scanning of the LEU fuel followers are also presented. Improved burnup calculations against which the experimental results are to be compared are now in progress. 7 refs., 21 figs., 3 tabs
Szabolcsi, Zoltán; Farkas, Zsuzsa; Borbély, Andrea; Bárány, Gusztáv; Varga, Dániel; Heinrich, Attila; Völgyi, Antónia; Pamjav, Horolma
2015-11-01
When the DNA profile from a crime-scene matches that of a suspect, the weight of DNA evidence depends on the unbiased estimation of the match probability of the profiles. For this reason, it is required to establish and expand the databases that reflect the actual allele frequencies in the population applied. 21,473 complete DNA profiles from Databank samples were used to establish the allele frequency database to represent the population of Hungarian suspects. We used fifteen STR loci (PowerPlex ESI16) including five, new ESS loci. The aim was to calculate the statistical, forensic efficiency parameters for the Databank samples and compare the newly detected data to the earlier report. The population substructure caused by relatedness may influence the frequency of profiles estimated. As our Databank profiles were considered non-random samples, possible relationships between the suspects can be assumed. Therefore, population inbreeding effect was estimated using the FIS calculation. The overall inbreeding parameter was found to be 0.0106. Furthermore, we tested the impact of the two allele frequency datasets on 101 randomly chosen STR profiles, including full and partial profiles. The 95% confidence interval estimates for the profile frequencies (pM) resulted in a tighter range when we used the new dataset compared to the previously published ones. We found that the FIS had less effect on frequency values in the 21,473 samples than the application of minimum allele frequency. No genetic substructure was detected by STRUCTURE analysis. Due to the low level of inbreeding effect and the high number of samples, the new dataset provides unbiased and precise estimates of LR for statistical interpretation of forensic casework and allows us to use lower allele frequencies. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Application of Statistical Tools for Data Analysis and Interpretation in Rice Plant Pathology
Directory of Open Access Journals (Sweden)
Parsuram Nayak
2018-01-01
Full Text Available There has been a significant advancement in the application of statistical tools in plant pathology during the past four decades. These tools include multivariate analysis of disease dynamics involving principal component analysis, cluster analysis, factor analysis, pattern analysis, discriminant analysis, multivariate analysis of variance, correspondence analysis, canonical correlation analysis, redundancy analysis, genetic diversity analysis, and stability analysis, which involve in joint regression, additive main effects and multiplicative interactions, and genotype-by-environment interaction biplot analysis. The advanced statistical tools, such as non-parametric analysis of disease association, meta-analysis, Bayesian analysis, and decision theory, take an important place in analysis of disease dynamics. Disease forecasting methods by simulation models for plant diseases have a great potentiality in practical disease control strategies. Common mathematical tools such as monomolecular, exponential, logistic, Gompertz and linked differential equations take an important place in growth curve analysis of disease epidemics. The highly informative means of displaying a range of numerical data through construction of box and whisker plots has been suggested. The probable applications of recent advanced tools of linear and non-linear mixed models like the linear mixed model, generalized linear model, and generalized linear mixed models have been presented. The most recent technologies such as micro-array analysis, though cost effective, provide estimates of gene expressions for thousands of genes simultaneously and need attention by the molecular biologists. Some of these advanced tools can be well applied in different branches of rice research, including crop improvement, crop production, crop protection, social sciences as well as agricultural engineering. The rice research scientists should take advantage of these new opportunities adequately in
International Nuclear Information System (INIS)
Xin Jun; Zhao Zhoushe; Li Hong; Lu Zhe; Wu Wenkai; Guo Qiyong
2013-01-01
Objective: To improve image quality of low dose CT in whole body PET/CT using adaptive statistical iterative reconstruction (ASiR) technology. Methods: Twice CT scans were performed with GE water model,scan parameters were: 120 kV, 120 and 300 mA respectively. In addition, 30 subjects treated with PET/CT were selected randomly, whole body PET/CT were performed after 18 F-FDG injection of 3.70 MBq/kg, Sharp IR+time of flight + VUE Point HD technology were used for 1.5 min/bed in PET; CT of spiral scan was performed under 120 kV using automatic exposure control technology (30-210 mA, noise index 25). Model and patients whole body CT images were reconstructed with conventional and 40% ASiR methods respectively, and the CT attenuation value and noise index were measured. Results: Research of model and clinical showed that standard deviation of ASiR method in model CT was 33.0% lower than the conventional CT reconstruction method (t =27.76, P<0.01), standard deviation of CT in normal tissues (brain, lung, mediastinum, liver and vertebral body) and lesions (brain, lung, mediastinum, liver and vertebral body) reduced by 21.08% (t =23.35, P<0.01) and 24.43% (t =16.15, P<0.01) respectively, especially for normal liver tissue and liver lesions, standard deviations of CT were reduced by 51.33% (t=34.21, P<0.0) and 49.54% (t=15.21, P<0.01) respectively. Conclusion: ASiR reconstruction method was significantly reduced the noise of low dose CT image and improved the quality of CT image in whole body PET/CT, which seems more suitable for quantitative analysis and clinical applications. (authors)
Interpretation of scanning tunneling quasiparticle interference and impurity states in cuprates.
Kreisel, A; Choubey, Peayush; Berlijn, T; Ku, W; Andersen, B M; Hirschfeld, P J
2015-05-29
We apply a recently developed method combining first principles based Wannier functions with solutions to the Bogoliubov-de Gennes equations to the problem of interpreting STM data in cuprate superconductors. We show that the observed images of Zn on the surface of Bi_{2}Sr_{2}CaCu_{2}O_{8} can only be understood by accounting for the tails of the Cu Wannier functions, which include significant weight on apical O sites in neighboring unit cells. This calculation thus puts earlier crude "filter" theories on a microscopic foundation and solves a long-standing puzzle. We then study quasiparticle interference phenomena induced by out-of-plane weak potential scatterers, and show how patterns long observed in cuprates can be understood in terms of the interference of Wannier functions above the surface. Our results show excellent agreement with experiment and enable a better understanding of novel phenomena in the cuprates via STM imaging.
A statistical pixel intensity model for segmentation of confocal laser scanning microscopy images.
Calapez, Alexandre; Rosa, Agostinho
2010-09-01
Confocal laser scanning microscopy (CLSM) has been widely used in the life sciences for the characterization of cell processes because it allows the recording of the distribution of fluorescence-tagged macromolecules on a section of the living cell. It is in fact the cornerstone of many molecular transport and interaction quantification techniques where the identification of regions of interest through image segmentation is usually a required step. In many situations, because of the complexity of the recorded cellular structures or because of the amounts of data involved, image segmentation either is too difficult or inefficient to be done by hand and automated segmentation procedures have to be considered. Given the nature of CLSM images, statistical segmentation methodologies appear as natural candidates. In this work we propose a model to be used for statistical unsupervised CLSM image segmentation. The model is derived from the CLSM image formation mechanics and its performance is compared to the existing alternatives. Results show that it provides a much better description of the data on classes characterized by their mean intensity, making it suitable not only for segmentation methodologies with known number of classes but also for use with schemes aiming at the estimation of the number of classes through the application of cluster selection criteria.
Interpretation of the MEG-MUSIC scan in biomagnetic source localization
Energy Technology Data Exchange (ETDEWEB)
Mosher, J.C.; Lewis, P.S. [Los Alamos National Lab., NM (United States); Leahy, R.M. [University of Southern California, Los Angeles, CA (United States). Signal and Image Processing Inst.
1993-09-01
MEG-Music is a new approach to MEG source localization. MEG-Music is based on a spatio-temporal source model in which the observed biomagnetic fields are generated by a small number of current dipole sources with fixed positions/orientations and varying strengths. From the spatial covariance matrix of the observed fields, a signal subspace can be identified. The rank of this subspace is equal to the number of elemental sources present. This signal sub-space is used in a projection metric that scans the three dimensional head volume. Given a perfect signal subspace estimate and a perfect forward model, the metric will peak at unity at each dipole location. In practice, the signal subspace estimate is contaminated by noise, which in turn yields MUSIC peaks which are less than unity. Previously we examined the lower bounds on localization error, independent of the choice of localization procedure. In this paper, we analyzed the effects of noise and temporal coherence on the signal subspace estimate and the resulting effects on the MEG-MUSIC peaks.
Bieber, Frederick R; Buckleton, John S; Budowle, Bruce; Butler, John M; Coble, Michael D
2016-08-31
The evaluation and interpretation of forensic DNA mixture evidence faces greater interpretational challenges due to increasingly complex mixture evidence. Such challenges include: casework involving low quantity or degraded evidence leading to allele and locus dropout; allele sharing of contributors leading to allele stacking; and differentiation of PCR stutter artifacts from true alleles. There is variation in statistical approaches used to evaluate the strength of the evidence when inclusion of a specific known individual(s) is determined, and the approaches used must be supportable. There are concerns that methods utilized for interpretation of complex forensic DNA mixtures may not be implemented properly in some casework. Similar questions are being raised in a number of U.S. jurisdictions, leading to some confusion about mixture interpretation for current and previous casework. Key elements necessary for the interpretation and statistical evaluation of forensic DNA mixtures are described. Given the most common method for statistical evaluation of DNA mixtures in many parts of the world, including the USA, is the Combined Probability of Inclusion/Exclusion (CPI/CPE). Exposition and elucidation of this method and a protocol for use is the focus of this article. Formulae and other supporting materials are provided. Guidance and details of a DNA mixture interpretation protocol is provided for application of the CPI/CPE method in the analysis of more complex forensic DNA mixtures. This description, in turn, should help reduce the variability of interpretation with application of this methodology and thereby improve the quality of DNA mixture interpretation throughout the forensic community.
van Driel, A.F.; Nikolaev, I.; Vergeer, P.; Lodahl, P.; Vanmaekelbergh, D.; Vos, Willem L.
2007-01-01
We present a statistical analysis of time-resolved spontaneous emission decay curves from ensembles of emitters, such as semiconductor quantum dots, with the aim of interpreting ubiquitous non-single-exponential decay. Contrary to what is widely assumed, the density of excited emitters and the
International Nuclear Information System (INIS)
Morris, A.; Lipe, T.
1996-01-01
In a previous article Van Humbeeck and Planes have made a number of criticisms of the authors' recent paper concerning the interpretation of the results obtained by Differential Scanning Calorimetry (DSC) from the Martensitic Transformation of Cu-Al-Ni-Mn-B alloys. Although the martensitic transformation of these shape memory alloys is generally classified as athermal, it has been confirmed that the capacity of the alloys to undergo a more complete thermoelastic transformation (i.e. better reversibility of the transformation) increased with the Mn content. This behavior has been explained by interpreting the DSC results obtained during thermal cycling in terms of a thermally activated mechanism controlling the direct and reverse transformations. When the heating rate increases during the reverse transformation the DSC curves shift towards higher temperatures while they shift towards the lower temperatures when the cooling rate was increased during the direct transformation. Since the starting transformation temperatures (As, Ms) do not shift, Van Humbeeck and Planes state that there is no real peak shift and assume that the DCS experiments were carried out without taking into account the thermal lag effect between sample and cell. On the following line they deduce a time constant, τ, of 60 seconds because the peak maximum shifts. In fact the assumption made by Van Humbeeck and Planes is false
Analysis of health in health centers area in Depok using correspondence analysis and scan statistic
Basir, C.; Widyaningsih, Y.; Lestari, D.
2017-07-01
Hotspots indicate area that has a higher case intensity than others. For example, in health problems of an area, the number of sickness of a region can be used as parameter and condition of area that determined severity of an area. If this condition is known soon, it can be overcome preventively. Many factors affect the severity level of area. Some health factors to be considered in this study are the number of infant with low birth weight, malnourished children under five years old, under five years old mortality, maternal deaths, births without the help of health personnel, infants without handling the baby's health, and infant without basic immunization. The number of cases is based on every public health center area in Depok. Correspondence analysis provides graphical information about two nominal variables relationship. It create plot based on row and column scores and show categories that have strong relation in a close distance. Scan Statistic method is used to examine hotspot based on some selected variables that occurred in the study area; and Correspondence Analysis is used to picturing association between the regions and variables. Apparently, using SaTScan software, Sukatani health center is obtained as a point hotspot; and Correspondence Analysis method shows health centers and the seven variables have a very significant relationship and the majority of health centers close to all variables, except Cipayung which is distantly related to the number of pregnant mother death. These results can be used as input for the government agencies to upgrade the health level in the area.
Honda, O; Yanagawa, M; Inoue, A; Kikuyama, A; Yoshida, S; Sumikawa, H; Tobino, K; Koyama, M; Tomiyama, N
2011-04-01
We investigated the image quality of multiplanar reconstruction (MPR) using adaptive statistical iterative reconstruction (ASIR). Inflated and fixed lungs were scanned with a garnet detector CT in high-resolution mode (HR mode) or non-high-resolution (HR) mode, and MPR images were then reconstructed. Observers compared 15 MPR images of ASIR (40%) and ASIR (80%) with those of ASIR (0%), and assessed image quality using a visual five-point scale (1, definitely inferior; 5, definitely superior), with particular emphasis on normal pulmonary structures, artefacts, noise and overall image quality. The mean overall image quality scores in HR mode were 3.67 with ASIR (40%) and 4.97 with ASIR (80%). Those in non-HR mode were 3.27 with ASIR (40%) and 3.90 with ASIR (80%). The mean artefact scores in HR mode were 3.13 with ASIR (40%) and 3.63 with ASIR (80%), but those in non-HR mode were 2.87 with ASIR (40%) and 2.53 with ASIR (80%). The mean scores of the other parameters were greater than 3, whereas those in HR mode were higher than those in non-HR mode. There were significant differences between ASIR (40%) and ASIR (80%) in overall image quality (pASIR did not suppress the severe artefacts of contrast medium. In general, MPR image quality with ASIR (80%) was superior to that with ASIR (40%). However, there was an increased incidence of artefacts by ASIR when CT images were obtained in non-HR mode.
Ma, Yue; Yin, Fei; Zhang, Tao; Zhou, Xiaohua Andrew; Li, Xiaosong
2016-01-01
Spatial scan statistics are widely used in various fields. The performance of these statistics is influenced by parameters, such as maximum spatial cluster size, and can be improved by parameter selection using performance measures. Current performance measures are based on the presence of clusters and are thus inapplicable to data sets without known clusters. In this work, we propose a novel overall performance measure called maximum clustering set-proportion (MCS-P), which is based on the likelihood of the union of detected clusters and the applied dataset. MCS-P was compared with existing performance measures in a simulation study to select the maximum spatial cluster size. Results of other performance measures, such as sensitivity and misclassification, suggest that the spatial scan statistic achieves accurate results in most scenarios with the maximum spatial cluster sizes selected using MCS-P. Given that previously known clusters are not required in the proposed strategy, selection of the optimal maximum cluster size with MCS-P can improve the performance of the scan statistic in applications without identified clusters.
"What If" Analyses: Ways to Interpret Statistical Significance Test Results Using EXCEL or "R"
Ozturk, Elif
2012-01-01
The present paper aims to review two motivations to conduct "what if" analyses using Excel and "R" to understand the statistical significance tests through the sample size context. "What if" analyses can be used to teach students what statistical significance tests really do and in applied research either prospectively to estimate what sample size…
Directory of Open Access Journals (Sweden)
Ozonoff Al
2010-07-01
Full Text Available Abstract Background A common, important problem in spatial epidemiology is measuring and identifying variation in disease risk across a study region. In application of statistical methods, the problem has two parts. First, spatial variation in risk must be detected across the study region and, second, areas of increased or decreased risk must be correctly identified. The location of such areas may give clues to environmental sources of exposure and disease etiology. One statistical method applicable in spatial epidemiologic settings is a generalized additive model (GAM which can be applied with a bivariate LOESS smoother to account for geographic location as a possible predictor of disease status. A natural hypothesis when applying this method is whether residential location of subjects is associated with the outcome, i.e. is the smoothing term necessary? Permutation tests are a reasonable hypothesis testing method and provide adequate power under a simple alternative hypothesis. These tests have yet to be compared to other spatial statistics. Results This research uses simulated point data generated under three alternative hypotheses to evaluate the properties of the permutation methods and compare them to the popular spatial scan statistic in a case-control setting. Case 1 was a single circular cluster centered in a circular study region. The spatial scan statistic had the highest power though the GAM method estimates did not fall far behind. Case 2 was a single point source located at the center of a circular cluster and Case 3 was a line source at the center of the horizontal axis of a square study region. Each had linearly decreasing logodds with distance from the point. The GAM methods outperformed the scan statistic in Cases 2 and 3. Comparing sensitivity, measured as the proportion of the exposure source correctly identified as high or low risk, the GAM methods outperformed the scan statistic in all three Cases. Conclusions The GAM
Young, Robin L; Weinberg, Janice; Vieira, Verónica; Ozonoff, Al; Webster, Thomas F
2010-07-19
A common, important problem in spatial epidemiology is measuring and identifying variation in disease risk across a study region. In application of statistical methods, the problem has two parts. First, spatial variation in risk must be detected across the study region and, second, areas of increased or decreased risk must be correctly identified. The location of such areas may give clues to environmental sources of exposure and disease etiology. One statistical method applicable in spatial epidemiologic settings is a generalized additive model (GAM) which can be applied with a bivariate LOESS smoother to account for geographic location as a possible predictor of disease status. A natural hypothesis when applying this method is whether residential location of subjects is associated with the outcome, i.e. is the smoothing term necessary? Permutation tests are a reasonable hypothesis testing method and provide adequate power under a simple alternative hypothesis. These tests have yet to be compared to other spatial statistics. This research uses simulated point data generated under three alternative hypotheses to evaluate the properties of the permutation methods and compare them to the popular spatial scan statistic in a case-control setting. Case 1 was a single circular cluster centered in a circular study region. The spatial scan statistic had the highest power though the GAM method estimates did not fall far behind. Case 2 was a single point source located at the center of a circular cluster and Case 3 was a line source at the center of the horizontal axis of a square study region. Each had linearly decreasing logodds with distance from the point. The GAM methods outperformed the scan statistic in Cases 2 and 3. Comparing sensitivity, measured as the proportion of the exposure source correctly identified as high or low risk, the GAM methods outperformed the scan statistic in all three Cases. The GAM permutation testing methods provide a regression
On the statistical interpretation of quantum mechanics: evolution of the density matrix
International Nuclear Information System (INIS)
Benzecri, J.P.
1986-01-01
Without attempting to identify ontological interpretation with a mathematical structure, we reduce philosophical speculation to five theses. In the discussion of these, a central role is devoted to the mathematical problem of the evolution of the density matrix. This article relates to the first 3 of these 5 theses [fr
Webb, Samuel J; Hanser, Thierry; Howlin, Brendan; Krause, Paul; Vessey, Jonathan D
2014-03-25
A new algorithm has been developed to enable the interpretation of black box models. The developed algorithm is agnostic to learning algorithm and open to all structural based descriptors such as fragments, keys and hashed fingerprints. The algorithm has provided meaningful interpretation of Ames mutagenicity predictions from both random forest and support vector machine models built on a variety of structural fingerprints.A fragmentation algorithm is utilised to investigate the model's behaviour on specific substructures present in the query. An output is formulated summarising causes of activation and deactivation. The algorithm is able to identify multiple causes of activation or deactivation in addition to identifying localised deactivations where the prediction for the query is active overall. No loss in performance is seen as there is no change in the prediction; the interpretation is produced directly on the model's behaviour for the specific query. Models have been built using multiple learning algorithms including support vector machine and random forest. The models were built on public Ames mutagenicity data and a variety of fingerprint descriptors were used. These models produced a good performance in both internal and external validation with accuracies around 82%. The models were used to evaluate the interpretation algorithm. Interpretation was revealed that links closely with understood mechanisms for Ames mutagenicity. This methodology allows for a greater utilisation of the predictions made by black box models and can expedite further study based on the output for a (quantitative) structure activity model. Additionally the algorithm could be utilised for chemical dataset investigation and knowledge extraction/human SAR development.
Onisko, Agnieszka; Druzdzel, Marek J; Austin, R Marshall
2016-01-01
Classical statistics is a well-established approach in the analysis of medical data. While the medical community seems to be familiar with the concept of a statistical analysis and its interpretation, the Bayesian approach, argued by many of its proponents to be superior to the classical frequentist approach, is still not well-recognized in the analysis of medical data. The goal of this study is to encourage data analysts to use the Bayesian approach, such as modeling with graphical probabilistic networks, as an insightful alternative to classical statistical analysis of medical data. This paper offers a comparison of two approaches to analysis of medical time series data: (1) classical statistical approach, such as the Kaplan-Meier estimator and the Cox proportional hazards regression model, and (2) dynamic Bayesian network modeling. Our comparison is based on time series cervical cancer screening data collected at Magee-Womens Hospital, University of Pittsburgh Medical Center over 10 years. The main outcomes of our comparison are cervical cancer risk assessments produced by the three approaches. However, our analysis discusses also several aspects of the comparison, such as modeling assumptions, model building, dealing with incomplete data, individualized risk assessment, results interpretation, and model validation. Our study shows that the Bayesian approach is (1) much more flexible in terms of modeling effort, and (2) it offers an individualized risk assessment, which is more cumbersome for classical statistical approaches.
Faires, Meredith C; Pearl, David L; Ciccotelli, William A; Berke, Olaf; Reid-Smith, Richard J; Weese, J Scott
2014-05-12
In hospitals, Clostridium difficile infection (CDI) surveillance relies on unvalidated guidelines or threshold criteria to identify outbreaks. This can result in false-positive and -negative cluster alarms. The application of statistical methods to identify and understand CDI clusters may be a useful alternative or complement to standard surveillance techniques. The objectives of this study were to investigate the utility of the temporal scan statistic for detecting CDI clusters and determine if there are significant differences in the rate of CDI cases by month, season, and year in a community hospital. Bacteriology reports of patients identified with a CDI from August 2006 to February 2011 were collected. For patients detected with CDI from March 2010 to February 2011, stool specimens were obtained. Clostridium difficile isolates were characterized by ribotyping and investigated for the presence of toxin genes by PCR. CDI clusters were investigated using a retrospective temporal scan test statistic. Statistically significant clusters were compared to known CDI outbreaks within the hospital. A negative binomial regression model was used to identify associations between year, season, month and the rate of CDI cases. Overall, 86 CDI cases were identified. Eighteen specimens were analyzed and nine ribotypes were classified with ribotype 027 (n = 6) the most prevalent. The temporal scan statistic identified significant CDI clusters at the hospital (n = 5), service (n = 6), and ward (n = 4) levels (P ≤ 0.05). Three clusters were concordant with the one C. difficile outbreak identified by hospital personnel. Two clusters were identified as potential outbreaks. The negative binomial model indicated years 2007-2010 (P ≤ 0.05) had decreased CDI rates compared to 2006 and spring had an increased CDI rate compared to the fall (P = 0.023). Application of the temporal scan statistic identified several clusters, including potential outbreaks not detected by hospital
Jieyi Li; Arandjelovic, Ognjen
2017-07-01
Computer science and machine learning in particular are increasingly lauded for their potential to aid medical practice. However, the highly technical nature of the state of the art techniques can be a major obstacle in their usability by health care professionals and thus, their adoption and actual practical benefit. In this paper we describe a software tool which focuses on the visualization of predictions made by a recently developed method which leverages data in the form of large scale electronic records for making diagnostic predictions. Guided by risk predictions, our tool allows the user to explore interactively different diagnostic trajectories, or display cumulative long term prognostics, in an intuitive and easily interpretable manner.
Statistical interpretation of the process of evolution and functioning of Audiovisual Archives
Directory of Open Access Journals (Sweden)
Nuno Miguel Epifânio
2013-03-01
Full Text Available The article provides a type of the operating conditions of audiovisual archives, using for this purpose the interpretation of the results obtained in the study of quantitative sampling. The study involved 43 institutions of different nature of dimension since the national and foreign organizations, from of the questions answered by services of communication and of cultural institutions. The analysis of the object of study found a variety of guidelines on the management of information preservation, as featured the typology of records collections of each file. The data collection thus allowed building an overview of the operating model of each organization surveyed in this study.
Ergodic theory, interpretations of probability and the foundations of statistical mechanics
van Lith, J.H.
2001-01-01
The traditional use of ergodic theory in the foundations of equilibrium statistical mechanics is that it provides a link between thermodynamic observables and microcanonical probabilities. First of all, the ergodic theorem demonstrates the equality of microcanonical phase averages and infinite time
Directory of Open Access Journals (Sweden)
James A Fordyce
Full Text Available BACKGROUND: Phylogenetic hypotheses are increasingly being used to elucidate historical patterns of diversification rate-variation. Hypothesis testing is often conducted by comparing the observed vector of branching times to a null, pure-birth expectation. A popular method for inferring a decrease in speciation rate, which might suggest an early burst of diversification followed by a decrease in diversification rate is the gamma statistic. METHODOLOGY: Using simulations under varying conditions, I examine the sensitivity of gamma to the distribution of the most recent branching times. Using an exploratory data analysis tool for lineages through time plots, tree deviation, I identified trees with a significant gamma statistic that do not appear to have the characteristic early accumulation of lineages consistent with an early, rapid rate of cladogenesis. I further investigated the sensitivity of the gamma statistic to recent diversification by examining the consequences of failing to simulate the full time interval following the most recent cladogenic event. The power of gamma to detect rate decrease at varying times was assessed for simulated trees with an initial high rate of diversification followed by a relatively low rate. CONCLUSIONS: The gamma statistic is extraordinarily sensitive to recent diversification rates, and does not necessarily detect early bursts of diversification. This was true for trees of various sizes and completeness of taxon sampling. The gamma statistic had greater power to detect recent diversification rate decreases compared to early bursts of diversification. Caution should be exercised when interpreting the gamma statistic as an indication of early, rapid diversification.
Fordyce, James A
2010-07-23
Phylogenetic hypotheses are increasingly being used to elucidate historical patterns of diversification rate-variation. Hypothesis testing is often conducted by comparing the observed vector of branching times to a null, pure-birth expectation. A popular method for inferring a decrease in speciation rate, which might suggest an early burst of diversification followed by a decrease in diversification rate is the gamma statistic. Using simulations under varying conditions, I examine the sensitivity of gamma to the distribution of the most recent branching times. Using an exploratory data analysis tool for lineages through time plots, tree deviation, I identified trees with a significant gamma statistic that do not appear to have the characteristic early accumulation of lineages consistent with an early, rapid rate of cladogenesis. I further investigated the sensitivity of the gamma statistic to recent diversification by examining the consequences of failing to simulate the full time interval following the most recent cladogenic event. The power of gamma to detect rate decrease at varying times was assessed for simulated trees with an initial high rate of diversification followed by a relatively low rate. The gamma statistic is extraordinarily sensitive to recent diversification rates, and does not necessarily detect early bursts of diversification. This was true for trees of various sizes and completeness of taxon sampling. The gamma statistic had greater power to detect recent diversification rate decreases compared to early bursts of diversification. Caution should be exercised when interpreting the gamma statistic as an indication of early, rapid diversification.
Davies, Hugh Trevor Frimston
still reflects regional ventilation in this age group. The doubt cast on the interpretation of the Kr81m steady state image could limit the value of V/Q lung scans in following regional lung function through childhood, a period when specific ventilation is falling rapidly as the child grows. Therefore the first aim of this study was to examine the application of this theoretical model to children and determine whether the changing specific ventilation seen through childhood significantly alters the interpretation of the steady state Kr81m image. This is a necessary first step before conducting longitudinal studies of regional ventilation and perfusion in children. The effect of posture on regional ventilation and perfusion in the adult human lung has been extensively studied. Radiotracer studies have consistently shown that both ventilation and perfusion are preferentially distributed to dependent lung regions during tidal breathing regardless of posture. There is little published information concerning the pattern in children yet there are many differences in lung and chest wall mechanics of children and adults which, along with clinical observation, have led to the hypothesis that the pattern of regional ventilation observed in adults may not be seen in children. Recent reports of regional ventilation in infants and very young children have provided support for this theory. The paper of Heaf et al demonstrated that these differences may in certain circumstances be clinically important. It is not clear however at what age children adopt the "adult pattern of ventilation". In addition to the problems referred to above, attenuation of Kr81m activity as it passes through the chest wall and the changing geometry of the chest during tidal breathing have made quantitative analysis of the image difficult although fractional ventilation and perfusion to each lung can be calculated from the steady state image. In clinical practise, therefore, ventilation and perfusion are
Coakley, K J; Imtiaz, A; Wallis, T M; Weber, J C; Berweger, S; Kabos, P
2015-03-01
Near-field scanning microwave microscopy offers great potential to facilitate characterization, development and modeling of materials. By acquiring microwave images at multiple frequencies and amplitudes (along with the other modalities) one can study material and device physics at different lateral and depth scales. Images are typically noisy and contaminated by artifacts that can vary from scan line to scan line and planar-like trends due to sample tilt errors. Here, we level images based on an estimate of a smooth 2-d trend determined with a robust implementation of a local regression method. In this robust approach, features and outliers which are not due to the trend are automatically downweighted. We denoise images with the Adaptive Weights Smoothing method. This method smooths out additive noise while preserving edge-like features in images. We demonstrate the feasibility of our methods on topography images and microwave |S11| images. For one challenging test case, we demonstrate that our method outperforms alternative methods from the scanning probe microscopy data analysis software package Gwyddion. Our methods should be useful for massive image data sets where manual selection of landmarks or image subsets by a user is impractical. Published by Elsevier B.V.
The statistical interpretations of counting data from measurements of low-level radioactivity
International Nuclear Information System (INIS)
Donn, J.J.; Wolke, R.L.
1977-01-01
The statistical model appropriate to measurements of low-level or background-dominant radioactivity is examined and the derived relationships are applied to two practical problems involving hypothesis testing: 'Does the sample exhibit a net activity above background' and 'Is the activity of the sample below some preselected limit'. In each of these cases, the appropriate decision rule is formulated, procedures are developed for estimating the preset count which is necessary to achieve a desired probability of detection, and a specific sequence of operations is provided for the worker in the field. (author)
DEFF Research Database (Denmark)
Van Driel, A.F.; Nikolaev, I.S.; Vergeer, P.
2007-01-01
We present a statistical analysis of time-resolved spontaneous emission decay curves from ensembles of emitters, such as semiconductor quantum dots, with the aim of interpreting ubiquitous non-single-exponential decay. Contrary to what is widely assumed, the density of excited emitters...... and the intensity in an emission decay curve are not proportional, but the density is a time integral of the intensity. The integral relation is crucial to correctly interpret non-single-exponential decay. We derive the proper normalization for both a discrete and a continuous distribution of rates, where every...... decay component is multiplied by its radiative decay rate. A central result of our paper is the derivation of the emission decay curve when both radiative and nonradiative decays are independently distributed. In this case, the well-known emission quantum efficiency can no longer be expressed...
Statistical and Methodological Considerations for the Interpretation of Intranasal Oxytocin Studies.
Walum, Hasse; Waldman, Irwin D; Young, Larry J
2016-02-01
Over the last decade, oxytocin (OT) has received focus in numerous studies associating intranasal administration of this peptide with various aspects of human social behavior. These studies in humans are inspired by animal research, especially in rodents, showing that central manipulations of the OT system affect behavioral phenotypes related to social cognition, including parental behavior, social bonding, and individual recognition. Taken together, these studies in humans appear to provide compelling, but sometimes bewildering, evidence for the role of OT in influencing a vast array of complex social cognitive processes in humans. In this article, we investigate to what extent the human intranasal OT literature lends support to the hypothesis that intranasal OT consistently influences a wide spectrum of social behavior in humans. We do this by considering statistical features of studies within this field, including factors like statistical power, prestudy odds, and bias. Our conclusion is that intranasal OT studies are generally underpowered and that there is a high probability that most of the published intranasal OT findings do not represent true effects. Thus, the remarkable reports that intranasal OT influences a large number of human social behaviors should be viewed with healthy skepticism, and we make recommendations to improve the reliability of human OT studies in the future. Copyright © 2016 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.
Rapp, J.B.
1991-01-01
Q-mode factor analysis was used to quantitate the distribution of the major aliphatic hydrocarbon (n-alkanes, pristane, phytane) systems in sediments from a variety of marine environments. The compositions of the pure end members of the systems were obtained from factor scores and the distribution of the systems within each sample was obtained from factor loadings. All the data, from the diverse environments sampled (estuarine (San Francisco Bay), fresh-water (San Francisco Peninsula), polar-marine (Antarctica) and geothermal-marine (Gorda Ridge) sediments), were reduced to three major systems: a terrestrial system (mostly high molecular weight aliphatics with odd-numbered-carbon predominance), a mature system (mostly low molecular weight aliphatics without predominance) and a system containing mostly high molecular weight aliphatics with even-numbered-carbon predominance. With this statistical approach, it is possible to assign the percentage contribution from various sources to the observed distribution of aliphatic hydrocarbons in each sediment sample. ?? 1991.
Statistical interpretation of transient current power-law decay in colloidal quantum dot arrays
Energy Technology Data Exchange (ETDEWEB)
Sibatov, R T, E-mail: ren_sib@bk.ru [Ulyanovsk State University, 432000, 42 Leo Tolstoy Street, Ulyanovsk (Russian Federation)
2011-08-01
A new statistical model of the charge transport in colloidal quantum dot arrays is proposed. It takes into account Coulomb blockade forbidding multiple occupancy of nanocrystals and the influence of energetic disorder of interdot space. The model explains power-law current transients and the presence of the memory effect. The fractional differential analogue of the Ohm law is found phenomenologically for nanocrystal arrays. The model combines ideas that were considered as conflicting by other authors: the Scher-Montroll idea about the power-law distribution of waiting times in localized states for disordered semiconductors is applied taking into account Coulomb blockade; Novikov's condition about the asymptotic power-law distribution of time intervals between successful current pulses in conduction channels is fulfilled; and the carrier injection blocking predicted by Ginger and Greenham (2000 J. Appl. Phys. 87 1361) takes place.
Statistical interpretation of transient current power-law decay in colloidal quantum dot arrays
International Nuclear Information System (INIS)
Sibatov, R T
2011-01-01
A new statistical model of the charge transport in colloidal quantum dot arrays is proposed. It takes into account Coulomb blockade forbidding multiple occupancy of nanocrystals and the influence of energetic disorder of interdot space. The model explains power-law current transients and the presence of the memory effect. The fractional differential analogue of the Ohm law is found phenomenologically for nanocrystal arrays. The model combines ideas that were considered as conflicting by other authors: the Scher-Montroll idea about the power-law distribution of waiting times in localized states for disordered semiconductors is applied taking into account Coulomb blockade; Novikov's condition about the asymptotic power-law distribution of time intervals between successful current pulses in conduction channels is fulfilled; and the carrier injection blocking predicted by Ginger and Greenham (2000 J. Appl. Phys. 87 1361) takes place.
Swofford, H J; Koertner, A J; Zemp, F; Ausdemore, M; Liu, A; Salyards, M J
2018-04-03
The forensic fingerprint community has faced increasing amounts of criticism by scientific and legal commentators, challenging the validity and reliability of fingerprint evidence due to the lack of an empirically demonstrable basis to evaluate and report the strength of the evidence in a given case. This paper presents a method, developed as a stand-alone software application, FRStat, which provides a statistical assessment of the strength of fingerprint evidence. The performance was evaluated using a variety of mated and non-mated datasets. The results show strong performance characteristics, often with values supporting specificity rates greater than 99%. This method provides fingerprint experts the capability to demonstrate the validity and reliability of fingerprint evidence in a given case and report the findings in a more transparent and standardized fashion with clearly defined criteria for conclusions and known error rate information thereby responding to concerns raised by the scientific and legal communities. Published by Elsevier B.V.
Statistical analysis of CT brain scans in the evaluation of cerebral atrophy and hydrocephalus
International Nuclear Information System (INIS)
Oberthur, J.; Baddeley, H.; Jayasinghe, L.; Walsh, P.
1983-01-01
All the subjects with a visual CT diagnosis of atrophy or hydrocephalus showed variations from the normal in excess of two standard deviations so the standard deviation analysis method can be regarded as being as sensitive as the visual interpretation. However, three patients in the control group were also indicted although their results were only in the borderline range. Limitations of the study are discussed
Faires, Meredith C; Pearl, David L; Ciccotelli, William A; Berke, Olaf; Reid-Smith, Richard J; Weese, J Scott
2014-07-08
In healthcare facilities, conventional surveillance techniques using rule-based guidelines may result in under- or over-reporting of methicillin-resistant Staphylococcus aureus (MRSA) outbreaks, as these guidelines are generally unvalidated. The objectives of this study were to investigate the utility of the temporal scan statistic for detecting MRSA clusters, validate clusters using molecular techniques and hospital records, and determine significant differences in the rate of MRSA cases using regression models. Patients admitted to a community hospital between August 2006 and February 2011, and identified with MRSA>48 hours following hospital admission, were included in this study. Between March 2010 and February 2011, MRSA specimens were obtained for spa typing. MRSA clusters were investigated using a retrospective temporal scan statistic. Tests were conducted on a monthly scale and significant clusters were compared to MRSA outbreaks identified by hospital personnel. Associations between the rate of MRSA cases and the variables year, month, and season were investigated using a negative binomial regression model. During the study period, 735 MRSA cases were identified and 167 MRSA isolates were spa typed. Nine different spa types were identified with spa type 2/t002 (88.6%) the most prevalent. The temporal scan statistic identified significant MRSA clusters at the hospital (n=2), service (n=16), and ward (n=10) levels (P ≤ 0.05). Seven clusters were concordant with nine MRSA outbreaks identified by hospital staff. For the remaining clusters, seven events may have been equivalent to true outbreaks and six clusters demonstrated possible transmission events. The regression analysis indicated years 2009-2011, compared to 2006, and months March and April, compared to January, were associated with an increase in the rate of MRSA cases (P ≤ 0.05). The application of the temporal scan statistic identified several MRSA clusters that were not detected by hospital
A Comprehensive Statistically-Based Method to Interpret Real-Time Flowing Measurements
Energy Technology Data Exchange (ETDEWEB)
Keita Yoshioka; Pinan Dawkrajai; Analis A. Romero; Ding Zhu; A. D. Hill; Larry W. Lake
2007-01-15
With the recent development of temperature measurement systems, continuous temperature profiles can be obtained with high precision. Small temperature changes can be detected by modern temperature measuring instruments such as fiber optic distributed temperature sensor (DTS) in intelligent completions and will potentially aid the diagnosis of downhole flow conditions. In vertical wells, since elevational geothermal changes make the wellbore temperature sensitive to the amount and the type of fluids produced, temperature logs can be used successfully to diagnose the downhole flow conditions. However, geothermal temperature changes along the wellbore being small for horizontal wells, interpretations of a temperature log become difficult. The primary temperature differences for each phase (oil, water, and gas) are caused by frictional effects. Therefore, in developing a thermal model for horizontal wellbore, subtle temperature changes must be accounted for. In this project, we have rigorously derived governing equations for a producing horizontal wellbore and developed a prediction model of the temperature and pressure by coupling the wellbore and reservoir equations. Also, we applied Ramey's model (1962) to the build section and used an energy balance to infer the temperature profile at the junction. The multilateral wellbore temperature model was applied to a wide range of cases at varying fluid thermal properties, absolute values of temperature and pressure, geothermal gradients, flow rates from each lateral, and the trajectories of each build section. With the prediction models developed, we present inversion studies of synthetic and field examples. These results are essential to identify water or gas entry, to guide flow control devices in intelligent completions, and to decide if reservoir stimulation is needed in particular horizontal sections. This study will complete and validate these inversion studies.
Energy Technology Data Exchange (ETDEWEB)
Charleston, D. B.; Beck, R. N.; Eidelberg, P.; Schuh, M. W. [Argonne Cancer Research Hospital, Chicago, IL (United States)
1964-10-15
This paper discusses a range of techniques which assist in evaluating and interpreting scanning read-out display. This range extends from simple internal calibration for photographic read-out to fairly elaborate auxiliary equipment for presentation of accumulated digital scan information to a computer programme. The direct and remarkably useful method of using a random pulse generator to produce a calibrated step-wedge of spots, which are projected on to a film by the same projection light source as is used during the scan, allows the viewer to compare exposure densities of regions of interest on the scan to similar regions on the wedge which are calibrated directly in count-rate units. Auxiliary equipment, such as a multichannel analyser used in the multiscaling mode, permits the accumulation of digital information for a ''total count per scan line'' display for each index step. Small animal scans have been made which accumulate and display ''counts per scan line'' for each index step. This produces an accurate quantitative measure of the distribution of activity over the animal and a profile display of activity similar to the slit scan display of a linear scanning system. The same multiscaling technique is carried further by accumulating digital information for a ''count per unit area'' display. A profile curve is obtained for each scan line of each index step. From this it is possible to visualize or construct an area profile of count-rate. Scan displays with or without contrast enhancement and with or without ''time lag'' from integrating circuitry and scans with various spot sizes and shapes have been produced under identical statistical conditions by means of multiple read-outs while scanning a phantom with a single-detector system. Direct comparison of displays combined with the ''count per unit area'' mapping technique aid in the interpretation of scan results. 'Precise position information must be included with the data record. Computations of percentage
Directory of Open Access Journals (Sweden)
Paul A. Swinton
2018-05-01
Full Text Available The concept of personalized nutrition and exercise prescription represents a topical and exciting progression for the discipline given the large inter-individual variability that exists in response to virtually all performance and health related interventions. Appropriate interpretation of intervention-based data from an individual or group of individuals requires practitioners and researchers to consider a range of concepts including the confounding influence of measurement error and biological variability. In addition, the means to quantify likely statistical and practical improvements are facilitated by concepts such as confidence intervals (CIs and smallest worthwhile change (SWC. The purpose of this review is to provide accessible and applicable recommendations for practitioners and researchers that interpret, and report personalized data. To achieve this, the review is structured in three sections that progressively develop a statistical framework. Section 1 explores fundamental concepts related to measurement error and describes how typical error and CIs can be used to express uncertainty in baseline measurements. Section 2 builds upon these concepts and demonstrates how CIs can be combined with the concept of SWC to assess whether meaningful improvements occur post-intervention. Finally, section 3 introduces the concept of biological variability and discusses the subsequent challenges in identifying individual response and non-response to an intervention. Worked numerical examples and interactive Supplementary Material are incorporated to solidify concepts and assist with implementation in practice.
Comparison of statistical sampling methods with ScannerBit, the GAMBIT scanning module
Energy Technology Data Exchange (ETDEWEB)
Martinez, Gregory D. [University of California, Physics and Astronomy Department, Los Angeles, CA (United States); McKay, James; Scott, Pat [Imperial College London, Department of Physics, Blackett Laboratory, London (United Kingdom); Farmer, Ben; Conrad, Jan [AlbaNova University Centre, Oskar Klein Centre for Cosmoparticle Physics, Stockholm (Sweden); Stockholm University, Department of Physics, Stockholm (Sweden); Roebber, Elinore [McGill University, Department of Physics, Montreal, QC (Canada); Putze, Antje [LAPTh, Universite de Savoie, CNRS, Annecy-le-Vieux (France); Collaboration: The GAMBIT Scanner Workgroup
2017-11-15
We introduce ScannerBit, the statistics and sampling module of the public, open-source global fitting framework GAMBIT. ScannerBit provides a standardised interface to different sampling algorithms, enabling the use and comparison of multiple computational methods for inferring profile likelihoods, Bayesian posteriors, and other statistical quantities. The current version offers random, grid, raster, nested sampling, differential evolution, Markov Chain Monte Carlo (MCMC) and ensemble Monte Carlo samplers. We also announce the release of a new standalone differential evolution sampler, Diver, and describe its design, usage and interface to ScannerBit. We subject Diver and three other samplers (the nested sampler MultiNest, the MCMC GreAT, and the native ScannerBit implementation of the ensemble Monte Carlo algorithm T-Walk) to a battery of statistical tests. For this we use a realistic physical likelihood function, based on the scalar singlet model of dark matter. We examine the performance of each sampler as a function of its adjustable settings, and the dimensionality of the sampling problem. We evaluate performance on four metrics: optimality of the best fit found, completeness in exploring the best-fit region, number of likelihood evaluations, and total runtime. For Bayesian posterior estimation at high resolution, T-Walk provides the most accurate and timely mapping of the full parameter space. For profile likelihood analysis in less than about ten dimensions, we find that Diver and MultiNest score similarly in terms of best fit and speed, outperforming GreAT and T-Walk; in ten or more dimensions, Diver substantially outperforms the other three samplers on all metrics. (orig.)
Więckowska, Barbara; Marcinkowska, Justyna
2017-11-06
When searching for epidemiological clusters, an important tool can be to carry out one's own research with the incidence rate from the literature as the reference level. Values exceeding this level may indicate the presence of a cluster in that location. This paper presents a method of searching for clusters that have significantly higher incidence rates than those specified by the investigator. The proposed method uses the classic binomial exact test for one proportion and an algorithm that joins areas with potential clusters while reducing the number of multiple comparisons needed. The sensitivity and specificity are preserved by this new method, while avoiding the Monte Carlo approach and still delivering results comparable to the commonly used Kulldorff's scan statistics and other similar methods of localising clusters. A strong contributing factor afforded by the statistical software that makes this possible is that it allows analysis and presentation of the results cartographically.
Energy Technology Data Exchange (ETDEWEB)
Munoz, Gerard; Bauer, Klaus; Moeck, Inga; Schulze, Albrecht; Ritter, Oliver [Deutsches GeoForschungsZentrum (GFZ), Telegrafenberg, 14473 Potsdam (Germany)
2010-03-15
Exploration for geothermal resources is often challenging because there are no geophysical techniques that provide direct images of the parameters of interest, such as porosity, permeability and fluid content. Magnetotelluric (MT) and seismic tomography methods yield information about subsurface distribution of resistivity and seismic velocity on similar scales and resolution. The lack of a fundamental law linking the two parameters, however, has limited joint interpretation to a qualitative analysis. By using a statistical approach in which the resistivity and velocity models are investigated in the joint parameter space, we are able to identify regions of high correlation and map these classes (or structures) back onto the spatial domain. This technique, applied to a seismic tomography-MT profile in the area of the Gross Schoenebeck geothermal site, allows us to identify a number of classes in accordance with the local geology. In particular, a high-velocity, low-resistivity class is interpreted as related to areas with thinner layers of evaporites; regions where these sedimentary layers are highly fractured may be of higher permeability. (author)
Hervind, Widyaningsih, Y.
2017-07-01
Concurrent infection with multiple infectious agents may occur in one patient, it appears frequently in dengue hemorrhagic fever (DHF) and typhoid fever. This paper depicted association between DHF and typhoid based on spatial point of view. Since paucity of data regarding dengue and typhoid co-infection, data that be used are the number of patients of those diseases in every district (kecamatan) in Jakarta in 2014 and 2015 obtained from Jakarta surveillance website. Poisson spatial scan statistics is used to detect DHF and typhoid hotspots area district in Jakarta separately. After obtain the hotspot, Fisher's exact test is applied to validate association between those two diseases' hotspot. The result exhibit hotspots of DHF and typhoid are located around central Jakarta. The further analysis used Poisson space-time scan statistics to reveal the hotspot in term of spatial and time. DHF and typhoid fever more likely occurr from January until May in the area which is relatively similar with pure spatial result. Preventive action could be done especially in the hotspot areas and it is required further study to observe the causes based on characteristics of the hotspot area.
Bosomprah, Samuel; Dotse-Gborgbortsi, Winfred; Aboagye, Patrick; Matthews, Zoe
2016-11-01
To identify and evaluate clusters of births that occurred outside health facilities in Ghana for targeted intervention. A retrospective study was conducted using a convenience sample of live births registered in Ghanaian health facilities from January 1 to December 31, 2014. Data were extracted from the district health information system. A spatial scan statistic was used to investigate clusters of home births through a discrete Poisson probability model. Scanning with a circular spatial window was conducted only for clusters with high rates of such deliveries. The district was used as the geographic unit of analysis. The likelihood P value was estimated using Monte Carlo simulations. Ten statistically significant clusters with a high rate of home birth were identified. The relative risks ranged from 1.43 ("least likely" cluster; P=0.001) to 1.95 ("most likely" cluster; P=0.001). The relative risks of the top five "most likely" clusters ranged from 1.68 to 1.95; these clusters were located in Ashanti, Brong Ahafo, and the Western, Eastern, and Greater regions of Accra. Health facility records, geospatial techniques, and geographic information systems provided locally relevant information to assist policy makers in delivering targeted interventions to small geographic areas. Copyright © 2016 International Federation of Gynecology and Obstetrics. Published by Elsevier Ireland Ltd. All rights reserved.
Hotspot detection using space-time scan statistics on children under five years of age in Depok
Verdiana, Miranti; Widyaningsih, Yekti
2017-03-01
Some problems that affect the health level in Depok is the high malnutrition rates from year to year and the more spread infectious and non-communicable diseases in some areas. Children under five years old is a vulnerable part of population to get the malnutrition and diseases. Based on this reason, it is important to observe the location and time, where and when, malnutrition in Depok happened in high intensity. To obtain the location and time of the hotspots of malnutrition and diseases that attack children under five years old, space-time scan statistics method can be used. Space-time scan statistic is a hotspot detection method, where the area and time of information and time are taken into account simultaneously in detecting the hotspots. This method detects a hotspot with a cylindrical scanning window: the cylindrical pedestal describes the area, and the height of cylinder describe the time. Cylinders formed is a hotspot candidate that may occur, which require testing of hypotheses, whether a cylinder can be summed up as a hotspot. Hotspot detection in this study carried out by forming a combination of several variables. Some combination of variables provides hotspot detection results that tend to be the same, so as to form groups (clusters). In the case of infant health level in Depok city, Beji health care center region in 2011-2012 is a hotspot. According to the combination of the variables used in the detection of hotspots, Beji health care center is most frequently as a hotspot. Hopefully the local government can take the right policy to improve the health level of children under five in the city of Depok.
International Nuclear Information System (INIS)
Rubinstein, Michael; VanDaele, Paul; Wegener, William MD; Guardia, Miguel de la
2004-01-01
Purpose: To determine if imaging with acetobromo (immunoscintigraphy) is sensitive to technical and interpretative techniques that must be mastered in order to obtain reliable results. We studied the impact of training to reduce the learning curve. Methods: 1) Evaluate the performance of an experienced Nuclear Medicine Physicians (Team A) un-blinded with their initial series of patients, compared to the conclusions of Experts (Team B) blinded from any clinical information; 2) Training of Team A is by the expert team on image acquisition, processing and interpretation techniques as well as using all clinical information and anatomic studies for comparison; 3) Assess the performance of the Team A on a second series of patients. 4) Questionnaires were sent to 65 consecutive physicians trained by experts to determine if the learned techniques improved interpretation of immuno scintigrams. Results: Twenty three (23) patients with CRC were included, 13 pre and 10 for the post teaching phase with a total of 30 clinically confirmed lesions (pathologically proven or demonstrated on follow-up). The clinically confirmed lesions include: 8 primary, 12 pelvic recurrences and 10 metastatic sites. On the pre-teaching series, Team A correctly identified only 6/19 lesions (32%). On the post teaching series, Team A found 8/11 lesions (73%) including 4/5 pelvic recurrence (80%), all 3 primary lesions, and 1/3 metastasis which compares favorably to published results. To determine the effect of blinded reading of immuno scintigrams, Team B reviews the first 13 studies without any clinical information or CT for comparison. Team B found 10/19 lesions (53%) with 4 false positive. Questionnaires were mailed to 65 trained physicians (54 returned), 67% of responders found that training improved their results, 22% experienced mixed results and 11% did not notice any improvement. Conclusion: The lower than expected sensitivity by the blinded Expert Team confirms that the overall accuracy
International Nuclear Information System (INIS)
Tumur, Odgerel; Soon, Kean; Brown, Fraser; Mykytowycz, Marcus
2013-01-01
The aims of our study were to evaluate the effect of application of Adaptive Statistical Iterative Reconstruction (ASIR) algorithm on the radiation dose of coronary computed tomography angiography (CCTA) and its effects on image quality of CCTA and to evaluate the effects of various patient and CT scanning factors on the radiation dose of CCTA. This was a retrospective study that included 347 consecutive patients who underwent CCTA at a tertiary university teaching hospital between 1 July 2009 and 20 September 2011. Analysis was performed comparing patient demographics, scan characteristics, radiation dose and image quality in two groups of patients in whom conventional Filtered Back Projection (FBP) or ASIR was used for image reconstruction. There were 238 patients in the FBP group and 109 patients in the ASIR group. There was no difference between the groups in the use of prospective gating, scan length or tube voltage. In ASIR group, significantly lower tube current was used compared with FBP group, 550mA (450–600) vs. 650mA (500–711.25) (median (interquartile range)), respectively, P<0.001. There was 27% effective radiation dose reduction in the ASIR group compared with FBP group, 4.29mSv (2.84–6.02) vs. 5.84mSv (3.88–8.39) (median (interquartile range)), respectively, P<0.001. Although ASIR was associated with increased image noise compared with FBP (39.93±10.22 vs. 37.63±18.79 (mean ±standard deviation), respectively, P<001), it did not affect the signal intensity, signal-to-noise ratio, contrast-to-noise ratio or the diagnostic quality of CCTA. Application of ASIR reduces the radiation dose of CCTA without affecting the image quality.
Tumur, Odgerel; Soon, Kean; Brown, Fraser; Mykytowycz, Marcus
2013-06-01
The aims of our study were to evaluate the effect of application of Adaptive Statistical Iterative Reconstruction (ASIR) algorithm on the radiation dose of coronary computed tomography angiography (CCTA) and its effects on image quality of CCTA and to evaluate the effects of various patient and CT scanning factors on the radiation dose of CCTA. This was a retrospective study that included 347 consecutive patients who underwent CCTA at a tertiary university teaching hospital between 1 July 2009 and 20 September 2011. Analysis was performed comparing patient demographics, scan characteristics, radiation dose and image quality in two groups of patients in whom conventional Filtered Back Projection (FBP) or ASIR was used for image reconstruction. There were 238 patients in the FBP group and 109 patients in the ASIR group. There was no difference between the groups in the use of prospective gating, scan length or tube voltage. In ASIR group, significantly lower tube current was used compared with FBP group, 550 mA (450-600) vs. 650 mA (500-711.25) (median (interquartile range)), respectively, P ASIR group compared with FBP group, 4.29 mSv (2.84-6.02) vs. 5.84 mSv (3.88-8.39) (median (interquartile range)), respectively, P ASIR was associated with increased image noise compared with FBP (39.93 ± 10.22 vs. 37.63 ± 18.79 (mean ± standard deviation), respectively, P ASIR reduces the radiation dose of CCTA without affecting the image quality. © 2013 The Authors. Journal of Medical Imaging and Radiation Oncology © 2013 The Royal Australian and New Zealand College of Radiologists.
Hayslett, H T
1991-01-01
Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the
International Nuclear Information System (INIS)
Saunders, R.L. de C.H.
1980-01-01
Microfocal radiography is used to study post mortem, the microcirculatory and neuronal organization of the normal and diseased brain, as well as to interpret the images obtained clinically by the new techniques of cerebral magnification angiography and X-ray brain scanning. An outline of the basic technique underlying CT scanning and magnification radiography of the living human brain is given to facilitate the understanding of why microfocal radiography is central to magnification radiography and complementary to CT scanning. Microangiography, one of the microfocal radiographic techniques, is discussed at length in relation to the microvasculature of the human cerebral cortex, the vasculature of the subcortical or medullary white matter, the microvascular patterns of the central grey matter and internal capsule, the vascular patterns of the visual cortex and hippocampus; the application of microangiography to the spinal cord and nerve roots is also discussed. Another microfocal radiographic technique described is cerebral historadiography, i.e. X-ray studies of brain histology, with particular reference to the human hippocampal formation. Finally, the correlation of microfocal X-ray and brain CT scan images is discussed. (U.K.)
Directory of Open Access Journals (Sweden)
Chien-Chou Chen
2016-11-01
Full Text Available Abstract Background Cases of dengue fever have increased in areas of Southeast Asia in recent years. Taiwan hit a record-high 42,856 cases in 2015, with the majority in southern Tainan and Kaohsiung Cities. Leveraging spatial statistics and geo-visualization techniques, we aim to design an online analytical tool for local public health workers to prospectively identify ongoing hot spots of dengue fever weekly at the village level. Methods A total of 57,516 confirmed cases of dengue fever in 2014 and 2015 were obtained from the Taiwan Centers for Disease Control (TCDC. Incorporating demographic information as covariates with cumulative cases (365 days in a discrete Poisson model, we iteratively applied space–time scan statistics by SaTScan software to detect the currently active cluster of dengue fever (reported as relative risk in each village of Tainan and Kaohsiung every week. A village with a relative risk >1 and p value <0.05 was identified as a dengue-epidemic area. Assuming an ongoing transmission might continuously spread for two consecutive weeks, we estimated the sensitivity and specificity for detecting outbreaks by comparing the scan-based classification (dengue-epidemic vs. dengue-free village with the true cumulative case numbers from the TCDC’s surveillance statistics. Results Among the 1648 villages in Tainan and Kaohsiung, the overall sensitivity for detecting outbreaks increases as case numbers grow in a total of 92 weekly simulations. The specificity for detecting outbreaks behaves inversely, compared to the sensitivity. On average, the mean sensitivity and specificity of 2-week hot spot detection were 0.615 and 0.891 respectively (p value <0.001 for the covariate adjustment model, as the maximum spatial and temporal windows were specified as 50% of the total population at risk and 28 days. Dengue-epidemic villages were visualized and explored in an interactive map. Conclusions We designed an online analytical tool for
Maté-González, Miguel Ángel; Aramendi, Julia; Yravedra, José; Blasco, Ruth; Rosell, Jordi; González-Aguilera, Diego; Domínguez-Rodrigo, Manuel
2017-09-01
In the last few years, the study of cut marks on bone surfaces has become fundamental for the interpretation of prehistoric butchery practices. Due to the difficulties in the correct identification of cut marks, many criteria for their description and classification have been suggested. Different techniques, such as three-dimensional digital microscope (3D DM), laser scanning confocal microscopy (LSCM) and micro-photogrammetry (M-PG) have been recently applied to the study of cut marks. Although the 3D DM and LSCM microscopic techniques are the most commonly used for the 3D identification of cut marks, M-PG has also proved to be very efficient and a low-cost method. M-PG is a noninvasive technique that allows the study of the cortical surface without any previous preparation of the samples, and that generates high-resolution models. Despite the current application of microscopic and micro-photogrammetric techniques to taphonomy, their reliability has never been tested. In this paper, we compare 3D DM, LSCM and M-PG in order to assess their resolution and results. In this study, we analyse 26 experimental cut marks generated with a metal knife. The quantitative and qualitative information registered is analysed by means of standard multivariate statistics and geometric morphometrics to assess the similarities and differences obtained with the different methodologies. © 2017 The Authors Journal of Microscopy © 2017 Royal Microscopical Society.
Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.
Zielinski, Jerzy S.
The dramatic increase in number and volume of digital images produced in medical diagnostics, and the escalating demand for rapid access to these relevant medical data, along with the need for interpretation and retrieval has become of paramount importance to a modern healthcare system. Therefore, there is an ever growing need for processed, interpreted and saved images of various types. Due to the high cost and unreliability of human-dependent image analysis, it is necessary to develop an automated method for feature extraction, using sophisticated mathematical algorithms and reasoning. This work is focused on digital image signal processing of biological and biomedical data in one- two- and three-dimensional space. Methods and algorithms presented in this work were used to acquire data from genomic sequences, breast cancer, and biofilm images. One-dimensional analysis was applied to DNA sequences which were presented as a non-stationary sequence and modeled by a time-dependent autoregressive moving average (TD-ARMA) model. Two-dimensional analyses used 2D-ARMA model and applied it to detect breast cancer from x-ray mammograms or ultrasound images. Three-dimensional detection and classification techniques were applied to biofilm images acquired using confocal laser scanning microscopy. Modern medical images are geometrically arranged arrays of data. The broadening scope of imaging as a way to organize our observations of the biophysical world has led to a dramatic increase in our ability to apply new processing techniques and to combine multiple channels of data into sophisticated and complex mathematical models of physiological function and dysfunction. With explosion of the amount of data produced in a field of biomedicine, it is crucial to be able to construct accurate mathematical models of the data at hand. Two main purposes of signal modeling are: data size conservation and parameter extraction. Specifically, in biomedical imaging we have four key problems
Energy Technology Data Exchange (ETDEWEB)
Roman, Frida, E-mail: roman@mmt.upc.edu [Laboratori de Termodinamica, Departament de Maquines i Motors Termics, ETSEIAT, Universitat Politecnica de Catalunya, Carrer Colom 11, 08222 Terrassa (Spain); Calventus, Yolanda, E-mail: calventus@mmt.upc.edu [Laboratori de Termodinamica, Departament de Maquines i Motors Termics, ETSEIAT, Universitat Politecnica de Catalunya, Carrer Colom 11, 08222 Terrassa (Spain); Colomer, Pere, E-mail: colomer@mmt.upc.edu [Laboratori de Termodinamica, Departament de Maquines i Motors Termics, ETSEIAT, Universitat Politecnica de Catalunya, Carrer Colom 11, 08222 Terrassa (Spain); Hutchinson, John M., E-mail: hutchinson@mmt.upc.edu [Laboratori de Termodinamica, Departament de Maquines i Motors Termics, ETSEIAT, Universitat Politecnica de Catalunya, Carrer Colom 11, 08222 Terrassa (Spain)
2012-08-10
Highlights: Black-Right-Pointing-Pointer Comparison of DSC and DRS in the cure of epoxy nanocomposites. Black-Right-Pointing-Pointer Dependence of exfoliation of nanocomposite on clay content. Black-Right-Pointing-Pointer Anionically initiated homopolymerisation in PLS nanocomposites. - Abstract: The effect of nanoclay on the non-isothermal cure kinetics of polymer layered silicate nanocomposites based upon epoxy resin is studied by calorimetric techniques (DSC and TGA) and by dielectric relaxation spectroscopy (DRS) in non-isothermal cure at constant heating rate. The cure process takes place by homopolymerisation, initiated anionically using 3 wt% dimethylaminopyridine (DMAP), and the influence of the nanoclay content has been analysed. Interesting differences are observed between the nanocomposites with 2 wt% and 5 wt% clay content. At low heating rates, these samples vitrify and then devitrify during the cure. For the sample with 2 wt% clay, the devitrification is accompanied by a thermally initiated homopolymerisation, which can be identified by DRS but not by DSC. The effect of this is to improve the exfoliation of the nanocomposite with 2 wt% clay, as verified by transmission electron microscopy, with a corresponding increase in the glass transition temperature. These observations are interpreted in respect of the nanocomposite preparation method and the cure kinetics.
International Nuclear Information System (INIS)
Román, Frida; Calventus, Yolanda; Colomer, Pere; Hutchinson, John M.
2012-01-01
Highlights: ► Comparison of DSC and DRS in the cure of epoxy nanocomposites. ► Dependence of exfoliation of nanocomposite on clay content. ► Anionically initiated homopolymerisation in PLS nanocomposites. - Abstract: The effect of nanoclay on the non-isothermal cure kinetics of polymer layered silicate nanocomposites based upon epoxy resin is studied by calorimetric techniques (DSC and TGA) and by dielectric relaxation spectroscopy (DRS) in non-isothermal cure at constant heating rate. The cure process takes place by homopolymerisation, initiated anionically using 3 wt% dimethylaminopyridine (DMAP), and the influence of the nanoclay content has been analysed. Interesting differences are observed between the nanocomposites with 2 wt% and 5 wt% clay content. At low heating rates, these samples vitrify and then devitrify during the cure. For the sample with 2 wt% clay, the devitrification is accompanied by a thermally initiated homopolymerisation, which can be identified by DRS but not by DSC. The effect of this is to improve the exfoliation of the nanocomposite with 2 wt% clay, as verified by transmission electron microscopy, with a corresponding increase in the glass transition temperature. These observations are interpreted in respect of the nanocomposite preparation method and the cure kinetics.
International Nuclear Information System (INIS)
2005-01-01
For the years 2004 and 2005 the figures shown in the tables of Energy Review are partly preliminary. The annual statistics published in Energy Review are presented in more detail in a publication called Energy Statistics that comes out yearly. Energy Statistics also includes historical time-series over a longer period of time (see e.g. Energy Statistics, Statistics Finland, Helsinki 2004.) The applied energy units and conversion coefficients are shown in the back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes, precautionary stock fees and oil pollution fees
International Nuclear Information System (INIS)
Van Humbeeck, J.; Planes, A.
1996-01-01
Experimentally, two distinct classes of martensitic transformations are considered: athermal and isothermal. In the former class, on cooling, at some well-defined start temperature (M s ), isolated small regions of the martensitic product begin to appear in the parent phase. The transformation at any temperature appears to be instantaneous in practical time scales, and the amount of transformed material (x) does not depend on time, i.e., it increases at each step of lowering temperature. The transition is not completed until the temperature is lowered below M f (martensite finish temperature). The transformation temperatures are only determined by chemical (composition and degree of order) and microstructural factors. The external controlling parameter (T or applied stress) determines the free energy difference between the high and the low temperature phases, which provides the driving force for the transition. In the development of athermal martensite activation kinetics is secondary. Athermal martensite, as observed in the well known shape memory alloys Cu-Zn-Al, Cu-Al-Ni and Ni-Ti, cannot be attributed to a thermally activated mechanism for which kinetics are generally described by the Arrhenius rate equation. However, the latter has been applied by Lipe and Morris to results for the Martensitic Transformation of Cu-Al-Ni-B-Mn obtained by conventional Differential Scanning Calorimetry (DSC). It is the concern of the authors of this letter to point out the incongruences arising from the analysis of calorimetric results, corresponding to forward and reverse thermoelastic martensitic transformations, in terms of standard kinetic analysis based on the Arrhenius rate equation
Polewski, Przemyslaw; Yao, Wei; Heurich, Marco; Krzystek, Peter; Stilla, Uwe
2017-07-01
This paper introduces a statistical framework for detecting cylindrical shapes in dense point clouds. We target the application of mapping fallen trees in datasets obtained through terrestrial laser scanning. This is a challenging task due to the presence of ground vegetation, standing trees, DTM artifacts, as well as the fragmentation of dead trees into non-collinear segments. Our method shares the concept of voting in parameter space with the generalized Hough transform, however two of its significant drawbacks are improved upon. First, the need to generate samples on the shape's surface is eliminated. Instead, pairs of nearby input points lying on the surface cast a vote for the cylinder's parameters based on the intrinsic geometric properties of cylindrical shapes. Second, no discretization of the parameter space is required: the voting is carried out in continuous space by means of constructing a kernel density estimator and obtaining its local maxima, using automatic, data-driven kernel bandwidth selection. Furthermore, we show how the detected cylindrical primitives can be efficiently merged to obtain object-level (entire tree) semantic information using graph-cut segmentation and a tailored dynamic algorithm for eliminating cylinder redundancy. Experiments were performed on 3 plots from the Bavarian Forest National Park, with ground truth obtained through visual inspection of the point clouds. It was found that relative to sample consensus (SAC) cylinder fitting, the proposed voting framework can improve the detection completeness by up to 10 percentage points while maintaining the correctness rate.
Directory of Open Access Journals (Sweden)
Erfan Ayubi
2017-05-01
Full Text Available OBJECTIVES The aim of this study was to explore the spatial pattern of female breast cancer (BC incidence at the neighborhood level in Tehran, Iran. METHODS The present study included all registered incident cases of female BC from March 2008 to March 2011. The raw standardized incidence ratio (SIR of BC for each neighborhood was estimated by comparing observed cases relative to expected cases. The estimated raw SIRs were smoothed by a Besag, York, and Mollie spatial model and the spatial empirical Bayesian method. The purely spatial scan statistic was used to identify spatial clusters. RESULTS There were 4,175 incident BC cases in the study area from 2008 to 2011, of which 3,080 were successfully geocoded to the neighborhood level. Higher than expected rates of BC were found in neighborhoods located in northern and central Tehran, whereas lower rates appeared in southern areas. The most likely cluster of higher than expected BC incidence involved neighborhoods in districts 3 and 6, with an observed-to-expected ratio of 3.92 (p<0.001, whereas the most likely cluster of lower than expected rates involved neighborhoods in districts 17, 18, and 19, with an observed-to-expected ratio of 0.05 (p<0.001. CONCLUSIONS Neighborhood-level inequality in the incidence of BC exists in Tehran. These findings can serve as a basis for resource allocation and preventive strategies in at-risk areas.
International Nuclear Information System (INIS)
2001-01-01
For the year 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions from the use of fossil fuels, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in 2000, Energy exports by recipient country in 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
International Nuclear Information System (INIS)
2000-01-01
For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g., Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-March 2000, Energy exports by recipient country in January-March 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
International Nuclear Information System (INIS)
1999-01-01
For the year 1998 and the year 1999, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 1999, Energy exports by recipient country in January-June 1999, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
International Nuclear Information System (INIS)
Bonetti, R.; Milazzo, L.C.; Melanotte, M.
1983-01-01
A number of (p,n), (n,p), and ( 3 He, p) reactions have been interpreted on the basis of the statistical multistep compound emission mechanism. Good agreement with experiment is found both in spectrum shape and in the value of the coherence widths
International Nuclear Information System (INIS)
2003-01-01
For the year 2002, part of the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot 2001, Statistics Finland, Helsinki 2002). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supply and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees on energy products
International Nuclear Information System (INIS)
2004-01-01
For the year 2003 and 2004, the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot, Statistics Finland, Helsinki 2003, ISSN 0785-3165). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-March 2004, Energy exports by recipient country in January-March 2004, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees
International Nuclear Information System (INIS)
2000-01-01
For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy also includes historical time series over a longer period (see e.g., Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 2000, Energy exports by recipient country in January-June 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
Directory of Open Access Journals (Sweden)
Kevin Walsh
2016-05-01
Full Text Available The Abri Faravel, discovered in 2010 at 2,133m asl in the Parc National des Ecrins, Freissinières, Southern French Alps, is probably the most enigmatic high altitude site in the Alps. This rock shelter saw phases of human activity from the Mesolithic through to the medieval period; the artefactual assemblages comprise Mesolithic and Neolithic flint tools, Iron Age hand-thrown pottery, a Roman fibula and some medieval metalwork. However, the most interesting and unique feature on the site are the prehistoric rock paintings; the highest representations of animals (quadrupeds in Europe. These paintings are presented in this article. The paintings themselves were the object of a white-light scan, whilst the rock-shelter and surrounding landscape was scanned using a Faro laser scanner. Both of these models are presented here, and their interpretation elucidated by an assessment of the different phases of activity at the shelter, combined with a synthesis of other evidence from the area and pertinent environmental evidence.
DEFF Research Database (Denmark)
Denwood, M.J.; McKendrick, I.J.; Matthews, L.
Introduction. There is an urgent need for a method of analysing FECRT data that is computationally simple and statistically robust. A method for evaluating the statistical power of a proposed FECRT study would also greatly enhance the current guidelines. Methods. A novel statistical framework has...... been developed that evaluates observed FECRT data against two null hypotheses: (1) the observed efficacy is consistent with the expected efficacy, and (2) the observed efficacy is inferior to the expected efficacy. The method requires only four simple summary statistics of the observed data. Power...... that the notional type 1 error rate of the new statistical test is accurate. Power calculations demonstrate a power of only 65% with a sample size of 20 treatment and control animals, which increases to 69% with 40 control animals or 79% with 40 treatment animals. Discussion. The method proposed is simple...
Azad Henareh Khalyani; William A. Gould; Eric Harmsen; Adam Terando; Maya Quinones; Jaime A. Collazo
2016-01-01
statistically downscaled general circulation models (GCMs) taking Puerto Rico as a test case. Two model selection/model averaging strategies were used: the average of all available GCMs and the av-erage of the models that are able to...
Bersimis, Sotiris; Panaretos, John; Psarakis, Stelios
2005-01-01
Woodall and Montgomery [35] in a discussion paper, state that multivariate process control is one of the most rapidly developing sections of statistical process control. Nowadays, in industry, there are many situations in which the simultaneous monitoring or control, of two or more related quality - process characteristics is necessary. Process monitoring problems in which several related variables are of interest are collectively known as Multivariate Statistical Process Control (MSPC).This ...
Directory of Open Access Journals (Sweden)
Takahiro eKawabe
2013-09-01
Full Text Available Humans can acquire the statistical features of the external world and employ them to control behaviors. Some external events occur in harmony with an agent’s action, and thus humans should also be able to acquire the statistical features between an action and its external outcome. We report that the acquired action-outcome statistical features alter the visual appearance of the action outcome. Pressing either of two assigned keys triggered visual motion whose direction was statistically biased either upward or downward, and observers judged the stimulus motion direction. Points of subjective equality (PSE for judging motion direction were shifted repulsively from the mean of the distribution associated with each key. Our Bayesian model accounted for the PSE shifts, indicating the optimal acquisition of the action-effect statistical relation. The PSE shifts were moderately attenuated when the action-outcome contingency was reduced. The Bayesian model again accounted for the attenuated PSE shifts. On the other hand, when the action-outcome contiguity was greatly reduced, the PSE shifts were greatly attenuated, and however, the Bayesian model could not accounted for the shifts. The results indicate that visual appearance can be modified by prediction based on the optimal acquisition of action-effect causal relation.
International Nuclear Information System (INIS)
Shapiro, B.
1986-01-01
Radionuclide scanning is the production of images of normal and diseased tissues and organs by means of the gamma-ray emissions from radiopharmaceutical agents having specific distributions in the body. The gamma rays are detected at the body surface by a variety of instruments that convert the invisible rays into visible patterns representing the distribution of the radionuclide in the body. The patterns, or images, obtained can be interpreted to provide or to aid diagnoses, to follow the course of disease, and to monitor the management of various illnesses. Scanning is a sensitive technique, but its specificity may be low when interpreted alone. To be used most successfully, radionuclide scanning must be interpreted in conjunction with other techniques, such as bone radiographs with bone scans, chest radiographs with lung scans, and ultrasonic studies with thyroid scans. Interpretation is also enhanced by providing pertinent clinical information because the distribution of radiopharmaceutical agents can be altered by drugs and by various procedures besides physiologic and pathologic conditions. Discussion of the patient with the radionuclide scanning specialist prior to the study and review of the results with that specialist after the study are beneficial
Zhao, Xing; Zhou, Xiao-Hua; Feng, Zijian; Guo, Pengfei; He, Hongyan; Zhang, Tao; Duan, Lei; Li, Xiaosong
2013-01-01
As a useful tool for geographical cluster detection of events, the spatial scan statistic is widely applied in many fields and plays an increasingly important role. The classic version of the spatial scan statistic for the binary outcome is developed by Kulldorff, based on the Bernoulli or the Poisson probability model. In this paper, we apply the Hypergeometric probability model to construct the likelihood function under the null hypothesis. Compared with existing methods, the likelihood function under the null hypothesis is an alternative and indirect method to identify the potential cluster, and the test statistic is the extreme value of the likelihood function. Similar with Kulldorff's methods, we adopt Monte Carlo test for the test of significance. Both methods are applied for detecting spatial clusters of Japanese encephalitis in Sichuan province, China, in 2009, and the detected clusters are identical. Through a simulation to independent benchmark data, it is indicated that the test statistic based on the Hypergeometric model outweighs Kulldorff's statistics for clusters of high population density or large size; otherwise Kulldorff's statistics are superior.
Link, J; Pachaly, J
1975-08-01
In a retrospective 18-month study the infusion therapy applied in a great anesthesia institute is examined. The data of the course of anesthesia recorded on magnetic tape by routine are analysed for this purpose bya computer with the statistical program SPSS. It could be proved that the behaviour of the several anesthetists is very different. Various correlations are discussed.
Directory of Open Access Journals (Sweden)
Brayan Alexander Fonseca Martinez
2017-11-01
Full Text Available One of the most commonly observational study designs employed in veterinary is the cross-sectional study with binary outcomes. To measure an association with exposure, the use of prevalence ratios (PR or odds ratios (OR are possible. In human epidemiology, much has been discussed about the use of the OR exclusively for case–control studies and some authors reported that there is no good justification for fitting logistic regression when the prevalence of the disease is high, in which OR overestimate the PR. Nonetheless, interpretation of OR is difficult since confusing between risk and odds can lead to incorrect quantitative interpretation of data such as “the risk is X times greater,” commonly reported in studies that use OR. The aims of this study were (1 to review articles with cross-sectional designs to assess the statistical method used and the appropriateness of the interpretation of the estimated measure of association and (2 to illustrate the use of alternative statistical methods that estimate PR directly. An overview of statistical methods and its interpretation using the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA guidelines was conducted and included a diverse set of peer-reviewed journals among the veterinary science field using PubMed as the search engine. From each article, the statistical method used and the appropriateness of the interpretation of the estimated measure of association were registered. Additionally, four alternative models for logistic regression that estimate directly PR were tested using our own dataset from a cross-sectional study on bovine viral diarrhea virus. The initial search strategy found 62 articles, in which 6 articles were excluded and therefore 56 studies were used for the overall analysis. The review showed that independent of the level of prevalence reported, 96% of articles employed logistic regression, thus estimating the OR. Results of the multivariate models
Directory of Open Access Journals (Sweden)
David Smith
2013-03-01
Full Text Available Background: Drug adverse event (AE signal detection using the Gamma Poisson Shrinker (GPS is commonly applied in spontaneous reporting. AE signal detection using large observational health plan databases can expand medication safety surveillance. Methods: Using data from nine health plans, we conducted a pilot study to evaluate the implementation and findings of the GPS approach for two antifungal drugs, terbinafine and itraconazole, and two diabetes drugs, pioglitazone and rosiglitazone. We evaluated 1676 diagnosis codes grouped into 183 different clinical concepts and four levels of granularity. Several signaling thresholds were assessed. GPS results were compared to findings from a companion study using the identical analytic dataset but an alternative statistical method—the tree-based scan statistic (TreeScan. Results: We identified 71 statistical signals across two signaling thresholds and two methods, including closely-related signals of overlapping diagnosis definitions. Initial review found that most signals represented known adverse drug reactions or confounding. About 31% of signals met the highest signaling threshold. Conclusions: The GPS method was successfully applied to observational health plan data in a distributed data environment as a drug safety data mining method. There was substantial concordance between the GPS and TreeScan approaches. Key method implementation decisions relate to defining exposures and outcomes and informed choice of signaling thresholds.
International Nuclear Information System (INIS)
Podorozhnyi, D.M.; Postnikov, E.B.; Sveshnikova, L.G.; Turundaevsky, A.N.
2005-01-01
A multivariate statistical procedure for solving problems of estimating physical parameters on the basis of data from measurements with multichannel equipment is described. Within the multivariate procedure, an algorithm is constructed for estimating the energy of primary cosmic rays and the exponent in their power-law spectrum. They are investigated by using the KLEM spectrometer (NUCLEON project) as a specific example of measuring equipment. The results of computer experiments simulating the operation of the multivariate procedure for this equipment are given, the proposed approach being compared in these experiments with the one-parameter approach presently used in data processing
Nash, J. Thomas; Frishman, David
1983-01-01
Analytical results for 61 elements in 370 samples from the Ranger Mine area are reported. Most of the rocks come from drill core in the Ranger No. 1 and Ranger No. 3 deposits, but 20 samples are from unmineralized drill core more than 1 km from ore. Statistical tests show that the elements Mg, Fe, F, Be, Co, Li, Ni, Pb, Sc, Th, Ti, V, CI, As, Br, Au, Ce, Dy, La Sc, Eu, Tb, Yb, and Tb have positive association with uranium, and Si, Ca, Na, K, Sr, Ba, Ce, and Cs have negative association. For most lithologic subsets Mg, Fe, Li, Cr, Ni, Pb, V, Y, Sm, Sc, Eu, and Yb are significantly enriched in ore-bearing rocks, whereas Ca, Na, K, Sr, Ba, Mn, Ce, and Cs are significantly depleted. These results are consistent with petrographic observations on altered rocks. Lithogeochemistry can aid exploration, but for these rocks requires methods that are expensive and not amenable to routine use.
Kennedy, R R; Merry, A F
2011-09-01
Anaesthesia involves processing large amounts of information over time. One task of the anaesthetist is to detect substantive changes in physiological variables promptly and reliably. It has been previously demonstrated that a graphical trend display of historical data leads to more rapid detection of such changes. We examined the effect of a graphical indication of the magnitude of Trigg's Tracking Variable, a simple statistically based trend detection algorithm, on the accuracy and latency of the detection of changes in a micro-simulation. Ten anaesthetists each viewed 20 simulations with four variables displayed as the current value with a simple graphical trend display. Values for these variables were generated by a computer model, and updated every second; after a period of stability a change occurred to a new random value at least 10 units from baseline. In 50% of the simulations an indication of the rate of change was given by a five level graphical representation of the value of Trigg's Tracking Variable. Participants were asked to indicate when they thought a change was occurring. Changes were detected 10.9% faster with the trend indicator present (mean 13.1 [SD 3.1] cycles vs 14.6 [SD 3.4] cycles, 95% confidence interval 0.4 to 2.5 cycles, P = 0.013. There was no difference in accuracy of detection (median with trend detection 97% [interquartile range 95 to 100%], without trend detection 100% [98 to 100%]), P = 0.8. We conclude that simple statistical trend detection may speed detection of changes during routine anaesthesia, even when a graphical trend display is present.
Kissling, Grace E; Haseman, Joseph K; Zeiger, Errol
2015-09-02
A recent article by Gaus (2014) demonstrates a serious misunderstanding of the NTP's statistical analysis and interpretation of rodent carcinogenicity data as reported in Technical Report 578 (Ginkgo biloba) (NTP, 2013), as well as a failure to acknowledge the abundant literature on false positive rates in rodent carcinogenicity studies. The NTP reported Ginkgo biloba extract to be carcinogenic in mice and rats. Gaus claims that, in this study, 4800 statistical comparisons were possible, and that 209 of them were statistically significant (p<0.05) compared with 240 (4800×0.05) expected by chance alone; thus, the carcinogenicity of Ginkgo biloba extract cannot be definitively established. However, his assumptions and calculations are flawed since he incorrectly assumes that the NTP uses no correction for multiple comparisons, and that significance tests for discrete data operate at exactly the nominal level. He also misrepresents the NTP's decision making process, overstates the number of statistical comparisons made, and ignores the fact that the mouse liver tumor effects were so striking (e.g., p<0.0000000000001) that it is virtually impossible that they could be false positive outcomes. Gaus' conclusion that such obvious responses merely "generate a hypothesis" rather than demonstrate a real carcinogenic effect has no scientific credibility. Moreover, his claims regarding the high frequency of false positive outcomes in carcinogenicity studies are misleading because of his methodological misconceptions and errors. Published by Elsevier Ireland Ltd.
Evaluation of the ICS and DEW scatter correction methods for low statistical content scans in 3D PET
International Nuclear Information System (INIS)
Sossi, V.; Oakes, T.R.; Ruth, T.J.
1996-01-01
The performance of the Integral Convolution and the Dual Energy Window scatter correction methods in 3D PET has been evaluated over a wide range of statistical content of acquired data (1M to 400M events) The order in which scatter correction and detector normalization should be applied has also been investigated. Phantom and human neuroreceptor studies were used with the following figures of merit: axial and radial uniformity, sinogram and image noise, contrast accuracy and contrast accuracy uniformity. Both scatter correction methods perform reliably in the range of number of events examined. Normalization applied after scatter correction yields better radial uniformity and fewer image artifacts
Azahari Razak, Khamarrul; Straatsma, Menno; van Westen, Cees; Malet, Jean-Philippe; de Jong, Steven M.
2010-05-01
Airborne Laser Scanning (ALS) is the state of the art technology for topographic mapping over a wide variety of spatial and temporal scales. It is also a promising technique for identification and mapping of landslides in a forested mountainous landscape. This technology demonstrates the ability to pass through the gaps between forest foliage and record the terrain height under vegetation cover. To date, most of the images either derived from satellite imagery, aerial-photograph or synthetic aperture radar are not appropriate for visual interpretation of landslide features that are covered by dense vegetation. However, it is a necessity to carefully map the landslides in order to understand its processes. This is essential for landslide hazard and risk assessment. This research demonstrates the capabilities of high resolution ALS data to recognize and identify different types of landslides in mixed forest in Barcelonnette, France and tropical rainforest in Cameron Highlands, Malaysia. ALS measurements over the 100-years old forest in Bois Noir catchment were carried out in 2007 and 2009. Both ALS dataset were captured using a Riegl laser scanner. First and last pulse with density of one point per meter square was derived from 2007 ALS dataset, whereas multiple return (of up to five returns) pulse was derived from July 2009 ALS dataset, which consists of 60 points per meter square over forested terrain. Generally, this catchment is highly affected by shallow landslides which mostly occur beneath dense vegetation. It is located in the dry intra-Alpine zone and represented by the climatic of the South French Alps. In the Cameron Highlands, first and last pulse data was captured in 2004 which covers an area of up to 300 kilometres square. Here, the Optech laser scanner was used under the Malaysian national pilot study which has slightly low point density. With precipitation intensity of up to 3000 mm per year over rugged topography and elevations up to 2800 m a
Energy Technology Data Exchange (ETDEWEB)
Serhal, M.; Dordea, M.; Cymbalista, M. [Hopital de Montfermeil, Service de Radiologie, 93 - Montfermeil (France); Halimi, P. [Hopital Europeen Georges-Pompidou, Service de Radiologie, 75 - Paris (France); Iffenecker, C. [Clinique Radiologique, 62 - Boulogne sur Mer (France); Bensimon, J.L
2003-02-01
The accurate description of bony changes in ear CT scans has a great diagnostic and therapeutic impact. The third part shows the way to analyze bone remodeling when CT scan is performed for tumors in the vicinity of the temporal bone, for intra temporal lesions of the facial nerve and for external auditory canal malformations. It demonstrates how bony analysis should be included in postoperative report of ear CT scan. The importance of bony signs in tumors and pseudo tumors of the inner ear are outlined. (authors)
IMANISHI, M.; NEWTON, A. E.; VIEIRA, A. R.; GONZALEZ-AVILES, G.; KENDALL SCOTT, M. E.; MANIKONDA, K.; MAXWELL, T. N.; HALPIN, J. L.; FREEMAN, M. M.; MEDALLA, F.; AYERS, T. L.; DERADO, G.; MAHON, B. E.; MINTZ, E. D.
2016-01-01
SUMMARY Although rare, typhoid fever cases acquired in the United States continue to be reported. Detection and investigation of outbreaks in these domestically acquired cases offer opportunities to identify chronic carriers. We searched surveillance and laboratory databases for domestically acquired typhoid fever cases, used a space–time scan statistic to identify clusters, and classified clusters as outbreaks or non-outbreaks. From 1999 to 2010, domestically acquired cases accounted for 18% of 3373 reported typhoid fever cases; their isolates were less often multidrug-resistant (2% vs. 15%) compared to isolates from travel-associated cases. We identified 28 outbreaks and two possible outbreaks within 45 space–time clusters of ⩾2 domestically acquired cases, including three outbreaks involving ⩾2 molecular subtypes. The approach detected seven of the ten outbreaks published in the literature or reported to CDC. Although this approach did not definitively identify any previously unrecognized outbreaks, it showed the potential to detect outbreaks of typhoid fever that may escape detection by routine analysis of surveillance data. Sixteen outbreaks had been linked to a carrier. Every case of typhoid fever acquired in a non-endemic country warrants thorough investigation. Space–time scan statistics, together with shoe-leather epidemiology and molecular subtyping, may improve outbreak detection. PMID:25427666
Imanishi, M; Newton, A E; Vieira, A R; Gonzalez-Aviles, G; Kendall Scott, M E; Manikonda, K; Maxwell, T N; Halpin, J L; Freeman, M M; Medalla, F; Ayers, T L; Derado, G; Mahon, B E; Mintz, E D
2015-08-01
Although rare, typhoid fever cases acquired in the United States continue to be reported. Detection and investigation of outbreaks in these domestically acquired cases offer opportunities to identify chronic carriers. We searched surveillance and laboratory databases for domestically acquired typhoid fever cases, used a space-time scan statistic to identify clusters, and classified clusters as outbreaks or non-outbreaks. From 1999 to 2010, domestically acquired cases accounted for 18% of 3373 reported typhoid fever cases; their isolates were less often multidrug-resistant (2% vs. 15%) compared to isolates from travel-associated cases. We identified 28 outbreaks and two possible outbreaks within 45 space-time clusters of ⩾2 domestically acquired cases, including three outbreaks involving ⩾2 molecular subtypes. The approach detected seven of the ten outbreaks published in the literature or reported to CDC. Although this approach did not definitively identify any previously unrecognized outbreaks, it showed the potential to detect outbreaks of typhoid fever that may escape detection by routine analysis of surveillance data. Sixteen outbreaks had been linked to a carrier. Every case of typhoid fever acquired in a non-endemic country warrants thorough investigation. Space-time scan statistics, together with shoe-leather epidemiology and molecular subtyping, may improve outbreak detection.
Yih, W Katherine; Maro, Judith C; Nguyen, Michael; Baker, Meghan A; Balsbaugh, Carolyn; Cole, David V; Dashevsky, Inna; Mba-Jonas, Adamma; Kulldorff, Martin
2018-06-01
The self-controlled tree-temporal scan statistic-a new signal-detection method-can evaluate whether any of a wide variety of health outcomes are temporally associated with receipt of a specific vaccine, while adjusting for multiple testing. Neither health outcomes nor postvaccination potential periods of increased risk need be prespecified. Using US medical claims data in the Food and Drug Administration's Sentinel system, we employed the method to evaluate adverse events occurring after receipt of quadrivalent human papillomavirus vaccine (4vHPV). Incident outcomes recorded in emergency department or inpatient settings within 56 days after first doses of 4vHPV received by 9- through 26.9-year-olds in 2006-2014 were identified using International Classification of Diseases, Ninth Revision, diagnosis codes and analyzed by pairing the new method with a standard hierarchical classification of diagnoses. On scanning diagnoses of 1.9 million 4vHPV recipients, 2 statistically significant categories of adverse events were found: cellulitis on days 2-3 after vaccination and "other complications of surgical and medical procedures" on days 1-3 after vaccination. Cellulitis is a known adverse event. Clinically informed investigation of electronic claims records of the patients with "other complications" did not suggest any previously unknown vaccine safety problem. Considering that thousands of potential short-term adverse events and hundreds of potential risk intervals were evaluated, these findings add significantly to the growing safety record of 4vHPV.
Energy Technology Data Exchange (ETDEWEB)
Wjihi, Sarra [Unité de Recherche de Physique Quantique, 11 ES 54, Faculté des Science de Monastir (Tunisia); Dhaou, Houcine [Laboratoire des Etudes des Systèmes Thermiques et Energétiques (LESTE), ENIM, Route de Kairouan, 5019 Monastir (Tunisia); Yahia, Manel Ben; Knani, Salah [Unité de Recherche de Physique Quantique, 11 ES 54, Faculté des Science de Monastir (Tunisia); Jemni, Abdelmajid [Laboratoire des Etudes des Systèmes Thermiques et Energétiques (LESTE), ENIM, Route de Kairouan, 5019 Monastir (Tunisia); Lamine, Abdelmottaleb Ben, E-mail: abdelmottaleb.benlamine@gmail.com [Unité de Recherche de Physique Quantique, 11 ES 54, Faculté des Science de Monastir (Tunisia)
2015-12-15
Statistical physics treatment is used to study the desorption of hydrogen on LaNi{sub 4.75}Fe{sub 0.25}, in order to obtain new physicochemical interpretations at the molecular level. Experimental desorption isotherms of hydrogen on LaNi{sub 4.75}Fe{sub 0.25} are fitted at three temperatures (293 K, 303 K and 313 K), using a monolayer desorption model. Six parameters of the model are fitted, namely the number of molecules per site n{sub α} and n{sub β}, the receptor site densities N{sub αM} and N{sub βM}, and the energetic parameters P{sub α} and P{sub β}. The behaviors of these parameters are discussed in relationship with desorption process. A dynamic study of the α and β phases in the desorption process was then carried out. Finally, the different thermodynamical potential functions are derived by statistical physics calculations from our adopted model.
Guyonvarch, Estelle; Ramin, Elham; Kulahci, Murat; Plósz, Benedek Gy
2015-10-15
The present study aims at using statistically designed computational fluid dynamics (CFD) simulations as numerical experiments for the identification of one-dimensional (1-D) advection-dispersion models - computationally light tools, used e.g., as sub-models in systems analysis. The objective is to develop a new 1-D framework, referred to as interpreted CFD (iCFD) models, in which statistical meta-models are used to calculate the pseudo-dispersion coefficient (D) as a function of design and flow boundary conditions. The method - presented in a straightforward and transparent way - is illustrated using the example of a circular secondary settling tank (SST). First, the significant design and flow factors are screened out by applying the statistical method of two-level fractional factorial design of experiments. Second, based on the number of significant factors identified through the factor screening study and system understanding, 50 different sets of design and flow conditions are selected using Latin Hypercube Sampling (LHS). The boundary condition sets are imposed on a 2-D axi-symmetrical CFD simulation model of the SST. In the framework, to degenerate the 2-D model structure, CFD model outputs are approximated by the 1-D model through the calibration of three different model structures for D. Correlation equations for the D parameter then are identified as a function of the selected design and flow boundary conditions (meta-models), and their accuracy is evaluated against D values estimated in each numerical experiment. The evaluation and validation of the iCFD model structure is carried out using scenario simulation results obtained with parameters sampled from the corners of the LHS experimental region. For the studied SST, additional iCFD model development was carried out in terms of (i) assessing different density current sub-models; (ii) implementation of a combined flocculation, hindered, transient and compression settling velocity function; and (iii
Rubino, Corrado; Mazzarello, Vittorio; Faenza, Mario; Montella, Andrea; Santanelli, Fabio; Farace, Francesco
2015-06-01
The aim of this study was to evaluate the effects on adipocyte morphology of 2 techniques of fat harvesting and of fat purification in lipofilling, considering that the number of viable healthy adipocytes is important in fat survival in recipient areas of lipofilling. Fat harvesting was performed in 10 female patients from flanks, on one side with a 2-mm Coleman cannula and on the other side with a 3-mm Mercedes cannula. Thirty milliliter of fat tissue from each side was collected and divided into three 10 mL syringes: A, B, and C. The fat inside syringe A was left untreated, the fat in syringe B underwent simple sedimentation, and the fat inside syringe C underwent centrifugation at 3000 rpm for 3 minutes. Each fat graft specimen was processed for examination under low-vacuum scanning electron microscope. Diameter (μ) and number of adipocytes per square millimeter and number of altered adipocytes per square millimeter were evaluated. Untreated specimens harvested with the 2 different techniques were first compared, then sedimented versus centrifuged specimens harvested with the same technique were compared. Statistical analysis was performed using Wilcoxon signed rank test. The number of adipocytes per square millimeter was statistically higher in specimens harvested with the 3-mm Mercedes cannula (P = 0.0310). The number of altered cells was statistically higher in centrifuged specimens than in sedimented ones using both methods of fat harvesting (P = 0.0080) with a 2-mm Coleman cannula and (P = 0.0050) with a 3-mm Mercedes cannula. Alterations in adipocyte morphology consisted in wrinkling of the membrane, opening of pore with leakage of oily material, reduction of cellular diameter, and total collapse of the cellular membrane. Fat harvesting by a 3-mm cannula results in a higher number of adipocytes and centrifugation of the harvested fat results in a higher number of morphologic altered cells than sedimentation.
Royse, C F; Haji, D L; Faris, J G; Veltman, M G; Kumar, A; Royse, A G
2012-05-01
Limited transthoracic echocardiography performed by treating physicians may facilitate assessment of haemodynamic abnormalities in perioperative and critical care patients. The interpretative skills of one hundred participants who completed an education program in limited transthoracic echocardiography were assessed by reporting five pre-recorded case studies. A high level of agreement was observed in ventricular volume assessment (left 95%, right 96%), systolic function (left 99%, right 96%), left atrial pressure (96%) and haemodynamic state (97%). The highest failure to report answers (that is, no answer given) was for right ventricular volume and function. For moderate or severe valve lesions, agreement ranged from 90 to 98%, with failure to report educational program showed good agreement with experts in interpretation of valve and ventricular function.
International Nuclear Information System (INIS)
Wang, Xin Lian; He, Wen; Chen, Jian Hong; Hu, Zhi Hai; Zhao, Li Qin
2015-01-01
To evaluate image quality of female pelvic computed tomography (CT) scans reconstructed with the adaptive statistical iterative reconstruction (ASIR) technique combined with low tube-voltage and to explore the feasibility of its clinical application. Ninety-four patients were divided into two groups. The study group used 100 kVp, and images were reconstructed with 30%, 50%, 70%, and 90% ASIR. The control group used 120 kVp, and images were reconstructed with 30% ASIR. The noise index was 15 for the study group and 11 for the control group. The CT values and noise levels of different tissues were measured. The contrast to noise ratio (CNR) was calculated. A subjective evaluation was carried out by two experienced radiologists. The CT dose index volume (CTDIvol) was recorded. A 44.7% reduction in CTDIvol was observed in the study group (8.18 ± 3.58 mGy) compared with that in the control group (14.78 ± 6.15 mGy). No significant differences were observed in the tissue noise levels and CNR values between the 70% ASIR group and the control group (p = 0.068-1.000). The subjective scores indicated that visibility of small structures, diagnostic confidence, and the overall image quality score in the 70% ASIR group was the best, and were similar to those in the control group (1.87 vs. 1.79, 1.26 vs. 1.28, and 4.53 vs. 4.57; p = 0.122-0.585). No significant difference in diagnostic accuracy was detected between the study group and the control group (42/47 vs. 43/47, p = 1.000). Low tube-voltage combined with automatic tube current modulation and 70% ASIR allowed the low CT radiation dose to be reduced by 44.7% without losing image quality on female pelvic scan
Energy Technology Data Exchange (ETDEWEB)
Wang, Xin Lian; He, Wen; Chen, Jian Hong; Hu, Zhi Hai; Zhao, Li Qin [Dept. of Radiology, Beijing Friendship Hospital, Capital Medical University, Beijing (China)
2015-10-15
To evaluate image quality of female pelvic computed tomography (CT) scans reconstructed with the adaptive statistical iterative reconstruction (ASIR) technique combined with low tube-voltage and to explore the feasibility of its clinical application. Ninety-four patients were divided into two groups. The study group used 100 kVp, and images were reconstructed with 30%, 50%, 70%, and 90% ASIR. The control group used 120 kVp, and images were reconstructed with 30% ASIR. The noise index was 15 for the study group and 11 for the control group. The CT values and noise levels of different tissues were measured. The contrast to noise ratio (CNR) was calculated. A subjective evaluation was carried out by two experienced radiologists. The CT dose index volume (CTDIvol) was recorded. A 44.7% reduction in CTDIvol was observed in the study group (8.18 ± 3.58 mGy) compared with that in the control group (14.78 ± 6.15 mGy). No significant differences were observed in the tissue noise levels and CNR values between the 70% ASIR group and the control group (p = 0.068-1.000). The subjective scores indicated that visibility of small structures, diagnostic confidence, and the overall image quality score in the 70% ASIR group was the best, and were similar to those in the control group (1.87 vs. 1.79, 1.26 vs. 1.28, and 4.53 vs. 4.57; p = 0.122-0.585). No significant difference in diagnostic accuracy was detected between the study group and the control group (42/47 vs. 43/47, p = 1.000). Low tube-voltage combined with automatic tube current modulation and 70% ASIR allowed the low CT radiation dose to be reduced by 44.7% without losing image quality on female pelvic scan.
Wang, Xinlian; Chen, Jianghong; Hu, Zhihai; Zhao, Liqin
2015-01-01
Objective To evaluate image quality of female pelvic computed tomography (CT) scans reconstructed with the adaptive statistical iterative reconstruction (ASIR) technique combined with low tube-voltage and to explore the feasibility of its clinical application. Materials and Methods Ninety-four patients were divided into two groups. The study group used 100 kVp, and images were reconstructed with 30%, 50%, 70%, and 90% ASIR. The control group used 120 kVp, and images were reconstructed with 30% ASIR. The noise index was 15 for the study group and 11 for the control group. The CT values and noise levels of different tissues were measured. The contrast to noise ratio (CNR) was calculated. A subjective evaluation was carried out by two experienced radiologists. The CT dose index volume (CTDIvol) was recorded. Results A 44.7% reduction in CTDIvol was observed in the study group (8.18 ± 3.58 mGy) compared with that in the control group (14.78 ± 6.15 mGy). No significant differences were observed in the tissue noise levels and CNR values between the 70% ASIR group and the control group (p = 0.068-1.000). The subjective scores indicated that visibility of small structures, diagnostic confidence, and the overall image quality score in the 70% ASIR group was the best, and were similar to those in the control group (1.87 vs. 1.79, 1.26 vs. 1.28, and 4.53 vs. 4.57; p = 0.122-0.585). No significant difference in diagnostic accuracy was detected between the study group and the control group (42/47 vs. 43/47, p = 1.000). Conclusion Low tube-voltage combined with automatic tube current modulation and 70% ASIR allowed the low CT radiation dose to be reduced by 44.7% without losing image quality on female pelvic scan. PMID:26357499
Technetium phosphate bone scan in the diagnosis of septic arthritis in childhood
International Nuclear Information System (INIS)
Sundberg, S.B.; Savage, J.P.; Foster, B.K.
1989-01-01
The technetium phosphate bone scans of 106 children with suspected septic arthritis were reviewed to determine whether the bone scan can accurately differentiate septic from nonseptic arthropathy. Only 13% of children with proved septic arthritis had correct blind scan interpretation. The clinically adjusted interpretation did not identify septic arthritis in 30%. Septic arthritis was incorrectly identified in 32% of children with no evidence of septic arthritis. No statistically significant differences were noted between the scan findings in the septic and nonseptic groups and no scan findings correlated specifically with the presence or absence of joint sepsis
Directory of Open Access Journals (Sweden)
Michael Robert Cunningham
2016-10-01
Full Text Available The limited resource model states that self-control is governed by a relatively finite set of inner resources on which people draw when exerting willpower. Once self-control resources have been used up or depleted, they are less available for other self-control tasks, leading to a decrement in subsequent self-control success. The depletion effect has been studied for over 20 years, tested or extended in more than 600 studies, and supported in an independent meta-analysis (Hagger, Wood, Stiff, and Chatzisarantis, 2010. Meta-analyses are supposed to reduce bias in literature reviews. Carter, Kofler, Forster, and McCullough’s (2015 meta-analysis, by contrast, included a series of questionable decisions involving sampling, methods, and data analysis. We provide quantitative analyses of key sampling issues: exclusion of many of the best depletion studies based on idiosyncratic criteria and the emphasis on mini meta-analyses with low statistical power as opposed to the overall depletion effect. We discuss two key methodological issues: failure to code for research quality, and the quantitative impact of weak studies by novice researchers. We discuss two key data analysis issues: questionable interpretation of the results of trim and fill and funnel plot asymmetry test procedures, and the use and misinterpretation of the untested Precision Effect Test [PET] and Precision Effect Estimate with Standard Error (PEESE procedures. Despite these serious problems, the Carter et al. meta-analysis results actually indicate that there is a real depletion effect – contrary to their title.
Walkden, N. R.; Wynn, A.; Militello, F.; Lipschultz, B.; Matthews, G.; Guillemaut, C.; Harrison, J.; Moulton, D.; Contributors, JET
2017-08-01
This paper presents the use of a novel modelling technique based around intermittent transport due to filament motion, to interpret experimental profile and fluctuation data in the scrape-off layer (SOL) of JET during the onset and evolution of a density profile shoulder. A baseline case is established, prior to shoulder formation, and the stochastic model is shown to be capable of simultaneously matching the time averaged profile measurement as well as the PDF shape and autocorrelation function from the ion-saturation current time series at the outer wall. Aspects of the stochastic model are then varied with the aim of producing a profile shoulder with statistical measurements consistent with experiment. This is achieved through a strong localised reduction in the density sink acting on the filaments within the model. The required reduction of the density sink occurs over a highly localised region with the timescale of the density sink increased by a factor of 25. This alone is found to be insufficient to model the expansion and flattening of the shoulder region as the density increases, which requires additional changes within the stochastic model. An example is found which includes both a reduction in the density sink and filament acceleration and provides a consistent match to the experimental data as the shoulder expands, though the uniqueness of this solution can not be guaranteed. Within the context of the stochastic model, this implies that the localised reduction in the density sink can trigger shoulder formation, but additional physics is required to explain the subsequent evolution of the profile.
Directory of Open Access Journals (Sweden)
Nadja Stumberg
2014-05-01
Full Text Available The vegetation in the forest-tundra ecotone zone is expected to be highly affected by climate change and requires effective monitoring techniques. Airborne laser scanning (ALS has been proposed as a tool for the detection of small pioneer trees for such vast areas using laser height and intensity data. The main objective of the present study was to assess a possible improvement in the performance of classifying tree and nontree laser echoes from high-density ALS data. The data were collected along a 1000 km long transect stretching from southern to northern Norway. Different geostatistical and statistical measures derived from laser height and intensity values were used to extent and potentially improve more simple models ignoring the spatial context. Generalised linear models (GLM and support vector machines (SVM were employed as classification methods. Total accuracies and Cohen’s kappa coefficients were calculated and compared to those of simpler models from a previous study. For both classification methods, all models revealed total accuracies similar to the results of the simpler models. Concerning classification performance, however, the comparison of the kappa coefficients indicated a significant improvement for some models both using GLM and SVM, with classification accuracies >94%.
International Nuclear Information System (INIS)
Zhao Yongxia; Chang Jin; Zuo Ziwei; Zhang Changda; Zhang Tianle
2014-01-01
Objective: To investigate the best weighting of adaptive statistical iterative reconstruction (ASIR) algorithm and optimized low-dose scanning parameters in thoracic aorta CT angiography(CTA). Methods: Totally 120 patients with the body mass index (BMI) of 19-24 were randomly divided into 6 groups. All patients underwent thoracic aorta CTA with a GE Discovery CT 750 HD scanner (ranging from 290-330 mm). The default parameters (100 kV, 240 mAs) were applied in Group 1. Reconstructions were performed with different weightings of ASIR(10%-100% with 10%), and the signal to noise ratio (S/N) and contrast to noise ratio(C/N) of images were calculated. The images of series were evaluated by 2 independent radiologists with 5-point-scale and lastly the best weighting were revealed. Then the mAs in Group 2-6 were defined as 210, 180, 150, 120 and 90 with the kilovoltage 100. The CTDI_v_o_l and DLP in every scan series were recorded and the effective dose (E) was calculated. The S/N and C/N were calculated and the image quality was assessed by two radiologists. Results: The best weighing of ASIR was 60% at the 100 kV, 240 mAs. Under 60% of ASIR and 100 kV, the scores of image quality from 240 mAs to 90 mAs were (4.78±0.30)-(3.15±0.23). The CTDI_v_o_l and DLP were 12.64-4.41 mGy and 331.81-128.27 mGy, and the E was 4.98-1.92 mSv. The image qualities among Group 1-5 were nor significantly different (F = 5.365, P > 0.05), but the CTDI_v_o_l and DLP of Group 5 were reduced by 37.0% and 36.9%, respectively compared with Group 1. Conclusions: In thoracic aorta CT Angiography, the best weighting of ASIR is 60%, and 120 mAs is the best mAs with 100 kV in patients with BMI 19-24. (authors)
The advent of new higher throughput analytical instrumentation has put a strain on interpreting and explaining the results from complex studies. Contemporary human, environmental, and biomonitoring data sets are comprised of tens or hundreds of analytes, multiple repeat measures...
Austin, Peter C; Steyerberg, Ewout W
2012-06-20
When outcomes are binary, the c-statistic (equivalent to the area under the Receiver Operating Characteristic curve) is a standard measure of the predictive accuracy of a logistic regression model. An analytical expression was derived under the assumption that a continuous explanatory variable follows a normal distribution in those with and without the condition. We then conducted an extensive set of Monte Carlo simulations to examine whether the expressions derived under the assumption of binormality allowed for accurate prediction of the empirical c-statistic when the explanatory variable followed a normal distribution in the combined sample of those with and without the condition. We also examine the accuracy of the predicted c-statistic when the explanatory variable followed a gamma, log-normal or uniform distribution in combined sample of those with and without the condition. Under the assumption of binormality with equality of variances, the c-statistic follows a standard normal cumulative distribution function with dependence on the product of the standard deviation of the normal components (reflecting more heterogeneity) and the log-odds ratio (reflecting larger effects). Under the assumption of binormality with unequal variances, the c-statistic follows a standard normal cumulative distribution function with dependence on the standardized difference of the explanatory variable in those with and without the condition. In our Monte Carlo simulations, we found that these expressions allowed for reasonably accurate prediction of the empirical c-statistic when the distribution of the explanatory variable was normal, gamma, log-normal, and uniform in the entire sample of those with and without the condition. The discriminative ability of a continuous explanatory variable cannot be judged by its odds ratio alone, but always needs to be considered in relation to the heterogeneity of the population.
Interpretation of computed tomographic images
International Nuclear Information System (INIS)
Stickle, R.L.; Hathcock, J.T.
1993-01-01
This article discusses the production of optimal CT images in small animal patients as well as principles of radiographic interpretation. Technical factors affecting image quality and aiding image interpretation are included. Specific considerations for scanning various anatomic areas are given, including indications and potential pitfalls. Principles of radiographic interpretation are discussed. Selected patient images are illustrated
Austin, Peter C
2008-09-01
Propensity-score matching is frequently used in the cardiology literature. Recent systematic reviews have found that this method is, in general, poorly implemented in the medical literature. The study objective was to examine the quality of the implementation of propensity-score matching in the general cardiology literature. A total of 44 articles published in the American Heart Journal, the American Journal of Cardiology, Circulation, the European Heart Journal, Heart, the International Journal of Cardiology, and the Journal of the American College of Cardiology between January 1, 2004, and December 31, 2006, were examined. Twenty of the 44 studies did not provide adequate information on how the propensity-score-matched pairs were formed. Fourteen studies did not report whether matching on the propensity score balanced baseline characteristics between treated and untreated subjects in the matched sample. Only 4 studies explicitly used statistical methods appropriate for matched studies to compare baseline characteristics between treated and untreated subjects. Only 11 (25%) of the 44 studies explicitly used statistical methods appropriate for the analysis of matched data when estimating the effect of treatment on the outcomes. Only 2 studies described the matching method used, assessed balance in baseline covariates by appropriate methods, and used appropriate statistical methods to estimate the treatment effect and its significance. Application of propensity-score matching was poor in the cardiology literature. Suggestions for improving the reporting and analysis of studies that use propensity-score matching are provided.
DEFF Research Database (Denmark)
Guyonvarch, Estelle; Ramin, Elham; Kulahci, Murat
2015-01-01
using the example of a circular secondary settling tank (SST). First, the significant design and flow factors are screened out by applying the statistical method of two-level fractional factorial design of experiments. Second, based on the number of significant factors identified through the factor...... of (i) assessing different density current sub-models; (ii) implementation of a combined flocculation, hindered, transient and compression settling velocity function; and (iii) assessment of modelling the onset of transient and compression settling. Furthermore, the optimal level of model discretization...
International Nuclear Information System (INIS)
Famy, C.; Brough, A.R.; Taylor, H.F.W.
2003-01-01
Scanning electron microscopy (SEM) microanalyses of the calcium-silicate-hydrate (C-S-H) gel in Portland cement pastes rarely represent single phases. Essential experimental requirements are summarised and new procedures for interpreting the data are described. These include, notably, plots of Si/Ca against other atom ratios, 3D plots to allow three such ratios to be correlated and solution of linear simultaneous equations to test and quantify hypotheses regarding the phases contributing to individual microanalyses. Application of these methods to the C-S-H gel of a 1-day-old mortar identified a phase with Al/Ca=0.67 and S/Ca=0.33, which we consider to be a highly substituted ettringite of probable composition C 6 A 2 S-bar 2 H 34 or {Ca 6 [Al(OH) 6 ] 2 ·24H 2 O}(SO 4 ) 2 [Al(OH) 4 ] 2 . If this is true for Portland cements in general, it might explain observed discrepancies between observed and calculated aluminate concentrations in the pore solution. The C-S-H gel of a similar mortar aged 600 days contained unsubstituted ettringite and an AFm phase with S/Ca=0.125
2011-01-01
Background The correspondence of satisfaction ratings between physicians and patients can be assessed on different dimensions. One may examine whether they differ between the two groups or focus on measures of association or agreement. The aim of our study was to evaluate methodological difficulties in calculating the correspondence between patient and physician satisfaction ratings and to show the relevance for shared decision making research. Methods We utilised a structured tool for cardiovascular prevention (arriba™) in a pragmatic cluster-randomised controlled trial. Correspondence between patient and physician satisfaction ratings after individual primary care consultations was assessed using the Patient Participation Scale (PPS). We used the Wilcoxon signed-rank test, the marginal homogeneity test, Kendall's tau-b, weighted kappa, percentage of agreement, and the Bland-Altman method to measure differences, associations, and agreement between physicians and patients. Results Statistical measures signal large differences between patient and physician satisfaction ratings with more favourable ratings provided by patients and a low correspondence regardless of group allocation. Closer examination of the raw data revealed a high ceiling effect of satisfaction ratings and only slight disagreement regarding the distributions of differences between physicians' and patients' ratings. Conclusions Traditional statistical measures of association and agreement are not able to capture a clinically relevant appreciation of the physician-patient relationship by both parties in skewed satisfaction ratings. Only the Bland-Altman method for assessing agreement augmented by bar charts of differences was able to indicate this. Trial registration ISRCTN: ISRCT71348772 PMID:21592337
International Nuclear Information System (INIS)
Bogen, K; Hamilton, T F; Brown, T A; Martinelli, R E; Marchetti, A A; Kehl, S R; Langston, R G
2007-01-01
We have developed refined statistical and modeling techniques to assess low-level uptake and urinary excretion of plutonium from different population group in the northern Marshall Islands. Urinary excretion rates of plutonium from the resident population on Enewetak Atoll and from resettlement workers living on Rongelap Atoll range from 239 Pu. However, our statistical analyses show that urinary excretion of plutonium-239 ( 239 Pu) from both cohort groups is significantly positively associated with volunteer age, especially for the resident population living on Enewetak Atoll. Urinary excretion of 239 Pu from the Enewetak cohort was also found to be positively associated with estimates of cumulative exposure to worldwide fallout. Consequently, the age-related trends in urinary excretion of plutonium from Marshallese populations can be described by either a long-term component from residual systemic burdens acquired from previous exposures to worldwide fallout or a prompt (and eventual long-term) component acquired from low-level systemic intakes of plutonium associated with resettlement of the northern Marshall Islands, or some combination of both
Nuclear scans use radioactive substances to see structures and functions inside your body. They use a special ... images. Most scans take 20 to 45 minutes. Nuclear scans can help doctors diagnose many conditions, including ...
Tchabo, William; Ma, Yongkun; Kwaw, Emmanuel; Zhang, Haining; Xiao, Lulu; Apaliya, Maurice T
2018-01-15
The four different methods of color measurement of wine proposed by Boulton, Giusti, Glories and Commission International de l'Eclairage (CIE) were applied to assess the statistical relationship between the phytochemical profile and chromatic characteristics of sulfur dioxide-free mulberry (Morus nigra) wine submitted to non-thermal maturation processes. The alteration in chromatic properties and phenolic composition of non-thermal aged mulberry wine were examined, aided by the used of Pearson correlation, cluster and principal component analysis. The results revealed a positive effect of non-thermal processes on phytochemical families of wines. From Pearson correlation analysis relationships between chromatic indexes and flavonols as well as anthocyanins were established. Cluster analysis highlighted similarities between Boulton and Giusti parameters, as well as Glories and CIE parameters in the assessment of chromatic properties of wines. Finally, principal component analysis was able to discriminate wines subjected to different maturation techniques on the basis of their chromatic and phenolics characteristics. Copyright © 2017. Published by Elsevier Ltd.
Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James
2014-01-01
Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.
Vasikaran, Samuel
2008-08-01
* Clinical laboratories should be able to offer interpretation of the results they produce. * At a minimum, contact details for interpretative advice should be available on laboratory reports.Interpretative comments may be verbal or written and printed. * Printed comments on reports should be offered judiciously, only where they would add value; no comment preferred to inappropriate or dangerous comment. * Interpretation should be based on locally agreed or nationally recognised clinical guidelines where available. * Standard tied comments ("canned" comments) can have some limited use.Individualised narrative comments may be particularly useful in the case of tests that are new, complex or unfamiliar to the requesting clinicians and where clinical details are available. * Interpretative commenting should only be provided by appropriately trained and credentialed personnel. * Audit of comments and continued professional development of personnel providing them are important for quality assurance.
Auvinen, Jussi; Bernhard, Jonah E.; Bass, Steffen A.; Karpenko, Iurii
2018-04-01
We determine the probability distributions of the shear viscosity over the entropy density ratio η /s in the quark-gluon plasma formed in Au + Au collisions at √{sN N}=19.6 ,39 , and 62.4 GeV , using Bayesian inference and Gaussian process emulators for a model-to-data statistical analysis that probes the full input parameter space of a transport + viscous hydrodynamics hybrid model. We find the most likely value of η /s to be larger at smaller √{sN N}, although the uncertainties still allow for a constant value between 0.10 and 0.15 for the investigated collision energy range.
Objective interpretation as conforming interpretation
Directory of Open Access Journals (Sweden)
Lidka Rodak
2011-12-01
Full Text Available The practical discourse willingly uses the formula of “objective interpretation”, with no regards to its controversial nature that has been discussed in literature.The main aim of the article is to investigate what “objective interpretation” could mean and how it could be understood in the practical discourse, focusing on the understanding offered by judicature.The thesis of the article is that objective interpretation, as identified with textualists’ position, is not possible to uphold, and should be rather linked with conforming interpretation. And what this actually implies is that it is not the virtue of certainty and predictability – which are usually associated with objectivity- but coherence that makes the foundation of applicability of objectivity in law.What could be observed from the analyses, is that both the phenomenon of conforming interpretation and objective interpretation play the role of arguments in the interpretive discourse, arguments that provide justification that interpretation is not arbitrary or subjective. With regards to the important part of the ideology of legal application which is the conviction that decisions should be taken on the basis of law in order to exclude arbitrariness, objective interpretation could be read as a question “what kind of authority “supports” certain interpretation”? that is almost never free of judicial creativity and judicial activism.One can say that, objective and conforming interpretation are just another arguments used in legal discourse.
Qiao, Xue; Lin, Xiong-hao; Ji, Shuai; Zhang, Zheng-xiang; Bo, Tao; Guo, De-an; Ye, Min
2016-01-05
To fully understand the chemical diversity of an herbal medicine is challenging. In this work, we describe a new approach to globally profile and discover novel compounds from an herbal extract using multiple neutral loss/precursor ion scanning combined with substructure recognition and statistical analysis. Turmeric (the rhizomes of Curcuma longa L.) was used as an example. This approach consists of three steps: (i) multiple neutral loss/precursor ion scanning to obtain substructure information; (ii) targeted identification of new compounds by extracted ion current and substructure recognition; and (iii) untargeted identification using total ion current and multivariate statistical analysis to discover novel structures. Using this approach, 846 terpecurcumins (terpene-conjugated curcuminoids) were discovered from turmeric, including a number of potentially novel compounds. Furthermore, two unprecedented compounds (terpecurcumins X and Y) were purified, and their structures were identified by NMR spectroscopy. This study extended the application of mass spectrometry to global profiling of natural products in herbal medicines and could help chemists to rapidly discover novel compounds from a complex matrix.
... this page: //medlineplus.gov/ency/article/003790.htm Renal scan To use the sharing features on this ... anaphylaxis . Alternative Names Renogram; Kidney scan Images Kidney anatomy Kidney - blood and urine flow References Chernecky CC, ...
... disease, lung nodules and liver masses Monitor the effectiveness of certain treatments, such as cancer treatment Detect ... scan done in a hospital or an outpatient facility. CT scans are painless and, with newer machines, ...
Alruwaili, A R; Pannek, K; Coulthard, A; Henderson, R; Kurniawan, N D; McCombe, P
2018-02-01
This study aims to compare the cortical and subcortical deep gray matter (GM) and white matter (WM) of ALS subjects and controls and to compare ALS subjects with (ALScog) and without (ALSnon-cog) cognitive impairment. The study was performed in 30 ALS subjects, and 19 healthy controls. Structural T1- and diffusion-weighted MRI data were analyzed using voxel-based morphometry (VBM) and tract-based spatial statistics (TBSS). All DTI measures and GM volume differed significantly between ALS subjects and controls. Compared to controls, greater DTI changes were present in ALScog than ALSnon-cog subjects. GM results showed reduction in the caudate nucleus volume in ALScog subjects compared to ALSnon-cog. and comparing all ALS with controls, there were changes on the right side and in a small region in the left middle frontal gyrus. This combined DTI and VBM study showed changes in motor and extra-motor regions. The DTI changes were more extensive in ALScog than ALSnon-cog subjects. It is likely that the inclusion of ALS subjects with cognitive impairment in previous studies resulted in extra-motor WM abnormalities being reported in ALS subjects. Copyright © 2017. Published by Elsevier Masson SAS.
Kissick, David J.; Muir, Ryan D.; Sullivan, Shane Z.; Oglesbee, Robert A.; Simpson, Garth J.
2013-02-01
Despite the ubiquitous use of multi-photon and confocal microscopy measurements in biology, the core techniques typically suffer from fundamental compromises between signal to noise (S/N) and linear dynamic range (LDR). In this study, direct synchronous digitization of voltage transients coupled with statistical analysis is shown to allow S/N approaching the theoretical maximum throughout an LDR spanning more than 8 decades, limited only by the dark counts of the detector on the low end and by the intrinsic nonlinearities of the photomultiplier tube (PMT) detector on the high end. Synchronous digitization of each voltage transient represents a fundamental departure from established methods in confocal/multi-photon imaging, which are currently based on either photon counting or signal averaging. High information-density data acquisition (up to 3.2 GB/s of raw data) enables the smooth transition between the two modalities on a pixel-by-pixel basis and the ultimate writing of much smaller files (few kB/s). Modeling of the PMT response allows extraction of key sensor parameters from the histogram of voltage peak-heights. Applications in second harmonic generation (SHG) microscopy are described demonstrating S/N approaching the shot-noise limit of the detector over large dynamic ranges.
Llope, W. J.; STAR Collaboration
2013-10-01
Specific products of the statistical moments of the multiplicity distributions of identified particles can be directly compared to susceptibility ratios obtained from lattice QCD calculations. They may also diverge for nuclear systems formed close to a possible QCD critical point due to the phenomenon of critical opalescence. Of particular interest are the moments products for net-protons, net-kaons, and net-charge, as these are considered proxies for conserved quantum numbers. The moments products have been measured by the STAR experiment for Au+Au collisions at seven beam energies ranging from 7.7 to 200 GeV. In this presentation, the experimental results are compared to data-based calculations in which the intra-event correlations of the numbers of positive and negative particles are broken by construction. The importance of intra-event correlations to the moments products values for net-protons, net-kaons, and net-charge can thus be evaluated. Work supported by the U.S. Dept of Energy under grant DE-PS02-09ER09.
Localized Smart-Interpretation
Lundh Gulbrandsen, Mats; Mejer Hansen, Thomas; Bach, Torben; Pallesen, Tom
2014-05-01
The complex task of setting up a geological model consists not only of combining available geological information into a conceptual plausible model, but also requires consistency with availably data, e.g. geophysical data. However, in many cases the direct geological information, e.g borehole samples, are very sparse, so in order to create a geological model, the geologist needs to rely on the geophysical data. The problem is however, that the amount of geophysical data in many cases are so vast that it is practically impossible to integrate all of them in the manual interpretation process. This means that a lot of the information available from the geophysical surveys are unexploited, which is a problem, due to the fact that the resulting geological model does not fulfill its full potential and hence are less trustworthy. We suggest an approach to geological modeling that 1. allow all geophysical data to be considered when building the geological model 2. is fast 3. allow quantification of geological modeling. The method is constructed to build a statistical model, f(d,m), describing the relation between what the geologists interpret, d, and what the geologist knows, m. The para- meter m reflects any available information that can be quantified, such as geophysical data, the result of a geophysical inversion, elevation maps, etc... The parameter d reflects an actual interpretation, such as for example the depth to the base of a ground water reservoir. First we infer a statistical model f(d,m), by examining sets of actual interpretations made by a geological expert, [d1, d2, ...], and the information used to perform the interpretation; [m1, m2, ...]. This makes it possible to quantify how the geological expert performs interpolation through f(d,m). As the geological expert proceeds interpreting, the number of interpreted datapoints from which the statistical model is inferred increases, and therefore the accuracy of the statistical model increases. When a model f
Neuman, Yair
2010-10-01
Interpretation is at the center of psychoanalytic activity. However, interpretation is always challenged by that which is beyond our grasp, the 'dark matter' of our mind, what Bion describes as ' O'. O is one of the most central and difficult concepts in Bion's thought. In this paper, I explain the enigmatic nature of O as a high-dimensional mental space and point to the price one should pay for substituting the pre-symbolic lexicon of the emotion-laden and high-dimensional unconscious for a low-dimensional symbolic representation. This price is reification--objectifying lived experience and draining it of vitality and complexity. In order to address the difficulty of approaching O through symbolization, I introduce the term 'Penultimate Interpretation'--a form of interpretation that seeks 'loopholes' through which the analyst and the analysand may reciprocally save themselves from the curse of reification. Three guidelines for 'Penultimate Interpretation' are proposed and illustrated through an imaginary dialogue. Copyright © 2010 Institute of Psychoanalysis.
MacKinnon, Edward
2012-01-01
This book is the first to offer a systematic account of the role of language in the development and interpretation of physics. An historical-conceptual analysis of the co-evolution of mathematical and physical concepts leads to the classical/quatum interface. Bohrian orthodoxy stresses the indispensability of classical concepts and the functional role of mathematics. This book analyses ways of extending, and then going beyond this orthodoxy orthodoxy. Finally, the book analyzes how a revised interpretation of physics impacts on basic philosophical issues: conceptual revolutions, realism, and r
Kothe, Elsa Lenz; Berard, Marie-France
2013-01-01
Utilizing a/r/tographic methodology to interrogate interpretive acts in museums, multiple areas of inquiry are raised in this paper, including: which knowledge is assigned the greatest value when preparing a gallery talk; what lies outside of disciplinary knowledge; how invitations to participate invite and disinvite in the same gesture; and what…
M. Zukowski (Marcin); P.A. Boncz (Peter); M.L. Kersten (Martin)
2004-01-01
textabstractData mining, information retrieval and other application areas exhibit a query load with multiple concurrent queries touching a large fraction of a relation. This leads to individual query plans based on a table scan or large index scan. The implementation of this access path in most
Fisher, Anthony C; McCulloch, Daphne L; Borchert, Mark S; Garcia-Filion, Pamela; Fink, Cassandra; Eleuteri, Antonio; Simpson, David M
2015-08-01
Pattern electroretinograms (PERGs) have inherently low signal-to-noise ratios and can be difficult to detect when degraded by pathology or noise. We compare an objective system for automated PERG analysis with expert human interpretation in children with optic nerve hypoplasia (ONH) with PERGs ranging from clear to undetectable. PERGs were recorded uniocularly with chloral hydrate sedation in children with ONH (aged 3.5-35 months). Stimuli were reversing checks of four sizes focused using an optical system incorporating the cycloplegic refraction. Forty PERG records were analysed; 20 selected at random and 20 from eyes with good vision (fellow eyes or eyes with mild ONH) from over 300 records. Two experts identified P50 and N95 of the PERGs after manually deleting trials with movement artefact, slow-wave EEG (4-8 Hz) or other noise from raw data for 150 check reversals. The automated system first identified present/not-present responses using a magnitude-squared coherence criterion and then, for responses confirmed as present, estimated the P50 and N95 cardinal positions as the turning points in local third-order polynomials fitted in the -3 dB bandwidth [0.25 … 45] Hz. Confidence limits were estimated from bootstrap re-sampling with replacement. The automated system uses an interactive Internet-available webpage tool (see http://clinengnhs.liv.ac.uk/esp_perg_1.htm). The automated system detected 28 PERG signals above the noise level (p ≤ 0.05 for H0). Good subjective quality ratings were indicative of significant PERGs; however, poor subjective quality did not necessarily predict non-significant signals. P50 and N95 implicit times showed good agreement between the two experts and between experts and the automated system. For the N95 amplitude measured to P50, the experts differed by an average of 13% consistent with differing interpretations of peaks within noise, while the automated amplitude measure was highly correlated with the expert measures but was
Bott, Lewis; Frisson, Steven; Murphy, Gregory L
2009-04-01
The interpretation generated from a sentence of the form P and Q can often be different to that generated by Q and P, despite the fact that and has a symmetric truth-conditional meaning. We experimentally investigated to what extent this difference in meaning is due to the connective and and to what extent it is due to order of mention of the events in the sentence. In three experiments, we collected interpretations of sentences in which we varied the presence of the conjunction, the order of mention of the events, and the type of relation holding between the events (temporally vs. causally related events). The results indicated that the effect of using a conjunction was dependent on the discourse relation between the events. Our findings contradict a narrative marker theory of and, but provide partial support for a single-unit theory derived from Carston (2002). The results are discussed in terms of conjunction processing and implicatures of temporal order.
Reeve, Joanne
2010-01-01
Patient-centredness is a core value of general practice; it is defined as the interpersonal processes that support the holistic care of individuals. To date, efforts to demonstrate their relationship to patient outcomes have been disappointing, whilst some studies suggest values may be more rhetoric than reality. Contextual issues influence the quality of patient-centred consultations, impacting on outcomes. The legitimate use of knowledge, or evidence, is a defining aspect of modern practice, and has implications for patient-centredness. Based on a critical review of the literature, on my own empirical research, and on reflections from my clinical practice, I critique current models of the use of knowledge in supporting individualised care. Evidence-Based Medicine (EBM), and its implementation within health policy as Scientific Bureaucratic Medicine (SBM), define best evidence in terms of an epistemological emphasis on scientific knowledge over clinical experience. It provides objective knowledge of disease, including quantitative estimates of the certainty of that knowledge. Whilst arguably appropriate for secondary care, involving episodic care of selected populations referred in for specialist diagnosis and treatment of disease, application to general practice can be questioned given the complex, dynamic and uncertain nature of much of the illness that is treated. I propose that general practice is better described by a model of Interpretive Medicine (IM): the critical, thoughtful, professional use of an appropriate range of knowledges in the dynamic, shared exploration and interpretation of individual illness experience, in order to support the creative capacity of individuals in maintaining their daily lives. Whilst the generation of interpreted knowledge is an essential part of daily general practice, the profession does not have an adequate framework by which this activity can be externally judged to have been done well. Drawing on theory related to the
Objective interpretation as conforming interpretation
Lidka Rodak
2011-01-01
The practical discourse willingly uses the formula of “objective interpretation”, with no regards to its controversial nature that has been discussed in literature.The main aim of the article is to investigate what “objective interpretation” could mean and how it could be understood in the practical discourse, focusing on the understanding offered by judicature.The thesis of the article is that objective interpretation, as identified with textualists’ position, is not possible to uphold, and ...
Understanding Statistics - Cancer Statistics
Annual reports of U.S. cancer statistics including new cases, deaths, trends, survival, prevalence, lifetime risk, and progress toward Healthy People targets, plus statistical summaries for a number of common cancer types.
Energy Technology Data Exchange (ETDEWEB)
Durand, O., E-mail: olivier.durand@insa-rennes.fr [Universite Europeenne de Bretagne, INSA, FOTON, UMR 6082, 20 avenue des Buttes de Coesmes, F-35708 RENNES (France); Letoublon, A. [Universite Europeenne de Bretagne, INSA, FOTON, UMR 6082, 20 avenue des Buttes de Coesmes, F-35708 RENNES (France); Rogers, D.J. [Nanovation SARL, 103 bis rue de Versailles, 91400 Orsay (France); SUPA, School of Physics and Astronomy, University of St. Andrews, St. Andrews, KY16 9SS (United Kingdom); Hosseini Teherani, F. [Nanovation SARL, 103 bis rue de Versailles, 91400 Orsay (France)
2011-07-29
X-ray scattering methods were applied to the study of thin mosaic ZnO layers deposited on c-Al{sub 2}O{sub 3} substrates using Pulsed Laser Deposition. High Resolution (HR) studies revealed two components in the {omega} scans (transverse scans) which were not resolved in conventional 'open-detector' {omega} rocking curves: a narrow, resolution-limited, peak, characteristic of long-range correlation, and a broad peak, attributed to defect-related diffuse-scattering inducing a limited transverse structural correlation length. Thus, for such mosaic films, the conventional {omega} rocking curve Full Width at Half Maximum linewidth was found to be ill-adapted as an overall figure-of-merit for the structural quality, in that the different contributions were not meaningfully represented. A 'Williamson-Hall like' integral breadth (IB) metric for the HR (00.l) transverse-scans was thus developed as a reliable, fast, accurate and robust alternative to the rocking curve linewidth for routine non-destructive testing of such mosaic thin films. For a typical ZnO/c-Al{sub 2}O{sub 3} film, the IB method gave a limited structural correlation length of 110 nm {+-} 9 nm. The results are coherent with a thin film containing misfit dislocations at the film-substrate interface.
International Nuclear Information System (INIS)
Tabor, L.
1987-01-01
For mammography to be an effective diagnostic method, it must be performed to a very high standard of quality. Otherwise many lesions, in particular cancer in its early stages, will simply not be detectable on the films, regardless of the skill of the mammographer. Mammographic interpretation consists of two basic steps: perception and analysis. The process of mammographic interpretation begins with perception of the lesion on the mammogram. Perception is influenced by several factors. One of the most important is the parenchymal pattern of the breast tissue, detection of pathologic lesions being easier with fatty involution. The mammographer should use a method for the systematic viewing of the mammograms that will ensure that all parts of each mammogram are carefully searched for the presence of lesions. The method of analysis proceeds according to the type of lesion. The contour analysis of primary importance in the evaluation of circumscribed tumors. After having analyzed the contour and density of a lesion and considered its size, the mammographer should be fairly certain whether the circumscribed tumor is benign or malignant. Fine-needle puncture and/or US may assist the mammographer in making this decision. Painstaking analysis is required because many circumscribed tumors do not need to be biopsied. The perception of circumscribed tumors seldom causes problems, but their analysis needs careful attention. On the other hand, the major challenge with star-shaped lesions is perception. They may be difficult to discover when small. Although the final diagnosis of a stellate lesion can be made only with the help of histologic examination, the preoperative mammorgraphic differential diagnosis can be highly accurate. The differential diagnostic problem is between malignant tumors (scirrhous carcinoma), on the one hand, and traumatic fat necrosis as well as radial scars on the other hand
An Interpreter's Interpretation: Sign Language Interpreters' View of Musculoskeletal Disorders
National Research Council Canada - National Science Library
Johnson, William L
2003-01-01
Sign language interpreters are at increased risk for musculoskeletal disorders. This study used content analysis to obtain detailed information about these disorders from the interpreters' point of view...
1960-01-01
Before the invention of wire chambers, particles tracks were analysed on scanning tables like this one. Today, the process is electronic and much faster. Bubble chamber film - currently available - (links can be found below) was used for this analysis of the particle tracks.
Interpreting the Customary Rules on Interpretation
Merkouris, Panos
2017-01-01
International courts have at times interpreted the customary rules on interpretation. This is interesting because what is being interpreted is: i) rules of interpretation, which sounds dangerously tautological, and ii) customary law, the interpretation of which has not been the object of critical
Rumsey, Deborah
2011-01-01
The fun and easy way to get down to business with statistics Stymied by statistics? No fear ? this friendly guide offers clear, practical explanations of statistical ideas, techniques, formulas, and calculations, with lots of examples that show you how these concepts apply to your everyday life. Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more.Tracks to a typical first semester statistics cou
Variability in the interpretation of DMSA scintigraphy after urine infection
International Nuclear Information System (INIS)
Craig, J.; Howman-Giles, R.; Uren, R.; Irwig, L.; Bernard, E.; Knight, J.; Sureshkumar, P.; Roy, L.P.
1997-01-01
Full text: This study investigated the extent of and potential reasons for interpretation disagreement of 99m Tc-DMSA scans after urine infection in children. Methods: 441 scans were selected from children with first urine infection (UTI) from 1993-1995. 294 scans were performed at a median time of seven days after UTI and 147 in children free from infection over one year follow-up. Two nuclear medicine physicians independently reported according to whether renal abnormality was present or absent and used the four level grading system described by Goldraich: grade 1-no more than two cortical defects; grade 2 -more than 2 defects; grade 3-diffuse reduction in uptake with or without defects; grade 4 -shrunken kidney <10% function. Indices for variability used were the percentage of agreement and kappa statistic, expressed as a percentage. For the grading scale used, both measures were weighted with integers representing the number of categories from perfect agreement. Disagreement was analysed for children, kidneys and kidney zones. Results: There was agreement in 86 per cent (kappa 69%) for the normal-abnormal DMSA scan dichotomy, the weighted agreement was 94 per cent (kappa 82%) for the grading scale. Disagreement of DMSA scan interpretation ≥ two grades was present in three cases (0.7%). The same level of agreement was present for the patient, kidney and kidney zones comparisons. Agreement was not influenced by age or the timing of scintigraphy after urine infection. Conclusion: Two experienced physicians showed good agreement in the interpretation DMSA scintigraphy in children after urine infection and using the grading system of Goldraich
International Nuclear Information System (INIS)
Natali, S.
1984-01-01
This chapter reports on the scanning of 1000 holograms taken in HOBC at CERN. Each hologram is triggered by an interaction in the chamber, the primary particles being pions at 340 GeV/c. The aim of the experiment is the study of charm production. The holograms, recorded on 50 mm film with the ''in line'' technique, can be analyzed by shining a parallel expanded laser beam through the film, obtaining immediately above it the real image of the chamber which can then be scanned and measured with a technique half way between emulsions and bubble chambers. The results indicate that holograms can be analyzed as quickly and reliably as in other visual techniques and that to them is open the same order of magnitude of large scale experiments
International Nuclear Information System (INIS)
Hetherington, V.J.
1989-01-01
Oftentimes, in managing podiatric complaints, clinical and conventional radiographic techniques are insufficient in determining a patient's problem. This is especially true in the early stages of bone infection. Bone scanning or imaging can provide additional information in the diagnosis of the disorder. However, bone scans are not specific and must be correlated with clinical, radiographic, and laboratory evaluation. In other words, bone scanning does not provide the diagnosis but is an important bit of information aiding in the process of diagnosis. The more useful radionuclides in skeletal imaging are technetium phosphate complexes and gallium citrate. These compounds are administered intravenously and are detected at specific time intervals postinjection by a rectilinear scanner with minification is used and the entire skeleton can be imaged from head to toe. Minification allows visualization of the entire skeleton in a single image. A gamma camera can concentrate on an isolated area. However, it requires multiple views to complete the whole skeletal image. Recent advances have allowed computer augmentation of the data received from radionucleotide imaging. The purpose of this chapter is to present the current radionuclides clinically useful in podiatric patients
Applied statistics for economists
Lewis, Margaret
2012-01-01
This book is an undergraduate text that introduces students to commonly-used statistical methods in economics. Using examples based on contemporary economic issues and readily-available data, it not only explains the mechanics of the various methods, it also guides students to connect statistical results to detailed economic interpretations. Because the goal is for students to be able to apply the statistical methods presented, online sources for economic data and directions for performing each task in Excel are also included.
Statistics & probaility for dummies
Rumsey, Deborah J
2013-01-01
Two complete eBooks for one low price! Created and compiled by the publisher, this Statistics I and Statistics II bundle brings together two math titles in one, e-only bundle. With this special bundle, you'll get the complete text of the following two titles: Statistics For Dummies, 2nd Edition Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more. Tra
Image analysis enhancement and interpretation
International Nuclear Information System (INIS)
Glauert, A.M.
1978-01-01
The necessary practical and mathematical background are provided for the analysis of an electron microscope image in order to extract the maximum amount of structural information. Instrumental methods of image enhancement are described, including the use of the energy-selecting electron microscope and the scanning transmission electron microscope. The problems of image interpretation are considered with particular reference to the limitations imposed by radiation damage and specimen thickness. A brief survey is given of the methods for producing a three-dimensional structure from a series of two-dimensional projections, although emphasis is really given on the analysis, processing and interpretation of the two-dimensional projection of a structure. (Auth.)
Preoperative nuclear scans in patients with melanoma
International Nuclear Information System (INIS)
Au, F.C.; Maier, W.P.; Malmud, L.S.; Goldman, L.I.; Clark, W.H. Jr.
1984-01-01
One hundred forty-one liver scans, 137 brain scans, and 112 bone scans were performed in 192 patients with clinical Stage 1 melanoma. One liver scan was interpreted as abnormal; liver biopsy of that patient showed no metastasis. There were 11 suggestive liver scans; three of the patients with suggestive liver scans had negative liver biopsies. The remaining eight patients were followed from 4 to 6 years and none of those patients developed clinical evidence of hepatic metastases. All of the brain scans were normal. Five patients had suggestive bone scans and none of those patients had manifested symptoms of osseous metastases with a follow-up of 2 to 4.5 years. This study demonstrates that the use of preoperative liver, brain and bone scan in the evaluation of patients with clinical Stage 1 melanoma is virtually unproductive
Interpretive Media Study and Interpretive Social Science.
Carragee, Kevin M.
1990-01-01
Defines the major theoretical influences on interpretive approaches in mass communication, examines the central concepts of these perspectives, and provides a critique of these approaches. States that the adoption of interpretive approaches in mass communication has ignored varied critiques of interpretive social science. Suggests that critical…
Interpreters, Interpreting, and the Study of Bilingualism.
Valdes, Guadalupe; Angelelli, Claudia
2003-01-01
Discusses research on interpreting focused specifically on issues raised by this literature about the nature of bilingualism. Suggests research carried out on interpreting--while primarily produced with a professional audience in mind and concerned with improving the practice of interpreting--provides valuable insights about complex aspects of…
International Nuclear Information System (INIS)
Lim, Gyeong Hui
2008-03-01
This book consists of 15 chapters, which are basic conception and meaning of statistical thermodynamics, Maxwell-Boltzmann's statistics, ensemble, thermodynamics function and fluctuation, statistical dynamics with independent particle system, ideal molecular system, chemical equilibrium and chemical reaction rate in ideal gas mixture, classical statistical thermodynamics, ideal lattice model, lattice statistics and nonideal lattice model, imperfect gas theory on liquid, theory on solution, statistical thermodynamics of interface, statistical thermodynamics of a high molecule system and quantum statistics
Does environmental data collection need statistics?
Pulles, M.P.J.
1998-01-01
The term 'statistics' with reference to environmental science and policymaking might mean different things: the development of statistical methodology, the methodology developed by statisticians to interpret and analyse such data, or the statistical data that are needed to understand environmental
... scan - orbits; CT scan - sinuses; Computed tomography - cranial; CAT scan - brain ... head size in children Changes in thinking or behavior Fainting Headache, when you have certain other signs ...
Kanji, Gopal K
2006-01-01
This expanded and updated Third Edition of Gopal K. Kanji's best-selling resource on statistical tests covers all the most commonly used tests with information on how to calculate and interpret results with simple datasets. Each entry begins with a short summary statement about the test's purpose, and contains details of the test objective, the limitations (or assumptions) involved, a brief outline of the method, a worked example, and the numerical calculation. 100 Statistical Tests, Third Edition is the one indispensable guide for users of statistical materials and consumers of statistical information at all levels and across all disciplines.
Day, George S; Schoemaker, Paul J H
2005-11-01
Companies often face new rivals, technologies, regulations, and other environmental changes that seem to come out of left field. How can they see these changes sooner and capitalize on them? Such changes often begin as weak signals on what the authors call the periphery, or the blurry zone at the edge of an organization's vision. As with human peripheral vision, these signals are difficult to see and interpret but can be vital to success or survival. Unfortunately, most companies lack a systematic method for determining where on the periphery they should be looking, how to interpret the weak signals they see, and how to allocate limited scanning resources. This article provides such a method-a question-based framework for helping companies scan the periphery more efficiently and effectively. The framework divides questions into three categories: learning from the past (What have been our past blind spots? What instructive analogies do other industries offer? Who in the industry is skilled at picking up weak signals and acting on them?); evaluating the present (What important signals are we rationalizing away? What are our mavericks, outliers, complainers, and defectors telling us? What are our peripheral customers and competitors really thinking?); and envisioning the future (What future surprises could really hurt or help us? What emerging technologies could change the game? Is there an unthinkable scenario that might disrupt our business?). Answering these questions is a good first step toward anticipating problems or opportunities that may appear on the business horizon. The article concludes with a self-test that companies can use to assess their need and capability for peripheral vision.
... What Is Cancer? Cancer Statistics Cancer Disparities Cancer Statistics Cancer has a major impact on society in ... success of efforts to control and manage cancer. Statistics at a Glance: The Burden of Cancer in ...
On court interpreters' visibility
DEFF Research Database (Denmark)
Dubslaff, Friedel; Martinsen, Bodil
of the service they receive. Ultimately, the findings will be used for training purposes. Future - and, for that matter, already practising - interpreters as well as the professional users of interpreters ought to take the reality of the interpreters' work in practice into account when assessing the quality...... on the interpreter's interpersonal role and, in particular, on signs of the interpreter's visibility, i.e. active co-participation. At first sight, the interpreting assignment in question seems to be a short and simple routine task which would not require the interpreter to deviate from the traditional picture...
CERN. Geneva
2005-01-01
The three lectures will present an introduction to statistical methods as used in High Energy Physics. As the time will be very limited, the course will seek mainly to define the important issues and to introduce the most wide used tools. Topics will include the interpretation and use of probability, estimation of parameters and testing of hypotheses.
Introduction to Statistics course
CERN. Geneva HR-RFA
2006-01-01
The four lectures will present an introduction to statistical methods as used in High Energy Physics. As the time will be very limited, the course will seek mainly to define the important issues and to introduce the most wide used tools. Topics will include the interpretation and use of probability, estimation of parameters and testing of hypotheses.
CERN. Geneva
2004-01-01
The three lectures will present an introduction to statistical methods as used in High Energy Physics. As the time will be very limited, the course will seek mainly to define the important issues and to introduce the most wide used tools. Topics will include the interpretation and use of probability, estimation of parameters and testing of hypotheses.
Bone scanning in severe external otitis
International Nuclear Information System (INIS)
Levin, W.J.; Shary, J.H. III; Nichols, L.T.; Lucente, F.E.
1986-01-01
Technetium99 Methylene Diphosphate bone scanning has been considered an early valuable tool to diagnose necrotizing progressive malignant external otitis. However, to our knowledge, no formal studies have actually compared bone scans of otherwise young, healthy patients with severe external otitis to scans of patients with clinical presentation of malignant external otitis. Twelve patients with only severe external otitis were studied with Technetium99 Diphosphate and were compared to known cases of malignant otitis. All scans were evaluated by two neuroradiologists with no prior knowledge of the clinical status of the patients. Nine of the 12 patients had positive bone scans with many scans resembling those reported with malignant external otitis. Interestingly, there was no consistent correlation between the severity of clinical presentation and the amount of Technetium uptake. These findings suggest that a positive bone scan alone should not be interpreted as indicative of malignant external otitis
Interpreting Impoliteness: Interpreters’ Voices
Directory of Open Access Journals (Sweden)
Tatjana Radanović Felberg
2017-11-01
Full Text Available Interpreters in the public sector in Norway interpret in a variety of institutional encounters, and the interpreters evaluate the majority of these encounters as polite. However, some encounters are evaluated as impolite, and they pose challenges when it comes to interpreting impoliteness. This issue raises the question of whether interpreters should take a stance on their own evaluation of impoliteness and whether they should interfere in communication. In order to find out more about how interpreters cope with this challenge, in 2014 a survey was sent to all interpreters registered in the Norwegian Register of Interpreters. The survey data were analyzed within the theoretical framework of impoliteness theory using the notion of moral order as an explanatory tool in a close reading of interpreters’ answers. The analysis shows that interpreters reported using a variety of strategies for interpreting impoliteness, including omissions and downtoning. However, the interpreters also gave examples of individual strategies for coping with impoliteness, such as interrupting and postponing interpreting. These strategies border behavioral strategies and conflict with the Norwegian ethical guidelines for interpreting. In light of the ethical guidelines and actual practice, mapping and discussing different strategies used by interpreters might heighten interpreters’ and interpreter-users’ awareness of the role impoliteness can play in institutional interpreter– mediated encounters.
... results on a PET scan. Blood sugar or insulin levels may affect the test results in people with diabetes . PET scans may be done along with a CT scan. This combination scan is called a PET/CT. Alternative Names Brain positron emission tomography; PET scan - brain References Chernecky ...
... nuclear medicine scan; Heart positron emission tomography; Myocardial PET scan ... A PET scan requires a small amount of radioactive material (tracer). This tracer is given through a vein (IV), ...
Statistical Reform in School Psychology Research: A Synthesis
Swaminathan, Hariharan; Rogers, H. Jane
2007-01-01
Statistical reform in school psychology research is discussed in terms of research designs, measurement issues, statistical modeling and analysis procedures, interpretation and reporting of statistical results, and finally statistics education.
The use of CT scan in the pre-operative staging of bronchogenic carcinoma
International Nuclear Information System (INIS)
Pada, C.C.
1992-01-01
Surgery remains the treatment of choice in patients with localized bronchogenic carcinoma. Pre-operative identification of inoperability spares the patient from unnecessary surgery. This prospective study was carried out to determine the correctness of judgement regarding a patient's operability or inoperability based on the pre-operative staging of CT scan; to find out the sensitivity, specificity and overall accuracy of the CT scan in estimating tumor description, nodal status and metastatic spread to the chest. Staging was done by 3 senior radiologists aware of the diagnosis. Both the surgical and histopathologic findings and staging were gathered and used as measurement of truth in arriving at the CT scan's accuracy. Overall accuracy rate of CT scan in determining operability or inoperability is 80%; tumor description accuracy of assessment is 87% and nodal status estimation has an accuracy of 60%. Sensitivity of CT scan is assessment of metastatic spread to the chest is 93%. There is no statistically significant difference in the judgement of operability or interpretability by CT scan compared to surgical and histopathologic results. The CT scan is recommended as a valuable tool in the pre-operative staging of patients with bronchogenic carcinoma who are candidates for surgery. (auth.). 21 refs.; 8 tabs
Interpretation of Confidence Interval Facing the Conflict
Andrade, Luisa; Fernández, Felipe
2016-01-01
As literature has reported, it is usual that university students in statistics courses, and even statistics teachers, interpret the confidence level associated with a confidence interval as the probability that the parameter value will be between the lower and upper interval limits. To confront this misconception, class activities have been…
... this page: https://medlineplus.gov/usestatistics.html MedlinePlus Statistics To use the sharing features on this page, ... By Quarter View image full size Quarterly User Statistics Quarter Page Views Unique Visitors Oct-Dec-98 ...
International Nuclear Information System (INIS)
Olivari, D.
1984-01-01
An attempt is made at identifying the most important factors which introduce difficulties in the analysis of results from tests on pollutant dispersal: the unsteadiness of the phenomenon, the effect of external uncontrollable parameters, and the inherent complexity of the problem itself. The basic models for prediction of dispersion of passive contaminants are discussed, and in particular a Lagrangian approach which seems to provide accurate results. For the analysis of results many problems arise. First the need of computing for the results the statistical quantities which describe them: the mean, the variance and higher order moments are important. It is shown that there is no easy solution if the duration and/or the number of independent ''events'' to be analyzed are too limited. The probability density function provides the most useful information, but is not easy to measure. A family of functions is recalled which predict reasonably well the trend of the pdf. Then the role of intermittency is shown in some detail. Its importance cannot be underestimated and its relationship to pdf and the effects on measurements are shown to be rather complex. Finally, an example is made to show the effects of the variance of external factors
Pestman, Wiebe R
2009-01-01
This textbook provides a broad and solid introduction to mathematical statistics, including the classical subjects hypothesis testing, normal regression analysis, and normal analysis of variance. In addition, non-parametric statistics and vectorial statistics are considered, as well as applications of stochastic analysis in modern statistics, e.g., Kolmogorov-Smirnov testing, smoothing techniques, robustness and density estimation. For students with some elementary mathematical background. With many exercises. Prerequisites from measure theory and linear algebra are presented.
Whole Frog Project and Virtual Frog Dissection Statistics wwwstats output for January 1 through duplicate or extraneous accesses. For example, in these statistics, while a POST requesting an image is as well. Note that this under-represents the bytes requested. Starting date for following statistics
DEFF Research Database (Denmark)
Auken, Sune
2015-01-01
Despite the immensity of genre studies as well as studies in interpretation, our understanding of the relationship between genre and interpretation is sketchy at best. The article attempts to unravel some of intricacies of that relationship through an analysis of the generic interpretation carrie...
Engineering Definitional Interpreters
DEFF Research Database (Denmark)
Midtgaard, Jan; Ramsay, Norman; Larsen, Bradford
2013-01-01
A definitional interpreter should be clear and easy to write, but it may run 4--10 times slower than a well-crafted bytecode interpreter. In a case study focused on implementation choices, we explore ways of making definitional interpreters faster without expending much programming effort. We imp...
Technical considerations on scanning and image analysis for amyloid PET in dementia
International Nuclear Information System (INIS)
Akamatsu, Go; Ohnishi, Akihito; Aita, Kazuki; Ikari, Yasuhiko; Senda, Michio; Yamamoto, Yasuji
2017-01-01
Brain imaging techniques, such as computed tomography (CT), magnetic resonance imaging (MRI), single photon emission computed tomography (SPECT), and positron emission tomography (PET), can provide essential and objective information for the early and differential diagnosis of dementia. Amyloid PET is especially useful to evaluate the amyloid-β pathological process as a biomarker of Alzheimer's disease. This article reviews critical points about technical considerations on the scanning and image analysis methods for amyloid PET. Each amyloid PET agent has its own proper administration instructions and recommended uptake time, scan duration, and the method of image display and interpretation. In addition, we have introduced general scanning information, including subject positioning, reconstruction parameters, and quantitative and statistical image analysis. We believe that this article could make amyloid PET a more reliable tool in clinical study and practice. (author)
Technical Considerations on Scanning and Image Analysis for Amyloid PET in Dementia.
Akamatsu, Go; Ohnishi, Akihito; Aita, Kazuki; Ikari, Yasuhiko; Yamamoto, Yasuji; Senda, Michio
2017-01-01
Brain imaging techniques, such as computed tomography (CT), magnetic resonance imaging (MRI), single photon emission computed tomography (SPECT), and positron emission tomography (PET), can provide essential and objective information for the early and differential diagnosis of dementia. Amyloid PET is especially useful to evaluate the amyloid-β pathological process as a biomarker of Alzheimer's disease. This article reviews critical points about technical considerations on the scanning and image analysis methods for amyloid PET. Each amyloid PET agent has its own proper administration instructions and recommended uptake time, scan duration, and the method of image display and interpretation. In addition, we have introduced general scanning information, including subject positioning, reconstruction parameters, and quantitative and statistical image analysis. We believe that this article could make amyloid PET a more reliable tool in clinical study and practice.
What dementia reveals about proverb interpretation and its neuroanatomical correlates.
Kaiser, Natalie C; Lee, Grace J; Lu, Po H; Mather, Michelle J; Shapira, Jill; Jimenez, Elvira; Thompson, Paul M; Mendez, Mario F
2013-08-01
Neuropsychologists frequently include proverb interpretation as a measure of executive abilities. A concrete interpretation of proverbs, however, may reflect semantic impairments from anterior temporal lobes, rather than executive dysfunction from frontal lobes. The investigation of proverb interpretation among patients with different dementias with varying degrees of temporal and frontal dysfunction may clarify the underlying brain-behavior mechanisms for abstraction from proverbs. We propose that patients with behavioral variant frontotemporal dementia (bvFTD), who are characteristically more impaired on proverb interpretation than those with Alzheimer's disease (AD), are disproportionately impaired because of anterior temporal-mediated semantic deficits. Eleven patients with bvFTD and 10 with AD completed the Delis-Kaplan Executive Function System (D-KEFS) Proverbs Test and a series of neuropsychological measures of executive and semantic functions. The analysis included both raw and age-adjusted normed data for multiple choice responses on the D-KEFS Proverbs Test using independent samples t-tests. Tensor-based morphometry (TBM) applied to 3D T1-weighted MRI scans mapped the association between regional brain volume and proverb performance. Computations of mean Jacobian values within select regions of interest provided a numeric summary of regional volume, and voxel-wise regression yielded 3D statistical maps of the association between tissue volume and proverb scores. The patients with bvFTD were significantly worse than those with AD in proverb interpretation. The worse performance of the bvFTD patients involved a greater number of concrete responses to common, familiar proverbs, but not to uncommon, unfamiliar ones. These concrete responses to common proverbs correlated with semantic measures, whereas concrete responses to uncommon proverbs correlated with executive functions. After controlling for dementia diagnosis, TBM analyses indicated significant
Scheck, Florian
2016-01-01
Scheck’s textbook starts with a concise introduction to classical thermodynamics, including geometrical aspects. Then a short introduction to probabilities and statistics lays the basis for the statistical interpretation of thermodynamics. Phase transitions, discrete models and the stability of matter are explained in great detail. Thermodynamics has a special role in theoretical physics. Due to the general approach of thermodynamics the field has a bridging function between several areas like the theory of condensed matter, elementary particle physics, astrophysics and cosmology. The classical thermodynamics describes predominantly averaged properties of matter, reaching from few particle systems and state of matter to stellar objects. Statistical Thermodynamics covers the same fields, but explores them in greater depth and unifies classical statistical mechanics with quantum theory of multiple particle systems. The content is presented as two tracks: the fast track for master students, providing the essen...
Statistics 101 for Radiologists.
Anvari, Arash; Halpern, Elkan F; Samir, Anthony E
2015-10-01
Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced. © RSNA, 2015.
Yanagawa, Masahiro; Honda, Osamu; Kikuyama, Ayano; Gyobu, Tomoko; Sumikawa, Hiromitsu; Koyama, Mitsuhiro; Tomiyama, Noriyuki
2012-10-01
To evaluate the effects of ASIR on CAD system of pulmonary nodules using clinical routine-dose CT and lower-dose CT. Thirty-five patients (body mass index, 22.17 ± 4.37 kg/m(2)) were scanned by multidetector-row CT with tube currents (clinical routine-dose CT, automatically adjusted mA; lower-dose CT, 10 mA) and X-ray voltage (120 kVp). Each 0.625-mm-thick image was reconstructed at 0%-, 50%-, and 100%-ASIR: 0%-ASIR is reconstructed using only the filtered back-projection algorithm (FBP), while 100%-ASIR is reconstructed using the maximum ASIR and 50%-ASIR implies a blending of 50% FBP and ASIR. CAD output was compared retrospectively with the results of the reference standard which was established using a consensus panel of three radiologists. Data were analyzed using Bonferroni/Dunn's method. Radiation dose was calculated by multiplying dose-length product by conversion coefficient of 0.021. The consensus panel found 265 non-calcified nodules ≤ 30 mm (ground-glass opacity [GGO], 103; part-solid, 34; and solid, 128). CAD sensitivity was significantly higher at 100%-ASIR [clinical routine-dose CT, 71% (overall), 49% (GGO); lower-dose CT, 52% (overall), 67% (solid)] than at 0%-ASIR [clinical routine-dose CT, 54% (overall), 25% (GGO); lower-dose CT, 36% (overall), 50% (solid)] (pASIR (clinical routine-dose CT, 8.5; lower-dose CT, 6.2) than at 0%-ASIR (clinical routine-dose CT, 4.6; lower-dose CT, 3.5; pASIR on lower-dose CT is almost equal to that at 0%-ASIR on clinical routine-dose CT. ASIR can increase CAD sensitivity despite increased false-positive findings. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Sadovskii, Michael V
2012-01-01
This volume provides a compact presentation of modern statistical physics at an advanced level. Beginning with questions on the foundations of statistical mechanics all important aspects of statistical physics are included, such as applications to ideal gases, the theory of quantum liquids and superconductivity and the modern theory of critical phenomena. Beyond that attention is given to new approaches, such as quantum field theory methods and non-equilibrium problems.
Waller, Derek L
2008-01-01
Statistical analysis is essential to business decision-making and management, but the underlying theory of data collection, organization and analysis is one of the most challenging topics for business students and practitioners. This user-friendly text and CD-ROM package will help you to develop strong skills in presenting and interpreting statistical information in a business or management environment. Based entirely on using Microsoft Excel rather than more complicated applications, it includes a clear guide to using Excel with the key functions employed in the book, a glossary of terms and
Statistics As Principled Argument
Abelson, Robert P
2012-01-01
In this illuminating volume, Robert P. Abelson delves into the too-often dismissed problems of interpreting quantitative data and then presenting them in the context of a coherent story about one's research. Unlike too many books on statistics, this is a remarkably engaging read, filled with fascinating real-life (and real-research) examples rather than with recipes for analysis. It will be of true interest and lasting value to beginning graduate students and seasoned researchers alike. The focus of the book is that the purpose of statistics is to organize a useful argument from quantitative
Neave, Henry R
2012-01-01
This book, designed for students taking a basic introductory course in statistical analysis, is far more than just a book of tables. Each table is accompanied by a careful but concise explanation and useful worked examples. Requiring little mathematical background, Elementary Statistics Tables is thus not just a reference book but a positive and user-friendly teaching and learning aid. The new edition contains a new and comprehensive "teach-yourself" section on a simple but powerful approach, now well-known in parts of industry but less so in academia, to analysing and interpreting process dat
Search Databases and Statistics
DEFF Research Database (Denmark)
Refsgaard, Jan C; Munk, Stephanie; Jensen, Lars J
2016-01-01
having strengths and weaknesses that must be considered for the individual needs. These are reviewed in this chapter. Equally critical for generating highly confident output datasets is the application of sound statistical criteria to limit the inclusion of incorrect peptide identifications from database...... searches. Additionally, careful filtering and use of appropriate statistical tests on the output datasets affects the quality of all downstream analyses and interpretation of the data. Our considerations and general practices on these aspects of phosphoproteomics data processing are presented here....
Conformity and statistical tolerancing
Leblond, Laurent; Pillet, Maurice
2018-02-01
Statistical tolerancing was first proposed by Shewhart (Economic Control of Quality of Manufactured Product, (1931) reprinted 1980 by ASQC), in spite of this long history, its use remains moderate. One of the probable reasons for this low utilization is undoubtedly the difficulty for designers to anticipate the risks of this approach. The arithmetic tolerance (worst case) allows a simple interpretation: conformity is defined by the presence of the characteristic in an interval. Statistical tolerancing is more complex in its definition. An interval is not sufficient to define the conformance. To justify the statistical tolerancing formula used by designers, a tolerance interval should be interpreted as the interval where most of the parts produced should probably be located. This tolerance is justified by considering a conformity criterion of the parts guaranteeing low offsets on the latter characteristics. Unlike traditional arithmetic tolerancing, statistical tolerancing requires a sustained exchange of information between design and manufacture to be used safely. This paper proposes a formal definition of the conformity, which we apply successively to the quadratic and arithmetic tolerancing. We introduce a concept of concavity, which helps us to demonstrate the link between tolerancing approach and conformity. We use this concept to demonstrate the various acceptable propositions of statistical tolerancing (in the space decentring, dispersion).
The emergent Copenhagen interpretation of quantum mechanics
Hollowood, Timothy J.
2014-05-01
We introduce a new and conceptually simple interpretation of quantum mechanics based on reduced density matrices of sub-systems from which the standard Copenhagen interpretation emerges as an effective description of macroscopically large systems. This interpretation describes a world in which definite measurement results are obtained with probabilities that reproduce the Born rule. Wave function collapse is seen to be a useful but fundamentally unnecessary piece of prudent book keeping which is only valid for macro-systems. The new interpretation lies in a class of modal interpretations in that it applies to quantum systems that interact with a much larger environment. However, we show that it does not suffer from the problems that have plagued similar modal interpretations like macroscopic superpositions and rapid flipping between macroscopically distinct states. We describe how the interpretation fits neatly together with fully quantum formulations of statistical mechanics and that a measurement process can be viewed as a process of ergodicity breaking analogous to a phase transition. The key feature of the new interpretation is that joint probabilities for the ergodic subsets of states of disjoint macro-systems only arise as emergent quantities. Finally we give an account of the EPR-Bohm thought experiment and show that the interpretation implies the violation of the Bell inequality characteristic of quantum mechanics but in a way that is rather novel. The final conclusion is that the Copenhagen interpretation gives a completely satisfactory phenomenology of macro-systems interacting with micro-systems.
The emergent Copenhagen interpretation of quantum mechanics
International Nuclear Information System (INIS)
Hollowood, Timothy J
2014-01-01
We introduce a new and conceptually simple interpretation of quantum mechanics based on reduced density matrices of sub-systems from which the standard Copenhagen interpretation emerges as an effective description of macroscopically large systems. This interpretation describes a world in which definite measurement results are obtained with probabilities that reproduce the Born rule. Wave function collapse is seen to be a useful but fundamentally unnecessary piece of prudent book keeping which is only valid for macro-systems. The new interpretation lies in a class of modal interpretations in that it applies to quantum systems that interact with a much larger environment. However, we show that it does not suffer from the problems that have plagued similar modal interpretations like macroscopic superpositions and rapid flipping between macroscopically distinct states. We describe how the interpretation fits neatly together with fully quantum formulations of statistical mechanics and that a measurement process can be viewed as a process of ergodicity breaking analogous to a phase transition. The key feature of the new interpretation is that joint probabilities for the ergodic subsets of states of disjoint macro-systems only arise as emergent quantities. Finally we give an account of the EPR–Bohm thought experiment and show that the interpretation implies the violation of the Bell inequality characteristic of quantum mechanics but in a way that is rather novel. The final conclusion is that the Copenhagen interpretation gives a completely satisfactory phenomenology of macro-systems interacting with micro-systems. (paper)
Goodman, Joseph W
2015-01-01
This book discusses statistical methods that are useful for treating problems in modern optics, and the application of these methods to solving a variety of such problems This book covers a variety of statistical problems in optics, including both theory and applications. The text covers the necessary background in statistics, statistical properties of light waves of various types, the theory of partial coherence and its applications, imaging with partially coherent light, atmospheric degradations of images, and noise limitations in the detection of light. New topics have been introduced i
Energy Technology Data Exchange (ETDEWEB)
Eliazar, Iddo, E-mail: eliazar@post.tau.ac.il
2017-05-15
The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.
International Nuclear Information System (INIS)
Eliazar, Iddo
2017-01-01
The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.
Szulc, Stefan
1965-01-01
Statistical Methods provides a discussion of the principles of the organization and technique of research, with emphasis on its application to the problems in social statistics. This book discusses branch statistics, which aims to develop practical ways of collecting and processing numerical data and to adapt general statistical methods to the objectives in a given field.Organized into five parts encompassing 22 chapters, this book begins with an overview of how to organize the collection of such information on individual units, primarily as accomplished by government agencies. This text then
... Testing Treatment & Outcomes Health Professionals Statistics More Resources Candidiasis Candida infections of the mouth, throat, and esophagus Vaginal candidiasis Invasive candidiasis Definition Symptoms Risk & Prevention Sources Diagnosis ...
Scanning probe recognition microscopy investigation of tissue scaffold properties
Fan, Yuan; Chen, Qian; Ayres, Virginia M; Baczewski, Andrew D; Udpa, Lalita; Kumar, Shiva
2007-01-01
Scanning probe recognition microscopy is a new scanning probe microscopy technique which enables selective scanning along individual nanofibers within a tissue scaffold. Statistically significant data for multiple properties can be collected by repetitively fine-scanning an identical region of interest. The results of a scanning probe recognition microscopy investigation of the surface roughness and elasticity of a series of tissue scaffolds are presented. Deconvolution and statistical methods were developed and used for data accuracy along curved nanofiber surfaces. Nanofiber features were also independently analyzed using transmission electron microscopy, with results that supported the scanning probe recognition microscopy-based analysis. PMID:18203431
Reading Statistics And Research
Akbulut, Reviewed By Yavuz
2008-01-01
The book demonstrates the best and most conservative ways to decipher and critique research reports particularly for social science researchers. In addition, new editions of the book are always better organized, effectively structured and meticulously updated in line with the developments in the field of research statistics. Even the most trivial issues are revisited and updated in new editions. For instance, purchaser of the previous editions might check the interpretation of skewness and ku...
Statistical methods in quality assurance
International Nuclear Information System (INIS)
Eckhard, W.
1980-01-01
During the different phases of a production process - planning, development and design, manufacturing, assembling, etc. - most of the decision rests on a base of statistics, the collection, analysis and interpretation of data. Statistical methods can be thought of as a kit of tools to help to solve problems in the quality functions of the quality loop with respect to produce quality products and to reduce quality costs. Various statistical methods are represented, typical examples for their practical application are demonstrated. (RW)
SOCR: Statistics Online Computational Resource
Dinov, Ivo D.
2006-01-01
The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis...
Energy Technology Data Exchange (ETDEWEB)
Yanagawa, Masahiro, E-mail: m-yanagawa@radiol.med.osaka-u.ac.jp [Department of Radiology, Osaka University Graduate School of Medicine, 2-2 Yamadaoka, Suita-city, Osaka 565-0871 (Japan); Honda, Osamu, E-mail: ohonda@radiol.med.osaka-u.ac.jp [Department of Radiology, Osaka University Graduate School of Medicine, 2-2 Yamadaoka, Suita-city, Osaka 565-0871 (Japan); Kikuyama, Ayano, E-mail: a-kikuyama@radiol.med.osaka-u.ac.jp [Department of Radiology, Osaka University Graduate School of Medicine, 2-2 Yamadaoka, Suita-city, Osaka 565-0871 (Japan); Gyobu, Tomoko, E-mail: t-gyobu@radiol.med.osaka-u.ac.jp [Department of Radiology, Osaka University Graduate School of Medicine, 2-2 Yamadaoka, Suita-city, Osaka 565-0871 (Japan); Sumikawa, Hiromitsu, E-mail: h-sumikawa@radiol.med.osaka-u.ac.jp [Department of Radiology, Osaka University Graduate School of Medicine, 2-2 Yamadaoka, Suita-city, Osaka 565-0871 (Japan); Koyama, Mitsuhiro, E-mail: m-koyama@radiol.med.osaka-u.ac.jp [Department of Radiology, Osaka University Graduate School of Medicine, 2-2 Yamadaoka, Suita-city, Osaka 565-0871 (Japan); Tomiyama, Noriyuki, E-mail: tomiyama@radiol.med.osaka-u.ac.jp [Department of Radiology, Osaka University Graduate School of Medicine, 2-2 Yamadaoka, Suita-city, Osaka 565-0871 (Japan)
2012-10-15
Purpose: To evaluate the effects of ASIR on CAD system of pulmonary nodules using clinical routine-dose CT and lower-dose CT. Materials and methods: Thirty-five patients (body mass index, 22.17 ± 4.37 kg/m{sup 2}) were scanned by multidetector-row CT with tube currents (clinical routine-dose CT, automatically adjusted mA; lower-dose CT, 10 mA) and X-ray voltage (120 kVp). Each 0.625-mm-thick image was reconstructed at 0%-, 50%-, and 100%-ASIR: 0%-ASIR is reconstructed using only the filtered back-projection algorithm (FBP), while 100%-ASIR is reconstructed using the maximum ASIR and 50%-ASIR implies a blending of 50% FBP and ASIR. CAD output was compared retrospectively with the results of the reference standard which was established using a consensus panel of three radiologists. Data were analyzed using Bonferroni/Dunn's method. Radiation dose was calculated by multiplying dose-length product by conversion coefficient of 0.021. Results: The consensus panel found 265 non-calcified nodules ≤30 mm (ground-glass opacity [GGO], 103; part-solid, 34; and solid, 128). CAD sensitivity was significantly higher at 100%-ASIR [clinical routine-dose CT, 71% (overall), 49% (GGO); lower-dose CT, 52% (overall), 67% (solid)] than at 0%-ASIR [clinical routine-dose CT, 54% (overall), 25% (GGO); lower-dose CT, 36% (overall), 50% (solid)] (p < 0.001). Mean number of false-positive findings per examination was significantly higher at 100%-ASIR (clinical routine-dose CT, 8.5; lower-dose CT, 6.2) than at 0%-ASIR (clinical routine-dose CT, 4.6; lower-dose CT, 3.5; p < 0.001). Effective doses were 10.77 ± 3.41 mSv in clinical routine-dose CT and 2.67 ± 0.17 mSv in lower-dose CT. Conclusion: CAD sensitivity at 100%-ASIR on lower-dose CT is almost equal to that at 0%-ASIR on clinical routine-dose CT. ASIR can increase CAD sensitivity despite increased false-positive findings.
International Nuclear Information System (INIS)
Yanagawa, Masahiro; Honda, Osamu; Kikuyama, Ayano; Gyobu, Tomoko; Sumikawa, Hiromitsu; Koyama, Mitsuhiro; Tomiyama, Noriyuki
2012-01-01
Purpose: To evaluate the effects of ASIR on CAD system of pulmonary nodules using clinical routine-dose CT and lower-dose CT. Materials and methods: Thirty-five patients (body mass index, 22.17 ± 4.37 kg/m 2 ) were scanned by multidetector-row CT with tube currents (clinical routine-dose CT, automatically adjusted mA; lower-dose CT, 10 mA) and X-ray voltage (120 kVp). Each 0.625-mm-thick image was reconstructed at 0%-, 50%-, and 100%-ASIR: 0%-ASIR is reconstructed using only the filtered back-projection algorithm (FBP), while 100%-ASIR is reconstructed using the maximum ASIR and 50%-ASIR implies a blending of 50% FBP and ASIR. CAD output was compared retrospectively with the results of the reference standard which was established using a consensus panel of three radiologists. Data were analyzed using Bonferroni/Dunn's method. Radiation dose was calculated by multiplying dose-length product by conversion coefficient of 0.021. Results: The consensus panel found 265 non-calcified nodules ≤30 mm (ground-glass opacity [GGO], 103; part-solid, 34; and solid, 128). CAD sensitivity was significantly higher at 100%-ASIR [clinical routine-dose CT, 71% (overall), 49% (GGO); lower-dose CT, 52% (overall), 67% (solid)] than at 0%-ASIR [clinical routine-dose CT, 54% (overall), 25% (GGO); lower-dose CT, 36% (overall), 50% (solid)] (p < 0.001). Mean number of false-positive findings per examination was significantly higher at 100%-ASIR (clinical routine-dose CT, 8.5; lower-dose CT, 6.2) than at 0%-ASIR (clinical routine-dose CT, 4.6; lower-dose CT, 3.5; p < 0.001). Effective doses were 10.77 ± 3.41 mSv in clinical routine-dose CT and 2.67 ± 0.17 mSv in lower-dose CT. Conclusion: CAD sensitivity at 100%-ASIR on lower-dose CT is almost equal to that at 0%-ASIR on clinical routine-dose CT. ASIR can increase CAD sensitivity despite increased false-positive findings
Wilson, Donald A
2014-01-01
Base retracement on solid research and historically accurate interpretation Interpreting Land Records is the industry's most complete guide to researching and understanding the historical records germane to land surveying. Coverage includes boundary retracement and the primary considerations during new boundary establishment, as well as an introduction to historical records and guidance on effective research and interpretation. This new edition includes a new chapter titled "Researching Land Records," and advice on overcoming common research problems and insight into alternative resources wh
Statistical concepts a second course
Lomax, Richard G
2012-01-01
Statistical Concepts consists of the last 9 chapters of An Introduction to Statistical Concepts, 3rd ed. Designed for the second course in statistics, it is one of the few texts that focuses just on intermediate statistics. The book highlights how statistics work and what they mean to better prepare students to analyze their own data and interpret SPSS and research results. As such it offers more coverage of non-parametric procedures used when standard assumptions are violated since these methods are more frequently encountered when working with real data. Determining appropriate sample sizes
Ovesen, Christian; Jakobsen, Janus Christian; Gluud, Christian; Steiner, Thorsten; Law, Zhe; Flaherty, Katie; Dineen, Rob A; Bath, Philip M; Sprigg, Nikola; Christensen, Hanne
2018-06-13
We present the statistical analysis plan of a prespecified Tranexamic Acid for Hyperacute Primary Intracerebral Haemorrhage (TICH)-2 sub-study aiming to investigate, if tranexamic acid has a different effect in intracerebral haemorrhage patients with the spot sign on admission compared to spot sign negative patients. The TICH-2 trial recruited above 2000 participants with intracerebral haemorrhage arriving in hospital within 8 h after symptom onset. They were included irrespective of radiological signs of on-going haematoma expansion. Participants were randomised to tranexamic acid versus matching placebo. In this subgroup analysis, we will include all participants in TICH-2 with a computed tomography angiography on admission allowing adjudication of the participants' spot sign status. Primary outcome will be the ability of tranexamic acid to limit absolute haematoma volume on computed tomography at 24 h (± 12 h) after randomisation among spot sign positive and spot sign negative participants, respectively. Within all outcome measures, the effect of tranexamic acid in spot sign positive/negative participants will be compared using tests of interaction. This sub-study will investigate the important clinical hypothesis that spot sign positive patients might benefit more from administration of tranexamic acid compared to spot sign negative patients. Trial registration ISRCTN93732214 ( http://www.isrctn.com ).
Petocz, Peter; Sowey, Eric
2012-01-01
The term "data snooping" refers to the practice of choosing which statistical analyses to apply to a set of data after having first looked at those data. Data snooping contradicts a fundamental precept of applied statistics, that the scheme of analysis is to be planned in advance. In this column, the authors shall elucidate the…
Petocz, Peter; Sowey, Eric
2008-01-01
In this article, the authors focus on hypothesis testing--that peculiarly statistical way of deciding things. Statistical methods for testing hypotheses were developed in the 1920s and 1930s by some of the most famous statisticians, in particular Ronald Fisher, Jerzy Neyman and Egon Pearson, who laid the foundations of almost all modern methods of…
Lyons, L.
2016-01-01
Accelerators and detectors are expensive, both in terms of money and human effort. It is thus important to invest effort in performing a good statistical anal- ysis of the data, in order to extract the best information from it. This series of five lectures deals with practical aspects of statistical issues that arise in typical High Energy Physics analyses.
Liver-lung scan in the diagnosis of right subphrenic abscess
International Nuclear Information System (INIS)
Middleton, H.M. III; Patton, D.D.; Hoyumpa, A.M. Jr.; Schenker, S.
1976-01-01
To assess the value of liver-lung scanning in the diagnosis of right subphrenic abscess, 148 scans were reviewed against corresponding charts. Of 91 scans with adequate clinical data, overall scanning error was 19.3 percent with 14 false positive and 3 false negative scans. Among 49 scans (of the initial group of 91 studies) with presence or absence of actual pathology proved by surgery and/or autopsy, there were 3 true positive, 12 false positive, 29 true negative, and 3 false negative scans. Analysis of data indicated lower accuracy of scan interpretation than generally reported, low specificity for positive scans and high specificity for negative scans, correlation of false interpretations with atypical degrees of liver-lung separation and with scanning defects in liver and lung, and failure of rereading significantly to improve accuracy of interpretation
Nick, Todd G
2007-01-01
Statistics is defined by the Medical Subject Headings (MeSH) thesaurus as the science and art of collecting, summarizing, and analyzing data that are subject to random variation. The two broad categories of summarizing and analyzing data are referred to as descriptive and inferential statistics. This chapter considers the science and art of summarizing data where descriptive statistics and graphics are used to display data. In this chapter, we discuss the fundamentals of descriptive statistics, including describing qualitative and quantitative variables. For describing quantitative variables, measures of location and spread, for example the standard deviation, are presented along with graphical presentations. We also discuss distributions of statistics, for example the variance, as well as the use of transformations. The concepts in this chapter are useful for uncovering patterns within the data and for effectively presenting the results of a project.
Blakemore, J S
1962-01-01
Semiconductor Statistics presents statistics aimed at complementing existing books on the relationships between carrier densities and transport effects. The book is divided into two parts. Part I provides introductory material on the electron theory of solids, and then discusses carrier statistics for semiconductors in thermal equilibrium. Of course a solid cannot be in true thermodynamic equilibrium if any electrical current is passed; but when currents are reasonably small the distribution function is but little perturbed, and the carrier distribution for such a """"quasi-equilibrium"""" co
Wannier, Gregory Hugh
1966-01-01
Until recently, the field of statistical physics was traditionally taught as three separate subjects: thermodynamics, statistical mechanics, and kinetic theory. This text, a forerunner in its field and now a classic, was the first to recognize the outdated reasons for their separation and to combine the essentials of the three subjects into one unified presentation of thermal physics. It has been widely adopted in graduate and advanced undergraduate courses, and is recommended throughout the field as an indispensable aid to the independent study and research of statistical physics.Designed for
Linguistics in Text Interpretation
DEFF Research Database (Denmark)
Togeby, Ole
2011-01-01
A model for how text interpretation proceeds from what is pronounced, through what is said to what is comunicated, and definition of the concepts 'presupposition' and 'implicature'.......A model for how text interpretation proceeds from what is pronounced, through what is said to what is comunicated, and definition of the concepts 'presupposition' and 'implicature'....
Statistical interpretation of low energy nuclear level schemes
Energy Technology Data Exchange (ETDEWEB)
Egidy, T von; Schmidt, H H; Behkami, A N
1988-01-01
Nuclear level schemes and neutron resonance spacings yield information on level densities and level spacing distributions. A total of 75 nuclear level schemes with 1761 levels and known spins and parities was investigated. The A-dependence of level density parameters is discussed. The spacing distributions of levels near the groundstate indicate transitional character between regular and chaotic properties while chaos dominates near the neutron binding energy.
Statistical interpretation of WEBNET seismograms by artificial neural nets
Czech Academy of Sciences Publication Activity Database
Plešinger, Axel; Růžek, Bohuslav; Boušková, Alena
2000-01-01
Roč. 44, č. 2 (2000), s. 251-271 ISSN 0039-3169 R&D Projects: GA AV ČR IAA312104; GA ČR GA205/99/0907 Institutional research plan: CEZ:AV0Z3012916 Subject RIV: DC - Siesmology, Volcanology, Earth Structure Impact factor: 0.761, year: 2000
Radiopharmaceutical scanning agents
International Nuclear Information System (INIS)
1976-01-01
This invention is directed to dispersions useful in preparing radiopharmaceutical scanning agents, to technetium labelled dispersions, to methods for preparing such dispersions and to their use as scanning agents
Full Text Available ... Scan and Uptake Thyroid scan and uptake uses small amounts of radioactive materials called radiotracers, a special ... is a branch of medical imaging that uses small amounts of radioactive material to diagnose and determine ...
... Home / Nuclear Heart Scan Nuclear Heart Scan Also known as Nuclear Stress Test , ... Learn More Connect With Us Contact Us Directly Policies Privacy Policy Freedom of Information Act (FOIA) Accessibility ...
Full Text Available ... of page What will I experience during and after the procedure? Most thyroid scan and thyroid uptake ... you otherwise, you may resume your normal activities after your nuclear medicine scan. If any special instructions ...
... page: //medlineplus.gov/ency/article/003835.htm RBC nuclear scan To use the sharing features on this page, please enable JavaScript. An RBC nuclear scan uses small amounts of radioactive material to ...
Energy Technology Data Exchange (ETDEWEB)
Wendelberger, Laura Jean [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2017-08-08
In large datasets, it is time consuming or even impossible to pick out interesting images. Our proposed solution is to find statistics to quantify the information in each image and use those to identify and pick out images of interest.
Department of Homeland Security — Accident statistics available on the Coast Guard’s website by state, year, and one variable to obtain tables and/or graphs. Data from reports has been loaded for...
U.S. Department of Health & Human Services — The CMS Center for Strategic Planning produces an annual CMS Statistics reference booklet that provides a quick reference for summary information about health...
Allegheny County / City of Pittsburgh / Western PA Regional Data Center — Data about the usage of the WPRDC site and its various datasets, obtained by combining Google Analytics statistics with information from the WPRDC's data portal.
Serdobolskii, Vadim Ivanovich
2007-01-01
This monograph presents mathematical theory of statistical models described by the essentially large number of unknown parameters, comparable with sample size but can also be much larger. In this meaning, the proposed theory can be called "essentially multiparametric". It is developed on the basis of the Kolmogorov asymptotic approach in which sample size increases along with the number of unknown parameters.This theory opens a way for solution of central problems of multivariate statistics, which up until now have not been solved. Traditional statistical methods based on the idea of an infinite sampling often break down in the solution of real problems, and, dependent on data, can be inefficient, unstable and even not applicable. In this situation, practical statisticians are forced to use various heuristic methods in the hope the will find a satisfactory solution.Mathematical theory developed in this book presents a regular technique for implementing new, more efficient versions of statistical procedures. ...
... Search Form Controls Cancel Submit Search the CDC Gonorrhea Note: Javascript is disabled or is not supported ... Twitter STD on Facebook Sexually Transmitted Diseases (STDs) Gonorrhea Statistics Recommend on Facebook Tweet Share Compartir Gonorrhea ...
DEFF Research Database (Denmark)
Tryggestad, Kjell
2004-01-01
The study aims is to describe how the inclusion and exclusion of materials and calculative devices construct the boundaries and distinctions between statistical facts and artifacts in economics. My methodological approach is inspired by John Graunt's (1667) Political arithmetic and more recent work...... within constructivism and the field of Science and Technology Studies (STS). The result of this approach is here termed reversible statistics, reconstructing the findings of a statistical study within economics in three different ways. It is argued that all three accounts are quite normal, albeit...... in different ways. The presence and absence of diverse materials, both natural and political, is what distinguishes them from each other. Arguments are presented for a more symmetric relation between the scientific statistical text and the reader. I will argue that a more symmetric relation can be achieved...
MacKenzie, Dana
2004-01-01
The drawbacks of using 19th-century mathematics in physics and astronomy are illustrated. To continue with the expansion of the knowledge about the cosmos, the scientists will have to come in terms with modern statistics. Some researchers have deliberately started importing techniques that are used in medical research. However, the physicists need to identify the brand of statistics that will be suitable for them, and make a choice between the Bayesian and the frequentists approach. (Edited abstract).
International Nuclear Information System (INIS)
Engdahl, L.W.; Batter, J.F. Jr.; Stout, K.J.
1977-01-01
A scanning system for a gamma camera providing for the overlapping of adjacent scan paths is described. A collimator mask having tapered edges provides for a graduated reduction in intensity of radiation received by a detector thereof, the reduction in intensity being graduated in a direction normal to the scanning path to provide a blending of images of adjacent scan paths. 31 claims, 15 figures
Perception in statistical graphics
VanderPlas, Susan Ruth
There has been quite a bit of research on statistical graphics and visualization, generally focused on new types of graphics, new software to create graphics, interactivity, and usability studies. Our ability to interpret and use statistical graphics hinges on the interface between the graph itself and the brain that perceives and interprets it, and there is substantially less research on the interplay between graph, eye, brain, and mind than is sufficient to understand the nature of these relationships. The goal of the work presented here is to further explore the interplay between a static graph, the translation of that graph from paper to mental representation (the journey from eye to brain), and the mental processes that operate on that graph once it is transferred into memory (mind). Understanding the perception of statistical graphics should allow researchers to create more effective graphs which produce fewer distortions and viewer errors while reducing the cognitive load necessary to understand the information presented in the graph. Taken together, these experiments should lay a foundation for exploring the perception of statistical graphics. There has been considerable research into the accuracy of numerical judgments viewers make from graphs, and these studies are useful, but it is more effective to understand how errors in these judgments occur so that the root cause of the error can be addressed directly. Understanding how visual reasoning relates to the ability to make judgments from graphs allows us to tailor graphics to particular target audiences. In addition, understanding the hierarchy of salient features in statistical graphics allows us to clearly communicate the important message from data or statistical models by constructing graphics which are designed specifically for the perceptual system.
Wilhelm Wundt's Theory of Interpretation
Directory of Open Access Journals (Sweden)
Jochen Fahrenberg
2008-09-01
Full Text Available Wilhelm WUNDT was a pioneer in experimental and physiological psychology. However, his theory of interpretation (hermeneutics remains virtually neglected. According to WUNDT psychology belongs to the domain of the humanities (Geisteswissenschaften, and, throughout his books and research, he advocated two basic methodologies: experimentation (as the means of controlled self-observation and interpretative analysis of mental processes and products. He was an experimental psychologist and a profound expert in traditional hermeneutics. Today, he still may be acknowledged as the author of the monumental Völkerpsychologie, but not his advances in epistemology and methodology. His subsequent work, the Logik (1908/1921, contains about 120 pages on hermeneutics. In the present article a number of issues are addressed. Noteworthy was WUNDT's general intention to account for the logical constituents and the psychological process of understanding, and his reflections on quality control. In general, WUNDT demanded methodological pluralism and a complementary approach to the study of consciousness and neurophysiological processes. In the present paper WUNDT's approach is related to the continuing controversy on basic issues in methodology; e.g. experimental and statistical methods vs. qualitative (hermeneutic methods. Varied explanations are given for the one-sided or distorted reception of WUNDT's methodology. Presently, in Germany the basic program of study in psychology lacks thorough teaching and training in qualitative (hermeneutic methods. Appropriate courses are not included in the curricula, in contrast to the training in experimental design, observation methods, and statistics. URN: urn:nbn:de:0114-fqs0803291
Design research in statistics education : on symbolizing and computer tools
Bakker, A.
2004-01-01
The present knowledge society requires statistical literacy-the ability to interpret, critically evaluate, and communicate about statistical information and messages (Gal, 2002). However, research shows that students generally do not gain satisfactory statistical understanding. The research
Interpretation of ultrasonic images; Interpretation von Ultraschall-Abbildungen
Energy Technology Data Exchange (ETDEWEB)
Mueller, W; Schmitz, V; Kroening, M [Fraunhofer-Institut fuer Zerstoerungsfreie Pruefverfahren, Saarbruecken (Germany)
1998-11-01
During the evaluation of ultrasonic images, e.g. SAFT-reconstructed B-scan images (SAFT=Synthetic Aperture Focusing Technique) it is often difficult to decide, what is the origin of reconstructed image points: were they caused by defects, specimens geometry or mode-conversions. To facilitate this evaluation a tool based on the comparison of data was developed. Different kinds of data comparison are possible: identification of that RF-signals, which caused the reconstructed image point. This is the comparison of a reconstructed image with the corresponding RF-data. Comparison of two reconstructed images performing a superposition using logical operators. In this case e.g. the reconstruction of an unknown reflector is compared with that of a known one. Comparison of raw-RF-data by simultaneous scanning through two data sets. Here the echoes of an unknown reflector are compared with the echoes of a known one. The necessary datasets of known reflectors may be generated experimentally on reference reflectors or modelled. The aim is the identification of the reflector type, e.g. cracklike or not, the determination of position, size and orientation as well as the identification of accompanying satellite echoes. The interpretation of the SAFT-reconstructed B-scan image is carried out by a complete description of the reflector. In addition to the aim of interpretation the tool described is well suited to educate and train ultrasonic testers. (orig./MM) [Deutsch] Bei der Auswertung von Ultraschall-Abbildungen, z.B. SAFT-rekonstruierten B-Bildern (SAFT=Synthetische Apertur Fokus Technik), ist es oft schwierig zu entscheiden, wo rekonstruierte Bildpunkte herruehren: wurden sie durch Materialfehler, Bauteilgeometrie oder durch Wellenumwandlungen versursacht. Um diese Auswertung zu erleichtern, wurde ein Werkzeug entwickelt, welches auf dem Vergleich von Datensaetzen basiert. Es koennen verschiedene Arten des Datenvergleichs durchgefuehrt werden: Identifikation der HF
Goodman, J. W.
This book is based on the thesis that some training in the area of statistical optics should be included as a standard part of any advanced optics curriculum. Random variables are discussed, taking into account definitions of probability and random variables, distribution functions and density functions, an extension to two or more random variables, statistical averages, transformations of random variables, sums of real random variables, Gaussian random variables, complex-valued random variables, and random phasor sums. Other subjects examined are related to random processes, some first-order properties of light waves, the coherence of optical waves, some problems involving high-order coherence, effects of partial coherence on imaging systems, imaging in the presence of randomly inhomogeneous media, and fundamental limits in photoelectric detection of light. Attention is given to deterministic versus statistical phenomena and models, the Fourier transform, and the fourth-order moment of the spectrum of a detected speckle image.
Schwabl, Franz
2006-01-01
The completely revised new edition of the classical book on Statistical Mechanics covers the basic concepts of equilibrium and non-equilibrium statistical physics. In addition to a deductive approach to equilibrium statistics and thermodynamics based on a single hypothesis - the form of the microcanonical density matrix - this book treats the most important elements of non-equilibrium phenomena. Intermediate calculations are presented in complete detail. Problems at the end of each chapter help students to consolidate their understanding of the material. Beyond the fundamentals, this text demonstrates the breadth of the field and its great variety of applications. Modern areas such as renormalization group theory, percolation, stochastic equations of motion and their applications to critical dynamics, kinetic theories, as well as fundamental considerations of irreversibility, are discussed. The text will be useful for advanced students of physics and other natural sciences; a basic knowledge of quantum mechan...
Jana, Madhusudan
2015-01-01
Statistical mechanics is self sufficient, written in a lucid manner, keeping in mind the exam system of the universities. Need of study this subject and its relation to Thermodynamics is discussed in detail. Starting from Liouville theorem gradually, the Statistical Mechanics is developed thoroughly. All three types of Statistical distribution functions are derived separately with their periphery of applications and limitations. Non-interacting ideal Bose gas and Fermi gas are discussed thoroughly. Properties of Liquid He-II and the corresponding models have been depicted. White dwarfs and condensed matter physics, transport phenomenon - thermal and electrical conductivity, Hall effect, Magneto resistance, viscosity, diffusion, etc. are discussed. Basic understanding of Ising model is given to explain the phase transition. The book ends with a detailed coverage to the method of ensembles (namely Microcanonical, canonical and grand canonical) and their applications. Various numerical and conceptual problems ar...
Guénault, Tony
2007-01-01
In this revised and enlarged second edition of an established text Tony Guénault provides a clear and refreshingly readable introduction to statistical physics, an essential component of any first degree in physics. The treatment itself is self-contained and concentrates on an understanding of the physical ideas, without requiring a high level of mathematical sophistication. A straightforward quantum approach to statistical averaging is adopted from the outset (easier, the author believes, than the classical approach). The initial part of the book is geared towards explaining the equilibrium properties of a simple isolated assembly of particles. Thus, several important topics, for example an ideal spin-½ solid, can be discussed at an early stage. The treatment of gases gives full coverage to Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein statistics. Towards the end of the book the student is introduced to a wider viewpoint and new chapters are included on chemical thermodynamics, interactions in, for exam...
DEFF Research Database (Denmark)
Agerbo, Heidi
2017-01-01
Approximately a decade ago, it was suggested that a new function should be added to the lexicographical function theory: the interpretive function(1). However, hardly any research has been conducted into this function, and though it was only suggested that this new function was relevant...... to incorporate into lexicographical theory, some scholars have since then assumed that this function exists(2), including the author of this contribution. In Agerbo (2016), I present arguments supporting the incorporation of the interpretive function into the function theory and suggest how non-linguistic signs...... can be treated in specific dictionary articles. However, in the current article, due to the results of recent research, I argue that the interpretive function should not be considered an individual main function. The interpretive function, contrary to some of its definitions, is not connected...
Cytological artifacts masquerading interpretation
Directory of Open Access Journals (Sweden)
Khushboo Sahay
2013-01-01
Conclusions: In order to justify a cytosmear interpretation, a cytologist must be well acquainted with delayed fixation-induced cellular changes and microscopic appearances of common contaminants so as to implicate better prognosis and therapy.
Schrodinger's mechanics interpretation
Cook, David B
2018-01-01
The interpretation of quantum mechanics has been in dispute for nearly a century with no sign of a resolution. Using a careful examination of the relationship between the final form of classical particle mechanics (the HamiltonJacobi Equation) and Schrödinger's mechanics, this book presents a coherent way of addressing the problems and paradoxes that emerge through conventional interpretations.Schrödinger's Mechanics critiques the popular way of giving physical interpretation to the various terms in perturbation theory and other technologies and places an emphasis on development of the theory and not on an axiomatic approach. When this interpretation is made, the extension of Schrödinger's mechanics in relation to other areas, including spin, relativity and fields, is investigated and new conclusions are reached.
Normative interpretations of diversity
DEFF Research Database (Denmark)
Lægaard, Sune
2009-01-01
Normative interpretations of particular cases consist of normative principles or values coupled with social theoretical accounts of the empirical facts of the case. The article reviews the most prominent normative interpretations of the Muhammad cartoons controversy over the publication of drawings...... of the Prophet Muhammad in the Danish newspaper Jyllands-Posten. The controversy was seen as a case of freedom of expression, toleration, racism, (in)civility and (dis)respect, and the article notes different understandings of these principles and how the application of them to the controversy implied different...... social theoretical accounts of the case. In disagreements between different normative interpretations, appeals are often made to the ‘context', so it is also considered what roles ‘context' might play in debates over normative interpretations...
Principles of radiological interpretation
International Nuclear Information System (INIS)
Rowe, L.J.; Yochum, T.R.
1987-01-01
Conventional radiographic procedures (plain film) are the most frequently utilized imaging modality in the evaluation of the skeletal system. This chapter outlines the essentials of skeletal imaging, anatomy, physiology, and interpretation
Mandl, Franz
1988-01-01
The Manchester Physics Series General Editors: D. J. Sandiford; F. Mandl; A. C. Phillips Department of Physics and Astronomy, University of Manchester Properties of Matter B. H. Flowers and E. Mendoza Optics Second Edition F. G. Smith and J. H. Thomson Statistical Physics Second Edition E. Mandl Electromagnetism Second Edition I. S. Grant and W. R. Phillips Statistics R. J. Barlow Solid State Physics Second Edition J. R. Hook and H. E. Hall Quantum Mechanics F. Mandl Particle Physics Second Edition B. R. Martin and G. Shaw The Physics of Stars Second Edition A. C. Phillips Computing for Scient
Rohatgi, Vijay K
2003-01-01
Unified treatment of probability and statistics examines and analyzes the relationship between the two fields, exploring inferential issues. Numerous problems, examples, and diagrams--some with solutions--plus clear-cut, highlighted summaries of results. Advanced undergraduate to graduate level. Contents: 1. Introduction. 2. Probability Model. 3. Probability Distributions. 4. Introduction to Statistical Inference. 5. More on Mathematical Expectation. 6. Some Discrete Models. 7. Some Continuous Models. 8. Functions of Random Variables and Random Vectors. 9. Large-Sample Theory. 10. General Meth
Levine-Wissing, Robin
2012-01-01
All Access for the AP® Statistics Exam Book + Web + Mobile Everything you need to prepare for the Advanced Placement® exam, in a study system built around you! There are many different ways to prepare for an Advanced Placement® exam. What's best for you depends on how much time you have to study and how comfortable you are with the subject matter. To score your highest, you need a system that can be customized to fit you: your schedule, your learning style, and your current level of knowledge. This book, and the online tools that come with it, will help you personalize your AP® Statistics prep
Davidson, Norman
2003-01-01
Clear and readable, this fine text assists students in achieving a grasp of the techniques and limitations of statistical mechanics. The treatment follows a logical progression from elementary to advanced theories, with careful attention to detail and mathematical development, and is sufficiently rigorous for introductory or intermediate graduate courses.Beginning with a study of the statistical mechanics of ideal gases and other systems of non-interacting particles, the text develops the theory in detail and applies it to the study of chemical equilibrium and the calculation of the thermody
Energy Technology Data Exchange (ETDEWEB)
Kwon, Yongwonn [Department of Radiology, Konkuk University Medical Center, 4-12, Hwayang-dong, Gwangjin-gu, Seoul 143-729 (Korea, Republic of); Park, Hee Sun, E-mail: heesun.park@gmail.com [Department of Radiology, Konkuk University Medical Center, 4-12, Hwayang-dong, Gwangjin-gu, Seoul 143-729 (Korea, Republic of); Kim, Young Jun; Jung, Sung Il; Jeon, Hae Jeong [Department of Radiology, Konkuk University Medical Center, 4-12, Hwayang-dong, Gwangjin-gu, Seoul 143-729 (Korea, Republic of)
2012-08-15
Objective: The purpose of this study is to evaluate the question of whether nonenhanced CT or contrast enhanced portal phase CT can replace multiphasic pancreas protocol CT in short term monitoring in patients with acute pancreatitis. Materials and methods: This retrospective study was approved by the Institutional Review Board. From April 2006 to May 2010, a total of 52 patients having acute pancreatitis who underwent initial dual phase multidetector row CT (unenhanced, arterial, and portal phase) at admission and a short term (within 30 days) follow up dual phase CT (mean interval 10.3 days, range 3-28 days) were included. Two abdominal radiologists performed an independent review of three sets of follow up CT images (nonenhanced scan, single portal phase scan, and dual phase scan). Interpretation of each image set was done with at least 2-week interval. Radiologists evaluated severity of acute pancreatitis with regard to pancreatic inflammation, pancreatic necrosis, and extrapancreatic complication, based on the modified CT severity index. Scores of each image set were compared using a paired t-test and interobserver agreement was evaluated using intraclass correlation coefficient statistics. Results: Mean scores of sum of CT severity index on nonenhanced scan, portal phase scan, and dual phase scan were 5.7, 6.6, and 6.5 for radiologist 1, and 5.0, 5.6, and 5.8 for radiologist 2, respectively. In both radiologists, contrast enhanced scan (portal phase scan and dual phase scan) showed significantly higher severity score compared with that of unenhanced scan (P < 0.05), while portal phase and dual phase scan showed no significant difference each other. The trend was similar regarding pancreatic inflammation and extrapancreatic complications, in which contrast enhanced scans showed significantly higher score compared with those of unenhanced scan, while no significant difference was observed between portal phase scan and dual phase scan. In pancreatic necrosis
Phillips, Richard L.; Chang, Kyu Hyun; Friedler, Sorelle A.
2017-01-01
Active learning has long been a topic of study in machine learning. However, as increasingly complex and opaque models have become standard practice, the process of active learning, too, has become more opaque. There has been little investigation into interpreting what specific trends and patterns an active learning strategy may be exploring. This work expands on the Local Interpretable Model-agnostic Explanations framework (LIME) to provide explanations for active learning recommendations. W...
Comparison of supine, upright, and prone positions for liver scans
International Nuclear Information System (INIS)
Harolds, J.A.; Brill, A.B.; Patton, J.A.; Touya, J.J.
1983-01-01
We compared liver scan interpretations based on anterior images obtained in the upright, prone, and supine positions. Receiver-operating-characteristic curves were generated for three well trained observers. Results showed that reading the three different views together was more accurate than the reading of any individual image. Furthermore, interpretations based on either the prone or upright view were superior to those using the supine view alone. The prone and upright views should be used more often in liver scanning
International Nuclear Information System (INIS)
Hofer, Werner A
2012-01-01
In a recent paper we introduced a model of extended electrons, which is fully compatible with quantum mechanics in the formulation of Schrödinger. However, it contradicts the current interpretation of electrons as point-particles. Here, we show by a statistical analysis of high-resolution scanning tunneling microscopy (STM) experiments, that the interpretation of electrons as point particles and, consequently, the interpretation of the density of electron charge as a statistical quantity will lead to a conflict with the Heisenberg uncertainty principle. Given the precision in these experiments we find that the uncertainty principle would be violated by close to two orders of magnitude, if this interpretation were correct. We are thus forced to conclude that the density of electron charge is a physically real, i.e. in principle precisely measurable quantity, as derived in a recent paper. Experimental evidence to the contrary, in particular high-energy scattering experiments, is briefly discussed. The finding is expected to have wide implications in condensed matter physics, chemistry, and biology, scientific disciplines which are based on the properties and interactions of electrons.
Interpreter-mediated dentistry.
Bridges, Susan; Drew, Paul; Zayts, Olga; McGrath, Colman; Yiu, Cynthia K Y; Wong, H M; Au, T K F
2015-05-01
The global movements of healthcare professionals and patient populations have increased the complexities of medical interactions at the point of service. This study examines interpreter mediated talk in cross-cultural general dentistry in Hong Kong where assisting para-professionals, in this case bilingual or multilingual Dental Surgery Assistants (DSAs), perform the dual capabilities of clinical assistant and interpreter. An initial language use survey was conducted with Polyclinic DSAs (n = 41) using a logbook approach to provide self-report data on language use in clinics. Frequencies of mean scores using a 10-point visual analogue scale (VAS) indicated that the majority of DSAs spoke mainly Cantonese in clinics and interpreted for postgraduates and professors. Conversation Analysis (CA) examined recipient design across a corpus (n = 23) of video-recorded review consultations between non-Cantonese speaking expatriate dentists and their Cantonese L1 patients. Three patterns of mediated interpreting indicated were: dentist designated expansions; dentist initiated interpretations; and assistant initiated interpretations to both the dentist and patient. The third, rather than being perceived as negative, was found to be framed either in response to patient difficulties or within the specific task routines of general dentistry. The findings illustrate trends in dentistry towards personalized care and patient empowerment as a reaction to product delivery approaches to patient management. Implications are indicated for both treatment adherence and the education of dental professionals. Copyright © 2015 Elsevier Ltd. All rights reserved.
Indian Academy of Sciences (India)
inference and finite population sampling. Sudhakar Kunte. Elements of statistical computing are discussed in this series. ... which captain gets an option to decide whether to field first or bat first ... may of course not be fair, in the sense that the team which wins ... describe two methods of drawing a random number between 0.
Schrödinger, Erwin
1952-01-01
Nobel Laureate's brilliant attempt to develop a simple, unified standard method of dealing with all cases of statistical thermodynamics - classical, quantum, Bose-Einstein, Fermi-Dirac, and more.The work also includes discussions of Nernst theorem, Planck's oscillator, fluctuations, the n-particle problem, problem of radiation, much more.
Statistical analysis with Excel for dummies
Schmuller, Joseph
2013-01-01
Take the mystery out of statistical terms and put Excel to work! If you need to create and interpret statistics in business or classroom settings, this easy-to-use guide is just what you need. It shows you how to use Excel's powerful tools for statistical analysis, even if you've never taken a course in statistics. Learn the meaning of terms like mean and median, margin of error, standard deviation, and permutations, and discover how to interpret the statistics of everyday life. You'll learn to use Excel formulas, charts, PivotTables, and other tools to make sense of everything fro
Geographic analysis of forest health indicators using spatial scan statistics
John W. Coulston; Kurt H. Riitters
2003-01-01
Forest health analysts seek to define the location, extent, and magnitude of changes in forest ecosystems, to explain the observed changes when possible, and to draw attention to the unexplained changes for further investigation. The data come from a variety of sources including satellite images, field plot measurements, and low-altitude aerial surveys. Indicators...
MR guided spatial normalization of SPECT scans
International Nuclear Information System (INIS)
Crouch, B.; Barnden, L.R.; Kwiatek, R.
2010-01-01
Full text: In SPECT population studies where magnetic resonance (MR) scans are also available, the higher resolution of the MR scans allows for an improved spatial normalization of the SPECT scans. In this approach, the SPECT images are first coregistered to their corresponding MR images by a linear (affine) transformation which is calculated using SPM's mutual information maximization algorithm. Non-linear spatial normalization maps are then computed either directly from the MR scans using SPM's built in spatial normalization algorithm, or, from segmented TI MR images using DARTEL, an advanced diffeomorphism based spatial normalization algorithm. We compare these MR based methods to standard SPECT based spatial normalization for a population of 27 fibromyalgia patients and 25 healthy controls with spin echo T 1 scans. We identify significant perfusion deficits in prefrontal white matter in FM patients, with the DARTEL based spatial normalization procedure yielding stronger statistics than the standard SPECT based spatial normalization. (author)
... Chest PET scan; Lung positron emission tomography; PET - chest; PET - lung; PET - tumor imaging; ... Grainger & Allison's Diagnostic Radiology: A Textbook of Medical Imaging . 6th ed. Philadelphia, ...
International Nuclear Information System (INIS)
Robillard, J.
1977-01-01
The Centers against cancer of Caen, Angers, Montpellier, Strasbourg and 'the Curie Foundation' have confronted their experience in detection of bone metastases by total body scanning. From the investigation by this procedure, of 1,467 patients with cancer, it results: the confrontation between radio and scanning shows a rate of false positive and false negative identical to the literature ones; the countage scanning allows to reduce the number of false positive; scanning allows to direct bone biopsy and to improve efficiency of histological examination [fr
The Role of Biased Scanning in Counterattitudinal Advocacy
Cunningham, John D.; Collins, Barry E.
1977-01-01
Experiments tested biased-scanning hypothesis that high financial inducement leads to greater cognitive contact with counterattitudinal arguments and thus to greater attitude change. No differences in biased scanning or attitude change were observed as a function of financial inducement. Results were interpreted in framework of reactance and…
Scanning tunneling microscopy III theory of STM and related scanning probe methods
Güntherodt, Hans-Joachim
1996-01-01
Scanning Tunneling Microscopy III provides a unique introduction to the theoretical foundations of scanning tunneling microscopy and related scanning probe methods. The different theoretical concepts developed in the past are outlined, and the implications of the theoretical results for the interpretation of experimental data are discussed in detail. Therefore, this book serves as a most useful guide for experimentalists as well as for theoreticians working in the filed of local probe methods. In this second edition the text has been updated and new methods are discussed.
International Nuclear Information System (INIS)
Anon.
1994-01-01
For the years 1992 and 1993, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period. The tables and figures shown in this publication are: Changes in the volume of GNP and energy consumption; Coal consumption; Natural gas consumption; Peat consumption; Domestic oil deliveries; Import prices of oil; Price development of principal oil products; Fuel prices for power production; Total energy consumption by source; Electricity supply; Energy imports by country of origin in 1993; Energy exports by recipient country in 1993; Consumer prices of liquid fuels; Consumer prices of hard coal and natural gas, prices of indigenous fuels; Average electricity price by type of consumer; Price of district heating by type of consumer and Excise taxes and turnover taxes included in consumer prices of some energy sources
Goodman, Joseph W.
2000-07-01
The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research
Pivato, Marcus
2013-01-01
We show that, in a sufficiently large population satisfying certain statistical regularities, it is often possible to accurately estimate the utilitarian social welfare function, even if we only have very noisy data about individual utility functions and interpersonal utility comparisons. In particular, we show that it is often possible to identify an optimal or close-to-optimal utilitarian social choice using voting rules such as the Borda rule, approval voting, relative utilitarianism, or a...
Natrella, Mary Gibbons
1963-01-01
Formulated to assist scientists and engineers engaged in army ordnance research and development programs, this well-known and highly regarded handbook is a ready reference for advanced undergraduate and graduate students as well as for professionals seeking engineering information and quantitative data for designing, developing, constructing, and testing equipment. Topics include characterizing and comparing the measured performance of a material, product, or process; general considerations in planning experiments; statistical techniques for analyzing extreme-value data; use of transformations
Evaluation of processing methods for static radioisotope scan images
International Nuclear Information System (INIS)
Oakberg, J.A.
1976-12-01
Radioisotope scanning in the field of nuclear medicine provides a method for the mapping of a radioactive drug in the human body to produce maps (images) which prove useful in detecting abnormalities in vital organs. At best, radioisotope scanning methods produce images with poor counting statistics. One solution to improving the body scan images is using dedicated small computers with appropriate software to process the scan data. Eleven methods for processing image data are compared
FDG-PET scan in assessing lymphomas and the application of Deauville Criteria
International Nuclear Information System (INIS)
Awan, U.E.K.; Siddiqui, N.; Muzaffar, N.; Farooqui, Z.S.
2013-01-01
To evaluate the role of Fluorine-18-fluorodexoyglucose Positron Emission Tomography (FDG-PET) scan in staging and its implications on the treatment of lymphoma, and to study the concordance between visual assessment and Deauville criteria for the interpretation of interim scans. Methods: The prospective single-arm experimental study was conducted at the Shaukat Khanum Memorial Cancer Hospital, Lahore, from May 2011 to October 2011. It comprised 53 newly diagnosed lymphoma patients who agreed to participate in the study. All patients underwent scans with contrast-enhanced computerised tomography at baseline. Treatment plan was formulated based on the final stage. Interim scans were acquired after 2 cycles of chemotherapy and were reported using visual criteria and compared with the 5-point Deauville criteria. Score of 1-3 was taken as disease-negative, while 4-5 was taken as disease-positive. SPSS 19 was used for statistical analysis. Results: Of the 53 patients, 35 (66%) had Hodgkin's Lymphoma, while 18 (34%) had Non-Hodgkin's Lymphoma. Scans resulted in disease upstaging in 4 (7.5%) patients, and detecting increased disease burden in 12 (23%). On interim scans, complete remission was achieved in 38 (71%) patients (Deauville score 1-3); 12 (23%) showed partial response (Deauville score 4-5); and 3 (6%) had progression. Kappa test was statistically significant (kappa 0.856; p <0.001). Conclusion: The positron emission tomography helped to upstage lymphoma and reflected increased disease burden. The Deauville criteria correlated very well with visual assessment criteria and can be applied in the patient population. (author)
Conjunctive interpretations of disjunctions
Directory of Open Access Journals (Sweden)
Robert van Rooij
2010-09-01
Full Text Available In this extended commentary I discuss the problem of how to account for "conjunctive" readings of some sentences with embedded disjunctions for globalist analyses of conversational implicatures. Following Franke (2010, 2009, I suggest that earlier proposals failed, because they did not take into account the interactive reasoning of what else the speaker could have said, and how else the hearer could have interpreted the (alternative sentence(s. I show how Franke's idea relates to more traditional pragmatic interpretation strategies. doi:10.3765/sp.3.11 BibTeX info
Measurement and statistics for teachers
Van Blerkom, Malcolm
2008-01-01
Written in a student-friendly style, Measurement and Statistics for Teachers shows teachers how to use measurement and statistics wisely in their classes. Although there is some discussion of theory, emphasis is given to the practical, everyday uses of measurement and statistics. The second part of the text provides more complete coverage of basic descriptive statistics and their use in the classroom than in any text now available.Comprehensive and accessible, Measurement and Statistics for Teachers includes:Short vignettes showing concepts in action Numerous classroom examples Highlighted vocabulary Boxes summarizing related concepts End-of-chapter exercises and problems Six full chapters devoted to the essential topic of Classroom Tests Instruction on how to carry out informal assessments, performance assessments, and portfolio assessments, and how to use and interpret standardized tests A five-chapter section on Descriptive Statistics, giving instructors the option of more thoroughly teaching basic measur...
Strunk, Amber; Gazdovich, Jennifer; Redouté, Oriane; Reverte, Juan Manuel; Shelley, Samantha; Todorova, Vesela
2018-05-01
This paper provides a brief introduction to antimatter and how it, along with other modern physics topics, is utilized in positron emission tomography (PET) scans. It further describes a hands-on activity for students to help them gain an understanding of how PET scans assist in detecting cancer. Modern physics topics provide an exciting way to introduce students to current applications of physics.
Scanning laser Doppler vibrometry
DEFF Research Database (Denmark)
Brøns, Marie; Thomsen, Jon Juel
With a Scanning Laser Doppler Vibrometer (SLDV) a vibrating surface is automatically scanned over predefined grid points, and data processed for displaying vibration properties like mode shapes, natural frequencies, damping ratios, and operational deflection shapes. Our SLDV – a PSV-500H from...
Full Text Available Toggle navigation Test/Treatment Patient Type Screening/Wellness Disease/Condition Safety En Español More Info Images/Videos About Us News Physician Resources Professions Site Index A-Z Thyroid Scan and Uptake Thyroid scan and uptake uses ...
International Nuclear Information System (INIS)
Luo Chuanwen; Wang Gang; Wang Chuncheng; Wei Junjie
2009-01-01
The concepts of uniform index and expectation uniform index are two mathematical descriptions of the uniformity and the mean uniformity of a finite set in a polyhedron. The concepts of instantaneous chaometry (ICM) and k step chaometry (k SCM) are introduced in order to apply the method in statistics for studying the nonlinear difference equations. It is found that k step chaometry is an indirect estimation of the expectation uniform index. The simulation illustrate that the expectation uniform index for the Lorenz System is increasing linearly, but increasing nonlinearly for the Chen's System with parameter b. In other words, the orbits for each system become more and more uniform with parameter b increasing. Finally, a conjecture is also brought forward, which implies that chaos can be interpreted by its orbit's mean uniformity described by the expectation uniform index and indirectly estimated by k SCM. The k SCM of the heart rate showes the feeble and old process of the heart.
International Nuclear Information System (INIS)
Anon.
1989-01-01
World data from the United Nation's latest Energy Statistics Yearbook, first published in our last issue, are completed here. The 1984-86 data were revised and 1987 data added for world commercial energy production and consumption, world natural gas plant liquids production, world LP-gas production, imports, exports, and consumption, world residual fuel oil production, imports, exports, and consumption, world lignite production, imports, exports, and consumption, world peat production and consumption, world electricity production, imports, exports, and consumption (Table 80), and world nuclear electric power production
Transverse section scanning mechanism
International Nuclear Information System (INIS)
Doherty, E.J.
1978-01-01
Apparatus is described for scanning a transverse, radionuclide scan-field using an array of focussed collimators. The collimators are movable tangentially on rails, driven by a single motor via a coupled screw. The collimators are also movable in a radial direction on rails driven by a step motor via coupled screws and bevel gears. Adjacent bevel gears rotate in opposite directions so adjacent collimators move in radially opposite directions. In use, the focal point of each collimator scans at least half of the scan-field, e.g. a human head located in the central aperture, and the electrical outputs of detectors associated with each collimator are used to determine the distribution of radioactive emission intensity at a number of points in the scan-field. (author)
Quantum Statistics and Entanglement Problems
Trainor, L. E. H.; Lumsden, Charles J.
2002-01-01
Interpretations of quantum measurement theory have been plagued by two questions, one concerning the role of observer consciousness and the other the entanglement phenomenon arising from the superposition of quantum states. We emphasize here the remarkable role of quantum statistics in describing the entanglement problem correctly and discuss the relationship to issues arising from current discussions of intelligent observers in entangled, decohering quantum worlds.
An introduction to medical statistics
International Nuclear Information System (INIS)
Hilgers, R.D.; Bauer, P.; Scheiber, V.; Heitmann, K.U.
2002-01-01
This textbook teaches all aspects and methods of biometrics as a field of concentration in medical education. Instrumental interpretations of the theory, concepts and terminology of medical statistics are enhanced by numerous illustrations and examples. With problems, questions and answers. (orig./CB) [de
Statistics Poster Challenge for Schools
Payne, Brad; Freeman, Jenny; Stillman, Eleanor
2013-01-01
The analysis and interpretation of data are important life skills. A poster challenge for schoolchildren provides an innovative outlet for these skills and demonstrates their relevance to daily life. We discuss our Statistics Poster Challenge and the lessons we have learned.
Interpreting & Biomechanics. PEPNet Tipsheet
PEPNet-Northeast, 2001
2001-01-01
Cumulative trauma disorder (CTD) refers to a collection of disorders associated with nerves, muscles, tendons, bones, and the neurovascular (nerves and related blood vessels) system. CTD symptoms may involve the neck, back, shoulders, arms, wrists, or hands. Interpreters with CTD may experience a variety of symptoms including: pain, joint…
Tokens: Facts and Interpretation.
Schmandt-Besserat, Denise
1986-01-01
Summarizes some of the major pieces of evidence concerning the archeological clay tokens, specifically the technique for their manufacture, their geographic distribution, chronology, and the context in which they are found. Discusses the interpretation of tokens as the first example of visible language, particularly as an antecedent of Sumerian…
DEFF Research Database (Denmark)
Hauschild, Michael Z.; Bonou, Alexandra; Olsen, Stig Irving
2018-01-01
The interpretation is the final phase of an LCA where the results of the other phases are considered together and analysed in the light of the uncertainties of the applied data and the assumptions that have been made and documented throughout the study. This chapter teaches how to perform an inte...
Interpretations of Greek Mythology
Bremmer, Jan
1987-01-01
This collection of original studies offers new interpretations of some of the best known characters and themes of Greek mythology, reflecting the complexity and fascination of the Greek imagination. Following analyses of the concept of myth and the influence of the Orient on Greek mythology, the
Translation, Interpreting and Lexicography
DEFF Research Database (Denmark)
Dam, Helle Vrønning; Tarp, Sven
2018-01-01
in the sense that their practice fields are typically ‘about something else’. Translators may, for example, be called upon to translate medical texts, and interpreters may be assigned to work on medical speeches. Similarly, practical lexicography may produce medical dictionaries. In this perspective, the three...
Directory of Open Access Journals (Sweden)
V. V. Elizarov
2016-11-01
Full Text Available Subject of Research. The results of lidar combined scanning unit development for locating leaks of hydrocarbons are presented The unit enables to perform high-speed scanning of the investigated space in wide and narrow angle fields. Method. Scanning in a wide angular field is produced by one-line scanning path by means of the movable aluminum mirror with a frequency of 20Hz and amplitude of 20 degrees of swing. Narrowband scanning is performed along a spiral path by the deflector. The deflection of the beam is done by rotation of the optical wedges forming part of the deflector at an angle of ±50. The control function of the scanning node is performed by a specialized software product written in C# programming language. Main Results. This scanning unit allows scanning the investigated area at a distance of 50-100 m with spatial resolution at the level of 3 cm. The positioning accuracy of the laser beam in space is 15'. The developed scanning unit gives the possibility to browse the entire investigated area for the time not more than 1 ms at a rotation frequency of each wedge from 50 to 200 Hz. The problem of unambiguous definition of the beam geographical coordinates in space is solved at the software level according to the rotation angles of the mirrors and optical wedges. Lidar system coordinates are determined by means of GPS. Practical Relevance. Development results open the possibility for increasing the spatial resolution of scanning systems of a wide range of lidars and can provide high positioning accuracy of the laser beam in space.
READING STATISTICS AND RESEARCH
Directory of Open Access Journals (Sweden)
Reviewed by Yavuz Akbulut
2008-10-01
Full Text Available The book demonstrates the best and most conservative ways to decipher and critique research reports particularly for social science researchers. In addition, new editions of the book are always better organized, effectively structured and meticulously updated in line with the developments in the field of research statistics. Even the most trivial issues are revisited and updated in new editions. For instance, purchaser of the previous editions might check the interpretation of skewness and kurtosis indices in the third edition (p. 34 and in the fifth edition (p.29 to see how the author revisits every single detail. Theory and practice always go hand in hand in all editions of the book. Re-reading previous editions (e.g. third edition before reading the fifth edition gives the impression that the author never stops ameliorating his instructional text writing methods. In brief, “Reading Statistics and Research” is among the best sources showing research consumers how to understand and critically assess the statistical information and research results contained in technical research reports. In this respect, the review written by Mirko Savić in Panoeconomicus (2008, 2, pp. 249-252 will help the readers to get a more detailed overview of each chapters. I cordially urge the beginning researchers to pick a highlighter to conduct a detailed reading with the book. A thorough reading of the source will make the researchers quite selective in appreciating the harmony between the data analysis, results and discussion sections of typical journal articles. If interested, beginning researchers might begin with this book to grasp the basics of research statistics, and prop up their critical research reading skills with some statistics package applications through the help of Dr. Andy Field’s book, Discovering Statistics using SPSS (second edition published by Sage in 2005.
Applied statistics in ecology: common pitfalls and simple solutions
E. Ashley Steel; Maureen C. Kennedy; Patrick G. Cunningham; John S. Stanovick
2013-01-01
The most common statistical pitfalls in ecological research are those associated with data exploration, the logic of sampling and design, and the interpretation of statistical results. Although one can find published errors in calculations, the majority of statistical pitfalls result from incorrect logic or interpretation despite correct numerical calculations. There...
Directory of Open Access Journals (Sweden)
Håkan Olsson
2012-09-01
Full Text Available The introduction of Airborne Laser Scanning (ALS to forests has been revolutionary during the last decade. This development was facilitated by combining earlier ranging lidar discoveries [1–5], with experience obtained from full-waveform ranging radar [6,7] to new airborne laser scanning systems which had components such as a GNSS receiver (Global Navigation Satellite System, IMU (Inertial Measurement Unit and a scanning mechanism. Since the first commercial ALS in 1994, new ALS-based forest inventory approaches have been reported feasible for operational activities [8–12]. ALS is currently operationally applied for stand level forest inventories, for example, in Nordic countries. In Finland alone, the adoption of ALS for forest data collection has led to an annual savings of around 20 M€/year, and the work is mainly done by companies instead of governmental organizations. In spite of the long implementation times and there being a limited tradition of making changes in the forest sector, laser scanning was commercially and operationally applied after about only one decade of research. When analyzing high-ranked journal papers from ISI Web of Science, the topic of laser scanning of forests has been the driving force for the whole laser scanning research society over the last decade. Thus, the topic “laser scanning in forests” has provided a significant industrial, societal and scientific impact. [...
International Nuclear Information System (INIS)
Gordon, I.; Peters, A.M.
1987-01-01
In 1984, a survey carried out in 21 countries in Europe showed that bone scintigraphy comprised 16% of all paediatric radioisotope scans. Although the value of bone scans in paediatrics is potentially great, their quality varies greatly, and poor-quality images are giving this valuable technique a bad reputation. The handling of children requires a sensitive staff and the provision of a few simple inexpensive items of distraction. Attempting simply to scan a child between two adult patients in a busy general department is a recipe for an unhappy, uncooperative child with the probable result of poor images. The intravenous injection of isotope should be given adjacent to the gamma camera room, unless dynamic scans are required, so that the child does not associate the camera with the injection. This injection is best carried out by someone competent in paediatric venipunture; the entire procedure should be explained to the child and parent, who should remain with child throughout. It is naive to think that silence makes for a cooperative child. The sensitivity of bone-seeking radioisotope tracers and the marked improvement in gamma camera resolution has allowed the bone scanning to become an integrated technique in the assessment of children suspected of suffering from pathological bone conditions. The tracer most commonly used for routine bone scanning is 99m Tc diphosphonate (MDP); other isotopes used include 99m Tc colloid for bone marrow scans and 67 Ga citrate and 111 In white blood cells ( 111 In WBC) for investigation of inflammatory/infective lesions
Isotope scanning for tumor localization
Energy Technology Data Exchange (ETDEWEB)
NONE
1961-09-15
At the request of the Government of the United Arab Republic, the Agency provided the services of an expert for the establishment in the UAR of a tumor localization program using photoscanning techniques and appropriate radioactive tracers. Photoscanning is a recently developed technique whereby the differences in isotope concentrations are enhanced on the record, and this facilitates the interpretation of the record. A variety of brain tumors were located, using a suitable radioactive tracer (Hg-203 - labelled Neohydrin) obtained from the USA. In some other investigations, processes in the kidney were scanned. Further, radioactive gold was used to demonstrate the normal and pathological spleen and liver and these tests showed various types of space occupying lesions resulting from malignancy and the parasitic infections endemic to the area. While the localization of brain tumors by scanning techniques is extremely useful, it does not always establish the precise extent of the tumor which should be known at the time of surgery. Dr. Bender, therefore, thought it advisable to instruct personnel in the use of what is known as an in-vivo needle scintillation probe - a technique for the investigation of the isotope concentration in a particular tissue during operation. The necessary instrument was obtained for this purpose and demonstrations were given; one patient was examined in this way at the time of surgery at the University of Alexandria Hospital.
Taylor, Andrew T; Garcia, Ernest V
2014-01-01
The goal of artificial intelligence, expert systems, decision support systems and computer assisted diagnosis (CAD) in imaging is the development and implementation of software to assist in the detection and evaluation of abnormalities, to alert physicians to cognitive biases, to reduce intra and inter-observer variability and to facilitate the interpretation of studies at a faster rate and with a higher level of accuracy. These developments are needed to meet the challenges resulting from a rapid increase in the volume of diagnostic imaging studies coupled with a concurrent increase in the number and complexity of images in each patient data. The convergence of an expanding knowledge base and escalating time constraints increases the likelihood of physician errors. Errors are even more likely when physicians interpret low volume studies such as 99mTc-MAG3 diuretic scans where imagers may have had limited training or experience. Decision support systems include neural networks, case-based reasoning, expert systems and statistical systems. iRENEX (renal expert) is an expert system for diuretic renography that uses a set of rules obtained from human experts to analyze a knowledge base of both clinical parameters and quantitative parameters derived from the renogram. Initial studies have shown that the interpretations provided by iRENEX are comparable to the interpretations of a panel of experts. iRENEX provides immediate patient specific feedback at the time of scan interpretation, can be queried to provide the reasons for its conclusions and can be used as an educational tool to teach trainees to better interpret renal scans. iRENEX also has the capacity to populate a structured reporting module and generate a clear and concise impression based on the elements contained in the report; adherence to the procedural and data entry components of the structured reporting module assures and documents procedural competency. Finally, although the focus is CAD applied to
Personal literary interpretation
Directory of Open Access Journals (Sweden)
Michał Januszkiewicz
2015-11-01
Full Text Available The article titled “Personal literary interpretation” deals with problems which have usually been marginalized in literary studies, but which seem to be very important in the context of the humanities, as broadly defined. The author of this article intends to rethink the problem of literary studies not in objective, but in personal terms. This is why the author wants to talk about what he calls personal literary interpretation, which has nothing to do with subjective or irrational thinking, but which is rather grounded in the hermeneutical rule that says that one must believe in order tounderstand a text or the other (where ‘believe’ also means: ‘to love’, ‘engage’, and ‘be open’. The article presents different determinants of this attitude, ranging from Dilthey to Heidegger and Gadamer. Finally, the author subscribes to the theory of personal interpretation, which is always dialogical.
Interpretation and clinical applications
International Nuclear Information System (INIS)
Higgins, C.B.
1987-01-01
This chapter discusses the factors to be kept in mind during routine interpretation of MR images. This includes the factors that determine contrast on standard spin-echo images and some distinguishing features between true lesions and artifactually simulated lesions. This chapter also indicates the standard protocols for MRI of various portions of the body. Finally, the current indications for MRI of various portions of the body are suggested; however, it is recognized that the indications for MRI are rapidly increasing and consequently, at the time of publication of this chapter, it is likely that many more applications will have become evident. Interpretation of magnetic resonance (MR) images requires consideration of anatomy and tissue characteristics and extraction of artifacts resulting from motion and other factors
Gianni Vattimo
2013-01-01
Gianni Vattimo, who is both a Catholic and a frequent critic of the Church, explores the surprising congruence between Christianity and hermeneutics in light of the dissolution of metaphysical truth. As in hermeneutics, Vatimo claims, interpretation is central to Christianity. Influenced by hermeneutics and borrowing largely from the Nietzschean and Heideggerian heritage, the Italian philosopher, who has been instrumental in promoting a nihilistic approach to Christianity, draws here on Nietz...
Directory of Open Access Journals (Sweden)
Gianni Vattimo
2013-01-01
Full Text Available Gianni Vattimo, who is both a Catholic and a frequent critic of the Church, explores the surprising congruence between Christianity and hermeneutics in light of the dissolution of metaphysical truth. As in hermeneutics, Vatimo claims, interpretation is central to Christianity. Influenced by hermeneutics and borrowing largely from the Nietzschean and Heideggerian heritage, the Italian philosopher, who has been instrumental in promoting a nihilistic approach to Christianity, draws here on Nietzsche’s writings on nihilism, which is not to be understood in a purely negative sense. Vattimo suggests that nihilism not only expands the Christian message of charity, but also transforms it into its endless human potential. In “The Age of Interpretation,” the author shows that hermeneutical radicalism “reduces all reality to message,” so that the opposition between facts and norms turns out to be misguided, for both are governed by the interpretative paradigms through which someone (always a concrete, historically situated someone makes sense of them. Vattimo rejects some of the deplorable political consequences of hermeneutics and claims that traditional hermeneutics is in collusion with various political-ideological neutralizations.
Full Text Available ... for a thyroid scan is 30 minutes or less. Thyroid Uptake You will be given radioactive iodine ( ... for each thyroid uptake is five minutes or less. top of page What will I experience during ...
Full Text Available ... evaluate changes in the gland following medication use, surgery, radiotherapy or chemotherapy top of page How should ... such as an x-ray or CT scan, surgeries or treatments using iodinated contrast material within the ...
Full Text Available ... abnormal was found, and should not be a cause of concern for you. If you had an ... abnormal was found, and should not be a cause of concern for you. Actual scanning time for ...
Tomographic scanning apparatus
International Nuclear Information System (INIS)
1981-01-01
Details are given of a tomographic scanning apparatus, with particular reference to a multiplexer slip ring means for receiving output from the detectors and enabling interfeed to the image reconstruction station. (U.K.)
Tomographic scanning apparatus
International Nuclear Information System (INIS)
1981-01-01
Details are presented of a tomographic scanning apparatus, its rotational assembly, and the control and circuit elements, with particular reference to the amplifier and multiplexing circuits enabling detector signal calibration. (U.K.)
Tomographic scanning apparatus
International Nuclear Information System (INIS)
1981-01-01
This patent specification relates to a tomographic scanning apparatus using a fan beam and digital output signal, and particularly to the design of the gas-pressurized ionization detection system. (U.K.)
The Radiation Epidemiology Branch and collaborators have initiated a retrospective cohort study to evaluate the relationship between radiation exposure from CT scans conducted during childhood and adolescence and the subsequent development of cancer.
Full Text Available ... which are encased in metal and plastic and most often shaped like a box, attached to a ... will I experience during and after the procedure? Most thyroid scan and thyroid uptake procedures are painless. ...
... make to decrease the risk of heart disease. Risks Risks of CT scans include: Being exposed to ... urac.org). URAC's accreditation program is an independent audit to verify that A.D.A.M. follows ...
Full Text Available ... eat for several hours before your exam because eating can affect the accuracy of the uptake measurement. ... often unattainable using other imaging procedures. For many diseases, nuclear medicine scans yield the most useful information ...
Full Text Available ... A thyroid scan is a type of nuclear medicine imaging. The radioactive iodine uptake test (RAIU) is ... thyroid function, but does not involve imaging. Nuclear medicine is a branch of medical imaging that uses ...
Full Text Available ... that help physicians diagnose and evaluate medical conditions. These imaging scans use radioactive materials called radiopharmaceuticals or ... or had thyroid cancer. A physician may perform these imaging tests to: determine if the gland is ...
Full Text Available ... Because nuclear medicine procedures are able to pinpoint molecular activity within the body, they offer the potential ... or imaging device that produces pictures and provides molecular information. The thyroid scan and thyroid uptake provide ...
Full Text Available ... Actual scanning time for each thyroid uptake is five minutes or less. top of page What will ... diagnostic procedures have been used for more than five decades, and there are no known long-term ...
Full Text Available ... top of page Additional Information and Resources RTAnswers.org Radiation Therapy for Head and Neck Cancer top ... Scan and Uptake Sponsored by Please note RadiologyInfo.org is not a medical facility. Please contact your ...
Full Text Available ... often unattainable using other imaging procedures. For many diseases, nuclear medicine scans yield the most useful information needed to make a diagnosis or to determine appropriate treatment, if any. Nuclear medicine is less expensive and ...
Full Text Available ... the gamma camera and single-photon emission-computed tomography (SPECT). The gamma camera, also called a scintillation ... high as with other imaging techniques, such as CT or MRI. However, nuclear medicine scans are more ...
Scanning Auger Electron Microscope
Federal Laboratory Consortium — A JEOL model 7830F field emission source, scanning Auger microscope.Specifications / Capabilities:Ultra-high vacuum (UHV), electron gun range from 0.1 kV to 25 kV,...
Full Text Available ... as an overactive thyroid gland, a condition called hyperthyroidism , cancer or other growths assess the nature of ... an x-ray or CT scan, surgeries or treatments using iodinated contrast material within the last two ...
Full Text Available ... painless. However, during the thyroid scan, you may feel uncomfortable when lying completely still with your head ... When the radiotracer is given intravenously, you will feel a slight pin prick when the needle is ...
Full Text Available ... energy. top of page What are some common uses of the procedure? The thyroid scan is used ... community, you can search the ACR-accredited facilities database . This website does not provide cost information. The ...
Full Text Available ... scan and thyroid uptake provide information about the structure and function of the thyroid. The thyroid is ... computer, create pictures offering details on both the structure and function of organs and tissues in your ...
Full Text Available ... found, and should not be a cause of concern for you. If you had an intravenous line ... found, and should not be a cause of concern for you. Actual scanning time for each thyroid ...
... a CT scan can be reformatted in multiple planes, and can even generate three-dimensional images. These ... other medical conditions and whether you have a history of heart disease, asthma, diabetes, kidney disease or ...
Full Text Available ... the gland following medication use, surgery, radiotherapy or chemotherapy top of page How should I prepare? You ... You will receive specific instructions based on the type of scan you are undergoing. top of page ...
Full Text Available ... Uptake? A thyroid scan is a type of nuclear medicine imaging. The radioactive iodine uptake test (RAIU) ... of thyroid function, but does not involve imaging. Nuclear medicine is a branch of medical imaging that ...
Tomographic scanning apparatus
International Nuclear Information System (INIS)
1981-01-01
This patent specification describes a tomographic scanning apparatus, with particular reference to the adjustable fan beam and its collimator system, together with the facility for taking a conventional x-radiograph without moving the patient. (U.K.)
Full Text Available ... exam of any medications you are taking, including vitamins and herbal supplements. You should also inform them ... of scan you are undergoing. top of page What does the equipment look like? The special camera ...
The Scanning Optical Microscope.
Sheppard, C. J. R.
1978-01-01
Describes the principle of the scanning optical microscope and explains its advantages over the conventional microscope in the improvement of resolution and contrast, as well as the possibility of producing a picture from optical harmonies generated within the specimen.
Full Text Available ... the gland following medication use, surgery, radiotherapy or chemotherapy top of page How should I prepare? You ... but is often performed on hospitalized patients as well. Thyroid Scan You will be positioned on an ...
The role of key image notes in CT imaging study interpretation.
Fan, Shu-Feng; Xu, Zhe; He, Hai-Qing; Ding, Jian-Rong; Teng, Gao-Jun
2011-04-01
The objective of the study was to investigate the clinical effects of CT key image notes (KIN) in the interpretation of a CT image study. All experiments were approved by the ethics committee of the local district. Six experienced radiologists were equally divided into routine reporting (RR) group and KIN reporting (KIN) group. CT scans of each 100 consecutive cases before and after using KIN technique were randomly selected, and the reports were made by group RR and KIN, respectively. All the reports were again reviewed 3 months later by both groups. All the results with using or not using KIN were interpreted and reinterpreted after 3 months by six clinicians, who were experienced in picture archiving and communication system (PACS) applications and were equally divided into the clinical routine report group and the clinical KIN report group, respectively. The results were statistically analyzed; the time used in making a report, the re-reading time 3 months later, and the consistency of imaging interpretation were determined and compared between groups. After using KIN technique, the time used in making a report was significantly increased (8.77 ± 5.27 vs. 10.53 ± 5.71 min, P < 0.05), the re-reading time was decreased (5.23 ± 2.54 vs. 4.99 ± 1.70 min, P < 0.05), the clinical interpretation and reinterpretation time after 3 months were decreased, and the consistency of the interpretation, reinterpretation between different doctors in different time was markedly improved (P < 0.01). CT report with KIN technique in PACS can significantly improve the consistency of the interpretation and efficiency in routine clinical work.
Multimodal integration in statistical learning
DEFF Research Database (Denmark)
Mitchell, Aaron; Christiansen, Morten Hyllekvist; Weiss, Dan
2014-01-01
, we investigated the ability of adults to integrate audio and visual input during statistical learning. We presented learners with a speech stream synchronized with a video of a speaker’s face. In the critical condition, the visual (e.g., /gi/) and auditory (e.g., /mi/) signals were occasionally...... facilitated participants’ ability to segment the speech stream. Our results therefore demonstrate that participants can integrate audio and visual input to perceive the McGurk illusion during statistical learning. We interpret our findings as support for modality-interactive accounts of statistical learning.......Recent advances in the field of statistical learning have established that learners are able to track regularities of multimodal stimuli, yet it is unknown whether the statistical computations are performed on integrated representations or on separate, unimodal representations. In the present study...
Full Text Available ... of page Who interprets the results and how do I get them? A radiologist or other physician ... E-mail: Area code: Phone no: Thank you! Do you have a personal story about radiology? Share ...
National Statistical Commission and Indian Official Statistics*
Indian Academy of Sciences (India)
IAS Admin
a good collection of official statistics of that time. With more .... statistical agencies and institutions to provide details of statistical activities .... ing several training programmes. .... ful completion of Indian Statistical Service examinations, the.
Scanning high-Tc SQUID imaging system for magnetocardiography
International Nuclear Information System (INIS)
Yang, H-C; Wu, T-Y; Horng, H-E; Wu, C-C; Yang, S Y; Liao, S-H; Wu, C-H; Jeng, J T; Chen, J C; Chen, Kuen-Lin; Chen, M J
2006-01-01
A scanning magnetocardiography (MCG) system constructed from SQUID sensors offers potential to basic or clinical research in biomagnetism. In this work, we study a first order scanning electronic high-T c (HTS) SQUID MCG system for biomagnetic signals. The scanning MCG system was equipped with an x-y translation bed powered by step motors. Using noise cancellation and μ-metal shielding, we reduced the noise level substantially. The established scanning HTS MCG system was used to study the magnetophysiology of hypercholesterolaemic (HC) rabbits. The MCG data of HC rabbits were analysed. The MCG contour map of HC rabbits provides experimental models for the interpretation of human cardiac patterns
Tellinghuisen, Joel
2008-01-01
The method of least squares is probably the most powerful data analysis tool available to scientists. Toward a fuller appreciation of that power, this work begins with an elementary review of statistics fundamentals, and then progressively increases in sophistication as the coverage is extended to the theory and practice of linear and nonlinear least squares. The results are illustrated in application to data analysis problems important in the life sciences. The review of fundamentals includes the role of sampling and its connection to probability distributions, the Central Limit Theorem, and the importance of finite variance. Linear least squares are presented using matrix notation, and the significance of the key probability distributions-Gaussian, chi-square, and t-is illustrated with Monte Carlo calculations. The meaning of correlation is discussed, including its role in the propagation of error. When the data themselves are correlated, special methods are needed for the fitting, as they are also when fitting with constraints. Nonlinear fitting gives rise to nonnormal parameter distributions, but the 10% Rule of Thumb suggests that such problems will be insignificant when the parameter is sufficiently well determined. Illustrations include calibration with linear and nonlinear response functions, the dangers inherent in fitting inverted data (e.g., Lineweaver-Burk equation), an analysis of the reliability of the van't Hoff analysis, the problem of correlated data in the Guggenheim method, and the optimization of isothermal titration calorimetry procedures using the variance-covariance matrix for experiment design. The work concludes with illustrations on assessing and presenting results.
Statistical ensembles in quantum mechanics
International Nuclear Information System (INIS)
Blokhintsev, D.
1976-01-01
The interpretation of quantum mechanics presented in this paper is based on the concept of quantum ensembles. This concept differs essentially from the canonical one by that the interference of the observer into the state of a microscopic system is of no greater importance than in any other field of physics. Owing to this fact, the laws established by quantum mechanics are not of less objective character than the laws governing classical statistical mechanics. The paradoxical nature of some statements of quantum mechanics which result from the interpretation of the wave functions as the observer's notebook greatly stimulated the development of the idea presented. (Auth.)
Interpretation of Internet technology
DEFF Research Database (Denmark)
Madsen, Charlotte Øland
2001-01-01
Research scope: The topic of the research project is to investigate how new internet technologies such as e-trade and customer relation marketing and management are implemented in Danish food processing companies. The aim is to use Weick's (1995) sensemaking concept to analyse the strategic...... processes leading to the use of internet marketing technologies and to investigate how these new technologies are interpreted into the organisation. Investigating the organisational socio-cognitive processes underlying the decision making processes will give further insight into the socio...
Changing interpretations of Plotinus
DEFF Research Database (Denmark)
Catana, Leo
2013-01-01
about method point in other directions. Eduard Zeller (active in the second half of the 19th century) is typically regarded as the first who gave a satisfying account of Plotinus’ philosophy as a whole. In this article, on the other hand, Zeller is seen as the one who finalised a tradition initiated...... in the 18th century. Very few Plotinus scholars have examined the interpretative development prior to Zeller. Schiavone (1952) and Bonetti (1971), for instance, have given little attention to Brucker’s introduction of the concept system of philosophy. The present analysis, then, has value...
SOCR: Statistics Online Computational Resource
Directory of Open Access Journals (Sweden)
Ivo D. Dinov
2006-10-01
Full Text Available The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis, visualization and integration. Following years of experience in statistical teaching at all college levels using established licensed statistical software packages, like STATA, S-PLUS, R, SPSS, SAS, Systat, etc., we have attempted to engineer a new statistics education environment, the Statistics Online Computational Resource (SOCR. This resource performs many of the standard types of statistical analysis, much like other classical tools. In addition, it is designed in a plug-in object-oriented architecture and is completely platform independent, web-based, interactive, extensible and secure. Over the past 4 years we have tested, fine-tuned and reanalyzed the SOCR framework in many of our undergraduate and graduate probability and statistics courses and have evidence that SOCR resources build student's intuition and enhance their learning.
Data Interpretation: Using Probability
Drummond, Gordon B.; Vowler, Sarah L.
2011-01-01
Experimental data are analysed statistically to allow researchers to draw conclusions from a limited set of measurements. The hard fact is that researchers can never be certain that measurements from a sample will exactly reflect the properties of the entire group of possible candidates available to be studied (although using a sample is often the…
Physical interpretation of antigravity
Bars, Itzhak; James, Albin
2016-02-01
Geodesic incompleteness is a problem in both general relativity and string theory. The Weyl-invariant Standard Model coupled to general relativity (SM +GR ), and a similar treatment of string theory, are improved theories that are geodesically complete. A notable prediction of this approach is that there must be antigravity regions of spacetime connected to gravity regions through gravitational singularities such as those that occur in black holes and cosmological bang/crunch. Antigravity regions introduce apparent problems of ghosts that raise several questions of physical interpretation. It was shown that unitarity is not violated, but there may be an instability associated with negative kinetic energies in the antigravity regions. In this paper we show that the apparent problems can be resolved with the interpretation of the theory from the perspective of observers strictly in the gravity region. Such observers cannot experience the negative kinetic energy in antigravity directly, but can only detect in and out signals that interact with the antigravity region. This is no different from a spacetime black box for which the information about its interior is encoded in scattering amplitudes for in/out states at its exterior. Through examples we show that negative kinetic energy in antigravity presents no problems of principles but is an interesting topic for physical investigations of fundamental significance.
International Nuclear Information System (INIS)
Charkes, N.D.; Malmud, L.S.; Caswell, T.; Goldman, L.; Hall, J.; Lauby, V.; Lightfoot, W.; Maier, W.; Rosemond, G.
1975-01-01
Strontium nitrate Sr-87m bone scans were made preoperatively in a group of women with suspected breast cancer, 35 of whom subsequently underwent radical mastectomy. In 3 of the 35 (9 percent), the scans were abnormal despite the absence of clinical or roentgenographic evidence of metastatic disease. All three patients had extensive axillary lymph node involvement by tumor, and went on to have additional bone metastases, from which one died. Roentgenograms failed to detect the metastases in all three. Occult bone metastases account in part for the failure of radical mastectomy to cure some patients with breast cancer. It is recommended that all candidates for radical mastectomy have a preoperative bone scan. (U.S.)
Frequency scanning microstrip antennas
DEFF Research Database (Denmark)
Danielsen, Magnus; Jørgensen, Rolf
1979-01-01
The principles of using radiating microstrip resonators as elements in a frequency scanning antenna array are described. The resonators are cascade-coupled. This gives a scan of the main lobe due to the phase-shift in the resonator in addition to that created by the transmission line phase......-shift. Experimental results inX-band, in good agreement with the theory, show that it is possible to scan the main lobe an angle ofpm30degby a variation of the frequencypm300MHz, and where the 3 dB beamwidth is less than10deg. The directivity was 14.7 dB, while the gain was 8.1 dB. The efficiency might be improved...
The Statistics of wood assays for preservative retention
Patricia K. Lebow; Scott W. Conklin
2011-01-01
This paper covers general statistical concepts that apply to interpreting wood assay retention values. In particular, since wood assays are typically obtained from a single composited sample, the statistical aspects, including advantages and disadvantages, of simple compositing are covered.
Radiopharmaceutical agents for skeletal scanning
International Nuclear Information System (INIS)
Jansen, S.E.; Van Aswegen, A.; Loetter, M.G.; Minnaar, P.C.; Otto, A.C.; Goedhals, L.; Dedekind, P.S.
1987-01-01
The quality of bone scan images obtained with a locally produced and with an imported radiopharmaceutical bone agent, methylene diphosphonate (MDP), was compared visually. Standard skeletal imaging was carried out on 10 patients using both agents, with a period of 2 to 7 days between studies with alternate agents. Equal amounts of activity were administered for both agents. All images were acquired on Polaroid film for subsequent evaluation. The acquisition time for standard amount of counts per study was recorded. Three physicians with applicable experience evaluated image quality (on a 4 point scale) and detectability of metastasis (on a 3 point scale). There was no statistically significant difference (p 0,05) between the two agents by paired t-test of Hotelling's T 2 analysis. It is concluded that the imaging properties of the locally produced and the imported MDP are similar
Tomographic scanning apparatus
International Nuclear Information System (INIS)
Abele, M.
1983-01-01
A computerized tomographic scanning apparatus suitable for diagnosis and for improving target identification in stereotactic neurosurgery is described. It consists of a base, a source of penetrating energy, a detector which produces scanning signals and detector positioning means. A frame with top and bottom arms secures the detector and source to the top and bottom arms respectively. A drive mechanism rotates the frame about an axis along which the frame may also be moved. Finally, the detector may be moved relative to the bottom arm in a direction contrary to the rotation of the frame. (U.K.)
Scanning the phenomenological MSSM
Wuerzinger, Jonas
2017-01-01
A framework to perform scans in the 19-dimensional phenomenological MSSM is developed and used to re-evaluate the ATLAS experiments' sensitivity to R-parity-conserving supersymmetry with LHC Run 2 data ($\\sqrt{s}=13$ TeV), using results from 14 separate ATLAS searches. We perform a $\\tilde{t}_1$ dedicated scan, only considering models with $m_{\\tilde{t}_1}<1$ TeV, while allowing both a neutralino ($\\tilde{\\chi}_1^0$) and a sneutrino ($\\tilde{\
DEFF Research Database (Denmark)
Gómez Arranz, Paula; Courtney, Michael
This report describes the tests carried out on a scanning lidar at the DTU Test Station for large wind turbines, Høvsøre. The tests were divided in two parts. In the first part, the purpose was to obtain wind speed calibrations at two heights against two cup anemometers mounted on a mast. Additio......This report describes the tests carried out on a scanning lidar at the DTU Test Station for large wind turbines, Høvsøre. The tests were divided in two parts. In the first part, the purpose was to obtain wind speed calibrations at two heights against two cup anemometers mounted on a mast...
Adaptive Optical Scanning Holography
Tsang, P. W. M.; Poon, Ting-Chung; Liu, J.-P.
2016-01-01
Optical Scanning Holography (OSH) is a powerful technique that employs a single-pixel sensor and a row-by-row scanning mechanism to capture the hologram of a wide-view, three-dimensional object. However, the time required to acquire a hologram with OSH is rather lengthy. In this paper, we propose an enhanced framework, which is referred to as Adaptive OSH (AOSH), to shorten the holographic recording process. We have demonstrated that the AOSH method is capable of decreasing the acquisition time by up to an order of magnitude, while preserving the content of the hologram favorably. PMID:26916866
Shnirelman peak in the level spacing statistics
International Nuclear Information System (INIS)
Chirikov, B.V.; Shepelyanskij, D.L.
1994-01-01
The first results on the statistical properties of the quantum quasidegeneracy are presented. A physical interpretation of the Shnirelman theorem predicted the bulk quasidegeneracy is given. The conditions for the strong impact of the degeneracy on the quantum level statistics are formulated which allows to extend the application of the Shnirelman theorem into a broad class of quantum systems. 14 refs., 3 figs
Biblical Interpretation Beyond Historicity
DEFF Research Database (Denmark)
Biblical Interpretation beyond Historicity evaluates the new perspectives that have emerged since the crisis over historicity in the 1970s and 80s in the field of biblical scholarship. Several new studies in the field, as well as the ‘deconstructive’ side of literary criticism that emerged from...... writers such as Derrida and Wittgenstein, among others, lead biblical scholars today to view the texts of the Bible more as literary narratives than as sources for a history of Israel. Increased interest in archaeological and anthropological studies in writing the history of Palestine and the ancient Near...... and the commitment to a new approach to both the history of Palestine and the Bible’s place in ancient history. This volume features essays from a range of highly regarded scholars, and is divided into three sections: “Beyond Historicity”, which explores alternative historical roles for the Bible, “Greek Connections...
Interpretation of galaxy counts
International Nuclear Information System (INIS)
Tinsely, B.M.
1980-01-01
New models are presented for the interpretation of recent counts of galaxies to 24th magnitude, and predictions are shown to 28th magnitude for future comparison with data from the Space Telescope. The results supersede earlier, more schematic models by the author. Tyson and Jarvis found in their counts a ''local'' density enhancement at 17th magnitude, on comparison with the earlier models; the excess is no longer significant when a more realistic mixture of galaxy colors is used. Bruzual and Kron's conclusion that Kron's counts show evidence for evolution at faint magnitudes is confirmed, and it is predicted that some 23d magnitude galaxies have redshifts greater than unity. These may include spheroidal systems, elliptical galaxies, and the bulges of early-type spirals and S0's, seen during their primeval rapid star formation
Statistical ecology comes of age
Gimenez, Olivier; Buckland, Stephen T.; Morgan, Byron J. T.; Bez, Nicolas; Bertrand, Sophie; Choquet, Rémi; Dray, Stéphane; Etienne, Marie-Pierre; Fewster, Rachel; Gosselin, Frédéric; Mérigot, Bastien; Monestiez, Pascal; Morales, Juan M.; Mortier, Frédéric; Munoz, François; Ovaskainen, Otso; Pavoine, Sandrine; Pradel, Roger; Schurr, Frank M.; Thomas, Len; Thuiller, Wilfried; Trenkel, Verena; de Valpine, Perry; Rexstad, Eric
2014-01-01
The desire to predict the consequences of global environmental change has been the driver towards more realistic models embracing the variability and uncertainties inherent in ecology. Statistical ecology has gelled over the past decade as a discipline that moves away from describing patterns towards modelling the ecological processes that generate these patterns. Following the fourth International Statistical Ecology Conference (1–4 July 2014) in Montpellier, France, we analyse current trends in statistical ecology. Important advances in the analysis of individual movement, and in the modelling of population dynamics and species distributions, are made possible by the increasing use of hierarchical and hidden process models. Exciting research perspectives include the development of methods to interpret citizen science data and of efficient, flexible computational algorithms for model fitting. Statistical ecology has come of age: it now provides a general and mathematically rigorous framework linking ecological theory and empirical data. PMID:25540151
Statistical ecology comes of age.
Gimenez, Olivier; Buckland, Stephen T; Morgan, Byron J T; Bez, Nicolas; Bertrand, Sophie; Choquet, Rémi; Dray, Stéphane; Etienne, Marie-Pierre; Fewster, Rachel; Gosselin, Frédéric; Mérigot, Bastien; Monestiez, Pascal; Morales, Juan M; Mortier, Frédéric; Munoz, François; Ovaskainen, Otso; Pavoine, Sandrine; Pradel, Roger; Schurr, Frank M; Thomas, Len; Thuiller, Wilfried; Trenkel, Verena; de Valpine, Perry; Rexstad, Eric
2014-12-01
The desire to predict the consequences of global environmental change has been the driver towards more realistic models embracing the variability and uncertainties inherent in ecology. Statistical ecology has gelled over the past decade as a discipline that moves away from describing patterns towards modelling the ecological processes that generate these patterns. Following the fourth International Statistical Ecology Conference (1-4 July 2014) in Montpellier, France, we analyse current trends in statistical ecology. Important advances in the analysis of individual movement, and in the modelling of population dynamics and species distributions, are made possible by the increasing use of hierarchical and hidden process models. Exciting research perspectives include the development of methods to interpret citizen science data and of efficient, flexible computational algorithms for model fitting. Statistical ecology has come of age: it now provides a general and mathematically rigorous framework linking ecological theory and empirical data.
Experimental statistics for biological sciences.
Bang, Heejung; Davidian, Marie
2010-01-01
In this chapter, we cover basic and fundamental principles and methods in statistics - from "What are Data and Statistics?" to "ANOVA and linear regression," which are the basis of any statistical thinking and undertaking. Readers can easily find the selected topics in most introductory statistics textbooks, but we have tried to assemble and structure them in a succinct and reader-friendly manner in a stand-alone chapter. This text has long been used in real classroom settings for both undergraduate and graduate students who do or do not major in statistical sciences. We hope that from this chapter, readers would understand the key statistical concepts and terminologies, how to design a study (experimental or observational), how to analyze the data (e.g., describe the data and/or estimate the parameter(s) and make inference), and how to interpret the results. This text would be most useful if it is used as a supplemental material, while the readers take their own statistical courses or it would serve as a great reference text associated with a manual for any statistical software as a self-teaching guide.
Bone scanning in the evaluation of lung cancer
International Nuclear Information System (INIS)
Jung, Kun Sik; Zeon, Seok Kil; Lee, Hee Jung; Song, Hong Suk
1994-01-01
We studied the diagnostic significance of bone scan in evaluation of bone metastasis by lung cancer, prevalence rate, and the causes of false positive bone scan and soft tissue accumulation of bone seeking agent. This subject include 73 lung cancer patients with bone scan, We analyzed the frequency of the metastasis, its distribution and configuration, and any relationship between bone pain and corresponding region on bone scan. The positive findings of bone scans were compared with simple X-ray film, CT, MRI and other diagnostic modalities. The false positive bone scan and the soft tissue accumulation of bone seeking agent were analyzed. The positive findings on bone scan were noted in 26 cases(36%) and they were coexistent with bone pain in 30%. The correspondence between bone scan and bone X-ray was 38%. False positive bone scans were seen in 12 cases(16%), which include fracture due to thoracotomy and trauma, degenerative bone disease, and bifid rib. Accumulation of bone seeking agent in soft tissue were seen in 13 cases(18%), which included primary tumor, enlarged cervical lymph node, pleural effusion, ascites and pleural thickening. Bone scans should be carefully interpreted in detecting bone metastasis in primary malignancy, because of the 16% false positivity and 18% soft tissue accumulation rate. It is very important to note that the correlation between bone pain and positive findings of bone scans was only 38%
Bone scanning in the evaluation of lung cancer
Energy Technology Data Exchange (ETDEWEB)
Jung, Kun Sik; Zeon, Seok Kil; Lee, Hee Jung; Song, Hong Suk [School of Medicine, Keimyung University, Daegu (Korea, Republic of)
1994-05-15
We studied the diagnostic significance of bone scan in evaluation of bone metastasis by lung cancer, prevalence rate, and the causes of false positive bone scan and soft tissue accumulation of bone seeking agent. This subject include 73 lung cancer patients with bone scan, We analyzed the frequency of the metastasis, its distribution and configuration, and any relationship between bone pain and corresponding region on bone scan. The positive findings of bone scans were compared with simple X-ray film, CT, MRI and other diagnostic modalities. The false positive bone scan and the soft tissue accumulation of bone seeking agent were analyzed. The positive findings on bone scan were noted in 26 cases(36%) and they were coexistent with bone pain in 30%. The correspondence between bone scan and bone X-ray was 38%. False positive bone scans were seen in 12 cases(16%), which include fracture due to thoracotomy and trauma, degenerative bone disease, and bifid rib. Accumulation of bone seeking agent in soft tissue were seen in 13 cases(18%), which included primary tumor, enlarged cervical lymph node, pleural effusion, ascites and pleural thickening. Bone scans should be carefully interpreted in detecting bone metastasis in primary malignancy, because of the 16% false positivity and 18% soft tissue accumulation rate. It is very important to note that the correlation between bone pain and positive findings of bone scans was only 38%.
Full Text Available ... process that regulates the rate at which the body converts food to energy. top of page What are some common uses of the procedure? The thyroid scan is used to determine the size, shape and position of the thyroid gland. The ...
Full Text Available Toggle navigation Test/Treatment Patient Type Screening/Wellness Disease/Condition Safety En Español More Info Images/Videos About Us News Physician Resources Professions Site Index A-Z Thyroid Scan and Uptake ...
Dialogue scanning measuring systems
International Nuclear Information System (INIS)
Borodyuk, V.P.; Shkundenkov, V.N.
1985-01-01
The main developments of scanning measuring systems intended for mass precision processsing of films in nuclear physics problems and in related fields are reviewed. A special attention is paid to the problem of creation of dialogue systems which permit to simlify the development of control computer software
Energy Technology Data Exchange (ETDEWEB)
Cox, B. [Atomic Energy of Canada Limited, Chalk River, Ontario (Canada)
1970-05-15
The JSM-11 scanning electron microscope at CRNL has been used extensively for topographical studies of oxidized metals, fracture surfaces, entomological and biological specimens. A non-dispersive X-ray attachment permits the microanalysis of the surface features. Techniques for the production of electron channeling patterns have been developed. (author)
International Nuclear Information System (INIS)
Binnig, G.; Rohrer, H.
1983-01-01
Based on vacuum tunneling, a novel type of microscope, the scanning tunneling microscope (STM) was developed. It has an unprecedented resolution in real space on an atomic scale. The authors review the important technical features, illustrate the power of the STM for surface topographies and discuss its potential in other areas of science and technology. (Auth.)
International Nuclear Information System (INIS)
Morales G, R.; Cano P, R.; Mendoza P, R.
1993-01-01
In this chapter a revision is made concerning different uses of bone scan in rheumatic diseases. These include reflex sympathetic dystrophy, osteomyelitis, spondyloarthropaties, metabolic bone diseases, avascular bone necrosis and bone injuries due to sports. There is as well some comments concerning pediatric pathology and orthopedics. (authors). 19 refs., 9 figs
Full Text Available ... information. The thyroid scan and thyroid uptake provide information about the structure and function of the thyroid. The thyroid is a gland in the neck that controls metabolism , a chemical process that regulates the rate at which the body ...
Tomographic scanning apparatus
International Nuclear Information System (INIS)
1981-01-01
Details are given of a tomographic scanning apparatus, with particular reference to the means of adjusting the apparent gain of the signal processing means for receiving output signals from the detectors, to compensate for drift in the gain characteristics, including means for passing a reference signal. (U.K.)
Stabilized radiographic scanning agent
International Nuclear Information System (INIS)
Fawzi, M.B.
1979-01-01
A stable composition useful in preparation of technetium-99m-based radiographic scanning agents has been developed. The composition contains a stabilizing amount of gentisate stabilizer selected from gentisic acid and its soluble pharmaceutically-acceptable salts and esthers. (E.G.)
International Nuclear Information System (INIS)
Anon.
1980-01-01
The principle underlying the design of the scanning electron microscope (SEM), the design and functioning of SEM are described. Its applications in the areas of microcircuitry and materials science are outlined. The development of SEM in India is reviewed. (M.G.B.)
International Nuclear Information System (INIS)
Tofe, A.J.
1976-01-01
A stable radiographic scanning agent on a sup(99m)Tc basis has been developed. The substance contains a pertechnetate reduction agent, tin(II)-chloride, chromium(II)-chloride, or iron(II)-sulphate, as well as an organospecific carrier and ascorbic acid or a pharmacologically admissible salt or ester of ascorbic acid. (VJ) [de
Full Text Available ... you: have had any tests, such as an x-ray or CT scan, surgeries or treatments using iodinated ... page How does the procedure work? With ordinary x-ray examinations, an image is made by passing x- ...
Full Text Available ... for a thyroid scan is 30 minutes or less. Thyroid Uptake You will be given radioactive iodine (I-123 or I-131) in liquid or capsule form to swallow. The thyroid uptake will begin several hours to 24 hours later. Often, two separate uptake ...
Full Text Available ... an x-ray or CT scan, surgeries or treatments using iodinated contrast material within the last two months. are taking medications or ingesting other substances that contain iodine , including kelp, seaweed, cough syrups, multivitamins or heart medications. have any ...
Critical Assessment of Metagenome Interpretation
DEFF Research Database (Denmark)
Sczyrba, Alexander; Hofmann, Peter; Belmann, Peter
2017-01-01
Methods for assembly, taxonomic profiling and binning are key to interpreting metagenome data, but a lack of consensus about benchmarking complicates performance assessment. The Critical Assessment of Metagenome Interpretation (CAMI) challenge has engaged the global developer community to benchma...
Statistics for scientists and engineers
Shanmugam , Ramalingam
2015-01-01
This book provides the theoretical framework needed to build, analyze and interpret various statistical models. It helps readers choose the correct model, distinguish among various choices that best captures the data, or solve the problem at hand. This is an introductory textbook on probability and statistics. The authors explain theoretical concepts in a step-by-step manner and provide practical examples. The introductory chapter in this book presents the basic concepts. Next, the authors discuss the measures of location, popular measures of spread, and measures of skewness and kurtosis. Prob
A primer of multivariate statistics
Harris, Richard J
2014-01-01
Drawing upon more than 30 years of experience in working with statistics, Dr. Richard J. Harris has updated A Primer of Multivariate Statistics to provide a model of balance between how-to and why. This classic text covers multivariate techniques with a taste of latent variable approaches. Throughout the book there is a focus on the importance of describing and testing one's interpretations of the emergent variables that are produced by multivariate analysis. This edition retains its conversational writing style while focusing on classical techniques. The book gives the reader a feel for why
Applied statistics for social and management sciences
Miah, Abdul Quader
2016-01-01
This book addresses the application of statistical techniques and methods across a wide range of disciplines. While its main focus is on the application of statistical methods, theoretical aspects are also provided as fundamental background information. It offers a systematic interpretation of results often discovered in general descriptions of methods and techniques such as linear and non-linear regression. SPSS is also used in all the application aspects. The presentation of data in the form of tables and graphs throughout the book not only guides users, but also explains the statistical application and assists readers in interpreting important features. The analysis of statistical data is presented consistently throughout the text. Academic researchers, practitioners and other users who work with statistical data will benefit from reading Applied Statistics for Social and Management Sciences. .
The interpretation of administrative contracts
Directory of Open Access Journals (Sweden)
Cătălin-Silviu SĂRARU
2014-06-01
Full Text Available The article analyzes the principles of interpretation for administrative contracts, in French law and in Romanian law. In the article are highlighted derogations from the rules of contract interpretation in common law. Are examined the exceptions to the principle of good faith, the principle of common intention (willingness of the parties, the principle of good administration, the principle of extensive interpretation of the administrative contract. The article highlights the importance and role of the interpretation in administrative contracts.
Monitoring and interpreting bioremediation effectiveness
International Nuclear Information System (INIS)
Bragg, J.R.; Prince, R.C.; Harner, J.; Atlas, R.M.
1993-01-01
Following the Exxon Valdez oil spill in 1989, extensive research was conducted by the US Environments Protection Agency and Exxon to develop and implement bioremediation techniques for oil spill cleanup. A key challenge of this program was to develop effective methods for monitoring and interpreting bioremediation effectiveness on extremely heterogenous intertidal shorelines. Fertilizers were applied to shorelines at concentrations known to be safe, and effectiveness achieved in acceleration biodegradation of oil residues was measure using several techniques. This paper describes the most definitive method identified, which monitors biodegradation loss by measuring changes in ratios of hydrocarbons to hopane, a cycloalkane present in the oil that showed no measurable degradation. Rates of loss measured by the hopane ratio method have high levels of statistical confidence, and show that the fertilizer addition stimulated biodegradation rates as much a fivefold. Multiple regression analyses of data show that fertilizer addition of nitrogen in interstitial pore water per unit of oil load was the most important parameter affecting biodegradation rate, and results suggest that monitoring nitrogen concentrations in the subsurface pore water is preferred technique for determining fertilizer dosage and reapplication frequency
Structural interpretation of seismic data and inherent uncertainties
Bond, Clare
2013-04-01
Geoscience is perhaps unique in its reliance on incomplete datasets and building knowledge from their interpretation. This interpretation basis for the science is fundamental at all levels; from creation of a geological map to interpretation of remotely sensed data. To teach and understand better the uncertainties in dealing with incomplete data we need to understand the strategies individual practitioners deploy that make them effective interpreters. The nature of interpretation is such that the interpreter needs to use their cognitive ability in the analysis of the data to propose a sensible solution in their final output that is both consistent not only with the original data but also with other knowledge and understanding. In a series of experiments Bond et al. (2007, 2008, 2011, 2012) investigated the strategies and pitfalls of expert and non-expert interpretation of seismic images. These studies focused on large numbers of participants to provide a statistically sound basis for analysis of the results. The outcome of these experiments showed that a wide variety of conceptual models were applied to single seismic datasets. Highlighting not only spatial variations in fault placements, but whether interpreters thought they existed at all, or had the same sense of movement. Further, statistical analysis suggests that the strategies an interpreter employs are more important than expert knowledge per se in developing successful interpretations. Experts are successful because of their application of these techniques. In a new set of experiments a small number of experts are focused on to determine how they use their cognitive and reasoning skills, in the interpretation of 2D seismic profiles. Live video and practitioner commentary were used to track the evolving interpretation and to gain insight on their decision processes. The outputs of the study allow us to create an educational resource of expert interpretation through online video footage and commentary with
Analysis of statistical misconception in terms of statistical reasoning
Maryati, I.; Priatna, N.
2018-05-01
Reasoning skill is needed for everyone to face globalization era, because every person have to be able to manage and use information from all over the world which can be obtained easily. Statistical reasoning skill is the ability to collect, group, process, interpret, and draw conclusion of information. Developing this skill can be done through various levels of education. However, the skill is low because many people assume that statistics is just the ability to count and using formulas and so do students. Students still have negative attitude toward course which is related to research. The purpose of this research is analyzing students’ misconception in descriptive statistic course toward the statistical reasoning skill. The observation was done by analyzing the misconception test result and statistical reasoning skill test; observing the students’ misconception effect toward statistical reasoning skill. The sample of this research was 32 students of math education department who had taken descriptive statistic course. The mean value of misconception test was 49,7 and standard deviation was 10,6 whereas the mean value of statistical reasoning skill test was 51,8 and standard deviation was 8,5. If the minimal value is 65 to state the standard achievement of a course competence, students’ mean value is lower than the standard competence. The result of students’ misconception study emphasized on which sub discussion that should be considered. Based on the assessment result, it was found that students’ misconception happen on this: 1) writing mathematical sentence and symbol well, 2) understanding basic definitions, 3) determining concept that will be used in solving problem. In statistical reasoning skill, the assessment was done to measure reasoning from: 1) data, 2) representation, 3) statistic format, 4) probability, 5) sample, and 6) association.
Thiese, Matthew S; Walker, Skyler; Lindsey, Jenna
2017-10-01
Distribution of valuable research discoveries are needed for the continual advancement of patient care. Publication and subsequent reliance of false study results would be detrimental for patient care. Unfortunately, research misconduct may originate from many sources. While there is evidence of ongoing research misconduct in all it's forms, it is challenging to identify the actual occurrence of research misconduct, which is especially true for misconduct in clinical trials. Research misconduct is challenging to measure and there are few studies reporting the prevalence or underlying causes of research misconduct among biomedical researchers. Reported prevalence estimates of misconduct are probably underestimates, and range from 0.3% to 4.9%. There have been efforts to measure the prevalence of research misconduct; however, the relatively few published studies are not freely comparable because of varying characterizations of research misconduct and the methods used for data collection. There are some signs which may point to an increased possibility of research misconduct, however there is a need for continued self-policing by biomedical researchers. There are existing resources to assist in ensuring appropriate statistical methods and preventing other types of research fraud. These included the "Statistical Analyses and Methods in the Published Literature", also known as the SAMPL guidelines, which help scientists determine the appropriate method of reporting various statistical methods; the "Strengthening Analytical Thinking for Observational Studies", or the STRATOS, which emphases on execution and interpretation of results; and the Committee on Publication Ethics (COPE), which was created in 1997 to deliver guidance about publication ethics. COPE has a sequence of views and strategies grounded in the values of honesty and accuracy.
International Nuclear Information System (INIS)
Mainsbridge, B.
1994-01-01
In late 1959, Richard Feynman observed that manoeuvring atoms was something that could be done in principle but has not been done, 'because we are too big'. In 1982, the scanning tunnelling microscope (STM) was invented and is now a central tool for the construction of nanoscale devices in what was known as molecular engineering, and now, nanotechnology. The principles of the microscope are outlined and references are made to other scanning devices which have evolved from the original invention. The method of employment of the STM as a machine tool is described and references are made to current speculations on applications of the instrument in nanotechnology. A short bibliography on this topic is included. 27 refs., 7 figs
Energy Technology Data Exchange (ETDEWEB)
Mainsbridge, B [Murdoch Univ., WA (Australia). School of Mathematical and Physical Sciences
1994-12-31
In late 1959, Richard Feynman observed that manoeuvring atoms was something that could be done in principle but has not been done, `because we are too big`. In 1982, the scanning tunnelling microscope (STM) was invented and is now a central tool for the construction of nanoscale devices in what was known as molecular engineering, and now, nanotechnology. The principles of the microscope are outlined and references are made to other scanning devices which have evolved from the original invention. The method of employment of the STM as a machine tool is described and references are made to current speculations on applications of the instrument in nanotechnology. A short bibliography on this topic is included. 27 refs., 7 figs.
International Nuclear Information System (INIS)
Niden, A.H.; Mishkin, F.S.; Khurana, M.M.L.; Pick, R.
1977-01-01
Twenty-three patients with clinical signs of pulmonary embolic disease and lung infiltrates were studied to determine the value of gallium citrate 67 Ga lung scan in differentiating embolic from inflammatory lung disease. In 11 patients without angiographically proved embolism, only seven had corresponding ventilation-perfusion defects compatible with inflammatory disease. In seven of these 11 patients, the 67 Ga concentration indicated inflammatory disease. In the 12 patients with angiographically proved embolic disease, six had corresponding ventilation-perfusion defects compatible with inflammatory disease. None had an accumulation of 67 Ga in the area of pulmonary infiltrate. Thus, ventilation-perfusion lung scans are of limited value when lung infiltrates are present. In contrast, the accumulation of 67 Ga in the lung indicates an inflammatory process. Gallium imaging can help select those patients with lung infiltrates who need angiography
Horizon Scanning for Pharmaceuticals
DEFF Research Database (Denmark)
Lepage-Nefkens, Isabelle; Douw, Karla; Mantjes, GertJan
for a joint horizon scanning system (HSS). We propose to create a central “horizon scanning unit” to perform the joint HS activities (a newly established unit, an existing HS unit, or a third party commissioned and financed by the collaborating countries). The unit will be responsible for the identification...... and filtration of new and emerging pharmaceutical products. It will maintain and update the HS database, organise company pipeline meetings, and disseminate the HSS’s outputs. The HS unit works closely together with the designated national HS experts in each collaborating country. The national HS experts...... will collect country-specific information, liaise between the central HS unit and country-specific clinical and other experts, coordinate the national prioritization process (to select products for early assessment), and communicate the output of the HSS to national decision makers. The outputs of the joint...
Combination and interpretation of observables in Cosmology
Directory of Open Access Journals (Sweden)
Virey Jean-Marc
2010-04-01
Full Text Available The standard cosmological model has deep theoretical foundations but need the introduction of two major unknown components, dark matter and dark energy, to be in agreement with various observations. Dark matter describes a non-relativistic collisionless fluid of (non baryonic matter which amount to 25% of the total density of the universe. Dark energy is a new kind of fluid not of matter type, representing 70% of the total density which should explain the recent acceleration of the expansion of the universe. Alternatively, one can reject this idea of adding one or two new components but argue that the equations used to make the interpretation should be modified consmological scales. Instead of dark matter one can invoke a failure of Newton's laws. Instead of dark energy, two approaches are proposed : general relativity (in term of the Einstein equation should be modified, or the cosmological principle which fixes the metric used for cosmology should be abandonned. One of the main objective of the community is to find the path of the relevant interpretations thanks to the next generation of experiments which should provide large statistics of observationnal data. Unfortunately, cosmological in formations are difficult to pin down directly fromt he measurements, and it is mandatory to combine the various observables to get the cosmological parameters. This is not problematic from the statistical point of view, but assumptions and approximations made for the analysis may bias our interprettion of the data. Consequently, a strong attention should be paied to the statistical methods used to make parameters estimation and for model testing. After a review of the basics of cosmology where the cosmological parameters are introduced, we discuss the various cosmological probes and their associated observables used to extract cosmological informations. We present the results obtained from several statistical analyses combining data of diferent nature but
Multichannel scanning spectrophotometer
International Nuclear Information System (INIS)
Lagutin, A.F.
1979-01-01
A spectrophotometer designed in the Crimea astrophysical observatory is described. The spectrophotometer is intended for the installation at the telescope to measure energy distribution in the star spectra in the 3100-8550 A range. The device is made according to the scheme with a fixed diffraction lattice. The choice of the optical kinematic scheme is explained. The main design elements are shown. Some singularities of the scanning drive kinematics are considered. The device performance is given
Energy Technology Data Exchange (ETDEWEB)
Jin, Jian; Xiang, Chengxiang; Gregoire, John
2017-05-09
Electrochemical experiments are performed on a collection of samples by suspending a drop of electrolyte solution between an electrochemical experiment probe and one of the samples that serves as a test sample. During the electrochemical experiment, the electrolyte solution is added to the drop and an output solution is removed from the drop. The probe and collection of samples can be moved relative to one another so the probe can be scanned across the samples.
Energy Technology Data Exchange (ETDEWEB)
Jin, Jian; Xiang, Chengxiang; Gregoire, John M.; Shinde, Aniketa A.; Guevarra, Dan W.; Jones, Ryan J.; Marcin, Martin R.; Mitrovic, Slobodan
2017-05-09
Electrochemical or electrochemical and photochemical experiments are performed on a collection of samples by suspending a drop of electrolyte solution between an electrochemical experiment probe and one of the samples that serves as a test sample. During the electrochemical experiment, the electrolyte solution is added to the drop and an output solution is removed from the drop. The probe and collection of samples can be moved relative to one another so the probe can be scanned across the samples.
Energy Technology Data Exchange (ETDEWEB)
Baek, Sang Yeol; Park, Dae Kyu; Ahn, Sang Bok; Ju, Yong Sun; Jeon, Yong Bum
1997-06-01
The gamma scanning system which is installed in IMEF is the equipment obtaining the gamma ray spectrum from irradiated fuels. This equipment could afford the useful data relating spent fuels like as burn-up measurements. We describe the specifications of the equipment and its accessories, and also described its operation procedure so that an operator can use this report as the operation procedure. (author). 1 tab., 11 figs., 11 refs.
International Nuclear Information System (INIS)
Baek, Sang Yeol; Park, Dae Kyu; Ahn, Sang Bok; Ju, Yong Sun; Jeon, Yong Bum.
1997-06-01
The gamma scanning system which is installed in IMEF is the equipment obtaining the gamma ray spectrum from irradiated fuels. This equipment could afford the useful data relating spent fuels like as burn-up measurements. We describe the specifications of the equipment and its accessories, and also described its operation procedure so that an operator can use this report as the operation procedure. (author). 1 tab., 11 figs., 11 refs
International Nuclear Information System (INIS)
Plaige, Yves.
1976-01-01
This invention concerns a measurement scanning assembly for collectron type detectors. It is used in measuring the neutron flux in nuclear reactors. As the number of these detectors in a reactor can be very great, they are not usually all connected permanently to the measuring facility but rather in turn by means of a scanning device which carries out, as it were, multiplexing between all the collectrons and the input of a single measuring system. The object of the invention is a scanning assembly which is of relative simplicity through an original organisation. Specifically, according to this organisation, the collectrons outputs are grouped together in bunches, each of these bunches being processed by a multiplexing sub-assembly belonging to a first stage, the different outputs of these multiplexing subassemblies of this first stage being grouped together yet again in bunches processed by multiplexors forming a new stage and so forth. Further, this structure is specially adapted for use with collectrons by utilising a current amplifier at each multiplexing level so that from one end to the other of the multiplexing system, the commutations are carried out on currents and not on voltages [fr
Forensic Scanning Electron Microscope
Keeley, R. H.
1983-03-01
The scanning electron microscope equipped with an x-ray spectrometer is a versatile instrument which has many uses in the investigation of crime and preparation of scientific evidence for the courts. Major applications include microscopy and analysis of very small fragments of paint, glass and other materials which may link an individual with a scene of crime, identification of firearms residues and examination of questioned documents. Although simultaneous observation and chemical analysis of the sample is the most important feature of the instrument, other modes of operation such as cathodoluminescence spectrometry, backscattered electron imaging and direct x-ray excitation are also exploited. Marks on two bullets or cartridge cases can be compared directly by sequential scanning with a single beam or electronic linkage of two instruments. Particles of primer residue deposited on the skin and clothing when a gun is fired can be collected on adhesive tape and identified by their morphology and elemental composition. It is also possible to differentiate between the primer residues of different types of ammunition. Bullets may be identified from the small fragments left behind as they pass through the body tissues. In the examination of questioned documents the scanning electron microscope is used to establish the order in which two intersecting ink lines were written and to detect traces of chemical markers added to the security inks on official documents.
Revisiting organizational interpretation and three types of uncertainty
DEFF Research Database (Denmark)
Sund, Kristian J.
2015-01-01
that might help explain and untangle some of the conflicting empirical results found in the extant literature. The paper illustrates how the literature could benefit from re-conceptualizing the perceived environmental uncertainty construct to take into account different types of uncertainty. Practical....... Design/methodology/approach – This conceptual paper extends existing conceptual work by distinguishing between general and issue-specific scanning and linking the interpretation process to three different types of perceived uncertainty: state, effect and response uncertainty. Findings – It is proposed...... on existing work by linking the interpretation process to three different types of uncertainty (state, effect and response uncertainty) with several novel and testable propositions. The paper also differentiates clearly general (regular) scanning from issue-specific (irregular) scanning. Finally, the paper...
Orientalismi: nuove prospettive interpretative
Directory of Open Access Journals (Sweden)
Gabriele Proglio
2012-11-01
Full Text Available This paper is aimed at reconsidering the concept of Orientalism in a new and multiple perspective, and at proposing a different interpretation of the relationship between culture and power, starting from Edward Said’s theoretical frame of reference. If Said’s representational model is repositioned out of structuralist and foucaultian frameworks and separated from the gramscian idea of hegemony-subordination, indeed, it may be possible to re-discuss the traditional profile identifying the Other in the European cultures. My basic assumption here is that Orientalism should not be understood as a consensus mechanism, which is able to produce diversified images of the Orient and the Oriental on demand. Although, of course, in most cases Orientalism is connected to the issue of power, its meanings could also be explained —as it will be soon shown— otherwise. Let’s take The Invisible Cities by Italo Calvino as an example. Here the narratives are not just multiple repetitions of Venice —in Said’s case, the same would hold for Europeanism—, but they could be strategically re-appropriated by those “others” and “alterities” whose bodies and identities are imposed by the Eurocentric discourse. In this sense, a double link may be identified with queer theories and postcolonial studies, and the notion of subordination will be rethought. Finally, from the above mentioned borders, a new idea of image emerges, which appears as linear, uniform and flattened only to the European gaze, whereas in actual fact it is made of imaginaries and forms of knowledge, which combine representation with the conceptualization of power relationships.
A synthetic interpretation: the double-preparation theory
International Nuclear Information System (INIS)
Gondran, Michel; Gondran, Alexandre
2014-01-01
In the 1927 Solvay conference, three apparently irreconcilable interpretations of the quantum mechanics wave function were presented: the pilot-wave interpretation by de Broglie, the soliton wave interpretation by Schrödinger and the Born statistical rule by Born and Heisenberg. In this paper, we demonstrate the complementarity of these interpretations corresponding to quantum systems that are prepared differently and we deduce a synthetic interpretation: the double-preparation theory. We first introduce in quantum mechanics the concept of semi-classical statistically prepared particles, and we show that in the Schrödinger equation these particles converge, when h→0, to the equations of a statistical set of classical particles. These classical particles are undiscerned, and if we assume continuity between classical mechanics and quantum mechanics, we conclude the necessity of the de Broglie–Bohm interpretation for the semi-classical statistically prepared particles (statistical wave). We then introduce in quantum mechanics the concept of a semi-classical deterministically prepared particle, and we show that in the Schrödinger equation this particle converges, when h→0, to the equations of a single classical particle. This classical particle is discerned and assuming continuity between classical mechanics and quantum mechanics, we conclude the necessity of the Schrödinger interpretation for the semi-classical deterministically prepared particle (the soliton wave). Finally we propose, in the semi-classical approximation, a new interpretation of quantum mechanics, the ‘theory of the double preparation’, which depends on the preparation of the particles. (paper)
... Watchdog Ratings Feedback Contact Select Page Childhood Cancer Statistics Home > Cancer Resources > Childhood Cancer Statistics Childhood Cancer Statistics – Graphs and Infographics Number of Diagnoses Incidence Rates ...
Statistical significance versus clinical relevance.
van Rijn, Marieke H C; Bech, Anneke; Bouyer, Jean; van den Brand, Jan A J G
2017-04-01
In March this year, the American Statistical Association (ASA) posted a statement on the correct use of P-values, in response to a growing concern that the P-value is commonly misused and misinterpreted. We aim to translate these warnings given by the ASA into a language more easily understood by clinicians and researchers without a deep background in statistics. Moreover, we intend to illustrate the limitations of P-values, even when used and interpreted correctly, and bring more attention to the clinical relevance of study findings using two recently reported studies as examples. We argue that P-values are often misinterpreted. A common mistake is saying that P < 0.05 means that the null hypothesis is false, and P ≥0.05 means that the null hypothesis is true. The correct interpretation of a P-value of 0.05 is that if the null hypothesis were indeed true, a similar or more extreme result would occur 5% of the times upon repeating the study in a similar sample. In other words, the P-value informs about the likelihood of the data given the null hypothesis and not the other way around. A possible alternative related to the P-value is the confidence interval (CI). It provides more information on the magnitude of an effect and the imprecision with which that effect was estimated. However, there is no magic bullet to replace P-values and stop erroneous interpretation of scientific results. Scientists and readers alike should make themselves familiar with the correct, nuanced interpretation of statistical tests, P-values and CIs. © The Author 2017. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.
The insignificance of statistical significance testing
Johnson, Douglas H.
1999-01-01
Despite their use in scientific journals such as The Journal of Wildlife Management, statistical hypothesis tests add very little value to the products of research. Indeed, they frequently confuse the interpretation of data. This paper describes how statistical hypothesis tests are often viewed, and then contrasts that interpretation with the correct one. I discuss the arbitrariness of P-values, conclusions that the null hypothesis is true, power analysis, and distinctions between statistical and biological significance. Statistical hypothesis testing, in which the null hypothesis about the properties of a population is almost always known a priori to be false, is contrasted with scientific hypothesis testing, which examines a credible null hypothesis about phenomena in nature. More meaningful alternatives are briefly outlined, including estimation and confidence intervals for determining the importance of factors, decision theory for guiding actions in the face of uncertainty, and Bayesian approaches to hypothesis testing and other statistical practices.
International Nuclear Information System (INIS)
Way, Ted W; Chan, H-P; Goodsitt, Mitchell M; Sahiner, Berkman; Hadjiiski, Lubomir M; Zhou Chuan; Chughtai, Aamer
2008-01-01
The purpose of this study is to investigate the effects of CT scanning and reconstruction parameters on automated segmentation and volumetric measurements of nodules in CT images. Phantom nodules of known sizes were used so that segmentation accuracy could be quantified in comparison to ground-truth volumes. Spherical nodules having 4.8, 9.5 and 16 mm diameters and 50 and 100 mg cc -1 calcium contents were embedded in lung-tissue-simulating foam which was inserted in the thoracic cavity of a chest section phantom. CT scans of the phantom were acquired with a 16-slice scanner at various tube currents, pitches, fields-of-view and slice thicknesses. Scans were also taken using identical techniques either within the same day or five months apart for study of reproducibility. The phantom nodules were segmented with a three-dimensional active contour (3DAC) model that we previously developed for use on patient nodules. The percentage volume errors relative to the ground-truth volumes were estimated under the various imaging conditions. There was no statistically significant difference in volume error for repeated CT scans or scans taken with techniques where only pitch, field of view, or tube current (mA) were changed. However, the slice thickness significantly (p < 0.05) affected the volume error. Therefore, to evaluate nodule growth, consistent imaging conditions and high resolution should be used for acquisition of the serial CT scans, especially for smaller nodules. Understanding the effects of scanning and reconstruction parameters on volume measurements by 3DAC allows better interpretation of data and assessment of growth. Tracking nodule growth with computerized segmentation methods would reduce inter- and intraobserver variabilities
... Standards Act and Program MQSA Insights MQSA National Statistics Share Tweet Linkedin Pin it More sharing options ... but should level off with time. Archived Scorecard Statistics 2018 Scorecard Statistics 2017 Scorecard Statistics 2016 Scorecard ...
State Transportation Statistics 2014
2014-12-15
The Bureau of Transportation Statistics (BTS) presents State Transportation Statistics 2014, a statistical profile of transportation in the 50 states and the District of Columbia. This is the 12th annual edition of State Transportation Statistics, a ...
Statistical parametric mapping of Tc-99m HMPAO SPECT cerebral perfusion in the normal elderly
International Nuclear Information System (INIS)
Turlakow, A.; Scott, A.M.; Berlangieri, S.U.; Sonkila, C.; Wardill, T.D.; Crowley, K.; Abbott, D.; Egan, G.F.; McKay, W.J.; Hughes, A.
1998-01-01
Full text: The clinical value of Tc-99m HMPAO SPECT cerebral blood flow studies in cognitive and neuropsychiatric disorders has been well described. Currently, interpretation of these studies relies on qualitative or semi- quantitative techniques. The aim of our study is to generate statistical measures of regional cerebral perfusion in the normal elderly using statistical parametric mapping (Friston et al, Wellcome Department of Cognitive Neurology, London, UK) in order to facilitate the objective analysis of cerebral blood flow studies in patient groups. A cohort of 20 healthy, elderly volunteers, aged 68 to 81 years, was prospectively selected on the basis of normal physical examination and neuropsychological testing. Subjects with risk factors, or a history of cognitive impairment were excluded from our study group. All volunteers underwent SPECT cerebral blood flow imaging, 30 minutes following the administration of 370 MBq Tc-99m HMPAO, on a Trionix Triad XLT triple-headed scanner (Trionix Research Laboratory Twinsburg, OH) using high resolution, fan-beam collimators resulting in a system resolution of 10 mm full width at half-maximum (FWHM). The SPECT cerebral blood flow studies were analysed using statistical parametric mapping (SPM) software specifically developed for the routine statistical analysis of functional neuroimaging data. The SPECT images were coregistered with each individual's T1-weighted MR volume brain scan and spatially normalized to standardised Talairach space. Using SPM, these data were analyzed for differences in interhemispheric regional cerebral blood flow. Significant asymmetry of cerebral perfusion was detected in the pre-central gyrus at the 95th percentile. In conclusion, the interpretation of cerebral blood flow studies in the elderly should take into account the statistically significant asymmetry in interhemispheric pre-central cortical blood flow. In the future, clinical studies will be compared to statistical data sets in age
Statistical parametric mapping of Tc-99m HMPAO SPECT cerebral perfusion in the normal elderly
Energy Technology Data Exchange (ETDEWEB)
Turlakow, A.; Scott, A.M.; Berlangieri, S.U.; Sonkila, C.; Wardill, T.D.; Crowley, K.; Abbott, D.; Egan, G.F.; McKay, W.J.; Hughes, A. [Austin and Repatriation Medical Centre, Heidelberg, VIC (Australia). Departments of Nuclear Medicine and Centre for PET Neurology and Clinical Neuropsychology
1998-06-01
Full text: The clinical value of Tc-99m HMPAO SPECT cerebral blood flow studies in cognitive and neuropsychiatric disorders has been well described. Currently, interpretation of these studies relies on qualitative or semi- quantitative techniques. The aim of our study is to generate statistical measures of regional cerebral perfusion in the normal elderly using statistical parametric mapping (Friston et al, Wellcome Department of Cognitive Neurology, London, UK) in order to facilitate the objective analysis of cerebral blood flow studies in patient groups. A cohort of 20 healthy, elderly volunteers, aged 68 to 81 years, was prospectively selected on the basis of normal physical examination and neuropsychological testing. Subjects with risk factors, or a history of cognitive impairment were excluded from our study group. All volunteers underwent SPECT cerebral blood flow imaging, 30 minutes following the administration of 370 MBq Tc-99m HMPAO, on a Trionix Triad XLT triple-headed scanner (Trionix Research Laboratory Twinsburg, OH) using high resolution, fan-beam collimators resulting in a system resolution of 10 mm full width at half-maximum (FWHM). The SPECT cerebral blood flow studies were analysed using statistical parametric mapping (SPM) software specifically developed for the routine statistical analysis of functional neuroimaging data. The SPECT images were coregistered with each individual`s T1-weighted MR volume brain scan and spatially normalized to standardised Talairach space. Using SPM, these data were analyzed for differences in interhemispheric regional cerebral blood flow. Significant asymmetry of cerebral perfusion was detected in the pre-central gyrus at the 95th percentile. In conclusion, the interpretation of cerebral blood flow studies in the elderly should take into account the statistically significant asymmetry in interhemispheric pre-central cortical blood flow. In the future, clinical studies will be compared to statistical data sets in age
Haddad, D. E.; Arrowsmith, R.
2009-12-01
Terrestrial laser scanning (TLS) technology is rapidly becoming an effective three-dimensional imaging tool. Precariously balanced rocks are a subset of spheroidally weathered boulders. They are balanced on bedrock pedestals and are formed in upland drainage basins and pediments of exhumed plutons. Precarious rocks are used as negative evidence of earthquake-driven extreme ground motions. Field surveys of PBRs are coupled with cosmogenic radionuclide (CRN) surface exposure dating techniques to determine their exhumation rates. These rates are used in statistical simulations to estimate the magnitudes and recurrences of earthquake-generated extreme ground shaking as a means to physically validate seismic hazard analyses. However, the geomorphic setting of PBRs in the landscape is poorly constrained when interpreting their exhumation rates from CRN surface exposure dates. Are PBRs located on steep or gentle hillslopes? Are they located near drainages or hillslope crests? What geomorphic processes control the spatial distribution of PBRs in a landscape, and where do these processes dominate? Because the fundamental hillslope transport laws are largely controlled by local hillslope gradient and contributing area, the location of a PBR is controlled by the geomorphic agents and their rates acting on it. Our latest efforts involve using a combination of TLS and airborne laser swath mapping (ALSM) to characterize the geomorphic situation of PBRs. We used a Riegl LPM 800i (LPM 321) terrestrial laser scanner to scan a ~1.5 m tall by ~1 m wide precariously balanced rock in the Granite Dells, central Arizona. The PBR was scanned from six positions, and the scans were aligned to a point cloud totaling 3.4M points. We also scanned a ~50 m by ~150 m area covering PBR hillslopes from five scan positions. The resulting 5.5M points were used to create a digital terrain model of precarious rocks and their hillslopes. Our TLS- and ALSM-generated surface models and DEMs provide a
DEFF Research Database (Denmark)
Moshavegh, Ramin
on the user adjustments on the scanner interface to optimize the scan settings. This explains the huge interest in the subject of this PhD project entitled “AUTOMATIC ULTRASOUND SCANNING”. The key goals of the project have been to develop automated techniques to minimize the unnecessary settings...... on the scanners, and to improve the computer-aided diagnosis (CAD) in ultrasound by introducing new quantitative measures. Thus, four major issues concerning automation of the medical ultrasound are addressed in this PhD project. They touch upon gain adjustments in ultrasound, automatic synthetic aperture image...
International Nuclear Information System (INIS)
Bevan, J.A.
1983-01-01
This invention relates to radiodiagnostic agents and more particularly to a composition and method for preparing a highly effective technetium-99m-based bone scanning agent. One deficiency of x-ray examination is the inability of that technique to detect skeletal metastases in their incipient stages. It has been discovered that the methanehydroxydiphosphonate bone mineral-seeking agent is unique in that it provides the dual benefits of sharp radiographic imaging and excellent lesion detection when used with technetium-99m. This agent can also be used with technetium-99m for detecting soft tissue calcification in the manner of the inorganic phosphate radiodiagnostic agents
International Nuclear Information System (INIS)
Nakagawa, Hiroshi
1982-01-01
Methods of CT of the cervical and thoracic spines were explained, and normal CT pictures of them were described. Spinal CT was evaluated in comparison with other methods in various spinal diseases. Plain CT revealed stenosis due to spondylosis or ossification of posterior longitudinal ligament and hernia of intervertebral disc. CT took an important role in the diagnosis of spinal cord tumors with calcification and destruction of the bone. CT scan in combination with other methods was also useful for the diagnosis of spinal injuries, congenital anomalies and infections. (Ueda, J.)
Scanning Color Laser Microscope
Awamura, D.; Ode, T.; Yonezawa, M.
1988-01-01
A confocal color laser microscope which utilizes a three color laser light source (Red: He-Ne, Green: Ar, Blue: Ar) has been developed and is finding useful applications in the semiconductor field. The color laser microscope, when compared to a conventional microscope, offers superior color separation, higher resolution, and sharper contrast. Recently some new functions including a Focus Scan Memory, a Surface Profile Measurement System, a Critical Dimension Measurement system (CD) and an Optical Beam Induced Current Function (OBIC) have been developed for the color laser microscope. This paper will discuss these new features.
International Nuclear Information System (INIS)
Brunnett, C.J.
1980-01-01
A novel method is described for processing the analogue signals from the photomultiplier tubes in a tomographic X-ray scanner. The system produces a series of pulses whose instantaneous frequency depends on the detected intensity of the X-radiation. A timer unit is used to determine the segment scan intervals and also to deduce the average radiation intensity detected during this interval. The overall system is claimed to possess the advantageous properties of low time delay, wide bandwidth and relative low cost. (U.K.)
Working memory and simultaneous interpreting
Timarova, Sarka
2009-01-01
Working memory is a cognitive construct underlying a number of abilities, and it has been hypothesised for many years that it is crucial for interpreting. A number of studies have been conducted with the aim to support this hypothesis, but research has not yielded convincing results. Most researchers focused on studying working memory differences between interpreters and non-interpreters with the rationale that differences in working memory between the two groups would provide evidence of wor...
Conversion factors and oil statistics
International Nuclear Information System (INIS)
Karbuz, Sohbet
2004-01-01
World oil statistics, in scope and accuracy, are often far from perfect. They can easily lead to misguided conclusions regarding the state of market fundamentals. Without proper attention directed at statistic caveats, the ensuing interpretation of oil market data opens the door to unnecessary volatility, and can distort perception of market fundamentals. Among the numerous caveats associated with the compilation of oil statistics, conversion factors, used to produce aggregated data, play a significant role. Interestingly enough, little attention is paid to conversion factors, i.e. to the relation between different units of measurement for oil. Additionally, the underlying information regarding the choice of a specific factor when trying to produce measurements of aggregated data remains scant. The aim of this paper is to shed some light on the impact of conversion factors for two commonly encountered issues, mass to volume equivalencies (barrels to tonnes) and for broad energy measures encountered in world oil statistics. This paper will seek to demonstrate how inappropriate and misused conversion factors can yield wildly varying results and ultimately distort oil statistics. Examples will show that while discrepancies in commonly used conversion factors may seem trivial, their impact on the assessment of a world oil balance is far from negligible. A unified and harmonised convention for conversion factors is necessary to achieve accurate comparisons and aggregate oil statistics for the benefit of both end-users and policy makers
Automated, computer interpreted radioimmunoassay results
International Nuclear Information System (INIS)
Hill, J.C.; Nagle, C.E.; Dworkin, H.J.; Fink-Bennett, D.; Freitas, J.E.; Wetzel, R.; Sawyer, N.; Ferry, D.; Hershberger, D.
1984-01-01
90,000 Radioimmunoassay results have been interpreted and transcribed automatically using software developed for use on a Hewlett Packard Model 1000 mini-computer system with conventional dot matrix printers. The computer program correlates the results of a combination of assays, interprets them and prints a report ready for physician review and signature within minutes of completion of the assay. The authors designed and wrote a computer program to query their patient data base for radioassay laboratory results and to produce a computer generated interpretation of these results using an algorithm that produces normal and abnormal interpretives. Their laboratory assays 50,000 patient samples each year using 28 different radioassays. Of these 85% have been interpreted using our computer program. Allowances are made for drug and patient history and individualized reports are generated with regard to the patients age and sex. Finalization of reports is still subject to change by the nuclear physician at the time of final review. Automated, computerized interpretations have realized cost savings through reduced personnel and personnel time and provided uniformity of the interpretations among the five physicians. Prior to computerization of interpretations, all radioassay results had to be dictated and reviewed for signing by one of the resident or staff physicians. Turn around times for reports prior to the automated computer program generally were two to three days. Whereas, the computerized interpret system allows reports to generally be issued the day assays are completed
NEW SCANNING DEVICE FOR SCANNING TUNNELING MICROSCOPE APPLICATIONS
SAWATZKY, GA; Koops, Karl Richard
A small, single piezo XYZ translator has been developed. The device has been used as a scanner for a scanning tunneling microscope and has been tested successfully in air and in UHV. Its simple design results in a rigid and compact scanning unit which permits high scanning rates.
Subeihi, Haitham
dimensional accuracy, which is defined as the absolute value of deviation in micrometers from the reference model. A two-way analysis of analysis of variance (ANOVA) was applied to calculate if the measurements for the six test groups were statistically significantly different from the original reference model as well as between test groups (p Results: The mean (± SD) RMS was 29.42 ± 5.80 microns for digital models produced from polyether impression scans, 27.58 ± 5.85 microns for digital models from PVS impressions scans, and 24.08 ± 4.89 microns for digital models produced from VPES impressions scans. 26.08 ± 6.58 microns for digital models produced by scanning stone casts poured from PE, 31.67 ± 9.95 microns for digital models produced by scanning stone casts poured from PVS and 22.58 ± 2.84 microns for digital models produced by scanning stone casts poured from VPES. In the Two-Way ANOVA, the p-value for the material factor was 0.004, reflecting a statistically significant difference between the accuracy of the three impression materials, with VPES showing the highest accuracy (mean RMS = 23.33 ± 3.99 microns) followed by PE (mean RMS = 27.75 ± 6.3 microns) and PVS (mean RMS = 29.63 ± 8.25 microns). For the technique factor, the p-value was 0.870 reflecting no statistically significant difference between the accuracy of the two techniques (impression scan and stone cast scan). The mean RMS values were 27.03 ± 5.82 microns and 26.78 ± 7.85 microns, respectively. In the post-hoc tests for the material factor, a significant difference was found between the accuracy of VPES and PVS (p-value = 0.004) with VPES having the higher accuracy (lower mean RMS). No significant difference was found between the accuracies of PE and PVS (p-value = 0.576), and between the accuracies of PE and VPES (p-value = 0.054). Conclusions: Within the limitations of this in vitro study, it can be concluded that: 1. There is no statistically significant difference in dimensional accuracy
Ultrafast scanning tunneling microscopy
Energy Technology Data Exchange (ETDEWEB)
Botkin, D.A. [California Univ., Berkeley, CA (United States). Dept. of Physics]|[Lawrence Berkeley Lab., CA (United States)
1995-09-01
I have developed an ultrafast scanning tunneling microscope (USTM) based on uniting stroboscopic methods of ultrafast optics and scanned probe microscopy to obtain nanometer spatial resolution and sub-picosecond temporal resolution. USTM increases the achievable time resolution of a STM by more than 6 orders of magnitude; this should enable exploration of mesoscopic and nanometer size systems on time scales corresponding to the period or decay of fundamental excitations. USTM consists of a photoconductive switch with subpicosecond response time in series with the tip of a STM. An optical pulse from a modelocked laser activates the switch to create a gate for the tunneling current, while a second laser pulse on the sample initiates a dynamic process which affects the tunneling current. By sending a large sequence of identical pulse pairs and measuring the average tunnel current as a function of the relative time delay between the pulses in each pair, one can map the time evolution of the surface process. USTM was used to measure the broadband response of the STM`s atomic size tunnel barrier in frequencies from tens to hundreds of GHz. The USTM signal amplitude decays linearly with the tunnel junction conductance, so the spatial resolution of the time-resolved signal is comparable to that of a conventional STM. Geometrical capacitance of the junction does not appear to play an important role in the measurement, but a capacitive effect intimately related to tunneling contributes to the measured signals and may limit the ultimate resolution of the USTM.
Renyi statistics in equilibrium statistical mechanics
International Nuclear Information System (INIS)
Parvan, A.S.; Biro, T.S.
2010-01-01
The Renyi statistics in the canonical and microcanonical ensembles is examined both in general and in particular for the ideal gas. In the microcanonical ensemble the Renyi statistics is equivalent to the Boltzmann-Gibbs statistics. By the exact analytical results for the ideal gas, it is shown that in the canonical ensemble, taking the thermodynamic limit, the Renyi statistics is also equivalent to the Boltzmann-Gibbs statistics. Furthermore it satisfies the requirements of the equilibrium thermodynamics, i.e. the thermodynamical potential of the statistical ensemble is a homogeneous function of first degree of its extensive variables of state. We conclude that the Renyi statistics arrives at the same thermodynamical relations, as those stemming from the Boltzmann-Gibbs statistics in this limit.
Sampling, Probability Models and Statistical Reasoning Statistical
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...
Neutrons and antimony physical measurements and interpretations
International Nuclear Information System (INIS)
Smith, A. B.
2000-01-01
New experimental information for the elastic and inelastic scattering of ∼ 4--10 MeV neutrons from elemental antimony is presented. The differential measurements are made at ∼ 40 or more scattering angles and at incident neutron-energy intervals of ∼ 0.5 MeV. The present experimental results, those previously reported from this laboratory and as found in the literature are comprehensively interpreted using spherical optical-statistical and dispersive-optical models. Direct vibrational processes via core-excitation, isospin and shell effects are discussed. Antimony models for applications are proposed and compared with global, regional, and specific models reported in the literature
hepawk - A language for scanning high energy physics events
International Nuclear Information System (INIS)
Ohl, T.
1992-01-01
We present the programming language hepawk, designed for convenient scanning of data structures arising in the simulation of high energy physics events. The interpreter for this language has been implemented in FORTRAN-77, therefore hepawk runs on any machine with a FORTRAN-77 compiler. (orig.)
The disagreeable behaviour of the kappa statistic.
Flight, Laura; Julious, Steven A
2015-01-01
It is often of interest to measure the agreement between a number of raters when an outcome is nominal or ordinal. The kappa statistic is used as a measure of agreement. The statistic is highly sensitive to the distribution of the marginal totals and can produce unreliable results. Other statistics such as the proportion of concordance, maximum attainable kappa and prevalence and bias adjusted kappa should be considered to indicate how well the kappa statistic represents agreement in the data. Each kappa should be considered and interpreted based on the context of the data being analysed. Copyright © 2014 John Wiley & Sons, Ltd.
Interpretive Reporting of Protein Electrophoresis Data by Microcomputer
Talamo, Thomas S.; Losos, Frank J.; Kessler, G. Frederick
1982-01-01
A microcomputer based system for interpretive reporting of protein electrophoretic data has been developed. Data for serum, urine and cerebrospinal fluid protein electrophoreses as well as immunoelectrophoresis can be entered. Patient demographic information is entered through the keyboard followed by manual entry of total and fractionated protein levels obtained after densitometer scanning of the electrophoretic strip. The patterns are then coded, interpreted, and final reports generated. In most cases interpretation time is less than one second. Misinterpretation by computer is uncommon and can be corrected by edit functions within the system. These discrepancies between computer and pathologist interpretation are automatically stored in a data file for later review and possible program modification. Any or all previous tests on a patient may be reviewed with graphic display of the electrophoretic pattern. The system has been in use for several months and is presently well accepted by both laboratory and clinical staff. It also allows rapid storage, retrieval and analysis of protein electrophoretic datab.
Scanning device for a spectrometer
International Nuclear Information System (INIS)
Ignat'ev, V.M.
1982-01-01
The invention belongs to scanning devices and is intended for spectrum scanning in spectral devices. The purpose of the invention is broadening of spectral scanning range. The device construction ensures the spectrum scanning range determined from revolution fractions to several revolutions of the monochromator drum head, any number of the drum head revolutions determined by integral number with addition of the drum revolution fractions with high degree of accuracy being possible
Tax Treaty Interpretation in Spain
Soler Roch, María Teresa; Ribes Ribes, Aurora
2001-01-01
This paper provides insight in the interpretation of Spanish double taxation conventions. Taking as a premise the Vienna Convention on the Law of Treaties and the wording of Article 3(2) OECD Model Convention, the authors explore the relevance of mutual agreements, tax authority practice and foreign court decisions on the tax treaty interpretation.
Pragmatics in Court Interpreting: Additions
DEFF Research Database (Denmark)
Jacobsen, Bente
2003-01-01
Danish court interpreters are expected to follow ethical guidelines, which instruct them to deliver exact verbatim versions of source texts. However, this requirement often clashes with the reality of the interpreting situation in the courtroom. This paper presents and discusses the findings of a...
Intercultural pragmatics and court interpreting
DEFF Research Database (Denmark)
Jacobsen, Bente
2008-01-01
This paper reports on an on-going investigation of conversational implicature in triadic speech events: Interpreter-mediated questionings in criminal proceedings in Danish district courts. The languages involved are Danish and English, and the mode of interpreting is the consecutive mode. The c...
Interpreting Recoil for Undergraduate Students
Elsayed, Tarek A.
2012-01-01
The phenomenon of recoil is usually explained to students in the context of Newton's third law. Typically, when a projectile is fired, the recoil of the launch mechanism is interpreted as a reaction to the ejection of the smaller projectile. The same phenomenon is also interpreted in the context of the conservation of linear momentum, which is…
Observation of Liver Color Scan
Energy Technology Data Exchange (ETDEWEB)
Choe, Y K; Ahn, S B [Yonsei University College of Medicine, Seoul (Korea, Republic of)
1969-09-15
In the past few years, scintigraphy has become increasingly important in clinical practice, and the use of a color-printing technique has permitted a more accurate interpretation of the scan image. Our liver color scintigrams consist of 51 hepatomas, 35 liver cirrhosis, 22 liver abscesses, 10 hepatitis and other 13 cases of the liver diseases which were clinically and pathologically diagnosed at Severance Hospital, Yonsei Univ. since Feb. 1969 through Sept. 1969. These scintigrams have been analyzed in terms of various pathologic morphology, such as size, shape, margin of the liver, distribution of radioactivity, and shape of the space occupying lesions. The results are as follows: 1) Enlargement of the liver was the most common finding in the diseased livers. The Rt. lobe enlargement was particularly prominent in the liver abscess. 2) Irregular distribution of radioactivity in the liver (so called mottling) was present in 78% of hepatoma, while it was seen only in 31% of liver abscesses. 3) Liver cirrhosis tends to show perihilar accumulation of the isotope (57%). 4) The deformity of the lower most angle of the Rt. lobe, and the Lt. lateral margin of the Lt. lobe was also impressive throughout the cases (74-95% of all diseased livers). 5) The frequency of visualization of the spleen was influenced by the size of space occupying lesions and the amount of functioning liver. 6) Differentiation between the liver abscess and hepatoma seems to be possible on scintigram, when shape an margin of defect and patterns of distribution of radioactivity in the remaining liver are clearly demonstrated.
Observation of Liver Color Scan
International Nuclear Information System (INIS)
Choe, Y. K.; Ahn, S. B.
1969-01-01
In the past few years, scintigraphy has become increasingly important in clinical practice, and the use of a color-printing technique has permitted a more accurate interpretation of the scan image. Our liver color scintigrams consist of 51 hepatomas, 35 liver cirrhosis, 22 liver abscesses, 10 hepatitis and other 13 cases of the liver diseases which were clinically and pathologically diagnosed at Severance Hospital, Yonsei Univ. since Feb. 1969 through Sept. 1969. These scintigrams have been analyzed in terms of various pathologic morphology, such as size, shape, margin of the liver, distribution of radioactivity, and shape of the space occupying lesions. The results are as follows: 1) Enlargement of the liver was the most common finding in the diseased livers. The Rt. lobe enlargement was particularly prominent in the liver abscess. 2) Irregular distribution of radioactivity in the liver (so called mottling) was present in 78% of hepatoma, while it was seen only in 31% of liver abscesses. 3) Liver cirrhosis tends to show perihilar accumulation of the isotope (57%). 4) The deformity of the lower most angle of the Rt. lobe, and the Lt. lateral margin of the Lt. lobe was also impressive throughout the cases (74-95% of all diseased livers). 5) The frequency of visualization of the spleen was influenced by the size of space occupying lesions and the amount of functioning liver. 6) Differentiation between the liver abscess and hepatoma seems to be possible on scintigram, when shape an margin of defect and patterns of distribution of radioactivity in the remaining liver are clearly demonstrated.
Factors influencing bone scan quality
International Nuclear Information System (INIS)
Adams, F.G.; Shirley, A.W.
1983-01-01
A reliable subjective method of assessing bone scan quality is described. A large number of variables which theoretically could influence scan quality were submitted to regression and factor analysis. Obesity, age, sex and abnormality of scan were found to be significant but weak variables. (orig.)
International Nuclear Information System (INIS)
Imanishi, Masami; Morimoto, Tetsuya; Iida, Noriyuki; Hisanaga, Manabu; Kinugawa, Kazuhiko
1980-01-01
Generally, CT scans reveal a decrease in the volume of the ventricular system, sylvian fissures and cortical sulci in the acute stage of encephalitis, and softening of the cerebral lobes with dilatation of the lateral ventricles and subarachnoidian dilated spaces in the chronic stage. We encountered three cases of encephalitis: mumps (case 1), herpes simplex (case 2), and syphilis (case 3). In case 1, brain edema was seen in the acute stage and brain atrophy in the chronic stage. In case 2, necrosis of the temporal pole, which is pathognomonic in herpes simplex encephalitis, was recognized. And in case 3, multiple lesions whose CT appearance was enhanced by contrast materials were found scattered over the whole brain. These lesions were diagnosed as inflammatory granuloma by histological examination. (author)
Application of descriptive statistics in analysis of experimental data
Mirilović Milorad; Pejin Ivana
2008-01-01
Statistics today represent a group of scientific methods for the quantitative and qualitative investigation of variations in mass appearances. In fact, statistics present a group of methods that are used for the accumulation, analysis, presentation and interpretation of data necessary for reaching certain conclusions. Statistical analysis is divided into descriptive statistical analysis and inferential statistics. The values which represent the results of an experiment, and which are the subj...
Scanning device for scintigraphy
International Nuclear Information System (INIS)
Casale, R.
1975-01-01
A device is described for the scintigraphic scanning according to a horizontal plane, comprising: (a) A support provided with two guides horizontally and longitudinally located, one of which is located in the upper part of the support, while the second guide is located in the lower part of the support; (b) A carriage, movable with respect to the support along the two guides, provided in its upper part, projecting above the support, with rolling means suitable to support and to cause to slide along its axis a support rod for the first detector, horizontally and transversely located, said carriage being further provided in its lower part with a recess with possible rolling means suitable to support and to cause to slide along its axis a second support rod for the second detector, said second rod being located parallel to the first rod and below it; (c) One or two support rods for the detectors, the first of said rods being supported above the support in a sliding way along its axis, by the rolling means located in the upper part of the carriage, and the second rod if present is supported slidingly along its axis by the possible rolling means contained in the suitable recess which is provided in the lower part of the carriage, and (d) A vertical shaft supported by said carriage on which is mounted a toothed wheel for each rod, each toothed wheel engaging a positive drive belt or the like, which is connected to each said rod so that rotation of the shaft determines the simultaneous displacement of the two rods along their axes; and single motor means for driving said shaft during a scanning operation. (U.S.)
Do Interpreters Indeed Have Superior Working Memory in Interpreting
Institute of Scientific and Technical Information of China (English)
于飞
2012-01-01
With the frequent communications between China and western countries in the field of economy,politics and culture,etc,Inter preting becomes more and more important to people in all walks of life.This paper aims to testify the author’s hypothesis "professional interpreters have similar short-term memory with unprofessional interpreters,but they have superior working memory." After the illustration of literatures concerning with consecutive interpreting,short-term memory and working memory,experiments are designed and analysis are described.
Over-all accuracy of sup(99m)Tc-pertechnetate brain scanning for brain tumours
International Nuclear Information System (INIS)
Bjoernsson, O.G.; Petursson, E.; Sigurbjoernsson, B.; Davidsson, D.
1978-01-01
A 3-year follow-up and re-evaluation of all scans on all patients referred for brain scanning in Iceland during 1 year was performed in order to assess the diagnostic reliability of radioisotope scanning for brain tumours. The study included 471 patients. Of these 25 had primary brain tumours and 7 brain metastases. Scans were positive and correctly interpreted in 68% of the patients with primary brain tumours and in 3 of the 7 patients with metastases. The over-all accuracy of brain scanning for brain tumours defined as the total number of correct positive scans and correct negative scans versus total number of scans examined was 96%, this figure being mainly influenced by the high number of true negative scans. (orig.) [de
Handbook of univariate and multivariate data analysis and interpretation with SPSS
Ho, Robert
2006-01-01
Many statistics texts tend to focus more on the theory and mathematics underlying statistical tests than on their applications and interpretation. This can leave readers with little understanding of how to apply statistical tests or how to interpret their findings. While the SPSS statistical software has done much to alleviate the frustrations of social science professionals and students who must analyze data, they still face daunting challenges in selecting the proper tests, executing the tests, and interpreting the test results.With emphasis firmly on such practical matters, this handbook se
An objective interpretation of Lagrangian quantum mechanics
International Nuclear Information System (INIS)
Roberts, K.V.
1978-01-01
Unlike classical mechanics, the Copenhagen interpretation of quantum mechanics does not provide an objective space-time picture of the actual history of a physical system. This paper suggests how the conceptual foundations of quantum mechanics can be reformulated, without changing the mathematical content of the theory or its detailed agreement with experiment and without introducing any hidden variables, in order to provide an objective, covariant, Lagrangian description of reality which is deterministic and time-symmetric on the microscopic scale. The basis of this description can be expressed either as an action functional or as a summation over Feynman diagrams or paths. The probability laws associated with the quantum-mechanical measurement process, and the asymmetry in time of the principles of macroscopic causality and of the laws of statistical mechanics, are interpreted as consequences of the particular boundary conditions that apply to the actual universe. The objective interpretation does not include the observer and the measurement process among the fundamental concepts of the theory, but it does not entail a revision of the ideas of determinism and of time, since in a Lagrangian theory both initial and final boundary conditions on the action functional are required. (author)
Data analysis and interpretation for environmental surveillance
International Nuclear Information System (INIS)
1992-06-01
The Data Analysis and Interpretation for Environmental Surveillance Conference was held in Lexington, Kentucky, February 5--7, 1990. The conference was sponsored by what is now the Office of Environmental Compliance and Documentation, Oak Ridge National Laboratory. Participants included technical professionals from all Martin Marietta Energy Systems facilities, Westinghouse Materials Company of Ohio, Pacific Northwest Laboratory, and several technical support contractors. Presentations at the conference ranged the full spectrum of issues that effect the analysis and interpretation of environmental data. Topics included tracking systems for samples and schedules associated with ongoing programs; coalescing data from a variety of sources and pedigrees into integrated data bases; methods for evaluating the quality of environmental data through empirical estimates of parameters such as charge balance, pH, and specific conductance; statistical applications to the interpretation of environmental information; and uses of environmental information in risk and dose assessments. Hearing about and discussing this wide variety of topics provided an opportunity to capture the subtlety of each discipline and to appreciate the continuity that is required among the disciplines in order to perform high-quality environmental information analysis
Abstract Interpretation and Attribute Gramars
DEFF Research Database (Denmark)
Rosendahl, Mads
The objective of this thesis is to explore the connections between abstract interpretation and attribute grammars as frameworks in program analysis. Abstract interpretation is a semantics-based program analysis method. A large class of data flow analysis problems can be expressed as non-standard ...... is presented in the thesis. Methods from abstract interpretation can also be used in correctness proofs of attribute grammars. This proof technique introduces a new class of attribute grammars based on domain theory. This method is illustrated with examples....
An Introduction to Statistical Concepts
Lomax, Richard G
2012-01-01
This comprehensive, flexible text is used in both one- and two-semester courses to review introductory through intermediate statistics. Instructors select the topics that are most appropriate for their course. Its conceptual approach helps students more easily understand the concepts and interpret SPSS and research results. Key concepts are simply stated and occasionally reintroduced and related to one another for reinforcement. Numerous examples demonstrate their relevance. This edition features more explanation to increase understanding of the concepts. Only crucial equations are included. I