WorldWideScience

Sample records for scan statistic interpretation

  1. Geovisual analytics to enhance spatial scan statistic interpretation: an analysis of U.S. cervical cancer mortality.

    Science.gov (United States)

    Chen, Jin; Roth, Robert E; Naito, Adam T; Lengerich, Eugene J; Maceachren, Alan M

    2008-11-07

    Kulldorff's spatial scan statistic and its software implementation - SaTScan - are widely used for detecting and evaluating geographic clusters. However, two issues make using the method and interpreting its results non-trivial: (1) the method lacks cartographic support for understanding the clusters in geographic context and (2) results from the method are sensitive to parameter choices related to cluster scaling (abbreviated as scaling parameters), but the system provides no direct support for making these choices. We employ both established and novel geovisual analytics methods to address these issues and to enhance the interpretation of SaTScan results. We demonstrate our geovisual analytics approach in a case study analysis of cervical cancer mortality in the U.S. We address the first issue by providing an interactive visual interface to support the interpretation of SaTScan results. Our research to address the second issue prompted a broader discussion about the sensitivity of SaTScan results to parameter choices. Sensitivity has two components: (1) the method can identify clusters that, while being statistically significant, have heterogeneous contents comprised of both high-risk and low-risk locations and (2) the method can identify clusters that are unstable in location and size as the spatial scan scaling parameter is varied. To investigate cluster result stability, we conducted multiple SaTScan runs with systematically selected parameters. The results, when scanning a large spatial dataset (e.g., U.S. data aggregated by county), demonstrate that no single spatial scan scaling value is known to be optimal to identify clusters that exist at different scales; instead, multiple scans that vary the parameters are necessary. We introduce a novel method of measuring and visualizing reliability that facilitates identification of homogeneous clusters that are stable across analysis scales. Finally, we propose a logical approach to proceed through the analysis of

  2. Data-driven inference for the spatial scan statistic.

    Science.gov (United States)

    Almeida, Alexandre C L; Duarte, Anderson R; Duczmal, Luiz H; Oliveira, Fernando L P; Takahashi, Ricardo H C

    2011-08-02

    Kulldorff's spatial scan statistic for aggregated area maps searches for clusters of cases without specifying their size (number of areas) or geographic location in advance. Their statistical significance is tested while adjusting for the multiple testing inherent in such a procedure. However, as is shown in this work, this adjustment is not done in an even manner for all possible cluster sizes. A modification is proposed to the usual inference test of the spatial scan statistic, incorporating additional information about the size of the most likely cluster found. A new interpretation of the results of the spatial scan statistic is done, posing a modified inference question: what is the probability that the null hypothesis is rejected for the original observed cases map with a most likely cluster of size k, taking into account only those most likely clusters of size k found under null hypothesis for comparison? This question is especially important when the p-value computed by the usual inference process is near the alpha significance level, regarding the correctness of the decision based in this inference. A practical procedure is provided to make more accurate inferences about the most likely cluster found by the spatial scan statistic.

  3. Data-driven inference for the spatial scan statistic

    Directory of Open Access Journals (Sweden)

    Duczmal Luiz H

    2011-08-01

    Full Text Available Abstract Background Kulldorff's spatial scan statistic for aggregated area maps searches for clusters of cases without specifying their size (number of areas or geographic location in advance. Their statistical significance is tested while adjusting for the multiple testing inherent in such a procedure. However, as is shown in this work, this adjustment is not done in an even manner for all possible cluster sizes. Results A modification is proposed to the usual inference test of the spatial scan statistic, incorporating additional information about the size of the most likely cluster found. A new interpretation of the results of the spatial scan statistic is done, posing a modified inference question: what is the probability that the null hypothesis is rejected for the original observed cases map with a most likely cluster of size k, taking into account only those most likely clusters of size k found under null hypothesis for comparison? This question is especially important when the p-value computed by the usual inference process is near the alpha significance level, regarding the correctness of the decision based in this inference. Conclusions A practical procedure is provided to make more accurate inferences about the most likely cluster found by the spatial scan statistic.

  4. Scanning Tunneling Microscopy - image interpretation

    International Nuclear Information System (INIS)

    Maca, F.

    1998-01-01

    The basic ideas of image interpretation in Scanning Tunneling Microscopy are presented using simple quantum-mechanical models and supplied with examples of successful application. The importance is stressed of a correct interpretation of this brilliant experimental surface technique

  5. Scan Statistics

    CERN Document Server

    Glaz, Joseph

    2009-01-01

    Suitable for graduate students and researchers in applied probability and statistics, as well as for scientists in biology, computer science, pharmaceutical science and medicine, this title brings together a collection of chapters illustrating the depth and diversity of theory, methods and applications in the area of scan statistics.

  6. A scan statistic to extract causal gene clusters from case-control genome-wide rare CNV data

    Directory of Open Access Journals (Sweden)

    Scherer Stephen W

    2011-05-01

    Full Text Available Abstract Background Several statistical tests have been developed for analyzing genome-wide association data by incorporating gene pathway information in terms of gene sets. Using these methods, hundreds of gene sets are typically tested, and the tested gene sets often overlap. This overlapping greatly increases the probability of generating false positives, and the results obtained are difficult to interpret, particularly when many gene sets show statistical significance. Results We propose a flexible statistical framework to circumvent these problems. Inspired by spatial scan statistics for detecting clustering of disease occurrence in the field of epidemiology, we developed a scan statistic to extract disease-associated gene clusters from a whole gene pathway. Extracting one or a few significant gene clusters from a global pathway limits the overall false positive probability, which results in increased statistical power, and facilitates the interpretation of test results. In the present study, we applied our method to genome-wide association data for rare copy-number variations, which have been strongly implicated in common diseases. Application of our method to a simulated dataset demonstrated the high accuracy of this method in detecting disease-associated gene clusters in a whole gene pathway. Conclusions The scan statistic approach proposed here shows a high level of accuracy in detecting gene clusters in a whole gene pathway. This study has provided a sound statistical framework for analyzing genome-wide rare CNV data by incorporating topological information on the gene pathway.

  7. A nonparametric spatial scan statistic for continuous data.

    Science.gov (United States)

    Jung, Inkyung; Cho, Ho Jin

    2015-10-20

    Spatial scan statistics are widely used for spatial cluster detection, and several parametric models exist. For continuous data, a normal-based scan statistic can be used. However, the performance of the model has not been fully evaluated for non-normal data. We propose a nonparametric spatial scan statistic based on the Wilcoxon rank-sum test statistic and compared the performance of the method with parametric models via a simulation study under various scenarios. The nonparametric method outperforms the normal-based scan statistic in terms of power and accuracy in almost all cases under consideration in the simulation study. The proposed nonparametric spatial scan statistic is therefore an excellent alternative to the normal model for continuous data and is especially useful for data following skewed or heavy-tailed distributions.

  8. A statistical mechanical interpretation of algorithmic information theory: Total statistical mechanical interpretation based on physical argument

    International Nuclear Information System (INIS)

    Tadaki, Kohtaro

    2010-01-01

    The statistical mechanical interpretation of algorithmic information theory (AIT, for short) was introduced and developed by our former works [K. Tadaki, Local Proceedings of CiE 2008, pp. 425-434, 2008] and [K. Tadaki, Proceedings of LFCS'09, Springer's LNCS, vol. 5407, pp. 422-440, 2009], where we introduced the notion of thermodynamic quantities, such as partition function Z(T), free energy F(T), energy E(T), statistical mechanical entropy S(T), and specific heat C(T), into AIT. We then discovered that, in the interpretation, the temperature T equals to the partial randomness of the values of all these thermodynamic quantities, where the notion of partial randomness is a stronger representation of the compression rate by means of program-size complexity. Furthermore, we showed that this situation holds for the temperature T itself, which is one of the most typical thermodynamic quantities. Namely, we showed that, for each of the thermodynamic quantities Z(T), F(T), E(T), and S(T) above, the computability of its value at temperature T gives a sufficient condition for T is an element of (0,1) to satisfy the condition that the partial randomness of T equals to T. In this paper, based on a physical argument on the same level of mathematical strictness as normal statistical mechanics in physics, we develop a total statistical mechanical interpretation of AIT which actualizes a perfect correspondence to normal statistical mechanics. We do this by identifying a microcanonical ensemble in the framework of AIT. As a result, we clarify the statistical mechanical meaning of the thermodynamic quantities of AIT.

  9. Skeletal blood flow: implications for bone-scan interpretation

    International Nuclear Information System (INIS)

    Charkes, N.D.

    1980-01-01

    The dispersion of the skeleton throughout the body and its complex vascular anatomy require indirect methods for the measurement of skeletal blood flow. The results of one such method, compartmental analysis of skeletal tracer kinetics, are presented. The assumptions underlying the models were tested in animals and found to be in agreement with experimental observations. Based upon the models and the experimental results, inferences concerning bone-scan interpretation can be drawn: decreased cardiac output produces low-contrast (technically poor) scans; decreased skeletal flow produces photon-deficient lesions; increase of cardiac output or of generalized systemic blood flow is undetectable 1 to 2 h after dose; increased local skeletal blood flow results from disturbance of the bone microvasculature and can occur from neurologic (sympatholytic) disorders or in association with focal abnormalities that also incite the formation of reactive bone (e.g., metastasis, fracture, etc.). Mathematical solutions of tracer kinetic data thus become relevant to bone-scan interpretation

  10. Spatial scan statistics using elliptic windows

    DEFF Research Database (Denmark)

    Christiansen, Lasse Engbo; Andersen, Jens Strodl; Wegener, Henrik Caspar

    The spatial scan statistic is widely used to search for clusters in epidemiologic data. This paper shows that the usually applied elimination of secondary clusters as implemented in SatScan is sensitive to smooth changes in the shape of the clusters. We present an algorithm for generation of set...

  11. Spatial scan statistics using elliptic windows

    DEFF Research Database (Denmark)

    Christiansen, Lasse Engbo; Andersen, Jens Strodl; Wegener, Henrik Caspar

    2006-01-01

    The spatial scan statistic is widely used to search for clusters. This article shows that the usually applied elimination of secondary clusters as implemented in SatScan is sensitive to smooth changes in the shape of the clusters. We present an algorithm for generation of a set of confocal elliptic...

  12. Normal variants and artifacts in bone scan: potential for errors in interpretation

    International Nuclear Information System (INIS)

    Sohn, Myung Hee

    2004-01-01

    Bone scan is one of the most frequently performed studies in nuclear medicine. In bone scan, the amount of radioisotope taken up by lesion depends primarily on the local rate of bone turnover rather than on the bone mass. Bone scan is extremely sensitive for detecting bony abnormalities. However, abnormalities that appear on bone scan may not always represent disease. The normal scan appearances may be affected not only by skeletal physiology and anatomy but also by a variety of technical factors which can influence image quality. Many normal variants and artifacts may appear on bone scan. They could simulate a pathologic process and could mislead into the wrong diagnostic interpretation. Therefore, their recognition is necessary to avoid misdiagnosis. A nuclear medicine physician should be aware of variable appearance of the normal variants and artifacts on bone scan. In this article, a variety of normal variants and artifacts mimicking real pathologic lesion in bone scan interpretation are discussed and illustrated

  13. Huffman and linear scanning methods with statistical language models.

    Science.gov (United States)

    Roark, Brian; Fried-Oken, Melanie; Gibbons, Chris

    2015-03-01

    Current scanning access methods for text generation in AAC devices are limited to relatively few options, most notably row/column variations within a matrix. We present Huffman scanning, a new method for applying statistical language models to binary-switch, static-grid typing AAC interfaces, and compare it to other scanning options under a variety of conditions. We present results for 16 adults without disabilities and one 36-year-old man with locked-in syndrome who presents with complex communication needs and uses AAC scanning devices for writing. Huffman scanning with a statistical language model yielded significant typing speedups for the 16 participants without disabilities versus any of the other methods tested, including two row/column scanning methods. A similar pattern of results was found with the individual with locked-in syndrome. Interestingly, faster typing speeds were obtained with Huffman scanning using a more leisurely scan rate than relatively fast individually calibrated scan rates. Overall, the results reported here demonstrate great promise for the usability of Huffman scanning as a faster alternative to row/column scanning.

  14. Statistics and Data Interpretation for Social Work

    CERN Document Server

    Rosenthal, James

    2011-01-01

    "Without question, this text will be the most authoritative source of information on statistics in the human services. From my point of view, it is a definitive work that combines a rigorous pedagogy with a down to earth (commonsense) exploration of the complex and difficult issues in data analysis (statistics) and interpretation. I welcome its publication.". -Praise for the First Edition. Written by a social worker for social work students, this is a nuts and bolts guide to statistics that presents complex calculations and concepts in clear, easy-to-understand language. It includes

  15. Combinatorial interpretation of Haldane-Wu fractional exclusion statistics.

    Science.gov (United States)

    Aringazin, A K; Mazhitov, M I

    2002-08-01

    Assuming that the maximal allowed number of identical particles in a state is an integer parameter, q, we derive the statistical weight and analyze the associated equation that defines the statistical distribution. The derived distribution covers Fermi-Dirac and Bose-Einstein ones in the particular cases q=1 and q--> infinity (n(i)/q-->1), respectively. We show that the derived statistical weight provides a natural combinatorial interpretation of Haldane-Wu fractional exclusion statistics, and present exact solutions of the distribution equation.

  16. A flexible spatial scan statistic with a restricted likelihood ratio for detecting disease clusters.

    Science.gov (United States)

    Tango, Toshiro; Takahashi, Kunihiko

    2012-12-30

    Spatial scan statistics are widely used tools for detection of disease clusters. Especially, the circular spatial scan statistic proposed by Kulldorff (1997) has been utilized in a wide variety of epidemiological studies and disease surveillance. However, as it cannot detect noncircular, irregularly shaped clusters, many authors have proposed different spatial scan statistics, including the elliptic version of Kulldorff's scan statistic. The flexible spatial scan statistic proposed by Tango and Takahashi (2005) has also been used for detecting irregularly shaped clusters. However, this method sets a feasible limitation of a maximum of 30 nearest neighbors for searching candidate clusters because of heavy computational load. In this paper, we show a flexible spatial scan statistic implemented with a restricted likelihood ratio proposed by Tango (2008) to (1) eliminate the limitation of 30 nearest neighbors and (2) to have surprisingly much less computational time than the original flexible spatial scan statistic. As a side effect, it is shown to be able to detect clusters with any shape reasonably well as the relative risk of the cluster becomes large via Monte Carlo simulation. We illustrate the proposed spatial scan statistic with data on mortality from cerebrovascular disease in the Tokyo Metropolitan area, Japan. Copyright © 2012 John Wiley & Sons, Ltd.

  17. A log-Weibull spatial scan statistic for time to event data.

    Science.gov (United States)

    Usman, Iram; Rosychuk, Rhonda J

    2018-06-13

    Spatial scan statistics have been used for the identification of geographic clusters of elevated numbers of cases of a condition such as disease outbreaks. These statistics accompanied by the appropriate distribution can also identify geographic areas with either longer or shorter time to events. Other authors have proposed the spatial scan statistics based on the exponential and Weibull distributions. We propose the log-Weibull as an alternative distribution for the spatial scan statistic for time to events data and compare and contrast the log-Weibull and Weibull distributions through simulation studies. The effect of type I differential censoring and power have been investigated through simulated data. Methods are also illustrated on time to specialist visit data for discharged patients presenting to emergency departments for atrial fibrillation and flutter in Alberta during 2010-2011. We found northern regions of Alberta had longer times to specialist visit than other areas. We proposed the spatial scan statistic for the log-Weibull distribution as a new approach for detecting spatial clusters for time to event data. The simulation studies suggest that the test performs well for log-Weibull data.

  18. A spatial scan statistic for nonisotropic two-level risk cluster.

    Science.gov (United States)

    Li, Xiao-Zhou; Wang, Jin-Feng; Yang, Wei-Zhong; Li, Zhong-Jie; Lai, Sheng-Jie

    2012-01-30

    Spatial scan statistic methods are commonly used for geographical disease surveillance and cluster detection. The standard spatial scan statistic does not model any variability in the underlying risks of subregions belonging to a detected cluster. For a multilevel risk cluster, the isotonic spatial scan statistic could model a centralized high-risk kernel in the cluster. Because variations in disease risks are anisotropic owing to different social, economical, or transport factors, the real high-risk kernel will not necessarily take the central place in a whole cluster area. We propose a spatial scan statistic for a nonisotropic two-level risk cluster, which could be used to detect a whole cluster and a noncentralized high-risk kernel within the cluster simultaneously. The performance of the three methods was evaluated through an intensive simulation study. Our proposed nonisotropic two-level method showed better power and geographical precision with two-level risk cluster scenarios, especially for a noncentralized high-risk kernel. Our proposed method is illustrated using the hand-foot-mouth disease data in Pingdu City, Shandong, China in May 2009, compared with two other methods. In this practical study, the nonisotropic two-level method is the only way to precisely detect a high-risk area in a detected whole cluster. Copyright © 2011 John Wiley & Sons, Ltd.

  19. Simulators of tray distillation columns as tools for interpreting gamma-ray scan profile signal

    International Nuclear Information System (INIS)

    Offei-Mensah, P.S.; Gbadago, J.K.; Dagadu, C.P.K.; Danso, K.A.

    2008-01-01

    Simulators of tray distillation columns were used to provide technical guidelines for interpreting signals from gamma ray scans used for analysing malfunctions in distillation columns. The transmitted radiation intensities at 0.05 m intervals were determined from top to bottom of simulators of tray distillation columns exposed to 20 mCi of '1'3'7 Cs. Signals generated from the simulators were identical with the experimental signals obtained from the stabilizer column of the crude oil distillation unit at the Tema Oil Refinery Ghana Limited. Changes in the signal level were observed with changes in diameter, type of material (gasoline, air, debris, steel) and orientation of scan line. The analysis provided accurate interpretation of gamma scan profiles. (au)

  20. Equivalent statistics and data interpretation.

    Science.gov (United States)

    Francis, Gregory

    2017-08-01

    Recent reform efforts in psychological science have led to a plethora of choices for scientists to analyze their data. A scientist making an inference about their data must now decide whether to report a p value, summarize the data with a standardized effect size and its confidence interval, report a Bayes Factor, or use other model comparison methods. To make good choices among these options, it is necessary for researchers to understand the characteristics of the various statistics used by the different analysis frameworks. Toward that end, this paper makes two contributions. First, it shows that for the case of a two-sample t test with known sample sizes, many different summary statistics are mathematically equivalent in the sense that they are based on the very same information in the data set. When the sample sizes are known, the p value provides as much information about a data set as the confidence interval of Cohen's d or a JZS Bayes factor. Second, this equivalence means that different analysis methods differ only in their interpretation of the empirical data. At first glance, it might seem that mathematical equivalence of the statistics suggests that it does not matter much which statistic is reported, but the opposite is true because the appropriateness of a reported statistic is relative to the inference it promotes. Accordingly, scientists should choose an analysis method appropriate for their scientific investigation. A direct comparison of the different inferential frameworks provides some guidance for scientists to make good choices and improve scientific practice.

  1. Small nodule detectability evaluation using a generalized scan-statistic model

    International Nuclear Information System (INIS)

    Popescu, Lucretiu M; Lewitt, Robert M

    2006-01-01

    In this paper is investigated the use of the scan statistic for evaluating the detectability of small nodules in medical images. The scan-statistic method is often used in applications in which random fields must be searched for abnormal local features. Several results of the detection with localization theory are reviewed and a generalization is presented using the noise nodule distribution obtained by scanning arbitrary areas. One benefit of the noise nodule model is that it enables determination of the scan-statistic distribution by using only a few image samples in a way suitable both for simulation and experimental setups. Also, based on the noise nodule model, the case of multiple targets per image is addressed and an image abnormality test using the likelihood ratio and an alternative test using multiple decision thresholds are derived. The results obtained reveal that in the case of low contrast nodules or multiple nodules the usual test strategy based on a single decision threshold underperforms compared with the alternative tests. That is a consequence of the fact that not only the contrast or the size, but also the number of suspicious nodules is a clue indicating the image abnormality. In the case of the likelihood ratio test, the multiple clues are unified in a single decision variable. Other tests that process multiple clues differently do not necessarily produce a unique ROC curve, as shown in examples using a test involving two decision thresholds. We present examples with two-dimensional time-of-flight (TOF) and non-TOF PET image sets analysed using the scan statistic for different search areas, as well as the fixed position observer

  2. A critical look at prospective surveillance using a scan statistic.

    Science.gov (United States)

    Correa, Thais R; Assunção, Renato M; Costa, Marcelo A

    2015-03-30

    The scan statistic is a very popular surveillance technique for purely spatial, purely temporal, and spatial-temporal disease data. It was extended to the prospective surveillance case, and it has been applied quite extensively in this situation. When the usual signal rules, as those implemented in SaTScan(TM) (Boston, MA, USA) software, are used, we show that the scan statistic method is not appropriate for the prospective case. The reason is that it does not adjust properly for the sequential and repeated tests carried out during the surveillance. We demonstrate that the nominal significance level α is not meaningful and there is no relationship between α and the recurrence interval or the average run length (ARL). In some cases, the ARL may be equal to ∞, which makes the method ineffective. This lack of control of the type-I error probability and of the ARL leads us to strongly oppose the use of the scan statistic with the usual signal rules in the prospective context. Copyright © 2014 John Wiley & Sons, Ltd.

  3. A Scan Statistic for Continuous Data Based on the Normal Probability Model

    OpenAIRE

    Konty, Kevin; Kulldorff, Martin; Huang, Lan

    2009-01-01

    Abstract Temporal, spatial and space-time scan statistics are commonly used to detect and evaluate the statistical significance of temporal and/or geographical disease clusters, without any prior assumptions on the location, time period or size of those clusters. Scan statistics are mostly used for count data, such as disease incidence or mortality. Sometimes there is an interest in looking for clusters with respect to a continuous variable, such as lead levels in children or low birth weight...

  4. Local multiplicity adjustment for the spatial scan statistic using the Gumbel distribution.

    Science.gov (United States)

    Gangnon, Ronald E

    2012-03-01

    The spatial scan statistic is an important and widely used tool for cluster detection. It is based on the simultaneous evaluation of the statistical significance of the maximum likelihood ratio test statistic over a large collection of potential clusters. In most cluster detection problems, there is variation in the extent of local multiplicity across the study region. For example, using a fixed maximum geographic radius for clusters, urban areas typically have many overlapping potential clusters, whereas rural areas have relatively few. The spatial scan statistic does not account for local multiplicity variation. We describe a previously proposed local multiplicity adjustment based on a nested Bonferroni correction and propose a novel adjustment based on a Gumbel distribution approximation to the distribution of a local scan statistic. We compare the performance of all three statistics in terms of power and a novel unbiased cluster detection criterion. These methods are then applied to the well-known New York leukemia dataset and a Wisconsin breast cancer incidence dataset. © 2011, The International Biometric Society.

  5. A scan statistic for continuous data based on the normal probability model

    Directory of Open Access Journals (Sweden)

    Huang Lan

    2009-10-01

    Full Text Available Abstract Temporal, spatial and space-time scan statistics are commonly used to detect and evaluate the statistical significance of temporal and/or geographical disease clusters, without any prior assumptions on the location, time period or size of those clusters. Scan statistics are mostly used for count data, such as disease incidence or mortality. Sometimes there is an interest in looking for clusters with respect to a continuous variable, such as lead levels in children or low birth weight. For such continuous data, we present a scan statistic where the likelihood is calculated using the the normal probability model. It may also be used for other distributions, while still maintaining the correct alpha level. In an application of the new method, we look for geographical clusters of low birth weight in New York City.

  6. Drug safety data mining with a tree-based scan statistic.

    Science.gov (United States)

    Kulldorff, Martin; Dashevsky, Inna; Avery, Taliser R; Chan, Arnold K; Davis, Robert L; Graham, David; Platt, Richard; Andrade, Susan E; Boudreau, Denise; Gunter, Margaret J; Herrinton, Lisa J; Pawloski, Pamala A; Raebel, Marsha A; Roblin, Douglas; Brown, Jeffrey S

    2013-05-01

    In post-marketing drug safety surveillance, data mining can potentially detect rare but serious adverse events. Assessing an entire collection of drug-event pairs is traditionally performed on a predefined level of granularity. It is unknown a priori whether a drug causes a very specific or a set of related adverse events, such as mitral valve disorders, all valve disorders, or different types of heart disease. This methodological paper evaluates the tree-based scan statistic data mining method to enhance drug safety surveillance. We use a three-million-member electronic health records database from the HMO Research Network. Using the tree-based scan statistic, we assess the safety of selected antifungal and diabetes drugs, simultaneously evaluating overlapping diagnosis groups at different granularity levels, adjusting for multiple testing. Expected and observed adverse event counts were adjusted for age, sex, and health plan, producing a log likelihood ratio test statistic. Out of 732 evaluated disease groupings, 24 were statistically significant, divided among 10 non-overlapping disease categories. Five of the 10 signals are known adverse effects, four are likely due to confounding by indication, while one may warrant further investigation. The tree-based scan statistic can be successfully applied as a data mining tool in drug safety surveillance using observational data. The total number of statistical signals was modest and does not imply a causal relationship. Rather, data mining results should be used to generate candidate drug-event pairs for rigorous epidemiological studies to evaluate the individual and comparative safety profiles of drugs. Copyright © 2013 John Wiley & Sons, Ltd.

  7. Better Visualisation of Air-borne Laser Scanning for geomorphological and archaeological interpretation

    DEFF Research Database (Denmark)

    Ljungberg, Thomas; Scott, D; Kristiansen, Søren Munch

    Digital elevation models derived from high-precision Air-borne Laser Scanning (ALS or LiDAR) point clouds are becoming increasingly available throughout the world. These elevation models presents a very valuable tool for locating and interpreting geomorphological as well as archaeological features...

  8. Statistical analysis and interpretation of prenatal diagnostic imaging studies, Part 2: descriptive and inferential statistical methods.

    Science.gov (United States)

    Tuuli, Methodius G; Odibo, Anthony O

    2011-08-01

    The objective of this article is to discuss the rationale for common statistical tests used for the analysis and interpretation of prenatal diagnostic imaging studies. Examples from the literature are used to illustrate descriptive and inferential statistics. The uses and limitations of linear and logistic regression analyses are discussed in detail.

  9. Statistical image reconstruction methods for simultaneous emission/transmission PET scans

    International Nuclear Information System (INIS)

    Erdogan, H.; Fessler, J.A.

    1996-01-01

    Transmission scans are necessary for estimating the attenuation correction factors (ACFs) to yield quantitatively accurate PET emission images. To reduce the total scan time, post-injection transmission scans have been proposed in which one can simultaneously acquire emission and transmission data using rod sources and sinogram windowing. However, since the post-injection transmission scans are corrupted by emission coincidences, accurate correction for attenuation becomes more challenging. Conventional methods (emission subtraction) for ACF computation from post-injection scans are suboptimal and require relatively long scan times. We introduce statistical methods based on penalized-likelihood objectives to compute ACFs and then use them to reconstruct lower noise PET emission images from simultaneous transmission/emission scans. Simulations show the efficacy of the proposed methods. These methods improve image quality and SNR of the estimates as compared to conventional methods

  10. A spatial scan statistic for survival data based on Weibull distribution.

    Science.gov (United States)

    Bhatt, Vijaya; Tiwari, Neeraj

    2014-05-20

    The spatial scan statistic has been developed as a geographical cluster detection analysis tool for different types of data sets such as Bernoulli, Poisson, ordinal, normal and exponential. We propose a scan statistic for survival data based on Weibull distribution. It may also be used for other survival distributions, such as exponential, gamma, and log normal. The proposed method is applied on the survival data of tuberculosis patients for the years 2004-2005 in Nainital district of Uttarakhand, India. Simulation studies reveal that the proposed method performs well for different survival distribution functions. Copyright © 2013 John Wiley & Sons, Ltd.

  11. Statistics translated a step-by-step guide to analyzing and interpreting data

    CERN Document Server

    Terrell, Steven R

    2012-01-01

    Written in a humorous and encouraging style, this text shows how the most common statistical tools can be used to answer interesting real-world questions, presented as mysteries to be solved. Engaging research examples lead the reader through a series of six steps, from identifying a researchable problem to stating a hypothesis, identifying independent and dependent variables, and selecting and interpreting appropriate statistical tests. All techniques are demonstrated both manually and with the help of SPSS software. The book provides students and others who may need to read and interpret sta

  12. Advanced statistics to improve the physical interpretation of atomization processes

    International Nuclear Information System (INIS)

    Panão, Miguel R.O.; Radu, Lucian

    2013-01-01

    Highlights: ► Finite pdf mixtures improves physical interpretation of sprays. ► Bayesian approach using MCMC algorithm is used to find the best finite mixture. ► Statistical method identifies multiple droplet clusters in a spray. ► Multiple drop clusters eventually associated with multiple atomization mechanisms. ► Spray described by drop size distribution and not only its moments. -- Abstract: This paper reports an analysis of the physics of atomization processes using advanced statistical tools. Namely, finite mixtures of probability density functions, which best fitting is found using a Bayesian approach based on a Markov chain Monte Carlo (MCMC) algorithm. This approach takes into account eventual multimodality and heterogeneities in drop size distributions. Therefore, it provides information about the complete probability density function of multimodal drop size distributions and allows the identification of subgroups in the heterogeneous data. This allows improving the physical interpretation of atomization processes. Moreover, it also overcomes the limitations induced by analyzing the spray droplets characteristics through moments alone, particularly, the hindering of different natures of droplet formation. Finally, the method is applied to physically interpret a case-study based on multijet atomization processes

  13. Detection of Clostridium difficile infection clusters, using the temporal scan statistic, in a community hospital in southern Ontario, Canada, 2006-2011.

    Science.gov (United States)

    Faires, Meredith C; Pearl, David L; Ciccotelli, William A; Berke, Olaf; Reid-Smith, Richard J; Weese, J Scott

    2014-05-12

    In hospitals, Clostridium difficile infection (CDI) surveillance relies on unvalidated guidelines or threshold criteria to identify outbreaks. This can result in false-positive and -negative cluster alarms. The application of statistical methods to identify and understand CDI clusters may be a useful alternative or complement to standard surveillance techniques. The objectives of this study were to investigate the utility of the temporal scan statistic for detecting CDI clusters and determine if there are significant differences in the rate of CDI cases by month, season, and year in a community hospital. Bacteriology reports of patients identified with a CDI from August 2006 to February 2011 were collected. For patients detected with CDI from March 2010 to February 2011, stool specimens were obtained. Clostridium difficile isolates were characterized by ribotyping and investigated for the presence of toxin genes by PCR. CDI clusters were investigated using a retrospective temporal scan test statistic. Statistically significant clusters were compared to known CDI outbreaks within the hospital. A negative binomial regression model was used to identify associations between year, season, month and the rate of CDI cases. Overall, 86 CDI cases were identified. Eighteen specimens were analyzed and nine ribotypes were classified with ribotype 027 (n = 6) the most prevalent. The temporal scan statistic identified significant CDI clusters at the hospital (n = 5), service (n = 6), and ward (n = 4) levels (P ≤ 0.05). Three clusters were concordant with the one C. difficile outbreak identified by hospital personnel. Two clusters were identified as potential outbreaks. The negative binomial model indicated years 2007-2010 (P ≤ 0.05) had decreased CDI rates compared to 2006 and spring had an increased CDI rate compared to the fall (P = 0.023). Application of the temporal scan statistic identified several clusters, including potential outbreaks not detected by hospital

  14. Selection of the Maximum Spatial Cluster Size of the Spatial Scan Statistic by Using the Maximum Clustering Set-Proportion Statistic.

    Science.gov (United States)

    Ma, Yue; Yin, Fei; Zhang, Tao; Zhou, Xiaohua Andrew; Li, Xiaosong

    2016-01-01

    Spatial scan statistics are widely used in various fields. The performance of these statistics is influenced by parameters, such as maximum spatial cluster size, and can be improved by parameter selection using performance measures. Current performance measures are based on the presence of clusters and are thus inapplicable to data sets without known clusters. In this work, we propose a novel overall performance measure called maximum clustering set-proportion (MCS-P), which is based on the likelihood of the union of detected clusters and the applied dataset. MCS-P was compared with existing performance measures in a simulation study to select the maximum spatial cluster size. Results of other performance measures, such as sensitivity and misclassification, suggest that the spatial scan statistic achieves accurate results in most scenarios with the maximum spatial cluster sizes selected using MCS-P. Given that previously known clusters are not required in the proposed strategy, selection of the optimal maximum cluster size with MCS-P can improve the performance of the scan statistic in applications without identified clusters.

  15. Technetium phosphate bone scan in the diagnosis of septic arthritis in childhood

    International Nuclear Information System (INIS)

    Sundberg, S.B.; Savage, J.P.; Foster, B.K.

    1989-01-01

    The technetium phosphate bone scans of 106 children with suspected septic arthritis were reviewed to determine whether the bone scan can accurately differentiate septic from nonseptic arthropathy. Only 13% of children with proved septic arthritis had correct blind scan interpretation. The clinically adjusted interpretation did not identify septic arthritis in 30%. Septic arthritis was incorrectly identified in 32% of children with no evidence of septic arthritis. No statistically significant differences were noted between the scan findings in the septic and nonseptic groups and no scan findings correlated specifically with the presence or absence of joint sepsis

  16. Background Noise Removal in Ultrasonic B-scan Images Using Iterative Statistical Techniques

    NARCIS (Netherlands)

    Wells, I.; Charlton, P. C.; Mosey, S.; Donne, K. E.

    2008-01-01

    The interpretation of ultrasonic B-scan images can be a time-consuming process and its success depends on operator skills and experience. Removal of the image background will potentially improve its quality and hence improve operator diagnosis. An automatic background noise removal algorithm is

  17. Clinical importance of re-interpretation of PET/CT scanning in patients referred to a tertiary care medical centre

    DEFF Research Database (Denmark)

    Löfgren, Johan; Loft, Annika; Barbosa de Lima, Vinicius Araújo

    2017-01-01

    had an external F-18-FDG PET/CT scan were included. Only information that had been available at the time of the initial reading at the external hospital was available at re-interpretation. Teams with one radiologist and one nuclear medicine physician working side by side performed the re......PURPOSE: To evaluate, in a controlled prospective manner with double-blind read, whether there are differences in interpretations of PET/CT scans at our tertiary medical centre, Rigshospitalet, compared to the external hospitals. METHODS: Ninety consecutive patients referred to our department who...

  18. Interpretation of the results of statistical measurements. [search for basic probability model

    Science.gov (United States)

    Olshevskiy, V. V.

    1973-01-01

    For random processes, the calculated probability characteristic, and the measured statistical estimate are used in a quality functional, which defines the difference between the two functions. Based on the assumption that the statistical measurement procedure is organized so that the parameters for a selected model are optimized, it is shown that the interpretation of experimental research is a search for a basic probability model.

  19. Interpreting Statistical Significance Test Results: A Proposed New "What If" Method.

    Science.gov (United States)

    Kieffer, Kevin M.; Thompson, Bruce

    As the 1994 publication manual of the American Psychological Association emphasized, "p" values are affected by sample size. As a result, it can be helpful to interpret the results of statistical significant tests in a sample size context by conducting so-called "what if" analyses. However, these methods can be inaccurate…

  20. Radionuclide scanning

    International Nuclear Information System (INIS)

    Shapiro, B.

    1986-01-01

    Radionuclide scanning is the production of images of normal and diseased tissues and organs by means of the gamma-ray emissions from radiopharmaceutical agents having specific distributions in the body. The gamma rays are detected at the body surface by a variety of instruments that convert the invisible rays into visible patterns representing the distribution of the radionuclide in the body. The patterns, or images, obtained can be interpreted to provide or to aid diagnoses, to follow the course of disease, and to monitor the management of various illnesses. Scanning is a sensitive technique, but its specificity may be low when interpreted alone. To be used most successfully, radionuclide scanning must be interpreted in conjunction with other techniques, such as bone radiographs with bone scans, chest radiographs with lung scans, and ultrasonic studies with thyroid scans. Interpretation is also enhanced by providing pertinent clinical information because the distribution of radiopharmaceutical agents can be altered by drugs and by various procedures besides physiologic and pathologic conditions. Discussion of the patient with the radionuclide scanning specialist prior to the study and review of the results with that specialist after the study are beneficial

  1. A power comparison of generalized additive models and the spatial scan statistic in a case-control setting

    Directory of Open Access Journals (Sweden)

    Ozonoff Al

    2010-07-01

    Full Text Available Abstract Background A common, important problem in spatial epidemiology is measuring and identifying variation in disease risk across a study region. In application of statistical methods, the problem has two parts. First, spatial variation in risk must be detected across the study region and, second, areas of increased or decreased risk must be correctly identified. The location of such areas may give clues to environmental sources of exposure and disease etiology. One statistical method applicable in spatial epidemiologic settings is a generalized additive model (GAM which can be applied with a bivariate LOESS smoother to account for geographic location as a possible predictor of disease status. A natural hypothesis when applying this method is whether residential location of subjects is associated with the outcome, i.e. is the smoothing term necessary? Permutation tests are a reasonable hypothesis testing method and provide adequate power under a simple alternative hypothesis. These tests have yet to be compared to other spatial statistics. Results This research uses simulated point data generated under three alternative hypotheses to evaluate the properties of the permutation methods and compare them to the popular spatial scan statistic in a case-control setting. Case 1 was a single circular cluster centered in a circular study region. The spatial scan statistic had the highest power though the GAM method estimates did not fall far behind. Case 2 was a single point source located at the center of a circular cluster and Case 3 was a line source at the center of the horizontal axis of a square study region. Each had linearly decreasing logodds with distance from the point. The GAM methods outperformed the scan statistic in Cases 2 and 3. Comparing sensitivity, measured as the proportion of the exposure source correctly identified as high or low risk, the GAM methods outperformed the scan statistic in all three Cases. Conclusions The GAM

  2. A power comparison of generalized additive models and the spatial scan statistic in a case-control setting.

    Science.gov (United States)

    Young, Robin L; Weinberg, Janice; Vieira, Verónica; Ozonoff, Al; Webster, Thomas F

    2010-07-19

    A common, important problem in spatial epidemiology is measuring and identifying variation in disease risk across a study region. In application of statistical methods, the problem has two parts. First, spatial variation in risk must be detected across the study region and, second, areas of increased or decreased risk must be correctly identified. The location of such areas may give clues to environmental sources of exposure and disease etiology. One statistical method applicable in spatial epidemiologic settings is a generalized additive model (GAM) which can be applied with a bivariate LOESS smoother to account for geographic location as a possible predictor of disease status. A natural hypothesis when applying this method is whether residential location of subjects is associated with the outcome, i.e. is the smoothing term necessary? Permutation tests are a reasonable hypothesis testing method and provide adequate power under a simple alternative hypothesis. These tests have yet to be compared to other spatial statistics. This research uses simulated point data generated under three alternative hypotheses to evaluate the properties of the permutation methods and compare them to the popular spatial scan statistic in a case-control setting. Case 1 was a single circular cluster centered in a circular study region. The spatial scan statistic had the highest power though the GAM method estimates did not fall far behind. Case 2 was a single point source located at the center of a circular cluster and Case 3 was a line source at the center of the horizontal axis of a square study region. Each had linearly decreasing logodds with distance from the point. The GAM methods outperformed the scan statistic in Cases 2 and 3. Comparing sensitivity, measured as the proportion of the exposure source correctly identified as high or low risk, the GAM methods outperformed the scan statistic in all three Cases. The GAM permutation testing methods provide a regression

  3. Brazilian Amazonia Deforestation Detection Using Spatio-Temporal Scan Statistics

    Science.gov (United States)

    Vieira, C. A. O.; Santos, N. T.; Carneiro, A. P. S.; Balieiro, A. A. S.

    2012-07-01

    The spatio-temporal models, developed for analyses of diseases, can also be used for others fields of study, including concerns about forest and deforestation. The aim of this paper is to quantitatively check priority areas in order to combat deforestation on the Amazon forest, using the space-time scan statistic. The study area location is at the south of the Amazonas State and cover around 297.183 kilometre squares, including the municipality of Boca do Acre, Labrea, Canutama, Humaita, Manicore, Novo Aripuana e Apui County on the north region of Brazil. This area has showed a significant change for land cover, which has increased the number of deforestation's alerts. Therefore this situation becomes a concern and gets more investigation, trying to stop factors that increase the number of cases in the area. The methodology includes the location and year that deforestation's alert occurred. These deforestation's alerts are mapped by the DETER (Detection System of Deforestation in Real Time in Amazonia), which is carry out by the Brazilian Space Agency (INPE). The software SatScanTM v7.0 was used in order to define space-time permutation scan statistic for detection of deforestation cases. The outcome of this experiment shows an efficient model to detect space-time clusters of deforestation's alerts. The model was efficient to detect the location, the size, the order and characteristics about activities at the end of the experiments. Two clusters were considered actives and kept actives up to the end of the study. These clusters are located in Canutama and Lábrea County. This quantitative spatial modelling of deforestation warnings allowed: firstly, identifying actives clustering of deforestation, in which the environment government official are able to concentrate their actions; secondly, identifying historic clustering of deforestation, in which the environment government official are able to monitoring in order to avoid them to became actives again; and finally

  4. BRAZILIAN AMAZONIA DEFORESTATION DETECTION USING SPATIO-TEMPORAL SCAN STATISTICS

    Directory of Open Access Journals (Sweden)

    C. A. O. Vieira

    2012-07-01

    Full Text Available The spatio-temporal models, developed for analyses of diseases, can also be used for others fields of study, including concerns about forest and deforestation. The aim of this paper is to quantitatively check priority areas in order to combat deforestation on the Amazon forest, using the space-time scan statistic. The study area location is at the south of the Amazonas State and cover around 297.183 kilometre squares, including the municipality of Boca do Acre, Labrea, Canutama, Humaita, Manicore, Novo Aripuana e Apui County on the north region of Brazil. This area has showed a significant change for land cover, which has increased the number of deforestation's alerts. Therefore this situation becomes a concern and gets more investigation, trying to stop factors that increase the number of cases in the area. The methodology includes the location and year that deforestation’s alert occurred. These deforestation's alerts are mapped by the DETER (Detection System of Deforestation in Real Time in Amazonia, which is carry out by the Brazilian Space Agency (INPE. The software SatScanTM v7.0 was used in order to define space-time permutation scan statistic for detection of deforestation cases. The outcome of this experiment shows an efficient model to detect space-time clusters of deforestation’s alerts. The model was efficient to detect the location, the size, the order and characteristics about activities at the end of the experiments. Two clusters were considered actives and kept actives up to the end of the study. These clusters are located in Canutama and Lábrea County. This quantitative spatial modelling of deforestation warnings allowed: firstly, identifying actives clustering of deforestation, in which the environment government official are able to concentrate their actions; secondly, identifying historic clustering of deforestation, in which the environment government official are able to monitoring in order to avoid them to became

  5. Statistics Refresher for Molecular Imaging Technologists, Part 2: Accuracy of Interpretation, Significance, and Variance.

    Science.gov (United States)

    Farrell, Mary Beth

    2018-06-01

    This article is the second part of a continuing education series reviewing basic statistics that nuclear medicine and molecular imaging technologists should understand. In this article, the statistics for evaluating interpretation accuracy, significance, and variance are discussed. Throughout the article, actual statistics are pulled from the published literature. We begin by explaining 2 methods for quantifying interpretive accuracy: interreader and intrareader reliability. Agreement among readers can be expressed simply as a percentage. However, the Cohen κ-statistic is a more robust measure of agreement that accounts for chance. The higher the κ-statistic is, the higher is the agreement between readers. When 3 or more readers are being compared, the Fleiss κ-statistic is used. Significance testing determines whether the difference between 2 conditions or interventions is meaningful. Statistical significance is usually expressed using a number called a probability ( P ) value. Calculation of P value is beyond the scope of this review. However, knowing how to interpret P values is important for understanding the scientific literature. Generally, a P value of less than 0.05 is considered significant and indicates that the results of the experiment are due to more than just chance. Variance, standard deviation (SD), confidence interval, and standard error (SE) explain the dispersion of data around a mean of a sample drawn from a population. SD is commonly reported in the literature. A small SD indicates that there is not much variation in the sample data. Many biologic measurements fall into what is referred to as a normal distribution taking the shape of a bell curve. In a normal distribution, 68% of the data will fall within 1 SD, 95% will fall within 2 SDs, and 99.7% will fall within 3 SDs. Confidence interval defines the range of possible values within which the population parameter is likely to lie and gives an idea of the precision of the statistic being

  6. Optimizing the maximum reported cluster size in the spatial scan statistic for ordinal data.

    Science.gov (United States)

    Kim, Sehwi; Jung, Inkyung

    2017-01-01

    The spatial scan statistic is an important tool for spatial cluster detection. There have been numerous studies on scanning window shapes. However, little research has been done on the maximum scanning window size or maximum reported cluster size. Recently, Han et al. proposed to use the Gini coefficient to optimize the maximum reported cluster size. However, the method has been developed and evaluated only for the Poisson model. We adopt the Gini coefficient to be applicable to the spatial scan statistic for ordinal data to determine the optimal maximum reported cluster size. Through a simulation study and application to a real data example, we evaluate the performance of the proposed approach. With some sophisticated modification, the Gini coefficient can be effectively employed for the ordinal model. The Gini coefficient most often picked the optimal maximum reported cluster sizes that were the same as or smaller than the true cluster sizes with very high accuracy. It seems that we can obtain a more refined collection of clusters by using the Gini coefficient. The Gini coefficient developed specifically for the ordinal model can be useful for optimizing the maximum reported cluster size for ordinal data and helpful for properly and informatively discovering cluster patterns.

  7. The uterine blush. A potential false-positive in Meckel's scan interpretation

    International Nuclear Information System (INIS)

    Fink-Bennett, D.

    1982-01-01

    To determine the presence, prevalence, and clinical importance of /sup 99m/Tc pertechnetate uterine uptake, this retrospective analysis of 71 Meckel's scans was undertaken. Specifically, each study was evaluated for the presence of a focal accumulation of radiotracer cephalad to the bladder. Patients received an intravenous dose of 150 microCi/kg of /sup 99m/Tc pertechnetate. Each study consisted of 15 one minute anterior serial gamma camera images, and a 15, 30, and 60 minute anterior, right lateral and posterior scintiscan. Menstrual histories were obtained from all patients except two. No males (33/33), nor premenstrual (13/13), menopausal (4/4) or posthysterectomy (2/2) patients revealed a uterine blush. Eleven of 15 patients (73%) with regular menses demonstrated a uterine blush. They were in the menstrual or secretory phases of their cycle. Four demonstrated no uterine uptake, had regular periods, but were in the proliferative phase of their cycle. Two with irregular periods, and one with no recorded menstrual history, manifested the blush. Radiotracer should be expected in the uterus during the menstrual and secretory phases of the menstrual cycle. It is a manifestation of a normal physiologic phenomenon, and must be recognized to prevent false-positive Meckel's scan interpretations

  8. A study on the use of Gumbel approximation with the Bernoulli spatial scan statistic.

    Science.gov (United States)

    Read, S; Bath, P A; Willett, P; Maheswaran, R

    2013-08-30

    The Bernoulli version of the spatial scan statistic is a well established method of detecting localised spatial clusters in binary labelled point data, a typical application being the epidemiological case-control study. A recent study suggests the inferential accuracy of several versions of the spatial scan statistic (principally the Poisson version) can be improved, at little computational cost, by using the Gumbel distribution, a method now available in SaTScan(TM) (www.satscan.org). We study in detail the effect of this technique when applied to the Bernoulli version and demonstrate that it is highly effective, albeit with some increase in false alarm rates at certain significance thresholds. We explain how this increase is due to the discrete nature of the Bernoulli spatial scan statistic and demonstrate that it can affect even small p-values. Despite this, we argue that the Gumbel method is actually preferable for very small p-values. Furthermore, we extend previous research by running benchmark trials on 12 000 synthetic datasets, thus demonstrating that the overall detection capability of the Bernoulli version (i.e. ratio of power to false alarm rate) is not noticeably affected by the use of the Gumbel method. We also provide an example application of the Gumbel method using data on hospital admissions for chronic obstructive pulmonary disease. Copyright © 2013 John Wiley & Sons, Ltd.

  9. Identifying clusters of active transportation using spatial scan statistics.

    Science.gov (United States)

    Huang, Lan; Stinchcomb, David G; Pickle, Linda W; Dill, Jennifer; Berrigan, David

    2009-08-01

    There is an intense interest in the possibility that neighborhood characteristics influence active transportation such as walking or biking. The purpose of this paper is to illustrate how a spatial cluster identification method can evaluate the geographic variation of active transportation and identify neighborhoods with unusually high/low levels of active transportation. Self-reported walking/biking prevalence, demographic characteristics, street connectivity variables, and neighborhood socioeconomic data were collected from respondents to the 2001 California Health Interview Survey (CHIS; N=10,688) in Los Angeles County (LAC) and San Diego County (SDC). Spatial scan statistics were used to identify clusters of high or low prevalence (with and without age-adjustment) and the quantity of time spent walking and biking. The data, a subset from the 2001 CHIS, were analyzed in 2007-2008. Geographic clusters of significantly high or low prevalence of walking and biking were detected in LAC and SDC. Structural variables such as street connectivity and shorter block lengths are consistently associated with higher levels of active transportation, but associations between active transportation and socioeconomic variables at the individual and neighborhood levels are mixed. Only one cluster with less time spent walking and biking among walkers/bikers was detected in LAC, and this was of borderline significance. Age-adjustment affects the clustering pattern of walking/biking prevalence in LAC, but not in SDC. The use of spatial scan statistics to identify significant clustering of health behaviors such as active transportation adds to the more traditional regression analysis that examines associations between behavior and environmental factors by identifying specific geographic areas with unusual levels of the behavior independent of predefined administrative units.

  10. Menzerath-Altmann Law: Statistical Mechanical Interpretation as Applied to a Linguistic Organization

    Science.gov (United States)

    Eroglu, Sertac

    2014-10-01

    The distribution behavior described by the empirical Menzerath-Altmann law is frequently encountered during the self-organization of linguistic and non-linguistic natural organizations at various structural levels. This study presents a statistical mechanical derivation of the law based on the analogy between the classical particles of a statistical mechanical organization and the distinct words of a textual organization. The derived model, a transformed (generalized) form of the Menzerath-Altmann model, was termed as the statistical mechanical Menzerath-Altmann model. The derived model allows interpreting the model parameters in terms of physical concepts. We also propose that many organizations presenting the Menzerath-Altmann law behavior, whether linguistic or not, can be methodically examined by the transformed distribution model through the properly defined structure-dependent parameter and the energy associated states.

  11. Bohm's mysterious 'quantum force' and 'active information': alternative interpretation and statistical properties

    International Nuclear Information System (INIS)

    Lan, B.L.

    2001-01-01

    An alternative interpretation to Bohm's 'quantum force' and 'active information' is proposed. Numerical evidence is presented, which suggests that the time series of Bohm's 'quantum force' evaluated at the Bohmian position for non-stationary quantum states are typically non-Gaussian stable distributed with a flat power spectrum in classically chaotic Hamiltonian systems. An important implication of these statistical properties is briefly mentioned. (orig.)

  12. Alternative interpretations of statistics on health effects of low-level radiation

    International Nuclear Information System (INIS)

    Hamilton, L.D.

    1983-01-01

    Four examples of the interpretation of statistics of data on low-level radiation are reviewed: (a) genetic effects of the atomic bombs at Hiroshima and Nagasaki, (b) cancer at Rocky Flats, (c) childhood leukemia and fallout in Utah, and (d) cancer among workers at the Portsmouth Naval Shipyard. Aggregation of data, adjustment for age, and other problems related to the determination of health effects of low-level radiation are discussed. Troublesome issues related to post hoc analysis are considered

  13. Statistical transformation and the interpretation of inpatient glucose control data.

    Science.gov (United States)

    Saulnier, George E; Castro, Janna C; Cook, Curtiss B

    2014-03-01

    To introduce a statistical method of assessing hospital-based non-intensive care unit (non-ICU) inpatient glucose control. Point-of-care blood glucose (POC-BG) data from hospital non-ICUs were extracted for January 1 through December 31, 2011. Glucose data distribution was examined before and after Box-Cox transformations and compared to normality. Different subsets of data were used to establish upper and lower control limits, and exponentially weighted moving average (EWMA) control charts were constructed from June, July, and October data as examples to determine if out-of-control events were identified differently in nontransformed versus transformed data. A total of 36,381 POC-BG values were analyzed. In all 3 monthly test samples, glucose distributions in nontransformed data were skewed but approached a normal distribution once transformed. Interpretation of out-of-control events from EWMA control chart analyses also revealed differences. In the June test data, an out-of-control process was identified at sample 53 with nontransformed data, whereas the transformed data remained in control for the duration of the observed period. Analysis of July data demonstrated an out-of-control process sooner in the transformed (sample 55) than nontransformed (sample 111) data, whereas for October, transformed data remained in control longer than nontransformed data. Statistical transformations increase the normal behavior of inpatient non-ICU glycemic data sets. The decision to transform glucose data could influence the interpretation and conclusions about the status of inpatient glycemic control. Further study is required to determine whether transformed versus nontransformed data influence clinical decisions or evaluation of interventions.

  14. A spatial scan statistic for compound Poisson data.

    Science.gov (United States)

    Rosychuk, Rhonda J; Chang, Hsing-Ming

    2013-12-20

    The topic of spatial cluster detection gained attention in statistics during the late 1980s and early 1990s. Effort has been devoted to the development of methods for detecting spatial clustering of cases and events in the biological sciences, astronomy and epidemiology. More recently, research has examined detecting clusters of correlated count data associated with health conditions of individuals. Such a method allows researchers to examine spatial relationships of disease-related events rather than just incident or prevalent cases. We introduce a spatial scan test that identifies clusters of events in a study region. Because an individual case may have multiple (repeated) events, we base the test on a compound Poisson model. We illustrate our method for cluster detection on emergency department visits, where individuals may make multiple disease-related visits. Copyright © 2013 John Wiley & Sons, Ltd.

  15. Evaluation of forensic DNA mixture evidence: protocol for evaluation, interpretation, and statistical calculations using the combined probability of inclusion.

    Science.gov (United States)

    Bieber, Frederick R; Buckleton, John S; Budowle, Bruce; Butler, John M; Coble, Michael D

    2016-08-31

    The evaluation and interpretation of forensic DNA mixture evidence faces greater interpretational challenges due to increasingly complex mixture evidence. Such challenges include: casework involving low quantity or degraded evidence leading to allele and locus dropout; allele sharing of contributors leading to allele stacking; and differentiation of PCR stutter artifacts from true alleles. There is variation in statistical approaches used to evaluate the strength of the evidence when inclusion of a specific known individual(s) is determined, and the approaches used must be supportable. There are concerns that methods utilized for interpretation of complex forensic DNA mixtures may not be implemented properly in some casework. Similar questions are being raised in a number of U.S. jurisdictions, leading to some confusion about mixture interpretation for current and previous casework. Key elements necessary for the interpretation and statistical evaluation of forensic DNA mixtures are described. Given the most common method for statistical evaluation of DNA mixtures in many parts of the world, including the USA, is the Combined Probability of Inclusion/Exclusion (CPI/CPE). Exposition and elucidation of this method and a protocol for use is the focus of this article. Formulae and other supporting materials are provided. Guidance and details of a DNA mixture interpretation protocol is provided for application of the CPI/CPE method in the analysis of more complex forensic DNA mixtures. This description, in turn, should help reduce the variability of interpretation with application of this methodology and thereby improve the quality of DNA mixture interpretation throughout the forensic community.

  16. Effect of clinical information in brain CT scan interpretation : a blinded double crossover study

    International Nuclear Information System (INIS)

    Zhianpour, M.; Janghorbani, M.

    2004-01-01

    Errors and variations in interpretation can happen in clinical imaging. Few studies have examined the biased effect of clinical information on reporting of brain CT scans. In a blinded double crossover design, we studied whether three radiologists were biased by clinical information when making CT scan diagnosis of the brain. Three consultant radiologists in three rounds with at least a one month interval assessed 100 consecutive cases of brain CT scan. In the first round, clinical information was not available and 100 films without clinical information were given to radiologists. In the second round, the same 100 films were given and true clinical information was available. In the third round, the same 100 films were given and false clinical information was allocated. In 180 cases (60%) the evaluation resulted in the same diagnosis on all three occasions (95% confidence interval (CI): 54.5, 65.5), whereas 120(40%; 95% CI:34.5, 45.5) sets were evaluated differently. 48 cases (16%; 95% CI:11.9,20.1) had discordant evaluation with true and 33 (11%; 95% CI:7.5, 14.5) with false clinical information. Discordance without and with true and false clinical information was 39 (13%; 95% CI:9.2, 16.8). Correct clinical information improves the brain CT report, while the report became less accurate false clinical information was allocated. These results indicate that radiologists are biased by clinical information when reporting brian CT scans

  17. Statistical Literacy: High School Students in Reading, Interpreting and Presenting Data

    Science.gov (United States)

    Hafiyusholeh, M.; Budayasa, K.; Siswono, T. Y. E.

    2018-01-01

    One of the foundations for high school students in statistics is to be able to read data; presents data in the form of tables and diagrams and its interpretation. The purpose of this study is to describe high school students’ competencies in reading, interpreting and presenting data. Subjects were consisted of male and female students who had high levels of mathematical ability. Collecting data was done in form of task formulation which is analyzed by reducing, presenting and verifying data. Results showed that the students read the data based on explicit explanations on the diagram, such as explaining the points in the diagram as the relation between the x and y axis and determining the simple trend of a graph, including the maximum and minimum point. In interpreting and summarizing the data, both subjects pay attention to general data trends and use them to predict increases or decreases in data. The male estimates the value of the (n+1) of weight data by using the modus of the data, while the females estimate the weigth by using the average. The male tend to do not consider the characteristics of the data, while the female more carefully consider the characteristics of data.

  18. Misuse of statistics in the interpretation of data on low-level radiation

    International Nuclear Information System (INIS)

    Hamilton, L.D.

    1982-01-01

    Four misuses of statistics in the interpretation of data of low-level radiation are reviewed: (1) post-hoc analysis and aggregation of data leading to faulty conclusions in the reanalysis of genetic effects of the atomic bomb, and premature conclusions on the Portsmouth Naval Shipyard data; (2) inappropriate adjustment for age and ignoring differences between urban and rural areas leading to potentially spurious increase in incidence of cancer at Rocky Flats; (3) hazard of summary statistics based on ill-conditioned individual rates leading to spurious association between childhood leukemia and fallout in Utah; and (4) the danger of prematurely published preliminary work with inadequate consideration of epidemiological problems - censored data - leading to inappropriate conclusions, needless alarm at the Portsmouth Naval Shipyard, and diversion of scarce research funds

  19. Misuse of statistics in the interpretation of data on low-level radiation

    Energy Technology Data Exchange (ETDEWEB)

    Hamilton, L.D.

    1982-01-01

    Four misuses of statistics in the interpretation of data of low-level radiation are reviewed: (1) post-hoc analysis and aggregation of data leading to faulty conclusions in the reanalysis of genetic effects of the atomic bomb, and premature conclusions on the Portsmouth Naval Shipyard data; (2) inappropriate adjustment for age and ignoring differences between urban and rural areas leading to potentially spurious increase in incidence of cancer at Rocky Flats; (3) hazard of summary statistics based on ill-conditioned individual rates leading to spurious association between childhood leukemia and fallout in Utah; and (4) the danger of prematurely published preliminary work with inadequate consideration of epidemiological problems - censored data - leading to inappropriate conclusions, needless alarm at the Portsmouth Naval Shipyard, and diversion of scarce research funds.

  20. Variability in the interpretation of DMSA scintigraphy after urine infection

    International Nuclear Information System (INIS)

    Craig, J.; Howman-Giles, R.; Uren, R.; Irwig, L.; Bernard, E.; Knight, J.; Sureshkumar, P.; Roy, L.P.

    1997-01-01

    Full text: This study investigated the extent of and potential reasons for interpretation disagreement of 99m Tc-DMSA scans after urine infection in children. Methods: 441 scans were selected from children with first urine infection (UTI) from 1993-1995. 294 scans were performed at a median time of seven days after UTI and 147 in children free from infection over one year follow-up. Two nuclear medicine physicians independently reported according to whether renal abnormality was present or absent and used the four level grading system described by Goldraich: grade 1-no more than two cortical defects; grade 2 -more than 2 defects; grade 3-diffuse reduction in uptake with or without defects; grade 4 -shrunken kidney <10% function. Indices for variability used were the percentage of agreement and kappa statistic, expressed as a percentage. For the grading scale used, both measures were weighted with integers representing the number of categories from perfect agreement. Disagreement was analysed for children, kidneys and kidney zones. Results: There was agreement in 86 per cent (kappa 69%) for the normal-abnormal DMSA scan dichotomy, the weighted agreement was 94 per cent (kappa 82%) for the grading scale. Disagreement of DMSA scan interpretation ≥ two grades was present in three cases (0.7%). The same level of agreement was present for the patient, kidney and kidney zones comparisons. Agreement was not influenced by age or the timing of scintigraphy after urine infection. Conclusion: Two experienced physicians showed good agreement in the interpretation DMSA scintigraphy in children after urine infection and using the grading system of Goldraich

  1. The uterine blush. A potential false-positive in Meckel's scan interpretation

    Energy Technology Data Exchange (ETDEWEB)

    Fink-Bennett, D.

    1982-10-01

    To determine the presence, prevalence, and clinical importance of /sup 99m/Tc pertechnetate uterine uptake, this retrospective analysis of 71 Meckel's scans was undertaken. Specifically, each study was evaluated for the presence of a focal accumulation of radiotracer cephalad to the bladder. Patients received an intravenous dose of 150 microCi/kg of /sup 99m/Tc pertechnetate. Each study consisted of 15 one minute anterior serial gamma camera images, and a 15, 30, and 60 minute anterior, right lateral and posterior scintiscan. Menstrual histories were obtained from all patients except two. No males (33/33), nor premenstrual (13/13), menopausal (4/4) or posthysterectomy (2/2) patients revealed a uterine blush. Eleven of 15 patients (73%) with regular menses demonstrated a uterine blush. They were in the menstrual or secretory phases of their cycle. Four demonstrated no uterine uptake, had regular periods, but were in the proliferative phase of their cycle. Two with irregular periods, and one with no recorded menstrual history, manifested the blush. Radiotracer should be expected in the uterus during the menstrual and secretory phases of the menstrual cycle. It is a manifestation of a normal physiologic phenomenon, and must be recognized to prevent false-positive Meckel's scan interpretations.

  2. The use of the temporal scan statistic to detect methicillin-resistant Staphylococcus aureus clusters in a community hospital.

    Science.gov (United States)

    Faires, Meredith C; Pearl, David L; Ciccotelli, William A; Berke, Olaf; Reid-Smith, Richard J; Weese, J Scott

    2014-07-08

    In healthcare facilities, conventional surveillance techniques using rule-based guidelines may result in under- or over-reporting of methicillin-resistant Staphylococcus aureus (MRSA) outbreaks, as these guidelines are generally unvalidated. The objectives of this study were to investigate the utility of the temporal scan statistic for detecting MRSA clusters, validate clusters using molecular techniques and hospital records, and determine significant differences in the rate of MRSA cases using regression models. Patients admitted to a community hospital between August 2006 and February 2011, and identified with MRSA>48 hours following hospital admission, were included in this study. Between March 2010 and February 2011, MRSA specimens were obtained for spa typing. MRSA clusters were investigated using a retrospective temporal scan statistic. Tests were conducted on a monthly scale and significant clusters were compared to MRSA outbreaks identified by hospital personnel. Associations between the rate of MRSA cases and the variables year, month, and season were investigated using a negative binomial regression model. During the study period, 735 MRSA cases were identified and 167 MRSA isolates were spa typed. Nine different spa types were identified with spa type 2/t002 (88.6%) the most prevalent. The temporal scan statistic identified significant MRSA clusters at the hospital (n=2), service (n=16), and ward (n=10) levels (P ≤ 0.05). Seven clusters were concordant with nine MRSA outbreaks identified by hospital staff. For the remaining clusters, seven events may have been equivalent to true outbreaks and six clusters demonstrated possible transmission events. The regression analysis indicated years 2009-2011, compared to 2006, and months March and April, compared to January, were associated with an increase in the rate of MRSA cases (P ≤ 0.05). The application of the temporal scan statistic identified several MRSA clusters that were not detected by hospital

  3. Kidney Scanning with Hippuran: A Necessary Complement for Correct Interpretation of Renography in the Transplanted Kidney

    Energy Technology Data Exchange (ETDEWEB)

    Lubin, E.; Lewitus, Z.; Rosenfeld, J.; Levi, M. [Beilinson Medical Centre, University of Tel Aviv School of Medicine (Israel)

    1969-05-15

    Every impairment of the kidneys' blood supply, and the production or excretion of urine, is reflected by abnormal renograms that are not specific enough in the information they provide. The subsequent performance of renal scans with Hippuran {sup 131}I adds information to the topographical distribution of the Hippuran {sup 131}I in the renal parenchyma, and to the dynamics of its transport through the urinary system. The information thus obtained is valuable in itself and is necessary for the correct interpretation of renography. We have found this specially useful in the follow-up of the transplanted kidneys. In cases of anuria following renal transplantation, the renogram is useful for indicating that renal circulation is present in the transplanted kidney but is inadequate as a method to differentiate between other renal and post-renal causes of anuria. The anuric transplanted kidney should be scanned thirty minutes after the injection of Hippuran {sup 131}I. Patterns of complementary results of renograms and renal scannings are presented that correspond to prerenal, renal and postrenal causes of anuria, with all the important therapeutic implications this differential diagnosis has. (author)

  4. Localized Smart-Interpretation

    Science.gov (United States)

    Lundh Gulbrandsen, Mats; Mejer Hansen, Thomas; Bach, Torben; Pallesen, Tom

    2014-05-01

    The complex task of setting up a geological model consists not only of combining available geological information into a conceptual plausible model, but also requires consistency with availably data, e.g. geophysical data. However, in many cases the direct geological information, e.g borehole samples, are very sparse, so in order to create a geological model, the geologist needs to rely on the geophysical data. The problem is however, that the amount of geophysical data in many cases are so vast that it is practically impossible to integrate all of them in the manual interpretation process. This means that a lot of the information available from the geophysical surveys are unexploited, which is a problem, due to the fact that the resulting geological model does not fulfill its full potential and hence are less trustworthy. We suggest an approach to geological modeling that 1. allow all geophysical data to be considered when building the geological model 2. is fast 3. allow quantification of geological modeling. The method is constructed to build a statistical model, f(d,m), describing the relation between what the geologists interpret, d, and what the geologist knows, m. The para- meter m reflects any available information that can be quantified, such as geophysical data, the result of a geophysical inversion, elevation maps, etc... The parameter d reflects an actual interpretation, such as for example the depth to the base of a ground water reservoir. First we infer a statistical model f(d,m), by examining sets of actual interpretations made by a geological expert, [d1, d2, ...], and the information used to perform the interpretation; [m1, m2, ...]. This makes it possible to quantify how the geological expert performs interpolation through f(d,m). As the geological expert proceeds interpreting, the number of interpreted datapoints from which the statistical model is inferred increases, and therefore the accuracy of the statistical model increases. When a model f

  5. Precipitate statistics in an Al-Mg-Si-Cu alloy from scanning precession electron diffraction data

    Science.gov (United States)

    Sunde, J. K.; Paulsen, Ø.; Wenner, S.; Holmestad, R.

    2017-09-01

    The key microstructural feature providing strength to age-hardenable Al alloys is nanoscale precipitates. Alloy development requires a reliable statistical assessment of these precipitates, in order to link the microstructure with material properties. Here, it is demonstrated that scanning precession electron diffraction combined with computational analysis enable the semi-automated extraction of precipitate statistics in an Al-Mg-Si-Cu alloy. Among the main findings is the precipitate number density, which agrees well with a conventional method based on manual counting and measurements. By virtue of its data analysis objectivity, our methodology is therefore seen as an advantageous alternative to existing routines, offering reproducibility and efficiency in alloy statistics. Additional results include improved qualitative information on phase distributions. The developed procedure is generic and applicable to any material containing nanoscale precipitates.

  6. An extended model of electrons: experimental evidence from high-resolution scanning tunneling microscopy

    International Nuclear Information System (INIS)

    Hofer, Werner A

    2012-01-01

    In a recent paper we introduced a model of extended electrons, which is fully compatible with quantum mechanics in the formulation of Schrödinger. However, it contradicts the current interpretation of electrons as point-particles. Here, we show by a statistical analysis of high-resolution scanning tunneling microscopy (STM) experiments, that the interpretation of electrons as point particles and, consequently, the interpretation of the density of electron charge as a statistical quantity will lead to a conflict with the Heisenberg uncertainty principle. Given the precision in these experiments we find that the uncertainty principle would be violated by close to two orders of magnitude, if this interpretation were correct. We are thus forced to conclude that the density of electron charge is a physically real, i.e. in principle precisely measurable quantity, as derived in a recent paper. Experimental evidence to the contrary, in particular high-energy scattering experiments, is briefly discussed. The finding is expected to have wide implications in condensed matter physics, chemistry, and biology, scientific disciplines which are based on the properties and interactions of electrons.

  7. Statistical analysis of time-resolved emission from ensembles of semiconductor quantum dots: interpretations of exponantial decay models

    NARCIS (Netherlands)

    van Driel, A.F.; Nikolaev, I.; Vergeer, P.; Lodahl, P.; Vanmaekelbergh, D.; Vos, Willem L.

    2007-01-01

    We present a statistical analysis of time-resolved spontaneous emission decay curves from ensembles of emitters, such as semiconductor quantum dots, with the aim of interpreting ubiquitous non-single-exponential decay. Contrary to what is widely assumed, the density of excited emitters and the

  8. A scan statistic for binary outcome based on hypergeometric probability model, with an application to detecting spatial clusters of Japanese encephalitis.

    Science.gov (United States)

    Zhao, Xing; Zhou, Xiao-Hua; Feng, Zijian; Guo, Pengfei; He, Hongyan; Zhang, Tao; Duan, Lei; Li, Xiaosong

    2013-01-01

    As a useful tool for geographical cluster detection of events, the spatial scan statistic is widely applied in many fields and plays an increasingly important role. The classic version of the spatial scan statistic for the binary outcome is developed by Kulldorff, based on the Bernoulli or the Poisson probability model. In this paper, we apply the Hypergeometric probability model to construct the likelihood function under the null hypothesis. Compared with existing methods, the likelihood function under the null hypothesis is an alternative and indirect method to identify the potential cluster, and the test statistic is the extreme value of the likelihood function. Similar with Kulldorff's methods, we adopt Monte Carlo test for the test of significance. Both methods are applied for detecting spatial clusters of Japanese encephalitis in Sichuan province, China, in 2009, and the detected clusters are identical. Through a simulation to independent benchmark data, it is indicated that the test statistic based on the Hypergeometric model outweighs Kulldorff's statistics for clusters of high population density or large size; otherwise Kulldorff's statistics are superior.

  9. Liver-lung scan in the diagnosis of right subphrenic abscess

    International Nuclear Information System (INIS)

    Middleton, H.M. III; Patton, D.D.; Hoyumpa, A.M. Jr.; Schenker, S.

    1976-01-01

    To assess the value of liver-lung scanning in the diagnosis of right subphrenic abscess, 148 scans were reviewed against corresponding charts. Of 91 scans with adequate clinical data, overall scanning error was 19.3 percent with 14 false positive and 3 false negative scans. Among 49 scans (of the initial group of 91 studies) with presence or absence of actual pathology proved by surgery and/or autopsy, there were 3 true positive, 12 false positive, 29 true negative, and 3 false negative scans. Analysis of data indicated lower accuracy of scan interpretation than generally reported, low specificity for positive scans and high specificity for negative scans, correlation of false interpretations with atypical degrees of liver-lung separation and with scanning defects in liver and lung, and failure of rereading significantly to improve accuracy of interpretation

  10. Statistical interpretation of geochemical data

    International Nuclear Information System (INIS)

    Carambula, M.

    1990-01-01

    Statistical results have been obtained from a geochemical research from the following four aerial photographies Zapican, Carape, Las Canias, Alferez. They have been studied 3020 samples in total, to 22 chemical elements using plasma emission spectrometry methods.

  11. Geographic prediction of tuberculosis clusters in Fukuoka, Japan, using the space-time scan statistic

    Energy Technology Data Exchange (ETDEWEB)

    Daisuke Onozuka; Akihito Hagihara [Fukuoka Institute of Health and Environmental Sciences, Fukuoka (Japan). Department of Information Science

    2007-07-01

    Tuberculosis (TB) has reemerged as a global public health epidemic in recent years. Although evaluating local disease clusters leads to effective prevention and control of TB, there are few, if any, spatiotemporal comparisons for epidemic diseases. TB cases among residents in Fukuoka Prefecture between 1999 and 2004 (n = 9,119) were geocoded at the census tract level (n = 109) based on residence at the time of diagnosis. The spatial and space-time scan statistics were then used to identify clusters of census tracts with elevated proportions of TB cases. In the purely spatial analyses, the most likely clusters were in the Chikuho coal mining area (in 1999, 2002, 2003, 2004), the Kita-Kyushu industrial area (in 2000), and the Fukuoka urban area (in 2001). In the space-time analysis, the most likely cluster was the Kita-Kyushu industrial area (in 2000). The north part of Fukuoka Prefecture was the most likely to have a cluster with a significantly high occurrence of TB. The spatial and space-time scan statistics are effective ways of describing circular disease clusters. Since, in reality, infectious diseases might form other cluster types, the effectiveness of the method may be limited under actual practice. The sophistication of the analytical methodology, however, is a topic for future study. 48 refs., 3 figs., 3 tabs.

  12. Geographic prediction of tuberculosis clusters in Fukuoka, Japan, using the space-time scan statistic

    Directory of Open Access Journals (Sweden)

    Onozuka Daisuke

    2007-04-01

    Full Text Available Abstract Background Tuberculosis (TB has reemerged as a global public health epidemic in recent years. Although evaluating local disease clusters leads to effective prevention and control of TB, there are few, if any, spatiotemporal comparisons for epidemic diseases. Methods TB cases among residents in Fukuoka Prefecture between 1999 and 2004 (n = 9,119 were geocoded at the census tract level (n = 109 based on residence at the time of diagnosis. The spatial and space-time scan statistics were then used to identify clusters of census tracts with elevated proportions of TB cases. Results In the purely spatial analyses, the most likely clusters were in the Chikuho coal mining area (in 1999, 2002, 2003, 2004, the Kita-Kyushu industrial area (in 2000, and the Fukuoka urban area (in 2001. In the space-time analysis, the most likely cluster was the Kita-Kyushu industrial area (in 2000. The north part of Fukuoka Prefecture was the most likely to have a cluster with a significantly high occurrence of TB. Conclusion The spatial and space-time scan statistics are effective ways of describing circular disease clusters. Since, in reality, infectious diseases might form other cluster types, the effectiveness of the method may be limited under actual practice. The sophistication of the analytical methodology, however, is a topic for future study.

  13. Interpretation of computed tomographic images

    International Nuclear Information System (INIS)

    Stickle, R.L.; Hathcock, J.T.

    1993-01-01

    This article discusses the production of optimal CT images in small animal patients as well as principles of radiographic interpretation. Technical factors affecting image quality and aiding image interpretation are included. Specific considerations for scanning various anatomic areas are given, including indications and potential pitfalls. Principles of radiographic interpretation are discussed. Selected patient images are illustrated

  14. How to interpret the results of medical time series data analysis: Classical statistical approaches versus dynamic Bayesian network modeling.

    Science.gov (United States)

    Onisko, Agnieszka; Druzdzel, Marek J; Austin, R Marshall

    2016-01-01

    Classical statistics is a well-established approach in the analysis of medical data. While the medical community seems to be familiar with the concept of a statistical analysis and its interpretation, the Bayesian approach, argued by many of its proponents to be superior to the classical frequentist approach, is still not well-recognized in the analysis of medical data. The goal of this study is to encourage data analysts to use the Bayesian approach, such as modeling with graphical probabilistic networks, as an insightful alternative to classical statistical analysis of medical data. This paper offers a comparison of two approaches to analysis of medical time series data: (1) classical statistical approach, such as the Kaplan-Meier estimator and the Cox proportional hazards regression model, and (2) dynamic Bayesian network modeling. Our comparison is based on time series cervical cancer screening data collected at Magee-Womens Hospital, University of Pittsburgh Medical Center over 10 years. The main outcomes of our comparison are cervical cancer risk assessments produced by the three approaches. However, our analysis discusses also several aspects of the comparison, such as modeling assumptions, model building, dealing with incomplete data, individualized risk assessment, results interpretation, and model validation. Our study shows that the Bayesian approach is (1) much more flexible in terms of modeling effort, and (2) it offers an individualized risk assessment, which is more cumbersome for classical statistical approaches.

  15. Error analysis of terrestrial laser scanning data by means of spherical statistics and 3D graphs.

    Science.gov (United States)

    Cuartero, Aurora; Armesto, Julia; Rodríguez, Pablo G; Arias, Pedro

    2010-01-01

    This paper presents a complete analysis of the positional errors of terrestrial laser scanning (TLS) data based on spherical statistics and 3D graphs. Spherical statistics are preferred because of the 3D vectorial nature of the spatial error. Error vectors have three metric elements (one module and two angles) that were analyzed by spherical statistics. A study case has been presented and discussed in detail. Errors were calculating using 53 check points (CP) and CP coordinates were measured by a digitizer with submillimetre accuracy. The positional accuracy was analyzed by both the conventional method (modular errors analysis) and the proposed method (angular errors analysis) by 3D graphics and numerical spherical statistics. Two packages in R programming language were performed to obtain graphics automatically. The results indicated that the proposed method is advantageous as it offers a more complete analysis of the positional accuracy, such as angular error component, uniformity of the vector distribution, error isotropy, and error, in addition the modular error component by linear statistics.

  16. Statistical transformation and the interpretation of inpatient glucose control data from the intensive care unit.

    Science.gov (United States)

    Saulnier, George E; Castro, Janna C; Cook, Curtiss B

    2014-05-01

    Glucose control can be problematic in critically ill patients. We evaluated the impact of statistical transformation on interpretation of intensive care unit inpatient glucose control data. Point-of-care blood glucose (POC-BG) data derived from patients in the intensive care unit for 2011 was obtained. Box-Cox transformation of POC-BG measurements was performed, and distribution of data was determined before and after transformation. Different data subsets were used to establish statistical upper and lower control limits. Exponentially weighted moving average (EWMA) control charts constructed from April, October, and November data determined whether out-of-control events could be identified differently in transformed versus nontransformed data. A total of 8679 POC-BG values were analyzed. POC-BG distributions in nontransformed data were skewed but approached normality after transformation. EWMA control charts revealed differences in projected detection of out-of-control events. In April, an out-of-control process resulting in the lower control limit being exceeded was identified at sample 116 in nontransformed data but not in transformed data. October transformed data detected an out-of-control process exceeding the upper control limit at sample 27 that was not detected in nontransformed data. Nontransformed November results remained in control, but transformation identified an out-of-control event less than 10 samples into the observation period. Using statistical methods to assess population-based glucose control in the intensive care unit could alter conclusions about the effectiveness of care processes for managing hyperglycemia. Further study is required to determine whether transformed versus nontransformed data change clinical decisions about the interpretation of care or intervention results. © 2014 Diabetes Technology Society.

  17. Technical considerations on scanning and image analysis for amyloid PET in dementia

    International Nuclear Information System (INIS)

    Akamatsu, Go; Ohnishi, Akihito; Aita, Kazuki; Ikari, Yasuhiko; Senda, Michio; Yamamoto, Yasuji

    2017-01-01

    Brain imaging techniques, such as computed tomography (CT), magnetic resonance imaging (MRI), single photon emission computed tomography (SPECT), and positron emission tomography (PET), can provide essential and objective information for the early and differential diagnosis of dementia. Amyloid PET is especially useful to evaluate the amyloid-β pathological process as a biomarker of Alzheimer's disease. This article reviews critical points about technical considerations on the scanning and image analysis methods for amyloid PET. Each amyloid PET agent has its own proper administration instructions and recommended uptake time, scan duration, and the method of image display and interpretation. In addition, we have introduced general scanning information, including subject positioning, reconstruction parameters, and quantitative and statistical image analysis. We believe that this article could make amyloid PET a more reliable tool in clinical study and practice. (author)

  18. Technical Considerations on Scanning and Image Analysis for Amyloid PET in Dementia.

    Science.gov (United States)

    Akamatsu, Go; Ohnishi, Akihito; Aita, Kazuki; Ikari, Yasuhiko; Yamamoto, Yasuji; Senda, Michio

    2017-01-01

    Brain imaging techniques, such as computed tomography (CT), magnetic resonance imaging (MRI), single photon emission computed tomography (SPECT), and positron emission tomography (PET), can provide essential and objective information for the early and differential diagnosis of dementia. Amyloid PET is especially useful to evaluate the amyloid-β pathological process as a biomarker of Alzheimer's disease. This article reviews critical points about technical considerations on the scanning and image analysis methods for amyloid PET. Each amyloid PET agent has its own proper administration instructions and recommended uptake time, scan duration, and the method of image display and interpretation. In addition, we have introduced general scanning information, including subject positioning, reconstruction parameters, and quantitative and statistical image analysis. We believe that this article could make amyloid PET a more reliable tool in clinical study and practice.

  19. Application of machine learning and expert systems to Statistical Process Control (SPC) chart interpretation

    Science.gov (United States)

    Shewhart, Mark

    1991-01-01

    Statistical Process Control (SPC) charts are one of several tools used in quality control. Other tools include flow charts, histograms, cause and effect diagrams, check sheets, Pareto diagrams, graphs, and scatter diagrams. A control chart is simply a graph which indicates process variation over time. The purpose of drawing a control chart is to detect any changes in the process signalled by abnormal points or patterns on the graph. The Artificial Intelligence Support Center (AISC) of the Acquisition Logistics Division has developed a hybrid machine learning expert system prototype which automates the process of constructing and interpreting control charts.

  20. Drug Adverse Event Detection in Health Plan Data Using the Gamma Poisson Shrinker and Comparison to the Tree-based Scan Statistic

    Directory of Open Access Journals (Sweden)

    David Smith

    2013-03-01

    Full Text Available Background: Drug adverse event (AE signal detection using the Gamma Poisson Shrinker (GPS is commonly applied in spontaneous reporting. AE signal detection using large observational health plan databases can expand medication safety surveillance. Methods: Using data from nine health plans, we conducted a pilot study to evaluate the implementation and findings of the GPS approach for two antifungal drugs, terbinafine and itraconazole, and two diabetes drugs, pioglitazone and rosiglitazone. We evaluated 1676 diagnosis codes grouped into 183 different clinical concepts and four levels of granularity. Several signaling thresholds were assessed. GPS results were compared to findings from a companion study using the identical analytic dataset but an alternative statistical method—the tree-based scan statistic (TreeScan. Results: We identified 71 statistical signals across two signaling thresholds and two methods, including closely-related signals of overlapping diagnosis definitions. Initial review found that most signals represented known adverse drug reactions or confounding. About 31% of signals met the highest signaling threshold. Conclusions: The GPS method was successfully applied to observational health plan data in a distributed data environment as a drug safety data mining method. There was substantial concordance between the GPS and TreeScan approaches. Key method implementation decisions relate to defining exposures and outcomes and informed choice of signaling thresholds.

  1. Hotspot detection using space-time scan statistics on children under five years of age in Depok

    Science.gov (United States)

    Verdiana, Miranti; Widyaningsih, Yekti

    2017-03-01

    Some problems that affect the health level in Depok is the high malnutrition rates from year to year and the more spread infectious and non-communicable diseases in some areas. Children under five years old is a vulnerable part of population to get the malnutrition and diseases. Based on this reason, it is important to observe the location and time, where and when, malnutrition in Depok happened in high intensity. To obtain the location and time of the hotspots of malnutrition and diseases that attack children under five years old, space-time scan statistics method can be used. Space-time scan statistic is a hotspot detection method, where the area and time of information and time are taken into account simultaneously in detecting the hotspots. This method detects a hotspot with a cylindrical scanning window: the cylindrical pedestal describes the area, and the height of cylinder describe the time. Cylinders formed is a hotspot candidate that may occur, which require testing of hypotheses, whether a cylinder can be summed up as a hotspot. Hotspot detection in this study carried out by forming a combination of several variables. Some combination of variables provides hotspot detection results that tend to be the same, so as to form groups (clusters). In the case of infant health level in Depok city, Beji health care center region in 2011-2012 is a hotspot. According to the combination of the variables used in the detection of hotspots, Beji health care center is most frequently as a hotspot. Hopefully the local government can take the right policy to improve the health level of children under five in the city of Depok.

  2. Scanning tunneling microscopy III theory of STM and related scanning probe methods

    CERN Document Server

    Güntherodt, Hans-Joachim

    1996-01-01

    Scanning Tunneling Microscopy III provides a unique introduction to the theoretical foundations of scanning tunneling microscopy and related scanning probe methods. The different theoretical concepts developed in the past are outlined, and the implications of the theoretical results for the interpretation of experimental data are discussed in detail. Therefore, this book serves as a most useful guide for experimentalists as well as for theoreticians working in the filed of local probe methods. In this second edition the text has been updated and new methods are discussed.

  3. Dengue hemorrhagic fever and typhoid fever association based on spatial standpoint using scan statistics in DKI Jakarta

    Science.gov (United States)

    Hervind, Widyaningsih, Y.

    2017-07-01

    Concurrent infection with multiple infectious agents may occur in one patient, it appears frequently in dengue hemorrhagic fever (DHF) and typhoid fever. This paper depicted association between DHF and typhoid based on spatial point of view. Since paucity of data regarding dengue and typhoid co-infection, data that be used are the number of patients of those diseases in every district (kecamatan) in Jakarta in 2014 and 2015 obtained from Jakarta surveillance website. Poisson spatial scan statistics is used to detect DHF and typhoid hotspots area district in Jakarta separately. After obtain the hotspot, Fisher's exact test is applied to validate association between those two diseases' hotspot. The result exhibit hotspots of DHF and typhoid are located around central Jakarta. The further analysis used Poisson space-time scan statistics to reveal the hotspot in term of spatial and time. DHF and typhoid fever more likely occurr from January until May in the area which is relatively similar with pure spatial result. Preventive action could be done especially in the hotspot areas and it is required further study to observe the causes based on characteristics of the hotspot area.

  4. FDG-PET scan in assessing lymphomas and the application of Deauville Criteria

    International Nuclear Information System (INIS)

    Awan, U.E.K.; Siddiqui, N.; Muzaffar, N.; Farooqui, Z.S.

    2013-01-01

    To evaluate the role of Fluorine-18-fluorodexoyglucose Positron Emission Tomography (FDG-PET) scan in staging and its implications on the treatment of lymphoma, and to study the concordance between visual assessment and Deauville criteria for the interpretation of interim scans. Methods: The prospective single-arm experimental study was conducted at the Shaukat Khanum Memorial Cancer Hospital, Lahore, from May 2011 to October 2011. It comprised 53 newly diagnosed lymphoma patients who agreed to participate in the study. All patients underwent scans with contrast-enhanced computerised tomography at baseline. Treatment plan was formulated based on the final stage. Interim scans were acquired after 2 cycles of chemotherapy and were reported using visual criteria and compared with the 5-point Deauville criteria. Score of 1-3 was taken as disease-negative, while 4-5 was taken as disease-positive. SPSS 19 was used for statistical analysis. Results: Of the 53 patients, 35 (66%) had Hodgkin's Lymphoma, while 18 (34%) had Non-Hodgkin's Lymphoma. Scans resulted in disease upstaging in 4 (7.5%) patients, and detecting increased disease burden in 12 (23%). On interim scans, complete remission was achieved in 38 (71%) patients (Deauville score 1-3); 12 (23%) showed partial response (Deauville score 4-5); and 3 (6%) had progression. Kappa test was statistically significant (kappa 0.856; p <0.001). Conclusion: The positron emission tomography helped to upstage lymphoma and reflected increased disease burden. The Deauville criteria correlated very well with visual assessment criteria and can be applied in the patient population. (author)

  5. The application of microfocal radiography to neuroanatomy and neuropathology research, and its relation to cerebral magnification angiography and brain scan interpretation. Chapter 3

    International Nuclear Information System (INIS)

    Saunders, R.L. de C.H.

    1980-01-01

    Microfocal radiography is used to study post mortem, the microcirculatory and neuronal organization of the normal and diseased brain, as well as to interpret the images obtained clinically by the new techniques of cerebral magnification angiography and X-ray brain scanning. An outline of the basic technique underlying CT scanning and magnification radiography of the living human brain is given to facilitate the understanding of why microfocal radiography is central to magnification radiography and complementary to CT scanning. Microangiography, one of the microfocal radiographic techniques, is discussed at length in relation to the microvasculature of the human cerebral cortex, the vasculature of the subcortical or medullary white matter, the microvascular patterns of the central grey matter and internal capsule, the vascular patterns of the visual cortex and hippocampus; the application of microangiography to the spinal cord and nerve roots is also discussed. Another microfocal radiographic technique described is cerebral historadiography, i.e. X-ray studies of brain histology, with particular reference to the human hippocampal formation. Finally, the correlation of microfocal X-ray and brain CT scan images is discussed. (U.K.)

  6. Difficulties in the interpretation of focal changes on liver scans

    Energy Technology Data Exchange (ETDEWEB)

    Derimanov, S.G. (Oblastnoj Onkologicheskij Dispanser, Veliko-Tyrnovo (Bulgaria))

    1983-01-01

    A series of cases are presented in which erroneous conclusions were arrived at by liver scans. The analysis of these errors has shown that they are determined by the similarity of a scanographic picture of focal changes in various diseases of the liver (cancer, cirrhosis, echinococcus, etc.) and the adjacent organs (bronchogenic cyst, renal tumor, etc.). In this connection all attempts to determine the etiology of a pathological process by liver scans without correlation with the results of multidimensional examination frequently result in errors.

  7. A synthetic interpretation: the double-preparation theory

    International Nuclear Information System (INIS)

    Gondran, Michel; Gondran, Alexandre

    2014-01-01

    In the 1927 Solvay conference, three apparently irreconcilable interpretations of the quantum mechanics wave function were presented: the pilot-wave interpretation by de Broglie, the soliton wave interpretation by Schrödinger and the Born statistical rule by Born and Heisenberg. In this paper, we demonstrate the complementarity of these interpretations corresponding to quantum systems that are prepared differently and we deduce a synthetic interpretation: the double-preparation theory. We first introduce in quantum mechanics the concept of semi-classical statistically prepared particles, and we show that in the Schrödinger equation these particles converge, when h→0, to the equations of a statistical set of classical particles. These classical particles are undiscerned, and if we assume continuity between classical mechanics and quantum mechanics, we conclude the necessity of the de Broglie–Bohm interpretation for the semi-classical statistically prepared particles (statistical wave). We then introduce in quantum mechanics the concept of a semi-classical deterministically prepared particle, and we show that in the Schrödinger equation this particle converges, when h→0, to the equations of a single classical particle. This classical particle is discerned and assuming continuity between classical mechanics and quantum mechanics, we conclude the necessity of the Schrödinger interpretation for the semi-classical deterministically prepared particle (the soliton wave). Finally we propose, in the semi-classical approximation, a new interpretation of quantum mechanics, the ‘theory of the double preparation’, which depends on the preparation of the particles. (paper)

  8. Crossing statistic: Bayesian interpretation, model selection and resolving dark energy parametrization problem

    International Nuclear Information System (INIS)

    Shafieloo, Arman

    2012-01-01

    By introducing Crossing functions and hyper-parameters I show that the Bayesian interpretation of the Crossing Statistics [1] can be used trivially for the purpose of model selection among cosmological models. In this approach to falsify a cosmological model there is no need to compare it with other models or assume any particular form of parametrization for the cosmological quantities like luminosity distance, Hubble parameter or equation of state of dark energy. Instead, hyper-parameters of Crossing functions perform as discriminators between correct and wrong models. Using this approach one can falsify any assumed cosmological model without putting priors on the underlying actual model of the universe and its parameters, hence the issue of dark energy parametrization is resolved. It will be also shown that the sensitivity of the method to the intrinsic dispersion of the data is small that is another important characteristic of the method in testing cosmological models dealing with data with high uncertainties

  9. Quantitation of PET signal as an adjunct to visual interpretation of florbetapir imaging

    Energy Technology Data Exchange (ETDEWEB)

    Pontecorvo, Michael J.; Arora, Anupa K.; Devine, Marybeth; Lu, Ming; Galante, Nick; Siderowf, Andrew; Devadanam, Catherine; Joshi, Abhinay D.; Heun, Stephen L.; Teske, Brian F.; Truocchio, Stephen P.; Krautkramer, Michael; Devous, Michael D.; Mintun, Mark A. [Avid Radiopharmaceuticals (a wholly owned subsidiary of Eli Lilly and Company), Philadelphia, PA (United States)

    2017-05-15

    This study examined the feasibility of using quantitation to augment interpretation of florbetapir PET amyloid imaging. A total of 80 physician readers were trained on quantitation of florbetapir PET images and the principles for using quantitation to augment a visual read. On day 1, the readers completed a visual read of 96 scans (46 autopsy-verified and 50 from patients seeking a diagnosis). On day 2, 69 of the readers reinterpreted the 96 scans augmenting their interpretation with quantitation (VisQ method) using one of three commercial software packages. A subset of 11 readers reinterpreted all scans on day 2 based on a visual read only (VisVis control). For the autopsy-verified scans, the neuropathologist's modified CERAD plaque score was used as the truth standard for interpretation accuracy. Because an autopsy truth standard was not available for scans from patients seeking a diagnosis, the majority VisQ interpretation of the three readers with the best accuracy in interpreting autopsy-verified scans was used as the reference standard. Day 1 visual read accuracy was high for both the autopsy-verified scans (90%) and the scans from patients seeking a diagnosis (87.3%). Accuracy improved from the visual read to the VisQ read (from 90.1% to 93.1%, p < 0.0001). Importantly, access to quantitative information did not decrease interpretation accuracy of the above-average readers (>90% on day 1). Accuracy in interpreting the autopsy-verified scans also increased from the first to the second visual read (VisVis group). However, agreement with the reference standard (best readers) for scans from patients seeking a diagnosis did not improve with a second visual read, and in this cohort the VisQ group was significantly improved relative to the VisVis group (change 5.4% vs. -1.1%, p < 0.0001). These results indicate that augmentation of visual interpretation of florbetapir PET amyloid images with quantitative information obtained using commercially available

  10. Quantitation of PET signal as an adjunct to visual interpretation of florbetapir imaging

    International Nuclear Information System (INIS)

    Pontecorvo, Michael J.; Arora, Anupa K.; Devine, Marybeth; Lu, Ming; Galante, Nick; Siderowf, Andrew; Devadanam, Catherine; Joshi, Abhinay D.; Heun, Stephen L.; Teske, Brian F.; Truocchio, Stephen P.; Krautkramer, Michael; Devous, Michael D.; Mintun, Mark A.

    2017-01-01

    This study examined the feasibility of using quantitation to augment interpretation of florbetapir PET amyloid imaging. A total of 80 physician readers were trained on quantitation of florbetapir PET images and the principles for using quantitation to augment a visual read. On day 1, the readers completed a visual read of 96 scans (46 autopsy-verified and 50 from patients seeking a diagnosis). On day 2, 69 of the readers reinterpreted the 96 scans augmenting their interpretation with quantitation (VisQ method) using one of three commercial software packages. A subset of 11 readers reinterpreted all scans on day 2 based on a visual read only (VisVis control). For the autopsy-verified scans, the neuropathologist's modified CERAD plaque score was used as the truth standard for interpretation accuracy. Because an autopsy truth standard was not available for scans from patients seeking a diagnosis, the majority VisQ interpretation of the three readers with the best accuracy in interpreting autopsy-verified scans was used as the reference standard. Day 1 visual read accuracy was high for both the autopsy-verified scans (90%) and the scans from patients seeking a diagnosis (87.3%). Accuracy improved from the visual read to the VisQ read (from 90.1% to 93.1%, p < 0.0001). Importantly, access to quantitative information did not decrease interpretation accuracy of the above-average readers (>90% on day 1). Accuracy in interpreting the autopsy-verified scans also increased from the first to the second visual read (VisVis group). However, agreement with the reference standard (best readers) for scans from patients seeking a diagnosis did not improve with a second visual read, and in this cohort the VisQ group was significantly improved relative to the VisVis group (change 5.4% vs. -1.1%, p < 0.0001). These results indicate that augmentation of visual interpretation of florbetapir PET amyloid images with quantitative information obtained using commercially available

  11. Statistical analysis of time-resolved emission from ensembles of semiconductor quantum dots: Interpretation of exponential decay models

    DEFF Research Database (Denmark)

    Van Driel, A.F.; Nikolaev, I.S.; Vergeer, P.

    2007-01-01

    We present a statistical analysis of time-resolved spontaneous emission decay curves from ensembles of emitters, such as semiconductor quantum dots, with the aim of interpreting ubiquitous non-single-exponential decay. Contrary to what is widely assumed, the density of excited emitters...... and the intensity in an emission decay curve are not proportional, but the density is a time integral of the intensity. The integral relation is crucial to correctly interpret non-single-exponential decay. We derive the proper normalization for both a discrete and a continuous distribution of rates, where every...... decay component is multiplied by its radiative decay rate. A central result of our paper is the derivation of the emission decay curve when both radiative and nonradiative decays are independently distributed. In this case, the well-known emission quantum efficiency can no longer be expressed...

  12. Interpretation of commonly used statistical regression models.

    Science.gov (United States)

    Kasza, Jessica; Wolfe, Rory

    2014-01-01

    A review of some regression models commonly used in respiratory health applications is provided in this article. Simple linear regression, multiple linear regression, logistic regression and ordinal logistic regression are considered. The focus of this article is on the interpretation of the regression coefficients of each model, which are illustrated through the application of these models to a respiratory health research study. © 2013 The Authors. Respirology © 2013 Asian Pacific Society of Respirology.

  13. Interpretation of some (p,n), (n,p), and (3He, p) reactions by means of the statistical multistep compound emission theory

    International Nuclear Information System (INIS)

    Bonetti, R.; Milazzo, L.C.; Melanotte, M.

    1983-01-01

    A number of (p,n), (n,p), and ( 3 He, p) reactions have been interpreted on the basis of the statistical multistep compound emission mechanism. Good agreement with experiment is found both in spectrum shape and in the value of the coherence widths

  14. Computer assisted diagnosis in renal nuclear medicine: rationale, methodology and interpretative criteria for diuretic renography

    Science.gov (United States)

    Taylor, Andrew T; Garcia, Ernest V

    2014-01-01

    The goal of artificial intelligence, expert systems, decision support systems and computer assisted diagnosis (CAD) in imaging is the development and implementation of software to assist in the detection and evaluation of abnormalities, to alert physicians to cognitive biases, to reduce intra and inter-observer variability and to facilitate the interpretation of studies at a faster rate and with a higher level of accuracy. These developments are needed to meet the challenges resulting from a rapid increase in the volume of diagnostic imaging studies coupled with a concurrent increase in the number and complexity of images in each patient data. The convergence of an expanding knowledge base and escalating time constraints increases the likelihood of physician errors. Errors are even more likely when physicians interpret low volume studies such as 99mTc-MAG3 diuretic scans where imagers may have had limited training or experience. Decision support systems include neural networks, case-based reasoning, expert systems and statistical systems. iRENEX (renal expert) is an expert system for diuretic renography that uses a set of rules obtained from human experts to analyze a knowledge base of both clinical parameters and quantitative parameters derived from the renogram. Initial studies have shown that the interpretations provided by iRENEX are comparable to the interpretations of a panel of experts. iRENEX provides immediate patient specific feedback at the time of scan interpretation, can be queried to provide the reasons for its conclusions and can be used as an educational tool to teach trainees to better interpret renal scans. iRENEX also has the capacity to populate a structured reporting module and generate a clear and concise impression based on the elements contained in the report; adherence to the procedural and data entry components of the structured reporting module assures and documents procedural competency. Finally, although the focus is CAD applied to

  15. Autonomic Differentiation Map: A Novel Statistical Tool for Interpretation of Heart Rate Variability

    Directory of Open Access Journals (Sweden)

    Daniela Lucini

    2018-04-01

    Full Text Available In spite of the large body of evidence suggesting Heart Rate Variability (HRV alone or combined with blood pressure variability (providing an estimate of baroreflex gain as a useful technique to assess the autonomic regulation of the cardiovascular system, there is still an ongoing debate about methodology, interpretation, and clinical applications. In the present investigation, we hypothesize that non-parametric and multivariate exploratory statistical manipulation of HRV data could provide a novel informational tool useful to differentiate normal controls from clinical groups, such as athletes, or subjects affected by obesity, hypertension, or stress. With a data-driven protocol in 1,352 ambulant subjects, we compute HRV and baroreflex indices from short-term data series as proxies of autonomic (ANS regulation. We apply a three-step statistical procedure, by first removing age and gender effects. Subsequently, by factor analysis, we extract four ANS latent domains that detain the large majority of information (86.94%, subdivided in oscillatory (40.84%, amplitude (18.04%, pressure (16.48%, and pulse domains (11.58%. Finally, we test the overall capacity to differentiate clinical groups vs. control. To give more practical value and improve readability, statistical results concerning individual discriminant ANS proxies and ANS differentiation profiles are displayed through peculiar graphical tools, i.e., significance diagram and ANS differentiation map, respectively. This approach, which simultaneously uses all available information about the system, shows what domains make up the difference in ANS discrimination. e.g., athletes differ from controls in all domains, but with a graded strength: maximal in the (normalized oscillatory and in the pulse domains, slightly less in the pressure domain and minimal in the amplitude domain. The application of multiple (non-parametric and exploratory statistical and graphical tools to ANS proxies defines

  16. A statistical pixel intensity model for segmentation of confocal laser scanning microscopy images.

    Science.gov (United States)

    Calapez, Alexandre; Rosa, Agostinho

    2010-09-01

    Confocal laser scanning microscopy (CLSM) has been widely used in the life sciences for the characterization of cell processes because it allows the recording of the distribution of fluorescence-tagged macromolecules on a section of the living cell. It is in fact the cornerstone of many molecular transport and interaction quantification techniques where the identification of regions of interest through image segmentation is usually a required step. In many situations, because of the complexity of the recorded cellular structures or because of the amounts of data involved, image segmentation either is too difficult or inefficient to be done by hand and automated segmentation procedures have to be considered. Given the nature of CLSM images, statistical segmentation methodologies appear as natural candidates. In this work we propose a model to be used for statistical unsupervised CLSM image segmentation. The model is derived from the CLSM image formation mechanics and its performance is compared to the existing alternatives. Results show that it provides a much better description of the data on classes characterized by their mean intensity, making it suitable not only for segmentation methodologies with known number of classes but also for use with schemes aiming at the estimation of the number of classes through the application of cluster selection criteria.

  17. The use of CT scan in the pre-operative staging of bronchogenic carcinoma

    International Nuclear Information System (INIS)

    Pada, C.C.

    1992-01-01

    Surgery remains the treatment of choice in patients with localized bronchogenic carcinoma. Pre-operative identification of inoperability spares the patient from unnecessary surgery. This prospective study was carried out to determine the correctness of judgement regarding a patient's operability or inoperability based on the pre-operative staging of CT scan; to find out the sensitivity, specificity and overall accuracy of the CT scan in estimating tumor description, nodal status and metastatic spread to the chest. Staging was done by 3 senior radiologists aware of the diagnosis. Both the surgical and histopathologic findings and staging were gathered and used as measurement of truth in arriving at the CT scan's accuracy. Overall accuracy rate of CT scan in determining operability or inoperability is 80%; tumor description accuracy of assessment is 87% and nodal status estimation has an accuracy of 60%. Sensitivity of CT scan is assessment of metastatic spread to the chest is 93%. There is no statistically significant difference in the judgement of operability or interpretability by CT scan compared to surgical and histopathologic results. The CT scan is recommended as a valuable tool in the pre-operative staging of patients with bronchogenic carcinoma who are candidates for surgery. (auth.). 21 refs.; 8 tabs

  18. Biomembrane Permeabilization: Statistics of Individual Leakage Events Harmonize the Interpretation of Vesicle Leakage.

    Science.gov (United States)

    Braun, Stefan; Pokorná, Šárka; Šachl, Radek; Hof, Martin; Heerklotz, Heiko; Hoernke, Maria

    2018-01-23

    The mode of action of membrane-active molecules, such as antimicrobial, anticancer, cell penetrating, and fusion peptides and their synthetic mimics, transfection agents, drug permeation enhancers, and biological signaling molecules (e.g., quorum sensing), involves either the general or local destabilization of the target membrane or the formation of defined, rather stable pores. Some effects aim at killing the cell, while others need to be limited in space and time to avoid serious damage. Biological tests reveal translocation of compounds and cell death but do not provide a detailed, mechanistic, and quantitative understanding of the modes of action and their molecular basis. Model membrane studies of membrane leakage have been used for decades to tackle this issue, but their interpretation in terms of biology has remained challenging and often quite limited. Here we compare two recent, powerful protocols to study model membrane leakage: the microscopic detection of dye influx into giant liposomes and time-correlated single photon counting experiments to characterize dye efflux from large unilamellar vesicles. A statistical treatment of both data sets does not only harmonize apparent discrepancies but also makes us aware of principal issues that have been confusing the interpretation of model membrane leakage data so far. Moreover, our study reveals a fundamental difference between nano- and microscale systems that needs to be taken into account when conclusions about microscale objects, such as cells, are drawn from nanoscale models.

  19. Semi-quantitative interpretation of the bone scan in metabolic bone disease

    Energy Technology Data Exchange (ETDEWEB)

    Fogelman, I; Turner, J G; Hay, I D; Boyle, I T [Royal Infirmary, Glasgow (UK). Dept. of Nuclear Medicine; Citrin, D L [Wisconsin Univ., Madison (USA). Dept. of Human Oncology; Bessent, G R

    1979-01-01

    Certain easily recognisable features are commonly seen in the bone scans of patients with metabolic bone disorders. Seven such features have been numerically graded by three independent observers in the scans of 100 patients with metabolic bone disease and of 50 control subjects. The total score for each patient is defined as the metabolic index. The mean metabolic index for each group of patients with metabolic bone disease is significantly greater than that for the control group (P < 0.001). (orig.).

  20. Interpreting the Rock Paintings of Abri Faravel: laser and white-light scanning at 2,133m in the southern French Alps

    Directory of Open Access Journals (Sweden)

    Kevin Walsh

    2016-05-01

    Full Text Available The Abri Faravel, discovered in 2010 at 2,133m asl in the Parc National des Ecrins, Freissinières, Southern French Alps, is probably the most enigmatic high altitude site in the Alps. This rock shelter saw phases of human activity from the Mesolithic through to the medieval period; the artefactual assemblages comprise Mesolithic and Neolithic flint tools, Iron Age hand-thrown pottery, a Roman fibula and some medieval metalwork. However, the most interesting and unique feature on the site are the prehistoric rock paintings; the highest representations of animals (quadrupeds in Europe. These paintings are presented in this article. The paintings themselves were the object of a white-light scan, whilst the rock-shelter and surrounding landscape was scanned using a Faro laser scanner. Both of these models are presented here, and their interpretation elucidated by an assessment of the different phases of activity at the shelter, combined with a synthesis of other evidence from the area and pertinent environmental evidence.

  1. Revisiting organizational interpretation and three types of uncertainty

    DEFF Research Database (Denmark)

    Sund, Kristian J.

    2015-01-01

    that might help explain and untangle some of the conflicting empirical results found in the extant literature. The paper illustrates how the literature could benefit from re-conceptualizing the perceived environmental uncertainty construct to take into account different types of uncertainty. Practical....... Design/methodology/approach – This conceptual paper extends existing conceptual work by distinguishing between general and issue-specific scanning and linking the interpretation process to three different types of perceived uncertainty: state, effect and response uncertainty. Findings – It is proposed...... on existing work by linking the interpretation process to three different types of uncertainty (state, effect and response uncertainty) with several novel and testable propositions. The paper also differentiates clearly general (regular) scanning from issue-specific (irregular) scanning. Finally, the paper...

  2. Making the invisible body visible. Bone scans, osteoporosis and women's bodily experiences.

    Science.gov (United States)

    Reventlow, Susanne Dalsgaard; Hvas, Lotte; Malterud, Kirsti

    2006-06-01

    The imaging technology of bone scans allows visualization of the bone structure, and determination of a numerical value. Both these are subjected to professional interpretation according to medical (epidemiological) evidence to estimate the individual's risk of fractures. But when bodily experience is challenged by a visual diagnosis, what effect does this have on an individual? The aim of this study was to explore women's bodily experiences after a bone scan and to analyse how the scan affects women's self-awareness, sense of bodily identity and integrity. We interviewed 16 Danish women (aged 61-63) who had had a bone scan for osteoporosis. The analysis was based on Merleau-Ponty's perspective of perception as an embodied experience in which bodily experience is understood to be the existential ground of culture and self. Women appeared to take the scan literally and planned their lives accordingly. They appeared to believe that the 'pictures' revealed some truth in themselves. The information supplied by the scan fostered a new body image. The women interpreted the scan result (a mark on a curve) to mean bodily fragility which they incorporated into their bodily perception. The embodiment of this new body image produced new symptom interpretations and preventive actions, including caution. The result of the bone scan and its cultural interpretation triggered a reconstruction of the body self as weak with reduced capacity. Women's interpretation of the bone scan reorganized their lived space and time, and their relations with others and themselves. Technological information about osteoporosis appeared to leave most affected women more uncertain and restricted rather than empowered. The findings raise some fundamental questions concerning the use of medical technology for the prevention of asymptomatic disorders.

  3. Workplace Statistical Literacy for Teachers: Interpreting Box Plots

    Science.gov (United States)

    Pierce, Robyn; Chick, Helen

    2013-01-01

    As a consequence of the increased use of data in workplace environments, there is a need to understand the demands that are placed on users to make sense of such data. In education, teachers are being increasingly expected to interpret and apply complex data about student and school performance, and, yet it is not clear that they always have the…

  4. Interpretation of gamma-scanning data from the ORR demonstration elements

    International Nuclear Information System (INIS)

    Bretscher, M.M.; Snelgrove, J.L.; Hobbs, R.W.

    1989-01-01

    The HEU and LEU fuel elements used in the ORR whole-core demonstration were gamma-scanned to determine the axial distribution of the 140 La and 137 Cs activities. Analysis of this data is now complete. From the 140 La activity distributions cycle-averaged powers were determined while the 137 Cs data provided a measure of the final 235 U burnup in the fuel elements. A method for calculating correction factors for activity gradients transverse to the fuel element axis is presented and is applied to the first mixed core used in the demonstration during the gradual transition to an all LEU core. Results based on the gamma-scanning of the LEU fuel followers are also presented. Improved burnup calculations against which the experimental results are to be compared are now in progress. 7 refs., 21 figs., 3 tabs

  5. Adaptive statistical iterative reconstruction technology in the application of PET/CT whole body scans

    International Nuclear Information System (INIS)

    Xin Jun; Zhao Zhoushe; Li Hong; Lu Zhe; Wu Wenkai; Guo Qiyong

    2013-01-01

    Objective: To improve image quality of low dose CT in whole body PET/CT using adaptive statistical iterative reconstruction (ASiR) technology. Methods: Twice CT scans were performed with GE water model,scan parameters were: 120 kV, 120 and 300 mA respectively. In addition, 30 subjects treated with PET/CT were selected randomly, whole body PET/CT were performed after 18 F-FDG injection of 3.70 MBq/kg, Sharp IR+time of flight + VUE Point HD technology were used for 1.5 min/bed in PET; CT of spiral scan was performed under 120 kV using automatic exposure control technology (30-210 mA, noise index 25). Model and patients whole body CT images were reconstructed with conventional and 40% ASiR methods respectively, and the CT attenuation value and noise index were measured. Results: Research of model and clinical showed that standard deviation of ASiR method in model CT was 33.0% lower than the conventional CT reconstruction method (t =27.76, P<0.01), standard deviation of CT in normal tissues (brain, lung, mediastinum, liver and vertebral body) and lesions (brain, lung, mediastinum, liver and vertebral body) reduced by 21.08% (t =23.35, P<0.01) and 24.43% (t =16.15, P<0.01) respectively, especially for normal liver tissue and liver lesions, standard deviations of CT were reduced by 51.33% (t=34.21, P<0.0) and 49.54% (t=15.21, P<0.01) respectively. Conclusion: ASiR reconstruction method was significantly reduced the noise of low dose CT image and improved the quality of CT image in whole body PET/CT, which seems more suitable for quantitative analysis and clinical applications. (authors)

  6. A Statistical Framework to Interpret Individual Response to Intervention: Paving the Way for Personalized Nutrition and Exercise Prescription

    Directory of Open Access Journals (Sweden)

    Paul A. Swinton

    2018-05-01

    Full Text Available The concept of personalized nutrition and exercise prescription represents a topical and exciting progression for the discipline given the large inter-individual variability that exists in response to virtually all performance and health related interventions. Appropriate interpretation of intervention-based data from an individual or group of individuals requires practitioners and researchers to consider a range of concepts including the confounding influence of measurement error and biological variability. In addition, the means to quantify likely statistical and practical improvements are facilitated by concepts such as confidence intervals (CIs and smallest worthwhile change (SWC. The purpose of this review is to provide accessible and applicable recommendations for practitioners and researchers that interpret, and report personalized data. To achieve this, the review is structured in three sections that progressively develop a statistical framework. Section 1 explores fundamental concepts related to measurement error and describes how typical error and CIs can be used to express uncertainty in baseline measurements. Section 2 builds upon these concepts and demonstrates how CIs can be combined with the concept of SWC to assess whether meaningful improvements occur post-intervention. Finally, section 3 introduces the concept of biological variability and discusses the subsequent challenges in identifying individual response and non-response to an intervention. Worked numerical examples and interactive Supplementary Material are incorporated to solidify concepts and assist with implementation in practice.

  7. Augmenting Amyloid PET Interpretations With Quantitative Information Improves Consistency of Early Amyloid Detection.

    Science.gov (United States)

    Harn, Nicholas R; Hunt, Suzanne L; Hill, Jacqueline; Vidoni, Eric; Perry, Mark; Burns, Jeffrey M

    2017-08-01

    Establishing reliable methods for interpreting elevated cerebral amyloid-β plaque on PET scans is increasingly important for radiologists, as availability of PET imaging in clinical practice increases. We examined a 3-step method to detect plaque in cognitively normal older adults, focusing on the additive value of quantitative information during the PET scan interpretation process. Fifty-five F-florbetapir PET scans were evaluated by 3 experienced raters. Scans were first visually interpreted as having "elevated" or "nonelevated" plaque burden ("Visual Read"). Images were then processed using a standardized quantitative analysis software (MIMneuro) to generate whole brain and region of interest SUV ratios. This "Quantitative Read" was considered elevated if at least 2 of 6 regions of interest had an SUV ratio of more than 1.1. The final interpretation combined both visual and quantitative data together ("VisQ Read"). Cohen kappa values were assessed as a measure of interpretation agreement. Plaque was elevated in 25.5% to 29.1% of the 165 total Visual Reads. Interrater agreement was strong (kappa = 0.73-0.82) and consistent with reported values. Quantitative Reads were elevated in 45.5% of participants. Final VisQ Reads changed from initial Visual Reads in 16 interpretations (9.7%), with most changing from "nonelevated" Visual Reads to "elevated." These changed interpretations demonstrated lower plaque quantification than those initially read as "elevated" that remained unchanged. Interrater variability improved for VisQ Reads with the addition of quantitative information (kappa = 0.88-0.96). Inclusion of quantitative information increases consistency of PET scan interpretations for early detection of cerebral amyloid-β plaque accumulation.

  8. Techniques Which Aid in Quantitative Interpretation of Scan Data; Methodes Facilitant l'Interpretation Quantitative des Scintigrammes; Metody, oblegchayushchie kolichestvennuyu interpretatsiyu dannykh skennirovaniya; Tecnicas Que Facilitan la Interpretacion Cuantitativa de los Datos Centelleograficos

    Energy Technology Data Exchange (ETDEWEB)

    Charleston, D. B.; Beck, R. N.; Eidelberg, P.; Schuh, M. W. [Argonne Cancer Research Hospital, Chicago, IL (United States)

    1964-10-15

    This paper discusses a range of techniques which assist in evaluating and interpreting scanning read-out display. This range extends from simple internal calibration for photographic read-out to fairly elaborate auxiliary equipment for presentation of accumulated digital scan information to a computer programme. The direct and remarkably useful method of using a random pulse generator to produce a calibrated step-wedge of spots, which are projected on to a film by the same projection light source as is used during the scan, allows the viewer to compare exposure densities of regions of interest on the scan to similar regions on the wedge which are calibrated directly in count-rate units. Auxiliary equipment, such as a multichannel analyser used in the multiscaling mode, permits the accumulation of digital information for a ''total count per scan line'' display for each index step. Small animal scans have been made which accumulate and display ''counts per scan line'' for each index step. This produces an accurate quantitative measure of the distribution of activity over the animal and a profile display of activity similar to the slit scan display of a linear scanning system. The same multiscaling technique is carried further by accumulating digital information for a ''count per unit area'' display. A profile curve is obtained for each scan line of each index step. From this it is possible to visualize or construct an area profile of count-rate. Scan displays with or without contrast enhancement and with or without ''time lag'' from integrating circuitry and scans with various spot sizes and shapes have been produced under identical statistical conditions by means of multiple read-outs while scanning a phantom with a single-detector system. Direct comparison of displays combined with the ''count per unit area'' mapping technique aid in the interpretation of scan results. 'Precise position information must be included with the data record. Computations of percentage

  9. Comparison Between Postprocessing Software and Repeated Scanning to Eliminate Subdiaphragmatic Activity in Myocardial Perfusion Scintigraphy

    International Nuclear Information System (INIS)

    Theerakulpisut, Daris; Chotipanich, Chanisa

    2016-01-01

    Myocardial perfusion single photon emission computed tomography (SPECT) is a powerful test of evaluation for coronary artery disease, but subdiaphragmatic radiotracer activity often interferes with the interpretation of inferior wall findings. This study aims to evaluate the effectiveness of using software elimination of the subdiaphragmatic activity for the assessment of its efficacy in the correctness of image interpretation and the overall image quality of myocardial perfusion scintigraphy (MPS). MPS studies from January 2010 to October 2012 at our institution were reviewed. Thirty-two SPECT studies were included, all of which had significant subdiaphragmatic activity in the first scan and needed to be delayed to let the activity clear. Each scan was interpreted by using semiquantitative scoring in 17 segments according to the degree of radiotracer uptake. The first scan, which had interfering activity, was manipulated by masking out the unwanted activity with software native to our image processing software suite. The manipulated images were then compared with delayed images of the same patient, of which the subdiaphragmatic activity was spontaneously cleared with time. The first scan masked by software correlated with the delayed scan for myocardial regions supplied by the left circumflex (LCx) and right coronary artery (RCA), but not the left anterior descending (LAD). However, the quality of the masked scans was perceived by the observer to be better in terms of quality and ease of interpretation. Using software to mask out unwanted subdiaphragmatic activity has no detrimental effect on the interpretation of MPS images when compared with delayed scanning, but it can improve subjective scan quality and ease of interpretation

  10. CutL: an alternative to Kulldorff's scan statistics for cluster detection with a specified cut-off level.

    Science.gov (United States)

    Więckowska, Barbara; Marcinkowska, Justyna

    2017-11-06

    When searching for epidemiological clusters, an important tool can be to carry out one's own research with the incidence rate from the literature as the reference level. Values exceeding this level may indicate the presence of a cluster in that location. This paper presents a method of searching for clusters that have significantly higher incidence rates than those specified by the investigator. The proposed method uses the classic binomial exact test for one proportion and an algorithm that joins areas with potential clusters while reducing the number of multiple comparisons needed. The sensitivity and specificity are preserved by this new method, while avoiding the Monte Carlo approach and still delivering results comparable to the commonly used Kulldorff's scan statistics and other similar methods of localising clusters. A strong contributing factor afforded by the statistical software that makes this possible is that it allows analysis and presentation of the results cartographically.

  11. Comments on the interpretation of differential scanning calorimetry results for thermoelastic martensitic transformations: Athermal versus thermally activated kinetics

    International Nuclear Information System (INIS)

    Morris, A.; Lipe, T.

    1996-01-01

    In a previous article Van Humbeeck and Planes have made a number of criticisms of the authors' recent paper concerning the interpretation of the results obtained by Differential Scanning Calorimetry (DSC) from the Martensitic Transformation of Cu-Al-Ni-Mn-B alloys. Although the martensitic transformation of these shape memory alloys is generally classified as athermal, it has been confirmed that the capacity of the alloys to undergo a more complete thermoelastic transformation (i.e. better reversibility of the transformation) increased with the Mn content. This behavior has been explained by interpreting the DSC results obtained during thermal cycling in terms of a thermally activated mechanism controlling the direct and reverse transformations. When the heating rate increases during the reverse transformation the DSC curves shift towards higher temperatures while they shift towards the lower temperatures when the cooling rate was increased during the direct transformation. Since the starting transformation temperatures (As, Ms) do not shift, Van Humbeeck and Planes state that there is no real peak shift and assume that the DCS experiments were carried out without taking into account the thermal lag effect between sample and cell. On the following line they deduce a time constant, τ, of 60 seconds because the peak maximum shifts. In fact the assumption made by Van Humbeeck and Planes is false

  12. Odds Ratio or Prevalence Ratio? An Overview of Reported Statistical Methods and Appropriateness of Interpretations in Cross-sectional Studies with Dichotomous Outcomes in Veterinary Medicine

    Directory of Open Access Journals (Sweden)

    Brayan Alexander Fonseca Martinez

    2017-11-01

    Full Text Available One of the most commonly observational study designs employed in veterinary is the cross-sectional study with binary outcomes. To measure an association with exposure, the use of prevalence ratios (PR or odds ratios (OR are possible. In human epidemiology, much has been discussed about the use of the OR exclusively for case–control studies and some authors reported that there is no good justification for fitting logistic regression when the prevalence of the disease is high, in which OR overestimate the PR. Nonetheless, interpretation of OR is difficult since confusing between risk and odds can lead to incorrect quantitative interpretation of data such as “the risk is X times greater,” commonly reported in studies that use OR. The aims of this study were (1 to review articles with cross-sectional designs to assess the statistical method used and the appropriateness of the interpretation of the estimated measure of association and (2 to illustrate the use of alternative statistical methods that estimate PR directly. An overview of statistical methods and its interpretation using the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA guidelines was conducted and included a diverse set of peer-reviewed journals among the veterinary science field using PubMed as the search engine. From each article, the statistical method used and the appropriateness of the interpretation of the estimated measure of association were registered. Additionally, four alternative models for logistic regression that estimate directly PR were tested using our own dataset from a cross-sectional study on bovine viral diarrhea virus. The initial search strategy found 62 articles, in which 6 articles were excluded and therefore 56 studies were used for the overall analysis. The review showed that independent of the level of prevalence reported, 96% of articles employed logistic regression, thus estimating the OR. Results of the multivariate models

  13. Scanning probe recognition microscopy investigation of tissue scaffold properties

    Science.gov (United States)

    Fan, Yuan; Chen, Qian; Ayres, Virginia M; Baczewski, Andrew D; Udpa, Lalita; Kumar, Shiva

    2007-01-01

    Scanning probe recognition microscopy is a new scanning probe microscopy technique which enables selective scanning along individual nanofibers within a tissue scaffold. Statistically significant data for multiple properties can be collected by repetitively fine-scanning an identical region of interest. The results of a scanning probe recognition microscopy investigation of the surface roughness and elasticity of a series of tissue scaffolds are presented. Deconvolution and statistical methods were developed and used for data accuracy along curved nanofiber surfaces. Nanofiber features were also independently analyzed using transmission electron microscopy, with results that supported the scanning probe recognition microscopy-based analysis. PMID:18203431

  14. The Role of Biased Scanning in Counterattitudinal Advocacy

    Science.gov (United States)

    Cunningham, John D.; Collins, Barry E.

    1977-01-01

    Experiments tested biased-scanning hypothesis that high financial inducement leads to greater cognitive contact with counterattitudinal arguments and thus to greater attitude change. No differences in biased scanning or attitude change were observed as a function of financial inducement. Results were interpreted in framework of reactance and…

  15. Interpretation of ultrasonic images; Interpretation von Ultraschall-Abbildungen

    Energy Technology Data Exchange (ETDEWEB)

    Mueller, W; Schmitz, V; Kroening, M [Fraunhofer-Institut fuer Zerstoerungsfreie Pruefverfahren, Saarbruecken (Germany)

    1998-11-01

    During the evaluation of ultrasonic images, e.g. SAFT-reconstructed B-scan images (SAFT=Synthetic Aperture Focusing Technique) it is often difficult to decide, what is the origin of reconstructed image points: were they caused by defects, specimens geometry or mode-conversions. To facilitate this evaluation a tool based on the comparison of data was developed. Different kinds of data comparison are possible: identification of that RF-signals, which caused the reconstructed image point. This is the comparison of a reconstructed image with the corresponding RF-data. Comparison of two reconstructed images performing a superposition using logical operators. In this case e.g. the reconstruction of an unknown reflector is compared with that of a known one. Comparison of raw-RF-data by simultaneous scanning through two data sets. Here the echoes of an unknown reflector are compared with the echoes of a known one. The necessary datasets of known reflectors may be generated experimentally on reference reflectors or modelled. The aim is the identification of the reflector type, e.g. cracklike or not, the determination of position, size and orientation as well as the identification of accompanying satellite echoes. The interpretation of the SAFT-reconstructed B-scan image is carried out by a complete description of the reflector. In addition to the aim of interpretation the tool described is well suited to educate and train ultrasonic testers. (orig./MM) [Deutsch] Bei der Auswertung von Ultraschall-Abbildungen, z.B. SAFT-rekonstruierten B-Bildern (SAFT=Synthetische Apertur Fokus Technik), ist es oft schwierig zu entscheiden, wo rekonstruierte Bildpunkte herruehren: wurden sie durch Materialfehler, Bauteilgeometrie oder durch Wellenumwandlungen versursacht. Um diese Auswertung zu erleichtern, wurde ein Werkzeug entwickelt, welches auf dem Vergleich von Datensaetzen basiert. Es koennen verschiedene Arten des Datenvergleichs durchgefuehrt werden: Identifikation der HF

  16. The Statistical Interpretation of Entropy: An Activity

    Science.gov (United States)

    Timmberlake, Todd

    2010-01-01

    The second law of thermodynamics, which states that the entropy of an isolated macroscopic system can increase but will not decrease, is a cornerstone of modern physics. Ludwig Boltzmann argued that the second law arises from the motion of the atoms that compose the system. Boltzmann's statistical mechanics provides deep insight into the…

  17. Improving interpretation of publically reported statistics on health and healthcare: the Figure Interpretation Assessment Tool (FIAT-Health).

    Science.gov (United States)

    Gerrits, Reinie G; Kringos, Dionne S; van den Berg, Michael J; Klazinga, Niek S

    2018-03-07

    Policy-makers, managers, scientists, patients and the general public are confronted daily with figures on health and healthcare through public reporting in newspapers, webpages and press releases. However, information on the key characteristics of these figures necessary for their correct interpretation is often not adequately communicated, which can lead to misinterpretation and misinformed decision-making. The objective of this research was to map the key characteristics relevant to the interpretation of figures on health and healthcare, and to develop a Figure Interpretation Assessment Tool-Health (FIAT-Health) through which figures on health and healthcare can be systematically assessed, allowing for a better interpretation of these figures. The abovementioned key characteristics of figures on health and healthcare were identified through systematic expert consultations in the Netherlands on four topic categories of figures, namely morbidity, healthcare expenditure, healthcare outcomes and lifestyle. The identified characteristics were used as a frame for the development of the FIAT-Health. Development of the tool and its content was supported and validated through regular review by a sounding board of potential users. Identified characteristics relevant for the interpretation of figures in the four categories relate to the figures' origin, credibility, expression, subject matter, population and geographical focus, time period, and underlying data collection methods. The characteristics were translated into a set of 13 dichotomous and 4-point Likert scale questions constituting the FIAT-Health, and two final assessment statements. Users of the FIAT-Health were provided with a summary overview of their answers to support a final assessment of the correctness of a figure and the appropriateness of its reporting. FIAT-Health can support policy-makers, managers, scientists, patients and the general public to systematically assess the quality of publicly reported

  18. Online platform for applying space–time scan statistics for prospectively detecting emerging hot spots of dengue fever

    Directory of Open Access Journals (Sweden)

    Chien-Chou Chen

    2016-11-01

    Full Text Available Abstract Background Cases of dengue fever have increased in areas of Southeast Asia in recent years. Taiwan hit a record-high 42,856 cases in 2015, with the majority in southern Tainan and Kaohsiung Cities. Leveraging spatial statistics and geo-visualization techniques, we aim to design an online analytical tool for local public health workers to prospectively identify ongoing hot spots of dengue fever weekly at the village level. Methods A total of 57,516 confirmed cases of dengue fever in 2014 and 2015 were obtained from the Taiwan Centers for Disease Control (TCDC. Incorporating demographic information as covariates with cumulative cases (365 days in a discrete Poisson model, we iteratively applied space–time scan statistics by SaTScan software to detect the currently active cluster of dengue fever (reported as relative risk in each village of Tainan and Kaohsiung every week. A village with a relative risk >1 and p value <0.05 was identified as a dengue-epidemic area. Assuming an ongoing transmission might continuously spread for two consecutive weeks, we estimated the sensitivity and specificity for detecting outbreaks by comparing the scan-based classification (dengue-epidemic vs. dengue-free village with the true cumulative case numbers from the TCDC’s surveillance statistics. Results Among the 1648 villages in Tainan and Kaohsiung, the overall sensitivity for detecting outbreaks increases as case numbers grow in a total of 92 weekly simulations. The specificity for detecting outbreaks behaves inversely, compared to the sensitivity. On average, the mean sensitivity and specificity of 2-week hot spot detection were 0.615 and 0.891 respectively (p value <0.001 for the covariate adjustment model, as the maximum spatial and temporal windows were specified as 50% of the total population at risk and 28 days. Dengue-epidemic villages were visualized and explored in an interactive map. Conclusions We designed an online analytical tool for

  19. Preoperative nuclear scans in patients with melanoma

    International Nuclear Information System (INIS)

    Au, F.C.; Maier, W.P.; Malmud, L.S.; Goldman, L.I.; Clark, W.H. Jr.

    1984-01-01

    One hundred forty-one liver scans, 137 brain scans, and 112 bone scans were performed in 192 patients with clinical Stage 1 melanoma. One liver scan was interpreted as abnormal; liver biopsy of that patient showed no metastasis. There were 11 suggestive liver scans; three of the patients with suggestive liver scans had negative liver biopsies. The remaining eight patients were followed from 4 to 6 years and none of those patients developed clinical evidence of hepatic metastases. All of the brain scans were normal. Five patients had suggestive bone scans and none of those patients had manifested symptoms of osseous metastases with a follow-up of 2 to 4.5 years. This study demonstrates that the use of preoperative liver, brain and bone scan in the evaluation of patients with clinical Stage 1 melanoma is virtually unproductive

  20. Bone scanning in severe external otitis

    International Nuclear Information System (INIS)

    Levin, W.J.; Shary, J.H. III; Nichols, L.T.; Lucente, F.E.

    1986-01-01

    Technetium99 Methylene Diphosphate bone scanning has been considered an early valuable tool to diagnose necrotizing progressive malignant external otitis. However, to our knowledge, no formal studies have actually compared bone scans of otherwise young, healthy patients with severe external otitis to scans of patients with clinical presentation of malignant external otitis. Twelve patients with only severe external otitis were studied with Technetium99 Diphosphate and were compared to known cases of malignant otitis. All scans were evaluated by two neuroradiologists with no prior knowledge of the clinical status of the patients. Nine of the 12 patients had positive bone scans with many scans resembling those reported with malignant external otitis. Interestingly, there was no consistent correlation between the severity of clinical presentation and the amount of Technetium uptake. These findings suggest that a positive bone scan alone should not be interpreted as indicative of malignant external otitis

  1. Comparison of supine, upright, and prone positions for liver scans

    International Nuclear Information System (INIS)

    Harolds, J.A.; Brill, A.B.; Patton, J.A.; Touya, J.J.

    1983-01-01

    We compared liver scan interpretations based on anterior images obtained in the upright, prone, and supine positions. Receiver-operating-characteristic curves were generated for three well trained observers. Results showed that reading the three different views together was more accurate than the reading of any individual image. Furthermore, interpretations based on either the prone or upright view were superior to those using the supine view alone. The prone and upright views should be used more often in liver scanning

  2. Handbook of univariate and multivariate data analysis and interpretation with SPSS

    CERN Document Server

    Ho, Robert

    2006-01-01

    Many statistics texts tend to focus more on the theory and mathematics underlying statistical tests than on their applications and interpretation. This can leave readers with little understanding of how to apply statistical tests or how to interpret their findings. While the SPSS statistical software has done much to alleviate the frustrations of social science professionals and students who must analyze data, they still face daunting challenges in selecting the proper tests, executing the tests, and interpreting the test results.With emphasis firmly on such practical matters, this handbook se

  3. Use of a spatial scan statistic to identify clusters of births occurring outside Ghanaian health facilities for targeted intervention.

    Science.gov (United States)

    Bosomprah, Samuel; Dotse-Gborgbortsi, Winfred; Aboagye, Patrick; Matthews, Zoe

    2016-11-01

    To identify and evaluate clusters of births that occurred outside health facilities in Ghana for targeted intervention. A retrospective study was conducted using a convenience sample of live births registered in Ghanaian health facilities from January 1 to December 31, 2014. Data were extracted from the district health information system. A spatial scan statistic was used to investigate clusters of home births through a discrete Poisson probability model. Scanning with a circular spatial window was conducted only for clusters with high rates of such deliveries. The district was used as the geographic unit of analysis. The likelihood P value was estimated using Monte Carlo simulations. Ten statistically significant clusters with a high rate of home birth were identified. The relative risks ranged from 1.43 ("least likely" cluster; P=0.001) to 1.95 ("most likely" cluster; P=0.001). The relative risks of the top five "most likely" clusters ranged from 1.68 to 1.95; these clusters were located in Ashanti, Brong Ahafo, and the Western, Eastern, and Greater regions of Accra. Health facility records, geospatial techniques, and geographic information systems provided locally relevant information to assist policy makers in delivering targeted interventions to small geographic areas. Copyright © 2016 International Federation of Gynecology and Obstetrics. Published by Elsevier Ireland Ltd. All rights reserved.

  4. Statistical analysis with Excel for dummies

    CERN Document Server

    Schmuller, Joseph

    2013-01-01

    Take the mystery out of statistical terms and put Excel to work! If you need to create and interpret statistics in business or classroom settings, this easy-to-use guide is just what you need. It shows you how to use Excel's powerful tools for statistical analysis, even if you've never taken a course in statistics. Learn the meaning of terms like mean and median, margin of error, standard deviation, and permutations, and discover how to interpret the statistics of everyday life. You'll learn to use Excel formulas, charts, PivotTables, and other tools to make sense of everything fro

  5. An expert system for improving the gamma-ray scanning technique

    International Nuclear Information System (INIS)

    Laraki, K.; Alami, R.; Cherkaoui El Moursli, R.; Bensitel, A.; El Badri, L.

    2007-01-01

    The gamma-ray scanning technique is widely used in the diagnosis and identification of industrial installations, in general and, in particular, of distillation columns considered as the most critical components in petrochemical plants. It provides essential data to optimise the performance of columns and identify maintenance requirements. Due to the various difficulties that can arise while analysing a scanning profile and in order to benefit from the continuous advent of new technologies in the field of electronics and data processing, the team of the Division of Instrumentation and Industrial Applications of CNESTEN have conducted a project aiming the elaboration of an expert system for acquisition, processing and interpretation of the scanning results. This system consists of two main modules: the first one is devoted to the preparation and control of the scanning operation conditions, while the second module has been developed to carry out easily and effectively the automatic (on-line) analysis and interpretation of the scan profiles

  6. Applications and interpretation of krypton 81m ventilation/technetium 99m macroaggregate perfusion lung scanning in childhood

    Science.gov (United States)

    Davies, Hugh Trevor Frimston

    still reflects regional ventilation in this age group. The doubt cast on the interpretation of the Kr81m steady state image could limit the value of V/Q lung scans in following regional lung function through childhood, a period when specific ventilation is falling rapidly as the child grows. Therefore the first aim of this study was to examine the application of this theoretical model to children and determine whether the changing specific ventilation seen through childhood significantly alters the interpretation of the steady state Kr81m image. This is a necessary first step before conducting longitudinal studies of regional ventilation and perfusion in children. The effect of posture on regional ventilation and perfusion in the adult human lung has been extensively studied. Radiotracer studies have consistently shown that both ventilation and perfusion are preferentially distributed to dependent lung regions during tidal breathing regardless of posture. There is little published information concerning the pattern in children yet there are many differences in lung and chest wall mechanics of children and adults which, along with clinical observation, have led to the hypothesis that the pattern of regional ventilation observed in adults may not be seen in children. Recent reports of regional ventilation in infants and very young children have provided support for this theory. The paper of Heaf et al demonstrated that these differences may in certain circumstances be clinically important. It is not clear however at what age children adopt the "adult pattern of ventilation". In addition to the problems referred to above, attenuation of Kr81m activity as it passes through the chest wall and the changing geometry of the chest during tidal breathing have made quantitative analysis of the image difficult although fractional ventilation and perfusion to each lung can be calculated from the steady state image. In clinical practise, therefore, ventilation and perfusion are

  7. Exploring the Gross Schoenebeck (Germany) geothermal site using a statistical joint interpretation of magnetotelluric and seismic tomography models

    Energy Technology Data Exchange (ETDEWEB)

    Munoz, Gerard; Bauer, Klaus; Moeck, Inga; Schulze, Albrecht; Ritter, Oliver [Deutsches GeoForschungsZentrum (GFZ), Telegrafenberg, 14473 Potsdam (Germany)

    2010-03-15

    Exploration for geothermal resources is often challenging because there are no geophysical techniques that provide direct images of the parameters of interest, such as porosity, permeability and fluid content. Magnetotelluric (MT) and seismic tomography methods yield information about subsurface distribution of resistivity and seismic velocity on similar scales and resolution. The lack of a fundamental law linking the two parameters, however, has limited joint interpretation to a qualitative analysis. By using a statistical approach in which the resistivity and velocity models are investigated in the joint parameter space, we are able to identify regions of high correlation and map these classes (or structures) back onto the spatial domain. This technique, applied to a seismic tomography-MT profile in the area of the Gross Schoenebeck geothermal site, allows us to identify a number of classes in accordance with the local geology. In particular, a high-velocity, low-resistivity class is interpreted as related to areas with thinner layers of evaporites; regions where these sedimentary layers are highly fractured may be of higher permeability. (author)

  8. Statistics For Dummies

    CERN Document Server

    Rumsey, Deborah

    2011-01-01

    The fun and easy way to get down to business with statistics Stymied by statistics? No fear ? this friendly guide offers clear, practical explanations of statistical ideas, techniques, formulas, and calculations, with lots of examples that show you how these concepts apply to your everyday life. Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more.Tracks to a typical first semester statistics cou

  9. On Improving the Quality and Interpretation of Environmental Assessments using Statistical Analysis and Geographic Information Systems

    Science.gov (United States)

    Karuppiah, R.; Faldi, A.; Laurenzi, I.; Usadi, A.; Venkatesh, A.

    2014-12-01

    An increasing number of studies are focused on assessing the environmental footprint of different products and processes, especially using life cycle assessment (LCA). This work shows how combining statistical methods and Geographic Information Systems (GIS) with environmental analyses can help improve the quality of results and their interpretation. Most environmental assessments in literature yield single numbers that characterize the environmental impact of a process/product - typically global or country averages, often unchanging in time. In this work, we show how statistical analysis and GIS can help address these limitations. For example, we demonstrate a method to separately quantify uncertainty and variability in the result of LCA models using a power generation case study. This is important for rigorous comparisons between the impacts of different processes. Another challenge is lack of data that can affect the rigor of LCAs. We have developed an approach to estimate environmental impacts of incompletely characterized processes using predictive statistical models. This method is applied to estimate unreported coal power plant emissions in several world regions. There is also a general lack of spatio-temporal characterization of the results in environmental analyses. For instance, studies that focus on water usage do not put in context where and when water is withdrawn. Through the use of hydrological modeling combined with GIS, we quantify water stress on a regional and seasonal basis to understand water supply and demand risks for multiple users. Another example where it is important to consider regional dependency of impacts is when characterizing how agricultural land occupation affects biodiversity in a region. We developed a data-driven methodology used in conjuction with GIS to determine if there is a statistically significant difference between the impacts of growing different crops on different species in various biomes of the world.

  10. The role of key image notes in CT imaging study interpretation.

    Science.gov (United States)

    Fan, Shu-Feng; Xu, Zhe; He, Hai-Qing; Ding, Jian-Rong; Teng, Gao-Jun

    2011-04-01

    The objective of the study was to investigate the clinical effects of CT key image notes (KIN) in the interpretation of a CT image study. All experiments were approved by the ethics committee of the local district. Six experienced radiologists were equally divided into routine reporting (RR) group and KIN reporting (KIN) group. CT scans of each 100 consecutive cases before and after using KIN technique were randomly selected, and the reports were made by group RR and KIN, respectively. All the reports were again reviewed 3 months later by both groups. All the results with using or not using KIN were interpreted and reinterpreted after 3 months by six clinicians, who were experienced in picture archiving and communication system (PACS) applications and were equally divided into the clinical routine report group and the clinical KIN report group, respectively. The results were statistically analyzed; the time used in making a report, the re-reading time 3 months later, and the consistency of imaging interpretation were determined and compared between groups. After using KIN technique, the time used in making a report was significantly increased (8.77 ± 5.27 vs. 10.53 ± 5.71 min, P < 0.05), the re-reading time was decreased (5.23 ± 2.54 vs. 4.99 ± 1.70 min, P < 0.05), the clinical interpretation and reinterpretation time after 3 months were decreased, and the consistency of the interpretation, reinterpretation between different doctors in different time was markedly improved (P < 0.01). CT report with KIN technique in PACS can significantly improve the consistency of the interpretation and efficiency in routine clinical work.

  11. Interpreting the gamma statistic in phylogenetic diversification rate studies: a rate decrease does not necessarily indicate an early burst.

    Science.gov (United States)

    Fordyce, James A

    2010-07-23

    Phylogenetic hypotheses are increasingly being used to elucidate historical patterns of diversification rate-variation. Hypothesis testing is often conducted by comparing the observed vector of branching times to a null, pure-birth expectation. A popular method for inferring a decrease in speciation rate, which might suggest an early burst of diversification followed by a decrease in diversification rate is the gamma statistic. Using simulations under varying conditions, I examine the sensitivity of gamma to the distribution of the most recent branching times. Using an exploratory data analysis tool for lineages through time plots, tree deviation, I identified trees with a significant gamma statistic that do not appear to have the characteristic early accumulation of lineages consistent with an early, rapid rate of cladogenesis. I further investigated the sensitivity of the gamma statistic to recent diversification by examining the consequences of failing to simulate the full time interval following the most recent cladogenic event. The power of gamma to detect rate decrease at varying times was assessed for simulated trees with an initial high rate of diversification followed by a relatively low rate. The gamma statistic is extraordinarily sensitive to recent diversification rates, and does not necessarily detect early bursts of diversification. This was true for trees of various sizes and completeness of taxon sampling. The gamma statistic had greater power to detect recent diversification rate decreases compared to early bursts of diversification. Caution should be exercised when interpreting the gamma statistic as an indication of early, rapid diversification.

  12. Interpreting the gamma statistic in phylogenetic diversification rate studies: a rate decrease does not necessarily indicate an early burst.

    Directory of Open Access Journals (Sweden)

    James A Fordyce

    Full Text Available BACKGROUND: Phylogenetic hypotheses are increasingly being used to elucidate historical patterns of diversification rate-variation. Hypothesis testing is often conducted by comparing the observed vector of branching times to a null, pure-birth expectation. A popular method for inferring a decrease in speciation rate, which might suggest an early burst of diversification followed by a decrease in diversification rate is the gamma statistic. METHODOLOGY: Using simulations under varying conditions, I examine the sensitivity of gamma to the distribution of the most recent branching times. Using an exploratory data analysis tool for lineages through time plots, tree deviation, I identified trees with a significant gamma statistic that do not appear to have the characteristic early accumulation of lineages consistent with an early, rapid rate of cladogenesis. I further investigated the sensitivity of the gamma statistic to recent diversification by examining the consequences of failing to simulate the full time interval following the most recent cladogenic event. The power of gamma to detect rate decrease at varying times was assessed for simulated trees with an initial high rate of diversification followed by a relatively low rate. CONCLUSIONS: The gamma statistic is extraordinarily sensitive to recent diversification rates, and does not necessarily detect early bursts of diversification. This was true for trees of various sizes and completeness of taxon sampling. The gamma statistic had greater power to detect recent diversification rate decreases compared to early bursts of diversification. Caution should be exercised when interpreting the gamma statistic as an indication of early, rapid diversification.

  13. Application of Statistical Tools for Data Analysis and Interpretation in Rice Plant Pathology

    Directory of Open Access Journals (Sweden)

    Parsuram Nayak

    2018-01-01

    Full Text Available There has been a significant advancement in the application of statistical tools in plant pathology during the past four decades. These tools include multivariate analysis of disease dynamics involving principal component analysis, cluster analysis, factor analysis, pattern analysis, discriminant analysis, multivariate analysis of variance, correspondence analysis, canonical correlation analysis, redundancy analysis, genetic diversity analysis, and stability analysis, which involve in joint regression, additive main effects and multiplicative interactions, and genotype-by-environment interaction biplot analysis. The advanced statistical tools, such as non-parametric analysis of disease association, meta-analysis, Bayesian analysis, and decision theory, take an important place in analysis of disease dynamics. Disease forecasting methods by simulation models for plant diseases have a great potentiality in practical disease control strategies. Common mathematical tools such as monomolecular, exponential, logistic, Gompertz and linked differential equations take an important place in growth curve analysis of disease epidemics. The highly informative means of displaying a range of numerical data through construction of box and whisker plots has been suggested. The probable applications of recent advanced tools of linear and non-linear mixed models like the linear mixed model, generalized linear model, and generalized linear mixed models have been presented. The most recent technologies such as micro-array analysis, though cost effective, provide estimates of gene expressions for thousands of genes simultaneously and need attention by the molecular biologists. Some of these advanced tools can be well applied in different branches of rice research, including crop improvement, crop production, crop protection, social sciences as well as agricultural engineering. The rice research scientists should take advantage of these new opportunities adequately in

  14. Comparison of dimensional accuracy of digital dental models produced from scanned impressions and scanned stone casts

    Science.gov (United States)

    Subeihi, Haitham

    dimensional accuracy, which is defined as the absolute value of deviation in micrometers from the reference model. A two-way analysis of analysis of variance (ANOVA) was applied to calculate if the measurements for the six test groups were statistically significantly different from the original reference model as well as between test groups (p Results: The mean (± SD) RMS was 29.42 ± 5.80 microns for digital models produced from polyether impression scans, 27.58 ± 5.85 microns for digital models from PVS impressions scans, and 24.08 ± 4.89 microns for digital models produced from VPES impressions scans. 26.08 ± 6.58 microns for digital models produced by scanning stone casts poured from PE, 31.67 ± 9.95 microns for digital models produced by scanning stone casts poured from PVS and 22.58 ± 2.84 microns for digital models produced by scanning stone casts poured from VPES. In the Two-Way ANOVA, the p-value for the material factor was 0.004, reflecting a statistically significant difference between the accuracy of the three impression materials, with VPES showing the highest accuracy (mean RMS = 23.33 ± 3.99 microns) followed by PE (mean RMS = 27.75 ± 6.3 microns) and PVS (mean RMS = 29.63 ± 8.25 microns). For the technique factor, the p-value was 0.870 reflecting no statistically significant difference between the accuracy of the two techniques (impression scan and stone cast scan). The mean RMS values were 27.03 ± 5.82 microns and 26.78 ± 7.85 microns, respectively. In the post-hoc tests for the material factor, a significant difference was found between the accuracy of VPES and PVS (p-value = 0.004) with VPES having the higher accuracy (lower mean RMS). No significant difference was found between the accuracies of PE and PVS (p-value = 0.576), and between the accuracies of PE and VPES (p-value = 0.054). Conclusions: Within the limitations of this in vitro study, it can be concluded that: 1. There is no statistically significant difference in dimensional accuracy

  15. Assessment of statistical agreement of three techniques for the study of cut marks: 3D digital microscope, laser scanning confocal microscopy and micro-photogrammetry.

    Science.gov (United States)

    Maté-González, Miguel Ángel; Aramendi, Julia; Yravedra, José; Blasco, Ruth; Rosell, Jordi; González-Aguilera, Diego; Domínguez-Rodrigo, Manuel

    2017-09-01

    In the last few years, the study of cut marks on bone surfaces has become fundamental for the interpretation of prehistoric butchery practices. Due to the difficulties in the correct identification of cut marks, many criteria for their description and classification have been suggested. Different techniques, such as three-dimensional digital microscope (3D DM), laser scanning confocal microscopy (LSCM) and micro-photogrammetry (M-PG) have been recently applied to the study of cut marks. Although the 3D DM and LSCM microscopic techniques are the most commonly used for the 3D identification of cut marks, M-PG has also proved to be very efficient and a low-cost method. M-PG is a noninvasive technique that allows the study of the cortical surface without any previous preparation of the samples, and that generates high-resolution models. Despite the current application of microscopic and micro-photogrammetric techniques to taphonomy, their reliability has never been tested. In this paper, we compare 3D DM, LSCM and M-PG in order to assess their resolution and results. In this study, we analyse 26 experimental cut marks generated with a metal knife. The quantitative and qualitative information registered is analysed by means of standard multivariate statistics and geometric morphometrics to assess the similarities and differences obtained with the different methodologies. © 2017 The Authors Journal of Microscopy © 2017 Royal Microscopical Society.

  16. Image analysis enhancement and interpretation

    International Nuclear Information System (INIS)

    Glauert, A.M.

    1978-01-01

    The necessary practical and mathematical background are provided for the analysis of an electron microscope image in order to extract the maximum amount of structural information. Instrumental methods of image enhancement are described, including the use of the energy-selecting electron microscope and the scanning transmission electron microscope. The problems of image interpretation are considered with particular reference to the limitations imposed by radiation damage and specimen thickness. A brief survey is given of the methods for producing a three-dimensional structure from a series of two-dimensional projections, although emphasis is really given on the analysis, processing and interpretation of the two-dimensional projection of a structure. (Auth.)

  17. Applied statistics in ecology: common pitfalls and simple solutions

    Science.gov (United States)

    E. Ashley Steel; Maureen C. Kennedy; Patrick G. Cunningham; John S. Stanovick

    2013-01-01

    The most common statistical pitfalls in ecological research are those associated with data exploration, the logic of sampling and design, and the interpretation of statistical results. Although one can find published errors in calculations, the majority of statistical pitfalls result from incorrect logic or interpretation despite correct numerical calculations. There...

  18. New scanning technique using Adaptive Statistical Iterative Reconstruction (ASIR) significantly reduced the radiation dose of cardiac CT.

    Science.gov (United States)

    Tumur, Odgerel; Soon, Kean; Brown, Fraser; Mykytowycz, Marcus

    2013-06-01

    The aims of our study were to evaluate the effect of application of Adaptive Statistical Iterative Reconstruction (ASIR) algorithm on the radiation dose of coronary computed tomography angiography (CCTA) and its effects on image quality of CCTA and to evaluate the effects of various patient and CT scanning factors on the radiation dose of CCTA. This was a retrospective study that included 347 consecutive patients who underwent CCTA at a tertiary university teaching hospital between 1 July 2009 and 20 September 2011. Analysis was performed comparing patient demographics, scan characteristics, radiation dose and image quality in two groups of patients in whom conventional Filtered Back Projection (FBP) or ASIR was used for image reconstruction. There were 238 patients in the FBP group and 109 patients in the ASIR group. There was no difference between the groups in the use of prospective gating, scan length or tube voltage. In ASIR group, significantly lower tube current was used compared with FBP group, 550 mA (450-600) vs. 650 mA (500-711.25) (median (interquartile range)), respectively, P ASIR group compared with FBP group, 4.29 mSv (2.84-6.02) vs. 5.84 mSv (3.88-8.39) (median (interquartile range)), respectively, P ASIR was associated with increased image noise compared with FBP (39.93 ± 10.22 vs. 37.63 ± 18.79 (mean ± standard deviation), respectively, P ASIR reduces the radiation dose of CCTA without affecting the image quality. © 2013 The Authors. Journal of Medical Imaging and Radiation Oncology © 2013 The Royal Australian and New Zealand College of Radiologists.

  19. Evaluation of Two Commercial Systems for Automated Processing, Reading, and Interpretation of Lyme Borreliosis Western Blots▿

    Science.gov (United States)

    Binnicker, M. J.; Jespersen, D. J.; Harring, J. A.; Rollins, L. O.; Bryant, S. C.; Beito, E. M.

    2008-01-01

    The diagnosis of Lyme borreliosis (LB) is commonly made by serologic testing with Western blot (WB) analysis serving as an important supplemental assay. Although specific, the interpretation of WBs for diagnosis of LB (i.e., Lyme WBs) is subjective, with considerable variability in results. In addition, the processing, reading, and interpretation of Lyme WBs are laborious and time-consuming procedures. With the need for rapid processing and more objective interpretation of Lyme WBs, we evaluated the performances of two automated interpretive systems, TrinBlot/BLOTrix (Trinity Biotech, Carlsbad, CA) and BeeBlot/ViraScan (Viramed Biotech AG, Munich, Germany), using 518 serum specimens submitted to our laboratory for Lyme WB analysis. The results of routine testing with visual interpretation were compared to those obtained by BLOTrix analysis of MarBlot immunoglobulin M (IgM) and IgG and by ViraScan analysis of ViraBlot and ViraStripe IgM and IgG assays. BLOTrix analysis demonstrated an agreement of 84.7% for IgM and 87.3% for IgG compared to visual reading and interpretation. ViraScan analysis of the ViraBlot assays demonstrated agreements of 85.7% for IgM and 94.2% for IgG, while ViraScan analysis of the ViraStripe IgM and IgG assays showed agreements of 87.1 and 93.1%, respectively. Testing by the automated systems yielded an average time savings of 64 min/run compared to processing, reading, and interpretation by our current procedure. Our findings demonstrated that automated processing and interpretive systems yield results comparable to those of visual interpretation, while reducing the subjectivity and time required for Lyme WB analysis. PMID:18463211

  20. Evaluation of two commercial systems for automated processing, reading, and interpretation of Lyme borreliosis Western blots.

    Science.gov (United States)

    Binnicker, M J; Jespersen, D J; Harring, J A; Rollins, L O; Bryant, S C; Beito, E M

    2008-07-01

    The diagnosis of Lyme borreliosis (LB) is commonly made by serologic testing with Western blot (WB) analysis serving as an important supplemental assay. Although specific, the interpretation of WBs for diagnosis of LB (i.e., Lyme WBs) is subjective, with considerable variability in results. In addition, the processing, reading, and interpretation of Lyme WBs are laborious and time-consuming procedures. With the need for rapid processing and more objective interpretation of Lyme WBs, we evaluated the performances of two automated interpretive systems, TrinBlot/BLOTrix (Trinity Biotech, Carlsbad, CA) and BeeBlot/ViraScan (Viramed Biotech AG, Munich, Germany), using 518 serum specimens submitted to our laboratory for Lyme WB analysis. The results of routine testing with visual interpretation were compared to those obtained by BLOTrix analysis of MarBlot immunoglobulin M (IgM) and IgG and by ViraScan analysis of ViraBlot and ViraStripe IgM and IgG assays. BLOTrix analysis demonstrated an agreement of 84.7% for IgM and 87.3% for IgG compared to visual reading and interpretation. ViraScan analysis of the ViraBlot assays demonstrated agreements of 85.7% for IgM and 94.2% for IgG, while ViraScan analysis of the ViraStripe IgM and IgG assays showed agreements of 87.1 and 93.1%, respectively. Testing by the automated systems yielded an average time savings of 64 min/run compared to processing, reading, and interpretation by our current procedure. Our findings demonstrated that automated processing and interpretive systems yield results comparable to those of visual interpretation, while reducing the subjectivity and time required for Lyme WB analysis.

  1. Scanning high-Tc SQUID imaging system for magnetocardiography

    International Nuclear Information System (INIS)

    Yang, H-C; Wu, T-Y; Horng, H-E; Wu, C-C; Yang, S Y; Liao, S-H; Wu, C-H; Jeng, J T; Chen, J C; Chen, Kuen-Lin; Chen, M J

    2006-01-01

    A scanning magnetocardiography (MCG) system constructed from SQUID sensors offers potential to basic or clinical research in biomagnetism. In this work, we study a first order scanning electronic high-T c (HTS) SQUID MCG system for biomagnetic signals. The scanning MCG system was equipped with an x-y translation bed powered by step motors. Using noise cancellation and μ-metal shielding, we reduced the noise level substantially. The established scanning HTS MCG system was used to study the magnetophysiology of hypercholesterolaemic (HC) rabbits. The MCG data of HC rabbits were analysed. The MCG contour map of HC rabbits provides experimental models for the interpretation of human cardiac patterns

  2. Computer Vision Tool and Technician as First Reader of Lung Cancer Screening CT Scans.

    Science.gov (United States)

    Ritchie, Alexander J; Sanghera, Calvin; Jacobs, Colin; Zhang, Wei; Mayo, John; Schmidt, Heidi; Gingras, Michel; Pasian, Sergio; Stewart, Lori; Tsai, Scott; Manos, Daria; Seely, Jean M; Burrowes, Paul; Bhatia, Rick; Atkar-Khattra, Sukhinder; van Ginneken, Bram; Tammemagi, Martin; Tsao, Ming Sound; Lam, Stephen

    2016-05-01

    To implement a cost-effective low-dose computed tomography (LDCT) lung cancer screening program at the population level, accurate and efficient interpretation of a large volume of LDCT scans is needed. The objective of this study was to evaluate a workflow strategy to identify abnormal LDCT scans in which a technician assisted by computer vision (CV) software acts as a first reader with the aim to improve speed, consistency, and quality of scan interpretation. Without knowledge of the diagnosis, a technician reviewed 828 randomly batched scans (136 with lung cancers, 556 with benign nodules, and 136 without nodules) from the baseline Pan-Canadian Early Detection of Lung Cancer Study that had been annotated by the CV software CIRRUS Lung Screening (Diagnostic Image Analysis Group, Nijmegen, The Netherlands). The scans were classified as either normal (no nodules ≥1 mm or benign nodules) or abnormal (nodules or other abnormality). The results were compared with the diagnostic interpretation by Pan-Canadian Early Detection of Lung Cancer Study radiologists. The overall sensitivity and specificity of the technician in identifying an abnormal scan were 97.8% (95% confidence interval: 96.4-98.8) and 98.0% (95% confidence interval: 89.5-99.7), respectively. Of the 112 prevalent nodules that were found to be malignant in follow-up, 92.9% were correctly identified by the technician plus CV compared with 84.8% by the study radiologists. The average time taken by the technician to review a scan after CV processing was 208 ± 120 seconds. Prescreening CV software and a technician as first reader is a promising strategy for improving the consistency and quality of screening interpretation of LDCT scans. Copyright © 2016 International Association for the Study of Lung Cancer. Published by Elsevier Inc. All rights reserved.

  3. Typhoid fever acquired in the United States, 1999-2010: epidemiology, microbiology, and use of a space-time scan statistic for outbreak detection.

    Science.gov (United States)

    Imanishi, M; Newton, A E; Vieira, A R; Gonzalez-Aviles, G; Kendall Scott, M E; Manikonda, K; Maxwell, T N; Halpin, J L; Freeman, M M; Medalla, F; Ayers, T L; Derado, G; Mahon, B E; Mintz, E D

    2015-08-01

    Although rare, typhoid fever cases acquired in the United States continue to be reported. Detection and investigation of outbreaks in these domestically acquired cases offer opportunities to identify chronic carriers. We searched surveillance and laboratory databases for domestically acquired typhoid fever cases, used a space-time scan statistic to identify clusters, and classified clusters as outbreaks or non-outbreaks. From 1999 to 2010, domestically acquired cases accounted for 18% of 3373 reported typhoid fever cases; their isolates were less often multidrug-resistant (2% vs. 15%) compared to isolates from travel-associated cases. We identified 28 outbreaks and two possible outbreaks within 45 space-time clusters of ⩾2 domestically acquired cases, including three outbreaks involving ⩾2 molecular subtypes. The approach detected seven of the ten outbreaks published in the literature or reported to CDC. Although this approach did not definitively identify any previously unrecognized outbreaks, it showed the potential to detect outbreaks of typhoid fever that may escape detection by routine analysis of surveillance data. Sixteen outbreaks had been linked to a carrier. Every case of typhoid fever acquired in a non-endemic country warrants thorough investigation. Space-time scan statistics, together with shoe-leather epidemiology and molecular subtyping, may improve outbreak detection.

  4. Variation in reaction norms: Statistical considerations and biological interpretation.

    Science.gov (United States)

    Morrissey, Michael B; Liefting, Maartje

    2016-09-01

    Analysis of reaction norms, the functions by which the phenotype produced by a given genotype depends on the environment, is critical to studying many aspects of phenotypic evolution. Different techniques are available for quantifying different aspects of reaction norm variation. We examine what biological inferences can be drawn from some of the more readily applicable analyses for studying reaction norms. We adopt a strongly biologically motivated view, but draw on statistical theory to highlight strengths and drawbacks of different techniques. In particular, consideration of some formal statistical theory leads to revision of some recently, and forcefully, advocated opinions on reaction norm analysis. We clarify what simple analysis of the slope between mean phenotype in two environments can tell us about reaction norms, explore the conditions under which polynomial regression can provide robust inferences about reaction norm shape, and explore how different existing approaches may be used to draw inferences about variation in reaction norm shape. We show how mixed model-based approaches can provide more robust inferences than more commonly used multistep statistical approaches, and derive new metrics of the relative importance of variation in reaction norm intercepts, slopes, and curvatures. © 2016 The Author(s). Evolution © 2016 The Society for the Study of Evolution.

  5. Does environmental data collection need statistics?

    NARCIS (Netherlands)

    Pulles, M.P.J.

    1998-01-01

    The term 'statistics' with reference to environmental science and policymaking might mean different things: the development of statistical methodology, the methodology developed by statisticians to interpret and analyse such data, or the statistical data that are needed to understand environmental

  6. Statistical Reform in School Psychology Research: A Synthesis

    Science.gov (United States)

    Swaminathan, Hariharan; Rogers, H. Jane

    2007-01-01

    Statistical reform in school psychology research is discussed in terms of research designs, measurement issues, statistical modeling and analysis procedures, interpretation and reporting of statistical results, and finally statistics education.

  7. The usefulness of descriptive statistics in the interpretation of data on occupational physical activity of Poles

    Directory of Open Access Journals (Sweden)

    Elżbieta Biernat

    2014-12-01

    Full Text Available Background: The aim of this paper is to assess whether basic descriptive statistics is sufficient to interpret the data on physical activity of Poles within occupational domain of life. Material and Methods: The study group consisted of 964 randomly selected Polish working professionals. The long version of the International Physical Activity Questionnaire (IPAQ was used. Descriptive statistics included characteristics of variables using: mean (M, median (Me, maximal and minimal values (max–min., standard deviation (SD and percentile values. Statistical inference was based on the comparison of variables with the significance level of 0.05 (Kruskal-Wallis and Pearson’s Chi2 tests. Results: Occupational physical activity (OPA was declared by 46.4% of respondents (vigorous – 23.5%, moderate – 30.2%, walking – 39.5%. The total OPA amounted to 2751.1 MET-min/week (Metabolic Equivalent of Task with very high standard deviation (SD = 5302.8 and max = 35 511 MET-min/week. It concerned different types of activities. Approximately 10% (90th percentile overstated the average. However, there was no significant difference depended on the character of the profession, or the type of activity. The average time of sitting was 256 min/day. As many as 39% of the respondents met the World Health Organization standards only due to OPA (42.5% of white-collar workers, 38% of administrative and technical employees and only 37.9% of physical workers. Conclusions: In the data analysis it is necessary to define quantiles to provide a fuller picture of the distributions of OPA in MET-min/week. It is also crucial to update the guidelines for data processing and analysis of long version of IPAQ. It seems that 16 h of activity/day is not a sufficient criterion for excluding the results from further analysis. Med Pr 2014;65(6:743–753

  8. New scanning technique using Adaptive Statistical lterative Reconstruction (ASIR) significantly reduced the radiation dose of cardiac CT

    International Nuclear Information System (INIS)

    Tumur, Odgerel; Soon, Kean; Brown, Fraser; Mykytowycz, Marcus

    2013-01-01

    The aims of our study were to evaluate the effect of application of Adaptive Statistical Iterative Reconstruction (ASIR) algorithm on the radiation dose of coronary computed tomography angiography (CCTA) and its effects on image quality of CCTA and to evaluate the effects of various patient and CT scanning factors on the radiation dose of CCTA. This was a retrospective study that included 347 consecutive patients who underwent CCTA at a tertiary university teaching hospital between 1 July 2009 and 20 September 2011. Analysis was performed comparing patient demographics, scan characteristics, radiation dose and image quality in two groups of patients in whom conventional Filtered Back Projection (FBP) or ASIR was used for image reconstruction. There were 238 patients in the FBP group and 109 patients in the ASIR group. There was no difference between the groups in the use of prospective gating, scan length or tube voltage. In ASIR group, significantly lower tube current was used compared with FBP group, 550mA (450–600) vs. 650mA (500–711.25) (median (interquartile range)), respectively, P<0.001. There was 27% effective radiation dose reduction in the ASIR group compared with FBP group, 4.29mSv (2.84–6.02) vs. 5.84mSv (3.88–8.39) (median (interquartile range)), respectively, P<0.001. Although ASIR was associated with increased image noise compared with FBP (39.93±10.22 vs. 37.63±18.79 (mean ±standard deviation), respectively, P<001), it did not affect the signal intensity, signal-to-noise ratio, contrast-to-noise ratio or the diagnostic quality of CCTA. Application of ASIR reduces the radiation dose of CCTA without affecting the image quality.

  9. What dementia reveals about proverb interpretation and its neuroanatomical correlates.

    Science.gov (United States)

    Kaiser, Natalie C; Lee, Grace J; Lu, Po H; Mather, Michelle J; Shapira, Jill; Jimenez, Elvira; Thompson, Paul M; Mendez, Mario F

    2013-08-01

    Neuropsychologists frequently include proverb interpretation as a measure of executive abilities. A concrete interpretation of proverbs, however, may reflect semantic impairments from anterior temporal lobes, rather than executive dysfunction from frontal lobes. The investigation of proverb interpretation among patients with different dementias with varying degrees of temporal and frontal dysfunction may clarify the underlying brain-behavior mechanisms for abstraction from proverbs. We propose that patients with behavioral variant frontotemporal dementia (bvFTD), who are characteristically more impaired on proverb interpretation than those with Alzheimer's disease (AD), are disproportionately impaired because of anterior temporal-mediated semantic deficits. Eleven patients with bvFTD and 10 with AD completed the Delis-Kaplan Executive Function System (D-KEFS) Proverbs Test and a series of neuropsychological measures of executive and semantic functions. The analysis included both raw and age-adjusted normed data for multiple choice responses on the D-KEFS Proverbs Test using independent samples t-tests. Tensor-based morphometry (TBM) applied to 3D T1-weighted MRI scans mapped the association between regional brain volume and proverb performance. Computations of mean Jacobian values within select regions of interest provided a numeric summary of regional volume, and voxel-wise regression yielded 3D statistical maps of the association between tissue volume and proverb scores. The patients with bvFTD were significantly worse than those with AD in proverb interpretation. The worse performance of the bvFTD patients involved a greater number of concrete responses to common, familiar proverbs, but not to uncommon, unfamiliar ones. These concrete responses to common proverbs correlated with semantic measures, whereas concrete responses to uncommon proverbs correlated with executive functions. After controlling for dementia diagnosis, TBM analyses indicated significant

  10. Symmetry, Invariance and Ontology in Physics and Statistics

    Directory of Open Access Journals (Sweden)

    Julio Michael Stern

    2011-09-01

    Full Text Available This paper has three main objectives: (a Discuss the formal analogy between some important symmetry-invariance arguments used in physics, probability and statistics. Specifically, we will focus on Noether’s theorem in physics, the maximum entropy principle in probability theory, and de Finetti-type theorems in Bayesian statistics; (b Discuss the epistemological and ontological implications of these theorems, as they are interpreted in physics and statistics. Specifically, we will focus on the positivist (in physics or subjective (in statistics interpretations vs. objective interpretations that are suggested by symmetry and invariance arguments; (c Introduce the cognitive constructivism epistemological framework as a solution that overcomes the realism-subjectivism dilemma and its pitfalls. The work of the physicist and philosopher Max Born will be particularly important in our discussion.

  11. Statistics & probaility for dummies

    CERN Document Server

    Rumsey, Deborah J

    2013-01-01

    Two complete eBooks for one low price! Created and compiled by the publisher, this Statistics I and Statistics II bundle brings together two math titles in one, e-only bundle. With this special bundle, you'll get the complete text of the following two titles: Statistics For Dummies, 2nd Edition  Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more. Tra

  12. 100 statistical tests

    CERN Document Server

    Kanji, Gopal K

    2006-01-01

    This expanded and updated Third Edition of Gopal K. Kanji's best-selling resource on statistical tests covers all the most commonly used tests with information on how to calculate and interpret results with simple datasets. Each entry begins with a short summary statement about the test's purpose, and contains details of the test objective, the limitations (or assumptions) involved, a brief outline of the method, a worked example, and the numerical calculation. 100 Statistical Tests, Third Edition is the one indispensable guide for users of statistical materials and consumers of statistical information at all levels and across all disciplines.

  13. Statistical physics modeling of hydrogen desorption from LaNi{sub 4.75}Fe{sub 0.25}: Stereographic and energetic interpretations

    Energy Technology Data Exchange (ETDEWEB)

    Wjihi, Sarra [Unité de Recherche de Physique Quantique, 11 ES 54, Faculté des Science de Monastir (Tunisia); Dhaou, Houcine [Laboratoire des Etudes des Systèmes Thermiques et Energétiques (LESTE), ENIM, Route de Kairouan, 5019 Monastir (Tunisia); Yahia, Manel Ben; Knani, Salah [Unité de Recherche de Physique Quantique, 11 ES 54, Faculté des Science de Monastir (Tunisia); Jemni, Abdelmajid [Laboratoire des Etudes des Systèmes Thermiques et Energétiques (LESTE), ENIM, Route de Kairouan, 5019 Monastir (Tunisia); Lamine, Abdelmottaleb Ben, E-mail: abdelmottaleb.benlamine@gmail.com [Unité de Recherche de Physique Quantique, 11 ES 54, Faculté des Science de Monastir (Tunisia)

    2015-12-15

    Statistical physics treatment is used to study the desorption of hydrogen on LaNi{sub 4.75}Fe{sub 0.25}, in order to obtain new physicochemical interpretations at the molecular level. Experimental desorption isotherms of hydrogen on LaNi{sub 4.75}Fe{sub 0.25} are fitted at three temperatures (293 K, 303 K and 313 K), using a monolayer desorption model. Six parameters of the model are fitted, namely the number of molecules per site n{sub α} and n{sub β}, the receptor site densities N{sub αM} and N{sub βM}, and the energetic parameters P{sub α} and P{sub β}. The behaviors of these parameters are discussed in relationship with desorption process. A dynamic study of the α and β phases in the desorption process was then carried out. Finally, the different thermodynamical potential functions are derived by statistical physics calculations from our adopted model.

  14. Applied statistics for social and management sciences

    CERN Document Server

    Miah, Abdul Quader

    2016-01-01

    This book addresses the application of statistical techniques and methods across a wide range of disciplines. While its main focus is on the application of statistical methods, theoretical aspects are also provided as fundamental background information. It offers a systematic interpretation of results often discovered in general descriptions of methods and techniques such as linear and non-linear regression. SPSS is also used in all the application aspects. The presentation of data in the form of tables and graphs throughout the book not only guides users, but also explains the statistical application and assists readers in interpreting important features. The analysis of statistical data is presented consistently throughout the text. Academic researchers, practitioners and other users who work with statistical data will benefit from reading Applied Statistics for Social and Management Sciences. .

  15. Statistics Clinic

    Science.gov (United States)

    Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James

    2014-01-01

    Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.

  16. Interpretation of Confidence Interval Facing the Conflict

    Science.gov (United States)

    Andrade, Luisa; Fernández, Felipe

    2016-01-01

    As literature has reported, it is usual that university students in statistics courses, and even statistics teachers, interpret the confidence level associated with a confidence interval as the probability that the parameter value will be between the lower and upper interval limits. To confront this misconception, class activities have been…

  17. Adaptive and robust statistical methods for processing near-field scanning microwave microscopy images.

    Science.gov (United States)

    Coakley, K J; Imtiaz, A; Wallis, T M; Weber, J C; Berweger, S; Kabos, P

    2015-03-01

    Near-field scanning microwave microscopy offers great potential to facilitate characterization, development and modeling of materials. By acquiring microwave images at multiple frequencies and amplitudes (along with the other modalities) one can study material and device physics at different lateral and depth scales. Images are typically noisy and contaminated by artifacts that can vary from scan line to scan line and planar-like trends due to sample tilt errors. Here, we level images based on an estimate of a smooth 2-d trend determined with a robust implementation of a local regression method. In this robust approach, features and outliers which are not due to the trend are automatically downweighted. We denoise images with the Adaptive Weights Smoothing method. This method smooths out additive noise while preserving edge-like features in images. We demonstrate the feasibility of our methods on topography images and microwave |S11| images. For one challenging test case, we demonstrate that our method outperforms alternative methods from the scanning probe microscopy data analysis software package Gwyddion. Our methods should be useful for massive image data sets where manual selection of landmarks or image subsets by a user is impractical. Published by Elsevier B.V.

  18. Typhoid fever acquired in the United States, 1999–2010: epidemiology, microbiology, and use of a space–time scan statistic for outbreak detection

    Science.gov (United States)

    IMANISHI, M.; NEWTON, A. E.; VIEIRA, A. R.; GONZALEZ-AVILES, G.; KENDALL SCOTT, M. E.; MANIKONDA, K.; MAXWELL, T. N.; HALPIN, J. L.; FREEMAN, M. M.; MEDALLA, F.; AYERS, T. L.; DERADO, G.; MAHON, B. E.; MINTZ, E. D.

    2016-01-01

    SUMMARY Although rare, typhoid fever cases acquired in the United States continue to be reported. Detection and investigation of outbreaks in these domestically acquired cases offer opportunities to identify chronic carriers. We searched surveillance and laboratory databases for domestically acquired typhoid fever cases, used a space–time scan statistic to identify clusters, and classified clusters as outbreaks or non-outbreaks. From 1999 to 2010, domestically acquired cases accounted for 18% of 3373 reported typhoid fever cases; their isolates were less often multidrug-resistant (2% vs. 15%) compared to isolates from travel-associated cases. We identified 28 outbreaks and two possible outbreaks within 45 space–time clusters of ⩾2 domestically acquired cases, including three outbreaks involving ⩾2 molecular subtypes. The approach detected seven of the ten outbreaks published in the literature or reported to CDC. Although this approach did not definitively identify any previously unrecognized outbreaks, it showed the potential to detect outbreaks of typhoid fever that may escape detection by routine analysis of surveillance data. Sixteen outbreaks had been linked to a carrier. Every case of typhoid fever acquired in a non-endemic country warrants thorough investigation. Space–time scan statistics, together with shoe-leather epidemiology and molecular subtyping, may improve outbreak detection. PMID:25427666

  19. The insignificance of statistical significance testing

    Science.gov (United States)

    Johnson, Douglas H.

    1999-01-01

    Despite their use in scientific journals such as The Journal of Wildlife Management, statistical hypothesis tests add very little value to the products of research. Indeed, they frequently confuse the interpretation of data. This paper describes how statistical hypothesis tests are often viewed, and then contrasts that interpretation with the correct one. I discuss the arbitrariness of P-values, conclusions that the null hypothesis is true, power analysis, and distinctions between statistical and biological significance. Statistical hypothesis testing, in which the null hypothesis about the properties of a population is almost always known a priori to be false, is contrasted with scientific hypothesis testing, which examines a credible null hypothesis about phenomena in nature. More meaningful alternatives are briefly outlined, including estimation and confidence intervals for determining the importance of factors, decision theory for guiding actions in the face of uncertainty, and Bayesian approaches to hypothesis testing and other statistical practices.

  20. A Study on Liver Scan using 113mIn Colloid

    International Nuclear Information System (INIS)

    Koh, Chang Soon; Rhee, Chong Hoen; Chang, Kochang; Hong, Chang Gi

    1969-01-01

    There have been reported numberous cases of liver scanning in use of 198 Au colloid by many investigators, however, one in use of 113m In colloid has not been reported as yet in this country. The dose of 113 mIn for high diagnostic value in examination of each organ was determined and the diagnostic interpretability of liver scanning with the use of 113m In was carefully evaluated in comparison with the results of the liver scanning by the conventionally applied radioisotope. The comparative study of both figures of liver scanning with the use of 113m In colloid and 198 Au colloid delivered following results:1) The liver uptake rate and clearance into peripheral blood were accentuated more in case of 113m In colloid than in case of 198 Au colloid. 2) The interpretability of space occupying lesion in liver scanning with 113m In was also superior to one with 198 Au. 3) The figure of liver scanning with 113m In colloid corresponds not always to the figure with 198 Au. This difference can be explained by difference of phagocytic ability of reticuloendothelial system within liver. 4) In the liver scanning with 113m In colloid, the spleen is also visualized even in normal examine. 5) In the cases of disturbed liver function, uptake is more decreased in use of 113m In colloid than in 198 Au, in the spleen, however, the way is contrary. 6) With use of 113m In colloid, the time required for scanning could be shortened in comparison with 198 Au. 7) The filtration of 113m In colloid for scanning prior to human administration gives an expectation for better scanning figure.

  1. Characterizing the geomorphic setting of precariously balanced rocks using terrestrial laser scanning technology

    Science.gov (United States)

    Haddad, D. E.; Arrowsmith, R.

    2009-12-01

    Terrestrial laser scanning (TLS) technology is rapidly becoming an effective three-dimensional imaging tool. Precariously balanced rocks are a subset of spheroidally weathered boulders. They are balanced on bedrock pedestals and are formed in upland drainage basins and pediments of exhumed plutons. Precarious rocks are used as negative evidence of earthquake-driven extreme ground motions. Field surveys of PBRs are coupled with cosmogenic radionuclide (CRN) surface exposure dating techniques to determine their exhumation rates. These rates are used in statistical simulations to estimate the magnitudes and recurrences of earthquake-generated extreme ground shaking as a means to physically validate seismic hazard analyses. However, the geomorphic setting of PBRs in the landscape is poorly constrained when interpreting their exhumation rates from CRN surface exposure dates. Are PBRs located on steep or gentle hillslopes? Are they located near drainages or hillslope crests? What geomorphic processes control the spatial distribution of PBRs in a landscape, and where do these processes dominate? Because the fundamental hillslope transport laws are largely controlled by local hillslope gradient and contributing area, the location of a PBR is controlled by the geomorphic agents and their rates acting on it. Our latest efforts involve using a combination of TLS and airborne laser swath mapping (ALSM) to characterize the geomorphic situation of PBRs. We used a Riegl LPM 800i (LPM 321) terrestrial laser scanner to scan a ~1.5 m tall by ~1 m wide precariously balanced rock in the Granite Dells, central Arizona. The PBR was scanned from six positions, and the scans were aligned to a point cloud totaling 3.4M points. We also scanned a ~50 m by ~150 m area covering PBR hillslopes from five scan positions. The resulting 5.5M points were used to create a digital terrain model of precarious rocks and their hillslopes. Our TLS- and ALSM-generated surface models and DEMs provide a

  2. Horizon scanning for environmental foresight: a review of issues and approaches

    Science.gov (United States)

    David N. Bengston

    2013-01-01

    Natural resource management organizations carry out a range of activities to examine possible future conditions and trends as part of their planning process, but the distinct approach of formal horizon scanning is often a missing component of strategic thinking and strategy development in these organizations. Horizon scanning is a process for finding and interpreting...

  3. Evaluation of processing methods for static radioisotope scan images

    International Nuclear Information System (INIS)

    Oakberg, J.A.

    1976-12-01

    Radioisotope scanning in the field of nuclear medicine provides a method for the mapping of a radioactive drug in the human body to produce maps (images) which prove useful in detecting abnormalities in vital organs. At best, radioisotope scanning methods produce images with poor counting statistics. One solution to improving the body scan images is using dedicated small computers with appropriate software to process the scan data. Eleven methods for processing image data are compared

  4. Theoretical, analytical, and statistical interpretation of environmental data

    International Nuclear Information System (INIS)

    Lombard, S.M.

    1974-01-01

    The reliability of data from radiochemical analyses of environmental samples cannot be determined from nuclear counting statistics alone. The rigorous application of the principles of propagation of errors, an understanding of the physics and chemistry of the species of interest in the environment, and the application of information from research on the analytical procedure are all necessary for a valid estimation of the errors associated with analytical results. The specific case of the determination of plutonium in soil is considered in terms of analytical problems and data reliability. (U.S.)

  5. Conformity and statistical tolerancing

    Science.gov (United States)

    Leblond, Laurent; Pillet, Maurice

    2018-02-01

    Statistical tolerancing was first proposed by Shewhart (Economic Control of Quality of Manufactured Product, (1931) reprinted 1980 by ASQC), in spite of this long history, its use remains moderate. One of the probable reasons for this low utilization is undoubtedly the difficulty for designers to anticipate the risks of this approach. The arithmetic tolerance (worst case) allows a simple interpretation: conformity is defined by the presence of the characteristic in an interval. Statistical tolerancing is more complex in its definition. An interval is not sufficient to define the conformance. To justify the statistical tolerancing formula used by designers, a tolerance interval should be interpreted as the interval where most of the parts produced should probably be located. This tolerance is justified by considering a conformity criterion of the parts guaranteeing low offsets on the latter characteristics. Unlike traditional arithmetic tolerancing, statistical tolerancing requires a sustained exchange of information between design and manufacture to be used safely. This paper proposes a formal definition of the conformity, which we apply successively to the quadratic and arithmetic tolerancing. We introduce a concept of concavity, which helps us to demonstrate the link between tolerancing approach and conformity. We use this concept to demonstrate the various acceptable propositions of statistical tolerancing (in the space decentring, dispersion).

  6. Applied statistics for economists

    CERN Document Server

    Lewis, Margaret

    2012-01-01

    This book is an undergraduate text that introduces students to commonly-used statistical methods in economics. Using examples based on contemporary economic issues and readily-available data, it not only explains the mechanics of the various methods, it also guides students to connect statistical results to detailed economic interpretations. Because the goal is for students to be able to apply the statistical methods presented, online sources for economic data and directions for performing each task in Excel are also included.

  7. Statistical concepts a second course

    CERN Document Server

    Lomax, Richard G

    2012-01-01

    Statistical Concepts consists of the last 9 chapters of An Introduction to Statistical Concepts, 3rd ed. Designed for the second course in statistics, it is one of the few texts that focuses just on intermediate statistics. The book highlights how statistics work and what they mean to better prepare students to analyze their own data and interpret SPSS and research results. As such it offers more coverage of non-parametric procedures used when standard assumptions are violated since these methods are more frequently encountered when working with real data. Determining appropriate sample sizes

  8. Evaluation of 111In leukocyte whole body scanning

    International Nuclear Information System (INIS)

    McDougall, I.R.; Baumert, J.E.; Lantieri, R.L.

    1979-01-01

    Indium-111 oxine, polymorphonuclear cells isolated and labeled with 111 In were used for studying abscesses and inflammatory conditions. There were 64 total scans done in 59 patients, 32 male and 27 female, aged 3 to 81 years (average, 51). The original clinical diagnosis was abscess in 33 patients. The whole blood cell scan was abnormal in 12 (36%) of these, and a good clinical correlation was obtained in 11 of the 12. In the 21 with a normal scan, 18 had no evidence of abscess, yielding one false-positive and three false-negative interpretations in the abscess group. Thirteen patients had fever of unknown origin, nine had negative scans and no subsequent evidence of abscess, and four had positive scans with good correlation in three. Acute bone and joint infections were positive on scan (4/4), whereas chronic osteomyelitis was negative (0/2). Three patients with acute myocardial infarction and three of four with subacute bacterial endocarditis had normal scans. All three studies in renal transplant rejection showed positive uptake in the pelvic kidneys. Indium-111 white blood cell scans have proved useful to diagnose or exclude a diagnosis of abscess or inflammatory condition infiltrated with polymorphonuclear leukocytes

  9. SOCR: Statistics Online Computational Resource

    OpenAIRE

    Dinov, Ivo D.

    2006-01-01

    The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis...

  10. Statistical significance of cis-regulatory modules

    Directory of Open Access Journals (Sweden)

    Smith Andrew D

    2007-01-01

    Full Text Available Abstract Background It is becoming increasingly important for researchers to be able to scan through large genomic regions for transcription factor binding sites or clusters of binding sites forming cis-regulatory modules. Correspondingly, there has been a push to develop algorithms for the rapid detection and assessment of cis-regulatory modules. While various algorithms for this purpose have been introduced, most are not well suited for rapid, genome scale scanning. Results We introduce methods designed for the detection and statistical evaluation of cis-regulatory modules, modeled as either clusters of individual binding sites or as combinations of sites with constrained organization. In order to determine the statistical significance of module sites, we first need a method to determine the statistical significance of single transcription factor binding site matches. We introduce a straightforward method of estimating the statistical significance of single site matches using a database of known promoters to produce data structures that can be used to estimate p-values for binding site matches. We next introduce a technique to calculate the statistical significance of the arrangement of binding sites within a module using a max-gap model. If the module scanned for has defined organizational parameters, the probability of the module is corrected to account for organizational constraints. The statistical significance of single site matches and the architecture of sites within the module can be combined to provide an overall estimation of statistical significance of cis-regulatory module sites. Conclusion The methods introduced in this paper allow for the detection and statistical evaluation of single transcription factor binding sites and cis-regulatory modules. The features described are implemented in the Search Tool for Occurrences of Regulatory Motifs (STORM and MODSTORM software.

  11. Assessment of Quadrivalent Human Papillomavirus Vaccine Safety Using the Self-Controlled Tree-Temporal Scan Statistic Signal-Detection Method in the Sentinel System.

    Science.gov (United States)

    Yih, W Katherine; Maro, Judith C; Nguyen, Michael; Baker, Meghan A; Balsbaugh, Carolyn; Cole, David V; Dashevsky, Inna; Mba-Jonas, Adamma; Kulldorff, Martin

    2018-06-01

    The self-controlled tree-temporal scan statistic-a new signal-detection method-can evaluate whether any of a wide variety of health outcomes are temporally associated with receipt of a specific vaccine, while adjusting for multiple testing. Neither health outcomes nor postvaccination potential periods of increased risk need be prespecified. Using US medical claims data in the Food and Drug Administration's Sentinel system, we employed the method to evaluate adverse events occurring after receipt of quadrivalent human papillomavirus vaccine (4vHPV). Incident outcomes recorded in emergency department or inpatient settings within 56 days after first doses of 4vHPV received by 9- through 26.9-year-olds in 2006-2014 were identified using International Classification of Diseases, Ninth Revision, diagnosis codes and analyzed by pairing the new method with a standard hierarchical classification of diagnoses. On scanning diagnoses of 1.9 million 4vHPV recipients, 2 statistically significant categories of adverse events were found: cellulitis on days 2-3 after vaccination and "other complications of surgical and medical procedures" on days 1-3 after vaccination. Cellulitis is a known adverse event. Clinically informed investigation of electronic claims records of the patients with "other complications" did not suggest any previously unknown vaccine safety problem. Considering that thousands of potential short-term adverse events and hundreds of potential risk intervals were evaluated, these findings add significantly to the growing safety record of 4vHPV.

  12. Occipital and Cingulate Hypometabolism are Significantly Under-Reported on 18-Fluorodeoxyglucose Positron Emission Tomography Scans of Patients with Lewy Body Dementia.

    Science.gov (United States)

    Hamed, Moath; Schraml, Frank; Wilson, Jeffrey; Galvin, James; Sabbagh, Marwan N

    2018-01-01

    To determine whether occipital and cingulate hypometabolism is being under-reported or missed on 18-fluorodeoxyglucose positron emission tomography (FDG-PET) CT scans in patients with Dementia with Lewy Bodies (DLB). Recent studies have reported higher sensitivity and specificity for occipital and cingulate hypometabolism on FDG-PET of DLB patients. This retrospective chart review looked at regions of interest (ROI's) in FDG-PET CT scan reports in 35 consecutive patients with a clinical diagnosis of probable, possible, or definite DLB as defined by the latest DLB Consortium Report. ROI's consisting of glucose hypometabolism in frontal, parietal, temporal, occipital, and cingulate areas were tabulated and charted separately by the authors from the reports. A blinded Nuclear medicine physician read the images independently and marked ROI's separately. A Cohen's Kappa coefficient statistic was calculated to determine agreement between the reports and the blinded reads. On the radiology reports, 25.71% and 17.14% of patients reported occipital and cingulate hypometabolism respectively. Independent reads demonstrated significant disagreement with the proportion of occipital and cingulate hypometabolism being reported on initial reads: 91.43% and 85.71% respectively. Cohen's Kappa statistic determinations demonstrated significant agreement only with parietal hypometabolism (pOccipital and cingulate hypometabolism is under-reported and missed frequently on clinical interpretations of FDG-PET scans of patients with DLB, but the frequency of hypometabolism is even higher than previously reported. Further studies with more statistical power and receiver operating characteristic analyses are needed to delineate the sensitivity and specificity of these in vivo biomarkers.

  13. Value of hepatobiliary scanning in complex liver trauma

    International Nuclear Information System (INIS)

    Gartman, D.M.; Zeman, R.K.; Cahow, C.E.; Baker, C.C.

    1985-01-01

    To evaluate the use of biliary scintigraphy with /sup 99m/Tc-dimethyl analogs (HIDA) in traumatic liver injuries, a group of 26 patients with penetrating and blunt liver injuries were studied. The results indicate that HIDA scanning is an effective noninvasive method of evaluating the hepatobiliary tree in the post-traumatic setting. The HIDA scan is a sensitive tool for studying the hepatic parenchyma and the presence or absence of bile leaks. Its evaluation of the extrahepatic biliary ductal system is not specific and should be assessed with further studies. Gallbladder nonvisualization by HIDA scans in this setting cannot be presumed to be secondary to acute cholecystitis and should be interpreted with extreme caution

  14. Indices of agreement between neurosurgeons and a radiologist in interpreting tomography scans in an emergency department.

    Science.gov (United States)

    Dourado, Jules Carlos; Pereira, Júlio Leonardo Barbosa; Albuquerque, Lucas Alverne Freitas de; Carvalho, Gervásio Teles Cardos de; Dias, Patrícia; Dias, Laura; Bicalho, Marcos; Magalhães, Pollyana; Dellaretti, Marcos

    2015-08-01

    The power of interpretation in the analysis of cranial computed tomography (CCT) among neurosurgeons and radiologists has rarely been studied. This study aimed to assess the rate of agreement in the interpretation of CCTs between neurosurgeons and a radiologist in an emergency department. 227 CCT were independently analyzed by two neurosurgeons (NS1 and NS2) and a radiologist (RAD). The level of agreement in interpreting the examination was studied. The Kappa values obtained between NS1 and NS2 and RAD were considered nearly perfect and substantial agreement. The highest levels of agreement when evaluating abnormalities were observed in the identification of tumors, hydrocephalus and intracranial hematomas. The worst levels of agreement were observed for leukoaraiosis and reduced brain volume. For diseases in which the emergency room procedure must be determined, agreement in the interpretation of CCTs between the radiologist and neurosurgeons was satisfactory.

  15. Developments in statistical evaluation of clinical trials

    CERN Document Server

    Oud, Johan; Ghidey, Wendimagegn

    2014-01-01

    This book describes various ways of approaching and interpreting the data produced by clinical trial studies, with a special emphasis on the essential role that biostatistics plays in clinical trials. Over the past few decades the role of statistics in the evaluation and interpretation of clinical data has become of paramount importance. As a result the standards of clinical study design, conduct and interpretation have undergone substantial improvement. The book includes 18 carefully reviewed chapters on recent developments in clinical trials and their statistical evaluation, with each chapter providing one or more examples involving typical data sets, enabling readers to apply the proposed procedures. The chapters employ a uniform style to enhance comparability between the approaches.

  16. Inter-Observer Agreement on Diffusion-Weighted Magnetic Resonance Imaging Interpretation for Diagnosis of Acute Ischemic Stroke Among Emergency Physicians

    Directory of Open Access Journals (Sweden)

    Deniz ORAY

    2015-06-01

    Full Text Available SUMMARY: Objectives: Diffusion-weighted magnetic resonance imaging (DW-MRI is a highly sensitive tool for the detection of early ischemic stroke and is excellent at detecting small and early infarcts. Nevertheless, conflict may arise and judgments may differ among different interpreters. Inter-observer variability shows the systematic difference among different observers and is expressed as the kappa (Κ coefficient. In this study, we aimed to determinate the inter-observer variability among emergency physicians in the use of DW-MRI for the diagnosis of acute ischemic stroke. Methods: Cranial DW-MRI images of 50 patients were interpreted in this retrospective observational cross-sectional study. Patients who were submitted to DW-MRI imaging for a suspected acute ischemic stroke were included in the study, unless the scans were ordered by any of the reviewers or they were absent in the system. The scans were blindly and randomly interpreted by four emergency physicians. Inter-observer agreement between reviewers was evaluated using Fleiss’ Κ statistics. Results: The mean kappa value for high signal on diffusion-weighted images (DWI and for reduction on apparent diffusion coefficient (ADC were substantial (k=0.67 and moderate (k=0.60 respectively. The correlation for detection of the presence of ischemia and location was substantial (k: 0.67. There were 18 false-positive and 4 false-negative evaluations of DWI, 15 false positive and 8 false-negative evaluations of ADC. Conclusions: Our data suggest that DW-MRI is reliable in screening for ischemic stroke when interpreted by emergency physicians in the emergency department. The levels of stroke identification and variability show that emergency physicians may have an acceptable level of agreement. Key words: Emergency department, diffusion weighted magnetic resonance imaging, inter-observer agreement, ischemic stroke

  17. Confidence intervals permit, but don't guarantee, better inference than statistical significance testing

    Directory of Open Access Journals (Sweden)

    Melissa Coulson

    2010-07-01

    Full Text Available A statistically significant result, and a non-significant result may differ little, although significance status may tempt an interpretation of difference. Two studies are reported that compared interpretation of such results presented using null hypothesis significance testing (NHST, or confidence intervals (CIs. Authors of articles published in psychology, behavioural neuroscience, and medical journals were asked, via email, to interpret two fictitious studies that found similar results, one statistically significant, and the other non-significant. Responses from 330 authors varied greatly, but interpretation was generally poor, whether results were presented as CIs or using NHST. However, when interpreting CIs respondents who mentioned NHST were 60% likely to conclude, unjustifiably, the two results conflicted, whereas those who interpreted CIs without reference to NHST were 95% likely to conclude, justifiably, the two results were consistent. Findings were generally similar for all three disciplines. An email survey of academic psychologists confirmed that CIs elicit better interpretations if NHST is not invoked. Improved statistical inference can result from encouragement of meta-analytic thinking and use of CIs but, for full benefit, such highly desirable statistical reform requires also that researchers interpret CIs without recourse to NHST.

  18. Interpretive Reporting of Protein Electrophoresis Data by Microcomputer

    Science.gov (United States)

    Talamo, Thomas S.; Losos, Frank J.; Kessler, G. Frederick

    1982-01-01

    A microcomputer based system for interpretive reporting of protein electrophoretic data has been developed. Data for serum, urine and cerebrospinal fluid protein electrophoreses as well as immunoelectrophoresis can be entered. Patient demographic information is entered through the keyboard followed by manual entry of total and fractionated protein levels obtained after densitometer scanning of the electrophoretic strip. The patterns are then coded, interpreted, and final reports generated. In most cases interpretation time is less than one second. Misinterpretation by computer is uncommon and can be corrected by edit functions within the system. These discrepancies between computer and pathologist interpretation are automatically stored in a data file for later review and possible program modification. Any or all previous tests on a patient may be reviewed with graphic display of the electrophoretic pattern. The system has been in use for several months and is presently well accepted by both laboratory and clinical staff. It also allows rapid storage, retrieval and analysis of protein electrophoretic datab.

  19. A statistical model for interpreting computerized dynamic posturography data

    Science.gov (United States)

    Feiveson, Alan H.; Metter, E. Jeffrey; Paloski, William H.

    2002-01-01

    Computerized dynamic posturography (CDP) is widely used for assessment of altered balance control. CDP trials are quantified using the equilibrium score (ES), which ranges from zero to 100, as a decreasing function of peak sway angle. The problem of how best to model and analyze ESs from a controlled study is considered. The ES often exhibits a skewed distribution in repeated trials, which can lead to incorrect inference when applying standard regression or analysis of variance models. Furthermore, CDP trials are terminated when a patient loses balance. In these situations, the ES is not observable, but is assigned the lowest possible score--zero. As a result, the response variable has a mixed discrete-continuous distribution, further compromising inference obtained by standard statistical methods. Here, we develop alternative methodology for analyzing ESs under a stochastic model extending the ES to a continuous latent random variable that always exists, but is unobserved in the event of a fall. Loss of balance occurs conditionally, with probability depending on the realized latent ES. After fitting the model by a form of quasi-maximum-likelihood, one may perform statistical inference to assess the effects of explanatory variables. An example is provided, using data from the NIH/NIA Baltimore Longitudinal Study on Aging.

  20. Scanning the periphery.

    Science.gov (United States)

    Day, George S; Schoemaker, Paul J H

    2005-11-01

    Companies often face new rivals, technologies, regulations, and other environmental changes that seem to come out of left field. How can they see these changes sooner and capitalize on them? Such changes often begin as weak signals on what the authors call the periphery, or the blurry zone at the edge of an organization's vision. As with human peripheral vision, these signals are difficult to see and interpret but can be vital to success or survival. Unfortunately, most companies lack a systematic method for determining where on the periphery they should be looking, how to interpret the weak signals they see, and how to allocate limited scanning resources. This article provides such a method-a question-based framework for helping companies scan the periphery more efficiently and effectively. The framework divides questions into three categories: learning from the past (What have been our past blind spots? What instructive analogies do other industries offer? Who in the industry is skilled at picking up weak signals and acting on them?); evaluating the present (What important signals are we rationalizing away? What are our mavericks, outliers, complainers, and defectors telling us? What are our peripheral customers and competitors really thinking?); and envisioning the future (What future surprises could really hurt or help us? What emerging technologies could change the game? Is there an unthinkable scenario that might disrupt our business?). Answering these questions is a good first step toward anticipating problems or opportunities that may appear on the business horizon. The article concludes with a self-test that companies can use to assess their need and capability for peripheral vision.

  1. Bone scanning in the evaluation of lung cancer

    International Nuclear Information System (INIS)

    Jung, Kun Sik; Zeon, Seok Kil; Lee, Hee Jung; Song, Hong Suk

    1994-01-01

    We studied the diagnostic significance of bone scan in evaluation of bone metastasis by lung cancer, prevalence rate, and the causes of false positive bone scan and soft tissue accumulation of bone seeking agent. This subject include 73 lung cancer patients with bone scan, We analyzed the frequency of the metastasis, its distribution and configuration, and any relationship between bone pain and corresponding region on bone scan. The positive findings of bone scans were compared with simple X-ray film, CT, MRI and other diagnostic modalities. The false positive bone scan and the soft tissue accumulation of bone seeking agent were analyzed. The positive findings on bone scan were noted in 26 cases(36%) and they were coexistent with bone pain in 30%. The correspondence between bone scan and bone X-ray was 38%. False positive bone scans were seen in 12 cases(16%), which include fracture due to thoracotomy and trauma, degenerative bone disease, and bifid rib. Accumulation of bone seeking agent in soft tissue were seen in 13 cases(18%), which included primary tumor, enlarged cervical lymph node, pleural effusion, ascites and pleural thickening. Bone scans should be carefully interpreted in detecting bone metastasis in primary malignancy, because of the 16% false positivity and 18% soft tissue accumulation rate. It is very important to note that the correlation between bone pain and positive findings of bone scans was only 38%

  2. Bone scanning in the evaluation of lung cancer

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Kun Sik; Zeon, Seok Kil; Lee, Hee Jung; Song, Hong Suk [School of Medicine, Keimyung University, Daegu (Korea, Republic of)

    1994-05-15

    We studied the diagnostic significance of bone scan in evaluation of bone metastasis by lung cancer, prevalence rate, and the causes of false positive bone scan and soft tissue accumulation of bone seeking agent. This subject include 73 lung cancer patients with bone scan, We analyzed the frequency of the metastasis, its distribution and configuration, and any relationship between bone pain and corresponding region on bone scan. The positive findings of bone scans were compared with simple X-ray film, CT, MRI and other diagnostic modalities. The false positive bone scan and the soft tissue accumulation of bone seeking agent were analyzed. The positive findings on bone scan were noted in 26 cases(36%) and they were coexistent with bone pain in 30%. The correspondence between bone scan and bone X-ray was 38%. False positive bone scans were seen in 12 cases(16%), which include fracture due to thoracotomy and trauma, degenerative bone disease, and bifid rib. Accumulation of bone seeking agent in soft tissue were seen in 13 cases(18%), which included primary tumor, enlarged cervical lymph node, pleural effusion, ascites and pleural thickening. Bone scans should be carefully interpreted in detecting bone metastasis in primary malignancy, because of the 16% false positivity and 18% soft tissue accumulation rate. It is very important to note that the correlation between bone pain and positive findings of bone scans was only 38%.

  3. Interpreting Statistical Findings A Guide For Health Professionals And Students

    CERN Document Server

    Walker, Jan

    2010-01-01

    This book is aimed at those studying and working in the field of health care, including nurses and the professions allied to medicine, who have little prior knowledge of statistics but for whom critical review of research is an essential skill.

  4. hepawk - A language for scanning high energy physics events

    International Nuclear Information System (INIS)

    Ohl, T.

    1992-01-01

    We present the programming language hepawk, designed for convenient scanning of data structures arising in the simulation of high energy physics events. The interpreter for this language has been implemented in FORTRAN-77, therefore hepawk runs on any machine with a FORTRAN-77 compiler. (orig.)

  5. Statistical parametric mapping of Tc-99m HMPAO SPECT cerebral perfusion in the normal elderly

    International Nuclear Information System (INIS)

    Turlakow, A.; Scott, A.M.; Berlangieri, S.U.; Sonkila, C.; Wardill, T.D.; Crowley, K.; Abbott, D.; Egan, G.F.; McKay, W.J.; Hughes, A.

    1998-01-01

    Full text: The clinical value of Tc-99m HMPAO SPECT cerebral blood flow studies in cognitive and neuropsychiatric disorders has been well described. Currently, interpretation of these studies relies on qualitative or semi- quantitative techniques. The aim of our study is to generate statistical measures of regional cerebral perfusion in the normal elderly using statistical parametric mapping (Friston et al, Wellcome Department of Cognitive Neurology, London, UK) in order to facilitate the objective analysis of cerebral blood flow studies in patient groups. A cohort of 20 healthy, elderly volunteers, aged 68 to 81 years, was prospectively selected on the basis of normal physical examination and neuropsychological testing. Subjects with risk factors, or a history of cognitive impairment were excluded from our study group. All volunteers underwent SPECT cerebral blood flow imaging, 30 minutes following the administration of 370 MBq Tc-99m HMPAO, on a Trionix Triad XLT triple-headed scanner (Trionix Research Laboratory Twinsburg, OH) using high resolution, fan-beam collimators resulting in a system resolution of 10 mm full width at half-maximum (FWHM). The SPECT cerebral blood flow studies were analysed using statistical parametric mapping (SPM) software specifically developed for the routine statistical analysis of functional neuroimaging data. The SPECT images were coregistered with each individual's T1-weighted MR volume brain scan and spatially normalized to standardised Talairach space. Using SPM, these data were analyzed for differences in interhemispheric regional cerebral blood flow. Significant asymmetry of cerebral perfusion was detected in the pre-central gyrus at the 95th percentile. In conclusion, the interpretation of cerebral blood flow studies in the elderly should take into account the statistically significant asymmetry in interhemispheric pre-central cortical blood flow. In the future, clinical studies will be compared to statistical data sets in age

  6. Statistical parametric mapping of Tc-99m HMPAO SPECT cerebral perfusion in the normal elderly

    Energy Technology Data Exchange (ETDEWEB)

    Turlakow, A.; Scott, A.M.; Berlangieri, S.U.; Sonkila, C.; Wardill, T.D.; Crowley, K.; Abbott, D.; Egan, G.F.; McKay, W.J.; Hughes, A. [Austin and Repatriation Medical Centre, Heidelberg, VIC (Australia). Departments of Nuclear Medicine and Centre for PET Neurology and Clinical Neuropsychology

    1998-06-01

    Full text: The clinical value of Tc-99m HMPAO SPECT cerebral blood flow studies in cognitive and neuropsychiatric disorders has been well described. Currently, interpretation of these studies relies on qualitative or semi- quantitative techniques. The aim of our study is to generate statistical measures of regional cerebral perfusion in the normal elderly using statistical parametric mapping (Friston et al, Wellcome Department of Cognitive Neurology, London, UK) in order to facilitate the objective analysis of cerebral blood flow studies in patient groups. A cohort of 20 healthy, elderly volunteers, aged 68 to 81 years, was prospectively selected on the basis of normal physical examination and neuropsychological testing. Subjects with risk factors, or a history of cognitive impairment were excluded from our study group. All volunteers underwent SPECT cerebral blood flow imaging, 30 minutes following the administration of 370 MBq Tc-99m HMPAO, on a Trionix Triad XLT triple-headed scanner (Trionix Research Laboratory Twinsburg, OH) using high resolution, fan-beam collimators resulting in a system resolution of 10 mm full width at half-maximum (FWHM). The SPECT cerebral blood flow studies were analysed using statistical parametric mapping (SPM) software specifically developed for the routine statistical analysis of functional neuroimaging data. The SPECT images were coregistered with each individual`s T1-weighted MR volume brain scan and spatially normalized to standardised Talairach space. Using SPM, these data were analyzed for differences in interhemispheric regional cerebral blood flow. Significant asymmetry of cerebral perfusion was detected in the pre-central gyrus at the 95th percentile. In conclusion, the interpretation of cerebral blood flow studies in the elderly should take into account the statistically significant asymmetry in interhemispheric pre-central cortical blood flow. In the future, clinical studies will be compared to statistical data sets in age

  7. Statistical methods in quality assurance

    International Nuclear Information System (INIS)

    Eckhard, W.

    1980-01-01

    During the different phases of a production process - planning, development and design, manufacturing, assembling, etc. - most of the decision rests on a base of statistics, the collection, analysis and interpretation of data. Statistical methods can be thought of as a kit of tools to help to solve problems in the quality functions of the quality loop with respect to produce quality products and to reduce quality costs. Various statistical methods are represented, typical examples for their practical application are demonstrated. (RW)

  8. Penalized likelihood and multi-objective spatial scans for the detection and inference of irregular clusters

    Directory of Open Access Journals (Sweden)

    Fonseca Carlos M

    2010-10-01

    Full Text Available Abstract Background Irregularly shaped spatial clusters are difficult to delineate. A cluster found by an algorithm often spreads through large portions of the map, impacting its geographical meaning. Penalized likelihood methods for Kulldorff's spatial scan statistics have been used to control the excessive freedom of the shape of clusters. Penalty functions based on cluster geometry and non-connectivity have been proposed recently. Another approach involves the use of a multi-objective algorithm to maximize two objectives: the spatial scan statistics and the geometric penalty function. Results & Discussion We present a novel scan statistic algorithm employing a function based on the graph topology to penalize the presence of under-populated disconnection nodes in candidate clusters, the disconnection nodes cohesion function. A disconnection node is defined as a region within a cluster, such that its removal disconnects the cluster. By applying this function, the most geographically meaningful clusters are sifted through the immense set of possible irregularly shaped candidate cluster solutions. To evaluate the statistical significance of solutions for multi-objective scans, a statistical approach based on the concept of attainment function is used. In this paper we compared different penalized likelihoods employing the geometric and non-connectivity regularity functions and the novel disconnection nodes cohesion function. We also build multi-objective scans using those three functions and compare them with the previous penalized likelihood scans. An application is presented using comprehensive state-wide data for Chagas' disease in puerperal women in Minas Gerais state, Brazil. Conclusions We show that, compared to the other single-objective algorithms, multi-objective scans present better performance, regarding power, sensitivity and positive predicted value. The multi-objective non-connectivity scan is faster and better suited for the

  9. Statistics for business

    CERN Document Server

    Waller, Derek L

    2008-01-01

    Statistical analysis is essential to business decision-making and management, but the underlying theory of data collection, organization and analysis is one of the most challenging topics for business students and practitioners. This user-friendly text and CD-ROM package will help you to develop strong skills in presenting and interpreting statistical information in a business or management environment. Based entirely on using Microsoft Excel rather than more complicated applications, it includes a clear guide to using Excel with the key functions employed in the book, a glossary of terms and

  10. Elementary Statistics Tables

    CERN Document Server

    Neave, Henry R

    2012-01-01

    This book, designed for students taking a basic introductory course in statistical analysis, is far more than just a book of tables. Each table is accompanied by a careful but concise explanation and useful worked examples. Requiring little mathematical background, Elementary Statistics Tables is thus not just a reference book but a positive and user-friendly teaching and learning aid. The new edition contains a new and comprehensive "teach-yourself" section on a simple but powerful approach, now well-known in parts of industry but less so in academia, to analysing and interpreting process dat

  11. Computer Vision Tool and Technician as First Reader of Lung Cancer Screening CT Scans

    NARCIS (Netherlands)

    Ritchie, A.J.; Sanghera, C.; Jacobs, C.; Zhang, W.; Mayo, J.; Schmidt, H.; Gingras, M.; Pasian, S.; Stewart, L.; Tsai, S.; Manos, D.; Seely, J.M.; Burrowes, P.; Bhatia, R.; Atkar-Khattra, S.; Ginneken, B. van; Tammemagi, M.; Tsao, M.S.; Lam, S.; et al.,

    2016-01-01

    To implement a cost-effective low-dose computed tomography (LDCT) lung cancer screening program at the population level, accurate and efficient interpretation of a large volume of LDCT scans is needed. The objective of this study was to evaluate a workflow strategy to identify abnormal LDCT scans in

  12. Over-all accuracy of sup(99m)Tc-pertechnetate brain scanning for brain tumours

    International Nuclear Information System (INIS)

    Bjoernsson, O.G.; Petursson, E.; Sigurbjoernsson, B.; Davidsson, D.

    1978-01-01

    A 3-year follow-up and re-evaluation of all scans on all patients referred for brain scanning in Iceland during 1 year was performed in order to assess the diagnostic reliability of radioisotope scanning for brain tumours. The study included 471 patients. Of these 25 had primary brain tumours and 7 brain metastases. Scans were positive and correctly interpreted in 68% of the patients with primary brain tumours and in 3 of the 7 patients with metastases. The over-all accuracy of brain scanning for brain tumours defined as the total number of correct positive scans and correct negative scans versus total number of scans examined was 96%, this figure being mainly influenced by the high number of true negative scans. (orig.) [de

  13. Statistical and population genetics issues of two Hungarian datasets from the aspect of DNA evidence interpretation.

    Science.gov (United States)

    Szabolcsi, Zoltán; Farkas, Zsuzsa; Borbély, Andrea; Bárány, Gusztáv; Varga, Dániel; Heinrich, Attila; Völgyi, Antónia; Pamjav, Horolma

    2015-11-01

    When the DNA profile from a crime-scene matches that of a suspect, the weight of DNA evidence depends on the unbiased estimation of the match probability of the profiles. For this reason, it is required to establish and expand the databases that reflect the actual allele frequencies in the population applied. 21,473 complete DNA profiles from Databank samples were used to establish the allele frequency database to represent the population of Hungarian suspects. We used fifteen STR loci (PowerPlex ESI16) including five, new ESS loci. The aim was to calculate the statistical, forensic efficiency parameters for the Databank samples and compare the newly detected data to the earlier report. The population substructure caused by relatedness may influence the frequency of profiles estimated. As our Databank profiles were considered non-random samples, possible relationships between the suspects can be assumed. Therefore, population inbreeding effect was estimated using the FIS calculation. The overall inbreeding parameter was found to be 0.0106. Furthermore, we tested the impact of the two allele frequency datasets on 101 randomly chosen STR profiles, including full and partial profiles. The 95% confidence interval estimates for the profile frequencies (pM) resulted in a tighter range when we used the new dataset compared to the previously published ones. We found that the FIS had less effect on frequency values in the 21,473 samples than the application of minimum allele frequency. No genetic substructure was detected by STRUCTURE analysis. Due to the low level of inbreeding effect and the high number of samples, the new dataset provides unbiased and precise estimates of LR for statistical interpretation of forensic casework and allows us to use lower allele frequencies. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  14. On the statistical interpretation of quantum mechanics: evolution of the density matrix

    International Nuclear Information System (INIS)

    Benzecri, J.P.

    1986-01-01

    Without attempting to identify ontological interpretation with a mathematical structure, we reduce philosophical speculation to five theses. In the discussion of these, a central role is devoted to the mathematical problem of the evolution of the density matrix. This article relates to the first 3 of these 5 theses [fr

  15. Automated Quantification of Stroke Damage on Brain Computed Tomography Scans: e-ASPECTS

    Directory of Open Access Journals (Sweden)

    James Hampton-Till

    2015-08-01

    Full Text Available Emergency radiological diagnosis of acute ischaemic stroke requires the accurate detection and appropriate interpretation of relevant imaging findings. Non-contrast computed tomography (CT provides fast and low-cost assessment of the early signs of ischaemia and is the most widely used diagnostic modality for acute stroke. The Alberta Stroke Program Early CT Score (ASPECTS is a quantitative and clinically validated method to measure the extent of ischaemic signs on brain CT scans. The CE-marked electronic-ASPECTS (e-ASPECTS software automates the ASPECTS score. Anglia Ruskin Clinical Trials Unit (ARCTU independently carried out a clinical investigation of the e-ASPECTS software, an automated scoring system which can be integrated into the diagnostic pathway of an acute ischaemic stroke patient, thereby assisting the physician with expert interpretation of the brain CT scan. Here we describe a literature review of the clinical importance of reliable assessment of early ischaemic signs on plain CT scans, and of technologies automating these processed scoring systems in ischaemic stroke on CT scans focusing on the e-ASPECTS software. To be suitable for critical appraisal in this evaluation, the published studies needed a sample size of a minimum of 10 cases. All randomised studies were screened and data deemed relevant to demonstration of performance of ASPECTS were appraised. The literature review focused on three domains: i interpretation of brain CT scans of stroke patients, ii the application of the ASPECTS score in ischaemic stroke, and iii automation of brain CT analysis. Finally, the appraised references are discussed in the context of the clinical impact of e-ASPECTS and the expected performance, which will be independently evaluated by a non-inferiority study conducted by the ARCTU.

  16. Statistical significance versus clinical relevance.

    Science.gov (United States)

    van Rijn, Marieke H C; Bech, Anneke; Bouyer, Jean; van den Brand, Jan A J G

    2017-04-01

    In March this year, the American Statistical Association (ASA) posted a statement on the correct use of P-values, in response to a growing concern that the P-value is commonly misused and misinterpreted. We aim to translate these warnings given by the ASA into a language more easily understood by clinicians and researchers without a deep background in statistics. Moreover, we intend to illustrate the limitations of P-values, even when used and interpreted correctly, and bring more attention to the clinical relevance of study findings using two recently reported studies as examples. We argue that P-values are often misinterpreted. A common mistake is saying that P < 0.05 means that the null hypothesis is false, and P ≥0.05 means that the null hypothesis is true. The correct interpretation of a P-value of 0.05 is that if the null hypothesis were indeed true, a similar or more extreme result would occur 5% of the times upon repeating the study in a similar sample. In other words, the P-value informs about the likelihood of the data given the null hypothesis and not the other way around. A possible alternative related to the P-value is the confidence interval (CI). It provides more information on the magnitude of an effect and the imprecision with which that effect was estimated. However, there is no magic bullet to replace P-values and stop erroneous interpretation of scientific results. Scientists and readers alike should make themselves familiar with the correct, nuanced interpretation of statistical tests, P-values and CIs. © The Author 2017. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.

  17. Image quality of multiplanar reconstruction of pulmonary CT scans using adaptive statistical iterative reconstruction.

    Science.gov (United States)

    Honda, O; Yanagawa, M; Inoue, A; Kikuyama, A; Yoshida, S; Sumikawa, H; Tobino, K; Koyama, M; Tomiyama, N

    2011-04-01

    We investigated the image quality of multiplanar reconstruction (MPR) using adaptive statistical iterative reconstruction (ASIR). Inflated and fixed lungs were scanned with a garnet detector CT in high-resolution mode (HR mode) or non-high-resolution (HR) mode, and MPR images were then reconstructed. Observers compared 15 MPR images of ASIR (40%) and ASIR (80%) with those of ASIR (0%), and assessed image quality using a visual five-point scale (1, definitely inferior; 5, definitely superior), with particular emphasis on normal pulmonary structures, artefacts, noise and overall image quality. The mean overall image quality scores in HR mode were 3.67 with ASIR (40%) and 4.97 with ASIR (80%). Those in non-HR mode were 3.27 with ASIR (40%) and 3.90 with ASIR (80%). The mean artefact scores in HR mode were 3.13 with ASIR (40%) and 3.63 with ASIR (80%), but those in non-HR mode were 2.87 with ASIR (40%) and 2.53 with ASIR (80%). The mean scores of the other parameters were greater than 3, whereas those in HR mode were higher than those in non-HR mode. There were significant differences between ASIR (40%) and ASIR (80%) in overall image quality (pASIR did not suppress the severe artefacts of contrast medium. In general, MPR image quality with ASIR (80%) was superior to that with ASIR (40%). However, there was an increased incidence of artefacts by ASIR when CT images were obtained in non-HR mode.

  18. Feature combination networks for the interpretation of statistical machine learning models: application to Ames mutagenicity.

    Science.gov (United States)

    Webb, Samuel J; Hanser, Thierry; Howlin, Brendan; Krause, Paul; Vessey, Jonathan D

    2014-03-25

    A new algorithm has been developed to enable the interpretation of black box models. The developed algorithm is agnostic to learning algorithm and open to all structural based descriptors such as fragments, keys and hashed fingerprints. The algorithm has provided meaningful interpretation of Ames mutagenicity predictions from both random forest and support vector machine models built on a variety of structural fingerprints.A fragmentation algorithm is utilised to investigate the model's behaviour on specific substructures present in the query. An output is formulated summarising causes of activation and deactivation. The algorithm is able to identify multiple causes of activation or deactivation in addition to identifying localised deactivations where the prediction for the query is active overall. No loss in performance is seen as there is no change in the prediction; the interpretation is produced directly on the model's behaviour for the specific query. Models have been built using multiple learning algorithms including support vector machine and random forest. The models were built on public Ames mutagenicity data and a variety of fingerprint descriptors were used. These models produced a good performance in both internal and external validation with accuracies around 82%. The models were used to evaluate the interpretation algorithm. Interpretation was revealed that links closely with understood mechanisms for Ames mutagenicity. This methodology allows for a greater utilisation of the predictions made by black box models and can expedite further study based on the output for a (quantitative) structure activity model. Additionally the algorithm could be utilised for chemical dataset investigation and knowledge extraction/human SAR development.

  19. Interpretation of the MEG-MUSIC scan in biomagnetic source localization

    Energy Technology Data Exchange (ETDEWEB)

    Mosher, J.C.; Lewis, P.S. [Los Alamos National Lab., NM (United States); Leahy, R.M. [University of Southern California, Los Angeles, CA (United States). Signal and Image Processing Inst.

    1993-09-01

    MEG-Music is a new approach to MEG source localization. MEG-Music is based on a spatio-temporal source model in which the observed biomagnetic fields are generated by a small number of current dipole sources with fixed positions/orientations and varying strengths. From the spatial covariance matrix of the observed fields, a signal subspace can be identified. The rank of this subspace is equal to the number of elemental sources present. This signal sub-space is used in a projection metric that scans the three dimensional head volume. Given a perfect signal subspace estimate and a perfect forward model, the metric will peak at unity at each dipole location. In practice, the signal subspace estimate is contaminated by noise, which in turn yields MUSIC peaks which are less than unity. Previously we examined the lower bounds on localization error, independent of the choice of localization procedure. In this paper, we analyzed the effects of noise and temporal coherence on the signal subspace estimate and the resulting effects on the MEG-MUSIC peaks.

  20. Structural interpretation of seismic data and inherent uncertainties

    Science.gov (United States)

    Bond, Clare

    2013-04-01

    Geoscience is perhaps unique in its reliance on incomplete datasets and building knowledge from their interpretation. This interpretation basis for the science is fundamental at all levels; from creation of a geological map to interpretation of remotely sensed data. To teach and understand better the uncertainties in dealing with incomplete data we need to understand the strategies individual practitioners deploy that make them effective interpreters. The nature of interpretation is such that the interpreter needs to use their cognitive ability in the analysis of the data to propose a sensible solution in their final output that is both consistent not only with the original data but also with other knowledge and understanding. In a series of experiments Bond et al. (2007, 2008, 2011, 2012) investigated the strategies and pitfalls of expert and non-expert interpretation of seismic images. These studies focused on large numbers of participants to provide a statistically sound basis for analysis of the results. The outcome of these experiments showed that a wide variety of conceptual models were applied to single seismic datasets. Highlighting not only spatial variations in fault placements, but whether interpreters thought they existed at all, or had the same sense of movement. Further, statistical analysis suggests that the strategies an interpreter employs are more important than expert knowledge per se in developing successful interpretations. Experts are successful because of their application of these techniques. In a new set of experiments a small number of experts are focused on to determine how they use their cognitive and reasoning skills, in the interpretation of 2D seismic profiles. Live video and practitioner commentary were used to track the evolving interpretation and to gain insight on their decision processes. The outputs of the study allow us to create an educational resource of expert interpretation through online video footage and commentary with

  1. Perception in statistical graphics

    Science.gov (United States)

    VanderPlas, Susan Ruth

    There has been quite a bit of research on statistical graphics and visualization, generally focused on new types of graphics, new software to create graphics, interactivity, and usability studies. Our ability to interpret and use statistical graphics hinges on the interface between the graph itself and the brain that perceives and interprets it, and there is substantially less research on the interplay between graph, eye, brain, and mind than is sufficient to understand the nature of these relationships. The goal of the work presented here is to further explore the interplay between a static graph, the translation of that graph from paper to mental representation (the journey from eye to brain), and the mental processes that operate on that graph once it is transferred into memory (mind). Understanding the perception of statistical graphics should allow researchers to create more effective graphs which produce fewer distortions and viewer errors while reducing the cognitive load necessary to understand the information presented in the graph. Taken together, these experiments should lay a foundation for exploring the perception of statistical graphics. There has been considerable research into the accuracy of numerical judgments viewers make from graphs, and these studies are useful, but it is more effective to understand how errors in these judgments occur so that the root cause of the error can be addressed directly. Understanding how visual reasoning relates to the ability to make judgments from graphs allows us to tailor graphics to particular target audiences. In addition, understanding the hierarchy of salient features in statistical graphics allows us to clearly communicate the important message from data or statistical models by constructing graphics which are designed specifically for the perceptual system.

  2. Evaluation of /sup 111/In leukocyte whole body scanning

    Energy Technology Data Exchange (ETDEWEB)

    McDougall, I.R.; Baumert, J.E.; Lantieri, R.L.

    1979-11-01

    Indium-111 oxine, polymorphonuclear cells isolated and labeled with /sup 111/In were used for studying abscesses and inflammatory conditions. There were 64 total scans done in 59 patients, 32 male and 27 female, aged 3 to 81 years (average, 51). The original clinical diagnosis was abscess in 33 patients. The whole blood cell scan was abnormal in 12 (36%) of these, and a good clinical correlation was obtained in 11 of the 12. In the 21 with a normal scan, 18 had no evidence of abscess, yielding one false-positive and three false-negative interpretations in the abscess group. Thirteen patients had fever of unknown origin, nine had negative scans and no subsequent evidence of abscess, and four had positive scans with good correlation in three. Acute bone and joint infections were positive on scan (4/4), whereas chronic osteomyelitis was negative (0/2). Three patients with acute myocardial infarction and three of four with subacute bacterial endocarditis had normal scans. All three studies in renal transplant rejection showed positive uptake in the pelvic kidneys. Indium-111 white blood cell scans have proved useful to diagnose or exclude a diagnosis of abscess or inflammatory condition infiltrated with polymorphonuclear leukocytes.

  3. Local crystallography analysis for atomically resolved scanning tunneling microscopy images

    International Nuclear Information System (INIS)

    Lin, Wenzhi; Li, Qing; Belianinov, Alexei; Gai, Zheng; Baddorf, Arthur P; Pan, Minghu; Jesse, Stephen; Kalinin, Sergei V; Sales, Brian C; Sefat, Athena

    2013-01-01

    Scanning probe microscopy has emerged as a powerful and flexible tool for atomically resolved imaging of surface structures. However, due to the amount of information extracted, in many cases the interpretation of such data is limited to being qualitative and semi-quantitative in nature. At the same time, much can be learned from local atom parameters, such as distances and angles, that can be analyzed and interpreted as variations of local chemical bonding, or order parameter fields. Here, we demonstrate an iterative algorithm for indexing and determining atomic positions that allows the analysis of inhomogeneous surfaces. This approach is further illustrated by local crystallographic analysis of several real surfaces, including highly ordered pyrolytic graphite and an Fe-based superconductor FeTe 0.55 Se 0.45 . This study provides a new pathway to extract and quantify local properties for scanning probe microscopy images. (paper)

  4. Design research in statistics education : on symbolizing and computer tools

    NARCIS (Netherlands)

    Bakker, A.

    2004-01-01

    The present knowledge society requires statistical literacy-the ability to interpret, critically evaluate, and communicate about statistical information and messages (Gal, 2002). However, research shows that students generally do not gain satisfactory statistical understanding. The research

  5. Introduction to statistics

    CERN Multimedia

    CERN. Geneva

    2005-01-01

    The three lectures will present an introduction to statistical methods as used in High Energy Physics. As the time will be very limited, the course will seek mainly to define the important issues and to introduce the most wide used tools. Topics will include the interpretation and use of probability, estimation of parameters and testing of hypotheses.

  6. Introduction to statistics

    CERN Multimedia

    CERN. Geneva

    2004-01-01

    The three lectures will present an introduction to statistical methods as used in High Energy Physics. As the time will be very limited, the course will seek mainly to define the important issues and to introduce the most wide used tools. Topics will include the interpretation and use of probability, estimation of parameters and testing of hypotheses.

  7. MR guided spatial normalization of SPECT scans

    International Nuclear Information System (INIS)

    Crouch, B.; Barnden, L.R.; Kwiatek, R.

    2010-01-01

    Full text: In SPECT population studies where magnetic resonance (MR) scans are also available, the higher resolution of the MR scans allows for an improved spatial normalization of the SPECT scans. In this approach, the SPECT images are first coregistered to their corresponding MR images by a linear (affine) transformation which is calculated using SPM's mutual information maximization algorithm. Non-linear spatial normalization maps are then computed either directly from the MR scans using SPM's built in spatial normalization algorithm, or, from segmented TI MR images using DARTEL, an advanced diffeomorphism based spatial normalization algorithm. We compare these MR based methods to standard SPECT based spatial normalization for a population of 27 fibromyalgia patients and 25 healthy controls with spin echo T 1 scans. We identify significant perfusion deficits in prefrontal white matter in FM patients, with the DARTEL based spatial normalization procedure yielding stronger statistics than the standard SPECT based spatial normalization. (author)

  8. The Statistics of wood assays for preservative retention

    Science.gov (United States)

    Patricia K. Lebow; Scott W. Conklin

    2011-01-01

    This paper covers general statistical concepts that apply to interpreting wood assay retention values. In particular, since wood assays are typically obtained from a single composited sample, the statistical aspects, including advantages and disadvantages, of simple compositing are covered.

  9. Prospective evaluation of radionuclide scanning in detection of intestinal necrosis in neonatal necrotizing enterocolitis

    International Nuclear Information System (INIS)

    Haase, G.M.; Sfakianakis, G.N.; Lobe, T.E.; Boles, E.T.

    1981-01-01

    The ability of external imaging to demonstrate intestinal infarction in neonatal necrotizing enterocolitis (NEC) was prospectively evaluated. The radiopharmaceutical technetium--99m diphosphonate was injected intravenously and the patients subsequently underwent abdominal scanning. Clinical patient care and interpretation of the images were entirely independent throughout the study. Of 33 studies, 7 were positive, 4 were suspicious, and 22 were negative. One false positive study detected ischemia without transmural infarction. The second false positive scan occurred postoperatively and was due to misinterpretation of the hyperactivity along the surgical incision. None of the suspicious cases had damaged bowel. The two false negative studies clearly failed to demonstrate frank intestinal necrosis. The presence of very small areas of infarction, errors in technical settings, subjective interpretation of scans and delayed clearance of the radionuclide in a critically ill neonate may all limit the accuracy of external abdominal scanning. However, in spite of an error rate of 12%, it is likely that this technique will enhance the present clinical, laboratory, and radiologic parameters of patient management in NEC

  10. The emergent Copenhagen interpretation of quantum mechanics

    Science.gov (United States)

    Hollowood, Timothy J.

    2014-05-01

    We introduce a new and conceptually simple interpretation of quantum mechanics based on reduced density matrices of sub-systems from which the standard Copenhagen interpretation emerges as an effective description of macroscopically large systems. This interpretation describes a world in which definite measurement results are obtained with probabilities that reproduce the Born rule. Wave function collapse is seen to be a useful but fundamentally unnecessary piece of prudent book keeping which is only valid for macro-systems. The new interpretation lies in a class of modal interpretations in that it applies to quantum systems that interact with a much larger environment. However, we show that it does not suffer from the problems that have plagued similar modal interpretations like macroscopic superpositions and rapid flipping between macroscopically distinct states. We describe how the interpretation fits neatly together with fully quantum formulations of statistical mechanics and that a measurement process can be viewed as a process of ergodicity breaking analogous to a phase transition. The key feature of the new interpretation is that joint probabilities for the ergodic subsets of states of disjoint macro-systems only arise as emergent quantities. Finally we give an account of the EPR-Bohm thought experiment and show that the interpretation implies the violation of the Bell inequality characteristic of quantum mechanics but in a way that is rather novel. The final conclusion is that the Copenhagen interpretation gives a completely satisfactory phenomenology of macro-systems interacting with micro-systems.

  11. The emergent Copenhagen interpretation of quantum mechanics

    International Nuclear Information System (INIS)

    Hollowood, Timothy J

    2014-01-01

    We introduce a new and conceptually simple interpretation of quantum mechanics based on reduced density matrices of sub-systems from which the standard Copenhagen interpretation emerges as an effective description of macroscopically large systems. This interpretation describes a world in which definite measurement results are obtained with probabilities that reproduce the Born rule. Wave function collapse is seen to be a useful but fundamentally unnecessary piece of prudent book keeping which is only valid for macro-systems. The new interpretation lies in a class of modal interpretations in that it applies to quantum systems that interact with a much larger environment. However, we show that it does not suffer from the problems that have plagued similar modal interpretations like macroscopic superpositions and rapid flipping between macroscopically distinct states. We describe how the interpretation fits neatly together with fully quantum formulations of statistical mechanics and that a measurement process can be viewed as a process of ergodicity breaking analogous to a phase transition. The key feature of the new interpretation is that joint probabilities for the ergodic subsets of states of disjoint macro-systems only arise as emergent quantities. Finally we give an account of the EPR–Bohm thought experiment and show that the interpretation implies the violation of the Bell inequality characteristic of quantum mechanics but in a way that is rather novel. The final conclusion is that the Copenhagen interpretation gives a completely satisfactory phenomenology of macro-systems interacting with micro-systems. (paper)

  12. Xenon ventilation-perfusion lung scans. The early diagnosis of inhalation injury

    International Nuclear Information System (INIS)

    Schall, G.L.; McDonald, H.D.; Carr, L.B.; Capozzi, A.

    1978-01-01

    The use of xenon Xe-133 ventilation-perfusion lung scans for the early diagnosis of inhalation injury was evaluated in 67 patients with acute thermal burns. Study results were interpreted as normal if there was complete pulmonary clearance of the radioactive gas by 150 seconds. Thirty-two scans were normal, 32 abnormal, and three technically inadequate. There were three true false-positive study results and one false-negative study result. Good correlation was found between the scan results and various historical, physical, and laboratory values currently used to evaluate inhalation injury. The scans appeared to be the most sensitive method for the detection of early involvement, often being abnormal several days before the chest roentgenogram. Xenon lung scanning is a safe, easy, accurate, and sensitive method for the early diagnosis of inhalation injury and has important therapeutic and prognostic implications as well

  13. Image statistics and nonlinear artifacts in composed transmission x-ray tomography

    International Nuclear Information System (INIS)

    Duerinckx, A.J.G.

    1979-01-01

    Knowledge of the image quality and image statistics in Computed Tomography (CT) images obtained with transmission x-ray CT scanners can increase the amount of clinically useful information that can be retrieved. Artifacts caused by nonlinear shadows are strongly object-dependent and are visible over larger areas of the image. No simple technique exists for their complete elimination. One source of artifacts in the first order statistics is the nonlinearities in the measured shadow or projection data used to reconstruct the image. One of the leading causes is the polychromaticity of the x-ray beam used in transmission CT scanners. Ways to improve the resulting image quality and techniques to extract additional information using dual energy scanning are discussed. A unique formalism consisting of a vector representation of the material dependence of the photon-tissue interactions is generalized to allow an in depth analysis. Poly-correction algorithms are compared using this analytic approach. Both quantum and detector electronic noise decrease the quality or information content of first order statistics. Preliminary results are presented using an heuristic adaptive nonlinear noise filter system for projection data. This filter system can be improved and/or modified to remove artifacts in both first and second order image statistics. Artifacts in the second order image statistics arise from the contribution of quantum noise. This can be described with a nonlinear detection equivalent model, similar to the model used to study artifacts in first order statistics. When analyzing these artifacts in second order statistics, one can divide them into linear artifacts, which do not present any problem of interpretation, and nonlinear artifacts, referred to as noise artifacts. A study of noise artifacts is presented together with a discussion of their relative importance in diagnostic radiology

  14. Search Databases and Statistics

    DEFF Research Database (Denmark)

    Refsgaard, Jan C; Munk, Stephanie; Jensen, Lars J

    2016-01-01

    having strengths and weaknesses that must be considered for the individual needs. These are reviewed in this chapter. Equally critical for generating highly confident output datasets is the application of sound statistical criteria to limit the inclusion of incorrect peptide identifications from database...... searches. Additionally, careful filtering and use of appropriate statistical tests on the output datasets affects the quality of all downstream analyses and interpretation of the data. Our considerations and general practices on these aspects of phosphoproteomics data processing are presented here....

  15. Statistical and stochastic aspects of the delocalization problem in quantum mechanics

    International Nuclear Information System (INIS)

    Claverie, P.; Diner, S.

    1976-01-01

    The space-time behaviour of electrons in atoms and molecules is reviewed. The wave conception of the electron is criticized and the poverty of the non-reductionist attitude is underlined. Further, the two main interpretations of quantum mechanics are recalled: the Copenhagen and the Statistical Interpretations. The meaning and the successes of the Statistical Interpretation are explained and it is shown that it does not solve all problems because quantum mechanics is irreducible to a classical statistical theory. The fluctuation of the particle number and its relationship to loge theory, delocalization and correlation is studied. Finally, different stochastic models for microphysics are reviewed. The markovian Fenyes-Nelson process allows an interpretation of the original heuristic considerations of Schroedinger. Non-markov processes with Schroedinger time evolution are shown to be equivalent to the base state analysis of Feynmann but they are unsatisfactory from a probabilistic point of view. Stochastic electrodynamics is presented as the most satisfactory conception nowadays

  16. Statistical Surface Recovery: A Study on Ear Canals

    DEFF Research Database (Denmark)

    Jensen, Rasmus Ramsbøl; Olesen, Oline Vinter; Paulsen, Rasmus Reinhold

    2012-01-01

    We present a method for surface recovery in partial surface scans based on a statistical model. The framework is based on multivariate point prediction, where the distribution of the points are learned from an annotated data set. The training set consist of surfaces with dense correspondence...... that are Procrustes aligned. The average shape and point covariances can be estimated from this set. It is shown how missing data in a new given shape can be predicted using the learned statistics. The method is evaluated on a data set of 29 scans of ear canal impressions. By using a leave-one-out approach we...

  17. A decision support system and rule-based algorithm to augment the human interpretation of the 12-lead electrocardiogram.

    Science.gov (United States)

    Cairns, Andrew W; Bond, Raymond R; Finlay, Dewar D; Guldenring, Daniel; Badilini, Fabio; Libretti, Guido; Peace, Aaron J; Leslie, Stephen J

    The 12-lead Electrocardiogram (ECG) has been used to detect cardiac abnormalities in the same format for more than 70years. However, due to the complex nature of 12-lead ECG interpretation, there is a significant cognitive workload required from the interpreter. This complexity in ECG interpretation often leads to errors in diagnosis and subsequent treatment. We have previously reported on the development of an ECG interpretation support system designed to augment the human interpretation process. This computerised decision support system has been named 'Interactive Progressive based Interpretation' (IPI). In this study, a decision support algorithm was built into the IPI system to suggest potential diagnoses based on the interpreter's annotations of the 12-lead ECG. We hypothesise semi-automatic interpretation using a digital assistant can be an optimal man-machine model for ECG interpretation. To improve interpretation accuracy and reduce missed co-abnormalities. The Differential Diagnoses Algorithm (DDA) was developed using web technologies where diagnostic ECG criteria are defined in an open storage format, Javascript Object Notation (JSON), which is queried using a rule-based reasoning algorithm to suggest diagnoses. To test our hypothesis, a counterbalanced trial was designed where subjects interpreted ECGs using the conventional approach and using the IPI+DDA approach. A total of 375 interpretations were collected. The IPI+DDA approach was shown to improve diagnostic accuracy by 8.7% (although not statistically significant, p-value=0.1852), the IPI+DDA suggested the correct interpretation more often than the human interpreter in 7/10 cases (varying statistical significance). Human interpretation accuracy increased to 70% when seven suggestions were generated. Although results were not found to be statistically significant, we found; 1) our decision support tool increased the number of correct interpretations, 2) the DDA algorithm suggested the correct

  18. Representations and Techniques for 3D Object Recognition and Scene Interpretation

    CERN Document Server

    Hoiem, Derek

    2011-01-01

    One of the grand challenges of artificial intelligence is to enable computers to interpret 3D scenes and objects from imagery. This book organizes and introduces major concepts in 3D scene and object representation and inference from still images, with a focus on recent efforts to fuse models of geometry and perspective with statistical machine learning. The book is organized into three sections: (1) Interpretation of Physical Space; (2) Recognition of 3D Objects; and (3) Integrated 3D Scene Interpretation. The first discusses representations of spatial layout and techniques to interpret physi

  19. Neural Correlates of Morphology Acquisition through a Statistical Learning Paradigm.

    Science.gov (United States)

    Sandoval, Michelle; Patterson, Dianne; Dai, Huanping; Vance, Christopher J; Plante, Elena

    2017-01-01

    The neural basis of statistical learning as it occurs over time was explored with stimuli drawn from a natural language (Russian nouns). The input reflected the "rules" for marking categories of gendered nouns, without making participants explicitly aware of the nature of what they were to learn. Participants were scanned while listening to a series of gender-marked nouns during four sequential scans, and were tested for their learning immediately after each scan. Although participants were not told the nature of the learning task, they exhibited learning after their initial exposure to the stimuli. Independent component analysis of the brain data revealed five task-related sub-networks. Unlike prior statistical learning studies of word segmentation, this morphological learning task robustly activated the inferior frontal gyrus during the learning period. This region was represented in multiple independent components, suggesting it functions as a network hub for this type of learning. Moreover, the results suggest that subnetworks activated by statistical learning are driven by the nature of the input, rather than reflecting a general statistical learning system.

  20. Metal artifact reduction of CT scans to improve PET/CT

    NARCIS (Netherlands)

    Van Der Vos, Charlotte S.; Arens, Anne I.J.; Hamill, James J.; Hofmann, Christian; Panin, Vladimir Y.; Meeuwis, Antoi P.W.; Visser, Eric P.; De Geus-Oei, Lioe Fee

    2017-01-01

    In recent years, different metal artifact reduction methods have been developed for CT. These methods have only recently been introduced for PET/CT even though they could be beneficial for interpretation, segmentation, and quantification of the PET/CT images. In this study, phantom and patient scans

  1. Metal Artifact Reduction of CT Scans to Improve PET/CT

    NARCIS (Netherlands)

    Vos, C.S. van der; Arens, A.I.J.; Hamill, J.J.; Hofmann, C.; Panin, V.Y.; Meeuwis, A.P.W.; Visser, E.P.; Geus-Oei, L.F. de

    2017-01-01

    In recent years, different metal artifact reduction methods have been developed for CT. These methods have only recently been introduced for PET/CT even though they could be beneficial for interpretation, segmentation, and quantification of the PET/CT images. In this study, phantom and patient scans

  2. Interpreting the Customary Rules on Interpretation

    NARCIS (Netherlands)

    Merkouris, Panos

    2017-01-01

    International courts have at times interpreted the customary rules on interpretation. This is interesting because what is being interpreted is: i) rules of interpretation, which sounds dangerously tautological, and ii) customary law, the interpretation of which has not been the object of critical

  3. A statistical analysis of volatile organic compounds observed during the TEXAQS2000 air quality study at LaPorte, Tx, using proton-transfer-reaction mass spectrometry

    International Nuclear Information System (INIS)

    Kuster, B.; Williams, E.; Fehsenfeld, F.; Jobson, T.; Fall, R.; Lindinger, W.; Karl, T.

    2002-01-01

    Statistical analysis of online VOC measurements obtained by proton-transfer-reaction mass spectrometry (PTR-MS) during the TEXAQS2000 intensive period is presented. The experiment was based at the La Porte site, near the Houston ship channel (HSC), and deployed for continuous long-term monitoring. Multivariate techniques helped to identify various VOC sources in the vicinity of HSC and distinguish between different anthropogenic emissions. An assessment is given of the selectivity and interpretation of mass scans from this online technique in complex urban and industrial VOC matrix. (author)

  4. The relevance of electrostatics for scanning-gate microscopy

    International Nuclear Information System (INIS)

    Schnez, S; Guettinger, J; Stampfer, C; Ensslin, K; Ihn, T

    2011-01-01

    Scanning-probe techniques have been developed to extract local information from a given physical system. In particular, conductance maps obtained by means of scanning-gate microscopy (SGM), where a conducting tip of an atomic-force microscope is used as a local and movable gate, seem to present an intuitive picture of the underlying physical processes. Here, we argue that the interpretation of such images is complex and not very intuitive under certain circumstances: scanning a graphene quantum dot (QD) in the Coulomb-blockaded regime, we observe an apparent shift of features in scanning-gate images as a function of gate voltages, which cannot be a real shift of the physical system. Furthermore, we demonstrate the appearance of more than one set of Coulomb rings arising from the graphene QD. We attribute these effects to screening between the metallic tip and the gates. Our results are relevant for SGM on any kind of nanostructure, but are of particular importance for nanostructures that are not covered with a dielectric, e.g. graphene or carbon nanotube structures.

  5. Hunting Down Interpretations of the HERA Large-$Q^{2}$ data

    CERN Document Server

    Ellis, John R.

    1999-01-01

    Possible interpretations of the HERA large-Q^2 data are reviewed briefly. The possibility of statistical fluctuations cannot be ruled out, and it seems premature to argue that the H1 and ZEUS anomalies are incompatible. The data cannot be explained away by modifications of parton distributions, nor do contact interactions help. A leptoquark interpretation would need a large tau-q branching ratio. Several R-violating squark interpretations are still viable despite all the constraints, and offer interesting experimental signatures, but please do not hold your breath.

  6. Statistics Poster Challenge for Schools

    Science.gov (United States)

    Payne, Brad; Freeman, Jenny; Stillman, Eleanor

    2013-01-01

    The analysis and interpretation of data are important life skills. A poster challenge for schoolchildren provides an innovative outlet for these skills and demonstrates their relevance to daily life. We discuss our Statistics Poster Challenge and the lessons we have learned.

  7. Application of descriptive statistics in analysis of experimental data

    OpenAIRE

    Mirilović Milorad; Pejin Ivana

    2008-01-01

    Statistics today represent a group of scientific methods for the quantitative and qualitative investigation of variations in mass appearances. In fact, statistics present a group of methods that are used for the accumulation, analysis, presentation and interpretation of data necessary for reaching certain conclusions. Statistical analysis is divided into descriptive statistical analysis and inferential statistics. The values which represent the results of an experiment, and which are the subj...

  8. Effect of CT scanning parameters on volumetric measurements of pulmonary nodules by 3D active contour segmentation: a phantom study

    International Nuclear Information System (INIS)

    Way, Ted W; Chan, H-P; Goodsitt, Mitchell M; Sahiner, Berkman; Hadjiiski, Lubomir M; Zhou Chuan; Chughtai, Aamer

    2008-01-01

    The purpose of this study is to investigate the effects of CT scanning and reconstruction parameters on automated segmentation and volumetric measurements of nodules in CT images. Phantom nodules of known sizes were used so that segmentation accuracy could be quantified in comparison to ground-truth volumes. Spherical nodules having 4.8, 9.5 and 16 mm diameters and 50 and 100 mg cc -1 calcium contents were embedded in lung-tissue-simulating foam which was inserted in the thoracic cavity of a chest section phantom. CT scans of the phantom were acquired with a 16-slice scanner at various tube currents, pitches, fields-of-view and slice thicknesses. Scans were also taken using identical techniques either within the same day or five months apart for study of reproducibility. The phantom nodules were segmented with a three-dimensional active contour (3DAC) model that we previously developed for use on patient nodules. The percentage volume errors relative to the ground-truth volumes were estimated under the various imaging conditions. There was no statistically significant difference in volume error for repeated CT scans or scans taken with techniques where only pitch, field of view, or tube current (mA) were changed. However, the slice thickness significantly (p < 0.05) affected the volume error. Therefore, to evaluate nodule growth, consistent imaging conditions and high resolution should be used for acquisition of the serial CT scans, especially for smaller nodules. Understanding the effects of scanning and reconstruction parameters on volume measurements by 3DAC allows better interpretation of data and assessment of growth. Tracking nodule growth with computerized segmentation methods would reduce inter- and intraobserver variabilities

  9. Accuracy of Buccal Scan Procedures for the Registration of Habitual Intercuspation.

    Science.gov (United States)

    Zimmermann, M; Ender, A; Attin, T; Mehl, A

    2018-04-09

    Accurate reproduction of the jaw relationship is important in many fields of dentistry. Maximum intercuspation can be registered with digital buccal scan procedures implemented in the workflow of many intraoral scanning systems. The aim of this study was to investigate the accuracy of buccal scan procedures with intraoral scanning devices for the registration of habitual intercuspation in vivo. The hypothesis was that there is no statistically significant difference for buccal scan procedures compared to registration methods with poured model casts. Ten individuals (full dentition, no dental rehabilitations) were subjects for five different habitual intercuspation registration methods: (CI) poured model casts, manual hand registration, buccal scan with inEOS X5; (BC) intraoral scan, buccal scan with CEREC Bluecam; (OC4.2) intraoral scan, buccal scan with CEREC Omnicam software version 4.2; (OC4.5β) intraoral scan, buccal scan with CEREC Omnicam version 4.5β; and (TR) intraoral scan, buccal scan with Trios 3. Buccal scan was repeated three times. Analysis of rotation (Rot) and translation (Trans) parameters was performed with difference analysis software (OraCheck). Statistical analysis was performed with one-way analysis of variance and the post hoc Scheffé test ( p0.05) differences in terms of translation between groups CI_Trans (98.74±112.01 μm), BC_Trans (84.12±64.95 μm), OC4.2_Trans (60.70±35.08 μm), OC4.5β_Trans (68.36±36.67 μm), and TR_Trans (66.60±64.39 μm). For rotation, there were no significant differences ( p>0.05) for groups CI_Rot (0.23±0.25°), BC_Rot (0.73±0.52°), OC4.2_Rot (0.45±0.31°), OC4.5β_Rot (0.50±0.36°), and TR_Rot (0.47±0.65°). Intraoral scanning devices allow the reproduction of the static relationship of the maxillary and mandibular teeth with the same accuracy as registration methods with poured model casts.

  10. Statistics As Principled Argument

    CERN Document Server

    Abelson, Robert P

    2012-01-01

    In this illuminating volume, Robert P. Abelson delves into the too-often dismissed problems of interpreting quantitative data and then presenting them in the context of a coherent story about one's research. Unlike too many books on statistics, this is a remarkably engaging read, filled with fascinating real-life (and real-research) examples rather than with recipes for analysis. It will be of true interest and lasting value to beginning graduate students and seasoned researchers alike. The focus of the book is that the purpose of statistics is to organize a useful argument from quantitative

  11. ISBN and QR Barcode Scanning Mobile App for Libraries

    Directory of Open Access Journals (Sweden)

    Graham McCarthy

    2011-04-01

    Full Text Available This article outlines the development of a mobile application for the Ryerson University Library. The application provides for ISBN barcode scanning that results in a lookup of library copies and services for the book scanned, as well as QR code scanning. Two versions of the application were developed, one for iOS and one for Android. The article includes some details on the free packages used for barcode scanning functionality. Source code for the Ryerson iOS and Android applications are freely available, and instructions are provided on customizing the Ryerson application for use in other library environments. Some statistics on the number of downloads of the Ryerson mobile app by users are included.

  12. Technetium 99mTc Pertechnetate Brain Scanning

    International Nuclear Information System (INIS)

    Rhee, Sang Min; Park, Jin Yung; Lee, Ahn Ki; Chung, Choo Il; Hong, Chang Gi; Rhee, Chong Heon; Koh, Chang Soon

    1968-01-01

    Technetium 99 mTc pertechnetate brain scanning were performed in 3 cases of head injury (2 chronic subdural hematomas and 1 acute epidural hematoma), 2 cases of brain abscess and 1 case of intracerebral hematoma associated with arteriovenous anomaly. In all the cases brain scintigrams showed 'hot areas.' Literatures on radioisotope scanning of intracranial lesions were briefly reviewed. With the improvement of radioisotope scanner and development of new radiopharmaceuticals brain scanning became a safe and useful screening test for diagnosis of intracranial lesions. Brain scanning can be easily performed even to a moribund patient without any discomfort and risk to the patient which are associated with cerebral angiography or pneumoencephalography. Brain scanning has been useful in diagnosis of brain tumor, brain abscess, subdural hematoma, and cerebral vascular diseases. In 80 to 90% of brain tumors positive scintigrams can be expected. Early studies were done with 203 Hg-Neohydrin or 131 I-serum albumin. With these agents, however, patients receive rather much radiation to the whole body and kidneys. In 1965 Harper introduced 99 mTc to reduce radiation dose to the patient and improve statistical variation in isotope scanning.

  13. Momentum conservation decides Heisenberg's interpretation of the uncertainty formulas

    International Nuclear Information System (INIS)

    Angelidis, T.D.

    1977-01-01

    In the light of Heisenberg's interpretation of the uncertainty formulas, the conditions necessary for the derivation of the quantitative statement or law of momentum conservation are considered. The result of such considerations is a contradiction between the formalism of quantum physics and the asserted consequences of Heisenberg's interpretation. This contradiction decides against Heisenberg's interpretation of the uncertainty formulas on upholding that the formalism of quantum physics is both consistent and complete, at least insofar as the statement of momentum conservation can be proved within this formalism. A few comments are also included on Bohr's complementarity interpretation of the formalism of quantum physics. A suggestion, based on a statistical mode of empirical testing of the uncertainty formulas, does not give rise to any such contradiction

  14. Multimodal integration in statistical learning

    DEFF Research Database (Denmark)

    Mitchell, Aaron; Christiansen, Morten Hyllekvist; Weiss, Dan

    2014-01-01

    , we investigated the ability of adults to integrate audio and visual input during statistical learning. We presented learners with a speech stream synchronized with a video of a speaker’s face. In the critical condition, the visual (e.g., /gi/) and auditory (e.g., /mi/) signals were occasionally...... facilitated participants’ ability to segment the speech stream. Our results therefore demonstrate that participants can integrate audio and visual input to perceive the McGurk illusion during statistical learning. We interpret our findings as support for modality-interactive accounts of statistical learning.......Recent advances in the field of statistical learning have established that learners are able to track regularities of multimodal stimuli, yet it is unknown whether the statistical computations are performed on integrated representations or on separate, unimodal representations. In the present study...

  15. An introduction to medical statistics

    International Nuclear Information System (INIS)

    Hilgers, R.D.; Bauer, P.; Scheiber, V.; Heitmann, K.U.

    2002-01-01

    This textbook teaches all aspects and methods of biometrics as a field of concentration in medical education. Instrumental interpretations of the theory, concepts and terminology of medical statistics are enhanced by numerous illustrations and examples. With problems, questions and answers. (orig./CB) [de

  16. Ergodic theory, interpretations of probability and the foundations of statistical mechanics

    NARCIS (Netherlands)

    van Lith, J.H.

    2001-01-01

    The traditional use of ergodic theory in the foundations of equilibrium statistical mechanics is that it provides a link between thermodynamic observables and microcanonical probabilities. First of all, the ergodic theorem demonstrates the equality of microcanonical phase averages and infinite time

  17. HistFitter software framework for statistical data analysis

    CERN Document Server

    Baak, M.; Côte, D.; Koutsman, A.; Lorenz, J.; Short, D.

    2015-01-01

    We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fitted to data and interpreted with statistical tests. A key innovation of HistFitter is its design, which is rooted in core analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its very fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with mu...

  18. Statistical theory of heat

    CERN Document Server

    Scheck, Florian

    2016-01-01

    Scheck’s textbook starts with a concise introduction to classical thermodynamics, including geometrical aspects. Then a short introduction to probabilities and statistics lays the basis for the statistical interpretation of thermodynamics. Phase transitions, discrete models and the stability of matter are explained in great detail. Thermodynamics has a special role in theoretical physics. Due to the general approach of thermodynamics the field has a bridging function between several areas like the theory of condensed matter, elementary particle physics, astrophysics and cosmology. The classical thermodynamics describes predominantly averaged properties of matter, reaching from few particle systems and state of matter to stellar objects. Statistical Thermodynamics covers the same fields, but explores them in greater depth and unifies classical statistical mechanics with quantum theory of multiple particle systems. The content is presented as two tracks: the fast track for master students, providing the essen...

  19. Combination and interpretation of observables in Cosmology

    Directory of Open Access Journals (Sweden)

    Virey Jean-Marc

    2010-04-01

    Full Text Available The standard cosmological model has deep theoretical foundations but need the introduction of two major unknown components, dark matter and dark energy, to be in agreement with various observations. Dark matter describes a non-relativistic collisionless fluid of (non baryonic matter which amount to 25% of the total density of the universe. Dark energy is a new kind of fluid not of matter type, representing 70% of the total density which should explain the recent acceleration of the expansion of the universe. Alternatively, one can reject this idea of adding one or two new components but argue that the equations used to make the interpretation should be modified consmological scales. Instead of dark matter one can invoke a failure of Newton's laws. Instead of dark energy, two approaches are proposed : general relativity (in term of the Einstein equation should be modified, or the cosmological principle which fixes the metric used for cosmology should be abandonned. One of the main objective of the community is to find the path of the relevant interpretations thanks to the next generation of experiments which should provide large statistics of observationnal data. Unfortunately, cosmological in formations are difficult to pin down directly fromt he measurements, and it is mandatory to combine the various observables to get the cosmological parameters. This is not problematic from the statistical point of view, but assumptions and approximations made for the analysis may bias our interprettion of the data. Consequently, a strong attention should be paied to the statistical methods used to make parameters estimation and for model testing. After a review of the basics of cosmology where the cosmological parameters are introduced, we discuss the various cosmological probes and their associated observables used to extract cosmological informations. We present the results obtained from several statistical analyses combining data of diferent nature but

  20. Visualization of the variability of 3D statistical shape models by animation.

    Science.gov (United States)

    Lamecker, Hans; Seebass, Martin; Lange, Thomas; Hege, Hans-Christian; Deuflhard, Peter

    2004-01-01

    Models of the 3D shape of anatomical objects and the knowledge about their statistical variability are of great benefit in many computer assisted medical applications like images analysis, therapy or surgery planning. Statistical model of shapes have successfully been applied to automate the task of image segmentation. The generation of 3D statistical shape models requires the identification of corresponding points on two shapes. This remains a difficult problem, especially for shapes of complicated topology. In order to interpret and validate variations encoded in a statistical shape model, visual inspection is of great importance. This work describes the generation and interpretation of statistical shape models of the liver and the pelvic bone.

  1. Proper interpretation of chronic toxicity studies and their statistics: A critique of "Which level of evidence does the US National Toxicology Program provide? Statistical considerations using the Technical Report 578 on Ginkgo biloba as an example".

    Science.gov (United States)

    Kissling, Grace E; Haseman, Joseph K; Zeiger, Errol

    2015-09-02

    A recent article by Gaus (2014) demonstrates a serious misunderstanding of the NTP's statistical analysis and interpretation of rodent carcinogenicity data as reported in Technical Report 578 (Ginkgo biloba) (NTP, 2013), as well as a failure to acknowledge the abundant literature on false positive rates in rodent carcinogenicity studies. The NTP reported Ginkgo biloba extract to be carcinogenic in mice and rats. Gaus claims that, in this study, 4800 statistical comparisons were possible, and that 209 of them were statistically significant (p<0.05) compared with 240 (4800×0.05) expected by chance alone; thus, the carcinogenicity of Ginkgo biloba extract cannot be definitively established. However, his assumptions and calculations are flawed since he incorrectly assumes that the NTP uses no correction for multiple comparisons, and that significance tests for discrete data operate at exactly the nominal level. He also misrepresents the NTP's decision making process, overstates the number of statistical comparisons made, and ignores the fact that the mouse liver tumor effects were so striking (e.g., p<0.0000000000001) that it is virtually impossible that they could be false positive outcomes. Gaus' conclusion that such obvious responses merely "generate a hypothesis" rather than demonstrate a real carcinogenic effect has no scientific credibility. Moreover, his claims regarding the high frequency of false positive outcomes in carcinogenicity studies are misleading because of his methodological misconceptions and errors. Published by Elsevier Ireland Ltd.

  2. Interpretation of scanning tunneling quasiparticle interference and impurity states in cuprates.

    Science.gov (United States)

    Kreisel, A; Choubey, Peayush; Berlijn, T; Ku, W; Andersen, B M; Hirschfeld, P J

    2015-05-29

    We apply a recently developed method combining first principles based Wannier functions with solutions to the Bogoliubov-de Gennes equations to the problem of interpreting STM data in cuprate superconductors. We show that the observed images of Zn on the surface of Bi_{2}Sr_{2}CaCu_{2}O_{8} can only be understood by accounting for the tails of the Cu Wannier functions, which include significant weight on apical O sites in neighboring unit cells. This calculation thus puts earlier crude "filter" theories on a microscopic foundation and solves a long-standing puzzle. We then study quasiparticle interference phenomena induced by out-of-plane weak potential scatterers, and show how patterns long observed in cuprates can be understood in terms of the interference of Wannier functions above the surface. Our results show excellent agreement with experiment and enable a better understanding of novel phenomena in the cuprates via STM imaging.

  3. Measurement and statistics for teachers

    CERN Document Server

    Van Blerkom, Malcolm

    2008-01-01

    Written in a student-friendly style, Measurement and Statistics for Teachers shows teachers how to use measurement and statistics wisely in their classes. Although there is some discussion of theory, emphasis is given to the practical, everyday uses of measurement and statistics. The second part of the text provides more complete coverage of basic descriptive statistics and their use in the classroom than in any text now available.Comprehensive and accessible, Measurement and Statistics for Teachers includes:Short vignettes showing concepts in action Numerous classroom examples Highlighted vocabulary Boxes summarizing related concepts End-of-chapter exercises and problems Six full chapters devoted to the essential topic of Classroom Tests Instruction on how to carry out informal assessments, performance assessments, and portfolio assessments, and how to use and interpret standardized tests A five-chapter section on Descriptive Statistics, giving instructors the option of more thoroughly teaching basic measur...

  4. THE STATISTICAL INDICATORS OF POTATO PRODUCED IN ROMANIA

    Directory of Open Access Journals (Sweden)

    Elena BULARCA

    2013-12-01

    Full Text Available In this study we have analyzed and interpreted the main statistical indicators of potato produced in Romania. First of all, we start by presenting some information about potatoes: origin and appearance, their importance and necessity in the life of people and animals. Then on the basis of the specific statistical indicators, it was interpreted the evolution of the cultivated area, the percentage of the main counties in the cultivated area with potatoes, the average yield per hectare, as well as the import and export of potatoes in a given period. Each indicator was analyzed and corresponding remarks and conclusions have been drawn.

  5. Patient exposure during thyroid scan in Khartoum Hospital

    International Nuclear Information System (INIS)

    Saeed, N. E. B.

    2013-03-01

    The aim of this study was to measure exposure during thyroid scan by using technetium-9 9m radioactive isotope. This study was conducted on 35 patients under thyroid scan, measured in Alnelein diagnostic center, data collected for the study included, age, sex, height, weight, and the material used in examination and the activity half-life of the material. The mean age was 41.83 years, while the mean body mass index (BMI) was 24.40, and the value of effective dose average 2.65±0.24 mSv. Data collected were analyzed by excel software and statistical analysis program, where the process of analysis category was given such as: age weight of patient, time of scan, the activity and the effective dose, it was found that thyroid scan was more common in female than male patients.(Author)

  6. Isocount scintillation scanner with preset statistical data reliability

    International Nuclear Information System (INIS)

    Ikebe, J.; Yamaguchi, H.; Nawa, O.A.

    1975-01-01

    A scintillation detector scans an object such as a live body along horizontal straight scanning lines in such a manner that the scintillation detector is stopped at a scanning point during the time interval T required for counting a predetermined number of N pulses. The rate R/sub N/ = N/T is then calculated and the output signal pulses the number of which represents the rate R or the corresponding output signal is used as the recording signal for forming the scintigram. In contrast to the usual scanner, the isocount scanner scans an object stepwise in order to gather data with statistically uniform reliability

  7. Analysis of statistical misconception in terms of statistical reasoning

    Science.gov (United States)

    Maryati, I.; Priatna, N.

    2018-05-01

    Reasoning skill is needed for everyone to face globalization era, because every person have to be able to manage and use information from all over the world which can be obtained easily. Statistical reasoning skill is the ability to collect, group, process, interpret, and draw conclusion of information. Developing this skill can be done through various levels of education. However, the skill is low because many people assume that statistics is just the ability to count and using formulas and so do students. Students still have negative attitude toward course which is related to research. The purpose of this research is analyzing students’ misconception in descriptive statistic course toward the statistical reasoning skill. The observation was done by analyzing the misconception test result and statistical reasoning skill test; observing the students’ misconception effect toward statistical reasoning skill. The sample of this research was 32 students of math education department who had taken descriptive statistic course. The mean value of misconception test was 49,7 and standard deviation was 10,6 whereas the mean value of statistical reasoning skill test was 51,8 and standard deviation was 8,5. If the minimal value is 65 to state the standard achievement of a course competence, students’ mean value is lower than the standard competence. The result of students’ misconception study emphasized on which sub discussion that should be considered. Based on the assessment result, it was found that students’ misconception happen on this: 1) writing mathematical sentence and symbol well, 2) understanding basic definitions, 3) determining concept that will be used in solving problem. In statistical reasoning skill, the assessment was done to measure reasoning from: 1) data, 2) representation, 3) statistic format, 4) probability, 5) sample, and 6) association.

  8. Introduction to Statistics course

    CERN Multimedia

    CERN. Geneva HR-RFA

    2006-01-01

    The four lectures will present an introduction to statistical methods as used in High Energy Physics. As the time will be very limited, the course will seek mainly to define the important issues and to introduce the most wide used tools. Topics will include the interpretation and use of probability, estimation of parameters and testing of hypotheses.

  9. Shnirelman peak in the level spacing statistics

    International Nuclear Information System (INIS)

    Chirikov, B.V.; Shepelyanskij, D.L.

    1994-01-01

    The first results on the statistical properties of the quantum quasidegeneracy are presented. A physical interpretation of the Shnirelman theorem predicted the bulk quasidegeneracy is given. The conditions for the strong impact of the degeneracy on the quantum level statistics are formulated which allows to extend the application of the Shnirelman theorem into a broad class of quantum systems. 14 refs., 3 figs

  10. Visual and statistical analysis of 18F-FDG PET in primary progressive aphasia

    International Nuclear Information System (INIS)

    Matias-Guiu, Jordi A.; Moreno-Ramos, Teresa; Garcia-Ramos, Rocio; Fernandez-Matarrubia, Marta; Oreja-Guevara, Celia; Matias-Guiu, Jorge; Cabrera-Martin, Maria Nieves; Perez-Castejon, Maria Jesus; Rodriguez-Rey, Cristina; Ortega-Candil, Aida; Carreras, Jose Luis

    2015-01-01

    Diagnosing progressive primary aphasia (PPA) and its variants is of great clinical importance, and fluorodeoxyglucose (FDG) positron emission tomography (PET) may be a useful diagnostic technique. The purpose of this study was to evaluate interobserver variability in the interpretation of FDG PET images in PPA as well as the diagnostic sensitivity and specificity of the technique. We also aimed to compare visual and statistical analyses of these images. There were 10 raters who analysed 44 FDG PET scans from 33 PPA patients and 11 controls. Five raters analysed the images visually, while the other five used maps created using Statistical Parametric Mapping software. Two spatial normalization procedures were performed: global mean normalization and cerebellar normalization. Clinical diagnosis was considered the gold standard. Inter-rater concordance was moderate for visual analysis (Fleiss' kappa 0.568) and substantial for statistical analysis (kappa 0.756-0.881). Agreement was good for all three variants of PPA except for the nonfluent/agrammatic variant studied with visual analysis. The sensitivity and specificity of each rater's diagnosis of PPA was high, averaging 87.8 and 89.9 % for visual analysis and 96.9 and 90.9 % for statistical analysis using global mean normalization, respectively. In cerebellar normalization, sensitivity was 88.9 % and specificity 100 %. FDG PET demonstrated high diagnostic accuracy for the diagnosis of PPA and its variants. Inter-rater concordance was higher for statistical analysis, especially for the nonfluent/agrammatic variant. These data support the use of FDG PET to evaluate patients with PPA and show that statistical analysis methods are particularly useful for identifying the nonfluent/agrammatic variant of PPA. (orig.)

  11. Tumor markers and bone scan in breast cancer patients

    International Nuclear Information System (INIS)

    Ugrinska, A.; Vaskova, O.; Kraleva, S.; Petrova, D.; Smickova, S.

    2004-01-01

    Full text: The objective of this study was to compare the levels of CA15-3 and CEA with the bone scan findings in patients with breast cancer. Retrospective analysis of 76 bone scans from 61 patients diagnosed with breast cancer in the last 5 years was performed by two nuclear medicine specialists. All bone scans were performed after surgical treatment of the disease. Patients with loco-regional residual disease or distant metastases in the liver, lung or the brain were excluded from the study. According to the bone scan the patients were divided in 5 groups: normal bone scan (N), equivocal bone scan (E), single metastasis (1MS), three metastases (3MS) and multiple metastases (MMS). Tumor markers were determined within a month before or after the bone scan was performed. Cut-off value for CA 15-3 was 35 U/ml, and for CEA 3 ng/ml. Statistical analysis was performed using descriptive statistic and Kolmogorov-Smirnov test. Bone metastases were revealed in 38% of the patients referred for bone scintigraphy out of which 26% had MMS, 7.8% had single MS and 4% had 3MS. The results of 6.5% of the patients were determined as equivocal. The values of CA15-3 were higher in all patient groups compared with the group that had normal bone scan, but this difference reached statistical significance only in groups with 3MS and MMS (p < 0.01). The values of CEA were significantly higher only in patients with multiple metastases when compared with group N (p < 0.01). Values higher than cut-off value for CA 15-3 was found in 9 patients out of 42 in the group with normal bone scan. The highest value of CA 15-3 in this group was 47 U/ml. Only one patient in this group showed elevated levels for CEA. Three patients in the group with single metastasis had normal CA 15-3, while CEA was elevated only in one patient. All patients in the group with 3MS had elevated levels of CA 15-3 while CEA was in the normal range. All patients with MMS had elevated CA 15-3 values while CEA was elevated in

  12. Logarithmic axicon characterized by scanning optical probe system.

    Science.gov (United States)

    Cao, Zhaolou; Wang, Keyi; Wu, Qinglin

    2013-05-15

    A scanning optical probe system is proposed to measure a logarithmic axicon (LA) with subwavelength resolution. Multiple plane intensity profiles measured by a fiber probe are interpreted by solving an optimization problem to get the phase retardation function (PRF) of the LA. Experimental results show that this approach can accurately obtain the PRF with which the optical path difference of the generated quasi-nondiffracting beam in the propagation is calculated.

  13. Medical Radioisotope Scanning, Vol. II. Proceedings of the Symposium on Medical Radioisotope Scanning

    International Nuclear Information System (INIS)

    1964-01-01

    scientific papers presented at the Symposium together with the subsequent discussions. Volume I covers the sessions devoted to theoretical principles, instrumentation and techniques, whilst Volume II deals with choice of radioisotopes and labelled compounds, clinical applications and interpretation of results. It is hoped that together they will provide a valuable guide to the present status and likely future development of medical radioisotope scanning and its applications. The Agency gratefully acknowledges the co-operation of the staff of the Greek Atomic Energy Commission

  14. Application of dual-energy scanning technique with dual-source CT in pulmonary mass lesions

    International Nuclear Information System (INIS)

    Jiang Jie; Xu Yiming; He Bo; Xie Xiaojie; Han Dan

    2012-01-01

    Objective: To explore the feasibility of DSCT dual-energy technique in pulmonary mass lesions. Methods: A total of 100 patients with pulmonary masses underwent conventional plain CT scan and dual-energy enhanced CT scan. The virtual non-contrast (VNC) images were obtained at post-processing workstation.The mean CT value,enhancement value,signal to noise ratio (SNR), image quality and radiation dose of pulmonary masses were compared between the two scan techniques using F or t test and the detectability of lesions was compared using Wilcoxon test. Results: There was no statistically significant difference among VNC (A) (32.89 ± 12.58) HU,VNC (S) (30.86 ± 9.60) HU and conventional plain images (35.89 ± 9.99) HU in mean CT value of mass (F =2.08, P>0.05). There was statistically significant difference among VNC (A) (3.29 ± 1.45), VNC (S) (3.93 ± 1.49) and conventional plain image (4.61 ± 1.50) in SNR (F =6.01, P<0.05), which of conventional plain scan was higher than that of VNC.The enhancement value of mass in conventional enhanced scan (60.74 ± 13.9) HU and distribution of iodine from VNC (A) (58.26 ± 31.99) HU was no statistically significant difference (t=0.48, P>0.05), but there was a significant difference between conventional enhanced scan (56.51 ± 17.94) HU and distribution of iodine from VNC (S) (52.65 ± 16.78) HU (t=4.45, P<0.05). There was no statistically significant difference among conventional plain scan (4.69 ± 0.06) and VNC (A) (4.60 ± 0.09), VNC (S) (4.61 ±0.11) in image quality at mediastinal window (F=3.014, P>0.05). The appearance, size, internal features of mass (such as necrosis, calcification and cavity) were showed the same in conventional plain scan, VNC (A) and VNC (S). Of 41 patients with hilar mass, 18 patients were found to have lobular and segmental perfusion decrease or defect. Perfusion defect area was found in 59 patients with peripheral lung mass. The radiation dose of dual-energy enhanced scan was lower than that of

  15. Objective interpretation as conforming interpretation

    Directory of Open Access Journals (Sweden)

    Lidka Rodak

    2011-12-01

    Full Text Available The practical discourse willingly uses the formula of “objective interpretation”, with no regards to its controversial nature that has been discussed in literature.The main aim of the article is to investigate what “objective interpretation” could mean and how it could be understood in the practical discourse, focusing on the understanding offered by judicature.The thesis of the article is that objective interpretation, as identified with textualists’ position, is not possible to uphold, and should be rather linked with conforming interpretation. And what this actually implies is that it is not the virtue of certainty and predictability – which are usually associated with objectivity- but coherence that makes the foundation of applicability of objectivity in law.What could be observed from the analyses, is that both the phenomenon of conforming interpretation and objective interpretation play the role of arguments in the interpretive discourse, arguments that provide justification that interpretation is not arbitrary or subjective. With regards to the important part of the ideology of legal application which is the conviction that decisions should be taken on the basis of law in order to exclude arbitrariness, objective interpretation could be read as a question “what kind of authority “supports” certain interpretation”? that is almost never free of judicial creativity and judicial activism.One can say that, objective and conforming interpretation are just another arguments used in legal discourse.

  16. Radiologic head CT interpretation errors in pediatric abusive and non-abusive head trauma patients

    International Nuclear Information System (INIS)

    Kralik, Stephen F.; Finke, Whitney; Wu, Isaac C.; Ho, Chang Y.; Hibbard, Roberta A.; Hicks, Ralph A.

    2017-01-01

    Pediatric head trauma, including abusive head trauma, is a significant cause of morbidity and mortality. The purpose of this research was to identify and evaluate radiologic interpretation errors of head CTs performed on abusive and non-abusive pediatric head trauma patients from a community setting referred for a secondary interpretation at a tertiary pediatric hospital. A retrospective search identified 184 patients <5 years of age with head CT for known or potential head trauma who had a primary interpretation performed at a referring community hospital by a board-certified radiologist. Two board-certified fellowship-trained neuroradiologists at an academic pediatric hospital independently interpreted the head CTs, compared their interpretations to determine inter-reader discrepancy rates, and resolved discrepancies to establish a consensus second interpretation. The primary interpretation was compared to the consensus second interpretation using the RADPEER trademark scoring system to determine the primary interpretation-second interpretation overall and major discrepancy rates. MRI and/or surgical findings were used to validate the primary interpretation or second interpretation when possible. The diagnosis of abusive head trauma was made using clinical and imaging data by a child abuse specialist to separate patients into abusive head trauma and non-abusive head trauma groups. Discrepancy rates were compared for both groups. Lastly, primary interpretations and second interpretations were evaluated for discussion of imaging findings concerning for abusive head trauma. There were statistically significant differences between primary interpretation-second interpretation versus inter-reader overall and major discrepancy rates (28% vs. 6%, P=0.0001; 16% vs. 1%, P=0.0001). There were significant differences in the primary interpretation-second interpretation overall and major discrepancy rates for abusive head trauma patients compared to non-abusive head trauma

  17. Radiologic head CT interpretation errors in pediatric abusive and non-abusive head trauma patients

    Energy Technology Data Exchange (ETDEWEB)

    Kralik, Stephen F.; Finke, Whitney; Wu, Isaac C.; Ho, Chang Y. [Indiana University School of Medicine, Department of Radiology and Imaging Sciences, Indianapolis, IN (United States); Hibbard, Roberta A.; Hicks, Ralph A. [Indiana University School of Medicine, Department of Pediatrics, Section of Child Protection Programs, Indianapolis, IN (United States)

    2017-07-15

    Pediatric head trauma, including abusive head trauma, is a significant cause of morbidity and mortality. The purpose of this research was to identify and evaluate radiologic interpretation errors of head CTs performed on abusive and non-abusive pediatric head trauma patients from a community setting referred for a secondary interpretation at a tertiary pediatric hospital. A retrospective search identified 184 patients <5 years of age with head CT for known or potential head trauma who had a primary interpretation performed at a referring community hospital by a board-certified radiologist. Two board-certified fellowship-trained neuroradiologists at an academic pediatric hospital independently interpreted the head CTs, compared their interpretations to determine inter-reader discrepancy rates, and resolved discrepancies to establish a consensus second interpretation. The primary interpretation was compared to the consensus second interpretation using the RADPEER trademark scoring system to determine the primary interpretation-second interpretation overall and major discrepancy rates. MRI and/or surgical findings were used to validate the primary interpretation or second interpretation when possible. The diagnosis of abusive head trauma was made using clinical and imaging data by a child abuse specialist to separate patients into abusive head trauma and non-abusive head trauma groups. Discrepancy rates were compared for both groups. Lastly, primary interpretations and second interpretations were evaluated for discussion of imaging findings concerning for abusive head trauma. There were statistically significant differences between primary interpretation-second interpretation versus inter-reader overall and major discrepancy rates (28% vs. 6%, P=0.0001; 16% vs. 1%, P=0.0001). There were significant differences in the primary interpretation-second interpretation overall and major discrepancy rates for abusive head trauma patients compared to non-abusive head trauma

  18. Reducing task-based fMRI scanning time using simultaneous multislice echo planar imaging

    Energy Technology Data Exchange (ETDEWEB)

    Kiss, Mate [Hungarian Academy of Sciences, Brain Imaging Centre, Research Centre for Natural Sciences, Budapest (Hungary); Janos Szentagothai PhD School, MR Research Centre, Budapest (Hungary); National Institute of Clinical Neuroscience, Department of Neuroradiology, Budapest (Hungary); Hermann, Petra; Vidnyanszky, Zoltan; Gal, Viktor [Hungarian Academy of Sciences, Brain Imaging Centre, Research Centre for Natural Sciences, Budapest (Hungary)

    2018-03-15

    To maintain alertness and to remain motionless during scanning represent a substantial challenge for patients/subjects involved in both clinical and research functional magnetic resonance imaging (fMRI) examinations. Therefore, availability and application of new data acquisition protocols allowing the shortening of scan time without compromising the data quality and statistical power are of major importance. Higher order category-selective visual cortical areas were identified individually, and rapid event-related fMRI design was used to compare three different sampling rates (TR = 2000, 1000, and 410 ms, using state-of-the-art simultaneous multislice imaging) and four different scanning lengths to match the statistical power of the traditional scanning methods to high sampling-rate design. The results revealed that ∝ 4 min of the scan time with 1 Hz (TR = 1000 ms) sampling rate and ∝ 2 min scanning at ∝ 2.5 Hz (TR = 410 ms) sampling rate provide similar localization sensitivity and selectivity to that obtained with 11-min session at conventional, 0.5 Hz (TR = 2000 ms) sampling rate. Our findings suggest that task-based fMRI examination of clinical population prone to distress such as presurgical mapping experiments might substantially benefit from the reduced (20-40%) scanning time that can be achieved by the application of simultaneous multislice sequences. (orig.)

  19. Precision of guided scanning procedures for full-arch digital impressions in vivo.

    Science.gov (United States)

    Zimmermann, Moritz; Koller, Christina; Rumetsch, Moritz; Ender, Andreas; Mehl, Albert

    2017-11-01

    System-specific scanning strategies have been shown to influence the accuracy of full-arch digital impressions. Special guided scanning procedures have been implemented for specific intraoral scanning systems with special regard to the digital orthodontic workflow. The aim of this study was to evaluate the precision of guided scanning procedures compared to conventional impression techniques in vivo. Two intraoral scanning systems with implemented full-arch guided scanning procedures (Cerec Omnicam Ortho; Ormco Lythos) were included along with one conventional impression technique with irreversible hydrocolloid material (alginate). Full-arch impressions were taken three times each from 5 participants (n = 15). Impressions were then compared within the test groups using a point-to-surface distance method after best-fit model matching (OraCheck). Precision was calculated using the (90-10%)/2 quantile and statistical analysis with one-way repeated measures ANOVA and post hoc Bonferroni test was performed. The conventional impression technique with alginate showed the lowest precision for full-arch impressions with 162.2 ± 71.3 µm. Both guided scanning procedures performed statistically significantly better than the conventional impression technique (p Cerec Omnicam Ortho were 74.5 ± 39.2 µm and for group Ormco Lythos 91.4 ± 48.8 µm. The in vivo precision of guided scanning procedures exceeds conventional impression techniques with the irreversible hydrocolloid material alginate. Guided scanning procedures may be highly promising for clinical applications, especially for digital orthodontic workflows.

  20. The disagreeable behaviour of the kappa statistic.

    Science.gov (United States)

    Flight, Laura; Julious, Steven A

    2015-01-01

    It is often of interest to measure the agreement between a number of raters when an outcome is nominal or ordinal. The kappa statistic is used as a measure of agreement. The statistic is highly sensitive to the distribution of the marginal totals and can produce unreliable results. Other statistics such as the proportion of concordance, maximum attainable kappa and prevalence and bias adjusted kappa should be considered to indicate how well the kappa statistic represents agreement in the data. Each kappa should be considered and interpreted based on the context of the data being analysed. Copyright © 2014 John Wiley & Sons, Ltd.

  1. Multidetector row computed tomography of acute pancreatitis: Utility of single portal phase CT scan in short-term follow up

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Yongwonn [Department of Radiology, Konkuk University Medical Center, 4-12, Hwayang-dong, Gwangjin-gu, Seoul 143-729 (Korea, Republic of); Park, Hee Sun, E-mail: heesun.park@gmail.com [Department of Radiology, Konkuk University Medical Center, 4-12, Hwayang-dong, Gwangjin-gu, Seoul 143-729 (Korea, Republic of); Kim, Young Jun; Jung, Sung Il; Jeon, Hae Jeong [Department of Radiology, Konkuk University Medical Center, 4-12, Hwayang-dong, Gwangjin-gu, Seoul 143-729 (Korea, Republic of)

    2012-08-15

    Objective: The purpose of this study is to evaluate the question of whether nonenhanced CT or contrast enhanced portal phase CT can replace multiphasic pancreas protocol CT in short term monitoring in patients with acute pancreatitis. Materials and methods: This retrospective study was approved by the Institutional Review Board. From April 2006 to May 2010, a total of 52 patients having acute pancreatitis who underwent initial dual phase multidetector row CT (unenhanced, arterial, and portal phase) at admission and a short term (within 30 days) follow up dual phase CT (mean interval 10.3 days, range 3-28 days) were included. Two abdominal radiologists performed an independent review of three sets of follow up CT images (nonenhanced scan, single portal phase scan, and dual phase scan). Interpretation of each image set was done with at least 2-week interval. Radiologists evaluated severity of acute pancreatitis with regard to pancreatic inflammation, pancreatic necrosis, and extrapancreatic complication, based on the modified CT severity index. Scores of each image set were compared using a paired t-test and interobserver agreement was evaluated using intraclass correlation coefficient statistics. Results: Mean scores of sum of CT severity index on nonenhanced scan, portal phase scan, and dual phase scan were 5.7, 6.6, and 6.5 for radiologist 1, and 5.0, 5.6, and 5.8 for radiologist 2, respectively. In both radiologists, contrast enhanced scan (portal phase scan and dual phase scan) showed significantly higher severity score compared with that of unenhanced scan (P < 0.05), while portal phase and dual phase scan showed no significant difference each other. The trend was similar regarding pancreatic inflammation and extrapancreatic complications, in which contrast enhanced scans showed significantly higher score compared with those of unenhanced scan, while no significant difference was observed between portal phase scan and dual phase scan. In pancreatic necrosis

  2. Social indicators and other income statistics using the EUROMOD baseline: a comparison with Eurostat and National Statistics

    OpenAIRE

    Mantovani, Daniela; Sutherland, Holly

    2003-01-01

    This paper reports an exercise to validate EUROMOD output for 1998 by comparing income statistics calculated from the baseline micro-output with comparable statistics from other sources, including the European Community Household Panel. The main potential reasons for discrepancies are identified. While there are some specific national issues that arise, there are two main general points to consider in interpreting EUROMOD estimates of social indicators across EU member States: (a) the method ...

  3. Full information acquisition in scanning probe microscopy and spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Jesse, Stephen; Belianinov, Alex; Kalinin, Sergei V.; Somnath, Suhas

    2017-04-04

    Apparatus and methods are described for scanning probe microscopy and spectroscopy based on acquisition of full probe response. The full probe response contains valuable information about the probe-sample interaction that is lost in traditional scanning probe microscopy and spectroscopy methods. The full probe response is analyzed post data acquisition using fast Fourier transform and adaptive filtering, as well as multivariate analysis. The full response data is further compressed to retain only statistically significant components before being permanently stored.

  4. Marginal and internal fit of zirconia copings obtained using different digital scanning methods

    Directory of Open Access Journals (Sweden)

    Lorena Oliveira PEDROCHE

    Full Text Available Abstract The objective of this study was to evaluate the marginal and internal fit of zirconia copings obtained with different digital scanning methods. A human mandibular first molar was set in a typodont with its adjacent and antagonist teeth and prepared for an all-ceramic crown. Digital impressions were made using an intraoral scanner (3Shape. Polyvinyl siloxane impressions and Type IV gypsum models were also obtained and scanned with a benchtop laboratory scanner (3Shape D700. Ten zirconia copings were fabricated for each group using CAD-CAM technology. The marginal and internal fit of the zirconia copings was assessed by the silicone replica technique. Four sections of each replica were obtained, and each section was evaluated at four points: marginal gap (MG, axial wall (AW, axio-occlusal edge (AO and centro-occlusal wall (CO, using an image analyzing software. The data were submitted to one-way ANOVA and Tukey’s test (α = 0.05. They showed statistically significant differences for MG, AO and CO. Regarding MG, intraoral scanning showed lower gap values, whereas gypsum model scanning showed higher gap values. Regarding AO and CO, intraoral digital scanning showed lower gap values. Polyvinyl siloxane impression scanning and gypsum model scanning showed higher gap values and were statistically similar. It can be concluded that intraoral digital scanning provided a lower mean gap value, in comparison with conventional impressions and gypsum casts scanned with a standard benchtop laboratory scanner.

  5. Probabilistic and Statistical Aspects of Quantum Theory

    CERN Document Server

    Holevo, Alexander S

    2011-01-01

    This book is devoted to aspects of the foundations of quantum mechanics in which probabilistic and statistical concepts play an essential role. The main part of the book concerns the quantitative statistical theory of quantum measurement, based on the notion of positive operator-valued measures. During the past years there has been substantial progress in this direction, stimulated to a great extent by new applications such as Quantum Optics, Quantum Communication and high-precision experiments. The questions of statistical interpretation, quantum symmetries, theory of canonical commutation re

  6. Analysis of health in health centers area in Depok using correspondence analysis and scan statistic

    Science.gov (United States)

    Basir, C.; Widyaningsih, Y.; Lestari, D.

    2017-07-01

    Hotspots indicate area that has a higher case intensity than others. For example, in health problems of an area, the number of sickness of a region can be used as parameter and condition of area that determined severity of an area. If this condition is known soon, it can be overcome preventively. Many factors affect the severity level of area. Some health factors to be considered in this study are the number of infant with low birth weight, malnourished children under five years old, under five years old mortality, maternal deaths, births without the help of health personnel, infants without handling the baby's health, and infant without basic immunization. The number of cases is based on every public health center area in Depok. Correspondence analysis provides graphical information about two nominal variables relationship. It create plot based on row and column scores and show categories that have strong relation in a close distance. Scan Statistic method is used to examine hotspot based on some selected variables that occurred in the study area; and Correspondence Analysis is used to picturing association between the regions and variables. Apparently, using SaTScan software, Sukatani health center is obtained as a point hotspot; and Correspondence Analysis method shows health centers and the seven variables have a very significant relationship and the majority of health centers close to all variables, except Cipayung which is distantly related to the number of pregnant mother death. These results can be used as input for the government agencies to upgrade the health level in the area.

  7. Technetium {sup 99m}Tc Pertechnetate Brain Scanning

    Energy Technology Data Exchange (ETDEWEB)

    Rhee, Sang Min; Park, Jin Yung; Lee, Ahn Ki; Chung, Choo Il; Hong, Chang Gi [Capital Army Hospital, ROKA, Seoul (Korea, Republic of); Rhee, Chong Heon; Koh, Chang Soon [Radiological Research Institute, Seoul (Korea, Republic of)

    1968-03-15

    Technetium {sup 99}mTc pertechnetate brain scanning were performed in 3 cases of head injury (2 chronic subdural hematomas and 1 acute epidural hematoma), 2 cases of brain abscess and 1 case of intracerebral hematoma associated with arteriovenous anomaly. In all the cases brain scintigrams showed 'hot areas.' Literatures on radioisotope scanning of intracranial lesions were briefly reviewed. With the improvement of radioisotope scanner and development of new radiopharmaceuticals brain scanning became a safe and useful screening test for diagnosis of intracranial lesions. Brain scanning can be easily performed even to a moribund patient without any discomfort and risk to the patient which are associated with cerebral angiography or pneumoencephalography. Brain scanning has been useful in diagnosis of brain tumor, brain abscess, subdural hematoma, and cerebral vascular diseases. In 80 to 90% of brain tumors positive scintigrams can be expected. Early studies were done with 203 Hg-Neohydrin or {sup 131}I-serum albumin. With these agents, however, patients receive rather much radiation to the whole body and kidneys. In 1965 Harper introduced {sup 99}mTc to reduce radiation dose to the patient and improve statistical variation in isotope scanning.

  8. The value of indium 111 leukocyte scanning in the evaluation of painful or infected total knee arthroplasties

    International Nuclear Information System (INIS)

    Rand, J.A.; Brown, M.L.

    1990-01-01

    Evaluation of painful total knee arthroplasties (TKAs) for infection can be difficult. Indium 111 ( 111 In) leukocyte bone scanning provides a minimally invasive technique for evaluation of possible infection. Thirty-eight patients with a painful TKA who had surgical exploration after 111 In leukocyte scanning were reviewed. The scan had an accuracy of 84%, a sensitivity of 83%, and a specificity of 85%. The 111 In leukocyte scans must be interpreted in conjunction with the clinical evaluation of the patient because they are less accurate for study of TKAs than of total hip arthroplasties

  9. Statistics and probability with applications for engineers and scientists

    CERN Document Server

    Gupta, Bhisham C

    2013-01-01

    Introducing the tools of statistics and probability from the ground up An understanding of statistical tools is essential for engineers and scientists who often need to deal with data analysis over the course of their work. Statistics and Probability with Applications for Engineers and Scientists walks readers through a wide range of popular statistical techniques, explaining step-by-step how to generate, analyze, and interpret data for diverse applications in engineering and the natural sciences. Unique among books of this kind, Statistics and Prob

  10. Exploring structures of the Rochefort Cave (Belgium) with 3D models from LIDAR scans and UAV photoscans.

    Science.gov (United States)

    Watlet, A.; Triantafyllou, A.; Kaufmann, O.; Le Mouelic, S.

    2016-12-01

    Amongst today's techniques that are able to produce 3D point clouds, LIDAR and UAV (Unmanned Aerial Vehicle) photogrammetry are probably the most commonly used. Both methods have their own advantages and limitations. LIDAR scans create high resolution and high precision 3D point clouds, but such methods are generally costly, especially for sporadic surveys. Compared to LIDAR, UAV (e.g. drones) are cheap and flexible to use in different types of environments. Moreover, the photogrammetric processing workflow of digital images taken with UAV becomes easier with the rise of many affordable software packages (e.g., Agisoft PhotoScan, MicMac, VisualSFM). In this canvas, we present a challenging study made at the Rochefort Cave Laboratory (South Belgium) comprising surface and underground surveys. The main chamber of the cave ( 10000 m³) was the principal target of the study. A LIDAR scan and an UAV photoscan were acquired underground, producing respective 3D models. An additional 3D photoscan was performed at the surface, in the sinkhole in direct connection with the main chamber. The main goal of the project is to combine this different datasets for quantifying the orientation of inaccessible geological structures (e.g. faults, tectonic and gravitational joints, and sediments bedding), and for comparing them to structural data surveyed on the field. To go through structural interpretations, we used a subsampling method merging neighboured model polygons that have similar orientations, allowing statistical analyses of polygons spatial distribution. The benefit of this method is to verify the spatial continuity of in-situ structural measurements to larger scale. Roughness and colorimetric/spectral analyses may also be of great interest for several geosciences purposes by discriminating different facies among the geological beddings. Amongst others, this study was helpful to precise the local petrophysical properties associated with particular geological layers, what

  11. SOCR: Statistics Online Computational Resource

    Directory of Open Access Journals (Sweden)

    Ivo D. Dinov

    2006-10-01

    Full Text Available The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis, visualization and integration. Following years of experience in statistical teaching at all college levels using established licensed statistical software packages, like STATA, S-PLUS, R, SPSS, SAS, Systat, etc., we have attempted to engineer a new statistics education environment, the Statistics Online Computational Resource (SOCR. This resource performs many of the standard types of statistical analysis, much like other classical tools. In addition, it is designed in a plug-in object-oriented architecture and is completely platform independent, web-based, interactive, extensible and secure. Over the past 4 years we have tested, fine-tuned and reanalyzed the SOCR framework in many of our undergraduate and graduate probability and statistics courses and have evidence that SOCR resources build student's intuition and enhance their learning.

  12. An approach to the interpretation of backpropagation neural network models in QSAR studies.

    Science.gov (United States)

    Baskin, I I; Ait, A O; Halberstam, N M; Palyulin, V A; Zefirov, N S

    2002-03-01

    An approach to the interpretation of backpropagation neural network models for quantitative structure-activity and structure-property relationships (QSAR/QSPR) studies is proposed. The method is based on analyzing the first and second moments of distribution of the values of the first and the second partial derivatives of neural network outputs with respect to inputs calculated at data points. The use of such statistics makes it possible not only to obtain actually the same characteristics as for the case of traditional "interpretable" statistical methods, such as the linear regression analysis, but also to reveal important additional information regarding the non-linear character of QSAR/QSPR relationships. The approach is illustrated by an example of interpreting a backpropagation neural network model for predicting position of the long-wave absorption band of cyane dyes.

  13. A primer of multivariate statistics

    CERN Document Server

    Harris, Richard J

    2014-01-01

    Drawing upon more than 30 years of experience in working with statistics, Dr. Richard J. Harris has updated A Primer of Multivariate Statistics to provide a model of balance between how-to and why. This classic text covers multivariate techniques with a taste of latent variable approaches. Throughout the book there is a focus on the importance of describing and testing one's interpretations of the emergent variables that are produced by multivariate analysis. This edition retains its conversational writing style while focusing on classical techniques. The book gives the reader a feel for why

  14. Statistics for scientists and engineers

    CERN Document Server

    Shanmugam , Ramalingam

    2015-01-01

    This book provides the theoretical framework needed to build, analyze and interpret various statistical models. It helps readers choose the correct model, distinguish among various choices that best captures the data, or solve the problem at hand. This is an introductory textbook on probability and statistics. The authors explain theoretical concepts in a step-by-step manner and provide practical examples. The introductory chapter in this book presents the basic concepts. Next, the authors discuss the measures of location, popular measures of spread, and measures of skewness and kurtosis. Prob

  15. On the statistical properties of photons

    International Nuclear Information System (INIS)

    Cini, M.

    1990-01-01

    The interpretation in terms of a transition from Maxwell-Boltzmann to Bose-Einstein statistics of the effect in quantum optics of degenerate light discovered by De Martini and Di Fonzo is discussed. It is shown that the results of the experiment can be explained by using only the quantum-mechanical rule that the states of an assembly of bosons should be completely symmetrical, without mentioning in any way their statistical properties. This means that photons are indeed identical particles

  16. Teaching Business Statistics with Real Data to Undergraduates and the Use of Technology in the Class Room

    Science.gov (United States)

    Singamsetti, Rao

    2007-01-01

    In this paper an attempt is made to highlight some issues of interpretation of statistical concepts and interpretation of results as taught in undergraduate Business statistics courses. The use of modern technology in the class room is shown to have increased the efficiency and the ease of learning and teaching in statistics. The importance of…

  17. Critical Analysis of an e-Learning and Interactive Teaching Module with Respect to the Interpretation of Emergency Computed Tomography of the Brain.

    Science.gov (United States)

    Groth, Michael; Barthe, Käthe Greta; Riemer, Martin; Ernst, Marielle; Herrmann, Jochen; Fiehler, Jens; Buhk, Jan-Hendrik

    2018-04-01

     To compare the learning benefit of three different teaching strategies on the interpretation of emergency cerebral computed tomography (CT) pathologies by medical students.  Three groups of students with different types of teaching (e-learning, interactive teaching, and standard curricular education in neuroradiology) were tested with respect to the detection of seven CT pathologies. The test results of each group were compared for each CT pathology using the chi-square test. A p-value ≤ 0.05 was considered to be significant.  Opposed to the results of the comparison group (curricular education), the e-learning group and interactive teaching tutorial group both showed a significantly better performance in detecting hyperdense middle cerebral artery sign (p = 0.001 and p e-learning group, with statistical significance in the latter (p = 0.03 and p e-learning module group with respect to reading CT scans with slightly different advantages. Thus, the introduction of new learning methods in radiological education might be reasonable at an undergraduate stage but requires learning content-based considerations.   · E-learning can offer benefits regarding the reading of cerebral CT scans by students. · Interactive tutorial can offer benefits regarding the reading of cerebral CT scans by students. · E-learning and interactive tutorial feature different strengths for student learning in radiology. · Application of interactive teaching methods in radiology requires learning content-based considerations. · Groth M, Barthe KG, Riemer M et al. Critical Analysis of an e-Learning and Interactive Teaching Module with Respect to the Interpretation of Emergency Computed Tomography of the Brain. Fortschr Röntgenstr 2017; 190: 334 - 340. © Georg Thieme Verlag KG Stuttgart · New York.

  18. Microgamma Scan System for analyzing radial isotopic profiles of irradiated transmutation fuels

    International Nuclear Information System (INIS)

    Hilton, Bruce A.; McGrath, Christopher A.

    2008-01-01

    The U. S. Global Nuclear Energy Partnership / Advanced Fuel Cycle Initiative (GNEP/AFCI) is developing metallic transmutation alloys as a fuel form to transmute the long-lived transuranic actinide isotopes contained in spent nuclear fuel into shorter-lived fission products. A micro-gamma scan system is being developed to analyze the radial distribution of fission products, such as Cs-137, Cs-134, Ru-106, and Zr-95, in irradiated fuel cross-sections. The micro-gamma scan system consists of a precision linear stage with integrated sample holder and a tungsten alloy collimator, which interfaces with the Idaho National Laboratory (INL) Analytical Laboratory Hot Cell (ALHC) Gamma Scan System high purity germanium detector, multichannel analyzer, and removable collimators. A simplified model of the micro-gamma scan system was developed in MCNP (Monte-Carlo N-Particle Transport Code) and used to investigate the system performance and to interpret data from the scoping studies. Preliminary measurements of the micro-gamma scan system are discussed. (authors)

  19. Applied Statistics for the Social and Health Sciences

    CERN Document Server

    Gordon, Rachel A A

    2012-01-01

    Applied Statistics for the Social and Health Sciences provides graduate students in the social and health sciences with the basic skills that they need to estimate, interpret, present, and publish statistical models using contemporary standards. The book targets the social and health science branches such as human development, public health, sociology, psychology, education, and social work in which students bring a wide range of mathematical skills and have a wide range of methodological affinities. For these students, a successful course in statistics will not only offer statistical content

  20. Multi-reader ROC studies with split-plot designs: a comparison of statistical methods.

    Science.gov (United States)

    Obuchowski, Nancy A; Gallas, Brandon D; Hillis, Stephen L

    2012-12-01

    Multireader imaging trials often use a factorial design, in which study patients undergo testing with all imaging modalities and readers interpret the results of all tests for all patients. A drawback of this design is the large number of interpretations required of each reader. Split-plot designs have been proposed as an alternative, in which one or a subset of readers interprets all images of a sample of patients, while other readers interpret the images of other samples of patients. In this paper, the authors compare three methods of analysis for the split-plot design. Three statistical methods are presented: the Obuchowski-Rockette method modified for the split-plot design, a newly proposed marginal-mean analysis-of-variance approach, and an extension of the three-sample U-statistic method. A simulation study using the Roe-Metz model was performed to compare the type I error rate, power, and confidence interval coverage of the three test statistics. The type I error rates for all three methods are close to the nominal level but tend to be slightly conservative. The statistical power is nearly identical for the three methods. The coverage of 95% confidence intervals falls close to the nominal coverage for small and large sample sizes. The split-plot multireader, multicase study design can be statistically efficient compared to the factorial design, reducing the number of interpretations required per reader. Three methods of analysis, shown to have nominal type I error rates, similar power, and nominal confidence interval coverage, are available for this study design. Copyright © 2012 AUR. All rights reserved.

  1. Visual and statistical analysis of {sup 18}F-FDG PET in primary progressive aphasia

    Energy Technology Data Exchange (ETDEWEB)

    Matias-Guiu, Jordi A.; Moreno-Ramos, Teresa; Garcia-Ramos, Rocio; Fernandez-Matarrubia, Marta; Oreja-Guevara, Celia; Matias-Guiu, Jorge [Hospital Clinico San Carlos, Department of Neurology, Madrid (Spain); Cabrera-Martin, Maria Nieves; Perez-Castejon, Maria Jesus; Rodriguez-Rey, Cristina; Ortega-Candil, Aida; Carreras, Jose Luis [San Carlos Health Research Institute (IdISSC) Complutense University of Madrid, Department of Nuclear Medicine, Hospital Clinico San Carlos, Madrid (Spain)

    2015-05-01

    Diagnosing progressive primary aphasia (PPA) and its variants is of great clinical importance, and fluorodeoxyglucose (FDG) positron emission tomography (PET) may be a useful diagnostic technique. The purpose of this study was to evaluate interobserver variability in the interpretation of FDG PET images in PPA as well as the diagnostic sensitivity and specificity of the technique. We also aimed to compare visual and statistical analyses of these images. There were 10 raters who analysed 44 FDG PET scans from 33 PPA patients and 11 controls. Five raters analysed the images visually, while the other five used maps created using Statistical Parametric Mapping software. Two spatial normalization procedures were performed: global mean normalization and cerebellar normalization. Clinical diagnosis was considered the gold standard. Inter-rater concordance was moderate for visual analysis (Fleiss' kappa 0.568) and substantial for statistical analysis (kappa 0.756-0.881). Agreement was good for all three variants of PPA except for the nonfluent/agrammatic variant studied with visual analysis. The sensitivity and specificity of each rater's diagnosis of PPA was high, averaging 87.8 and 89.9 % for visual analysis and 96.9 and 90.9 % for statistical analysis using global mean normalization, respectively. In cerebellar normalization, sensitivity was 88.9 % and specificity 100 %. FDG PET demonstrated high diagnostic accuracy for the diagnosis of PPA and its variants. Inter-rater concordance was higher for statistical analysis, especially for the nonfluent/agrammatic variant. These data support the use of FDG PET to evaluate patients with PPA and show that statistical analysis methods are particularly useful for identifying the nonfluent/agrammatic variant of PPA. (orig.)

  2. Review of P-scan computer-based ultrasonic inservice inspection system. Supplement 1

    International Nuclear Information System (INIS)

    Harris, R.V. Jr.; Angel, L.J.

    1995-12-01

    This Supplement reviews the P-scan system, a computer-based ultrasonic system used for inservice inspection of piping and other components in nuclear power plants. The Supplement was prepared using the methodology described in detail in Appendix A of NUREG/CR-5985, and is based on one month of using the system in a laboratory. This Supplement describes and characterizes: computer system, ultrasonic components, and mechanical components; scanning, detection, digitizing, imaging, data interpretation, operator interaction, data handling, and record-keeping. It includes a general description, a review checklist, and detailed results of all tests performed

  3. Experimental statistics for biological sciences.

    Science.gov (United States)

    Bang, Heejung; Davidian, Marie

    2010-01-01

    In this chapter, we cover basic and fundamental principles and methods in statistics - from "What are Data and Statistics?" to "ANOVA and linear regression," which are the basis of any statistical thinking and undertaking. Readers can easily find the selected topics in most introductory statistics textbooks, but we have tried to assemble and structure them in a succinct and reader-friendly manner in a stand-alone chapter. This text has long been used in real classroom settings for both undergraduate and graduate students who do or do not major in statistical sciences. We hope that from this chapter, readers would understand the key statistical concepts and terminologies, how to design a study (experimental or observational), how to analyze the data (e.g., describe the data and/or estimate the parameter(s) and make inference), and how to interpret the results. This text would be most useful if it is used as a supplemental material, while the readers take their own statistical courses or it would serve as a great reference text associated with a manual for any statistical software as a self-teaching guide.

  4. Cardiac imaging: working towards fully-automated machine analysis & interpretation.

    Science.gov (United States)

    Slomka, Piotr J; Dey, Damini; Sitek, Arkadiusz; Motwani, Manish; Berman, Daniel S; Germano, Guido

    2017-03-01

    Non-invasive imaging plays a critical role in managing patients with cardiovascular disease. Although subjective visual interpretation remains the clinical mainstay, quantitative analysis facilitates objective, evidence-based management, and advances in clinical research. This has driven developments in computing and software tools aimed at achieving fully automated image processing and quantitative analysis. In parallel, machine learning techniques have been used to rapidly integrate large amounts of clinical and quantitative imaging data to provide highly personalized individual patient-based conclusions. Areas covered: This review summarizes recent advances in automated quantitative imaging in cardiology and describes the latest techniques which incorporate machine learning principles. The review focuses on the cardiac imaging techniques which are in wide clinical use. It also discusses key issues and obstacles for these tools to become utilized in mainstream clinical practice. Expert commentary: Fully-automated processing and high-level computer interpretation of cardiac imaging are becoming a reality. Application of machine learning to the vast amounts of quantitative data generated per scan and integration with clinical data also facilitates a move to more patient-specific interpretation. These developments are unlikely to replace interpreting physicians but will provide them with highly accurate tools to detect disease, risk-stratify, and optimize patient-specific treatment. However, with each technological advance, we move further from human dependence and closer to fully-automated machine interpretation.

  5. Interpretation of computed tomography imaging of the eye and orbit. A systematic approach

    Directory of Open Access Journals (Sweden)

    Naik Milind

    2002-01-01

    Full Text Available Computed tomography (CT has revolutionised the diagnosis and management of ocular and orbital diseases. The use of thin sections with multiplanar scanning (axial, coronal and sagittal planes and the possibility of three-dimensional reconstruction permits thorough evaluation. To make the most of this technique, users must familiarize themselves with the pertinent CT principles and terminology. The diagnostic yield is optimal when the ophthalmologist and radiologist collaborate in the radiodiagnostic workup. In this article we describe a systematic approach to the interpretation of ocular and orbital CT scans.

  6. "What If" Analyses: Ways to Interpret Statistical Significance Test Results Using EXCEL or "R"

    Science.gov (United States)

    Ozturk, Elif

    2012-01-01

    The present paper aims to review two motivations to conduct "what if" analyses using Excel and "R" to understand the statistical significance tests through the sample size context. "What if" analyses can be used to teach students what statistical significance tests really do and in applied research either prospectively to estimate what sample size…

  7. HistFitter software framework for statistical data analysis

    Energy Technology Data Exchange (ETDEWEB)

    Baak, M. [CERN, Geneva (Switzerland); Besjes, G.J. [Radboud University Nijmegen, Nijmegen (Netherlands); Nikhef, Amsterdam (Netherlands); Cote, D. [University of Texas, Arlington (United States); Koutsman, A. [TRIUMF, Vancouver (Canada); Lorenz, J. [Ludwig-Maximilians-Universitaet Muenchen, Munich (Germany); Excellence Cluster Universe, Garching (Germany); Short, D. [University of Oxford, Oxford (United Kingdom)

    2015-04-15

    We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fit to data and interpreted with statistical tests. Internally HistFitter uses the statistics packages RooStats and HistFactory. A key innovation of HistFitter is its design, which is rooted in analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with multiple models at once that describe the data, HistFitter introduces an additional level of abstraction that allows for easy bookkeeping, manipulation and testing of large collections of signal hypotheses. Finally, HistFitter provides a collection of tools to present results with publication quality style through a simple command-line interface. (orig.)

  8. HistFitter software framework for statistical data analysis

    International Nuclear Information System (INIS)

    Baak, M.; Besjes, G.J.; Cote, D.; Koutsman, A.; Lorenz, J.; Short, D.

    2015-01-01

    We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fit to data and interpreted with statistical tests. Internally HistFitter uses the statistics packages RooStats and HistFactory. A key innovation of HistFitter is its design, which is rooted in analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with multiple models at once that describe the data, HistFitter introduces an additional level of abstraction that allows for easy bookkeeping, manipulation and testing of large collections of signal hypotheses. Finally, HistFitter provides a collection of tools to present results with publication quality style through a simple command-line interface. (orig.)

  9. Statistics 101 for Radiologists.

    Science.gov (United States)

    Anvari, Arash; Halpern, Elkan F; Samir, Anthony E

    2015-10-01

    Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced. © RSNA, 2015.

  10. Statistical ensembles in quantum mechanics

    International Nuclear Information System (INIS)

    Blokhintsev, D.

    1976-01-01

    The interpretation of quantum mechanics presented in this paper is based on the concept of quantum ensembles. This concept differs essentially from the canonical one by that the interference of the observer into the state of a microscopic system is of no greater importance than in any other field of physics. Owing to this fact, the laws established by quantum mechanics are not of less objective character than the laws governing classical statistical mechanics. The paradoxical nature of some statements of quantum mechanics which result from the interpretation of the wave functions as the observer's notebook greatly stimulated the development of the idea presented. (Auth.)

  11. Global Profiling and Novel Structure Discovery Using Multiple Neutral Loss/Precursor Ion Scanning Combined with Substructure Recognition and Statistical Analysis (MNPSS): Characterization of Terpene-Conjugated Curcuminoids in Curcuma longa as a Case Study.

    Science.gov (United States)

    Qiao, Xue; Lin, Xiong-hao; Ji, Shuai; Zhang, Zheng-xiang; Bo, Tao; Guo, De-an; Ye, Min

    2016-01-05

    To fully understand the chemical diversity of an herbal medicine is challenging. In this work, we describe a new approach to globally profile and discover novel compounds from an herbal extract using multiple neutral loss/precursor ion scanning combined with substructure recognition and statistical analysis. Turmeric (the rhizomes of Curcuma longa L.) was used as an example. This approach consists of three steps: (i) multiple neutral loss/precursor ion scanning to obtain substructure information; (ii) targeted identification of new compounds by extracted ion current and substructure recognition; and (iii) untargeted identification using total ion current and multivariate statistical analysis to discover novel structures. Using this approach, 846 terpecurcumins (terpene-conjugated curcuminoids) were discovered from turmeric, including a number of potentially novel compounds. Furthermore, two unprecedented compounds (terpecurcumins X and Y) were purified, and their structures were identified by NMR spectroscopy. This study extended the application of mass spectrometry to global profiling of natural products in herbal medicines and could help chemists to rapidly discover novel compounds from a complex matrix.

  12. Method for statistical data analysis of multivariate observations

    CERN Document Server

    Gnanadesikan, R

    1997-01-01

    A practical guide for multivariate statistical techniques-- now updated and revised In recent years, innovations in computer technology and statistical methodologies have dramatically altered the landscape of multivariate data analysis. This new edition of Methods for Statistical Data Analysis of Multivariate Observations explores current multivariate concepts and techniques while retaining the same practical focus of its predecessor. It integrates methods and data-based interpretations relevant to multivariate analysis in a way that addresses real-world problems arising in many areas of inte

  13. An Interpreter's Interpretation: Sign Language Interpreters' View of Musculoskeletal Disorders

    National Research Council Canada - National Science Library

    Johnson, William L

    2003-01-01

    Sign language interpreters are at increased risk for musculoskeletal disorders. This study used content analysis to obtain detailed information about these disorders from the interpreters' point of view...

  14. Statistical Estimation of Heterogeneities: A New Frontier in Well Testing

    Science.gov (United States)

    Neuman, S. P.; Guadagnini, A.; Illman, W. A.; Riva, M.; Vesselinov, V. V.

    2001-12-01

    Well-testing methods have traditionally relied on analytical solutions of groundwater flow equations in relatively simple domains, consisting of one or at most a few units having uniform hydraulic properties. Recently, attention has been shifting toward methods and solutions that would allow one to characterize subsurface heterogeneities in greater detail. On one hand, geostatistical inverse methods are being used to assess the spatial variability of parameters, such as permeability and porosity, on the basis of multiple cross-hole pressure interference tests. On the other hand, analytical solutions are being developed to describe the mean and variance (first and second statistical moments) of flow to a well in a randomly heterogeneous medium. Geostatistical inverse interpretation of cross-hole tests yields a smoothed but detailed "tomographic" image of how parameters actually vary in three-dimensional space, together with corresponding measures of estimation uncertainty. Moment solutions may soon allow one to interpret well tests in terms of statistical parameters such as the mean and variance of log permeability, its spatial autocorrelation and statistical anisotropy. The idea of geostatistical cross-hole tomography is illustrated through pneumatic injection tests conducted in unsaturated fractured tuff at the Apache Leap Research Site near Superior, Arizona. The idea of using moment equations to interpret well-tests statistically is illustrated through a recently developed three-dimensional solution for steady state flow to a well in a bounded, randomly heterogeneous, statistically anisotropic aquifer.

  15. Truth, possibility and probability new logical foundations of probability and statistical inference

    CERN Document Server

    Chuaqui, R

    1991-01-01

    Anyone involved in the philosophy of science is naturally drawn into the study of the foundations of probability. Different interpretations of probability, based on competing philosophical ideas, lead to different statistical techniques, and frequently to mutually contradictory consequences. This unique book presents a new interpretation of probability, rooted in the traditional interpretation that was current in the 17th and 18th centuries. Mathematical models are constructed based on this interpretation, and statistical inference and decision theory are applied, including some examples in artificial intelligence, solving the main foundational problems. Nonstandard analysis is extensively developed for the construction of the models and in some of the proofs. Many nonstandard theorems are proved, some of them new, in particular, a representation theorem that asserts that any stochastic process can be approximated by a process defined over a space with equiprobable outcomes.

  16. Statistical interpretation of the process of evolution and functioning of Audiovisual Archives

    Directory of Open Access Journals (Sweden)

    Nuno Miguel Epifânio

    2013-03-01

    Full Text Available The article provides a type of the operating conditions of audiovisual archives, using for this purpose the interpretation of the results obtained in the study of quantitative sampling. The study involved 43 institutions of different nature of dimension since the national and foreign organizations, from of the questions answered by services of communication and of cultural institutions. The analysis of the object of study found a variety of guidelines on the management of information preservation, as featured the typology of records collections of each file. The data collection thus allowed building an overview of the operating model of each organization surveyed in this study.

  17. College Students' Interpretation of Research Reports on Group Differences: The Tall-Tale Effect

    Science.gov (United States)

    Hogan, Thomas P.; Zaboski, Brian A.; Perry, Tiffany R.

    2015-01-01

    How does the student untrained in advanced statistics interpret results of research that reports a group difference? In two studies, statistically untrained college students were presented with abstracts or professional associations' reports and asked for estimates of scores obtained by the original participants in the studies. These estimates…

  18. ADHD Rating Scale-IV: Checklists, Norms, and Clinical Interpretation

    Science.gov (United States)

    Pappas, Danielle

    2006-01-01

    This article reviews the "ADHD Rating Scale-IV: Checklist, norms, and clinical interpretation," is a norm-referenced checklist that measures the symptoms of attention deficit/hyperactivity disorder (ADHD) according to the diagnostic criteria of the Diagnostic and Statistical Manual of Mental Disorders (DSM-IV; American Psychiatric…

  19. Lessons learned on the presentation of scan data

    International Nuclear Information System (INIS)

    King, David A.; Vitkus, Tim

    2015-01-01

    Technicians performed a radiological survey of a surplus metal tank to support disposition planning at an Oak Ridge, Tennessee site. The survey included radiation scans to identify contamination and, if identified, define the boundary and magnitude of contamination. Fixed-point 1-minute measurements were also collected at randomly selected locations for comparison against the site's free release limit of 5,000 disintegrations per minute per 100 cm 2 (dpm/100 cm -2 ) (0.83 Bq/cm -2 ). Scan data were recorded using a data logger as a means to document surveyor observation - logged data captured at 1-second intervals and converted to counts per minute (cpm) by the data logger software were presented in the project report. Both the qualitative scan data (in cpm) and the quantitative direct measurement (in dpm/100 cm -2 ) were reported for completeness, so stakeholders had all available information to support disposition decisions. However, a new stakeholder - introduced to the project at the reporting phase of work - used the instrument efficiency and background data to convert the scan data from cpm to dpm/100 cm -1 , then compared the converted results to the site limit. Many of the converted values exceeded 5,000 dpm/100 cm -1 . This resulted in delays in tank disposition and additional project costs which could have been avoided if the proper use and interpretation of scan data, and implications of radon progeny buildup on oxidized metal surfaces, had been better communicated

  20. Wilhelm Wundt's Theory of Interpretation

    Directory of Open Access Journals (Sweden)

    Jochen Fahrenberg

    2008-09-01

    Full Text Available Wilhelm WUNDT was a pioneer in experimental and physiological psychology. However, his theory of interpretation (hermeneutics remains virtually neglected. According to WUNDT psychology belongs to the domain of the humanities (Geisteswissenschaften, and, throughout his books and research, he advocated two basic methodologies: experimentation (as the means of controlled self-observation and interpretative analysis of mental processes and products. He was an experimental psychologist and a profound expert in traditional hermeneutics. Today, he still may be acknowledged as the author of the monumental Völkerpsychologie, but not his advances in epistemology and methodology. His subsequent work, the Logik (1908/1921, contains about 120 pages on hermeneutics. In the present article a number of issues are addressed. Noteworthy was WUNDT's general intention to account for the logical constituents and the psychological process of understanding, and his reflections on quality control. In general, WUNDT demanded methodological pluralism and a complementary approach to the study of consciousness and neurophysiological processes. In the present paper WUNDT's approach is related to the continuing controversy on basic issues in methodology; e.g. experimental and statistical methods vs. qualitative (hermeneutic methods. Varied explanations are given for the one-sided or distorted reception of WUNDT's methodology. Presently, in Germany the basic program of study in psychology lacks thorough teaching and training in qualitative (hermeneutic methods. Appropriate courses are not included in the curricula, in contrast to the training in experimental design, observation methods, and statistics. URN: urn:nbn:de:0114-fqs0803291

  1. Nanoscale electrical property studies of individual GeSi quantum rings by conductive scanning probe microscopy.

    Science.gov (United States)

    Lv, Yi; Cui, Jian; Jiang, Zuimin M; Yang, Xinju

    2012-11-29

    The nanoscale electrical properties of individual self-assembled GeSi quantum rings (QRs) were studied by scanning probe microscopy-based techniques. The surface potential distributions of individual GeSi QRs are obtained by scanning Kelvin microscopy (SKM). Ring-shaped work function distributions are observed, presenting that the QRs' rim has a larger work function than the QRs' central hole. By combining the SKM results with those obtained by conductive atomic force microscopy and scanning capacitance microscopy, the correlations between the surface potential, conductance, and carrier density distributions are revealed, and a possible interpretation for the QRs' conductance distributions is suggested.

  2. Statistical ecology comes of age

    Science.gov (United States)

    Gimenez, Olivier; Buckland, Stephen T.; Morgan, Byron J. T.; Bez, Nicolas; Bertrand, Sophie; Choquet, Rémi; Dray, Stéphane; Etienne, Marie-Pierre; Fewster, Rachel; Gosselin, Frédéric; Mérigot, Bastien; Monestiez, Pascal; Morales, Juan M.; Mortier, Frédéric; Munoz, François; Ovaskainen, Otso; Pavoine, Sandrine; Pradel, Roger; Schurr, Frank M.; Thomas, Len; Thuiller, Wilfried; Trenkel, Verena; de Valpine, Perry; Rexstad, Eric

    2014-01-01

    The desire to predict the consequences of global environmental change has been the driver towards more realistic models embracing the variability and uncertainties inherent in ecology. Statistical ecology has gelled over the past decade as a discipline that moves away from describing patterns towards modelling the ecological processes that generate these patterns. Following the fourth International Statistical Ecology Conference (1–4 July 2014) in Montpellier, France, we analyse current trends in statistical ecology. Important advances in the analysis of individual movement, and in the modelling of population dynamics and species distributions, are made possible by the increasing use of hierarchical and hidden process models. Exciting research perspectives include the development of methods to interpret citizen science data and of efficient, flexible computational algorithms for model fitting. Statistical ecology has come of age: it now provides a general and mathematically rigorous framework linking ecological theory and empirical data. PMID:25540151

  3. Statistical ecology comes of age.

    Science.gov (United States)

    Gimenez, Olivier; Buckland, Stephen T; Morgan, Byron J T; Bez, Nicolas; Bertrand, Sophie; Choquet, Rémi; Dray, Stéphane; Etienne, Marie-Pierre; Fewster, Rachel; Gosselin, Frédéric; Mérigot, Bastien; Monestiez, Pascal; Morales, Juan M; Mortier, Frédéric; Munoz, François; Ovaskainen, Otso; Pavoine, Sandrine; Pradel, Roger; Schurr, Frank M; Thomas, Len; Thuiller, Wilfried; Trenkel, Verena; de Valpine, Perry; Rexstad, Eric

    2014-12-01

    The desire to predict the consequences of global environmental change has been the driver towards more realistic models embracing the variability and uncertainties inherent in ecology. Statistical ecology has gelled over the past decade as a discipline that moves away from describing patterns towards modelling the ecological processes that generate these patterns. Following the fourth International Statistical Ecology Conference (1-4 July 2014) in Montpellier, France, we analyse current trends in statistical ecology. Important advances in the analysis of individual movement, and in the modelling of population dynamics and species distributions, are made possible by the increasing use of hierarchical and hidden process models. Exciting research perspectives include the development of methods to interpret citizen science data and of efficient, flexible computational algorithms for model fitting. Statistical ecology has come of age: it now provides a general and mathematically rigorous framework linking ecological theory and empirical data.

  4. International Conference on Robust Statistics 2015

    CERN Document Server

    Basu, Ayanendranath; Filzmoser, Peter; Mukherjee, Diganta

    2016-01-01

    This book offers a collection of recent contributions and emerging ideas in the areas of robust statistics presented at the International Conference on Robust Statistics 2015 (ICORS 2015) held in Kolkata during 12–16 January, 2015. The book explores the applicability of robust methods in other non-traditional areas which includes the use of new techniques such as skew and mixture of skew distributions, scaled Bregman divergences, and multilevel functional data methods; application areas being circular data models and prediction of mortality and life expectancy. The contributions are of both theoretical as well as applied in nature. Robust statistics is a relatively young branch of statistical sciences that is rapidly emerging as the bedrock of statistical analysis in the 21st century due to its flexible nature and wide scope. Robust statistics supports the application of parametric and other inference techniques over a broader domain than the strictly interpreted model scenarios employed in classical statis...

  5. Components of the Pearson-Fisher chi-squared statistic

    Directory of Open Access Journals (Sweden)

    G. D. Raynery

    2002-01-01

    interpretation of the corresponding test statistic components has not previously been investigated. This paper provides the necessary details, as well as an overview of the decomposition options available, and revisits two published examples.

  6. Statistical Data Analyses of Trace Chemical, Biochemical, and Physical Analytical Signatures

    Energy Technology Data Exchange (ETDEWEB)

    Udey, Ruth Norma [Michigan State Univ., East Lansing, MI (United States)

    2013-01-01

    Analytical and bioanalytical chemistry measurement results are most meaningful when interpreted using rigorous statistical treatments of the data. The same data set may provide many dimensions of information depending on the questions asked through the applied statistical methods. Three principal projects illustrated the wealth of information gained through the application of statistical data analyses to diverse problems.

  7. Advanced radiographic scanning, enhancement and electronic data storage

    International Nuclear Information System (INIS)

    Savoie, C.; Rivest, D.

    2003-01-01

    It is a well-known fact that radiographs deteriorate with time. Substantial cost is attributed to cataloguing and storage. To eliminate deterioration issues and save time retrieving radiographs, laser scanning techniques were developed in conjunction with viewing and enhancement software. This will allow radiographs to be successfully scanned and stored electronically for future reference. Todays radiographic laser scanners are capable Qf capturing images with an optical density of up to 4.1 at 256 grey levels and resolutions up to 4096 pixels per line. An industrial software interface was developed for the nondestructive testing industry so that, certain parameters such as scan resolution, number of scans, file format and location to be saved could be adjusted as needed. Once the radiographs have been scanned, the tiff images are stored, or retrieved into Radiance software (developed by Rivest Technologies Inc.), which will help to properly interpret the radiographs. Radiance was developed to allow the user to quickly view the radiographs correctness or enhance its defects for comparison and future evaluation. Radiance also allows the user to zoom, measure and annotate areas of interest. Physical cost associated with cataloguing, storing and retrieving radiographs can be eliminated. You can now successfully retrieve and view your radiographs from CD media or dedicated hard drive at will. For continuous searches and/or field access, dedicated hard drives controlled by a server would be the media of choice. All scanned radiographs will be archived to CD media (CD-R). Laser scanning with a proper acquisition interface and easy to use viewing software will permit a qualified user to identify areas of interest and share this information with his/her colleagues via e-mail or web data access. (author)

  8. The Role of the Sampling Distribution in Understanding Statistical Inference

    Science.gov (United States)

    Lipson, Kay

    2003-01-01

    Many statistics educators believe that few students develop the level of conceptual understanding essential for them to apply correctly the statistical techniques at their disposal and to interpret their outcomes appropriately. It is also commonly believed that the sampling distribution plays an important role in developing this understanding.…

  9. Feasibility of F-18-FDG PET/CT scan in abdominopelvic regions

    International Nuclear Information System (INIS)

    Suga, Kazuyoshi

    2008-01-01

    F-18-2-Fluoro-2-deoxy-D-glucose (FDG) positron emission tomography (PET)/CT scan, which simultaneously provides metabolic function and morphology on the same tomographic section, is being the key imaging modality for diagnosis and treatment strategy of makignancies in various organs. FDG PET/CT scanning of the whole body beneficially allows the assessment of primary tumor and regional lymph nodes, and distant metastases and co-existed benign/other malignant lesions, as ''one stop shopping'' fashion. This technique contributes to the selection of the optimal treatment in individual patients, and also can predict histopathologic response to treatment and postoperative/post chemo-radiation therapeutic prognosis. In this paper, we describe the fundamental knowledge required for accurate interpretation of FDG PET/CT scan, and review the utility of this technique for diagnosis and treatment strategy of makignancies in abdominal and pelvic regions. (author)

  10. Statistical and Methodological Considerations for the Interpretation of Intranasal Oxytocin Studies.

    Science.gov (United States)

    Walum, Hasse; Waldman, Irwin D; Young, Larry J

    2016-02-01

    Over the last decade, oxytocin (OT) has received focus in numerous studies associating intranasal administration of this peptide with various aspects of human social behavior. These studies in humans are inspired by animal research, especially in rodents, showing that central manipulations of the OT system affect behavioral phenotypes related to social cognition, including parental behavior, social bonding, and individual recognition. Taken together, these studies in humans appear to provide compelling, but sometimes bewildering, evidence for the role of OT in influencing a vast array of complex social cognitive processes in humans. In this article, we investigate to what extent the human intranasal OT literature lends support to the hypothesis that intranasal OT consistently influences a wide spectrum of social behavior in humans. We do this by considering statistical features of studies within this field, including factors like statistical power, prestudy odds, and bias. Our conclusion is that intranasal OT studies are generally underpowered and that there is a high probability that most of the published intranasal OT findings do not represent true effects. Thus, the remarkable reports that intranasal OT influences a large number of human social behaviors should be viewed with healthy skepticism, and we make recommendations to improve the reliability of human OT studies in the future. Copyright © 2016 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.

  11. Statistical Power in Plant Pathology Research.

    Science.gov (United States)

    Gent, David H; Esker, Paul D; Kriss, Alissa B

    2018-01-01

    In null hypothesis testing, failure to reject a null hypothesis may have two potential interpretations. One interpretation is that the treatments being evaluated do not have a significant effect, and a correct conclusion was reached in the analysis. Alternatively, a treatment effect may have existed but the conclusion of the study was that there was none. This is termed a Type II error, which is most likely to occur when studies lack sufficient statistical power to detect a treatment effect. In basic terms, the power of a study is the ability to identify a true effect through a statistical test. The power of a statistical test is 1 - (the probability of Type II errors), and depends on the size of treatment effect (termed the effect size), variance, sample size, and significance criterion (the probability of a Type I error, α). Low statistical power is prevalent in scientific literature in general, including plant pathology. However, power is rarely reported, creating uncertainty in the interpretation of nonsignificant results and potentially underestimating small, yet biologically significant relationships. The appropriate level of power for a study depends on the impact of Type I versus Type II errors and no single level of power is acceptable for all purposes. Nonetheless, by convention 0.8 is often considered an acceptable threshold and studies with power less than 0.5 generally should not be conducted if the results are to be conclusive. The emphasis on power analysis should be in the planning stages of an experiment. Commonly employed strategies to increase power include increasing sample sizes, selecting a less stringent threshold probability for Type I errors, increasing the hypothesized or detectable effect size, including as few treatment groups as possible, reducing measurement variability, and including relevant covariates in analyses. Power analysis will lead to more efficient use of resources and more precisely structured hypotheses, and may even

  12. Concept of probability in statistical physics

    CERN Document Server

    Guttmann, Y M

    1999-01-01

    Foundational issues in statistical mechanics and the more general question of how probability is to be understood in the context of physical theories are both areas that have been neglected by philosophers of physics. This book fills an important gap in the literature by providing a most systematic study of how to interpret probabilistic assertions in the context of statistical mechanics. The book explores both subjectivist and objectivist accounts of probability, and takes full measure of work in the foundations of probability theory, in statistical mechanics, and in mathematical theory. It will be of particular interest to philosophers of science, physicists and mathematicians interested in foundational issues, and also to historians of science.

  13. Improving Classification of Airborne Laser Scanning Echoes in the Forest-Tundra Ecotone Using Geostatistical and Statistical Measures

    Directory of Open Access Journals (Sweden)

    Nadja Stumberg

    2014-05-01

    Full Text Available The vegetation in the forest-tundra ecotone zone is expected to be highly affected by climate change and requires effective monitoring techniques. Airborne laser scanning (ALS has been proposed as a tool for the detection of small pioneer trees for such vast areas using laser height and intensity data. The main objective of the present study was to assess a possible improvement in the performance of classifying tree and nontree laser echoes from high-density ALS data. The data were collected along a 1000 km long transect stretching from southern to northern Norway. Different geostatistical and statistical measures derived from laser height and intensity values were used to extent and potentially improve more simple models ignoring the spatial context. Generalised linear models (GLM and support vector machines (SVM were employed as classification methods. Total accuracies and Cohen’s kappa coefficients were calculated and compared to those of simpler models from a previous study. For both classification methods, all models revealed total accuracies similar to the results of the simpler models. Concerning classification performance, however, the comparison of the kappa coefficients indicated a significant improvement for some models both using GLM and SVM, with classification accuracies >94%.

  14. Use of demonstrations and experiments in teaching business statistics

    OpenAIRE

    Johnson, D. G.; John, J. A.

    2003-01-01

    The aim of a business statistics course should be to help students think statistically and to interpret and understand data, rather than to focus on mathematical detail and computation. To achieve this students must be thoroughly involved in the learning process, and encouraged to discover for themselves the meaning, importance and relevance of statistical concepts. In this paper we advocate the use of experiments and demonstrations as aids to achieving these goals. A number of demonstrations...

  15. Wind Statistics from a Forested Landscape

    DEFF Research Database (Denmark)

    Arnqvist, Johan; Segalini, Antonio; Dellwik, Ebba

    2015-01-01

    An analysis and interpretation of measurements from a 138-m tall tower located in a forested landscape is presented. Measurement errors and statistical uncertainties are carefully evaluated to ensure high data quality. A 40(Formula presented.) wide wind-direction sector is selected as the most...... representative for large-scale forest conditions, and from that sector first-, second- and third-order statistics, as well as analyses regarding the characteristic length scale, the flux-profile relationship and surface roughness are presented for a wide range of stability conditions. The results are discussed...

  16. Testing University Rankings Statistically: Why this Perhaps is not such a Good Idea after All. Some Reflections on Statistical Power, Effect Size, Random Sampling and Imaginary Populations

    DEFF Research Database (Denmark)

    Schneider, Jesper Wiborg

    2012-01-01

    In this paper we discuss and question the use of statistical significance tests in relation to university rankings as recently suggested. We outline the assumptions behind and interpretations of statistical significance tests and relate this to examples from the recent SCImago Institutions Rankin...

  17. The influence of control parameter estimation on large scale geomorphological interpretation of pointclouds

    Science.gov (United States)

    Dorninger, P.; Koma, Z.; Székely, B.

    2012-04-01

    In recent years, laser scanning, also referred to as LiDAR, has proved to be an important tool for topographic data acquisition. Basically, laser scanning acquires a more or less homogeneously distributed point cloud. These points represent all natural objects like terrain and vegetation as well as man-made objects such as buildings, streets, powerlines, or other constructions. Due to the enormous amount of data provided by current scanning systems capturing up to several hundred thousands of points per second, the immediate application of such point clouds for large scale interpretation and analysis is often prohibitive due to restrictions of the hard- and software infrastructure. To overcome this, numerous methods for the determination of derived products do exist. Commonly, Digital Terrain Models (DTM) or Digital Surface Models (DSM) are derived to represent the topography using a regular grid as datastructure. The obvious advantages are a significant reduction of the amount of data and the introduction of an implicit neighborhood topology enabling the application of efficient post processing methods. The major disadvantages are the loss of 3D information (i.e. overhangs) as well as the loss of information due to the interpolation approach used. We introduced a segmentation approach enabling the determination of planar structures within a given point cloud. It was originally developed for the purpose of building modeling but has proven to be well suited for large scale geomorphological analysis as well. The result is an assignment of the original points to a set of planes. Each plane is represented by its plane parameters. Additionally, numerous quality and quantity parameters are determined (e.g. aspect, slope, local roughness, etc.). In this contribution, we investigate the influence of the control parameters required for the plane segmentation on the geomorphological interpretation of the derived product. The respective control parameters may be determined

  18. A systematic approach to the interpretation of preoperative staging MRI for rectal cancer.

    Science.gov (United States)

    Taylor, Fiona G M; Swift, Robert I; Blomqvist, Lennart; Brown, Gina

    2008-12-01

    The purpose of this article is to provide an aid to the systematic evaluation of MRI in staging rectal cancer. MRI has been shown to be an effective tool for the accurate preoperative staging of rectal cancer. In the Magnetic Resonance Imaging and Rectal Cancer European Equivalence Study (MERCURY), imaging workshops were held for participating radiologists to ensure standardization of scan acquisition techniques and interpretation of the images. In this article, we report how the information was obtained and give examples of the images and how they are interpreted, with the aim of providing a systematic approach to the reporting process.

  19. Basic principles of pulmonary anatomy and physiology for CT interpretation of lung diseases

    International Nuclear Information System (INIS)

    Remy-Jardin, M.; Beigelman, C.; Desfontaines, C.; Dupont, S.; Remy, J.

    1989-01-01

    High resolution CT is now the method of choice in the diagnosis of lung diseases, especially in their early recognition. However, the radiologist must be aware of precise anatomic, pathologic and physiologic data which are observed when the patient is supine. This concept leads to a transversal analysis of lung diseases by CT, as previously proposed in the coronal and sagittal planes for conventional chest X Ray interpretation. The aim of the study is to demonstrate that these regional differences in the lung must be included in the method of chest scanning but also in the interpretation of lung diseases [fr

  20. The scanning Compton polarimeter for the SLD experiment

    International Nuclear Information System (INIS)

    Woods, M.

    1996-10-01

    For the 1994/95 run of the SLD experiment at SLAC, a Compton polarimeter measured the luminosity-weighted electron beam polarization to be (77.2 ± 0.5)%. This excellent accuracy is achieved by measuring the rate asymmetry of Compton-scattered electrons near the kinematic endpoint. The polarimeter takes data continuously while the electron and positron beams are in collision and achieves a statistical precision of better than 1% in a three minute run. To calibrate the polarimeter and demonstrate its accuracy, many scans are frequently done. These include scans of the laser polarization, the detector position with respect to the kinematic edge, and the laser power

  1. Reading Statistics And Research

    OpenAIRE

    Akbulut, Reviewed By Yavuz

    2008-01-01

    The book demonstrates the best and most conservative ways to decipher and critique research reports particularly for social science researchers. In addition, new editions of the book are always better organized, effectively structured and meticulously updated in line with the developments in the field of research statistics. Even the most trivial issues are revisited and updated in new editions. For instance, purchaser of the previous editions might check the interpretation of skewness and ku...

  2. Surveys Assessing Students' Attitudes toward Statistics: A Systematic Review of Validity and Reliability

    Science.gov (United States)

    Nolan, Meaghan M.; Beran, Tanya; Hecker, Kent G.

    2012-01-01

    Students with positive attitudes toward statistics are likely to show strong academic performance in statistics courses. Multiple surveys measuring students' attitudes toward statistics exist; however, a comparison of the validity and reliability of interpretations based on their scores is needed. A systematic review of relevant electronic…

  3. Philosophical perspectives on quantum chaos: Models and interpretations

    Science.gov (United States)

    Bokulich, Alisa Nicole

    2001-09-01

    The problem of quantum chaos is a special case of the larger problem of understanding how the classical world emerges from quantum mechanics. While we have learned that chaos is pervasive in classical systems, it appears to be almost entirely absent in quantum systems. The aim of this dissertation is to determine what implications the interpretation of quantum mechanics has for attempts to explain the emergence of classical chaos. There are three interpretations of quantum mechanics that have set out programs for solving the problem of quantum chaos: the standard interpretation, the statistical interpretation, and the deBroglie-Bohm causal interpretation. One of the main conclusions of this dissertation is that an interpretation alone is insufficient for solving the problem of quantum chaos and that the phenomenon of decoherence must be taken into account. Although a completely satisfactory solution of the problem of quantum chaos is still outstanding, I argue that the deBroglie-Bohm interpretation with the help of decoherence outlines the most promising research program to pursue. In addition to making a contribution to the debate in the philosophy of physics concerning the interpretation of quantum mechanics, this dissertation reveals two important methodological lessons for the philosophy of science. First, issues of reductionism and intertheoretic relations cannot be divorced from questions concerning the interpretation of the theories involved. Not only is the exploration of intertheoretic relations a central part of the articulation and interpretation of an individual theory, but the very terms used to discuss intertheoretic relations, such as `state' and `classical limit', are themselves defined by particular interpretations of the theory. The second lesson that emerges is that, when it comes to characterizing the relationship between classical chaos and quantum mechanics, the traditional approaches to intertheoretic relations, namely reductionism and

  4. Interpretive Media Study and Interpretive Social Science.

    Science.gov (United States)

    Carragee, Kevin M.

    1990-01-01

    Defines the major theoretical influences on interpretive approaches in mass communication, examines the central concepts of these perspectives, and provides a critique of these approaches. States that the adoption of interpretive approaches in mass communication has ignored varied critiques of interpretive social science. Suggests that critical…

  5. Interpreters, Interpreting, and the Study of Bilingualism.

    Science.gov (United States)

    Valdes, Guadalupe; Angelelli, Claudia

    2003-01-01

    Discusses research on interpreting focused specifically on issues raised by this literature about the nature of bilingualism. Suggests research carried out on interpreting--while primarily produced with a professional audience in mind and concerned with improving the practice of interpreting--provides valuable insights about complex aspects of…

  6. Comparison of statistical sampling methods with ScannerBit, the GAMBIT scanning module

    Energy Technology Data Exchange (ETDEWEB)

    Martinez, Gregory D. [University of California, Physics and Astronomy Department, Los Angeles, CA (United States); McKay, James; Scott, Pat [Imperial College London, Department of Physics, Blackett Laboratory, London (United Kingdom); Farmer, Ben; Conrad, Jan [AlbaNova University Centre, Oskar Klein Centre for Cosmoparticle Physics, Stockholm (Sweden); Stockholm University, Department of Physics, Stockholm (Sweden); Roebber, Elinore [McGill University, Department of Physics, Montreal, QC (Canada); Putze, Antje [LAPTh, Universite de Savoie, CNRS, Annecy-le-Vieux (France); Collaboration: The GAMBIT Scanner Workgroup

    2017-11-15

    We introduce ScannerBit, the statistics and sampling module of the public, open-source global fitting framework GAMBIT. ScannerBit provides a standardised interface to different sampling algorithms, enabling the use and comparison of multiple computational methods for inferring profile likelihoods, Bayesian posteriors, and other statistical quantities. The current version offers random, grid, raster, nested sampling, differential evolution, Markov Chain Monte Carlo (MCMC) and ensemble Monte Carlo samplers. We also announce the release of a new standalone differential evolution sampler, Diver, and describe its design, usage and interface to ScannerBit. We subject Diver and three other samplers (the nested sampler MultiNest, the MCMC GreAT, and the native ScannerBit implementation of the ensemble Monte Carlo algorithm T-Walk) to a battery of statistical tests. For this we use a realistic physical likelihood function, based on the scalar singlet model of dark matter. We examine the performance of each sampler as a function of its adjustable settings, and the dimensionality of the sampling problem. We evaluate performance on four metrics: optimality of the best fit found, completeness in exploring the best-fit region, number of likelihood evaluations, and total runtime. For Bayesian posterior estimation at high resolution, T-Walk provides the most accurate and timely mapping of the full parameter space. For profile likelihood analysis in less than about ten dimensions, we find that Diver and MultiNest score similarly in terms of best fit and speed, outperforming GreAT and T-Walk; in ten or more dimensions, Diver substantially outperforms the other three samplers on all metrics. (orig.)

  7. An empirical Bayes method for updating inferences in analysis of quantitative trait loci using information from related genome scans.

    Science.gov (United States)

    Zhang, Kui; Wiener, Howard; Beasley, Mark; George, Varghese; Amos, Christopher I; Allison, David B

    2006-08-01

    Individual genome scans for quantitative trait loci (QTL) mapping often suffer from low statistical power and imprecise estimates of QTL location and effect. This lack of precision yields large confidence intervals for QTL location, which are problematic for subsequent fine mapping and positional cloning. In prioritizing areas for follow-up after an initial genome scan and in evaluating the credibility of apparent linkage signals, investigators typically examine the results of other genome scans of the same phenotype and informally update their beliefs about which linkage signals in their scan most merit confidence and follow-up via a subjective-intuitive integration approach. A method that acknowledges the wisdom of this general paradigm but formally borrows information from other scans to increase confidence in objectivity would be a benefit. We developed an empirical Bayes analytic method to integrate information from multiple genome scans. The linkage statistic obtained from a single genome scan study is updated by incorporating statistics from other genome scans as prior information. This technique does not require that all studies have an identical marker map or a common estimated QTL effect. The updated linkage statistic can then be used for the estimation of QTL location and effect. We evaluate the performance of our method by using extensive simulations based on actual marker spacing and allele frequencies from available data. Results indicate that the empirical Bayes method can account for between-study heterogeneity, estimate the QTL location and effect more precisely, and provide narrower confidence intervals than results from any single individual study. We also compared the empirical Bayes method with a method originally developed for meta-analysis (a closely related but distinct purpose). In the face of marked heterogeneity among studies, the empirical Bayes method outperforms the comparator.

  8. Data precision of X-ray fluorescence (XRF) scanning of discrete samples with the ITRAX XRF core-scanner exemplified on loess-paleosol samples

    Science.gov (United States)

    Profe, Jörn; Ohlendorf, Christian

    2017-04-01

    XRF-scanning is the state-of-the-art technique for geochemical analyses in marine and lacustrine sedimentology for more than a decade. However, little attention has been paid to data precision and technical limitations so far. Using homogenized, dried and powdered samples (certified geochemical reference standards and samples from a lithologically-contrasting loess-paleosol sequence) minimizes many adverse effects that influence the XRF-signal when analyzing wet sediment cores. This allows the investigation of data precision under ideal conditions and documents a new application of the XRF core-scanner technology at the same time. Reliable interpretations of XRF results require data precision evaluation of single elements as a function of X-ray tube, measurement time, sample compaction and quality of peak fitting. Ten-fold measurement of each sample constitutes data precision. Data precision of XRF measurements theoretically obeys Poisson statistics. Fe and Ca exhibit largest deviations from Poisson statistics. The same elements show the least mean relative standard deviations in the range from 0.5% to 1%. This represents the technical limit of data precision achievable by the installed detector. Measurement times ≥ 30 s reveal mean relative standard deviations below 4% for most elements. The quality of peak fitting is only relevant for elements with overlapping fluorescence lines such as Ba, Ti and Mn or for elements with low concentrations such as Y, for example. Differences in sample compaction are marginal and do not change mean relative standard deviation considerably. Data precision is in the range reported for geochemical reference standards measured by conventional techniques. Therefore, XRF scanning of discrete samples provide a cost- and time-efficient alternative to conventional multi-element analyses. As best trade-off between economical operation and data quality, we recommend a measurement time of 30 s resulting in a total scan time of 30 minutes

  9. Bone scan as a screening test for missed fractures in severely injured patients.

    Science.gov (United States)

    Lee, K-J; Jung, K; Kim, J; Kwon, J

    2014-12-01

    In many cases, patients with severe blunt trauma have multiple fractures throughout the body. These fractures are not often detectable by history or physical examination, and their diagnosis can be delayed or even missed. Thus, screening test fractures of the whole body is required after initial management. We performed this study to evaluate the reliability of bone scans for detecting missed fractures in patients with multiple severe traumas and we analyzed the causes of missed fractures by using bone scan. A bone scan is useful as a screening test for fractures of the entire body of severe trauma patients who are passed the acute phase. We reviewed the electronic medical records of severe trauma patients who underwent a bone scan from September 2009 to December 2010. Demographic and medical data were compared and statistically analyzed to determine whether missed fractures were detected after bone scan in the two groups. A total of 382 patients who had an injury severity score [ISS] greater than 16 points with multiple traumas visited the emergency room. One hundred and thirty-one patients underwent bone scan and 81 patients were identified with missed fractures by bone scan. The most frequent location for missed fractures was the rib area (55 cases, 41.98%), followed by the extremities (42 cases, 32.06%). The missed fractures that required surgery or splint were most common in extremities (11 cases). In univariate analysis, higher ISS scores and mechanism of injury were related with the probability that missed fractures would be found with a bone scan. The ISS score was statistically significant in multivariate analysis. Bone scan is an effective method of detecting missed fractures among patients with multiple severe traumas. Level IV, retrospective study. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  10. On the interpretations of Langevin stochastic equation in different coordinate systems

    International Nuclear Information System (INIS)

    Martinez, E.; Lopez-Diaz, L.; Torres, L.; Alejos, O.

    2004-01-01

    The stochastic Langevin Landau-Lifshitz equation is usually utilized in micromagnetics formalism to account for thermal effects. Commonly, two different interpretations of the stochastic integrals can be made: Ito and Stratonovich. In this work, the Langevin-Landau-Lifshitz (LLL) equation is written in both Cartesian and Spherical coordinates. If Spherical coordinates are employed, the noise is additive, and therefore, Ito and Stratonovich solutions are equal. This is not the case when (LLL) equation is written in Cartesian coordinates. In this case, the Langevin equation must be interpreted in the Stratonovich sense in order to reproduce correct statistical results. Nevertheless, the statistics of the numerical results obtained from Euler-Ito and Euler-Stratonovich schemes are equivalent due to the additional numerical constraint imposed in Cartesian system after each time step, which itself assures that the magnitude of the magnetization is preserved

  11. Fast-scan, beam-profile monitor

    International Nuclear Information System (INIS)

    Waugh, A.F.

    1977-01-01

    A minimodular, data-acquisition system can be used to rapidly interrogate a 45-point matrix of beam-current sampling targets over the 3- x 12-in. rectangular, output beam cross section of a 50-A, neutral-beam ion source. This system, operating at a throughput rate of 12 μs per channel, can make several complete scans during the 10- to 25-ms-duration beam pulse. Data obtained are available in both analog and digital form. The analog signal is used to create an immediately interpretable CRT display of the beam-current density profile that shows how well the source is aimed. The digital data are held in buffer memory until transfer to a minicomputer for software processing and plotting

  12. Toward Establishing the Validity of the Resource Interpreter's Self-Efficacy Instrument

    Science.gov (United States)

    Smith, Grant D.

    Interpretive rangers serve as one of the major educational resources that visitors may encounter during their visit to a park or other natural area, yet our understanding of their professional growth remains limited. This study helps address this issue by developing an instrument that evaluates the beliefs of resource interpreters regarding their capabilities of communicating with the public. The resulting 11-item instrument was built around the construct of Albert Bandura's self-efficacy theory (Bandura, 1977, 1986, 1997), used guidelines and principles developed over the course of 30 years of teacher efficacy studies (Bandura, 2006; Gibson & Dembo, 1984; Riggs & Enochs, 1990; Tschannen-Moran & Hoy, 2001; Tschannen-Moran, Hoy, & Hoy, 1998), and probed areas of challenge that are unique to the demands of resource interpretation (Brochu & Merriman, 2002; Ham, 1992; Knudson, Cable, & Beck, 2003; Larsen, 2003; Tilden, 1977). A voluntary convenience sample of 364 National Park Service rangers was collected in order to conduct the statistical analyses needed to winnow the draft instrument down from 47 items in its original form to 11 items in its final state. Statistical analyses used in this process included item-total correlation, index of discrimination, exploratory factor analysis, and confirmatory factor analysis.

  13. Quantum Statistics and Entanglement Problems

    OpenAIRE

    Trainor, L. E. H.; Lumsden, Charles J.

    2002-01-01

    Interpretations of quantum measurement theory have been plagued by two questions, one concerning the role of observer consciousness and the other the entanglement phenomenon arising from the superposition of quantum states. We emphasize here the remarkable role of quantum statistics in describing the entanglement problem correctly and discuss the relationship to issues arising from current discussions of intelligent observers in entangled, decohering quantum worlds.

  14. Crunching Numbers: What Cancer Screening Statistics Really Tell Us

    Science.gov (United States)

    Cancer screening studies have shown that more screening does not necessarily translate into fewer cancer deaths. This article explains how to interpret the statistics used to describe the results of screening studies.

  15. Tip-Dependent Scanning Tunneling Microscopy Imaging of Ultrathin FeO Films on Pt(111)

    DEFF Research Database (Denmark)

    Merte, Lindsay Richard; Grabow, Lars C.; Peng, Guowen

    2011-01-01

    High-resolution scanning tunneling microscope (STM) images of moiré-structured FeO films on Pt(111) were obtained in a number of different tip-dependent imaging modes. For the first time, the STM images are distinguished and interpreted unambiguously with the help of distinct oxygen...

  16. Toward smartphone applications for geoparks information and interpretation systems in China

    Science.gov (United States)

    Li, Qian; Tian, Mingzhong; Li, Xingle; Shi, Yihua; Zhou, Xu

    2015-11-01

    Geopark information and interpretation systems are both necessary infrastructure in geopark planning and construction program, and they are also essential for geoeducation and geoconservation in geopark tourism. The current state and development of information and interpretation systems in China's geoparks were presented and analyzed in this paper. Statistics showed that fewer than half of geoparks run websites, and less than that amount maintained database, and less than one percent of all Internet/smartphone applications were used for geopark tourism. The results of our analysis indicated that smartphone applications in geopark information and interpretation systems would provide benefits such as accelerated geopark science popularization and education and facilitated interactive communication between geoparks and tourists.

  17. Vocational students' learning preferences: the interpretability of ipsative data.

    Science.gov (United States)

    Smith, P J

    2000-02-01

    A number of researchers have argued that ipsative data are not suitable for statistical procedures designed for normative data. Others have argued that the interpretability of such analyses of ipsative data are little affected where the number of variables and the sample size are sufficiently large. The research reported here represents a factor analysis of the scores on the Canfield Learning Styles Inventory for 1,252 students in vocational education. The results of the factor analysis of these ipsative data were examined in a context of existing theory and research on vocational students and lend support to the argument that the factor analysis of ipsative data can provide sensibly interpretable results.

  18. The statistics of multi-step direct reactions

    International Nuclear Information System (INIS)

    Koning, A.J.; Akkermans, J.M.

    1991-01-01

    We propose a quantum-statistical framework that provides an integrated perspective on the differences and similarities between the many current models for multi-step direct reactions in the continuum. It is argued that to obtain a statistical theory two physically different approaches are conceivable to postulate randomness, respectively called leading-particle statistics and residual-system statistics. We present a new leading-particle statistics theory for multi-step direct reactions. It is shown that the model of Feshbach et al. can be derived as a simplification of this theory and thus can be founded solely upon leading-particle statistics. The models developed by Tamura et al. and Nishioka et al. are based upon residual-system statistics and hence fall into a physically different class of multi-step direct theories, although the resulting cross-section formulae for the important first step are shown to be the same. The widely used semi-classical models such as the generalized exciton model can be interpreted as further phenomenological simplifications of the leading-particle statistics theory. A more comprehensive exposition will appear before long. (author). 32 refs, 4 figs

  19. Scan Order in Gibbs Sampling: Models in Which it Matters and Bounds on How Much.

    Science.gov (United States)

    He, Bryan; De Sa, Christopher; Mitliagkas, Ioannis; Ré, Christopher

    2016-01-01

    Gibbs sampling is a Markov Chain Monte Carlo sampling technique that iteratively samples variables from their conditional distributions. There are two common scan orders for the variables: random scan and systematic scan. Due to the benefits of locality in hardware, systematic scan is commonly used, even though most statistical guarantees are only for random scan. While it has been conjectured that the mixing times of random scan and systematic scan do not differ by more than a logarithmic factor, we show by counterexample that this is not the case, and we prove that that the mixing times do not differ by more than a polynomial factor under mild conditions. To prove these relative bounds, we introduce a method of augmenting the state space to study systematic scan using conductance.

  20. Generation of 3D Virtual Geographic Environment Based on Laser Scanning Technique

    Institute of Scientific and Technical Information of China (English)

    DU Jie; CHEN Xiaoyong; FumioYamazaki

    2003-01-01

    This paper demonstrates an experiment on the generation of 3D virtual geographic environment on the basis of experimental flight laser scanning data by a set of algorithms and methods that were developed to automatically interpret range images for extracting geo-spatial features and then to reconstruct geo-objects. The algorithms and methods for the interpretation and modeling of laser scanner data include triangulated-irregular-network (TIN)-based range image interpolation ; mathematical-morphology(MM)-based range image filtering,feature extraction and range image segmentation, feature generalization and optimization, 3D objects reconstruction and modeling; computergraphics (CG)-based visualization and animation of geographic virtual reality environment.

  1. Statistical learning from a regression perspective

    CERN Document Server

    Berk, Richard A

    2016-01-01

    This textbook considers statistical learning applications when interest centers on the conditional distribution of the response variable, given a set of predictors, and when it is important to characterize how the predictors are related to the response. As a first approximation, this can be seen as an extension of nonparametric regression. This fully revised new edition includes important developments over the past 8 years. Consistent with modern data analytics, it emphasizes that a proper statistical learning data analysis derives from sound data collection, intelligent data management, appropriate statistical procedures, and an accessible interpretation of results. A continued emphasis on the implications for practice runs through the text. Among the statistical learning procedures examined are bagging, random forests, boosting, support vector machines and neural networks. Response variables may be quantitative or categorical. As in the first edition, a unifying theme is supervised learning that can be trea...

  2. Image acquisition and interpretation criteria for 99mTc-HMPAO-labelled white blood cell scintigraphy: results of a multicentre study

    International Nuclear Information System (INIS)

    Erba, Paola A.; Glaudemans, Andor W.J.M.; Dierckx, Rudi A.J.O.; Veltman, Niels C.; Sollini, Martina; Pacilio, Marta; Galli, Filippo; Signore, Alberto; Sapienza Univ., Rome; Sapienza Univ., Rome

    2014-01-01

    There is no consensus yet on the best protocol for planar image acquisition and interpretation of radiolabelled white blood cell (WBC) scintigraphy. This may account for differences in reported diagnostic accuracy amongst different centres. This was a multicentre retrospective study analysing 235 WBC scans divided into two groups. The first group of scans (105 patients) were acquired with a fixed-time acquisition protocol and the second group (130 patients) were acquired with a decay time-corrected acquisition protocol. Planar images were interpreted both qualitatively and semiquantitatively. Three blinded readers analysed the images. The most accurate imaging acquisition protocol comprised image acquisition at 3 - 4 h and at 20 - 24 h in time mode with acquisition times corrected for isotope decay. Using this protocol, visual analysis had high sensitivity and specificity in the diagnosis of infection. Semiquantitative analysis could be used in doubtful cases, with no cut-off for the percentage increase in radiolabelled WBC over time, as a criterion to define a positive scan. (orig.)

  3. Statistical analysis of CT brain scans in the evaluation of cerebral atrophy and hydrocephalus

    International Nuclear Information System (INIS)

    Oberthur, J.; Baddeley, H.; Jayasinghe, L.; Walsh, P.

    1983-01-01

    All the subjects with a visual CT diagnosis of atrophy or hydrocephalus showed variations from the normal in excess of two standard deviations so the standard deviation analysis method can be regarded as being as sensitive as the visual interpretation. However, three patients in the control group were also indicted although their results were only in the borderline range. Limitations of the study are discussed

  4. Power, effects, confidence, and significance: an investigation of statistical practices in nursing research.

    Science.gov (United States)

    Gaskin, Cadeyrn J; Happell, Brenda

    2014-05-01

    To (a) assess the statistical power of nursing research to detect small, medium, and large effect sizes; (b) estimate the experiment-wise Type I error rate in these studies; and (c) assess the extent to which (i) a priori power analyses, (ii) effect sizes (and interpretations thereof), and (iii) confidence intervals were reported. Statistical review. Papers published in the 2011 volumes of the 10 highest ranked nursing journals, based on their 5-year impact factors. Papers were assessed for statistical power, control of experiment-wise Type I error, reporting of a priori power analyses, reporting and interpretation of effect sizes, and reporting of confidence intervals. The analyses were based on 333 papers, from which 10,337 inferential statistics were identified. The median power to detect small, medium, and large effect sizes was .40 (interquartile range [IQR]=.24-.71), .98 (IQR=.85-1.00), and 1.00 (IQR=1.00-1.00), respectively. The median experiment-wise Type I error rate was .54 (IQR=.26-.80). A priori power analyses were reported in 28% of papers. Effect sizes were routinely reported for Spearman's rank correlations (100% of papers in which this test was used), Poisson regressions (100%), odds ratios (100%), Kendall's tau correlations (100%), Pearson's correlations (99%), logistic regressions (98%), structural equation modelling/confirmatory factor analyses/path analyses (97%), and linear regressions (83%), but were reported less often for two-proportion z tests (50%), analyses of variance/analyses of covariance/multivariate analyses of variance (18%), t tests (8%), Wilcoxon's tests (8%), Chi-squared tests (8%), and Fisher's exact tests (7%), and not reported for sign tests, Friedman's tests, McNemar's tests, multi-level models, and Kruskal-Wallis tests. Effect sizes were infrequently interpreted. Confidence intervals were reported in 28% of papers. The use, reporting, and interpretation of inferential statistics in nursing research need substantial

  5. Critical Views of 8th Grade Students toward Statistical Data in Newspaper Articles: Analysis in Light of Statistical Literacy

    Science.gov (United States)

    Guler, Mustafa; Gursoy, Kadir; Guven, Bulent

    2016-01-01

    Understanding and interpreting biased data, decision-making in accordance with the data, and critically evaluating situations involving data are among the fundamental skills necessary in the modern world. To develop these required skills, emphasis on statistical literacy in school mathematics has been gradually increased in recent years. The…

  6. STATISTICS IN SERVICE QUALITY ASSESSMENT

    Directory of Open Access Journals (Sweden)

    Dragana Gardašević

    2012-09-01

    Full Text Available For any quality evaluation in sports, science, education, and so, it is useful to collect data to construct a strategy to improve the quality of services offered to the user. For this purpose, we use statistical software packages for data processing data collected in order to increase customer satisfaction. The principle is demonstrated by the example of the level of student satisfaction ratings Belgrade Polytechnic (as users the quality of institutions (Belgrade Polytechnic. Here, the emphasis on statistical analysis as a tool for quality control in order to improve the same, and not the interpretation of results. Therefore, the above can be used as a model in sport to improve the overall results.

  7. Surgical retroperitoneoscopic and transperitoneoscopic access in varicocelectomy: duplex scan results in pediatric population.

    Science.gov (United States)

    Mancini, Stefano; Bulotta, Anna Lavinia; Molinaro, Francesco; Ferrara, Francesco; Tommasino, Giulio; Messina, Mario

    2014-12-01

    This is a retrospective study to compare duplex scan results of laparoscopic Palomo's technique through retroperitoneal and transperitoneal approach for varicocelectomy in children. We statistically analyzed recurrence, testicular volume growth and complications. Surgical intervention was performed utilizing transperitoneoscopic (group A) or retroperitoneoscopic access (group B). Duplex scan control was performed after 12 months (T1), after 2 years (T2) and the last one at 18 years old in most patients. Statistical analysis was performed using the t-test for parametric data. Differences in proportions were evaluated using χ2 or Fisher's exact test. We treated 120 children (age range 10-17 years) who presented an asymptomatic IV grade of reflux, Coolsaet 1, associated with a left testicular hypotrophy in 36.6% of the cases (44 patients). No post-operative complications were verified. Duplex scan exam showed an increase of left testicular growth in both groups, with complete hypotrophy disappear in patients in both groups after 24 months. Hydrocele, diagnosed clinically and confirmed with duplex scan, was the most frequent post-operative complication (22/120 cases; 18.3%). This study showed the importance of duplex scan at all steps of this vascular pathology in children, and that there is no significantly difference in results between the two surgical techniques except for hydrocele in transperitoneoscopic access. Copyright © 2014 Journal of Pediatric Urology Company. Published by Elsevier Ltd. All rights reserved.

  8. Value and pitfalls of the lateral lung scan

    International Nuclear Information System (INIS)

    Sy, W.M.; Krol, G.; Faunce, H.; Bay, R.

    1975-01-01

    Two hundred eighty-one of 443 lung scans composed of anterior, posterior, and lateral projections (done in our hospital) demonstrated defects. In 3.9 percent of them (11 cases), the defects were delineated in the lateral views only, while in 29.2 percent (82 cases), the lateral views either outlined additional defects not appreciated on the straight views, or showed more extensive lung involvement. In the majority of instances, 56.6 percent (159 cases), the lateral views showed comparable findings and also tended to segmentally localize the defects better. However, in 10.3 percent (29 cases), defects present on the straight projections were not detected on the lateral views. Various causes that could give rise to artefactual abnormalities in the lateral lung scan and therefore inhibit its proper interpretation are reviewed and discussed. Despite these problems, the lateral may be the only view to demonstrate abnormalities and, in fact, frequently provides additional useful information

  9. Statistical considerations in the development of injury risk functions.

    Science.gov (United States)

    McMurry, Timothy L; Poplin, Gerald S

    2015-01-01

    We address 4 frequently misunderstood and important statistical ideas in the construction of injury risk functions. These include the similarities of survival analysis and logistic regression, the correct scale on which to construct pointwise confidence intervals for injury risk, the ability to discern which form of injury risk function is optimal, and the handling of repeated tests on the same subject. The statistical models are explored through simulation and examination of the underlying mathematics. We provide recommendations for the statistically valid construction and correct interpretation of single-predictor injury risk functions. This article aims to provide useful and understandable statistical guidance to improve the practice in constructing injury risk functions.

  10. False positive and false negative FDG-PET scans in various thoracic diseases

    International Nuclear Information System (INIS)

    Chang, Jung Min; Lee, Hyun Ju; Goo, Jin Mo; Lee, Ho Young; Lee, Jong Jin; Chung, June Key; Im, Jung Gi

    2006-01-01

    Fluorodeoxygucose (FDG)-positron emission tomography (PET) is being used more and more to differentiate benign form malignant focal lesions and it has been shown to be more efficacious than conventional chest computed tomography (CT). However, FDG is not a cancer-specific agent, and false positive findings in benign diseases have been reported. Infectious diseases (mycobacterial, fungal, bacterial infection), sarcoidosis, radiation pneumonitis and post-operative surgical conditions have shown intense uptake on PET scan. On the other hand, tumors with low glycolytic activity such as adenomas, bronchioloalveolar carcinomas, carcinoid tumors, low grade lymphomas and small sized tumors have revealed false negative findings on PET scan, Furthermore, in diseases located near the physiologic uptake sites (heart, bladder, kidney, and liver), FDG-PET should be complemented with other imaging modalities to confirm results and to minimize false negative findings. Familiarity with these false positive and negative findings will help radiologists interpret PET scans more accurately and also will help to determine the significance of the findings. In this review, we illustrate false positive and negative findings of PET scan in a variety of diseases

  11. Feasibility study of radiation dose reduction in adult female pelvic CT scan with low tube-voltage and adaptive statistical iterative econstruction

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Xin Lian; He, Wen; Chen, Jian Hong; Hu, Zhi Hai; Zhao, Li Qin [Dept. of Radiology, Beijing Friendship Hospital, Capital Medical University, Beijing (China)

    2015-10-15

    To evaluate image quality of female pelvic computed tomography (CT) scans reconstructed with the adaptive statistical iterative reconstruction (ASIR) technique combined with low tube-voltage and to explore the feasibility of its clinical application. Ninety-four patients were divided into two groups. The study group used 100 kVp, and images were reconstructed with 30%, 50%, 70%, and 90% ASIR. The control group used 120 kVp, and images were reconstructed with 30% ASIR. The noise index was 15 for the study group and 11 for the control group. The CT values and noise levels of different tissues were measured. The contrast to noise ratio (CNR) was calculated. A subjective evaluation was carried out by two experienced radiologists. The CT dose index volume (CTDIvol) was recorded. A 44.7% reduction in CTDIvol was observed in the study group (8.18 ± 3.58 mGy) compared with that in the control group (14.78 ± 6.15 mGy). No significant differences were observed in the tissue noise levels and CNR values between the 70% ASIR group and the control group (p = 0.068-1.000). The subjective scores indicated that visibility of small structures, diagnostic confidence, and the overall image quality score in the 70% ASIR group was the best, and were similar to those in the control group (1.87 vs. 1.79, 1.26 vs. 1.28, and 4.53 vs. 4.57; p = 0.122-0.585). No significant difference in diagnostic accuracy was detected between the study group and the control group (42/47 vs. 43/47, p = 1.000). Low tube-voltage combined with automatic tube current modulation and 70% ASIR allowed the low CT radiation dose to be reduced by 44.7% without losing image quality on female pelvic scan.

  12. Feasibility study of radiation dose reduction in adult female pelvic CT scan with low tube-voltage and adaptive statistical iterative econstruction

    International Nuclear Information System (INIS)

    Wang, Xin Lian; He, Wen; Chen, Jian Hong; Hu, Zhi Hai; Zhao, Li Qin

    2015-01-01

    To evaluate image quality of female pelvic computed tomography (CT) scans reconstructed with the adaptive statistical iterative reconstruction (ASIR) technique combined with low tube-voltage and to explore the feasibility of its clinical application. Ninety-four patients were divided into two groups. The study group used 100 kVp, and images were reconstructed with 30%, 50%, 70%, and 90% ASIR. The control group used 120 kVp, and images were reconstructed with 30% ASIR. The noise index was 15 for the study group and 11 for the control group. The CT values and noise levels of different tissues were measured. The contrast to noise ratio (CNR) was calculated. A subjective evaluation was carried out by two experienced radiologists. The CT dose index volume (CTDIvol) was recorded. A 44.7% reduction in CTDIvol was observed in the study group (8.18 ± 3.58 mGy) compared with that in the control group (14.78 ± 6.15 mGy). No significant differences were observed in the tissue noise levels and CNR values between the 70% ASIR group and the control group (p = 0.068-1.000). The subjective scores indicated that visibility of small structures, diagnostic confidence, and the overall image quality score in the 70% ASIR group was the best, and were similar to those in the control group (1.87 vs. 1.79, 1.26 vs. 1.28, and 4.53 vs. 4.57; p = 0.122-0.585). No significant difference in diagnostic accuracy was detected between the study group and the control group (42/47 vs. 43/47, p = 1.000). Low tube-voltage combined with automatic tube current modulation and 70% ASIR allowed the low CT radiation dose to be reduced by 44.7% without losing image quality on female pelvic scan

  13. Feasibility Study of Radiation Dose Reduction in Adult Female Pelvic CT Scan with Low Tube-Voltage and Adaptive Statistical Iterative Reconstruction

    Science.gov (United States)

    Wang, Xinlian; Chen, Jianghong; Hu, Zhihai; Zhao, Liqin

    2015-01-01

    Objective To evaluate image quality of female pelvic computed tomography (CT) scans reconstructed with the adaptive statistical iterative reconstruction (ASIR) technique combined with low tube-voltage and to explore the feasibility of its clinical application. Materials and Methods Ninety-four patients were divided into two groups. The study group used 100 kVp, and images were reconstructed with 30%, 50%, 70%, and 90% ASIR. The control group used 120 kVp, and images were reconstructed with 30% ASIR. The noise index was 15 for the study group and 11 for the control group. The CT values and noise levels of different tissues were measured. The contrast to noise ratio (CNR) was calculated. A subjective evaluation was carried out by two experienced radiologists. The CT dose index volume (CTDIvol) was recorded. Results A 44.7% reduction in CTDIvol was observed in the study group (8.18 ± 3.58 mGy) compared with that in the control group (14.78 ± 6.15 mGy). No significant differences were observed in the tissue noise levels and CNR values between the 70% ASIR group and the control group (p = 0.068-1.000). The subjective scores indicated that visibility of small structures, diagnostic confidence, and the overall image quality score in the 70% ASIR group was the best, and were similar to those in the control group (1.87 vs. 1.79, 1.26 vs. 1.28, and 4.53 vs. 4.57; p = 0.122-0.585). No significant difference in diagnostic accuracy was detected between the study group and the control group (42/47 vs. 43/47, p = 1.000). Conclusion Low tube-voltage combined with automatic tube current modulation and 70% ASIR allowed the low CT radiation dose to be reduced by 44.7% without losing image quality on female pelvic scan. PMID:26357499

  14. Automatic classification of DMSA scans using an artificial neural network

    Science.gov (United States)

    Wright, J. W.; Duguid, R.; Mckiddie, F.; Staff, R. T.

    2014-04-01

    DMSA imaging is carried out in nuclear medicine to assess the level of functional renal tissue in patients. This study investigated the use of an artificial neural network to perform diagnostic classification of these scans. Using the radiological report as the gold standard, the network was trained to classify DMSA scans as positive or negative for defects using a representative sample of 257 previously reported images. The trained network was then independently tested using a further 193 scans and achieved a binary classification accuracy of 95.9%. The performance of the network was compared with three qualified expert observers who were asked to grade each scan in the 193 image testing set on a six point defect scale, from ‘definitely normal’ to ‘definitely abnormal’. A receiver operating characteristic analysis comparison between a consensus operator, generated from the scores of the three expert observers, and the network revealed a statistically significant increase (α quality assurance assistant in clinical practice.

  15. Introduction of a Journal Excerpt Activity Improves Undergraduate Students' Performance in Statistics

    Science.gov (United States)

    Rabin, Laura A.; Nutter-Upham, Katherine E.

    2010-01-01

    We describe an active learning exercise intended to improve undergraduate students' understanding of statistics by grounding complex concepts within a meaningful, applied context. Students in a journal excerpt activity class read brief excerpts of statistical reporting from published research articles, answered factual and interpretive questions,…

  16. Tools to support interpreting multiple regression in the face of multicollinearity.

    Science.gov (United States)

    Kraha, Amanda; Turner, Heather; Nimon, Kim; Zientek, Linda Reichwein; Henson, Robin K

    2012-01-01

    While multicollinearity may increase the difficulty of interpreting multiple regression (MR) results, it should not cause undue problems for the knowledgeable researcher. In the current paper, we argue that rather than using one technique to investigate regression results, researchers should consider multiple indices to understand the contributions that predictors make not only to a regression model, but to each other as well. Some of the techniques to interpret MR effects include, but are not limited to, correlation coefficients, beta weights, structure coefficients, all possible subsets regression, commonality coefficients, dominance weights, and relative importance weights. This article will review a set of techniques to interpret MR effects, identify the elements of the data on which the methods focus, and identify statistical software to support such analyses.

  17. An objective interpretation of Lagrangian quantum mechanics

    International Nuclear Information System (INIS)

    Roberts, K.V.

    1978-01-01

    Unlike classical mechanics, the Copenhagen interpretation of quantum mechanics does not provide an objective space-time picture of the actual history of a physical system. This paper suggests how the conceptual foundations of quantum mechanics can be reformulated, without changing the mathematical content of the theory or its detailed agreement with experiment and without introducing any hidden variables, in order to provide an objective, covariant, Lagrangian description of reality which is deterministic and time-symmetric on the microscopic scale. The basis of this description can be expressed either as an action functional or as a summation over Feynman diagrams or paths. The probability laws associated with the quantum-mechanical measurement process, and the asymmetry in time of the principles of macroscopic causality and of the laws of statistical mechanics, are interpreted as consequences of the particular boundary conditions that apply to the actual universe. The objective interpretation does not include the observer and the measurement process among the fundamental concepts of the theory, but it does not entail a revision of the ideas of determinism and of time, since in a Lagrangian theory both initial and final boundary conditions on the action functional are required. (author)

  18. Data analysis and interpretation for environmental surveillance

    International Nuclear Information System (INIS)

    1992-06-01

    The Data Analysis and Interpretation for Environmental Surveillance Conference was held in Lexington, Kentucky, February 5--7, 1990. The conference was sponsored by what is now the Office of Environmental Compliance and Documentation, Oak Ridge National Laboratory. Participants included technical professionals from all Martin Marietta Energy Systems facilities, Westinghouse Materials Company of Ohio, Pacific Northwest Laboratory, and several technical support contractors. Presentations at the conference ranged the full spectrum of issues that effect the analysis and interpretation of environmental data. Topics included tracking systems for samples and schedules associated with ongoing programs; coalescing data from a variety of sources and pedigrees into integrated data bases; methods for evaluating the quality of environmental data through empirical estimates of parameters such as charge balance, pH, and specific conductance; statistical applications to the interpretation of environmental information; and uses of environmental information in risk and dose assessments. Hearing about and discussing this wide variety of topics provided an opportunity to capture the subtlety of each discipline and to appreciate the continuity that is required among the disciplines in order to perform high-quality environmental information analysis

  19. How the Mastery Rubric for Statistical Literacy Can Generate Actionable Evidence about Statistical and Quantitative Learning Outcomes

    Directory of Open Access Journals (Sweden)

    Rochelle E. Tractenberg

    2016-12-01

    Full Text Available Statistical literacy is essential to an informed citizenry; and two emerging trends highlight a growing need for training that achieves this literacy. The first trend is towards “big” data: while automated analyses can exploit massive amounts of data, the interpretation—and possibly more importantly, the replication—of results are challenging without adequate statistical literacy. The second trend is that science and scientific publishing are struggling with insufficient/inappropriate statistical reasoning in writing, reviewing, and editing. This paper describes a model for statistical literacy (SL and its development that can support modern scientific practice. An established curriculum development and evaluation tool—the Mastery Rubric—is integrated with a new, developmental, model of statistical literacy that reflects the complexity of reasoning and habits of mind that scientists need to cultivate in order to recognize, choose, and interpret statistical methods. This developmental model provides actionable evidence, and explicit opportunities for consequential assessment that serves students, instructors, developers/reviewers/accreditors of a curriculum, and institutions. By supporting the enrichment, rather than increasing the amount, of statistical training in the basic and life sciences, this approach supports curriculum development, evaluation, and delivery to promote statistical literacy for students and a collective quantitative proficiency more broadly.

  20. Comparative interpretations of renormalization inversion technique for reconstructing unknown emissions from measured atmospheric concentrations

    Science.gov (United States)

    Singh, Sarvesh Kumar; Kumar, Pramod; Rani, Raj; Turbelin, Grégory

    2017-04-01

    The study highlights a theoretical comparison and various interpretations of a recent inversion technique, called renormalization, developed for the reconstruction of unknown tracer emissions from their measured concentrations. The comparative interpretations are presented in relation to the other inversion techniques based on principle of regularization, Bayesian, minimum norm, maximum entropy on mean, and model resolution optimization. It is shown that the renormalization technique can be interpreted in a similar manner to other techniques, with a practical choice of a priori information and error statistics, while eliminating the need of additional constraints. The study shows that the proposed weight matrix and weighted Gram matrix offer a suitable deterministic choice to the background error and measurement covariance matrices, respectively, in the absence of statistical knowledge about background and measurement errors. The technique is advantageous since it (i) utilizes weights representing a priori information apparent to the monitoring network, (ii) avoids dependence on background source estimates, (iii) improves on alternative choices for the error statistics, (iv) overcomes the colocalization problem in a natural manner, and (v) provides an optimally resolved source reconstruction. A comparative illustration of source retrieval is made by using the real measurements from a continuous point release conducted in Fusion Field Trials, Dugway Proving Ground, Utah.

  1. Challenges in 3D scanning: Focusing on Ears and Multiple View Stereopsis

    DEFF Research Database (Denmark)

    Jensen, Rasmus Ramsbøl

    It is the goal of this thesis to address some of the challenges in 3D scanning. This has been done with focus on direct in-ear scanning and on Multiple View Stereopsis. Seven papers have been produced over the course of the Ph.D., out of which, six have been included. Two papers concern volumetric...... segmentation based on Markov Random Fields. These have been formulated to address problems relating to noise ltering in direct in-ear scanning and Intracranial Volume estimation. Another two papers have been produced on the topic of recovering surface data based on a strong statistical prior. This was done...

  2. Qualitative evaluation of coronary flow during anesthetic induction using thallium-201 perfusion scans

    Energy Technology Data Exchange (ETDEWEB)

    Kleinman, B.; Henkin, R.E.; Glisson, S.N.; el-Etr, A.A.; Bakhos, M.; Sullivan, H.J.; Montoya, A.; Pifarre, R.

    1986-02-01

    Qualitative distribution of coronary flow using thallium-201 perfusion scans immediately postintubation was studied in 22 patients scheduled for elective coronary artery bypass surgery. Ten patients received a thiopental (4 mg/kg) and halothane induction. Twelve patients received a fentanyl (100 micrograms/kg) induction. Baseline thallium-201 perfusion scans were performed 24 h prior to surgery. These scans were compared with the scans performed postintubation. A thallium-positive scan was accepted as evidence of relative hypoperfusion. Baseline hemodynamic and ECG data were obtained prior to induction of anesthesia. These data were compared with the data obtained postintubation. Ten patients developed postintubation thallium-perfusion scan defects (thallium-positive scan), even though there was no statistical difference between their baseline hemodynamics and hemodynamics at the time of intubation. There was no difference in the incidence of thallium-positive scans between those patients anesthetized by fentanyl and those patients anesthetized with thiopental-halothane. The authors conclude that relative hypoperfusion, and possibly ischemia, occurred in 45% of patients studied, despite stable hemodynamics, and that the incidence of these events was the same with two different anesthetic techniques.

  3. Qualitative evaluation of coronary flow during anesthetic induction using thallium-201 perfusion scans

    International Nuclear Information System (INIS)

    Kleinman, B.; Henkin, R.E.; Glisson, S.N.; el-Etr, A.A.; Bakhos, M.; Sullivan, H.J.; Montoya, A.; Pifarre, R.

    1986-01-01

    Qualitative distribution of coronary flow using thallium-201 perfusion scans immediately postintubation was studied in 22 patients scheduled for elective coronary artery bypass surgery. Ten patients received a thiopental (4 mg/kg) and halothane induction. Twelve patients received a fentanyl (100 micrograms/kg) induction. Baseline thallium-201 perfusion scans were performed 24 h prior to surgery. These scans were compared with the scans performed postintubation. A thallium-positive scan was accepted as evidence of relative hypoperfusion. Baseline hemodynamic and ECG data were obtained prior to induction of anesthesia. These data were compared with the data obtained postintubation. Ten patients developed postintubation thallium-perfusion scan defects (thallium-positive scan), even though there was no statistical difference between their baseline hemodynamics and hemodynamics at the time of intubation. There was no difference in the incidence of thallium-positive scans between those patients anesthetized by fentanyl and those patients anesthetized with thiopental-halothane. The authors conclude that relative hypoperfusion, and possibly ischemia, occurred in 45% of patients studied, despite stable hemodynamics, and that the incidence of these events was the same with two different anesthetic techniques

  4. Bone scanning a useful addition in the diagnosis of ankle joint trauma

    International Nuclear Information System (INIS)

    Schmidt, C.

    1983-01-01

    A retrospective study of the indication in 169 scintigraphic examinations of the ankle joint was made. Usually joints respond to trauma with a generalized increase of the concentration of the radiopharmaceutical. By using a highly performed technique the focal hot spot caused by the fracture can be seen in the bone scan. The focal accumulation of the radioactive material must not correspond to a bone fracture in any case. The ligamentous avulsion of a bone chip and/or the periosteum can yield the same image but it cannot be diagnosed by radiographic techniques. Initially the routine radiograph and even the tomograph often are interpreted as normal or equivocal. In these cases of ankle trauma bone scanning completes the clinical evaluation. Although bone scanning is very important in the diagnosis of any traumatic lesion of the ankle joints it cannot replace the conventional X-ray technique. (orig.) [de

  5. Basic statistical tools in research and data analysis

    Directory of Open Access Journals (Sweden)

    Zulfiqar Ali

    2016-01-01

    Full Text Available Statistical methods involved in carrying out a study include planning, designing, collecting data, analysing, drawing meaningful interpretation and reporting of the research findings. The statistical analysis gives meaning to the meaningless numbers, thereby breathing life into a lifeless data. The results and inferences are precise only if proper statistical tests are used. This article will try to acquaint the reader with the basic research tools that are utilised while conducting various studies. The article covers a brief outline of the variables, an understanding of quantitative and qualitative variables and the measures of central tendency. An idea of the sample size estimation, power analysis and the statistical errors is given. Finally, there is a summary of parametric and non-parametric tests used for data analysis.

  6. Robust Proton Pencil Beam Scanning Treatment Planning for Rectal Cancer Radiation Therapy

    International Nuclear Information System (INIS)

    Blanco Kiely, Janid Patricia; White, Benjamin M.

    2016-01-01

    Purpose: To investigate, in a treatment plan design and robustness study, whether proton pencil beam scanning (PBS) has the potential to offer advantages, relative to interfraction uncertainties, over photon volumetric modulated arc therapy (VMAT) in a locally advanced rectal cancer patient population. Methods and Materials: Ten patients received a planning CT scan, followed by an average of 4 weekly offline CT verification CT scans, which were rigidly co-registered to the planning CT. Clinical PBS plans were generated on the planning CT, using a single-field uniform-dose technique with single-posterior and parallel-opposed (LAT) fields geometries. The VMAT plans were generated on the planning CT using 2 6-MV, 220° coplanar arcs. Clinical plans were forward-calculated on verification CTs to assess robustness relative to anatomic changes. Setup errors were assessed by forward-calculating clinical plans with a ±5-mm (left–right, anterior–posterior, superior–inferior) isocenter shift on the planning CT. Differences in clinical target volume and organ at risk dose–volume histogram (DHV) indicators between plans were tested for significance using an appropriate Wilcoxon test (P<.05). Results: Dosimetrically, PBS plans were statistically different from VMAT plans, showing greater organ at risk sparing. However, the bladder was statistically identical among LAT and VMAT plans. The clinical target volume coverage was statistically identical among all plans. The robustness test found that all DVH indicators for PBS and VMAT plans were robust, except the LAT's genitalia (V5, V35). The verification CT plans showed that all DVH indicators were robust. Conclusions: Pencil beam scanning plans were found to be as robust as VMAT plans relative to interfractional changes during treatment when posterior beam angles and appropriate range margins are used. Pencil beam scanning dosimetric gains in the bowel (V15, V20) over VMAT suggest that using PBS to treat rectal cancer

  7. Robust Proton Pencil Beam Scanning Treatment Planning for Rectal Cancer Radiation Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Blanco Kiely, Janid Patricia, E-mail: jkiely@sas.upenn.edu; White, Benjamin M.

    2016-05-01

    Purpose: To investigate, in a treatment plan design and robustness study, whether proton pencil beam scanning (PBS) has the potential to offer advantages, relative to interfraction uncertainties, over photon volumetric modulated arc therapy (VMAT) in a locally advanced rectal cancer patient population. Methods and Materials: Ten patients received a planning CT scan, followed by an average of 4 weekly offline CT verification CT scans, which were rigidly co-registered to the planning CT. Clinical PBS plans were generated on the planning CT, using a single-field uniform-dose technique with single-posterior and parallel-opposed (LAT) fields geometries. The VMAT plans were generated on the planning CT using 2 6-MV, 220° coplanar arcs. Clinical plans were forward-calculated on verification CTs to assess robustness relative to anatomic changes. Setup errors were assessed by forward-calculating clinical plans with a ±5-mm (left–right, anterior–posterior, superior–inferior) isocenter shift on the planning CT. Differences in clinical target volume and organ at risk dose–volume histogram (DHV) indicators between plans were tested for significance using an appropriate Wilcoxon test (P<.05). Results: Dosimetrically, PBS plans were statistically different from VMAT plans, showing greater organ at risk sparing. However, the bladder was statistically identical among LAT and VMAT plans. The clinical target volume coverage was statistically identical among all plans. The robustness test found that all DVH indicators for PBS and VMAT plans were robust, except the LAT's genitalia (V5, V35). The verification CT plans showed that all DVH indicators were robust. Conclusions: Pencil beam scanning plans were found to be as robust as VMAT plans relative to interfractional changes during treatment when posterior beam angles and appropriate range margins are used. Pencil beam scanning dosimetric gains in the bowel (V15, V20) over VMAT suggest that using PBS to treat rectal

  8. Chest CT scans are frequently abnormal in asymptomatic patients with newly diagnosed acute myeloid leukemia.

    Science.gov (United States)

    Vallipuram, Janaki; Dhalla, Sidika; Bell, Chaim M; Dresser, Linda; Han, Heekyung; Husain, Shahid; Minden, Mark D; Paul, Narinder S; So, Miranda; Steinberg, Marilyn; Vallipuram, Mayuran; Wong, Gary; Morris, Andrew M

    2017-04-01

    Chest computed tomography (CT) findings of nodules, ground glass opacities, and consolidations are often interpreted as representing invasive fungal infection in individuals with febrile neutropenia. We assessed whether these CT findings were present in asymptomatic individuals with acute myeloid leukemia (AML) at low risk of invasive fungal disease. A retrospective study of consecutive asymptomatic adult patients with newly diagnosed AML over a 2-year period was performed at a tertiary care oncology center. Radiology reports of baseline chest CTs were reviewed. Of 145 CT scans, the majority (88%) had pulmonary abnormalities. Many (70%) had one or both of unspecified opacities (52%) and nodules (49%). Ground glass opacities (18%) and consolidations (12%) occurred less frequently. Radiologists suggested pneumonia as a possible diagnosis in 32% (n = 47) of scans. Chest CT may result in over-diagnosis of invasive fungal disease in individuals with febrile neutropenia if interpreted without correlation to the patients' clinical status.

  9. Ether and interpretation of some physical phenomena and concepts

    International Nuclear Information System (INIS)

    Rzayev, S.G.

    2008-01-01

    On the basis of the concept of existence of an ether representation about time, space, matters and physical field are profound and also the essence of such phenomena, as corpuscular - wave dualism, change of time, scale and mass at movement body's is opened. The opportunity of transition from probability-statistical interpretation of the quantum phenomena to Laplace's determinism is shown

  10. Textural features of (18)F-fluorodeoxyglucose positron emission tomography scanning in diagnosing aortic prosthetic graft infection

    NARCIS (Netherlands)

    Saleem, Ben R; Beukinga, Roelof J.; Boellaard, Ronald; Glaudemans, Andor W J M; Reijnen, Michel M P J; Zeebregts, Clark J; Slart, Riemer H J A

    BACKGROUND: The clinical problem in suspected aortoiliac graft infection (AGI) is to obtain proof of infection. Although (18)F-fluorodeoxyglucose ((18)F-FDG) positron emission tomography scanning (PET) has been suggested to play a pivotal role, an evidence-based interpretation is lacking. The

  11. Use of newly developed standardized form for interpretation of high-resolution CT in screening for pneumoconiosis

    International Nuclear Information System (INIS)

    Julien, P.J.; Sider, L.; Silverman, J.M.; Dahlgren, J.; Harber, P.; Bunn, W.

    1991-01-01

    This paper reports that although the International Labour Office (ILO) standard for interpretation of the posteroanterior chest radiograph has been available for 10 years, there has been no attempt to standardize the high-resolution CT (HRTC) readings for screening of pneumoconiosis. An integrated respirator surveillance program for 87 workers exposed to inorganic dust was conducted. This program consisted of a detailed occupational exposure history, physical symptoms and signs, spirometry, chest radiography, and HRCT. Two groups of workers with known exposure were studied with HRCT. Group 1 had normal spirometry results and chest radiographs, and group 2 had abnormalities at spirometry or on chest radiographs. The HRCT scans were read independently of the clinical findings and chest radiographs. The HRCT scans were interpreted by using an ILO-based standard form developed by the authors for this project. With the newly developed HRCT form, individual descriptive abnormality localized severity, and overall rating systems have been developed and compared for inter- and intraobserver consistency

  12. Image acquisition and interpretation criteria for {sup 99m}Tc-HMPAO-labelled white blood cell scintigraphy: results of a multicentre study

    Energy Technology Data Exchange (ETDEWEB)

    Erba, Paola A. [University of Pisa Medical School (Italy). Regional Center of Nuclear Medicine; Glaudemans, Andor W.J.M.; Dierckx, Rudi A.J.O. [University Medical Center Groningen (Netherlands). Dept. of Nuclear Medicine and Molecular Imaging; Veltman, Niels C. [Jeroen Bosch Hospital, ' s-Hertogenbosch (Netherlands). Dept. of Nuclear Medicine; Sollini, Martina [Arcisprdale S. Maria Nuova - IRCCS, Reggio Emilia (Italy). Nuclear Medicine Unit; Pacilio, Marta; Galli, Filippo [Sapienza Univ., Rome (Italy). Nuclear Medicine Unit; Signore, Alberto [University Medical Center Groningen (Netherlands). Dept. of Nuclear Medicine and Molecular Imaging; Sapienza Univ., Rome (Italy). Nuclear Medicine Unit; Sapienza Univ., Rome (Italy). Ospedale S. Andrea Medicina Nucleare

    2014-04-15

    There is no consensus yet on the best protocol for planar image acquisition and interpretation of radiolabelled white blood cell (WBC) scintigraphy. This may account for differences in reported diagnostic accuracy amongst different centres. This was a multicentre retrospective study analysing 235 WBC scans divided into two groups. The first group of scans (105 patients) were acquired with a fixed-time acquisition protocol and the second group (130 patients) were acquired with a decay time-corrected acquisition protocol. Planar images were interpreted both qualitatively and semiquantitatively. Three blinded readers analysed the images. The most accurate imaging acquisition protocol comprised image acquisition at 3 - 4 h and at 20 - 24 h in time mode with acquisition times corrected for isotope decay. Using this protocol, visual analysis had high sensitivity and specificity in the diagnosis of infection. Semiquantitative analysis could be used in doubtful cases, with no cut-off for the percentage increase in radiolabelled WBC over time, as a criterion to define a positive scan. (orig.)

  13. SpaSM: A MATLAB Toolbox for Sparse Statistical Modeling

    DEFF Research Database (Denmark)

    Sjöstrand, Karl; Clemmensen, Line Harder; Larsen, Rasmus

    2018-01-01

    Applications in biotechnology such as gene expression analysis and image processing have led to a tremendous development of statistical methods with emphasis on reliable solutions to severely underdetermined systems. Furthermore, interpretations of such solutions are of importance, meaning...

  14. Beam diffusion measurements using collimator scans in the LHC

    CERN Document Server

    Valentino, Gianluca; Bruce, Roderik; Burkart, Florian; Previtali, Valentina; Redaelli, Stefano; Salvachua, Belen; Stancari, Giuliov; Valishev, Alexander

    2013-01-01

    The time evolution of beam losses during a collimator scan provides information on halo diffusion and population. This is an essential input for machine performance characterization and for the design of collimation systems. Beam halo measurements in the CERN Large Hadron Collider were conducted through collimator scrapings in a dedicated beam study for the first time at 4 TeV. Four scans were performed with two collimators, in the vertical plane for beam 1 and horizontally for beam 2, before and after bringing the beams into collisions. Inward and outward steps were performed. A diffusion model was used to interpret the observed loss rate evolution in response to the collimator steps. With this technique, diffusion coefficients were estimated as a function of betatron oscillation amplitude from approximately 3 to 7 standard deviations of the transverse beam distribution. A comparison of halo diffusion and core emittance growth rates is also presented.

  15. Differential scanning calorimetry (DSC) of semicrystalline polymers.

    Science.gov (United States)

    Schick, C

    2009-11-01

    Differential scanning calorimetry (DSC) is an effective analytical tool to characterize the physical properties of a polymer. DSC enables determination of melting, crystallization, and mesomorphic transition temperatures, and the corresponding enthalpy and entropy changes, and characterization of glass transition and other effects that show either changes in heat capacity or a latent heat. Calorimetry takes a special place among other methods. In addition to its simplicity and universality, the energy characteristics (heat capacity C(P) and its integral over temperature T--enthalpy H), measured via calorimetry, have a clear physical meaning even though sometimes interpretation may be difficult. With introduction of differential scanning calorimeters (DSC) in the early 1960s calorimetry became a standard tool in polymer science. The advantage of DSC compared with other calorimetric techniques lies in the broad dynamic range regarding heating and cooling rates, including isothermal and temperature-modulated operation. Today 12 orders of magnitude in scanning rate can be covered by combining different types of DSCs. Rates as low as 1 microK s(-1) are possible and at the other extreme heating and cooling at 1 MK s(-1) and higher is possible. The broad dynamic range is especially of interest for semicrystalline polymers because they are commonly far from equilibrium and phase transitions are strongly time (rate) dependent. Nevertheless, there are still several unsolved problems regarding calorimetry of polymers. I try to address a few of these, for example determination of baseline heat capacity, which is related to the problem of crystallinity determination by DSC, or the occurrence of multiple melting peaks. Possible solutions by using advanced calorimetric techniques, for example fast scanning and high frequency AC (temperature-modulated) calorimetry are discussed.

  16. A basic introduction to statistics for the orthopaedic surgeon.

    Science.gov (United States)

    Bertrand, Catherine; Van Riet, Roger; Verstreken, Frederik; Michielsen, Jef

    2012-02-01

    Orthopaedic surgeons should review the orthopaedic literature in order to keep pace with the latest insights and practices. A good understanding of basic statistical principles is of crucial importance to the ability to read articles critically, to interpret results and to arrive at correct conclusions. This paper explains some of the key concepts in statistics, including hypothesis testing, Type I and Type II errors, testing of normality, sample size and p values.

  17. Optimization of low-dose protocol in thoracic aorta CTA: weighting of adaptive statistical iterative reconstruction (ASIR) algorithm and scanning parameters

    International Nuclear Information System (INIS)

    Zhao Yongxia; Chang Jin; Zuo Ziwei; Zhang Changda; Zhang Tianle

    2014-01-01

    Objective: To investigate the best weighting of adaptive statistical iterative reconstruction (ASIR) algorithm and optimized low-dose scanning parameters in thoracic aorta CT angiography(CTA). Methods: Totally 120 patients with the body mass index (BMI) of 19-24 were randomly divided into 6 groups. All patients underwent thoracic aorta CTA with a GE Discovery CT 750 HD scanner (ranging from 290-330 mm). The default parameters (100 kV, 240 mAs) were applied in Group 1. Reconstructions were performed with different weightings of ASIR(10%-100% with 10%), and the signal to noise ratio (S/N) and contrast to noise ratio(C/N) of images were calculated. The images of series were evaluated by 2 independent radiologists with 5-point-scale and lastly the best weighting were revealed. Then the mAs in Group 2-6 were defined as 210, 180, 150, 120 and 90 with the kilovoltage 100. The CTDI_v_o_l and DLP in every scan series were recorded and the effective dose (E) was calculated. The S/N and C/N were calculated and the image quality was assessed by two radiologists. Results: The best weighing of ASIR was 60% at the 100 kV, 240 mAs. Under 60% of ASIR and 100 kV, the scores of image quality from 240 mAs to 90 mAs were (4.78±0.30)-(3.15±0.23). The CTDI_v_o_l and DLP were 12.64-4.41 mGy and 331.81-128.27 mGy, and the E was 4.98-1.92 mSv. The image qualities among Group 1-5 were nor significantly different (F = 5.365, P > 0.05), but the CTDI_v_o_l and DLP of Group 5 were reduced by 37.0% and 36.9%, respectively compared with Group 1. Conclusions: In thoracic aorta CT Angiography, the best weighting of ASIR is 60%, and 120 mAs is the best mAs with 100 kV in patients with BMI 19-24. (authors)

  18. Design and calibration of a scanning tunneling microscope for large machined surfaces

    Energy Technology Data Exchange (ETDEWEB)

    Grigg, D.A.; Russell, P.E.; Dow, T.A.

    1988-12-01

    During the last year the large sample STM has been designed, built and used for the observation of several different samples. Calibration of the scanner for prope dimensional interpretation of surface features has been a chief concern, as well as corrections for non-linear effects such as hysteresis during scans. Several procedures used in calibration and correction of piezoelectric scanners used in the laboratorys STMs are described.

  19. Pedicle measurement of the thoracolumbar spine: a cadaveric, radiographic, and CT scan study in Filipinos

    International Nuclear Information System (INIS)

    Molano, A.M.V.; Sison, A.B.; Fong, H.C.; Lim, N.T.; Sabile, K.

    1994-01-01

    With the popular usage of spinal pedicular screw fixation, it is essential to have a knowledge of the morphometry of the pedicles of the spine of particular populations. This study compared the direct pedicle measurements of ten cadavers in an institution, with their respective radiographic and computerized tomographic (CT) scan values, and also compared the effective pedicle diameter (EPD) with the conventional outer pedicle diameter (OPD) measurements. A compilation of pedicle values was also made in X-ray and CT scan plates of a Filipino population. A statistical analysis made on the 2,760 pedicle measurements taken from cadaveric T6-L5 vertebrae showed that direct measurements were significantly different from X-ray and CT scan values. The mean values of the EPD differed from those of the OPD, but not statistically significant. Comparison with previous foreign studies revealed significant differences in these pedicle dimensions. Pedicle measurements in a living Filipino population were found to be significantly different statistically between sexes. Accurate measurement of the pedicle diameters and lengths are indeed critical for the success of a spinal stabilization procedure using pedicular screws. (author). 8 refs.; 5 figs.; 1 tab

  20. Use of statistical parametric mapping of 18F-FDG-PET in frontal lobe epilepsy

    International Nuclear Information System (INIS)

    Plotkin, M.; Amthauer, H.; Luedemann, L.; Hartkop, E.; Ruf, J.; Gutberlet, M.; Bertram, H.; Felix, R.; Venz, St.; Merschhemke, M.; Meencke, H.-J.

    2003-01-01

    Aim: Evaluation of the use of statistical parametrical mapping (SPM) of FDG-PET for seizure lateralization in frontal lobe epilepsy. Patients: 38 patients with suspected frontal lobe epilepsy supported by clinical findings and video-EEG monitoring. Method: Statistical parametrical maps were generated by subtraction of individual scans from a control group, formed by 16 patients with negative neurological/psychiatric history and no abnormalities in the MR scan. The scans were also analyzed visually as well as semiquantitatively by manually drawn ROIs. Results: SPM showed a better accordance to the results of surface EEG monitoring compared with visual scan analysis and ROI quantification. In comparison with intracranial EEG recordings, the best performance was achieved by combining the ROI based quantification with SPM analysis. Conclusion: These findings suggest that SPM analysis of FDG-PET data could be a useful as complementary tool in the evaluation of seizure focus lateralization in patients with supposed frontal lobe epilepsy. (orig.)

  1. A voting-based statistical cylinder detection framework applied to fallen tree mapping in terrestrial laser scanning point clouds

    Science.gov (United States)

    Polewski, Przemyslaw; Yao, Wei; Heurich, Marco; Krzystek, Peter; Stilla, Uwe

    2017-07-01

    This paper introduces a statistical framework for detecting cylindrical shapes in dense point clouds. We target the application of mapping fallen trees in datasets obtained through terrestrial laser scanning. This is a challenging task due to the presence of ground vegetation, standing trees, DTM artifacts, as well as the fragmentation of dead trees into non-collinear segments. Our method shares the concept of voting in parameter space with the generalized Hough transform, however two of its significant drawbacks are improved upon. First, the need to generate samples on the shape's surface is eliminated. Instead, pairs of nearby input points lying on the surface cast a vote for the cylinder's parameters based on the intrinsic geometric properties of cylindrical shapes. Second, no discretization of the parameter space is required: the voting is carried out in continuous space by means of constructing a kernel density estimator and obtaining its local maxima, using automatic, data-driven kernel bandwidth selection. Furthermore, we show how the detected cylindrical primitives can be efficiently merged to obtain object-level (entire tree) semantic information using graph-cut segmentation and a tailored dynamic algorithm for eliminating cylinder redundancy. Experiments were performed on 3 plots from the Bavarian Forest National Park, with ground truth obtained through visual inspection of the point clouds. It was found that relative to sample consensus (SAC) cylinder fitting, the proposed voting framework can improve the detection completeness by up to 10 percentage points while maintaining the correctness rate.

  2. An interchangeable scanning Hall probe/scanning SQUID microscope

    International Nuclear Information System (INIS)

    Tang, Chiu-Chun; Lin, Hui-Ting; Wu, Sing-Lin; Chen, Tse-Jun; Wang, M. J.; Ling, D. C.; Chi, C. C.; Chen, Jeng-Chung

    2014-01-01

    We have constructed a scanning probe microscope for magnetic imaging, which can function as a scanning Hall probe microscope (SHPM) and as a scanning SQUID microscope (SSM). The scanning scheme, applicable to SHPM and SSM, consists of a mechanical positioning (sub) micron-XY stage and a flexible direct contact to the sample without a feedback control system for the Z-axis. With the interchangeable capability of operating two distinct scanning modes, our microscope can incorporate the advantageous functionalities of the SHPM and SSM with large scan range up to millimeter, high spatial resolution (⩽4 μm), and high field sensitivity in a wide range of temperature (4.2 K-300 K) and magnetic field (10 −7 T-1 T). To demonstrate the capabilities of the system, we present magnetic images scanned with SHPM and SSM, including a RbFeB magnet and a nickel grid pattern at room temperature, surface magnetic domain structures of a La 2/3 Ca 1/3 MnO 3 thin film at 77 K, and superconducting vortices in a striped niobium film at 4.2 K

  3. CONSENSUS STATEMENT BY THE AMERICAN ASSOCIATION OF CLINICAL ENDOCRINOLOGISTS AND AMERICAN COLLEGE OF ENDOCRINOLOGY ON THE QUALITY OF DXA SCANS AND REPORTS.

    Science.gov (United States)

    Licata, Angelo A; Binkley, Neil; Petak, Steven M; Camacho, Pauline M

    2018-02-01

    High-quality dual-energy X-ray absorptiometry (DXA) scans are necessary for accurate diagnosis of osteoporosis and monitoring of therapy; however, DXA scan reports may contain errors that cause confusion about diagnosis and treatment. This American Association of Clinical Endocrinologists/American College of Endocrinology consensus statement was generated to draw attention to many common technical problems affecting DXA report conclusions and provide guidance on how to address them to ensure that patients receive appropriate osteoporosis care. The DXA Writing Committee developed a consensus based on discussion and evaluation of available literature related to osteoporosis and osteodensitometry. Technical errors may include errors in scan acquisition and/or analysis, leading to incorrect diagnosis and reporting of change over time. Although the International Society for Clinical Densitometry advocates training for technologists and medical interpreters to help eliminate these problems, many lack skill in this technology. Suspicion that reports are wrong arises when clinical history is not compatible with scan interpretation (e.g., dramatic increase/decrease in a short period of time; declines in previously stable bone density after years of treatment), when different scanners are used, or when inconsistent anatomic sites are used for monitoring the response to therapy. Understanding the concept of least significant change will minimize erroneous conclusions about changes in bone density. Clinicians must develop the skills to differentiate technical problems, which confound reports, from real biological changes. We recommend that clinicians review actual scan images and data, instead of relying solely on the impression of the report, to pinpoint errors and accurately interpret DXA scan images. AACE = American Association of Clinical Endocrinologists; BMC = bone mineral content; BMD = bone mineral density; DXA = dual-energy X-ray absorptiometry; ISCD = International

  4. The Scythe Statistical Library: An Open Source C++ Library for Statistical Computation

    Directory of Open Access Journals (Sweden)

    Daniel Pemstein

    2011-08-01

    Full Text Available The Scythe Statistical Library is an open source C++ library for statistical computation. It includes a suite of matrix manipulation functions, a suite of pseudo-random number generators, and a suite of numerical optimization routines. Programs written using Scythe are generally much faster than those written in commonly used interpreted languages, such as R and proglang{MATLAB}; and can be compiled on any system with the GNU GCC compiler (and perhaps with other C++ compilers. One of the primary design goals of the Scythe developers has been ease of use for non-expert C++ programmers. Ease of use is provided through three primary mechanisms: (1 operator and function over-loading, (2 numerous pre-fabricated utility functions, and (3 clear documentation and example programs. Additionally, Scythe is quite flexible and entirely extensible because the source code is available to all users under the GNU General Public License.

  5. Textural features of 18F-fluorodeoxyglucose positron emission tomography scanning in diagnosing aortic prosthetic graft infection

    NARCIS (Netherlands)

    Saleem, Ben R.; Beukinga, Roelof J.; Boellaard, Ronald; Glaudemans, Andor W.J.M.; Reijnen, Michel M.P.J.; Zeebregts, Clark J.; Slart, Riemer H.J.A.

    2017-01-01

    Background: The clinical problem in suspected aortoiliac graft infection (AGI) is to obtain proof of infection. Although 18F-fluorodeoxyglucose (18F-FDG) positron emission tomography scanning (PET) has been suggested to play a pivotal role, an evidence-based interpretation is lacking. The objective

  6. Initial phantom study comparing image quality in computed tomography using adaptive statistical iterative reconstruction and new adaptive statistical iterative reconstruction v.

    Science.gov (United States)

    Lim, Kyungjae; Kwon, Heejin; Cho, Jinhan; Oh, Jongyoung; Yoon, Seongkuk; Kang, Myungjin; Ha, Dongho; Lee, Jinhwa; Kang, Eunju

    2015-01-01

    The purpose of this study was to assess the image quality of a novel advanced iterative reconstruction (IR) method called as "adaptive statistical IR V" (ASIR-V) by comparing the image noise, contrast-to-noise ratio (CNR), and spatial resolution from those of filtered back projection (FBP) and adaptive statistical IR (ASIR) on computed tomography (CT) phantom image. We performed CT scans at 5 different tube currents (50, 70, 100, 150, and 200 mA) using 3 types of CT phantoms. Scanned images were subsequently reconstructed in 7 different scan settings, such as FBP, and 3 levels of ASIR and ASIR-V (30%, 50%, and 70%). The image noise was measured in the first study using body phantom. The CNR was measured in the second study using contrast phantom and the spatial resolutions were measured in the third study using a high-resolution phantom. We compared the image noise, CNR, and spatial resolution among the 7 reconstructed image scan settings to determine whether noise reduction, high CNR, and high spatial resolution could be achieved at ASIR-V. At quantitative analysis of the first and second studies, it showed that the images reconstructed using ASIR-V had reduced image noise and improved CNR compared with those of FBP and ASIR (P ASIR-V had significantly improved spatial resolution than those of FBP and ASIR (P ASIR-V provides a significant reduction in image noise and a significant improvement in CNR as well as spatial resolution. Therefore, this technique has the potential to reduce the radiation dose further without compromising image quality.

  7. Utility of the indium 111-labeled human immunoglobulin G scan for the detection of focal vascular graft infection

    International Nuclear Information System (INIS)

    LaMuraglia, G.M.; Fischman, A.J.; Strauss, H.W.; Keech, F.; Wilkinson, R.; Callahan, R.J.; Khaw, B.A.; Rubin, R.H.

    1989-01-01

    The ability to diagnose and localize vascular graft infections has been a major challenge. Recent studies in animal models and humans with focal bacterial infection have shown that radiolabeled, polyclonal, human immunoglobulin G accumulates at the site of inflammation and can serve as the basis for an imaging technique. This study investigated this new technique for the diagnosis and localization of vascular graft infections. Twenty-five patients with suspected vascular infections involving grafts (22), atherosclerotic aneurysms (2), and subclavian vein thrombophlebitis (1) were studied. Gamma camera images of the suspected area were obtained between 5 and 48 hours after intravenous administration of 1.5 to 2.0 mCi (56 to 74 mBq) of indium 111-labeled, human, polyclonal immunoglobulin G. Scan results were interpreted without clinical information about the patient and were subsequently correlated with surgical findings, other imaging modalities, and/or clinical follow-up. In 10 of 10 patients found to have positive scan results, localized infections were confirmed at the involved sites. In 14 of 15 patients whose scan results were interpreted as negative, no vascular infections were identified at follow-up. The patient with false-negative results and recurrent bacteremia from an aortoduodenal fistula was found to have a negative scan outcome at a time when his disease was quiescent. These data suggest that nonspecific, human, indium 111-labeled immunoglobulin G scanning can be a useful noninvasive means of localizing vascular infections

  8. Brief guidelines for methods and statistics in medical research

    CERN Document Server

    Ab Rahman, Jamalludin

    2015-01-01

    This book serves as a practical guide to methods and statistics in medical research. It includes step-by-step instructions on using SPSS software for statistical analysis, as well as relevant examples to help those readers who are new to research in health and medical fields. Simple texts and diagrams are provided to help explain the concepts covered, and print screens for the statistical steps and the SPSS outputs are provided, together with interpretations and examples of how to report on findings. Brief Guidelines for Methods and Statistics in Medical Research offers a valuable quick reference guide for healthcare students and practitioners conducting research in health related fields, written in an accessible style.

  9. Functional brain mapping using H215O positron emission tomography (I): statistical parametric mapping method

    International Nuclear Information System (INIS)

    Lee, Dong Soo; Lee, Jae Sung; Kim, Kyeong Min; Chung, June Key; Lee, Myung Chul

    1998-01-01

    We investigated the statistical methods to compose the functional brain map of human working memory and the principal factors that have an effect on the methods for localization. Repeated PET scans with successive four tasks, which consist of one control and three different activation tasks, were performed on six right-handed normal volunteers for 2 minutes after bolus injections of 925 MBq H 2 15 O at the intervals of 30 minutes. Image data were analyzed using SPM96 (Statistical Parametric Mapping) implemented with Matlab (Mathworks Inc., U.S.A.). Images from the same subject were spatially registered and were normalized using linear and nonlinear transformation methods. Significant difference between control and each activation state was estimated at every voxel based on the general linear model. Differences of global counts were removed using analysis of covariance (ANCOVA) with global activity as covariate. Using the mean and variance for each condition which was adjusted using ANCOVA, t-statistics was performed on every voxel. To interpret the results more easily, t-values were transformed to the standard Gaussian distribution (Z-score). All the subjects carried out the activation and control tests successfully. Average rate of correct answers was 95%. The numbers of activated blobs were 4 for verbal memory I, 9 for verbal memory II, 9 for visual memory, and 6 for conjunctive activation of these three tasks. The verbal working memory activates predominantly left-sided structures, and the visual memory activates the right hemisphere. We conclude that rCBF PET imaging and statistical parametric mapping method were useful in the localization of the brain regions for verbal and visual working memory

  10. Bone scanning in the child and young adult. Pt. 1

    Energy Technology Data Exchange (ETDEWEB)

    Murray, I P.C. [Prince of Wales Hospital, Randwick (Australia). Dept. of Nuclear Medicine

    1980-02-01

    Radionuclide bone scanning will identify readily areas of the skeleton where vascularity or osteogenesis is disturbed. Frequently, this will be achieved with a greater sensitivity than orthodox radiology by reflecting altered local physiology of bone. This procedure is, therefore, valuable not only for identifying metastatic disease, but also in benign skeletal disorders characterised by altered blood flow or osteoblastic reaction. These changes occur in many diseases involving bone which are more common in children and young adults. Special attention to the performance of the study and to its interpretation is, however, required in these age groups. The bone scan is invaluable in detecting metastatic disease related to either primary bone tumours or other neoplasia, both in the initial investigation and in the evaluation of therapy. Extra-osseous uptake may also occur, providing useful information relevant to the care of these patients.

  11. The Ulysses fast latitude scans: COSPIN/KET results

    Directory of Open Access Journals (Sweden)

    B. Heber

    2003-06-01

    Full Text Available Ulysses, launched in October 1990, began its second out-of-ecliptic orbit in December 1997, and its second fast latitude scan in September 2000. In contrast to the first fast latitude scan in 1994/1995, during the second fast latitude scan solar activity was close to maximum. The solar magnetic field reversed its polarity around July 2000. While the first latitude scan mainly gave a snapshot of the spatial distribution of galactic cosmic rays, the second one is dominated by temporal variations. Solar particle increases are observed at all heliographic latitudes, including events that produce >250 MeV protons and 50 MeV electrons. Using observations from the University of Chicago’s instrument on board IMP8 at Earth, we find that most solar particle events are observed at both high and low latitudes, indicating either acceleration of these particles over a broad latitude range or an efficient latitudinal transport. The latter is supported by "quiet time" variations in the MeV electron background, if interpreted as Jovian electrons. No latitudinal gradient was found for >106 MeV galactic cosmic ray protons, during the solar maximum fast latitude scan. The electron to proton ratio remains constant and has practically the same value as in the previous solar maximum. Both results indicate that drift is of minor importance. It was expected that, with the reversal of the solar magnetic field and in the declining phase of the solar cycle, this ratio should increase. This was, however, not observed, probably because the transition to the new magnetic cycle was not completely terminated within the heliosphere, as indicated by the Ulysses magnetic field and solar wind measurements. We argue that the new A<0-solar magnetic modulation epoch will establish itself once both polar coronal holes have developed.Key words. Interplanetary physics (cosmic rays; energetic particles; interplanetary magnetic fields

  12. Isotope scanning for tumor localization

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1961-09-15

    At the request of the Government of the United Arab Republic, the Agency provided the services of an expert for the establishment in the UAR of a tumor localization program using photoscanning techniques and appropriate radioactive tracers. Photoscanning is a recently developed technique whereby the differences in isotope concentrations are enhanced on the record, and this facilitates the interpretation of the record. A variety of brain tumors were located, using a suitable radioactive tracer (Hg-203 - labelled Neohydrin) obtained from the USA. In some other investigations, processes in the kidney were scanned. Further, radioactive gold was used to demonstrate the normal and pathological spleen and liver and these tests showed various types of space occupying lesions resulting from malignancy and the parasitic infections endemic to the area. While the localization of brain tumors by scanning techniques is extremely useful, it does not always establish the precise extent of the tumor which should be known at the time of surgery. Dr. Bender, therefore, thought it advisable to instruct personnel in the use of what is known as an in-vivo needle scintillation probe - a technique for the investigation of the isotope concentration in a particular tissue during operation. The necessary instrument was obtained for this purpose and demonstrations were given; one patient was examined in this way at the time of surgery at the University of Alexandria Hospital.

  13. THE BENEFITS OF TERRESTRIAL LASER SCANNING AND HYPERSPECTRAL DATA FUSION PRODUCTS

    Directory of Open Access Journals (Sweden)

    S. J. Buckley

    2012-10-01

    Full Text Available Close range hyperspectral imaging is a developing method for the analysis and identification of material composition in many applications, such as in within the earth sciences. Using compact imaging devices in the field allows near-vertical topography to be imaged, thus bypassing the key limitations of viewing angle and resolution that preclude the use of airborne and spaceborne platforms. Terrestrial laser scanning allows 3D topography to be captured with high precision and spatial resolution. The combination of 3D geometry from laser scanning, and material properties from hyperspectral imaging allows new fusion products to be created, adding new information for solving application problems. This paper highlights the advantages of terrestrial lidar and hyperspectral integration, focussing on the qualitative and quantitative aspects, with examples from a geological field application. Accurate co-registration of the two data types is required. This allows 2D pixels to be linked to the 3D lidar geometry, giving increased quantitative analysis as classified material vectors are projected to 3D space for calculation of areas and examination of spatial relationships. User interpretation of hyperspectral results in a spatially-meaningful manner is facilitated using visual methods that combine the geometric and mineralogical products in a 3D environment. Point cloud classification and the use of photorealistic modelling enhance qualitative validation and interpretation, and allow image registration accuracy to be checked. A method for texture mapping of lidar meshes with multiple image textures, both conventional digital photos and hyperspectral results, is described. The integration of terrestrial laser scanning and hyperspectral imaging is a valuable means of providing new analysis methods, suitable for many applications requiring linked geometric and chemical information.

  14. An Error Analysis of Structured Light Scanning of Biological Tissue

    DEFF Research Database (Denmark)

    Jensen, Sebastian Hoppe Nesgaard; Wilm, Jakob; Aanæs, Henrik

    2017-01-01

    This paper presents an error analysis and correction model for four structured light methods applied to three common types of biological tissue; skin, fat and muscle. Despite its many advantages, structured light is based on the assumption of direct reflection at the object surface only......, statistical linear model based on the scan geometry. As such, scans can be corrected without introducing any specially designed pattern strategy or hardware. We can effectively reduce the error in a structured light scanner applied to biological tissue by as much as factor of two or three........ This assumption is violated by most biological material e.g. human skin, which exhibits subsurface scattering. In this study, we find that in general, structured light scans of biological tissue deviate significantly from the ground truth. We show that a large portion of this error can be predicted with a simple...

  15. An interchangeable scanning Hall probe/scanning SQUID microscope

    Energy Technology Data Exchange (ETDEWEB)

    Tang, Chiu-Chun; Lin, Hui-Ting; Wu, Sing-Lin [Department of Physics, National Tsing Hua University, Hsinchu 30013, Taiwan (China); Chen, Tse-Jun; Wang, M. J. [Institute of Astronomy and Astrophysics, Academia Sinica, Taipei 10617, Taiwan (China); Ling, D. C. [Department of Physics, Tamkang University, Tamsui Dist., New Taipei City 25137, Taiwan (China); Chi, C. C.; Chen, Jeng-Chung [Department of Physics, National Tsing Hua University, Hsinchu 30013, Taiwan (China); Frontier Research Center on Fundamental and Applied Sciences of Matters, National Tsing Hua University, Hsinchu 30013, Taiwan (China)

    2014-08-15

    We have constructed a scanning probe microscope for magnetic imaging, which can function as a scanning Hall probe microscope (SHPM) and as a scanning SQUID microscope (SSM). The scanning scheme, applicable to SHPM and SSM, consists of a mechanical positioning (sub) micron-XY stage and a flexible direct contact to the sample without a feedback control system for the Z-axis. With the interchangeable capability of operating two distinct scanning modes, our microscope can incorporate the advantageous functionalities of the SHPM and SSM with large scan range up to millimeter, high spatial resolution (⩽4 μm), and high field sensitivity in a wide range of temperature (4.2 K-300 K) and magnetic field (10{sup −7} T-1 T). To demonstrate the capabilities of the system, we present magnetic images scanned with SHPM and SSM, including a RbFeB magnet and a nickel grid pattern at room temperature, surface magnetic domain structures of a La{sub 2/3}Ca{sub 1/3}MnO{sub 3} thin film at 77 K, and superconducting vortices in a striped niobium film at 4.2 K.

  16. Development of an ultra wide band microwave radar based footwear scanning system

    Science.gov (United States)

    Rezgui, Nacer Ddine; Bowring, Nicholas J.; Andrews, David A.; Harmer, Stuart W.; Southgate, Matthew J.; O'Reilly, Dean

    2013-10-01

    At airports, security screening can cause long delays. In order to speed up screening a solution to avoid passengers removing their shoes to have them X-ray scanned is required. To detect threats or contraband items hidden within the shoe, a method of screening using frequency swept signals between 15 to 40 GHz has been developed, where the scan is carried out whilst the shoes are being worn. Most footwear is transparent to microwaves to some extent in this band. The scans, data processing and interpretation of the 2D image of the cross section of the shoe are completed in a few seconds. Using safe low power UWB radar, scattered signals from the shoe can be observed which are caused by changes in material properties such as cavities, dielectric or metal objects concealed within the shoe. By moving the transmission horn along the length of the shoe a 2D image corresponding to a cross section through the footwear is built up, which can be interpreted by the user, or automatically, to reveal the presence of concealed threat within the shoe. A prototype system with a resolution of 6 mm or less has been developed and results obtained for a wide range of commonly worn footwear, some modified by the inclusion of concealed material. Clear differences between the measured images of modified and unmodified shoes are seen. Procedures for enhancing the image through electronic image synthesis techniques and image processing methods are discussed and preliminary performance data presented.

  17. Pre- and postoperative ventilation-perfusion scan findings in patients undergoing total hip replacement or knee arthroplasty

    International Nuclear Information System (INIS)

    Kim, S.M.; Park, C.H.; Intenzo, C.M.

    1988-01-01

    Venous thrombolembolism is one of the major postoperative complications in patients undergoing total hip replacement (THR) or knee anthroplasty (TKA). The reported incidence of pulmonary embolism in this group is as high as 20%. The purpose of this report was to evaluate the value of preoperative and 7th-day postpoperative ventilation-perfusion (V/Q) lung scans in the management of patients undergoing elective reconstructive surgery of the hips or knees. Routine preoperative and 7th-day postoperative V/Q lung scans were obtained in 34 patients who underwent THR (17 patients) or TKA (17 patients). There were 15 male and 19 female patients, with an age distribution ranging from 56 to 80 years. Chest radiographs were obtained within 1 day of the pre- or postoperative lungs scan. Lung scans were interpreted by two experienced nuclear physicians

  18. Statistical physics of medical ultrasonic images

    International Nuclear Information System (INIS)

    Wagner, R.F.; Insana, M.F.; Brown, D.G.; Smith, S.W.

    1987-01-01

    The physical and statistical properties of backscattered signals in medical ultrasonic imaging are reviewed in terms of: 1) the radiofrequency signal; 2) the envelope (video or magnitude) signal; and 3) the density of samples in simple and in compounded images. There is a wealth of physical information in backscattered signals in medical ultrasound. This information is contained in the radiofrequency spectrum - which is not typically displayed to the viewer - as well as in the higher statistical moments of the envelope or video signal - which are not readily accessed by the human viewer of typical B-scans. This information may be extracted from the detected backscattered signals by straightforward signal processing techniques at low resolution

  19. Importance of bony analysis for interpreting ear CT scans: part three; ORL - tomodensitometrie de l'oreille. Interet de l'analyse osseuse dans l'interpretation des scanners de l'oreille (troisieme partie)

    Energy Technology Data Exchange (ETDEWEB)

    Serhal, M.; Dordea, M.; Cymbalista, M. [Hopital de Montfermeil, Service de Radiologie, 93 - Montfermeil (France); Halimi, P. [Hopital Europeen Georges-Pompidou, Service de Radiologie, 75 - Paris (France); Iffenecker, C. [Clinique Radiologique, 62 - Boulogne sur Mer (France); Bensimon, J.L

    2003-02-01

    The accurate description of bony changes in ear CT scans has a great diagnostic and therapeutic impact. The third part shows the way to analyze bone remodeling when CT scan is performed for tumors in the vicinity of the temporal bone, for intra temporal lesions of the facial nerve and for external auditory canal malformations. It demonstrates how bony analysis should be included in postoperative report of ear CT scan. The importance of bony signs in tumors and pseudo tumors of the inner ear are outlined. (authors)

  20. The statistical interpretations of counting data from measurements of low-level radioactivity

    International Nuclear Information System (INIS)

    Donn, J.J.; Wolke, R.L.

    1977-01-01

    The statistical model appropriate to measurements of low-level or background-dominant radioactivity is examined and the derived relationships are applied to two practical problems involving hypothesis testing: 'Does the sample exhibit a net activity above background' and 'Is the activity of the sample below some preselected limit'. In each of these cases, the appropriate decision rule is formulated, procedures are developed for estimating the preset count which is necessary to achieve a desired probability of detection, and a specific sequence of operations is provided for the worker in the field. (author)

  1. NEW SCANNING DEVICE FOR SCANNING TUNNELING MICROSCOPE APPLICATIONS

    NARCIS (Netherlands)

    SAWATZKY, GA; Koops, Karl Richard

    A small, single piezo XYZ translator has been developed. The device has been used as a scanner for a scanning tunneling microscope and has been tested successfully in air and in UHV. Its simple design results in a rigid and compact scanning unit which permits high scanning rates.

  2. Advanced optical system for scanning-spot photorefractive keratectomy (PRK)

    Science.gov (United States)

    Mrochen, Michael; Wullner, Christian; Semchishen, Vladimir A.; Seiler, Theo

    1999-06-01

    Purpose: The goal of this presentation is to discuss the use of the Light Shaping Beam Homogenizer in an optical system for scanning-spot PRK. Methods: The basic principle of the LSBH is the transformation of any incident intensity distribution by light scattering on an irregular microlens structure z = f(x,y). The relief of this microlens structure is determined by a defined statistical function, i.e. it is defined by the mean root-squared tilt σ of the surface relief. Therefore, the beam evolution after the LSBH and in the focal plane of an imaging lens was measured for various root-squared tilts. Beside this, an optical setup for scanning-spot PRK was assembled according to the theoretical and experimental results. Results: The divergence, homogeneity and the Gaussian radius of the intensity distribution in the treatment plane of the scanning-spot PRK laser system is mainly characterized by dependent on root-mean-square tilt σ of the LSBH, as it will be explained by the theoretical description of the LSBH. Conclusions: The LSBH represents a simple, low cost beam homogenizer with low energy losses, for scanning-spot excimer laser systems.

  3. Principles of Statistics: What the Sports Medicine Professional Needs to Know.

    Science.gov (United States)

    Riemann, Bryan L; Lininger, Monica R

    2018-07-01

    Understanding the results and statistics reported in original research remains a large challenge for many sports medicine practitioners and, in turn, may be among one of the biggest barriers to integrating research into sports medicine practice. The purpose of this article is to provide minimal essentials a sports medicine practitioner needs to know about interpreting statistics and research results to facilitate the incorporation of the latest evidence into practice. Topics covered include the difference between statistical significance and clinical meaningfulness; effect sizes and confidence intervals; reliability statistics, including the minimal detectable difference and minimal important difference; and statistical power. Copyright © 2018 Elsevier Inc. All rights reserved.

  4. Role of delayed indium-111 labeled leukocyte scan in the management of Crohn's disease

    International Nuclear Information System (INIS)

    Slaton, G.D.; Navab, F.; Boyd, C.M.; Diner, W.C.; Texter, E.C. Jr.

    1985-01-01

    Comparison of nine patients with Crohn's disease who had a positive delayed (24 hr) 111 indium leukocyte scan and 10 patients with negative scan showed no significant difference between the two groups for the Crohn's disease activity index, sedimentation rate, survival, complications, number of days in hospital, outpatient visits, or readmissions. Despite the apparent lack of statistical significance in Crohn's disease activity index, the scan was positive in nine of 16 patients with a Crohn's disease activity index more than 150, and none of three patients with Crohn's disease activity index less than 150. In the patients studied, there were no false-positive leukocyte scans. In nine of 10 patients with ileocolonic disease, scanning results correctly predicted the proper management. Six patients with positive scan and enteroclysis responded to medical treatment. Four patients had positive enteroclysis and negative scan; of these, three had radiographic features of chronic ileal stricture which was confirmed at operation. The results suggest that a negative delayed indium-111 leukocyte scan may be useful in diagnosis of chronic fibrotic ileal stricture

  5. Nuclear Scans

    Science.gov (United States)

    Nuclear scans use radioactive substances to see structures and functions inside your body. They use a special ... images. Most scans take 20 to 45 minutes. Nuclear scans can help doctors diagnose many conditions, including ...

  6. Beam diffusion measurements using collimator scans in the LHC

    Directory of Open Access Journals (Sweden)

    Gianluca Valentino

    2013-02-01

    Full Text Available The time evolution of beam losses during a collimator scan provides information on halo diffusion and population. This is an essential input for machine performance characterization and for the design of collimation systems. Beam halo measurements in the CERN Large Hadron Collider were conducted through collimator scrapings in a dedicated beam study for the first time at 4 TeV. Four scans were performed with two collimators, in the vertical plane for beam 1 and horizontally for beam 2, before and after bringing the beams into collisions. Inward and outward steps were performed. A diffusion model was used to interpret the observed loss rate evolution in response to the collimator steps. With this technique, diffusion coefficients were estimated as a function of betatron oscillation amplitude from approximately 3 to 7 standard deviations of the transverse beam distribution. A comparison of halo diffusion and core emittance growth rates is also presented.

  7. Detection of lacunar infarction in brain CT-scans: No evidence of bias from accompanying patient information

    International Nuclear Information System (INIS)

    Bonke, B.; Knippenberg, F.C.E. van; Duivenvoorden, H.J.; Kappelle, L.J.

    1989-01-01

    Interobserver agreement in assessing brain CT-scans is, in general, high. The extent, however, to which such agreement is caused by bias through knowledge of other clinical details remains uncertain. The hypothesis that observers are somehow prejudiced before assessing ambiguous, CT-scans in this particular situation was tested. Sixteen neurologists and 16 radiologists volunteered to interpret two ambiguous brain CT-scans, with regard to the presence or absence of a lacunar infarct in the region of the internal capsule. The scans were accompanied by 'patient' information that was or was not suggestive of a stroke. These scans were camouflaged by a variety of other scans, to be assessed in the same way, to mask the purpose of the study. I was assumed that the observers, in their assessments of the scans, would somehow let their ratings of the likelihood of a lacunar infarction in or near the internal capsule be subject to the accompanying information. Results showed lower ratings produced by neurologists (i.e., less likelihood of an infarction) than by radiologists in the majority of all assessments, but no bias by the accompanying information. (orig.)

  8. On the how latitude scanning photometer signatures of equatorial ionosphere plasma bubbles

    International Nuclear Information System (INIS)

    Abdu, M.A.; Sobral, J.H.A.; Nakamura, Y.

    1985-01-01

    Meridional and east-west scan 6300 (angstrom) night airglow photometer are being extensively used at the low latitude station Cachoeira Paulista (23 0 S 45 0 W, dip latitude 14 0 ), Brazil, for investigation of trans-equatorial ionospheric plasma bubble dynamics. The zonal velocities of the flux aligned plasma bubbles can be determined, in a straingforward way, from the east-west displacement of the airglow intensity valleys observed by the east-west scan photometer. On the other hand, the determination of the other velocity component of the plasma bubble motion (namely, vertical motion in the equatorial plane) has to be based on the meridional propagation of the airglow valleys observed by the meriodinal scan photometer. Such determinatios of the bubbles vertical rise velocity should, however, involved considerations on different bubble parameters such as, for exemple, the phase of the bubble event (whether growth, mature or decay phase), the limited east-west extension, and the often observed westward tilt of the bubble. In this brief report there were condidered in some detail, possible influences of these different factors on the interpretation of low latitude scanning photometer data to infer trans-equatorial plasma bubble dynamics. (author) [pt

  9. Quantitative analysis of tip-sample interaction in non-contact scanning force spectroscopy

    International Nuclear Information System (INIS)

    Palacios-Lidon, Elisa; Colchero, Jaime

    2006-01-01

    Quantitative characterization of tip-sample interaction in scanning force microscopy is fundamental for optimum image acquisition as well as data interpretation. In this work we discuss how to characterize the electrostatic and van der Waals contribution to tip-sample interaction in non-contact scanning force microscopy precisely. The spectroscopic technique presented is based on the simultaneous measurement of cantilever deflection, oscillation amplitude and frequency shift as a function of tip-sample voltage and tip-sample distance as well as on advanced data processing. Data are acquired at a fixed lateral position as interaction images, with the bias voltage as fast scan, and tip-sample distance as slow scan. Due to the quadratic dependence of the electrostatic interaction with tip-sample voltage the van der Waals force can be separated from the electrostatic force. Using appropriate data processing, the van der Waals interaction, the capacitance and the contact potential can be determined as a function of tip-sample distance. The measurement of resonance frequency shift yields very high signal to noise ratio and the absolute calibration of the measured quantities, while the acquisition of cantilever deflection allows the determination of the tip-sample distance

  10. Evaluation of lateral margin of left lobe of the liver on CT scan : focus on perisplenic extension

    International Nuclear Information System (INIS)

    Seo, Chang Hye; Cha, Seong Sook; Lee, Byung Jin; Choi, Jae Young; Choi, Seok Jin; Eun, Choong Ki

    1996-01-01

    The perisplenic extension of the left lobe of the liver can be misinterpreted as a splenic or perisplenic lesion on ultrasonography(US) and computed tomography(CT). The purpose of our study is to classify the lateral margin of the left lobe of the liver into three types and to evaluate the incidence and the relationship between each type and abnormal liver on CT scan. A total of 515 abdominal CT scans from patients over 15 years old were retrospectively evaluated. Liver contours were divided into three types on the basis of degree of the left lateral extension of the left lobe of the liver. Type A was defined as the lateral extension of the left lobe of liver to the medial portion of the stomach, type C as the perisplenic portion, and type B as between the two types. Each type was further divided into normal and abnormal liver groups based on clinical, CT, surgical and patholigic findings and evaluated on its ratio of normal and abnormal liver, intrahepatic diseases associated with an abnormal liver and statistical significance between a normal and abnormal liver. The incidence of the three types of liver among the 515 patients was 360(69.9%), 121(23.5%) and 34(6.6%) patients in type A, B and C, respectively. Type C showed normal liver in six patients, which was 2.7% of all normal livers(221/515) and abnormal liver in 28 patients, which was 9.5% of all abnormal livers(294/515). Type A showed normal liver in 49.7%, abnormal liver in 50.3% and there was not statistically significant difference between normal and abnormal liver(p>0.05). Type B showed normal liver in 29.8% and abnormal liver in 70.2%;type C showed normal liver in 17.6%, abnormal liver in 82.4% and there was a statistically significant difference between normal and abnormal liver(P<0.001). The space occupying lesion(SOL) was most common(52.6%) in all the abnormal livers and hepatoma was the most common disease in the SOL(47.2%). In the abnormal type C liver, SOL(58%) and diffuse hepatopathy(32.8%) were

  11. Selection of patients with infrainguinal arterial occlusive disease for percutaneous transluminal angioplasty with duplex scanning

    International Nuclear Information System (INIS)

    Bostroem Ardin, A.; Hellberg, A.; Ljungman, C.; Logason, K.; Karacagil, S.; Loefberg, A.M.; Andren, B.

    2002-01-01

    Aim: To evaluate the role of duplex scanning in the selection of patients with infrainguinal arterial occlusive disease for percutaneous transluminal angioplasty (PTA). Material and Methods: From January 1995 through May 2000, 702 patients (952 limbs), with chronic lower extremity ischemia due to infrainguinal atherosclerotic disease diagnosed by duplex scanning, were retrospectively studied. Diagnostic angiography (130 limbs) or infrainguinal PTA (108 limbs) was performed in 238 limbs. Two investigators retrospectively analyzed the duplex examinations and angiographies in a blinded manner and used similar criteria for the interpretation of lesions suitable or not suitable for PTA. Results: The superficial femoral, popliteal and crural artery lesions were correctly selected for PTA in 85%, 66% and 32%, respectively. The accuracy, sensitivity, specificity, negative predictive value and positive predictive value of duplex scanning to appropriately categorize femoropopliteal lesions as suitable or unsuitable for PTA were 89%, 83%, 92%, 94% and 78%, respectively. The accuracy of duplex scanning for predicting the performance of infrainguinal PTA was 83%. Conclusion: Duplex scanning has an important impact on the selection of treatment modalities in limbs with infrainguinal arterial occlusive disease. Femoropopliteal lesions can be reliably selected to PTA according to duplex scan findings

  12. X-ray fluorescent scanning of the thyroid

    International Nuclear Information System (INIS)

    Jonckheer, M.H.; Deconinck, F.

    1983-01-01

    The main emphasis of the technical chapters of this monograph lies on the aspects which are of direct importance to thyroid scanning: the general principles of X-ray fluorescence, the choice and characteristics of appropriate sources and detectors, a stationary system, quantification problems, and the pitfalls in the interpretation of the intrathyroidal iodine imaging and quantification. The clinical part of the monograph consists of chapters on the role of stable iodine and the thyroid function, on endemic non-toxic goiter, on hyperthyroidism as a result of iodine overload, on feasibility of dynamic studies, on stable iodine stores in thyroiditis, and on a general review of the clinical usefulness of XRF in thyroid disease. (Auth.)

  13. 3D digital image processing for biofilm quantification from confocal laser scanning microscopy: Multidimensional statistical analysis of biofilm modeling

    Science.gov (United States)

    Zielinski, Jerzy S.

    The dramatic increase in number and volume of digital images produced in medical diagnostics, and the escalating demand for rapid access to these relevant medical data, along with the need for interpretation and retrieval has become of paramount importance to a modern healthcare system. Therefore, there is an ever growing need for processed, interpreted and saved images of various types. Due to the high cost and unreliability of human-dependent image analysis, it is necessary to develop an automated method for feature extraction, using sophisticated mathematical algorithms and reasoning. This work is focused on digital image signal processing of biological and biomedical data in one- two- and three-dimensional space. Methods and algorithms presented in this work were used to acquire data from genomic sequences, breast cancer, and biofilm images. One-dimensional analysis was applied to DNA sequences which were presented as a non-stationary sequence and modeled by a time-dependent autoregressive moving average (TD-ARMA) model. Two-dimensional analyses used 2D-ARMA model and applied it to detect breast cancer from x-ray mammograms or ultrasound images. Three-dimensional detection and classification techniques were applied to biofilm images acquired using confocal laser scanning microscopy. Modern medical images are geometrically arranged arrays of data. The broadening scope of imaging as a way to organize our observations of the biophysical world has led to a dramatic increase in our ability to apply new processing techniques and to combine multiple channels of data into sophisticated and complex mathematical models of physiological function and dysfunction. With explosion of the amount of data produced in a field of biomedicine, it is crucial to be able to construct accurate mathematical models of the data at hand. Two main purposes of signal modeling are: data size conservation and parameter extraction. Specifically, in biomedical imaging we have four key problems

  14. New nuclear scanning and surveillance systems for global security and safeguards

    International Nuclear Information System (INIS)

    Kemeny, L. G.

    2003-01-01

    This paper discusses new innovative techniques for both cargo and personnel scanning and plant and infrastructure surveillance and protection. It contains Intellectual Property and some of the systems described are covered by Patents. For example, a typical container inspection system is based on Pulsed Fast Neutron Analysis operating on the following principles: 1. An accelerator produced pulses of fast neutrons, which interact with the elemental composition of the cargo under inspection. In a manner similar to radar scanning the timing and positioning of the pulsed neutrons indicates where the interactions occurs. These interactions initiate the emission of gamma radiation which characterises the elemental composition and which is collected by sensor arrays. 2. The gamma ray signals are analysed in a high speed processor which identifies the presence and location of the chemical element combinations in all types of contraband. These may be drugs, explosives or nuclear material. 3. High resolution images display the location and shape of all contraband in the cargo under inspection. An x-ray like image of the cargo can also be provided. Because the scanning system software already contains standard gamma ray material signatures, the need for time consuming and unreliable manual interpretation of complicated images obtained in x-ray scanning systems is completely eliminated

  15. Scanning the business external environment for information: evidence from Greece

    Directory of Open Access Journals (Sweden)

    L. Kourteli

    2005-01-01

    Full Text Available Introduction. This paper examines the business external environment scanning theory for information in the context of Greece. Method. A questionnaire was developed to explore the relationships between general and task business environment, perceived uncertainty, scanning strategy, and sources of information with respect to type of environment, size and industry.The research was based on a sample of 144 private organizations operating in North Greece. Analysis. Data collected were analysed using SPSS. The statistical procedures of chi-squared homogeneity test, ANOVA, Duncan's test of homogeneity of means, and related samples t-test were followed for testing the hypotheses developed. Results. The results show that perceived uncertainty of the general and task business external environment factors depend on the type of the environment, size of organization, and industry where the organizations operate; organizations adapt their scanning strategy to the complexity of the environment; personal sources of information seem to be more important than impersonal sources; external sources of information are equally important with internal sources; and higher levels of environmental uncertainty are associated with higher levels of scanning the various sources. Conclusion. Business external environment scanning of information is influenced by the characteristics of the organizations themselves and by the characteristics of the external environment within which the organizations operate. The study contributes to both environmental scanning theory and has important messages for practitioners.

  16. Statistics for X-chromosome associations.

    Science.gov (United States)

    Özbek, Umut; Lin, Hui-Min; Lin, Yan; Weeks, Daniel E; Chen, Wei; Shaffer, John R; Purcell, Shaun M; Feingold, Eleanor

    2018-06-13

    In a genome-wide association study (GWAS), association between genotype and phenotype at autosomal loci is generally tested by regression models. However, X-chromosome data are often excluded from published analyses of autosomes because of the difference between males and females in number of X chromosomes. Failure to analyze X-chromosome data at all is obviously less than ideal, and can lead to missed discoveries. Even when X-chromosome data are included, they are often analyzed with suboptimal statistics. Several mathematically sensible statistics for X-chromosome association have been proposed. The optimality of these statistics, however, is based on very specific simple genetic models. In addition, while previous simulation studies of these statistics have been informative, they have focused on single-marker tests and have not considered the types of error that occur even under the null hypothesis when the entire X chromosome is scanned. In this study, we comprehensively tested several X-chromosome association statistics using simulation studies that include the entire chromosome. We also considered a wide range of trait models for sex differences and phenotypic effects of X inactivation. We found that models that do not incorporate a sex effect can have large type I error in some cases. We also found that many of the best statistics perform well even when there are modest deviations, such as trait variance differences between the sexes or small sex differences in allele frequencies, from assumptions. © 2018 WILEY PERIODICALS, INC.

  17. Effect of variable scanning protocolson the pre-implant site evaluation of the mandible in reformatted computed tomography

    International Nuclear Information System (INIS)

    Kim, Kee Deog; Park, Chang Seo

    1999-01-01

    To evaluate the effect of variable scanning protocols of computed tomography for evaluation of pre-implant site of the mandible through the comparison of the reformatted cross-sectional images of helical CT scans obtained with various imaging parameters versus those of conventional CT scans. A dry mandible was imaged using conventional nonoverlapped CT scans with 1 mm slice thickness and helical CT scans with 1 mm slice thickness and pitches of 1.0, 1.5, 2.0, 2.5 and 3.0. All helical images were reconstructed at reconstruction interval of 1 mm. DentaScan reformatted images were obtained to allow standardized visualization of cross-sectional images of the mandible. The reformatted images were reviewed and measured separately by 4 dental radiologists. The image qualities of continuity of cortical outline, trabecular bone structure and visibility of the mandibular canal were evaluated and the distance between anatomic structures were measured by 4 dental radiologists. On image qualities of continuity of cortical outline, trabecular bone structure and visibility of the mandibular canal and in horizontal measurement, there was no statistically significant difference among conventional and helical scans with pitches of 1.0, 1.5 and 2.0. In vertical measurement, there was no statistically significant difference among the conventional and all imaging parameters of helical CT scans with pitches of 1.0, 1.5, 2.0, 2.5 and 3.0. The images of helical CT scans with 1 mm slice thickness and pitches of 1.0, 1.5 and 2.0 are as good as those of conventional CT scans with 1 mm slice thickness for evaluation of pre-dental implant site of the mandible. Considering the radiation dose and patient comfort, helical CT scans with 1 mm slice thickness and pitch of 2.0 is recommended for evaluation of pre-implant site of the mandible.

  18. Diffusion-weighted MRI to assess response to chemoradiotherapy in rectal cancer: main interpretation pitfalls and their use for teaching

    International Nuclear Information System (INIS)

    Lambregts, Doenja M.J.; Lahaye, Max J.; Maas, Monique; Heeswijk, Miriam M. van; Delli Pizzi, Andrea; Elderen, Saskia G.C. van; Andrade, Luisa; Peters, Nicky H.G.M.; Osinga-de Jong, Margreet; Kint, Peter A.M.; Bipat, Shandra; Ooms, Rik; Beets, Geerard L.; Bakers, Frans C.H.; Beets-Tan, Regina G.H.

    2017-01-01

    To establish the most common image interpretation pitfalls for non-expert readers using diffusion-weighted imaging (DWI) to assess response to chemoradiotherapy in patients with rectal cancer and to explore the use of these pitfalls in an expert teaching setting. Two independent non-expert readers (R1 and R2) scored the restaging DW MRI scans (b1,000 DWI, in conjunction with ADC maps and T2-W MRI scans for anatomical reference) in 100 patients for the likelihood of a complete response versus residual tumour using a five-point confidence score. The readers received expert feedback and the final response outcome for each case. The supervising expert documented any potential interpretation errors/pitfalls discussed for each case to identify the most common pitfalls. The most common pitfalls were the interpretation of low signal on the ADC map, small susceptibility artefacts, T2 shine-through effects, suboptimal sequence angulation and collapsed rectal wall. Diagnostic performance (area under the ROC curve) was 0.78 (R1) and 0.77 (R2) in the first 50 patients and 0.85 (R1) and 0.85 (R2) in the final 50 patients. Five main image interpretation pitfalls were identified and used for teaching and feedback. Both readers achieved a good diagnostic performance with an AUC of 0.85. (orig.)

  19. Diffusion-weighted MRI to assess response to chemoradiotherapy in rectal cancer: main interpretation pitfalls and their use for teaching

    Energy Technology Data Exchange (ETDEWEB)

    Lambregts, Doenja M.J.; Lahaye, Max J.; Maas, Monique [The Netherlands Cancer Institute, Department of Radiology, Amsterdam (Netherlands); Heeswijk, Miriam M. van [The Netherlands Cancer Institute, Department of Radiology, Amsterdam (Netherlands); Maastricht University Medical Centre, Department of Radiology, Maastricht (Netherlands); Maastricht University Medical Centre, Department of Surgery, Maastricht (Netherlands); Maastricht University, GROW School for Oncology and Developmental Biology, Maastricht (Netherlands); Delli Pizzi, Andrea [Gabriele d' Annunzio University, SS. Annunziate Hospital, Department of Neuroscience and Imaging, Chieti (Italy); Elderen, Saskia G.C. van [Leiden University Medical Centre, Department of Radiology, Leiden (Netherlands); Andrade, Luisa [Hospitais da Universidade de Coimbra, Department of Radiology, Coimbra (Portugal); Peters, Nicky H.G.M.; Osinga-de Jong, Margreet [Zuyderland Medical Center, location Heerlen, Heerlen (Netherlands); Kint, Peter A.M. [Amphia Hospital, Department of Radiology, Breda (Netherlands); Bipat, Shandra [Academic Medical Centre, Department of Radiology, Amsterdam (Netherlands); Ooms, Rik [Maxima Medical Centre, Department of Radiology, Eindhoven-Veldhoven (Netherlands); Beets, Geerard L. [The Netherlands Cancer Institute, Department of Surgery, Amsterdam (Netherlands); Maastricht University, GROW School for Oncology and Developmental Biology, Maastricht (Netherlands); Bakers, Frans C.H. [Maastricht University Medical Centre, Department of Radiology, Maastricht (Netherlands); Beets-Tan, Regina G.H. [The Netherlands Cancer Institute, Department of Radiology, Amsterdam (Netherlands); Maastricht University, GROW School for Oncology and Developmental Biology, Maastricht (Netherlands)

    2017-10-15

    To establish the most common image interpretation pitfalls for non-expert readers using diffusion-weighted imaging (DWI) to assess response to chemoradiotherapy in patients with rectal cancer and to explore the use of these pitfalls in an expert teaching setting. Two independent non-expert readers (R1 and R2) scored the restaging DW MRI scans (b1,000 DWI, in conjunction with ADC maps and T2-W MRI scans for anatomical reference) in 100 patients for the likelihood of a complete response versus residual tumour using a five-point confidence score. The readers received expert feedback and the final response outcome for each case. The supervising expert documented any potential interpretation errors/pitfalls discussed for each case to identify the most common pitfalls. The most common pitfalls were the interpretation of low signal on the ADC map, small susceptibility artefacts, T2 shine-through effects, suboptimal sequence angulation and collapsed rectal wall. Diagnostic performance (area under the ROC curve) was 0.78 (R1) and 0.77 (R2) in the first 50 patients and 0.85 (R1) and 0.85 (R2) in the final 50 patients. Five main image interpretation pitfalls were identified and used for teaching and feedback. Both readers achieved a good diagnostic performance with an AUC of 0.85. (orig.)

  20. Interpretative commenting.

    Science.gov (United States)

    Vasikaran, Samuel

    2008-08-01

    * Clinical laboratories should be able to offer interpretation of the results they produce. * At a minimum, contact details for interpretative advice should be available on laboratory reports.Interpretative comments may be verbal or written and printed. * Printed comments on reports should be offered judiciously, only where they would add value; no comment preferred to inappropriate or dangerous comment. * Interpretation should be based on locally agreed or nationally recognised clinical guidelines where available. * Standard tied comments ("canned" comments) can have some limited use.Individualised narrative comments may be particularly useful in the case of tests that are new, complex or unfamiliar to the requesting clinicians and where clinical details are available. * Interpretative commenting should only be provided by appropriately trained and credentialed personnel. * Audit of comments and continued professional development of personnel providing them are important for quality assurance.

  1. An Introduction to Statistical Concepts

    CERN Document Server

    Lomax, Richard G

    2012-01-01

    This comprehensive, flexible text is used in both one- and two-semester courses to review introductory through intermediate statistics. Instructors select the topics that are most appropriate for their course. Its conceptual approach helps students more easily understand the concepts and interpret SPSS and research results. Key concepts are simply stated and occasionally reintroduced and related to one another for reinforcement. Numerous examples demonstrate their relevance. This edition features more explanation to increase understanding of the concepts. Only crucial equations are included. I

  2. Statistics corner: A guide to appropriate use of correlation coefficient in medical research.

    Science.gov (United States)

    Mukaka, M M

    2012-09-01

    Correlation is a statistical method used to assess a possible linear association between two continuous variables. It is simple both to calculate and to interpret. However, misuse of correlation is so common among researchers that some statisticians have wished that the method had never been devised at all. The aim of this article is to provide a guide to appropriate use of correlation in medical research and to highlight some misuse. Examples of the applications of the correlation coefficient have been provided using data from statistical simulations as well as real data. Rule of thumb for interpreting size of a correlation coefficient has been provided.

  3. The Impact of Language Experience on Language and Reading: A Statistical Learning Approach

    Science.gov (United States)

    Seidenberg, Mark S.; MacDonald, Maryellen C.

    2018-01-01

    This article reviews the important role of statistical learning for language and reading development. Although statistical learning--the unconscious encoding of patterns in language input--has become widely known as a force in infants' early interpretation of speech, the role of this kind of learning for language and reading comprehension in…

  4. Brain PET scan

    Science.gov (United States)

    ... results on a PET scan. Blood sugar or insulin levels may affect the test results in people with diabetes . PET scans may be done along with a CT scan. This combination scan is called a PET/CT. Alternative Names Brain positron emission tomography; PET scan - brain References Chernecky ...

  5. Automatic segmentation of cell nuclei from confocal laser scanning microscopy images

    International Nuclear Information System (INIS)

    Kelemen, A.; Reist, H.W.

    1997-01-01

    A newly developed experimental method combines the possibility of irradiating more than a thousand cells simultaneous with an efficient colony-forming ability and with the capability of localizing a particle track through a cell nucleus together with the assessment of the energy transfer by digital superposition of the image containing the track with that of the cells. To assess the amount of energy deposition by particles traversing the cell nucleus the intersection lengths of the particle tracks have to be known. Intersection lengths can be obtained by determining the 3D surface contours of the irradiated cell nuclei. Confocal laser scanning microscopy using specific DNA fluorescent dye offers a possible way for the determination of the 3D shape of individual nuclei. Unfortunately, such experiments cannot be performed on living cells. One solution to this problem can be provided by building a statistical model of the shape of the nuclei of the exposed cells. In order to build such a statistical model, a large number of cell nuclei have to be identified and segmented from confocal laser scanning microscopy images. The present paper describes a method to perform this 3D segmentation in an automatic manner in order to create a solid basis for the statistical model. (author) 2 figs., 4 refs

  6. SDE decomposition and A-type stochastic interpretation in nonequilibrium processes

    Science.gov (United States)

    Yuan, Ruoshi; Tang, Ying; Ao, Ping

    2017-12-01

    An innovative theoretical framework for stochastic dynamics based on the decomposition of a stochastic differential equation (SDE) into a dissipative component, a detailed-balance-breaking component, and a dual-role potential landscape has been developed, which has fruitful applications in physics, engineering, chemistry, and biology. It introduces the A-type stochastic interpretation of the SDE beyond the traditional Ito or Stratonovich interpretation or even the α-type interpretation for multidimensional systems. The potential landscape serves as a Hamiltonian-like function in nonequilibrium processes without detailed balance, which extends this important concept from equilibrium statistical physics to the nonequilibrium region. A question on the uniqueness of the SDE decomposition was recently raised. Our review of both the mathematical and physical aspects shows that uniqueness is guaranteed. The demonstration leads to a better understanding of the robustness of the novel framework. In addition, we discuss related issues including the limitations of an approach to obtaining the potential function from a steady-state distribution.

  7. Conversion factors and oil statistics

    International Nuclear Information System (INIS)

    Karbuz, Sohbet

    2004-01-01

    World oil statistics, in scope and accuracy, are often far from perfect. They can easily lead to misguided conclusions regarding the state of market fundamentals. Without proper attention directed at statistic caveats, the ensuing interpretation of oil market data opens the door to unnecessary volatility, and can distort perception of market fundamentals. Among the numerous caveats associated with the compilation of oil statistics, conversion factors, used to produce aggregated data, play a significant role. Interestingly enough, little attention is paid to conversion factors, i.e. to the relation between different units of measurement for oil. Additionally, the underlying information regarding the choice of a specific factor when trying to produce measurements of aggregated data remains scant. The aim of this paper is to shed some light on the impact of conversion factors for two commonly encountered issues, mass to volume equivalencies (barrels to tonnes) and for broad energy measures encountered in world oil statistics. This paper will seek to demonstrate how inappropriate and misused conversion factors can yield wildly varying results and ultimately distort oil statistics. Examples will show that while discrepancies in commonly used conversion factors may seem trivial, their impact on the assessment of a world oil balance is far from negligible. A unified and harmonised convention for conversion factors is necessary to achieve accurate comparisons and aggregate oil statistics for the benefit of both end-users and policy makers

  8. Choice ofoptimal phase for liver angiography and multi-phase scanning with multi-slice spiral CT

    International Nuclear Information System (INIS)

    Fang Hong; Song Yunlong; Bi Yongmin; Wang Dong; Shi Huiping; Zhang Wanshi; Zhu Hongxian; Yang Hua; Ji Xudong; Fan Hongxia

    2008-01-01

    Objective: To evaluate the efficacy of test bolus technique with multi-slice spiral CT (MSCT) for determining the optimal scan delay time in CT Hepatic artery (HA)-portal vein (PV) angiography and multi-phase scanning. Methods: MSCT liver angiography and multi-phase scanning were performed in 187 patients divided randomly into two groups. In group A (n=59), the scan delay time was set according to the subjective experiences of operators; in group B (n=128), the scan delay time was determined by test bolus technique. Abdominal aorta and superior mesenteric, vein were selected as target blood vessels, and 50 HU was set as enhancement threshold value. 20 ml contrast agent was injected intravenously and time-density curve of target blood vessels were obtained, then HA-PV scanning delay time were calculated respectively. The quality of CTA images obtained by using these 2 methods were compared and statistically analysed using Chi-square criterion. Results: For hepatic artery phase, the images of group A are: excellent in 34 (58%), good in 17 (29%), and poor in 8 (13%), while those of group B are excellent in 128(100%), good in 0(0%), and poor in 0(0%). For portal vein phase, the images of group A are: excellent in 23 (39%), good in 27 (46%), and poor in 9 (15%), while those of group B are excellent in 96 (75%), good in 28 (22%), and poor in 4 (3%) respectively. There was statistically significant difference between the ratios of image quality in group A and group B (χ 2 =14.97, 9.18, P< 0.05). Conclusion: Accurate scan delay time was best determined by using test bolus technique, which can improve the image quality of liver angiography and multi-phase scanning. (authors)

  9. A comparison between intrastomal 3D ultrasonography, CT scanning and findings at surgery in patients with stomal complaints.

    Science.gov (United States)

    Näsvall, P; Wikner, F; Gunnarsson, U; Rutegård, J; Strigård, K

    2014-10-01

    Since there are no reliable investigative tools for imaging parastomal hernia, new techniques are needed. The aim of this study was to assess the validity of intrastomal three-dimensional ultrasonography (3D) as an alternative to CT scanning for the assessment of stomal complaints. Twenty patients with stomal complaints, indicating surgery, were examined preoperatively with a CT scan in the supine position and 3D intrastomal ultrasonography in the supine and erect positions. Comparison with findings at surgery, considered to be the true state, was made. Both imaging methods, 3D ultrasonography and CT scanning, showed high sensitivity (ultrasound 15/18, CT scan 15/18) and specificity (ultrasound 2/2, CT scan 1/2) when judged by a dedicated radiologist. Corresponding values for interpretation of CT scans in routine clinical practice was for sensitivity 17/18 and for specificity 1/2. 3D ultrasonography has a high validity and is a promising alternative to CT scanning in the supine position to distinguish a bulge from a parastomal hernia.

  10. Emergence of quantum mechanics from classical statistics

    International Nuclear Information System (INIS)

    Wetterich, C

    2009-01-01

    The conceptual setting of quantum mechanics is subject to an ongoing debate from its beginnings until now. The consequences of the apparent differences between quantum statistics and classical statistics range from the philosophical interpretations to practical issues as quantum computing. In this note we demonstrate how quantum mechanics can emerge from classical statistical systems. We discuss conditions and circumstances for this to happen. Quantum systems describe isolated subsystems of classical statistical systems with infinitely many states. While infinitely many classical observables 'measure' properties of the subsystem and its environment, the state of the subsystem can be characterized by the expectation values of only a few probabilistic observables. They define a density matrix, and all the usual laws of quantum mechanics follow. No concepts beyond classical statistics are needed for quantum physics - the differences are only apparent and result from the particularities of those classical statistical systems which admit a quantum mechanical description. In particular, we show how the non-commuting properties of quantum operators are associated to the use of conditional probabilities within the classical system, and how a unitary time evolution reflects the isolation of the subsystem.

  11. A new generation scanning system for the high-speed analysis of nuclear emulsions

    Science.gov (United States)

    Alexandrov, A.; Buonaura, A.; Consiglio, L.; D'Ambrosio, N.; De Lellis, G.; Di Crescenzo, A.; Galati, G.; Lauria, A.; Montesi, M. C.; Tioukov, V.; Vladymyrov, M.

    2016-06-01

    The development of automatic scanning systems was a fundamental issue for large scale neutrino detectors exploiting nuclear emulsions as particle trackers. Such systems speed up significantly the event analysis in emulsion, allowing the feasibility of experiments with unprecedented statistics. In the early 1990s, R&D programs were carried out by Japanese and European laboratories leading to automatic scanning systems more and more efficient. The recent progress in the technology of digital signal processing and of image acquisition allows the fulfillment of new systems with higher performances. In this paper we report the description and the performance of a new generation scanning system able to operate at the record speed of 84 cm2/hour and based on the Large Angle Scanning System for OPERA (LASSO) software infrastructure developed by the Naples scanning group. Such improvement, reduces the scanning time by a factor 4 with respect to the available systems, allowing the readout of huge amount of nuclear emulsions in reasonable time. This opens new perspectives for the employment of such detectors in a wider variety of applications.

  12. A new generation scanning system for the high-speed analysis of nuclear emulsions

    International Nuclear Information System (INIS)

    Alexandrov, A.; Buonaura, A.; Consiglio, L.; Lellis, G. De; Crescenzo, A. Di; Galati, G.; Lauria, A.; Montesi, M.C.; Tioukov, V.; D'Ambrosio, N.; Vladymyrov, M.

    2016-01-01

    The development of automatic scanning systems was a fundamental issue for large scale neutrino detectors exploiting nuclear emulsions as particle trackers. Such systems speed up significantly the event analysis in emulsion, allowing the feasibility of experiments with unprecedented statistics. In the early 1990s, R and D programs were carried out by Japanese and European laboratories leading to automatic scanning systems more and more efficient. The recent progress in the technology of digital signal processing and of image acquisition allows the fulfillment of new systems with higher performances. In this paper we report the description and the performance of a new generation scanning system able to operate at the record speed of 84 cm 2 /hour and based on the Large Angle Scanning System for OPERA (LASSO) software infrastructure developed by the Naples scanning group. Such improvement, reduces the scanning time by a factor 4 with respect to the available systems, allowing the readout of huge amount of nuclear emulsions in reasonable time. This opens new perspectives for the employment of such detectors in a wider variety of applications.

  13. Is triple contrast computed tomographic scanning useful in the selective management of stab wounds to the back?

    Science.gov (United States)

    McAllister, E; Perez, M; Albrink, M H; Olsen, S M; Rosemurgy, A S

    1994-09-01

    We devised a protocol to prospectively manage stab wounds to the back with the hypothesis that the triple contrast computed tomographic (CT) scan is an effective means of detecting occult injury in these patients. All wounds to the back in hemodynamically stable adults were locally explored. All patients with muscular fascial penetration underwent triple contrast CT scanning utilizing oral, rectal, and IV contrast. Patients did not undergo surgical exploration if their CT scan was interpreted as negative or if the CT scan demonstrated injuries not requiring surgical intervention. Fifty-three patients were entered into the protocol. The time to complete the triple contrast CT scan ranged from 3 to 6 hours at a cost of $1050 for each scan. In 51 patients (96%), the CT scan either had negative findings (n = 31) or showed injuries not requiring exploration (n = 20). These patients did well with nonsurgical management. Two CT scans documented significant injury and led to surgical exploration and therapeutic celiotomies. Although triple contrast CT scanning was able to detect occult injury in patients with stab wounds to the back it did so at considerable cost and the results rarely altered clinical care. Therefore, its routine use in these patients is not recommended.

  14. Quality of pediatric abdominal CT scans performed at a dedicated children's hospital and its referring institutions: a multifactorial evaluation

    International Nuclear Information System (INIS)

    Snow, Aisling; Milliren, Carly E.; Graham, Dionne A.; Callahan, Michael J.; MacDougall, Robert D.; Robertson, Richard L.; Taylor, George A.

    2017-01-01

    Pediatric patients requiring transfer to a dedicated children's hospital from an outside institution may undergo CT imaging as part of their evaluation. Whether this imaging is performed prior to or after transfer has been shown to impact the radiation dose imparted to the patient. Other quality variables could also be affected by the pediatric experience and expertise of the scanning institution. To identify differences in quality between abdominal CT scans and reports performed at a dedicated children's hospital, and those performed at referring institutions. Fifty consecutive pediatric abdominal CT scans performed at outside institutions were matched (for age, gender and indication) with 50 CT scans performed at a dedicated freestanding children's hospital. We analyzed the scans for technical parameters, report findings, correlation with final clinical diagnosis, and clinical utility. Technical evaluation included use of intravenous and oral contrast agents, anatomical coverage, number of scan phases and size-specific dose estimate (SSDE) for each scan. Outside institution scans were re-reported when the child was admitted to the children's hospital; they were also re-interpreted for this study by children's hospital radiologists who were provided with only the referral information given in the outside institution's report. Anonymized original outside institutional reports and children's hospital admission re-reports were analyzed by two emergency medicine physicians for ease of understanding, degree to which the clinical question was answered, and level of confidence in the report. Mean SSDE was lower (8.68) for children's hospital scans, as compared to outside institution scans (13.29, P = 0.03). Concordance with final clinical diagnosis was significantly lower for original outside institution reports (38/48, 79%) than for both the admission and study children's hospital reports (48/50, 96%; P = 0.005). Children's hospital admission reports were rated higher

  15. Adherence to Imaging Protocol and Impact of Training on the Interpretation of CEA-Scan (arcitumomab) Imaging for colorectal Cancer

    International Nuclear Information System (INIS)

    Rubinstein, Michael; VanDaele, Paul; Wegener, William MD; Guardia, Miguel de la

    2004-01-01

    Purpose: To determine if imaging with acetobromo (immunoscintigraphy) is sensitive to technical and interpretative techniques that must be mastered in order to obtain reliable results. We studied the impact of training to reduce the learning curve. Methods: 1) Evaluate the performance of an experienced Nuclear Medicine Physicians (Team A) un-blinded with their initial series of patients, compared to the conclusions of Experts (Team B) blinded from any clinical information; 2) Training of Team A is by the expert team on image acquisition, processing and interpretation techniques as well as using all clinical information and anatomic studies for comparison; 3) Assess the performance of the Team A on a second series of patients. 4) Questionnaires were sent to 65 consecutive physicians trained by experts to determine if the learned techniques improved interpretation of immuno scintigrams. Results: Twenty three (23) patients with CRC were included, 13 pre and 10 for the post teaching phase with a total of 30 clinically confirmed lesions (pathologically proven or demonstrated on follow-up). The clinically confirmed lesions include: 8 primary, 12 pelvic recurrences and 10 metastatic sites. On the pre-teaching series, Team A correctly identified only 6/19 lesions (32%). On the post teaching series, Team A found 8/11 lesions (73%) including 4/5 pelvic recurrence (80%), all 3 primary lesions, and 1/3 metastasis which compares favorably to published results. To determine the effect of blinded reading of immuno scintigrams, Team B reviews the first 13 studies without any clinical information or CT for comparison. Team B found 10/19 lesions (53%) with 4 false positive. Questionnaires were mailed to 65 trained physicians (54 returned), 67% of responders found that training improved their results, 22% experienced mixed results and 11% did not notice any improvement. Conclusion: The lower than expected sensitivity by the blinded Expert Team confirms that the overall accuracy

  16. Consequences of Not Interpreting Structure Coefficients in Published CFA Research: A Reminder

    Science.gov (United States)

    Graham, James M.; Guthrie, Abbie C.; Thompson, Bruce

    2003-01-01

    Confirmatory factor analysis (CFA) is a statistical procedure frequently used to test the fit of data to measurement models. Published CFA studies typically report factor pattern coefficients. Few reports, however, also present factor structure coefficients, which can be essential for the accurate interpretation of CFA results. The interpretation…

  17. Nuclear material statistical accountancy system

    International Nuclear Information System (INIS)

    Argentest, F.; Casilli, T.; Franklin, M.

    1979-01-01

    The statistical accountancy system developed at JRC Ispra is refered as 'NUMSAS', ie Nuclear Material Statistical Accountancy System. The principal feature of NUMSAS is that in addition to an ordinary material balance calcultation, NUMSAS can calculate an estimate of the standard deviation of the measurement error accumulated in the material balance calculation. The purpose of the report is to describe in detail, the statistical model on wich the standard deviation calculation is based; the computational formula which is used by NUMSAS in calculating the standard deviation and the information about nuclear material measurements and the plant measurement system which are required as data for NUMSAS. The material balance records require processing and interpretation before the material balance calculation is begun. The material balance calculation is the last of four phases of data processing undertaken by NUMSAS. Each of these phases is implemented by a different computer program. The activities which are carried out in each phase can be summarised as follows; the pre-processing phase; the selection and up-date phase; the transformation phase, and the computation phase

  18. Statistical application of groundwater monitoring data at the Hanford Site

    International Nuclear Information System (INIS)

    Chou, C.J.; Johnson, V.G.; Hodges, F.N.

    1993-09-01

    Effective use of groundwater monitoring data requires both statistical and geohydrologic interpretations. At the Hanford Site in south-central Washington state such interpretations are used for (1) detection monitoring, assessment monitoring, and/or corrective action at Resource Conservation and Recovery Act sites; (2) compliance testing for operational groundwater surveillance; (3) impact assessments at active liquid-waste disposal sites; and (4) cleanup decisions at Comprehensive Environmental Response Compensation and Liability Act sites. Statistical tests such as the Kolmogorov-Smirnov two-sample test are used to test the hypothesis that chemical concentrations from spatially distinct subsets or populations are identical within the uppermost unconfined aquifer. Experience at the Hanford Site in applying groundwater background data indicates that background must be considered as a statistical distribution of concentrations, rather than a single value or threshold. The use of a single numerical value as a background-based standard ignores important information and may result in excessive or unnecessary remediation. Appropriate statistical evaluation techniques include Wilcoxon rank sum test, Quantile test, ''hot spot'' comparisons, and Kolmogorov-Smirnov types of tests. Application of such tests is illustrated with several case studies derived from Hanford groundwater monitoring programs. To avoid possible misuse of such data, an understanding of the limitations is needed. In addition to statistical test procedures, geochemical, and hydrologic considerations are integral parts of the decision process. For this purpose a phased approach is recommended that proceeds from simple to the more complex, and from an overview to detailed analysis

  19. Application of multivariate statistical techniques in microbial ecology.

    Science.gov (United States)

    Paliy, O; Shankar, V

    2016-03-01

    Recent advances in high-throughput methods of molecular analyses have led to an explosion of studies generating large-scale ecological data sets. In particular, noticeable effect has been attained in the field of microbial ecology, where new experimental approaches provided in-depth assessments of the composition, functions and dynamic changes of complex microbial communities. Because even a single high-throughput experiment produces large amount of data, powerful statistical techniques of multivariate analysis are well suited to analyse and interpret these data sets. Many different multivariate techniques are available, and often it is not clear which method should be applied to a particular data set. In this review, we describe and compare the most widely used multivariate statistical techniques including exploratory, interpretive and discriminatory procedures. We consider several important limitations and assumptions of these methods, and we present examples of how these approaches have been utilized in recent studies to provide insight into the ecology of the microbial world. Finally, we offer suggestions for the selection of appropriate methods based on the research question and data set structure. © 2016 John Wiley & Sons Ltd.

  20. Exploring neighborhood inequality in female breast cancer incidence in Tehran using Bayesian spatial models and a spatial scan statistic

    Directory of Open Access Journals (Sweden)

    Erfan Ayubi

    2017-05-01

    Full Text Available OBJECTIVES The aim of this study was to explore the spatial pattern of female breast cancer (BC incidence at the neighborhood level in Tehran, Iran. METHODS The present study included all registered incident cases of female BC from March 2008 to March 2011. The raw standardized incidence ratio (SIR of BC for each neighborhood was estimated by comparing observed cases relative to expected cases. The estimated raw SIRs were smoothed by a Besag, York, and Mollie spatial model and the spatial empirical Bayesian method. The purely spatial scan statistic was used to identify spatial clusters. RESULTS There were 4,175 incident BC cases in the study area from 2008 to 2011, of which 3,080 were successfully geocoded to the neighborhood level. Higher than expected rates of BC were found in neighborhoods located in northern and central Tehran, whereas lower rates appeared in southern areas. The most likely cluster of higher than expected BC incidence involved neighborhoods in districts 3 and 6, with an observed-to-expected ratio of 3.92 (p<0.001, whereas the most likely cluster of lower than expected rates involved neighborhoods in districts 17, 18, and 19, with an observed-to-expected ratio of 0.05 (p<0.001. CONCLUSIONS Neighborhood-level inequality in the incidence of BC exists in Tehran. These findings can serve as a basis for resource allocation and preventive strategies in at-risk areas.

  1. Statistical limitations in functional neuroimaging. I. Non-inferential methods and statistical models.

    Science.gov (United States)

    Petersson, K M; Nichols, T E; Poline, J B; Holmes, A P

    1999-01-01

    Functional neuroimaging (FNI) provides experimental access to the intact living brain making it possible to study higher cognitive functions in humans. In this review and in a companion paper in this issue, we discuss some common methods used to analyse FNI data. The emphasis in both papers is on assumptions and limitations of the methods reviewed. There are several methods available to analyse FNI data indicating that none is optimal for all purposes. In order to make optimal use of the methods available it is important to know the limits of applicability. For the interpretation of FNI results it is also important to take into account the assumptions, approximations and inherent limitations of the methods used. This paper gives a brief overview over some non-inferential descriptive methods and common statistical models used in FNI. Issues relating to the complex problem of model selection are discussed. In general, proper model selection is a necessary prerequisite for the validity of the subsequent statistical inference. The non-inferential section describes methods that, combined with inspection of parameter estimates and other simple measures, can aid in the process of model selection and verification of assumptions. The section on statistical models covers approaches to global normalization and some aspects of univariate, multivariate, and Bayesian models. Finally, approaches to functional connectivity and effective connectivity are discussed. In the companion paper we review issues related to signal detection and statistical inference. PMID:10466149

  2. Interpreting Impoliteness: Interpreters’ Voices

    Directory of Open Access Journals (Sweden)

    Tatjana Radanović Felberg

    2017-11-01

    Full Text Available Interpreters in the public sector in Norway interpret in a variety of institutional encounters, and the interpreters evaluate the majority of these encounters as polite. However, some encounters are evaluated as impolite, and they pose challenges when it comes to interpreting impoliteness. This issue raises the question of whether interpreters should take a stance on their own evaluation of impoliteness and whether they should interfere in communication. In order to find out more about how interpreters cope with this challenge, in 2014 a survey was sent to all interpreters registered in the Norwegian Register of Interpreters. The survey data were analyzed within the theoretical framework of impoliteness theory using the notion of moral order as an explanatory tool in a close reading of interpreters’ answers. The analysis shows that interpreters reported using a variety of strategies for interpreting impoliteness, including omissions and downtoning. However, the interpreters also gave examples of individual strategies for coping with impoliteness, such as interrupting and postponing interpreting. These strategies border behavioral strategies and conflict with the Norwegian ethical guidelines for interpreting. In light of the ethical guidelines and actual practice, mapping and discussing different strategies used by interpreters might heighten interpreters’ and interpreter-users’ awareness of the role impoliteness can play in institutional interpreter– mediated encounters. 

  3. Linear mixed models a practical guide using statistical software

    CERN Document Server

    West, Brady T; Galecki, Andrzej T

    2006-01-01

    Simplifying the often confusing array of software programs for fitting linear mixed models (LMMs), Linear Mixed Models: A Practical Guide Using Statistical Software provides a basic introduction to primary concepts, notation, software implementation, model interpretation, and visualization of clustered and longitudinal data. This easy-to-navigate reference details the use of procedures for fitting LMMs in five popular statistical software packages: SAS, SPSS, Stata, R/S-plus, and HLM. The authors introduce basic theoretical concepts, present a heuristic approach to fitting LMMs based on bo

  4. Penultimate interpretation.

    Science.gov (United States)

    Neuman, Yair

    2010-10-01

    Interpretation is at the center of psychoanalytic activity. However, interpretation is always challenged by that which is beyond our grasp, the 'dark matter' of our mind, what Bion describes as ' O'. O is one of the most central and difficult concepts in Bion's thought. In this paper, I explain the enigmatic nature of O as a high-dimensional mental space and point to the price one should pay for substituting the pre-symbolic lexicon of the emotion-laden and high-dimensional unconscious for a low-dimensional symbolic representation. This price is reification--objectifying lived experience and draining it of vitality and complexity. In order to address the difficulty of approaching O through symbolization, I introduce the term 'Penultimate Interpretation'--a form of interpretation that seeks 'loopholes' through which the analyst and the analysand may reciprocally save themselves from the curse of reification. Three guidelines for 'Penultimate Interpretation' are proposed and illustrated through an imaginary dialogue. Copyright © 2010 Institute of Psychoanalysis.

  5. A simple and robust statistical framework for planning, analysing and interpreting faecal egg count reduction test (FECRT) studies

    DEFF Research Database (Denmark)

    Denwood, M.J.; McKendrick, I.J.; Matthews, L.

    Introduction. There is an urgent need for a method of analysing FECRT data that is computationally simple and statistically robust. A method for evaluating the statistical power of a proposed FECRT study would also greatly enhance the current guidelines. Methods. A novel statistical framework has...... been developed that evaluates observed FECRT data against two null hypotheses: (1) the observed efficacy is consistent with the expected efficacy, and (2) the observed efficacy is inferior to the expected efficacy. The method requires only four simple summary statistics of the observed data. Power...... that the notional type 1 error rate of the new statistical test is accurate. Power calculations demonstrate a power of only 65% with a sample size of 20 treatment and control animals, which increases to 69% with 40 control animals or 79% with 40 treatment animals. Discussion. The method proposed is simple...

  6. Reproducibility of trabecular bone score with different scan modes using dual-energy X-ray absorptiometry: a phantom study

    Energy Technology Data Exchange (ETDEWEB)

    Bandirali, Michele; Messina, Carmelo [Universita degli Studi di Milano, Scuola di Specializzazione in Radiodiagnostica, Milano (Italy); Di Leo, Giovanni [Unita di Radiologia, IRCCS Policlinico San Donato, San Donato Milanese (Italy); Pastor Lopez, Maria Juana; Ulivieri, Fabio M. [Servizio di Medicina Nucleare, Ospedale Maggiore, Mineralometria Ossea Computerizzata e Ambulatorio Malattie Metabolismo Minerale e Osseo, Milano (Italy); Mai, Alessandro [Universita degli Studi di Milano, Tecniche di Radiologia Medica, per Immagini e Radioterapia, Milano (Italy); Sardanelli, Francesco [Unita di Radiologia, IRCCS Policlinico San Donato, San Donato Milanese (Italy); Universita degli Studi di Milano, Dipartimento di Scienze Biomediche per la Salute, San Donato Milanese (Italy)

    2014-08-12

    The trabecular bone score (TBS) accounts for the bone microarchitecture and is calculated on dual-energy X-ray absorptiometry (DXA). We estimated the reproducibility of the TBS using different scan modes compared to the reproducibility bone mineral density (BMD). A spine phantom was used with a Hologic QDR-Discovery A densitometer. For each scan mode [fast array, array, high definition (HD)], 25 scans were automatically performed without phantom repositioning; a further 25 scans were performed with phantom repositioning. For each scan, the TBS was obtained. The coefficient of variation (CoV) was calculated as the ratio between standard deviation and mean; percent least significant change (LSC%) as 2.8 x CoV; reproducibility as the complement to 100 % of LSC%. Differences among scan modes were assessed using ANOVA. Without phantom repositioning, the mean TBS (mm{sup -1}) was: 1.352 (fast array), 1.321 (array), and 1.360 (HD); with phantom repositioning, it was 1.345, 1.332, and 1.362, respectively. Reproducibility of the TBS without phantom repositioning was 97.7 % (fast array), 98.3 % (array), and 98.2 % (HD); with phantom repositioning, it was 97.9 %, 98.7 %, and 98.4 %, respectively. LSC% was ≤2.26 %. Differences among scan modes were all statistically significant (p ≤ 0.019). Reproducibility of BMD was 99.1 % with all scan modes, while LSC% was from 0.86 % to 0.91 %. Reproducibility error of the TBS was 2-3-fold higher than that of BMD. Although statistically significant, differences in TBS among scan modes were within the highest LSC%. Thus, the three scan modes can be considered interchangeable. (orig.)

  7. Some statistical issues important to future developments in human radiation research

    International Nuclear Information System (INIS)

    Vaeth, Michael

    1991-01-01

    Using his two years experience at the Radiation Effects Research Foundation at Hiroshima, the author tries to outline some of the areas of statistics where methodologies relevant to the future developments in human radiation research are likely to be found. Problems related to statistical analysis of existing data are discussed, together with methodological developments in non-parametric and semi-parametric regression modelling, and interpretation and presentation of results. (Author)

  8. Misleading reporting and interpretation of results in major infertility journals.

    Science.gov (United States)

    Glujovsky, Demian; Sueldo, Carlos E; Borghi, Carolina; Nicotra, Pamela; Andreucci, Sara; Ciapponi, Agustín

    2016-05-01

    To evaluate the proportion of randomized controlled trials (RCTs) published in top infertility journals indexed on PubMed that reported their results with proper effect estimates and their precision estimation, while correctly interpreting both measures. Cross-sectional study evaluating all the RCTs published in top infertility journals during 2014. Not applicable. Not applicable. Not applicable. Proportion of RCTs that reported both relative and absolute effect size measures and its precision. Among the 32 RCTs published in 2014 in the top infertility journals reviewed, 37.5% (95% confidence interval [CI], 21.1-56.3) did not mention in their abstracts whether the difference among the study arms was statistically or clinically significant, and only 6.3% (95% CI, 0.8-20.8) used a CI of the absolute difference. Similarly, in the results section, these elements were observed in 28.2% (95% CI, 13.7-46.7) and 15.6% (95% CI, 5.3-32.8), respectively. Only one study clearly expressed the minimal clinically important difference in their methods section, but we found related proxies in 53% (95% CI, 34.7-70.9). None of the studies used CIs to draw conclusions about the clinical or statistical significance. We found 13 studies where the interpretation of the findings could be misleading. Recommended reporting items are underused in top infertility journals, which could lead to misleading interpretations. Authors, reviewers, and editorial boards should emphasize their use to improve reporting quality. Copyright © 2016 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  9. Uptake of 18F-DCFPyL in Paget's Disease of Bone, an Important Potential Pitfall in Clinical Interpretation of PSMA PET Studies.

    Science.gov (United States)

    Rowe, Steven P; Deville, Curtiland; Paller, Channing; Cho, Steve Y; Fishman, Elliot K; Pomper, Martin G; Ross, Ashley E; Gorin, Michael A

    2015-12-01

    Prostate-specific membrane antigen (PSMA)-targeted PET imaging is an emerging technique for evaluating patients with prostate cancer (PCa) in a variety of clinical contexts. As with any new imaging modality, there are interpretive pitfalls that are beginning to be recognized. In this image report, we describe the findings in a 63-year-old male with biochemically recurrent PCa after radical prostatectomy who was imaged with 18 F-DCFPyL, a small molecule inhibitor of PSMA. Diffuse radiotracer uptake was noted throughout the sacrum, corresponding to imaging findings on contrast-enhanced CT, bone scan, and pelvic MRI consistent with Paget's disease of bone. The uptake of 18 F-DCFPyL in Paget's disease is most likely due to hyperemia and increased radiotracer delivery. In light of the overlap in patients affected by PCa and Paget's, it is important for nuclear medicine physicians and radiologists interpreting PSMA PET/CT scans to be aware of the potential for this diagnostic pitfall. Correlation to findings on conventional imaging such as diagnostic CT and bone scan can help confirm the diagnosis.

  10. TrueAllele casework on Virginia DNA mixture evidence: computer and manual interpretation in 72 reported criminal cases.

    Directory of Open Access Journals (Sweden)

    Mark W Perlin

    Full Text Available Mixtures are a commonly encountered form of biological evidence that contain DNA from two or more contributors. Laboratory analysis of mixtures produces data signals that usually cannot be separated into distinct contributor genotypes. Computer modeling can resolve the genotypes up to probability, reflecting the uncertainty inherent in the data. Human analysts address the problem by simplifying the quantitative data in a threshold process that discards considerable identification information. Elevated stochastic threshold levels potentially discard more information. This study examines three different mixture interpretation methods. In 72 criminal cases, 111 genotype comparisons were made between 92 mixture items and relevant reference samples. TrueAllele computer modeling was done on all the evidence samples, and documented in DNA match reports that were provided as evidence for each case. Threshold-based Combined Probability of Inclusion (CPI and stochastically modified CPI (mCPI analyses were performed as well. TrueAllele's identification information in 101 positive matches was used to assess the reliability of its modeling approach. Comparison was made with 81 CPI and 53 mCPI DNA match statistics that were manually derived from the same data. There were statistically significant differences between the DNA interpretation methods. TrueAllele gave an average match statistic of 113 billion, CPI averaged 6.68 million, and mCPI averaged 140. The computer was highly specific, with a false positive rate under 0.005%. The modeling approach was precise, having a factor of two within-group standard deviation. TrueAllele accuracy was indicated by having uniformly distributed match statistics over the data set. The computer could make genotype comparisons that were impossible or impractical using manual methods. TrueAllele computer interpretation of DNA mixture evidence is sensitive, specific, precise, accurate and more informative than manual

  11. Interobserver variability in interpretation of mammogram

    International Nuclear Information System (INIS)

    Lee, Kyung Jae; Lee, Hae Kyung; Lee, Won Chul; Hwang, In Young; Park, Young Gyu; Jung, Sang Seol; Kim, Hoon Kyo; Kim, Mi Hye; Kim, Hak Hee

    2004-01-01

    The purpose of this study was to evaluate the performance of radiologists for mammographic screening, and to analyze interobserver agreement in the interpretation of mammograms. 50 women were selected as subjects from the patients who were screened with mammograms at two university hospitals. The images were analyzed by five radiologists working independently and without their having any knowledge of the final diagnosis. The interobserver variation was analyzed by using the kappa statistic. There were moderate agreements for the findings of the parenchymal pattern (k=0.44; 95% CI 0.39-0.49). calcification type (k=0.66; 95% CI 0.60-0.72) and calcification distribution (K=0.43; 95% CI 0.38-0.48). The mean kappa values ranged from 0.66 to 0.42 for the mass findings. The mean kappa value for the final conclusion was 0.44 (95% CI 0.38-0.51). In general, moderate agreement was evident for all the categories that were evaluated. The general agreement was moderate, but there was wide variability in some findings. To improve the accuracy and reduce variability among physicians in interpretation, proper training of radiologists and standardization of criteria are essential for breast screening

  12. A Framework for Assessing High School Students' Statistical Reasoning.

    Science.gov (United States)

    Chan, Shiau Wei; Ismail, Zaleha; Sumintono, Bambang

    2016-01-01

    Based on a synthesis of literature, earlier studies, analyses and observations on high school students, this study developed an initial framework for assessing students' statistical reasoning about descriptive statistics. Framework descriptors were established across five levels of statistical reasoning and four key constructs. The former consisted of idiosyncratic reasoning, verbal reasoning, transitional reasoning, procedural reasoning, and integrated process reasoning. The latter include describing data, organizing and reducing data, representing data, and analyzing and interpreting data. In contrast to earlier studies, this initial framework formulated a complete and coherent statistical reasoning framework. A statistical reasoning assessment tool was then constructed from this initial framework. The tool was administered to 10 tenth-grade students in a task-based interview. The initial framework was refined, and the statistical reasoning assessment tool was revised. The ten students then participated in the second task-based interview, and the data obtained were used to validate the framework. The findings showed that the students' statistical reasoning levels were consistent across the four constructs, and this result confirmed the framework's cohesion. Developed to contribute to statistics education, this newly developed statistical reasoning framework provides a guide for planning learning goals and designing instruction and assessments.

  13. Targeting Change: Assessing a Faculty Learning Community Focused on Increasing Statistics Content in Life Science Curricula

    Science.gov (United States)

    Parker, Loran Carleton; Gleichsner, Alyssa M.; Adedokun, Omolola A.; Forney, James

    2016-01-01

    Transformation of research in all biological fields necessitates the design, analysis and, interpretation of large data sets. Preparing students with the requisite skills in experimental design, statistical analysis, and interpretation, and mathematical reasoning will require both curricular reform and faculty who are willing and able to integrate…

  14. READING STATISTICS AND RESEARCH

    Directory of Open Access Journals (Sweden)

    Reviewed by Yavuz Akbulut

    2008-10-01

    Full Text Available The book demonstrates the best and most conservative ways to decipher and critique research reports particularly for social science researchers. In addition, new editions of the book are always better organized, effectively structured and meticulously updated in line with the developments in the field of research statistics. Even the most trivial issues are revisited and updated in new editions. For instance, purchaser of the previous editions might check the interpretation of skewness and kurtosis indices in the third edition (p. 34 and in the fifth edition (p.29 to see how the author revisits every single detail. Theory and practice always go hand in hand in all editions of the book. Re-reading previous editions (e.g. third edition before reading the fifth edition gives the impression that the author never stops ameliorating his instructional text writing methods. In brief, “Reading Statistics and Research” is among the best sources showing research consumers how to understand and critically assess the statistical information and research results contained in technical research reports. In this respect, the review written by Mirko Savić in Panoeconomicus (2008, 2, pp. 249-252 will help the readers to get a more detailed overview of each chapters. I cordially urge the beginning researchers to pick a highlighter to conduct a detailed reading with the book. A thorough reading of the source will make the researchers quite selective in appreciating the harmony between the data analysis, results and discussion sections of typical journal articles. If interested, beginning researchers might begin with this book to grasp the basics of research statistics, and prop up their critical research reading skills with some statistics package applications through the help of Dr. Andy Field’s book, Discovering Statistics using SPSS (second edition published by Sage in 2005.

  15. Born in an infinite universe: A cosmological interpretation of quantum mechanics

    International Nuclear Information System (INIS)

    Aguirre, Anthony; Tegmark, Max

    2011-01-01

    We study the quantum measurement problem in the context of an infinite, statistically uniform space, as could be generated by eternal inflation. It has recently been argued that when identical copies of a quantum measurement system exist, the standard projection operators and Born rule method for calculating probabilities must be supplemented by estimates of relative frequencies of observers. We argue that an infinite space actually renders the Born rule redundant, by physically realizing all outcomes of a quantum measurement in different regions, with relative frequencies given by the square of the wave-function amplitudes. Our formal argument hinges on properties of what we term the quantum confusion operator, which projects onto the Hilbert subspace where the Born rule fails, and we comment on its relation to the oft-discussed quantum frequency operator. This analysis unifies the classical and quantum levels of parallel universes that have been discussed in the literature, and has implications for several issues in quantum measurement theory. Replacing the standard hypothetical ensemble of measurements repeated ad infinitum by a concrete decohered spatial collection of experiments carried out in different distant regions of space provides a natural context for a statistical interpretation of quantum mechanics. It also shows how, even for a single measurement, probabilities may be interpreted as relative frequencies in unitary (Everettian) quantum mechanics. We also argue that after discarding a zero-norm part of the wave function, the remainder consists of a superposition of indistinguishable terms, so that arguably 'collapse' of the wave function is irrelevant, and the ''many worlds'' of Everett's interpretation are unified into one. Finally, the analysis suggests a 'cosmological interpretation' of quantum theory in which the wave function describes the actual spatial collection of identical quantum systems, and quantum uncertainty is attributable to the

  16. Neutrons and antimony physical measurements and interpretations

    International Nuclear Information System (INIS)

    Smith, A. B.

    2000-01-01

    New experimental information for the elastic and inelastic scattering of ∼ 4--10 MeV neutrons from elemental antimony is presented. The differential measurements are made at ∼ 40 or more scattering angles and at incident neutron-energy intervals of ∼ 0.5 MeV. The present experimental results, those previously reported from this laboratory and as found in the literature are comprehensively interpreted using spherical optical-statistical and dispersive-optical models. Direct vibrational processes via core-excitation, isospin and shell effects are discussed. Antimony models for applications are proposed and compared with global, regional, and specific models reported in the literature

  17. Use of keyword hierarchies to interpret gene expression patterns.

    Science.gov (United States)

    Masys, D R; Welsh, J B; Lynn Fink, J; Gribskov, M; Klacansky, I; Corbeil, J

    2001-04-01

    High-density microarray technology permits the quantitative and simultaneous monitoring of thousands of genes. The interpretation challenge is to extract relevant information from this large amount of data. A growing variety of statistical analysis approaches are available to identify clusters of genes that share common expression characteristics, but provide no information regarding the biological similarities of genes within clusters. The published literature provides a potential source of information to assist in interpretation of clustering results. We describe a data mining method that uses indexing terms ('keywords') from the published literature linked to specific genes to present a view of the conceptual similarity of genes within a cluster or group of interest. The method takes advantage of the hierarchical nature of Medical Subject Headings used to index citations in the MEDLINE database, and the registry numbers applied to enzymes.

  18. [How to fit and interpret multilevel models using SPSS].

    Science.gov (United States)

    Pardo, Antonio; Ruiz, Miguel A; San Martín, Rafael

    2007-05-01

    Hierarchic or multilevel models are used to analyse data when cases belong to known groups and sample units are selected both from the individual level and from the group level. In this work, the multilevel models most commonly discussed in the statistic literature are described, explaining how to fit these models using the SPSS program (any version as of the 11 th ) and how to interpret the outcomes of the analysis. Five particular models are described, fitted, and interpreted: (1) one-way analysis of variance with random effects, (2) regression analysis with means-as-outcomes, (3) one-way analysis of covariance with random effects, (4) regression analysis with random coefficients, and (5) regression analysis with means- and slopes-as-outcomes. All models are explained, trying to make them understandable to researchers in health and behaviour sciences.

  19. Statistical black-hole thermodynamics

    International Nuclear Information System (INIS)

    Bekenstein, J.D.

    1975-01-01

    Traditional methods from statistical thermodynamics, with appropriate modifications, are used to study several problems in black-hole thermodynamics. Jaynes's maximum-uncertainty method for computing probabilities is used to show that the earlier-formulated generalized second law is respected in statistically averaged form in the process of spontaneous radiation by a Kerr black hole discovered by Hawking, and also in the case of a Schwarzschild hole immersed in a bath of black-body radiation, however cold. The generalized second law is used to motivate a maximum-entropy principle for determining the equilibrium probability distribution for a system containing a black hole. As an application we derive the distribution for the radiation in equilibrium with a Kerr hole (it is found to agree with what would be expected from Hawking's results) and the form of the associated distribution among Kerr black-hole solution states of definite mass. The same results are shown to follow from a statistical interpretation of the concept of black-hole entropy as the natural logarithm of the number of possible interior configurations that are compatible with the given exterior black-hole state. We also formulate a Jaynes-type maximum-uncertainty principle for black holes, and apply it to obtain the probability distribution among Kerr solution states for an isolated radiating Kerr hole

  20. On court interpreters' visibility

    DEFF Research Database (Denmark)

    Dubslaff, Friedel; Martinsen, Bodil

    of the service they receive. Ultimately, the findings will be used for training purposes. Future - and, for that matter, already practising - interpreters as well as the professional users of interpreters ought to take the reality of the interpreters' work in practice into account when assessing the quality...... on the interpreter's interpersonal role and, in particular, on signs of the interpreter's visibility, i.e. active co-participation. At first sight, the interpreting assignment in question seems to be a short and simple routine task which would not require the interpreter to deviate from the traditional picture...

  1. Do Interpreters Indeed Have Superior Working Memory in Interpreting

    Institute of Scientific and Technical Information of China (English)

    于飞

    2012-01-01

    With the frequent communications between China and western countries in the field of economy,politics and culture,etc,Inter preting becomes more and more important to people in all walks of life.This paper aims to testify the author’s hypothesis "professional interpreters have similar short-term memory with unprofessional interpreters,but they have superior working memory." After the illustration of literatures concerning with consecutive interpreting,short-term memory and working memory,experiments are designed and analysis are described.

  2. Interpretative approaches to identifying sources of hydrocarbons in complex contaminated environments

    International Nuclear Information System (INIS)

    Sauer, T.C.; Brown, J.S.; Boehm, P.D.

    1993-01-01

    Recent advances in analytical instrumental hardware and software have permitted the use of more sophisticated approaches in identifying or fingerprinting sources of hydrocarbons in complex matrix environments. In natural resource damage assessments and contaminated site investigations of both terrestrial and aquatic environments, chemical fingerprinting has become an important interpretative tool. The alkyl homologues of the major polycyclic and heterocyclic aromatic hydrocarbons (e.g., phenanthrenes/anthracenes, dibenzothiophenes, chrysenes) have been found to the most valuable hydrocarbons in differentiating hydrocarbon sources, but there are other hydrocarbon analytes, such as the chemical biomarkers steranes and triterpanes, and alkyl homologues of benzene, and chemical methodologies, such as scanning UV fluorescence, that have been found to be useful in certain environments. This presentation will focus on recent data interpretative approaches for hydrocarbon source identification assessments. Selection of appropriate targets analytes and data quality requirements will be discussed and example cases including the Arabian Gulf War oil spill results will be presented

  3. Effect of Work Improvement for Promotion of Outpatient Satisfaction on CT scan

    International Nuclear Information System (INIS)

    Han, Man Seok; Kim, Tae Hyung; Lee, Seung Youl; Lee, Myeong Goo; Jeon, Min Cheol; Cho, Jae Hwan

    2012-01-01

    Nowadays, most of the hospital serves 'one stop service' for CT scan. The patients could be taken the CT scan in the day they register for scan. On the contrary to the time convenience, patients are not satisfied with long waiting time and unkindness of staff. The objective of this study is to improve the patient's satisfaction for the CT scan, by analyzing inconvenience factors and improving the service qualities. From April 1 to August 30 in 2011, we investigated the satisfaction of patients who did examined abdomen CT scan with contrast media. We analyzed the 89 questionnaires before and after the service improvements from them. The worker's kindness, the environment of CT room and understanding about CT scan were answered by questionnaire and the waiting time of a day CT scan was drawn by medical information statistics. Also, the period before improvement was from April to June and the period after improvement was from July to September. And these questionnaire was analyzed through SPSS V. 15.0. In this study, kindness of staff, environment of CT room, intelligibility for CT scan and waiting time was explored and analyzed by SPSS V.15.0. The score of kindness was improved by 32%, satisfaction level of the environment was improved by 52.54%. The understanding level about CT scan was improved by 52.36% and the waiting time of a day CT was shortened by 21% through our service enhancement programs. Consequentially, it is considered that these efforts would contribute to increase the revenue of hospital.

  4. Effect of Work Improvement for Promotion of Outpatient Satisfaction on CT scan

    Energy Technology Data Exchange (ETDEWEB)

    Han, Man Seok; Kim, Tae Hyung [Dept. of Radiological Science, Kangwon National University, Chuncheon (Korea, Republic of); Lee, Seung Youl; Lee, Myeong Goo; Jeon, Min Cheol [Dept. of Radiology, Chungnam National University Hospital, Daejeon (Korea, Republic of); Cho, Jae Hwan [Dept. of Radiological Science, Gyeongsan University College, Daegu (Korea, Republic of)

    2012-03-15

    Nowadays, most of the hospital serves 'one stop service' for CT scan. The patients could be taken the CT scan in the day they register for scan. On the contrary to the time convenience, patients are not satisfied with long waiting time and unkindness of staff. The objective of this study is to improve the patient's satisfaction for the CT scan, by analyzing inconvenience factors and improving the service qualities. From April 1 to August 30 in 2011, we investigated the satisfaction of patients who did examined abdomen CT scan with contrast media. We analyzed the 89 questionnaires before and after the service improvements from them. The worker's kindness, the environment of CT room and understanding about CT scan were answered by questionnaire and the waiting time of a day CT scan was drawn by medical information statistics. Also, the period before improvement was from April to June and the period after improvement was from July to September. And these questionnaire was analyzed through SPSS V. 15.0. In this study, kindness of staff, environment of CT room, intelligibility for CT scan and waiting time was explored and analyzed by SPSS V.15.0. The score of kindness was improved by 32%, satisfaction level of the environment was improved by 52.54%. The understanding level about CT scan was improved by 52.36% and the waiting time of a day CT was shortened by 21% through our service enhancement programs. Consequentially, it is considered that these efforts would contribute to increase the revenue of hospital.

  5. Contrast bolus technique with rapid CT scanning

    International Nuclear Information System (INIS)

    Arnold, H.; Kuehne, D.; Rohr, W.; Heller, M.

    1981-01-01

    Twenty-three patients complying with the clinical criteria for brain death were studied by contrast-enhanced CT. In all but one, the great intracranial vessels escaped visualization; accordingly, angiography demonstrated cerebral circulatory arrest. In the remaining case, faint enhancement of the circle of Willis corresponded to angiographic demonstration of the proximal segments of cerebral arteris. Neither in normal brain nor in dead brain did slow CT scanning disclose any postcontrast increase in parenchymal attenuation. An improved technique is proposed to demonstrate the transit of the contrast bolus by rapid CT with image splitting. If cerebral blood flow is preserved, the grey and white matter will enhance significantly following administration of contrast medium. Vice versa, the absence of enhancement confirms brain death, even in instances in which the great cerebral vessels are obscured by hemorrhage or other extensive lesions. Two additional cases of brain death were evaluated by rapid CT scanning. As to brain death, the technique obviates the need for angiography or radionuclide angiography, usually applied in prospective organ donors, because its informative content is superior to that of either method. The CT technique described affords a reliable and safe diagnosis of brain death, and can be interpreted easily. (orig.)

  6. Field-based scanning tunneling microscope manipulation of antimony dimers on Si(001)

    NARCIS (Netherlands)

    Rogge, S.; Timmerman, R.H.; Scholte, P.M.L.O.; Geerligs, L.J.; Salemink, H.W.M.

    2001-01-01

    The manipulation of antimony dimers, Sb2, on the silicon (001) surface by means of a scanning tunneling microscope (STM) has been experimentally investigated. Directed hopping of the Sb2 dimers due the STM tip can dominate over the thermal motion at temperatures between 300 and 500 K. Statistics on

  7. Three Insights from a Bayesian Interpretation of the One-Sided "P" Value

    Science.gov (United States)

    Marsman, Maarten; Wagenmakers, Eric-Jan

    2017-01-01

    P values have been critiqued on several grounds but remain entrenched as the dominant inferential method in the empirical sciences. In this article, we elaborate on the fact that in many statistical models, the one-sided "P" value has a direct Bayesian interpretation as the approximate posterior mass for values lower than zero. The…

  8. Application of Multivariable Statistical Techniques in Plant-wide WWTP Control Strategies Analysis

    DEFF Research Database (Denmark)

    Flores Alsina, Xavier; Comas, J.; Rodríguez-Roda, I.

    2007-01-01

    The main objective of this paper is to present the application of selected multivariable statistical techniques in plant-wide wastewater treatment plant (WWTP) control strategies analysis. In this study, cluster analysis (CA), principal component analysis/factor analysis (PCA/FA) and discriminant...... analysis (DA) are applied to the evaluation matrix data set obtained by simulation of several control strategies applied to the plant-wide IWA Benchmark Simulation Model No 2 (BSM2). These techniques allow i) to determine natural groups or clusters of control strategies with a similar behaviour, ii......) to find and interpret hidden, complex and casual relation features in the data set and iii) to identify important discriminant variables within the groups found by the cluster analysis. This study illustrates the usefulness of multivariable statistical techniques for both analysis and interpretation...

  9. Applications of quantum entropy to statistics

    International Nuclear Information System (INIS)

    Silver, R.N.; Martz, H.F.

    1994-01-01

    This paper develops two generalizations of the maximum entropy (ME) principle. First, Shannon classical entropy is replaced by von Neumann quantum entropy to yield a broader class of information divergences (or penalty functions) for statistics applications. Negative relative quantum entropy enforces convexity, positivity, non-local extensivity and prior correlations such as smoothness. This enables the extension of ME methods from their traditional domain of ill-posed in-verse problems to new applications such as non-parametric density estimation. Second, given a choice of information divergence, a combination of ME and Bayes rule is used to assign both prior and posterior probabilities. Hyperparameters are interpreted as Lagrange multipliers enforcing constraints. Conservation principles are proposed to act statistical regularization and other hyperparameters, such as conservation of information and smoothness. ME provides an alternative to heirarchical Bayes methods

  10. Lung scintigraphy in the diagnosis of pulmonary embolism: current methods and interpretation criteria in clinical practice

    International Nuclear Information System (INIS)

    Skarlovnik, Ajda; Hrastnik, Damjana; Fettich, Jure; Grmek, Marko

    2014-01-01

    In current clinical practice lung scintigraphy is mainly used to exclude pulmonary embolism (PE). Modified diagnostic criteria for planar lung scintigraphy are considered, as newer scitigraphic methods, especially single photon emission computed tomography (SPECT) are becoming more popular. Data of 98 outpatients who underwent planar ventilation/perfusion (V/Q) scintigraphy and 49 outpatients who underwent V/Q SPECT from the emergency department (ED) were retrospectively collected. Planar V/Q images were interpreted according to 0.5 segment mismatch criteria and revised PIOPED II criteria and perfusion scans according to PISA-PED criteria. V/Q SPECT images were interpreted according to the criteria suggested in EANM guidelines. Final diagnosis of PE was based on the clinical decision of an attending physician and evaluation of a 12 months follow-up period. Using 0.5 segment mismatch criteria and revised PIOPED II, planar V/Q scans were diagnostic in 93% and 84% of cases, respectively. Among the diagnostic planar scans readings specificity for 0.5 segment mismatch criteria was 98%, and 99% for revised PIOPED II criteria. V/Q SPECT showed a sensitivity of 100% and a specificity of 98%, without any non-diagnostic cases. In patients with low pretest probability for PE, planar V/Q scans assessed by 0.5 segment mismatch criteria were diagnostic in 92%, and in 85% using revised PIOPED II criteria, while perfusion scintigraphy without ventilation scans was diagnostic in 80%. Lung scintigraphy yielded diagnostically definitive results and is reliable in ruling out PE in patients from ED. V/Q SPECT has excellent specificity and sensitivity without any non-diagnostic results. Percentage of non-diagnostic results in planar lung scintigraphy is considerably smaller when 0.5 segment mismatch criteria instead of revised PIOPED II criteria are used. Diagnostic value of perfusion scintigraphy according to PISA-PED criteria is inferior to combined V/Q scintigraphy; the difference is

  11. Head CT scan

    Science.gov (United States)

    ... scan - orbits; CT scan - sinuses; Computed tomography - cranial; CAT scan - brain ... head size in children Changes in thinking or behavior Fainting Headache, when you have certain other signs ...

  12. Counterbalancing and Other Uses of Repeated-Measures Latin-Square Designs: Analyses and Interpretations.

    Science.gov (United States)

    Reese, Hayne W.

    1997-01-01

    Recommends that when repeated-measures Latin-square designs are used to counterbalance treatments across a procedural variable or to reduce the number of treatment combinations given to each participant, effects be analyzed statistically, and that in all uses, researchers consider alternative interpretations of the variance associated with the…

  13. Genre and Interpretation

    DEFF Research Database (Denmark)

    Auken, Sune

    2015-01-01

    Despite the immensity of genre studies as well as studies in interpretation, our understanding of the relationship between genre and interpretation is sketchy at best. The article attempts to unravel some of intricacies of that relationship through an analysis of the generic interpretation carrie...

  14. Using assemblage data in ecological indicators: A comparison and evaluation of commonly available statistical tools

    Science.gov (United States)

    Smith, Joseph M.; Mather, Martha E.

    2012-01-01

    Ecological indicators are science-based tools used to assess how human activities have impacted environmental resources. For monitoring and environmental assessment, existing species assemblage data can be used to make these comparisons through time or across sites. An impediment to using assemblage data, however, is that these data are complex and need to be simplified in an ecologically meaningful way. Because multivariate statistics are mathematical relationships, statistical groupings may not make ecological sense and will not have utility as indicators. Our goal was to define a process to select defensible and ecologically interpretable statistical simplifications of assemblage data in which researchers and managers can have confidence. For this, we chose a suite of statistical methods, compared the groupings that resulted from these analyses, identified convergence among groupings, then we interpreted the groupings using species and ecological guilds. When we tested this approach using a statewide stream fish dataset, not all statistical methods worked equally well. For our dataset, logistic regression (Log), detrended correspondence analysis (DCA), cluster analysis (CL), and non-metric multidimensional scaling (NMDS) provided consistent, simplified output. Specifically, the Log, DCA, CL-1, and NMDS-1 groupings were ≥60% similar to each other, overlapped with the fluvial-specialist ecological guild, and contained a common subset of species. Groupings based on number of species (e.g., Log, DCA, CL and NMDS) outperformed groupings based on abundance [e.g., principal components analysis (PCA) and Poisson regression]. Although the specific methods that worked on our test dataset have generality, here we are advocating a process (e.g., identifying convergent groupings with redundant species composition that are ecologically interpretable) rather than the automatic use of any single statistical tool. We summarize this process in step-by-step guidance for the

  15. Evaluation of the interpretative skills of participants of a limited transthoracic echocardiography training course (H.A.R.T.scan course).

    Science.gov (United States)

    Royse, C F; Haji, D L; Faris, J G; Veltman, M G; Kumar, A; Royse, A G

    2012-05-01

    Limited transthoracic echocardiography performed by treating physicians may facilitate assessment of haemodynamic abnormalities in perioperative and critical care patients. The interpretative skills of one hundred participants who completed an education program in limited transthoracic echocardiography were assessed by reporting five pre-recorded case studies. A high level of agreement was observed in ventricular volume assessment (left 95%, right 96%), systolic function (left 99%, right 96%), left atrial pressure (96%) and haemodynamic state (97%). The highest failure to report answers (that is, no answer given) was for right ventricular volume and function. For moderate or severe valve lesions, agreement ranged from 90 to 98%, with failure to report educational program showed good agreement with experts in interpretation of valve and ventricular function.

  16. Heart PET scan

    Science.gov (United States)

    ... nuclear medicine scan; Heart positron emission tomography; Myocardial PET scan ... A PET scan requires a small amount of radioactive material (tracer). This tracer is given through a vein (IV), ...

  17. Unusual cause for recurrent Cushing syndrome and its diagnosis by computed tomography and NP-59 radiocholesterol scanning

    International Nuclear Information System (INIS)

    Harris, R.D.; Herwig, K.R.

    1990-01-01

    Cushing syndrome can recur following an adrenalectomy. One of the primary causes is recurrence of adrenal carcinoma either locally or from metastases. Hyperplasia and hyperfunction of adrenal remnants may also occur if there is pituitary stimulation. We have a patient in whom recurrent Cushing syndrome developed from small nonmalignant deposits of adrenal tissue in the perirenal adipose tissue following adrenalectomy of a benign adenoma. These deposits were identifiable by computed tomography. A false-negative NP-59 iodocholesterol scan was instructive in pointing out some problems in the interpretation of this type of scan for adrenal tissue

  18. Engineering Definitional Interpreters

    DEFF Research Database (Denmark)

    Midtgaard, Jan; Ramsay, Norman; Larsen, Bradford

    2013-01-01

    A definitional interpreter should be clear and easy to write, but it may run 4--10 times slower than a well-crafted bytecode interpreter. In a case study focused on implementation choices, we explore ways of making definitional interpreters faster without expending much programming effort. We imp...

  19. Functional brain mapping using H{sub 2}{sup 15}O positron emission tomography (I): statistical parametric mapping method

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Dong Soo; Lee, Jae Sung; Kim, Kyeong Min; Chung, June Key; Lee, Myung Chul [College of Medicine, Seoul National Univ., Seoul (Korea, Republic of)

    1998-08-01

    We investigated the statistical methods to compose the functional brain map of human working memory and the principal factors that have an effect on the methods for localization. Repeated PET scans with successive four tasks, which consist of one control and three different activation tasks, were performed on six right-handed normal volunteers for 2 minutes after bolus injections of 925 MBq H{sub 2}{sup 15}O at the intervals of 30 minutes. Image data were analyzed using SPM96 (Statistical Parametric Mapping) implemented with Matlab (Mathworks Inc., U.S.A.). Images from the same subject were spatially registered and were normalized using linear and nonlinear transformation methods. Significant difference between control and each activation state was estimated at every voxel based on the general linear model. Differences of global counts were removed using analysis of covariance (ANCOVA) with global activity as covariate. Using the mean and variance for each condition which was adjusted using ANCOVA, t-statistics was performed on every voxel. To interpret the results more easily, t-values were transformed to the standard Gaussian distribution (Z-score). All the subjects carried out the activation and control tests successfully. Average rate of correct answers was 95%. The numbers of activated blobs were 4 for verbal memory I, 9 for verbal memory II, 9 for visual memory, and 6 for conjunctive activation of these three tasks. The verbal working memory activates predominantly left-sided structures, and the visual memory activates the right hemisphere. We conclude that rCBF PET imaging and statistical parametric mapping method were useful in the localization of the brain regions for verbal and visual working memory.

  20. iCFD: Interpreted Computational Fluid Dynamics - Degeneration of CFD to one-dimensional advection-dispersion models using statistical experimental design - The secondary clarifier.

    Science.gov (United States)

    Guyonvarch, Estelle; Ramin, Elham; Kulahci, Murat; Plósz, Benedek Gy

    2015-10-15

    The present study aims at using statistically designed computational fluid dynamics (CFD) simulations as numerical experiments for the identification of one-dimensional (1-D) advection-dispersion models - computationally light tools, used e.g., as sub-models in systems analysis. The objective is to develop a new 1-D framework, referred to as interpreted CFD (iCFD) models, in which statistical meta-models are used to calculate the pseudo-dispersion coefficient (D) as a function of design and flow boundary conditions. The method - presented in a straightforward and transparent way - is illustrated using the example of a circular secondary settling tank (SST). First, the significant design and flow factors are screened out by applying the statistical method of two-level fractional factorial design of experiments. Second, based on the number of significant factors identified through the factor screening study and system understanding, 50 different sets of design and flow conditions are selected using Latin Hypercube Sampling (LHS). The boundary condition sets are imposed on a 2-D axi-symmetrical CFD simulation model of the SST. In the framework, to degenerate the 2-D model structure, CFD model outputs are approximated by the 1-D model through the calibration of three different model structures for D. Correlation equations for the D parameter then are identified as a function of the selected design and flow boundary conditions (meta-models), and their accuracy is evaluated against D values estimated in each numerical experiment. The evaluation and validation of the iCFD model structure is carried out using scenario simulation results obtained with parameters sampled from the corners of the LHS experimental region. For the studied SST, additional iCFD model development was carried out in terms of (i) assessing different density current sub-models; (ii) implementation of a combined flocculation, hindered, transient and compression settling velocity function; and (iii

  1. SU-E-J-261: Statistical Analysis and Chaotic Dynamics of Respiratory Signal of Patients in BodyFix

    Energy Technology Data Exchange (ETDEWEB)

    Michalski, D; Huq, M; Bednarz, G; Lalonde, R; Yang, Y; Heron, D [University of Pittsburgh Medical Center, Pittsburgh, PA (United States)

    2014-06-01

    Purpose: To quantify respiratory signal of patients in BodyFix undergoing 4DCT scan with and without immobilization cover. Methods: 20 pairs of respiratory tracks recorded with RPM system during 4DCT scan were analyzed. Descriptive statistic was applied to selected parameters of exhale-inhale decomposition. Standardized signals were used with the delay method to build orbits in embedded space. Nonlinear behavior was tested with surrogate data. Sample entropy SE, Lempel-Ziv complexity LZC and the largest Lyapunov exponents LLE were compared. Results: Statistical tests show difference between scans for inspiration time and its variability, which is bigger for scans without cover. The same is for variability of the end of exhalation and inhalation. Other parameters fail to show the difference. For both scans respiratory signals show determinism and nonlinear stationarity. Statistical test on surrogate data reveals their nonlinearity. LLEs show signals chaotic nature and its correlation with breathing period and its embedding delay time. SE, LZC and LLE measure respiratory signal complexity. Nonlinear characteristics do not differ between scans. Conclusion: Contrary to expectation cover applied to patients in BodyFix appears to have limited effect on signal parameters. Analysis based on trajectories of delay vectors shows respiratory system nonlinear character and its sensitive dependence on initial conditions. Reproducibility of respiratory signal can be evaluated with measures of signal complexity and its predictability window. Longer respiratory period is conducive for signal reproducibility as shown by these gauges. Statistical independence of the exhale and inhale times is also supported by the magnitude of LLE. The nonlinear parameters seem more appropriate to gauge respiratory signal complexity since its deterministic chaotic nature. It contrasts with measures based on harmonic analysis that are blind for nonlinear features. Dynamics of breathing, so crucial for

  2. SU-E-J-261: Statistical Analysis and Chaotic Dynamics of Respiratory Signal of Patients in BodyFix

    International Nuclear Information System (INIS)

    Michalski, D; Huq, M; Bednarz, G; Lalonde, R; Yang, Y; Heron, D

    2014-01-01

    Purpose: To quantify respiratory signal of patients in BodyFix undergoing 4DCT scan with and without immobilization cover. Methods: 20 pairs of respiratory tracks recorded with RPM system during 4DCT scan were analyzed. Descriptive statistic was applied to selected parameters of exhale-inhale decomposition. Standardized signals were used with the delay method to build orbits in embedded space. Nonlinear behavior was tested with surrogate data. Sample entropy SE, Lempel-Ziv complexity LZC and the largest Lyapunov exponents LLE were compared. Results: Statistical tests show difference between scans for inspiration time and its variability, which is bigger for scans without cover. The same is for variability of the end of exhalation and inhalation. Other parameters fail to show the difference. For both scans respiratory signals show determinism and nonlinear stationarity. Statistical test on surrogate data reveals their nonlinearity. LLEs show signals chaotic nature and its correlation with breathing period and its embedding delay time. SE, LZC and LLE measure respiratory signal complexity. Nonlinear characteristics do not differ between scans. Conclusion: Contrary to expectation cover applied to patients in BodyFix appears to have limited effect on signal parameters. Analysis based on trajectories of delay vectors shows respiratory system nonlinear character and its sensitive dependence on initial conditions. Reproducibility of respiratory signal can be evaluated with measures of signal complexity and its predictability window. Longer respiratory period is conducive for signal reproducibility as shown by these gauges. Statistical independence of the exhale and inhale times is also supported by the magnitude of LLE. The nonlinear parameters seem more appropriate to gauge respiratory signal complexity since its deterministic chaotic nature. It contrasts with measures based on harmonic analysis that are blind for nonlinear features. Dynamics of breathing, so crucial for

  3. The Nirex Sellafield site investigation: the role of geophysical interpretation

    International Nuclear Information System (INIS)

    Muir Wood, R.; Woo, G.; MacMillan, G.

    1992-01-01

    This report reviews the methods by which geophysical data are interpreted, and used to characterize the 3-D geology of a site for potential storage of radioactive waste. The report focuses on the NIREX site investigation at Sellafield, for which geophysical observations provide a significant component of the structural geological understanding. In outlining the basic technical principles of seismic data processing and interpretation, and borehole logging, an attempt has been made to identify errors, uncertainties, and the implicit use of expert judgement. To enhance the reliability of a radiological probabilistic risk assessment, recommendations are proposed for independent use of the primary NIREX geophysical site investigation data in characterizing the site geology. These recommendations include quantitative procedures for undertaking an uncertainty audit using a combination of statistical analysis and expert judgement. (author)

  4. Directionality effects in simultaneous language interpreting: the case of sign language interpreters in The Netherlands.

    Science.gov (United States)

    Van Dijk, Rick; Boers, Eveline; Christoffels, Ingrid; Hermans, Daan

    2011-01-01

    The quality of interpretations produced by sign language interpreters was investigated. Twenty-five experienced interpreters were instructed to interpret narratives from (a) spoken Dutch to Sign Language of The Netherlands (SLN), (b) spoken Dutch to Sign Supported Dutch (SSD), and (c) SLN to spoken Dutch. The quality of the interpreted narratives was assessed by 5 certified sign language interpreters who did not participate in the study. Two measures were used to assess interpreting quality: the propositional accuracy of the interpreters' interpretations and a subjective quality measure. The results showed that the interpreted narratives in the SLN-to-Dutch interpreting direction were of lower quality (on both measures) than the interpreted narratives in the Dutch-to-SLN and Dutch-to-SSD directions. Furthermore, interpreters who had begun acquiring SLN when they entered the interpreter training program performed as well in all 3 interpreting directions as interpreters who had acquired SLN from birth.

  5. Kinetic Analysis of Dynamic Positron Emission Tomography Data using Open-Source Image Processing and Statistical Inference Tools.

    Science.gov (United States)

    Hawe, David; Hernández Fernández, Francisco R; O'Suilleabháin, Liam; Huang, Jian; Wolsztynski, Eric; O'Sullivan, Finbarr

    2012-05-01

    In dynamic mode, positron emission tomography (PET) can be used to track the evolution of injected radio-labelled molecules in living tissue. This is a powerful diagnostic imaging technique that provides a unique opportunity to probe the status of healthy and pathological tissue by examining how it processes substrates. The spatial aspect of PET is well established in the computational statistics literature. This article focuses on its temporal aspect. The interpretation of PET time-course data is complicated because the measured signal is a combination of vascular delivery and tissue retention effects. If the arterial time-course is known, the tissue time-course can typically be expressed in terms of a linear convolution between the arterial time-course and the tissue residue. In statistical terms, the residue function is essentially a survival function - a familiar life-time data construct. Kinetic analysis of PET data is concerned with estimation of the residue and associated functionals such as flow, flux, volume of distribution and transit time summaries. This review emphasises a nonparametric approach to the estimation of the residue based on a piecewise linear form. Rapid implementation of this by quadratic programming is described. The approach provides a reference for statistical assessment of widely used one- and two-compartmental model forms. We illustrate the method with data from two of the most well-established PET radiotracers, (15)O-H(2)O and (18)F-fluorodeoxyglucose, used for assessment of blood perfusion and glucose metabolism respectively. The presentation illustrates the use of two open-source tools, AMIDE and R, for PET scan manipulation and model inference.

  6. Quality of pediatric abdominal CT scans performed at a dedicated children's hospital and its referring institutions: a multifactorial evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Snow, Aisling [Boston Children' s Hospital, Department of Radiology, Boston, MA (United States); Our Lady' s Children' s Hospital, Department of Radiology, Dublin (Ireland); Milliren, Carly E.; Graham, Dionne A. [Boston Children' s Hospital, Program for Patient Safety and Quality, Boston, MA (United States); Callahan, Michael J.; MacDougall, Robert D.; Robertson, Richard L.; Taylor, George A. [Boston Children' s Hospital, Department of Radiology, Boston, MA (United States)

    2017-04-15

    Pediatric patients requiring transfer to a dedicated children's hospital from an outside institution may undergo CT imaging as part of their evaluation. Whether this imaging is performed prior to or after transfer has been shown to impact the radiation dose imparted to the patient. Other quality variables could also be affected by the pediatric experience and expertise of the scanning institution. To identify differences in quality between abdominal CT scans and reports performed at a dedicated children's hospital, and those performed at referring institutions. Fifty consecutive pediatric abdominal CT scans performed at outside institutions were matched (for age, gender and indication) with 50 CT scans performed at a dedicated freestanding children's hospital. We analyzed the scans for technical parameters, report findings, correlation with final clinical diagnosis, and clinical utility. Technical evaluation included use of intravenous and oral contrast agents, anatomical coverage, number of scan phases and size-specific dose estimate (SSDE) for each scan. Outside institution scans were re-reported when the child was admitted to the children's hospital; they were also re-interpreted for this study by children's hospital radiologists who were provided with only the referral information given in the outside institution's report. Anonymized original outside institutional reports and children's hospital admission re-reports were analyzed by two emergency medicine physicians for ease of understanding, degree to which the clinical question was answered, and level of confidence in the report. Mean SSDE was lower (8.68) for children's hospital scans, as compared to outside institution scans (13.29, P = 0.03). Concordance with final clinical diagnosis was significantly lower for original outside institution reports (38/48, 79%) than for both the admission and study children's hospital reports (48/50, 96%; P = 0.005). Children

  7. Which statistics should tropical biologists learn?

    Science.gov (United States)

    Loaiza Velásquez, Natalia; González Lutz, María Isabel; Monge-Nájera, Julián

    2011-09-01

    Tropical biologists study the richest and most endangered biodiversity in the planet, and in these times of climate change and mega-extinctions, the need for efficient, good quality research is more pressing than in the past. However, the statistical component in research published by tropical authors sometimes suffers from poor quality in data collection; mediocre or bad experimental design and a rigid and outdated view of data analysis. To suggest improvements in their statistical education, we listed all the statistical tests and other quantitative analyses used in two leading tropical journals, the Revista de Biología Tropical and Biotropica, during a year. The 12 most frequent tests in the articles were: Analysis of Variance (ANOVA), Chi-Square Test, Student's T Test, Linear Regression, Pearson's Correlation Coefficient, Mann-Whitney U Test, Kruskal-Wallis Test, Shannon's Diversity Index, Tukey's Test, Cluster Analysis, Spearman's Rank Correlation Test and Principal Component Analysis. We conclude that statistical education for tropical biologists must abandon the old syllabus based on the mathematical side of statistics and concentrate on the correct selection of these and other procedures and tests, on their biological interpretation and on the use of reliable and friendly freeware. We think that their time will be better spent understanding and protecting tropical ecosystems than trying to learn the mathematical foundations of statistics: in most cases, a well designed one-semester course should be enough for their basic requirements.

  8. Statistical analysis of quality control of automatic processor

    International Nuclear Information System (INIS)

    Niu Yantao; Zhao Lei; Zhang Wei; Yan Shulin

    2002-01-01

    Objective: To strengthen the scientific management of automatic processor and promote QC, based on analyzing QC management chart for automatic processor by statistical method, evaluating and interpreting the data and trend of the chart. Method: Speed, contrast, minimum density of step wedge of film strip were measured everyday and recorded on the QC chart. Mean (x-bar), standard deviation (s) and range (R) were calculated. The data and the working trend were evaluated and interpreted for management decisions. Results: Using relative frequency distribution curve constructed by measured data, the authors can judge whether it is a symmetric bell-shaped curve or not. If not, it indicates a few extremes overstepping control limits possibly are pulling the curve to the left or right. If it is a normal distribution, standard deviation (s) is observed. When x-bar +- 2s lies in upper and lower control limits of relative performance indexes, it indicates the processor works in stable status in this period. Conclusion: Guided by statistical method, QC work becomes more scientific and quantified. The authors can deepen understanding and application of the trend chart, and improve the quality management to a new step

  9. Stable statistical representations facilitate visual search.

    Science.gov (United States)

    Corbett, Jennifer E; Melcher, David

    2014-10-01

    Observers represent the average properties of object ensembles even when they cannot identify individual elements. To investigate the functional role of ensemble statistics, we examined how modulating statistical stability affects visual search. We varied the mean and/or individual sizes of an array of Gabor patches while observers searched for a tilted target. In "stable" blocks, the mean and/or local sizes of the Gabors were constant over successive displays, whereas in "unstable" baseline blocks they changed from trial to trial. Although there was no relationship between the context and the spatial location of the target, observers found targets faster (as indexed by faster correct responses and fewer saccades) as the global mean size became stable over several displays. Building statistical stability also facilitated scanning the scene, as measured by larger saccadic amplitudes, faster saccadic reaction times, and shorter fixation durations. These findings suggest a central role for peripheral visual information, creating context to free resources for detailed processing of salient targets and maintaining the illusion of visual stability.

  10. Bone scan and joint scan of hands and feet in rheumatoid arthritis

    International Nuclear Information System (INIS)

    Carpentier, N.; Verbeke, S.; Perdrisot, R.; Grilo, R.M.; Quenesson, E.; Bonnet, C.; Vergne, P.; Treves, R.; Bertin, P.; Boutros-Toni, F.

    2000-01-01

    The aim of this study was to determine the ability of joint scan and bone scan of hands and feet, in patients with rheumatoid arthritis, to localize the altered joints. The sensitivity, the specificity, the positive predictive value (PPV) and the negative predictive value (NPV) of joint scan were determined in comparison with clinical joint assessment. Fifteen patients (780 joints) were clinically examined (pain and synovitis); during the same day, a bone scan and a joint scan were realized by oxidronate 99m Tc intravenous injection. Patients were scanned 5 minutes (tissual time, T t ) and 3 hours 1/4 (bone time, T 0 ) after the administration. The uptake of the bi-phosphonate was evaluated with a qualitative method using a grey scale. The uptake of 99m Tc oxidronate was quantitated using an extra-articular region of interest. The sensitivity, specificity, PPV and NPV of the scan at Tt were 46%, 96%, 85% et 78%. The same parameters were 75%, 66%, 53% and 84% for the scan realized at T 0 . The joint scan has showed 22% of false positive. These false positives could be a consequence of an earlier detection of joint alterations by scan. The joint scan should forecast the evolution of joints in patients with rheumatoid arthritis. (author)

  11. F-18 FDG PET scan findings in patients with pulmonary involvement in the hypereosinophilic syndrome

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jae Hoon; Kim, Tae Hoon; Yun, Mi Jin [College of Medicine, Yonsei University, Seoul (Korea, Republic of)] (and others)

    2005-08-15

    Hypereosinophilic syndrome (HES) is an infiltrative disease of eosinophils affecting multiple organs including the lung. F-18 2-fluoro-2-deoxyglucose (F-18 FDG) may accumulate at sites of inflammation or infection, making interpretation of whole body PET scan difficult in patients with cancer. This study was to evaluate the PET findings of HES with lung involvement and to find out differential PET features between lung malignancy and HES with lung involvement. F-18 FDG PET and low dose chest CT scan was performed for screening of lung cancer. Eight patients who showed ground-glass attenuation (GGA) and consolidation on chest CT scan with peripheral blood eosinophilia were included in this study. The patients with history of parasite infection, allergy and collagen vascular disease were excluded. CT features and FDG PET findings were meticulously evaluated for the distribution of GGA and consolidation and nodules on CT scan and mean and maximal SUV of abnormalities depicted on F-18 FDG PET scan. In eight patients, follow-up chest CT scan and FDG PET scan were done one or two weeks after initial study. F-18 FDG PET scan identified metabolically active lesions in seven out of eight patients. Maximal SUV was ranged from 2.8 to 10.6 and mean SUV was ranged from 2.2 to 7.2. Remaining one patient had maximal SUV of 1.3. On follow-up FDG PET scan taken on from one to four weeks later showed decreased degree of initially noted FDG uptakes or migration of previously noted abnormal FDG uptakes. Lung involvement in the HES might be identified as abnormal uptake foci on FDG PET scan mimicking lung cancer. Follow-up FDG PET and CT scan for the identification of migration or resolution of abnormalities and decrement of SUV would be of help for the differentiation between lung cancer and HES with lung involvement.

  12. F-18 FDG PET scan findings in patients with pulmonary involvement in the hypereosinophilic syndrome

    International Nuclear Information System (INIS)

    Lee, Jae Hoon; Kim, Tae Hoon; Yun, Mi Jin

    2005-01-01

    Hypereosinophilic syndrome (HES) is an infiltrative disease of eosinophils affecting multiple organs including the lung. F-18 2-fluoro-2-deoxyglucose (F-18 FDG) may accumulate at sites of inflammation or infection, making interpretation of whole body PET scan difficult in patients with cancer. This study was to evaluate the PET findings of HES with lung involvement and to find out differential PET features between lung malignancy and HES with lung involvement. F-18 FDG PET and low dose chest CT scan was performed for screening of lung cancer. Eight patients who showed ground-glass attenuation (GGA) and consolidation on chest CT scan with peripheral blood eosinophilia were included in this study. The patients with history of parasite infection, allergy and collagen vascular disease were excluded. CT features and FDG PET findings were meticulously evaluated for the distribution of GGA and consolidation and nodules on CT scan and mean and maximal SUV of abnormalities depicted on F-18 FDG PET scan. In eight patients, follow-up chest CT scan and FDG PET scan were done one or two weeks after initial study. F-18 FDG PET scan identified metabolically active lesions in seven out of eight patients. Maximal SUV was ranged from 2.8 to 10.6 and mean SUV was ranged from 2.2 to 7.2. Remaining one patient had maximal SUV of 1.3. On follow-up FDG PET scan taken on from one to four weeks later showed decreased degree of initially noted FDG uptakes or migration of previously noted abnormal FDG uptakes. Lung involvement in the HES might be identified as abnormal uptake foci on FDG PET scan mimicking lung cancer. Follow-up FDG PET and CT scan for the identification of migration or resolution of abnormalities and decrement of SUV would be of help for the differentiation between lung cancer and HES with lung involvement

  13. Scanning gamma camera

    International Nuclear Information System (INIS)

    Engdahl, L.W.; Batter, J.F. Jr.; Stout, K.J.

    1977-01-01

    A scanning system for a gamma camera providing for the overlapping of adjacent scan paths is described. A collimator mask having tapered edges provides for a graduated reduction in intensity of radiation received by a detector thereof, the reduction in intensity being graduated in a direction normal to the scanning path to provide a blending of images of adjacent scan paths. 31 claims, 15 figures

  14. Statistical behavior of the tensile property of heated cotton fiber

    Science.gov (United States)

    The temperature dependence of the tensile property of single cotton fiber was studied in the range of 160-300°C using Favimat test, and its statistical behavior was interpreted in terms of structural changes. The tenacity of control cotton fiber was well described by the single Weibull distribution,...

  15. Histopathological correlation of 11C-choline PET scans for target volume definition in radical prostate radiotherapy

    International Nuclear Information System (INIS)

    Chang, Joe H.; Joon, Daryl Lim; Lee, Sze Ting; Gong, Sylvia J.; Scott, Andrew M.; Davis, Ian D.; Clouston, David; Bolton, Damien; Hamilton, Christopher S.; Khoo, Vincent

    2011-01-01

    Background and purpose: To evaluate the accuracy of 11 C-choline PET scans in defining dominant intraprostatic lesions (DILs) for radiotherapy target volume definition. Material and methods: Eight men with prostate cancer who had 11 C-choline PET scans prior to radical prostatectomy were studied. Several methods were used to contour the DIL on the PET scans: visual, PET Edge, Region Grow, absolute standardised uptake value (SUV) thresholds and percentage of maximum SUV thresholds. Prostatectomy specimens were sliced in the transverse plane and DILs were delineated on these by a pathologist. These were then compared with the PET scans. The accuracy of correlation was assessed by the Dice similarity coefficient (DSC) and the Youden index. Results: The contouring method resulting in both the highest DSC and the highest Youden index was 60% of the maximum SUV (SUV 60% ), with values of 0.64 and 0.51, respectively. However SUV 60% was not statistically significantly better than all of the other methods by either measure. Conclusions: Although not statistically significant, SUV 60% resulted in the best correlation between 11 C-choline PET and pathology amongst all the methods studied. The degree of correlation shown here is consistent with previous studies that have justified using imaging for DIL radiotherapy target volume definition.

  16. Robust statistical reconstruction for charged particle tomography

    Science.gov (United States)

    Schultz, Larry Joe; Klimenko, Alexei Vasilievich; Fraser, Andrew Mcleod; Morris, Christopher; Orum, John Christopher; Borozdin, Konstantin N; Sossong, Michael James; Hengartner, Nicolas W

    2013-10-08

    Systems and methods for charged particle detection including statistical reconstruction of object volume scattering density profiles from charged particle tomographic data to determine the probability distribution of charged particle scattering using a statistical multiple scattering model and determine a substantially maximum likelihood estimate of object volume scattering density using expectation maximization (ML/EM) algorithm to reconstruct the object volume scattering density. The presence of and/or type of object occupying the volume of interest can be identified from the reconstructed volume scattering density profile. The charged particle tomographic data can be cosmic ray muon tomographic data from a muon tracker for scanning packages, containers, vehicles or cargo. The method can be implemented using a computer program which is executable on a computer.

  17. Interpretation of Spirometry: Selection of Predicted Values and Defining Abnormality.

    Science.gov (United States)

    Chhabra, S K

    2015-01-01

    Spirometry is the most frequently performed investigation to evaluate pulmonary function. It provides clinically useful information on the mechanical properties of the lung and the thoracic cage and aids in taking management-related decisions in a wide spectrum of diseases and disorders. Few measurements in medicine are so dependent on factors related to equipment, operator and the patient. Good spirometry requires quality assured measurements and a systematic approach to interpretation. Standard guidelines on the technical aspects of equipment and their calibration as well as the test procedure have been developed and revised from time-to-time. Strict compliance with standardisation guidelines ensures quality control. Interpretation of spirometry data is based only on two basic measurements--the forced vital capacity (FVC) and the forced expiratory volume in 1 second (FEV1) and their ratio, FEV1/FVC. A meaningful and clinically useful interpretation of the measured data requires a systematic approach and consideration of several important issues. Central to interpretation is the understanding of the development and application of prediction equations. Selection of prediction equations that are appropriate for the ethnic origin of the patient is vital to avoid erroneous interpretation. Defining abnormal values is a debatable but critical aspect of spirometry. A statistically valid definition of the lower limits of normal has been advocated as the better method over the more commonly used approach of defining abnormality as a fixed percentage of the predicted value. Spirometry rarely provides a specific diagnosis. Examination of the flow-volume curve and the measured data provides information to define patterns of ventilatory impairment. Spirometry must be interpreted in conjunction with clinical information including results of other investigations.

  18. Lineament interpretation. Short review and methodology

    Energy Technology Data Exchange (ETDEWEB)

    Tiren, Sven (GEOSIGMA AB (Sweden))

    2010-11-15

    interpretation, and the skill of the interpreter. Images and digital terrain models that display the relief of the studied area should, if possible, be illuminated in at least four directions to reduce biases regarding the orientation of structures. The resolution in the source data should be fully used and extrapolation of structures avoided in the primary interpretation of the source data. The interpretation of lineaments should be made in steps: a. Interpretation of each data set/image/terrain model is conducted separately; b. Compilation of all interpretations in a base lineament map and classification of the lineaments; and c. Construction of thematical maps, e.g. structural maps, rock block maps, and statistic presentation of lineaments. Generalisations and extrapolations of lineaments/structures may be made when producing the thematical maps. The construction of thematical maps should be supported by auxiliary information (geological and geomorphologic data and information on human impact in the area). Inferred tectonic structures should be controlled in field

  19. Lineament interpretation. Short review and methodology

    International Nuclear Information System (INIS)

    Tiren, Sven

    2010-11-01

    the interpreter. Images and digital terrain models that display the relief of the studied area should, if possible, be illuminated in at least four directions to reduce biases regarding the orientation of structures. The resolution in the source data should be fully used and extrapolation of structures avoided in the primary interpretation of the source data. The interpretation of lineaments should be made in steps: a. Interpretation of each data set/image/terrain model is conducted separately; b. Compilation of all interpretations in a base lineament map and classification of the lineaments; and c. Construction of thematical maps, e.g. structural maps, rock block maps, and statistic presentation of lineaments. Generalisations and extrapolations of lineaments/structures may be made when producing the thematical maps. The construction of thematical maps should be supported by auxiliary information (geological and geomorphologic data and information on human impact in the area). Inferred tectonic structures should be controlled in field

  20. Analytical and statistical analysis of elemental composition of lichens

    International Nuclear Information System (INIS)

    Calvelo, S.; Baccala, N.; Bubach, D.; Arribere, M.A.; Riberio Guevara, S.

    1997-01-01

    The elemental composition of lichens from remote southern South America regions has been studied with analytical and statistical techniques to determine if the values obtained reflect species, growth forms or habitat characteristics. The enrichment factors are calculated discriminated by species and collection site and compared with data available in the literature. The elemental concentrations are standardized and compared for different species. The information was statistically processed, a cluster analysis was performed using the three first principal axes of the PCA; the three groups formed are presented. Their relationship with the species, collection sites and the lichen growth forms are interpreted. (author)