Equivalent statistics and data interpretation.
Francis, Gregory
2017-08-01
Recent reform efforts in psychological science have led to a plethora of choices for scientists to analyze their data. A scientist making an inference about their data must now decide whether to report a p value, summarize the data with a standardized effect size and its confidence interval, report a Bayes Factor, or use other model comparison methods. To make good choices among these options, it is necessary for researchers to understand the characteristics of the various statistics used by the different analysis frameworks. Toward that end, this paper makes two contributions. First, it shows that for the case of a two-sample t test with known sample sizes, many different summary statistics are mathematically equivalent in the sense that they are based on the very same information in the data set. When the sample sizes are known, the p value provides as much information about a data set as the confidence interval of Cohen's d or a JZS Bayes factor. Second, this equivalence means that different analysis methods differ only in their interpretation of the empirical data. At first glance, it might seem that mathematical equivalence of the statistics suggests that it does not matter much which statistic is reported, but the opposite is true because the appropriateness of a reported statistic is relative to the inference it promotes. Accordingly, scientists should choose an analysis method appropriate for their scientific investigation. A direct comparison of the different inferential frameworks provides some guidance for scientists to make good choices and improve scientific practice.
Statistical interpretation of geochemical data
International Nuclear Information System (INIS)
Carambula, M.
1990-01-01
Statistical results have been obtained from a geochemical research from the following four aerial photographies Zapican, Carape, Las Canias, Alferez. They have been studied 3020 samples in total, to 22 chemical elements using plasma emission spectrometry methods.
Statistics and Data Interpretation for Social Work
Rosenthal, James
2011-01-01
"Without question, this text will be the most authoritative source of information on statistics in the human services. From my point of view, it is a definitive work that combines a rigorous pedagogy with a down to earth (commonsense) exploration of the complex and difficult issues in data analysis (statistics) and interpretation. I welcome its publication.". -Praise for the First Edition. Written by a social worker for social work students, this is a nuts and bolts guide to statistics that presents complex calculations and concepts in clear, easy-to-understand language. It includes
International Nuclear Information System (INIS)
Tadaki, Kohtaro
2010-01-01
The statistical mechanical interpretation of algorithmic information theory (AIT, for short) was introduced and developed by our former works [K. Tadaki, Local Proceedings of CiE 2008, pp. 425-434, 2008] and [K. Tadaki, Proceedings of LFCS'09, Springer's LNCS, vol. 5407, pp. 422-440, 2009], where we introduced the notion of thermodynamic quantities, such as partition function Z(T), free energy F(T), energy E(T), statistical mechanical entropy S(T), and specific heat C(T), into AIT. We then discovered that, in the interpretation, the temperature T equals to the partial randomness of the values of all these thermodynamic quantities, where the notion of partial randomness is a stronger representation of the compression rate by means of program-size complexity. Furthermore, we showed that this situation holds for the temperature T itself, which is one of the most typical thermodynamic quantities. Namely, we showed that, for each of the thermodynamic quantities Z(T), F(T), E(T), and S(T) above, the computability of its value at temperature T gives a sufficient condition for T is an element of (0,1) to satisfy the condition that the partial randomness of T equals to T. In this paper, based on a physical argument on the same level of mathematical strictness as normal statistical mechanics in physics, we develop a total statistical mechanical interpretation of AIT which actualizes a perfect correspondence to normal statistical mechanics. We do this by identifying a microcanonical ensemble in the framework of AIT. As a result, we clarify the statistical mechanical meaning of the thermodynamic quantities of AIT.
Interpretation of commonly used statistical regression models.
Kasza, Jessica; Wolfe, Rory
2014-01-01
A review of some regression models commonly used in respiratory health applications is provided in this article. Simple linear regression, multiple linear regression, logistic regression and ordinal logistic regression are considered. The focus of this article is on the interpretation of the regression coefficients of each model, which are illustrated through the application of these models to a respiratory health research study. © 2013 The Authors. Respirology © 2013 Asian Pacific Society of Respirology.
The Statistical Interpretation of Entropy: An Activity
Timmberlake, Todd
2010-01-01
The second law of thermodynamics, which states that the entropy of an isolated macroscopic system can increase but will not decrease, is a cornerstone of modern physics. Ludwig Boltzmann argued that the second law arises from the motion of the atoms that compose the system. Boltzmann's statistical mechanics provides deep insight into the…
Making Statistical Data More Easily Accessible on the Web Results of the StatSearch Case Study
Rajman, M; Boynton, I M; Fridlund, B; Fyhrlund, A; Sundgren, B; Lundquist, P; Thelander, H; Wänerskär, M
2005-01-01
In this paper we present the results of the StatSearch case study that aimed at providing an enhanced access to statistical data available on the Web. In the scope of this case study we developed a prototype of an information access tool combining a query-based search engine with semi-automated navigation techniques exploiting the hierarchical structuring of the available data. This tool enables a better control of the information retrieval, improving the quality and ease of the access to statistical information. The central part of the presented StatSearch tool consists in the design of an algorithm for automated navigation through a tree-like hierarchical document structure. The algorithm relies on the computation of query related relevance score distributions over the available database to identify the most relevant clusters in the data structure. These most relevant clusters are then proposed to the user for navigation, or, alternatively, are the support for the automated navigation process. Several appro...
Combinatorial interpretation of Haldane-Wu fractional exclusion statistics.
Aringazin, A K; Mazhitov, M I
2002-08-01
Assuming that the maximal allowed number of identical particles in a state is an integer parameter, q, we derive the statistical weight and analyze the associated equation that defines the statistical distribution. The derived distribution covers Fermi-Dirac and Bose-Einstein ones in the particular cases q=1 and q--> infinity (n(i)/q-->1), respectively. We show that the derived statistical weight provides a natural combinatorial interpretation of Haldane-Wu fractional exclusion statistics, and present exact solutions of the distribution equation.
Tuuli, Methodius G; Odibo, Anthony O
2011-08-01
The objective of this article is to discuss the rationale for common statistical tests used for the analysis and interpretation of prenatal diagnostic imaging studies. Examples from the literature are used to illustrate descriptive and inferential statistics. The uses and limitations of linear and logistic regression analyses are discussed in detail.
Advanced statistics to improve the physical interpretation of atomization processes
International Nuclear Information System (INIS)
Panão, Miguel R.O.; Radu, Lucian
2013-01-01
Highlights: ► Finite pdf mixtures improves physical interpretation of sprays. ► Bayesian approach using MCMC algorithm is used to find the best finite mixture. ► Statistical method identifies multiple droplet clusters in a spray. ► Multiple drop clusters eventually associated with multiple atomization mechanisms. ► Spray described by drop size distribution and not only its moments. -- Abstract: This paper reports an analysis of the physics of atomization processes using advanced statistical tools. Namely, finite mixtures of probability density functions, which best fitting is found using a Bayesian approach based on a Markov chain Monte Carlo (MCMC) algorithm. This approach takes into account eventual multimodality and heterogeneities in drop size distributions. Therefore, it provides information about the complete probability density function of multimodal drop size distributions and allows the identification of subgroups in the heterogeneous data. This allows improving the physical interpretation of atomization processes. Moreover, it also overcomes the limitations induced by analyzing the spray droplets characteristics through moments alone, particularly, the hindering of different natures of droplet formation. Finally, the method is applied to physically interpret a case-study based on multijet atomization processes
Statistical transformation and the interpretation of inpatient glucose control data.
Saulnier, George E; Castro, Janna C; Cook, Curtiss B
2014-03-01
To introduce a statistical method of assessing hospital-based non-intensive care unit (non-ICU) inpatient glucose control. Point-of-care blood glucose (POC-BG) data from hospital non-ICUs were extracted for January 1 through December 31, 2011. Glucose data distribution was examined before and after Box-Cox transformations and compared to normality. Different subsets of data were used to establish upper and lower control limits, and exponentially weighted moving average (EWMA) control charts were constructed from June, July, and October data as examples to determine if out-of-control events were identified differently in nontransformed versus transformed data. A total of 36,381 POC-BG values were analyzed. In all 3 monthly test samples, glucose distributions in nontransformed data were skewed but approached a normal distribution once transformed. Interpretation of out-of-control events from EWMA control chart analyses also revealed differences. In the June test data, an out-of-control process was identified at sample 53 with nontransformed data, whereas the transformed data remained in control for the duration of the observed period. Analysis of July data demonstrated an out-of-control process sooner in the transformed (sample 55) than nontransformed (sample 111) data, whereas for October, transformed data remained in control longer than nontransformed data. Statistical transformations increase the normal behavior of inpatient non-ICU glycemic data sets. The decision to transform glucose data could influence the interpretation and conclusions about the status of inpatient glycemic control. Further study is required to determine whether transformed versus nontransformed data influence clinical decisions or evaluation of interventions.
Workplace Statistical Literacy for Teachers: Interpreting Box Plots
Pierce, Robyn; Chick, Helen
2013-01-01
As a consequence of the increased use of data in workplace environments, there is a need to understand the demands that are placed on users to make sense of such data. In education, teachers are being increasingly expected to interpret and apply complex data about student and school performance, and, yet it is not clear that they always have the…
Interpreting Statistical Findings A Guide For Health Professionals And Students
Walker, Jan
2010-01-01
This book is aimed at those studying and working in the field of health care, including nurses and the professions allied to medicine, who have little prior knowledge of statistics but for whom critical review of research is an essential skill.
Variation in reaction norms: Statistical considerations and biological interpretation.
Morrissey, Michael B; Liefting, Maartje
2016-09-01
Analysis of reaction norms, the functions by which the phenotype produced by a given genotype depends on the environment, is critical to studying many aspects of phenotypic evolution. Different techniques are available for quantifying different aspects of reaction norm variation. We examine what biological inferences can be drawn from some of the more readily applicable analyses for studying reaction norms. We adopt a strongly biologically motivated view, but draw on statistical theory to highlight strengths and drawbacks of different techniques. In particular, consideration of some formal statistical theory leads to revision of some recently, and forcefully, advocated opinions on reaction norm analysis. We clarify what simple analysis of the slope between mean phenotype in two environments can tell us about reaction norms, explore the conditions under which polynomial regression can provide robust inferences about reaction norm shape, and explore how different existing approaches may be used to draw inferences about variation in reaction norm shape. We show how mixed model-based approaches can provide more robust inferences than more commonly used multistep statistical approaches, and derive new metrics of the relative importance of variation in reaction norm intercepts, slopes, and curvatures. © 2016 The Author(s). Evolution © 2016 The Society for the Study of Evolution.
Theoretical, analytical, and statistical interpretation of environmental data
International Nuclear Information System (INIS)
Lombard, S.M.
1974-01-01
The reliability of data from radiochemical analyses of environmental samples cannot be determined from nuclear counting statistics alone. The rigorous application of the principles of propagation of errors, an understanding of the physics and chemistry of the species of interest in the environment, and the application of information from research on the analytical procedure are all necessary for a valid estimation of the errors associated with analytical results. The specific case of the determination of plutonium in soil is considered in terms of analytical problems and data reliability. (U.S.)
A statistical model for interpreting computerized dynamic posturography data
Feiveson, Alan H.; Metter, E. Jeffrey; Paloski, William H.
2002-01-01
Computerized dynamic posturography (CDP) is widely used for assessment of altered balance control. CDP trials are quantified using the equilibrium score (ES), which ranges from zero to 100, as a decreasing function of peak sway angle. The problem of how best to model and analyze ESs from a controlled study is considered. The ES often exhibits a skewed distribution in repeated trials, which can lead to incorrect inference when applying standard regression or analysis of variance models. Furthermore, CDP trials are terminated when a patient loses balance. In these situations, the ES is not observable, but is assigned the lowest possible score--zero. As a result, the response variable has a mixed discrete-continuous distribution, further compromising inference obtained by standard statistical methods. Here, we develop alternative methodology for analyzing ESs under a stochastic model extending the ES to a continuous latent random variable that always exists, but is unobserved in the event of a fall. Loss of balance occurs conditionally, with probability depending on the realized latent ES. After fitting the model by a form of quasi-maximum-likelihood, one may perform statistical inference to assess the effects of explanatory variables. An example is provided, using data from the NIH/NIA Baltimore Longitudinal Study on Aging.
Interpretation of the results of statistical measurements. [search for basic probability model
Olshevskiy, V. V.
1973-01-01
For random processes, the calculated probability characteristic, and the measured statistical estimate are used in a quality functional, which defines the difference between the two functions. Based on the assumption that the statistical measurement procedure is organized so that the parameters for a selected model are optimized, it is shown that the interpretation of experimental research is a search for a basic probability model.
Statistics translated a step-by-step guide to analyzing and interpreting data
Terrell, Steven R
2012-01-01
Written in a humorous and encouraging style, this text shows how the most common statistical tools can be used to answer interesting real-world questions, presented as mysteries to be solved. Engaging research examples lead the reader through a series of six steps, from identifying a researchable problem to stating a hypothesis, identifying independent and dependent variables, and selecting and interpreting appropriate statistical tests. All techniques are demonstrated both manually and with the help of SPSS software. The book provides students and others who may need to read and interpret sta
Jieyi Li; Arandjelovic, Ognjen
2017-07-01
Computer science and machine learning in particular are increasingly lauded for their potential to aid medical practice. However, the highly technical nature of the state of the art techniques can be a major obstacle in their usability by health care professionals and thus, their adoption and actual practical benefit. In this paper we describe a software tool which focuses on the visualization of predictions made by a recently developed method which leverages data in the form of large scale electronic records for making diagnostic predictions. Guided by risk predictions, our tool allows the user to explore interactively different diagnostic trajectories, or display cumulative long term prognostics, in an intuitive and easily interpretable manner.
Interpreting Statistical Significance Test Results: A Proposed New "What If" Method.
Kieffer, Kevin M.; Thompson, Bruce
As the 1994 publication manual of the American Psychological Association emphasized, "p" values are affected by sample size. As a result, it can be helpful to interpret the results of statistical significant tests in a sample size context by conducting so-called "what if" analyses. However, these methods can be inaccurate…
International Nuclear Information System (INIS)
Lan, B.L.
2001-01-01
An alternative interpretation to Bohm's 'quantum force' and 'active information' is proposed. Numerical evidence is presented, which suggests that the time series of Bohm's 'quantum force' evaluated at the Bohmian position for non-stationary quantum states are typically non-Gaussian stable distributed with a flat power spectrum in classically chaotic Hamiltonian systems. An important implication of these statistical properties is briefly mentioned. (orig.)
Alternative interpretations of statistics on health effects of low-level radiation
International Nuclear Information System (INIS)
Hamilton, L.D.
1983-01-01
Four examples of the interpretation of statistics of data on low-level radiation are reviewed: (a) genetic effects of the atomic bombs at Hiroshima and Nagasaki, (b) cancer at Rocky Flats, (c) childhood leukemia and fallout in Utah, and (d) cancer among workers at the Portsmouth Naval Shipyard. Aggregation of data, adjustment for age, and other problems related to the determination of health effects of low-level radiation are discussed. Troublesome issues related to post hoc analysis are considered
Farrell, Mary Beth
2018-06-01
This article is the second part of a continuing education series reviewing basic statistics that nuclear medicine and molecular imaging technologists should understand. In this article, the statistics for evaluating interpretation accuracy, significance, and variance are discussed. Throughout the article, actual statistics are pulled from the published literature. We begin by explaining 2 methods for quantifying interpretive accuracy: interreader and intrareader reliability. Agreement among readers can be expressed simply as a percentage. However, the Cohen κ-statistic is a more robust measure of agreement that accounts for chance. The higher the κ-statistic is, the higher is the agreement between readers. When 3 or more readers are being compared, the Fleiss κ-statistic is used. Significance testing determines whether the difference between 2 conditions or interventions is meaningful. Statistical significance is usually expressed using a number called a probability ( P ) value. Calculation of P value is beyond the scope of this review. However, knowing how to interpret P values is important for understanding the scientific literature. Generally, a P value of less than 0.05 is considered significant and indicates that the results of the experiment are due to more than just chance. Variance, standard deviation (SD), confidence interval, and standard error (SE) explain the dispersion of data around a mean of a sample drawn from a population. SD is commonly reported in the literature. A small SD indicates that there is not much variation in the sample data. Many biologic measurements fall into what is referred to as a normal distribution taking the shape of a bell curve. In a normal distribution, 68% of the data will fall within 1 SD, 95% will fall within 2 SDs, and 99.7% will fall within 3 SDs. Confidence interval defines the range of possible values within which the population parameter is likely to lie and gives an idea of the precision of the statistic being
Menzerath-Altmann Law: Statistical Mechanical Interpretation as Applied to a Linguistic Organization
Eroglu, Sertac
2014-10-01
The distribution behavior described by the empirical Menzerath-Altmann law is frequently encountered during the self-organization of linguistic and non-linguistic natural organizations at various structural levels. This study presents a statistical mechanical derivation of the law based on the analogy between the classical particles of a statistical mechanical organization and the distinct words of a textual organization. The derived model, a transformed (generalized) form of the Menzerath-Altmann model, was termed as the statistical mechanical Menzerath-Altmann model. The derived model allows interpreting the model parameters in terms of physical concepts. We also propose that many organizations presenting the Menzerath-Altmann law behavior, whether linguistic or not, can be methodically examined by the transformed distribution model through the properly defined structure-dependent parameter and the energy associated states.
Shewhart, Mark
1991-01-01
Statistical Process Control (SPC) charts are one of several tools used in quality control. Other tools include flow charts, histograms, cause and effect diagrams, check sheets, Pareto diagrams, graphs, and scatter diagrams. A control chart is simply a graph which indicates process variation over time. The purpose of drawing a control chart is to detect any changes in the process signalled by abnormal points or patterns on the graph. The Artificial Intelligence Support Center (AISC) of the Acquisition Logistics Division has developed a hybrid machine learning expert system prototype which automates the process of constructing and interpreting control charts.
Misuse of statistics in the interpretation of data on low-level radiation
International Nuclear Information System (INIS)
Hamilton, L.D.
1982-01-01
Four misuses of statistics in the interpretation of data of low-level radiation are reviewed: (1) post-hoc analysis and aggregation of data leading to faulty conclusions in the reanalysis of genetic effects of the atomic bomb, and premature conclusions on the Portsmouth Naval Shipyard data; (2) inappropriate adjustment for age and ignoring differences between urban and rural areas leading to potentially spurious increase in incidence of cancer at Rocky Flats; (3) hazard of summary statistics based on ill-conditioned individual rates leading to spurious association between childhood leukemia and fallout in Utah; and (4) the danger of prematurely published preliminary work with inadequate consideration of epidemiological problems - censored data - leading to inappropriate conclusions, needless alarm at the Portsmouth Naval Shipyard, and diversion of scarce research funds
Misuse of statistics in the interpretation of data on low-level radiation
Energy Technology Data Exchange (ETDEWEB)
Hamilton, L.D.
1982-01-01
Four misuses of statistics in the interpretation of data of low-level radiation are reviewed: (1) post-hoc analysis and aggregation of data leading to faulty conclusions in the reanalysis of genetic effects of the atomic bomb, and premature conclusions on the Portsmouth Naval Shipyard data; (2) inappropriate adjustment for age and ignoring differences between urban and rural areas leading to potentially spurious increase in incidence of cancer at Rocky Flats; (3) hazard of summary statistics based on ill-conditioned individual rates leading to spurious association between childhood leukemia and fallout in Utah; and (4) the danger of prematurely published preliminary work with inadequate consideration of epidemiological problems - censored data - leading to inappropriate conclusions, needless alarm at the Portsmouth Naval Shipyard, and diversion of scarce research funds.
Statistical Literacy: High School Students in Reading, Interpreting and Presenting Data
Hafiyusholeh, M.; Budayasa, K.; Siswono, T. Y. E.
2018-01-01
One of the foundations for high school students in statistics is to be able to read data; presents data in the form of tables and diagrams and its interpretation. The purpose of this study is to describe high school students’ competencies in reading, interpreting and presenting data. Subjects were consisted of male and female students who had high levels of mathematical ability. Collecting data was done in form of task formulation which is analyzed by reducing, presenting and verifying data. Results showed that the students read the data based on explicit explanations on the diagram, such as explaining the points in the diagram as the relation between the x and y axis and determining the simple trend of a graph, including the maximum and minimum point. In interpreting and summarizing the data, both subjects pay attention to general data trends and use them to predict increases or decreases in data. The male estimates the value of the (n+1) of weight data by using the modus of the data, while the females estimate the weigth by using the average. The male tend to do not consider the characteristics of the data, while the female more carefully consider the characteristics of data.
Saulnier, George E; Castro, Janna C; Cook, Curtiss B
2014-05-01
Glucose control can be problematic in critically ill patients. We evaluated the impact of statistical transformation on interpretation of intensive care unit inpatient glucose control data. Point-of-care blood glucose (POC-BG) data derived from patients in the intensive care unit for 2011 was obtained. Box-Cox transformation of POC-BG measurements was performed, and distribution of data was determined before and after transformation. Different data subsets were used to establish statistical upper and lower control limits. Exponentially weighted moving average (EWMA) control charts constructed from April, October, and November data determined whether out-of-control events could be identified differently in transformed versus nontransformed data. A total of 8679 POC-BG values were analyzed. POC-BG distributions in nontransformed data were skewed but approached normality after transformation. EWMA control charts revealed differences in projected detection of out-of-control events. In April, an out-of-control process resulting in the lower control limit being exceeded was identified at sample 116 in nontransformed data but not in transformed data. October transformed data detected an out-of-control process exceeding the upper control limit at sample 27 that was not detected in nontransformed data. Nontransformed November results remained in control, but transformation identified an out-of-control event less than 10 samples into the observation period. Using statistical methods to assess population-based glucose control in the intensive care unit could alter conclusions about the effectiveness of care processes for managing hyperglycemia. Further study is required to determine whether transformed versus nontransformed data change clinical decisions about the interpretation of care or intervention results. © 2014 Diabetes Technology Society.
Gerrits, Reinie G; Kringos, Dionne S; van den Berg, Michael J; Klazinga, Niek S
2018-03-07
Policy-makers, managers, scientists, patients and the general public are confronted daily with figures on health and healthcare through public reporting in newspapers, webpages and press releases. However, information on the key characteristics of these figures necessary for their correct interpretation is often not adequately communicated, which can lead to misinterpretation and misinformed decision-making. The objective of this research was to map the key characteristics relevant to the interpretation of figures on health and healthcare, and to develop a Figure Interpretation Assessment Tool-Health (FIAT-Health) through which figures on health and healthcare can be systematically assessed, allowing for a better interpretation of these figures. The abovementioned key characteristics of figures on health and healthcare were identified through systematic expert consultations in the Netherlands on four topic categories of figures, namely morbidity, healthcare expenditure, healthcare outcomes and lifestyle. The identified characteristics were used as a frame for the development of the FIAT-Health. Development of the tool and its content was supported and validated through regular review by a sounding board of potential users. Identified characteristics relevant for the interpretation of figures in the four categories relate to the figures' origin, credibility, expression, subject matter, population and geographical focus, time period, and underlying data collection methods. The characteristics were translated into a set of 13 dichotomous and 4-point Likert scale questions constituting the FIAT-Health, and two final assessment statements. Users of the FIAT-Health were provided with a summary overview of their answers to support a final assessment of the correctness of a figure and the appropriateness of its reporting. FIAT-Health can support policy-makers, managers, scientists, patients and the general public to systematically assess the quality of publicly reported
International Nuclear Information System (INIS)
Shafieloo, Arman
2012-01-01
By introducing Crossing functions and hyper-parameters I show that the Bayesian interpretation of the Crossing Statistics [1] can be used trivially for the purpose of model selection among cosmological models. In this approach to falsify a cosmological model there is no need to compare it with other models or assume any particular form of parametrization for the cosmological quantities like luminosity distance, Hubble parameter or equation of state of dark energy. Instead, hyper-parameters of Crossing functions perform as discriminators between correct and wrong models. Using this approach one can falsify any assumed cosmological model without putting priors on the underlying actual model of the universe and its parameters, hence the issue of dark energy parametrization is resolved. It will be also shown that the sensitivity of the method to the intrinsic dispersion of the data is small that is another important characteristic of the method in testing cosmological models dealing with data with high uncertainties
Karuppiah, R.; Faldi, A.; Laurenzi, I.; Usadi, A.; Venkatesh, A.
2014-12-01
An increasing number of studies are focused on assessing the environmental footprint of different products and processes, especially using life cycle assessment (LCA). This work shows how combining statistical methods and Geographic Information Systems (GIS) with environmental analyses can help improve the quality of results and their interpretation. Most environmental assessments in literature yield single numbers that characterize the environmental impact of a process/product - typically global or country averages, often unchanging in time. In this work, we show how statistical analysis and GIS can help address these limitations. For example, we demonstrate a method to separately quantify uncertainty and variability in the result of LCA models using a power generation case study. This is important for rigorous comparisons between the impacts of different processes. Another challenge is lack of data that can affect the rigor of LCAs. We have developed an approach to estimate environmental impacts of incompletely characterized processes using predictive statistical models. This method is applied to estimate unreported coal power plant emissions in several world regions. There is also a general lack of spatio-temporal characterization of the results in environmental analyses. For instance, studies that focus on water usage do not put in context where and when water is withdrawn. Through the use of hydrological modeling combined with GIS, we quantify water stress on a regional and seasonal basis to understand water supply and demand risks for multiple users. Another example where it is important to consider regional dependency of impacts is when characterizing how agricultural land occupation affects biodiversity in a region. We developed a data-driven methodology used in conjuction with GIS to determine if there is a statistically significant difference between the impacts of growing different crops on different species in various biomes of the world.
Braun, Stefan; Pokorná, Šárka; Šachl, Radek; Hof, Martin; Heerklotz, Heiko; Hoernke, Maria
2018-01-23
The mode of action of membrane-active molecules, such as antimicrobial, anticancer, cell penetrating, and fusion peptides and their synthetic mimics, transfection agents, drug permeation enhancers, and biological signaling molecules (e.g., quorum sensing), involves either the general or local destabilization of the target membrane or the formation of defined, rather stable pores. Some effects aim at killing the cell, while others need to be limited in space and time to avoid serious damage. Biological tests reveal translocation of compounds and cell death but do not provide a detailed, mechanistic, and quantitative understanding of the modes of action and their molecular basis. Model membrane studies of membrane leakage have been used for decades to tackle this issue, but their interpretation in terms of biology has remained challenging and often quite limited. Here we compare two recent, powerful protocols to study model membrane leakage: the microscopic detection of dye influx into giant liposomes and time-correlated single photon counting experiments to characterize dye efflux from large unilamellar vesicles. A statistical treatment of both data sets does not only harmonize apparent discrepancies but also makes us aware of principal issues that have been confusing the interpretation of model membrane leakage data so far. Moreover, our study reveals a fundamental difference between nano- and microscale systems that needs to be taken into account when conclusions about microscale objects, such as cells, are drawn from nanoscale models.
Directory of Open Access Journals (Sweden)
Elżbieta Biernat
2014-12-01
Full Text Available Background: The aim of this paper is to assess whether basic descriptive statistics is sufficient to interpret the data on physical activity of Poles within occupational domain of life. Material and Methods: The study group consisted of 964 randomly selected Polish working professionals. The long version of the International Physical Activity Questionnaire (IPAQ was used. Descriptive statistics included characteristics of variables using: mean (M, median (Me, maximal and minimal values (max–min., standard deviation (SD and percentile values. Statistical inference was based on the comparison of variables with the significance level of 0.05 (Kruskal-Wallis and Pearson’s Chi2 tests. Results: Occupational physical activity (OPA was declared by 46.4% of respondents (vigorous – 23.5%, moderate – 30.2%, walking – 39.5%. The total OPA amounted to 2751.1 MET-min/week (Metabolic Equivalent of Task with very high standard deviation (SD = 5302.8 and max = 35 511 MET-min/week. It concerned different types of activities. Approximately 10% (90th percentile overstated the average. However, there was no significant difference depended on the character of the profession, or the type of activity. The average time of sitting was 256 min/day. As many as 39% of the respondents met the World Health Organization standards only due to OPA (42.5% of white-collar workers, 38% of administrative and technical employees and only 37.9% of physical workers. Conclusions: In the data analysis it is necessary to define quantiles to provide a fuller picture of the distributions of OPA in MET-min/week. It is also crucial to update the guidelines for data processing and analysis of long version of IPAQ. It seems that 16 h of activity/day is not a sufficient criterion for excluding the results from further analysis. Med Pr 2014;65(6:743–753
Autonomic Differentiation Map: A Novel Statistical Tool for Interpretation of Heart Rate Variability
Directory of Open Access Journals (Sweden)
Daniela Lucini
2018-04-01
Full Text Available In spite of the large body of evidence suggesting Heart Rate Variability (HRV alone or combined with blood pressure variability (providing an estimate of baroreflex gain as a useful technique to assess the autonomic regulation of the cardiovascular system, there is still an ongoing debate about methodology, interpretation, and clinical applications. In the present investigation, we hypothesize that non-parametric and multivariate exploratory statistical manipulation of HRV data could provide a novel informational tool useful to differentiate normal controls from clinical groups, such as athletes, or subjects affected by obesity, hypertension, or stress. With a data-driven protocol in 1,352 ambulant subjects, we compute HRV and baroreflex indices from short-term data series as proxies of autonomic (ANS regulation. We apply a three-step statistical procedure, by first removing age and gender effects. Subsequently, by factor analysis, we extract four ANS latent domains that detain the large majority of information (86.94%, subdivided in oscillatory (40.84%, amplitude (18.04%, pressure (16.48%, and pulse domains (11.58%. Finally, we test the overall capacity to differentiate clinical groups vs. control. To give more practical value and improve readability, statistical results concerning individual discriminant ANS proxies and ANS differentiation profiles are displayed through peculiar graphical tools, i.e., significance diagram and ANS differentiation map, respectively. This approach, which simultaneously uses all available information about the system, shows what domains make up the difference in ANS discrimination. e.g., athletes differ from controls in all domains, but with a graded strength: maximal in the (normalized oscillatory and in the pulse domains, slightly less in the pressure domain and minimal in the amplitude domain. The application of multiple (non-parametric and exploratory statistical and graphical tools to ANS proxies defines
Szabolcsi, Zoltán; Farkas, Zsuzsa; Borbély, Andrea; Bárány, Gusztáv; Varga, Dániel; Heinrich, Attila; Völgyi, Antónia; Pamjav, Horolma
2015-11-01
When the DNA profile from a crime-scene matches that of a suspect, the weight of DNA evidence depends on the unbiased estimation of the match probability of the profiles. For this reason, it is required to establish and expand the databases that reflect the actual allele frequencies in the population applied. 21,473 complete DNA profiles from Databank samples were used to establish the allele frequency database to represent the population of Hungarian suspects. We used fifteen STR loci (PowerPlex ESI16) including five, new ESS loci. The aim was to calculate the statistical, forensic efficiency parameters for the Databank samples and compare the newly detected data to the earlier report. The population substructure caused by relatedness may influence the frequency of profiles estimated. As our Databank profiles were considered non-random samples, possible relationships between the suspects can be assumed. Therefore, population inbreeding effect was estimated using the FIS calculation. The overall inbreeding parameter was found to be 0.0106. Furthermore, we tested the impact of the two allele frequency datasets on 101 randomly chosen STR profiles, including full and partial profiles. The 95% confidence interval estimates for the profile frequencies (pM) resulted in a tighter range when we used the new dataset compared to the previously published ones. We found that the FIS had less effect on frequency values in the 21,473 samples than the application of minimum allele frequency. No genetic substructure was detected by STRUCTURE analysis. Due to the low level of inbreeding effect and the high number of samples, the new dataset provides unbiased and precise estimates of LR for statistical interpretation of forensic casework and allows us to use lower allele frequencies. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Application of Statistical Tools for Data Analysis and Interpretation in Rice Plant Pathology
Directory of Open Access Journals (Sweden)
Parsuram Nayak
2018-01-01
Full Text Available There has been a significant advancement in the application of statistical tools in plant pathology during the past four decades. These tools include multivariate analysis of disease dynamics involving principal component analysis, cluster analysis, factor analysis, pattern analysis, discriminant analysis, multivariate analysis of variance, correspondence analysis, canonical correlation analysis, redundancy analysis, genetic diversity analysis, and stability analysis, which involve in joint regression, additive main effects and multiplicative interactions, and genotype-by-environment interaction biplot analysis. The advanced statistical tools, such as non-parametric analysis of disease association, meta-analysis, Bayesian analysis, and decision theory, take an important place in analysis of disease dynamics. Disease forecasting methods by simulation models for plant diseases have a great potentiality in practical disease control strategies. Common mathematical tools such as monomolecular, exponential, logistic, Gompertz and linked differential equations take an important place in growth curve analysis of disease epidemics. The highly informative means of displaying a range of numerical data through construction of box and whisker plots has been suggested. The probable applications of recent advanced tools of linear and non-linear mixed models like the linear mixed model, generalized linear model, and generalized linear mixed models have been presented. The most recent technologies such as micro-array analysis, though cost effective, provide estimates of gene expressions for thousands of genes simultaneously and need attention by the molecular biologists. Some of these advanced tools can be well applied in different branches of rice research, including crop improvement, crop production, crop protection, social sciences as well as agricultural engineering. The rice research scientists should take advantage of these new opportunities adequately in
Bieber, Frederick R; Buckleton, John S; Budowle, Bruce; Butler, John M; Coble, Michael D
2016-08-31
The evaluation and interpretation of forensic DNA mixture evidence faces greater interpretational challenges due to increasingly complex mixture evidence. Such challenges include: casework involving low quantity or degraded evidence leading to allele and locus dropout; allele sharing of contributors leading to allele stacking; and differentiation of PCR stutter artifacts from true alleles. There is variation in statistical approaches used to evaluate the strength of the evidence when inclusion of a specific known individual(s) is determined, and the approaches used must be supportable. There are concerns that methods utilized for interpretation of complex forensic DNA mixtures may not be implemented properly in some casework. Similar questions are being raised in a number of U.S. jurisdictions, leading to some confusion about mixture interpretation for current and previous casework. Key elements necessary for the interpretation and statistical evaluation of forensic DNA mixtures are described. Given the most common method for statistical evaluation of DNA mixtures in many parts of the world, including the USA, is the Combined Probability of Inclusion/Exclusion (CPI/CPE). Exposition and elucidation of this method and a protocol for use is the focus of this article. Formulae and other supporting materials are provided. Guidance and details of a DNA mixture interpretation protocol is provided for application of the CPI/CPE method in the analysis of more complex forensic DNA mixtures. This description, in turn, should help reduce the variability of interpretation with application of this methodology and thereby improve the quality of DNA mixture interpretation throughout the forensic community.
van Driel, A.F.; Nikolaev, I.; Vergeer, P.; Lodahl, P.; Vanmaekelbergh, D.; Vos, Willem L.
2007-01-01
We present a statistical analysis of time-resolved spontaneous emission decay curves from ensembles of emitters, such as semiconductor quantum dots, with the aim of interpreting ubiquitous non-single-exponential decay. Contrary to what is widely assumed, the density of excited emitters and the
"What If" Analyses: Ways to Interpret Statistical Significance Test Results Using EXCEL or "R"
Ozturk, Elif
2012-01-01
The present paper aims to review two motivations to conduct "what if" analyses using Excel and "R" to understand the statistical significance tests through the sample size context. "What if" analyses can be used to teach students what statistical significance tests really do and in applied research either prospectively to estimate what sample size…
On the statistical interpretation of quantum mechanics: evolution of the density matrix
International Nuclear Information System (INIS)
Benzecri, J.P.
1986-01-01
Without attempting to identify ontological interpretation with a mathematical structure, we reduce philosophical speculation to five theses. In the discussion of these, a central role is devoted to the mathematical problem of the evolution of the density matrix. This article relates to the first 3 of these 5 theses [fr
Webb, Samuel J; Hanser, Thierry; Howlin, Brendan; Krause, Paul; Vessey, Jonathan D
2014-03-25
A new algorithm has been developed to enable the interpretation of black box models. The developed algorithm is agnostic to learning algorithm and open to all structural based descriptors such as fragments, keys and hashed fingerprints. The algorithm has provided meaningful interpretation of Ames mutagenicity predictions from both random forest and support vector machine models built on a variety of structural fingerprints.A fragmentation algorithm is utilised to investigate the model's behaviour on specific substructures present in the query. An output is formulated summarising causes of activation and deactivation. The algorithm is able to identify multiple causes of activation or deactivation in addition to identifying localised deactivations where the prediction for the query is active overall. No loss in performance is seen as there is no change in the prediction; the interpretation is produced directly on the model's behaviour for the specific query. Models have been built using multiple learning algorithms including support vector machine and random forest. The models were built on public Ames mutagenicity data and a variety of fingerprint descriptors were used. These models produced a good performance in both internal and external validation with accuracies around 82%. The models were used to evaluate the interpretation algorithm. Interpretation was revealed that links closely with understood mechanisms for Ames mutagenicity. This methodology allows for a greater utilisation of the predictions made by black box models and can expedite further study based on the output for a (quantitative) structure activity model. Additionally the algorithm could be utilised for chemical dataset investigation and knowledge extraction/human SAR development.
Onisko, Agnieszka; Druzdzel, Marek J; Austin, R Marshall
2016-01-01
Classical statistics is a well-established approach in the analysis of medical data. While the medical community seems to be familiar with the concept of a statistical analysis and its interpretation, the Bayesian approach, argued by many of its proponents to be superior to the classical frequentist approach, is still not well-recognized in the analysis of medical data. The goal of this study is to encourage data analysts to use the Bayesian approach, such as modeling with graphical probabilistic networks, as an insightful alternative to classical statistical analysis of medical data. This paper offers a comparison of two approaches to analysis of medical time series data: (1) classical statistical approach, such as the Kaplan-Meier estimator and the Cox proportional hazards regression model, and (2) dynamic Bayesian network modeling. Our comparison is based on time series cervical cancer screening data collected at Magee-Womens Hospital, University of Pittsburgh Medical Center over 10 years. The main outcomes of our comparison are cervical cancer risk assessments produced by the three approaches. However, our analysis discusses also several aspects of the comparison, such as modeling assumptions, model building, dealing with incomplete data, individualized risk assessment, results interpretation, and model validation. Our study shows that the Bayesian approach is (1) much more flexible in terms of modeling effort, and (2) it offers an individualized risk assessment, which is more cumbersome for classical statistical approaches.
Statistical interpretation of the process of evolution and functioning of Audiovisual Archives
Directory of Open Access Journals (Sweden)
Nuno Miguel Epifânio
2013-03-01
Full Text Available The article provides a type of the operating conditions of audiovisual archives, using for this purpose the interpretation of the results obtained in the study of quantitative sampling. The study involved 43 institutions of different nature of dimension since the national and foreign organizations, from of the questions answered by services of communication and of cultural institutions. The analysis of the object of study found a variety of guidelines on the management of information preservation, as featured the typology of records collections of each file. The data collection thus allowed building an overview of the operating model of each organization surveyed in this study.
Ergodic theory, interpretations of probability and the foundations of statistical mechanics
van Lith, J.H.
2001-01-01
The traditional use of ergodic theory in the foundations of equilibrium statistical mechanics is that it provides a link between thermodynamic observables and microcanonical probabilities. First of all, the ergodic theorem demonstrates the equality of microcanonical phase averages and infinite time
Directory of Open Access Journals (Sweden)
James A Fordyce
Full Text Available BACKGROUND: Phylogenetic hypotheses are increasingly being used to elucidate historical patterns of diversification rate-variation. Hypothesis testing is often conducted by comparing the observed vector of branching times to a null, pure-birth expectation. A popular method for inferring a decrease in speciation rate, which might suggest an early burst of diversification followed by a decrease in diversification rate is the gamma statistic. METHODOLOGY: Using simulations under varying conditions, I examine the sensitivity of gamma to the distribution of the most recent branching times. Using an exploratory data analysis tool for lineages through time plots, tree deviation, I identified trees with a significant gamma statistic that do not appear to have the characteristic early accumulation of lineages consistent with an early, rapid rate of cladogenesis. I further investigated the sensitivity of the gamma statistic to recent diversification by examining the consequences of failing to simulate the full time interval following the most recent cladogenic event. The power of gamma to detect rate decrease at varying times was assessed for simulated trees with an initial high rate of diversification followed by a relatively low rate. CONCLUSIONS: The gamma statistic is extraordinarily sensitive to recent diversification rates, and does not necessarily detect early bursts of diversification. This was true for trees of various sizes and completeness of taxon sampling. The gamma statistic had greater power to detect recent diversification rate decreases compared to early bursts of diversification. Caution should be exercised when interpreting the gamma statistic as an indication of early, rapid diversification.
Fordyce, James A
2010-07-23
Phylogenetic hypotheses are increasingly being used to elucidate historical patterns of diversification rate-variation. Hypothesis testing is often conducted by comparing the observed vector of branching times to a null, pure-birth expectation. A popular method for inferring a decrease in speciation rate, which might suggest an early burst of diversification followed by a decrease in diversification rate is the gamma statistic. Using simulations under varying conditions, I examine the sensitivity of gamma to the distribution of the most recent branching times. Using an exploratory data analysis tool for lineages through time plots, tree deviation, I identified trees with a significant gamma statistic that do not appear to have the characteristic early accumulation of lineages consistent with an early, rapid rate of cladogenesis. I further investigated the sensitivity of the gamma statistic to recent diversification by examining the consequences of failing to simulate the full time interval following the most recent cladogenic event. The power of gamma to detect rate decrease at varying times was assessed for simulated trees with an initial high rate of diversification followed by a relatively low rate. The gamma statistic is extraordinarily sensitive to recent diversification rates, and does not necessarily detect early bursts of diversification. This was true for trees of various sizes and completeness of taxon sampling. The gamma statistic had greater power to detect recent diversification rate decreases compared to early bursts of diversification. Caution should be exercised when interpreting the gamma statistic as an indication of early, rapid diversification.
The statistical interpretations of counting data from measurements of low-level radioactivity
International Nuclear Information System (INIS)
Donn, J.J.; Wolke, R.L.
1977-01-01
The statistical model appropriate to measurements of low-level or background-dominant radioactivity is examined and the derived relationships are applied to two practical problems involving hypothesis testing: 'Does the sample exhibit a net activity above background' and 'Is the activity of the sample below some preselected limit'. In each of these cases, the appropriate decision rule is formulated, procedures are developed for estimating the preset count which is necessary to achieve a desired probability of detection, and a specific sequence of operations is provided for the worker in the field. (author)
DEFF Research Database (Denmark)
Van Driel, A.F.; Nikolaev, I.S.; Vergeer, P.
2007-01-01
We present a statistical analysis of time-resolved spontaneous emission decay curves from ensembles of emitters, such as semiconductor quantum dots, with the aim of interpreting ubiquitous non-single-exponential decay. Contrary to what is widely assumed, the density of excited emitters...... and the intensity in an emission decay curve are not proportional, but the density is a time integral of the intensity. The integral relation is crucial to correctly interpret non-single-exponential decay. We derive the proper normalization for both a discrete and a continuous distribution of rates, where every...... decay component is multiplied by its radiative decay rate. A central result of our paper is the derivation of the emission decay curve when both radiative and nonradiative decays are independently distributed. In this case, the well-known emission quantum efficiency can no longer be expressed...
Statistical and Methodological Considerations for the Interpretation of Intranasal Oxytocin Studies.
Walum, Hasse; Waldman, Irwin D; Young, Larry J
2016-02-01
Over the last decade, oxytocin (OT) has received focus in numerous studies associating intranasal administration of this peptide with various aspects of human social behavior. These studies in humans are inspired by animal research, especially in rodents, showing that central manipulations of the OT system affect behavioral phenotypes related to social cognition, including parental behavior, social bonding, and individual recognition. Taken together, these studies in humans appear to provide compelling, but sometimes bewildering, evidence for the role of OT in influencing a vast array of complex social cognitive processes in humans. In this article, we investigate to what extent the human intranasal OT literature lends support to the hypothesis that intranasal OT consistently influences a wide spectrum of social behavior in humans. We do this by considering statistical features of studies within this field, including factors like statistical power, prestudy odds, and bias. Our conclusion is that intranasal OT studies are generally underpowered and that there is a high probability that most of the published intranasal OT findings do not represent true effects. Thus, the remarkable reports that intranasal OT influences a large number of human social behaviors should be viewed with healthy skepticism, and we make recommendations to improve the reliability of human OT studies in the future. Copyright © 2016 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.
Rapp, J.B.
1991-01-01
Q-mode factor analysis was used to quantitate the distribution of the major aliphatic hydrocarbon (n-alkanes, pristane, phytane) systems in sediments from a variety of marine environments. The compositions of the pure end members of the systems were obtained from factor scores and the distribution of the systems within each sample was obtained from factor loadings. All the data, from the diverse environments sampled (estuarine (San Francisco Bay), fresh-water (San Francisco Peninsula), polar-marine (Antarctica) and geothermal-marine (Gorda Ridge) sediments), were reduced to three major systems: a terrestrial system (mostly high molecular weight aliphatics with odd-numbered-carbon predominance), a mature system (mostly low molecular weight aliphatics without predominance) and a system containing mostly high molecular weight aliphatics with even-numbered-carbon predominance. With this statistical approach, it is possible to assign the percentage contribution from various sources to the observed distribution of aliphatic hydrocarbons in each sediment sample. ?? 1991.
Statistical interpretation of transient current power-law decay in colloidal quantum dot arrays
Energy Technology Data Exchange (ETDEWEB)
Sibatov, R T, E-mail: ren_sib@bk.ru [Ulyanovsk State University, 432000, 42 Leo Tolstoy Street, Ulyanovsk (Russian Federation)
2011-08-01
A new statistical model of the charge transport in colloidal quantum dot arrays is proposed. It takes into account Coulomb blockade forbidding multiple occupancy of nanocrystals and the influence of energetic disorder of interdot space. The model explains power-law current transients and the presence of the memory effect. The fractional differential analogue of the Ohm law is found phenomenologically for nanocrystal arrays. The model combines ideas that were considered as conflicting by other authors: the Scher-Montroll idea about the power-law distribution of waiting times in localized states for disordered semiconductors is applied taking into account Coulomb blockade; Novikov's condition about the asymptotic power-law distribution of time intervals between successful current pulses in conduction channels is fulfilled; and the carrier injection blocking predicted by Ginger and Greenham (2000 J. Appl. Phys. 87 1361) takes place.
Statistical interpretation of transient current power-law decay in colloidal quantum dot arrays
International Nuclear Information System (INIS)
Sibatov, R T
2011-01-01
A new statistical model of the charge transport in colloidal quantum dot arrays is proposed. It takes into account Coulomb blockade forbidding multiple occupancy of nanocrystals and the influence of energetic disorder of interdot space. The model explains power-law current transients and the presence of the memory effect. The fractional differential analogue of the Ohm law is found phenomenologically for nanocrystal arrays. The model combines ideas that were considered as conflicting by other authors: the Scher-Montroll idea about the power-law distribution of waiting times in localized states for disordered semiconductors is applied taking into account Coulomb blockade; Novikov's condition about the asymptotic power-law distribution of time intervals between successful current pulses in conduction channels is fulfilled; and the carrier injection blocking predicted by Ginger and Greenham (2000 J. Appl. Phys. 87 1361) takes place.
Swofford, H J; Koertner, A J; Zemp, F; Ausdemore, M; Liu, A; Salyards, M J
2018-04-03
The forensic fingerprint community has faced increasing amounts of criticism by scientific and legal commentators, challenging the validity and reliability of fingerprint evidence due to the lack of an empirically demonstrable basis to evaluate and report the strength of the evidence in a given case. This paper presents a method, developed as a stand-alone software application, FRStat, which provides a statistical assessment of the strength of fingerprint evidence. The performance was evaluated using a variety of mated and non-mated datasets. The results show strong performance characteristics, often with values supporting specificity rates greater than 99%. This method provides fingerprint experts the capability to demonstrate the validity and reliability of fingerprint evidence in a given case and report the findings in a more transparent and standardized fashion with clearly defined criteria for conclusions and known error rate information thereby responding to concerns raised by the scientific and legal communities. Published by Elsevier B.V.
A Comprehensive Statistically-Based Method to Interpret Real-Time Flowing Measurements
Energy Technology Data Exchange (ETDEWEB)
Keita Yoshioka; Pinan Dawkrajai; Analis A. Romero; Ding Zhu; A. D. Hill; Larry W. Lake
2007-01-15
With the recent development of temperature measurement systems, continuous temperature profiles can be obtained with high precision. Small temperature changes can be detected by modern temperature measuring instruments such as fiber optic distributed temperature sensor (DTS) in intelligent completions and will potentially aid the diagnosis of downhole flow conditions. In vertical wells, since elevational geothermal changes make the wellbore temperature sensitive to the amount and the type of fluids produced, temperature logs can be used successfully to diagnose the downhole flow conditions. However, geothermal temperature changes along the wellbore being small for horizontal wells, interpretations of a temperature log become difficult. The primary temperature differences for each phase (oil, water, and gas) are caused by frictional effects. Therefore, in developing a thermal model for horizontal wellbore, subtle temperature changes must be accounted for. In this project, we have rigorously derived governing equations for a producing horizontal wellbore and developed a prediction model of the temperature and pressure by coupling the wellbore and reservoir equations. Also, we applied Ramey's model (1962) to the build section and used an energy balance to infer the temperature profile at the junction. The multilateral wellbore temperature model was applied to a wide range of cases at varying fluid thermal properties, absolute values of temperature and pressure, geothermal gradients, flow rates from each lateral, and the trajectories of each build section. With the prediction models developed, we present inversion studies of synthetic and field examples. These results are essential to identify water or gas entry, to guide flow control devices in intelligent completions, and to decide if reservoir stimulation is needed in particular horizontal sections. This study will complete and validate these inversion studies.
Directory of Open Access Journals (Sweden)
Paul A. Swinton
2018-05-01
Full Text Available The concept of personalized nutrition and exercise prescription represents a topical and exciting progression for the discipline given the large inter-individual variability that exists in response to virtually all performance and health related interventions. Appropriate interpretation of intervention-based data from an individual or group of individuals requires practitioners and researchers to consider a range of concepts including the confounding influence of measurement error and biological variability. In addition, the means to quantify likely statistical and practical improvements are facilitated by concepts such as confidence intervals (CIs and smallest worthwhile change (SWC. The purpose of this review is to provide accessible and applicable recommendations for practitioners and researchers that interpret, and report personalized data. To achieve this, the review is structured in three sections that progressively develop a statistical framework. Section 1 explores fundamental concepts related to measurement error and describes how typical error and CIs can be used to express uncertainty in baseline measurements. Section 2 builds upon these concepts and demonstrates how CIs can be combined with the concept of SWC to assess whether meaningful improvements occur post-intervention. Finally, section 3 introduces the concept of biological variability and discusses the subsequent challenges in identifying individual response and non-response to an intervention. Worked numerical examples and interactive Supplementary Material are incorporated to solidify concepts and assist with implementation in practice.
Energy Technology Data Exchange (ETDEWEB)
Munoz, Gerard; Bauer, Klaus; Moeck, Inga; Schulze, Albrecht; Ritter, Oliver [Deutsches GeoForschungsZentrum (GFZ), Telegrafenberg, 14473 Potsdam (Germany)
2010-03-15
Exploration for geothermal resources is often challenging because there are no geophysical techniques that provide direct images of the parameters of interest, such as porosity, permeability and fluid content. Magnetotelluric (MT) and seismic tomography methods yield information about subsurface distribution of resistivity and seismic velocity on similar scales and resolution. The lack of a fundamental law linking the two parameters, however, has limited joint interpretation to a qualitative analysis. By using a statistical approach in which the resistivity and velocity models are investigated in the joint parameter space, we are able to identify regions of high correlation and map these classes (or structures) back onto the spatial domain. This technique, applied to a seismic tomography-MT profile in the area of the Gross Schoenebeck geothermal site, allows us to identify a number of classes in accordance with the local geology. In particular, a high-velocity, low-resistivity class is interpreted as related to areas with thinner layers of evaporites; regions where these sedimentary layers are highly fractured may be of higher permeability. (author)
Chen, Jin; Roth, Robert E; Naito, Adam T; Lengerich, Eugene J; Maceachren, Alan M
2008-11-07
Kulldorff's spatial scan statistic and its software implementation - SaTScan - are widely used for detecting and evaluating geographic clusters. However, two issues make using the method and interpreting its results non-trivial: (1) the method lacks cartographic support for understanding the clusters in geographic context and (2) results from the method are sensitive to parameter choices related to cluster scaling (abbreviated as scaling parameters), but the system provides no direct support for making these choices. We employ both established and novel geovisual analytics methods to address these issues and to enhance the interpretation of SaTScan results. We demonstrate our geovisual analytics approach in a case study analysis of cervical cancer mortality in the U.S. We address the first issue by providing an interactive visual interface to support the interpretation of SaTScan results. Our research to address the second issue prompted a broader discussion about the sensitivity of SaTScan results to parameter choices. Sensitivity has two components: (1) the method can identify clusters that, while being statistically significant, have heterogeneous contents comprised of both high-risk and low-risk locations and (2) the method can identify clusters that are unstable in location and size as the spatial scan scaling parameter is varied. To investigate cluster result stability, we conducted multiple SaTScan runs with systematically selected parameters. The results, when scanning a large spatial dataset (e.g., U.S. data aggregated by county), demonstrate that no single spatial scan scaling value is known to be optimal to identify clusters that exist at different scales; instead, multiple scans that vary the parameters are necessary. We introduce a novel method of measuring and visualizing reliability that facilitates identification of homogeneous clusters that are stable across analysis scales. Finally, we propose a logical approach to proceed through the analysis of
Hayslett, H T
1991-01-01
Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the
Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.
International Nuclear Information System (INIS)
2005-01-01
For the years 2004 and 2005 the figures shown in the tables of Energy Review are partly preliminary. The annual statistics published in Energy Review are presented in more detail in a publication called Energy Statistics that comes out yearly. Energy Statistics also includes historical time-series over a longer period of time (see e.g. Energy Statistics, Statistics Finland, Helsinki 2004.) The applied energy units and conversion coefficients are shown in the back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes, precautionary stock fees and oil pollution fees
International Nuclear Information System (INIS)
2001-01-01
For the year 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions from the use of fossil fuels, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in 2000, Energy exports by recipient country in 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
International Nuclear Information System (INIS)
2000-01-01
For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g., Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-March 2000, Energy exports by recipient country in January-March 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
International Nuclear Information System (INIS)
1999-01-01
For the year 1998 and the year 1999, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 1999, Energy exports by recipient country in January-June 1999, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
International Nuclear Information System (INIS)
Bonetti, R.; Milazzo, L.C.; Melanotte, M.
1983-01-01
A number of (p,n), (n,p), and ( 3 He, p) reactions have been interpreted on the basis of the statistical multistep compound emission mechanism. Good agreement with experiment is found both in spectrum shape and in the value of the coherence widths
International Nuclear Information System (INIS)
2003-01-01
For the year 2002, part of the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot 2001, Statistics Finland, Helsinki 2002). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supply and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees on energy products
International Nuclear Information System (INIS)
2004-01-01
For the year 2003 and 2004, the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot, Statistics Finland, Helsinki 2003, ISSN 0785-3165). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-March 2004, Energy exports by recipient country in January-March 2004, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees
International Nuclear Information System (INIS)
2000-01-01
For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy also includes historical time series over a longer period (see e.g., Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 2000, Energy exports by recipient country in January-June 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
DEFF Research Database (Denmark)
Denwood, M.J.; McKendrick, I.J.; Matthews, L.
Introduction. There is an urgent need for a method of analysing FECRT data that is computationally simple and statistically robust. A method for evaluating the statistical power of a proposed FECRT study would also greatly enhance the current guidelines. Methods. A novel statistical framework has...... been developed that evaluates observed FECRT data against two null hypotheses: (1) the observed efficacy is consistent with the expected efficacy, and (2) the observed efficacy is inferior to the expected efficacy. The method requires only four simple summary statistics of the observed data. Power...... that the notional type 1 error rate of the new statistical test is accurate. Power calculations demonstrate a power of only 65% with a sample size of 20 treatment and control animals, which increases to 69% with 40 control animals or 79% with 40 treatment animals. Discussion. The method proposed is simple...
Azad Henareh Khalyani; William A. Gould; Eric Harmsen; Adam Terando; Maya Quinones; Jaime A. Collazo
2016-01-01
statistically downscaled general circulation models (GCMs) taking Puerto Rico as a test case. Two model selection/model averaging strategies were used: the average of all available GCMs and the av-erage of the models that are able to...
Bersimis, Sotiris; Panaretos, John; Psarakis, Stelios
2005-01-01
Woodall and Montgomery [35] in a discussion paper, state that multivariate process control is one of the most rapidly developing sections of statistical process control. Nowadays, in industry, there are many situations in which the simultaneous monitoring or control, of two or more related quality - process characteristics is necessary. Process monitoring problems in which several related variables are of interest are collectively known as Multivariate Statistical Process Control (MSPC).This ...
Directory of Open Access Journals (Sweden)
Takahiro eKawabe
2013-09-01
Full Text Available Humans can acquire the statistical features of the external world and employ them to control behaviors. Some external events occur in harmony with an agent’s action, and thus humans should also be able to acquire the statistical features between an action and its external outcome. We report that the acquired action-outcome statistical features alter the visual appearance of the action outcome. Pressing either of two assigned keys triggered visual motion whose direction was statistically biased either upward or downward, and observers judged the stimulus motion direction. Points of subjective equality (PSE for judging motion direction were shifted repulsively from the mean of the distribution associated with each key. Our Bayesian model accounted for the PSE shifts, indicating the optimal acquisition of the action-effect statistical relation. The PSE shifts were moderately attenuated when the action-outcome contingency was reduced. The Bayesian model again accounted for the attenuated PSE shifts. On the other hand, when the action-outcome contiguity was greatly reduced, the PSE shifts were greatly attenuated, and however, the Bayesian model could not accounted for the shifts. The results indicate that visual appearance can be modified by prediction based on the optimal acquisition of action-effect causal relation.
Link, J; Pachaly, J
1975-08-01
In a retrospective 18-month study the infusion therapy applied in a great anesthesia institute is examined. The data of the course of anesthesia recorded on magnetic tape by routine are analysed for this purpose bya computer with the statistical program SPSS. It could be proved that the behaviour of the several anesthetists is very different. Various correlations are discussed.
Directory of Open Access Journals (Sweden)
Brayan Alexander Fonseca Martinez
2017-11-01
Full Text Available One of the most commonly observational study designs employed in veterinary is the cross-sectional study with binary outcomes. To measure an association with exposure, the use of prevalence ratios (PR or odds ratios (OR are possible. In human epidemiology, much has been discussed about the use of the OR exclusively for case–control studies and some authors reported that there is no good justification for fitting logistic regression when the prevalence of the disease is high, in which OR overestimate the PR. Nonetheless, interpretation of OR is difficult since confusing between risk and odds can lead to incorrect quantitative interpretation of data such as “the risk is X times greater,” commonly reported in studies that use OR. The aims of this study were (1 to review articles with cross-sectional designs to assess the statistical method used and the appropriateness of the interpretation of the estimated measure of association and (2 to illustrate the use of alternative statistical methods that estimate PR directly. An overview of statistical methods and its interpretation using the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA guidelines was conducted and included a diverse set of peer-reviewed journals among the veterinary science field using PubMed as the search engine. From each article, the statistical method used and the appropriateness of the interpretation of the estimated measure of association were registered. Additionally, four alternative models for logistic regression that estimate directly PR were tested using our own dataset from a cross-sectional study on bovine viral diarrhea virus. The initial search strategy found 62 articles, in which 6 articles were excluded and therefore 56 studies were used for the overall analysis. The review showed that independent of the level of prevalence reported, 96% of articles employed logistic regression, thus estimating the OR. Results of the multivariate models
The use of easily debondable orthodontic adhesives with ceramic brackets.
Ryu, Chiyako; Namura, Yasuhiro; Tsuruoka, Takashi; Hama, Tomohiko; Kaji, Kaori; Shimizu, Noriyoshi
2011-01-01
We experimentally produced an easily debondable orthodontic adhesive (EDA) containing heat-expandable microcapsules. The purpose of this in vitro study was to evaluate the best debondable condition when EDA was used for ceramic brackets. Shear bond strengths were measured before and after heating and were compared statistically. Temperatures of the bracket base and pulp wall were also examined during heating. Bond strengths of EDA containing 30 wt% and 40 wt% heat-expandable microcapsules were 13.4 and 12.9 MPa, respectively and decreased significantly to 3.8 and 3.7 MPa, respectively, after heating. The temperature of the pulp wall increased 1.8-3.6°C after heating, less than that required to induce pulp damage. Based on the results, we conclude that heating for 8 s during debonding of ceramic brackets bonded using EDA containing 40 wt% heat-expandable microcapsules is the most effective and safest method for the enamel and pulp.
International Nuclear Information System (INIS)
Podorozhnyi, D.M.; Postnikov, E.B.; Sveshnikova, L.G.; Turundaevsky, A.N.
2005-01-01
A multivariate statistical procedure for solving problems of estimating physical parameters on the basis of data from measurements with multichannel equipment is described. Within the multivariate procedure, an algorithm is constructed for estimating the energy of primary cosmic rays and the exponent in their power-law spectrum. They are investigated by using the KLEM spectrometer (NUCLEON project) as a specific example of measuring equipment. The results of computer experiments simulating the operation of the multivariate procedure for this equipment are given, the proposed approach being compared in these experiments with the one-parameter approach presently used in data processing
Nash, J. Thomas; Frishman, David
1983-01-01
Analytical results for 61 elements in 370 samples from the Ranger Mine area are reported. Most of the rocks come from drill core in the Ranger No. 1 and Ranger No. 3 deposits, but 20 samples are from unmineralized drill core more than 1 km from ore. Statistical tests show that the elements Mg, Fe, F, Be, Co, Li, Ni, Pb, Sc, Th, Ti, V, CI, As, Br, Au, Ce, Dy, La Sc, Eu, Tb, Yb, and Tb have positive association with uranium, and Si, Ca, Na, K, Sr, Ba, Ce, and Cs have negative association. For most lithologic subsets Mg, Fe, Li, Cr, Ni, Pb, V, Y, Sm, Sc, Eu, and Yb are significantly enriched in ore-bearing rocks, whereas Ca, Na, K, Sr, Ba, Mn, Ce, and Cs are significantly depleted. These results are consistent with petrographic observations on altered rocks. Lithogeochemistry can aid exploration, but for these rocks requires methods that are expensive and not amenable to routine use.
Kennedy, R R; Merry, A F
2011-09-01
Anaesthesia involves processing large amounts of information over time. One task of the anaesthetist is to detect substantive changes in physiological variables promptly and reliably. It has been previously demonstrated that a graphical trend display of historical data leads to more rapid detection of such changes. We examined the effect of a graphical indication of the magnitude of Trigg's Tracking Variable, a simple statistically based trend detection algorithm, on the accuracy and latency of the detection of changes in a micro-simulation. Ten anaesthetists each viewed 20 simulations with four variables displayed as the current value with a simple graphical trend display. Values for these variables were generated by a computer model, and updated every second; after a period of stability a change occurred to a new random value at least 10 units from baseline. In 50% of the simulations an indication of the rate of change was given by a five level graphical representation of the value of Trigg's Tracking Variable. Participants were asked to indicate when they thought a change was occurring. Changes were detected 10.9% faster with the trend indicator present (mean 13.1 [SD 3.1] cycles vs 14.6 [SD 3.4] cycles, 95% confidence interval 0.4 to 2.5 cycles, P = 0.013. There was no difference in accuracy of detection (median with trend detection 97% [interquartile range 95 to 100%], without trend detection 100% [98 to 100%]), P = 0.8. We conclude that simple statistical trend detection may speed detection of changes during routine anaesthesia, even when a graphical trend display is present.
Kissling, Grace E; Haseman, Joseph K; Zeiger, Errol
2015-09-02
A recent article by Gaus (2014) demonstrates a serious misunderstanding of the NTP's statistical analysis and interpretation of rodent carcinogenicity data as reported in Technical Report 578 (Ginkgo biloba) (NTP, 2013), as well as a failure to acknowledge the abundant literature on false positive rates in rodent carcinogenicity studies. The NTP reported Ginkgo biloba extract to be carcinogenic in mice and rats. Gaus claims that, in this study, 4800 statistical comparisons were possible, and that 209 of them were statistically significant (p<0.05) compared with 240 (4800×0.05) expected by chance alone; thus, the carcinogenicity of Ginkgo biloba extract cannot be definitively established. However, his assumptions and calculations are flawed since he incorrectly assumes that the NTP uses no correction for multiple comparisons, and that significance tests for discrete data operate at exactly the nominal level. He also misrepresents the NTP's decision making process, overstates the number of statistical comparisons made, and ignores the fact that the mouse liver tumor effects were so striking (e.g., p<0.0000000000001) that it is virtually impossible that they could be false positive outcomes. Gaus' conclusion that such obvious responses merely "generate a hypothesis" rather than demonstrate a real carcinogenic effect has no scientific credibility. Moreover, his claims regarding the high frequency of false positive outcomes in carcinogenicity studies are misleading because of his methodological misconceptions and errors. Published by Elsevier Ireland Ltd.
Energy Technology Data Exchange (ETDEWEB)
Wjihi, Sarra [Unité de Recherche de Physique Quantique, 11 ES 54, Faculté des Science de Monastir (Tunisia); Dhaou, Houcine [Laboratoire des Etudes des Systèmes Thermiques et Energétiques (LESTE), ENIM, Route de Kairouan, 5019 Monastir (Tunisia); Yahia, Manel Ben; Knani, Salah [Unité de Recherche de Physique Quantique, 11 ES 54, Faculté des Science de Monastir (Tunisia); Jemni, Abdelmajid [Laboratoire des Etudes des Systèmes Thermiques et Energétiques (LESTE), ENIM, Route de Kairouan, 5019 Monastir (Tunisia); Lamine, Abdelmottaleb Ben, E-mail: abdelmottaleb.benlamine@gmail.com [Unité de Recherche de Physique Quantique, 11 ES 54, Faculté des Science de Monastir (Tunisia)
2015-12-15
Statistical physics treatment is used to study the desorption of hydrogen on LaNi{sub 4.75}Fe{sub 0.25}, in order to obtain new physicochemical interpretations at the molecular level. Experimental desorption isotherms of hydrogen on LaNi{sub 4.75}Fe{sub 0.25} are fitted at three temperatures (293 K, 303 K and 313 K), using a monolayer desorption model. Six parameters of the model are fitted, namely the number of molecules per site n{sub α} and n{sub β}, the receptor site densities N{sub αM} and N{sub βM}, and the energetic parameters P{sub α} and P{sub β}. The behaviors of these parameters are discussed in relationship with desorption process. A dynamic study of the α and β phases in the desorption process was then carried out. Finally, the different thermodynamical potential functions are derived by statistical physics calculations from our adopted model.
INVERSE ELECTRON TRANSFER IN PEROXYOXALATE CHEMIEXCITATION USING EASILY REDUCIBLE ACTIVATORS
Bartoloni, Fernando Heering; Monteiro Leite Ciscato, Luiz Francisco; Augusto, Felipe Alberto; Baader, Wilhelm Josef
2010-01-01
INVERSE ELECTRON TRANSFER IN PEROXYOXALATE CHEMIEXCITATION USING EASILY REDUCIBLE ACTIVATORS. Chemiluminescence properties of the peroxyoxalate reaction in the presence of activators bearing electron withdrawing substituents were studied, to evaluate the possible occurrence of an inverse electron
Guyonvarch, Estelle; Ramin, Elham; Kulahci, Murat; Plósz, Benedek Gy
2015-10-15
The present study aims at using statistically designed computational fluid dynamics (CFD) simulations as numerical experiments for the identification of one-dimensional (1-D) advection-dispersion models - computationally light tools, used e.g., as sub-models in systems analysis. The objective is to develop a new 1-D framework, referred to as interpreted CFD (iCFD) models, in which statistical meta-models are used to calculate the pseudo-dispersion coefficient (D) as a function of design and flow boundary conditions. The method - presented in a straightforward and transparent way - is illustrated using the example of a circular secondary settling tank (SST). First, the significant design and flow factors are screened out by applying the statistical method of two-level fractional factorial design of experiments. Second, based on the number of significant factors identified through the factor screening study and system understanding, 50 different sets of design and flow conditions are selected using Latin Hypercube Sampling (LHS). The boundary condition sets are imposed on a 2-D axi-symmetrical CFD simulation model of the SST. In the framework, to degenerate the 2-D model structure, CFD model outputs are approximated by the 1-D model through the calibration of three different model structures for D. Correlation equations for the D parameter then are identified as a function of the selected design and flow boundary conditions (meta-models), and their accuracy is evaluated against D values estimated in each numerical experiment. The evaluation and validation of the iCFD model structure is carried out using scenario simulation results obtained with parameters sampled from the corners of the LHS experimental region. For the studied SST, additional iCFD model development was carried out in terms of (i) assessing different density current sub-models; (ii) implementation of a combined flocculation, hindered, transient and compression settling velocity function; and (iii
Teaching the Assessment of Normality Using Large Easily-Generated Real Data Sets
Kulp, Christopher W.; Sprechini, Gene D.
2016-01-01
A classroom activity is presented, which can be used in teaching students statistics with an easily generated, large, real world data set. The activity consists of analyzing a video recording of an object. The colour data of the recorded object can then be used as a data set to explore variation in the data using graphs including histograms,…
Directory of Open Access Journals (Sweden)
Michael Robert Cunningham
2016-10-01
Full Text Available The limited resource model states that self-control is governed by a relatively finite set of inner resources on which people draw when exerting willpower. Once self-control resources have been used up or depleted, they are less available for other self-control tasks, leading to a decrement in subsequent self-control success. The depletion effect has been studied for over 20 years, tested or extended in more than 600 studies, and supported in an independent meta-analysis (Hagger, Wood, Stiff, and Chatzisarantis, 2010. Meta-analyses are supposed to reduce bias in literature reviews. Carter, Kofler, Forster, and McCullough’s (2015 meta-analysis, by contrast, included a series of questionable decisions involving sampling, methods, and data analysis. We provide quantitative analyses of key sampling issues: exclusion of many of the best depletion studies based on idiosyncratic criteria and the emphasis on mini meta-analyses with low statistical power as opposed to the overall depletion effect. We discuss two key methodological issues: failure to code for research quality, and the quantitative impact of weak studies by novice researchers. We discuss two key data analysis issues: questionable interpretation of the results of trim and fill and funnel plot asymmetry test procedures, and the use and misinterpretation of the untested Precision Effect Test [PET] and Precision Effect Estimate with Standard Error (PEESE procedures. Despite these serious problems, the Carter et al. meta-analysis results actually indicate that there is a real depletion effect – contrary to their title.
Walkden, N. R.; Wynn, A.; Militello, F.; Lipschultz, B.; Matthews, G.; Guillemaut, C.; Harrison, J.; Moulton, D.; Contributors, JET
2017-08-01
This paper presents the use of a novel modelling technique based around intermittent transport due to filament motion, to interpret experimental profile and fluctuation data in the scrape-off layer (SOL) of JET during the onset and evolution of a density profile shoulder. A baseline case is established, prior to shoulder formation, and the stochastic model is shown to be capable of simultaneously matching the time averaged profile measurement as well as the PDF shape and autocorrelation function from the ion-saturation current time series at the outer wall. Aspects of the stochastic model are then varied with the aim of producing a profile shoulder with statistical measurements consistent with experiment. This is achieved through a strong localised reduction in the density sink acting on the filaments within the model. The required reduction of the density sink occurs over a highly localised region with the timescale of the density sink increased by a factor of 25. This alone is found to be insufficient to model the expansion and flattening of the shoulder region as the density increases, which requires additional changes within the stochastic model. An example is found which includes both a reduction in the density sink and filament acceleration and provides a consistent match to the experimental data as the shoulder expands, though the uniqueness of this solution can not be guaranteed. Within the context of the stochastic model, this implies that the localised reduction in the density sink can trigger shoulder formation, but additional physics is required to explain the subsequent evolution of the profile.
The advent of new higher throughput analytical instrumentation has put a strain on interpreting and explaining the results from complex studies. Contemporary human, environmental, and biomonitoring data sets are comprised of tens or hundreds of analytes, multiple repeat measures...
Pyrochemical recovery of easily reducible species from spent nuclear fuel
International Nuclear Information System (INIS)
Jouault, C.
2000-01-01
The purpose of the reprocessing of spent fuel is to separate noble metals and other easily reducible species, actinides and lanthanides. A thermodynamic and bibliographical study allowed us to elaborate a process which realises these separations in several steps. The experimental validation of the steps concerning the extraction of noble metals and easily reducible species required to imagine an apparatus which is conformed to the study of the two steps in question: the reduction by a gas of fission product oxides and the extraction of the metallic particles, obtained by reduction, by digestion in a liquid metal. Experiments on digestion, carried on molybdenum and ruthenium particles, allowed us to conclude that the transfer of metallic particles from a molten salt into a liquid metal is ruled by phenomena of complex wettability between the metallic particle, the molten salt, the liquid metal and the gas. The transfer from the salt to the metal is a chain of two steps: emersion of the particles from the salt to go into the gas, and then transfer from the gas into the metal. Kinetics are limited by the transfer through the metal surface. Kinetics study withdrew the experimental parameters and the metals properties which influence the digestion rate. A model on the transfer into a liquid metal of a particle trapped at the fluid/metal interface ratified the experimental conclusions and informed on the stirring influence. All the results allow us to think that the extraction of noble metals and easily reducible species are feasible in this way. (author) [fr
A method for easily customizable gradient gel electrophoresis.
Miller, Andrew J; Roman, Brandon; Norstrom, Eric
2016-09-15
Gradient polyacrylamide gel electrophoresis is a powerful tool for the resolution of polypeptides by relative mobility. Here, we present a simplified method for generating polyacrylamide gradient gels for routine analysis without the need for specialized mixing equipment. The method allows for easily customizable gradients which can be optimized for specific polypeptide resolution requirements. Moreover, the method eliminates the possibility of buffer cross contamination in mixing equipment, and the time and resources saved with this method in place of traditional gradient mixing, or the purchase of pre-cast gels, are noteworthy given the frequency with which many labs use gradient gel SDS-PAGE. Copyright © 2016 Elsevier Inc. All rights reserved.
The non-easily ionized elements as spectrochemical buffers
International Nuclear Information System (INIS)
Tripkovic, M.; Radovanov, S.; Holclajtner-Antunovic, I.; Todorovic, M.
1985-01-01
A method is developed for determining trace elements (In, Ga, B, V, Mo, Mn, Pt, P, Be) in graphite with the aid of a low current d.c. arc. The method makes use of the enhancement of the radiation intensities of trace elements by non-easily ionized elements (NEIE). As a NEIE, this method uses Cd which is added up to a concentration of 150 mg/g sample. The absolute detection limits for all of the above mentioned elements are at the ng-level. (orig.) [de
The reaction of organocerium reagents with easily enolizable ketones
International Nuclear Information System (INIS)
Imamoto, Tsuneo; Kusumoto, Tetsuo; Sugiura, Yasushi; Suzuki, Nobuyo; Takiyama, Nobuyuki
1985-01-01
Organocerium (III) reagents were conveniently generated by the reaction of organolithium compounds with anhydrous cerium (III) chloride. The reagents are less basic than organolithiums and Grignard reagents, and they react readily at -78 deg C with easily enolizable ketones such as 2-tetralone to afford addition products in high yields. Cerium (III) enolates were also generated from lithium enolates and cerium (III) chloride. The cerium (III) enolates undergo aldol addition with ketones or sterically crowded aldehyde to give the corresponding β-hydroxy ketones in good to high yields. (author)
Plasmonic Films Can Easily Be Better: Rules and Recipes
2015-01-01
High-quality materials are critical for advances in plasmonics, especially as researchers now investigate quantum effects at the limit of single surface plasmons or exploit ultraviolet- or CMOS-compatible metals such as aluminum or copper. Unfortunately, due to inexperience with deposition methods, many plasmonics researchers deposit metals under the wrong conditions, severely limiting performance unnecessarily. This is then compounded as others follow their published procedures. In this perspective, we describe simple rules collected from the surface-science literature that allow high-quality plasmonic films of aluminum, copper, gold, and silver to be easily deposited with commonly available equipment (a thermal evaporator). Recipes are also provided so that films with optimal optical properties can be routinely obtained. PMID:25950012
An Introduction to Statistical Concepts
Lomax, Richard G
2012-01-01
This comprehensive, flexible text is used in both one- and two-semester courses to review introductory through intermediate statistics. Instructors select the topics that are most appropriate for their course. Its conceptual approach helps students more easily understand the concepts and interpret SPSS and research results. Key concepts are simply stated and occasionally reintroduced and related to one another for reinforcement. Numerous examples demonstrate their relevance. This edition features more explanation to increase understanding of the concepts. Only crucial equations are included. I
Austin, Peter C; Steyerberg, Ewout W
2012-06-20
When outcomes are binary, the c-statistic (equivalent to the area under the Receiver Operating Characteristic curve) is a standard measure of the predictive accuracy of a logistic regression model. An analytical expression was derived under the assumption that a continuous explanatory variable follows a normal distribution in those with and without the condition. We then conducted an extensive set of Monte Carlo simulations to examine whether the expressions derived under the assumption of binormality allowed for accurate prediction of the empirical c-statistic when the explanatory variable followed a normal distribution in the combined sample of those with and without the condition. We also examine the accuracy of the predicted c-statistic when the explanatory variable followed a gamma, log-normal or uniform distribution in combined sample of those with and without the condition. Under the assumption of binormality with equality of variances, the c-statistic follows a standard normal cumulative distribution function with dependence on the product of the standard deviation of the normal components (reflecting more heterogeneity) and the log-odds ratio (reflecting larger effects). Under the assumption of binormality with unequal variances, the c-statistic follows a standard normal cumulative distribution function with dependence on the standardized difference of the explanatory variable in those with and without the condition. In our Monte Carlo simulations, we found that these expressions allowed for reasonably accurate prediction of the empirical c-statistic when the distribution of the explanatory variable was normal, gamma, log-normal, and uniform in the entire sample of those with and without the condition. The discriminative ability of a continuous explanatory variable cannot be judged by its odds ratio alone, but always needs to be considered in relation to the heterogeneity of the population.
Design of two easily-testable VLSI array multipliers
Energy Technology Data Exchange (ETDEWEB)
Ferguson, J.; Shen, J.P.
1983-01-01
Array multipliers are well-suited to VLSI implementation because of the regularity in their iterative structure. However, most VLSI circuits are very difficult to test. This paper shows that, with appropriate cell design, array multipliers can be designed to be very easily testable. An array multiplier is called c-testable if all its adder cells can be exhaustively tested while requiring only a constant number of test patterns. The testability of two well-known array multiplier structures are studied. The conventional design of the carry-save array multipler is shown to be not c-testable. However, a modified design, using a modified adder cell, is generated and shown to be c-testable and requires only 16 test patterns. Similar results are obtained for the baugh-wooley two's complement array multiplier. A modified design of the baugh-wooley array multiplier is shown to be c-testable and requires 55 test patterns. The implementation of a practical c-testable 16*16 array multiplier is also presented. 10 references.
A highly versatile and easily configurable system for plant electrophysiology.
Gunsé, Benet; Poschenrieder, Charlotte; Rankl, Simone; Schröeder, Peter; Rodrigo-Moreno, Ana; Barceló, Juan
2016-01-01
In this study we present a highly versatile and easily configurable system for measuring plant electrophysiological parameters and ionic flow rates, connected to a computer-controlled highly accurate positioning device. The modular software used allows easy customizable configurations for the measurement of electrophysiological parameters. Both the operational tests and the experiments already performed have been fully successful and rendered a low noise and highly stable signal. Assembly, programming and configuration examples are discussed. The system is a powerful technique that not only gives precise measuring of plant electrophysiological status, but also allows easy development of ad hoc configurations that are not constrained to plant studies. •We developed a highly modular system for electrophysiology measurements that can be used either in organs or cells and performs either steady or dynamic intra- and extracellular measurements that takes advantage of the easiness of visual object-oriented programming.•High precision accuracy in data acquisition under electrical noisy environments that allows it to run even in a laboratory close to electrical equipment that produce electrical noise.•The system makes an improvement of the currently used systems for monitoring and controlling high precision measurements and micromanipulation systems providing an open and customizable environment for multiple experimental needs.
Experimental statistics for biological sciences.
Bang, Heejung; Davidian, Marie
2010-01-01
In this chapter, we cover basic and fundamental principles and methods in statistics - from "What are Data and Statistics?" to "ANOVA and linear regression," which are the basis of any statistical thinking and undertaking. Readers can easily find the selected topics in most introductory statistics textbooks, but we have tried to assemble and structure them in a succinct and reader-friendly manner in a stand-alone chapter. This text has long been used in real classroom settings for both undergraduate and graduate students who do or do not major in statistical sciences. We hope that from this chapter, readers would understand the key statistical concepts and terminologies, how to design a study (experimental or observational), how to analyze the data (e.g., describe the data and/or estimate the parameter(s) and make inference), and how to interpret the results. This text would be most useful if it is used as a supplemental material, while the readers take their own statistical courses or it would serve as a great reference text associated with a manual for any statistical software as a self-teaching guide.
Austin, Peter C
2008-09-01
Propensity-score matching is frequently used in the cardiology literature. Recent systematic reviews have found that this method is, in general, poorly implemented in the medical literature. The study objective was to examine the quality of the implementation of propensity-score matching in the general cardiology literature. A total of 44 articles published in the American Heart Journal, the American Journal of Cardiology, Circulation, the European Heart Journal, Heart, the International Journal of Cardiology, and the Journal of the American College of Cardiology between January 1, 2004, and December 31, 2006, were examined. Twenty of the 44 studies did not provide adequate information on how the propensity-score-matched pairs were formed. Fourteen studies did not report whether matching on the propensity score balanced baseline characteristics between treated and untreated subjects in the matched sample. Only 4 studies explicitly used statistical methods appropriate for matched studies to compare baseline characteristics between treated and untreated subjects. Only 11 (25%) of the 44 studies explicitly used statistical methods appropriate for the analysis of matched data when estimating the effect of treatment on the outcomes. Only 2 studies described the matching method used, assessed balance in baseline covariates by appropriate methods, and used appropriate statistical methods to estimate the treatment effect and its significance. Application of propensity-score matching was poor in the cardiology literature. Suggestions for improving the reporting and analysis of studies that use propensity-score matching are provided.
DEFF Research Database (Denmark)
Guyonvarch, Estelle; Ramin, Elham; Kulahci, Murat
2015-01-01
using the example of a circular secondary settling tank (SST). First, the significant design and flow factors are screened out by applying the statistical method of two-level fractional factorial design of experiments. Second, based on the number of significant factors identified through the factor...... of (i) assessing different density current sub-models; (ii) implementation of a combined flocculation, hindered, transient and compression settling velocity function; and (iii) assessment of modelling the onset of transient and compression settling. Furthermore, the optimal level of model discretization...
2011-01-01
Background The correspondence of satisfaction ratings between physicians and patients can be assessed on different dimensions. One may examine whether they differ between the two groups or focus on measures of association or agreement. The aim of our study was to evaluate methodological difficulties in calculating the correspondence between patient and physician satisfaction ratings and to show the relevance for shared decision making research. Methods We utilised a structured tool for cardiovascular prevention (arriba™) in a pragmatic cluster-randomised controlled trial. Correspondence between patient and physician satisfaction ratings after individual primary care consultations was assessed using the Patient Participation Scale (PPS). We used the Wilcoxon signed-rank test, the marginal homogeneity test, Kendall's tau-b, weighted kappa, percentage of agreement, and the Bland-Altman method to measure differences, associations, and agreement between physicians and patients. Results Statistical measures signal large differences between patient and physician satisfaction ratings with more favourable ratings provided by patients and a low correspondence regardless of group allocation. Closer examination of the raw data revealed a high ceiling effect of satisfaction ratings and only slight disagreement regarding the distributions of differences between physicians' and patients' ratings. Conclusions Traditional statistical measures of association and agreement are not able to capture a clinically relevant appreciation of the physician-patient relationship by both parties in skewed satisfaction ratings. Only the Bland-Altman method for assessing agreement augmented by bar charts of differences was able to indicate this. Trial registration ISRCTN: ISRCT71348772 PMID:21592337
International Nuclear Information System (INIS)
Bogen, K; Hamilton, T F; Brown, T A; Martinelli, R E; Marchetti, A A; Kehl, S R; Langston, R G
2007-01-01
We have developed refined statistical and modeling techniques to assess low-level uptake and urinary excretion of plutonium from different population group in the northern Marshall Islands. Urinary excretion rates of plutonium from the resident population on Enewetak Atoll and from resettlement workers living on Rongelap Atoll range from 239 Pu. However, our statistical analyses show that urinary excretion of plutonium-239 ( 239 Pu) from both cohort groups is significantly positively associated with volunteer age, especially for the resident population living on Enewetak Atoll. Urinary excretion of 239 Pu from the Enewetak cohort was also found to be positively associated with estimates of cumulative exposure to worldwide fallout. Consequently, the age-related trends in urinary excretion of plutonium from Marshallese populations can be described by either a long-term component from residual systemic burdens acquired from previous exposures to worldwide fallout or a prompt (and eventual long-term) component acquired from low-level systemic intakes of plutonium associated with resettlement of the northern Marshall Islands, or some combination of both
Tchabo, William; Ma, Yongkun; Kwaw, Emmanuel; Zhang, Haining; Xiao, Lulu; Apaliya, Maurice T
2018-01-15
The four different methods of color measurement of wine proposed by Boulton, Giusti, Glories and Commission International de l'Eclairage (CIE) were applied to assess the statistical relationship between the phytochemical profile and chromatic characteristics of sulfur dioxide-free mulberry (Morus nigra) wine submitted to non-thermal maturation processes. The alteration in chromatic properties and phenolic composition of non-thermal aged mulberry wine were examined, aided by the used of Pearson correlation, cluster and principal component analysis. The results revealed a positive effect of non-thermal processes on phytochemical families of wines. From Pearson correlation analysis relationships between chromatic indexes and flavonols as well as anthocyanins were established. Cluster analysis highlighted similarities between Boulton and Giusti parameters, as well as Glories and CIE parameters in the assessment of chromatic properties of wines. Finally, principal component analysis was able to discriminate wines subjected to different maturation techniques on the basis of their chromatic and phenolics characteristics. Copyright © 2017. Published by Elsevier Ltd.
Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James
2014-01-01
Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.
Conversion factors and oil statistics
International Nuclear Information System (INIS)
Karbuz, Sohbet
2004-01-01
World oil statistics, in scope and accuracy, are often far from perfect. They can easily lead to misguided conclusions regarding the state of market fundamentals. Without proper attention directed at statistic caveats, the ensuing interpretation of oil market data opens the door to unnecessary volatility, and can distort perception of market fundamentals. Among the numerous caveats associated with the compilation of oil statistics, conversion factors, used to produce aggregated data, play a significant role. Interestingly enough, little attention is paid to conversion factors, i.e. to the relation between different units of measurement for oil. Additionally, the underlying information regarding the choice of a specific factor when trying to produce measurements of aggregated data remains scant. The aim of this paper is to shed some light on the impact of conversion factors for two commonly encountered issues, mass to volume equivalencies (barrels to tonnes) and for broad energy measures encountered in world oil statistics. This paper will seek to demonstrate how inappropriate and misused conversion factors can yield wildly varying results and ultimately distort oil statistics. Examples will show that while discrepancies in commonly used conversion factors may seem trivial, their impact on the assessment of a world oil balance is far from negligible. A unified and harmonised convention for conversion factors is necessary to achieve accurate comparisons and aggregate oil statistics for the benefit of both end-users and policy makers
Analysis of statistical misconception in terms of statistical reasoning
Maryati, I.; Priatna, N.
2018-05-01
Reasoning skill is needed for everyone to face globalization era, because every person have to be able to manage and use information from all over the world which can be obtained easily. Statistical reasoning skill is the ability to collect, group, process, interpret, and draw conclusion of information. Developing this skill can be done through various levels of education. However, the skill is low because many people assume that statistics is just the ability to count and using formulas and so do students. Students still have negative attitude toward course which is related to research. The purpose of this research is analyzing students’ misconception in descriptive statistic course toward the statistical reasoning skill. The observation was done by analyzing the misconception test result and statistical reasoning skill test; observing the students’ misconception effect toward statistical reasoning skill. The sample of this research was 32 students of math education department who had taken descriptive statistic course. The mean value of misconception test was 49,7 and standard deviation was 10,6 whereas the mean value of statistical reasoning skill test was 51,8 and standard deviation was 8,5. If the minimal value is 65 to state the standard achievement of a course competence, students’ mean value is lower than the standard competence. The result of students’ misconception study emphasized on which sub discussion that should be considered. Based on the assessment result, it was found that students’ misconception happen on this: 1) writing mathematical sentence and symbol well, 2) understanding basic definitions, 3) determining concept that will be used in solving problem. In statistical reasoning skill, the assessment was done to measure reasoning from: 1) data, 2) representation, 3) statistic format, 4) probability, 5) sample, and 6) association.
Vasikaran, Samuel
2008-08-01
* Clinical laboratories should be able to offer interpretation of the results they produce. * At a minimum, contact details for interpretative advice should be available on laboratory reports.Interpretative comments may be verbal or written and printed. * Printed comments on reports should be offered judiciously, only where they would add value; no comment preferred to inappropriate or dangerous comment. * Interpretation should be based on locally agreed or nationally recognised clinical guidelines where available. * Standard tied comments ("canned" comments) can have some limited use.Individualised narrative comments may be particularly useful in the case of tests that are new, complex or unfamiliar to the requesting clinicians and where clinical details are available. * Interpretative commenting should only be provided by appropriately trained and credentialed personnel. * Audit of comments and continued professional development of personnel providing them are important for quality assurance.
Objective interpretation as conforming interpretation
Directory of Open Access Journals (Sweden)
Lidka Rodak
2011-12-01
Full Text Available The practical discourse willingly uses the formula of “objective interpretation”, with no regards to its controversial nature that has been discussed in literature.The main aim of the article is to investigate what “objective interpretation” could mean and how it could be understood in the practical discourse, focusing on the understanding offered by judicature.The thesis of the article is that objective interpretation, as identified with textualists’ position, is not possible to uphold, and should be rather linked with conforming interpretation. And what this actually implies is that it is not the virtue of certainty and predictability – which are usually associated with objectivity- but coherence that makes the foundation of applicability of objectivity in law.What could be observed from the analyses, is that both the phenomenon of conforming interpretation and objective interpretation play the role of arguments in the interpretive discourse, arguments that provide justification that interpretation is not arbitrary or subjective. With regards to the important part of the ideology of legal application which is the conviction that decisions should be taken on the basis of law in order to exclude arbitrariness, objective interpretation could be read as a question “what kind of authority “supports” certain interpretation”? that is almost never free of judicial creativity and judicial activism.One can say that, objective and conforming interpretation are just another arguments used in legal discourse.
Statistical significance versus clinical relevance.
van Rijn, Marieke H C; Bech, Anneke; Bouyer, Jean; van den Brand, Jan A J G
2017-04-01
In March this year, the American Statistical Association (ASA) posted a statement on the correct use of P-values, in response to a growing concern that the P-value is commonly misused and misinterpreted. We aim to translate these warnings given by the ASA into a language more easily understood by clinicians and researchers without a deep background in statistics. Moreover, we intend to illustrate the limitations of P-values, even when used and interpreted correctly, and bring more attention to the clinical relevance of study findings using two recently reported studies as examples. We argue that P-values are often misinterpreted. A common mistake is saying that P < 0.05 means that the null hypothesis is false, and P ≥0.05 means that the null hypothesis is true. The correct interpretation of a P-value of 0.05 is that if the null hypothesis were indeed true, a similar or more extreme result would occur 5% of the times upon repeating the study in a similar sample. In other words, the P-value informs about the likelihood of the data given the null hypothesis and not the other way around. A possible alternative related to the P-value is the confidence interval (CI). It provides more information on the magnitude of an effect and the imprecision with which that effect was estimated. However, there is no magic bullet to replace P-values and stop erroneous interpretation of scientific results. Scientists and readers alike should make themselves familiar with the correct, nuanced interpretation of statistical tests, P-values and CIs. © The Author 2017. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.
An interpretation of signature inversion
International Nuclear Information System (INIS)
Onishi, Naoki; Tajima, Naoki
1988-01-01
An interpretation in terms of the cranking model is presented to explain why signature inversion occurs for positive γ of the axially asymmetric deformation parameter and emerges into specific orbitals. By introducing a continuous variable, the eigenvalue equation can be reduced to a one dimensional Schroedinger equation by means of which one can easily understand the cause of signature inversion. (author)
Localized Smart-Interpretation
Lundh Gulbrandsen, Mats; Mejer Hansen, Thomas; Bach, Torben; Pallesen, Tom
2014-05-01
The complex task of setting up a geological model consists not only of combining available geological information into a conceptual plausible model, but also requires consistency with availably data, e.g. geophysical data. However, in many cases the direct geological information, e.g borehole samples, are very sparse, so in order to create a geological model, the geologist needs to rely on the geophysical data. The problem is however, that the amount of geophysical data in many cases are so vast that it is practically impossible to integrate all of them in the manual interpretation process. This means that a lot of the information available from the geophysical surveys are unexploited, which is a problem, due to the fact that the resulting geological model does not fulfill its full potential and hence are less trustworthy. We suggest an approach to geological modeling that 1. allow all geophysical data to be considered when building the geological model 2. is fast 3. allow quantification of geological modeling. The method is constructed to build a statistical model, f(d,m), describing the relation between what the geologists interpret, d, and what the geologist knows, m. The para- meter m reflects any available information that can be quantified, such as geophysical data, the result of a geophysical inversion, elevation maps, etc... The parameter d reflects an actual interpretation, such as for example the depth to the base of a ground water reservoir. First we infer a statistical model f(d,m), by examining sets of actual interpretations made by a geological expert, [d1, d2, ...], and the information used to perform the interpretation; [m1, m2, ...]. This makes it possible to quantify how the geological expert performs interpolation through f(d,m). As the geological expert proceeds interpreting, the number of interpreted datapoints from which the statistical model is inferred increases, and therefore the accuracy of the statistical model increases. When a model f
Applying contemporary statistical techniques
Wilcox, Rand R
2003-01-01
Applying Contemporary Statistical Techniques explains why traditional statistical methods are often inadequate or outdated when applied to modern problems. Wilcox demonstrates how new and more powerful techniques address these problems far more effectively, making these modern robust methods understandable, practical, and easily accessible.* Assumes no previous training in statistics * Explains how and why modern statistical methods provide more accurate results than conventional methods* Covers the latest developments on multiple comparisons * Includes recent advanc
Neuman, Yair
2010-10-01
Interpretation is at the center of psychoanalytic activity. However, interpretation is always challenged by that which is beyond our grasp, the 'dark matter' of our mind, what Bion describes as ' O'. O is one of the most central and difficult concepts in Bion's thought. In this paper, I explain the enigmatic nature of O as a high-dimensional mental space and point to the price one should pay for substituting the pre-symbolic lexicon of the emotion-laden and high-dimensional unconscious for a low-dimensional symbolic representation. This price is reification--objectifying lived experience and draining it of vitality and complexity. In order to address the difficulty of approaching O through symbolization, I introduce the term 'Penultimate Interpretation'--a form of interpretation that seeks 'loopholes' through which the analyst and the analysand may reciprocally save themselves from the curse of reification. Three guidelines for 'Penultimate Interpretation' are proposed and illustrated through an imaginary dialogue. Copyright © 2010 Institute of Psychoanalysis.
Mottelson, Ida Nygaard; Sodemann, Morten; Nielsen, Dorthe Susanne
2018-03-01
Immigrants, refugees, and their descendants comprise 12% of Denmark's population. Some of these people do not speak or understand Danish well enough to communicate with the staff in a healthcare setting and therefore need interpreter services. Interpretation through video conferencing equipment (video interpretation) is frequently used and creates a forum where the interpreter is not physically present in the medical consultation. The aim of this study was to investigate the attitudes to and experiences with video interpretation among charge nurses in a Danish university hospital. An electronic questionnaire was sent to 99 charge nurses. The questionnaire comprised both closed and open-ended questions. The answers were analysed using descriptive statistics and thematic text condensation. Of the 99 charge nurses, 78 (79%) completed the questionnaire. Most charge nurses, 21 (91%) of the daily/monthly users, and 21 (72%) of the monthly/yearly users, said that video interpretation increased the quality of their conversations with patients. A total of 19 (24%) departments had not used video interpretation within the last 12 months. The more the charge nurses used video interpretation, the more satisfied they were. Most of the charge nurses using video interpretation expressed satisfaction with the technology and found it easy to use. Some charge nurses are still content to allow family or friends to interpret. To reach its full potential, video interpretation technology has to be reliable and easily accessible for any consultation, including at the bedside.
MacKinnon, Edward
2012-01-01
This book is the first to offer a systematic account of the role of language in the development and interpretation of physics. An historical-conceptual analysis of the co-evolution of mathematical and physical concepts leads to the classical/quatum interface. Bohrian orthodoxy stresses the indispensability of classical concepts and the functional role of mathematics. This book analyses ways of extending, and then going beyond this orthodoxy orthodoxy. Finally, the book analyzes how a revised interpretation of physics impacts on basic philosophical issues: conceptual revolutions, realism, and r
Kothe, Elsa Lenz; Berard, Marie-France
2013-01-01
Utilizing a/r/tographic methodology to interrogate interpretive acts in museums, multiple areas of inquiry are raised in this paper, including: which knowledge is assigned the greatest value when preparing a gallery talk; what lies outside of disciplinary knowledge; how invitations to participate invite and disinvite in the same gesture; and what…
Spreadsheets as tools for statistical computing and statistics education
Neuwirth, Erich
2000-01-01
Spreadsheets are an ubiquitous program category, and we will discuss their use in statistics and statistics education on various levels, ranging from very basic examples to extremely powerful methods. Since the spreadsheet paradigm is very familiar to many potential users, using it as the interface to statistical methods can make statistics more easily accessible.
Fisher, Anthony C; McCulloch, Daphne L; Borchert, Mark S; Garcia-Filion, Pamela; Fink, Cassandra; Eleuteri, Antonio; Simpson, David M
2015-08-01
Pattern electroretinograms (PERGs) have inherently low signal-to-noise ratios and can be difficult to detect when degraded by pathology or noise. We compare an objective system for automated PERG analysis with expert human interpretation in children with optic nerve hypoplasia (ONH) with PERGs ranging from clear to undetectable. PERGs were recorded uniocularly with chloral hydrate sedation in children with ONH (aged 3.5-35 months). Stimuli were reversing checks of four sizes focused using an optical system incorporating the cycloplegic refraction. Forty PERG records were analysed; 20 selected at random and 20 from eyes with good vision (fellow eyes or eyes with mild ONH) from over 300 records. Two experts identified P50 and N95 of the PERGs after manually deleting trials with movement artefact, slow-wave EEG (4-8 Hz) or other noise from raw data for 150 check reversals. The automated system first identified present/not-present responses using a magnitude-squared coherence criterion and then, for responses confirmed as present, estimated the P50 and N95 cardinal positions as the turning points in local third-order polynomials fitted in the -3 dB bandwidth [0.25 … 45] Hz. Confidence limits were estimated from bootstrap re-sampling with replacement. The automated system uses an interactive Internet-available webpage tool (see http://clinengnhs.liv.ac.uk/esp_perg_1.htm). The automated system detected 28 PERG signals above the noise level (p ≤ 0.05 for H0). Good subjective quality ratings were indicative of significant PERGs; however, poor subjective quality did not necessarily predict non-significant signals. P50 and N95 implicit times showed good agreement between the two experts and between experts and the automated system. For the N95 amplitude measured to P50, the experts differed by an average of 13% consistent with differing interpretations of peaks within noise, while the automated amplitude measure was highly correlated with the expert measures but was
Bott, Lewis; Frisson, Steven; Murphy, Gregory L
2009-04-01
The interpretation generated from a sentence of the form P and Q can often be different to that generated by Q and P, despite the fact that and has a symmetric truth-conditional meaning. We experimentally investigated to what extent this difference in meaning is due to the connective and and to what extent it is due to order of mention of the events in the sentence. In three experiments, we collected interpretations of sentences in which we varied the presence of the conjunction, the order of mention of the events, and the type of relation holding between the events (temporally vs. causally related events). The results indicated that the effect of using a conjunction was dependent on the discourse relation between the events. Our findings contradict a narrative marker theory of and, but provide partial support for a single-unit theory derived from Carston (2002). The results are discussed in terms of conjunction processing and implicatures of temporal order.
Reeve, Joanne
2010-01-01
Patient-centredness is a core value of general practice; it is defined as the interpersonal processes that support the holistic care of individuals. To date, efforts to demonstrate their relationship to patient outcomes have been disappointing, whilst some studies suggest values may be more rhetoric than reality. Contextual issues influence the quality of patient-centred consultations, impacting on outcomes. The legitimate use of knowledge, or evidence, is a defining aspect of modern practice, and has implications for patient-centredness. Based on a critical review of the literature, on my own empirical research, and on reflections from my clinical practice, I critique current models of the use of knowledge in supporting individualised care. Evidence-Based Medicine (EBM), and its implementation within health policy as Scientific Bureaucratic Medicine (SBM), define best evidence in terms of an epistemological emphasis on scientific knowledge over clinical experience. It provides objective knowledge of disease, including quantitative estimates of the certainty of that knowledge. Whilst arguably appropriate for secondary care, involving episodic care of selected populations referred in for specialist diagnosis and treatment of disease, application to general practice can be questioned given the complex, dynamic and uncertain nature of much of the illness that is treated. I propose that general practice is better described by a model of Interpretive Medicine (IM): the critical, thoughtful, professional use of an appropriate range of knowledges in the dynamic, shared exploration and interpretation of individual illness experience, in order to support the creative capacity of individuals in maintaining their daily lives. Whilst the generation of interpreted knowledge is an essential part of daily general practice, the profession does not have an adequate framework by which this activity can be externally judged to have been done well. Drawing on theory related to the
Objective interpretation as conforming interpretation
Lidka Rodak
2011-01-01
The practical discourse willingly uses the formula of “objective interpretation”, with no regards to its controversial nature that has been discussed in literature.The main aim of the article is to investigate what “objective interpretation” could mean and how it could be understood in the practical discourse, focusing on the understanding offered by judicature.The thesis of the article is that objective interpretation, as identified with textualists’ position, is not possible to uphold, and ...
Understanding Statistics - Cancer Statistics
Annual reports of U.S. cancer statistics including new cases, deaths, trends, survival, prevalence, lifetime risk, and progress toward Healthy People targets, plus statistical summaries for a number of common cancer types.
International Nuclear Information System (INIS)
Tabor, L.
1987-01-01
For mammography to be an effective diagnostic method, it must be performed to a very high standard of quality. Otherwise many lesions, in particular cancer in its early stages, will simply not be detectable on the films, regardless of the skill of the mammographer. Mammographic interpretation consists of two basic steps: perception and analysis. The process of mammographic interpretation begins with perception of the lesion on the mammogram. Perception is influenced by several factors. One of the most important is the parenchymal pattern of the breast tissue, detection of pathologic lesions being easier with fatty involution. The mammographer should use a method for the systematic viewing of the mammograms that will ensure that all parts of each mammogram are carefully searched for the presence of lesions. The method of analysis proceeds according to the type of lesion. The contour analysis of primary importance in the evaluation of circumscribed tumors. After having analyzed the contour and density of a lesion and considered its size, the mammographer should be fairly certain whether the circumscribed tumor is benign or malignant. Fine-needle puncture and/or US may assist the mammographer in making this decision. Painstaking analysis is required because many circumscribed tumors do not need to be biopsied. The perception of circumscribed tumors seldom causes problems, but their analysis needs careful attention. On the other hand, the major challenge with star-shaped lesions is perception. They may be difficult to discover when small. Although the final diagnosis of a stellate lesion can be made only with the help of histologic examination, the preoperative mammorgraphic differential diagnosis can be highly accurate. The differential diagnostic problem is between malignant tumors (scirrhous carcinoma), on the one hand, and traumatic fat necrosis as well as radial scars on the other hand
An Interpreter's Interpretation: Sign Language Interpreters' View of Musculoskeletal Disorders
National Research Council Canada - National Science Library
Johnson, William L
2003-01-01
Sign language interpreters are at increased risk for musculoskeletal disorders. This study used content analysis to obtain detailed information about these disorders from the interpreters' point of view...
Interpreting the Customary Rules on Interpretation
Merkouris, Panos
2017-01-01
International courts have at times interpreted the customary rules on interpretation. This is interesting because what is being interpreted is: i) rules of interpretation, which sounds dangerously tautological, and ii) customary law, the interpretation of which has not been the object of critical
Rumsey, Deborah
2011-01-01
The fun and easy way to get down to business with statistics Stymied by statistics? No fear ? this friendly guide offers clear, practical explanations of statistical ideas, techniques, formulas, and calculations, with lots of examples that show you how these concepts apply to your everyday life. Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more.Tracks to a typical first semester statistics cou
Applied statistics for economists
Lewis, Margaret
2012-01-01
This book is an undergraduate text that introduces students to commonly-used statistical methods in economics. Using examples based on contemporary economic issues and readily-available data, it not only explains the mechanics of the various methods, it also guides students to connect statistical results to detailed economic interpretations. Because the goal is for students to be able to apply the statistical methods presented, online sources for economic data and directions for performing each task in Excel are also included.
Statistics & probaility for dummies
Rumsey, Deborah J
2013-01-01
Two complete eBooks for one low price! Created and compiled by the publisher, this Statistics I and Statistics II bundle brings together two math titles in one, e-only bundle. With this special bundle, you'll get the complete text of the following two titles: Statistics For Dummies, 2nd Edition Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more. Tra
LAMINAR FLOW THROUGH A TUBE WITH AN EASILY PENETRABLE ROUGHNESS NEAR AXIS
Directory of Open Access Journals (Sweden)
Є.О. Гаєв
2012-12-01
Full Text Available Mathematical model has been suggested and investigation carried out of laminar flow through a round tube with a porous insertion (easily penetrable roughness, EPR in its middle along the axis. Velocity and shear fields have been found analytically for stable flow region, as well as hydraulic resistance as functions of EPR density and its height.
Clearly written, easily comprehended? The readability of websites providing information on epilepsy
Brigo, Francesco; Otte, Wim; Igwe, Stanley C.; Tezzon, Frediano; Nardone, Raffaele
2015-01-01
There is a general need for high-quality, easily accessible, and comprehensive health-care information on epilepsy to better inform the general population about this highly stigmatized neurological disorder. The aim of this study was to evaluate the health literacy level of eight popular
Blunt bilateral diaphragmatic rupture—A right side can be easily missed
Directory of Open Access Journals (Sweden)
Maria Michailidou
2015-12-01
Full Text Available Blunt diaphragmatic rupture (BDR is uncommon with a reported incidence range of 1%–2%. The true incidence is not known. Bilateral BDR is particularly rare. We presented a case of bilateral BDR and we think that the incidence is under-recognised thanks to an easily missed and difficult to diagnose right sided injury. Keywords: Blunt, Diaphragm, Bilateral, Injury
Ooms, L.; Veenhof, C.
2014-01-01
Introduction: The Dutch government stimulates sport and physical activity opportunities in the neighborhood to make it easier for people to adopt a physically active lifestyle. Seven National Sports Federations (NSFs) were funded to develop easily accessible sporting programs, targeted at groups
Interpretive Media Study and Interpretive Social Science.
Carragee, Kevin M.
1990-01-01
Defines the major theoretical influences on interpretive approaches in mass communication, examines the central concepts of these perspectives, and provides a critique of these approaches. States that the adoption of interpretive approaches in mass communication has ignored varied critiques of interpretive social science. Suggests that critical…
Interpreters, Interpreting, and the Study of Bilingualism.
Valdes, Guadalupe; Angelelli, Claudia
2003-01-01
Discusses research on interpreting focused specifically on issues raised by this literature about the nature of bilingualism. Suggests research carried out on interpreting--while primarily produced with a professional audience in mind and concerned with improving the practice of interpreting--provides valuable insights about complex aspects of…
International Nuclear Information System (INIS)
Lim, Gyeong Hui
2008-03-01
This book consists of 15 chapters, which are basic conception and meaning of statistical thermodynamics, Maxwell-Boltzmann's statistics, ensemble, thermodynamics function and fluctuation, statistical dynamics with independent particle system, ideal molecular system, chemical equilibrium and chemical reaction rate in ideal gas mixture, classical statistical thermodynamics, ideal lattice model, lattice statistics and nonideal lattice model, imperfect gas theory on liquid, theory on solution, statistical thermodynamics of interface, statistical thermodynamics of a high molecule system and quantum statistics
Does environmental data collection need statistics?
Pulles, M.P.J.
1998-01-01
The term 'statistics' with reference to environmental science and policymaking might mean different things: the development of statistical methodology, the methodology developed by statisticians to interpret and analyse such data, or the statistical data that are needed to understand environmental
Kanji, Gopal K
2006-01-01
This expanded and updated Third Edition of Gopal K. Kanji's best-selling resource on statistical tests covers all the most commonly used tests with information on how to calculate and interpret results with simple datasets. Each entry begins with a short summary statement about the test's purpose, and contains details of the test objective, the limitations (or assumptions) involved, a brief outline of the method, a worked example, and the numerical calculation. 100 Statistical Tests, Third Edition is the one indispensable guide for users of statistical materials and consumers of statistical information at all levels and across all disciplines.
International Nuclear Information System (INIS)
Fu Jie; Zhou Qifu; Chen Dongliang
2011-01-01
This paper mainly introduces the contents of shielding design in the No. 151 report of NCRP published in 2005, discusses some issues that easily to be overlooked during the environmental impact assessment of medical electrical accelerators in China. Some references will be provided in the medical electrical accelerators' shielding design and assessment to achieved the purpose of scientific, reasonable, feasible and economical radiation shielding protection. (authors)
OntologyWidget – a reusable, embeddable widget for easily locating ontology terms
Beauheim, Catherine C; Wymore, Farrell; Nitzberg, Michael; Zachariah, Zachariah K; Jin, Heng; Skene, JH Pate; Ball, Catherine A; Sherlock, Gavin
2007-01-01
Abstract Background Biomedical ontologies are being widely used to annotate biological data in a computer-accessible, consistent and well-defined manner. However, due to their size and complexity, annotating data with appropriate terms from an ontology is often challenging for experts and non-experts alike, because there exist few tools that allow one to quickly find relevant ontology terms to easily populate a web form. Results We have produced a tool, OntologyWidget, which allows users to r...
Statistical inference a short course
Panik, Michael J
2012-01-01
A concise, easily accessible introduction to descriptive and inferential techniques Statistical Inference: A Short Course offers a concise presentation of the essentials of basic statistics for readers seeking to acquire a working knowledge of statistical concepts, measures, and procedures. The author conducts tests on the assumption of randomness and normality, provides nonparametric methods when parametric approaches might not work. The book also explores how to determine a confidence interval for a population median while also providing coverage of ratio estimation, randomness, and causal
Energy Technology Data Exchange (ETDEWEB)
Institute of Paper Science Technology
2004-01-30
In recent years, the world has expressed an increasing interest in the recycling of waste paper to supplement the use of virgin fiber as a way to protect the environment. Statistics show that major countries are increasing their use of recycled paper. For example, in 1991 to 1996, the U.S. increased its recovered paper utilization rate from 31% to 39%, Germany went from 50% to 60%, the UK went from 60% to 70%, France increased from 46% to 49%, and China went from 32% to 35% [1]. As recycled fiber levels and water system closures both increase, recycled product quality will need to improve in order for recycled products to compete with products made from virgin fiber [2]. The use of recycled fiber has introduced an increasing level of metal, plastic, and adhesive contamination into the papermaking process which has added to the complexity of the already overwhelming task of providing a uniform and clean recycle furnish. The most harmful of these contaminates is a mixture of adhesives and polymeric substances that are commonly known as stickies. Stickies, which enter the mill with the pulp furnish, are not easily removed from the repulper and become more difficult the further down the system they get. This can be detrimental to the final product quality. Stickies are hydrophobic, tacky, polymeric materials that are introduced into the papermaking system from a mixture of recycled fiber sources. Properties of stickies are very similar to the fibers used in papermaking, viz. size, density, hydrophobicity, and electrokinetic charge. This reduces the probability of their removal by conventional separation processes, such as screening and cleaning, which are based on such properties. Also, their physical and chemical structure allows for them to extrude through screens, attach to fibers, process equipment, wires and felts. Stickies can break down and then reagglomerate and appear at seemingly any place in the mill. When subjected to a number of factors including changes
An easily regenerable enzyme reactor prepared from polymerized high internal phase emulsions
International Nuclear Information System (INIS)
Ruan, Guihua; Wu, Zhenwei; Huang, Yipeng; Wei, Meiping; Su, Rihui; Du, Fuyou
2016-01-01
A large-scale high-efficient enzyme reactor based on polymerized high internal phase emulsion monolith (polyHIPE) was prepared. First, a porous cross-linked polyHIPE monolith was prepared by in-situ thermal polymerization of a high internal phase emulsion containing styrene, divinylbenzene and polyglutaraldehyde. The enzyme of TPCK-Trypsin was then immobilized on the monolithic polyHIPE. The performance of the resultant enzyme reactor was assessed according to the conversion ability of N_α-benzoyl-L-arginine ethyl ester to N_α-benzoyl-L-arginine, and the protein digestibility of bovine serum albumin (BSA) and cytochrome (Cyt-C). The results showed that the prepared enzyme reactor exhibited high enzyme immobilization efficiency and fast and easy-control protein digestibility. BSA and Cyt-C could be digested in 10 min with sequence coverage of 59% and 78%, respectively. The peptides and residual protein could be easily rinsed out from reactor and the reactor could be regenerated easily with 4 M HCl without any structure destruction. Properties of multiple interconnected chambers with good permeability, fast digestion facility and easily reproducibility indicated that the polyHIPE enzyme reactor was a good selector potentially applied in proteomics and catalysis areas. - Graphical abstract: Schematic illustration of preparation of hypercrosslinking polyHIPE immobilized enzyme reactor for on-column protein digestion. - Highlights: • A reactor was prepared and used for enzyme immobilization and continuous on-column protein digestion. • The new polyHIPE IMER was quite suit for protein digestion with good properties. • On-column digestion revealed that the IMER was easy regenerated by HCl without any structure destruction.
An easily regenerable enzyme reactor prepared from polymerized high internal phase emulsions
Energy Technology Data Exchange (ETDEWEB)
Ruan, Guihua, E-mail: guihuaruan@hotmail.com [Guangxi Key Laboratory of Electrochemical and Magnetochemical Functional Materials, College of Chemistry and Bioengineering, Guilin University of Technology, Guangxi 541004 (China); Guangxi Collaborative Innovation Center for Water Pollution Control and Water Safety in Karst Area, Guilin University of Technology, Guilin 541004 (China); Wu, Zhenwei; Huang, Yipeng; Wei, Meiping; Su, Rihui [Guangxi Key Laboratory of Electrochemical and Magnetochemical Functional Materials, College of Chemistry and Bioengineering, Guilin University of Technology, Guangxi 541004 (China); Du, Fuyou, E-mail: dufu2005@126.com [Guangxi Key Laboratory of Electrochemical and Magnetochemical Functional Materials, College of Chemistry and Bioengineering, Guilin University of Technology, Guangxi 541004 (China); Guangxi Collaborative Innovation Center for Water Pollution Control and Water Safety in Karst Area, Guilin University of Technology, Guilin 541004 (China)
2016-04-22
A large-scale high-efficient enzyme reactor based on polymerized high internal phase emulsion monolith (polyHIPE) was prepared. First, a porous cross-linked polyHIPE monolith was prepared by in-situ thermal polymerization of a high internal phase emulsion containing styrene, divinylbenzene and polyglutaraldehyde. The enzyme of TPCK-Trypsin was then immobilized on the monolithic polyHIPE. The performance of the resultant enzyme reactor was assessed according to the conversion ability of N{sub α}-benzoyl-L-arginine ethyl ester to N{sub α}-benzoyl-L-arginine, and the protein digestibility of bovine serum albumin (BSA) and cytochrome (Cyt-C). The results showed that the prepared enzyme reactor exhibited high enzyme immobilization efficiency and fast and easy-control protein digestibility. BSA and Cyt-C could be digested in 10 min with sequence coverage of 59% and 78%, respectively. The peptides and residual protein could be easily rinsed out from reactor and the reactor could be regenerated easily with 4 M HCl without any structure destruction. Properties of multiple interconnected chambers with good permeability, fast digestion facility and easily reproducibility indicated that the polyHIPE enzyme reactor was a good selector potentially applied in proteomics and catalysis areas. - Graphical abstract: Schematic illustration of preparation of hypercrosslinking polyHIPE immobilized enzyme reactor for on-column protein digestion. - Highlights: • A reactor was prepared and used for enzyme immobilization and continuous on-column protein digestion. • The new polyHIPE IMER was quite suit for protein digestion with good properties. • On-column digestion revealed that the IMER was easy regenerated by HCl without any structure destruction.
... What Is Cancer? Cancer Statistics Cancer Disparities Cancer Statistics Cancer has a major impact on society in ... success of efforts to control and manage cancer. Statistics at a Glance: The Burden of Cancer in ...
Cellulose with a High Fractal Dimension Is Easily Hydrolysable under Acid Catalysis
Directory of Open Access Journals (Sweden)
Mariana Díaz
2017-05-01
Full Text Available The adsorption of three diverse amino acids couples onto the surface of microcrystalline cellulose was studied. Characterisation of modified celluloses included changes in the polarity and in roughness. The amino acids partially break down the hydrogen bonding network of the cellulose structure, leading to more reactive cellulose residues that were easily hydrolysed to glucose in the presence of hydrochloric acid or tungstophosphoric acid catalysts. The conversion of cellulose and selectivity for glucose was highly dependent on the self-assembled amino acids adsorbed onto the cellulose and the catalyst.
An AAA-DDD triply hydrogen-bonded complex easily accessible for supramolecular polymers.
Han, Yi-Fei; Chen, Wen-Qiang; Wang, Hong-Bo; Yuan, Ying-Xue; Wu, Na-Na; Song, Xiang-Zhi; Yang, Lan
2014-12-15
For a complementary hydrogen-bonded complex, when every hydrogen-bond acceptor is on one side and every hydrogen-bond donor is on the other, all secondary interactions are attractive and the complex is highly stable. AAA-DDD (A=acceptor, D=donor) is considered to be the most stable among triply hydrogen-bonded sequences. The easily synthesized and further derivatized AAA-DDD system is very desirable for hydrogen-bonded functional materials. In this case, AAA and DDD, starting from 4-methoxybenzaldehyde, were synthesized with the Hantzsch pyridine synthesis and Friedländer annulation reaction. The association constant determined by fluorescence titration in chloroform at room temperature is 2.09×10(7) M(-1) . The AAA and DDD components are not coplanar, but form a V shape in the solid state. Supramolecular polymers based on AAA-DDD triply hydrogen bonded have also been developed. This work may make AAA-DDD triply hydrogen-bonded sequences easily accessible for stimuli-responsive materials. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Adams, Wendy K.; Alhadlaq, Hisham; Malley, Christopher V.; Perkins, Katherine K.; Olson, Jonathan; Alshaya, Fahad; Alabdulkareem, Saleh; Wieman, Carl E.
2012-02-01
The PhET Interactive Simulations Project partnered with the Excellence Research Center of Science and Mathematics Education at King Saud University with the joint goal of making simulations useable worldwide. One of the main challenges of this partnership is to make PhET simulations and the website easily translatable into any language. The PhET project team overcame this challenge by creating the Translation Utility. This tool allows a person fluent in both English and another language to easily translate any of the PhET simulations and requires minimal computer expertise. In this paper we discuss the technical issues involved in this software solution, as well as the issues involved in obtaining accurate translations. We share our solutions to many of the unexpected problems we encountered that would apply generally to making on-line scientific course materials available in many different languages, including working with: languages written right-to-left, different character sets, and different conventions for expressing equations, variables, units and scientific notation.
On court interpreters' visibility
DEFF Research Database (Denmark)
Dubslaff, Friedel; Martinsen, Bodil
of the service they receive. Ultimately, the findings will be used for training purposes. Future - and, for that matter, already practising - interpreters as well as the professional users of interpreters ought to take the reality of the interpreters' work in practice into account when assessing the quality...... on the interpreter's interpersonal role and, in particular, on signs of the interpreter's visibility, i.e. active co-participation. At first sight, the interpreting assignment in question seems to be a short and simple routine task which would not require the interpreter to deviate from the traditional picture...
CERN. Geneva
2005-01-01
The three lectures will present an introduction to statistical methods as used in High Energy Physics. As the time will be very limited, the course will seek mainly to define the important issues and to introduce the most wide used tools. Topics will include the interpretation and use of probability, estimation of parameters and testing of hypotheses.
Introduction to Statistics course
CERN. Geneva HR-RFA
2006-01-01
The four lectures will present an introduction to statistical methods as used in High Energy Physics. As the time will be very limited, the course will seek mainly to define the important issues and to introduce the most wide used tools. Topics will include the interpretation and use of probability, estimation of parameters and testing of hypotheses.
CERN. Geneva
2004-01-01
The three lectures will present an introduction to statistical methods as used in High Energy Physics. As the time will be very limited, the course will seek mainly to define the important issues and to introduce the most wide used tools. Topics will include the interpretation and use of probability, estimation of parameters and testing of hypotheses.
Interpreting Impoliteness: Interpreters’ Voices
Directory of Open Access Journals (Sweden)
Tatjana Radanović Felberg
2017-11-01
Full Text Available Interpreters in the public sector in Norway interpret in a variety of institutional encounters, and the interpreters evaluate the majority of these encounters as polite. However, some encounters are evaluated as impolite, and they pose challenges when it comes to interpreting impoliteness. This issue raises the question of whether interpreters should take a stance on their own evaluation of impoliteness and whether they should interfere in communication. In order to find out more about how interpreters cope with this challenge, in 2014 a survey was sent to all interpreters registered in the Norwegian Register of Interpreters. The survey data were analyzed within the theoretical framework of impoliteness theory using the notion of moral order as an explanatory tool in a close reading of interpreters’ answers. The analysis shows that interpreters reported using a variety of strategies for interpreting impoliteness, including omissions and downtoning. However, the interpreters also gave examples of individual strategies for coping with impoliteness, such as interrupting and postponing interpreting. These strategies border behavioral strategies and conflict with the Norwegian ethical guidelines for interpreting. In light of the ethical guidelines and actual practice, mapping and discussing different strategies used by interpreters might heighten interpreters’ and interpreter-users’ awareness of the role impoliteness can play in institutional interpreter– mediated encounters.
Evaluation of easily measured risk factors in the prediction of osteoporotic fractures
Directory of Open Access Journals (Sweden)
Brown Jacques P
2005-09-01
Full Text Available Abstract Background Fracture represents the single most important clinical event in patients with osteoporosis, yet remains under-predicted. As few premonitory symptoms for fracture exist, it is of critical importance that physicians effectively and efficiently identify individuals at increased fracture risk. Methods Of 3426 postmenopausal women in CANDOO, 40, 158, 99, and 64 women developed a new hip, vertebral, wrist or rib fracture, respectively. Seven easily measured risk factors predictive of fracture in research trials were examined in clinical practice including: age (, 65–69, 70–74, 75–79, 80+ years, rising from a chair with arms (yes, no, weight (≥ 57kg, maternal history of hip facture (yes, no, prior fracture after age 50 (yes, no, hip T-score (>-1, -1 to >-2.5, ≤-2.5, and current smoking status (yes, no. Multivariable logistic regression analysis was conducted. Results The inability to rise from a chair without the use of arms (3.58; 95% CI: 1.17, 10.93 was the most significant risk factor for new hip fracture. Notable risk factors for predicting new vertebral fractures were: low body weight (1.57; 95% CI: 1.04, 2.37, current smoking (1.95; 95% CI: 1.20, 3.18 and age between 75–79 years (1.96; 95% CI: 1.10, 3.51. New wrist fractures were significantly identified by low body weight (1.71, 95% CI: 1.01, 2.90 and prior fracture after 50 years (1.96; 95% CI: 1.19, 3.22. Predictors of new rib fractures include a maternal history of a hip facture (2.89; 95% CI: 1.04, 8.08 and a prior fracture after 50 years (2.16; 95% CI: 1.20, 3.87. Conclusion This study has shown that there exists a variety of predictors of future fracture, besides BMD, that can be easily assessed by a physician. The significance of each variable depends on the site of incident fracture. Of greatest interest is that an inability to rise from a chair is perhaps the most readily identifiable significant risk factor for hip fracture and can be easily incorporated
An easily Prepared Fluorescent pH Probe Based on Dansyl.
Sha, Chunming; Chen, Yuhua; Chen, Yufen; Xu, Dongmei
2016-09-01
A novel fluorescent pH probe from dansyl chloride and thiosemicarbazide was easily prepared and fully characterized by (1)H NMR, (13)C NMR, LC-MS, Infrared spectra and elemental analysis. The probe exhibited high selectivity and sensitivity to H(+) with a pK a value of 4.98. The fluorescence intensity at 510 nm quenched 99.5 % when the pH dropped from 10.88 to 1.98. In addition, the dansyl-based probe could respond quickly and reversibly to the pH variation and various common metal ions showed negligible interference. The recognition could be ascribed to the intramolecular charge transfer caused by the protonation of the nitrogen in the dimethylamino group.
Can four-quark states be easily detected in baryon-antibaryon scattering?
International Nuclear Information System (INIS)
Roberts, W.; Silvestre-Brac, B.; Gignoux, C.
1990-01-01
We attempt to explain the experimental sparsity of diquonia candidates given the theoretical abundance of such states. We do this by investigating the lowest-order contributions of such states as intermediates in p bar p scattering into exclusive baryon-antibaryon final states. We find that the contributions depend on the partial widths for the meson-meson decays of the diquonia, and that resonant effects can be easily made to disappear. We conclude that if the meson-meson widths of diquonia are larger than about 50 MeV, most of these states will be extremely difficult to observe in p bar p scattering, for instance. We note that diquonia may offer a convenient means of describing some aspects of the dynamics of baryon-antibaryon scattering
Yang, Huishan; Yu, Yaoyao; Wu, Lishuang; Qu, Biao; Lin, Wenyan; Yu, Ye; Wu, Zhijun; Xie, Wenfa
2018-02-01
We have realized highly efficient tandem organic light-emitting devices (OLEDs) employing an easily fabricated charge generation unit (CGU) combining 1,4,5,8,9,11-hexaazatriphenylene-hexacarbonitrile with ultrathin bilayers of CsN3 and Al. The charge generation and separation processes of the CGU have been demonstrated by studying the differences in the current density-voltage characteristics of external-carrier-excluding devices. At high luminances of 1000 and 10000 cd/m2, the current efficiencies of the phosphorescent tandem device are about 2.2- and 2.3-fold those of the corresponding single-unit device, respectively. Simultaneously, an efficient tandem white OLED exhibiting high color stability and warm white emission has also been fabricated.
Yang, Hui; Deng, Yan
2017-12-01
All-dielectric metasurfaces for wavefront deflecting and optical vortex generating with broadband and high efficiency are demonstrated. The unit cell of the metasurfaces is optimized to function as a half wave-plate with high polarization conversion efficiency (94%) and transmittance (94.5%) at the telecommunication wavelength. Under such a condition, we can get rid of the complicated parameter sweep process for phase shift selecting. Hence, a phase coverage ranges from 0 to 2 π can be easily obtained by introducing the Pancharatnam-Berry phase. Metasurfaces composed of the two pre-designed super cells are demonstrated for optical beam deflecting and vortex beam generating. It is found that the metasurfaces with more phase shift sampling points (small phase shift increment) exhibit better performance. Moreover, optical vortex beams can be generated by the designed metasurfaces within a wavelength range of 200 nm. These results will provide a viable route for designing broadband and high efficiency devices related to phase modulation.
TRANSALPINA CAN EASILY BE CONSIDERED THE DIAMOND COUNTRY LANDSCAPES, ADVENTURE AND MYSTERY
Directory of Open Access Journals (Sweden)
Constanta ENEA
2014-05-01
Full Text Available If Transfăgărăşan is pearl Romanian mountains, the road easily qill be considered the diamond country landscapes, adventure and mystery. Hell 's Kitchen has developed and evolved naturally. Have no certainty of success and money required to carry out the infrastructure first and then see if investors come, so we can not blame the local authorities find here. The difficulties encountered in implementing funding programs made for funds to obtain hard enough. In this paper, I will briefly mention some ideas that could make the two cities, the holder of administratively to Rancière, the burgeoning tourist development area of Gorj County. I sincerely hope uhat there is among us and other people with vision who want to stand up and take action to provide a decent future for our children.
Solving block linear systems with low-rank off-diagonal blocks is easily parallelizable
Energy Technology Data Exchange (ETDEWEB)
Menkov, V. [Indiana Univ., Bloomington, IN (United States)
1996-12-31
An easily and efficiently parallelizable direct method is given for solving a block linear system Bx = y, where B = D + Q is the sum of a non-singular block diagonal matrix D and a matrix Q with low-rank blocks. This implicitly defines a new preconditioning method with an operation count close to the cost of calculating a matrix-vector product Qw for some w, plus at most twice the cost of calculating Qw for some w. When implemented on a parallel machine the processor utilization can be as good as that of those operations. Order estimates are given for the general case, and an implementation is compared to block SSOR preconditioning.
The study on development of easily chewable and swallowable foods for elderly.
Kim, Soojeong; Joo, Nami
2015-08-01
When the functions involved in the ingestion of food occurs failure, not only loss of enjoyment of eating, it will be faced with protein-energy malnutrition. Dysmasesis and difficulty of swallowing occurs in various diseases, but it may be a major cause of aging, and elderly people with authoring and dysmasesis and difficulty of swallowing in the aging society is expected to increase rapidly. In this study, we carried out a survey targeting nutritionists who work in elderly care facilities, and examined characteristics of offering of foods for elderly and the degree of demand of development of easily chewable and swallowable foods for the elderly who can crush foods and take that by their own tongues, and sometimes have difficulty in drinking water and tea. In elderly care facilities, it was found to provide a finely chopped food or ground food that was ground with water in a blender for elderly with dysmasesis. Elderly satisfaction of provided foods is appeared overall low. Results of investigating the applicability of foods for elderly and the reflection will of menus, were showed the highest response rate in a gelification method in molecular gastronomic science technics, and results of investigating the frequent food of the elderly; representative menu of beef, pork, white fish, anchovies and spinach, were showed Korean barbecue beef, hot pepper paste stir fried pork, pan fried white fish, stir fried anchovy, seasoned spinach were the highest offer frequency. This study will provide the fundamentals of the development of easily chewable and swallowable foods, gelification, for the elderly. The study will also illustrate that, in the elderly, food undergone gelification will reduce the risk of swallowing down to the wrong pipe and improve overall food preference.
Effects of easily ionizable elements on the liquid sampling-atmospheric pressure glow discharge
International Nuclear Information System (INIS)
Venzie, Jacob L.; Marcus, R. Kenneth
2006-01-01
A series of studies has been undertaken to determine the susceptibility of the liquid sampling-atmospheric pressure glow discharge (LS-APGD) atomic emission source to easily ionizable element (EIE) effects. The initial portions of the study involved monitoring the voltage drop across the plasma as a function of the pH to ascertain whether or not the conductivity of the liquid eluent alters the plasma energetics and subsequently the analyte signal strength. It was found that altering the pH (0.0 to 2.0) in the sample matrix did not significantly change the discharge voltage. The emission signal intensities for Cu(I) 327.4 nm, Mo(I) 344.7 nm, Sc(I) 326.9 nm and Hg(I) 253.6 nm were measured as a function of the easily ionizable element (sodium and calcium) concentration in the injection matrix. A range of 0.0 to 0.1% (w/v) EIE in the sample matrix did not cause a significant change in the Cu, Sc, and Mo signal-to-background ratios, with only a slight change noted for Hg. In addition to this test of analyte response, the plasma energetics as a function of EIE concentration are assessed using the ratio of Mg(II) to Mg(I) (280.2 nm and 285.2 nm, respectively) intensities. The Mg(II)/Mg(I) ratio showed that the plasma energetics did not change significantly over the same range of EIE addition. These results are best explained by the electrolytic nature of the eluent acting as an ionic (and perhaps spectrochemical) buffer
GoCxx: a tool to easily leverage C++ legacy code for multicore-friendly Go libraries and frameworks
International Nuclear Information System (INIS)
Binet, Sébastien
2012-01-01
Current HENP libraries and frameworks were written before multicore systems became widely deployed and used. From this environment, a ‘single-thread’ processing model naturally emerged but the implicit assumptions it encouraged are greatly impairing our abilities to scale in a multicore/manycore world. Writing scalable code in C++ for multicore architectures, while doable, is no panacea. Sure, C++11 will improve on the current situation (by standardizing on std::thread, introducing lambda functions and defining a memory model) but it will do so at the price of complicating further an already quite sophisticated language. This level of sophistication has probably already strongly motivated analysis groups to migrate to CPython, hoping for its current limitations with respect to multicore scalability to be either lifted (Grand Interpreter Lock removal) or for the advent of a new Python VM better tailored for this kind of environment (PyPy, Jython, …) Could HENP migrate to a language with none of the deficiencies of C++ (build time, deployment, low level tools for concurrency) and with the fast turn-around time, simplicity and ease of coding of Python? This paper will try to make the case for Go - a young open source language with built-in facilities to easily express and expose concurrency - being such a language. We introduce GoCxx, a tool leveraging gcc-xml's output to automatize the tedious work of creating Go wrappers for foreign languages, a critical task for any language wishing to leverage legacy and field-tested code. We will conclude with the first results of applying GoCxx to real C++ code.
Statistical Reform in School Psychology Research: A Synthesis
Swaminathan, Hariharan; Rogers, H. Jane
2007-01-01
Statistical reform in school psychology research is discussed in terms of research designs, measurement issues, statistical modeling and analysis procedures, interpretation and reporting of statistical results, and finally statistics education.
Interpretation of Confidence Interval Facing the Conflict
Andrade, Luisa; Fernández, Felipe
2016-01-01
As literature has reported, it is usual that university students in statistics courses, and even statistics teachers, interpret the confidence level associated with a confidence interval as the probability that the parameter value will be between the lower and upper interval limits. To confront this misconception, class activities have been…
... this page: https://medlineplus.gov/usestatistics.html MedlinePlus Statistics To use the sharing features on this page, ... By Quarter View image full size Quarterly User Statistics Quarter Page Views Unique Visitors Oct-Dec-98 ...
International Nuclear Information System (INIS)
Olivari, D.
1984-01-01
An attempt is made at identifying the most important factors which introduce difficulties in the analysis of results from tests on pollutant dispersal: the unsteadiness of the phenomenon, the effect of external uncontrollable parameters, and the inherent complexity of the problem itself. The basic models for prediction of dispersion of passive contaminants are discussed, and in particular a Lagrangian approach which seems to provide accurate results. For the analysis of results many problems arise. First the need of computing for the results the statistical quantities which describe them: the mean, the variance and higher order moments are important. It is shown that there is no easy solution if the duration and/or the number of independent ''events'' to be analyzed are too limited. The probability density function provides the most useful information, but is not easy to measure. A family of functions is recalled which predict reasonably well the trend of the pdf. Then the role of intermittency is shown in some detail. Its importance cannot be underestimated and its relationship to pdf and the effects on measurements are shown to be rather complex. Finally, an example is made to show the effects of the variance of external factors
Pestman, Wiebe R
2009-01-01
This textbook provides a broad and solid introduction to mathematical statistics, including the classical subjects hypothesis testing, normal regression analysis, and normal analysis of variance. In addition, non-parametric statistics and vectorial statistics are considered, as well as applications of stochastic analysis in modern statistics, e.g., Kolmogorov-Smirnov testing, smoothing techniques, robustness and density estimation. For students with some elementary mathematical background. With many exercises. Prerequisites from measure theory and linear algebra are presented.
Whole Frog Project and Virtual Frog Dissection Statistics wwwstats output for January 1 through duplicate or extraneous accesses. For example, in these statistics, while a POST requesting an image is as well. Note that this under-represents the bytes requested. Starting date for following statistics
DEFF Research Database (Denmark)
Auken, Sune
2015-01-01
Despite the immensity of genre studies as well as studies in interpretation, our understanding of the relationship between genre and interpretation is sketchy at best. The article attempts to unravel some of intricacies of that relationship through an analysis of the generic interpretation carrie...
Engineering Definitional Interpreters
DEFF Research Database (Denmark)
Midtgaard, Jan; Ramsay, Norman; Larsen, Bradford
2013-01-01
A definitional interpreter should be clear and easy to write, but it may run 4--10 times slower than a well-crafted bytecode interpreter. In a case study focused on implementation choices, we explore ways of making definitional interpreters faster without expending much programming effort. We imp...
Measurement of thermal properties of white radish (R. raphanistrum using easily constructed probes.
Directory of Open Access Journals (Sweden)
Mfrekemfon Samuel Obot
Full Text Available Thermal properties are necessary for the design and control of processes and storage facilities of food materials. This study proposes the measurement of thermal properties using easily constructed probes with specific heat capacity calculated, as opposed to the use of Differential Scanning Calorimeter (DSC or other. These probes were constructed and used to measure thermal properties of white radish in the temperature range of 80-20°C and moisture content of 91-6.1% wb. Results showed thermal properties were within the range of 0.71-0.111 Wm-1 C-1 for thermal conductivity, 1.869×10-7-0.72×10-8 m2s-1 for thermal diffusivity and 4.316-1.977 kJ kg-1C-1for specific heat capacity. These results agree with reports for similar products studied using DSC and commercially available line heat source probes. Empirical models were developed for each property through linear multiple regressions. The data generated would be useful in modeling and control of its processing and equipment design.
Measurement of thermal properties of white radish (R. raphanistrum) using easily constructed probes.
Obot, Mfrekemfon Samuel; Li, Changcheng; Fang, Ting; Chen, Jinquan
2017-01-01
Thermal properties are necessary for the design and control of processes and storage facilities of food materials. This study proposes the measurement of thermal properties using easily constructed probes with specific heat capacity calculated, as opposed to the use of Differential Scanning Calorimeter (DSC) or other. These probes were constructed and used to measure thermal properties of white radish in the temperature range of 80-20°C and moisture content of 91-6.1% wb. Results showed thermal properties were within the range of 0.71-0.111 Wm-1 C-1 for thermal conductivity, 1.869×10-7-0.72×10-8 m2s-1 for thermal diffusivity and 4.316-1.977 kJ kg-1C-1for specific heat capacity. These results agree with reports for similar products studied using DSC and commercially available line heat source probes. Empirical models were developed for each property through linear multiple regressions. The data generated would be useful in modeling and control of its processing and equipment design.
Bar-Eyal, Leeat; Eisenberg, Ido; Faust, Adam; Raanan, Hagai; Nevo, Reinat; Rappaport, Fabrice; Krieger-Liszkay, Anja; Sétif, Pierre; Thurotte, Adrien; Reich, Ziv; Kaplan, Aaron; Ohad, Itzhak; Paltiel, Yossi; Keren, Nir
2015-10-01
Biological desert sand crusts are the foundation of desert ecosystems, stabilizing the sands and allowing colonization by higher order organisms. The first colonizers of the desert sands are cyanobacteria. Facing the harsh conditions of the desert, these organisms must withstand frequent desiccation-hydration cycles, combined with high light intensities. Here, we characterize structural and functional modifications to the photosynthetic apparatus that enable a cyanobacterium, Leptolyngbya sp., to thrive under these conditions. Using multiple in vivo spectroscopic and imaging techniques, we identified two complementary mechanisms for dissipating absorbed energy in the desiccated state. The first mechanism involves the reorganization of the phycobilisome antenna system, increasing excitonic coupling between antenna components. This provides better energy dissipation in the antenna rather than directed exciton transfer to the reaction center. The second mechanism is driven by constriction of the thylakoid lumen which limits diffusion of plastocyanin to P700. The accumulation of P700(+) not only prevents light-induced charge separation but also efficiently quenches excitation energy. These protection mechanisms employ existing components of the photosynthetic apparatus, forming two distinct functional modes. Small changes in the structure of the thylakoid membranes are sufficient for quenching of all absorbed energy in the desiccated state, protecting the photosynthetic apparatus from photoinhibitory damage. These changes can be easily reversed upon rehydration, returning the system to its high photosynthetic quantum efficiency. Copyright © 2015 Elsevier B.V. All rights reserved.
Estimating subsoil resistance to nitrate leaching from easily measurable pedological properties
Directory of Open Access Journals (Sweden)
Fábio Keiti Nakagawa
2012-11-01
Full Text Available Leaching of nitrate (NO3- can increase the groundwater concentration of this anion and reduce the agronomical effectiveness of nitrogen fertilizers. The main soil property inversely related to NO3- leaching is the anion exchange capacity (AEC, whose determination is however too time-consuming for being carried out in soil testing laboratories. For this reason, this study evaluated if more easily measurable soil properties could be used to estimate the resistance of subsoils to NO3- leaching. Samples from the subsurface layer (20-40 cm of 24 representative soils of São Paulo State were characterized for particle-size distribution and for chemical and electrochemical properties. The subsoil content of adsorbed NO3- was calculated from the difference between the NO3- contents extracted with 1 mol L-1 KCl and with water; furthermore, NO3- leaching was studied in miscible displacement experiments. The results of both adsorption and leaching experiments were consistent with the well-known role exerted by AEC on the nitrate behavior in weathered soils. Multiple regression analysis indicated that in subsoils with (i low values of remaining phosphorus (Prem, (ii low soil pH values measured in water (pH H2O, and (iii high pH values measured in 1 moL L-1 KCl (pH KCl, the amounts of surface positive charges tend to be greater. For this reason, NO3- leaching tends to be slower in these subsoils, even under saturated flow condition.
Shaft seals with an easily removable cylinder holder for low-pressure steam turbines
Zakharov, A. E.; Rodionov, D. A.; Pimenov, E. V.; Sobolev, A. S.
2016-01-01
The article is devoted to the problems that occur at the operation of LPC shaft seals (SS) of turbines, particularly, their bearings. The problems arising from the deterioration of oil-protecting rings of SS and bearings and also the consequences in which they can result are considered. The existing SS housing construction types are considered. Their operational features are specified. A new SS construction type with an easily removable holder is presented. The construction of its main elements is described. The sequence of operations of the repair personnel at the restoration of the new SS type spacings is proposed. The comparative analysis of the new and the existing SS construction types is carried out. The assessment results of the efficiency, the operational convenience, and the economic effect after the installation of the new type seals are given. The conclusions about the offered construction prospects are made by results of the comparative analysis and the carried-out assessment. The main advantage of this design is the possibility of spacings restoration both in SS and in oil-protecting rings during a short-term stop of a turbine, even without its cooling. This construction was successfully tested on the working K-300-23.5 LMP turbine. However, its adaptation for other turbines is quite possible.
Easily recycled Bi2O3 photocatalyst coatings prepared via ball milling followed by calcination
Cheng, Lijun; Hu, Xumin; Hao, Liang
2017-06-01
Bi2O3 photocatalyst coatings derived from Bi coatings were first prepared by a two-step method, namely ball milling followed by the calcination process. The as-prepared samples were characterized by XRD, SEM, XPS and UV-Vis spectra, respectively. The results showed that monoclinic Bi2O3 coatings were obtained after sintering Bi coatings at 673 or 773 K, while monoclinic and triclinic mixed phase Bi2O3 coatings were obtained at 873 or 973 K. The topographies of the samples were observably different, which varied from flower-like, irregular, polygonal to nanosized particles with the increase in calcination temperature. Photodegradation of malachite green under simulated solar irradiation for 180 min showed that the largest degradation efficiency of 86.2% was achieved over Bi2O3 photocatalyst coatings sintered at 873 K. The Bi2O3 photocatalyst coatings, encapsulated with Al2O3 ball with an average diameter around 1 mm, are quite easily recycled, which provides an alternative visible light-driven photocatalyst suitable for practical water treatment application.
Solar-assisted photodegradation of isoproturon over easily recoverable titania catalysts.
Tolosana-Moranchel, A; Carbajo, J; Faraldos, M; Bahamonde, A
2017-03-01
An easily recoverable homemade TiO 2 catalyst (GICA-1) has been evaluated during the overall photodegradation process, understood as photocatalytic efficiency and catalyst recovery step, in the solar light-assisted photodegradation of isoproturon and its reuse in two consecutive cycles. The global feasibility has been compared to the commercial TiO 2 P25. The homemade GICA-1 catalyst presented better sedimentation efficiency than TiO 2 P25 at all studied pHs, which could be explained by its higher average hydrodynamic particle size (3 μm) and other physicochemical surface properties. The evaluation of the overall process (isoproturon photo-oxidation + catalyst recovery) revealed GICA-1 homemade titania catalyst strengths: total removal of isoproturon in less than 60 min, easy recovery by sedimentation, and reusability in two consecutive cycles, without any loss of photocatalytic efficiency. Therefore, considering the whole photocatalytic cycle (good performance in photodegradation plus catalyst recovery step), the homemade GICA-1 photocatalyst resulted in more affordability than commercial TiO 2 P25. Graphical abstract.
Optimal Design of Multitype Groundwater Monitoring Networks Using Easily Accessible Tools.
Wöhling, Thomas; Geiges, Andreas; Nowak, Wolfgang
2016-11-01
Monitoring networks are expensive to establish and to maintain. In this paper, we extend an existing data-worth estimation method from the suite of PEST utilities with a global optimization method for optimal sensor placement (called optimal design) in groundwater monitoring networks. Design optimization can include multiple simultaneous sensor locations and multiple sensor types. Both location and sensor type are treated simultaneously as decision variables. Our method combines linear uncertainty quantification and a modified genetic algorithm for discrete multilocation, multitype search. The efficiency of the global optimization is enhanced by an archive of past samples and parallel computing. We demonstrate our methodology for a groundwater monitoring network at the Steinlach experimental site, south-western Germany, which has been established to monitor river-groundwater exchange processes. The target of optimization is the best possible exploration for minimum variance in predicting the mean travel time of the hyporheic exchange. Our results demonstrate that the information gain of monitoring network designs can be explored efficiently and with easily accessible tools prior to taking new field measurements or installing additional measurement points. The proposed methods proved to be efficient and can be applied for model-based optimal design of any type of monitoring network in approximately linear systems. Our key contributions are (1) the use of easy-to-implement tools for an otherwise complex task and (2) yet to consider data-worth interdependencies in simultaneous optimization of multiple sensor locations and sensor types. © 2016, National Ground Water Association.
OntologyWidget – a reusable, embeddable widget for easily locating ontology terms
Directory of Open Access Journals (Sweden)
Skene JH Pate
2007-09-01
Full Text Available Abstract Background Biomedical ontologies are being widely used to annotate biological data in a computer-accessible, consistent and well-defined manner. However, due to their size and complexity, annotating data with appropriate terms from an ontology is often challenging for experts and non-experts alike, because there exist few tools that allow one to quickly find relevant ontology terms to easily populate a web form. Results We have produced a tool, OntologyWidget, which allows users to rapidly search for and browse ontology terms. OntologyWidget can easily be embedded in other web-based applications. OntologyWidget is written using AJAX (Asynchronous JavaScript and XML and has two related elements. The first is a dynamic auto-complete ontology search feature. As a user enters characters into the search box, the appropriate ontology is queried remotely for terms that match the typed-in text, and the query results populate a drop-down list with all potential matches. Upon selection of a term from the list, the user can locate this term within a generic and dynamic ontology browser, which comprises the second element of the tool. The ontology browser shows the paths from a selected term to the root as well as parent/child tree hierarchies. We have implemented web services at the Stanford Microarray Database (SMD, which provide the OntologyWidget with access to over 40 ontologies from the Open Biological Ontology (OBO website 1. Each ontology is updated weekly. Adopters of the OntologyWidget can either use SMD's web services, or elect to rely on their own. Deploying the OntologyWidget can be accomplished in three simple steps: (1 install Apache Tomcat 2 on one's web server, (2 download and install the OntologyWidget servlet stub that provides access to the SMD ontology web services, and (3 create an html (HyperText Markup Language file that refers to the OntologyWidget using a simple, well-defined format. Conclusion We have developed Ontology
Clearly written, easily comprehended? The readability of websites providing information on epilepsy.
Brigo, Francesco; Otte, Willem M; Igwe, Stanley C; Tezzon, Frediano; Nardone, Raffaele
2015-03-01
There is a general need for high-quality, easily accessible, and comprehensive health-care information on epilepsy to better inform the general population about this highly stigmatized neurological disorder. The aim of this study was to evaluate the health literacy level of eight popular English-written websites that provide information on epilepsy in quantitative terms of readability. Educational epilepsy material on these websites, including 41 Wikipedia articles, were analyzed for their overall level of readability and the corresponding academic grade level needed to comprehend the published texts on the first reading. The Flesch Reading Ease (FRE) was used to assess ease of comprehension while the Gunning Fog Index, Coleman-Liau Index, Flesch-Kincaid Grade Level, Automated Readability Index, and Simple Measure of Gobbledygook scales estimated the corresponding academic grade level needed for comprehension. The average readability of websites yielded results indicative of a difficult-to-fairly-difficult readability level (FRE results: 44.0±8.2), with text readability corresponding to an 11th academic grade level (11.3±1.9). The average FRE score of the Wikipedia articles was indicative of a difficult readability level (25.6±9.5), with the other readability scales yielding results corresponding to a 14th grade level (14.3±1.7). Popular websites providing information on epilepsy, including Wikipedia, often demonstrate a low level of readability. This can be ameliorated by increasing access to clear and concise online information on epilepsy and health in general. Short "basic" summaries targeted to patients and nonmedical users should be added to articles published in specialist websites and Wikipedia to ease readability. Copyright © 2014 Elsevier Inc. All rights reserved.
Easily exchangeable x-ray mirrors and hybrid monochromator modules a study of their performance
Energy Technology Data Exchange (ETDEWEB)
Lin, Fan. [Philips Analytical, Asia Pacific, Toa Payoh, (Singapore); Kogan, V. [Philips Analytical, EA Almelo, (Netherlands); Saito, K. [Philips Analytical, Tokyo, (Japan)
1999-12-01
Full text: PreFix prealigned optical mounts allowing rapid and easily changeover will be presented. The benefits of laterally graded multilayer X-Ray mirrors coupled with these Prefix mounts - conversion of divergent beam to parallel beam, increase of intensity by a factor of 3-7, monochromation to {alpha}1 and {alpha}2 and a dynamic range of 10 {sup 4-5} CpS will be demonstrated in areas such as Thin Film and Powder analysis. Data will be shown on a diffraction profile of thin film (Cr/SiO{sub 2}) with and without a mirror and Si powder with and without a mirror. Further enhancement will be demonstrated by combining a channel cut monochromator-collimator with an X-Ray mirror to produce a high intensity, parallel, pure Cu K{alpha}1 beam with a high intensity of up to 4.5 x 10{sup 8} cps and a divergence down to 0.01 deg. The applicability to various ranging from High Resolution to thin film/reflectivity to Rietveld structural refinement and to phase analysis will be shown. The Rocking curve of HEMT 10nm InGaAs on InP will be presented using various `standard` optics and hybrid optics, also Si powder and a Rietveld refinement of CuS0{sub 4}.5H{sub 2}0 and Aspirin. A comparison of the benefits and application of X-Ray Mirrors and Hybrid Mirror/Monochromators will be given. The data presented will show that by using X-Ray Mirrors and Hybrid modules the performance of standard `Laboratory` Diffractometers can be greatly enhanced to a level previously unachievable with great practical benefits. Copyright (1999) Australian X-ray Analytical Association Inc.
Wang, Mengmeng; Gao, Yuzhen; Feng, Huijuan; Warner, Elisa; An, Mingrui; Jia, Jian'an; Chen, Shipeng; Fang, Meng; Ji, Jun; Gu, Xing; Gao, Chunfang
2018-03-01
Intrahepatic cholangiocarcinoma (ICC) and hepatocellular carcinoma (HCC) are the most prevalent histologic types of primary liver cancer (PLC). Although ICC and HCC share similar risk factors and clinical manifestations, ICC usually bears poorer prognosis than HCC. Confidently discriminating ICC and HCC before surgery is beneficial to both treatment and prognosis. Given the lack of effective differential diagnosis biomarkers and methods, construction of models based on available clinicopathological characteristics is in need. Nomograms present a simple and efficient way to make a discrimination. A total of 2894 patients who underwent surgery for PLC were collected. Of these, 1614 patients formed the training cohort for nomogram construction, and thereafter, 1280 patients formed the validation cohort to confirm the model's performance. Histopathologically confirmed ICC was diagnosed in 401 (24.8%) and 296 (23.1%) patients in these two cohorts, respectively. A nomogram integrating six easily obtained variables (Gender, Hepatitis B surface antigen, Aspartate aminotransferase, Alpha-fetoprotein, Carcinoembryonic antigen, Carbohydrate antigen 19-9) is proposed in accordance with Akaike's Information Criterion (AIC). A score of 15 was determined as the cut-off value, and the corresponding discrimination efficacy was sufficient. Additionally, patients who scored higher than 15 suffered poorer prognosis than those with lower scores, regardless of the subtype of PLC. A nomogram for clinical discrimination of ICC and HCC has been established, where a higher score indicates ICC and poor prognosis. Further application of this nomogram in multicenter investigations may confirm the practicality of this tool for future clinical use. © 2018 The Authors. Cancer Medicine published by John Wiley & Sons Ltd.
Directory of Open Access Journals (Sweden)
Eric Denion
Full Text Available We aimed to determine the limbal lighting illuminance thresholds (LLITs required to trigger perception of sclerotic scatter at the opposite non-illuminated limbus (i.e. perception of a light limbal scleral arc under different levels of ambient lighting illuminance (ALI.Twenty healthy volunteers were enrolled. The iris shade (light or dark was graded by retrieving the median value of the pixels of a pre-determined zone of a gray-level iris photograph. Mean keratometry and central corneal pachymetry were recorded. Each subject was asked to lie down, and the ALI at eye level was set to mesopic values (10, 20, 40 lux, then photopic values (60, 80, 100, 150, 200 lux. For each ALI level, a light beam of gradually increasing illuminance was applied to the right temporal limbus until the LLIT was reached, i.e. the level required to produce the faint light arc that is characteristic of sclerotic scatter at the nasal limbus.After log-log transformation, a linear relationship between the logarithm of ALI and the logarithm of the LLIT was found (p<0.001, a 10% increase in ALI being associated with an average increase in the LLIT of 28.9%. Higher keratometry values were associated with higher LLIT values (p = 0.008 under low ALI levels, but the coefficient of the interaction was very small, representing a very limited effect. Iris shade and central corneal thickness values were not significantly associated with the LLIT. We also developed a censored linear model for ALI values ≤ 40 lux, showing a linear relationship between ALI and the LLIT, in which the LLIT value was 34.4 times greater than the ALI value.Sclerotic scatter is more easily elicited under mesopic conditions than under photopic conditions and requires the LLIT value to be much higher than the ALI value, i.e. it requires extreme contrast.
Zerth, Herb; Harwood, Robert; Tommaso, Laura; Girzadas, Daniel V
2012-12-01
Pericardiocentesis is a low-frequency, high-risk procedure integral to the practice of emergency medicine. Ultrasound-guided pericardiocentesis is the preferred technique for providing this critical intervention. Traditionally, emergency physicians learned pericardiocentesis in real time, at the bedside, on critically ill patients. Medical education is moving toward simulation for training and assessment of procedures such as pericardiocentesis because it allows learners to practice time-sensitive skills without risk to patient or learner. The retail market for models for pericardiocentesis practice is limited and expensive. We have developed an ultrasound-guided pericardiocentesis task trainer that allows the physician to insert a needle under ultrasound guidance, pierce the "pericardial sac" and aspirate "blood." Our model can be simply constructed in a home kitchen, and the overall preparation time is 1 h. Our model costs $20.00 (US, 2008). Materials needed for the construction include 16 ounces of plain gelatin, one large balloon, one golf ball, food coloring, non-stick cooking spray, one wooden cooking skewer, surgical iodine solution, and a 4-quart sized plastic food storage container. Refrigeration and a heat source for cooking are also required. Once prepared, the model is usable for 2 weeks at room temperature and may be preserved an additional week if refrigerated. When the model shows signs of wear, it can be easily remade, by simply recycling the existing materials. The self-made model was well liked by training staff due to accessibility of a simulation model, and by learners of the technique as they felt more at ease performing pericardiocentesis on a live patient. Copyright © 2012 Elsevier Inc. All rights reserved.
Scheck, Florian
2016-01-01
Scheck’s textbook starts with a concise introduction to classical thermodynamics, including geometrical aspects. Then a short introduction to probabilities and statistics lays the basis for the statistical interpretation of thermodynamics. Phase transitions, discrete models and the stability of matter are explained in great detail. Thermodynamics has a special role in theoretical physics. Due to the general approach of thermodynamics the field has a bridging function between several areas like the theory of condensed matter, elementary particle physics, astrophysics and cosmology. The classical thermodynamics describes predominantly averaged properties of matter, reaching from few particle systems and state of matter to stellar objects. Statistical Thermodynamics covers the same fields, but explores them in greater depth and unifies classical statistical mechanics with quantum theory of multiple particle systems. The content is presented as two tracks: the fast track for master students, providing the essen...
Statistics 101 for Radiologists.
Anvari, Arash; Halpern, Elkan F; Samir, Anthony E
2015-10-01
Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced. © RSNA, 2015.
Sadovskii, Michael V
2012-01-01
This volume provides a compact presentation of modern statistical physics at an advanced level. Beginning with questions on the foundations of statistical mechanics all important aspects of statistical physics are included, such as applications to ideal gases, the theory of quantum liquids and superconductivity and the modern theory of critical phenomena. Beyond that attention is given to new approaches, such as quantum field theory methods and non-equilibrium problems.
Waller, Derek L
2008-01-01
Statistical analysis is essential to business decision-making and management, but the underlying theory of data collection, organization and analysis is one of the most challenging topics for business students and practitioners. This user-friendly text and CD-ROM package will help you to develop strong skills in presenting and interpreting statistical information in a business or management environment. Based entirely on using Microsoft Excel rather than more complicated applications, it includes a clear guide to using Excel with the key functions employed in the book, a glossary of terms and
Statistics As Principled Argument
Abelson, Robert P
2012-01-01
In this illuminating volume, Robert P. Abelson delves into the too-often dismissed problems of interpreting quantitative data and then presenting them in the context of a coherent story about one's research. Unlike too many books on statistics, this is a remarkably engaging read, filled with fascinating real-life (and real-research) examples rather than with recipes for analysis. It will be of true interest and lasting value to beginning graduate students and seasoned researchers alike. The focus of the book is that the purpose of statistics is to organize a useful argument from quantitative
Neave, Henry R
2012-01-01
This book, designed for students taking a basic introductory course in statistical analysis, is far more than just a book of tables. Each table is accompanied by a careful but concise explanation and useful worked examples. Requiring little mathematical background, Elementary Statistics Tables is thus not just a reference book but a positive and user-friendly teaching and learning aid. The new edition contains a new and comprehensive "teach-yourself" section on a simple but powerful approach, now well-known in parts of industry but less so in academia, to analysing and interpreting process dat
Search Databases and Statistics
DEFF Research Database (Denmark)
Refsgaard, Jan C; Munk, Stephanie; Jensen, Lars J
2016-01-01
having strengths and weaknesses that must be considered for the individual needs. These are reviewed in this chapter. Equally critical for generating highly confident output datasets is the application of sound statistical criteria to limit the inclusion of incorrect peptide identifications from database...... searches. Additionally, careful filtering and use of appropriate statistical tests on the output datasets affects the quality of all downstream analyses and interpretation of the data. Our considerations and general practices on these aspects of phosphoproteomics data processing are presented here....
Conformity and statistical tolerancing
Leblond, Laurent; Pillet, Maurice
2018-02-01
Statistical tolerancing was first proposed by Shewhart (Economic Control of Quality of Manufactured Product, (1931) reprinted 1980 by ASQC), in spite of this long history, its use remains moderate. One of the probable reasons for this low utilization is undoubtedly the difficulty for designers to anticipate the risks of this approach. The arithmetic tolerance (worst case) allows a simple interpretation: conformity is defined by the presence of the characteristic in an interval. Statistical tolerancing is more complex in its definition. An interval is not sufficient to define the conformance. To justify the statistical tolerancing formula used by designers, a tolerance interval should be interpreted as the interval where most of the parts produced should probably be located. This tolerance is justified by considering a conformity criterion of the parts guaranteeing low offsets on the latter characteristics. Unlike traditional arithmetic tolerancing, statistical tolerancing requires a sustained exchange of information between design and manufacture to be used safely. This paper proposes a formal definition of the conformity, which we apply successively to the quadratic and arithmetic tolerancing. We introduce a concept of concavity, which helps us to demonstrate the link between tolerancing approach and conformity. We use this concept to demonstrate the various acceptable propositions of statistical tolerancing (in the space decentring, dispersion).
The emergent Copenhagen interpretation of quantum mechanics
Hollowood, Timothy J.
2014-05-01
We introduce a new and conceptually simple interpretation of quantum mechanics based on reduced density matrices of sub-systems from which the standard Copenhagen interpretation emerges as an effective description of macroscopically large systems. This interpretation describes a world in which definite measurement results are obtained with probabilities that reproduce the Born rule. Wave function collapse is seen to be a useful but fundamentally unnecessary piece of prudent book keeping which is only valid for macro-systems. The new interpretation lies in a class of modal interpretations in that it applies to quantum systems that interact with a much larger environment. However, we show that it does not suffer from the problems that have plagued similar modal interpretations like macroscopic superpositions and rapid flipping between macroscopically distinct states. We describe how the interpretation fits neatly together with fully quantum formulations of statistical mechanics and that a measurement process can be viewed as a process of ergodicity breaking analogous to a phase transition. The key feature of the new interpretation is that joint probabilities for the ergodic subsets of states of disjoint macro-systems only arise as emergent quantities. Finally we give an account of the EPR-Bohm thought experiment and show that the interpretation implies the violation of the Bell inequality characteristic of quantum mechanics but in a way that is rather novel. The final conclusion is that the Copenhagen interpretation gives a completely satisfactory phenomenology of macro-systems interacting with micro-systems.
The emergent Copenhagen interpretation of quantum mechanics
International Nuclear Information System (INIS)
Hollowood, Timothy J
2014-01-01
We introduce a new and conceptually simple interpretation of quantum mechanics based on reduced density matrices of sub-systems from which the standard Copenhagen interpretation emerges as an effective description of macroscopically large systems. This interpretation describes a world in which definite measurement results are obtained with probabilities that reproduce the Born rule. Wave function collapse is seen to be a useful but fundamentally unnecessary piece of prudent book keeping which is only valid for macro-systems. The new interpretation lies in a class of modal interpretations in that it applies to quantum systems that interact with a much larger environment. However, we show that it does not suffer from the problems that have plagued similar modal interpretations like macroscopic superpositions and rapid flipping between macroscopically distinct states. We describe how the interpretation fits neatly together with fully quantum formulations of statistical mechanics and that a measurement process can be viewed as a process of ergodicity breaking analogous to a phase transition. The key feature of the new interpretation is that joint probabilities for the ergodic subsets of states of disjoint macro-systems only arise as emergent quantities. Finally we give an account of the EPR–Bohm thought experiment and show that the interpretation implies the violation of the Bell inequality characteristic of quantum mechanics but in a way that is rather novel. The final conclusion is that the Copenhagen interpretation gives a completely satisfactory phenomenology of macro-systems interacting with micro-systems. (paper)
Effect size, confidence intervals and statistical power in psychological research.
Directory of Open Access Journals (Sweden)
Téllez A.
2015-07-01
Full Text Available Quantitative psychological research is focused on detecting the occurrence of certain population phenomena by analyzing data from a sample, and statistics is a particularly helpful mathematical tool that is used by researchers to evaluate hypotheses and make decisions to accept or reject such hypotheses. In this paper, the various statistical tools in psychological research are reviewed. The limitations of null hypothesis significance testing (NHST and the advantages of using effect size and its respective confidence intervals are explained, as the latter two measurements can provide important information about the results of a study. These measurements also can facilitate data interpretation and easily detect trivial effects, enabling researchers to make decisions in a more clinically relevant fashion. Moreover, it is recommended to establish an appropriate sample size by calculating the optimum statistical power at the moment that the research is designed. Psychological journal editors are encouraged to follow APA recommendations strictly and ask authors of original research studies to report the effect size, its confidence intervals, statistical power and, when required, any measure of clinical significance. Additionally, we must account for the teaching of statistics at the graduate level. At that level, students do not receive sufficient information concerning the importance of using different types of effect sizes and their confidence intervals according to the different types of research designs; instead, most of the information is focused on the various tools of NHST.
Goodman, Joseph W
2015-01-01
This book discusses statistical methods that are useful for treating problems in modern optics, and the application of these methods to solving a variety of such problems This book covers a variety of statistical problems in optics, including both theory and applications. The text covers the necessary background in statistics, statistical properties of light waves of various types, the theory of partial coherence and its applications, imaging with partially coherent light, atmospheric degradations of images, and noise limitations in the detection of light. New topics have been introduced i
Energy Technology Data Exchange (ETDEWEB)
Eliazar, Iddo, E-mail: eliazar@post.tau.ac.il
2017-05-15
The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.
International Nuclear Information System (INIS)
Eliazar, Iddo
2017-01-01
The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.
Szulc, Stefan
1965-01-01
Statistical Methods provides a discussion of the principles of the organization and technique of research, with emphasis on its application to the problems in social statistics. This book discusses branch statistics, which aims to develop practical ways of collecting and processing numerical data and to adapt general statistical methods to the objectives in a given field.Organized into five parts encompassing 22 chapters, this book begins with an overview of how to organize the collection of such information on individual units, primarily as accomplished by government agencies. This text then
... Testing Treatment & Outcomes Health Professionals Statistics More Resources Candidiasis Candida infections of the mouth, throat, and esophagus Vaginal candidiasis Invasive candidiasis Definition Symptoms Risk & Prevention Sources Diagnosis ...
Reading Statistics And Research
Akbulut, Reviewed By Yavuz
2008-01-01
The book demonstrates the best and most conservative ways to decipher and critique research reports particularly for social science researchers. In addition, new editions of the book are always better organized, effectively structured and meticulously updated in line with the developments in the field of research statistics. Even the most trivial issues are revisited and updated in new editions. For instance, purchaser of the previous editions might check the interpretation of skewness and ku...
Statistical methods in quality assurance
International Nuclear Information System (INIS)
Eckhard, W.
1980-01-01
During the different phases of a production process - planning, development and design, manufacturing, assembling, etc. - most of the decision rests on a base of statistics, the collection, analysis and interpretation of data. Statistical methods can be thought of as a kit of tools to help to solve problems in the quality functions of the quality loop with respect to produce quality products and to reduce quality costs. Various statistical methods are represented, typical examples for their practical application are demonstrated. (RW)
SOCR: Statistics Online Computational Resource
Dinov, Ivo D.
2006-01-01
The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis...
Wilson, Donald A
2014-01-01
Base retracement on solid research and historically accurate interpretation Interpreting Land Records is the industry's most complete guide to researching and understanding the historical records germane to land surveying. Coverage includes boundary retracement and the primary considerations during new boundary establishment, as well as an introduction to historical records and guidance on effective research and interpretation. This new edition includes a new chapter titled "Researching Land Records," and advice on overcoming common research problems and insight into alternative resources wh
Statistical concepts a second course
Lomax, Richard G
2012-01-01
Statistical Concepts consists of the last 9 chapters of An Introduction to Statistical Concepts, 3rd ed. Designed for the second course in statistics, it is one of the few texts that focuses just on intermediate statistics. The book highlights how statistics work and what they mean to better prepare students to analyze their own data and interpret SPSS and research results. As such it offers more coverage of non-parametric procedures used when standard assumptions are violated since these methods are more frequently encountered when working with real data. Determining appropriate sample sizes
Petocz, Peter; Sowey, Eric
2012-01-01
The term "data snooping" refers to the practice of choosing which statistical analyses to apply to a set of data after having first looked at those data. Data snooping contradicts a fundamental precept of applied statistics, that the scheme of analysis is to be planned in advance. In this column, the authors shall elucidate the…
Petocz, Peter; Sowey, Eric
2008-01-01
In this article, the authors focus on hypothesis testing--that peculiarly statistical way of deciding things. Statistical methods for testing hypotheses were developed in the 1920s and 1930s by some of the most famous statisticians, in particular Ronald Fisher, Jerzy Neyman and Egon Pearson, who laid the foundations of almost all modern methods of…
Glaz, Joseph
2009-01-01
Suitable for graduate students and researchers in applied probability and statistics, as well as for scientists in biology, computer science, pharmaceutical science and medicine, this title brings together a collection of chapters illustrating the depth and diversity of theory, methods and applications in the area of scan statistics.
Lyons, L.
2016-01-01
Accelerators and detectors are expensive, both in terms of money and human effort. It is thus important to invest effort in performing a good statistical anal- ysis of the data, in order to extract the best information from it. This series of five lectures deals with practical aspects of statistical issues that arise in typical High Energy Physics analyses.
Nick, Todd G
2007-01-01
Statistics is defined by the Medical Subject Headings (MeSH) thesaurus as the science and art of collecting, summarizing, and analyzing data that are subject to random variation. The two broad categories of summarizing and analyzing data are referred to as descriptive and inferential statistics. This chapter considers the science and art of summarizing data where descriptive statistics and graphics are used to display data. In this chapter, we discuss the fundamentals of descriptive statistics, including describing qualitative and quantitative variables. For describing quantitative variables, measures of location and spread, for example the standard deviation, are presented along with graphical presentations. We also discuss distributions of statistics, for example the variance, as well as the use of transformations. The concepts in this chapter are useful for uncovering patterns within the data and for effectively presenting the results of a project.
Blakemore, J S
1962-01-01
Semiconductor Statistics presents statistics aimed at complementing existing books on the relationships between carrier densities and transport effects. The book is divided into two parts. Part I provides introductory material on the electron theory of solids, and then discusses carrier statistics for semiconductors in thermal equilibrium. Of course a solid cannot be in true thermodynamic equilibrium if any electrical current is passed; but when currents are reasonably small the distribution function is but little perturbed, and the carrier distribution for such a """"quasi-equilibrium"""" co
Wannier, Gregory Hugh
1966-01-01
Until recently, the field of statistical physics was traditionally taught as three separate subjects: thermodynamics, statistical mechanics, and kinetic theory. This text, a forerunner in its field and now a classic, was the first to recognize the outdated reasons for their separation and to combine the essentials of the three subjects into one unified presentation of thermal physics. It has been widely adopted in graduate and advanced undergraduate courses, and is recommended throughout the field as an indispensable aid to the independent study and research of statistical physics.Designed for
Linguistics in Text Interpretation
DEFF Research Database (Denmark)
Togeby, Ole
2011-01-01
A model for how text interpretation proceeds from what is pronounced, through what is said to what is comunicated, and definition of the concepts 'presupposition' and 'implicature'.......A model for how text interpretation proceeds from what is pronounced, through what is said to what is comunicated, and definition of the concepts 'presupposition' and 'implicature'....
Statistical interpretation of low energy nuclear level schemes
Energy Technology Data Exchange (ETDEWEB)
Egidy, T von; Schmidt, H H; Behkami, A N
1988-01-01
Nuclear level schemes and neutron resonance spacings yield information on level densities and level spacing distributions. A total of 75 nuclear level schemes with 1761 levels and known spins and parities was investigated. The A-dependence of level density parameters is discussed. The spacing distributions of levels near the groundstate indicate transitional character between regular and chaotic properties while chaos dominates near the neutron binding energy.
Statistical interpretation of WEBNET seismograms by artificial neural nets
Czech Academy of Sciences Publication Activity Database
Plešinger, Axel; Růžek, Bohuslav; Boušková, Alena
2000-01-01
Roč. 44, č. 2 (2000), s. 251-271 ISSN 0039-3169 R&D Projects: GA AV ČR IAA312104; GA ČR GA205/99/0907 Institutional research plan: CEZ:AV0Z3012916 Subject RIV: DC - Siesmology, Volcanology, Earth Structure Impact factor: 0.761, year: 2000
Energy Technology Data Exchange (ETDEWEB)
Wendelberger, Laura Jean [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2017-08-08
In large datasets, it is time consuming or even impossible to pick out interesting images. Our proposed solution is to find statistics to quantify the information in each image and use those to identify and pick out images of interest.
Department of Homeland Security — Accident statistics available on the Coast Guard’s website by state, year, and one variable to obtain tables and/or graphs. Data from reports has been loaded for...
U.S. Department of Health & Human Services — The CMS Center for Strategic Planning produces an annual CMS Statistics reference booklet that provides a quick reference for summary information about health...
Allegheny County / City of Pittsburgh / Western PA Regional Data Center — Data about the usage of the WPRDC site and its various datasets, obtained by combining Google Analytics statistics with information from the WPRDC's data portal.
Serdobolskii, Vadim Ivanovich
2007-01-01
This monograph presents mathematical theory of statistical models described by the essentially large number of unknown parameters, comparable with sample size but can also be much larger. In this meaning, the proposed theory can be called "essentially multiparametric". It is developed on the basis of the Kolmogorov asymptotic approach in which sample size increases along with the number of unknown parameters.This theory opens a way for solution of central problems of multivariate statistics, which up until now have not been solved. Traditional statistical methods based on the idea of an infinite sampling often break down in the solution of real problems, and, dependent on data, can be inefficient, unstable and even not applicable. In this situation, practical statisticians are forced to use various heuristic methods in the hope the will find a satisfactory solution.Mathematical theory developed in this book presents a regular technique for implementing new, more efficient versions of statistical procedures. ...
... Search Form Controls Cancel Submit Search the CDC Gonorrhea Note: Javascript is disabled or is not supported ... Twitter STD on Facebook Sexually Transmitted Diseases (STDs) Gonorrhea Statistics Recommend on Facebook Tweet Share Compartir Gonorrhea ...
DEFF Research Database (Denmark)
Tryggestad, Kjell
2004-01-01
The study aims is to describe how the inclusion and exclusion of materials and calculative devices construct the boundaries and distinctions between statistical facts and artifacts in economics. My methodological approach is inspired by John Graunt's (1667) Political arithmetic and more recent work...... within constructivism and the field of Science and Technology Studies (STS). The result of this approach is here termed reversible statistics, reconstructing the findings of a statistical study within economics in three different ways. It is argued that all three accounts are quite normal, albeit...... in different ways. The presence and absence of diverse materials, both natural and political, is what distinguishes them from each other. Arguments are presented for a more symmetric relation between the scientific statistical text and the reader. I will argue that a more symmetric relation can be achieved...
MacKenzie, Dana
2004-01-01
The drawbacks of using 19th-century mathematics in physics and astronomy are illustrated. To continue with the expansion of the knowledge about the cosmos, the scientists will have to come in terms with modern statistics. Some researchers have deliberately started importing techniques that are used in medical research. However, the physicists need to identify the brand of statistics that will be suitable for them, and make a choice between the Bayesian and the frequentists approach. (Edited abstract).
Topics in statistical data analysis for high-energy physics
International Nuclear Information System (INIS)
Cowan, G.
2011-01-01
These lectures concert two topics that are becoming increasingly important in the analysis of high-energy physics data: Bayesian statistics and multivariate methods. In the Bayesian approach, we extend the interpretation of probability not only to cover the frequency of repeatable outcomes but also to include a degree of belief. In this way we are able to associate probability with a hypothesis and thus to answer directly questions that cannot be addressed easily with traditional frequentist methods. In multivariate analysis, we try to exploit as much information as possible from the characteristics that we measure for each event to distinguish between event types. In particular we will look at a method that has gained popularity in high-energy physics in recent years: the boosted decision tree. Finally, we give a brief sketch of how multivariate methods may be applied in a search for a new signal process. (author)
Perception in statistical graphics
VanderPlas, Susan Ruth
There has been quite a bit of research on statistical graphics and visualization, generally focused on new types of graphics, new software to create graphics, interactivity, and usability studies. Our ability to interpret and use statistical graphics hinges on the interface between the graph itself and the brain that perceives and interprets it, and there is substantially less research on the interplay between graph, eye, brain, and mind than is sufficient to understand the nature of these relationships. The goal of the work presented here is to further explore the interplay between a static graph, the translation of that graph from paper to mental representation (the journey from eye to brain), and the mental processes that operate on that graph once it is transferred into memory (mind). Understanding the perception of statistical graphics should allow researchers to create more effective graphs which produce fewer distortions and viewer errors while reducing the cognitive load necessary to understand the information presented in the graph. Taken together, these experiments should lay a foundation for exploring the perception of statistical graphics. There has been considerable research into the accuracy of numerical judgments viewers make from graphs, and these studies are useful, but it is more effective to understand how errors in these judgments occur so that the root cause of the error can be addressed directly. Understanding how visual reasoning relates to the ability to make judgments from graphs allows us to tailor graphics to particular target audiences. In addition, understanding the hierarchy of salient features in statistical graphics allows us to clearly communicate the important message from data or statistical models by constructing graphics which are designed specifically for the perceptual system.
Groen-Blokhuis, Maria M; Middeldorp, Christel M; M van Beijsterveldt, Catharina E; Boomsma, Dorret I
2011-10-01
In order to estimate the influence of genetic and environmental factors on 'crying without a cause' and 'being easily upset' in 2-year-old children, a large twin study was carried out. Prospective data were available for ~18,000 2-year-old twin pairs from the Netherlands Twin Register. A bivariate genetic analysis was performed using structural equation modeling in the Mx software package. The influence of maternal personality characteristics and demographic and lifestyle factors was tested to identify specific risk factors that may underlie the shared environment of twins. Furthermore, it was tested whether crying without a cause and being easily upset were predictive of later internalizing, externalizing and attention problems. Crying without a cause yielded a heritability estimate of 60% in boys and girls. For easily upset, the heritability was estimated at 43% in boys and 31% in girls. The variance explained by shared environment varied between 35% and 63%. The correlation between crying without a cause and easily upset (r = .36) was explained both by genetic and shared environmental factors. Birth cohort, gestational age, socioeconomic status, parental age, parental smoking behavior and alcohol use during pregnancy did not explain the shared environmental component. Neuroticism of the mother explained a small proportion of the additive genetic, but not of the shared environmental effects for easily upset. Crying without a cause and being easily upset at age 2 were predictive of internalizing, externalizing and attention problems at age 7, with effect sizes of .28-.42. A large influence of shared environmental factors on crying without a cause and easily upset was detected. Although these effects could be specific to these items, we could not explain them by personality characteristics of the mother or by demographic and lifestyle factors, and we recognize that these effects may reflect other maternal characteristics. A substantial influence of genetic factors
Wilhelm Wundt's Theory of Interpretation
Directory of Open Access Journals (Sweden)
Jochen Fahrenberg
2008-09-01
Full Text Available Wilhelm WUNDT was a pioneer in experimental and physiological psychology. However, his theory of interpretation (hermeneutics remains virtually neglected. According to WUNDT psychology belongs to the domain of the humanities (Geisteswissenschaften, and, throughout his books and research, he advocated two basic methodologies: experimentation (as the means of controlled self-observation and interpretative analysis of mental processes and products. He was an experimental psychologist and a profound expert in traditional hermeneutics. Today, he still may be acknowledged as the author of the monumental Völkerpsychologie, but not his advances in epistemology and methodology. His subsequent work, the Logik (1908/1921, contains about 120 pages on hermeneutics. In the present article a number of issues are addressed. Noteworthy was WUNDT's general intention to account for the logical constituents and the psychological process of understanding, and his reflections on quality control. In general, WUNDT demanded methodological pluralism and a complementary approach to the study of consciousness and neurophysiological processes. In the present paper WUNDT's approach is related to the continuing controversy on basic issues in methodology; e.g. experimental and statistical methods vs. qualitative (hermeneutic methods. Varied explanations are given for the one-sided or distorted reception of WUNDT's methodology. Presently, in Germany the basic program of study in psychology lacks thorough teaching and training in qualitative (hermeneutic methods. Appropriate courses are not included in the curricula, in contrast to the training in experimental design, observation methods, and statistics. URN: urn:nbn:de:0114-fqs0803291
Design research in statistics education : on symbolizing and computer tools
Bakker, A.
2004-01-01
The present knowledge society requires statistical literacy-the ability to interpret, critically evaluate, and communicate about statistical information and messages (Gal, 2002). However, research shows that students generally do not gain satisfactory statistical understanding. The research
Goodman, J. W.
This book is based on the thesis that some training in the area of statistical optics should be included as a standard part of any advanced optics curriculum. Random variables are discussed, taking into account definitions of probability and random variables, distribution functions and density functions, an extension to two or more random variables, statistical averages, transformations of random variables, sums of real random variables, Gaussian random variables, complex-valued random variables, and random phasor sums. Other subjects examined are related to random processes, some first-order properties of light waves, the coherence of optical waves, some problems involving high-order coherence, effects of partial coherence on imaging systems, imaging in the presence of randomly inhomogeneous media, and fundamental limits in photoelectric detection of light. Attention is given to deterministic versus statistical phenomena and models, the Fourier transform, and the fourth-order moment of the spectrum of a detected speckle image.
Schwabl, Franz
2006-01-01
The completely revised new edition of the classical book on Statistical Mechanics covers the basic concepts of equilibrium and non-equilibrium statistical physics. In addition to a deductive approach to equilibrium statistics and thermodynamics based on a single hypothesis - the form of the microcanonical density matrix - this book treats the most important elements of non-equilibrium phenomena. Intermediate calculations are presented in complete detail. Problems at the end of each chapter help students to consolidate their understanding of the material. Beyond the fundamentals, this text demonstrates the breadth of the field and its great variety of applications. Modern areas such as renormalization group theory, percolation, stochastic equations of motion and their applications to critical dynamics, kinetic theories, as well as fundamental considerations of irreversibility, are discussed. The text will be useful for advanced students of physics and other natural sciences; a basic knowledge of quantum mechan...
Jana, Madhusudan
2015-01-01
Statistical mechanics is self sufficient, written in a lucid manner, keeping in mind the exam system of the universities. Need of study this subject and its relation to Thermodynamics is discussed in detail. Starting from Liouville theorem gradually, the Statistical Mechanics is developed thoroughly. All three types of Statistical distribution functions are derived separately with their periphery of applications and limitations. Non-interacting ideal Bose gas and Fermi gas are discussed thoroughly. Properties of Liquid He-II and the corresponding models have been depicted. White dwarfs and condensed matter physics, transport phenomenon - thermal and electrical conductivity, Hall effect, Magneto resistance, viscosity, diffusion, etc. are discussed. Basic understanding of Ising model is given to explain the phase transition. The book ends with a detailed coverage to the method of ensembles (namely Microcanonical, canonical and grand canonical) and their applications. Various numerical and conceptual problems ar...
Guénault, Tony
2007-01-01
In this revised and enlarged second edition of an established text Tony Guénault provides a clear and refreshingly readable introduction to statistical physics, an essential component of any first degree in physics. The treatment itself is self-contained and concentrates on an understanding of the physical ideas, without requiring a high level of mathematical sophistication. A straightforward quantum approach to statistical averaging is adopted from the outset (easier, the author believes, than the classical approach). The initial part of the book is geared towards explaining the equilibrium properties of a simple isolated assembly of particles. Thus, several important topics, for example an ideal spin-½ solid, can be discussed at an early stage. The treatment of gases gives full coverage to Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein statistics. Towards the end of the book the student is introduced to a wider viewpoint and new chapters are included on chemical thermodynamics, interactions in, for exam...
DEFF Research Database (Denmark)
Agerbo, Heidi
2017-01-01
Approximately a decade ago, it was suggested that a new function should be added to the lexicographical function theory: the interpretive function(1). However, hardly any research has been conducted into this function, and though it was only suggested that this new function was relevant...... to incorporate into lexicographical theory, some scholars have since then assumed that this function exists(2), including the author of this contribution. In Agerbo (2016), I present arguments supporting the incorporation of the interpretive function into the function theory and suggest how non-linguistic signs...... can be treated in specific dictionary articles. However, in the current article, due to the results of recent research, I argue that the interpretive function should not be considered an individual main function. The interpretive function, contrary to some of its definitions, is not connected...
Cytological artifacts masquerading interpretation
Directory of Open Access Journals (Sweden)
Khushboo Sahay
2013-01-01
Conclusions: In order to justify a cytosmear interpretation, a cytologist must be well acquainted with delayed fixation-induced cellular changes and microscopic appearances of common contaminants so as to implicate better prognosis and therapy.
Schrodinger's mechanics interpretation
Cook, David B
2018-01-01
The interpretation of quantum mechanics has been in dispute for nearly a century with no sign of a resolution. Using a careful examination of the relationship between the final form of classical particle mechanics (the HamiltonJacobi Equation) and Schrödinger's mechanics, this book presents a coherent way of addressing the problems and paradoxes that emerge through conventional interpretations.Schrödinger's Mechanics critiques the popular way of giving physical interpretation to the various terms in perturbation theory and other technologies and places an emphasis on development of the theory and not on an axiomatic approach. When this interpretation is made, the extension of Schrödinger's mechanics in relation to other areas, including spin, relativity and fields, is investigated and new conclusions are reached.
Normative interpretations of diversity
DEFF Research Database (Denmark)
Lægaard, Sune
2009-01-01
Normative interpretations of particular cases consist of normative principles or values coupled with social theoretical accounts of the empirical facts of the case. The article reviews the most prominent normative interpretations of the Muhammad cartoons controversy over the publication of drawings...... of the Prophet Muhammad in the Danish newspaper Jyllands-Posten. The controversy was seen as a case of freedom of expression, toleration, racism, (in)civility and (dis)respect, and the article notes different understandings of these principles and how the application of them to the controversy implied different...... social theoretical accounts of the case. In disagreements between different normative interpretations, appeals are often made to the ‘context', so it is also considered what roles ‘context' might play in debates over normative interpretations...
Principles of radiological interpretation
International Nuclear Information System (INIS)
Rowe, L.J.; Yochum, T.R.
1987-01-01
Conventional radiographic procedures (plain film) are the most frequently utilized imaging modality in the evaluation of the skeletal system. This chapter outlines the essentials of skeletal imaging, anatomy, physiology, and interpretation
Mandl, Franz
1988-01-01
The Manchester Physics Series General Editors: D. J. Sandiford; F. Mandl; A. C. Phillips Department of Physics and Astronomy, University of Manchester Properties of Matter B. H. Flowers and E. Mendoza Optics Second Edition F. G. Smith and J. H. Thomson Statistical Physics Second Edition E. Mandl Electromagnetism Second Edition I. S. Grant and W. R. Phillips Statistics R. J. Barlow Solid State Physics Second Edition J. R. Hook and H. E. Hall Quantum Mechanics F. Mandl Particle Physics Second Edition B. R. Martin and G. Shaw The Physics of Stars Second Edition A. C. Phillips Computing for Scient
Rohatgi, Vijay K
2003-01-01
Unified treatment of probability and statistics examines and analyzes the relationship between the two fields, exploring inferential issues. Numerous problems, examples, and diagrams--some with solutions--plus clear-cut, highlighted summaries of results. Advanced undergraduate to graduate level. Contents: 1. Introduction. 2. Probability Model. 3. Probability Distributions. 4. Introduction to Statistical Inference. 5. More on Mathematical Expectation. 6. Some Discrete Models. 7. Some Continuous Models. 8. Functions of Random Variables and Random Vectors. 9. Large-Sample Theory. 10. General Meth
Levine-Wissing, Robin
2012-01-01
All Access for the AP® Statistics Exam Book + Web + Mobile Everything you need to prepare for the Advanced Placement® exam, in a study system built around you! There are many different ways to prepare for an Advanced Placement® exam. What's best for you depends on how much time you have to study and how comfortable you are with the subject matter. To score your highest, you need a system that can be customized to fit you: your schedule, your learning style, and your current level of knowledge. This book, and the online tools that come with it, will help you personalize your AP® Statistics prep
Davidson, Norman
2003-01-01
Clear and readable, this fine text assists students in achieving a grasp of the techniques and limitations of statistical mechanics. The treatment follows a logical progression from elementary to advanced theories, with careful attention to detail and mathematical development, and is sufficiently rigorous for introductory or intermediate graduate courses.Beginning with a study of the statistical mechanics of ideal gases and other systems of non-interacting particles, the text develops the theory in detail and applies it to the study of chemical equilibrium and the calculation of the thermody
Phillips, Richard L.; Chang, Kyu Hyun; Friedler, Sorelle A.
2017-01-01
Active learning has long been a topic of study in machine learning. However, as increasingly complex and opaque models have become standard practice, the process of active learning, too, has become more opaque. There has been little investigation into interpreting what specific trends and patterns an active learning strategy may be exploring. This work expands on the Local Interpretable Model-agnostic Explanations framework (LIME) to provide explanations for active learning recommendations. W...
Interpreter-mediated dentistry.
Bridges, Susan; Drew, Paul; Zayts, Olga; McGrath, Colman; Yiu, Cynthia K Y; Wong, H M; Au, T K F
2015-05-01
The global movements of healthcare professionals and patient populations have increased the complexities of medical interactions at the point of service. This study examines interpreter mediated talk in cross-cultural general dentistry in Hong Kong where assisting para-professionals, in this case bilingual or multilingual Dental Surgery Assistants (DSAs), perform the dual capabilities of clinical assistant and interpreter. An initial language use survey was conducted with Polyclinic DSAs (n = 41) using a logbook approach to provide self-report data on language use in clinics. Frequencies of mean scores using a 10-point visual analogue scale (VAS) indicated that the majority of DSAs spoke mainly Cantonese in clinics and interpreted for postgraduates and professors. Conversation Analysis (CA) examined recipient design across a corpus (n = 23) of video-recorded review consultations between non-Cantonese speaking expatriate dentists and their Cantonese L1 patients. Three patterns of mediated interpreting indicated were: dentist designated expansions; dentist initiated interpretations; and assistant initiated interpretations to both the dentist and patient. The third, rather than being perceived as negative, was found to be framed either in response to patient difficulties or within the specific task routines of general dentistry. The findings illustrate trends in dentistry towards personalized care and patient empowerment as a reaction to product delivery approaches to patient management. Implications are indicated for both treatment adherence and the education of dental professionals. Copyright © 2015 Elsevier Ltd. All rights reserved.
Indian Academy of Sciences (India)
inference and finite population sampling. Sudhakar Kunte. Elements of statistical computing are discussed in this series. ... which captain gets an option to decide whether to field first or bat first ... may of course not be fair, in the sense that the team which wins ... describe two methods of drawing a random number between 0.
Schrödinger, Erwin
1952-01-01
Nobel Laureate's brilliant attempt to develop a simple, unified standard method of dealing with all cases of statistical thermodynamics - classical, quantum, Bose-Einstein, Fermi-Dirac, and more.The work also includes discussions of Nernst theorem, Planck's oscillator, fluctuations, the n-particle problem, problem of radiation, much more.
Harrigan, George G; Harrison, Jay M
2012-01-01
New transgenic (GM) crops are subjected to extensive safety assessments that include compositional comparisons with conventional counterparts as a cornerstone of the process. The influence of germplasm, location, environment, and agronomic treatments on compositional variability is, however, often obscured in these pair-wise comparisons. Furthermore, classical statistical significance testing can often provide an incomplete and over-simplified summary of highly responsive variables such as crop composition. In order to more clearly describe the influence of the numerous sources of compositional variation we present an introduction to two alternative but complementary approaches to data analysis and interpretation. These include i) exploratory data analysis (EDA) with its emphasis on visualization and graphics-based approaches and ii) Bayesian statistical methodology that provides easily interpretable and meaningful evaluations of data in terms of probability distributions. The EDA case-studies include analyses of herbicide-tolerant GM soybean and insect-protected GM maize and soybean. Bayesian approaches are presented in an analysis of herbicide-tolerant GM soybean. Advantages of these approaches over classical frequentist significance testing include the more direct interpretation of results in terms of probabilities pertaining to quantities of interest and no confusion over the application of corrections for multiple comparisons. It is concluded that a standardized framework for these methodologies could provide specific advantages through enhanced clarity of presentation and interpretation in comparative assessments of crop composition.
Statistical analysis with Excel for dummies
Schmuller, Joseph
2013-01-01
Take the mystery out of statistical terms and put Excel to work! If you need to create and interpret statistics in business or classroom settings, this easy-to-use guide is just what you need. It shows you how to use Excel's powerful tools for statistical analysis, even if you've never taken a course in statistics. Learn the meaning of terms like mean and median, margin of error, standard deviation, and permutations, and discover how to interpret the statistics of everyday life. You'll learn to use Excel formulas, charts, PivotTables, and other tools to make sense of everything fro
International Nuclear Information System (INIS)
Anon.
1994-01-01
For the years 1992 and 1993, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period. The tables and figures shown in this publication are: Changes in the volume of GNP and energy consumption; Coal consumption; Natural gas consumption; Peat consumption; Domestic oil deliveries; Import prices of oil; Price development of principal oil products; Fuel prices for power production; Total energy consumption by source; Electricity supply; Energy imports by country of origin in 1993; Energy exports by recipient country in 1993; Consumer prices of liquid fuels; Consumer prices of hard coal and natural gas, prices of indigenous fuels; Average electricity price by type of consumer; Price of district heating by type of consumer and Excise taxes and turnover taxes included in consumer prices of some energy sources
Goodman, Joseph W.
2000-07-01
The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research
Pivato, Marcus
2013-01-01
We show that, in a sufficiently large population satisfying certain statistical regularities, it is often possible to accurately estimate the utilitarian social welfare function, even if we only have very noisy data about individual utility functions and interpersonal utility comparisons. In particular, we show that it is often possible to identify an optimal or close-to-optimal utilitarian social choice using voting rules such as the Borda rule, approval voting, relative utilitarianism, or a...
Natrella, Mary Gibbons
1963-01-01
Formulated to assist scientists and engineers engaged in army ordnance research and development programs, this well-known and highly regarded handbook is a ready reference for advanced undergraduate and graduate students as well as for professionals seeking engineering information and quantitative data for designing, developing, constructing, and testing equipment. Topics include characterizing and comparing the measured performance of a material, product, or process; general considerations in planning experiments; statistical techniques for analyzing extreme-value data; use of transformations
Groen-Blokhuis, Maria M.; Middeldorp, Christel M.; van Beijsterveldt, Catharina E.; Boomsma, Dorret I.
2011-01-01
In order to estimate the influence of genetic and environmental factors on 'crying without a cause' and 'being easily upset' in 2-year-old children, a large twin study was carried out. Prospective data were available for ~18,000 2-year-old twin pairs from the Netherlands Twin Register. A bivariate
Groen-Blokhuis, M.M.; Middeldorp, C.M.; van Beijsterveldt, C.E.M.; Boomsma, D.I.
2011-01-01
In order to estimate the influence of genetic and environmental factors on 'crying without a cause' and 'being easily upset' in 2-year-old children, a large twin study was carried out. Prospective data were available for ∼18,000 2-year-old twin pairs from the Netherlands Twin Register. A bivariate
Mesci, Gunkut; Schwartz, Renee' S.
2017-01-01
The purpose of this study was to assess preservice teachers' views of Nature of Science (NOS), identify aspects that were challenging for conceptual change, and explore reasons why. This study particularly focused on why and how some concepts of NOS may be more easily altered than others. Fourteen preservice science teachers enrolled in a NOS and…
International Nuclear Information System (INIS)
Altarelli, Fabrizio; Monasson, Remi; Zamponi, Francesco
2007-01-01
For large clause-to-variable ratios, typical K-SAT instances drawn from the uniform distribution have no solution. We argue, based on statistical mechanics calculations using the replica and cavity methods, that rare satisfiable instances from the uniform distribution are very similar to typical instances drawn from the so-called planted distribution, where instances are chosen uniformly between the ones that admit a given solution. It then follows, from a recent article by Feige, Mossel and Vilenchik (2006 Complete convergence of message passing algorithms for some satisfiability problems Proc. Random 2006 pp 339-50), that these rare instances can be easily recognized (in O(log N) time and with probability close to 1) by a simple message-passing algorithm
Conjunctive interpretations of disjunctions
Directory of Open Access Journals (Sweden)
Robert van Rooij
2010-09-01
Full Text Available In this extended commentary I discuss the problem of how to account for "conjunctive" readings of some sentences with embedded disjunctions for globalist analyses of conversational implicatures. Following Franke (2010, 2009, I suggest that earlier proposals failed, because they did not take into account the interactive reasoning of what else the speaker could have said, and how else the hearer could have interpreted the (alternative sentence(s. I show how Franke's idea relates to more traditional pragmatic interpretation strategies. doi:10.3765/sp.3.11 BibTeX info
Tarolli, Paolo; Prosdocimi, Massimo; Sofia, Giulia; Dalla Fontana, Giancarlo
2015-04-01
A real opportunity and challenge for the hazard mapping is offered by the use of smartphones and low-cost and flexible photogrammetric technique (i.e. 'Structure-from-Motion'-SfM-). Differently from the other traditional photogrammetric methods, the SfM allows to reconstitute three-dimensional geometries (Digital Surface Models, DSMs) from randomly acquired images. The images can be acquired by standalone digital cameras (compact or reflex), or even by smartphones built-in cameras. This represents a "revolutionary" advance compared with more expensive technologies and applications (e.g. Terrestrial Laser Scanner TLS, airborne lidar) (Tarolli, 2014). Through fast, simple and consecutive field surveys, anyone with a smartphone can take a lot of pictures of the same study area. This way, high-resolution and multi-temporal DSMs may be obtained and used to better monitor and understand erosion and deposition processes. Furthermore, these topographic data can also facilitate to quantify volumes of eroded materials due to landslides and recognize the major critical issues that usually occur during a natural hazard (e.g. river bank erosion and/or collapse due to floods). In this work we considered different case studies located in different environmental contexts of Italy, where extensive photosets were obtained using smartphones. TLS data were also considered in the analysis as benchmark to compare with SfM data. Digital Surface Models (DSMs) derived from SfM at centimeter grid-cell resolution revealed to be effective to automatically recognize areas subject to surface instabilities, and estimate quantitatively erosion and deposition volumes, for example. Morphometric indexes such as landform curvature and surface roughness, and statistical thresholds (e.g. standard deviation) of these indices, served as the basis for the proposed analyses. The results indicate that SfM technique through smartphones really offers a fast, simple and affordable alternative to lidar
Measurement and statistics for teachers
Van Blerkom, Malcolm
2008-01-01
Written in a student-friendly style, Measurement and Statistics for Teachers shows teachers how to use measurement and statistics wisely in their classes. Although there is some discussion of theory, emphasis is given to the practical, everyday uses of measurement and statistics. The second part of the text provides more complete coverage of basic descriptive statistics and their use in the classroom than in any text now available.Comprehensive and accessible, Measurement and Statistics for Teachers includes:Short vignettes showing concepts in action Numerous classroom examples Highlighted vocabulary Boxes summarizing related concepts End-of-chapter exercises and problems Six full chapters devoted to the essential topic of Classroom Tests Instruction on how to carry out informal assessments, performance assessments, and portfolio assessments, and how to use and interpret standardized tests A five-chapter section on Descriptive Statistics, giving instructors the option of more thoroughly teaching basic measur...
International Nuclear Information System (INIS)
Luo Chuanwen; Wang Gang; Wang Chuncheng; Wei Junjie
2009-01-01
The concepts of uniform index and expectation uniform index are two mathematical descriptions of the uniformity and the mean uniformity of a finite set in a polyhedron. The concepts of instantaneous chaometry (ICM) and k step chaometry (k SCM) are introduced in order to apply the method in statistics for studying the nonlinear difference equations. It is found that k step chaometry is an indirect estimation of the expectation uniform index. The simulation illustrate that the expectation uniform index for the Lorenz System is increasing linearly, but increasing nonlinearly for the Chen's System with parameter b. In other words, the orbits for each system become more and more uniform with parameter b increasing. Finally, a conjecture is also brought forward, which implies that chaos can be interpreted by its orbit's mean uniformity described by the expectation uniform index and indirectly estimated by k SCM. The k SCM of the heart rate showes the feeble and old process of the heart.
International Nuclear Information System (INIS)
Anon.
1989-01-01
World data from the United Nation's latest Energy Statistics Yearbook, first published in our last issue, are completed here. The 1984-86 data were revised and 1987 data added for world commercial energy production and consumption, world natural gas plant liquids production, world LP-gas production, imports, exports, and consumption, world residual fuel oil production, imports, exports, and consumption, world lignite production, imports, exports, and consumption, world peat production and consumption, world electricity production, imports, exports, and consumption (Table 80), and world nuclear electric power production
Quantum Statistics and Entanglement Problems
Trainor, L. E. H.; Lumsden, Charles J.
2002-01-01
Interpretations of quantum measurement theory have been plagued by two questions, one concerning the role of observer consciousness and the other the entanglement phenomenon arising from the superposition of quantum states. We emphasize here the remarkable role of quantum statistics in describing the entanglement problem correctly and discuss the relationship to issues arising from current discussions of intelligent observers in entangled, decohering quantum worlds.
An introduction to medical statistics
International Nuclear Information System (INIS)
Hilgers, R.D.; Bauer, P.; Scheiber, V.; Heitmann, K.U.
2002-01-01
This textbook teaches all aspects and methods of biometrics as a field of concentration in medical education. Instrumental interpretations of the theory, concepts and terminology of medical statistics are enhanced by numerous illustrations and examples. With problems, questions and answers. (orig./CB) [de
Statistics Poster Challenge for Schools
Payne, Brad; Freeman, Jenny; Stillman, Eleanor
2013-01-01
The analysis and interpretation of data are important life skills. A poster challenge for schoolchildren provides an innovative outlet for these skills and demonstrates their relevance to daily life. We discuss our Statistics Poster Challenge and the lessons we have learned.
Interpreting & Biomechanics. PEPNet Tipsheet
PEPNet-Northeast, 2001
2001-01-01
Cumulative trauma disorder (CTD) refers to a collection of disorders associated with nerves, muscles, tendons, bones, and the neurovascular (nerves and related blood vessels) system. CTD symptoms may involve the neck, back, shoulders, arms, wrists, or hands. Interpreters with CTD may experience a variety of symptoms including: pain, joint…
Tokens: Facts and Interpretation.
Schmandt-Besserat, Denise
1986-01-01
Summarizes some of the major pieces of evidence concerning the archeological clay tokens, specifically the technique for their manufacture, their geographic distribution, chronology, and the context in which they are found. Discusses the interpretation of tokens as the first example of visible language, particularly as an antecedent of Sumerian…
DEFF Research Database (Denmark)
Hauschild, Michael Z.; Bonou, Alexandra; Olsen, Stig Irving
2018-01-01
The interpretation is the final phase of an LCA where the results of the other phases are considered together and analysed in the light of the uncertainties of the applied data and the assumptions that have been made and documented throughout the study. This chapter teaches how to perform an inte...
Interpretations of Greek Mythology
Bremmer, Jan
1987-01-01
This collection of original studies offers new interpretations of some of the best known characters and themes of Greek mythology, reflecting the complexity and fascination of the Greek imagination. Following analyses of the concept of myth and the influence of the Orient on Greek mythology, the
Translation, Interpreting and Lexicography
DEFF Research Database (Denmark)
Dam, Helle Vrønning; Tarp, Sven
2018-01-01
in the sense that their practice fields are typically ‘about something else’. Translators may, for example, be called upon to translate medical texts, and interpreters may be assigned to work on medical speeches. Similarly, practical lexicography may produce medical dictionaries. In this perspective, the three...
READING STATISTICS AND RESEARCH
Directory of Open Access Journals (Sweden)
Reviewed by Yavuz Akbulut
2008-10-01
Full Text Available The book demonstrates the best and most conservative ways to decipher and critique research reports particularly for social science researchers. In addition, new editions of the book are always better organized, effectively structured and meticulously updated in line with the developments in the field of research statistics. Even the most trivial issues are revisited and updated in new editions. For instance, purchaser of the previous editions might check the interpretation of skewness and kurtosis indices in the third edition (p. 34 and in the fifth edition (p.29 to see how the author revisits every single detail. Theory and practice always go hand in hand in all editions of the book. Re-reading previous editions (e.g. third edition before reading the fifth edition gives the impression that the author never stops ameliorating his instructional text writing methods. In brief, “Reading Statistics and Research” is among the best sources showing research consumers how to understand and critically assess the statistical information and research results contained in technical research reports. In this respect, the review written by Mirko Savić in Panoeconomicus (2008, 2, pp. 249-252 will help the readers to get a more detailed overview of each chapters. I cordially urge the beginning researchers to pick a highlighter to conduct a detailed reading with the book. A thorough reading of the source will make the researchers quite selective in appreciating the harmony between the data analysis, results and discussion sections of typical journal articles. If interested, beginning researchers might begin with this book to grasp the basics of research statistics, and prop up their critical research reading skills with some statistics package applications through the help of Dr. Andy Field’s book, Discovering Statistics using SPSS (second edition published by Sage in 2005.
Applied statistics in ecology: common pitfalls and simple solutions
E. Ashley Steel; Maureen C. Kennedy; Patrick G. Cunningham; John S. Stanovick
2013-01-01
The most common statistical pitfalls in ecological research are those associated with data exploration, the logic of sampling and design, and the interpretation of statistical results. Although one can find published errors in calculations, the majority of statistical pitfalls result from incorrect logic or interpretation despite correct numerical calculations. There...
Energy Technology Data Exchange (ETDEWEB)
Viebach, Dieter
2010-07-01
Subsequently to a easily comprehensively description of the function and characteristics of Stirling engines, the author of the book under consideration describes the construction of a model Stirling engine on the basis of clear construction drawings. A delicacy for experienced modelers: The 'amazing model', a miniature Stirling engine consisting of beverage cans, has been running with the warmth of the human hand. Even in this technically demanding model, the construction will be described accurately by detailed construction drawings.
Personal literary interpretation
Directory of Open Access Journals (Sweden)
Michał Januszkiewicz
2015-11-01
Full Text Available The article titled “Personal literary interpretation” deals with problems which have usually been marginalized in literary studies, but which seem to be very important in the context of the humanities, as broadly defined. The author of this article intends to rethink the problem of literary studies not in objective, but in personal terms. This is why the author wants to talk about what he calls personal literary interpretation, which has nothing to do with subjective or irrational thinking, but which is rather grounded in the hermeneutical rule that says that one must believe in order tounderstand a text or the other (where ‘believe’ also means: ‘to love’, ‘engage’, and ‘be open’. The article presents different determinants of this attitude, ranging from Dilthey to Heidegger and Gadamer. Finally, the author subscribes to the theory of personal interpretation, which is always dialogical.
Interpretation and clinical applications
International Nuclear Information System (INIS)
Higgins, C.B.
1987-01-01
This chapter discusses the factors to be kept in mind during routine interpretation of MR images. This includes the factors that determine contrast on standard spin-echo images and some distinguishing features between true lesions and artifactually simulated lesions. This chapter also indicates the standard protocols for MRI of various portions of the body. Finally, the current indications for MRI of various portions of the body are suggested; however, it is recognized that the indications for MRI are rapidly increasing and consequently, at the time of publication of this chapter, it is likely that many more applications will have become evident. Interpretation of magnetic resonance (MR) images requires consideration of anatomy and tissue characteristics and extraction of artifacts resulting from motion and other factors
Gianni Vattimo
2013-01-01
Gianni Vattimo, who is both a Catholic and a frequent critic of the Church, explores the surprising congruence between Christianity and hermeneutics in light of the dissolution of metaphysical truth. As in hermeneutics, Vatimo claims, interpretation is central to Christianity. Influenced by hermeneutics and borrowing largely from the Nietzschean and Heideggerian heritage, the Italian philosopher, who has been instrumental in promoting a nihilistic approach to Christianity, draws here on Nietz...
Directory of Open Access Journals (Sweden)
Gianni Vattimo
2013-01-01
Full Text Available Gianni Vattimo, who is both a Catholic and a frequent critic of the Church, explores the surprising congruence between Christianity and hermeneutics in light of the dissolution of metaphysical truth. As in hermeneutics, Vatimo claims, interpretation is central to Christianity. Influenced by hermeneutics and borrowing largely from the Nietzschean and Heideggerian heritage, the Italian philosopher, who has been instrumental in promoting a nihilistic approach to Christianity, draws here on Nietzsche’s writings on nihilism, which is not to be understood in a purely negative sense. Vattimo suggests that nihilism not only expands the Christian message of charity, but also transforms it into its endless human potential. In “The Age of Interpretation,” the author shows that hermeneutical radicalism “reduces all reality to message,” so that the opposition between facts and norms turns out to be misguided, for both are governed by the interpretative paradigms through which someone (always a concrete, historically situated someone makes sense of them. Vattimo rejects some of the deplorable political consequences of hermeneutics and claims that traditional hermeneutics is in collusion with various political-ideological neutralizations.
Permutation statistical methods an integrated approach
Berry, Kenneth J; Johnston, Janis E
2016-01-01
This research monograph provides a synthesis of a number of statistical tests and measures, which, at first consideration, appear disjoint and unrelated. Numerous comparisons of permutation and classical statistical methods are presented, and the two methods are compared via probability values and, where appropriate, measures of effect size. Permutation statistical methods, compared to classical statistical methods, do not rely on theoretical distributions, avoid the usual assumptions of normality and homogeneity of variance, and depend only on the data at hand. This text takes a unique approach to explaining statistics by integrating a large variety of statistical methods, and establishing the rigor of a topic that to many may seem to be a nascent field in statistics. This topic is new in that it took modern computing power to make permutation methods available to people working in the mainstream of research. This research monograph addresses a statistically-informed audience, and can also easily serve as a ...
Multimodal integration in statistical learning
DEFF Research Database (Denmark)
Mitchell, Aaron; Christiansen, Morten Hyllekvist; Weiss, Dan
2014-01-01
, we investigated the ability of adults to integrate audio and visual input during statistical learning. We presented learners with a speech stream synchronized with a video of a speaker’s face. In the critical condition, the visual (e.g., /gi/) and auditory (e.g., /mi/) signals were occasionally...... facilitated participants’ ability to segment the speech stream. Our results therefore demonstrate that participants can integrate audio and visual input to perceive the McGurk illusion during statistical learning. We interpret our findings as support for modality-interactive accounts of statistical learning.......Recent advances in the field of statistical learning have established that learners are able to track regularities of multimodal stimuli, yet it is unknown whether the statistical computations are performed on integrated representations or on separate, unimodal representations. In the present study...
National Statistical Commission and Indian Official Statistics*
Indian Academy of Sciences (India)
IAS Admin
a good collection of official statistics of that time. With more .... statistical agencies and institutions to provide details of statistical activities .... ing several training programmes. .... ful completion of Indian Statistical Service examinations, the.
Tellinghuisen, Joel
2008-01-01
The method of least squares is probably the most powerful data analysis tool available to scientists. Toward a fuller appreciation of that power, this work begins with an elementary review of statistics fundamentals, and then progressively increases in sophistication as the coverage is extended to the theory and practice of linear and nonlinear least squares. The results are illustrated in application to data analysis problems important in the life sciences. The review of fundamentals includes the role of sampling and its connection to probability distributions, the Central Limit Theorem, and the importance of finite variance. Linear least squares are presented using matrix notation, and the significance of the key probability distributions-Gaussian, chi-square, and t-is illustrated with Monte Carlo calculations. The meaning of correlation is discussed, including its role in the propagation of error. When the data themselves are correlated, special methods are needed for the fitting, as they are also when fitting with constraints. Nonlinear fitting gives rise to nonnormal parameter distributions, but the 10% Rule of Thumb suggests that such problems will be insignificant when the parameter is sufficiently well determined. Illustrations include calibration with linear and nonlinear response functions, the dangers inherent in fitting inverted data (e.g., Lineweaver-Burk equation), an analysis of the reliability of the van't Hoff analysis, the problem of correlated data in the Guggenheim method, and the optimization of isothermal titration calorimetry procedures using the variance-covariance matrix for experiment design. The work concludes with illustrations on assessing and presenting results.
Is the statistic value all we should care about in neuroimaging?
Chen, Gang; Taylor, Paul A; Cox, Robert W
2017-02-15
Here we address an important issue that has been embedded within the neuroimaging community for a long time: the absence of effect estimates in results reporting in the literature. The statistic value itself, as a dimensionless measure, does not provide information on the biophysical interpretation of a study, and it certainly does not represent the whole picture of a study. Unfortunately, in contrast to standard practice in most scientific fields, effect (or amplitude) estimates are usually not provided in most results reporting in the current neuroimaging publications and presentations. Possible reasons underlying this general trend include (1) lack of general awareness, (2) software limitations, (3) inaccurate estimation of the BOLD response, and (4) poor modeling due to our relatively limited understanding of FMRI signal components. However, as we discuss here, such reporting damages the reliability and interpretability of the scientific findings themselves, and there is in fact no overwhelming reason for such a practice to persist. In order to promote meaningful interpretation, cross validation, reproducibility, meta and power analyses in neuroimaging, we strongly suggest that, as part of good scientific practice, effect estimates should be reported together with their corresponding statistic values. We provide several easily adaptable recommendations for facilitating this process. Published by Elsevier Inc.
Directory of Open Access Journals (Sweden)
Filippo Nisic
2014-01-01
Full Text Available Five fulleropyrrolidines and methanofullerenes, bearing one or two terthiophene moieties, have been prepared in a convenient way and well characterized. These novel fullerene derivatives are characterized by good solubility and by better harvesting of the solar radiation with respect to traditional PCBM. In addition, they have a relatively high LUMO level and a low band gap that can be easily tuned by an adequate design of the link between the fullerene and the terthiophene. Preliminary results show that they are potential acceptors for the creation of efficient bulk-heterojunction solar cells based on donor polymers containing thiophene units.
International Nuclear Information System (INIS)
Luo, Y.; Tepikian, S.; Fischer, W.; Robert-Demolaize, G.; Trbojevic, D.
2009-01-01
Based on the contributions of the chromatic sextupole families to the half-integer resonance driving terms, we discuss how to sort the chromatic sextupoles in the arcs of the Relativistic Heavy Ion Collider (RHIC) to easily and effectively correct the second order chromaticities. We propose a method with 4 knobs corresponding to 4 pairs of chromatic sextupole families to online correct the second order chromaticities. Numerical simulation justifies this method, showing that this method reduces the unbalance in the correction strengths of sextupole families and avoids the reversal of sextupole polarities. Therefore, this method yields larger dynamic apertures for the proposed RHIC 2009 100GeV polarized proton run lattices
Statistical ensembles in quantum mechanics
International Nuclear Information System (INIS)
Blokhintsev, D.
1976-01-01
The interpretation of quantum mechanics presented in this paper is based on the concept of quantum ensembles. This concept differs essentially from the canonical one by that the interference of the observer into the state of a microscopic system is of no greater importance than in any other field of physics. Owing to this fact, the laws established by quantum mechanics are not of less objective character than the laws governing classical statistical mechanics. The paradoxical nature of some statements of quantum mechanics which result from the interpretation of the wave functions as the observer's notebook greatly stimulated the development of the idea presented. (Auth.)
Interpretation of Internet technology
DEFF Research Database (Denmark)
Madsen, Charlotte Øland
2001-01-01
Research scope: The topic of the research project is to investigate how new internet technologies such as e-trade and customer relation marketing and management are implemented in Danish food processing companies. The aim is to use Weick's (1995) sensemaking concept to analyse the strategic...... processes leading to the use of internet marketing technologies and to investigate how these new technologies are interpreted into the organisation. Investigating the organisational socio-cognitive processes underlying the decision making processes will give further insight into the socio...
Changing interpretations of Plotinus
DEFF Research Database (Denmark)
Catana, Leo
2013-01-01
about method point in other directions. Eduard Zeller (active in the second half of the 19th century) is typically regarded as the first who gave a satisfying account of Plotinus’ philosophy as a whole. In this article, on the other hand, Zeller is seen as the one who finalised a tradition initiated...... in the 18th century. Very few Plotinus scholars have examined the interpretative development prior to Zeller. Schiavone (1952) and Bonetti (1971), for instance, have given little attention to Brucker’s introduction of the concept system of philosophy. The present analysis, then, has value...
SOCR: Statistics Online Computational Resource
Directory of Open Access Journals (Sweden)
Ivo D. Dinov
2006-10-01
Full Text Available The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis, visualization and integration. Following years of experience in statistical teaching at all college levels using established licensed statistical software packages, like STATA, S-PLUS, R, SPSS, SAS, Systat, etc., we have attempted to engineer a new statistics education environment, the Statistics Online Computational Resource (SOCR. This resource performs many of the standard types of statistical analysis, much like other classical tools. In addition, it is designed in a plug-in object-oriented architecture and is completely platform independent, web-based, interactive, extensible and secure. Over the past 4 years we have tested, fine-tuned and reanalyzed the SOCR framework in many of our undergraduate and graduate probability and statistics courses and have evidence that SOCR resources build student's intuition and enhance their learning.
Data Interpretation: Using Probability
Drummond, Gordon B.; Vowler, Sarah L.
2011-01-01
Experimental data are analysed statistically to allow researchers to draw conclusions from a limited set of measurements. The hard fact is that researchers can never be certain that measurements from a sample will exactly reflect the properties of the entire group of possible candidates available to be studied (although using a sample is often the…
Statistical Power in Longitudinal Network Studies
Stadtfeld, Christoph; Snijders, Tom A. B.; Steglich, Christian; van Duijn, Marijtje
2018-01-01
Longitudinal social network studies may easily suffer from a lack of statistical power. This is the case in particular for studies that simultaneously investigate change of network ties and change of nodal attributes. Such selection and influence studies have become increasingly popular due to the
Evaluation of observables in statistical multifragmentation theories
International Nuclear Information System (INIS)
Cole, A.J.
1989-01-01
The canonical formulation of equilibrium statistical multifragmentation is examined. It is shown that the explicit construction of observables (average values) by sampling the partition probabilities is unnecessary insofar as closed expressions in the form of recursion relations can be obtained quite easily. Such expressions may conversely be used to verify the sampling algorithms
Physical interpretation of antigravity
Bars, Itzhak; James, Albin
2016-02-01
Geodesic incompleteness is a problem in both general relativity and string theory. The Weyl-invariant Standard Model coupled to general relativity (SM +GR ), and a similar treatment of string theory, are improved theories that are geodesically complete. A notable prediction of this approach is that there must be antigravity regions of spacetime connected to gravity regions through gravitational singularities such as those that occur in black holes and cosmological bang/crunch. Antigravity regions introduce apparent problems of ghosts that raise several questions of physical interpretation. It was shown that unitarity is not violated, but there may be an instability associated with negative kinetic energies in the antigravity regions. In this paper we show that the apparent problems can be resolved with the interpretation of the theory from the perspective of observers strictly in the gravity region. Such observers cannot experience the negative kinetic energy in antigravity directly, but can only detect in and out signals that interact with the antigravity region. This is no different from a spacetime black box for which the information about its interior is encoded in scattering amplitudes for in/out states at its exterior. Through examples we show that negative kinetic energy in antigravity presents no problems of principles but is an interesting topic for physical investigations of fundamental significance.
Phillips, Craig B; Iline, Ilia I; Richards, Nicola K; Novoselov, Max; McNeill, Mark R
2013-10-01
Quickly, accurately, and easily assessing the efficacy of treatments to control sessile arthropods (e.g., scale insects) and stationary immature life stages (e.g., eggs and pupae) is problematic because it is difficult to tell whether treated organisms are alive or dead. Current approaches usually involve either maintaining organisms in the laboratory to observe them for development, gauging their response to physical stimulation, or assessing morphological characters such as turgidity and color. These can be slow, technically difficult, or subjective, and the validity of methods other than laboratory rearing has seldom been tested. Here, we describe development and validation of a quick easily used biochemical colorimetric assay for measuring the viability of arthropods that is sufficiently sensitive to test even very small organisms such as white fly eggs. The assay was adapted from a technique for staining the enzyme hexokinase to signal the presence of adenosine triphosphate in viable specimens by reducing a tetrazolium salt to formazan. Basic laboratory facilities and skills are required for production of the stain, but no specialist equipment, expertise, or facilities are needed for its use.
The Statistics of wood assays for preservative retention
Patricia K. Lebow; Scott W. Conklin
2011-01-01
This paper covers general statistical concepts that apply to interpreting wood assay retention values. In particular, since wood assays are typically obtained from a single composited sample, the statistical aspects, including advantages and disadvantages, of simple compositing are covered.
Pattern recognition in menstrual bleeding diaries by statistical cluster analysis
Directory of Open Access Journals (Sweden)
Wessel Jens
2009-07-01
Full Text Available Abstract Background The aim of this paper is to empirically identify a treatment-independent statistical method to describe clinically relevant bleeding patterns by using bleeding diaries of clinical studies on various sex hormone containing drugs. Methods We used the four cluster analysis methods single, average and complete linkage as well as the method of Ward for the pattern recognition in menstrual bleeding diaries. The optimal number of clusters was determined using the semi-partial R2, the cubic cluster criterion, the pseudo-F- and the pseudo-t2-statistic. Finally, the interpretability of the results from a gynecological point of view was assessed. Results The method of Ward yielded distinct clusters of the bleeding diaries. The other methods successively chained the observations into one cluster. The optimal number of distinctive bleeding patterns was six. We found two desirable and four undesirable bleeding patterns. Cyclic and non cyclic bleeding patterns were well separated. Conclusion Using this cluster analysis with the method of Ward medications and devices having an impact on bleeding can be easily compared and categorized.
Shnirelman peak in the level spacing statistics
International Nuclear Information System (INIS)
Chirikov, B.V.; Shepelyanskij, D.L.
1994-01-01
The first results on the statistical properties of the quantum quasidegeneracy are presented. A physical interpretation of the Shnirelman theorem predicted the bulk quasidegeneracy is given. The conditions for the strong impact of the degeneracy on the quantum level statistics are formulated which allows to extend the application of the Shnirelman theorem into a broad class of quantum systems. 14 refs., 3 figs
Biblical Interpretation Beyond Historicity
DEFF Research Database (Denmark)
Biblical Interpretation beyond Historicity evaluates the new perspectives that have emerged since the crisis over historicity in the 1970s and 80s in the field of biblical scholarship. Several new studies in the field, as well as the ‘deconstructive’ side of literary criticism that emerged from...... writers such as Derrida and Wittgenstein, among others, lead biblical scholars today to view the texts of the Bible more as literary narratives than as sources for a history of Israel. Increased interest in archaeological and anthropological studies in writing the history of Palestine and the ancient Near...... and the commitment to a new approach to both the history of Palestine and the Bible’s place in ancient history. This volume features essays from a range of highly regarded scholars, and is divided into three sections: “Beyond Historicity”, which explores alternative historical roles for the Bible, “Greek Connections...
Interpretation of galaxy counts
International Nuclear Information System (INIS)
Tinsely, B.M.
1980-01-01
New models are presented for the interpretation of recent counts of galaxies to 24th magnitude, and predictions are shown to 28th magnitude for future comparison with data from the Space Telescope. The results supersede earlier, more schematic models by the author. Tyson and Jarvis found in their counts a ''local'' density enhancement at 17th magnitude, on comparison with the earlier models; the excess is no longer significant when a more realistic mixture of galaxy colors is used. Bruzual and Kron's conclusion that Kron's counts show evidence for evolution at faint magnitudes is confirmed, and it is predicted that some 23d magnitude galaxies have redshifts greater than unity. These may include spheroidal systems, elliptical galaxies, and the bulges of early-type spirals and S0's, seen during their primeval rapid star formation
Statistical ecology comes of age
Gimenez, Olivier; Buckland, Stephen T.; Morgan, Byron J. T.; Bez, Nicolas; Bertrand, Sophie; Choquet, Rémi; Dray, Stéphane; Etienne, Marie-Pierre; Fewster, Rachel; Gosselin, Frédéric; Mérigot, Bastien; Monestiez, Pascal; Morales, Juan M.; Mortier, Frédéric; Munoz, François; Ovaskainen, Otso; Pavoine, Sandrine; Pradel, Roger; Schurr, Frank M.; Thomas, Len; Thuiller, Wilfried; Trenkel, Verena; de Valpine, Perry; Rexstad, Eric
2014-01-01
The desire to predict the consequences of global environmental change has been the driver towards more realistic models embracing the variability and uncertainties inherent in ecology. Statistical ecology has gelled over the past decade as a discipline that moves away from describing patterns towards modelling the ecological processes that generate these patterns. Following the fourth International Statistical Ecology Conference (1–4 July 2014) in Montpellier, France, we analyse current trends in statistical ecology. Important advances in the analysis of individual movement, and in the modelling of population dynamics and species distributions, are made possible by the increasing use of hierarchical and hidden process models. Exciting research perspectives include the development of methods to interpret citizen science data and of efficient, flexible computational algorithms for model fitting. Statistical ecology has come of age: it now provides a general and mathematically rigorous framework linking ecological theory and empirical data. PMID:25540151
Statistical ecology comes of age.
Gimenez, Olivier; Buckland, Stephen T; Morgan, Byron J T; Bez, Nicolas; Bertrand, Sophie; Choquet, Rémi; Dray, Stéphane; Etienne, Marie-Pierre; Fewster, Rachel; Gosselin, Frédéric; Mérigot, Bastien; Monestiez, Pascal; Morales, Juan M; Mortier, Frédéric; Munoz, François; Ovaskainen, Otso; Pavoine, Sandrine; Pradel, Roger; Schurr, Frank M; Thomas, Len; Thuiller, Wilfried; Trenkel, Verena; de Valpine, Perry; Rexstad, Eric
2014-12-01
The desire to predict the consequences of global environmental change has been the driver towards more realistic models embracing the variability and uncertainties inherent in ecology. Statistical ecology has gelled over the past decade as a discipline that moves away from describing patterns towards modelling the ecological processes that generate these patterns. Following the fourth International Statistical Ecology Conference (1-4 July 2014) in Montpellier, France, we analyse current trends in statistical ecology. Important advances in the analysis of individual movement, and in the modelling of population dynamics and species distributions, are made possible by the increasing use of hierarchical and hidden process models. Exciting research perspectives include the development of methods to interpret citizen science data and of efficient, flexible computational algorithms for model fitting. Statistical ecology has come of age: it now provides a general and mathematically rigorous framework linking ecological theory and empirical data.
Scanning Tunneling Microscopy - image interpretation
International Nuclear Information System (INIS)
Maca, F.
1998-01-01
The basic ideas of image interpretation in Scanning Tunneling Microscopy are presented using simple quantum-mechanical models and supplied with examples of successful application. The importance is stressed of a correct interpretation of this brilliant experimental surface technique
Critical Assessment of Metagenome Interpretation
DEFF Research Database (Denmark)
Sczyrba, Alexander; Hofmann, Peter; Belmann, Peter
2017-01-01
Methods for assembly, taxonomic profiling and binning are key to interpreting metagenome data, but a lack of consensus about benchmarking complicates performance assessment. The Critical Assessment of Metagenome Interpretation (CAMI) challenge has engaged the global developer community to benchma...
Statistics for scientists and engineers
Shanmugam , Ramalingam
2015-01-01
This book provides the theoretical framework needed to build, analyze and interpret various statistical models. It helps readers choose the correct model, distinguish among various choices that best captures the data, or solve the problem at hand. This is an introductory textbook on probability and statistics. The authors explain theoretical concepts in a step-by-step manner and provide practical examples. The introductory chapter in this book presents the basic concepts. Next, the authors discuss the measures of location, popular measures of spread, and measures of skewness and kurtosis. Prob
A primer of multivariate statistics
Harris, Richard J
2014-01-01
Drawing upon more than 30 years of experience in working with statistics, Dr. Richard J. Harris has updated A Primer of Multivariate Statistics to provide a model of balance between how-to and why. This classic text covers multivariate techniques with a taste of latent variable approaches. Throughout the book there is a focus on the importance of describing and testing one's interpretations of the emergent variables that are produced by multivariate analysis. This edition retains its conversational writing style while focusing on classical techniques. The book gives the reader a feel for why
Semiclassical statistical mechanics
International Nuclear Information System (INIS)
Stratt, R.M.
1979-04-01
On the basis of an approach devised by Miller, a formalism is developed which allows the nonperturbative incorporation of quantum effects into equilibrium classical statistical mechanics. The resulting expressions bear a close similarity to classical phase space integrals and, therefore, are easily molded into forms suitable for examining a wide variety of problems. As a demonstration of this, three such problems are briefly considered: the simple harmonic oscillator, the vibrational state distribution of HCl, and the density-independent radial distribution function of He 4 . A more detailed study is then made of two more general applications involving the statistical mechanics of nonanalytic potentials and of fluids. The former, which is a particularly difficult problem for perturbative schemes, is treated with only limited success by restricting phase space and by adding an effective potential. The problem of fluids, however, is readily found to yield to a semiclassical pairwise interaction approximation, which in turn permits any classical many-body model to be expressed in a convenient form. The remainder of the discussion concentrates on some ramifications of having a phase space version of quantum mechanics. To test the breadth of the formulation, the task of constructing quantal ensemble averages of phase space functions is undertaken, and in the process several limitations of the formalism are revealed. A rather different approach is also pursued. The concept of quantum mechanical ergodicity is examined through the use of numerically evaluated eigenstates of the Barbanis potential, and the existence of this quantal ergodicity - normally associated with classical phase space - is verified. 21 figures, 4 tables
Easily Dispersible NiFe2O4/RGO Composite for Microwave Absorption Properties in the X-Band
Bateer, Buhe; Zhang, Jianjao; Zhang, Hongchen; Zhang, Xiaochen; Wang, Chunyan; Qi, Haiqun
2018-01-01
Composites with good dispersion and excellent microwave absorption properties have important applications. Therefore, an easily dispersible NiFe2O4/reduced graphene oxide (RGO) composite has been prepared conveniently through a simple hydrothermal method. Highly crystalline, small size (about 7 nm) monodispersed NiFe2O4 nanoparticles (NPs) are evenly distributed on the surface of RGO. The microwave absorbability revealed that the NiFe2O4/RGO composite exhibits excellent microwave absorption properties in the X-band (8-12 GHz), and the minimum reflection loss of the NiFe2O4/RGO composite is -27.7 dB at 9.2 GHz. The NiFe2O4/RGO composite has good dispersibility in nonpolar solvent, which facilitates the preparation of stable commercial microwave absorbing coatings. It can be a promising candidate for lightweight microwave absorption materials in many application fields.
Isolated Post-Traumatic Radial Head Dislocation, A Rare and Easily Missed Injury-A Case Report
Directory of Open Access Journals (Sweden)
V Gupta
2013-03-01
Full Text Available Dislocation of the head of the radius may be either congenital, an isolated injury or more commonly part of a complex injury to the elbow such as the Monteggia fracturedislocation. Isolated traumatic radial head dislocation without associated injuries in children is a rare and easily missed condition. We report such a case in a 7-year-old boy without any associated injuries or co-morbid conditions. Initially the diagnosis was missed, and 6 weeks later open reduction was performed with annular ligament reconstruction surgery. At the one-year follow up, the patient had returned to most normal activities, showing only slight terminal restriction of pronation. We discuss the injury mechanism and management for the Monteggia fracturedislocation and review the available literature.
FIDEA: a server for the functional interpretation of differential expression analysis.
D'Andrea, Daniel
2013-06-10
The results of differential expression analyses provide scientists with hundreds to thousands of differentially expressed genes that need to be interpreted in light of the biology of the specific system under study. This requires mapping the genes to functional classifications that can be, for example, the KEGG pathways or InterPro families they belong to, their GO Molecular Function, Biological Process or Cellular Component. A statistically significant overrepresentation of one or more category terms in the set of differentially expressed genes is an essential step for the interpretation of the biological significance of the results. Ideally, the analysis should be performed by scientists who are well acquainted with the biological problem, as they have a wealth of knowledge about the system and can, more easily than a bioinformatician, discover less obvious and, therefore, more interesting relationships. To allow experimentalists to explore their data in an easy and at the same time exhaustive fashion within a single tool and to test their hypothesis quickly and effortlessly, we developed FIDEA. The FIDEA server is located at http://www.biocomputing.it/fidea; it is free and open to all users, and there is no login requirement.
Applied statistics for social and management sciences
Miah, Abdul Quader
2016-01-01
This book addresses the application of statistical techniques and methods across a wide range of disciplines. While its main focus is on the application of statistical methods, theoretical aspects are also provided as fundamental background information. It offers a systematic interpretation of results often discovered in general descriptions of methods and techniques such as linear and non-linear regression. SPSS is also used in all the application aspects. The presentation of data in the form of tables and graphs throughout the book not only guides users, but also explains the statistical application and assists readers in interpreting important features. The analysis of statistical data is presented consistently throughout the text. Academic researchers, practitioners and other users who work with statistical data will benefit from reading Applied Statistics for Social and Management Sciences. .
The interpretation of administrative contracts
Directory of Open Access Journals (Sweden)
Cătălin-Silviu SĂRARU
2014-06-01
Full Text Available The article analyzes the principles of interpretation for administrative contracts, in French law and in Romanian law. In the article are highlighted derogations from the rules of contract interpretation in common law. Are examined the exceptions to the principle of good faith, the principle of common intention (willingness of the parties, the principle of good administration, the principle of extensive interpretation of the administrative contract. The article highlights the importance and role of the interpretation in administrative contracts.
Monitoring and interpreting bioremediation effectiveness
International Nuclear Information System (INIS)
Bragg, J.R.; Prince, R.C.; Harner, J.; Atlas, R.M.
1993-01-01
Following the Exxon Valdez oil spill in 1989, extensive research was conducted by the US Environments Protection Agency and Exxon to develop and implement bioremediation techniques for oil spill cleanup. A key challenge of this program was to develop effective methods for monitoring and interpreting bioremediation effectiveness on extremely heterogenous intertidal shorelines. Fertilizers were applied to shorelines at concentrations known to be safe, and effectiveness achieved in acceleration biodegradation of oil residues was measure using several techniques. This paper describes the most definitive method identified, which monitors biodegradation loss by measuring changes in ratios of hydrocarbons to hopane, a cycloalkane present in the oil that showed no measurable degradation. Rates of loss measured by the hopane ratio method have high levels of statistical confidence, and show that the fertilizer addition stimulated biodegradation rates as much a fivefold. Multiple regression analyses of data show that fertilizer addition of nitrogen in interstitial pore water per unit of oil load was the most important parameter affecting biodegradation rate, and results suggest that monitoring nitrogen concentrations in the subsurface pore water is preferred technique for determining fertilizer dosage and reapplication frequency
Interpretation of computed tomographic images
International Nuclear Information System (INIS)
Stickle, R.L.; Hathcock, J.T.
1993-01-01
This article discusses the production of optimal CT images in small animal patients as well as principles of radiographic interpretation. Technical factors affecting image quality and aiding image interpretation are included. Specific considerations for scanning various anatomic areas are given, including indications and potential pitfalls. Principles of radiographic interpretation are discussed. Selected patient images are illustrated
Structural interpretation of seismic data and inherent uncertainties
Bond, Clare
2013-04-01
Geoscience is perhaps unique in its reliance on incomplete datasets and building knowledge from their interpretation. This interpretation basis for the science is fundamental at all levels; from creation of a geological map to interpretation of remotely sensed data. To teach and understand better the uncertainties in dealing with incomplete data we need to understand the strategies individual practitioners deploy that make them effective interpreters. The nature of interpretation is such that the interpreter needs to use their cognitive ability in the analysis of the data to propose a sensible solution in their final output that is both consistent not only with the original data but also with other knowledge and understanding. In a series of experiments Bond et al. (2007, 2008, 2011, 2012) investigated the strategies and pitfalls of expert and non-expert interpretation of seismic images. These studies focused on large numbers of participants to provide a statistically sound basis for analysis of the results. The outcome of these experiments showed that a wide variety of conceptual models were applied to single seismic datasets. Highlighting not only spatial variations in fault placements, but whether interpreters thought they existed at all, or had the same sense of movement. Further, statistical analysis suggests that the strategies an interpreter employs are more important than expert knowledge per se in developing successful interpretations. Experts are successful because of their application of these techniques. In a new set of experiments a small number of experts are focused on to determine how they use their cognitive and reasoning skills, in the interpretation of 2D seismic profiles. Live video and practitioner commentary were used to track the evolving interpretation and to gain insight on their decision processes. The outputs of the study allow us to create an educational resource of expert interpretation through online video footage and commentary with
Thiese, Matthew S; Walker, Skyler; Lindsey, Jenna
2017-10-01
Distribution of valuable research discoveries are needed for the continual advancement of patient care. Publication and subsequent reliance of false study results would be detrimental for patient care. Unfortunately, research misconduct may originate from many sources. While there is evidence of ongoing research misconduct in all it's forms, it is challenging to identify the actual occurrence of research misconduct, which is especially true for misconduct in clinical trials. Research misconduct is challenging to measure and there are few studies reporting the prevalence or underlying causes of research misconduct among biomedical researchers. Reported prevalence estimates of misconduct are probably underestimates, and range from 0.3% to 4.9%. There have been efforts to measure the prevalence of research misconduct; however, the relatively few published studies are not freely comparable because of varying characterizations of research misconduct and the methods used for data collection. There are some signs which may point to an increased possibility of research misconduct, however there is a need for continued self-policing by biomedical researchers. There are existing resources to assist in ensuring appropriate statistical methods and preventing other types of research fraud. These included the "Statistical Analyses and Methods in the Published Literature", also known as the SAMPL guidelines, which help scientists determine the appropriate method of reporting various statistical methods; the "Strengthening Analytical Thinking for Observational Studies", or the STRATOS, which emphases on execution and interpretation of results; and the Committee on Publication Ethics (COPE), which was created in 1997 to deliver guidance about publication ethics. COPE has a sequence of views and strategies grounded in the values of honesty and accuracy.
Quantum physics and statistical physics. 5. ed.
International Nuclear Information System (INIS)
Alonso, Marcelo; Finn, Edward J.
2012-01-01
By logical and uniform presentation this recognized introduction in modern physics treats both the experimental and theoretical aspects. The first part of the book deals with quantum mechanics and their application to atoms, molecules, nuclei, solids, and elementary particles. The statistical physics with classical statistics, thermodynamics, and quantum statistics is theme of the second part. Alsonso and Finn avoid complicated mathematical developments; by numerous sketches and diagrams as well as many problems and examples they make the reader early and above all easily understandably familiar with the formations of concepts of modern physics.
Combination and interpretation of observables in Cosmology
Directory of Open Access Journals (Sweden)
Virey Jean-Marc
2010-04-01
Full Text Available The standard cosmological model has deep theoretical foundations but need the introduction of two major unknown components, dark matter and dark energy, to be in agreement with various observations. Dark matter describes a non-relativistic collisionless fluid of (non baryonic matter which amount to 25% of the total density of the universe. Dark energy is a new kind of fluid not of matter type, representing 70% of the total density which should explain the recent acceleration of the expansion of the universe. Alternatively, one can reject this idea of adding one or two new components but argue that the equations used to make the interpretation should be modified consmological scales. Instead of dark matter one can invoke a failure of Newton's laws. Instead of dark energy, two approaches are proposed : general relativity (in term of the Einstein equation should be modified, or the cosmological principle which fixes the metric used for cosmology should be abandonned. One of the main objective of the community is to find the path of the relevant interpretations thanks to the next generation of experiments which should provide large statistics of observationnal data. Unfortunately, cosmological in formations are difficult to pin down directly fromt he measurements, and it is mandatory to combine the various observables to get the cosmological parameters. This is not problematic from the statistical point of view, but assumptions and approximations made for the analysis may bias our interprettion of the data. Consequently, a strong attention should be paied to the statistical methods used to make parameters estimation and for model testing. After a review of the basics of cosmology where the cosmological parameters are introduced, we discuss the various cosmological probes and their associated observables used to extract cosmological informations. We present the results obtained from several statistical analyses combining data of diferent nature but
Orientalismi: nuove prospettive interpretative
Directory of Open Access Journals (Sweden)
Gabriele Proglio
2012-11-01
Full Text Available This paper is aimed at reconsidering the concept of Orientalism in a new and multiple perspective, and at proposing a different interpretation of the relationship between culture and power, starting from Edward Said’s theoretical frame of reference. If Said’s representational model is repositioned out of structuralist and foucaultian frameworks and separated from the gramscian idea of hegemony-subordination, indeed, it may be possible to re-discuss the traditional profile identifying the Other in the European cultures. My basic assumption here is that Orientalism should not be understood as a consensus mechanism, which is able to produce diversified images of the Orient and the Oriental on demand. Although, of course, in most cases Orientalism is connected to the issue of power, its meanings could also be explained —as it will be soon shown— otherwise. Let’s take The Invisible Cities by Italo Calvino as an example. Here the narratives are not just multiple repetitions of Venice —in Said’s case, the same would hold for Europeanism—, but they could be strategically re-appropriated by those “others” and “alterities” whose bodies and identities are imposed by the Eurocentric discourse. In this sense, a double link may be identified with queer theories and postcolonial studies, and the notion of subordination will be rethought. Finally, from the above mentioned borders, a new idea of image emerges, which appears as linear, uniform and flattened only to the European gaze, whereas in actual fact it is made of imaginaries and forms of knowledge, which combine representation with the conceptualization of power relationships.
A synthetic interpretation: the double-preparation theory
International Nuclear Information System (INIS)
Gondran, Michel; Gondran, Alexandre
2014-01-01
In the 1927 Solvay conference, three apparently irreconcilable interpretations of the quantum mechanics wave function were presented: the pilot-wave interpretation by de Broglie, the soliton wave interpretation by Schrödinger and the Born statistical rule by Born and Heisenberg. In this paper, we demonstrate the complementarity of these interpretations corresponding to quantum systems that are prepared differently and we deduce a synthetic interpretation: the double-preparation theory. We first introduce in quantum mechanics the concept of semi-classical statistically prepared particles, and we show that in the Schrödinger equation these particles converge, when h→0, to the equations of a statistical set of classical particles. These classical particles are undiscerned, and if we assume continuity between classical mechanics and quantum mechanics, we conclude the necessity of the de Broglie–Bohm interpretation for the semi-classical statistically prepared particles (statistical wave). We then introduce in quantum mechanics the concept of a semi-classical deterministically prepared particle, and we show that in the Schrödinger equation this particle converges, when h→0, to the equations of a single classical particle. This classical particle is discerned and assuming continuity between classical mechanics and quantum mechanics, we conclude the necessity of the Schrödinger interpretation for the semi-classical deterministically prepared particle (the soliton wave). Finally we propose, in the semi-classical approximation, a new interpretation of quantum mechanics, the ‘theory of the double preparation’, which depends on the preparation of the particles. (paper)
... Watchdog Ratings Feedback Contact Select Page Childhood Cancer Statistics Home > Cancer Resources > Childhood Cancer Statistics Childhood Cancer Statistics – Graphs and Infographics Number of Diagnoses Incidence Rates ...
Filipova, Olga
2016-01-01
About This Book Learn how to propagate DOM changes across the website without writing extensive jQuery callbacks code. Learn how to achieve reactivity and easily compose views with Vue.js and understand what it does behind the scenes. Explore the core features of Vue.js with small examples, learn how to build dynamic content into preexisting web applications, and build Vue.js applications from scratch. Who This Book Is For This book is perfect for novice web developer seeking to learn new technologies or frameworks and also for webdev gurus eager to enrich their experience. Whatever your level of expertise, this book is a great introduction to the wonderful world of reactive web apps. What You Will Learn Build a fully functioning reactive web application in Vue.js from scratch. The importance of the MVVM architecture and how Vue.js compares with other frameworks such as Angular.js and React.js. How to bring reactivity to an existing static application using Vue.js. How to use p...
Shi, Le
2018-05-26
Solar-driven water distillation by photothermal materials is emerging as a promising way of renewable energy-driven clean water production. In designing photothermal materials, light absorption, photo-to-thermal conversion efficiency, and ability to localize thermal energy at the water-air interface are three important considerations. However, one additional consideration, regenerability, has so far slipped out of the photothermal material designs at status quo. This work reveals that there is a fouling layer formed during photothermal evaporation of real seawater (Red Sea water) and domestic wastewater, which once formed, would be difficult to remove. Herein, we synthesize a SiC-C composite monolith as an effective photothermal material where carbon acts as photothermal component and SiC serves as a heat conductor and strong structural support. The high mechanical strength of the monolithic composite makes it able to withstand repeatedly high strength physical cleaning by brush scrubbing and sonication and the anti-carbon-loss mechanism generates zero carbon loss during the physical cleaning. In the case of the domestic wastewater evaporation, the bio- and organic foulants on the SiC-C composite monolith can be totally removed by annealing at 1000 oC in N2 atmosphere. We believe that the SiC-C composite monoliths are promising photothermal materials in practical solar-driven water evaporation applications thanks to their highly stable and easily regenerable properties and therefore more research efforts are warranted to further improve their performances.
The insignificance of statistical significance testing
Johnson, Douglas H.
1999-01-01
Despite their use in scientific journals such as The Journal of Wildlife Management, statistical hypothesis tests add very little value to the products of research. Indeed, they frequently confuse the interpretation of data. This paper describes how statistical hypothesis tests are often viewed, and then contrasts that interpretation with the correct one. I discuss the arbitrariness of P-values, conclusions that the null hypothesis is true, power analysis, and distinctions between statistical and biological significance. Statistical hypothesis testing, in which the null hypothesis about the properties of a population is almost always known a priori to be false, is contrasted with scientific hypothesis testing, which examines a credible null hypothesis about phenomena in nature. More meaningful alternatives are briefly outlined, including estimation and confidence intervals for determining the importance of factors, decision theory for guiding actions in the face of uncertainty, and Bayesian approaches to hypothesis testing and other statistical practices.
Efficiently and easily integrating differential equations with JiTCODE, JiTCDDE, and JiTCSDE
Ansmann, Gerrit
2018-04-01
We present a family of Python modules for the numerical integration of ordinary, delay, or stochastic differential equations. The key features are that the user enters the derivative symbolically and it is just-in-time-compiled, allowing the user to efficiently integrate differential equations from a higher-level interpreted language. The presented modules are particularly suited for large systems of differential equations such as those used to describe dynamics on complex networks. Through the selected method of input, the presented modules also allow almost complete automatization of the process of estimating regular as well as transversal Lyapunov exponents for ordinary and delay differential equations. We conceptually discuss the modules' design, analyze their performance, and demonstrate their capabilities by application to timely problems.
Christman, Stephen D; Henning, Bradley R; Geers, Andrew L; Propper, Ruth E; Niebauer, Christopher L
2008-09-01
Research has shown that persons with mixed hand preference (i.e., who report using their non-dominant hand for at least some manual activities) display an increased tendency to update beliefs in response to information inconsistent with those beliefs. This has been interpreted as reflecting the fact that the left hemisphere maintains our current beliefs while the right hemisphere evaluates and updates those beliefs when appropriate. Belief evaluation is thus dependent on interhemispheric interaction, and mixed-handedness is associated with increased interhemispheric interaction. In Experiment 1 mixed-handers exhibited higher levels of persuasion in a standard attitude-change paradigm, while in Experiment 2 mixed-handers exhibited higher levels of gullibility as measured by the Barnum Effect.
Energy Technology Data Exchange (ETDEWEB)
Xu Chengjian, E-mail: c.j.xu@umcg.nl [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schaaf, Arjen van der; Schilstra, Cornelis; Langendijk, Johannes A.; Veld, Aart A. van' t [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands)
2012-03-15
Purpose: To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. Methods and Materials: In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator (LASSO), and Bayesian model averaging (BMA), were used to build NTCP models of xerostomia following radiotherapy treatment for head and neck cancer. Performance of each learning method was evaluated by a repeated cross-validation scheme in order to obtain a fair comparison among methods. Results: It was found that the LASSO and BMA methods produced models with significantly better predictive power than that of the stepwise selection method. Furthermore, the LASSO method yields an easily interpretable model as the stepwise method does, in contrast to the less intuitive BMA method. Conclusions: The commonly used stepwise selection method, which is simple to execute, may be insufficient for NTCP modeling. The LASSO method is recommended.
Xu, Cheng-Jian; van der Schaaf, Arjen; Schilstra, Cornelis; Langendijk, Johannes A; van't Veld, Aart A
2012-03-15
To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator (LASSO), and Bayesian model averaging (BMA), were used to build NTCP models of xerostomia following radiotherapy treatment for head and neck cancer. Performance of each learning method was evaluated by a repeated cross-validation scheme in order to obtain a fair comparison among methods. It was found that the LASSO and BMA methods produced models with significantly better predictive power than that of the stepwise selection method. Furthermore, the LASSO method yields an easily interpretable model as the stepwise method does, in contrast to the less intuitive BMA method. The commonly used stepwise selection method, which is simple to execute, may be insufficient for NTCP modeling. The LASSO method is recommended. Copyright Â© 2012 Elsevier Inc. All rights reserved.
International Nuclear Information System (INIS)
Xu Chengjian; Schaaf, Arjen van der; Schilstra, Cornelis; Langendijk, Johannes A.; Veld, Aart A. van’t
2012-01-01
Purpose: To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. Methods and Materials: In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator (LASSO), and Bayesian model averaging (BMA), were used to build NTCP models of xerostomia following radiotherapy treatment for head and neck cancer. Performance of each learning method was evaluated by a repeated cross-validation scheme in order to obtain a fair comparison among methods. Results: It was found that the LASSO and BMA methods produced models with significantly better predictive power than that of the stepwise selection method. Furthermore, the LASSO method yields an easily interpretable model as the stepwise method does, in contrast to the less intuitive BMA method. Conclusions: The commonly used stepwise selection method, which is simple to execute, may be insufficient for NTCP modeling. The LASSO method is recommended.
... Standards Act and Program MQSA Insights MQSA National Statistics Share Tweet Linkedin Pin it More sharing options ... but should level off with time. Archived Scorecard Statistics 2018 Scorecard Statistics 2017 Scorecard Statistics 2016 Scorecard ...
State Transportation Statistics 2014
2014-12-15
The Bureau of Transportation Statistics (BTS) presents State Transportation Statistics 2014, a statistical profile of transportation in the 50 states and the District of Columbia. This is the 12th annual edition of State Transportation Statistics, a ...
Working memory and simultaneous interpreting
Timarova, Sarka
2009-01-01
Working memory is a cognitive construct underlying a number of abilities, and it has been hypothesised for many years that it is crucial for interpreting. A number of studies have been conducted with the aim to support this hypothesis, but research has not yielded convincing results. Most researchers focused on studying working memory differences between interpreters and non-interpreters with the rationale that differences in working memory between the two groups would provide evidence of wor...
Foolad, Mahsa; Ong, Say Leong; Hu, Jiangyong
2015-11-01
Pharmaceutical and personal care products (PPCPs) and artificial sweeteners (ASs) are emerging organic contaminants (EOCs) in the aquatic environment. The presence of PPCPs and ASs in water bodies has an ecologic potential risk and health concern. Therefore, it is needed to detect the pollution sources by understanding the transport behavior of sewage molecular markers in a subsurface area. The aim of this study was to evaluate transport of nine selected molecular markers through saturated soil column experiments. The selected sewage molecular markers in this study were six PPCPs including acetaminophen (ACT), carbamazepine (CBZ), caffeine (CF), crotamiton (CTMT), diethyltoluamide (DEET), salicylic acid (SA) and three ASs including acesulfame (ACF), cyclamate (CYC), and saccharine (SAC). Results confirmed that ACF, CBZ, CTMT, CYC and SAC were suitable to be used as sewage molecular markers since they were almost stable against sorption and biodegradation process during soil column experiments. In contrast, transport of ACT, CF and DEET were limited by both sorption and biodegradation processes and 100% removal efficiency was achieved in the biotic column. Moreover, in this study the effect of different acetate concentration (0-100mg/L) as an easily biodegradable primary substrate on a removal of PPCPs and ASs was also studied. Results showed a negative correlation (r(2)>0.75) between the removal of some selected sewage chemical markers including ACF, CF, ACT, CYC, SAC and acetate concentration. CTMT also decreased with the addition of acetate, but increasing acetate concentration did not affect on its removal. CBZ and DEET removal were not dependent on the presence of acetate. Copyright © 2015 Elsevier Ltd. All rights reserved.
International Nuclear Information System (INIS)
Ding, Yaobin; Tang, Hebin; Zhang, Shenghua; Wang, Songbo; Tang, Heqing
2016-01-01
Highlights: • CuFeO_2 microparticles were prepared by a microwave-assisted hydrothermal method. • CuFeO_2 microparticles efficiently catalyzed the activation of peroxymonosulfate. • Quenching experiments confirmed sulfate radicals as the major reactive radicals. • Carbamazepine was rapidly degraded by micro-CuFeO_2/peroxymonosulfate. • Feasibility of CuFeO_2/peroxymonosulfate was tested for treatment of actual water. - Abstract: Microscaled CuFeO_2 particles (micro-CuFeO_2) were rapidly prepared via a microwave-assisted hydrothermal method and characterized by scanning electron microscopy, X-ray powder diffraction and X-ray photoelectron spectroscopy. It was found that the micro-CuFeO_2 was of pure phase and a rhombohedral structure with size in the range of 2.8 ± 0.6 μm. The micro-CuFeO_2 efficiently catalyzed the activation of peroxymonosulfate (PMS) to generate sulfate radicals (SO_4·−), causing the fast degradation of carbamazepine (CBZ). The catalytic activity of micro-CuFeO_2 was observed to be 6.9 and 25.3 times that of micro-Cu_2O and micro-Fe_2O_3, respectively. The enhanced activity of micro-CuFeO_2 for the activation of PMS was confirmed to be attributed to synergistic effect of surface bonded Cu(I) and Fe(III). Sulfate radical was the primary radical species responsible for the CBZ degradation. As a microscaled catalyst, micro-CuFeO_2 can be easily recovered by gravity settlement and exhibited improved catalytic stability compared with micro-Cu_2O during five successive degradation cycles. Oxidative degradation of CBZ by the couple of PMS/CuFeO_2 was effective in the studied actual aqueous environmental systems.
Energy Technology Data Exchange (ETDEWEB)
Ding, Yaobin, E-mail: yaobinding@mail.scuec.edu.cn [Key Laboratory of Catalysis and Materials Science of the State Ethnic Affairs Commission and Ministry of Education, College of Resources and Environmental Science, South-Central University for Nationalities, Wuhan 430074 (China); Tang, Hebin [College of Pharmacy, South-Central University for Nationalities, Wuhan 430074 (China); Zhang, Shenghua; Wang, Songbo [Key Laboratory of Catalysis and Materials Science of the State Ethnic Affairs Commission and Ministry of Education, College of Resources and Environmental Science, South-Central University for Nationalities, Wuhan 430074 (China); Tang, Heqing, E-mail: tangheqing@mail.scuec.edu.cn [Key Laboratory of Catalysis and Materials Science of the State Ethnic Affairs Commission and Ministry of Education, College of Resources and Environmental Science, South-Central University for Nationalities, Wuhan 430074 (China)
2016-11-05
Highlights: • CuFeO{sub 2} microparticles were prepared by a microwave-assisted hydrothermal method. • CuFeO{sub 2} microparticles efficiently catalyzed the activation of peroxymonosulfate. • Quenching experiments confirmed sulfate radicals as the major reactive radicals. • Carbamazepine was rapidly degraded by micro-CuFeO{sub 2}/peroxymonosulfate. • Feasibility of CuFeO{sub 2}/peroxymonosulfate was tested for treatment of actual water. - Abstract: Microscaled CuFeO{sub 2} particles (micro-CuFeO{sub 2}) were rapidly prepared via a microwave-assisted hydrothermal method and characterized by scanning electron microscopy, X-ray powder diffraction and X-ray photoelectron spectroscopy. It was found that the micro-CuFeO{sub 2} was of pure phase and a rhombohedral structure with size in the range of 2.8 ± 0.6 μm. The micro-CuFeO{sub 2} efficiently catalyzed the activation of peroxymonosulfate (PMS) to generate sulfate radicals (SO{sub 4}·−), causing the fast degradation of carbamazepine (CBZ). The catalytic activity of micro-CuFeO{sub 2} was observed to be 6.9 and 25.3 times that of micro-Cu{sub 2}O and micro-Fe{sub 2}O{sub 3}, respectively. The enhanced activity of micro-CuFeO{sub 2} for the activation of PMS was confirmed to be attributed to synergistic effect of surface bonded Cu(I) and Fe(III). Sulfate radical was the primary radical species responsible for the CBZ degradation. As a microscaled catalyst, micro-CuFeO{sub 2} can be easily recovered by gravity settlement and exhibited improved catalytic stability compared with micro-Cu{sub 2}O during five successive degradation cycles. Oxidative degradation of CBZ by the couple of PMS/CuFeO{sub 2} was effective in the studied actual aqueous environmental systems.
Automated, computer interpreted radioimmunoassay results
International Nuclear Information System (INIS)
Hill, J.C.; Nagle, C.E.; Dworkin, H.J.; Fink-Bennett, D.; Freitas, J.E.; Wetzel, R.; Sawyer, N.; Ferry, D.; Hershberger, D.
1984-01-01
90,000 Radioimmunoassay results have been interpreted and transcribed automatically using software developed for use on a Hewlett Packard Model 1000 mini-computer system with conventional dot matrix printers. The computer program correlates the results of a combination of assays, interprets them and prints a report ready for physician review and signature within minutes of completion of the assay. The authors designed and wrote a computer program to query their patient data base for radioassay laboratory results and to produce a computer generated interpretation of these results using an algorithm that produces normal and abnormal interpretives. Their laboratory assays 50,000 patient samples each year using 28 different radioassays. Of these 85% have been interpreted using our computer program. Allowances are made for drug and patient history and individualized reports are generated with regard to the patients age and sex. Finalization of reports is still subject to change by the nuclear physician at the time of final review. Automated, computerized interpretations have realized cost savings through reduced personnel and personnel time and provided uniformity of the interpretations among the five physicians. Prior to computerization of interpretations, all radioassay results had to be dictated and reviewed for signing by one of the resident or staff physicians. Turn around times for reports prior to the automated computer program generally were two to three days. Whereas, the computerized interpret system allows reports to generally be issued the day assays are completed
Directory of Open Access Journals (Sweden)
Dominic Beaulieu-Prévost
2006-03-01
Full Text Available For the last 50 years of research in quantitative social sciences, the empirical evaluation of scientific hypotheses has been based on the rejection or not of the null hypothesis. However, more than 300 articles demonstrated that this method was problematic. In summary, null hypothesis testing (NHT is unfalsifiable, its results depend directly on sample size and the null hypothesis is both improbable and not plausible. Consequently, alternatives to NHT such as confidence intervals (CI and measures of effect size are starting to be used in scientific publications. The purpose of this article is, first, to provide the conceptual tools necessary to implement an approach based on confidence intervals, and second, to briefly demonstrate why such an approach is an interesting alternative to an approach based on NHT. As demonstrated in the article, the proposed CI approach avoids most problems related to a NHT approach and can often improve the scientific and contextual relevance of the statistical interpretations by testing range hypotheses instead of a point hypothesis and by defining the minimal value of a substantial effect. The main advantage of such a CI approach is that it replaces the notion of statistical power by an easily interpretable three-value logic (probable presence of a substantial effect, probable absence of a substantial effect and probabilistic undetermination. The demonstration includes a complete example.
Renyi statistics in equilibrium statistical mechanics
International Nuclear Information System (INIS)
Parvan, A.S.; Biro, T.S.
2010-01-01
The Renyi statistics in the canonical and microcanonical ensembles is examined both in general and in particular for the ideal gas. In the microcanonical ensemble the Renyi statistics is equivalent to the Boltzmann-Gibbs statistics. By the exact analytical results for the ideal gas, it is shown that in the canonical ensemble, taking the thermodynamic limit, the Renyi statistics is also equivalent to the Boltzmann-Gibbs statistics. Furthermore it satisfies the requirements of the equilibrium thermodynamics, i.e. the thermodynamical potential of the statistical ensemble is a homogeneous function of first degree of its extensive variables of state. We conclude that the Renyi statistics arrives at the same thermodynamical relations, as those stemming from the Boltzmann-Gibbs statistics in this limit.
Sampling, Probability Models and Statistical Reasoning Statistical
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...
Neutrons and antimony physical measurements and interpretations
International Nuclear Information System (INIS)
Smith, A. B.
2000-01-01
New experimental information for the elastic and inelastic scattering of ∼ 4--10 MeV neutrons from elemental antimony is presented. The differential measurements are made at ∼ 40 or more scattering angles and at incident neutron-energy intervals of ∼ 0.5 MeV. The present experimental results, those previously reported from this laboratory and as found in the literature are comprehensively interpreted using spherical optical-statistical and dispersive-optical models. Direct vibrational processes via core-excitation, isospin and shell effects are discussed. Antimony models for applications are proposed and compared with global, regional, and specific models reported in the literature
The disagreeable behaviour of the kappa statistic.
Flight, Laura; Julious, Steven A
2015-01-01
It is often of interest to measure the agreement between a number of raters when an outcome is nominal or ordinal. The kappa statistic is used as a measure of agreement. The statistic is highly sensitive to the distribution of the marginal totals and can produce unreliable results. Other statistics such as the proportion of concordance, maximum attainable kappa and prevalence and bias adjusted kappa should be considered to indicate how well the kappa statistic represents agreement in the data. Each kappa should be considered and interpreted based on the context of the data being analysed. Copyright © 2014 John Wiley & Sons, Ltd.
Energy Technology Data Exchange (ETDEWEB)
Keller, Sune H [Rigshospitalet, University of Copenhagen, Copenhagen (Denmark); Jakoby, Bjorn [University of Surrey, Guildford (United Kingdom); Hansen, Adam Espe; Svalling, Susanne; Klausen, Thomas L [Rigshospitalet, University of Copenhagen, Copenhagen (Denmark)
2015-05-18
We present a quick and easy method to perform quantitatively accurate PET scans of typical water-filled PET plastic shell phantoms on the Siemens mMR PET/MR scanner. We perform regular cross calibrations (Xcals) of our PET scanners, including the Siemens mMR PET/MR, with a Siemens mCT water phantom. We evaluate the mMR cross calibration stability over a 3-year period. Recently, the mMR software (VB20P) offered the option of using predefined μ-maps. We evaluated this option by using either the predefined μ-map of the long mMR water phantom or a system-integrated user defined CT-based μ-map of the mCT water phantom used for Xcal. On 54 cross calibrations that were acquired over 3 years, the mMR on average underestimated the concentration by 16% due to the use of MR-based μ-maps. The mMR produced the narrowest range and lowest standard deviation of the Xcal ratios, implying it and is the most stable of the 6 scanners included in this study over a 3 year period. With correctly segmented μ-maps, the mMR produced Xcal ratios of 1.00-1.02, well within the acceptance range [0.95-1.05]. Measuring the concentration in a centrally placed cylindrical VOI allows for some robustness against misregistration of the μ-maps but it should be no more than a few millimeters in the x-y plane, while the tolerance is larger on the z-axis (when, as always with PET, keeping clear of the axial edges of the FOV). The mMR is the most stable scanner in this study and the mean underestimation is no longer an issue with the easily accessible μ-map, which in all 7 tests resulted in correct Xcal ratios. We will share the user defined μ-map of the mCT phantom and the protocol with interested mMR users.
International Nuclear Information System (INIS)
Keller, Sune H; Jakoby, Bjorn; Hansen, Adam Espe; Svalling, Susanne; Klausen, Thomas L
2015-01-01
We present a quick and easy method to perform quantitatively accurate PET scans of typical water-filled PET plastic shell phantoms on the Siemens mMR PET/MR scanner. We perform regular cross calibrations (Xcals) of our PET scanners, including the Siemens mMR PET/MR, with a Siemens mCT water phantom. We evaluate the mMR cross calibration stability over a 3-year period. Recently, the mMR software (VB20P) offered the option of using predefined μ-maps. We evaluated this option by using either the predefined μ-map of the long mMR water phantom or a system-integrated user defined CT-based μ-map of the mCT water phantom used for Xcal. On 54 cross calibrations that were acquired over 3 years, the mMR on average underestimated the concentration by 16% due to the use of MR-based μ-maps. The mMR produced the narrowest range and lowest standard deviation of the Xcal ratios, implying it and is the most stable of the 6 scanners included in this study over a 3 year period. With correctly segmented μ-maps, the mMR produced Xcal ratios of 1.00-1.02, well within the acceptance range [0.95-1.05]. Measuring the concentration in a centrally placed cylindrical VOI allows for some robustness against misregistration of the μ-maps but it should be no more than a few millimeters in the x-y plane, while the tolerance is larger on the z-axis (when, as always with PET, keeping clear of the axial edges of the FOV). The mMR is the most stable scanner in this study and the mean underestimation is no longer an issue with the easily accessible μ-map, which in all 7 tests resulted in correct Xcal ratios. We will share the user defined μ-map of the mCT phantom and the protocol with interested mMR users.
Tax Treaty Interpretation in Spain
Soler Roch, María Teresa; Ribes Ribes, Aurora
2001-01-01
This paper provides insight in the interpretation of Spanish double taxation conventions. Taking as a premise the Vienna Convention on the Law of Treaties and the wording of Article 3(2) OECD Model Convention, the authors explore the relevance of mutual agreements, tax authority practice and foreign court decisions on the tax treaty interpretation.
Pragmatics in Court Interpreting: Additions
DEFF Research Database (Denmark)
Jacobsen, Bente
2003-01-01
Danish court interpreters are expected to follow ethical guidelines, which instruct them to deliver exact verbatim versions of source texts. However, this requirement often clashes with the reality of the interpreting situation in the courtroom. This paper presents and discusses the findings of a...
Intercultural pragmatics and court interpreting
DEFF Research Database (Denmark)
Jacobsen, Bente
2008-01-01
This paper reports on an on-going investigation of conversational implicature in triadic speech events: Interpreter-mediated questionings in criminal proceedings in Danish district courts. The languages involved are Danish and English, and the mode of interpreting is the consecutive mode. The c...
Interpreting Recoil for Undergraduate Students
Elsayed, Tarek A.
2012-01-01
The phenomenon of recoil is usually explained to students in the context of Newton's third law. Typically, when a projectile is fired, the recoil of the launch mechanism is interpreted as a reaction to the ejection of the smaller projectile. The same phenomenon is also interpreted in the context of the conservation of linear momentum, which is…
Directory of Open Access Journals (Sweden)
hossein sabourifard
2018-02-01
Full Text Available Introduction: One of the most important requirements in planning production and processing of medicinal plants in order to obtain high yield and high-quality is the initial assessment of the physical and chemical properties of soil, which reduces the production cost by avoiding the use of unnecessary soil analysis. Summer savory (Satureja hortensis L. is one the most widely used medicinal plants that quality index of plant is related to the quantity and the constituent of its essential oil content. Understanding the relations between the quantity and quality of medicinal plants with the very physical and chemical properties of soil is very complex and the estimation of parameters changes of medicinal plants affect by soil quality characteristics is more difficult. Today, with the arrival of multivariable regression models and artificial lattice models in the research, many complex relationships found in nature is understandable. Hence the need for estimation the biomass yield of savory using fast, cheap and with acceptable accuracy is feeling. Materials and Methods: The present study was performed at the Agricultural Research Station Neyshabur as pot experiment based on a completely randomized design with three replications. Around 53 soil samples were collected from different parts of Neyshabur city, and soil texture, organic matter, pH, salinity, phosphorus, potassium, nitrogen and carbon content were selected as the easily available parameters. Before planting the parameters were measured in laboratory. Approximately 90 days after planting seeds in pots containing soil samples, the sampling of plants was done based on the treatments. For drying, samples were placed for 24 hours in an oven at 40 °C. Finally, the relationship between the biomass yield and easily available soil parameters was determined using artificial neural network by Matlab7.9 software. Results and Discussion: The results showed that soil variability, is a key element in
Application of descriptive statistics in analysis of experimental data
Mirilović Milorad; Pejin Ivana
2008-01-01
Statistics today represent a group of scientific methods for the quantitative and qualitative investigation of variations in mass appearances. In fact, statistics present a group of methods that are used for the accumulation, analysis, presentation and interpretation of data necessary for reaching certain conclusions. Statistical analysis is divided into descriptive statistical analysis and inferential statistics. The values which represent the results of an experiment, and which are the subj...
Do Interpreters Indeed Have Superior Working Memory in Interpreting
Institute of Scientific and Technical Information of China (English)
于飞
2012-01-01
With the frequent communications between China and western countries in the field of economy,politics and culture,etc,Inter preting becomes more and more important to people in all walks of life.This paper aims to testify the author’s hypothesis "professional interpreters have similar short-term memory with unprofessional interpreters,but they have superior working memory." After the illustration of literatures concerning with consecutive interpreting,short-term memory and working memory,experiments are designed and analysis are described.
ARSENIC CONTAMINATION IN GROUNDWATER: A STATISTICAL MODELING
Palas Roy; Naba Kumar Mondal; Biswajit Das; Kousik Das
2013-01-01
High arsenic in natural groundwater in most of the tubewells of the Purbasthali- Block II area of Burdwan district (W.B, India) has recently been focused as a serious environmental concern. This paper is intending to illustrate the statistical modeling of the arsenic contaminated groundwater to identify the interrelation of that arsenic contain with other participating groundwater parameters so that the arsenic contamination level can easily be predicted by analyzing only such parameters. Mul...
Handbook of univariate and multivariate data analysis and interpretation with SPSS
Ho, Robert
2006-01-01
Many statistics texts tend to focus more on the theory and mathematics underlying statistical tests than on their applications and interpretation. This can leave readers with little understanding of how to apply statistical tests or how to interpret their findings. While the SPSS statistical software has done much to alleviate the frustrations of social science professionals and students who must analyze data, they still face daunting challenges in selecting the proper tests, executing the tests, and interpreting the test results.With emphasis firmly on such practical matters, this handbook se
Kutchinsky, B
1991-01-01
from extreme scarcity to relative abundance. If (violent) pornography causes rape, this exceptional development in the availability of (violent) pornography should definitely somehow influence the rape statistics. Since, however, the rape figures could not simply be expected to remain steady during the period in question (when it is well known that most other crimes increased considerably), the development of rape rates was compared with that of non-sexual violent offences and nonviolent sexual offences (in so far as available statistics permitted). The results showed that in none of the countries did rape increase more than nonsexual violent crimes. This finding in itself would seem sufficient to discard the hypothesis that pornography causes rape.(ABSTRACT TRUNCATED AT 400 WORDS)
An objective interpretation of Lagrangian quantum mechanics
International Nuclear Information System (INIS)
Roberts, K.V.
1978-01-01
Unlike classical mechanics, the Copenhagen interpretation of quantum mechanics does not provide an objective space-time picture of the actual history of a physical system. This paper suggests how the conceptual foundations of quantum mechanics can be reformulated, without changing the mathematical content of the theory or its detailed agreement with experiment and without introducing any hidden variables, in order to provide an objective, covariant, Lagrangian description of reality which is deterministic and time-symmetric on the microscopic scale. The basis of this description can be expressed either as an action functional or as a summation over Feynman diagrams or paths. The probability laws associated with the quantum-mechanical measurement process, and the asymmetry in time of the principles of macroscopic causality and of the laws of statistical mechanics, are interpreted as consequences of the particular boundary conditions that apply to the actual universe. The objective interpretation does not include the observer and the measurement process among the fundamental concepts of the theory, but it does not entail a revision of the ideas of determinism and of time, since in a Lagrangian theory both initial and final boundary conditions on the action functional are required. (author)
Data analysis and interpretation for environmental surveillance
International Nuclear Information System (INIS)
1992-06-01
The Data Analysis and Interpretation for Environmental Surveillance Conference was held in Lexington, Kentucky, February 5--7, 1990. The conference was sponsored by what is now the Office of Environmental Compliance and Documentation, Oak Ridge National Laboratory. Participants included technical professionals from all Martin Marietta Energy Systems facilities, Westinghouse Materials Company of Ohio, Pacific Northwest Laboratory, and several technical support contractors. Presentations at the conference ranged the full spectrum of issues that effect the analysis and interpretation of environmental data. Topics included tracking systems for samples and schedules associated with ongoing programs; coalescing data from a variety of sources and pedigrees into integrated data bases; methods for evaluating the quality of environmental data through empirical estimates of parameters such as charge balance, pH, and specific conductance; statistical applications to the interpretation of environmental information; and uses of environmental information in risk and dose assessments. Hearing about and discussing this wide variety of topics provided an opportunity to capture the subtlety of each discipline and to appreciate the continuity that is required among the disciplines in order to perform high-quality environmental information analysis
Optimal state discrimination using particle statistics
International Nuclear Information System (INIS)
Bose, S.; Ekert, A.; Omar, Y.; Paunkovic, N.; Vedral, V.
2003-01-01
We present an application of particle statistics to the problem of optimal ambiguous discrimination of quantum states. The states to be discriminated are encoded in the internal degrees of freedom of identical particles, and we use the bunching and antibunching of the external degrees of freedom to discriminate between various internal states. We show that we can achieve the optimal single-shot discrimination probability using only the effects of particle statistics. We discuss interesting applications of our method to detecting entanglement and purifying mixed states. Our scheme can easily be implemented with the current technology
Abstract Interpretation and Attribute Gramars
DEFF Research Database (Denmark)
Rosendahl, Mads
The objective of this thesis is to explore the connections between abstract interpretation and attribute grammars as frameworks in program analysis. Abstract interpretation is a semantics-based program analysis method. A large class of data flow analysis problems can be expressed as non-standard ...... is presented in the thesis. Methods from abstract interpretation can also be used in correctness proofs of attribute grammars. This proof technique introduces a new class of attribute grammars based on domain theory. This method is illustrated with examples....
On the statistical properties of photons
International Nuclear Information System (INIS)
Cini, M.
1990-01-01
The interpretation in terms of a transition from Maxwell-Boltzmann to Bose-Einstein statistics of the effect in quantum optics of degenerate light discovered by De Martini and Di Fonzo is discussed. It is shown that the results of the experiment can be explained by using only the quantum-mechanical rule that the states of an assembly of bosons should be completely symmetrical, without mentioning in any way their statistical properties. This means that photons are indeed identical particles
GIGMF - A statistical model program
International Nuclear Information System (INIS)
Vladuca, G.; Deberth, C.
1978-01-01
The program GIGMF computes the differential and integrated statistical model cross sections for the reactions proceeding through a compound nuclear stage. The computational method is based on the Hauser-Feshbach-Wolfenstein theory, modified to include the modern version of Tepel et al. Although the program was written for a PDP-15 computer, with 16K high speed memory, many reaction channels can be taken into account with the following restrictions: the pro ectile spin must be less than 2, the maximum spin momenta of the compound nucleus can not be greater than 10. These restrictions are due solely to the storage allotments and may be easily relaxed. The energy of the impinging particle, the target and projectile masses, the spin and paritjes of the projectile, target, emergent and residual nuclei the maximum orbital momentum and transmission coefficients for each reaction channel are the input parameters of the program. (author)
Savage, Leonard J
1972-01-01
Classic analysis of the foundations of statistics and development of personal probability, one of the greatest controversies in modern statistical thought. Revised edition. Calculus, probability, statistics, and Boolean algebra are recommended.
State Transportation Statistics 2010
2011-09-14
The Bureau of Transportation Statistics (BTS), a part of DOTs Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2010, a statistical profile of transportation in the 50 states and the District of Col...
State Transportation Statistics 2012
2013-08-15
The Bureau of Transportation Statistics (BTS), a part of the U.S. Department of Transportation's (USDOT) Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2012, a statistical profile of transportation ...
Adrenal Gland Tumors: Statistics
... Gland Tumor: Statistics Request Permissions Adrenal Gland Tumor: Statistics Approved by the Cancer.Net Editorial Board , 03/ ... primary adrenal gland tumor is very uncommon. Exact statistics are not available for this type of tumor ...
State transportation statistics 2009
2009-01-01
The Bureau of Transportation Statistics (BTS), a part of DOTs Research and : Innovative Technology Administration (RITA), presents State Transportation : Statistics 2009, a statistical profile of transportation in the 50 states and the : District ...
State Transportation Statistics 2011
2012-08-08
The Bureau of Transportation Statistics (BTS), a part of DOTs Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2011, a statistical profile of transportation in the 50 states and the District of Col...
Neuroendocrine Tumor: Statistics
... Tumor > Neuroendocrine Tumor: Statistics Request Permissions Neuroendocrine Tumor: Statistics Approved by the Cancer.Net Editorial Board , 01/ ... the body. It is important to remember that statistics on the survival rates for people with a ...
State Transportation Statistics 2013
2014-09-19
The Bureau of Transportation Statistics (BTS), a part of the U.S. Department of Transportations (USDOT) Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2013, a statistical profile of transportatio...
BTS statistical standards manual
2005-10-01
The Bureau of Transportation Statistics (BTS), like other federal statistical agencies, establishes professional standards to guide the methods and procedures for the collection, processing, storage, and presentation of statistical data. Standards an...
Andries van Aarde's Matthew Interpretation
African Journals Online (AJOL)
Test
2011-01-14
Jan 14, 2011 ... Secondly, it is an individual and independent interpretation of the Matthean .... specific social context is emphasised: certain events in the early church ...... Moses-theology, a Covenant-theology or any other exclusive theology.
Dialectica Interpretation with Marked Counterexamples
Directory of Open Access Journals (Sweden)
Trifon Trifonov
2011-01-01
Full Text Available Goedel's functional "Dialectica" interpretation can be used to extract functional programs from non-constructive proofs in arithmetic by employing two sorts of higher-order witnessing terms: positive realisers and negative counterexamples. In the original interpretation decidability of atoms is required to compute the correct counterexample from a set of candidates. When combined with recursion, this choice needs to be made for every step in the extracted program, however, in some special cases the decision on negative witnesses can be calculated only once. We present a variant of the interpretation in which the time complexity of extracted programs can be improved by marking the chosen witness and thus avoiding recomputation. The achieved effect is similar to using an abortive control operator to interpret computational content of non-constructive principles.
Federal Aviation Administration Legal Interpretations
Department of Transportation — Legal Interpretations and the Chief Counsel's opinions are now available at this site. Your may choose to search by year or by text search. Please note that not all...
Interpreting Sustainability for Urban Forests
Directory of Open Access Journals (Sweden)
Camilo Ordóñez
2010-06-01
Full Text Available Incisive interpretations of urban-forest sustainability are important in furthering our understanding of how to sustain the myriad values associated with urban forests. Our analysis of earlier interpretations reveals conceptual gaps. These interpretations are attached to restrictive definitions of a sustainable urban forest and limited to a rather mechanical view of maintaining the biophysical structure of trees. The probing of three conceptual domains (urban forest concepts, sustainable development, and sustainable forest management leads to a broader interpretation of urban-forest sustainability as the process of sustaining urban forest values through time and across space. We propose that values—and not services, benefits, functions or goods—is a superior concept to refer to what is to be sustained in and by an urban forest.
Tarague Interpretive Trail Mitigation Plan
National Research Council Canada - National Science Library
Welch, David
2001-01-01
...), International Archaeological Research Institute, Inc. (lARfI) has prepared a mitigation plan for development of an interpretive trail at Tarague Beach, located on the north coast of the island of Guam (Fig. 1...
STATISTICS IN SERVICE QUALITY ASSESSMENT
Directory of Open Access Journals (Sweden)
Dragana Gardašević
2012-09-01
Full Text Available For any quality evaluation in sports, science, education, and so, it is useful to collect data to construct a strategy to improve the quality of services offered to the user. For this purpose, we use statistical software packages for data processing data collected in order to increase customer satisfaction. The principle is demonstrated by the example of the level of student satisfaction ratings Belgrade Polytechnic (as users the quality of institutions (Belgrade Polytechnic. Here, the emphasis on statistical analysis as a tool for quality control in order to improve the same, and not the interpretation of results. Therefore, the above can be used as a model in sport to improve the overall results.
Begriffsverwirrung? Interpretation Analyse Bedeutung Applikation
Directory of Open Access Journals (Sweden)
Mayr, Jeremia Josef M.
2017-11-01
Full Text Available Empirical research on the reception of biblical texts confronts scientific exegesis with valid and challenging requests and demands. The hermeneutic question of the compatibility of interpretations resulting out of different contexts (e.g. scientific exegesis and ordinary readers‘ exegesis plays an important role. Taking these requests seriously by coherently restructuring fundamental and central aspects of the theory of scientific interpretation, the present article attempts to offer a stimulating approach for further investigation.
Court interpreting and pragmatic meaning
DEFF Research Database (Denmark)
Jacobsen, Bente
In Denmark, court interpreters are required to deliver verbatim translations of speakers' originals and to refrain from transferring pragmatic meaning. Yet, as this paper demonstrates, pragmatic meaning is central to courtroom interaction.......In Denmark, court interpreters are required to deliver verbatim translations of speakers' originals and to refrain from transferring pragmatic meaning. Yet, as this paper demonstrates, pragmatic meaning is central to courtroom interaction....
Interpretation of macroscopic quantum phenomena
International Nuclear Information System (INIS)
Baumann, K.
1986-01-01
It is argued that a quantum theory without observer is required for the interpretation of macroscopic quantum tunnelling. Such a theory is obtained by augmenting QED by the actual electric field in the rest system of the universe. An equation of the motion of this field is formulated form which the correct macroscopic behavior of the universe and the validity of the Born interpretation is derived. Care is taken to use mathematically sound concepts only. (Author)
International Conference on Robust Statistics 2015
Basu, Ayanendranath; Filzmoser, Peter; Mukherjee, Diganta
2016-01-01
This book offers a collection of recent contributions and emerging ideas in the areas of robust statistics presented at the International Conference on Robust Statistics 2015 (ICORS 2015) held in Kolkata during 12–16 January, 2015. The book explores the applicability of robust methods in other non-traditional areas which includes the use of new techniques such as skew and mixture of skew distributions, scaled Bregman divergences, and multilevel functional data methods; application areas being circular data models and prediction of mortality and life expectancy. The contributions are of both theoretical as well as applied in nature. Robust statistics is a relatively young branch of statistical sciences that is rapidly emerging as the bedrock of statistical analysis in the 21st century due to its flexible nature and wide scope. Robust statistics supports the application of parametric and other inference techniques over a broader domain than the strictly interpreted model scenarios employed in classical statis...
Interpreter services in emergency medicine.
Chan, Yu-Feng; Alagappan, Kumar; Rella, Joseph; Bentley, Suzanne; Soto-Greene, Marie; Martin, Marcus
2010-02-01
Emergency physicians are routinely confronted with problems associated with language barriers. It is important for emergency health care providers and the health system to strive for cultural competency when communicating with members of an increasingly diverse society. Possible solutions that can be implemented include appropriate staffing, use of new technology, and efforts to develop new kinds of ties to the community served. Linguistically specific solutions include professional interpretation, telephone interpretation, the use of multilingual staff members, the use of ad hoc interpreters, and, more recently, the use of mobile computer technology at the bedside. Each of these methods carries a specific set of advantages and disadvantages. Although professionally trained medical interpreters offer improved communication, improved patient satisfaction, and overall cost savings, they are often underutilized due to their perceived inefficiency and the inconclusive results of their effect on patient care outcomes. Ultimately, the best solution for each emergency department will vary depending on the population served and available resources. Access to the multiple interpretation options outlined above and solid support and commitment from hospital institutions are necessary to provide proper and culturally competent care for patients. Appropriate communications inclusive of interpreter services are essential for culturally and linguistically competent provider/health systems and overall improved patient care and satisfaction. Copyright (c) 2010 Elsevier Inc. All rights reserved.
Philosophical perspectives on quantum chaos: Models and interpretations
Bokulich, Alisa Nicole
2001-09-01
The problem of quantum chaos is a special case of the larger problem of understanding how the classical world emerges from quantum mechanics. While we have learned that chaos is pervasive in classical systems, it appears to be almost entirely absent in quantum systems. The aim of this dissertation is to determine what implications the interpretation of quantum mechanics has for attempts to explain the emergence of classical chaos. There are three interpretations of quantum mechanics that have set out programs for solving the problem of quantum chaos: the standard interpretation, the statistical interpretation, and the deBroglie-Bohm causal interpretation. One of the main conclusions of this dissertation is that an interpretation alone is insufficient for solving the problem of quantum chaos and that the phenomenon of decoherence must be taken into account. Although a completely satisfactory solution of the problem of quantum chaos is still outstanding, I argue that the deBroglie-Bohm interpretation with the help of decoherence outlines the most promising research program to pursue. In addition to making a contribution to the debate in the philosophy of physics concerning the interpretation of quantum mechanics, this dissertation reveals two important methodological lessons for the philosophy of science. First, issues of reductionism and intertheoretic relations cannot be divorced from questions concerning the interpretation of the theories involved. Not only is the exploration of intertheoretic relations a central part of the articulation and interpretation of an individual theory, but the very terms used to discuss intertheoretic relations, such as `state' and `classical limit', are themselves defined by particular interpretations of the theory. The second lesson that emerges is that, when it comes to characterizing the relationship between classical chaos and quantum mechanics, the traditional approaches to intertheoretic relations, namely reductionism and
Interpreting clinical trial results by deductive reasoning: In search of improved trial design.
Kurbel, Sven; Mihaljević, Slobodan
2017-10-01
Clinical trial results are often interpreted by inductive reasoning, in a trial design-limited manner, directed toward modifications of the current clinical practice. Deductive reasoning is an alternative in which results of relevant trials are combined in indisputable premises that lead to a conclusion easily testable in future trials. © 2017 WILEY Periodicals, Inc.
Developments in statistical evaluation of clinical trials
Oud, Johan; Ghidey, Wendimagegn
2014-01-01
This book describes various ways of approaching and interpreting the data produced by clinical trial studies, with a special emphasis on the essential role that biostatistics plays in clinical trials. Over the past few decades the role of statistics in the evaluation and interpretation of clinical data has become of paramount importance. As a result the standards of clinical study design, conduct and interpretation have undergone substantial improvement. The book includes 18 carefully reviewed chapters on recent developments in clinical trials and their statistical evaluation, with each chapter providing one or more examples involving typical data sets, enabling readers to apply the proposed procedures. The chapters employ a uniform style to enhance comparability between the approaches.
ADHD Rating Scale-IV: Checklists, Norms, and Clinical Interpretation
Pappas, Danielle
2006-01-01
This article reviews the "ADHD Rating Scale-IV: Checklist, norms, and clinical interpretation," is a norm-referenced checklist that measures the symptoms of attention deficit/hyperactivity disorder (ADHD) according to the diagnostic criteria of the Diagnostic and Statistical Manual of Mental Disorders (DSM-IV; American Psychiatric…
Ether and interpretation of some physical phenomena and concepts
International Nuclear Information System (INIS)
Rzayev, S.G.
2008-01-01
On the basis of the concept of existence of an ether representation about time, space, matters and physical field are profound and also the essence of such phenomena, as corpuscular - wave dualism, change of time, scale and mass at movement body's is opened. The opportunity of transition from probability-statistical interpretation of the quantum phenomena to Laplace's determinism is shown
Information Statistics in Schools Educate your students about the value and everyday use of statistics. The Statistics in Schools program provides resources for teaching and learning with real life data. Explore the site for standards-aligned, classroom-ready activities. Statistics in Schools Math Activities History
Transport Statistics - Transport - UNECE
Sustainable Energy Statistics Trade Transport Themes UNECE and the SDGs Climate Change Gender Ideas 4 Change UNECE Weekly Videos UNECE Transport Areas of Work Transport Statistics Transport Transport Statistics About us Terms of Reference Meetings and Events Meetings Working Party on Transport Statistics (WP.6
The impact of working memory on interpreting
Institute of Scientific and Technical Information of China (English)
白云安; 张国梅
2016-01-01
This paper investigates the roles of working memory in interpreting process. First of all, it gives a brief introduction to interpreting. Secondly, the paper exemplifies the role of working memory in interpreting. The result reveals that the working memory capacity of interpreters is not adsolutely proportional to the quality of interpreting in the real interpreting conditions. The performance of an interpreter with well-equipped working memory capacity will comprehensively influenced by various elements.
Probabilistic and Statistical Aspects of Quantum Theory
Holevo, Alexander S
2011-01-01
This book is devoted to aspects of the foundations of quantum mechanics in which probabilistic and statistical concepts play an essential role. The main part of the book concerns the quantitative statistical theory of quantum measurement, based on the notion of positive operator-valued measures. During the past years there has been substantial progress in this direction, stimulated to a great extent by new applications such as Quantum Optics, Quantum Communication and high-precision experiments. The questions of statistical interpretation, quantum symmetries, theory of canonical commutation re
Concept of probability in statistical physics
Guttmann, Y M
1999-01-01
Foundational issues in statistical mechanics and the more general question of how probability is to be understood in the context of physical theories are both areas that have been neglected by philosophers of physics. This book fills an important gap in the literature by providing a most systematic study of how to interpret probabilistic assertions in the context of statistical mechanics. The book explores both subjectivist and objectivist accounts of probability, and takes full measure of work in the foundations of probability theory, in statistical mechanics, and in mathematical theory. It will be of particular interest to philosophers of science, physicists and mathematicians interested in foundational issues, and also to historians of science.
Generalized quantum statistics
International Nuclear Information System (INIS)
Chou, C.
1992-01-01
In the paper, a non-anyonic generalization of quantum statistics is presented, in which Fermi-Dirac statistics (FDS) and Bose-Einstein statistics (BES) appear as two special cases. The new quantum statistics, which is characterized by the dimension of its single particle Fock space, contains three consistent parts, namely the generalized bilinear quantization, the generalized quantum mechanical description and the corresponding statistical mechanics
Momentum conservation decides Heisenberg's interpretation of the uncertainty formulas
International Nuclear Information System (INIS)
Angelidis, T.D.
1977-01-01
In the light of Heisenberg's interpretation of the uncertainty formulas, the conditions necessary for the derivation of the quantitative statement or law of momentum conservation are considered. The result of such considerations is a contradiction between the formalism of quantum physics and the asserted consequences of Heisenberg's interpretation. This contradiction decides against Heisenberg's interpretation of the uncertainty formulas on upholding that the formalism of quantum physics is both consistent and complete, at least insofar as the statement of momentum conservation can be proved within this formalism. A few comments are also included on Bohr's complementarity interpretation of the formalism of quantum physics. A suggestion, based on a statistical mode of empirical testing of the uncertainty formulas, does not give rise to any such contradiction
Crunching Numbers: What Cancer Screening Statistics Really Tell Us
Cancer screening studies have shown that more screening does not necessarily translate into fewer cancer deaths. This article explains how to interpret the statistics used to describe the results of screening studies.
SpaSM: A MATLAB Toolbox for Sparse Statistical Modeling
DEFF Research Database (Denmark)
Sjöstrand, Karl; Clemmensen, Line Harder; Larsen, Rasmus
2018-01-01
Applications in biotechnology such as gene expression analysis and image processing have led to a tremendous development of statistical methods with emphasis on reliable solutions to severely underdetermined systems. Furthermore, interpretations of such solutions are of importance, meaning...
Components of the Pearson-Fisher chi-squared statistic
Directory of Open Access Journals (Sweden)
G. D. Raynery
2002-01-01
interpretation of the corresponding test statistic components has not previously been investigated. This paper provides the necessary details, as well as an overview of the decomposition options available, and revisits two published examples.
Default Sarcastic Interpretations: On the Priority of Nonsalient Interpretations
Giora, Rachel; Drucker, Ari; Fein, Ofer; Mendelson, Itamar
2015-01-01
Findings from five experiments support the view that negation generates sarcastic utterance-interpretations by default. When presented in isolation, novel negative constructions ("Punctuality is not his forte," "Thoroughness is not her most distinctive feature"), free of semantic anomaly or internal incongruity, were…
The Interpretive Approach to Religious Education: Challenging Thompson's Interpretation
Jackson, Robert
2012-01-01
In a recent book chapter, Matthew Thompson makes some criticisms of my work, including the interpretive approach to religious education and the research and activity of Warwick Religions and Education Research Unit. Against the background of a discussion of religious education in the public sphere, my response challenges Thompson's account,…
Interpretation and evaluation of radiograph
International Nuclear Information System (INIS)
Abdul Nassir Ibrahim; Azali Muhammad; Ab. Razak Hamzah; Abd. Aziz Mohamed; Mohamad Pauzi Ismail
2008-01-01
After digestion, the interpreter must interpreted and evaluate the image on film, usually many radiograph stuck in this step, if there is good density, so there are no problem. This is a final stage of radiography work and this work must be done by level two or three radiographer. This is a final stages before the radiographer give a result to their customer for further action. The good interpreter must know what kind of artifact, is this artifact are dangerous or not and others. In this chapter, the entire artifact that usually showed will be discussed briefly with the good illustration and picture to make the reader understand and know the type of artifact that exists.
Interpretation and digestion of radiograph
International Nuclear Information System (INIS)
Abdul Nassir Ibrahim; Azali Muhammad; Ab. Razak Hamzah; Abd. Aziz Mohamed; Mohamad Pauzi Ismail
2008-01-01
Radiography digestion is final test for the radiography to make sure that radiograph produced will inspect their quality of the image before its interpreted. This level is critical level where if there is a mistake, all of the radiography work done before will be unaccepted. So as mention earlier, it can waste time, cost and more worst it can make the production must shut down. So, this step, level two radiographers or interpreter must evaluate the radiograph carefully. For this purpose, digestion room and densitometer must used. Of course all the procedure must follow the specification that mentioned in document. There are several needs must fill before we can say the radiograph is corrected or not like the location of penetrameter, number of penetrameter that showed, the degree of density of film, and usually there is no problem in this step and the radiograph can go to interpretation and evaluation step as will mentioned in next chapter.
National Statistical Commission and Indian Official Statistics
Indian Academy of Sciences (India)
Author Affiliations. T J Rao1. C. R. Rao Advanced Institute of Mathematics, Statistics and Computer Science (AIMSCS) University of Hyderabad Campus Central University Post Office, Prof. C. R. Rao Road Hyderabad 500 046, AP, India.
Image analysis enhancement and interpretation
International Nuclear Information System (INIS)
Glauert, A.M.
1978-01-01
The necessary practical and mathematical background are provided for the analysis of an electron microscope image in order to extract the maximum amount of structural information. Instrumental methods of image enhancement are described, including the use of the energy-selecting electron microscope and the scanning transmission electron microscope. The problems of image interpretation are considered with particular reference to the limitations imposed by radiation damage and specimen thickness. A brief survey is given of the methods for producing a three-dimensional structure from a series of two-dimensional projections, although emphasis is really given on the analysis, processing and interpretation of the two-dimensional projection of a structure. (Auth.)
Interpretative challenges in face analysis
DEFF Research Database (Denmark)
de Oliveira, Sandi Michele; Hernández-Flores, Nieves
2015-01-01
In current research on face analysis questions of who and what should be interpreted, as well as how, are of central interest. In English language research, this question has led to a debate on the concepts of P1 (laypersons, representing the “emic” perspective) and P2 (researchers, representing...... in Spanish and address forms in European Portuguese, we view P1 and P2 as being far more complex than the literature suggests, with subgroups (different types of laypersons and researchers, respectively). At the micro-level we will describe the roles each subgroup plays in the interpretative process...
Authentic interpretations of Return of the Lands Act (1839
Directory of Open Access Journals (Sweden)
Stanković Uroš
2011-01-01
Full Text Available The article sheds light on three interpretations of Return of the Lands Act, introduced in 1839 and entitling landowners whose land was usurped by prince Miloš Obrenović (1815-1839, 1858-1860 and distinguished people's headmen to claim retrial of litigations over land adjudicated unjustly and return of their lawlessly disposed property. Two main questions arose in relation with interpretive rules - what were legislative power's goals when interpreting the Act and to what they were due. The author seeked the answer to the first dilemma by scrutinizing texts of interpretations, in order to determine their semantic meaning. In an attempt to provide the explanation for the second problem, he explored social context preceeding introduction of interpretive rules (namely, number of litigations before courts and political ambiance in Serbia. The first interpretation, dating back to March, 2nd 1843, is inclined towards previous owners of the land. Such solution was caused mainly by the political situation - Russia contested Aleksandar Karađorđević's first election for prince of Serbia in 1842 and the assembly foreseen to elect ruler anew was to be summoned in June 1843 In the meantime, new regime embodied in so-called constitution-defenders (the group of distinguished political leaders opposed to Obrenović dynasty struggled to ensure enthronement of its candidate and therefore issued a demagogic interpretation. On the contrary, two remaining interpretations, from the years 1844 and 1845, were aimed to retain status quo regarding land property by diminishing possibilities for new trial. The legislators opted for restrictions having learned that number of litigations had increased greatly. Besides, political climate was to the most extent convenient for taking such measures, as several lesser rebellions incited by the followers of Obrenović dynasty had been quelled easily, after which the opponents of the regime remained passive for a longer period
Industrial statistics with Minitab
Cintas, Pere Grima; Llabres, Xavier Tort-Martorell
2012-01-01
Industrial Statistics with MINITAB demonstrates the use of MINITAB as a tool for performing statistical analysis in an industrial context. This book covers introductory industrial statistics, exploring the most commonly used techniques alongside those that serve to give an overview of more complex issues. A plethora of examples in MINITAB are featured along with case studies for each of the statistical techniques presented. Industrial Statistics with MINITAB: Provides comprehensive coverage of user-friendly practical guidance to the essential statistical methods applied in industry.Explores
A Genealogical Interpretation of Principal Components Analysis
McVean, Gil
2009-01-01
Principal components analysis, PCA, is a statistical method commonly used in population genetics to identify structure in the distribution of genetic variation across geographical location and ethnic background. However, while the method is often used to inform about historical demographic processes, little is known about the relationship between fundamental demographic parameters and the projection of samples onto the primary axes. Here I show that for SNP data the projection of samples onto the principal components can be obtained directly from considering the average coalescent times between pairs of haploid genomes. The result provides a framework for interpreting PCA projections in terms of underlying processes, including migration, geographical isolation, and admixture. I also demonstrate a link between PCA and Wright's fst and show that SNP ascertainment has a largely simple and predictable effect on the projection of samples. Using examples from human genetics, I discuss the application of these results to empirical data and the implications for inference. PMID:19834557
Analyzing and Interpreting Historical Sources
DEFF Research Database (Denmark)
Kipping, Matthias; Wadhwani, Dan; Bucheli, Marcelo
2014-01-01
This chapter outlines a methodology for the interpretation of historical sources, helping to realize their full potential for the study of organization, while overcoming their challenges in terms of distortions created by time, changes in context, and selective production or preservation. Drawing....... The chapter contributes to the creation of a language for describing the use of historical sources in management research....
Quantum theory needs no 'Interpretation'
International Nuclear Information System (INIS)
Fuchs, Christopher A.; Peres, Asher
2000-01-01
Purpose of this article is to stress the fact that Quantum Theory does not need an interpretation other than being an algorithm for computing probabilities associated with macroscopic phenomena and measurements. It does not ''describ'' reality, and the wave function is not objective entity, it only gives the evolution of our probabilities for the outcomes potential experiments. (AIP) (c)
An Authentic Interpretation of Laws
Directory of Open Access Journals (Sweden)
Teodor Antić
2015-01-01
Full Text Available Authentic interpretation of laws is a legal institute whereby a legislator gives the authentic meaning to a specific legal norm in case of its incorrect or diversified interpretation in practice. It has the same legal force as the law. Retroactivity and influence on pending cases are its inherent characteristics. Due to these characteristics and their relation to the principles of the rule of law, legal certainty and separation of powers, it is subjected to severe criticism not only by legal theory but also legal practice. The author analyses the institute of authentic interpretation from historical and comparative point of view and through the Croatian normative regulation, practice of the Croatian Parliament and academic debate, including opinions in favour as well as against it. On these grounds the author concludes that higher quality of law making procedure could make the authentic interpretation dispensable. On the other hand, should this institute be kept in the legal order it is essential to receive more effective constitutional control.
Copenhagen interpretation versus Bohm's theory
International Nuclear Information System (INIS)
Baumann, K.
1985-01-01
The objections raised against Bohm's interpretation of quantum theory are reexamined, and arguments are presented in favour of this theory. Bohm's QED is modified such as to include Dirac particles. It is pointed out that the electric field may be chosen as the 'actual' field instead of the magnetic field. Finally, the theory is reformulated in terms of an arbitrary actual field. (Author)
Interpretation of Recurrent Neural Networks
DEFF Research Database (Denmark)
Pedersen, Morten With; Larsen, Jan
1997-01-01
This paper addresses techniques for interpretation and characterization of trained recurrent nets for time series problems. In particular, we focus on assessment of effective memory and suggest an operational definition of memory. Further we discuss the evaluation of learning curves. Various nume...
Abstract Interpretation of Mobile Ambients
DEFF Research Database (Denmark)
Hansen, René Rydhof; Jensen, J. G.; Nielson, Flemming
1999-01-01
We demonstrate that abstract interpretation is useful for analysing calculi of computation such as the ambient calculus (which is based on the p-calculus); more importantly, we show that the entire development can be expressed in a constraint-based formalism that is becoming exceedingly popular...
Interpretive Reproduction in Children's Play
Corsaro, William A.
2012-01-01
The author looks at children's play from the perspective of interpretive reproduction, emphasizing the way children create their own unique peer cultures, which he defines as a set of routines, artifacts, values, and concerns that children engage in with their playmates. The article focuses on two types of routines in the peer culture of preschool…
Interpreting Data: The Hybrid Mind
Heisterkamp, Kimberly; Talanquer, Vicente
2015-01-01
The central goal of this study was to characterize major patterns of reasoning exhibited by college chemistry students when analyzing and interpreting chemical data. Using a case study approach, we investigated how a representative student used chemical models to explain patterns in the data based on structure-property relationships. Our results…
Interpretable functional principal component analysis.
Lin, Zhenhua; Wang, Liangliang; Cao, Jiguo
2016-09-01
Functional principal component analysis (FPCA) is a popular approach to explore major sources of variation in a sample of random curves. These major sources of variation are represented by functional principal components (FPCs). The intervals where the values of FPCs are significant are interpreted as where sample curves have major variations. However, these intervals are often hard for naïve users to identify, because of the vague definition of "significant values". In this article, we develop a novel penalty-based method to derive FPCs that are only nonzero precisely in the intervals where the values of FPCs are significant, whence the derived FPCs possess better interpretability than the FPCs derived from existing methods. To compute the proposed FPCs, we devise an efficient algorithm based on projection deflation techniques. We show that the proposed interpretable FPCs are strongly consistent and asymptotically normal under mild conditions. Simulation studies confirm that with a competitive performance in explaining variations of sample curves, the proposed FPCs are more interpretable than the traditional counterparts. This advantage is demonstrated by analyzing two real datasets, namely, electroencephalography data and Canadian weather data. © 2015, The International Biometric Society.
Interpretation and the Aesthetic Dimension
Mortensen, Charles O.
1976-01-01
The author, utilizing a synthesis of philosophic comments on aesthetics, provides a discourse on the aesthetic dimension and offers examples of how interpreters can nurture the innate sense of beauty in man. Poetic forms, such as haiku, are used to relate the aesthetic relationship between man and the environment. (BT)
Abstract Interpretation Using Attribute Grammar
DEFF Research Database (Denmark)
Rosendahl, Mads
1990-01-01
This paper deals with the correctness proofs of attribute grammars using methods from abstract interpretation. The technique will be described by defining a live-variable analysis for a small flow-chart language and proving it correct with respect to a continuation style semantics. The proof...
Applied statistics for economics and business
Özdemir, Durmuş
2016-01-01
This textbook introduces readers to practical statistical issues by presenting them within the context of real-life economics and business situations. It presents the subject in a non-threatening manner, with an emphasis on concise, easily understandable explanations. It has been designed to be accessible and student-friendly and, as an added learning feature, provides all the relevant data required to complete the accompanying exercises and computing problems, which are presented at the end of each chapter. It also discusses index numbers and inequality indices in detail, since these are of particular importance to students and commonly omitted in textbooks. Throughout the text it is assumed that the student has no prior knowledge of statistics. It is aimed primarily at business and economics undergraduates, providing them with the basic statistical skills necessary for further study of their subject. However, students of other disciplines will also find it relevant.
Interobserver variability in interpretation of mammogram
International Nuclear Information System (INIS)
Lee, Kyung Jae; Lee, Hae Kyung; Lee, Won Chul; Hwang, In Young; Park, Young Gyu; Jung, Sang Seol; Kim, Hoon Kyo; Kim, Mi Hye; Kim, Hak Hee
2004-01-01
The purpose of this study was to evaluate the performance of radiologists for mammographic screening, and to analyze interobserver agreement in the interpretation of mammograms. 50 women were selected as subjects from the patients who were screened with mammograms at two university hospitals. The images were analyzed by five radiologists working independently and without their having any knowledge of the final diagnosis. The interobserver variation was analyzed by using the kappa statistic. There were moderate agreements for the findings of the parenchymal pattern (k=0.44; 95% CI 0.39-0.49). calcification type (k=0.66; 95% CI 0.60-0.72) and calcification distribution (K=0.43; 95% CI 0.38-0.48). The mean kappa values ranged from 0.66 to 0.42 for the mass findings. The mean kappa value for the final conclusion was 0.44 (95% CI 0.38-0.51). In general, moderate agreement was evident for all the categories that were evaluated. The general agreement was moderate, but there was wide variability in some findings. To improve the accuracy and reduce variability among physicians in interpretation, proper training of radiologists and standardization of criteria are essential for breast screening
Recreational Boating Statistics 2012
Department of Homeland Security — Every year, the USCG compiles statistics on reported recreational boating accidents. These statistics are derived from accident reports that are filed by the owners...
Recreational Boating Statistics 2013
Department of Homeland Security — Every year, the USCG compiles statistics on reported recreational boating accidents. These statistics are derived from accident reports that are filed by the owners...
Statistical data analysis handbook
National Research Council Canada - National Science Library
Wall, Francis J
1986-01-01
It must be emphasized that this is not a text book on statistics. Instead it is a working tool that presents data analysis in clear, concise terms which can be readily understood even by those without formal training in statistics...
U.S. Department of Health & Human Services — The CMS Office of Enterprise Data and Analytics has developed CMS Program Statistics, which includes detailed summary statistics on national health care, Medicare...
Recreational Boating Statistics 2011
Department of Homeland Security — Every year, the USCG compiles statistics on reported recreational boating accidents. These statistics are derived from accident reports that are filed by the owners...
... Doing AMIGAS Stay Informed Cancer Home Uterine Cancer Statistics Language: English (US) Español (Spanish) Recommend on Facebook ... the most commonly diagnosed gynecologic cancer. U.S. Cancer Statistics Data Visualizations Tool The Data Visualizations tool makes ...
Tuberculosis Data and Statistics
... Advisory Groups Federal TB Task Force Data and Statistics Language: English (US) Español (Spanish) Recommend on Facebook ... Set) Mortality and Morbidity Weekly Reports Data and Statistics Decrease in Reported Tuberculosis Cases MMWR 2010; 59 ( ...
National transportation statistics 2011
2011-04-01
Compiled and published by the U.S. Department of Transportation's Bureau of Transportation Statistics : (BTS), National Transportation Statistics presents information on the U.S. transportation system, including : its physical components, safety reco...
National Transportation Statistics 2008
2009-01-08
Compiled and published by the U.S. Department of Transportations Bureau of Transportation Statistics (BTS), National Transportation Statistics presents information on the U.S. transportation system, including its physical components, safety record...
... News & Events About Us Home > Health Information Share Statistics Research shows that mental illnesses are common in ... of mental illnesses, such as suicide and disability. Statistics Top ı cs Mental Illness Any Anxiety Disorder ...
School Violence: Data & Statistics
... Social Media Publications Injury Center School Violence: Data & Statistics Recommend on Facebook Tweet Share Compartir The first ... Vehicle Safety Traumatic Brain Injury Injury Response Data & Statistics (WISQARS) Funded Programs Press Room Social Media Publications ...
Caregiver Statistics: Demographics
... You are here Home Selected Long-Term Care Statistics Order this publication Printer-friendly version What is ... needs and services are wide-ranging and complex, statistics may vary from study to study. Sources for ...
... Summary Coverdell Program 2012-2015 State Summaries Data & Statistics Fact Sheets Heart Disease and Stroke Fact Sheets ... Roadmap for State Planning Other Data Resources Other Statistic Resources Grantee Information Cross-Program Information Online Tools ...
... Standard Drink? Drinking Levels Defined Alcohol Facts and Statistics Print version Alcohol Use in the United States: ... 1238–1245, 2004. PMID: 15010446 National Center for Statistics and Analysis. 2014 Crash Data Key Findings (Traffic ...
National Transportation Statistics 2009
2010-01-21
Compiled and published by the U.S. Department of Transportation's Bureau of Transportation Statistics (BTS), National Transportation Statistics presents information on the U.S. transportation system, including its physical components, safety record, ...
National transportation statistics 2010
2010-01-01
National Transportation Statistics presents statistics on the U.S. transportation system, including its physical components, safety record, economic performance, the human and natural environment, and national security. This is a large online documen...
DEFF Research Database (Denmark)
Lindström, Erik; Madsen, Henrik; Nielsen, Jan Nygaard
Statistics for Finance develops students’ professional skills in statistics with applications in finance. Developed from the authors’ courses at the Technical University of Denmark and Lund University, the text bridges the gap between classical, rigorous treatments of financial mathematics...
Principles of applied statistics
National Research Council Canada - National Science Library
Cox, D. R; Donnelly, Christl A
2011-01-01
.... David Cox and Christl Donnelly distil decades of scientific experience into usable principles for the successful application of statistics, showing how good statistical strategy shapes every stage of an investigation...
Statistical black-hole thermodynamics
International Nuclear Information System (INIS)
Bekenstein, J.D.
1975-01-01
Traditional methods from statistical thermodynamics, with appropriate modifications, are used to study several problems in black-hole thermodynamics. Jaynes's maximum-uncertainty method for computing probabilities is used to show that the earlier-formulated generalized second law is respected in statistically averaged form in the process of spontaneous radiation by a Kerr black hole discovered by Hawking, and also in the case of a Schwarzschild hole immersed in a bath of black-body radiation, however cold. The generalized second law is used to motivate a maximum-entropy principle for determining the equilibrium probability distribution for a system containing a black hole. As an application we derive the distribution for the radiation in equilibrium with a Kerr hole (it is found to agree with what would be expected from Hawking's results) and the form of the associated distribution among Kerr black-hole solution states of definite mass. The same results are shown to follow from a statistical interpretation of the concept of black-hole entropy as the natural logarithm of the number of possible interior configurations that are compatible with the given exterior black-hole state. We also formulate a Jaynes-type maximum-uncertainty principle for black holes, and apply it to obtain the probability distribution among Kerr solution states for an isolated radiating Kerr hole
Nuclear material statistical accountancy system
International Nuclear Information System (INIS)
Argentest, F.; Casilli, T.; Franklin, M.
1979-01-01
The statistical accountancy system developed at JRC Ispra is refered as 'NUMSAS', ie Nuclear Material Statistical Accountancy System. The principal feature of NUMSAS is that in addition to an ordinary material balance calcultation, NUMSAS can calculate an estimate of the standard deviation of the measurement error accumulated in the material balance calculation. The purpose of the report is to describe in detail, the statistical model on wich the standard deviation calculation is based; the computational formula which is used by NUMSAS in calculating the standard deviation and the information about nuclear material measurements and the plant measurement system which are required as data for NUMSAS. The material balance records require processing and interpretation before the material balance calculation is begun. The material balance calculation is the last of four phases of data processing undertaken by NUMSAS. Each of these phases is implemented by a different computer program. The activities which are carried out in each phase can be summarised as follows; the pre-processing phase; the selection and up-date phase; the transformation phase, and the computation phase
Numeric computation and statistical data analysis on the Java platform
Chekanov, Sergei V
2016-01-01
Numerical computation, knowledge discovery and statistical data analysis integrated with powerful 2D and 3D graphics for visualization are the key topics of this book. The Python code examples powered by the Java platform can easily be transformed to other programming languages, such as Java, Groovy, Ruby and BeanShell. This book equips the reader with a computational platform which, unlike other statistical programs, is not limited by a single programming language. The author focuses on practical programming aspects and covers a broad range of topics, from basic introduction to the Python language on the Java platform (Jython), to descriptive statistics, symbolic calculations, neural networks, non-linear regression analysis and many other data-mining topics. He discusses how to find regularities in real-world data, how to classify data, and how to process data for knowledge discoveries. The code snippets are so short that they easily fit into single pages. Numeric Computation and Statistical Data Analysis ...
Federal Motor Vehicle Safety Standards Interpretations
Department of Transportation — NHTSA's Chief Counsel interprets the statutes that the agency administers and the regulations that it promulgates. The Chief Counsel's interpretations, issued in the...
Lineament interpretation. Short review and methodology
Energy Technology Data Exchange (ETDEWEB)
Tiren, Sven (GEOSIGMA AB (Sweden))
2010-11-15
interpretation, and the skill of the interpreter. Images and digital terrain models that display the relief of the studied area should, if possible, be illuminated in at least four directions to reduce biases regarding the orientation of structures. The resolution in the source data should be fully used and extrapolation of structures avoided in the primary interpretation of the source data. The interpretation of lineaments should be made in steps: a. Interpretation of each data set/image/terrain model is conducted separately; b. Compilation of all interpretations in a base lineament map and classification of the lineaments; and c. Construction of thematical maps, e.g. structural maps, rock block maps, and statistic presentation of lineaments. Generalisations and extrapolations of lineaments/structures may be made when producing the thematical maps. The construction of thematical maps should be supported by auxiliary information (geological and geomorphologic data and information on human impact in the area). Inferred tectonic structures should be controlled in field
Lineament interpretation. Short review and methodology
International Nuclear Information System (INIS)
Tiren, Sven
2010-11-01
the interpreter. Images and digital terrain models that display the relief of the studied area should, if possible, be illuminated in at least four directions to reduce biases regarding the orientation of structures. The resolution in the source data should be fully used and extrapolation of structures avoided in the primary interpretation of the source data. The interpretation of lineaments should be made in steps: a. Interpretation of each data set/image/terrain model is conducted separately; b. Compilation of all interpretations in a base lineament map and classification of the lineaments; and c. Construction of thematical maps, e.g. structural maps, rock block maps, and statistic presentation of lineaments. Generalisations and extrapolations of lineaments/structures may be made when producing the thematical maps. The construction of thematical maps should be supported by auxiliary information (geological and geomorphologic data and information on human impact in the area). Inferred tectonic structures should be controlled in field
Interactive statistics with ILLMO
Martens, J.B.O.S.
2014-01-01
Progress in empirical research relies on adequate statistical analysis and reporting. This article proposes an alternative approach to statistical modeling that is based on an old but mostly forgotten idea, namely Thurstone modeling. Traditional statistical methods assume that either the measured
Lenard, Christopher; McCarthy, Sally; Mills, Terence
2014-01-01
There are many different aspects of statistics. Statistics involves mathematics, computing, and applications to almost every field of endeavour. Each aspect provides an opportunity to spark someone's interest in the subject. In this paper we discuss some ethical aspects of statistics, and describe how an introduction to ethics has been…
Youth Sports Safety Statistics
... 6):794-799. 31 American Heart Association. CPR statistics. www.heart.org/HEARTORG/CPRAndECC/WhatisCPR/CPRFactsandStats/CPRpercent20Statistics_ ... Mental Health Services Administration, Center for Behavioral Health Statistics and Quality. (January 10, 2013). The DAWN Report: ...
Van Dijk, Rick; Boers, Eveline; Christoffels, Ingrid; Hermans, Daan
2011-01-01
The quality of interpretations produced by sign language interpreters was investigated. Twenty-five experienced interpreters were instructed to interpret narratives from (a) spoken Dutch to Sign Language of The Netherlands (SLN), (b) spoken Dutch to Sign Supported Dutch (SSD), and (c) SLN to spoken Dutch. The quality of the interpreted narratives was assessed by 5 certified sign language interpreters who did not participate in the study. Two measures were used to assess interpreting quality: the propositional accuracy of the interpreters' interpretations and a subjective quality measure. The results showed that the interpreted narratives in the SLN-to-Dutch interpreting direction were of lower quality (on both measures) than the interpreted narratives in the Dutch-to-SLN and Dutch-to-SSD directions. Furthermore, interpreters who had begun acquiring SLN when they entered the interpreter training program performed as well in all 3 interpreting directions as interpreters who had acquired SLN from birth.
Stereotypic movement disorder: easily missed.
Freeman, Roger D; Soltanifar, Atefeh; Baer, Susan
2010-08-01
To expand the understanding of stereotypic movement disorder (SMD) and its differentiation from tics and autistic stereotypies. Forty-two children (31 males, mean age 6y 3mo, SD 2y 8mo; 11 females, mean age 6y 7mo, SD 1y 9mo) consecutively diagnosed with SMD, without-self-injurious behavior, intellectual disability, sensory impairment, or an autistic spectrum disorder (ASD), were assessed in a neuropsychiatry clinic. A list of probe questions on the nature of the stereotypy was administered to parents (and to children if developmentally ready). Questionnaires administered included the Stereotypy Severity Scale, Short Sensory Profile, Strengths and Difficulties Questionnaire, Repetitive Behavior Scale--Revised, and the Developmental Coordination Disorder Questionnaire. The stereotyped movement patterns were directly observed and in some cases further documented by video recordings made by parents. The probe questions were used again on follow-up at a mean age of 10 years 7 months (SD 4y 4mo). Mean age at onset was 17 months. Males exceeded females by 3:1. Family history of a pattern of SMD was reported in 13 and neuropsychiatric comorbidity in 30 (attention-deficit-hyperactivity disorder in 16, tics in 18, and developmental coordination disorder in 16). Obsessive-compulsive disorder occurred in only two. The Short Sensory Profile correlated with comorbidity (p<0.001), the Stereotypy Severity Scale (p=0.009), and the Repetitive Behavior Scale (p<0.001); the last correlated with the Stereotypy Severity Scale (p=0.001). Children (but not their parents) liked their movements, which were usually associated with excitement or imaginative play. Mean length of follow-up was 4 years 8 months (SD 2y 10mo). Of the 39 children followed for longer than 6 months, the behavior stopped or was gradually shaped so as to occur primarily privately in 25. Misdiagnosis was common: 26 were initially referred as tics, 10 as ASD, five as compulsions, and one as epilepsy. Co-occurring facial grimacing in 15 children and vocalization in 22 contributed to diagnostic confusion. SMD occurs in children without ASD or intellectual disability. The generally favorable clinical course is largely due to a gradual increase in private expression of the movements. Severity of the stereotypy is associated with sensory differences and psychopathology. Differentiation of SMD from tics and ASD is important to avoid misdiagnosis and unnecessary treatment.
Learning power point 2000 easily
Energy Technology Data Exchange (ETDEWEB)
Mon, In Su; Je, Jung Suk
2000-05-15
This book introduces power point 2000, which gives descriptions what power point is, what we can do with power point 2000, is it possible to install power point 2000 in my computer? Let's run power point, basic of power point such as new presentation, writing letter, using text box, changing font size, color and shape, catching power user, insertion of word art and creating of new file. It also deals with figure, chart, graph, making multimedia file, presentation, know-how of power point for teachers and company workers.
Dowdy, Shirley; Chilko, Daniel
2011-01-01
Praise for the Second Edition "Statistics for Research has other fine qualities besides superior organization. The examples and the statistical methods are laid out with unusual clarity by the simple device of using special formats for each. The book was written with great care and is extremely user-friendly."-The UMAP Journal Although the goals and procedures of statistical research have changed little since the Second Edition of Statistics for Research was published, the almost universal availability of personal computers and statistical computing application packages have made it possible f
Boslaugh, Sarah
2013-01-01
Need to learn statistics for your job? Want help passing a statistics course? Statistics in a Nutshell is a clear and concise introduction and reference for anyone new to the subject. Thoroughly revised and expanded, this edition helps you gain a solid understanding of statistics without the numbing complexity of many college texts. Each chapter presents easy-to-follow descriptions, along with graphics, formulas, solved examples, and hands-on exercises. If you want to perform common statistical analyses and learn a wide range of techniques without getting in over your head, this is your book.
Nonparametric statistical inference
Gibbons, Jean Dickinson
2010-01-01
Overall, this remains a very fine book suitable for a graduate-level course in nonparametric statistics. I recommend it for all people interested in learning the basic ideas of nonparametric statistical inference.-Eugenia Stoimenova, Journal of Applied Statistics, June 2012… one of the best books available for a graduate (or advanced undergraduate) text for a theory course on nonparametric statistics. … a very well-written and organized book on nonparametric statistics, especially useful and recommended for teachers and graduate students.-Biometrics, 67, September 2011This excellently presente
Business statistics for dummies
Anderson, Alan
2013-01-01
Score higher in your business statistics course? Easy. Business statistics is a common course for business majors and MBA candidates. It examines common data sets and the proper way to use such information when conducting research and producing informational reports such as profit and loss statements, customer satisfaction surveys, and peer comparisons. Business Statistics For Dummies tracks to a typical business statistics course offered at the undergraduate and graduate levels and provides clear, practical explanations of business statistical ideas, techniques, formulas, and calculations, w
Griffiths, Dawn
2009-01-01
Wouldn't it be great if there were a statistics book that made histograms, probability distributions, and chi square analysis more enjoyable than going to the dentist? Head First Statistics brings this typically dry subject to life, teaching you everything you want and need to know about statistics through engaging, interactive, and thought-provoking material, full of puzzles, stories, quizzes, visual aids, and real-world examples. Whether you're a student, a professional, or just curious about statistical analysis, Head First's brain-friendly formula helps you get a firm grasp of statistics
Vocational students' learning preferences: the interpretability of ipsative data.
Smith, P J
2000-02-01
A number of researchers have argued that ipsative data are not suitable for statistical procedures designed for normative data. Others have argued that the interpretability of such analyses of ipsative data are little affected where the number of variables and the sample size are sufficiently large. The research reported here represents a factor analysis of the scores on the Canfield Learning Styles Inventory for 1,252 students in vocational education. The results of the factor analysis of these ipsative data were examined in a context of existing theory and research on vocational students and lend support to the argument that the factor analysis of ipsative data can provide sensibly interpretable results.
Lectures on algebraic statistics
Drton, Mathias; Sullivant, Seth
2009-01-01
How does an algebraic geometer studying secant varieties further the understanding of hypothesis tests in statistics? Why would a statistician working on factor analysis raise open problems about determinantal varieties? Connections of this type are at the heart of the new field of "algebraic statistics". In this field, mathematicians and statisticians come together to solve statistical inference problems using concepts from algebraic geometry as well as related computational and combinatorial techniques. The goal of these lectures is to introduce newcomers from the different camps to algebraic statistics. The introduction will be centered around the following three observations: many important statistical models correspond to algebraic or semi-algebraic sets of parameters; the geometry of these parameter spaces determines the behaviour of widely used statistical inference procedures; computational algebraic geometry can be used to study parameter spaces and other features of statistical models.
Naghshpour, Shahdad
2012-01-01
Statistics is the branch of mathematics that deals with real-life problems. As such, it is an essential tool for economists. Unfortunately, the way you and many other economists learn the concept of statistics is not compatible with the way economists think and learn. The problem is worsened by the use of mathematical jargon and complex derivations. Here's a book that proves none of this is necessary. All the examples and exercises in this book are constructed within the field of economics, thus eliminating the difficulty of learning statistics with examples from fields that have no relation to business, politics, or policy. Statistics is, in fact, not more difficult than economics. Anyone who can comprehend economics can understand and use statistics successfully within this field, including you! This book utilizes Microsoft Excel to obtain statistical results, as well as to perform additional necessary computations. Microsoft Excel is not the software of choice for performing sophisticated statistical analy...
Visual perception and radiographic interpretation
International Nuclear Information System (INIS)
Papageorges, M.
1998-01-01
Although interpretation errors are common in radiology, their causes are still debated. Perceptual mechanisms appear to be responsible for a large proportion of mistakes made by both neophytes and trained radiologists. Erroneous perception of familiar contours can be triggered by unrelated opacities. Conversely, visual information cannot induce a specific perception if the observer is not familiar with the concept represented or its radiographicappearance. Additionally, the area of acute vision is smaller than is commonly recognized. Other factors, such as the attitude, beliefs,.: preconceptions, and expectations of the viewer, can affect what he or she ''sees'' whenviewing any object, including a radiograph. Familiarity with perceptual mechanisms and the limitations of the visual system as well as multiple readings may be necessary to reduce interpretation errors
Modeling and interpretation of images*
Directory of Open Access Journals (Sweden)
Min Michiel
2015-01-01
Full Text Available Imaging protoplanetary disks is a challenging but rewarding task. It is challenging because of the glare of the central star outshining the weak signal from the disk at shorter wavelengths and because of the limited spatial resolution at longer wavelengths. It is rewarding because it contains a wealth of information on the structure of the disks and can (directly probe things like gaps and spiral structure. Because it is so challenging, telescopes are often pushed to their limitations to get a signal. Proper interpretation of these images therefore requires intimate knowledge of the instrumentation, the detection method, and the image processing steps. In this chapter I will give some examples and stress some issues that are important when interpreting images from protoplanetary disks.
Design of interpretable fuzzy systems
Cpałka, Krzysztof
2017-01-01
This book shows that the term “interpretability” goes far beyond the concept of readability of a fuzzy set and fuzzy rules. It focuses on novel and precise operators of aggregation, inference, and defuzzification leading to flexible Mamdani-type and logical-type systems that can achieve the required accuracy using a less complex rule base. The individual chapters describe various aspects of interpretability, including appropriate selection of the structure of a fuzzy system, focusing on improving the interpretability of fuzzy systems designed using both gradient-learning and evolutionary algorithms. It also demonstrates how to eliminate various system components, such as inputs, rules and fuzzy sets, whose reduction does not adversely affect system accuracy. It illustrates the performance of the developed algorithms and methods with commonly used benchmarks. The book provides valuable tools for possible applications in many fields including expert systems, automatic control and robotics.
Baseline Statistics of Linked Statistical Data
Scharnhorst, Andrea; Meroño-Peñuela, Albert; Guéret, Christophe
2014-01-01
We are surrounded by an ever increasing ocean of information, everybody will agree to that. We build sophisticated strategies to govern this information: design data models, develop infrastructures for data sharing, building tool for data analysis. Statistical datasets curated by National
Quantum mechanics interpretation: scalled debate
International Nuclear Information System (INIS)
Sanchez Gomez, J. L.
2000-01-01
This paper discusses the two main issues of the so called quantum debate, that started in 1927 with the famous Bohr-Einstein controversy; namely non-separability and the projection postulate. Relevant interpretations and formulations of quantum mechanics are critically analyzed in the light of the said issues. The treatment is focused chiefly on fundamental points, so that technical ones are practically not dealt with here. (Author) 20 refs
Topological interpretation of Luttinger theorem
Seki, Kazuhiro; Yunoki, Seiji
2017-01-01
Based solely on the analytical properties of the single-particle Green's function of fermions at finite temperatures, we show that the generalized Luttinger theorem inherently possesses topological aspects. The topological interpretation of the generalized Luttinger theorem can be introduced because i) the Luttinger volume is represented as the winding number of the single-particle Green's function and thus ii) the deviation of the theorem, expressed with a ratio between the interacting and n...
Operational interpretations of quantum discord
International Nuclear Information System (INIS)
Cavalcanti, D.; Modi, K.; Aolita, L.; Boixo, S.; Piani, M.; Winter, A.
2011-01-01
Quantum discord quantifies nonclassical correlations beyond the standard classification of quantum states into entangled and unentangled. Although it has received considerable attention, it still lacks any precise interpretation in terms of some protocol in which quantum features are relevant. Here we give quantum discord its first information-theoretic operational meaning in terms of entanglement consumption in an extended quantum-state-merging protocol. We further relate the asymmetry of quantum discord with the performance imbalance in quantum state merging and dense coding.
Defunctionalized Interpreters for Programming Languages
DEFF Research Database (Denmark)
Danvy, Olivier
2008-01-01
by Reynolds in ``Definitional Interpreters for Higher-Order Programming Languages'' for functional implementations of denotational semantics, natural semantics, and big-step abstract machines using closure conversion, CPS transformation, and defunctionalization. Over the last few years, the author and his...... operational semantics can be expressed as a reduction semantics: for deterministic languages, a reduction semantics is a structural operational semantics in continuation style, where the reduction context is a defunctionalized continuation. As the defunctionalized counterpart of the continuation of a one...
A Generator for Composition Interpreters
DEFF Research Database (Denmark)
Steensgaard-Madsen, Jørgen
1997-01-01
programming language design, specification and implementation then apply. A component can be considered as defining objects or commands according to convenience. A description language including type information provides sufficient means to describe component interaction according to the underlying abstract......Composition of program components must be expressed in some language, and late composition can be achieved by an interpreter for the composition language. A suitable notion of component is obtained by identifying it with the semantics of a generalised structured command. Experiences from...
Interpreting radiographs. 4. The carpus
International Nuclear Information System (INIS)
Burguez, P.N.
1984-01-01
The complexity of the carpus which has three major joints, seven or eight carpal bones and five adjacent bones, each of which articulates with one or more of the carpal elements, necessitates good quality radiographs for definitive radiographic interpretation may be extremely difficult because of the disparity between radiographic changes and obvious clinical signs and, therefore, must be discussed in the light of a thorough clinical assessment
Paris convention - Decisions, recommendations, interpretations
International Nuclear Information System (INIS)
1990-01-01
This booklet is published in a single edition in English and French. It contains decisions, recommendations and interpretations concerning the 1960 Paris Convention on Third Party Liability in the Field of Nuclear Energy adopted by the OECD Steering Committee and the OECD Council. All the instruments are set out according to the Article of the Convention to which they relate and explanatory notes are added where necessary [fr
Securing wide appreciation of health statistics.
PYRRAIT A M DO, A; AUBENQUE, M J; BENJAMIN, B; DE GROOT, M J; KOHN, R
1954-01-01
All the authors are agreed on the need for a certain publicizing of health statistics, but do Amaral Pyrrait points out that the medical profession prefers to convince itself rather than to be convinced. While there is great utility in articles and reviews in the professional press (especially for paramedical personnel) Aubenque, de Groot, and Kohn show how appreciation can effectively be secured by making statistics more easily understandable to the non-expert by, for instance, including readable commentaries in official publications, simplifying charts and tables, and preparing simple manuals on statistical methods. Aubenque and Kohn also stress the importance of linking health statistics to other economic and social information. Benjamin suggests that the principles of market research could to advantage be applied to health statistics to determine the precise needs of the "consumers". At the same time, Aubenque points out that the value of the ultimate results must be clear to those who provide the data; for this, Kohn suggests that the enumerators must know exactly what is wanted and why.There is general agreement that some explanation of statistical methods and their uses should be given in the curricula of medical schools and that lectures and postgraduate courses should be arranged for practising physicians.
Statistical dynamics of religion evolutions
Ausloos, M.; Petroni, F.
2009-10-01
A religion affiliation can be considered as a “degree of freedom” of an agent on the human genre network. A brief review is given on the state of the art in data analysis and modelization of religious “questions” in order to suggest and if possible initiate further research, after using a “statistical physics filter”. We present a discussion of the evolution of 18 so-called religions, as measured through their number of adherents between 1900 and 2000. Some emphasis is made on a few cases presenting a minimum or a maximum in the investigated time range-thereby suggesting a competitive ingredient to be considered, besides the well accepted “at birth” attachment effect. The importance of the “external field” is still stressed through an Avrami late stage crystal growth-like parameter. The observed features and some intuitive interpretations point to opinion based models with vector, rather than scalar, like agents.
Statistical inference on residual life
Jeong, Jong-Hyeon
2014-01-01
This is a monograph on the concept of residual life, which is an alternative summary measure of time-to-event data, or survival data. The mean residual life has been used for many years under the name of life expectancy, so it is a natural concept for summarizing survival or reliability data. It is also more interpretable than the popular hazard function, especially for communications between patients and physicians regarding the efficacy of a new drug in the medical field. This book reviews existing statistical methods to infer the residual life distribution. The review and comparison includes existing inference methods for mean and median, or quantile, residual life analysis through medical data examples. The concept of the residual life is also extended to competing risks analysis. The targeted audience includes biostatisticians, graduate students, and PhD (bio)statisticians. Knowledge in survival analysis at an introductory graduate level is advisable prior to reading this book.
Statistical methods for quality assurance
International Nuclear Information System (INIS)
Rinne, H.; Mittag, H.J.
1989-01-01
This is the first German-language textbook on quality assurance and the fundamental statistical methods that is suitable for private study. The material for this book has been developed from a course of Hagen Open University and is characterized by a particularly careful didactical design which is achieved and supported by numerous illustrations and photographs, more than 100 exercises with complete problem solutions, many fully displayed calculation examples, surveys fostering a comprehensive approach, bibliography with comments. The textbook has an eye to practice and applications, and great care has been taken by the authors to avoid abstraction wherever appropriate, to explain the proper conditions of application of the testing methods described, and to give guidance for suitable interpretation of results. The testing methods explained also include latest developments and research results in order to foster their adoption in practice. (orig.) [de
Statistical mechanics of program systems
International Nuclear Information System (INIS)
Neirotti, Juan P; Caticha, Nestor
2006-01-01
We discuss the collective behaviour of a set of operators and variables that constitute a program and the emergence of meaningful computational properties in the language of statistical mechanics. This is done by appropriately modifying available Monte Carlo methods to deal with hierarchical structures. The study suggests, in analogy with simulated annealing, a method to automatically design programs. Reasonable solutions can be found, at low temperatures, when the method is applied to simple toy problems such as finding an algorithm that determines the roots of a function or one that makes a nonlinear regression. Peaks in the specific heat are interpreted as signalling phase transitions which separate regions where different algorithmic strategies are used to solve the problem
Hébert-Dufresne, Laurent; Grochow, Joshua A; Allard, Antoine
2016-08-18
We introduce a network statistic that measures structural properties at the micro-, meso-, and macroscopic scales, while still being easy to compute and interpretable at a glance. Our statistic, the onion spectrum, is based on the onion decomposition, which refines the k-core decomposition, a standard network fingerprinting method. The onion spectrum is exactly as easy to compute as the k-cores: It is based on the stages at which each vertex gets removed from a graph in the standard algorithm for computing the k-cores. Yet, the onion spectrum reveals much more information about a network, and at multiple scales; for example, it can be used to quantify node heterogeneity, degree correlations, centrality, and tree- or lattice-likeness. Furthermore, unlike the k-core decomposition, the combined degree-onion spectrum immediately gives a clear local picture of the network around each node which allows the detection of interesting subgraphs whose topological structure differs from the global network organization. This local description can also be leveraged to easily generate samples from the ensemble of networks with a given joint degree-onion distribution. We demonstrate the utility of the onion spectrum for understanding both static and dynamic properties on several standard graph models and on many real-world networks.
Interpreter in Criminal Cases: Allrounders First!
Frid, Arthur
1974-01-01
The interpreter in criminal cases generally has had a purely linguistic training with no difference from the education received by his colleague interpreters. The position of interpreters in criminal cases is vague and their role depends to a large extent on individual interpretation of officials involved in the criminal procedure. Improvements on…
The Role of the Sampling Distribution in Understanding Statistical Inference
Lipson, Kay
2003-01-01
Many statistics educators believe that few students develop the level of conceptual understanding essential for them to apply correctly the statistical techniques at their disposal and to interpret their outcomes appropriately. It is also commonly believed that the sampling distribution plays an important role in developing this understanding.…
Statistical learning from a regression perspective
Berk, Richard A
2016-01-01
This textbook considers statistical learning applications when interest centers on the conditional distribution of the response variable, given a set of predictors, and when it is important to characterize how the predictors are related to the response. As a first approximation, this can be seen as an extension of nonparametric regression. This fully revised new edition includes important developments over the past 8 years. Consistent with modern data analytics, it emphasizes that a proper statistical learning data analysis derives from sound data collection, intelligent data management, appropriate statistical procedures, and an accessible interpretation of results. A continued emphasis on the implications for practice runs through the text. Among the statistical learning procedures examined are bagging, random forests, boosting, support vector machines and neural networks. Response variables may be quantitative or categorical. As in the first edition, a unifying theme is supervised learning that can be trea...
Directory of Open Access Journals (Sweden)
Yvonne Dzierma
2012-09-01
Full Text Available A sequence of 150 explosive eruptions recorded during the past century at the Chilean Southern Volcanic Zone (SVZ is subjected to statistical time series analysis. The exponential, Weibull, and log-logistic distribution functions are fit to the eruption record, separately for literature-assigned volcanic exploslvlty indices (VEI ≥ 2 and VEI ≥ 3. Since statistical tests confirm the adequacy of all the fits to describe the data, all models are used to estimate the likelihood of future eruptions. Only small differences are observed between the different distribution functions with regard to the eruption forecast, whereby the log-logistic distribution predicts the lowest probabilities. There is a 50% probability for VEI ≥ 2 eruptions to occur in the SVZ within less than a year, and 90% probability to occur within the next 2-3 years. For the larger VEI ≥ 3 eruptions, the 50% probability is reached in 3-4 years, while the 90% level is reached in 9-11 years.Se presenta un análisis estadístico de la serie temporal de 150 erupciones volcánicas explosivas registradas durante el siglo pasado en la Zona Volcánica del Sur de Chile. Se modeló el conjunto de erupciones mediante la distribución exponencial, de Weibull y log-logística, restringiendo el análisis a erupciones de índice de explosividad volcánica (IEV mayores a 2 y 3, respectivamente. Como los modelos pasan las pruebas estadísticas, los tres modelos se aplican para estimar la probabilidad de erupciones futuras. Se observan solo diferencias menores entre las predicciones mediante los distintos modelos, con la distribución log-logística dando las probabilidades más bajas. Para erupciones de IEV ≥ 2, la probabilidad de producirse una erupción dentro de un año es más del 50%, creciendo al 90% en 2-3 años. Para erupciones más grandes, de IEV ≥ 3, el 50% de probabilidad se alcanza dentro de 3-4 años, y el 90% dentro de 9-11 años.
Statistical Physics An Introduction
Yoshioka, Daijiro
2007-01-01
This book provides a comprehensive presentation of the basics of statistical physics. The first part explains the essence of statistical physics and how it provides a bridge between microscopic and macroscopic phenomena, allowing one to derive quantities such as entropy. Here the author avoids going into details such as Liouville’s theorem or the ergodic theorem, which are difficult for beginners and unnecessary for the actual application of the statistical mechanics. In the second part, statistical mechanics is applied to various systems which, although they look different, share the same mathematical structure. In this way readers can deepen their understanding of statistical physics. The book also features applications to quantum dynamics, thermodynamics, the Ising model and the statistical dynamics of free spins.
Statistical symmetries in physics
International Nuclear Information System (INIS)
Green, H.S.; Adelaide Univ., SA
1994-01-01
Every law of physics is invariant under some group of transformations and is therefore the expression of some type of symmetry. Symmetries are classified as geometrical, dynamical or statistical. At the most fundamental level, statistical symmetries are expressed in the field theories of the elementary particles. This paper traces some of the developments from the discovery of Bose statistics, one of the two fundamental symmetries of physics. A series of generalizations of Bose statistics is described. A supersymmetric generalization accommodates fermions as well as bosons, and further generalizations, including parastatistics, modular statistics and graded statistics, accommodate particles with properties such as 'colour'. A factorization of elements of ggl(n b ,n f ) can be used to define truncated boson operators. A general construction is given for q-deformed boson operators, and explicit constructions of the same type are given for various 'deformed' algebras. A summary is given of some of the applications and potential applications. 39 refs., 2 figs
The statistical stability phenomenon
Gorban, Igor I
2017-01-01
This monograph investigates violations of statistical stability of physical events, variables, and processes and develops a new physical-mathematical theory taking into consideration such violations – the theory of hyper-random phenomena. There are five parts. The first describes the phenomenon of statistical stability and its features, and develops methods for detecting violations of statistical stability, in particular when data is limited. The second part presents several examples of real processes of different physical nature and demonstrates the violation of statistical stability over broad observation intervals. The third part outlines the mathematical foundations of the theory of hyper-random phenomena, while the fourth develops the foundations of the mathematical analysis of divergent and many-valued functions. The fifth part contains theoretical and experimental studies of statistical laws where there is violation of statistical stability. The monograph should be of particular interest to engineers...
Equilibrium statistical mechanics
Jackson, E Atlee
2000-01-01
Ideal as an elementary introduction to equilibrium statistical mechanics, this volume covers both classical and quantum methodology for open and closed systems. Introductory chapters familiarize readers with probability and microscopic models of systems, while additional chapters describe the general derivation of the fundamental statistical mechanics relationships. The final chapter contains 16 sections, each dealing with a different application, ordered according to complexity, from classical through degenerate quantum statistical mechanics. Key features include an elementary introduction t
Mineral industry statistics 1975
Energy Technology Data Exchange (ETDEWEB)
1978-01-01
Production, consumption and marketing statistics are given for solid fuels (coal, peat), liquid fuels and gases (oil, natural gas), iron ore, bauxite and other minerals quarried in France, in 1975. Also accident statistics are included. Production statistics are presented of the Overseas Departments and territories (French Guiana, New Caledonia, New Hebrides). An account of modifications in the mining field in 1975 is given. Concessions, exploitation permits, and permits solely for prospecting for mineral products are discussed. (In French)
Lectures on statistical mechanics
Bowler, M G
1982-01-01
Anyone dissatisfied with the almost ritual dullness of many 'standard' texts in statistical mechanics will be grateful for the lucid explanation and generally reassuring tone. Aimed at securing firm foundations for equilibrium statistical mechanics, topics of great subtlety are presented transparently and enthusiastically. Very little mathematical preparation is required beyond elementary calculus and prerequisites in physics are limited to some elementary classical thermodynamics. Suitable as a basis for a first course in statistical mechanics, the book is an ideal supplement to more convent
Directory of Open Access Journals (Sweden)
Mirjam Nielen
2017-01-01
Full Text Available Always wondered why research papers often present rather complicated statistical analyses? Or wondered how to properly analyse the results of a pragmatic trial from your own practice? This talk will give an overview of basic statistical principles and focus on the why of statistics, rather than on the how.This is a podcast of Mirjam's talk at the Veterinary Evidence Today conference, Edinburgh November 2, 2016.
Equilibrium statistical mechanics
Mayer, J E
1968-01-01
The International Encyclopedia of Physical Chemistry and Chemical Physics, Volume 1: Equilibrium Statistical Mechanics covers the fundamental principles and the development of theoretical aspects of equilibrium statistical mechanics. Statistical mechanical is the study of the connection between the macroscopic behavior of bulk matter and the microscopic properties of its constituent atoms and molecules. This book contains eight chapters, and begins with a presentation of the master equation used for the calculation of the fundamental thermodynamic functions. The succeeding chapters highlight t
Mahalanobis, P C
1965-01-01
Contributions to Statistics focuses on the processes, methodologies, and approaches involved in statistics. The book is presented to Professor P. C. Mahalanobis on the occasion of his 70th birthday. The selection first offers information on the recovery of ancillary information and combinatorial properties of partially balanced designs and association schemes. Discussions focus on combinatorial applications of the algebra of association matrices, sample size analogy, association matrices and the algebra of association schemes, and conceptual statistical experiments. The book then examines latt
Wind Statistics from a Forested Landscape
DEFF Research Database (Denmark)
Arnqvist, Johan; Segalini, Antonio; Dellwik, Ebba
2015-01-01
An analysis and interpretation of measurements from a 138-m tall tower located in a forested landscape is presented. Measurement errors and statistical uncertainties are carefully evaluated to ensure high data quality. A 40(Formula presented.) wide wind-direction sector is selected as the most...... representative for large-scale forest conditions, and from that sector first-, second- and third-order statistics, as well as analyses regarding the characteristic length scale, the flux-profile relationship and surface roughness are presented for a wide range of stability conditions. The results are discussed...
Boslaugh, Sarah
2008-01-01
Need to learn statistics as part of your job, or want some help passing a statistics course? Statistics in a Nutshell is a clear and concise introduction and reference that's perfect for anyone with no previous background in the subject. This book gives you a solid understanding of statistics without being too simple, yet without the numbing complexity of most college texts. You get a firm grasp of the fundamentals and a hands-on understanding of how to apply them before moving on to the more advanced material that follows. Each chapter presents you with easy-to-follow descriptions illustrat
Understanding Computational Bayesian Statistics
Bolstad, William M
2011-01-01
A hands-on introduction to computational statistics from a Bayesian point of view Providing a solid grounding in statistics while uniquely covering the topics from a Bayesian perspective, Understanding Computational Bayesian Statistics successfully guides readers through this new, cutting-edge approach. With its hands-on treatment of the topic, the book shows how samples can be drawn from the posterior distribution when the formula giving its shape is all that is known, and how Bayesian inferences can be based on these samples from the posterior. These ideas are illustrated on common statistic
Annual Statistical Supplement, 2002
Social Security Administration — The Annual Statistical Supplement, 2002 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2010
Social Security Administration — The Annual Statistical Supplement, 2010 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2007
Social Security Administration — The Annual Statistical Supplement, 2007 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2001
Social Security Administration — The Annual Statistical Supplement, 2001 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2016
Social Security Administration — The Annual Statistical Supplement, 2016 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2011
Social Security Administration — The Annual Statistical Supplement, 2011 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2005
Social Security Administration — The Annual Statistical Supplement, 2005 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2015
Social Security Administration — The Annual Statistical Supplement, 2015 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2003
Social Security Administration — The Annual Statistical Supplement, 2003 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2017
Social Security Administration — The Annual Statistical Supplement, 2017 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2008
Social Security Administration — The Annual Statistical Supplement, 2008 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2014
Social Security Administration — The Annual Statistical Supplement, 2014 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2004
Social Security Administration — The Annual Statistical Supplement, 2004 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2000
Social Security Administration — The Annual Statistical Supplement, 2000 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2009
Social Security Administration — The Annual Statistical Supplement, 2009 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2006
Social Security Administration — The Annual Statistical Supplement, 2006 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Bulmer, M G
1979-01-01
There are many textbooks which describe current methods of statistical analysis, while neglecting related theory. There are equally many advanced textbooks which delve into the far reaches of statistical theory, while bypassing practical applications. But between these two approaches is an unfilled gap, in which theory and practice merge at an intermediate level. Professor M. G. Bulmer's Principles of Statistics, originally published in 1965, was created to fill that need. The new, corrected Dover edition of Principles of Statistics makes this invaluable mid-level text available once again fo
Statistical distribution sampling
Johnson, E. S.
1975-01-01
Determining the distribution of statistics by sampling was investigated. Characteristic functions, the quadratic regression problem, and the differential equations for the characteristic functions are analyzed.