WorldWideScience

Sample records for easily interpretable statistics

  1. Equivalent statistics and data interpretation.

    Science.gov (United States)

    Francis, Gregory

    2016-10-14

    Recent reform efforts in psychological science have led to a plethora of choices for scientists to analyze their data. A scientist making an inference about their data must now decide whether to report a p value, summarize the data with a standardized effect size and its confidence interval, report a Bayes Factor, or use other model comparison methods. To make good choices among these options, it is necessary for researchers to understand the characteristics of the various statistics used by the different analysis frameworks. Toward that end, this paper makes two contributions. First, it shows that for the case of a two-sample t test with known sample sizes, many different summary statistics are mathematically equivalent in the sense that they are based on the very same information in the data set. When the sample sizes are known, the p value provides as much information about a data set as the confidence interval of Cohen's d or a JZS Bayes factor. Second, this equivalence means that different analysis methods differ only in their interpretation of the empirical data. At first glance, it might seem that mathematical equivalence of the statistics suggests that it does not matter much which statistic is reported, but the opposite is true because the appropriateness of a reported statistic is relative to the inference it promotes. Accordingly, scientists should choose an analysis method appropriate for their scientific investigation. A direct comparison of the different inferential frameworks provides some guidance for scientists to make good choices and improve scientific practice.

  2. The Malpractice of Statistical Interpretation

    Science.gov (United States)

    Fraas, John W.; Newman, Isadore

    1978-01-01

    Problems associated with the use of gain scores, analysis of covariance, multicollinearity, part and partial correlation, and the lack of rectilinearity in regression are discussed. Particular attention is paid to the misuse of statistical techniques. (JKS)

  3. Statistics and Data Interpretation for Social Work

    CERN Document Server

    Rosenthal, James

    2011-01-01

    "Without question, this text will be the most authoritative source of information on statistics in the human services. From my point of view, it is a definitive work that combines a rigorous pedagogy with a down to earth (commonsense) exploration of the complex and difficult issues in data analysis (statistics) and interpretation. I welcome its publication.". -Praise for the First Edition. Written by a social worker for social work students, this is a nuts and bolts guide to statistics that presents complex calculations and concepts in clear, easy-to-understand language. It includes

  4. A statistical mechanical interpretation of algorithmic information theory: Total statistical mechanical interpretation based on physical argument

    Energy Technology Data Exchange (ETDEWEB)

    Tadaki, Kohtaro, E-mail: tadaki@kc.chuo-u.ac.j [Research and Development Initiative, Chuo University, 1-13-27 Kasuga, Bunkyo-ku, Tokyo 112-8551 (Japan)

    2010-12-01

    The statistical mechanical interpretation of algorithmic information theory (AIT, for short) was introduced and developed by our former works [K. Tadaki, Local Proceedings of CiE 2008, pp. 425-434, 2008] and [K. Tadaki, Proceedings of LFCS'09, Springer's LNCS, vol. 5407, pp. 422-440, 2009], where we introduced the notion of thermodynamic quantities, such as partition function Z(T), free energy F(T), energy E(T), statistical mechanical entropy S(T), and specific heat C(T), into AIT. We then discovered that, in the interpretation, the temperature T equals to the partial randomness of the values of all these thermodynamic quantities, where the notion of partial randomness is a stronger representation of the compression rate by means of program-size complexity. Furthermore, we showed that this situation holds for the temperature T itself, which is one of the most typical thermodynamic quantities. Namely, we showed that, for each of the thermodynamic quantities Z(T), F(T), E(T), and S(T) above, the computability of its value at temperature T gives a sufficient condition for T is an element of (0,1) to satisfy the condition that the partial randomness of T equals to T. In this paper, based on a physical argument on the same level of mathematical strictness as normal statistical mechanics in physics, we develop a total statistical mechanical interpretation of AIT which actualizes a perfect correspondence to normal statistical mechanics. We do this by identifying a microcanonical ensemble in the framework of AIT. As a result, we clarify the statistical mechanical meaning of the thermodynamic quantities of AIT.

  5. Interpretation and uses of medical statistics

    CERN Document Server

    Daly, Leslie

    2008-01-01

    In 1969 the first edition of this book introduced the concepts of statistics and their medical application to readers with no formal training in this area. While retaining this basic aim, the authors have expanded the coverage in each subsequent edition to keep pace with the increasing use and sophistication of statistics in medical research. This fifth edition has undergone major restructuring, with some sections completely rewritten; it is now more logically organized and more user friendly (with the addition of 'summary boxes' throughout the text). It incorporates new statistical techniq

  6. The Statistical Interpretation of Entropy: An Activity

    Science.gov (United States)

    Timmberlake, Todd

    2010-01-01

    The second law of thermodynamics, which states that the entropy of an isolated macroscopic system can increase but will not decrease, is a cornerstone of modern physics. Ludwig Boltzmann argued that the second law arises from the motion of the atoms that compose the system. Boltzmann's statistical mechanics provides deep insight into the…

  7. Writing in Statistics Classes Encourages Students To Learn Interpretation.

    Science.gov (United States)

    Beins, Bernard C.

    A study investigated the effect of writing in statistics classes on students' interpretation skills (translating the results of data analysis into verbal interpretations that are accessible to non-statisticians). One hundred twenty-two students in three statistics classes received either low, moderate, or high instructional emphasis in…

  8. Interpretation and use of statistics in nursing research.

    Science.gov (United States)

    Giuliano, Karen K; Polanowicz, Michelle

    2008-01-01

    A working understanding of the major fundamentals of statistical analysis is required to incorporate the findings of empirical research into nursing practice. The primary focus of this article is to describe common statistical terms, present some common statistical tests, and explain the interpretation of results from inferential statistics in nursing research. An overview of major concepts in statistics, including the distinction between parametric and nonparametric statistics, different types of data, and the interpretation of statistical significance, is reviewed. Examples of some of the most common statistical techniques used in nursing research, such as the Student independent t test, analysis of variance, and regression, are also discussed. Nursing knowledge based on empirical research plays a fundamental role in the development of evidence-based nursing practice. The ability to interpret and use quantitative findings from nursing research is an essential skill for advanced practice nurses to ensure provision of the best care possible for our patients.

  9. The Statistical Interpretation of Classical Thermodynamic Heating and Expansion Processes

    Science.gov (United States)

    Cartier, Stephen F.

    2011-01-01

    A statistical model has been developed and applied to interpret thermodynamic processes typically presented from the macroscopic, classical perspective. Through this model, students learn and apply the concepts of statistical mechanics, quantum mechanics, and classical thermodynamics in the analysis of the (i) constant volume heating, (ii)…

  10. Making Statistical Data More Easily Accessible on the Web Results of the StatSearch Case Study

    CERN Document Server

    Rajman, M; Boynton, I M; Fridlund, B; Fyhrlund, A; Sundgren, B; Lundquist, P; Thelander, H; Wänerskär, M

    2005-01-01

    In this paper we present the results of the StatSearch case study that aimed at providing an enhanced access to statistical data available on the Web. In the scope of this case study we developed a prototype of an information access tool combining a query-based search engine with semi-automated navigation techniques exploiting the hierarchical structuring of the available data. This tool enables a better control of the information retrieval, improving the quality and ease of the access to statistical information. The central part of the presented StatSearch tool consists in the design of an algorithm for automated navigation through a tree-like hierarchical document structure. The algorithm relies on the computation of query related relevance score distributions over the available database to identify the most relevant clusters in the data structure. These most relevant clusters are then proposed to the user for navigation, or, alternatively, are the support for the automated navigation process. Several appro...

  11. Interpretation of Statistical Data: The Importance of Affective Expressions

    Science.gov (United States)

    Queiroz, Tamires; Monteiro, Carlos; Carvalho, Liliane; François, Karen

    In recent years, research on teaching and learning of statistics emphasized that the interpretation of data is a complex process that involves cognitive and technical aspects. However, it is a human activity that involves also contextual and affective aspects. This view is in line with research on affectivity and cognition. While the affective…

  12. Pre-service teachers challenges while interpreting statistical graphs

    Science.gov (United States)

    Wahid, Norabiatul Adawiah Abd; Rahim, Suzieleez Syrene Abdul; Zamri, Sharifah Norul Akmar Syed

    2017-05-01

    Nowadays statistical graphs has been widely used as a medium to communicate. Awareness of the important of statistical graphs have been realized by Ministry of Education. Therefore, Ministry of Education have included this topic into national standard curriculum as early as Standard 3. It proved that this field of study is important to our students. However pre-service teachers still faced some difficulties to comprehend various types of statistical graphs. Among the problem faced by those pre-service teachers are the difficulties to relate two statistical graphs that carrying related issues. Therefore this study will look at the types of difficulties faced by pre-service teachers when they need to interpret two statistical graphs which carried related issues. We focus on data which came from interviews which gives evidence of several problems faced by pre-service teachers when they try to comprehend those graphs that are related with each other. The discussion of results might contribute to an understanding of the complexity of the interpretation of such graphs, and to the possible solutions that can be used to cater on the problems arise.

  13. Quantum statistics as geometry: Conflict, Mechanism, Interpretation, and Implication

    CERN Document Server

    Galehouse, Daniel C

    2015-01-01

    The conflict between the determinism of geometry in general relativity and the essential statistics of quantum mechanics blocks the development of a unified theory. Electromagnetic radiation is essential to both fields and supplies a common meeting ground. It is proposed that a suitable mechanism to resolve these differences can be based on the use of a time-symmetric treatment for the radiation. Advanced fields of the absorber can be interpreted to supply the random character of spontaneous emission. This allows the statistics of the Born rule to come from the spontaneous emission that occurs during a physical measurement. When the absorber is included, quantum mechanics is completely deterministic. It is suggested that the peculiar properties of kaons may be induced by the advanced effects of the neutrino field. Schr\\"odinger's cat loses its enigmatic personality and the identification of mental processes as an essential component of a measurement is no longer needed.

  14. Statistical interpretation of machine learning-based feature importance scores for biomarker discovery.

    Science.gov (United States)

    Huynh-Thu, Vân Anh; Saeys, Yvan; Wehenkel, Louis; Geurts, Pierre

    2012-07-01

    Univariate statistical tests are widely used for biomarker discovery in bioinformatics. These procedures are simple, fast and their output is easily interpretable by biologists but they can only identify variables that provide a significant amount of information in isolation from the other variables. As biological processes are expected to involve complex interactions between variables, univariate methods thus potentially miss some informative biomarkers. Variable relevance scores provided by machine learning techniques, however, are potentially able to highlight multivariate interacting effects, but unlike the p-values returned by univariate tests, these relevance scores are usually not statistically interpretable. This lack of interpretability hampers the determination of a relevance threshold for extracting a feature subset from the rankings and also prevents the wide adoption of these methods by practicians. We evaluated several, existing and novel, procedures that extract relevant features from rankings derived from machine learning approaches. These procedures replace the relevance scores with measures that can be interpreted in a statistical way, such as p-values, false discovery rates, or family wise error rates, for which it is easier to determine a significance level. Experiments were performed on several artificial problems as well as on real microarray datasets. Although the methods differ in terms of computing times and the tradeoff, they achieve in terms of false positives and false negatives, some of them greatly help in the extraction of truly relevant biomarkers and should thus be of great practical interest for biologists and physicians. As a side conclusion, our experiments also clearly highlight that using model performance as a criterion for feature selection is often counter-productive. Python source codes of all tested methods, as well as the MATLAB scripts used for data simulation, can be found in the Supplementary Material.

  15. Statistical Interpretation of Natural and Technological Hazards in China

    Science.gov (United States)

    Borthwick, Alistair, ,, Prof.; Ni, Jinren, ,, Prof.

    2010-05-01

    China is prone to catastrophic natural hazards from floods, droughts, earthquakes, storms, cyclones, landslides, epidemics, extreme temperatures, forest fires, avalanches, and even tsunami. This paper will list statistics related to the six worst natural disasters in China over the past 100 or so years, ranked according to number of fatalities. The corresponding data for the six worst natural disasters in China over the past decade will also be considered. [The data are abstracted from the International Disaster Database, Centre for Research on the Epidemiology of Disasters (CRED), Université Catholique de Louvain, Brussels, Belgium, http://www.cred.be/ where a disaster is defined as occurring if one of the following criteria is fulfilled: 10 or more people reported killed; 100 or more people reported affected; a call for international assistance; or declaration of a state of emergency.] The statistics include the number of occurrences of each type of natural disaster, the number of deaths, the number of people affected, and the cost in billions of US dollars. Over the past hundred years, the largest disasters may be related to the overabundance or scarcity of water, and to earthquake damage. However, there has been a substantial relative reduction in fatalities due to water related disasters over the past decade, even though the overall numbers of people affected remain huge, as does the economic damage. This change is largely due to the efforts put in by China's water authorities to establish effective early warning systems, the construction of engineering countermeasures for flood protection, the implementation of water pricing and other measures for reducing excessive consumption during times of drought. It should be noted that the dreadful death toll due to the Sichuan Earthquake dominates recent data. Joint research has been undertaken between the Department of Environmental Engineering at Peking University and the Department of Engineering Science at Oxford

  16. A Critique of Divorce Statistics and Their Interpretation.

    Science.gov (United States)

    Crosby, John F.

    1980-01-01

    Increasingly, appeals to the divorce statistic are employed to substantiate claims that the family is in a state of breakdown and marriage is passe. This article contains a consideration of reasons why the divorce statistics are invalid and/or unreliable as indicators of the present state of marriage and family. (Author)

  17. Workplace Statistical Literacy for Teachers: Interpreting Box Plots

    Science.gov (United States)

    Pierce, Robyn; Chick, Helen

    2013-01-01

    As a consequence of the increased use of data in workplace environments, there is a need to understand the demands that are placed on users to make sense of such data. In education, teachers are being increasingly expected to interpret and apply complex data about student and school performance, and, yet it is not clear that they always have the…

  18. Interpreting Statistical Findings A Guide For Health Professionals And Students

    CERN Document Server

    Walker, Jan

    2010-01-01

    This book is aimed at those studying and working in the field of health care, including nurses and the professions allied to medicine, who have little prior knowledge of statistics but for whom critical review of research is an essential skill.

  19. Interpreting socio-economic data a foundation of descriptive statistics

    CERN Document Server

    Winkler, Othmar

    2009-01-01

    A compendium of the concepts needed to describe and analyze empirical data in the social sciences. It aims to set the record straight about foundations of descriptive statistics and also aims to offer a look behind the scenes of commonly used methods. It shows how to deal with ratios, probabilities, and index numbers.

  20. A statistical model for interpreting computerized dynamic posturography data

    Science.gov (United States)

    Feiveson, Alan H.; Metter, E. Jeffrey; Paloski, William H.

    2002-01-01

    Computerized dynamic posturography (CDP) is widely used for assessment of altered balance control. CDP trials are quantified using the equilibrium score (ES), which ranges from zero to 100, as a decreasing function of peak sway angle. The problem of how best to model and analyze ESs from a controlled study is considered. The ES often exhibits a skewed distribution in repeated trials, which can lead to incorrect inference when applying standard regression or analysis of variance models. Furthermore, CDP trials are terminated when a patient loses balance. In these situations, the ES is not observable, but is assigned the lowest possible score--zero. As a result, the response variable has a mixed discrete-continuous distribution, further compromising inference obtained by standard statistical methods. Here, we develop alternative methodology for analyzing ESs under a stochastic model extending the ES to a continuous latent random variable that always exists, but is unobserved in the event of a fall. Loss of balance occurs conditionally, with probability depending on the realized latent ES. After fitting the model by a form of quasi-maximum-likelihood, one may perform statistical inference to assess the effects of explanatory variables. An example is provided, using data from the NIH/NIA Baltimore Longitudinal Study on Aging.

  1. ENTREPRENEURSHIP - BETWEEN ATTITUDE AND SUCCESS CONCEPTUAL AND STATISTICAL INTERPRETATION

    Directory of Open Access Journals (Sweden)

    CLAUDIA ISAC

    2015-10-01

    Full Text Available In Romania the term entrepreneurship became popular after 1990 when private capital companies were brought under regulation. In this context, the terms used in this paper come to supplement or to emphasize the elements necessary to assess the entrepreneurial phenomenon. Thus, in the first part of the paper we have resented besides an evolution of the term “entrepreneur”, a classification according to classical theories, such as the one belonging to Adam Smith, as well as current classifications regarding changes in the business environment, namely the exponential growth of businesses in online field due to the Internet. At the end of the paper we have presented a statistical analysis of the entrepreneurial phenomenon in Romania and on European or international level, according to several criteria, such as attitude towards entrepreneurship and the development of a business, age groups and the attitude towards entrepreneurship, the reasons that influence the development of a business, etc.

  2. Inferring the statistical interpretation of quantum mechanics from the classical limit

    Science.gov (United States)

    Gottfried

    2000-06-01

    It is widely believed that the statistical interpretation of quantum mechanics cannot be inferred from the Schrodinger equation itself, and must be stated as an additional independent axiom. Here I propose that the situation is not so stark. For systems that have both continuous and discrete degrees of freedom (such as coordinates and spin respectively), the statistical interpretation for the discrete variables is implied by requiring that the system's gross motion can be classically described under circumstances specified by the Schrodinger equation. However, this is not a full-fledged derivation of the statistical interpretation because it does not apply to the continuous variables of classical mechanics.

  3. Statistic Non-Parametric Methods of Measurement and Interpretation of Existing Statistic Connections within Seaside Hydro Tourism

    OpenAIRE

    MIRELA SECARĂ

    2008-01-01

    Tourism represents an important field of economic and social life in our country, and the main sector of the economy of Constanta County is the balneary touristic capitalization of Romanian seaside. In order to statistically analyze hydro tourism on Romanian seaside, we have applied non-parametric methods of measuring and interpretation of existing statistic connections within seaside hydro tourism. Major objective of this research is represented by hydro tourism re-establishment on Romanian ...

  4. Easily Stated but Hard Statistical Problems

    Science.gov (United States)

    1986-05-01

    o ^ H- K - o o Santa Maria K . ,- M H- o o isabela o - H- o o o Fernandina...o H- •- o o o Los Hermanos o *-•-►- o o pinzon O O i-1 o o o La...Fred Huff er, Duane Meeter, Edsel Pena and Arif Zaman for helpful discussions. i^ REFERENCES Abbott, I., Abbott, L. K ., and Grant, P. R. (1977

  5. Interpretation of the results of statistical measurements. [search for basic probability model

    Science.gov (United States)

    Olshevskiy, V. V.

    1973-01-01

    For random processes, the calculated probability characteristic, and the measured statistical estimate are used in a quality functional, which defines the difference between the two functions. Based on the assumption that the statistical measurement procedure is organized so that the parameters for a selected model are optimized, it is shown that the interpretation of experimental research is a search for a basic probability model.

  6. Statistics translated a step-by-step guide to analyzing and interpreting data

    CERN Document Server

    Terrell, Steven R

    2012-01-01

    Written in a humorous and encouraging style, this text shows how the most common statistical tools can be used to answer interesting real-world questions, presented as mysteries to be solved. Engaging research examples lead the reader through a series of six steps, from identifying a researchable problem to stating a hypothesis, identifying independent and dependent variables, and selecting and interpreting appropriate statistical tests. All techniques are demonstrated both manually and with the help of SPSS software. The book provides students and others who may need to read and interpret sta

  7. Statistical Tools for the Interpretation of Enzootic West Nile virus Transmission Dynamics.

    Science.gov (United States)

    Caillouët, Kevin A; Robertson, Suzanne

    2016-01-01

    Interpretation of enzootic West Nile virus (WNV) surveillance indicators requires little advanced mathematical skill, but greatly enhances the ability of public health officials to prescribe effective WNV management tactics. Stepwise procedures for the calculation of mosquito infection rates (IR) and vector index (VI) are presented alongside statistical tools that require additional computation. A brief review of advantages and important considerations for each statistic's use is provided.

  8. Interpreting Statistical Significance Test Results: A Proposed New "What If" Method.

    Science.gov (United States)

    Kieffer, Kevin M.; Thompson, Bruce

    As the 1994 publication manual of the American Psychological Association emphasized, "p" values are affected by sample size. As a result, it can be helpful to interpret the results of statistical significant tests in a sample size context by conducting so-called "what if" analyses. However, these methods can be inaccurate…

  9. New physicochemical interpretations for the adsorption of food dyes on chitosan films using statistical physics treatment.

    Science.gov (United States)

    Dotto, G L; Pinto, L A A; Hachicha, M A; Knani, S

    2015-03-15

    In this work, statistical physics treatment was employed to study the adsorption of food dyes onto chitosan films, in order to obtain new physicochemical interpretations at molecular level. Experimental equilibrium curves were obtained for the adsorption of four dyes (FD&C red 2, FD&C yellow 5, FD&C blue 2, Acid Red 51) at different temperatures (298, 313 and 328 K). A statistical physics formula was used to interpret these curves, and the parameters such as, number of adsorbed dye molecules per site (n), anchorage number (n'), receptor sites density (NM), adsorbed quantity at saturation (N asat), steric hindrance (τ), concentration at half saturation (c1/2) and molar adsorption energy (ΔE(a)) were estimated. The relation of the above mentioned parameters with the chemical structure of the dyes and temperature was evaluated and interpreted.

  10. Correlation-based interpretations of paleoclimate data - where statistics meet past climates

    Science.gov (United States)

    Hu, Jun; Emile-Geay, Julien; Partin, Judson

    2017-02-01

    Correlation analysis is omnipresent in paleoclimatology, and often serves to support the proposed climatic interpretation of a given proxy record. However, this analysis presents several statistical challenges, each of which is sufficient to nullify the interpretation: the loss of degrees of freedom due to serial correlation, the test multiplicity problem in connection with a climate field, and the presence of age uncertainties. While these issues have long been known to statisticians, they are not widely appreciated by the wider paleoclimate community; yet they can have a first-order impact on scientific conclusions. Here we use three examples from the recent paleoclimate literature to highlight how spurious correlations affect the published interpretations of paleoclimate proxies, and suggest that future studies should address these issues to strengthen their conclusions. In some cases, correlations that were previously claimed to be significant are found insignificant, thereby challenging published interpretations. In other cases, minor adjustments can be made to safeguard against these concerns. Because such problems arise so commonly with paleoclimate data, we provide open-source code to address them. Ultimately, we conclude that statistics alone cannot ground-truth a proxy, and recommend establishing a mechanistic understanding of a proxy signal as a sounder basis for interpretation.

  11. Parameters of the Menzerath-Altmann law: Statistical mechanical interpretation as applied to a linguistic organization

    CERN Document Server

    Eroglu, Sertac

    2013-01-01

    The distribution behavior dictated by the Menzerath-Altmann (MA) law is frequently encountered in linguistic and natural organizations at various structural levels. The mathematical form of this empirical law comprises three fitting parameters whose values tend to be elusive, especially in inter-organizational studies. To allow interpretation of these parameters and better understand such distribution behavior, we present a statistical mechanical approach based on an analogy between the classical particles of a statistical mechanical organization and the number of distinct words in a textual organization. With this derivation, we achieve a transformed (generalized) form of the MA model, termed the statistical mechanical Menzerath-Altmann (SMMA) model. This novel transformed model consists of four parameters, one of which is a structure-dependent input parameter, and three of which are free-fitting parameters. Using distinct word data sets from two text corpora, we verified that the SMMA model describes the sa...

  12. Menzerath-Altmann Law: Statistical Mechanical Interpretation as Applied to a Linguistic Organization

    Science.gov (United States)

    Eroglu, Sertac

    2014-10-01

    The distribution behavior described by the empirical Menzerath-Altmann law is frequently encountered during the self-organization of linguistic and non-linguistic natural organizations at various structural levels. This study presents a statistical mechanical derivation of the law based on the analogy between the classical particles of a statistical mechanical organization and the distinct words of a textual organization. The derived model, a transformed (generalized) form of the Menzerath-Altmann model, was termed as the statistical mechanical Menzerath-Altmann model. The derived model allows interpreting the model parameters in terms of physical concepts. We also propose that many organizations presenting the Menzerath-Altmann law behavior, whether linguistic or not, can be methodically examined by the transformed distribution model through the properly defined structure-dependent parameter and the energy associated states.

  13. Misuse of statistics in the interpretation of data on low-level radiation

    Energy Technology Data Exchange (ETDEWEB)

    Hamilton, L.D.

    1982-01-01

    Four misuses of statistics in the interpretation of data of low-level radiation are reviewed: (1) post-hoc analysis and aggregation of data leading to faulty conclusions in the reanalysis of genetic effects of the atomic bomb, and premature conclusions on the Portsmouth Naval Shipyard data; (2) inappropriate adjustment for age and ignoring differences between urban and rural areas leading to potentially spurious increase in incidence of cancer at Rocky Flats; (3) hazard of summary statistics based on ill-conditioned individual rates leading to spurious association between childhood leukemia and fallout in Utah; and (4) the danger of prematurely published preliminary work with inadequate consideration of epidemiological problems - censored data - leading to inappropriate conclusions, needless alarm at the Portsmouth Naval Shipyard, and diversion of scarce research funds.

  14. Two Easily Made Astronomical Telescopes.

    Science.gov (United States)

    Hill, M.; Jacobs, D. J.

    1991-01-01

    The directions and diagrams for making a reflecting telescope and a refracting telescope are presented. These telescopes can be made by students out of plumbing parts and easily obtainable, inexpensive, optical components. (KR)

  15. Interpretations

    Science.gov (United States)

    Bellac, Michel Le

    2014-11-01

    Although nobody can question the practical efficiency of quantum mechanics, there remains the serious question of its interpretation. As Valerio Scarani puts it, "We do not feel at ease with the indistinguishability principle (that is, the superposition principle) and some of its consequences." Indeed, this principle which pervades the quantum world is in stark contradiction with our everyday experience. From the very beginning of quantum mechanics, a number of physicists--but not the majority of them!--have asked the question of its "interpretation". One may simply deny that there is a problem: according to proponents of the minimalist interpretation, quantum mechanics is self-sufficient and needs no interpretation. The point of view held by a majority of physicists, that of the Copenhagen interpretation, will be examined in Section 10.1. The crux of the problem lies in the status of the state vector introduced in the preceding chapter to describe a quantum system, which is no more than a symbolic representation for the Copenhagen school of thought. Conversely, one may try to attribute some "external reality" to this state vector, that is, a correspondence between the mathematical description and the physical reality. In this latter case, it is the measurement problem which is brought to the fore. In 1932, von Neumann was first to propose a global approach, in an attempt to build a purely quantum theory of measurement examined in Section 10.2. This theory still underlies modern approaches, among them those grounded on decoherence theory, or on the macroscopic character of the measuring apparatus: see Section 10.3. Finally, there are non-standard interpretations such as Everett's many worlds theory or the hidden variables theory of de Broglie and Bohm (Section 10.4). Note, however, that this variety of interpretations has no bearing whatsoever on the practical use of quantum mechanics. There is no controversy on the way we should use quantum mechanics!

  16. Optimal Statistical Operations for 3-Dimensional Rotational Data: Geometric Interpretations and Application to Prosthesis Kinematics

    Directory of Open Access Journals (Sweden)

    Øyvind Stavdahl

    2005-10-01

    Full Text Available Rotational data in the form of measured three-dimensional rotations or orientations arise naturally in many fields of science, including biomechanics, orthopaedics and robotics. The cyclic topology of rotation spaces calls for special care and considerations when performing statistical analysis of rotational data. Relevant theory has been developed during the last three decades, and has become a standard tool in some areas. In relation to the study of human kinematics and motion however, these concepts have hardly been put to use. This paper gives an introduction to the intricacies of three-dimensional rotations, and provides a thorough geometric interpretation of several approaches to averaging rotational data A set of novel, simple operators is presented. Simulations and a prosthetics-related real-world example involving wrist kinematics illuminate important aspects of the results. Finally generalizations and related subjects for further research are suggested.

  17. A Statistical Framework for the Interpretation of mtDNA Mixtures: Forensic and Medical Applications

    Science.gov (United States)

    Egeland, Thore; Salas, Antonio

    2011-01-01

    Background Mitochondrial DNA (mtDNA) variation is commonly analyzed in a wide range of different biomedical applications. Cases where more than one individual contribute to a stain genotyped from some biological material give rise to a mixture. Most forensic mixture cases are analyzed using autosomal markers. In rape cases, Y-chromosome markers typically add useful information. However, there are important cases where autosomal and Y-chromosome markers fail to provide useful profiles. In some instances, usually involving small amounts or degraded DNA, mtDNA may be the only useful genetic evidence available. Mitochondrial DNA mixtures also arise in studies dealing with the role of mtDNA variation in tumorigenesis. Such mixtures may be generated by the tumor, but they could also originate in vitro due to inadvertent contamination or a sample mix-up. Methods/Principal Findings We present the statistical methods needed for mixture interpretation and emphasize the modifications required for the more well-known methods based on conventional markers to generalize to mtDNA mixtures. Two scenarios are considered. Firstly, only categorical mtDNA data is assumed available, that is, the variants contributing to the mixture. Secondly, quantitative data (peak heights or areas) on the allelic variants are also accessible. In cases where quantitative information is available in addition to allele designation, it is possible to extract more precise information by using regression models. More precisely, using quantitative information may lead to a unique solution in cases where the qualitative approach points to several possibilities. Importantly, these methods also apply to clinical cases where contamination is a potential alternative explanation for the data. Conclusions/Significance We argue that clinical and forensic scientists should give greater consideration to mtDNA for mixture interpretation. The results and examples show that the analysis of mtDNA mixtures contributes

  18. Statistical and population genetics issues of two Hungarian datasets from the aspect of DNA evidence interpretation.

    Science.gov (United States)

    Szabolcsi, Zoltán; Farkas, Zsuzsa; Borbély, Andrea; Bárány, Gusztáv; Varga, Dániel; Heinrich, Attila; Völgyi, Antónia; Pamjav, Horolma

    2015-11-01

    When the DNA profile from a crime-scene matches that of a suspect, the weight of DNA evidence depends on the unbiased estimation of the match probability of the profiles. For this reason, it is required to establish and expand the databases that reflect the actual allele frequencies in the population applied. 21,473 complete DNA profiles from Databank samples were used to establish the allele frequency database to represent the population of Hungarian suspects. We used fifteen STR loci (PowerPlex ESI16) including five, new ESS loci. The aim was to calculate the statistical, forensic efficiency parameters for the Databank samples and compare the newly detected data to the earlier report. The population substructure caused by relatedness may influence the frequency of profiles estimated. As our Databank profiles were considered non-random samples, possible relationships between the suspects can be assumed. Therefore, population inbreeding effect was estimated using the FIS calculation. The overall inbreeding parameter was found to be 0.0106. Furthermore, we tested the impact of the two allele frequency datasets on 101 randomly chosen STR profiles, including full and partial profiles. The 95% confidence interval estimates for the profile frequencies (pM) resulted in a tighter range when we used the new dataset compared to the previously published ones. We found that the FIS had less effect on frequency values in the 21,473 samples than the application of minimum allele frequency. No genetic substructure was detected by STRUCTURE analysis. Due to the low level of inbreeding effect and the high number of samples, the new dataset provides unbiased and precise estimates of LR for statistical interpretation of forensic casework and allows us to use lower allele frequencies.

  19. MS/MS spectra interpretation as a statistical-mechanics problem.

    Science.gov (United States)

    Faccin, Mauro; Bruscolini, Pierpaolo

    2013-05-21

    We describe a new method for peptide sequencing based on the mapping of the interpretation of tandem mass spectra onto the analysis of the equilibrium distribution of a suitably defined physical model, whose variables describe the positions of the fragmentation sites along a discrete mass index. The model is governed by a potential energy function that, at present, we derive ad hoc from the distribution of peaks in a data set of experimental spectra. The statistical-physics perspective prompts for a consistent and unified approach to de novo and database-search methods, which is a distinctive feature of this approach over alternative ones: the characterization of the ground state of the model allows the de novo identification of the precursor peptide; the study of the thermodynamic variables as a function of the (fictitious) temperature gives insight on the quality of the prediction, while the probability profiles at nonzero temperature reveal, on one hand, which fragments are more reliably predicted. On the other hand, they can be used as a spectrum-adapted, a posteriori score for database search. Results obtained with two different test data sets reveal a performance similar to that of other de novo and database-search methods, which is reasonable, given the lack of an aggressive optimization of the energy function at this stage. An important feature of the method is that it is quite general and can be applied with different choices of the energy function: we discuss its possible improvements and generalizations.

  20. Evaluation of forensic DNA mixture evidence: protocol for evaluation, interpretation, and statistical calculations using the combined probability of inclusion.

    Science.gov (United States)

    Bieber, Frederick R; Buckleton, John S; Budowle, Bruce; Butler, John M; Coble, Michael D

    2016-08-31

    The evaluation and interpretation of forensic DNA mixture evidence faces greater interpretational challenges due to increasingly complex mixture evidence. Such challenges include: casework involving low quantity or degraded evidence leading to allele and locus dropout; allele sharing of contributors leading to allele stacking; and differentiation of PCR stutter artifacts from true alleles. There is variation in statistical approaches used to evaluate the strength of the evidence when inclusion of a specific known individual(s) is determined, and the approaches used must be supportable. There are concerns that methods utilized for interpretation of complex forensic DNA mixtures may not be implemented properly in some casework. Similar questions are being raised in a number of U.S. jurisdictions, leading to some confusion about mixture interpretation for current and previous casework. Key elements necessary for the interpretation and statistical evaluation of forensic DNA mixtures are described. Given the most common method for statistical evaluation of DNA mixtures in many parts of the world, including the USA, is the Combined Probability of Inclusion/Exclusion (CPI/CPE). Exposition and elucidation of this method and a protocol for use is the focus of this article. Formulae and other supporting materials are provided. Guidance and details of a DNA mixture interpretation protocol is provided for application of the CPI/CPE method in the analysis of more complex forensic DNA mixtures. This description, in turn, should help reduce the variability of interpretation with application of this methodology and thereby improve the quality of DNA mixture interpretation throughout the forensic community.

  1. Uses and Misuses of Student Evaluations of Teaching: The Interpretation of Differences in Teaching Evaluation Means Irrespective of Statistical Information

    Science.gov (United States)

    Boysen, Guy A.

    2015-01-01

    Student evaluations of teaching are among the most accepted and important indicators of college teachers' performance. However, faculty and administrators can overinterpret small variations in mean teaching evaluations. The current research examined the effect of including statistical information on the interpretation of teaching evaluations.…

  2. Uses and Misuses of Student Evaluations of Teaching: The Interpretation of Differences in Teaching Evaluation Means Irrespective of Statistical Information

    Science.gov (United States)

    Boysen, Guy A.

    2015-01-01

    Student evaluations of teaching are among the most accepted and important indicators of college teachers' performance. However, faculty and administrators can overinterpret small variations in mean teaching evaluations. The current research examined the effect of including statistical information on the interpretation of teaching evaluations.…

  3. On statistical interpretation of quantum mechanics. Detection chambers and cascade measurements

    Energy Technology Data Exchange (ETDEWEB)

    Benzecri, J.-P.

    1986-01-01

    Using analysis of a concrete experimental system, and using a formal schema, we interpret this dictum due to Heisenberg: In quantum mechanics, the frontier between object and observer, can be arbitrarily moved in the direction of the latter.

  4. "What If" Analyses: Ways to Interpret Statistical Significance Test Results Using EXCEL or "R"

    Science.gov (United States)

    Ozturk, Elif

    2012-01-01

    The present paper aims to review two motivations to conduct "what if" analyses using Excel and "R" to understand the statistical significance tests through the sample size context. "What if" analyses can be used to teach students what statistical significance tests really do and in applied research either prospectively to estimate what sample size…

  5. How to interpret the results of medical time series data analysis: Classical statistical approaches versus dynamic Bayesian network modeling

    Science.gov (United States)

    Onisko, Agnieszka; Druzdzel, Marek J.; Austin, R. Marshall

    2016-01-01

    Background: Classical statistics is a well-established approach in the analysis of medical data. While the medical community seems to be familiar with the concept of a statistical analysis and its interpretation, the Bayesian approach, argued by many of its proponents to be superior to the classical frequentist approach, is still not well-recognized in the analysis of medical data. Aim: The goal of this study is to encourage data analysts to use the Bayesian approach, such as modeling with graphical probabilistic networks, as an insightful alternative to classical statistical analysis of medical data. Materials and Methods: This paper offers a comparison of two approaches to analysis of medical time series data: (1) classical statistical approach, such as the Kaplan–Meier estimator and the Cox proportional hazards regression model, and (2) dynamic Bayesian network modeling. Our comparison is based on time series cervical cancer screening data collected at Magee-Womens Hospital, University of Pittsburgh Medical Center over 10 years. Results: The main outcomes of our comparison are cervical cancer risk assessments produced by the three approaches. However, our analysis discusses also several aspects of the comparison, such as modeling assumptions, model building, dealing with incomplete data, individualized risk assessment, results interpretation, and model validation. Conclusion: Our study shows that the Bayesian approach is (1) much more flexible in terms of modeling effort, and (2) it offers an individualized risk assessment, which is more cumbersome for classical statistical approaches. PMID:28163973

  6. On the statistical interpretation of quantum mechanics: evolution of the density matrix

    Energy Technology Data Exchange (ETDEWEB)

    Benzecri, J.P.

    1986-01-01

    Without attempting to identify ontological interpretation with a mathematical structure, we reduce philosophical speculation to five theses. In the discussion of these, a central role is devoted to the mathematical problem of the evolution of the density matrix. This article relates to the first 3 of these 5 theses.

  7. Statistical Interpretation of Joint Multiplicity Distributions of Neutrons and Charged Particles

    CERN Document Server

    Töke, J; Skulski, W; Schröder, W U

    2001-01-01

    Experimental joint multiplicity distributions of neutrons and charged particles emitted in complex nuclear reactions provide an important test of theoretical models. The method is applied to test three different theoretical models of nuclear multi-fragmentation, two of which fail the test. The measurement of neutrons is decisive in distinguishing between the Berlin and Copenhagen models of nuclear multi-fragmentation and challenges the interpretation of pseudo- Arrhenius plots.

  8. [Statistic pitfalls or how should we interprete numbers in the evaluation of a new treatment].

    Science.gov (United States)

    Martin-Du Pan, R

    1998-06-01

    Are cholesterol lowering drugs useful? Do they increase life expectancy? Do third generation oral contraceptives increase the risk of venous thromboembolism? Is there a worldwide decline in semen quality over the last 50 years? Do vitamin supplements improve your child's IQ? Does homeopathy work better than placebo? These questions illustrate some statistical problems and some bias encountered during clinical studies, which can lead to erroneous results. Type I and II errors, surveillance, prescription or publication bias as well as the healthy user effect are described. Problems of regression to the mean, limits of meta-analysis validity and other statistical problems are discussed.

  9. Translating the Statistical Representation of the Effects of Education Interventions into More Readily Interpretable Forms

    Science.gov (United States)

    Lipsey, Mark W.; Puzio, Kelly; Yun, Cathy; Hebert, Michael A.; Steinka-Fry, Kasia; Cole, Mikel W.; Roberts, Megan; Anthony, Karen S.; Busick, Matthew D.

    2012-01-01

    This paper is directed to researchers who conduct and report education intervention studies. Its purpose is to stimulate and guide them to go a step beyond reporting the statistics that emerge from their analysis of the differences between experimental groups on the respective outcome variables. With what is often very minimal additional effort,…

  10. A Statistical Mechanical Interpretation of Black Hole Entropy Based on an Orthonormal Frame Action

    CERN Document Server

    Epp, R J

    1998-01-01

    Carlip has shown that the entropy of the three-dimensional black hole has its origin in the statistical mechanics of microscopic states living at the horizon. Beginning with a certain orthonormal frame action, and applying similar methods, I show that an analogous result extends to the (Euclidean) black hole in any spacetime dimension. However, this approach still faces many interesting challenges, both technical and conceptual.

  11. Statistical Model for the Interpretation of Evidence for Bio-Signatures Simulated in virtual Mars Samples.

    Science.gov (United States)

    Mani, Peter; Heuer, Markus; Hofmann, Beda A.; Milliken, Kitty L.; West, Julia M.

    This paper evaluates a mathematical model of bio-signature search processes on Mars samples returned to Earth and studied inside a Mars Sample Return Facility (MSRF). Asimple porosity model for a returned Mars sample, based on initial observations on Mars meteorites, has been stochastically simulated and the data analysed in a computer study. The resulting false positive, true negative and false negative values - as a typical output of the simulations - was statistically analysed. The results were used in Bayes’ statistics to correct the a-priori probability of presence of bio-signature and the resulting posteriori probability was used in turn to improve the initial assumption of the value of extra-terrestrial presence for life forms in Mars material. Such an iterative algorithm can lead to a better estimate of the positive predictive value for life on Mars and therefore, together with Poisson statistics for a null result, it should be possible to bound the probability for the presence of extra-terrestrial bio-signatures to an upper level.

  12. Statistical analysis of time-resolved emission from ensembles of semiconductor quantum dots: Interpretation of exponential decay models

    DEFF Research Database (Denmark)

    Van Driel, A.F.; Nikolaev, I.S.; Vergeer, P.

    2007-01-01

    analysis to recent examples of colloidal quantum dot emission in suspensions and in photonic crystals, and we find that this important class of emitters is well described by a log-normal distribution of decay rates with a narrow and a broad distribution, respectively. Finally, we briefly discuss......We present a statistical analysis of time-resolved spontaneous emission decay curves from ensembles of emitters, such as semiconductor quantum dots, with the aim of interpreting ubiquitous non-single-exponential decay. Contrary to what is widely assumed, the density of excited emitters...... and the intensity in an emission decay curve are not proportional, but the density is a time integral of the intensity. The integral relation is crucial to correctly interpret non-single-exponential decay. We derive the proper normalization for both a discrete and a continuous distribution of rates, where every...

  13. Tsallis Statistical Interpretation of Transverse Momentum Spectra in High-Energy pA Collisions

    Directory of Open Access Journals (Sweden)

    Bao-Chun Li

    2015-01-01

    Full Text Available In Tsallis statistics, we investigate charged pion and proton production for pCu and pPb interactions at 3, 8, and 15 GeV/c. Two versions of Tsallis distribution are implemented in a multisource thermal model. A comparison with experimental data of the HARP-CDP group shows that they both can reproduce the transverse momentum spectra, but the improved form gives a better description. It is also found that the difference between q and q′ is small when the temperature T = T′ for the same incident momentum and angular interval, and the value of q is greater than q′ in most cases.

  14. Statistical Approaches to Interpretation of Local, Regional, and National Highway-Runoff and Urban-Stormwater Data

    Science.gov (United States)

    Tasker, Gary D.; Granato, Gregory E.

    2000-01-01

    Decision makers need viable methods for the interpretation of local, regional, and national-highway runoff and urban-stormwater data including flows, concentrations and loads of chemical constituents and sediment, potential effects on receiving waters, and the potential effectiveness of various best management practices (BMPs). Valid (useful for intended purposes), current, and technically defensible stormwater-runoff models are needed to interpret data collected in field studies, to support existing highway and urban-runoffplanning processes, to meet National Pollutant Discharge Elimination System (NPDES) requirements, and to provide methods for computation of Total Maximum Daily Loads (TMDLs) systematically and economically. Historically, conceptual, simulation, empirical, and statistical models of varying levels of detail, complexity, and uncertainty have been used to meet various data-quality objectives in the decision-making processes necessary for the planning, design, construction, and maintenance of highways and for other land-use applications. Water-quality simulation models attempt a detailed representation of the physical processes and mechanisms at a given site. Empirical and statistical regional water-quality assessment models provide a more general picture of water quality or changes in water quality over a region. All these modeling techniques share one common aspect-their predictive ability is poor without suitable site-specific data for calibration. To properly apply the correct model, one must understand the classification of variables, the unique characteristics of water-resources data, and the concept of population structure and analysis. Classifying variables being used to analyze data may determine which statistical methods are appropriate for data analysis. An understanding of the characteristics of water-resources data is necessary to evaluate the applicability of different statistical methods, to interpret the results of these techniques

  15. Effect of normalization on statistical and biological interpretation of gene expression profiles.

    Science.gov (United States)

    Qin, Shaopu; Kim, Jinhee; Arafat, Dalia; Gibson, Greg

    2012-01-01

    An under-appreciated aspect of the genetic analysis of gene expression is the impact of post-probe level normalization on biological inference. Here we contrast nine different methods for normalization of an Illumina bead-array gene expression profiling dataset consisting of peripheral blood samples from 189 individual participants in the Center for Health Discovery and Well Being study in Atlanta, quantifying differences in the inference of global variance components and covariance of gene expression, as well as the detection of variants that affect transcript abundance (eSNPs). The normalization strategies, all relative to raw log2 measures, include simple mean centering, two modes of transcript-level linear adjustment for technical factors, and for differential immune cell counts, variance normalization by interquartile range and by quantile, fitting the first 16 Principal Components, and supervised normalization using the SNM procedure with adjustment for cell counts. Robustness of genetic associations as a consequence of Pearson and Spearman rank correlation is also reported for each method, and it is shown that the normalization strategy has a far greater impact than correlation method. We describe similarities among methods, discuss the impact on biological interpretation, and make recommendations regarding appropriate strategies.

  16. Effect of Normalization on Statistical and Biological Interpretation of Gene Expression Profiles

    Directory of Open Access Journals (Sweden)

    Shaopu Peter Qin

    2013-05-01

    Full Text Available A neglected aspect of the genetic analysis of gene expression is the impact of normalization on biological inference. Here we contrast nine different methods for normalization of an Illumina bead-array gene expression profiling dataset consisting of peripheral blood samples from 189 individual participants in the Center for Health Discovery and Well-Being (CHDWB study in Atlanta, quantifying differences in the inference of global variance components and covariance of gene expression, as well as the detection of eSNPs. The normalization strategies, all relative to raw log2 measures, include simple mean centering, two modes of transcript-level linear adjustment for technical factors, and for differential immune cell counts, variance normalization by inter-quartile range and by quantile, fitting the first 16 Principal Components, and supervised normalization using the SNM procedure with adjustment for cell counts. Robustness of genetic associations as a consequence of Pearson and Spearman rank correlation is also reported for each method, and it shown that the normalization strategy has a far greater impact than correlation method. We describe similarities among methods, discuss the impact on biological interpretation, and make recommendations regarding appropriate strategies.

  17. Statistic-mathematical interpretation of some assessment parameters of the grassland ecosystem according to soil characteristics

    Science.gov (United States)

    Samfira, Ionel; Boldea, Marius; Popescu, Cosmin

    2012-09-01

    Significant parameters of permanent grasslands are represented by the pastoral value and Shannon and Simpson biodiversity indices. The dynamics of these parameters has been studied in several plant associations in Banat Plain, Romania. From the point of view of their typology, these permanent grasslands belong to the steppe area, series Festuca pseudovina, type Festuca pseudovina-Achilea millefolium, subtype Lolium perenne. The methods used for the purpose of this research included plant cover analysis (double meter method, calculation of Shannon and Simpson indices), and statistical methods of regression and correlation. The results show that, in the permanent grasslands in the plain region, when the pastoral value is average to low, the level of interspecific biodiversity is on the increase.

  18. Interpretation of Mueller matrix images based on polar decomposition and statistical discriminators to distinguish skin cancer

    Science.gov (United States)

    Chung, Jung R.; DeLaughter, Aimee H.; Baba, Justin S.; Spiegelman, Clifford H.; Amoss, M. S.; Cote, Gerard L.

    2003-07-01

    The Mueller matrix describes all the polarizing properties of a sample, and therefore the optical differences between cancerous and non-cancerous tissue should be present within the matrix elements. We present in this paper the Mueller matrices of three types of tissue; normal, benign mole, and malignant melanoma on a Sinclair swine model. Feature extraction is done on the Mueller matrix elements resulting in the retardance images, diattenuation images, and depolarization images. These images are analyzed in an attempt to determine the important factors for the identification of cancerous lesions from their benign counterparts. In addition, the extracted features are analyzed using statistical processing to develop an accurate classification scheme and to identify the importance of each parameter in the determination of cancerous versus non-cancerous tissue.

  19. The new interpretation of support vector machines on statistical learning theory

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    This paper is concerned with the theoretical foundation of support vector machines (SVMs). The purpose is to develop further an exact relationship between SVMs and the statistical learning theory (SLT). As a representative, the standard C-support vector classification (C-SVC) is considered here. More precisely, we show that the decision function obtained by C-SVC is just one of the decision functions obtained by solving the optimization problem derived directly from the structural risk minimization principle. In addition, an interesting meaning of the parameter C in C-SVC is given by showing that C corresponds to the size of the decision function candidate set in the structural risk minimization principle.

  20. Interpretation of delayed neutron emission using a non-statistical approach

    CERN Document Server

    Shihab-Eldin, A A; Nuh, F M; Prussin, S G

    1976-01-01

    Experimental data on several delayed neutron emitting systems exhibit characteristics not accounted for by the normal statistical model. Using a single-particle approach, the locations and relative beta - strengths to configurations in the emitter nuclides populated by allowed G.T. transitions have been calculated and are in qualitative agreement with strength function data for /sup 85/As, /sup 87/Br, /sup 135/Sb and /sup 137/I. Calculations of P/sub n/-values for the bromine precursors A=87 to 92 are also in good agreement with experimental data. The lack of high energy neutrons in spectra where excited states in the final nucleus are strongly populated can be traced qualitatively to particle-hole excitations contributing to the excited states. (16 refs).

  1. Statistical interpretation of transient current power-law decay in colloidal quantum dot arrays

    Energy Technology Data Exchange (ETDEWEB)

    Sibatov, R T, E-mail: ren_sib@bk.ru [Ulyanovsk State University, 432000, 42 Leo Tolstoy Street, Ulyanovsk (Russian Federation)

    2011-08-01

    A new statistical model of the charge transport in colloidal quantum dot arrays is proposed. It takes into account Coulomb blockade forbidding multiple occupancy of nanocrystals and the influence of energetic disorder of interdot space. The model explains power-law current transients and the presence of the memory effect. The fractional differential analogue of the Ohm law is found phenomenologically for nanocrystal arrays. The model combines ideas that were considered as conflicting by other authors: the Scher-Montroll idea about the power-law distribution of waiting times in localized states for disordered semiconductors is applied taking into account Coulomb blockade; Novikov's condition about the asymptotic power-law distribution of time intervals between successful current pulses in conduction channels is fulfilled; and the carrier injection blocking predicted by Ginger and Greenham (2000 J. Appl. Phys. 87 1361) takes place.

  2. A Comprehensive Statistically-Based Method to Interpret Real-Time Flowing Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Pinan Dawkrajai; Keita Yoshioka; Analis A. Romero; Ding Zhu; A.D. Hill; Larry W. Lake

    2005-10-01

    This project is motivated by the increasing use of distributed temperature sensors for real-time monitoring of complex wells (horizontal, multilateral and multi-branching wells) to infer the profiles of oil, gas, and water entry. Measured information can be used to interpret flow profiles along the wellbore including junction and build section. In this second project year, we have completed a forward model to predict temperature and pressure profiles in complex wells. As a comprehensive temperature model, we have developed an analytical reservoir flow model which takes into account Joule-Thomson effects in the near well vicinity and multiphase non-isothermal producing wellbore model, and couples those models accounting mass and heat transfer between them. For further inferences such as water coning or gas evaporation, we will need a numerical non-isothermal reservoir simulator, and unlike existing (thermal recovery, geothermal) simulators, it should capture subtle temperature change occurring in a normal production. We will show the results from the analytical coupled model (analytical reservoir solution coupled with numerical multi-segment well model) to infer the anomalous temperature or pressure profiles under various conditions, and the preliminary results from the numerical coupled reservoir model which solves full matrix including wellbore grids. We applied Ramey's model to the build section and used an enthalpy balance to infer the temperature profile at the junction. The multilateral wellbore temperature model was applied to a wide range of cases varying fluid thermal properties, absolute values of temperature and pressure, geothermal gradients, flow rates from each lateral, and the trajectories of each build section.

  3. A Comprehensive Statistically-Based Method to Interpret Real-Time Flowing Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Keita Yoshioka; Pinan Dawkrajai; Analis A. Romero; Ding Zhu; A. D. Hill; Larry W. Lake

    2007-01-15

    With the recent development of temperature measurement systems, continuous temperature profiles can be obtained with high precision. Small temperature changes can be detected by modern temperature measuring instruments such as fiber optic distributed temperature sensor (DTS) in intelligent completions and will potentially aid the diagnosis of downhole flow conditions. In vertical wells, since elevational geothermal changes make the wellbore temperature sensitive to the amount and the type of fluids produced, temperature logs can be used successfully to diagnose the downhole flow conditions. However, geothermal temperature changes along the wellbore being small for horizontal wells, interpretations of a temperature log become difficult. The primary temperature differences for each phase (oil, water, and gas) are caused by frictional effects. Therefore, in developing a thermal model for horizontal wellbore, subtle temperature changes must be accounted for. In this project, we have rigorously derived governing equations for a producing horizontal wellbore and developed a prediction model of the temperature and pressure by coupling the wellbore and reservoir equations. Also, we applied Ramey's model (1962) to the build section and used an energy balance to infer the temperature profile at the junction. The multilateral wellbore temperature model was applied to a wide range of cases at varying fluid thermal properties, absolute values of temperature and pressure, geothermal gradients, flow rates from each lateral, and the trajectories of each build section. With the prediction models developed, we present inversion studies of synthetic and field examples. These results are essential to identify water or gas entry, to guide flow control devices in intelligent completions, and to decide if reservoir stimulation is needed in particular horizontal sections. This study will complete and validate these inversion studies.

  4. Evaluating Structural Equation Models for Categorical Outcomes: A New Test Statistic and a Practical Challenge of Interpretation.

    Science.gov (United States)

    Monroe, Scott; Cai, Li

    2015-01-01

    This research is concerned with two topics in assessing model fit for categorical data analysis. The first topic involves the application of a limited-information overall test, introduced in the item response theory literature, to structural equation modeling (SEM) of categorical outcome variables. Most popular SEM test statistics assess how well the model reproduces estimated polychoric correlations. In contrast, limited-information test statistics assess how well the underlying categorical data are reproduced. Here, the recently introduced C2 statistic of Cai and Monroe (2014) is applied. The second topic concerns how the root mean square error of approximation (RMSEA) fit index can be affected by the number of categories in the outcome variable. This relationship creates challenges for interpreting RMSEA. While the two topics initially appear unrelated, they may conveniently be studied in tandem since RMSEA is based on an overall test statistic, such as C2. The results are illustrated with an empirical application to data from a large-scale educational survey.

  5. Hysteresis model and statistical interpretation of energy losses in non-oriented steels

    Energy Technology Data Exchange (ETDEWEB)

    Mănescu, Veronica, E-mail: veronica.paltanea@upb.ro; Păltânea, Gheorghe; Gavrilă, Horia

    2016-04-01

    In this paper the hysteresis energy losses in two non-oriented industrial steels (M400-65A and M800-65A) were determined, by means of an efficient classical Preisach model, which is based on the Pescetti–Biorci method for the identification of the Preisach density. The excess and the total energy losses were also determined, using a statistical framework, based on magnetic object theory. The hysteresis energy losses, in a non-oriented steel alloy, depend on the peak magnetic polarization and they can be computed using a Preisach model, due to the fact that in these materials there is a direct link between the elementary rectangular loops and the discontinuous character of the magnetization process (Barkhausen jumps). To determine the Preisach density it was necessary to measure the normal magnetization curve and the saturation hysteresis cycle. A system of equations was deduced and the Preisach density was calculated for a magnetic polarization of 1.5 T; then the hysteresis cycle was reconstructed. Using the same pattern for the Preisach distribution, it was computed the hysteresis cycle for 1 T. The classical losses were calculated using a well known formula and the excess energy losses were determined by means of the magnetic object theory. The total energy losses were mathematically reconstructed and compared with those, measured experimentally.

  6. Hysteresis model and statistical interpretation of energy losses in non-oriented steels

    Science.gov (United States)

    Mănescu (Păltânea), Veronica; Păltânea, Gheorghe; Gavrilă, Horia

    2016-04-01

    In this paper the hysteresis energy losses in two non-oriented industrial steels (M400-65A and M800-65A) were determined, by means of an efficient classical Preisach model, which is based on the Pescetti-Biorci method for the identification of the Preisach density. The excess and the total energy losses were also determined, using a statistical framework, based on magnetic object theory. The hysteresis energy losses, in a non-oriented steel alloy, depend on the peak magnetic polarization and they can be computed using a Preisach model, due to the fact that in these materials there is a direct link between the elementary rectangular loops and the discontinuous character of the magnetization process (Barkhausen jumps). To determine the Preisach density it was necessary to measure the normal magnetization curve and the saturation hysteresis cycle. A system of equations was deduced and the Preisach density was calculated for a magnetic polarization of 1.5 T; then the hysteresis cycle was reconstructed. Using the same pattern for the Preisach distribution, it was computed the hysteresis cycle for 1 T. The classical losses were calculated using a well known formula and the excess energy losses were determined by means of the magnetic object theory. The total energy losses were mathematically reconstructed and compared with those, measured experimentally.

  7. Adsorption of ethanol onto activated carbon: Modeling and consequent interpretations based on statistical physics treatment

    Science.gov (United States)

    Bouzid, Mohamed; Sellaoui, Lotfi; Khalfaoui, Mohamed; Belmabrouk, Hafedh; Lamine, Abdelmottaleb Ben

    2016-02-01

    In this work, we studied the adsorption of ethanol on three types of activated carbon, namely parent Maxsorb III and two chemically modified activated carbons (H2-Maxsorb III and KOH-H2-Maxsorb III). This investigation has been conducted on the basis of the grand canonical formalism in statistical physics and on simplified assumptions. This led to three parameter equations describing the adsorption of ethanol onto the three types of activated carbon. There was a good correlation between experimental data and results obtained by the new proposed equation. The parameters characterizing the adsorption isotherm were the number of adsorbed molecules (s) per site n, the density of the receptor sites per unit mass of the adsorbent Nm, and the energetic parameter p1/2. They were estimated for the studied systems by a non linear least square regression. The results show that the ethanol molecules were adsorbed in perpendicular (or non parallel) position to the adsorbent surface. The magnitude of the calculated adsorption energies reveals that ethanol is physisorbed onto activated carbon. Both van der Waals and hydrogen interactions were involved in the adsorption process. The calculated values of the specific surface AS, proved that the three types of activated carbon have a highly microporous surface.

  8. Exploring the Gross Schoenebeck (Germany) geothermal site using a statistical joint interpretation of magnetotelluric and seismic tomography models

    Energy Technology Data Exchange (ETDEWEB)

    Munoz, Gerard; Bauer, Klaus; Moeck, Inga; Schulze, Albrecht; Ritter, Oliver [Deutsches GeoForschungsZentrum (GFZ), Telegrafenberg, 14473 Potsdam (Germany)

    2010-03-15

    Exploration for geothermal resources is often challenging because there are no geophysical techniques that provide direct images of the parameters of interest, such as porosity, permeability and fluid content. Magnetotelluric (MT) and seismic tomography methods yield information about subsurface distribution of resistivity and seismic velocity on similar scales and resolution. The lack of a fundamental law linking the two parameters, however, has limited joint interpretation to a qualitative analysis. By using a statistical approach in which the resistivity and velocity models are investigated in the joint parameter space, we are able to identify regions of high correlation and map these classes (or structures) back onto the spatial domain. This technique, applied to a seismic tomography-MT profile in the area of the Gross Schoenebeck geothermal site, allows us to identify a number of classes in accordance with the local geology. In particular, a high-velocity, low-resistivity class is interpreted as related to areas with thinner layers of evaporites; regions where these sedimentary layers are highly fractured may be of higher permeability. (author)

  9. Statistics

    CERN Document Server

    Hayslett, H T

    1991-01-01

    Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the

  10. Geovisual analytics to enhance spatial scan statistic interpretation: an analysis of U.S. cervical cancer mortality

    Directory of Open Access Journals (Sweden)

    Lengerich Eugene J

    2008-11-01

    Full Text Available Abstract Background Kulldorff's spatial scan statistic and its software implementation – SaTScan – are widely used for detecting and evaluating geographic clusters. However, two issues make using the method and interpreting its results non-trivial: (1 the method lacks cartographic support for understanding the clusters in geographic context and (2 results from the method are sensitive to parameter choices related to cluster scaling (abbreviated as scaling parameters, but the system provides no direct support for making these choices. We employ both established and novel geovisual analytics methods to address these issues and to enhance the interpretation of SaTScan results. We demonstrate our geovisual analytics approach in a case study analysis of cervical cancer mortality in the U.S. Results We address the first issue by providing an interactive visual interface to support the interpretation of SaTScan results. Our research to address the second issue prompted a broader discussion about the sensitivity of SaTScan results to parameter choices. Sensitivity has two components: (1 the method can identify clusters that, while being statistically significant, have heterogeneous contents comprised of both high-risk and low-risk locations and (2 the method can identify clusters that are unstable in location and size as the spatial scan scaling parameter is varied. To investigate cluster result stability, we conducted multiple SaTScan runs with systematically selected parameters. The results, when scanning a large spatial dataset (e.g., U.S. data aggregated by county, demonstrate that no single spatial scan scaling value is known to be optimal to identify clusters that exist at different scales; instead, multiple scans that vary the parameters are necessary. We introduce a novel method of measuring and visualizing reliability that facilitates identification of homogeneous clusters that are stable across analysis scales. Finally, we propose a

  11. Statistics

    Science.gov (United States)

    Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

  12. Easily Retrievable Objects among the NEO Population

    CERN Document Server

    Yárnoz, D García; McInnes, C R

    2013-01-01

    Asteroids and comets are of strategic importance for science in an effort to understand the formation, evolution and composition of the Solar System. Near-Earth Objects (NEOs) are of particular interest because of their accessibility from Earth, but also because of their speculated wealth of material resources. The exploitation of these resources has long been discussed as a means to lower the cost of future space endeavours. In this paper, we consider the currently known NEO population and define a family of so-called Easily Retrievable Objects (EROs), objects that can be transported from accessible heliocentric orbits into the Earth's neighbourhood at affordable costs. The asteroid retrieval transfers are sought from the continuum of low energy transfers enabled by the dynamics of invariant manifolds; specifically, the retrieval transfers target planar, vertical Lyapunov and halo orbit families associated with the collinear equilibrium points of the Sun-Earth Circular Restricted Three Body problem. The judi...

  13. Statistical factor analysis technique for characterizing basalt through interpreting nuclear and electrical well logging data (case study from Southern Syria).

    Science.gov (United States)

    Asfahani, Jamal

    2014-02-01

    Factor analysis technique is proposed in this research for interpreting the combination of nuclear well logging, including natural gamma ray, density and neutron-porosity, and the electrical well logging of long and short normal, in order to characterize the large extended basaltic areas in southern Syria. Kodana well logging data are used for testing and applying the proposed technique. The four resulting score logs enable to establish the lithological score cross-section of the studied well. The established cross-section clearly shows the distribution and the identification of four kinds of basalt which are hard massive basalt, hard basalt, pyroclastic basalt and the alteration basalt products, clay. The factor analysis technique is successfully applied on the Kodana well logging data in southern Syria, and can be used efficiently when several wells and huge well logging data with high number of variables are required to be interpreted.

  14. Side effect of acting on the world: Acquisition of action-outcome statistic relation alters visual interpretation of action outcome

    Directory of Open Access Journals (Sweden)

    Takahiro eKawabe

    2013-09-01

    Full Text Available Humans can acquire the statistical features of the external world and employ them to control behaviors. Some external events occur in harmony with an agent’s action, and thus humans should also be able to acquire the statistical features between an action and its external outcome. We report that the acquired action-outcome statistical features alter the visual appearance of the action outcome. Pressing either of two assigned keys triggered visual motion whose direction was statistically biased either upward or downward, and observers judged the stimulus motion direction. Points of subjective equality (PSE for judging motion direction were shifted repulsively from the mean of the distribution associated with each key. Our Bayesian model accounted for the PSE shifts, indicating the optimal acquisition of the action-effect statistical relation. The PSE shifts were moderately attenuated when the action-outcome contingency was reduced. The Bayesian model again accounted for the attenuated PSE shifts. On the other hand, when the action-outcome contiguity was greatly reduced, the PSE shifts were greatly attenuated, and however, the Bayesian model could not accounted for the shifts. The results indicate that visual appearance can be modified by prediction based on the optimal acquisition of action-effect causal relation.

  15. Interview with Yves Pomeau, Boltzmann Medallist 2016 : The universality of statistical physics interpretation is ever more obvious.

    Science.gov (United States)

    Pomeau, Yves; Louët, Sabine

    2016-06-01

    During the StatPhys Conference on 20th July 2016 in Lyon, France, Yves Pomeau and Daan Frenkel will be awarded the most important prize in the field of Statistical Mechanics: the 2016 Boltzmann Medal, named after the Austrian physicist and philosopher Ludwig Boltzmann. The award recognises Pomeau's key contributions to the Statistical Physics of non-equilibrium phenomena in general. And, in particular, for developing our modern understanding of fluid mechanics, instabilities, pattern formation and chaos. He is recognised as an outstanding theorist bridging disciplines from applied mathematics to statistical physics with a profound impact on the neighbouring fields of turbulence and mechanics. In the article Sabine Louët interviews Pomeau, who is an Editor for the European Physical Journal Special Topics. He shares his views and tells how he experienced the rise of Statistical Mechanics in the past few decades. He also touches upon the need to provide funding to people who have the rare ability to discover new things and ideas, and not just those who are good at filling in grant application forms.

  16. Weighted Feature Significance: A Simple, Interpretable Model of Compound Toxicity Based on the Statistical Enrichment of Structural Features

    OpenAIRE

    Huang, Ruili; Southall, Noel; Xia, Menghang; Cho, Ming-Hsuang; Jadhav, Ajit; Nguyen, Dac-Trung; Inglese, James; Tice, Raymond R.; Austin, Christopher P.

    2009-01-01

    In support of the U.S. Tox21 program, we have developed a simple and chemically intuitive model we call weighted feature significance (WFS) to predict the toxicological activity of compounds, based on the statistical enrichment of structural features in toxic compounds. We trained and tested the model on the following: (1) data from quantitative high–throughput screening cytotoxicity and caspase activation assays conducted at the National Institutes of Health Chemical Genomics Center, (2) dat...

  17. Skew-Laplace and Cell-Size Distribution in Microbial Axenic Cultures: Statistical Assessment and Biological Interpretation

    Directory of Open Access Journals (Sweden)

    Olga Julià

    2010-01-01

    Full Text Available We report a skew-Laplace statistical analysis of both flow cytometry scatters and cell size from microbial strains primarily grown in batch cultures, others in chemostat cultures and bacterial aquatic populations. Cytometry scatters best fit the skew-Laplace distribution while cell size as assessed by an electronic particle analyzer exhibited a moderate fitting. Unlike the cultures, the aquatic bacterial communities clearly do not fit to a skew-Laplace distribution. Due to its versatile nature, the skew-Laplace distribution approach offers an easy, efficient, and powerful tool for distribution of frequency analysis in tandem with the flow cytometric cell sorting.

  18. The effect of a graphical interpretation of a statistic trend indicator (Trigg's Tracking Variable) on the detection of simulated changes.

    Science.gov (United States)

    Kennedy, R R; Merry, A F

    2011-09-01

    Anaesthesia involves processing large amounts of information over time. One task of the anaesthetist is to detect substantive changes in physiological variables promptly and reliably. It has been previously demonstrated that a graphical trend display of historical data leads to more rapid detection of such changes. We examined the effect of a graphical indication of the magnitude of Trigg's Tracking Variable, a simple statistically based trend detection algorithm, on the accuracy and latency of the detection of changes in a micro-simulation. Ten anaesthetists each viewed 20 simulations with four variables displayed as the current value with a simple graphical trend display. Values for these variables were generated by a computer model, and updated every second; after a period of stability a change occurred to a new random value at least 10 units from baseline. In 50% of the simulations an indication of the rate of change was given by a five level graphical representation of the value of Trigg's Tracking Variable. Participants were asked to indicate when they thought a change was occurring. Changes were detected 10.9% faster with the trend indicator present (mean 13.1 [SD 3.1] cycles vs 14.6 [SD 3.4] cycles, 95% confidence interval 0.4 to 2.5 cycles, P = 0.013. There was no difference in accuracy of detection (median with trend detection 97% [interquartile range 95 to 100%], without trend detection 100% [98 to 100%]), P = 0.8. We conclude that simple statistical trend detection may speed detection of changes during routine anaesthesia, even when a graphical trend display is present.

  19. Proper interpretation of chronic toxicity studies and their statistics: A critique of "Which level of evidence does the US National Toxicology Program provide? Statistical considerations using the Technical Report 578 on Ginkgo biloba as an example".

    Science.gov (United States)

    Kissling, Grace E; Haseman, Joseph K; Zeiger, Errol

    2015-09-02

    A recent article by Gaus (2014) demonstrates a serious misunderstanding of the NTP's statistical analysis and interpretation of rodent carcinogenicity data as reported in Technical Report 578 (Ginkgo biloba) (NTP, 2013), as well as a failure to acknowledge the abundant literature on false positive rates in rodent carcinogenicity studies. The NTP reported Ginkgo biloba extract to be carcinogenic in mice and rats. Gaus claims that, in this study, 4800 statistical comparisons were possible, and that 209 of them were statistically significant (pGinkgo biloba extract cannot be definitively established. However, his assumptions and calculations are flawed since he incorrectly assumes that the NTP uses no correction for multiple comparisons, and that significance tests for discrete data operate at exactly the nominal level. He also misrepresents the NTP's decision making process, overstates the number of statistical comparisons made, and ignores the fact that the mouse liver tumor effects were so striking (e.g., p<0.0000000000001) that it is virtually impossible that they could be false positive outcomes. Gaus' conclusion that such obvious responses merely "generate a hypothesis" rather than demonstrate a real carcinogenic effect has no scientific credibility. Moreover, his claims regarding the high frequency of false positive outcomes in carcinogenicity studies are misleading because of his methodological misconceptions and errors.

  20. Multivariate Statistical Analysis as a Supplementary Tool for Interpretation of Variations in Salivary Cortisol Level in Women with Major Depressive Disorder

    Directory of Open Access Journals (Sweden)

    Ewelina Dziurkowska

    2015-01-01

    Full Text Available Multivariate statistical analysis is widely used in medical studies as a profitable tool facilitating diagnosis of some diseases, for instance, cancer, allergy, pneumonia, or Alzheimer’s and psychiatric diseases. Taking this in consideration, the aim of this study was to use two multivariate techniques, hierarchical cluster analysis (HCA and principal component analysis (PCA, to disclose the relationship between the drugs used in the therapy of major depressive disorder and the salivary cortisol level and the period of hospitalization. The cortisol contents in saliva of depressed women were quantified by HPLC with UV detection day-to-day during the whole period of hospitalization. A data set with 16 variables (e.g., the patients’ age, multiplicity and period of hospitalization, initial and final cortisol level, highest and lowest hormone level, mean contents, and medians characterizing 97 subjects was used for HCA and PCA calculations. Multivariate statistical analysis reveals that various groups of antidepressants affect at the varying degree the salivary cortisol level. The SSRIs, SNRIs, and the polypragmasy reduce most effectively the hormone secretion. Thus, both unsupervised pattern recognition methods, HCA and PCA, can be used as complementary tools for interpretation of the results obtained by laboratory diagnostic methods.

  1. Statistical physics modeling of hydrogen desorption from LaNi{sub 4.75}Fe{sub 0.25}: Stereographic and energetic interpretations

    Energy Technology Data Exchange (ETDEWEB)

    Wjihi, Sarra [Unité de Recherche de Physique Quantique, 11 ES 54, Faculté des Science de Monastir (Tunisia); Dhaou, Houcine [Laboratoire des Etudes des Systèmes Thermiques et Energétiques (LESTE), ENIM, Route de Kairouan, 5019 Monastir (Tunisia); Yahia, Manel Ben; Knani, Salah [Unité de Recherche de Physique Quantique, 11 ES 54, Faculté des Science de Monastir (Tunisia); Jemni, Abdelmajid [Laboratoire des Etudes des Systèmes Thermiques et Energétiques (LESTE), ENIM, Route de Kairouan, 5019 Monastir (Tunisia); Lamine, Abdelmottaleb Ben, E-mail: abdelmottaleb.benlamine@gmail.com [Unité de Recherche de Physique Quantique, 11 ES 54, Faculté des Science de Monastir (Tunisia)

    2015-12-15

    Statistical physics treatment is used to study the desorption of hydrogen on LaNi{sub 4.75}Fe{sub 0.25}, in order to obtain new physicochemical interpretations at the molecular level. Experimental desorption isotherms of hydrogen on LaNi{sub 4.75}Fe{sub 0.25} are fitted at three temperatures (293 K, 303 K and 313 K), using a monolayer desorption model. Six parameters of the model are fitted, namely the number of molecules per site n{sub α} and n{sub β}, the receptor site densities N{sub αM} and N{sub βM}, and the energetic parameters P{sub α} and P{sub β}. The behaviors of these parameters are discussed in relationship with desorption process. A dynamic study of the α and β phases in the desorption process was then carried out. Finally, the different thermodynamical potential functions are derived by statistical physics calculations from our adopted model.

  2. Multivariate Statistical Analysis as a Supplementary Tool for Interpretation of Variations in Salivary Cortisol Level in Women with Major Depressive Disorder.

    Science.gov (United States)

    Dziurkowska, Ewelina; Wesolowski, Marek

    2015-01-01

    Multivariate statistical analysis is widely used in medical studies as a profitable tool facilitating diagnosis of some diseases, for instance, cancer, allergy, pneumonia, or Alzheimer's and psychiatric diseases. Taking this in consideration, the aim of this study was to use two multivariate techniques, hierarchical cluster analysis (HCA) and principal component analysis (PCA), to disclose the relationship between the drugs used in the therapy of major depressive disorder and the salivary cortisol level and the period of hospitalization. The cortisol contents in saliva of depressed women were quantified by HPLC with UV detection day-to-day during the whole period of hospitalization. A data set with 16 variables (e.g., the patients' age, multiplicity and period of hospitalization, initial and final cortisol level, highest and lowest hormone level, mean contents, and medians) characterizing 97 subjects was used for HCA and PCA calculations. Multivariate statistical analysis reveals that various groups of antidepressants affect at the varying degree the salivary cortisol level. The SSRIs, SNRIs, and the polypragmasy reduce most effectively the hormone secretion. Thus, both unsupervised pattern recognition methods, HCA and PCA, can be used as complementary tools for interpretation of the results obtained by laboratory diagnostic methods.

  3. How to Make Nothing Out of Something: Analyses of the Impact of Study Sampling and Statistical Interpretation in Misleading Meta-Analytic Conclusions.

    Science.gov (United States)

    Cunningham, Michael R; Baumeister, Roy F

    2016-01-01

    The limited resource model states that self-control is governed by a relatively finite set of inner resources on which people draw when exerting willpower. Once self-control resources have been used up or depleted, they are less available for other self-control tasks, leading to a decrement in subsequent self-control success. The depletion effect has been studied for over 20 years, tested or extended in more than 600 studies, and supported in an independent meta-analysis (Hagger et al., 2010). Meta-analyses are supposed to reduce bias in literature reviews. Carter et al.'s (2015) meta-analysis, by contrast, included a series of questionable decisions involving sampling, methods, and data analysis. We provide quantitative analyses of key sampling issues: exclusion of many of the best depletion studies based on idiosyncratic criteria and the emphasis on mini meta-analyses with low statistical power as opposed to the overall depletion effect. We discuss two key methodological issues: failure to code for research quality, and the quantitative impact of weak studies by novice researchers. We discuss two key data analysis issues: questionable interpretation of the results of trim and fill and Funnel Plot Asymmetry test procedures, and the use and misinterpretation of the untested Precision Effect Test and Precision Effect Estimate with Standard Error (PEESE) procedures. Despite these serious problems, the Carter et al. (2015) meta-analysis results actually indicate that there is a real depletion effect - contrary to their title.

  4. iCFD: Interpreted Computational Fluid Dynamics - Degeneration of CFD to one-dimensional advection-dispersion models using statistical experimental design - The secondary clarifier.

    Science.gov (United States)

    Guyonvarch, Estelle; Ramin, Elham; Kulahci, Murat; Plósz, Benedek Gy

    2015-10-15

    The present study aims at using statistically designed computational fluid dynamics (CFD) simulations as numerical experiments for the identification of one-dimensional (1-D) advection-dispersion models - computationally light tools, used e.g., as sub-models in systems analysis. The objective is to develop a new 1-D framework, referred to as interpreted CFD (iCFD) models, in which statistical meta-models are used to calculate the pseudo-dispersion coefficient (D) as a function of design and flow boundary conditions. The method - presented in a straightforward and transparent way - is illustrated using the example of a circular secondary settling tank (SST). First, the significant design and flow factors are screened out by applying the statistical method of two-level fractional factorial design of experiments. Second, based on the number of significant factors identified through the factor screening study and system understanding, 50 different sets of design and flow conditions are selected using Latin Hypercube Sampling (LHS). The boundary condition sets are imposed on a 2-D axi-symmetrical CFD simulation model of the SST. In the framework, to degenerate the 2-D model structure, CFD model outputs are approximated by the 1-D model through the calibration of three different model structures for D. Correlation equations for the D parameter then are identified as a function of the selected design and flow boundary conditions (meta-models), and their accuracy is evaluated against D values estimated in each numerical experiment. The evaluation and validation of the iCFD model structure is carried out using scenario simulation results obtained with parameters sampled from the corners of the LHS experimental region. For the studied SST, additional iCFD model development was carried out in terms of (i) assessing different density current sub-models; (ii) implementation of a combined flocculation, hindered, transient and compression settling velocity function; and (iii

  5. INVERSE ELECTRON TRANSFER IN PEROXYOXALATE CHEMIEXCITATION USING EASILY REDUCIBLE ACTIVATORS

    NARCIS (Netherlands)

    Bartoloni, Fernando Heering; Monteiro Leite Ciscato, Luiz Francisco; Augusto, Felipe Alberto; Baader, Wilhelm Josef

    2010-01-01

    INVERSE ELECTRON TRANSFER IN PEROXYOXALATE CHEMIEXCITATION USING EASILY REDUCIBLE ACTIVATORS. Chemiluminescence properties of the peroxyoxalate reaction in the presence of activators bearing electron withdrawing substituents were studied, to evaluate the possible occurrence of an inverse electron tr

  6. INVERSE ELECTRON TRANSFER IN PEROXYOXALATE CHEMIEXCITATION USING EASILY REDUCIBLE ACTIVATORS

    NARCIS (Netherlands)

    Bartoloni, Fernando Heering; Monteiro Leite Ciscato, Luiz Francisco; Augusto, Felipe Alberto; Baader, Wilhelm Josef

    2010-01-01

    INVERSE ELECTRON TRANSFER IN PEROXYOXALATE CHEMIEXCITATION USING EASILY REDUCIBLE ACTIVATORS. Chemiluminescence properties of the peroxyoxalate reaction in the presence of activators bearing electron withdrawing substituents were studied, to evaluate the possible occurrence of an inverse electron tr

  7. INVERSE ELECTRON TRANSFER IN PEROXYOXALATE CHEMIEXCITATION USING EASILY REDUCIBLE ACTIVATORS

    NARCIS (Netherlands)

    Bartoloni, Fernando Heering; Monteiro Leite Ciscato, Luiz Francisco; Augusto, Felipe Alberto; Baader, Wilhelm Josef

    2010-01-01

    INVERSE ELECTRON TRANSFER IN PEROXYOXALATE CHEMIEXCITATION USING EASILY REDUCIBLE ACTIVATORS. Chemiluminescence properties of the peroxyoxalate reaction in the presence of activators bearing electron withdrawing substituents were studied, to evaluate the possible occurrence of an inverse electron

  8. Interpretation of scrape-off layer profile evolution and first-wall ion flux statistics on JET using a stochastic framework based on fillamentary motion

    Science.gov (United States)

    Walkden, N. R.; Wynn, A.; Militello, F.; Lipschultz, B.; Matthews, G.; Guillemaut, C.; Harrison, J.; Moulton, D.; Contributors, JET

    2017-08-01

    This paper presents the use of a novel modelling technique based around intermittent transport due to filament motion, to interpret experimental profile and fluctuation data in the scrape-off layer (SOL) of JET during the onset and evolution of a density profile shoulder. A baseline case is established, prior to shoulder formation, and the stochastic model is shown to be capable of simultaneously matching the time averaged profile measurement as well as the PDF shape and autocorrelation function from the ion-saturation current time series at the outer wall. Aspects of the stochastic model are then varied with the aim of producing a profile shoulder with statistical measurements consistent with experiment. This is achieved through a strong localised reduction in the density sink acting on the filaments within the model. The required reduction of the density sink occurs over a highly localised region with the timescale of the density sink increased by a factor of 25. This alone is found to be insufficient to model the expansion and flattening of the shoulder region as the density increases, which requires additional changes within the stochastic model. An example is found which includes both a reduction in the density sink and filament acceleration and provides a consistent match to the experimental data as the shoulder expands, though the uniqueness of this solution can not be guaranteed. Within the context of the stochastic model, this implies that the localised reduction in the density sink can trigger shoulder formation, but additional physics is required to explain the subsequent evolution of the profile.

  9. A toolkit for analyzing nonlinear dynamic stochastic models easily

    NARCIS (Netherlands)

    Uhlig, H.F.H.V.S.

    1995-01-01

    Often, researchers wish to analyze nonlinear dynamic discrete-time stochastic models. This paper provides a toolkit for solving such models easily, building on log-linearizing the necessary equations characterizing the equilibrium and solving for the recursive equilibrium law of motion with the meth

  10. A toolkit for analyzing nonlinear dynamic stochastic models easily

    NARCIS (Netherlands)

    Uhlig, H.F.H.V.S.

    1995-01-01

    Often, researchers wish to analyze nonlinear dynamic discrete-time stochastic models. This paper provides a toolkit for solving such models easily, building on log-linearizing the necessary equations characterizing the equilibrium and solving for the recursive equilibrium law of motion with the meth

  11. Combining data visualization and statistical approaches for interpreting measurements and meta-data: Integrating heatmaps, variable clustering, and mixed regression models

    Science.gov (United States)

    The advent of new higher throughput analytical instrumentation has put a strain on interpreting and explaining the results from complex studies. Contemporary human, environmental, and biomonitoring data sets are comprised of tens or hundreds of analytes, multiple repeat measures...

  12. Combining data visualization and statistical approaches for interpreting measurements and meta-data: Integrating heatmaps, variable clustering, and mixed regression models

    Science.gov (United States)

    The advent of new higher throughput analytical instrumentation has put a strain on interpreting and explaining the results from complex studies. Contemporary human, environmental, and biomonitoring data sets are comprised of tens or hundreds of analytes, multiple repeat measures...

  13. [Easily implemented cognitive behaviour techniques in primary care (part 2)].

    Science.gov (United States)

    Ibáñez-Tarín, C; Manzanera-Escartí, R

    2014-01-01

    Cognitive behavioural therapy has shown to be very effective for treating the vast majority of mental health disorders. In this second part of the article, we continue commenting on those techniques that can be easily used in the Primary Care setting. Copyright © 2011 Sociedad Española de Médicos de Atención Primaria (SEMERGEN). Publicado por Elsevier España. All rights reserved.

  14. An Introduction to Statistical Concepts

    CERN Document Server

    Lomax, Richard G

    2012-01-01

    This comprehensive, flexible text is used in both one- and two-semester courses to review introductory through intermediate statistics. Instructors select the topics that are most appropriate for their course. Its conceptual approach helps students more easily understand the concepts and interpret SPSS and research results. Key concepts are simply stated and occasionally reintroduced and related to one another for reinforcement. Numerous examples demonstrate their relevance. This edition features more explanation to increase understanding of the concepts. Only crucial equations are included. I

  15. A method for easily customizable gradient gel electrophoresis.

    Science.gov (United States)

    Miller, Andrew J; Roman, Brandon; Norstrom, Eric

    2016-09-15

    Gradient polyacrylamide gel electrophoresis is a powerful tool for the resolution of polypeptides by relative mobility. Here, we present a simplified method for generating polyacrylamide gradient gels for routine analysis without the need for specialized mixing equipment. The method allows for easily customizable gradients which can be optimized for specific polypeptide resolution requirements. Moreover, the method eliminates the possibility of buffer cross contamination in mixing equipment, and the time and resources saved with this method in place of traditional gradient mixing, or the purchase of pre-cast gels, are noteworthy given the frequency with which many labs use gradient gel SDS-PAGE. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. An easily fabricated high performance ionic polymer based sensor network

    Science.gov (United States)

    Zhu, Zicai; Wang, Yanjie; Hu, Xiaopin; Sun, Xiaofei; Chang, Longfei; Lu, Pin

    2016-08-01

    Ionic polymer materials can generate an electrical potential from ion migration under an external force. For traditional ionic polymer metal composite sensors, the output voltage is very small (a few millivolts), and the fabrication process is complex and time-consuming. This letter presents an ionic polymer based network of pressure sensors which is easily and quickly constructed, and which can generate high voltage. A 3 × 3 sensor array was prepared by casting Nafion solution directly over copper wires. Under applied pressure, two different levels of voltage response were observed among the nine nodes in the array. For the group producing the higher level, peak voltages reached as high as 25 mV. Computational stress analysis revealed the physical origin of the different responses. High voltages resulting from the stress concentration and asymmetric structure can be further utilized to modify subsequent designs to improve the performance of similar sensors.

  17. iCFD: Interpreted Computational Fluid Dynamics – Degeneration of CFD to one-dimensional advection-dispersion models using statistical experimental design – The secondary clarifier

    DEFF Research Database (Denmark)

    Guyonvarch, Estelle; Ramin, Elham; Kulahci, Murat

    2015-01-01

    The present study aims at using statistically designed computational fluid dynamics (CFD) simulations as numerical experiments for the identification of one-dimensional (1-D) advection-dispersion models – computationally light tools, used e.g., as sub-models in systems analysis. The objective...... using the example of a circular secondary settling tank (SST). First, the significant design and flow factors are screened out by applying the statistical method of two-level fractional factorial design of experiments. Second, based on the number of significant factors identified through the factor...... both in 2-D and 1-D was undertaken. Results suggest that the iCFD model developed for the SST through the proposed methodology is able to predict solid distribution with high accuracy – taking a reasonable computational effort – when compared to multi-dimensional numerical experiments, under a wide...

  18. Primer on statistical interpretation or methods report card on propensity-score matching in the cardiology literature from 2004 to 2006: a systematic review.

    Science.gov (United States)

    Austin, Peter C

    2008-09-01

    Propensity-score matching is frequently used in the cardiology literature. Recent systematic reviews have found that this method is, in general, poorly implemented in the medical literature. The study objective was to examine the quality of the implementation of propensity-score matching in the general cardiology literature. A total of 44 articles published in the American Heart Journal, the American Journal of Cardiology, Circulation, the European Heart Journal, Heart, the International Journal of Cardiology, and the Journal of the American College of Cardiology between January 1, 2004, and December 31, 2006, were examined. Twenty of the 44 studies did not provide adequate information on how the propensity-score-matched pairs were formed. Fourteen studies did not report whether matching on the propensity score balanced baseline characteristics between treated and untreated subjects in the matched sample. Only 4 studies explicitly used statistical methods appropriate for matched studies to compare baseline characteristics between treated and untreated subjects. Only 11 (25%) of the 44 studies explicitly used statistical methods appropriate for the analysis of matched data when estimating the effect of treatment on the outcomes. Only 2 studies described the matching method used, assessed balance in baseline covariates by appropriate methods, and used appropriate statistical methods to estimate the treatment effect and its significance. Application of propensity-score matching was poor in the cardiology literature. Suggestions for improving the reporting and analysis of studies that use propensity-score matching are provided.

  19. Statistics Clinic

    Science.gov (United States)

    Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James

    2014-01-01

    Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.

  20. On interpretation

    Directory of Open Access Journals (Sweden)

    Michał Januszkiewicz

    2013-01-01

    Full Text Available The article entitled “On interpretation” is an attempt to formulate a viewpoint on the issue of textual interpretation. It presents different ideas related to interpretation, including especially those that are concerned with a text’s meaning and with the way in which it is interpreted by the reader. The author proposes another interpretation method which he calls transactional. The primary concern is how to possibly justify the fundamental character of interpretation and interpretative activity while at the same time preserving and respecting the relative autonomy of an interpreted text.

  1. Technical Basis Document: A Statistical Basis for Interpreting Urinary Excretion of Plutonium Based on Accelerator Mass Spectrometry (AMS) for Selected Atoll Populations in the Marshall Islands

    Energy Technology Data Exchange (ETDEWEB)

    Bogen, K; Hamilton, T F; Brown, T A; Martinelli, R E; Marchetti, A A; Kehl, S R; Langston, R G

    2007-05-01

    We have developed refined statistical and modeling techniques to assess low-level uptake and urinary excretion of plutonium from different population group in the northern Marshall Islands. Urinary excretion rates of plutonium from the resident population on Enewetak Atoll and from resettlement workers living on Rongelap Atoll range from <1 to 8 {micro}Bq per day and are well below action levels established under the latest Department regulation 10 CFR 835 in the United States for in vitro bioassay monitoring of {sup 239}Pu. However, our statistical analyses show that urinary excretion of plutonium-239 ({sup 239}Pu) from both cohort groups is significantly positively associated with volunteer age, especially for the resident population living on Enewetak Atoll. Urinary excretion of {sup 239}Pu from the Enewetak cohort was also found to be positively associated with estimates of cumulative exposure to worldwide fallout. Consequently, the age-related trends in urinary excretion of plutonium from Marshallese populations can be described by either a long-term component from residual systemic burdens acquired from previous exposures to worldwide fallout or a prompt (and eventual long-term) component acquired from low-level systemic intakes of plutonium associated with resettlement of the northern Marshall Islands, or some combination of both.

  2. Enhancing the interpretation of statistical P values in toxicology studies: implementation of linear mixed models (LMMs) and standardized effect sizes (SESs).

    Science.gov (United States)

    Schmidt, Kerstin; Schmidtke, Jörg; Kohl, Christian; Wilhelm, Ralf; Schiemann, Joachim; van der Voet, Hilko; Steinberg, Pablo

    2016-03-01

    In this paper, we compare the traditional ANOVA approach to analysing data from 90-day toxicity studies with a more modern LMM approach, and we investigate the use of standardized effect sizes. The LMM approach is used to analyse weight or feed consumption data. When compared to the week-by-week ANOVA with multiple test results per week, this approach results in only one statement on differences in weight development between groups. Standardized effect sizes are calculated for the endpoints: weight, relative organ weights, haematology and clinical biochemistry. The endpoints are standardized, allowing different endpoints of the same study to be compared and providing an overall picture of group differences at a glance. Furthermore, in terms of standardized effect sizes, statistical significance and biological relevance are displayed simultaneously in a graph.

  3. Can difficult intubation be easily and rapidly predicted?

    Science.gov (United States)

    Fritscherova, Sarka; Adamus, Milan; Dostalova, Katerina; Koutna, Jirina; Hrabalek, Lumir; Zapletalova, Jana; Uvizl, Radovan; Janout, Vladimir

    2011-06-01

    Failed endotracheal intubation and inadequate ventilation with subsequent insufficient oxygenation can result in serious complications potentially leading to permanent health damage. Difficult intubation may occur not only in patients with apparent pathologies in the orofacial region but also, unexpectedly, in those without abnormalities. This study aimed at finding anthropometric parameters that are easy to examine and that would aid in predicting difficult intubation. A case-control study was undertaken. Based on defined criteria, 15 parameters were examined in patients with unanticipated difficult intubation. The parameters included a previous history of difficult intubation, pathologies associated with difficult intubation, clinical symptoms of airway pathology, the Mallampati score, upper lip bite test, receding mandible, and cervical spine and temporomandibular joint movement. Thyromental, hyomental and sternomental distances and inter-incisor gap were measured. The methods were precisely defined and the measurements were carried out by a trained anesthesiologist. Statistical analysis was performed on data from 74 patients with difficult intubation and 74 control patients with easy intubation. Significant predictors of difficult intubation were inter-incisor gap (IIG), thyromental distance (TMD) and class 3 limited movement of the temporomandibular joint. The IIG and TMD cut-offs were set at 42 mm and 93 mm, respectively. The results will be used to confirm these predictors in an anesthesiology clinic along with the aid of the laryngoscopic findings to improve the prediction of unanticipated difficult intubation.

  4. Making large amounts of meteorological plots easily accessible to users

    Science.gov (United States)

    Lamy-Thepaut, Sylvie; Siemen, Stephan; Sahin, Cihan; Raoult, Baudouin

    2015-04-01

    implementation, It presents the user's products in a single interface with fast access to the original product, and possibilities of synchronous animations between them. But its functionalities are being extended to give users the freedom to collect not only ecCharts's 2D maps and graphs, but also other ECMWF Web products such as monthly and seasonal products, scores, and observation monitoring. The dashboard will play a key role to help the user to interpret the large amount of information that ECMWF is providing. This talk will present examples of how the new user interface can organise complex meteorological maps and graphs and show the new possibilities users have gained by using the web as a medium.

  5. Statistical interpretation of chromatic indicators in correlation to phytochemical profile of a sulfur dioxide-free mulberry (Morus nigra) wine submitted to non-thermal maturation processes.

    Science.gov (United States)

    Tchabo, William; Ma, Yongkun; Kwaw, Emmanuel; Zhang, Haining; Xiao, Lulu; Apaliya, Maurice T

    2018-01-15

    The four different methods of color measurement of wine proposed by Boulton, Giusti, Glories and Commission International de l'Eclairage (CIE) were applied to assess the statistical relationship between the phytochemical profile and chromatic characteristics of sulfur dioxide-free mulberry (Morus nigra) wine submitted to non-thermal maturation processes. The alteration in chromatic properties and phenolic composition of non-thermal aged mulberry wine were examined, aided by the used of Pearson correlation, cluster and principal component analysis. The results revealed a positive effect of non-thermal processes on phytochemical families of wines. From Pearson correlation analysis relationships between chromatic indexes and flavonols as well as anthocyanins were established. Cluster analysis highlighted similarities between Boulton and Giusti parameters, as well as Glories and CIE parameters in the assessment of chromatic properties of wines. Finally, principal component analysis was able to discriminate wines subjected to different maturation techniques on the basis of their chromatic and phenolics characteristics. Copyright © 2017. Published by Elsevier Ltd.

  6. Interpretability formalized

    NARCIS (Netherlands)

    Joosten, Joost Johannes

    2004-01-01

    The dissertation is in the first place a treatment of mathematical interpretations. Interpretations themselves will be studied, but also shall they be used to study formal theories. Interpretations, when used in comparing theories, tell us, in a natural way, something about proof-strength of form

  7. Statistical significance versus clinical relevance.

    Science.gov (United States)

    van Rijn, Marieke H C; Bech, Anneke; Bouyer, Jean; van den Brand, Jan A J G

    2017-04-01

    In March this year, the American Statistical Association (ASA) posted a statement on the correct use of P-values, in response to a growing concern that the P-value is commonly misused and misinterpreted. We aim to translate these warnings given by the ASA into a language more easily understood by clinicians and researchers without a deep background in statistics. Moreover, we intend to illustrate the limitations of P-values, even when used and interpreted correctly, and bring more attention to the clinical relevance of study findings using two recently reported studies as examples. We argue that P-values are often misinterpreted. A common mistake is saying that P < 0.05 means that the null hypothesis is false, and P ≥0.05 means that the null hypothesis is true. The correct interpretation of a P-value of 0.05 is that if the null hypothesis were indeed true, a similar or more extreme result would occur 5% of the times upon repeating the study in a similar sample. In other words, the P-value informs about the likelihood of the data given the null hypothesis and not the other way around. A possible alternative related to the P-value is the confidence interval (CI). It provides more information on the magnitude of an effect and the imprecision with which that effect was estimated. However, there is no magic bullet to replace P-values and stop erroneous interpretation of scientific results. Scientists and readers alike should make themselves familiar with the correct, nuanced interpretation of statistical tests, P-values and CIs. © The Author 2017. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.

  8. Landslides triggered by the 12 January 2010 Mw 7.0 Port-au-Prince, Haiti, earthquake: visual interpretation, inventory compiling and spatial distribution statistical analysis

    Directory of Open Access Journals (Sweden)

    C. Xu

    2014-02-01

    Full Text Available The 12 January 2010 Port-au-Prince, Haiti, earthquake (Mw 7.0 triggered tens of thousands of landslides. The purpose of this study is to investigate the correlations of the occurrence of landslides and their erosion thicknesses with topographic factors, seismic parameters, and their distance from roads. A total of 30 828 landslides triggered by the earthquake covered a total area of 15.736 km2, distributed in an area more than 3000 km2, and the volume of landslide accumulation materials is estimated to be about 29 700 000 m3. These landslides are of various types, mostly belonging to shallow disrupted landslides and rock falls, but also include coherent deep-seated landslides and rock slides. These landslides were delineated using pre- and post-earthquake high-resolutions satellite images. Spatial distribution maps and contour maps of landslide number density, landslide area percentage, and landslide erosion thickness were constructed in order to analyze the spatial distribution patterns of co-seismic landslides. Statistics of size distribution and morphometric parameters of co-seismic landslides were carried out and were compared with other earthquake events in the world. Four proxies of co-seismic landslide abundance, including landslides centroid number density (LCND, landslide top number density (LTND, landslide area percentage (LAP, and landslide erosion thickness (LET were used to correlate co-seismic landslides with various landslide controlling parameters. These controlling parameters include elevation, slope angle, slope aspect, slope curvature, topographic position, distance from drainages, lithology, distance from the epicenter, distance from the Enriquillo–Plantain Garden fault, distance along the fault, and peak ground acceleration (PGA. A comparison of these impact parameters on co-seismic landslides shows that slope angle is the strongest impact parameter on co-seismic landslide occurrence. Our co-seismic landslide inventory is

  9. Cross cultural aspects of health interpreting

    Institute of Scientific and Technical Information of China (English)

    昝婷

    2014-01-01

    In today’s society, the migrant phenomenon occurs easily and frequently. In the host society, they will meet language difficulties in different sectors, for example, in the medical context. In this case we need the medical interpreters. As it has been known, medical interpretation is extremely challenging. For medical interpreters, how to overcome language and cultural obstacles becomes very important.

  10. Phonomicrosurgery simulation: A low-cost teaching model using easily accessible materials.

    Science.gov (United States)

    Zambricki, Elizabeth A; Bergeron, Jennifer L; DiRenzo, Elizabeth E; Sung, C Kwang

    2016-11-01

    To introduce the use of a new phonomicrosurgical trainer using easily accessible materials, and to establish the effectiveness of the model. The model uses a grape imbedded in gelatin, a microscope, and microlaryngeal instruments. The study was designed to test baseline differences in training levels, as well as improvement in performance after training with the simulation model. Thirty subjects enrolled in the Stanford University School of Medicine otolaryngology training program performed microlaryngeal surgery tasks on a grape. Tasks were designed to model both excision of a vocal fold lesion and vocal fold injection. Anonymized video recordings comparing presimulation and postsimulation training were collected and graded by an expert laryngologist. Both objective comparison of skills and subjective participant surveys were analyzed. Objectively, trainees in all groups made statistically significant improvements across all tested variables, including microscope positioning, creation of a linear incision, elevation of epithelial flaps, excision of a crescent of tissue, vocal fold injection, preservation of remaining tissue, and time to complete all tasks. Subjectively, 100% of participants felt that they had increased comfort with microlaryngeal instruments and decreased intimidation of microlaryngeal surgery after completing the simulation training. This appreciation of skills was most notable and statistically significant in the intern trainees. Microlaryngeal surgical simulation is a tool that can be used to train residents to prepare them for phonomicrosurgical procedures at all levels of training. Our low-cost model with accessible materials can be easily duplicated and used to introduce trainees to microlaryngeal surgery or improve skills of more senior trainees. NA Laryngoscope, 126:2528-2533, 2016. © 2016 The American Laryngological, Rhinological and Otological Society, Inc.

  11. A Statistical Thermodynamical Interpretation of Metabolism

    Directory of Open Access Journals (Sweden)

    Pornkamol Unrean

    2010-08-01

    Full Text Available The metabolic network of a cell can be decomposed into discrete elementary modes that contribute, each with a certain probability, to the overall flux through the metabolism. These modes are cell function supporting, fundamental pathways that represent permissible ‘quantum’ states of the metabolism. For the case that cellular regulatory mechanisms for pathway fluxes evolved in an unbiased way, we demonstrate theoretically that the usage probabilities of individual elementary modes are distributed according to Boltzmann’s distribution law such that the rate of entropy production is maximized. Such distribution can be observed experimentally in highly evolved metabolic networks. Therefore, cell function has a natural tendency to operate at a maximum rate of entropy generation using preferentially efficient pathways with small reaction entropies. Ultimately, evolution of metabolic networks appears to be driven by forces that can be quantified by the distance of the current metabolic state from the state of maximum entropy generation that represents the unbiased, most probable selection of fundamental pathway choices.

  12. Detecting and interpreting statistical lensing by absorbers

    CERN Document Server

    Ménard, B

    2004-01-01

    We propose a method for detecting gravitational magnification of distant sources, like quasars, due to absorber systems detected in their spectra. We first motivate the use of metal absorption lines rather than Lyman-alpha lines, then we show how to relate the observed moments of the source magnitude distribution to the mass distribution of absorbers. In order to illustrate the feasibility of the method, we use a simple model to estimate the amplitude of the effect expected for MgII absorption lines, and show that their lensing signal might already be detectable in large surveys like the SDSS. Our model suggests that quasars behind strong MgII absorbers are in average brightened by -0.05 to -0.2 magnitude due to magnification. One must therefore revisit the claim that, in magnitude limited surveys, quasars with strong absorbers tend to be missed due to extinction effects. In addition to constraining the mass of absorber systems, applying our method will allow for the quantification of this bias.

  13. Fractures network analysis and interpretation in carbonate rocks using a multi-criteria statistical approach. Case study of Jebal Chamsi and Jebal Belkhir, South-western part of Tunisia

    Science.gov (United States)

    Msaddek, Mohamed Haythem; Moumni, Yahya; Chenini, Ismail; Mercier, Eric; Dlala, Mahmoud

    2016-11-01

    The quantitative analysis of fractures in carbonate rocks across termination folds is important for the understanding of the fractures network distribution and arrangement. In this study, we performed a quantitative analysis and interpretation of fracture network to identify the fracture networks type. For this reason, we used a multi-criteria statistical analysis. The distribution of directional families in all measured stations and their elemental distribution are firstly examined. Then we performed the analysis of directional criteria for each of the two and three neighbouring stations. Finally, the elemental analyses of fracture families crossing others were carried out. This methodology was applied to the folds of Jebal Chamsi and Jebal Belkhir areas located in south western Tunisia characterized by simple folds of carbonate geological formations. The application of the global and the elemental statistical analysis criteria of directional families show a random arrangement of fractures. However, elemental analysis of two and three neighbouring stations for families crossing one another shows a pseudo-organization of fracture arrangements.

  14. Objective interpretation as conforming interpretation

    Directory of Open Access Journals (Sweden)

    Lidka Rodak

    2011-12-01

    Full Text Available The practical discourse willingly uses the formula of “objective interpretation”, with no regards to its controversial nature that has been discussed in literature.The main aim of the article is to investigate what “objective interpretation” could mean and how it could be understood in the practical discourse, focusing on the understanding offered by judicature.The thesis of the article is that objective interpretation, as identified with textualists’ position, is not possible to uphold, and should be rather linked with conforming interpretation. And what this actually implies is that it is not the virtue of certainty and predictability – which are usually associated with objectivity- but coherence that makes the foundation of applicability of objectivity in law.What could be observed from the analyses, is that both the phenomenon of conforming interpretation and objective interpretation play the role of arguments in the interpretive discourse, arguments that provide justification that interpretation is not arbitrary or subjective. With regards to the important part of the ideology of legal application which is the conviction that decisions should be taken on the basis of law in order to exclude arbitrariness, objective interpretation could be read as a question “what kind of authority “supports” certain interpretation”? that is almost never free of judicial creativity and judicial activism.One can say that, objective and conforming interpretation are just another arguments used in legal discourse.

  15. Assessing agreement on classification tasks the kappa statistic

    CERN Document Server

    Carletta, J

    1996-01-01

    Currently, computational linguists and cognitive scientists working in the area of discourse and dialogue argue that their subjective judgments are reliable using several different statistics, none of which are easily interpretable or comparable to each other. Meanwhile, researchers in content analysis have already experienced the same difficulties and come up with a solution in the kappa statistic. We discuss what is wrong with reliability measures as they are currently used for discourse and dialogue work in computational linguistics and cognitive science, and argue that we would be better off as a field adopting techniques from content analysis.

  16. Quantum interpretations

    Energy Technology Data Exchange (ETDEWEB)

    Goernitz, T.; Weizsaecker, C.F.V.

    1987-10-01

    Four interpretations of quantum theory are compared: the Copenhagen interpretation (C.I.) with the additional assumption that the quantum description also applies to the mental states of the observer, and three recent ones, by Kochen, Deutsch, and Cramer. Since they interpret the same mathematical structure with the same empirical predictions, it is assumed that they formulate only different linguistic expressions of one identical theory. C.I. as a theory on human knowledge rests on a phenomenological description of time. It can be reconstructed from simple assumptions on predictions. Kochen shows that mathematically every composite system can be split into an object and an observer. Deutsch, with the same decomposition, describes futuric possibilities under the Everett term worlds. Cramer, using four-dimensional action at a distance (Wheeler-Feynman), describes all future events like past facts. All three can be described in the C.I. frame. The role of abstract nonlocality is discussed.

  17. Interpreting Physics

    CERN Document Server

    MacKinnon, Edward

    2012-01-01

    This book is the first to offer a systematic account of the role of language in the development and interpretation of physics. An historical-conceptual analysis of the co-evolution of mathematical and physical concepts leads to the classical/quatum interface. Bohrian orthodoxy stresses the indispensability of classical concepts and the functional role of mathematics. This book analyses ways of extending, and then going beyond this orthodoxy orthodoxy. Finally, the book analyzes how a revised interpretation of physics impacts on basic philosophical issues: conceptual revolutions, realism, and r

  18. Interpreting Evidence.

    Science.gov (United States)

    Munsart, Craig A.

    1993-01-01

    Presents an activity that allows students to experience the type of discovery process that paleontologists necessarily followed during the early dinosaur explorations. Students are read parts of a story taken from the "American Journal of Science" and interpret the evidence leading to the discovery of Triceratops and Stegosaurus. (PR)

  19. Ambulatory blood pressure monitoring during pregnancy with a new, small, easily concealed monitor.

    Science.gov (United States)

    Tape, T G; Rayburn, W F; Bremer, K D; Schnoor, T A

    1994-12-01

    Before establishing the utility of ambulatory blood pressure monitoring during pregnancy, we evaluated the accuracy of a small, easily concealed monitor. The 59 normotensive pregnant patients were between 13 and 26 gestational weeks. For each monitor reading, two trained observers independently and simultaneously recorded blood pressures using a mercury manometer connected to the monitor cuff. Seven readings in three positions (sitting upright, semirecumbent, standing) were performed on each patient. Averaged differences between the observers' and monitor readings varied from -2.2 to -0.9 mm Hg (systolic) and from -2.8 to -0.6 (fifth-phase diastolic), indicating slight but clinically unimportant overestimation by the monitor. Correlations between averaged observers' readings and the monitor ranged from 0.79 to 0.92 (systolic) and from 0.85 to 0.92 (fifth-phase diastolic). Overall, the observers agreed with the monitor within 5 mm Hg on 94% of systolic readings and 99% of fifth-phase diastolic readings. There was no statistically significant difference in accuracy with changes in body position. We conclude that this small, quiet, noninvasive device accurately determined blood pressures during pregnancy.

  20. Statistics For Dummies

    CERN Document Server

    Rumsey, Deborah

    2011-01-01

    The fun and easy way to get down to business with statistics Stymied by statistics? No fear ? this friendly guide offers clear, practical explanations of statistical ideas, techniques, formulas, and calculations, with lots of examples that show you how these concepts apply to your everyday life. Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more.Tracks to a typical first semester statistics cou

  1. Cross cultural aspects of health interpreting

    Institute of Scientific and Technical Information of China (English)

    昝婷

    2014-01-01

    In today’s society,the migrant phenomenon occurs easily and frequently.In the host society,they will meet language difficulties in different sectors,for example,in the medical context.In this case we need the medical interpreters.As it has been known,medical interpretation is extremely challenging.For medical interpreters,how to overcome language and cultural obstacles becomes very important.

  2. SLAR image interpretation keys for geographic analysis

    Science.gov (United States)

    Coiner, J. C.

    1972-01-01

    A means for side-looking airborne radar (SLAR) imagery to become a more widely used data source in geoscience and agriculture is suggested by providing interpretation keys as an easily implemented interpretation model. Interpretation problems faced by the researcher wishing to employ SLAR are specifically described, and the use of various types of image interpretation keys to overcome these problems is suggested. With examples drawn from agriculture and vegetation mapping, direct and associate dichotomous image interpretation keys are discussed and methods of constructing keys are outlined. Initial testing of the keys, key-based automated decision rules, and the role of the keys in an information system for agriculture are developed.

  3. P value interpretations and considerations

    Science.gov (United States)

    Ronna, Brenden; Ott, Ulrike

    2016-01-01

    Application and interpretation of statistical evaluation of relationships is a necessary element in biomedical research. Statistical analyses rely on P value to demonstrate relationships. The traditional level of significance, P0.05 for complex relationships such as effect modification. PMID:27747028

  4. Tips for Interpretation

    Institute of Scientific and Technical Information of China (English)

    陈鹏; 罗新平

    2015-01-01

    this article offers tips for interpreting, including interpretation techniques and improving interpreting skills by the practice of listening, speaking, reading and writing to better interpreting performance.

  5. CT Colonography: Pitfalls in Interpretation

    Science.gov (United States)

    Pickhardt, Perry J.; Kim, David H.

    2012-01-01

    Synopsis As with any radiologic imaging test, there are a number of potential interpretive pitfalls at CT colonography (CTC) that need to be recognized and handled appropriately. Perhaps the single most important step in learning to avoid most of these diagnostic traps is simply to be aware of their existence. With a little experience, most of these potential pitfalls will be easily recognized. This review will systematically cover the key pitfalls confronting the radiologist at CTC interpretation, primarily dividing them into those related to technique and those related to underlying anatomy. Tips and pointers for how to effectively handle these potential pitfalls are included. PMID:23182508

  6. Common pitfalls in statistical analysis: Clinical versus statistical significance

    Science.gov (United States)

    Ranganathan, Priya; Pramesh, C. S.; Buyse, Marc

    2015-01-01

    In clinical research, study results, which are statistically significant are often interpreted as being clinically important. While statistical significance indicates the reliability of the study results, clinical significance reflects its impact on clinical practice. The third article in this series exploring pitfalls in statistical analysis clarifies the importance of differentiating between statistical significance and clinical significance. PMID:26229754

  7. Applied statistics for economists

    CERN Document Server

    Lewis, Margaret

    2012-01-01

    This book is an undergraduate text that introduces students to commonly-used statistical methods in economics. Using examples based on contemporary economic issues and readily-available data, it not only explains the mechanics of the various methods, it also guides students to connect statistical results to detailed economic interpretations. Because the goal is for students to be able to apply the statistical methods presented, online sources for economic data and directions for performing each task in Excel are also included.

  8. Statistics & probaility for dummies

    CERN Document Server

    Rumsey, Deborah J

    2013-01-01

    Two complete eBooks for one low price! Created and compiled by the publisher, this Statistics I and Statistics II bundle brings together two math titles in one, e-only bundle. With this special bundle, you'll get the complete text of the following two titles: Statistics For Dummies, 2nd Edition  Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more. Tra

  9. Statistical Inference: The Big Picture.

    Science.gov (United States)

    Kass, Robert E

    2011-02-01

    Statistics has moved beyond the frequentist-Bayesian controversies of the past. Where does this leave our ability to interpret results? I suggest that a philosophy compatible with statistical practice, labelled here statistical pragmatism, serves as a foundation for inference. Statistical pragmatism is inclusive and emphasizes the assumptions that connect statistical models with observed data. I argue that introductory courses often mis-characterize the process of statistical inference and I propose an alternative "big picture" depiction.

  10. Injury Statistics

    Science.gov (United States)

    ... Certification Import Safety International Recall Guidance Civil and Criminal Penalties Federal Court Orders & Decisions Research & Statistics Research & Statistics Technical Reports Injury Statistics NEISS Injury ...

  11. How to limit clinical errors in interpretation of data.

    Science.gov (United States)

    Wright, P; Jansen, C; Wyatt, J C

    1998-11-01

    We all assume that we can understand and correctly interpret what we read. However, interpretation is a collection of subtle processes that are easily influenced by poor presentation or wording of information. This article examines how evidence-based principles of information design can be applied to medical records to enhance clinical understanding and accuracy in interpretation of the detailed data that they contain.

  12. Cosmic Statistics of Statistics

    OpenAIRE

    Szapudi, I.; Colombi, S.; Bernardeau, F.

    1999-01-01

    The errors on statistics measured in finite galaxy catalogs are exhaustively investigated. The theory of errors on factorial moments by Szapudi & Colombi (1996) is applied to cumulants via a series expansion method. All results are subsequently extended to the weakly non-linear regime. Together with previous investigations this yields an analytic theory of the errors for moments and connected moments of counts in cells from highly nonlinear to weakly nonlinear scales. The final analytic formu...

  13. Statistical laws in linguistics

    CERN Document Server

    Altmann, Eduardo G

    2015-01-01

    Zipf's law is just one out of many universal laws proposed to describe statistical regularities in language. Here we review and critically discuss how these laws can be statistically interpreted, fitted, and tested (falsified). The modern availability of large databases of written text allows for tests with an unprecedent statistical accuracy and also a characterization of the fluctuations around the typical behavior. We find that fluctuations are usually much larger than expected based on simplifying statistical assumptions (e.g., independence and lack of correlations between observations).These simplifications appear also in usual statistical tests so that the large fluctuations can be erroneously interpreted as a falsification of the law. Instead, here we argue that linguistic laws are only meaningful (falsifiable) if accompanied by a model for which the fluctuations can be computed (e.g., a generative model of the text). The large fluctuations we report show that the constraints imposed by linguistic laws...

  14. 100 statistical tests

    CERN Document Server

    Kanji, Gopal K

    2006-01-01

    This expanded and updated Third Edition of Gopal K. Kanji's best-selling resource on statistical tests covers all the most commonly used tests with information on how to calculate and interpret results with simple datasets. Each entry begins with a short summary statement about the test's purpose, and contains details of the test objective, the limitations (or assumptions) involved, a brief outline of the method, a worked example, and the numerical calculation. 100 Statistical Tests, Third Edition is the one indispensable guide for users of statistical materials and consumers of statistical information at all levels and across all disciplines.

  15. Statistical Neurodynamics.

    Science.gov (United States)

    Paine, Gregory Harold

    1982-03-01

    The primary objective of the thesis is to explore the dynamical properties of small nerve networks by means of the methods of statistical mechanics. To this end, a general formalism is developed and applied to elementary groupings of model neurons which are driven by either constant (steady state) or nonconstant (nonsteady state) forces. Neuronal models described by a system of coupled, nonlinear, first-order, ordinary differential equations are considered. A linearized form of the neuronal equations is studied in detail. A Lagrange function corresponding to the linear neural network is constructed which, through a Legendre transformation, provides a constant of motion. By invoking the Maximum-Entropy Principle with the single integral of motion as a constraint, a probability distribution function for the network in a steady state can be obtained. The formalism is implemented for some simple networks driven by a constant force; accordingly, the analysis focuses on a study of fluctuations about the steady state. In particular, a network composed of N noninteracting neurons, termed Free Thinkers, is considered in detail, with a view to interpretation and numerical estimation of the Lagrange multiplier corresponding to the constant of motion. As an archetypical example of a net of interacting neurons, the classical neural oscillator, consisting of two mutually inhibitory neurons, is investigated. It is further shown that in the case of a network driven by a nonconstant force, the Maximum-Entropy Principle can be applied to determine a probability distribution functional describing the network in a nonsteady state. The above examples are reconsidered with nonconstant driving forces which produce small deviations from the steady state. Numerical studies are performed on simplified models of two physical systems: the starfish central nervous system and the mammalian olfactory bulb. Discussions are given as to how statistical neurodynamics can be used to gain a better

  16. Application of Interpretive Theory to Business Interpretation

    Institute of Scientific and Technical Information of China (English)

    刘杰

    2014-01-01

    Interpretive theory brings forward three phases of interpretation:understanding, deverberlization and re-expression. It needs linguistic knowledge and non-linguistic knowledge. This essay discusses application of interpretive theory to business inter-pretation from the perspective of theory and practice.

  17. Statistics For Neuroscientists

    Directory of Open Access Journals (Sweden)

    Subbakrishna D.K

    2000-01-01

    Full Text Available The role statistical methods play in medicine in the interpretation of empirical data is well recognized by researchers. With modern computing facilities and software packages there is little need for familiarity with the computational details of statistical calculations. However, for the researcher to understand whether these calculations are valid and appropriate it is necessary that the user is aware of the rudiments of the statistical methodology. Also, it needs to be emphasized that no amount of advanced analysis can be a substitute for a properly planned and executed study. An attempt is made in this communication to discuss some of the theoretical issues that are important for the valid analysis and interpretation of precious date that are gathered. The article summarises some of the basic statistical concepts followed by illustrations from live data generated from various research projects from the department of Neurology of this Institute.

  18. Invention Activities Support Statistical Reasoning

    Science.gov (United States)

    Smith, Carmen Petrick; Kenlan, Kris

    2016-01-01

    Students' experiences with statistics and data analysis in middle school are often limited to little more than making and interpreting graphs. Although students may develop fluency in statistical procedures and vocabulary, they frequently lack the skills necessary to apply statistical reasoning in situations other than clear-cut textbook examples.…

  19. Invention Activities Support Statistical Reasoning

    Science.gov (United States)

    Smith, Carmen Petrick; Kenlan, Kris

    2016-01-01

    Students' experiences with statistics and data analysis in middle school are often limited to little more than making and interpreting graphs. Although students may develop fluency in statistical procedures and vocabulary, they frequently lack the skills necessary to apply statistical reasoning in situations other than clear-cut textbook examples.…

  20. Changes in sport and physical activity behavior after participation in easily accessible sporting programs.

    NARCIS (Netherlands)

    Ooms, L.; Veenhof, C.

    2014-01-01

    Introduction: The Dutch government stimulates sport and physical activity opportunities in the neighborhood to make it easier for people to adopt a physically active lifestyle. Seven National Sports Federations (NSFs) were funded to develop easily accessible sporting programs, targeted at groups

  1. Summary and interpretive synthesis

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-05-01

    This chapter summarizes the major advances made through our integrated geological studies of the Lisburne Group in northern Alaska. The depositional history of the Lisburne Group is discussed in a framework of depositional sequence stratigraphy. Although individual parasequences (small-scale carbonate cycles) of the Wahoo Limestone cannot be correlated with certainty, parasequence sets can be interpreted as different systems tracts within the large-scale depositional sequences, providing insights on the paleoenvironments, paleogeography and platform geometry. Conodont biostratigraphy precisely established the position of the Mississippian-Pennsylvanian boundary within an important reference section, where established foraminiferal biostratigraphy is inconsistent with respect to conodont-based time-rock boundaries. However, existing Carboniferous conodont zonations are not readily applicable because most zonal indicators are absent, so a local zonation scheme was developed. Diagenetic studies of the Lisburne Group recognized nineteen subaerial exposure surfaces and developed a cement stratigraphy that includes: early cements associated with subaerial exposure surfaces in the Lisburne Group; cements associated with the sub-Permian unconformity; and later burial cements. Subaerial exposure surfaces in the Alapah Limestone are easily explained, being associated with peritidal environments at the boundaries of Sequence A. The Lisburne exposed in ANWR is generally tightly cemented and supermature, but could still be a good reservoir target in the adjacent subsurface of ANWR given the appropriate diagenetic, deformational and thermal history. Our ongoing research on the Lisburne Group will hopefully provide additional insights in future publications.

  2. Statistics a complete introduction

    CERN Document Server

    Graham, Alan

    2013-01-01

    Statistics: A Complete Introduction is the most comprehensive yet easy-to-use introduction to using Statistics. Written by a leading expert, this book will help you if you are studying for an important exam or essay, or if you simply want to improve your knowledge. The book covers all the key areas of Statistics including graphs, data interpretation, spreadsheets, regression, correlation and probability. Everything you will need is here in this one book. Each chapter includes not only an explanation of the knowledge and skills you need, but also worked examples and test questions.

  3. Practical business statistics

    CERN Document Server

    Siegel, Andrew

    2011-01-01

    Practical Business Statistics, Sixth Edition, is a conceptual, realistic, and matter-of-fact approach to managerial statistics that carefully maintains-but does not overemphasize-mathematical correctness. The book offers a deep understanding of how to learn from data and how to deal with uncertainty while promoting the use of practical computer applications. This teaches present and future managers how to use and understand statistics without an overdose of technical detail, enabling them to better understand the concepts at hand and to interpret results. The text uses excellent examples with

  4. An easily regenerable enzyme reactor prepared from polymerized high internal phase emulsions.

    Science.gov (United States)

    Ruan, Guihua; Wu, Zhenwei; Huang, Yipeng; Wei, Meiping; Su, Rihui; Du, Fuyou

    2016-04-22

    A large-scale high-efficient enzyme reactor based on polymerized high internal phase emulsion monolith (polyHIPE) was prepared. First, a porous cross-linked polyHIPE monolith was prepared by in-situ thermal polymerization of a high internal phase emulsion containing styrene, divinylbenzene and polyglutaraldehyde. The enzyme of TPCK-Trypsin was then immobilized on the monolithic polyHIPE. The performance of the resultant enzyme reactor was assessed according to the conversion ability of Nα-benzoyl-l-arginine ethyl ester to Nα-benzoyl-l-arginine, and the protein digestibility of bovine serum albumin (BSA) and cytochrome (Cyt-C). The results showed that the prepared enzyme reactor exhibited high enzyme immobilization efficiency and fast and easy-control protein digestibility. BSA and Cyt-C could be digested in 10 min with sequence coverage of 59% and 78%, respectively. The peptides and residual protein could be easily rinsed out from reactor and the reactor could be regenerated easily with 4 M HCl without any structure destruction. Properties of multiple interconnected chambers with good permeability, fast digestion facility and easily reproducibility indicated that the polyHIPE enzyme reactor was a good selector potentially applied in proteomics and catalysis areas.

  5. Interpreters, Interpreting, and the Study of Bilingualism.

    Science.gov (United States)

    Valdes, Guadalupe; Angelelli, Claudia

    2003-01-01

    Discusses research on interpreting focused specifically on issues raised by this literature about the nature of bilingualism. Suggests research carried out on interpreting--while primarily produced with a professional audience in mind and concerned with improving the practice of interpreting--provides valuable insights about complex aspects of…

  6. Algebraic Statistics

    OpenAIRE

    Norén, Patrik

    2013-01-01

    Algebraic statistics brings together ideas from algebraic geometry, commutative algebra, and combinatorics to address problems in statistics and its applications. Computer algebra provides powerful tools for the study of algorithms and software. However, these tools are rarely prepared to address statistical challenges and therefore new algebraic results need often be developed. This way of interplay between algebra and statistics fertilizes both disciplines. Algebraic statistics is a relativ...

  7. Statistical Methods for Astronomy

    CERN Document Server

    Feigelson, Eric D

    2012-01-01

    This review outlines concepts of mathematical statistics, elements of probability theory, hypothesis tests and point estimation for use in the analysis of modern astronomical data. Least squares, maximum likelihood, and Bayesian approaches to statistical inference are treated. Resampling methods, particularly the bootstrap, provide valuable procedures when distributions functions of statistics are not known. Several approaches to model selection and good- ness of fit are considered. Applied statistics relevant to astronomical research are briefly discussed: nonparametric methods for use when little is known about the behavior of the astronomical populations or processes; data smoothing with kernel density estimation and nonparametric regression; unsupervised clustering and supervised classification procedures for multivariate problems; survival analysis for astronomical datasets with nondetections; time- and frequency-domain times series analysis for light curves; and spatial statistics to interpret the spati...

  8. Synthesis, Characterization, to application of water soluble and easily removable cationic pressure sensitive adhesives

    Energy Technology Data Exchange (ETDEWEB)

    Institute of Paper Science Technology

    2004-01-30

    In recent years, the world has expressed an increasing interest in the recycling of waste paper to supplement the use of virgin fiber as a way to protect the environment. Statistics show that major countries are increasing their use of recycled paper. For example, in 1991 to 1996, the U.S. increased its recovered paper utilization rate from 31% to 39%, Germany went from 50% to 60%, the UK went from 60% to 70%, France increased from 46% to 49%, and China went from 32% to 35% [1]. As recycled fiber levels and water system closures both increase, recycled product quality will need to improve in order for recycled products to compete with products made from virgin fiber [2]. The use of recycled fiber has introduced an increasing level of metal, plastic, and adhesive contamination into the papermaking process which has added to the complexity of the already overwhelming task of providing a uniform and clean recycle furnish. The most harmful of these contaminates is a mixture of adhesives and polymeric substances that are commonly known as stickies. Stickies, which enter the mill with the pulp furnish, are not easily removed from the repulper and become more difficult the further down the system they get. This can be detrimental to the final product quality. Stickies are hydrophobic, tacky, polymeric materials that are introduced into the papermaking system from a mixture of recycled fiber sources. Properties of stickies are very similar to the fibers used in papermaking, viz. size, density, hydrophobicity, and electrokinetic charge. This reduces the probability of their removal by conventional separation processes, such as screening and cleaning, which are based on such properties. Also, their physical and chemical structure allows for them to extrude through screens, attach to fibers, process equipment, wires and felts. Stickies can break down and then reagglomerate and appear at seemingly any place in the mill. When subjected to a number of factors including changes

  9. Statistical Reform in School Psychology Research: A Synthesis

    Science.gov (United States)

    Swaminathan, Hariharan; Rogers, H. Jane

    2007-01-01

    Statistical reform in school psychology research is discussed in terms of research designs, measurement issues, statistical modeling and analysis procedures, interpretation and reporting of statistical results, and finally statistics education.

  10. Statistical Reform in School Psychology Research: A Synthesis

    Science.gov (United States)

    Swaminathan, Hariharan; Rogers, H. Jane

    2007-01-01

    Statistical reform in school psychology research is discussed in terms of research designs, measurement issues, statistical modeling and analysis procedures, interpretation and reporting of statistical results, and finally statistics education.

  11. Frenchglen Interpretive Plan

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — The purpose of this interpretive plan is to provide guidance for the development of the interpretive exhibits for the Frenchglen Interpretive Center, as well as the...

  12. Targeting Lexicon in Interpreting.

    Science.gov (United States)

    Farghal, Mohammed; Shakir, Abdullah

    1994-01-01

    Studies student interpreters in the Master's Translation Program at Yarmouk University in Jordan. Analyzes the difficulties of these students, particularly regarding lexical competence, when interpreting from Arabic to English, emphasizing the need to teach lexicon all through interpreting programs. (HB)

  13. An easily regenerable enzyme reactor prepared from polymerized high internal phase emulsions

    Energy Technology Data Exchange (ETDEWEB)

    Ruan, Guihua, E-mail: guihuaruan@hotmail.com [Guangxi Key Laboratory of Electrochemical and Magnetochemical Functional Materials, College of Chemistry and Bioengineering, Guilin University of Technology, Guangxi 541004 (China); Guangxi Collaborative Innovation Center for Water Pollution Control and Water Safety in Karst Area, Guilin University of Technology, Guilin 541004 (China); Wu, Zhenwei; Huang, Yipeng; Wei, Meiping; Su, Rihui [Guangxi Key Laboratory of Electrochemical and Magnetochemical Functional Materials, College of Chemistry and Bioengineering, Guilin University of Technology, Guangxi 541004 (China); Du, Fuyou, E-mail: dufu2005@126.com [Guangxi Key Laboratory of Electrochemical and Magnetochemical Functional Materials, College of Chemistry and Bioengineering, Guilin University of Technology, Guangxi 541004 (China); Guangxi Collaborative Innovation Center for Water Pollution Control and Water Safety in Karst Area, Guilin University of Technology, Guilin 541004 (China)

    2016-04-22

    A large-scale high-efficient enzyme reactor based on polymerized high internal phase emulsion monolith (polyHIPE) was prepared. First, a porous cross-linked polyHIPE monolith was prepared by in-situ thermal polymerization of a high internal phase emulsion containing styrene, divinylbenzene and polyglutaraldehyde. The enzyme of TPCK-Trypsin was then immobilized on the monolithic polyHIPE. The performance of the resultant enzyme reactor was assessed according to the conversion ability of N{sub α}-benzoyl-L-arginine ethyl ester to N{sub α}-benzoyl-L-arginine, and the protein digestibility of bovine serum albumin (BSA) and cytochrome (Cyt-C). The results showed that the prepared enzyme reactor exhibited high enzyme immobilization efficiency and fast and easy-control protein digestibility. BSA and Cyt-C could be digested in 10 min with sequence coverage of 59% and 78%, respectively. The peptides and residual protein could be easily rinsed out from reactor and the reactor could be regenerated easily with 4 M HCl without any structure destruction. Properties of multiple interconnected chambers with good permeability, fast digestion facility and easily reproducibility indicated that the polyHIPE enzyme reactor was a good selector potentially applied in proteomics and catalysis areas. - Graphical abstract: Schematic illustration of preparation of hypercrosslinking polyHIPE immobilized enzyme reactor for on-column protein digestion. - Highlights: • A reactor was prepared and used for enzyme immobilization and continuous on-column protein digestion. • The new polyHIPE IMER was quite suit for protein digestion with good properties. • On-column digestion revealed that the IMER was easy regenerated by HCl without any structure destruction.

  14. An easily synthesized blue polymer for high-performance polymer solar cells

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Ergang; Hellstroem, Stefan; Zhang, Fengling; Andersson, Mats R. [Department of Chemical and Biological, Engineering/Polymer Technology, Chalmers University of Technology, SE-412 96 Goeteborg (Sweden); Hou, Lintao; Wang, Zhongqiang; Inganaes, Olle [Biomolecular and Organic Electronics, IFM, and Center of Organic Electronics, Linkoeping University, SE-581 83 Linkoeping (Sweden)

    2010-12-07

    High performance solar cells fabricated from an easily synthesized donor-acceptor polymer show maximum power point up to 6.0 mW cm{sup -2}, with an open-circuit voltage of 0.89 V, short-circuit current density of 10.5 mA cm{sup -2} and fill factor of 0.64, making this polymer a particularly promising candidate for high-efficiency low-cost polymer solar cells. (Copyright copyright 2010 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  15. Cellulose with a High Fractal Dimension Is Easily Hydrolysable under Acid Catalysis

    Directory of Open Access Journals (Sweden)

    Mariana Díaz

    2017-05-01

    Full Text Available The adsorption of three diverse amino acids couples onto the surface of microcrystalline cellulose was studied. Characterisation of modified celluloses included changes in the polarity and in roughness. The amino acids partially break down the hydrogen bonding network of the cellulose structure, leading to more reactive cellulose residues that were easily hydrolysed to glucose in the presence of hydrochloric acid or tungstophosphoric acid catalysts. The conversion of cellulose and selectivity for glucose was highly dependent on the self-assembled amino acids adsorbed onto the cellulose and the catalyst.

  16. Spider phobics more easily see a spider in morphed schematic pictures

    Directory of Open Access Journals (Sweden)

    Partchev Ivailo

    2007-11-01

    Full Text Available Abstract Background Individuals with social phobia are more likely to misinterpret ambiguous social situations as more threatening, i.e. they show an interpretive bias. This study investigated whether such a bias also exists in specific phobia. Methods Individuals with spider phobia or social phobia, spider aficionados and non-phobic controls saw morphed stimuli that gradually transformed from a schematic picture of a flower into a schematic picture of a spider by shifting the outlines of the petals until they turned into spider legs. Participants' task was to decide whether each stimulus was more similar to a spider, a flower or to neither object while EEG was recorded. Results An interpretive bias was found in spider phobia on a behavioral level: with the first opening of the petals of the flower anchor, spider phobics rated the stimuli as more unpleasant and arousing than the control groups and showed an elevated latent trait to classify a stimulus as a spider and a response-time advantage for spider-like stimuli. No cortical correlates on the level of ERPs of this interpretive bias could be identified. However, consistent with previous studies, social and spider phobic persons exhibited generally enhanced visual P1 amplitudes indicative of hypervigilance in phobia. Conclusion Results suggest an interpretive bias and generalization of phobia-specific responses in specific phobia. Similar effects have been observed in other anxiety disorders, such as social phobia and posttraumatic stress disorder.

  17. Bayesian statistics

    OpenAIRE

    新家, 健精

    2013-01-01

    © 2012 Springer Science+Business Media, LLC. All rights reserved. Article Outline: Glossary Definition of the Subject and Introduction The Bayesian Statistical Paradigm Three Examples Comparison with the Frequentist Statistical Paradigm Future Directions Bibliography

  18. Mathematical statistics

    CERN Document Server

    Pestman, Wiebe R

    2009-01-01

    This textbook provides a broad and solid introduction to mathematical statistics, including the classical subjects hypothesis testing, normal regression analysis, and normal analysis of variance. In addition, non-parametric statistics and vectorial statistics are considered, as well as applications of stochastic analysis in modern statistics, e.g., Kolmogorov-Smirnov testing, smoothing techniques, robustness and density estimation. For students with some elementary mathematical background. With many exercises. Prerequisites from measure theory and linear algebra are presented.

  19. 75 FR 71671 - Federal Economic Statistics Advisory Committee Meeting

    Science.gov (United States)

    2010-11-24

    ... Committee will advise the Directors of the Economics and Statistics Administration's (ESA) two statistical... meeting is physically accessible to people with disabilities. Requests for sign language interpretation or...

  20. 76 FR 72160 - Federal Economic Statistics Advisory Committee Meeting

    Science.gov (United States)

    2011-11-22

    ... Committee will advise the Directors of the Economics and Statistics Administration's (ESA) two statistical... meeting is physically accessible to people with disabilities. Requests for sign language interpretation or...

  1. 78 FR 68024 - Federal Economic Statistics Advisory Committee Meeting

    Science.gov (United States)

    2013-11-13

    ... Committee will advise the Directors of the Economics and Statistics Administration's (ESA) two statistical... physically accessible to people with disabilities. Requests for sign language interpretation or other...

  2. 78 FR 30269 - Federal Economic Statistics Advisory Committee Meeting

    Science.gov (United States)

    2013-05-22

    ... Committee will advise the Directors of the Economics and Statistics Administration's (ESA) two statistical... accessible to people with disabilities. Requests for sign language interpretation or other auxiliary aids...

  3. Interpreting SUSHI-The Standardized Usage Statistics Harvesting Initiative Protocol%解读SUSHI——标准化的电子资源使用统计获取协议

    Institute of Scientific and Technical Information of China (English)

    杜莹琦; 郏琳

    2008-01-01

    SUSHI(The Standardized Usage Statistics Harvesting Initiative Protocol,标准化的电子资源使用统计获取协议)是基于WEB服务的SOAP(Simple Object Access Protocol)协议,为获取网络电子资源的用户使用统计报告提供自动化的数据交换方法.论文从基础、技术内涵和实际应用等方面对SUSHI进行了深入分析.

  4. An AAA-DDD triply hydrogen-bonded complex easily accessible for supramolecular polymers.

    Science.gov (United States)

    Han, Yi-Fei; Chen, Wen-Qiang; Wang, Hong-Bo; Yuan, Ying-Xue; Wu, Na-Na; Song, Xiang-Zhi; Yang, Lan

    2014-12-15

    For a complementary hydrogen-bonded complex, when every hydrogen-bond acceptor is on one side and every hydrogen-bond donor is on the other, all secondary interactions are attractive and the complex is highly stable. AAA-DDD (A=acceptor, D=donor) is considered to be the most stable among triply hydrogen-bonded sequences. The easily synthesized and further derivatized AAA-DDD system is very desirable for hydrogen-bonded functional materials. In this case, AAA and DDD, starting from 4-methoxybenzaldehyde, were synthesized with the Hantzsch pyridine synthesis and Friedländer annulation reaction. The association constant determined by fluorescence titration in chloroform at room temperature is 2.09×10(7)  M(-1) . The AAA and DDD components are not coplanar, but form a V shape in the solid state. Supramolecular polymers based on AAA-DDD triply hydrogen bonded have also been developed. This work may make AAA-DDD triply hydrogen-bonded sequences easily accessible for stimuli-responsive materials.

  5. Morphological Characterization of a New and Easily Recognizable Nuclear Male Sterile Mutant of Sorghum (Sorghum bicolor)

    Science.gov (United States)

    Xin, Zhanguo; Huang, Jian; Smith, Ashley R.; Chen, Junping; Burke, John; Sattler, Scott E.

    2017-01-01

    Sorghum (Sorghum bicolor L. Moench) is one of the most important grain crops in the world. The nuclear male sterility (NMS) trait, which is caused by mutations on the nuclear gene, is valuable for hybrid breeding and genetic studies. Several NMS mutants have been reported previously, but none of them were well characterized. Here, we present our detailed morphological characterization of a new and easily recognizable NMS sorghum mutant male sterile 8 (ms8) isolated from an elite inbred BTx623 mutagenized by ethyl methane sulfonate (EMS). Our results show that the ms8 mutant phenotype was caused by a mutation on a single recessive nuclear gene that is different from all available NMS loci reported in sorghum. In fertile sorghum plants, yellow anthers appeared first during anthesis, while in the ms8 mutant, white hairy stigma emerged first and only small white anthers were observed, making ms8 plants easily recognizable when flowering. The ovary development and seed production after manual pollination are normal in the ms8 mutant, indicating it is female fertile and male sterile only. We found that ms8 anthers did not produce pollen grains. Further analysis revealed that ms8 anthers were defective in tapetum development, which led to the arrest of pollen formation. As a stable male sterile mutant across different environments, greenhouses, and fields in different locations, the ms8 mutant could be a useful breeding tool. Moreover, ms8 might be an important for elucidating male gametophyte development in sorghum and other plants. PMID:28052078

  6. V\\"axj\\"o interpretation-2003: realism of contexts

    CERN Document Server

    Khrenikov, A Yu

    2004-01-01

    We present a new variant of the V\\"axj\\"o interpretation: contextualistic statistical realistic. Basic ideas of the V\\"axj\\"o interpretation-2001 are essentially clarified. We also discuss applications to biology, psychology, sociology, economy,...

  7. Harmonic statistics

    Science.gov (United States)

    Eliazar, Iddo

    2017-05-01

    The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their 'public relations' for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford's law, and 1/f noise.

  8. Latest analysis results and statistical interpretations for SUSY searches at √(s) = 13 TeV with two same-sign leptons, jets and E{sub T}{sup miss} at the ATLAS detector

    Energy Technology Data Exchange (ETDEWEB)

    Cardillo, Fabio; Tornambe, Peter [Albert-Ludwigs Universitaet Freiburg (Germany)

    2016-07-01

    A search for supersymmetric phenomena in final states with two leptons with the same electric charge, jets and missing transverse energy E{sub T}{sup miss} is presented. The production of same-sign lepton pairs or three leptons is only induced by rare Standard Model processes with very small cross-sections. The search thus benefits from little background and has a good exclusion potential in compressed SUSY spectra. This analysis has been performed already in Run-I of the LHC and provided powerful exclusion limits for various SUSY scenarios. In the ongoing Run-II, the search was conducted with the full dataset of pp collisions at √(s)=13 TeV recorded with the ATLAS detector in 2015 corresponding to a total integrated luminosity of 3.3 fb{sup -1}. The sensitivity to a big variety of supersymmetric models is illustrated by the interpretation of the results in the context of four different SUSY benchmark scenarios producing same-sign leptons signatures. The results can be used to set model-independent limits to new physics signals as well as increasing the existing limits on different supersymmetric scenarios with respect to the previous Run-I results. This talk presents the latest results of the same-sign/3L analysis published at the end of 2015. Furthermore, analysis details are addressed, and the prospects for the progressive data-taking during Run-II are shown.

  9. Evaluation of easily measured risk factors in the prediction of osteoporotic fractures

    Directory of Open Access Journals (Sweden)

    Brown Jacques P

    2005-09-01

    Full Text Available Abstract Background Fracture represents the single most important clinical event in patients with osteoporosis, yet remains under-predicted. As few premonitory symptoms for fracture exist, it is of critical importance that physicians effectively and efficiently identify individuals at increased fracture risk. Methods Of 3426 postmenopausal women in CANDOO, 40, 158, 99, and 64 women developed a new hip, vertebral, wrist or rib fracture, respectively. Seven easily measured risk factors predictive of fracture in research trials were examined in clinical practice including: age (, 65–69, 70–74, 75–79, 80+ years, rising from a chair with arms (yes, no, weight (≥ 57kg, maternal history of hip facture (yes, no, prior fracture after age 50 (yes, no, hip T-score (>-1, -1 to >-2.5, ≤-2.5, and current smoking status (yes, no. Multivariable logistic regression analysis was conducted. Results The inability to rise from a chair without the use of arms (3.58; 95% CI: 1.17, 10.93 was the most significant risk factor for new hip fracture. Notable risk factors for predicting new vertebral fractures were: low body weight (1.57; 95% CI: 1.04, 2.37, current smoking (1.95; 95% CI: 1.20, 3.18 and age between 75–79 years (1.96; 95% CI: 1.10, 3.51. New wrist fractures were significantly identified by low body weight (1.71, 95% CI: 1.01, 2.90 and prior fracture after 50 years (1.96; 95% CI: 1.19, 3.22. Predictors of new rib fractures include a maternal history of a hip facture (2.89; 95% CI: 1.04, 8.08 and a prior fracture after 50 years (2.16; 95% CI: 1.20, 3.87. Conclusion This study has shown that there exists a variety of predictors of future fracture, besides BMD, that can be easily assessed by a physician. The significance of each variable depends on the site of incident fracture. Of greatest interest is that an inability to rise from a chair is perhaps the most readily identifiable significant risk factor for hip fracture and can be easily incorporated

  10. The Emergent Copenhagen Interpretation of Quantum Mechanics

    CERN Document Server

    Hollowood, Timothy J

    2013-01-01

    We introduce a new and conceptually simple interpretation of quantum mechanics based on reduced density matrices of sub-systems from which the standard Copenhagen interpretation emerges as an effective description of macroscopically large systems. Wave function collapse is seen to be a useful but fundamentally unnecessary piece of prudent book keeping which is only valid for macro-systems. The new interpretation lies in a class of modal interpretations in that it applies to quantum systems that interact with a much larger environment. However, we show that it does not suffer from the problems that have plagued similar modal interpretations like macroscopic superpositions and rapid flipping between macroscopically distinct states. We describe how the interpretation fits neatly together with fully quantum formulations of statistical mechanics and that a measurement process can be viewed as a process of ergodicity breaking analogous to a phase transition. The key feature of the new interpretation is that joint p...

  11. Statistics 101 for Radiologists.

    Science.gov (United States)

    Anvari, Arash; Halpern, Elkan F; Samir, Anthony E

    2015-10-01

    Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced.

  12. Statistical theory of heat

    CERN Document Server

    Scheck, Florian

    2016-01-01

    Scheck’s textbook starts with a concise introduction to classical thermodynamics, including geometrical aspects. Then a short introduction to probabilities and statistics lays the basis for the statistical interpretation of thermodynamics. Phase transitions, discrete models and the stability of matter are explained in great detail. Thermodynamics has a special role in theoretical physics. Due to the general approach of thermodynamics the field has a bridging function between several areas like the theory of condensed matter, elementary particle physics, astrophysics and cosmology. The classical thermodynamics describes predominantly averaged properties of matter, reaching from few particle systems and state of matter to stellar objects. Statistical Thermodynamics covers the same fields, but explores them in greater depth and unifies classical statistical mechanics with quantum theory of multiple particle systems. The content is presented as two tracks: the fast track for master students, providing the essen...

  13. A 2D zinc-organic network being easily exfoliated into isolated sheets

    Science.gov (United States)

    Yu, Guihong; Li, Ruiqing; Leng, Zhihua; Gan, Shucai

    2016-08-01

    A metal-organic aggregate, namely {Zn2Cl2(BBC)}n (BBC = 4,4‧,4‧‧-(benzene-1,3,5-triyl-tris(benzene-4,1-diyl))tribenzoate) was obtained by solvothermal synthesis. Its structure is featured with the Zn2(COO)3 paddle-wheels with two chloride anions on axial positions and hexagonal pores in the layers. The exclusion of water in the precursor and the solvent plays a crucial role in the formation of target compound. This compound can be easily dissolved in alkaline solution and exfoliated into isolated sheets, which shows a novel way for the preparation of 2D materials.

  14. TRANSALPINA CAN EASILY BE CONSIDERED THE DIAMOND COUNTRY LANDSCAPES, ADVENTURE AND MYSTERY

    Directory of Open Access Journals (Sweden)

    Constanta ENEA

    2014-05-01

    Full Text Available If Transfăgărăşan is pearl Romanian mountains, the road easily qill be considered the diamond country landscapes, adventure and mystery. Hell 's Kitchen has developed and evolved naturally. Have no certainty of success and money required to carry out the infrastructure first and then see if investors come, so we can not blame the local authorities find here. The difficulties encountered in implementing funding programs made for funds to obtain hard enough. In this paper, I will briefly mention some ideas that could make the two cities, the holder of administratively to Rancière, the burgeoning tourist development area of Gorj County. I sincerely hope uhat there is among us and other people with vision who want to stand up and take action to provide a decent future for our children.

  15. An Easily Operating Polymer 1×4 Optical Waveguide Switch Matrix Based on Vertical Couplers

    Institute of Scientific and Technical Information of China (English)

    Kaixin Chen; Pak L Chu; Hau Ping Chan; Kin S. Chiang

    2007-01-01

    A three-dimensional (3D) polymer thermo-optic (TO) 1×4 waveguide switch matrix based on vertical couplers is demonstrated. It consists of four basic 3D switch units and because of its 3D structure, its construction is compact, only 9mm in length; moreover, the control logic of the entire switch is very simple, the light signal can be easily switched to any output port by operating only a single switch unit. The finished devices exhibit a switching extinction ratio greater than 21 dB for all of four output ports and the crosstalk between two adjacent output ports is lower than n for all switching units is about 50 mW.

  16. Solving block linear systems with low-rank off-diagonal blocks is easily parallelizable

    Energy Technology Data Exchange (ETDEWEB)

    Menkov, V. [Indiana Univ., Bloomington, IN (United States)

    1996-12-31

    An easily and efficiently parallelizable direct method is given for solving a block linear system Bx = y, where B = D + Q is the sum of a non-singular block diagonal matrix D and a matrix Q with low-rank blocks. This implicitly defines a new preconditioning method with an operation count close to the cost of calculating a matrix-vector product Qw for some w, plus at most twice the cost of calculating Qw for some w. When implemented on a parallel machine the processor utilization can be as good as that of those operations. Order estimates are given for the general case, and an implementation is compared to block SSOR preconditioning.

  17. Multiobjective optimization in quantitative structure-activity relationships: deriving accurate and interpretable QSARs.

    Science.gov (United States)

    Nicolotti, Orazio; Gillet, Valerie J; Fleming, Peter J; Green, Darren V S

    2002-11-07

    Deriving quantitative structure-activity relationship (QSAR) models that are accurate, reliable, and easily interpretable is a difficult task. In this study, two new methods have been developed that aim to find useful QSAR models that represent an appropriate balance between model accuracy and complexity. Both methods are based on genetic programming (GP). The first method, referred to as genetic QSAR (or GPQSAR), uses a penalty function to control model complexity. GPQSAR is designed to derive a single linear model that represents an appropriate balance between the variance and the number of descriptors selected for the model. The second method, referred to as multiobjective genetic QSAR (MoQSAR), is based on multiobjective GP and represents a new way of thinking of QSAR. Specifically, QSAR is considered as a multiobjective optimization problem that comprises a number of competitive objectives. Typical objectives include model fitting, the total number of terms, and the occurrence of nonlinear terms. MoQSAR results in a family of equivalent QSAR models where each QSAR represents a different tradeoff in the objectives. A practical consideration often overlooked in QSAR studies is the need for the model to promote an understanding of the biochemical response under investigation. To accomplish this, chemically intuitive descriptors are needed but do not always give rise to statistically robust models. This problem is addressed by the addition of a further objective, called chemical desirability, that aims to reward models that consist of descriptors that are easily interpretable by chemists. GPQSAR and MoQSAR have been tested on various data sets including the Selwood data set and two different solubility data sets. The study demonstrates that the MoQSAR method is able to find models that are at least as good as models derived using standard statistical approaches and also yields models that allow a medicinal chemist to trade statistical robustness for chemical

  18. Surface structure, model and mechanism of an insect integument adapted to be damaged easily

    Directory of Open Access Journals (Sweden)

    Bouillard Philippe

    2004-10-01

    Full Text Available Abstract Background Several sawfly larvae of the Tenthredinidae (Hymenoptera are called easy bleeders because their whole body integument, except the head capsule, disrupts very easily at a given spot, under a slight mechanical stress at this spot. The exuding haemolymph droplet acts as a feeding deterrent towards invertebrate predators. The present study aimed to describe the cuticle surface, to consider it from a mechanistic point of view, and to discuss potential consequences of the integument surface in the predator-prey relationships. Results The integument surface of sawfly larvae was investigated by light microscopy (LM and scanning electron microscopy (SEM which revealed that the cuticle of easy bleeders was densely covered by what we call "spider-like" microstructures. Such microstructures were not detected in non-easy bleeders. A model by finite elements of the cuticle layer was developed to get an insight into the potential function of the microstructures during easy bleeding. Cuticle parameters (i.e., size of the microstructures and thickness of the epi-versus procuticle were measured on integument sections and used in the model. A shear force applied on the modelled cuticle surface led to higher stress values when microstructures were present, as compared to a plan surface. Furthermore, by measuring the diameter of a water droplet deposited on sawfly larvae, the integument of several sawfly species was determined as hydrophobic (e.g., more than Teflon®, which was related to the sawfly larvae's ability to bleed easily. Conclusion Easy bleeders show spider-like microstructures on their cuticle surface. It is suggested that these microstructures may facilitate integument disruption as well as render the integument hydrophobic. This latter property would allow the exuding haemolymph to be maintained as a droplet at the integument surface.

  19. Easily processable multimodal spectral converters based on metal oxide/organic-inorganic hybrid nanocomposites.

    Science.gov (United States)

    Julián-López, Beatriz; Gonell, Francisco; Lima, Patricia P; Freitas, Vânia T; André, Paulo S; Carlos, Luis D; Ferreira, Rute A S

    2015-10-09

    This manuscript reports the synthesis and characterization of the first organic-inorganic hybrid material exhibiting efficient multimodal spectral converting properties. The nanocomposite, made of Er(3+), Yb(3+) codoped zirconia nanoparticles (NPs) entrapped in a di-ureasil d-U(600) hybrid matrix, is prepared by an easy two-step sol-gel synthesis leading to homogeneous and transparent materials that can be very easily processed as monolith or film. Extensive structural characterization reveals that zirconia nanocrystals of 10-20 nm in size are efficiently dispersed into the hybrid matrix and that the local structure of the di-ureasil is not affected by the presence of the NPs. A significant enhancement in the refractive index of the di-ureasil matrix with the incorporation of the ZrO2 nanocrystals is observed. The optical study demonstrates that luminescent properties of both constituents are perfectly preserved in the final hybrid. Thus, the material displays a white-light photoluminescence from the di-ureasil component upon excitation at UV/visible radiation and also intense green and red emissions from the Er(3+)- and Yb(3+)-doped NPs after NIR excitation. The dynamics of the optical processes were also studied as a function of the lanthanide content and the thickness of the films. Our results indicate that these luminescent hybrids represent a low-cost, environmentally friendly, size-controlled, easily processed and chemically stable alternative material to be used in light harvesting devices such as luminescent solar concentrators, optical fibres and sensors. Furthermore, this synthetic approach can be extended to a wide variety of luminescent NPs entrapped in hybrid matrices, thus leading to multifunctional and versatile materials for efficient tuneable nonlinear optical nanodevices.

  20. Easily processable multimodal spectral converters based on metal oxide/organic—inorganic hybrid nanocomposites

    Science.gov (United States)

    Julián-López, Beatriz; Gonell, Francisco; Lima, Patricia P.; Freitas, Vânia T.; André, Paulo S.; Carlos, Luis D.; Ferreira, Rute A. S.

    2015-10-01

    This manuscript reports the synthesis and characterization of the first organic-inorganic hybrid material exhibiting efficient multimodal spectral converting properties. The nanocomposite, made of Er3+, Yb3+ codoped zirconia nanoparticles (NPs) entrapped in a di-ureasil d-U(600) hybrid matrix, is prepared by an easy two-step sol-gel synthesis leading to homogeneous and transparent materials that can be very easily processed as monolith or film. Extensive structural characterization reveals that zirconia nanocrystals of 10-20 nm in size are efficiently dispersed into the hybrid matrix and that the local structure of the di-ureasil is not affected by the presence of the NPs. A significant enhancement in the refractive index of the di-ureasil matrix with the incorporation of the ZrO2 nanocrystals is observed. The optical study demonstrates that luminescent properties of both constituents are perfectly preserved in the final hybrid. Thus, the material displays a white-light photoluminescence from the di-ureasil component upon excitation at UV/visible radiation and also intense green and red emissions from the Er3+- and Yb3+-doped NPs after NIR excitation. The dynamics of the optical processes were also studied as a function of the lanthanide content and the thickness of the films. Our results indicate that these luminescent hybrids represent a low-cost, environmentally friendly, size-controlled, easily processed and chemically stable alternative material to be used in light harvesting devices such as luminescent solar concentrators, optical fibres and sensors. Furthermore, this synthetic approach can be extended to a wide variety of luminescent NPs entrapped in hybrid matrices, thus leading to multifunctional and versatile materials for efficient tuneable nonlinear optical nanodevices.

  1. Statistical physics

    CERN Document Server

    Sadovskii, Michael V

    2012-01-01

    This volume provides a compact presentation of modern statistical physics at an advanced level. Beginning with questions on the foundations of statistical mechanics all important aspects of statistical physics are included, such as applications to ideal gases, the theory of quantum liquids and superconductivity and the modern theory of critical phenomena. Beyond that attention is given to new approaches, such as quantum field theory methods and non-equilibrium problems.

  2. Elementary Statistics Tables

    CERN Document Server

    Neave, Henry R

    2012-01-01

    This book, designed for students taking a basic introductory course in statistical analysis, is far more than just a book of tables. Each table is accompanied by a careful but concise explanation and useful worked examples. Requiring little mathematical background, Elementary Statistics Tables is thus not just a reference book but a positive and user-friendly teaching and learning aid. The new edition contains a new and comprehensive "teach-yourself" section on a simple but powerful approach, now well-known in parts of industry but less so in academia, to analysing and interpreting process dat

  3. Statistics As Principled Argument

    CERN Document Server

    Abelson, Robert P

    2012-01-01

    In this illuminating volume, Robert P. Abelson delves into the too-often dismissed problems of interpreting quantitative data and then presenting them in the context of a coherent story about one's research. Unlike too many books on statistics, this is a remarkably engaging read, filled with fascinating real-life (and real-research) examples rather than with recipes for analysis. It will be of true interest and lasting value to beginning graduate students and seasoned researchers alike. The focus of the book is that the purpose of statistics is to organize a useful argument from quantitative

  4. Statistics for business

    CERN Document Server

    Waller, Derek L

    2008-01-01

    Statistical analysis is essential to business decision-making and management, but the underlying theory of data collection, organization and analysis is one of the most challenging topics for business students and practitioners. This user-friendly text and CD-ROM package will help you to develop strong skills in presenting and interpreting statistical information in a business or management environment. Based entirely on using Microsoft Excel rather than more complicated applications, it includes a clear guide to using Excel with the key functions employed in the book, a glossary of terms and

  5. Effect size, confidence intervals and statistical power in psychological research.

    Directory of Open Access Journals (Sweden)

    Téllez A.

    2015-07-01

    Full Text Available Quantitative psychological research is focused on detecting the occurrence of certain population phenomena by analyzing data from a sample, and statistics is a particularly helpful mathematical tool that is used by researchers to evaluate hypotheses and make decisions to accept or reject such hypotheses. In this paper, the various statistical tools in psychological research are reviewed. The limitations of null hypothesis significance testing (NHST and the advantages of using effect size and its respective confidence intervals are explained, as the latter two measurements can provide important information about the results of a study. These measurements also can facilitate data interpretation and easily detect trivial effects, enabling researchers to make decisions in a more clinically relevant fashion. Moreover, it is recommended to establish an appropriate sample size by calculating the optimum statistical power at the moment that the research is designed. Psychological journal editors are encouraged to follow APA recommendations strictly and ask authors of original research studies to report the effect size, its confidence intervals, statistical power and, when required, any measure of clinical significance. Additionally, we must account for the teaching of statistics at the graduate level. At that level, students do not receive sufficient information concerning the importance of using different types of effect sizes and their confidence intervals according to the different types of research designs; instead, most of the information is focused on the various tools of NHST.

  6. Search Databases and Statistics

    DEFF Research Database (Denmark)

    Refsgaard, Jan C; Munk, Stephanie; Jensen, Lars J

    2016-01-01

    the vast amounts of raw data. This task is tackled by computational tools implementing algorithms that match the experimental data to databases, providing the user with lists for downstream analysis. Several platforms for such automated interpretation of mass spectrometric data have been developed, each...... having strengths and weaknesses that must be considered for the individual needs. These are reviewed in this chapter. Equally critical for generating highly confident output datasets is the application of sound statistical criteria to limit the inclusion of incorrect peptide identifications from database...... searches. Additionally, careful filtering and use of appropriate statistical tests on the output datasets affects the quality of all downstream analyses and interpretation of the data. Our considerations and general practices on these aspects of phosphoproteomics data processing are presented here....

  7. Statistical methods

    CERN Document Server

    Szulc, Stefan

    1965-01-01

    Statistical Methods provides a discussion of the principles of the organization and technique of research, with emphasis on its application to the problems in social statistics. This book discusses branch statistics, which aims to develop practical ways of collecting and processing numerical data and to adapt general statistical methods to the objectives in a given field.Organized into five parts encompassing 22 chapters, this book begins with an overview of how to organize the collection of such information on individual units, primarily as accomplished by government agencies. This text then

  8. Statistical optics

    CERN Document Server

    Goodman, Joseph W

    2015-01-01

    This book discusses statistical methods that are useful for treating problems in modern optics, and the application of these methods to solving a variety of such problems This book covers a variety of statistical problems in optics, including both theory and applications.  The text covers the necessary background in statistics, statistical properties of light waves of various types, the theory of partial coherence and its applications, imaging with partially coherent light, atmospheric degradations of images, and noise limitations in the detection of light. New topics have been introduced i

  9. Histoplasmosis Statistics

    Science.gov (United States)

    ... Foodborne, Waterborne, and Environmental Diseases Mycotic Diseases Branch Histoplasmosis Statistics Recommend on Facebook Tweet Share Compartir How common is histoplasmosis? In the United States, an estimated 60% to ...

  10. Statistical distributions

    CERN Document Server

    Forbes, Catherine; Hastings, Nicholas; Peacock, Brian J.

    2010-01-01

    A new edition of the trusted guide on commonly used statistical distributions Fully updated to reflect the latest developments on the topic, Statistical Distributions, Fourth Edition continues to serve as an authoritative guide on the application of statistical methods to research across various disciplines. The book provides a concise presentation of popular statistical distributions along with the necessary knowledge for their successful use in data modeling and analysis. Following a basic introduction, forty popular distributions are outlined in individual chapters that are complete with re

  11. Harmonic statistics

    Energy Technology Data Exchange (ETDEWEB)

    Eliazar, Iddo, E-mail: eliazar@post.tau.ac.il

    2017-05-15

    The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.

  12. GoCxx: a tool to easily leverage C++ legacy code for multicore-friendly Go libraries and frameworks

    Science.gov (United States)

    Binet, Sébastien

    2012-12-01

    Current HENP libraries and frameworks were written before multicore systems became widely deployed and used. From this environment, a ‘single-thread’ processing model naturally emerged but the implicit assumptions it encouraged are greatly impairing our abilities to scale in a multicore/manycore world. Writing scalable code in C++ for multicore architectures, while doable, is no panacea. Sure, C++11 will improve on the current situation (by standardizing on std::thread, introducing lambda functions and defining a memory model) but it will do so at the price of complicating further an already quite sophisticated language. This level of sophistication has probably already strongly motivated analysis groups to migrate to CPython, hoping for its current limitations with respect to multicore scalability to be either lifted (Grand Interpreter Lock removal) or for the advent of a new Python VM better tailored for this kind of environment (PyPy, Jython, …) Could HENP migrate to a language with none of the deficiencies of C++ (build time, deployment, low level tools for concurrency) and with the fast turn-around time, simplicity and ease of coding of Python? This paper will try to make the case for Go - a young open source language with built-in facilities to easily express and expose concurrency - being such a language. We introduce GoCxx, a tool leveraging gcc-xml's output to automatize the tedious work of creating Go wrappers for foreign languages, a critical task for any language wishing to leverage legacy and field-tested code. We will conclude with the first results of applying GoCxx to real C++ code.

  13. Statistical concepts a second course

    CERN Document Server

    Lomax, Richard G

    2012-01-01

    Statistical Concepts consists of the last 9 chapters of An Introduction to Statistical Concepts, 3rd ed. Designed for the second course in statistics, it is one of the few texts that focuses just on intermediate statistics. The book highlights how statistics work and what they mean to better prepare students to analyze their own data and interpret SPSS and research results. As such it offers more coverage of non-parametric procedures used when standard assumptions are violated since these methods are more frequently encountered when working with real data. Determining appropriate sample sizes

  14. Scan Statistics

    CERN Document Server

    Glaz, Joseph

    2009-01-01

    Suitable for graduate students and researchers in applied probability and statistics, as well as for scientists in biology, computer science, pharmaceutical science and medicine, this title brings together a collection of chapters illustrating the depth and diversity of theory, methods and applications in the area of scan statistics.

  15. Statistical Diversions

    Science.gov (United States)

    Petocz, Peter; Sowey, Eric

    2008-01-01

    In this article, the authors focus on hypothesis testing--that peculiarly statistical way of deciding things. Statistical methods for testing hypotheses were developed in the 1920s and 1930s by some of the most famous statisticians, in particular Ronald Fisher, Jerzy Neyman and Egon Pearson, who laid the foundations of almost all modern methods of…

  16. Practical Statistics

    CERN Document Server

    Lyons, L

    2016-01-01

    Accelerators and detectors are expensive, both in terms of money and human effort. It is thus important to invest effort in performing a good statistical anal- ysis of the data, in order to extract the best information from it. This series of five lectures deals with practical aspects of statistical issues that arise in typical High Energy Physics analyses.

  17. On Prerequisites of Interpreters

    Institute of Scientific and Technical Information of China (English)

    范文

    2006-01-01

    Interpreters are invariably playing a crucial role in international affairs. Those who regularly read pictorials or watch TV news programs know best why interpreters are always placed between two leaders. That is because interpreters are indispensable if any two VIPs aim to achieve further understanding, to eliminate bilateral distrust or even establish alliance with each other, a fact may partly account for why so many students are swarming into translation schools. Are they able to become interpreters? What are the prerequisites for an interpreter? This article will, taking into operative factors as complete as possible, provide a basic framework under which prerequisites of interpreters are structured.

  18. Popper's experiment, Copenhagen Interpretation and Nonlocality

    CERN Document Server

    Qureshi, T

    2003-01-01

    A thought experiment, proposed by Karl Popper, which has been experimentally realized recently, is critically examined. A basic flaw in Popper's argument which has also been prevailing in subsequent debates, is pointed out. It is shown that Popper's experiment can be understood easily within the Copenhagen interpretation of quantum mechanics. An alternate experiment, based on discrete variables, is proposed, which constitutes Popper's test in a clearer way. It refutes the argument of absence of nonlocality in quantum mechanics.

  19. [Sixty years ago, cell cultures finally permitted the poliomyelitis virus to multiply easily].

    Science.gov (United States)

    Chastel, Claude

    2009-01-01

    In 1949, three American virologists, John F. Enders, Thomas H. Weller and Frederick C. Robbins, from the Harvard Medical Scholl and working at the Children's Medical Centre, Boston, Mass., have provoked a true revolution in Virology. Here, they have succeeded in readily multiplying the three poliomyelitis viruses in vitro, in non-nervous cells cultures. A few years afterwards (1954), they were collectively honoured by the Nobel Prize of Physiology and Medicine. This discovery not only has quickly led to the production of efficient poliomyelitis vaccines (J. E. Salk, 1953; A. B. Sabin, 1955) but also has permitted to easily isolate a number of already known viruses (measles, rubella, mumps, herpes simplex and herpes zoster) or until then totally unknown viruses (adenovirus, echovirus, cytomegalovirus). These progresses have significantly contributed to improve diagnosis, sanitary surveillance and vaccinal prophylaxis of human and animal viral diseases. Moreover, the cells cultures techniques have also benefited to other domains of fundamental Biology, such as cellular biology, genetics, cancerology, biology of the reproduction and regenerative medicine as well.

  20. Estimating subsoil resistance to nitrate leaching from easily measurable pedological properties

    Directory of Open Access Journals (Sweden)

    Fábio Keiti Nakagawa

    2012-11-01

    Full Text Available Leaching of nitrate (NO3- can increase the groundwater concentration of this anion and reduce the agronomical effectiveness of nitrogen fertilizers. The main soil property inversely related to NO3- leaching is the anion exchange capacity (AEC, whose determination is however too time-consuming for being carried out in soil testing laboratories. For this reason, this study evaluated if more easily measurable soil properties could be used to estimate the resistance of subsoils to NO3- leaching. Samples from the subsurface layer (20-40 cm of 24 representative soils of São Paulo State were characterized for particle-size distribution and for chemical and electrochemical properties. The subsoil content of adsorbed NO3- was calculated from the difference between the NO3- contents extracted with 1 mol L-1 KCl and with water; furthermore, NO3- leaching was studied in miscible displacement experiments. The results of both adsorption and leaching experiments were consistent with the well-known role exerted by AEC on the nitrate behavior in weathered soils. Multiple regression analysis indicated that in subsoils with (i low values of remaining phosphorus (Prem, (ii low soil pH values measured in water (pH H2O, and (iii high pH values measured in 1 moL L-1 KCl (pH KCl, the amounts of surface positive charges tend to be greater. For this reason, NO3- leaching tends to be slower in these subsoils, even under saturated flow condition.

  1. Easily recycled Bi2O3 photocatalyst coatings prepared via ball milling followed by calcination

    Science.gov (United States)

    Cheng, Lijun; Hu, Xumin; Hao, Liang

    2017-06-01

    Bi2O3 photocatalyst coatings derived from Bi coatings were first prepared by a two-step method, namely ball milling followed by the calcination process. The as-prepared samples were characterized by XRD, SEM, XPS and UV-Vis spectra, respectively. The results showed that monoclinic Bi2O3 coatings were obtained after sintering Bi coatings at 673 or 773 K, while monoclinic and triclinic mixed phase Bi2O3 coatings were obtained at 873 or 973 K. The topographies of the samples were observably different, which varied from flower-like, irregular, polygonal to nanosized particles with the increase in calcination temperature. Photodegradation of malachite green under simulated solar irradiation for 180 min showed that the largest degradation efficiency of 86.2% was achieved over Bi2O3 photocatalyst coatings sintered at 873 K. The Bi2O3 photocatalyst coatings, encapsulated with Al2O3 ball with an average diameter around 1 mm, are quite easily recycled, which provides an alternative visible light-driven photocatalyst suitable for practical water treatment application.

  2. Open window: when easily identifiable genomes and traits are in the public domain.

    Directory of Open Access Journals (Sweden)

    Misha Angrist

    Full Text Available "One can't be of an enquiring and experimental nature, and still be very sensible."--Charles Fort. As the costs of personal genetic testing "self-quantification" fall, publicly accessible databases housing people's genotypic and phenotypic information are gradually increasing in number and scope. The latest entrant is openSNP, which allows participants to upload their personal genetic/genomic and self-reported phenotypic data. I believe the emergence of such open repositories of human biological data is a natural reflection of inquisitive and digitally literate people's desires to make genomic and phenotypic information more easily available to a community beyond the research establishment. Such unfettered databases hold the promise of contributing mightily to science, science education and medicine. That said, in an age of increasingly widespread governmental and corporate surveillance, we would do well to be mindful that genomic DNA is uniquely identifying. Participants in open biological databases are engaged in a real-time experiment whose outcome is unknown.

  3. An easily-achieved time-domain beamformer for ultrafast ultrasound imaging based on compressive sensing.

    Science.gov (United States)

    Wang, Congzhi; Peng, Xi; Liang, Dong; Xiao, Yang; Qiu, Weibao; Qian, Ming; Zheng, Hairong

    2015-01-01

    In ultrafast ultrasound imaging technique, how to maintain the high frame rate, and at the same time to improve the image quality as far as possible, has become a significant issue. Several novel beamforming methods based on compressive sensing (CS) theory have been proposed in previous literatures, but all have their own limitations, such as the excessively large memory consumption and the errors caused by the short-time discrete Fourier transform (STDFT). In this study, a novel CS-based time-domain beamformer for plane-wave ultrasound imaging is proposed and its image quality has been verified to be better than the traditional DAS method and even the popular coherent compounding method on several simulated phantoms. Comparing to the existing CS method, the memory consumption of our method is significantly reduced since the encoding matrix can be sparse-expressed. In addition, the time-delay calculations of the echo signals are directly accomplished in time-domain with a dictionary concept, avoiding the errors induced by the short-time Fourier translation calculation in those frequency-domain methods. The proposed method can be easily implemented on some low-cost hardware platforms, and can obtain ultrasound images with both high frame rate and good image quality, which make it has a great potential for clinical application.

  4. Shaft seals with an easily removable cylinder holder for low-pressure steam turbines

    Science.gov (United States)

    Zakharov, A. E.; Rodionov, D. A.; Pimenov, E. V.; Sobolev, A. S.

    2016-01-01

    The article is devoted to the problems that occur at the operation of LPC shaft seals (SS) of turbines, particularly, their bearings. The problems arising from the deterioration of oil-protecting rings of SS and bearings and also the consequences in which they can result are considered. The existing SS housing construction types are considered. Their operational features are specified. A new SS construction type with an easily removable holder is presented. The construction of its main elements is described. The sequence of operations of the repair personnel at the restoration of the new SS type spacings is proposed. The comparative analysis of the new and the existing SS construction types is carried out. The assessment results of the efficiency, the operational convenience, and the economic effect after the installation of the new type seals are given. The conclusions about the offered construction prospects are made by results of the comparative analysis and the carried-out assessment. The main advantage of this design is the possibility of spacings restoration both in SS and in oil-protecting rings during a short-term stop of a turbine, even without its cooling. This construction was successfully tested on the working K-300-23.5 LMP turbine. However, its adaptation for other turbines is quite possible.

  5. Direct PCR: Alternative Diagnostic Method for Diagnosis of Diphtheria Rapidly, Easily and Cost Effective

    Directory of Open Access Journals (Sweden)

    Sunarno Sunarno

    2013-12-01

    Full Text Available Some diseases require immediate and appropriate treatment to decrease the fatality risk patients incident, for example diphtheria. Time to help patients is very crucial since delay of therapy may increase the mortality cases up to 20 times. In other hands, conventional diagnostic methods (the gold standard for diagnosis of diphtheria is time consuming and laborious. Therefore, an alternative diagnostic method which is rapid, easy and inexpensive is needed. In this case, direct PCR has been proved to reduce time and cost in laboratory examination. This study aimed to develop direct PCR as alternative diagnostic method for diagnosis of diphtheria rapidly, easily, and inexpensive. Fifteen samples include 10 isolates of Corynebacterium diphtheriae (toxigenic and 3 isolates of Corynebacterium non- diphtheriae (nontoxigenic and 2 clinical specimens (throat swab was examined by performing direct PCR method and a standard PCR method was used for optimizing the protocols. Result showed that direct PCR can be used to amplify target genes correctly as well as standard PCR. All of C. diphtheriae samples showed bands at 168 bp (dtxR gene marker and 551 bp (tox gene marker while no band appeared in others. Direct PCR detected at least 71 CFU/uL of bacterial cells in samples. We concluded that direct PCR can be used for alternative diagnostic method for diagnosis of diphtheria which is rapid, easy and cost effective.

  6. Easily separated silver nanoparticle-decorated magnetic graphene oxide: Synthesis and high antibacterial activity.

    Science.gov (United States)

    Zhang, Huai-Zhi; Zhang, Chang; Zeng, Guang-Ming; Gong, Ji-Lai; Ou, Xiao-Ming; Huan, Shuang-Yan

    2016-06-01

    Silver nanoparticle-decorated magnetic graphene oxide (MGO-Ag) was synthesized by doping silver and Fe3O4 nanoparticles on the surface of GO, which was used as an antibacterial agent. MGO-Ag was characterized by scanning electron microscopy (SEM), transmission electron microscopy (TEM), Energy dispersive X-ray (EDS), X-ray diffraction (XRD), Raman spectroscopy and magnetic property tests. It can be found that magnetic iron oxide nanoparticles and nano-Ag was well dispersed on graphene oxide; and MGO-Ag exhibited excellent antibacterial activity against Escherichia coli and Staphylococcus aureus. Several factors were investigated to study the antibacterial effect of MGO-Ag, such as temperature, time, pH and bacterial concentration. We also found that MGO-Ag maintained high inactivation rates after use six times and can be separated easily after antibacterial process. Moreover, the antibacterial mechanism is discussed and the synergistic effect of GO, Fe3O4 nanoparticles and nano-Ag accounted for high inactivation of MGO-Ag.

  7. Easily regenerable solid adsorbents based on polyamines for carbon dioxide capture from the air.

    Science.gov (United States)

    Goeppert, Alain; Zhang, Hang; Czaun, Miklos; May, Robert B; Prakash, G K Surya; Olah, George A; Narayanan, S R

    2014-05-01

    Adsorbents prepared easily by impregnation of fumed silica with polyethylenimine (PEI) are promising candidates for the capture of CO2 directly from the air. These inexpensive adsorbents have high CO2 adsorption capacity at ambient temperature and can be regenerated in repeated cycles under mild conditions. Despite the very low CO2 concentration, they are able to scrub efficiently all CO2 out of the air in the initial hours of the experiments. The influence of parameters such as PEI loading, adsorption and desorption temperature, particle size, and PEI molecular weight on the adsorption behavior were investigated. The mild regeneration temperatures required could allow the use of waste heat available in many industrial processes as well as solar heat. CO2 adsorption from the air has a number of applications. Removal of CO2 from a closed environment, such as a submarine or space vehicles, is essential for life support. The supply of CO2-free air is also critical for alkaline fuel cells and batteries. Direct air capture of CO2 could also help mitigate the rising concerns about atmospheric CO2 concentration and associated climatic changes, while, at the same time, provide the first step for an anthropogenic carbon cycle.

  8. A comprehensive review on removal of arsenic using activated carbon prepared from easily available waste materials.

    Science.gov (United States)

    Mondal, Monoj Kumar; Garg, Ravi

    2017-05-01

    Arsenic contamination in water bodies is a serious problem and causes various health problems due to which US Environment Protection Agency (USEPA) set its maximum permissible limit of 10 ppb. The present review article starts with the removal of toxic arsenic using adsorbents prepared from easily available waste materials. Adsorbent either commercial or low-cost adsorbent can be used for arsenic removal but recent research was focused on the low-cost adsorbent. Preparation and activation of various adsorbents were discussed. Adsorption capacities, surface area, thermodynamic, and kinetics data of various adsorbents for As(III) and As(V) removal were compiled. Desorption followed by regeneration and reuse of adsorbents is an important step in adsorption and leads to economical process. Various desorbing and regenerating agents were discussed for arsenic decontamination from the adsorbent surface. Strong acids, bases, and salts are the main desorbing agents. Disposal of arsenic-contaminated adsorbent and arsenic waste was also a big problem because of the toxic and leaching effect of arsenic. So, arsenic waste was disposed of by proper stabilization/solidification (S/S) technique by mixing it in Portland cement, iron, ash, etc. to reduce the leaching effect.

  9. Introductory statistics

    CERN Document Server

    Ross, Sheldon M

    2005-01-01

    In this revised text, master expositor Sheldon Ross has produced a unique work in introductory statistics. The text's main merits are the clarity of presentation, contemporary examples and applications from diverse areas, and an explanation of intuition and ideas behind the statistical methods. To quote from the preface, ""It is only when a student develops a feel or intuition for statistics that she or he is really on the path toward making sense of data."" Ross achieves this goal through a coherent mix of mathematical analysis, intuitive discussions and examples.* Ross's clear writin

  10. Introductory statistics

    CERN Document Server

    Ross, Sheldon M

    2010-01-01

    In this 3rd edition revised text, master expositor Sheldon Ross has produced a unique work in introductory statistics. The text's main merits are the clarity of presentation, contemporary examples and applications from diverse areas, and an explanation of intuition and ideas behind the statistical methods. Concepts are motivated, illustrated and explained in a way that attempts to increase one's intuition. To quote from the preface, ""It is only when a student develops a feel or intuition for statistics that she or he is really on the path toward making sense of data."" Ross achieves this

  11. Statistical physics

    CERN Document Server

    Wannier, Gregory H

    2010-01-01

    Until recently, the field of statistical physics was traditionally taught as three separate subjects: thermodynamics, statistical mechanics, and kinetic theory. This text, a forerunner in its field and now a classic, was the first to recognize the outdated reasons for their separation and to combine the essentials of the three subjects into one unified presentation of thermal physics. It has been widely adopted in graduate and advanced undergraduate courses, and is recommended throughout the field as an indispensable aid to the independent study and research of statistical physics.Designed for

  12. Semiconductor statistics

    CERN Document Server

    Blakemore, J S

    1962-01-01

    Semiconductor Statistics presents statistics aimed at complementing existing books on the relationships between carrier densities and transport effects. The book is divided into two parts. Part I provides introductory material on the electron theory of solids, and then discusses carrier statistics for semiconductors in thermal equilibrium. Of course a solid cannot be in true thermodynamic equilibrium if any electrical current is passed; but when currents are reasonably small the distribution function is but little perturbed, and the carrier distribution for such a """"quasi-equilibrium"""" co

  13. On court interpreters' visibility

    DEFF Research Database (Denmark)

    Dubslaff, Friedel; Martinsen, Bodil

    This paper is part of the initial stage of a larger empirical research project on court interpreting seen as a complex interaction between (at least) three co-participants. The empirical material consists of recordings of interpreted interrogations in court room settings and questionnaires filled...... of the service they receive. Ultimately, the findings will be used for training purposes. Future - and, for that matter, already practising - interpreters as well as the professional users of interpreters ought to take the reality of the interpreters' work in practice into account when assessing the quality...... of the service rendered/received. The paper presents a small-scale case study based on an interpreted witness interrogation. Recent research on the interpreter's role has shown that interpreters across all settings perceive themselves as "visible" (Angelelli 2003, 2004). This has led us to focus...

  14. An easily made, low-cost phantom for ultrasound airway exam training and assessment

    Directory of Open Access Journals (Sweden)

    Kristopher M Schroeder

    2013-01-01

    Full Text Available Background: Recent manuscripts have described the use of ultrasound imaging to evaluate airway structures. Ultrasound training tools are necessary for practitioners to become proficient at obtaining and interpreting images. Few training tools exist and those that do can often times be expensive and rendered useless with repeated needle passes. Methods: We utilised inexpensive and easy to obtain materials to create a gel phantom model for ultrasound-guided airway examination training. Results: Following creation of the gel phantom model, images were successfully obtained of the thyroid and cricoid cartilages, cricothyroid membrane and tracheal rings in both the sagittal transverse planes. Conclusion: The gel phantom model mimics human airway anatomy and may be used for ultrasound-guided airway assessment and intervention training. This may have important safety implications as ultrasound imaging is increasingly used for airway assessment.

  15. SEER Statistics

    Science.gov (United States)

    The Surveillance, Epidemiology, and End Results (SEER) Program of the National Cancer Institute works to provide information on cancer statistics in an effort to reduce the burden of cancer among the U.S. population.

  16. Cancer Statistics

    Science.gov (United States)

    ... Resources Conducting Clinical Trials Statistical Tools and Data Terminology Resources NCI Data Catalog Cryo-EM NCI's Role ... Contacts Other Funding Find NCI funding for small business innovation, technology transfer, and contracts Training Cancer Training ...

  17. CMS Statistics

    Data.gov (United States)

    U.S. Department of Health & Human Services — The CMS Center for Strategic Planning produces an annual CMS Statistics reference booklet that provides a quick reference for summary information about health...

  18. Reversible Statistics

    DEFF Research Database (Denmark)

    Tryggestad, Kjell

    2004-01-01

    The study aims is to describe how the inclusion and exclusion of materials and calculative devices construct the boundaries and distinctions between statistical facts and artifacts in economics. My methodological approach is inspired by John Graunt's (1667) Political arithmetic and more recent work...... within constructivism and the field of Science and Technology Studies (STS). The result of this approach is here termed reversible statistics, reconstructing the findings of a statistical study within economics in three different ways. It is argued that all three accounts are quite normal, albeit...... in different ways. The presence and absence of diverse materials, both natural and political, is what distinguishes them from each other. Arguments are presented for a more symmetric relation between the scientific statistical text and the reader. I will argue that a more symmetric relation can be achieved...

  19. Image Statistics

    Energy Technology Data Exchange (ETDEWEB)

    Wendelberger, Laura Jean [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-08

    In large datasets, it is time consuming or even impossible to pick out interesting images. Our proposed solution is to find statistics to quantify the information in each image and use those to identify and pick out images of interest.

  20. Accident Statistics

    Data.gov (United States)

    Department of Homeland Security — Accident statistics available on the Coast Guard’s website by state, year, and one variable to obtain tables and/or graphs. Data from reports has been loaded for...

  1. Multiparametric statistics

    CERN Document Server

    Serdobolskii, Vadim Ivanovich

    2007-01-01

    This monograph presents mathematical theory of statistical models described by the essentially large number of unknown parameters, comparable with sample size but can also be much larger. In this meaning, the proposed theory can be called "essentially multiparametric". It is developed on the basis of the Kolmogorov asymptotic approach in which sample size increases along with the number of unknown parameters.This theory opens a way for solution of central problems of multivariate statistics, which up until now have not been solved. Traditional statistical methods based on the idea of an infinite sampling often break down in the solution of real problems, and, dependent on data, can be inefficient, unstable and even not applicable. In this situation, practical statisticians are forced to use various heuristic methods in the hope the will find a satisfactory solution.Mathematical theory developed in this book presents a regular technique for implementing new, more efficient versions of statistical procedures. ...

  2. Engineering Definitional Interpreters

    DEFF Research Database (Denmark)

    Midtgaard, Jan; Ramsay, Norman; Larsen, Bradford

    2013-01-01

    A definitional interpreter should be clear and easy to write, but it may run 4--10 times slower than a well-crafted bytecode interpreter. In a case study focused on implementation choices, we explore ways of making definitional interpreters faster without expending much programming effort. We imp...

  3. Engineering Definitional Interpreters

    DEFF Research Database (Denmark)

    Midtgaard, Jan; Ramsay, Norman; Larsen, Bradford

    2013-01-01

    A definitional interpreter should be clear and easy to write, but it may run 4--10 times slower than a well-crafted bytecode interpreter. In a case study focused on implementation choices, we explore ways of making definitional interpreters faster without expending much programming effort. We imp...

  4. Genre and Interpretation

    DEFF Research Database (Denmark)

    2015-01-01

    Despite the immensity of genre studies as well as studies in interpretation, our understanding of the relationship between genre and interpretation is sketchy at best. The article attempts to unravel some of intricacies of that relationship through an analysis of the generic interpretation carrie...

  5. OntologyWidget – a reusable, embeddable widget for easily locating ontology terms

    Directory of Open Access Journals (Sweden)

    Skene JH Pate

    2007-09-01

    Full Text Available Abstract Background Biomedical ontologies are being widely used to annotate biological data in a computer-accessible, consistent and well-defined manner. However, due to their size and complexity, annotating data with appropriate terms from an ontology is often challenging for experts and non-experts alike, because there exist few tools that allow one to quickly find relevant ontology terms to easily populate a web form. Results We have produced a tool, OntologyWidget, which allows users to rapidly search for and browse ontology terms. OntologyWidget can easily be embedded in other web-based applications. OntologyWidget is written using AJAX (Asynchronous JavaScript and XML and has two related elements. The first is a dynamic auto-complete ontology search feature. As a user enters characters into the search box, the appropriate ontology is queried remotely for terms that match the typed-in text, and the query results populate a drop-down list with all potential matches. Upon selection of a term from the list, the user can locate this term within a generic and dynamic ontology browser, which comprises the second element of the tool. The ontology browser shows the paths from a selected term to the root as well as parent/child tree hierarchies. We have implemented web services at the Stanford Microarray Database (SMD, which provide the OntologyWidget with access to over 40 ontologies from the Open Biological Ontology (OBO website 1. Each ontology is updated weekly. Adopters of the OntologyWidget can either use SMD's web services, or elect to rely on their own. Deploying the OntologyWidget can be accomplished in three simple steps: (1 install Apache Tomcat 2 on one's web server, (2 download and install the OntologyWidget servlet stub that provides access to the SMD ontology web services, and (3 create an html (HyperText Markup Language file that refers to the OntologyWidget using a simple, well-defined format. Conclusion We have developed Ontology

  6. The Sclerotic Scatter Limbal Arc Is More Easily Elicited under Mesopic Rather Than Photopic Conditions.

    Directory of Open Access Journals (Sweden)

    Eric Denion

    Full Text Available We aimed to determine the limbal lighting illuminance thresholds (LLITs required to trigger perception of sclerotic scatter at the opposite non-illuminated limbus (i.e. perception of a light limbal scleral arc under different levels of ambient lighting illuminance (ALI.Twenty healthy volunteers were enrolled. The iris shade (light or dark was graded by retrieving the median value of the pixels of a pre-determined zone of a gray-level iris photograph. Mean keratometry and central corneal pachymetry were recorded. Each subject was asked to lie down, and the ALI at eye level was set to mesopic values (10, 20, 40 lux, then photopic values (60, 80, 100, 150, 200 lux. For each ALI level, a light beam of gradually increasing illuminance was applied to the right temporal limbus until the LLIT was reached, i.e. the level required to produce the faint light arc that is characteristic of sclerotic scatter at the nasal limbus.After log-log transformation, a linear relationship between the logarithm of ALI and the logarithm of the LLIT was found (p<0.001, a 10% increase in ALI being associated with an average increase in the LLIT of 28.9%. Higher keratometry values were associated with higher LLIT values (p = 0.008 under low ALI levels, but the coefficient of the interaction was very small, representing a very limited effect. Iris shade and central corneal thickness values were not significantly associated with the LLIT. We also developed a censored linear model for ALI values ≤ 40 lux, showing a linear relationship between ALI and the LLIT, in which the LLIT value was 34.4 times greater than the ALI value.Sclerotic scatter is more easily elicited under mesopic conditions than under photopic conditions and requires the LLIT value to be much higher than the ALI value, i.e. it requires extreme contrast.

  7. Threshold concentration of easily assimilable organic carton in feedwater for biofouling of spiral-wound membranes.

    Science.gov (United States)

    Hijnen, W A M; Biraud, D; Cornelissen, E R; van der Kooij, D

    2009-07-01

    One of the major impediments in the application of spiral-wound membranes in water treatment or desalination is clogging of the feed channel by biofouling which is induced by nutrients in the feedwater. Organic carbon is, under most conditions, limiting the microbial growth. The objective of this study is to assess the relationship between the concentration of an easily assimilable organic compound such as acetate in the feedwater and the pressure drop increase in the feed channel. For this purpose the membrane fouling simulator (MFS) was used as a model for the feed channel of a spiral-wound membrane. This MFS unit was supplied with drinking water enriched with acetate at concentrations ranging from 1 to 1000 microg C x L(-1). The pressure drop (PD) in the feed channel increased at all tested concentrations but not with the blank. The PD increase could be described by a first order process based on theoretical considerations concerning biofilm formation rate and porosity decline. The relationship between the first order fouling rate constant R(f) and the acetate concentration is described with a saturation function corresponding with the growth kinetics of bacteria. Under the applied conditions the maximum R(f) (0.555 d(-1)) was reached at 25 microg acetate-C x L(-1) and the half saturation constant k(f) was estimated at 15 microg acetate-C x L(-1). This value is higher than k(s) values for suspended bacteria grown on acetate, which is attributed to substrate limited growth conditions in the biofilm. The threshold concentration for biofouling of the feed channel is about 1 microg acetate-C x L(-1).

  8. Superomniphobic and easily repairable coatings on copper substrates based on simple immersion or spray processes.

    Science.gov (United States)

    Rangel, Thomaz C; Michels, Alexandre F; Horowitz, Flávio; Weibel, Daniel E

    2015-03-24

    Textures that resemble typical fern or bracken plant species (dendrite structures) were fabricated for liquid repellency by dipping copper substrates in a single-step process in solutions containing AgNO3 or by a simple spray liquid application. Superhydrophobic surfaces were produced using a solution containing AgNO3 and trimethoxypropylsilane (TMPSi), and superomniphobic surfaces were produced by a two-step procedure, immersing the copper substrate in a AgNO3 solution and, after that, in a solution containing 1H,1H,2H,2H-perfluorodecyltriethoxysilane (PFDTES). The simple functionalization processes can also be used when the superomniphobic surfaces were destroyed by mechanical stress. By immersion of the wrecked surfaces in the above solutions or by the spray method and soft heating, the copper substrates could be easily repaired, regenerating the surfaces' superrepellency to liquids. The micro- and nanoroughness structures generated on copper surfaces by the deposition of silver dendrites functionalized with TMPSi presented apparent contact angles greater than 150° with a contact angle hysteresis lower than 10° when water was used as the test liquid. To avoid total wettability with very low surface tension liquids, such as rapeseed oil and hexadecane, a thin perfluorinated coating of poly(tetrafluoroethylene) (PTFE), produced by physical vapor deposition, was used. A more efficient perfluorinated coating was obtained when PFDTES was used. The superomniphobic surfaces produced apparent contact angles above 150° with all of the tested liquids, including hexadecane, although the contact angle hysteresis with this liquid was above 10°. The coupling of dendritic structures with TMPSi/PTFE or directly by PFDTES coatings was responsible for the superrepellency of the as-prepared surfaces. These simple, fast, and reliable procedures allow the large area, and cost-effective scale fabrication of superrepellent surfaces on copper substrates for various industrial

  9. Easily-handled method to isolate mesenchymal stem cells from coagulated human bone marrow samples

    Institute of Scientific and Technical Information of China (English)

    Heng-Xiang; Wang; Zhi-Yong; Li; Zhi-Kun; Guo; Zi-Kuan; Guo

    2015-01-01

    AIM:To establish an easily-handled method to isolate mesenchymal stem cells(MSCs) from coagulated human bone marrow samples. METHODS: Thrombin was added to aliquots of seven heparinized human bone marrow samples to mimic marrow coagulation. The clots were untreated,treated with urokinase or mechanically cut into pieces before culture for MSCs. The un-coagulated samples and the clots were also stored at 4 ℃ for 8 or 16 h before the treatment. The numbers of colony-forming unit-fibroblast(CFU-F) in the different samples were determined. The adherent cells from different groups were passaged and their surface profile was analyzed with flow cytometry. Their capacities of in vitro osteogenesis and adipogenesis were observed after the cells were exposed to specific inductive agents.RESULTS: The average CFU-F number of urokinasetreated samples(16.85 ± 11.77/106) was comparable to that of un-coagulated control samples(20.22 ± 10.65/106,P = 0.293),which was significantly higher than those of mechanically-cut clots(6.5 ± 5.32/106,P < 0.01) and untreated clots(1.95 ± 1.86/106,P < 0.01). The CFU-F numbers decreased after samples were stored,but those of control and urokinase-treated clots remained higher than the other two groups. Consistently,the numbers of the attached cells at passage 0 were higher in control and urokinase-treated clots than those of mechanically-cut clots and untreated clots.The attached cells were fibroblast-like in morphology and homogenously positive for CD44,CD73 and CD90,and negative for CD31 and CD45. Also,they could be induced to differentiate into osteoblasts and adipocytes in vitro. CONCLUSION: Urokinase pretreatment is an optimal strategy to isolate MSCs from human bone marrow samples that are poorly aspirated and clotted.

  10. Perception in statistical graphics

    Science.gov (United States)

    VanderPlas, Susan Ruth

    There has been quite a bit of research on statistical graphics and visualization, generally focused on new types of graphics, new software to create graphics, interactivity, and usability studies. Our ability to interpret and use statistical graphics hinges on the interface between the graph itself and the brain that perceives and interprets it, and there is substantially less research on the interplay between graph, eye, brain, and mind than is sufficient to understand the nature of these relationships. The goal of the work presented here is to further explore the interplay between a static graph, the translation of that graph from paper to mental representation (the journey from eye to brain), and the mental processes that operate on that graph once it is transferred into memory (mind). Understanding the perception of statistical graphics should allow researchers to create more effective graphs which produce fewer distortions and viewer errors while reducing the cognitive load necessary to understand the information presented in the graph. Taken together, these experiments should lay a foundation for exploring the perception of statistical graphics. There has been considerable research into the accuracy of numerical judgments viewers make from graphs, and these studies are useful, but it is more effective to understand how errors in these judgments occur so that the root cause of the error can be addressed directly. Understanding how visual reasoning relates to the ability to make judgments from graphs allows us to tailor graphics to particular target audiences. In addition, understanding the hierarchy of salient features in statistical graphics allows us to clearly communicate the important message from data or statistical models by constructing graphics which are designed specifically for the perceptual system.

  11. Topics in statistical data analysis for high-energy physics

    CERN Document Server

    Cowan, G

    2013-01-01

    These lectures concern two topics that are becoming increasingly important in the analysis of High Energy Physics (HEP) data: Bayesian statistics and multivariate methods. In the Bayesian approach we extend the interpretation of probability to cover not only the frequency of repeatable outcomes but also to include a degree of belief. In this way we are able to associate probability with a hypothesis and thus to answer directly questions that cannot be addressed easily with traditional frequentist methods. In multivariate analysis we try to exploit as much information as possible from the characteristics that we measure for each event to distinguish between event types. In particular we will look at a method that has gained popularity in HEP in recent years: the boosted decision tree (BDT).

  12. Design research in statistics education : on symbolizing and computer tools

    NARCIS (Netherlands)

    Bakker, A.

    2004-01-01

    The present knowledge society requires statistical literacy-the ability to interpret, critically evaluate, and communicate about statistical information and messages (Gal, 2002). However, research shows that students generally do not gain satisfactory statistical understanding. The research present

  13. Mathematical Language / Scientific Interpretation / Theological Interpretation

    Directory of Open Access Journals (Sweden)

    Bodea Marcel Smilihon

    2015-05-01

    Full Text Available The specific languages referred to in this presentation are: scientific language, mathematical language, theological language and philosophical language. Cosmological, scientific or theological models understood as distinct interpretations of a common symbolic language do not ensure, by such a common basis, a possible or legitimate correspondence of certain units of meaning. Mathematics understood as a symbolic language used in scientific and theological interpretation does not bridge between science and theology. Instead, it only allows the assertion of a rational-mathematical unity in expression. In this perspective, theology is nothing less rational than science. The activity of interpretation has an interdisciplinary character, it is a necessary condition of dialogue. We cannot speak about dialogue without communication between various fields, without passing from one specialized language to another specialized language. The present paper proposes to suggest this aspect.

  14. Fluorescence lifetimes: fundamentals and interpretations.

    Science.gov (United States)

    Noomnarm, Ulai; Clegg, Robert M

    2009-01-01

    Fluorescence measurements have been an established mainstay of photosynthesis experiments for many decades. Because in the photosynthesis literature the basics of excited states and their fates are not usually described, we have presented here an easily understandable text for biology students in the style of a chapter in a text book. In this review we give an educational overview of fundamental physical principles of fluorescence, with emphasis on the temporal response of emission. Escape from the excited state of a molecule is a dynamic event, and the fluorescence emission is in direct kinetic competition with several other pathways of de-excitation. It is essentially through a kinetic competition between all the pathways of de-excitation that we gain information about the fluorescent sample on the molecular scale. A simple probability allegory is presented that illustrates the basic ideas that are important for understanding and interpreting most fluorescence experiments. We also briefly point out challenges that confront the experimenter when interpreting time-resolved fluorescence responses.

  15. Statistical mechanics

    CERN Document Server

    Jana, Madhusudan

    2015-01-01

    Statistical mechanics is self sufficient, written in a lucid manner, keeping in mind the exam system of the universities. Need of study this subject and its relation to Thermodynamics is discussed in detail. Starting from Liouville theorem gradually, the Statistical Mechanics is developed thoroughly. All three types of Statistical distribution functions are derived separately with their periphery of applications and limitations. Non-interacting ideal Bose gas and Fermi gas are discussed thoroughly. Properties of Liquid He-II and the corresponding models have been depicted. White dwarfs and condensed matter physics, transport phenomenon - thermal and electrical conductivity, Hall effect, Magneto resistance, viscosity, diffusion, etc. are discussed. Basic understanding of Ising model is given to explain the phase transition. The book ends with a detailed coverage to the method of ensembles (namely Microcanonical, canonical and grand canonical) and their applications. Various numerical and conceptual problems ar...

  16. Statistical mechanics

    CERN Document Server

    Schwabl, Franz

    2006-01-01

    The completely revised new edition of the classical book on Statistical Mechanics covers the basic concepts of equilibrium and non-equilibrium statistical physics. In addition to a deductive approach to equilibrium statistics and thermodynamics based on a single hypothesis - the form of the microcanonical density matrix - this book treats the most important elements of non-equilibrium phenomena. Intermediate calculations are presented in complete detail. Problems at the end of each chapter help students to consolidate their understanding of the material. Beyond the fundamentals, this text demonstrates the breadth of the field and its great variety of applications. Modern areas such as renormalization group theory, percolation, stochastic equations of motion and their applications to critical dynamics, kinetic theories, as well as fundamental considerations of irreversibility, are discussed. The text will be useful for advanced students of physics and other natural sciences; a basic knowledge of quantum mechan...

  17. Basic statistics in cell biology.

    Science.gov (United States)

    Vaux, David L

    2014-01-01

    The physicist Ernest Rutherford said, "If your experiment needs statistics, you ought to have done a better experiment." Although this aphorism remains true for much of today's research in cell biology, a basic understanding of statistics can be useful to cell biologists to help in monitoring the conduct of their experiments, in interpreting the results, in presenting them in publications, and when critically evaluating research by others. However, training in statistics is often focused on the sophisticated needs of clinical researchers, psychologists, and epidemiologists, whose conclusions depend wholly on statistics, rather than the practical needs of cell biologists, whose experiments often provide evidence that is not statistical in nature. This review describes some of the basic statistical principles that may be of use to experimental biologists, but it does not cover the sophisticated statistics needed for papers that contain evidence of no other kind.

  18. A Note on a Geometric Interpretation of the Correlation Coefficient.

    Science.gov (United States)

    Marks, Edmond

    1982-01-01

    An alternate geometric interpretation of the correlation coefficient to that given in most statistics texts for psychology and education is presented. This interpretation is considered to be more consistent with the statistical model for the data, and richer in geometric meaning. (Author)

  19. Statistical inference

    CERN Document Server

    Rohatgi, Vijay K

    2003-01-01

    Unified treatment of probability and statistics examines and analyzes the relationship between the two fields, exploring inferential issues. Numerous problems, examples, and diagrams--some with solutions--plus clear-cut, highlighted summaries of results. Advanced undergraduate to graduate level. Contents: 1. Introduction. 2. Probability Model. 3. Probability Distributions. 4. Introduction to Statistical Inference. 5. More on Mathematical Expectation. 6. Some Discrete Models. 7. Some Continuous Models. 8. Functions of Random Variables and Random Vectors. 9. Large-Sample Theory. 10. General Meth

  20. Statistical Physics

    CERN Document Server

    Mandl, Franz

    1988-01-01

    The Manchester Physics Series General Editors: D. J. Sandiford; F. Mandl; A. C. Phillips Department of Physics and Astronomy, University of Manchester Properties of Matter B. H. Flowers and E. Mendoza Optics Second Edition F. G. Smith and J. H. Thomson Statistical Physics Second Edition E. Mandl Electromagnetism Second Edition I. S. Grant and W. R. Phillips Statistics R. J. Barlow Solid State Physics Second Edition J. R. Hook and H. E. Hall Quantum Mechanics F. Mandl Particle Physics Second Edition B. R. Martin and G. Shaw The Physics of Stars Second Edition A. C. Phillips Computing for Scient

  1. AP statistics

    CERN Document Server

    Levine-Wissing, Robin

    2012-01-01

    All Access for the AP® Statistics Exam Book + Web + Mobile Everything you need to prepare for the Advanced Placement® exam, in a study system built around you! There are many different ways to prepare for an Advanced Placement® exam. What's best for you depends on how much time you have to study and how comfortable you are with the subject matter. To score your highest, you need a system that can be customized to fit you: your schedule, your learning style, and your current level of knowledge. This book, and the online tools that come with it, will help you personalize your AP® Statistics prep

  2. Statistical methods

    CERN Document Server

    Freund, Rudolf J; Wilson, William J

    2010-01-01

    Statistical Methods, 3e provides students with a working introduction to statistical methods offering a wide range of applications that emphasize the quantitative skills useful across many academic disciplines. This text takes a classic approach emphasizing concepts and techniques for working out problems and intepreting results. The book includes research projects, real-world case studies, numerous examples and data exercises organized by level of difficulty. This text requires that a student be familiar with algebra. New to this edition: NEW expansion of exercises a

  3. Statistical mechanics

    CERN Document Server

    Davidson, Norman

    2003-01-01

    Clear and readable, this fine text assists students in achieving a grasp of the techniques and limitations of statistical mechanics. The treatment follows a logical progression from elementary to advanced theories, with careful attention to detail and mathematical development, and is sufficiently rigorous for introductory or intermediate graduate courses.Beginning with a study of the statistical mechanics of ideal gases and other systems of non-interacting particles, the text develops the theory in detail and applies it to the study of chemical equilibrium and the calculation of the thermody

  4. Interpretation biases in paranoia.

    Science.gov (United States)

    Savulich, George; Freeman, Daniel; Shergill, Sukhi; Yiend, Jenny

    2015-01-01

    Information in the environment is frequently ambiguous in meaning. Emotional ambiguity, such as the stare of a stranger, or the scream of a child, encompasses possible good or bad emotional consequences. Those with elevated vulnerability to affective disorders tend to interpret such material more negatively than those without, a phenomenon known as "negative interpretation bias." In this study we examined the relationship between vulnerability to psychosis, measured by trait paranoia, and interpretation bias. One set of material permitted broadly positive/negative (valenced) interpretations, while another allowed more or less paranoid interpretations, allowing us to also investigate the content specificity of interpretation biases associated with paranoia. Regression analyses (n=70) revealed that trait paranoia, trait anxiety, and cognitive inflexibility predicted paranoid interpretation bias, whereas trait anxiety and cognitive inflexibility predicted negative interpretation bias. In a group comparison those with high levels of trait paranoia were negatively biased in their interpretations of ambiguous information relative to those with low trait paranoia, and this effect was most pronounced for material directly related to paranoid concerns. Together these data suggest that a negative interpretation bias occurs in those with elevated vulnerability to paranoia, and that this bias may be strongest for material matching paranoid beliefs. We conclude that content-specific biases may be important in the cause and maintenance of paranoid symptoms.

  5. Statistical Approach to Diffraction of Periodic and Non-Periodic Crystals—Review

    Directory of Open Access Journals (Sweden)

    Radoslaw Strzalka

    2016-08-01

    Full Text Available In this paper, we show the fundamentals of statistical method of structure analysis. Basic concept of a method is the average unit cell, which is a probability distribution of atomic positions with respect to some reference lattices. The distribution carries complete structural information required for structure determination via diffraction experiment regardless of the inner symmetry of diffracting medium. The shape of envelope function that connects all diffraction maxima can be derived as the Fourier transform of a distribution function. Moreover, distributions are sensitive to any disorder introduced to ideal structure—phonons and phasons. The latter are particularly important in case of quasicrystals. The statistical method deals very well with phason flips and may be used to redefine phasonic Debye-Waller correction factor. The statistical approach can be also successfully applied to the peak’s profile interpretation. It will be shown that the average unit cell can be equally well applied to a description of Bragg peaks as well as other components of diffraction pattern, namely continuous and singular continuous components. Calculations performed within statistical method are equivalent to the ones from multidimensional analysis. The atomic surface, also called occupation domain, which is the basic concept behind multidimensional models, acquires physical interpretation if compared to average unit cell. The statistical method applied to diffraction analysis is now a complete theory, which deals equally well with periodic and non-periodic crystals, including quasicrystals. The method easily meets also any structural disorder.

  6. Statistical inferences in phylogeography

    DEFF Research Database (Denmark)

    Nielsen, Rasmus; Beaumont, Mark A

    2009-01-01

    In conventional phylogeographic studies, historical demographic processes are elucidated from the geographical distribution of individuals represented on an inferred gene tree. However, the interpretation of gene trees in this context can be difficult as the same demographic/geographical process ...... may also be challenged by computational problems or poor model choice. In this review, we will describe the development of statistical methods in phylogeographic analysis, and discuss some of the challenges facing these methods....... can randomly lead to multiple different genealogies. Likewise, the same gene trees can arise under different demographic models. This problem has led to the emergence of many statistical methods for making phylogeographic inferences. A popular phylogeographic approach based on nested clade analysis...... is challenged by the fact that a certain amount of the interpretation of the data is left to the subjective choices of the user, and it has been argued that the method performs poorly in simulation studies. More rigorous statistical methods based on coalescence theory have been developed. However, these methods...

  7. Statistics; Tilastot

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-12-31

    For the year 1997 and 1998, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually includes also historical time series over a longer period (see e.g. Energiatilastot 1997, Statistics Finland, Helsinki 1998, ISSN 0784-3165). The inside of the Review`s back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO{sub 2}-emissions, Electricity supply, Energy imports by country of origin in January-September 1998, Energy exports by recipient country in January-September 1998, Consumer prices of liquid fuels, Consumer prices of hard coal, Natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, Value added taxes and fiscal charges and fees included in consumer prices of some energy sources, Energy taxes and precautionary stock fees, pollution fees on oil products

  8. Statistics; Tilastot

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-12-31

    For the year 1997 and 1998, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually includes also historical time series over a longer period (see e.g. Energiatilastot 1996, Statistics Finland, Helsinki 1997, ISSN 0784-3165). The inside of the Review`s back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO{sub 2}-emissions, Electricity supply, Energy imports by country of origin in January-June 1998, Energy exports by recipient country in January-June 1998, Consumer prices of liquid fuels, Consumer prices of hard coal, Natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, Value added taxes and fiscal charges and fees included in consumer prices of some energy sources, Energy taxes and precautionary stock fees, pollution fees on oil products

  9. Statistical Mechancis

    CERN Document Server

    Gallavotti, Giovanni

    2011-01-01

    C. Cercignani: A sketch of the theory of the Boltzmann equation.- O.E. Lanford: Qualitative and statistical theory of dissipative systems.- E.H. Lieb: many particle Coulomb systems.- B. Tirozzi: Report on renormalization group.- A. Wehrl: Basic properties of entropy in quantum mechanics.

  10. The Geometry of Statistical Efficiency and Matrix Statistics

    Directory of Open Access Journals (Sweden)

    K. Gustafson

    2007-01-01

    Full Text Available We will place certain parts of the theory of statistical efficiency into the author's operator trigonometry (1967, thereby providing new geometrical understanding of statistical efficiency. Important earlier results of Bloomfield and Watson, Durbin and Kendall, Rao and Rao, will be so interpreted. For example, worse case relative least squares efficiency corresponds to and is achieved by the maximal turning antieigenvectors of the covariance matrix. Some little-known historical perspectives will also be exposed. The overall view will be emphasized.

  11. Preparation and Use of an Easily Constructed, Inexpensive Chamber for Viewing Courtship Behaviors of Fruit Flies, Drosophila sp.

    Science.gov (United States)

    Christensen, Timothy J.; Labov, Jay B.

    1997-01-01

    Details the construction of a viewing chamber for fruit flies that connects to a dissecting microscope and features a design that enables students to easily move fruit flies in and out of the chamber. (DDR)

  12. Statistical analysis with Excel for dummies

    CERN Document Server

    Schmuller, Joseph

    2013-01-01

    Take the mystery out of statistical terms and put Excel to work! If you need to create and interpret statistics in business or classroom settings, this easy-to-use guide is just what you need. It shows you how to use Excel's powerful tools for statistical analysis, even if you've never taken a course in statistics. Learn the meaning of terms like mean and median, margin of error, standard deviation, and permutations, and discover how to interpret the statistics of everyday life. You'll learn to use Excel formulas, charts, PivotTables, and other tools to make sense of everything fro

  13. A Novel Statistical Analysis and Interpretation of Flow Cytometry Data

    Science.gov (United States)

    2013-07-05

    death processes at the population level to the observed flu - orescence intensity profiles as measured by a flow cytometer (Figures 1 and 2). Because... Spanish Ministry of Science and Innovation. The authors are grateful to several referees for a number of helpful comments. References [1] J.E. Aubin

  14. [Blood proteins in African trypanosomiasis: variations and statistical interpretations].

    Science.gov (United States)

    Cailliez, M; Poupin, F; Pages, J P; Savel, J

    1982-01-01

    The estimation of blood orosomucoid, haptoglobin, C-reactive protein and immunoglobulins levels, has enable us to prove a specific proteic profile in the human african trypanosomiasis, as compared with other that of parasitic diseases, and with an healthy african reference group. Data processing informatique by principal components analysis, provide a valuable pool for epidemiological surveys.

  15. Interpretation of psychophysics response curves using statistical physics.

    Science.gov (United States)

    Knani, S; Khalfaoui, M; Hachicha, M A; Mathlouthi, M; Ben Lamine, A

    2014-05-15

    Experimental gustatory curves have been fitted for four sugars (sucrose, fructose, glucose and maltitol), using a double layer adsorption model. Three parameters of the model are fitted, namely the number of molecules per site n, the maximum response RM and the concentration at half saturation C1/2. The behaviours of these parameters are discussed in relationship to each molecule's characteristics. Starting from the double layer adsorption model, we determined (in addition) the adsorption energy of each molecule on taste receptor sites. The use of the threshold expression allowed us to gain information about the adsorption occupation rate of a receptor site which fires a minimal response at a gustatory nerve. Finally, by means of this model we could calculate the configurational entropy of the adsorption system, which can describe the order and disorder of the adsorbent surface.

  16. Interpreting land records

    CERN Document Server

    Wilson, Donald A

    2014-01-01

    Base retracement on solid research and historically accurate interpretation Interpreting Land Records is the industry's most complete guide to researching and understanding the historical records germane to land surveying. Coverage includes boundary retracement and the primary considerations during new boundary establishment, as well as an introduction to historical records and guidance on effective research and interpretation. This new edition includes a new chapter titled "Researching Land Records," and advice on overcoming common research problems and insight into alternative resources wh

  17. Making Tree Ensembles Interpretable

    OpenAIRE

    Hara, Satoshi; Hayashi, Kohei

    2016-01-01

    Tree ensembles, such as random forest and boosted trees, are renowned for their high prediction performance, whereas their interpretability is critically limited. In this paper, we propose a post processing method that improves the model interpretability of tree ensembles. After learning a complex tree ensembles in a standard way, we approximate it by a simpler model that is interpretable for human. To obtain the simpler model, we derive the EM algorithm minimizing the KL divergence from the ...

  18. Interpretation of cell culture phenomena.

    Science.gov (United States)

    Vierck, J L; Dodson, M V

    2000-03-01

    This paper discusses the dilemma of interpreting unusual or abnormal phenomena seen in cell cultures and is not intended to address the statistical design of experiments. Problems that can be encountered when growing cells in experimental situations include low or decreasing cell numbers, abnormal cell morphology, microbial contamination, and detachment of the cell monolayer. If any of these situations occur, it is not realistic to proceed with data analysis until the problem is corrected. The best policy is to attempt to standardize all types of cultures used for analysis and to avoid using any cultures that display atypical characteristics.

  19. Photoshop轻松布局网页%Using Photoshop to layout webpage easily

    Institute of Scientific and Technical Information of China (English)

    张楼英

    2012-01-01

    Nowadays,the network has become an important part of our daily life,which generates a rising profession that is web design.Some vocational schools have opened the network engineering courses in order to meet the needs of this profession.But in the course of cultivating the web page designers,some teachers lack the practice experience,and can't be familiar with the integration ability of Photoshop software in web design for the web elements.As a result,this software doesn't work well.Based on this phenomenon,the function and significance of Photoshop in web design and planning and some notable problems of Photoshop for web planning are interpreted in detail,especially for the slice function and its operation,which offers significant guidance for the beginners.%当今网络已经成为人们生活的重要组成部分,由此一个新兴的专业---网页设计诞生了。部分耿业学校为适应这个职业对人才的需求纷纷开设了网络工程专业。但是在培养网页设计人员的过程中一些教师由于缺乏网页设计实践经验,对Photoshop这个图象软件在网页设计中对网页元素的整合作用认识不足,没有使这个软件发挥其应有的作用。文章针对这个现象,对Photoshop在网页设计和规划中的作用和意义以及运用Photoshop进行网页规划时应注意的问题作了论述,特别是对Photoshop的切片功能和操作方法做了详细的说明,这对于初学网页设计的人员来说更具有指导意义。

  20. Genre and Interpretation

    DEFF Research Database (Denmark)

    Auken, Sune

    2015-01-01

    Despite the immensity of genre studies as well as studies in interpretation, our understanding of the relationship between genre and interpretation is sketchy at best. The article attempts to unravel some of intricacies of that relationship through an analysis of the generic interpretation carried...... traits of an utterance will lead to a characterization of its individual, as well as its general characteristics. The article proceeds to describe three central concepts within genre studies that are applicable to generic interpretation: “horizon of expectation,” “world,” and the triad “theme-form-rhetoric...

  1. Leptospirosis in the Tropics: When Prevention Doesn't Easily Sell as a Ton of Cure

    Directory of Open Access Journals (Sweden)

    Roger Lee Mendoza

    2010-01-01

    Full Text Available Problem statement: Human leptospirosis -- the most widespread zoonotic disease -- thrives well in tropical and subtropical climates. It is seldom addressed ex ante or prior to an outbreak, or expected outbreak, by governments of high-risk countries. Whether common, post-exposure treatment with antimicrobial drugs is more cost-efficient than reducing animal reservoir populations was the overarching question that guided this study. A related question in this study was how to establish comparability of price or cost estimates for these two treatment methods. Approach: Annualized price samples of government-approved, antimicrobial therapies, particularly antibiotics, and appropriate anti-rat/anti-rodent chemical agents (rodenticides were gathered from three leptospirosisendemic countries: Brazil, the Philippines and Sri Lanka. Certain price data were adjusted for present value based on a linear cost accounting function. Two-tailed hypothesis-testing (α = 0.05 was performed to determine any statistically significant differences in pricing antimicrobial therapies and rodenticides in each country under investigation. Results: Shared environmental issues and socio-demographic characteristics of infected populations appear to support the need for ex ante containment of rat/rodent reservoir populations in high-risk tropical and subtropical countries. In each t-tested country, we found that tcrit>tobs>-tcrit. Therefore, the null hypothesis, μantimicrobial = μrodenticides; μantimicrobial - µrodenticides = 0, could not be rejected in favor of the alternative hypothesis, μantimicrobial ≠ μrodenticides; μantimicrobial - μrodenticides ≠ 0. Conclusion: Applications of Price Estimation (PE methods in financial economics, such as present value, help optimize health decisions concerning zoonotic diseases. Leptospira transmission in Brazil, the Philippines and Sri Lanka illuminate the need for broad and cohesive policies that take into

  2. Experimental statistics

    CERN Document Server

    Natrella, Mary Gibbons

    2005-01-01

    Formulated to assist scientists and engineers engaged in army ordnance research and development programs, this well-known and highly regarded handbook is a ready reference for advanced undergraduate and graduate students as well as for professionals seeking engineering information and quantitative data for designing, developing, constructing, and testing equipment. Topics include characterizing and comparing the measured performance of a material, product, or process; general considerations in planning experiments; statistical techniques for analyzing extreme-value data; use of transformations

  3. Arc Statistics

    CERN Document Server

    Meneghetti, M; Dahle, H; Limousin, M

    2013-01-01

    The existence of an arc statistics problem was at the center of a strong debate in the last fifteen years. With the aim to clarify if the optical depth for giant gravitational arcs by galaxy clusters in the so called concordance model is compatible with observations, several studies were carried out which helped to significantly improve our knowledge of strong lensing clusters, unveiling their extremely complex internal structure. In particular, the abundance and the frequency of strong lensing events like gravitational arcs turned out to be a potentially very powerful tool to trace the structure formation. However, given the limited size of observational and theoretical data-sets, the power of arc statistics as a cosmological tool has been only minimally exploited so far. On the other hand, the last years were characterized by significant advancements in the field, and several cluster surveys that are ongoing or planned for the near future seem to have the potential to make arc statistics a competitive cosmo...

  4. Misuse of statistics in surgical literature.

    Science.gov (United States)

    Thiese, Matthew S; Ronna, Brenden; Robbins, Riann B

    2016-08-01

    Statistical analyses are a key part of biomedical research. Traditionally surgical research has relied upon a few statistical methods for evaluation and interpretation of data to improve clinical practice. As research methods have increased in both rigor and complexity, statistical analyses and interpretation have fallen behind. Some evidence suggests that surgical research studies are being designed and analyzed improperly given the specific study question. The goal of this article is to discuss the complexities of surgical research analyses and interpretation, and provide some resources to aid in these processes.

  5. Measurement and statistics for teachers

    CERN Document Server

    Van Blerkom, Malcolm

    2008-01-01

    Written in a student-friendly style, Measurement and Statistics for Teachers shows teachers how to use measurement and statistics wisely in their classes. Although there is some discussion of theory, emphasis is given to the practical, everyday uses of measurement and statistics. The second part of the text provides more complete coverage of basic descriptive statistics and their use in the classroom than in any text now available.Comprehensive and accessible, Measurement and Statistics for Teachers includes:Short vignettes showing concepts in action Numerous classroom examples Highlighted vocabulary Boxes summarizing related concepts End-of-chapter exercises and problems Six full chapters devoted to the essential topic of Classroom Tests Instruction on how to carry out informal assessments, performance assessments, and portfolio assessments, and how to use and interpret standardized tests A five-chapter section on Descriptive Statistics, giving instructors the option of more thoroughly teaching basic measur...

  6. Linguistics in Text Interpretation

    DEFF Research Database (Denmark)

    Togeby, Ole

    2011-01-01

    A model for how text interpretation proceeds from what is pronounced, through what is said to what is comunicated, and definition of the concepts 'presupposition' and 'implicature'.......A model for how text interpretation proceeds from what is pronounced, through what is said to what is comunicated, and definition of the concepts 'presupposition' and 'implicature'....

  7. Acquiring specific interpreting competence

    Directory of Open Access Journals (Sweden)

    Jana Zidar Forte

    2012-12-01

    Full Text Available In postgraduate interpreter training, the main objective of the course is to help trainees develop various competences, from linguistic, textual and cultural competence, to professional and specific interpreting competence. For simultaneous interpreting (SI, the main focus is on mastering the SI technique and strategies as well as on developing and strengthening communicative skills, which is discussed and illustrated with examples in the present paper. First, a brief overview is given of all the necessary competences of a professional interpreter with greater emphasis on specific interpreting competence for SI. In the second part of the paper, various approaches are described in terms of acquiring specific skills and strategies, specifically through a range of exercises. Besides interpreting entire speeches, practical courses should also consist of targeted exercises, which help trainees develop suitable coping strategies and mechanisms (later on almost automatisms, while at the same time "force" them to reflect on their individual learning process and interpreting performance. This provides a solid base on which trained interpreters can progress and develop their skills also after joining the professional sphere.

  8. Wilhelm Wundt's Theory of Interpretation

    Directory of Open Access Journals (Sweden)

    Jochen Fahrenberg

    2008-09-01

    Full Text Available Wilhelm WUNDT was a pioneer in experimental and physiological psychology. However, his theory of interpretation (hermeneutics remains virtually neglected. According to WUNDT psychology belongs to the domain of the humanities (Geisteswissenschaften, and, throughout his books and research, he advocated two basic methodologies: experimentation (as the means of controlled self-observation and interpretative analysis of mental processes and products. He was an experimental psychologist and a profound expert in traditional hermeneutics. Today, he still may be acknowledged as the author of the monumental Völkerpsychologie, but not his advances in epistemology and methodology. His subsequent work, the Logik (1908/1921, contains about 120 pages on hermeneutics. In the present article a number of issues are addressed. Noteworthy was WUNDT's general intention to account for the logical constituents and the psychological process of understanding, and his reflections on quality control. In general, WUNDT demanded methodological pluralism and a complementary approach to the study of consciousness and neurophysiological processes. In the present paper WUNDT's approach is related to the continuing controversy on basic issues in methodology; e.g. experimental and statistical methods vs. qualitative (hermeneutic methods. Varied explanations are given for the one-sided or distorted reception of WUNDT's methodology. Presently, in Germany the basic program of study in psychology lacks thorough teaching and training in qualitative (hermeneutic methods. Appropriate courses are not included in the curricula, in contrast to the training in experimental design, observation methods, and statistics. URN: urn:nbn:de:0114-fqs0803291

  9. Statistics Poster Challenge for Schools

    Science.gov (United States)

    Payne, Brad; Freeman, Jenny; Stillman, Eleanor

    2013-01-01

    The analysis and interpretation of data are important life skills. A poster challenge for schoolchildren provides an innovative outlet for these skills and demonstrates their relevance to daily life. We discuss our Statistics Poster Challenge and the lessons we have learned.

  10. Depth statistics

    OpenAIRE

    2012-01-01

    In 1975 John Tukey proposed a multivariate median which is the 'deepest' point in a given data cloud in R^d. Later, in measuring the depth of an arbitrary point z with respect to the data, David Donoho and Miriam Gasko considered hyperplanes through z and determined its 'depth' by the smallest portion of data that are separated by such a hyperplane. Since then, these ideas has proved extremely fruitful. A rich statistical methodology has developed that is based on data depth and, more general...

  11. Statistical mechanics

    CERN Document Server

    Sheffield, Scott

    2009-01-01

    In recent years, statistical mechanics has been increasingly recognized as a central domain of mathematics. Major developments include the Schramm-Loewner evolution, which describes two-dimensional phase transitions, random matrix theory, renormalization group theory and the fluctuations of random surfaces described by dimers. The lectures contained in this volume present an introduction to recent mathematical progress in these fields. They are designed for graduate students in mathematics with a strong background in analysis and probability. This book will be of particular interest to graduate students and researchers interested in modern aspects of probability, conformal field theory, percolation, random matrices and stochastic differential equations.

  12. READING STATISTICS AND RESEARCH

    Directory of Open Access Journals (Sweden)

    Reviewed by Yavuz Akbulut

    2008-10-01

    Full Text Available The book demonstrates the best and most conservative ways to decipher and critique research reports particularly for social science researchers. In addition, new editions of the book are always better organized, effectively structured and meticulously updated in line with the developments in the field of research statistics. Even the most trivial issues are revisited and updated in new editions. For instance, purchaser of the previous editions might check the interpretation of skewness and kurtosis indices in the third edition (p. 34 and in the fifth edition (p.29 to see how the author revisits every single detail. Theory and practice always go hand in hand in all editions of the book. Re-reading previous editions (e.g. third edition before reading the fifth edition gives the impression that the author never stops ameliorating his instructional text writing methods. In brief, “Reading Statistics and Research” is among the best sources showing research consumers how to understand and critically assess the statistical information and research results contained in technical research reports. In this respect, the review written by Mirko Savić in Panoeconomicus (2008, 2, pp. 249-252 will help the readers to get a more detailed overview of each chapters. I cordially urge the beginning researchers to pick a highlighter to conduct a detailed reading with the book. A thorough reading of the source will make the researchers quite selective in appreciating the harmony between the data analysis, results and discussion sections of typical journal articles. If interested, beginning researchers might begin with this book to grasp the basics of research statistics, and prop up their critical research reading skills with some statistics package applications through the help of Dr. Andy Field’s book, Discovering Statistics using SPSS (second edition published by Sage in 2005.

  13. Working With Educational Interpreters.

    Science.gov (United States)

    Seal, Brenda C

    2000-01-01

    Increasing numbers of students who are deaf or hard of hearing are being educated in their local schools. Accommodations frequently made for these students include the provision of educational interpreting services. Educational interpreters serve to equalize the source language or source communication mode (usually spoken English) with a target language or target mode (either sign language, cued speech, or oral transliterating). Educational interpreters' expertise in sign language or cued speech will likely exceed that of speech-language pathologists, whose expertise in speech and language development and in discourse demands of the classroom will likely exceed that of the educational interpreters. This article addresses the mutual needs of speech-language pathologists and educational interpreters in providing services to their students. Guidelines supported by recent research reports and survey data collected from interpreters are offered to speech-language pathologists as ways to improve the working relationships with educational interpreters in three areas: (a) evaluating a student's communication skills, (b) establishing treatment goals and intervening to meet those goals, and

  14. Interpretation as doing

    DEFF Research Database (Denmark)

    Majgaard Krarup, Jonna

    2008-01-01

    The intent of the paper is to address and discuss relationships between the aesthetic perception and interpretation of contemporary landscape architecture. I will try to do this by setting up a cross-disciplinary perspective that looks into themes from the contemporary art scene and aesthetic...... concept of landscape and design in landscape architecture, and hereby address the question of how interpretation might be processed. It is also my premise that a key point in this is the interplay between different sensory experiences of both material and non-material aspects......, and that it is this interplay that the individual collects into an entity – an interpretation – through an intellectual process....

  15. Customizable tool for ecological data entry, assessment, monitoring, and interpretation

    Science.gov (United States)

    The Database for Inventory, Monitoring and Assessment (DIMA) is a highly customizable tool for data entry, assessment, monitoring, and interpretation. DIMA is a Microsoft Access database that can easily be used without Access knowledge and is available at no cost. Data can be entered for common, nat...

  16. Neural network classification - A Bayesian interpretation

    Science.gov (United States)

    Wan, Eric A.

    1990-01-01

    The relationship between minimizing a mean squared error and finding the optimal Bayesian classifier is reviewed. This provides a theoretical interpretation for the process by which neural networks are used in classification. A number of confidence measures are proposed to evaluate the performance of the neural network classifier within a statistical framework.

  17. Interpretation of Biosphere Reserves.

    Science.gov (United States)

    Merriman, Tim

    1994-01-01

    Introduces the Man and the Biosphere Programme (MAB) to monitor the 193 biogeographical provinces of the Earth and the creation of biosphere reserves. Highlights the need for interpreters to become familiar or involved with MAB program activities. (LZ)

  18. A New Redshift Interpretation

    CERN Document Server

    Gentry, R V

    1997-01-01

    A nonhomogeneous universe with vacuum energy, but without spacetime expansion, is utilized together with gravitational and Doppler redshifts as the basis for proposing a new interpretation of the Hubble relation and the 2.7K Cosmic Blackbody Radiation.

  19. Normative interpretations of diversity

    DEFF Research Database (Denmark)

    Lægaard, Sune

    2009-01-01

    Normative interpretations of particular cases consist of normative principles or values coupled with social theoretical accounts of the empirical facts of the case. The article reviews the most prominent normative interpretations of the Muhammad cartoons controversy over the publication of drawings...... of the Prophet Muhammad in the Danish newspaper Jyllands-Posten. The controversy was seen as a case of freedom of expression, toleration, racism, (in)civility and (dis)respect, and the article notes different understandings of these principles and how the application of them to the controversy implied different...... social theoretical accounts of the case. In disagreements between different normative interpretations, appeals are often made to the ‘context', so it is also considered what roles ‘context' might play in debates over normative interpretations...

  20. Cytological artifacts masquerading interpretation

    Directory of Open Access Journals (Sweden)

    Khushboo Sahay

    2013-01-01

    Conclusions: In order to justify a cytosmear interpretation, a cytologist must be well acquainted with delayed fixation-induced cellular changes and microscopic appearances of common contaminants so as to implicate better prognosis and therapy.

  1. Ultrasound-guided spinal anaesthesia in obstetrics: is there an advantage over the landmark technique in patients with easily palpable spines?

    Science.gov (United States)

    Ansari, T; Yousef, A; El Gamassy, A; Fayez, M

    2014-08-01

    Data are scarce on the advantage of ultrasound-guided spinal anaesthesia in patients with easily identifiable bony landmarks. In this study, we compared the use of ultrasound to the landmark method in patients with no anticipated technical difficulty, presenting for caesarean delivery under spinal anaesthesia. A total of 150 pregnant women were recruited in this randomized, controlled study. Ultrasound examination and spinal anaesthesia were performed by three anaesthetists with experience in ultrasound-guided neuraxial block. Patients were randomized to either the Ultrasound Group (n=75) or the Landmark Group (n=75). In both groups the level of L3-4 or L4-5 was identified by ultrasound (transverse and longitudinal approach) or palpation. The primary outcome was the procedure time, measured from the time of skin puncture by the introducer to the time of viewing cerebrospinal fluid at the hub of the spinal needle. Secondary outcomes were the number of skin punctures, number of passes, and incidence of successful spinal blockade. The average procedure time, number of skin punctures and needle passes, were similar in both groups. The number of patients with successful spinal anaesthesia after one puncture was not statistically different between the groups. The present results indicate that when performed by anaesthetists experienced in both ultrasound and landmark techniques, the use of ultrasound does not appear to increase the success rate of spinal anaesthesia, or reduce the procedure time or number of attempts in obstetric patients with easily palpable spines. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. Permutation statistical methods an integrated approach

    CERN Document Server

    Berry, Kenneth J; Johnston, Janis E

    2016-01-01

    This research monograph provides a synthesis of a number of statistical tests and measures, which, at first consideration, appear disjoint and unrelated. Numerous comparisons of permutation and classical statistical methods are presented, and the two methods are compared via probability values and, where appropriate, measures of effect size. Permutation statistical methods, compared to classical statistical methods, do not rely on theoretical distributions, avoid the usual assumptions of normality and homogeneity of variance, and depend only on the data at hand. This text takes a unique approach to explaining statistics by integrating a large variety of statistical methods, and establishing the rigor of a topic that to many may seem to be a nascent field in statistics. This topic is new in that it took modern computing power to make permutation methods available to people working in the mainstream of research. This research monograph addresses a statistically-informed audience, and can also easily serve as a ...

  3. Statistical Computing in Information Society

    Directory of Open Access Journals (Sweden)

    Domański Czesław

    2015-12-01

    Full Text Available In the presence of massive data coming with high heterogeneity we need to change our statistical thinking and statistical education in order to adapt both - classical statistics and software developments that address new challenges. Significant developments include open data, big data, data visualisation, and they are changing the nature of the evidence that is available, the ways in which it is presented and the skills needed for its interpretation. The amount of information is not the most important issue – the real challenge is the combination of the amount and the complexity of data. Moreover, a need arises to know how uncertain situations should be dealt with and what decisions should be taken when information is insufficient (which can also be observed for large datasets. In the paper we discuss the idea of computational statistics as a new approach to statistical teaching and we try to answer a question: how we can best prepare the next generation of statisticians.

  4. Changing Preservice Science Teachers' Views of Nature of Science: Why Some Conceptions May Be More Easily Altered than Others

    Science.gov (United States)

    Mesci, Gunkut; Schwartz, Renee' S.

    2017-01-01

    The purpose of this study was to assess preservice teachers' views of Nature of Science (NOS), identify aspects that were challenging for conceptual change, and explore reasons why. This study particularly focused on why and how some concepts of NOS may be more easily altered than others. Fourteen preservice science teachers enrolled in a NOS and…

  5. Easily denaturing nucleic acids derived from intercalating nucleic acids: thermal stability studies, dual duplex invasion and inhibition of transcription start

    DEFF Research Database (Denmark)

    Filichev, Vyacheslav V; Vester, Birte; Hansen, Lykke Haastrup;

    2005-01-01

    The bulged insertions of (R)-1-O-(pyren-1-ylmethyl)glycerol (monomer P) in two complementary 8mer DNA strands (intercalating nucleic acids) opposite to each other resulted in the formation of an easily denaturing duplex, which had lower thermal stability (21.0 degrees C) than the wild-type double...

  6. Smartphones for post-event analysis: a low-cost and easily accessible approach for mapping natural hazards

    Science.gov (United States)

    Tarolli, Paolo; Prosdocimi, Massimo; Sofia, Giulia; Dalla Fontana, Giancarlo

    2015-04-01

    A real opportunity and challenge for the hazard mapping is offered by the use of smartphones and low-cost and flexible photogrammetric technique (i.e. 'Structure-from-Motion'-SfM-). Differently from the other traditional photogrammetric methods, the SfM allows to reconstitute three-dimensional geometries (Digital Surface Models, DSMs) from randomly acquired images. The images can be acquired by standalone digital cameras (compact or reflex), or even by smartphones built-in cameras. This represents a "revolutionary" advance compared with more expensive technologies and applications (e.g. Terrestrial Laser Scanner TLS, airborne lidar) (Tarolli, 2014). Through fast, simple and consecutive field surveys, anyone with a smartphone can take a lot of pictures of the same study area. This way, high-resolution and multi-temporal DSMs may be obtained and used to better monitor and understand erosion and deposition processes. Furthermore, these topographic data can also facilitate to quantify volumes of eroded materials due to landslides and recognize the major critical issues that usually occur during a natural hazard (e.g. river bank erosion and/or collapse due to floods). In this work we considered different case studies located in different environmental contexts of Italy, where extensive photosets were obtained using smartphones. TLS data were also considered in the analysis as benchmark to compare with SfM data. Digital Surface Models (DSMs) derived from SfM at centimeter grid-cell resolution revealed to be effective to automatically recognize areas subject to surface instabilities, and estimate quantitatively erosion and deposition volumes, for example. Morphometric indexes such as landform curvature and surface roughness, and statistical thresholds (e.g. standard deviation) of these indices, served as the basis for the proposed analyses. The results indicate that SfM technique through smartphones really offers a fast, simple and affordable alternative to lidar

  7. Large, Easily Deployable Structures

    Science.gov (United States)

    Agan, W. E.

    1983-01-01

    Study of concepts for large space structures will interest those designing scaffolding, radio towers, rescue equipment, and prefabricated shelters. Double-fold, double-cell module was selected for further design and for zero gravity testing. Concept is viable for deployment by humans outside space vehicle as well as by remotely operated manipulator.

  8. SOCR: Statistics Online Computational Resource

    Directory of Open Access Journals (Sweden)

    Ivo D. Dinov

    2006-10-01

    Full Text Available The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis, visualization and integration. Following years of experience in statistical teaching at all college levels using established licensed statistical software packages, like STATA, S-PLUS, R, SPSS, SAS, Systat, etc., we have attempted to engineer a new statistics education environment, the Statistics Online Computational Resource (SOCR. This resource performs many of the standard types of statistical analysis, much like other classical tools. In addition, it is designed in a plug-in object-oriented architecture and is completely platform independent, web-based, interactive, extensible and secure. Over the past 4 years we have tested, fine-tuned and reanalyzed the SOCR framework in many of our undergraduate and graduate probability and statistics courses and have evidence that SOCR resources build student's intuition and enhance their learning.

  9. SOCR: Statistics Online Computational Resource

    Directory of Open Access Journals (Sweden)

    Ivo D. Dinov

    2006-10-01

    Full Text Available The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis, visualization and integration. Following years of experience in statistical teaching at all college levels using established licensed statistical software packages, like STATA, S-PLUS, R, SPSS, SAS, Systat, etc., we have attempted to engineer a new statistics education environment, the Statistics Online Computational Resource (SOCR. This resource performs many of the standard types of statistical analysis, much like other classical tools. In addition, it is designed in a plug-in object-oriented architecture and is completely platform independent, web-based, interactive, extensible and secure. Over the past 4 years we have tested, fine-tuned and reanalyzed the SOCR framework in many of our undergraduate and graduate probability and statistics courses and have evidence that SOCR resources build student’s intuition and enhance their learning.

  10. Modification of codes NUALGAM and BREMRAD. Volume 3: Statistical considerations of the Monte Carlo method

    Science.gov (United States)

    Firstenberg, H.

    1971-01-01

    The statistics are considered of the Monte Carlo method relative to the interpretation of the NUGAM2 and NUGAM3 computer code results. A numerical experiment using the NUGAM2 code is presented and the results are statistically interpreted.

  11. Knowledge Requirements Formula for Interpreters and Interpreting Training

    Institute of Scientific and Technical Information of China (English)

    高宇婷

    2009-01-01

    Based on Zhong Weihe's knowledge requirements formula for interpreters:KI=KL+EK+S(P+AP),this paper explains in detail how the different knowledge is used in the course of interpreting and provides some useful strategies in interpreting practice.

  12. Translation, Interpreting and Lexicography

    DEFF Research Database (Denmark)

    Tarp, Sven; Dam, Helle Vrønning

    2017-01-01

    Translation, interpreting and lexicography represent three separate areas of human activity, each of them with its own theories, models and methods and, hence, with its own disciplinary underpinnings. At the same time, all three disciplines are characterized by a marked interdisciplinary dimension...... in the sense that their practice fields are typically ‘about something else’. Translators may, for example, be called upon to translate medical texts, and interpreters may be assigned to work on medical speeches. Similarly, practical lexicography may produce medical dictionaries. In this perspective, the three...... disciplines frequently come into touch with each other. This chapter discusses and explores some of the basic aspects of this interrelationship, focusing on the (potential) contribution of lexicography to translation and interpreting and on explaining the basic concepts and methods of the former discipline...

  13. Conference Interpreting Explained

    Institute of Scientific and Technical Information of China (English)

    盖孟姣

    2015-01-01

    This book written by Roderick Jones is easy to read for me.It gives me a bit confidence through reading a book and this time I know a bit about how to read a book quickly.After this,I will read more books about interpreting and translating for my further study.From my perspective,every part of this book consists of three parts,that is,the theory part,the examples part and the concluding part.Through reading this book,I know something about interpreting such as simultaneous interpreting techniques and some actual examples.Anyhow,I still need a lot of practice to improve my English capability.What I have written below is the main content of the fourth part in this book,and the feelings of my reading the book.

  14. Integrated reservoir interpretation

    Energy Technology Data Exchange (ETDEWEB)

    Caamano, Ed; Dickerman, Ken; Thornton, Mick (Conoco Indonesia Inc., Jakarta (Indonesia)); Corbett, Chip; Douglas, David; Schultz, Phil (GeoQuest, Houston, TX (United States)); Gir, Roopa; Nicholson, Barry (GeoQuest, Jakarta (Indonesia)); Martono, Dwi; Padmono, Joko; Novias; Kiagus; Suroso, Sigit (Pertamina Sumbagut, Brandan, North Sumatra (Indonesia)); Mathieu, Gilles (Etudes et Productions Schlumberger, Clamart (France)); Yan, Zhao (China National Petroleum Company, Beijing (China))

    1994-07-01

    Improved reservoir management often relies on linking a variety of application software that helps geoscientists handle, visualize and interpret massive amounts of diverse data. The goal is to obtain the best possible reservoir model so its behavior can be understood and optimized. But diverse application software creates specialty niches and discourages integrated interpretation. A description is given of a new reservoir management package that covers all required functionalities and encourages the geologist, geophysicist, petrophysicist and reservoir engineer to embrace the integrated approach. Case studies are included in the article. 21 figs., 13 refs.

  15. Conjunctive interpretations of disjunctions

    Directory of Open Access Journals (Sweden)

    Robert van Rooij

    2010-09-01

    Full Text Available In this extended commentary I discuss the problem of how to account for "conjunctive" readings of some sentences with embedded disjunctions for globalist analyses of conversational implicatures. Following Franke (2010, 2009, I suggest that earlier proposals failed, because they did not take into account the interactive reasoning of what else the speaker could have said, and how else the hearer could have interpreted the (alternative sentence(s. I show how Franke's idea relates to more traditional pragmatic interpretation strategies. doi:10.3765/sp.3.11 BibTeX info

  16. Normative interpretations of diversity

    DEFF Research Database (Denmark)

    Lægaard, Sune

    2009-01-01

    of the Prophet Muhammad in the Danish newspaper Jyllands-Posten. The controversy was seen as a case of freedom of expression, toleration, racism, (in)civility and (dis)respect, and the article notes different understandings of these principles and how the application of them to the controversy implied different......Normative interpretations of particular cases consist of normative principles or values coupled with social theoretical accounts of the empirical facts of the case. The article reviews the most prominent normative interpretations of the Muhammad cartoons controversy over the publication of drawings...

  17. Highly concentrated synthesis of copper-zinc-tin-sulfide nanocrystals with easily decomposable capping molecules for printed photovoltaic applications.

    Science.gov (United States)

    Kim, Youngwoo; Woo, Kyoohee; Kim, Inhyuk; Cho, Yong Soo; Jeong, Sunho; Moon, Jooho

    2013-11-07

    Among various candidate materials, Cu2ZnSnS4 (CZTS) is a promising earth-abundant semiconductor for low-cost thin film solar cells. We report a facile, less toxic, highly concentrated synthetic method utilizing the heretofore unrecognized, easily decomposable capping ligand of triphenylphosphate, where phase-pure, single-crystalline, and well-dispersed colloidal CZTS nanocrystals were obtained. The favorable influence of the easily decomposable capping ligand on the microstructural evolution of device-quality CZTS absorber layers was clarified based on a comparative study with commonly used oleylamine-capped CZTS nanoparticles. The resulting CZTS nanoparticles enabled us to produce a dense and crack-free absorbing layer through annealing under a N2 + H2S (4%) atmosphere, demonstrating a solar cell with an efficiency of 3.6% under AM 1.5 illumination.

  18. The Stirling engine. Simply explained, easily constructed. 9. rev. and enl. ed.; Der Stirlingmotor. Einfach erklaert und leicht gebaut

    Energy Technology Data Exchange (ETDEWEB)

    Viebach, Dieter

    2010-07-01

    Subsequently to a easily comprehensively description of the function and characteristics of Stirling engines, the author of the book under consideration describes the construction of a model Stirling engine on the basis of clear construction drawings. A delicacy for experienced modelers: The 'amazing model', a miniature Stirling engine consisting of beverage cans, has been running with the warmth of the human hand. Even in this technically demanding model, the construction will be described accurately by detailed construction drawings.

  19. A two-stage anaerobic system for biodegrading wastewater containing terephthalic acid and high strength easily degradable pollutants

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The high strength easily biodegradable pollutants(represented by CODE) are strong inhibitors of terephthalic acid(TA) anaerobic biodegradation. At the same time, TA can inhibiteasily biodegradable pollutants removal under anaerobic conditionsto a limited extent. This mutual inhibition could happen and causea low removal efficiency of both TA and CODE, when the effluentfrom TA workshops containing TA and easily biodegradable pollutantsare treated by a single anaerobic reactor system. Based upon thetreatment kinetics analysis of both TA degradation and CODEremoval, a two-stage up-flow anaerobic sludge blanket and up-flowfixed film reactor(UASB-UAFF) system for dealing with this kind ofwastewater was developed and run successfully at laboratory scale.An UASB reactor with the methanogenic consortium as the first stageremoves the easily biodegradable pollutants(CODE). An UAFF reactor as the second stage is mainly in charge of TA degradation. At aHRT 18.5h, the CODE and TA removal rate of the system reached 89.2% and 71.6%, respectively.

  20. Screencast Tutorials Enhance Student Learning of Statistics

    Science.gov (United States)

    Lloyd, Steven A.; Robertson, Chuck L.

    2012-01-01

    Although the use of computer-assisted instruction has rapidly increased, there is little empirical research evaluating these technologies, specifically within the context of teaching statistics. The authors assessed the effect of screencast tutorials on learning outcomes, including statistical knowledge, application, and interpretation. Students…

  1. Statistical distributions of air pollution concentrations

    Energy Technology Data Exchange (ETDEWEB)

    Georgopoulos, P.G.; Seinfeld, J.H.

    1982-07-01

    Methodologies and limitations in describing air quality through statistical distributions of pollutant are discussed, and the use of extreme statistics in the evaluation of different forms of air quality standards are explained. In addition, the interpretation of rollback calculations with regard to air quality standards is discussed. (JMT)

  2. Interpreting the Constitution.

    Science.gov (United States)

    Brennan, William J., Jr.

    1987-01-01

    Discusses constitutional interpretations relating to capital punishment and protection of human dignity. Points out the document's effectiveness in creating a new society by adapting its principles to current problems and needs. Considers two views of the Constitution that lead to controversy over the legitimacy of judicial decisions. (PS)

  3. Interpreting the Santal Rebellion

    DEFF Research Database (Denmark)

    Andersen, Peter Birkelund

    2016-01-01

    Postcolonial studies have interpreted the Santal Rebellion, the hul of 1855, as a peasant rebellion that the colonial power construed as an ethnic rebellion (R. Guha). Anthropologists and historians have stressed the near-complete mobilisation of the Santals, whereas a later colonial historian (W...

  4. Social Maladjustment: An Interpretation.

    Science.gov (United States)

    Center, David B.

    The exclusionary term, "social maladjustment," the definition in Public Law 94-142 (the Education for All Handicapped Children Act) of serious emotional disturbance, has been an enigma for special education. This paper attempts to limit the interpretation of social maladjustment in order to counter effects of such decisions as…

  5. Interpretations of Greek Mythology

    NARCIS (Netherlands)

    Bremmer, Jan

    1987-01-01

    This collection of original studies offers new interpretations of some of the best known characters and themes of Greek mythology, reflecting the complexity and fascination of the Greek imagination. Following analyses of the concept of myth and the influence of the Orient on Greek mythology, the

  6. Interpretations of Greek Mythology

    NARCIS (Netherlands)

    Bremmer, Jan

    1987-01-01

    This collection of original studies offers new interpretations of some of the best known characters and themes of Greek mythology, reflecting the complexity and fascination of the Greek imagination. Following analyses of the concept of myth and the influence of the Orient on Greek mythology, the suc

  7. Conflicts in interpretation

    NARCIS (Netherlands)

    Bouma, G.; Hendriks, P.; Hoop, H. de; Krämer, I.; Swart, Henriëtte de; Zwarts, J.

    2007-01-01

    The leading hypothesis of this paper is that interpretation is a process of constraint satisfaction, conflict resolution, and optimization, along the lines of Optimality Theory. Support for this view is drawn from very different domains, and based on both experimental and theoretical research. We di

  8. Interpreting & Biomechanics. PEPNet Tipsheet

    Science.gov (United States)

    PEPNet-Northeast, 2001

    2001-01-01

    Cumulative trauma disorder (CTD) refers to a collection of disorders associated with nerves, muscles, tendons, bones, and the neurovascular (nerves and related blood vessels) system. CTD symptoms may involve the neck, back, shoulders, arms, wrists, or hands. Interpreters with CTD may experience a variety of symptoms including: pain, joint…

  9. Interpretation as conflict resolution

    NARCIS (Netherlands)

    Swart, Henriëtte de; Zwart, J.

    Semantic interpretation is not a simple process. When we want to know what a given sentence means, more is needed than just a simple ‘adding up’ of the meanings of the component words. Not only can the words in a sentence interact and conflict with each other, but also with the linguistic and

  10. The act of interpretation.

    Science.gov (United States)

    D'Abreu, Aloysio Augusto

    2006-08-01

    The author understands the interpreting act as an attempt to perceive what happens in the transference/countertransference field and not just what happens in the patient's mind. Interpretation transcends mere intellectual communication. It is also an experience in which analysts' emotions work as an important instrument in understanding their patients. Interpretation is seen to possess manifest as well as latent content; the latter would contain the analysts' feelings, emotions and personality. The unconscious content of an interpretation does not inconvenience or preclude the development of the analytic process, but, on the contrary, it allows new associative material to emerge, and it transforms the analytic session into a human relationship. Analysts' awareness of this content derived from patients' apperceptions is a significant instrument for understanding what is happening in the analytic relationship, and what transpires in these sessions provides fundamental elements for analysts' self-analysis. Some clinical examples demonstrate these occurrences in analytic sessions, and how they can be apprehended and used for a better understanding of the patient. The author also mentions the occurrence of difficulties during the analytic process. These difficulties are often the result of lapses in an analyst's perception related to unconscious elements of the relationship.

  11. Interpreting television news

    NARCIS (Netherlands)

    Schaap, G.J.

    2009-01-01

    Television news range among the most extensively investigated topics in communication studies. The book contributes to television news research by focusing on whether and how news viewers who watch the same news program form similar or different interpretations. The author develops a novel concept o

  12. Statistical mechanics of influence maximization with thermal noise

    Science.gov (United States)

    Lynn, Christopher W.; Lee, Daniel D.

    2017-03-01

    The problem of optimally distributing a budget of influence among individuals in a social network, known as influence maximization, has typically been studied in the context of contagion models and deterministic processes, which fail to capture stochastic interactions inherent in real-world settings. Here, we show that by introducing thermal noise into influence models, the dynamics exactly resemble spins in a heterogeneous Ising system. In this way, influence maximization in the presence of thermal noise has a natural physical interpretation as maximizing the magnetization of an Ising system given a budget of external magnetic field. Using this statistical mechanical formulation, we demonstrate analytically that for small external-field budgets, the optimal influence solutions exhibit a highly non-trivial temperature dependence, focusing on high-degree hub nodes at high temperatures and on easily influenced peripheral nodes at low temperatures. For the general problem, we present a projected gradient ascent algorithm that uses the magnetic susceptibility to calculate locally optimal external-field distributions. We apply our algorithm to synthetic and real-world networks, demonstrating that our analytic results generalize qualitatively. Our work establishes a fruitful connection with statistical mechanics and demonstrates that influence maximization depends crucially on the temperature of the system, a fact that has not been appreciated by existing research.

  13. Statistical ecology comes of age.

    Science.gov (United States)

    Gimenez, Olivier; Buckland, Stephen T; Morgan, Byron J T; Bez, Nicolas; Bertrand, Sophie; Choquet, Rémi; Dray, Stéphane; Etienne, Marie-Pierre; Fewster, Rachel; Gosselin, Frédéric; Mérigot, Bastien; Monestiez, Pascal; Morales, Juan M; Mortier, Frédéric; Munoz, François; Ovaskainen, Otso; Pavoine, Sandrine; Pradel, Roger; Schurr, Frank M; Thomas, Len; Thuiller, Wilfried; Trenkel, Verena; de Valpine, Perry; Rexstad, Eric

    2014-12-01

    The desire to predict the consequences of global environmental change has been the driver towards more realistic models embracing the variability and uncertainties inherent in ecology. Statistical ecology has gelled over the past decade as a discipline that moves away from describing patterns towards modelling the ecological processes that generate these patterns. Following the fourth International Statistical Ecology Conference (1-4 July 2014) in Montpellier, France, we analyse current trends in statistical ecology. Important advances in the analysis of individual movement, and in the modelling of population dynamics and species distributions, are made possible by the increasing use of hierarchical and hidden process models. Exciting research perspectives include the development of methods to interpret citizen science data and of efficient, flexible computational algorithms for model fitting. Statistical ecology has come of age: it now provides a general and mathematically rigorous framework linking ecological theory and empirical data.

  14. Statistical ecology comes of age

    Science.gov (United States)

    Gimenez, Olivier; Buckland, Stephen T.; Morgan, Byron J. T.; Bez, Nicolas; Bertrand, Sophie; Choquet, Rémi; Dray, Stéphane; Etienne, Marie-Pierre; Fewster, Rachel; Gosselin, Frédéric; Mérigot, Bastien; Monestiez, Pascal; Morales, Juan M.; Mortier, Frédéric; Munoz, François; Ovaskainen, Otso; Pavoine, Sandrine; Pradel, Roger; Schurr, Frank M.; Thomas, Len; Thuiller, Wilfried; Trenkel, Verena; de Valpine, Perry; Rexstad, Eric

    2014-01-01

    The desire to predict the consequences of global environmental change has been the driver towards more realistic models embracing the variability and uncertainties inherent in ecology. Statistical ecology has gelled over the past decade as a discipline that moves away from describing patterns towards modelling the ecological processes that generate these patterns. Following the fourth International Statistical Ecology Conference (1–4 July 2014) in Montpellier, France, we analyse current trends in statistical ecology. Important advances in the analysis of individual movement, and in the modelling of population dynamics and species distributions, are made possible by the increasing use of hierarchical and hidden process models. Exciting research perspectives include the development of methods to interpret citizen science data and of efficient, flexible computational algorithms for model fitting. Statistical ecology has come of age: it now provides a general and mathematically rigorous framework linking ecological theory and empirical data. PMID:25540151

  15. Statistical physics and ecology

    Science.gov (United States)

    Volkov, Igor

    This work addresses the applications of the methods of statistical physics to problems in population ecology. A theoretical framework based on stochastic Markov processes for the unified neutral theory of biodiversity is presented and an analytical solution for the distribution of the relative species abundance distribution both in the large meta-community and in the small local community is obtained. It is shown that the framework of the current neutral theory in ecology can be easily generalized to incorporate symmetric density dependence. An analytically tractable model is studied that provides an accurate description of beta-diversity and exhibits novel scaling behavior that leads to links between ecological measures such as relative species abundance and the species area relationship. We develop a simple framework that incorporates the Janzen-Connell, dispersal and immigration effects and leads to a description of the distribution of relative species abundance, the equilibrium species richness, beta-diversity and the species area relationship, in good accord with data. Also it is shown that an ecosystem can be mapped into an unconventional statistical ensemble and is quite generally tuned in the vicinity of a phase transition where bio-diversity and the use of resources are optimized. We also perform a detailed study of the unconventional statistical ensemble, in which, unlike in physics, the total number of particles and the energy are not fixed but bounded. We show that the temperature and the chemical potential play a dual role: they determine the average energy and the population of the levels in the system and at the same time they act as an imbalance between the energy and population ceilings and the corresponding average values. Different types of statistics (Boltzmann, Bose-Einstein, Fermi-Dirac and one corresponding to the description of a simple ecosystem) are considered. In all cases, we show that the systems may undergo a first or a second order

  16. Applied statistics for social and management sciences

    CERN Document Server

    Miah, Abdul Quader

    2016-01-01

    This book addresses the application of statistical techniques and methods across a wide range of disciplines. While its main focus is on the application of statistical methods, theoretical aspects are also provided as fundamental background information. It offers a systematic interpretation of results often discovered in general descriptions of methods and techniques such as linear and non-linear regression. SPSS is also used in all the application aspects. The presentation of data in the form of tables and graphs throughout the book not only guides users, but also explains the statistical application and assists readers in interpreting important features. The analysis of statistical data is presented consistently throughout the text. Academic researchers, practitioners and other users who work with statistical data will benefit from reading Applied Statistics for Social and Management Sciences. .

  17. Statistics for scientists and engineers

    CERN Document Server

    Shanmugam , Ramalingam

    2015-01-01

    This book provides the theoretical framework needed to build, analyze and interpret various statistical models. It helps readers choose the correct model, distinguish among various choices that best captures the data, or solve the problem at hand. This is an introductory textbook on probability and statistics. The authors explain theoretical concepts in a step-by-step manner and provide practical examples. The introductory chapter in this book presents the basic concepts. Next, the authors discuss the measures of location, popular measures of spread, and measures of skewness and kurtosis. Prob

  18. A primer of multivariate statistics

    CERN Document Server

    Harris, Richard J

    2014-01-01

    Drawing upon more than 30 years of experience in working with statistics, Dr. Richard J. Harris has updated A Primer of Multivariate Statistics to provide a model of balance between how-to and why. This classic text covers multivariate techniques with a taste of latent variable approaches. Throughout the book there is a focus on the importance of describing and testing one's interpretations of the emergent variables that are produced by multivariate analysis. This edition retains its conversational writing style while focusing on classical techniques. The book gives the reader a feel for why

  19. Ranald Macdonald and statistical inference.

    Science.gov (United States)

    Smith, Philip T

    2009-05-01

    Ranald Roderick Macdonald (1945-2007) was an important contributor to mathematical psychology in the UK, as a referee and action editor for British Journal of Mathematical and Statistical Psychology and as a participant and organizer at the British Psychological Society's Mathematics, statistics and computing section meetings. This appreciation argues that his most important contribution was to the foundations of significance testing, where his concern about what information was relevant in interpreting the results of significance tests led him to be a persuasive advocate for the 'Weak Fisherian' form of hypothesis testing.

  20. Use of statistics in plant biotechnology.

    Science.gov (United States)

    Compton, Michael E

    2012-01-01

    Statistics and experimental design are important tools for the plant biotechnologist and should be used when planning and conducting experiments as well as during the analysis and interpretation of results. This chapter provides some basic concepts important to the statistical analysis of data obtained from plant tissue culture or biotechnology experiments, and illustrates the application of common statistical procedures to analyze binomial, count, and continuous data for experiments with different treatment factors as well as identifying trends of dosage treatment factors.

  1. Interpretation of Internet technology

    DEFF Research Database (Denmark)

    Madsen, Charlotte Øland

    2001-01-01

    -cognitive competencies of organisations (Rindova & Fombrunn, 1999). The aim is to contribute to the existing technological implementation theory complex by studying the relationships between the elements of the socio-cognitive processes and the resulting interpretations and actions when new technologies are implemented......Research scope: The topic of the research project is to investigate how new internet technologies such as e-trade and customer relation marketing and management are implemented in Danish food processing companies. The aim is to use Weick's (1995) sensemaking concept to analyse the strategic...... processes leading to the use of internet marketing technologies and to investigate how these new technologies are interpreted into the organisation. Investigating the organisational socio-cognitive processes underlying the decision making processes will give further insight into the socio...

  2. Physical Interpretion of Antigravity

    CERN Document Server

    Bars, Itzhak

    2015-01-01

    Geodesic incompleteness is a problem in both general relativity and string theory. The Weyl invariant Standard Model coupled to General Relativity (SM+GR), and a similar treatment of string theory, are improved theories that are geodesically complete. A notable prediction of this approach is that there must be antigravity regions of spacetime connected to gravity regions through gravitational singularities such as those that occur in black holes and cosmological bang/crunch. Antigravity regions introduce apparent problems of ghosts that raise several questions of physical interpretation. It was shown that unitarity is not violated but there may be an instability associated with negative kinetic energies in the antigravity regions. In this paper we show that the apparent problems can be resolved with the interpretation of the theory from the perspective of observers strictly in the gravity region. Such observers cannot experience the negative kinetic energy in antigravity directly, but can only detect in and o...

  3. Well testing: interpretation methods

    Energy Technology Data Exchange (ETDEWEB)

    Bourdarot, G. [Elf Aquitaine, 92 - Courbevoie (France)

    1998-12-31

    This book presents the different methods to use for interpreting well tests: conventional methods, type curves, derivative, according to the type of reservoir limits (faults, channels, secant faults, constant pressure limits, closed reservoir) and the well configuration (partial penetration, inclined, fractured, horizontal or injection well). It indicates the method to be used in the case of more complex reservoirs (fissured reservoirs, two-layered reservoirs), interference tests or pulse tests and when fluid produced is gas or polyphasic. (authors) 60 refs.

  4. Well testing: interpretation methods

    Energy Technology Data Exchange (ETDEWEB)

    Bourdarot, G. (Elf Aquitaine, 92 - Courbevoie (France))

    1998-01-01

    This book presents the different methods to use for interpreting well tests: conventional methods, type curves, derivative, according to the type of reservoir limits (faults, channels, secant faults, constant pressure limits, closed reservoir) and the well configuration (partial penetration, inclined, fractured, horizontal or injection well). It indicates the method to be used in the case of more complex reservoirs (fissured reservoirs, two-layered reservoirs), interference tests or pulse tests and when fluid produced is gas or polyphasic. (authors) 60 refs.

  5. Reflections and Interpretations

    DEFF Research Database (Denmark)

    Reflections and Interpretations is an anthology on The Freedom Writers’ methodology. It is an anthology for all those with a professional need for texts explaining, not only how The Freedom Writers’ tools are being used, but also why they work so convincingly well. It is not an anthology of guide...... of guidelines; it is an anthology of explanations based on theory. And it is an anthology written by Freedom Writer Teachers – who else could do it?...

  6. Tips for Mental Health Interpretation

    Science.gov (United States)

    Whitsett, Margaret

    2008-01-01

    This paper offers tips for working with interpreters in mental health settings. These tips include: (1) Using trained interpreters, not bilingual staff or community members; (2) Explaining "interpreting procedures" to the providers and clients; (3) Addressing the stigma associated with mental health that may influence interpreters; (4) Defining…

  7. Video interpretations in Danish hospitals

    DEFF Research Database (Denmark)

    Søbjerg, Lene Mosegaard; Noesgaard, Susanne; Henriksen, Jan Erik;

    2013-01-01

    This article presents a study of an RCT comparing video interpretation with in-person interpretation at the Endocrinology Ward at Odense University Hospital.......This article presents a study of an RCT comparing video interpretation with in-person interpretation at the Endocrinology Ward at Odense University Hospital....

  8. Highly concentrated synthesis of copper-zinc-tin-sulfide nanocrystals with easily decomposable capping molecules for printed photovoltaic applications

    Science.gov (United States)

    Kim, Youngwoo; Woo, Kyoohee; Kim, Inhyuk; Cho, Yong Soo; Jeong, Sunho; Moon, Jooho

    2013-10-01

    Among various candidate materials, Cu2ZnSnS4 (CZTS) is a promising earth-abundant semiconductor for low-cost thin film solar cells. We report a facile, less toxic, highly concentrated synthetic method utilizing the heretofore unrecognized, easily decomposable capping ligand of triphenylphosphate, where phase-pure, single-crystalline, and well-dispersed colloidal CZTS nanocrystals were obtained. The favorable influence of the easily decomposable capping ligand on the microstructural evolution of device-quality CZTS absorber layers was clarified based on a comparative study with commonly used oleylamine-capped CZTS nanoparticles. The resulting CZTS nanoparticles enabled us to produce a dense and crack-free absorbing layer through annealing under a N2 + H2S (4%) atmosphere, demonstrating a solar cell with an efficiency of 3.6% under AM 1.5 illumination.Among various candidate materials, Cu2ZnSnS4 (CZTS) is a promising earth-abundant semiconductor for low-cost thin film solar cells. We report a facile, less toxic, highly concentrated synthetic method utilizing the heretofore unrecognized, easily decomposable capping ligand of triphenylphosphate, where phase-pure, single-crystalline, and well-dispersed colloidal CZTS nanocrystals were obtained. The favorable influence of the easily decomposable capping ligand on the microstructural evolution of device-quality CZTS absorber layers was clarified based on a comparative study with commonly used oleylamine-capped CZTS nanoparticles. The resulting CZTS nanoparticles enabled us to produce a dense and crack-free absorbing layer through annealing under a N2 + H2S (4%) atmosphere, demonstrating a solar cell with an efficiency of 3.6% under AM 1.5 illumination. Electronic supplementary information (ESI) available: Experimental methods for CZTS nanocrystal synthesis, device fabrication, and characterization; the size distribution and energy dispersive X-ray (EDX) spectra of the synthesized CZTS nanoparticles; UV-vis spectra of the

  9. Novel Terthiophene-Substituted Fullerene Derivatives as Easily Accessible Acceptor Molecules for Bulk-Heterojunction Polymer Solar Cells

    Directory of Open Access Journals (Sweden)

    Filippo Nisic

    2014-01-01

    Full Text Available Five fulleropyrrolidines and methanofullerenes, bearing one or two terthiophene moieties, have been prepared in a convenient way and well characterized. These novel fullerene derivatives are characterized by good solubility and by better harvesting of the solar radiation with respect to traditional PCBM. In addition, they have a relatively high LUMO level and a low band gap that can be easily tuned by an adequate design of the link between the fullerene and the terthiophene. Preliminary results show that they are potential acceptors for the creation of efficient bulk-heterojunction solar cells based on donor polymers containing thiophene units.

  10. Sorting chromatic sextupoles for easily and effectively correcting second order chromaticity in the Relativistic Heavy Ion Collider

    Energy Technology Data Exchange (ETDEWEB)

    Luo,Y.; Tepikian, S.; Fischer, W.; Robert-Demolaize, G.; Trbojevic, D.

    2009-01-02

    Based on the contributions of the chromatic sextupole families to the half-integer resonance driving terms, we discuss how to sort the chromatic sextupoles in the arcs of the Relativistic Heavy Ion Collider (RHIC) to easily and effectively correct the second order chromaticities. We propose a method with 4 knobs corresponding to 4 pairs of chromatic sextupole families to online correct the second order chromaticities. Numerical simulation justifies this method, showing that this method reduces the unbalance in the correction strengths of sextupole families and avoids the reversal of sextupole polarities. Therefore, this method yields larger dynamic apertures for the proposed RHIC 2009 100GeV polarized proton run lattices.

  11. Measurement, Interpretation and Information

    Directory of Open Access Journals (Sweden)

    Olimpia Lombardi

    2015-10-01

    Full Text Available During many years since the birth of quantum mechanics, instrumentalistinterpretations prevailed: the meaning of the theory was expressed in terms of measurementsresults. However, in the last decades, several attempts to interpret it from a realist viewpointhave been proposed. Among them, modal interpretations supply a realist non-collapseaccount, according to which the system always has definite properties and the quantum staterepresents possibilities, not actualities. But the traditional modal interpretations faced someconceptual problems when addressing imperfect measurements. The modal-Hamiltonianinterpretation, on the contrary, proved to be able to supply an adequate account of themeasurement problem, both in its ideal and its non-ideal versions. Moreover, in the non-idealcase, it gives a precise criterion to distinguish between reliable and non-reliable measurements.Nevertheless, that criterion depends on the particular state of the measured system, and thismight be considered as a shortcoming of the proposal. In fact, one could ask for a criterionof reliability that does not depend on the features of what is measured but only on theproperties of the measurement device. The aim of this article is precisely to supply such acriterion: we will adopt an informational perspective for this purpose.

  12. Interpreting uncertainty terms.

    Science.gov (United States)

    Holtgraves, Thomas

    2014-08-01

    Uncertainty terms (e.g., some, possible, good, etc.) are words that do not have a fixed referent and hence are relatively ambiguous. A model is proposed that specifies how, from the hearer's perspective, recognition of facework as a potential motive for the use of an uncertainty term results in a calibration of the intended meaning of that term. Four experiments are reported that examine the impact of face threat, and the variables that affect it (e.g., power), on the manner in which a variety of uncertainty terms (probability terms, quantifiers, frequency terms, etc.) are interpreted. Overall, the results demonstrate that increased face threat in a situation will result in a more negative interpretation of an utterance containing an uncertainty term. That the interpretation of so many different types of uncertainty terms is affected in the same way suggests the operation of a fundamental principle of language use, one with important implications for the communication of risk, subjective experience, and so on.

  13. Development and validation of a quick easily used biochemical assay for evaluating the viability of small immobile arthropods.

    Science.gov (United States)

    Phillips, Craig B; Iline, Ilia I; Richards, Nicola K; Novoselov, Max; McNeill, Mark R

    2013-10-01

    Quickly, accurately, and easily assessing the efficacy of treatments to control sessile arthropods (e.g., scale insects) and stationary immature life stages (e.g., eggs and pupae) is problematic because it is difficult to tell whether treated organisms are alive or dead. Current approaches usually involve either maintaining organisms in the laboratory to observe them for development, gauging their response to physical stimulation, or assessing morphological characters such as turgidity and color. These can be slow, technically difficult, or subjective, and the validity of methods other than laboratory rearing has seldom been tested. Here, we describe development and validation of a quick easily used biochemical colorimetric assay for measuring the viability of arthropods that is sufficiently sensitive to test even very small organisms such as white fly eggs. The assay was adapted from a technique for staining the enzyme hexokinase to signal the presence of adenosine triphosphate in viable specimens by reducing a tetrazolium salt to formazan. Basic laboratory facilities and skills are required for production of the stain, but no specialist equipment, expertise, or facilities are needed for its use.

  14. Statistical Analysis by Statistical Physics Model for the STOCK Markets

    Science.gov (United States)

    Wang, Tiansong; Wang, Jun; Fan, Bingli

    A new stochastic stock price model of stock markets based on the contact process of the statistical physics systems is presented in this paper, where the contact model is a continuous time Markov process, one interpretation of this model is as a model for the spread of an infection. Through this model, the statistical properties of Shanghai Stock Exchange (SSE) and Shenzhen Stock Exchange (SZSE) are studied. In the present paper, the data of SSE Composite Index and the data of SZSE Component Index are analyzed, and the corresponding simulation is made by the computer computation. Further, we investigate the statistical properties, fat-tail phenomena, the power-law distributions, and the long memory of returns for these indices. The techniques of skewness-kurtosis test, Kolmogorov-Smirnov test, and R/S analysis are applied to study the fluctuation characters of the stock price returns.

  15. Data Interpretation: Using Probability

    Science.gov (United States)

    Drummond, Gordon B.; Vowler, Sarah L.

    2011-01-01

    Experimental data are analysed statistically to allow researchers to draw conclusions from a limited set of measurements. The hard fact is that researchers can never be certain that measurements from a sample will exactly reflect the properties of the entire group of possible candidates available to be studied (although using a sample is often the…

  16. Working with interpreters: how student behavior affects quality of patient interaction when using interpreters

    Directory of Open Access Journals (Sweden)

    Cha-Chi Fung

    2010-06-01

    Full Text Available Background: Despite the prevalence of medical interpreting in the clinical environment, few medical professionals receive training in best practices when using an interpreter. We designed and implemented an educational workshop on using interpreters as part of the cultural competency curriculum for second year medical students (MSIIs at David Geffen School of Medicine at UCLA. The purpose of this study is two-fold: first, to evaluate the effectiveness of the workshop and second, if deficiencies are found, to investigate whether the deficiencies affected the quality of the patient encounter when using an interpreter. Methods: A total of 152 MSIIs completed the 3-hour workshop and a 1-station objective-structured clinical examination, 8 weeks later to assess skills. Descriptive statistics and independent sample t-tests were used to assess workshop effectiveness. Results: Based on a passing score of 70%, 39.4% of the class failed. Two skills seemed particularly problematic: assuring confidentiality (missed by 50% and positioning the interpreter (missed by 70%. While addressing confidentiality did not have a significant impact on standardized patient satisfaction, interpreter position did. Conclusion: Instructing the interpreter to sit behind the patient helps sustain eye contact between clinician and patient, while assuring confidentiality is a tenet of quality clinical encounters. Teaching students and faculty to emphasize both is warranted to improve cross-language clinical encounters.

  17. FIDEA: a server for the functional interpretation of differential expression analysis.

    KAUST Repository

    D'Andrea, Daniel

    2013-06-10

    The results of differential expression analyses provide scientists with hundreds to thousands of differentially expressed genes that need to be interpreted in light of the biology of the specific system under study. This requires mapping the genes to functional classifications that can be, for example, the KEGG pathways or InterPro families they belong to, their GO Molecular Function, Biological Process or Cellular Component. A statistically significant overrepresentation of one or more category terms in the set of differentially expressed genes is an essential step for the interpretation of the biological significance of the results. Ideally, the analysis should be performed by scientists who are well acquainted with the biological problem, as they have a wealth of knowledge about the system and can, more easily than a bioinformatician, discover less obvious and, therefore, more interesting relationships. To allow experimentalists to explore their data in an easy and at the same time exhaustive fashion within a single tool and to test their hypothesis quickly and effortlessly, we developed FIDEA. The FIDEA server is located at http://www.biocomputing.it/fidea; it is free and open to all users, and there is no login requirement.

  18. Interpretation of the geoid

    Science.gov (United States)

    Runcorn, S. K.

    1985-01-01

    The superposition of the first satellite geoid determined by Iszak upon Ootilla's geoid was based on surface gravity determinations. Good agreement was observed except over the Pacific area of the globe. The poor agreement over the Pacific was interpreted as the result of inadequate observations there. Many geoids were determined from satellite observations, including Doppler measurements. It is found that the geoid is the result of density differences in the mantle maintained since the primeval Earth by its finite strength. Various models based on this assumption are developed.

  19. Formalism and Interpretation in Quantum Theory

    Science.gov (United States)

    Wilce, Alexander

    2010-04-01

    Quantum Mechanics can be viewed as a linear dynamical theory having a familiar mathematical framework but a mysterious probabilistic interpretation, or as a probabilistic theory having a familiar interpretation but a mysterious formal framework. These points of view are usually taken to be somewhat in tension with one another. The first has generated a vast literature aiming at a “realistic” and “collapse-free” interpretation of quantum mechanics that will account for its statistical predictions. The second has generated an at least equally large literature aiming to derive, or at any rate motivate, the formal structure of quantum theory in probabilistically intelligible terms. In this paper I explore, in a preliminary way, the possibility that these two programmes have something to offer one another. In particular, I show that a version of the measurement problem occurs in essentially any non-classical probabilistic theory, and ask to what extent various interpretations of quantum mechanics continue to make sense in such a general setting. I make a start on answering this question in the case of a rudimentary version of the Everett interpretation.

  20. Plastic Surgery Statistics

    Science.gov (United States)

    ... PSN PSEN GRAFT Contact Us News Plastic Surgery Statistics Plastic surgery procedural statistics from the American Society of Plastic Surgeons. Statistics by Year Print 2016 Plastic Surgery Statistics 2015 ...

  1. MQSA National Statistics

    Science.gov (United States)

    ... Standards Act and Program MQSA Insights MQSA National Statistics Share Tweet Linkedin Pin it More sharing options ... but should level off with time. Archived Scorecard Statistics 2017 Scorecard Statistics 2016 Scorecard Statistics (Archived) 2015 ...

  2. Enhancing the Teaching of Statistics: Portfolio Theory, an Application of Statistics in Finance

    Science.gov (United States)

    Christou, Nicolas

    2008-01-01

    In this paper we present an application of statistics using real stock market data. Most, if not all, students have some familiarity with the stock market (or at least they have heard about it) and therefore can understand the problem easily. It is the real data analysis that students find interesting. Here we explore the building of efficient…

  3. Enhancing the Teaching of Statistics: Portfolio Theory, an Application of Statistics in Finance

    Science.gov (United States)

    Christou, Nicolas

    2008-01-01

    In this paper we present an application of statistics using real stock market data. Most, if not all, students have some familiarity with the stock market (or at least they have heard about it) and therefore can understand the problem easily. It is the real data analysis that students find interesting. Here we explore the building of efficient…

  4. Physical interpretation of antigravity

    Science.gov (United States)

    Bars, Itzhak; James, Albin

    2016-02-01

    Geodesic incompleteness is a problem in both general relativity and string theory. The Weyl-invariant Standard Model coupled to general relativity (SM +GR ), and a similar treatment of string theory, are improved theories that are geodesically complete. A notable prediction of this approach is that there must be antigravity regions of spacetime connected to gravity regions through gravitational singularities such as those that occur in black holes and cosmological bang/crunch. Antigravity regions introduce apparent problems of ghosts that raise several questions of physical interpretation. It was shown that unitarity is not violated, but there may be an instability associated with negative kinetic energies in the antigravity regions. In this paper we show that the apparent problems can be resolved with the interpretation of the theory from the perspective of observers strictly in the gravity region. Such observers cannot experience the negative kinetic energy in antigravity directly, but can only detect in and out signals that interact with the antigravity region. This is no different from a spacetime black box for which the information about its interior is encoded in scattering amplitudes for in/out states at its exterior. Through examples we show that negative kinetic energy in antigravity presents no problems of principles but is an interesting topic for physical investigations of fundamental significance.

  5. Impact of statistical learning methods on the predictive power of multivariate normal tissue complication probability models.

    Science.gov (United States)

    Xu, Cheng-Jian; van der Schaaf, Arjen; Schilstra, Cornelis; Langendijk, Johannes A; van't Veld, Aart A

    2012-03-15

    To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator (LASSO), and Bayesian model averaging (BMA), were used to build NTCP models of xerostomia following radiotherapy treatment for head and neck cancer. Performance of each learning method was evaluated by a repeated cross-validation scheme in order to obtain a fair comparison among methods. It was found that the LASSO and BMA methods produced models with significantly better predictive power than that of the stepwise selection method. Furthermore, the LASSO method yields an easily interpretable model as the stepwise method does, in contrast to the less intuitive BMA method. The commonly used stepwise selection method, which is simple to execute, may be insufficient for NTCP modeling. The LASSO method is recommended. Copyright © 2012 Elsevier Inc. All rights reserved.

  6. Application of an easily water-compatible hypercrosslinked polymeric adsorbent for efficient removal of catechol and resorcinol in aqueous solution

    Energy Technology Data Exchange (ETDEWEB)

    Huang Jianhan, E-mail: xiaomeijiangou@yahoo.com.cn [College of Chemistry and Chemical Engineering, Central South University, Changsha 410083 (China); Huang Kelong; Yan Cheng [College of Chemistry and Chemical Engineering, Central South University, Changsha 410083 (China)

    2009-08-15

    An easily water-compatible hypercrosslinked resin HJ-1 was developed for adsorbing catechol and resorcinol in aqueous solution in this study. Its adsorption performances for catechol and resorcinol were investigated in aqueous solution by using the commercial Amberlite XAD-4 as a reference. The adsorption dynamic curves were measured and the adsorption obeyed the pseudo-second-order rate equation of Boyer and Hsu. The adsorption isotherms were scaled and Freundlich isotherm model characterized the adsorption better. The adsorption thermodynamic parameters were calculated and the adsorption was an exothermic, favorable, and more ordered process. The fact that the adsorption capacity of catechol was larger than resorcinol and the adsorption enthalpy of catechol was more negative than resorcinol can be explained in terms of the solubility and the polarity of two adsorbates.

  7. The effect of easily ionized elements Na and K on the performance of pulsed plasma thruster using water propellant

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    In view of the low thrust power ratio caused by the high resistance of pulsed plasma thruster using water propellant,the paper argues that the easily ionized elements Na and K with low ionic potentials are added in the water propellant to improve its performance. The measurement of the discharging current and plasma emission spectrographic analysis prove the improvement. The experiments show that the elements Na and K have certain effect on the improvement of the performance of pulsed plasma thruster: In comparison with water propellant,the NaCl and KCl water propellant has a lower total resistance and a higher ratio of thruster power and specific impulse,and the NaCl water propellant has a slightly stronger effect on pulsed plasma thruster than the KCl. The plasma emission spectrographic analysis is in consistent with the experiment of measuring the discharging current: The elements Na and K can intensify the plasma emission spectrographic signal.

  8. Structural interpretation of seismic data and inherent uncertainties

    Science.gov (United States)

    Bond, Clare

    2013-04-01

    Geoscience is perhaps unique in its reliance on incomplete datasets and building knowledge from their interpretation. This interpretation basis for the science is fundamental at all levels; from creation of a geological map to interpretation of remotely sensed data. To teach and understand better the uncertainties in dealing with incomplete data we need to understand the strategies individual practitioners deploy that make them effective interpreters. The nature of interpretation is such that the interpreter needs to use their cognitive ability in the analysis of the data to propose a sensible solution in their final output that is both consistent not only with the original data but also with other knowledge and understanding. In a series of experiments Bond et al. (2007, 2008, 2011, 2012) investigated the strategies and pitfalls of expert and non-expert interpretation of seismic images. These studies focused on large numbers of participants to provide a statistically sound basis for analysis of the results. The outcome of these experiments showed that a wide variety of conceptual models were applied to single seismic datasets. Highlighting not only spatial variations in fault placements, but whether interpreters thought they existed at all, or had the same sense of movement. Further, statistical analysis suggests that the strategies an interpreter employs are more important than expert knowledge per se in developing successful interpretations. Experts are successful because of their application of these techniques. In a new set of experiments a small number of experts are focused on to determine how they use their cognitive and reasoning skills, in the interpretation of 2D seismic profiles. Live video and practitioner commentary were used to track the evolving interpretation and to gain insight on their decision processes. The outputs of the study allow us to create an educational resource of expert interpretation through online video footage and commentary with

  9. How to use and interpret hormone ratios.

    Science.gov (United States)

    Sollberger, Silja; Ehlert, Ulrike

    2016-01-01

    Hormone ratios have become increasingly popular throughout the neuroendocrine literature since they offer a straightforward way to simultaneously analyze the effects of two interdependent hormones. However, the analysis of ratios is associated with statistical and interpretational concerns which have not been sufficiently considered in the context of endocrine research. The aim of this article, therefore, is to demonstrate and discuss these issues, and to suggest suitable ways to address them. In a first step, we use exemplary testosterone and cortisol data to illustrate that one major concern of ratios lies in their distribution and inherent asymmetry. As a consequence, results of parametric statistical analyses are affected by the ultimately arbitrary decision of which way around the ratio is computed (i.e., A/B or B/A). We suggest the use of non-parametric methods as well as the log-transformation of hormone ratios as appropriate methods to deal with these statistical problems. However, in a second step, we also discuss the complicated interpretation of ratios, and propose moderation analysis as an alternative and oftentimes more insightful approach to ratio analysis. In conclusion, we suggest that researchers carefully consider which statistical approach is best suited to investigate reciprocal hormone effects. With regard to the hormone ratio method, further research is needed to specify what exactly this index reflects on the biological level and in which cases it is a meaningful variable to analyze.

  10. Functional interpretation and inductive definitions

    CERN Document Server

    Avigad, Jeremy

    2008-01-01

    Extending G\\"odel's \\emph{Dialectica} interpretation, we provide a functional interpretation of classical theories of positive arithmetic inductive definitions, reducing them to theories of finite-type functionals defined using transfinite recursion on well-founded trees.

  11. Generic interpreters and microprocessor verification

    Science.gov (United States)

    Windley, Phillip J.

    1990-01-01

    The following topics are covered in viewgraph form: (1) generic interpreters; (2) Viper microprocessors; (3) microprocessor verification; (4) determining correctness; (5) hierarchical decomposition; (6) interpreter theory; (7) AVM-1; (8) phase-level specification; and future work.

  12. What Language Do Interpreters Speak?

    Science.gov (United States)

    Parks, Gerald B.

    1982-01-01

    States that both the register and variety of an interpreter's speech are quite limited and analyzes the linguistic characteristics of "International English," the English used by interpreters at international conferences. (CFM)

  13. Discussion of "interpretation and play".

    Science.gov (United States)

    Pick, Irma Brenman

    2011-01-01

    This discussion addresses the conflict in technique between play versus interpretation. It further considers how the nature of the interpretation may be affected by a consideration of what is being projected into the analyst.

  14. Confidence Intervals: From tests of statistical significance to confidence intervals, range hypotheses and substantial effects

    Directory of Open Access Journals (Sweden)

    Dominic Beaulieu-Prévost

    2006-03-01

    Full Text Available For the last 50 years of research in quantitative social sciences, the empirical evaluation of scientific hypotheses has been based on the rejection or not of the null hypothesis. However, more than 300 articles demonstrated that this method was problematic. In summary, null hypothesis testing (NHT is unfalsifiable, its results depend directly on sample size and the null hypothesis is both improbable and not plausible. Consequently, alternatives to NHT such as confidence intervals (CI and measures of effect size are starting to be used in scientific publications. The purpose of this article is, first, to provide the conceptual tools necessary to implement an approach based on confidence intervals, and second, to briefly demonstrate why such an approach is an interesting alternative to an approach based on NHT. As demonstrated in the article, the proposed CI approach avoids most problems related to a NHT approach and can often improve the scientific and contextual relevance of the statistical interpretations by testing range hypotheses instead of a point hypothesis and by defining the minimal value of a substantial effect. The main advantage of such a CI approach is that it replaces the notion of statistical power by an easily interpretable three-value logic (probable presence of a substantial effect, probable absence of a substantial effect and probabilistic undetermination. The demonstration includes a complete example.

  15. Multicomponent statistical analysis to identify flow and transport processes in a highly-complex environment

    Science.gov (United States)

    Moeck, Christian; Radny, Dirk; Borer, Paul; Rothardt, Judith; Auckenthaler, Adrian; Berg, Michael; Schirmer, Mario

    2016-11-01

    A combined approach of multivariate statistical analysis, namely factor analysis (FA) and hierarchical cluster analysis (HCA), interpretation of geochemical processes, stable water isotope data and organic micropollutants enabling to assess spatial patterns of water types was performed for a study area in Switzerland, where drinking water production is close to different potential input pathways for contamination. To avoid drinking water contamination, artificial groundwater recharge with surface water into an aquifer is used to create a hydraulic barrier between potential intake pathways for contamination and drinking water extraction wells. Inter-aquifer mixing in the subsurface is identified, where a high amount of artificial infiltrated surface water is mixed with a lesser amount of water originating from the regional flow pathway in the vicinity of drinking water extraction wells. The spatial distribution of different water types can be estimated and a conceptual system understanding is developed. Results of the multivariate statistical analysis are comparable with gained information from isotopic data and organic micropollutants analyses. The integrated approach using different kinds of observations can be easily transferred to a variety of hydrological settings to synthesise and evaluate large hydrochemical datasets. The combination of additional data with different information content is conceivable and enabled effective interpretation of hydrological processes. Using the applied approach leads to more sound conceptual system understanding acting as the very basis to develop improved water resources management practices in a sustainable way.

  16. Interpretation of Helioseismic Traveltimes

    CERN Document Server

    Burston, Raymond; Birch, Aaron C

    2015-01-01

    Time-distance helioseismology uses cross-covariances of wave motions on the solar surface to determine the travel times of wave packets moving from one surface location to another. We review the methodology to interpret travel-time measurements in terms of small, localized perturbations to a horizontally homogeneous reference solar model. Using the first Born approximation, we derive and compute 3D travel-time sensitivity (Fr\\'echet) kernels for perturbations in sound-speed, density, pressure, and vector flows. While kernels for sound speed and flows had been computed previously, here we extend the calculation to kernels for density and pressure, hence providing a complete description of the effects of solar dynamics and structure on travel times. We treat three thermodynamic quantities as independent and do not assume hydrostatic equilibrium. We present a convenient approach to computing damped Green's functions using a normal-mode summation. The Green's function must be computed on a wavenumber grid that ha...

  17. Interpretability in Linear Brain Decoding

    OpenAIRE

    Kia, Seyed Mostafa; Passerini, Andrea

    2016-01-01

    Improving the interpretability of brain decoding approaches is of primary interest in many neuroimaging studies. Despite extensive studies of this type, at present, there is no formal definition for interpretability of brain decoding models. As a consequence, there is no quantitative measure for evaluating the interpretability of different brain decoding methods. In this paper, we present a simple definition for interpretability of linear brain decoding models. Then, we propose to combine the...

  18. The interpretation of administrative contracts

    Directory of Open Access Journals (Sweden)

    Cătălin-Silviu SĂRARU

    2014-06-01

    Full Text Available The article analyzes the principles of interpretation for administrative contracts, in French law and in Romanian law. In the article are highlighted derogations from the rules of contract interpretation in common law. Are examined the exceptions to the principle of good faith, the principle of common intention (willingness of the parties, the principle of good administration, the principle of extensive interpretation of the administrative contract. The article highlights the importance and role of the interpretation in administrative contracts.

  19. Interpreting social enterprises

    Directory of Open Access Journals (Sweden)

    Carlo Borzaga

    2012-09-01

    Full Text Available Institutional and organizational variety is increasingly characterizing advanced economic systems. While traditional economic theories have focused almost exclusively on profit-maximizing (i.e., for-profit enterprises and on publicly-owned organizations, the increasing relevance of non-profit organizations, and especially of social enterprises, requires scientists to reflect on a new comprehensive economic approach for explaining this organizational variety. This paper examines the main limitations of the orthodox and institutional theories and asserts the need for creating and testing a new theoretical framework, which considers the way in which diverse enterprises pursue their goals, the diverse motivations driving actors and organizations, and the different learning patterns and routines within organizations. The new analytical framework proposed in the paper draws upon recent developments in the theories of the firm, mainly of an evolutionary and behavioral kind. The firm is interpreted as a coordination mechanism of economic activity, and one whose objectives need not coincide with profit maximization. On the other hand, economic agents driven by motivational complexity and intrinsic, non-monetary motivation play a crucial role in forming firm activity over and above purely monetary and financial objectives. The new framework is thought to be particularly suitable to correctly interpret the emergence and role of nontraditional organizational and ownership forms that are not driven by the profit motive (non-profit organizations, mainly recognized in the legal forms of cooperative firms, non-profit organizations and social enterprises. A continuum of organizational forms ranging from profit making activities to public benefit activities, and encompassing mutual benefit organizations as its core constituent, is envisaged and discussed.

  20. Cytological artifacts masquerading interpretation

    Science.gov (United States)

    Sahay, Khushboo; Mehendiratta, Monica; Rehani, Shweta; Kumra, Madhumani; Sharma, Rashi; Kardam, Priyanka

    2013-01-01

    Background: Cytological artifacts are important to learn because an error in routine laboratory practice can bring out an erroneous result. Aims: The aim of this study was to analyze the effects of delayed fixation and morphological discrepancies created by deliberate addition of extraneous factors on the interpretation and/or diagnosis of an oral cytosmear. Materials and Methods: A prospective study was carried out using papanicolaou and hematoxylin and eosin-stained oral smears, 6 each from 66 volunteer dental students with deliberate variation in fixation delay timings, with and without changes in temperature, undue pressure while smear making and intentional addition of contaminants. The fixation delay at room temperature was carried out at an interval of every 30 minutes, 1 day and 1 week and was continued till the end of 1 day, 1 week, and 1 month, respectively. The temperature variations included 60 to 70°C and 3 to 4°C. Results: Light microscopically, the effect of delayed fixation at room temperature appeared first on cytoplasm followed by nucleus within the first 2 hours and on the 4th day, respectively, till complete cytoplasmic degeneration on the 23rd day. However, delayed fixation at variable temperature brought faster degenerative changes at higher temperature than lower temperature. Effect of extraneous factors revealed some interesting facts. Conclusions: In order to justify a cytosmear interpretation, a cytologist must be well acquainted with delayed fixation-induced cellular changes and microscopic appearances of common contaminants so as to implicate better prognosis and therapy. PMID:24648667

  1. "All-In-One Test"(AI1): A rapid and easily applicable approach to consumer product testing

    DEFF Research Database (Denmark)

    Giacalone, Davide; Frøst, Michael Bom; Bredie, Wender Laurentius Petrus

    2013-01-01

    by the Check-All-That-Apply (CATA) technique is reported. In this exploratory "All-In-One Test"(AI1), subjects (N = 160) filled out a questionnaire with demographic and psychographic variables, and appropriateness ratings for specific sensory descriptors of beer. Subsequently, subjects gave hedonic ratings...... between appropriateness and actual hedonic response. Overall, the AI1 test provided interpretable results concerning consumer perception (sensory/hedonic) of the beers, and revealed relations with consumers' background information. Initial results with AI1 test show that it is an efficient and versatile...

  2. Predict! Teaching Statistics Using Informational Statistical Inference

    Science.gov (United States)

    Makar, Katie

    2013-01-01

    Statistics is one of the most widely used topics for everyday life in the school mathematics curriculum. Unfortunately, the statistics taught in schools focuses on calculations and procedures before students have a chance to see it as a useful and powerful tool. Researchers have found that a dominant view of statistics is as an assortment of tools…

  3. Pragmatics in Court Interpreting: Additions

    DEFF Research Database (Denmark)

    Jacobsen, Bente

    2003-01-01

    Danish court interpreters are expected to follow ethical guidelines, which instruct them to deliver exact verbatim versions of source texts. However, this requirement often clashes with the reality of the interpreting situation in the courtroom. This paper presents and discusses the findings of a...... of an investigation regarding one kind of interpreter modification in particular: additions. The investigation was undertaken for a doctoral thesis....

  4. Combination and interpretation of observables in Cosmology

    Directory of Open Access Journals (Sweden)

    Virey Jean-Marc

    2010-04-01

    Full Text Available The standard cosmological model has deep theoretical foundations but need the introduction of two major unknown components, dark matter and dark energy, to be in agreement with various observations. Dark matter describes a non-relativistic collisionless fluid of (non baryonic matter which amount to 25% of the total density of the universe. Dark energy is a new kind of fluid not of matter type, representing 70% of the total density which should explain the recent acceleration of the expansion of the universe. Alternatively, one can reject this idea of adding one or two new components but argue that the equations used to make the interpretation should be modified consmological scales. Instead of dark matter one can invoke a failure of Newton's laws. Instead of dark energy, two approaches are proposed : general relativity (in term of the Einstein equation should be modified, or the cosmological principle which fixes the metric used for cosmology should be abandonned. One of the main objective of the community is to find the path of the relevant interpretations thanks to the next generation of experiments which should provide large statistics of observationnal data. Unfortunately, cosmological in formations are difficult to pin down directly fromt he measurements, and it is mandatory to combine the various observables to get the cosmological parameters. This is not problematic from the statistical point of view, but assumptions and approximations made for the analysis may bias our interprettion of the data. Consequently, a strong attention should be paied to the statistical methods used to make parameters estimation and for model testing. After a review of the basics of cosmology where the cosmological parameters are introduced, we discuss the various cosmological probes and their associated observables used to extract cosmological informations. We present the results obtained from several statistical analyses combining data of diferent nature but

  5. Changing Preservice Science Teachers' Views of Nature of Science: Why Some Conceptions May be More Easily Altered than Others

    Science.gov (United States)

    Mesci, Gunkut; Schwartz, Renee'S.

    2017-04-01

    The purpose of this study was to assess preservice teachers' views of Nature of Science (NOS), identify aspects that were challenging for conceptual change, and explore reasons why. This study particularly focused on why and how some concepts of NOS may be more easily altered than others. Fourteen preservice science teachers enrolled in a NOS and Science Inquiry course participated in this study. Data were collected by using a pre/post format with the Views of Nature of Science questionnaire (VNOS-270), the Views of Scientific Inquiry questionnaire (VOSI-270), follow-up interviews, and classroom artifacts. The results indicated that most students initially held naïve views about certain aspects of NOS like tentativeness and subjectivity. By the end of the semester, almost all students dramatically improved their understanding about almost all aspects of NOS. However, several students still struggled with certain aspects like the differences between scientific theory and law, tentativeness, and socio-cultural embeddedness. Results suggested that instructional, motivational, and socio-cultural factors may influence if and how students changed their views about targeted NOS aspects. Students thought that classroom activities, discussions, and readings were most helpful to improve their views about NOS. The findings from the research have the potential to translate as practical advice for teachers, science educators, and future researchers.

  6. Easily accessible polymer additives for tuning the crystal-growth of perovskite thin-films for highly efficient solar cells.

    Science.gov (United States)

    Dong, Qingqing; Wang, Zhaowei; Zhang, Kaicheng; Yu, Hao; Huang, Peng; Liu, Xiaodong; Zhou, Yi; Chen, Ning; Song, Bo

    2016-03-14

    For perovskite solar cells (Pero-SCs), one of the key issues with respect to the power conversion efficiency (PCE) is the morphology control of the perovskite thin-films. In this study, an easily-accessible additive polyethylenimine (PEI) is utilized to tune the morphology of CH3NH3PbI3-xClx. With addition of 1.00 wt% of PEI, the smoothness and crystallinity of the perovskite were greatly improved, which were characterized by scanning electron microscopy (SEM) and X-ray diffraction (XRD). A summit PCE of 14.07% was achieved for the p-i-n type Pero-SC, indicating a 26% increase compared to those of the devices without the additive. Both photoluminescence (PL) and alternating current impedance spectroscopy (ACIS) analyses confirm the efficiency results after the addition of PEI. This study provides a low-cost polymer additive candidate for tuning the morphology of perovskite thin-films, and might be a new clue for the mass production of Pero-SCs.

  7. Changing Preservice Science Teachers' Views of Nature of Science: Why Some Conceptions May be More Easily Altered than Others

    Science.gov (United States)

    Mesci, Gunkut; Schwartz, Renee'S.

    2016-02-01

    The purpose of this study was to assess preservice teachers' views of Nature of Science (NOS), identify aspects that were challenging for conceptual change, and explore reasons why. This study particularly focused on why and how some concepts of NOS may be more easily altered than others. Fourteen preservice science teachers enrolled in a NOS and Science Inquiry course participated in this study. Data were collected by using a pre/post format with the Views of Nature of Science questionnaire (VNOS-270), the Views of Scientific Inquiry questionnaire (VOSI-270), follow-up interviews, and classroom artifacts. The results indicated that most students initially held naïve views about certain aspects of NOS like tentativeness and subjectivity. By the end of the semester, almost all students dramatically improved their understanding about almost all aspects of NOS. However, several students still struggled with certain aspects like the differences between scientific theory and law, tentativeness, and socio-cultural embeddedness. Results suggested that instructional, motivational, and socio-cultural factors may influence if and how students changed their views about targeted NOS aspects. Students thought that classroom activities, discussions, and readings were most helpful to improve their views about NOS. The findings from the research have the potential to translate as practical advice for teachers, science educators, and future researchers.

  8. Changing interpretations of Plotinus

    DEFF Research Database (Denmark)

    Catana, Leo

    2013-01-01

    ’ writings relatively late, in the 18th and 19th centuries, and that it was primarily made possible by Brucker’s methodology for history of philosophy, dating from the 1740s, in which the concept system of philosophy was essential. It is observed that the concept was absent in Ficino’s commentary from the 15......th century, and that it remained absent in interpretative works produced between the 15th and 18th century. It is also argued that it is erroneous to assume that Plotinus presented a system of philosophy, or intended to do so — we do not find this concept in Plotinus’ writings, and his own statements...... about method point in other directions. Eduard Zeller (active in the second half of the 19th century) is typically regarded as the first who gave a satisfying account of Plotinus’ philosophy as a whole. In this article, on the other hand, Zeller is seen as the one who finalised a tradition initiated...

  9. Audiometry screening and interpretation.

    Science.gov (United States)

    Walker, Jennifer Junnila; Cleveland, Leanne M; Davis, Jenny L; Seales, Jennifer S

    2013-01-01

    The prevalence of hearing loss varies with age, affecting at least 25 percent of patients older than 50 years and more than 50 percent of those older than 80 years. Adolescents and young adults represent groups in which the prevalence of hearing loss is increasing and may therefore benefit from screening. If offered, screening can be performed periodically by asking the patient or family if there are perceived hearing problems, or by using clinical office tests such as whispered voice, finger rub, or audiometry. Audiometry in the family medicine clinic setting is a relatively simple procedure that can be interpreted by a trained health care professional. Pure-tone testing presents tones across the speech spectrum (500 to 4,000 Hz) to determine if the patient's hearing levels fall within normal limits. A quiet testing environment, calibrated audiometric equipment, and appropriately trained personnel are required for in-office testing. Pure-tone audiometry may help physicians appropriately refer patients to an audiologist or otolaryngologist. Unilateral or asymmetrical hearing loss can be symptomatic of a central nervous system lesion and requires additional evaluation.

  10. [How to interprete hypercalcitoninemia?].

    Science.gov (United States)

    Levy-Bohbot, N; Patey, M; Larbre, H; Hecart, A-C; Caron, J; Delemer, B

    2006-08-01

    Today, calcitonin assay is used for the diagnosis of thyroid medullary cancer in the context of nodular thyroid disease. Calcitonin is an excellent marker of thyroid medullary cancer but some hypercalcitoninemia can also be related to other diseases, such as renal failure, endocrine tumors other than thyroid medullary cancer and sometimes to C cell hyperplasia, which is a not well-defined situation. Recent studies contributed to define calcitoninemia thresholds, which guide decision and avoid excessive invasive treatment. After a brief reminder of physiological role of calcitonin and assays, the difficulties encountered in interpreting hypercalcitoninemia and its potential causes other than thyroid medullary cancer are addressed. Recent studies, on large series, now allow a better knowledge of specificity and sensitivity of calcitonin measurement in patients with nodular thyroid disease and a well-argued management. In the future, calcitonin dosage will be ordered even more frequently, as some authors recommend it for the diagnosis of thyroid nodule. It is up to us to know how to use this remarkable marker, by considering all possible situations of benign hypercalcitoninemia and reserving aggressive treatments for patients who really need them.

  11. Mixed-handed persons are more easily persuaded and are more gullible: interhemispheric interaction and belief updating.

    Science.gov (United States)

    Christman, Stephen D; Henning, Bradley R; Geers, Andrew L; Propper, Ruth E; Niebauer, Christopher L

    2008-09-01

    Research has shown that persons with mixed hand preference (i.e., who report using their non-dominant hand for at least some manual activities) display an increased tendency to update beliefs in response to information inconsistent with those beliefs. This has been interpreted as reflecting the fact that the left hemisphere maintains our current beliefs while the right hemisphere evaluates and updates those beliefs when appropriate. Belief evaluation is thus dependent on interhemispheric interaction, and mixed-handedness is associated with increased interhemispheric interaction. In Experiment 1 mixed-handers exhibited higher levels of persuasion in a standard attitude-change paradigm, while in Experiment 2 mixed-handers exhibited higher levels of gullibility as measured by the Barnum Effect.

  12. Statistical mechanics of combinatorial auctions

    Science.gov (United States)

    Galla, Tobias; Leone, Michele; Marsili, Matteo; Sellitto, Mauro; Weigt, Martin; Zecchina, Riccardo

    2006-05-01

    Combinatorial auctions are formulated as frustrated lattice gases on sparse random graphs, allowing the determination of the optimal revenue by methods of statistical physics. Transitions between computationally easy and hard regimes are found and interpreted in terms of the geometric structure of the space of solutions. We introduce an iterative algorithm to solve intermediate and large instances, and discuss competing states of optimal revenue and maximal number of satisfied bidders. The algorithm can be generalized to the hard phase and to more sophisticated auction protocols.

  13. The foundations of statistics

    CERN Document Server

    Savage, Leonard J

    1972-01-01

    Classic analysis of the foundations of statistics and development of personal probability, one of the greatest controversies in modern statistical thought. Revised edition. Calculus, probability, statistics, and Boolean algebra are recommended.

  14. Adrenal Gland Tumors: Statistics

    Science.gov (United States)

    ... Gland Tumor: Statistics Request Permissions Adrenal Gland Tumor: Statistics Approved by the Cancer.Net Editorial Board , 03/ ... primary adrenal gland tumor is very uncommon. Exact statistics are not available for this type of tumor ...

  15. Blood Facts and Statistics

    Science.gov (United States)

    ... Facts and Statistics Printable Version Blood Facts and Statistics Facts about blood needs Facts about the blood ... to Top Learn About Blood Blood Facts and Statistics Blood Components Whole Blood and Red Blood Cells ...

  16. Algebraic statistics computational commutative algebra in statistics

    CERN Document Server

    Pistone, Giovanni; Wynn, Henry P

    2000-01-01

    Written by pioneers in this exciting new field, Algebraic Statistics introduces the application of polynomial algebra to experimental design, discrete probability, and statistics. It begins with an introduction to Gröbner bases and a thorough description of their applications to experimental design. A special chapter covers the binary case with new application to coherent systems in reliability and two level factorial designs. The work paves the way, in the last two chapters, for the application of computer algebra to discrete probability and statistical modelling through the important concept of an algebraic statistical model.As the first book on the subject, Algebraic Statistics presents many opportunities for spin-off research and applications and should become a landmark work welcomed by both the statistical community and its relatives in mathematics and computer science.

  17. A New Interpretation to The Quantum Mechanics

    CERN Document Server

    Feng, Yulei

    2012-01-01

    In this paper, we try to give a new interpretation to the quantum mechanics from the point of view of (non-relativistic) quantum field theory. After field quantization, we obtain the Heisenberg equations for the momentum and coordinate operators of the particles excited from the (Schrodinger) field. We then give the probability concepts of quantum mechanics on the base of a statistical assemble realizing the assemble interpretation. With these, we make a series of conceptual modifications to the standard quantum mechanics, especially the quantum measurement theory; in the end, we try to solve the EPR paradox with the use of our new ideas. In addition, we also give a field theoretical description to the double-slit interference experiment, obtaining the particle number distribution, in the appendix.

  18. Problems with the concept 'interpretation'.

    Science.gov (United States)

    Paniagua, Cecilio

    2003-10-01

    Consensus on the conceptualisation of 'interpretation', the most characteristic feature of psychoanalytic technique, has proven elusive. Attempts at precising the meaning of this term are reviewed. The role of intuition and suggestion in interpretation are commented upon. There seem to exist polarities in interpreting styles. It is the author's contention that these are mostly contingent on the practitioner's adscription to the topographical or the structural model of the mind. The tendency to interpret deeply unconscious elements would correspond to pre-structural technique, whereas the tendency to direct the patient's attention to preconscious manifestations would be characteristic of the structural orientation. Clinical material is provided to illustrate the divergence of underlying theories of technique. The topographical interpreting of Freud and his early followers is different from the interpreting used in contemporary structural technique. 'Deep' interpreting approaches continue to be used side by side with clarification-like interpretations. The reasons for this coexistence are examined. There are powerful motivations for the adherence to pre-structural interpreting. It seems to gratify the analysand's dependency wishes and the analyst's narcissism more directly. It also provides a less sublimated satisfaction of epistemophilic drives. Maintaining ill-defined the concept 'interpretation' facilitates the application of the topographical technique with its irrational gratifications.

  19. Efficient degradation of carbamazepine by easily recyclable microscaled CuFeO{sub 2} mediated heterogeneous activation of peroxymonosulfate

    Energy Technology Data Exchange (ETDEWEB)

    Ding, Yaobin, E-mail: yaobinding@mail.scuec.edu.cn [Key Laboratory of Catalysis and Materials Science of the State Ethnic Affairs Commission and Ministry of Education, College of Resources and Environmental Science, South-Central University for Nationalities, Wuhan 430074 (China); Tang, Hebin [College of Pharmacy, South-Central University for Nationalities, Wuhan 430074 (China); Zhang, Shenghua; Wang, Songbo [Key Laboratory of Catalysis and Materials Science of the State Ethnic Affairs Commission and Ministry of Education, College of Resources and Environmental Science, South-Central University for Nationalities, Wuhan 430074 (China); Tang, Heqing, E-mail: tangheqing@mail.scuec.edu.cn [Key Laboratory of Catalysis and Materials Science of the State Ethnic Affairs Commission and Ministry of Education, College of Resources and Environmental Science, South-Central University for Nationalities, Wuhan 430074 (China)

    2016-11-05

    Highlights: • CuFeO{sub 2} microparticles were prepared by a microwave-assisted hydrothermal method. • CuFeO{sub 2} microparticles efficiently catalyzed the activation of peroxymonosulfate. • Quenching experiments confirmed sulfate radicals as the major reactive radicals. • Carbamazepine was rapidly degraded by micro-CuFeO{sub 2}/peroxymonosulfate. • Feasibility of CuFeO{sub 2}/peroxymonosulfate was tested for treatment of actual water. - Abstract: Microscaled CuFeO{sub 2} particles (micro-CuFeO{sub 2}) were rapidly prepared via a microwave-assisted hydrothermal method and characterized by scanning electron microscopy, X-ray powder diffraction and X-ray photoelectron spectroscopy. It was found that the micro-CuFeO{sub 2} was of pure phase and a rhombohedral structure with size in the range of 2.8 ± 0.6 μm. The micro-CuFeO{sub 2} efficiently catalyzed the activation of peroxymonosulfate (PMS) to generate sulfate radicals (SO{sub 4}·−), causing the fast degradation of carbamazepine (CBZ). The catalytic activity of micro-CuFeO{sub 2} was observed to be 6.9 and 25.3 times that of micro-Cu{sub 2}O and micro-Fe{sub 2}O{sub 3}, respectively. The enhanced activity of micro-CuFeO{sub 2} for the activation of PMS was confirmed to be attributed to synergistic effect of surface bonded Cu(I) and Fe(III). Sulfate radical was the primary radical species responsible for the CBZ degradation. As a microscaled catalyst, micro-CuFeO{sub 2} can be easily recovered by gravity settlement and exhibited improved catalytic stability compared with micro-Cu{sub 2}O during five successive degradation cycles. Oxidative degradation of CBZ by the couple of PMS/CuFeO{sub 2} was effective in the studied actual aqueous environmental systems.

  20. Transport of sewage molecular markers through saturated soil column and effect of easily biodegradable primary substrate on their removal.

    Science.gov (United States)

    Foolad, Mahsa; Ong, Say Leong; Hu, Jiangyong

    2015-11-01

    Pharmaceutical and personal care products (PPCPs) and artificial sweeteners (ASs) are emerging organic contaminants (EOCs) in the aquatic environment. The presence of PPCPs and ASs in water bodies has an ecologic potential risk and health concern. Therefore, it is needed to detect the pollution sources by understanding the transport behavior of sewage molecular markers in a subsurface area. The aim of this study was to evaluate transport of nine selected molecular markers through saturated soil column experiments. The selected sewage molecular markers in this study were six PPCPs including acetaminophen (ACT), carbamazepine (CBZ), caffeine (CF), crotamiton (CTMT), diethyltoluamide (DEET), salicylic acid (SA) and three ASs including acesulfame (ACF), cyclamate (CYC), and saccharine (SAC). Results confirmed that ACF, CBZ, CTMT, CYC and SAC were suitable to be used as sewage molecular markers since they were almost stable against sorption and biodegradation process during soil column experiments. In contrast, transport of ACT, CF and DEET were limited by both sorption and biodegradation processes and 100% removal efficiency was achieved in the biotic column. Moreover, in this study the effect of different acetate concentration (0-100mg/L) as an easily biodegradable primary substrate on a removal of PPCPs and ASs was also studied. Results showed a negative correlation (r(2)>0.75) between the removal of some selected sewage chemical markers including ACF, CF, ACT, CYC, SAC and acetate concentration. CTMT also decreased with the addition of acetate, but increasing acetate concentration did not affect on its removal. CBZ and DEET removal were not dependent on the presence of acetate. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Input of easily available organic C and N stimulates microbial decomposition of soil organic matter in arctic permafrost soil.

    Science.gov (United States)

    Wild, Birgit; Schnecker, Jörg; Alves, Ricardo J Eloy; Barsukov, Pavel; Bárta, Jiří; Capek, Petr; Gentsch, Norman; Gittel, Antje; Guggenberger, Georg; Lashchinskiy, Nikolay; Mikutta, Robert; Rusalimova, Olga; Santrůčková, Hana; Shibistova, Olga; Urich, Tim; Watzka, Margarete; Zrazhevskaya, Galina; Richter, Andreas

    2014-08-01

    Rising temperatures in the Arctic can affect soil organic matter (SOM) decomposition directly and indirectly, by increasing plant primary production and thus the allocation of plant-derived organic compounds into the soil. Such compounds, for example root exudates or decaying fine roots, are easily available for microorganisms, and can alter the decomposition of older SOM ("priming effect"). We here report on a SOM priming experiment in the active layer of a permafrost soil from the central Siberian Arctic, comparing responses of organic topsoil, mineral subsoil, and cryoturbated subsoil material (i.e., poorly decomposed topsoil material subducted into the subsoil by freeze-thaw processes) to additions of (13)C-labeled glucose, cellulose, a mixture of amino acids, and protein (added at levels corresponding to approximately 1% of soil organic carbon). SOM decomposition in the topsoil was barely affected by higher availability of organic compounds, whereas SOM decomposition in both subsoil horizons responded strongly. In the mineral subsoil, SOM decomposition increased by a factor of two to three after any substrate addition (glucose, cellulose, amino acids, protein), suggesting that the microbial decomposer community was limited in energy to break down more complex components of SOM. In the cryoturbated horizon, SOM decomposition increased by a factor of two after addition of amino acids or protein, but was not significantly affected by glucose or cellulose, indicating nitrogen rather than energy limitation. Since the stimulation of SOM decomposition in cryoturbated material was not connected to microbial growth or to a change in microbial community composition, the additional nitrogen was likely invested in the production of extracellular enzymes required for SOM decomposition. Our findings provide a first mechanistic understanding of priming in permafrost soils and suggest that an increase in the availability of organic carbon or nitrogen, e.g., by increased plant

  2. Semantic interpretation of Compositional Logic in Instantiation Space

    Institute of Scientific and Technical Information of China (English)

    SU Kaile; XIAO Yinyin; CHEN Qingliang; LIN Han

    2007-01-01

    The formal methods for security protocols guarantee the security properties of protocols.Instantiation Space Logic is a new security protocol logic,which has a strong expressive power.Compositional Logic is also a useful security protocol logic.This paper analyzes the relationship between these two logics,and interprets the semantics of Compositional Logic in Instantiation Space model.Through our work,the interpreted Compositional Logic can be extended more easily.Moreover,those security protocols described in Compositional Logic can be automatically verified by the verifier of Instantiation Space.The paper also proves that the expressive power of Instantiation Space Logic,which can not be completely interpreted by Compositional Logic,is stronger than Compositional Logic.

  3. Orientalismi: nuove prospettive interpretative

    Directory of Open Access Journals (Sweden)

    Gabriele Proglio

    2012-11-01

    Full Text Available This paper is aimed at reconsidering the concept of Orientalism in a new and multiple perspective, and at proposing a different interpretation of the relationship between culture and power, starting from Edward Said’s theoretical frame of reference. If Said’s representational model is repositioned out of structuralist and foucaultian frameworks and separated from the gramscian idea of hegemony-subordination, indeed, it may be possible to re-discuss the traditional profile identifying the Other in the European cultures. My basic assumption here is that Orientalism should not be understood as a consensus mechanism, which is able to produce diversified images of the Orient and the Oriental on demand. Although, of course, in most cases Orientalism is connected to the issue of power, its meanings could also be explained —as it will be soon shown— otherwise. Let’s take The Invisible Cities by Italo Calvino as an example. Here the narratives are not just multiple repetitions of Venice —in Said’s case, the same would hold for Europeanism—, but they could be strategically re-appropriated by those “others” and “alterities” whose bodies and identities are imposed by the Eurocentric discourse. In this sense, a double link may be identified with queer theories and postcolonial studies, and the notion of subordination will be rethought. Finally, from the above mentioned borders, a new idea of image emerges, which appears as linear, uniform and flattened only to the European gaze, whereas in actual fact it is made of imaginaries and forms of knowledge, which combine representation with the conceptualization of power relationships.

  4. Statistical Methods for Material Characterization and Qualification

    Energy Technology Data Exchange (ETDEWEB)

    Kercher, A.K.

    2005-04-01

    This document describes a suite of statistical methods that can be used to infer lot parameters from the data obtained from inspection/testing of random samples taken from that lot. Some of these methods will be needed to perform the statistical acceptance tests required by the Advanced Gas Reactor Fuel Development and Qualification (AGR) Program. Special focus has been placed on proper interpretation of acceptance criteria and unambiguous methods of reporting the statistical results. In addition, modified statistical methods are described that can provide valuable measures of quality for different lots of material. This document has been written for use as a reference and a guide for performing these statistical calculations. Examples of each method are provided. Uncertainty analysis (e.g., measurement uncertainty due to instrumental bias) is not included in this document, but should be considered when reporting statistical results.

  5. Statistical methods for material characterization and qualification

    Energy Technology Data Exchange (ETDEWEB)

    Hunn, John D [ORNL; Kercher, Andrew K [ORNL

    2005-01-01

    This document describes a suite of statistical methods that can be used to infer lot parameters from the data obtained from inspection/testing of random samples taken from that lot. Some of these methods will be needed to perform the statistical acceptance tests required by the Advanced Gas Reactor Fuel Development and Qualification (AGR) Program. Special focus has been placed on proper interpretation of acceptance criteria and unambiguous methods of reporting the statistical results. In addition, modified statistical methods are described that can provide valuable measures of quality for different lots of material. This document has been written for use as a reference and a guide for performing these statistical calculations. Examples of each method are provided. Uncertainty analysis (e.g., measurement uncertainty due to instrumental bias) is not included in this document, but should be considered when reporting statistical results.

  6. International Conference on Robust Statistics 2015

    CERN Document Server

    Basu, Ayanendranath; Filzmoser, Peter; Mukherjee, Diganta

    2016-01-01

    This book offers a collection of recent contributions and emerging ideas in the areas of robust statistics presented at the International Conference on Robust Statistics 2015 (ICORS 2015) held in Kolkata during 12–16 January, 2015. The book explores the applicability of robust methods in other non-traditional areas which includes the use of new techniques such as skew and mixture of skew distributions, scaled Bregman divergences, and multilevel functional data methods; application areas being circular data models and prediction of mortality and life expectancy. The contributions are of both theoretical as well as applied in nature. Robust statistics is a relatively young branch of statistical sciences that is rapidly emerging as the bedrock of statistical analysis in the 21st century due to its flexible nature and wide scope. Robust statistics supports the application of parametric and other inference techniques over a broader domain than the strictly interpreted model scenarios employed in classical statis...

  7. Statistical methods for material characterization and qualification

    Energy Technology Data Exchange (ETDEWEB)

    Hunn, John D [ORNL; Kercher, Andrew K [ORNL

    2005-01-01

    This document describes a suite of statistical methods that can be used to infer lot parameters from the data obtained from inspection/testing of random samples taken from that lot. Some of these methods will be needed to perform the statistical acceptance tests required by the Advanced Gas Reactor Fuel Development and Qualification (AGR) Program. Special focus has been placed on proper interpretation of acceptance criteria and unambiguous methods of reporting the statistical results. In addition, modified statistical methods are described that can provide valuable measures of quality for different lots of material. This document has been written for use as a reference and a guide for performing these statistical calculations. Examples of each method are provided. Uncertainty analysis (e.g., measurement uncertainty due to instrumental bias) is not included in this document, but should be considered when reporting statistical results.

  8. Statistical Methods for Material Characterization and Qualification

    Energy Technology Data Exchange (ETDEWEB)

    Kercher, A.K.

    2005-04-01

    This document describes a suite of statistical methods that can be used to infer lot parameters from the data obtained from inspection/testing of random samples taken from that lot. Some of these methods will be needed to perform the statistical acceptance tests required by the Advanced Gas Reactor Fuel Development and Qualification (AGR) Program. Special focus has been placed on proper interpretation of acceptance criteria and unambiguous methods of reporting the statistical results. In addition, modified statistical methods are described that can provide valuable measures of quality for different lots of material. This document has been written for use as a reference and a guide for performing these statistical calculations. Examples of each method are provided. Uncertainty analysis (e.g., measurement uncertainty due to instrumental bias) is not included in this document, but should be considered when reporting statistical results.

  9. PROBABILITY AND STATISTICS.

    Science.gov (United States)

    STATISTICAL ANALYSIS, REPORTS), (*PROBABILITY, REPORTS), INFORMATION THEORY, DIFFERENTIAL EQUATIONS, STATISTICAL PROCESSES, STOCHASTIC PROCESSES, MULTIVARIATE ANALYSIS, DISTRIBUTION THEORY , DECISION THEORY, MEASURE THEORY, OPTIMIZATION

  10. STATISTICS IN SERVICE QUALITY ASSESSMENT

    Directory of Open Access Journals (Sweden)

    Dragana Gardašević

    2012-09-01

    Full Text Available For any quality evaluation in sports, science, education, and so, it is useful to collect data to construct a strategy to improve the quality of services offered to the user. For this purpose, we use statistical software packages for data processing data collected in order to increase customer satisfaction. The principle is demonstrated by the example of the level of student satisfaction ratings Belgrade Polytechnic (as users the quality of institutions (Belgrade Polytechnic. Here, the emphasis on statistical analysis as a tool for quality control in order to improve the same, and not the interpretation of results. Therefore, the above can be used as a model in sport to improve the overall results.

  11. Practical statistics for nursing and health care

    CERN Document Server

    Fowler, Jim; Chevannes, Mel

    2002-01-01

    Nursing is a growing area of higher education, in which an introduction to statistics is an essential component. There is currently a gap in the market for a 'user-friendly' book which is contextulised and targeted for nursing. Practical Statistics for Nursing and Health Care introduces statistical techniques in such a way that readers will easily grasp the fundamentals to enable them to gain the confidence and understanding to perform their own analysis. It also provides sufficient advice in areas such as clinical trials and epidemiology to enable the reader to critically appraise work published in journals such as the Lancet and British Medical Journal. * Covers all basic statistical concepts and tests * Is user-friendly - avoids excessive jargon * Includes relevant examples for nurses, including case studies and data sets * Provides information on further reading * Starts from first principles and progresses step by step * Includes 'advice on' sections for all of the tests described.

  12. Trend figures assist with untrained emergency electroencephalogram interpretation.

    Science.gov (United States)

    Kobayashi, Katsuhiro; Yunoki, Kosuke; Zensho, Kazumasa; Akiyama, Tomoyuki; Oka, Makio; Yoshinaga, Harumi

    2015-05-01

    Acute electroencephalogram (EEG) findings are important for diagnosing emergency patients with suspected neurological disorders, but they can be difficult for untrained medical staff to interpret. In this research, we will develop an emergency EEG trend figure that we hypothesize will be more easily understood by untrained staff compared with the raw original traces. For each of several EEG patterns (wakefulness, sleep, seizure activity, and encephalopathy), trend figures incorporating information on both amplitude and frequency were built. The accuracy of untrained reviewers' interpretation was compared with that of the raw EEG trace interpretation. The rate of correct answers was significantly higher in response to the EEG trend figures than to the raw traces showing wakefulness, sleep, and encephalopathy, but there was no difference when seizure activity patterns were viewed. The rates of misjudging normal or abnormal findings were significantly lower with the trend figures in the wakefulness pattern; in the other patterns, misjudgments were equally low for the trend figures and the raw traces. EEG trend figures improved the accuracy with which untrained medical staff interpreted emergency EEGs. Emergency EEG figures that can be understood intuitively with minimal training might improve the accuracy of emergency EEG interpretation. However, additional studies are required to confirm these results because there may be many types of clinical EEGs that are difficult to interpret. Copyright © 2014 The Japanese Society of Child Neurology. Published by Elsevier B.V. All rights reserved.

  13. Interpretation and display of research results

    Directory of Open Access Journals (Sweden)

    Dilip Kumar Kulkarni

    2016-01-01

    Full Text Available It important to properly collect, code, clean and edit the data before interpreting and displaying the research results. Computers play a major role in different phases of research starting from conceptual, design and planning, data collection, data analysis and research publication phases. The main objective of data display is to summarize the characteristics of a data and to make the data more comprehensible and meaningful. Usually data is presented depending upon the type of data in different tables and graphs. This will enable not only to understand the data behaviour, but also useful in choosing the different statistical tests to be applied.

  14. Wind Statistics from a Forested Landscape

    DEFF Research Database (Denmark)

    Arnqvist, Johan; Segalini, Antonio; Dellwik, Ebba

    2015-01-01

    An analysis and interpretation of measurements from a 138-m tall tower located in a forested landscape is presented. Measurement errors and statistical uncertainties are carefully evaluated to ensure high data quality. A 40(Formula presented.) wide wind-direction sector is selected as the most re...

  15. A Question of Interpretation

    Science.gov (United States)

    2004-01-01

    [figure removed for brevity, see original site] Released July 22, 2004 The atmosphere of Mars is a dynamic system. Water-ice clouds, fog, and hazes can make imaging the surface from space difficult. Dust storms can grow from local disturbances to global sizes, through which imaging is impossible. Seasonal temperature changes are the usual drivers in cloud and dust storm development and growth. Eons of atmospheric dust storm activity has left its mark on the surface of Mars. Dust carried aloft by the wind has settled out on every available surface; sand dunes have been created and moved by centuries of wind; and the effect of continual sand-blasting has modified many regions of Mars, creating yardangs and other unusual surface forms. It is often difficult to determine if wind eroded surface represent the youngest activity in a region. Wind eroded landforms can be covered by later materials and the exhumed long after they were initially formed. This image illustrates how difficult it can be to interpret the surface of Mars. Image information: VIS instrument. Latitude -6.7, Longitude 174.7 East (185.3 West). 19 meter/pixel resolution. Note: this THEMIS visual image has not been radiometrically nor geometrically calibrated for this preliminary release. An empirical correction has been performed to remove instrumental effects. A linear shift has been applied in the cross-track and down-track direction to approximate spacecraft and planetary motion. Fully calibrated and geometrically projected images will be released through the Planetary Data System in accordance with Project policies at a later time. NASA's Jet Propulsion Laboratory manages the 2001 Mars Odyssey mission for NASA's Office of Space Science, Washington, D.C. The Thermal Emission Imaging System (THEMIS) was developed by Arizona State University, Tempe, in collaboration with Raytheon Santa Barbara Remote Sensing. The THEMIS investigation is led by Dr. Philip Christensen at Arizona State University. Lockheed

  16. Exploring the Relationship Between Eye Movements and Electrocardiogram Interpretation Accuracy

    Science.gov (United States)

    Davies, Alan; Brown, Gavin; Vigo, Markel; Harper, Simon; Horseman, Laura; Splendiani, Bruno; Hill, Elspeth; Jay, Caroline

    2016-12-01

    Interpretation of electrocardiograms (ECGs) is a complex task involving visual inspection. This paper aims to improve understanding of how practitioners perceive ECGs, and determine whether visual behaviour can indicate differences in interpretation accuracy. A group of healthcare practitioners (n = 31) who interpret ECGs as part of their clinical role were shown 11 commonly encountered ECGs on a computer screen. The participants’ eye movement data were recorded as they viewed the ECGs and attempted interpretation. The Jensen-Shannon distance was computed for the distance between two Markov chains, constructed from the transition matrices (visual shifts from and to ECG leads) of the correct and incorrect interpretation groups for each ECG. A permutation test was then used to compare this distance against 10,000 randomly shuffled groups made up of the same participants. The results demonstrated a statistically significant (α  0.05) result in 5 of the 11 stimuli demonstrating that the gaze shift between the ECG leads is different between the groups making correct and incorrect interpretations and therefore a factor in interpretation accuracy. The results shed further light on the relationship between visual behaviour and ECG interpretation accuracy, providing information that can be used to improve both human and automated interpretation approaches.

  17. Mantoux test and its interpretation

    Directory of Open Access Journals (Sweden)

    Surajit Nayak

    2012-01-01

    Full Text Available The tuberculin skin test is one of the few investigations dating from the 19 th century that are still widely used as an important test for diagnosing tuberculosis. Though very commonly used by physicians worldwide its interpretation always remains difficult and controversial. Various factors like age, immunological status coexisting illness etc influence its outcome, so also its interpretation. Utmost care is required while interpreting the result and giving an opinion. This article has been written with the purpose of elucidating the performance and interpretation of the standard tuberculin test.

  18. An easily accessible sulfated saccharide mimetic inhibits in vitro human tumor cell adhesion and angiogenesis of vascular endothelial cells

    Directory of Open Access Journals (Sweden)

    Grazia Marano

    2012-05-01

    Full Text Available Oligosaccharides aberrantly expressed on tumor cells influence processes such as cell adhesion and modulation of the cell’s microenvironment resulting in an increased malignancy. Schmidt’s imidate strategy offers an effective method to synthesize libraries of various oligosaccharide mimetics. With the aim to perturb interactions of tumor cells with extracellular matrix proteins and host cells, molecules with 3,4-bis(hydroxymethylfuran as core structure were synthesized and screened in biological assays for their abilities to interfere in cell adhesion and other steps of the metastatic cascade, such as tumor-induced angiogenesis.The most active compound, (4-{[(β-D-galactopyranosyloxy]methyl}furan-3-ylmethyl hydrogen sulfate (GSF, inhibited the activation of matrix-metalloproteinase-2 (MMP-2 as well as migration of the human melanoma cells of the lines WM-115 and WM-266-4 in a two-dimensional migration assay. GSF inhibited completely the adhesion of WM-115 cells to the extracellular matrix (ECM proteins, fibrinogen and fibronectin.In an in vitro angiogenesis assay with human endothelial cells, GSF very effectively inhibited endothelial tubule formation and sprouting of blood vessels, as well as the adhesion of endothelial cells to ECM proteins. GSF was not cytotoxic at biologically active concentrations; neither were 3,4-bis{[(β-D-galactopyranosyloxy]methyl}furan (BGF nor methyl β-D-galactopyranoside nor 3,4-bis(hydroxymethylfuran, which were used as controls, eliciting comparable biological activity. In silico modeling experiments, in which binding of GSF to the extracellular domain of the integrin αvβ3 was determined, revealed specific docking of GSF to the same binding site as the natural peptidic ligands of this integrin. The sulfate in the molecule coordinated with one manganese ion in the binding site.These studies show that this chemically easily accessible molecule GSF, synthesized in three steps from 3,4-bis

  19. Developments in statistical evaluation of clinical trials

    CERN Document Server

    Oud, Johan; Ghidey, Wendimagegn

    2014-01-01

    This book describes various ways of approaching and interpreting the data produced by clinical trial studies, with a special emphasis on the essential role that biostatistics plays in clinical trials. Over the past few decades the role of statistics in the evaluation and interpretation of clinical data has become of paramount importance. As a result the standards of clinical study design, conduct and interpretation have undergone substantial improvement. The book includes 18 carefully reviewed chapters on recent developments in clinical trials and their statistical evaluation, with each chapter providing one or more examples involving typical data sets, enabling readers to apply the proposed procedures. The chapters employ a uniform style to enhance comparability between the approaches.

  20. Handbook of univariate and multivariate data analysis and interpretation with SPSS

    CERN Document Server

    Ho, Robert

    2006-01-01

    Many statistics texts tend to focus more on the theory and mathematics underlying statistical tests than on their applications and interpretation. This can leave readers with little understanding of how to apply statistical tests or how to interpret their findings. While the SPSS statistical software has done much to alleviate the frustrations of social science professionals and students who must analyze data, they still face daunting challenges in selecting the proper tests, executing the tests, and interpreting the test results.With emphasis firmly on such practical matters, this handbook se

  1. Components of simultaneous interpreting: Comparing interpreting with shadowing and paraphrasing.

    NARCIS (Netherlands)

    Christoffels, I.K.; de Groot, A.M.B.

    2004-01-01

    Simultaneous interpreting is a complex task where the interpreter is routinely involved in comprehending, translating and producing language at the same time. This study assessed two components that are likely to be major sources of complexity in SI: The simultaneity of comprehension and production,

  2. Explorations in statistics: statistical facets of reproducibility.

    Science.gov (United States)

    Curran-Everett, Douglas

    2016-06-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This eleventh installment of Explorations in Statistics explores statistical facets of reproducibility. If we obtain an experimental result that is scientifically meaningful and statistically unusual, we would like to know that our result reflects a general biological phenomenon that another researcher could reproduce if (s)he repeated our experiment. But more often than not, we may learn this researcher cannot replicate our result. The National Institutes of Health and the Federation of American Societies for Experimental Biology have created training modules and outlined strategies to help improve the reproducibility of research. These particular approaches are necessary, but they are not sufficient. The principles of hypothesis testing and estimation are inherent to the notion of reproducibility in science. If we want to improve the reproducibility of our research, then we need to rethink how we apply fundamental concepts of statistics to our science.

  3. Clinical interpretation and use of stroke scales.

    Science.gov (United States)

    Kasner, Scott E

    2006-07-01

    No single outcome measure can describe or predict all dimensions of recovery and disability after acute stroke. Several scales have proven reliability and validity in stroke trials, including the National Institutes of Health stroke scale (NIHSS), the modified Rankin scale (mRS), the Barthel index (BI), the Glasgow outcome scale (GOS), and the stroke impact scale (SIS). Several scales have been combined in stroke trials to derive a global statistic to better define the effect of acute interventions, although this composite statistic is not clinically tenable. In practice, the NIHSS is useful for early prognostication and serial assessment, whereas the BI is useful for planning rehabilitative strategies. The mRS and GOS provide summary measures of outcome and might be most relevant to clinicians and patients considering early intervention. The SIS was designed to measure the patient's perspective on the effect of stroke. Familiarity with these scales could improve clinicians' interpretation of stroke research and their clinical decision-making.

  4. Concept of probability in statistical physics

    CERN Document Server

    Guttmann, Y M

    1999-01-01

    Foundational issues in statistical mechanics and the more general question of how probability is to be understood in the context of physical theories are both areas that have been neglected by philosophers of physics. This book fills an important gap in the literature by providing a most systematic study of how to interpret probabilistic assertions in the context of statistical mechanics. The book explores both subjectivist and objectivist accounts of probability, and takes full measure of work in the foundations of probability theory, in statistical mechanics, and in mathematical theory. It will be of particular interest to philosophers of science, physicists and mathematicians interested in foundational issues, and also to historians of science.

  5. Statistics using R

    CERN Document Server

    Purohit, Sudha G; Deshmukh, Shailaja R

    2015-01-01

    STATISTICS USING R will be useful at different levels, from an undergraduate course in statistics, through graduate courses in biological sciences, engineering, management and so on. The book introduces statistical terminology and defines it for the benefit of a novice. For a practicing statistician, it will serve as a guide to R language for statistical analysis. For a researcher, it is a dual guide, simultaneously explaining appropriate statistical methods for the problems at hand and indicating how these methods can be implemented using the R language. For a software developer, it is a guide in a variety of statistical methods for development of a suite of statistical procedures.

  6. Intercultural pragmatics and court interpreting

    DEFF Research Database (Denmark)

    Jacobsen, Bente

    2008-01-01

    . The court interpreters are all state-authorized court interpreters and thus fully competent professionals.   The centrality of pragmatics in triadic speech events has been demonstrated by a number of studies (e.g. Berk-Seligson 2002, Hale 2004, Jacobsen 2002). Thus, conversational implicatures, which...

  7. Investigating a Hierarchy of Students’ Interpretations of Graphs

    Directory of Open Access Journals (Sweden)

    Kazuhiro Aoyama

    2007-10-01

    Full Text Available The ability to analyse qualitative information from quantitative information, and/or to create new information from qualitative and quantitative information is the key task of statistical literacy in the 21st century. Although several studies have focussed on critical evaluation of statistical information, this aspect of research has not been clearly conceptualised as yet. This paper presents a hierarchy of the graphical interpretation component of statistical literacy. 175 participants from different educational levels (junior high school to graduate students responded to a questionnaire and some of them were also interviewed. The SOLO Taxonomy was used for coding the students’ responses and the Rasch model was used to clarify the construction of the hierarchy. Five different levels of interpretations of graphs were identified: Idiosyncratic, Basic graph reading, Rational/Literal, Critical, and Hypothesising and Modelling. These results will provide guidelines for teaching statistical literacy.

  8. Making sense of statistics a non-mathematical approach

    CERN Document Server

    Wood, Michael

    2003-01-01

    This text provides a thorough but accessible introduction to statistics and probability, without the distractions of mathematics. Guidance is provided on how to design investigations, analyse data and interpret results.

  9. Remote interpretation of chest roentgenograms.

    Science.gov (United States)

    Andrus, W S; Hunter, C H; Bird, K T

    1975-04-01

    A series of 98 chest films was interpreted by two physicians on the basis of monitor display of the transmitted television signal representing the roentgenographic image. The transmission path was 14 miles long, and included one active repeater station. Receiver operating characteristic curves were drawn to compare interpretations rendered on television view of the image with classic, direct view interpretations of the same films. Performance in these two viewing modes was found to be quite similar. When films containing only hazy densities lacking internal structure or sharp margins, were removed from the sample, interpretation of the remaining films was essentially identical via the two modes. Since hazy densities are visible on retrospective examination, interpretation of roentgenograms at a distance via television appears to be a feasible route for delivery of radiologic services.

  10. Industrial statistics with Minitab

    CERN Document Server

    Cintas, Pere Grima; Llabres, Xavier Tort-Martorell

    2012-01-01

    Industrial Statistics with MINITAB demonstrates the use of MINITAB as a tool for performing statistical analysis in an industrial context. This book covers introductory industrial statistics, exploring the most commonly used techniques alongside those that serve to give an overview of more complex issues. A plethora of examples in MINITAB are featured along with case studies for each of the statistical techniques presented. Industrial Statistics with MINITAB: Provides comprehensive coverage of user-friendly practical guidance to the essential statistical methods applied in industry.Explores

  11. Interpretation: bridge of cultures--A brief introduction to interpretation

    Institute of Scientific and Technical Information of China (English)

    袁立梅

    2005-01-01

    This essay gives a brief introduction to interpretation in term of its definition, characteristics, criteria and classification, providing an overall view to this increasingly widely used communication vehicle in today's society.

  12. Applied statistics for economics and business

    CERN Document Server

    Özdemir, Durmuş

    2016-01-01

    This textbook introduces readers to practical statistical issues by presenting them within the context of real-life economics and business situations. It presents the subject in a non-threatening manner, with an emphasis on concise, easily understandable explanations. It has been designed to be accessible and student-friendly and, as an added learning feature, provides all the relevant data required to complete the accompanying exercises and computing problems, which are presented at the end of each chapter. It also discusses index numbers and inequality indices in detail, since these are of particular importance to students and commonly omitted in textbooks. Throughout the text it is assumed that the student has no prior knowledge of statistics. It is aimed primarily at business and economics undergraduates, providing them with the basic statistical skills necessary for further study of their subject. However, students of other disciplines will also find it relevant.

  13. a Contextualist Interpretation of Mathematics

    Science.gov (United States)

    Liu, Jie

    2014-03-01

    The nature of mathematics has been the subject of heated debate among mathematicians and philosophers throughout the ages. The realist and anti-realist positions have had longstanding debate over this problem, but some of the most important recent development has focused on the interpretations; each of the above positions has its own interpretation of the nature of mathematics. I argue in this paper a contextualist interpretation of mathematics, it elucidates the essential features of mathematical context. That is, being integral and having concrete structure, mathematical context is a recontextualizational process with determinate boundary.

  14. CMS Program Statistics

    Data.gov (United States)

    U.S. Department of Health & Human Services — The CMS Office of Enterprise Data and Analytics has developed CMS Program Statistics, which includes detailed summary statistics on national health care, Medicare...

  15. Alcohol Facts and Statistics

    Science.gov (United States)

    ... Standard Drink? Drinking Levels Defined Alcohol Facts and Statistics Print version Alcohol Use in the United States: ... 1245, 2004. PMID: 15010446 11 National Center for Statistics and Analysis. 2014 Crash Data Key Findings (Traffic ...

  16. Bureau of Labor Statistics

    Science.gov (United States)

    ... Statistics Students' Pages Errata Other Statistical Sites Subjects Inflation & Prices » Consumer Price Index Producer Price Indexes Import/Export Price ... Choose a Subject Employment and Unemployment Employment Unemployment Inflation, Prices, and ... price indexes Consumer spending Industry price indexes Pay ...

  17. Recreational Boating Statistics 2012

    Data.gov (United States)

    Department of Homeland Security — Every year, the USCG compiles statistics on reported recreational boating accidents. These statistics are derived from accident reports that are filed by the owners...

  18. Statistics for Finance

    DEFF Research Database (Denmark)

    Lindström, Erik; Madsen, Henrik; Nielsen, Jan Nygaard

    Statistics for Finance develops students’ professional skills in statistics with applications in finance. Developed from the authors’ courses at the Technical University of Denmark and Lund University, the text bridges the gap between classical, rigorous treatments of financial mathematics...

  19. Mathematical and statistical analysis

    Science.gov (United States)

    Houston, A. Glen

    1988-01-01

    The goal of the mathematical and statistical analysis component of RICIS is to research, develop, and evaluate mathematical and statistical techniques for aerospace technology applications. Specific research areas of interest include modeling, simulation, experiment design, reliability assessment, and numerical analysis.

  20. Statistics for Finance

    DEFF Research Database (Denmark)

    Lindström, Erik; Madsen, Henrik; Nielsen, Jan Nygaard

    Statistics for Finance develops students’ professional skills in statistics with applications in finance. Developed from the authors’ courses at the Technical University of Denmark and Lund University, the text bridges the gap between classical, rigorous treatments of financial mathematics...

  1. Neuroendocrine Tumor: Statistics

    Science.gov (United States)

    ... Tumor > Neuroendocrine Tumor: Statistics Request Permissions Neuroendocrine Tumor: Statistics Approved by the Cancer.Net Editorial Board , 11/ ... the United States are diagnosed with Merkel cell skin cancer each year. Almost all people diagnosed with the ...

  2. Experiment in Elementary Statistics

    Science.gov (United States)

    Fernando, P. C. B.

    1976-01-01

    Presents an undergraduate laboratory exercise in elementary statistics in which students verify empirically the various aspects of the Gaussian distribution. Sampling techniques and other commonly used statistical procedures are introduced. (CP)

  3. Overweight and Obesity Statistics

    Science.gov (United States)

    ... the full list of resources ​​. Overweight and Obesity Statistics Page Content About Overweight and Obesity Prevalence of ... adults age 20 and older [ Top ] Physical Activity Statistics Adults Research Findings Research suggests that staying active ...

  4. Uterine Cancer Statistics

    Science.gov (United States)

    ... Research AMIGAS Fighting Cervical Cancer Worldwide Stay Informed Statistics for Other Kinds of Cancer Breast Cervical Colorectal ( ... Skin Vaginal and Vulvar Cancer Home Uterine Cancer Statistics Language: English Español (Spanish) Recommend on Facebook Tweet ...

  5. School Violence: Data & Statistics

    Science.gov (United States)

    ... Social Media Publications Injury Center School Violence: Data & Statistics Recommend on Facebook Tweet Share Compartir The first ... fact sheet provides up-to-date data and statistics on youth violence. Data Sources Indicators of School ...

  6. Recreational Boating Statistics 2013

    Data.gov (United States)

    Department of Homeland Security — Every year, the USCG compiles statistics on reported recreational boating accidents. These statistics are derived from accident reports that are filed by the owners...

  7. Principles of Equilibrium Statistical Mechanics

    Science.gov (United States)

    Chowdhury, Debashish; Stauffer, Dietrich

    2000-09-01

    This modern textbook provides a complete survey of the broad field of statistical mechanics. Based on a series of lectures, it adopts a special pedagogical approach. The authors, both excellent lecturers, clearly distinguish between general principles and their applications in solving problems. Analogies between phase transitions in fluids and magnets using continuum and spin models are emphasized, leading to a better understanding. Such special features as historical notes, summaries, problems, mathematical appendix, computer programs and order of magnitude estimations distinguish this volume from competing works. Due to its ambitious level and an extensive list of references for technical details on advanced topics, this is equally a must for researchers in condensed matter physics, materials science, polymer science, solid state physics, and astrophysics. From the contents Thermostatics: phase stability, phase equilibria, phase transitions; Statistical Mechanics: calculation, correlation functions, ideal classical gases, ideal quantum gases; Interacting Systems: models, computer simulation, mean-field approximation; Interacting Systems beyond Mean-field Theory: scaling and renormalization group, foundations of statistical mechanics "The present book, however, is unique that it both is written in a very pedagogic, easily comprehensible style, and, nevertheless, goes from the basic principles all the way to these modern topics, containing several chapters on the various approaches of mean field theory, and a chapter on computer simulation. A characteristic feature of this book is that often first some qualitative arguments are given, or a "pedestrians's approach", and then a more general and/or more rigorous derivation is presented as well. Particularly useful are also "supplementary notes", pointing out interesting applications and further developments of the subject, a detailed bibliography, problems and historical notes, and many pedagogic figures."

  8. Alternative Derivation of the Propagator in Polar Coordinates by Feynman's Physical Interpretation of the Characteristic Function

    Institute of Scientific and Technical Information of China (English)

    QIAN Shang-Wu; CHAN King-Man; GU Zhi-Yu

    2002-01-01

    This article revisits Feynman's characteristic function, and points out the insight and usefulness of hisphysical interpretation. As an example, the tedious and rather long derivation of the propagator in polar coordinatescan be easily and clearly obtained by merely using Feynman's physical intepretation of the characteristic function andsome well-known results of central force problem.

  9. Software for Spatial Statistics

    Directory of Open Access Journals (Sweden)

    Edzer Pebesma

    2015-02-01

    Full Text Available We give an overview of the papers published in this special issue on spatial statistics, of the Journal of Statistical Software. 21 papers address issues covering visualization (micromaps, links to Google Maps or Google Earth, point pattern analysis, geostatistics, analysis of areal aggregated or lattice data, spatio-temporal statistics, Bayesian spatial statistics, and Laplace approximations. We also point to earlier publications in this journal on the same topic.

  10. Software for Spatial Statistics

    OpenAIRE

    Edzer Pebesma; Roger Bivand; Paulo Justiniano Ribeiro

    2015-01-01

    We give an overview of the papers published in this special issue on spatial statistics, of the Journal of Statistical Software. 21 papers address issues covering visualization (micromaps, links to Google Maps or Google Earth), point pattern analysis, geostatistics, analysis of areal aggregated or lattice data, spatio-temporal statistics, Bayesian spatial statistics, and Laplace approximations. We also point to earlier publications in this journal on the same topic.

  11. Numeric computation and statistical data analysis on the Java platform

    CERN Document Server

    Chekanov, Sergei V

    2016-01-01

    Numerical computation, knowledge discovery and statistical data analysis integrated with powerful 2D and 3D graphics for visualization are the key topics of this book. The Python code examples powered by the Java platform can easily be transformed to other programming languages, such as Java, Groovy, Ruby and BeanShell. This book equips the reader with a computational platform which, unlike other statistical programs, is not limited by a single programming language. The author focuses on practical programming aspects and covers a broad range of topics, from basic introduction to the Python language on the Java platform (Jython), to descriptive statistics, symbolic calculations, neural networks, non-linear regression analysis and many other data-mining topics. He discusses how to find regularities in real-world data, how to classify data, and how to process data for knowledge discoveries. The code snippets are so short that they easily fit into single pages. Numeric Computation and Statistical Data Analysis ...

  12. Ethics in Statistics

    Science.gov (United States)

    Lenard, Christopher; McCarthy, Sally; Mills, Terence

    2014-01-01

    There are many different aspects of statistics. Statistics involves mathematics, computing, and applications to almost every field of endeavour. Each aspect provides an opportunity to spark someone's interest in the subject. In this paper we discuss some ethical aspects of statistics, and describe how an introduction to ethics has been…

  13. Federal Aviation Administration Legal Interpretations

    Data.gov (United States)

    Department of Transportation — Legal Interpretations and the Chief Counsel's opinions are now available at this site. Your may choose to search by year or by text search. Please note that not all...

  14. Interpreting Sustainability for Urban Forests

    Directory of Open Access Journals (Sweden)

    Camilo Ordóñez

    2010-06-01

    Full Text Available Incisive interpretations of urban-forest sustainability are important in furthering our understanding of how to sustain the myriad values associated with urban forests. Our analysis of earlier interpretations reveals conceptual gaps. These interpretations are attached to restrictive definitions of a sustainable urban forest and limited to a rather mechanical view of maintaining the biophysical structure of trees. The probing of three conceptual domains (urban forest concepts, sustainable development, and sustainable forest management leads to a broader interpretation of urban-forest sustainability as the process of sustaining urban forest values through time and across space. We propose that values—and not services, benefits, functions or goods—is a superior concept to refer to what is to be sustained in and by an urban forest.

  15. Psychophysical Interpretation of Quantum theory

    CERN Document Server

    Pradhan, Rajat K

    2013-01-01

    It is shown that the formalism of quantum theory naturally incorporates the psychophysical parallelism and thereby interprets itself, if the subjective aspects are taken as equal partners alongside the objective aspects as determinants of Reality as a Whole. The inevitable interplay of the subject (observer) and the object (observed) in making up Reality is brought out succinctly through a comprehensive psychophysical interpretation which includes in its bosom the truths of many of the major interpretations proposed so far as essential ingredients. At the heart of this novel approach lies the interpretation of the complex conjugate quantities such as the conjugate wave function {\\Psi}*(r, t), the bra vector , and the observable A etc. respectively. This brings out the psycho-physical parallelism lying hidden in the quantum mechanical formalism in a quite straightforward manner. The measurement process is shown to be a two-step process comprising objective interaction through the retarded waves and subjective ...

  16. Abstract Interpretation and Attribute Gramars

    DEFF Research Database (Denmark)

    Rosendahl, Mads

    The objective of this thesis is to explore the connections between abstract interpretation and attribute grammars as frameworks in program analysis. Abstract interpretation is a semantics-based program analysis method. A large class of data flow analysis problems can be expressed as non......-standard semantics where the ``meaning'' contains information about the runtime behaviour of programs. In an abstract interpretation the analysis is proved correct by relating it to the usual semantics for the language. Attribute grammars provide a method and notation to specify code generation and program analysis...... directly from the syntax of the programming language. They are especially used for describing compilation of programming languages and very efficient evaluators have been developed for subclasses of attribute grammars. By relating abstract interpretation and attribute grammars we obtain a closer connection...

  17. ENVIRONMENTAL PHOTOGRAPHIC INTERPRETATION CENTER (EPIC)

    Science.gov (United States)

    The Environmental Sciences Division (ESD) in the National Exposure Research Laboratory (NERL) of the Office of Research and Development provides remote sensing technical support including aerial photograph acquisition and interpretation to the EPA Program Offices, ORD Laboratorie...

  18. Dialectica Interpretation with Marked Counterexamples

    Directory of Open Access Journals (Sweden)

    Trifon Trifonov

    2011-01-01

    Full Text Available Goedel's functional "Dialectica" interpretation can be used to extract functional programs from non-constructive proofs in arithmetic by employing two sorts of higher-order witnessing terms: positive realisers and negative counterexamples. In the original interpretation decidability of atoms is required to compute the correct counterexample from a set of candidates. When combined with recursion, this choice needs to be made for every step in the extracted program, however, in some special cases the decision on negative witnesses can be calculated only once. We present a variant of the interpretation in which the time complexity of extracted programs can be improved by marking the chosen witness and thus avoiding recomputation. The achieved effect is similar to using an abortive control operator to interpret computational content of non-constructive principles.

  19. Interpretation of the Weyl Tensor

    CERN Document Server

    Hofmann, Stefan; Schneider, Robert

    2013-01-01

    According to folklore in general relativity, the Weyl tensor can be decomposed into parts corresponding to Newton-like, incoming- and outgoing wave-like field components. It is shown here that this interpretation cannot be applied to space-time geometries with cylindrical isometries. This is done by investigating some well-known exact solutions of Einstein's field equations with whole-cylindrical symmetry, for which the physical interpretation is very clear, but for which the standard Weyl interpretation would give contradictory results. For planar or spherical geometries, however, the standard interpretation works for both, static and dynamical space-times. It is argued that one reason for the failure in the cylindrical case is that for waves spreading in two spatial dimensions there is no local criterion to distinguish incoming and outgoing waves already at the linear level. It turns out that Thorne's local energy notion, subject to certain qualifications, provides an efficient diagnostic tool to extract th...

  20. Dialectica Interpretation with Marked Counterexamples

    CERN Document Server

    Trifonov, Trifon

    2011-01-01

    Goedel's functional "Dialectica" interpretation can be used to extract functional programs from non-constructive proofs in arithmetic by employing two sorts of higher-order witnessing terms: positive realisers and negative counterexamples. In the original interpretation decidability of atoms is required to compute the correct counterexample from a set of candidates. When combined with recursion, this choice needs to be made for every step in the extracted program, however, in some special cases the decision on negative witnesses can be calculated only once. We present a variant of the interpretation in which the time complexity of extracted programs can be improved by marking the chosen witness and thus avoiding recomputation. The achieved effect is similar to using an abortive control operator to interpret computational content of non-constructive principles.

  1. Selling statistics[Statistics in scientific progress

    Energy Technology Data Exchange (ETDEWEB)

    Bridle, S. [Astrophysics Group, University College London (United Kingdom)]. E-mail: sarah@star.ucl.ac.uk

    2006-09-15

    From Cosmos to Chaos- Peter Coles, 2006, Oxford University Press, 224pp. To confirm or refute a scientific theory you have to make a measurement. Unfortunately, however, measurements are never perfect: the rest is statistics. Indeed, statistics is at the very heart of scientific progress, but it is often poorly taught and badly received; for many, the very word conjures up half-remembered nightmares of 'null hypotheses' and 'student's t-tests'. From Cosmos to Chaos by Peter Coles, a cosmologist at Nottingham University, is an approachable antidote that places statistics in a range of catchy contexts. Using this book you will be able to calculate the probabilities in a game of bridge or in a legal trial based on DNA fingerprinting, impress friends by talking confidently about entropy, and stretch your mind thinking about quantum mechanics. (U.K.)

  2. COURT INTERPRETING AT DENPASAR COURT

    Directory of Open Access Journals (Sweden)

    Ida Ayu Made Puspani

    2012-11-01

    Full Text Available This is a research on interpreting (oral translation on a criminal case ofdrug user in the court proceedings at Denpasar Court. The study of theinterpreting is concerned with two-ways rendition from Indonesian into Englishand vice-versa. The study is related to: (1 the description of modes of interpretingapplied by the interpreter, (2 the application of translation strategies: shift,addition and deletion of information, (3 factors that underlie the application ofthe strategies, and (4 the impact of the application of those strategies towards thequality of the interpreting.The methodology applied in this study is qualitative based on eclectictheories (translation, syntax, semantics and pragmatics. The utilization of thetheories is in accordance with the type of the data analyzed in regard to thetranslation phenomena as an applied study and its complexity.The interpreting at court applied the consecutive and simultaneous modes.The strategy of shift was applied when there were differences in structure betweenthe source and the target languages. Addition of information was used when theinterpreter emphasized the message of the source language in the target language.The deletion of information applied if the context in the target language has beencovered, and it was not necessary for the interpreter to interpret the same thingbecause the message of the source language was pragmatically implied in thetarget language.The factors which underlie the application of the interpreting strategies incourt interpreting were communication factor and the differences in the languagesystems between the source and the target languages. The impact of the use of thestrategies towards the quality of the interpreting happened when the interpretationof the source language message into the message of the target language and themessage in the source language was not completely render into the targetlanguage.The novelties of the research are: (1 relevance theory and its

  3. Court interpreting and pragmatic meaning

    DEFF Research Database (Denmark)

    Jacobsen, Bente

    In Denmark, court interpreters are required to deliver verbatim translations of speakers' originals and to refrain from transferring pragmatic meaning. Yet, as this paper demonstrates, pragmatic meaning is central to courtroom interaction.......In Denmark, court interpreters are required to deliver verbatim translations of speakers' originals and to refrain from transferring pragmatic meaning. Yet, as this paper demonstrates, pragmatic meaning is central to courtroom interaction....

  4. Formal modelling of cognitive interpretation

    OpenAIRE

    Rukšenas, R.; Curzon, P; Back, J.; Blandford, A.

    2007-01-01

    We formally specify the interpretation stage in a dual state space human-computer interaction cycle. This is done by extending/reorganising our previous cognitive architecture. In particular, we focus on shape related aspects of the interpretation process associated with device input prompts. A cash-point example illustrates our approach. Using the SAL model checking environment, we show how the extended cognitive architecture facilitates detection of prompt-shape induced human error. © Sprin...

  5. Statistics Essentials For Dummies

    CERN Document Server

    Rumsey, Deborah

    2010-01-01

    Statistics Essentials For Dummies not only provides students enrolled in Statistics I with an excellent high-level overview of key concepts, but it also serves as a reference or refresher for students in upper-level statistics courses. Free of review and ramp-up material, Statistics Essentials For Dummies sticks to the point, with content focused on key course topics only. It provides discrete explanations of essential concepts taught in a typical first semester college-level statistics course, from odds and error margins to confidence intervals and conclusions. This guide is also a perfect re

  6. Head First Statistics

    CERN Document Server

    Griffiths, Dawn

    2009-01-01

    Wouldn't it be great if there were a statistics book that made histograms, probability distributions, and chi square analysis more enjoyable than going to the dentist? Head First Statistics brings this typically dry subject to life, teaching you everything you want and need to know about statistics through engaging, interactive, and thought-provoking material, full of puzzles, stories, quizzes, visual aids, and real-world examples. Whether you're a student, a professional, or just curious about statistical analysis, Head First's brain-friendly formula helps you get a firm grasp of statistics

  7. Business statistics for dummies

    CERN Document Server

    Anderson, Alan

    2013-01-01

    Score higher in your business statistics course? Easy. Business statistics is a common course for business majors and MBA candidates. It examines common data sets and the proper way to use such information when conducting research and producing informational reports such as profit and loss statements, customer satisfaction surveys, and peer comparisons. Business Statistics For Dummies tracks to a typical business statistics course offered at the undergraduate and graduate levels and provides clear, practical explanations of business statistical ideas, techniques, formulas, and calculations, w

  8. Nonparametric statistical inference

    CERN Document Server

    Gibbons, Jean Dickinson

    2010-01-01

    Overall, this remains a very fine book suitable for a graduate-level course in nonparametric statistics. I recommend it for all people interested in learning the basic ideas of nonparametric statistical inference.-Eugenia Stoimenova, Journal of Applied Statistics, June 2012… one of the best books available for a graduate (or advanced undergraduate) text for a theory course on nonparametric statistics. … a very well-written and organized book on nonparametric statistics, especially useful and recommended for teachers and graduate students.-Biometrics, 67, September 2011This excellently presente

  9. Statistics in a nutshell

    CERN Document Server

    Boslaugh, Sarah

    2013-01-01

    Need to learn statistics for your job? Want help passing a statistics course? Statistics in a Nutshell is a clear and concise introduction and reference for anyone new to the subject. Thoroughly revised and expanded, this edition helps you gain a solid understanding of statistics without the numbing complexity of many college texts. Each chapter presents easy-to-follow descriptions, along with graphics, formulas, solved examples, and hands-on exercises. If you want to perform common statistical analyses and learn a wide range of techniques without getting in over your head, this is your book.

  10. Lectures on algebraic statistics

    CERN Document Server

    Drton, Mathias; Sullivant, Seth

    2009-01-01

    How does an algebraic geometer studying secant varieties further the understanding of hypothesis tests in statistics? Why would a statistician working on factor analysis raise open problems about determinantal varieties? Connections of this type are at the heart of the new field of "algebraic statistics". In this field, mathematicians and statisticians come together to solve statistical inference problems using concepts from algebraic geometry as well as related computational and combinatorial techniques. The goal of these lectures is to introduce newcomers from the different camps to algebraic statistics. The introduction will be centered around the following three observations: many important statistical models correspond to algebraic or semi-algebraic sets of parameters; the geometry of these parameter spaces determines the behaviour of widely used statistical inference procedures; computational algebraic geometry can be used to study parameter spaces and other features of statistical models.

  11. Statistics for economics

    CERN Document Server

    Naghshpour, Shahdad

    2012-01-01

    Statistics is the branch of mathematics that deals with real-life problems. As such, it is an essential tool for economists. Unfortunately, the way you and many other economists learn the concept of statistics is not compatible with the way economists think and learn. The problem is worsened by the use of mathematical jargon and complex derivations. Here's a book that proves none of this is necessary. All the examples and exercises in this book are constructed within the field of economics, thus eliminating the difficulty of learning statistics with examples from fields that have no relation to business, politics, or policy. Statistics is, in fact, not more difficult than economics. Anyone who can comprehend economics can understand and use statistics successfully within this field, including you! This book utilizes Microsoft Excel to obtain statistical results, as well as to perform additional necessary computations. Microsoft Excel is not the software of choice for performing sophisticated statistical analy...

  12. Estimation and inferential statistics

    CERN Document Server

    Sahu, Pradip Kumar; Das, Ajit Kumar

    2015-01-01

    This book focuses on the meaning of statistical inference and estimation. Statistical inference is concerned with the problems of estimation of population parameters and testing hypotheses. Primarily aimed at undergraduate and postgraduate students of statistics, the book is also useful to professionals and researchers in statistical, medical, social and other disciplines. It discusses current methodological techniques used in statistics and related interdisciplinary areas. Every concept is supported with relevant research examples to help readers to find the most suitable application. Statistical tools have been presented by using real-life examples, removing the “fear factor” usually associated with this complex subject. The book will help readers to discover diverse perspectives of statistical theory followed by relevant worked-out examples. Keeping in mind the needs of readers, as well as constantly changing scenarios, the material is presented in an easy-to-understand form.

  13. Interactive comparison of hypothesis tests for statistical model checking

    NARCIS (Netherlands)

    de Boer, Pieter-Tjerk; Reijsbergen, D.P.; Scheinhardt, Willem R.W.

    2015-01-01

    We present a web-based interactive comparison of hypothesis tests as are used in statistical model checking, providing users and tool developers with more insight into their characteristics. Parameters can be modified easily and their influence is visualized in real time; an integrated simulation

  14. A New Statistic for Variable Selection in Questionnaire Analysis

    Institute of Scientific and Technical Information of China (English)

    ZHANG Jun-hua; FANG Wei-wu

    2001-01-01

    In this paper, a new statistic is proposed for variable selection which is one of the important problems in analysis of questionnaire data. Contrasting to other methods, the approach introduced here can be used not only for two groups of samples but can also be easily generalized to the multi-group case.

  15. An Experimental Approach to Teaching and Learning Elementary Statistical Mechanics

    Science.gov (United States)

    Ellis, Frank B.; Ellis, David C.

    2008-01-01

    Introductory statistical mechanics is studied for a simple two-state system using an inexpensive and easily built apparatus. A large variety of demonstrations, suitable for students in high school and introductory university chemistry courses, are possible. This article details demonstrations for exothermic and endothermic reactions, the dynamic…

  16. Baseline Statistics of Linked Statistical Data

    NARCIS (Netherlands)

    Scharnhorst, Andrea; Meroño-Peñuela, Albert; Guéret, Christophe

    2014-01-01

    We are surrounded by an ever increasing ocean of information, everybody will agree to that. We build sophisticated strategies to govern this information: design data models, develop infrastructures for data sharing, building tool for data analysis. Statistical datasets curated by National Statistica

  17. The impact of working memory on interpreting

    Institute of Scientific and Technical Information of China (English)

    白云安; 张国梅

    2016-01-01

    This paper investigates the roles of working memory in interpreting process. First of all, it gives a brief introduction to interpreting. Secondly, the paper exemplifies the role of working memory in interpreting. The result reveals that the working memory capacity of interpreters is not adsolutely proportional to the quality of interpreting in the real interpreting conditions. The performance of an interpreter with well-equipped working memory capacity will comprehensively influenced by various elements.

  18. Disability: concepts and statistical information

    Directory of Open Access Journals (Sweden)

    Giordana Baldassarre

    2008-06-01

    Full Text Available

    Background: The measurement and definition of disability is difficult due to its objective and subjective characteristics. In Italy, three different perspectives have been developed during the last 40 years. These various perspectives have had an effect, not only on how to measure disability, but also on policies to improve the social integration of people with disabilities.

    Methods: This paper examines the various conceptual models behind the definition of disability and the differences in the estimated number of persons with disabilities. In addition, it analyses in accordance with the International Classification of Functioning, disability and health, the European and international initiatives undertaken to harmonize the definitions of disability.

    Discussion: There are various bodies and central government agencies that either have management data or carry out statistical systematic surveys and disability surveys. Statistically speaking, the worst aspect of this scenario is that it creates confusion and uncertainty among the end users of this data, namely the policy makers. At international level the statistical data on disability is scarcely comparable among countries, despite huge efforts on behalf of international organisations to harmonize classifications and definitions of disability.

    Conclusions: Statistical and administrative surveys provide information flows using a different definition and label based on a conceptual model that reflects the time period in which they were implemented. The use of different prescriptive definitions of disability produces different counts of persons with disabilities in Italy. For this reason it is important to interpret the data correctly and choose the appropriate cross section that best represents the population on which to focus attention.

  19. Students' Misconceptions of Statistical Inference: A Review of the Empirical Evidence from Research on Statistics Education

    Science.gov (United States)

    Sotos, Ana Elisa Castro; Vanhoof, Stijn; Van den Noortgate, Wim; Onghena, Patrick

    2007-01-01

    A solid understanding of "inferential statistics" is of major importance for designing and interpreting empirical results in any scientific discipline. However, students are prone to many misconceptions regarding this topic. This article structurally summarizes and describes these misconceptions by presenting a systematic review of publications…

  20. Statistical inference on residual life

    CERN Document Server

    Jeong, Jong-Hyeon

    2014-01-01

    This is a monograph on the concept of residual life, which is an alternative summary measure of time-to-event data, or survival data. The mean residual life has been used for many years under the name of life expectancy, so it is a natural concept for summarizing survival or reliability data. It is also more interpretable than the popular hazard function, especially for communications between patients and physicians regarding the efficacy of a new drug in the medical field. This book reviews existing statistical methods to infer the residual life distribution. The review and comparison includes existing inference methods for mean and median, or quantile, residual life analysis through medical data examples. The concept of the residual life is also extended to competing risks analysis. The targeted audience includes biostatisticians, graduate students, and PhD (bio)statisticians. Knowledge in survival analysis at an introductory graduate level is advisable prior to reading this book.

  1. Evaluation of the TV Series "Statistics" (SABC-ERTV1).

    Science.gov (United States)

    Stupart, J. D. C.; Duby, Aliza

    A summative evaluation of the effectiveness of the educational television series, "Statistics," that aired on South African television is presented. The two episodes chosen from the six-episode series covered pie charts, pictograms, and pictographs (episode 1); and point-of-view interpretations of statistics (episode 4). The evaluation was…

  2. Statistical tools: An overview of common applications in social sciences

    NARCIS (Netherlands)

    Grotenhuis, H.F. te; Weegen, T.M.C.M. van der

    2009-01-01

    This textbook deals with proper use of common statistical applications and correct interpretation of statistical result in the realm of social sciences. Besides the text of the book, addtional data (In SPSS) format is available on the internet to allow the reader to exercise by replication of the

  3. What Is the next Trend in Usage Statistics in Libraries?

    Science.gov (United States)

    King, Douglas

    2009-01-01

    In answering the question "What is the next trend in usage statistics in libraries?" an eclectic group of respondents has presented an assortment of possibilities, suggestions, complaints and, of course, questions of their own. Undoubtedly, usage statistics collection, interpretation, and application are areas of growth and increasing complexity…

  4. Multi-scale structure and topological anomaly detection via a new network statistic: The onion decomposition

    Science.gov (United States)

    Hébert-Dufresne, Laurent; Grochow, Joshua A.; Allard, Antoine

    2016-08-01

    We introduce a network statistic that measures structural properties at the micro-, meso-, and macroscopic scales, while still being easy to compute and interpretable at a glance. Our statistic, the onion spectrum, is based on the onion decomposition, which refines the k-core decomposition, a standard network fingerprinting method. The onion spectrum is exactly as easy to compute as the k-cores: It is based on the stages at which each vertex gets removed from a graph in the standard algorithm for computing the k-cores. Yet, the onion spectrum reveals much more information about a network, and at multiple scales; for example, it can be used to quantify node heterogeneity, degree correlations, centrality, and tree- or lattice-likeness. Furthermore, unlike the k-core decomposition, the combined degree-onion spectrum immediately gives a clear local picture of the network around each node which allows the detection of interesting subgraphs whose topological structure differs from the global network organization. This local description can also be leveraged to easily generate samples from the ensemble of networks with a given joint degree-onion distribution. We demonstrate the utility of the onion spectrum for understanding both static and dynamic properties on several standard graph models and on many real-world networks.

  5. Statistical learning from a regression perspective

    CERN Document Server

    Berk, Richard A

    2016-01-01

    This textbook considers statistical learning applications when interest centers on the conditional distribution of the response variable, given a set of predictors, and when it is important to characterize how the predictors are related to the response. As a first approximation, this can be seen as an extension of nonparametric regression. This fully revised new edition includes important developments over the past 8 years. Consistent with modern data analytics, it emphasizes that a proper statistical learning data analysis derives from sound data collection, intelligent data management, appropriate statistical procedures, and an accessible interpretation of results. A continued emphasis on the implications for practice runs through the text. Among the statistical learning procedures examined are bagging, random forests, boosting, support vector machines and neural networks. Response variables may be quantitative or categorical. As in the first edition, a unifying theme is supervised learning that can be trea...

  6. Intelligent Collection Environment for an Interpretation System

    Energy Technology Data Exchange (ETDEWEB)

    Maurer, W J

    2001-07-19

    An Intelligent Collection Environment for a data interpretation system is described. The environment accepts two inputs: A data model and a number between 0.0 and 1.0. The data model is as simple as a single word or as complex as a multi-level/multidimensional model. The number between 0.0 and 1.0 is a control knob to indicate the user's desire to allow loose matching of the data (things are ambiguous and unknown) versus strict matching of the data (things are precise and known). The environment produces a set of possible interpretations, a set of requirements to further strengthen or to differentiate a particular subset of the possible interpretation from the others, a set of inconsistencies, and a logic map that graphically shows the lines of reasoning used to derive the above output. The environment is comprised of a knowledge editor, model explorer, expertise server, and the World Wide Web. The Knowledge Editor is used by a subject matter expert to define Linguistic Types, Term Sets, detailed explanations, and dynamically created URI's, and to create rule bases using a straight forward hyper matrix representation. The Model Explorer allows rapid construction and browsing of multi-level models. A multi-level model is a model whose elements may also be models themselves. The Expertise Server is an inference engine used to interpret the data submitted. It incorporates a semantic network knowledge representation, an assumption based truth maintenance system, and a fuzzy logic calculus. It can be extended by employing any classifier (e.g. statistical/neural networks) of complex data types. The World Wide Web is an unstructured data space accessed by the URI's supplied as part of the output of the environment. By recognizing the input data model as a query, the environment serves as a deductive search engine. Applications include (but are not limited to) interpretation of geophysical phenomena, a navigation aid for very large web sites, monitoring of

  7. The statistical stability phenomenon

    CERN Document Server

    Gorban, Igor I

    2017-01-01

    This monograph investigates violations of statistical stability of physical events, variables, and processes and develops a new physical-mathematical theory taking into consideration such violations – the theory of hyper-random phenomena. There are five parts. The first describes the phenomenon of statistical stability and its features, and develops methods for detecting violations of statistical stability, in particular when data is limited. The second part presents several examples of real processes of different physical nature and demonstrates the violation of statistical stability over broad observation intervals. The third part outlines the mathematical foundations of the theory of hyper-random phenomena, while the fourth develops the foundations of the mathematical analysis of divergent and many-valued functions. The fifth part contains theoretical and experimental studies of statistical laws where there is violation of statistical stability. The monograph should be of particular interest to engineers...

  8. Laboratory test result interpretation for primary care doctors in South Africa.

    Science.gov (United States)

    Vanker, Naadira; Faull, Norman H B

    2017-01-01

    Challenges and uncertainties with test result interpretation can lead to diagnostic errors. Primary care doctors are at a higher risk than specialists of making these errors, due to the range in complexity and severity of conditions that they encounter. This study aimed to investigate the challenges that primary care doctors face with test result interpretation, and to identify potential countermeasures to address these. A survey was sent out to 7800 primary care doctors in South Africa. Questionnaire themes included doctors' uncertainty with interpreting test results, mechanisms used to overcome this uncertainty, challenges with appropriate result interpretation, and perceived solutions for interpreting results. Of the 552 responses received, the prevalence of challenges with result interpretation was estimated in an average of 17% of diagnostic encounters. The most commonly-reported challenges were not receiving test results in a timely manner (51% of respondents) and previous results not being easily available (37%). When faced with diagnostic uncertainty, 84% of respondents would either follow-up and reassess the patient or discuss the case with a specialist, and 67% would contact a laboratory professional. The most useful test utilisation enablers were found to be: interpretive comments (78% of respondents), published guidelines (74%), and a dedicated laboratory phone line (72%). Primary care doctors acknowledge uncertainty with test result interpretation. Potential countermeasures include the addition of patient-specific interpretive comments, the availability of guidelines or algorithms, and a dedicated laboratory phone line. The benefit of enhanced test result interpretation would reduce diagnostic error rates.

  9. The effects of test interpretation styles and the status of tests in career counseling

    Directory of Open Access Journals (Sweden)

    Nelia Frade

    2007-03-01

    Full Text Available The effects of two styles of test interpretation, namely directive and collaborative, and clients’ perceptions of the technical status of tests, namely high and low, were compared for 32 postgraduate psychology students who served as career counseling clients. Clients who received a collaborative interpretation perceived their counselor as more attractive and trustworthy than did clients who received a directive test interpretation. Interpretation style did not have an effect on session impact. Clients’ perceptions of test status had a noticeable, but statistically non-significant effect on counselor evaluations and session impact. Implications for test-interpretation practice are discussed.

  10. The Translation of Figures in Interpretation

    Institute of Scientific and Technical Information of China (English)

    杨莹

    2011-01-01

    Figure has always been a "bottleneck" of high-quality interpretation.This paper attempts a theoretical analysis of the main causes of the bottleneck and discusses the skill of interpreting figures.It also sums up intensively various forms related to interpreting figures for interpreters to achieve a high-quality interpretation.

  11. Interpretation of buzzword renxing at Two Sessions

    Institute of Scientific and Technical Information of China (English)

    Ding Xin

    2015-01-01

    A buzzword renxing from Two Sessions has made headlines in domestic and overseas media. Now the word has an authority interpretation--capricious. The interpreter Zhang Lei has become a celebrity overnight. However,according to interpretive theory,thefirst interpretation theory initiated by French translator Danica Seleskovitch,there seems much more room for improving the interpretation of renxing.

  12. Lectures on statistical mechanics

    CERN Document Server

    Bowler, M G

    1982-01-01

    Anyone dissatisfied with the almost ritual dullness of many 'standard' texts in statistical mechanics will be grateful for the lucid explanation and generally reassuring tone. Aimed at securing firm foundations for equilibrium statistical mechanics, topics of great subtlety are presented transparently and enthusiastically. Very little mathematical preparation is required beyond elementary calculus and prerequisites in physics are limited to some elementary classical thermodynamics. Suitable as a basis for a first course in statistical mechanics, the book is an ideal supplement to more convent

  13. Statistics at square one

    CERN Document Server

    Campbell, M J

    2011-01-01

    The new edition of this international bestseller continues to throw light on the world of statistics for health care professionals and medical students. Revised throughout, the 11th edition features new material in the areas of relative risk, absolute risk and   numbers needed to treat diagnostic tests, sensitivity, specificity, ROC curves free statistical software The popular self-testing exercises at the end of every chapter are strengthened by the addition of new sections on reading and reporting statistics and formula appreciation.

  14. Contributions to industrial statistics

    OpenAIRE

    2015-01-01

    This thesis is about statistics' contributions to industry. It is an article compendium comprising four articles divided in two blocks: (i) two contributions for a water supply company, and (ii) significance of the effects in Design of Experiments. In the first block, great emphasis is placed on how the research design and statistics can be applied to various real problems that a water company raises and it aims to convince water management companies that statistics can be very useful to impr...

  15. Optimization techniques in statistics

    CERN Document Server

    Rustagi, Jagdish S

    1994-01-01

    Statistics help guide us to optimal decisions under uncertainty. A large variety of statistical problems are essentially solutions to optimization problems. The mathematical techniques of optimization are fundamentalto statistical theory and practice. In this book, Jagdish Rustagi provides full-spectrum coverage of these methods, ranging from classical optimization and Lagrange multipliers, to numerical techniques using gradients or direct search, to linear, nonlinear, and dynamic programming using the Kuhn-Tucker conditions or the Pontryagin maximal principle. Variational methods and optimiza

  16. Equilibrium statistical mechanics

    CERN Document Server

    Jackson, E Atlee

    2000-01-01

    Ideal as an elementary introduction to equilibrium statistical mechanics, this volume covers both classical and quantum methodology for open and closed systems. Introductory chapters familiarize readers with probability and microscopic models of systems, while additional chapters describe the general derivation of the fundamental statistical mechanics relationships. The final chapter contains 16 sections, each dealing with a different application, ordered according to complexity, from classical through degenerate quantum statistical mechanics. Key features include an elementary introduction t

  17. Equilibrium statistical mechanics

    CERN Document Server

    Mayer, J E

    1968-01-01

    The International Encyclopedia of Physical Chemistry and Chemical Physics, Volume 1: Equilibrium Statistical Mechanics covers the fundamental principles and the development of theoretical aspects of equilibrium statistical mechanics. Statistical mechanical is the study of the connection between the macroscopic behavior of bulk matter and the microscopic properties of its constituent atoms and molecules. This book contains eight chapters, and begins with a presentation of the master equation used for the calculation of the fundamental thermodynamic functions. The succeeding chapters highlight t

  18. Mathematical statistics with applications

    CERN Document Server

    Wackerly, Dennis D; Scheaffer, Richard L

    2008-01-01

    In their bestselling MATHEMATICAL STATISTICS WITH APPLICATIONS, premiere authors Dennis Wackerly, William Mendenhall, and Richard L. Scheaffer present a solid foundation in statistical theory while conveying the relevance and importance of the theory in solving practical problems in the real world. The authors' use of practical applications and excellent exercises helps you discover the nature of statistics and understand its essential role in scientific research.

  19. Contributions to statistics

    CERN Document Server

    Mahalanobis, P C

    1965-01-01

    Contributions to Statistics focuses on the processes, methodologies, and approaches involved in statistics. The book is presented to Professor P. C. Mahalanobis on the occasion of his 70th birthday. The selection first offers information on the recovery of ancillary information and combinatorial properties of partially balanced designs and association schemes. Discussions focus on combinatorial applications of the algebra of association matrices, sample size analogy, association matrices and the algebra of association schemes, and conceptual statistical experiments. The book then examines latt

  20. Measuring statistical heterogeneity: The Pietra index

    Science.gov (United States)

    Eliazar, Iddo I.; Sokolov, Igor M.

    2010-01-01

    There are various ways of quantifying the statistical heterogeneity of a given probability law: Statistics uses variance - which measures the law’s dispersion around its mean; Physics and Information Theory use entropy - which measures the law’s randomness; Economics uses the Gini index - which measures the law’s egalitarianism. In this research we explore an alternative to the Gini index-the Pietra index-which is a counterpart of the Kolmogorov-Smirnov statistic. The Pietra index is shown to be a natural and elemental measure of statistical heterogeneity, which is especially useful in the case of asymmetric and skewed probability laws, and in the case of asymptotically Paretian laws with finite mean and infinite variance. Moreover, the Pietra index is shown to have immediate and fundamental interpretations within the following applications: renewal processes and continuous time random walks; infinite-server queueing systems and shot noise processes; financial derivatives. The interpretation of the Pietra index within the context of financial derivatives implies that derivative markets, in effect, use the Pietra index as their benchmark measure of statistical heterogeneity.

  1. Statistical discrete geometry

    CERN Document Server

    Ariwahjoedi, Seramika; Kosasih, Jusak Sali; Rovelli, Carlo; Zen, Freddy Permana

    2016-01-01

    Following our earlier work, we construct statistical discrete geometry by applying statistical mechanics to discrete (Regge) gravity. We propose a coarse-graining method for discrete geometry under the assumptions of atomism and background independence. To maintain these assumptions, restrictions are given to the theory by introducing cut-offs, both in ultraviolet and infrared regime. Having a well-defined statistical picture of discrete Regge geometry, we take the infinite degrees of freedom (large n) limit. We argue that the correct limit consistent with the restrictions and the background independence concept is not the continuum limit of statistical mechanics, but the thermodynamical limit.

  2. Improved Statistics Handling

    OpenAIRE

    2009-01-01

    Ericsson is a global provider of telecommunications systems equipment and related services for mobile and fixed network operators.  3Gsim is a tool used by Ericsson in tests of the 3G RNC node. In order to validate the tests, statistics are constantly gathered within 3Gsim and users can use telnet to access the statistics using some system specific 3Gsim commands. The statistics can be retrieved but is unstructured for the human eye and needs parsing and arranging to be readable.  The statist...

  3. Annual Statistical Supplement, 2008

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2008 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  4. Annual Statistical Supplement, 2004

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2004 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  5. Annual Statistical Supplement, 2006

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2006 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  6. Annual Statistical Supplement, 2016

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2016 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  7. Annual Statistical Supplement, 2010

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2010 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  8. Annual Statistical Supplement, 2002

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2002 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  9. Annual Statistical Supplement, 2003

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2003 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  10. Annual Statistical Supplement, 2011

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2011 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  11. Annual Statistical Supplement, 2000

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2000 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  12. Annual Statistical Supplement, 2015

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2015 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  13. Annual Statistical Supplement, 2009

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2009 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  14. Annual Statistical Supplement, 2014

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2014 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  15. Annual Statistical Supplement, 2007

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2007 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  16. Annual Statistical Supplement, 2005

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2005 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  17. Annual Statistical Supplement, 2001

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2001 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  18. Understanding Computational Bayesian Statistics

    CERN Document Server

    Bolstad, William M

    2011-01-01

    A hands-on introduction to computational statistics from a Bayesian point of view Providing a solid grounding in statistics while uniquely covering the topics from a Bayesian perspective, Understanding Computational Bayesian Statistics successfully guides readers through this new, cutting-edge approach. With its hands-on treatment of the topic, the book shows how samples can be drawn from the posterior distribution when the formula giving its shape is all that is known, and how Bayesian inferences can be based on these samples from the posterior. These ideas are illustrated on common statistic

  19. Statistics in a Nutshell

    CERN Document Server

    Boslaugh, Sarah

    2008-01-01

    Need to learn statistics as part of your job, or want some help passing a statistics course? Statistics in a Nutshell is a clear and concise introduction and reference that's perfect for anyone with no previous background in the subject. This book gives you a solid understanding of statistics without being too simple, yet without the numbing complexity of most college texts. You get a firm grasp of the fundamentals and a hands-on understanding of how to apply them before moving on to the more advanced material that follows. Each chapter presents you with easy-to-follow descriptions illustrat

  20. Record Statistics and Dynamics

    DEFF Research Database (Denmark)

    Sibani, Paolo; Jensen, Henrik J.

    2009-01-01

    The term record statistics covers the statistical properties of records within an ordered series of numerical data obtained from observations or measurements. A record within such series is simply a value larger (or smaller) than all preceding values. The mathematical properties of records strongly...... fluctuations of e. g. the energy are able to push the system past some sort of ‘edge of stability’, inducing irreversible configurational changes, whose statistics then closely follows the statistics of record fluctuations....