WorldWideScience

Sample records for define history measurements

  1. Defining and Measuring User Experience

    DEFF Research Database (Denmark)

    Stage, Jan

    2006-01-01

    on the intrinsic relation between definition and measurement. In the area of usability, this relation has been developed over several years. It is described how usability is defined and measured in contemporary approaches. Based on that, it is discussed to what extent we can employ experience from the conceptual...

  2. Defining and Measuring User Experience

    DEFF Research Database (Denmark)

    Stage, Jan

    2006-01-01

    User experience is being used to denote what a user goes through while using a computerized system. The concept has gained momentum as a means to distinguish new types of applications such as games and entertainment software from more traditional work-related applications. This paper focuses...... on the intrinsic relation between definition and measurement. In the area of usability, this relation has been developed over several years. It is described how usability is defined and measured in contemporary approaches. Based on that, it is discussed to what extent we can employ experience from the conceptual...... definition of usability to develop the notion of user experience....

  3. Defining Astrology in Ancient and Classical History

    Science.gov (United States)

    Campion, Nicholas

    2015-05-01

    Astrology in the ancient and classical worlds can be partly defined by its role, and partly by the way in which scholars spoke about it. The problem is complicated by the fact that the word is Greek - it has no Babylonian or Egyptian cognates - and even in Greece it was interchangeable with its cousin, 'astronomy'. Yet if we are to understand the role of the sky, stars and planets in culture, debates about the nature of ancient astrology, by both classical and modern scholars, must be taken into account. This talk will consider modern scholars' typologies of ancient astrology, together with ancient debates from Cicero in the 1st century BC, to Plotinus (204/5-270 AD) and Isidore of Seville (c. 560 - 4 April 636). It will consider the implications for our understanding of astronomy's role in culture, and conclude that in the classical period astrology may be best understood through its diversity and allegiance to competing philosophies, and that its functions were therefore similarly varied.

  4. Defining and Measuring Academic Success

    Directory of Open Access Journals (Sweden)

    Travis T. York

    2015-03-01

    Full Text Available Despite, and perhaps because of its amorphous nature, the term - academic success' is one of the most widely used constructs in educational research and assessment within higher education. This paper conducts an analytic literature review to examine the use and operationalization of the term in multiple academic fields. Dominant definitions of the term are conceptually evaluated using Astin's I-E-O model resulting in the proposition of a revised definition and new conceptual model of academic success. Measurements of academic success found throughout the literature are presented in accordance with the presented model of academic success. These measurements are provided with details in a user-friendly table (Appendix B. Results also indicate that grades and GPA are the most commonly used measure of academic success. Finally, recommendations are given for future research and practice to increase effective assessment of academic success.

  5. Defining and measuring environmental consciousness

    Directory of Open Access Journals (Sweden)

    Jiménez Sánchez, Manuel

    2010-09-01

    Full Text Available Based on a review of the main analytical approaches found in the literature, in this paper we establish a multidimensional and behaviour-oriented definition of environmental consciousness. We propose a method to operationalize this definition with the final aim of obtaining summary measures (or indexes of this phenomenon which can be applied to different social contexts and time periods. The data obtained from a survey on environmental attitudes and behaviour conducted in 2004 among Andalusians (Ecobarómetro de Andalucía 2004 is used as an empirical basis for the proposed operationalization. The resulting measures are then employed to identify social groups according to the diverse forms of their environmental consciousness and to explore their basic socio-demographic profiles

    A partir de las principales aproximaciones analíticas presentes en la literatura, en este trabajo establecemos una definición de conciencia ambiental multidimensional y orientada a la conducta; proponemos un método para su operacionalización con el objetivo de elaborar medidas sintéticas de este fenómeno en distintos contextos sociales. La operacionalización propuesta utiliza como base empírica los resultados del Ecobarómetro de Andalucía (EBA 2004. Los indicadores resultantes son utilizados seguidamente para identificar distintos grupos sociales según la naturaleza de su conciencia ambiental.

  6. Twelve defining moments in the history of alcoholics anonymous.

    Science.gov (United States)

    White, William L; Kurtz, Ernest

    2008-01-01

    Misconceptions about Alcoholics Anonymous (AA) abound in spite of (or because of) the thousands of theses, dissertations, books, professional and popular articles, and Internet commentaries that have been written about AA. One of the most pervasive characterizations of AA is that it is a "treatment" for alcoholism--a characterization that distorts the meaning of both mutual aid and alcoholism treatment. This article describes 12 character-defining moments in the history of AA that highlight the differences between AA and alcoholism treatment.

  7. Defining and measuring pilot mental workload

    Science.gov (United States)

    Kantowitz, Barry H.

    1988-01-01

    A theory is sought that is general enough to help the researcher deal with a wide range of situations involving pilot mental stress. A limited capacity theory of attention forms the basis for the theory. Mental workload is then defined as an intervening variable, similar to attention, that modulates or indexes the tuning between the demands of the environment and the capacity of the organism. Two methods for measuring pilot mental workload are endorsed: (1) objective measures based on secondary tasks; and (2) psychophysiological measures, which have not yet been perfected but which will become more useful as theoretical models are refined. Secondary-task research is illustrated by simulator studies in which flying performance has been shown not to be adversely affected by adding a complex choice-reaction secondary task.

  8. Trait variation in yeast is defined by population history.

    Directory of Open Access Journals (Sweden)

    Jonas Warringer

    2011-06-01

    Full Text Available A fundamental goal in biology is to achieve a mechanistic understanding of how and to what extent ecological variation imposes selection for distinct traits and favors the fixation of specific genetic variants. Key to such an understanding is the detailed mapping of the natural genomic and phenomic space and a bridging of the gap that separates these worlds. Here we chart a high-resolution map of natural trait variation in one of the most important genetic model organisms, the budding yeast Saccharomyces cerevisiae, and its closest wild relatives and trace the genetic basis and timing of major phenotype changing events in its recent history. We show that natural trait variation in S. cerevisiae exceeds that of its relatives, despite limited genetic variation, and follows the population history rather than the source environment. In particular, the West African population is phenotypically unique, with an extreme abundance of low-performance alleles, notably a premature translational termination signal in GAL3 that cause inability to utilize galactose. Our observations suggest that many S. cerevisiae traits may be the consequence of genetic drift rather than selection, in line with the assumption that natural yeast lineages are remnants of recent population bottlenecks. Disconcertingly, the universal type strain S288C was found to be highly atypical, highlighting the danger of extrapolating gene-trait connections obtained in mosaic, lab-domesticated lineages to the species as a whole. Overall, this study represents a step towards an in-depth understanding of the causal relationship between co-variation in ecology, selection pressure, natural traits, molecular mechanism, and alleles in a key model organism.

  9. Convolution operators defined by singular measures on the motion group

    CERN Document Server

    Brandolini, Luca; Thangavelu, Sundaram; Travaglini, Giancarlo

    2010-01-01

    This paper contains an $L^{p}$ improving result for convolution operators defined by singular measures associated to hypersurfaces on the motion group. This needs only mild geometric properties of the surfaces, and it extends earlier results on Radon type transforms on $\\mathbb{R}^{n}$. The proof relies on the harmonic analysis on the motion group.

  10. What Is Violence Against Women? Defining and Measuring the Problem

    Science.gov (United States)

    Kilpatrick, Dean G.

    2004-01-01

    Violence against women (VAW) is a prevalent problem with substantial physical and mental health consequences throughout the world, and sound public policy is dependent on having good measures of VAW. This article (a) describes and contrasts criminal justice and public health approaches toward defining VAW, (b) identifies major controversies…

  11. Defining and measuring successful emergency care networks: a research agenda.

    Science.gov (United States)

    Glickman, Seth W; Kit Delgado, M; Hirshon, Jon Mark; Hollander, Judd E; Iwashyna, Theodore J; Jacobs, Alice K; Kilaru, Austin S; Lorch, Scott A; Mutter, Ryan L; Myers, Sage R; Owens, Pamela L; Phelan, Michael P; Pines, Jesse M; Seymour, Christopher W; Ewen Wang, N; Branas, Charles C

    2010-12-01

    The demands on emergency services have grown relentlessly, and the Institute of Medicine (IOM) has asserted the need for "regionalized, coordinated, and accountable emergency care systems throughout the country." There are large gaps in the evidence base needed to fix the problem of how emergency care is organized and delivered, and science is urgently needed to define and measure success in the emerging network of emergency care. In 2010, Academic Emergency Medicine convened a consensus conference entitled "Beyond Regionalization: Integrated Networks of Emergency Care." This article is a product of the conference breakout session on "Defining and Measuring Successful Networks"; it explores the concept of integrated emergency care delivery and prioritizes a research agenda for how to best define and measure successful networks of emergency care. The authors discuss five key areas: 1) the fundamental metrics that are needed to measure networks across time-sensitive and non-time-sensitive conditions; 2) how networks can be scalable and nimble and can be creative in terms of best practices; 3) the potential unintended consequences of networks of emergency care; 4) the development of large-scale, yet feasible, network data systems; and 5) the linkage of data systems across the disease course. These knowledge gaps must be filled to improve the quality and efficiency of emergency care and to fulfill the IOM's vision of regionalized, coordinated, and accountable emergency care systems. 2010 by the Society for Academic Emergency Medicine.

  12. Defining and measuring irritability: Construct clarification and differentiation.

    Science.gov (United States)

    Toohey, Michael J; DiGiuseppe, Raymond

    2017-04-01

    Irritability is a symptom of 15 disorders in the DSM-5 and is included in Mood Disorders, Addictive Disorders, Personality Disorders, and more (American Psychiatric Association, 2013). However, the term irritability is defined and measured inconsistently in the scholarly literature. In this article, we reviewed the scholarly definitions of irritability and the item content of irritability measures. Components of definitions and items measuring irritability were divided into three categories: a) causes, b) experience, and c) consequences. We also reviewed potential causes and biomarkers of irritability. We found much overlap between definitions and measures of irritability and related constructs such as anger and aggression. Consequently, the validity of research on irritability needs to be questioned including the role of irritability in psychopathology and the presence of irritability as a symptom in any disorder. Research on irritability's role in behavioral disorders needs to be repeated after more well defined measures are developed. We proposed a more precise definition of irritability that clearly differentiates it from related constructs. Suggested items for measuring irritability are also provided. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. It is what it eats: Chemically defined media and the history of surrounds.

    Science.gov (United States)

    Landecker, Hannah

    2016-06-01

    The cultivation of living organs, cells, animals, and embryos in the laboratory has been central to the production of biological knowledge. Over the twentieth century, the drive to variance control in the experimental setting led to systematic efforts to generate synthetic, chemically defined substitutes for complex natural foods, housing, and other substrates of life. This article takes up the history of chemically defined media with three aims in mind. First, to characterize patterns of decontextualization, tinkering, and negotiation between life and experimenter that occur across disparate histories of cultivation. Second, to highlight the paradoxical historicity of cultivated organisms generated to be freed from context, as they incorporate and embody the purified amino acids, vitamins, plastics, and other artificial supports developed in the name of experimental control. Third, to highlight the figure-ground reversal that occurs as these cells and organisms are reconsidered as accidentally good models of life in industrialized conditions of pollution and nutrient excess, due to the man-made nature of their surrounds. Methodologically, the history of surrounds is described as an epigenetic approach that focuses on the material relations between different objects and organisms previously considered quite separately, from explanted organs to bacteria to plant cells to rats to human embryos.

  14. Software-defined Radio Based Measurement Platform for Wireless Networks.

    Science.gov (United States)

    Chao, I-Chun; Lee, Kang B; Candell, Richard; Proctor, Frederick; Shen, Chien-Chung; Lin, Shinn-Yan

    2015-10-01

    End-to-end latency is critical to many distributed applications and services that are based on computer networks. There has been a dramatic push to adopt wireless networking technologies and protocols (such as WiFi, ZigBee, WirelessHART, Bluetooth, ISA100.11a, etc.) into time-critical applications. Examples of such applications include industrial automation, telecommunications, power utility, and financial services. While performance measurement of wired networks has been extensively studied, measuring and quantifying the performance of wireless networks face new challenges and demand different approaches and techniques. In this paper, we describe the design of a measurement platform based on the technologies of software-defined radio (SDR) and IEEE 1588 Precision Time Protocol (PTP) for evaluating the performance of wireless networks.

  15. Notice to Readers: Special Podcast: "Defining Moments in MMWR History - E. coli O157:H7".

    Science.gov (United States)

    2017-06-02

    MMWR has released a special podcast that highlights the leading role that MMWR played in reporting on the deadly multistate Escherichia coli O157:H7 foodborne outbreak of 1993. "Defining Moments in MMWR History - E. coli O157:H7" features an interview with Dr. Beth Bell conducted by MMWR Editor-in-Chief Dr. Sonja Rasmussen. Dr. Bell, who served as director of the National Center for Emerging and Zoonotic Infectious Diseases (NCEZID) from 2010 to 2017 and as an Epidemic Intelligence Service Officer during 1992-1994, was one of the first public health responders on the scene for this landmark public health emergency.

  16. Defining and measuring autophagosome flux—concept and reality.

    Science.gov (United States)

    Loos, Ben; du Toit, André; Hofmeyr, Jan-Hendrik S

    2014-01-01

    The autophagic system is involved in both bulk degradation of primarily long-lived cytoplasmic proteins as well as in selective degradation of cytoplasmic organelles. Autophagic flux is often defined as a measure of autophagic degradation activity, and a number of methods are currently utilized to assess autophagic flux. However, despite major advances in measuring various molecular aspects of the autophagic machinery, we remain less able to express autophagic flux in a highly sensitive, robust, and well-quantifiable manner. Here, we describe a conceptual framework for defining and measuring autophagosome flux at the single-cell level. The concept discussed here is based on the theoretical framework of metabolic control analysis, which distinguishes between the pathway along which there is a flow of material and the quantitative measure of this flow. By treating the autophagic system as a multistep pathway with each step characterized by a particular rate, we are able to provide a single-cell fluorescence live-cell imaging-based approach that describes the accurate assessment of the complete autophagosome pool size, the autophagosome flux, and the transition time required to turn over the intracellular autophagosome pool. In doing so, this perspective provides clarity on whether the system is at steady state or in a transient state moving towards a new steady state. It is hoped that this theoretical account of quantitatively measuring autophagosome flux may contribute towards a new direction in the field of autophagy, a standardized approach that allows the establishment of systematic flux databases of clinically relevant cell and tissue types that serve as important model systems for human pathologies.

  17. Reproductive history and risk of three breast cancer subtypes defined by three biomarkers.

    Science.gov (United States)

    Phipps, Amanda I; Buist, Diana S M; Malone, Kathleen E; Barlow, William E; Porter, Peggy L; Kerlikowske, Karla; Li, Christopher I

    2011-03-01

    Breast cancer subtypes defined by estrogen receptor (ER), progesterone receptor (PR), and HER2 expression are biologically distinct and thus, may have distinct etiologies. In particular, it is plausible that risk factors operating through hormonal mechanisms are differentially related to risk of such tumor subtypes. Using data from the Breast Cancer Surveillance Consortium, we explored associations between reproductive history and three breast cancer subtypes. Data on parity and age at first birth were collected from 743,623 women, 10,896 of whom were subsequently diagnosed with breast cancer. Cases were classified into three subtypes based on tumor maker expression: (1) ER positive (ER+, N = 8,203), (2) ER negative/PR negative/HER2 positive (ER-/PR-/HER2+, N = 288), or (3) ER-, PR-, and HER2-negative (triple-negative, N = 645). Associations with reproductive history, evaluated using Cox regression, differed significantly across tumor subtypes. Nulliparity was most strongly associated with risk of ER+ breast cancer [hazard ratio (HR) = 1.31, 95% confidence interval (CI): 1.23-1.39]; late age at first birth was most strongly associated with risk of ER-/PR-/HER2+ disease (HR = 1.83, 95% CI: 1.31-2.56). Neither parity nor age at first birth was associated with triple-negative breast cancer. In contrast to ER+ and ER-/PR-/HER2+ subtypes, reproductive history does not appear to be a risk factor for triple-negative breast cancer.

  18. Defining pharmaceutical systems strengthening: concepts to enable measurement.

    Science.gov (United States)

    Hafner, Tamara; Walkowiak, Helena; Lee, David; Aboagye-Nyame, Francis

    2017-05-01

    Pharmaceutical products are indispensable for improving health outcomes. An extensive body of work on access to and use of medicines has resulted in an assortment of tools measuring various elements of pharmaceutical systems. Until now however, there has been little attempt to conceptualize a pharmaceutical system as an entity and define its strengthening in a way that allows for measuring systems strengthening. The narrow focus of available tools limits their value in ascertaining which interventions result in stronger, more resilient systems. We sought to address this shortcoming by revisiting the current definitions, frameworks and assessment tools related to pharmaceutical systems. We conducted a comprehensive literature review and consulted with select pharmaceutical experts. On the basis of our review, we propose that a pharmaceutical system consists of all structures, people, resources, processes, and their interactions within the broader health system that aim to ensure equitable and timely access to safe, effective, quality pharmaceutical products and related services that promote their appropriate and cost-effective use to improve health outcomes. We further propose that pharmaceutical systems strengthening is the process of identifying and implementing strategies and actions that achieve coordinated and sustainable improvements in the critical components of a pharmaceutical system to make it more responsive and resilient and to enhance its performance for achieving better health outcomes. Finally, we established that, in addition to system performance and resilience, seven components of the pharmaceutical system are critical for measuring pharmaceutical systems strengthening: pharmaceutical products and related services; policy, laws and governance; regulatory systems; innovation, research and development, manufacturing, and trade; financing; human resources; and information. This work adds clarity to the concept of pharmaceutical systems and their

  19. Family history: impact on coronary heart disease risk assessment beyond guideline-defined factors.

    Science.gov (United States)

    Hasanaj, Q; Wilson, B J; Little, J; Montazeri, Z; Carroll, J C

    2013-01-01

    Family history (FH) provides insights into the effects of shared genomic susceptibilities, environments and behaviors, making it a potentially valuable risk assessment tool for chronic diseases. We assessed whether coronary heart disease (CHD) risk assessment is improved when FH information is added to other clinical information recommended in guidelines. We applied logistic regression analyses to cross-sectional data originally obtained from a UK study of women who delivered a live-born infant between 1951 and 1970. We developed 3 models: Model 1 included only the covariates in a guideline applicable to the population, Model 2 added FH to Model 1, and Model 3 included a fuller range of risk factors. For each model, its ability to discriminate between study subjects with and those without CHD was evaluated and its impact on risk classification examined using the net reclassification index. FH was an independent risk factor for CHD (odds ratio = 1.7, 95% confidence interval = 1.26-2.47) and improved discrimination beyond guideline-defined clinical factors (p risk factor for CHD, it added little to risk factors typically included in guidelines. © 2013 S. Karger AG, Basel.

  20. A measure of explained variation for event history data.

    Science.gov (United States)

    Stare, Janez; Perme, Maja Pohar; Henderson, Robin

    2011-09-01

    There is no shortage of proposed measures of prognostic value of survival models in the statistical literature. They come under different names, including explained variation, correlation, explained randomness, and information gain, but their goal is common: to define something analogous to the coefficient of determination R(2)  in linear regression. None however have been uniformly accepted, none have been extended to general event history data, including recurrent events, and many cannot incorporate time-varying effects or covariates. We present here a measure specifically tailored for use with general dynamic event history regression models. The measure is applicable and interpretable in discrete or continuous time; with tied data or otherwise; with time-varying, time-fixed, or dynamic covariates; with time-varying or time-constant effects; with single or multiple event times; with parametric or semiparametric models; and under general independent censoring/observation. For single-event survival data with neither censoring nor time dependency it reduces to the concordance index. We give expressions for its population value and the variance of the estimator and explore its use in simulations and applications. A web link to R software is provided. © 2010, The International Biometric Society.

  1. Defining and Measuring Entrepreneurship for Regional Research: A New Approach

    Science.gov (United States)

    Low, Sarah A.

    2009-01-01

    In this dissertation, I develop a definition and regional measure of entrepreneurship that will aid entrepreneurship research and economic development policy. My new indicators represent an improvement over current measures of entrepreneurship. The chief contribution of these new indicators is that they incorporate innovation, which others ignore.…

  2. Defining and Computing a Valued Based Cyber-Security Measure

    Energy Technology Data Exchange (ETDEWEB)

    Aissa, Anis Ben [University of Tunis, Belvedere, Tunisia; Abercrombie, Robert K [ORNL; Sheldon, Frederick T [ORNL; Mili, Ali [New Jersey Insitute of Technology

    2012-01-01

    In earlier work, we presented a value based measure of cybersecurity that quantifies the security of a system in concrete terms, specifically, in terms of how much each system stakeholder stands to lose (in dollars per hour of operation) as a result of security threats and system vulnerabilities; our metric varies according to the stakes that each stakeholder has in meeting each security requirement. In this paper, we discuss the specification and design of a system that collects, updates, and maintains all the information that pertains to estimating our cybersecurity measure, and offers stakeholders quantitative means to make security-related decisions.

  3. Defining and Computing a Value Based Cyber-Security Measure

    Energy Technology Data Exchange (ETDEWEB)

    Aissa, Anis Ben [University of Tunis, Belvedere, Tunisia; Abercrombie, Robert K [ORNL; Sheldon, Frederick T [ORNL; Mili, Ali [New Jersey Insitute of Technology

    2011-01-01

    In past work, we presented a value based measure of cybersecurity that quantifies the security of a system in concrete terms, specifically, in terms of how much each system stakeholder stands to lose (in dollars per hour of operation) as a result of security threats and system vulnerabilities\\; our metric varies according to the stakes that each stakeholder has in meeting each security requirement. In this paper we discuss the specification and design of a system that collects, updates and maintains all the information that pertains to estimating our cybersecurity measure, and offers stakeholders quantitative means to make security-related decisions.

  4. Spasticity, an impairment that is poorly defined and poorly measured

    NARCIS (Netherlands)

    Malhotra, S.; Malhotra, S.; Pandyan, A.D.; Day, C.R.; Jones, Valerie M.; Hermens, Hermanus J.

    Objective: To explore, following a literature review, whether there is a consistent definition and a unified assessment framework for the term 'spasticity'. The congruence between the definitions of spasticity and the corresponding methods of measurement were also explored. Data sources: The search

  5. Teenage Nonviolence: How Do We Define and Measure It?

    Science.gov (United States)

    Mayton, Daniel M., II

    With the rise of violent teenage crime, with an alarming number of child soldiers across the globe, and with the continually increasing number of children and adolescents who are victimized by violence and war, an instrument that measures nonviolent tendencies would be very useful. The Teenage Nonviolence Test (TNT) was recently developed and…

  6. Spasticity, an impairment that is poorly defined and poorly measured

    NARCIS (Netherlands)

    Malhotra, S.; Pandyan, A.D.; Day, C.R.; Jones, V.M.; Hermens, H.J.

    2009-01-01

    Objective: To explore, following a literature review, whether there is a consistent definition and a unified assessment framework for the term 'spasticity'. The congruence between the definitions of spasticity and the corresponding methods of measurement were also explored. Data sources: The search

  7. The measurement of water scarcity: Defining a meaningful indicator.

    Science.gov (United States)

    Damkjaer, Simon; Taylor, Richard

    2017-09-01

    Metrics of water scarcity and stress have evolved over the last three decades from simple threshold indicators to holistic measures characterising human environments and freshwater sustainability. Metrics commonly estimate renewable freshwater resources using mean annual river runoff, which masks hydrological variability, and quantify subjectively socio-economic conditions characterising adaptive capacity. There is a marked absence of research evaluating whether these metrics of water scarcity are meaningful. We argue that measurement of water scarcity (1) be redefined physically in terms of the freshwater storage required to address imbalances in intra- and inter-annual fluxes of freshwater supply and demand; (2) abandons subjective quantifications of human environments and (3) be used to inform participatory decision-making processes that explore a wide range of options for addressing freshwater storage requirements beyond dams that include use of renewable groundwater, soil water and trading in virtual water. Further, we outline a conceptual framework redefining water scarcity in terms of freshwater storage.

  8. Defining and Computing a Valued Based Cyber Security Measure

    Energy Technology Data Exchange (ETDEWEB)

    Aissa, Anis Ben [University of Tunis, Belvedere, Tunisia; Abercrombie, Robert K [ORNL; Sheldon, Frederick T [ORNL; Mili, Ali [New Jersey Insitute of Technology

    2011-01-01

    In earlier works (Ben-Aissa et al. 2010; Abercrombie et al. 2008; Sheldon et al. 2009), we presented a value based measure of cybersecurity that quantifies the security of a system in concrete terms, specifically, in terms of how much each system stakeholder stands to lose (in dollars per hour of operation) as a result of security threats and system vulnerabilities; our metric varies according to the stakes that each stakeholder has in meeting each security requirement. In this paper, we discuss the specification and design of a system that collects, updates, and maintains all the information that pertains to estimating our cybersecurity measure, and offers stakeholders quantitative means to make security-related decisions.

  9. Defining and Measuring Coastal Vulnerability and Resilience to Natural Hazards

    Science.gov (United States)

    Becker, M. K.; Hoagland, P.

    2014-12-01

    Accounting for an estimated 23 percent of the world's population, coastal communities face many types of natural hazards. In particular, they may be vulnerable to the effects of tropical cyclones, flooding due to tsunamis or storm surges, erosion, saltwater intrusion, and subsidence. These coastal hazards are further exacerbated by population growth and climate change. There is a lack of consensus in the literature about what constitutes vulnerability (negative impacts) and resilience (recovery from negative impacts) and how to measure these phenomena. While some important work has focused on the long-term effects of coastal hazards on economic growth, little has been done to understand, in quantitative terms, the extent to which coastal communities may be vulnerable to such hazards and, if so, whether they can be resilient. We surveyed nine indicators of human well-being in order to determine their potential suitability as measures of coastal vulnerability or resilience. Some measures, such as the Gross Domestic Product, the Human Development Index, and the Gini coefficient, comprise economic or distributional indicators of human welfare; others, such as the Social Vulnerability Index, are more complex and difficult to interpret. We selected per capita personal income as the most viable indicator, due largely to its simplicity and its availability over several decades. We used it to examine human community vulnerability and resilience to a specific coastal hazard—significant storm surges from major coastal hurricanes—in several US coastal metropolitan areas. We compiled data on per capita personal income from the US Bureau of Economic Analysis for 15 to 20 years prior and subsequent to four major hurricanes: Hugo, which hit the Charleston, South Carolina, metropolitan area in 1989; Bob, Cape Cod, Massachusetts, in 1991; Andrew, Miami, Florida, in 1992; and Opal, Pensacola, Florida, in 1995. Intervention analysis using linear regression suggests that these

  10. The Cubit: A History and Measurement Commentary

    Directory of Open Access Journals (Sweden)

    Mark H. Stone

    2014-01-01

    Full Text Available Historical dimensions for the cubit are provided by scripture and pyramid documentation. Additional dimensions from the Middle East are found in other early documents. Two major dimensions emerge from a history of the cubit. The first is the anthropological or short cubit, and the second is the architectual or long cubit. The wide geographical area and long chronological period suggest that cubit dimensions varied over time and geographic area. Greek and Roman conquests led to standardization. More recent dimensions are provided from a study by Francis Galton based upon his investigations into anthropometry. The subjects for Galton’s study and those of several other investigators lacked adequate sample descriptions for producing a satisfactory cubit/forearm dimension. This finding is not surprising given the demise of the cubit in today’s world. Contemporary dimensions from military and civilian anthropometry for the forearm and hand allow comparison to the ancient unit. Although there appears no pressing need for a forearm-hand/cubit dimension, the half-yard or half-meter unit seems a useful one that could see more application.

  11. Measuring metamorphic history of unequilibrated ordinary chondrites

    Science.gov (United States)

    Sears, D. W.; Grossman, J. N.; Melcher, C. L.; Ross, L. M.; Mills, A. A.

    1980-10-01

    Measurements performed by a thermoluminescence sensitivity technique of the degree of metamorphism experienced by unequilibrated ordinary chondrites are reported. Samples of type 3 chondrites were ground and heated to 500 C to remove their natural thermoluminescence, then irradiated with either 50 krad from a Co-60 gamma ray source or 25 krad from a Sr-90 beta source. The resulting thermoluminescence measured as a function of temperature is found to differ as much among some type 3 chondrites as between type 3 and other types, leading to the proposal of scheme for subdividing type 3 ordinary chondrites based on their thermoluminescence sensitivity.

  12. Decoherent histories approach to the cosmological measure problem

    CERN Document Server

    Lloyd, Seth

    2016-01-01

    The method of decoherent histories allows probabilities to be assigned to sequences of quantum events in systems, such as the universe as a whole, where there is no external observer to make measurements. This paper applies the method of decoherent histories to address cosmological questions. Using a series of simple examples, beginning with the harmonic oscillator, we show that systems in a stationary state such as an energy eigenstate or thermal state can exhibit decoherent histories with non-trivial dynamics. We then examine decoherent histories in a universe that undergoes eternal inflation. Decoherent histories that assign probabilities to sequences of events in the vicinity of a timelike geodesic supply a natural cosmological measure. Under reasonable conditions, such sequences of events do not suffer from the presence of unlikely statistical fluctuations that mimic reality.

  13. Shell Measuring Machine. History and Status Report

    Energy Technology Data Exchange (ETDEWEB)

    Birchler, Wilbur D. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Fresquez, Philip R. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2000-06-01

    Commercialization of the Ring Rotacon Shell Measuring Machine project is a CRADA (NO. LA98C10358) between The University of California (Los Alamos National Laboratory) and Moore Tool Company, Bridgeport, CT. The actual work started on this CRADA in December of 1998. Several meetings were held with the interested parties (Los Alamos, Oak Ridge, Moore Tool, and the University of North Carolina). The results of these meetings were that the original Ring Rotacon did not measure up to the requirements of the Department of Energy and private industry, and a new configuration was investigated. This new configuration (Shell Measuring Machine [SMM]) much better fits the needs of all parties. The work accomplished on the Shell Measuring Machine in FY 99 includes the following; Specifications for size and weight were developed; Performance error budgets were established; Designs were developed; Analyses were performed (stiffness and natural frequency); Existing part designs were compared to the working SMM volume; Peer reviews were conducted; Controller requirements were studied; Fixture requirements were evaluated; and Machine motions were analyzed. The consensus of the Peer Review Committee was that the new configuration has the potential to satisfy the shell inspection needs of Department of Energy as well as several commercial customers. They recommended that more analyses be performed on error budgets, structural stiffness, natural frequency, and thermal effects and that operational processes be developed. Several design issues need to be addressed. They are the type of bearings utilized to support the tables (air bearings or mechanical roller type bearings), the selection of the probes, the design of the probe sliding mechanisms, and the design of the upper table positioning mechanism. Each item has several possible solutions, and more work is required to obtain the best design. This report includes the background and technical objectives; minutes of the working

  14. Antigen Exposure History Defines CD8 T Cell Dynamics and Protection during Localized Pulmonary Infections

    Science.gov (United States)

    Van Braeckel-Budimir, Natalija; Martin, Matthew D.; Hartwig, Stacey M.; Legge, Kevin L.; Badovinac, Vladimir P.; Harty, John T.

    2017-01-01

    Unlike systemic infections, little is known about the role of repeated localized infections on (re)shaping pathogen-specific memory CD8 T cell responses. Here, we used primary (1°) and secondary (2°) intranasal influenza virus infections of mice as a model to study intrinsic memory CD8 T cell properties. We show that secondary antigen exposure, relative to a single infection, generates memory CD8 T cell responses of superior magnitude in multiple tissue compartments including blood, spleen, draining lymph nodes, and lung. Unexpectedly, regardless of the significantly higher number of 2° memory CD8 T cells, similar degree of protection against pulmonary challenge was observed in both groups of mice containing 1° or 2° memory CD8 T cells. Mechanistically, using pertussis toxin-induced migration block, we showed that superior antigen-driven proliferation and ability to relocate to the site of infection allowed 1° memory CD8 T cells to accumulate in the infected lung during the first few days after challenge, compensating for the initially lower cell numbers. Taken together, the history of antigen exposures to localized pulmonary infections, through altering basic cell biology, dictates dynamic properties of protective memory CD8 T cell responses. This knowledge has important implications for a design of novel and an improvement of existing vaccines and immunization strategies. PMID:28191007

  15. The role of fecundity and reproductive effort in defining life-history strategies of North American freshwater mussels.

    Science.gov (United States)

    Haag, Wendell R

    2013-08-01

    Selection is expected to optimize reproductive investment resulting in characteristic trade-offs among traits such as brood size, offspring size, somatic maintenance, and lifespan; relative patterns of energy allocation to these functions are important in defining life-history strategies. Freshwater mussels are a diverse and imperiled component of aquatic ecosystems, but little is known about their life-history strategies, particularly patterns of fecundity and reproductive effort. Because mussels have an unusual life cycle in which larvae (glochidia) are obligate parasites on fishes, differences in host relationships are expected to influence patterns of reproductive output among species. I investigated fecundity and reproductive effort (RE) and their relationships to other life-history traits for a taxonomically broad cross section of North American mussel diversity. Annual fecundity of North American mussel species spans nearly four orders of magnitude, ranging from 200000). Estimates of RE also were highly variable, ranging among species from 0.06 to 25.4%. Median fecundity and RE differed among phylogenetic groups, but patterns for these two traits differed in several ways. For example, the tribe Anodontini had relatively low median fecundity but had the highest RE of any group. Within and among species, body size was a strong predictor of fecundity and explained a high percentage of variation in fecundity among species. Fecundity showed little relationship to other life-history traits including glochidial size, lifespan, brooding strategies, or host strategies. The only apparent trade-off evident among these traits was the extraordinarily high fecundity of Leptodea, Margaritifera, and Truncilla, which may come at a cost of greatly reduced glochidial size; there was no relationship between fecundity and glochidial size for the remaining 61 species in the dataset. In contrast to fecundity, RE showed evidence of a strong trade-off with lifespan, which was

  16. History and progress on accurate measurements of the Planck constant

    Science.gov (United States)

    Steiner, Richard

    2013-01-01

    The measurement of the Planck constant, h, is entering a new phase. The CODATA 2010 recommended value is 6.626 069 57 × 10-34 J s, but it has been a long road, and the trip is not over yet. Since its discovery as a fundamental physical constant to explain various effects in quantum theory, h has become especially important in defining standards for electrical measurements and soon, for mass determination. Measuring h in the International System of Units (SI) started as experimental attempts merely to prove its existence. Many decades passed while newer experiments measured physical effects that were the influence of h combined with other physical constants: elementary charge, e, and the Avogadro constant, NA. As experimental techniques improved, the precision of the value of h expanded. When the Josephson and quantum Hall theories led to new electronic devices, and a hundred year old experiment, the absolute ampere, was altered into a watt balance, h not only became vital in definitions for the volt and ohm units, but suddenly it could be measured directly and even more accurately. Finally, as measurement uncertainties now approach a few parts in 108 from the watt balance experiments and Avogadro determinations, its importance has been linked to a proposed redefinition of a kilogram unit of mass. The path to higher accuracy in measuring the value of h was not always an example of continuous progress. Since new measurements periodically led to changes in its accepted value and the corresponding SI units, it is helpful to see why there were bumps in the road and where the different branch lines of research joined in the effort. Recalling the bumps along this road will hopefully avoid their repetition in the upcoming SI redefinition debates. This paper begins with a brief history of the methods to measure a combination of fundamental constants, thus indirectly obtaining the Planck constant. The historical path is followed in the section describing how the improved

  17. A quantitative method for measuring the quality of history matches

    Energy Technology Data Exchange (ETDEWEB)

    Shaw, T.S. [Kerr-McGee Corp., Oklahoma City, OK (United States); Knapp, R.M. [Univ. of Oklahoma, Norman, OK (United States)

    1997-08-01

    History matching can be an efficient tool for reservoir characterization. A {open_quotes}good{close_quotes} history matching job can generate reliable reservoir parameters. However, reservoir engineers are often frustrated when they try to select a {open_quotes}better{close_quotes} match from a series of history matching runs. Without a quantitative measurement, it is always difficult to tell the difference between a {open_quotes}good{close_quotes} and a {open_quotes}better{close_quotes} matches. For this reason, we need a quantitative method for testing the quality of matches. This paper presents a method for such a purpose. The method uses three statistical indices to (1) test shape conformity, (2) examine bias errors, and (3) measure magnitude of deviation. The shape conformity test insures that the shape of a simulated curve matches that of a historical curve. Examining bias errors assures that model reservoir parameters have been calibrated to that of a real reservoir. Measuring the magnitude of deviation assures that the difference between the model and the real reservoir parameters is minimized. The method was first tested on a hypothetical model and then applied to published field studies. The results showed that the method can efficiently measure the quality of matches. It also showed that the method can serve as a diagnostic tool for calibrating reservoir parameters during history matching.

  18. Are self-report measures able to define individuals as physically active or inactive?

    NARCIS (Netherlands)

    Steene-Johannessen, J.; Anderssen, S.A.; Ploeg, H.P. van der; Hendriksen, I.J.M.; Donnelly, A.E.; Brage, S.; Ekelund, U.

    2016-01-01

    Purpose: Assess the agreement between commonly used self-report methods compared with objectively measured physical activity (PA) in defining the prevalence of individuals compliant with PA recommendations. Methods: Time spent in moderate and vigorous PA (MVPA) was measured at two time points in 171

  19. System Energy Assessment (SEA), Defining a Standard Measure of EROI for Energy Businesses as Whole Systems

    OpenAIRE

    2011-01-01

    A more objective method for measuring the energy needs of businesses, System Energy Assessment (SEA), identifies the natural boundaries of businesses as self-managing net-energy systems, of controlled and self-managing parts. The method is demonstrated using a model Wind Farm case study, and applied to defining a true physical measure of its energy productivity for society (EROI-S), the global ratio of energy produced to energy cost. The traceable needs of business technology are combined wit...

  20. How to Define the Equality of Durations in Measurement of Time

    Institute of Scientific and Technical Information of China (English)

    ZHAO Zheng; TIAN Gui-Hua; LIU Liao; GAO Si-Jie

    2006-01-01

    We develop the research on measurement of time worked by Poincaré, Einstein, Landau and other researchers.Based on the convention that the velocity of light is isotropic and is a constant in empty spacetime, we not only answer the question about the definition of the synchronization of rate of clocks located at different places, but also find the solution to the issue of how to define the equality of two durations in measurement of time.

  1. Incompatible multiple consistent sets of histories and measures of quantumness

    Science.gov (United States)

    Halliwell, J. J.

    2017-07-01

    In the consistent histories approach to quantum theory probabilities are assigned to histories subject to a consistency condition of negligible interference. The approach has the feature that a given physical situation admits multiple sets of consistent histories that cannot in general be united into a single consistent set, leading to a number of counterintuitive or contrary properties if propositions from different consistent sets are combined indiscriminately. An alternative viewpoint is proposed in which multiple consistent sets are classified according to whether or not there exists any unifying probability for combinations of incompatible sets which replicates the consistent histories result when restricted to a single consistent set. A number of examples are exhibited in which this classification can be made, in some cases with the assistance of the Bell, Clauser-Horne-Shimony-Holt, or Leggett-Garg inequalities together with Fine's theorem. When a unifying probability exists logical deductions in different consistent sets can in fact be combined, an extension of the "single framework rule." It is argued that this classification coincides with intuitive notions of the boundary between classical and quantum regimes and in particular, the absence of a unifying probability for certain combinations of consistent sets is regarded as a measure of the "quantumness" of the system. The proposed approach and results are closely related to recent work on the classification of quasiprobabilities and this connection is discussed.

  2. Toward defining and measuring social accountability in graduate medical education: a stakeholder study.

    Science.gov (United States)

    Reddy, Anjani T; Lazreg, Sonia A; Phillips, Robert L; Bazemore, Andrew W; Lucan, Sean C

    2013-09-01

    Since 1965, Medicare has publically financed graduate medical education (GME) in the United States. Given public financing, various advisory groups have argued that GME should be more socially accountable. Several efforts are underway to develop accountability measures for GME that could be tied to Medicare payments, but it is not clear how to measure or even define social accountability. We explored how GME stakeholders perceive, define, and measure social accountability. Through purposive and snowball sampling, we completed semistructured interviews with 18 GME stakeholders from GME training sites, government agencies, and health care organizations. We analyzed interview field notes and audiorecordings using a flexible, iterative, qualitative group process to identify themes. THREE THEMES EMERGED IN REGARDS TO DEFINING SOCIAL ACCOUNTABILITY: (1) creating a diverse physician workforce to address regional needs and primary care and specialty shortages; (2) ensuring quality in training and care to best serve patients; and (3) providing service to surrounding communities and the general public. All but 1 stakeholder believed GME institutions have a responsibility to be socially accountable. Reported barriers to achieving social accountability included training time constraints, financial limitations, and institutional resistance. Suggestions for measuring social accountability included reviewing graduates' specialties and practice locations, evaluating curricular content, and reviewing program services to surrounding communities. Most stakeholders endorsed the concept of social accountability in GME, suggesting definitions and possible measures that could inform policy makers calls for increased accountability despite recognized barriers.

  3. Towards sustainability of health information systems: how can we define, measure and achieve it?

    Science.gov (United States)

    Garde, Sebastian; Hullin, Carola M; Chen, Rong; Schuler, Thilo; Gränz, Jana; Knaup, Petra; Hovenga, Evelyn J S

    2007-01-01

    Health information systems (HIS) in their current form are rarely sustainable. In order to sustain our health information systems and with it our health systems, we need to focus on defining and maintaining sustainable Health Information System building blocks or components. These components need to be easily updatable when clinical knowledge (or anything else) changes, easily adaptable when business requirements or processes change, and easily exchangeable when technology advances. One major prerequisite for this is that we need to be able to define and measure sustainability, so that it can become one of the major business drivers in HIS development. Therefore, this paper analyses general definitions and indicators for sustainability, and analyses their applicability to HIS. We find that general 'Emergy analysis' is one possibility to measure sustainability for HIS. Based on this, we investigate major enablers and inhibitors to sustainability in a highlevel framework consisting of four pillars: clinical, technical, socio-technical, and political/business.

  4. Using Discrete-time Event History Fertility Models to Simulate Total Fertility Rates and Other Fertility Measures.

    Science.gov (United States)

    Van Hook, Jennifer; Altman, Claire E

    2013-08-01

    Event history models, also known as hazard models, are commonly used in analyses of fertility. One drawback of event history models is that the conditional probabilities (hazards) estimated by event history models do not readily translate into summary measures, particularly for models of repeatable events, like childbirth. In this paper, we describe how to translate the results of discrete-time event history models of all births into well-known summary fertility measures: simulated age- and parity-specific fertility rates, parity progression ratios (PPRs), and the total fertility rate (TFR). The method incorporates all birth intervals, but permits the hazard functions to vary across parities. It also can simulate values for groups defined by both fixed and time-varying covariates, such as marital or employment life histories. We demonstrate the method using an example from the National Survey of Family Growth (NSFG) and provide an accompanying data file and Stata program.

  5. Low-noise correlation measurements based on software-defined-radio receivers and cooled microwave amplifiers

    Science.gov (United States)

    Nieminen, Teemu; Lähteenmäki, Pasi; Tan, Zhenbing; Cox, Daniel; Hakonen, Pertti J.

    2016-11-01

    We present a microwave correlation measurement system based on two low-cost USB-connected software defined radio dongles modified to operate as coherent receivers by using a common local oscillator. Existing software is used to obtain I/Q samples from both dongles simultaneously at a software tunable frequency. To achieve low noise, we introduce an easy low-noise solution for cryogenic amplification at 600-900 MHz based on single discrete HEMT with 21 dB gain and 7 K noise temperature. In addition, we discuss the quantization effects in a digital correlation measurement and determination of optimal integration time by applying Allan deviation analysis.

  6. Antenatal Ultrasonographic Anteroposterior Renal Pelvis Diameter Measurement: Is It a Reliable Way of Defining Fetal Hydronephrosis?

    Directory of Open Access Journals (Sweden)

    Alamanda Kfoury Pereira

    2011-01-01

    Full Text Available Purpose. It was to quantify the intraobserver and interobserver variability of the sonographic measurements of renal pelvis and classify hydronephrosis severity. Methods. Two ultrasonographers evaluated 17 fetuses from 23 to 39 weeks of gestation. Renal pelvis APD were taken in 50 renal units. For intraobserver error, one of them performed three sequential measurements. The mean and standard deviation from the absolute and percentage differences between measurements were calculated. Bland-Altman plots were used to visually assess the relationship between the precision of repeated measurements. Hydronephrosis was classified as mild (5.0 to 9.9 mm, moderate (10.0 to 14.9 mm, or severe (≥15.0 mm. Interrater agreement were obtained using the Kappa index. Results. Absolute intraobserver variation in APD measurements was 5.2±3.5%. Interobserver variation of ultrasonographers was 9.3±9.7%. Neither intraobserver or interobserver error increased with increasing APD size. The overall percentage of agreement with the antenatal hydronephrosis diagnosis was 64%. Cohen's Kappa to hydronephrosis severity was 0.51 (95% CI, 0.33 to 0.69. Conclusion. Inter and intraobserver APD measurement errors were low in these group, but the agreement to hydronephrosis diagnosis and classification was fair. We suggest that standard and serial APD measurement can better define and evaluate fetal hydronephrosis.

  7. The deficit is not a well-defined measure of fiscal policy.

    Science.gov (United States)

    Kotlikoff, L J

    1988-08-12

    Notwithstanding its widespread use, the government's deficit is not a well-defined measure of fiscal policy from the perspective of neoclassical economics; the equations of neoclassical models do not define the deficit. Rather than being a fundamental economic concept, the deficit is an arbitrary cash flow accounting construct with no necessary relation to the true stance of fiscal policy. Although the deficit is supposed to indicate how the burden of paying for the government's consumption is spread across different generations, actual changes in the measured deficit in the United States have had little if any relation to changes in the burden imposed by the government on different generations. The deficit's lack of definition is illustrated with a simple model, and the potential for misreading fiscal policy is discussed with U.S. fiscal policy in the 1980s as an example. In this article, creation of present value generational accounts are called for that would properly measure the intergenerational stance of fiscal policy.

  8. Moving beyond Mindfulness: Defining Equanimity as an Outcome Measure in Meditation and Contemplative Research.

    Science.gov (United States)

    Desbordes, Gaëlle; Gard, Tim; Hoge, Elizabeth A; Hölzel, Britta K; Kerr, Catherine; Lazar, Sara W; Olendzki, Andrew; Vago, David R

    2014-01-21

    In light of a growing interest in contemplative practices such as meditation, the emerging field of contemplative science has been challenged to describe and objectively measure how these practices affect health and well-being. While "mindfulness" itself has been proposed as a measurable outcome of contemplative practices, this concept encompasses multiple components, some of which, as we review here, may be better characterized as equanimity. Equanimity can be defined as an even-minded mental state or dispositional tendency toward all experiences or objects, regardless of their origin or their affective valence (pleasant, unpleasant, or neutral). In this article we propose that equanimity be used as an outcome measure in contemplative research. We first define and discuss the inter-relationship between mindfulness and equanimity from the perspectives of both classical Buddhism and modern psychology and present existing meditation techniques for cultivating equanimity. We then review psychological, physiological, and neuroimaging methods that have been used to assess equanimity, either directly or indirectly. In conclusion, we propose that equanimity captures potentially the most important psychological element in the improvement of well-being, and therefore should be a focus in future research studies.

  9. System Energy Assessment (SEA, Defining a Standard Measure of EROI for Energy Businesses as Whole Systems

    Directory of Open Access Journals (Sweden)

    Jay Zarnikau

    2011-10-01

    Full Text Available A more objective method for measuring the energy needs of businesses, System Energy Assessment (SEA, measures the combined impacts of material supply chains and service supply chains, to assess businesses as whole self-managing net-energy systems. The method is demonstrated using a model Wind Farm, and defines a physical measure of their energy productivity for society (EROI-S, a ratio of total energy delivered to total energy expended. Energy use records for technology and proxy measures for clearly understood but not individually recorded energy uses for services are combined for a whole system estimate of consumption required for production. Current methods count only energy needs for technology. Business services outsource their own energy needs to operate, leaving no traceable record. That uncounted business energy demand is often 80% of the total, an amount of “dark energy” hidden from view, discovered by finding the average energy estimated needs for businesses far below the world average energy consumed per dollar of GDP. Presently for lack of information the energy needs of business services are counted to be “0”. Our default assumption is to treat them as “average”. The result is a hard measure of total business demand for energy services, a “Scope 4” energy use or GHG impact assessment. Counting recorded energy uses and discounting unrecorded ones misrepresents labor intensive work as highly energy efficient. The result confirms a similar finding by Hall et al. in 1981 [1]. We use exhaustive search for what a business needs to operate as a whole, tracing internal business relationships rather than energy data, to locate its natural physical boundary as a working unit, and so define a business as a physical rather than statistical subject of scientific study. See also online resource materials and notes [2].

  10. Defining and measuring health inequality: an approach based on the distribution of health expectancy.

    Science.gov (United States)

    Gakidou, E. E.; Murray, C. J.; Frenk, J.

    2000-01-01

    This paper proposes an approach to conceptualizing and operationalizing the measurement of health inequality, defined as differences in health across individuals in the population. We propose that health is an intrinsic component of well-being and thus we should be concerned with inequality in health, whether or not it is correlated with inequality in other dimensions of well-being. In the measurement of health inequality, the complete range of fatal and non-fatal health outcomes should be incorporated. This notion is operationalized through the concept of healthy lifespan. Individual health expectancy is preferable, as a measurement, to individual healthy lifespan, since health expectancy excludes those differences in healthy lifespan that are simply due to chance. In other words, the quantity of interest for studying health inequality is the distribution of health expectancy across individuals in the population. The inequality of the distribution of health expectancy can be summarized by measures of individual/mean differences (differences between the individual and the mean of the population) or inter-individual differences. The exact form of the measure to summarize inequality depends on three normative choices. A firmer understanding of people's views on these normative choices will provide a basis for deliberating on a standard WHO measure of health inequality. PMID:10686732

  11. Defining natural history: assessment of the ability of college students to aid in characterizing clinical progression of Niemann-Pick disease, type C.

    Directory of Open Access Journals (Sweden)

    Jenny Shin

    Full Text Available Niemann-Pick Disease, type C (NPC is a fatal, neurodegenerative, lysosomal storage disorder. It is a rare disease with broad phenotypic spectrum and variable age of onset. These issues make it difficult to develop a universally accepted clinical outcome measure to assess urgently needed therapies. To this end, clinical investigators have defined emerging, disease severity scales. The average time from initial symptom to diagnosis is approximately 4 years. Further, some patients may not travel to specialized clinical centers even after diagnosis. We were therefore interested in investigating whether appropriately trained, community-based assessment of patient records could assist in defining disease progression using clinical severity scores. In this study we evolved a secure, step wise process to show that pre-existing medical records may be correctly assessed by non-clinical practitioners trained to quantify disease progression. Sixty-four undergraduate students at the University of Notre Dame were expertly trained in clinical disease assessment and recognition of major and minor symptoms of NPC. Seven clinical records, randomly selected from a total of thirty seven used to establish a leading clinical severity scale, were correctly assessed to show expected characteristics of linear disease progression. Student assessment of two new records donated by NPC families to our study also revealed linear progression of disease, but both showed accelerated disease progression, relative to the current severity scale, especially at the later stages. Together, these data suggest that college students may be trained in assessment of patient records, and thus provide insight into the natural history of a disease.

  12. Reduction of variance in measurements of average metabolite concentration in anatomically-defined brain regions

    Science.gov (United States)

    Larsen, Ryan J.; Newman, Michael; Nikolaidis, Aki

    2016-11-01

    Multiple methods have been proposed for using Magnetic Resonance Spectroscopy Imaging (MRSI) to measure representative metabolite concentrations of anatomically-defined brain regions. Generally these methods require spectral analysis, quantitation of the signal, and reconciliation with anatomical brain regions. However, to simplify processing pipelines, it is practical to only include those corrections that significantly improve data quality. Of particular importance for cross-sectional studies is knowledge about how much each correction lowers the inter-subject variance of the measurement, thereby increasing statistical power. Here we use a data set of 72 subjects to calculate the reduction in inter-subject variance produced by several corrections that are commonly used to process MRSI data. Our results demonstrate that significant reductions of variance can be achieved by performing water scaling, accounting for tissue type, and integrating MRSI data over anatomical regions rather than simply assigning MRSI voxels with anatomical region labels.

  13. Dispersion measurement as a method of quantifying geologic characterization and defining reservoir heterogeneity. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Menzie, D.E.

    1995-05-01

    The main objective of this research project is to investigate dispersion as a method of quantifying geological characterization and defining reservoir heterogeneity in order to enhance crude oil recovery. The dispersion of flow of a reservoir rock (dispersion coefficient and dispersivity) was identified as one of the physical properties of a reservoir rock by measuring the mixing of two miscible fluids, one displacing the other in a porous medium. A rock was 100% saturated with a resident fluid and displaced by a miscible fluid of equal viscosity and equal density. Some specific experiments were performed with unequal densities. Produced fluid was analyzed by refractometer, nuclear reaction, electrical conductivity and X-ray scan. Several physical and flow characteristics were measured on the sand rock sample in order to establish correlations with the measured dispersion property. Absolute permeability, effective porosity, relative permeability, capillary pressure, the heterogeneity factor and electrical conductivity were used to better understand the flow system. Linear, transverse, 2-D and 3-D dispersions were measured and used to characterize the rock heterogeneity of the flow system. A new system of measuring dispersion was developed using a gas displacing gas system in a porous medium. An attempt was also made to determine the dispersion property of an actual reservoir from present day well log data on a producing well. 275 refs., 102 figs., 17 tabs.

  14. Defining Neighbourhoods as a Measure of Exposure to the Food Environment

    Directory of Open Access Journals (Sweden)

    Anders K. Lyseen

    2015-07-01

    Full Text Available Neighbourhoods are frequently used as a measure for individuals’ exposure to the food environment. However, the definitions of neighbourhoods fluctuate and have not been applied consistently in previous studies. Neighbourhoods defined from a single fixed location fail to capture people’s complete exposure in multiple locations, but measuring behaviour using traditional methods can be challenging. This study compares the traditional methods of measuring exposure to the food environment to methods that use data from GPS tracking. For each of the 187 participants, 11 different neighbourhoods were created in which the exposure to supermarkets and fast food outlets were measured. ANOVA, Tukey’s Honestly Significant Difference (HSD test and t-tests were performed to compare the neighbourhoods. Significant differences were found between area sizes and the exposure to supermarkets and fast food outlets for different neighbourhood types. Second, significant differences in exposure to food outlets were found between the urban and rural neighbourhoods. Neighbourhoods are clearly a diffused and blurred concept that varies in meaning depending on each person’s perception and the conducted study. Complexity and heterogeneity of human mobility no longer appear to correspond to the use of residential neighbourhoods but rather emphasise the need for methods, concepts and measures of individual activity and exposure.

  15. Defining Neighbourhoods as a Measure of Exposure to the Food Environment.

    Science.gov (United States)

    Lyseen, Anders K; Hansen, Henning S; Harder, Henrik; Jensen, Anders S; Mikkelsen, Bent E

    2015-07-21

    Neighbourhoods are frequently used as a measure for individuals' exposure to the food environment. However, the definitions of neighbourhoods fluctuate and have not been applied consistently in previous studies. Neighbourhoods defined from a single fixed location fail to capture people's complete exposure in multiple locations, but measuring behaviour using traditional methods can be challenging. This study compares the traditional methods of measuring exposure to the food environment to methods that use data from GPS tracking. For each of the 187 participants, 11 different neighbourhoods were created in which the exposure to supermarkets and fast food outlets were measured. ANOVA, Tukey's Honestly Significant Difference (HSD) test and t-tests were performed to compare the neighbourhoods. Significant differences were found between area sizes and the exposure to supermarkets and fast food outlets for different neighbourhood types. Second, significant differences in exposure to food outlets were found between the urban and rural neighbourhoods. Neighbourhoods are clearly a diffused and blurred concept that varies in meaning depending on each person's perception and the conducted study. Complexity and heterogeneity of human mobility no longer appear to correspond to the use of residential neighbourhoods but rather emphasise the need for methods, concepts and measures of individual activity and exposure.

  16. Historie

    DEFF Research Database (Denmark)

    Poulsen, Jens Aage

    Historie i serien handler om læreplaner og læremidler og deres brug i skolefaget historie. Bogen indeholder nyttige redskaber til at analysere og vurdere læremidler......Historie i serien handler om læreplaner og læremidler og deres brug i skolefaget historie. Bogen indeholder nyttige redskaber til at analysere og vurdere læremidler...

  17. Modified T-history method for measuring thermophysical properties of phase change materials at high temperature

    Science.gov (United States)

    Omaraa, Ehsan; Saman, Wasim; Bruno, Frank; Liu, Ming

    2017-06-01

    Latent heat storage using phase change materials (PCMs) can be used to store large amounts of energy in a narrow temperature difference during phase transition. The thermophysical properties of PCMs such as latent heat, specific heat and melting and solidification temperature need to be defined at high precision for the design and estimating the cost of latent heat storage systems. The existing laboratory standard methods, such as differential thermal analysis (DTA) and differential scanning calorimetry (DSC), use a small sample size (1-10 mg) to measure thermophysical properties, which makes these methods suitable for homogeneous elements. In addition, this small amount of sample has different thermophysical properties when compared with the bulk sample and may have limitations for evaluating the properties of mixtures. To avoid the drawbacks in existing methods, the temperature - history (T-history) method can be used with bulk quantities of PCM salt mixtures to characterize PCMs. This paper presents a modified T-history setup, which was designed and built at the University of South Australia to measure the melting point, heat of fusion, specific heat, degree of supercooling and phase separation of salt mixtures for a temperature range between 200 °C and 400 °C. Sodium Nitrate (NaNO3) was used to verify the accuracy of the new setup.

  18. Extracting fuzzy rules under uncertainty and measuring definability using rough sets

    Science.gov (United States)

    Culas, Donald E.

    1991-01-01

    Although computers have come a long way since their invention, they are basically able to handle only crisp values at the hardware level. Unfortunately, the world we live in consists of problems which fail to fall into this category, i.e., uncertainty is all too common. A problem is looked at which involves uncertainty. To be specific, attributes are dealt with which are fuzzy sets. Under this condition, knowledge is acquired by looking at examples. In each example, a condition as well as a decision is made available. Based on the examples given, two sets of rules are extracted, certain and possible. Furthermore, measures are constructed of how much these rules are believed in, and finally, the decisions are defined as a function of the terms used in the conditions.

  19. Does the Defining Issues Test measure ethical judgment ability or political position?

    Science.gov (United States)

    Bailey, Charles D

    2011-01-01

    This article addresses the construct validity of the Defining Issues Test of ethical judgment (DIT/DIT-2). Alleging a political bias in the test, Emler and colleagues (1983, 1998, 1999, 2007), show that conservatives score higher when asked to fake as liberals, implying that they understand the reasoning associated with "higher" moral development but avoid items they see as liberally biased. DIT proponents challenge the internal validity of faking studies, advocating an explained-variance validation. This study takes a new approach: Adult participants complete the DIT-2, then evaluate the raw responses of others to discern political orientation and ethical development. Results show that individuals scoring higher on the DIT-2 rank others' ethical judgment in a way consistent with DIT-2-based rankings. Accuracy at assessing political orientation, however, is low. Results support the DIT-2's validity as a measure of ethical development, not an expression of political position.

  20. Impact of Nodal Centrality Measures to Robustness in Software-Defined Networking

    Directory of Open Access Journals (Sweden)

    Tomas Hegr

    2014-01-01

    Full Text Available The paper deals with the network robustness from the perspective of nodal centrality measures and its applicability in Software-Defined Networking (SDN. Traditional graph characteristics have been evolving during the last century, and numerous of less-conventional metrics was introduced trying to bring a new view to some particular graph attributes. New control technologies can finally utilize these metrics but simultaneously show new challenges. SDN brings the fine-grained and nearly online view of the underlying network state which allows to implement an advanced routing and forwarding. In such situation, sophisticated algorithms can be applied utilizing pre-computed network measures. Since in recent version of SDN protocol OpenFlow (OF has been revived an idea of the fast link failover, the authors in this paper introduce a novel metric, Quality of Alternative Paths centrality (QAP. The QAP value quantifies node surroundings and can be with an advantage utilized in algorithms to indicate more robust paths. The centrality is evaluated using the node-failure simulation at different network topologies in combination with the Quality of Backup centrality measure.

  1. Use of gravity potential field methods for defining a shallow magmatic intrusion: the Mt. Amiata case history (Tuscany, Central Italy)

    Science.gov (United States)

    Girolami, Chiara; Rinaldo Barchi, Massimiliano; Pauselli, Cristina; Heyde, Ingo

    2016-04-01

    We analyzed the Bouguer gravity anomaly signal beneath the Mt. Amiata area in order to reconstruct the subsurface setting. The study area is characterized by a pronounced gravity minimum, possibly correlated with the observed anomalous heat flow and hydrothermal activity. Using different approaches, previous authors defined a low density body (generally interpreted as a magmatic intrusion) beneath this area, which could explain the observed gravity anomaly minimum. However the proposed geologic models show different geometries and densities for the batholith. The gravity data used in this study (kindly provided by eni) were acquired from different institutions (eni, OGS, USDMA and Servizio Geologico d'Italia) and collected in a unique dataset, consisting of about 50000 stations, randomly distributed, which cover Central Italy, with a spacing of less than 1 km. For each station the elevation and the Bouguer gravity anomaly data are given. From this dataset, we created two maps of the Bouguer gravity anomaly and the topography, using the Minimum Curvature gridding method considering a grid cell size of 500m x 500m. The Bouguer gravity anomaly has been computed using a density of 2.67 g/cm3. From these maps we extracted a window of about 240 km2 (12x20 km) for the study area, which includes the Mt. Amiata region and the adjacent Radicofani sedimentary basin. The first part of this study was focused on calculating the first order vertical derivative and the power spectra analysis of the Bouguer gravity anomaly to enhance the effect of shallow bodies and estimating the source depth respectively. The second part of this study was focused on constructing a 3D geological density model of the subsurface setting of the studied area, implementing a forward modelling approach. The stratigraphy of the study area's upper crust schematically consists of six litho-mechanical units, whose density was derived from velocity data collected by active seismic surveys. A preliminary

  2. Defining and Measuring Safety Climate: A Review of the Construction Industry Literature.

    Science.gov (United States)

    Schwatka, Natalie V; Hecker, Steven; Goldenhar, Linda M

    2016-06-01

    Safety climate measurements can be used to proactively assess an organization's effectiveness in identifying and remediating work-related hazards, thereby reducing or preventing work-related ill health and injury. This review article focuses on construction-specific articles that developed and/or measured safety climate, assessed safety climate's relationship with other safety and health performance indicators, and/or used safety climate measures to evaluate interventions targeting one or more indicators of safety climate. Fifty-six articles met our inclusion criteria, 80% of which were published after 2008. Our findings demonstrate that researchers commonly defined safety climate as perception based, but the object of those perceptions varies widely. Within the wide range of indicators used to measure safety climate, safety policies, procedures, and practices were the most common, followed by general management commitment to safety. The most frequently used indicators should and do reflect that the prevention of work-related ill health and injury depends on both organizational and employee actions. Safety climate scores were commonly compared between groups (e.g. management and workers, different trades), and often correlated with subjective measures of safety behavior rather than measures of ill health or objective safety and health outcomes. Despite the observed limitations of current research, safety climate has been promised as a useful feature of research and practice activities to prevent work-related ill health and injury. Safety climate survey data can reveal gaps between management and employee perceptions, or between espoused and enacted policies, and trigger communication and action to narrow those gaps. The validation of safety climate with safety and health performance data offers the potential for using safety climate measures as a leading indicator of performance. We discuss these findings in relation to the related concept of safety culture and

  3. System Energy Assessment (SEA), Defining a Standard Measure of EROI for Energy Businesses as Whole Systems

    CERN Document Server

    Henshaw, Philip F; Zarnikau, Jay

    2011-01-01

    A more objective method for measuring the energy needs of businesses, System Energy Assessment (SEA), identifies the natural boundaries of businesses as self-managing net-energy systems, of controlled and self-managing parts. The method is demonstrated using a model Wind Farm case study, and applied to defining a true physical measure of its energy productivity for society (EROI-S), the global ratio of energy produced to energy cost. The traceable needs of business technology are combined with assignable energy needs for all other operating services. That serves to correct a large natural gap in energy use information. Current methods count traceable energy receipts for technology use. Self-managing services employed by businesses outsource their own energy needs to operate, and leave no records to trace. Those uncounted energy demands are often 80% of the total embodied energy of business end products. The scale of this "dark energy" was discovered from differing global accounts, and corrected so the average...

  4. The use of collaboration science to define consensus outcome measures: a telemental health case study.

    Science.gov (United States)

    Mishkind, Matthew C; Doarn, Charles R; Bernard, Jordana; Shore, Jay H

    2013-06-01

    The purpose of this document is to provide an overview of a collaboration science process used to develop recommendations for the field of telemental health (TMH) in the selection of outcome measures that best reflect programmatic impacts. A common use of group development techniques in medicine is the development of clinical guidelines, which typically occurs using one of two methods: the nominal group or the Delphi method. Both processes have been faulted for limited transparency, reliability, and sustainability. Recommendations to improve the traditional process include making goals explicit, making disagreements transparent, and publicly displaying levels of agreement. A group of 26 TMH experts convened during the American Telemedicine Association's 2012 Fall Forum in New Orleans, LA to participate in a 1-day, interactive, consensus-building workshop to initiate the development of a shared lexicon of outcomes. The workshop method was designed to improve on traditional methods of guideline development by focusing on clarity of expectations, transparency, and timeliness of group development work. Results suggest that, compared with other traditional methods, the current process involved more people, occurred more rapidly, was more transparent, and resulted in a comparable deliverable. Recommendations for further process development, both within and external to TMH, as well as an initial overview of defined outcome measures are discussed.

  5. Spiritual health scale 2011: defining and measuring 4 dimension of health.

    Science.gov (United States)

    Dhar, Neera; Chaturvedi, Sk; Nandan, Deoki

    2011-10-01

    In the midst of physical comforts provided by the unprecedented developments in all spheres of life, the humanity is at cross roads and looking at something beyond these means. Spirituality has now been identified globally as an important aspect for providing answers to many questions related to health and happiness. The World Health Organization is also keen at looking beyond physical, mental and social dimensions of the health, and the member countries are actively exploring the 4(th) Dimension of the health i.e. the spiritual health and its impact on the overall health and happiness of an individual. National Institute of Health and Family Welfare (NIHFW), realized this need and initiated a research study in this direction. In this study, an effort was made to define this 4(th) Dimension of health from a common worldly person's perspective and measure it. 3 Domains, 6 Constructs and 27 Determinants of spiritual health were identified through a scientific process. A statistically reliable and valid Spiritual Health Scale (SHS 2011) containing 114 items has been developed. Construct validity and test- retest reliability has been established for urban educated adult population. The scale is first of its kind in the world to measure the spiritual health of a common worldly person, which is devoid of religious and cultural bias. Its items have universal applicability.

  6. Spiritual health scale 2011: Defining and measuring 4 th dimension of health

    Directory of Open Access Journals (Sweden)

    Neera Dhar

    2011-01-01

    Full Text Available In the midst of physical comforts provided by the unprecedented developments in all spheres of life, the humanity is at cross roads and looking at something beyond these means. Spirituality has now been identified globally as an important aspect for providing answers to many questions related to health and happiness. The World Health Organization is also keen at looking beyond physical, mental and social dimensions of the health, and the member countries are actively exploring the 4 th Dimension of the health i.e. the spiritual health and its impact on the overall health and happiness of an individual. National Institute of Health and Family Welfare (NIHFW, realized this need and initiated a research study in this direction. In this study, an effort was made to define this 4 th Dimension of health from a common worldly person′s perspective and measure it. 3 Domains, 6 Constructs and 27 Determinants of spiritual health were identified through a scientific process. A statistically reliable and valid Spiritual Health Scale (SHS 2011 containing 114 items has been developed. Construct validity and test- retest reliability has been established for urban educated adult population. The scale is first of its kind in the world to measure the spiritual health of a common worldly person, which is devoid of religious and cultural bias. Its items have universal applicability.

  7. AV-1451 PET imaging of tau pathology in preclinical Alzheimer disease: Defining a summary measure.

    Science.gov (United States)

    Mishra, Shruti; Gordon, Brian A; Su, Yi; Christensen, Jon; Friedrichsen, Karl; Jackson, Kelley; Hornbeck, Russ; Balota, David A; Cairns, Nigel J; Morris, John C; Ances, Beau M; Benzinger, Tammie L S

    2017-07-26

    Utilizing [18F]-AV-1451 tau positron emission tomography (PET) as an Alzheimer disease (AD) biomarker will require identification of brain regions that are most important in detecting elevated tau pathology in preclinical AD. Here, we utilized an unsupervised learning, data-driven approach to identify brain regions whose tau PET is most informative in discriminating low and high levels of [18F]-AV-1451 binding. 84 cognitively normal participants who had undergone AV-1451 PET imaging were used in a sparse k-means clustering with resampling analysis to identify the regions most informative in dividing a cognitively normal population into high tau and low tau groups. The highest-weighted FreeSurfer regions of interest (ROIs) separating these groups were the entorhinal cortex, amygdala, lateral occipital cortex, and inferior temporal cortex, and an average SUVR in these four ROIs was used as a summary metric for AV-1451 uptake. We propose an AV-1451 SUVR cut-off of 1.25 to define high tau as described by imaging. This spatial distribution of tau PET is a more widespread pattern than that predicted by pathological staging schemes. Our data-derived metric was validated first in this cognitively normal cohort by correlating with early measures of cognitive dysfunction, and with disease progression as measured by β-amyloid PET imaging. We additionally validated this summary metric in a cohort of 13 Alzheimer disease patients, and showed that this measure correlates with cognitive dysfunction and β-amyloid PET imaging in a diseased population. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. Developing a questionnaire for measuring epistemological beliefs in history education

    NARCIS (Netherlands)

    Stoel, Gerhard; Logtenberg, Albert; Wansink, Bjorn; Huijgen, Timothy

    2016-01-01

    Developing pupils’ understanding of history with its own disciplinary and epistemological problems can contribute to the education of a critical and peaceful diverse society. This symposium discusses results of four studies from the Netherlands, Germany and the USA addressing theoretical,

  9. Defining and measuring health literacy: how can we profit from other literacy domains?

    Science.gov (United States)

    Frisch, Anne-Linda; Camerini, Luca; Diviani, Nicola; Schulz, Peter J

    2012-03-01

    When the antecedents of health-promoting behavior are explored, the concept of health literacy is deemed a factor of major influence. Originally defined as reading, writing and numeracy skills in the health domain, health literacy is now considered a multidimensional concept. The ongoing discussion on health literacy reveals that no agreement exists about which dimensions to include in the concept. To contribute to the development of a consistent and parsimonious concept of health literacy, we conducted a critical review of concepts in other literacy domains. Our review was guided by two research questions: (i) Which dimensions are included in the concepts of other literacy domains? (ii) How can health literacy research profit from other literacy domains? Based on articles collected from PubMed, PsycINFO, Communication & Mass Media Complete, CINAHL, SAGE Full-Text Collection, Cochrane Library and Google Scholar as well as selected monographs and editions, we identified seven distinct dimensions. Some of the dimensions recur across all reviewed literacy domains and first attempts have been made to operationalize the dimensions. Expanding upon these dimensions, the paper discusses how they can prove useful for elaborating a consistent and parsimonious concept of health literacy and foster the development of a more holistic measure.

  10. Compassionate Care: Can it be Defined and Measured? The Development of the Compassionate Care Assessment Tool

    Directory of Open Access Journals (Sweden)

    Lori Burnell

    2013-01-01

    Full Text Available Background: Compassion has not been universally defined or understood, nonetheless is recognized as a component of nursing excellence. If compassionate care is routine in health care delivery models, nursing behaviors and actions that exemplify compassion ought to be easily identifiable to patients. However, a standardized scale measuring compassionate care attributes has been notably absent.Objective: To address this gap and ascertain the importance of compassionate care to patients, a Compassionate Care Assessment Tool (CCAT© was formulated. This new tool, derived from a pilot study of two published surveys, combined the constructs of compassion and caring to generate 28 elements of compassionate care.Methodology: The CCAT© was administered to 250 hospitalized patients. Patients were asked to rate (a the importance of these items to compassionate care and (b the extent to which nurses made this type of care apparent to them.Results: Four categorical segments illustrated compassion from the patients’ perspective: the ability to establish meaningful connections, meet expectations, exhibit caring attributes, and function as a capable practitioner.Conclusions: The provision of compassionate care requires a holistic approach. Patients value nurses forming personal connections, serving as their advocates, and responding to their individual needs.

  11. Measuring solar reflectance Part I: Defining a metric that accurately predicts solar heat gain

    Energy Technology Data Exchange (ETDEWEB)

    Levinson, Ronnen; Akbari, Hashem; Berdahl, Paul

    2010-05-14

    Solar reflectance can vary with the spectral and angular distributions of incident sunlight, which in turn depend on surface orientation, solar position and atmospheric conditions. A widely used solar reflectance metric based on the ASTM Standard E891 beam-normal solar spectral irradiance underestimates the solar heat gain of a spectrally selective 'cool colored' surface because this irradiance contains a greater fraction of near-infrared light than typically found in ordinary (unconcentrated) global sunlight. At mainland U.S. latitudes, this metric RE891BN can underestimate the annual peak solar heat gain of a typical roof or pavement (slope {le} 5:12 [23{sup o}]) by as much as 89 W m{sup -2}, and underestimate its peak surface temperature by up to 5 K. Using R{sub E891BN} to characterize roofs in a building energy simulation can exaggerate the economic value N of annual cool-roof net energy savings by as much as 23%. We define clear-sky air mass one global horizontal ('AM1GH') solar reflectance R{sub g,0}, a simple and easily measured property that more accurately predicts solar heat gain. R{sub g,0} predicts the annual peak solar heat gain of a roof or pavement to within 2 W m{sup -2}, and overestimates N by no more than 3%. R{sub g,0} is well suited to rating the solar reflectances of roofs, pavements and walls. We show in Part II that R{sub g,0} can be easily and accurately measured with a pyranometer, a solar spectrophotometer or version 6 of the Solar Spectrum Reflectometer.

  12. Consensus statement on defining and measuring negative effects of Internet interventions

    Directory of Open Access Journals (Sweden)

    Alexander Rozental

    2014-03-01

    Full Text Available Internet interventions have great potential for alleviating emotional distress, promoting mental health, and enhancing well-being. Numerous clinical trials have demonstrated their efficacy for a number of psychiatric conditions, and interventions delivered via the Internet will likely become a common alternative to face-to-face treatment. Meanwhile, research has paid little attention to the negative effects associated with treatment, warranting further investigation of the possibility that some patients might deteriorate or encounter adverse events despite receiving best available care. Evidence from research of face-to-face treatment suggests that negative effects afflict 5–10% of all patients undergoing treatment in terms of deterioration. However, there is currently a lack of consensus on how to define and measure negative effects in psychotherapy research in general, leaving researchers without practical guidelines for monitoring and reporting negative effects in clinical trials. The current paper therefore seeks to provide recommendations that could promote the study of negative effects in Internet interventions with the aim of increasing the knowledge of its occurrence and characteristics. Ten leading experts in the field of Internet interventions were invited to participate and share their perspective on how to explore negative effects, using the Delphi technique to facilitate a dialog and reach an agreement. The authors discuss the importance of conducting research on negative effects in order to further the understanding of its incidence and different features. Suggestions on how to classify and measure negative effects in Internet interventions are proposed, involving methods from both quantitative and qualitative research. Potential mechanisms underlying negative effects are also discussed, differentiating common factors shared with face-to-face treatments from those unique to treatments delivered via the Internet. The authors

  13. History and current safety measures at Laguna Palcacocha, Huaraz, Peru

    Science.gov (United States)

    Salazar Checa, César; Cochachin, Alejo; Frey, Holger; Huggel, Christian; Portocarrero, César

    2017-04-01

    Laguna Palcacocha is a large glacier lake in the Cordillera Blanca, Peru, located in the Quillcay catchment, above the city of Huaraz, the local capital. On 13 December 1941, the moraine dam lake collapsed, probably after having been impacted by a large ice avalanche, and triggered a major outburst flood. This GLOF destroyed about a third of the city of Huaraz, causing about 2,000 casualties and is therefore one of the deadliest glacier lake outbursts known in history. In 1974, the Glaciology Unit of Peru, responsible for the studying, monitoring and mitigation works related to glacier hazards installed a reinforcement of the natural moraine dam of the newly filled Laguna Palcacocha, with an artificial drainage channel at 7 m below the crest of the reinforced dam. At that time, the lake had an area of 66,800 m2 and a volume of 0.5 x 106 m3. During the past decades, in the course of continued glacier retreat, Laguna Palcacocha has undergone an extreme growth. In February 2016, the lake had an area of 514,000 m2 (7.7 times the area of 1974) and a volume of more than 17 x 106 m3 (more than 34 times the volume of 1974). At the same time, the city of Huaraz, located 20 km downstream of the lake, grew significantly after its almost complete destruction by the 1970 earthquake. Today, about 120,000 people are living in the city. Due to the persisting possibility for large ice avalanches directly above the Palcacocha lake, this constitutes a high-risk situation, requiring new hazard and risk mitigation measures. As an immediate temporal measure, in order to bridge the time until the realization of a more permanent measure, a syphoning system has been installed in 2011, using about ten 700-m pipes with a 10-inch (25.4 cm) diameter. The aim of this syphoning attempt is to lower the lake level by about 7 m, and therefore reduce the lake volume on the one hand, and also reach a higher dam freeboard. However, the system is less effective than assumed, currently the lake level

  14. Screening history of cervical cancers in Emilia-Romagna, Italy: defining priorities to improve cervical cancer screening.

    Science.gov (United States)

    Rossi, Paolo Giorgi; Caroli, Stefania; Mancini, Silvia; de' Bianchi, Priscilla Sassoli; Finarelli, Alba C; Naldoni, Carlo; Bucchi, Lauro; Falcini, Fabio

    2015-03-01

    Most invasive cervical cancers in industrialized countries are due to the lack of Pap test coverage, very few are due to screening failures. This study aimed at quantifying the proportion of invasive cancers occurring in nonscreened or underscreened women and that in women with a previous negative screening, that is, screening failure, during the first two screening rounds (1996-2002) and in the following rounds (2003-2008) in the Emilia-Romagna region. All cases of invasive cancers registered in the regional cancer registry between 1996 and 2008 were classified according to screening history through a record linkage with the screening programme registry. The incidence significantly decreased from 11.6/100 000 to 8.7/100 000; this decrease is due to a reduction in squamous cell cancers (annual percentage change -6.2; confidence interval: -7.8, -4.6) and advanced cancers (annual percentage change -6.6; confidence interval: -8.8, -4.3), whereas adenocarcinomas and microinvasive cancers were essentially stable. The proportion of cancers among women not yet invited and among nonresponders decreased over the two periods, from 45.5 to 33.3%. In contrast, the proportion of women with a previous negative Pap test less than 5 years and 5 years or more before cancer incidence increased from 5.7 to 13.3% and from 0.3 to 5.5%, respectively. Although nonattendance of the screening programme remains the main barrier to cervical cancer control, the introduction of a more sensitive test, such as the human papillomavirus DNA test, could significantly reduce the burden of disease.

  15. Molecularly defined adult-type hypolactasia in school-aged children with a previous history of cow's milk allergy

    Institute of Scientific and Technical Information of China (English)

    Heli Rasinper(a); Kristiina Saarinen; Anna Pelkonen; Irma J(a)rvel(a); Erkki Savilahti; Kaija-Leena Kolho

    2006-01-01

    AIM: To assess the role of lactase non-persistence/persistence in school-aged children and their milk-related sYmptoms.METHODS: The genotypes for the C/T-13910 variant associated with lactase non-persistence/ persistence were determined using PCR-minisequencing in a group of 172 children with a mean age of 8.6 years (SE = 0.02,93 boys) participating in a follow-up study for cow's milk allergy. The parents were asked to assess their children's milk consumption and abdominal symptoms.RESULTS: The presence of allergy to cow's milk was not associated with the C/G13910 genotype related with a decline of lactase enzyme activity during childhood (lactase non-persistence). The frequency of the C/G13910genotype (16%) was similar to published figures for the prevalence of adult-type hypolactasia in Finland. The majority of the children (90%) in this series consumed milk but 26% of their families suspected that their children had milk-related symptoms. Forty-eight percent of the children with the C/G13910 genotype did not drink milk at all or consumed a low lactose containing diet prior to the genotyping (P<0.004 when compared to the other genotypes).CONCLUSION: Analysis of the C/T-13910 polymorphism is an easy and reliable method for excluding adult-type hypolactasia in children with milk-related symptoms.Genotyping for this variant can be used to advise diets for children with a previous history of cow's milk allergy.

  16. Defining, Designing for, and Measuring "Social Constructivist Digital Literacy" Development in Learners: A Proposed Framework

    Science.gov (United States)

    Reynolds, Rebecca

    2016-01-01

    This paper offers a newly conceptualized modular framework for digital literacy that defines this concept as a task-driven "social constructivist digital literacy," comprising 6 practice domains grounded in Constructionism and social constructivism: Create, Manage, Publish, Socialize, Research, Surf. The framework articulates possible…

  17. Defining, Designing for, and Measuring "Social Constructivist Digital Literacy" Development in Learners: A Proposed Framework

    Science.gov (United States)

    Reynolds, Rebecca

    2016-01-01

    This paper offers a newly conceptualized modular framework for digital literacy that defines this concept as a task-driven "social constructivist digital literacy," comprising 6 practice domains grounded in Constructionism and social constructivism: Create, Manage, Publish, Socialize, Research, Surf. The framework articulates possible…

  18. Defining and Measuring the Success of Services Contracts in the United States Navy

    Science.gov (United States)

    2012-12-06

    PMBOK Project Management Body of Knowledge PMI Project Management Institute R&D Research and Development SPAWAR Space and Naval Warfare...Project Management Institute (PMI) Project Management Body of Knowledge ( PMBOK ; 2008) defines project life cycle as a collection of generally sequential...a systematic program management approach and are vital to project success. The PMI PMBOK (2008) identifies five project management process groups

  19. Improvement of Traceability of Widely-Defined Measurements in the Field of Humanities

    Science.gov (United States)

    Sapozhnikova, K.; Taymanov, R.

    2010-01-01

    In the last decades, a tendency to extend the domain of "fuzzy" measurements of multiparametric quantities to the field of humanities has been observed. In the measurement process, the "fuzzy" measurements should meet the requirements of metrological traceability. The paper deals with the approach proposed for developing a measurement model of "fuzzy" measurements. The approach suggested is illustrated by an example of a model for measuring the emotions contained in musical fragments. The model is based on the hypothesis that permits to explain the origination of emotions in the evolution process.

  20. Defining Cigarette Smoking Status in Young Adults: A Comparison of Adolescent vs Adult Measures

    Science.gov (United States)

    Delnevo, Cristine D.; Lewis, M. Jane; Kaufman, Ira; Abatemarco, Diane J.

    2004-01-01

    Objective: To determine the agreement between 2 measures (adult vs adolescent) of current cigarette smoking among young adults. Methods: We examined data from 1007 young adults from the New Jersey Adult Tobacco Survey. The adult measure incorporates lifetime and present use, whereas the adolescent measure assesses past 30-day use. The kappa…

  1. Defining Cigarette Smoking Status in Young Adults: A Comparison of Adolescent vs Adult Measures

    Science.gov (United States)

    Delnevo, Cristine D.; Lewis, M. Jane; Kaufman, Ira; Abatemarco, Diane J.

    2004-01-01

    Objective: To determine the agreement between 2 measures (adult vs adolescent) of current cigarette smoking among young adults. Methods: We examined data from 1007 young adults from the New Jersey Adult Tobacco Survey. The adult measure incorporates lifetime and present use, whereas the adolescent measure assesses past 30-day use. The kappa…

  2. Relationship Between Surface-Based Brain Morphometric Measures and Intelligence in Autism Spectrum Disorders: Influence of History of Language Delay.

    Science.gov (United States)

    Balardin, Joana Bisol; Sato, João Ricardo; Vieira, Gilson; Feng, Yeu; Daly, Eileen; Murphy, Clodagh; Murphy, Declan; Ecker, Christine

    2015-10-01

    Autism spectrum disorders (ASD) are a group of conditions that show abnormalities in the neuroanatomy of multiple brain regions. The variability in the development of intelligence and language among individuals on the autism spectrum has long been acknowledged, but it remains unknown whether these differences impact on the neuropathology of ASD. In this study, we aimed to compare associations between surface-based regional brain measures and general intelligence (IQ) scores in ASD individuals with and without a history of language delay. We included 64 ASD adults of normal intelligence (37 without a history of language delay and 27 with a history of language delay and 80 neurotypicals). Regions with a significant association between verbal and nonverbal IQ and measures of cortical thickness (CT), surface area, and cortical volume were first identified in the combined sample of individuals with ASD and controls. Thicker dorsal frontal and temporal cortices, and thinner lateral orbital frontal and parieto-occipital cortices were associated with greater and lower verbal IQ scores, respectively. Correlations between cortical volume and verbal IQ were observed in similar regions as revealed by the CT analysis. A significant difference between ASD individuals with and without a history of language delay in the association between CT and verbal IQ was evident in the parieto-occipital region. These results indicate that ASD subgroups defined on the basis of differential language trajectories in childhood can have different associations between verbal IQ and brain measures in adulthood despite achieving similar levels of cognitive performance.

  3. The Measurement Process in the Generalized Contexts Formalism for Quantum Histories

    Science.gov (United States)

    Losada, Marcelo; Vanni, Leonardo; Laura, Roberto

    2016-02-01

    In the interpretations of quantum mechanics involving quantum histories there is no collapse postulate and the measurement is considered as a quantum interaction between the measured system and the measured instrument. For two consecutive non ideal measurements on the same system, we prove that both pointer indications at the end of each measurement are compatible properties in our generalized context formalism for quantum histories. Inmediately after the first measurement an effective state for the measured system is deduced from the formalism, generalizing the state that would be obtained by applying the state collapse postulate.

  4. Design of a Neutron Temporal Diagnostic for measuring DD or DT burn histories at the NIF

    Science.gov (United States)

    Lahmann, B.; Frenje, J. A.; Sio, H.; Petrasso, R. D.; Bradley, D. K.; Le Pape, S.; MacKinnon, A. J.; Isumi, N.; Macphee, A.; Zayas, C.; Spears, B. K.; Hermann, H.; Hilsabeck, T. J.; Kilkenny, J. D.

    2015-11-01

    The DD or DT burn history in Inertial Confinement Fusion (ICF) implosions provides essential information about implosion performance and helps to constrain numerical modeling. The capability of measuring this burn history is thus important for the NIF in its pursuit of ignition. Currently, the Gamma Reaction History (GRH) diagnostic is the only system capable of measuring the burn history for DT implosions with yields greater than ~ 1e14. To complement GRH, a new NIF Neutron Temporal Diagnostic (NTD) is being designed for measuring the DD or DT burn history with yields greater than ~ 1e10. A traditional scintillator-based design and a pulse-dilation-based design are being considered. Using MCNPX simulations, both designs have been optimized, validated and contrasted for various types of implosions at the NIF. This work was supported in part by the U.S. DOE, LLNL and LLE.

  5. DEFINING THE 'BLIND SPOT' OF HINODE EIS AND XRT TEMPERATURE MEASUREMENTS

    Energy Technology Data Exchange (ETDEWEB)

    Winebarger, Amy R.; Cirtain, Jonathan; Mulu-Moore, Fana [NASA Marshall Space Flight Center, VP 62, Huntsville, AL 35812 (United States); Warren, Harry P. [Space Science Division, Naval Research Laboratory, Washington, DC 20375 (United States); Schmelz, Joan T. [Physics Department, University of Memphis, Memphis, TN 38152 (United States); Golub, Leon [Harvard-Smithsonian Center for Astrophysics, 60 Garden St., Cambridge, MA 02138 (United States); Kobayashi, Ken, E-mail: amy.r.winebarger@nasa.gov [Center for Space Plasma and Aeronomic Research, 320 Sparkman Dr, Huntsville, AL 35805 (United States)

    2012-02-20

    Observing high-temperature, low emission measure plasma is key to unlocking the coronal heating problem. With current instrumentation, a combination of EUV spectral data from Hinode Extreme-ultraviolet Imaging Spectrometer (EIS; sensitive to temperatures up to 4 MK) and broadband filter data from Hinode X-ray Telescope (XRT; sensitive to higher temperatures) is typically used to diagnose the temperature structure of the observed plasma. In this Letter, we demonstrate that a 'blind spot' exists in temperature-emission measure space for combined Hinode EIS and XRT observations. For a typical active region core with significant emission at 3-4 MK, Hinode EIS and XRT are insensitive to plasma with temperatures greater than {approx}6 MK and emission measures less than {approx}10{sup 27} cm{sup -5}. We then demonstrate that the temperature and emission measure limits of this blind spot depend upon the temperature distribution of the plasma along the line of sight by considering a hypothetical emission measure distribution sharply peaked at 1 MK. For this emission measure distribution, we find that EIS and XRT are insensitive to plasma with emission measures less than {approx}10{sup 26} cm{sup -5}. We suggest that a spatially and spectrally resolved 6-24 Angstrom-Sign spectrum would improve the sensitivity to these high-temperature, low emission measure plasma.

  6. Issues Related to Defining and Measuring Violence Against Women: Response to Kilpatrick

    Science.gov (United States)

    Saltzman, Linda E.

    2004-01-01

    This paper asserts that although there is considerable agreement in the U.S. and internationally about the importance of uniform terminology and measurement related to violence against women, we need a strategy for choosing standardized definitions and measures. Responding to Kilpatrick's comments at the October 2003 national research conference…

  7. Managing Swedish forestry's impact on mercury in fish: Defining the impact and mitigation measures.

    Science.gov (United States)

    Eklöf, Karin; Lidskog, Rolf; Bishop, Kevin

    2016-02-01

    Inputs of anthropogenic mercury (Hg) to the environment have led to accumulation of Hg in terrestrial and aquatic ecosystems, contributing to fish Hg concentrations well above the European Union standards in large parts of Fennoscandia. Forestry operations have been reported to increase the concentrations and loads of Hg to surface waters by mobilizing Hg from the soil. This summary of available forestry effect studies reveals considerable variation in treatment effects on total Hg (THg) and methylmercury (MeHg) at different sites, varying from no effect up to manifold concentration increases, especially for the bioavailable MeHg fraction. Since Hg biomagnification depends on trophic structures, forestry impacts on nutrient flows will also influence the Hg in fish. From this, we conclude that recommendations for best management practices in Swedish forestry operations are appropriate from the perspective of mercury contamination. However, the complexity of defining effective policies needs to be recognized.

  8. Defining and measuring cyberbullying within the larger context of bullying victimization.

    Science.gov (United States)

    Ybarra, Michele L; Boyd, Danah; Korchmaros, Josephine D; Oppenheim, Jay Koby

    2012-07-01

    To inform the scientific debate about bullying, including cyberbullying, measurement. Two split-form surveys were conducted online among 6-17-year-olds (n = 1,200 each) to inform recommendations for cyberbullying measurement. Measures that use the word "bully" result in prevalence rates similar to each other, irrespective of whether a definition is included, whereas measures not using the word "bully" are similar to each other, irrespective of whether a definition is included. A behavioral list of bullying experiences without either a definition or the word "bully" results in higher prevalence rates and likely measures experiences that are beyond the definition of "bullying." Follow-up questions querying differential power, repetition, and bullying over time were used to examine misclassification. The measure using a definition but not the word "bully" appeared to have the highest rate of false positives and, therefore, the highest rate of misclassification. Across two studies, an average of 25% reported being bullied at least monthly in person compared with an average of 10% bullied online, 7% via telephone (cell or landline), and 8% via text messaging. Measures of bullying among English-speaking individuals in the United States should include the word "bully" when possible. The definition may be a useful tool for researchers, but results suggest that it does not necessarily yield a more rigorous measure of bullying victimization. Directly measuring aspects of bullying (i.e., differential power, repetition, over time) reduces misclassification. To prevent double counting across domains, we suggest the following distinctions: mode (e.g., online, in-person), type (e.g., verbal, relational), and environment (e.g., school, home). We conceptualize cyberbullying as bullying communicated through the online mode. Copyright © 2012 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.

  9. Defining, Measuring, and Incentivizing Sustainable Land Use to Meet Human Needs

    Science.gov (United States)

    Nicholas, K. A.; Brady, M. V.; Olin, S.; Ekroos, J.; Hall, M.; Seaquist, J. W.; Lehsten, V.; Smith, H.

    2016-12-01

    Land is a natural capital that supports the flow of an enormous amount of ecosystem services critical to human welfare. Sustainable land use, which we define as land use that meets both current and future human needs for ecosystem services, is essential to meet global goals for climate mitigation and sustainable development, while maintaining natural capital. However, it is not clear what governance is needed to achieve sustainable land use under multiple goals (as defined by the values of relevant decision-makers and land managers), particularly under climate change. Here we develop a conceptual model for examining the interactions and tradeoffs among multiple goals, as well as their spatial interactions (teleconnections), in research developed using Design Thinking principles. We have selected five metrics for provisioning (food production, and fiber production for wood and energy), regulating and maintenance (climate mitigation and biodiversity conservation), and cultural (heritage) ecosystem services. Using the case of Sweden, we estimate indicators for these metrics using a combination of existing data synthesis and process-based simulation modeling. We also develop and analyze new indicators (e.g., combining data on land use, bird conservation status, and habitat specificity to make a predictive model of bird diversity changes on agricultural or forested land). Our results highlight both expected tradeoffs (e.g., between food production and biodiversity conservation) as well as unexpected opportunities for synergies under different land management scenarios and strategies. Our model also provides a practical way to make decision-maker values explicit by comparing both quantity and preferences for bundles of ecosystem services under various scenarios. We hope our model will help in considering competing interests and shaping economic incentives and governance structures to meet national targets in support of global goals for sustainable management of land

  10. The defined daily dose as a measure of drug consumption in South ...

    African Journals Online (AJOL)

    drugs. The traditional measures, viz. cost and volume studies, have inherent shortcomings when drug consumption ... drug utilisation in South Africa The DDD methodology has ..... Drug Utilization in Norway during the 1970s -Increases.

  11. Defining and measuring cyberbullying within the larger context of bullying victimization

    Science.gov (United States)

    Ybarra, Michele; boyd, danah; Korchmaros, Josephine; Oppenheim, Jay (Koby)

    2012-01-01

    Methods Two split-form surveys were conducted online among 6–17 year olds (n=1,200 each) to inform recommendations for cyberbullying measurement. Results Measures that use the word ‘bully’ result in prevalence rates similar to each other whether or not a definition is included, whereas measures not using the word ‘bully’ are similar to each other whether or not a definition is included. A behavioral list of bullying experiences without either a definition or the word ‘bully’ results in higher prevalence rates and likely measures experiences that are beyond the definition of ‘bullying’. Follow-up questions querying differential power, repetition, and bullying over time were used to examine misclassification. The measure using a definition but not the word ‘bully’ appeared to have the highest rate of false positives and, therefore, the highest rate of misclassification. Across two studies, an average of 25% reported being bullied at least monthly in person compared with an average of 10% bullied online, 7% via telephone (cell or landline), and 8% via text messaging. Conclusions Measures of bullying among English-speaking samples in the US should include the word ‘bully’ when possible. The definition may be a useful tool for researchers, but results suggest that it does not necessarily yield a more rigorous measure of bullying victimization. Directly measuring aspects of bullying (i.e., differential power, repetition, over time) reduces misclassification. To prevent double counting across categories, we conceptualize cyberbullying as bullying communicated through the online mode; type (e.g., verbal, relational), and environment (e.g., school, home) are additional domains of bullying. PMID:22727077

  12. The History of Rabies in Trinidad: Epidemiology and Control Measures

    Directory of Open Access Journals (Sweden)

    Janine F. R. Seetahal

    2017-07-01

    Full Text Available Vampire bat-transmitted rabies was first recognized in Trinidad during a major outbreak reported in 1925. Trinidad is the only Caribbean island with vampire bat-transmitted rabies. We conducted a literature review to describe the changing epidemiology of rabies in Trinidad and give a historical perspective to rabies prevention and control measures on the island. The last human case of rabies occurred in 1937 and although no case of canine-transmitted rabies was reported since 1914, sporadic outbreaks of bat-transmitted rabies still occur in livestock to date. Over the last century, seven notable epidemics were recorded in Trinidad with the loss of over 3000 animals. During the 1950s, several measures were effectively adopted for the prevention and control of the disease which led to a significant reduction in the number of cases. These measures include: vampire bat population control, livestock vaccination, and animal surveillance. However, due to lapses in these measures over the years (e.g., periods of limited vampire control and incomplete herd vaccination, epidemics have occurred. In light of the significant negative impact of rabies on animal production and human health, rabies surveillance in Trinidad should be enhanced and cases evaluated towards the design and implementation of more evidence-based prevention and control programs.

  13. Defining allowable physical property variations for high accurate measurements on polymer parts

    Science.gov (United States)

    Mohammadi, A.; Sonne, M. R.; Madruga, D. G.; De Chiffre, L.; Hattel, J. H.

    2016-06-01

    Measurement conditions and material properties have a significant impact on the dimensions of a part, especially for polymers parts. Temperature variation causes part deformations that increase the uncertainty of the measurement process. Current industrial tolerances of a few micrometres demand high accurate measurements in non-controlled ambient. Most of polymer parts are manufactured by injection moulding and their inspection is carried out after stabilization, around 200 hours. The overall goal of this work is to reach ±5μm in uncertainty measurements a polymer products which is a challenge in today`s production and metrology environments. The residual deformations in polymer products at room temperature after injection molding are important when micrometer accuracy needs to be achieved. Numerical modelling can give a valuable insight to what is happening in the polymer during cooling down after injection molding. In order to obtain accurate simulations, accurate inputs to the model are crucial. In reality however, the material and physical properties will have some variations. Although these variations may be small, they can act as a source of uncertainty for the measurement. In this paper, we investigated how big the variation in material and physical properties are allowed in order to reach the 5 μm target on the uncertainty.

  14. Defining Allowable Physical Property Variations for High Accurate Measurements on Polymer Parts

    DEFF Research Database (Denmark)

    Mohammadi, Ali; Sonne, Mads Rostgaard; Madruga, Daniel González;

    2015-01-01

    cooling down after injection molding. In order to obtain accurate simulations, accurate inputs to the model are crucial. In reality however, the material and physical properties will have some variations. Although these variations may be small, they can act as a source of uncertainty for the measurement...... high accurate measurements in non-controlled ambient. Most of polymer parts are manufactured by injection moulding and their inspection is carried out after stabilization, around 200 hours. The overall goal of this work is to reach ±5μm in uncertainty measurements a polymer products which...... is a challenge in today‘s production and metrology environments. The residual deformations in polymer products at room temperature after injection molding are important when micrometer accuracy needs to be achieved. Numerical modelling can give a valuable insight to what is happening in the polymer during...

  15. Measured and perceived environmental characteristics are related to accelerometer defined physical activity in older adults

    Directory of Open Access Journals (Sweden)

    Strath Scott J

    2012-04-01

    Full Text Available Abstract Background Few studies have investigated both the self-perceived and measured environment with objectively determined physical activity in older adults. Accordingly, the aim of this study was to examine measured and perceived environmental associations with physical activity of older adults residing across different neighborhood types. Methods One-hundred and forty-eight older individuals, mean age 64.3 ± 8.4, were randomly recruited from one of four neighborhoods that were pre-determined as either having high- or low walkable characteristics. Individual residences were geocoded and 200 m network buffers established. Both objective environment audit, and self-perceived environmental measures were collected, in conjunction with accelerometer derived physical activity behavior. Using both perceived and objective environment data, analysis consisted of a macro-level comparison of physical activity levels across neighborhood, and a micro-level analysis of individual environmental predictors of physical activity levels. Results Individuals residing in high-walkable neighborhoods on average engaged in 11 min of moderate to vigorous physical activity per day more than individuals residing in low-walkable neighborhoods. Both measured access to non-residential destinations (b = .11, p p = .031 were significant predictors of time spent in moderate to vigorous physical activity. Other environmental variables significantly predicting components of physical activity behavior included presence of measured neighborhood crime signage (b = .4785, p = .031, measured street safety (b = 26.8, p = .006, and perceived neighborhood satisfaction (b = .5.8, p = .003. Conclusions Older adult residents who live in high-walkable neighborhoods, who have easy and close access to nonresidential destinations, have lower social dysfunction pertinent to crime, and generally perceive the neighborhood to a higher overall satisfaction are likely to engage in higher levels

  16. Measurement of (23)Na(n,2n) cross section in well-defined reactor spectra.

    Science.gov (United States)

    Košťál, Michal; Švadlenková, Marie; Baroň, Petr; Milčák, Ján; Mareček, Martin; Uhlíř, Jan

    2016-05-01

    The present paper aims to compare the calculated and experimental reaction rates of (23)Na(n,2n)(22)Na in a well-defined reactor spectra of a special core assembled in the LR-0 reactor. The experimentally determined reaction rate, derived using gamma spectroscopy of irradiated NaF sample, is used for average cross section determination. The resulting value averaged in spectra is 0.91±0.02µb. This cross-section is important as it is included in International Reactor Dosimetry and Fusion File and is also relevant to the correct estimation of long-term activity of Na coolant in Sodium Fast Reactors. The calculations were performed with the MCNP6 code using ENDF/B-VII.0, JEFF-3.1, JEFF-3.2, JENDL-3.3, JENDL-4, ROSFOND-2010 and CENDL-3.1 nuclear data libraries. Generally the best C/E agreement, within 2%, was found using the ROSFOND-2010 data set, whereas the worst, as high as 40%, was found using the ENDF/B-VII.0.

  17. The Valued Living Questionnaire: Defining and Measuring Valued Action within a Behavioral Framework

    Science.gov (United States)

    Wilson, Kelly G.; Sandoz, Emily K.; Kitchens, Jennifer; Roberts, Miguel

    2010-01-01

    A number of cognitive-behavior therapies now strongly emphasize particular behavioral processes as mediators of clinical change specific to that therapy. This shift in emphasis calls for the development of measures sensitive to changes in the therapies' processes. Among these is acceptance and commitment therapy (ACT), which posits valued living…

  18. Defining and measuring service awareness among elders and caregivers of Mexican descent.

    Science.gov (United States)

    Crist, Janice D; Michaels, Cathleen; Gelfand, Donald E; Phillips, Linda R

    2007-01-01

    Mexican American elders' and their caregivers' awareness of available home care services is one of nine factors hypothesized to be associated with underuse of home care services. Previous instruments did not fully measure service awareness. The objective of this study was to explore the conceptual foundation of service awareness, generate items, and establish language equivalence in Spanish and English for the Service Awareness Scale. A hybrid use of the literature and fieldwork were used to develop the concept and generate items. The team used back-translation and community collaboration to test for language equivalence. Concept development and language equivalence were achieved for the Service Awareness Scale. Teaching/learning theories contributed to the definition and inductive validity of service awareness and item generation and can shape future interventions. Bicultural/bilingual community and research team partners refined its measure. The scale will be usable in research and practice designed to promote equity in health care use.

  19. Anatomic guidelines defined by reformatting images on MRI for volume measurement of amygdala and hippocampus

    Energy Technology Data Exchange (ETDEWEB)

    Hoshida, Tohru; Sakaki, Toshisuke [Nara Medical Univ., Kashihara (Japan); Uematsu, Sumio

    1995-03-01

    Twelve patients with intractable partial epilepsy underwent MR scans at the Epilepsy Center of the Johns Hopkins Hospital. There were five women and seven men, ranging in age from five to 51 years (mean age: 26 years). Coronal images were obtained using a 3-D SPGR. The coronal images were transferred to an Allegro 5.1 workstation, and reformatted along the cardinal axes (axial and sagittal) in multiple view points. The anterior end of the amygdala was measured at the level just posterior to the disappearance of the temporal stem. The semilunar gyrus of the amygdala was separated from the ambient gyrus by the semianular sulcus that forms the boundary between the amygdala and the entorhinal cortex. The delineation of the hippocampal formation included the subicular complex, hippocampal proper, dentate gyrus, alveus, and fimbria. The uncal cleft separated the uncus above from the parahippocampal gyrus below. The roof of this cleft was formed by the hippocampus and the dentate gyrus, and the floor, by the presubiculum and subiculum. Although using some guidelines, strictly separating the hippocampal head from the posterior part of the amygdala was not feasible as was previously reported, because of the isointensity on MRI between the cortex of the amygdala and the hippocampus. The most posterior portion of the hippocampus was measured at the level of the subsplenial gyri, just below the splenium of the corpus callosum, to measure the hippocampal volume in its near totality. Therefore, it is reliable, and clinically useful, to measure the combined total volume of the amygdala and the hippocampus when comparing results with those of other centers. (S.Y.).

  20. Defining Allowable Physical Property Variations for High Accurate Measurements on Polymer Parts

    DEFF Research Database (Denmark)

    Mohammadi, Ali; Sonne, Mads Rostgaard; Madruga, Daniel González

    2015-01-01

    cooling down after injection molding. In order to obtain accurate simulations, accurate inputs to the model are crucial. In reality however, the material and physical properties will have some variations. Although these variations may be small, they can act as a source of uncertainty for the measurement....... In this paper, we investigated how big the variation in material and physical properties are allowed in order to reach the 5 μm target on the uncertainty....

  1. Defining and measuring the costs of the HIV epidemic to business firms.

    OpenAIRE

    Farnham, P G

    1994-01-01

    Most published estimates of the costs of the epidemic of human immunodeficiency virus (HIV) infection and acquired immunodeficiency syndrome (AIDS) have been developed from the societal perspective, attempting to measure the burden of the epidemic to society in this country. Although societal cost analysis is well-developed, relatively little is known about many of the factors influencing the costs of the epidemic to business firms. The business community may bear a substantial portion of tho...

  2. Learning and cognitive fatigue trajectories in multiple sclerosis defined using a burst measurement design.

    Science.gov (United States)

    Holtzer, Roee; Foley, Frederick; D'Orio, Vanessa; Spat, Jessica; Shuman, Melissa; Wang, Cuiling

    2013-10-01

    Compromised learning and cognitive fatigue are critical clinical features in multiple sclerosis. This study was designed to determine the effect of repeated exposures within and across study visits on performance measures of learning and cognitive fatigue in relapsing-remitting multiple sclerosis (RRMS). Thirty patients with RRMS and 30 controls were recruited. Using a burst measurement design (i.e. repeated assessments within and across study visits) the oral version of the Symbol Digit Modalities Test (SDMT) was administered three times during the baseline and two consecutive monthly follow-up visits for a total of nine test administrations. Learning was assessed within and across study visits whereas cognitive fatigue was assessed during the course of each test administration that was divided into three 30-second intervals. Linear mixed-effect models revealed compromised learning within (95% CI: 2.6355 to 3.9867) and across (95% CI: 1.3250 to 3.1861) visits and worse cognitive fatigue (95% CI: -2.1761 to -0.1720) in patients with RRMS compared with controls. Among patients with RRMS, worse self-rated cognitive dysfunction predicted poor learning within (95% CI: -0.1112 to -0.0020) and across (95% CI: -0.0724 to -0.0106) visits. Burst design is optimal to study learning and cognitive fatigue. This methodology, using the SDMT or other time-efficient tests as outcome measures, can be successfully implemented in longitudinal studies and clinical trials.

  3. Measuring concurrency using a joint multistate and point process model for retrospective sexual history data.

    Science.gov (United States)

    Aralis, Hilary J; Gorbach, Pamina M; Brookmeyer, Ron

    2016-10-30

    Understanding the impact of concurrency, defined as overlapping sexual partnerships, on the spread of HIV within various communities has been complicated by difficulties in measuring concurrency. Retrospective sexual history data consisting of first and last dates of sexual intercourse for each previous and ongoing partnership is often obtained through use of cross-sectional surveys. Previous attempts to empirically estimate the magnitude and extent of concurrency among these surveyed populations have inadequately accounted for the dependence between partnerships and used only a snapshot of the available data. We introduce a joint multistate and point process model in which states are defined as the number of ongoing partnerships an individual is engaged in at a given time. Sexual partnerships starting and ending on the same date are referred to as one-offs and modeled as discrete events. The proposed method treats each individual's continuation in and transition through various numbers of ongoing partnerships as a separate stochastic process and allows the occurrence of one-offs to impact subsequent rates of partnership formation and dissolution. Estimators for the concurrent partnership distribution and mean sojourn times during which a person has k ongoing partnerships are presented. We demonstrate this modeling approach using epidemiological data collected from a sample of men having sex with men and seeking HIV testing at a Los Angeles clinic. Among this sample, the estimated point prevalence of concurrency was higher among men later diagnosed HIV positive. One-offs were associated with increased rates of subsequent partnership dissolution. Copyright © 2016 John Wiley & Sons, Ltd.

  4. Survivor-Defined Practice in Domestic Violence Work: Measure Development and Preliminary Evidence of Link to Empowerment.

    Science.gov (United States)

    Goodman, Lisa A; Thomas, Kristie; Cattaneo, Lauren Bennett; Heimel, Deborah; Woulfe, Julie; Chong, Siu Kwan

    2016-01-01

    Survivor-defined practice, characterized by an emphasis on client choice, partnership, and sensitivity to the unique needs, contexts, and coping strategies of individual survivors, is an aspirational goal of the domestic violence (DV) movement, assumed to be a key contributor to empowerment and other positive outcomes among survivors. Despite its central role in DV program philosophy, training, and practice, however, our ability to assess its presence and its presumed link to well-being has been hampered by the absence of a way to measure it from survivors' perspectives. As part of a larger university-community collaboration, this study had two aims: (a) to develop a measure of survivor-defined practice from the perspective of participants, and (b) to assess its relationship to safety-related empowerment after controlling for other contributors to survivor well-being (e.g., financial stability and social support). Results supported the reliability and validity of the Survivor-Defined Practice Scale (SDPS), a nine-item measure that assesses participants' perception of the degree to which their advocates help them achieve goals they set for themselves, facilitate a spirit of partnership, and show sensitivity to their individual needs and styles. The items combined to form one factor indicating that the three theoretical aspects of survivor-defined practice may be different manifestations of one underlying construct. Results also support the hypothesized link between survivor-defined practice and safety-related empowerment. The SDPS offers DV programs a mechanism for process evaluation that is rigorous and rooted in the feminist empowerment philosophy that so many programs espouse.

  5. Defining the Spacecraft Angular Position in the Orbital Stabilization Mode from the Angular Rate Sensor Measurements

    Directory of Open Access Journals (Sweden)

    A. S. Oleynik

    2015-01-01

    Full Text Available The spacecraft orientation mode relative to given reference point is implemented using the sensor equipment, which measures the spacecraft angular position relative to given reference coordinate system and the angular rate of the spacecraft rotation relative to inertial space. In case one or another sensor equipment fails, to implement the orientation mode a method for estimating the missing component of the state vector should be developed.This work states a task to estimate a spacecraft angular position vector relative to orbital coordinate system using the measurement readings of an angular rate sensor. This formulation of the research is typically used to build the reserve algorithms of the orbital stabilization.The solution of this problem uses an approach based on the decomposition of the linearized model for the angular motion of the spacecraft and the theory of identification by the method of exact placement of the poles. To maximize the convergence rate of the algorithm we choose a zero of the discrete system eigenvalues.The simulation results proved the efficiency of the resulting algorithm to estimate the angular position of the spacecraft relative to orbital coordinate system. In accordance with the analysis of the computational cost of the algorithm we can conclude that its real-time implementation is possible. This is of great importance when implementing this algorithm in the onboard computer of the spacecraft.This work is a logical continuation of authors’ previous publications. The earlier works investigated the possibility to construct the linear observing devices for nonlinear dynamic systems. There were also solved the problems of building an orbital and inertial orientation, according to the results of a measuring position of the spacecraft relative to specified reference points (Earth, stars when the angular velocity sensor failed.

  6. The Criterion A problem revisited: controversies and challenges in defining and measuring psychological trauma.

    Science.gov (United States)

    Weathers, Frank W; Keane, Terence M

    2007-04-01

    The Criterion A problem in the field of traumatic stress refers to the stressor criterion for posttraumatic stress disorder (PTSD) and involves a number of fundamental issues regarding the definition and measurement of psychological trauma. These issues first emerged with the introduction of PTSD as a diagnostic category in the Diagnostic and Statistical Manual of Mental Disorders, Third Edition (DSM-III; American Psychiatric Association, 1980) and continue to generate considerable controversy. In this article, the authors provide an update on the Criterion A problem, with particular emphasis on the evolution of the DSM definition of the stressor criterion and the ongoing debate regarding broad versus narrow conceptualizations of traumatic events.

  7. A Brief History of Attempts to Measure Sexual Motives

    Directory of Open Access Journals (Sweden)

    Elaine Hatfield

    2012-12-01

    Full Text Available Artists, creative writers, and musicians have long been interested in the complex motives that spark passionate love, sexual desire, and sexual behavior. Recently, scholars from a variety of disciplines have begun to investigate two questions: “Why do men and women choose to engage in sexual liaisons?” “Why do they avoid such encounters?” Theories abound. Many theorists have complained that there exists a paucity of scales designed to measure the plethora of motives that prompt people to seek out or to avoid sexual activities. In fact, this observation is incorrect. Many such scales of documented reliability and validity do exist. The reason that few scholars are familiar with these scales is that they were developed by psychometricians from a variety of disciplines and are scattered about in an assortment of journals, college libraries, and researchers’ desk drawers, thus making them difficult to identify and locate. This paper will attempt to provide a compendium of all known sexual motives scales, hoping that this will encourage scholars to take a multidisciplinary approach in developing typologies of sexual motives and/or in conducting their own research into the nature of sexual motives.

  8. Gas permeation measurement under defined humidity via constant volume/variable pressure method

    KAUST Repository

    Jan Roman, Pauls

    2012-02-01

    Many industrial gas separations in which membrane processes are feasible entail high water vapour contents, as in CO 2-separation from flue gas in carbon capture and storage (CCS), or in biogas/natural gas processing. Studying the effect of water vapour on gas permeability through polymeric membranes is essential for materials design and optimization of these membrane applications. In particular, for amine-based CO 2 selective facilitated transport membranes, water vapour is necessary for carrier-complex formation (Matsuyama et al., 1996; Deng and Hägg, 2010; Liu et al., 2008; Shishatskiy et al., 2010) [1-4]. But also conventional polymeric membrane materials can vary their permeation behaviour due to water-induced swelling (Potreck, 2009) [5]. Here we describe a simple approach to gas permeability measurement in the presence of water vapour, in the form of a modified constant volume/variable pressure method (pressure increase method). © 2011 Elsevier B.V.

  9. Defining and measuring blood donor altruism: a theoretical approach from biology, economics and psychology.

    Science.gov (United States)

    Evans, R; Ferguson, E

    2014-02-01

    While blood donation is traditionally described as a behaviour motivated by pure altruism, the assessment of altruism in the blood donation literature has not been theoretically informed. Drawing on theories of altruism from psychology, economics and evolutionary biology, it is argued that a theoretically derived psychometric assessment of altruism is needed. Such a measure is developed in this study that can be used to help inform both our understanding of the altruistic motives of blood donors and recruitment intervention strategies. A cross-sectional survey (N = 414), with a 1-month behavioural follow-up (time 2, N = 77), was designed to assess theoretically derived constructs from psychological, economic and evolutionary biological theories of altruism. Theory of planned behaviour (TPB) variables and co-operation were also assessed at time 1 and a measure of behavioural co-operation at time 2. Five theoretical dimensions (impure altruism, kinship, self-regarding motives, reluctant altruism and egalitarian warm glow) of altruism were identified through factor analyses. These five altruistic motives differentiated blood donors from non-donors (donors scored higher on impure altruism and reluctant altruism), showed incremental validity over TPB constructs to predict donor intention and predicted future co-operative behaviour. These findings show that altruism in the context of blood donation is multifaceted and complex and, does not reflect pure altruism. This has implication for recruitment campaigns that focus solely on pure altruism. © 2013 The Authors. Vox Sanguinis published by John Wiley & Sons Ltd. on behalf of International Society of Blood Transfusion.

  10. Assessment and measurement in neuropsychiatry: a conceptual history.

    Science.gov (United States)

    Berrios, German E; Marková, Ivana S

    2002-01-01

    Since the time the parent discipline of psychiatry became organized as a profession, one of its ludi saeculares (neuropsychiatry) has enjoyed at least 4 vogues. On each, neuropsychiatry has been known to ally itself to a cause: currently it is the big business of neurobiology. This move can be seen as scientific progress or as a side-effect of the (professional rather than scientific) infighting that affected neuromedicine during the late 19(th) century and which led to the construction of the notion of "neurological disease." Alienists responded to this variously: some, like Kahlbaum and Kraepelin accepted the split and returned to the more botanico approach; others, like Ziehen chose psychology; yet others, like Freud, delved in hermeneutics; lastly, there were those, like Meynert, Wernicke, Von Monakow, and Liepmann who sought an accommodation with neurology. Born out of this compromise, neuropsychiatry has remained a blurred activity (whose definitions range from "psychiatry of neurology" to a crusade for the "naturalization of the mind"). Neuropsychiatric assessment is a methodology designed to collect information about patients whose mental symptoms are thought to be caused by brain disease. When it first appeared, it was torn by the debate between "nomothetic versus idiographic" science. For a time, the neuropsychiatry assessment techniques stuck to the old personalized narratives characteristic of 19(th) century "casenotes" (trying to meet its descriptive, explanatory, therapeutic, legal, and ethical obligations). But during the late 19(th) century, measurement and quantification became part of the new rhetoric of science. Soon enough this affected psychology in general and neuropsychology in particular and neuropsychiatric assessment followed suit. It has changed little since except that now and again old tests and markers are replaced by more "reliable" ones and phenomenological data are squeezed out further. Its laudable enthusiasm for objectivity and

  11. ANALYSIS OF MUTATIONS OF TUBERCULOUS MYCOBACTERIA DEFINING DRUG RESISTANCE IN HIV POSITIVE AND HIV NEGATIVE TUBERCULOSIS PATIENTS WITHOUT PRIOR HISTORY OF TREATMENT IN SVERDLOVSK REGION

    Directory of Open Access Journals (Sweden)

    G. V. Panov

    2017-01-01

    Full Text Available Goal of the study: to identify profile of mutations of tuberculous mycobacteria responsible for resistance to anti-tuberculosis drugs in HIV positive and HIV negative tuberculosis patients without prior history of treatment.Materials and methods. 165 strains of tuberculous mycobacteria from HIV positive patients and 166 strains of tuberculous mycobacteria from HIV negative patients were studied in Sverdlovsk Region (TB Dispensary, Yekaterinburg. Mutations in genes were identified using microchips of TB-BIOCHIP® and TB-BIOCHIP®-2 in compliance with the manufacturer's guidelines (OOO Biochip-IMB, Moscow.Results. It was observed that 85/165 (51.52% strains isolated from HIV positive tuberculosis patients and 58/166 (34.94% strains isolated from tuberculosis patients not associated with HIV possessed MDR genotype (p < 0.01. The majority of MDR strains had mutations in the 531th codon of rpoB (Ser→Leu and 315th codon of katG (Ser→Thr (64/85, 75.29% and 38/58, 65.52% respective the groups, resulting in the high level of resistance to rifampicin and isoniazid. Each group also had approximately equal ratio (11/165, 6.67% and 12/166, 7.23% respective the groups of strains with genomic mutations defining the resistance to isoniazid, rifampicin and fluoruquinolones. No confident difference was found in mutation patterns of genome of tuberculous mycobacteria isolated from HIV positive and HIV negative tuberculosis patients. 

  12. What is culture in «cultural economy»? Defining culture to create measurable models in cultural economy

    Directory of Open Access Journals (Sweden)

    Aníbal Monasterio Astobiza

    2017-07-01

    Full Text Available The idea of culture is somewhat vague and ambiguous for the formal goals of economics. The aim of this paper is to define the notion of culture better so as to help build economic explanations based on culture and therefore to measure its impact in every activity or beliefs associated with culture. To define culture according to the canonical evolutionary definition, it is any kind of ritualised behaviour that becomes meaningful for a group and that remains more or less constant and is transmitted down through the generations. Economic institutions are founded, implicitly or explicitly, on a worldview of how humans function; culture is an essential part of understanding us as humans, making it necessary to describe what we understand by culture correctly. In this paper we review the literature on evolutionary anthropology and psychology dealing with the concept of culture to warn that economic modelling ignores intangible benefits of culture rendering economics unable to measure certain cultural items in the digital consumer society.

  13. A Model of the Dynamic Error as a Measurement Result of Instruments Defining the Parameters of Moving Objects

    Science.gov (United States)

    Dichev, D.; Koev, H.; Bakalova, T.; Louda, P.

    2014-08-01

    The present paper considers a new model for the formation of the dynamic error inertial component. It is very effective in the analysis and synthesis of measuring instruments positioned on moving objects and measuring their movement parameters. The block diagram developed within this paper is used as a basis for defining the mathematical model. The block diagram is based on the set-theoretic description of the measuring system, its input and output quantities and the process of dynamic error formation. The model reflects the specific nature of the formation of the dynamic error inertial component. In addition, the model submits to the logical interrelation and sequence of the physical processes that form it. The effectiveness, usefulness and advantages of the model proposed are rooted in the wide range of possibilities it provides in relation to the analysis and synthesis of those measuring instruments, the formulation of algorithms and optimization criteria, as well as the development of new intelligent measuring systems with improved accuracy characteristics in dynamic mode.

  14. AN EXPERIMENTAL TECHNIQUE TO MEASURE PROJECTILE DECELERATION HISTORY DURING NORMAL PENETRATION

    Institute of Scientific and Technical Information of China (English)

    INTO PLAIN; Liu Xiaohu; Liu Ji; Wang Cheng

    2000-01-01

    The present paper presents a new experimental method to measure the deceleration time his tory of projectiles penetrating into concrete in full-size test. The experiment can be carried out by using an onboard accelerometer to measure the projectile deceleration history and the data are transmitted to a ground recording system. With this experimental method, a series of tests on hemisphere-nose steel projectiles pene trating normally into plain concrete at the velocity region 150 - 400 m/s have been executed and the deceler ation histories obtained. The high frequency portion in the deceleration data has been investigated and proved to be the structure response of projectile. The characteristics of deceleration history have also been analyzed and discussed.

  15. The Star Formation History of Galaxies Measured from Individual Pixels. I. The Hubble Deep Field North

    CERN Document Server

    Conti, A; Hopkins, A M; Budavari, T; Szalay, A S; Csabai, I; Schmidt, S J; Adams, C; Petrovic, N D; Conti, Alberto; Connolly, Andrew J.; Hopkins, Andrew M.; Szalay, Alex S.; Csabai, Istvan; Schmidt, Samuel J.; Adams, Carla; Petrovic, Nada

    2003-01-01

    We analyze the photometric information contained in individual pixels of galaxies in the Hubble Deep Field North (HDFN) using a new technique, _pixel-z_, that combines predictions of evolutionary synthesis models with photometric redshift template fitting. Each spectral energy distribution template is a result of modeling of the detailed physical processes affecting gas properties and star formation efficiency. The criteria chosen to generate the SED templates is that of sampling a wide range of physical characteristics such as age, star formation rate, obscuration and metallicity. A key feature of our method is the sophisticated use of error analysis to generate error maps that define the reliability of the template fitting on pixel scales and allow for the separation of the interplay among dust, metallicity and star formation histories. This technique offers a number of advantages over traditional integrated color studies. As a first application, we derive the star formation and metallicity histories of gal...

  16. Measurement and calculation of fast neutron and gamma spectra in well defined cores in LR-0 reactor.

    Science.gov (United States)

    Košťál, Michal; Matěj, Zdeněk; Cvachovec, František; Rypar, Vojtěch; Losa, Evžen; Rejchrt, Jiří; Mravec, Filip; Veškrna, Martin

    2017-02-01

    A well-defined neutron spectrum is essential for many types of experimental topics and is also important for both calibration and testing of spectrometric and dosimetric detectors. Provided it is well described, such a spectrum can also be employed as a reference neutron field that is suitable for validating selected cross sections. The present paper aims to compare calculations and measurements of such a well-defined spectra in geometrically similar cores of the LR-0 reactor with fuel containing slightly different enrichments (2%, 3.3% and 3.6%). The common feature to all cores is a centrally located dry channel which can be used for the insertion of studied materials. The calculation of neutron and gamma spectra was realized with the MCNP6 code using ENDF/B-VII.0, JEFF-3.1, JENDL-3.3, ROSFOND-2010 and CENDL-3.1 nuclear data libraries. Only minor differences in neutron and gamma spectra were found in the comparison of the presented reactor cores with different fuel enrichments. One exception is the gamma spectrum in the higher energy region (above 8MeV), where more pronounced variations could be observed.

  17. Surface Charge Measurement of SonoVue, Definity and Optison: A Comparison of Laser Doppler Electrophoresis and Micro-Electrophoresis.

    Science.gov (United States)

    Ja'afar, Fairuzeta; Leow, Chee Hau; Garbin, Valeria; Sennoga, Charles A; Tang, Meng-Xing; Seddon, John M

    2015-11-01

    Microbubble (MB) contrast-enhanced ultrasonography is a promising tool for targeted molecular imaging. It is important to determine the MB surface charge accurately as it affects the MB interactions with cell membranes. In this article, we report the surface charge measurement of SonoVue, Definity and Optison. We compare the performance of the widely used laser Doppler electrophoresis with an in-house micro-electrophoresis system. By optically tracking MB electrophoretic velocity in a microchannel, we determined the zeta potentials of MB samples. Using micro-electrophoresis, we obtained zeta potential values for SonoVue, Definity and Optison of -28.3, -4.2 and -9.5 mV, with relative standard deviations of 5%, 48% and 8%, respectively. In comparison, laser Doppler electrophoresis gave -8.7, +0.7 and +15.8 mV with relative standard deviations of 330%, 29,000% and 130%, respectively. We found that the reliability of laser Doppler electrophoresis is compromised by MB buoyancy. Micro-electrophoresis determined zeta potential values with a 10-fold improvement in relative standard deviation.

  18. History Culture and Tradition in Helon Habila’s Measuring Time

    Directory of Open Access Journals (Sweden)

    Juliet Tenshak

    2014-09-01

    Full Text Available Nigerian literature has evolved over the past fifty years and no longer looks like it used to when first generation writers Chinua Achebe, Wole Soyinka and their contemporaries first started to write in the late 1950’s. Nigeria itself has changed greatly since the time of colonialism and nationalism. But, present generation Nigerian literary artist even though swamped by globalization and neo-colonialism, continue to tread the path of the writers before them by the reiterating in their works, the need to engage with and confront the distorted and sometimes, untold histories of their societies. Helon Habila in Measuring Time (2007, presents the simple statement that a society’s present is better understood if its history is better known in all its glory and shame. He goes on to show a concern with the need for the people to be the ones to voice or relate that history. Writing on issues, that are not only relevant but also timely, he shows how that the more fully we understand our past, the better we are likely to understand ourselves. With this background, the present paper aims to explore Helon Habila’s concern with history, culture and tradition in Measuring Time with the intention of highlighting his presentation and reassessment of these within the threshold of governance in Keti in particular and in Nigeria in general.

  19. A Bayesian Retrieval of Greenland Ice Sheet Internal Temperature from Ultra-wideband Software-defined Microwave Radiometer (UWBRAD) Measurements

    Science.gov (United States)

    Duan, Y.; Durand, M. T.; Jezek, K. C.; Yardim, C.; Bringer, A.; Aksoy, M.; Johnson, J.

    2015-12-01

    The ultra-wideband software-defined microwave radiometer (UWBRAD) is designed to provide ice sheet internal temperature product via measuring low frequency microwave emission. Twelve channels ranging from 0.5 to 2.0 GHz are covered by the instrument. A Bayesian framework was designed to retrieve the ice sheet internal temperature from UWBRAD brightness temperature (Tb) measurements for the Greenland air-borne demonstration scheduled for summer 2016. Several parameters would affect the ice sheet physical temperature. And the effective surface temperature, geothermal heat flux and the variance of upper layer ice density were treated as unknown random variables within the retrieval framework. Synthetic brightness temperature were calculated by the snow radiation transfer models as a function of ice temperature, ice density, and an estimate of snow grain size in the upper layers. A incoherent model-the Microwave Emission Model of Layered Snowpacks (MEMLS) and a coherent model were used respectively to estimate the influence of coherent effect. The inputs of the radiation transfer model were generated from a 1-D heat-flow equation developed by Robin and a exponential fit of ice density variation from Borehole measurement. The simulated Tb was corrupted with white noise and served as UWBRAD observation in retrieval. A look-up table was developed between the parameters and the corresponding Tb. In the Bayesian retrieval process, each parameter was defined with its possible range and set to be uniformly distributed. The Markov Chain Monte Carlo (MCMC) approach was applied to make the unknown parameters randomly walk in the parameter space. Experiment results were examined for science goals on three levels: estimation of the 10-m firn temperature, the average temperature integrated with depth, and the entire temperature profile. The 10-m temperature was estimated to within 0.77 K, with a bias of 0.6 K, across the 47 locations on the ice sheet; the 10-m "synthetic true

  20. Love as a subjective corrlate of interpersonal relationships: attempts of defining of the concepts and methods of measurment

    Directory of Open Access Journals (Sweden)

    O. P. Zolotnyik

    2015-04-01

    Full Text Available This article is devoted to overview the scientific study of the phenomenon of love. Attempts of scientific knowledge presented by developed by sociologists and psychologists love theories, which defined, classified and measure this phenomenon. The paper proposed to review the most popular theory of love studying: the triangular theory of love for Robert J. Sternberg, classification styles love for John Alan Lee and transformational concept of A.Giddens. The importance of studying this subject is explained by the subjective definition by respondents of the role of love as correlates of interpersonal relationships. Love is considered as a factor that acts as a marriage motive and components, which ensures its durability. The complexity of the scientific understanding of love is the absence of clear empirical referents for fixation. The examined theory reaffirms their scientific hypotheses through the use of specific methods of measurement. It is offered for review: Scale of love and sympathy by Z.Rubin, Love Attitude Scale by Hendrick C. and Hendrick S. and scale of romantic relationships by Munro­Adams. These methodologies are widely used in modern scientific research, been undergo with modifications and adaptation depending on the cultural characteristics of the respondents. The phenomenon of love needs more scientific study with the aim of further categorization, require range of techniques selection and should be included  as a component in the sociological survey of interpersonal relationships.

  1. Decoherent histories and measurement of temporal correlation functions for Leggett-Garg inequalities

    Science.gov (United States)

    Halliwell, J. J.

    2016-11-01

    We consider two protocols for the measurement of the temporal correlation functions of a dichotomic variable Q appearing in Leggett-Garg-type inequalities. The protocols measure solely whether Q has the same or a different sign at the end of a given time interval, thereby measuring no more than is required for determination of the correlation function. They are inspired, in part, by a decoherent histories analysis of the two-time histories of Q , which yields a number of useful insights, although the protocols are ultimately expressed in macrorealistic form independent of quantum theory. The first type involves an ancilla coupled to the system with two sequential controlled-not (cnot) gates, and the two-time histories of the system (whose probabilities yield the correlation function) are determined in a single final time measurement of the ancilla. It is noninvasive for special choices of initial system states and partially invasive for more general choices. Modified Leggett-Garg-type inequalities which accommodate the partial invasiveness are discussed. The quantum picture of the protocol shows that for certain choices of the primary system initial state, the final state is unaffected by the two cnot gate interactions, hence the protocol is undetectable with respect to final system-state measurements, although it is still invasive at intermediate times. This invasiveness can be reduced with different choices of ancilla states and the protocol is then similar in flavor to a weak measurement. The second type of protocol is based on the fact that the behavior of Q over a time interval can be determined from knowledge of the dynamics together with a measurement of certain initial (or final) data. Its quantum version corresponds to the known fact that when sets of histories are decoherent, their probabilities may be expressed in terms of a record projector, hence the two-time histories in which Q has the same or a different sign can be determined by a single projective

  2. Multi-species time-history measurements during high-temperature acetone and 2-butanone pyrolysis

    KAUST Repository

    Lam, Kingyiu

    2013-01-01

    High-temperature acetone and 2-butanone pyrolysis studies were conducted behind reflected shock waves using five species time-history measurements (ketone, CO, CH3, CH4 and C2H4). Experimental conditions covered temperatures of 1100-1600 Kat 1.6 atm, for mixtures of 0.25-1.5% ketone in argon. During acetone pyrolysis, the CO concentration time-history was found to be strongly sensitive to the acetone dissociation rate constant κ1 (CH3COCH3 → CH3 + CH3CO), and this could be directly determined from the CO time-histories, yielding κ1(1.6 atm) = 2.46 × 1014 exp(-69.3 [kcal/mol]/RT) s-1 with an uncertainty of ±25%. This rate constant is in good agreement with previous shock tube studies from Sato and Hidaka (2000) [3] and Saxena et al. (2009) [4] (within 30%) at temperatures above 1450 K, but is at least three times faster than the evaluation from Sato and Hidaka at temperatures below 1250 K. Using this revised κ1 value with the recent mechanism of Pichon et al. (2009) [5], the simulated profiles during acetone pyrolysis show excellent agreement with all five species time-history measurements. Similarly, the overall 2-butanone decomposition rate constant κtot was inferred from measured 2-butanone time-histories, yielding κ tot(1.5 atm) = 6.08 × 1013 exp(-63.1 [kcal/mol]/RT) s -1 with an uncertainty of ±35%. This rate constant is approximately 30% faster than that proposed by Serinyel et al. (2010) [11] at 1119 K, and approximately 100% faster at 1412 K. Using the measured 2-butanone and CO time-histories and an O-atom balance analysis, a missing removal pathway for methyl ketene was identified. The rate constant for the decomposition of methyl ketene was assumed to be the same as the value for the ketene decomposition reaction. Using the revised κtot value and adding the methyl ketene decomposition reaction to the Serinyel et al. mechanism, the simulated profiles during 2-butanone pyrolysis show good agreement with the measurements for all five species.

  3. History and challenges of national examination as a quality measurement for high school students in Indonesia

    Science.gov (United States)

    Rozamuri, Arif Murti; Suradi, Nur Riza Mohd

    2015-02-01

    Education in Indonesia has been established before the Indonesian state. Therefore, the history of Education in Indonesia is quite long. Education that has existed since ancient times, and then forwarded to the days of the Hindu and Buddhist religious influence, then the influence of Islamic religious era, the education in the colonial era until education in independence era. At Indonesia for senior high school students, the quality measured by national exam. The national examination has a long history and full of pros and cons in determining the quality of students. With different level social economic status, and teacher quality in each schools, the student quality would be assessed within a period of three years. The interesting part is whether the national examination, able to measure the quality of students?. Is the quality of students can only be measured through the national exam?. Then various fraud taint the education particularly in the implementation of national examination. This research would explain long history of national examination and various problems that occur in national examination.

  4. Defining the ultrasound longitudinal natural history of newly diagnosed pediatric small bowel Crohn disease treated with infliximab and infliximab-azathioprine combination therapy.

    Science.gov (United States)

    Dillman, Jonathan R; Dehkordy, Soudabeh Fazeli; Smith, Ethan A; DiPietro, Michael A; Sanchez, Ramon; DeMatos-Maillard, Vera; Adler, Jeremy; Zhang, Bin; Trout, Andrew T

    2017-07-01

    Little is known about changes in the imaging appearances of the bowel and mesentery over time in either pediatric or adult patients with newly diagnosed small bowel Crohn disease treated with anti-tumor necrosis factor-alpha (anti-TNF-α) therapy. To define how bowel ultrasound findings change over time and correlate with laboratory inflammatory markers in children who have been newly diagnosed with pediatric small bowel Crohn disease and treated with infliximab. We included 28 pediatric patients treated with infliximab for newly diagnosed ileal Crohn disease who underwent bowel sonography prior to medical therapy and at approximately 2 weeks, 1 month, 3 months and 6 months after treatment initiation; these patients also had laboratory testing at baseline, 1 month and 6 months. We used linear mixed models to compare mean results between visits and evaluate whether ultrasound measurements changed over time. We used Spearman rank correlation to assess bivariate relationships. Mean subject age was 15.3±2.2 years; 11 subjects were girls (39%). We observed decreases in mean length of disease involvement (12.0±5.4 vs. 9.1±5.3 cm, P=0.02), maximum bowel wall thickness (5.6±1.8 vs. 4.7±1.7 mm, P=0.02), bowel wall color Doppler signal (1.7±0.9 vs. 1.2±0.8, P=0.002) and mesenteric color Doppler signal (1.1±0.9 vs. 0.6±0.6, P=0.005) at approximately 2 weeks following the initiation of infliximab compared to baseline. All laboratory inflammatory markers decreased at 1 month (P-valuesdisease involvement (P=0.0002) and bowel wall color Doppler signal (PCrohn's disease activity index score. The ultrasound appearance of the bowel changes as early as 2 weeks after the initiation of infliximab therapy. There is strong correlation between bowel wall color Doppler signal and fecal calprotectin.

  5. Behavioral manifestations of audiometrically-defined "slight" or "hidden" hearing loss revealed by measures of binaural detection.

    Science.gov (United States)

    Bernstein, Leslie R; Trahiotis, Constantine

    2016-11-01

    This study assessed whether audiometrically-defined "slight" or "hidden" hearing losses might be associated with degradations in binaural processing as measured in binaural detection experiments employing interaurally delayed signals and maskers. Thirty-one listeners participated, all having no greater than slight hearing losses (i.e., no thresholds greater than 25 dB HL). Across the 31 listeners and consistent with the findings of Bernstein and Trahiotis [(2015). J. Acoust. Soc. Am. 138, EL474-EL479] binaural detection thresholds at 500 Hz and 4 kHz increased with increasing magnitude of interaural delay, suggesting a loss of precision of coding with magnitude of interaural delay. Binaural detection thresholds were consistently found to be elevated for listeners whose absolute thresholds at 4 kHz exceeded 7.5 dB HL. No such elevations were observed in conditions having no binaural cues available to aid detection (i.e., "monaural" conditions). Partitioning and analyses of the data revealed that those elevated thresholds (1) were more attributable to hearing level than to age and (2) result from increased levels of internal noise. The data suggest that listeners whose high-frequency monaural hearing status would be classified audiometrically as being normal or "slight loss" may exhibit substantial and perceptually meaningful losses of binaural processing.

  6. Defining excellence.

    Science.gov (United States)

    Mehl, B

    1993-05-01

    Excellence in the pharmacy profession, particularly pharmacy management, is defined. Several factors have a significant effect on the ability to reach a given level of excellence. The first is the economic and political climate in which pharmacists practice. Stricter controls, reduced resources, and the velocity of change all necessitate nurturing of values and a work ethic to maintain excellence. Excellence must be measured by the services provided with regard to the resources available; thus, the ability to achieve excellence is a true test of leadership and innovation. Excellence is also time dependent, and today's innovation becomes tomorrow's standard. Programs that raise the level of patient care, not those that aggrandize the profession, are the most important. In addition, basic services must be practiced at a level of excellence. Quality assessment is a way to improve care and bring medical treatment to a higher plane of excellence. For such assessment to be effective and not punitive, the philosophy of the program must be known, and the goal must be clear. Excellence in practice is dependent on factors such as political and social norms, standards of practice, available resources; perceptions, time, the motivation to progress to a higher level, and the continuous innovation required to reshape the profession to meet the needs of society.

  7. Defining and evaluating a novel outcome measure representing end-stage knee osteoarthritis: data from the Osteoarthritis Initiative.

    Science.gov (United States)

    Driban, Jeffrey B; Price, Lori Lyn; Lynch, John; Nevitt, Michael; Lo, Grace H; Eaton, Charles B; McAlindon, Timothy E

    2016-10-01

    We described a definition of end-stage knee osteoarthritis (esKOA) and evaluated its association with health outcomes and osteoarthritis risk factors. We included Osteoarthritis Initiative participants with or at risk for knee osteoarthritis who had complete baseline data. We defined esKOA by adapting a validated appropriateness algorithm for total knee replacement based on data from baseline and the first four follow-up visits. We performed person-based analyses, including both knees from all participants. Participants met the definition of esKOA at the visit at which ≥1 knee reached the esKOA criteria. We assessed differences in individual characteristics between groups at baseline and over time and tested if incident esKOA (outcome) was associated with osteoarthritis risk factors (e.g., age, maximum adult weight, and quadriceps strength). The cohort consisted of 3916 participants with mean age of 61 (SD = 9) years and mean body mass index of 28.4 (4.7) kg/m(2); 59 % were female and 9.7 % developed incident esKOA. Those with incident esKOA had poorer health outcomes at baseline and greater declines in health outcomes, with the exception of SF-12 mental health score. Five out of nine tested risk factors were associated with incident esKOA in unadjusted analyses, with older age (≥65 years; odds ratio = 1.44, 95 % confidence interval = 1.19 to 1.83) and quadriceps weakness (odds ratio = 0.78, 95 % confidence interval = 0.71 to 0.86) remaining significant in adjusted models. Older age and quadriceps weakness predicted esKOA. esKOA is also characterized by poor health-related outcomes. This definition of esKOA could be a new clinically relevant outcome measure for osteoarthritis research.

  8. Measuring uncertainty by extracting fuzzy rules using rough sets and extracting fuzzy rules under uncertainty and measuring definability using rough sets

    Science.gov (United States)

    Worm, Jeffrey A.; Culas, Donald E.

    1991-01-01

    Computers are not designed to handle terms where uncertainty is present. To deal with uncertainty, techniques other than classical logic must be developed. This paper examines the concepts of statistical analysis, the Dempster-Shafer theory, rough set theory, and fuzzy set theory to solve this problem. The fundamentals of these theories are combined to provide the possible optimal solution. By incorporating principles from these theories, a decision-making process may be simulated by extracting two sets of fuzzy rules: certain rules and possible rules. From these rules a corresponding measure of how much we believe these rules is constructed. From this, the idea of how much a fuzzy diagnosis is definable in terms of its fuzzy attributes is studied.

  9. Defining Darwinism.

    Science.gov (United States)

    Hull, David L

    2011-03-01

    Evolutionary theory seems to lend itself to all sorts of misunderstanding. In this paper I strive to decrease such confusions, for example, between Darwinism and Darwinians, propositions and people, organisms and individuals, species as individuals versus species as classes, homologies and homoplasies, and finally essences versus histories.

  10. Stability of clinical outcome measures in rheumatoid arthritis patients with stable disease defined on the basis of the EULAR response criteria

    DEFF Research Database (Denmark)

    Madsen, Ole Rintek

    2016-01-01

    Natural variation also known as measurement error is assessed in individuals in "steady state." The study aimed to examine inter-visit variations in clinical outcome measures in rheumatoid arthritis (RA) patients with stable disease defined on the basis of the EULAR response criteria. Two hundred...

  11. An imaging flow cytometric method for measuring cell division history and molecular symmetry during mitosis.

    Science.gov (United States)

    Filby, Andrew; Perucha, Esperanza; Summers, Huw; Rees, Paul; Chana, Prabhjoat; Heck, Susanne; Lord, Graham M; Davies, Derek

    2011-07-01

    Asymmetric cell division is an important mechanism for generating cellular diversity, however, techniques for measuring the distribution of fate-regulating molecules during mitosis have been hampered by a lack of objectivity, quantitation, and statistical robustness. Here we describe a novel imaging flow cytometric approach that is able to report a cells proliferative history and cell cycle position using dye dilution, pH3, and PI staining to then measure the spatial distribution of fluorescent signals during mitosis using CCD-derived imagery. Using Jurkat cells, resolution of the fluorescently labeled populations was comparable to traditional PMT based cytometers thus eliminating the need to sort cells with specific division histories for microscopy. Subdividing mitotic stages by morphology allowed us to determine the time spent in each cell cycle phase using mathematical modeling approaches. Furthermore high sample throughput allowed us to collect statistically relevant numbers of cells without the need to use blocking agents that artificially enrich for mitotic events. The fluorescent imagery was used to measure PKCζ protein and EEA-1+ endosome distribution during different mitotic phases in Jurkat cells. While telophase cells represented the favorable population for measuring asymmetry, asynchronously dividing cells spent approximately 43 seconds in this stage, explaining why they were present at such low frequencies. This necessitated the acquisition of large cell numbers. Interestingly we found that PKCζ was inherited asymmetrically in 2.5% of all telophasic events whereas endosome inheritance was significantly more symmetrical. Furthermore, molecular polarity at early mitotic phases was a poor indicator of asymmetry during telophase highlighting that, though rare, telophasic events represented the best candidates for asymmetry studies. In summary, this technique combines the spatial information afforded by fluorescence microscopy with the statistical

  12. Define Project

    DEFF Research Database (Denmark)

    Munk-Madsen, Andreas

    2005-01-01

    "Project" is a key concept in IS management. The word is frequently used in textbooks and standards. Yet we seldom find a precise definition of the concept. This paper discusses how to define the concept of a project. The proposed definition covers both heavily formalized projects and informally...... organized, agile projects. Based on the proposed definition popular existing definitions are discussed....

  13. Triangles in ROC space: History and theory of "nonparametric" measures of sensitivity and response bias.

    Science.gov (United States)

    Macmillan, N A; Creelman, C D

    1996-06-01

    Can accuracy and response bias in two-stimulus, two-response recognition or detection experiments be measured nonparametrically? Pollack and Norman (1964) answered this question affirmatively for sensitivity, Hodos (1970) for bias: Both proposed measures based on triangular areas in receiver-operating characteristic space. Their papers, and especially a paper by Grier (1971) that provided computing formulas for the measures, continue to be heavily cited in a wide range of content areas. In our sample of articles, most authors described triangle-based measures as making fewer assumptions than measures associated with detection theory. However, we show that statistics based on products or ratios of right triangle areas, including a recently proposed bias index and a not-yetproposed but apparently plausible sensitivity index, are consistent with a decision process based on logistic distributions. Even the Pollack and Norman measure, which is based on non-right triangles, is approximately logistic for low values of sensitivity. Simple geometric models for sensitivity and bias are not nonparametric, even if their implications are not acknowledged in the defining publications.

  14. Serrated polyposis associated with a family history of colorectal cancer and/or polyps: The preferential location of polyps in the colon and rectum defines two molecular entities.

    Science.gov (United States)

    Silva, Patrícia; Albuquerque, Cristina; Lage, Pedro; Fontes, Vanessa; Fonseca, Ricardo; Vitoriano, Inês; Filipe, Bruno; Rodrigues, Paula; Moita, Susana; Ferreira, Sara; Sousa, Rita; Claro, Isabel; Nobre Leitão, Carlos; Chaves, Paula; Dias Pereira, António

    2016-09-01

    Serrated polyposis (SPP) is characterized by the development of multiple serrated polyps and an increased predisposition to colorectal cancer (CRC). In the present study, we aimed to characterize, at a clinical and molecular level, a cohort of SPP patients with or without a family history of SPP and/or polyps/CRC (SPP-FHP/CRC). Sixty-two lesions from 12 patients with SPP-FHP/CRC and 6 patients with sporadic SPP were included. The patients with SPP-FHP/CRC presented with an older mean age at diagnosis (p=0.027) and a more heterogeneous histological pattern of lesions (p=0.032) than the patients with sporadic SPP. We identified two molecular forms of SPP-FHP/CRC, according to the preferential location of the lesions: proximal/whole-colon or distal colon. Mismatch repair (MMR) gene methylation [mutS homolog 6 (MSH6)/mutS homolog 3 (MSH3)] or loss of heterozygosity (LOH) of D2S123 (flanking MSH6) were detected exclusively in the former (p=3.0x10-7), in most early lesions. Proximal/whole‑colon SPP-FHP/CRC presented a higher frequency of O-6-methylguanine-DNA methyltransferase (MGMT) methylation/LOH, microsatellite instability (MSI) and Wnt mutations (19/29 vs. 7/17; 16/23 vs. 1/14, p=2.2x10-4; 15/26 vs. 2/15, p=0.006; 14/26 vs. 4/20, p=0.02) but a lower frequency of B-raf proto-oncogene, serine/threonine kinase (BRAF) mutations (7/30 vs. 12/20, p=0.0089) than the distal form. CRC was more frequent in cases of Kirsten rat sarcoma viral oncogene homolog (KRAS)-associated proximal/whole-colon SPP-FHP/CRC than in the remaining cases (4/4 vs. 1/8, p=0.01). Thus, SPP-FHP/CRC appears to be a specific entity, presenting two forms, proximal/whole-colon and distal, which differ in the underlying tumor initiation pathways. Early MGMT and MMR gene deficiency in the former may underlie an inherited susceptibility to genotoxic stress.

  15. Accuracy improvement of T-history method for measuring heat of fusion of various materials

    Energy Technology Data Exchange (ETDEWEB)

    Hiki Hong [KyungHee University (Korea). School of Mechanical and Industrial Systems Engineering; Sun Kuk Kim [KyungHee University (Korea). School of Architecture and Civil Engineering; Yong-Shik Kim [University of Incheon (Korea). Dept. of Architectural Engineering

    2004-06-01

    T-history method, developed for measuring heat-of-fusion of phase change material (PCM) in sealed tubes, has the advantages of a simple experimental device and convenience with no sampling process. However, some improper assumptions in the original method, such as using a degree of supercooling as the end of latent heat period and neglecting sensible heat during phase change, can cause significant errors in determining the heat of fusion. We have improved this problem in order to predict better results. The present study shows that the modified T-history method is successfully applied to a variety of PCMs such as paraffin and lauric acid having no or a low degree of supercooling. Also it turned out that selected periods for sensible and latent heat do not significantly affect the accuracy of heat- of-fusion. As a result, the method can provide an appropriate means to assess a newly developed PCM by a cycle test even if a very accurate value cannot be obtained. (author)

  16. Friedreich Ataxia Clinical Outcome Measures: Natural History Evaluation in 410 Participants

    Science.gov (United States)

    Regner, Sean R.; Wilcox, Nicholas; Friedman, Lisa S.; Seyer, Lauren; Schadt, Kim; Brigatti, Karlla W.; Perlman, Susan; Delatycki, Martin; Wilmot, George R.; Gomez, Christopher M.; Bushara, Khalaf O.; Mathews, Katherine D.; Subramony, S.H.; Ashizawa, Tetsuo; Ravina, Bernard; Brocht, Alicia; Farmer, Jennifer M.; Lynch, David R.

    2013-01-01

    Friedreich ataxia is an autosomal recessive neurodegenerative disorder characterized by ataxia, dysarthria, and areflexia. We report the progress of a large international non-interventional cohort (n = 410), tracking the natural history of disease progression using the neurological exam-based Friedreich Ataxia Rating Scale. We analyzed the rate of progression with cross-sectional analysis and longitudinal analysis over a 2-year period. The Friedreich Ataxia Rating Scale captured disease progression when used at 1 and 2 years following initial evaluation, with a lower ratio of standard deviation of change to mean change over 2 years of evaluation. However, modeling of disease progression identified substantial ceiling effects in the Friedreich Ataxia Rating Scale, suggesting this measure is most useful in patients before maximal deficit is approached. PMID:22752494

  17. On the Incorporation of Metallicity Data into Star Formation History Measurements from Resolved Stellar Populations

    CERN Document Server

    Dolphin, Andrew E

    2016-01-01

    The combination of spectroscopic stellar metallicities and resolved star color-magnitude diagrams (CMDs) has the potential to constrain the entire star formation and chemical enrichment history (SFH) of a galaxy better than fitting CMDs alone (as is most common in SFH studies using resolved stellar populations). In this paper, two approaches for incorporating external metallicity information into color-magnitude diagram fitting techniques are presented. Overall, the joint fitting of metallicity and CMD information can increase the precision on measured age-metallicity relationships and star formation rates by ~10% over CMD fitting alone. However, systematics in stellar isochrones and mismatches between spectroscopic and photometric metallicity determinations can reduce the accuracy of the recovered SFHs. I present a simple mitigation of these systematics that can reduce the amplitude of these systematics to the level obtained from CMD fitting alone, while ensuring the age-metallicity relationship is consisten...

  18. How Not to Evaluate a Psychological Measure: Rebuttal to Criticism of the Defining Issues Test of Moral Judgment Development by Curzer and Colleagues

    Science.gov (United States)

    Thoma, Stephen J.; Bebeau, Muriel J.; Narvaez, Darcia

    2016-01-01

    In a 2014 paper in "Theory and Research in Education," Howard Curzer and colleagues critique the Defining Issues Test of moral judgment development according to eight criteria that are described as difficulties any measure of educational outcomes must address. This article highlights how Curzer et al. do not consult existing empirical…

  19. Unstable work histories and fertility in France: An adaptation of sequence complexity measures to employment trajectories

    Directory of Open Access Journals (Sweden)

    Daniel Ciganda

    2015-04-01

    Full Text Available Background: The emergence of new evidence suggesting a sign shift in the long-standing negativecorrelation between prosperity and fertility levels has sparked a renewed interest in understanding the relationship between economic conditions and fertility decisions. In thiscontext, the notion of uncertainty has gained relevance in analyses of low fertility. So far, most studies have approached this notion using snapshot indicators such as type of contract or employment situation. However, these types of measures seem to be fallingshort in capturing what is intrinsically a dynamic process. Objective: Our first objective is to analyze to what extent employment trajectories have become lessstable over time, and the second, to determine whether or not employment instability has an impact on the timing and quantum of fertility in France.Additionally, we present a new indicator of employment instability that takes into account both the frequency and duration of unemployment, with the objective of comparing its performance against other, more commonly used indicators of economic uncertainty. Methods: Our study combines exploratory (Sequence Analysis with confirmatory (Event History, Logistic Regression methods to understand the relationship between early life-course uncertainty and the timing and intensity of fertility. We use employment histories from the three available waves of the Etude des relations familiales et intergenerationnelles (ERFI, a panel survey carried out by INED and INSEE which constitutes the base of the Generations and Gender Survey (GGS in France. Results: Although France is characterized by strong family policies and high and stable fertility levels, we find that employment instability not only has a strong and persistent negative effect on the final number of children for both men and women, but also contributes to fertility postponement in the case of men.Regarding the timing of the transition to motherhood, we show how

  20. Measuring the orbital history of the ultra-faint dwarf galaxy Hercules with GSAOI

    Science.gov (United States)

    Do, Tuan; Lu, Jessica; Simon, Josh; Peter, Annika; Boylan-Kolchin, Mike

    2014-02-01

    The Milky Way ultra-faint dwarf galaxies are the most dark matter dominated systems known to date. Their low masses, low luminosities, and extremely low metallicities offer a glimpse into galaxy formation at the earliest epochs, while their high inferred dark matter densities and proximity make them ideal candidates for indirect dark matter detection experiments. However, significant uncertainties remain in key observables such as the mass and infall history of these extreme objects. For example, without knowledge of their orbits, it is difficult to determine whether their masses are overestimated because their radial velocity dispersions have been inflated by past tidal encounters with the Milky Way. We propose to measure the proper motion of the Hercules ultra-faint dwarf galaxy using GSAOI to understand its orbital history. Hercules is a particularly intriguing target because structural and kinematic studies have motivated claims that it is tidally disrupting despite its relatively large present distance from the Milky Way and apparently high mass-to-light ratio. It also has a very old stellar population, making it a prime candidate for a "fossil galaxy" whose star formation was shut off by reionization. With observations in 2014A, 2015A, and 2017A in conjunction with HST data taken in 2011, we will be able to achieve 35 km/s proper motion precision, making it possible to reconstruct the orbit of Hercules. Knowledge of its perigalacticon will allow us to quantify the tidal effects of the Milky Way on its internal stellar dynamics, while its eccentricities and orbital energies will constrain the initial infall time into the Milky Way dark matter halo. This proposal aims to extend the technique developed of HST data to using background galaxies as absolute reference sources to ground based MCAO observations, which will be important for astrometry work after HST and for future extremely large telescopes.

  1. Developing a patient-centered outcome measure for complementary and alternative medicine therapies I: defining content and format

    Directory of Open Access Journals (Sweden)

    Ritenbaugh Cheryl

    2011-12-01

    Full Text Available Abstract Background Patients receiving complementary and alternative medicine (CAM therapies often report shifts in well-being that go beyond resolution of the original presenting symptoms. We undertook a research program to develop and evaluate a patient-centered outcome measure to assess the multidimensional impacts of CAM therapies, utilizing a novel mixed methods approach that relied upon techniques from the fields of anthropology and psychometrics. This tool would have broad applicability, both for CAM practitioners to measure shifts in patients' states following treatments, and conventional clinical trial researchers needing validated outcome measures. The US Food and Drug Administration has highlighted the importance of valid and reliable measurement of patient-reported outcomes in the evaluation of conventional medical products. Here we describe Phase I of our research program, the iterative process of content identification, item development and refinement, and response format selection. Cognitive interviews and psychometric evaluation are reported separately. Methods From a database of patient interviews (n = 177 from six diverse CAM studies, 150 interviews were identified for secondary analysis in which individuals spontaneously discussed unexpected changes associated with CAM. Using ATLAS.ti, we identified common themes and language to inform questionnaire item content and wording. Respondents' language was often richly textured, but item development required a stripping down of language to extract essential meaning and minimize potential comprehension barriers across populations. Through an evocative card sort interview process, we identified those items most widely applicable and covering standard psychometric domains. We developed, pilot-tested, and refined the format, yielding a questionnaire for cognitive interviews and psychometric evaluation. Results The resulting questionnaire contained 18 items, in visual analog scale format

  2. Developing a patient-centered outcome measure for complementary and alternative medicine therapies I: defining content and format

    Science.gov (United States)

    2011-01-01

    Background Patients receiving complementary and alternative medicine (CAM) therapies often report shifts in well-being that go beyond resolution of the original presenting symptoms. We undertook a research program to develop and evaluate a patient-centered outcome measure to assess the multidimensional impacts of CAM therapies, utilizing a novel mixed methods approach that relied upon techniques from the fields of anthropology and psychometrics. This tool would have broad applicability, both for CAM practitioners to measure shifts in patients' states following treatments, and conventional clinical trial researchers needing validated outcome measures. The US Food and Drug Administration has highlighted the importance of valid and reliable measurement of patient-reported outcomes in the evaluation of conventional medical products. Here we describe Phase I of our research program, the iterative process of content identification, item development and refinement, and response format selection. Cognitive interviews and psychometric evaluation are reported separately. Methods From a database of patient interviews (n = 177) from six diverse CAM studies, 150 interviews were identified for secondary analysis in which individuals spontaneously discussed unexpected changes associated with CAM. Using ATLAS.ti, we identified common themes and language to inform questionnaire item content and wording. Respondents' language was often richly textured, but item development required a stripping down of language to extract essential meaning and minimize potential comprehension barriers across populations. Through an evocative card sort interview process, we identified those items most widely applicable and covering standard psychometric domains. We developed, pilot-tested, and refined the format, yielding a questionnaire for cognitive interviews and psychometric evaluation. Results The resulting questionnaire contained 18 items, in visual analog scale format, in which each line was

  3. Cultural Distance:How is it defined, how is it measured, and what is its relevance to international marketers?

    Institute of Scientific and Technical Information of China (English)

    杨柳

    2015-01-01

    This essay analyses the meaning of culture and in particular aims at reviewing different tools to measure differences be⁃tween cultures—the so-called cultural distance. Two major tools are considered in detail:Hall’s High Vs Low context culture (1977) and Hofstede’s Five Cultural Dimensions (1991). The conclusion of this essay draws on the weaknesses of existing systems and suggests the introduction of a‘cultural distance segmentation’that would change global companies’tendency of uniformity in their messages to a more adaptive message amongst different cultures.

  4. Definably amenable NIP groups

    OpenAIRE

    Chernikov, Artem; Simon, Pierre

    2015-01-01

    We study definably amenable NIP groups. We develop a theory of generics, showing that various definitions considered previously coincide, and study invariant measures. Applications include: characterization of regular ergodic measures, a proof of the conjecture of Petrykowski connecting existence of bounded orbits with definable amenability in the NIP case, and the Ellis group conjecture of Newelski and Pillay connecting the model-theoretic connected component of an NIP group with the ideal s...

  5. Understanding service user-defined continuity of care and its relationship to health and social measures: a cross-sectional study

    Directory of Open Access Journals (Sweden)

    Sweeney Angela

    2012-06-01

    Full Text Available Abstract Background Despite the importance of continuity of care [COC] in contemporary mental health service provision, COC lacks a clearly agreed definition. Furthermore, whilst there is broad agreement that definitions should include service users’ experiences, little is known about this. This paper aims to explore a new construct of service user-defined COC and its relationship to a range of health and social outcomes. Methods In a cross sectional study design, 167 people who experience psychosis participated in structured interviews, including a service user-generated COC measure (CONTINU-UM and health and social assessments. Constructs underlying CONTINU-UM were explored using factor analysis in order to understand service user-defined COC. The relationships between the total/factor CONTINU-UM scores and the health and social measures were then explored through linear regression and an examination of quartile results in order to assess whether service user-defined COC is related to outcome. Results Service user-defined COC is underpinned by three sub-constructs: preconditions, staff-related continuity and care contacts, although internal consistency of some sub-scales was low. High COC as assessed via CONTINU-UM, including preconditions and staff-related COC, was related to having needs met and better therapeutic alliances. Preconditions for COC were additionally related to symptoms and quality of life. COC was unrelated to empowerment and care contacts unrelated to outcomes. Service users who had experienced a hospital admission experienced higher levels of COC. A minority of service users with the poorest continuity of care also had high BPRS scores and poor quality of life. Conclusions Service-user defined continuity of care is a measurable construct underpinned by three sub-constructs (preconditions, staff-related and care contacts. COC and its sub-constructs demonstrate a range of relationships with health and social measures

  6. Understanding service user-defined continuity of care and its relationship to health and social measures: a cross-sectional study.

    Science.gov (United States)

    Sweeney, Angela; Rose, Diana; Clement, Sarah; Jichi, Fatima; Jones, Ian Rees; Burns, Tom; Catty, Jocelyn; Mclaren, Susan; Wykes, Til

    2012-06-08

    Despite the importance of continuity of care [COC] in contemporary mental health service provision, COC lacks a clearly agreed definition. Furthermore, whilst there is broad agreement that definitions should include service users' experiences, little is known about this. This paper aims to explore a new construct of service user-defined COC and its relationship to a range of health and social outcomes. In a cross sectional study design, 167 people who experience psychosis participated in structured interviews, including a service user-generated COC measure (CONTINU-UM) and health and social assessments. Constructs underlying CONTINU-UM were explored using factor analysis in order to understand service user-defined COC. The relationships between the total/factor CONTINU-UM scores and the health and social measures were then explored through linear regression and an examination of quartile results in order to assess whether service user-defined COC is related to outcome. Service user-defined COC is underpinned by three sub-constructs: preconditions, staff-related continuity and care contacts, although internal consistency of some sub-scales was low. High COC as assessed via CONTINU-UM, including preconditions and staff-related COC, was related to having needs met and better therapeutic alliances. Preconditions for COC were additionally related to symptoms and quality of life. COC was unrelated to empowerment and care contacts unrelated to outcomes. Service users who had experienced a hospital admission experienced higher levels of COC. A minority of service users with the poorest continuity of care also had high BPRS scores and poor quality of life. Service-user defined continuity of care is a measurable construct underpinned by three sub-constructs (preconditions, staff-related and care contacts). COC and its sub-constructs demonstrate a range of relationships with health and social measures. Clinicians have an important role to play in supporting service users to

  7. Defining "intermittent UVR exposure"

    DEFF Research Database (Denmark)

    Bodekær, Mette; Philipsen, Peter Alshede; Petersen, Bibi Øager;

    2016-01-01

    to define and quantify “intermittent UVR exposure” by an objective measure. Methods: A broad study population of adults and children had data collected during a summer period. Data were personal UVR dosimetry measurements, from which the number of “intermittent days” was derived, sun behaviour diaries.......001). The corresponding numbers for prediction of nevi and lentigo density by retrospective questionnaire data was lower (R2 = 0.11, R2 = 0.26, p defined objective measure of intermittent UVR exposure. This measure may provide a better prediction of solar skin damage and CMM...

  8. Measurability of matter: history of ozone measurements; La mesurabilite de la matiere: histoire de la mesure de l'ozone

    Energy Technology Data Exchange (ETDEWEB)

    Callens, S. [Clerse Ifresi Fu 3 CNRS, 59 - Lille (France)

    1998-03-01

    BACHELARD wrote 'when the genuine nature of ozone molecule is known, it becomes clear that sound ideas are made despite history'. The history of ozone is interrupted with brutal breaks, more epistemic than historical, which can refer to a general history of measure. Bachelard still base his argument on an underlying inferior order in which simplicity lays. The history of ozone has always indicated that what is inferior has always revealed surprises, that the world is not a well kept house from the list of the simple chemical elements. It is a mixed, varied and short-lived world with uncertain accounts of productive processes combining precursors and processes of natural and anthropic origin. (author)

  9. An approach to define potential radon emission level maps using indoor radon concentration measurements and radiogeochemical data positive proportion relationships.

    Science.gov (United States)

    Drolet, Jean-Philippe; Martel, Richard; Poulin, Patrick; Dessau, Jean-Claude; Lavoie, Denis; Parent, Michel; Lévesque, Benoît

    2013-10-01

    The aim of this paper is to present the first step of a new approach to make a map of radonprone areas showing different potential radon emission levels in the Quebec province. This map is a tool intended to assist the Quebec government in identifying populations with a higher risk of indoor radon gas exposure. This map of radon-prone areas used available radiogeochemical information for the province of Quebec: (1) Equivalent uranium (eU) concentration from airborne surface gamma-ray surveys; (2) uranium concentration measurements in sediments; and (3) bedrock and surficial geology. Positive proportion relationships (PPR) between each individual criterion and the 1417 available basement radon concentrations were demonstrated. It was also shown that those criteria were reliable indicators of radon-prone areas. The three criteria were discretized into 3, 2 and 2 statistically significant different classes respectively. For each class, statistical heterogeneity was validated by Kruskal-Wallis one way analyses of variance on ranks. Maps of radon-prone areas were traced down for each criterion. Based on this statistical study and on the maps of radon-prone areas in Quebec, 18% of the dwellings located in areas with an equivalent uranium (eU) concentration from airborne surface gamma-ray surveys under 0.75 ppm showed indoor radon concentrations above 150 Bq/m3. This percentage increases to 33% when eU concentrations are between 0.75 ppm and 1.25 ppm and exceeds 40% when eU concentrations are above 1.25 ppm. A uranium concentration in sediments above 20 ppm showed an indoor radon concentration geometric mean of 215 Bq/m3 with more than 69% of the dwellings exceeding 150 Bq/m3 or more than 50% of dwellings exceeding the Canadian radon guideline of 200 Bq/m3. It is also shown that the radon emission potential is higher where a uranium-rich bedrock unit is not covered by a low permeability (silt/clay) surficial deposit.

  10. "Galaxy," Defined

    CERN Document Server

    Willman, Beth

    2012-01-01

    A growing number of low luminosity and low surface brightness astronomical objects challenge traditional notions of both galaxies and star clusters. To address this challenge, we propose a definition of galaxy that does not depend on a cold dark matter model of the universe: A galaxy is a gravitationally bound collection of stars whose properties cannot be explained by a combination of baryons and Newton's laws of gravity. We use this definition to critically examine the classification of ultra-faint dwarfs, globular clusters, ultra-compact dwarfs, and tidal dwarfs. While kinematic studies provide an effective diagnostic of the definition in many regimes, they can be less useful for compact or very faint systems. To explore the utility of using the [Fe/H] spread as a diagnostic, we use published spectroscopic [Fe/H] measurements of 16 Milky Way dwarfs and 24 globular clusters to uniformly calculate their [Fe/H] spreads and associated uncertainties. Our principal results are: (i) no known, old star cluster wit...

  11. Investigating Efficiency of Time Domain Curve fitters Versus Filtering for Rectification of Displacement Histories Reconstructed from Acceleration Measurements

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri; Brincker, Rune

    2008-01-01

    Computing displacements of a structure from its measured accelerations has been major concern of some fields of engineering such as earthquake engineering. In vibration engineering also displacements are preferred to acceleration histories occasionally i.e. in the determination of forces applied...

  12. Defining and Measuring Chronic Conditions

    Centers for Disease Control (CDC) Podcasts

    2013-05-20

    This podcast is an interview with Dr. Anand Parekh, U.S. Department of Health and Human Services Deputy Assistant Secretary for Health, and Dr. Samuel Posner, Preventing Chronic Disease Editor in Chief, about the definition and burden of multiple chronic conditions in the United States.  Created: 5/20/2013 by Preventing Chronic Disease (PCD), National Center for Chronic Disease Prevention and Health Promotion (NCCDPHP).   Date Released: 5/20/2013.

  13. Defining and Measuring Academic Success

    Science.gov (United States)

    York, Travis T.; Gibson, Charles; Rankin, Susan

    2015-01-01

    Despite, and perhaps because of its amorphous nature, the term "academic success" is one of the most widely used constructs in educational research and assessment within higher education. This paper conducts an analytic literature review to examine the use and operationalization of the term in multiple academic fields. Dominant…

  14. Defining and measuring transnational fields

    OpenAIRE

    Molina, J.; Petermann, S.; Herz, A.

    2012-01-01

    Transnational social fields and transnational social spaces are concepts used interchangeably in transnational literature. Although both of them refer to the complex of connections between borders, each of them represents a different – and complementary – perspective. In this paper, it will be argued that the adoption of the social networks approach by transnational studies actually inherited two different traditions for studying relational phenomena: the anthropological egocentric or persona...

  15. Quantum Histories

    CERN Document Server

    Kent, A

    1998-01-01

    There are good motivations for considering some type of quantum histories formalism. Several possible formalisms are known, defined by different definitions of event and by different selection criteria for sets of histories. These formalisms have a natural interpretation, according to which nature somehow chooses one set of histories from among those allowed, and then randomly chooses to realise one history from that set; other interpretations are possible, but their scientific implications are essentially the same. The selection criteria proposed to date are reasonably natural, and certainly raise new questions. For example, the validity of ordering inferences which we normally take for granted --- such as that a particle in one region is necessarily in a larger region containing it --- depends on whether or not our history respects the criterion of ordered consistency, or merely consistency. However, the known selection criteria, including consistency and medium decoherence, are very weak. It is not possibl...

  16. Educational Testing as an Accountability Measure: Drawing on Twentieth-Century Danish History of Education Experiences

    Science.gov (United States)

    Ydesen, Christian

    2013-01-01

    This article reveals perspectives based on experiences from twentieth-century Danish educational history by outlining contemporary, test-based accountability regime characteristics and their implications for education policy. The article introduces one such characteristic, followed by an empirical analysis of the origins and impacts of test-based…

  17. An exploration of how to define and measure the evolution of behavior, learning, memory and mind across the full phylogenetic tree of life.

    Science.gov (United States)

    Eisenstein, E M; Eisenstein, D L; Sarma, J S M

    2016-01-01

    There are probably few terms in evolutionary studies regarding neuroscience issues that are used more frequently than 'behavior', 'learning', 'memory', and 'mind'. Yet there are probably as many different meanings of these terms as there are users of them. Further, investigators in such studies, while recognizing the full phylogenetic spectrum of life and the evolution of these phenomena, rarely go beyond mammals and other vertebrates in their investigations; invertebrates are sometimes included. What is rarely taken into consideration, though, is that to fully understand the evolution and significance for survival of these phenomena across phylogeny, it is essential that they be measured and compared in the same units of measurement across the full phylogenetic spectrum from aneural bacteria and protozoa to humans. This paper explores how these terms are generally used as well as how they might be operationally defined and measured to facilitate uniform examination and comparisons across the full phylogenetic spectrum of life. This paper has 2 goals: (1) to provide models for measuring the evolution of 'behavior' and its changes across the full phylogenetic spectrum, and (2) to explain why 'mind phenomena' cannot be measured scientifically at the present time.

  18. Substance Abuse among High-Risk Sexual Offenders: Do Measures of Lifetime History of Substance Abuse Add to the Prediction of Recidivism over Actuarial Risk Assessment Instruments?

    Science.gov (United States)

    Looman, Jan; Abracen, Jeffrey

    2011-01-01

    There has been relatively little research on the degree to which measures of lifetime history of substance abuse add to the prediction of risk based on actuarial measures alone among sexual offenders. This issue is of relevance in that a history of substance abuse is related to relapse to substance using behavior. Furthermore, substance use has…

  19. Tectonic history of continental crustal wedge constrained by EBSD measurements of garnet inclusion trails and thermodynamic modeling

    Science.gov (United States)

    Skrzypek, E.; Schulmann, K.; Lexa, O.; Haloda, J.

    2009-04-01

    Inclusion trails in garnets represent an important but underused tool of structural geology to examine non-coaxial or polyphase coaxial deformation histories of orogens. Garnet growth with respect to deformation during prograde and retrograde orogenic evolution of a continental crustal wedge was constrained by EBSD measurements of internal garnet fabrics and petrological record from mid-crustal rocks of the Śnieżnik Massif (Western Sudetes). Textural position of metamorphic minerals and thermodynamic modeling document three main stages in the tectonic evolution. Few garnet cores show prograde MnO zoning and growth coeval with the formation of the earliest metamorphic foliation which is only rarely observed in the field. The major garnet growth occurs synchronously with the second steep S2 fabric under still prograde conditions as shown by garnet zoning and appearance of staurolite and kyanite (peak at 6,5kbar/600°C). Oppositely, garnet retrogression associated to the development of sillimanite and later andalusite indicates pressure decrease of ca. 3 kbar for the late flat and pervasive S3 fabric associated with macroscopic recumbent folding of steep S2 foliation. Electron back-scatter diffraction measurements on ilmenites platelets included in garnets help determining their crystallographic preferred orientation. Ilmenites a[100] axes define planar structures that are interpreted as included foliations. Consequently, microscopic observations and foliation intersection axes (FIA) allow to distinguish between two different records. Only few (prograde) garnet cores yield information on the orientation of the presumed first metamorphic fabric whereas most of the internal garnet foliations are straight, steep and correspond to relics of originally steep S2 fabric. Importantly, this steep attitude of internal garnet foliations is persistent in both F3 fold hinge and limb zones as well as in zones of complete transposition of S2 into flat S3. Therefore, these

  20. Entangled histories

    Science.gov (United States)

    Cotler, Jordan; Wilczek, Frank

    2016-12-01

    We introduce quantum history states and their mathematical framework, thereby reinterpreting and extending the consistent histories approach to quantum theory. Through thought experiments, we demonstrate that our formalism allows us to analyze a quantum version of history in which we reconstruct the past by observations. In particular, we can pass from measurements to inferences about ‘what happened’ in a way that is sensible and free of paradox. Our framework allows for a richer understanding of the temporal structure of quantum theory, and we construct history states that embody peculiar, non-classical correlations in time.

  1. Measurement of Antibiotic Consumption: A Practical Guide to the Use of the Anatomical Therapeutic Chemical Classification and Defined Daily Dose System Methodology in Canada

    Directory of Open Access Journals (Sweden)

    James M Hutchinson

    2004-01-01

    Full Text Available Despite the global public health importance of resistance of microorganisms to the effects of antibiotics, and the direct relationship of consumption to resistance, little information is available concerning levels of consumption in Canadian hospitals and out-patient settings. The present paper provides practical advice on the use of administrative pharmacy data to address this need. Focus is made on the use of the Anatomical Therapeutic Chemical classification and Defined Daily Dose system. Examples of consumption data from Canadian community and hospital settings, with comparisons to international data, are used to incite interest and to propose uses of this information. It is hoped that all persons responsible for policy decisions regarding licensing, reimbursement, prescribing guidelines, formulary controls or any other structure pertaining to antimicrobial use become conversant with the concepts of population antibiotic consumption and that this paper provides them with the impetus and direction to begin accurately measuring and comparing antibiotic use in their jurisdictions.

  2. 20 °C—A Short History of the Standard Reference Temperature for Industrial Dimensional Measurements

    Science.gov (United States)

    Doiron, Ted

    2007-01-01

    One of the basic principles of dimensional metrology is that a part dimension changes with temperature because of thermal expansion. Since 1931 industrial lengths have been defined as the size at 20 °C. This paper discusses the variety of standard temperatures that were in use before that date, the efforts of C.E. Johansson to meet these variations, and the effort by the National Bureau of Standards to bring the United States to the eventual world standard. PMID:27110451

  3. 20 °C—A Short History of the Standard Reference Temperature for Industrial Dimensional Measurements

    OpenAIRE

    Doiron, Ted

    2007-01-01

    One of the basic principles of dimensional metrology is that a part dimension changes with temperature because of thermal expansion. Since 1931 industrial lengths have been defined as the size at 20 °C. This paper discusses the variety of standard temperatures that were in use before that date, the efforts of C.E. Johansson to meet these variations, and the effort by the National Bureau of Standards to bring the United States to the eventual world standard.

  4. Galaxy Formation as a Cosmological Tool. I: The Galaxy Merger History as a Measure of Cosmological Parameters

    CERN Document Server

    Conselice, Christopher J; Mortlock, Alice; Palamara, David; Benson, Andrew J

    2014-01-01

    As galaxy formation and evolution over long cosmic time-scales depends to a large degree on the structure of the universe, the assembly history of galaxies is potentially a powerful approach for learning about the universe itself. In this paper we examine the merger history of dark matter halos based on the Extended Press-Schechter formalism as a function of cosmological parameters, redshift and halo mass. We calculate how major halo mergers are influenced by changes in the cosmological values of $\\Omega_{\\rm m}$, $\\Omega_{\\Lambda}$, $\\sigma_{8}$, the dark matter particle temperature (warm vs. cold dark matter), and the value of a constant and evolving equation of state parameter $w(z)$. We find that the merger fraction at a given halo mass varies by up to a factor of three for halos forming under the assumption of Cold Dark Matter, within different underling cosmological parameters. We find that the current measurements of the merger history, as measured through observed galaxy pairs as well as through struc...

  5. In situ calibration of the Gamma Reaction History instrument using reference samples ("pucks") for areal density measurements

    Science.gov (United States)

    Hoffman, N. M.; Herrmann, H. W.; Kim, Y. H.; Hsu, H. H.; Horsfield, C. J.; Rubery, M. S.; Wilson, D. C.; Stoeffl, W. W.; Young, C. S.; Mack, J. M.; Miller, E. K.; Grafil, E.; Evans, S. C.; Sedillo, T. J.; Glebov, V. Yu.; Duffy, T.

    2013-11-01

    The introduction of a sample of carbon, for example a disk or "puck", near an imploding DT-filled capsule creates a source of 12C gamma rays that can serve as a reference for calibrating the response of the Gamma Reaction History (GRH) detector [1]. Such calibration is important in the measurement of ablator areal density ⟨ρR⟩abl in plastic-ablator DT-filled capsules at OMEGA [2], by allowing ⟨ρR⟩abl to be inferred as a function of ratios of signals rather than from absolute measurements of signal magnitudes. Systematic uncertainties in signal measurements and detector responses therefore cancel, permitting more accurate measurements of ⟨ρR⟩abl.

  6. In situ calibration of the Gamma Reaction History instrument using reference samples (“pucks” for areal density measurements

    Directory of Open Access Journals (Sweden)

    Hoffman N.M.

    2013-11-01

    Full Text Available The introduction of a sample of carbon, for example a disk or “puck”, near an imploding DT-filled capsule creates a source of 12C gamma rays that can serve as a reference for calibrating the response of the Gamma Reaction History (GRH detector [1]. Such calibration is important in the measurement of ablator areal density ⟨ρR⟩abl in plastic-ablator DT-filled capsules at OMEGA [2], by allowing ⟨ρR⟩abl to be inferred as a function of ratios of signals rather than from absolute measurements of signal magnitudes. Systematic uncertainties in signal measurements and detector responses therefore cancel, permitting more accurate measurements of ⟨ρR⟩abl.

  7. Mach-Zehnder fiber-optic links for reaction history measurements at the National Ignition Facility

    Science.gov (United States)

    Miller, E. Kirk; Herrmann, H. W.; Stoeffl, W.; Horsfield, C. J.

    2010-08-01

    We present the details of the analog fiber-optic data link that will be used in the chamber-mounted Gamma Reaction History (GRH) diagnostic at the National Ignition Facility (NIF) located at the Lawrence Livermore Laboratory in Livermore, California. The system is based on Mach-Zehnder (MZ) modulators integrated into the diagnostic, with the source lasers and bias control electronics located remotely to protect the active electronics. A complete recording system for a single GRH channel comprises two MZ modulators, with the fiber signals split onto four channels on a single digitizer. By carefully selecting the attenuation, the photoreceiver, and the digitizer settings, the dynamic range achievable is greater than 1000:1 at the full system bandwidth of greater than 10 GHz. The system is designed to minimize electrical reflections and mitigate the effects of transient radiation darkening on the fibers.

  8. Thermal history sensors for non-destructive temperature measurements in harsh environments

    Science.gov (United States)

    Pilgrim, C. C.; Heyes, A. L.; Feist, J. P.

    2014-02-01

    The operating temperature is a critical physical parameter in many engineering applications, however, can be very challenging to measure in certain environments, particularly when access is limited or on rotating components. A new quantitative non-destructive temperature measurement technique has been proposed which relies on thermally induced permanent changes in ceramic phosphors. This technique has several distinct advantages over current methods for many different applications. The robust ceramic material stores the temperature information allowing long term thermal exposures in harsh environment to be measured at a convenient time. Additionally, rare earth dopants make the ceramic phosphorescent so that the temperature information can be interpreted by automated interrogation of the phosphorescent light. This technique has been demonstrated by application of YAG doped with dysprosium and europium as coatings through the air-plasma spray process. Either material can be used to measure temperature over a wide range, namely between 300°C and 900°C. Furthermore, results show that the material records the peak exposure temperature and prolonged exposure at lower temperatures would have no effect on the temperature measurement. This indicates that these materials could be used to measure peak operating temperatures in long-term testing.

  9. A better method to define electrical chargeability from laboratory measurements of spectral impedance using a parallel Cole-Cole equivalent circuit

    Science.gov (United States)

    Enkin, R. J.

    2014-12-01

    Induced polarization (IP) is a successful electric method to identify drill targets for mineral exploration at the property scale. The Paleomagnetism and Petrophysics Laboratory at the Geological Survey of Canada makes petrophysical measurements on cylindrical rock samples, 2.5 cm diameter and 2.2 cm long. This small size has advantages, including allowing measurement of magnetic remanence with standard paleomagnetism equipment, but it is too small to allow a 4-contact electrical impedance measurement. The samples are impregnated with distilled water under vacuum and allowed 24 hours for solutes to dissolve off pore walls, in order to approximate original groundwater ionic conductivity. We use graphite electrodes on the flat surfaces and measure the complex impedance at 5 frequencies per decade from 1 MHz down to 25 mHz. Typical responses on a Cole-Cole plot (i.e., real vs. imaginary components displayed parametrically as a function of frequency) look like a two overlapping circular arcs followed by a constant-phase diffusive response at lowest frequencies. The impedance frequency response is fit with a circuit in which the rock is modelled as a set of parallel resistor and constant-phase-element pathways, connected in series through a modified constant-phase-element representing the low frequency sample-holder response. The program "ZarcFit", written in LabView, allows the operator to tune parameters of an equivalent but far more intuitive series circuit with a set of 13 sliders, and then perform a least-squares optimization. Time domain chargeability is defined by removing the effect of the sample holder, taking the Fourier transform to convert the frequency response to its time-domain equivalent and then integrating under the resulting voltage-decay curve. Time domain measurements using two-electrode sample holders are necessarily contaminated by the low-frequency response of ionic diffusion at the electrodes. Results are compiled in the Canadian Rock Physical

  10. Diamond growth history from in situ measurement of Pb and S isotopic compositions of sulfide inclusions

    Science.gov (United States)

    Rudnick, Roberta L.; Eldridge, C. Stewart; Bulanova, Galina P.

    1993-01-01

    In a continuing effort to understand crust-mantle dynamics, we have determined the S and Pb isotopic compositions of mantle sulfides encapsulated within diamonds from under the Siberian craton and compared these results to those of previously investigated African counterparts. Because diamond inclusions are isolated from exchange with surrounding mantle, they may preserve the history of diamond growth and act as direct tracers of the origins of mantle materials. Study of these inclusions may thus offer the best chance of recognizing global-scale interaction between Earth's crust and mantle. Although δ34S values of the Siberian sulfides do not deviate significantly from the mantle value of 0‰ ± 3‰, Pb isotopic compositions are highly variable. Pb isotopic compositions of sulfides from peridotitic suite diamonds generally plot near the terrestrial Pb growth curve, with model ages ranging between 0 and 2 Ga, whereas sulfides from eclogitic suite diamonds have radiogenic compositions, plotting beyond the growth curve. These results, which are similar to those for sulfides in African diamonds, suggest that the sulfides from eclogitic suite diamonds were derived from a source with an unusually high U/Pb ratio and may indicate a common process (such as subduction of crystal materials into the mantle) operating beneath Africa and Siberia. The absence of extremely radiogenic Pb in sulfides from eclogite xenoliths suggests that the radiogenic material from which eclogitic suite diamonds grew was a transient feature of the mantle, associated with diamond growth. The ultimate origin of this high U/Pb signature, however, remains enigmatic. Large variations in Pb isotopic composition of sulfides from different zones in a single peridotitic suite diamond document (1) crystallization of the diamond's core near 2.0 Ga, (2) growth of its outer zone in an environment with a high U/Pb ratio similar to the growth environment of eclogitic suite diamonds, and (3) growth of the

  11. Daily ratings measures of alcohol craving during an inpatient stay define subtypes of alcohol addiction that predict subsequent risk for resumption of drinking.

    Science.gov (United States)

    Oslin, David W; Cary, Mark; Slaymaker, Valarie; Colleran, Carol; Blow, Frederic C

    2009-08-01

    Both depressive symptoms and alcohol craving have been postulated as important predictors of relapse in patients with addictive disorders. The purpose of this study was to examine the course of affective symptoms and cravings for alcohol use during the initial 25 days of residential treatment for middle aged and older adults addicted to alcohol and the relationship between these symptoms and recovery outcomes. 95 alcohol-dependent subjects were enrolled in this observational study. Participants completed a daily diary of alcohol craving, positive affect, and negative affect during residential treatment. Participants were interviewed 1 and 6 months after discharge to assess clinical symptoms of relapse and functioning. Latent class analysis identified three groups of individuals for each of the three daily measures. For alcohol craving, 17 subjects reported elevated cravings during the entire treatment stay, 37 subjects reported initially elevated but then a slight improvement in craving, and 41 subjects reported relatively low craving from the time of admission to the end of residential treatment. Alcohol craving class was associated with negative affect but not positive affect. Alcohol craving class but not affective class was predictive of time to relapse to any drinking in the 6 months after residential treatment (pcraving may define a subtype of alcohol dependence that is less responsive to treatment and may explain heterogeneity in treatment outcomes. These results also may suggest a role for differential treatment programming to address high states of craving for alcohol.

  12. Mn-Cr relative sensitivity factor in ferromagnesian olivines defined for SIMS measurements with a Cameca ims-1280 ion microprobe: Implications for dating secondary fayalite

    Science.gov (United States)

    Doyle, Patricia M.; Jogo, Kaori; Nagashima, Kazuhide; Huss, Gary R.; Krot, Alexander N.

    2016-02-01

    The short-lived radionuclide 53Mn, which decays to 53Cr with a half-life of ∼3.7 Myr, is useful for sequencing objects that formed within the first 20 Myr of Solar System evolution. 53Mn-53Cr relative chronology enables aqueously formed secondary minerals such as fayalite and various carbonates in ordinary and carbonaceous chondrites to be dated, thereby providing chronological constraints on aqueous alteration processes. In situ measurements of Mn-Cr isotope systematics in fayalite by secondary ion mass spectrometry (SIMS) require consideration of the relative sensitivities of the 55Mn+ and 52Cr+ ions, for which a relative sensitivity factor [RSF = (55Mn+/52Cr+)SIMS/(55Mn/52Cr)true] is defined using appropriate standards. In the past, San Carlos olivine (Fa∼10) was commonly used for this purpose, but a growing body of evidence suggests that it is an unsuitable standard for meteoritic fayalite (Fa>90). Natural fayalite also cannot be used as a standard because it contains only trace amounts of chromium, which makes determining a true 55Mn/52Cr ratio and its degree of heterogeneity very difficult. To investigate the dependence of the Mn-Cr RSF on ferromagnesian olivine compositions, we synthesized a suite of compositionally homogeneous Mn,Cr-bearing liquidus-phase ferromagnesian olivines (Fa31-99). Manganese-chromium isotopic measurements of San Carlos olivine and synthesized ferromagnesian olivines using the University of Hawai'i Cameca ims-1280 SIMS show that the RSF for Fa10 is ∼0.9; it increases rapidly between Fa10 and Fa31 and reaches a plateau value of ∼1.5 ± 0.1 for Fa>34. The RSF is time-dependent: it increases during the measurements of olivines with fayalite content 50. The RSF measured on ferroan olivine (Fa>90) is influenced by pit shape, whereas the RSF measured on magnesian olivine (Fa10) is less sensitive to changes in pit shape. For these reasons, 53Mn-53Cr systematics of chondritic fayalite (Fa>90) should be determined using standards of

  13. Oxygen consumption rate v. rate of energy utilization of fishes: a comparison and brief history of the two measurements.

    Science.gov (United States)

    Nelson, J A

    2016-01-01

    Accounting for energy use by fishes has been taking place for over 200 years. The original, and continuing gold standard for measuring energy use in terrestrial animals, is to account for the waste heat produced by all reactions of metabolism, a process referred to as direct calorimetry. Direct calorimetry is not easy or convenient in terrestrial animals and is extremely difficult in aquatic animals. Thus, the original and most subsequent measurements of metabolic activity in fishes have been measured via indirect calorimetry. Indirect calorimetry takes advantage of the fact that oxygen is consumed and carbon dioxide is produced during the catabolic conversion of foodstuffs or energy reserves to useful ATP energy. As measuring [CO2 ] in water is more challenging than measuring [O2 ], most indirect calorimetric studies on fishes have used the rate of O2 consumption. To relate measurements of O2 consumption back to actual energy usage requires knowledge of the substrate being oxidized. Many contemporary studies of O2 consumption by fishes do not attempt to relate this measurement back to actual energy usage. Thus, the rate of oxygen consumption (M˙O2 ) has become a measurement in its own right that is not necessarily synonymous with metabolic rate. Because all extant fishes are obligate aerobes (many fishes engage in substantial net anaerobiosis, but all require oxygen to complete their life cycle), this discrepancy does not appear to be of great concern to the fish biology community, and reports of fish oxygen consumption, without being related to energy, have proliferated. Unfortunately, under some circumstances, these measures can be quite different from one another. A review of the methodological history of the two measurements and a look towards the future are included.

  14. Measurement of circulating transcripts and gene cluster analysis predicts and defines therapeutic efficacy of peptide receptor radionuclide therapy (PRRT) in neuroendocrine tumors

    Energy Technology Data Exchange (ETDEWEB)

    Bodei, L. [European Institute of Oncology, Division of Nuclear Medicine, Milan (Italy); LuGenIum Consortium, Milan, Rotterdam, Bad Berka, London, Italy, Netherlands, Germany (Country Unknown); Kidd, M. [Wren Laboratories, Branford, CT (United States); Modlin, I.M. [LuGenIum Consortium, Milan, Rotterdam, Bad Berka, London, Italy, Netherlands, Germany (Country Unknown); Yale School of Medicine, New Haven, CT (United States); Severi, S.; Nicolini, S.; Paganelli, G. [Istituto Scientifico Romagnolo per lo Studio e la Cura dei Tumori (IRST) IRCCS, Nuclear Medicine and Radiometabolic Units, Meldola (Italy); Drozdov, I. [Bering Limited, London (United Kingdom); Kwekkeboom, D.J.; Krenning, E.P. [LuGenIum Consortium, Milan, Rotterdam, Bad Berka, London, Italy, Netherlands, Germany (Country Unknown); Erasmus Medical Center, Nuclear Medicine Department, Rotterdam (Netherlands); Baum, R.P. [LuGenIum Consortium, Milan, Rotterdam, Bad Berka, London, Italy, Netherlands, Germany (Country Unknown); Zentralklinik Bad Berka, Theranostics Center for Molecular Radiotherapy and Imaging, Bad Berka (Germany)

    2016-05-15

    Peptide receptor radionuclide therapy (PRRT) is an effective method for treating neuroendocrine tumors (NETs). It is limited, however, in the prediction of individual tumor response and the precise and early identification of changes in tumor size. Currently, response prediction is based on somatostatin receptor expression and efficacy by morphological imaging and/or chromogranin A (CgA) measurement. The aim of this study was to assess the accuracy of circulating NET transcripts as a measure of PRRT efficacy, and moreover to identify prognostic gene clusters in pretreatment blood that could be interpolated with relevant clinical features in order to define a biological index for the tumor and a predictive quotient for PRRT efficacy. NET patients (n = 54), M: F 37:17, median age 66, bronchial: n = 13, GEP-NET: n = 35, CUP: n = 6 were treated with {sup 177}Lu-based-PRRT (cumulative activity: 6.5-27.8 GBq, median 18.5). At baseline: 47/54 low-grade (G1/G2; bronchial typical/atypical), 31/49 {sup 18}FDG positive and 39/54 progressive. Disease status was assessed by RECIST1.1. Transcripts were measured by real-time quantitative reverse transcription PCR (qRT-PCR) and multianalyte algorithmic analysis (NETest); CgA by enzyme-linked immunosorbent assay (ELISA). Gene cluster (GC) derivations: regulatory network, protein:protein interactome analyses. Statistical analyses: chi-square, non-parametric measurements, multiple regression, receiver operating characteristic and Kaplan-Meier survival. The disease control rate was 72 %. Median PFS was not achieved (follow-up: 1-33 months, median: 16). Only grading was associated with response (p < 0.01). At baseline, 94 % of patients were NETest-positive, while CgA was elevated in 59 %. NETest accurately (89 %, χ{sup 2} = 27.4; p = 1.2 x 10{sup -7}) correlated with treatment response, while CgA was 24 % accurate. Gene cluster expression (growth-factor signalome and metabolome) had an AUC of 0.74 ± 0.08 (z-statistic = 2.92, p < 0

  15. A study of accurate latent heat measurement for a PCM with a low melting temperature using T-history method

    Energy Technology Data Exchange (ETDEWEB)

    Peck, Jong Hyeon [Korea Institute of Industrial Technology (KITECH), Energy System Team, 35-3 Ipjang-myeon, Chonan 330-820 (Korea, Republic of); Kim, Jae-Jun [College of Architecture, Hanyang University, Seoul 133-791 (Korea, Republic of); Kang, Chaedong [Department of Mechanical Engineering, Chonbuk National University, Jeonju 561-756 (Korea, Republic of); Hong, Hiki [School of Mechanical and Industrial System Engineering, KyungHee University, Yongin 449-701 (Korea, Republic of)

    2006-11-15

    When the latent heat of a phase change material (PCM) with a lower melting point than ambient temperature was assessed according to the standard T-history method using a vertically oriented test tube, a temperature gradient occurred in the longitudinal direction of the tube due to natural convection. This led to a decrease in the accuracy of the latent heat of fusion measurement. In this study, the accuracy of the measurement with the original T-history method was improved without decreasing the test's simplicity and convenience by setting the test tube horizontally. The heat transfer to the vapor-layer of the tube under volume change during melting was assumed to be negligible and the results were calculated using the two inflection points of temperature as the start and end of latent heat period. Under these assumptions, the results agree closely with other reference data. And, the new method proposed in this study showed a remarkable reduction in data scattering. (author)

  16. Identification of fall risk predictors in daily life measurements: gait characteristics' reliability and association with self-reported fall history.

    Science.gov (United States)

    Rispens, Sietse M; van Schooten, Kimberley S; Pijnappels, Mirjam; Daffertshofer, Andreas; Beek, Peter J; van Dieën, Jaap H

    2015-01-01

    Background. Gait characteristics extracted from trunk accelerations during daily life locomotion are complementary to questionnaire- or laboratory-based gait and balance assessments and may help to improve fall risk prediction. Objective. The aim of this study was to identify gait characteristics that are associated with self-reported fall history and that can be reliably assessed based on ambulatory data collected during a single week. Methods. We analyzed 2 weeks of trunk acceleration data (DynaPort MoveMonitor, McRoberts) collected among 113 older adults (age range, 65-97 years). During episodes of locomotion, various gait characteristics were determined, including local dynamic stability, interstride variability, and several spectral features. For each characteristic, we performed a negative binomial regression analysis with the participants' self-reported number of falls in the preceding year as outcome. Reliability of gait characteristics was assessed in terms of intraclass correlations between both measurement weeks. Results. The percentages of spectral power below 0.7 Hz along the vertical and anteroposterior axes and below 10 Hz along the mediolateral axis, as well as local dynamic stability, local dynamic stability per stride, gait smoothness, and the amplitude and slope of the dominant frequency along the vertical axis, were associated with the number of falls in the preceding year and could be reliably assessed (all P 0.75). Conclusions. Daily life gait characteristics are associated with fall history in older adults and can be reliably estimated from a week of ambulatory trunk acceleration measurements.

  17. Statistical measures of genetic differentiation of populations:Rationales, history and current states

    Institute of Scientific and Technical Information of China (English)

    Liang MA; Ya-Jie JI; De-Xing ZHANG

    2015-01-01

    Population differentiation is a fundamental process of evolution, and many evolutionary studies, such as population genetics, phylogeography and conservation biology, all require the inference of population differentiation. Recently, there has been a lot of debate over the validity ofFST (and its analogueGST) as a measure for population genetic differentiation, notably since the proposal of the new indexD in 2008. Although several papers reviewed or explored specific features of these statistical measures, a succinct account of this bewildering issue with an overall update appears to be desirable. This is the purpose of the present review. The available statistics generally fall into two categories, represented byFST andD, respectively. None of them is perfect in measuring population genetic differentiation. Nevertheless, they each have advantages and are valuable for current re-search. In practice, both indices should be calculated and a comparison of them can generate useful insights into the evolutionary processes that influence population differentiation.FST (GST) has some unique irreplaceable characteristics assuring its standing as the default measure for the foreseeable near future. Also, it will continue to serve as the standard for any alternative measures to contrast with. Instead of being anxious about making choice between these indices, one should pay due attention to the equili-brium status and the level of diversity (especiallyHS) of the populations, since they largely sway the power of a given statistic to address a specific question. We provide a multi-faceted comparative summary of the various statistics, which can serve as a basic reference for readers to guide their applications [Current Zoology 61 (5): 886–897, 2015].

  18. The history and assessment of effectiveness of soil erosion control measures deployed in Russia

    Directory of Open Access Journals (Sweden)

    Valentin Golosov

    2013-09-01

    Full Text Available Research activities aimed at design and application of soil conservation measures for reduction of soil losses from cultivated fields started in Russia in the last quarter of the 19th century. A network of "zonal agrofor-estry melioration experimental stations" was organized in the different landscape zones of Russia in the first half of the 20th century. The main task of the experiments was to develop effective soil conservation measures for Russian climatic,soil and land use conditions. The most widespread and large-scale introduction of coun-termeasures to cope with soil erosion by water and wind into agricultural practice supported by serious governmental investments took place during the Soviet Union period after the Second World War. After the Soviet Union collapse in 1991 ,general deterioration of the agricultural economy sector and the absence of investments resulted in cessation of organized soil conservation measures application at the nation-wide level. However, some of the long-term erosion control measures such as forest shelter belts, artificial slope terracing, water diversion dams above formerly active gully heads survived until the present. In the case study of sediment redistribution within the small cultivated catchment presented in this paper an attempt was made to evaluate average annual erosion rates on arable slopes with and without soil conservation measures for two time intervals. It has been found that application of conservation measures on cultivated slopes within the experimental part of the case study catchment has led to a decrease of average soil loss rates by at least 2. 5 2. 8 times. The figures obtained are in good agreement with previously published results of direct monitoring of snowmelt erosion rates, reporting approximately a 3 -fold decrease of average snowmelt erosion rates in the experimental sub-catchment compared to a traditionally cultivated control sub-catchment. A substantial decrease of soil

  19. Birth Measurements, Family History, and Environmental Factors Associated With Later-Life Hypertensive Status

    OpenAIRE

    Chen, Xia; Zhang, Zhen-Xin; George, Linda K.; Wang, Zi-Shi; Fan, Zhong-jie; Xu, Tao; Zhou, Xiao-Lin; Han, Shao-Mei; Wen, Hong-Bo; Zeng, Yi

    2012-01-01

    Background This birth cohort study was conducted to investigate the contribution of prenatal and antenatal environmental exposures to later-life hypertensive status. Methods Two thousand five hundred and three individuals born in 1921–1954 at the Peking Union Medical College Hospital (PUMCH) were targeted; 2,081 (83.1%) participated. Clinical examinations included an interview, blood pressure (BP) measurements, and laboratory assays. Statistical analyses were performed using ordinal regressio...

  20. A History of In Vivo Neutron Activation Analysis in Measurement of Aluminum in Human Subjects.

    Science.gov (United States)

    Mohseni, Hedieh K; Chettle, David R

    2016-01-01

    Aluminum, as an abundant metal, has gained widespread use in human life, entering the body predominantly as an additive to various foods and drinking water. Other major sources of exposure to aluminum include medical, cosmetic, and occupational routes. As a common environmental toxin, with well-known roles in several medical conditions such as dialysis encephalopathy, aluminum is considered a potential candidate in the causality of Alzheimer's disease. Aluminum mostly accumulates in the bone, which makes bone an indicator of the body burden of aluminum and an ideal organ as a proxy for the brain. Most of the techniques developed for measuring aluminum include bone biopsy, which requires invasive measures, causing inconvenience for the patients. There has been a considerable effort in developing non-invasive approaches, which allow for monitoring aluminum levels for medical and occupational purposes in larger populations. In vivo neutron activation analysis, a method based on nuclear activation of isotopes of elements in the body and their subsequent detection, has proven to be an invaluable tool for this purpose. There are definite challenges in developing in vivo non-invasive techniques capable of detecting low levels of aluminum in healthy individuals and aluminum-exposed populations. The following review examines the method of in vivo neutron activation analysis in the context of aluminum measurement in humans focusing on different neutron sources, interference from other activation products, and the improvements made in minimum detectable limits and patient dose over the past few decades.

  1. World History Of Radon Research And Measurement From The Early 1900's To Today

    Science.gov (United States)

    George, A. C.

    2008-08-01

    In 1900, Dorn discovered the emanation in the uranium series that eventually became the well-known gas 222Rn. From 1900 through 1908, it was demonstrated that 222Rn is a radioactive gas found in tap water, highly condensable at low temperatures with a half-life of approximately 3.7 days and can be collected on charcoal by adsorption. Although, radon was discovered in 1900, the effects of prolonged exposure had been suspected and noted 300 years earlier among underground miners who developed lung cancer. During the period from 1924-1932, it was suggested that radon was the cause of high lung cancer incidence. In 1951, researchers at the university of Rochester N.Y. pointed out that the lung cancer health hazard was from the alpha radiation dose delivered by the radon decay products that deposited in the respiratory tract. The findings of the BEIR Committee Report VI, which was based on epidemiological studies in different groups of mines in the 1950's and 1960's and on laboratory studies, showed that from 60,000 miners over 2,600 developed lung cancer where only 750 were expected. Since 1998, the epidemiological study conducted in Iowa US, showed beyond any reasonable doubt that radon decay products cause lung cancer among women who lived at least twenty years in their homes. This paper will cover early radon measurements in soil, building material, ground water and in different air environments such as in the atmosphere, caves spas, underground mines and in residential indoor air environment. Radon measurements were conducted in many areas for diagnostic purposes. Radon was used as natural tracer to study air masses, vertical diffusion, and atmospheric studies, in earthquake prediction, and as a geological indicator for radium and uranium. In the early radon measurements, electroscopes, electrometers and primitive ionization chambers were used for many years. In the 1940's fast pulse ionization chambers replaced total ionization chambers. From the mid 1950's

  2. Atmospheric refraction: a history

    Science.gov (United States)

    Lehn, Waldemar H.; van der Werf, Siebren

    2005-09-01

    We trace the history of atmospheric refraction from the ancient Greeks up to the time of Kepler. The concept that the atmosphere could refract light entered Western science in the second century B.C. Ptolemy, 300 years later, produced the first clearly defined atmospheric model, containing air of uniform density up to a sharp upper transition to the ether, at which the refraction occurred. Alhazen and Witelo transmitted his knowledge to medieval Europe. The first accurate measurements were made by Tycho Brahe in the 16th century. Finally, Kepler, who was aware of unusually strong refractions, used the Ptolemaic model to explain the first documented and recognized mirage (the Novaya Zemlya effect).

  3. A multi-decadal history of biomass burning plume heights identified using aerosol index measurements

    Directory of Open Access Journals (Sweden)

    H. Guan

    2010-07-01

    Full Text Available We have quantified the relationship between Aerosol Index (AI measurements and plume height for young biomass burning plumes using coincident Ozone Monitoring Instrument (OMI and Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO measurements. This linear relationship allows the determination of high-altitude plumes wherever AI data are available, and it provides a data set for validating global fire plume heights in chemistry transport models. We find that all plumes detected from June 2006 to February 2009 with an AI value ≥9 are located at altitudes higher than 5 km. Older high-altitude plumes have lower AI values than young plumes at similar altitudes. We have examined available AI data from the OMI and TOMS instruments (1978–2009 and find that large AI plumes occur more frequently over North America than over Australia or Russia/Northeast Asia. According to the derived relationship, during this time interval, 181 plumes, in various stages of their evolution, reached altitudes above 8 km.

  4. Defining the Anthropocene

    Science.gov (United States)

    Lewis, Simon; Maslin, Mark

    2016-04-01

    Time is divided by geologists according to marked shifts in Earth's state. Recent global environmental changes suggest that Earth may have entered a new human-dominated geological epoch, the Anthropocene. Should the Anthropocene - the idea that human activity is a force acting upon the Earth system in ways that mean that Earth will be altered for millions of years - be defined as a geological time-unit at the level of an Epoch? Here we appraise the data to assess such claims, first in terms of changes to the Earth system, with particular focus on very long-lived impacts, as Epochs typically last millions of years. Can Earth really be said to be in transition from one state to another? Secondly, we then consider the formal criteria used to define geological time-units and move forward through time examining whether currently available evidence passes typical geological time-unit evidence thresholds. We suggest two time periods likely fit the criteria (1) the aftermath of the interlinking of the Old and New Worlds, which moved species across continents and ocean basins worldwide, a geologically unprecedented and permanent change, which is also the globally synchronous coolest part of the Little Ice Age (in Earth system terms), and the beginning of global trade and a new socio-economic "world system" (in historical terms), marked as a golden spike by a temporary drop in atmospheric CO2, centred on 1610 CE; and (2) the aftermath of the Second World War, when many global environmental changes accelerated and novel long-lived materials were increasingly manufactured, known as the Great Acceleration (in Earth system terms) and the beginning of the Cold War (in historical terms), marked as a golden spike by the peak in radionuclide fallout in 1964. We finish by noting that the Anthropocene debate is politically loaded, thus transparency in the presentation of evidence is essential if a formal definition of the Anthropocene is to avoid becoming a debate about bias. The

  5. Prediction of solar energetic particle event histories using real-time particle and solar wind measurements

    Science.gov (United States)

    Roelof, E. C.; Gold, R. E.

    1978-01-01

    The comparatively well-ordered magnetic structure in the solar corona during the decline of Solar Cycle 20 revealed a characteristic dependence of solar energetic particle injection upon heliographic longitude. When analyzed using solar wind mapping of the large scale interplanetary magnetic field line connection from the corona to the Earth, particle fluxes display an approximately exponential dependence on heliographic longitude. Since variations in the solar wind velocity (and hence the coronal connection longitude) can severely distort the simple coronal injection profile, the use of real-time solar wind velocity measurements can be of great aid in predicting the decay of solar particle events. Although such exponential injection profiles are commonplace during 1973-1975, they have also been identified earlier in Solar Cycle 20, and hence this structure may be present during the rise and maximum of the cycle, but somewhat obscured by greater temporal variations in particle injection.

  6. Constraining cosmology and ionization history with combined 21 cm power spectrum and global signal measurements

    CERN Document Server

    Liu, Adrian

    2015-01-01

    Improvements in current instruments and the advent of next-generation instruments will soon push observational 21 cm cosmology into a new era, with high significance measurements of both the power spectrum and the mean ("global") signal of the 21 cm brightness temperature. In this paper we use the recently commenced Hydrogen Epoch of Reionization Array as a worked example to provide forecasts on astrophysical and cosmological parameter constraints. In doing so we improve upon previous forecasts in a number of ways. First, we provide updated forecasts using the latest best-fit cosmological parameters from the Planck satellite, exploring the impact of different Planck datasets on 21 cm experiments. We also show that despite the exquisite constraints that other probes have placed on cosmological parameters, the remaining uncertainties are still large enough to have a non-negligible impact on upcoming 21 cm data analyses. While this complicates high-precision constraints on reionization models, it provides an ave...

  7. A brief history of lipid and lipoprotein measurements and their contribution to clinical chemistry.

    Science.gov (United States)

    McNamara, Judith R; Warnick, G Russell; Cooper, Gerald R

    2006-07-23

    The study of modern lipid chemistry began in the 17th and 18th centuries with early observations by Robert Boyle, Poulletier de la Salle, Antoine François de Fourcroy and others. The 19th century chemist, Chevreul, identified several fatty acids, suggested the name 'cholesterine' for the fatty substance in gallstones, coined the word 'glycerine', and showed that fats were comprised of glycerol and fatty acids. The 20th century brought many advances in the understanding of lipoprotein structure and function, and explored relationships between lipoproteins and disease states. The development of the ultracentrifuge and other lipoprotein separation techniques, and reagents for accurate, standardized quantitative measurement have steadily increased our understanding of the important role of lipoprotein metabolism in both healthy and disease states.

  8. The WiggleZ Dark Energy Survey: Joint measurements of the expansion and growth history at z < 1

    CERN Document Server

    Blake, Chris; Colless, Matthew; Contreras, Carlos; Couch, Warrick; Croom, Scott; Croton, Darren; Davis, Tamara; Drinkwater, Michael J; Forster, Karl; Gilbank, David; Gladders, Mike; Glazebrook, Karl; Jelliffe, Ben; Jurek, Russell J; Li, I-hui; Madore, Barry; Martin, Chris; Pimbblet, Kevin; Poole, Gregory B; Pracy, Michael; Sharp, Rob; Wisnioski, Emily; Woods, David; Wyder, Ted; Yee, Howard

    2012-01-01

    We perform a joint determination of the distance-redshift relation and cosmic expansion rate at redshifts z = 0.44, 0.6 and 0.73 by combining measurements of the baryon acoustic peak and Alcock-Paczynski distortion from galaxy clustering in the WiggleZ Dark Energy Survey, using a large ensemble of mock catalogues to calculate the covariance between the measurements. Further combining our results with other baryon acoustic oscillation and distant supernovae datasets, we use a Monte Carlo Markov Chain technique to determine the evolution of the Hubble parameter H(z) as a stepwise function in 9 redshift bins of width dz = 0.1, also marginalizing over the spatial curvature. Our measurements of H(z), which have precision better than 7% in most redshift bins, are consistent with the expansion history predicted by a cosmological-constant dark-energy model, in which the expansion accelerates at redshift z < 0.7. We also measure the normalized cosmic growth rate at z = 0.44, 0.6 and 0.73, together with its covarian...

  9. A brief history of the solar diameter measurements: a critical quality assessment of the existing data

    CERN Document Server

    Rozelot, Jean Pierre; Kilcik, Ali

    2016-01-01

    The size of the diameter of the Sun has been debated for a very long time. First tackled by the Greek astronomers from a geometric point of view, an estimate, although incorrect, has been determined, not truly called into question for several centuries. The French school of astronomy, under the impetus of Mouton and Picard in the XVIIth century can be considered as a pioneer in this issue. It was followed by the German school at the end of the XIXth century whose works led to a canonical value established at 959".63 (second of arc). A number of ground-based observations has been made in the second half of the XIXth century leading to controversial results mainly due to the difficulty to disentangle between the solar and atmospheric effects. Dedicated space measurements yield to a very faint dependence of the solar diameter with time. New studies over the entire radiation spectrum lead to a clear relationship between the solar diameter and the wavelength, reflecting the height at which the lines are formed. Th...

  10. Aqueous history of Mars as inferred from landed mission measurements of rocks, soils, and water ice

    Science.gov (United States)

    Arvidson, Raymond E.

    2016-09-01

    The missions that have operated on the surface of Mars acquired data that complement observations acquired from orbit and provide information that would not have been acquired without surface measurements. Data from the Viking Landers demonstrated that soils have basaltic compositions, containing minor amounts of salts and one or more strong oxidants. Pathfinder with its rover confirmed that the distal portion of Ares Vallis is the site of flood-deposited boulders. Spirit found evidence for hydrothermal deposits surrounding the Home Plate volcanoclastic feature. Opportunity discovered that the hematite signature on Meridiani Planum as seen from orbit is due to hematitic concretions concentrated on the surface as winds eroded sulfate-rich sandstones that dominate the Burns formation. The sandstones originated as playa muds that were subsequently reworked by wind and rising groundwater. Opportunity also found evidence on the rim of the Noachian Endurance Crater for smectites, with extensive leaching along fractures. Curiosity acquired data at the base of Mount Sharp in Gale Crater that allows reconstruction of a sustained fluvial-deltaic-lacustrine system prograding into the crater. Smectites and low concentrations of chlorinated hydrocarbons have been identified in the lacustrine deposits. Phoenix, landing above the Arctic Circle, found icy soils, along with low concentrations of perchlorate salt. Perchlorate is considered to be a strong candidate for the oxidant found by the Viking Landers. It is also a freezing point depressant and may play a role in allowing brines to exist at and beneath the surface in more modern periods of time on Mars.

  11. [The history of adverse drug reactions, relief for these health damage and safety measures in Japan].

    Science.gov (United States)

    Takahashi, Haruo

    2009-01-01

    The first remarkable adverse drug reaction (ADR) reported in Japan was anaphylactic shock caused by penicillin. Although intradermal testing for antibiotics had been exercised as prediction method of anaphylactic shock for a long time, it was discontinued in 2004 because of no evidence for prediction. The malformation of limbs, etc. caused by thalidomide was a global problem, and thalidomide was withdrawn from the market. Teratogenicity testing during new drug development has been implemented since 1963. Chinoform (clioquinol)-iron chelate was detected from green tongue and green urine in patients with subacute myelo-optic neuropathy (SMON) and identified as a causal material of SMON in 1970. Chinoform was withdrawn from the market, and a fund for relief the health damage caused by ADR was established in 1979. The co-administration of sorivudine and fluorouracil anticancer agents induced fatal agranulocytosis, and sorivudine was withdrawn from the market after being on sale for one month in 1993. The guidelines for package inserts were corrected with this opportunity, and early phase pharmacovigilance of new drugs was introduced later. Since acquired immune deficiency syndrome, and hepatitis B and C were driven by virus-infected blood products, the Ministry of Health, Labor and Welfare tightened regulations regarding biological products in 2003, and a fund for relief of health damage caused by infections driven from biological products was established in 2004. The other remarkable ADRs were quadriceps contracture induced by the repeated administration of muscular injection products and Creutzfeldt-Jakob disease caused by the transplantation of human dry cranial dura matter, etc. The significance of safety measures for drugs based on experiences related to ADRs is worthy of notice. New drugs are approved based on a benefit-risk assessment, if the expected therapeutic benefits outweigh the possible risks associated with treatment. Since unexpected, rare and serious

  12. Spall measurements in shock-loaded hemispherical shells from free-surface velocity histories. [Dynamic fracture of hemishells of copper and tantalum

    Energy Technology Data Exchange (ETDEWEB)

    Cagliostro, D.J.; Warnes, R.H.; Johnson, N.L.; Fujita, R.K.

    1987-01-01

    Copper and tantalum hemishells are externally loaded by a hemishell of PBX 9501 detonated at its pole. Free-surface velocity histories of the metal hemishells are measured at the pole and at 50 from the pole with a Fabry-Perot interferometer. These histories are used to determine spall strengths and depths by simple wave-interaction analyses and are compared with hydro-code (CAVEAT) predictions using simple and void-growth spall models. 8 refs., 4 figs., 1 tab.

  13. Calibration of a T-History calorimeter to measure enthalpy curves of phase change materials in the temperature range from 40 to 200 °C

    Science.gov (United States)

    Rathgeber, Christoph; Schmit, Henri; Hennemann, Peter; Hiebler, Stefan

    2014-03-01

    Thermal energy storage using phase change materials (PCMs) provides high storage capacities in small temperature ranges. For the design of efficient latent heat storage, the enthalpy curve of a PCM has to be measured with high precision. Measurements are most commonly performed with differential scanning calorimetry (DSC). The T-History method, however, proved to be favourable for the characterization of typical PCMs due to large samples and a measuring procedure close to conditions found in applications. As T-History calorimeters are usually individual constructions, performing a careful calibration procedure is decisive to ensure optimal measuring accuracy. We report in this paper on the calibration of a T-History calorimeter with a working range from 40 to 200 °C that was designed and built at our institute. A three-part procedure, consisting of an indium calibration, a measurement of the specific heat of copper and measurements of three solid-liquid PCMs (stearic acid, dimethyl terephthalate and d-mannitol), was performed and an advanced procedure for the correction of enthalpy curves was developed. When comparing T-History enthalpy curves to literature data and DSC step measurements, good agreement within the uncertainty limits demanded by RAL testing specifications was obtained. Thus, our design of a T-History calorimeter together with the developed calibration procedure provides the measuring accuracy that is required to identify the most suitable PCM for a given application. In addition, the dependence of the enthalpy curve on the sample size can be analysed by comparing results obtained with T-History and DSC and the behaviour of the bulk material in real applications can be predicted.

  14. Validation of dietary history method in a group of elderly women using measurements of total energy expenditure.

    NARCIS (Netherlands)

    Visser, M.; Groot, de C.P.G.M.; Deurenberg, P.; Staveren, van W.A.

    1995-01-01

    The objective of the present study was to validate energy intake data, obtained by dietary history, in twelve elderly women aged 69–82 years. Energy and protein intakes were obtained using the dietary history method with a reference period of 30 d. Reported energy intake was compared with total

  15. Simple measurements reveal the feeding history, the onset of reproduction, and energy conversion efficiencies in captive bluefin tuna

    Science.gov (United States)

    Jusup, Marko; Klanjšček, Tin; Matsuda, Hiroyuki

    2014-11-01

    We present a numerical approach that, in conjunction with a fully set up Dynamic Energy Budget (DEB) model, aims at consistently approximating the feeding history of cultivated fish from the commonly measured aquaculture data (body length, body mass, or the condition factor). We demonstrate the usefulness of the approach by performing validation of a DEB-based model for Pacific bluefin tuna (Thunnus orientalis) on an independent dataset and exploring the implied bioenergetics of this species in captivity. In the context of validation, the results indicate that the model successfully accounts for more than 75% of the variance in actual fish feed. At the 5% significance level, predictions do not underestimate nor overestimate observations and there is no bias. The overall model accuracy of 87.6% is satisfactory. In the context of tuna bioenergetics, we offer an explanation as to why the first reproduction in the examined case occurred only after the fish reached seven years of age, whereas it takes five years in the wild and sometimes as little as three years in captivity. Finally, we calculate energy conversion efficiencies and the supply stress throughout the entire lifetime to theoretically underpin the relatively low contribution of growth to aerobic metabolism implied by respirometry and high feed conversion ratio observed in bluefin tuna aquaculture.

  16. Issues in defining and measuring veteran community reintegration: Proceedings of the Working Group on Community Reintegration, VA Rehabilitation Outcomes Conference, Miami, Florida

    Directory of Open Access Journals (Sweden)

    Linda Resnik, PT, PhD

    2012-02-01

    Full Text Available In January 2010, the Department of Veterans Affairs (VA Rehabilitation Research and Development Service convened a State of the Art (SOTA conference to advance the field of outcome measurement for rehabilitation-related studies. This article reports on the proceedings of the SOTA Working Group on Community Reintegration. We explored the use of the International Classification of Health, Disability, and Functioning as a theoretical framework for measuring community reintegration; identified key dimensions of community reintegration that could and/or should be measured; discussed challenges in measuring community reintegration; suggested steps to enhance community reintegration measurement; proposed future research that focuses on outcomes measures for community reintegration and the study of community reintegration outcomes; and made policy recommendations that would facilitate community reintegration research within the VA.

  17. Issues in defining and measuring veteran community reintegration: proceedings of the Working Group on Community Reintegration, VA Rehabilitation Outcomes Conference, Miami, Florida.

    Science.gov (United States)

    Resnik, Linda; Bradford, Daniel W; Glynn, Shirley M; Jette, Alan M; Johnson Hernandez, Caitlin; Wills, Sharon

    2012-01-01

    In January 2010, the Department of Veterans Affairs (VA) Rehabilitation Research and Development Service convened a State of the Art (SOTA) conference to advance the field of outcome measurement for rehabilitation-related studies. This article reports on the proceedings of the SOTA Working Group on Community Reintegration. We explored the use of the International Classification of Health, Disability, and Functioning as a theoretical framework for measuring community reintegration; identified key dimensions of community reintegration that could and/or should be measured; discussed challenges in measuring community reintegration; suggested steps to enhance community reintegration measurement; proposed future research that focuses on outcomes measures for community reintegration and the study of community reintegration outcomes; and made policy recommendations that would facilitate community reintegration research within the VA.

  18. IgH-V(D)J NGS-MRD measurement pre- and early post-allotransplant defines very low- and very high-risk ALL patients.

    Science.gov (United States)

    Pulsipher, Michael A; Carlson, Chris; Langholz, Bryan; Wall, Donna A; Schultz, Kirk R; Bunin, Nancy; Kirsch, Ilan; Gastier-Foster, Julie M; Borowitz, Michael; Desmarais, Cindy; Williamson, David; Kalos, Michael; Grupp, Stephan A

    2015-05-28

    Positive detection of minimal residual disease (MRD) by multichannel flow cytometry (MFC) prior to hematopoietic cell transplantation (HCT) of patients with acute lymphoblastic leukemia (ALL) identifies patients at high risk for relapse, but many pre-HCT MFC-MRD negative patients also relapse, and the predictive power MFC-MRD early post-HCT is poor. To test whether the increased sensitivity of next-generation sequencing (NGS)-MRD better identifies pre- and post-HCT relapse risk, we performed immunoglobulin heavy chain (IgH) variable, diversity, and joining (V[D]J) DNA sequences J NGS-MRD on 56 patients with B-cell ALL enrolled in Children's Oncology Group trial ASCT0431. NGS-MRD predicted relapse and survival more accurately than MFC-MRD (P NGS-MRD detection was better at predicting relapse than MFC-MRD (P NGS-MRD positive relapse rate, 67%; P = .004). Any post-HCT NGS positivity resulted in an increase in relapse risk by multivariate analysis (hazard ratio, 7.7; P = .05). Absence of detectable IgH-V(D)J NGS-MRD pre-HCT defines good-risk patients potentially eligible for less intense treatment approaches. Post-HCT NGS-MRD is highly predictive of relapse and survival, suggesting a role for this technique in defining patients early who would be eligible for post-HCT interventions. The trial was registered at www.clinicaltrials.gov as #NCT00382109.

  19. Definable deduction relation

    Institute of Scientific and Technical Information of China (English)

    张玉平

    1999-01-01

    The nonmonotonic deduction relation in default reasoning is defined with fixed point style, which has the many-extension property that classical logic is not possessed of. These two kinds of deductions both have boolean definability property, that is, their extensions or deductive closures can be defined by boolean formulas. A generalized form of fixed point method is employed to define a class of deduction relations, which all have the above property. Theorems on definability and atomless boolean algebras in model theory are essential in dealing with this assertion.

  20. Defining the Most Accurate Measurable Dimension(s of the Liver in Predicting Liver Volume Based on CT Volumetery and Reconstruction

    Directory of Open Access Journals (Sweden)

    Reza Saadat Mostafavi

    2010-05-01

    Full Text Available Background/Objective: The presence of liver volume has a great effect on diagnosis and management of different diseases such as lymphoproliferative conditions. "nPatients and Methods: Abdominal CT scan of 100 patients without any findings for liver disease (in history and imaging was subjected to volumetry and reconstruction. Along with the liver volume, in axial series, the AP diameter of the left lobe (in midline and right lobe (mid-clavicular and lateral maximum diameter of the liver in the mid-axiliary line and maximum diameter to IVC were calculated. In the coronal mid-axillary and sagittal mid-clavicular plane, maximum superior-inferior dimensions were calculated with their various combinations (multiplying. Regression analysis between dimensions and volume were performed. "nResults: The most accurate combination was the superior inferior sagittal dimension multiplied by AP diameter of the right lobe (R squared 0.78, P-value<0.001 and the most solitary dimension was the lateral dimension to IVC in the axial plane (R squared 0.57, P-value<0.001 with an interval of 9-11cm for 68% of normal. "nConclusion: We recommend the lateral maximum diameter of liver from surface to IVC in the axial plane in ultrasound for liver volume prediction with an interval of 9-11cm for 68% of normal. Out of this range is regarded as abnormal.

  1. Intake of ruminant trans-fatty acids, assessed by diet history interview, and changes in measured body size, shape and composition

    DEFF Research Database (Denmark)

    Hansen, Camilla P; Heitmann, Berit L; Sørensen, Thorkild IA

    2016-01-01

    composition (body fat percentage). DESIGN: A 6-year follow-up study. Information on dietary intake was collected through diet history interviews, and anthropometric and bioelectrical impedance measurements were obtained by trained technicians at baseline (1987-1988) and at follow-up (1993-1994). Multiple...

  2. Measuring children's self-reported sport participation, risk perception and injury history: development and validation of a survey instrument.

    Science.gov (United States)

    Siesmaa, Emma J; Blitvich, Jennifer D; White, Peta E; Finch, Caroline F

    2011-01-01

    Despite the health benefits associated with children's sport participation, the occurrence of injury in this context is common. The extent to which sport injuries impact children's ongoing involvement in sport is largely unknown. Surveys have been shown to be useful for collecting children's injury and sport participation data; however, there are currently no published instruments which investigate the impact of injury on children's sport participation. This study describes the processes undertaken to assess the validity of two survey instruments for collecting self-reported information about child cricket and netball related participation, injury history and injury risk perceptions, as well as the reliability of the cricket-specific version. Face and content validity were assessed through expert feedback from primary and secondary level teachers and from representatives of peak sporting bodies for cricket and netball. Test-retest reliability was measured using a sample of 59 child cricketers who completed the survey on two occasions, 3-4 weeks apart. Based on expert feedback relating to face and content validity, modification and/or deletion of some survey items was undertaken. Survey items with low test-retest reliability (κ≤0.40) were modified or deleted, items with moderate reliability (κ=0.41-0.60) were modified slightly and items with higher reliability (κ≥0.61) were retained, with some undergoing minor modifications. This is the first survey of its kind which has been successfully administered to cricketers aged 10-16 years to collect information about injury risk perceptions and intentions for continued sport participation. Implications for its generalisation to other child sport participants are discussed.

  3. Identification of cut-points in commonly used hip osteoarthritis-related outcome measures that define the patient acceptable symptom state (PASS).

    Science.gov (United States)

    Emerson Kavchak, Alicia J; Cook, Chad; Hegedus, Eric J; Wright, Alexis A

    2013-11-01

    To determine patient acceptable symptom state (PASS) estimates in outcome measures commonly used in hip osteoarthritis (OA). Identification of cut-points on commonly used outcome measures associated with patient satisfaction with their current state of health. As part of a randomized controlled trial, 70 patients with a clinical diagnosis of hip OA undergoing a 9-session physiotherapy treatment program completed four physical performance measures and three self-report measures at 9 weeks and 1 year. Upon completion of treatment, patients assessed their current health status according to the PASS question. Cut-points were estimated using receiver operating characteristic curves (anchor-based method), based on the patient's response to the PASS question. At 9 weeks and 1 year, identified cut-points were, respectively, ≤10 and ≤11 for the Western Ontario and McMaster Universities Osteoarthritis Index (WOMAC) pain subscale; ≤35 and ≤40 on the WOMAC physical function subscale; ≥+5 and ≥+6 on the global rating of change score; ≤6.05 and ≤5.30 s for the timed-up-and-go; ≤28.3 and ≤24.9 for the 40-m self-paced walk test; ≥11 and ≥12 repetitions for the 30-s chair stand test; and ≥46 repetitions for the 20-cm step test. Initial target cut-points signaling patient satisfaction with their current symptom state following physiotherapy in patients with hip osteoarthritis were determined for seven outcome measures over 1 year.

  4. Historically defined autobiographical periods

    DEFF Research Database (Denmark)

    Brown, Norman R.; Hansen, Tia G. B.; Lee, Peter J.;

    2012-01-01

    The chapter reviews a research programme that has demonstrated the existence of historically defined autobiographical periods and identified the conditions that bring them about. Data from four samples of World War II-generation adults show that historically defined autobiographical periods endure...... over time and theoretical implications are discussed, notably by introducing a new approach to autobiographical memory, Transition Theory, which assumes that autobiographical memory is organized by transitional events that can be selfinitiated or externally imposed - historically defined...

  5. Defining the effect and mediators of two knowledge translation strategies designed to alter knowledge, intent and clinical utilization of rehabilitation outcome measures: a study protocol [NCT00298727

    Directory of Open Access Journals (Sweden)

    Law Mary

    2006-07-01

    Full Text Available Abstract Background A substantial number of valid outcome measures have been developed to measure health in adult musculoskeletal and childhood disability. Regrettably, national initiatives have merely resulted in changes in attitude, while utilization remains unacceptably low. This study will compare the effectiveness and mediators of two different knowledge transfer (KT interventions in terms of their impact on changing knowledge and behavior (utilization and clinical reasoning related to health outcome measures. Method/Design Physical and occupational therapists (n = 144 will be recruited in partnership with the national professional associations to evaluate two different KT interventions with the same curriculum: 1 Stakeholder-Hosted Interactive Problem-Based Seminar (SHIPS, and 2 Online Problem-Based course (e-PBL. SHIPS will consist of face-to-face problem-based learning (PBL for 2 1/2 days with outcome measure developers as facilitators, using six problems generated in consultation with participants. The e-PBL will consist of a 6-week web-based course with six generic problems developed by content experts. SHIPS will be conducted in three urban centers in Canada. Participants will be block-allocated by a minimization procedure to either of the two interventions to minimize any prognostic differences. Trained evaluators at each site will conduct chart audits and chart-stimulated recall. Trained interviewers will conduct semi-structured interviews focused on identifying critical elements in KT and implementing practice changes. Interviews will be transcribed verbatim. Baseline predictors including demographics, knowledge, attitudes/barriers regarding outcome measures, and Readiness to Change will be assessed by self-report. Immediately post-intervention and 6 months later, these will be re-administered. Primary qualitative and quantitative evaluations will be conducted 6-months post-intervention to assess the relative effectiveness of KT

  6. Association of adiposity, measured by skinfold thickness, with parental history of diabetes in a South Indian population: data from CURES-114.

    Science.gov (United States)

    Surendar, J; Indulekha, K; Deepa, M; Mohan, V; Pradeepa, R

    2016-07-01

    To look at the association of central and peripheral skinfold thickness with parental history of diabetes in subjects without diabetes. Subjects with no parental history of diabetes (n=1132), subjects with one parent with diabetes (n=271) and subjects with both parents with diabetes (n=51) were recruited from the Chennai Urban Rural Epidemiological Study (CURES) conducted between 2001 and 2003. Biceps, triceps, medial calf, mid-thigh, chest, abdomen, mid-axillary, suprailiac and subscapsular sites were measured with Lange skinfold callipers. Trunk fat measurements, such as chest (p=0.020), mid-axillary (p=0.005), suprailiac (p=0.014), subscapsular (pskinfolds, were highest in subjects with both parents with diabetes followed by those with one parent with diabetes, and lowest in those with no parental history of diabetes. However, the peripheral fat measurements, ie, biceps, triceps, medial calf and mid-thigh, were not significantly different between the study groups. Total truncal and peripheral fat skinfold thicknesses showed a significant positive association with other indices of obesity such as body mass index (BMI) and waist circumference in relation to trunk fat (BMI: r=0.748, pskinfold thickness, and parental history of diabetes among subjects without diabetes in this urban South Indian population. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  7. Defining adolescent common mental disorders using electronic primary care data:a comparison with outcomes measured using the CIS-R

    OpenAIRE

    Cornish, Rosie; John, Ann; Boyd, Andrew; Tilling, Kate; Macleod, John

    2016-01-01

    Objective To compare the prevalence of common mental disorders (CMD) derived from data held in primary care records to that measured using the revised Clinical Interview Schedule (CIS-R) in order to assess the potential robustness of findings based only on routinely collected data.Design and settingComparison study using linkage between the Avon Longitudinal Study of Parents and Children (ALSPAC) and electronic primary care records.ParticipantsWe studied 1,562 adolescents who had completed th...

  8. Defining Legal Moralism

    DEFF Research Database (Denmark)

    Thaysen, Jens Damgaard

    2015-01-01

    This paper discusses how legal moralism should be defined. It is argued that legal moralism should be defined as the position that “For any X, it is always a pro tanto reason for justifiably imposing legal regulation on X that X is morally wrong (where “morally wrong” is not conceptually equivalent...

  9. Body mass index and measures of body fat for defining obesity and underweight: a cross-sectional, population-based study

    OpenAIRE

    Pasco, Julie A; Kara L. Holloway; Amelia G. Dobbins; Kotowicz, Mark A; Lana J Williams; Brennan, Sharon L

    2014-01-01

    Background The body mass index (BMI) is commonly used as a surrogate marker for adiposity. However, the BMI indicates weight-for-height without considering differences in body composition and the contribution of body fat to overall body weight. The aim of this cross-sectional study was to identify sex-and-age-specific values for percentage body fat (%BF), measured using whole body dual energy x-ray absorptiometry (DXA), that correspond to BMI 18.5 kg/m2 (threshold for underweight), 25.0 kg/m2...

  10. Longitudinal measures of cognition in the Ts65Dn mouse: Refining windows and defining modalities for therapeutic intervention in Down syndrome.

    Science.gov (United States)

    Olmos-Serrano, J Luis; Tyler, William A; Cabral, Howard J; Haydar, Tarik F

    2016-05-01

    Mouse models have provided insights into adult changes in learning and memory in Down syndrome, but an in-depth assessment of how these abnormalities develop over time has never been conducted. To address this shortcoming, we conducted a longitudinal behavioral study from birth until late adulthood in the Ts65Dn mouse model to measure the emergence and continuity of learning and memory deficits in individuals with a broad array of tests. Our results demonstrate for the first time that the pace at which neonatal and perinatal milestones are acquired is correlated with later cognitive performance as an adult. In addition, we find that life-long behavioral indexing stratifies mice within each genotype. Our expanded assessment reveals that diminished cognitive flexibility, as measured by reversal learning, is the most robust learning and memory impairment in both young and old Ts65Dn mice. Moreover, we find that reversal learning degrades with age and is therefore a useful biomarker for studying age-related decline in cognitive ability. Altogether, our results indicate that preclinical studies aiming to restore cognitive function in Ts65Dn should target both neonatal milestones and reversal learning in adulthood. Here we provide the quantitative framework for this type of approach.

  11. Sleep Health: Can We Define It? Does It Matter?

    Science.gov (United States)

    Buysse, Daniel J.

    2014-01-01

    Good sleep is essential to good health. Yet for most of its history, sleep medicine has focused on the definition, identification, and treatment of sleep problems. Sleep health is a term that is infrequently used and even less frequently defined. It is time for us to change this. Indeed, pressures in the research, clinical, and regulatory environments require that we do so. The health of populations is increasingly defined by positive attributes such as wellness, performance, and adaptation, and not merely by the absence of disease. Sleep health can be defined in such terms. Empirical data demonstrate several dimensions of sleep that are related to health outcomes, and that can be measured with self-report and objective methods. One suggested definition of sleep health and a description of self-report items for measuring it are provided as examples. The concept of sleep health synergizes with other health care agendas, such as empowering individuals and communities, improving population health, and reducing health care costs. Promoting sleep health also offers the field of sleep medicine new research and clinical opportunities. In this sense, defining sleep health is vital not only to the health of populations and individuals, but also to the health of sleep medicine itself. Citation: Buysse DJ. Sleep health: can we define it? Does it matter? SLEEP 2014;37(1):9-17. PMID:24470692

  12. Contrasting neogene denudation histories of different structural regions in the transantarctic mountains rift flank constrained by cosmogenic isotope measurements

    NARCIS (Netherlands)

    Wateren, F.M. van der; Dunai, T.J.; Balen, R.T. van; Klas, W.; Verbers, A.L.L.M.; Passchier, S.; Herpers, U.

    1999-01-01

    Separate regions within the Transantarctic Mountains, the uplifted flank of the West Antarctic rift system, appear to have distinct Neogene histories of glaciation and valley downcutting. Incision of deep glacial outlet valleys occurred at different times throughout central and northern Victoria Lan

  13. Book History and Ideological Hierarchies

    Science.gov (United States)

    Dilevko, Juris; Dali, Keren

    2006-01-01

    The evolving field of Book History has had difficulty in integrating the experiences of immigrant culture. In explaining the origins of print culture in North America, Book History has a tendency to associate lowbrow with immigrants and their struggles to establish a foothold in a new land. Book History therefore symbolically defines immigrant…

  14. Defining Documentary Film

    DEFF Research Database (Denmark)

    Juel, Henrik

    2006-01-01

    A discussion of various attemts at defining documentary film regarding form, content, truth, stile, genre or reception - and a propoposal of a positive list of essential, but non-exclusive characteristica of documentary film......A discussion of various attemts at defining documentary film regarding form, content, truth, stile, genre or reception - and a propoposal of a positive list of essential, but non-exclusive characteristica of documentary film...

  15. FDG-PET Response Prediction in Pediatric Hodgkin’s Lymphoma: Impact of Metabolically Defined Tumor Volumes and Individualized SUV Measurements on the Positive Predictive Value

    Energy Technology Data Exchange (ETDEWEB)

    Hussien, Amr Elsayed M. [Department of Nuclear Medicine (KME), Forschungszentrum Jülich, Medical Faculty, Heinrich-Heine-University Düsseldorf, Jülich, 52426 (Germany); Department of Nuclear Medicine, Medical Faculty, Heinrich-Heine-University Düsseldorf, Düsseldorf, 40225 (Germany); Furth, Christian [Department of Radiology and Nuclear Medicine, Medical School, Otto-von-Guericke University Magdeburg, Magdeburg, 39120 (Germany); Schönberger, Stefan [Department of Pediatric Oncology, Hematology and Clinical Immunology, University Children’s Hospital, Medical Faculty, Heinrich-Heine-University Düsseldorf, Düsseldorf, 40225 (Germany); Hundsdoerfer, Patrick [Department of Pediatric Oncology and Hematology, Charité Campus Virchow, Humboldt-University Berlin, Berlin, 13353 (Germany); Steffen, Ingo G.; Amthauer, Holger [Department of Radiology and Nuclear Medicine, Medical School, Otto-von-Guericke University Magdeburg, Magdeburg, 39120 (Germany); Müller, Hans-Wilhelm; Hautzel, Hubertus, E-mail: h.hautzel@fz-juelich.de [Department of Nuclear Medicine (KME), Forschungszentrum Jülich, Medical Faculty, Heinrich-Heine-University Düsseldorf, Jülich, 52426 (Germany); Department of Nuclear Medicine, Medical Faculty, Heinrich-Heine-University Düsseldorf, Düsseldorf, 40225 (Germany)

    2015-01-28

    Background: In pediatric Hodgkin’s lymphoma (pHL) early response-to-therapy prediction is metabolically assessed by (18)F-FDG PET carrying an excellent negative predictive value (NPV) but an impaired positive predictive value (PPV). Aim of this study was to improve the PPV while keeping the optimal NPV. A comparison of different PET data analyses was performed applying individualized standardized uptake values (SUV), PET-derived metabolic tumor volume (MTV) and the product of both parameters, termed total lesion glycolysis (TLG); Methods: One-hundred-eight PET datasets (PET1, n = 54; PET2, n = 54) of 54 children were analysed by visual and semi-quantitative means. SUVmax, SUVmean, MTV and TLG were obtained the results of both PETs and the relative change from PET1 to PET2 (Δ in %) were compared for their capability of identifying responders and non-responders using receiver operating characteristics (ROC)-curves. In consideration of individual variations in noise and contrasts levels all parameters were additionally obtained after threshold correction to lean body mass and background; Results: All semi-quantitative SUV estimates obtained at PET2 were significantly superior to the visual PET2 analysis. However, ΔSUVmax revealed the best results (area under the curve, 0.92; p < 0.001; sensitivity 100%; specificity 85.4%; PPV 46.2%; NPV 100%; accuracy, 87.0%) but was not significantly superior to SUVmax-estimation at PET2 and ΔTLGmax. Likewise, the lean body mass and background individualization of the datasets did not impove the results of the ROC analyses; Conclusions: Sophisticated semi-quantitative PET measures in early response assessment of pHL patients do not perform significantly better than the previously proposed ΔSUVmax. All analytical strategies failed to improve the impaired PPV to a clinically acceptable level while preserving the excellent NPV.

  16. How to consistently link extraversion and intelligence to the catechol-O-methyltransferase (COMT) gene: on defining and measuring psychological phenotypes in neurogenetic research.

    Science.gov (United States)

    Wacker, Jan; Mueller, Erik M; Hennig, Jürgen; Stemmler, Gerhard

    2012-02-01

    The evidence for associations between genetic polymorphisms and complex behavioral/psychological phenotypes (traits) has thus far been weak and inconsistent. Using the well-studied Val158Met polymorphism of the catechol-O-methyltransferase (COMT) gene as an example, we demonstrate that using theoretical models to guide phenotype definition and measuring the phenotypes of interest with a high degree of specificity reveals strong gene-behavior associations that are consistent with prior work and that would have otherwise gone unnoticed. Only after statistically controlling for irrelevant portions of phenotype variance did we observe strong (Cohen's d = 0.33-0.70) and significant associations between COMT Val158Met and both cognitive and affective traits in a healthy male sample (N = 201) in Study 1: Carriers of the Met allele scored higher in fluid intelligence (reasoning) but lower in both crystallized intelligence (general knowledge) and the agency facet of extraversion. In Study 2, we conceptually replicated the association of COMT Val158Met with the agency facet of extraversion after partialing irrelevant phenotype variance in a female sample (N = 565). Finally, through reanalysis of a large published data set we showed that Met allele carriers also scored higher in indicators of fluid intelligence after partialing verbal fluency. Because the Met allele codes for a less efficient variant of the enzyme COMT, resulting in higher levels of extrasynaptic prefrontal dopamine, these observations provide further support for a role for dopamine in both intelligence and extraversion. More importantly, the present findings have important implications for the definition of psychological phenotypes in neurogenetic research.

  17. FDG-PET Response Prediction in Pediatric Hodgkin’s Lymphoma: Impact of Metabolically Defined Tumor Volumes and Individualized SUV Measurements on the Positive Predictive Value

    Directory of Open Access Journals (Sweden)

    Amr Elsayed M. Hussien

    2015-01-01

    Full Text Available Background: In pediatric Hodgkin’s lymphoma (pHL early response-to-therapy prediction is metabolically assessed by (18F-FDG PET carrying an excellent negative predictive value (NPV but an impaired positive predictive value (PPV. Aim of this study was to improve the PPV while keeping the optimal NPV. A comparison of different PET data analyses was performed applying individualized standardized uptake values (SUV, PET-derived metabolic tumor volume (MTV and the product of both parameters, termed total lesion glycolysis (TLG; Methods: One-hundred-eight PET datasets (PET1, n = 54; PET2, n = 54 of 54 children were analysed by visual and semi-quantitative means. SUVmax, SUVmean, MTV and TLG were obtained the results of both PETs and the relative change from PET1 to PET2 (Δ in % were compared for their capability of identifying responders and non-responders using receiver operating characteristics (ROC-curves. In consideration of individual variations in noise and contrasts levels all parameters were additionally obtained after threshold correction to lean body mass and background; Results: All semi-quantitative SUV estimates obtained at PET2 were significantly superior to the visual PET2 analysis. However, ΔSUVmax revealed the best results (area under the curve, 0.92; p < 0.001; sensitivity 100%; specificity 85.4%; PPV 46.2%; NPV 100%; accuracy, 87.0% but was not significantly superior to SUVmax-estimation at PET2 and ΔTLGmax. Likewise, the lean body mass and background individualization of the datasets did not impove the results of the ROC analyses; Conclusions: Sophisticated semi-quantitative PET measures in early response assessment of pHL patients do not perform significantly better than the previously proposed ΔSUVmax. All analytical strategies failed to improve the impaired PPV to a clinically acceptable level while preserving the excellent NPV.

  18. Counting coalescent histories.

    Science.gov (United States)

    Rosenberg, Noah A

    2007-04-01

    Given a species tree and a gene tree, a valid coalescent history is a list of the branches of the species tree on which coalescences in the gene tree take place. I develop a recursion for the number of valid coalescent histories that exist for an arbitrary gene tree/species tree pair, when one gene lineage is studied per species. The result is obtained by defining a concept of m-extended coalescent histories, enumerating and counting these histories, and taking the special case of m = 1. As a sum over valid coalescent histories appears in a formula for the probability that a random gene tree evolving along the branches of a fixed species tree has a specified labeled topology, the enumeration of valid coalescent histories can considerably reduce the effort required for evaluating this formula.

  19. On-line measurement of the supplied wind capacity for a defined supply area; Online-Ermittlung der eingespeisten Windleistung fuer definierte Versorgungsgebiete

    Energy Technology Data Exchange (ETDEWEB)

    Ensslin, C.; Hoppe-Kilpper, M.; Rohrig, K. [Institut fuer Solare Energieversorgungstechnik (ISET), Kassel (Germany)

    1999-07-01

    Apart from knowing exactly the statistic performance of the supplied wind capacity the projection of the capacity to be expected in the short and medium-term is of increasing importance for the utilisation plan of power plants and the load management of public utilities. Various approaches and methods already exist for the projection of wind capacity. These approaches are based on the capability to approximate non-linear interrelations and to use unclear, incomplete or even contradictory data with the help of artificial neuronal networks. The physical and meteorological interrelations of the problem do not need to be known during the modelling. Based upon the on-line measurement of wind energy supply the short time projection for periods of one to four hours, e.g. by means of artificial neuronal networks, is an important first step for an improved integration of wind energy into load control and utilisation plan of power plants of public utilities with a high share in wind energy. A first study carried out by the ISET (Institute for Solar Energy Supply Technology) showed the basic possibility to predict the overall capacity of widely distributed joint systems of wind power plants by means of artificial neuronal networks. (orig.) [German] Neben der genauen Kenntnis des statistischen Verhaltens der eingespeisten Windleistung ist die Vorhersage der kurz- bis mittelfristig zu erwartenden Leistung fuer die Kraftwerkseinsatzplanung und das Lastmanagement von Energieversorgungsunternehmen von zunehmender Bedeutung. Fuer die Prognose der Windleistung existieren bereits verschiedene Ansaetze und Verfahren. Dabei basieren die Ansaetze zur Windleistungsprognose mit Hilfe von Kuenstlichen Neuronalen Netzen (KNN) auf der Faehigkeit, nichtlineare Zusammenhaenge zu approximieren und verrauschte, unvollstaendige oder sogar widerspruechliche Daten zu verwenden. Weiter sind bei der Modellierung keine Kenntnisse der physikalischen oder meteorologischen Zusammenhaenge des Problems

  20. Defining Art Appreciation.

    Science.gov (United States)

    Seabolt, Betty Oliver

    2001-01-01

    Discusses the differences and goals of four areas: (1) art appreciation; (2) art history; (3) art aesthetics; and (4) art criticism. Offers a definition of art appreciation and information on how the view of art appreciation in education has changed over time. (CMK)

  1. History of Science and History of Philologies.

    Science.gov (United States)

    Daston, Lorraine; Most, Glenn W

    2015-06-01

    While both the sciences and the humanities, as currently defined, may be too heterogeneous to be encompassed within a unified historical framework, there is good reason to believe that the history of science and the history of philologies both have much to gain by joining forces. This collaboration has already yielded striking results in the case of the history of science and humanist learning in early modern Europe. This essay argues that first, philology and at least some of the sciences (e.g., astronomy) remained intertwined in consequential ways well into the modern period in Western cultures; and second, widening the scope of inquiry to include other philological traditions in non-Western cultures offers rich possibilities for a comparative history of learned practices. The focus on practices is key; by shifting the emphasis from what is studied to how it is studied, deep commonalities emerge among disciplines--and intellectual traditions--now classified as disparate.

  2. Can play be defined?

    DEFF Research Database (Denmark)

    Eichberg, Henning

    2015-01-01

    Can play be defined? There is reason to raise critical questions about the established academic demand that at phenomenon – also in humanist studies – should first of all be defined, i.e. de-lineated and by neat lines limited to a “little box” that can be handled. The following chapter develops t....... Human beings can very well understand play – or whatever phenomenon in human life – without defining it........ The academic imperative of definition seems to be linked to the positivistic attempts – and produces sometimes monstrous definitions. Have they any philosophical value for our knowledge of what play is? Definition is not a universal instrument of knowledge-building, but a culturally specific construction...

  3. Nouns to Define Homophobia

    Directory of Open Access Journals (Sweden)

    Adalberto Campo Arias

    2013-09-01

    Full Text Available Background. The term ‘homophobia’ was introduced in the academic context more than 40 years ago. However, its meaning has changed over time. Objective. To review the nouns used in the last twelve years to define homophobia. Methodology. The authors conducted a systematic search in Medline through Pubmed that included editorials, letters to editors, comments and narrative reviews, in English and Spanish. A qualitative analysis (Grounded theory was applied to analyze nouns used to define homophobia since 2001 through 2012. Results. Authors reviewed three papers including ten nouns to define homophobia, the most common noun was fear. The terms were grouped into two domains: negative attitude and discomfort with homosexuality. Conclusion. Fear is the most used word to describe homophobia. The terms were grouped into two domains: negative attitude and discomfort toward homosexuality.

  4. On Defining Mass

    Science.gov (United States)

    Hecht, Eugene

    2011-01-01

    Though central to any pedagogical development of physics, the concept of mass is still not well understood. Properly defining mass has proven to be far more daunting than contemporary textbooks would have us believe. And yet today the origin of mass is one of the most aggressively pursued areas of research in all of physics. Much of the excitement…

  5. Defining Data Science

    OpenAIRE

    Zhu, Yangyong; Xiong, Yun

    2015-01-01

    Data science is gaining more and more and widespread attention, but no consensus viewpoint on what data science is has emerged. As a new science, its objects of study and scientific issues should not be covered by established sciences. Data in cyberspace have formed what we call datanature. In the present paper, data science is defined as the science of exploring datanature.

  6. Defining Mathematical Giftedness

    Science.gov (United States)

    Parish, Linda

    2014-01-01

    This theoretical paper outlines the process of defining "mathematical giftedness" for a present study on how primary school teaching shapes the mindsets of children who are mathematically gifted. Mathematical giftedness is not a badge of honour or some special value attributed to a child who has achieved something exceptional.…

  7. Software Defined Networking

    DEFF Research Database (Denmark)

    Caba, Cosmin Marius

    resources are limited. Hence, to counteract this trend, current QoS mechanisms must become simpler to deploy and operate, in order to motivate NSPs to employ QoS techniques instead of overprovisioning. Software Defined Networking (SDN) represents a paradigm shift in the way telecommunication and data...

  8. Defining Effective Teaching

    Science.gov (United States)

    Layne, L.

    2012-01-01

    The author looks at the meaning of specific terminology commonly used in student surveys: "effective teaching." The research seeks to determine if there is a difference in how "effective teaching" is defined by those taking student surveys and those interpreting the results. To investigate this difference, a sample group of professors and students…

  9. Defining Game Mechanics

    DEFF Research Database (Denmark)

    Sicart (Vila), Miguel Angel

    2008-01-01

    This article defins game mechanics in relation to rules and challenges. Game mechanics are methods invoked by agents for interacting with the game world. I apply this definition to a comparative analysis of the games Rez, Every Extend Extra and Shadow of the Colossus that will show the relevance...... of a formal definition of game mechanics. Udgivelsesdato: Dec 2008...

  10. Software Defined Cyberinfrastructure

    Energy Technology Data Exchange (ETDEWEB)

    Foster, Ian; Blaiszik, Ben; Chard, Kyle; Chard, Ryan

    2017-07-17

    Within and across thousands of science labs, researchers and students struggle to manage data produced in experiments, simulations, and analyses. Largely manual research data lifecycle management processes mean that much time is wasted, research results are often irreproducible, and data sharing and reuse remain rare. In response, we propose a new approach to data lifecycle management in which researchers are empowered to define the actions to be performed at individual storage systems when data are created or modified: actions such as analysis, transformation, copying, and publication. We term this approach software-defined cyberinfrastructure because users can implement powerful data management policies by deploying rules to local storage systems, much as software-defined networking allows users to configure networks by deploying rules to switches.We argue that this approach can enable a new class of responsive distributed storage infrastructure that will accelerate research innovation by allowing any researcher to associate data workflows with data sources, whether local or remote, for such purposes as data ingest, characterization, indexing, and sharing. We report on early experiments with this approach in the context of experimental science, in which a simple if-trigger-then-action (IFTA) notation is used to define rules.

  11. Defining in Classroom Activities.

    Science.gov (United States)

    Mariotti, Maria Alessandra; Fischbein, Efraim

    1997-01-01

    Discusses some aspects of the defining process in geometrical context in the reference frame of the theory of "figural concepts." Presents analysis of some examples taken from a teaching experiment at the sixth-grade level. Contains 30 references. (Author/ASK)

  12. Defining Game Mechanics

    DEFF Research Database (Denmark)

    Sicart (Vila), Miguel Angel

    2008-01-01

    This article defins game mechanics in relation to rules and challenges. Game mechanics are methods invoked by agents for interacting with the game world. I apply this definition to a comparative analysis of the games Rez, Every Extend Extra and Shadow of the Colossus that will show the relevance...... of a formal definition of game mechanics. Udgivelsesdato: Dec 2008...

  13. Defining, Measuring, and Comparing Organisational Cultures

    NARCIS (Netherlands)

    van den Berg, Peter T.; Wilderom, Celeste P.M.

    2004-01-01

    La littérature portant sur la culture des organisations souffre d’un manque manifeste d’enquêtes extensives débouchant sur des études comparatives. Afin de rendre plus comparables les cultures organisationnelles, nous proposons une définition et une série de dimensions. La culture organisationnelle

  14. Defining and Measuring Dysphagia Following Stroke

    Science.gov (United States)

    Daniels, Stephanie K.; Schroeder, Mae Fern; DeGeorge, Pamela C.; Corey, David M.; Foundas, Anne L.; Rosenbek, John C.

    2009-01-01

    Purpose: To continue the development of a quantified, standard method to differentiate individuals with stroke and dysphagia from individuals without dysphagia. Method: Videofluoroscopic swallowing studies (VFSS) were completed on a group of participants with acute stroke (n = 42) and healthy age-matched individuals (n = 25). Calibrated liquid…

  15. Defining, Measuring, and Comparing Organisational Cultures

    NARCIS (Netherlands)

    Berg, van den Peter T.; Wilderom, Celeste P.M.

    2004-01-01

    La littérature portant sur la culture des organisations souffre d’un manque manifeste d’enquêtes extensives débouchant sur des études comparatives. Afin de rendre plus comparables les cultures organisationnelles, nous proposons une définition et une série de dimensions. La culture organisationnelle

  16. The History of Star Formation in Galaxy Disks in the Local Volume as Measured by the ACS Nearby Galaxy Survey Treasury

    CERN Document Server

    Williams, Benjamin F; Johnson, L C; Weisz, Daniel R; Seth, Anil C; Dolphin, Andrew; Gilbert, Karoline M; Skillman, Evan; Rosema, Keith; Gogarten, Stephanie M; Holtzman, Jon; de Jong, Roelof S

    2011-01-01

    We present a measurement of the age distribution of stars residing in spiral disks and dwarf galaxies. We derive a complete star formation history of the ~140 Mpc^3 covered by the volume-limited sample of galaxies in the Advanced Camera for Surveys (ACS) Nearby Galaxy Survey Treasury (ANGST). The total star formation rate density history is dominated by the large spirals in the volume, although the sample consists mainly of dwarf galaxies. Our measurement shows a factor of ~3 drop at z~2, in approximate agreement with results from other measurement techniques. While our results show that the overall star formation rate density has decreased since z~1, the measured rates during this epoch are higher than those obtained from other measurement techniques. This enhanced recent star formation rate appears to be largely due to an increase in the fraction of star formation contained in low-mass disks at recent times. Finally, our results indicate that despite the differences at recent times, the epoch of formation o...

  17. GBM Accreting Pulsar Histories

    Data.gov (United States)

    National Aeronautics and Space Administration — For each source we plot the history of pulse frequency and pulsed flux measured using the Fermi Gamma-Ray Burst Monitor (GBM) NaI detectors. For these measurements...

  18. Defining the non-profit sector: some lessons from history

    OpenAIRE

    Morris, Susannah

    2000-01-01

    This paper seeks to establish whether the structural-operational definition of the sector, used by the John Hopkins Comparative Non-profit Sector Project (JHCNSP), is universal in its applicability. Historical case studies of primary health care and social housing provision in nineteenth-century England demonstrate that the definition cannot accommodate the institutional diversity of earlier periods and does not produce meaningful sectoral distinctions. The structural-operational definition r...

  19. Canadian History and Cultural History: Thoughts and Notes on a New Departure.

    Science.gov (United States)

    Smith, Allan

    1990-01-01

    Seeks to define the study of Canadian cultural history, tracing the development of cultural history from the Enlightenment to the present. Discusses books on cultural history that had an impact on theories of culture and society. Ties this general discussion of cultural history and its roots to Canadian cultural history. (RW)

  20. Defining the fascial system.

    Science.gov (United States)

    Adstrum, Sue; Hedley, Gil; Schleip, Robert; Stecco, Carla; Yucesoy, Can A

    2017-01-01

    Fascia is a widely used yet indistinctly defined anatomical term that is concurrently applied to the description of soft collagenous connective tissue, distinct sections of membranous tissue, and a body pervading soft connective tissue system. Inconsistent use of this term is causing concern due to its potential to confuse technical communication about fascia in global, multiple discipline- and multiple profession-spanning discourse environments. The Fascia Research Society acted to address this issue by establishing a Fascia Nomenclature Committee (FNC) whose purpose was to clarify the terminology relating to fascia. This committee has since developed and defined the terms a fascia, and, more recently, the fascial system. This article reports on the FNC's proposed definition of the fascial system. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Defining Legal Moralism

    DEFF Research Database (Denmark)

    Thaysen, Jens Damgaard

    2015-01-01

    This paper discusses how legal moralism should be defined. It is argued that legal moralism should be defined as the position that “For any X, it is always a pro tanto reason for justifiably imposing legal regulation on X that X is morally wrong (where “morally wrong” is not conceptually equivalent...... to “harmful”)”. Furthermore, a distinction between six types of legal moralism is made. The six types are grouped according to whether they are concerned with the enforcement of positive or critical morality, and whether they are concerned with criminalising, legally restricting, or refraining from legally...... protecting morally wrong behaviour. This is interesting because not all types of legal moralism are equally vulnerable to the different critiques of legal moralism that have been put forth. Indeed, I show that some interesting types of legal moralism have not been criticised at all....

  2. Define Digital Vernacular

    Institute of Scientific and Technical Information of China (English)

    刘佳; 李海英; James Stevens; Rough Nelson

    2014-01-01

    As science and technology developed, the tools of humans developed from humans’hands, to mechanical and digital technologies. The tools influ-ence almost everything in the humans’world, so does vernacular. The digital vernacular could be understood as using digital technology to vernacular; the digital means technologies. It also could be understood as doing vernacular in a digital way;the digital means data and information, in other words it can be seeking truth from facts. Define digital vernacular is not only what is digital vernacular, but also about how to do the digital vernacular and what kind of attitude we should hold to-ward the digital vernacular. Define digital vernacular as both thinking and doing.

  3. Defining local food

    DEFF Research Database (Denmark)

    Eriksen, Safania Normann

    2013-01-01

    Despite evolving local food research, there is no consistent definition of “local food.” Various understandings are utilized, which have resulted in a diverse landscape of meaning. The main purpose of this paper is to examine how researchers within the local food systems literature define local...... food, and how these definitions can be used as a starting point to identify a new taxonomy of local food based on three domains of proximity....

  4. [To define internet addiction].

    Science.gov (United States)

    Tonioni, Federico

    2013-01-01

    Internet addiction is a new behavioral disorder difficult to define, especially when referring to young teenagers who make great use of web-mediated relationships. It's necessary to separate the cases of overt dependency on those in which the abuse of internet seems to have a different value, offering the only way to achieve the possible relationship. Internet is mediating a new way of communicating and thinking, this may favor the onset of clinical phenomena intended to surprise.

  5. Decidability of definability

    CERN Document Server

    Tsankov, Manuel Bodirsky; Michael Pinsker; Todor

    2010-01-01

    For a fixed infinite structure $\\Gamma$ with finite signature $\\tau$, we study the following computational problem: Input are quantifier-free first-order $\\tau$-formulas $\\phi_0,\\phi_1,\\dots,\\phi_n$ that define relations $R_0,R_1,\\dots,R_n$ over $\\Gamma$. The question is whether the relation $R_0$ is primitive positive definable from $R_1,\\ldots,R_n$, i.e., definable by a first-order formula that uses only relation symbols for $R_1, \\dots, R_n$, equality, conjunctions, and existential quantification (disjunction, negation, and universal quantification are forbidden). We show decidability of this problem for all structures $\\Gamma$ that have a first-order definition in an ordered homogeneous structure $\\Delta$ with a finite language whose age is a Ramsey class and determined by finitely many forbidden substructures. Examples for structures $\\Gamma$ with this property are the order of the rationals, the random graph, the homogeneous universal poset, the random tournament, all homogeneous universal $C$-relations...

  6. Probing the Expansion history of the Universe by Model-Independent Reconstruction from Supernovae and Gamma-Ray Bursts Measurements

    CERN Document Server

    Feng, Chao-Jun

    2016-01-01

    To probe the late evolution history of the Universe, we adopt two kinds of optimal basis systems. One of them is constructed by performing the principle component analysis (PCA) and the other is build by taking the multidimensional scaling (MDS) approach. Cosmological observables such as the luminosity distance can be decomposed into these basis systems. These basis are optimized for different kinds of cosmological models that based on different physical assumptions, even for a mixture model of them. Therefore, the so-called feature space that projected from the basis systems is cosmological model independent, and it provide a parameterization for studying and reconstructing the Hubble expansion rate from the supernova luminosity distance and even gamma-ray bursts (GRBs) data with self-calibration. The circular problem when using GRBs as cosmological candles is naturally eliminated in this procedure. By using the Levenberg-Marquardt (LM) technique and the Markov Chain Monte Carlo (MCMC) method, we perform an ...

  7. Defining Overweight and Obesity

    Science.gov (United States)

    ... more direct measures of body fat obtained from skinfold thickness measurements, bioelectrical impedance, underwater weighing, dual energy x- ... Berenson, G.S., 2013. A comparison of the Slaughter skinfold-thickness equations and BMI in predicting body fatness and ...

  8. The Relationship between Content Area General Outcome Measurement and Statewide Testing in Sixth-Grade World History

    Science.gov (United States)

    Mooney, Paul; McCarter, Kevin S.; Schraven, Jodie; Haydel, Beth

    2010-01-01

    The purpose of the present study was to extend validity research on a content area general outcome measurement tool known as vocabulary matching. Previous research has reported moderately strong to strong correlations between the group-administered vocabulary-matching measure and a standardized assessment instrument. The present study extended the…

  9. Postural Control Characteristics during Single Leg Standing of Individuals with a History of Ankle Sprain: Measurements Obtained Using a Gravicorder and Head and Foot Accelerometry.

    Science.gov (United States)

    Abe, Yota; Sugaya, Tomoaki; Sakamoto, Masaaki

    2014-03-01

    [Purpose] This study aimed to validate the postural control characteristics of individuals with a history of ankle sprain during single leg standing by using a gravicorder and head and foot accelerometry. [Subjects] Twenty subjects with and 23 subjects without a history of ankle sprain (sprain and control groups, respectively) participated. [Methods] The anteroposterior, mediolateral, and total path lengths, as well as root mean square (RMS) of each length, were calculated using the gravicorder. The anteroposterior, mediolateral, and resultant acceleration of the head and foot were measured using accelerometers and were evaluated as the ratio of the acceleration of the head to the foot. [Results] There was no significant difference between the two groups in path length or RMS acceleration of the head and foot. However, the ratios of the mediolateral and resultant components were significantly higher in the sprain group than in the control group. [Conclusion] Our findings suggest that individuals with a history of ankle sprain have a higher head-to-foot acceleration ratio and different postural control characteristics than those of control subjects.

  10. The intergalactic medium thermal history at redshift z=1.7--3.2 from the Lyman alpha forest: a comparison of measurements using wavelets and the flux distribution

    CERN Document Server

    Garzilli, A; Kim, T -S; Leach, S; Viel, M

    2012-01-01

    We investigate the thermal history of the intergalactic medium (IGM) in the redshift interval z=1.7--3.2 by studying the small-scale fluctuations in the Lyman alpha forest transmitted flux. We apply a wavelet filtering technique to eighteen high resolution quasar spectra obtained with the Ultraviolet and Visual Echelle Spectrograph (UVES), and compare these data to synthetic spectra drawn from a suite of hydrodynamical simulations in which the IGM thermal state and cosmological parameters are varied. From the wavelet analysis we obtain estimates of the IGM thermal state that are in good agreement with other recent, independent wavelet-based measurements. We also perform a reanalysis of the same data set using the Lyman alpha forest flux probability distribution function (PDF), which has previously been used to measure the IGM temperature-density relation. This provides an important consistency test for measurements of the IGM thermal state, as it enables a direct comparison of the constraints obtained using t...

  11. Defining Z in Q

    CERN Document Server

    Koenigsmann, Jochen

    2010-01-01

    We show that ${\\mathbb Z}$ is definable in ${\\mathbb Q}$ by a universal first-order formula in the language of rings. We also present an $\\forall\\exists$-formula for ${\\mathbb Z}$ in ${\\mathbb Q}$ with just one universal quantifier. We exhibit new diophantine subsets of ${\\mathbb Q}$ like the set of non-squares or the complement of the image of the norm map under a quadratic extension. Finally, we show that there is no existential formula for ${\\mathbb Z}$ in ${\\mathbb Q}$, provided one assumes a strong variant of the Bombieri-Lang Conjecture for varieties over ${\\mathbb Q}$ with many ${\\mathbb Q}$-rational points.

  12. Atmospheric refraction : a history

    NARCIS (Netherlands)

    Lehn, WH; van der Werf, S

    2005-01-01

    We trace the history of atmospheric refraction from the ancient Greeks up to the time of Kepler. The concept that the atmosphere could refract light entered Western science in the second century B.C. Ptolemy, 300 years later, produced the first clearly defined atmospheric model, containing air of un

  13. Psychology and History.

    Science.gov (United States)

    Munsterburg, Hugo

    1994-01-01

    This essay considers the discipline of psychology as distinct from history, defining it as a science within philosophy dedicated to the study of the causal structure of the human mind. Although Hugo Munsterburg was considered an important figure in applied psychology, this essay represents an earlier epistemology. (SLD)

  14. Atmospheric refraction : a history

    NARCIS (Netherlands)

    Lehn, WH; van der Werf, S

    2005-01-01

    We trace the history of atmospheric refraction from the ancient Greeks up to the time of Kepler. The concept that the atmosphere could refract light entered Western science in the second century B.C. Ptolemy, 300 years later, produced the first clearly defined atmospheric model, containing air of

  15. Software-Defined Cluster

    Institute of Scientific and Technical Information of China (English)

    聂华; 杨晓君; 刘淘英

    2015-01-01

    The cluster architecture has played an important role in high-end computing for the past 20 years. With the advent of Internet services, big data, and cloud computing, traditional clusters face three challenges: 1) providing flexible system balance among computing, memory, and I/O capabilities;2) reducing resource pooling overheads;and 3) addressing low performance-power efficiency. This position paper proposes a software-defined cluster (SDC) architecture to deal with these challenges. The SDC architecture inherits two features of traditional cluster: its architecture is multicomputer and it has loosely-coupled interconnect. SDC provides two new mechanisms: global I/O space (GIO) and hardware-supported native access (HNA) to remote devices. Application software can define a virtual cluster best suited to its needs from resources pools provided by a physical cluster, and traditional cluster ecosystems need no modification. We also discuss a prototype design and implementation of a 32-processor cloud server utilizing the SDC architecture.

  16. Probing the Expansion History of the Universe by Model-independent Reconstruction from Supernovae and Gamma-Ray Burst Measurements

    Science.gov (United States)

    Feng, Chao-Jun; Li, Xin-Zhou

    2016-04-01

    To probe the late evolution history of the universe, we adopt two kinds of optimal basis systems. One of them is constructed by performing the principle component analysis, and the other is built by taking the multidimensional scaling approach. Cosmological observables such as the luminosity distance can be decomposed into these basis systems. These basis systems are optimized for different kinds of cosmological models that are based on different physical assumptions, even for a mixture model of them. Therefore, the so-called feature space that is projected from the basis systems is cosmological model independent, and it provides a parameterization for studying and reconstructing the Hubble expansion rate from the supernova luminosity distance and even gamma-ray burst (GRB) data with self-calibration. The circular problem when using GRBs as cosmological candles is naturally eliminated in this procedure. By using the Levenberg-Marquardt technique and the Markov Chain Monte Carlo method, we perform an observational constraint on this kind of parameterization. The data we used include the “joint light-curve analysis” data set that consists of 740 Type Ia supernovae and 109 long GRBs with the well-known Amati relation.

  17. PROBING THE EXPANSION HISTORY OF THE UNIVERSE BY MODEL-INDEPENDENT RECONSTRUCTION FROM SUPERNOVAE AND GAMMA-RAY BURST MEASUREMENTS

    Energy Technology Data Exchange (ETDEWEB)

    Feng, Chao-Jun; Li, Xin-Zhou, E-mail: fengcj@shnu.edu.cn, E-mail: kychz@shnu.edu.cn [Shanghai United Center for Astrophysics (SUCA), Shanghai Normal University, 100 Guilin Road, Shanghai 200234 (China)

    2016-04-10

    To probe the late evolution history of the universe, we adopt two kinds of optimal basis systems. One of them is constructed by performing the principle component analysis, and the other is built by taking the multidimensional scaling approach. Cosmological observables such as the luminosity distance can be decomposed into these basis systems. These basis systems are optimized for different kinds of cosmological models that are based on different physical assumptions, even for a mixture model of them. Therefore, the so-called feature space that is projected from the basis systems is cosmological model independent, and it provides a parameterization for studying and reconstructing the Hubble expansion rate from the supernova luminosity distance and even gamma-ray burst (GRB) data with self-calibration. The circular problem when using GRBs as cosmological candles is naturally eliminated in this procedure. By using the Levenberg–Marquardt technique and the Markov Chain Monte Carlo method, we perform an observational constraint on this kind of parameterization. The data we used include the “joint light-curve analysis” data set that consists of 740 Type Ia supernovae and 109 long GRBs with the well-known Amati relation.

  18. Natural history of malignant bone disease in breast cancer and the use of cumulative mean functions to measure skeletal morbidity

    Directory of Open Access Journals (Sweden)

    Smith Matthew R

    2009-08-01

    Full Text Available Abstract Background Bone metastases are a common cause of skeletal morbidity in patients with advanced cancer. The pattern of skeletal morbidity is complex, and the number of skeletal complications is influenced by the duration of survival. Because many patients with cancer die before trial completion, there is a need for survival-adjusted methods to accurately assess the effects of treatment on skeletal morbidity. Methods Recently, a survival-adjusted cumulative mean function model has been generated that can provide an intuitive graphic representation of skeletal morbidity throughout a study. This model was applied to the placebo-control arm of a pamidronate study in patients with malignant bone disease from breast cancer. Results Analysis by bone lesion location showed that spinal metastases were associated with the highest cumulative mean incidence of skeletal-related events (SREs, followed by chest and pelvic metastases. Metastases located in the extremities were associated with an intermediate incidence of SREs, and those in the skull were associated with the lowest incidence of SREs. Conclusion Application of this model to data from the placebo arm of this trial revealed important insight into the natural history of skeletal morbidity in patients with bone metastases. Based on these observations, treatment for the prevention of SREs is warranted regardless of lesion location except for metastases on the skull.

  19. Community History.

    Science.gov (United States)

    Lewis, Helen M.

    1997-01-01

    Recounts the experience of researching community history in Ivanhoe, Virginia, between 1987 and 1990. The Ivanhoe History Project involved community members in collecting photographs, memorabilia, and oral histories of their town. Subsequent published volumes won the W. D. Weatherford Award and inspired a quilt exhibit and a theatrical production.…

  20. The Peabody Treatment Progress Battery: history and methods for developing a comprehensive measurement battery for youth mental health.

    Science.gov (United States)

    Riemer, Manuel; Athay, M Michele; Bickman, Leonard; Breda, Carolyn; Kelley, Susan Douglas; Vides de Andrade, Ana R

    2012-03-01

    There is increased need for comprehensive, flexible, and evidence-based approaches to measuring the process and outcomes of youth mental health treatment. This paper introduces a special issue dedicated to the Peabody Treatment Progress Battery (PTPB), a battery of measures created to meet this need. The PTPB is an integrated set of brief, reliable, and valid instruments that can be administered efficiently at low cost and can provide systematic feedback for use in treatment planning. It includes eleven measures completed by youth, caregivers, and/or clinicians that assess clinically-relevant constructs such as symptom severity, therapeutic alliance, life satisfaction, motivation for treatment, hope, treatment expectations, caregiver strain, and service satisfaction. This introductory article describes the rationale for the PTPB and its development and evaluation, detailing the specific analytic approaches utilized by the different papers in the special issue and a description of the study and samples from which the participants were taken.

  1. The history, development and the present status of the radon measurement programme in the United States of America.

    Science.gov (United States)

    George, A C

    2015-11-01

    The US radon measurement programme began in the late 1950s by the US Public Health Service in Colorado, New Mexico and Utah during the uranium frenzy. After the 1967 Congressional Hearings on the working conditions in uranium mines, the US Atomic Energy Commission (AEC) was asked to conduct studies in active uranium mines to assess the exposure of the miners on the Colorado Plateau and in New Mexico. From 1967 to 1972, the Health and Safety Laboratory of the US AEC in New York investigated more than 20 uranium mines for radon and radon decay product concentrations and particle size in 4 large uranium mines in New Mexico. In 1970, the US Environmental Protection Agency (EPA) was established and took over some of the AEC radon measurement activities. Between 1975 and 1978, the Environmental Measurements Laboratory of the US Department of Energy conducted the first detailed indoor radon survey in the USA. Later in 1984, the very high concentrations of radon found in Pennsylvania homes set the wheels in motion and gave birth to the US Radon Industry. The US EPA expanded its involvement in radon issues and assumed an active role by establishing the National Radon Proficiency Program to evaluate the effectiveness of radon measurement and mitigation methods. In 1998, due to limited resources EPA privatised the radon programme. This paper presents a personal perspective of past events and current status of the US radon programme. It will present an update on radon health effects, the incidence rate of lung cancer in the USA and the number of radon measurements made from 1988 to 2013 using short-term test methods. More than 23 million measurements were made in the last 25 y and as a result more than 1.24 million homes were mitigated successfully. It is estimated that radon measurements performed in the USA are made using long-term testing devices. The number of homes above the US action level of 148 Bq m(-3) (4 pCi l(-1)) may be ∼8.5 million because ∼50 million homes

  2. Implementing Software Defined Radio

    CERN Document Server

    Grayver, Eugene

    2013-01-01

    Software Defined Radio makes wireless communications easier, more efficient, and more reliable. This book bridges the gap between academic research and practical implementation. When beginning a project, practicing engineers, technical managers, and graduate students can save countless hours by considering the concepts presented in these pages. The author covers the myriad options and trade-offs available when selecting an appropriate hardware architecture. As demonstrated here, the choice between hardware- and software-centric architecture can mean the difference between meeting an aggressive schedule and bogging down in endless design iterations. Because of the author’s experience overseeing dozens of failed and successful developments, he is able to present many real-life examples. Some of the key concepts covered are: Choosing the right architecture for the market – laboratory, military, or commercial Hardware platforms – FPGAs, GPPs, specialized and hybrid devices Standardization efforts to ens...

  3. Defining cyber warfare

    Directory of Open Access Journals (Sweden)

    Dragan D. Mladenović

    2012-04-01

    Full Text Available Cyber conflicts represent a new kind of warfare that is technologically developing very rapidly. Such development results in more frequent and more intensive cyber attacks undertaken by states against adversary targets, with a wide range of diverse operations, from information operations to physical destruction of targets. Nevertheless, cyber warfare is waged through the application of the same means, techniques and methods as those used in cyber criminal, terrorism and intelligence activities. Moreover, it has a very specific nature that enables states to covertly initiate attacks against their adversaries. The starting point in defining doctrines, procedures and standards in the area of cyber warfare is determining its true nature. In this paper, a contribution to this effort was made through the analysis of the existing state doctrines and international practice in the area of cyber warfare towards the determination of its nationally acceptable definition.

  4. Ranking Economic History Journals

    DEFF Research Database (Denmark)

    Di Vaio, Gianfranco; Weisdorf, Jacob Louis

    This study ranks - for the first time - 12 international academic journals that have economic history as their main topic. The ranking is based on data collected for the year 2007. Journals are ranked using standard citation analysis where we adjust for age, size and self-citation of journals. We...... also compare the leading economic history journals with the leading journals in economics in order to measure the influence on economics of economic history, and vice versa. With a few exceptions, our results confirm the general idea about what economic history journals are the most influential...

  5. Ranking economic history journals

    DEFF Research Database (Denmark)

    Di Vaio, Gianfranco; Weisdorf, Jacob Louis

    2010-01-01

    This study ranks-for the first time-12 international academic journals that have economic history as their main topic. The ranking is based on data collected for the year 2007. Journals are ranked using standard citation analysis where we adjust for age, size and self-citation of journals. We also...... compare the leading economic history journals with the leading journals in economics in order to measure the influence on economics of economic history, and vice versa. With a few exceptions, our results confirm the general idea about what economic history journals are the most influential for economic...

  6. DEFINING SPATIAL VIOLENCE. BUCHAREST AS A STUDY CASE

    Directory of Open Access Journals (Sweden)

    Celia GHYKA

    2015-05-01

    Full Text Available The paper looks at the spatial manifestations of violence, aiming to define the category of spatial violence by focusing on the recent urban history of Bucharest; it establishes links with the longer history of natural and inflicted disasters that defined the city, and it explores the spatial, urban, social and symbolical conflicts that occured during the last 25 years, pointing at their consequences on the social and urban substance of the city.

  7. Ranking Economic History Journals

    DEFF Research Database (Denmark)

    Di Vaio, Gianfranco; Weisdorf, Jacob Louis

    This study ranks - for the first time - 12 international academic journals that have economic history as their main topic. The ranking is based on data collected for the year 2007. Journals are ranked using standard citation analysis where we adjust for age, size and self-citation of journals. We...... also compare the leading economic history journals with the leading journals in economics in order to measure the influence on economics of economic history, and vice versa. With a few exceptions, our results confirm the general idea about what economic history journals are the most influential...... for economic history, and that, although economic history is quite independent from economics as a whole, knowledge exchange between the two fields is indeed going on....

  8. History and Legacy

    Science.gov (United States)

    Mason, Diana S.

    2004-01-01

    The history of the computer usage in high school laboratories is discussed. Students learned scientific methods by acknowledging measurement errors, using significant digits, questioning their own results, and without doubts, they benefited from applying skill learned in mathematics classes.

  9. Self-defining memories and self-defining future projections in hypomania-prone individuals.

    Science.gov (United States)

    Lardi Robyn, Claudia; Ghisletta, Paolo; Van der Linden, Martial

    2012-06-01

    Mania and hypomania involve dysfunctional beliefs about the self, others, and the world, as well about affect regulation. The present study explored the impact of these beliefs on self-defining memories and self-defining future projections of individuals with a history of hypomanic symptoms. The main findings showed that a history of hypomanic symptoms was related to enhanced retrieval of memories describing positive relationships and to reduced future projections about relationships, suggesting both a need for social bonding and a striving for autonomy. Moreover, hypomania-prone individuals tended to describe more recent events and to produce self-defining memories with references to tension that were more integrated in their self-structure. All of these findings support the presence of conflicting dysfunctional beliefs and the importance of memories containing references to tension in hypomania.

  10. Comparison of elemental carbon in lake sediments measured by three different methods and 150-year pollution history in Eastern China.

    Science.gov (United States)

    Han, Y M; Cao, J J; Yan, B Z; Kenna, T C; Jin, Z D; Cheng, Y; Chow, Judith C; An, Z S

    2011-06-15

    Concentrations of elemental carbon (EC) were measured in a 150 yr sediment record collected from Lake Chaohu in Anhui Province, eastern China, using three different thermal analytical methods: IMPROVE_A thermal optical reflectance (TOR), STN_thermal optical transmittance (TOT), and chemothermal oxidation (CTO). Distribution patterns for EC concentrations are different among the three methods, most likely due to the operational definition of EC and different temperature treatments prescribed for each method. However, similar profiles were found for high-temperature EC fractions among different methods. Historical soot(TOR) (high-temperature EC fractions measured by the IMPROVE_A TOR method) from Lake Chaohu exhibited stable low concentrations prior to the late 1970s and a sharp increase thereafter, corresponding well with the rapid industrialization of China in the last three decades. This may suggest that high-temperature thermal protocols are suitable for differentiating between soot and other carbon fractions. A similar soot(TOR) record was also obtained from Lake Taihu (~200 km away), suggesting a regional source of soot. The ratio of char(TOR) (low-temperature EC fraction measured by the IMPROVE_A TOR method, after correction for pyrolysis) to soot(TOR) in Lake Chaohu shows an overall decreasing trend, consistent with gradual changes in fuel use from wood burning to increasing fossil fuel combustions. Average higher char(TOR)/soot(TOR) was observed in Lake Taihu than in Lake Chaohu in the past 150 years, consistent with the longer and more extensive industrialization around the Taihu region.

  11. Defining Student Engagement

    Science.gov (United States)

    Axelson, Rick D.; Flick, Arend

    2011-01-01

    Few terms in the lexicon of higher education today are invoked more frequently, and in more varied ways, than "engagement". The phrase "student engagement" has come to refer to how "involved" or "interested" students appear to be in their learning and how "connected" they are to their classes, their institutions, and each other. As measured by…

  12. Mars Atmospheric History Derived from Upper-Atmospheric Structure of 38Ar/36Ar Measured From MAVEN

    Science.gov (United States)

    Jakosky, Bruce; Slipski, Marek; Benna, Mehdi; Mahaffy, Paul; Elrod, Meredith K.; Yelle, Roger; Stone, Shane; Alsaeed, Noora

    2016-10-01

    Measurements of the structure of the Martian upper atmosphere made from MAVEN observations allow us to derive homopause and exobase altitudes in the Mars upper atmosphere and to determine the isotopic fractionation that occurs between them. Fractionation in the ratio of 38Ar/36Ar occurs between the homopause and exobase due to diffusive separation. This fractionation, combined with measurements of the bulk atmospheric ratio, is used to determine the total amount of argon lost to space by pick-up-ion sputtering. Our analysis is based on Rayleigh distillation, modified by replenishment of gas to the atmosphere by outgassing, impact, and crustal weathering. Approximately 80 % of the 36Ar that was ever in the atmosphere has been removed through time. This high value requires that a major fraction of Mars atmospheric gas has been lost to space. It points strongly to loss to space as having been the dominant mechanism driving the transition in Martian climate from an early, warm, wet environment to today's cold, dry, thin atmosphere.

  13. What can we learn on the thermal history of the Universe from future CMB spectrum measures at long wavelengths?

    CERN Document Server

    Burigana, C

    2003-01-01

    We analyse the implications of future observations of the CMB absolute temperature at centimeter and decimeter wavelengths, necessary to complement the accurate COBE/FIRAS data. Our analysis shows that forthcoming ground and balloon measures will allow a better understanding of free-free distortions but will not be able to significantly improve the constraints already provided by the FIRAS data on the possible energy exchanges in the primeval plasma. The same holds even improving the sensitivity up to ~10 times. Thus, we have studied the impact of very high quality data, such those in principle achievable with a space experiment like DIMES planned to measure the CMB absolute temperature at 0.5 - 15 cm with a sensitivity of ~0.1 mK, close to that of FIRAS. Such high quality data would improve by a factor ~50 the FIRAS results on the fractional energy exchanges associated to dissipation processes possibly occurred in a wide range of cosmic epochs, at intermediate and high redshifts (y_h > 1). The energy dissipa...

  14. Otolith oxygen isotopes measured by high-precision secondary ion mass spectrometry reflect life history of a yellowfin sole (Limanda aspera).

    Science.gov (United States)

    Matta, Mary Elizabeth; Orland, Ian J; Ushikubo, Takayuki; Helser, Thomas E; Black, Bryan A; Valley, John W

    2013-03-30

    The oxygen isotope ratio (δ(18)O value) of aragonite fish otoliths is dependent on the temperature and the δ(18)O value of the ambient water and can thus reflect the environmental history of a fish. Secondary ion mass spectrometry (SIMS) offers a spatial-resolution advantage over conventional acid-digestion techniques for stable isotope analysis of otoliths, especially given their compact nature. High-precision otolith δ(18)O analysis was conducted with an IMS-1280 ion microprobe to investigate the life history of a yellowfin sole (Limanda aspera), a Bering Sea species known to migrate ontogenetically. The otolith was cut transversely through its core and one half was roasted to eliminate organic contaminants. Values of δ(18)O were measured in 10-µm spots along three transects (two in the roasted half, one in the unroasted half) from the core toward the edge. Otolith annual growth zones were dated using the dendrochronology technique of crossdating. Measured values of δ(18)O ranged from 29.0 to 34.1‰ (relative to Vienna Standard Mean Ocean Water). Ontogenetic migration from shallow to deeper waters was reflected in generally increasing δ(18)O values from age-0 to approximately age-7 and subsequent stabilization after the expected onset of maturity at age-7. Cyclical variations of δ(18)O values within juvenile otolith growth zones, up to 3.9‰ in magnitude, were caused by a combination of seasonal changes in the temperature and the δ(18)O value of the ambient water. The ion microprobe produced a high-precision and high-resolution record of the relative environmental conditions experienced by a yellowfin sole that was consistent with population-level studies of ontogeny. Furthermore, this study represents the first time that crossdating has been used to ensure the dating accuracy of δ(18)O measurements in otoliths. Copyright © 2013 John Wiley & Sons, Ltd.

  15. On defining dietary fibre.

    Science.gov (United States)

    DeVries, Jonathan W

    2003-02-01

    Establishing a definition for dietary fibre has historically been a balance between nutrition knowledge and analytical method capabilities. While the most widely accepted physiologically-based definitions have generally been accurate in defining the dietary fibre in foods, scientists and regulators have tended, in practice, to rely on analytical procedures as the definitional basis in fact. As a result, incongruities between theory and practice have resulted in confusion regarding the components that make up dietary fibre. In November 1998 the president of the American Association of Cereal Chemists (AACC) appointed an expert scientific review committee and charged it with the task of reviewing and, if necessary, updating the definition of dietary fibre. The committee was further charged with assessing the state of analytical methodology and making recommendations relevant to the updated definition. After due deliberation, an updated definition of dietary fibre was delivered to the AACC Board of Directors for consideration and adoption (Anon, 2000; Jones 2000b). The updated definition includes the same food components as the historical working definition used for approximately 30 years (a very important point, considering that the majority of the research of the past 30 years delineating the positive health effects of dietary fibre is based on that working definition). However, the updated definition more clearly delineates the make-up of dietary fibre and its physiological functionality. As a result, relatively few changes will be necessary in analytical methodology. Current methodologies, in particular AACC-approved method of analysis 32-05 (Grami, 2000), Association of Official Analytical Chemists' official method of analysis 985.29 (Horwitz, 2000a) or AACC 32-07 (Grami, 2000) Association of Official Analytical Chemists 991.43 (Horwitz, 2000a) will continue to be sufficient and used for most foods. A small number of additional methods will be necessary to

  16. Disparities in bone density measurement history and osteoporosis medication utilisation in Swiss women: results from the Swiss Health Survey 2007

    Directory of Open Access Journals (Sweden)

    Born Rita

    2013-01-01

    Full Text Available Abstract Background Although factors associated with the utilisation of bone density measurement (BDM and osteoporosis treatment have been regularly assessed in the US and Canada, they have not been effectively analysed in European countries. This study assessed factors associated with the utilisation of BDM and osteoporosis medication (OM in Switzerland. Methods The Swiss Health Survey 2007 data included self-reported information on BDM and OM for women aged 40 years and older who were living in private households. Multivariable logistic regression analysis was used to identify sociodemographic, socioeconomic, healthcare-related and osteoporosis risk factors associated with BDM and OM utilisation. Results The lifetime prevalence of BDM was 25.6% (95% CI: 24.3-26.9% for women aged 40 years and older. BDM utilisation was associated with most sociodemographic factors, all the socioeconomic and healthcare-related factors, and with major osteoporosis risk factors analysed. The prevalence of current OM was 7.8% (95% CI: 7.0-8.6% and it was associated with some sociodemographic and most healthcare-related factors but only with one socioeconomic factor. Conclusions In Swiss women, ever having had a BDM and current OM were low and utilisation disparities exist according to sociodemographic, socioeconomic and healthcare-related factors. This might foster further health inequalities. The reasons for these findings should be addressed in further studies of the elderly women, including those living in institutions.

  17. Time, money, and history.

    Science.gov (United States)

    Edgerton, David

    2012-06-01

    This essay argues that taking the economy seriously in histories of science could not only extend the range of activities studied but also change--often quite radically--our understanding of well-known cases and instances in twentieth-century science. It shows how scientific intellectuals and historians of science have followed the money as a means of critique of particular forms of science and of particular conceptions of science. It suggests the need to go further, to a much broader implicit definition of what constitutes science--one that implies a criticism of much history of twentieth-century science for defining it implicitly and inappropriately in very restrictive ways.

  18. A history of the 2014 Minute 319 environmental pulse flow asdocumented by field measurements and satellite imagery

    Science.gov (United States)

    Nelson, Steven M.; Ramirez-Hernandez, Jorge; Rodriguez-Burgeueno, J. Eliana; Milliken, Jeff; Kennedy, Jeffrey R.; Zamora-Arroyo, Francisco; Schlatter, Karen; Santiago-Serrano, Edith; Carrera-Villa, Edgar

    2016-01-01

    As provided in Minute 319 of the U.S.-Mexico Water Treaty of 1944, a pulse flow of approximately 132 million cubic meters (mcm) was released to the riparian corridor of the Colorado River Delta over an eight-week period that began March 23, 2014 and ended May 18, 2014. Peak flows were released in the early part of the pulse to simulate a spring flood, with approximately 101.7 mcm released at Morelos Dam on the U.S.-Mexico border. The remainder of the pulse flow water was released to the riparian corridor via Mexicali Valley irrigation spillway canals, with 20.9 mcm released at Km 27 Spillway (41 km below Morelos Dam) and 9.3 mcm released at Km 18 Spillway (78 km below Morelos Dam). We used sequential satellite images, overflights, ground observations, water discharge measurements, and automated temperature, river stage and water quality loggers to document and describe the progression of pulse flow water through the study area. The rate of advance of the wetted front was slowed by infiltration and high channel roughness as the pulse flow crossed more than 40 km of dry channel which was disconnected from underlying groundwater and partially overgrown with salt cedar. High lag time and significant attenuation of flow resulted in a changing hydrograph as the pulse flow progressed to the downstream delivery points; two peak flows occurred in some lower reaches. The pulse flow advanced more than 120 km downstream from Morelos Dam to reach the Colorado River estuary at the northern end of the Gulf of California.

  19. Intellectual History

    DEFF Research Database (Denmark)

    In the 5 Questions book series, this volume presents a range of leading scholars in Intellectual History and the History of Ideas through their answers to a brief questionnaire. Respondents include Michael Friedman, Jacques le Goff, Hans Ulrich Gumbrecht, Jonathan Israel, Phiip Pettit, John Pocock...

  20. Intellectual History

    DEFF Research Database (Denmark)

    In the 5 Questions book series, this volume presents a range of leading scholars in Intellectual History and the History of Ideas through their answers to a brief questionnaire. Respondents include Michael Friedman, Jacques le Goff, Hans Ulrich Gumbrecht, Jonathan Israel, Phiip Pettit, John Pocock...

  1. Romerrigets historie

    DEFF Research Database (Denmark)

    Christiansen, Erik

    Romerrigets historie fra Roms legendariske grundlæggelse i 753 f.v.t. til Heraklios' tronbestigelse i 610 e.v.t.......Romerrigets historie fra Roms legendariske grundlæggelse i 753 f.v.t. til Heraklios' tronbestigelse i 610 e.v.t....

  2. Assimilating airborne gas and aerosol measurements into HYSPLIT: a visualization tool for simultaneous assessment of air mass history and back trajectory reliability

    Directory of Open Access Journals (Sweden)

    S. Freitag

    2013-06-01

    Full Text Available Backward trajectories are commonly used to gain knowledge about the history of airborne observations in terms of possible processes along their path as well as feasible source regions. Here, we describe a refined approach that incorporates airborne gas, aerosol, and environmental data into back trajectories and show how this technique allows for simultaneous assessment of air mass history and back trajectory reliability without the need of calculating trajectory errors. We use the HYbrid Single-Particle Lagrangian Integrated Trajectory (HYSPLIT model and add a simple semi-automated computing routine to facilitate high-frequency coverage of back trajectories initiated along the flight track every 10 s. We integrate our in-situ physiochemical data by color-coding each of these trajectories with its corresponding in-situ tracer values measured at the back trajectory start points along the flight path. The unique color for each trajectory aids assessment of trajectory reliability through the visual clustering of air mass pathways of similar coloration. Moreover, marked changes in trajectories associated with marked changes evident in measured physiochemical or thermodynamic properties of an air mass add credence to trajectories, particularly when these air mass properties are linked to trajectory features characteristic of recognized sources or processes. This visual clustering of air mass pathways is of particular value for large-scale 3-D flight tracks common to aircraft experiments where air mass features of interest are often spatially distributed and temporally separated. The cluster-visualization tool used here reveals most back trajectories with pollution signatures measured in the Central Equatorial Pacific reach back to sources on the South American continent over 10 000 km away and 12 days back in time, e.g. the Amazonian basin. We also demonstrate the distinctions in air mass properties between these and trajectories that penetrate deep

  3. Family History

    Science.gov (United States)

    Your family history includes health information about you and your close relatives. Families have many factors in common, including their genes, ... as heart disease, stroke, and cancer. Having a family member with a disease raises your risk, but ...

  4. Transient, three-dimensional heat transfer model for the laser assisted machining of silicon nitride: 1. Comparison of predictions with measured surface temperature histories

    Energy Technology Data Exchange (ETDEWEB)

    Rozzi, J.C.; Pfefferkorn, F.E.; Shin, Y.C. [Purdue University, (United States). Laser Assisted Materials Processing Laboratory, School of Mechanical Engineering; Incropera, F.P. [University of Notre Dame, (United States). Aerospace and Mechanical Engineering Department

    2000-04-01

    Laser assisted machining (LAM), in which the material is locally heated by an intense laser source prior to material removal, provides an alternative machining process with the potential to yield higher material removal rates, as well as improved control of workpiece properties and geometry, for difficult-to-machine materials such as structural ceramics. To assess the feasibility of the LAM process and to obtain an improved understanding of governing physical phenomena, experiments have been performed to determine the thermal response of a rotating silicon nitride workpiece undergoing heating by a translating CO{sub 2} laser and material removal by a cutting tool. Using a focused laser pyrometer, surface temperature histories were measured to determine the effect of the rotational and translational speeds, the depth of cut, the laser-tool lead distance, and the laser beam diameter and power on thermal conditions. The measurements are in excellent agreement with predictions based on a transient, three-dimensional numerical solution of the heating and material removal processes. The temperature distribution within the unmachined workpiece is most strongly influenced by the laser power and laser-tool lead distance, as well as by the laser/tool translational velocity. A minimum allowable operating temperature in the material removal region corresponds to the YSiAlON glass transition temperature, below which tool fracture may occur. In a companion paper, the numerical model is used to further elucidate thermal conditions associated with laser assisted machining. (author)

  5. The Bursty Star Formation Histories of Low-mass Galaxies at $0.4Measured from FUV and H$\\beta$

    CERN Document Server

    Guo, Yicheng; Faber, S M; Koo, David C; Krumholz, Mark R; Trump, Jonathan R; Willner, S P; Amorín, Ricardo; Barro, Guillermo; Bell, Eric F; Gardner, Jonathan P; Gawiser, Eric; Hathi, Nimish P; Koekemoer, Anton M; Pacifici, Camilla; Pérez-González, Pablo G; Ravindranath, Swara; Reddy, Naveen; Teplitz, Harry I; Yesuf, Hassen

    2016-01-01

    We investigate the burstiness of star formation histories (SFHs) of galaxies at $0.4measured from FUV (1500 \\AA) and H$\\beta$ (FUV--to--H$\\beta$ ratio). Our sample contains 164 galaxies down to stellar mass (M*) of $10^{8.5} M_\\odot$ in the CANDELS GOODS-N region, where TKRS Keck/DEIMOS spectroscopy and HST/WFC3 F275W images from CANDELS and HDUV are available. When the ratio of FUV- and H$\\beta$-derived SFRs is measured, dust extinction correction is negligible (except for very dusty galaxies) with the Calzetti attenuation curve. The FUV--to--H$\\beta$ ratio of our sample increases with the decrease of M* and SFR. The median ratio is $\\sim$1 at M* $\\sim 10^{10} M_\\odot$ (or SFR = 20 $M_\\odot$/yr) and increases to $\\sim$1.6 at M* $\\sim 10^{8.5} M_\\odot$ (or SFR $\\sim 0.5 M_\\odot$/yr). At M* $< 10^{9.5} M_\\odot$, our median FUV--to--H$\\beta$ ratio is higher than that of local galaxies at the same M*, implying a redshift evolution. Bursty SFH on a ...

  6. Mathematics and history: history and analysis epistemology: from exhaustion method to defined integral

    Directory of Open Access Journals (Sweden)

    Mario Mandrone

    2015-06-01

    Full Text Available The creation of the calculation (differential, in the terminology of Leibniz, bending in that of Newton is the event that, in the second half of the seventeenth century, marked, in a sense, the transition from classical to modern mathematics. The aim of this work is a historical analysis of the rigor and epistemological question and the "metaphysics" of calculus that takes account of the methods of the ancient (eg. Of Archimedes' method of exhaustion, as well as interpretations of Leibniz and Newton and their successors. The problem of searching for a sure foundation on which to base the calculus, glimpsed by D'Alembert in the theory of limits and taken up by Lagrange to the theory of infinite series, and that the derivative functions, found in Cachy the pioneer of a new way to seek rigor in analysis. The Cauchy setting will be tightened by Weierstrass in the second half of the 800 with the definition of limit, with the epsilon-delta method, which in turn is based on definitions concerning the real numbers. In this sense we speak of "arithmetisation" analysis.         Matematica e storia: storia ed epistemologia dell’analisi: dal metodo  di esaustione  all’integrale  definito La creazione del calcolo (differenziale, nella terminologia leibniziana flessionale in quella di Newton è l’evento che, nella seconda metà del seicento, ha segnato, in un certo senso, il passaggio dalla matematica classica a quella moderna. Obiettivo del presente lavoro è una analisi storica ed epistemologica della questione del rigore e della “metafisica” del calcolo infinitesimale che tenga conto dei metodi degli antichi (ad es.  del metodo di esaustione di Archimede, nonché delle interpretazioni di Leibniz e Newton e dei loro successori. Il problema della ricerca di un fondamento sicuro su cui basare il calcolo infinitesimale, intravisto da D’Alembert nella teoria dei limiti e ripreso da Lagrange con la teoria delle serie infinite e quella delle funzioni derivate, trova in Cachy il pioniere di un nuovo modo di ricercare il rigore in analisi. L’impostazione di Cauchy sarà resa rigorosa da Weierstrass nella seconda metà dell’800 con la definizione di limite, col metodo dell’epsilon-delta, che a sua volta si basa su definizioni concernenti i numeri reali. In questo senso si parla di “aritmetizzazione” dell’analisi. Parole Chiave: metodo di esaustione; metodo dei teoremi meccanici; calcolo sublime, fluenti e flussioni; teoria dell’integrazione; numeri iperreali; analisi non-standard.

  7. Intake of ruminant trans-fatty acids, assessed by diet history interview, and changes in measured body size, shape and composition.

    Science.gov (United States)

    Hansen, Camilla P; Heitmann, Berit L; Sørensen, Thorkild Ia; Overvad, Kim; Jakobsen, Marianne U

    2016-02-01

    Studies have suggested that total intake of trans-fatty acids (TFA) is positively associated with changes in body weight and waist circumference, whereas intake of TFA from ruminant dairy and meat products (R-TFA) has not been associated with weight gain. However, these previous studies are limited by self-reported measures of body weight and waist circumference or by a cross-sectional design. The objective of the present study was to investigate if R-TFA intake was associated with subsequent changes in anthropometry (body weight, waist and hip circumference) measured by technicians and body composition (body fat percentage). A 6-year follow-up study. Information on dietary intake was collected through diet history interviews, and anthropometric and bioelectrical impedance measurements were obtained by trained technicians at baseline (1987-1988) and at follow-up (1993-1994). Multiple regression with cubic spline modelling was used to analyse the data. Copenhagen County, Denmark. Two hundred and sixty-seven men and women aged 35-65 years from the Danish MONICA (MONItoring of trends and determinants in CArdiovascular diseases) cohort. The median R-TFA intake was 1.3 g/d (5th, 95th percentile: 0.4, 2.7 g/d) or 0.6% of the total energy intake (5th, 95th percentile: 0.2, 1.1%). No significant associations were observed between R-TFA intake and changes in body weight, waist and hip circumference or body fat percentage. R-TFA intake within the range present in the Danish population was not significantly associated with subsequent changes in body size, shape or composition and the 95% confidence intervals indicate that any relevant associations are unlikely to have produced these observations.

  8. Response measurement of single-crystal chemical vapor deposition diamond radiation detector for intense X-rays aiming at neutron bang-time and neutron burn-history measurement on an inertial confinement fusion with fast ignition

    Energy Technology Data Exchange (ETDEWEB)

    Shimaoka, T., E-mail: t.shimaoka@eng.hokudai.ac.jp; Kaneko, J. H.; Tsubota, M. [Graduate School of Engineering, Hokkaido University, Sapporo 060-8628 (Japan); Arikawa, Y.; Nagai, T.; Kojima, S.; Abe, Y.; Sakata, S.; Fujioka, S.; Nakai, M.; Shiraga, H.; Azechi, H. [Osaka University, 2-6 Yamada-Oka, Suita, Osaka 565-0871 (Japan); Isobe, M. [National Institute for Fusion Science, 322-6 Oroshi-cho, Toki 509-5292 (Japan); Sato, Y. [The Institute of Physical and Chemical Research (RIKEN), 2-1 Hirosawa, Wako, Saitama 351-0198 (Japan); Chayahara, A.; Umezawa, H.; Shikata, S. [Diamond Research Laboratory, National Institute of Advanced Industrial Science and Technology (AIST), 1-8-31 Midorigaoka, Ikeda, Osaka 563-8577 (Japan)

    2015-05-15

    A neutron bang time and burn history monitor in inertial confinement fusion with fast ignition are necessary for plasma diagnostics. In the FIREX project, however, no detector attained those capabilities because high-intensity X-rays accompanied fast electrons used for plasma heating. To solve this problem, single-crystal CVD diamond was grown and fabricated into a radiation detector. The detector, which had excellent charge transportation property, was tested to obtain a response function for intense X-rays. The applicability for neutron bang time and burn history monitor was verified experimentally. Charge collection efficiency of 99.5% ± 0.8% and 97.1% ± 1.4% for holes and electrons were obtained using 5.486 MeV alpha particles. The drift velocity at electric field which saturates charge collection efficiency was 1.1 ± 0.4 × 10{sup 7} cm/s and 1.0 ± 0.3 × 10{sup 7} cm/s for holes and electrons. Fast response of several ns pulse width for intense X-ray was obtained at the GEKKO XII experiment, which is sufficiently fast for ToF measurements to obtain a neutron signal separately from X-rays. Based on these results, we confirmed that the single-crystal CVD diamond detector obtained neutron signal with good S/N under ion temperature 0.5–1 keV and neutron yield of more than 10{sup 9} neutrons/shot.

  9. Measuring $\

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, Jessica Sarah [Univ. of Cambridge (United Kingdom)

    2011-01-01

    The MINOS Experiment consists of two steel-scintillator calorimeters, sampling the long baseline NuMI muon neutrino beam. It was designed to make a precise measurement of the ‘atmospheric’ neutrino mixing parameters, Δm2 atm. and sin2 (2 atm.). The Near Detector measures the initial spectrum of the neutrino beam 1km from the production target, and the Far Detector, at a distance of 735 km, measures the impact of oscillations in the neutrino energy spectrum. Work performed to validate the quality of the data collected by the Near Detector is presented as part of this thesis. This thesis primarily details the results of a vμ disappearance analysis, and presents a new sophisticated fitting software framework, which employs a maximum likelihood method to extract the best fit oscillation parameters. The software is entirely decoupled from the extrapolation procedure between the detectors, and is capable of fitting multiple event samples (defined by the selections applied) in parallel, and any combination of energy dependent and independent sources of systematic error. Two techniques to improve the sensitivity of the oscillation measurement were also developed. The inclusion of information on the energy resolution of the neutrino events results in a significant improvement in the allowed region for the oscillation parameters. The degree to which sin2 (2θ )= 1.0 could be disfavoured with the exposure of the current dataset if the true mixing angle was non-maximal, was also investigated, with an improved neutrino energy reconstruction for very low energy events. The best fit oscillation parameters, obtained by the fitting software and incorporating resolution information were: | Δm2| = 2.32+0.12 -0.08×10-3 eV2 and sin2 (2θ ) > 0.90(90% C.L.). The analysis provides the current world best measurement of the atmospheric neutrino mass

  10. The history of a lesson

    DEFF Research Database (Denmark)

    Rasmussen, Mikkel Vedby

    2003-01-01

    and emphasises the need to study the history of lessons rather than the lessons of history. This approach shows that Munich is the end point of a constitutive history that begins in the failure of the Versailles treaty to create a durable European order following the First World War. The Munich lesson is thus...... one element of the lesson of Versailles, which is a praxeology that defines how the West is to make peace, and against whom peace must be defended. The lesson of Versailles has been, at least in part, constitutive of the outbreak of the Cold War, and it continues to define the Western conception...... of what defines peace and security even in the 'war against terrorism'....

  11. A consistent measure of the merger histories of massive galaxies using close-pair statistics - I. Major mergers at z < 3.5

    Science.gov (United States)

    Mundy, Carl J.; Conselice, Christopher J.; Duncan, Kenneth J.; Almaini, Omar; Häußler, Boris; Hartley, William G.

    2017-09-01

    We use a large sample of ∼350 000 galaxies constructed by combining the UKIDSS UDS, VIDEO/CFHT-LS, UltraVISTA/COSMOS and GAMA survey regions to probe the major (1:4 stellar mass ratio) merging histories of massive galaxies (>1010 M⊙) at 0.005 probability distributions, to measure pair fractions of flux-limited, stellar mass selected galaxy samples using close-pair statistics. The pair fraction is found to weakly evolve as ∝ (1 + z)0.8 with no dependence on stellar mass. We subsequently derive major merger rates for galaxies at >1010 M⊙ and at a constant number density of n > 10-4 Mpc-3, and find rates a factor of 2-3 smaller than previous works, although this depends strongly on the assumed merger time-scale and likelihood of a close-pair merging. Galaxies undergo approximately 0.5 major mergers at z 1011 M⊙) galaxies have experienced a steady supply of stellar mass via major mergers throughout their evolution. While pair fractions are found to agree with those predicted by the Henriques et al. semi-analytic model, the Illustris hydrodynamical simulation fails to quantitatively reproduce derived merger rates. Furthermore, we find that major mergers become a comparable source of stellar mass growth compared to star formation at z < 1, but is 10-100 times smaller than the star formation rate density at higher redshifts.

  12. Matematikkens historie

    DEFF Research Database (Denmark)

    Hansen, Vagn Lundsgaard

    2009-01-01

    Matematikkens historie i syv kapitler: 1. Matematik i støbeskeen; 2. Matematikkens græske arv; 3. Den gyldne tidsalder for hinduer og arabere; 4. Matematik i Kina; 5. Renæssancens matematik; 6. Regning med infinitesimaler ser dagens lys; 7. Matematik i det tyvende århundrede.......Matematikkens historie i syv kapitler: 1. Matematik i støbeskeen; 2. Matematikkens græske arv; 3. Den gyldne tidsalder for hinduer og arabere; 4. Matematik i Kina; 5. Renæssancens matematik; 6. Regning med infinitesimaler ser dagens lys; 7. Matematik i det tyvende århundrede....

  13. Linjefaget historie

    DEFF Research Database (Denmark)

    Rasch-Christensen, Andreas

    Afhandlingen er en undersøgelse af linjefaget historie ved læreruddannelsen. Med fokus på subjektperspektivet peger afhandlingen på en række afgørende udviklingsperspektiver for læreruddannelsen, uddannelsen af historielærere og folkeskolens historieundervisning.......Afhandlingen er en undersøgelse af linjefaget historie ved læreruddannelsen. Med fokus på subjektperspektivet peger afhandlingen på en række afgørende udviklingsperspektiver for læreruddannelsen, uddannelsen af historielærere og folkeskolens historieundervisning....

  14. Matematikkens historie

    DEFF Research Database (Denmark)

    Hansen, Vagn Lundsgaard

    2009-01-01

    Matematikkens historie i syv kapitler: 1. Matematik i støbeskeen; 2. Matematikkens græske arv; 3. Den gyldne tidsalder for hinduer og arabere; 4. Matematik i Kina; 5. Renæssancens matematik; 6. Regning med infinitesimaler ser dagens lys; 7. Matematik i det tyvende århundrede.......Matematikkens historie i syv kapitler: 1. Matematik i støbeskeen; 2. Matematikkens græske arv; 3. Den gyldne tidsalder for hinduer og arabere; 4. Matematik i Kina; 5. Renæssancens matematik; 6. Regning med infinitesimaler ser dagens lys; 7. Matematik i det tyvende århundrede....

  15. Environmental history

    DEFF Research Database (Denmark)

    Pawson, Eric; Christensen, Andreas Aagaard

    2016-01-01

    Environmental history is an interdisciplinary pursuit that has developed as a form of conscience to counter an increasingly powerful, forward-looking liberal theory of the environment. It deals with the relations between environmental ideas and materialities, from the work of the geographers George...... risks”. These are exposed by environmental history’s focus on long-run analysis and its narrative form that identifies the stories that we tell ourselves about nature. How a better understanding of past environmental transformations helps to analyse society and agency, and what this can mean...... for solutions and policies, is the agenda for an engaged environmental history from now on....

  16. The History of Nuclidic Masses and of their Evaluation

    CERN Document Server

    Audi, G

    2006-01-01

    This paper is centered on some historical aspects of nuclear masses, and their relations to major discoveries. Besides nuclear reactions and decays, the heart of mass measurements lies in mass spectrometry, the early history of which will be reviewed first. I shall then give a short history of the mass unit which has not always been defined as one twelfth of the carbon-12 mass. When combining inertial masses from mass spectrometry with energy differences obtained in reactions and decays, the conversion factor between the two is essential. The history of the evaluation of the nuclear masses (actually atomic masses) is only slightly younger than that of the mass measurements themselves. In their modern form, mass evaluations can be traced back to 1955. Prior to 1955, several tables were established, the oldest one in 1935.

  17. Cultural history as polyphonic history

    Directory of Open Access Journals (Sweden)

    Burke, Peter

    2010-06-01

    Full Text Available This texts offers a reflection on the origins and actual development of the field of cultural history through a comparison with the term that has served as title for this seminar: “polyphonic history”. The author provides an overview of the themes that have structured the seminar (the history of representations, the history of the body and the cultural history of science with the aim of making explicit and clarifying this plurality of voices in the field of history as well as its pervasiveness in other research areas.

    En este texto se ofrece una reflexión sobre el origen y actual desarrollo del campo de la historia cultural a través de una comparación con el término que ha dado título a este seminario: “historia polifónica”. El autor propone un recorrido por las áreas temáticas que han conformado la estructura del seminario (la historia de las representaciones, la historia del cuerpo y la historia cultural de la ciencia con el objeto de explicitar y explicar esta pluralidad de voces en el campo de la historia, así como su repercusión en otras áreas del conocimiento.

  18. Why History?

    Science.gov (United States)

    Duffy, Robert E.

    1988-01-01

    Examines the way in which studying history contributes to intellectual development. Identifies five mental attributes it enhances: perspective--gained from placing people, events, institutions against larger background; encounter--confronting great ideas, personalities, etc.; relativism in a pluralistic world--developed from immersion in other…

  19. Bulletproof History.

    Science.gov (United States)

    Roy, R. H.

    1994-01-01

    Asserts that the writers and producers of the television documentary, "The Valour and the Horror," provided a false impression of an event to fit preconceived and erroneous interpretations of history. Points out specific examples of inaccurate historical presentations and provides contradictory historical interpretations. (CFR)

  20. Business History

    DEFF Research Database (Denmark)

    Hansen, Per H.

    2012-01-01

    This article argues that a cultural and narrative perspective can enrich the business history field, encourage new and different questions and answers, and provide new ways of thinking about methods and empirical material. It discusses what culture is and how it relates to narratives. Taking...

  1. Potted history

    NARCIS (Netherlands)

    Van Dijk, T.

    2010-01-01

    The Jordan Valley was once populated by a people, now almost forgotten by historians, with whom the pharaoh of Egypt sought favour. That is the conclusion reached by Niels Groot, the first researcher to take a PhD at the Delft-Leiden Centre for Archaeology, Art History and Science.

  2. Defining asthma in genetic studies

    NARCIS (Netherlands)

    Koppelman, GH; Postma, DS; Meijer, G.

    1999-01-01

    Genetic studies have been hampered by the lack of a gold standard to diagnose asthma. The complex nature of asthma makes it more difficult to identify asthma genes. Therefore, approaches to define phenotypes, which have been successful in other genetically complex diseases, may be applied to define

  3. Defining asthma in genetic studies

    NARCIS (Netherlands)

    Koppelman, GH; Postma, DS; Meijer, G.

    1999-01-01

    Genetic studies have been hampered by the lack of a gold standard to diagnose asthma. The complex nature of asthma makes it more difficult to identify asthma genes. Therefore, approaches to define phenotypes, which have been successful in other genetically complex diseases, may be applied to define

  4. The Bursty Star Formation Histories of Low-mass Galaxies at 0.4 < z < 1 Revealed by Star Formation Rates Measured From Hβ and FUV

    Science.gov (United States)

    Guo, Yicheng; Rafelski, Marc; Faber, S. M.; Koo, David C.; Krumholz, Mark R.; Trump, Jonathan R.; Willner, S. P.; Amorín, Ricardo; Barro, Guillermo; Bell, Eric F.; Gardner, Jonathan P.; Gawiser, Eric; Hathi, Nimish P.; Koekemoer, Anton M.; Pacifici, Camilla; Pérez-González, Pablo G.; Ravindranath, Swara; Reddy, Naveen; Teplitz, Harry I.; Yesuf, Hassen

    2016-12-01

    We investigate the burstiness of star formation histories (SFHs) of galaxies at 0.4 models, e.g., non-universal initial mass function or stochastic star formation on star cluster scales, are unable to plausibly explain our results.

  5. Defining viability in mammalian cell cultures

    OpenAIRE

    Browne, Susan M.; Al-Rubeai, Mohamed

    2011-01-01

    Abstract A large number of assays are available to monitor viability in mammalian cell cultures with most defining loss of viability as a loss of plasma membrane integrity, a characteristic of necrotic cell death. However, the majority of cultured cells die by apoptosis and early apoptotic cells, although non-viable, maintain an intact plasma membrane and are thus ignored. Here we measure the viability of cultures of a number of common mammalian cell lines by assays that measure me...

  6. Long-term measurements of particle number size distributions and the relationships with air mass history and source apportionment in the summer of Beijing

    Science.gov (United States)

    Wang, Z. B.; Hu, M.; Wu, Z. J.; Yue, D. L.; He, L. Y.; Huang, X. F.; Liu, X. G.; Wiedensohler, A.

    2013-10-01

    A series of long-term and temporary measurements were conducted to study the improvement of air quality in Beijing during the Olympic Games period (8-24 August 2008). To evaluate actions taken to improve the air quality, comparisons of particle number and volume size distributions of August 2008 and 2004-2007 were performed. The total particle number and volume concentrations were 14 000 cm-3 and 37 μm-3 cm-3 in August of 2008, respectively. These were reductions of 41% and 35% compared with mean values of August 2004-2007. A cluster analysis on air mass history and source apportionment were performed, exploring reasons for the reduction of particle concentrations. Back trajectories were classified into five major clusters. Air masses from the south direction are always associated with pollution events during the summertime in Beijing. In August 2008, the frequency of air mass arriving from the south was 1.3 times higher compared to the average of the previous years, which however did not result in elevated particle volume concentrations in Beijing. Therefore, the reduced particle number and volume concentrations during the 2008 Beijing Olympic Games cannot be only explained by meteorological conditions. Four factors were found influencing particle concentrations using a positive matrix factorization (PMF) model. They were identified as local and remote traffic emissions, combustion sources as well as secondary transformation. The reductions of the four sources were calculated to 47%, 44%, 43% and 30%, respectively. The significant reductions of particle number and volume concentrations may attribute to actions taken, focusing on primary emissions, especially related to the traffic and combustion sources.

  7. Business History as Cultural History

    DEFF Research Database (Denmark)

    Lunde Jørgensen, Ida

    The paper engages with the larger question of how cultural heritage becomes taken for granted and offers a complimentary view to the anthropological ʻCopenhagen School’ of business history, one that draws attention to the way corporate wealth directly and indirectly influences the culture available...

  8. Defining and Classifying Interest Groups

    DEFF Research Database (Denmark)

    Baroni, Laura; Carroll, Brendan; Chalmers, Adam;

    2014-01-01

    The interest group concept is defined in many different ways in the existing literature and a range of different classification schemes are employed. This complicates comparisons between different studies and their findings. One of the important tasks faced by interest group scholars engaged...... in large-N studies is therefore to define the concept of an interest group and to determine which classification scheme to use for different group types. After reviewing the existing literature, this article sets out to compare different approaches to defining and classifying interest groups with a sample...

  9. Sommerferiens historie

    DEFF Research Database (Denmark)

    Lützen, Karin

    2011-01-01

    favourite ways for com- mon people to spend their holidays, and with the introduction of holiday pay in the 1930s almost everybody could take a couple of weeks off work in the sum- mer. With the introduction of charter tourism many people went off to Southern Europe to spend their holidays on the same...... pattern. Finally, the history of the special holiday camps is told, which were established by American Jews because they were excluded from many hotels....

  10. Theoretical approaches to elections defining

    Directory of Open Access Journals (Sweden)

    Natalya V. Lebedeva

    2011-01-01

    Full Text Available Theoretical approaches to elections defining develop the nature, essence and content of elections, help to determine their place and a role as one of the major national law institutions in democratic system.

  11. Defining Modules, Modularity and Modularization

    DEFF Research Database (Denmark)

    Miller, Thomas Dedenroth; Pedersen, Per Erik Elgård

    The paper describes the evolution of the concept of modularity in a historical perspective. The main reasons for modularity are: create variety, utilize similarities, and reduce complexity. The paper defines the terms: Module, modularity, and modularization.......The paper describes the evolution of the concept of modularity in a historical perspective. The main reasons for modularity are: create variety, utilize similarities, and reduce complexity. The paper defines the terms: Module, modularity, and modularization....

  12. Philosophy and its History

    Directory of Open Access Journals (Sweden)

    Carlos B. Gutiérrez

    2008-08-01

    Full Text Available Socratic irony already stated that philosophy did not progress because it always devoted itself to the same matters; philosophical knowledge, besieged by the question of legitimacy, shows its regenerative capacity by always going back to its historical-conceptual foundations. Hence, whoever tried to define the essence of philosophy leaving history aside will be in danger of narrow dogmatism. In its factical situation philosophical reflection deals with issues referred to previous knowledge and opinions which determine the limits to the rationality claims of its knowledge. Philosophical truth is temporal, like any other human truth.

  13. Uncovering History for Future History Teachers

    Science.gov (United States)

    Fischer, Fritz

    2010-01-01

    The art of history teaching is at a crossroads. Recent scholarship focuses on the need to change the teaching of history so students can better learn history, and insists that history teachers must move beyond traditional structures and methods of teaching in order to improve their students' abilities to think with history. This article presents…

  14. Ildens historier

    DEFF Research Database (Denmark)

    Lassen, Henrik Roesgaard

    In December 2012 a manuscript entitled "Tællelyset" ['The Tallow Candle'] was discovered in an archive. The story was subsequently presented to the world as Hans Christian Andersen's first fairy tale and rather bombastically celebrated as such. In this book it is demonstrated that the text cannot...... from a point-by-point tracing of 'the origins and history' of Hans Christian Andersen's famous fairy tales. Where did the come from? How did they become the iconic texts that we know today? On this background it becomes quite clear that "Tællelyset" is a modern pastiche and not a genuine Hans Christian...... Andersen fairy tale....

  15. Quantum Histories and their Implications

    CERN Document Server

    Kent, A

    1996-01-01

    It was recently pointed out that, using two different sets in the consistent histories formalism, one can conclude that the system state is definitely in a given subspace $A$ of the state space at a given time and that the system state is definitely not in a larger subspace $B \\supset A$. This raises the question as to whether, if standard quantum theory applies to the macroscopic realm, we should necessarily expect the quasiclassical physics we actually observe to respect subspace implications. I give here a new criterion, ordered consistency, with the property that inferences made by ordered consistent sets do not violate subspace relations. The criterion allows a precise version of the question to be formulated: do the operators defining our observations form an ordered consistent history? It also defines a version of quantum theory which has greater predictive power than the standard consistent histories formalism.

  16. Natural history of cerebral saccular aneurysms

    African Journals Online (AJOL)

    Keywords: Natural history, Cerebral saccular aneurysm,. Aneurysmal rupture. .... as autosomal dominant polycystic kidney disease and Ehlers-. Danlos syndrome .... the method of defining 'acute' hypertension was not reported. Juvela et al25 ...

  17. Defining the states of consciousness.

    Science.gov (United States)

    Tassi, P; Muzet, A

    2001-03-01

    Consciousness remains an elusive concept due to the difficulty to define what has been regarded for many years as a subjective experience, therefore irrelevant for scientific study. Recent development in this field of research has allowed to provide some new insight to a possible way to define consciousness. Going through the extensive literature in this domain, several perspectives are proposed to define this concept. (1) Consciousness and Attention may not reflect the same process. (2) Consciousness during wake and sleep may not involve the same mechanisms. (3) Besides physiological states of consciousness, human beings can experience modified states of consciousness either by self-training (transcendental meditation, hypnosis, etc.) or by drug intake (hallucinogens, anaesthetics, etc.). Altogether, we address the question of a more precise terminology, given the theoretical weight words can convey. To this respect, we propose different definitions for concepts like consciousness, vigilance, arousal and alertness as candidates to separate functional entities.

  18. Modular Software-Defined Radio

    Directory of Open Access Journals (Sweden)

    Rhiemeier Arnd-Ragnar

    2005-01-01

    Full Text Available In view of the technical and commercial boundary conditions for software-defined radio (SDR, it is suggestive to reconsider the concept anew from an unconventional point of view. The organizational principles of signal processing (rather than the signal processing algorithms themselves are the main focus of this work on modular software-defined radio. Modularity and flexibility are just two key characteristics of the SDR environment which extend smoothly into the modeling of hardware and software. In particular, the proposed model of signal processing software includes irregular, connected, directed, acyclic graphs with random node weights and random edges. Several approaches for mapping such software to a given hardware are discussed. Taking into account previous findings as well as new results from system simulations presented here, the paper finally concludes with the utility of pipelining as a general design guideline for modular software-defined radio.

  19. Defining the Internet of Things

    OpenAIRE

    Benghozi, Pierre-Jean; Bureau, Sylvain; Massit-Folléa, Françoise

    2012-01-01

    How can a definition be given to what does not yet exist ? The Internet of Things, as it is conceptualized by researchers or imagined by science-fiction writers such as Bruce Sterling, is not yet reality and if we try to define it accurately we risk rash predictions. In order to better comprehend this notion, let us first define the main principles of the IoT as given in research papers and reports on the subject. Definitions gradually established Almost all agree that the Internet of Things...

  20. Defining the Internet of Things

    OpenAIRE

    Benghozi, Pierre-Jean; Bureau, Sylvain; Massit-Folléa, Françoise

    2012-01-01

    How can a definition be given to what does not yet exist ? The Internet of Things, as it is conceptualized by researchers or imagined by science-fiction writers such as Bruce Sterling, is not yet reality and if we try to define it accurately we risk rash predictions. In order to better comprehend this notion, let us first define the main principles of the IoT as given in research papers and reports on the subject. Definitions gradually established Almost all agree that the Internet of Things...

  1. Associations among Measures of Sequential Processing in Motor and Linguistics Tasks in Adults with and without a Family History of Childhood Apraxia of Speech: A Replication Study

    Science.gov (United States)

    Button, Le; Peter, Beate; Stoel-Gammon, Carol; Raskind, Wendy H.

    2013-01-01

    The purpose of this study was to address the hypothesis that childhood apraxia of speech (CAS) is influenced by an underlying deficit in sequential processing that is also expressed in other modalities. In a sample of 21 adults from five multigenerational families, 11 with histories of various familial speech sound disorders, 3 biologically…

  2. Procrastination as a Fast Life History Strategy

    Directory of Open Access Journals (Sweden)

    Bin-Bin Chen

    2016-02-01

    Full Text Available Research has revealed that procrastination—the purposive delay of an intended course of action—is a maladaptive behavior. However, by drawing on an evolutionary life history (LF approach, the present study proposes that procrastination may be an adaptive fast LF strategy characterized by prioritizing immediate benefits with little regard to long-term consequences. A total of 199 undergraduate students completed measures of procrastination and future orientation and the Mini-K scale, which measures the slow LF strategy. Structural equation modeling revealed that, as predicted, procrastination was negatively associated with a slow LF strategy both directly and indirectly through the mediation of future orientation. These results define the fast LF origin of procrastination.

  3. Defined medium for Moraxella bovis.

    OpenAIRE

    Juni, E; Heym, G A

    1986-01-01

    A defined medium (medium MB) for Moraxella bovis was formulated. Nineteen strains grew well on medium MB. One strain was auxotrophic for asparagine, and another was auxotrophic for methionine. Strains of M. equi and M. lacunata also grew on medium MB. All strains had an absolute requirement for thiamine and were stimulated by or actually required the other growth factors in the medium.

  4. Indico CONFERENCE: Define the Programme

    CERN Document Server

    CERN. Geneva; Ferreira, Pedro

    2017-01-01

    In this tutorial you are going to learn how to define the programme of a conference in Indico. The program of your conference is divided in different “tracks”. Tracks represent the subject matter of the conference, such as “Online Computing”, “Offline Computing”, and so on.

  5. Defining sphincter of oddi dysfunction

    DEFF Research Database (Denmark)

    Funch-Jensen, P

    1996-01-01

    Sphincter of Oddi (SO) dysmotility may give rise to pain. The golden standard for the demonstration of SO dysfunction is endoscopic manometry. A number of abnormalities are observed in patients with postcholecystectomy pain and in patients with idiopathic recurrent pancreatitis. Criteria for defi...... for defining SO dysfunction and the possible mechanisms for the precipitation of pain are discussed....

  6. Defined medium for Moraxella bovis.

    Science.gov (United States)

    Juni, E; Heym, G A

    1986-10-01

    A defined medium (medium MB) for Moraxella bovis was formulated. Nineteen strains grew well on medium MB. One strain was auxotrophic for asparagine, and another was auxotrophic for methionine. Strains of M. equi and M. lacunata also grew on medium MB. All strains had an absolute requirement for thiamine and were stimulated by or actually required the other growth factors in the medium.

  7. Defined medium for Moraxella bovis.

    OpenAIRE

    1986-01-01

    A defined medium (medium MB) for Moraxella bovis was formulated. Nineteen strains grew well on medium MB. One strain was auxotrophic for asparagine, and another was auxotrophic for methionine. Strains of M. equi and M. lacunata also grew on medium MB. All strains had an absolute requirement for thiamine and were stimulated by or actually required the other growth factors in the medium.

  8. Network Coded Software Defined Networking

    DEFF Research Database (Denmark)

    Krigslund, Jeppe; Hansen, Jonas; Roetter, Daniel Enrique Lucani

    2015-01-01

    Software Defined Networking (SDN) and Network Coding (NC) are two key concepts in networking that have garnered a large attention in recent years. On the one hand, SDN's potential to virtualize services in the Internet allows a large flexibility not only for routing data, but also to manage buffe...

  9. Defining and Differentiating the Makerspace

    Science.gov (United States)

    Dousay, Tonia A.

    2017-01-01

    Many resources now punctuate the maker movement landscape. However, some schools and communities still struggle to understand this burgeoning movement. How do we define these spaces and differentiate them from previous labs and shops? Through a multidimensional framework, stakeholders should consider how the structure, access, staffing, and tools…

  10. Quantum Thought Experiments Can Define Nature

    CERN Document Server

    McCartor, D

    2004-01-01

    One would not think that thought experiments could matter to nature, for they are a humble human device. Yet quantum mechanics very naturally frames thought experiments (as distinct from precisely defining what exists). They exemplify the informing powers of radiation. Though based on wave functions that have time symmetry, these tableaux inevitably tell of irreversible behavior by nature. The paper sketches how John von Neumann's measurement theory fits into this and retells N. David Mermin's baseball story.

  11. Nonadditive Set Functions Defined by Aumann Fuzzy Integrals

    Institute of Scientific and Technical Information of China (English)

    刘彦奎; 刘宝碇

    2003-01-01

    A novel concept, called nonadditive set-valued measure, is first defined as a monotone and continuous set function. Then the interconnections between nonadditive set-valued measure and the additive set-valued measure as well as the fuzzy measure are discussed. Finally, an approach to construct a nonadditive compact set-valued measure is presented via Aumann fuzzy integral.

  12. Public History

    Directory of Open Access Journals (Sweden)

    Marta Gouveia de Oliveira Rovai

    2017-04-01

    Full Text Available Este artigo tem como proposta apresentar o conceito e as práticas de História Pública como um novo posicionamento da ciência histórica em diálogo com profissionais da comunicação, no sentido de produzir e divulgar as experiências humanas. Para isso, discute-se a origem do conceito de História Pública e as diferentes formas de educação histórica que a utilização das novas tecnologias podem proporcionar (dentre elas a internet. Nesse sentido, convida-se o leitor para a reflexão sobre as possibilidades de publicização e de democratização do conhecimento histórico e da cultura, ampliando-se a oportunidade de produção, de divulgação e de acesso do público a diferentes formas experiências no tempo. O artigo também intenciona chamar atenção dos profissionais que lidam com a História e com a Comunicação para os perigos de produções exclusivamente submetidas ao mercado que transformam a popularização da História no reforço de estigmas culturais.   PALAVRAS-CHAVE: História Pública; Educação histórica e Comunicação; democratização e estigmatização.     ABSTRACT This article aims to present the concept and practices of Public History as a new positioning of historical science in dialogue with communication professionals, in the sense of producing and disseminating human experiences. For this, the origin of the concept of Public History and the different forms of historical education that the use of the new technologies can provide (among them the Internet is discussed. In this sense, the reader is invited to reflect on the possibilities of publicizing and democratizing historical knowledge and culture, expanding the opportunity for production, dissemination and public access to different forms of experience in time. The article also intends to draw attention from professionals dealing with History and Communication to the dangers of exclusively commercialized productions that transform the popularization

  13. Profiling of Humoral Response to Influenza A(H1N1)pdm09 Infection and Vaccination Measured by a Protein Microarray in Persons with and without History of Seasonal Vaccination

    OpenAIRE

    Huijskens, Elisabeth G. W.; Johan Reimerink; Mulder, Paul G H; Janko van Beek; Adam Meijer; Erwin de Bruin; Ingrid Friesema; de Jong, Menno D.; Rimmelzwaan, Guus F.; Peeters, Marcel F.; Rossen, John W. A.; Marion Koopmans

    2013-01-01

    textabstractBackground: The influence of prior seasonal influenza vaccination on the antibody response produced by natural infection or vaccination is not well understood. Methods: We compared the profiles of antibody responses of 32 naturally infected subjects and 98 subjects vaccinated with a 2009 influenza A(H1N1) monovalent MF59-adjuvanted vaccine (Focetria®, Novartis), with and without a history of seasonal influenza vaccination. Antibodies were measured by hemagglutination inhibition (H...

  14. Software defined radio architectures evaluation

    OpenAIRE

    Palomo, Alvaro; Villing, Rudi; Farrell, Ronan

    2008-01-01

    This paper presents an performance evaluation of GNU Radio and OSSIE, two open source Software Defined Radio (SDR) architectures. The two architectures were compared by running implementations of a BPSK waveform utilising a software loopback channel on each. The upper bound full duplex throughput was found to be around 700kbps in both cases, though OSSIE was slightly faster than GNU Radio. CPU and memory loads did not differ significantly.

  15. AIDS defining disease: Disseminated cryptococcosis

    Directory of Open Access Journals (Sweden)

    Roshan Anupama

    2006-01-01

    Full Text Available Disseminated cryptococcosis is one of the acquired immune deficiency syndrome defining criteria and the most common cause of life threatening meningitis. Disseminated lesions in the skin manifest as papules or nodules that mimic molluscum contagiosum (MC. We report here a human immunodeficiency virus positive patient who presented with MC like lesions. Disseminated cryptococcosis was confirmed by India ink preparation and histopathology. The condition of the patient improved with amphotercin B.

  16. The Future of History and History Teaching.

    Science.gov (United States)

    Commager, Henry Steele

    1983-01-01

    Technical history, a quantitative record of history strengthened by new techniques in mathematics, computer science, and other fields has advantages over former approaches to history--history as philosophy and historical theology. For example, it makes available more source materials. However, it has drawbacks, e.g., it directs research to highly…

  17. The Future of History and History Teaching.

    Science.gov (United States)

    Commager, Henry Steele

    1983-01-01

    Technical history, a quantitative record of history strengthened by new techniques in mathematics, computer science, and other fields has advantages over former approaches to history--history as philosophy and historical theology. For example, it makes available more source materials. However, it has drawbacks, e.g., it directs research to highly…

  18. How to define green adjuvants.

    Science.gov (United States)

    Beck, Bert; Steurbaut, Walter; Spanoghe, Pieter

    2012-08-01

    The concept 'green adjuvants' is difficult to define. This paper formulates an answer based on two approaches. Starting from the Organisation for Economic Cooperation and Development (OECD) definition for green chemistry, production-based and environmental-impact-based definitions for green adjuvants are proposed. According to the production-based approach, adjuvants are defined as green if they are manufactured using renewable raw materials as much as possible while making efficient use of energy, preferably renewable energy. According to the environmental impact approach, adjuvants are defined as green (1) if they have a low human and environmental impact, (2) if they do not increase active ingredient environmental mobility and/or toxicity to humans and non-target organisms, (3) if they do not increase the exposure to these active substances and (4) if they lower the impact of formulated pesticides by enhancing the performance of active ingredients, thus potentially lowering the required dosage of active ingredients. Based on both approaches, a tentative definition for 'green adjuvants' is given, and future research and legislation directions are set out.

  19. Profiling of humoral response to influenza A(H1N1pdm09 infection and vaccination measured by a protein microarray in persons with and without history of seasonal vaccination.

    Directory of Open Access Journals (Sweden)

    Elisabeth G W Huijskens

    Full Text Available BACKGROUND: The influence of prior seasonal influenza vaccination on the antibody response produced by natural infection or vaccination is not well understood. METHODS: We compared the profiles of antibody responses of 32 naturally infected subjects and 98 subjects vaccinated with a 2009 influenza A(H1N1 monovalent MF59-adjuvanted vaccine (Focetria, Novartis, with and without a history of seasonal influenza vaccination. Antibodies were measured by hemagglutination inhibition (HI assay for influenza A(H1N1pdm09 and by protein microarray (PA using the HA1 subunit for seven recent and historic H1, H2 and H3 influenza viruses, and three avian influenza viruses. Serum samples for the infection group were taken at the moment of collection of the diagnostic sample, 10 days and 30 days after onset of influenza symptoms. For the vaccination group, samples were drawn at baseline, 3 weeks after the first vaccination and 5 weeks after the second vaccination. RESULTS: We showed that subjects with a history of seasonal vaccination generally exhibited higher baseline titers for the various HA1 antigens than subjects without a seasonal vaccination history. Infection and pandemic influenza vaccination responses in persons with a history of seasonal vaccination were skewed towards historic antigens. CONCLUSIONS: Seasonal vaccination is of significant influence on the antibody response to subsequent infection and vaccination, and further research is needed to understand the effect of annual vaccination on protective immunity.

  20. Substance, History, and Politics

    Directory of Open Access Journals (Sweden)

    Candace J. Black

    2017-02-01

    Full Text Available The aim of this article is to examine the relations between two approaches to the measurement of life history (LH strategies: A traditional approach, termed here the biodemographic approach, measures developmental characteristics like birthweight, gestation length, interbirth intervals, pubertal timing, and sexual debut, and a psychological approach measures a suite of cognitive and behavioral traits such as altruism, sociosexual orientation, personality, mutualism, familial relationships, and religiosity. The biodemographic approach also tends not to invoke latent variables, whereas the psychological approach typically relies heavily upon them. Although a large body of literature supports both approaches, they are largely separate. This review examines the history and relations between biodemographic and psychological measures of LH, which remain murky at best. In doing so, we consider basic questions about the nature of LH strategies: What constitutes LH strategy (or perhaps more importantly, what does not constitute LH strategy? What is gained or lost by including psychological measures in LH research? Must these measures remain independent or should they be used in conjunction as complementary tools to test tenets of LH theory? Although definitive answers will linger, we hope to catalyze an explicit discussion among LH researchers and to provoke novel research avenues that combine the strengths each approach brings to this burgeoning field.

  1. UNIQLO, Define Your Own Fashion

    Institute of Scientific and Technical Information of China (English)

    Wang Ting

    2009-01-01

    @@ Yes,women like and enjoy shopping.Always,they want to buy some well-designed clothes with the most 'in'factors; and what's of the great importance,they would like to hear the words:"wow! You fit the wear well!"However,the most satisfied right things could not be always waiting for you there or you would not help complaining the so-fast changing trends day by day.At that time,why not to seek some delights from the basic classic collections?UNIQLO maybe is a choice for you to define your own fashion.

  2. Defining Life: The Virus Viewpoint

    Science.gov (United States)

    Forterre, Patrick

    2010-04-01

    Are viruses alive? Until very recently, answering this question was often negative and viruses were not considered in discussions on the origin and definition of life. This situation is rapidly changing, following several discoveries that have modified our vision of viruses. It has been recognized that viruses have played (and still play) a major innovative role in the evolution of cellular organisms. New definitions of viruses have been proposed and their position in the universal tree of life is actively discussed. Viruses are no more confused with their virions, but can be viewed as complex living entities that transform the infected cell into a novel organism—the virus—producing virions. I suggest here to define life (an historical process) as the mode of existence of ribosome encoding organisms (cells) and capsid encoding organisms (viruses) and their ancestors. I propose to define an organism as an ensemble of integrated organs (molecular or cellular) producing individuals evolving through natural selection. The origin of life on our planet would correspond to the establishment of the first organism corresponding to this definition.

  3. Defining life: the virus viewpoint.

    Science.gov (United States)

    Forterre, Patrick

    2010-04-01

    Are viruses alive? Until very recently, answering this question was often negative and viruses were not considered in discussions on the origin and definition of life. This situation is rapidly changing, following several discoveries that have modified our vision of viruses. It has been recognized that viruses have played (and still play) a major innovative role in the evolution of cellular organisms. New definitions of viruses have been proposed and their position in the universal tree of life is actively discussed. Viruses are no more confused with their virions, but can be viewed as complex living entities that transform the infected cell into a novel organism-the virus-producing virions. I suggest here to define life (an historical process) as the mode of existence of ribosome encoding organisms (cells) and capsid encoding organisms (viruses) and their ancestors. I propose to define an organism as an ensemble of integrated organs (molecular or cellular) producing individuals evolving through natural selection. The origin of life on our planet would correspond to the establishment of the first organism corresponding to this definition.

  4. The Discovery of the Tau Lepton: Part 1, The Early History Through 1975; Part 2, Confirmation of the Discovery and Measurement of Major Properties, 1976--1982

    Science.gov (United States)

    Perl, M. L.

    1994-08-01

    Several previous papers have given the history of the discovery of the {tau} lepton at the Stanford Linear Accelerator Center (SLAC). These papers emphasized (a) the experiments which led to our 1975 publication of the first evidence for the existence of the {tau}, (b) the subsequent experiments which confirmed the existence of the r, and (c) the experiments which elucidated the major properties of the {tau}. That history will be summarized in Part 2 of this talk. In this Part 1, I describe the earlier thoughts and work of myself and my colleagues at SLAC in the 1960's and early 1970's which led to the discovery. I also describe the theoretical and experimental events in particle physics in the 1960's in which our work was immersed. I will also try to describe for the younger generations of particle physicists, the atmosphere in the 1960's. That was before the elucidation of the quark model of hadrons, before the development of the concept of particle generations The experimental paths to program we hot as clear as they are today and we had to cast a wide experimental net.

  5. Recent history of atmospheric trace gas concentrations deduced from measurement in the deep sea: application to sulfur hexa-fluoride and carbon tetrachlordie

    Energy Technology Data Exchange (ETDEWEB)

    Watson, A.J.; Liddicoat, M.I.

    1985-01-01

    On a time scale of several decades, an increase in the atmospheric burden of certain stable trace gases results in a characteristic oceanic depth profile for the concentration of the dissolved gas. If the atmosphere is the only source of the gas to the sea, the time delay inherent in its downward penetration from the surface results in a profile which decreases with depth. By referencing to compounds such as Freon 11 or Freon 12, the atmospheric histories of which are relatively well known, limits can be placed on the increase of a trace gas whose history is unknown. The method may be particularly valuable in distinguishing the contributions of natural and anthropogenic sources of gases such as CCl/sub 4/ and CF/sub 4/, which may have both. The method is here applied to estimate the concentration of atmospheric SF/sub 6/ since 1970. Both exponential and linear fits are investigated, but the best fit is a linear increase, C = 0.34 + 0.084 (Yr-1970), where Yr is the calendar year and C is the concentration in pptv. A preliminary look at two CCl/sub 4/ profiles suggests that at least 50% of the atmospheric burden is of recent anthropogenic origin.

  6. Rorschach variables and dysfunctional attitudes as measures of depressive vulnerability: a 9-year follow-up study of individuals with different histories of major depressive episodes.

    Science.gov (United States)

    Hartmann, Ellen; Halvorsen, Marianne; Wang, Catharina E A

    2013-01-01

    Forty-six individuals with different histories of major depressive episodes (MDEs) completed the Rorschach (Exner, 2003 ) and the Dysfunctional Attitude Scale (DAS; Weissman & Beck, 1978) at 2 assessment points (T1, T2) over a 9-year follow-up. At T1, history of MDE and the Rorschach variable MOR (associated with negative self-image) emerged as significant predictors of number of MDEs over the follow-up. At T2, Rorschach markers of depressive vulnerability and scars were identified (i.e., WSum6, related to illogical thinking; X+%, related to conventional perception and social adjustment; X-%, linked to erroneous judgments; MQ-, associated with impaired social relations; and MOR). Test-retest analyses displayed significant temporal stability in Rorschach variables, with r ranging from .34 to .67 and in the DAS, r = .42. Our findings highlight MDE as a recurrent and serious disorder, number of MDEs as a risk factor for future depressions, and Rorschach variables as markers of depressive vulnerability and scars.

  7. The action operator for continuous time histories

    CERN Document Server

    Savvidou, K N

    1999-01-01

    We define the action operator for the History Projection Operator consistent histories theory, as the quantum analogue of the classical action functional, for the simple harmonic oscillator in one dimention. We conclude that the action operator is the generator of time transformations, and is associated with the two types of time-evolution of the standard quantum theory: the wave-packet reduction and the Heisenberg time-evolution. We construct corresponding classical histories and demonstrate the relevance with the quantum histories. Finally, we show the appearance of the action operator in the expression for the decoherence functional.

  8. Oscillator metrology with software defined radio

    CERN Document Server

    Sherman, Jeff A

    2016-01-01

    Analog electrical elements such as mixers, filters, transfer oscillators, isolating buffers, dividers, and even transmission lines contribute technical noise and unwanted environmental coupling in time and frequency measurements. Software defined radio (SDR) techniques replace many of these analog components with digital signal processing (DSP) on rapidly sampled signals. We demonstrate that, generically, commercially available multi-channel SDRs are capable of time and frequency metrology, outperforming purpose-built devices by as much as an order-of-magnitude. For example, for signals at 10 MHz and 6 GHz, we observe SDR time deviation noise floors of about 20 fs and 1 fs, respectively, in under 10 ms of averaging. Examining the other complex signal component, we find a relative amplitude measurement instability of 3e-7 at 5 MHz. We discuss the scalability of a SDR-based system for simultaneous measurement of many clocks. SDR's frequency agility allows for comparison of oscillators at widely different freque...

  9. Interrupting Life History: The Evolution of Relationship within Research

    Science.gov (United States)

    Hallett, Ronald E.

    2013-01-01

    In this paper the author explores how relationships are defined within the context of constructing a life history. The life history of Benjamin, a homeless young man transitioning to adulthood, is used to illustrate how difficult it is to define the parameters of the research environment. During an "ethically important moment" in the research…

  10. Defining Life: Synthesis and Conclusions

    Science.gov (United States)

    Gayon, Jean

    2010-04-01

    The first part of the paper offers philosophical landmarks on the general issue of defining life. §1 defends that the recognition of “life” has always been and remains primarily an intuitive process, for the scientist as for the layperson. However we should not expect, then, to be able to draw a definition from this original experience, because our cognitive apparatus has not been primarily designed for this. §2 is about definitions in general. Two kinds of definition should be carefully distinguished: lexical definitions (based upon current uses of a word), and stipulative or legislative definitions, which deliberately assign a meaning to a word, for the purpose of clarifying scientific or philosophical arguments. The present volume provides examples of these two kinds of definitions. §3 examines three traditional philosophical definitions of life, all of which have been elaborated prior to the emergence of biology as a specific scientific discipline: life as animation (Aristotle), life as mechanism, and life as organization (Kant). All three concepts constitute a common heritage that structures in depth a good deal of our cultural intuitions and vocabulary any time we try to think about “life”. The present volume offers examples of these three concepts in contemporary scientific discourse. The second part of the paper proposes a synthesis of the major debates developed in this volume. Three major questions have been discussed. A first issue (§4) is whether we should define life or not, and why. Most authors are skeptical about the possibility of defining life in a strong way, although all admit that criteria are useful in contexts such as exobiology, artificial life and the origins of life. §5 examines the possible kinds of definitions of life presented in the volume. Those authors who have explicitly defended that a definition of life is needed, can be classified into two categories. The first category (or standard view) refers to two conditions

  11. Defining life: synthesis and conclusions.

    Science.gov (United States)

    Gayon, Jean

    2010-04-01

    The first part of the paper offers philosophical landmarks on the general issue of defining life. Section 1 defends that the recognition of "life" has always been and remains primarily an intuitive process, for the scientist as for the layperson. However we should not expect, then, to be able to draw a definition from this original experience, because our cognitive apparatus has not been primarily designed for this. Section 2 is about definitions in general. Two kinds of definition should be carefully distinguished: lexical definitions (based upon current uses of a word), and stipulative or legislative definitions, which deliberately assign a meaning to a word, for the purpose of clarifying scientific or philosophical arguments. The present volume provides examples of these two kinds of definitions. Section 3 examines three traditional philosophical definitions of life, all of which have been elaborated prior to the emergence of biology as a specific scientific discipline: life as animation (Aristotle), life as mechanism, and life as organization (Kant). All three concepts constitute a common heritage that structures in depth a good deal of our cultural intuitions and vocabulary any time we try to think about "life". The present volume offers examples of these three concepts in contemporary scientific discourse. The second part of the paper proposes a synthesis of the major debates developed in this volume. Three major questions have been discussed. A first issue (Section 4) is whether we should define life or not, and why. Most authors are skeptical about the possibility of defining life in a strong way, although all admit that criteria are useful in contexts such as exobiology, artificial life and the origins of life. Section 5 examines the possible kinds of definitions of life presented in the volume. Those authors who have explicitly defended that a definition of life is needed, can be classified into two categories. The first category (or standard view) refers

  12. Network Coded Software Defined Networking

    DEFF Research Database (Denmark)

    Hansen, Jonas; Roetter, Daniel Enrique Lucani; Krigslund, Jeppe

    2015-01-01

    Software defined networking has garnered large attention due to its potential to virtualize services in the Internet, introducing flexibility in the buffering, scheduling, processing, and routing of data in network routers. SDN breaks the deadlock that has kept Internet network protocols stagnant...... for decades, while applications and physical links have evolved. This article advocates for the use of SDN to bring about 5G network services by incorporating network coding (NC) functionalities. The latter constitutes a major leap forward compared to the state-of-the- art store and forward Internet paradigm....... The inherent flexibility of both SDN and NC provides fertile ground to envision more efficient, robust, and secure networking designs, which may also incorporate content caching and storage, all of which are key challenges of the upcoming 5G networks. This article not only proposes the fundamentals...

  13. Network Coded Software Defined Networking

    DEFF Research Database (Denmark)

    Krigslund, Jeppe; Hansen, Jonas; Roetter, Daniel Enrique Lucani;

    2015-01-01

    Software Defined Networking (SDN) and Network Coding (NC) are two key concepts in networking that have garnered a large attention in recent years. On the one hand, SDN's potential to virtualize services in the Internet allows a large flexibility not only for routing data, but also to manage...... buffering, scheduling, and processing over the network. On the other hand, NC has shown great potential for increasing robustness and performance when deployed on intermediate nodes in the network. This new paradigm changes the dynamics of network protocols, requiring new designs that exploit its potential....... This paper advocates for the use of SDN to bring about future Internet and 5G network services by incorporating network coding (NC) functionalities. The inherent flexibility of both SDN and NC provides a fertile ground to envision more efficient, robust, and secure networking designs, that may also...

  14. Defining biocultural approaches to conservation.

    Science.gov (United States)

    Gavin, Michael C; McCarter, Joe; Mead, Aroha; Berkes, Fikret; Stepp, John Richard; Peterson, Debora; Tang, Ruifei

    2015-03-01

    We contend that biocultural approaches to conservation can achieve effective and just conservation outcomes while addressing erosion of both cultural and biological diversity. Here, we propose a set of guidelines for the adoption of biocultural approaches to conservation. First, we draw lessons from work on biocultural diversity and heritage, social-ecological systems theory, integrated conservation and development, co-management, and community-based conservation to define biocultural approaches to conservation. Second, we describe eight principles that characterize such approaches. Third, we discuss reasons for adopting biocultural approaches and challenges. If used well, biocultural approaches to conservation can be a powerful tool for reducing the global loss of both biological and cultural diversity.

  15. Miniature EVA Software Defined Radio

    Science.gov (United States)

    Pozhidaev, Aleksey

    2012-01-01

    As NASA embarks upon developing the Next-Generation Extra Vehicular Activity (EVA) Radio for deep space exploration, the demands on EVA battery life will substantially increase. The number of modes and frequency bands required will continue to grow in order to enable efficient and complex multi-mode operations including communications, navigation, and tracking applications. Whether conducting astronaut excursions, communicating to soldiers, or first responders responding to emergency hazards, NASA has developed an innovative, affordable, miniaturized, power-efficient software defined radio that offers unprecedented power-efficient flexibility. This lightweight, programmable, S-band, multi-service, frequency- agile EVA software defined radio (SDR) supports data, telemetry, voice, and both standard and high-definition video. Features include a modular design, an easily scalable architecture, and the EVA SDR allows for both stationary and mobile battery powered handheld operations. Currently, the radio is equipped with an S-band RF section. However, its scalable architecture can accommodate multiple RF sections simultaneously to cover multiple frequency bands. The EVA SDR also supports multiple network protocols. It currently implements a Hybrid Mesh Network based on the 802.11s open standard protocol. The radio targets RF channel data rates up to 20 Mbps and can be equipped with a real-time operating system (RTOS) that can be switched off for power-aware applications. The EVA SDR's modular design permits implementation of the same hardware at all Network Nodes concept. This approach assures the portability of the same software into any radio in the system. It also brings several benefits to the entire system including reducing system maintenance, system complexity, and development cost.

  16. Asymptomatic Alzheimer disease: Defining resilience.

    Science.gov (United States)

    Hohman, Timothy J; McLaren, Donald G; Mormino, Elizabeth C; Gifford, Katherine A; Libon, David J; Jefferson, Angela L

    2016-12-06

    To define robust resilience metrics by leveraging CSF biomarkers of Alzheimer disease (AD) pathology within a latent variable framework and to demonstrate the ability of such metrics to predict slower rates of cognitive decline and protection against diagnostic conversion. Participants with normal cognition (n = 297) and mild cognitive impairment (n = 432) were drawn from the Alzheimer's Disease Neuroimaging Initiative. Resilience metrics were defined at baseline by examining the residuals when regressing brain aging outcomes (hippocampal volume and cognition) on CSF biomarkers. A positive residual reflected better outcomes than expected for a given level of pathology (high resilience). Residuals were integrated into a latent variable model of resilience and validated by testing their ability to independently predict diagnostic conversion, cognitive decline, and the rate of ventricular dilation. Latent variables of resilience predicted a decreased risk of conversion (hazard ratio 0.02, p < 0.001), and slower rates of ventricular dilation (β < -4.7, p < 2 × 10(-15)). These results were significant even when analyses were restricted to clinically normal individuals. Furthermore, resilience metrics interacted with biomarker status such that biomarker-positive individuals with low resilience showed the greatest risk of subsequent decline. Robust phenotypes of resilience calculated by leveraging AD biomarkers and baseline brain aging outcomes provide insight into which individuals are at greatest risk of short-term decline. Such comprehensive definitions of resilience are needed to further our understanding of the mechanisms that protect individuals from the clinical manifestation of AD dementia, especially among biomarker-positive individuals. © 2016 American Academy of Neurology.

  17. Defining Starch Binding by Glucan Phosphatases

    DEFF Research Database (Denmark)

    Auger, Kyle; Raththagala, Madushi; Wilkens, Casper

    2015-01-01

    phosphatases. The main objective of this study was to quantify the binding affinity of different enzymes that are involved in this cyclic process. We established a protocol to quickly, reproducibly, and quantitatively measure the binding of the enzymes to glucans utilizing Affinity Gel Electrophoresis (AGE...... glucan phosphatases showed similar affinities for the short oligosaccharide β-cyclodextrin. We performed structure-guided mutagenesis to define the mechanism of these differences. We found that the carbohydrate binding module (CBM) domain provided a stronger binding affinity compared to surface binding...

  18. Associations among measures of sequential processing in motor and linguistics tasks in adults with and without a family history of childhood apraxia of speech: a replication study.

    Science.gov (United States)

    Button, Le; Peter, Beate; Stoel-Gammon, Carol; Raskind, Wendy H

    2013-03-01

    The purpose of this study was to address the hypothesis that childhood apraxia of speech (CAS) is influenced by an underlying deficit in sequential processing that is also expressed in other modalities. In a sample of 21 adults from five multigenerational families, 11 with histories of various familial speech sound disorders, 3 biologically related adults from a family with familial CAS showed motor sequencing deficits in an alternating motor speech task. Compared with the other adults, these three participants showed deficits in tasks requiring high loads of sequential processing, including nonword imitation, nonword reading and spelling. Qualitative error analyses in real word and nonword imitations revealed group differences in phoneme sequencing errors. Motor sequencing ability was correlated with phoneme sequencing errors during real word and nonword imitation, reading and spelling. Correlations were characterized by extremely high scores in one family and extremely low scores in another. Results are consistent with a central deficit in sequential processing in CAS of familial origin.

  19. Defining Acquisition and Contracting Terms Associated with Contract Administration

    Science.gov (United States)

    1990-09-01

    scope is already defined. Sidney Landau in his book, Dictionaries: The Art and Craft of Lexicography mentions that the hard part is to determine the...good as the amount of time and money available. James Sledd, an authority on the history of lexicography , says that "useful things in lexicography ...word being described and that word alone. Isolating cognitive thought is difficult to ac- complish but can be very effective when successful. A more

  20. History and Imagination: Reenactments for Elementary Social Studies

    Science.gov (United States)

    Morris, Ronald Vaughan

    2012-01-01

    In "History and Imagination," elementary school social studies teachers will learn how to help their students break down the walls of their schools, more personally engage with history, and define democratic citizenship. By collaborating together in meaningful investigations into the past and reenacting history, students will become…

  1. Defining a Maturity Scale for Governing Operational Resilience

    Science.gov (United States)

    2015-03-01

    of technology and innovation continues to accelerate. Sponsorship, strategic planning, and oversight of operational resilience are the most crucial...identify shortfalls across these defined activities, make incremental improvements, and measure improvement against a defined, accepted maturity scale. The...the granu- larity needs for organizations committed to making incremental improvements in governing oper- ational resilience. To achieve a more

  2. Defining Child Neglect Based on Child Protective Services Data

    Science.gov (United States)

    Dubowitz, H.; Pitts, S.C.; Litrownik, A.J.; Cox, C.E.; Runyan, D.; Black, M.M.

    2005-01-01

    Objectives:: To compare neglect defined by Child Protective Services official codes with neglect defined by a review of CPS narrative data, and to examine the validity of the different neglect measures using children's functioning at age 8 years. Methods:: Data are from 740 children participating in a consortium of longitudinal studies on child…

  3. Defining and Assessing Team Skills of Business and Accountancy Students

    Science.gov (United States)

    Alghalith, Nabil; Blum, Michael; Medlock, Amanda; Weber, Sandy

    2004-01-01

    The objectives of the project are (1) to define the skills necessary for students to work effectively with others to achieve common goals, and (2) to develop an assessment instrument to measure student progress toward achieving these skills. The defined skill set will form a basis for common expectations related to team skills that will be shared…

  4. Software Defined Common Processing System (SDCPS) Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Coherent Logix, Incorporated proposes the Software Defined Common Processing System (SDCPS) program to facilitate the development of a Software Defined Radio...

  5. Multi-channel software defined radio experimental evaluation and analysis

    CSIR Research Space (South Africa)

    Van der Merwe, JR

    2014-09-01

    Full Text Available Multi-channel software-defined radios (SDRs) can be utilised as inexpensive prototyping platforms for transceiver arrays. The application for multi-channel prototyping is discussed and measured results of coherent channels for both receiver...

  6. Popular history magazines and history education

    Directory of Open Access Journals (Sweden)

    Robert Thorp

    2016-05-01

    Full Text Available This paper argues that popular history magazines may be a welcome complement to other forms of historical media in history teaching. By outlining a theoretical framework that captures uses of history, the paper analyses popular history magazine articles from five European countries all dealing with the outbreak of World War I. The study finds that while the studied articles provide a rather heterogeneous view of the causes of the Great War, they can be used to discuss and analyse the importance of perspective in history, thus offering an opportunity to further a more disciplinary historical understanding.

  7. Using experimental design to define boundary manikins.

    Science.gov (United States)

    Bertilsson, Erik; Högberg, Dan; Hanson, Lars

    2012-01-01

    When evaluating human-machine interaction it is central to consider anthropometric diversity to ensure intended accommodation levels. A well-known method is the use of boundary cases where manikins with extreme but likely measurement combinations are derived by mathematical treatment of anthropometric data. The supposition by that method is that the use of these manikins will facilitate accommodation of the expected part of the total, less extreme, population. In literature sources there are differences in how many and in what way these manikins should be defined. A similar field to the boundary case method is the use of experimental design in where relationships between affecting factors of a process is studied by a systematic approach. This paper examines the possibilities to adopt methodology used in experimental design to define a group of manikins. Different experimental designs were adopted to be used together with a confidence region and its axes. The result from the study shows that it is possible to adapt the methodology of experimental design when creating groups of manikins. The size of these groups of manikins depends heavily on the number of key measurements but also on the type of chosen experimental design.

  8. Defining and identifying Sleeping Beauties in science

    CERN Document Server

    Ke, Qing; Radicchi, Filippo; Flammini, Alessandro

    2015-01-01

    A Sleeping Beauty (SB) in science refers to a paper whose importance is not recognized for several years after publication. Its citation history exhibits a long hibernation period followed by a sudden spike of popularity. Previous studies suggest a relative scarcity of SBs. The reliability of this conclusion is, however, heavily dependent on identification methods based on arbitrary threshold parameters for sleeping time and number of citations, applied to small or monodisciplinary bibliographic datasets. Here we present a systematic, large-scale, and multidisciplinary analysis of the SB phenomenon in science. We introduce a parameter-free measure that quantifies the extent to which a specific paper can be considered an SB. We apply our method to 22 million scientific papers published in all disciplines of natural and social sciences over a time span longer than a century. Our results reveal that the SB phenomenon is not exceptional. There is a continuous spectrum of delayed recognition where both the hiberna...

  9. İnovasyon Süreci Performansı Ölçüm Kriterlerini Nitel Bir Araştırma İle Belirleme: Bilişim Sektöründen Bulgular - Defining Innovation Process Performance Measurement Criteria with a Qualitative Research: Findings from IT Sector

    Directory of Open Access Journals (Sweden)

    Yunus Emre TAŞGİT

    2016-06-01

    Full Text Available The aim of this study is to define innovation performance measurement criteria for firms and measure their performance through these criteria. IT firms in technoparks at TR42 East Marmara Region are included in the study and qualitative research method is used. Data are collected through the interviews conducted with managers of IT firms and are analyzed with descriptive and content analysis techniques. After the analysis, some measurement criteria are introduced to measure the innovation performance. Results show that “Idea Generation” stage is not taken seriously by these firms. Performances of “Beta Version Development” and “Full Version Development” stages are high. Firms have to analyze “Sale” stage carefully.

  10. Defining Tobacco Regulatory Science Competencies.

    Science.gov (United States)

    Wipfli, Heather L; Berman, Micah; Hanson, Kacey; Kelder, Steven; Solis, Amy; Villanti, Andrea C; Ribeiro, Carla M P; Meissner, Helen I; Anderson, Roger

    2017-02-01

    In 2013, the National Institutes of Health and the Food and Drug Administration funded a network of 14 Tobacco Centers of Regulatory Science (TCORS) with a mission that included research and training. A cross-TCORS Panel was established to define tobacco regulatory science (TRS) competencies to help harmonize and guide their emerging educational programs. The purpose of this paper is to describe the Panel's work to develop core TRS domains and competencies. The Panel developed the list of domains and competencies using a semistructured Delphi method divided into four phases occurring between November 2013 and August 2015. The final proposed list included a total of 51 competencies across six core domains and 28 competencies across five specialized domains. There is a need for continued discussion to establish the utility of the proposed set of competencies for emerging TRS curricula and to identify the best strategies for incorporating these competencies into TRS training programs. Given the field's broad multidisciplinary nature, further experience is needed to refine the core domains that should be covered in TRS training programs versus knowledge obtained in more specialized programs. Regulatory science to inform the regulation of tobacco products is an emerging field. The paper provides an initial list of core and specialized domains and competencies to be used in developing curricula for new and emerging training programs aimed at preparing a new cohort of scientists to conduct critical TRS research. © The Author 2016. Published by Oxford University Press on behalf of the Society for Research on Nicotine and Tobacco. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  11. Defining Platelet Function During Polytrauma

    Science.gov (United States)

    2013-02-01

    using calibrated automated thrombography ( CAT ). 3. Platelet-induced clot contraction and using viscoelastic measures such as TEG with Platelet Mapping...using calibrated automated thrombography ( CAT ) in platelet-rich plasma. 3. Platelet-induced clot contraction and effect on clot structure by platelet...if injury with stable vital signs on initial evaluation.  Pregnancy (confirmed with urine pregnancy testing)  Documented do not resuscitate order

  12. History, legislation and offense: deprivation of liberty and socio-educational measures aimed at children and adolescents in the 20th century

    Directory of Open Access Journals (Sweden)

    Camila Serafim Daminelli

    2017-08-01

    Full Text Available During the twentieth century the Brazilian State sought to reeducate the minors offenders by their insertion in centers built for this purpose. First based on Minor’s Rights [Direito do Menor] (1927, then through the Doctrine of the Irregular Situation [Doutrina da Situação Irregular] (1979, the offenders were priority subject for internment, because of their noticeable public disorder potential and as adult crime demand. Since the enactment of the Child and Teenager Statute [Estatuto da Criança e do Adolescente] (1990, educational measures in open regime was established aiming the reintegration of the offender to social life, presenting the shelter as a last option to be used. It is proposed to analyze the measures provided by law for accountability of child and youth people throughout the twentieth century, in Brazil and make some considerations about educational measures prescribed by actual law.

  13. Oscillator metrology with software defined radio.

    Science.gov (United States)

    Sherman, Jeff A; Jördens, Robert

    2016-05-01

    Analog electrical elements such as mixers, filters, transfer oscillators, isolating buffers, dividers, and even transmission lines contribute technical noise and unwanted environmental coupling in time and frequency measurements. Software defined radio (SDR) techniques replace many of these analog components with digital signal processing (DSP) on rapidly sampled signals. We demonstrate that, generically, commercially available multi-channel SDRs are capable of time and frequency metrology, outperforming purpose-built devices by as much as an order-of-magnitude. For example, for signals at 10 MHz and 6 GHz, we observe SDR time deviation noise floors of about 20 fs and 1 fs, respectively, in under 10 ms of averaging. Examining the other complex signal component, we find a relative amplitude measurement instability of 3 × 10(-7) at 5 MHz. We discuss the scalability of a SDR-based system for simultaneous measurement of many clocks. SDR's frequency agility allows for comparison of oscillators at widely different frequencies. We demonstrate a novel and extreme example with optical clock frequencies differing by many terahertz: using a femtosecond-laser frequency comb and SDR, we show femtosecond-level time comparisons of ultra-stable lasers with zero measurement dead-time.

  14. Kiropraktikkens historie i Danmark

    DEFF Research Database (Denmark)

    Jørgensen, Per

    Bogen er den første samlede, forskningsbaserede fremstilling om kiropraktikkens danske historie. Den har udblik til kiropraktikkens historie i USA.......Bogen er den første samlede, forskningsbaserede fremstilling om kiropraktikkens danske historie. Den har udblik til kiropraktikkens historie i USA....

  15. Microforms and Sport History.

    Science.gov (United States)

    Levine, Peter

    1986-01-01

    Explores the importance of sport history as it reflects the social and cultural history of the United States. Discussion covers the various sport history materials that are available in microform, including the Spalding Collection, twentieth-century microfilm sources, and sports and social history (Sports Periodicals microfilm series). (EJS)

  16. What Is Literary "History"?

    Science.gov (United States)

    Harris, Wendell V.

    1994-01-01

    Examines the meaning of the word "history" as used in the common phrase "literary history" by critics and scholars. Asserts the differences between historical scholarship and literary history. Argues that the grounding activity of literary history is insulated from the relativism insisted upon by poststructuralist theorizing.…

  17. (Sample) Size Matters: Defining Error in Planktic Foraminiferal Isotope Measurement

    Science.gov (United States)

    Lowery, C.; Fraass, A. J.

    2015-12-01

    Planktic foraminifera have been used as carriers of stable isotopic signals since the pioneering work of Urey and Emiliani. In those heady days, instrumental limitations required hundreds of individual foraminiferal tests to return a usable value. This had the fortunate side-effect of smoothing any seasonal to decadal changes within the planktic foram population, which generally turns over monthly, removing that potential noise from each sample. With the advent of more sensitive mass spectrometers, smaller sample sizes have now become standard. This has been a tremendous advantage, allowing longer time series with the same investment of time and energy. Unfortunately, the use of smaller numbers of individuals to generate a data point has lessened the amount of time averaging in the isotopic analysis and decreased precision in paleoceanographic datasets. With fewer individuals per sample, the differences between individual specimens will result in larger variation, and therefore error, and less precise values for each sample. Unfortunately, most workers (the authors included) do not make a habit of reporting the error associated with their sample size. We have created an open-source model in R to quantify the effect of sample sizes under various realistic and highly modifiable parameters (calcification depth, diagenesis in a subset of the population, improper identification, vital effects, mass, etc.). For example, a sample in which only 1 in 10 specimens is diagenetically altered can be off by >0.3‰ δ18O VPDB or ~1°C. Additionally, and perhaps more importantly, we show that under unrealistically ideal conditions (perfect preservation, etc.) it takes ~5 individuals from the mixed-layer to achieve an error of less than 0.1‰. Including just the unavoidable vital effects inflates that number to ~10 individuals to achieve ~0.1‰. Combining these errors with the typical machine error inherent in mass spectrometers make this a vital consideration moving forward.

  18. Toward Defining, Measuring, and Evaluating LGBT Cultural Competence for Psychologists

    Science.gov (United States)

    Boroughs, Michael S.; Andres Bedoya, C.; O'Cleirigh, Conall; Safren, Steven A.

    2015-01-01

    A central part of providing evidence-based practice is appropriate cultural competence to facilitate psychological assessment and intervention with diverse clients. At a minimum, cultural competence with lesbian, gay, bisexual, and transgender (LGBT) people involves adequate scientific and supervised practical training, with increasing depth and complexity across training levels. In order to further this goal, we offer 28 recommendations of minimum standards moving toward ideal training for LGBT-specific cultural competence. We review and synthesize the relevant literature to achieve and assess competence across the various levels of training (doctoral, internship, post-doctoral, and beyond) in order to guide the field towards best practices. These recommendations are aligned with educational and practice guidelines set forth by the field and informed by other allied professions in order to provide a roadmap for programs, faculty, and trainees in improving the training of psychologists to work with LGBT individuals. PMID:26279609

  19. Defining Neighborhood Boundaries for Social Measurement: Advancing Social Work Research

    Science.gov (United States)

    Foster, Kirk A.; Hipp, J. Aaron

    2011-01-01

    Much of the current neighborhood-based research uses variables aggregated on administrative boundaries such as zip codes, census tracts, and block groups. However, other methods using current technological advances in geographic sciences may broaden our ability to explore the spatial concentration of neighborhood factors affecting individuals and…

  20. Defining and Measuring the Success of Service Contracts

    Science.gov (United States)

    2012-06-01

    how to meet the objectives of the service contract. Pertinent activities include conducting advertising to identify new sources and compiling a list...and Air Force. Acquisition Research Journal, 12(1), 3–32. Sekhar, G. V. (2010). Business policy and strategic management. Uphaar Cinema Market

  1. Using Defined Processes as a Context for Resilience Measures

    Science.gov (United States)

    2011-12-01

    processes. 1 "W. Edwards Deming." BrainyQuote.com. Xplore Inc, 2010. Accessed September 22, 2011. http://www.brainyquote.com/quotes/quotes/w...process owner of this process element or another related organizational home page, e.g., Software Engineering Institute, IEEE or a government regulatory

  2. Defining the toxicology of aging.

    Science.gov (United States)

    Sorrentino, Jessica A; Sanoff, Hanna K; Sharpless, Norman E

    2014-07-01

    Mammalian aging is complex and incompletely understood. Although significant effort has been spent addressing the genetics or, more recently, the pharmacology of aging, the toxicology of aging has been relatively understudied. Just as an understanding of 'carcinogens' has proven crucial to modern cancer biology, an understanding of environmental toxicants that accelerate aging ('gerontogens') will inform gerontology. In this review, we discuss the evidence for the existence of mammalian gerontogens, as well as describe the biomarkers needed to measure the age-promoting activity of a given toxicant. We focus on the effects of putative gerontogens on the in vivo accumulation of senescent cells, a characteristic feature of aging that has a causal role in some age-associated phenotypes.

  3. Radioactivity and health: A history

    Energy Technology Data Exchange (ETDEWEB)

    Stannard, J.N.; Baalman, R.W. Jr. (ed.)

    1988-10-01

    This book is designed to be primarily a history of research facts, measurements, and ideas and the people who developed them. ''Research'' is defined very broadly to include from bench-top laboratory experiments to worldwide environmental investigations. The book is not a monograph or a critical review. The findings and conclusions are presented largely as the investigators saw and reported them. Frequently, the discussion utilizes the terminology and units of the time, unless they are truly antiquated or potentially unclear. It is only when the work being reported is markedly iconoclastic or obviously wrong that I chose to make special note of it or to correct it. Nevertheless, except for direct quotations, the language is mine, and I take full responsibility for it. The working materials for this volume included published papers in scientific journals, books, published conferences and symposia, personal interviews with over 100 individuals, some of them more than once (see Appendix A), and particularly for the 1940--1950 decade and for the large government-supported laboratories to the present day, ''in-house'' reports. These reports frequently represent the only comprehensive archive of what was done and why. Unfortunately, this source is drying up because of storage problems and must be retrieved by ever more complex and inconvenient means. For this reason, special efforts have been taken to review and document these sources, though even now some sections of the field are partially inaccessible. Nevertheless, the volume of all materials available for this review was surprisingly large and the quality much better than might have been expected for so complex and disparate a fields approached under conditions of considerable urgency.

  4. A brief history of stratospheric ozone research

    Directory of Open Access Journals (Sweden)

    Rolf Müller

    2009-03-01

    Full Text Available Ozone is one of the most important trace species in the atmosphere. Therefore, the history of research on ozone has also received a good deal of attention. Here a short overview of ozone research (with a focus on the stratosphere is given, starting from the first atmospheric measurements and ending with current developments. It is valuable to study the history of ozone research, because much can be learned for current research from an understanding of how previous discoveries were made. Moreover, since the 1970s, the history of ozone research has also encompassed also the history of the human impact on the ozone layer and thus the history of policy measures taken to protect the ozone layer, notably the Montreal Protocol and its amendments and adjustments. The history of this development is particularly important because it may serve as a prototype for the development of policy measures for the protection of the Earth's climate.

  5. Defining professional pharmacy services in community pharmacy.

    Science.gov (United States)

    Moullin, Joanna C; Sabater-Hernández, Daniel; Fernandez-Llimos, Fernando; Benrimoj, Shalom I

    2013-01-01

    Multiple terms and definitions exist to describe specific aspects of pharmacy practice and service provision, yet none encompass the full range of professional services delivered by community pharmacy. The majority of current pharmacy service definitions and nomenclature refer to either the professional philosophy of pharmaceutical care or to specific professional pharmacy services; particularly pharmaceutical services provided by pharmacists with a focus on drug safety, effectiveness and health outcomes. The objective of this paper is therefore to define a professional pharmacy service within the context of the community pharmacy model of service provision. A professional pharmacy service is defined as "an action or set of actions undertaken in or organised by a pharmacy, delivered by a pharmacist or other health practitioner, who applies their specialised health knowledge personally or via an intermediary, with a patient/client, population or other health professional, to optimise the process of care, with the aim to improve health outcomes and the value of healthcare." Based on Donabedian's framework, the professional pharmacy service definition incorporates the concepts of organizational structure, process indicators and outcome measures. The definition will assist in many areas including recognition of the full range of services provided by community pharmacy and facilitating the identification of indicators of professional pharmacy service implementation and sustainable provision. A simple conceptual model for incorporating all services provided by community pharmacy is proposed. Copyright © 2013 Elsevier Inc. All rights reserved.

  6. History Matching in Parallel Computational Environments

    Energy Technology Data Exchange (ETDEWEB)

    Steven Bryant; Sanjay Srinivasan; Alvaro Barrera; Sharad Yadav

    2005-10-01

    A novel methodology for delineating multiple reservoir domains for the purpose of history matching in a distributed computing environment has been proposed. A fully probabilistic approach to perturb permeability within the delineated zones is implemented. The combination of robust schemes for identifying reservoir zones and distributed computing significantly increase the accuracy and efficiency of the probabilistic approach. The information pertaining to the permeability variations in the reservoir that is contained in dynamic data is calibrated in terms of a deformation parameter rD. This information is merged with the prior geologic information in order to generate permeability models consistent with the observed dynamic data as well as the prior geology. The relationship between dynamic response data and reservoir attributes may vary in different regions of the reservoir due to spatial variations in reservoir attributes, well configuration, flow constrains etc. The probabilistic approach then has to account for multiple r{sub D} values in different regions of the reservoir. In order to delineate reservoir domains that can be characterized with different rD parameters, principal component analysis (PCA) of the Hessian matrix has been done. The Hessian matrix summarizes the sensitivity of the objective function at a given step of the history matching to model parameters. It also measures the interaction of the parameters in affecting the objective function. The basic premise of PC analysis is to isolate the most sensitive and least correlated regions. The eigenvectors obtained during the PCA are suitably scaled and appropriate grid block volume cut-offs are defined such that the resultant domains are neither too large (which increases interactions between domains) nor too small (implying ineffective history matching). The delineation of domains requires calculation of Hessian, which could be computationally costly and as well as restricts the current approach to

  7. Association between obesity and magnetic resonance imaging defined patellar tendinopathy in community-based adults: a cross-sectional study.

    Science.gov (United States)

    Fairley, Jessica; Toppi, Jason; Cicuttini, Flavia M; Wluka, Anita E; Giles, Graham G; Cook, Jill; O'Sullivan, Richard; Wang, Yuanyuan

    2014-08-07

    Patellar tendinopathy is a common cause of activity-related anterior knee pain. Evidence is conflicting as to whether obesity is a risk factor for this condition. The aim of this study was to determine the relationship between obesity and prevalence of magnetic resonance imaging (MRI) defined patellar tendinopathy in community-based adults. 297 participants aged 50-79 years with no history of knee pain or injury were recruited from an existing community-based cohort. Measures of obesity included measured weight and body mass index (BMI), self-reported weight at age of 18-21 years and heaviest lifetime weight. Fat-free mass and fat mass were measured using bioelectrical impedance. Participants underwent MRI of the dominant knee. Patellar tendinopathy was defined on both T1- and T2-weighted images. The prevalence of MRI defined patellar tendinopathy was 28.3%. Current weight (OR per kg = 1.04, 95% CI 1.01-1.06, P = 0.002), BMI (OR per kg/m2 = 1.10, 95% CI 1.04-1.17, P = 0.002), heaviest lifetime weight (OR per kg = 1.03, 95% CI 1.01-1.05, P = 0.007) and weight at age of 18-21 years (OR per kg = 1.03, 95% CI 1.00-1.07, P = 0.05) were all positively associated with the prevalence of patellar tendinopathy. Neither fat mass nor fat-free mass was associated with patellar tendinopathy. MRI defined patellar tendinopathy is common in community-based adults and is associated with current and past history of obesity assessed by BMI or body weight, but not fat mass. The findings suggest a mechanical pathogenesis of patellar tendinopathy and patellar tendinopathy may be one mechanism for obesity related anterior knee pain.

  8. [Natural history of HBV in dialysis population].

    Science.gov (United States)

    Fabrizi, F; Martin, P; Lunghi, G; Ponticelli, C

    2004-01-01

    Dialysis patients remain at risk of acquiring hepatitis B virus (HBV) infection. The issue of the natural history of HBV among patients undergoing long-term dialysis remains unclear. Assessing the natural history of hepatitis B in patients on maintenance dialysis is problematic because of the unique characteristics of this population: serum aminotransferase activity is lower in dialysis patients compared with patients without renal disease; also, chronic hepatitis B has an insidious and prolonged natural history, and the competing mortality from complications of end-stage renal disease may obscure the long-term consequences of hepatitis B. HBV-related liver disease frequently runs an asymptomatic course in dialysis patients and the liver-related mortality in this population is very low; thus, the prognosis for chronic HBV infection in dialysis patients has been reported as benign. However, the frequency of liver cancer in dialysis patients appears higher than that observed in the general population, this has been related to a greater exposure to HBV/HCV. Cirrhosis is not a frequent comorbid condition in the dialysis population of industrialised countries, but the death rate for dialysis patients with cirrhosis is 35% higher than for those without it. In addition, it has been observed that liver disease remains a significant cause of mortality among HbsAg-positive carriers on dialysis in developing countries. The low viral load measured in dialysis patients with persistent HBsAg carriage could be accounted for by the relatively benign course of HBV-related liver disease in this population. Prospective clinical trials are under way to better define the virological features of HBV in the dialysis population.

  9. "Defining Computer 'Speed': An Unsolved Challenge"

    CERN Document Server

    CERN. Geneva

    2012-01-01

    Abstract: The reason we use computers is their speed, and the reason we use parallel computers is that they're faster than single-processor computers. Yet, after 70 years of electronic digital computing, we still do not have a solid definition of what computer 'speed' means, or even what it means to be 'faster'. Unlike measures in physics, where the definition of speed is rigorous and unequivocal, in computing there is no definition of speed that is universally accepted. As a result, computer customers have made purchases misguided by dubious information, computer designers have optimized their designs for the wrong goals, and computer programmers have chosen methods that optimize the wrong things. This talk describes why some of the obvious and historical ways of defining 'speed' haven't served us well, and the things we've learned in the struggle to find a definition that works. Biography: Dr. John Gustafson is a Director ...

  10. Quantum computing. Defining and detecting quantum speedup.

    Science.gov (United States)

    Rønnow, Troels F; Wang, Zhihui; Job, Joshua; Boixo, Sergio; Isakov, Sergei V; Wecker, David; Martinis, John M; Lidar, Daniel A; Troyer, Matthias

    2014-07-25

    The development of small-scale quantum devices raises the question of how to fairly assess and detect quantum speedup. Here, we show how to define and measure quantum speedup and how to avoid pitfalls that might mask or fake such a speedup. We illustrate our discussion with data from tests run on a D-Wave Two device with up to 503 qubits. By using random spin glass instances as a benchmark, we found no evidence of quantum speedup when the entire data set is considered and obtained inconclusive results when comparing subsets of instances on an instance-by-instance basis. Our results do not rule out the possibility of speedup for other classes of problems and illustrate the subtle nature of the quantum speedup question.

  11. Defining Starch Binding by Glucan Phosphatases

    DEFF Research Database (Denmark)

    Auger, Kyle; Raththagala, Madushi; Wilkens, Casper;

    2015-01-01

    Starch is a vital energy molecule in plants that has a wide variety of uses in industry, such as feedstock for biomaterial processing and biofuel production. Plants employ a three enzyme cyclic process utilizing kinases, amylases, and phosphatases to degrade starch in a diurnal manner. Starch...... is comprised of the branched glucan amylopectin and the more linear glucan amylose. Our lab has determined the first structures of these glucan phosphatases and we have defined their enzymatic action. Despite this progress, we lacked a means to quickly and efficiently quantify starch binding to glucan...... phosphatases. The main objective of this study was to quantify the binding affinity of different enzymes that are involved in this cyclic process. We established a protocol to quickly, reproducibly, and quantitatively measure the binding of the enzymes to glucans utilizing Affinity Gel Electrophoresis (AGE...

  12. Defining and Distinguishing Secular and Religious Terrorism

    Directory of Open Access Journals (Sweden)

    Heather S. Gregg

    2014-04-01

    Full Text Available Religious terrorism is typically characterised as acts of unrestrained, irrational and indiscriminant violence, thus offering few if any policy options for counterterrorism measures. This assumption about religious terrorism stems from two challenges in the literature: disproportionate attention to apocalyptic terrorism, and a lack of distinction between religious terrorism and its secular counterpart. This article, therefore, aims to do four things: define and differentiate religiously motivated terrorism from traditional terrorism; investigate three goals of religious terrorism (fomenting the apocalypse, creating a religious government, and establishing a religiously pure state; consider the role of leadership and target selection of religious terrorists; and, finally, suggest a range of counterterrorism strategies based on these observations.

  13. Nurse leader resilience: career defining moments.

    Science.gov (United States)

    Cline, Susan

    2015-01-01

    Resilience is an essential component of effective nursing leadership. It is defined as the ability to survive and thrive in the face of adversity. Resilience can be developed and internalized as a measure to improve retention and reduce burnout. Nurse leaders at all levels should develop these competencies to survive and thrive in an increasingly complex health care environment. Building positive relationships, maintaining positivity, developing emotional insight, creating work-life balance, and reflecting on successes and challenges are effective strategies for resilience building. Nurse leaders have a professional obligation to develop resilience in themselves, the teams they supervise, and the organization as a whole. Additional benefits include reduced turnover, reduced cost, and improved quality outcomes through organizational mindfulness.

  14. Book History and African American Studies

    Directory of Open Access Journals (Sweden)

    Claire Parfait

    2009-06-01

    Full Text Available In the fifty years that have elapsed since the publication of Lucien Febvre and Henri-Jean Martin’s L’Apparition du livre, the field of book history has expanded tremendously, on both sides of the Atlantic. As defined by the Society for the History of Authorship, Reading and Publishing (SHARP, “The history of the book is not only about books per se: broadly speaking, it concerns the creation, dissemination, and reception of script and print, including newspapers, periodicals, and ephemera. B...

  15. Music and physics: a cultural, interdisciplinary history.

    Science.gov (United States)

    Jackson, Myles W

    2008-06-01

    This essay investigates the triangular exchange among physicists, musicians, and instrument makers in nineteenth-century Germany by proffering a material, cultural, and interdisciplinary history. It does so by analyzing four concrete examples of such exchanges: the relationship between musical automata and virtuosi, the reed pipe as an object of music and scientific measurement, the history of standardizing performance pitch, and the attempts to measure musical virtuosity. The goal of the essay is to suggest ways in which interdisciplinary cultural histories, which take the scientific content seriously, can be an improvement upon purely disciplinary histories.

  16. Poisson Bracket on the Space of Histories

    CERN Document Server

    Marolf, D

    1994-01-01

    We extend the Poisson bracket from a Lie bracket of phase space functions to a Lie bracket of functions on the space of canonical histories and investigate the resulting algebras. Typically, such extensions define corresponding Lie algebras on the space of Lagrangian histories via pull back to a space of partial solutions. These are the same spaces of histories studied with regard to path integration and decoherence. Such spaces of histories are familiar from path integration and some studies of decoherence. For gauge systems, we extend both the canonical and reduced Poisson brackets to the full space of histories. We then comment on the use of such algebras in time reparameterization invariant systems and systems with a Gribov ambiguity, though our main goal is to introduce concepts and techniques for use in a companion paper.

  17. Equivalence of History and Generator Epsilon-Machines

    CERN Document Server

    Travers, Nicholas F

    2011-01-01

    Epsilon-machines are minimal, unifilar representations of stationary stochastic processes. They were originally defined in the history machine sense---as machines whose states are the equivalence classes of infinite histories with the same probability distribution over futures. In analyzing synchronization, though, an alternative generator definition was given: unifilar edge-label hidden Markov models with probabilistically distinct states. The key difference is that history epsilon-machines are defined by a process, whereas generator epsilon-machines define a process. We show here that these two definitions are equivalent.

  18. Comparison of Hipparcos Trigonometric and Mount Wilson Spectroscopic Parallaxes for 90 Subgiants that Defined the Class in 1935

    CERN Document Server

    Sandage, Allan; Majewski, Steven R

    2015-01-01

    A history is given of the discovery between 1914 and 1935 of stars of intermediate luminosity between giants and dwarfs with spectral types between G0 to K3. The Mt Wilson spectroscopists identified about 90 such stars in their 1935 summary paper of spectroscopic absolute magnitudes for 4179 stars. Called "subgiants" by Str\\"omberg, these 90 stars defined the group at the time. The position of the Mt Wilson subgiants in the HR diagram caused difficulties in comparisons of high weight trigonometric parallaxes being measured and with Russell's prevailing evolution proposal, and critics questioned the reality of the Mt Wilson subgiants. We compare, star-by-star, the Mt Wilson spectroscopic absolute magnitudes of the 90 stars defining their sample against those absolute magnitudes derived from Hipparcos (HIP) trigonometric parallaxes. We address concerns over biases in the Mt Wilson calibration sample and biases created by the adopted methodology for calibration. Historically, these concerns were sufficient to di...

  19. On defining semantics of extended attribute grammars

    DEFF Research Database (Denmark)

    Madsen, Ole Lehrmann

    1980-01-01

    Knuth has introduced attribute grammars (AGs) as a tool to define the semanitcs of context-free languages. The use of AGs in connection with programming language definitions has mostly been to define the context-sensitive syntax of the language and to define a translation in code for a hypothetical...

  20. Defining moments in leadership character development.

    Science.gov (United States)

    Bleich, Michael R

    2015-06-01

    Critical moments in life define one's character and clarify true values. Reflective leadership is espoused as an important practice for transformational leaders. Professional development educators can help surface and explore defining moments, strengthen leadership behavior with defining moments as a catalyst for change, and create safe spaces for leaders to expand their leadership capacity.

  1. A brief history of Etymology

    Directory of Open Access Journals (Sweden)

    Mário Eduardo Viaro

    2013-12-01

    Full Text Available the etymological studies never were independent of the linguistic ones, although they have their own method, which will be shown in this paper in a historiographical way. From Plato’s thought to the creation of the Phonetic Laws during the Renaissance and to its development from the historical and comparative point of view of Gyarmathi, Rask and the authors from Germany, new assumptions are defined in the first half of the twentieth century, when there is a pause, resumed only at the end of the same century. The knowledge of the History of Etymology is particularly important to the future development of etymological studies of the Portuguese Language.

  2. Defining Ecosystem Assets for Natural Capital Accounting.

    Science.gov (United States)

    Hein, Lars; Bagstad, Ken; Edens, Bram; Obst, Carl; de Jong, Rixt; Lesschen, Jan Peter

    2016-01-01

    In natural capital accounting, ecosystems are assets that provide ecosystem services to people. Assets can be measured using both physical and monetary units. In the international System of Environmental-Economic Accounting, ecosystem assets are generally valued on the basis of the net present value of the expected flow of ecosystem services. In this paper we argue that several additional conceptualisations of ecosystem assets are needed to understand ecosystems as assets, in support of ecosystem assessments, ecosystem accounting and ecosystem management. In particular, we define ecosystems' capacity and capability to supply ecosystem services, as well as the potential supply of ecosystem services. Capacity relates to sustainable use levels of multiple ecosystem services, capability involves prioritising the use of one ecosystem service over a basket of services, and potential supply considers the ability of ecosystems to generate services regardless of demand for these services. We ground our definitions in the ecosystem services and accounting literature, and illustrate and compare the concepts of flow, capacity, capability, and potential supply with a range of conceptual and real-world examples drawn from case studies in Europe and North America. Our paper contributes to the development of measurement frameworks for natural capital to support environmental accounting and other assessment frameworks.

  3. Defining nodes in complex brain networks

    Directory of Open Access Journals (Sweden)

    Matthew Lawrence Stanley

    2013-11-01

    Full Text Available Network science holds great promise for expanding our understanding of the human brain in health, disease, development, and aging. Network analyses are quickly becoming the method of choice for analyzing functional MRI data. However, many technical issues have yet to be confronted in order to optimize results. One particular issue that remains controversial in functional brain network analyses is the definition of a network node. In functional brain networks a node represents some predefined collection of brain tissue, and an edge measures the functional connectivity between pairs of nodes. The characteristics of a node, chosen by the researcher, vary considerably in the literature. This manuscript reviews the current state of the art based on published manuscripts and highlights the strengths and weaknesses of three main methods for defining nodes. Voxel-wise networks are constructed by assigning a node to each, equally sized brain area (voxel. The fMRI time-series recorded from each voxel is then used to create the functional network. Anatomical methods utilize atlases to define the nodes based on brain structure. The fMRI time-series from all voxels within the anatomical area are averaged and subsequently used to generate the network. Functional activation methods rely on data from traditional fMRI activation studies, often from databases, to identify network nodes. Such methods identify the peaks or centers of mass from activation maps to determine the location of the nodes. Small (~10-20 millimeter diameter spheres located at the coordinates of the activation foci are then applied to the data being used in the network analysis. The fMRI time-series from all voxels in the sphere are then averaged, and the resultant time series is used to generate the network. We attempt to clarify the discussion and move the study of complex brain networks forward. While the correct method to be used remains an open, possibly unsolvable question that

  4. Defining and identifying Sleeping Beauties in science.

    Science.gov (United States)

    Ke, Qing; Ferrara, Emilio; Radicchi, Filippo; Flammini, Alessandro

    2015-06-16

    A Sleeping Beauty (SB) in science refers to a paper whose importance is not recognized for several years after publication. Its citation history exhibits a long hibernation period followed by a sudden spike of popularity. Previous studies suggest a relative scarcity of SBs. The reliability of this conclusion is, however, heavily dependent on identification methods based on arbitrary threshold parameters for sleeping time and number of citations, applied to small or monodisciplinary bibliographic datasets. Here we present a systematic, large-scale, and multidisciplinary analysis of the SB phenomenon in science. We introduce a parameter-free measure that quantifies the extent to which a specific paper can be considered an SB. We apply our method to 22 million scientific papers published in all disciplines of natural and social sciences over a time span longer than a century. Our results reveal that the SB phenomenon is not exceptional. There is a continuous spectrum of delayed recognition where both the hibernation period and the awakening intensity are taken into account. Although many cases of SBs can be identified by looking at monodisciplinary bibliographic data, the SB phenomenon becomes much more apparent with the analysis of multidisciplinary datasets, where we can observe many examples of papers achieving delayed yet exceptional importance in disciplines different from those where they were originally published. Our analysis emphasizes a complex feature of citation dynamics that so far has received little attention, and also provides empirical evidence against the use of short-term citation metrics in the quantification of scientific impact.

  5. History of Bioterrorism: Botulism

    Medline Plus

    Full Text Available ... Education What's New Emergency Preparedness and You Video: "The History of Bioterrorism" Recommend on Facebook Tweet Share ... or can be used as bioterrorist weapons. Watch the Complete Program "The History of Bioterroism" (26 min ...

  6. "Hillary - en god historie"

    DEFF Research Database (Denmark)

    Bjerre, Thomas Ærvold

    2007-01-01

    Anmeldelse af Carl Bernsteins Hillary Rodham Clinton og Michael Ehrenreichs Hillary - En amerikansk historie Udgivelsesdato: 15. november......Anmeldelse af Carl Bernsteins Hillary Rodham Clinton og Michael Ehrenreichs Hillary - En amerikansk historie Udgivelsesdato: 15. november...

  7. History of Bioterrorism: Botulism

    Science.gov (United States)

    ... Guide Reaching At-Risk Populations MedCon Video: "The History of Bioterrorism" Recommend on Facebook Tweet Share Compartir ... as bioterrorist weapons. Watch the Complete Program "The History of Bioterroism" (26 min 38 sec) Watch Specific ...

  8. History of Bioterrorism: Botulism

    Medline Plus

    Full Text Available ... What's New Emergency Preparedness and You Video: "The History of Bioterrorism" Recommend on Facebook Tweet Share Compartir ... as bioterrorist weapons. Watch the Complete Program "The History of Bioterroism" (26 min 38 sec) Watch Specific ...

  9. History of Bioterrorism: Botulism

    Medline Plus

    Full Text Available ... Education What's New Emergency Preparedness and You Video: "The History of Bioterrorism" Recommend on Facebook Tweet Share ... or can be used as bioterrorist weapons. Watch the Complete Program "The History of Bioterroism" (26 min ...

  10. The history of COSPAR

    Science.gov (United States)

    Willmore, Peter

    The Space Age started with the launch of Sputnik 1 in 1958, during the International Geophysical Year and at the height of the Cold War. The International Geophysical Year showed the power and spirit of which international collaboration in science was capable in a world just emerging from the Second World War, which had now become again deeply riven by the Cold War. COSPAR was born out of a determination to harness the former in spite of the latter. By the 1980's, the moderation of the Cold War meant this was no longer a reason for COSPAR's continued existence and new forms and objectives needed to be formulated. That debate has continued right until the Paris Assembly of 2004 and we now see COSPAR revitalized and, by objective measures, once more growing. This history will be reviewed from, in the early years, a rather personal viewpoint.

  11. The History of Astrometry

    CERN Document Server

    Perryman, Michael

    2012-01-01

    The history of astrometry, the branch of astronomy dealing with the positions of celestial objects, is a lengthy and complex chronicle, having its origins in the earliest records of astronomical observations more than two thousand years ago, and extending to the high accuracy observations being made from space today. Improved star positions progressively opened up and advanced fundamental fields of scientific enquiry, including our understanding of the scale of the solar system, the details of the Earth's motion through space, and the comprehension and acceptance of Newtonianism. They also proved crucial to the practical task of maritime navigation. Over the past 400 years, during which positional accuracy has improved roughly logarithmically with time, the distances to the nearest stars were triangulated, making use of the extended measurement baseline given by the Earth's orbit around the Sun. This led to quantifying the extravagantly vast scale of the Universe, to a determination of the physical properties...

  12. Cartooning History: Canada's Stories in Graphic Novels

    Science.gov (United States)

    King, Alyson E.

    2012-01-01

    In recent years, historical events, issues, and characters have been portrayed in an increasing number of non-fiction graphic texts. Similar to comics and graphic novels, graphic texts are defined as fully developed, non-fiction narratives told through panels of sequential art. Such non-fiction graphic texts are being used to teach history in…

  13. Cartooning History: Canada's Stories in Graphic Novels

    Science.gov (United States)

    King, Alyson E.

    2012-01-01

    In recent years, historical events, issues, and characters have been portrayed in an increasing number of non-fiction graphic texts. Similar to comics and graphic novels, graphic texts are defined as fully developed, non-fiction narratives told through panels of sequential art. Such non-fiction graphic texts are being used to teach history in…

  14. HAD Oral History Project

    Science.gov (United States)

    Holbrook, Jarita

    2014-01-01

    The Historical Astronomy Division is the recipient of an American Institute of Physics Neils Bohr Library Grant for Oral History. HAD has assembled a team of volunteers to conduct oral history interviews since May 2013. Each oral history interview varies in length between two and six hours. This presentation is an introduction to the HAD Oral History Project and the activities of the team during the first six months of the grant.

  15. Canadian petroleum history bibliography

    Energy Technology Data Exchange (ETDEWEB)

    Cass, D.

    2003-09-27

    The Petroleum History Bibliography includes a list of more than 2,000 publications that record the history of the Canadian petroleum industry. The list includes books, theses, films, audio tapes, published articles, company histories, biographies, autobiographies, fiction, poetry, humour, and an author index. It was created over a period of several years to help with projects at the Petroleum History Society. It is an ongoing piece of work, and as such, invites comments and additions.

  16. Reconstructing the Chronology of Supernovae: Determining Major Variations in the History of the Cosmic-ray Flux Incident on the Earth's Surface by Measuring the Concentration of 22Ne in Halite

    Science.gov (United States)

    Nahill, N. D.; Giegengack, R.; Lande, K.; Omar, G.

    2008-12-01

    We plan to measure the inventory of cosmogenically produced 22Ne atoms preserved in the mineral lattice of halite in deposits of rock salt, and to use that inventory to measure variations in the cosmic-ray flux to enable us to reconstruct the history of supernovae. Bedded rock salt consists almost entirely of the mineral halite (NaCl). Any neon trapped in the halite crystals during precipitation is primarily 20Ne, with a 22Ne concentration of 9% or less. Any neon resulting from cosmic-ray interactions with 23Na is solely 22Ne; therefore, 22Ne atoms in excess of 9% of the total neon are cosmogenic in origin. Measurement of the 22Ne inventory in halite from deposits covering a range of geologic ages may enable us to document the systematic growth of 22Ne through geologic time and, thus, establish the cosmic-ray flux and a chronology of supernovae. The cosmic-ray flux is attenuated in direct proportion to the mass of material overlying a halite deposit. To adjust the 22Ne inventory to account for that attenuation, we must reconstruct the post-depositional history of accumulation and removal of superjacent sediment for each halite deposit we study. As an example of our procedure, we reconstruct here the shielding history of the Permian halite deposit, the Salado Formation, Delaware Basin, New Mexico. The stratigraphy of the Delaware Basin has been well documented via exploration and production wells drilled in search of oil and gas, exploration boreholes associated with potash mining, and comprehensive geologic site assessment of the DOE Waste Isolation Pilot Plant (WIPP). WIPP is a subsurface repository for the permanent disposal of transuranic wastes, located in southeastern New Mexico, 42 km east of Carlsbad and approximately 655 m beneath the surface in the Salado Fm. The Salado Fm is part of the Late Permian Ochoan Series, and consists of 1) a lower member, 2) the McNutt Potash Zone, and 3) an upper member. WIPP lies between marker bed (MB)139 and MB136 in the

  17. Towards a European History

    NARCIS (Netherlands)

    H. van Dijk (Henk)

    2000-01-01

    textabstractAlthough historical writing is a profession with a long tradition, history as an academic discipline is strongly related to the development of the nation state in the nineteenth century. Notwithstanding specialisations like e.g. cultural history and social and economic history put less e

  18. Modern History of Tibet

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    Authored by Xu Guangzhi, this book is a subsidiary project of Research Into Traditional Culture and History (of the PRC Ministry of Education) conducted by China Tibetology Research Institute of Tibet University. The book combines modern history of Tibet with modern history of China as a whole. It tells the close ties between various members of the Chinese nation.

  19. Conducting the Medical History

    Science.gov (United States)

    Finkel, Martin A.; Alexander, Randell A.

    2011-01-01

    A key portion of the medical evaluation of child sexual abuse is the medical history. This differs from interviews or histories obtained by other professionals in that it is focuses more on the health and well-being of the child. Careful questions should be asked about all aspects of the child's medical history by a skilled, compassionate,…

  20. Defining food literacy: A scoping review.

    Science.gov (United States)

    Truman, Emily; Lane, Daniel; Elliott, Charlene

    2017-09-01

    The term "food literacy" describes the idea of proficiency in food related skills and knowledge. This prevalent term is broadly applied, although its core elements vary from initiative to initiative. In light of its ubiquitous use-but varying definitions-this article establishes the scope of food literacy research by identifying all articles that define 'food literacy', analysing its key conceptualizations, and reporting outcomes/measures of this concept. A scoping review was conducted to identify all articles (academic and grey literature) using the term "food literacy". Databases included Medline, Pubmed, Embase, CAB Abstracts, CINAHL, Scopus, JSTOR, and Web of Science, and Google Scholar. Of 1049 abstracts, 67 studies were included. From these, data was extracted on country of origin, study type (methodological approach), primary target population, and the primary outcomes relating to food literacy. The majority of definitions of food literacy emphasize the acquisition of critical knowledge (information and understanding) (55%) over functional knowledge (skills, abilities and choices) (8%), although some incorporate both (37%). Thematic analysis of 38 novel definitions of food literacy reveals the prevalence of six themes: skills and behaviours, food/health choices, culture, knowledge, emotions, and food systems. Study outcomes largely focus on knowledge generating measures, with very few focusing on health related outcome measures. Current definitions of food literacy incorporate components of six key themes or domains and attributes of both critical and functional knowledge. Despite this broad definition of the term, most studies aiming to improve food literacy focus on knowledge related outcomes. Few articles address health outcomes, leaving an important gap (and opportunity) for future research in this field. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Diet History Questionnaire: Database Revision History

    Science.gov (United States)

    The following details all additions and revisions made to the DHQ nutrient and food database. This revision history is provided as a reference for investigators who may have performed analyses with a previous release of the database.

  2. Diet History Questionnaire: Database Revision History

    Science.gov (United States)

    The following details all additions and revisions made to the DHQ nutrient and food database. This revision history is provided as a reference for investigators who may have performed analyses with a previous release of the database.

  3. History of mathematics and history of science

    OpenAIRE

    Mann, Tony

    2011-01-01

    This essay argues that the diversity of the history of mathematics community in the United Kingdom has influenced the development of the subject and is a significant factor behind the different concerns often evident in work on the history of mathematics when compared with that of historians of science. The heterogeneous nature of the community, which includes many who are not specialist historians, and the limited opportunities for academic\\ud careers open to practitioners have had a profoun...

  4. What quantum measurements measure

    Science.gov (United States)

    Griffiths, Robert B.

    2017-09-01

    A solution to the second measurement problem, determining what prior microscopic properties can be inferred from measurement outcomes ("pointer positions"), is worked out for projective and generalized (POVM) measurements, using consistent histories. The result supports the idea that equipment properly designed and calibrated reveals the properties it was designed to measure. Applications include Einstein's hemisphere and Wheeler's delayed choice paradoxes, and a method for analyzing weak measurements without recourse to weak values. Quantum measurements are noncontextual in the original sense employed by Bell and Mermin: if [A ,B ]=[A ,C ]=0 ,[B ,C ]≠0 , the outcome of an A measurement does not depend on whether it is measured with B or with C . An application to Bohm's model of the Einstein-Podolsky-Rosen situation suggests that a faulty understanding of quantum measurements is at the root of this paradox.

  5. GAMA/H-ATLAS: a meta-analysis of SFR indicators - comprehensive measures of the SFR-M* relation and cosmic star formation history at z < 0.4

    Science.gov (United States)

    Davies, L. J. M.; Driver, S. P.; Robotham, A. S. G.; Grootes, M. W.; Popescu, C. C.; Tuffs, R. J.; Hopkins, A.; Alpaslan, M.; Andrews, S. K.; Bland-Hawthorn, J.; Bremer, M. N.; Brough, S.; Brown, M. J. I.; Cluver, M. E.; Croom, S.; da Cunha, E.; Dunne, L.; Lara-López, M. A.; Liske, J.; Loveday, J.; Moffett, A. J.; Owers, M.; Phillipps, S.; Sansom, A. E.; Taylor, E. N.; Michalowski, M. J.; Ibar, E.; Smith, M.; Bourne, N.

    2016-09-01

    We present a meta-analysis of star formation rate (SFR) indicators in the Galaxy And Mass Assembly (GAMA) survey, producing 12 different SFR metrics and determining the SFR-M* relation for each. We compare and contrast published methods to extract the SFR from each indicator, using a well-defined local sample of morphologically selected spiral galaxies, which excludes sources which potentially have large recent changes to their SFR. The different methods are found to yield SFR-M* relations with inconsistent slopes and normalizations, suggesting differences between calibration methods. The recovered SFR-M* relations also have a large range in scatter which, as SFRs of the targets may be considered constant over the different time-scales, suggests differences in the accuracy by which methods correct for attenuation in individual targets. We then recalibrate all SFR indicators to provide new, robust and consistent luminosity-to-SFR calibrations, finding that the most consistent slopes and normalizations of the SFR-M* relations are obtained when recalibrated using the radiation transfer method of Popescu et al. These new calibrations can be used to directly compare SFRs across different observations, epochs and galaxy populations. We then apply our calibrations to the GAMA II equatorial data set and explore the evolution of star formation in the local Universe. We determine the evolution of the normalization to the SFR-M* relation from 0 < z < 0.35 - finding consistent trends with previous estimates at 0.3 < z < 1.2. We then provide the definitive z < 0.35 cosmic star formation history, SFR-M* relation and its evolution over the last 3 billion years.

  6. The fall and rise of the history of recent chemistry.

    Science.gov (United States)

    Morris, Peter J T

    2011-11-01

    This paper defines the history of recent chemistry, and then charts the disappearance of the history of recent chemistry ("how we got here" history) from general histories of chemistry by the late 1930s. It is also shown how the history of recent chemistry in the early decades of the twentieth century was very much the history of physical chemistry. The revival of the history of recent chemistry is attributed to Eduard Farber and Aaron lhde. Several attempts have been made since the early 1980s to promote the history of recent chemistry, with mixed results. The current situation is assessed, and the paper concludes with a proposal for the entrenchment of the subject.

  7. The LOLITA User-Definable Template Interface

    OpenAIRE

    Košmelj, Katarina

    2001-01-01

    The development of user-definable templates interfaces which allow the user to design new templates definitions in a user-friendly way is a new issue in the field of information extraction. The LOLITA user-definable templates interface allows the user to define new templates using sentences in natural language text with a few restrictions and formal elements. This approach is rather different from previous approaches to information extraction which require developers to code the template defi...

  8. Portraying User Interface History

    DEFF Research Database (Denmark)

    Jørgensen, Anker Helms

    2008-01-01

    in that they largely address prevailing UI techno­logies, and thirdly history from above in that they focus on the great deeds of the visionaries. The paper then compares this state-of-art in UI history to the much more mature fields history of computing and history of technology. Based hereon, some speculations......The user interface is coming of age. Papers adressing UI history have appeared in fair amounts in the last 25 years. Most of them address particular aspects such as an in­novative interface paradigm or the contribution of a visionary or a research lab. Contrasting this, papers addres­sing UI...... history at large have been sparse. However, a small spate of publications appeared recently, so a reasonable number of papers are available. Hence this work-in-progress paints a portrait of the current history of user interfaces at large. The paper first describes a theoretical framework recruited from...

  9. Software Defined Common Processing System (SDCPS) Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Coherent Logix, Incorporated (CLX) proposes the development of a Software Defined Common Processing System (SDCPS) that leverages the inherent advantages of an...

  10. Externally definable sets and dependent pairs II

    CERN Document Server

    Chernikov, Artem

    2012-01-01

    We continue investigating the structure of externally definable sets in NIP theories and preservation of NIP after expanding by new predicates. Most importantly: types over finite sets are uniformly definable; over a model, a family of non-forking instances of a formula (with parameters ranging over a type-definable set) can be covered with finitely many invariant types; we give some criteria for the boundedness of an expansion by a new predicate in a distal theory; naming an arbitrary small indiscernible sequence preserves NIP, while naming a large one doesn't; there are models of NIP theories over which all 1-types are definable, but not all n-types.

  11. High Resolution Software Defined Radar System for Target Detection

    Directory of Open Access Journals (Sweden)

    S. Costanzo

    2013-01-01

    Full Text Available The Universal Software Radio Peripheral USRP NI2920, a software defined transceiver so far mainly used in Software Defined Radio applications, is adopted in this work to design a high resolution L-Band Software Defined Radar system. The enhanced available bandwidth, due to the Gigabit Ethernet interface, is exploited to obtain a higher slant-range resolution with respect to the existing Software Defined Radar implementations. A specific LabVIEW application, performing radar operations, is discussed, and successful validations are presented to demonstrate the accurate target detection capability of the proposed software radar architecture. In particular, outdoor and indoor test are performed by adopting a metal plate as reference structure located at different distances from the designed radar system, and results obtained from the measured echo are successfully processed to accurately reveal the correct target position, with the predicted slant-range resolution equal to 6 m.

  12. Defining fitness in an uncertain world.

    Science.gov (United States)

    Crewe, Paul; Gratwick, Richard; Grafen, Alan

    2017-07-29

    The recently elucidated definition of fitness employed by Fisher in his fundamental theorem of natural selection is combined with reproductive values as appropriately defined in the context of both random environments and continuing fluctuations in the distribution over classes in a class-structured population. We obtain astonishingly simple results, generalisations of the Price Equation and the fundamental theorem, that show natural selection acting only through the arithmetic expectation of fitness over all uncertainties, in contrast to previous studies with fluctuating demography, in which natural selection looks rather complicated. Furthermore, our setting permits each class to have its characteristic ploidy, thus covering haploidy, diploidy and haplodiploidy at the same time; and allows arbitrary classes, including continuous variables such as condition. The simplicity is achieved by focussing just on the effects of natural selection on genotype frequencies: while other causes are present in the model, and the effect of natural selection is assessed in their presence, these causes will have their own further effects on genoytpe frequencies that are not assessed here. Also, Fisher's uses of reproductive value are shown to have two ambivalences, and a new axiomatic foundation for reproductive value is endorsed. The results continue the formal darwinism project, and extend support for the individual-as-maximising-agent analogy to finite populations with random environments and fluctuating class-distributions. The model may also lead to improved ways to measure fitness in real populations.

  13. Defining and resolving current systems in geospace

    Science.gov (United States)

    Ganushkina, N. Y.; Liemohn, M. W.; Dubyagin, S.; Daglis, I. A.; Dandouras, I.; De Zeeuw, D. L.; Ebihara, Y.; Ilie, R.; Katus, R.; Kubyshkina, M.; Milan, S. E.; Ohtani, S.; Ostgaard, N.; Reistad, J. P.; Tenfjord, P.; Toffoletto, F.; Zaharia, S.; Amariutei, O.

    2015-11-01

    Electric currents flowing through near-Earth space (R ≤ 12 RE) can support a highly distorted magnetic field topology, changing particle drift paths and therefore having a nonlinear feedback on the currents themselves. A number of current systems exist in the magnetosphere, most commonly defined as (1) the dayside magnetopause Chapman-Ferraro currents, (2) the Birkeland field-aligned currents with high-latitude "region 1" and lower-latitude "region 2" currents connected to the partial ring current, (3) the magnetotail currents, and (4) the symmetric ring current. In the near-Earth nightside region, however, several of these current systems flow in close proximity to each other. Moreover, the existence of other temporal current systems, such as the substorm current wedge or "banana" current, has been reported. It is very difficult to identify a local measurement as belonging to a specific system. Such identification is important, however, because how the current closes and how these loops change in space and time governs the magnetic topology of the magnetosphere and therefore controls the physical processes of geospace. Furthermore, many methods exist for identifying the regions of near-Earth space carrying each type of current. This study presents a robust collection of these definitions of current systems in geospace, particularly in the near-Earth nightside magnetosphere, as viewed from a variety of observational and computational analysis techniques. The influence of definitional choice on the resulting interpretation of physical processes governing geospace dynamics is presented and discussed.

  14. A Soundtrack to Mongolian History

    Directory of Open Access Journals (Sweden)

    Franck Billé

    2016-06-01

    Full Text Available Lucy M. Rees, Mongolian Film Music: Tradition, Revolution and Propaganda. London: Routledge, 2015. 210 pp. $110 (cloth. In her recently published study, ethnomusicologist Lucy M. Rees recounts the evolution of Mongolian film music, from the establishment of the country’s film industry as a vehicle of propaganda in the early socialist era to the release of the latest international productions, such as Khadak (2006, The Story of the Weeping Camel (2003, and The Cave of the Yellow Dog (2005. An in-depth analysis of the genres, structures, and melodies of Mongolia’s filmic landscape, Rees’s book also extends to the historical context and social reception of the most important films in that country’s history and is thus more than a mere compendium of cinematic works. Rees presents a narrative of Mongolian history from the perspective of film music, with each introduction of instruments, techniques, and harmonies representing a particular turn in the cultural transformation experienced by Mongolia over the course of the twentieth century. Each chapter is dedicated to a specific period of the country’s history and is constructed around a particular case study—one personality or one film—that played a defining role in that period...

  15. A Brief History of Linguistics before 18th Century

    Institute of Scientific and Technical Information of China (English)

    李亦松

    2015-01-01

    <正>Introduction Linguistics can be simply defined as the scientific study of language.Therefore,a history of linguistics is closely related to the origin of human language.This paper,in a rough way,classifies history of linguistics into three periods:Linguistics in Ancient Times;Linguistics in the Middle Ages;and Linguistics in the

  16. Highlights in the History of Oral Teacher Preparation in America

    Science.gov (United States)

    Marvelli, Alan L.

    2010-01-01

    The history of oral teacher preparation in America is both significant and diverse. There are numerous individuals and events that shifted and defined the professional practices of individuals who promote the listening and spoken language development of children with hearing loss. This article provides an overview of this rich history and offers a…

  17. A Brief Introduction of the History of Linguistics

    Institute of Scientific and Technical Information of China (English)

    王非男

    2015-01-01

    Linguistics is generally defined as the scientific study of language. When it comes to the history of linguistics, it must be related to the origin of human language. This paper will give a brief introduction of the history of linguistics in Ancient Times and in the Middle Ages.

  18. The Educational Promise of Public History Museum Exhibits

    Science.gov (United States)

    Trofanenko, Brenda M.

    2010-01-01

    Public history museums play a critical role in validating a nation's history. The museum's institutional strategies of object display are used to define a particular representation of past events. Museum displays of war are of particular interest not only because they provide evidence of past wars, but also because they serve to advance national…

  19. ICF gamma-ray reaction history diagnostics

    Science.gov (United States)

    Herrmann, H. W.; Young, C. S.; Mack, J. M.; Kim, Y. H.; McEvoy, A.; Evans, S.; Sedillo, T.; Batha, S.; Schmitt, M.; Wilson, D. C.; Langenbrunner, J. R.; Malone, R.; Kaufman, M. I.; Cox, B. C.; Frogget, B.; Miller, E. K.; Ali, Z. A.; Tunnell, T. W.; Stoeffl, W.; Horsfield, C. J.; Rubery, M.

    2010-08-01

    Reaction history measurements, such as nuclear bang time and burn width, are fundamental components of diagnosing ICF implosions and will be employed to help steer the National Ignition Facility (NIF) towards ignition. Fusion gammas provide a direct measure of nuclear interaction rate (unlike x-rays) without being compromised by Doppler spreading (unlike neutrons). Gas Cherenkov Detectors that convert fusion gamma rays to UV/visible Cherenkov photons for collection by fast optical recording systems have established their usefulness in illuminating ICF physics in several experimental campaigns at OMEGA. In particular, bang time precision better than 25 ps has been demonstrated, well below the 50 ps accuracy requirement defined by the NIF. NIF Gamma Reaction History (GRH) diagnostics are being developed based on optimization of sensitivity, bandwidth, dynamic range, cost, and NIF-specific logistics, requirements and extreme radiation environment. Implementation will occur in two phases. The first phase consists of four channels mounted to the outside of the target chamber at ~6 m from target chamber center (GRH-6m) coupled to ultra-fast photo-multiplier tubes (PMT). This system is intended to operate in the 1013-1017 neutron yield range expected during the early THD campaign. It will have high enough bandwidth to provide accurate bang times and burn widths for the expected THD reaction histories (> 80 ps fwhm). Successful operation of the first GRH-6m channel has been demonstrated at OMEGA, allowing a verification of instrument sensitivity, timing and EMI/background suppression. The second phase will consist of several channels located just inside the target bay shield wall at 15 m from target chamber center (GRH-15m) with optical paths leading through the cement shield wall to well-shielded streak cameras and PMTs. This system is intended to operate in the 1016-1020 yield range expected during the DT ignition campaign, providing higher temporal resolution for the

  20. Normativity in Russian History Education: Political Patterns and National History Textbooks

    Directory of Open Access Journals (Sweden)

    Natalia Potapova

    2014-10-01

    Full Text Available My current research concerns the politics in Russian History education. In this paper I discuss some of the issues raised by the study of national History textbooks. I analyze the normative implications of sentences and statements about the past and try to define contrary ideological assumptions. How do the authors construct the aim of historical education? In what kind of activities do the typical patterns of textbook questions and instructions try to engage pupils? How do the different Textbooks construct the political subject? The article aims to explore the media construction of political actions in Russian School History Textbooks.

  1. Defining nodes in complex brain networks.

    Science.gov (United States)

    Stanley, Matthew L; Moussa, Malaak N; Paolini, Brielle M; Lyday, Robert G; Burdette, Jonathan H; Laurienti, Paul J

    2013-11-22

    Network science holds great promise for expanding our understanding of the human brain in health, disease, development, and aging. Network analyses are quickly becoming the method of choice for analyzing functional MRI data. However, many technical issues have yet to be confronted in order to optimize results. One particular issue that remains controversial in functional brain network analyses is the definition of a network node. In functional brain networks a node represents some predefined collection of brain tissue, and an edge measures the functional connectivity between pairs of nodes. The characteristics of a node, chosen by the researcher, vary considerably in the literature. This manuscript reviews the current state of the art based on published manuscripts and highlights the strengths and weaknesses of three main methods for defining nodes. Voxel-wise networks are constructed by assigning a node to each, equally sized brain area (voxel). The fMRI time-series recorded from each voxel is then used to create the functional network. Anatomical methods utilize atlases to define the nodes based on brain structure. The fMRI time-series from all voxels within the anatomical area are averaged and subsequently used to generate the network. Functional activation methods rely on data from traditional fMRI activation studies, often from databases, to identify network nodes. Such methods identify the peaks or centers of mass from activation maps to determine the location of the nodes. Small (~10-20 millimeter diameter) spheres located at the coordinates of the activation foci are then applied to the data being used in the network analysis. The fMRI time-series from all voxels in the sphere are then averaged, and the resultant time series is used to generate the network. We attempt to clarify the discussion and move the study of complex brain networks forward. While the "correct" method to be used remains an open, possibly unsolvable question that deserves extensive

  2. A definability theorem for first order logic

    NARCIS (Netherlands)

    Butz, C.; Moerdijk, I.

    2001-01-01

    In this paper we will present a definability theorem for first order logic This theorem is very easy to state and its proof only uses elementary tools To explain the theorem let us first observe that if M is a model of a theory T in a language L then clearly any definable subset S M ie a subset S

  3. Defining Dynamic Graphics by a Graphical Language

    Institute of Scientific and Technical Information of China (English)

    毛其昌; 戴汝为

    1991-01-01

    A graphical language which can be used for defining dynamic picture and applying control actions to it is defined with an expanded attributed grammar.Based on this a system is built for developing the presentation of application data of user interface.This system provides user interface designers with a friendly and high efficient programming environment.

  4. 7 CFR 29.12 - Terms defined.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Terms defined. 29.12 Section 29.12 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... INSPECTION Regulations Definitions § 29.12 Terms defined. As used in this subpart and in all...

  5. Solar History An Introduction

    CERN Document Server

    Vita-Finzi, Claudio

    2013-01-01

    Beyond the four centuries of sunspot observation and the five decades during which artificial satellites have monitored the Sun – that is to say for 99.99999% of the Sun’s existence – our knowledge of solar history depends largely on analogy with kindred main sequence stars, on the outcome of various kinds of modelling, and on indirect measures of solar activity. They include the analysis of lunar rocks and meteorites for evidence of solar flares and other components of the solar cosmic-ray (SCR) flux, and the measurement of cosmogenic isotopes in wood, stratified ice and marine sediments to evaluate changes in the galactic cosmic-ray (GCR) flux and thus infer changes in the sheltering magnetic fields of the solar wind. In addition, shifts in the global atmospheric circulation which appear to result from cyclic fluctuations in solar irradiance have left their mark in river sediments and in the isotopic composition of cave deposits. In this volume the results these sources have already produced have bee...

  6. Exploring global history through the lens of history of Chemistry: Materials, identities and governance.

    Science.gov (United States)

    Roberts, Lissa

    2016-12-01

    As global history continues to take shape as an important field of research, its interactive relationships with the history of science, technology, and medicine are recognized and being investigated as significant areas of concern. Strangely, despite the fact that it is key to understanding so many of the subjects that are central to global history and would itself benefit from a broader geographical perspective, the history of chemistry has largely been left out of this process - particularly for the modern historical period. This article argues for the value of integrating the history of chemistry with global history, not only for understanding the past, but also for thinking about our shared present and future. Toward this end, it (1) explores the various ways in which 'chemistry' has and can be defined, with special attention to discussions of 'indigenous knowledge systems'; (2) examines the benefits of organizing historical inquiry around the evolving sociomaterial identities of substances; (3) considers ways in which the concepts of 'chemical governance' and 'chemical expertise' can be expanded to match the complexities of global history, especially in relation to environmental issues, climate change, and pollution; and (4) seeks to sketch the various geographies entailed in bringing the history of chemistry together with global histories.

  7. History of mathematics and history of science.

    Science.gov (United States)

    Mann, Tony

    2011-09-01

    This essay argues that the diversity of the history of mathematics community in the United Kingdom has influenced the development of the subject and is a significant factor behind the different concerns often evident in work on the history of mathematics when compared with that of historians of science. The heterogeneous nature of the community, which includes many who are not specialist historians, and the limited opportunities for academic careers open to practitioners have had a profound effect on the discipline, leading to a focus on elite mathematics and great mathematicians. More recently, reflecting earlier developments in the history of science, an increased interest in the context and culture of the practice of mathematics has become evident.

  8. Defining Moments in MMWR History: 1993 E. coli O157:H7 Hamburger Outbreak

    Centers for Disease Control (CDC) Podcasts

    2017-05-31

    During the 1993 E. coli O157 outbreak, four children died, and approximately 700 persons in four states became ill with severe and often bloody diarrhea after eating hamburgers from fast food restaurants. The first reports of CDC’s investigation into this deadly outbreak were published in MMWR. In this podcast, Dr. Beth Bell shares what it was like to serve as one of CDC’s lead investigators – a boots-on-the-ground disease detective -- for the historic outbreak.  Created: 5/31/2017 by MMWR.   Date Released: 5/31/2017.

  9. Coalescent histories for caterpillar-like families.

    Science.gov (United States)

    Rosenberg, Noah A

    2013-01-01

    A coalescent history is an assignment of branches of a gene tree to branches of a species tree on which coalescences in the gene tree occur. The number of coalescent histories for a pair consisting of a labeled gene tree topology and a labeled species tree topology is important in gene tree probability computations, and more generally, in studying evolutionary possibilities for gene trees on species trees. Defining the Tr-caterpillar-like family as a sequence of n-taxon trees constructed by replacing the r-taxon subtree of n-taxon caterpillars by a specific r-taxon labeled topology Tr, we examine the number of coalescent histories for caterpillar-like families with matching gene tree and species tree labeled topologies. For each Tr with size r≤8, we compute the number of coalescent histories for n-taxon trees in the Tr-caterpillar-like family. Next, as n→∞, we find that the limiting ratio of the numbers of coalescent histories for the Tr family and caterpillars themselves is correlated with the number of labeled histories for Tr. The results support a view that large numbers of coalescent histories occur when a tree has both a relatively balanced subtree and a high tree depth, contributing to deeper understanding of the combinatorics of gene trees and species trees.

  10. Marine Environmental History

    DEFF Research Database (Denmark)

    Poulsen, Bo

    2012-01-01

    This essay provides an overview of recent trends in the historiography of marine environmental history, a sub-field of environmental history which has grown tremendously in scope and size over the last c. 15 years. The object of marine environmental history is the changing relationship between...... human society and natural marine resources. Within this broad topic, several trends and objectives are discernable. The essay argue that the so-called material marine environmental history has its main focus on trying to reconstruct the presence, development and environmental impact of past fisheries...... and whaling operations. This ambition often entails a reconstruction also of how marine life has changed over time. The time frame rages from Paleolithicum to the present era. The field of marine environmental history also includes a more culturally oriented environmental history, which mainly has come...

  11. Defining and Assessing Affective Outcomes in Undergraduate Pediatric Dentistry.

    Science.gov (United States)

    Cullen, Claire L.

    1990-01-01

    The affective aspect of the curriculum is defined as the development of appropriate and measurable values such as ethical behavior, honesty, tolerance, and becoming a life-long learner. In outcome assessment of the affective category, the goal is to evaluate the transition of the student to a professional. (MLW)

  12. 75 FR 10439 - Cognitive Radio Technologies and Software Defined Radios

    Science.gov (United States)

    2010-03-08

    ... clarification filed by Cisco Systems, Inc. (``Cisco'') requesting that the Commission clarify: (1) The... of software that controls security measures in software defined radios. 3. In responding to the Cisco... response to the Cisco petition for reconsideration that raised the issue of using open source software...

  13. Family Health History and Diabetes

    Science.gov (United States)

    ... Diabetes Diabetes Risk Test Family Health History Quiz Family Health History Quiz Family health history is an ... health problems. Four Questions You Should Ask Your Family About Diabetes & Family Health History Knowing your family ...

  14. Cooperative Station History Forms

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Various forms, photographs and correspondence documenting the history of Cooperative station instrumentation, location changes, inspections, and...

  15. Public and popular history

    CERN Document Server

    De Groot, Jerome

    2013-01-01

    This interdisciplinary collection considers public and popular history within a global framework, seeking to understand considerations of local, domestic histories and the ways they interact with broader discourses. Grounded in particular local and national situations, the book addresses the issues associated with popular history in a globalised cultural world, such as: how the study of popular history might work in the future; new ways in which the terms 'popular' and 'public' might inform one another and nuance scholarship; transnational, intercultural models of 'pastness'; cultural translat

  16. Generalizing the Nagel line to circumscribed polygons by analogy and constructive defining

    Directory of Open Access Journals (Sweden)

    Michael de Villiers

    2008-10-01

    Full Text Available This paper first discusses the genetic approach and the relevance of the history of mathematics for teaching, reasoning by analogy, and the role of constructive defining in the creation of new mathematical content. It then uses constructive defining to generate a new generalization of the Nagel line of a triangle to polygons circumscribed around a circle, based on an analogy between the Nagel line and the Euler line of a triangle.

  17. Generalizing the Nagel line to circumscribed polygons by analogy and constructive defining

    OpenAIRE

    Michael de Villiers

    2008-01-01

    This paper first discusses the genetic approach and the relevance of the history of mathematics for teaching, reasoning by analogy, and the role of constructive defining in the creation of new mathematical content. It then uses constructive defining to generate a new generalization of the Nagel line of a triangle to polygons circumscribed around a circle, based on an analogy between the Nagel line and the Euler line of a triangle.

  18. Chemically defined medium and Caenorhabditis elegans

    Science.gov (United States)

    Szewczyk, Nathaniel J.; Kozak, Elena; Conley, Catharine A.

    2003-01-01

    BACKGROUND: C. elegans has been established as a powerful genetic system. Use of a chemically defined medium (C. elegans Maintenance Medium (CeMM)) now allows standardization and systematic manipulation of the nutrients that animals receive. Liquid cultivation allows automated culturing and experimentation and should be of use in large-scale growth and screening of animals. RESULTS: We find that CeMM is versatile and culturing is simple. CeMM can be used in a solid or liquid state, it can be stored unused for at least a year, unattended actively growing cultures may be maintained longer than with standard techniques, and standard C. elegans protocols work well with animals grown in defined medium. We also find that there are caveats to using defined medium. Animals in defined medium grow more slowly than on standard medium, appear to display adaptation to the defined medium, and display altered growth rates as they change the composition of the defined medium. CONCLUSIONS: As was suggested with the introduction of C. elegans as a potential genetic system, use of defined medium with C. elegans should prove a powerful tool.

  19. Shaping Sexual Knowledge: A Cultural History of Sex Education in Twentieth Century Europe. Routledge Studies in the Social History of Medicine

    Science.gov (United States)

    Sauerteig, Lutz, Ed.; Davidson, Roger, Ed.

    2012-01-01

    The history of sex education enables us to gain valuable insights into the cultural constructions of what different societies have defined as 'normal' sexuality and sexual health. Yet, the history of sex education has only recently attracted the full attention of historians of modern sexuality. "Shaping Sexual Knowledge: A Cultural History of…

  20. Shaping Sexual Knowledge: A Cultural History of Sex Education in Twentieth Century Europe. Routledge Studies in the Social History of Medicine

    Science.gov (United States)

    Sauerteig, Lutz, Ed.; Davidson, Roger, Ed.

    2012-01-01

    The history of sex education enables us to gain valuable insights into the cultural constructions of what different societies have defined as 'normal' sexuality and sexual health. Yet, the history of sex education has only recently attracted the full attention of historians of modern sexuality. "Shaping Sexual Knowledge: A Cultural History of Sex…

  1. Freud and history before 1905: from defending to questioning the theory of a glorious past.

    Science.gov (United States)

    Cotti, Patricia

    2008-01-01

    By sticking closely to Freud's use of the German term Geschichte (history, story) between 1894 and 1905, I will reveal two conceptions of history. The first one, the theory of the glorious past and its archaeological metaphor, which accompanied and sustained the seduction theory of cultural history. I will define how this change was determined by an evolution in Freud's conceptions of childhood prehistory and original history. I will also question how the history problem interfered with Freud's auto-analysis.

  2. Statistical analysis of life history calendar data.

    Science.gov (United States)

    Eerola, Mervi; Helske, Satu

    2016-04-01

    The life history calendar is a data-collection tool for obtaining reliable retrospective data about life events. To illustrate the analysis of such data, we compare the model-based probabilistic event history analysis and the model-free data mining method, sequence analysis. In event history analysis, we estimate instead of transition hazards the cumulative prediction probabilities of life events in the entire trajectory. In sequence analysis, we compare several dissimilarity metrics and contrast data-driven and user-defined substitution costs. As an example, we study young adults' transition to adulthood as a sequence of events in three life domains. The events define the multistate event history model and the parallel life domains in multidimensional sequence analysis. The relationship between life trajectories and excess depressive symptoms in middle age is further studied by their joint prediction in the multistate model and by regressing the symptom scores on individual-specific cluster indices. The two approaches complement each other in life course analysis; sequence analysis can effectively find typical and atypical life patterns while event history analysis is needed for causal inquiries.

  3. History of Cardiology in India

    OpenAIRE

    Das, Mrinal Kanti; Kumar, Soumitra; Deb, Pradip Kumar; Mishra, Sundeep

    2015-01-01

    History as a science revolves around memories, travellers' tales, fables and chroniclers' stories, gossip and trans-telephonic conversations. Medicine itself as per the puritan's definition is a non-exact science because of the probability-predictability-sensitivity-specificity factors. Howsoever, the chronicles of Cardiology in India is quite interesting and intriguing. Heart and circulation was known to humankind from pre-Vedic era. Various therapeutics measures including the role of Yoga a...

  4. Measuring Savings

    OpenAIRE

    Mark Schreiner

    2001-01-01

    Development depends on saving. But what exactly is saving, and how is it measured? This paper defines saving and describes several measures of financial savings. The measures account for the passage of time and for the three stages of saving: putting in (depositing), keeping in (maintaining a balance), and taking out (withdrawing). Together, the different measures capture how people move financial resources through time.

  5. South Pole-Aitken Basin (SPA) Units Delineated by Measures of Surface Roughness: Implications for the History and Evolution of the Basin as Seen by Data from the Lunar Reconnaissance Orbiter (LRO)

    Science.gov (United States)

    Petro, N. E.; Jolliff, B. L.; Cahill, J. T.; Whelley, P.

    2015-12-01

    The interior of SPA contains a range of morphologic units, from smooth plains and mare basalts to rough, ancient, terrains. Recent data, particularly from LRO provide unique measures of SPA surface properties. With each new dataset, the differences between the interior of SPA and its surroundings become more, or in some cases less, clearly defined. Here we explore recent datasets that offer insight into surface roughness at a variety of scales and assess implications for the origins of units across SPA. Identifying the origin of units in SPA is critical for identifying future sampling sites that address the science goal of determining the age of SPA. The unique interior of SPA relative to the rest of the Moon is demonstrated by Mini-RF and LOLA derived products. Mini-RF data shows that the interior of SPA has a slightly higher average Circular Polarization Ratio than nearly any other terrain on the Moon, with the exception of the interior of the Orientale Basin. Cahill et al. [2014,Icarus] note that the average interior CPR value of SPA is similar but slightly higher than the mid-latitude farside highlands, suggesting that both are enhanced in blocks at the surface and near subsurface (to depths CPR, and other high resolution measures of surface roughness within SPA will be used to infer, delineate morphologic terrains, and distinguish volcanic and impact-generated units.

  6. Bilayer graphene quantum dot defined by topgates

    Energy Technology Data Exchange (ETDEWEB)

    Müller, André; Kaestner, Bernd; Hohls, Frank; Weimann, Thomas; Pierz, Klaus; Schumacher, Hans W., E-mail: hans.w.schumacher@ptb.de [Physikalisch-Technische Bundesanstalt, Bundesallee 100, 38116 Braunschweig (Germany)

    2014-06-21

    We investigate the application of nanoscale topgates on exfoliated bilayer graphene to define quantum dot devices. At temperatures below 500 mK, the conductance underneath the grounded gates is suppressed, which we attribute to nearest neighbour hopping and strain-induced piezoelectric fields. The gate-layout can thus be used to define resistive regions by tuning into the corresponding temperature range. We use this method to define a quantum dot structure in bilayer graphene showing Coulomb blockade oscillations consistent with the gate layout.

  7. Teaching Sport as History, History through Sport

    Science.gov (United States)

    Wheeler, Robert F.

    1978-01-01

    Describes an undergraduate history course based on two themes: sport as a reflection of society and sport as a socializing agent affecting society. The course focuses on sports and industrialization, traditional and modern sports, political and economic aspects of sport, and inequality and discrimination in sports. (Author/JK)

  8. A history of the histories of econometrics

    NARCIS (Netherlands)

    Boumans, M.; Dupont-Kieffer, A.

    2011-01-01

    Econometricians have from the start considered historical knowledge of their own discipline as reflexive knowledge useful for delineating their discipline, that is, for setting its disciplinary boundaries with respect to its aims, its methods, and its scientific values. As such, the histories writte

  9. Charles E. Rosenberg and the multifaceted promise of medical history.

    Science.gov (United States)

    Stevens, Rosemary A

    2008-10-01

    Charles E. Rosenberg has had a major influence in defining the history of medicine as a field. However, critics who focus on his leadership or "school" in terms of defined scholarly perspectives, including those of social history and the framing of disease, offer inadequate descriptions of the messages, breadth, and scope of his scholarly work as a whole. Shoehorning the history of medicine into prescribed patterns in order to build a more unitary discipline would weaken rather than strengthen the field and is not in the Rosenberg tradition.

  10. Defining orthologs and pangenome size metrics.

    Science.gov (United States)

    Bosi, Emanuele; Fani, Renato; Fondi, Marco

    2015-01-01

    Since the advent of ultra-massive sequencing techniques, the consequent drop-off in both price and time required made feasible the sequencing of increasingly more genomes from microbes belonging to the same taxonomic unit. Eventually, this led to the concept of pangenome, that is, the entire set of genes present in a group of representatives of the same genus/species, which, in turn, can be divided into core genome, defined as the set of those genes present in all the genomes under study, and a dispensable genome, the set of genes possessed only by one or a subset of organism. When analyzing a pangenome, an interesting point is to measure its size, thus estimating the gene repertoire of a given taxonomic group. This is usually performed counting the novel genes added to the overall pangenome when new genomes are sequenced and annotated. A pangenome can be also classified as open or close: in an open pangenome its size increases indefinitely when adding new genomes; thus sequencing additional strains will likely yield novel genes. Conversely, in a close pangenome, adding new genomes will not lead to the discovery of new coding capabilities. A central point in pangenomics is the definition of homology relationships between genes belonging to different genomes. This may turn into the search of those genes with similar sequences between different organisms (and including both paralogous and orthologous genes). In this chapter, methods for finding groups of orthologs between genomes and for estimating the pangenome size are discussed. Also, working codes to address these tasks are provided.

  11. Family history of type 2 diabetes and prevalence of metabolic syndrome in adult Asian Indians.

    Science.gov (United States)

    Das, Mithun; Pal, Susil; Ghosh, Arnab

    2012-04-01

    Our objective was to test the association between familial risk of type 2 diabetes mellitus (T2DM) and the prevalence of metabolic syndrome (MS) in adult Asian Indians. A total of 448 adult (>30 years) individuals (257 males and 191 females) participated in the study. Familial risk of T2DM was classified into three groups viz., 1=both parents affected; 2=parent and/or siblings affected and 3=none or no family history for T2DM. Anthropometric measures, blood pressures, fasting blood glucose and metabolic profiles were studied using standard techniques. MS was defined accordingly. The prevalence of MS phenotypes was estimated and compared among the three familial risk strata. Individuals with a history of both parents affected from diabetes had significantly higher (Ppressure (SBP), diastolic blood pressure (DBP) and fasting blood glucose (FBG; P=0.035) than individuals having no family history of T2DM. Significant difference was also noticed between individuals with and without MS according to the family history of diabetes (P<0.001). Differences were evident between individuals who fulfilled all the MS criteria (P=0.001) and individuals with only one or two criteria (phenotypes) according to family history of T2DM. Family history of T2DM had significant effect on individuals with MS as compared to their counterparts (individuals having no family history of T2DM). It therefore seems reasonable to argue that family history of T2DM could be useful as a predictive tool for early diagnosis and prevention of MS in Asian Indian population.

  12. Branch dependence in the "consistent histories" approach to quantum mechanics

    CERN Document Server

    Müller, T

    2005-01-01

    In the consistent histories formalism one specifies a family of histories as an exhaustive set of pairwise exclusive descriptions of the dynamics of a quantum system. We define branching families of histories, which strike a middle ground between the two available mathematically precise definitions of families of histories, viz., product families and Isham's history projector operator formalism. The former are too narrow for applications, and the latter's generality comes at a certain cost, barring an intuitive reading of the ``histories''. Branching families retain the intuitiveness of product families, they allow for the interpretation of a history's weight as a probability, and they allow one to distinguish two kinds of coarse-graining. It is shown that for branching families, the ``consistency condition'' is not a precondition for assigning probabilities, but for a specific kind of coarse-graining.

  13. Reconfigurable, Cognitive Software Defined Radio Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Intelligent Automation Inc, (IAI) is currently developing a software defined radio (SDR) platform that can adaptively switch between different modes of operation for...

  14. Reconfigurable, Cognitive Software Defined Radio Project

    Data.gov (United States)

    National Aeronautics and Space Administration — IAI is actively developing Software Defined Radio platforms that can adaptively switch between different modes of operation by modifying both transmit waveforms and...

  15. Convolutional Goppa codes defined on fibrations

    CERN Document Server

    Curto, J I Iglesias; Martín, F J Plaza; Sotelo, G Serrano

    2010-01-01

    We define a new class of Convolutional Codes in terms of fibrations of algebraic varieties generalizaing our previous constructions of Convolutional Goppa Codes. Using this general construction we can give several examples of Maximum Distance Separable (MDS) Convolutional Codes.

  16. Radiation Tolerant Software Defined Video Processor Project

    Data.gov (United States)

    National Aeronautics and Space Administration — MaXentric's is proposing a radiation tolerant Software Define Video Processor, codenamed SDVP, for the problem of advanced motion imaging in the space environment....

  17. Towards a Southern African English Defining Vocabulary

    African Journals Online (AJOL)

    user

    In my experience, defining vocabularies compiled for English dictionaries for a British or .... Oxford 3000, which contains 3 540 entries, and is available on the Internet. ... One thing that became apparent was a lack of consistency within lexical.

  18. Software Defined Multiband EVA Radio Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of this research is to propose a reliable, lightweight, programmable, multi-band, multi-mode, miniaturized frequency-agile EVA software defined radio...

  19. Software Defined Multiband EVA Radio Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of Phase 2 is to build a reliable, lightweight, programmable, multi-mode, miniaturized EVA Software Defined Radio (SDR) that supports data telemetry,...

  20. Optimum Criteria for Developing Defined Structures

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2008-01-01

    Full Text Available Basic aspects concerning distributed applications are presented: definition, particularities and importance. For distributed applications linear, arborescent, graph structures are defined with different versions and aggregation methods. Distributed applications have associated structures which through their characteristics influence the costs of the stages in the development cycle and the exploitation costs transferred to each user. The complexity of the defined structures is analyzed. The minimum and maximum criteria are enumerated for optimizing distributed application structures.

  1. History of Science

    Science.gov (United States)

    Oversby, John

    2010-01-01

    In this article, the author discusses why the history of science should be included in the science curriculum in schools. He also presents some opportunities that can come out of using historical contexts, and findings from a study assessing the place of history of science in readily available textbooks.

  2. History of Science

    Science.gov (United States)

    Oversby, John

    2010-01-01

    In this article, the author discusses why the history of science should be included in the science curriculum in schools. He also presents some opportunities that can come out of using historical contexts, and findings from a study assessing the place of history of science in readily available textbooks.

  3. History of Mathematics

    DEFF Research Database (Denmark)

    Hansen, Vagn Lundsgaard; Gray, Jeremy

    Volume 1 in Theme on "History of Mathematics", in "Encyclopedia of Life Support Systems (EOLSS), developed under the auspices of the UNESCO.......Volume 1 in Theme on "History of Mathematics", in "Encyclopedia of Life Support Systems (EOLSS), developed under the auspices of the UNESCO....

  4. Writing American Indian History

    Science.gov (United States)

    Noley, Grayson B.

    2008-01-01

    The purpose of this paper is to critique the manner in which history about American Indians has been written and propose a rationale for the rethinking of what we know about this subject. In particular, histories of education as regards the participation of American Indians is a subject that has been given scant attention over the years and when…

  5. Aggersborg through history

    DEFF Research Database (Denmark)

    2014-01-01

    Aggersborg's history from the time of the end of the circular fortress till the present day, with a focus on the late Viking Age and the Middle Ages......Aggersborg's history from the time of the end of the circular fortress till the present day, with a focus on the late Viking Age and the Middle Ages...

  6. History of Mathematics

    DEFF Research Database (Denmark)

    Hansen, Vagn Lundsgaard; Gray, Jeremy

    Volume 1 in Theme on "History of Mathematics", in "Encyclopedia of Life Support Systems (EOLSS), developed under the auspices of the UNESCO.......Volume 1 in Theme on "History of Mathematics", in "Encyclopedia of Life Support Systems (EOLSS), developed under the auspices of the UNESCO....

  7. The history of tuberculosis.

    Science.gov (United States)

    Daniel, Thomas M

    2006-11-01

    Tuberculosis has claimed its victims throughout much of known human history. It reached epidemic proportions in Europe and North America during the 18th and 19th centuries, earning the sobriquet, "Captain Among these Men of Death." Then it began to decline. Understanding of the pathogenesis of tuberculosis began with the work of Théophile Laennec at the beginning of the 19th century and was further advanced by the demonstration of the transmissibility of Mycobacterium tuberculosis infection by Jean-Antoine Villemin in 1865 and the identification of the tubercle bacillus as the etiologic agent by Robert Koch in 1882. Clemens von Pirquet developed the tuberculin skin test in 1907 and 3 years later used it to demonstrate latent tuberculous infection in asymptomatic children. In the late 19th and early 20th centuries sanatoria developed for the treatment of patients with tuberculosis. The rest provided there was supplemented with pulmonary collapse procedures designed to rest infected parts of lungs and to close cavities. Public Health measures to combat the spread of tuberculosis emerged following the discovery of its bacterial cause. BCG vaccination was widely employed following World War I. The modern era of tuberculosis treatment and control was heralded by the discovery of streptomycin in 1944 and isoniazid in 1952.

  8. NIF Gamma Reaction History

    Science.gov (United States)

    Herrmann, H. W.; Kim, Y.; Young, C. S.; Mack, J. M.; McEvoy, A. M.; Hoffman, N. M.; Wilson, D. C.; Langenbrunner, J. R.; Evans, S.; Batha, S. H.; Stoeffl, W.; Lee, A.; Horsfield, C. J.; Rubery, M.; Miller, E. K.; Malone, R. M.; Kaufman, M. I.

    2010-11-01

    The primary objective of the NIF Gamma Reaction History (GRH) diagnostics is to provide bang time and burn width information based upon measurement of fusion gamma-rays. This is accomplished with energy-thresholded Gas Cherenkov detectors that convert MeV gamma-rays into UV/visible photons for high-bandwidth optical detection. In addition, the GRH detectors can perform γ-ray spectroscopy to explore other nuclear processes from which additional significant implosion parameters may be inferred (e.g., plastic ablator areal density). Implementation is occurring in 2 phases: 1) four PMT-based channels mounted to the outside of the NIF target chamber at ˜6 m from TCC (GRH-6m) for the 3e13-3e16 DT neutron yield range expected during the early ignition-tuning campaigns; and 2) several channels located just inside the target bay shield wall at ˜15 m from TCC (GRH-15m) with optical paths leading through the wall into well-shielded streak cameras and PMTs for the 1e16-1e20 yield range expected during the DT ignition campaign. This suite of diagnostics will allow exploration of interesting γ-ray physics well beyond the ignition campaign. Recent data from OMEGA and NIF will be shown.

  9. Thucydides was Right: Defining the Future Threat

    Science.gov (United States)

    2015-04-01

    matters of policy and strategy was revealed in painful form in the political and strategic errors that succeeded September 11, 2001 in the counterinsur... visceral disdain for some cultural explanation of strategic history. 12. I explain this process fully in my book, The Future of Strategy, Cambridge, UK

  10. Lack of reproducibility of linkage results in serially measured blood pressure data

    NARCIS (Netherlands)

    Patel, [No Value; Celedon, JC; Weiss, ST; Palmer, LJ

    2003-01-01

    Background: Using the longitudinal Framingham Heart Study data on blood pressure, we analyzed the reproducibility of linkage measures from serial cross-sectional surveys of a defined population by performing genome-wide model-free linkage analyses to systolic blood pressure (SBP) and history of hype

  11. Conceptualizing time preference: a life-history analysis.

    Science.gov (United States)

    Copping, Lee T; Campbell, Anne; Muncer, Steven

    2014-09-29

    Life-history theory (LHT) has drawn upon the concept of "time preference" as a psychological mechanism for the development of fast and slow strategies. However, the conceptual and empirical nature of this mechanism is ill-defined. This study compared four traits commonly used as measures of "time preference" (impulsivity, sensation seeking, future orientation and delay discounting) and evaluated their relationship to variables associated with life-history strategies (aggressive behavior and mating attitudes, biological sex, pubertal timing, victimization, and exposure to aggression in the environment). Results indicated that only sensation seeking consistently showed all the predicted associations, although impulsivity, future orientation, and delay discounting showed some significant associations. A unidimensional higher-order factor of "time preference" did not adequately fit the data and lacked structural invariance across age and sex, suggesting that personality traits associated with LHT do not represent a global trait. We discuss the use of personality traits as measures in LHT and suggest that greater caution and clarity is required when conceptualizing this construct in future work.

  12. Defining hormesis: evaluation of a complex concentration response phenomenon.

    Science.gov (United States)

    Kendig, Eric L; Le, Hoa H; Belcher, Scott M

    2010-01-01

    Hormesis describes dose-response relationships characterized by a reversal of response between low and high doses of chemicals, biological molecules, physical stressors, or other initiators of a response. Acceptance of hormesis as a viable dose-response theory has been limited until recently, in part, because of poor conceptual understanding, ad hoc and inappropriate use, and lack of a defined mechanism. By examining the history of this dose-response theory, it is clear that both pharmacological and toxicological studies provide evidence for hormetic dose responses, but retrospective examination of studies can be problematic at best. Limited scientific evidence and lack of a common lexicon with which to describe these responses have left hormesis open to inappropriate application to unrelated dose-response relationships. Future studies should examine low-dose effects using unbiased, descriptive criteria to further the scientific understanding of this dose response. A clear, concise definition is required to further the limited scientific evidence for hormetic dose responses.

  13. A history of mathematics

    CERN Document Server

    Boyer, Carl B

    2011-01-01

    The updated new edition of the classic and comprehensive guide to the history of mathematics. For more than forty years, A History of Mathematics has been the reference of choice for those looking to learn about the fascinating history of humankind's relationship with numbers, shapes, and patterns. This revised edition features up-to-date coverage of topics such as Fermat's Last Theorem and the Poincaré Conjecture , in addition to recent advances in areas such as finite group theory and computer-aided proofs.: Distills thousands of years of mathematics into a single, approachable volume; Cover

  14. Science A history

    CERN Document Server

    Gribbin, John

    2002-01-01

    From award-winning science writer John Gribbin, "Science: A History" is the enthralling story of the men and women who changed the way we see the world, and the turbulent times they lived in. From Galileo, tried by the Inquisition for his ideas, to Newton, who wrote his rivals out of the history books; from Marie Curie, forced to work apart from male students for fear she might excite them, to Louis Agassiz, who marched his colleagues up a mountain to prove that the ice ages had occurred. Filled with pioneers, visionaries, eccentrics and madmen, this is the history of science as it has never been told before.

  15. Software-Defined Cellular Mobile Network Solutions

    Institute of Scientific and Technical Information of China (English)

    Jiandong Li; Peng Liu; Hongyan Li

    2014-01-01

    The emergency relating to software-defined networking (SDN), especially in terms of the prototype associated with OpenFlow, pro-vides new possibilities for innovating on network design. Researchers have started to extend SDN to cellular networks. Such new programmable architecture is beneficial to the evolution of mobile networks and allows operators to provide better services. The typical cellular network comprises radio access network (RAN) and core network (CN); hence, the technique roadmap diverges in two ways. In this paper, we investigate SoftRAN, the latest SDN solution for RAN, and SoftCell and MobileFlow, the latest solu-tions for CN. We also define a series of control functions for CROWD. Unlike in the other literature, we emphasize only software-defined cellular network solutions and specifications in order to provide possible research directions.

  16. Defining resilience within a risk-informed assessment framework

    Energy Technology Data Exchange (ETDEWEB)

    Coles, Garill A.; Unwin, Stephen D.; Holter, Gregory M.; Bass, Robert B.; Dagle, Jeffery E.

    2011-08-01

    The concept of resilience is the subject of considerable discussion in academic, business, and governmental circles. The United States Department of Homeland Security for one has emphasised the need to consider resilience in safeguarding critical infrastructure and key resources. The concept of resilience is complex, multidimensional, and defined differently by different stakeholders. The authors contend that there is a benefit in moving from discussing resilience as an abstraction to defining resilience as a measurable characteristic of a system. This paper proposes defining resilience measures using elements of a traditional risk assessment framework to help clarify the concept of resilience and as a way to provide non-traditional risk information. The authors show various, diverse dimensions of resilience can be quantitatively defined in a common risk assessment framework based on the concept of loss of service. This allows the comparison of options for improving the resilience of infrastructure and presents a means to perform cost-benefit analysis. This paper discusses definitions and key aspects of resilience, presents equations for the risk of loss of infrastructure function that incorporate four key aspects of resilience that could prevent or mitigate that loss, describes proposed resilience factor definitions based on those risk impacts, and provides an example that illustrates how resilience factors would be calculated using a hypothetical scenario.

  17. BIOMARKERS TO DEFINE OPTIMAL PROTEIN REQUIREMENT

    OpenAIRE

    Di Girolamo, Filippo Giorgio

    2015-01-01

    Dietary proteins are the source of the amino acids required by the body for tissue growth and maintenance. The Population Reference Intake (PRI) for proteins, as defined by the European Food Safety Authority (EFSA) for healthy adults, including the elderly, is 0.83 g/kg body weight/day. This amount is defined on the net balance of body protein (or “nitrogen balance”, given by the difference between dietary nitrogen intake and losses) equivalent to 0.66 g/kg/day plus a safety factor for interp...

  18. GNU Based Security in Software Defined Radio

    Directory of Open Access Journals (Sweden)

    H. B. Bhadka

    2012-11-01

    Full Text Available Various new technologies are explored for radio communication toward the 21st century. Among them the technology of "software defined radio" attracts large attention. Software Defined Radio (SDR technology implements some of the functional modules of a radio system in software enabling highly flexible handsets. SDR devices may be reconfigured dynamically via the download of new software modules. Malicious or malfunctioning downloaded software present serious security risks to SDR devices and networks in which they operate. Together with the use of software downloading, future terminals will become a platform to support the deployment of yet unspecified services and applications.

  19. Medical abortion. defining success and categorizing failures

    DEFF Research Database (Denmark)

    Rørbye, Christina; Nørgaard, Mogens; Vestermark, Vibeke;

    2003-01-01

    Medical abortion was performed in 461 consecutive women with gestational age LT /= 63 days using a regimen of mifepristone 600 mg followed 2 days later by gemeprost 1 mg vaginally. Success, defined as no surgical intervention, declined from 98.7% after 2 weeks to 94.6% after 15 weeks. The differe......Medical abortion was performed in 461 consecutive women with gestational age LT /= 63 days using a regimen of mifepristone 600 mg followed 2 days later by gemeprost 1 mg vaginally. Success, defined as no surgical intervention, declined from 98.7% after 2 weeks to 94.6% after 15 weeks...

  20. What Defines a Separate Hydrothermal System

    Energy Technology Data Exchange (ETDEWEB)

    Lawless, J.V.; Bogie, I.; Bignall, G.

    1995-01-01

    Separate hydrothermal systems can be defined in a variety of ways. Criteria which have been applied include separation of heat source, upflow, economic resource and geophysical anomaly. Alternatively, connections have been defined by the effects of withdrawal of economically useful fluid and subsidence, effects of reinjection, changes in thermal features, or by a hydrological connection of groundwaters. It is proposed here that: ''A separate hydrothermal system is one that is fed by a separate convective upflow of fluid, at a depth above the brittle-ductile transition for the host rocks, while acknowledging that separate hydrothermal systems can be hydrologically interconnected at shallower levels''.