WorldWideScience

Sample records for define history measurements

  1. Defining Astrology in Ancient and Classical History

    Science.gov (United States)

    Campion, Nicholas

    2015-05-01

    Astrology in the ancient and classical worlds can be partly defined by its role, and partly by the way in which scholars spoke about it. The problem is complicated by the fact that the word is Greek - it has no Babylonian or Egyptian cognates - and even in Greece it was interchangeable with its cousin, 'astronomy'. Yet if we are to understand the role of the sky, stars and planets in culture, debates about the nature of ancient astrology, by both classical and modern scholars, must be taken into account. This talk will consider modern scholars' typologies of ancient astrology, together with ancient debates from Cicero in the 1st century BC, to Plotinus (204/5-270 AD) and Isidore of Seville (c. 560 - 4 April 636). It will consider the implications for our understanding of astronomy's role in culture, and conclude that in the classical period astrology may be best understood through its diversity and allegiance to competing philosophies, and that its functions were therefore similarly varied.

  2. Defining and Measuring Safeguards Culture

    International Nuclear Information System (INIS)

    Frazar, Sarah L.; Mladineo, Stephen V.

    2010-01-01

    In light of the shift toward State Level Evaluations and information driven safeguards, this paper offers a refined definition of safeguards culture and a set of metrics for measuring the extent to which a safeguards culture exists in a state. Where the IAEA is able to use the definition and metrics to come to a positive conclusion about the country, it may help reduce the burden on the Agency and the state.

  3. Use of plant operating history to define transient loads

    International Nuclear Information System (INIS)

    Dwivedy, K.K.

    1996-01-01

    Fatigue and crack growth analyses of components subjected to transient loads have been under continuous development during the recent past to include effects of environment on the components. The accuracy of the evaluation method on the predicted reliability of the components in the operating environment has become a focus of attention. Methods have integrated available material/component test data to improve evaluation techniques. However, in the area of definition of thermal transient loads the analyst still has to remain conservative, because no realistic guidelines have been developed to define thermal transients and their sequences. Fatigue re-evaluations of components are becoming increasingly necessary in operating plants as they age due to two reasons: (1) Components show age related degradation and cannot be repaired/replaced due to economic/logistic reasons. (2) Components experience transient conditions which were not considered in the original design. In either case, the evaluation of remaining life of components involves definition of transients and their sequence from the time the component was put in service until the end of life. As a common practice, initial plant design transients are used in a conservative definition of sequences to obtain results unrealistic for the situation, which sometimes leads to inaccurate estimate of the remaining life of components. The objective of this paper is to use plant operating history and plant monitoring data to provide procedures and techniques to define realistic transients for evaluation

  4. Defining Multiple Chronic Conditions for Quality Measurement.

    Science.gov (United States)

    Drye, Elizabeth E; Altaf, Faseeha K; Lipska, Kasia J; Spatz, Erica S; Montague, Julia A; Bao, Haikun; Parzynski, Craig S; Ross, Joseph S; Bernheim, Susannah M; Krumholz, Harlan M; Lin, Zhenqiu

    2018-02-01

    Patients with multiple chronic conditions (MCCs) are a critical but undefined group for quality measurement. We present a generally applicable systematic approach to defining an MCC cohort of Medicare fee-for-service beneficiaries that we developed for a national quality measure, risk-standardized rates of unplanned admissions for Accountable Care Organizations. To define the MCC cohort we: (1) identified potential chronic conditions; (2) set criteria for cohort conditions based on MCC framework and measure concept; (3) applied the criteria informed by empirical analysis, experts, and the public; (4) described "broader" and "narrower" cohorts; and (5) selected final cohort with stakeholder input. Subjects were patients with chronic conditions. Participants included 21.8 million Medicare fee-for-service beneficiaries in 2012 aged 65 years and above with ≥1 of 27 Medicare Chronic Condition Warehouse condition(s). In total, 10 chronic conditions were identified based on our criteria; 8 of these 10 were associated with notably increased admission risk when co-occurring. A broader cohort (2+ of the 8 conditions) included 4.9 million beneficiaries (23% of total cohort) with an admission rate of 70 per 100 person-years. It captured 53% of total admissions. The narrower cohort (3+ conditions) had 2.2 million beneficiaries (10%) with 100 admissions per 100 person-years and captured 32% of admissions. Most stakeholders viewed the broader cohort as best aligned with the measure concept. By systematically narrowing chronic conditions to those most relevant to the outcome and incorporating stakeholder input, we defined an MCC admission measure cohort supported by stakeholders. This approach can be used as a model for other MCC outcome measures.

  5. Alpha emitters activity measurement using the defined solid angle method

    International Nuclear Information System (INIS)

    Blanchis, P.

    1983-01-01

    The defined solid angle counting method can reach a very high accuracy, specially for heavy ions as alpha particles emitted by a radioactive source. The activity measurement of such sources with a relative uncertainty of the order of 0.01% is investigated. Such an accuracy is available only under suitable conditions: the radiation emitted by the source must be isotropic and all the particles emitted in the effective solid angle must be detected. The efficiency detection value must be equal to unity and phenomena such as absorption or scattering must be null. It is shown that corrections often become necessary. All parameters which can influence the measurements are studied [fr

  6. Performance considerations of ultrasonic distance measurement with well defined properties

    International Nuclear Information System (INIS)

    Elmer, Hannes; Schweinzer, Herbert

    2005-01-01

    Conventional ultrasonic distance measurement systems based on narrow bandwidth ultrasonic bursts and amplitude detection are often used because of their low costs and easy implementation. However, the achievable results strongly depend on the actual environments where the system is implemented: in case of well defined objects that are always located near the measurement direction of the system, in general good results are obtained. If arbitrary objects are expected that are moreover located in arbitrary positions in front of the sensor, strongly object dependent areas where objects are detected with decreasing accuracy towards their borders must be taken into account. In previous works we developed an ultrasonic measurement system that provides accurate distance measurement values within a well defined detection area that is independent of the reflection properties of the objects. This measurement system is based on the One Bit Correlation method that is described in the following. To minimise its implementation efforts, it is necessary to examine the influence of the system parameters as e.g. the correlation length to the results that are expected in case of different signal to noise ratios of the received signal. In the following, these examinations are shown and the obtained results are discussed that allow getting a well conditioned system that makes best use of given system resources

  7. Defining, Developing, and Measuring "Proclivities for Teaching Mathematics"

    Science.gov (United States)

    Lewis, Jennifer M.; Fischman, Davida; Riggs, Matt

    2015-01-01

    This article presents a form of teacher reasoning that we call "proclivities for teaching mathematics." We define proclivities for teaching mathematics as the beliefs, knowledge, and dispositions that are actionable in the flow of instruction, and we argue that growth in this area contributes to positive change in mathematics…

  8. Defining and Measuring Job Vacancies in a Dynamic Perspective

    NARCIS (Netherlands)

    P.A. Donker van Heel (Peter)

    2015-01-01

    textabstractWhat is the best definition for job vacancies, what is the best method to measure job vacancies, and what further research is needed to gain a better insight into job vacancies in a dynamic perspective?

  9. Evaluating measurement of dynamic constructs: defining a measurement model of derivatives.

    Science.gov (United States)

    Estabrook, Ryne

    2015-03-01

    While measurement evaluation has been embraced as an important step in psychological research, evaluating measurement structures with longitudinal data is fraught with limitations. This article defines and tests a measurement model of derivatives (MMOD), which is designed to assess the measurement structure of latent constructs both for analyses of between-person differences and for the analysis of change. Simulation results indicate that MMOD outperforms existing models for multivariate analysis and provides equivalent fit to data generation models. Additional simulations show MMOD capable of detecting differences in between-person and within-person factor structures. Model features, applications, and future directions are discussed. (c) 2015 APA, all rights reserved).

  10. Defining the research agenda to measure and reduce tuberculosis stigmas.

    Science.gov (United States)

    Macintyre, K; Bakker, M I; Bergson, S; Bhavaraju, R; Bond, V; Chikovore, J; Colvin, C; Craig, G M; Cremers, A L; Daftary, A; Engel, N; France, N Ferris; Jaramillo, E; Kimerling, M; Kipp, A; Krishnaratne, S; Mergenthaler, C; Ngicho, M; Redwood, L; Rood, E J J; Sommerland, N; Stangl, A; van Rie, A; van Brakel, W; Wouters, E; Zwerling, A; Mitchell, E M H

    2017-11-01

    Crucial to finding and treating the 4 million tuberculosis (TB) patients currently missed by national TB programmes, TB stigma is receiving well-deserved and long-delayed attention at the global level. However, the ability to measure and evaluate the success of TB stigma-reduction efforts is limited by the need for additional tools. At a 2016 TB stigma-measurement meeting held in The Hague, The Netherlands, stigma experts discussed and proposed a research agenda around four themes: 1) drivers: what are the main drivers and domains of TB stigma(s)?; 2) consequences: how consequential are TB stigmas and how are negative impacts most felt?; 3) burden: what is the global prevalence and distribution of TB stigma(s) and what explains any variation? 4): intervention: what can be done to reduce the extent and impact of TB stigma(s)? Each theme was further subdivided into research topics to be addressed to move the agenda forward. These include greater clarity on what causes TB stigmas to emerge and thrive, the difficulty of measuring the complexity of stigma, and the improbability of a universal stigma 'cure'. Nevertheless, these challenges should not hinder investments in the measurement and reduction of TB stigma. We believe it is time to focus on how, and not whether, the global community should measure and reduce TB stigma.

  11. The Challenges of Defining and Measuring Student Engagement in Science

    Science.gov (United States)

    Sinatra, Gale M.; Heddy, Benjamin C.; Lombardi, Doug

    2015-01-01

    Engagement is one of the hottest research topics in the field of educational psychology. Research shows that multifarious benefits occur when students are engaged in their own learning, including increased motivation and achievement. However, there is little agreement on a concrete definition and effective measurement of engagement. This special…

  12. Spasticity, an impairment that is poorly defined and poorly measured

    NARCIS (Netherlands)

    Malhotra, S.; Malhotra, S.; Pandyan, A.D.; Day, C.R.; Jones, Valerie M.; Hermens, Hermanus J.

    Objective: To explore, following a literature review, whether there is a consistent definition and a unified assessment framework for the term 'spasticity'. The congruence between the definitions of spasticity and the corresponding methods of measurement were also explored. Data sources: The search

  13. The measurement of water scarcity: Defining a meaningful indicator.

    Science.gov (United States)

    Damkjaer, Simon; Taylor, Richard

    2017-09-01

    Metrics of water scarcity and stress have evolved over the last three decades from simple threshold indicators to holistic measures characterising human environments and freshwater sustainability. Metrics commonly estimate renewable freshwater resources using mean annual river runoff, which masks hydrological variability, and quantify subjectively socio-economic conditions characterising adaptive capacity. There is a marked absence of research evaluating whether these metrics of water scarcity are meaningful. We argue that measurement of water scarcity (1) be redefined physically in terms of the freshwater storage required to address imbalances in intra- and inter-annual fluxes of freshwater supply and demand; (2) abandons subjective quantifications of human environments and (3) be used to inform participatory decision-making processes that explore a wide range of options for addressing freshwater storage requirements beyond dams that include use of renewable groundwater, soil water and trading in virtual water. Further, we outline a conceptual framework redefining water scarcity in terms of freshwater storage.

  14. Defining Tsunami Magnitude as Measure of Potential Impact

    Science.gov (United States)

    Titov, V. V.; Tang, L.

    2016-12-01

    The goal of tsunami forecast, as a system for predicting potential impact of a tsunami at coastlines, requires quick estimate of a tsunami magnitude. This goal has been recognized since the beginning of tsunami research. The work of Kajiura, Soloviev, Abe, Murty, and many others discussed several scales for tsunami magnitude based on estimates of tsunami energy. However, difficulties of estimating tsunami energy based on available tsunami measurements at coastal sea-level stations has carried significant uncertainties and has been virtually impossible in real time, before tsunami impacts coastlines. The slow process of tsunami magnitude estimates, including collection of vast amount of available coastal sea-level data from affected coastlines, made it impractical to use any tsunami magnitude scales in tsunami warning operations. Uncertainties of estimates made tsunami magnitudes difficult to use as universal scale for tsunami analysis. Historically, the earthquake magnitude has been used as a proxy of tsunami impact estimates, since real-time seismic data is available of real-time processing and ample amount of seismic data is available for an elaborate post event analysis. This measure of tsunami impact carries significant uncertainties in quantitative tsunami impact estimates, since the relation between the earthquake and generated tsunami energy varies from case to case. In this work, we argue that current tsunami measurement capabilities and real-time modeling tools allow for establishing robust tsunami magnitude that will be useful for tsunami warning as a quick estimate for tsunami impact and for post-event analysis as a universal scale for tsunamis inter-comparison. We present a method for estimating the tsunami magnitude based on tsunami energy and present application of the magnitude analysis for several historical events for inter-comparison with existing methods.

  15. History and background of quality measurement.

    Science.gov (United States)

    Chun, Jonathan; Bafford, Andrea Chao

    2014-03-01

    Health care quality measurement has become increasingly emphasized, as providers and administrators respond to public and government demands for improved patient care. This article will review the evolution of surgical quality measurement and improvement from its infancy in the 1850s to the vast efforts being undertaken today.

  16. Defining SOD1 ALS natural history to guide therapeutic clinical trial design.

    Science.gov (United States)

    Bali, Taha; Self, Wade; Liu, Jingxia; Siddique, Teepu; Wang, Leo H; Bird, Thomas D; Ratti, Elena; Atassi, Nazem; Boylan, Kevin B; Glass, Jonathan D; Maragakis, Nicholas J; Caress, James B; McCluskey, Leo F; Appel, Stanley H; Wymer, James P; Gibson, Summer; Zinman, Lorne; Mozaffar, Tahseen; Callaghan, Brian; McVey, April L; Jockel-Balsarotti, Jennifer; Allred, Peggy; Fisher, Elena R; Lopate, Glenn; Pestronk, Alan; Cudkowicz, Merit E; Miller, Timothy M

    2017-02-01

    Understanding the natural history of familial amyotrophic lateral sclerosis (ALS) caused by SOD1 mutations (ALS SOD1 ) will provide key information for optimising clinical trials in this patient population. To establish an updated natural history of ALS SOD1 . Retrospective cohort study from 15 medical centres in North America evaluated records from 175 patients with ALS with genetically confirmed SOD1 mutations, cared for after the year 2000. Age of onset, survival, ALS Functional Rating Scale (ALS-FRS) scores and respiratory function were analysed. Patients with the A4V (Ala-Val) SOD1 mutation (SOD1 A4V ), the largest mutation population in North America with an aggressive disease progression, were distinguished from other SOD1 mutation patients (SOD1 non-A4V ) for analysis. Mean age of disease onset was 49.7±12.3 years (mean±SD) for all SOD1 patients, with no statistical significance between SOD1 A4V and SOD1 non-A4V (p=0.72, Kruskal-Wallis). Total SOD1 patient median survival was 2.7 years. Mean disease duration for all SOD1 was 4.6±6.0 and 1.4±0.7 years for SOD1 A4V . SOD1 A4V survival probability (median survival 1.2 years) was significantly decreased compared with SOD1 non-A4V (median survival 6.8 years; p<0.0001, log-rank). A statistically significant increase in ALS-FRS decline in SOD1 A4V compared with SOD1 non-A4V participants (p=0.02) was observed, as well as a statistically significant increase in ALS-forced vital capacity decline in SOD1 A4V compared with SOD1 non-A4V (p=0.02). SOD1 A4V is an aggressive, but relatively homogeneous form of ALS. These SOD1-specific ALS natural history data will be important for the design and implementation of clinical trials in the ALS SOD1 patient population. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  17. Measuring metamorphic history of unequilibrated ordinary chondrites

    International Nuclear Information System (INIS)

    Sears, D.W.; Grossman, J.N.; Melcher, C.L.; Ross, L.M.; Mills, A.A.

    1980-01-01

    A thermoluminescence sensitivity technique is used to give a new measurement of the degree of metamorphism of unequilibrated ordinary chondrites. Consequently the petrological assignment of these meteorites is modified. (author)

  18. Environmental drivers defining linkages among life-history traits: mechanistic insights from a semiterrestrial amphipod subjected to macroscale gradients.

    Science.gov (United States)

    Gómez, Julio; Barboza, Francisco R; Defeo, Omar

    2013-10-01

    Determining the existence of interconnected responses among life-history traits and identifying underlying environmental drivers are recognized as key goals for understanding the basis of phenotypic variability. We studied potentially interconnected responses among senescence, fecundity, embryos size, weight of brooding females, size at maturity and sex ratio in a semiterrestrial amphipod affected by macroscale gradients in beach morphodynamics and salinity. To this end, multiple modelling processes based on generalized additive mixed models were used to deal with the spatio-temporal structure of the data obtained at 10 beaches during 22 months. Salinity was the only nexus among life-history traits, suggesting that this physiological stressor influences the energy balance of organisms. Different salinity scenarios determined shifts in the weight of brooding females and size at maturity, having consequences in the number and size of embryos which in turn affected sex determination and sex ratio at the population level. Our work highlights the importance of analysing field data to find the variables and potential mechanisms that define concerted responses among traits, therefore defining life-history strategies.

  19. Understanding unmet need: history, theory, and measurement.

    Science.gov (United States)

    Bradley, Sarah E K; Casterline, John B

    2014-06-01

    During the past two decades, estimates of unmet need have become an influential measure for assessing population policies and programs. This article recounts the evolution of the concept of unmet need, describes how demographic survey data have been used to generate estimates of its prevalence, and tests the sensitivity of these estimates to various assumptions in the unmet need algorithm. The algorithm uses a complex set of assumptions to identify women: who are sexually active, who are infecund, whose most recent pregnancy was unwanted, who wish to postpone their next birth, and who are postpartum amenorrheic. The sensitivity tests suggest that defensible alternative criteria for identifying four out of five of these subgroups of women would increase the estimated prevalence of unmet need. The exception is identification of married women who are sexually active; more accurate measurement of this subgroup would reduce the estimated prevalence of unmet need in most settings. © 2013 The Population Council, Inc.

  20. Shell Measuring Machine. History and Status Report

    Energy Technology Data Exchange (ETDEWEB)

    Birchler, Wilbur D. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Fresquez, Philip R. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2000-06-01

    Commercialization of the Ring Rotacon Shell Measuring Machine project is a CRADA (NO. LA98C10358) between The University of California (Los Alamos National Laboratory) and Moore Tool Company, Bridgeport, CT. The actual work started on this CRADA in December of 1998. Several meetings were held with the interested parties (Los Alamos, Oak Ridge, Moore Tool, and the University of North Carolina). The results of these meetings were that the original Ring Rotacon did not measure up to the requirements of the Department of Energy and private industry, and a new configuration was investigated. This new configuration (Shell Measuring Machine [SMM]) much better fits the needs of all parties. The work accomplished on the Shell Measuring Machine in FY 99 includes the following; Specifications for size and weight were developed; Performance error budgets were established; Designs were developed; Analyses were performed (stiffness and natural frequency); Existing part designs were compared to the working SMM volume; Peer reviews were conducted; Controller requirements were studied; Fixture requirements were evaluated; and Machine motions were analyzed. The consensus of the Peer Review Committee was that the new configuration has the potential to satisfy the shell inspection needs of Department of Energy as well as several commercial customers. They recommended that more analyses be performed on error budgets, structural stiffness, natural frequency, and thermal effects and that operational processes be developed. Several design issues need to be addressed. They are the type of bearings utilized to support the tables (air bearings or mechanical roller type bearings), the selection of the probes, the design of the probe sliding mechanisms, and the design of the upper table positioning mechanism. Each item has several possible solutions, and more work is required to obtain the best design. This report includes the background and technical objectives; minutes of the working

  1. Cuticular Drusen: Clinical Phenotypes and Natural History Defined Using Multimodal Imaging.

    Science.gov (United States)

    Balaratnasingam, Chandrakumar; Cherepanoff, Svetlana; Dolz-Marco, Rosa; Killingsworth, Murray; Chen, Fred K; Mendis, Randev; Mrejen, Sarah; Too, Lay Khoon; Gal-Or, Orly; Curcio, Christine A; Freund, K Bailey; Yannuzzi, Lawrence A

    2018-01-01

    To define the range and life cycles of cuticular drusen phenotypes using multimodal imaging and to review the histologic characteristics of cuticular drusen. Retrospective, observational cohort study and experimental laboratory study. Two hundred forty eyes of 120 clinic patients with a cuticular drusen phenotype and 4 human donor eyes with cuticular drusen (n = 2), soft drusen (n = 1), and hard drusen (n = 1). We performed a retrospective review of clinical and multimodal imaging data of patients with a cuticular drusen phenotype. Patients had undergone imaging with various combinations of color photography, fluorescein angiography, indocyanine green angiography, near-infrared reflectance, fundus autofluorescence, high-resolution OCT, and ultrawide-field imaging. Human donor eyes underwent processing for high-resolution light and electron microscopy. Appearance of cuticular drusen in multimodal imaging and the topography of a cuticular drusen distribution; age-dependent variations in cuticular drusen phenotypes, including the occurrence of retinal pigment epithelium (RPE) abnormalities, choroidal neovascularization, acquired vitelliform lesions (AVLs), and geographic atrophy (GA); and ultrastructural and staining characteristics of druse subtypes. The mean age of patients at the first visit was 57.9±13.4 years. Drusen and RPE changes were seen in the peripheral retina, anterior to the vortex veins, in 21.8% of eyes. Of eyes with more than 5 years of follow-up, cuticular drusen disappeared from view in 58.3% of eyes, drusen coalescence was seen in 70.8% of eyes, and new RPE pigmentary changes developed in 56.2% of eyes. Retinal pigment epithelium abnormalities, AVLs, neovascularization, and GA occurred at a frequency of 47.5%, 24.2%, 12.5%, and 25%, respectively, and were significantly more common in patients older than 60 years of age (all P < 0.015). Occurrence of GA and neovascularization were important determinants of final visual acuity in eyes with the

  2. Pressure History Measurement in a Microwave Beaming Thruster

    International Nuclear Information System (INIS)

    Oda, Yasuhisa; Ushio, Masato; Komurasaki, Kimiya; Takahashi, Koji; Kasugai, Atsushi; Sakamoto, Keishi

    2006-01-01

    In a microwave beaming thruster with a 1-dimensional nozzle, plasma and shock wave propagates in the nozzle absorbing microwave power. In this study, pressure histories in the thruster are measured using pressure gauges. Measured pressure history at the thruster wall shows constant pressure during plasma propagation in the nozzle. The result of measurement of the propagating velocities of shock wave and plasma shows that both propagate in the same velocity. These result shows that thrust producing model of analogy of pulse detonation engine is successful for the 1D thruster

  3. The role of fecundity and reproductive effort in defining life-history strategies of North American freshwater mussels.

    Science.gov (United States)

    Haag, Wendell R

    2013-08-01

    Selection is expected to optimize reproductive investment resulting in characteristic trade-offs among traits such as brood size, offspring size, somatic maintenance, and lifespan; relative patterns of energy allocation to these functions are important in defining life-history strategies. Freshwater mussels are a diverse and imperiled component of aquatic ecosystems, but little is known about their life-history strategies, particularly patterns of fecundity and reproductive effort. Because mussels have an unusual life cycle in which larvae (glochidia) are obligate parasites on fishes, differences in host relationships are expected to influence patterns of reproductive output among species. I investigated fecundity and reproductive effort (RE) and their relationships to other life-history traits for a taxonomically broad cross section of North American mussel diversity. Annual fecundity of North American mussel species spans nearly four orders of magnitude, ranging from 200000). Estimates of RE also were highly variable, ranging among species from 0.06 to 25.4%. Median fecundity and RE differed among phylogenetic groups, but patterns for these two traits differed in several ways. For example, the tribe Anodontini had relatively low median fecundity but had the highest RE of any group. Within and among species, body size was a strong predictor of fecundity and explained a high percentage of variation in fecundity among species. Fecundity showed little relationship to other life-history traits including glochidial size, lifespan, brooding strategies, or host strategies. The only apparent trade-off evident among these traits was the extraordinarily high fecundity of Leptodea, Margaritifera, and Truncilla, which may come at a cost of greatly reduced glochidial size; there was no relationship between fecundity and glochidial size for the remaining 61 species in the dataset. In contrast to fecundity, RE showed evidence of a strong trade-off with lifespan, which was

  4. Forgotten marriages? Measuring the reliability of marriage histories

    Science.gov (United States)

    Chae, Sophia

    2016-01-01

    BACKGROUND Marriage histories are a valuable data source for investigating nuptiality. While researchers typically acknowledge the problems associated with their use, it is unknown to what extent these problems occur and how marriage analyses are affected. OBJECTIVE This paper seeks to investigate the quality of marriage histories by measuring levels of misreporting, examining the characteristics associated with misreporting, and assessing whether misreporting biases marriage indicators. METHODS Using data from the Malawi Longitudinal Study of Families and Health (MLSFH), I compare marriage histories reported by the same respondents at two different points in time. I investigate whether respondents consistently report their spouses (by name), status of marriage, and dates of marriage. I use multivariate regression models to investigate the characteristics associated with misreporting. Finally, I examine whether misreporting marriages and marriage dates affects marriage indicators. RESULTS Results indicate that 28.3% of men and 17.9% of women omitted at least one marriage in one of the survey waves. Multivariate regression models show that misreporting is not random: marriage, individual, interviewer, and survey characteristics are associated with marriage omission and marriage date inconsistencies. Misreporting also affects marriage indicators. CONCLUSIONS This is the first study of its kind to examine the reliability of marriage histories collected in the context of Sub-Saharan Africa. Although marriage histories are frequently used to study marriage dynamics, until now no knowledge has existed on the degree of misreporting. Misreporting in marriage histories is shown to be non-negligent and could potentially affect analyses. PMID:27152090

  5. A quantitative method for measuring the quality of history matches

    Energy Technology Data Exchange (ETDEWEB)

    Shaw, T.S. [Kerr-McGee Corp., Oklahoma City, OK (United States); Knapp, R.M. [Univ. of Oklahoma, Norman, OK (United States)

    1997-08-01

    History matching can be an efficient tool for reservoir characterization. A {open_quotes}good{close_quotes} history matching job can generate reliable reservoir parameters. However, reservoir engineers are often frustrated when they try to select a {open_quotes}better{close_quotes} match from a series of history matching runs. Without a quantitative measurement, it is always difficult to tell the difference between a {open_quotes}good{close_quotes} and a {open_quotes}better{close_quotes} matches. For this reason, we need a quantitative method for testing the quality of matches. This paper presents a method for such a purpose. The method uses three statistical indices to (1) test shape conformity, (2) examine bias errors, and (3) measure magnitude of deviation. The shape conformity test insures that the shape of a simulated curve matches that of a historical curve. Examining bias errors assures that model reservoir parameters have been calibrated to that of a real reservoir. Measuring the magnitude of deviation assures that the difference between the model and the real reservoir parameters is minimized. The method was first tested on a hypothetical model and then applied to published field studies. The results showed that the method can efficiently measure the quality of matches. It also showed that the method can serve as a diagnostic tool for calibrating reservoir parameters during history matching.

  6. Laboratory Measured Behavioral Impulsivity Relates to Suicide Attempt History

    Science.gov (United States)

    Dougherty, Donald M.; Mathias, Charles W.; Marsh, Dawn M.; Papageorgiou, T. Dorina; Swann, Alan C.; Moeller, F. Gerard

    2004-01-01

    The purpose of this study was to examine the relationship between laboratory behavioral measured impulsivity (using the Immediate and Delayed Memory Tasks) and suicidal attempt histories. Three groups of adults were recruited, those with either: no previous suicide attempts (Control, n = 20), only a single suicide attempt (Single, n = 20), or…

  7. Measurement Development in Reflective Supervision: History, Methods, and Next Steps

    Science.gov (United States)

    Tomlin, Angela M.; Heller, Sherryl Scott

    2016-01-01

    This issue of the "ZERO TO THREE" journal provides a snapshot of the current state of measurement of reflective supervision within the infant-family field. In this article, the authors introduce the issue by providing a brief history of the development of reflective supervision in the field of infant mental health, with a specific focus…

  8. Are self-report measures able to define individuals as physically active or inactive?

    NARCIS (Netherlands)

    Steene-Johannessen, J.; Anderssen, S.A.; Ploeg, H.P. van der; Hendriksen, I.J.M.; Donnelly, A.E.; Brage, S.; Ekelund, U.

    2016-01-01

    Purpose: Assess the agreement between commonly used self-report methods compared with objectively measured physical activity (PA) in defining the prevalence of individuals compliant with PA recommendations. Methods: Time spent in moderate and vigorous PA (MVPA) was measured at two time points in

  9. History and measurement of the base and derived units

    CERN Document Server

    Treese, Steven A

    2018-01-01

    This book discusses how and why historical measurement units developed, and reviews useful methods for making conversions as well as situations in which dimensional analysis can be used. It starts from the history of length measurement, which is one of the oldest measures used by humans. It highlights the importance of area measurement, briefly discussing the methods for determining areas mathematically and by measurement. The book continues on to detail the development of measures for volume, mass, weight, time, temperature, angle, electrical units, amounts of substances, and light intensity. The seven SI/metric base units are highlighted, as well as a number of other units that have historically been used as base units. Providing a comprehensive reference for interconversion among the commonly measured quantities in the different measurement systems with engineering accuracy, it also examines the relationships among base units in fields such as mechanical/thermal, electromagnetic and physical flow rates and...

  10. Guidelines for defining and documenting data on costs of possible environmental protection measures

    Energy Technology Data Exchange (ETDEWEB)

    Marlowe, I.; King, K.; Boyd, R.; Bouscaren, R.; Pacyna, J. [AEA Technology Environment, Harwell (United Kingdom)

    1999-07-01

    The Guidelines are intended to promote good practice in the documenting and use of data on the costs of possible environmental protection measures in the context of international data comparisons. The minimum information needed to describe the cost of an environmental protection measures is: details of pollution source; details of the environmental protection measure and its performance characteristics; how costs are defined; the year to which data apply; indications of data uncertainty; how pollutants are defined; and reference to data sources. Guidelines are given for these seven items. These are followed by descriptions of various methods of data processing - dealing with information; calculating annual costs; discount/interest rates; and additional issues relating to the implementation of cost data. 16 refs., 5 tabs., 6 apps.

  11. Toward defining and measuring social accountability in graduate medical education: a stakeholder study.

    Science.gov (United States)

    Reddy, Anjani T; Lazreg, Sonia A; Phillips, Robert L; Bazemore, Andrew W; Lucan, Sean C

    2013-09-01

    Since 1965, Medicare has publically financed graduate medical education (GME) in the United States. Given public financing, various advisory groups have argued that GME should be more socially accountable. Several efforts are underway to develop accountability measures for GME that could be tied to Medicare payments, but it is not clear how to measure or even define social accountability. We explored how GME stakeholders perceive, define, and measure social accountability. Through purposive and snowball sampling, we completed semistructured interviews with 18 GME stakeholders from GME training sites, government agencies, and health care organizations. We analyzed interview field notes and audiorecordings using a flexible, iterative, qualitative group process to identify themes. THREE THEMES EMERGED IN REGARDS TO DEFINING SOCIAL ACCOUNTABILITY: (1) creating a diverse physician workforce to address regional needs and primary care and specialty shortages; (2) ensuring quality in training and care to best serve patients; and (3) providing service to surrounding communities and the general public. All but 1 stakeholder believed GME institutions have a responsibility to be socially accountable. Reported barriers to achieving social accountability included training time constraints, financial limitations, and institutional resistance. Suggestions for measuring social accountability included reviewing graduates' specialties and practice locations, evaluating curricular content, and reviewing program services to surrounding communities. Most stakeholders endorsed the concept of social accountability in GME, suggesting definitions and possible measures that could inform policy makers calls for increased accountability despite recognized barriers.

  12. History and status of atomic mass measurement and evaluation

    International Nuclear Information System (INIS)

    Huang Wenxue; Zhu Zhichao; Wang Meng; Wang Yue; Tian Yulin; Xu Hushan; Xiao Guoqing

    2010-01-01

    Mass is one of the most fundamental properties that can be obtained about an atomic nucleus. High-accuracy mass values for atoms let us study the atomic and nuclear binding energies that represent the sum of all the atomic and nucleonic interactions. Looking on the history of nuclear masses, it can be found that it is almost as old as that of nuclear physics itself. The experimental methods for masses and the relevant outcomes are so rich that the evaluation is needed to check the consistency among the various results and obtain more reliable data. The atomic mass evaluation is a considerate and complicated process. This paper introduces briefly the history and status of atomic mass measurement and evaluation. (authors)

  13. A discussion of techniques used in defining the Interactive Measurement Evaluation and Control System at Rocky Flats

    International Nuclear Information System (INIS)

    Greer, B.K.; Hunt, V.; Schweitzer, M.F.

    1983-01-01

    This paper describes both the general methodology used to study the current needs for a measurement control and evaluation system at Rocky Flats Plant and the recommendations for implementation into the Interactive Measurement Evaluation and Control System (IMECS). The study resulted in a clear assessment of the current system and recommendations for the system which will be its replacement. To arrive at the recommendations, the authors used a formal analysis approach that is based on an in-depth study of the measurement evaluation and control problems and user needs. The problems and needs were defined by interviews with present and potential users of this kind of system throughout the nuclear industry. Some of the recommendations are to provide: timely sample measurement feedback; representative measurement error estimates; a history data base of sample measurements To meet the user needs, the new system will: be interactive with user selection menus; use standards which cover the range of application; facilitate historical analysis of sample data and bookkeeping. The implementation of this program is projected to be more cost effective than the current program. Also included are the authors' recommendations to those involved in the design of a system of similar large magnitude

  14. Defining and Measuring Cognitive-Entropy and Cognitive Self-Synchronization

    Science.gov (United States)

    2011-06-01

    16th ICCRTS: “Collective C2 in Multinational Civil-Military Operations” Defining and Measuring Cognitive-Entropy and Cognitive Self- Synchronization ...shared awareness and enabling self- synchronization across the range of participating entities (Alberts and Hayes 2009, pp.106). We consider the...aspect of self- synchronization (Alberts and Hayes, 2006) a key one in the context of modern operations and in performing C2 assessments. Based on (Manso

  15. DNA Methylation and Somatic Mutations Converge on the Cell Cycle and Define Similar Evolutionary Histories in Brain Tumors

    NARCIS (Netherlands)

    T. Mazor (Tali); A. Pankov (Aleksandr); B.E. Johnson (Brett E.); C. Hong (Chibo); E.G. Hamilton (Emily G.); R.J.A. Bell (Robert J.A.); I.V. Smirnov (Ivan V.); G.F. Reis (Gerald F.); J.J. Phillips (Joanna J.); M.J. Barnes (Michael); A. Idbaih (Ahmed); A. Alentorn (Agusti); J.J. Kloezeman (Jenneke); M.L.M. Lamfers (Martine); A.W. Bollen (Andrew W.); B.S. Taylor (Barry S.); A.M. Molinaro (Annette M.); A. Olshen (Adam); S.M. Chang (Susan); J.S. Song (Jun S.); J.F. Costello (Joseph F.)

    2015-01-01

    textabstractThe evolutionary history of tumor cell populations can be reconstructed from patterns of genetic alterations. In contrast to stable genetic events, epigenetic states are reversible and sensitive to the microenvironment, prompting the question whether epigenetic information can similarly

  16. System Energy Assessment (SEA, Defining a Standard Measure of EROI for Energy Businesses as Whole Systems

    Directory of Open Access Journals (Sweden)

    Jay Zarnikau

    2011-10-01

    Full Text Available A more objective method for measuring the energy needs of businesses, System Energy Assessment (SEA, measures the combined impacts of material supply chains and service supply chains, to assess businesses as whole self-managing net-energy systems. The method is demonstrated using a model Wind Farm, and defines a physical measure of their energy productivity for society (EROI-S, a ratio of total energy delivered to total energy expended. Energy use records for technology and proxy measures for clearly understood but not individually recorded energy uses for services are combined for a whole system estimate of consumption required for production. Current methods count only energy needs for technology. Business services outsource their own energy needs to operate, leaving no traceable record. That uncounted business energy demand is often 80% of the total, an amount of “dark energy” hidden from view, discovered by finding the average energy estimated needs for businesses far below the world average energy consumed per dollar of GDP. Presently for lack of information the energy needs of business services are counted to be “0”. Our default assumption is to treat them as “average”. The result is a hard measure of total business demand for energy services, a “Scope 4” energy use or GHG impact assessment. Counting recorded energy uses and discounting unrecorded ones misrepresents labor intensive work as highly energy efficient. The result confirms a similar finding by Hall et al. in 1981 [1]. We use exhaustive search for what a business needs to operate as a whole, tracing internal business relationships rather than energy data, to locate its natural physical boundary as a working unit, and so define a business as a physical rather than statistical subject of scientific study. See also online resource materials and notes [2].

  17. Outcome Measurement in Nursing: Imperatives, Ideals, History, and Challenges

    Science.gov (United States)

    Jones, Terry L

    2016-05-31

    Nurses have a social responsibility to evaluate the effect of nursing practice on patient outcomes in the areas of health promotion; injury and illness prevention; and alleviation of suffering. Quality assessment initiatives are hindered by the paucity of available data related to nursing processes and patient outcomes across these three domains of practice. Direct care nurses are integral to self-regulation for the discipline as they are the best source of information about nursing practice and patient outcomes. Evidence supports the assumption that nurses do contribute to prevention of adverse events but there is insufficient evidence to explain how nurses contribute to these and/or other patient outcomes. The purposes of this article are to examine the imperatives, ideal conditions, history, and challenges related to effective outcome measurement in nursing. The article concludes with recommendations for action to move quality assessment forward, such as substantial investment to support adequate documentation of nursing practice and patient outcomes.

  18. Defining natural history: assessment of the ability of college students to aid in characterizing clinical progression of Niemann-Pick disease, type C.

    Directory of Open Access Journals (Sweden)

    Jenny Shin

    Full Text Available Niemann-Pick Disease, type C (NPC is a fatal, neurodegenerative, lysosomal storage disorder. It is a rare disease with broad phenotypic spectrum and variable age of onset. These issues make it difficult to develop a universally accepted clinical outcome measure to assess urgently needed therapies. To this end, clinical investigators have defined emerging, disease severity scales. The average time from initial symptom to diagnosis is approximately 4 years. Further, some patients may not travel to specialized clinical centers even after diagnosis. We were therefore interested in investigating whether appropriately trained, community-based assessment of patient records could assist in defining disease progression using clinical severity scores. In this study we evolved a secure, step wise process to show that pre-existing medical records may be correctly assessed by non-clinical practitioners trained to quantify disease progression. Sixty-four undergraduate students at the University of Notre Dame were expertly trained in clinical disease assessment and recognition of major and minor symptoms of NPC. Seven clinical records, randomly selected from a total of thirty seven used to establish a leading clinical severity scale, were correctly assessed to show expected characteristics of linear disease progression. Student assessment of two new records donated by NPC families to our study also revealed linear progression of disease, but both showed accelerated disease progression, relative to the current severity scale, especially at the later stages. Together, these data suggest that college students may be trained in assessment of patient records, and thus provide insight into the natural history of a disease.

  19. Measuring replication competent HIV-1: advances and challenges in defining the latent reservoir.

    Science.gov (United States)

    Wang, Zheng; Simonetti, Francesco R; Siliciano, Robert F; Laird, Gregory M

    2018-02-13

    Antiretroviral therapy cannot cure HIV-1 infection due to the persistence of a small number of latently infected cells harboring replication-competent proviruses. Measuring persistent HIV-1 is challenging, as it consists of a mosaic population of defective and intact proviruses that can shift from a state of latency to active HIV-1 transcription. Due to this complexity, most of the current assays detect multiple categories of persistent HIV-1, leading to an overestimate of the true size of the latent reservoir. Here, we review the development of the viral outgrowth assay, the gold-standard quantification of replication-competent proviruses, and discuss the insights provided by full-length HIV-1 genome sequencing methods, which allowed us to unravel the composition of the proviral landscape. In this review, we provide a dissection of what defines HIV-1 persistence and we examine the unmet needs to measure the efficacy of interventions aimed at eliminating the HIV-1 reservoir.

  20. Measuring the temperature history of isochorically heated warm dense metals

    Science.gov (United States)

    McGuffey, Chris; Kim, J.; Park, J.; Moody, J.; Emig, J.; Heeter, B.; Dozieres, M.; Beg, Fn; McLean, Hs

    2017-10-01

    A pump-probe platform has been designed for soft X-ray absorption spectroscopy near edge structure measurements in isochorically heated Al or Cu samples with temperature of 10s to 100s of eV. The method is compatible with dual picosecond-class laser systems and may be used to measure the temperature of the sample heated directly by the pump laser or by a laser-driven proton beam Knowledge of the temperature history of warm dense samples will aid equation of state measurements. First, various low- to mid-Z targets were evaluated for their suitability as continuum X-ray backlighters over the range 200-1800 eV using a 10 J picosecond-class laser with relativistic peak intensity Alloys were found to be more suitable than single-element backlighters. Second, the heated sample package was designed with consideration of target thickness and tamp layers using atomic physics codes. The results of the first demonstration attempts will be presented. This work was supported by the U.S. DOE under Contract No. DE-SC0014600.

  1. Defining Neighbourhoods as a Measure of Exposure to the Food Environment

    Directory of Open Access Journals (Sweden)

    Anders K. Lyseen

    2015-07-01

    Full Text Available Neighbourhoods are frequently used as a measure for individuals’ exposure to the food environment. However, the definitions of neighbourhoods fluctuate and have not been applied consistently in previous studies. Neighbourhoods defined from a single fixed location fail to capture people’s complete exposure in multiple locations, but measuring behaviour using traditional methods can be challenging. This study compares the traditional methods of measuring exposure to the food environment to methods that use data from GPS tracking. For each of the 187 participants, 11 different neighbourhoods were created in which the exposure to supermarkets and fast food outlets were measured. ANOVA, Tukey’s Honestly Significant Difference (HSD test and t-tests were performed to compare the neighbourhoods. Significant differences were found between area sizes and the exposure to supermarkets and fast food outlets for different neighbourhood types. Second, significant differences in exposure to food outlets were found between the urban and rural neighbourhoods. Neighbourhoods are clearly a diffused and blurred concept that varies in meaning depending on each person’s perception and the conducted study. Complexity and heterogeneity of human mobility no longer appear to correspond to the use of residential neighbourhoods but rather emphasise the need for methods, concepts and measures of individual activity and exposure.

  2. Using blood cytokine measures to define high inflammatory biotype of schizophrenia and schizoaffective disorder.

    Science.gov (United States)

    Boerrigter, Danny; Weickert, Thomas W; Lenroot, Rhoshel; O'Donnell, Maryanne; Galletly, Cherrie; Liu, Dennis; Burgess, Martin; Cadiz, Roxanne; Jacomb, Isabella; Catts, Vibeke S; Fillman, Stu G; Weickert, Cynthia Shannon

    2017-09-18

    Increases in pro-inflammatory cytokines are found in the brain and blood of people with schizophrenia. However, increased cytokines are not evident in all people with schizophrenia, but are found in a subset. The cytokine changes that best define this subset, termed the "elevated inflammatory biotype", are still being identified. Using quantitative RT-PCR, we measured five cytokine mRNAs (IL-1β, IL-2 IL-6, IL-8 and IL-18) from peripheral blood of healthy controls and of people with schizophrenia or schizoaffective disorder (n = 165). We used a cluster analysis of the transcript levels to define those with low and those with elevated levels of cytokine expression. From the same cohort, eight cytokine proteins (IL-1β, IL-2, IL-6, IL-8, IL-10, IL-12, IFNγ and TNFα) were measured in serum and plasma using a Luminex Magpix-based assay. We compared peripheral mRNA and protein levels across diagnostic groups and between those with low and elevated levels of cytokine expression according to our transcription-based cluster analysis. We found an overall decrease in the anti-inflammatory IL-2 mRNA (p = 0.006) and an increase in three serum cytokines, IL-6 (p = 0.010), IL-8 (p = 0.024) and TNFα (p schizophrenia compared to healthy controls. A greater percentage of people with schizophrenia (48%) were categorised into the elevated inflammatory biotype compared to healthy controls (33%). The magnitude of increase in IL-1β, IL-6, IL-8 and IL-10 mRNAs in people in the elevated inflammation biotype ranged from 100 to 220% of those in the non-elevated inflammatory biotype and was comparable between control and schizophrenia groups. Blood cytokine protein levels did not correlate with cytokine mRNA levels, and plasma levels of only two cytokines distinguished the elevated and low inflammatory biotypes, with IL-1β significantly increased in the elevated cytokine control group and IL-8 significantly increased in the elevated cytokine schizophrenia group. Our results

  3. Defining Primary Care Shortage Areas: Do GIS-based Measures Yield Different Results?

    Science.gov (United States)

    Daly, Michael R; Mellor, Jennifer M; Millones, Marco

    2018-02-12

    To examine whether geographic information systems (GIS)-based physician-to-population ratios (PPRs) yield determinations of geographic primary care shortage areas that differ from those based on bounded-area PPRs like those used in the Health Professional Shortage Area (HPSA) designation process. We used geocoded data on primary care physician (PCP) locations and census block population counts from 1 US state to construct 2 shortage area indicators. The first is a bounded-area shortage indicator defined without GIS methods; the second is a GIS-based measure that measures the populations' spatial proximity to PCP locations. We examined agreement and disagreement between bounded shortage areas and GIS-based shortage areas. Bounded shortage area indicators and GIS-based shortage area indicators agree for the census blocks where the vast majority of our study populations reside. Specifically, 95% and 98% of the populations in our full and urban samples, respectively, reside in census blocks where the 2 indicators agree. Although agreement is generally high in rural areas (ie, 87% of the rural population reside in census blocks where the 2 indicators agree), agreement is significantly lower compared to urban areas. One source of disagreement suggests that bounded-area measures may "overlook" some shortages in rural areas; however, other aspects of the HPSA designation process likely mitigate this concern. Another source of disagreement arises from the border-crossing problem, and it is more prevalent. The GIS-based PPRs we employed would yield shortage area determinations that are similar to those based on bounded-area PPRs defined for Primary Care Service Areas. Disagreement rates were lower than previous studies have found. © 2018 National Rural Health Association.

  4. Improving inferior vena cava filter retrieval rates with the define, measure, analyze, improve, control methodology.

    Science.gov (United States)

    Sutphin, Patrick D; Reis, Stephen P; McKune, Angie; Ravanzo, Maria; Kalva, Sanjeeva P; Pillai, Anil K

    2015-04-01

    To design a sustainable process to improve optional inferior vena cava (IVC) filter retrieval rates based on the Define, Measure, Analyze, Improve, Control (DMAIC) methodology of the Six Sigma process improvement paradigm. DMAIC, an acronym for Define, Measure, Analyze, Improve, and Control, was employed to design and implement a quality improvement project to increase IVC filter retrieval rates at a tertiary academic hospital. Retrievable IVC filters were placed in 139 patients over a 2-year period. The baseline IVC filter retrieval rate (n = 51) was reviewed through a retrospective analysis, and two strategies were devised to improve the filter retrieval rate: (a) mailing of letters to clinicians and patients for patients who had filters placed within 8 months of implementation of the project (n = 43) and (b) a prospective automated scheduling of a clinic visit at 4 weeks after filter placement for all new patients (n = 45). The effectiveness of these strategies was assessed by measuring the filter retrieval rates and estimated increase in revenue to interventional radiology. IVC filter retrieval rates increased from a baseline of 8% to 40% with the mailing of letters and to 52% with the automated scheduling of a clinic visit 4 weeks after IVC filter placement. The estimated revenue per 100 IVC filters placed increased from $2,249 to $10,518 with the mailing of letters and to $17,022 with the automated scheduling of a clinic visit. Using the DMAIC methodology, a simple and sustainable quality improvement intervention was devised that markedly improved IVC filter retrieval rates in eligible patients. Copyright © 2015 SIR. Published by Elsevier Inc. All rights reserved.

  5. Are Self-report Measures Able to Define Individuals as Physically Active or Inactive?

    Science.gov (United States)

    Steene-Johannessen, Jostein; Anderssen, Sigmund A; van der Ploeg, Hidde P; Hendriksen, Ingrid J M; Donnelly, Alan E; Brage, Søren; Ekelund, Ulf

    2016-02-01

    Assess the agreement between commonly used self-report methods compared with objectively measured physical activity (PA) in defining the prevalence of individuals compliant with PA recommendations. Time spent in moderate and vigorous PA (MVPA) was measured at two time points in 1713 healthy individuals from nine European countries using individually calibrated combined heart rate and movement sensing. Participants also completed the Recent Physical Activity Questionnaire (RPAQ), short form of the International Physical Activity Questionnaire (IPAQ), and short European Prospective Investigation into Cancer and Nutrition Physical Activity Questionnaire (EPIC-PAQ). Individuals were categorized as active (e.g., reporting ≥150 min of MVPA per week) or inactive, based on the information derived from the different measures. Sensitivity and specificity analyses and Kappa statistics were performed to evaluate the ability of the three PA questionnaires to correctly categorize individuals as active or inactive. Prevalence estimates of being sufficiently active varied significantly (P for all PAQ 39.9% [95% CI, 37.5-42.1] and objective measure 48.5% [95% CI, 41.6-50.9]. All self-report methods showed low or moderate sensitivity (IPAQ 20.0%, RPAQ 18.7%, and EPIC-PAQ 69.8%) to correctly classify inactive people and the agreement between objective and self-reported PA was low (ĸ = 0.07 [95% CI, 0.02-0.12], 0.12 [95% CI, 0.06-0.18], and 0.19 [95% CI, 0.13-0.24] for IPAQ, RPAQ, and EPIC-PAQ, respectively). The modest agreement between self-reported and objectively measured PA suggests that population levels of PA derived from self-report should be interpreted cautiously. Implementation of objective measures in large-scale cohort studies and surveillance systems is recommended.

  6. Modified T-history method for measuring thermophysical properties of phase change materials at high temperature

    Science.gov (United States)

    Omaraa, Ehsan; Saman, Wasim; Bruno, Frank; Liu, Ming

    2017-06-01

    Latent heat storage using phase change materials (PCMs) can be used to store large amounts of energy in a narrow temperature difference during phase transition. The thermophysical properties of PCMs such as latent heat, specific heat and melting and solidification temperature need to be defined at high precision for the design and estimating the cost of latent heat storage systems. The existing laboratory standard methods, such as differential thermal analysis (DTA) and differential scanning calorimetry (DSC), use a small sample size (1-10 mg) to measure thermophysical properties, which makes these methods suitable for homogeneous elements. In addition, this small amount of sample has different thermophysical properties when compared with the bulk sample and may have limitations for evaluating the properties of mixtures. To avoid the drawbacks in existing methods, the temperature - history (T-history) method can be used with bulk quantities of PCM salt mixtures to characterize PCMs. This paper presents a modified T-history setup, which was designed and built at the University of South Australia to measure the melting point, heat of fusion, specific heat, degree of supercooling and phase separation of salt mixtures for a temperature range between 200 °C and 400 °C. Sodium Nitrate (NaNO3) was used to verify the accuracy of the new setup.

  7. Measuring the Star Formation History Of Omega Centauri

    Science.gov (United States)

    Weisz, Daniel

    2011-10-01

    We propopse to apply the technique of color-magnitude diagram {CMD} fitting to archival HST/ACS and WFC3 imaging of Omega Centauri in order to measure its star formation history {SFH}. As the remnant of a captured satellite galaxy, the SFH of Omega Cen will provide key insights into its formation and evolution before and after its incorporation into the Milky Way. The derivation of SFHs from CMD analysis has been well-established in the Local Group and nearby galaxies, but has never been applied within our Galaxy. Archival HST imaging of Omega Cen provides for exquisitely deep CMDs with broad wavelength coverage {near-UV through I-band}, which allows for clear separation of age-sensitive CMD features, and can be leveraged to highly constrain its star formation rate as a function of time. In addition, the CMD fitting technique also allows us to test for consistency in recovered SFHs using different stellar models, and quantitatively tie the UV characteristics of ancient stellar populations to a SFH.

  8. Defining and Measuring Safety Climate: A Review of the Construction Industry Literature.

    Science.gov (United States)

    Schwatka, Natalie V; Hecker, Steven; Goldenhar, Linda M

    2016-06-01

    Safety climate measurements can be used to proactively assess an organization's effectiveness in identifying and remediating work-related hazards, thereby reducing or preventing work-related ill health and injury. This review article focuses on construction-specific articles that developed and/or measured safety climate, assessed safety climate's relationship with other safety and health performance indicators, and/or used safety climate measures to evaluate interventions targeting one or more indicators of safety climate. Fifty-six articles met our inclusion criteria, 80% of which were published after 2008. Our findings demonstrate that researchers commonly defined safety climate as perception based, but the object of those perceptions varies widely. Within the wide range of indicators used to measure safety climate, safety policies, procedures, and practices were the most common, followed by general management commitment to safety. The most frequently used indicators should and do reflect that the prevention of work-related ill health and injury depends on both organizational and employee actions. Safety climate scores were commonly compared between groups (e.g. management and workers, different trades), and often correlated with subjective measures of safety behavior rather than measures of ill health or objective safety and health outcomes. Despite the observed limitations of current research, safety climate has been promised as a useful feature of research and practice activities to prevent work-related ill health and injury. Safety climate survey data can reveal gaps between management and employee perceptions, or between espoused and enacted policies, and trigger communication and action to narrow those gaps. The validation of safety climate with safety and health performance data offers the potential for using safety climate measures as a leading indicator of performance. We discuss these findings in relation to the related concept of safety culture and

  9. Stratospheric ozone measurements at Arosa (Switzerland): history and scientific relevance

    Science.gov (United States)

    Staehelin, Johannes; Viatte, Pierre; Stübi, Rene; Tummon, Fiona; Peter, Thomas

    2018-05-01

    to the unique length of the observational record. This paper presents the evolution of the ozone layer, the history of international ozone research, and discusses the justification for the measurements in the past, present and into future.

  10. Precautionary measures to prevent damage, as defined in the Atomic Energy Law

    International Nuclear Information System (INIS)

    Marburger, P.

    1983-01-01

    The requirement to take every 'precaution which is necessary in the light of existing scientific knowledge and technology to prevent damage' (section 7, sub-section (2), no. 3 Atomic Energy Act) is not restricted to conventional (preventive) measures but is to be understood as a duty to actively provide for appropriate protection from conceivable damage. Below the level of legally binding laws and regulations, there is the level of scientific-technical codes and standards, which are of great significance to the licensing procedure under atomic energy law. As these codes and standards do not form part of the law but nevertheless represent the essence of scientific knowledge needed to fulfill the duty defined by the law, they are gaining full impact only through the licensing procedure, thus being transformed into concrete legal requirements. Hence one can say that the legal situation in atomic energy law relating to the licensing requirements as laid down in section 7, sub-section (2), no. 3 is presently characterised by a regulatory deficit. This regulatory deficit cannot be overcome by the means and tools offered by the current law. One possibility to fill the gap is to give a legally binding status to the safety guides defined by the deterministic safety concept, by listing the conceivable accidents to be mastered. This recommendable procedure could lead to an ordinance on the safety of nuclear installations. Such an ordinance could be kept abreast with technical progress and scientific knowledge by creating a referring legal instrument, pointing to, e.g., the KTA Safety Guide. (orig./HSCH) [de

  11. Developing a questionnaire for measuring epistemological beliefs in history education

    NARCIS (Netherlands)

    Stoel, Gerhard; Logtenberg, Albert; Wansink, Bjorn; Huijgen, Timothy

    2016-01-01

    Developing pupils’ understanding of history with its own disciplinary and epistemological problems can contribute to the education of a critical and peaceful diverse society. This symposium discusses results of four studies from the Netherlands, Germany and the USA addressing theoretical,

  12. History and current safety measures at Laguna Palcacocha, Huaraz, Peru

    Science.gov (United States)

    Salazar Checa, César; Cochachin, Alejo; Frey, Holger; Huggel, Christian; Portocarrero, César

    2017-04-01

    Laguna Palcacocha is a large glacier lake in the Cordillera Blanca, Peru, located in the Quillcay catchment, above the city of Huaraz, the local capital. On 13 December 1941, the moraine dam lake collapsed, probably after having been impacted by a large ice avalanche, and triggered a major outburst flood. This GLOF destroyed about a third of the city of Huaraz, causing about 2,000 casualties and is therefore one of the deadliest glacier lake outbursts known in history. In 1974, the Glaciology Unit of Peru, responsible for the studying, monitoring and mitigation works related to glacier hazards installed a reinforcement of the natural moraine dam of the newly filled Laguna Palcacocha, with an artificial drainage channel at 7 m below the crest of the reinforced dam. At that time, the lake had an area of 66,800 m2 and a volume of 0.5 x 106 m3. During the past decades, in the course of continued glacier retreat, Laguna Palcacocha has undergone an extreme growth. In February 2016, the lake had an area of 514,000 m2 (7.7 times the area of 1974) and a volume of more than 17 x 106 m3 (more than 34 times the volume of 1974). At the same time, the city of Huaraz, located 20 km downstream of the lake, grew significantly after its almost complete destruction by the 1970 earthquake. Today, about 120,000 people are living in the city. Due to the persisting possibility for large ice avalanches directly above the Palcacocha lake, this constitutes a high-risk situation, requiring new hazard and risk mitigation measures. As an immediate temporal measure, in order to bridge the time until the realization of a more permanent measure, a syphoning system has been installed in 2011, using about ten 700-m pipes with a 10-inch (25.4 cm) diameter. The aim of this syphoning attempt is to lower the lake level by about 7 m, and therefore reduce the lake volume on the one hand, and also reach a higher dam freeboard. However, the system is less effective than assumed, currently the lake level

  13. Measuring solar reflectance - Part I: Defining a metric that accurately predicts solar heat gain

    Energy Technology Data Exchange (ETDEWEB)

    Levinson, Ronnen; Akbari, Hashem; Berdahl, Paul [Heat Island Group, Environmental Energy Technologies Division, Lawrence Berkeley National Laboratory, 1 Cyclotron Road, Berkeley, CA 94720 (United States)

    2010-09-15

    Solar reflectance can vary with the spectral and angular distributions of incident sunlight, which in turn depend on surface orientation, solar position and atmospheric conditions. A widely used solar reflectance metric based on the ASTM Standard E891 beam-normal solar spectral irradiance underestimates the solar heat gain of a spectrally selective ''cool colored'' surface because this irradiance contains a greater fraction of near-infrared light than typically found in ordinary (unconcentrated) global sunlight. At mainland US latitudes, this metric R{sub E891BN} can underestimate the annual peak solar heat gain of a typical roof or pavement (slope {<=} 5:12 [23 ]) by as much as 89 W m{sup -2}, and underestimate its peak surface temperature by up to 5 K. Using R{sub E891BN} to characterize roofs in a building energy simulation can exaggerate the economic value N of annual cool roof net energy savings by as much as 23%. We define clear sky air mass one global horizontal (''AM1GH'') solar reflectance R{sub g,0}, a simple and easily measured property that more accurately predicts solar heat gain. R{sub g,0} predicts the annual peak solar heat gain of a roof or pavement to within 2 W m{sup -2}, and overestimates N by no more than 3%. R{sub g,0} is well suited to rating the solar reflectances of roofs, pavements and walls. We show in Part II that R{sub g,0} can be easily and accurately measured with a pyranometer, a solar spectrophotometer or version 6 of the Solar Spectrum Reflectometer. (author)

  14. Measuring solar reflectance Part I: Defining a metric that accurately predicts solar heat gain

    Energy Technology Data Exchange (ETDEWEB)

    Levinson, Ronnen; Akbari, Hashem; Berdahl, Paul

    2010-05-14

    Solar reflectance can vary with the spectral and angular distributions of incident sunlight, which in turn depend on surface orientation, solar position and atmospheric conditions. A widely used solar reflectance metric based on the ASTM Standard E891 beam-normal solar spectral irradiance underestimates the solar heat gain of a spectrally selective 'cool colored' surface because this irradiance contains a greater fraction of near-infrared light than typically found in ordinary (unconcentrated) global sunlight. At mainland U.S. latitudes, this metric RE891BN can underestimate the annual peak solar heat gain of a typical roof or pavement (slope {le} 5:12 [23{sup o}]) by as much as 89 W m{sup -2}, and underestimate its peak surface temperature by up to 5 K. Using R{sub E891BN} to characterize roofs in a building energy simulation can exaggerate the economic value N of annual cool-roof net energy savings by as much as 23%. We define clear-sky air mass one global horizontal ('AM1GH') solar reflectance R{sub g,0}, a simple and easily measured property that more accurately predicts solar heat gain. R{sub g,0} predicts the annual peak solar heat gain of a roof or pavement to within 2 W m{sup -2}, and overestimates N by no more than 3%. R{sub g,0} is well suited to rating the solar reflectances of roofs, pavements and walls. We show in Part II that R{sub g,0} can be easily and accurately measured with a pyranometer, a solar spectrophotometer or version 6 of the Solar Spectrum Reflectometer.

  15. Consensus statement on defining and measuring negative effects of Internet interventions

    Directory of Open Access Journals (Sweden)

    Alexander Rozental

    2014-03-01

    Full Text Available Internet interventions have great potential for alleviating emotional distress, promoting mental health, and enhancing well-being. Numerous clinical trials have demonstrated their efficacy for a number of psychiatric conditions, and interventions delivered via the Internet will likely become a common alternative to face-to-face treatment. Meanwhile, research has paid little attention to the negative effects associated with treatment, warranting further investigation of the possibility that some patients might deteriorate or encounter adverse events despite receiving best available care. Evidence from research of face-to-face treatment suggests that negative effects afflict 5–10% of all patients undergoing treatment in terms of deterioration. However, there is currently a lack of consensus on how to define and measure negative effects in psychotherapy research in general, leaving researchers without practical guidelines for monitoring and reporting negative effects in clinical trials. The current paper therefore seeks to provide recommendations that could promote the study of negative effects in Internet interventions with the aim of increasing the knowledge of its occurrence and characteristics. Ten leading experts in the field of Internet interventions were invited to participate and share their perspective on how to explore negative effects, using the Delphi technique to facilitate a dialog and reach an agreement. The authors discuss the importance of conducting research on negative effects in order to further the understanding of its incidence and different features. Suggestions on how to classify and measure negative effects in Internet interventions are proposed, involving methods from both quantitative and qualitative research. Potential mechanisms underlying negative effects are also discussed, differentiating common factors shared with face-to-face treatments from those unique to treatments delivered via the Internet. The authors

  16. Perimetric measurements with flicker-defined form stimulation in comparison with conventional perimetry and retinal nerve fiber measurements.

    Science.gov (United States)

    Horn, Folkert K; Tornow, Ralf P; Jünemann, Anselm G; Laemmer, Robert; Kremers, Jan

    2014-04-11

    We compared the results of flicker-defined form (FDF) perimetry with standard automated perimetry (SAP) and retinal nerve fiber layer (RNFL) thickness measurements using spectral domain optical coherence tomography (OCT). A total of 64 healthy subjects, 45 ocular hypertensive patients, and 97 "early" open-angle glaucoma (OAG) patients participated in this study. Definition of glaucoma was based exclusively on glaucomatous optic disc appearance. All subjects underwent FDF perimetry, SAP, and peripapillary measurements of the RNFL thickness. The FDF perimetry and SAP were performed at identical test locations (G1 protocol). Exclusion criteria were subjects younger than 34 years, SAP mean defect (SAP MD) > 5 dB, eye diseases other than glaucoma, or nonreliable FDF measurements. The correlations between the perimetric data on one hand and RNFL thicknesses on the other hand were analyzed statistically. The age-corrected sensitivity values and the local results from the controls were used to determine FDF mean defect (FDF MD). The FDF perimetry and SAP showed high concordance in this cohort of experienced patients (MD values, R = -0.69, P < 0.001). Of a total of 42 OAG patients with abnormal SAP MD, 38 also displayed abnormal FDF MD. However, FDF MD was abnormal in 28 of 55 OAG patients with normal SAP MD. The FDF MD was significantly (R = -0.61, P < 0.001) correlated with RNFL thickness with a (nonsignificantly) larger correlation coefficient than conventional SAP MD (R = -0.48, P < 0.001). The FDF perimetry is able to uncover functional changes concurrent with the changes in RNFL thickness. The FDF perimetry may be an efficient functional test to detect early glaucomatous nerve atrophy. (ClinicalTrials.gov number, NCT00494923.).

  17. Defining and Measuring the Affordability of New Medicines: A Systematic Review.

    Science.gov (United States)

    Antoñanzas, Fernando; Terkola, Robert; Overton, Paul M; Shalet, Natalie; Postma, Maarten

    2017-08-01

    In many healthcare systems, affordability concerns can lead to restrictions on the use of expensive efficacious therapies. However, there does not appear to be any consensus as to the terminology used to describe affordability, or the thresholds used to determine whether new drugs are affordable. The aim of this systematic review was to investigate how affordability is defined and measured in healthcare. MEDLINE, EMBASE and EconLit databases (2005-July 2016) were searched using terms covering affordability and budget impact, combined with definitions, thresholds and restrictions, to identify articles describing a definition of affordability with respect to new medicines. Additional definitions were identified through citation searching, and through manual searches of European health technology assessment body websites. In total, 27 definitions were included in the review. Of these, five definitions described affordability in terms of the value of a product; seven considered affordability within the context of healthcare system budgets; and 15 addressed whether products are affordable in a given country based on economic factors. However, there was little in the literature to indicate that the price of medicines is considered alongside both their value to individual patients and their budget impact at a population level. Current methods of assessing affordability in healthcare may be limited by their focus on budget impact. A more effective approach may involve a broader perspective than is currently described in the literature, to consider the long-term benefits of a therapy and cost savings elsewhere in the healthcare system, as well as cooperation between healthcare payers and the pharmaceutical industry to develop financing models that support sustainability as well as innovation.

  18. Monodisperse measurement of the biotin-streptavidin interaction strength in a well-defined pulling geometry.

    Directory of Open Access Journals (Sweden)

    Steffen M Sedlak

    Full Text Available The widely used interaction of the homotetramer streptavidin with the small molecule biotin has been intensively studied by force spectroscopy and has become a model system for receptor ligand interaction. However, streptavidin's tetravalency results in diverse force propagation pathways through the different binding interfaces. This multiplicity gives rise to polydisperse force spectroscopy data. Here, we present an engineered monovalent streptavidin tetramer with a single cysteine in its functional subunit that allows for site-specific immobilization of the molecule, orthogonal to biotin binding. Functionality of streptavidin and its binding properties for biotin remain unaffected. We thus created a stable and reliable molecular anchor with a unique high-affinity binding site for biotinylated molecules or nanoparticles, which we expect to be useful for many single-molecule applications. To characterize the mechanical properties of the bond between biotin and our monovalent streptavidin, we performed force spectroscopy experiments using an atomic force microscope. We were able to conduct measurements at the single-molecule level with 1:1-stoichiometry and a well-defined geometry, in which force exclusively propagates through a single subunit of the streptavidin tetramer. For different force loading rates, we obtained narrow force distributions of the bond rupture forces ranging from 200 pN at 1,500 pN/s to 230 pN at 110,000 pN/s. The data are in very good agreement with the standard Bell-Evans model with a single potential barrier at Δx0 = 0.38 nm and a zero-force off-rate koff,0 in the 10-6 s-1 range.

  19. Monodisperse measurement of the biotin-streptavidin interaction strength in a well-defined pulling geometry.

    Science.gov (United States)

    Sedlak, Steffen M; Bauer, Magnus S; Kluger, Carleen; Schendel, Leonard C; Milles, Lukas F; Pippig, Diana A; Gaub, Hermann E

    2017-01-01

    The widely used interaction of the homotetramer streptavidin with the small molecule biotin has been intensively studied by force spectroscopy and has become a model system for receptor ligand interaction. However, streptavidin's tetravalency results in diverse force propagation pathways through the different binding interfaces. This multiplicity gives rise to polydisperse force spectroscopy data. Here, we present an engineered monovalent streptavidin tetramer with a single cysteine in its functional subunit that allows for site-specific immobilization of the molecule, orthogonal to biotin binding. Functionality of streptavidin and its binding properties for biotin remain unaffected. We thus created a stable and reliable molecular anchor with a unique high-affinity binding site for biotinylated molecules or nanoparticles, which we expect to be useful for many single-molecule applications. To characterize the mechanical properties of the bond between biotin and our monovalent streptavidin, we performed force spectroscopy experiments using an atomic force microscope. We were able to conduct measurements at the single-molecule level with 1:1-stoichiometry and a well-defined geometry, in which force exclusively propagates through a single subunit of the streptavidin tetramer. For different force loading rates, we obtained narrow force distributions of the bond rupture forces ranging from 200 pN at 1,500 pN/s to 230 pN at 110,000 pN/s. The data are in very good agreement with the standard Bell-Evans model with a single potential barrier at Δx0 = 0.38 nm and a zero-force off-rate koff,0 in the 10-6 s-1 range.

  20. Maslach Burnout Inventory and a Self-Defined, Single-Item Burnout Measure Produce Different Clinician and Staff Burnout Estimates.

    Science.gov (United States)

    Knox, Margae; Willard-Grace, Rachel; Huang, Beatrice; Grumbach, Kevin

    2018-06-04

    Clinicians and healthcare staff report high levels of burnout. Two common burnout assessments are the Maslach Burnout Inventory (MBI) and a single-item, self-defined burnout measure. Relatively little is known about how the measures compare. To identify the sensitivity, specificity, and concurrent validity of the self-defined burnout measure compared to the more established MBI measure. Cross-sectional survey (November 2016-January 2017). Four hundred forty-four primary care clinicians and 606 staff from three San Francisco Aarea healthcare systems. The MBI measure, calculated from a high score on either the emotional exhaustion or cynicism subscale, and a single-item measure of self-defined burnout. Concurrent validity was assessed using a validated, 7-item team culture scale as reported by Willard-Grace et al. (J Am Board Fam Med 27(2):229-38, 2014) and a standard question about workplace atmosphere as reported by Rassolian et al. (JAMA Intern Med 177(7):1036-8, 2017) and Linzer et al. (Ann Intern Med 151(1):28-36, 2009). Similar to other nationally representative burnout estimates, 52% of clinicians (95% CI: 47-57%) and 46% of staff (95% CI: 42-50%) reported high MBI emotional exhaustion or high MBI cynicism. In contrast, 29% of clinicians (95% CI: 25-33%) and 31% of staff (95% CI: 28-35%) reported "definitely burning out" or more severe symptoms on the self-defined burnout measure. The self-defined measure's sensitivity to correctly identify MBI-assessed burnout was 50.4% for clinicians and 58.6% for staff; specificity was 94.7% for clinicians and 92.3% for staff. Area under the receiver operator curve was 0.82 for clinicians and 0.81 for staff. Team culture and atmosphere were significantly associated with both self-defined burnout and the MBI, confirming concurrent validity. Point estimates of burnout notably differ between the self-defined and MBI measures. Compared to the MBI, the self-defined burnout measure misses half of high-burnout clinicians and more

  1. Associations of anatomical measures from MRI with radiographically defined knee osteoarthritis score, pain, and physical functioning.

    Science.gov (United States)

    Sowers, Maryfran; Karvonen-Gutierrez, Carrie A; Jacobson, Jon A; Jiang, Yebin; Yosef, Matheos

    2011-02-02

    The prevalence of knee osteoarthritis is traditionally based on radiographic findings, but magnetic resonance imaging is now being used to provide better visualization of bone, cartilage, and soft tissues as well as the patellar compartment. The goal of this study was to estimate the prevalences of knee features defined on magnetic resonance imaging in a population and to relate these abnormalities to knee osteoarthritis severity scores based on radiographic findings, physical functioning, and reported knee pain in middle-aged women. Magnetic resonance images of the knee were evaluated for the location and severity of cartilage defects, bone marrow lesions, osteophytes, subchondral cysts, meniscal and/or ligamentous tears, effusion, and synovitis among 363 middle-aged women (724 knees) from the Michigan Study of Women's Health Across the Nation. These findings were related to Kellgren-Lawrence osteoarthritis severity scores from radiographs, self-reported knee pain, self-reported knee injury, perception of physical functioning, and physical performance measures to assess mobility. Radiographs, physical performance assessment, and interviews were undertaken at the 1996 study baseline and again (with the addition of magnetic resonance imaging assessment) at the follow-up visit during 2007 to 2008. The prevalence of moderate-to-severe knee osteoarthritis changed from 3.7% at the baseline assessment to 26.7% at the follow-up visit eleven years later. Full-thickness cartilage defects of the medial, lateral, and patellofemoral compartments were present in 14.5% (105 knees), 4.6% (thirty-three knees), and 26.2% (190 knees), respectively. Synovitis was identified in 24.7% (179) of the knees, and joint effusions were observed in 70% (507 knees); 21.7% (157) of the knees had complex or macerated meniscal tears. Large osteophytes, marked synovitis, macerated meniscal tears, and full-thickness tibial cartilage defects were associated with increased odds of knee pain and with

  2. Defining, Designing for, and Measuring "Social Constructivist Digital Literacy" Development in Learners: A Proposed Framework

    Science.gov (United States)

    Reynolds, Rebecca

    2016-01-01

    This paper offers a newly conceptualized modular framework for digital literacy that defines this concept as a task-driven "social constructivist digital literacy," comprising 6 practice domains grounded in Constructionism and social constructivism: Create, Manage, Publish, Socialize, Research, Surf. The framework articulates possible…

  3. Defining Allowable Physical Property Variations for High Accurate Measurements on Polymer Parts

    DEFF Research Database (Denmark)

    Mohammadi, Ali; Sonne, Mads Rostgaard; Madruga, Daniel González

    2015-01-01

    Measurement conditions and material properties have a significant impact on the dimensions of a part, especially for polymers parts. Temperature variation causes part deformations that increase the uncertainty of the measurement process. Current industrial tolerances of a few micrometres demand...... high accurate measurements in non-controlled ambient. Most of polymer parts are manufactured by injection moulding and their inspection is carried out after stabilization, around 200 hours. The overall goal of this work is to reach ±5μm in uncertainty measurements a polymer products which...

  4. Defining and measuring the mean residence time of lateral surface transient storage zones in small streams

    Science.gov (United States)

    T.R. Jackson; R. Haggerty; S.V. Apte; A. Coleman; K.J. Drost

    2012-01-01

    Surface transient storage (STS) has functional significance in stream ecosystems because it increases solute interaction with sediments. After volume, mean residence time is the most important metric of STS, but it is unclear how this can be measured accurately or related to other timescales and field-measureable parameters. We studied mean residence time of lateral...

  5. Defining and measuring cyberbullying within the larger context of bullying victimization.

    Science.gov (United States)

    Ybarra, Michele L; Boyd, Danah; Korchmaros, Josephine D; Oppenheim, Jay Koby

    2012-07-01

    To inform the scientific debate about bullying, including cyberbullying, measurement. Two split-form surveys were conducted online among 6-17-year-olds (n = 1,200 each) to inform recommendations for cyberbullying measurement. Measures that use the word "bully" result in prevalence rates similar to each other, irrespective of whether a definition is included, whereas measures not using the word "bully" are similar to each other, irrespective of whether a definition is included. A behavioral list of bullying experiences without either a definition or the word "bully" results in higher prevalence rates and likely measures experiences that are beyond the definition of "bullying." Follow-up questions querying differential power, repetition, and bullying over time were used to examine misclassification. The measure using a definition but not the word "bully" appeared to have the highest rate of false positives and, therefore, the highest rate of misclassification. Across two studies, an average of 25% reported being bullied at least monthly in person compared with an average of 10% bullied online, 7% via telephone (cell or landline), and 8% via text messaging. Measures of bullying among English-speaking individuals in the United States should include the word "bully" when possible. The definition may be a useful tool for researchers, but results suggest that it does not necessarily yield a more rigorous measure of bullying victimization. Directly measuring aspects of bullying (i.e., differential power, repetition, over time) reduces misclassification. To prevent double counting across domains, we suggest the following distinctions: mode (e.g., online, in-person), type (e.g., verbal, relational), and environment (e.g., school, home). We conceptualize cyberbullying as bullying communicated through the online mode. Copyright © 2012 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.

  6. Premises and Limitations in Defining and Measuring Synergy from M&As

    Directory of Open Access Journals (Sweden)

    Aevoae George Marian

    2017-01-01

    Full Text Available Mergers and acquisitions are performed worldwide mainly because of synergy. Although many invoke the term synergy as the key motivation of why they engage in M&As, research has led us to understand that it is not very clear in terms of what it actually is. In the scientific literature, synergy is mostly defined as being “2+2=5”. Thus, we first thought that it can only be a positive effect. But, latter on, we found out that synergy is not only positive, it can be negative as well, known as negative synergy or dyssynergy. The purpose of this paper is to shed some light on what is synergy, how can we quantify and classify it and why acquiring firms tend to pay more for the target firm. We believe that there is a link between the amount of premium paid for a target firm and the expectations for synergy.

  7. Defining and Measuring Decision-Making for the Management of Trauma Patients.

    Science.gov (United States)

    Madani, Amin; Gips, Amanda; Razek, Tarek; Deckelbaum, Dan L; Mulder, David S; Grushka, Jeremy R

    Effective management of trauma patients is heavily dependent on sound judgment and decision-making. Yet, current methods for training and assessing these advanced cognitive skills are subjective, lack standardization, and are prone to error. This qualitative study aims to define and characterize the cognitive and interpersonal competencies required to optimally manage injured patients. Cognitive and hierarchical task analyses for managing unstable trauma patients were performed using qualitative methods to map the thoughts, behaviors, and practices that characterize expert performance. Trauma team leaders and board-certified trauma surgeons participated in semistructured interviews that were transcribed verbatim. Data were supplemented with content from published literature and prospectively collected field notes from observations of the trauma team during trauma activations. The data were coded and analyzed using grounded theory by 2 independent reviewers. A framework was created based on 14 interviews with experts (lasting 1-2 hours each), 35 field observations (20 [57%] blunt; 15 [43%] penetrating; median Injury Severity Score 20 [13-25]), and 15 literary sources. Experts included 11 trauma surgeons and 3 emergency physicians from 7 Level 1 academic institutions in North America (median years in practice: 12 [8-17]). Twenty-nine competencies were identified, including 17 (59%) related to situation awareness, 6 (21%) involving decision-making, and 6 (21%) requiring interpersonal skills. Of 40 potential errors that were identified, root causes were mapped to errors in situation awareness (20 [50%]), decision-making (10 [25%]), or interpersonal skills (10 [25%]). This study defines cognitive and interpersonal competencies that are essential for the management of trauma patients. This framework may serve as the basis for novel curricula to train and assess decision-making skills, and to develop quality-control metrics to improve team and individual performance

  8. Defining, Measuring, and Incentivizing Sustainable Land Use to Meet Human Needs

    Science.gov (United States)

    Nicholas, K. A.; Brady, M. V.; Olin, S.; Ekroos, J.; Hall, M.; Seaquist, J. W.; Lehsten, V.; Smith, H.

    2016-12-01

    Land is a natural capital that supports the flow of an enormous amount of ecosystem services critical to human welfare. Sustainable land use, which we define as land use that meets both current and future human needs for ecosystem services, is essential to meet global goals for climate mitigation and sustainable development, while maintaining natural capital. However, it is not clear what governance is needed to achieve sustainable land use under multiple goals (as defined by the values of relevant decision-makers and land managers), particularly under climate change. Here we develop a conceptual model for examining the interactions and tradeoffs among multiple goals, as well as their spatial interactions (teleconnections), in research developed using Design Thinking principles. We have selected five metrics for provisioning (food production, and fiber production for wood and energy), regulating and maintenance (climate mitigation and biodiversity conservation), and cultural (heritage) ecosystem services. Using the case of Sweden, we estimate indicators for these metrics using a combination of existing data synthesis and process-based simulation modeling. We also develop and analyze new indicators (e.g., combining data on land use, bird conservation status, and habitat specificity to make a predictive model of bird diversity changes on agricultural or forested land). Our results highlight both expected tradeoffs (e.g., between food production and biodiversity conservation) as well as unexpected opportunities for synergies under different land management scenarios and strategies. Our model also provides a practical way to make decision-maker values explicit by comparing both quantity and preferences for bundles of ecosystem services under various scenarios. We hope our model will help in considering competing interests and shaping economic incentives and governance structures to meet national targets in support of global goals for sustainable management of land

  9. On successive measurements. A comparison between the orthodox view and the consistent history proposal

    International Nuclear Information System (INIS)

    Prvanovic, S.; Maric, Z.

    1999-01-01

    The consistent history approach in quantum mechanics is analysed under the view of the standard quantum theory of successive measurements. It shown that in almost all respect the standard theory is a superior one. The problems appearing in the consistent history approach disappear by applying a subtle algorithm of the standard theory

  10. The History of Rabies in Trinidad: Epidemiology and Control Measures

    Directory of Open Access Journals (Sweden)

    Janine F. R. Seetahal

    2017-07-01

    Full Text Available Vampire bat-transmitted rabies was first recognized in Trinidad during a major outbreak reported in 1925. Trinidad is the only Caribbean island with vampire bat-transmitted rabies. We conducted a literature review to describe the changing epidemiology of rabies in Trinidad and give a historical perspective to rabies prevention and control measures on the island. The last human case of rabies occurred in 1937 and although no case of canine-transmitted rabies was reported since 1914, sporadic outbreaks of bat-transmitted rabies still occur in livestock to date. Over the last century, seven notable epidemics were recorded in Trinidad with the loss of over 3000 animals. During the 1950s, several measures were effectively adopted for the prevention and control of the disease which led to a significant reduction in the number of cases. These measures include: vampire bat population control, livestock vaccination, and animal surveillance. However, due to lapses in these measures over the years (e.g., periods of limited vampire control and incomplete herd vaccination, epidemics have occurred. In light of the significant negative impact of rabies on animal production and human health, rabies surveillance in Trinidad should be enhanced and cases evaluated towards the design and implementation of more evidence-based prevention and control programs.

  11. Measured and perceived environmental characteristics are related to accelerometer defined physical activity in older adults

    Directory of Open Access Journals (Sweden)

    Strath Scott J

    2012-04-01

    Full Text Available Abstract Background Few studies have investigated both the self-perceived and measured environment with objectively determined physical activity in older adults. Accordingly, the aim of this study was to examine measured and perceived environmental associations with physical activity of older adults residing across different neighborhood types. Methods One-hundred and forty-eight older individuals, mean age 64.3 ± 8.4, were randomly recruited from one of four neighborhoods that were pre-determined as either having high- or low walkable characteristics. Individual residences were geocoded and 200 m network buffers established. Both objective environment audit, and self-perceived environmental measures were collected, in conjunction with accelerometer derived physical activity behavior. Using both perceived and objective environment data, analysis consisted of a macro-level comparison of physical activity levels across neighborhood, and a micro-level analysis of individual environmental predictors of physical activity levels. Results Individuals residing in high-walkable neighborhoods on average engaged in 11 min of moderate to vigorous physical activity per day more than individuals residing in low-walkable neighborhoods. Both measured access to non-residential destinations (b = .11, p p = .031 were significant predictors of time spent in moderate to vigorous physical activity. Other environmental variables significantly predicting components of physical activity behavior included presence of measured neighborhood crime signage (b = .4785, p = .031, measured street safety (b = 26.8, p = .006, and perceived neighborhood satisfaction (b = .5.8, p = .003. Conclusions Older adult residents who live in high-walkable neighborhoods, who have easy and close access to nonresidential destinations, have lower social dysfunction pertinent to crime, and generally perceive the neighborhood to a higher overall satisfaction are likely to engage in higher levels

  12. So ware-Defined Network Solutions for Science Scenarios: Performance Testing Framework and Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Settlemyer, Bradley [Los Alamos National Laboratory (LANL); Kettimuthu, R. [Argonne National Laboratory (ANL); Boley, Josh [Argonne National Laboratory (ANL); Katramatos, Dimitrios [Brookhaven National Laboratory (BNL); Rao, Nageswara S. [ORNL; Sen, Satyabrata [ORNL; Liu, Qiang [ORNL

    2018-01-01

    High-performance scientific work flows utilize supercomputers, scientific instruments, and large storage systems. Their executions require fast setup of a small number of dedicated network connections across the geographically distributed facility sites. We present Software-Defined Network (SDN) solutions consisting of site daemons that use dpctl, Floodlight, ONOS, or OpenDaylight controllers to set up these connections. The development of these SDN solutions could be quite disruptive to the infrastructure, while requiring a close coordination among multiple sites; in addition, the large number of possible controller and device combinations to investigate could make the infrastructure unavailable to regular users for extended periods of time. In response, we develop a Virtual Science Network Environment (VSNE) using virtual machines, Mininet, and custom scripts that support the development, testing, and evaluation of SDN solutions, without the constraints and expenses of multi-site physical infrastructures; furthermore, the chosen solutions can be directly transferred to production deployments. By complementing VSNE with a physical testbed, we conduct targeted performance tests of various SDN solutions to help choose the best candidates. In addition, we propose a switching response method to assess the setup times and throughput performances of different SDN solutions, and present experimental results that show their advantages and limitations.

  13. Defining and measuring suspicion of sepsis: an analysis of routine data.

    Science.gov (United States)

    Inada-Kim, Matthew; Page, Bethan; Maqsood, Imran; Vincent, Charles

    2017-06-09

    To define the target population of patients who have suspicion of sepsis (SOS) and to provide a basis for assessing the burden of SOS, and the evaluation of sepsis guidelines and improvement programmes. Retrospective analysis of routinely collected hospital administrative data. Secondary care, eight National Health Service (NHS) Acute Trusts. Hospital Episode Statistics data for 2013-2014 was used to identify all admissions with a primary diagnosis listed in the 'suspicion of sepsis' (SOS) coding set. The SOS coding set consists of all bacterial infective diagnoses. We identified 47 475 admissions with SOS, equivalent to a rate of 17 admissions per 1000 adults in a given year. The mortality for this group was 7.2% during their acute hospital admission. Urinary tract infection was the most common diagnosis and lobar pneumonia was associated with the most deaths. A short list of 10 diagnoses can account for 85% of the deaths. Patients with SOS can be identified in routine administrative data. It is these patients who should be screened for sepsis and are the target of programmes to improve the detection and treatment of sepsis. The effectiveness of such programmes can be evaluated by examining the outcomes of patients with SOS. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  14. Comparison of measured and calculated doses for narrow MLC defined fields

    International Nuclear Information System (INIS)

    Lydon, J.; Rozenfeld, A.; Lerch, M.

    2002-01-01

    Full text: The introduction of Intensity Modulated Radiotherapy (IMRT) has led to the use of narrow fields in the delivery of radiation doses to patients. Such fields are not well characterized by calculation methods commonly used in radiotherapy treatment planning systems. The accuracy of the dose calculation algorithm must therefore be investigated prior to clinical use. This study looked at symmetrical and asymmetrical 0.1 to 3cm wide fields delivered with a Varian CL2100C 6MV photon beam. Measured doses were compared to doses calculated using Pinnacle, the ADAC radiotherapy treatment planning system. Two high resolution methods of measuring dose were used. A MOSFET detector in a water phantom and radiographic film in a solid water phantom with spatial resolutions of 10 and 89μm respectively. Dose calculations were performed using the collapsed cone convolution algorithm in Pinnacle with a 0.1cm dose calculation grid in the MLC direction. The effect of Pinnacle not taking into account the rounded leaf ends was simulated by offsetting the leaves by 0.1cm in the dose calculation. Agreement between measurement and calculation is good for fields of 1cm and wider. However, fields of less than 1cm width can show a significant difference between measurement and calculation

  15. Impedance measurements and high-resolution manometry help to better define rumination episodes

    NARCIS (Netherlands)

    Kessing, Boudewijn F.; Govaert, Frank; Masclee, Ad A. M.; Conchillo, José M.

    2011-01-01

    Rumination syndrome is a disorder of unknown etiology characterized by regurgitation of recently ingested food. We aimed to improve the diagnosis of rumination syndrome by classification of separate rumination symptoms using (1) an ambulatory manometry/impedance (AMIM) measurement and (2) a

  16. Design of a fusion reaction-history measurement system with high temporal resolution

    International Nuclear Information System (INIS)

    Peng Xiaoshi; Wang Feng; Liu Shenye; Jiang Xiaohua; Tang Qi

    2010-01-01

    In order to accurately measure the history of fusion reaction for experimental study of inertial confinement fusion, we advance the design of a fusion reaction-history measurement system with high temporal resolution. The diagnostic system is composed of plastic scintillator and nose cone, an optical imaging system and the system of optic streak camera. Analyzing the capability of the system indicated that the instrument measured fusion reaction history at temporal resolution as low as 55ps and 40ps correspond to 2.45MeV DD neutrons and 14.03MeV DT neutrons. The instrument is able to measure the fusion reaction history at yields 1.5 x 10 9 DD neutrons, about 4 x 10 8 DT neutrons are required for a similar quality signal. (authors)

  17. Defining and measuring integrated patient care: promoting the next frontier in health care delivery.

    Science.gov (United States)

    Singer, Sara J; Burgers, Jako; Friedberg, Mark; Rosenthal, Meredith B; Leape, Lucian; Schneider, Eric

    2011-02-01

    Integration of care is emerging as a central challenge of health care delivery, particularly for patients with multiple, complex chronic conditions. The authors argue that the concept of "integrated patient care" would benefit from further clarification regarding (a) the object of integration and (b) its essential components, particularly when constructing measures.To address these issues, the authors propose a definition of integrated patient care that distinguishes it from integrated delivery organizations, acknowledging that integrated organizational structures and processes may fail to produce integrated patient care. The definition emphasizes patients' central role as active participants in managing their own health by including patient centeredness as a key element of integrated patient care. Measures based on the proposed definition will enable empirical assessment of the potential relationships between the integration of organizations, the integration of patient care, and patient outcomes, providing valuable guidance to health systems reformers.

  18. Measuring replication competent HIV-1: advances and challenges in defining the latent reservoir

    OpenAIRE

    Wang, Zheng; Simonetti, Francesco R.; Siliciano, Robert F.; Laird, Gregory M.

    2018-01-01

    Antiretroviral therapy cannot cure HIV-1 infection due to the persistence of a small number of latently infected cells harboring replication-competent proviruses. Measuring persistent HIV-1 is challenging, as it consists of a mosaic population of defective and intact proviruses that can shift from a state of latency to active HIV-1 transcription. Due to this complexity, most of the current assays detect multiple categories of persistent HIV-1, leading to an overestimate of the true size of th...

  19. Anatomic guidelines defined by reformatting images on MRI for volume measurement of amygdala and hippocampus

    International Nuclear Information System (INIS)

    Hoshida, Tohru; Sakaki, Toshisuke; Uematsu, Sumio.

    1995-01-01

    Twelve patients with intractable partial epilepsy underwent MR scans at the Epilepsy Center of the Johns Hopkins Hospital. There were five women and seven men, ranging in age from five to 51 years (mean age: 26 years). Coronal images were obtained using a 3-D SPGR. The coronal images were transferred to an Allegro 5.1 workstation, and reformatted along the cardinal axes (axial and sagittal) in multiple view points. The anterior end of the amygdala was measured at the level just posterior to the disappearance of the temporal stem. The semilunar gyrus of the amygdala was separated from the ambient gyrus by the semianular sulcus that forms the boundary between the amygdala and the entorhinal cortex. The delineation of the hippocampal formation included the subicular complex, hippocampal proper, dentate gyrus, alveus, and fimbria. The uncal cleft separated the uncus above from the parahippocampal gyrus below. The roof of this cleft was formed by the hippocampus and the dentate gyrus, and the floor, by the presubiculum and subiculum. Although using some guidelines, strictly separating the hippocampal head from the posterior part of the amygdala was not feasible as was previously reported, because of the isointensity on MRI between the cortex of the amygdala and the hippocampus. The most posterior portion of the hippocampus was measured at the level of the subsplenial gyri, just below the splenium of the corpus callosum, to measure the hippocampal volume in its near totality. Therefore, it is reliable, and clinically useful, to measure the combined total volume of the amygdala and the hippocampus when comparing results with those of other centers. (S.Y.)

  20. Infectious bronchitis virus variants ? History, current situation and control measures

    OpenAIRE

    2011-01-01

    Abstract Infectious bronchitis virus (IBV) is ubiquitous in most parts of the world where poultry are reared and is able to spread very rapidly in non-protected birds. It is shed via both the respiratory tract and the faeces and can persist in the birds and the intestinal tract for several weeks or months. Outdoors, survival of IBV for 56 days in litter has been reported. Although strict biosecurity and working with a one-age system are essential control measures, normally vaccinat...

  1. The Criterion A problem revisited: controversies and challenges in defining and measuring psychological trauma.

    Science.gov (United States)

    Weathers, Frank W; Keane, Terence M

    2007-04-01

    The Criterion A problem in the field of traumatic stress refers to the stressor criterion for posttraumatic stress disorder (PTSD) and involves a number of fundamental issues regarding the definition and measurement of psychological trauma. These issues first emerged with the introduction of PTSD as a diagnostic category in the Diagnostic and Statistical Manual of Mental Disorders, Third Edition (DSM-III; American Psychiatric Association, 1980) and continue to generate considerable controversy. In this article, the authors provide an update on the Criterion A problem, with particular emphasis on the evolution of the DSM definition of the stressor criterion and the ongoing debate regarding broad versus narrow conceptualizations of traumatic events.

  2. Using a management perspective to define and measure changes in nursing technology.

    Science.gov (United States)

    Alexander, J W; Kroposki, M

    2001-09-01

    The aims of this paper are to discuss the uses of the concept of technology from the medical science and the management perspectives; to propose a clear definition of nursing technology; and to present a study applying the use of the concept of nursing technology on nursing units. Nurse managers must use management terms correctly and the term technology may be misleading for some. A review of the nursing literature shows varied uses of the concept of technology. Thus a discussion of the dimensions, attributes, consequences, and definitions of nursing technology from the management perspective are given. A longitudinal study to measure the dimensions of nursing technology on nursing units 10 years apart. The findings suggest that the dimensions of nursing technology change over time and support the need for nurse managers to periodically assess nursing technology before making management changes at the level of the nursing unit. This study helps health care providers understand the unique role of nurses as healthcare professionals by identifying and measuring nursing technology on the nursing unit.

  3. DEFINING THE RELEVANT OUTCOME MEASURES IN MEDICAL DEVICE ASSESSMENTS: AN ANALYSIS OF THE DEFINITION PROCESS IN HEALTH TECHNOLOGY ASSESSMENT.

    Science.gov (United States)

    Jacobs, Esther; Antoine, Sunya-Lee; Prediger, Barbara; Neugebauer, Edmund; Eikermann, Michaela

    2017-01-01

    Defining relevant outcome measures for clinical trials on medical devices (MD) is complex, as there is a large variety of potentially relevant outcomes. The chosen outcomes vary widely across clinical trials making the assessment in evidence syntheses very challenging. The objective is to provide an overview on the current common procedures of health technology assessment (HTA) institutions in defining outcome measures in MD trials. In 2012-14, the Web pages of 126 institutions involved in HTA were searched for methodological manuals written in English or German that describe methods for the predefinition process of outcome measures. Additionally, the institutions were contacted by email. Relevant information was extracted. All process steps were performed independently by two reviewers. Twenty-four manuals and ten responses from the email request were included in the analysis. Overall, 88.5 percent of the institutions describe the type of outcomes that should be considered in detail and 84.6 percent agree that the main focus should be on patient relevant outcomes. Specifically related to MD, information could be obtained in 26 percent of the included manuals and email responses. Eleven percent of the institutions report a particular consideration of MD related outcomes. This detailed analysis on common procedures of HTA institutions in the context of defining relevant outcome measures for the assessment of MD shows that standardized procedures for MD from the perspective of HTA institutions are not widespread. This leads to the question if a homogenous approach should be implemented in the field of HTA on MD.

  4. Survivor-Defined Practice in Domestic Violence Work: Measure Development and Preliminary Evidence of Link to Empowerment.

    Science.gov (United States)

    Goodman, Lisa A; Thomas, Kristie; Cattaneo, Lauren Bennett; Heimel, Deborah; Woulfe, Julie; Chong, Siu Kwan

    2016-01-01

    Survivor-defined practice, characterized by an emphasis on client choice, partnership, and sensitivity to the unique needs, contexts, and coping strategies of individual survivors, is an aspirational goal of the domestic violence (DV) movement, assumed to be a key contributor to empowerment and other positive outcomes among survivors. Despite its central role in DV program philosophy, training, and practice, however, our ability to assess its presence and its presumed link to well-being has been hampered by the absence of a way to measure it from survivors' perspectives. As part of a larger university-community collaboration, this study had two aims: (a) to develop a measure of survivor-defined practice from the perspective of participants, and (b) to assess its relationship to safety-related empowerment after controlling for other contributors to survivor well-being (e.g., financial stability and social support). Results supported the reliability and validity of the Survivor-Defined Practice Scale (SDPS), a nine-item measure that assesses participants' perception of the degree to which their advocates help them achieve goals they set for themselves, facilitate a spirit of partnership, and show sensitivity to their individual needs and styles. The items combined to form one factor indicating that the three theoretical aspects of survivor-defined practice may be different manifestations of one underlying construct. Results also support the hypothesized link between survivor-defined practice and safety-related empowerment. The SDPS offers DV programs a mechanism for process evaluation that is rigorous and rooted in the feminist empowerment philosophy that so many programs espouse. © The Author(s) 2014.

  5. Gas permeation measurement under defined humidity via constant volume/variable pressure method

    KAUST Repository

    Jan Roman, Pauls

    2012-02-01

    Many industrial gas separations in which membrane processes are feasible entail high water vapour contents, as in CO 2-separation from flue gas in carbon capture and storage (CCS), or in biogas/natural gas processing. Studying the effect of water vapour on gas permeability through polymeric membranes is essential for materials design and optimization of these membrane applications. In particular, for amine-based CO 2 selective facilitated transport membranes, water vapour is necessary for carrier-complex formation (Matsuyama et al., 1996; Deng and Hägg, 2010; Liu et al., 2008; Shishatskiy et al., 2010) [1-4]. But also conventional polymeric membrane materials can vary their permeation behaviour due to water-induced swelling (Potreck, 2009) [5]. Here we describe a simple approach to gas permeability measurement in the presence of water vapour, in the form of a modified constant volume/variable pressure method (pressure increase method). © 2011 Elsevier B.V.

  6. Defining and measuring blood donor altruism: a theoretical approach from biology, economics and psychology.

    Science.gov (United States)

    Evans, R; Ferguson, E

    2014-02-01

    While blood donation is traditionally described as a behaviour motivated by pure altruism, the assessment of altruism in the blood donation literature has not been theoretically informed. Drawing on theories of altruism from psychology, economics and evolutionary biology, it is argued that a theoretically derived psychometric assessment of altruism is needed. Such a measure is developed in this study that can be used to help inform both our understanding of the altruistic motives of blood donors and recruitment intervention strategies. A cross-sectional survey (N = 414), with a 1-month behavioural follow-up (time 2, N = 77), was designed to assess theoretically derived constructs from psychological, economic and evolutionary biological theories of altruism. Theory of planned behaviour (TPB) variables and co-operation were also assessed at time 1 and a measure of behavioural co-operation at time 2. Five theoretical dimensions (impure altruism, kinship, self-regarding motives, reluctant altruism and egalitarian warm glow) of altruism were identified through factor analyses. These five altruistic motives differentiated blood donors from non-donors (donors scored higher on impure altruism and reluctant altruism), showed incremental validity over TPB constructs to predict donor intention and predicted future co-operative behaviour. These findings show that altruism in the context of blood donation is multifaceted and complex and, does not reflect pure altruism. This has implication for recruitment campaigns that focus solely on pure altruism. © 2013 The Authors. Vox Sanguinis published by John Wiley & Sons Ltd. on behalf of International Society of Blood Transfusion.

  7. A Brief History of Attempts to Measure Sexual Motives

    Directory of Open Access Journals (Sweden)

    Elaine Hatfield

    2012-12-01

    Full Text Available Artists, creative writers, and musicians have long been interested in the complex motives that spark passionate love, sexual desire, and sexual behavior. Recently, scholars from a variety of disciplines have begun to investigate two questions: “Why do men and women choose to engage in sexual liaisons?” “Why do they avoid such encounters?” Theories abound. Many theorists have complained that there exists a paucity of scales designed to measure the plethora of motives that prompt people to seek out or to avoid sexual activities. In fact, this observation is incorrect. Many such scales of documented reliability and validity do exist. The reason that few scholars are familiar with these scales is that they were developed by psychometricians from a variety of disciplines and are scattered about in an assortment of journals, college libraries, and researchers’ desk drawers, thus making them difficult to identify and locate. This paper will attempt to provide a compendium of all known sexual motives scales, hoping that this will encourage scholars to take a multidisciplinary approach in developing typologies of sexual motives and/or in conducting their own research into the nature of sexual motives.

  8. Assessment and measurement in neuropsychiatry: a conceptual history.

    Science.gov (United States)

    Berrios, German E; Marková, Ivana S

    2002-01-01

    Since the time the parent discipline of psychiatry became organized as a profession, one of its ludi saeculares (neuropsychiatry) has enjoyed at least 4 vogues. On each, neuropsychiatry has been known to ally itself to a cause: currently it is the big business of neurobiology. This move can be seen as scientific progress or as a side-effect of the (professional rather than scientific) infighting that affected neuromedicine during the late 19(th) century and which led to the construction of the notion of "neurological disease." Alienists responded to this variously: some, like Kahlbaum and Kraepelin accepted the split and returned to the more botanico approach; others, like Ziehen chose psychology; yet others, like Freud, delved in hermeneutics; lastly, there were those, like Meynert, Wernicke, Von Monakow, and Liepmann who sought an accommodation with neurology. Born out of this compromise, neuropsychiatry has remained a blurred activity (whose definitions range from "psychiatry of neurology" to a crusade for the "naturalization of the mind"). Neuropsychiatric assessment is a methodology designed to collect information about patients whose mental symptoms are thought to be caused by brain disease. When it first appeared, it was torn by the debate between "nomothetic versus idiographic" science. For a time, the neuropsychiatry assessment techniques stuck to the old personalized narratives characteristic of 19(th) century "casenotes" (trying to meet its descriptive, explanatory, therapeutic, legal, and ethical obligations). But during the late 19(th) century, measurement and quantification became part of the new rhetoric of science. Soon enough this affected psychology in general and neuropsychology in particular and neuropsychiatric assessment followed suit. It has changed little since except that now and again old tests and markers are replaced by more "reliable" ones and phenomenological data are squeezed out further. Its laudable enthusiasm for objectivity and

  9. Defining the endpoints: how to measure the efficacy of drugs that are active against central nervous system metastases

    OpenAIRE

    Fabi, Alessandra; Vidiri, Antonello

    2016-01-01

    Brain metastases (BMs) are the most common cause of malignant central nervous system (CNS) tumors in adults. In the recent past, patients with BMs were excluded from clinical trials, but now, with the advent of new biological and immunological drugs, their inclusion is more common. In the last era response and progression criteria used across clinical trials have defined the importance to consider not only measurement changes of brain lesions but also the modification of parameters related to...

  10. Measuring the construct of executive control in schizophrenia: defining and validating translational animal paradigms for discovery research.

    Science.gov (United States)

    Gilmour, Gary; Arguello, Alexander; Bari, Andrea; Brown, Verity J; Carter, Cameron; Floresco, Stan B; Jentsch, David J; Tait, David S; Young, Jared W; Robbins, Trevor W

    2013-11-01

    Executive control is an aspect of cognitive function known to be impaired in schizophrenia. Previous meetings of the Cognitive Neuroscience Treatment Research to Improve Cognition in Schizophrenia (CNTRICS) group have more precisely defined executive control in terms of two constructs: "rule generation and selection", and "dynamic adjustments of control". Next, human cognitive tasks that may effectively measure performance with regard to these constructs were identified to be developed into practical and reliable measures for use in treatment development. The aim of this round of CNTRICS meetings was to define animal paradigms that have sufficient promise to warrant further investigation for their utility in measuring these constructs. Accordingly, "reversal learning" and the "attentional set-shifting task" were nominated to assess the construct of rule generation and selection, and the "stop signal task" for the construct of dynamic adjustments of control. These tasks are described in more detail here, with a particular focus on their utility for drug discovery efforts. Presently, each assay has strengths and weaknesses with regard to this point and increased emphasis on improving practical aspects of testing, understanding predictive validity, and defining biomarkers of performance represent important objectives in attaining confidence in translational validity here. Crown Copyright © 2012. Published by Elsevier Ltd. All rights reserved.

  11. A Case Study of Six Sigma Define-Measure-Analyze-Improve-Control (DMAIC Methodology in Garment Sector

    Directory of Open Access Journals (Sweden)

    Abdur Rahman

    2017-12-01

    Full Text Available This paper demonstrates the empirical application of Six Sigma and Define-Measure-Analyze-Improve-Control (DMAIC methodology to reduce product defects within a garments manufacturing organization in Bangladesh which follows the DMAIC methodology to investigate defects, root causes and provide a solution to eliminate these defects. The analysis from employing Six Sigma and DMAIC indicated that the broken stitch and open seam influenced the number of defective products. Design of experiments (DOE and the analysis of variance (ANOVA techniques were combined to statistically determine the correlation of the broken stitch and open seam with defects as well as to define their optimum values needed to eliminate the defects. Thus, a reduction of about 35% in the garments defect was achieved, which helped the organization studied to reduce its defects and thus improve its Sigma level from 1.7 to 3.4.

  12. ANALYSIS OF MUTATIONS OF TUBERCULOUS MYCOBACTERIA DEFINING DRUG RESISTANCE IN HIV POSITIVE AND HIV NEGATIVE TUBERCULOSIS PATIENTS WITHOUT PRIOR HISTORY OF TREATMENT IN SVERDLOVSK REGION

    Directory of Open Access Journals (Sweden)

    G. V. Panov

    2017-01-01

    Full Text Available Goal of the study: to identify profile of mutations of tuberculous mycobacteria responsible for resistance to anti-tuberculosis drugs in HIV positive and HIV negative tuberculosis patients without prior history of treatment.Materials and methods. 165 strains of tuberculous mycobacteria from HIV positive patients and 166 strains of tuberculous mycobacteria from HIV negative patients were studied in Sverdlovsk Region (TB Dispensary, Yekaterinburg. Mutations in genes were identified using microchips of TB-BIOCHIP® and TB-BIOCHIP®-2 in compliance with the manufacturer's guidelines (OOO Biochip-IMB, Moscow.Results. It was observed that 85/165 (51.52% strains isolated from HIV positive tuberculosis patients and 58/166 (34.94% strains isolated from tuberculosis patients not associated with HIV possessed MDR genotype (p < 0.01. The majority of MDR strains had mutations in the 531th codon of rpoB (Ser→Leu and 315th codon of katG (Ser→Thr (64/85, 75.29% and 38/58, 65.52% respective the groups, resulting in the high level of resistance to rifampicin and isoniazid. Each group also had approximately equal ratio (11/165, 6.67% and 12/166, 7.23% respective the groups of strains with genomic mutations defining the resistance to isoniazid, rifampicin and fluoruquinolones. No confident difference was found in mutation patterns of genome of tuberculous mycobacteria isolated from HIV positive and HIV negative tuberculosis patients. 

  13. Inverse analysis of inner surface temperature history from outer surface temperature measurement of a pipe

    International Nuclear Information System (INIS)

    Kubo, S; Ioka, S; Onchi, S; Matsumoto, Y

    2010-01-01

    When slug flow runs through a pipe, nonuniform and time-varying thermal stresses develop and there is a possibility that thermal fatigue occurs. Therefore it is necessary to know the temperature distributions and the stress distributions in the pipe for the integrity assessment of the pipe. It is, however, difficult to measure the inner surface temperature directly. Therefore establishment of the estimation method of the temperature history on inner surface of pipe is needed. As a basic study on the estimation method of the temperature history on the inner surface of a pipe with slug flow, this paper presents an estimation method of the temperature on the inner surface of a plate from the temperature on the outer surface. The relationship between the temperature history on the outer surface and the inner surface is obtained analytically. Using the results of the mathematical analysis, the inverse analysis method of the inner surface temperature history estimation from the outer surface temperature history is proposed. It is found that the inner surface temperature history can be estimated from the outer surface temperature history by applying the inverse analysis method, even when it is expressed by the multiple frequency components.

  14. The vexing problem of defining the meaning, role and measurement of values in treatment decision-making.

    Science.gov (United States)

    Charles, Cathy; Gafni, Amiram

    2014-03-01

    Two international movements, evidence-based medicine (EBM) and shared decision-making (SDM) have grappled for some time with issues related to defining the meaning, role and measurement of values/preferences in their respective models of treatment decision-making. In this article, we identify and describe unresolved problems in the way that each movement addresses these issues. The starting point for this discussion is that at least two essential ingredients are needed for treatment decision-making: research information about treatment options and their potential benefits and risks; and the values/preferences of participants in the decision-making process. Both the EBM and SDM movements have encountered difficulties in defining the meaning, role and measurement of values/preferences in treatment decision-making. In the EBM model of practice, there is no clear and consistent definition of patient values/preferences and no guidance is provided on how to integrate these into an EBM model of practice. Methods advocated to measure patient values are also problematic. Within the SDM movement, patient values/preferences tend to be defined and measured in a restrictive and reductionist way as patient preferences for treatment options or attributes of options, while broader underlying value structures are ignored. In both models of practice, the meaning and expected role of physician values in decision-making are unclear. Values clarification exercises embedded in patient decision aids are suggested by SDM advocates to identify and communicate patient values/preferences for different treatment outcomes. Such exercises have the potential to impose a particular decision-making theory and/or process onto patients, which can change the way they think about and process information, potentially impeding them from making decisions that are consistent with their true values. The tasks of clarifying the meaning, role and measurement of values/preferences in treatment decision

  15. What is culture in «cultural economy»? Defining culture to create measurable models in cultural economy

    Directory of Open Access Journals (Sweden)

    Aníbal Monasterio Astobiza

    2017-07-01

    Full Text Available The idea of culture is somewhat vague and ambiguous for the formal goals of economics. The aim of this paper is to define the notion of culture better so as to help build economic explanations based on culture and therefore to measure its impact in every activity or beliefs associated with culture. To define culture according to the canonical evolutionary definition, it is any kind of ritualised behaviour that becomes meaningful for a group and that remains more or less constant and is transmitted down through the generations. Economic institutions are founded, implicitly or explicitly, on a worldview of how humans function; culture is an essential part of understanding us as humans, making it necessary to describe what we understand by culture correctly. In this paper we review the literature on evolutionary anthropology and psychology dealing with the concept of culture to warn that economic modelling ignores intangible benefits of culture rendering economics unable to measure certain cultural items in the digital consumer society.

  16. Multi-species time-history measurements during high-temperature acetone and 2-butanone pyrolysis

    KAUST Repository

    Lam, Kingyiu; Ren, Wei; Pyun, Sunghyun; Farooq, Aamir; Davidson, David Frank; Hanson, Ronald Kenneth

    2013-01-01

    High-temperature acetone and 2-butanone pyrolysis studies were conducted behind reflected shock waves using five species time-history measurements (ketone, CO, CH3, CH4 and C2H4). Experimental conditions covered temperatures of 1100-1600 Kat 1.6 atm

  17. Measuring epistemological beliefs in history education : An exploration of naïve and nuanced beliefs

    NARCIS (Netherlands)

    Stoel, G.; Logtenberg, A.; Wansink, B.; Huijgen, T.; van Boxtel, C.; van Drie, J.

    2017-01-01

    This study investigates a questionnaire that measures epistemological beliefs in history. Participants were 922 exam students. A basic division between naïve and nuanced ideas underpins the questionnaire. However, results show this division oversimplifies the underlying structure. Exploratory factor

  18. A Bayesian Retrieval of Greenland Ice Sheet Internal Temperature from Ultra-wideband Software-defined Microwave Radiometer (UWBRAD) Measurements

    Science.gov (United States)

    Duan, Y.; Durand, M. T.; Jezek, K. C.; Yardim, C.; Bringer, A.; Aksoy, M.; Johnson, J. T.

    2017-12-01

    The ultra-wideband software-defined microwave radiometer (UWBRAD) is designed to provide ice sheet internal temperature product via measuring low frequency microwave emission. Twelve channels ranging from 0.5 to 2.0 GHz are covered by the instrument. A Greenland air-borne demonstration was demonstrated in September 2016, provided first demonstration of Ultra-wideband radiometer observations of geophysical scenes, including ice sheets. Another flight is planned for September 2017 for acquiring measurements in central ice sheet. A Bayesian framework is designed to retrieve the ice sheet internal temperature from simulated UWBRAD brightness temperature (Tb) measurements over Greenland flight path with limited prior information of the ground. A 1-D heat-flow model, the Robin Model, was used to model the ice sheet internal temperature profile with ground information. Synthetic UWBRAD Tb observations was generated via the partially coherent radiation transfer model, which utilizes the Robin model temperature profile and an exponential fit of ice density from Borehole measurement as input, and corrupted with noise. The effective surface temperature, geothermal heat flux, the variance of upper layer ice density, and the variance of fine scale density variation at deeper ice sheet were treated as unknown variables within the retrieval framework. Each parameter is defined with its possible range and set to be uniformly distributed. The Markov Chain Monte Carlo (MCMC) approach is applied to make the unknown parameters randomly walk in the parameter space. We investigate whether the variables can be improved over priors using the MCMC approach and contribute to the temperature retrieval theoretically. UWBRAD measurements near camp century from 2016 was also treated with the MCMC to examine the framework with scattering effect. The fine scale density fluctuation is an important parameter. It is the most sensitive yet highly unknown parameter in the estimation framework

  19. Surface Charge Measurement of SonoVue, Definity and Optison: A Comparison of Laser Doppler Electrophoresis and Micro-Electrophoresis.

    Science.gov (United States)

    Ja'afar, Fairuzeta; Leow, Chee Hau; Garbin, Valeria; Sennoga, Charles A; Tang, Meng-Xing; Seddon, John M

    2015-11-01

    Microbubble (MB) contrast-enhanced ultrasonography is a promising tool for targeted molecular imaging. It is important to determine the MB surface charge accurately as it affects the MB interactions with cell membranes. In this article, we report the surface charge measurement of SonoVue, Definity and Optison. We compare the performance of the widely used laser Doppler electrophoresis with an in-house micro-electrophoresis system. By optically tracking MB electrophoretic velocity in a microchannel, we determined the zeta potentials of MB samples. Using micro-electrophoresis, we obtained zeta potential values for SonoVue, Definity and Optison of -28.3, -4.2 and -9.5 mV, with relative standard deviations of 5%, 48% and 8%, respectively. In comparison, laser Doppler electrophoresis gave -8.7, +0.7 and +15.8 mV with relative standard deviations of 330%, 29,000% and 130%, respectively. We found that the reliability of laser Doppler electrophoresis is compromised by MB buoyancy. Micro-electrophoresis determined zeta potential values with a 10-fold improvement in relative standard deviation. Copyright © 2015 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.

  20. History of satellite missions and measurements of the Earth Radiation Budget (1957-1984)

    Science.gov (United States)

    House, F. B.; Gruber, A.; Hunt, G. E.; Mecherikunnel, A. T.

    1986-01-01

    The history of satellite missions and their measurements of the earth radiation budget from the beginning of the space age until the present time are reviewed. The survey emphasizes the early struggle to develop instrument systems to monitor reflected shortwave and emitted long-wave exitances from the earth, and the problems associated with the interpretation of these observations from space. In some instances, valuable data sets were developed from satellite measurements whose instruments were not specifically designed for earth radiation budget observations.

  1. Accounting for measurement error in human life history trade-offs using structural equation modeling.

    Science.gov (United States)

    Helle, Samuli

    2018-03-01

    Revealing causal effects from correlative data is very challenging and a contemporary problem in human life history research owing to the lack of experimental approach. Problems with causal inference arising from measurement error in independent variables, whether related either to inaccurate measurement technique or validity of measurements, seem not well-known in this field. The aim of this study is to show how structural equation modeling (SEM) with latent variables can be applied to account for measurement error in independent variables when the researcher has recorded several indicators of a hypothesized latent construct. As a simple example of this approach, measurement error in lifetime allocation of resources to reproduction in Finnish preindustrial women is modelled in the context of the survival cost of reproduction. In humans, lifetime energetic resources allocated in reproduction are almost impossible to quantify with precision and, thus, typically used measures of lifetime reproductive effort (e.g., lifetime reproductive success and parity) are likely to be plagued by measurement error. These results are contrasted with those obtained from a traditional regression approach where the single best proxy of lifetime reproductive effort available in the data is used for inference. As expected, the inability to account for measurement error in women's lifetime reproductive effort resulted in the underestimation of its underlying effect size on post-reproductive survival. This article emphasizes the advantages that the SEM framework can provide in handling measurement error via multiple-indicator latent variables in human life history studies. © 2017 Wiley Periodicals, Inc.

  2. Love as a subjective corrlate of interpersonal relationships: attempts of defining of the concepts and methods of measurment

    Directory of Open Access Journals (Sweden)

    O. P. Zolotnyik

    2015-04-01

    Full Text Available This article is devoted to overview the scientific study of the phenomenon of love. Attempts of scientific knowledge presented by developed by sociologists and psychologists love theories, which defined, classified and measure this phenomenon. The paper proposed to review the most popular theory of love studying: the triangular theory of love for Robert J. Sternberg, classification styles love for John Alan Lee and transformational concept of A.Giddens. The importance of studying this subject is explained by the subjective definition by respondents of the role of love as correlates of interpersonal relationships. Love is considered as a factor that acts as a marriage motive and components, which ensures its durability. The complexity of the scientific understanding of love is the absence of clear empirical referents for fixation. The examined theory reaffirms their scientific hypotheses through the use of specific methods of measurement. It is offered for review: Scale of love and sympathy by Z.Rubin, Love Attitude Scale by Hendrick C. and Hendrick S. and scale of romantic relationships by Munro­Adams. These methodologies are widely used in modern scientific research, been undergo with modifications and adaptation depending on the cultural characteristics of the respondents. The phenomenon of love needs more scientific study with the aim of further categorization, require range of techniques selection and should be included  as a component in the sociological survey of interpersonal relationships.

  3. Multi-species time-history measurements during high-temperature acetone and 2-butanone pyrolysis

    KAUST Repository

    Lam, Kingyiu

    2013-01-01

    High-temperature acetone and 2-butanone pyrolysis studies were conducted behind reflected shock waves using five species time-history measurements (ketone, CO, CH3, CH4 and C2H4). Experimental conditions covered temperatures of 1100-1600 Kat 1.6 atm, for mixtures of 0.25-1.5% ketone in argon. During acetone pyrolysis, the CO concentration time-history was found to be strongly sensitive to the acetone dissociation rate constant κ1 (CH3COCH3 → CH3 + CH3CO), and this could be directly determined from the CO time-histories, yielding κ1(1.6 atm) = 2.46 × 1014 exp(-69.3 [kcal/mol]/RT) s-1 with an uncertainty of ±25%. This rate constant is in good agreement with previous shock tube studies from Sato and Hidaka (2000) [3] and Saxena et al. (2009) [4] (within 30%) at temperatures above 1450 K, but is at least three times faster than the evaluation from Sato and Hidaka at temperatures below 1250 K. Using this revised κ1 value with the recent mechanism of Pichon et al. (2009) [5], the simulated profiles during acetone pyrolysis show excellent agreement with all five species time-history measurements. Similarly, the overall 2-butanone decomposition rate constant κtot was inferred from measured 2-butanone time-histories, yielding κ tot(1.5 atm) = 6.08 × 1013 exp(-63.1 [kcal/mol]/RT) s -1 with an uncertainty of ±35%. This rate constant is approximately 30% faster than that proposed by Serinyel et al. (2010) [11] at 1119 K, and approximately 100% faster at 1412 K. Using the measured 2-butanone and CO time-histories and an O-atom balance analysis, a missing removal pathway for methyl ketene was identified. The rate constant for the decomposition of methyl ketene was assumed to be the same as the value for the ketene decomposition reaction. Using the revised κtot value and adding the methyl ketene decomposition reaction to the Serinyel et al. mechanism, the simulated profiles during 2-butanone pyrolysis show good agreement with the measurements for all five species.

  4. Measuring the Measuring Rod: Bible and Parabiblical Texts within the History of Medieval Literature

    Directory of Open Access Journals (Sweden)

    Lucie Doležalová

    2018-01-01

    Full Text Available In spite of the acknowledged crucial role it had in forming medieval written culture, the Bible and a wide-range of parabiblical texts still remain largely ignored by histories of medieval literatures. The reason for this striking omission of an important group of medieval texts from the 'canonical' narratives is, as I argue, the strong bias in favour of national, secular, fictional, and original texts which shapes literary studies – an inheritance from the nineteenth-century nationalising approaches discussed in the first issue of the Interfaces journal. Of course, the discipline of literary studies and therefore selection, hierarchization, and interpretation are complex social, cultural and political processes where almost anything is possible. It is the environment, the interpretive community, in which the interpretation takes place that has a decisive role. And that, too, is constantly being transformed. Thus, there are no final categories and answers because as long as there are interpretive communities, meanings are generated and operate in new ways. That is why the present discussion does not aim to claim that many of the parabiblical texts are literature and should have been included in the canon of medieval literature. Rather, I examine what the nineteenth-century notion of canon did to these texts and how the current questioning and substantial reshaping of notions of canon can transform our understanding of parabiblical texts.

  5. Single-cell quantitative HER2 measurement identifies heterogeneity and distinct subgroups within traditionally defined HER2-positive patients.

    Science.gov (United States)

    Onsum, Matthew D; Geretti, Elena; Paragas, Violette; Kudla, Arthur J; Moulis, Sharon P; Luus, Lia; Wickham, Thomas J; McDonagh, Charlotte F; MacBeath, Gavin; Hendriks, Bart S

    2013-11-01

    Human epidermal growth factor receptor 2 (HER2) is an important biomarker for breast and gastric cancer prognosis and patient treatment decisions. HER2 positivity, as defined by IHC or fluorescent in situ hybridization testing, remains an imprecise predictor of patient response to HER2-targeted therapies. Challenges to correct HER2 assessment and patient stratification include intratumoral heterogeneity, lack of quantitative and/or objective assays, and differences between measuring HER2 amplification at the protein versus gene level. We developed a novel immunofluorescence method for quantitation of HER2 protein expression at the single-cell level on FFPE patient samples. Our assay uses automated image analysis to identify and classify tumor versus non-tumor cells, as well as quantitate the HER2 staining for each tumor cell. The HER2 staining level is converted to HER2 protein expression using a standard cell pellet array stained in parallel with the tissue sample. This approach allows assessment of HER2 expression and heterogeneity within a tissue section at the single-cell level. By using this assay, we identified distinct subgroups of HER2 heterogeneity within traditional definitions of HER2 positivity in both breast and gastric cancers. Quantitative assessment of intratumoral HER2 heterogeneity may offer an opportunity to improve the identification of patients likely to respond to HER2-targeted therapies. The broad applicability of the assay was demonstrated by measuring HER2 expression profiles on multiple tumor types, and on normal and diseased heart tissues. Copyright © 2013 American Society for Investigative Pathology. Published by Elsevier Inc. All rights reserved.

  6. Informational measuring system of Baikal-1 stand. History and tendency of development

    International Nuclear Information System (INIS)

    Gorbanenko, O.A.; Dzalbo, V.V.; Inkov, A.F.

    1995-01-01

    The existing informational and measuring system has following main characteristics: - an input and processing of 768 analog signals with minimum interrogation period -0,1 s; - a registration of 768 analog parameters on magnetic tape with designed frequency (maximum frequency - 10 Hz); constant cyclic registration of measured information with frequency 10 Hz on magnetic disk and depth of the process history preservation - up to 10 min; - an input and registration of 1024 discrete parameters with frequency - 10 Hz; - a display of experimental and technical information on colour terminal; - a graphic display of main control points on monochromatic gas discharge party screen with renovation period 1 s. 2 figs

  7. A comparison of measured wind park load histories with the WISPER and WISPERX load spectra

    Science.gov (United States)

    Kelley, N. D.

    1995-01-01

    The blade-loading histories from two adjacent Micon 65/13 wind turbines are compared with the variable-amplitude test-loading histories known as the WISPER and WISPERX spectra. These standardized loading sequences were developed from blade flapwise load histories taken from nine different horizontal-axis wind turbines operating under a wide range of conditions in Europe. The subject turbines covered a broad spectrum of rotor diameters, materials, and operating environments. The final loading sequences were developed as a joint effort of thirteen different European organizations. The goal was to develop a meaningful loading standard for horizontal-axis wind turbine blades that represents common interaction effects seen in service. In 1990, NREL made extensive load measurements on two adjacent Micon 65/13 wind turbines in simultaneous operation in the very turbulent environment of a large wind park. Further, before and during the collection of the loads data, comprehensive measurements of the statistics of the turbulent environment were obtained at both the turbines under test and at two other locations within the park. The trend to larger but lighter wind turbine structures has made an understanding of the expected lifetime loading history of paramount importance. Experience in the US has shown that the turbulence-induced loads associated with multi-row wind parks in general are much more severe than for turbines operating individually or within widely spaced environments. Multi-row wind parks are much more common in the US than in Europe. In this paper we report on our results in applying the methodology utilized to develop the WISPER and WISPERX standardized loading sequences using the available data from the Micon turbines. While the intended purpose of the WISPER sequences were not to represent a specific operating environment, we believe the exercise is useful, especially when a turbine design is likely to be installed in a multi-row wind park.

  8. SPME-Based Ca-History Method for Measuring SVOC Diffusion Coefficients in Clothing Material.

    Science.gov (United States)

    Cao, Jianping; Liu, Ningrui; Zhang, Yinping

    2017-08-15

    Clothes play an important role in dermal exposure to indoor semivolatile organic compounds (SVOCs). The diffusion coefficient of SVOCs in clothing material (D m ) is essential for estimating SVOC sorption by clothing material and subsequent dermal exposure to SVOCs. However, few studies have reported the measured D m for clothing materials. In this paper, we present the solid-phase microextraction (SPME) based C a -history method. To the best of our knowledge, this is the first try to measure D m with known relative standard deviation (RSD). A thin sealed chamber is formed by a circular ring and two pieces of flat SVOC source materials that are tightly covered by the targeted clothing materials. D m is obtained by applying an SVOC mass transfer model in the chamber to the history of gas-phase SVOC concentrations (C a ) in the chamber measured by SPME. D m 's of three SVOCs, di-iso-butyl phthalate (DiBP), di-n-butyl phthalate (DnBP), and tris(1-chloro-2-propyl) phosphate (TCPP), in a cotton T-shirt can be obtained within 16 days, with RSD less than 3%. This study should prove useful for measuring SVOC D m in various sink materials. Further studies are expected to facilitate application of this method and investigate the effects of temperature, relative humidity, and clothing material on D m .

  9. Defining Darwinism.

    Science.gov (United States)

    Hull, David L

    2011-03-01

    Evolutionary theory seems to lend itself to all sorts of misunderstanding. In this paper I strive to decrease such confusions, for example, between Darwinism and Darwinians, propositions and people, organisms and individuals, species as individuals versus species as classes, homologies and homoplasies, and finally essences versus histories. Copyright © 2010. Published by Elsevier Ltd.

  10. In-reactor measurement of clad strain: effect of power history

    International Nuclear Information System (INIS)

    Fehrenbach, P.J.; Morel, P.A.

    1980-01-01

    A series of experimental irradiations has been undertaken at CRNL to measure directly the in-reactor deformation of fuel elements while they are operating at power. Power histories have been chosen to allow investigation of power, time at power and burnup on pellet-clad interaction for element linear powers to 60kW/m. Results are presented which indicate that irradiation of a fresh fuel element at high power is effective in minimizing clad hoop stresses during subsequent ramps or cycles to that power. The effectiveness of this preconditioning appears to be due primarily to fuel densification rather than stress relaxation in the clad. (auth)

  11. Fiber scintillator/streak camera detector for burn history measurement in inertial confinement fusion experiment

    International Nuclear Information System (INIS)

    Miyanaga, N.; Ohba, N.; Fujimoto, K.

    1997-01-01

    To measure the burn history in an inertial confinement fusion experiment, we have developed a new neutron detector based on plastic scintillation fibers. Twenty-five fiber scintillators were arranged in a geometry compensation configuration by which the time-of-flight difference of the neutrons is compensated by the transit time difference of light passing through the fibers. Each fiber scintillator is spliced individually to an ultraviolet optical fiber that is coupled to a streak camera. We have demonstrated a significant improvement of sensitivity compared with the usual bulk scintillator coupled to a bundle of the same ultraviolet fibers. copyright 1997 American Institute of Physics

  12. Characterization of memory and measurement history in photoconductivity of nanocrystal arrays

    Science.gov (United States)

    Fairfield, Jessamyn A.; Dadosh, Tali; Drndic, Marija

    2010-10-01

    Photoconductivity in nanocrystal films has been previously characterized, but memory effects have received little attention despite their importance for device applications. We show that the magnitude and temperature dependence of the photocurrent in CdSe/ZnS core-shell nanocrystal arrays depends on the illumination and electric field history. Changes in photoconductivity occur on a few-hour timescale, and subband gap illumination of nanocrystals prior to measurements modifies the photocurrent more than band gap illumination. The observed effects can be explained by charge traps within the band gap that are filled or emptied, which may alter nonradiative recombination processes and affect photocurrent.

  13. Define Project

    DEFF Research Database (Denmark)

    Munk-Madsen, Andreas

    2005-01-01

    "Project" is a key concept in IS management. The word is frequently used in textbooks and standards. Yet we seldom find a precise definition of the concept. This paper discusses how to define the concept of a project. The proposed definition covers both heavily formalized projects and informally...... organized, agile projects. Based on the proposed definition popular existing definitions are discussed....

  14. "Dermatitis" defined.

    Science.gov (United States)

    Smith, Suzanne M; Nedorost, Susan T

    2010-01-01

    The term "dermatitis" can be defined narrowly or broadly, clinically or histologically. A common and costly condition, dermatitis is underresourced compared to other chronic skin conditions. The lack of a collectively understood definition of dermatitis and its subcategories could be the primary barrier. To investigate how dermatologists define the term "dermatitis" and determine if a consensus on the definition of this term and other related terms exists. A seven-question survey of dermatologists nationwide was conducted. Of respondents (n  =  122), half consider dermatitis to be any inflammation of the skin. Nearly half (47.5%) use the term interchangeably with "eczema." Virtually all (> 96%) endorse the subcategory "atopic" under the terms "dermatitis" and "eczema," but the subcategories "contact," "drug hypersensitivity," and "occupational" are more highly endorsed under the term "dermatitis" than under the term "eczema." Over half (55.7%) personally consider "dermatitis" to have a broad meaning, and even more (62.3%) believe that dermatologists as a whole define the term broadly. There is a lack of consensus among experts in defining dermatitis, eczema, and their related subcategories.

  15. HISTORY AND ACCOMPLISHMENTS OF THE US EPA'S SUPERFUND INNOVATIVE TECHNOLOGY EVALUATION (SITE) MONITORING AND MEASUREMENT (MMT) PROGRAM

    Science.gov (United States)

    This manuscript presents the history and evolution of the U.S. Environmental Protection Agency's (EPA) Superfund Innovative Technology Evaluation (SITE) Monitoring and Measurement Technology (MMT) Program. This includes a discussion of how the fundamental concepts of a performanc...

  16. Triangles in ROC space: History and theory of "nonparametric" measures of sensitivity and response bias.

    Science.gov (United States)

    Macmillan, N A; Creelman, C D

    1996-06-01

    Can accuracy and response bias in two-stimulus, two-response recognition or detection experiments be measured nonparametrically? Pollack and Norman (1964) answered this question affirmatively for sensitivity, Hodos (1970) for bias: Both proposed measures based on triangular areas in receiver-operating characteristic space. Their papers, and especially a paper by Grier (1971) that provided computing formulas for the measures, continue to be heavily cited in a wide range of content areas. In our sample of articles, most authors described triangle-based measures as making fewer assumptions than measures associated with detection theory. However, we show that statistics based on products or ratios of right triangle areas, including a recently proposed bias index and a not-yetproposed but apparently plausible sensitivity index, are consistent with a decision process based on logistic distributions. Even the Pollack and Norman measure, which is based on non-right triangles, is approximately logistic for low values of sensitivity. Simple geometric models for sensitivity and bias are not nonparametric, even if their implications are not acknowledged in the defining publications.

  17. Defining chaos.

    Science.gov (United States)

    Hunt, Brian R; Ott, Edward

    2015-09-01

    In this paper, we propose, discuss, and illustrate a computationally feasible definition of chaos which can be applied very generally to situations that are commonly encountered, including attractors, repellers, and non-periodically forced systems. This definition is based on an entropy-like quantity, which we call "expansion entropy," and we define chaos as occurring when this quantity is positive. We relate and compare expansion entropy to the well-known concept of topological entropy to which it is equivalent under appropriate conditions. We also present example illustrations, discuss computational implementations, and point out issues arising from attempts at giving definitions of chaos that are not entropy-based.

  18. Neutron Measurements for Radiation Protection in Low Earth Orbit - History and Future

    Science.gov (United States)

    Golightly, M. J.; Se,pmes. E/

    2003-01-01

    The neutron environment inside spacecraft has been of interest from a scientific and radiation protection perspective since early in the history of manned spaceflight. With 1:.1e exception of a few missions which carried plutonium-fueled radioisotope thermoelectric generators, all of the neutrons inside the spacecraft are secondary radiations resulting from interactions of high-energy charged particles with nuclei in the Earth's atmosphere, spacecraft structural materials, and the astronaut's own bodies. Although of great interest, definitive measurements of the spacecraft neutron field have been difficult due to the wide particle energy range and the limited available volume and power for traditional techniques involving Bonner spheres. A multitude of measurements, however, have been made of the neutron environment inside spacecraft. The majority of measurements were made using passive techniques including metal activation fo ils, fission foils, nuclear photoemulsions, plastic track detectors, and thermoluminescent detectors. Active measurements have utilized proton recoil spectrometers (stilbene), Bonner Spheres eRe proportional counter based), and LiI(Eu)phoswich scintillation detectors. For the International Space Station (ISS), only the plastic track! thermoluminescent detectors are used with any regularity. A monitoring program utilizing a set of active Bonner spheres was carried out in the ISS Lab module from March - December 200l. These measurements provide a very limited look at the crew neutron exposure, both in time coverage and neutron energy coverage. A review of the currently published data from past flights will be made and compared with the more recent results from the ISS. Future measurement efforts using currently available techniques and those in development will be also discussed.

  19. Defining Cyberbullying.

    Science.gov (United States)

    Englander, Elizabeth; Donnerstein, Edward; Kowalski, Robin; Lin, Carolyn A; Parti, Katalin

    2017-11-01

    Is cyberbullying essentially the same as bullying, or is it a qualitatively different activity? The lack of a consensual, nuanced definition has limited the field's ability to examine these issues. Evidence suggests that being a perpetrator of one is related to being a perpetrator of the other; furthermore, strong relationships can also be noted between being a victim of either type of attack. It also seems that both types of social cruelty have a psychological impact, although the effects of being cyberbullied may be worse than those of being bullied in a traditional sense (evidence here is by no means definitive). A complicating factor is that the 3 characteristics that define bullying (intent, repetition, and power imbalance) do not always translate well into digital behaviors. Qualities specific to digital environments often render cyberbullying and bullying different in circumstances, motivations, and outcomes. To make significant progress in addressing cyberbullying, certain key research questions need to be addressed. These are as follows: How can we define, distinguish between, and understand the nature of cyberbullying and other forms of digital conflict and cruelty, including online harassment and sexual harassment? Once we have a functional taxonomy of the different types of digital cruelty, what are the short- and long-term effects of exposure to or participation in these social behaviors? What are the idiosyncratic characteristics of digital communication that users can be taught? Finally, how can we apply this information to develop and evaluate effective prevention programs? Copyright © 2017 by the American Academy of Pediatrics.

  20. How Not to Evaluate a Psychological Measure: Rebuttal to Criticism of the Defining Issues Test of Moral Judgment Development by Curzer and Colleagues

    Science.gov (United States)

    Thoma, Stephen J.; Bebeau, Muriel J.; Narvaez, Darcia

    2016-01-01

    In a 2014 paper in "Theory and Research in Education," Howard Curzer and colleagues critique the Defining Issues Test of moral judgment development according to eight criteria that are described as difficulties any measure of educational outcomes must address. This article highlights how Curzer et al. do not consult existing empirical…

  1. Defining Glaucomatous Optic Neuropathy from a Continuous Measure of Optic Nerve Damage - The Optimal Cut-off Point for Risk-factor Analysis in Population-based Epidemiology

    NARCIS (Netherlands)

    Ramdas, Wishal D.; Rizopoulos, Dimitris; Wolfs, Roger C. W.; Hofman, Albert; de Jong, Paulus T. V. M.; Vingerling, Johannes R.; Jansonius, Nomdo M.

    2011-01-01

    Purpose: Diseases characterized by a continuous trait can be defined by setting a cut-off point for the disease measure in question, accepting some misclassification. The 97.5th percentile is commonly used as a cut-off point. However, it is unclear whether this percentile is the optimal cut-off

  2. Unstable work histories and fertility in France: An adaptation of sequence complexity measures to employment trajectories

    Directory of Open Access Journals (Sweden)

    Daniel Ciganda

    2015-04-01

    Full Text Available Background: The emergence of new evidence suggesting a sign shift in the long-standing negativecorrelation between prosperity and fertility levels has sparked a renewed interest in understanding the relationship between economic conditions and fertility decisions. In thiscontext, the notion of uncertainty has gained relevance in analyses of low fertility. So far, most studies have approached this notion using snapshot indicators such as type of contract or employment situation. However, these types of measures seem to be fallingshort in capturing what is intrinsically a dynamic process. Objective: Our first objective is to analyze to what extent employment trajectories have become lessstable over time, and the second, to determine whether or not employment instability has an impact on the timing and quantum of fertility in France.Additionally, we present a new indicator of employment instability that takes into account both the frequency and duration of unemployment, with the objective of comparing its performance against other, more commonly used indicators of economic uncertainty. Methods: Our study combines exploratory (Sequence Analysis with confirmatory (Event History, Logistic Regression methods to understand the relationship between early life-course uncertainty and the timing and intensity of fertility. We use employment histories from the three available waves of the Etude des relations familiales et intergenerationnelles (ERFI, a panel survey carried out by INED and INSEE which constitutes the base of the Generations and Gender Survey (GGS in France. Results: Although France is characterized by strong family policies and high and stable fertility levels, we find that employment instability not only has a strong and persistent negative effect on the final number of children for both men and women, but also contributes to fertility postponement in the case of men.Regarding the timing of the transition to motherhood, we show how

  3. Four Years of Chemical Measurements from the Deepwater Horizon Oil Spill Define the Deep Sea Sediment footprint and Subsequent Recovery

    Science.gov (United States)

    Boehm, P.

    2016-02-01

    Chemical data acquired during and after the DWHOS showed that several mechanisms were responsible for transport of oil from the water column to the sediments in the deep sea off the continental shelf. Three primary pathways were identified:Sorption onto and sinking of drilling mud particles during "Top Kill" response activity, highly scattered deposition of residuesfrom in situ burns, and deposition of oil combined with microbial organic matter from diffuse oil plumes ("marine snow"). Data collected during 2010, 2011 and 2014 were used to define the oil footprint and estimate time to recovery. More than 1200 stations were sampled. Of these, 27 stations were visited all three years, providing a time series from which recovery rates were calculated using the loss of total polycyclic aromatic hydrocarbons (TPAH) over time fit to first order kinetics. Results showed that the footprint of the oil was limited to the area around the wellhead and in patches to the southwest. Mostsamples had returned to background levels by 2015, with some exceptions close to the wellhead. Deposition to the northeast (DeSoto Canyon) was minor as evidenced by the absence of oil in sediments in that area. Samples with the longest recovery times were within 2 nautical miles of the wellhead, and often contained drilling mud, as shown by olefin signatures on the GC/FID chromatogram. Detailed chemistry data evaluation and chemical fingerprinting provided evidence that oil was being degraded in situ.

  4. Bohmian histories and decoherent histories

    International Nuclear Information System (INIS)

    Hartle, James B.

    2004-01-01

    The predictions of the Bohmian and the decoherent (or consistent) histories formulations of the quantum mechanics of a closed system are compared for histories--sequences of alternatives at a series of times. For certain kinds of histories, Bohmian mechanics and decoherent histories may both be formulated in the same mathematical framework within which they can be compared. In that framework, Bohmian mechanics and decoherent histories represent a given history by different operators. Their predictions for the probabilities of histories of a closed system therefore generally differ. However, in an idealized model of measurement, the predictions of Bohmian mechanics and decoherent histories coincide for the probabilities of records of measurement outcomes. The formulations are thus difficult to distinguish experimentally. They may differ in their accounts of the past history of the Universe in quantum cosmology

  5. Reliability and Validity of Measures for Investigating the Determinants of Health Behaviors among Women with a History of Gestational Diabetes

    Science.gov (United States)

    Smith, Ben J.; Cheung, N. Wah; Najnin, Nusrat; Bauman, Adrian; Razee, Husna; Blignault, Ilse; van der Ploeg, Hidde P.

    2018-01-01

    Aim: Assisting women with a history of gestational diabetes mellitus (GDM) to adopt healthy lifestyles is a priority for diabetes prevention. The aim of this study was to develop and evaluate measures that can be used to assess the efficacy of behavior change interventions in this group. Method: Measures of psychosocial influences on physical…

  6. Improved measurements of scant hydrogen peroxide enable experiments that define its threshold of toxicity for Escherichia coli.

    Science.gov (United States)

    Li, Xin; Imlay, James A

    2018-03-14

    Escherichia coli is a model organism that has been exploited to reveal key details of hydrogen peroxide stress: the biomolecules that H 2 O 2 most rapidly damages and the defensive tactics that organisms use to fend it off. Much less clear is the amount of exogenous H 2 O 2 that is sufficient to injure the bacterium and/or to trigger its stress response. To fill this gap, we need to study the behavior of cells when they are exposed to defined amounts of H 2 O 2 on an hours-long time scale. Such experiments are difficult because bacteria rapidly consume H 2 O 2 that is added to test cultures. Further, lab media itself can generate H 2 O 2 , and media components interfere with the quantification of H 2 O 2 levels. In this study we describe mechanisms by which media components interfere with H 2 O 2 determinations, and we identify simple ways to minimize and correct for this interference. Using these techniques, it was shown that standard media generate so much H 2 O 2 that most intracellular H 2 O 2 derives from the medium rather than from endogenous metabolism. Indeed, bacteria spread on plates must induce their stress response or else perish. Finally, two straightforward methods were used to sustain low-micromolar steady-state concentrations of H 2 O 2 . In this way we determined that > 2 μM extracellular H 2 O 2 is sufficient to trigger the intracellular OxyR stress response, and > 5 μM begins to impair cell growth in a minimal medium. These concentrations are orders of magnitude lower than the doses that have typically been used in lab experiments. The new approaches should enable workers to study how various organisms cope with natural levels of H 2 O 2 stress. Copyright © 2018 Elsevier Inc. All rights reserved.

  7. Value for money: Defining and measuring 'value' in MoD's acquisition policy of obtaining best value for money

    OpenAIRE

    Weiss, A.

    2006-01-01

    Obtaining value for money is a keystone of UK Ministry of Defence (MoD) acquisition strategy embedded in its Smart Acquisition policy. This thesis examines how best to measure the relative value of competing tender submissions for major projects. There is a comprehensive discussion of a wide range of relevant definitions and over three dozen documents are scrutinised including just some sixteen published by the Government. Commercially available models, algorithms and software are examined as...

  8. Developing a patient-centered outcome measure for complementary and alternative medicine therapies I: defining content and format

    Directory of Open Access Journals (Sweden)

    Ritenbaugh Cheryl

    2011-12-01

    Full Text Available Abstract Background Patients receiving complementary and alternative medicine (CAM therapies often report shifts in well-being that go beyond resolution of the original presenting symptoms. We undertook a research program to develop and evaluate a patient-centered outcome measure to assess the multidimensional impacts of CAM therapies, utilizing a novel mixed methods approach that relied upon techniques from the fields of anthropology and psychometrics. This tool would have broad applicability, both for CAM practitioners to measure shifts in patients' states following treatments, and conventional clinical trial researchers needing validated outcome measures. The US Food and Drug Administration has highlighted the importance of valid and reliable measurement of patient-reported outcomes in the evaluation of conventional medical products. Here we describe Phase I of our research program, the iterative process of content identification, item development and refinement, and response format selection. Cognitive interviews and psychometric evaluation are reported separately. Methods From a database of patient interviews (n = 177 from six diverse CAM studies, 150 interviews were identified for secondary analysis in which individuals spontaneously discussed unexpected changes associated with CAM. Using ATLAS.ti, we identified common themes and language to inform questionnaire item content and wording. Respondents' language was often richly textured, but item development required a stripping down of language to extract essential meaning and minimize potential comprehension barriers across populations. Through an evocative card sort interview process, we identified those items most widely applicable and covering standard psychometric domains. We developed, pilot-tested, and refined the format, yielding a questionnaire for cognitive interviews and psychometric evaluation. Results The resulting questionnaire contained 18 items, in visual analog scale format

  9. Experimental test of entangled histories

    Science.gov (United States)

    Cotler, Jordan; Duan, Lu-Ming; Hou, Pan-Yu; Wilczek, Frank; Xu, Da; Yin, Zhang-Qi; Zu, Chong

    2017-12-01

    Entangled histories arise when a system partially decoheres in such a way that its past cannot be described by a sequence of states, but rather a superposition of sequences of states. Such entangled histories have not been previously observed. We propose and demonstrate the first experimental scheme to create entangled history states of the Greenberger-Horne-Zeilinger (GHZ) type. In our experiment, the polarization states of a single photon at three different times are prepared as a GHZ entangled history state. We define a GHZ functional which attains a maximum value 1 on the ideal GHZ entangled history state and is bounded above by 1 / 16 for any three-time history state lacking tripartite entanglement. We have measured the GHZ functional on a state we have prepared experimentally, yielding a value of 0 . 656 ± 0 . 005, clearly demonstrating the contribution of entangled histories.

  10. Ultrasensitive prostate specific antigen assay following laparoscopic radical prostatectomy--an outcome measure for defining the learning curve.

    Science.gov (United States)

    Viney, R; Gommersall, L; Zeif, J; Hayne, D; Shah, Z H; Doherty, A

    2009-07-01

    Radical retropubic prostatectomy (RRP) performed laparoscopically is a popular treatment with curative intent for organ-confined prostate cancer. After surgery, prostate specific antigen (PSA) levels drop to low levels which can be measured with ultrasensitive assays. This has been described in the literature for open RRP but not for laparoscopic RRP. This paper describes PSA changes in the first 300 consecutive patients undergoing non-robotic laparoscopic RRP by a single surgeon. To use ultrasensitive PSA (uPSA) assays to measure a PSA nadir in patients having laparoscopic radical prostatectomy below levels recorded by standard assays. The aim was to use uPSA nadir at 3 months' post-prostatectomy as an early surrogate end-point of oncological outcome. In so doing, laparoscopic oncological outcomes could then be compared with published results from other open radical prostatectomy series with similar end-points. Furthermore, this end-point could be used in the assessment of the surgeon's learning curve. Prospective, comprehensive, demographic, clinical, biochemical and operative data were collected from all patients undergoing non-robotic laparoscopic RRP. We present data from the first 300 consecutive patients undergoing laparoscopic RRP by a single surgeon. uPSA was measured every 3 months post surgery. Median follow-up was 29 months (minimum 3 months). The likelihood of reaching a uPSA of bench-marking performance. With experience, a surgeon can achieve in excess of an 80% chance of obtaining a uPSA nadir of < or = 0.01 ng/ml at 3 months after laparoscopic RRP for a British population. This is equivalent to most published open series.

  11. Sweeping the Floor or Putting a Man on the Moon: How to Define and Measure Meaningful Work.

    Science.gov (United States)

    Both-Nwabuwe, Jitske M C; Dijkstra, Maria T M; Beersma, Bianca

    2017-01-01

    Meaningful work is integral to well-being and a flourishing life. The construct of "meaningful work" is, however, consistently affected by conceptual ambiguity. Although there is substantial support for arguments to maintain the status of conceptual ambiguity, we make a case for the benefits of having consensus on a definition and scale of meaningful work in the context of paid work. The objective of this article, therefore, was twofold. Firstly, we wanted to develop a more integrative definition of meaningful work. Secondly, we wanted to establish a corresponding operationalization. We reviewed the literature on the existing definitions of meaningful work and the scales designed to measure it. We found 14 definitions of meaningful work. Based on these definitions, we identified four categories of definitions, which led us to propose an integrative and comprehensive definition of meaningful work. We identified two validated scales that were partly aligned with the proposed definition. Based on our review, we conclude that scholars in this field should coalesce rather than diverge their efforts to conceptualize and measure meaningful work.

  12. Measurements for the production of aluminium oxide ceramics with defined microstructure parameters by using colloidal-chemical processings

    International Nuclear Information System (INIS)

    Baer, D.; Foerthmann, R.; Naoumidis, A.; Nickel, H.

    1992-04-01

    The aim of this work is to verify the influences of the different single procedure steps on the microstructure of sintered alumina and to get a correlation between the product characteristics and the characteristic data. The powder production was carried out by using the sol-gel-process followed by freeze-drying of the gel. From the boehmit-powder porous and inhomogen microstructure of the sintered pellets was obtained. The unfavourable morphology of the hydroxide-powder could be eliminated by pre-calcination followed by powder-milling. Because of the wet-milling after the pre-calcination the powder was doped with α-Al 2 O 3 , caused by the abrasion of the milling-mug and -balls, and therefore the calcinating temperature could be reduced to 1050deg C. Two charges of the colloidal-chemical produced powder and four commercial powders with different characteristics with regard to the purity or doping and particle-size and -distribution were compared with themselves. These powders were cold-isostatically pressed and sintered under different conditions. It could be shown that the influence of the impurities on the microstructure is higher than the influence of the grain size distribution. Impurities lead to a discontinous grain size distribution and intracristalline pores in sintered bodies, even using powders with a small grain size distribution. Measurements on the slip casted samples yielded for all powders different relationships between the viscosity and the pH. There was no visible influence of different pH-values on the microstructure (pH always measured at the minimum of the viscosity). Here the influence of the purity and the grain size distribution on the microstructure was less pronounced compared with the isostatically pressed ceramics. (orig.) [de

  13. ON THE INCORPORATION OF METALLICITY DATA INTO MEASUREMENTS OF STAR FORMATION HISTORY FROM RESOLVED STELLAR POPULATIONS

    Energy Technology Data Exchange (ETDEWEB)

    Dolphin, Andrew E., E-mail: adolphin@raytheon.com [Raytheon Company, Tucson, AZ 85734 (United States)

    2016-07-10

    The combination of spectroscopic stellar metallicities and resolved star color–magnitude diagrams (CMDs) has the potential to constrain the entire star formation history (SFH) of a galaxy better than fitting CMDs alone (as is most common in SFH studies using resolved stellar populations). In this paper, two approaches to incorporating external metallicity information into CMD-fitting techniques are presented. Overall, the joint fitting of metallicity and CMD information can increase the precision of measured age–metallicity relationships (AMRs) and star formation rates by 10% over CMD fitting alone. However, systematics in stellar isochrones and mismatches between spectroscopic and photometric determinations of metallicity can reduce the accuracy of the recovered SFHs. I present a simple mitigation of these systematics that can reduce their amplitude to the level obtained from CMD fitting alone, while ensuring that the AMR is consistent with spectroscopic metallicities. As is the case in CMD-fitting analysis, improved stellar models and calibrations between spectroscopic and photometric metallicities are currently the primary impediment to gains in SFH precision from jointly fitting stellar metallicities and CMDs.

  14. ON THE INCORPORATION OF METALLICITY DATA INTO MEASUREMENTS OF STAR FORMATION HISTORY FROM RESOLVED STELLAR POPULATIONS

    International Nuclear Information System (INIS)

    Dolphin, Andrew E.

    2016-01-01

    The combination of spectroscopic stellar metallicities and resolved star color–magnitude diagrams (CMDs) has the potential to constrain the entire star formation history (SFH) of a galaxy better than fitting CMDs alone (as is most common in SFH studies using resolved stellar populations). In this paper, two approaches to incorporating external metallicity information into CMD-fitting techniques are presented. Overall, the joint fitting of metallicity and CMD information can increase the precision of measured age–metallicity relationships (AMRs) and star formation rates by 10% over CMD fitting alone. However, systematics in stellar isochrones and mismatches between spectroscopic and photometric determinations of metallicity can reduce the accuracy of the recovered SFHs. I present a simple mitigation of these systematics that can reduce their amplitude to the level obtained from CMD fitting alone, while ensuring that the AMR is consistent with spectroscopic metallicities. As is the case in CMD-fitting analysis, improved stellar models and calibrations between spectroscopic and photometric metallicities are currently the primary impediment to gains in SFH precision from jointly fitting stellar metallicities and CMDs.

  15. Defining adolescent common mental disorders using electronic primary care data: a comparison with outcomes measured using the CIS-R.

    Science.gov (United States)

    Cornish, Rosie P; John, Ann; Boyd, Andy; Tilling, Kate; Macleod, John

    2016-12-01

    To compare the prevalence of common mental disorders (CMDs) derived from data held in primary care records with that measured using the revised Clinical Interview Schedule (CIS-R) in order to assess the potential robustness of findings based only on routinely collected data. Comparison study using linkage between the Avon Longitudinal Study of Parents and Children (ALSPAC) and electronic primary care records. We studied 1562 adolescents who had completed the CIS-R in ALSPAC at age 17-18 years and had linkage established to their primary care records. Outcome measures from ALSPAC were whether or not an individual met International Classification of Diseases-10 criteria for a diagnosis of (1) a CMD or, specifically, (2) depression. Lists of Read codes corresponding to diagnoses, symptoms and treatments were used to create 12 definitions of CMD and depression alone using the primary care data. We calculated sensitivities and specificities of these, using CIS-R definitions as the reference standard. Sensitivities ranged from 5.2% to 24.3% for depression and from 3.8% to 19.2% for CMD. The specificities of all definitions were above 98% for depression and above 96% for CMD.For both outcomes, the definition that included current diagnosis, treatment or symptoms identified the highest proportion of CIS-R cases. Most individuals meeting case definitions for CMD based on primary care data also met CIS-R case definitions. Conversely many individuals identified as cases using the CIS-R had no evidence of CMD in their clinical records. This suggests that clinical databases are likely to yield underestimates of the burden of CMD in the population. However, clinical records appear to yield valid diagnoses which may be useful for studying risk factors and consequences of CMD. The greatest epidemiological value may be obtained when information is available from survey and clinical records. Published by the BMJ Publishing Group Limited. For permission to use (where not already

  16. Defining greed.

    Science.gov (United States)

    Seuntjens, Terri G; Zeelenberg, Marcel; Breugelmans, Seger M; van de Ven, Niels

    2015-08-01

    Although greed is both hailed as the motor of economic growth and blamed as the cause of economic crises, very little is known about its psychological underpinnings. Five studies explored lay conceptualizations of greed among U.S. and Dutch participants using a prototype analysis. Study 1 identified features related to greed. Study 2 determined the importance of these features; the most important features were classified as central (e.g., self-interested, never satisfied), whereas less important features were classified as peripheral (e.g., ambition, addiction). Subsequently, we found that, compared to peripheral features, participants recalled central features better (Study 3), faster (Study 4), and these central features were more present in real-life episodes of greed (Study 5). These findings provide a better understanding of the elements that make up the experience of greed and provide insights into how greed can be manipulated and measured in future research. © 2014 The British Psychological Society.

  17. Initial steps in defining the environment of the prepuce of the bull by measuring pH and temperature.

    Science.gov (United States)

    Koziol, J H; Fraser, N S; Passler, T; Wolfe, D F

    2017-12-01

    To determine the baseline pH and temperature of the preputial cavity of bulls. We enrolled 55 bulls ranging in age from 15 to 84 months. The preputial temperature and pH were measured by insertion of temperature and pH probes, respectively, into the preputial orifice prior to routine breeding soundness examinations. Information was obtained from owners regarding the diet of each bull and categorised as one of three categories: forage only, grain supplemented or silage supplemented. The average temperature of the prepuce was 37.81°C ± 1.76 and the median pH of the prepuce was 8.45 (6.35-9.46). Preputial temperatures of the bull weakly correlated with ambient temperatures (r s  = -0.29, P = 0.028). The preputial pH of silage-fed bulls was significantly lower than that of bulls fed forage only (P = 0.025) or grain-supplemented diets (P = 0.002). The median preputial pH of bulls fed a silage-based diet was 7.6 (6.3-8.9) compared with a median pH 8.7 (7.8-9.1) for bulls fed forage-based diets or a median of 8.5 (7.7-9.4) for those given grain-supplemented diets. Diet and ambient temperature can, respectively, affect pH and the temperature in the prepuce. Further studies to describe and understand the microbiota of the prepuce and penis may assist in developing treatments for diseases of the genital tract in bulls. © 2017 Australian Veterinary Association.

  18. Defining and Measuring Teacher Legitimacy

    Science.gov (United States)

    Drake, Douglass Martin

    2013-01-01

    Power and authority exist in every relationship. The relationship between teacher and student is no exception. Legitimacy is the cornerstone of authority, yet there is a dearth of research into how teacher legitimacy affects the teacher/student relationship. In the current study, I sought to identify characteristics and behaviors teachers exhibit…

  19. Defining and Measuring User Experience

    DEFF Research Database (Denmark)

    Stage, Jan

    2006-01-01

    User experience is being used to denote what a user goes through while using a computerized system. The concept has gained momentum as a means to distinguish new types of applications such as games and entertainment software from more traditional work-related applications. This paper focuses on t...

  20. Defining and Measuring Chronic Conditions

    Centers for Disease Control (CDC) Podcasts

    This podcast is an interview with Dr. Anand Parekh, U.S. Department of Health and Human Services Deputy Assistant Secretary for Health, and Dr. Samuel Posner, Preventing Chronic Disease Editor in Chief, about the definition and burden of multiple chronic conditions in the United States.

  1. Defining and Measuring Chronic Conditions

    Centers for Disease Control (CDC) Podcasts

    2013-05-20

    This podcast is an interview with Dr. Anand Parekh, U.S. Department of Health and Human Services Deputy Assistant Secretary for Health, and Dr. Samuel Posner, Preventing Chronic Disease Editor in Chief, about the definition and burden of multiple chronic conditions in the United States.  Created: 5/20/2013 by Preventing Chronic Disease (PCD), National Center for Chronic Disease Prevention and Health Promotion (NCCDPHP).   Date Released: 5/20/2013.

  2. 25-ps neutron detector for measuring ICF-target burn history

    International Nuclear Information System (INIS)

    Lerche, R.A.; Phillion, D.W.; Tietbohl, G.L.

    1994-01-01

    We have developed a fast, sensitive neutron detector for recording the fusion reaction-rate history of inertial-confinement fusion (ICF) experiments. The detector is based on the fast rise-time of a commercial plastic scintillator (BC-422) and has a response 8 and 2 x 10 13 neutrons

  3. The Collin dynamometer: History of the development of an instrument for measuring physical and mental strength

    Czech Academy of Sciences Publication Activity Database

    Nicolas, S.; Vobořil, Dalibor

    2017-01-01

    Roč. 117, č. 2 (2017), s. 173-219 ISSN 0003-5033 Institutional support: RVO:68081740 Keywords : Collin dynamometer * experimental psycho logy * history of psycho logy Subject RIV: AN - Psycho logy OBOR OECD: Psycho logy (including human - machine relations) Impact factor: 0.358, year: 2016

  4. The Collin dynamometer: History of the development of an instrument for measuring physical and mental strength

    Czech Academy of Sciences Publication Activity Database

    Nicolas, S.; Vobořil, Dalibor

    2017-01-01

    Roč. 117, č. 2 (2017), s. 173-219 ISSN 0003-5033 Institutional support: RVO:68081740 Keywords : Collin dynamometer * experimental psychology * history of psychology Subject RIV: AN - Psychology OBOR OECD: Psychology (including human - machine relations) Impact factor: 0.358, year: 2016

  5. A comparison of time-history elastic plastic piping analysis with measurement

    International Nuclear Information System (INIS)

    Scavuzzo, R.J.; Sansalone, K.H.

    1992-01-01

    The GE/ETEC Green piping system was subjected to high seismic inputs from hydraulic sleds at each pipe foundation. These inputs were high enough to force bending stresses into the plastic regime. Strain gages recorded the pipe response at various positions within the system. The ABAQUS finite element code was used to model this piping system and the dynamic input. Problems associated with the dynamic input are discussed. Various types of finite elements were evaluated for accurancy. Both an elastic time-history analysis and an elastic-plastic time-history analysis of the system were conducted. Results of these analyses are compared to each other and the experimental data. These comparisons indicated that elastic analysis of dynamic strains are conservative at all points of comparison and that there is good agreement between the nonlinear elastic-plastic analysis and experimental data. (orig.)

  6. Reinventing Entrepreneurial History

    DEFF Research Database (Denmark)

    Wadhwani, R. Daniel; Lubinski, Christina

    2017-01-01

    Research on entrepreneurship remains fragmented in business history. A lack of conceptual clarity inhibits comparisons between studies and dialogue among scholars. To address these issues, we propose to reinvent entrepreneurial history as a research field. We define “new entrepreneurial history...... and reconfiguring resources, and legitimizing novelty. The article elaborates on the historiography, premises, and potential contributions of new entrepreneurial history....

  7. Entangled histories

    International Nuclear Information System (INIS)

    Cotler, Jordan; Wilczek, Frank

    2016-01-01

    We introduce quantum history states and their mathematical framework, thereby reinterpreting and extending the consistent histories approach to quantum theory. Through thought experiments, we demonstrate that our formalism allows us to analyze a quantum version of history in which we reconstruct the past by observations. In particular, we can pass from measurements to inferences about ‘what happened’ in a way that is sensible and free of paradox. Our framework allows for a richer understanding of the temporal structure of quantum theory, and we construct history states that embody peculiar, non-classical correlations in time. (paper)

  8. Substance Abuse among High-Risk Sexual Offenders: Do Measures of Lifetime History of Substance Abuse Add to the Prediction of Recidivism over Actuarial Risk Assessment Instruments?

    Science.gov (United States)

    Looman, Jan; Abracen, Jeffrey

    2011-01-01

    There has been relatively little research on the degree to which measures of lifetime history of substance abuse add to the prediction of risk based on actuarial measures alone among sexual offenders. This issue is of relevance in that a history of substance abuse is related to relapse to substance using behavior. Furthermore, substance use has…

  9. Do rivers really obey power-laws? Using continuous high resolution measurements to define bankfull channel and evaluate downstream hydraulic-scaling over large changes in drainage area

    Science.gov (United States)

    Scher, C.; Tennant, C.; Larsen, L.; Bellugi, D. G.

    2016-12-01

    Advances in remote-sensing technology allow for cost-effective, accurate, high-resolution mapping of river-channel topography and shallow aquatic bathymetry over large spatial scales. A combination of near-infrared and green spectra airborne laser swath mapping was used to map river channel bathymetry and watershed geometry over 90+ river-kilometers (75-1175 km2) of the Greys River in Wyoming. The day of flight wetted channel was identified from green LiDAR returns, and more than 1800 valley-bottom cross-sections were extracted at regular 50-m intervals. The bankfull channel geometry was identified using a "watershed-based" algorithm that incrementally filled local minima to a "spill" point, thereby constraining areas of local convergence and delineating all the potential channels along the cross-section for each distinct "spill stage." Multiple potential channels in alluvial floodplains and lack of clearly defined channel banks in bedrock reaches challenge identification of the bankfull channel based on topology alone. Here we combine a variety of topological measures, geometrical considerations, and stage levels to define a stage-dependent bankfull channel geometry, and compare the results with day of flight wetted channel data. Initial results suggest that channel hydraulic geometry and basin hydrology power-law scaling may not accurately capture downstream channel adjustments for rivers draining complex mountain topography.

  10. History of plutonium composition of fallout in the northeastern U.S. from contemporary measurements

    International Nuclear Information System (INIS)

    Krey, P.W.; Heit, M.; Miller, K.M.; Livingston, H.D.

    1990-01-01

    The analyses of lake sediments from the northeastern US provide depositional histories of 137 Cs and 239+240 Pu from both global fallout and fallout from the Nevada Test Site detonations in the 1950's. These results provide an independent verification and extension of the temporal trend of the 240 Pu/ 239 Pu atom ratio of global fallout to earlier times. This data supports the findings of other studies of fallout in the atmospheric and marine environment. (author) 26 refs.; 8 figs.; 9 refs

  11. Defining a set of standardised outcome measures for newly diagnosed patients with multiple myeloma using the Delphi consensus method: the IMPORTA project.

    Science.gov (United States)

    Blade, Joan; Calleja, Miguel Ángel; Lahuerta, Juan José; Poveda, José Luis; de Paz, Héctor David; Lizán, Luis

    2018-02-22

    To define a standard set of outcomes and the most appropriate instruments to measure them for managing newly diagnosed patients with multiple myeloma (MM). A literature review and five discussion groups facilitated the design of two-round Delphi questionnaire. Delphi panellists (haematologists, hospital pharmacists and patients) were identified by the scientific committee, the Spanish Program of Haematology Treatments Foundation, the Spanish Society of Hospital Pharmacies and the Spanish Community of Patients with MM. Panellist's perception about outcomes' suitability and feasibility of use was assessed on a seven-point Likert scale. Consensus was reached when at least 75% of the respondents reached agreement or disagreement. A scientific committee led the project. Fifty-one and 45 panellists participated in the first and second Delphi rounds, respectively. Consensus was reached to use overall survival, progression-free survival, minimal residual disease and treatment response to assess survival and disease control. Panellists agreed to measure health-related quality of life, pain, performance status, fatigue, psychosocial status, symptoms, self-perception on body image, sexuality and preferences/satisfaction. However, panellist did not reach consensus about the feasibility of assessing in routine practice psychosocial status, symptoms, self-perception on body image and sexuality. Consensus was reached to collect patient-reported outcomes through the European Organisation for the Research and Treatment of Cancer (EORTC) Quality of Life Questionnaire (QLQ) Core questionnaire 30 (C30), three items from EORTC-QLQ-Multiple Myeloma (MY20) and EORTC-QLQ-Breast Cancer (BR23), pain Visual Analogue Scale, Morisky-Green and ad hoc questions about patients' preferences/satisfaction. A consensual standard set of outcomes for managing newly diagnosed patients with MM has been defined. The feasibility of its implementation in routine practice will be assessed in a future pilot

  12. Mach-Zehnder Fiber-Optic Links for Reaction History Measurements at the National Ignition Facility

    International Nuclear Information System (INIS)

    Miller, E. Kirk; Herrmann, H.W.; Stoeffl, W.; Horsfield, C.J.

    2009-01-01

    We present the details of the analog fiber-optic data link that will be used in the chamber-mounted Gamma Reaction History (GRH) diagnostic at the National Ignition Facility (NIF) located at the Lawrence Livermore Laboratory in Livermore, California. The system is based on Mach-Zehnder (MZ) modulators integrated into the diagnostic, with the source lasers and bias control electronics located remotely to protect the active electronics. A complete recording system for a single GRH channel comprises two MZ modulators, with the fiber signals split onto four channels on a single digitizer. By carefully selecting the attenuation, the photoreceiver, and the digitizer settings, the dynamic range achievable is greater than 1000:1 at the full system bandwidth of greater than 10 GHz. The system is designed to minimize electrical reflections and mitigate the effects of transient radiation darkening on the fibers.

  13. Development of temperature history device for measurement of FBR's irradiation environment

    CERN Document Server

    Abe, K; Satou, M; Tobita, K

    2002-01-01

    New surface modification and machining process which control swelling and exfoliation using rare gas ion beam had been developed. An attempt to make a memory device for temperature history was carried out using the process. As a prototype of the memory, array of the temperature monitor consist of thousand of small bulges was made on the surface of silicon carbide substrate. Baseline properties, which would be needed for the temperature monitor materials were examined. Microstructure observation of the swelling region was carried out by transmission electron microscopy. Mechanism of the changing of the surface morphology due to heating was discussed. Depending on temperature region, it was proposed that two mechanisms could be utilized for the temperature memory device. First one was surface exfoliation due to internal pressure of the implanted gas and second was surface exfoliation due to internal stress which caused by volume shrinking with thermal recovery process.

  14. Repeatable aversion across threat types is linked with life-history traits but is dependent on how aversion is measured.

    Science.gov (United States)

    Davidson, Gabrielle L; Reichert, Michael S; Crane, Jodie M S; O'Shea, William; Quinn, John L

    2018-02-01

    Personality research suggests that individual differences in risk aversion may be explained by links with life-history variation. However, few empirical studies examine whether repeatable differences in risk avoidance behaviour covary with life-history traits among individuals in natural populations, or how these links vary depending on the context and the way risk aversion is measured. We measured two different risk avoidance behaviours (latency to enter the nest and inspection time) in wild great tits ( Parus major ) in two different contexts-response to a novel object and to a predator cue placed at the nest-box during incubation---and related these behaviours to female reproductive success and condition. Females responded equally strongly to both stimuli, and although both behaviours were repeatable, they did not correlate. Latency to enter was negatively related to body condition and the number of offspring fledged. By contrast, inspection time was directly explained by whether incubating females had been flushed from the nest before the trial began. Thus, our inferences on the relationship between risk aversion and fitness depend on how risk aversion was measured. Our results highlight the limitations of drawing conclusions about the relevance of single measures of a personality trait such as risk aversion.

  15. Defining depth of anesthesia.

    Science.gov (United States)

    Shafer, S L; Stanski, D R

    2008-01-01

    In this chapter, drawn largely from the synthesis of material that we first presented in the sixth edition of Miller's Anesthesia, Chap 31 (Stanski and Shafer 2005; used by permission of the publisher), we have defined anesthetic depth as the probability of non-response to stimulation, calibrated against the strength of the stimulus, the difficulty of suppressing the response, and the drug-induced probability of non-responsiveness at defined effect site concentrations. This definition requires measurement of multiple different stimuli and responses at well-defined drug concentrations. There is no one stimulus and response measurement that will capture depth of anesthesia in a clinically or scientifically meaningful manner. The "clinical art" of anesthesia requires calibration of these observations of stimuli and responses (verbal responses, movement, tachycardia) against the dose and concentration of anesthetic drugs used to reduce the probability of response, constantly adjusting the administered dose to achieve the desired anesthetic depth. In our definition of "depth of anesthesia" we define the need for two components to create the anesthetic state: hypnosis created with drugs such as propofol or the inhalational anesthetics and analgesia created with the opioids or nitrous oxide. We demonstrate the scientific evidence that profound degrees of hypnosis in the absence of analgesia will not prevent the hemodynamic responses to profoundly noxious stimuli. Also, profound degrees of analgesia do not guarantee unconsciousness. However, the combination of hypnosis and analgesia suppresses hemodynamic response to noxious stimuli and guarantees unconsciousness.

  16. The relationship between subconcussive impacts and concussion history on clinical measures of neurologic function in collegiate football players.

    Science.gov (United States)

    Gysland, Sonia M; Mihalik, Jason P; Register-Mihalik, Johna K; Trulock, Scott C; Shields, Edgar W; Guskiewicz, Kevin M

    2012-01-01

    Concussions sustained during college and professional football careers have been associated with both acute and chronic neurologic impairment. The contribution of subconcussive impacts to this impairment has not been adequately studied. Therefore, we investigated the relationship between subconcussive impacts and concussion history on clinical measures of neurologic function. Forty-six collegiate football players completed five clinical measures of neurologic function commonly employed in the evaluation of concussion before and after a single season. These tests included the Automated Neuropsychological Assessment Metrics, Sensory Organization Test, Standardized Assessment of Concussion, Balance Error Scoring System, and Graded Symptom Checklist. The Head Impact Telemetry (HIT) System recorded head impact data including the frequency, magnitude, and location of impacts. College football players sustain approximately 1,000 subconcussive impacts to the head over the course of a season, but for the most part, do not demonstrate any clinically meaningful changes from preseason to postseason on measures of neurologic function. Changes in performance were mostly independent of prior concussion history, and the total number, magnitude and location of sustained impacts over one season as observed R(2) values ranged between 0.30 and 0.35. Repetitive subconcussive head impacts over a single season do not appear to result in short-term neurologic impairment, but these relationships should be further investigated for a potential dose-response over a player's career.

  17. Investigating Efficiency of Time Domain Curve fitters Versus Filtering for Rectification of Displacement Histories Reconstructed from Acceleration Measurements

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri; Brincker, Rune

    2008-01-01

    Computing displacements of a structure from its measured accelerations has been major concern of some fields of engineering such as earthquake engineering. In vibration engineering also displacements are preferred to acceleration histories occasionally i.e. in the determination of forces applied...... on a structure. In brief the major problem that accompanies reconstruction of true displacement from acceleration record is the unreal drift observed in the double integrated acceleration. Purpose of the present work is to address source of the problem, introduce its treatments, show how they work and compare...

  18. Predicting Falls in People with Multiple Sclerosis: Fall History Is as Accurate as More Complex Measures

    Directory of Open Access Journals (Sweden)

    Michelle H. Cameron

    2013-01-01

    Full Text Available Background. Many people with MS fall, but the best method for identifying those at increased fall risk is not known. Objective. To compare how accurately fall history, questionnaires, and physical tests predict future falls and injurious falls in people with MS. Methods. 52 people with MS were asked if they had fallen in the past 2 months and the past year. Subjects were also assessed with the Activities-specific Balance Confidence, Falls Efficacy Scale-International, and Multiple Sclerosis Walking Scale-12 questionnaires, the Expanded Disability Status Scale, Timed 25-Foot Walk, and computerized dynamic posturography and recorded their falls daily for the following 6 months with calendars. The ability of baseline assessments to predict future falls was compared using receiver operator curves and logistic regression. Results. All tests individually provided similar fall prediction (area under the curve (AUC 0.60–0.75. A fall in the past year was the best predictor of falls (AUC 0.75, sensitivity 0.89, specificity 0.56 or injurious falls (AUC 0.69, sensitivity 0.96, specificity 0.41 in the following 6 months. Conclusion. Simply asking people with MS if they have fallen in the past year predicts future falls and injurious falls as well as more complex, expensive, or time-consuming approaches.

  19. Analysis of Time-History data of Forces and Motions Measured at Towing Facilities

    National Research Council Canada - National Science Library

    Hong, Young S; Fullerton, Anne

    2007-01-01

    .... A lowpass filter is used to eliminate the noise of the measured data. The methods of zero crossing, sine function and spectral analysis are applied to compute the amplitudes, periods and phase angles...

  20. Thermal history sensors for non-destructive temperature measurements in harsh environments

    Energy Technology Data Exchange (ETDEWEB)

    Pilgrim, C. C. [Mechanical Engineering, Imperial College London, London, SW7 2AZ, UK and Sensor Coating Systems, Imperial Incubator, Bessemer Building, Level 1 and 2, Imperial College London, London SW7 2AZ (United Kingdom); Heyes, A. L. [Energy Technology and Innovation Initiative, University of Leeds, Leeds, LS2 9JT (United Kingdom); Feist, J. P. [Sensor Coating Systems, Imperial Incubator, Bessemer Building, Level 1 and 2, Imperial College London, London SW7 2AZ (United Kingdom)

    2014-02-18

    The operating temperature is a critical physical parameter in many engineering applications, however, can be very challenging to measure in certain environments, particularly when access is limited or on rotating components. A new quantitative non-destructive temperature measurement technique has been proposed which relies on thermally induced permanent changes in ceramic phosphors. This technique has several distinct advantages over current methods for many different applications. The robust ceramic material stores the temperature information allowing long term thermal exposures in harsh environment to be measured at a convenient time. Additionally, rare earth dopants make the ceramic phosphorescent so that the temperature information can be interpreted by automated interrogation of the phosphorescent light. This technique has been demonstrated by application of YAG doped with dysprosium and europium as coatings through the air-plasma spray process. Either material can be used to measure temperature over a wide range, namely between 300°C and 900°C. Furthermore, results show that the material records the peak exposure temperature and prolonged exposure at lower temperatures would have no effect on the temperature measurement. This indicates that these materials could be used to measure peak operating temperatures in long-term testing.

  1. Development of a Lifespan-Based Novel Composite Person-Reported Outcome Measure Using Data from the CINRG Duchenne Natural History Study

    Science.gov (United States)

    2017-10-01

    Duchenne natural history study PRINCIPAL INVESTIGATOR: McDonald, Craig M. CONTRACTING ORGANIZATION: University of California, Davis Davis, CA 95618...composite person- reported outcome measure using data from the CINRG Duchenne natural history study 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Craig...SUPPLEMENTARY NOTES 14. ABSTRACT Development of novel technologies and therapeutic agents to treat Duchenne muscular dystrophy (DMD) have increased

  2. ICF burn-history measurments using 17-MeV fusion gamma rays

    International Nuclear Information System (INIS)

    Lerche, R.A.; Cable, M.D.; Dendooven, P.G.

    1995-01-01

    Fusion reaction rate for inertial-confinement fusion (ICF) experiments at the Nova Laser Facility is measured with 30-ps resolution using a high-speed neutron detector. We are investigating a measurement technique based on the 16.7-MeV gamma rays that are released in deuterium-tritium fusion. Our concept is to convert gamma-ray energy into a fast burst of Cerenkov light that can be recorded with a high-speed optical detector. We have detected fusion gamma rays in preliminary experiments conducted at Nova where we used a tungsten/aerogel converter to generate Cerenkov light and an optical streak camera to record the signal

  3. Measurement of circulating transcripts and gene cluster analysis predicts and defines therapeutic efficacy of peptide receptor radionuclide therapy (PRRT) in neuroendocrine tumors

    International Nuclear Information System (INIS)

    Bodei, L.; Kidd, M.; Modlin, I.M.; Severi, S.; Nicolini, S.; Paganelli, G.; Drozdov, I.; Kwekkeboom, D.J.; Krenning, E.P.; Baum, R.P.

    2016-01-01

    Peptide receptor radionuclide therapy (PRRT) is an effective method for treating neuroendocrine tumors (NETs). It is limited, however, in the prediction of individual tumor response and the precise and early identification of changes in tumor size. Currently, response prediction is based on somatostatin receptor expression and efficacy by morphological imaging and/or chromogranin A (CgA) measurement. The aim of this study was to assess the accuracy of circulating NET transcripts as a measure of PRRT efficacy, and moreover to identify prognostic gene clusters in pretreatment blood that could be interpolated with relevant clinical features in order to define a biological index for the tumor and a predictive quotient for PRRT efficacy. NET patients (n = 54), M: F 37:17, median age 66, bronchial: n = 13, GEP-NET: n = 35, CUP: n = 6 were treated with 177 Lu-based-PRRT (cumulative activity: 6.5-27.8 GBq, median 18.5). At baseline: 47/54 low-grade (G1/G2; bronchial typical/atypical), 31/49 18 FDG positive and 39/54 progressive. Disease status was assessed by RECIST1.1. Transcripts were measured by real-time quantitative reverse transcription PCR (qRT-PCR) and multianalyte algorithmic analysis (NETest); CgA by enzyme-linked immunosorbent assay (ELISA). Gene cluster (GC) derivations: regulatory network, protein:protein interactome analyses. Statistical analyses: chi-square, non-parametric measurements, multiple regression, receiver operating characteristic and Kaplan-Meier survival. The disease control rate was 72 %. Median PFS was not achieved (follow-up: 1-33 months, median: 16). Only grading was associated with response (p < 0.01). At baseline, 94 % of patients were NETest-positive, while CgA was elevated in 59 %. NETest accurately (89 %, χ 2 = 27.4; p = 1.2 x 10 -7 ) correlated with treatment response, while CgA was 24 % accurate. Gene cluster expression (growth-factor signalome and metabolome) had an AUC of 0.74 ± 0.08 (z-statistic = 2.92, p < 0.004) for predicting

  4. Measurement of circulating transcripts and gene cluster analysis predicts and defines therapeutic efficacy of peptide receptor radionuclide therapy (PRRT) in neuroendocrine tumors

    Energy Technology Data Exchange (ETDEWEB)

    Bodei, L. [European Institute of Oncology, Division of Nuclear Medicine, Milan (Italy); LuGenIum Consortium, Milan, Rotterdam, Bad Berka, London, Italy, Netherlands, Germany (Country Unknown); Kidd, M. [Wren Laboratories, Branford, CT (United States); Modlin, I.M. [LuGenIum Consortium, Milan, Rotterdam, Bad Berka, London, Italy, Netherlands, Germany (Country Unknown); Yale School of Medicine, New Haven, CT (United States); Severi, S.; Nicolini, S.; Paganelli, G. [Istituto Scientifico Romagnolo per lo Studio e la Cura dei Tumori (IRST) IRCCS, Nuclear Medicine and Radiometabolic Units, Meldola (Italy); Drozdov, I. [Bering Limited, London (United Kingdom); Kwekkeboom, D.J.; Krenning, E.P. [LuGenIum Consortium, Milan, Rotterdam, Bad Berka, London, Italy, Netherlands, Germany (Country Unknown); Erasmus Medical Center, Nuclear Medicine Department, Rotterdam (Netherlands); Baum, R.P. [LuGenIum Consortium, Milan, Rotterdam, Bad Berka, London, Italy, Netherlands, Germany (Country Unknown); Zentralklinik Bad Berka, Theranostics Center for Molecular Radiotherapy and Imaging, Bad Berka (Germany)

    2016-05-15

    Peptide receptor radionuclide therapy (PRRT) is an effective method for treating neuroendocrine tumors (NETs). It is limited, however, in the prediction of individual tumor response and the precise and early identification of changes in tumor size. Currently, response prediction is based on somatostatin receptor expression and efficacy by morphological imaging and/or chromogranin A (CgA) measurement. The aim of this study was to assess the accuracy of circulating NET transcripts as a measure of PRRT efficacy, and moreover to identify prognostic gene clusters in pretreatment blood that could be interpolated with relevant clinical features in order to define a biological index for the tumor and a predictive quotient for PRRT efficacy. NET patients (n = 54), M: F 37:17, median age 66, bronchial: n = 13, GEP-NET: n = 35, CUP: n = 6 were treated with {sup 177}Lu-based-PRRT (cumulative activity: 6.5-27.8 GBq, median 18.5). At baseline: 47/54 low-grade (G1/G2; bronchial typical/atypical), 31/49 {sup 18}FDG positive and 39/54 progressive. Disease status was assessed by RECIST1.1. Transcripts were measured by real-time quantitative reverse transcription PCR (qRT-PCR) and multianalyte algorithmic analysis (NETest); CgA by enzyme-linked immunosorbent assay (ELISA). Gene cluster (GC) derivations: regulatory network, protein:protein interactome analyses. Statistical analyses: chi-square, non-parametric measurements, multiple regression, receiver operating characteristic and Kaplan-Meier survival. The disease control rate was 72 %. Median PFS was not achieved (follow-up: 1-33 months, median: 16). Only grading was associated with response (p < 0.01). At baseline, 94 % of patients were NETest-positive, while CgA was elevated in 59 %. NETest accurately (89 %, χ{sup 2} = 27.4; p = 1.2 x 10{sup -7}) correlated with treatment response, while CgA was 24 % accurate. Gene cluster expression (growth-factor signalome and metabolome) had an AUC of 0.74 ± 0.08 (z-statistic = 2.92, p < 0

  5. Oxygen consumption rate v. rate of energy utilization of fishes: a comparison and brief history of the two measurements.

    Science.gov (United States)

    Nelson, J A

    2016-01-01

    Accounting for energy use by fishes has been taking place for over 200 years. The original, and continuing gold standard for measuring energy use in terrestrial animals, is to account for the waste heat produced by all reactions of metabolism, a process referred to as direct calorimetry. Direct calorimetry is not easy or convenient in terrestrial animals and is extremely difficult in aquatic animals. Thus, the original and most subsequent measurements of metabolic activity in fishes have been measured via indirect calorimetry. Indirect calorimetry takes advantage of the fact that oxygen is consumed and carbon dioxide is produced during the catabolic conversion of foodstuffs or energy reserves to useful ATP energy. As measuring [CO2 ] in water is more challenging than measuring [O2 ], most indirect calorimetric studies on fishes have used the rate of O2 consumption. To relate measurements of O2 consumption back to actual energy usage requires knowledge of the substrate being oxidized. Many contemporary studies of O2 consumption by fishes do not attempt to relate this measurement back to actual energy usage. Thus, the rate of oxygen consumption (M˙O2 ) has become a measurement in its own right that is not necessarily synonymous with metabolic rate. Because all extant fishes are obligate aerobes (many fishes engage in substantial net anaerobiosis, but all require oxygen to complete their life cycle), this discrepancy does not appear to be of great concern to the fish biology community, and reports of fish oxygen consumption, without being related to energy, have proliferated. Unfortunately, under some circumstances, these measures can be quite different from one another. A review of the methodological history of the two measurements and a look towards the future are included. © 2016 The Fisheries Society of the British Isles.

  6. The history and assessment of effectiveness of soil erosion control measures deployed in Russia

    Directory of Open Access Journals (Sweden)

    Valentin Golosov

    2013-09-01

    Full Text Available Research activities aimed at design and application of soil conservation measures for reduction of soil losses from cultivated fields started in Russia in the last quarter of the 19th century. A network of "zonal agrofor-estry melioration experimental stations" was organized in the different landscape zones of Russia in the first half of the 20th century. The main task of the experiments was to develop effective soil conservation measures for Russian climatic,soil and land use conditions. The most widespread and large-scale introduction of coun-termeasures to cope with soil erosion by water and wind into agricultural practice supported by serious governmental investments took place during the Soviet Union period after the Second World War. After the Soviet Union collapse in 1991 ,general deterioration of the agricultural economy sector and the absence of investments resulted in cessation of organized soil conservation measures application at the nation-wide level. However, some of the long-term erosion control measures such as forest shelter belts, artificial slope terracing, water diversion dams above formerly active gully heads survived until the present. In the case study of sediment redistribution within the small cultivated catchment presented in this paper an attempt was made to evaluate average annual erosion rates on arable slopes with and without soil conservation measures for two time intervals. It has been found that application of conservation measures on cultivated slopes within the experimental part of the case study catchment has led to a decrease of average soil loss rates by at least 2. 5 2. 8 times. The figures obtained are in good agreement with previously published results of direct monitoring of snowmelt erosion rates, reporting approximately a 3 -fold decrease of average snowmelt erosion rates in the experimental sub-catchment compared to a traditionally cultivated control sub-catchment. A substantial decrease of soil

  7. History of measures taken to reduce radiation exposure at Hamaoka Nuclear Power Station

    International Nuclear Information System (INIS)

    Kondou, Masashi; Takagi, Nobuyuki; Yabushita, Kazuo; Dekijima, Makoto

    2009-01-01

    Hamaoka Nuclear Power Station currently has five reactors, Units 1 to 5. Units 1 and 2 halted commercial operation in January 2009 and are now being prepared for decommissioning. Units 3 to 5 are operating at the rated thermal output with the gross electrical output of 3504 MWe. Hamaoka Nuclear Power Station has been operating for about 30 years since Unit 1 started up in 1976. Various measures have been taken to control water chemistry: for controlling SCC in the core internals and structural materials, hydrogen injection and noble metal injection were implemented; and to reduce radiation exposure for workers, condensate filter demineralizers were added, hollow fiber filters and pleated filters were installed in the condensate cleanup system, and zinc injection was performed. This paper describes measures taken at Hamaoka to reduce exposure in terms of water chemistry and techniques to monitor ion impurities in the reactor water. (author)

  8. Measurement and prediction of thermochemical history effects on sensitization development in austenitic stainless steels

    International Nuclear Information System (INIS)

    Bruemmer, S.M.; Charlot, L.A.

    1985-11-01

    The effects of thermal and thermomechanical treatments on sensitization development in Type 304 and 316 stainless steels have been measured and compared to model predictions. Sensitization development resulting from isothermal, continuous cooling and pipe welding treatments has been evaluated. An empirically modified, theoretically based model is shown to accurately predict material degree of sensitization (DOS) as expressed by the electrochemical potentiokinetic reactivation (EPR) test after both simple and complex treatments. Material DOS is also examined using analytical electron microscopy to document grain boundary chromium depletion and is compared to EPR test results

  9. Reliable micro-measurement of strontium is the key to cracking the life-history code in the fish otolith

    International Nuclear Information System (INIS)

    Markwitz, A.; Grambole, D.; Herrmann, F.; Trompetter, W.J.; Dioses, T.; Gauldie, R.W.

    2000-01-01

    The fish otolith is a calcium carbonate (usually aragonite) crystal that grows continuously by accretion over the life of the fish and unlike bone is not continuously re-metabolised. Consequently, the otolith has long been regarded as a potential store of information about the life history of an individual fish, and this information is encoded in the deposition pattern of trace elements in the otolith. The code has been difficult to crack. However, recent developments have show that: (1) Sr is one of the few non-mobile trace elements in the otolith; and (2) the pattern of Sr deposition summarises the effects of environment changes that affect the growth rate of the otolith crystal. The remaining difficulties in cracking the chemical code in the otolith have hinged about making reliable micro-measurements of the stable Sr content at spatial resolutions of 10 μm or less; this interval represents about 4-6 days of otolith growth in most species of fish. This paper describes high beam resolution 2 μm linear measurements, and 6 μm square measurements over narrow windows of about 300 μm square, and links these micro-measures to macro-measures of 2D maps of the entire surface of sections of otoliths up to 5 mm square at beam resolutions of 25 μm square. The otoliths used in this study are from the Jurel, or Peruvian Jack mackerel, Trachurus murphyi (Carangidae: Teleostei)

  10. Defining the Anthropocene

    Science.gov (United States)

    Lewis, Simon; Maslin, Mark

    2016-04-01

    Time is divided by geologists according to marked shifts in Earth's state. Recent global environmental changes suggest that Earth may have entered a new human-dominated geological epoch, the Anthropocene. Should the Anthropocene - the idea that human activity is a force acting upon the Earth system in ways that mean that Earth will be altered for millions of years - be defined as a geological time-unit at the level of an Epoch? Here we appraise the data to assess such claims, first in terms of changes to the Earth system, with particular focus on very long-lived impacts, as Epochs typically last millions of years. Can Earth really be said to be in transition from one state to another? Secondly, we then consider the formal criteria used to define geological time-units and move forward through time examining whether currently available evidence passes typical geological time-unit evidence thresholds. We suggest two time periods likely fit the criteria (1) the aftermath of the interlinking of the Old and New Worlds, which moved species across continents and ocean basins worldwide, a geologically unprecedented and permanent change, which is also the globally synchronous coolest part of the Little Ice Age (in Earth system terms), and the beginning of global trade and a new socio-economic "world system" (in historical terms), marked as a golden spike by a temporary drop in atmospheric CO2, centred on 1610 CE; and (2) the aftermath of the Second World War, when many global environmental changes accelerated and novel long-lived materials were increasingly manufactured, known as the Great Acceleration (in Earth system terms) and the beginning of the Cold War (in historical terms), marked as a golden spike by the peak in radionuclide fallout in 1964. We finish by noting that the Anthropocene debate is politically loaded, thus transparency in the presentation of evidence is essential if a formal definition of the Anthropocene is to avoid becoming a debate about bias. The

  11. Spatial cluster detection for repeatedly measured outcomes while accounting for residential history.

    Science.gov (United States)

    Cook, Andrea J; Gold, Diane R; Li, Yi

    2009-10-01

    Spatial cluster detection has become an important methodology in quantifying the effect of hazardous exposures. Previous methods have focused on cross-sectional outcomes that are binary or continuous. There are virtually no spatial cluster detection methods proposed for longitudinal outcomes. This paper proposes a new spatial cluster detection method for repeated outcomes using cumulative geographic residuals. A major advantage of this method is its ability to readily incorporate information on study participants relocation, which most cluster detection statistics cannot. Application of these methods will be illustrated by the Home Allergens and Asthma prospective cohort study analyzing the relationship between environmental exposures and repeated measured outcome, occurrence of wheeze in the last 6 months, while taking into account mobile locations.

  12. Bipolar affective disorder and borderline personality disorder: Differentiation based on the history of early life stress and psychoneuroendocrine measures.

    Science.gov (United States)

    Mazer, Angela Kaline; Cleare, Anthony J; Young, Allan H; Juruena, Mario F

    2018-04-24

    Borderline Personality Disorder (BPD) and Bipolar Affective Disorder (BD) have clinical characteristics in common which often make their differential diagnosis difficult. The history of early life stress (ELS) may be a differentiating factor between BPD and BD, as well as its association with clinical manifestations and specific neuroendocrine responses in each of these diagnoses. Assessing and comparing patients with BD and BPD for factors related to symptomatology, etiopathogenesis and neuroendocrine markers. The study sample consisted of 51 women, divided into 3 groups: patients with a clinical diagnosis of BPD (n = 20) and BD (n = 16) and healthy controls (HC, n = 15). Standardized instruments were used for the clinical evaluation, while the history of ELS was quantified with the Childhood Trauma Questionnaire (CTQ), and classified according to the subtypes: emotional abuse, physical abuse, sexual abuse, emotional neglect and physical neglect. The functioning of the hypothalamic-pituitary-adrenal (HPA) axis was evaluated by measuring a single plasma cortisol sample. Patients with BPD presented with more severe psychiatric symptoms of: anxiety, impulsivity, depression, hopelessness and suicidal ideation than those with BD. The history of ELS was identified as significantly more prevalent and more severe in patients (BPD and BP) than in HC. Emotional abuse, emotional neglect and physical neglect also showed differences and were higher in BPD than BD patients. BPD patients had greater severity of ELS overall and in the subtypes of emotional abuse, emotional neglect and physical neglect than BD patients. The presence of ELS in patients with BPD and BP showed significant difference with lower cortisol levels when compared to HC. The endocrine evaluation showed no significant differences between the diagnoses of BPD and BD. Cortisol measured in patients with BPD was significantly lower compared to HC in the presence of emotional neglect and physical

  13. The predictive power of family history measures of alcohol and drug problems and internalizing disorders in a college population.

    Science.gov (United States)

    Kendler, Kenneth S; Edwards, Alexis; Myers, John; Cho, Seung Bin; Adkins, Amy; Dick, Danielle

    2015-07-01

    A family history (FH) of psychiatric and substance use problems is a potent risk factor for common internalizing and externalizing disorders. In a large web-based assessment of mental health in college students, we developed a brief set of screening questions for a FH of alcohol problems (AP), drug problems (DP) and depression-anxiety in four classes of relatives (father, mother, aunts/uncles/grandparents, and siblings) as reported by the student. Positive reports of a history of AP, DP, and depression-anxiety were substantially correlated within relatives. These FH measures predicted in the student, in an expected pattern, dimensions of personality and impulsivity, alcohol consumption and problems, smoking and nicotine dependence, use of illicit drugs, and symptoms of depression and anxiety. Using the mean score from the four classes of relatives was more predictive than using a familial/sporadic dichotomy. Interactions were seen between the FH of AP, DP, and depression-anxiety and peer deviance in predicting symptoms of alcohol and tobacco dependence. As the students aged, the FH of AP became a stronger predictor of alcohol problems. While we cannot directly assess the validity of these FH reports, the pattern of findings suggest that our brief screening items were able to assess, with some accuracy, the FH of substance misuse and internalizing psychiatric disorders in relatives. If correct, these measures can play an important role in the creation of developmental etiologic models for substance and internalizing psychiatric disorders which constitute one of the central goals of the overall project. © 2015 Wiley Periodicals, Inc.

  14. Identification of fall risk predictors in daily life measurements: gait characteristics' reliability and association with self-reported fall history.

    Science.gov (United States)

    Rispens, Sietse M; van Schooten, Kimberley S; Pijnappels, Mirjam; Daffertshofer, Andreas; Beek, Peter J; van Dieën, Jaap H

    2015-01-01

    Background. Gait characteristics extracted from trunk accelerations during daily life locomotion are complementary to questionnaire- or laboratory-based gait and balance assessments and may help to improve fall risk prediction. Objective. The aim of this study was to identify gait characteristics that are associated with self-reported fall history and that can be reliably assessed based on ambulatory data collected during a single week. Methods. We analyzed 2 weeks of trunk acceleration data (DynaPort MoveMonitor, McRoberts) collected among 113 older adults (age range, 65-97 years). During episodes of locomotion, various gait characteristics were determined, including local dynamic stability, interstride variability, and several spectral features. For each characteristic, we performed a negative binomial regression analysis with the participants' self-reported number of falls in the preceding year as outcome. Reliability of gait characteristics was assessed in terms of intraclass correlations between both measurement weeks. Results. The percentages of spectral power below 0.7 Hz along the vertical and anteroposterior axes and below 10 Hz along the mediolateral axis, as well as local dynamic stability, local dynamic stability per stride, gait smoothness, and the amplitude and slope of the dominant frequency along the vertical axis, were associated with the number of falls in the preceding year and could be reliably assessed (all P 0.75). Conclusions. Daily life gait characteristics are associated with fall history in older adults and can be reliably estimated from a week of ambulatory trunk acceleration measurements. © The Author(s) 2014.

  15. Neutrinos in the holographic dark energy model: constraints from latest measurements of expansion history and growth of structure

    International Nuclear Information System (INIS)

    Zhang, Jing-Fei; Zhao, Ming-Ming; Li, Yun-He; Zhang, Xin

    2015-01-01

    The model of holographic dark energy (HDE) with massive neutrinos and/or dark radiation is investigated in detail. The background and perturbation evolutions in the HDE model are calculated. We employ the PPF approach to overcome the gravity instability difficulty (perturbation divergence of dark energy) led by the equation-of-state parameter w evolving across the phantom divide w=−1 in the HDE model with c<1. We thus derive the evolutions of density perturbations of various components and metric fluctuations in the HDE model. The impacts of massive neutrino and dark radiation on the CMB anisotropy power spectrum and the matter power spectrum in the HDE scenario are discussed. Furthermore, we constrain the models of HDE with massive neutrinos and/or dark radiation by using the latest measurements of expansion history and growth of structure, including the Planck CMB temperature data, the baryon acoustic oscillation data, the JLA supernova data, the Hubble constant direct measurement, the cosmic shear data of weak lensing, the Planck CMB lensing data, and the redshift space distortions data. We find that ∑ m ν <0.186 eV (95% CL) and N eff =3.75 +0.28 −0.32 in the HDE model from the constraints of these data

  16. Calibration of a T-History calorimeter to measure enthalpy curves of phase change materials in the temperature range from 40 to 200 °C

    International Nuclear Information System (INIS)

    Rathgeber, Christoph; Schmit, Henri; Hennemann, Peter; Hiebler, Stefan

    2014-01-01

    Thermal energy storage using phase change materials (PCMs) provides high storage capacities in small temperature ranges. For the design of efficient latent heat storage, the enthalpy curve of a PCM has to be measured with high precision. Measurements are most commonly performed with differential scanning calorimetry (DSC). The T-History method, however, proved to be favourable for the characterization of typical PCMs due to large samples and a measuring procedure close to conditions found in applications. As T-History calorimeters are usually individual constructions, performing a careful calibration procedure is decisive to ensure optimal measuring accuracy. We report in this paper on the calibration of a T-History calorimeter with a working range from 40 to 200 °C that was designed and built at our institute. A three-part procedure, consisting of an indium calibration, a measurement of the specific heat of copper and measurements of three solid–liquid PCMs (stearic acid, dimethyl terephthalate and d-mannitol), was performed and an advanced procedure for the correction of enthalpy curves was developed. When comparing T-History enthalpy curves to literature data and DSC step measurements, good agreement within the uncertainty limits demanded by RAL testing specifications was obtained. Thus, our design of a T-History calorimeter together with the developed calibration procedure provides the measuring accuracy that is required to identify the most suitable PCM for a given application. In addition, the dependence of the enthalpy curve on the sample size can be analysed by comparing results obtained with T-History and DSC and the behaviour of the bulk material in real applications can be predicted. (paper)

  17. What Makes Difficult History Difficult?

    Science.gov (United States)

    Gross, Magdalena H.; Terra, Luke

    2018-01-01

    All modern nation-states have periods of difficult history that teachers fail to address or address inadequately. The authors present a framework for defining difficult histories and understanding what makes them difficult. These events 1) are central to a nation's history, 2) contradict accepted histories or values, 3) connect with present…

  18. SFCOMPO 2.0. Database of measured isotopic concentrations of spent nuclear fuel, with operational histories and design data

    International Nuclear Information System (INIS)

    2017-06-01

    SFCOMPO 2.0 (Spent Fuel Isotopic Composition) is a relational database designed to facilitate the search and visualisation of experimental assay data of spent nuclear fuel. It allows the user to access, plot and export isotopic composition data, reactor operational histories and relevant design data relating to spent fuel samples. The database can be queried using different search criteria. The data in SFCOMPO come from fuel samples irradiated in power reactors which have been experimentally measured in the past 50 years. SFCOMPO 2.0 offers a consistent and standardised approach to store, retrieve and compare different datasets from different post-irradiation experimental campaigns. Whenever it has been possible, original experimental lab reports or original publications have either been linked directly from the application, or are referenced. The aim of SFCOMPO 2.0 is to present the user of assay data with a referenced, standardised, cross-checked source of published experimental data for the use of assay data evaluators. SFCOMPO 2.0 was publicly released in June 2017. It contains experimental data coming from 44 different reactors of 8 different international types, currently representing 750 fuel samples. The database currently contains more than 24,000 measurement entries. The data in SFCOMPO 2.0 have been independently reviewed for consistency with the experimental reports but has not been formally evaluated. Any errors in measurements, omissions, or inconsistencies in the original reported data may be reproduced in the database. Therefore it is important that any user of the data for code validation consider and assess the potential data deficiencies. The evaluation of assay data will provide a more complete assessment and may result in the development of benchmark specifications and measurement data in cases of high quality experiments. Evaluations are a multi-disciplinary effort involving reactor specialists, modeling and simulation experts, and

  19. A theoretical study on the accuracy of the T-history method for enthalpy–temperature curve measurement: analysis of the influence of thermal gradients inside T-history samples

    International Nuclear Information System (INIS)

    Mazo, Javier; Delgado, Mónica; Lázaro, Ana; Dolado, Pablo; Peñalosa, Conchita; Marín, José María; Zalba, Belén

    2015-01-01

    The present work analyses the effect of radial thermal gradients inside T-history samples on the enthalpy temperature curve measurement. A conduction heat transfer model has been utilized for this purpose. Some expressions have been obtained that relate the main dimensionless numbers of the experiments with the deviations in specific heat capacity, phase change enthalpy and phase change temperature estimations. Although these relations can only be strictly applied to solid materials (e.g. measurements of shape stabilized phase change materials), they can provide some useful and conservative bounds for the deviations of the T-history method. Biot numbers emerge as the most relevant dimensionless parameters in the accuracy of the specific heat capacity and phase change enthalpy estimation whereas this model predicts a negligible influence of the temperature levels used for the experiments or the Stefan number. (paper)

  20. Measuring children's self-reported sport participation, risk perception and injury history: development and validation of a survey instrument.

    Science.gov (United States)

    Siesmaa, Emma J; Blitvich, Jennifer D; White, Peta E; Finch, Caroline F

    2011-01-01

    Despite the health benefits associated with children's sport participation, the occurrence of injury in this context is common. The extent to which sport injuries impact children's ongoing involvement in sport is largely unknown. Surveys have been shown to be useful for collecting children's injury and sport participation data; however, there are currently no published instruments which investigate the impact of injury on children's sport participation. This study describes the processes undertaken to assess the validity of two survey instruments for collecting self-reported information about child cricket and netball related participation, injury history and injury risk perceptions, as well as the reliability of the cricket-specific version. Face and content validity were assessed through expert feedback from primary and secondary level teachers and from representatives of peak sporting bodies for cricket and netball. Test-retest reliability was measured using a sample of 59 child cricketers who completed the survey on two occasions, 3-4 weeks apart. Based on expert feedback relating to face and content validity, modification and/or deletion of some survey items was undertaken. Survey items with low test-retest reliability (κ≤0.40) were modified or deleted, items with moderate reliability (κ=0.41-0.60) were modified slightly and items with higher reliability (κ≥0.61) were retained, with some undergoing minor modifications. This is the first survey of its kind which has been successfully administered to cricketers aged 10-16 years to collect information about injury risk perceptions and intentions for continued sport participation. Implications for its generalisation to other child sport participants are discussed. Copyright © 2010 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  1. Defining Quantum Control Flow

    OpenAIRE

    Ying, Mingsheng; Yu, Nengkun; Feng, Yuan

    2012-01-01

    A remarkable difference between quantum and classical programs is that the control flow of the former can be either classical or quantum. One of the key issues in the theory of quantum programming languages is defining and understanding quantum control flow. A functional language with quantum control flow was defined by Altenkirch and Grattage [\\textit{Proc. LICS'05}, pp. 249-258]. This paper extends their work, and we introduce a general quantum control structure by defining three new quantu...

  2. Can play be defined?

    DEFF Research Database (Denmark)

    Eichberg, Henning

    2015-01-01

    Can play be defined? There is reason to raise critical questions about the established academic demand that at phenomenon – also in humanist studies – should first of all be defined, i.e. de-lineated and by neat lines limited to a “little box” that can be handled. The following chapter develops....... Human beings can very well understand play – or whatever phenomenon in human life – without defining it....

  3. Defining the effect and mediators of two knowledge translation strategies designed to alter knowledge, intent and clinical utilization of rehabilitation outcome measures: a study protocol [NCT00298727

    Directory of Open Access Journals (Sweden)

    Law Mary

    2006-07-01

    Full Text Available Abstract Background A substantial number of valid outcome measures have been developed to measure health in adult musculoskeletal and childhood disability. Regrettably, national initiatives have merely resulted in changes in attitude, while utilization remains unacceptably low. This study will compare the effectiveness and mediators of two different knowledge transfer (KT interventions in terms of their impact on changing knowledge and behavior (utilization and clinical reasoning related to health outcome measures. Method/Design Physical and occupational therapists (n = 144 will be recruited in partnership with the national professional associations to evaluate two different KT interventions with the same curriculum: 1 Stakeholder-Hosted Interactive Problem-Based Seminar (SHIPS, and 2 Online Problem-Based course (e-PBL. SHIPS will consist of face-to-face problem-based learning (PBL for 2 1/2 days with outcome measure developers as facilitators, using six problems generated in consultation with participants. The e-PBL will consist of a 6-week web-based course with six generic problems developed by content experts. SHIPS will be conducted in three urban centers in Canada. Participants will be block-allocated by a minimization procedure to either of the two interventions to minimize any prognostic differences. Trained evaluators at each site will conduct chart audits and chart-stimulated recall. Trained interviewers will conduct semi-structured interviews focused on identifying critical elements in KT and implementing practice changes. Interviews will be transcribed verbatim. Baseline predictors including demographics, knowledge, attitudes/barriers regarding outcome measures, and Readiness to Change will be assessed by self-report. Immediately post-intervention and 6 months later, these will be re-administered. Primary qualitative and quantitative evaluations will be conducted 6-months post-intervention to assess the relative effectiveness of KT

  4. How Do Women Entrepreneurs Define Success? A Qualitative Study of Differences Among Women Entrepreneurs in Ethiopia

    OpenAIRE

    Atsede Tesfaye Hailemariam; Brigitte Kroon

    2014-01-01

    This paper describes how women entrepreneurs in Ethiopia define success in their own terms. Semi structured in-depth interviews were conducted with 24 women entrepreneurs from various sectors in Addis Ababa. The interview formats allowed the women to tell their life history and define success in their own terms. A common stereotype is that women entrepreneurs in Ethiopia operate businesses out of necessity and therefore women measure success in terms of financial rewards than personal rewards...

  5. Stability of clinical outcome measures in rheumatoid arthritis patients with stable disease defined on the basis of the EULAR response criteria

    DEFF Research Database (Denmark)

    Madsen, Ole Rintek

    2016-01-01

    patient. Using the Bland-Altman method, lower and upper 95 % limits of agreement (LLoA; ULoA) between the consecutive assessments and the bias were calculated for each measure. Associations were characterized by Pearson's r-values and standard errors of estimation (SEE). The mean change in DAS28-CRP was 0...

  6. Round robin test for define an accurate protocol to measure the pore fluid pH of low-pH cementitious materials

    International Nuclear Information System (INIS)

    Alonso, M.C.; Garcia Calvo, J.L.; Pettersson, S.; Puigdomenech, I.; Cunado, M.A.; Vuorio, M.; Weber, H.; Ueda, H.; Naito, M.; Walker, C.; Takeshi, Y.; Cau Dit Coumes, C.

    2012-01-01

    The present research belongs to an international project where several of the main nuclear waste management agencies have been involved. The main objective is the development of agreed procedures or protocols for measuring the pH value using low-pH cementitious products (LopHC). The Pore Fluid Expression (PFE) has been identified as reference method and Ex-situ Leaching methods (ELS) with two variants (filtering and without filtering the obtained suspension) have been identified as routine methods. Both methodologies are based on the extraction of the pore solution of the concrete before pH determination. The protocols employed were based on a broad literature review and in fitting the more critical parameters, such as the sample size, the carbonation affection, the leaching of cement hydrates during the measurement, etc. Moreover, the routine methods were validated with respect to the pore fluid expression results. It appears that the repeatability of the 3 pH measurement protocols is very good and that the results obtained with both ESL procedures agree well with the results given by the PFE technique in the case of low-pH cementitious materials and are acceptable in the case of cementitious materials with high pore fluid pH values, in that case some corrections considering the Ca content of the solution may be needed

  7. On the consistent effect histories approach to quantum mechanics

    International Nuclear Information System (INIS)

    Rudolph, O.

    1996-01-01

    A formulation of the consistent histories approach to quantum mechanics in terms of generalized observables (POV measures) and effect operators is provided. The usual notion of open-quote open-quote history close-quote close-quote is generalized to the notion of open-quote open-quote effect history.close-quote close-quote The space of effect histories carries the structure of a D-poset. Recent results of J. D. Maitland Wright imply that every decoherence functional defined for ordinary histories can be uniquely extended to a bi-additive decoherence functional on the space of effect histories. Omngrave es close-quote logical interpretation is generalized to the present context. The result of this work considerably generalizes and simplifies the earlier formulation of the consistent effect histories approach to quantum mechanics communicated in a previous work of this author. copyright 1996 American Institute of Physics

  8. PERFORMANCE OF HIGH SCHOOL FOOTBALL PLAYERS ON CLINICAL MEASURES OF DEEP CERVICAL FLEXOR ENDURANCE AND CERVICAL ACTIVE RANGE OF MOTION: IS HISTORY OF CONCUSSION A FACTOR?

    Science.gov (United States)

    Smith, Laura; Ruediger, Thomas; Alsalaheen, Bara; Bean, Ryan

    2016-04-01

    More than one million adolescent athletes participated in organized high school sanctioned football during the 2014-15 season. These athletes are at risk for sustaining concussion. Although cervical spine active range of motion (AROM) and deep neck flexor endurance may serve a preventative role in concussion, and widespread clinical use of measurements of these variables, reference values are not available for this population. Cost effective, clinically relevant methods for measuring neck endurance are also well established for adolescent athletes. The purpose of this study was to report reference values for deep cervical flexor endurance and cervical AROM in adolescent football players and examine whether differences in these measures exist in high school football players with and without a history of concussion. Concussion history, cervical AROM, and deep neck flexor endurance were measured in 122 high school football players. Reference values were calculated for AROM and endurance measures; association were examined between various descriptive variables and concussion. No statistically significant differences were found between athletes with a history of concussion and those without. A modest inverse correlation was seen between body mass and AROM in the sagittal and transverse planes. The results of this study indicate that the participants with larger body mass had less cervical AROM in some directions. While cervical AROM and endurance measurements may not be adequate to identify adolescents with a history of previous concussions among high school football players. However, if a concussion is sustained, these measures can offer a baseline to examine whether cervical AROM is affected as compared to healthy adolescents. 2c.

  9. Patient-provider communication styles in HIV treatment programs in Bamako, Mali: A mixed-methods study to define dimensions and measure patient preferences

    Directory of Open Access Journals (Sweden)

    Emily A. Hurley

    2017-12-01

    Full Text Available Effective patient-provider communication (PPC promotes patient adherence and retention in long-term care. Sub-Saharan Africa faces unprecedented demand for chronic care for HIV patients on antiretroviral therapy (ART, yet adherence and retention remain challenging. In high-income countries, research describing patient preferences for different PPC styles has guided interventions to improve PPC and patient outcomes. However, research on PPC preferences in sub-Saharan Africa is limited. We sought to define PPC dimensions relevant to ART programs in Bamako, Mali through recordings of clinical interactions, in-depth interviews and focus-group discussions with 69 patients and 17 providers. To assess preferences toward contrasting PPC styles within dimensions, we conducted a vignette-based survey with 141 patients across five ART facilities. Qualitative analysis revealed two PPC dimensions similar to those described in the literature on patient-centered communication (level of psychosocial regard, balance of power, and one unique dimension that emerged from the data (guiding patient behavior: easy/tough/sharp. Significantly more survey participants chose the vignette demonstrating high psychosocial regard (52.2% compared to a biomedical style (22.5% (p<0.001. Within balance of power, a statistically similar proportion of participants chose the vignette demonstrating shared power (40.2% compared to a provider-dominated style (35.8%. In guiding patient behavior, a similar proportion of participants preferred the vignette depicting the “easy” (38.4% and/or “tough” style (40.6%, but significantly fewer preferred the “sharp” style (14.5% (p<0.001. Highly educated participants chose biomedical and shared power styles more frequently, while less educated participants more frequently indicated “no preference”. Working to understand, develop, and tailor PPC styles to patients in chronic care may help support patient retention and ultimately

  10. A Bayesian account of quantum histories

    International Nuclear Information System (INIS)

    Marlow, Thomas

    2006-01-01

    We investigate whether quantum history theories can be consistent with Bayesian reasoning and whether such an analysis helps clarify the interpretation of such theories. First, we summarise and extend recent work categorising two different approaches to formalising multi-time measurements in quantum theory. The standard approach consists of describing an ordered series of measurements in terms of history propositions with non-additive 'probabilities.' The non-standard approach consists of defining multi-time measurements to consist of sets of exclusive and exhaustive history propositions and recovering the single-time exclusivity of results when discussing single-time history propositions. We analyse whether such history propositions can be consistent with Bayes' rule. We show that certain class of histories are given a natural Bayesian interpretation, namely, the linearly positive histories originally introduced by Goldstein and Page. Thus, we argue that this gives a certain amount of interpretational clarity to the non-standard approach. We also attempt a justification of our analysis using Cox's axioms of probability theory

  11. Defining energy vulnerability in mobility. Measuring energy vulnerability in mobility. Acting against energy vulnerability in mobility. Discussing energy vulnerability in mobility. Task no. 4

    International Nuclear Information System (INIS)

    Jouffe, Yves; Massot, Marie-Helene; Noble, Cyprien

    2015-01-01

    Extensive expansion of urban areas generates transportation needs and energy expenses for mobility. Households already impacted by fuel poverty also suffer from energy vulnerability in their mobility. This report was prepared in the framework of the study of fuel poverty in France in the light of several indicators from existing inquiries, databases and modeling tools. The report is organised in 4 parts dealing with: the definition of energy vulnerability in mobility, its measurement, the possible remedial actions, and the discussions about energy vulnerability in mobility through working group meetings, respectively

  12. Delirium superimposed on dementia: defining disease states and course from longitudinal measurements of a multivariate index using latent class analysis and hidden Markov chains.

    Science.gov (United States)

    Ciampi, Antonio; Dyachenko, Alina; Cole, Martin; McCusker, Jane

    2011-12-01

    The study of mental disorders in the elderly presents substantial challenges due to population heterogeneity, coexistence of different mental disorders, and diagnostic uncertainty. While reliable tools have been developed to collect relevant data, new approaches to study design and analysis are needed. We focus on a new analytic approach. Our framework is based on latent class analysis and hidden Markov chains. From repeated measurements of a multivariate disease index, we extract the notion of underlying state of a patient at a time point. The course of the disorder is then a sequence of transitions among states. States and transitions are not observable; however, the probability of being in a state at a time point, and the transition probabilities from one state to another over time can be estimated. Data from 444 patients with and without diagnosis of delirium and dementia were available from a previous study. The Delirium Index was measured at diagnosis, and at 2 and 6 months from diagnosis. Four latent classes were identified: fairly healthy, moderately ill, clearly sick, and very sick. Dementia and delirium could not be separated on the basis of these data alone. Indeed, as the probability of delirium increased, so did the probability of decline of mental functions. Eight most probable courses were identified, including good and poor stable courses, and courses exhibiting various patterns of improvement. Latent class analysis and hidden Markov chains offer a promising tool for studying mental disorders in the elderly. Its use may show its full potential as new data become available.

  13. An expanded framework to define and measure shared decision-making in dialogue: A 'top-down' and 'bottom-up' approach.

    Science.gov (United States)

    Callon, Wynne; Beach, Mary Catherine; Links, Anne R; Wasserman, Carly; Boss, Emily F

    2018-03-11

    We aimed to develop a comprehensive, descriptive framework to measure shared decision making (SDM) in clinical encounters. We combined a top-down (theoretical) approach with a bottom-up approach based on audio-recorded dialogue to identify all communication processes related to decision making. We coded 55 pediatric otolaryngology visits using the framework and report interrater reliability. We identified 14 clinician behaviors and 5 patient behaviors that have not been previously described, and developed a new SDM framework that is descriptive (what does happen) rather than normative (what should happen). Through the bottom-up approach we identified three broad domains not present in other SDM frameworks: socioemotional support, understandability of clinician dialogue, and recommendation-giving. We also specify the ways in which decision-making roles are assumed implicitly rather than discussed explicitly. Interrater reliability was >75% for 92% of the coded behaviors. This SDM framework allows for a more expansive understanding and analysis of how decision making takes place in clinical encounters, including new domains and behaviors not present in existing measures. We hope that this new framework will bring attention to a broader conception of SDM and allow researchers to further explore the new domains and behaviors identified. Copyright © 2018. Published by Elsevier B.V.

  14. Reliability and Validity of a Measure of Sexual and Physical Abuse Histories among Women with Serious Mental Illness.

    Science.gov (United States)

    Meyer, Ilan H.; And Others

    1996-01-01

    Structured clinical interviews concerning childhood histories of physical and sexual abuse with 70 mentally ill women at 2 times found test-retest reliability of .63 for physical abuse and .82 for sexual abuse. Validity, assessed as consistency with an independent clinical assessment, showed 75% agreement for physical abuse and 93% agreement for…

  15. Contrasting neogene denudation histories of different structural regions in the transantarctic mountains rift flank constrained by cosmogenic isotope measurements

    NARCIS (Netherlands)

    Wateren, F.M. van der; Dunai, T.J.; Balen, R.T. van; Klas, W.; Verbers, A.L.L.M.; Passchier, S.; Herpers, U.

    1999-01-01

    Separate regions within the Transantarctic Mountains, the uplifted flank of the West Antarctic rift system, appear to have distinct Neogene histories of glaciation and valley downcutting. Incision of deep glacial outlet valleys occurred at different times throughout central and northern Victoria

  16. Defining Overweight and Obesity

    Science.gov (United States)

    ... Micronutrient Malnutrition State and Local Programs Defining Adult Overweight and Obesity Recommend on Facebook Tweet Share Compartir ... weight for a given height is described as overweight or obese. Body Mass Index, or BMI, is ...

  17. Drinking Levels Defined

    Science.gov (United States)

    ... of Alcohol Consumption Alcohol's Effects on the Body Alcohol Use Disorder Fetal Alcohol Exposure Support & Treatment Alcohol Policy Special ... Definition of Drinking at Low Risk for Developing Alcohol Use Disorder (AUD): For women, low-risk drinking is defined ...

  18. Defining Documentary Film

    DEFF Research Database (Denmark)

    Juel, Henrik

    2006-01-01

    A discussion of various attemts at defining documentary film regarding form, content, truth, stile, genre or reception - and a propoposal of a positive list of essential, but non-exclusive characteristica of documentary film......A discussion of various attemts at defining documentary film regarding form, content, truth, stile, genre or reception - and a propoposal of a positive list of essential, but non-exclusive characteristica of documentary film...

  19. Tuberculosis and mass gatherings-opportunities for defining burden, transmission risk, and the optimal surveillance, prevention, and control measures at the annual Hajj pilgrimage.

    Science.gov (United States)

    Zumla, Alimuddin; Saeed, Abdulaziz Bin; Alotaibi, Badriah; Yezli, Saber; Dar, Osman; Bieh, Kingsley; Bates, Matthew; Tayeb, Tamara; Mwaba, Peter; Shafi, Shuja; McCloskey, Brian; Petersen, Eskild; Azhar, Esam I

    2016-06-01

    Tuberculosis (TB) is now the most common infectious cause of death worldwide. In 2014, an estimated 9.6 million people developed active TB. There were an estimated three million people with active TB including 360000 with multidrug-resistant TB (MDR-TB) who were not diagnosed, and such people continue to fuel TB transmission in the community. Accurate data on the actual burden of TB and the transmission risk associated with mass gatherings are scarce and unreliable due to the small numbers studied and methodological issues. Every year, an estimated 10 million pilgrims from 184 countries travel to the Kingdom of Saudi Arabia (KSA) to perform the Hajj and Umrah pilgrimages. A large majority of pilgrims come from high TB burden and MDR-TB endemic areas and thus many may have undiagnosed active TB, sub-clinical TB, and latent TB infection. The Hajj pilgrimage provides unique opportunities for the KSA and the 184 countries from which pilgrims originate, to conduct high quality priority research studies on TB under the remit of the Global Centre for Mass Gatherings Medicine. Research opportunities are discussed, including those related to the definition of the TB burden, transmission risk, and the optimal surveillance, prevention, and control measures at the annual Hajj pilgrimage. The associated data are required to develop international recommendations and guidelines for TB management and control at mass gathering events. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  20. [Development of an attitude-measurement questionnaire using the semantic differential technique: defining the attitudes of radiological technology students toward X-ray examination].

    Science.gov (United States)

    Tamura, Naomi; Terashita, Takayoshi; Ogasawara, Katsuhiko

    2014-03-01

    In general, it is difficult to objectively evaluate the results of an educational program. The semantic differential (SeD) technique, a methodology used to measure the connotative meaning of objects, words, and concepts, can, however, be applied to the evaluation of students' attitudes. In this study, we aimed to achieve an objective evaluation of the effects of radiological technology education. We therefore investigated the attitude of radiological students using the SeD technique. We focused on X-ray examinations in the field of radiological technology science. Bipolar adjective scales were used for the SeD questionnaire. To create the questionnaire, appropriate adjectives were selected from past reports of X-ray examination practice. The participants were 32 senior students at Hokkaido University at the Division of Radiological Technology at the School of Medicine's Department of Health Sciences. All the participants completed the questionnaire. The study was conducted in early June 2012. Attitudes toward X-ray examination were identified using a factor analysis of 11 adjectives. The factor analysis revealed the following three attitudes: feelings of expectation, responsibility, and resistance. Knowledge regarding the attitudes that students have toward X-ray examination will prove useful for evaluating the effects of educational intervention. In this study, a sampling bias may have occurred due to the small sample size; however, no other biases were observed.

  1. Defining a minimal clinically important difference for endometriosis-associated pelvic pain measured on a visual analog scale: analyses of two placebo-controlled, randomized trials

    Directory of Open Access Journals (Sweden)

    Schmitz Heinz

    2010-11-01

    Full Text Available Abstract Background When comparing active treatments, a non-inferiority (or one-sided equivalence study design is often used. This design requires the definition of a non-inferiority margin, the threshold value of clinical relevance. In recent studies, a non-inferiority margin of 15 mm has been used for the change in endometriosis-associated pelvic pain (EAPP on a visual analog scale (VAS. However, this value was derived from other chronic painful conditions and its validation in EAPP was lacking. Methods Data were analyzed from two placebo-controlled studies of active treatments in endometriosis, including 281 patients with laparoscopically-confirmed endometriosis and moderate-to-severe EAPP. Patients recorded EAPP on a VAS at baseline and the end of treatment. Patients also assessed their satisfaction with treatment on a modified Clinical Global Impression scale. Changes in VAS score were compared with patients' self-assessments to derive an empirically validated non-inferiority margin. This anchor-based value was compared to a non-inferiority margin derived using the conventional half standard deviation rule for minimal clinically important difference (MCID in patient-reported outcomes. Results Anchor-based and distribution-based MCIDs were-7.8 mm and-8.6 mm, respectively. Conclusions An empirically validated non-inferiority margin of 10 mm for EAPP measured on a VAS is appropriate to compare treatments in endometriosis.

  2. FDG-PET Response Prediction in Pediatric Hodgkin’s Lymphoma: Impact of Metabolically Defined Tumor Volumes and Individualized SUV Measurements on the Positive Predictive Value

    Energy Technology Data Exchange (ETDEWEB)

    Hussien, Amr Elsayed M. [Department of Nuclear Medicine (KME), Forschungszentrum Jülich, Medical Faculty, Heinrich-Heine-University Düsseldorf, Jülich, 52426 (Germany); Department of Nuclear Medicine, Medical Faculty, Heinrich-Heine-University Düsseldorf, Düsseldorf, 40225 (Germany); Furth, Christian [Department of Radiology and Nuclear Medicine, Medical School, Otto-von-Guericke University Magdeburg, Magdeburg, 39120 (Germany); Schönberger, Stefan [Department of Pediatric Oncology, Hematology and Clinical Immunology, University Children’s Hospital, Medical Faculty, Heinrich-Heine-University Düsseldorf, Düsseldorf, 40225 (Germany); Hundsdoerfer, Patrick [Department of Pediatric Oncology and Hematology, Charité Campus Virchow, Humboldt-University Berlin, Berlin, 13353 (Germany); Steffen, Ingo G.; Amthauer, Holger [Department of Radiology and Nuclear Medicine, Medical School, Otto-von-Guericke University Magdeburg, Magdeburg, 39120 (Germany); Müller, Hans-Wilhelm; Hautzel, Hubertus, E-mail: h.hautzel@fz-juelich.de [Department of Nuclear Medicine (KME), Forschungszentrum Jülich, Medical Faculty, Heinrich-Heine-University Düsseldorf, Jülich, 52426 (Germany); Department of Nuclear Medicine, Medical Faculty, Heinrich-Heine-University Düsseldorf, Düsseldorf, 40225 (Germany)

    2015-01-28

    Background: In pediatric Hodgkin’s lymphoma (pHL) early response-to-therapy prediction is metabolically assessed by (18)F-FDG PET carrying an excellent negative predictive value (NPV) but an impaired positive predictive value (PPV). Aim of this study was to improve the PPV while keeping the optimal NPV. A comparison of different PET data analyses was performed applying individualized standardized uptake values (SUV), PET-derived metabolic tumor volume (MTV) and the product of both parameters, termed total lesion glycolysis (TLG); Methods: One-hundred-eight PET datasets (PET1, n = 54; PET2, n = 54) of 54 children were analysed by visual and semi-quantitative means. SUVmax, SUVmean, MTV and TLG were obtained the results of both PETs and the relative change from PET1 to PET2 (Δ in %) were compared for their capability of identifying responders and non-responders using receiver operating characteristics (ROC)-curves. In consideration of individual variations in noise and contrasts levels all parameters were additionally obtained after threshold correction to lean body mass and background; Results: All semi-quantitative SUV estimates obtained at PET2 were significantly superior to the visual PET2 analysis. However, ΔSUVmax revealed the best results (area under the curve, 0.92; p < 0.001; sensitivity 100%; specificity 85.4%; PPV 46.2%; NPV 100%; accuracy, 87.0%) but was not significantly superior to SUVmax-estimation at PET2 and ΔTLGmax. Likewise, the lean body mass and background individualization of the datasets did not impove the results of the ROC analyses; Conclusions: Sophisticated semi-quantitative PET measures in early response assessment of pHL patients do not perform significantly better than the previously proposed ΔSUVmax. All analytical strategies failed to improve the impaired PPV to a clinically acceptable level while preserving the excellent NPV.

  3. FDG-PET Response Prediction in Pediatric Hodgkin’s Lymphoma: Impact of Metabolically Defined Tumor Volumes and Individualized SUV Measurements on the Positive Predictive Value

    International Nuclear Information System (INIS)

    Hussien, Amr Elsayed M.; Furth, Christian; Schönberger, Stefan; Hundsdoerfer, Patrick; Steffen, Ingo G.; Amthauer, Holger; Müller, Hans-Wilhelm; Hautzel, Hubertus

    2015-01-01

    Background: In pediatric Hodgkin’s lymphoma (pHL) early response-to-therapy prediction is metabolically assessed by (18)F-FDG PET carrying an excellent negative predictive value (NPV) but an impaired positive predictive value (PPV). Aim of this study was to improve the PPV while keeping the optimal NPV. A comparison of different PET data analyses was performed applying individualized standardized uptake values (SUV), PET-derived metabolic tumor volume (MTV) and the product of both parameters, termed total lesion glycolysis (TLG); Methods: One-hundred-eight PET datasets (PET1, n = 54; PET2, n = 54) of 54 children were analysed by visual and semi-quantitative means. SUVmax, SUVmean, MTV and TLG were obtained the results of both PETs and the relative change from PET1 to PET2 (Δ in %) were compared for their capability of identifying responders and non-responders using receiver operating characteristics (ROC)-curves. In consideration of individual variations in noise and contrasts levels all parameters were additionally obtained after threshold correction to lean body mass and background; Results: All semi-quantitative SUV estimates obtained at PET2 were significantly superior to the visual PET2 analysis. However, ΔSUVmax revealed the best results (area under the curve, 0.92; p < 0.001; sensitivity 100%; specificity 85.4%; PPV 46.2%; NPV 100%; accuracy, 87.0%) but was not significantly superior to SUVmax-estimation at PET2 and ΔTLGmax. Likewise, the lean body mass and background individualization of the datasets did not impove the results of the ROC analyses; Conclusions: Sophisticated semi-quantitative PET measures in early response assessment of pHL patients do not perform significantly better than the previously proposed ΔSUVmax. All analytical strategies failed to improve the impaired PPV to a clinically acceptable level while preserving the excellent NPV

  4. Sleep Health: Can We Define It? Does It Matter?

    Science.gov (United States)

    Buysse, Daniel J.

    2014-01-01

    Good sleep is essential to good health. Yet for most of its history, sleep medicine has focused on the definition, identification, and treatment of sleep problems. Sleep health is a term that is infrequently used and even less frequently defined. It is time for us to change this. Indeed, pressures in the research, clinical, and regulatory environments require that we do so. The health of populations is increasingly defined by positive attributes such as wellness, performance, and adaptation, and not merely by the absence of disease. Sleep health can be defined in such terms. Empirical data demonstrate several dimensions of sleep that are related to health outcomes, and that can be measured with self-report and objective methods. One suggested definition of sleep health and a description of self-report items for measuring it are provided as examples. The concept of sleep health synergizes with other health care agendas, such as empowering individuals and communities, improving population health, and reducing health care costs. Promoting sleep health also offers the field of sleep medicine new research and clinical opportunities. In this sense, defining sleep health is vital not only to the health of populations and individuals, but also to the health of sleep medicine itself. Citation: Buysse DJ. Sleep health: can we define it? Does it matter? SLEEP 2014;37(1):9-17. PMID:24470692

  5. Definably compact groups definable in real closed fields. I

    OpenAIRE

    Barriga, Eliana

    2017-01-01

    We study definably compact definably connected groups definable in a sufficiently saturated real closed field $R$. We introduce the notion of group-generic point for $\\bigvee$-definable groups and show the existence of group-generic points for definably compact groups definable in a sufficiently saturated o-minimal expansion of a real closed field. We use this notion along with some properties of generic sets to prove that for every definably compact definably connected group $G$ definable in...

  6. History of Science and History of Philologies.

    Science.gov (United States)

    Daston, Lorraine; Most, Glenn W

    2015-06-01

    While both the sciences and the humanities, as currently defined, may be too heterogeneous to be encompassed within a unified historical framework, there is good reason to believe that the history of science and the history of philologies both have much to gain by joining forces. This collaboration has already yielded striking results in the case of the history of science and humanist learning in early modern Europe. This essay argues that first, philology and at least some of the sciences (e.g., astronomy) remained intertwined in consequential ways well into the modern period in Western cultures; and second, widening the scope of inquiry to include other philological traditions in non-Western cultures offers rich possibilities for a comparative history of learned practices. The focus on practices is key; by shifting the emphasis from what is studied to how it is studied, deep commonalities emerge among disciplines--and intellectual traditions--now classified as disparate.

  7. Defining Game Mechanics

    DEFF Research Database (Denmark)

    Sicart (Vila), Miguel Angel

    2008-01-01

    This article defins game mechanics in relation to rules and challenges. Game mechanics are methods invoked by agents for interacting with the game world. I apply this definition to a comparative analysis of the games Rez, Every Extend Extra and Shadow of the Colossus that will show the relevance...... of a formal definition of game mechanics. Udgivelsesdato: Dec 2008...

  8. Modal Logics and Definability

    OpenAIRE

    Kuusisto, Antti

    2013-01-01

    In recent years, research into the mathematical foundations of modal logic has become increasingly popular. One of the main reasons for this is the fact that modal logic seems to adapt well to the requirements of a wide range of different fields of application. This paper is a summary of some of the author’s contributions to the understanding of modal definability theory.

  9. Software Defined Cyberinfrastructure

    Energy Technology Data Exchange (ETDEWEB)

    Foster, Ian; Blaiszik, Ben; Chard, Kyle; Chard, Ryan

    2017-07-17

    Within and across thousands of science labs, researchers and students struggle to manage data produced in experiments, simulations, and analyses. Largely manual research data lifecycle management processes mean that much time is wasted, research results are often irreproducible, and data sharing and reuse remain rare. In response, we propose a new approach to data lifecycle management in which researchers are empowered to define the actions to be performed at individual storage systems when data are created or modified: actions such as analysis, transformation, copying, and publication. We term this approach software-defined cyberinfrastructure because users can implement powerful data management policies by deploying rules to local storage systems, much as software-defined networking allows users to configure networks by deploying rules to switches.We argue that this approach can enable a new class of responsive distributed storage infrastructure that will accelerate research innovation by allowing any researcher to associate data workflows with data sources, whether local or remote, for such purposes as data ingest, characterization, indexing, and sharing. We report on early experiments with this approach in the context of experimental science, in which a simple if-trigger-then-action (IFTA) notation is used to define rules.

  10. Defining Abnormally Low Tenders

    DEFF Research Database (Denmark)

    Ølykke, Grith Skovgaard; Nyström, Johan

    2017-01-01

    The concept of an abnormally low tender is not defined in EU public procurement law. This article takes an interdisciplinary law and economics approach to examine a dataset consisting of Swedish and Danish judgments and verdicts concerning the concept of an abnormally low tender. The purpose...

  11. Software Defined Coded Networking

    DEFF Research Database (Denmark)

    Di Paola, Carla; Roetter, Daniel Enrique Lucani; Palazzo, Sergio

    2017-01-01

    the quality of each link and even across neighbouring links and using simulations to show that an additional reduction of packet transmission in the order of 40% is possible. Second, to advocate for the use of network coding (NC) jointly with software defined networking (SDN) providing an implementation...

  12. Defining and classifying syncope

    NARCIS (Netherlands)

    Thijs, Roland D.; Wieling, Wouter; Kaufmann, Horacio; van Dijk, Gert

    2004-01-01

    There is no widely adopted definition or classification of syncope and related disorders. This lack of uniformity harms patient care, research, and medical education. In this article, syncope is defined as a form of transient loss of consciousness (TLOC) due to cerebral hypoperfusion. Differences

  13. Defining and Measuring Dysphagia Following Stroke

    Science.gov (United States)

    Daniels, Stephanie K.; Schroeder, Mae Fern; DeGeorge, Pamela C.; Corey, David M.; Foundas, Anne L.; Rosenbek, John C.

    2009-01-01

    Purpose: To continue the development of a quantified, standard method to differentiate individuals with stroke and dysphagia from individuals without dysphagia. Method: Videofluoroscopic swallowing studies (VFSS) were completed on a group of participants with acute stroke (n = 42) and healthy age-matched individuals (n = 25). Calibrated liquid…

  14. Defining, Measuring, and Comparing Organisational Cultures

    NARCIS (Netherlands)

    van den Berg, Peter T.; Wilderom, Celeste P.M.

    2004-01-01

    La littérature portant sur la culture des organisations souffre d’un manque manifeste d’enquêtes extensives débouchant sur des études comparatives. Afin de rendre plus comparables les cultures organisationnelles, nous proposons une définition et une série de dimensions. La culture organisationnelle

  15. Defining Moments in MMWR History: Toxic-Shock Syndrome -- 1980

    Centers for Disease Control (CDC) Podcasts

    In the late 1970s and early 1980s, an outbreak of a disease called Toxic Shock Syndrome made healthy women sick. CDC's disease detectives helped unravel the link between Toxic Shock Syndrome and high-absorbency tampons. MMWR was the first scientific publication to break the news of these cases. In this podcast, Dr. Kathy Shands, former chief of CDC's Toxic Shock Syndrome Task Force, recalls her experience working with state epidemiologists to identify the link between toxic shock syndrome and tampon use.

  16. Defining Moments in MMWR History: Toxic-Shock Syndrome -- 1980

    Centers for Disease Control (CDC) Podcasts

    2017-11-03

    In the late 1970s and early 1980s, an outbreak of a disease called Toxic Shock Syndrome made healthy women sick. CDC’s disease detectives helped unravel the link between Toxic Shock Syndrome and high-absorbency tampons. MMWR was the first scientific publication to break the news of these cases. In this podcast, Dr. Kathy Shands, former chief of CDC’s Toxic Shock Syndrome Task Force, recalls her experience working with state epidemiologists to identify the link between toxic shock syndrome and tampon use.  Created: 11/3/2017 by MMWR.   Date Released: 11/3/2017.

  17. Quantum histories and their implications

    International Nuclear Information System (INIS)

    Kent, A.

    2000-01-01

    Classical mechanics and standard Copenhagen quantum mechanics respect subspace implications. For example, if a particle is confined in a particular region R of space, then in these theories we can deduce that it is confined in regions containing R. However, subspace implications are generally violated by versions of quantum theory that assign probabilities to histories, such as the consistent histories approach. I define here a new criterion, ordered consistency, which refines the criterion of consistency and has the property that inferences made by ordered consistent sets do not violate subspace relations. This raises the question: do the operators defining our observations form an ordered consistent history? If so, ordered consistency defines a version of quantum theory with greater predictive power than the consistent histories formalism. If not, and our observations are defined by a non-ordered consistent quantum history, then subspace implications are not generally valid. (orig.)

  18. Defining Legal Moralism

    DEFF Research Database (Denmark)

    Thaysen, Jens Damgaard

    2015-01-01

    This paper discusses how legal moralism should be defined. It is argued that legal moralism should be defined as the position that “For any X, it is always a pro tanto reason for justifiably imposing legal regulation on X that X is morally wrong (where “morally wrong” is not conceptually equivalent...... to “harmful”)”. Furthermore, a distinction between six types of legal moralism is made. The six types are grouped according to whether they are concerned with the enforcement of positive or critical morality, and whether they are concerned with criminalising, legally restricting, or refraining from legally...... protecting morally wrong behaviour. This is interesting because not all types of legal moralism are equally vulnerable to the different critiques of legal moralism that have been put forth. Indeed, I show that some interesting types of legal moralism have not been criticised at all....

  19. Defining local food

    DEFF Research Database (Denmark)

    Eriksen, Safania Normann

    2013-01-01

    Despite evolving local food research, there is no consistent definition of “local food.” Various understandings are utilized, which have resulted in a diverse landscape of meaning. The main purpose of this paper is to examine how researchers within the local food systems literature define local...... food, and how these definitions can be used as a starting point to identify a new taxonomy of local food based on three domains of proximity....

  20. Postural Control Characteristics during Single Leg Standing of Individuals with a History of Ankle Sprain: Measurements Obtained Using a Gravicorder and Head and Foot Accelerometry.

    Science.gov (United States)

    Abe, Yota; Sugaya, Tomoaki; Sakamoto, Masaaki

    2014-03-01

    [Purpose] This study aimed to validate the postural control characteristics of individuals with a history of ankle sprain during single leg standing by using a gravicorder and head and foot accelerometry. [Subjects] Twenty subjects with and 23 subjects without a history of ankle sprain (sprain and control groups, respectively) participated. [Methods] The anteroposterior, mediolateral, and total path lengths, as well as root mean square (RMS) of each length, were calculated using the gravicorder. The anteroposterior, mediolateral, and resultant acceleration of the head and foot were measured using accelerometers and were evaluated as the ratio of the acceleration of the head to the foot. [Results] There was no significant difference between the two groups in path length or RMS acceleration of the head and foot. However, the ratios of the mediolateral and resultant components were significantly higher in the sprain group than in the control group. [Conclusion] Our findings suggest that individuals with a history of ankle sprain have a higher head-to-foot acceleration ratio and different postural control characteristics than those of control subjects.

  1. [History and psychoanalysis: the stakes of history].

    Science.gov (United States)

    Chertok, L; Stengers, I

    1993-01-01

    Freud's definition of the relationship between hypnosis and psychoanalysis is a political one that even then pointed to the paradigmatical sciences as defined by Kuhn. Nevertheless, the historian who applies to psychoanalysis the technique of symetry elaborated for such sciences, runs up against a set of singularities that risk bringing him to a position of denouncer of a "fake science". We emphasize that, if the historian does not limit himself to the positivist position or to the history of ideas, he will inevitably find himself engaged in the history that he is analyzing, but with the responsibility of his mode of engagement. We propose to define hypnosis and psychoanalysis as fields inhabited by the question of science in the modern sense of the term, and raising the issue of pertinence, as far as they are concerned, of the theoretical experimental model that guided them.

  2. Feasible Histories, Maximum Entropy

    International Nuclear Information System (INIS)

    Pitowsky, I.

    1999-01-01

    We consider the broadest possible consistency condition for a family of histories, which extends all previous proposals. A family that satisfies this condition is called feasible. On each feasible family of histories we choose a probability measure by maximizing entropy, while keeping the probabilities of commuting histories to their quantum mechanical values. This procedure is justified by the assumption that decoherence increases entropy. Finally, a criterion for identifying the nearly classical families is proposed

  3. History Matters

    Institute of Scientific and Technical Information of China (English)

    2017-01-01

    In 2002, she began working as alecturer at Minzu University of China.Now, she teaches English, historicalliterature, ancient Chinese history,historical theory and method, ancientsocial history of China, ancient palacepolitical history of China and the historyof the Sui and Tang dynasties and thePeriod of Five Dynasties.

  4. Multivariate assessment of subjective and objective measures of social and family satisfaction in Veterans with history of traumatic brain injury.

    Science.gov (United States)

    Orff, Henry J; Hays, Chelsea C; Twamley, Elizabeth W

    2016-01-01

    Approximately 20% of current-era Veterans have sustained a traumatic brain injury (TBI), which can result in persistent postconcussive symptoms. These symptoms may disrupt family and social functioning. We explored psychiatric, postconcussive, and cognitive factors as correlates of objective functioning and subjective satisfaction in family and social relationships. At entry into a supported employment study, 50 unemployed Veterans with a history of mild to moderate TBI and current cognitive impairment were administered baseline assessments. Multivariate stepwise regressions determined that higher levels of depressive symptomatology were strongly associated with less frequent social contact, as well as lower subjective satisfaction with family and social relationships. Worse verbal fluency predicted less frequent social contact, whereas worse processing speed and switching predicted higher levels of subjective satisfaction with family relationships. The pattern of results remained similar when examining those Veterans with only mild TBI. Depressive symptoms and cognitive functioning may impact Veterans' social contact and satisfaction with family and social relationships. Evidence-based interventions addressing depression and cognition may therefore aid in improving community reintegration and satisfaction with social and family relationships.

  5. Family history assessment of personality disorders: II. Association with measures of psychosocial functioning in direct evaluations with relatives.

    Science.gov (United States)

    Lara, M E; Ferro, T; Klein, D N

    1997-01-01

    To test the convergent validity of the Family History Interview for Personality Disorders (FHIPD), as well as the general utility of informants' reports of personality disorders, we explored the relationship between proband informant reports of Axis II diagnoses on the FHIPD and relative reports of various indices of psychosocial adjustment. Subjects were the first degree relatives (n = 454) of 224 probands participating in a family study of mood and personality disorders. Relatives provided information on the Structured Clinical Interview for DSM-III-R (SCID), the Personality Disorder Examination (PDE), and other variables reflecting aspects of psychosocial dysfunction that are common in personality disorders. Proband informants were interviewed about their relatives using the FHIPD Proband informant reports of personality disorders on the FHIPD were associated with a variety of forms of psychosocial dysfunction as determined in direct assessments with the relatives, even for those with no diagnosable Axis II psychopathology dysfunction as determined in direct assessments with the relatives, even for those with no diagnosable Axis II psychopathology on direct interview. These results support the convergent validity of the FHIPD, and suggest that informants may provide important information on Axis II psychopathology that is not obtained from direct interviews with the subjects themselves.

  6. Defined contribution health benefits.

    Science.gov (United States)

    Fronstin, P

    2001-03-01

    This Issue Brief discusses the emerging issue of "defined contribution" (DC) health benefits. The term "defined contribution" is used to describe a wide variety of approaches to the provision of health benefits, all of which have in common a shift in the responsibility for payment and selection of health care services from employers to employees. DC health benefits often are mentioned in the context of enabling employers to control their outlay for health benefits by avoiding increases in health care costs. DC health benefits may also shift responsibility for choosing a health plan and the associated risks of choosing a plan from employers to employees. There are three primary reasons why some employers currently are considering some sort of DC approach. First, they are once again looking for ways to keep their health care cost increases in line with overall inflation. Second, some employers are concerned that the public "backlash" against managed care will result in new legislation, regulations, and litigation that will further increase their health care costs if they do not distance themselves from health care decisions. Third, employers have modified not only most employee benefit plans, but labor market practices in general, by giving workers more choice, control, and flexibility. DC-type health benefits have existed as cafeteria plans since the 1980s. A cafeteria plan gives each employee the opportunity to determine the allocation of his or her total compensation (within employer-defined limits) among various employee benefits (primarily retirement or health). Most types of DC health benefits currently being discussed could be provided within the existing employment-based health insurance system, with or without the use of cafeteria plans. They could also allow employees to purchase health insurance directly from insurers, or they could drive new technologies and new forms of risk pooling through which health care services are provided and financed. DC health

  7. Atmospheric refraction : a history

    NARCIS (Netherlands)

    Lehn, WH; van der Werf, S

    2005-01-01

    We trace the history of atmospheric refraction from the ancient Greeks up to the time of Kepler. The concept that the atmosphere could refract light entered Western science in the second century B.C. Ptolemy, 300 years later, produced the first clearly defined atmospheric model, containing air of

  8. Defining the "normal" postejaculate urinalysis.

    Science.gov (United States)

    Mehta, Akanksha; Jarow, Jonathan P; Maples, Pat; Sigman, Mark

    2012-01-01

    Although sperm have been shown to be present in the postejaculate urinalysis (PEU) of both fertile and infertile men, the number of sperm present in the PEU of the general population has never been well defined. The objective of this study was to describe the semen and PEU findings in both the general and infertile population, in order to develop a better appreciation for "normal." Infertile men (n = 77) and control subjects (n = 71) were prospectively recruited. Exclusion criteria included azoospermia and medications known to affect ejaculation. All men underwent a history, physical examination, semen analysis, and PEU. The urine was split into 2 containers: PEU1, the initial voided urine, and PEU2, the remaining voided urine. Parametric statistical methods were applied for data analysis to compare sperm concentrations in each sample of semen and urine between the 2 groups of men. Controls had higher average semen volume (3.3 ± 1.6 vs 2.0 ± 1.4 mL, P sperm concentrations (112 million vs 56.2 million, P = .011), compared with infertile men. The presence of sperm in urine was common in both groups, but more prevalent among infertile men (98.7% vs 88.7%, P = .012), in whom it comprised a greater proportion of the total sperm count (46% vs 24%, P = .022). The majority of sperm present in PEU were seen in PEU1 of both controls (69%) and infertile men (88%). An association was noted between severe oligospermia (sperm counts in PEU (sperm in the urine compared with control, there is a large degree of overlap between the 2 populations, making it difficult to identify a specific threshold to define a positive test. Interpretation of a PEU should be directed by whether the number of sperm in the urine could affect subsequent management.

  9. On Defining Mass

    Science.gov (United States)

    Hecht, Eugene

    2011-01-01

    Though central to any pedagogical development of physics, the concept of mass is still not well understood. Properly defining mass has proven to be far more daunting than contemporary textbooks would have us believe. And yet today the origin of mass is one of the most aggressively pursued areas of research in all of physics. Much of the excitement surrounding the Large Hadron Collider at CERN is associated with discovering the mechanism responsible for the masses of the elementary particles. This paper will first briefly examine the leading definitions, pointing out their shortcomings. Then, utilizing relativity theory, it will propose—for consideration by the community of physicists—a conceptual definition of mass predicated on the more fundamental concept of energy, more fundamental in that everything that has mass has energy, yet not everything that has energy has mass.

  10. Implementing Software Defined Radio

    CERN Document Server

    Grayver, Eugene

    2013-01-01

    Software Defined Radio makes wireless communications easier, more efficient, and more reliable. This book bridges the gap between academic research and practical implementation. When beginning a project, practicing engineers, technical managers, and graduate students can save countless hours by considering the concepts presented in these pages. The author covers the myriad options and trade-offs available when selecting an appropriate hardware architecture. As demonstrated here, the choice between hardware- and software-centric architecture can mean the difference between meeting an aggressive schedule and bogging down in endless design iterations. Because of the author’s experience overseeing dozens of failed and successful developments, he is able to present many real-life examples. Some of the key concepts covered are: Choosing the right architecture for the market – laboratory, military, or commercial Hardware platforms – FPGAs, GPPs, specialized and hybrid devices Standardization efforts to ens...

  11. Defining cyber warfare

    Directory of Open Access Journals (Sweden)

    Dragan D. Mladenović

    2012-04-01

    Full Text Available Cyber conflicts represent a new kind of warfare that is technologically developing very rapidly. Such development results in more frequent and more intensive cyber attacks undertaken by states against adversary targets, with a wide range of diverse operations, from information operations to physical destruction of targets. Nevertheless, cyber warfare is waged through the application of the same means, techniques and methods as those used in cyber criminal, terrorism and intelligence activities. Moreover, it has a very specific nature that enables states to covertly initiate attacks against their adversaries. The starting point in defining doctrines, procedures and standards in the area of cyber warfare is determining its true nature. In this paper, a contribution to this effort was made through the analysis of the existing state doctrines and international practice in the area of cyber warfare towards the determination of its nationally acceptable definition.

  12. Defining the mobilome.

    Science.gov (United States)

    Siefert, Janet L

    2009-01-01

    This chapter defines the agents that provide for the movement of genetic material which fuels the adaptive potential of life on our planet. The chapter has been structured to be broadly comprehensive, arbitrarily categorizing the mobilome into four classes: (1) transposons, (2) plasmids, (3) bacteriophage, and (4) self-splicing molecular parasites.Our increasing understanding of the mobilome is as dynamic as the mobilome itself. With continuing discovery, it is clear that nature has not confined these genomic agents of change to neat categories, but rather the classification categories overlap and intertwine. Massive sequencing efforts and their published analyses are continuing to refine our understanding of the extent of the mobilome. This chapter provides a framework to describe our current understanding of the mobilome and a foundation on which appreciation of its impact on genome evolution can be understood.

  13. Software Defined Networking

    DEFF Research Database (Denmark)

    Caba, Cosmin Marius

    Network Service Providers (NSP) often choose to overprovision their networks instead of deploying proper Quality of Services (QoS) mechanisms that allow for traffic differentiation and predictable quality. This tendency of overprovisioning is not sustainable for the simple reason that network...... resources are limited. Hence, to counteract this trend, current QoS mechanisms must become simpler to deploy and operate, in order to motivate NSPs to employ QoS techniques instead of overprovisioning. Software Defined Networking (SDN) represents a paradigm shift in the way telecommunication and data...... generic perspective (e.g. service provisioning speed, resources availability). As a result, new mechanisms for providing QoS are proposed, solutions for SDN-specific QoS challenges are designed and tested, and new network management concepts are prototyped, all aiming to improve QoS for network services...

  14. Histories electromagnetism

    International Nuclear Information System (INIS)

    Burch, Aidan

    2004-01-01

    Working within the HPO (History Projection Operator) Consistent Histories formalism, we follow the work of Savvidou on (scalar) field theory [J. Math. Phys. 43, 3053 (2002)] and that of Savvidou and Anastopoulos on (first-class) constrained systems [Class. Quantum Gravt. 17, 2463 (2000)] to write a histories theory (both classical and quantum) of Electromagnetism. We focus particularly on the foliation-dependence of the histories phase space/Hilbert space and the action thereon of the two Poincare groups that arise in histories field theory. We quantize in the spirit of the Dirac scheme for constrained systems

  15. The history, development and the present status of the radon measurement programme in the United States of America

    International Nuclear Information System (INIS)

    George, A.C.

    2015-01-01

    The US radon measurement programme began in the late 1950's by the US Public Health Service in Colorado, New Mexico and Utah during the uranium frenzy. After the 1967 Congressional Hearings on the working conditions in uranium mines, the US Atomic Energy Commission (AEC) was asked to conduct studies in active uranium mines to assess the exposure of the miners on the Colorado Plateau and in New Mexico. From 1967 to 1972, the Health and Safety Laboratory of the US AEC in New York investigated more than 20 uranium mines for radon and radon decay product concentrations and particle size in 4 large uranium mines in New Mexico. In 1970, the US Environmental Protection Agency (EPA) was established and took over some of the AEC radon measurement activities. Between 1975 and 1978, the Environmental Measurements Laboratory of the US Department of Energy conducted the first detailed indoor radon survey in the USA. Later in 1984, the very high concentrations of radon found in Pennsylvania homes set the wheels in motion and gave birth to the US Radon Industry. The US EPA expanded its involvement in radon issues and assumed an active role by establishing the National Radon Proficiency Program to evaluate the effectiveness of radon measurement and mitigation methods. In 1998, due to limited resources EPA privatised the radon programme. This paper presents a personal perspective of past events and current status of the US radon programme. It will present an update on radon health effects, the incidence rate of lung cancer in the USA and the number of radon measurements made from 1988 to 2013 using short-term test methods. More than 23 million measurements were made in the last 25 y and as a result more than 1.24 million homes were mitigated successfully. It is estimated that <2 % of the radon measurements performed in the USA are made using long-term testing devices. The number of homes above the US action level of 148 Bq m -3 (4 pCi l -1 ) may be ∼8.5 million because ∼50

  16. (Re)Defining Salesperson Motivation

    DEFF Research Database (Denmark)

    Khusainova, Rushana; de Jong, Ad; Lee, Nick

    2018-01-01

    The construct of motivation is one of the central themes in selling and sales management research. Yet, to-date no review article exists that surveys the construct (both from an extrinsic and intrinsic motivation context), critically evaluates its current status, examines various key challenges...... apparent from the extant research, and suggests new research opportunities based on a thorough review of past work. The authors explore how motivation is defined, major theories underpinning motivation, how motivation has historically been measured, and key methodologies used over time. In addition......, attention is given to principal drivers and outcomes of salesperson motivation. A summarizing appendix of key articles in salesperson motivation is provided....

  17. The history, development and the present status of the radon measurement programme in the United States of America.

    Science.gov (United States)

    George, A C

    2015-11-01

    The US radon measurement programme began in the late 1950s by the US Public Health Service in Colorado, New Mexico and Utah during the uranium frenzy. After the 1967 Congressional Hearings on the working conditions in uranium mines, the US Atomic Energy Commission (AEC) was asked to conduct studies in active uranium mines to assess the exposure of the miners on the Colorado Plateau and in New Mexico. From 1967 to 1972, the Health and Safety Laboratory of the US AEC in New York investigated more than 20 uranium mines for radon and radon decay product concentrations and particle size in 4 large uranium mines in New Mexico. In 1970, the US Environmental Protection Agency (EPA) was established and took over some of the AEC radon measurement activities. Between 1975 and 1978, the Environmental Measurements Laboratory of the US Department of Energy conducted the first detailed indoor radon survey in the USA. Later in 1984, the very high concentrations of radon found in Pennsylvania homes set the wheels in motion and gave birth to the US Radon Industry. The US EPA expanded its involvement in radon issues and assumed an active role by establishing the National Radon Proficiency Program to evaluate the effectiveness of radon measurement and mitigation methods. In 1998, due to limited resources EPA privatised the radon programme. This paper presents a personal perspective of past events and current status of the US radon programme. It will present an update on radon health effects, the incidence rate of lung cancer in the USA and the number of radon measurements made from 1988 to 2013 using short-term test methods. More than 23 million measurements were made in the last 25 y and as a result more than 1.24 million homes were mitigated successfully. It is estimated that USA are made using long-term testing devices. The number of homes above the US action level of 148 Bq m(-3) (4 pCi l(-1)) may be ∼8.5 million because ∼50 million homes were added since 1990 to the home

  18. 'Just give me the best quality of life questionnaire': the Karnofsky scale and the history of quality of life measurements in cancer trials.

    Science.gov (United States)

    Timmermann, Carsten

    2013-09-01

    To use the history of the Karnofsky Performance Scale as a case study illustrating the emergence of interest in the measurement and standardisation of quality of life; to understand the origins of current-day practices. Articles referring to the Karnofsky scale and quality of life measurements published from the 1940s to the 1990s were identified by searching databases and screening journals, and analysed using close-reading techniques. Secondary literature was consulted to understand the context in which articles were written. The Karnofsky scale was devised for a different purpose than measuring quality of life: as a standardisation device that helped quantify effects of chemotherapeutic agents less easily measurable than survival time. Interest in measuring quality of life only emerged around 1970. When quality of life measurements were increasingly widely discussed in the medical press from the late 1970s onwards, a consensus emerged that the Karnofsky scale was not a very good tool. More sophisticated approaches were developed, but Karnofsky continued to be used. I argue that the scale provided a quick and simple, approximate assessment of the 'soft' effects of treatment by physicians, overlapping but not identical with quality of life.

  19. ‘Just give me the best quality of life questionnaire’: the Karnofsky scale and the history of quality of life measurements in cancer trials

    Science.gov (United States)

    Timmermann, Carsten

    2013-01-01

    Objectives: To use the history of the Karnofsky Performance Scale as a case study illustrating the emergence of interest in the measurement and standardisation of quality of life; to understand the origins of current-day practices. Methods: Articles referring to the Karnofsky scale and quality of life measurements published from the 1940s to the 1990s were identified by searching databases and screening journals, and analysed using close-reading techniques. Secondary literature was consulted to understand the context in which articles were written. Results: The Karnofsky scale was devised for a different purpose than measuring quality of life: as a standardisation device that helped quantify effects of chemotherapeutic agents less easily measurable than survival time. Interest in measuring quality of life only emerged around 1970. Discussion: When quality of life measurements were increasingly widely discussed in the medical press from the late 1970s onwards, a consensus emerged that the Karnofsky scale was not a very good tool. More sophisticated approaches were developed, but Karnofsky continued to be used. I argue that the scale provided a quick and simple, approximate assessment of the ‘soft’ effects of treatment by physicians, overlapping but not identical with quality of life. PMID:23239756

  20. Teleology and Defining Sex.

    Science.gov (United States)

    Gamble, Nathan K; Pruski, Michal

    2018-07-01

    Disorders of sexual differentiation lead to what is often referred to as an intersex state. This state has medical, as well as some legal, recognition. Nevertheless, the question remains whether intersex persons occupy a state in between maleness and femaleness or whether they are truly men or women. To answer this question, another important conundrum needs to be first solved: what defines sex? The answer seems rather simple to most people, yet when morphology does not coincide with haplotypes, and genetics might not correlate with physiology the issue becomes more complex. This paper tackles both issues by establishing where the essence of sex is located and by superimposing that framework onto the issue of the intersex. This is achieved through giving due consideration to the biology of sexual development, as well as through the use of a teleological framework of the meaning of sex. Using a range of examples, the paper establishes that sex cannot be pinpointed to one biological variable but is rather determined by how the totality of one's biology is oriented towards biological reproduction. A brief consideration is also given to the way this situation could be comprehended from a Christian understanding of sex and suffering.

  1. Ranking economic history journals

    DEFF Research Database (Denmark)

    Di Vaio, Gianfranco; Weisdorf, Jacob Louis

    2010-01-01

    This study ranks-for the first time-12 international academic journals that have economic history as their main topic. The ranking is based on data collected for the year 2007. Journals are ranked using standard citation analysis where we adjust for age, size and self-citation of journals. We also...... compare the leading economic history journals with the leading journals in economics in order to measure the influence on economics of economic history, and vice versa. With a few exceptions, our results confirm the general idea about what economic history journals are the most influential for economic...... history, and that, although economic history is quite independent from economics as a whole, knowledge exchange between the two fields is indeed going on....

  2. Ranking Economic History Journals

    DEFF Research Database (Denmark)

    Di Vaio, Gianfranco; Weisdorf, Jacob Louis

    This study ranks - for the first time - 12 international academic journals that have economic history as their main topic. The ranking is based on data collected for the year 2007. Journals are ranked using standard citation analysis where we adjust for age, size and self-citation of journals. We...... also compare the leading economic history journals with the leading journals in economics in order to measure the influence on economics of economic history, and vice versa. With a few exceptions, our results confirm the general idea about what economic history journals are the most influential...... for economic history, and that, although economic history is quite independent from economics as a whole, knowledge exchange between the two fields is indeed going on....

  3. DEFINING SPATIAL VIOLENCE. BUCHAREST AS A STUDY CASE

    Directory of Open Access Journals (Sweden)

    Celia GHYKA

    2015-05-01

    Full Text Available The paper looks at the spatial manifestations of violence, aiming to define the category of spatial violence by focusing on the recent urban history of Bucharest; it establishes links with the longer history of natural and inflicted disasters that defined the city, and it explores the spatial, urban, social and symbolical conflicts that occured during the last 25 years, pointing at their consequences on the social and urban substance of the city.

  4. SFCOMPO 2.0 - A relational database of spent fuel isotopic measurements, reactor operational histories, and design data

    Science.gov (United States)

    Michel-Sendis, Franco; Martinez-González, Jesus; Gauld, Ian

    2017-09-01

    SFCOMPO-2.0 is a database of experimental isotopic concentrations measured in destructive radiochemical analysis of spent nuclear fuel (SNF) samples. The database includes corresponding design description of the fuel rods and assemblies, relevant operating conditions and characteristics of the host reactors necessary for modelling and simulation. Aimed at establishing a thorough, reliable, and publicly available resource for code and data validation of safety-related applications, SFCOMPO-2.0 is developed and maintained by the OECD Nuclear Energy Agency (NEA). The SFCOMPO-2.0 database is a Java application which is downloadable from the NEA website.

  5. SFCOMPO 2.0 – A relational database of spent fuel isotopic measurements, reactor operational histories, and design data

    Directory of Open Access Journals (Sweden)

    Michel-Sendis Franco

    2017-01-01

    Full Text Available SFCOMPO-2.0 is a database of experimental isotopic concentrations measured in destructive radiochemical analysis of spent nuclear fuel (SNF samples. The database includes corresponding design description of the fuel rods and assemblies, relevant operating conditions and characteristics of the host reactors necessary for modelling and simulation. Aimed at establishing a thorough, reliable, and publicly available resource for code and data validation of safety-related applications, SFCOMPO-2.0 is developed and maintained by the OECD Nuclear Energy Agency (NEA. The SFCOMPO-2.0 database is a Java application which is downloadable from the NEA website.

  6. History in a quantum world

    International Nuclear Information System (INIS)

    Squires, E.J.

    1992-01-01

    The difficulty of defining history in the context of orthodox quantum theory is discussed. A possible definition, involving the concept of conscious awareness, and using backwards unitary evolution, is described. Reasons are given why a demonstration that particular things happened with 100% certainty does not always give a history that is Lorentz invariant. (author). 16 refs

  7. Defining an emerging disease.

    Science.gov (United States)

    Moutou, F; Pastoret, P-P

    2015-04-01

    Defining an emerging disease is not straightforward, as there are several different types of disease emergence. For example, there can be a 'real' emergence of a brand new disease, such as the emergence of bovine spongiform encephalopathy in the 1980s, or a geographic emergence in an area not previously affected, such as the emergence of bluetongue in northern Europe in 2006. In addition, disease can emerge in species formerly not considered affected, e.g. the emergence of bovine tuberculosis in wildlife species since 2000 in France. There can also be an unexpected increase of disease incidence in a known area and a known species, or there may simply be an increase in our knowledge or awareness of a particular disease. What all these emerging diseases have in common is that human activity frequently has a role to play in their emergence. For example, bovine spongiform encephalopathy very probably emerged as a result of changes in the manufacturing of meat-and-bone meal, bluetongue was able to spread to cooler climes as a result of uncontrolled trade in animals, and a relaxation of screening and surveillance for bovine tuberculosis enabled the disease to re-emerge in areas that had been able to drastically reduce the number of cases. Globalisation and population growth will continue to affect the epidemiology of diseases in years to come and ecosystems will continue to evolve. Furthermore, new technologies such as metagenomics and high-throughput sequencing are identifying new microorganisms all the time. Change is the one constant, and diseases will continue to emerge, and we must consider the causes and different types of emergence as we deal with these diseases in the future.

  8. Long-Term Deflection Prediction from Computer Vision-Measured Data History for High-Speed Railway Bridges

    Directory of Open Access Journals (Sweden)

    Jaebeom Lee

    2018-05-01

    Full Text Available Management of the vertical long-term deflection of a high-speed railway bridge is a crucial factor to guarantee traffic safety and passenger comfort. Therefore, there have been efforts to predict the vertical deflection of a railway bridge based on physics-based models representing various influential factors to vertical deflection such as concrete creep and shrinkage. However, it is not an easy task because the vertical deflection of a railway bridge generally involves several sources of uncertainty. This paper proposes a probabilistic method that employs a Gaussian process to construct a model to predict the vertical deflection of a railway bridge based on actual vision-based measurement and temperature. To deal with the sources of uncertainty which may cause prediction errors, a Gaussian process is modeled with multiple kernels and hyperparameters. Once the hyperparameters are identified through the Gaussian process regression using training data, the proposed method provides a 95% prediction interval as well as a predictive mean about the vertical deflection of the bridge. The proposed method is applied to an arch bridge under operation for high-speed trains in South Korea. The analysis results obtained from the proposed method show good agreement with the actual measurement data on the vertical deflection of the example bridge, and the prediction results can be utilized for decision-making on railway bridge maintenance.

  9. Long-Term Deflection Prediction from Computer Vision-Measured Data History for High-Speed Railway Bridges.

    Science.gov (United States)

    Lee, Jaebeom; Lee, Kyoung-Chan; Lee, Young-Joo

    2018-05-09

    Management of the vertical long-term deflection of a high-speed railway bridge is a crucial factor to guarantee traffic safety and passenger comfort. Therefore, there have been efforts to predict the vertical deflection of a railway bridge based on physics-based models representing various influential factors to vertical deflection such as concrete creep and shrinkage. However, it is not an easy task because the vertical deflection of a railway bridge generally involves several sources of uncertainty. This paper proposes a probabilistic method that employs a Gaussian process to construct a model to predict the vertical deflection of a railway bridge based on actual vision-based measurement and temperature. To deal with the sources of uncertainty which may cause prediction errors, a Gaussian process is modeled with multiple kernels and hyperparameters. Once the hyperparameters are identified through the Gaussian process regression using training data, the proposed method provides a 95% prediction interval as well as a predictive mean about the vertical deflection of the bridge. The proposed method is applied to an arch bridge under operation for high-speed trains in South Korea. The analysis results obtained from the proposed method show good agreement with the actual measurement data on the vertical deflection of the example bridge, and the prediction results can be utilized for decision-making on railway bridge maintenance.

  10. Otolith oxygen isotopes measured by high-precision secondary ion mass spectrometry reflect life history of a yellowfin sole (Limanda aspera).

    Science.gov (United States)

    Matta, Mary Elizabeth; Orland, Ian J; Ushikubo, Takayuki; Helser, Thomas E; Black, Bryan A; Valley, John W

    2013-03-30

    The oxygen isotope ratio (δ(18)O value) of aragonite fish otoliths is dependent on the temperature and the δ(18)O value of the ambient water and can thus reflect the environmental history of a fish. Secondary ion mass spectrometry (SIMS) offers a spatial-resolution advantage over conventional acid-digestion techniques for stable isotope analysis of otoliths, especially given their compact nature. High-precision otolith δ(18)O analysis was conducted with an IMS-1280 ion microprobe to investigate the life history of a yellowfin sole (Limanda aspera), a Bering Sea species known to migrate ontogenetically. The otolith was cut transversely through its core and one half was roasted to eliminate organic contaminants. Values of δ(18)O were measured in 10-µm spots along three transects (two in the roasted half, one in the unroasted half) from the core toward the edge. Otolith annual growth zones were dated using the dendrochronology technique of crossdating. Measured values of δ(18)O ranged from 29.0 to 34.1‰ (relative to Vienna Standard Mean Ocean Water). Ontogenetic migration from shallow to deeper waters was reflected in generally increasing δ(18)O values from age-0 to approximately age-7 and subsequent stabilization after the expected onset of maturity at age-7. Cyclical variations of δ(18)O values within juvenile otolith growth zones, up to 3.9‰ in magnitude, were caused by a combination of seasonal changes in the temperature and the δ(18)O value of the ambient water. The ion microprobe produced a high-precision and high-resolution record of the relative environmental conditions experienced by a yellowfin sole that was consistent with population-level studies of ontogeny. Furthermore, this study represents the first time that crossdating has been used to ensure the dating accuracy of δ(18)O measurements in otoliths. Copyright © 2013 John Wiley & Sons, Ltd.

  11. Bayesian importance parameter modeling of misaligned predictors: soil metal measures related to residential history and intellectual disability in children.

    Science.gov (United States)

    Onicescu, Georgiana; Lawson, Andrew B; McDermott, Suzanne; Aelion, C Marjorie; Cai, Bo

    2014-09-01

    In this paper, we propose a novel spatial importance parameter hierarchical logistic regression modeling approach that includes measurement error from misalignment. We apply this model to study the relationship between the estimated concentration of soil metals at the residence of mothers and the development of intellectual disability (ID) in their children. The data consist of monthly computerized claims data about the prenatal experience of pregnant women living in nine areas within South Carolina and insured by Medicaid during January 1, 1996 and December 31, 2001 and the outcome of ID in their children during early childhood. We excluded mother-child pairs if the mother moved to an unknown location during pregnancy. We identified an association of the ID outcome with arsenic (As) and mercury (Hg) concentration in soil during pregnancy, controlling for infant sex, maternal race, mother's age, and gestational weeks at delivery. There is some indication that Hg has a slightly higher importance in the third and fourth months of pregnancy, while As has a more uniform effect over all the months with a suggestion of a slight increase in risk in later months.

  12. Intellectual History

    DEFF Research Database (Denmark)

    In the 5 Questions book series, this volume presents a range of leading scholars in Intellectual History and the History of Ideas through their answers to a brief questionnaire. Respondents include Michael Friedman, Jacques le Goff, Hans Ulrich Gumbrecht, Jonathan Israel, Phiip Pettit, John Pocock...

  13. Well-defined critical association concentration and rapid adsorption at the air/water interface of a short amphiphilic polymer, amphipol A8-35: a study by Förster resonance energy transfer and dynamic surface tension measurements.

    Science.gov (United States)

    Giusti, Fabrice; Popot, Jean-Luc; Tribet, Christophe

    2012-07-17

    Amphipols (APols) are short amphiphilic polymers designed to handle membrane proteins (MPs) in aqueous solutions as an alternative to small surfactants (detergents). APols adsorb onto the transmembrane, hydrophobic surface of MPs, forming small, water-soluble complexes, in which the protein is biochemically stabilized. At variance with MP/detergent complexes, MP/APol ones remain stable even at extreme dilutions. Pure APol solutions self-associate into well-defined micelle-like globules comprising a few APol molecules, a rather unusual behavior for amphiphilic polymers, which typically form ill-defined assemblies. The best characterized APol to date, A8-35, is a random copolymer of acrylic acid, isopropylacrylamide, and octylacrylamide. In the present work, the concentration threshold for self-association of A8-35 in salty buffer (NaCl 100 mM, Tris/HCl 20 mM, pH 8.0) has been studied by Förster resonance energy transfer (FRET) measurements and tensiometry. In a 1:1 mol/mol mixture of APols grafted with either rhodamine or 7-nitro-1,2,3-benzoxadiazole, the FRET signal as a function of A8-35 concentration is essentially zero below a threshold concentration of 0.002 g·L(-1) and increases linearly with concentration above this threshold. This indicates that assembly takes place in a narrow concentration interval around 0.002 g·L(-1). Surface tension measurements decreases regularly with concentration until a threshold of ca. 0.004 g·L(-1), beyond which it reaches a plateau at ca. 30 mN·m(-1). Within experimental uncertainties, the two techniques thus yield a comparable estimate of the critical self-assembly concentration. The kinetics of variation of the surface tension was analyzed by dynamic surface tension measurements in the time window 10 ms-100 s. The rate of surface tension decrease was similar in solutions of A8-35 and of the anionic surfactant sodium dodecylsulfate when both compounds were at a similar molar concentration of n-alkyl moieties. Overall, the

  14. Definably compact groups definable in real closed fields.II

    OpenAIRE

    Barriga, Eliana

    2017-01-01

    We continue the analysis of definably compact groups definable in a real closed field $\\mathcal{R}$. In [3], we proved that for every definably compact definably connected semialgebraic group $G$ over $\\mathcal{R}$ there are a connected $R$-algebraic group $H$, a definable injective map $\\phi$ from a generic definable neighborhood of the identity of $G$ into the group $H\\left(R\\right)$ of $R$-points of $H$ such that $\\phi$ acts as a group homomorphism inside its domain. The above result and o...

  15. A history of the 2014 Minute 319 environmental pulse flow asdocumented by field measurements and satellite imagery

    Science.gov (United States)

    Nelson, Steven M.; Ramirez-Hernandez, Jorge; Rodriguez-Burgeueno, J. Eliana; Milliken, Jeff; Kennedy, Jeffrey R.; Zamora-Arroyo, Francisco; Schlatter, Karen; Santiago-Serrano, Edith; Carrera-Villa, Edgar

    2017-01-01

    As provided in Minute 319 of the U.S.-Mexico Water Treaty of 1944, a pulse flow of approximately 132 million cubic meters (mcm) was released to the riparian corridor of the Colorado River Delta over an eight-week period that began March 23, 2014 and ended May 18, 2014. Peak flows were released in the early part of the pulse to simulate a spring flood, with approximately 101.7 mcm released at Morelos Dam on the U.S.-Mexico border. The remainder of the pulse flow water was released to the riparian corridor via Mexicali Valley irrigation spillway canals, with 20.9 mcm released at Km 27 Spillway (41 km below Morelos Dam) and 9.3 mcm released at Km 18 Spillway (78 km below Morelos Dam). We used sequential satellite images, overflights, ground observations, water discharge measurements, and automated temperature, river stage and water quality loggers to document and describe the progression of pulse flow water through the study area. The rate of advance of the wetted front was slowed by infiltration and high channel roughness as the pulse flow crossed more than 40 km of dry channel which was disconnected from underlying groundwater and partially overgrown with salt cedar. High lag time and significant attenuation of flow resulted in a changing hydrograph as the pulse flow progressed to the downstream delivery points; two peak flows occurred in some lower reaches. The pulse flow advanced more than 120 km downstream from Morelos Dam to reach the Colorado River estuary at the northern end of the Gulf of California.

  16. Family History

    Science.gov (United States)

    Your family history includes health information about you and your close relatives. Families have many factors in common, including their genes, ... as heart disease, stroke, and cancer. Having a family member with a disease raises your risk, but ...

  17. Selection History Modulates Working Memory Capacity

    Directory of Open Access Journals (Sweden)

    Bo-Cheng Kuo

    2016-10-01

    Full Text Available Recent studies have shown that past selection history affects the allocation of attention on target selection. However, it is unclear whether context-driven selection history can modulate the efficacy of attention allocation on working memory (WM representations. This study tests the influences of selection history on WM capacity. A display of one item (low load or three/four items (high load was shown for the participants to hold in WM in a delayed response task. Participants then judged whether a probe item was in the memory display or not. Selection history was defined as the number of items attended across trials in the task context within a block, manipulated by the stimulus set-size in the contexts with fewer possible stimuli (4-item or 5-item context or more possible stimuli (8-item or 9-item context from which the memorized content was selected. The capacity measure (i.e. the K parameter was estimated to reflect the number of items that can be held in WM. Across four behavioral experiments, the results revealed that the capacity was significantly reduced in the context with more possible stimuli relative to the context with fewer possible stimuli. Moreover, the reduction in capacity was significant for high WM load and not observed when the focus was on only a single item. Together, these findings indicate that context-driven selection history and focused attention influence WM capacity.

  18. Mathematics and history: history and analysis epistemology: from exhaustion method to defined integral

    Directory of Open Access Journals (Sweden)

    Mario Mandrone

    2015-06-01

    Full Text Available The creation of the calculation (differential, in the terminology of Leibniz, bending in that of Newton is the event that, in the second half of the seventeenth century, marked, in a sense, the transition from classical to modern mathematics. The aim of this work is a historical analysis of the rigor and epistemological question and the "metaphysics" of calculus that takes account of the methods of the ancient (eg. Of Archimedes' method of exhaustion, as well as interpretations of Leibniz and Newton and their successors. The problem of searching for a sure foundation on which to base the calculus, glimpsed by D'Alembert in the theory of limits and taken up by Lagrange to the theory of infinite series, and that the derivative functions, found in Cachy the pioneer of a new way to seek rigor in analysis. The Cauchy setting will be tightened by Weierstrass in the second half of the 800 with the definition of limit, with the epsilon-delta method, which in turn is based on definitions concerning the real numbers. In this sense we speak of "arithmetisation" analysis.         Matematica e storia: storia ed epistemologia dell’analisi: dal metodo  di esaustione  all’integrale  definito La creazione del calcolo (differenziale, nella terminologia leibniziana flessionale in quella di Newton è l’evento che, nella seconda metà del seicento, ha segnato, in un certo senso, il passaggio dalla matematica classica a quella moderna. Obiettivo del presente lavoro è una analisi storica ed epistemologica della questione del rigore e della “metafisica” del calcolo infinitesimale che tenga conto dei metodi degli antichi (ad es.  del metodo di esaustione di Archimede, nonché delle interpretazioni di Leibniz e Newton e dei loro successori. Il problema della ricerca di un fondamento sicuro su cui basare il calcolo infinitesimale, intravisto da D’Alembert nella teoria dei limiti e ripreso da Lagrange con la teoria delle serie infinite e quella delle funzioni derivate, trova in Cachy il pioniere di un nuovo modo di ricercare il rigore in analisi. L’impostazione di Cauchy sarà resa rigorosa da Weierstrass nella seconda metà dell’800 con la definizione di limite, col metodo dell’epsilon-delta, che a sua volta si basa su definizioni concernenti i numeri reali. In questo senso si parla di “aritmetizzazione” dell’analisi. Parole Chiave: metodo di esaustione; metodo dei teoremi meccanici; calcolo sublime, fluenti e flussioni; teoria dell’integrazione; numeri iperreali; analisi non-standard.

  19. Journal of East African Natural History

    African Journals Online (AJOL)

    The Journal of East African Natural History is published jointly by the East Africa Natural History Society and the National Museums of Kenya. The Journal publishes papers and notes in the field of natural history, broadly defined as the study of organisms in their natural state, relevant to the eastern African region.

  20. Environmental history

    DEFF Research Database (Denmark)

    Pawson, Eric; Christensen, Andreas Aagaard

    2017-01-01

    Environmental history is an interdisciplinary pursuit that has developed as a form of conscience to counter an increasingly powerful, forward-looking liberal theory of the environment. It deals with the relations between environmental ideas and materialities, from the work of the geographers George...... risks”. These are exposed by environmental history’s focus on long-run analysis and its narrative form that identifies the stories that we tell ourselves about nature. How a better understanding of past environmental transformations helps to analyse society and agency, and what this can mean...... for solutions and policies, is the agenda for an engaged environmental history from now on....

  1. Ildens historier

    DEFF Research Database (Denmark)

    Lassen, Henrik Roesgaard

    have been written by Andersen. In several chapters the curiously forgotten history of fire-lighting technology is outlined, and it is demonstrated that "Tællelyset" is written by a person with a modern perspective on how to light a candle - among other things. The central argument in the book springs...... from a point-by-point tracing of 'the origins and history' of Hans Christian Andersen's famous fairy tales. Where did the come from? How did they become the iconic texts that we know today? On this background it becomes quite clear that "Tællelyset" is a modern pastiche and not a genuine Hans Christian...

  2. Business History

    DEFF Research Database (Denmark)

    Hansen, Per H.

    2012-01-01

    This article argues that a cultural and narrative perspective can enrich the business history field, encourage new and different questions and answers, and provide new ways of thinking about methods and empirical material. It discusses what culture is and how it relates to narratives. Taking...

  3. LCA History

    DEFF Research Database (Denmark)

    Bjørn, Anders; Owsianiak, Mikołaj; Molin, Christine

    2018-01-01

    The idea of LCA was conceived in the 1960s when environmental degradation and in particular the limited access to resources started becoming a concern. This chapter gives a brief summary of the history of LCA since then with a focus on the fields of methodological development, application...

  4. Rewriting History.

    Science.gov (United States)

    Ramirez, Catherine Clark

    1994-01-01

    Suggests that the telling of vivid stories can help engage elementary students' emotions and increase the chances of fostering an interest in Texas history. Suggests that incorporating elements of the process approach to writing can merge with social studies objectives in creating a curriculum for wisdom. (RS)

  5. Measuring $\

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, Jessica Sarah [Univ. of Cambridge (United Kingdom)

    2011-01-01

    The MINOS Experiment consists of two steel-scintillator calorimeters, sampling the long baseline NuMI muon neutrino beam. It was designed to make a precise measurement of the ‘atmospheric’ neutrino mixing parameters, Δm2 atm. and sin2 (2 atm.). The Near Detector measures the initial spectrum of the neutrino beam 1km from the production target, and the Far Detector, at a distance of 735 km, measures the impact of oscillations in the neutrino energy spectrum. Work performed to validate the quality of the data collected by the Near Detector is presented as part of this thesis. This thesis primarily details the results of a vμ disappearance analysis, and presents a new sophisticated fitting software framework, which employs a maximum likelihood method to extract the best fit oscillation parameters. The software is entirely decoupled from the extrapolation procedure between the detectors, and is capable of fitting multiple event samples (defined by the selections applied) in parallel, and any combination of energy dependent and independent sources of systematic error. Two techniques to improve the sensitivity of the oscillation measurement were also developed. The inclusion of information on the energy resolution of the neutrino events results in a significant improvement in the allowed region for the oscillation parameters. The degree to which sin2 (2θ )= 1.0 could be disfavoured with the exposure of the current dataset if the true mixing angle was non-maximal, was also investigated, with an improved neutrino energy reconstruction for very low energy events. The best fit oscillation parameters, obtained by the fitting software and incorporating resolution information were: | Δm2| = 2.32+0.12 -0.08×10-3 eV2 and sin2 (2θ ) > 0.90(90% C.L.). The analysis provides the current world best measurement of the atmospheric neutrino mass

  6. Defining asthma in genetic studies

    NARCIS (Netherlands)

    Koppelman, GH; Postma, DS; Meijer, G.

    1999-01-01

    Genetic studies have been hampered by the lack of a gold standard to diagnose asthma. The complex nature of asthma makes it more difficult to identify asthma genes. Therefore, approaches to define phenotypes, which have been successful in other genetically complex diseases, may be applied to define

  7. ANALISIS PENGENDALIAN KUALITAS PRODUKSI BOGIE BARBER S2HD 9C MENGGUNAKAN METODE LEAN SIX SIGMA DENGAN PENDEKATAN DMAIC (DEFINE, MEASURE, ANALYZE, IMPROVE, CONTROL STUDI KASUS DI PT BARATA INDONESIA (PERSERO GRESIK

    Directory of Open Access Journals (Sweden)

    Suparno Suparno

    2017-01-01

    Full Text Available Penelitian ini bertujuan untuk menganalisis pengendalian kualitas dan pemborosan yang terjadi dalam produksi side frame S2HD 9C menggunakan metode lean six sigma dengan pendekatan DMAIC di PT Barata Indonesia (Persero Gresik. Side frame S2HD 9C merupakan bagian dari produk bogie barber S2HD 9C. Penelitian ini pada tahap six sigma difokuskan pada analisis defect dan capaian sigma, sedangkan pada tahap lean six sigma difokuskan pada analisis waste dan capaian sigma. Data yang digunakan dalam penelitian ini ada 2 jenis yaitu data primer dan sekunder, yang bersifat kualitatif maupun kuantitatif. Data primer diperoleh dari observasi lapangan, sedangkan data sekunder diperoleh dari telaah dokumen. Dalam penelitian ini dilakukan sesuai dengan pendekatan DMAIC ( define, measure, analyze, improve, control . Setelah dilakukan analisis pada tahap six sigma diketahui terdapat 5 jenis defect yang terjadi pada periode Januari sampai April 2016. Yaitu: defect gas terperangkap 58.18%, defect core patah 21.82%, defect sand drop 12.73%, defect brake mould 5.46% dan defect misplace core 1.82%. Dan nilai capaian tingkat sigma dari masing-masing defect adalah sebagai berikut:  defect gas terperangkap 1235.71 DPMO dengan tingkat sigma 4.52σ, defect core patah 463.392 DPMO dengan tingkat sigma 4.81σ, defect sand drop 270.312 DPMO dengan tingkat sigma 4.96σ, defect brake mould 115.848 DPMO dengan tingkat sigma 5.18σ dan defect misplace core 38.616 DPMO dengan tingkat sigma 5.45σ. Dalam tahap lean six sigma diketahui terdapat 4 jenis waste, yaitu: waste defect product, waste waiting time (delay, waste transportation dan waste excess process. Berikut ini merupakan nilai capaian dari masing-masing waste: waste defect product 6890 DPMO dengan tingkat sigma 3.96σ dan nilai capability process 1.31 dengan tingkat sigma 3.94σ, waste waiting time (delay nilai capability process 1 dengan tingkat sigma 3σ, waste transportation nilai capability process 1.31 dengan tingkat

  8. Business History as Cultural History

    DEFF Research Database (Denmark)

    Lunde Jørgensen, Ida

    The paper engages with the larger question of how cultural heritage becomes taken for granted and offers a complimentary view to the anthropological ʻCopenhagen School’ of business history, one that draws attention to the way corporate wealth directly and indirectly influences the culture available...

  9. River history.

    Science.gov (United States)

    Vita-Finzi, Claudio

    2012-05-13

    During the last half century, advances in geomorphology-abetted by conceptual and technical developments in geophysics, geochemistry, remote sensing, geodesy, computing and ecology-have enhanced the potential value of fluvial history for reconstructing erosional and depositional sequences on the Earth and on Mars and for evaluating climatic and tectonic changes, the impact of fluvial processes on human settlement and health, and the problems faced in managing unstable fluvial systems. This journal is © 2012 The Royal Society

  10. Environmental History

    OpenAIRE

    Kearns, Gerard

    2004-01-01

    There was a time when almost all Western geography could be termed environmental history. In the late nineteenth century, physical geographers explained landscapes by describing how they had evolved. Likewise, human geographers saw society as shaped by the directing hands of the environment. By the 1960s this had very much changed. Process studies shortened the temporal framework in geographical explanation and cut the cord between nature and society. Now, physical and human...

  11. Theoretical approaches to elections defining

    OpenAIRE

    Natalya V. Lebedeva

    2011-01-01

    Theoretical approaches to elections defining develop the nature, essence and content of elections, help to determine their place and a role as one of the major national law institutions in democratic system.

  12. Theoretical approaches to elections defining

    Directory of Open Access Journals (Sweden)

    Natalya V. Lebedeva

    2011-01-01

    Full Text Available Theoretical approaches to elections defining develop the nature, essence and content of elections, help to determine their place and a role as one of the major national law institutions in democratic system.

  13. Defining Modules, Modularity and Modularization

    DEFF Research Database (Denmark)

    Miller, Thomas Dedenroth; Pedersen, Per Erik Elgård

    The paper describes the evolution of the concept of modularity in a historical perspective. The main reasons for modularity are: create variety, utilize similarities, and reduce complexity. The paper defines the terms: Module, modularity, and modularization.......The paper describes the evolution of the concept of modularity in a historical perspective. The main reasons for modularity are: create variety, utilize similarities, and reduce complexity. The paper defines the terms: Module, modularity, and modularization....

  14. THE BURSTY STAR FORMATION HISTORIES OF LOW-MASS GALAXIES AT 0.4 < z < 1 REVEALED BY STAR FORMATION RATES MEASURED FROM H β AND FUV

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Yicheng; Faber, S. M.; Koo, David C.; Krumholz, Mark R.; Barro, Guillermo; Yesuf, Hassen [UCO/Lick Observatory, Department of Astronomy and Astrophysics, University of California, Santa Cruz, CA (United States); Rafelski, Marc; Gardner, Jonathan P.; Pacifici, Camilla [Goddard Space Flight Center, Code 665, Greenbelt, MD (United States); Trump, Jonathan R. [Department of Astronomy and Astrophysics and Institute for Gravitation and the Cosmos, Pennsylvania State University, University Park, PA (United States); Willner, S. P. [Harvard-Smithsonian Center for Astrophysics, Cambridge, MA (United States); Amorín, Ricardo [INAF-Osservatorio Astronomico di Roma, Monte Porzio Catone (Italy); Bell, Eric F. [Department of Astronomy, University of Michigan, Ann Arbor, MI (United States); Gawiser, Eric [Department of Physics and Astronomy, Rutgers University, New Brunswick, NJ (United States); Hathi, Nimish P. [Aix Marseille Université, CNRS, LAM (Laboratoire d’Astrophysique de Marseille) UMR 7326, Marseille (France); Koekemoer, Anton M.; Ravindranath, Swara [Space Telescope Science Institute, Baltimore, MD (United States); Pérez-González, Pablo G. [Departamento de Astrofísica, Facultad de CC. Físicas, Universidad Complutense de Madrid, E-28040 Madrid (Spain); Reddy, Naveen [Department of Physics and Astronomy, University of California, Riverside, CA (United States); Teplitz, Harry I., E-mail: ycguo@ucolick.org [Infrared Processing and Analysis Center, Caltech, Pasadena, CA 91125 (United States)

    2016-12-10

    We investigate the burstiness of star formation histories (SFHs) of galaxies at 0.4 <  z  < 1 by using the ratio of star formation rates (SFRs) measured from H β and FUV (1500 Å) (H β -to-FUV ratio). Our sample contains 164 galaxies down to stellar mass ( M {sub *}) of 10{sup 8.5} M {sub ⊙} in the CANDELS GOODS-N region, where Team Keck Redshift Survey Keck/DEIMOS spectroscopy and Hubble Space Telescope /WFC3 F275W images from CANDELS and Hubble Deep UV Legacy Survey are available. When the ratio of H β - and FUV-derived SFRs is measured, dust extinction correction is negligible (except for very dusty galaxies) with the Calzetti attenuation curve. The H β -to-FUV ratio of our sample increases with M {sub *} and SFR. The median ratio is ∼0.7 at M {sub *} ∼ 10{sup 8.5} M {sub ⊙} (or SFR ∼ 0.5 M {sub ⊙} yr{sup −1}) and increases to ∼1 at M {sub *} ∼ 10{sup 10} M {sub ⊙} (or SFR ∼ 10 M {sub ⊙} yr{sup −1}). At M {sub *} < 10{sup 9.5} M {sub ⊙}, our median H β -to-FUV ratio is lower than that of local galaxies at the same M {sub *}, implying a redshift evolution. Bursty SFH on a timescale of a few tens of megayears on galactic scales provides a plausible explanation for our results, and the importance of the burstiness increases as M {sub *} decreases. Due to sample selection effects, our H β -to-FUV ratio may be an upper limit of the true value of a complete sample, which strengthens our conclusions. Other models, e.g., non-universal initial mass function or stochastic star formation on star cluster scales, are unable to plausibly explain our results.

  15. DEFINED CONTRIBUTION PLANS, DEFINED BENEFIT PLANS, AND THE ACCUMULATION OF RETIREMENT WEALTH

    Science.gov (United States)

    Poterba, James; Rauh, Joshua; Venti, Steven; Wise, David

    2010-01-01

    The private pension structure in the United States, once dominated by defined benefit (DB) plans, is currently divided between defined contribution (DC) and DB plans. Wealth accumulation in DC plans depends on the participant's contribution behavior and on financial market returns, while accumulation in DB plans is sensitive to a participant's labor market experience and to plan parameters. This paper simulates the distribution of retirement wealth under representative DB and DC plans. It uses data from the Health and Retirement Study (HRS) to explore how asset returns, earnings histories, and retirement plan characteristics contribute to the variation in retirement wealth outcomes. We simulate DC plan accumulation by randomly assigning individuals a share of wages that they and their employer contribute to the plan. We consider several possible asset allocation strategies, with asset returns drawn from the historical return distribution. Our DB plan simulations draw earnings histories from the HRS, and randomly assign each individual a pension plan drawn from a sample of large private and public defined benefit plans. The simulations yield distributions of both DC and DB wealth at retirement. Average retirement wealth accruals under current DC plans exceed average accruals under private sector DB plans, although DC plans are also more likely to generate very low retirement wealth outcomes. The comparison of current DC plans with more generous public sector DB plans is less definitive, because public sector DB plans are more generous on average than their private sector counterparts. PMID:21057597

  16. Uncovering History for Future History Teachers

    Science.gov (United States)

    Fischer, Fritz

    2010-01-01

    The art of history teaching is at a crossroads. Recent scholarship focuses on the need to change the teaching of history so students can better learn history, and insists that history teachers must move beyond traditional structures and methods of teaching in order to improve their students' abilities to think with history. This article presents…

  17. Defining Plagiarism: A Literature Review

    Directory of Open Access Journals (Sweden)

    Akbar Akbar

    2018-02-01

    Full Text Available Plagiarism has repeatedly occurred in Indonesia, resulting in focusing on such academic misbehavior as a “central issue” in Indonesian higher education. One of the issues of addressing plagiarism in higher education is that there is a confusion of defining plagiarism. It seems that Indonesian academics had different perception when defining plagiarism. This article aims at exploring the issue of plagiarism by helping define plagiarism to address confusion among Indonesian academics. This article applies literature review by firs finding relevant articles after identifying databases for literature searching. After the collection of required articles for review, the articles were synthesized before presenting the findings. This study has explored the definition of plagiarism in the context of higher education. This research found that plagiarism is defined in the relation of criminal acts. The huge numbers of discursive features used position plagiaristic acts as an illegal deed. This study also found that cultural backgrounds and exposure to plagiarism were influential in defining plagiarism.

  18. Modular Software-Defined Radio

    Directory of Open Access Journals (Sweden)

    Rhiemeier Arnd-Ragnar

    2005-01-01

    Full Text Available In view of the technical and commercial boundary conditions for software-defined radio (SDR, it is suggestive to reconsider the concept anew from an unconventional point of view. The organizational principles of signal processing (rather than the signal processing algorithms themselves are the main focus of this work on modular software-defined radio. Modularity and flexibility are just two key characteristics of the SDR environment which extend smoothly into the modeling of hardware and software. In particular, the proposed model of signal processing software includes irregular, connected, directed, acyclic graphs with random node weights and random edges. Several approaches for mapping such software to a given hardware are discussed. Taking into account previous findings as well as new results from system simulations presented here, the paper finally concludes with the utility of pipelining as a general design guideline for modular software-defined radio.

  19. Defining and Selecting Independent Directors

    Directory of Open Access Journals (Sweden)

    Eric Pichet

    2017-10-01

    Full Text Available Drawing from the Enlightened Shareholder Theory that the author first developed in 2011, this theoretical paper with practical and normative ambitions achieves a better definition of independent director, while improving the understanding of the roles he fulfils on boards of directors. The first part defines constructs like firms, Governance system and Corporate governance, offering a clear distinction between the latter two concepts before explaining the four main missions of a board. The second part defines the ideal independent director by outlining the objective qualities that are necessary and adding those subjective aspects that have turned this into a veritable profession. The third part defines the ideal process for selecting independent directors, based on nominating committees that should themselves be independent. It also includes ways of assessing directors who are currently in function, as well as modalities for renewing their mandates. The paper’s conclusion presents the Paradox of the Independent Director.

  20. Defining and Classifying Interest Groups

    DEFF Research Database (Denmark)

    Baroni, Laura; Carroll, Brendan; Chalmers, Adam

    2014-01-01

    The interest group concept is defined in many different ways in the existing literature and a range of different classification schemes are employed. This complicates comparisons between different studies and their findings. One of the important tasks faced by interest group scholars engaged...... in large-N studies is therefore to define the concept of an interest group and to determine which classification scheme to use for different group types. After reviewing the existing literature, this article sets out to compare different approaches to defining and classifying interest groups with a sample...... in the organizational attributes of specific interest group types. As expected, our comparison of coding schemes reveals a closer link between group attributes and group type in narrower classification schemes based on group organizational characteristics than those based on a behavioral definition of lobbying....

  1. The history of a lesson

    DEFF Research Database (Denmark)

    Rasmussen, Mikkel Vedby

    2003-01-01

    and emphasises the need to study the history of lessons rather than the lessons of history. This approach shows that Munich is the end point of a constitutive history that begins in the failure of the Versailles treaty to create a durable European order following the First World War. The Munich lesson is thus......The article investigates the concept of lessons in IR. By means of a constructivist critique of the 'lessons literature', the article analyses one of the most important of IR lessons: that of Munich. Examining how the Munich lesson came about, the article shows the praxeological nature of lessons...... one element of the lesson of Versailles, which is a praxeology that defines how the West is to make peace, and against whom peace must be defended. The lesson of Versailles has been, at least in part, constitutive of the outbreak of the Cold War, and it continues to define the Western conception...

  2. ON DEFINING S-SPACES

    Directory of Open Access Journals (Sweden)

    Francesco Strati

    2013-05-01

    Full Text Available The present work is intended to be an introduction to the Superposition Theory of David Carfì. In particular I shall depict the meaning of his brand new theory, on the one hand in an informal fashion and on the other hand by giving a formal approach of the algebraic structure of the theory: the S-linear algebra. This kind of structure underpins the notion of S-spaces (or Carfì-spaces by defining both its properties and its nature. Thus I shall define the S-triple as the fundamental principle upon which the S-linear algebra is built up.

  3. Associations among Measures of Sequential Processing in Motor and Linguistics Tasks in Adults with and without a Family History of Childhood Apraxia of Speech: A Replication Study

    Science.gov (United States)

    Button, Le; Peter, Beate; Stoel-Gammon, Carol; Raskind, Wendy H.

    2013-01-01

    The purpose of this study was to address the hypothesis that childhood apraxia of speech (CAS) is influenced by an underlying deficit in sequential processing that is also expressed in other modalities. In a sample of 21 adults from five multigenerational families, 11 with histories of various familial speech sound disorders, 3 biologically…

  4. Cygnus History

    International Nuclear Information System (INIS)

    Henderson, David J.; Gignac, Raymond E.; Good, Douglas E.; Hansen, Mark D.; Mitton, Charles V.; Nelson, Daniel S.; Ormond, Eugene C.; Cordova, Steve R.; Molina, Isidro; Smith, John R.; Rose, Evan A.

    2009-01-01

    The Cygnus Dual Beam Radiographic Facility consists of two identical radiographic sources: Cygnus 1 and Cygnus 2. This Radiographic Facility is located in an underground tunnel test area at the Nevada Test Site. The sources were developed to produce high-resolution images for dynamic plutonium experiments. This work will recount and discuss salient maintenance and operational issues encountered during the history of Cygnus. A brief description of Cygnus systems and rational for design selections will set the stage for this historical narrative. It is intended to highlight the team-derived solutions for technical problems encountered during extended periods of maintenance and operation. While many of the issues are typical to pulsed power systems, some of the solutions are unique. It is hoped that other source teams will benefit from this presentation, as well as other necessary disciplines (e.g., source users, system architects, facility designers and managers, funding managers, and team leaders)

  5. Environmental history

    DEFF Research Database (Denmark)

    Pawson, Eric; Christensen, Andreas Aagaard

    2017-01-01

    Environmental history is an interdisciplinary pursuit that has developed as a form of conscience to counter an increasingly powerful, forward-looking liberal theory of the environment. It deals with the relations between environmental ideas and materialities, from the work of the geographers George...... Perkins Marsh, Carl Sauer, and Clarence Glacken, to more recent global-scale assessments of the impact of the “great acceleration” since 1950. Today’s “runaway world” paradoxically embraces risk management in an attempt to determine its own future whilst generating a whole new category of “manufactured...... risks”. These are exposed by environmental history’s focus on long-run analysis and its narrative form that identifies the stories that we tell ourselves about nature. How a better understanding of past environmental transformations helps to analyse society and agency, and what this can mean...

  6. Defining and Differentiating the Makerspace

    Science.gov (United States)

    Dousay, Tonia A.

    2017-01-01

    Many resources now punctuate the maker movement landscape. However, some schools and communities still struggle to understand this burgeoning movement. How do we define these spaces and differentiate them from previous labs and shops? Through a multidimensional framework, stakeholders should consider how the structure, access, staffing, and tools…

  7. Indico CONFERENCE: Define the Programme

    CERN Multimedia

    CERN. Geneva; Ferreira, Pedro

    2017-01-01

    In this tutorial you are going to learn how to define the programme of a conference in Indico. The program of your conference is divided in different “tracks”. Tracks represent the subject matter of the conference, such as “Online Computing”, “Offline Computing”, and so on.

  8. Public History

    Directory of Open Access Journals (Sweden)

    Marta Gouveia de Oliveira Rovai

    2017-04-01

    Full Text Available Este artigo tem como proposta apresentar o conceito e as práticas de História Pública como um novo posicionamento da ciência histórica em diálogo com profissionais da comunicação, no sentido de produzir e divulgar as experiências humanas. Para isso, discute-se a origem do conceito de História Pública e as diferentes formas de educação histórica que a utilização das novas tecnologias podem proporcionar (dentre elas a internet. Nesse sentido, convida-se o leitor para a reflexão sobre as possibilidades de publicização e de democratização do conhecimento histórico e da cultura, ampliando-se a oportunidade de produção, de divulgação e de acesso do público a diferentes formas experiências no tempo. O artigo também intenciona chamar atenção dos profissionais que lidam com a História e com a Comunicação para os perigos de produções exclusivamente submetidas ao mercado que transformam a popularização da História no reforço de estigmas culturais.   PALAVRAS-CHAVE: História Pública; Educação histórica e Comunicação; democratização e estigmatização.     ABSTRACT This article aims to present the concept and practices of Public History as a new positioning of historical science in dialogue with communication professionals, in the sense of producing and disseminating human experiences. For this, the origin of the concept of Public History and the different forms of historical education that the use of the new technologies can provide (among them the Internet is discussed. In this sense, the reader is invited to reflect on the possibilities of publicizing and democratizing historical knowledge and culture, expanding the opportunity for production, dissemination and public access to different forms of experience in time. The article also intends to draw attention from professionals dealing with History and Communication to the dangers of exclusively commercialized productions that transform the popularization

  9. Defining and Distinguishing Traditional and Religious Terrorism

    OpenAIRE

    Gregg, Heather S.

    2014-01-01

    The article of record may be found at: http://dx.doi.org/10.1080/23296151.2016.1239978 thus offering few if any policy options for counterterrorism measures. This assumption about religious terrorism stems from two challenges in the literature: disproportionate attention to apocalyptic terrorism, and a lack of distinction between religious terrorism and its secular counterpart. This article, therefore, aims to do four things: define and differentiate religiously motivated terrorism from tr...

  10. Procrastination as a Fast Life History Strategy

    Directory of Open Access Journals (Sweden)

    Bin-Bin Chen

    2016-02-01

    Full Text Available Research has revealed that procrastination—the purposive delay of an intended course of action—is a maladaptive behavior. However, by drawing on an evolutionary life history (LF approach, the present study proposes that procrastination may be an adaptive fast LF strategy characterized by prioritizing immediate benefits with little regard to long-term consequences. A total of 199 undergraduate students completed measures of procrastination and future orientation and the Mini-K scale, which measures the slow LF strategy. Structural equation modeling revealed that, as predicted, procrastination was negatively associated with a slow LF strategy both directly and indirectly through the mediation of future orientation. These results define the fast LF origin of procrastination.

  11. Life histories in occupational therapy clinical practice.

    Science.gov (United States)

    Frank, G

    1996-04-01

    This article defines and compares several narrative methods used to describe and interpret patients' lives. The biographical methods presented are case histories, life-charts, life histories, life stories, assisted autobiography, hermeneutic case reconstruction, therapeutic employment, volitional narratives, and occupational storytelling and story making. Emphasis is placed the clinician as a collaborator and interpreter of the patient's life through ongoing interactions and dialogue.

  12. Joint stability characteristics of the ankle complex in female athletes with histories of lateral ankle sprain, part II: clinical experience using arthrometric measurement.

    Science.gov (United States)

    Kovaleski, John E; Heitman, Robert J; Gurchiek, Larry R; Hollis, J M; Liu, Wei; Pearsall, Albert W

    2014-01-01

    This is part II of a 2-part series discussing stability characteristics of the ankle complex. In part I, we used a cadaver model to examine the effects of sectioning the lateral ankle ligaments on anterior and inversion motion and stiffness of the ankle complex. In part II, we wanted to build on and apply these findings to the clinical assessment of ankle-complex motion and stiffness in a group of athletes with a history of unilateral ankle sprain. To examine ankle-complex motion and stiffness in a group of athletes with reported history of lateral ankle sprain. Cross-sectional study. University research laboratory. Twenty-five female college athletes (age = 19.4 ± 1.4 years, height = 170.2 ± 7.4 cm, mass = 67.3 ± 10.0 kg) with histories of unilateral ankle sprain. All ankles underwent loading with an ankle arthrometer. Ankles were tested bilaterally. The dependent variables were anterior displacement, anterior end-range stiffness, inversion rotation, and inversion end-range stiffness. Anterior displacement of the ankle complex did not differ between the uninjured and sprained ankles (P = .37), whereas ankle-complex rotation was greater for the sprained ankles (P = .03). The sprained ankles had less anterior and inversion end-range stiffness than the uninjured ankles (P ankle-complex laxity and end-range stiffness were detected in ankles with histories of sprain. These results indicate the presence of altered mechanical characteristics in the soft tissues of the sprained ankles.

  13. AIDS defining disease: Disseminated cryptococcosis

    Directory of Open Access Journals (Sweden)

    Roshan Anupama

    2006-01-01

    Full Text Available Disseminated cryptococcosis is one of the acquired immune deficiency syndrome defining criteria and the most common cause of life threatening meningitis. Disseminated lesions in the skin manifest as papules or nodules that mimic molluscum contagiosum (MC. We report here a human immunodeficiency virus positive patient who presented with MC like lesions. Disseminated cryptococcosis was confirmed by India ink preparation and histopathology. The condition of the patient improved with amphotercin B.

  14. Balance ability measured with the Berg balance scale: a determinant of fall history in community-dwelling adults with leg amputation.

    Science.gov (United States)

    Wong, Christopher Kevin; Chen, Christine C; Blackwell, Wren M; Rahal, Rana T; Benoy, Stephany A

    2015-01-01

    Falls are common among adults with leg amputations and associated with balance confidence. But subjective confidence is not equivalent with physical ability. This multivariate analyses of community-dwelling adults with leg amputations examined relationships among individual characteristics, falls, balance ability and balance confidence. Cross-sectional study. Community-dwelling adults with leg amputations recruited from a support group and prosthetic clinic. Subjects provided self-reported medical/fall history, prosthetic functional use, and Activities-specific Balance Confidence (ABC) questionnaire data. Balance ability was assessed with the Berg Balance Scale (BBS). Fall incidence was categorized as any fall (one or more) and recurrent falls (more than one). Multivariate logistic regression analyzed relationships within the two fall categories. Cross tabulations and ANOVA analyzed differences among subcategories. Fifty-four subjects (mean age 56.8) with various etiologies, amputation levels, and balance abilities participated. 53.7% had any fall; 25.9% had recurrent falls. Models for both fall categories correctly classified fall history in > 70% of subjects with combinations of the variables ABC, BBS, body-mass-index, and amputation level. Falls occurred regardless of clinical characteristics. Total BBS and select item scores were independent determinants of fall history. Unlike other balance-impaired populations, adults with leg amputation and better balance ability had greater odds of falling.

  15. How do people define moderation?

    Science.gov (United States)

    vanDellen, Michelle R; Isherwood, Jennifer C; Delose, Julie E

    2016-06-01

    Eating in moderation is considered to be sound and practical advice for weight maintenance or prevention of weight gain. However, the concept of moderation is ambiguous, and the effect of moderation messages on consumption has yet to be empirically examined. The present manuscript examines how people define moderate consumption. We expected that people would define moderate consumption in ways that justified their current or desired consumption rather than view moderation as an objective standard. In Studies 1 and 2, moderate consumption was perceived to involve greater quantities of an unhealthy food (chocolate chip cookies, gummy candies) than perceptions of how much one should consume. In Study 3, participants generally perceived themselves to eat in moderation and defined moderate consumption as greater than their personal consumption. Furthermore, definitions of moderate consumption were related to personal consumption behaviors. Results suggest that the endorsement of moderation messages allows for a wide range of interpretations of moderate consumption. Thus, we conclude that moderation messages are unlikely to be effective messages for helping people maintain or lose weight. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Decoding Galactic Merger Histories

    Directory of Open Access Journals (Sweden)

    Eric F. Bell

    2017-12-01

    Full Text Available Galaxy mergers are expected to influence galaxy properties, yet measurements of individual merger histories are lacking. Models predict that merger histories can be measured using stellar halos and that these halos can be quantified using observations of resolved stars along their minor axis. Such observations reveal that Milky Way-mass galaxies have a wide range of stellar halo properties and show a correlation between their stellar halo masses and metallicities. This correlation agrees with merger-driven models where stellar halos are formed by satellite galaxy disruption. In these models, the largest accreted satellite dominates the stellar halo properties. Consequently, the observed diversity in the stellar halos of Milky Way-mass galaxies implies a large range in the masses of their largest merger partners. In particular, the Milky Way’s low mass halo implies an unusually quiet merger history. We used these measurements to seek predicted correlations between the bulge and central black hole (BH mass and the mass of the largest merger partner. We found no significant correlations: while some galaxies with large bulges and BHs have large stellar halos and thus experienced a major or minor merger, half have small stellar halos and never experienced a significant merger event. These results indicate that bulge and BH growth is not solely driven by merger-related processes.

  17. Network Coded Software Defined Networking

    DEFF Research Database (Denmark)

    Krigslund, Jeppe; Hansen, Jonas; Roetter, Daniel Enrique Lucani

    2015-01-01

    Software Defined Networking (SDN) and Network Coding (NC) are two key concepts in networking that have garnered a large attention in recent years. On the one hand, SDN's potential to virtualize services in the Internet allows a large flexibility not only for routing data, but also to manage....... This paper advocates for the use of SDN to bring about future Internet and 5G network services by incorporating network coding (NC) functionalities. The inherent flexibility of both SDN and NC provides a fertile ground to envision more efficient, robust, and secure networking designs, that may also...

  18. Network Coded Software Defined Networking

    DEFF Research Database (Denmark)

    Hansen, Jonas; Roetter, Daniel Enrique Lucani; Krigslund, Jeppe

    2015-01-01

    Software defined networking has garnered large attention due to its potential to virtualize services in the Internet, introducing flexibility in the buffering, scheduling, processing, and routing of data in network routers. SDN breaks the deadlock that has kept Internet network protocols stagnant...... for decades, while applications and physical links have evolved. This article advocates for the use of SDN to bring about 5G network services by incorporating network coding (NC) functionalities. The latter constitutes a major leap forward compared to the state-of-the- art store and forward Internet paradigm...

  19. Defining Usability of PN Services

    DEFF Research Database (Denmark)

    Nicolajsen, Hanne Westh; Ahola, Titta; Fleury, Alexandre

    In this deliverable usability and user experience are defined in relation to MAGNET Beyond technologies, and it is described how the main MAGNET Beyond concepts can be evaluated through the involvement of users. The concepts include the new "Activity based communication approach" for interacting...... with the MAGNET Beyond system, as well as the core concepts: Personal Network, Personal Network-Federation, Service Discovery, User Profile Management, Personal Network Management, Privacy and Security and Context Awareness. The overall plans for the final usability evaluation are documented based on the present...

  20. Why and how to measure stock market fluctuations? The early history of stock market indices, with special reference to the French case

    OpenAIRE

    Pierre-Cyrille Hautcoeur

    2006-01-01

    Stock market indices are today a vital and daily tool for both economists and actors in the financial world. The multiplication and the very importance given to these indices raise the question of their accuracy and of the reliability of the methods that are used to construct them. We begin an investigation on these questions by studying the early history of these indices. We show that stock market indices appeared in the daily press in the United States at the end of the 19th century; that a...

  1. Expressiveness and definability in circumscription

    Directory of Open Access Journals (Sweden)

    Francicleber Martins Ferreira

    2011-06-01

    Full Text Available We investigate expressiveness and definability issues with respect to minimal models, particularly in the scope of Circumscription. First, we give a proof of the failure of the Löwenheim-Skolem Theorem for Circumscription. Then we show that, if the class of P; Z-minimal models of a first-order sentence is Δ-elementary, then it is elementary. That is, whenever the circumscription of a first-order sentence is equivalent to a first-order theory, then it is equivalent to a finitely axiomatizable one. This means that classes of models of circumscribed theories are either elementary or not Δ-elementary. Finally, using the previous result, we prove that, whenever a relation Pi is defined in the class of P; Z-minimal models of a first-order sentence Φ and whenever such class of P; Z-minimal models is Δ-elementary, then there is an explicit definition ψ for Pi such that the class of P; Z-minimal models of Φ is the class of models of Φ ∧ ψ. In order words, the circumscription of P in Φ with Z varied can be replaced by Φ plus this explicit definition ψ for Pi.

  2. Defining Quality in Undergraduate Education

    Directory of Open Access Journals (Sweden)

    Alison W. Bowers

    2018-01-01

    Full Text Available Objectives: This research brief explores the literature addressing quality in undergraduate education to identify what previous research has said about quality and to offer future directions for research on quality in undergraduate education. Method: We conducted a scoping review to provide a broad overview of existing research. Using targeted search terms in academic databases, we identified and reviewed relevant academic literature to develop emergent themes and implications for future research. Results: The exploratory review of the literature revealed a range of thoughtful discussions and empirical studies attempting to define quality in undergraduate education. Many publications highlighted the importance of including different stakeholder perspectives and presented some of the varying perceptions of quality among different stakeholders. Conclusions: While a number of researchers have explored and written about how to define quality in undergraduate education, there is not a general consensus regarding a definition of quality in undergraduate education. Past research offers a range of insights, models, and data to inform future research. Implication for Theory and/or Practice: We provide four recommendations for future research to contribute to a high quality undergraduate educational experience. We suggest more comprehensive systematic reviews of the literature as a next step.

  3. The Discovery of the Tau Lepton: Part 1, The Early History Through 1975; Part 2, Confirmation of the Discovery and Measurement of Major Properties, 1976--1982

    Science.gov (United States)

    Perl, M. L.

    1994-08-01

    Several previous papers have given the history of the discovery of the {tau} lepton at the Stanford Linear Accelerator Center (SLAC). These papers emphasized (a) the experiments which led to our 1975 publication of the first evidence for the existence of the {tau}, (b) the subsequent experiments which confirmed the existence of the r, and (c) the experiments which elucidated the major properties of the {tau}. That history will be summarized in Part 2 of this talk. In this Part 1, I describe the earlier thoughts and work of myself and my colleagues at SLAC in the 1960's and early 1970's which led to the discovery. I also describe the theoretical and experimental events in particle physics in the 1960's in which our work was immersed. I will also try to describe for the younger generations of particle physicists, the atmosphere in the 1960's. That was before the elucidation of the quark model of hadrons, before the development of the concept of particle generations The experimental paths to program we hot as clear as they are today and we had to cast a wide experimental net.

  4. The discovery of the tau lepton: Part 1, The early history through 1975; Part 2, Confirmation of the discovery and measurement of major properties, 1976--1982

    International Nuclear Information System (INIS)

    Perl, M.L.

    1994-08-01

    Several previous papers have given the history of the discovery of the τ lepton at the Stanford Linear Accelerator Center (SLAC). These papers emphasized (a) the experiments which led to our 1975 publication of the first evidence for the existence of the τ, (b) the subsequent experiments which confirmed the existence of the r, and (c) the experiments which elucidated the major properties of the τ. That history will be summarized in Part 2 of this talk. In this Part 1, I describe the earlier thoughts and work of myself and my colleagues at SLAC in the 1960's and early 1970's which led to the discovery. I also describe the theoretical and experimental events in particle physics in the 1960's in which our work was immersed. I will also try to describe for the younger generations of particle physicists, the atmosphere in the 1960's. That was before the elucidation of the quark model of hadrons, before the development of the concept of particle generations The experimental paths to program we hot as clear as they are today and we had to cast a wide experimental net

  5. Miniature EVA Software Defined Radio

    Science.gov (United States)

    Pozhidaev, Aleksey

    2012-01-01

    As NASA embarks upon developing the Next-Generation Extra Vehicular Activity (EVA) Radio for deep space exploration, the demands on EVA battery life will substantially increase. The number of modes and frequency bands required will continue to grow in order to enable efficient and complex multi-mode operations including communications, navigation, and tracking applications. Whether conducting astronaut excursions, communicating to soldiers, or first responders responding to emergency hazards, NASA has developed an innovative, affordable, miniaturized, power-efficient software defined radio that offers unprecedented power-efficient flexibility. This lightweight, programmable, S-band, multi-service, frequency- agile EVA software defined radio (SDR) supports data, telemetry, voice, and both standard and high-definition video. Features include a modular design, an easily scalable architecture, and the EVA SDR allows for both stationary and mobile battery powered handheld operations. Currently, the radio is equipped with an S-band RF section. However, its scalable architecture can accommodate multiple RF sections simultaneously to cover multiple frequency bands. The EVA SDR also supports multiple network protocols. It currently implements a Hybrid Mesh Network based on the 802.11s open standard protocol. The radio targets RF channel data rates up to 20 Mbps and can be equipped with a real-time operating system (RTOS) that can be switched off for power-aware applications. The EVA SDR's modular design permits implementation of the same hardware at all Network Nodes concept. This approach assures the portability of the same software into any radio in the system. It also brings several benefits to the entire system including reducing system maintenance, system complexity, and development cost.

  6. Associations among measures of sequential processing in motor and linguistics tasks in adults with and without a family history of childhood apraxia of speech: a replication study.

    Science.gov (United States)

    Button, Le; Peter, Beate; Stoel-Gammon, Carol; Raskind, Wendy H

    2013-03-01

    The purpose of this study was to address the hypothesis that childhood apraxia of speech (CAS) is influenced by an underlying deficit in sequential processing that is also expressed in other modalities. In a sample of 21 adults from five multigenerational families, 11 with histories of various familial speech sound disorders, 3 biologically related adults from a family with familial CAS showed motor sequencing deficits in an alternating motor speech task. Compared with the other adults, these three participants showed deficits in tasks requiring high loads of sequential processing, including nonword imitation, nonword reading and spelling. Qualitative error analyses in real word and nonword imitations revealed group differences in phoneme sequencing errors. Motor sequencing ability was correlated with phoneme sequencing errors during real word and nonword imitation, reading and spelling. Correlations were characterized by extremely high scores in one family and extremely low scores in another. Results are consistent with a central deficit in sequential processing in CAS of familial origin.

  7. Celebrate Women's History.

    Science.gov (United States)

    Leonard, Carolyn M.; Baradar, Mariam

    This teachers' guide to activities celebrating Women's History Month focuses on women whose important contributions have been omitted from history textbooks. Women's History Month grew from a 1977 celebration of Women's History Week and is intended to bring women's history into the school curriculum. International Women's Day, celebrated on March…

  8. Measurement

    NARCIS (Netherlands)

    Boumans, M.; Durlauf, S.N.; Blume, L.E.

    2008-01-01

    Measurement theory takes measurement as the assignment of numbers to properties of an empirical system so that a homomorphism between the system and a numerical system is established. To avoid operationalism, two approaches can be distinguished. In the axiomatic approach it is asserted that if the

  9. Interrupting Life History: The Evolution of Relationship within Research

    Science.gov (United States)

    Hallett, Ronald E.

    2013-01-01

    In this paper the author explores how relationships are defined within the context of constructing a life history. The life history of Benjamin, a homeless young man transitioning to adulthood, is used to illustrate how difficult it is to define the parameters of the research environment. During an "ethically important moment" in the…

  10. The benefits of defining "snacks".

    Science.gov (United States)

    Hess, Julie M; Slavin, Joanne L

    2018-04-18

    Whether eating a "snack" is considered a beneficial or detrimental behavior is largely based on how "snack" is defined. The term "snack food" tends to connote energy-dense, nutrient-poor foods high in nutrients to limit (sugar, sodium, and/or saturated fat) like cakes, cookies, chips and other salty snacks, and sugar-sweetened beverages. Eating a "snack food" is often conflated with eating a "snack," however, leading to an overall perception of snacks as a dietary negative. Yet the term "snack" can also refer simply to an eating occasion outside of breakfast, lunch, or dinner. With this definition, the evidence to support health benefits or detriments to eating a "snack" remains unclear, in part because relatively few well-designed studies that specifically focus on the impact of eating frequency on health have been conducted. Despite these inconsistencies and research gaps, in much of the nutrition literature, "snacking" is still referred to as detrimental to health. As discussed in this review, however, there are multiple factors that influence the health impacts of snacking, including the definition of "snack" itself, the motivation to snack, body mass index of snack eaters, and the food selected as a snack. Without a definition of "snack" and a body of research using methodologically rigorous protocols, determining the health impact of eating a "snack" will continue to elude the nutrition research community and prevent the development of evidence-based policies about snacking that support public health. Copyright © 2018 Elsevier Inc. All rights reserved.

  11. Journal of East African Natural History: Editorial Policies

    African Journals Online (AJOL)

    Focus and Scope. The Journal of East African Natural History is published jointly by the East Africa Natural History Society and the National Museums of Kenya. The Journal publishes papers and notes in the field of natural history, broadly defined as the study of organisms in their natural state, relevant to the eastern African ...

  12. Defining and Assessing Team Skills of Business and Accountancy Students

    Science.gov (United States)

    Alghalith, Nabil; Blum, Michael; Medlock, Amanda; Weber, Sandy

    2004-01-01

    The objectives of the project are (1) to define the skills necessary for students to work effectively with others to achieve common goals, and (2) to develop an assessment instrument to measure student progress toward achieving these skills. The defined skill set will form a basis for common expectations related to team skills that will be shared…

  13. Defining safety goals. 2. Basic Consideration on Defining Safety Goals

    International Nuclear Information System (INIS)

    Hakata, T.

    2001-01-01

    cancer and severe hereditary effects are 10 x 10 -2 /Sv and 1.3 x10 -2 /Sv, respectively. The basic safety goals can be expressed by the complementary accumulative distribution function (CCDF) of dose versus frequencies of events: Pc(C > Cp) 5 (Cp/Co) -α . The aversion factor a is here expressed by the following arbitrary equation, which gives a polynomial curve of the order of m on a logarithmic plane: α = a+b(log(Cp/Co)) m , where: Pc = CCDF frequency for Cp (/yr), Cp = dose (mSv), Co = Cp for Pc =1, a, b, m = constants. Figure 1 shows a typical tolerable risk profile (risk limit curve), which is drawn so that all the points obtained in the previous discussions are above the curve (Co=1, a=1, b=0.0772, and m = 2). Safety criteria by ANS (Ref. 2) and SHE (Ref. 3) are shown in Fig. 1 for comparison. An aversion of a factor of 2 is resulted between 1 mSv and 1 Sv. No ALARA is included, which must be considered in defining specific safety goals. The frequency of a single class of events must be lower than the CCDF profile, and a curve lower by a factor of 10 is drawn in Fig. 1. The doses referenced in the current Japanese safety guidelines and site criteria are shown in Fig. 1. The referenced doses seem reasonable, considering the conservatism in the analysis of design-basis accidents. Specific safety goals for each sort of facility can be defined based on the basic safety goals, reflecting the characteristics of the facilities and considering ALARA. The indexes of engineering terms, such as CMF and LERF, are preferable for nuclear power plants, although interpretation from dose to the engineering terms is needed. Other indexes may be used (such as frequency of criticality accidents, etc.) for facilities except for power plants. The applicability of safety goals will thus be improved. Figure 2 shows the relative risk factors (1, 1%, and 0.1%) versus the severity of radiation effects. This might indicate the adequacy of the risk factors. The absolute risk limits, which

  14. Family history of type 2 diabetes and prevalence of metabolic syndrome in adult Asian Indians.

    Science.gov (United States)

    Das, Mithun; Pal, Susil; Ghosh, Arnab

    2012-04-01

    Our objective was to test the association between familial risk of type 2 diabetes mellitus (T2DM) and the prevalence of metabolic syndrome (MS) in adult Asian Indians. A total of 448 adult (>30 years) individuals (257 males and 191 females) participated in the study. Familial risk of T2DM was classified into three groups viz., 1=both parents affected; 2=parent and/or siblings affected and 3=none or no family history for T2DM. Anthropometric measures, blood pressures, fasting blood glucose and metabolic profiles were studied using standard techniques. MS was defined accordingly. The prevalence of MS phenotypes was estimated and compared among the three familial risk strata. Individuals with a history of both parents affected from diabetes had significantly higher (Pfamily history of T2DM. Significant difference was also noticed between individuals with and without MS according to the family history of diabetes (Pfamily history of T2DM. Family history of T2DM had significant effect on individuals with MS as compared to their counterparts (individuals having no family history of T2DM). It therefore seems reasonable to argue that family history of T2DM could be useful as a predictive tool for early diagnosis and prevention of MS in Asian Indian population.

  15. Defining Tobacco Regulatory Science Competencies.

    Science.gov (United States)

    Wipfli, Heather L; Berman, Micah; Hanson, Kacey; Kelder, Steven; Solis, Amy; Villanti, Andrea C; Ribeiro, Carla M P; Meissner, Helen I; Anderson, Roger

    2017-02-01

    In 2013, the National Institutes of Health and the Food and Drug Administration funded a network of 14 Tobacco Centers of Regulatory Science (TCORS) with a mission that included research and training. A cross-TCORS Panel was established to define tobacco regulatory science (TRS) competencies to help harmonize and guide their emerging educational programs. The purpose of this paper is to describe the Panel's work to develop core TRS domains and competencies. The Panel developed the list of domains and competencies using a semistructured Delphi method divided into four phases occurring between November 2013 and August 2015. The final proposed list included a total of 51 competencies across six core domains and 28 competencies across five specialized domains. There is a need for continued discussion to establish the utility of the proposed set of competencies for emerging TRS curricula and to identify the best strategies for incorporating these competencies into TRS training programs. Given the field's broad multidisciplinary nature, further experience is needed to refine the core domains that should be covered in TRS training programs versus knowledge obtained in more specialized programs. Regulatory science to inform the regulation of tobacco products is an emerging field. The paper provides an initial list of core and specialized domains and competencies to be used in developing curricula for new and emerging training programs aimed at preparing a new cohort of scientists to conduct critical TRS research. © The Author 2016. Published by Oxford University Press on behalf of the Society for Research on Nicotine and Tobacco. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  16. Adults with an epilepsy history fare significantly worse on positive mental and physical health than adults with other common chronic conditions-Estimates from the 2010 National Health Interview Survey and Patient Reported Outcome Measurement System (PROMIS) Global Health Scale.

    Science.gov (United States)

    Kobau, Rosemarie; Cui, Wanjun; Zack, Matthew M

    2017-07-01

    Healthy People 2020, a national health promotion initiative, calls for increasing the proportion of U.S. adults who self-report good or better health. The Patient-Reported Outcomes Measurement Information System (PROMIS) Global Health Scale (GHS) was identified as a reliable and valid set of items of self-reported physical and mental health to monitor these two domains across the decade. The purpose of this study was to examine the percentage of adults with an epilepsy history who met the Healthy People 2020 target for self-reported good or better health and to compare these percentages to adults with history of other common chronic conditions. Using the 2010 National Health Interview Survey, we compared and estimated the age-standardized prevalence of reporting good or better physical and mental health among adults with five selected chronic conditions including epilepsy, diabetes, heart disease, cancer, and hypertension. We examined response patterns for physical and mental health scale among adults with these five conditions. The percentages of adults with epilepsy who reported good or better physical health (52%) or mental health (54%) were significantly below the Healthy People 2020 target estimate of 80% for both outcomes. Significantly smaller percentages of adults with an epilepsy history reported good or better physical health than adults with heart disease, cancer, or hypertension. Significantly smaller percentages of adults with an epilepsy history reported good or better mental health than adults with all other four conditions. Health and social service providers can implement and enhance existing evidence-based clinical interventions and public health programs and strategies shown to improve outcomes in epilepsy. These estimates can be used to assess improvements in the Healthy People 2020 Health-Related Quality of Life and Well-Being Objective throughout the decade. Published by Elsevier Inc.

  17. Strain measurement of wheel by a super-small size strain history recorder and its application to fatigue design; Chokogata jitsudo hizumi keisoku sochi ni yoru wheel no hizumi keisoku to hiro kyodo sekkei eno oyo

    Energy Technology Data Exchange (ETDEWEB)

    Murakami, Y [Kyushu University, Fukuoka (Japan); Mineki, K; Wakamatsu, K [Central Motor Wheel, Tokyo (Japan); Morita, T

    1997-10-01

    A very small strain history recorder based on the rainflow method has been developed and applied to strain measurement of car wheels under several road tests. Various strain amplitude histogram data under mountain road, city road and high-way were acquired by the recorder for various types of wheels. The data were studied from the viewpoint of random fatigue and the fatigue damages were evaluated by Miner`s rule. The results of the damage evaluation were used for the improvement of shapes of wheels. 2 refs., 6 figs., 1 tab.

  18. Defining clogging potential for permeable concrete.

    Science.gov (United States)

    Kia, Alalea; Wong, Hong S; Cheeseman, Christopher R

    2018-08-15

    Permeable concrete is used to reduce urban flooding as it allows water to flow through normally impermeable infrastructure. It is prone to clogging by particulate matter and predicting the long-term performance of permeable concrete is challenging as there is currently no reliable means of characterising clogging potential. This paper reports on the performance of a range of laboratory-prepared and commercial permeable concretes, close packed glass spheres and aggregate particles of varying size, exposed to different clogging methods to understand this phenomena. New methods were developed to study clogging and define clogging potential. The tests involved applying flowing water containing sand and/or clay in cycles, and measuring the change in permeability. Substantial permeability reductions were observed in all samples, particularly when exposed to sand and clay simultaneously. Three methods were used to define clogging potential based on measuring the initial permeability decay, half-life cycle and number of cycles to full clogging. We show for the first time strong linear correlations between these parameters for a wide range of samples, indicating their use for service-life prediction. Copyright © 2018 Elsevier Ltd. All rights reserved.

  19. Multi-channel software defined radio experimental evaluation and analysis

    CSIR Research Space (South Africa)

    Van der Merwe, JR

    2014-09-01

    Full Text Available Multi-channel software-defined radios (SDRs) can be utilised as inexpensive prototyping platforms for transceiver arrays. The application for multi-channel prototyping is discussed and measured results of coherent channels for both receiver...

  20. History, legislation and offense: deprivation of liberty and socio-educational measures aimed at children and adolescents in the 20th century

    Directory of Open Access Journals (Sweden)

    Camila Serafim Daminelli

    2017-12-01

    Full Text Available During the twentieth century the Brazilian State sought to reeducate the minors offenders by their insertion in centers built for this purpose. First based on Minor’s Rights [Direito do Menor] (1927, then through the Doctrine of the Irregular Situation [Doutrina da Situação Irregular] (1979, the offenders were priority subject for internment, because of their noticeable public disorder potential and as adult crime demand. Since the enactment of the Child and Teenager Statute [Estatuto da Criança e do Adolescente] (1990, educational measures in open regime was established aiming the reintegration of the offender to social life, presenting the shelter as a last option to be used. It is proposed to analyze the measures provided by law for accountability of child and youth people throughout the twentieth century, in Brazil and make some considerations about educational measures prescribed by actual law

  1. İnovasyon Süreci Performansı Ölçüm Kriterlerini Nitel Bir Araştırma İle Belirleme: Bilişim Sektöründen Bulgular - Defining Innovation Process Performance Measurement Criteria with a Qualitative Research: Findings from IT Sector

    Directory of Open Access Journals (Sweden)

    Yunus Emre TAŞGİT

    2016-06-01

    Full Text Available The aim of this study is to define innovation performance measurement criteria for firms and measure their performance through these criteria. IT firms in technoparks at TR42 East Marmara Region are included in the study and qualitative research method is used. Data are collected through the interviews conducted with managers of IT firms and are analyzed with descriptive and content analysis techniques. After the analysis, some measurement criteria are introduced to measure the innovation performance. Results show that “Idea Generation” stage is not taken seriously by these firms. Performances of “Beta Version Development” and “Full Version Development” stages are high. Firms have to analyze “Sale” stage carefully.

  2. Becchi-Rouet-Stora-Tyutin quantization of histories electrodynamics

    International Nuclear Information System (INIS)

    Noltingk, Duncan

    2002-01-01

    This article is a continuation of earlier work where a classical history theory of pure electrodynamics was developed in which the history fields have five components. The extra component is associated with an extra constraint, thus enlarging the gauge group of histories electrodynamics. In this article we quantize the classical theory developed previously by two methods. First we quantize the reduced classical history space to obtain a reduced quantum history theory. Second we quantize the classical BRST-extended history space, and use the Becchi-Rouet-Stora-Tyutin charge to define a 'cohomological' quantum history theory. Finally, we show that the reduced history theory is isomorphic (as a history theory) to the cohomological history theory

  3. NOAA History - Main Page

    Science.gov (United States)

    NOAA History Banner gold bar divider home - takes you to index page about the site contacts noaa americas science and service noaa legacy 1807 - 2007 NOAA History is an intrinsic part of the history of Initiative scroll divider More NOAA History from Around the Nation scroll divider drawing of a tornado NOAA

  4. Kiropraktikkens historie i Danmark

    DEFF Research Database (Denmark)

    Jørgensen, Per

    Bogen er den første samlede, forskningsbaserede fremstilling om kiropraktikkens danske historie. Den har udblik til kiropraktikkens historie i USA.......Bogen er den første samlede, forskningsbaserede fremstilling om kiropraktikkens danske historie. Den har udblik til kiropraktikkens historie i USA....

  5. Using Defined Processes as a Context for Resilience Measures

    Science.gov (United States)

    2011-12-01

    processes. 1 "W. Edwards Deming." BrainyQuote.com. Xplore Inc, 2010. Accessed September 22, 2011. http://www.brainyquote.com/quotes/quotes/w...process owner of this process element or another related organizational home page, e.g., Software Engineering Institute, IEEE or a government regulatory

  6. Adolescents Define Sexual Orientation and Suggest Ways to Measure It

    Science.gov (United States)

    Friedman, M. S. Mark S.; Silvestre, Anthony J.; Gold, Melanie A.; Markovic, Nina; Savin-Williams, Ritch C.; Huggins, James; Sell, Randal L.

    2004-01-01

    Researchers disagree on how to assess adolescent sexual orientation. The relative importance of various dimensions (e.g. attraction, relationships, behavior, self-labeling) is unknown, which calls into question the validity of studies assessing adolescent sexual orientation. To address this issue, 50 male and female adolescents of varied sexual…

  7. Defining Neighborhood Boundaries for Social Measurement: Advancing Social Work Research

    Science.gov (United States)

    Foster, Kirk A.; Hipp, J. Aaron

    2011-01-01

    Much of the current neighborhood-based research uses variables aggregated on administrative boundaries such as zip codes, census tracts, and block groups. However, other methods using current technological advances in geographic sciences may broaden our ability to explore the spatial concentration of neighborhood factors affecting individuals and…

  8. Toward Defining, Measuring, and Evaluating LGBT Cultural Competence for Psychologists

    Science.gov (United States)

    Boroughs, Michael S.; Andres Bedoya, C.; O'Cleirigh, Conall; Safren, Steven A.

    2015-01-01

    A central part of providing evidence-based practice is appropriate cultural competence to facilitate psychological assessment and intervention with diverse clients. At a minimum, cultural competence with lesbian, gay, bisexual, and transgender (LGBT) people involves adequate scientific and supervised practical training, with increasing depth and complexity across training levels. In order to further this goal, we offer 28 recommendations of minimum standards moving toward ideal training for LGBT-specific cultural competence. We review and synthesize the relevant literature to achieve and assess competence across the various levels of training (doctoral, internship, post-doctoral, and beyond) in order to guide the field towards best practices. These recommendations are aligned with educational and practice guidelines set forth by the field and informed by other allied professions in order to provide a roadmap for programs, faculty, and trainees in improving the training of psychologists to work with LGBT individuals. PMID:26279609

  9. Dependency between removal characteristics and defined measurement categories of pellets

    Science.gov (United States)

    Vogt, C.; Rohrbacher, M.; Rascher, R.; Sinzinger, S.

    2015-09-01

    Optical surfaces are usually machined by grinding and polishing. To achieve short polishing times it is necessary to grind with best possible form accuracy and with low sub surface damages. This is possible by using very fine grained grinding tools for the finishing process. These however often show time dependent properties regarding cutting ability in conjunction with tool wear. Fine grinding tools in the optics are often pellet-tools. For a successful grinding process the tools must show a constant self-sharpening performance. A constant, at least predictable wear and cutting behavior is crucial for a deterministic machining. This work describes a method to determine the characteristics of pellet grinding tools by tests conducted with a single pellet. We investigate the determination of the effective material removal rate and the derivation of the G-ratio. Especially the change from the newly dressed via the quasi-stationary to the worn status of the tool is described. By recording the achieved roughness with the single pellet it is possible to derive the roughness expect from a series pellet tool made of pellets with the same specification. From the results of these tests the usability of a pellet grinding tool for a specific grinding task can be determined without testing a comparably expensive serial tool. The results are verified by a production test with a serial tool under series conditions. The collected data can be stored and used in an appropriate data base for tool characteristics and be combined with useful applications.

  10. Cosmic growth history and expansion history

    International Nuclear Information System (INIS)

    Linder, Eric V.

    2005-01-01

    The cosmic expansion history tests the dynamics of the global evolution of the universe and its energy density contents, while the cosmic growth history tests the evolution of the inhomogeneous part of the energy density. Precision comparison of the two histories can distinguish the nature of the physics responsible for the accelerating cosmic expansion: an additional smooth component--dark energy--or a modification of the gravitational field equations. With the aid of a new fitting formula for linear perturbation growth accurate to 0.05%-0.2%, we separate out the growth dependence on the expansion history and introduce a new growth index parameter γ that quantifies the gravitational modification

  11. Defining and certifying green power

    International Nuclear Information System (INIS)

    1998-02-01

    Studies have shown that as electric utilities restructure from monopolistic utilities to competitive open access retailers, there is an increasing demand by individual and institutional customers for green power. In the United States, 17 electricity suppliers have offered customers the opportunity to buy energy generated from renewable sources such as photovoltaic panels, wind turbines and biomass. Twenty other utilities are conducting market research in preparation for offering a similar program. It was suggested that in order to help the customers make their choice based on accurate information, generating facilities should be obligated to provide credible information about the environmental performance of electricity supply through standardized environmental profile labels. A list of agreed upon environmental indicators and performance levels must be established so that the 'environmental friendliness' of different generating facilities can be measured. One of the problems in tackling this issue is that there is disagreement about what constitutes green power. Opinions range from wind and solar generation being the only two forms of green power, to including even natural gas and nuclear energy (i.e. under the right conditions). The two programs that are used for the certification of green power in Canada and the United States are Canada's Environmental Choice Program and California's Green-e Renewable Electricity Branding Program. This report describes the two programs and summarizes the results of interviews conducted on the definition and certification of green power. 15 refs

  12. Defining an Open Source Strategy for NASA

    Science.gov (United States)

    Mattmann, C. A.; Crichton, D. J.; Lindsay, F.; Berrick, S. W.; Marshall, J. J.; Downs, R. R.

    2011-12-01

    Over the course of the past year, we have worked to help frame a strategy for NASA and open source software. This includes defining information processes to understand open source licensing, attribution, commerciality, redistribution, communities, architectures, and interactions within the agency. Specifically we held a training session at the NASA Earth Science Data Systems Working Group meeting in Open Source software as it relates to the NASA Earth Science data systems enterprise, including EOSDIS, the Distributed Active Archive Centers (DAACs), ACCESS proposals, and the MEASURES communities, and efforts to understand how open source software can be both consumed and produced within that ecosystem. In addition, we presented at the 1st NASA Open Source Summit (OSS) and helped to define an agency-level strategy, a set of recommendations and paths forward for how to identify healthy open source communities, how to deal with issues such as contributions originating from other agencies, and how to search out talent with the right skills to develop software for NASA in the modern age. This talk will review our current recommendations for open source at NASA, and will cover the set of thirteen recommendations output from the NASA Open Source Summit and discuss some of their implications for the agency.

  13. Radioactivity and health: A history

    International Nuclear Information System (INIS)

    Stannard, J.N.; Baalman, R.W. Jr.

    1988-10-01

    This book is designed to be primarily a history of research facts, measurements, and ideas and the people who developed them. ''Research'' is defined very broadly to include from bench-top laboratory experiments to worldwide environmental investigations. The book is not a monograph or a critical review. The findings and conclusions are presented largely as the investigators saw and reported them. Frequently, the discussion utilizes the terminology and units of the time, unless they are truly antiquated or potentially unclear. It is only when the work being reported is markedly iconoclastic or obviously wrong that I chose to make special note of it or to correct it. Nevertheless, except for direct quotations, the language is mine, and I take full responsibility for it. The working materials for this volume included published papers in scientific journals, books, published conferences and symposia, personal interviews with over 100 individuals, some of them more than once (see Appendix A), and particularly for the 1940--1950 decade and for the large government-supported laboratories to the present day, ''in-house'' reports. These reports frequently represent the only comprehensive archive of what was done and why. Unfortunately, this source is drying up because of storage problems and must be retrieved by ever more complex and inconvenient means. For this reason, special efforts have been taken to review and document these sources, though even now some sections of the field are partially inaccessible. Nevertheless, the volume of all materials available for this review was surprisingly large and the quality much better than might have been expected for so complex and disparate a fields approached under conditions of considerable urgency

  14. Radioactivity and health: A history

    Energy Technology Data Exchange (ETDEWEB)

    Stannard, J.N.; Baalman, R.W. Jr. (ed.)

    1988-10-01

    This book is designed to be primarily a history of research facts, measurements, and ideas and the people who developed them. ''Research'' is defined very broadly to include from bench-top laboratory experiments to worldwide environmental investigations. The book is not a monograph or a critical review. The findings and conclusions are presented largely as the investigators saw and reported them. Frequently, the discussion utilizes the terminology and units of the time, unless they are truly antiquated or potentially unclear. It is only when the work being reported is markedly iconoclastic or obviously wrong that I chose to make special note of it or to correct it. Nevertheless, except for direct quotations, the language is mine, and I take full responsibility for it. The working materials for this volume included published papers in scientific journals, books, published conferences and symposia, personal interviews with over 100 individuals, some of them more than once (see Appendix A), and particularly for the 1940--1950 decade and for the large government-supported laboratories to the present day, ''in-house'' reports. These reports frequently represent the only comprehensive archive of what was done and why. Unfortunately, this source is drying up because of storage problems and must be retrieved by ever more complex and inconvenient means. For this reason, special efforts have been taken to review and document these sources, though even now some sections of the field are partially inaccessible. Nevertheless, the volume of all materials available for this review was surprisingly large and the quality much better than might have been expected for so complex and disparate a fields approached under conditions of considerable urgency.

  15. History Matching in Parallel Computational Environments

    Energy Technology Data Exchange (ETDEWEB)

    Steven Bryant; Sanjay Srinivasan; Alvaro Barrera; Sharad Yadav

    2005-10-01

    A novel methodology for delineating multiple reservoir domains for the purpose of history matching in a distributed computing environment has been proposed. A fully probabilistic approach to perturb permeability within the delineated zones is implemented. The combination of robust schemes for identifying reservoir zones and distributed computing significantly increase the accuracy and efficiency of the probabilistic approach. The information pertaining to the permeability variations in the reservoir that is contained in dynamic data is calibrated in terms of a deformation parameter rD. This information is merged with the prior geologic information in order to generate permeability models consistent with the observed dynamic data as well as the prior geology. The relationship between dynamic response data and reservoir attributes may vary in different regions of the reservoir due to spatial variations in reservoir attributes, well configuration, flow constrains etc. The probabilistic approach then has to account for multiple r{sub D} values in different regions of the reservoir. In order to delineate reservoir domains that can be characterized with different rD parameters, principal component analysis (PCA) of the Hessian matrix has been done. The Hessian matrix summarizes the sensitivity of the objective function at a given step of the history matching to model parameters. It also measures the interaction of the parameters in affecting the objective function. The basic premise of PC analysis is to isolate the most sensitive and least correlated regions. The eigenvectors obtained during the PCA are suitably scaled and appropriate grid block volume cut-offs are defined such that the resultant domains are neither too large (which increases interactions between domains) nor too small (implying ineffective history matching). The delineation of domains requires calculation of Hessian, which could be computationally costly and as well as restricts the current approach to

  16. Nurse leader resilience: career defining moments.

    Science.gov (United States)

    Cline, Susan

    2015-01-01

    Resilience is an essential component of effective nursing leadership. It is defined as the ability to survive and thrive in the face of adversity. Resilience can be developed and internalized as a measure to improve retention and reduce burnout. Nurse leaders at all levels should develop these competencies to survive and thrive in an increasingly complex health care environment. Building positive relationships, maintaining positivity, developing emotional insight, creating work-life balance, and reflecting on successes and challenges are effective strategies for resilience building. Nurse leaders have a professional obligation to develop resilience in themselves, the teams they supervise, and the organization as a whole. Additional benefits include reduced turnover, reduced cost, and improved quality outcomes through organizational mindfulness.

  17. Defining and Distinguishing Secular and Religious Terrorism

    Directory of Open Access Journals (Sweden)

    Heather S. Gregg

    2014-04-01

    Full Text Available Religious terrorism is typically characterised as acts of unrestrained, irrational and indiscriminant violence, thus offering few if any policy options for counterterrorism measures. This assumption about religious terrorism stems from two challenges in the literature: disproportionate attention to apocalyptic terrorism, and a lack of distinction between religious terrorism and its secular counterpart. This article, therefore, aims to do four things: define and differentiate religiously motivated terrorism from traditional terrorism; investigate three goals of religious terrorism (fomenting the apocalypse, creating a religious government, and establishing a religiously pure state; consider the role of leadership and target selection of religious terrorists; and, finally, suggest a range of counterterrorism strategies based on these observations.

  18. "Defining Computer 'Speed': An Unsolved Challenge"

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Abstract: The reason we use computers is their speed, and the reason we use parallel computers is that they're faster than single-processor computers. Yet, after 70 years of electronic digital computing, we still do not have a solid definition of what computer 'speed' means, or even what it means to be 'faster'. Unlike measures in physics, where the definition of speed is rigorous and unequivocal, in computing there is no definition of speed that is universally accepted. As a result, computer customers have made purchases misguided by dubious information, computer designers have optimized their designs for the wrong goals, and computer programmers have chosen methods that optimize the wrong things. This talk describes why some of the obvious and historical ways of defining 'speed' haven't served us well, and the things we've learned in the struggle to find a definition that works. Biography: Dr. John Gustafson is a Director ...

  19. Quantum computing. Defining and detecting quantum speedup.

    Science.gov (United States)

    Rønnow, Troels F; Wang, Zhihui; Job, Joshua; Boixo, Sergio; Isakov, Sergei V; Wecker, David; Martinis, John M; Lidar, Daniel A; Troyer, Matthias

    2014-07-25

    The development of small-scale quantum devices raises the question of how to fairly assess and detect quantum speedup. Here, we show how to define and measure quantum speedup and how to avoid pitfalls that might mask or fake such a speedup. We illustrate our discussion with data from tests run on a D-Wave Two device with up to 503 qubits. By using random spin glass instances as a benchmark, we found no evidence of quantum speedup when the entire data set is considered and obtained inconclusive results when comparing subsets of instances on an instance-by-instance basis. Our results do not rule out the possibility of speedup for other classes of problems and illustrate the subtle nature of the quantum speedup question. Copyright © 2014, American Association for the Advancement of Science.

  20. Indico CONFERENCE: Define the Call for Abstracts

    CERN Multimedia

    CERN. Geneva; Ferreira, Pedro

    2017-01-01

    In this tutorial, you will learn how to define and open a call for abstracts. When defining a call for abstracts, you will be able to define settings related to the type of questions asked during a review of an abstract, select the users who will review the abstracts, decide when to open the call for abstracts, and more.

  1. On defining semantics of extended attribute grammars

    DEFF Research Database (Denmark)

    Madsen, Ole Lehrmann

    1980-01-01

    Knuth has introduced attribute grammars (AGs) as a tool to define the semanitcs of context-free languages. The use of AGs in connection with programming language definitions has mostly been to define the context-sensitive syntax of the language and to define a translation in code for a hypothetic...

  2. Languages for Software-Defined Networks

    Science.gov (United States)

    2013-02-01

    switches, firewalls, and middleboxes) with closed and proprietary configuration inter- faces. Software - Defined Networks ( SDN ) are poised to change...how- ever, have seen growing interest in software - defined networks ( SDNs ), in which a logically-centralized controller manages the packet-processing...switches, firewalls, and middleboxes) with closed and proprietary configuration interfaces. Software - Defined Networks ( SDN ) are poised to change this

  3. Life History Trade-offs

    NARCIS (Netherlands)

    Smallegange, I.M.; Kliman, R.M.

    2016-01-01

    Trade-offs play a central role in life history theory. This article explains why they exist, how they arise, how they can be measured, and briefly discusses their evolution. Three important trade-offs are discussed in detail: the trade-off between current reproduction and survival, between current

  4. The Examination of Patient-Reported Outcomes and Postural Control Measures in Patients With and Without a History of ACL Reconstruction: A Case Control Study.

    Science.gov (United States)

    Hoch, Johanna M; Sinnott, Cori W; Robinson, Kendall P; Perkins, William O; Hartman, Jonathan W

    2018-03-01

    There is a lack of literature to support the diagnostic accuracy and cut-off scores of commonly used patient-reported outcome measures (PROMs) and clinician-oriented outcomes such as postural-control assessments (PCAs) when treating post-ACL reconstruction (ACLR) patients. These scores could help tailor treatments, enhance patient-centered care and may identify individuals in need of additional rehabilitation. To determine if differences in 4-PROMs and 3-PCAs exist between post-ACLR and healthy participants, and to determine the diagnostic accuracy and cut-off scores of these outcomes. Case control. Laboratory. A total of 20 post-ACLR and 40 healthy control participants. The participants completed 4-PROMs (the Disablement in the Physically Active Scale [DPA], The Fear-Avoidance Belief Questionnaire [FABQ], the Knee Osteoarthritis Outcomes Score [KOOS] subscales, and the Tampa Scale of Kinesiophobia [TSK-11]) and 3-PCAs (the Balance Error Scoring System [BESS], the modified Star Excursion Balance Test [SEBT], and static balance on an instrumented force plate). Mann-Whitney U tests examined differences between groups. Receiver operating characteristic (ROC) curves were employed to determine sensitivity and specificity. The Area Under the Curve (AUC) was calculated to determine the diagnostic accuracy of each instrument. The Youdin Index was used to determine cut-off scores. Alpha was set a priori at P < 0.05. There were significant differences between groups for all PROMs (P < 0.05). There were no differences in PCAs between groups. The cut-off scores should be interpreted with caution for some instruments, as the scores may not be clinically applicable. Post-ACLR participants have decreased self-reported function and health-related quality of life. The PROMs are capable of discriminating between groups. Clinicians should consider using the cut-off scores in clinical practice. Further use of the instruments to examine detriments after completion of standard

  5. Defining a Progress Metric for CERT RMM Improvement

    Science.gov (United States)

    2017-09-14

    REV-03.18.2016.0 Defining a Progress Metric for CERT-RMM Improvement Gregory Crabb Nader Mehravari David Tobar September 2017 TECHNICAL ...fendable resource allocation decisions. Technical metrics measure aspects of controls implemented through technology (systems, soft- ware, hardware...implementation metric would be the percentage of users who have received anti-phishing training . • Effectiveness/efficiency metrics measure whether

  6. Defining Ecosystem Assets for Natural Capital Accounting.

    Science.gov (United States)

    Hein, Lars; Bagstad, Ken; Edens, Bram; Obst, Carl; de Jong, Rixt; Lesschen, Jan Peter

    2016-01-01

    In natural capital accounting, ecosystems are assets that provide ecosystem services to people. Assets can be measured using both physical and monetary units. In the international System of Environmental-Economic Accounting, ecosystem assets are generally valued on the basis of the net present value of the expected flow of ecosystem services. In this paper we argue that several additional conceptualisations of ecosystem assets are needed to understand ecosystems as assets, in support of ecosystem assessments, ecosystem accounting and ecosystem management. In particular, we define ecosystems' capacity and capability to supply ecosystem services, as well as the potential supply of ecosystem services. Capacity relates to sustainable use levels of multiple ecosystem services, capability involves prioritising the use of one ecosystem service over a basket of services, and potential supply considers the ability of ecosystems to generate services regardless of demand for these services. We ground our definitions in the ecosystem services and accounting literature, and illustrate and compare the concepts of flow, capacity, capability, and potential supply with a range of conceptual and real-world examples drawn from case studies in Europe and North America. Our paper contributes to the development of measurement frameworks for natural capital to support environmental accounting and other assessment frameworks.

  7. Defining Ecosystem Assets for Natural Capital Accounting

    Science.gov (United States)

    Hein, Lars; Bagstad, Ken; Edens, Bram; Obst, Carl; de Jong, Rixt; Lesschen, Jan Peter

    2016-01-01

    In natural capital accounting, ecosystems are assets that provide ecosystem services to people. Assets can be measured using both physical and monetary units. In the international System of Environmental-Economic Accounting, ecosystem assets are generally valued on the basis of the net present value of the expected flow of ecosystem services. In this paper we argue that several additional conceptualisations of ecosystem assets are needed to understand ecosystems as assets, in support of ecosystem assessments, ecosystem accounting and ecosystem management. In particular, we define ecosystems’ capacity and capability to supply ecosystem services, as well as the potential supply of ecosystem services. Capacity relates to sustainable use levels of multiple ecosystem services, capability involves prioritising the use of one ecosystem service over a basket of services, and potential supply considers the ability of ecosystems to generate services regardless of demand for these services. We ground our definitions in the ecosystem services and accounting literature, and illustrate and compare the concepts of flow, capacity, capability, and potential supply with a range of conceptual and real-world examples drawn from case studies in Europe and North America. Our paper contributes to the development of measurement frameworks for natural capital to support environmental accounting and other assessment frameworks. PMID:27828969

  8. Defining ecosystem assets for natural capital accounting

    Science.gov (United States)

    Hein, Lars; Bagstad, Kenneth J.; Edens, Bram; Obst, Carl; de Jong, Rixt; Lesschen, Jan Peter

    2016-01-01

    In natural capital accounting, ecosystems are assets that provide ecosystem services to people. Assets can be measured using both physical and monetary units. In the international System of Environmental-Economic Accounting, ecosystem assets are generally valued on the basis of the net present value of the expected flow of ecosystem services. In this paper we argue that several additional conceptualisations of ecosystem assets are needed to understand ecosystems as assets, in support of ecosystem assessments, ecosystem accounting and ecosystem management. In particular, we define ecosystems’ capacity and capability to supply ecosystem services, as well as the potential supply of ecosystem services. Capacity relates to sustainable use levels of multiple ecosystem services, capability involves prioritising the use of one ecosystem service over a basket of services, and potential supply considers the ability of ecosystems to generate services regardless of demand for these services. We ground our definitions in the ecosystem services and accounting literature, and illustrate and compare the concepts of flow, capacity, capability, and potential supply with a range of conceptual and real-world examples drawn from case studies in Europe and North America. Our paper contributes to the development of measurement frameworks for natural capital to support environmental accounting and other assessment frameworks.

  9. History of Bioterrorism: Botulism

    Medline Plus

    Full Text Available ... is Doing Blog: Public Health Matters Video: "The History of Bioterrorism" Recommend on Facebook Tweet Share Compartir ... as bioterrorist weapons. Watch the Complete Program "The History of Bioterroism" (26 min 38 sec) Watch Specific ...

  10. History of Bioterrorism: Botulism

    Science.gov (United States)

    ... is Doing Blog: Public Health Matters Video: "The History of Bioterrorism" Recommend on Facebook Tweet Share Compartir ... as bioterrorist weapons. Watch the Complete Program "The History of Bioterroism" (26 min 38 sec) Watch Specific ...

  11. "Hillary - en god historie"

    DEFF Research Database (Denmark)

    Bjerre, Thomas Ærvold

    2007-01-01

    Anmeldelse af Carl Bernsteins Hillary Rodham Clinton og Michael Ehrenreichs Hillary - En amerikansk historie Udgivelsesdato: 15. november......Anmeldelse af Carl Bernsteins Hillary Rodham Clinton og Michael Ehrenreichs Hillary - En amerikansk historie Udgivelsesdato: 15. november...

  12. Defining nodes in complex brain networks

    Directory of Open Access Journals (Sweden)

    Matthew Lawrence Stanley

    2013-11-01

    Full Text Available Network science holds great promise for expanding our understanding of the human brain in health, disease, development, and aging. Network analyses are quickly becoming the method of choice for analyzing functional MRI data. However, many technical issues have yet to be confronted in order to optimize results. One particular issue that remains controversial in functional brain network analyses is the definition of a network node. In functional brain networks a node represents some predefined collection of brain tissue, and an edge measures the functional connectivity between pairs of nodes. The characteristics of a node, chosen by the researcher, vary considerably in the literature. This manuscript reviews the current state of the art based on published manuscripts and highlights the strengths and weaknesses of three main methods for defining nodes. Voxel-wise networks are constructed by assigning a node to each, equally sized brain area (voxel. The fMRI time-series recorded from each voxel is then used to create the functional network. Anatomical methods utilize atlases to define the nodes based on brain structure. The fMRI time-series from all voxels within the anatomical area are averaged and subsequently used to generate the network. Functional activation methods rely on data from traditional fMRI activation studies, often from databases, to identify network nodes. Such methods identify the peaks or centers of mass from activation maps to determine the location of the nodes. Small (~10-20 millimeter diameter spheres located at the coordinates of the activation foci are then applied to the data being used in the network analysis. The fMRI time-series from all voxels in the sphere are then averaged, and the resultant time series is used to generate the network. We attempt to clarify the discussion and move the study of complex brain networks forward. While the correct method to be used remains an open, possibly unsolvable question that

  13. Canadian petroleum history bibliography

    Energy Technology Data Exchange (ETDEWEB)

    Cass, D.

    2003-09-27

    The Petroleum History Bibliography includes a list of more than 2,000 publications that record the history of the Canadian petroleum industry. The list includes books, theses, films, audio tapes, published articles, company histories, biographies, autobiographies, fiction, poetry, humour, and an author index. It was created over a period of several years to help with projects at the Petroleum History Society. It is an ongoing piece of work, and as such, invites comments and additions.

  14. Atomic policies: history, problems

    International Nuclear Information System (INIS)

    Galvan, Cesare Giuseppe.

    1993-01-01

    Two kinds of problems follow from the development of nuclear technology: its use in (diversion to) armaments, and its dangers for the population. Both arise as social phenomena: technology can be diverted to military aims; and installations require specific measures in order not to expose human life to danger. The diffusion of this technology required a series of tentative solutions for such problems. Its history constitutes our first part. The second part aims at understanding the dynamics, which led to the diffusion of such a technology in the capitalist world. The concept of subsumption (especially of its realization) is suited to interpret the meanings of the social interests, which led content ro this diffusion. Subsumption is found between labor and capital, but also between society and state. At both levels, it shows that there was some social meaning in the diffusion of nuclear technology notwithstanding its problems. 590 refs

  15. About the science-theoretical measuring of history of revolutionary shocks in Russia (to the 100 year of February and October, 1917

    Directory of Open Access Journals (Sweden)

    Andrey V. Ishin

    2017-01-01

    Full Text Available The article is devoted the comprehension of the science-theoretical tool of research of revolutionary shocks in Russia. The 100-years-old anniversary of revolutionary shocks in Russia pulls out before researchers the vital task of complex comprehension of reasons, character and consequences of revolution. Scientific tasks which stood before the scientists of soviet epoch lay mainly inplane illumination of event of revolution and Civil war as displays of fight of «leading revolutionary class» — proletariat at the head with bolshevist communist party with «regressive classes» — bourgeoisie, squires, clergy, the «kulak». Within the framework of this main approach, researchers succeeded to form the fully integral scientific picture of social and political conflict of 1917-1922 years, on the whole to expose his motive forces, leading political actors, to trace the dynamics of events.However and presently to a full degree the task of comprehensive scientific analysis of structural-functional features of becoming and evolution of organs of power saves the actuality, which functioned within the framework of the different political modes, including modes of antibolshevist orientation. The important element of search is an exposure of specific of mutual relations of public institutions, basic directions of policy, historical factors which stipulated acceptance and practical realization of important administrative decisions.Іnstitucional approach must organically complement dominant to this day in scientific literature historical-event approach. Institucional approach consists in that a look to the social and political process is inplane not «from» (as in the historical-event measuring outside, and, vice versa, «from within». In obedience to this approach, research attention applies foremost on subsoil and on organization of administrative mechanisms, internal logic of acceptance both key and, on the face of it, second-rate decisions

  16. Diet History Questionnaire: Database Revision History

    Science.gov (United States)

    The following details all additions and revisions made to the DHQ nutrient and food database. This revision history is provided as a reference for investigators who may have performed analyses with a previous release of the database.

  17. Modern History of Tibet

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    Authored by Xu Guangzhi, this book is a subsidiary project of Research Into Traditional Culture and History (of the PRC Ministry of Education) conducted by China Tibetology Research Institute of Tibet University. The book combines modern history of Tibet with modern history of China as a whole. It tells the close ties between various members of the Chinese nation.

  18. History of Particle Physics

    Science.gov (United States)

    back to history page Back Particle Physics Timeline For over two thousand years people have thought the Standard Model. We invite you to explore this history of particle physics with a focus on the : Quantum Theory 1964 - Present: The Modern View (the Standard Model) back to history page Back Sections of

  19. Teaching Women's History.

    Science.gov (United States)

    Fain, George

    1995-01-01

    Argues that women's history should stress the broad sociological view of women's roles not only in politics but in mundane, day-to-day life throughout all of history, rather that reducing women's history to a few token figures. Notes that many college and secondary texts and testing materials have recognized the trend toward the inclusion of…

  20. Towards Household History

    NARCIS (Netherlands)

    van Rappard, J.F.H.

    1998-01-01

    It is maintained that in contradistinction to the natural sciences, in psychology (and other human sciences) ‘history is not past tense’. This is borne out by the contemporary relevance of a specific part of the history of psychology, which focuses on the internal-theoretical significance of history

  1. Film and History.

    Science.gov (United States)

    Schaber, Robin L.

    2002-01-01

    Provides an annotated bibliography of Web sites that focus on using film to teach history. Includes Web sites in five areas: (1) film and education; (2) history of cinema; (3) film and history resources; (4) film and women; and (5) film organizations. (CMK)

  2. History of mathematics and history of science

    OpenAIRE

    Mann, Tony

    2011-01-01

    This essay argues that the diversity of the history of mathematics community in the United Kingdom has influenced the development of the subject and is a significant factor behind the different concerns often evident in work on the history of mathematics when compared with that of historians of science. The heterogeneous nature of the community, which includes many who are not specialist historians, and the limited opportunities for academic\\ud careers open to practitioners have had a profoun...

  3. Defining food literacy: A scoping review.

    Science.gov (United States)

    Truman, Emily; Lane, Daniel; Elliott, Charlene

    2017-09-01

    The term "food literacy" describes the idea of proficiency in food related skills and knowledge. This prevalent term is broadly applied, although its core elements vary from initiative to initiative. In light of its ubiquitous use-but varying definitions-this article establishes the scope of food literacy research by identifying all articles that define 'food literacy', analysing its key conceptualizations, and reporting outcomes/measures of this concept. A scoping review was conducted to identify all articles (academic and grey literature) using the term "food literacy". Databases included Medline, Pubmed, Embase, CAB Abstracts, CINAHL, Scopus, JSTOR, and Web of Science, and Google Scholar. Of 1049 abstracts, 67 studies were included. From these, data was extracted on country of origin, study type (methodological approach), primary target population, and the primary outcomes relating to food literacy. The majority of definitions of food literacy emphasize the acquisition of critical knowledge (information and understanding) (55%) over functional knowledge (skills, abilities and choices) (8%), although some incorporate both (37%). Thematic analysis of 38 novel definitions of food literacy reveals the prevalence of six themes: skills and behaviours, food/health choices, culture, knowledge, emotions, and food systems. Study outcomes largely focus on knowledge generating measures, with very few focusing on health related outcome measures. Current definitions of food literacy incorporate components of six key themes or domains and attributes of both critical and functional knowledge. Despite this broad definition of the term, most studies aiming to improve food literacy focus on knowledge related outcomes. Few articles address health outcomes, leaving an important gap (and opportunity) for future research in this field. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. From the handbooks for the History to the history of the History as a school disciplin

    Directory of Open Access Journals (Sweden)

    Rafael VALLS MONTES

    2013-11-01

    Full Text Available In recent years a very important change in studies related to history as a school discipline has taken place. A wide range of factors have acted at the base of this change. They have allowed us to better known its constitutive characteristics and its later evolution, strongly defined by inercia and routine. The intention of these studies is genealogical and not archaeological nor erudite, so they provide a will for educative renovation, that goes further than idealistic projects, wich have turned out to be insuficient in overcoming known deficiencies.

  5. Cartooning History: Canada's Stories in Graphic Novels

    Science.gov (United States)

    King, Alyson E.

    2012-01-01

    In recent years, historical events, issues, and characters have been portrayed in an increasing number of non-fiction graphic texts. Similar to comics and graphic novels, graphic texts are defined as fully developed, non-fiction narratives told through panels of sequential art. Such non-fiction graphic texts are being used to teach history in…

  6. World History Workshop (1983).

    Science.gov (United States)

    1984-10-01

    history appeared tenuous. While the study of American history was viewed as necessary to "indoctrinate kids ," world history is unable to make such a...world" which is hard to avoid in world history, where one examines China in 1500, China in 1800, and so on. A pedagogical goal in the new course was to...the historian to make intelligent decisions about what information he is going to talk about. Viewing world history as a scenario also has a pedagogic

  7. Mellem historie- og krigsvidenskab

    DEFF Research Database (Denmark)

    Hansen Schøning, Anna Sofie

    2016-01-01

    history was used to establish national and organisational identity. In the 1880s, military history was used as a means to find, explain and apply universal principles of war and, in the 1910s, military history should be used as a means to gain general insight that could potentially lead to a better......The article investigates how military history was taught as part of the Danish higher officer education from 1830 to 1920 and how the subject was affected by developments in academic history and the science of war. It argues that military history, as it was taught in the formal officer education......, could not be seen solely as a historic subject but also as a subject under the influence of the discipline of military science. Three very different understandings of how military history can contribute to higher officer education are shown through the analysis of textbooks. In the 1830s military...

  8. Three concepts of history

    Directory of Open Access Journals (Sweden)

    Antonio Campillo

    2016-05-01

    Full Text Available The aim of this article is twofold. On the one hand, I will outline the diverse usages that the concept of history has taken on throughout Western history. These different usages may be grouped together in three semantic fields (history as a way of knowing, as a way of being and as a way of doing, which correspond to three ways of understanding the Philosophy of History: as Epistemology of History, as Ontology of historicity and as ethical-political Critique of the present. On the other hand, I will show that these three concepts of history (and, accordingly, the three ways of understanding the Philosophy of History refer mutually to each other and, thus, are inseparable from each other.

  9. Abortion: a history.

    Science.gov (United States)

    Hovey, G

    1985-01-01

    labor, and greater consumerism. The legal history of abortion in the US illustrates dramatically that it was doctors, not women, who defined the morality surrounding abortion. Women continue to have to cope with the legacy of this fact. The seemingly benign 2-sphere family of the 19th century cut a deep wound in the human community. Men had public power and authority and were encouraged to be sexual. Women were offered the alternative of being powerful only as sexual beings who could thus enforce a domestic moral order. The legacy of the 2-sphere family continues, but much has changed. By 1973 pressure for reform had led 14 states to liberalize their existing abortin laws, and the US Supreme Court finally ruled that abortion is a private matter between a woman and her doctor. The current problem is that despite new laws and new attitudes toward women and abortion, male dominated and male defined institutions still determine what is possible. Women's right to abortion will never be safe and secure as long as this situation continues.

  10. Quality of cancer family history and referral for genetic counseling and testing among oncology practices: a pilot test of quality measures as part of the American Society of Clinical Oncology Quality Oncology Practice Initiative.

    Science.gov (United States)

    Wood, Marie E; Kadlubek, Pamela; Pham, Trang H; Wollins, Dana S; Lu, Karen H; Weitzel, Jeffrey N; Neuss, Michael N; Hughes, Kevin S

    2014-03-10

    Family history of cancer (CFH) is important for identifying individuals to receive genetic counseling/testing (GC/GT). Prior studies have demonstrated low rates of family history documentation and referral for GC/GT. CFH quality and GC/GT practices for patients with breast (BC) or colon cancer (CRC) were assessed in 271 practices participating in the American Society of Clinical Oncology Quality Oncology Practice Initiative in fall 2011. A total of 212 practices completed measures regarding CFH and GC/GT practices for 10,466 patients; 77.4% of all medical records reviewed documented presence or absence of CFH in first-degree relatives, and 61.5% of medical records documented presence or absence of CFH in second-degree relatives, with significantly higher documentation for patients with BC compared with CRC. Age at diagnosis was documented for all relatives with cancer in 30.7% of medical records (BC, 45.2%; CRC, 35.4%; P ≤ .001). Referall for GC/GT occurred in 22.1% of all patients with BC or CRC. Of patients with increased risk for hereditary cancer, 52.2% of patients with BC and 26.4% of those with CRC were referred for GC/GT. When genetic testing was performed, consent was documented 77.7% of the time, and discussion of results was documented 78.8% of the time. We identified low rates of complete CFH documentation and low rates of referral for those with BC or CRC meeting guidelines for referral among US oncologists. Documentation and referral were greater for patients with BC compared with CRC. Education and support regarding the importance of accurate CFH and the benefits of proactive high-risk patient management are clearly needed.

  11. Prediction of the time-dependent failure rate for normally operating components taking into account the operational history

    International Nuclear Information System (INIS)

    Vrbanic, I.; Simic, Z.; Sljivac, D.

    2008-01-01

    The prediction of the time-dependent failure rate has been studied, taking into account the operational history of a component used in applications such as system modeling in a probabilistic safety analysis in order to evaluate the impact of equipment aging and maintenance strategies on the risk measures considered. We have selected a time-dependent model for the failure rate which is based on the Weibull distribution and the principles of proportional age reduction by equipment overhauls. Estimation of the parameters that determine the failure rate is considered, including the definition of the operational history model and likelihood function for the Bayesian analysis of parameters for normally operating repairable components. The operational history is provided as a time axis with defined times of overhauls and failures. An example for demonstration is described with prediction of the future behavior for seven different operational histories. (orig.)

  12. Towards defining restlessness in individuals with dementia.

    Science.gov (United States)

    Regier, Natalie G; Gitlin, Laura N

    2017-05-01

    Most individuals with dementia develop significant behavioral problems. Restlessness is a behavioral symptom frequently endorsed by caregivers as distressing, yet is variably defined and measured. Lack of conceptual and operational clarity hinders an understanding of this common behavioral type, its prevalence, and development of effective interventions. We advance a systematic definition and understanding of restlessness from which to enhance reporting and intervention development. We reviewed the literature for existing definitions and measures of restlessness, identified common elements across existing definitions, assessed fit with relevant theoretical frameworks, and explored the relationship between restlessness and other behavioral symptoms in a data set of 272 community-dwelling persons with dementia. Twenty-five scales assessing restlessness were identified. Shared components included motor/neurological, psychiatric, and needs-based features. Exploratory analyses suggest that restlessness may co-occur primarily with argumentation, anxiety, waking the caregiver, delusions/hallucinations, and wandering. We propose that restlessness consists of three key attributes: diffuse motor activity or motion subject to limited control, non-productive or disorganized behavior, and subjective distress. Restlessness should be differentiated from and not confused with wandering or elopement, pharmacological side effects, a (non-dementia) mental or movement disorder, or behaviors occurring in the context of a delirium or at end-of-life. Restlessness appears to denote a distinct set of behaviors that have overlapping but non-equivalent features with other behavioral symptoms. We propose that it reflects a complex behavior involving three key characteristics. Understanding its specific manifestations and which components are present can enhance tailoring interventions to specific contexts of this multicomponent behavioral type.

  13. What quantum measurements measure

    Science.gov (United States)

    Griffiths, Robert B.

    2017-09-01

    A solution to the second measurement problem, determining what prior microscopic properties can be inferred from measurement outcomes ("pointer positions"), is worked out for projective and generalized (POVM) measurements, using consistent histories. The result supports the idea that equipment properly designed and calibrated reveals the properties it was designed to measure. Applications include Einstein's hemisphere and Wheeler's delayed choice paradoxes, and a method for analyzing weak measurements without recourse to weak values. Quantum measurements are noncontextual in the original sense employed by Bell and Mermin: if [A ,B ]=[A ,C ]=0 ,[B ,C ]≠0 , the outcome of an A measurement does not depend on whether it is measured with B or with C . An application to Bohm's model of the Einstein-Podolsky-Rosen situation suggests that a faulty understanding of quantum measurements is at the root of this paradox.

  14. User defined function for transformation of ellipsoidal coordinates

    Directory of Open Access Journals (Sweden)

    Ganić Aleksandar

    2014-01-01

    Full Text Available The topographic plane of the Earth has irregular shape, and for the purpose of mathematical defining, it is to be approximated by rotational ellipsoid. As local geodetic datum, rotational ellipsoids of various sizes are used in the world. More widely usage of the GPS while performing surveying tasks has resulted in the need to define global geodetic datum in order to obtain the best approximation the entire Earth. For this purpose, geocentric rotational ellipsoid WGS84 was defined and the results of the GPS measurements are shown in relation to it. By applying the appropriate equations, the ellipsoidal coordinates are being transformed from WGS84 into the coordinates on the local rotational ellipsoid, i.e. on the view plane. The paper shows User Defined Function created for Excel, by which the coordinates in the territory of Belgrade are being transformed from WGS84 of rotational ellipsoid into the Gauss-Krüger projection plane.

  15. A Soundtrack to Mongolian History

    Directory of Open Access Journals (Sweden)

    Franck Billé

    2016-06-01

    Full Text Available Lucy M. Rees, Mongolian Film Music: Tradition, Revolution and Propaganda. London: Routledge, 2015. 210 pp. $110 (cloth. In her recently published study, ethnomusicologist Lucy M. Rees recounts the evolution of Mongolian film music, from the establishment of the country’s film industry as a vehicle of propaganda in the early socialist era to the release of the latest international productions, such as Khadak (2006, The Story of the Weeping Camel (2003, and The Cave of the Yellow Dog (2005. An in-depth analysis of the genres, structures, and melodies of Mongolia’s filmic landscape, Rees’s book also extends to the historical context and social reception of the most important films in that country’s history and is thus more than a mere compendium of cinematic works. Rees presents a narrative of Mongolian history from the perspective of film music, with each introduction of instruments, techniques, and harmonies representing a particular turn in the cultural transformation experienced by Mongolia over the course of the twentieth century. Each chapter is dedicated to a specific period of the country’s history and is constructed around a particular case study—one personality or one film—that played a defining role in that period...

  16. Defining Moments in MMWR History: 1993 E. coli> O157:H7 Hamburger Outbreak

    Centers for Disease Control (CDC) Podcasts

    During the 1993 E. coli O157 outbreak, four children died, and approximately 700 persons in four states became ill with severe and often bloody diarrhea after eating hamburgers from fast food restaurants. The first reports of CDC's investigation into this deadly outbreak were published in MMWR. In this podcast, Dr. Beth Bell shares what it was like to serve as one of CDC's lead investigators - a boots-on-the-ground disease detective -- for the historic outbreak.

  17. Defining a Moment in History: Parent Communication with Adolescents about September 11, 2001

    Science.gov (United States)

    Stoppa, Tara M.; Wray-Lake, Laura; Syvertsen, Amy K.; Flanagan, Constance

    2011-01-01

    Parents play an important role in helping their children process and interpret significant sociohistorical events. However, little is known about how parents frame these experiences or the specific social, cultural, and civic messages they may communicate about the event. In this study, we examined self-reported communication of parents from six…

  18. Defining Moments in MMWR History: The AIDS Epidemic, Pneumocystis Pneumonia --- Los Angeles 1981

    Centers for Disease Control (CDC) Podcasts

    2017-12-01

    On June 5, 1981, MMWR published a report of Pneumocystis pneumonia in five previously healthy young gay men in Los Angeles, California. This report was later acknowledged as the first published account of what would become known as human immunodeficiency virus, or HIV, and acquired immunodeficiency syndrome, or AIDS. It was the first of many MMWR reports that led to a better understanding of this new condition. In this podcast, Dr. Harold Jaffe recalls CDC’s investigation and response to the AIDS Epidemic.  Created: 12/1/2017 by MMWR.   Date Released: 12/1/2017.

  19. Defining Moments in MMWR History: CDC's Response to Intentional Release of Anthrax - 2001

    Centers for Disease Control (CDC) Podcasts

    On October 4, 2001, shortly after the September 11 attacks in New York City and Washington, DC, the Palm Beach County Health Department, the Florida State Department of Health, and CDC reported a case of anthrax in a 63-year-old man from Florida. This case was first reported in MMWR and marked the beginning of a series of anthrax cases that resulted from intentional delivery of Bacillus anthracis spores sent through the mail. In this podcast, Dr. Sherif Zaki recalls CDC's investigation and response to the anthrax attacks.

  20. Defining Moments in MMWR History: The AIDS Epidemic, Pneumocystis Pneumonia --- Los Angeles 1981

    Centers for Disease Control (CDC) Podcasts

    On June 5, 1981, MMWR published a report of Pneumocystis pneumonia in five previously healthy young gay men in Los Angeles, California. This report was later acknowledged as the first published account of what would become known as human immunodeficiency virus, or HIV, and acquired immunodeficiency syndrome, or AIDS. It was the first of many MMWR reports that led to a better understanding of this new condition. In this podcast, Dr. Harold Jaffe recalls CDC's investigation and response to the AIDS Epidemic.

  1. Defining Moments in MMWR History: 1993 E. coli O157:H7 Hamburger Outbreak

    Centers for Disease Control (CDC) Podcasts

    2017-05-31

    During the 1993 E. coli O157 outbreak, four children died, and approximately 700 persons in four states became ill with severe and often bloody diarrhea after eating hamburgers from fast food restaurants. The first reports of CDC’s investigation into this deadly outbreak were published in MMWR. In this podcast, Dr. Beth Bell shares what it was like to serve as one of CDC’s lead investigators – a boots-on-the-ground disease detective -- for the historic outbreak.  Created: 5/31/2017 by MMWR.   Date Released: 5/31/2017.

  2. Solar History An Introduction

    CERN Document Server

    Vita-Finzi, Claudio

    2013-01-01

    Beyond the four centuries of sunspot observation and the five decades during which artificial satellites have monitored the Sun – that is to say for 99.99999% of the Sun’s existence – our knowledge of solar history depends largely on analogy with kindred main sequence stars, on the outcome of various kinds of modelling, and on indirect measures of solar activity. They include the analysis of lunar rocks and meteorites for evidence of solar flares and other components of the solar cosmic-ray (SCR) flux, and the measurement of cosmogenic isotopes in wood, stratified ice and marine sediments to evaluate changes in the galactic cosmic-ray (GCR) flux and thus infer changes in the sheltering magnetic fields of the solar wind. In addition, shifts in the global atmospheric circulation which appear to result from cyclic fluctuations in solar irradiance have left their mark in river sediments and in the isotopic composition of cave deposits. In this volume the results these sources have already produced have bee...

  3. History of mathematics and history of science.

    Science.gov (United States)

    Mann, Tony

    2011-09-01

    This essay argues that the diversity of the history of mathematics community in the United Kingdom has influenced the development of the subject and is a significant factor behind the different concerns often evident in work on the history of mathematics when compared with that of historians of science. The heterogeneous nature of the community, which includes many who are not specialist historians, and the limited opportunities for academic careers open to practitioners have had a profound effect on the discipline, leading to a focus on elite mathematics and great mathematicians. More recently, reflecting earlier developments in the history of science, an increased interest in the context and culture of the practice of mathematics has become evident.

  4. The teaching of history through histories

    Directory of Open Access Journals (Sweden)

    María Gabriela Calvas-Ojeda

    2017-12-01

    Full Text Available The comic strips have been introduced into the world of history as a didactic resource for their learning; However, there are still shortcomings in their use by teachers, motivated on many occasions due to lack of knowledge and insufficient methodological preparation; The purpose of this work is to socialize knowledge related to these didactic resources to contribute to the didactic-methodological enrichment of the teacher, in order to change this attitude. The methodological strategy responds to the quantitative-qualitative paradigm; in the collection of the information a participant observation guide was used to the history classes and interview to a sample of 9 teachers of Third Degree of the schools of the city of Machala randomly selected. We recorded the observations of the knowledge acquired by the 98 students who received the classes mediated by comic strips, which allowed us to conclude that comics for the teaching and learning of History constitute a powerful didactic resource.

  5. Increased fear-potentiated startle in major depressive disorder patients with lifetime history of suicide attempt.

    Science.gov (United States)

    Ballard, Elizabeth D; Ionescu, Dawn F; Vande Voort, Jennifer L; Slonena, Elizabeth E; Franco-Chaves, Jose A; Zarate, Carlos A; Grillon, Christian

    2014-06-01

    Suicide is a common reason for psychiatric emergency and morbidity, with few effective treatments. Anxiety symptoms have emerged as potential modifiable risk factors in the time before a suicide attempt, but few studies have been conducted using laboratory measures of fear and anxiety. We operationally defined fear and anxiety as increased startle reactivity during anticipation of predictable (fear-potentiated startle) and unpredictable (anxiety-potentiated startle) shock. We hypothesized that a lifetime history of suicide attempt (as compared to history of no suicide attempt) would be associated with increased fear-potentiated startle. A post-hoc analysis of fear- and anxiety-potentiated startle was conducted in 28 medication-free patients with Major Depressive Disorder (MDD) divided according to suicide attempt history. The magnitude of fear-potentiated startle was increased in depressed patients with lifetime suicide attempts compared to those without a lifetime history of suicide attempt (F(1,26)=5.629, p=.025). There was no difference in anxiety-potentiated startle by suicide attempt history. This is a post-hoc analysis of previously analyzed patient data from a study of depressed inpatients. Further replication of the finding with a larger patient sample is indicated. Increased fear-potentiated startle in suicide attempters suggests the role of amygdala in depressed patients with a suicide attempt history. Findings highlight the importance of anxiety symptoms in the treatment of patients at increased suicide risk. Published by Elsevier B.V.

  6. 22 CFR 92.36 - Authentication defined.

    Science.gov (United States)

    2010-04-01

    ... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Authentication defined. 92.36 Section 92.36... Notarial Acts § 92.36 Authentication defined. An authentication is a certification of the genuineness of... recognized in another jurisdiction. Documents which may require authentication include legal instruments...

  7. A definability theorem for first order logic

    NARCIS (Netherlands)

    Butz, C.; Moerdijk, I.

    1997-01-01

    In this paper we will present a definability theorem for first order logic This theorem is very easy to state and its proof only uses elementary tools To explain the theorem let us first observe that if M is a model of a theory T in a language L then clearly any definable subset S M ie a subset S

  8. Dilution Confusion: Conventions for Defining a Dilution

    Science.gov (United States)

    Fishel, Laurence A.

    2010-01-01

    Two conventions for preparing dilutions are used in clinical laboratories. The first convention defines an "a:b" dilution as "a" volumes of solution A plus "b" volumes of solution B. The second convention defines an "a:b" dilution as "a" volumes of solution A diluted into a final volume of "b". Use of the incorrect dilution convention could affect…

  9. Defining Hardwood Veneer Log Quality Attributes

    Science.gov (United States)

    Jan Wiedenbeck; Michael Wiemann; Delton Alderman; John Baumgras; William Luppold

    2004-01-01

    This publication provides a broad spectrum of information on the hardwood veneer industry in North America. Veneer manufacturers and their customers impose guidelines in specifying wood quality attributes that are very discriminating but poorly defined (e.g., exceptional color, texture, and/or figure characteristics). To better understand and begin to define the most...

  10. ICF gamma-ray reaction history diagnostics

    International Nuclear Information System (INIS)

    Herrmann, H W; Young, C S; Mack, J M; Kim, Y H; McEvoy, A; Evans, S; Sedillo, T; Batha, S; Schmitt, M; Wilson, D C; Langenbrunner, J R; Malone, R; Kaufman, M I; Cox, B C; Frogget, B; Tunnell, T W; Miller, E K; Ali, Z A; Stoeffl, W; Horsfield, C J

    2010-01-01

    Reaction history measurements, such as nuclear bang time and burn width, are fundamental components of diagnosing ICF implosions and will be employed to help steer the National Ignition Facility (NIF) towards ignition. Fusion gammas provide a direct measure of nuclear interaction rate (unlike x-rays) without being compromised by Doppler spreading (unlike neutrons). Gas Cherenkov Detectors that convert fusion gamma rays to UV/visible Cherenkov photons for collection by fast optical recording systems have established their usefulness in illuminating ICF physics in several experimental campaigns at OMEGA. In particular, bang time precision better than 25 ps has been demonstrated, well below the 50 ps accuracy requirement defined by the NIF. NIF Gamma Reaction History (GRH) diagnostics are being developed based on optimization of sensitivity, bandwidth, dynamic range, cost, and NIF-specific logistics, requirements and extreme radiation environment. Implementation will occur in two phases. The first phase consists of four channels mounted to the outside of the target chamber at ∼6 m from target chamber center (GRH-6m) coupled to ultra-fast photo-multiplier tubes (PMT). This system is intended to operate in the 10 13 -10 17 neutron yield range expected during the early THD campaign. It will have high enough bandwidth to provide accurate bang times and burn widths for the expected THD reaction histories (> 80 ps fwhm). Successful operation of the first GRH-6m channel has been demonstrated at OMEGA, allowing a verification of instrument sensitivity, timing and EMI/background suppression. The second phase will consist of several channels located just inside the target bay shield wall at 15 m from target chamber center (GRH-15m) with optical paths leading through the cement shield wall to well-shielded streak cameras and PMTs. This system is intended to operate in the 10 16 -10 20 yield range expected during the DT ignition campaign, providing higher temporal resolution

  11. ICF gamma-ray reaction history diagnostics

    Science.gov (United States)

    Herrmann, H. W.; Young, C. S.; Mack, J. M.; Kim, Y. H.; McEvoy, A.; Evans, S.; Sedillo, T.; Batha, S.; Schmitt, M.; Wilson, D. C.; Langenbrunner, J. R.; Malone, R.; Kaufman, M. I.; Cox, B. C.; Frogget, B.; Miller, E. K.; Ali, Z. A.; Tunnell, T. W.; Stoeffl, W.; Horsfield, C. J.; Rubery, M.

    2010-08-01

    Reaction history measurements, such as nuclear bang time and burn width, are fundamental components of diagnosing ICF implosions and will be employed to help steer the National Ignition Facility (NIF) towards ignition. Fusion gammas provide a direct measure of nuclear interaction rate (unlike x-rays) without being compromised by Doppler spreading (unlike neutrons). Gas Cherenkov Detectors that convert fusion gamma rays to UV/visible Cherenkov photons for collection by fast optical recording systems have established their usefulness in illuminating ICF physics in several experimental campaigns at OMEGA. In particular, bang time precision better than 25 ps has been demonstrated, well below the 50 ps accuracy requirement defined by the NIF. NIF Gamma Reaction History (GRH) diagnostics are being developed based on optimization of sensitivity, bandwidth, dynamic range, cost, and NIF-specific logistics, requirements and extreme radiation environment. Implementation will occur in two phases. The first phase consists of four channels mounted to the outside of the target chamber at ~6 m from target chamber center (GRH-6m) coupled to ultra-fast photo-multiplier tubes (PMT). This system is intended to operate in the 1013-1017 neutron yield range expected during the early THD campaign. It will have high enough bandwidth to provide accurate bang times and burn widths for the expected THD reaction histories (> 80 ps fwhm). Successful operation of the first GRH-6m channel has been demonstrated at OMEGA, allowing a verification of instrument sensitivity, timing and EMI/background suppression. The second phase will consist of several channels located just inside the target bay shield wall at 15 m from target chamber center (GRH-15m) with optical paths leading through the cement shield wall to well-shielded streak cameras and PMTs. This system is intended to operate in the 1016-1020 yield range expected during the DT ignition campaign, providing higher temporal resolution for the

  12. Marine Environmental History

    DEFF Research Database (Denmark)

    Poulsen, Bo

    2012-01-01

    human society and natural marine resources. Within this broad topic, several trends and objectives are discernable. The essay argue that the so-called material marine environmental history has its main focus on trying to reconstruct the presence, development and environmental impact of past fisheries......This essay provides an overview of recent trends in the historiography of marine environmental history, a sub-field of environmental history which has grown tremendously in scope and size over the last c. 15 years. The object of marine environmental history is the changing relationship between...... and whaling operations. This ambition often entails a reconstruction also of how marine life has changed over time. The time frame rages from Paleolithicum to the present era. The field of marine environmental history also includes a more culturally oriented environmental history, which mainly has come...

  13. Mortality of Inherited Arrhythmia Syndromes Insight Into Their Natural History

    NARCIS (Netherlands)

    Nannenberg, Eline A.; Sijbrands, Eric J. G.; Dijksman, Lea M.; Alders, Marielle; van Tintelen, J. Peter; Birnie, Martijn; van Langen, Irene M.; Wilde, Arthur A. M.

    Background-For most arrhythmia syndromes, the risk of sudden cardiac death for asymptomatic mutation carriers is ill defined. Data on the natural history of these diseases, therefore, are essential. The family tree mortality ratio method offers the unique possibility to study the natural history at

  14. Gender, Technology, and the History of Technical Communication.

    Science.gov (United States)

    Durack, Katherine T.

    1997-01-01

    Considers why women have been absent from the history of technical communication. Discusses research from the history of technology suggesting that notions of "technology,""work," and "workplace" may be gendered terms. Concludes with several suggestions for defining technical communication so that significant works of…

  15. Teaching History in a Post-Industrial Age

    Science.gov (United States)

    Bianchetti, Ann

    2004-01-01

    As a social studies teacher, the author emphasizes the story of history (sticking to the facts as much as they are known) and the human qualities of the players. Middle school kids are in the throes of exploring self-identity and attempting to define their worlds. They love drama, and history provides plenty of it. The author finds that teaching…

  16. Exploring global history through the lens of history of Chemistry: Materials, identities and governance.

    Science.gov (United States)

    Roberts, Lissa

    2016-12-01

    As global history continues to take shape as an important field of research, its interactive relationships with the history of science, technology, and medicine are recognized and being investigated as significant areas of concern. Strangely, despite the fact that it is key to understanding so many of the subjects that are central to global history and would itself benefit from a broader geographical perspective, the history of chemistry has largely been left out of this process - particularly for the modern historical period. This article argues for the value of integrating the history of chemistry with global history, not only for understanding the past, but also for thinking about our shared present and future. Toward this end, it (1) explores the various ways in which 'chemistry' has and can be defined, with special attention to discussions of 'indigenous knowledge systems'; (2) examines the benefits of organizing historical inquiry around the evolving sociomaterial identities of substances; (3) considers ways in which the concepts of 'chemical governance' and 'chemical expertise' can be expanded to match the complexities of global history, especially in relation to environmental issues, climate change, and pollution; and (4) seeks to sketch the various geographies entailed in bringing the history of chemistry together with global histories.

  17. The International Feast of the History. A Concrete Project for the Dissemination of History and Heritage

    Directory of Open Access Journals (Sweden)

    Filippo Galletti

    2017-12-01

    Full Text Available In this paper, I present a series of educational projects, new challenges and perspectives that the International Centre of methodology for teaching history and heritage (DiPaSt of the University of Bologna has undertaken in the last years regarding the teaching of history and heritage education. I would like to start by asking a question: can historical and cultural heritage act as a tool to compensate for the gaps, shortcomings, and the sense of loss, which afflict and define the society in which we live? In addition, this in turn leads us to another question: which tools and which methodologies can we use? Every time when a professor starts a new course of Medieval History or Methodologies of teaching history, most of the students tell him that they do not like history. Therefore, the professor usually spends half of the time of the course explaining why it is important to study and to teach history. Because history is not the merely textbook, or a sequence of dates, wars, battles. Nevertheless, history is us, we are history. For these reasons, fifteen years ago, a group of professors of the University of Bologna got together and created the “Feast of the history”. Nowadays, the Feast is widely recognized as one of the most important events in Europe.

  18. Cooperative Station History Forms

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Various forms, photographs and correspondence documenting the history of Cooperative station instrumentation, location changes, inspections, and...

  19. Public and popular history

    CERN Document Server

    De Groot, Jerome

    2013-01-01

    This interdisciplinary collection considers public and popular history within a global framework, seeking to understand considerations of local, domestic histories and the ways they interact with broader discourses. Grounded in particular local and national situations, the book addresses the issues associated with popular history in a globalised cultural world, such as: how the study of popular history might work in the future; new ways in which the terms 'popular' and 'public' might inform one another and nuance scholarship; transnational, intercultural models of 'pastness'; cultural translat

  20. A history of the histories of econometrics

    NARCIS (Netherlands)

    Boumans, Marcel; Dupont-Kieffer, Ariane

    2011-01-01

    Econometricians have from the start considered historical knowledge of their own discipline as reflexive knowledge useful for delineating their discipline, that is, for setting its disciplinary boundaries with respect to its aims, its methods, and its scientific values. As such, the histories

  1. Precariousness and discontinuous work history in association with health.

    Science.gov (United States)

    Sirviö, Anitta; Ek, Ellen; Jokelainen, Jari; Koiranen, Markku; Järvikoski, Timo; Taanila, Anja

    2012-06-01

    Precarious type of employment may have a negative impact on health, notably on low psychological wellbeing. The basis of the former relationship is constructed by definition and operationalisation of precariousness. In this research, we first experimented with a construct of work history in the operationalisation of precariousness and second studied the relationship between precariousness and health. The research data originated from a large population-based birth cohort (NFBC 1966). The study sample consists of 3449 respondents to the postal questionnaire at the age of 31 and the information supplemented by the register data of the Finnish Centre for Pensions. Health was measured by self-reports of doctor-diagnosed/treated illnesses and HSCL-25 for mental symptoms. Our operationalisation with a construct of discontinuous work history captured the precarious insecure relation to work. The precarious workers were found to have proportionally more mental symptoms in comparison with permanent workers. The perception of distress was stronger among precarious workers who perceived high job insecurity. However, there were no differences in doctor-diagnosed/treated illnesses between precarious and permanent workers. The study suggests that the construct of work history is a useful element in defining precariousness. The study also illustrates the association of precariousness, perceived job insecurity, and mental distress. The study suggests further research on disadvantages experienced by precarious workers.

  2. Bilayer graphene quantum dot defined by topgates

    Energy Technology Data Exchange (ETDEWEB)

    Müller, André; Kaestner, Bernd; Hohls, Frank; Weimann, Thomas; Pierz, Klaus; Schumacher, Hans W., E-mail: hans.w.schumacher@ptb.de [Physikalisch-Technische Bundesanstalt, Bundesallee 100, 38116 Braunschweig (Germany)

    2014-06-21

    We investigate the application of nanoscale topgates on exfoliated bilayer graphene to define quantum dot devices. At temperatures below 500 mK, the conductance underneath the grounded gates is suppressed, which we attribute to nearest neighbour hopping and strain-induced piezoelectric fields. The gate-layout can thus be used to define resistive regions by tuning into the corresponding temperature range. We use this method to define a quantum dot structure in bilayer graphene showing Coulomb blockade oscillations consistent with the gate layout.

  3. INTRODUCTION Dental care utilization can be defined as the ...

    African Journals Online (AJOL)

    INTRODUCTION. Dental care utilization can be defined as the percentage of the population who access dental services over a specified period of time1. Measures of actual dental care utilization describe the percentage of the population who have seen a dentist at different time intervals. Dental disease is a serious public ...

  4. History of Cardiology in India

    OpenAIRE

    Das, Mrinal Kanti; Kumar, Soumitra; Deb, Pradip Kumar; Mishra, Sundeep

    2015-01-01

    History as a science revolves around memories, travellers' tales, fables and chroniclers' stories, gossip and trans-telephonic conversations. Medicine itself as per the puritan's definition is a non-exact science because of the probability-predictability-sensitivity-specificity factors. Howsoever, the chronicles of Cardiology in India is quite interesting and intriguing. Heart and circulation was known to humankind from pre-Vedic era. Various therapeutics measures including the role of Yoga a...

  5. Software defined network inference with evolutionary optimal observation matrices

    OpenAIRE

    Malboubi, M; Gong, Y; Yang, Z; Wang, X; Chuah, CN; Sharma, P

    2017-01-01

    © 2017 Elsevier B.V. A key requirement for network management is the accurate and reliable monitoring of relevant network characteristics. In today's large-scale networks, this is a challenging task due to the scarcity of network measurement resources and the hard constraints that this imposes. This paper proposes a new framework, called SNIPER, which leverages the flexibility provided by Software-Defined Networking (SDN) to design the optimal observation or measurement matrix that can lead t...

  6. Application-Defined Decentralized Access Control

    Science.gov (United States)

    Xu, Yuanzhong; Dunn, Alan M.; Hofmann, Owen S.; Lee, Michael Z.; Mehdi, Syed Akbar; Witchel, Emmett

    2014-01-01

    DCAC is a practical OS-level access control system that supports application-defined principals. It allows normal users to perform administrative operations within their privilege, enabling isolation and privilege separation for applications. It does not require centralized policy specification or management, giving applications freedom to manage their principals while the policies are still enforced by the OS. DCAC uses hierarchically-named attributes as a generic framework for user-defined policies such as groups defined by normal users. For both local and networked file systems, its execution time overhead is between 0%–9% on file system microbenchmarks, and under 1% on applications. This paper shows the design and implementation of DCAC, as well as several real-world use cases, including sandboxing applications, enforcing server applications’ security policies, supporting NFS, and authenticating user-defined sub-principals in SSH, all with minimal code changes. PMID:25426493

  7. Software Defined Multiband EVA Radio, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of this research is to propose a reliable, lightweight, programmable, multi-band, multi-mode, miniaturized frequency-agile EVA software defined radio...

  8. Reconfigurable, Cognitive Software Defined Radio, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — IAI is actively developing Software Defined Radio platforms that can adaptively switch between different modes of operation by modifying both transmit waveforms and...

  9. Software Defined Multiband EVA Radio, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of Phase 2 is to build a reliable, lightweight, programmable, multi-mode, miniaturized EVA Software Defined Radio (SDR) that supports data telemetry,...

  10. Reconfigurable, Cognitive Software Defined Radio, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Intelligent Automation Inc, (IAI) is currently developing a software defined radio (SDR) platform that can adaptively switch between different modes of operation for...

  11. History of Mathematics

    DEFF Research Database (Denmark)

    Hansen, Vagn Lundsgaard; Gray, Jeremy

    Volume 1 in Theme on "History of Mathematics", in "Encyclopedia of Life Support Systems (EOLSS), developed under the auspices of the UNESCO.......Volume 1 in Theme on "History of Mathematics", in "Encyclopedia of Life Support Systems (EOLSS), developed under the auspices of the UNESCO....

  12. Business history and risk

    OpenAIRE

    Terry Gourvish

    2003-01-01

    CARR, in association with the Centre for Business History, University of Leeds, held a successful workshop on 'Business History and Risk' on 20 February 2002. The workshop, which was sponsored by the ESRC, brought together business historians, economists, accountants and risk analysts to develop an interdisciplinary discussion on understandings of risk by employers, workers and governments in different historical settings.

  13. Aggersborg through history

    DEFF Research Database (Denmark)

    Roesdahl, Else

    2014-01-01

    Aggersborg's history from the time of the end of the circular fortress till the present day, with a focus on the late Viking Age and the Middle Ages......Aggersborg's history from the time of the end of the circular fortress till the present day, with a focus on the late Viking Age and the Middle Ages...

  14. The Two World Histories

    Science.gov (United States)

    Dunn, Ross E.

    2008-01-01

    In the arenas where the two world histories have taken shape, educators vigorously debate among themselves intellectual, pedagogical, and policy issues surrounding world history as a school subject. The people in each arena tend to share, despite internal disagreements, a common set of premises and assumptions for ordering the discussion of world…

  15. History of Science

    Science.gov (United States)

    Oversby, John

    2010-01-01

    In this article, the author discusses why the history of science should be included in the science curriculum in schools. He also presents some opportunities that can come out of using historical contexts, and findings from a study assessing the place of history of science in readily available textbooks.

  16. Optimum Criteria for Developing Defined Structures

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2008-01-01

    Full Text Available Basic aspects concerning distributed applications are presented: definition, particularities and importance. For distributed applications linear, arborescent, graph structures are defined with different versions and aggregation methods. Distributed applications have associated structures which through their characteristics influence the costs of the stages in the development cycle and the exploitation costs transferred to each user. The complexity of the defined structures is analyzed. The minimum and maximum criteria are enumerated for optimizing distributed application structures.

  17. Shaping Sexual Knowledge: A Cultural History of Sex Education in Twentieth Century Europe. Routledge Studies in the Social History of Medicine

    Science.gov (United States)

    Sauerteig, Lutz, Ed.; Davidson, Roger, Ed.

    2012-01-01

    The history of sex education enables us to gain valuable insights into the cultural constructions of what different societies have defined as 'normal' sexuality and sexual health. Yet, the history of sex education has only recently attracted the full attention of historians of modern sexuality. "Shaping Sexual Knowledge: A Cultural History of…

  18. Freud and history before 1905: from defending to questioning the theory of a glorious past.

    Science.gov (United States)

    Cotti, Patricia

    2008-01-01

    By sticking closely to Freud's use of the German term Geschichte (history, story) between 1894 and 1905, I will reveal two conceptions of history. The first one, the theory of the glorious past and its archaeological metaphor, which accompanied and sustained the seduction theory of cultural history. I will define how this change was determined by an evolution in Freud's conceptions of childhood prehistory and original history. I will also question how the history problem interfered with Freud's auto-analysis.

  19. Deficient motion-defined and texture-defined figure-ground segregation in amblyopic children.

    Science.gov (United States)

    Wang, Jane; Ho, Cindy S; Giaschi, Deborah E

    2007-01-01

    Motion-defined form deficits in the fellow eye and the amblyopic eye of children with amblyopia implicate possible direction-selective motion processing or static figure-ground segregation deficits. Deficient motion-defined form perception in the fellow eye of amblyopic children may not be fully accounted for by a general motion processing deficit. This study investigates the contribution of figure-ground segregation deficits to the motion-defined form perception deficits in amblyopia. Performances of 6 amblyopic children (5 anisometropic, 1 anisostrabismic) and 32 control children with normal vision were assessed on motion-defined form, texture-defined form, and global motion tasks. Performance on motion-defined and texture-defined form tasks was significantly worse in amblyopic children than in control children. Performance on global motion tasks was not significantly different between the 2 groups. Faulty figure-ground segregation mechanisms are likely responsible for the observed motion-defined form perception deficits in amblyopia.

  20. Portraying User Interface History

    DEFF Research Database (Denmark)

    Jørgensen, Anker Helms

    2008-01-01

    history. Next the paper analyses a selected sample of papers on UI history at large. The analysis shows that the current state-of-art is featured by three aspects: Firstly internalism, in that the papers adress the tech­nologies in their own right with little con­text­ualization, secondly whiggism...... in that they largely address prevailing UI techno­logies, and thirdly history from above in that they focus on the great deeds of the visionaries. The paper then compares this state-of-art in UI history to the much more mature fields history of computing and history of technology. Based hereon, some speculations......The user interface is coming of age. Papers adressing UI history have appeared in fair amounts in the last 25 years. Most of them address particular aspects such as an in­novative interface paradigm or the contribution of a visionary or a research lab. Contrasting this, papers addres­sing UI...

  1. War in European history

    International Nuclear Information System (INIS)

    Howard, M.

    1981-01-01

    War history as a modern historic discipline is by far no longer a mere history of arms technique or a chronicle of battles. It deals with the change of warfare, shows how the wars of the various ages had determined society, and vice versay investigates the influence of social, economic, and -concerning mentality-historical changes on war. With this survey, which covers the period between the Middle Ages and the recent past, the author has presented a small masterpiece of the history of war. A book like this is particularly important and instructive in a time when all depends on the preventing of wars. (orig.) [de

  2. Science A history

    CERN Document Server

    Gribbin, John

    2002-01-01

    From award-winning science writer John Gribbin, "Science: A History" is the enthralling story of the men and women who changed the way we see the world, and the turbulent times they lived in. From Galileo, tried by the Inquisition for his ideas, to Newton, who wrote his rivals out of the history books; from Marie Curie, forced to work apart from male students for fear she might excite them, to Louis Agassiz, who marched his colleagues up a mountain to prove that the ice ages had occurred. Filled with pioneers, visionaries, eccentrics and madmen, this is the history of science as it has never been told before.

  3. Defining operational taxonomic units using DNA barcode data.

    Science.gov (United States)

    Blaxter, Mark; Mann, Jenna; Chapman, Tom; Thomas, Fran; Whitton, Claire; Floyd, Robin; Abebe, Eyualem

    2005-10-29

    The scale of diversity of life on this planet is a significant challenge for any scientific programme hoping to produce a complete catalogue, whatever means is used. For DNA barcoding studies, this difficulty is compounded by the realization that any chosen barcode sequence is not the gene 'for' speciation and that taxa have evolutionary histories. How are we to disentangle the confounding effects of reticulate population genetic processes? Using the DNA barcode data from meiofaunal surveys, here we discuss the benefits of treating the taxa defined by barcodes without reference to their correspondence to 'species', and suggest that using this non-idealist approach facilitates access to taxon groups that are not accessible to other methods of enumeration and classification. Major issues remain, in particular the methodologies for taxon discrimination in DNA barcode data.

  4. Thermodynamics of quantum spacetime histories

    Science.gov (United States)

    Smolin, Lee

    2017-11-01

    We show that the simplicity constraints, which define the dynamics of spin foam models, imply, and are implied by, the first law of thermodynamics, when the latter is applied to causal diamonds in the quantum spacetime. This result reveals an intimate connection between the holographic nature of gravity, as reflected by the Bekenstein entropy, and the fact that general relativity and other gravitational theories can be understood as constrained topological field theories. To state and derive this correspondence we describe causal diamonds in the causal structure of spin foam histories and generalize arguments given for the near horizon region of black holes by Frodden, Gosh and Perez [Phys. Rev. D 87, 121503 (2013); , 10.1103/PhysRevD.87.121503Phys. Rev. D 89, 084069 (2014); , 10.1103/PhysRevD.89.084069Phys. Rev. Lett. 107, 241301 (2011); , 10.1103/PhysRevLett.107.241301Phys. Rev. Lett.108, 169901(E) (2012)., 10.1103/PhysRevLett.108.169901] and Bianchi [arXiv:1204.5122.]. This allows us to apply a recent argument of Jacobson [Phys. Rev. Lett. 116, 201101 (2016).10.1103/PhysRevLett.116.201101] to show that if a spin foam history has a semiclassical limit described in terms of a smooth metric geometry, that geometry satisfies the Einstein equations. These results suggest also a proposal for a quantum equivalence principle.

  5. History of CERN. V. 2

    International Nuclear Information System (INIS)

    Hermann, A.; Krige, J.; Mersits, U.; Pestre, D.; Weiss, L.

    1990-01-01

    This volume of the History of CERN starts at 8 October 1954, when the Council of the new organization met for the first time, and takes the history through the mid-1960's. when it was decided to equip the laboratory with a second generation of accelerators and a new Director-General was nominated. It covers the building and the running of the laboratory during these dozen years, it studies the construction and exploitation of the 600 MeV Synchro-cyclotron and the 28 GeV Proton Synchrotron, it considers the setting up of the material and organizational infrastructure which made this possible, and it covers the reigns of four Director-Generals, Felix Bloch, Cornelis Bakker, John Adams and Victor Weisskopf. Part I describes the various aspects which together constitute the history of CERN and aims to offer a synchronic panorama year by year account of CERN's main activities. Part II deals primarily with technological achievements and scientific results and it includes the most technical chapters in the volume. Part III defines how the CERN 'system' functioned, how this science-based organization worked, how it chose, planned and concretely realized its experimental programme on the shop-floor and how it identified the equipment it would need in the long term and organized its relations with the outside world, notably the political world. The concluding Part IV brings out the specificity of CERN, to identify the ways in which it differed from other big science laboratories in the 1950's and 1960's, and to try to understand where its uniqueness and originality lay. (author). refs.; figs.; tabs

  6. Arizona transportation history.

    Science.gov (United States)

    2011-12-01

    The Arizona transportation history project was conceived in anticipation of Arizonas centennial, which will be : celebrated in 2012. Following approval of the Arizona Centennial Plan in 2007, the Arizona Department of : Transportation (ADOT) recog...

  7. History of quantum theory

    International Nuclear Information System (INIS)

    Hund, F.

    1980-01-01

    History of quantum theory from quantum representations (1900) to the formation of quantum mechanics is systematically stated in the monograph. A special attention is paid to the development of ideas of quantum physics, given are schemes of this development. Quantum theory is abstractly presented as the teaching about a role, which value h characterizing elementary quantum of action, plays in the nature: in statistics - as a unit for calculating the number of possible states; in corpuscular-wave dualism for light - as a value determining the interaction of light and substance and as a component of atom dynamics; in corpuscular-wave dualism for substance. Accordingly, history of the quantum theory development is considered in the following sequence: h discovery; history of quantum statistics, history of light quanta and initial atom dynamics; crysis of this dynamics and its settlement; substance waves and in conclusion - the completion of quantum mechanics including applications and its further development

  8. Personal history, beyond narrative

    DEFF Research Database (Denmark)

    Køster, Allan

    2017-01-01

    on a distinction between history and narrative, I outline an account of historical becoming through a process of sedimentation and a rich notion of what I call historical selfhood on an embodied level. Five embodied existentials are suggested, sketching a preliminary understanding of how selves are concretely......Narrative theories currently dominate our understanding of how selfhood is constituted and concretely individuated throughout personal history. Despite this success, the narrative perspective has recently been exposed to a range of critiques. Whilst these critiques have been effective in pointing...... out the shortcomings of narrative theories of selfhood, they have been less willing and able to suggest alternative ways of understanding personal history. In this article, I assess the criticisms and argue that an adequate phenomenology of personal history must also go beyond narrative. Drawing...

  9. History of Bioterrorism: Botulism

    Medline Plus

    Full Text Available ... is Doing Blog: Public Health Matters Video: "The History of Bioterrorism" Recommend on Facebook Tweet Share Compartir ... 6348 Email CDC-INFO U.S. Department of Health & Human Services HHS/Open USA.gov TOP

  10. History of Bioterrorism: Botulism

    Medline Plus

    Full Text Available ... is Doing Blog: Public Health Matters Video: "The History of Bioterrorism" Recommend on Facebook Tweet Share Compartir ... OIG 1600 Clifton Road Atlanta , GA 30329-4027 USA 800-CDC-INFO (800-232-4636) , TTY: 888- ...

  11. Water Level Station History

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Images contain station history information for 175 stations in the National Water Level Observation Network (NWLON). The NWLON is a network of long-term,...

  12. IUTAM a short history

    CERN Document Server

    Juhasz, Stephen

    2016-01-01

    This book presents extensive information related to the history of IUTAM. The initial chapters focus on IUTAM’s history and selected organizational aspects. Subsequent chapters provide extensive data and statistics, while the closing section showcases photos from all periods of the Union’s history. The history of IUTAM, the International Union on Theoretical and Applied Mechanics, began at a conference in 1922 in Innsbruck, Austria, where von Kármán put forward the idea of an international congress including the whole domain of applied mechanics. In 1946 IUTAM was then formally launched in Paris/France. IUTAM has since time organized more than 24 world congresses and 380 symposia, representing all fields of mechanics and highlighting advances by prominent international researchers. The efforts of IUTAM and its about 50 member countries serve to promote the mechanical sciences and the advancement of human society, addressing many key challenges. In this context, IUTAM preserves important traditions while...

  13. History of Bioterrorism: Botulism

    Medline Plus

    Full Text Available ... as bioterrorist weapons. Watch the Complete Program "The History of Bioterroism" (26 min 38 sec) Watch ... Institute of Infectious Diseases (USAMRIID), the Food and Drug Administration (FDA), and the Centers for Disease Control ...

  14. Life History Approach

    DEFF Research Database (Denmark)

    Olesen, Henning Salling

    2015-01-01

    as in everyday life. Life histories represent lived lives past, present and anticipated future. As such they are interpretations of individuals’ experiences of the way in which societal dynamics take place in the individual body and mind, either by the individual him/herself or by another biographer. The Life...... History approach was developing from interpreting autobiographical and later certain other forms of language interactive material as moments of life history, i.e. it is basically a hermeneutic approach. Talking about a psycho-societal approach indicates the ambition of attacking the dichotomy...... of the social and the psychic, both in the interpretation procedure and in some main theoretical understandings of language, body and mind. My article will present the reflections on the use of life history based methodology in learning and education research as a kind of learning story of research work....

  15. History of Bioterrorism: Botulism

    Medline Plus

    Full Text Available ... Matters Video: "The History of Bioterrorism" Recommend on Facebook Tweet Share Compartir This video describes the Category ... RSS ABOUT About CDC Jobs Funding LEGAL Policies Privacy FOIA No Fear Act OIG 1600 Clifton Road ...

  16. Oral history database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Separately, each history provides an in depth view into the professional and personal lives of individual participants. Together, they have the power to illuminate...

  17. History of Bioterrorism: Botulism

    Medline Plus

    Full Text Available ... Groups Resources for Emergency Health Professionals Training & Education Social Media What’s New Preparation & Planning More on Preparedness What CDC is Doing Blog: Public Health Matters Video: "The History of Bioterrorism" Recommend on Facebook Tweet ...

  18. Mercury's Early Geologic History

    Science.gov (United States)

    Denevi, B. W.; Ernst, C. M.; Klima, R. L.; Robinson, M. S.

    2018-05-01

    A combination of geologic mapping, compositional information, and geochemical models are providing a better understanding of Mercury's early geologic history, and allow us to place it in the context of the Moon and the terrestrial planets.

  19. Transformation of History textbooks

    DEFF Research Database (Denmark)

    Haue, Harry

    2013-01-01

    Artiklen omhandler danske og tyske lærebøger i historie over de seneste to århundreder med hensyn til deres vægtning af det nationale og det globale stof.......Artiklen omhandler danske og tyske lærebøger i historie over de seneste to århundreder med hensyn til deres vægtning af det nationale og det globale stof....

  20. History, Passion, and Performance.

    Science.gov (United States)

    Campbell, Kay N

    2017-04-01

    History, Passion, and Performance was chosen as the theme for the 75th anniversary of the American Association of Occupational Health Nurses (AAOHN) kickoff. The American Association of Occupational Health Nurses has a long history created by passionate, dedicated members. This article highlights historical foundations of the Association, describes the occupational health nurse's passion to drive quality care for workers and discusses future professional and organizational challenges.

  1. An attempt to specify thermal history in CZ silicon wafers and possibilities for its modification

    International Nuclear Information System (INIS)

    Kissinger, G.; Sattler, A.; Mueller, T.; Ammon, W. von

    2007-01-01

    The term thermal history of silicon wafers represents the whole variety of process parameters of crystal growth. The aim of this contribution is an attempt to specify thermal history by one parameter that is directly correlated to the bulk microdefect density. The parameter that reflects thermal history and correlates it with nucleation of oxide precipitates is the concentration of VO 2 complexes. The VO 2 concentration in silicon wafers is too low to be measured by FTIR but it can be obtained from the loss of interstitial oxygen during a standardized thermal treatment. Based on this, the vacancy concentration frozen during crystal cooling in the ingot can be calculated. RTA treatments above 1150 deg. C create a well defined level of the VO 2 concentration in silicon wafers. This means that a well controlled modification of the thermal history is possible. We also investigated the kinetics of reduction of the as-grown excess VO 2 concentration during RTA treatments at 950 deg. C and 1050 deg. C and the effectiveness of this attempt to totally delete the thermal history

  2. The natural history of Perthes' disease

    OpenAIRE

    Terjesen, Terje; Wiig, Ola; Svenningsen, Svein

    2010-01-01

    Background The prognosis in Perthes' disease varies considerably according to certain risk factors, but there is no concensus regarding the relative importance of these factors. We assessed the natural history of the disease and defined prognostic factors of value in deciding the proper treatment. Patients and methods During the 5-year period 1996?2000, a nationwide study on Perthes' disease was performed in Norway. 425 patients were registered. The present study involved the 212 children (me...

  3. Quantum arrival time formula from decoherent histories

    International Nuclear Information System (INIS)

    Halliwell, J.J.; Yearsley, J.M.

    2009-01-01

    We use the decoherent histories approach to quantum mechanics to compute the probability for a wave packet to cross the origin during a given time interval. We define class operators (sums of strings of projectors) characterizing quantum-mechanical crossing and simplify them using a semiclassical approximation. Using these class operators we find that histories crossing the origin during different time intervals are approximately decoherent for a variety of initial states. Probabilities may therefore be assigned and coincide with the flux of the wave packet (the standard semiclassical formula), and are positive. The known initial states for which the flux is negative (backflow states) are shown to correspond to non-decoherent sets of histories, so probabilities may not be assigned.

  4. History of infrared detectors

    Science.gov (United States)

    Rogalski, A.

    2012-09-01

    This paper overviews the history of infrared detector materials starting with Herschel's experiment with thermometer on February 11th, 1800. Infrared detectors are in general used to detect, image, and measure patterns of the thermal heat radiation which all objects emit. At the beginning, their development was connected with thermal detectors, such as thermocouples and bolometers, which are still used today and which are generally sensitive to all infrared wavelengths and operate at room temperature. The second kind of detectors, called the photon detectors, was mainly developed during the 20th Century to improve sensitivity and response time. These detectors have been extensively developed since the 1940's. Lead sulphide (PbS) was the first practical IR detector with sensitivity to infrared wavelengths up to ˜3 μm. After World War II infrared detector technology development was and continues to be primarily driven by military applications. Discovery of variable band gap HgCdTe ternary alloy by Lawson and co-workers in 1959 opened a new area in IR detector technology and has provided an unprecedented degree of freedom in infrared detector design. Many of these advances were transferred to IR astronomy from Departments of Defence research. Later on civilian applications of infrared technology are frequently called "dual-use technology applications." One should point out the growing utilisation of IR technologies in the civilian sphere based on the use of new materials and technologies, as well as the noticeable price decrease in these high cost technologies. In the last four decades different types of detectors are combined with electronic readouts to make detector focal plane arrays (FPAs). Development in FPA technology has revolutionized infrared imaging. Progress in integrated circuit design and fabrication techniques has resulted in continued rapid growth in the size and performance of these solid state arrays.

  5. Reconstitution of Low Bandwidth Reaction History

    International Nuclear Information System (INIS)

    May, M.; Clancy, T.; Fittinghoff, D.; Gennaro, P.; Hagans, K.; Halvorson, G.; Lowry, M.; Perry, T.; Roberson, P.; Smith, D.; Teruya, A.; Blair, J.; Davis, B.; Hunt, E.; Emkeit, B.; Galbraith, J.; Kelly, B.; Montoya, R.; Nickel, G.; Ogle, J.; Wilson, K.; Wood, M.

    2004-01-01

    The goal of the Test Readiness Program is to transition to a 24 month test readiness posture and if approved move to an 18-month posture. One of the key components of the Test Readiness Program necessary to meet this goal is the reconstitution of the important diagnostics. Since the end of nuclear testing, the ability to field diagnostics on a nuclear test has deteriorated. Reconstitution of diagnostics before those who had experience in nuclear testing either retire or leave is essential to achieving a shorter test readiness posture. Also, the data recording systems have not been used since the end of testing. This report documents the reconstitution of one vital diagnostic: the low bandwidth reaction history diagnostic for FY04. Reaction history is one of the major diagnostics that has been used on all LLNL and LANL tests since the early days of nuclear testing. Reaction history refers to measuring the time history of the gamma and neutron output from a nuclear test. This gives direct information on the nuclear reactions taking place in the device. The reaction history measurements are one of the prime measurements the nuclear weapon scientists use to validate their models of device performance. All tests currently under consideration require the reaction history diagnostic. Thus moving to a shorter test readiness posture requires the reconstitution of the ability to make reaction history measurements. Reconstitution of reaction history was planned to be in two steps. Reaction history measurements that have been used in the past can be broadly placed into two categories. The most common type of reaction history and the one that has been performed on virtually all nuclear tests is termed low bandwidth reaction history. This measurement has a time response that is limited by the bandpass of kilometer length coaxial cables. When higher bandwidth has been required for specific measurements, fiber optic techniques have been used. This is referred to as high

  6. Hanford defined waste model limitations and improvements

    International Nuclear Information System (INIS)

    HARMSEN, R.W.

    1999-01-01

    Recommendation 93-5 Implementation Plan, Milestone 5,6.3.1.i requires issuance of this report which addresses ''updates to the tank contents model''. This report summarizes the review of the Hanford Defined Waste, Revision 4, model limitations and provides conclusions and recommendations for potential updates to the model

  7. Parallel Education and Defining the Fourth Sector.

    Science.gov (United States)

    Chessell, Diana

    1996-01-01

    Parallel to the primary, secondary, postsecondary, and adult/community education sectors is education not associated with formal programs--learning in arts and cultural sites. The emergence of cultural and educational tourism is an opportunity for adult/community education to define itself by extending lifelong learning opportunities into parallel…

  8. Bruxism defined and graded: an international consensus

    NARCIS (Netherlands)

    Lobbezoo, F.; Ahlberg, J.; Glaros, A.G.; Kato, T.; Koyano, K.; Lavigne, G.J.; de Leeuw, R.; Manfredini, D.; Svensson, P.; Winocur, E.

    2013-01-01

    To date, there is no consensus about the definition and diagnostic grading of bruxism. A written consensus discussion was held among an international group of bruxism experts as to formulate a definition of bruxism and to suggest a grading system for its operationalisation. The expert group defined

  9. 7 CFR 28.950 - Terms defined.

    Science.gov (United States)

    2010-01-01

    ... Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing..., TESTING, AND STANDARDS Cotton Fiber and Processing Tests Definitions § 28.950 Terms defined. As used... Agricultural Marketing Service of the U.S. Department of Agriculture. (c) Administrator. The Administrator of...

  10. 47 CFR 54.401 - Lifeline defined.

    Science.gov (United States)

    2010-10-01

    ... SERVICE Universal Service Support for Low-Income Consumers § 54.401 Lifeline defined. (a) As used in this subpart, Lifeline means a retail local service offering: (1) That is available only to qualifying low-income consumers; (2) For which qualifying low-income consumers pay reduced charges as a result of...

  11. How Should Energy Be Defined throughout Schooling?

    Science.gov (United States)

    Bächtold, Manuel

    2018-01-01

    The question of how to teach energy has been renewed by recent studies focusing on the learning and teaching progressions for this concept. In this context, one question has been, for the most part, overlooked: how should energy be defined throughout schooling? This paper addresses this question in three steps. We first identify and discuss two…

  12. Big data and software defined networks

    CERN Document Server

    Taheri, Javid

    2018-01-01

    Big Data Analytics and Software Defined Networking (SDN) are helping to drive the management of data usage of the extraordinary increase of computer processing power provided by Cloud Data Centres (CDCs). This new book investigates areas where Big-Data and SDN can help each other in delivering more efficient services.

  13. Delta Semantics Defined By Petri Nets

    DEFF Research Database (Denmark)

    Jensen, Kurt; Kyng, Morten; Madsen, Ole Lehrmann

    and the possibility of using predicates to specify state changes. In this paper a formal semantics for Delta is defined and analysed using Petri nets. Petri nets was chosen because the ideas behind Petri nets and Delta concide on several points. A number of proposals for changes in Delta, which resulted from...

  14. Towards a Southern African English Defining Vocabulary

    African Journals Online (AJOL)

    user

    of parameters, such as avoiding synonyms and antonyms, to determine which words are necessary to write definitions in a concise and simple way. It has been found that existing defining vocabularies lack certain words that would make definitions more accessible to southern African learners, and therefore there is a need ...

  15. Spaces defined by the Paley function

    Energy Technology Data Exchange (ETDEWEB)

    Astashkin, S V [Samara State University, Samara (Russian Federation); Semenov, E M [Voronezh State University, Faculty of Mathematics, Voronezh (Russian Federation)

    2013-07-31

    The paper is concerned with Haar and Rademacher series in symmetric spaces, and also with the properties of spaces defined by the Paley function. In particular, the symmetric hull of the space of functions with uniformly bounded Paley function is found. Bibliography: 27 titles.

  16. Pointwise extensions of GSOS-defined operations

    NARCIS (Netherlands)

    Hansen, H.H.; Klin, B.

    2011-01-01

    Final coalgebras capture system behaviours such as streams, infinite trees and processes. Algebraic operations on a final coalgebra can be defined by distributive laws (of a syntax functor S over a behaviour functor F). Such distributive laws correspond to abstract specification formats. One such

  17. Pointwise Extensions of GSOS-Defined Operations

    NARCIS (Netherlands)

    H.H. Hansen (Helle); B. Klin

    2011-01-01

    textabstractFinal coalgebras capture system behaviours such as streams, infinite trees and processes. Algebraic operations on a final coalgebra can be defined by distributive laws (of a syntax functor $\\FSig$ over a behaviour functor $F$). Such distributive laws correspond to abstract specification

  18. Defining Virtual Reality: Dimensions Determining Telepresence.

    Science.gov (United States)

    Steuer, Jonathan

    1992-01-01

    Defines virtual reality as a particular type of experience (in terms of "presence" and "telepresence") rather than as a collection of hardware. Maintains that media technologies can be classified and studied in terms of vividness and interactivity, two attributes on which virtual reality ranks very high. (SR)

  19. A self-defining hierarchical data system

    Science.gov (United States)

    Bailey, J.

    1992-01-01

    The Self-Defining Data System (SDS) is a system which allows the creation of self-defining hierarchical data structures in a form which allows the data to be moved between different machine architectures. Because the structures are self-defining they can be used for communication between independent modules in a distributed system. Unlike disk-based hierarchical data systems such as Starlink's HDS, SDS works entirely in memory and is very fast. Data structures are created and manipulated as internal dynamic structures in memory managed by SDS itself. A structure may then be exported into a caller supplied memory buffer in a defined external format. This structure can be written as a file or sent as a message to another machine. It remains static in structure until it is reimported into SDS. SDS is written in portable C and has been run on a number of different machine architectures. Structures are portable between machines with SDS looking after conversion of byte order, floating point format, and alignment. A Fortran callable version is also available for some machines.

  20. Experience with a pharmacy technician medication history program.

    Science.gov (United States)

    Cooper, Julie B; Lilliston, Michelle; Brooks, DeAnne; Swords, Bruce

    2014-09-15

    The implementation and outcomes of a pharmacy technician medication history program are described. An interprofessional medication reconciliation team, led by a clinical pharmacist and a clinical nurse specialist, was charged with implementing a new electronic medication reconciliation system to improve compliance with medication reconciliation at discharge and capture compliance-linked reimbursement. The team recommended that the pharmacy department be allocated new pharmacy technician full-time-equivalent positions to assume ownership of the medication history process. Concurrent with the implementation of this program, a medication history standard was developed to define rules for documentation of what a patient reports he or she is actually taking. The standard requires a structured interview with the patient or caregiver and validation with outside sources as indicated to determine which medications to document in the medication history. The standard is based on four medication administration category rules: scheduled, as-needed, short-term, and discontinued medications. The medication history standard forms the core of the medication history technician training and accountability program. Pharmacy technicians are supervised by pharmacists, using a defined accountability plan based on a set of medical staff approved rules for what medications comprise a best possible medication history. Medication history accuracy and completeness rates have been consistently over 90% and rates of provider compliance with medication reconciliation rose from under 20% to 100% since program implementation. A defined medication history based on a medication history standard served as an effective foundation for a pharmacy technician medication history program, which helped improve provider compliance with discharge medication reconciliation. Copyright © 2014 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  1. [History of viral hepatitis].

    Science.gov (United States)

    Fonseca, José Carlos Ferraz da

    2010-01-01

    The history of viral hepatitis goes back thousands of years and is a fascinating one. When humans were first infected by such agents, a natural repetitive cycle began, with the capacity to infect billions of humans, thus decimating the population and causing sequelae in thousands of lives. This article reviews the available scientific information on the history of viral hepatitis. All the information was obtained through extensive bibliographic review, including original and review articles and consultations on the internet. There are reports on outbreaks of jaundice epidemics in China 5,000 years ago and in Babylon more than 2,500 years ago. The catastrophic history of great jaundice epidemics and pandemics is well known and generally associated with major wars. In the American Civil War, 40,000 cases occurred among Union troops. In 1885, an outbreak of catarrhal jaundice affected 191 workers at the Bremen shipyard (Germany) after vaccination against smallpox. In 1942, 28,585 soldiers became infected with hepatitis after inoculation with the yellow fever vaccine. The number of cases of hepatitis during the Second World War was estimated to be 16 million. Only in the twentieth century were the main agents causing viral hepatitis identified. The hepatitis B virus was the first to be discovered. In this paper, through reviewing the history of major epidemics caused by hepatitis viruses and the history of discovery of these agents, singular peculiarities were revealed. Examples of this include the accidental or chance discovery of the hepatitis B and D viruses.

  2. Making history critical.

    Science.gov (United States)

    Learmonth, Mark

    2017-08-21

    Purpose The purpose of this paper is to explore a possible discursive history of National Health Service (NHS) "management" (with management, for reasons that will become evident, very much in scare quotes). Such a history is offered as a complement, as well as a counterpoint, to the more traditional approaches that have already been taken to the history of the issue. Design/methodology/approach Document analysis and interviews with UK NHS trust chief executives. Findings After explicating the assumptions of the method it suggests, through a range of empirical sources that the NHS has undergone an era of administration, an era of management and an era of leadership. Research limitations/implications The paper enables a recasting of the history of the NHS; in particular, the potential for such a discursive history to highlight the interests supported and denied by different representational practices. Practical implications Today's so-called leaders are leaders because of conventional representational practices - not because of some essence about what they really are. Social implications New ideas about the nature of management. Originality/value The value of thinking in terms of what language does - rather than what it might represent.

  3. Defining Tiger Parenting in Chinese Americans.

    Science.gov (United States)

    Kim, Su Yeong

    2013-09-01

    "Tiger" parenting, as described by Amy Chua [2011], has instigated scholarly discourse on this phenomenon and its possible effects on families. Our eight-year longitudinal study, published in the Asian American Journal of Psychology [Kim, Wang, Orozco-Lapray, Shen, & Murtuza, 2013b], demonstrates that tiger parenting is not a common parenting profile in a sample of 444 Chinese American families. Tiger parenting also does not relate to superior academic performance in children. In fact, the best developmental outcomes were found among children of supportive parents. We examine the complexities around defining tiger parenting by reviewing classical literature on parenting styles and scholarship on Asian American parenting, along with Amy Chua's own description of her parenting method, to develop, define, and categorize variability in parenting in a sample of Chinese American families. We also provide evidence that supportive parenting is important for the optimal development of Chinese American adolescents.

  4. Defining enthesitis in spondyloarthritis by ultrasound

    DEFF Research Database (Denmark)

    Terslev, Lene; Naredo, E; Iagnocco, A

    2014-01-01

    Objective: To standardize ultrasound (US) in enthesitis. Methods: An Initial Delphi exercise was undertaken to define US detected enthesitis and its core components. These definitions were subsequently tested on static images taken from Spondyloarthritis (SpA) patients in order to evaluate...... elementary component. On static images the intra-observer reliability showed a high degree of variability for the detection of elementary lesions with kappa coefficients ranging from 0.14 - 1. The inter-observer kappa value was variable with the lowest kappa for enthesophytes (0.24) and the best for Doppler...... activity at the enthesis (0.63). Conclusion: This is the first consensus based definition of US enthesitis and its elementary components and the first step performed to ensure a higher degree of homogeneity and comparability of results between studies and in daily clinical work. Defining Enthesitis...

  5. Control of System with Defined Risk Level

    Directory of Open Access Journals (Sweden)

    Pavol Tomasov

    2002-01-01

    Full Text Available In the following paper the basic requirements for system control with defined risk level is presented. The paper should be an introduction to describe of theoretical apparatus, which was created during some years of research work in the Department of information and safety systems in this area. It a modification or creation of new parts of Information theory, System theory, and Control theory means. This parts are necessary for the analysis and synthesis tasks in the systems where dominant attribute of control is defined risk level. The basic problem is the creation of protect mechanism again the threats from inside and from controlled system environs. For each risk reduction mechanism is needed some redundancy which should be into control algorithm to put by exactly determined way.

  6. FINANCIAL ACCOUNTING QUALITY AND ITS DEFINING CHARACTERISTICS

    Directory of Open Access Journals (Sweden)

    Andra M. ACHIM

    2014-11-01

    Full Text Available The importance ofhigh-quality financial statements is highlighted by the main standard-setting institutions activating in the field of accounting and reporting. These have issued Conceptual Frameworks which state and describe the qualitative characteristics of accounting information. In this qualitative study, the research methodology consists of reviewing the literature related to the definition of accounting quality and striving for understanding how it can be explained. The main objective of the study is to identify the characteristics information should possess in order to be of high quality. These characteristics also contribute to the way of defining financial accounting quality. The main conclusions that arise from this research are represented by the facts that indeed financial accounting quality cannot be uniquely defined and that financial information is of good quality when it enhances the characteristics incorporated in the conceptual frameworks issued by both International Accounting Standards Board and Financial Accounting Standards Board.

  7. Exploring self-defining memories in schizophrenia.

    Science.gov (United States)

    Raffard, Stéphane; D'Argembeau, Arnaud; Lardi, Claudia; Bayard, Sophie; Boulenger, Jean-Philippe; Van Der Linden, Martial

    2009-01-01

    Previous studies have shown that patients with schizophrenia are impaired in recalling specific events from their personal past. However, the relationship between autobiographical memory impairments and disturbance of the sense of identity in schizophrenia has not been investigated in detail. In this study the authors investigated schizophrenic patients' ability to recall self-defining memories; that is, memories that play an important role in building and maintaining the self-concept. Results showed that patients recalled as many specific self-defining memories as healthy participants. However, patients with schizophrenia exhibited an abnormal reminiscence bump and reported different types of thematic content (i.e., they recalled less memories about past achievements and more memories regarding hospitalisation and stigmatisation of illness). Furthermore, the findings suggest that impairments in extracting meaning from personal memories could represent a core disturbance of autobiographical memory in patients with schizophrenia.

  8. The history of ZED-2

    International Nuclear Information System (INIS)

    Jones, R.

    2010-01-01

    This presentation gives the history of ZED-2 reactor at the Chalk River Laboratories. It traces the genealogy from Fermi Pile (1942) and ZEEP (1945) to the present ZED-2 reactor. ZED-2 is larger than ZEEP to allow larger critical lattices of larger CANDU type channels. There are two basic types of measurements made in ZED-2: critical size (buckling) of lattice and detailed reaction rate distribution in a cell. The presentation goes on to discuss research activities on ZED-2 from 1960 to the present.

  9. The history of ZED-2

    Energy Technology Data Exchange (ETDEWEB)

    Jones, R. [Atomic Energy of Canada Limited, Chalk River, Ontario (Canada)

    2010-07-01

    This presentation gives the history of ZED-2 reactor at the Chalk River Laboratories. It traces the genealogy from Fermi Pile (1942) and ZEEP (1945) to the present ZED-2 reactor. ZED-2 is larger than ZEEP to allow larger critical lattices of larger CANDU type channels. There are two basic types of measurements made in ZED-2: critical size (buckling) of lattice and detailed reaction rate distribution in a cell. The presentation goes on to discuss research activities on ZED-2 from 1960 to the present.

  10. Construction history and construction management

    International Nuclear Information System (INIS)

    Agh, S.

    1999-01-01

    The process of pre-design and design preparation of the Mochovce NPP as well as the construction history of the plant is highlighted, including the financing aspect and problems arising from changes in the technological and other conditions of start-up of the reactor units. The results of international audits performed to improve the level of nuclear safety and implementation of the measures suggested are also described. The milestones of the whole construction process and start-up process, the control and quality system, and the methods of control and management of the complex construction project are outlined. (author)

  11. Basic technical parameters of magnetometers with ferromagnetic transducers and a method to define them

    International Nuclear Information System (INIS)

    Nagiello, Z.

    1980-01-01

    The basic technical parameters of magnetometers with ferromagnetic transducers and measuring methods to define these parameters have been discussed. Special attention was paid to factors which essentially affect the inaccuracy of these measuring instruments. (author)

  12. Improving network management with Software Defined Networking

    International Nuclear Information System (INIS)

    Dzhunev, Pavel

    2013-01-01

    Software-defined networking (SDN) is developed as an alternative to closed networks in centers for data processing by providing a means to separate the control layer data layer switches, and routers. SDN introduces new possibilities for network management and configuration methods. In this article, we identify problems with the current state-of-the-art network configuration and management mechanisms and introduce mechanisms to improve various aspects of network management

  13. Stateless multicast switching in software defined networks

    OpenAIRE

    Reed, Martin J.; Al-Naday, Mays; Thomos, Nikolaos; Trossen, Dirk; Petropoulos, George; Spirou, Spiros

    2016-01-01

    Multicast data delivery can significantly reduce traffic in operators' networks, but has been limited in deployment due to concerns such as the scalability of state management. This paper shows how multicast can be implemented in contemporary software defined networking (SDN) switches, with less state than existing unicast switching strategies, by utilising a Bloom Filter (BF) based switching technique. Furthermore, the proposed mechanism uses only proactive rule insertion, and thus, is not l...

  14. Defining Trust Using Expected Utility Theory

    OpenAIRE

    Arai, Kazuhiro

    2009-01-01

    Trust has been discussed in many social sciences including economics, psychology, and sociology. However, there is no widely accepted definition of trust. Inparticular, there is no definition that can be used for economic analysis. This paper regards trust as expectation and defines it using expected utility theory together with concepts such as betrayal premium. In doing so, it rejects the widely accepted black-and-white view that (un) trustworthy people are always (un)trustworthy. This pape...

  15. On Undefined and Meaningless in Lambda Definability

    OpenAIRE

    de Vries, Fer-Jan

    2016-01-01

    We distinguish between undefined terms as used in lambda definability of partial recursive functions and meaningless terms as used in infinite lambda calculus for the infinitary terms models that generalise the Bohm model. While there are uncountable many known sets of meaningless terms, there are four known sets of undefined terms. Two of these four are sets of meaningless terms. In this paper we first present set of sufficient conditions for a set of lambda terms to se...

  16. How Should Energy Be Defined Throughout Schooling?

    Science.gov (United States)

    Bächtold, Manuel

    2017-02-01

    The question of how to teach energy has been renewed by recent studies focusing on the learning and teaching progressions for this concept. In this context, one question has been, for the most part, overlooked: how should energy be defined throughout schooling? This paper addresses this question in three steps. We first identify and discuss two main approaches in physics concerning the definition of energy, one claiming there is no satisfactory definition and taking conservation as a fundamental property, and the other based on Rankine's definition of energy as the capacity of a system to produce changes. We then present a study concerning how energy is actually defined throughout schooling in the case of France by analyzing national programs, physics textbooks, and the answers of teachers to a questionnaire. This study brings to light a consistency problem in the way energy is defined across school years: in primary school, an adapted version of Rankine's definition is introduced and conservation is ignored; in high school, conservation is introduced and Rankine's definition is ignored. Finally, we address this consistency problem by discussing possible teaching progressions. We argue in favor of the use of Rankine's definition throughout schooling: at primary school, it is a possible substitute to students' erroneous conceptions; at secondary school, it might help students become aware of the unifying role of energy and thereby overcome the compartmentalization problem.

  17. Defining functional distances over Gene Ontology

    Directory of Open Access Journals (Sweden)

    del Pozo Angela

    2008-01-01

    Full Text Available Abstract Background A fundamental problem when trying to define the functional relationships between proteins is the difficulty in quantifying functional similarities, even when well-structured ontologies exist regarding the activity of proteins (i.e. 'gene ontology' -GO-. However, functional metrics can overcome the problems in the comparing and evaluating functional assignments and predictions. As a reference of proximity, previous approaches to compare GO terms considered linkage in terms of ontology weighted by a probability distribution that balances the non-uniform 'richness' of different parts of the Direct Acyclic Graph. Here, we have followed a different approach to quantify functional similarities between GO terms. Results We propose a new method to derive 'functional distances' between GO terms that is based on the simultaneous occurrence of terms in the same set of Interpro entries, instead of relying on the structure of the GO. The coincidence of GO terms reveals natural biological links between the GO functions and defines a distance model Df which fulfils the properties of a Metric Space. The distances obtained in this way can be represented as a hierarchical 'Functional Tree'. Conclusion The method proposed provides a new definition of distance that enables the similarity between GO terms to be quantified. Additionally, the 'Functional Tree' defines groups with biological meaning enhancing its utility for protein function comparison and prediction. Finally, this approach could be for function-based protein searches in databases, and for analysing the gene clusters produced by DNA array experiments.

  18. History of psychiatry

    Science.gov (United States)

    Shorter, Edward

    2013-01-01

    Purpose of review The present review examines recent contributions to the evolving field of historical writing in psychiatry. Recent findings Interest in the history of psychiatry continues to grow, with an increasing emphasis on topics of current interest such as the history of psychopharmacology, electroconvulsive therapy, and the interplay between psychiatry and society. The scope of historical writing in psychiatry as of 2007 is as broad and varied as the discipline itself. Summary More than in other medical specialties such as cardiology or nephrology, treatment and diagnosis in psychiatry are affected by trends in the surrounding culture and society. Studying the history of the discipline provides insights into possible alternatives to the current crop of patent-protected remedies and trend-driven diagnoses. PMID:18852567

  19. Life-history interviews

    DEFF Research Database (Denmark)

    Adriansen, Hanne Kirstine

    2010-01-01

    in qualitative interviews. I first presented the paper on a conference on life history research at Karlstad University in November 2010. My main purpose was to establish whether a paper discussing the use of time line interviews should be placed in the context of a life history research. The valuable comments......My first encounter with life history research was during my Ph.D. research. This concerned a multi-method study of nomadic mobility in Senegal. One method stood out as yielding the most interesting and in-depth data: life story interviews using a time line. I made interviews with the head...... of the nomadic households and during these I came to understand the use of mobility in a complex context of continuity and change, identity and belonging in the Fulani community. Time line interviews became one of my favourite tool in the years to follow, a tool used both for my research in various settings...

  20. To betray art history

    Directory of Open Access Journals (Sweden)

    Jae Emerling

    2016-12-01

    Full Text Available The work of Donald Preziosi represents one of the most sustained and often brilliant attempts to betray the modern discipline of art history by exposing its skillful shell game: precisely how and why it substitutes artifice, poetry, and representational schemes for putative facticity and objectivity (that desirous and yet ever elusive Kunstwissenschaft that art historians prattle on about. This attempt is inseparable from a sinuous, witty, involutive writing style that meanders between steely insight and coy suggestions of how art history could be performed otherwise. Preziosi’s writes art history. In doing so he betrays its disciplinary desires. It is this event of betrayal that has made his work so exciting to some, so troubling to others.

  1. Criminal Justice History

    Directory of Open Access Journals (Sweden)

    Thomas Krause

    2005-01-01

    Full Text Available This review article discusses studies on the history of crime and the criminal law in England and Ireland published during the last few years. These reflect the ›history of crime and punishment‹ as a more or less established sub-discipline of social history, at least in England, whereas it only really began to flourish in the german-speaking world from the 1990s onwards. By contrast, the legal history of the criminal law and its procedure has a strong, recently revived academic tradition in Germany that does not really have a parallel in the British Isles, whose legal scholars still evidence their traditional reluctance to confront penal subjects.

  2. Metabolic syndrome and parental history of cardiovascular disease in young adults in urban Ghana.

    Science.gov (United States)

    Yeboah, Kwame; Dodam, Kennedy Konlan; Affrim, Patrick Kormla; Adu-Gyamfi, Linda; Bado, Anormah Rashid; Owusu Mensah, Richard N A; Adjei, Afua Bontu; Gyan, Ben

    2017-08-03

    Metabolic syndrome (MetS) in young adults poses significant cardiovascular diseases (CVD) risk for later years. Parental history of CVDs is known to affect the prevalence of CVD risk in adulthood. In sub-Saharan Africa, the burden of MetS in young adults and its relationship with parental CVDs is largely unknown. We studied the gender-specific prevalence of MetS and its association with parental history of diabetes, hypertension and CVDs in young adults resident in urban Ghana. In a cross-sectional design, 364 young adults aged 20-30 years were randomly recruited from students of University of Ghana. A structured questionnaire was used to collect data on demography, lifestyle, medical and parental medical history. Anthropometric indices and blood pressures were measured. Fasting blood samples were collected to measure plasma levels of glucose, lipid profile, urea and creatinine. MetS was defined according to the Joint Scientific Statement criteria. The prevalence of MetS was 12.4%, higher in females than male participants (18.4% vs 5.7, p = 0.019). Female participants had higher levels of all the components of MetS than the male participants. Compared to participants with no history of parental CVDs, participants with parental CVDs had a higher proportion of abdominal obesity. A positive history of parental CVDs was associated with increase in odds of MetS [OR (95% CI): 1.23 (1.12-3.04), p = 0.037]. In our study population, there is relatively high prevalence of MetS; higher in females compared to male participants. Parental history of CVDs was associated with MetS.

  3. History and National Development | Oyeranmi | Journal of History ...

    African Journals Online (AJOL)

    Volumes of works have been written on the subject of the relevance of history to national development in Nigeria. To „.non historians.. history teaches no particular skill “since the primary focus of history is the past... Does history still serve any purpose especially in the 21st century? What are those values embedded in ...

  4. History of psychology.

    Science.gov (United States)

    Weidman, Nadine

    2016-02-01

    The editor of History of Psychology discusses her plan to vary the journal's content and expand its scope in specific ways. The first is to introduce a "Spotlight" feature, a relatively brief, provocative thought piece that might take one of several forms. Along with this new feature, she hopes further to broaden the journal's coverage and its range of contributors. She encourages submissions on the history of the psy-sciences off the beaten path. Finally, she plans to continue the journal's tradition of special issues, special sections, and essay reviews of two or more important recently published books in the field. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  5. Our nuclear history

    International Nuclear Information System (INIS)

    Marx, G.

    1986-01-01

    The article on nuclear history is contained in a booklet on the Revised Nuffield Advanced Physics Course. The author shows how the difficult decisions about energy supplies at the end of the twentieth century can be seen as a consequence of the history and evolution of the Universe and of life, and mankind's activities on earth. The topics discussed include:-the origin of the Universe, formation of light elements, formation of carbon and oxygen, supernovae and nuclear equilibrium, formation of planets, development of life on earth, mankind and the use of fuels, and the nuclear valley. (UK)

  6. Diagnosing ignition with DT reaction history

    International Nuclear Information System (INIS)

    Wilson, D. C.; Bradley, P. A.; Herrmann, H. W.; Cerjan, C. J.; Salmonson, J. D.; Spears, B. K.; Hatchet, S. P. II; Glebov, V. Yu.

    2008-01-01

    A full range DT reaction history of an ignition capsule, from 10 9 to 10 20 neutrons/ns, offers the opportunity to diagnose fuel conditions hundreds of picoseconds before and during burn. The burn history begins with a sharp rise when the first shock reaches the center of the capsule. The level of this jump reflects the combined shock strength and the adiabat of DT fuel. Changes to the four laser pulses driving the capsule implosion which are large enough to degrade the yield make measurable changes to the reaction history. Low mode asymmetries grow during convergence but change the reaction history during the final ∼100 ps. High mode asymmetry or turbulence mixing affects only the reaction history within ∼50 ps of peak burn rate. A capsule with a tritium fuel layer containing a small amount of deuterium (∼1%) creates a reaction history similar to the ignition capsule, but without the final ignition burn. A combination of gas Cerenkov detectors and the neutron temporal diagnostic could be capable of diagnosing the full history of ignition and tritium rich capsules.

  7. Precise Dimensions; A history of units from 1791-2018

    Science.gov (United States)

    Cooper, Malcolm; Grozier, Jim

    2017-11-01

    Units are the foundation for all measurement of the natural world, and from which standard, our understanding develops. This book, stemming from a conference on the history of units organised by the editors, provides a detailed and discursive examination of the history of units within physics, in advance of the proposed redefinition of the SI base units at the General Conference on Weights and Measures in 2018. It features contributions from leading researchers in metrology and history.

  8. Interpretation of Spirometry: Selection of Predicted Values and Defining Abnormality.

    Science.gov (United States)

    Chhabra, S K

    2015-01-01

    Spirometry is the most frequently performed investigation to evaluate pulmonary function. It provides clinically useful information on the mechanical properties of the lung and the thoracic cage and aids in taking management-related decisions in a wide spectrum of diseases and disorders. Few measurements in medicine are so dependent on factors related to equipment, operator and the patient. Good spirometry requires quality assured measurements and a systematic approach to interpretation. Standard guidelines on the technical aspects of equipment and their calibration as well as the test procedure have been developed and revised from time-to-time. Strict compliance with standardisation guidelines ensures quality control. Interpretation of spirometry data is based only on two basic measurements--the forced vital capacity (FVC) and the forced expiratory volume in 1 second (FEV1) and their ratio, FEV1/FVC. A meaningful and clinically useful interpretation of the measured data requires a systematic approach and consideration of several important issues. Central to interpretation is the understanding of the development and application of prediction equations. Selection of prediction equations that are appropriate for the ethnic origin of the patient is vital to avoid erroneous interpretation. Defining abnormal values is a debatable but critical aspect of spirometry. A statistically valid definition of the lower limits of normal has been advocated as the better method over the more commonly used approach of defining abnormality as a fixed percentage of the predicted value. Spirometry rarely provides a specific diagnosis. Examination of the flow-volume curve and the measured data provides information to define patterns of ventilatory impairment. Spirometry must be interpreted in conjunction with clinical information including results of other investigations.

  9. Family History Is a Risk Factor for COPD

    Science.gov (United States)

    Hokanson, John E.; Lynch, David A.; Washko, George R.; Make, Barry J.; Crapo, James D.; Silverman, Edwin K.

    2011-01-01

    Background: Studies have shown that family history is a risk factor for COPD, but have not accounted for family history of smoking. Therefore, we sought to identify the effects of family history of smoking and family history of COPD on COPD susceptibility. Methods: We compared 821 patients with COPD to 776 control smokers from the Genetic Epidemiology of COPD (COPDGene) Study. Questionnaires captured parental histories of smoking and COPD, as well as childhood environmental tobacco smoke (ETS) exposure. Socioeconomic status was defined by educational achievement. Results: Parental history of smoking (85.5% case patients, 82.9% control subjects) was more common than parental history of COPD (43.0% case patients, 30.8% control subjects). In a logistic regression model, parental history of COPD (OR, 1.73; P < .0001) and educational level (OR, 0.48 for some college vs no college; P < .0001) were significant predictors of COPD, but parental history of smoking and childhood ETS exposure were not significant. The population-attributable risk from COPD family history was 18.6%. Patients with COPD with a parental history had more severe disease, with lower lung function, worse quality of life, and more frequent exacerbations. There were nonsignificant trends for more severe emphysema and airway disease on quantitative chest CT scans. Conclusions: Family history of COPD is a strong risk factor for COPD, independent of family history of smoking, personal lifetime smoking, or childhood ETS exposure. Although further studies are required to identify genetic variants that influence COPD susceptibility, clinicians should question all smokers, especially those with known or suspected COPD, regarding COPD family history. PMID:21310839

  10. A Church History of Denmark

    DEFF Research Database (Denmark)

    Lausten, Martin Schwarz

    A Church History of Denmark from the Missionary periode, through the Middle Ages, the Lutheran Reformation, the Ortodoxy, Pietisme, Enlightenment and det History of the 19. and 20. century......A Church History of Denmark from the Missionary periode, through the Middle Ages, the Lutheran Reformation, the Ortodoxy, Pietisme, Enlightenment and det History of the 19. and 20. century...

  11. How do pediatric anesthesiologists define intraoperative hypotension?

    Science.gov (United States)

    Nafiu, Olubukola O; Voepel-Lewis, Terri; Morris, Michelle; Chimbira, Wilson T; Malviya, Shobha; Reynolds, Paul I; Tremper, Kevin K

    2009-11-01

    Although blood pressure (BP) monitoring is a recommended standard of care by the ASA, and pediatric anesthesiologists routinely monitor the BP of their patients and when appropriate treat deviations from 'normal', there is no robust definition of hypotension in any of the pediatric anesthesia texts or journals. Consequently, what constitutes hypotension in pediatric anesthesia is currently unknown. We designed a questionnaire-based survey of pediatric anesthesiologists to determine the BP ranges and thresholds used to define intraoperative hypotension (IOH). Members of the Society of Pediatric Anesthesia (SPA) and the Association of Paediatric Anaesthetists (APA) of Great Britain and Ireland were contacted through e-mail to participate in this survey. We asked a few demographic questions and five questions about specific definitions of hypotension for different age groups of patients undergoing inguinal herniorraphy, a common pediatric surgical procedure. The overall response rate was 56% (483/860), of which 76% were SPA members. Majority of the respondents (72%) work in academic institutions, while 8.9% work in institutions with fewer than 1000 annual pediatric surgical caseload. About 76% of respondents indicated that a 20-30% reduction in baseline systolic blood pressure (SBP) indicates significant hypotension in children under anesthesia. Most responders (86.7%) indicated that they use mean arterial pressure or SBP (72%) to define IOH. The mean SBP values for hypotension quoted by SPA members was about 5-7% lower across all pediatric age groups compared to values quoted by APA members (P = 0.001 for all age groups). There is great variability in the BP parameters used and the threshold used for defining and treating IOH among pediatric anesthesiologists. The majority of respondents considered a 20-30% reduction from baseline in SBP as indicative of significant hypotension. Lack of a consensus definition for a common clinical condition like IOH could have

  12. Using greenhouse gas fluxes to define soil functional types

    Energy Technology Data Exchange (ETDEWEB)

    Petrakis, Sandra; Barba, Josep; Bond-Lamberty, Ben; Vargas, Rodrigo

    2017-12-04

    Soils provide key ecosystem services and directly control ecosystem functions; thus, there is a need to define the reference state of soil functionality. Most common functional classifications of ecosystems are vegetation-centered and neglect soil characteristics and processes. We propose Soil Functional Types (SFTs) as a conceptual approach to represent and describe the functionality of soils based on characteristics of their greenhouse gas (GHG) flux dynamics. We used automated measurements of CO2, CH4 and N2O in a forested area to define SFTs following a simple statistical framework. This study supports the hypothesis that SFTs provide additional insights on the spatial variability of soil functionality beyond information represented by commonly measured soil parameters (e.g., soil moisture, soil temperature, litter biomass). We discuss the implications of this framework at the plot-scale and the potential of this approach at larger scales. This approach is a first step to provide a framework to define SFTs, but a community effort is necessary to harmonize any global classification for soil functionality. A global application of the proposed SFT framework will only be possible if there is a community-wide effort to share data and create a global database of GHG emissions from soils.

  13. Making Invisible Histories Visible

    Science.gov (United States)

    Hanssen, Ana Maria

    2012-01-01

    This article features Omaha Public Schools' "Making Invisible Histories Visible" program, or MIHV. Omaha's schools have a low failure rate among 8th graders but a high one among high school freshmen. MIHV was created to help at-risk students "adjust to the increased demands of high school." By working alongside teachers and…

  14. History and Advancements

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 15; Issue 9. Colour: History and Advancements. Vinod R Kanetkar. General Article Volume 15 Issue 9 September 2010 pp 794-803. Fulltext. Click here to view fulltext PDF. Permanent link: https://www.ias.ac.in/article/fulltext/reso/015/09/0794-0803 ...

  15. Nanostructures-History

    Indian Academy of Sciences (India)

    First page Back Continue Last page Graphics. Nanostructures-History. Inspiration to Nanotechnology-. The Japanese scientist Norio Taniguchi of the Tokyo University of Science was used the term "nano-technology" in a 1974 conference, to describe semiconductor processes such as thin film His definition was, ...

  16. Storytelling and History.

    Science.gov (United States)

    Henegar, Steven

    1998-01-01

    Draws a connection between the techniques of storytelling and the content knowledge of history. Notes the many fables, tall tales, and legends that have historical incidents as their inspiration. Outlines some specific functions and steps of a story and provides an exercise for students or teachers to develop their own stories. (MJP)

  17. Natural history of COPD

    DEFF Research Database (Denmark)

    Vestbo, Jørgen; Lange, Peter

    2016-01-01

    The natural history of chronic obstructive pulmonary disease (COPD) is usually described with a focus on change in forced expiratory volume in 1 s (FEV1 ) over time as this allows for exploration of risk factors for an accelerated decline-and thus of developing COPD. From epidemiological studies we...

  18. American History (an introduction)

    DEFF Research Database (Denmark)

    Nye, David Edwin

    I et letforståeligt engelsk giver professor David Nye en fængende præsentation af amerikansk historie fra den tidlige kolonisationsperiode til præsident Obama. Bogen giver et helhedsportræt af perioderne og inkluderer til hver periode en kortfattet præsentation af kultur- og litteraturhistorien....

  19. Didactics of History

    DEFF Research Database (Denmark)

    Haue, Harry

    The book consits of five chapters about  formation and education in Denmark over the last two centuries. The developement of history teaching is especially stressed. The guiding concept for the upper secondary education has since 1850 been 'general character formation'. The book is an edited...

  20. History in the Flesh

    DEFF Research Database (Denmark)

    Bencard, Adam

    drevet af en historisering, en vilje til at placere historie hvor der før var biologi.   Denne afhandling undersøger denne interesse i kroppen gennem en analyse af hvad jeg kalder den historiserede krop som diskursiv figur. Den historiserede krop er ikke et klart aftegnet koncept eller en skarp afgrænset...