WorldWideScience

Sample records for invalid lnt model

  1. Radiation, Ecology and the Invalid LNT Model: The Evolutionary Imperative

    OpenAIRE

    Parsons, Peter A.

    2006-01-01

    Metabolic and energetic efficiency, and hence fitness of organisms to survive, should be maximal in their habitats. This tenet of evolutionary biology invalidates the linear-nothreshold (LNT) model for the risk consequences of environmental agents. Hormesis in response to selection for maximum metabolic and energetic efficiency, or minimum metabolic imbalance, to adapt to a stressed world dominated by oxidative stress should therefore be universal. Radiation hormetic zones extending substanti...

  2. Radiation, ecology and the invalid LNT model: the evolutionary imperative.

    Science.gov (United States)

    Parsons, Peter A

    2006-09-27

    Metabolic and energetic efficiency, and hence fitness of organisms to survive, should be maximal in their habitats. This tenet of evolutionary biology invalidates the linear-no threshold (LNT) model for the risk consequences of environmental agents. Hormesis in response to selection for maximum metabolic and energetic efficiency, or minimum metabolic imbalance, to adapt to a stressed world dominated by oxidative stress should therefore be universal. Radiation hormetic zones extending substantially beyond common background levels, can be explained by metabolic interactions among multiple abiotic stresses. Demographic and experimental data are mainly in accord with this expectation. Therefore, non-linearity becomes the primary model for assessing risks from low-dose ionizing radiation. This is the evolutionary imperative upon which risk assessment for radiation should be based.

  3. Linear non-threshold (LNT) radiation hazards model and its evaluation

    International Nuclear Information System (INIS)

    Min Rui

    2011-01-01

    In order to introduce linear non-threshold (LNT) model used in study on the dose effect of radiation hazards and to evaluate its application, the analysis of comprehensive literatures was made. The results show that LNT model is more suitable to describe the biological effects in accuracy for high dose than that for low dose. Repairable-conditionally repairable model of cell radiation effects can be well taken into account on cell survival curve in the all conditions of high, medium and low absorbed dose range. There are still many uncertainties in assessment model of effective dose of internal radiation based on the LNT assumptions and individual mean organ equivalent, and it is necessary to establish gender-specific voxel human model, taking gender differences into account. From above, the advantages and disadvantages of various models coexist. Before the setting of the new theory and new model, LNT model is still the most scientific attitude. (author)

  4. The threshold vs LNT showdown: Dose rate findings exposed flaws in the LNT model part 2. How a mistake led BEIR I to adopt LNT

    International Nuclear Information System (INIS)

    Calabrese, Edward J.

    2017-01-01

    This paper reveals that nearly 25 years after the used Russell's dose-rate data to support the adoption of the linear-no-threshold (LNT) dose response model for genetic and cancer risk assessment, Russell acknowledged a significant under-reporting of the mutation rate of the historical control group. This error, which was unknown to BEIR I, had profound implications, leading it to incorrectly adopt the LNT model, which was a decision that profoundly changed the course of risk assessment for radiation and chemicals to the present. -- Highlights: • The BEAR I Genetics Panel made an error in denying dose rate for mutation. • The BEIR I Genetics Subcommittee attempted to correct this dose rate error. • The control group used for risk assessment by BEIR I is now known to be in error. • Correcting this error contradicts the LNT, supporting a threshold model.

  5. The threshold vs LNT showdown: Dose rate findings exposed flaws in the LNT model part 2. How a mistake led BEIR I to adopt LNT

    Energy Technology Data Exchange (ETDEWEB)

    Calabrese, Edward J., E-mail: edwardc@schoolph.umass.edu [Department of Environmental Health Sciences, School of Public Health and Health Sciences, Morrill I, N344, University of Massachusetts, Amherst, MA 01003 (United States)

    2017-04-15

    This paper reveals that nearly 25 years after the used Russell's dose-rate data to support the adoption of the linear-no-threshold (LNT) dose response model for genetic and cancer risk assessment, Russell acknowledged a significant under-reporting of the mutation rate of the historical control group. This error, which was unknown to BEIR I, had profound implications, leading it to incorrectly adopt the LNT model, which was a decision that profoundly changed the course of risk assessment for radiation and chemicals to the present. -- Highlights: • The BEAR I Genetics Panel made an error in denying dose rate for mutation. • The BEIR I Genetics Subcommittee attempted to correct this dose rate error. • The control group used for risk assessment by BEIR I is now known to be in error. • Correcting this error contradicts the LNT, supporting a threshold model.

  6. The threshold vs LNT showdown: Dose rate findings exposed flaws in the LNT model part 2. How a mistake led BEIR I to adopt LNT.

    Science.gov (United States)

    Calabrese, Edward J

    2017-04-01

    This paper reveals that nearly 25 years after the National Academy of Sciences (NAS), Biological Effects of Ionizing Radiation (BEIR) I Committee (1972) used Russell's dose-rate data to support the adoption of the linear-no-threshold (LNT) dose response model for genetic and cancer risk assessment, Russell acknowledged a significant under-reporting of the mutation rate of the historical control group. This error, which was unknown to BEIR I, had profound implications, leading it to incorrectly adopt the LNT model, which was a decision that profoundly changed the course of risk assessment for radiation and chemicals to the present. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. Model Uncertainty via the Integration of Hormesis and LNT as the Default in Cancer Risk Assessment.

    Science.gov (United States)

    Calabrese, Edward J

    2015-01-01

    On June 23, 2015, the US Nuclear Regulatory Commission (NRC) issued a formal notice in the Federal Register that it would consider whether "it should amend its 'Standards for Protection Against Radiation' regulations from the linear non-threshold (LNT) model of radiation protection to the hormesis model." The present commentary supports this recommendation based on the (1) flawed and deceptive history of the adoption of LNT by the US National Academy of Sciences (NAS) in 1956; (2) the documented capacity of hormesis to make more accurate predictions of biological responses for diverse biological end points in the low-dose zone; (3) the occurrence of extensive hormetic data from the peer-reviewed biomedical literature that revealed hormetic responses are highly generalizable, being independent of biological model, end point measured, inducing agent, level of biological organization, and mechanism; and (4) the integration of hormesis and LNT models via a model uncertainty methodology that optimizes public health responses at 10(-4). Thus, both LNT and hormesis can be integratively used for risk assessment purposes, and this integration defines the so-called "regulatory sweet spot."

  8. The linear nonthreshold (LNT) model as used in radiation protection: an NCRP update.

    Science.gov (United States)

    Boice, John D

    2017-10-01

    The linear nonthreshold (LNT) model has been used in radiation protection for over 40 years and has been hotly debated. It relies heavily on human epidemiology, with support from radiobiology. The scientific underpinnings include NCRP Report No. 136 ('Evaluation of the Linear-Nonthreshold Dose-Response Model for Ionizing Radiation'), UNSCEAR 2000, ICRP Publication 99 (2004) and the National Academies BEIR VII Report (2006). NCRP Scientific Committee 1-25 is reviewing recent epidemiologic studies focusing on dose-response models, including threshold, and the relevance to radiation protection. Recent studies after the BEIR VII Report are being critically reviewed and include atomic-bomb survivors, Mayak workers, atomic veterans, populations on the Techa River, U.S. radiological technologists, the U.S. Million Person Study, international workers (INWORKS), Chernobyl cleanup workers, children given computerized tomography scans, and tuberculosis-fluoroscopy patients. Methodologic limitations, dose uncertainties and statistical approaches (and modeling assumptions) are being systematically evaluated. The review of studies continues and will be published as an NCRP commentary in 2017. Most studies reviewed to date are consistent with a straight-line dose response but there are a few exceptions. In the past, the scientific consensus process has worked in providing practical and prudent guidance. So pragmatic judgment is anticipated. The evaluations are ongoing and the extensive NCRP review process has just begun, so no decisions or recommendations are in stone. The march of science requires a constant assessment of emerging evidence to provide an optimum, though not necessarily perfect, approach to radiation protection. Alternatives to the LNT model may be forthcoming, e.g. an approach that couples the best epidemiology with biologically-based models of carcinogenesis, focusing on chronic (not acute) exposure circumstances. Currently for the practical purposes of

  9. The threshold vs LNT showdown: Dose rate findings exposed flaws in the LNT model part 1. The Russell-Muller debate

    Energy Technology Data Exchange (ETDEWEB)

    Calabrese, Edward J., E-mail: edwardc@schoolph.umass.edu

    2017-04-15

    This paper assesses the discovery of the dose-rate effect in radiation genetics and how it challenged fundamental tenets of the linear non-threshold (LNT) dose response model, including the assumptions that all mutational damage is cumulative and irreversible and that the dose-response is linear at low doses. Newly uncovered historical information also describes how a key 1964 report by the International Commission for Radiological Protection (ICRP) addressed the effects of dose rate in the assessment of genetic risk. This unique story involves assessments by two leading radiation geneticists, Hermann J. Muller and William L. Russell, who independently argued that the report's Genetic Summary Section on dose rate was incorrect while simultaneously offering vastly different views as to what the report's summary should have contained. This paper reveals occurrences of scientific disagreements, how conflicts were resolved, which view(s) prevailed and why. During this process the Nobel Laureate, Muller, provided incorrect information to the ICRP in what appears to have been an attempt to manipulate the decision-making process and to prevent the dose-rate concept from being adopted into risk assessment practices. - Highlights: • The discovery of radiation dose rate challenged the scientific basis of LNT. • Radiation dose rate occurred in males and females. • The dose rate concept supported a threshold dose-response for radiation.

  10. The threshold vs LNT showdown: Dose rate findings exposed flaws in the LNT model part 1. The Russell-Muller debate

    International Nuclear Information System (INIS)

    Calabrese, Edward J.

    2017-01-01

    This paper assesses the discovery of the dose-rate effect in radiation genetics and how it challenged fundamental tenets of the linear non-threshold (LNT) dose response model, including the assumptions that all mutational damage is cumulative and irreversible and that the dose-response is linear at low doses. Newly uncovered historical information also describes how a key 1964 report by the International Commission for Radiological Protection (ICRP) addressed the effects of dose rate in the assessment of genetic risk. This unique story involves assessments by two leading radiation geneticists, Hermann J. Muller and William L. Russell, who independently argued that the report's Genetic Summary Section on dose rate was incorrect while simultaneously offering vastly different views as to what the report's summary should have contained. This paper reveals occurrences of scientific disagreements, how conflicts were resolved, which view(s) prevailed and why. During this process the Nobel Laureate, Muller, provided incorrect information to the ICRP in what appears to have been an attempt to manipulate the decision-making process and to prevent the dose-rate concept from being adopted into risk assessment practices. - Highlights: • The discovery of radiation dose rate challenged the scientific basis of LNT. • Radiation dose rate occurred in males and females. • The dose rate concept supported a threshold dose-response for radiation.

  11. ADMIT: a toolbox for guaranteed model invalidation, estimation and qualitative-quantitative modeling.

    Science.gov (United States)

    Streif, Stefan; Savchenko, Anton; Rumschinski, Philipp; Borchers, Steffen; Findeisen, Rolf

    2012-05-01

    Often competing hypotheses for biochemical networks exist in the form of different mathematical models with unknown parameters. Considering available experimental data, it is then desired to reject model hypotheses that are inconsistent with the data, or to estimate the unknown parameters. However, these tasks are complicated because experimental data are typically sparse, uncertain, and are frequently only available in form of qualitative if-then observations. ADMIT (Analysis, Design and Model Invalidation Toolbox) is a MatLab(TM)-based tool for guaranteed model invalidation, state and parameter estimation. The toolbox allows the integration of quantitative measurement data, a priori knowledge of parameters and states, and qualitative information on the dynamic or steady-state behavior. A constraint satisfaction problem is automatically generated and algorithms are implemented for solving the desired estimation, invalidation or analysis tasks. The implemented methods built on convex relaxation and optimization and therefore provide guaranteed estimation results and certificates for invalidity. ADMIT, tutorials and illustrative examples are available free of charge for non-commercial use at http://ifatwww.et.uni-magdeburg.de/syst/ADMIT/

  12. ADMIT: a toolbox for guaranteed model invalidation, estimation and qualitative–quantitative modeling

    Science.gov (United States)

    Streif, Stefan; Savchenko, Anton; Rumschinski, Philipp; Borchers, Steffen; Findeisen, Rolf

    2012-01-01

    Summary: Often competing hypotheses for biochemical networks exist in the form of different mathematical models with unknown parameters. Considering available experimental data, it is then desired to reject model hypotheses that are inconsistent with the data, or to estimate the unknown parameters. However, these tasks are complicated because experimental data are typically sparse, uncertain, and are frequently only available in form of qualitative if–then observations. ADMIT (Analysis, Design and Model Invalidation Toolbox) is a MatLabTM-based tool for guaranteed model invalidation, state and parameter estimation. The toolbox allows the integration of quantitative measurement data, a priori knowledge of parameters and states, and qualitative information on the dynamic or steady-state behavior. A constraint satisfaction problem is automatically generated and algorithms are implemented for solving the desired estimation, invalidation or analysis tasks. The implemented methods built on convex relaxation and optimization and therefore provide guaranteed estimation results and certificates for invalidity. Availability: ADMIT, tutorials and illustrative examples are available free of charge for non-commercial use at http://ifatwww.et.uni-magdeburg.de/syst/ADMIT/ Contact: stefan.streif@ovgu.de PMID:22451270

  13. Whither LNT?

    International Nuclear Information System (INIS)

    Higson, D.J.

    2015-01-01

    UNSCEAR and ICRP have reported that no health effects have been attributed to radiation exposure at Fukushima. As at Chernobyl, however, fear that there is no safe dose of radiation has caused enormous psychological damage to health; and evacuation to protect the public from exposure to radiation appears to have done. more harm than good. UNSCEAR and ICRP both stress that collective doses, aggregated from the exposure of large numbers of individuals to very low doses, should not be used to estimate numbers of radiation-induced health effects. This is incompatible with the LNT assumption recommended by the ICRP. (author)

  14. Whither LNT?

    International Nuclear Information System (INIS)

    Higson, D.J.

    2014-01-01

    UNSCEAR and ICRP have reported that no health effects have been attributed to radiation exposure at Fukushima. As at Chernobyl, however, fear that there is no safe dose of radiation has caused enormous psychological damage to health; and evacuation to protect the public from exposure to radiation appears to have done more harm than good. UNSCEAR and ICRP both stress that collective doses, aggregated from the exposure of large numbers of individuals to very low doses, should not be used to estimate numbers of radiation-induced health effects. This is incompatible with the LNT assumption recommended by the ICRP. (author)

  15. Whither LNT?

    Energy Technology Data Exchange (ETDEWEB)

    Higson, D.J.

    2015-03-15

    UNSCEAR and ICRP have reported that no health effects have been attributed to radiation exposure at Fukushima. As at Chernobyl, however, fear that there is no safe dose of radiation has caused enormous psychological damage to health; and evacuation to protect the public from exposure to radiation appears to have done. more harm than good. UNSCEAR and ICRP both stress that collective doses, aggregated from the exposure of large numbers of individuals to very low doses, should not be used to estimate numbers of radiation-induced health effects. This is incompatible with the LNT assumption recommended by the ICRP. (author)

  16. Whither LNT?

    Energy Technology Data Exchange (ETDEWEB)

    Higson, D.J. [Australian Nuclear Association, Paddington, NSW (Australia)

    2014-07-01

    UNSCEAR and ICRP have reported that no health effects have been attributed to radiation exposure at Fukushima. As at Chernobyl, however, fear that there is no safe dose of radiation has caused enormous psychological damage to health; and evacuation to protect the public from exposure to radiation appears to have done more harm than good. UNSCEAR and ICRP both stress that collective doses, aggregated from the exposure of large numbers of individuals to very low doses, should not be used to estimate numbers of radiation-induced health effects. This is incompatible with the LNT assumption recommended by the ICRP. (author)

  17. Combining Generated Data Models with Formal Invalidation for Insider Threat Analysis

    DEFF Research Database (Denmark)

    Kammuller, Florian; Probst, Christian W.

    2014-01-01

    draw from recent insights into generation of insider data to complement a logic based mechanical approach. We show how insider analysis can be traced back to the early days of security verification and the Lowe-attack on NSPK. The invalidation of policies allows modelchecking organizational structures......In this paper we revisit the advances made on invalidation policies to explore attack possibilities in organizational models. One aspect that has so far eloped systematic analysis of insider threat is the integration of data into attack scenarios and its exploitation for analyzing the models. We...... to detect insider attacks. Integration of higher order logic specification techniques allows the use of data refinement to explore attack possibilities beyond the initial system specification. We illustrate this combined invalidation technique on the classical example of the naughty lottery fairy. Data...

  18. Observations on the Chernobyl Disaster and LNT.

    Science.gov (United States)

    Jaworowski, Zbigniew

    2010-01-28

    The Chernobyl accident was probably the worst possible catastrophe of a nuclear power station. It was the only such catastrophe since the advent of nuclear power 55 years ago. It resulted in a total meltdown of the reactor core, a vast emission of radionuclides, and early deaths of only 31 persons. Its enormous political, economic, social and psychological impact was mainly due to deeply rooted fear of radiation induced by the linear non-threshold hypothesis (LNT) assumption. It was a historic event that provided invaluable lessons for nuclear industry and risk philosophy. One of them is demonstration that counted per electricity units produced, early Chernobyl fatalities amounted to 0.86 death/GWe-year), and they were 47 times lower than from hydroelectric stations ( approximately 40 deaths/GWe-year). The accident demonstrated that using the LNT assumption as a basis for protection measures and radiation dose limitations was counterproductive, and lead to sufferings and pauperization of millions of inhabitants of contaminated areas. The projections of thousands of late cancer deaths based on LNT, are in conflict with observations that in comparison with general population of Russia, a 15% to 30% deficit of solid cancer mortality was found among the Russian emergency workers, and a 5% deficit solid cancer incidence among the population of most contaminated areas.

  19. Observations on the Chernobyl Disaster and LNT

    Science.gov (United States)

    Jaworowski, Zbigniew

    2010-01-01

    The Chernobyl accident was probably the worst possible catastrophe of a nuclear power station. It was the only such catastrophe since the advent of nuclear power 55 years ago. It resulted in a total meltdown of the reactor core, a vast emission of radionuclides, and early deaths of only 31 persons. Its enormous political, economic, social and psychological impact was mainly due to deeply rooted fear of radiation induced by the linear non-threshold hypothesis (LNT) assumption. It was a historic event that provided invaluable lessons for nuclear industry and risk philosophy. One of them is demonstration that counted per electricity units produced, early Chernobyl fatalities amounted to 0.86 death/GWe-year), and they were 47 times lower than from hydroelectric stations (∼40 deaths/GWe-year). The accident demonstrated that using the LNT assumption as a basis for protection measures and radiation dose limitations was counterproductive, and lead to sufferings and pauperization of millions of inhabitants of contaminated areas. The projections of thousands of late cancer deaths based on LNT, are in conflict with observations that in comparison with general population of Russia, a 15% to 30% deficit of solid cancer mortality was found among the Russian emergency workers, and a 5% deficit solid cancer incidence among the population of most contaminated areas. PMID:20585443

  20. Univariate time series modeling and an application to future claims amount in SOCSO's invalidity pension scheme

    Science.gov (United States)

    Chek, Mohd Zaki Awang; Ahmad, Abu Bakar; Ridzwan, Ahmad Nur Azam Ahmad; Jelas, Imran Md.; Jamal, Nur Faezah; Ismail, Isma Liana; Zulkifli, Faiz; Noor, Syamsul Ikram Mohd

    2012-09-01

    The main objective of this study is to forecast the future claims amount of Invalidity Pension Scheme (IPS). All data were derived from SOCSO annual reports from year 1972 - 2010. These claims consist of all claims amount from 7 benefits offered by SOCSO such as Invalidity Pension, Invalidity Grant, Survivors Pension, Constant Attendance Allowance, Rehabilitation, Funeral and Education. Prediction of future claims of Invalidity Pension Scheme will be made using Univariate Forecasting Models to predict the future claims among workforce in Malaysia.

  1. Response to, "On the origins of the linear no-threshold (LNT) dogma by means of untruths, artful dodges and blind faith.".

    Science.gov (United States)

    Beyea, Jan

    2016-07-01

    It is not true that successive groups of researchers from academia and research institutions-scientists who served on panels of the US National Academy of Sciences (NAS)-were duped into supporting a linear no-threshold model (LNT) by the opinions expressed in the genetic panel section of the 1956 "BEAR I" report. Successor reports had their own views of the LNT model, relying on mouse and human data, not fruit fly data. Nor was the 1956 report biased and corrupted, as has been charged in an article by Edward J. Calabrese in this journal. With or without BEAR I, the LNT model would likely have been accepted in the US for radiation protection purposes in the 1950's. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  2. Invalidating Policies using Structural Information

    DEFF Research Database (Denmark)

    Kammuller, Florian; Probst, Christian W.

    2013-01-01

    by invalidating policies using structural information of the organisational model. Based on this structural information and a description of the organisation's policies, our approach invalidates the policies and identifies exemplary sequences of actions that lead to a violation of the policy in question. Based...... on these examples, the organisation can identify real attack vectors that might result in an insider attack. This information can be used to refine access control system or policies....

  3. LNT-an apparent rather than a real controversy?

    Energy Technology Data Exchange (ETDEWEB)

    Charles, M W [School of Physics and Astronomy, University of Birmingham, Edgbaston, Birmingham B15 2TT (United Kingdom)

    2006-09-15

    Can the carcinogenic risks of radiation that are observed at high doses be extrapolated to low doses? This question has been debated through the whole professional life of the author-now nearing four decades. In its extreme form the question relates to a particular hypothesis (LNT) used widely by the international community for radiological protection applications. The linear no-threshold (LNT) hypothesis propounds that the extrapolation is linear and that it extends down to zero dose. The debate on the validity of LNT has increased dramatically in recent years. This is in no small part due to concern that exaggerated risks at low doses leads to undue amounts of societal resources being used to reduce man-made human exposure and because of the related growing public aversion to diagnostic and therapeutic medical exposures. The debate appears to be entering a new phase. There is a growing realisation of the limitations of fundamental data and the scientific approach to address this question at low doses. There also appears to be an increasing awareness that the assumptions necessary for a workable and acceptable system of radiological protection at low doses must necessarily be based on considerable pragmatism. Recent developments are reviewed and a historical perspective is given on the general nature of controversies in radiation protection over the years. All the protagonists in the debate will at the end of the day probably be able to claim that they were right{exclamation_point} (opinion)

  4. Ground-water models: Validate or invalidate

    Science.gov (United States)

    Bredehoeft, J.D.; Konikow, Leonard F.

    1993-01-01

    The word validation has a clear meaning to both the scientific community and the general public. Within the scientific community the validation of scientific theory has been the subject of philosophical debate. The philosopher of science, Karl Popper, argued that scientific theory cannot be validated, only invalidated. Popper’s view is not the only opinion in this debate; however, many scientists today agree with Popper (including the authors). To the general public, proclaiming that a ground-water model is validated carries with it an aura of correctness that we do not believe many of us who model would claim. We can place all the caveats we wish, but the public has its own understanding of what the word implies. Using the word valid with respect to models misleads the public; verification carries with it similar connotations as far as the public is concerned. Our point is this: using the terms validation and verification are misleading, at best. These terms should be abandoned by the ground-water community.

  5. Invalidating Policies using Structural Information

    DEFF Research Database (Denmark)

    Kammuller, Florian; Probst, Christian W.

    2014-01-01

    by invalidating policies using structural information of the organisational model. Based on this structural information and a description of the organisation’s policies, our approach invalidates the policies and identifies exemplary sequences of actions that lead to a violation of the policy in question. Based...... on these examples, the organisation can identify real attack vectors that might result in an insider attack. This information can be used to refine access control systems or policies. We provide case studies showing how mechanical verification tools, i.e. modelchecking with MCMAS and interactive theorem proving...

  6. Racial identity invalidation with multiracial individuals: An instrument development study.

    Science.gov (United States)

    Franco, Marisa G; O'Brien, Karen M

    2018-01-01

    Racial identity invalidation, others' denial of an individual's racial identity, is a salient racial stressor with harmful effects on the mental health and well-being of Multiracial individuals. The purpose of this study was to create a psychometrically sound measure to assess racial identity invalidation for use with Multiracial individuals (N = 497). The present sample was mostly female (75%) with a mean age of 26.52 years (SD = 9.60). The most common racial backgrounds represented were Asian/White (33.4%) and Black/White (23.7%). Participants completed several online measures via Qualtrics. Exploratory factor analyses revealed 3 racial identity invalidation factors: behavior invalidation, phenotype invalidation, and identity incongruent discrimination. A confirmatory factor analysis provided support for the initial factor structure. Alternative model testing indicated that the bifactor model was superior to the 3-factor model. Thus, a total score and/or 3 subscale scores can be used when administering this instrument. Support was found for the reliability and validity of the total scale and subscales. In line with the minority stress theory, challenges with racial identity mediated relationships between racial identity invalidation and mental health and well-being outcomes. The findings highlight the different dimensions of racial identity invalidation and indicate their negative associations with connectedness and psychological well-being. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  7. Growth of non-toxigenic Clostridium botulinum mutant LNT01 in cooked beef: One-step kinetic analysis and comparison with C. sporogenes and C. perfringens.

    Science.gov (United States)

    Huang, Lihan

    2018-05-01

    The objective of this study was to investigate the growth kinetics of Clostridium botulinum LNT01, a non-toxigenic mutant of C. botulinum 62A, in cooked ground beef. The spores of C. botulinum LNT01 were inoculated to ground beef and incubated anaerobically under different temperature conditions to observe growth and develop growth curves. A one-step kinetic analysis method was used to analyze the growth curves simultaneously to minimize the global residual error. The data analysis was performed using the USDA IPMP-Global Fit, with the Huang model as the primary model and the cardinal parameters model as the secondary model. The results of data analysis showed that the minimum, optimum, and maximum growth temperatures of this mutant are 11.5, 36.4, and 44.3 °C, and the estimated optimum specific growth rate is 0.633 ln CFU/g per h, or 0.275 log CFU/g per h. The maximum cell density is 7.84 log CFU/g. The models and kinetic parameters were validated using additional isothermal and dynamic growth curves. The resulting residual errors of validation followed a Laplace distribution, with about 60% of the residual errors within ±0.5 log CFU/g of experimental observations, suggesting that the models could predict the growth of C. botulinum LNT01 in ground beef with reasonable accuracy. Comparing with C. perfringens, C. botulinum LNT01 grows at much slower rates and with much longer lag times. Its growth kinetics is also very similar to C. sporogenes in ground beef. The results of computer simulation using kinetic models showed that, while prolific growth of C. perfringens may occur in ground beef during cooling, no growth of C. botulinum LNT01 or C. sporogenes would occur under the same cooling conditions. The models developed in this study may be used for prediction of the growth and risk assessments of proteolytic C. botulinum in cooked meats. Published by Elsevier Ltd.

  8. The Perceived Invalidation of Emotion Scale (PIES): Development and psychometric properties of a novel measure of current emotion invalidation.

    Science.gov (United States)

    Zielinski, Melissa J; Veilleux, Jennifer C

    2018-05-24

    Emotion invalidation is theoretically and empirically associated with mental and physical health problems. However, existing measures of invalidation focus on past (e.g., childhood) invalidation and/or do not specifically emphasize invalidation of emotion. In this article, the authors articulate a clarified operational definition of emotion invalidation and use that definition as the foundation for development of a new measure of current perceived emotion invalidation across a series of five studies. Study 1 was a qualitative investigation of people's experiences with emotional invalidation from which we generated items. An initial item pool was vetted by expert reviewers in Study 2 and examined via exploratory factor analysis in Study 3 within both college student and online samples. The scale was reduced to 10 items via confirmatory factor analysis in Study 4, resulting in a brief but psychometrically promising measure, the Perceived Invalidation of Emotion Scale (PIES). A short-term longitudinal investigation (Study 5) revealed that PIES scores had strong test-retest reliability, and that greater perceived emotion invalidation was associated with greater emotion dysregulation, borderline features and symptoms of emotional distress. In addition, the PIES predicted changes in relational health and psychological health over a 1-month period. The current set of studies thus presents a psychometrically promising and practical measure of perceived emotion invalidation that can provide a foundation for future research in this burgeoning area. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  9. The Use of Lexical Neighborhood Test (LNT) in the Assessment of Speech Recognition Performance of Cochlear Implantees with Normal and Malformed Cochlea.

    Science.gov (United States)

    Kant, Anjali R; Banik, Arun A

    2017-09-01

    The present study aims to use the model-based test Lexical Neighborhood Test (LNT), to assess speech recognition performance in early and late implanted hearing impaired children with normal and malformed cochlea. The LNT was administered to 46 children with congenital (prelingual) bilateral severe-profound sensorineural hearing loss, using Nucleus 24 cochlear implant. The children were grouped into Group 1-(early implantees with normal cochlea-EI); n = 15, 31/2-61/2 years of age; mean age at implantation-3½ years. Group 2-(late implantees with normal cochlea-LI); n = 15, 6-12 years of age; mean age at implantation-5 years. Group 3-(early implantees with malformed cochlea-EIMC); n = 9; 4.9-10.6 years of age; mean age at implantation-3.10 years. Group 4-(late implantees with malformed cochlea-LIMC); n = 7; 7-12.6 years of age; mean age at implantation-6.3 years. The following were the malformations: dysplastic cochlea, common cavity, Mondini's, incomplete partition-1 and 2 (IP-1 and 2), enlarged IAC. The children were instructed to repeat the words on hearing them. Means of the word and phoneme scores were computed. The LNT can also be used to assess speech recognition performance of hearing impaired children with malformed cochlea. When both easy and hard lists of LNT are considered, although, late implantees (with or without normal cochlea), have achieved higher word scores than early implantees, the differences are not statistically significant. Using LNT for assessing speech recognition enables a quantitative as well as descriptive report of phonological processes used by the children.

  10. Invalidity of contract: legislative regulation and types

    Directory of Open Access Journals (Sweden)

    Василь Іванович Крат

    2017-09-01

    Full Text Available Invalidity contracts always attracted attention researchers. Without regard to it, in modern conditions there is an enormous layer of the problems related to invalidity contract, that to require a doctrine and utilitarian comprehension. The article is sanctified to research invalidity contract. In the article analyses problems of the legislative regulation and types of invalidity contract through the prism of judicial practice. In the Civil code of Ukraine, a voidable contract sets as a common rule. A voidable of the contract is incarnated in the so-called «virtual» invalidity when only the most typical grounds are enumerated. However, even such approach does not allow to overcome all possible cases that arise up in practice. Such situation touches possibility of voidable contracts concluded with the purpose of to shut out the appeal of claim to the property of the debtor. Therefore it follows to set general rules in relation to voidable contracts of the debtor. A nullity of the contract takes place only in the case when there is the direct pointing of law on the qualification of that or another contract as the nullity. The nullity of contract in the Civil code of Ukraine is constructed by means of «textual» invalidity. There are no single attempts to use the construction of «virtual» invalidity in judicial practice when there is the direct pointing of law on the qualification of that or another contract as the nullity, that is impermissible. Methodologically incorrectly to carry out identifying of invalidity contract and obligation with the aim of providing of application of different after the rich in content filling norms.

  11. Consequences and detection of invalid exogeneity conditions

    NARCIS (Netherlands)

    Niemczyk, J.

    2009-01-01

    Estimators for econometric relationships require observations on at least as many exogenous variables as the model has unknown coefficients. This thesis examines techniques to classify variables as being either exogenous or endogenous, and investigates the consequences of invalid classifications.

  12. Association among self-compassion, childhood invalidation, and borderline personality disorder symptomatology in a Singaporean sample.

    Science.gov (United States)

    Keng, Shian-Ling; Wong, Yun Yi

    2017-01-01

    Linehan's biosocial theory posits that parental invalidation during childhood plays a role in the development of borderline personality disorder symptoms later in life. However, little research has examined components of the biosocial model in an Asian context, and variables that may influence the relationship between childhood invalidation and borderline symptoms. Self-compassion is increasingly regarded as an adaptive way to regulate one's emotions and to relate to oneself, and may serve to moderate the association between invalidation and borderline symptoms. The present study investigated the association among childhood invalidation, self-compassion, and borderline personality disorder symptoms in a sample of Singaporean undergraduate students. Two hundred and ninety undergraduate students from a large Singaporean university were recruited and completed measures assessing childhood invalidation, self-compassion, and borderline personality disorder symptoms. Analyses using multiple regression indicated that both childhood invalidation and self-compassion significantly predicted borderline personality disorder symptomatology. Results from moderation analyses indicated that relationship between childhood invalidation and borderline personality disorder symptomatology did not vary as a function of self-compassion. This study provides evidence in support of aspects of the biosocial model in an Asian context, and demonstrates a strong association between self-compassion and borderline personality disorder symptoms, independent of one's history of parental invalidation during childhood.

  13. Parental Invalidation and the Development of Narcissism.

    Science.gov (United States)

    Huxley, Elizabeth; Bizumic, Boris

    2017-02-17

    Parenting behaviors and childhood experiences have played a central role in theoretical approaches to the etiology of narcissism. Research has suggested an association between parenting and narcissism; however, it has been limited in its examination of different narcissism subtypes and individual differences in parenting behaviors. This study investigates the influence of perceptions of parental invalidation, an important aspect of parenting behavior theoretically associated with narcissism. Correlational and hierarchical regression analyses were conducted using a sample of 442 Australian participants to examine the relationship between invalidating behavior from mothers and fathers, and grandiose and vulnerable narcissism. Results indicate that stronger recollections of invalidating behavior from either mothers or fathers are associated with higher levels of grandiose and vulnerable narcissism when controlling for age, gender, and the related parenting behaviors of rejection, coldness, and overprotection. The lowest levels of narcissism were found in individuals who reported low levels of invalidation in both parents. These findings support the idea that parental invalidation is associated with narcissism.

  14. [Assessment of invalidity as a result of infectious diseases].

    Science.gov (United States)

    Čeledová, L; Čevela, R; Bosák, M

    2016-01-01

    The article features the new medical assessment paradigm for invalidity as a result of infectious disease which is applied as of 1 January 2010. The invalidity assessment criteria are regulated specifically by Regulation No. 359/2009. Chapter I of the Annexe to the invalidity assessment regulation addresses the area of infectious diseases with respect to functional impairment and its impact on the quality of life. Since 2010, the invalidity has also been newly categorized into three groups. The new assessment approach makes it possible to evaluate a persons functional capacity, type of disability, and eligibility for compensation for reduced capacity for work. In 2010, a total of 170 375 invalidity cases were assessed, and in 2014, 147 121 invalidity assessments were made. Invalidity as a result of infectious disease was assessed in 177 persons in 2010, and 128 invalidity assessments were made in 2014. The most common causes of invalidity as a result of infectious disease are chronic viral hepatitis, other spirochetal infections, tuberculosis of the respiratory tract, tick-borne viral encephalitis, and HIV/AIDS. The number of assessments of invalidity as a result of infectious disease showed a declining trend between 2010 and 2014, similarly to the total of invalidity assessments. In spite of this fact, the cases of invalidity as a result of infectious disease account for approximately half percent of all invalidity assessments made in the above-mentioned period of time.

  15. The (unclear effects of invalid retro-cues.

    Directory of Open Access Journals (Sweden)

    Marcel eGressmann

    2016-03-01

    Full Text Available Studies with the retro-cue paradigm have shown that validly cueing objects in visual working memory long after encoding can still benefit performance on subsequent change detection tasks. With regard to the effects of invalid cues, the literature is less clear. Some studies reported costs, others did not. We here revisit two recent studies that made interesting suggestions concerning invalid retro-cues: One study suggested that costs only occur for larger set sizes, and another study suggested that inclusion of invalid retro-cues diminishes the retro-cue benefit. New data from one experiment and a reanalysis of published data are provided to address these conclusions. The new data clearly show costs (and benefits that were independent of set size, and the reanalysis suggests no influence of the inclusion of invalid retro-cues on the retro-cue benefit. Thus, previous interpretations may be taken with some caution at present.

  16. Development of Optimal Catalyst Designs and Operating Strategies for Lean NOx Reduction in Coupled LNT-SCR Systems

    Energy Technology Data Exchange (ETDEWEB)

    Harold, Michael [Univ. of Houston, TX (United States); Crocker, Mark [Univ. of Kentucky, Lexington, KY (United States); Balakotaiah, Vemuri [Univ. of Houston, TX (United States); Luss, Dan [Univ. of Houston, TX (United States); Choi, Jae-Soon [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Dearth, Mark [Ford Motor Company, Dearborn, MI (United States); McCabe, Bob [Ford Motor Company, Dearborn, MI (United States); Theis, Joe [Ford Motor Company, Dearborn, MI (United States)

    2013-09-30

    Oxides of nitrogen in the form of nitric oxide (NO) and nitrogen dioxide (NO2) commonly referred to as NOx, is one of the two chemical precursors that lead to ground-level ozone, a ubiquitous air pollutant in urban areas. A major source of NOx} is generated by equipment and vehicles powered by diesel engines, which have a combustion exhaust that contains NOx in the presence of excess O2. Catalytic abatement measures that are effective for gasoline-fueled engines such as the precious metal containing three-way catalytic converter (TWC) cannot be used to treat O2-laden exhaust containing NOx. Two catalytic technologies that have emerged as effective for NOx abatement are NOx storage and reduction (NSR) and selective catalytic reduction (SCR). NSR is similar to TWC but requires much larger quantities of expensive precious metals and sophisticated periodic switching operation, while SCR requires an on-board source of ammonia which serves as the chemical reductant of the NOx. The fact that NSR produces ammonia as a byproduct while SCR requires ammonia to work has led to interest in combining the two together to avoid the need for the cumbersome ammonia generation system. In this project a comprehensive study was carried out of the fundamental aspects and application feasibility of combined NSR/SCR. The project team, which included university, industry, and national lab researchers, investigated the kinetics and mechanistic features of the underlying chemistry in the lean NOx trap (LNT) wherein NSR was carried out, with particular focus on identifying the operating conditions such as temperature and catalytic properties which lead to the production of ammonia in the LNT. The performance features of SCR on both model and commercial catalysts focused on the synergy between the LNT and SCR converters in terms of utilizing the upstream-generated ammonia and

  17. Attack Tree Generation by Policy Invalidation

    DEFF Research Database (Denmark)

    Ivanova, Marieta Georgieva; Probst, Christian W.; Hansen, Rene Rydhof

    2015-01-01

    through brainstorming of experts. In this work we formalize attack tree generation including human factors; based on recent advances in system models we develop a technique to identify possible attacks analytically, including technical and human factors. Our systematic attack generation is based......Attacks on systems and organisations increasingly exploit human actors, for example through social engineering, complicating their formal treatment and automatic identification. Formalisation of human behaviour is difficult at best, and attacks on socio-technical systems are still mostly identified...... on invalidating policies in the system model by identifying possible sequences of actions that lead to an attack. The generated attacks are precise enough to illustrate the threat, and they are general enough to hide the details of individual steps....

  18. Association among self-compassion, childhood invalidation, and borderline personality disorder symptomatology in a Singaporean sample

    OpenAIRE

    Keng, Shian-Ling; Wong, Yun Yi

    2017-01-01

    Background Linehan’s biosocial theory posits that parental invalidation during childhood plays a role in the development of borderline personality disorder symptoms later in life. However, little research has examined components of the biosocial model in an Asian context, and variables that may influence the relationship between childhood invalidation and borderline symptoms. Self-compassion is increasingly regarded as an adaptive way to regulate one’s emotions and to relate to oneself, and m...

  19. Set-base dynamical parameter estimation and model invalidation for biochemical reaction networks.

    Science.gov (United States)

    Rumschinski, Philipp; Borchers, Steffen; Bosio, Sandro; Weismantel, Robert; Findeisen, Rolf

    2010-05-25

    Mathematical modeling and analysis have become, for the study of biological and cellular processes, an important complement to experimental research. However, the structural and quantitative knowledge available for such processes is frequently limited, and measurements are often subject to inherent and possibly large uncertainties. This results in competing model hypotheses, whose kinetic parameters may not be experimentally determinable. Discriminating among these alternatives and estimating their kinetic parameters is crucial to improve the understanding of the considered process, and to benefit from the analytical tools at hand. In this work we present a set-based framework that allows to discriminate between competing model hypotheses and to provide guaranteed outer estimates on the model parameters that are consistent with the (possibly sparse and uncertain) experimental measurements. This is obtained by means of exact proofs of model invalidity that exploit the polynomial/rational structure of biochemical reaction networks, and by making use of an efficient strategy to balance solution accuracy and computational effort. The practicability of our approach is illustrated with two case studies. The first study shows that our approach allows to conclusively rule out wrong model hypotheses. The second study focuses on parameter estimation, and shows that the proposed method allows to evaluate the global influence of measurement sparsity, uncertainty, and prior knowledge on the parameter estimates. This can help in designing further experiments leading to improved parameter estimates.

  20. On the origins of the linear no-threshold (LNT) dogma by means of untruths, artful dodges and blind faith

    International Nuclear Information System (INIS)

    Calabrese, Edward J.

    2015-01-01

    This paper is an historical assessment of how prominent radiation geneticists in the United States during the 1940s and 1950s successfully worked to build acceptance for the linear no-threshold (LNT) dose–response model in risk assessment, significantly impacting environmental, occupational and medical exposure standards and practices to the present time. Detailed documentation indicates that actions taken in support of this policy revolution were ideologically driven and deliberately and deceptively misleading; that scientific records were artfully misrepresented; and that people and organizations in positions of public trust failed to perform the duties expected of them. Key activities are described and the roles of specific individuals are documented. These actions culminated in a 1956 report by a Genetics Panel of the U.S. National Academy of Sciences (NAS) on Biological Effects of Atomic Radiation (BEAR). In this report the Genetics Panel recommended that a linear dose response model be adopted for the purpose of risk assessment, a recommendation that was rapidly and widely promulgated. The paper argues that current international cancer risk assessment policies are based on fraudulent actions of the U.S. NAS BEAR I Committee, Genetics Panel and on the uncritical, unquestioning and blind-faith acceptance by regulatory agencies and the scientific community. - Highlights: • The 1956 recommendation of the US NAS to use the LNT for risk assessment was adopted worldwide. • This recommendation is based on a falsification of the research record and represents scientific misconduct. • The record misrepresented the magnitude of panelist disagreement of genetic risk from radiation. • These actions enhanced public acceptance of their risk assessment policy recommendations.

  1. On the origins of the linear no-threshold (LNT) dogma by means of untruths, artful dodges and blind faith

    Energy Technology Data Exchange (ETDEWEB)

    Calabrese, Edward J., E-mail: edwardc@schoolph.umass.edu

    2015-10-15

    This paper is an historical assessment of how prominent radiation geneticists in the United States during the 1940s and 1950s successfully worked to build acceptance for the linear no-threshold (LNT) dose–response model in risk assessment, significantly impacting environmental, occupational and medical exposure standards and practices to the present time. Detailed documentation indicates that actions taken in support of this policy revolution were ideologically driven and deliberately and deceptively misleading; that scientific records were artfully misrepresented; and that people and organizations in positions of public trust failed to perform the duties expected of them. Key activities are described and the roles of specific individuals are documented. These actions culminated in a 1956 report by a Genetics Panel of the U.S. National Academy of Sciences (NAS) on Biological Effects of Atomic Radiation (BEAR). In this report the Genetics Panel recommended that a linear dose response model be adopted for the purpose of risk assessment, a recommendation that was rapidly and widely promulgated. The paper argues that current international cancer risk assessment policies are based on fraudulent actions of the U.S. NAS BEAR I Committee, Genetics Panel and on the uncritical, unquestioning and blind-faith acceptance by regulatory agencies and the scientific community. - Highlights: • The 1956 recommendation of the US NAS to use the LNT for risk assessment was adopted worldwide. • This recommendation is based on a falsification of the research record and represents scientific misconduct. • The record misrepresented the magnitude of panelist disagreement of genetic risk from radiation. • These actions enhanced public acceptance of their risk assessment policy recommendations.

  2. An improved cooperative adaptive cruise control (CACC) algorithm considering invalid communication

    Science.gov (United States)

    Wang, Pangwei; Wang, Yunpeng; Yu, Guizhen; Tang, Tieqiao

    2014-05-01

    For the Cooperative Adaptive Cruise Control (CACC) Algorithm, existing research studies mainly focus on how inter-vehicle communication can be used to develop CACC controller, the influence of the communication delays and lags of the actuators to the string stability. However, whether the string stability can be guaranteed when inter-vehicle communication is invalid partially has hardly been considered. This paper presents an improved CACC algorithm based on the sliding mode control theory and analyses the range of CACC controller parameters to maintain string stability. A dynamic model of vehicle spacing deviation in a platoon is then established, and the string stability conditions under improved CACC are analyzed. Unlike the traditional CACC algorithms, the proposed algorithm can ensure the functionality of the CACC system even if inter-vehicle communication is partially invalid. Finally, this paper establishes a platoon of five vehicles to simulate the improved CACC algorithm in MATLAB/Simulink, and the simulation results demonstrate that the improved CACC algorithm can maintain the string stability of a CACC platoon through adjusting the controller parameters and enlarging the spacing to prevent accidents. With guaranteed string stability, the proposed CACC algorithm can prevent oscillation of vehicle spacing and reduce chain collision accidents under real-world circumstances. This research proposes an improved CACC algorithm, which can guarantee the string stability when inter-vehicle communication is invalid.

  3. Invalid-point removal based on epipolar constraint in the structured-light method

    Science.gov (United States)

    Qi, Zhaoshuai; Wang, Zhao; Huang, Junhui; Xing, Chao; Gao, Jianmin

    2018-06-01

    In structured-light measurement, there unavoidably exist many invalid points caused by shadows, image noise and ambient light. According to the property of the epipolar constraint, because the retrieved phase of the invalid point is inaccurate, the corresponding projector image coordinate (PIC) will not satisfy the epipolar constraint. Based on this fact, a new invalid-point removal method based on the epipolar constraint is proposed in this paper. First, the fundamental matrix of the measurement system is calculated, which will be used for calculating the epipolar line. Then, according to the retrieved phase map of the captured fringes, the PICs of each pixel are retrieved. Subsequently, the epipolar line in the projector image plane of each pixel is obtained using the fundamental matrix. The distance between the corresponding PIC and the epipolar line of a pixel is defined as the invalidation criterion, which quantifies the satisfaction degree of the epipolar constraint. Finally, all pixels with a distance larger than a certain threshold are removed as invalid points. Experiments verified that the method is easy to implement and demonstrates better performance than state-of-the-art measurement systems.

  4. Self-compassion and emotional invalidation mediate the effects of parental indifference on psychopathology.

    Science.gov (United States)

    Westphal, Maren; Leahy, Robert L; Pala, Andrea Norcini; Wupperman, Peggilee

    2016-08-30

    This study investigated whether self-compassion and emotional invalidation (perceiving others as indifferent to one's emotions) may explain the relationship of childhood exposure to adverse parenting and adult psychopathology in psychiatric outpatients (N=326). Path analysis was used to investigate associations between exposure to adverse parenting (abuse and indifference), self-compassion, emotional invalidation, and mental health when controlling for gender and age. Self-compassion was strongly inversely associated with emotional invalidation, suggesting that a schema that others will be unsympathetic or indifferent toward one's emotions may affect self-compassion and vice versa. Both self-compassion and emotional invalidation mediated the relationship between parental indifference and mental health outcomes. These preliminary findings suggest the potential utility of self-compassion and emotional schemas as transdiagnostic treatment targets. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  5. Has the prevalence of invalidating musculoskeletal pain changed over the last 15 years (1993-2006)? A Spanish population-based survey.

    Science.gov (United States)

    Jiménez-Sánchez, Silvia; Jiménez-García, Rodrigo; Hernández-Barrera, Valentín; Villanueva-Martínez, Manuel; Ríos-Luna, Antonio; Fernández-de-las-Peñas, César

    2010-07-01

    The aim of the current study was to estimate the prevalence and time trend of invalidating musculoskeletal pain in the Spanish population and its association with socio-demographic factors, lifestyle habits, self-reported health status, and comorbidity with other diseases analyzing data from 1993-2006 Spanish National Health Surveys (SNHS). We analyzed individualized data taken from the SNHS conducted in 1993 (n = 20,707), 2001 (n = 21,058), 2003 (n = 21,650) and 2006 (n = 29,478). Invalidating musculoskeletal pain was defined as pain suffered from the preceding 2 weeks that decreased main working activity or free-time activity by at least half a day. We analyzed socio-demographic characteristics, self-perceived health status, lifestyle habits, and comorbid conditions using multivariate logistic regression models. Overall, the prevalence of invalidating musculoskeletal pain in Spanish adults was 6.1% (95% CI, 5.7-6.4) in 1993, 7.3% (95% CI, 6.9-7.7) in 2001, 5.5% (95% CI, 5.1-5.9) in 2003 and 6.4% (95% CI 6-6.8) in 2006. The prevalence of invalidating musculoskeletal pain among women was almost twice that of men in every year (P postural hygiene, physical exercise, and how to prevent obesity and sedentary lifestyle habits should be provided by Public Health Services. This population-based study indicates that invalidating musculoskeletal pain that reduces main working activity is a public health problem in Spain. The prevalence of invalidating musculoskeletal pain was higher in women than in men and associated to lower income, poor sleeping, worse self-reported health status, and other comorbid conditions. Further, the prevalence of invalidating musculoskeletal pain increased from 1993 to 2001, but remained stable from the last years (2001 to 2006).

  6. 30 CFR 253.50 - How can MMS refuse or invalidate my OSFR evidence?

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 2 2010-07-01 2010-07-01 false How can MMS refuse or invalidate my OSFR... can MMS refuse or invalidate my OSFR evidence? (a) If MMS determines that any OSFR evidence you submit... acceptable evidence without being subject to civil penalty under § 253.51. (b) MMS may immediately and...

  7. Automatically repairing invalid polygons with a constrained triangulation

    NARCIS (Netherlands)

    Ledoux, H.; Arroyo Ohori, K.; Meijers, M.

    2012-01-01

    Although the validation of single polygons has received considerable attention, the automatic repair of invalid polygons has not. Automated repair methods can be considered as interpreting ambiguous or ill-defined polygons and giving a coherent and clearly defined output. At this moment, automatic

  8. Putting aside the LNT dilemma in the controllable dose concept

    International Nuclear Information System (INIS)

    Koblinger, Laszlo

    2000-01-01

    Recently, Professor R. Clarke, ICRP Chairman has published his proposal for a renewal of the basic radiation protection concept. The two main points of his proposed system are: (1) the term Controllable Dose is introduced, and (2) the protection philosophy is based on the individual. For practical uses terms like 'Action Level', 'Investigation Level' etc. are introduced. The outline of the new system promises a really less complex frame; no distinction between practices and interventions, unified treatment for occupational, medical and public exposures. There is, however, an inconsistency within the new system: Though linearity is not assumed, the relations between the definitions of the new terms of the system of protection and the doses assigned to them are still based on the LNT hypothesis. To avoid this discrepancy a new definition of Action Level is recommended as a conservative estimate of the lowest dose where harmful effects have ever been demonstrated. Other levels should be defined by the Action Level and Safety Factors applied on the doses. (author)

  9. Cancer and low dose responses In Vivo: implications for radiation protection

    Energy Technology Data Exchange (ETDEWEB)

    Mitchel, R.E.J. [Atomic Energy of Canada Limited, Chalk River, Ontario (Canada)

    2006-12-15

    This paper discusses the linear no-threshold (LNT) hypothesis, risk prediction and radiation protection. The summary implications for the radiation protection system are that at low doses the conceptual basis of the present system appears to be incorrect. The belief that the current system embodies the precautionary principle and that the LNT assumption is cautious appears incorrect. The concept of dose additivity appears incorrect. Effective dose (Sievert) and the weighting factors on which it is based appear to be invalid. There may be no constant and appropriate value of DDREF for radiological protection dosimetry. The use of dose as a predictor of risk needs to be re-examined. The use of dose limits as a means of limiting risk need to be re-evaluated.

  10. Intriguing legacy of Einstein, Fermi, Jordan, and others: The possible invalidation of quark conjectures

    International Nuclear Information System (INIS)

    Santilli, R.M.

    1981-01-01

    The objective of this paper is to present an outline of a number of criticisms of the quark models of hadron structure which have been present in the community of basic research for some time. The hope is that quark supporters will consider these criticisms and present possible counterarguments for a scientifically effective resolution of the issues. In particular, it is submitted that the problem of whether quarks exist as physical particles necessarily calls for the prior theoretical and experimental resolution of the question of the validity or invalidity, for hadronic structure, of the relativity and quantum mechanical laws established for atomic structure. The current theoretical studies leading to the conclusion that they are invalid are considered, together with the experimental situation. We also recall the doubts by Einstein, Fermi, Jordan, and others on the final character of contemporary physical knowledge. Most of all, this paper is an appeal to young minds of all ages. The possible invalidity for the strong interactions of the physical laws of the electromagnetic interactions, rather than constituting a scientific drawback, represents instead an invaluable impetus toward the search for covering laws specifically conceived for hadronic structure and strong interactions in general, a program which has already been initiated by a number of researchers. In turn, this situation appears to have all the ingredients for a new scientific renaissance, perhaps comparable to that of the early part of this century

  11. Intriguing legacy of Einstein, Fermi, Jordan, and others: the possible invalidation of quark conjectures

    International Nuclear Information System (INIS)

    Santilli, R.M.

    1981-01-01

    The objective of this paper is to present an outline of a number of criticisms of the quark models of hadron structure which have been present in the community of basic research for some time. The hope is that quark supporters will consider these criticisms and present possible counterarguments for a scintifically effective resolution of the issues. In particular, it is submitted that the problem of whether quarks exist as physical particles necessarily calls for the prior theoretical and experimental resolution of the question of the validity or invalidity, for hadronic structure, of the relativity and quantum mechanical laws established for atomic structure. The current theoretical studies leading to the conclusion that they are invalid are considered, together with the experimental situation. We also recall the doubts by Einstein, Fermi, Jordan, and others on the final character of contemporary physical knowledge. Most of all, this paper is an appeal to young minds of all ages. The possible invalidity for the strong interactions of the physical laws of the electromagnetic interactions, rather than constituting a scientific drawback, represents instead an invaluable impetus toward the search for covering laws specifically conceived for hadronic structure and strong interactions in general, a program which has already been initiated by a number of researchers. In turn, this situation appears to have all the ingredients for a new scientific renaissance, perhaps comparable to that of the early part of this century

  12. Invalid Permutation Tests

    Directory of Open Access Journals (Sweden)

    Mikel Aickin

    2010-01-01

    Full Text Available Permutation tests are often presented in a rather casual manner, in both introductory and advanced statistics textbooks. The appeal of the cleverness of the procedure seems to replace the need for a rigorous argument that it produces valid hypothesis tests. The consequence of this educational failing has been a widespread belief in a “permutation principle”, which is supposed invariably to give tests that are valid by construction, under an absolute minimum of statistical assumptions. Several lines of argument are presented here to show that the permutation principle itself can be invalid, concentrating on the Fisher-Pitman permutation test for two means. A simple counterfactual example illustrates the general problem, and a slightly more elaborate counterfactual argument is used to explain why the main mathematical proof of the validity of permutation tests is mistaken. Two modifications of the permutation test are suggested to be valid in a very modest simulation. In instances where simulation software is readily available, investigating the validity of a specific permutation test can be done easily, requiring only a minimum understanding of statistical technicalities.

  13. Does overprotection cause cardiac invalidism after acute myocardial infarction?

    Science.gov (United States)

    Riegel, B J; Dracup, K A

    1992-01-01

    To determine if overprotection on the part of the patient's family and friends contributes to the development of cardiac invalidism after acute myocardial infarction. Longitudinal survey. Nine hospitals in the southwestern United States. One hundred eleven patients who had experienced a first acute myocardial infarction. Subjects were predominantly male, older-aged, married, caucasian, and in functional class I. Eighty-one patients characterized themselves as being overprotected (i.e., receiving more social support from family and friends than desired), and 28 reported receiving inadequate support. Only two patients reported receiving as much support as they desired. Self-esteem, emotional distress, health perceptions, interpersonal dependency, return to work. Overprotected patients experienced less anxiety, depression, anger, confusion, more vigor, and higher self-esteem than inadequately supported patients 1 month after myocardial infarction (p Overprotection on the part of family and friends may facilitate psychosocial adjustment in the early months after an acute myocardial infarction rather than lead to cardiac invalidism.

  14. Penis invalidating cicatricial outcomes in an enlargement phalloplasty case with polyacrylamide gel (Formacryl).

    Science.gov (United States)

    Parodi, P C; Dominici, M; Moro, U

    2006-01-01

    The present article reports the case of a patient subjected to polyacrylamide polymers-composed gel cutaneous infiltration in the penis for cosmetic purposes, resulting in severe invalidating outcomes. A significant tissue reaction to the subcutaneous injection of polyacrylamide gel for the penis enlargement purpose resulted in permanent and invalidating scars both on the esthetic and functional levels. Such a result must be simply taken into account both singly and in the light of the international literature to exclude this method as standard uro-andrologic activity.

  15. [Structure of childhood and adolescent invalidity in persons with chronic somatic diseases].

    Science.gov (United States)

    Korenev, N M; Bogmat, L F; Tolmacheva, S R; Timofeeva, O N

    2002-01-01

    Based on the analysis of statistical data, prevalence is estimated of disorders with invalidism patterns outlined among those children and young adults under 40 years of age presenting with chronic somatic disorders in Kharkov. Both in children (52.4%) and in young adults (43.9%) diseases of the nervous system held the prominent place. Invalidity due to formed somatic disorders was identified in 10.9% of children and 24.3% of those persons less than 40 years old. There prevailed diseases of the circulation organs. The necessity is substantiated for the rehabilitation to be carried out of children with somatic disorders to prevent their disability.

  16. The footprints of visual attention during search with 100% valid and 100% invalid cues.

    Science.gov (United States)

    Eckstein, Miguel P; Pham, Binh T; Shimozaki, Steven S

    2004-06-01

    Human performance during visual search typically improves when spatial cues indicate the possible target locations. In many instances, the performance improvement is quantitatively predicted by a Bayesian or quasi-Bayesian observer in which visual attention simply selects the information at the cued locations without changing the quality of processing or sensitivity and ignores the information at the uncued locations. Aside from the general good agreement between the effect of the cue on model and human performance, there has been little independent confirmation that humans are effectively selecting the relevant information. In this study, we used the classification image technique to assess the effectiveness of spatial cues in the attentional selection of relevant locations and suppression of irrelevant locations indicated by spatial cues. Observers searched for a bright target among dimmer distractors that might appear (with 50% probability) in one of eight locations in visual white noise. The possible target location was indicated using a 100% valid box cue or seven 100% invalid box cues in which the only potential target locations was uncued. For both conditions, we found statistically significant perceptual templates shaped as differences of Gaussians at the relevant locations with no perceptual templates at the irrelevant locations. We did not find statistical significant differences between the shapes of the inferred perceptual templates for the 100% valid and 100% invalid cues conditions. The results confirm the idea that during search visual attention allows the observer to effectively select relevant information and ignore irrelevant information. The results for the 100% invalid cues condition suggests that the selection process is not drawn automatically to the cue but can be under the observers' voluntary control.

  17. Televīzijas loma neapmierinātībā ar politiku: LTV1, LNT, TV3 nedēļas analītisko raidījumu satura, to veidotāju un ekspertu vērtējumu analīze (2008.gada oktobris-2009.gada marts)

    OpenAIRE

    Novodvorskis, Vladimirs

    2009-01-01

    Maģistra darbu „Televīzijas loma neapmierinātībā ar politiku: LTV1, LNT, TV3 nedēļas analītisko raidījumu satura, to veidotāju un ekspertu vērtējumu analīze (2008. gada oktobris – 2009. gada marts)” izstrādāja Latvijas Universitātes Komunikācijas studiju nodaļas students Vladimirs Novodvorskis. Darbs veltīts auditorijas negatīvas attieksmes veidošanas problēmas izpētei televīzijas informatīvi analītiskajos raidījumos Panorāma, De facto (LTV1), LNT Top 10 (LNT), Nekā personīga (TV3) pret pol...

  18. Leukemia and ionizing radiation revisited

    Energy Technology Data Exchange (ETDEWEB)

    Cuttler, J.M. [Cuttler & Associates Inc., Vaughan, Ontario (Canada); Welsh, J.S. [Loyola University-Chicago, Dept. or Radiation Oncology, Stritch School of Medicine, Maywood, Illinois (United States)

    2016-03-15

    A world-wide radiation health scare was created in the late 19508 to stop the testing of atomic bombs and block the development of nuclear energy. In spite of the large amount of evidence that contradicts the cancer predictions, this fear continues. It impairs the use of low radiation doses in medical diagnostic imaging and radiation therapy. This brief article revisits the second of two key studies, which revolutionized radiation protection, and identifies a serious error that was missed. This error in analyzing the leukemia incidence among the 195,000 survivors, in the combined exposed populations of Hiroshima and Nagasaki, invalidates use of the LNT model for assessing the risk of cancer from ionizing radiation. The threshold acute dose for radiation-induced leukemia, based on about 96,800 humans, is identified to be about 50 rem, or 0.5 Sv. It is reasonable to expect that the thresholds for other cancer types are higher than this level. No predictions or hints of excess cancer risk (or any other health risk) should be made for an acute exposure below this value until there is scientific evidence to support the LNT hypothesis. (author)

  19. Sociodemographic characteristics and diabetes predict invalid self-reported non-smoking in a population-based study of U.S. adults

    Directory of Open Access Journals (Sweden)

    Shelton Brent J

    2007-03-01

    Full Text Available Abstract Background Nearly all studies reporting smoking status collect self-reported data. The objective of this study was to assess sociodemographic characteristics and selected, common smoking-related diseases as predictors of invalid reporting of non-smoking. Valid self-reported smoking may be related to the degree to which smoking is a behavior that is not tolerated by the smoker's social group. Methods True smoking was defined as having serum cotinine of 15+ng/ml. 1483 "true" smokers 45+ years of age with self-reported smoking and serum cotinine data from the Mobile Examination Center were identified in the third National Health and Nutrition Examination Survey. Invalid non-smoking was defined as "true" smokers self-reporting non-smoking. To assess predictors of invalid self-reported non-smoking, odds ratios (OR and 95% confidence intervals (CI were calculated for age, race/ethnicity-gender categories, education, income, diabetes, hypertension, and myocardial infarction. Multiple logistic regression modeling took into account the complex survey design and sample weights. Results Among smokers with diabetes, invalid non-smoking status was 15%, ranging from 0% for Mexican-American (MA males to 22%–25% for Non-Hispanic White (NHW males and Non-Hispanic Black (NHB females. Among smokers without diabetes, invalid non-smoking status was 5%, ranging from 3% for MA females to 10% for NHB females. After simultaneously taking into account diabetes, education, race/ethnicity and gender, smokers with diabetes (ORAdj = 3.15; 95% CI: 1.35–7.34, who did not graduate from high school (ORAdj = 2.05; 95% CI: 1.30–3.22 and who were NHB females (ORAdj = 5.12; 95% CI: 1.41–18.58 were more likely to self-report as non-smokers than smokers without diabetes, who were high school graduates, and MA females, respectively. Having a history of myocardial infarction or hypertension did not predict invalid reporting of non-smoking. Conclusion Validity of self

  20. PROCEDURAL REASONS FOR INVALIDITY OF DECISIONS MADE BY THE ASSEMBLY IN LIMITED LIABILITY COMPANY - de lege lata vs. de lege ferenda

    Directory of Open Access Journals (Sweden)

    Lidija Šimunović

    2017-01-01

    Full Text Available Procedural reasons, unlike other reasons for invalidity of decisions made by the Assembly in a Limited liability company (hereinafter:Ltd in the judicial and business practice open up the highest number of legal questions. These are “mistakes in the steps” that lead to invalidity of decisions made by the Assembly in Ltd. about which in the domestic legal literature has not been systematically discussed. The starting point for the elaboration of this issue is based on the circumstance that in the provision of article 448 of the Companies Act is stipulated that to the invalidity of decisions made by the Assembly in Ltd. appropriately apply the provisions on the invalidity of decisions made by the General Assembly in PLC (Public Limited Company. Procedural differences in working of the General Assembly in PLC and Assembly in Ltd. is one of the fundamental differences between these two types of capital companies and this kind of positive legal regulation leads to legal uncertainty and misinterpretations. The first part of this paper gives a chronological overview of the domestic law with regard to invalidity of decisions made by the Assembly in Ltd. Then are doctrinally deferred invalid decisions from the other decisions with defect. Then, each provision on the invalidity of decisions made by the General Assembly in PLC is tested and then explicitly formulated provision which is valid only within the context of Ltd. Apart from domestic law are analyzed also solutions from comparative law (especially German because domestic law largely overlaps with the solutions from comparative law. In conclusion after completion of analysis, the obtained findings are used as guidelines for more practical de lege ferenda regulation in the Companies Act regarding the invalidity of decisions made by the Assembly in Ltd.

  1. An experimental pilot study of response to invalidation in young women with features of borderline personality disorder.

    Science.gov (United States)

    Woodberry, Kristen A; Gallo, Kaitlin P; Nock, Matthew K

    2008-01-15

    One of the leading biosocial theories of borderline personality disorder (BPD) suggests that individuals with BPD have biologically based abnormalities in emotion regulation contributing to more intense and rapid responses to emotional stimuli, in particular, invalidation [Linehan, M.M., 1993. Cognitive-Behavioral Treatment of Borderline Personality Disorder. Guilford, New York.]. This study used a 2 by 2 experimental design to test whether young women with features of BPD actually show increased physiological arousal in response to invalidation. Twenty-three women ages 18 to 29 who endorsed high levels of BPD symptoms and 18 healthy controls were randomly assigned to hear either a validating or invalidating comment during a frustrating task. Although we found preliminary support for differential response to these stimuli in self-report of valence, we found neither self-report nor physiological evidence of hyperarousal in the BPD features group, either at baseline or in response to invalidation. Interestingly, the BPD features group reported significantly lower comfort with emotion, and comfort was significantly associated with affective valence but not arousal. We discuss implications for understanding and responding to the affective intensity of this population.

  2. A Novel Cache Invalidation Scheme for Mobile Networks

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    In this paper, we propose a strategy of maintaining cache consistency in wireless mobile environments, which adds a validation server (VS) into the GPRS network, utilizes the location information of mobile terminal in SGSN located at GPRS backbone, just sends invalidation information to mobile terminal which is online in accordance with the cached data, and reduces the information amount in asynchronous transmission. This strategy enables mobile terminal to access cached data with very little computing amount, little delay and arbitrary disconnection intervals, and excels the synchronous IR and asynchronous state (AS) in the total performances.

  3. The Role of Maternal Emotional Validation and Invalidation on Children's Emotional Awareness

    Science.gov (United States)

    Lambie, John A.; Lindberg, Anja

    2016-01-01

    Emotional awareness--that is, accurate emotional self-report--has been linked to positive well-being and mental health. However, it is still unclear how emotional awareness is socialized in young children. This observational study examined how a particular parenting communicative style--emotional validation versus emotional invalidation--was…

  4. Sequential and base rate analysis of emotional validation and invalidation in chronic pain couples: patient gender matters.

    Science.gov (United States)

    Leong, Laura E M; Cano, Annmarie; Johansen, Ayna B

    2011-11-01

    The purpose of this study was to examine the extent to which communication patterns that foster or hinder intimacy and emotion regulation in couples were related to pain, marital satisfaction, and depression in 78 chronic pain couples attempting to problem-solve an area of disagreement in their marriage. Sequences and base rates of validation and invalidation communication patterns were almost uniformly unrelated to adjustment variables unless patient gender was taken into account. Male patient couples' reciprocal invalidation was related to worse pain, but this was not found in female patient couples. In addition, spouses' validation was related to poorer patient pain and marital satisfaction, but only in couples with a male patient. It was not only the presence or absence of invalidation and validation that mattered (base rates), but the context and timing of these events (sequences) that affected patients' adjustment. This research demonstrates that sequences of interaction behaviors that foster and hinder emotion regulation should be attended to when assessing and treating pain patients and their spouses. This article presents analyses of both sequences and base rates of chronic pain couples' communication patterns, focusing on validation and invalidation. These results may potentially improve psychosocial treatments for these couples, by addressing sequential interactions of intimacy and empathy. Copyright © 2011 American Pain Society. Published by Elsevier Inc. All rights reserved.

  5. Educating Jurors about Forensic Evidence: Using an Expert Witness and Judicial Instructions to Mitigate the Impact of Invalid Forensic Science Testimony.

    Science.gov (United States)

    Eastwood, Joseph; Caldwell, Jiana

    2015-11-01

    Invalid expert witness testimony that overstated the precision and accuracy of forensic science procedures has been highlighted as a common factor in many wrongful conviction cases. This study assessed the ability of an opposing expert witness and judicial instructions to mitigate the impact of invalid forensic science testimony. Participants (N = 155) acted as mock jurors in a sexual assault trial that contained both invalid forensic testimony regarding hair comparison evidence, and countering testimony from either a defense expert witness or judicial instructions. Results showed that the defense expert witness was successful in educating jurors regarding limitations in the initial expert's conclusions, leading to a greater number of not-guilty verdicts. The judicial instructions were shown to have no impact on verdict decisions. These findings suggest that providing opposing expert witnesses may be an effective safeguard against invalid forensic testimony in criminal trials. © 2015 American Academy of Forensic Sciences.

  6. [Physicians as Experts of the Integration of war invalids of WWI and WWII].

    Science.gov (United States)

    Wolters, Christine

    2015-12-01

    After the First World War the large number of war invalids posed a medical as well as a socio-political problem. This needed to be addressed, at least to some extent, through healthcare providers (Versorgungsbehörden) and reintegration into the labour market. Due to the demilitarization of Germany, this task was taken on by the civil administration, which was dissolved during the time of National Socialism. In 1950, the Federal Republic of Germany enacted the Federal War Victims Relief Act (Bundesversorgungsgesetz), which created a privileged group of civil and military war invalids, whereas other disabled people and victims of national socialist persecution were initially excluded. This article examines the continuities and discontinuities of the institutions following the First World War. A particular focus lies on the groups of doctors which structured this field. How did doctors become experts and what was their expertise?

  7. Simulation of deposition and activity distribution of radionuclides in human airways

    International Nuclear Information System (INIS)

    Farkas, A.; Balashazy, I.; Szoke, I.; Hofmann, W.; Golser, R.

    2002-01-01

    The aim of our research activities is the modelling of the biological processes related to the development of lung cancer at the large central-airways observed in the case of uranium miners caused by the inhalation of radionuclides (especially alpha-emitting radon decay products). Statistical data show that at the uranium miners the lung cancer has developed mainly in the 3-4.-5. airway generations and especially in the right upper lobe. Therefore, it is rather important to study the physical and biological effects in this section of the human airways to find relations between the radiation dose and the adverse health effects. These results may provide useful information about the validity or invalidity of the currently used LNT (Linear-No-Threshold) dose-effect hypothesis at low doses

  8. Validity of the linear no-threshold (LNT) hypothesis in setting radiation protection regulations for the inhabitants in high level natural radiation areas of Ramsar, Iran

    International Nuclear Information System (INIS)

    Mortazavi, S.M.J.; Atefi, M.; Razi, Z.; Mortazavi Gh

    2010-01-01

    Some areas in Ramsar, a city in northern Iran, have long been known as inhabited areas with the highest levels of natural radiation. Despite the fact that the health effects of high doses of ionizing radiation are well documented, biological effects of above the background levels of natural radiation are still controversial and the validity of the LNT hypothesis in this area, has been criticized by many investigators around the world. The study of the health effects of high levels of natural radiation in areas such as Ramsar, help scientists to investigate the biological effects without the need for extrapolating the observations either from high doses of radiation to low dose region or from laboratory animals to humans. Considering the importance of these studies, National Radiation Protection Department (NRPD) of the Iranian Nuclear Regulatory Authority has started an integrative research project on the health effects of long-term exposure to high levels of natural radiation. This paper reviews findings of the studies conducted on the plants and humans living or laboratory animals kept in high level natural radiation areas of Ramsar. In human studies, different end points such as DNA damage, chromosome aberrations, blood cells and immunological alterations are discussed. This review comes to the conclusion that no reproducible detrimental health effect has been reported so far. In this paper the validity of LNT hypothesis in the assessment of the health effects of high levels of natural radiation is discussed. (author)

  9. Express attentional re-engagement but delayed entry into consciousness following invalid spatial cues in visual search.

    Directory of Open Access Journals (Sweden)

    Benoit Brisson

    Full Text Available BACKGROUND: In predictive spatial cueing studies, reaction times (RT are shorter for targets appearing at cued locations (valid trials than at other locations (invalid trials. An increase in the amplitude of early P1 and/or N1 event-related potential (ERP components is also present for items appearing at cued locations, reflecting early attentional sensory gain control mechanisms. However, it is still unknown at which stage in the processing stream these early amplitude effects are translated into latency effects. METHODOLOGY/PRINCIPAL FINDINGS: Here, we measured the latency of two ERP components, the N2pc and the sustained posterior contralateral negativity (SPCN, to evaluate whether visual selection (as indexed by the N2pc and visual-short term memory processes (as indexed by the SPCN are delayed in invalid trials compared to valid trials. The P1 was larger contralateral to the cued side, indicating that attention was deployed to the cued location prior to the target onset. Despite these early amplitude effects, the N2pc onset latency was unaffected by cue validity, indicating an express, quasi-instantaneous re-engagement of attention in invalid trials. In contrast, latency effects were observed for the SPCN, and these were correlated to the RT effect. CONCLUSIONS/SIGNIFICANCE: Results show that latency differences that could explain the RT cueing effects must occur after visual selection processes giving rise to the N2pc, but at or before transfer in visual short-term memory, as reflected by the SPCN, at least in discrimination tasks in which the target is presented concurrently with at least one distractor. Given that the SPCN was previously associated to conscious report, these results further show that entry into consciousness is delayed following invalid cues.

  10. Mechanisms of Contextual Risk for Adolescent Self-Injury: Invalidation and Conflict Escalation in Mother-Child Interactions

    Science.gov (United States)

    Crowell, Sheila E.; Baucom, Brian R.; McCauley, Elizabeth; Potapova, Natalia V.; Fitelson, Martha; Barth, Heather; Smith, Cindy J.; Beauchaine, Theodore P.

    2013-01-01

    OBJECTIVE According to developmental theories of self-injury, both child characteristics and environmental contexts shape and maintain problematic behaviors. Although progress has been made toward identifying biological vulnerabilities to self-injury, mechanisms underlying psychosocial risk have received less attention. METHOD In the present study, we compared self-injuring adolescents (n=17) with typical controls (n=20) during a mother-child conflict discussion. Dyadic interactions were coded using both global and microanalytic systems, allowing for a highly detailed characterization of mother-child interactions. We also assessed resting state psychophysiological regulation, as indexed by respiratory sinus arrhythmia (RSA). RESULTS Global coding revealed that maternal invalidation was associated with adolescent anger. Furthermore, maternal invalidation and coerciveness were both related to adolescent opposition/defiance. Results from the microanalytic system indicated that self-injuring dyads were more likely to escalate conflict, suggesting a potential mechanism through which emotion dysregulation is shaped and maintained over time. Finally, mother and teen aversiveness interacted to predict adolescent resting RSA. Low-aversive teens with highly aversive mothers had the highest RSA, whereas teens in high-high dyads showed the lowest RSA. CONCLUSIONS These findings are consistent with theories that emotion invalidation and conflict escalation are possible contextual risk factors for self-injury. PMID:23581508

  11. Evidence for beneficial low level radiation effects and radiation hormesis

    International Nuclear Information System (INIS)

    Feinendegen, L.E.

    2005-01-01

    Low doses in the mGy range cause a dual effect on cellular DNA. One effect concerns a relatively low probability of DNA damage per energy deposition event and it increases proportional with dose, with possible bystander effects operating. This damage at background radiation exposure is orders of magnitudes lower than that from endogenous sources, such as ROS. The other effect at comparable doses brings an easily obeservable adaptive protection against DNA damage from any, mainly endogenous sources, depending on cell type, species, and metabolism. Protective responses express adaptive responses to metabolic perturbations and also mimic oxygen stress responses. Adaptive protection operates in terms of DNA damage prevention and repair, and of immune stimulation. It develops with a delay of hours, may last for days to months, and increasingly disappears at doses beyond about 100 to 200 mGy. Radiation-induced apoptosis and terminal cell differentiation occurs also at higher doses and adds to protection by reducing genomic instability and the number of mutated cells in tissues. At low doses, damage reduction by adaptive protection against damage from endogenous sources predictably outweighs radiogenic damage induction. The analysis of the consequences of the particular low-dose scenario shows that the linear-no-threshold (LNT) hypothesis for cancer risk is scientifically unfounded and appears to be invalid in favor of a threshold or hormesis. This is consistent with data both from animal studies and human epidemiological observations on low-dose induced cancer. The LNT hypothesis should be abandoned and be replaced by a hypothesis that is scientifically justified. The appropriate model should include terms for both linear and non-linear response probabilities. Maintaining the LNT-hypothesis as basis for radiation protection causes unressonable fear and expenses. (author)

  12. Evaluating the accuracy of the Wechsler Memory Scale-Fourth Edition (WMS-IV) logical memory embedded validity index for detecting invalid test performance.

    Science.gov (United States)

    Soble, Jason R; Bain, Kathleen M; Bailey, K Chase; Kirton, Joshua W; Marceaux, Janice C; Critchfield, Edan A; McCoy, Karin J M; O'Rourke, Justin J F

    2018-01-08

    Embedded performance validity tests (PVTs) allow for continuous assessment of invalid performance throughout neuropsychological test batteries. This study evaluated the utility of the Wechsler Memory Scale-Fourth Edition (WMS-IV) Logical Memory (LM) Recognition score as an embedded PVT using the Advanced Clinical Solutions (ACS) for WAIS-IV/WMS-IV Effort System. This mixed clinical sample was comprised of 97 total participants, 71 of whom were classified as valid and 26 as invalid based on three well-validated, freestanding criterion PVTs. Overall, the LM embedded PVT demonstrated poor concordance with the criterion PVTs and unacceptable psychometric properties using ACS validity base rates (42% sensitivity/79% specificity). Moreover, 15-39% of participants obtained an invalid ACS base rate despite having a normatively-intact age-corrected LM Recognition total score. Receiving operating characteristic curve analysis revealed a Recognition total score cutoff of < 61% correct improved specificity (92%) while sensitivity remained weak (31%). Thus, results indicated the LM Recognition embedded PVT is not appropriate for use from an evidence-based perspective, and that clinicians may be faced with reconciling how a normatively intact cognitive performance on the Recognition subtest could simultaneously reflect invalid performance validity.

  13. LNTgate: How scientific misconduct by the U.S. NAS led to governments adopting LNT for cancer risk assessment

    International Nuclear Information System (INIS)

    Calabrese, Edward J.

    2016-01-01

    This paper provides a detailed rebuttal to the letter of Beyea (2016) which offered a series of alternative interpretations to those offered in my article in Environmental Research (Calabrese, 2015a) concerning the role of the U.S. National Academy of Sciences (NAS) Biological Effects of Atomic Radiation (BEAR) I Committee Genetics Panel in the adoption of the linear dose response model for cancer risk assessment. Significant newly uncovered evidence is presented which supports and extends the findings of Calabrese (2015a), reaffirming the conclusion that the Genetics Panel should be evaluated for scientific misconduct for deliberate misrepresentation of the research record in order to enhance an ideological agenda. This critique documents numerous factual errors along with extensive and deliberate filtering of information in the Beyea letter (2016) that leads to consistently incorrect conclusions and an invalid general perspective.

  14. LNTgate: How scientific misconduct by the U.S. NAS led to governments adopting LNT for cancer risk assessment

    Energy Technology Data Exchange (ETDEWEB)

    Calabrese, Edward J., E-mail: edwardc@schoolph.umass.edu

    2016-07-15

    This paper provides a detailed rebuttal to the letter of Beyea (2016) which offered a series of alternative interpretations to those offered in my article in Environmental Research (Calabrese, 2015a) concerning the role of the U.S. National Academy of Sciences (NAS) Biological Effects of Atomic Radiation (BEAR) I Committee Genetics Panel in the adoption of the linear dose response model for cancer risk assessment. Significant newly uncovered evidence is presented which supports and extends the findings of Calabrese (2015a), reaffirming the conclusion that the Genetics Panel should be evaluated for scientific misconduct for deliberate misrepresentation of the research record in order to enhance an ideological agenda. This critique documents numerous factual errors along with extensive and deliberate filtering of information in the Beyea letter (2016) that leads to consistently incorrect conclusions and an invalid general perspective.

  15. 20 CFR 655.1132 - When will the Department suspend or invalidate an approved Attestation?

    Science.gov (United States)

    2010-04-01

    ... Requirements Must a Facility Meet to Employ H-1C Nonimmigrant Workers as Registered Nurses? § 655.1132 When... Administrator, where that penalty or remedy assessment has become the final agency action. If an Attestation is... is suspended, invalidated or expired, as long as any H-1C nurse is at the facility, unless the...

  16. A proposed strategy for the validation of ground-water flow and solute transport models

    International Nuclear Information System (INIS)

    Davis, P.A.; Goodrich, M.T.

    1991-01-01

    Ground-water flow and transport models can be thought of as a combination of conceptual and mathematical models and the data that characterize a given system. The judgment of the validity or invalidity of a model depends both on the adequacy of the data and the model structure (i.e., the conceptual and mathematical model). This report proposes a validation strategy for testing both components independently. The strategy is based on the philosophy that a model cannot be proven valid, only invalid or not invalid. In addition, the authors believe that a model should not be judged in absence of its intended purpose. Hence, a flow and transport model may be invalid for one purpose but not invalid for another. 9 refs

  17. Validation of Measures of Biosocial Precursors to Borderline Personality Disorder: Childhood Emotional Vulnerability and Environmental Invalidation

    Science.gov (United States)

    Sauer, Shannon E.; Baer, Ruth A.

    2010-01-01

    Linehan's biosocial theory suggests that borderline personality disorder (BPD) results from a transaction of two childhood precursors: emotional vulnerability and an invalidating environment. Until recently, few empirical studies have explored relationships between these theoretical precursors and symptoms of the disorder. Psychometrically sound…

  18. Linear, no threshold response at low doses of ionizing radiation: ideology, prejudice and science

    International Nuclear Information System (INIS)

    Kesavan, P.C.

    2014-01-01

    The linear, no threshold (LNT) response model assumes that there is no threshold dose for the radiation-induced genetic effects (heritable mutations and cancer), and it forms the current basis for radiation protection standards for radiation workers and the general public. The LNT model is, however, based more on ideology than valid radiobiological data. Further, phenomena such as 'radiation hormesis', 'radioadaptive response', 'bystander effects' and 'genomic instability' are now demonstrated to be radioprotective and beneficial. More importantly, the 'differential gene expression' reveals that qualitatively different proteins are induced by low and high doses. This finding negates the LNT model which assumes that qualitatively similar proteins are formed at all doses. Thus, all available scientific data challenge the LNT hypothesis. (author)

  19. [The loss of work fitness and the course of invalidism in patients with limb vessel lesions].

    Science.gov (United States)

    Chernenko, V F; Goncharenko, A G; Shuvalov, A Iu; Chernenko, V V; Tarasov, I V

    2005-01-01

    The growth of the sick rate of limb peripheral vessels associated with a severe outcome (trophic ulcers, amputation) exerts an appreciable effect on the lowering of quality of life in patients. This manifests by the prolonged loss of work fitness, change of the habitual occupation and disability establishment. Objective analytical information on this problem will be of help in the delineation of the tendencies in this direction and potential approaches to the prevention of social losses. The present work is based on an analysis of 2115 statements of medicosocial expert evaluation (MSEE) of invalids suffering from diseases of limb vessels, performed over recent 8 years in the Altai region. The decisions made by the MSEE were based on the results of the clinical examination of patients using the current diagnostic modalities (ultrasonography, duplex scanning, angiography, etc). It has been established that among persons who had undergone MSEE, over the half (64.1%) were under 60 years, i.e. in the age of work fitness. It is noteworthy that the overwhelming number of invalids were men (83%) and workers (84.2%). As for special vascular pathologies, the majority of patients presented with obliterative arterial diseases (OAD) of the lower limbs, accounting for 76.3% whereas patients with venous pathology ranked second in number (15.9%). The highest severity of invalidism (groups I and II) was also recorded in OAD (77.5%), especially in atherosclerosis obliterans (AO) which accounted for 84%. Of note, these diseases were marked by no tendency toward reduction of their incidence. The time of temporary disability (from 3 to 9 months) was also most frequently recorded in OAD of the limbs. In OAD, the temporary or persistent loss of work fitness were caused by critical ischemia and amputations whereas in venous pathology, namely in varicosity and post-thrombophlebotic syndrome, the cause was progressing CVI complicated by trophic ulcers. On the whole, the lack of changes in

  20. Einstein's Equivalence Principle and Invalidity of Thorne's Theory for LIGO

    Directory of Open Access Journals (Sweden)

    Lo C. Y.

    2006-04-01

    Full Text Available The theoretical foundation of LIGO's design is based on the equation of motion derived by Thorne. His formula, motivated by Einstein's theory of measurement, shows that the gravitational wave-induced displacement of a mass with respect to an object is proportional to the distance from the object. On the other hand, based on the observed bending of light and Einstein's equivalence principle, it is concluded that such induced displacement has nothing to do with the distance from another object. It is shown that the derivation of Thorne's formula has invalid assumptions that make it inapplicable to LIGO. This is a good counter example for those who claimed that Einstein's equivalence principle is not important or even irrelevant.

  1. An exploration of the impact of invalid MMPI-2 protocols on collateral self-report measure scores.

    Science.gov (United States)

    Forbey, Johnathan D; Lee, Tayla T C

    2011-11-01

    Although a number of studies have examined the impact of invalid MMPI-2 (Butcher et al., 2001) response styles on MMPI-2 scale scores, limited research has specifically explored the effects that such response styles might have on conjointly administered collateral self-report measures. This study explored the potential impact of 2 invalidating response styles detected by the Validity scales of the MMPI-2, overreporting and underreporting, on scores of collateral self-report measures administered conjointly with the MMPI-2. The final group of participants included in analyses was 1,112 college students from a Midwestern university who completed all measures as part of a larger study. Results of t-test analyses suggested that if either over- or underreporting was indicated by the MMPI-2 Validity scales, the scores of most conjointly administered collateral measures were also significantly impacted. Overall, it appeared that test-takers who were identified as either over- or underreporting relied on such a response style across measures. Limitations and suggestions for future study are discussed.

  2. Motivated reflection on attitude-inconsistent information: an exploration of the role of fear of invalidity in self-persuasion.

    Science.gov (United States)

    Clarkson, Joshua J; Valente, Matthew J; Leone, Christopher; Tormala, Zakary L

    2013-12-01

    The mere thought effect is defined in part by the tendency of self-reflective thought to heighten the generation of and reflection on attitude-consistent thoughts. By focusing on individuals' fears of invalidity, we explored the possibility that the mere opportunity for thought sometimes motivates reflection on attitude-inconsistent thoughts. Across three experiments, dispositional and situational fear of invalidity was shown to heighten reflection on attitude-inconsistent thoughts. This heightened reflection, in turn, interacted with individuals' thought confidence to determine whether attitude-inconsistent thoughts were assimilated or refuted and consequently whether individuals' attitudes and behavioral intentions depolarized or polarized following a sufficient opportunity for thought, respectively. These findings emphasize the impact of motivational influences on thought reflection and generation, the importance of thought confidence in the assimilation and refutation of self-generated thought, and the dynamic means by which the mere thought bias can impact self-persuasion.

  3. 22 CFR 51.63 - Passports invalid for travel into or through restricted areas; prohibition on passports valid...

    Science.gov (United States)

    2010-04-01

    ... restricted areas; prohibition on passports valid only for travel to Israel. 51.63 Section 51.63 Foreign... Passports § 51.63 Passports invalid for travel into or through restricted areas; prohibition on passports... use in a country or area which the Secretary has determined is: (1) A country with which the United...

  4. Lessons to be learned from a contentious challenge to mainstream radiobiological science (the linear no-threshold theory of genetic mutations).

    Science.gov (United States)

    Beyea, Jan

    2017-04-01

    There are both statistically valid and invalid reasons why scientists with differing default hypotheses can disagree in high-profile situations. Examples can be found in recent correspondence in this journal, which may offer lessons for resolving challenges to mainstream science, particularly when adherents of a minority view attempt to elevate the status of outlier studies and/or claim that self-interest explains the acceptance of the dominant theory. Edward J. Calabrese and I have been debating the historical origins of the linear no-threshold theory (LNT) of carcinogenesis and its use in the regulation of ionizing radiation. Professor Calabrese, a supporter of hormesis, has charged a committee of scientists with misconduct in their preparation of a 1956 report on the genetic effects of atomic radiation. Specifically he argues that the report mischaracterized the LNT research record and suppressed calculations of some committee members. After reviewing the available scientific literature, I found that the contemporaneous evidence overwhelmingly favored a (genetics) LNT and that no calculations were suppressed. Calabrese's claims about the scientific record do not hold up primarily because of lack of attention to statistical analysis. Ironically, outlier studies were more likely to favor supra-linearity, not sub-linearity. Finally, the claim of investigator bias, which underlies Calabrese's accusations about key studies, is based on misreading of text. Attention to ethics charges, early on, may help seed a counter narrative explaining the community's adoption of a default hypothesis and may help focus attention on valid evidence and any real weaknesses in the dominant paradigm. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  5. Avoidance of Affect Mediates the Effect of Invalidating Childhood Environments on Borderline Personality Symptomatology in a Non-Clinical Sample

    Science.gov (United States)

    Sturrock, Bonnie A.; Francis, Andrew; Carr, Steven

    2009-01-01

    The aim of this study was to test the Linehan (1993) proposal regarding associations between invalidating childhood environments, distress tolerance (e.g., avoidance of affect), and borderline personality disorder (BPD) symptoms. The sample consisted of 141 non-clinical participants (51 men, 89 women, one gender unknown), ranging in age from 18 to…

  6. Sulfur Deactivation of NOx Storage Catalysts: A Multiscale Modeling Approach

    Directory of Open Access Journals (Sweden)

    Rankovic N.

    2013-09-01

    Full Text Available Lean NOx Trap (LNT catalysts, a promising solution for reducing the noxious nitrogen oxide emissions from the lean burn and Diesel engines, are technologically limited by the presence of sulfur in the exhaust gas stream. Sulfur stemming from both fuels and lubricating oils is oxidized during the combustion event and mainly exists as SOx (SO2 and SO3 in the exhaust. Sulfur oxides interact strongly with the NOx trapping material of a LNT to form thermodynamically favored sulfate species, consequently leading to the blockage of NOx sorption sites and altering the catalyst operation. Molecular and kinetic modeling represent a valuable tool for predicting system behavior and evaluating catalytic performances. The present paper demonstrates how fundamental ab initio calculations can be used as a valuable source for designing kinetic models developed in the IFP Exhaust library, intended for vehicle simulations. The concrete example we chose to illustrate our approach was SO3 adsorption on the model NOx storage material, BaO. SO3 adsorption was described for various sites (terraces, surface steps and kinks and bulk for a closer description of a real storage material. Additional rate and sensitivity analyses provided a deeper understanding of the poisoning phenomena.

  7. The unwanted heroes: war invalids in Poland after World War I.

    Science.gov (United States)

    Magowska, Anita

    2014-04-01

    This article focuses on the unique and hitherto unknown history of disabled ex-servicemen and civilians in interwar Poland. In 1914, thousands of Poles were conscripted into the Austrian, Prussian, and Russian armies and forced to fight against each other. When the war ended and Poland regained independence after more than one hundred years of partition, the fledgling government was unable to provide support for the more than three hundred thousand disabled war victims, not to mention the many civilians left injured or orphaned by the war. The vast majority of these victims were ex-servicemen of foreign armies, and were deprived of any war compensation. Neither the Polish government nor the impoverished society could meet the disabled ex-servicemen's medical and material needs; therefore, these men had to take responsibility for themselves and started cooperatives and war-invalids-owned enterprises. A social collaboration between Poland and America, rare in Europe at that time, was initiated by the Polish community in the United States to help blind ex-servicemen in Poland.

  8. Exogenous calcium alleviates low night temperature stress on the photosynthetic apparatus of tomato leaves.

    Directory of Open Access Journals (Sweden)

    Guoxian Zhang

    Full Text Available The effect of exogenous CaCl2 on photosystem I and II (PSI and PSII activities, cyclic electron flow (CEF, and proton motive force of tomato leaves under low night temperature (LNT was investigated. LNT stress decreased the net photosynthetic rate (Pn, effective quantum yield of PSII [Y(II], and photochemical quenching (qP, whereas CaCl2 pretreatment improved Pn, Y(II, and qP under LNT stress. LNT stress significantly increased the non-regulatory quantum yield of energy dissipation [Y(NO], whereas CaCl2 alleviated this increase. Exogenous Ca2+ enhanced stimulation of CEF by LNT stress. Inhibition of oxidized PQ pools caused by LNT stress was alleviated by CaCl2 pretreatment. LNT stress reduced zeaxanthin formation and ATPase activity, but CaCl2 pretreatment reversed both of these effects. LNT stress caused excess formation of a proton gradient across the thylakoid membrane, whereas CaCl2 pretreatment decreased the said factor under LNT. Thus, our results showed that photoinhibition of LNT-stressed plants could be alleviated by CaCl2 pretreatment. Our findings further revealed that this alleviation was mediated in part by improvements in carbon fixation capacity, PQ pools, linear and cyclic electron transports, xanthophyll cycles, and ATPase activity.

  9. Lessons to be learned from a contentious challenge to mainstream radiobiological science (the linear no-threshold theory of genetic mutations)

    International Nuclear Information System (INIS)

    Beyea, Jan

    2017-01-01

    There are both statistically valid and invalid reasons why scientists with differing default hypotheses can disagree in high-profile situations. Examples can be found in recent correspondence in this journal, which may offer lessons for resolving challenges to mainstream science, particularly when adherents of a minority view attempt to elevate the status of outlier studies and/or claim that self-interest explains the acceptance of the dominant theory. Edward J. Calabrese and I have been debating the historical origins of the linear no-threshold theory (LNT) of carcinogenesis and its use in the regulation of ionizing radiation. Professor Calabrese, a supporter of hormesis, has charged a committee of scientists with misconduct in their preparation of a 1956 report on the genetic effects of atomic radiation. Specifically he argues that the report mischaracterized the LNT research record and suppressed calculations of some committee members. After reviewing the available scientific literature, I found that the contemporaneous evidence overwhelmingly favored a (genetics) LNT and that no calculations were suppressed. Calabrese's claims about the scientific record do not hold up primarily because of lack of attention to statistical analysis. Ironically, outlier studies were more likely to favor supra-linearity, not sub-linearity. Finally, the claim of investigator bias, which underlies Calabrese's accusations about key studies, is based on misreading of text. Attention to ethics charges, early on, may help seed a counter narrative explaining the community's adoption of a default hypothesis and may help focus attention on valid evidence and any real weaknesses in the dominant paradigm. - Highlights: • Edward J Calabrese has made a contentious challenge to mainstream radiobiological science. • Such challenges should not be neglected, lest they enter the political arena without review. • Key genetic studies from the 1940s, challenged by Calabrese, were

  10. Lessons to be learned from a contentious challenge to mainstream radiobiological science (the linear no-threshold theory of genetic mutations)

    Energy Technology Data Exchange (ETDEWEB)

    Beyea, Jan, E-mail: jbeyea@cipi.com

    2017-04-15

    There are both statistically valid and invalid reasons why scientists with differing default hypotheses can disagree in high-profile situations. Examples can be found in recent correspondence in this journal, which may offer lessons for resolving challenges to mainstream science, particularly when adherents of a minority view attempt to elevate the status of outlier studies and/or claim that self-interest explains the acceptance of the dominant theory. Edward J. Calabrese and I have been debating the historical origins of the linear no-threshold theory (LNT) of carcinogenesis and its use in the regulation of ionizing radiation. Professor Calabrese, a supporter of hormesis, has charged a committee of scientists with misconduct in their preparation of a 1956 report on the genetic effects of atomic radiation. Specifically he argues that the report mischaracterized the LNT research record and suppressed calculations of some committee members. After reviewing the available scientific literature, I found that the contemporaneous evidence overwhelmingly favored a (genetics) LNT and that no calculations were suppressed. Calabrese's claims about the scientific record do not hold up primarily because of lack of attention to statistical analysis. Ironically, outlier studies were more likely to favor supra-linearity, not sub-linearity. Finally, the claim of investigator bias, which underlies Calabrese's accusations about key studies, is based on misreading of text. Attention to ethics charges, early on, may help seed a counter narrative explaining the community's adoption of a default hypothesis and may help focus attention on valid evidence and any real weaknesses in the dominant paradigm. - Highlights: • Edward J Calabrese has made a contentious challenge to mainstream radiobiological science. • Such challenges should not be neglected, lest they enter the political arena without review. • Key genetic studies from the 1940s, challenged by Calabrese, were

  11. Sulfated lentinan induced mitochondrial dysfunction leads to programmed cell death of tobacco BY-2 cells.

    Science.gov (United States)

    Wang, Jie; Wang, Yaofeng; Shen, Lili; Qian, Yumei; Yang, Jinguang; Wang, Fenglong

    2017-04-01

    Sulphated lentinan (sLTN) is known to act as a resistance inducer by causing programmed cell death (PCD) in tobacco suspension cells. However, the underlying mechanism of this effect is largely unknown. Using tobacco BY-2 cell model, morphological and biochemical studies revealed that mitochondrial reactive oxygen species (ROS) production and mitochondrial dysfunction contribute to sLNT induced PCD. Cell viability, and HO/PI fluorescence imaging and TUNEL assays confirmed a typical cell death process caused by sLNT. Acetylsalicylic acid (an ROS scavenger), diphenylene iodonium (an inhibitor of NADPH oxidases) and protonophore carbonyl cyanide p-trifluoromethoxyphenyl hydrazone (a protonophore and an uncoupler of mitochondrial oxidative phosphorylation) inhibited sLNT-induced H 2 O 2 generation and cell death, suggesting that ROS generation linked, at least partly, to a mitochondrial dysfunction and caspase-like activation. This conclusion was further confirmed by double-stained cells with the mitochondria-specific marker MitoTracker RedCMXRos and the ROS probe H 2 DCFDA. Moreover, the sLNT-induced PCD of BY-2 cells required cellular metabolism as up-regulation of the AOX family gene transcripts and induction of the SA biosynthesis, the TCA cycle, and miETC related genes were observed. It is concluded that mitochondria play an essential role in the signaling pathway of sLNT-induced ROS generation, which possibly provided new insight into the sLNT-mediated antiviral response, including PCD. Copyright © 2016. Published by Elsevier Inc.

  12. The cooperative effect of p53 and Rb in local nanotherapy in a rabbit VX2 model of hepatocellular carcinoma

    Directory of Open Access Journals (Sweden)

    Dong S

    2013-10-01

    Full Text Available Shengli Dong,1 Qibin Tang,2 Miaoyun Long,3 Jian Guan,4 Lu Ye,5 Gaopeng Li6 1Department of General Surgery, The Second Hospital of Shanxi Medical University, Shanxi Medical University, Taiyuan, Shanxi Province, 2Department of Hepatobiliopancreatic Surgery, Sun Yat-sen Memorial Hospital, Sun Yat-sen University, Guangzhou, Guangdong Province, 3Department of Thyroid and Vascular Surgery, Sun Yat-sen Memorial Hospital, Sun Yat-sen University, Guangzhou, Guangdong Province, 4Department of Radiology, First Affiliated Hospital, Sun Yat-sen University, Guangzhou, Guangdong Province, 5Infection Department, Guangzhou No 8 Hospital, Guangzhou, Guangdong Province, 6Department of Ultrasound, Sun Yat-sen Memorial Hospital, Sun Yat-sen University, Guangzhou, Guangdong Province, People's Republic of China Background/aim: A local nanotherapy (LNT combining the therapeutic efficacy of trans-arterial embolization, nanoparticles, and p53 gene therapy has been previously presented. The study presented here aimed to further improve the incomplete tumor eradication and limited survival enhancement and to elucidate the molecular mechanism of the LNT. Methods: In a tumor-targeting manner, recombinant expressing plasmids harboring wild-type p53 and Rb were either co-transferred or transferred separately to rabbit hepatic VX2 tumors in a poly-L-lysine-modified hydroxyapatite nanoparticle nanoplex and Lipiodol® (Guerbet, Villepinte, France emulsion via the hepatic artery. Subsequent co-expression of p53 and Rb proteins within the treated tumors was investigated by Western blotting and in situ analysis by laser-scanning confocal microscopy. The therapeutic effect was evaluated by the tumor growth velocity, apoptosis and necrosis rates, their sensitivity to Adriamycin® (ADM, mitomycin C, and fluorouracil, the microvessel density of tumor tissue, and the survival time of animals. Eventually, real-time polymerase chain reaction and enhanced chemiluminescence Western blotting

  13. On the invalidity of Bragg's rule in stopping cross sections of molecules for swift Li ions

    International Nuclear Information System (INIS)

    Neuwirth, W.; Pietsch, W.; Richter, K.; Hauser, U.

    1975-01-01

    We discuss the invalidity of Bragg's rule for stopping cross sections of molecules for Li ions in the velocity range 1.5 x 10 8 cm/sec to 4.8 x 10 8 cm/sec. Here the influence of the chemical bonding in a molecule normally leads to strong deviations from Bragg's additivity rule. In our boron compounds the measured cross section of the molecule is smaller than the sum of the stopping cross sections of the single constituents. This can be explained in a first order description by the transfer of electrons in the bonding. With this description it is possible to determine from the measured molecular stopping cross sections the charge transfer in certain compounds. (orig.) [de

  14. An examination of adaptive cellular protective mechanisms using a multi-stage carcinogenesis model

    International Nuclear Information System (INIS)

    Schollnberger, H.; Stewart, R. D.; Mitchel, R. E. J.; Hofmann, W.

    2004-01-01

    A multi-stage cancer model that describes the putative rate-limiting steps in carcinogenesis was developed and used to investigate the potential impact on lung cancer incidence of the hormesis mechanisms suggested by Feinendegen and Pollycove. In this deterministic cancer model, radiation and endogenous processes damage the DNA of target cells in the lung. Some fraction of the misrepaired our unrepaired DNA damage induces genomic instability and, ultimately, leads to the accumulation of malignant cells. The model accounts for cell birth and death processes. Ita also includes a rate of malignant transformation and a lag period for tumour formation. Cellular defence mechanisms are incorporated into the model by postulating dose and dose rate dependent radical scavenging. The accuracy of DNA damage repair also depends on dose and dose rate. Sensitivity studies were conducted to identify critical model inputs and to help define the shapes of the cumulative lung cancer incidence curves that may arise when dose and dose rate dependent cellular defence mechanisms are incorporated into a multi-stage cancer model. For lung cancer, both linear no-threshold (LNT) and non-LNT shaped responses can be obtained. The reported studied clearly show that it is critical to know whether or not and to what extent multiply damaged DNA sites are formed by endogenous processes. Model inputs that give rise to U-shaped responses are consistent with an effective cumulative lung cancer incidence threshold that may be as high as 300 mGy (4 mGy per year for 75 years). (Author) 11 refs

  15. Magazines as wilderness information sources: assessing users' general wilderness knowledge and specific leave no trace knowledge

    Science.gov (United States)

    John J. Confer; Andrew J. Mowen; Alan K. Graefe; James D. Absher

    2000-01-01

    The Leave No Trace (LNT) educational program has the potential to provide wilderness users with useful minimum impact information. For LNT to be effective, managers need to understand who is most/least aware of minimum impact practices and how to expose users to LNT messages. This study examined LNT knowledge among various user groups at an Eastern wilderness area and...

  16. Lipoproteins of slow-growing Mycobacteria carry three fatty acids and are N-acylated by apolipoprotein N-acyltransferase BCG_2070c.

    Science.gov (United States)

    Brülle, Juliane K; Tschumi, Andreas; Sander, Peter

    2013-10-05

    Lipoproteins are virulence factors of Mycobacterium tuberculosis. Bacterial lipoproteins are modified by the consecutive action of preprolipoprotein diacylglyceryl transferase (Lgt), prolipoprotein signal peptidase (LspA) and apolipoprotein N- acyltransferase (Lnt) leading to the formation of mature triacylated lipoproteins. Lnt homologues are found in Gram-negative and high GC-rich Gram-positive, but not in low GC-rich Gram-positive bacteria, although N-acylation is observed. In fast-growing Mycobacterium smegmatis, the molecular structure of the lipid modification of lipoproteins was resolved recently as a diacylglyceryl residue carrying ester-bound palmitic acid and ester-bound tuberculostearic acid and an additional amide-bound palmitic acid. We exploit the vaccine strain Mycobacterium bovis BCG as model organism to investigate lipoprotein modifications in slow-growing mycobacteria. Using Escherichia coli Lnt as a query in BLASTp search, we identified BCG_2070c and BCG_2279c as putative lnt genes in M. bovis BCG. Lipoproteins LprF, LpqH, LpqL and LppX were expressed in M. bovis BCG and BCG_2070c lnt knock-out mutant and lipid modifications were analyzed at molecular level by matrix-assisted laser desorption ionization time-of-flight/time-of-flight analysis. Lipoprotein N-acylation was observed in wildtype but not in BCG_2070c mutants. Lipoprotein N- acylation with palmitoyl and tuberculostearyl residues was observed. Lipoproteins are triacylated in slow-growing mycobacteria. BCG_2070c encodes a functional Lnt in M. bovis BCG. We identified mycobacteria-specific tuberculostearic acid as further substrate for N-acylation in slow-growing mycobacteria.

  17. Implications of invalidity of Data Retention Directive to telecom operators

    Directory of Open Access Journals (Sweden)

    Darja LONČAR DUŠANOVIĆ

    2014-12-01

    Full Text Available Obligation for telecom operators to retain traffic and location data for combating crime purposes had been controversial ever since the adoption of the Data Retention Directive in 2006 because of its inherent negative impact on the fundamental right to privacy and personal data protection. However, the awaited judgment of the CJEU in April this year, which declared the Directive invalid, did not so far resolve the ambiguity of the issue. Namely, having in mind that half a year later, some countries did not amend their national data retention legislations (yet to comply with the aforementioned CJEU judgment, telecom operators as addresses of this obligation are in uncertain legal situation which could be called “lose-lose” situation. Also, the emphasis from the question of proportionality between data privacy and public security is shifted to the question of existence of valid legal basis for data processing (retaining data and providing them to authorities in the new legal environment in which national and EU law are still not in compliance. In this paper the author examines the implications of the CJEU judgment to national EU legislation, telecom operators and data subjects, providing comparative analysis of national data retention legislation status in EU member states. The existence of valid legal basis for data processing is examined within EU law sources, including within proposed EU General Data Protection Regulation and opinions of the relevant data protection bodies (e.g. Article 29 Working Party.

  18. Molecular alterations in childhood thyroid cancer after Chernobyl accident and low-dose radiation risk

    International Nuclear Information System (INIS)

    Suzuki, Keiji; Mitsutake, Norisato; Yamashita, Shunichi

    2012-01-01

    The linear no-threshold (LNT) model of radiation carcinogenesis has been used for evaluating the risk from radiation exposure. While the epidemiological studies have supported the LNT model at doses above 100 mGy, more uncertainties are still existed in the LNT model at low doses below 100 mGy. Thus, it is urged to clarify the molecular mechanisms underlying radiation carcinogenesis. After the Chernobyl accident in 1986, significant amount of childhood thyroid cancer has emerged in the children living in the contaminated area. As the incidence of sporadic childhood thyroid cancer is very low, it is quite evident that those cancer cases have been induced by radiation exposure caused mainly by the intake of contaminated foods, such as milk. Because genetic alterations in childhood thyroid cancers have extensively been studied, it should provide a unique chance to understand the molecular mechanisms of radiation carcinogenesis. In a current review, molecular signatures obtained from the molecular studies of childhood thyroid cancer after Chernobyl accident have been overviewed, and new roles of radiation exposure in thyroid carcinogenesis will be discussed. (author)

  19. Health Physics Society Comments to U.S. Environmental Protection Agency Regulatory Reform Task Force.

    Science.gov (United States)

    Ring, Joseph; Tupin, Edward; Elder, Deirdre; Hiatt, Jerry; Sheetz, Michael; Kirner, Nancy; Little, Craig

    2018-05-01

    The Health Physics Society (HPS) provided comment to the U.S. Environmental Protection Agency (EPA) on options to consider when developing an action plan for President Trump's Executive Order to evaluate regulations for repeal, replacement, or modification. The HPS recommended that the EPA reconsider their adherence to the linear no-threshold (LNT) model for radiation risk calculations and improve several documents by better addressing uncertainties in low-dose, low dose-rate (LDDR) radiation exposure environments. The authors point out that use of the LNT model near background levels cannot provide reliable risk projections, use of the LNT model and collective-dose calculations in some EPA documents is inconsistent with the recommendations of international organizations, and some EPA documents have not been exposed to the public comment rule-making process. To assist in establishing a better scientific basis for the risks of low dose rate and low dose radiation exposure, the EPA should continue to support the "Million Worker Study," led by the National Council on Radiation Protection and Measurement.

  20. CRADA Final Report for CRADA Number ORNL00-0605: Advanced Engine/Aftertreatment System R&D

    Energy Technology Data Exchange (ETDEWEB)

    Pihl, Josh A [ORNL; West, Brian H [ORNL; Toops, Todd J [ORNL; Adelman, Brad [Navistar; Derybowski, Edward [Navistar

    2011-10-01

    compound experiments confirmed the previous results regarding hydrocarbon reactivity: 1-pentene was the most efficient LNT reductant, followed by toluene. Injection location had minimal impact on the reactivity of these two compounds. Iso-octane was an ineffective LNT reductant, requiring high doses (resulting in high HC emissions) to achieve reasonable NOx conversions. Diesel fuel reactivity was sensitive to injection location, with the best performance achieved through fuel injection downstream of the DOC. This configuration generated large LNT temperature excursions, which probably improved the efficiency of the NOx storage/reduction process, but also resulted in very high HC emissions. The ORNL team demonstrated an LNT desulfation under 'road load' conditions using throttling, EGR, and in-pipe injection of diesel fuel. Flow reactor characterization of core samples cut from the front and rear of the engine-aged LNT revealed complex spatially dependent degradation mechanisms. The front of the catalyst contained residual sulfates, which impacted NOx storage and conversion efficiencies at high temperatures. The rear of the catalyst showed significant sintering of the washcoat and precious metal particles, resulting in lower NOx conversion efficiencies at low temperatures. Further flow reactor characterization of engine-aged LNT core samples established that low temperature performance was limited by slow release and reduction of stored NOx during regeneration. Carbon monoxide was only effective at regenerating the LNT at temperatures above 200 C; propene was unreactive even at 250 C. Low temperature operation also resulted in unselective NOx reduction, resulting in high emissions of both N{sub 2}O and NH{sub 3}. During the latter years of the CRADA, the focus was shifted from LNTs to other aftertreatment devices. Two years of the CRADA were spent developing detailed ammonia SCR device models with sufficient accuracy and computational efficiency to be used in

  1. Investigations on low temperature thermoluminescence centres in quartz

    International Nuclear Information System (INIS)

    Bernhardt, H.

    1984-01-01

    The present paper will help to understand the often investigated process of thermoluminescence of quartz which is of high complexity. A lot of traps exist in quartz crystals which compete with each other with respect to the trapping of charge carriers during the X-ray treatment. That is why a variety of processes takes place after X-irradiation at liquid nitrogen temperature (LNT) of quartz which complicate the phenomenology of low temperature thermoluminescence. This competition in the trapping process leads to the so-called 'sensibilization' or 'desensibilization' effects of thermoluminescence, respectively, which are described in this paper for the first time. This effect means the dependence of the LNT thermoluminescence intensity on a pre-irradiation dose applied at room temperature (RT). The influence of this pre-irradiation is understood assuming the saturation of competitive traps. This favours an enhanced trapping of charge carriers at LNT-(shallow) traps instead of the preferential trapping on the deep traps in the case of X-ray treatment of the as-grown crystal at LNT. To get the afore mentioned model we take into account not only thermoluminescence but also coloration, ir- and vuv-absorption measurements. (author)

  2. Biological responses to low dose rate gamma radiation

    International Nuclear Information System (INIS)

    Magae, Junji; Ogata, Hiromitsu

    2003-01-01

    Linear non-threshold (LNT) theory is a basic theory for radioprotection. While LNT dose not consider irradiation time or dose-rate, biological responses to radiation are complex processes dependent on irradiation time as well as total dose. Moreover, experimental and epidemiological studies that can evaluate LNT at low dose/low dose-rate are not sufficiently accumulated. Here we analyzed quantitative relationship among dose, dose-rate and irradiation time using chromosomal breakage and proliferation inhibition of human cells as indicators of biological responses. We also acquired quantitative data at low doses that can evaluate adaptability of LNT with statistically sufficient accuracy. Our results demonstrate that biological responses at low dose-rate are remarkably affected by exposure time, and they are dependent on dose-rate rather than total dose in long-term irradiation. We also found that change of biological responses at low dose was not linearly correlated to dose. These results suggest that it is necessary for us to create a new model which sufficiently includes dose-rate effect and correctly fits of actual experimental and epidemiological results to evaluate risk of radiation at low dose/low dose-rate. (author)

  3. Prevalence of Invalid Performance on Baseline Testing for Sport-Related Concussion by Age and Validity Indicator.

    Science.gov (United States)

    Abeare, Christopher A; Messa, Isabelle; Zuccato, Brandon G; Merker, Bradley; Erdodi, Laszlo

    2018-03-12

    Estimated base rates of invalid performance on baseline testing (base rates of failure) for the management of sport-related concussion range from 6.1% to 40.0%, depending on the validity indicator used. The instability of this key measure represents a challenge in the clinical interpretation of test results that could undermine the utility of baseline testing. To determine the prevalence of invalid performance on baseline testing and to assess whether the prevalence varies as a function of age and validity indicator. This retrospective, cross-sectional study included data collected between January 1, 2012, and December 31, 2016, from a clinical referral center in the Midwestern United States. Participants included 7897 consecutively tested, equivalently proportioned male and female athletes aged 10 to 21 years, who completed baseline neurocognitive testing for the purpose of concussion management. Baseline assessment was conducted with the Immediate Postconcussion Assessment and Cognitive Testing (ImPACT), a computerized neurocognitive test designed for assessment of concussion. Base rates of failure on published ImPACT validity indicators were compared within and across age groups. Hypotheses were developed after data collection but prior to analyses. Of the 7897 study participants, 4086 (51.7%) were male, mean (SD) age was 14.71 (1.78) years, 7820 (99.0%) were primarily English speaking, and the mean (SD) educational level was 8.79 (1.68) years. The base rate of failure ranged from 6.4% to 47.6% across individual indicators. Most of the sample (55.7%) failed at least 1 of 4 validity indicators. The base rate of failure varied considerably across age groups (117 of 140 [83.6%] for those aged 10 years to 14 of 48 [29.2%] for those aged 21 years), representing a risk ratio of 2.86 (95% CI, 2.60-3.16; P indicator and the age of the examinee. The strong age association, with 3 of 4 participants aged 10 to 12 years failing validity indicators, suggests that the

  4. Energy Efficient Thermal Management for Natural Gas Engine Aftertreatment via Active Flow Control

    Energy Technology Data Exchange (ETDEWEB)

    David K. Irick; Ke Nguyen; Vitacheslav Naoumov; Doug Ferguson

    2006-04-01

    The project is focused on the development of an energy efficient aftertreatment system capable of reducing NOx and methane by 90% from lean-burn natural gas engines by applying active exhaust flow control. Compared to conventional passive flow-through reactors, the proposed scheme cuts supplemental energy by 50%-70%. The system consists of a Lean NOx Trap (LNT) system and an oxidation catalyst. Through alternating flow control, a major amount of engine exhaust flows through a large portion of the LNT system in the absorption mode, while a small amount of exhaust goes through a small portion of the LNT system in the regeneration or desulfurization mode. By periodically reversing the exhaust gas flow through the oxidation catalyst, a higher temperature profile is maintained in the catalyst bed resulting in greater efficiency of the oxidation catalyst at lower exhaust temperatures. The project involves conceptual design, theoretical analysis, computer simulation, prototype fabrication, and empirical studies. This report details the progress during the first twelve months of the project. The primary activities have been to develop the bench flow reactor system, develop the computer simulation and modeling of the reverse-flow oxidation catalyst, install the engine into the test cell, and begin design of the LNT system.

  5. Differential-difference model for textile engineering

    International Nuclear Information System (INIS)

    Wu Guocheng; Zhao Ling; He Jihuan

    2009-01-01

    Woven fabric is manifestly not a continuum and therefore Darcy's law or its modifications, or any other differential models are invalid theoretically. A differential-difference model for air transport in discontinuous media is introduced using conservation of mass, conservation of energy, and the equation of state in discrete space and continuous time, capillary pressure is obtained by dimensional analysis.

  6. Microkinetic Modeling of Lean NOx Trap Sulfation and Desulfation

    Energy Technology Data Exchange (ETDEWEB)

    Larson, Richard S. [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2011-08-01

    A microkinetic reaction sub-mechanism designed to account for the sulfation and desulfation of a commercial lean NOx trap (LNT) is presented. This set of reactions is appended to a previously developed mechanism for the normal storage and regeneration processes in an LNT in order to provide a comprehensive modeling tool. The reactions describing the storage, release, and reduction of sulfur oxides are patterned after those involving NOx, but the number of reactions is kept to the minimum necessary to give an adequate simulation of the experimental observations. Values for the kinetic constants are estimated by fitting semi-quantitatively the somewhat limited experimental data, using a transient plug flow reactor code to model the processes occurring in a single monolith channel. Rigorous thermodynamic constraints are imposed in order to ensure that the overall mechanism is consistent both internally and with the known properties of all gas-phase species. The final mechanism is shown to be capable of reproducing the principal aspects of sulfation/desulfation behavior, most notably (a) the essentially complete trapping of SO2 during normal cycling; (b) the preferential sulfation of NOx storage sites over oxygen storage sites and the consequent plug-like and diffuse sulfation profiles; (c) the degradation of NOx storage and reduction (NSR) capability with increasing sulfation level; and (d) the mix of H2S and SO2 evolved during desulfation by temperature-programmed reduction.

  7. Pros and cons of the revolution in radiation protection

    International Nuclear Information System (INIS)

    Latek, Stanislav

    2001-01-01

    In 1959, the International Commission of Radiation Protection (ICRP) chose the LNT (Linear No-Threshold) model as an assumption to form the basis for regulating radiation protection. During the 1999 UNSCEAR session, held in April in Vienna, the linear no-threshold (LNT) hypothesis was discussed. Among other LNT-related subjects, the Committee discussed the problem of collective dose and dose commitment. These concepts have been introduced in the early 1960s, as the offspring of the linear no-threshold assumption. At the time they reflected a deep concern about the induction of hereditary effects by nuclear tests fallout. Almost four decades later, collective dose and dose commitment are still widely used, although by now both the concepts and the concern should have faded into oblivion. It seems that the principles and concepts of radiation protection have gone astray and have led to exceedingly prohibitive standards and impractical recommendations. Revision of these principles and concepts is now being proposed by an increasing number of scientists and several organisations

  8. Test of the linear-no threshold theory of radiation carcinogenesis

    International Nuclear Information System (INIS)

    Cohen, B.L.

    1998-01-01

    It is shown that testing the linear-no threshold theory (L-NT) of radiation carcinogenesis is extremely important and that lung cancer resulting from exposure to radon in homes is the best tool for doing this. A study of lung cancer rates vs radon exposure in U.S. Counties, reported in 1975, is reviewed. It shows, with extremely powerful statistics, that lung cancer rates decrease with increasing radon exposure, in sharp contrast to the prediction of L-NT, with a discrepancy of over 20 standard deviations. Very extensive efforts were made to explain an appreciable part of this discrepancy consistently with L-NT, with no success; it was concluded that L-NT fails, grossly exaggerating the cancer risk of low level radiation. Two updating studies reported in 1996 are also reviewed. New updating studies utilizing more recent lung cancer statistics and considering 450 new potential confounding factors are reported. All updates reinforce the previous conclusion, and the discrepancy with L-NT is increased. (author)

  9. Distribution of shortest path lengths in a class of node duplication network models

    Science.gov (United States)

    Steinbock, Chanania; Biham, Ofer; Katzav, Eytan

    2017-09-01

    We present analytical results for the distribution of shortest path lengths (DSPL) in a network growth model which evolves by node duplication (ND). The model captures essential properties of the structure and growth dynamics of social networks, acquaintance networks, and scientific citation networks, where duplication mechanisms play a major role. Starting from an initial seed network, at each time step a random node, referred to as a mother node, is selected for duplication. Its daughter node is added to the network, forming a link to the mother node, and with probability p to each one of its neighbors. The degree distribution of the resulting network turns out to follow a power-law distribution, thus the ND network is a scale-free network. To calculate the DSPL we derive a master equation for the time evolution of the probability Pt(L =ℓ ) , ℓ =1 ,2 ,⋯ , where L is the distance between a pair of nodes and t is the time. Finding an exact analytical solution of the master equation, we obtain a closed form expression for Pt(L =ℓ ) . The mean distance 〈L〉 t and the diameter Δt are found to scale like lnt , namely, the ND network is a small-world network. The variance of the DSPL is also found to scale like lnt . Interestingly, the mean distance and the diameter exhibit properties of a small-world network, rather than the ultrasmall-world network behavior observed in other scale-free networks, in which 〈L〉 t˜lnlnt .

  10. Communicating Leave No Trace ethics and practices: Efficacy of two-day trainer courses

    Science.gov (United States)

    Daniels, M.L.; Marion, J.L.

    2005-01-01

    Heavy recreational visitation within protected natural areas has resulted in many ecological impacts. Many of these impacts may be avoided or minimized through adoption of low-impact hiking and camping practices. Although ?No Trace? messages have been promoted in public lands since the 1970s, few studies have documented the reception and effectiveness of these messages. The U.S. Leave No Trace Center for Outdoor Ethics develops and promotes two-day Trainer courses that teach Leave No Trace (LNT) skills and ethics to outdoor professionals, groups, and interested individuals. This study examined the change in knowledge, ethics, and behavior of LNT Trainer course participants. The respondents were a convenience sample of participants in Trainer courses offered from April through August 2003. Trainer course instructors administered pre-course and post-course questionnaires to their participants, and we contacted participants individually with a followup questionnaire 4 months after completion of their course. Scores for each of the sections increased immediately following the course, and decreased slightly over the 4 months following the course. Overall, more than half of the knowledge and behavior items, and half of the ethics items, showed significant improvement from pre-course measures to the follow-up. Age, reported LNT experience, and backpacking experience affected the participants? pre-course knowledge and behavior scores. Younger, less experienced respondents also showed a greater improvement in behavior following the course. Trainer course participants also shared their LNT skills and ethics with others both formally and informally. In summary, the LNT Trainer course was successful in increasing participants? knowledge, ethics, and behavior, which they then shared with others. Since many low impact skills taught in the LNT curriculum are supported by scientific research, LNT educational programs have the potential to effectively minimize the environmental

  11. Revisiting the Gram-negative lipoprotein paradigm.

    Science.gov (United States)

    LoVullo, Eric D; Wright, Lori F; Isabella, Vincent; Huntley, Jason F; Pavelka, Martin S

    2015-05-01

    The processing of lipoproteins (Lpps) in Gram-negative bacteria is generally considered an essential pathway. Mature lipoproteins in these bacteria are triacylated, with the final fatty acid addition performed by Lnt, an apolipoprotein N-acyltransferase. The mature lipoproteins are then sorted by the Lol system, with most Lpps inserted into the outer membrane (OM). We demonstrate here that the lnt gene is not essential to the Gram-negative pathogen Francisella tularensis subsp. tularensis strain Schu or to the live vaccine strain LVS. An LVS Δlnt mutant has a small-colony phenotype on sucrose medium and increased susceptibility to globomycin and rifampin. We provide data indicating that the OM lipoprotein Tul4A (LpnA) is diacylated but that it, and its paralog Tul4B (LpnB), still sort to the OM in the Δlnt mutant. We present a model in which the Lol sorting pathway of Francisella has a modified ABC transporter system that is capable of recognizing and sorting both triacylated and diacylated lipoproteins, and we show that this modified system is present in many other Gram-negative bacteria. We examined this model using Neisseria gonorrhoeae, which has the same Lol architecture as that of Francisella, and found that the lnt gene is not essential in this organism. This work suggests that Gram-negative bacteria fall into two groups, one in which full lipoprotein processing is essential and one in which the final acylation step is not essential, potentially due to the ability of the Lol sorting pathway in these bacteria to sort immature apolipoproteins to the OM. This paper describes the novel finding that the final stage in lipoprotein processing (normally considered an essential process) is not required by Francisella tularensis or Neisseria gonorrhoeae. The paper provides a potential reason for this and shows that it may be widespread in other Gram-negative bacteria. Copyright © 2015, American Society for Microbiology. All Rights Reserved.

  12. Cooling as a method of finding topological dislocations in lattice models

    International Nuclear Information System (INIS)

    Gomberoff, K.

    1989-01-01

    It is well known that the O(3) two-dimensional model has configurations with topological charge Q=1 and action S/sub min/=6.69. Since the exponent characterizing the renormalization-group behavior of this model is 4π such configurations invalidate the standard scaling behavior of the topological susceptibility. The analog exponent for the four-dimensional lattice SU(2) gauge model is 10.77. If there would exist configurations with Q=1 and S<10.77 in this model, they would invalidate the standard scaling behavior of its topological susceptibility. Kremer et al. have calculated the action of different configurations during cooling runs. They report that they do not find any configuration with S<12.7 and Q=1. I show that in the O(3) two-dimensional model cooling runs fail to uncover the well-known configurations with S<8. We conclude that the cooling method is not effective in uncovering the smallest action configurations in the Q=1 sector

  13. Photo- and electro-luminescence of rare earth doped ZnO electroluminors at liquid nitrogen temperature

    International Nuclear Information System (INIS)

    Bhushan, S.; Kaza, B.R.; Pandey, A.N.

    1981-01-01

    Photo (PL) and electroluminescent (EL) spectra of some rare earth (La, Gd, Er or Dy) doped ZnO electroluminors have been investigated at liquid nitrogen temperature (LNT) and compared with their corresponding results at room temperature (RT). In addition to three bands observed at RT, one more band on the higher wavelength side appears in EL spectra. Spectral shift with the exciting intensity at LNT supports the donor-acceptor (DA) model in which the rare earths form the donor levels. From the temperature dependent studies of PL and EL brightness, the EL phenomenon is found to be more susceptible to traps. (author)

  14. Droplet-model electric dipole moments

    International Nuclear Information System (INIS)

    Myers, W.D.; Swiatecki, W.J.

    1991-01-01

    Denisov's recent criticism of the droplet-model formula for the dipole moment of a deformed nucleus as derived by Dorso et al., it shown to be invalid. This helps to clarify the relation of theory to the measured dipole moments, as discussed in the review article by Aberg et al. (orig.)

  15. Dose and Dose-Rate Effectiveness Factor (DDREF); Der Dosis- und Dosisleistungs-Effektivitaetsfaktor (DDREF)

    Energy Technology Data Exchange (ETDEWEB)

    Breckow, Joachim [Fachhochschule Giessen-Friedberg, Giessen (Germany). Inst. fuer Medizinische Physik und Strahlenschutz

    2016-08-01

    For practical radiation protection purposes it is supposed that stochastic radiation effects a determined by a proportional dose relation (LNT). Radiobiological and radiation epidemiological studies indicated that in the low dose range a dependence on dose rates might exist. This would trigger an overestimation of radiation risks based on the LNT model. OCRP had recommended a concept to combine all effects in a single factor DDREF (dose and dose-Rate effectiveness factor). There is still too low information on cellular mechanisms of low dose irradiation including possible repair and other processes. The Strahlenschutzkommission cannot identify a sufficient scientific justification for DDREF and recommends an adaption to the actual state of science.

  16. The potential for bias in Cohen's ecological analysis of lung cancer and residential radon

    International Nuclear Information System (INIS)

    Lubin, Jay H.

    2002-01-01

    Cohen's ecological analysis of US lung cancer mortality rates and mean county radon concentration shows decreasing mortality rates with increasing radon concentration (Cohen 1995 Health Phys. 68 157-74). The results prompted his rejection of the linear-no-threshold (LNT) model for radon and lung cancer. Although several authors have demonstrated that risk patterns in ecological analyses provide no inferential value for assessment of risk to individuals, Cohen advances two arguments in a recent response to Darby and Doll (2000 J. Radiol. Prot. 20 221-2) who suggest Cohen's results are and will always be burdened by the ecological fallacy. Cohen asserts that the ecological fallacy does not apply when testing the LNT model, for which average exposure determines average risk, and that the influence of confounding factors is obviated by the use of large numbers of stratification variables. These assertions are erroneous. Average dose determines average risk only for models which are linear in all covariates, in which case ecological analyses are valid. However, lung cancer risk and radon exposure, while linear in the relative risk, are not linearly related to the scale of absolute risk, and thus Cohen's rejection of the LNT model is based on a false premise of linearity. In addition, it is demonstrated that the deleterious association for radon and lung cancer observed in residential and miner studies is consistent with negative trends from ecological studies, of the type described by Cohen. (author)

  17. Upgrading from the Dicon Wiring Management system to IntEC at the Gentilly 2 station

    International Nuclear Information System (INIS)

    Theoret, P.A.

    1995-01-01

    The General Electric DICON Wiring Management system supplied to HQ during the construction of G2 is currently being replaced by the stand-alone version of the IntEC software developed by AECL. The reasons for replacing DICON and choosing lntEC are discussed. The different aspects of the two year DICON data conversion project are presented with the problems encountered and the means that were taken to resolve the problems. lntEC has shown our DICON data to be considerably more deficient than we had thought. This has increased the cost and the duration of the conversion process. However, correcting the errors during the conversion process provides us with much more accurate data. This should be viewed as an investment in configuration management. Many potential causes of future errors and potentially critical path delays have been removed. We have chosen to document the detailed procedures for the use of lntEC in our plant using a Windows Help File compiler. This also has been found to be extremely useful as a training tool as well as providing on-line help. The DICON data conversion into lntEC will not be completed until 1996. lntEC is not perfect. However, from what we have up to now, we are satisfied with the conviviality and efficiency of lntEC and with AECL's diligence in constantly aspiring in making it a better product. (author)

  18. Brain transcriptional stability upon prion protein-encoding gene invalidation in zygotic or adult mouse

    Directory of Open Access Journals (Sweden)

    Béringue Vincent

    2010-07-01

    Full Text Available Abstract Background The physiological function of the prion protein remains largely elusive while its key role in prion infection has been expansively documented. To potentially assess this conundrum, we performed a comparative transcriptomic analysis of the brain of wild-type mice with that of transgenic mice invalidated at this locus either at the zygotic or at the adult stages. Results Only subtle transcriptomic differences resulting from the Prnp knockout could be evidenced, beside Prnp itself, in the analyzed adult brains following microarray analysis of 24 109 mouse genes and QPCR assessment of some of the putatively marginally modulated loci. When performed at the adult stage, neuronal Prnp disruption appeared to sequentially induce a response to an oxidative stress and a remodeling of the nervous system. However, these events involved only a limited number of genes, expression levels of which were only slightly modified and not always confirmed by RT-qPCR. If not, the qPCR obtained data suggested even less pronounced differences. Conclusions These results suggest that the physiological function of PrP is redundant at the adult stage or important for only a small subset of the brain cell population under classical breeding conditions. Following its early reported embryonic developmental regulation, this lack of response could also imply that PrP has a more detrimental role during mouse embryogenesis and that potential transient compensatory mechanisms have to be searched for at the time this locus becomes transcriptionally activated.

  19. Electric moulding of dispersed lipid nanotubes into a nanofluidic device.

    Science.gov (United States)

    Frusawa, Hiroshi; Manabe, Tatsuhiko; Kagiyama, Eri; Hirano, Ken; Kameta, Naohiro; Masuda, Mitsutoshi; Shimizu, Toshimi

    2013-01-01

    Hydrophilic nanotubes formed by lipid molecules have potential applications as platforms for chemical or biological events occurring in an attolitre volume inside a hollow cylinder. Here, we have integrated the lipid nanotubes (LNTs) by applying an AC electric field via plug-in electrode needles placed above a substrate. The off-chip assembly method has the on-demand adjustability of an electrode configuration, enabling the dispersed LNT to be electrically moulded into a separate film of parallel LNT arrays in one-step. The fluorescence resonance energy transfer technique as well as the digital microscopy visualised the overall filling of gold nanoparticles up to the inner capacity of an LNT film by capillary action, thereby showing the potential of this flexible film for use as a high-throughput nanofluidic device where not only is the endo-signalling and product in each LNT multiplied but also the encapsulated objects are efficiently transported and reacted.

  20. Validity of the linear no-threshold theory of radiation carcinogenesis at low doses

    International Nuclear Information System (INIS)

    Cohen, B.L.

    1999-01-01

    A great deal is known about the cancer risk of high radiation doses from studies of Japanese A-bomb survivors, patients exposed for medical therapy, occupational exposures, etc. But the vast majority of important applications deal with much lower doses, usually accumulated at much lower dose rates, referred to as 'low-level radiation' (LLR). Conventionally, the cancer risk from LLR has been estimated by the use of linear no-threshold theory (LNT). For example, it is assumed that the cancer risk from 0 01 Sr (100 mrem) of dose is 0 01 times the risk from 1 Sv (100 rem). In recent years, the former risk estimates have often been reduced by a 'dose and dose rate reduction factor', which is taken to be a factor of 2. But otherwise, the LNT is frequently used for doses as low as one hundred-thousandth of those for which there is direct evidence of cancer induction by radiation. It is the origin of the commonly used expression 'no level of radiation is safe' and the consequent public fear of LLR. The importance of this use of the LNT can not be exaggerated and is used in many applications in the nuclear industry. The LNT paradigm has also been carried over to chemical carcinogens, leading to severe restrictions on use of cleaning fluids, organic chemicals, pesticides, etc. If the LNT were abandoned for radiation, it would probably also be abandoned for chemical carcinogens. In view of these facts, it is important to consider the validity of the LNT. That is the purpose of this paper. (author)

  1. Evaluation of mechanical properties of Dy123 bulk superconductors by 3-point bending tests

    International Nuclear Information System (INIS)

    Katagiri, K.; Hatakeyama, Y.; Sato, T.; Kasaba, K.; Shoji, Y.; Murakami, A.; Teshima, H.; Hirano, H.

    2006-01-01

    In order to evaluate the mechanical properties, such as Young's modulus and strength, of Dy123 bulk superconductors and those with 10 wt.% Ag 2 O, we performed 3-point bending tests at room (RT) and liquid nitrogen temperatures (LNT) using specimens cut from the bulks. The Young's modulus and the bending strength increased with decrease in temperature. In the tests loading in the direction of c-axis and ones perpendicular to it, Young's moduli were almost comparable at both RT and LNT. Although the strengths for both orientations were also comparable at LNT, those at RT were different. Young's moduli loaded in the direction of c-axis for Ag 2 O added bulk specimens, 127 GPa in average at RT, were almost comparable to those without Ag 2 O, and 134 GPa at LNT, were slightly lower than those without Ag 2 O. On the other hand, the strengths at both RT and LNT were enhanced by 20% by the Ag addition. The mechanical properties of Dy123 bulks without Ag 2 O were compared with those of Y123 bulks obtained previously. The Young's modulus for loading in the direction of c-axis was slightly lower, and the strength was comparable to those in Y123 bulks, respectively

  2. β-Glucan from Lentinus edodes inhibits nitric oxide and tumor necrosis factor-α production and phosphorylation of mitogen-activated protein kinases in lipopolysaccharide-stimulated murine RAW 264.7 macrophages.

    Science.gov (United States)

    Xu, Xiaojuan; Yasuda, Michiko; Nakamura-Tsuruta, Sachiko; Mizuno, Masashi; Ashida, Hitoshi

    2012-01-06

    Lentinan (LNT), a β-glucan from the fruiting bodies of Lentinus edodes, is well known to have immunomodulatory activity. NO and TNF-α are associated with many inflammatory diseases. In this study, we investigated the effects of LNT extracted by sonication (LNT-S) on the NO and TNF-α production in LPS-stimulated murine RAW 264.7 macrophages. The results suggested that treatment with LNT-S not only resulted in the striking inhibition of TNF-α and NO production in LPS-activated macrophage RAW 264.7 cells, but also the protein expression of inducible NOS (iNOS) and the gene expression of iNOS mRNA and TNF-α mRNA. It is surprising that LNT-S enhanced LPS-induced NF-κB p65 nuclear translocation and NF-κB luciferase activity, but severely inhibited the phosphorylation of JNK1/2 and ERK1/2. The neutralizing antibodies of anti-Dectin-1 and anti-TLR2 hardly affected the inhibition of NO production. All of these results suggested that the suppression of LPS-induced NO and TNF-α production was at least partially attributable to the inhibition of JNK1/2 and ERK1/2 activation. This work discovered a promising molecule to control the diseases associated with overproduction of NO and TNF-α.

  3. Equivalent Dynamic Models.

    Science.gov (United States)

    Molenaar, Peter C M

    2017-01-01

    Equivalences of two classes of dynamic models for weakly stationary multivariate time series are discussed: dynamic factor models and autoregressive models. It is shown that exploratory dynamic factor models can be rotated, yielding an infinite set of equivalent solutions for any observed series. It also is shown that dynamic factor models with lagged factor loadings are not equivalent to the currently popular state-space models, and that restriction of attention to the latter type of models may yield invalid results. The known equivalent vector autoregressive model types, standard and structural, are given a new interpretation in which they are conceived of as the extremes of an innovating type of hybrid vector autoregressive models. It is shown that consideration of hybrid models solves many problems, in particular with Granger causality testing.

  4. Gene-Environment Interplay in Twin Models

    OpenAIRE

    Verhulst, Brad; Hatemi, Peter K.

    2013-01-01

    In this article, we respond to Shultziner’s critique that argues that identical twins are more alike not because of genetic similarity, but because they select into more similar environments and respond to stimuli in comparable ways, and that these effects bias twin model estimates to such an extent that they are invalid. The essay further argues that the theory and methods that undergird twin models, as well as the empirical studies which rely upon them, are unaware of these potential biases...

  5. Experimental Assessment of NOx Emissions from 73 Euro 6 Diesel Passenger Cars.

    Science.gov (United States)

    Yang, Liuhanzi; Franco, Vicente; Mock, Peter; Kolke, Reinhard; Zhang, Shaojun; Wu, Ye; German, John

    2015-12-15

    Controlling nitrogen oxides (NOx) emissions from diesel passenger cars during real-world driving is one of the major technical challenges facing diesel auto manufacturers. Three main technologies are available for this purpose: exhaust gas recirculation (EGR), lean-burn NOx traps (LNT), and selective catalytic reduction (SCR). Seventy-three Euro 6 diesel passenger cars (8 EGR only, 40 LNT, and 25 SCR) were tested on a chassis dynamometer over both the European type-approval cycle (NEDC, cold engine start) and the more realistic Worldwide harmonized light-duty test cycle (WLTC version 2.0, hot start) between 2012 and 2015. Most vehicles met the legislative limit of 0.08 g/km of NOx over NEDC (average emission factors by technology: EGR-only 0.07 g/km, LNT 0.04 g/km, and SCR 0.05 g/km), but the average emission factors rose dramatically over WLTC (EGR-only 0.17 g/km, LNT 0.21 g/km, and SCR 0.13 g/km). Five LNT-equipped vehicles exhibited very poor performance over the WLTC, emitting 7-15 times the regulated limit. These results illustrate how diesel NOx emissions are not properly controlled under the current, NEDC-based homologation framework. The upcoming real-driving emissions (RDE) regulation, which mandates an additional on-road emissions test for EU type approvals, could be a step in the right direction to address this problem.

  6. Using EEG and stimulus context to probe the modelling of auditory-visual speech.

    Science.gov (United States)

    Paris, Tim; Kim, Jeesun; Davis, Chris

    2016-02-01

    We investigated whether internal models of the relationship between lip movements and corresponding speech sounds [Auditory-Visual (AV) speech] could be updated via experience. AV associations were indexed by early and late event related potentials (ERPs) and by oscillatory power and phase locking. Different AV experience was produced via a context manipulation. Participants were presented with valid (the conventional pairing) and invalid AV speech items in either a 'reliable' context (80% AVvalid items) or an 'unreliable' context (80% AVinvalid items). The results showed that for the reliable context, there was N1 facilitation for AV compared to auditory only speech. This N1 facilitation was not affected by AV validity. Later ERPs showed a difference in amplitude between valid and invalid AV speech and there was significant enhancement of power for valid versus invalid AV speech. These response patterns did not change over the context manipulation, suggesting that the internal models of AV speech were not updated by experience. The results also showed that the facilitation of N1 responses did not vary as a function of the salience of visual speech (as previously reported); in post-hoc analyses, it appeared instead that N1 facilitation varied according to the relative time of the acoustic onset, suggesting for AV events N1 may be more sensitive to the relationship of AV timing than form. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.

  7. Exercise training attenuates experimental autoimmune encephalomyelitis by peripheral immunomodulation rather than direct neuroprotection.

    Science.gov (United States)

    Einstein, Ofira; Fainstein, Nina; Touloumi, Olga; Lagoudaki, Roza; Hanya, Ester; Grigoriadis, Nikolaos; Katz, Abram; Ben-Hur, Tamir

    2018-01-01

    Conflicting results exist on the effects of exercise training (ET) on Experimental Autoimmune Encephalomyelitis (EAE), nor is it known how exercise impacts on disease progression. We examined whether ET ameliorates the development of EAE by modulating the systemic immune system or exerting direct neuroprotective effects on the CNS. Healthy mice were subjected to 6weeks of motorized treadmill running. The Proteolipid protein (PLP)-induced transfer EAE model in mice was utilized. To assess effects of ET on systemic autoimmunity, lymph-node (LN)-T cells from trained- vs. sedentary donor mice were transferred to naïve recipients. To assess direct neuroprotective effects of ET, PLP-reactive LN-T cells were transferred into recipient mice that were trained prior to EAE transfer or to sedentary mice. EAE severity was assessed in vivo and the characteristics of encephalitogenic LN-T cells derived from PLP-immunized mice were evaluated in vitro. LN-T cells obtained from trained mice induced an attenuated clinical and pathological EAE in recipient mice vs. cells derived from sedentary animals. Training inhibited the activation, proliferation and cytokine gene expression of PLP-reactive T cells in response to CNS-derived autoantigen, but strongly enhanced their proliferation in response to Concanavalin A, a non-specific stimulus. However, there was no difference in EAE severity when autoreactive encephalitogenic T cells were transferred to trained vs. sedentary recipient mice. ET inhibits immune system responses to an auto-antigen to attenuate EAE, rather than generally suppressing the immune system, but does not induce a direct neuro-protective effect against EAE. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. In modelling effects of global warming, invalid assumptions lead to unrealistic projections.

    Science.gov (United States)

    Lefevre, Sjannie; McKenzie, David J; Nilsson, Göran E

    2018-02-01

    In their recent Opinion, Pauly and Cheung () provide new projections of future maximum fish weight (W ∞ ). Based on criticism by Lefevre et al. (2017) they changed the scaling exponent for anabolism, d G . Here we find that changing both d G and the scaling exponent for catabolism, b, leads to the projection that fish may even become 98% smaller with a 1°C increase in temperature. This unrealistic outcome indicates that the current W ∞ is unlikely to be explained by the Gill-Oxygen Limitation Theory (GOLT) and, therefore, GOLT cannot be used as a mechanistic basis for model projections about fish size in a warmer world. © 2017 John Wiley & Sons Ltd.

  9. Measuring the educational impact of Promoting Environmental Awareness in Kids (PEAK): The development and implementation of a new scale

    Science.gov (United States)

    Jennifer Miller; Lindsey Brown; Eddie Hill; Amy Shellman; Ron Ramsing; Edwin. Gómez

    2012-01-01

    The Leave No Trace Center for Outdoor Ethics (LNT) is a nonprofit educational organization that teaches skills and values for recreating responsibly in the out-of-doors. LNT developed Promoting Environmental Awareness in Kids (PEAK), based on seven ethical principles. The PEAK program provides a pack that contains several interactive activities specifically designed to...

  10. Liquid Nitrogen Temperature Operation of a Switching Power Converter

    Science.gov (United States)

    Ray, Biswajit; Gerber, Scott S.; Patterson, Richard L.; Myers, Ira T.

    1995-01-01

    The performance of a 42/28 V, 175 W, 50 kHz pulse-width modulated buck dc/dc switching power converter at liquid nitrogen temperature (LNT) is compared with room temperature operation. The power circuit as well as the control circuit of the converter, designed with commercially available components, were operated at LNT and resulted in a slight improvement in converter efficiency. The improvement in power MOSFET operation was offset by deteriorating performance of the output diode rectifier at LNT. Performance of the converter could be further improved at low temperatures by using only power MOSFET's as switches. The use of a resonant topology will further improve the circuit performance by reducing the switching noise and loss.

  11. Invalidity of the Fermi liquid theory and magnetic phase transition in quasi-1D dopant-induced armchair-edged graphene nanoribbons

    Science.gov (United States)

    Hoi, Bui Dinh; Davoudiniya, Masoumeh; Yarmohammadi, Mohsen

    2018-04-01

    Based on theoretically tight-binding calculations considering nearest neighbors and Green's function technique, we show that the magnetic phase transition in both semiconducting and metallic armchair graphene nanoribbons with width ranging from 9.83 Å to 69.3 Å would be observed in the presence of injecting electrons by doping. This transition is explained by the temperature-dependent static charge susceptibility through calculation of the correlation function of charge density operators. This work showed that charge concentration of dopants in such system plays a crucial role in determining the magnetic phase. A variety of multicritical points such as transition temperatures and maximum susceptibility are compared in undoped and doped cases. Our findings show that there exist two different transition temperatures and maximum susceptibility depending on the ribbon width in doped structures. Another remarkable point refers to the invalidity (validity) of the Fermi liquid theory in nanoribbons-based systems at weak (strong) concentration of dopants. The obtained interesting results of magnetic phase transition in such system create a new potential for magnetic graphene nanoribbon-based devices.

  12. Non-Fermi-liquid theory of a compactified Anderson single-impurity model

    International Nuclear Information System (INIS)

    Zhang, G.; Hewson, A.C.

    1996-01-01

    We consider a version of the symmetric Anderson impurity model (compactified) which has a non-Fermi-liquid weak-coupling regime. We find that in the Majorana fermion representation the perturbation theory can be conveniently developed in terms of Pfaffian determinants and we use this formalism to calculate the impurity free energy, self-energies, and vertex functions. We derive expressions for the impurity and the local conduction-electron charge and spin-dynamical susceptibilities in terms of the impurity self-energies and vertex functions. In the second-order perturbation theory, a linear temperature dependence of the electrical resistivity is obtained, and the leading corrections to the impurity specific heat are found to behave as TlnT. The impurity static susceptibilities have terms in lnT to zero, first, and second order, and corrections of ln 2 T to second order as well. The conduction-electron static susceptibilities, and the singlet superconducting paired static susceptibility at the impurity site, have second-order corrections lnT, which indicate that a singlet conduction-electron pairing resonance forms at the Fermi level (the chemical potential). When the perturbation theory is extended to third order logarithmic divergences are found in the only vertex function Γ 0,1,2,3 (0,0,0,0), which is nonvanishing in the zero-frequency limit. We use the multiplicative renormalization-group (RG) method to sum all the leading-order logarithmic contributions. This gives a weak-coupling low-temperature energy scale T c =Δexp[-(1/9)(πΔ/U) 2 ], which is the combination of the two independent coupling parameters. The RG scaling equation is derived and shows that the dimensionless coupling constant bar U=U/πΔ is increased as the high-energy scale Δ is reduced, so our perturbational results can be justified in the regime T approx-gt T c

  13. Scientific foundation of regulating ionizing radiation: application of metrics for evaluation of regulatory science information.

    Science.gov (United States)

    Moghissi, A Alan; Gerraa, Vikrham Kumar; McBride, Dennis K; Swetnam, Michael

    2014-11-01

    This paper starts by describing the historical evolution of assessment of biologic effects of ionizing radiation leading to the linear non-threshold (LNT) system currently used to regulate exposure to ionizing radiation. The paper describes briefly the concept of Best Available Science (BAS) and Metrics for Evaluation of Scientific Claims (MESC) derived for BAS. It identifies three phases of regulatory science consisting of the initial phase, when the regulators had to develop regulations without having the needed scientific information; the exploratory phase, when relevant tools were developed; and the standard operating phase, when the tools were applied to regulations. Subsequently, an attempt is made to apply the BAS/MESC system to various stages of LNT. This paper then compares the exposure limits imposed by regulatory agencies and also compares them with naturally occurring radiation at several cities. Controversies about LNT are addressed, including judgments of the U.S. National Academies and their French counterpart. The paper concludes that, based on the BAS/MESC system, there is no disagreement between the two academies on the scientific foundation of LNT; instead, the disagreement is based on their judgment or speculation.

  14. Advanced Computational Approaches for Characterizing Stochastic Cellular Responses to Low Dose, Low Dose Rate Exposures

    Energy Technology Data Exchange (ETDEWEB)

    Scott, Bobby, R., Ph.D.

    2003-06-27

    OAK - B135 This project final report summarizes modeling research conducted in the U.S. Department of Energy (DOE), Low Dose Radiation Research Program at the Lovelace Respiratory Research Institute from October 1998 through June 2003. The modeling research described involves critically evaluating the validity of the linear nonthreshold (LNT) risk model as it relates to stochastic effects induced in cells by low doses of ionizing radiation and genotoxic chemicals. The LNT model plays a central role in low-dose risk assessment for humans. With the LNT model, any radiation (or genotoxic chemical) exposure is assumed to increase one¡¯s risk of cancer. Based on the LNT model, others have predicted tens of thousands of cancer deaths related to environmental exposure to radioactive material from nuclear accidents (e.g., Chernobyl) and fallout from nuclear weapons testing. Our research has focused on developing biologically based models that explain the shape of dose-response curves for low-dose radiation and genotoxic chemical-induced stochastic effects in cells. Understanding the shape of the dose-response curve for radiation and genotoxic chemical-induced stochastic effects in cells helps to better understand the shape of the dose-response curve for cancer induction in humans. We have used a modeling approach that facilitated model revisions over time, allowing for timely incorporation of new knowledge gained related to the biological basis for low-dose-induced stochastic effects in cells. Both deleterious (e.g., genomic instability, mutations, and neoplastic transformation) and protective (e.g., DNA repair and apoptosis) effects have been included in our modeling. Our most advanced model, NEOTRANS2, involves differing levels of genomic instability. Persistent genomic instability is presumed to be associated with nonspecific, nonlethal mutations and to increase both the risk for neoplastic transformation and for cancer occurrence. Our research results, based on

  15. Regulatory Initiatives for Control and Release of Technologically Enhanced Naturally-Occurring Radioactive Material

    International Nuclear Information System (INIS)

    Egidi, P.V.

    1999-01-01

    Current drafts of proposed standards and suggested State regulations for control and release of technologically-enhanced naturally-occurring radioactive material (TENORM), and standards for release of volumetrically-contaminated material in the US are reviewed. These are compared to the recommendations of the International Atomic Energy Association (IAEA) Safety Series and the European Commission (EC) proposals. Past regulatory efforts with respect to TENORM in the US dealt primarily with oil-field related wastes. Currently, nine states (AK, GA, LA, MS, NM, OH, OR SC, TX) have specific regulations pertaining to TENORM, mostly based on uranium mill tailings cleanup criteria. The new US proposals are dose- or risk-based, as are the IAEA and EC recommendations, and are grounded in the linear no threshold hypothesis (LNT). TENORM wastes involve extremely large volumes, particularly scrap metal and mine wastes. Costs to control and dispose of these wastes can be considerable. The current debate over the validity of LNT at low doses and low dose rates is particularly germane to this discussion. Most standards setting organizations and regulatory agencies base their recommendations on the LNT. The US Environmental Protection Agency has released a draft Federal Guidance Report that recommends calculating health risks from low-level exposure to radionuclides based on the LNT. However, some scientific and professional organizations are openly questioning the validity of LNT and its basis for regulations, practices, and costs to society in general. It is not clear at this time how a non-linear regulatory scheme would be implemented

  16. Regulatory Initiatives for Control and Release of Technologically Enhanced Naturally-Occurring Radioactive Materials

    Energy Technology Data Exchange (ETDEWEB)

    Egidi, P.V.

    1999-03-02

    Current drafts of proposed standards and suggested State regulations for control and release of technologically-enhanced naturally-occurring radioactive material (TENORM), and standards for release of volumetrically-contaminated material in the US are reviewed. These are compared to the recommendations of the International Atomic Energy Association (IAEA) Safety Series and the European Commission (EC) proposals. Past regulatory efforts with respect to TENORM in the US dealt primarily with oil-field related wastes. Currently, nine states (AK, GA, LA, MS, NM, OH, OR SC, TX) have specific regulations pertaining to TENORM, mostly based on uranium mill tailings cleanup criteria. The new US proposals are dose- or risk-based, as are the IAEA and EC recommendations, and are grounded in the linear no threshold hypothesis (LNT). TENORM wastes involve extremely large volumes, particularly scrap metal and mine wastes. Costs to control and dispose of these wastes can be considerable. The current debate over the validity of LNT at low doses and low dose rates is particularly germane to this discussion. Most standards setting organizations and regulatory agencies base their recommendations on the LNT. The US Environmental Protection Agency has released a draft Federal Guidance Report that recommends calculating health risks from low-level exposure to radionuclides based on the LNT. However, some scientific and professional organizations are openly questioning the validity of LNT and its basis for regulations, practices, and costs to society in general. It is not clear at this time how a non-linear regulatory scheme would be implemented.

  17. Lipoproteins of slow-growing Mycobacteria carry three fatty acids and are N-acylated by Apolipoprotein N-Acyltransferase BCG_2070c.

    OpenAIRE

    Brülle Juliane K; Tschumi Andreas; Sander Peter

    2013-01-01

    BACKGROUND: Lipoproteins are virulence factors of Mycobacterium tuberculosis. Bacterial lipoproteins are modified by the consecutive action of preprolipoprotein diacylglyceryl transferase (Lgt), prolipoprotein signal peptidase (LspA) and apolipoprotein N- acyltransferase (Lnt) leading to the formation of mature triacylated lipoproteins. Lnt homologues are found in Gram-negative and high GC-rich Gram-positive, but not in low GC-rich Gram-positive bacteria, although N-acylation is observed. In ...

  18. Social psychological approach to the problem of threshold

    International Nuclear Information System (INIS)

    Nakayachi, Kazuya

    1999-01-01

    This paper discusses the threshold of carcinogen risk from the viewpoint of social psychology. First, the results of a survey suggesting that renunciation of the Linear No-Threshold (LNT) hypothesis would have no influence on the public acceptance (PA) of nuclear power plants are reported. Second, the relationship between the adoption of the LNT hypothesis and the standardization of management for various risks are discussed. (author)

  19. Statistical challenges in modelling the health consequences of social mobility: the need for diagonal reference models.

    Science.gov (United States)

    van der Waal, Jeroen; Daenekindt, Stijn; de Koster, Willem

    2017-12-01

    Various studies on the health consequences of socio-economic position address social mobility. They aim to uncover whether health outcomes are affected by: (1) social mobility, besides, (2) social origin, and (3) social destination. Conventional methods do not, however, estimate these three effects separately, which may produce invalid conclusions. We highlight that diagonal reference models (DRMs) overcome this problem, which we illustrate by focusing on overweight/obesity (OWOB). Using conventional methods (logistic-regression analyses with dummy variables) and DRMs, we examine the effects of intergenerational educational mobility on OWOB (BMI ≥ 25 kg/m 2 ) using survey data representative of the Dutch population aged 18-45 (1569 males, 1771 females). Conventional methods suggest that mobility effects on OWOB are present. Analyses with DRMs, however, indicate that no such effects exist. Conventional analyses of the health consequences of social mobility may produce invalid results. We, therefore, recommend the use of DRMs. DRMs also validly estimate the health consequences of other types of social mobility (e.g. intra- and intergenerational occupational and income mobility) and status inconsistency (e.g. in educational or occupational attainment between partners).

  20. Time-dependent local-to-normal mode transition in triatomic molecules

    Science.gov (United States)

    Cruz, Hans; Bermúdez-Montaña, Marisol; Lemus, Renato

    2018-01-01

    Time-evolution of the vibrational states of two interacting harmonic oscillators in the local mode scheme is presented. A local-to-normal mode transition (LNT) is identified and studied from temporal perspective through time-dependent frequencies of the oscillators. The LNT is established as a polyad-breaking phenomenon from the local standpoint for the stretching degrees of freedom in a triatomic molecule. This study is carried out in the algebraic representation of bosonic operators. The dynamics of the states are determined via the solutions of the corresponding nonlinear Ermakov equation and a local time-dependent polyad is obtained as a tool to identify the LNT. Applications of this formalism to H2O, CO2, O3 and NO2 molecules in the adiabatic, sudden and linear regime are considered.

  1. The brittle basis of linearity

    International Nuclear Information System (INIS)

    Roth, E.

    1997-01-01

    The LNT-theory of cancer generation by ionizing radiation is commonly vindicated by 3 arguments: The stochastic character of irradiation hits to cells, the monoclonality of cancer generation, and the error proneness of DNA-repair. It is shown that this conclusion is logically inadmissible. Equally, the rescuing attempts tried by some LNT-supporters are not successful. It contradicts the laws of thinking to exclude threshold and hormesis in this way. (author)

  2. A Capacity-Restraint Transit Assignment Model When a Predetermination Method Indicates the Invalidity of Time Independence

    Directory of Open Access Journals (Sweden)

    Haoyang Ding

    2015-01-01

    Full Text Available The statistical independence of time of every two adjacent bus links plays a crucial role in deciding the feasibility of using many mathematical models to analyze urban transit networks. Traditional research generally ignores the time independence that acts as the ground of their models. Assumption is usually made that time independence of every two adjacent links is sound. This is, however, actually groundless and probably causes problematic conclusions reached by corresponding models. Many transit assignment models such as multinomial probit-based models lose their effects when the time independence is not valid. In this paper, a simple method to predetermine the time independence is proposed. Based on the predetermination method, a modified capacity-restraint transit assignment method aimed at engineering practice is put forward and tested through a small contrived network and a case study in Nanjing city, China, respectively. It is found that the slope of regression equation between the mean and standard deviation of normal distribution acts as the indicator of time independence at the same time. Besides, our modified assignment method performs better than the traditional one with more reasonable results while keeping the property of simplicity well.

  3. Linear-No-Threshold Default Assumptions for Noncancer and Nongenotoxic Cancer Risks: A Mathematical and Biological Critique.

    Science.gov (United States)

    Bogen, Kenneth T

    2016-03-01

    To improve U.S. Environmental Protection Agency (EPA) dose-response (DR) assessments for noncarcinogens and for nonlinear mode of action (MOA) carcinogens, the 2009 NRC Science and Decisions Panel recommended that the adjustment-factor approach traditionally applied to these endpoints should be replaced by a new default assumption that both endpoints have linear-no-threshold (LNT) population-wide DR relationships. The panel claimed this new approach is warranted because population DR is LNT when any new dose adds to a background dose that explains background levels of risk, and/or when there is substantial interindividual heterogeneity in susceptibility in the exposed human population. Mathematically, however, the first claim is either false or effectively meaningless and the second claim is false. Any dose-and population-response relationship that is statistically consistent with an LNT relationship may instead be an additive mixture of just two quasi-threshold DR relationships, which jointly exhibit low-dose S-shaped, quasi-threshold nonlinearity just below the lower end of the observed "linear" dose range. In this case, LNT extrapolation would necessarily overestimate increased risk by increasingly large relative magnitudes at diminishing values of above-background dose. The fact that chemically-induced apoptotic cell death occurs by unambiguously nonlinear, quasi-threshold DR mechanisms is apparent from recent data concerning this quintessential toxicity endpoint. The 2009 NRC Science and Decisions Panel claims and recommendations that default LNT assumptions be applied to DR assessment for noncarcinogens and nonlinear MOA carcinogens are therefore not justified either mathematically or biologically. © 2015 The Author. Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.

  4. Theoretical model of granular compaction

    Energy Technology Data Exchange (ETDEWEB)

    Ben-Naim, E. [Los Alamos National Lab., NM (United States); Knight, J.B. [Princeton Univ., NJ (United States). Dept. of Physics; Nowak, E.R. [Univ. of Illinois, Urbana, IL (United States). Dept. of Physics]|[Univ. of Chicago, IL (United States). James Franck Inst.; Jaeger, H.M.; Nagel, S.R. [Univ. of Chicago, IL (United States). James Franck Inst.

    1997-11-01

    Experimental studies show that the density of a vibrated granular material evolves from a low density initial state into a higher density final steady state. The relaxation towards the final density follows an inverse logarithmic law. As the system approaches its final state, a growing number of beads have to be rearranged to enable a local density increase. A free volume argument shows that this number grows as N = {rho}/(1 {minus} {rho}). The time scale associated with such events increases exponentially e{sup {minus}N}, and as a result a logarithmically slow approach to the final state is found {rho} {infinity} {minus}{rho}(t) {approx_equal} 1/lnt.

  5. Atmospheric radionuclide transport model with radon postprocessor and SBG module. Model description version 2.8.0; ARTM. Atmosphaerisches Radionuklid-Transport-Modell mit Radon Postprozessor und SBG-Modul. Modellbeschreibung zu Version 2.8.0

    Energy Technology Data Exchange (ETDEWEB)

    Richter, Cornelia; Sogalla, Martin; Thielen, Harald; Martens, Reinhard

    2015-04-20

    The study on the atmospheric radionuclide transport model with radon postprocessor and SBG module (model description version 2.8.0) covers the following issues: determination of emissions, radioactive decay, atmospheric dispersion calculation for radioactive gases, atmospheric dispersion calculation for radioactive dusts, determination of the gamma cloud radiation (gamma submersion), terrain roughness, effective source height, calculation area and model points, geographic reference systems and coordinate transformations, meteorological data, use of invalid meteorological data sets, consideration of statistical uncertainties, consideration of housings, consideration of bumpiness, consideration of terrain roughness, use of frequency distributions of the hourly dispersion situation, consideration of the vegetation period (summer), the radon post processor radon.exe, the SBG module, modeling of wind fields, shading settings.

  6. Item bias detection in the Hospital Anxiety and Depression Scale using structural equation modeling: comparison with other item bias detection methods

    NARCIS (Netherlands)

    Verdam, M.G.E.; Oort, F.J.; Sprangers, M.A.G.

    Purpose Comparison of patient-reported outcomes may be invalidated by the occurrence of item bias, also known as differential item functioning. We show two ways of using structural equation modeling (SEM) to detect item bias: (1) multigroup SEM, which enables the detection of both uniform and

  7. Comparisons of patch-use models for wintering American tree sparrows

    Science.gov (United States)

    Tome, M.W.

    1990-01-01

    Optimal foraging theory has stimulated numerous theoretical and empirical studies of foraging behavior for >20 years. These models provide a valuable tool for studying the foraging behavior of an organism. As with any other tool, the models are most effective when properly used. For example, to obtain a robust test of a foraging model, Stephens and Krebs (1986) recommend experimental designs in which four questions are answered in the affirmative. First, do the foragers play the same "game" as the model? Sec- ond, are the assumptions of the model met? Third, does the test rule out alternative possibilities? Finally, are the appropriate variables measured? Negative an- swers to any of these questions could invalidate the model and lead to confusion over the usefulness of foraging theory in conducting ecological studies. Gaines (1989) attempted to determine whether American Tree Sparrows (Spizella arborea) foraged by a time (Krebs 1973) or number expectation rule (Gibb 1962), or in a manner consistent with the predictions of Charnov's (1976) marginal value theorem (MVT). Gaines (1989: 118) noted appropriately that field tests of foraging models frequently involve uncontrollable circumstances; thus, it is often difficult to meet the assumptions of the models. Gaines also states (1989: 118) that "violations of the assumptions are also in- formative but do not constitute robust tests of predicted hypotheses," and that "the problem can be avoided by experimental analyses which concurrently test mutually exclusive hypotheses so that alter- native predictions will be eliminated if falsified." There is a problem with this approach because, when major assumptions of models are not satisfied, it is not justifiable to compare a predator's foraging behavior with the model's predictions. I submit that failing to follow the advice offered by Stephens and Krebs (1986) can invalidate tests of foraging models.

  8. “Protective Bystander Effects Simulated with the State-Vector Model”—HeLa x Skin Exposure to 137Cs Not Protective Bystander Response But Mammogram and Diagnostic X-Rays Are

    Science.gov (United States)

    Leonard, Bobby E.

    2008-01-01

    The recent Dose Response journal article “Protective Bystander Effects Simulated with the State-Vector Model” (Schollnberger and Eckl 2007) identified the suppressive (below natural occurring, zero primer dose, spontaneous level) dose response for HeLa x skin exposure to 137Cs gamma rays (Redpath et al 2001) as a protective Bystander Effect (BE) behavior. I had previously analyzed the Redpath et al (2001) data with a Microdose Model and conclusively showed that the suppressive response was from Adaptive Response (AR) radio-protection (Leonard 2005, 2007a). The significance of my microdose analysis has been that low LET radiation induced single (i.e. only one) charged particle traversals through a cell can initiate a Poisson distributed activation of AR radio-protection. The purpose of this correspondence is to clarify the distinctions relative to the BE and the AR behaviors for the Redpath groups 137Cs data, show conversely however that the Redpath group data for mammography (Ko et al 2004) and diagnostic (Redpath et al 2003) X-rays do conclusively reflect protective bystander behavior and also herein emphasize the need for radio-biologist to apply microdosimetry in planning and analyzing their experiments for BE and AR. Whether we are adamantly pro-LNT, adamantly anti-LNT or, like most of us, just simple scientists searching for the truth in radio-biology, it is important that we accurately identify our results, especially when related to the LNT hypothesis controversy. PMID:18846260

  9. Epidemiology Without Biology: False Paradigms, Unfounded Assumptions, and Specious Statistics in Radiation Science (with Commentaries by Inge Schmitz-Feuerhake and Christopher Busby and a Reply by the Authors)

    OpenAIRE

    Sacks, Bill; Meyerson, Gregory; Siegel, Jeffry A.

    2016-01-01

    Radiation science is dominated by a paradigm based on an assumption without empirical foundation. Known as the linear no-threshold (LNT) hypothesis, it holds that all ionizing radiation is harmful no matter how low the dose or dose rate. Epidemiological studies that claim to confirm LNT either neglect experimental and/or observational discoveries at the cellular, tissue, and organismal levels, or mention them only to distort or dismiss them. The appearance of validity in these studies rests o...

  10. A dynamic random effects multinomial logit model of household car ownership

    DEFF Research Database (Denmark)

    Bue Bjørner, Thomas; Leth-Petersen, Søren

    2007-01-01

    Using a large household panel we estimate demand for car ownership by means of a dynamic multinomial model with correlated random effects. Results suggest that the persistence in car ownership observed in the data should be attributed to both true state dependence and to unobserved heterogeneity...... (random effects). It also appears that random effects related to single and multiple car ownership are correlated, suggesting that the IIA assumption employed in simple multinomial models of car ownership is invalid. Relatively small elasticities with respect to income and car costs are estimated...

  11. Stability of the thermodynamic equilibrium - A test of the validity of dynamic models as applied to gyroviscous perpendicular magnetohydrodynamics

    Science.gov (United States)

    Faghihi, Mustafa; Scheffel, Jan; Spies, Guenther O.

    1988-05-01

    Stability of the thermodynamic equilibrium is put forward as a simple test of the validity of dynamic equations, and is applied to perpendicular gyroviscous magnetohydrodynamics (i.e., perpendicular magnetohydrodynamics with gyroviscosity added). This model turns out to be invalid because it predicts exponentially growing Alfven waves in a spatially homogeneous static equilibrium with scalar pressure.

  12. Stability of the thermodynamic equilibrium: A test of the validity of dynamic models as applied to gyroviscous perpendicular magnetohydrodynamics

    International Nuclear Information System (INIS)

    Faghihi, M.; Scheffel, J.; Spies, G.O.

    1988-01-01

    Stability of the thermodynamic equilibrium is put forward as a simple test of the validity of dynamic equations, and is applied to perpendicular gyroviscous magnetohydrodynamics (i.e., perpendicular magnetohydrodynamics with gyroviscosity added). This model turns out to be invalid because it predicts exponentially growing Alfven waves in a spatially homogeneous static equilibrium with scalar pressure

  13. Advanced Engine/Aftertreatment System R&D

    Energy Technology Data Exchange (ETDEWEB)

    Pihl, J.; West, B.; Toops, T.; Adelman, B. (Navistar, Inc.); Derybowski, E. (Navistar, Inc.)

    2011-09-30

    Navistar and ORNL established this CRADA to develop diesel engine aftertreatment configurations and control strategies that could meet emissions regulations while maintaining or improving vehicle efficiency. The early years of the project focused on reducing the fuel penalty associated with lean NOx trap (LNT, also known as NOx adsorber catalyst) regeneration and desulfation. While Navistar pursued engine-based (in-cylinder) approaches to LNT regeneration, complementary experiments at ORNL focused on in-exhaust fuel injection. ORNL developed a PC-based controller for transient electronic control of EGR valve position, intake throttle position, and actuation of fuel injectors in the exhaust system of a Navistar engine installed at Oak Ridge. Aftertreatment systems consisting of different diesel oxidation catalysts (DOCs) in conjunction with a diesel particle filter and LNT were evaluated under quasi-steady-state conditions. Hydrocarbon (HC) species were measured at multiple locations in the exhaust system with Gas chromatograph mass spectrometry (GC-MS) and Fourier transform infrared (FTIR) spectroscopy.

  14. Topics on study of low dose-effect relationship

    Energy Technology Data Exchange (ETDEWEB)

    Yamada, Takeshi [Toho Univ., School of Medicine, Tokyo (Japan); Ohyama, Harumi

    1999-09-01

    It is not exceptional but usually observed that a dose-effect relationship in biosystem is not linear. Sometimes, the low dose-effect relationship appears entirely contrary to the expectation from high dose-effect. This is called a 'hormesis' phenomena. A high dose irradiation inflicts certainly an injury on biosystem. No matter how low the dose may be, an irradiation might inflict some injury on biosystem according to Linear Non-Threshold hypothesis(LNT). On the contrary to the expectation, a low dose irradiation stimulates immune system, and promotes cell proliferation. This is called 'radiation hormesis'. The studies of the radiation hormesis are made on from four points of view as follows: (1) radiation adaptive response, (2) revitalization caused by a low dose stimulation, (3) a low dose response unexpected from the LNT hypothesis, (4) negation of the LNT hypothesis. The various empirical proofs of radiation hormesis are introduced in the report. (M . Suetake)

  15. Topics on study of low dose-effect relationship

    International Nuclear Information System (INIS)

    Yamada, Takeshi; Ohyama, Harumi

    1999-01-01

    It is not exceptional but usually observed that a dose-effect relationship in biosystem is not linear. Sometimes, the low dose-effect relationship appears entirely contrary to the expectation from high dose-effect. This is called a 'hormesis' phenomena. A high dose irradiation inflicts certainly an injury on biosystem. No matter how low the dose may be, an irradiation might inflict some injury on biosystem according to Linear Non-Threshold hypothesis(LNT). On the contrary to the expectation, a low dose irradiation stimulates immune system, and promotes cell proliferation. This is called 'radiation hormesis'. The studies of the radiation hormesis are made on from four points of view as follows: (1) radiation adaptive response, (2) revitalization caused by a low dose stimulation, (3) a low dose response unexpected from the LNT hypothesis, (4) negation of the LNT hypothesis. The various empirical proofs of radiation hormesis are introduced in the report. (M . Suetake)

  16. Microkinetic Modeling of Lean NOx Trap Storage and Regeneration

    Energy Technology Data Exchange (ETDEWEB)

    Larson, Richard S. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Chakravarthy, V. Kalyana [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Pihl, Josh A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Daw, C. Stuart [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2011-12-01

    A microkinetic chemical reaction mechanism capable of describing both the storage and regeneration processes in a fully formulated lean NOx trap (LNT) is presented. The mechanism includes steps occurring on the precious metal, barium oxide (NOx storage), and cerium oxide (oxygen storage) sites of the catalyst. The complete reaction set is used in conjunction with a transient plug flow reactor code (including boundary layer mass transfer) to simulate not only a set of long storage/regeneration cycles with a CO/H2 reductant, but also a series of steady flow temperature sweep experiments that were previously analyzed with just a precious metal mechanism and a steady state code neglecting mass transfer. The results show that, while mass transfer effects are generally minor, NOx storage is not negligible during some of the temperature ramps, necessitating a re-evaluation of the precious metal kinetic parameters. The parameters for the entire mechanism are inferred by finding the best overall fit to the complete set of experiments. Rigorous thermodynamic consistency is enforced for parallel reaction pathways and with respect to known data for all of the gas phase species involved. It is found that, with a few minor exceptions, all of the basic experimental observations can be reproduced with the transient simulations. In addition to accounting for normal cycling behavior, the final mechanism should provide a starting point for the description of further LNT phenomena such as desulfation and the role of alternative reductants.

  17. Modeling aerodynamic discontinuities and onset of chaos in flight dynamical systems

    Science.gov (United States)

    Tobak, M.; Chapman, G. T.; Unal, A.

    1987-01-01

    Various representations of the aerodynamic contribution to the aircraft's equation of motion are shown to be compatible within the common assumption of their Frechet differentiability. Three forms of invalidating Frechet differentiability are identified, and the mathematical model is amended to accommodate their occurrence. Some of the ways in which chaotic behavior may emerge are discussed, first at the level of the aerodynamic contribution to the equations of motion, and then at the level of the equations of motion themselves.

  18. Galilean invariance in the exponential model of atomic collisions

    International Nuclear Information System (INIS)

    del Pozo, A.; Riera, A.; Yaez, M.

    1986-01-01

    Using the X/sup n/ + (1s 2 )+He/sup 2+/ colliding systems as specific examples, we study the origin dependence of results in the application of the two-state exponential model, and we show the relevance of polarization effects in that study. Our analysis shows that polarization effects of the He + (1s) orbital due to interaction with X/sup (//sup n//sup +1)+/ ion in the exit channel yield a very small contribution to the energy difference and render the dynamical coupling so strongly origin dependent that it invalidates the basic premises of the model. Further study, incorporating translation factors in the formalism, is needed

  19. Galilean invariance in the exponential model of atomic collisions

    Energy Technology Data Exchange (ETDEWEB)

    del Pozo, A.; Riera, A.; Yaez, M.

    1986-11-01

    Using the X/sup n//sup +/(1s/sup 2/)+He/sup 2+/ colliding systems as specific examples, we study the origin dependence of results in the application of the two-state exponential model, and we show the relevance of polarization effects in that study. Our analysis shows that polarization effects of the He/sup +/(1s) orbital due to interaction with X/sup (//sup n//sup +1)+/ ion in the exit channel yield a very small contribution to the energy difference and render the dynamical coupling so strongly origin dependent that it invalidates the basic premises of the model. Further study, incorporating translation factors in the formalism, is needed.

  20. Linear versus non-linear: a perspective from health physics and radiobiology

    International Nuclear Information System (INIS)

    Gentner, N.E.; Osborne, R.V.

    1998-01-01

    There is a vigorous debate about whether or not there may be a 'threshold' for radiation-induced adverse health effects. A linear-no threshold (LNT) model allows radiation protection practitioners to manage putative risk consistently, because different types of exposure, exposures at different times, and exposures to different organs may be summed. If we are to argue to regulators and the public that low doses are less dangerous than we presently assume, it is incumbent on us to prove this. The question is, therefore, whether any consonant body of evidence exists that the risk of low doses has been over-estimated. From the perspectives of both health physics and radiobiology, we conclude that the evidence for linearity at high doses (and arguably of fairly small total doses if delivered at high dose rate) is strong. For low doses (or in fact, even for fairly high doses) delivered at low dose rate, the evidence is much less compelling. Since statistical limitations at low doses are almost always going to prevent a definitive answer, one way or the other, from human data, we need a way out of this epistemological dilemma of 'LNT or not LNT, that is the question'. To our minds, the path forward is to exploit (1) radiobiological studies which address directly the question of what the dose and dose rate effectiveness factor is in actual human bodies exposed to low-level radiation, in concert with (2) epidemiological studies of human populations exposed to fairly high doses (to obtain statistical power) but where exposure was protracted over some years. (author)

  1. Modeling aerodynamic discontinuities and the onset of chaos in flight dynamical systems

    Science.gov (United States)

    Tobak, M.; Chapman, G. T.; Uenal, A.

    1986-01-01

    Various representations of the aerodynamic contribution to the aircraft's equation of motion are shown to be compatible within the common assumption of their Frechet differentiability. Three forms of invalidating Frechet differentiality are identified, and the mathematical model is amended to accommodate their occurrence. Some of the ways in which chaotic behavior may emerge are discussed, first at the level of the aerodynamic contribution to the equation of motion, and then at the level of the equations of motion themselves.

  2. Using numerical simulations to extract parameters of toroidal electron plasmas from experimental data

    DEFF Research Database (Denmark)

    Ha, B. N.; Stoneking,, M. R.; Marler, Joan

    2009-01-01

    Measurements of the image charge induced on electrodes provide the primary means of diagnosing plasmas in the Lawrence Non-neutral Torus II (LNT II) [Phys. Rev. Lett. 100, 155001 (2008)]. Therefore, it is necessary to develop techniques that determine characteristics of the electron plasma from......, as in the cylindrical case. In the toroidal case, additional information about the m=1 motion of the plasma can be obtained by analysis of the image charge signal amplitude and shape. Finally, results from the numerical simulations are compared to experimental data from the LNT II and plasma characteristics...

  3. Sulfur impact on NO{sub x} storage, oxygen storage, and ammonia breakthrough during cyclic lean/rich operation of a commercial lean NO{sub x} trap

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Jae-Soon; Partridge, William P.; Daw, C. Stuart [Fuels, Engines, and Emissions Research Center, Oak Ridge National Laboratory, P.O. Box 2008, MS-6472, Oak Ridge, TN 37831-6472 (United States)

    2007-11-30

    The objective of the present study was to develop an improved understanding of how sulfur affects the spatiotemporal distribution of reactions and temperature inside a monolithic lean NO{sub x} trap (LNT). These spatiotemporal distributions are believed to be major factors in LNT function, and thus, we expect that a better understanding of these phenomena can benefit the design and operation of commercial LNTs. In our study, we experimentally evaluated a commercial LNT monolith installed in a bench-flow reactor with simulated engine exhaust. The reactor feed gas composition was cycled to simulate fast lean/rich LNT operation at 325 C, and spatiotemporal species and temperature profiles were monitored along the LNT axis at different sulfur loadings. Reactor outlet NO{sub x}, NO, N{sub 2}O, and NH{sub 3} were also measured. Sulfur tended to accumulate in a plug-like fashion in the reactor and progressively inhibited NO{sub x} storage capacity along the axis. The NO{sub x} storage/reduction (NSR) reactions occurred over a relatively short portion of the reactor (NSR zone) under the conditions used in this study, and thus, net NO{sub x} conversion was only significantly reduced at high sulfur loading. Oxygen storage capacity (OSC) was poisoned by sulfur also in a progressive manner but to a lesser extent than the NO{sub x} storage capacity. Global selectivity for N{sub 2}O remained low at all sulfur loadings, but NH{sub 3} selectivity increased significantly with sulfur loading. We conjecture that NH{sub 3} breakthrough increased because of decreasing oxidation of NH{sub 3}, slipping from the NSR zone, by downstream stored oxygen. The NSR and oxygen storage/reduction (OSR) generated distinctive exotherms during the rich phase and at the rich/lean transition. Exotherm locations shifted downstream with sulfur accumulation in a manner that was consistent with the progressive poisoning of NSR and OSR sites. (author)

  4. Modeling of correlated data with informative cluster sizes: An evaluation of joint modeling and within-cluster resampling approaches.

    Science.gov (United States)

    Zhang, Bo; Liu, Wei; Zhang, Zhiwei; Qu, Yanping; Chen, Zhen; Albert, Paul S

    2017-08-01

    Joint modeling and within-cluster resampling are two approaches that are used for analyzing correlated data with informative cluster sizes. Motivated by a developmental toxicity study, we examined the performances and validity of these two approaches in testing covariate effects in generalized linear mixed-effects models. We show that the joint modeling approach is robust to the misspecification of cluster size models in terms of Type I and Type II errors when the corresponding covariates are not included in the random effects structure; otherwise, statistical tests may be affected. We also evaluate the performance of the within-cluster resampling procedure and thoroughly investigate the validity of it in modeling correlated data with informative cluster sizes. We show that within-cluster resampling is a valid alternative to joint modeling for cluster-specific covariates, but it is invalid for time-dependent covariates. The two methods are applied to a developmental toxicity study that investigated the effect of exposure to diethylene glycol dimethyl ether.

  5. Non-Linear Adaptive Phenomena Which Decrease The Risk of Infection After Pre-Exposure to Radiofrequency Radiation

    OpenAIRE

    Mortazavi, S.M.J.; Motamedifar, M.; Namdari, G.; Taheri, M.; Mortazavi, A.R.; Shokrpour, N.

    2013-01-01

    Substantial evidence indicates that adaptive response induced by low doses of ionizing radiation can result in resistance to the damage caused by a subsequently high-dose radiation or cause cross-resistance to other non-radiation stressors. Adaptive response contradicts the linear-non-threshold (LNT) dose-response model for ionizing radiation. We have previously reported that exposure of laboratory animals to radiofrequency radiation can induce a survival adaptive response. Furthermore, we ha...

  6. Conformal invariance in the long-range Ising model

    Directory of Open Access Journals (Sweden)

    Miguel F. Paulos

    2016-01-01

    Full Text Available We consider the question of conformal invariance of the long-range Ising model at the critical point. The continuum description is given in terms of a nonlocal field theory, and the absence of a stress tensor invalidates all of the standard arguments for the enhancement of scale invariance to conformal invariance. We however show that several correlation functions, computed to second order in the epsilon expansion, are nontrivially consistent with conformal invariance. We proceed to give a proof of conformal invariance to all orders in the epsilon expansion, based on the description of the long-range Ising model as a defect theory in an auxiliary higher-dimensional space. A detailed review of conformal invariance in the d-dimensional short-range Ising model is also included and may be of independent interest.

  7. Conformal Invariance in the Long-Range Ising Model

    CERN Document Server

    Paulos, Miguel F; van Rees, Balt C; Zan, Bernardo

    2016-01-01

    We consider the question of conformal invariance of the long-range Ising model at the critical point. The continuum description is given in terms of a nonlocal field theory, and the absence of a stress tensor invalidates all of the standard arguments for the enhancement of scale invariance to conformal invariance. We however show that several correlation functions, computed to second order in the epsilon expansion, are nontrivially consistent with conformal invariance. We proceed to give a proof of conformal invariance to all orders in the epsilon expansion, based on the description of the long-range Ising model as a defect theory in an auxiliary higher-dimensional space. A detailed review of conformal invariance in the d-dimensional short-range Ising model is also included and may be of independent interest.

  8. Conformal invariance in the long-range Ising model

    Energy Technology Data Exchange (ETDEWEB)

    Paulos, Miguel F. [CERN, Theory Group, Geneva (Switzerland); Rychkov, Slava, E-mail: slava.rychkov@lpt.ens.fr [CERN, Theory Group, Geneva (Switzerland); Laboratoire de Physique Théorique de l' École Normale Supérieure (LPTENS), Paris (France); Faculté de Physique, Université Pierre et Marie Curie (UPMC), Paris (France); Rees, Balt C. van [CERN, Theory Group, Geneva (Switzerland); Zan, Bernardo [Institute of Physics, Universiteit van Amsterdam, Amsterdam (Netherlands)

    2016-01-15

    We consider the question of conformal invariance of the long-range Ising model at the critical point. The continuum description is given in terms of a nonlocal field theory, and the absence of a stress tensor invalidates all of the standard arguments for the enhancement of scale invariance to conformal invariance. We however show that several correlation functions, computed to second order in the epsilon expansion, are nontrivially consistent with conformal invariance. We proceed to give a proof of conformal invariance to all orders in the epsilon expansion, based on the description of the long-range Ising model as a defect theory in an auxiliary higher-dimensional space. A detailed review of conformal invariance in the d-dimensional short-range Ising model is also included and may be of independent interest.

  9. Lipid nanotechnologies for structural studies of membrane-associated proteins.

    Science.gov (United States)

    Stoilova-McPhie, Svetla; Grushin, Kirill; Dalm, Daniela; Miller, Jaimy

    2014-11-01

    We present a methodology of lipid nanotubes (LNT) and nanodisks technologies optimized in our laboratory for structural studies of membrane-associated proteins at close to physiological conditions. The application of these lipid nanotechnologies for structure determination by cryo-electron microscopy (cryo-EM) is fundamental for understanding and modulating their function. The LNTs in our studies are single bilayer galactosylceramide based nanotubes of ∼20 nm inner diameter and a few microns in length, that self-assemble in aqueous solutions. The lipid nanodisks (NDs) are self-assembled discoid lipid bilayers of ∼10 nm diameter, which are stabilized in aqueous solutions by a belt of amphipathic helical scaffold proteins. By combining LNT and ND technologies, we can examine structurally how the membrane curvature and lipid composition modulates the function of the membrane-associated proteins. As proof of principle, we have engineered these lipid nanotechnologies to mimic the activated platelet's phosphtaidylserine rich membrane and have successfully assembled functional membrane-bound coagulation factor VIII in vitro for structure determination by cryo-EM. The macromolecular organization of the proteins bound to ND and LNT are further defined by fitting the known atomic structures within the calculated three-dimensional maps. The combination of LNT and ND technologies offers a means to control the design and assembly of a wide range of functional membrane-associated proteins and complexes for structural studies by cryo-EM. The presented results confirm the suitability of the developed methodology for studying the functional structure of membrane-associated proteins, such as the coagulation factors, at a close to physiological environment. © 2014 Wiley Periodicals, Inc.

  10. modeling workflow management in a distributed computing system

    African Journals Online (AJOL)

    Dr Obe

    communication system, which allows for computerized support. ... Keywords: Distributed computing system; Petri nets;Workflow management. 1. ... A distributed operating system usually .... the questionnaire is returned with invalid data,.

  11. Development and demonstration of a validation methodology for vehicle lateral dynamics simulation models

    Energy Technology Data Exchange (ETDEWEB)

    Kutluay, Emir

    2013-02-01

    In this thesis a validation methodology to be used in the assessment of the vehicle dynamics simulation models is presented. Simulation of vehicle dynamics is used to estimate the dynamic responses of existing or proposed vehicles and has a wide array of applications in the development of vehicle technologies. Although simulation environments, measurement tools and mathematical theories on vehicle dynamics are well established, the methodical link between the experimental test data and validity analysis of the simulation model is still lacking. The developed validation paradigm has a top-down approach to the problem. It is ascertained that vehicle dynamics simulation models can only be validated using test maneuvers although they are aimed for real world maneuvers. Test maneuvers are determined according to the requirements of the real event at the start of the model development project and data handling techniques, validation metrics and criteria are declared for each of the selected maneuvers. If the simulation results satisfy these criteria, then the simulation is deemed ''not invalid''. If the simulation model fails to meet the criteria, the model is deemed invalid, and model iteration should be performed. The results are analyzed to determine if the results indicate a modeling error or a modeling inadequacy; and if a conditional validity in terms of system variables can be defined. Three test cases are used to demonstrate the application of the methodology. The developed methodology successfully identified the shortcomings of the tested simulation model, and defined the limits of application. The tested simulation model is found to be acceptable but valid only in a certain dynamical range. Several insights for the deficiencies of the model are reported in the analysis but the iteration step of the methodology is not demonstrated. Utilizing the proposed methodology will help to achieve more time and cost efficient simulation projects with

  12. Simplified Model of Nonlinear Landau Damping

    International Nuclear Information System (INIS)

    Yampolsky, N.A.; Fisch, N.J.

    2009-01-01

    The nonlinear interaction of a plasma wave with resonant electrons results in a plateau in the electron distribution function close to the phase velocity of the plasma wave. As a result, Landau damping of the plasma wave vanishes and the resonant frequency of the plasma wave downshifts. However, this simple picture is invalid when the external driving force changes the plasma wave fast enough so that the plateau cannot be fully developed. A new model to describe amplification of the plasma wave including the saturation of Landau damping and the nonlinear frequency shift is proposed. The proposed model takes into account the change of the plasma wave amplitude and describes saturation of the Landau damping rate in terms of a single fluid equation, which simplifies the description of the inherently kinetic nature of Landau damping. A proposed fluid model, incorporating these simplifications, is verified numerically using a kinetic Vlasov code.

  13. [Consolidating the medical model of disability: on poliomyelitis and constitution of orthopedic surgery and orthopaedics as a speciality in Spain (1930-1950)].

    Science.gov (United States)

    Martínez-Pérez, José

    2009-01-01

    At the beginning of the 1930s, various factors made it necessary to transform one of the institutions which was renowned for its work regarding the social reinsertion of the disabled, that is, the Instituto de Reeducación Profesional de Inválidos del Trabajo (Institute for Occupational Retraining of Invalids of Work). The economic crisis of 1929 and the legislative reform aimed at regulating occupational accidents highlighted the failings of this institution to fulfill its objectives. After a time of uncertainty, the centre was renamed the Instituto Nacional de Reeducación de Inválidos (National Institute for Retraining of Invalids). This was done to take advantage of its work in championing the recovery of all people with disabilities.This work aims to study the role played in this process by the poliomyelitis epidemics in Spain at this time. It aims to highlight how this disease justified the need to continue the work of a group of professionals and how it helped to reorient the previous programme to re-educate the "invalids." Thus we shall see the way in which, from 1930 to 1950, a specific medical technology helped to consolidate an "individual model" of disability and how a certain cultural stereotype of those affected developed as a result. Lastly, this work discusses the way in which all this took place in the midst of a process of professional development of orthopaedic surgeons.

  14. Illusory inferences from a disjunction of conditionals: a new mental models account.

    Science.gov (United States)

    Barrouillet, P; Lecas, J F

    2000-08-14

    (Johnson-Laird, P.N., & Savary, F. (1999, Illusory inferences: a novel class of erroneous deductions. Cognition, 71, 191-229.) have recently presented a mental models account, based on the so-called principle of truth, for the occurrence of inferences that are compelling but invalid. This article presents an alternative account of the illusory inferences resulting from a disjunction of conditionals. In accordance with our modified theory of mental models of the conditional, we show that the way individuals represent conditionals leads them to misinterpret the locus of the disjunction and prevents them from drawing conclusions from a false conditional, thus accounting for the compelling character of the illusory inference.

  15. Exposures at low doses and biological effects of ionizing radiations

    International Nuclear Information System (INIS)

    Masse, R.

    2000-01-01

    Everyone is exposed to radiation from natural, man-made and medical sources, and world-wide average annual exposure can be set at about 3.5 mSv. Exposure to natural sources is characterised by very large fluctuations, not excluding a range covering two orders of magnitude. Millions of inhabitants are continuously exposed to external doses as high as 10 mSv per year, delivered at low dose rates, very few workers are exposed above the legal limit of 50 mSv/year, and referring to accidental exposures, only 5% of the 116 000 people evacuated following the Chernobyl disaster encountered doses above 100 mSv. Epidemiological survey of accidentally, occupationally or medically exposed groups have revealed radio-induced cancers, mostly following high dose-rate exposure levels, only above 100 mSv. Risk coefficients were derived from these studies and projected into linear models of risk (linear non-threshold hypothesis: LNT), for the purpose of risk management following exposures at low doses and low dose-rates. The legitimacy of this approach has been questioned, by the Academy of sciences and the Academy of medicine in France, arguing: that LNT was not supported by Hiroshima and Nagasaki studies when neutron dose was revisited; that linear modelling failed to explain why so many site-related cancers were obviously nonlinearly related to the dose, and especially when theory predicted they ought to be; that no evidence could be found of radio-induced cancers related to natural exposures or to low exposures at the work place; and that no evidence of genetic disease could be shown from any of the exposed groups. Arguments were provided from cellular and molecular biology helping to solve this issue, all resulting in dismissing the LNT hypothesis. These arguments included: different mechanisms of DNA repair at high and low dose rate; influence of inducible stress responses modifying mutagenesis and lethality; bystander effects allowing it to be considered that individual

  16. Biological effect and tumor risk of diagnostic x-rays. The ''war of the theories''; Biologische Wirkung und Tumorrisiko diagnostischer Roentgenstrahlen. Der ''Krieg der Modelle''

    Energy Technology Data Exchange (ETDEWEB)

    Selzer, E.; Hebar, A. [Medizinische Universitaet Wien, Abteilung fuer Strahlenbiologie, Klinik fuer Strahlentherapie, Wien (Austria)

    2012-10-15

    Since the introduction of ionizing radiation as a treatment and diagnostic tool in humans, scientists have been trying to estimate its side effects and potential health risks. There is now ample evidence for the principal existence of a direct relationship between higher doses and the risks of side effects. Most of the uncertainties lie in the field of low-dose effects especially with respect to the risk of cancer induction. Low-dose effects are usually of relevance in diagnostic medicine while high-dose radiation effects are typically observed after radiotherapeutic treatment for cancer or after nuclear accidents. The current state of the ''war of theories'' may be summarized as follows: one group of scientists and health regulatory officials favors the hypothesis that there is no threshold dose, i.e. the linear-no-threshold hypothesis (LNT) of radiation which can be regarded as safe. On the contrary, the critics of this hypothesis suggest that the risks of doses below 50 mSv are not measurable or even of clinical relevance and are not adequately described by a linear dose-response relationship. The aim of this article is to summarize the major unresolved issues in this field. Arguments are presented why the validity of the LNT model in the low-dose range should be regarded as at least inconsistent and is thus questionable. (orig.) [German] Seit der Einfuehrung ionisierender Strahlen als ein Mittel zur Behandlung und Diagnose beim Menschen haben Wissenschaftler versucht, ihre Nebenwirkungen und potenziellen Risiken fuer die Gesundheit einzuschaetzen. Es gibt nun ausreichende Evidenz fuer das grundsaetzliche Vorliegen einer direkten Beziehung zwischen hoeheren Dosen und Nebenwirkungsrisiken. Die meisten Unsicherheiten liegen auf dem Gebiet der Niedrigdosisforschung v. a. im Hinblick auf das Risiko der Induktion von Krebs. Niedrigdosiseffekte sind ueblicherweise von Bedeutung in der diagnostischen Medizin, waehrend Hochdosisbestrahlungseffekte

  17. Targeted and non-targeted effects of ionizing radiation

    OpenAIRE

    Omar Desouky; Nan Ding; Guangming Zhou

    2015-01-01

    For a long time it was generally accepted that effects of ionizing radiation such as cell death, chromosomal aberrations, DNA damage, mutagenesis, and carcinogenesis result from direct ionization of cell structures, particularly DNA, or from indirect damage through reactive oxygen species produced by radiolysis of water, and these biological effects were attributed to irreparable or misrepaired DNA damage in cells directly hit by radiation. Using linear non-threshold model (LNT), possible ris...

  18. A test of inflated zeros for Poisson regression models.

    Science.gov (United States)

    He, Hua; Zhang, Hui; Ye, Peng; Tang, Wan

    2017-01-01

    Excessive zeros are common in practice and may cause overdispersion and invalidate inference when fitting Poisson regression models. There is a large body of literature on zero-inflated Poisson models. However, methods for testing whether there are excessive zeros are less well developed. The Vuong test comparing a Poisson and a zero-inflated Poisson model is commonly applied in practice. However, the type I error of the test often deviates seriously from the nominal level, rendering serious doubts on the validity of the test in such applications. In this paper, we develop a new approach for testing inflated zeros under the Poisson model. Unlike the Vuong test for inflated zeros, our method does not require a zero-inflated Poisson model to perform the test. Simulation studies show that when compared with the Vuong test our approach not only better at controlling type I error rate, but also yield more power.

  19. An elastic-plastic contact model for line contact structures

    Science.gov (United States)

    Zhu, Haibin; Zhao, Yingtao; He, Zhifeng; Zhang, Ruinan; Ma, Shaopeng

    2018-06-01

    Although numerical simulation tools are now very powerful, the development of analytical models is very important for the prediction of the mechanical behaviour of line contact structures for deeply understanding contact problems and engineering applications. For the line contact structures widely used in the engineering field, few analytical models are available for predicting the mechanical behaviour when the structures deform plastically, as the classic Hertz's theory would be invalid. Thus, the present study proposed an elastic-plastic model for line contact structures based on the understanding of the yield mechanism. A mathematical expression describing the global relationship between load history and contact width evolution of line contact structures was obtained. The proposed model was verified through an actual line contact test and a corresponding numerical simulation. The results confirmed that this model can be used to accurately predict the elastic-plastic mechanical behaviour of a line contact structure.

  20. Stability of the electroweak ground state in the Standard Model and its extensions

    International Nuclear Information System (INIS)

    Di Luzio, Luca; Isidori, Gino; Ridolfi, Giovanni

    2016-01-01

    We review the formalism by which the tunnelling probability of an unstable ground state can be computed in quantum field theory, with special reference to the Standard Model of electroweak interactions. We describe in some detail the approximations implicitly adopted in such calculation. Particular attention is devoted to the role of scale invariance, and to the different implications of scale-invariance violations due to quantum effects and possible new degrees of freedom. We show that new interactions characterized by a new energy scale, close to the Planck mass, do not invalidate the main conclusions about the stability of the Standard Model ground state derived in absence of such terms.

  1. Stability of the electroweak ground state in the Standard Model and its extensions

    Energy Technology Data Exchange (ETDEWEB)

    Di Luzio, Luca, E-mail: diluzio@ge.infn.it [Dipartimento di Fisica, Università di Genova and INFN, Sezione di Genova, Via Dodecaneso 33, I-16146 Genova (Italy); Isidori, Gino [Department of Physics, University of Zürich, Winterthurerstrasse 190, CH-8057 Zürich (Switzerland); Ridolfi, Giovanni [Dipartimento di Fisica, Università di Genova and INFN, Sezione di Genova, Via Dodecaneso 33, I-16146 Genova (Italy)

    2016-02-10

    We review the formalism by which the tunnelling probability of an unstable ground state can be computed in quantum field theory, with special reference to the Standard Model of electroweak interactions. We describe in some detail the approximations implicitly adopted in such calculation. Particular attention is devoted to the role of scale invariance, and to the different implications of scale-invariance violations due to quantum effects and possible new degrees of freedom. We show that new interactions characterized by a new energy scale, close to the Planck mass, do not invalidate the main conclusions about the stability of the Standard Model ground state derived in absence of such terms.

  2. Epidemiological studies on the effects of low-level ionizing radiation on cancer risk

    International Nuclear Information System (INIS)

    Akiba, Suminori

    2010-01-01

    The health effects of low-level ionizing radiation are yet unclear. As pointed out by Upton in his review (Upton, 1989), low-level ionizing radiation seems to have different biological effects from what high-level radiation has. If so, the hazard identification of ionizing radiation should he conducted separately for low- and high-level ionizing radiation; the hazard identification of low-level radiation is yet to be completed. What makes hazard identification of ionizing radiation difficult, particularly in the case of carcinogenic effect, is the difficulty in distinguishing radiation-induced cancer from other cancers with respect to clinicopathological features and molecular biological characteristics. Actually, it is suspected that radiation-induced carcinogenesis involves mechanisms not specific for radiation, such as oxidative stress. Excess risk per dose in medium-high dose ranges can be extrapolated to a low-dose range if dose-response can be described by the linear-non-threshold model. The cancer risk data of atomic-bomb survivors describes leukemia risk with a linear-quadratic (LQ) model and solid-cancer risk with linear non-threshold (LNT) model. The LQ model for leukemia and the LNT model for solid cancer correspond to the two-hit model and the one-hit model, respectively. Although the one-hit model is an unlikely dose-response for carcinogenesis, there is no convincing epidemiological evidence supporting the LQ model or non-threshold model for solid cancer. It should be pointed out, however, even if the true dose response is non-linear various noises involved in epidemiological data may mask the truth. In this paper, the potential contribution of epidemiological studies on nuclear workers and residents in high background radiation areas will be discussed. (author)

  3. Molecular biology, epidemiology, and the demise of the linear no-threshold hypothesis

    International Nuclear Information System (INIS)

    Pollycove, M.

    1998-01-01

    The LNT hypothesis is the basic principle of all radiation protection policy. This theory assumes that all radiation doses, even those close to zero, are harmful in linear proportion to dose and that all doses produce a proportionate number of harmful mutations, i.e., mis- or unrepaired DNA alterations. The LNT theory is used to generate collective dose calculations of the number of deaths produced by minute fractions of background radiation. Current molecular biology reveals an enormous amount of relentless metabolic oxidative free radical damage with mis/unrepaired alterations of DNA. The corresponding mis/unrepaired DNA alterations produced by background radiation are negligible. These DNA alterations are effectively disposed of by the DNA damage-control biosystem of antioxidant prevention, enzymatic repair, and mutation removal. High-dose radiation injures this biosystem with associated risk increments of mortality and cancer mortality. Low-dose radiation stimulates DNA damage-control with associated epidemiologic observations of risk decrements of mortality and cancer mortality, i.e., hormesis. How can this 40-year-old LNT paradigm continue to be the operative principle of radiation protection policy despite the contradictory scientific observations of both molecular biology and epidemiology and the lack of any supportive human data? The increase of public fear through repeated statements of deaths caused by 'deadly' radiation has engendered an enormous increase in expenditures now required to 'protect' the public from all applications of nuclear technology: medical, research, energy, disposal, and cleanup remediation. Government funds are allocated to appointed committees, the research they support, and to multiple environmental and regulatory agencies. The LNT theory and multibillion dollar radiation activities have now become a symbiotic self-sustaining powerful political and economic force. (author)

  4. Physical examination tests and imaging studies based on arthroscopic assessment of the long head of biceps tendon are invalid.

    Science.gov (United States)

    Jordan, Robert W; Saithna, Adnan

    2017-10-01

    The aim of this study was to evaluate whether glenohumeral arthroscopy is an appropriate gold standard for the diagnosis of long head of biceps (LHB) tendon pathology. The objectives were to evaluate whether the length of tendon that can be seen at arthroscopy allows visualisation of areas of predilection of pathology and also to determine the rates of missed diagnoses at arthroscopy when compared to an open approach. A systematic review of cadaveric and clinical studies was performed. The search strategy was applied to MEDLINE, PubMed and Google Scholar databases. All relevant articles were included. Critical appraisal of clinical studies was performed using a validated quality assessment scale. Five articles were identified for inclusion in the review. This included both clinical and cadaveric studies. The overall population comprised 18 cadaveric specimens and 575 patients. Out of the five included studies, three reported the length of LHB tendon visualised during arthroscopy and four reported the rate of missed LHB diagnosis. Cadaveric studies showed that the use of a hook probe allowed arthroscopic visualisation of between 34 and 48 % of the overall length of the LHB. In the clinical series, the rate of missed diagnoses at arthroscopy when compared to open exploration ranged between 33 and 49 %. Arthroscopy allows visualisation of only a small part of the extra-articular LHB tendon. This leads to a high rate of missed pathology in the distal part of the tendon. Published figures for sensitivities and specificities of common physical examination and imaging tests for LHB pathology that are based on arthroscopy as the gold standard are therefore invalid. In clinical practice, it is important to note that a "negative" arthroscopic assessment does not exclude a lesion of the LHB tendon as this technique does not allow visualisation of common sites of distal pathology. IV.

  5. A sequential threshold cure model for genetic analysis of time-to-event data

    DEFF Research Database (Denmark)

    Ødegård, J; Madsen, Per; Labouriau, Rodrigo S.

    2011-01-01

    In analysis of time-to-event data, classical survival models ignore the presence of potential nonsusceptible (cured) individuals, which, if present, will invalidate the inference procedures. Existence of nonsusceptible individuals is particularly relevant under challenge testing with specific...... pathogens, which is a common procedure in aquaculture breeding schemes. A cure model is a survival model accounting for a fraction of nonsusceptible individuals in the population. This study proposes a mixed cure model for time-to-event data, measured as sequential binary records. In a simulation study...... survival data were generated through 2 underlying traits: susceptibility and endurance (risk of dying per time-unit), associated with 2 sets of underlying liabilities. Despite considerable phenotypic confounding, the proposed model was largely able to distinguish the 2 traits. Furthermore, if selection...

  6. Health effects of low level exposure to ionizing radiation: origin and development of a controversy

    International Nuclear Information System (INIS)

    Masse, Roland

    2014-06-01

    Health hazard assessment related to doses of ionizing radiation lower than 100-200 mSv is a matter of controversy, and more acutely when choosing transition towards a new energetic paradigm. Neither epidemiological nor experimental data can be used to determine the shape of the dose-effect relationship from 0 to 100 mSv. Recently, however, long term follow-up of children and young adults exposed to CT scans evidenced that doses of 50 to 60 mGy delivered at high dose-rate were associated to a significant increase of leukemias and cancers, including brain cancer. On the basis of the available data, this article leaves some questions about the plausibility of the linear no threshold hypothesis (LNT) used by radiological protection bodies to control overexposure of the members of public and workers. It concludes that although the plausibility of LNT is fairly weak, using LNT helps to situate the order of magnitude of health risks associated with the development of nuclear power plants and to compare them with those resulting from burning fossil fuels and biomass; the results show that sparing human lives can only be achieved with nuclear for the same quantity of energy produced. (author)

  7. Dose-responses for mortality from cerebrovascular and heart diseases in atomic bomb survivors: 1950-2003

    Energy Technology Data Exchange (ETDEWEB)

    Schoellnberger, Helmut [Helmholtz Zentrum Muenchen, Department of Radiation Sciences, Institute of Radiation Protection, Neuherberg (Germany); Federal Office for Radiation Protection, Department of Radiation Protection and the Environment, Neuherberg (Germany); Eidemueller, Markus; Simonetto, Cristoforo; Kaiser, Jan Christian [Helmholtz Zentrum Muenchen, Department of Radiation Sciences, Institute of Radiation Protection, Neuherberg (Germany); Cullings, Harry M. [Radiation Effects Research Foundation, Department of Statistics, Hiroshima (Japan); Neff, Frauke [Staedtisches Klinikum Muenchen and Technical University of Munich, Institute of Pathology, Munich (Germany)

    2018-03-15

    The scientific community faces important discussions on the validity of the linear no-threshold (LNT) model for radiation-associated cardiovascular diseases at low and moderate doses. In the present study, mortalities from cerebrovascular diseases (CeVD) and heart diseases from the latest data on atomic bomb survivors were analyzed. The analysis was performed with several radio-biologically motivated linear and nonlinear dose-response models. For each detrimental health outcome one set of models was identified that all fitted the data about equally well. This set was used for multi-model inference (MMI), a statistical method of superposing different models to allow risk estimates to be based on several plausible dose-response models rather than just relying on a single model of choice. MMI provides a more accurate determination of the dose response and a more comprehensive characterization of uncertainties. It was found that for CeVD, the dose-response curve from MMI is located below the linear no-threshold model at low and medium doses (0-1.4 Gy). At higher doses MMI predicts a higher risk compared to the LNT model. A sublinear dose-response was also found for heart diseases (0-3 Gy). The analyses provide no conclusive answer to the question whether there is a radiation risk below 0.75 Gy for CeVD and 2.6 Gy for heart diseases. MMI suggests that the dose-response curves for CeVD and heart diseases in the Lifespan Study are sublinear at low and moderate doses. This has relevance for radiotherapy treatment planning and for international radiation protection practices in general. (orig.)

  8. KWIK Smoke Obscuration Model: User’s Guide.

    Science.gov (United States)

    1982-09-01

    t ’ustr ( td I IK,j) 384: prt 3o :: pr t 6 k AC 1-4G" 36~b : pr t " ~ L 3b7: if j~i;prt &t(t1,] 3 8 8: it J=2;pr. "&str(Zjl,1,KI) 3 0 9: j~r t "I 39u...t.2~t71. * j3 3 2u: w r t 7uX ,"i(Lz~j i21iJ "c3wt70, )i: lnt 4 5X, "irI I U uIL - = 01 17.2;wrt 701,kq3 j~b: Lirt. 45x,"a..~c4 uAT - LiLY = g,t4.2

  9. Neural networks for tracking of unknown SISO discrete-time nonlinear dynamic systems.

    Science.gov (United States)

    Aftab, Muhammad Saleheen; Shafiq, Muhammad

    2015-11-01

    This article presents a Lyapunov function based neural network tracking (LNT) strategy for single-input, single-output (SISO) discrete-time nonlinear dynamic systems. The proposed LNT architecture is composed of two feedforward neural networks operating as controller and estimator. A Lyapunov function based back propagation learning algorithm is used for online adjustment of the controller and estimator parameters. The controller and estimator error convergence and closed-loop system stability analysis is performed by Lyapunov stability theory. Moreover, two simulation examples and one real-time experiment are investigated as case studies. The achieved results successfully validate the controller performance. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  10. The issue of statistical power for overall model fit in evaluating structural equation models

    Directory of Open Access Journals (Sweden)

    Richard HERMIDA

    2015-06-01

    Full Text Available Statistical power is an important concept for psychological research. However, examining the power of a structural equation model (SEM is rare in practice. This article provides an accessible review of the concept of statistical power for the Root Mean Square Error of Approximation (RMSEA index of overall model fit in structural equation modeling. By way of example, we examine the current state of power in the literature by reviewing studies in top Industrial-Organizational (I/O Psychology journals using SEMs. Results indicate that in many studies, power is very low, which implies acceptance of invalid models. Additionally, we examined methodological situations which may have an influence on statistical power of SEMs. Results showed that power varies significantly as a function of model type and whether or not the model is the main model for the study. Finally, results indicated that power is significantly related to model fit statistics used in evaluating SEMs. The results from this quantitative review imply that researchers should be more vigilant with respect to power in structural equation modeling. We therefore conclude by offering methodological best practices to increase confidence in the interpretation of structural equation modeling results with respect to statistical power issues.

  11. Standard-Chinese Lexical Neighborhood Test in normal-hearing young children.

    Science.gov (United States)

    Liu, Chang; Liu, Sha; Zhang, Ning; Yang, Yilin; Kong, Ying; Zhang, Luo

    2011-06-01

    The purposes of the present study were to establish the Standard-Chinese version of Lexical Neighborhood Test (LNT) and to examine the lexical and age effects on spoken-word recognition in normal-hearing children. Six lists of monosyllabic and six lists of disyllabic words (20 words/list) were selected from the database of daily speech materials for normal-hearing (NH) children of ages 3-5 years. The lists were further divided into "easy" and "hard" halves according to the word frequency and neighborhood density in the database based on the theory of Neighborhood Activation Model (NAM). Ninety-six NH children (age ranged between 4.0 and 7.0 years) were divided into three different age groups of 1-year intervals. Speech-perception tests were conducted using the Standard-Chinese monosyllabic and disyllabic LNT. The inter-list performance was found to be equivalent and inter-rater reliability was high with 92.5-95% consistency. Results of word-recognition scores showed that the lexical effects were all significant. Children scored higher with disyllabic words than with monosyllabic words. "Easy" words scored higher than "hard" words. The word-recognition performance also increased with age in each lexical category. A multiple linear regression analysis showed that neighborhood density, age, and word frequency appeared to have increasingly more contributions to Chinese word recognition. The results of the present study indicated that performances of Chinese word recognition were influenced by word frequency, age, and neighborhood density, with word frequency playing a major role. These results were consistent with those in other languages, supporting the application of NAM in the Chinese language. The development of Standard-Chinese version of LNT and the establishment of a database of children of 4-6 years old can provide a reliable means for spoken-word recognition test in children with hearing impairment. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  12. A More Flexible Lipoprotein Sorting Pathway

    Science.gov (United States)

    Chahales, Peter

    2015-01-01

    Lipoprotein biogenesis in Gram-negative bacteria occurs by a conserved pathway, each step of which is considered essential. In contrast to this model, LoVullo and colleagues demonstrate that the N-acyl transferase Lnt is not required in Francisella tularensis or Neisseria gonorrhoeae. This suggests the existence of a more flexible lipoprotein pathway, likely due to a modified Lol transporter complex, and raises the possibility that pathogens may regulate lipoprotein processing to modulate interactions with the host. PMID:25755190

  13. Extremely rare collapse and build-up of turbulence in stochastic models of transitional wall flows

    Science.gov (United States)

    Rolland, Joran

    2018-02-01

    This paper presents a numerical and theoretical study of multistability in two stochastic models of transitional wall flows. An algorithm dedicated to the computation of rare events is adapted on these two stochastic models. The main focus is placed on a stochastic partial differential equation model proposed by Barkley. Three types of events are computed in a systematic and reproducible manner: (i) the collapse of isolated puffs and domains initially containing their steady turbulent fraction; (ii) the puff splitting; (iii) the build-up of turbulence from the laminar base flow under a noise perturbation of vanishing variance. For build-up events, an extreme realization of the vanishing variance noise pushes the state from the laminar base flow to the most probable germ of turbulence which in turn develops into a full blown puff. For collapse events, the Reynolds number and length ranges of the two regimes of collapse of laminar-turbulent pipes, independent collapse or global collapse of puffs, is determined. The mean first passage time before each event is then systematically computed as a function of the Reynolds number r and pipe length L in the laminar-turbulent coexistence range of Reynolds number. In the case of isolated puffs, the faster-than-linear growth with Reynolds number of the logarithm of mean first passage time T before collapse is separated in two. One finds that ln(T ) =Apr -Bp , with Ap and Bp positive. Moreover, Ap and Bp are affine in the spatial integral of turbulence intensity of the puff, with the same slope. In the case of pipes initially containing the steady turbulent fraction, the length L and Reynolds number r dependence of the mean first passage time T before collapse is also separated. The author finds that T ≍exp[L (A r -B )] with A and B positive. The length and Reynolds number dependence of T are then discussed in view of the large deviations theoretical approaches of the study of mean first passage times and multistability

  14. A large enhancement of photoinduced second harmonic generation in CdI2--Cu layered nanocrystals.

    Science.gov (United States)

    Miah, M Idrish

    2009-02-12

    Photoinduced second harmonic generation (PISHG) in undoped as well as in various Cu-doped (0.05-1.2% Cu) CdI2 nanocrystals was measured at liquid nitrogen temperature (LNT). It was found that the PISHG increases with increasing Cu doping up to approximately 0.6% and then decreases almost to that for the undoped CdI2 for doping higher than approximately 1%. The values of the second-order susceptibility ranged from 0.50 to 0.67 pm V(-1) for the Cu-doped nanocrystals with a thickness of 0.5 nm. The Cu-doping dependence shown in a parabolic fashion suggests a crucial role of the Cu agglomerates in the observed effects. The PISHG in crystals with various nanosizes was also measured at LNT. The size dependence demonstrated the quantum-confined effect with a maximum PISHG for 0.5 nm and with a clear increase in the PISHG with decreasing thickness of the nanocrystal. The Raman scattering spectra at different pumping powers were taken for thin nanocrystals, and the phonon modes originating from interlayer phonons were observed in the spectra. The results were discussed within a model of photoinduced electron-phonon anharmonicity.

  15. Stability of the electroweak ground state in the Standard Model and its extensions

    Directory of Open Access Journals (Sweden)

    Luca Di Luzio

    2016-02-01

    Full Text Available We review the formalism by which the tunnelling probability of an unstable ground state can be computed in quantum field theory, with special reference to the Standard Model of electroweak interactions. We describe in some detail the approximations implicitly adopted in such calculation. Particular attention is devoted to the role of scale invariance, and to the different implications of scale-invariance violations due to quantum effects and possible new degrees of freedom. We show that new interactions characterized by a new energy scale, close to the Planck mass, do not invalidate the main conclusions about the stability of the Standard Model ground state derived in absence of such terms.

  16. Analytical models approximating individual processes: a validation method.

    Science.gov (United States)

    Favier, C; Degallier, N; Menkès, C E

    2010-12-01

    Upscaling population models from fine to coarse resolutions, in space, time and/or level of description, allows the derivation of fast and tractable models based on a thorough knowledge of individual processes. The validity of such approximations is generally tested only on a limited range of parameter sets. A more general validation test, over a range of parameters, is proposed; this would estimate the error induced by the approximation, using the original model's stochastic variability as a reference. This method is illustrated by three examples taken from the field of epidemics transmitted by vectors that bite in a temporally cyclical pattern, that illustrate the use of the method: to estimate if an approximation over- or under-fits the original model; to invalidate an approximation; to rank possible approximations for their qualities. As a result, the application of the validation method to this field emphasizes the need to account for the vectors' biology in epidemic prediction models and to validate these against finer scale models. Copyright © 2010 Elsevier Inc. All rights reserved.

  17. Towards product design automation based on parameterized standard model with diversiform knowledge

    Science.gov (United States)

    Liu, Wei; Zhang, Xiaobing

    2017-04-01

    Product standardization based on CAD software is an effective way to improve design efficiency. In the past, research and development on standardization mainly focused on the level of component, and the standardization of the entire product as a whole is rarely taken into consideration. In this paper, the size and structure of 3D product models are both driven by the Excel datasheets, based on which a parameterized model library is therefore established. Diversiform knowledge including associated parameters and default properties are embedded into the templates in advance to simplify their reuse. Through the simple operation, we can obtain the correct product with the finished 3D models including single parts or complex assemblies. Two examples are illustrated later to invalid the idea, which will greatly improve the design efficiency.

  18. Formal Specification and Verification of Fully Asynchronous Implementations of the Data Encryption Standard

    Directory of Open Access Journals (Sweden)

    Wendelin Serwe

    2015-11-01

    Full Text Available This paper presents two formal models of the Data Encryption Standard (DES, a first using the international standard LOTOS, and a second using the more recent process calculus LNT. Both models encode the DES in the style of asynchronous circuits, i.e., the data-flow blocks of the DES algorithm are represented by processes communicating via rendezvous. To ensure correctness of the models, several techniques have been applied, including model checking, equivalence checking, and comparing the results produced by a prototype automatically generated from the formal model with those of existing implementations of the DES. The complete code of the models is provided as appendices and also available on the website of the CADP verification toolbox.

  19. Problems in the radon versus lung cancer test of the linear no-threshold theory and a procedure for resolving them

    International Nuclear Information System (INIS)

    Cohen, B.L.

    1996-01-01

    It has been shown that lung cancer rates in U.S. counties, with or without correction for smoking, decrease with increasing radon exposure, in sharp contrast to the increase predicted by the linear-no-threshold (LNT) theory. The discrepancy is by 20 standard deviations, and very extensive efforts to explain it were not successful. Unless a plausible explanation for this discrepancy (or conflicting evidence) can be found, continued use of the LNT theory is a violation of open-quotes the scientific method.close quotes Nevertheless, LNT continues to be accepted and used by all official and governmental organizations, such as the International Commission on Radiological Protection, the National Council on Radiation Protection and Measurements, the Council on Radiation Protection and Measurements, the National Academy of Sciences - U.S. Nuclear Regulatory Commission Board of Radiation Effects Research, Environmental Protection Agency etc., and there has been no move by any of these bodies to discontinue or limit its use. Assuming that they rely on the scientific method, this clearly implies that they have a plausible explanation for the discrepancy. The author has made great efforts to discover these 'plausible explanations' by inquiries through various channels, and the purpose of this paper is to describe and discuss them

  20. Optical properties of CsI single crystals irradiated with neutrons at low temperature

    International Nuclear Information System (INIS)

    Okada, M.; Atobe, K.; Itatani, N.; Ozawa, K.

    1998-01-01

    Optical properties of the irradiation-induced-defects in neutron-irradiated CsI single crystals have been investigated. The nominally pure CsI crystals are irradiated by reactor fast neutrons (E>0.1 MeV) with a fluence of 1.4 x 10 15 n/cm 2 at 20 K and by γ-rays from 60 Co source to a dose of 1.5 x 10 4 Gy at liquid nitrogen temperature (LNT). After the irradiations, isochronal annealings are performed to investigate the thermal behavior of the defects. The glow peaks of the thermoluminescence (TL) in each sample irradiated with neutrons at 20 K and with γ-rays at LNT are observed at about 100, 160 and 220 K. In the neutron-irradiated samples at 20 K, the emission band at 338 nm is observed at LNT. It is supposed that this emission band occurs by an excitation of γ-rays from 134 Cs, which is radioactivated by thermal neutrons among the reactor radiations. It is confirmed that the temperature dependence of the 338 nm band is similar with that of the emission band due to the self-trapped exciton which is introduced into the non-irradiated samples illuminated by higher energy photons. (orig.)

  1. Optical properties of CsI single crystals irradiated with neutrons at low temperature

    Energy Technology Data Exchange (ETDEWEB)

    Okada, M. [Kyoto Univ., Kumatori, Osaka (Japan). Research Reactor Inst.; Nakagawa, M. [Faculty of Education, Kagawa Univ., Takamatsu, Kagawa (Japan); Atobe, K. [Faculty of Science, Naruto Univ. of Education, Naruto, Tokushima (Japan); Itatani, N.; Ozawa, K. [Horiba Ltd., Minamiku, Kyoto (Japan)

    1998-05-01

    Optical properties of the irradiation-induced-defects in neutron-irradiated CsI single crystals have been investigated. The nominally pure CsI crystals are irradiated by reactor fast neutrons (E>0.1 MeV) with a fluence of 1.4 x 10{sup 15} n/cm{sup 2} at 20 K and by {gamma}-rays from {sup 60}Co source to a dose of 1.5 x 10{sup 4} Gy at liquid nitrogen temperature (LNT). After the irradiations, isochronal annealings are performed to investigate the thermal behavior of the defects. The glow peaks of the thermoluminescence (TL) in each sample irradiated with neutrons at 20 K and with {gamma}-rays at LNT are observed at about 100, 160 and 220 K. In the neutron-irradiated samples at 20 K, the emission band at 338 nm is observed at LNT. It is supposed that this emission band occurs by an excitation of {gamma}-rays from {sup 134}Cs, which is radioactivated by thermal neutrons among the reactor radiations. It is confirmed that the temperature dependence of the 338 nm band is similar with that of the emission band due to the self-trapped exciton which is introduced into the non-irradiated samples illuminated by higher energy photons. (orig.) 13 refs.

  2. Damage formation and annealing in InP due to swift heavy ions

    International Nuclear Information System (INIS)

    Kamarou, A.; Wesch, W.; Wendler, E.; Klaumuenzer, S.

    2004-01-01

    Virgin and pre-damaged InP samples were irradiated at room temperature (RT) and at liquid nitrogen temperature (LNT) with different fluences of 140 MeV Kr, 390 MeV Xe and 600 MeV Au ions. The pre-damaging was performed with 600 keV Ge ions at LNT to obtain different damage levels. The samples were analysed by means of Rutherford backscattering spectrometry (RBS) in random and channelling geometry. A relatively weak damage accumulation in virgin InP and a very significant defect annealing in pre-damaged InP occurs due to 140 MeV Kr irradiation. The damaging of virgin InP with 390 MeV Xe and 600 MeV Au is much more efficient in comparison with that of 140 MeV Kr. Further, annealing of the pre-damaged InP due to 390 MeV Xe irradiation is hardly visible. At LNT InP appears to be much more radiation-resistant to swift heavy ion (SHI) irradiation than at RT. Our results show that during SHI irradiation of InP both damage formation and damage annealing occur simultaneously. Whether the first or the second one plays a more important role depends on the SHI mass and energy

  3. Numerical modelling of local deposition patients, activity distributions and cellular hit probabilities of inhaled radon progenies in human airways

    International Nuclear Information System (INIS)

    Farkas, A.; Balashazy, I.; Szoeke, I.

    2003-01-01

    The general objective of our research is modelling the biophysical processes of the effects of inhaled radon progenies. This effort is related to the rejection or support of the linear no threshold (LNT) dose-effect hypothesis, which seems to be one of the most challenging tasks of current radiation protection. Our approximation and results may also serve as a useful tool for lung cancer models. In this study, deposition patterns, activity distributions and alpha-hit probabilities of inhaled radon progenies in the large airways of the human tracheobronchial tree are computed. The airflow fields and related particle deposition patterns strongly depend on the shape of airway geometry and breathing pattern. Computed deposition patterns of attached an unattached radon progenies are strongly inhomogeneous creating hot spots at the carinal regions and downstream of the inner sides of the daughter airways. The results suggest that in the vicinity of the carinal regions the multiple hit probabilities are quite high even at low average doses and increase exponentially in the low-dose range. Thus, even the so-called low doses may present high doses for large clusters of cells. The cell transformation probabilities are much higher in these regions and this phenomenon cannot be modeled with average burdens. (authors)

  4. Possible roles of Peccei-Quinn symmetry in an effective low energy model

    Science.gov (United States)

    Suematsu, Daijiro

    2017-12-01

    The strong C P problem is known to be solved by imposing Peccei-Quinn (PQ) symmetry. However, the domain wall problem caused by the spontaneous breaking of its remnant discrete subgroup could make models invalid in many cases. We propose a model in which the PQ charge is assigned quarks so as to escape this problem without introducing any extra colored fermions. In the low energy effective model resulting after the PQ symmetry breaking, both the quark mass hierarchy and the CKM mixing could be explained through Froggatt-Nielsen mechanism. If the model is combined with the lepton sector supplemented by an inert doublet scalar and right-handed neutrinos, the effective model reduces to the scotogenic neutrino mass model in which both the origin of neutrino masses and dark matter are closely related. The strong C P problem could be related to the quark mass hierarchy, neutrino masses, and dark matter through the PQ symmetry.

  5. A more flexible lipoprotein sorting pathway.

    Science.gov (United States)

    Chahales, Peter; Thanassi, David G

    2015-05-01

    Lipoprotein biogenesis in Gram-negative bacteria occurs by a conserved pathway, each step of which is considered essential. In contrast to this model, LoVullo and colleagues demonstrate that the N-acyl transferase Lnt is not required in Francisella tularensis or Neisseria gonorrhoeae. This suggests the existence of a more flexible lipoprotein pathway, likely due to a modified Lol transporter complex, and raises the possibility that pathogens may regulate lipoprotein processing to modulate interactions with the host. Copyright © 2015, American Society for Microbiology. All Rights Reserved.

  6. Hormesis: Fact or fiction?

    International Nuclear Information System (INIS)

    Holzman, D.

    1995-01-01

    Bernard Cohen had not intended to foment revolution. To be sure, he had hoped that the linear, no-threshold (LNT) model of ionizing radiation's effects on humans would prove to be an exaggeration of reality at the low levels of radiation that one can measure in humans throughout the United States. His surprising conclusion, however, was that within the low dose ranges of radiation one receives in the home, the higher the dose, the less chance one had of contracting lung cancer. 1 fig., 1 tab

  7. A study of finite mixture model: Bayesian approach on financial time series data

    Science.gov (United States)

    Phoong, Seuk-Yen; Ismail, Mohd Tahir

    2014-07-01

    Recently, statistician have emphasized on the fitting finite mixture model by using Bayesian method. Finite mixture model is a mixture of distributions in modeling a statistical distribution meanwhile Bayesian method is a statistical method that use to fit the mixture model. Bayesian method is being used widely because it has asymptotic properties which provide remarkable result. In addition, Bayesian method also shows consistency characteristic which means the parameter estimates are close to the predictive distributions. In the present paper, the number of components for mixture model is studied by using Bayesian Information Criterion. Identify the number of component is important because it may lead to an invalid result. Later, the Bayesian method is utilized to fit the k-component mixture model in order to explore the relationship between rubber price and stock market price for Malaysia, Thailand, Philippines and Indonesia. Lastly, the results showed that there is a negative effect among rubber price and stock market price for all selected countries.

  8. Non-linearity of dose-effect relationship at low level exposure on the example of cytogenetic effects in plant cells

    International Nuclear Information System (INIS)

    Oudalova, A.A.; Geras'kin, S.A.; Dikarev, V.G.; Dikareva, N.S.; Chernonog, E.V.

    2007-01-01

    Complete text of publication follows. There has been an increasing concern in the current scientific society and among the public about the need to protect the environment in order to maintain the ecosystem sustainability and future well-being of man. The linear non-threshold (LNT) hypothesis as the most officially acknowledged concept of biological effect of radiation fails to explain many facts on effects at low level exposures (LLE) accumulated lately. Available information on the dose-effect relationship at low doses is scarce and incomplete for non-human species despite the fact that, under conditions of increased radiation exposure, some biota species occur at a risk of higher impact than humans because of differences in ecological niches occupied. Dose-effect relationships for cytogenetic damage in the range of LLE are studied in a series os experiments with plant (Hordeum vulgare L.) meristem cells. Dose-effect dependences obtained show an obvious non-linear behavior in the LLE region. A piecewise linear model (PLM) for dose-cytogenetic effect relationship that considers an existence of dose-independent part at LLE ('plateau') is developed and specified on the data obtained. An advantage of the PLM over linear model in approximating the frequency of cytogenetic disturbances is demonstrated. From an empirical probability distribution analysis, it is shown that the increase in cytogenetic damage level is tightly connected with changes in a process of absorbed energy distribution between target volumes in terms of fraction of cells experienced a radiation hit event. An appropriateness of the LNT hypothesis to the description of cytogenetic disturbances yield in plant meristem cells in the LLE region is discussed. The results support a conclusion about indirect mechanism of mutagenesis induced by low doses. New data obtained concern a perception of fundamental mechanisms governing cell response to LLE. These findings are of general biological interest, since

  9. Principles and interest of GOF tests for multistate capture-recapture models

    Directory of Open Access Journals (Sweden)

    Pradel, R.

    2005-12-01

    Full Text Available Optimal goodness–of–fit procedures for multistate models are new. Drawing a parallel with the corresponding single–state procedures, we present their singularities and show how the overall test can be decomposed into interpretable components. All theoretical developments are illustrated with an application to the now classical study of movements of Canada geese between wintering sites. Through this application, we exemplify how the interpretable components give insight into the data, leading eventually to the choice of an appropriate general model but also sometimes to the invalidation of the multistate models as a whole. The method for computing a corrective overdispersion factor is then mentioned. We also take the opportunity to try to demystify some statistical notions like that of Minimal Sufficient Statistics by introducing them intuitively. We conclude that these tests should be considered an important part of the analysis itself, contributing in ways that the parametric modelling cannot always do to the understanding of the data.

  10. Spatio-temporal precipitation climatology over complex terrain using a censored additive regression model.

    Science.gov (United States)

    Stauffer, Reto; Mayr, Georg J; Messner, Jakob W; Umlauf, Nikolaus; Zeileis, Achim

    2017-06-15

    Flexible spatio-temporal models are widely used to create reliable and accurate estimates for precipitation climatologies. Most models are based on square root transformed monthly or annual means, where a normal distribution seems to be appropriate. This assumption becomes invalid on a daily time scale as the observations involve large fractions of zero observations and are limited to non-negative values. We develop a novel spatio-temporal model to estimate the full climatological distribution of precipitation on a daily time scale over complex terrain using a left-censored normal distribution. The results demonstrate that the new method is able to account for the non-normal distribution and the large fraction of zero observations. The new climatology provides the full climatological distribution on a very high spatial and temporal resolution, and is competitive with, or even outperforms existing methods, even for arbitrary locations.

  11. Inefficient Metabolism of the Human Milk Oligosaccharides Lacto-N-tetraose and Lacto-N-neotetraose Shifts Bifidobacterium longum subsp. infantis Physiology

    Directory of Open Access Journals (Sweden)

    Ezgi Özcan

    2018-05-01

    Full Text Available Human milk contains a high concentration of indigestible oligosaccharides, which likely mediated the coevolution of the nursing infant with its gut microbiome. Specifically, Bifidobacterium longum subsp. infantis (B. infantis often colonizes the infant gut and utilizes these human milk oligosaccharides (HMOs to enrich their abundance. In this study, the physiology and mechanisms underlying B. infantis utilization of two HMO isomers lacto-N-tetraose (LNT and lacto-N-neotetraose (LNnT was investigated in addition to their carbohydrate constituents. Both LNT and LNnT utilization induced a significant shift in the ratio of secreted acetate to lactate (1.7–2.0 in contrast to the catabolism of their component carbohydrates (~1.5. Inefficient metabolism of LNnT prompts B. infantis to shunt carbon toward formic acid and ethanol secretion. The global transcriptome presents genomic features differentially expressed to catabolize these two HMO species that vary by a single glycosidic linkage. Furthermore, a measure of strain-level variation exists between B. infantis isolates. Regardless of strain, inefficient HMO metabolism induces the metabolic shift toward formic acid and ethanol production. Furthermore, bifidobacterial metabolites reduced LPS-induced inflammation in a cell culture model. Thus, differential metabolism of milk glycans potentially drives the emergent physiology of host-microbial interactions to impact infant health.

  12. Honoring Identity Through Mealtimes in Chinese Canadian Immigrants.

    Science.gov (United States)

    Lam, Ivy T Y; Keller, Heather H

    2015-11-01

    Mealtimes are opportunities for social interactions and expressions of individual and family identity, and serve as a microcosm of the broader lives of families living with dementia. The Eating Together study and its resulting Life Nourishment Theory (LNT) explicated the importance of mealtimes for honouring individual and family identities in the context of dementia. This sub-study examined a specific ethnocultural group with cultural food-ways and caring expectations, to determine if the concept of honouring identity needed to be modified or extended. Using active interview techniques, two Cantonese speaking researchers completed dyad/triad family and individual interviews with six Chinese Canadian immigrant families, recruited from two service providers in a large, urban, multicultural city. This sub-study provided insight into the challenges and rewards of mealtimes for Chinese immigrant families with dementia in the community and specifically provided further insights into the honouring identity concept. Although LNT and specifically the honouring identity concept was generally confirmed in this group, some culturally-specific themes were also identified. This work serves as a basis for future studies examining the meaning and experience of mealtimes in specific cultural groups living with dementia. Such work would confirm if the LNT can be applied to specific ethnocultural groups as well as the general population living with dementia. © The Author(s) 2012.

  13. Dynamic Computation of Change Operations in Version Management of Business Process Models

    Science.gov (United States)

    Küster, Jochen Malte; Gerth, Christian; Engels, Gregor

    Version management of business process models requires that changes can be resolved by applying change operations. In order to give a user maximal freedom concerning the application order of change operations, position parameters of change operations must be computed dynamically during change resolution. In such an approach, change operations with computed position parameters must be applicable on the model and dependencies and conflicts of change operations must be taken into account because otherwise invalid models can be constructed. In this paper, we study the concept of partially specified change operations where parameters are computed dynamically. We provide a formalization for partially specified change operations using graph transformation and provide a concept for their applicability. Based on this, we study potential dependencies and conflicts of change operations and show how these can be taken into account within change resolution. Using our approach, a user can resolve changes of business process models without being unnecessarily restricted to a certain order.

  14. Bifidobacterium breve UCC2003 metabolises the human milk oligosaccharides lacto-N-tetraose and lacto-N-neo-tetraose through overlapping, yet distinct pathways

    Science.gov (United States)

    James, Kieran; Motherway, Mary O’Connell; Bottacini, Francesca; van Sinderen, Douwe

    2016-01-01

    In this study, we demonstrate that the prototype B. breve strain UCC2003 possesses specific metabolic pathways for the utilisation of lacto-N-tetraose (LNT) and lacto-N-neotetraose (LNnT), which represent the central moieties of Type I and Type II human milk oligosaccharides (HMOs), respectively. Using a combination of experimental approaches, the enzymatic machinery involved in the metabolism of LNT and LNnT was identified and characterised. Homologs of the key genetic loci involved in the utilisation of these HMO substrates were identified in B. breve, B. bifidum, B. longum subsp. infantis and B. longum subsp. longum using bioinformatic analyses, and were shown to be variably present among other members of the Bifidobacterium genus, with a distinct pattern of conservation among human-associated bifidobacterial species. PMID:27929046

  15. Thermoluminescence and recovery processes in pure and doped NaCl after 20 K irradiation

    International Nuclear Information System (INIS)

    Lopez, F.J.; Aguilar, M.; Jaque, F.; Agullo-Lopez, F.

    1980-01-01

    The thermoluminescence (TL) spectra after X-ray irradiation at 20 K have been investigated for pure as well as divalent cation doped NaCl. The F-centre decay has also been determined in pure and Ca and Mg doped NaCl for comparison purposes. A clear decrease in F-centre concentration appears to correlate with glow peaks at 44 and 50 K for pure and Ca-doped samples. Main glow peak appearing at 69 K is not associated to any appreciable F-centre decay step. Below liquid nitrogen temperature (LNT) all peaks show both σ and π exciton emission bands. Above LNT, the glow peaks for doped samples show the σ emission together with another band at 410 nm, whereas pure samples still present the intrinsic emission bands. (author)

  16. Stochastic population oscillations in spatial predator-prey models

    International Nuclear Information System (INIS)

    Taeuber, Uwe C

    2011-01-01

    It is well-established that including spatial structure and stochastic noise in models for predator-prey interactions invalidates the classical deterministic Lotka-Volterra picture of neutral population cycles. In contrast, stochastic models yield long-lived, but ultimately decaying erratic population oscillations, which can be understood through a resonant amplification mechanism for density fluctuations. In Monte Carlo simulations of spatial stochastic predator-prey systems, one observes striking complex spatio-temporal structures. These spreading activity fronts induce persistent correlations between predators and prey. In the presence of local particle density restrictions (finite prey carrying capacity), there exists an extinction threshold for the predator population. The accompanying continuous non-equilibrium phase transition is governed by the directed-percolation universality class. We employ field-theoretic methods based on the Doi-Peliti representation of the master equation for stochastic particle interaction models to (i) map the ensuing action in the vicinity of the absorbing state phase transition to Reggeon field theory, and (ii) to quantitatively address fluctuation-induced renormalizations of the population oscillation frequency, damping, and diffusion coefficients in the species coexistence phase.

  17. Paradigm lost, paradigm found: The re-emergence of hormesis as a fundamental dose response model in the toxicological sciences

    International Nuclear Information System (INIS)

    Calabrese, Edward J.

    2005-01-01

    This paper provides an assessment of the toxicological basis of the hormetic dose-response relationship including issues relating to its reproducibility, frequency, and generalizability across biological models, endpoints measured and chemical class/physical stressors and implications for risk assessment. The quantitative features of the hormetic dose response are described and placed within toxicological context that considers study design, temporal assessment, mechanism, and experimental model/population heterogeneity. Particular emphasis is placed on an historical evaluation of why the field of toxicology rejected hormesis in favor of dose response models such as the threshold model for assessing non-carcinogens and linear no threshold (LNT) models for assessing carcinogens. The paper argues that such decisions were principally based on complex historical factors that emerged from the intense and protracted conflict between what is now called traditional medicine and homeopathy and the overly dominating influence of regulatory agencies on the toxicological intellectual agenda. Such regulatory agency influence emphasized hazard/risk assessment goals such as the derivation of no observed adverse effect levels (NOAELs) and the lowest observed adverse effect levels (LOAELs) which were derived principally from high dose studies using few doses, a feature which restricted perceptions and distorted judgments of several generations of toxicologists concerning the nature of the dose-response continuum. Such historical and technical blind spots lead the field of toxicology to not only reject an established dose-response model (hormesis), but also the model that was more common and fundamental than those that the field accepted. - The quantitative features of the hormetic dose/response are described and placed within the context of toxicology

  18. Paradigm lost, paradigm found: The re-emergence of hormesis as a fundamental dose response model in the toxicological sciences

    Energy Technology Data Exchange (ETDEWEB)

    Calabrese, Edward J. [Environmental Health Sciences, School of Public Health, Morrill I, N344, University of Massachusetts, Amherst, MA 01003 (United States)]. E-mail: edwardc@schoolph.umass.edu

    2005-12-15

    This paper provides an assessment of the toxicological basis of the hormetic dose-response relationship including issues relating to its reproducibility, frequency, and generalizability across biological models, endpoints measured and chemical class/physical stressors and implications for risk assessment. The quantitative features of the hormetic dose response are described and placed within toxicological context that considers study design, temporal assessment, mechanism, and experimental model/population heterogeneity. Particular emphasis is placed on an historical evaluation of why the field of toxicology rejected hormesis in favor of dose response models such as the threshold model for assessing non-carcinogens and linear no threshold (LNT) models for assessing carcinogens. The paper argues that such decisions were principally based on complex historical factors that emerged from the intense and protracted conflict between what is now called traditional medicine and homeopathy and the overly dominating influence of regulatory agencies on the toxicological intellectual agenda. Such regulatory agency influence emphasized hazard/risk assessment goals such as the derivation of no observed adverse effect levels (NOAELs) and the lowest observed adverse effect levels (LOAELs) which were derived principally from high dose studies using few doses, a feature which restricted perceptions and distorted judgments of several generations of toxicologists concerning the nature of the dose-response continuum. Such historical and technical blind spots lead the field of toxicology to not only reject an established dose-response model (hormesis), but also the model that was more common and fundamental than those that the field accepted. - The quantitative features of the hormetic dose/response are described and placed within the context of toxicology.

  19. Adaptive Modeling of the International Space Station Electrical Power System

    Science.gov (United States)

    Thomas, Justin Ray

    2007-01-01

    Software simulations provide NASA engineers the ability to experiment with spacecraft systems in a computer-imitated environment. Engineers currently develop software models that encapsulate spacecraft system behavior. These models can be inaccurate due to invalid assumptions, erroneous operation, or system evolution. Increasing accuracy requires manual calibration and domain-specific knowledge. This thesis presents a method for automatically learning system models without any assumptions regarding system behavior. Data stream mining techniques are applied to learn models for critical portions of the International Space Station (ISS) Electrical Power System (EPS). We also explore a knowledge fusion approach that uses traditional engineered EPS models to supplement the learned models. We observed that these engineered EPS models provide useful background knowledge to reduce predictive error spikes when confronted with making predictions in situations that are quite different from the training scenarios used when learning the model. Evaluations using ISS sensor data and existing EPS models demonstrate the success of the adaptive approach. Our experimental results show that adaptive modeling provides reductions in model error anywhere from 80% to 96% over these existing models. Final discussions include impending use of adaptive modeling technology for ISS mission operations and the need for adaptive modeling in future NASA lunar and Martian exploration.

  20. A Novel OBDD-Based Reliability Evaluation Algorithm for Wireless Sensor Networks on the Multicast Model

    Directory of Open Access Journals (Sweden)

    Zongshuai Yan

    2015-01-01

    Full Text Available The two-terminal reliability calculation for wireless sensor networks (WSNs is a #P-hard problem. The reliability calculation of WSNs on the multicast model provides an even worse combinatorial explosion of node states with respect to the calculation of WSNs on the unicast model; many real WSNs require the multicast model to deliver information. This research first provides a formal definition for the WSN on the multicast model. Next, a symbolic OBDD_Multicast algorithm is proposed to evaluate the reliability of WSNs on the multicast model. Furthermore, our research on OBDD_Multicast construction avoids the problem of invalid expansion, which reduces the number of subnetworks by identifying the redundant paths of two adjacent nodes and s-t unconnected paths. Experiments show that the OBDD_Multicast both reduces the complexity of the WSN reliability analysis and has a lower running time than Xing’s OBDD- (ordered binary decision diagram- based algorithm.

  1. Accuracy and user-acceptability of HIV self-testing using an oral fluid-based HIV rapid test.

    Directory of Open Access Journals (Sweden)

    Oon Tek Ng

    Full Text Available BACKGROUND: The United States FDA approved an over-the-counter HIV self-test, to facilitate increased HIV testing and earlier linkage to care. We assessed the accuracy of self-testing by untrained participants compared to healthcare worker (HCW testing, participants' ability to interpret sample results and user-acceptability of self-tests in Singapore. METHODOLOGY/PRINCIPAL FINDINGS: A cross-sectional study, involving 200 known HIV-positive patients and 794 unknown HIV status at-risk participants was conducted. Participants (all without prior self-test experience performed self-testing guided solely by visual instructions, followed by HCW testing, both using the OraQuick ADVANCE Rapid HIV 1/2 Antibody Test, with both results interpreted by the HCW. To assess ability to interpret results, participants were provided 3 sample results (positive, negative, and invalid to interpret. Of 192 participants who tested positive on HCW testing, self-testing was positive in 186 (96.9%, negative in 5 (2.6%, and invalid in 1 (0.5%. Of 794 participants who tested negative on HCW testing, self-testing was negative in 791 (99.6%, positive in 1 (0.1%, and invalid in 2 (0.3%. Excluding invalid tests, self-testing had sensitivity of 97.4% (95% CI 95.1% to 99.7% and specificity of 99.9% (95% CI: 99.6% to 100%. When interpreting results, 96%, 93.1% and 95.2% correctly read the positive, negative and invalid respectively. There were no significant demographic predictors for false negative self-testing or wrongly interpreting positive or invalid sample results as negative. Eighty-seven percent would purchase the kit over-the-counter; 89% preferred to take HIV tests in private. 72.5% and 74.9% felt the need for pre- and post-test counseling respectively. Only 28% would pay at least USD15 for the test. CONCLUSIONS/SIGNIFICANCE: Self-testing was associated with high specificity, and a small but significant number of false negatives. Incorrectly identifying model results as

  2. Is the Bifactor Model a Better Model or Is It Just Better at Modeling Implausible Responses? Application of Iteratively Reweighted Least Squares to the Rosenberg Self-Esteem Scale.

    Science.gov (United States)

    Reise, Steven P; Kim, Dale S; Mansolf, Maxwell; Widaman, Keith F

    2016-01-01

    Although the structure of the Rosenberg Self-Esteem Scale (RSES) has been exhaustively evaluated, questions regarding dimensionality and direction of wording effects continue to be debated. To shed new light on these issues, we ask (a) for what percentage of individuals is a unidimensional model adequate, (b) what additional percentage of individuals can be modeled with multidimensional specifications, and (c) what percentage of individuals respond so inconsistently that they cannot be well modeled? To estimate these percentages, we applied iteratively reweighted least squares (IRLS) to examine the structure of the RSES in a large, publicly available data set. A distance measure, d s , reflecting a distance between a response pattern and an estimated model, was used for case weighting. We found that a bifactor model provided the best overall model fit, with one general factor and two wording-related group factors. However, on the basis of d r  values, a distance measure based on individual residuals, we concluded that approximately 86% of cases were adequately modeled through a unidimensional structure, and only an additional 3% required a bifactor model. Roughly 11% of cases were judged as "unmodelable" due to their significant residuals in all models considered. Finally, analysis of d s revealed that some, but not all, of the superior fit of the bifactor model is owed to that model's ability to better accommodate implausible and possibly invalid response patterns, and not necessarily because it better accounts for the effects of direction of wording.

  3. ORIGINAL ARTICLES

    African Journals Online (AJOL)

    2000-06-27

    Jun 27, 2000 ... Batanero E, Villalba M, Monsalve Rl, Rodriguez R. Cross-reactivity between the major ... pollen and vegetable foods. lnt A rch Allergy lmmunol 1992; 98: 97-104. ... Allergy and Clinical Immunology Intemational 1994; 6: 80-.

  4. Capturing Assumptions while Designing a Verification Model for Embedded Systems

    NARCIS (Netherlands)

    Marincic, J.; Mader, Angelika H.; Wieringa, Roelf J.

    A formal proof of a system correctness typically holds under a number of assumptions. Leaving them implicit raises the chance of using the system in a context that violates some assumptions, which in return may invalidate the correctness proof. The goal of this paper is to show how combining

  5. Crisis Decision Making Through a Shared Integrative Negotiation Mental Model

    NARCIS (Netherlands)

    Van Santen, W.; Jonker, C.M.; Wijngaards, N.

    2009-01-01

    Decision making during crises takes place in (multi-agency) teams, in a bureaucratic political context. As a result, the common notion that during crises decision making should be done in line with a Command & Control structure is invalid. This paper shows that the best way for crisis decision

  6. Application of dynamic slip wall modeling to a turbine nozzle guide vane

    Science.gov (United States)

    Bose, Sanjeeb; Talnikar, Chaitanya; Blonigan, Patrick; Wang, Qiqi

    2015-11-01

    Resolution of near-wall turbulent structures is computational prohibitive necessitating the need for wall-modeled large-eddy simulation approaches. Standard wall models are often based on assumptions of equilibrium boundary layers, which do not necessarily account for the dissimilarity of the momentum and thermal boundary layers. We investigate the use of the dynamic slip wall boundary condition (Bose and Moin, 2014) for the prediction of surface heat transfer on a turbine nozzle guide vane (Arts and de Rouvroit, 1992). The heat transfer coefficient is well predicted by the slip wall model, including capturing the transition to turbulence. The sensitivity of the heat transfer coefficient to the incident turbulence intensity will additionally be discussed. Lastly, the behavior of the thermal and momentum slip lengths will be contrasted between regions where the strong Reynolds analogy is invalid (near transition on the suction side) and an isothermal, zero pressure gradient flat plate boundary layer (Wu and Moin, 2010).

  7. Use of nonlinear dose-effect models to predict consequences

    International Nuclear Information System (INIS)

    Seiler, F.A.; Alvarez, J.L.

    1996-01-01

    The linear dose-effect relationship was introduced as a model for the induction of cancer from exposure to nuclear radiation. Subsequently, it has been used by analogy to assess the risk of chemical carcinogens also. Recently, however, the model for radiation carcinogenesis has come increasingly under attack because its calculations contradict the epidemiological data, such as cancer in atomic bomb survivors. Even so, its proponents vigorously defend it, often using arguments that are not so much scientific as a mix of scientific, societal, and often political arguments. At least in part, the resilience of the linear model is due to two convenient properties that are exclusive to linearity: First, the risk of an event is determined solely by the event dose; second, the total risk of a population group depends only on the total population dose. In reality, the linear model has been conclusively falsified; i.e., it has been shown to make wrong predictions, and once this fact is generally realized, the scientific method calls for a new paradigm model. As all alternative models are by necessity nonlinear, all the convenient properties of the linear model are invalid, and calculational procedures have to be used that are appropriate for nonlinear models

  8. Controlled synthesis and photocatalytic investigation of different-shaped one-dimensional titanic acid nanomaterials

    Energy Technology Data Exchange (ETDEWEB)

    Li, Qiuye [State Key Laboratory for Oxo Synthesis and Selective Oxidation, Lanzhou Institute of Chemical Physics, Chinese Academy of Sciences, Lanzhou 730000 (China); The Graduate School of the Chinese Academy of Sciences, Chinese Academy of Sciences, Beijing 10080 (China); Key Laboratory of Special Functional Materials, Henan University, KaiFeng 475001 (China); Lu, Gongxuan [State Key Laboratory for Oxo Synthesis and Selective Oxidation, Lanzhou Institute of Chemical Physics, Chinese Academy of Sciences, Lanzhou 730000 (China)

    2008-10-15

    Different-shaped one-dimensional (1D) titanic acid nanomaterials (TANs) were prepared by hydrothermal synthesis. By changing the reaction temperature (120, 170 and 200 C), three kinds of 1D TAN, short-nanotubes (SNT), long-nanotubes (LNT), and nanorods (NR), were obtained. The obtained TANs were characterized by transmission electron microscopy (TEM), high-resolution transmission electron microscopy (HRTEM), powder X-ray diffraction (XRD), and solid-stated diffuse reflectance UV-vis spectra (UV-vis DRS) techniques. Based on these 1D TAN, Eosin Y-sensitized Pt-loaded TAN were prepared by the in situ impregnation and photo-reduction method. Their photocatalytic activity for hydrogen generation was evaluated in triethanolamine (TEOA) aqueous solution under visible light irradiation ({lambda} {>=} 420 nm). The results indicated that the morphology difference led to a significant variation of photocatalytic performance for hydrogen generation, with the activity order as follows: Eosin Y-sensitized Pt-loaded LNT > Eosin Y-sensitized Pt-loaded NR > Eosin Y-sensitized Pt-loaded SNT. The experimental conditions for photocatalytic hydrogen generation such as Pt loading content, the mass ratio of Eosin Y to TAN, and so on, were optimized. As a result, the highest apparent quantum yields of hydrogen generation for Eosin Y-sensitized Pt-loaded SNT, LNT, and NR were 6.65, 17.36, and 15.04%, respectively. The stability of these photocatalysts and the reaction mechanism of the photocatalytic hydrogen generation are also discussed in detail. (author)

  9. Controlled synthesis and photocatalytic investigation of different-shaped one-dimensional titanic acid nanomaterials

    Science.gov (United States)

    Li, Qiuye; Lu, Gongxuan

    Different-shaped one-dimensional (1D) titanic acid nanomaterials (TANs) were prepared by hydrothermal synthesis. By changing the reaction temperature (120, 170 and 200 °C), three kinds of 1D TAN, short-nanotubes (SNT), long-nanotubes (LNT), and nanorods (NR), were obtained. The obtained TANs were characterized by transmission electron microscopy (TEM), high-resolution transmission electron microscopy (HRTEM), powder X-ray diffraction (XRD), and solid-stated diffuse reflectance UV-vis spectra (UV-vis DRS) techniques. Based on these 1D TAN, Eosin Y-sensitized Pt-loaded TAN were prepared by the in situ impregnation and photo-reduction method. Their photocatalytic activity for hydrogen generation was evaluated in triethanolamine (TEOA) aqueous solution under visible light irradiation (λ ≥ 420 nm). The results indicated that the morphology difference led to a significant variation of photocatalytic performance for hydrogen generation, with the activity order as follows: Eosin Y-sensitized Pt-loaded LNT > Eosin Y-sensitized Pt-loaded NR > Eosin Y-sensitized Pt-loaded SNT. The experimental conditions for photocatalytic hydrogen generation such as Pt loading content, the mass ratio of Eosin Y to TAN, and so on, were optimized. As a result, the highest apparent quantum yields of hydrogen generation for Eosin Y-sensitized Pt-loaded SNT, LNT, and NR were 6.65, 17.36, and 15.04%, respectively. The stability of these photocatalysts and the reaction mechanism of the photocatalytic hydrogen generation are also discussed in detail.

  10. Epidemiology Without Biology: False Paradigms, Unfounded Assumptions, and Specious Statistics in Radiation Science (with Commentaries by Inge Schmitz-Feuerhake and Christopher Busby and a Reply by the Authors).

    Science.gov (United States)

    Sacks, Bill; Meyerson, Gregory; Siegel, Jeffry A

    Radiation science is dominated by a paradigm based on an assumption without empirical foundation. Known as the linear no-threshold (LNT) hypothesis, it holds that all ionizing radiation is harmful no matter how low the dose or dose rate. Epidemiological studies that claim to confirm LNT either neglect experimental and/or observational discoveries at the cellular, tissue, and organismal levels, or mention them only to distort or dismiss them. The appearance of validity in these studies rests on circular reasoning, cherry picking, faulty experimental design, and/or misleading inferences from weak statistical evidence. In contrast, studies based on biological discoveries demonstrate the reality of hormesis: the stimulation of biological responses that defend the organism against damage from environmental agents. Normal metabolic processes are far more damaging than all but the most extreme exposures to radiation. However, evolution has provided all extant plants and animals with defenses that repair such damage or remove the damaged cells, conferring on the organism even greater ability to defend against subsequent damage. Editors of medical journals now admit that perhaps half of the scientific literature may be untrue. Radiation science falls into that category. Belief in LNT informs the practice of radiology, radiation regulatory policies, and popular culture through the media. The result is mass radiophobia and harmful outcomes, including forced relocations of populations near nuclear power plant accidents, reluctance to avail oneself of needed medical imaging studies, and aversion to nuclear energy-all unwarranted and all harmful to millions of people.

  11. An abuse of risk assessment: how regulatory agencies improperly adopted LNT for cancer risk assessment.

    Science.gov (United States)

    Calabrese, Edward J

    2015-04-01

    The Genetics Panel of the National Academy of Sciences' Committee on Biological Effects of Atomic Radiation (BEAR) recommended the adoption of the linear dose-response model in 1956, abandoning the threshold dose-response for genetic risk assessments. This recommendation was quickly generalized to include somatic cells for cancer risk assessment and later was instrumental in the adoption of linearity for carcinogen risk assessment by the Environmental Protection Agency. The Genetics Panel failed to provide any scientific assessment to support this recommendation and refused to do so when later challenged by other leading scientists. Thus, the linearity model used in cancer risk assessment was based on ideology rather than science and originated with the recommendation of the NAS BEAR Committee Genetics Panel. Historical documentation in support of these conclusions is provided in the transcripts of the Panel meetings and in previously unexamined correspondence among Panel members.

  12. Stress induction in the bacteria Shewanella oneidensis and Deinococcus radiodurans in response to below-background ionizing radiation.

    Science.gov (United States)

    Castillo, Hugo; Schoderbek, Donald; Dulal, Santosh; Escobar, Gabriela; Wood, Jeffrey; Nelson, Roger; Smith, Geoffrey

    2015-01-01

    The 'Linear no-threshold' (LNT) model predicts that any amount of radiation increases the risk of organisms to accumulate negative effects. Several studies at below background radiation levels (4.5-11.4 nGy h(-1)) show decreased growth rates and an increased susceptibility to oxidative stress. The purpose of our study is to obtain molecular evidence of a stress response in Shewanella oneidensis and Deinococcus radiodurans grown at a gamma dose rate of 0.16 nGy h(-1), about 400 times less than normal background radiation. Bacteria cultures were grown at a dose rate of 0.16 or 71.3 nGy h(-1) gamma irradiation. Total RNA was extracted from samples at early-exponential and stationary phases for the rt-PCR relative quantification (radiation-deprived treatment/background radiation control) of the stress-related genes katB (catalase), recA (recombinase), oxyR (oxidative stress transcriptional regulator), lexA (SOS regulon transcriptional repressor), dnaK (heat shock protein 70) and SOA0154 (putative heavy metal efflux pump). Deprivation of normal levels of radiation caused a reduction in growth of both bacterial species, accompanied by the upregulation of katB, recA, SOA0154 genes in S. oneidensis and the upregulation of dnaK in D. radiodurans. When cells were returned to background radiation levels, growth rates recovered and the stress response dissipated. Our results indicate that below-background levels of radiation inhibited growth and elicited a stress response in two species of bacteria, contrary to the LNT model prediction.

  13. 我國智慧財產訴訟中專利權無效抗辯趨勢報導 The Defense of Patent Invalidity in the Intellectual Property Litigation Special Report

    Directory of Open Access Journals (Sweden)

    陳群顯 Chun-Hsien Chen

    2007-06-01

    Full Text Available 我國智慧財產民事訴訟中,以往囿於「公、私法訴訟二元制」之體系設計,被告即便認為原告所主張之智慧財產權有無效的理由,亦僅能循行政救濟的途徑主張,並無法直接於民事訴訟中直接提起智慧財產權無效抗辯,造成民事訴訟程序之延滯等不便。我國預計於2007 年間設立智慧財產法院,而該法院之設立對於我國智慧財產案件之爭訟將產生巨大而直接之影響,而攸關該法院成敗之主要關鍵⎯⎯「智慧財產法院組織法」及「智慧財產案件審理法」等二法案,業已送立法院進行審查。其中「智慧財產案件審理法」已 於2007 年1 月9 日經立法院三讀通過,「智慧財產法院組織法」亦已於2007 年3 月5 日經立法院三讀通過。「智慧財產案件審理法」中一項劃時代的變革,即是在第16 條第1 項規定:「當事人主張或抗辯智慧財產權有應撤銷、廢止之原因者,法院應就其主張或抗辯有無理由自為判斷」,易言之,該法條規定將直接改變目前我國「公、私法訴訟二元制」的現狀,對於專利訴訟當事人間自產生重大之影響,然依據該法案之規定,是否確能達到立法者之目的?以及是否需要有其他配套制度?本文將介紹我國智慧財產訴訟中 專利權無效抗辯相關制度沿革,並嘗試提供分析意見,同時就目前各國相關專利訴訟制度之設計,提供分析及建議。 In the past, the defendant of intellectual property (IP litigation cannot raise the defense of patent invalidity in the civil litigation. The defendant can only file an invalidity action against the IP at issue. Such judicial system design delays the proceeding of the civil litigation of the IP infringement. The IP Court is proposed to be established in 2007. The establishment of the IP Court will change the current court proceeding of the intellectual

  14. Validation of regression models for nitrate concentrations in the upper groundwater in sandy soils

    International Nuclear Information System (INIS)

    Sonneveld, M.P.W.; Brus, D.J.; Roelsma, J.

    2010-01-01

    For Dutch sandy regions, linear regression models have been developed that predict nitrate concentrations in the upper groundwater on the basis of residual nitrate contents in the soil in autumn. The objective of our study was to validate these regression models for one particular sandy region dominated by dairy farming. No data from this area were used for calibrating the regression models. The model was validated by additional probability sampling. This sample was used to estimate errors in 1) the predicted areal fractions where the EU standard of 50 mg l -1 is exceeded for farms with low N surpluses (ALT) and farms with higher N surpluses (REF); 2) predicted cumulative frequency distributions of nitrate concentration for both groups of farms. Both the errors in the predicted areal fractions as well as the errors in the predicted cumulative frequency distributions indicate that the regression models are invalid for the sandy soils of this study area. - This study indicates that linear regression models that predict nitrate concentrations in the upper groundwater using residual soil N contents should be applied with care.

  15. Thermodynamical aspects of modeling the mechanical response of granular materials

    International Nuclear Information System (INIS)

    Elata, D.

    1995-01-01

    In many applications in rock physics, the material is treated as a continuum. By supplementing the related conservation laws with constitutive equations such as stress-strain relations, a well-posed problem can be formulated and solved. The stress-strain relations may be based on a combination of experimental data and a phenomenological or micromechanical model. If the model is physically sound and its parameters have a physical meaning, it can serve to predict the stress response of the material to unmeasured deformations, predict the stress response of other materials, and perhaps predict other categories of the mechanical response such as failure, permeability, and conductivity. However, it is essential that the model be consistent with all conservation laws and consistent with the second law of thermodynamics. Specifically, some models of the mechanical response of granular materials proposed in literature, are based on intergranular contact force-displacement laws that violate the second law of thermodynamics by permitting energy generation at no cost. This diminishes the usefulness of these models as it invalidates their predictive capabilities. [This work was performed under the auspices of the U.S. DOE by Lawrence Livermore National Laboratory under Contract No. W-7405-ENG-48.

  16. Modeling time-series count data: the unique challenges facing political communication studies.

    Science.gov (United States)

    Fogarty, Brian J; Monogan, James E

    2014-05-01

    This paper demonstrates the importance of proper model specification when analyzing time-series count data in political communication studies. It is common for scholars of media and politics to investigate counts of coverage of an issue as it evolves over time. Many scholars rightly consider the issues of time dependence and dynamic causality to be the most important when crafting a model. However, to ignore the count features of the outcome variable overlooks an important feature of the data. This is particularly the case when modeling data with a low number of counts. In this paper, we argue that the Poisson autoregressive model (Brandt and Williams, 2001) accurately meets the needs of many media studies. We replicate the analyses of Flemming et al. (1997), Peake and Eshbaugh-Soha (2008), and Ura (2009) and demonstrate that models missing some of the assumptions of the Poisson autoregressive model often yield invalid inferences. We also demonstrate that the effect of any of these models can be illustrated dynamically with estimates of uncertainty through a simulation procedure. The paper concludes with implications of these findings for the practical researcher. Copyright © 2013 Elsevier Inc. All rights reserved.

  17. Water in the Gas Phase.

    Science.gov (United States)

    1998-06-01

    Valentin, Ch. Claveau A.D. Bykov, N.N. Lavrentieva, VN. Saveliev , L.N. Sinitsa « THE TCPE MANY-BODY MODEL FOR WATER » 79 M. Masella and J-P. Flament...Laboratoire de Physique Moleculaire et Applications, CNRS Universite Pierre et Marie Curie, Paris, France. A.D. Bykov, N.N. Lavrentieva, V.N. Saveliev , L.N...T19 Lozada M. T4 Rothman L.S. T22 Lutz B. L. T33 Ruiz J. P24 Lynch R. P3 Sadlej A. T5 Lynden-Bell R. M. P8 Saveliev V.N. P4 Maemets V. P18 Saykally

  18. Commentary on inhaled {sup 239}PuO{sub 2} in dogs - a prophylaxis against lung cancer?

    Energy Technology Data Exchange (ETDEWEB)

    Cuttler, J.M., E-mail: jerrycuttler@rogers.com [Cuttler and Associates, Vaughan, ON (Canada); Feinendegen, L. [Brookhaven National Laboratories, Upton, NY (United States)

    2015-07-01

    Several studies on the effect of inhaled plutonium-dioxide particulates and the incidence of lung tumors in dogs reveal beneficial effects when the cumulative alpha-radiation dose is low. There is a threshold at an exposure level of about 100 cGy for excess tumor incidence and reduced lifespan. The observations conform to the expectations of the radiation hormesis dose-response model and contradict the predictions of the Linear No-Threshold (LNT) hypothesis. These studies suggest investigating the possibility of employing low-dose alpha-radiation, such as from {sup 239}PuO{sub 2} inhalation, as a prophylaxis against lung cancer. (author)

  19. Commentary on inhaled {sup 239}PuO{sub 2} in dogs - a prophylaxis against lung cancer?

    Energy Technology Data Exchange (ETDEWEB)

    Cuttler, J.M. [Cuttler and Assoc., Vaughan, Ontario (Canada); Feinendegen, L. [Brookhaven National Laboratories, Upton, New York (United States)

    2015-06-15

    Several studies on the effect of inhaled plutonium-dioxide particulates and the incidence of lung tumors in dogs reveal beneficial effects when the cumulative alpha-radiation dose is low. There is a threshold at an exposure level of about 100 cGy for excess tumor incidence and reduced lifespan. The observations conform to the expectations of the radiation hormesis dose-response model and contradict the predictions of the Linear No-Threshold (LNT) hypothesis. These studies suggest investigating the possibility of employing low-dose alpha-radiation, such as from {sup 239}PuO {sub 2} inhalation, as a prophylaxis against lung cancer. (author)

  20. Nuclear disaster in Fukushima. Based on the WHO data between 22.000 and 66.000 carcinoma deaths are expected in Japan; Atomkatastrophe in Fukushima. Auf der Grundlage der WHO-Daten sind in Japan zwischen 22.000 und 66.000 Krebserkrankungen zu erwarten

    Energy Technology Data Exchange (ETDEWEB)

    Paulitz, Henrik; Eisenberg, Winfrid; Thiel, Reinhold

    2013-03-14

    The authors show that based on the data and assumption of WHO about 22.000 deaths due to cancer are expected in Japan as a consequence of the nuclear disaster in Fukushima in March 2011. The following data are used: the radiation exposure of the Japanese public in the first year after the nuclear catastrophe, the linear no-threshold model (LNT), risk factor for mortality (EAR excess absolute risk). When the factor to determine the lifetime dose is based on the experience of Chernobyl (UNSCEAR calculations) and the most recent scientific research the number of expected cancer cases rises to 66.000.

  1. The separatrix response of diverted TCV plasmas compared to the CREATE-L model

    International Nuclear Information System (INIS)

    Vyas, P.; Lister, J.B.; Villone, F.; Albanese, R.

    1997-11-01

    The response of Ohmic, single-null diverted, non-centred plasmas in TCV to poloidal field coil stimulation has been compared to the linear CREATE-L MHD equilibrium response model. The closed loop responses of directly measured quantities, reconstructed parameters, and the reconstructed plasma contour were all examined. Provided that the plasma position and shape perturbation were small enough for the linearity assumption to hold, the model-experiment agreement was good. For some stimulations the open loop vertical position instability growth rate changed significantly, illustrating the limitations of a linear model. A different model was developed with the assumption that the flux at the plasma boundary is frozen and was also compared with experimental results. It proved not to be as reliable as the CREATE-L model for some simulation parameters showing that the experiments were able to discriminate between different plasma response models. The closed loop response was also found to be sensitive to changes in the modelled plasma shape. It was not possible to invalidate the CREATE-L model despite the extensive range of responses excited by the experiments. (author) figs., tabs., 5 refs

  2. Multilevel Models: Conceptual Framework and Applicability

    Directory of Open Access Journals (Sweden)

    Roxana-Otilia-Sonia Hrițcu

    2015-10-01

    Full Text Available Individuals and the social or organizational groups they belong to can be viewed as a hierarchical system situated on different levels. Individuals are situated on the first level of the hierarchy and they are nested together on the higher levels. Individuals interact with the social groups they belong to and are influenced by these groups. Traditional methods that study the relationships between data, like simple regression, do not take into account the hierarchical structure of the data and the effects of a group membership and, hence, results may be invalidated. Unlike standard regression modelling, the multilevel approach takes into account the individuals as well as the groups to which they belong. To take advantage of the multilevel analysis it is important that we recognize the multilevel characteristics of the data. In this article we introduce the outlines of multilevel data and we describe the models that work with such data. We introduce the basic multilevel model, the two-level model: students can be nested into classes, individuals into countries and the general two-level model can be extended very easily to several levels. Multilevel analysis has begun to be extensively used in many research areas. We present the most frequent study areas where multilevel models are used, such as sociological studies, education, psychological research, health studies, demography, epidemiology, biology, environmental studies and entrepreneurship. We support the idea that since hierarchies exist everywhere, multilevel data should be recognized and analyzed properly by using multilevel modelling.

  3. Forgotten but not gone: Retro-cue costs and benefits in a double-cueing paradigm suggest multiple states in visual short-term memory.

    Science.gov (United States)

    van Moorselaar, Dirk; Olivers, Christian N L; Theeuwes, Jan; Lamme, Victor A F; Sligte, Ilja G

    2015-11-01

    Visual short-term memory (VSTM) performance is enhanced when the to-be-tested item is cued after encoding. This so-called retro-cue benefit is typically accompanied by a cost for the noncued items, suggesting that information is lost from VSTM upon presentation of a retrospective cue. Here we assessed whether noncued items can be restored to VSTM when made relevant again by a subsequent second cue. We presented either 1 or 2 consecutive retro-cues (80% valid) during the retention interval of a change-detection task. Relative to no cue, a valid cue increased VSTM capacity by 2 items, while an invalid cue decreased capacity by 2. Importantly, when a second, valid cue followed an invalid cue, capacity regained 2 items, so that performance was back on par. In addition, when the second cue was also invalid, there was no extra loss of information from VSTM, suggesting that those items that survived a first invalid cue, automatically also survived a second. We conclude that these results are in support of a very versatile VSTM system, in which memoranda adopt different representational states depending on whether they are deemed relevant now, in the future, or not at all. We discuss a neural model that is consistent with this conclusion. (c) 2015 APA, all rights reserved).

  4. Revisiting the gram-negative lipoprotein paradigm

    Science.gov (United States)

    The processing of lipoproteins (lpps) in Gram-negative bacteria is generally considered to be an essential pathway. Mature lipoproteins in these bacteria are triacylated, with the final fatty acid addition performed by Lnt, an apolipoprotein n-acyltransferase. The mature lipoproteins are then sorted...

  5. Cathodoluminescence microscopy and spectroscopy of micro- and nanodiamonds: an implication for laboratory astrophysics.

    Science.gov (United States)

    Gucsik, Arnold; Nishido, Hirotsugu; Ninagawa, Kiyotaka; Ott, Ulrich; Tsuchiyama, Akira; Kayama, Masahiro; Simonia, Irakli; Boudou, Jean-Paul

    2012-12-01

    Color centers in selected micro- and nanodiamond samples were investigated by cathodoluminescence (CL) microscopy and spectroscopy at 298 K [room temperature (RT)] and 77 K [liquid-nitrogen temperature (LNT)] to assess the value of the technique for astrophysics. Nanodiamonds from meteorites were compared with synthetic diamonds made with different processes involving distinct synthesis mechanisms (chemical vapor deposition, static high pressure high temperature, detonation). A CL emission peak centered at around 540 nm at 77 K was observed in almost all of the selected diamond samples and is assigned to the dislocation defect with nitrogen atoms. Additional peaks were identified at 387 and 452 nm, which are related to the vacancy defect. In general, peak intensity at LNT at the samples was increased in comparison to RT. The results indicate a clear temperature-dependence of the spectroscopic properties of diamond. This suggests the method is a useful tool in laboratory astrophysics.

  6. Systematic validation of non-equilibrium thermochemical models using Bayesian inference

    KAUST Repository

    Miki, Kenji

    2015-10-01

    © 2015 Elsevier Inc. The validation process proposed by Babuška et al. [1] is applied to thermochemical models describing post-shock flow conditions. In this validation approach, experimental data is involved only in the calibration of the models, and the decision process is based on quantities of interest (QoIs) predicted on scenarios that are not necessarily amenable experimentally. Moreover, uncertainties present in the experimental data, as well as those resulting from an incomplete physical model description, are propagated to the QoIs. We investigate four commonly used thermochemical models: a one-temperature model (which assumes thermal equilibrium among all inner modes), and two-temperature models developed by Macheret et al. [2], Marrone and Treanor [3], and Park [4]. Up to 16 uncertain parameters are estimated using Bayesian updating based on the latest absolute volumetric radiance data collected at the Electric Arc Shock Tube (EAST) installed inside the NASA Ames Research Center. Following the solution of the inverse problems, the forward problems are solved in order to predict the radiative heat flux, QoI, and examine the validity of these models. Our results show that all four models are invalid, but for different reasons: the one-temperature model simply fails to reproduce the data while the two-temperature models exhibit unacceptably large uncertainties in the QoI predictions.

  7. Identification of transmissivity fields using a Bayesian strategy and perturbative approach

    Science.gov (United States)

    Zanini, Andrea; Tanda, Maria Giovanna; Woodbury, Allan D.

    2017-10-01

    The paper deals with the crucial problem of the groundwater parameter estimation that is the basis for efficient modeling and reclamation activities. A hierarchical Bayesian approach is developed: it uses the Akaike's Bayesian Information Criteria in order to estimate the hyperparameters (related to the covariance model chosen) and to quantify the unknown noise variance. The transmissivity identification proceeds in two steps: the first, called empirical Bayesian interpolation, uses Y* (Y = lnT) observations to interpolate Y values on a specified grid; the second, called empirical Bayesian update, improve the previous Y estimate through the addition of hydraulic head observations. The relationship between the head and the lnT has been linearized through a perturbative solution of the flow equation. In order to test the proposed approach, synthetic aquifers from literature have been considered. The aquifers in question contain a variety of boundary conditions (both Dirichelet and Neuman type) and scales of heterogeneities (σY2 = 1.0 and σY2 = 5.3). The estimated transmissivity fields were compared to the true one. The joint use of Y* and head measurements improves the estimation of Y considering both degrees of heterogeneity. Even if the variance of the strong transmissivity field can be considered high for the application of the perturbative approach, the results show the same order of approximation of the non-linear methods proposed in literature. The procedure allows to compute the posterior probability distribution of the target quantities and to quantify the uncertainty in the model prediction. Bayesian updating has advantages related both to the Monte-Carlo (MC) and non-MC approaches. In fact, as the MC methods, Bayesian updating allows computing the direct posterior probability distribution of the target quantities and as non-MC methods it has computational times in the order of seconds.

  8. A message to Fukushima: nothing to fear but fear itself.

    Science.gov (United States)

    Sutou, Shizuyo

    2016-01-01

    The linear no-threshold model (LNT) has been the basis for radiation protection policies worldwide for 60 years. LNT was fabricated without correct data. The lifespan study of Atomic bomb survivors (LSS) has provided fundamental data to support the NLT. In LSS, exposure doses were underestimated and cancer risk was overestimated; LSS data do not support LNT anymore. In light of these findings, radiation levels and cancer risk in Fukushima are reexamined. Soon after the Fukushima accident, the International Commission on Radiological Protection issued an emergency recommendation that national authorities set reference highest levels in the band of 20-100 mSv and, when the radiation source is under control, reference levels are in the band of 1-20 mSv/y. The Japanese government set the limit dose as low as 1 mSv for the public and stirred up radiophobia, which continues to cause tremendous human, social, and economic losses. Estimated doses in three areas of Fukushima were 0.6-2.3 mSv/y in Tamura City, 1.1-5.5 mSv/y in Kawauchi Village, and 3.8-17 mSv/y in Iitate Village. Since even after acute irradiation, no significant differences are found below 200 mSv for leukemia and below 100 mSv for solid cancers. These data indicate that cancer risk is negligible in Fukushima. Moreover, beneficial effects (lessened cancer incidence) were observed at 400-600 mSv in LSS. Living organisms, which have established efficient defense mechanisms against radiation through 3.8 billion years of evolutionary history, can tolerate 1000 mSv/y if radiation dose rates are low. In fact, people have lived for generations without adverse health effects in high background radiation areas such as Kelara (35 mSv/y), India, and Ramsar (260 mSv/y), Iran. Low dose radiation itself is harmless, but fear of radiation is vitally harmful. When people return to the evacuation zones in Fukushima now and in the future, they will be exposed to such low radiation doses as to cause no physical

  9. Estimation of sexual behavior in the 18-to-24-years-old Iranian youth based on a crosswise model study.

    Science.gov (United States)

    Vakilian, Katayon; Mousavi, Seyed Abbas; Keramat, Afsaneh

    2014-01-13

    In many countries, negative social attitude towards sensitive issues such as sexual behavior has resulted in false and invalid data concerning this issue.This is an analytical cross-sectional study, in which a total number of 1500 single students from universities of Shahroud City were sampled using a multi stage technique. The students were assured that their information disclosed for the researcher will be treated as private and confidential. The results were analyzed using crosswise model, Crosswise Regression, T-test and Chi-square tests. It seems that the prevalence of sexual behavior among Iranian youth is 41% (CI = 36-53). Findings showed that estimation sexual relationship in Iranian single youth is high. Thus, devising training models according to the Islamic-Iranian culture is necessary in order to prevent risky sexual behavior.

  10. The Psychology of Career Theory--A New Perspective?

    Science.gov (United States)

    Woodd, Maureen

    2000-01-01

    New perspectives on human behavior have invalidated some assumptions of career theories such as personality type, career stages, and life-cycle models. Other theories, such as Driver's Objective Career Patterns, Schein's Temporal Development Model, and Nicholson's Transition Cycle, are compatible with current psychological understanding. (SK)

  11. Improvement of biological decontamination, protective and repair activity against radiation injury

    International Nuclear Information System (INIS)

    Kagawa, Yasuo

    2013-01-01

    Because the protection of human subject from late radiation injury is the final goal of remediation of radioactive contamination of 137 Cs in environment, improvement of DNA-repairing ability and 137 Cs-removal from human body is important. In order to reduce environmental radioactivity in areas exceeding 5 mSv/year in Fukushima prefecture, the cost is estimated to be 118 trillion yen, and there are difficulties in finding place to store 137 Cs-contaminated soils and in 137 Cs-recontamination. The radiation damage of DNA molecule takes place stochastically following linear no threshold model (LNT), but the cancer risk and other late radiation injury from long-term low dose radiation do not follow LNT model if we improve DNA repair and the cell regeneration systems. Indirect effects of radiation damage on DNA mediated by reactive oxygen species (ROS) are prevented by vitamin C, E, carotenoids including lycopene and phytochemicals. ROS is also removed by superoxide dismutases containing Cu, Mn and Z. Direct effects of radiation damage on DNA are repaired by enzyme systems using folic acid, vitamins B 6 and B 12 . In addition, before the radiation injury, absorption of 137 Cs is prevented by taking pectin etc. and excretion of 137 Cs is accelerated by ingesting more K. Finally, early detection of cancer and its removal by detailed health check of radiation-exposed people is needed. Radiation-protecting diet developed to protect astronauts from about 1 mSv per day, will be useful for many workers of atomic power plant as well as people living in the 137 Cs-contaminated areas. (author)

  12. Cytogenetic Low-Dose Hyperradiosensitivity Is Observed in Human Peripheral Blood Lymphocytes

    Energy Technology Data Exchange (ETDEWEB)

    Seth, Isheeta [Department of Biological Sciences, Wayne State University, Detroit, Michigan (United States); Joiner, Michael C. [Department of Radiation Oncology, Wayne State University, Detroit, Michigan (United States); Tucker, James D., E-mail: jtucker@biology.biosci.wayne.edu [Department of Biological Sciences, Wayne State University, Detroit, Michigan (United States)

    2015-01-01

    Purpose: The shape of the ionizing radiation response curve at very low doses has been the subject of considerable debate. Linear-no-threshold (LNT) models are widely used to estimate risks associated with low-dose exposures. However, the low-dose hyperradiosensitivity (HRS) phenomenon, in which cells are especially sensitive at low doses but then show increased radioresistance at higher doses, provides evidence of nonlinearity in the low-dose region. HRS is more prominent in the G2 phase of the cell cycle than in the G0/G1 or S phases. Here we provide the first cytogenetic mechanistic evidence of low-dose HRS in human peripheral blood lymphocytes using structural chromosomal aberrations. Methods and Materials: Human peripheral blood lymphocytes from 2 normal healthy female donors were acutely exposed to cobalt 60 γ rays in either G0 or G2 using closely spaced doses ranging from 0 to 1.5 Gy. Structural chromosomal aberrations were enumerated, and the slopes of the regression lines at low doses (0-0.4 Gy) were compared with doses of 0.5 Gy and above. Results: HRS was clearly evident in both donors for cells irradiated in G2. No HRS was observed in cells irradiated in G0. The radiation effect per unit dose was 2.5- to 3.5-fold higher for doses ≤0.4 Gy than for doses >0.5 Gy. Conclusions: These data provide the first cytogenetic evidence for the existence of HRS in human cells irradiated in G2 and suggest that LNT models may not always be optimal for making radiation risk assessments at low doses.

  13. Enzyme catalysed production of sialylated human milk oligosaccharides and galactooligosaccharides by Trypanosoma cruzi trans-sialidase

    DEFF Research Database (Denmark)

    Holck, Jesper; Larsen, Dorte Møller; Michalak, Malwina

    2014-01-01

    Bifidobacterium strains in single culture fermentations. The trans-sialidase also catalysed the transfer of sialic acid from CGMP to galacto-oligosaccharides (GOS) and to the human milk oligosaccharide (HMO) backbone lacto-N-tetraose (LNT) to produce 3′-sialyl-GOS, including doubly sialylated GOS products, and 3...

  14. Effect of sulfur loading on the desulfation chemistry of a commercial lean NOx trap catalyst

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Do Heui; Yezerets, Aleksey; Li, Junhui; Currier, Neal; Chen, Haiying; Hess, Howard ..; Engelhard, Mark H.; Muntean, George G.; Peden, Charles HF

    2012-12-15

    We investigate the effects of initial sulfur loadings on the desulfation chemistry and the subsequent final activity of a commercial LNT catalyst. Identical total amounts of SO2 are applied to the samples, albeit with the frequency of desulfation varied. The results indicate that performance is better with less frequent desulfations. The greater the amount of sulfur deposited before desulfation, the more amount of SO2 evolution before H2S is observed during desulfation, which can be explained by two sequential reactions; initial conversion of sulfate to SO2, followed by the reduction of SO2 to H2S. After completing all sulfation/desulfation steps, the sample with only a single desulfation results in a fairly uniform sulfur distribution along the z-axis inside of the monolith. We expect that the results obtained in this study will provide useful information for optimizing regeneration strategies in vehicles that utilize the LNT technology.

  15. Review of the controversy on risks from low levels of radiation

    International Nuclear Information System (INIS)

    Higson, D.

    2001-01-01

    The need for regulation of low levels of radiation exposure, and the estimation of risks from such exposures, are based on the assumption that risk is proportional to dose without a threshold, the 'linear no-threshold (LNT) hypothesis'. This assumption is not supported by scientific data. There is no clear evidence of harm from low levels of exposure, up to at least 20 mSv (acute dose) or total dose rates of at least 50 mSv per year. Even allowing for reasonable extrapolation from radiation levels at which harmful effects have been observed, the LNT assumption should not be used to estimate risks from doses less than 100 mSv. Laboratory and epidemiological evidence, and evolutionary expectations of biological effects from low level radiation, suggest that beneficial health effects (sometimes called 'radiation hormesis') are at least as likely as harmful effects from such exposures. Controversy on this matter strikes at the basis of radiation protection practice

  16. Radiation protection. Basic concepts of ICRP

    International Nuclear Information System (INIS)

    Saito, Tsutomu; Hirata, Hideki

    2014-01-01

    The title subject is easily explained. Main international organizations for radiation protection are United Nations Scientific Committee on the Effects of Atomic Radiation (UNSCEAR), International Commission on Radiological Protection (ICRP) and International Atomic Energy Agency (IAEA). The UNSCEAR objectively summarizes and publishes scientific findings; ICRP, an NGO, takes part in recommending the radiological protection from the expertized aspect; and IAEA, a UN autonomy, aims at peaceful usage of atomic power. These organizations support the legal regulation and standard of nations. The purpose of the ICRP recommendation (Pub. 103, 2007) is to contribute to the appropriate protection of radiation hazardous effects, which are assumed to be linearly proportional (the model of linear no-threshold, LNT) that radiation risk exists even at the lowest dose. When a change in the single cell results in hazardous alteration, the causative effects are called stochastic effects, which include the mutation leading to cancer formation and genetic effect in offspring (not observed in man). ICRP says the validity of LNT for the stochastic effects essentially from the protective aspect, although epidemiological data support it at >100 mSv exposure. The deterministic effects are caused by loss of cell itself or of its function, where the threshold is defined to be the dose causing >1% of disorder or death. Radiation protective system against exposure is on the situation (programmed, emergent and natural), category (occupational, public and medical) and 3 principles of justification, optimization and application of dose limit. (T.T.)

  17. Some environmental challenges which the uranium production industry faces in the 21st century

    International Nuclear Information System (INIS)

    Zhang Lisheng

    2004-01-01

    Some of the environmental challenges which the uranium production industry faces in the 21st century have been discussed in the paper. They are: the use of the linear non-threshold (LNT) model for radiation protection, the concept of 'controllable dose' as an alternative to the current International Commission on Radiological Protection (ICRP) system of dose limitation, the future of collective dose and the ALARA (As low As Reasonably Achievable) principle and the application of a risk-based framework for managing hazards. The author proposes that, the risk assessment/risk management framework could be used for managing the environmental, safety and decommissioning issues associated with the uranium fuel cycle. (author)

  18. Similarity of the leading contributions to the self-energy and the thermodynamics in two- and three-dimensional Fermi Liquids

    International Nuclear Information System (INIS)

    Coffey, D.; Bedell, K.S.

    1993-01-01

    We compare the self-energy and entropy of a two- and three-dimensional Fermi Liquids (FLs) using a model with a contact interaction between fermions. For a two-dimensional (2D) FL we find that there are T 2 contributions to the entropy from interactions separate from those due to the collective modes. These T 2 contributions arise from nonanalytic corrections to the real part of the self-energy and areanalogous to T 3 lnT contributions present in the entropy of a three-dimensional (3D) FL. The difference between the 2D and 3D results arises solely from the different phase space factors

  19. Phase-response curves and synchronized neural networks.

    Science.gov (United States)

    Smeal, Roy M; Ermentrout, G Bard; White, John A

    2010-08-12

    We review the principal assumptions underlying the application of phase-response curves (PRCs) to synchronization in neuronal networks. The PRC measures how much a given synaptic input perturbs spike timing in a neural oscillator. Among other applications, PRCs make explicit predictions about whether a given network of interconnected neurons will synchronize, as is often observed in cortical structures. Regarding the assumptions of the PRC theory, we conclude: (i) The assumption of noise-tolerant cellular oscillations at or near the network frequency holds in some but not all cases. (ii) Reduced models for PRC-based analysis can be formally related to more realistic models. (iii) Spike-rate adaptation limits PRC-based analysis but does not invalidate it. (iv) The dependence of PRCs on synaptic location emphasizes the importance of improving methods of synaptic stimulation. (v) New methods can distinguish between oscillations that derive from mutual connections and those arising from common drive. (vi) It is helpful to assume linear summation of effects of synaptic inputs; experiments with trains of inputs call this assumption into question. (vii) Relatively subtle changes in network structure can invalidate PRC-based predictions. (viii) Heterogeneity in the preferred frequencies of component neurons does not invalidate PRC analysis, but can annihilate synchronous activity.

  20. Detection of Common Problems in Real-Time and Multicore Systems Using Model-Based Constraints

    Directory of Open Access Journals (Sweden)

    Raphaël Beamonte

    2016-01-01

    Full Text Available Multicore systems are complex in that multiple processes are running concurrently and can interfere with each other. Real-time systems add on top of that time constraints, making results invalid as soon as a deadline has been missed. Tracing is often the most reliable and accurate tool available to study and understand those systems. However, tracing requires that users understand the kernel events and their meaning. It is therefore not very accessible. Using modeling to generate source code or represent applications’ workflow is handy for developers and has emerged as part of the model-driven development methodology. In this paper, we propose a new approach to system analysis using model-based constraints, on top of userspace and kernel traces. We introduce the constraints representation and how traces can be used to follow the application’s workflow and check the constraints we set on the model. We then present a number of common problems that we encountered in real-time and multicore systems and describe how our model-based constraints could have helped to save time by automatically identifying the unwanted behavior.

  1. Gene-Environment Interplay in Twin Models

    Science.gov (United States)

    Hatemi, Peter K.

    2013-01-01

    In this article, we respond to Shultziner’s critique that argues that identical twins are more alike not because of genetic similarity, but because they select into more similar environments and respond to stimuli in comparable ways, and that these effects bias twin model estimates to such an extent that they are invalid. The essay further argues that the theory and methods that undergird twin models, as well as the empirical studies which rely upon them, are unaware of these potential biases. We correct this and other misunderstandings in the essay and find that gene-environment (GE) interplay is a well-articulated concept in behavior genetics and political science, operationalized as gene-environment correlation and gene-environment interaction. Both are incorporated into interpretations of the classical twin design (CTD) and estimated in numerous empirical studies through extensions of the CTD. We then conduct simulations to quantify the influence of GE interplay on estimates from the CTD. Due to the criticism’s mischaracterization of the CTD and GE interplay, combined with the absence of any empirical evidence to counter what is presented in the extant literature and this article, we conclude that the critique does not enhance our understanding of the processes that drive political traits, genetic or otherwise. PMID:24808718

  2. Non-Linearity of dose-effect relationship on the example of cytogenetic effects in plant cells at low level exposure to ionising radiation

    International Nuclear Information System (INIS)

    Oudalova, Alla; Geras'kin, Stanislav; Dikarev, Vladimir; Dikareva, Nina; Chernonog, Elena; Copplestone, David; Evseeva, Tatyana

    2006-01-01

    Over several decades, modelling the effects of ionizing radiation on biological system has relied on the target principle [Timofeeff-Ressovsky et al., 1935], which assumes that cell damage or modification to genes appear as a direct consequence of the exposure of biological macromolecules to charged particles. Furthermore, it is assumed that there is no threshold for the induction of biological damage and that the effects observed are proportional to the energy absorbed. Following this principle, the average number of hits per target should increase linearly with dose, and the yield of mutations per unit of dose is assumed to be the same at both low and high doses (linearity of response). This principle has served as the scientific background for the linear no-threshold (LNT) concept that forms the basis for the radiological protection for the public and the environment [ICRP, 1990]. It follows from the LNT that there is an additional risk for human health from exposure to any radiation level, even below natural background. Since the mid 50's, however, the scientific basis for the LNT concept has been challenged as experimental data have shown that, at low doses, there was a non linear relationship in the dose response. Luchnik and Timofeeff-Ressovsky were the first who showed a non-linear response to a low dose exposure [Luchnik, 1957; Timofeeff-Ressovsky and Luchnik, 1960]. Since then, many data have been accumulated which contradict the LNT model at low doses and dose rates. However, the hit-effect paradigm has become such a strong and indissoluble fact that it has persisted even under the growing pressure of scientific evidence for phenomena at low dose exposure that can not be successfully accounted for by the LNT concept. In recent years, additional information on non-targeted effects of radiation has been accumulated following the first reports of an adaptive response in human lymphocytes [Olivieri et al., 1984] as well as bystander mutagenic effect of alpha

  3. Non-Linearity of dose-effect relationship on the example of cytogenetic effects in plant cells at low level exposure to ionising radiation

    Energy Technology Data Exchange (ETDEWEB)

    Oudalova, Alla; Geras' kin, Stanislav; Dikarev, Vladimir; Dikareva, Nina; Chernonog, Elena [Russian Institute of Agricultural Radiology and Agroecology, RIARAE, 249032 Obninsk (Russian Federation); Copplestone, David [Environment Agency, Millbank Tower, 25th. Floor, 21/24 Millbank, London, SW1P 4XL (United Kingdom); Evseeva, Tatyana [Institute of Biology, Kommunisticheskaya st., 28 Syktyvkar 167610, Komi Republic (Russian Federation)

    2006-07-01

    Over several decades, modelling the effects of ionizing radiation on biological system has relied on the target principle [Timofeeff-Ressovsky et al., 1935], which assumes that cell damage or modification to genes appear as a direct consequence of the exposure of biological macromolecules to charged particles. Furthermore, it is assumed that there is no threshold for the induction of biological damage and that the effects observed are proportional to the energy absorbed. Following this principle, the average number of hits per target should increase linearly with dose, and the yield of mutations per unit of dose is assumed to be the same at both low and high doses (linearity of response). This principle has served as the scientific background for the linear no-threshold (LNT) concept that forms the basis for the radiological protection for the public and the environment [ICRP, 1990]. It follows from the LNT that there is an additional risk for human health from exposure to any radiation level, even below natural background. Since the mid 50's, however, the scientific basis for the LNT concept has been challenged as experimental data have shown that, at low doses, there was a non linear relationship in the dose response. Luchnik and Timofeeff-Ressovsky were the first who showed a non-linear response to a low dose exposure [Luchnik, 1957; Timofeeff-Ressovsky and Luchnik, 1960]. Since then, many data have been accumulated which contradict the LNT model at low doses and dose rates. However, the hit-effect paradigm has become such a strong and indissoluble fact that it has persisted even under the growing pressure of scientific evidence for phenomena at low dose exposure that can not be successfully accounted for by the LNT concept. In recent years, additional information on non-targeted effects of radiation has been accumulated following the first reports of an adaptive response in human lymphocytes [Olivieri et al., 1984] as well as bystander mutagenic effect of

  4. Weinberg disequilibrium and association study of insertion/deletion ...

    African Journals Online (AJOL)

    Omayma M. Hassanin

    2014-09-10

    Sep 10, 2014 ... In the case of a reduced number of observed heterozygous patients, as may occur ... examined for a variety of genotyping errors and pseudo-SNP models. For the majority of genotyping models ... selected rather than a random sample, invalidating direct com- parisons with other populations. Therefore, we ...

  5. When do latent class models overstate accuracy for diagnostic and other classifiers in the absence of a gold standard?

    Science.gov (United States)

    Spencer, Bruce D

    2012-06-01

    Latent class models are increasingly used to assess the accuracy of medical diagnostic tests and other classifications when no gold standard is available and the true state is unknown. When the latent class is treated as the true class, the latent class models provide measures of components of accuracy including specificity and sensitivity and their complements, type I and type II error rates. The error rates according to the latent class model differ from the true error rates, however, and empirical comparisons with a gold standard suggest the true error rates often are larger. We investigate conditions under which the true type I and type II error rates are larger than those provided by the latent class models. Results from Uebersax (1988, Psychological Bulletin 104, 405-416) are extended to accommodate random effects and covariates affecting the responses. The results are important for interpreting the results of latent class analyses. An error decomposition is presented that incorporates an error component from invalidity of the latent class model. © 2011, The International Biometric Society.

  6. Leveraging the BPEL Event Model to Support QoS-aware Process Execution

    Science.gov (United States)

    Zaid, Farid; Berbner, Rainer; Steinmetz, Ralf

    Business processes executed using compositions of distributed Web Services are susceptible to different fault types. The Web Services Business Process Execution Language (BPEL) is widely used to execute such processes. While BPEL provides fault handling mechanisms to handle functional faults like invalid message types, it still lacks a flexible native mechanism to handle non-functional exceptions associated with violations of QoS levels that are typically specified in a governing Service Level Agreement (SLA), In this paper, we present an approach to complement BPEL's fault handling, where expected QoS levels and necessary recovery actions are specified declaratively in form of Event-Condition-Action (ECA) rules. Our main contribution is leveraging BPEL's standard event model which we use as an event space for the created ECA rules. We validate our approach by an extension to an open source BPEL engine.

  7. A numerical cloud model to interpret the isotope content of hailstones

    International Nuclear Information System (INIS)

    Jouzel, J.; Brichet, N.; Thalmann, B.; Federer, B.

    1980-07-01

    Measurements of the isotope content of hailstones are frequently used to deduce their trajectories and updraft speeds within severe storms. The interpretation was made in the past on the basis of an adiabatic equilibrium model in which the stones grew exclusively by interaction with droplets and vapor. Using the 1D steady-state model of Hirsch with parametrized cloud physics these unrealistic assumptions were dropped and the effects of interactions between droplets, drops, ice crystals and graupel on the concentrations of stable isotopes in hydrometeors were taken into account. The construction of the model is briefly discussed. The resulting height profiles of D and O 18 in hailstones deviate substantially from the equilibrium case, rendering most earlier trajectory calculations invalid. It is also seen that in the lower cloud layers the ice of the stones is richer due to relaxation effects, but at higher cloud layers (T(a) 0 C) the ice is much poorer in isotopes. This yields a broader spread of the isotope values in the interval 0>T(a)>-35 0 C or alternatively, it means that hailstones with a very large range of measured isotope concentrations grow in a smaller and therefore more realistic temperature interval. The use of the model in practice will be demonstrated

  8. Improved Correction of Misclassification Bias With Bootstrap Imputation.

    Science.gov (United States)

    van Walraven, Carl

    2018-07-01

    Diagnostic codes used in administrative database research can create bias due to misclassification. Quantitative bias analysis (QBA) can correct for this bias, requires only code sensitivity and specificity, but may return invalid results. Bootstrap imputation (BI) can also address misclassification bias but traditionally requires multivariate models to accurately estimate disease probability. This study compared misclassification bias correction using QBA and BI. Serum creatinine measures were used to determine severe renal failure status in 100,000 hospitalized patients. Prevalence of severe renal failure in 86 patient strata and its association with 43 covariates was determined and compared with results in which renal failure status was determined using diagnostic codes (sensitivity 71.3%, specificity 96.2%). Differences in results (misclassification bias) were then corrected with QBA or BI (using progressively more complex methods to estimate disease probability). In total, 7.4% of patients had severe renal failure. Imputing disease status with diagnostic codes exaggerated prevalence estimates [median relative change (range), 16.6% (0.8%-74.5%)] and its association with covariates [median (range) exponentiated absolute parameter estimate difference, 1.16 (1.01-2.04)]. QBA produced invalid results 9.3% of the time and increased bias in estimates of both disease prevalence and covariate associations. BI decreased misclassification bias with increasingly accurate disease probability estimates. QBA can produce invalid results and increase misclassification bias. BI avoids invalid results and can importantly decrease misclassification bias when accurate disease probability estimates are used.

  9. A hybrid hydrostatic and non-hydrostatic numerical model for shallow flow simulations

    Science.gov (United States)

    Zhang, Jingxin; Liang, Dongfang; Liu, Hua

    2018-05-01

    Hydrodynamics of geophysical flows in oceanic shelves, estuaries, and rivers, are often studied by solving shallow water model equations. Although hydrostatic models are accurate and cost efficient for many natural flows, there are situations where the hydrostatic assumption is invalid, whereby a fully hydrodynamic model is necessary to increase simulation accuracy. There is a growing concern about the decrease of the computational cost of non-hydrostatic pressure models to improve the range of their applications in large-scale flows with complex geometries. This study describes a hybrid hydrostatic and non-hydrostatic model to increase the efficiency of simulating shallow water flows. The basic numerical model is a three-dimensional hydrostatic model solved by the finite volume method (FVM) applied to unstructured grids. Herein, a second-order total variation diminishing (TVD) scheme is adopted. Using a predictor-corrector method to calculate the non-hydrostatic pressure, we extended the hydrostatic model to a fully hydrodynamic model. By localising the computational domain in the corrector step for non-hydrostatic pressure calculations, a hybrid model was developed. There was no prior special treatment on mode switching, and the developed numerical codes were highly efficient and robust. The hybrid model is applicable to the simulation of shallow flows when non-hydrostatic pressure is predominant only in the local domain. Beyond the non-hydrostatic domain, the hydrostatic model is still accurate. The applicability of the hybrid method was validated using several study cases.

  10. Rad-by-rad (bit-by-bit): triumph of evidence over activities fostering fear of radiogenic cancers at low doses

    International Nuclear Information System (INIS)

    Strzelczyk, J.; Potter, W.; Zdrojewicz, Z.

    2006-01-01

    Full text: Large segments of Western population hold sciences in low esteem. This trend became particularly pervasive in the field of radiation sciences in recent decades. The resulting lack of knowledge, easily filled with fear that feeds on itself, makes people susceptible to prevailing dogmas. Decades-long moratorium on nuclear power in the US, resentment of a nything nuclear , delay/refusal to obtain medical radiation procedures are some of the societal consequences. The problem has been exacerbated by promulgation of the linear-no-threshold (LNT) dose response model by advisory bodies such as the ICRP, NCRP and others. This model assumes no safe level of radiation and implies that response is the same per unit dose regardless of the total dose or dose rate. The most recent (June 2005) report from the National Research Council, BEIR VII (Biological Effects of Ionizing Radiation) continues this approach and quantifies potential cancer risks at low doses by linear extrapolation of risk values obtained from epidemiological observations of populations exposed to high doses, 0.2 to 3 Sv. It minimizes significance of lack of evidence of adverse effects in populations exposed to low doses and discounts documented beneficial effects of low dose exposures on the human immune system. The LNT doctrine is in direct conflict with current findings of radiobiology and important features of modern radiation oncology. Fortunately, these aspects are addressed in-depth in another major report - issued jointly in March 2005 by two French Academies, of Sciences and of Medicine. The latter report is much less publicized thus it is a responsibility of radiation professionals, physicists, nuclear engineers, and physicians to become familiar with its content and relevant studies, and to widely disseminate this information. To counteract biased media, we need to be creative in developing means of sharing good news about radiation with co-workers, patients, and the general public

  11. Implications for human and environmental health of low doses of ionising radiation

    International Nuclear Information System (INIS)

    Mothersill, Carmel; Seymour, Colin

    2014-01-01

    The last 20 years have seen a major paradigm shift in radiation biology. Several discoveries challenge the DNA centric view which holds that DNA damage is the critical effect of radiation irrespective of dose. This theory leads to the assumption that dose and effect are simply linked – the more energy deposition, the more DNA damage and the greater the biological effect. This is embodied in radiation protection (RP) regulations as the linear-non-threshold (LNT) model. However the science underlying the LNT model is being challenged particularly in relation to the environment because it is now clear that at low doses of concern in RP, cells, tissues and organisms respond to radiation by inducing responses which are not readily predictable by dose. These include adaptive responses, bystander effects, genomic instability and low dose hypersensitivity, and are commonly described as stress responses, while recognizing that “stress” can be good as well as bad. The phenomena contribute to observed radiation responses and appear to be influenced by genetic, epigenetic and environmental factors, meaning that dose and response are not simply related. The question is whether our discovery of these phenomena means that we need to re-evaluate RP approaches. The so-called “non-targeted” mechanisms mean that low dose radiobiology is very complex and supra linear or sub-linear (even hormetic) responses are possible but their occurrence is unpredictable for any given system level. Issues which may need consideration are synergistic or antagonistic effects of other pollutants. RP, at present, only looks at radiation dose but the new (NTE) radiobiology means that chemical or physical agents, which interfere with tissue responses to low doses of radiation, could critically modulate the predicted risk. Similarly, the “health” of the organism could determine the effect of a given low dose by enabling or disabling a critical response. These issues will be discussed

  12. Patient choice modelling: how do patients choose their hospitals?

    Science.gov (United States)

    Smith, Honora; Currie, Christine; Chaiwuttisak, Pornpimol; Kyprianou, Andreas

    2018-06-01

    As an aid to predicting future hospital admissions, we compare use of the Multinomial Logit and the Utility Maximising Nested Logit models to describe how patients choose their hospitals. The models are fitted to real data from Derbyshire, United Kingdom, which lists the postcodes of more than 200,000 admissions to six different local hospitals. Both elective and emergency admissions are analysed for this mixed urban/rural area. For characteristics that may affect a patient's choice of hospital, we consider the distance of the patient from the hospital, the number of beds at the hospital and the number of car parking spaces available at the hospital, as well as several statistics publicly available on National Health Service (NHS) websites: an average waiting time, the patient survey score for ward cleanliness, the patient safety score and the inpatient survey score for overall care. The Multinomial Logit model is successfully fitted to the data. Results obtained with the Utility Maximising Nested Logit model show that nesting according to city or town may be invalid for these data; in other words, the choice of hospital does not appear to be preceded by choice of city. In all of the analysis carried out, distance appears to be one of the main influences on a patient's choice of hospital rather than statistics available on the Internet.

  13. Modeling of the Earth's gravity field using the New Global Earth Model (NEWGEM)

    Science.gov (United States)

    Kim, Yeong E.; Braswell, W. Danny

    1989-01-01

    Traditionally, the global gravity field was described by representations based on the spherical harmonics (SH) expansion of the geopotential. The SH expansion coefficients were determined by fitting the Earth's gravity data as measured by many different methods including the use of artificial satellites. As gravity data have accumulated with increasingly better accuracies, more of the higher order SH expansion coefficients were determined. The SH representation is useful for describing the gravity field exterior to the Earth but is theoretically invalid on the Earth's surface and in the Earth's interior. A new global Earth model (NEWGEM) (KIM, 1987 and 1988a) was recently proposed to provide a unified description of the Earth's gravity field inside, on, and outside the Earth's surface using the Earth's mass density profile as deduced from seismic studies, elevation and bathymetric information, and local and global gravity data. Using NEWGEM, it is possible to determine the constraints on the mass distribution of the Earth imposed by gravity, topography, and seismic data. NEWGEM is useful in investigating a variety of geophysical phenomena. It is currently being utilized to develop a geophysical interpretation of Kaula's rule. The zeroth order NEWGEM is being used to numerically integrate spherical harmonic expansion coefficients and simultaneously determine the contribution of each layer in the model to a given coefficient. The numerically determined SH expansion coefficients are also being used to test the validity of SH expansions at the surface of the Earth by comparing the resulting SH expansion gravity model with exact calculations of the gravity at the Earth's surface.

  14. Dose Response Model of Biological Reaction to Low Dose Rate Gamma Radiation

    International Nuclear Information System (INIS)

    Magae, J.; Furikawa, C.; Hoshi, Y.; Kawakami, Y.; Ogata, H.

    2004-01-01

    It is necessary to use reproducible and stable indicators to evaluate biological responses to long term irradiation at low dose-rate. They should be simple and quantitative enough to produce the results statistically accurate, because we have to analyze the subtle changes of biological responses around background level at low dose. For these purposes we chose micronucleus formation of U2OS, a human osteosarcoma cell line, as indicators of biological responses. Cells were exposed to gamma ray in irradiation rom bearing 50,000 Ci 60Co. After irradiation, they were cultured for 24 h in the presence of cytochalasin B to block cytokinesis, and cytoplasm and nucleus were stained with DAPI and prospidium iodide, respectively. the number of binuclear cells bearing micronuclei was counted under a fluorescence microscope. Dose rate in the irradiation room was measured with PLD. Dose response of PLD is linear between 1 mGy to 10 Gy, and standard deviation of triplicate count was several percent of mean value. We fitted statistically dose response curves to the data, and they were plotted on the coordinate of linearly scale response and dose. The results followed to the straight line passing through the origin of the coordinate axes between 0.1-5 Gy, and dose and does rate effectiveness factor (DDREF) was less than 2 when cells were irradiated for 1-10 min. Difference of the percent binuclear cells bearing micronucleus between irradiated cells and control cells was not statistically significant at the dose above 0.1 Gy when 5,000 binuclear cells were analyzed. In contrast, dose response curves never followed LNT, when cells were irradiated for 7 to 124 days. Difference of the percent binuclear cells bearing micronucleus between irradiated cells and control cells was not statistically significant at the dose below 6 Gy, when cells were continuously irradiated for 124 days. These results suggest that dose response curve of biological reaction is remarkably affected by exposure

  15. Statistical mechanics of normal grain growth in one dimension: A partial integro-differential equation model

    International Nuclear Information System (INIS)

    Ng, Felix S.L.

    2016-01-01

    We develop a statistical-mechanical model of one-dimensional normal grain growth that does not require any drift-velocity parameterization for grain size, such as used in the continuity equation of traditional mean-field theories. The model tracks the population by considering grain sizes in neighbour pairs; the probability of a pair having neighbours of certain sizes is determined by the size-frequency distribution of all pairs. Accordingly, the evolution obeys a partial integro-differential equation (PIDE) over ‘grain size versus neighbour grain size’ space, so that the grain-size distribution is a projection of the PIDE's solution. This model, which is applicable before as well as after statistically self-similar grain growth has been reached, shows that the traditional continuity equation is invalid outside this state. During statistically self-similar growth, the PIDE correctly predicts the coarsening rate, invariant grain-size distribution and spatial grain size correlations observed in direct simulations. The PIDE is then reducible to the standard continuity equation, and we derive an explicit expression for the drift velocity. It should be possible to formulate similar parameterization-free models of normal grain growth in two and three dimensions.

  16. A hybrid model for computing nonthermal ion distributions in a long mean-free-path plasma

    Science.gov (United States)

    Tang, Xianzhu; McDevitt, Chris; Guo, Zehua; Berk, Herb

    2014-10-01

    Non-thermal ions, especially the suprathermal ones, are known to make a dominant contribution to a number of important physics such as the fusion reactivity in controlled fusion, the ion heat flux, and in the case of a tokamak, the ion bootstrap current. Evaluating the deviation from a local Maxwellian distribution of these non-thermal ions can be a challenging task in the context of a global plasma fluid model that evolves the plasma density, flow, and temperature. Here we describe a hybrid model for coupling such constrained kinetic calculation to global plasma fluid models. The key ingredient is a non-perturbative treatment of the tail ions where the ion Knudsen number approaches or surpasses order unity. This can be sharply constrasted with the standard Chapman-Enskog approach which relies on a perturbative treatment that is frequently invalidated. The accuracy of our coupling scheme is controlled by the precise criteria for matching the non-perturbative kinetic model to perturbative solutions in both configuration space and velocity space. Although our specific application examples will be drawn from laboratory controlled fusion experiments, the general approach is applicable to space and astrophysical plasmas as well. Work supported by DOE.

  17. Knowledge on radiation dose-rate for risk communication on nuclear power plants

    International Nuclear Information System (INIS)

    Sugiyama, Ken-ichiro

    2013-01-01

    The sense of anxiety on radiation after Fukushima Dai-ichi accident has not disappeared because of the nightmare scenario on radiation cultivated through the Cold War era starting at the atomic bomb dropping at Hiroshima and Nagasaki. In the present paper, from the viewpoint of establishing the social acceptance of nuclear power plants as well as new reasonable regulation, biological defense in depth (production of anti-oxidants, DNA repair, cell death/apoptosis, and immune defense mechanisms) found in a few decades are presented in comparison with the linear no-threshold (LNT) model for the induction of cancer in the range up to 100 mSv (as single or annual doses) applied for the present regulation. (author)

  18. Thermoluminescence analysis of co-doped NaCl at low temperature irradiations

    Energy Technology Data Exchange (ETDEWEB)

    Cruz-Zaragoza, E., E-mail: ecruz@nucleares.unam.m [Instituto de Ciencias Nucleares, Universidad Nacional Autonoma de Mexico, A.P. 70-543, 04510 Mexico D.F. (Mexico); Ortiz, A. [Instituto de Ciencias Nucleares, Universidad Nacional Autonoma de Mexico, A.P. 70-543, 04510 Mexico D.F. (Mexico); Unidad Profesional Interdisciplinaria de Ingenieria y Tecnologias Avanzadas, IPN, Av. Instituto Politecnico Nacional 2580, Col. La Laguna Ticoman, 07340 Mexico D.F. (Mexico); Furetta, C. [Touro University Rome, Circne Gianicolense 15-17, 00153 Rome (Italy); Flores J, C.; Hernandez A, J.; Murrieta S, H. [Instituto de Fisica, Universidad Nacional Autonoma de Mexico, A.P. 20-364, 01000 Mexico D.F. (Mexico)

    2011-02-15

    The thermoluminescent response and kinetics parameters of NaCl, doubly activated by Ca-Mn and Cd-Mn ions, exposed to gamma radiation are analyzed. The doped NaCl samples were irradiated at relative low temperature, i.e. at the liquid nitrogen temperature (LNT) and at dry ice temperature (DIT), and the glow curves obtained after 2 Gy of gamma irradiation were analyzed using the computerized glow curve deconvolution (CGCD). An evident variation in the glow curve structure after LNT and DIT was observed. It seems that different kinds of trapping levels are activated at relative low temperature. The original two prominent peaks in compositions A (Ca,Mn) and B (Ca,Mn) have been changed in only one main peak with satellites in the low temperature side of the glow curves. In compositions C (Cd,Mn) and D (Cd,Mn), low temperature peaks become stronger and prominent than the high temperature peaks; this effect could be explained considering that the trapping probability for low temperature traps, the one very close to the conduction band, is enhanced by low temperatures during irradiation.

  19. Treatments of intrinsic viscosity and glass transition temperature data of poly(2,6-dimethylphenylmethacrylate)

    International Nuclear Information System (INIS)

    Hamidi, Nasrollah; Massoudi, Ruhullah

    2003-01-01

    A useful relationship, ln(T g )=ln(T g,∞ )-m[η] -ν , between intrinsic viscosity and glass transition temperature for a series of homologous polymers was obtained by combining the Mark-Houwink-Kuhn-Sakurada (MHKS) relation for intrinsic viscosity and molecular mass, and the Fox-Flory equation for glass transition temperature and number-average molecular mass. This relationship was applied to poly(2,6-dimethylphenylmethacrylate) (PDMPh) in a variety of solvents (ideal to good) such as toluene, tetrahydrofuran/water, tetrahydrofuran, and chlorobenzene systems. The parameter α estimated by this procedure in toluene, tetrahydrofuran/water, tetrahydrofuran, and chlorobenzene systems are 0.50 6 , 0.51 1 , 0.56 7 , and 0.67 3 , respectively which are in agreement with those of Mark-Houwink-Kuhn-Sakurada values by less than 5% differences. The T g,∞ quantity estimated from this equation also is within the standard deviation of that obtained from the Fox-Flory method. The m quantity is increasing as the thermodynamic quality of the solvent improves, therefore, m may be considered as an indicator of coil conformations in a given solvent

  20. Attack Tree Generation by Policy Invalidation

    NARCIS (Netherlands)

    Ivanova, Marieta Georgieva; Probst, Christian W.; Hansen, René Rydhof; Kammüller, Florian; Naeem Akram, R.; Jajodia, S.

    2015-01-01

    Attacks on systems and organisations increasingly exploit human actors, for example through social engineering, complicating their formal treatment and automatic identi﬿cation. Formalisation of human behaviour is difficult at best, and attacks on socio-technical systems are still mostly identi﬿ed

  1. Uncertainty representation, quantification and evaluation for data and information fusion

    CSIR Research Space (South Africa)

    De Villiers, Johan P

    2015-07-01

    Full Text Available are not or are incorrectly accounted for, fusion processes may provide under- or overconfident results, or in some cases incorrect results. These are often owing to incorrect or invalid simplifying assumptions during the modelling process. The authors investigate the sources...

  2. Mathematical modelling of complex contagion on clustered networks

    Science.gov (United States)

    O'sullivan, David J.; O'Keeffe, Gary; Fennell, Peter; Gleeson, James

    2015-09-01

    The spreading of behavior, such as the adoption of a new innovation, is influenced bythe structure of social networks that interconnect the population. In the experiments of Centola (Science, 2010), adoption of new behavior was shown to spread further and faster across clustered-lattice networks than across corresponding random networks. This implies that the “complex contagion” effects of social reinforcement are important in such diffusion, in contrast to “simple” contagion models of disease-spread which predict that epidemics would grow more efficiently on random networks than on clustered networks. To accurately model complex contagion on clustered networks remains a challenge because the usual assumptions (e.g. of mean-field theory) regarding tree-like networks are invalidated by the presence of triangles in the network; the triangles are, however, crucial to the social reinforcement mechanism, which posits an increased probability of a person adopting behavior that has been adopted by two or more neighbors. In this paper we modify the analytical approach that was introduced by Hebert-Dufresne et al. (Phys. Rev. E, 2010), to study disease-spread on clustered networks. We show how the approximation method can be adapted to a complex contagion model, and confirm the accuracy of the method with numerical simulations. The analytical results of the model enable us to quantify the level of social reinforcement that is required to observe—as in Centola’s experiments—faster diffusion on clustered topologies than on random networks.

  3. Mathematical modelling of complex contagion on clustered networks

    Directory of Open Access Journals (Sweden)

    David J. P. O'Sullivan

    2015-09-01

    Full Text Available The spreading of behavior, such as the adoption of a new innovation, is influenced bythe structure of social networks that interconnect the population. In the experiments of Centola (Science, 2010, adoption of new behavior was shown to spread further and faster across clustered-lattice networks than across corresponding random networks. This implies that the complex contagion effects of social reinforcement are important in such diffusion, in contrast to simple contagion models of disease-spread which predict that epidemics would grow more efficiently on random networks than on clustered networks. To accurately model complex contagion on clustered networks remains a challenge because the usual assumptions (e.g. of mean-field theory regarding tree-like networks are invalidated by the presence of triangles in the network; the triangles are, however, crucial to the social reinforcement mechanism, which posits an increased probability of a person adopting behavior that has been adopted by two or more neighbors. In this paper we modify the analytical approach that was introduced by Hebert-Dufresne et al. (Phys. Rev. E, 2010, to study disease-spread on clustered networks. We show how the approximation method can be adapted to a complex contagion model, and confirm the accuracy of the method with numerical simulations. The analytical results of the model enable us to quantify the level of social reinforcement that is required to observe—as in Centola’s experiments—faster diffusion on clustered topologies than on random networks.

  4. ON THE LAMPPOST MODEL OF ACCRETING BLACK HOLES

    Energy Technology Data Exchange (ETDEWEB)

    Niedźwiecki, Andrzej; Szanecki, Michał [Łódź University, Department of Physics, Pomorska 149/153, 90-236 Łódź (Poland); Zdziarski, Andrzej A. [Centrum Astronomiczne im. M. Kopernika, Bartycka 18, 00-716 Warszawa (Poland)

    2016-04-10

    We study the lamppost model, in which the X-ray source in accreting black hole (BH) systems is located on the rotation axis close to the horizon. We point out a number of inconsistencies in the widely used lamppost model relxilllp, e.g., neglecting the redshift of the photons emitted by the lamppost that are directly observed. They appear to invalidate those model fitting results for which the source distances from the horizon are within several gravitational radii. Furthermore, if those results were correct, most of the photons produced in the lamppost would be trapped by the BH, and the luminosity generated in the source as measured at infinity would be much larger than that observed. This appears to be in conflict with the observed smooth state transitions between the hard and soft states of X-ray binaries. The required increase of the accretion rate and the associated efficiency reduction also present a problem for active galactic nuclei. Then, those models imply the luminosity measured in the local frame is much higher than that produced in the source and measured at infinity, due to the additional effects of time dilation and redshift, and the electron temperature is significantly higher than that observed. We show that these conditions imply that the fitted sources would be out of the e{sup ±} pair equilibrium. On the other hand, the above issues pose relatively minor problems for sources at large distances from the BH, where relxilllp can still be used.

  5. KGB methods suspected in eavesdropping affair

    Index Scriptorium Estoniae

    2011-01-01

    Läti Esimese Partei/Läti Tee esimees Ainars Slesers ütles telekanali LNT 5. mai saates 900 sekundit, et Läti hotellis Radisson Blu Ridzene on tõenäoliselt pealt kuulatud välismaiste ametiisikute kõnesid. Hotelli esindaja Aiga Lapina kinnitas, et hotell ei tea sellest midagi ja on nõus asja kontrollima

  6. Relationship between thermoluminescence and X-ray induced luminescence in alkali halides

    International Nuclear Information System (INIS)

    Aguilar, M.; Lopez, F.J.; Jaque, F.

    1978-01-01

    The wavelength spectra of thermoluminescence and X-ray induced luminescence in pure and divalent cation doped alkali halides, in the temperature range LNT-RT have been studied. The more important conclusion is that the wavelength spectra in both cases are very similar. This allows a new point of view to be presented on thermoluminescence mechanisms. (author)

  7. Integration of Heterogeneous Bibliographic Information through Data Abstractions.

    Science.gov (United States)

    1986-01-01

    11-12 [COMPENDEX] a) Electronics v 56 n 7 Apr 7 1983 p 155-157. b) IEEE Trans Magn v Mag-14 n 5 Sep 1978, INTERMAG (lnt Magn) Conf, I Florence, Italy ...developed a g0eograph.Cally distributed information systems as DOE/ PECaN . DOD/OROLS. NASA/RECON. CAS On-Line. OARC (France) and DECHEMA (West Germany

  8. Manual editing of automatically recorded data in an anesthesia information management system.

    Science.gov (United States)

    Wax, David B; Beilin, Yaakov; Hossain, Sabera; Lin, Hung-Mo; Reich, David L

    2008-11-01

    Anesthesia information management systems allow automatic recording of physiologic and anesthetic data. The authors investigated the prevalence of such data modification in an academic medical center. The authors queried their anesthesia information management system database of anesthetics performed in 2006 and tabulated the counts of data points for automatically recorded physiologic and anesthetic parameters as well as the subset of those data that were manually invalidated by clinicians (both with and without alternate values manually appended). Patient, practitioner, data source, and timing characteristics of recorded values were also extracted to determine their associations with editing of various parameters in the anesthesia information management system record. A total of 29,491 cases were analyzed, 19% of which had one or more data points manually invalidated. Among 58 attending anesthesiologists, each invalidated data in a median of 7% of their cases when working as a sole practitioner. A minority of invalidated values were manually appended with alternate values. Pulse rate, blood pressure, and pulse oximetry were the most commonly invalidated parameters. Data invalidation usually resulted in a decrease in parameter variance. Factors independently associated with invalidation included extreme physiologic values, American Society of Anesthesiologists physical status classification, emergency status, timing (phase of the procedure/anesthetic), presence of an intraarterial catheter, resident or certified registered nurse anesthetist involvement, and procedure duration. Editing of physiologic data automatically recorded in an anesthesia information management system is a common practice and results in decreased variability of intraoperative data. Further investigation may clarify the reasons for and consequences of this behavior.

  9. Restoration of dimensional reduction in the random-field Ising model at five dimensions

    Science.gov (United States)

    Fytas, Nikolaos G.; Martín-Mayor, Víctor; Picco, Marco; Sourlas, Nicolas

    2017-04-01

    The random-field Ising model is one of the few disordered systems where the perturbative renormalization group can be carried out to all orders of perturbation theory. This analysis predicts dimensional reduction, i.e., that the critical properties of the random-field Ising model in D dimensions are identical to those of the pure Ising ferromagnet in D -2 dimensions. It is well known that dimensional reduction is not true in three dimensions, thus invalidating the perturbative renormalization group prediction. Here, we report high-precision numerical simulations of the 5D random-field Ising model at zero temperature. We illustrate universality by comparing different probability distributions for the random fields. We compute all the relevant critical exponents (including the critical slowing down exponent for the ground-state finding algorithm), as well as several other renormalization-group invariants. The estimated values of the critical exponents of the 5D random-field Ising model are statistically compatible to those of the pure 3D Ising ferromagnet. These results support the restoration of dimensional reduction at D =5 . We thus conclude that the failure of the perturbative renormalization group is a low-dimensional phenomenon. We close our contribution by comparing universal quantities for the random-field problem at dimensions 3 ≤D equality at all studied dimensions.

  10. Implications of protein polymorphism on protein phase behaviour

    NARCIS (Netherlands)

    Stegen, J.; Schoot, van der P.P.A.M.

    2015-01-01

    The phase behaviour of small globular proteins is often modeled by approximating them as spherical particles with fixed internal structure. However, changes in the local environment of a protein can lead to changes in its conformation rendering this approximation invalid. We present a simple

  11. A set-valued approach to FDI and FTC: Theory and implementation issues

    DEFF Research Database (Denmark)

    Rosa, Paulo Andre Nobre; Casau, Pedro; Silvestre, Carlos

    2012-01-01

    A complete methodology to design robust Fault Detection and Isolation (FDI) filters and Fault Tolerant Control (FTC) schemes for Linear Time-Varying (LTV) systems is proposed. The paper takes advantage of the recent advances in model invalidation using Set-Valued Observers (SVOs) that led...

  12. Scenario and parameter studies on global deposition of radioactivity using the computer model GLODEP2

    International Nuclear Information System (INIS)

    Shapiro, C.S.

    1984-08-01

    The GLODEP2 computer code was utilized to determine biological impact to humans on a global scale using up-to-date estimates of biological risk. These risk factors use varied biological damage models for assessing effects. All the doses reported are the unsheltered, unweathered, smooth terrain, external gamma dose. We assume the unperturbed atmosphere in determining injection and deposition. Effects due to ''nuclear winter'' may invalidate this assumption. The calculations also include scenarios that attempt to assess the impact of the changing nature of the nuclear stockpile. In particular, the shift from larger to smaller yield nuclear devices significantly changes the injection pattern into the atmosphere, and hence significantly affects the radiation doses that ensue. We have also looked at injections into the equatorial atmosphere. In total, we report here the results for 8 scenarios. 10 refs., 6 figs., 11 tabs

  13. On the (In)Validity of Tests of Simple Mediation: Threats and Solutions

    OpenAIRE

    Pek, Jolynn; Hoyle, Rick H.

    2016-01-01

    Mediation analysis is a popular framework for identifying underlying mechanisms in social psychology. In the context of simple mediation, we review and discuss the implications of three facets of mediation analysis: (a) conceptualization of the relations between the variables, (b) statistical approaches, and (c) relevant elements of design. We also highlight the issue of equivalent models that are inherent in simple mediation. The extent to which results are meaningful stem directly from choi...

  14. Criticisms and defences of the balance-of-payments constrained growth model: some old, some new

    Directory of Open Access Journals (Sweden)

    John S.L. McCombie

    2011-12-01

    Full Text Available This paper assesses various critiques that have been levelled over the years against Thirlwall’s Law and the balance-of-payments constrained growth model. It starts by assessing the criticisms that the law is largely capturing an identity; that the law of one price renders the model incoherent; and that statistical testing using cross-country data rejects the hypothesis that the actual and the balance-of-payments equilibrium growth rates are the same. It goes on to consider the argument that calculations of the “constant-market-shares” income elasticities of demand for exports demonstrate that the UK (and by implication other advanced countries could not have been balance-of-payments constrained in the early postwar period. Next Krugman’s interpretation of the law (or what he terms the “45-degree rule”, which is at variance with the usual demand-oriented explanation, is examined. The paper next assesses attempts to reconcile the demand and supply side of the model and examines whether or not the balance-of-payments constrained growth model is subject to the fallacy of composition. It concludes that none of these criticisms invalidate the model, which remains a powerful explanation of why growth rates differ.

  15. Risk Assessments for Workers and the Population Following the Chernobyl Accident. Annex XI of Technical Volume 4

    International Nuclear Information System (INIS)

    2015-01-01

    The doses received by emergency workers at the Fukushima Daiichi NPP were much lower than those of the Chernobyl emergency workers (referred to as ‘liquidators’) and there is no evidence of an increased risk for Chernobyl workers below an equivalent dose of 150 mGy, so the inferred risks are expected to be small. Nevertheless, it is useful to adopt a similar modelling approach. The main estimates of radiation risks for the cohort of Chernobyl emergency workers who received moderate doses are in good quantitative agreement with the results for atomic bomb survivors if the linear non-threshold (LNT) model is used. The minimum latency period for radiation related solid cancers in the Russian cohort was estimated as four years. No statistically significant relationship was found between the thyroid cancer incidence and external radiation for the Russian cohort of liquidators

  16. Partitioning uncertainty in streamflow projections under nonstationary model conditions

    Science.gov (United States)

    Chawla, Ila; Mujumdar, P. P.

    2018-02-01

    Assessing the impacts of Land Use (LU) and climate change on future streamflow projections is necessary for efficient management of water resources. However, model projections are burdened with significant uncertainty arising from various sources. Most of the previous studies have considered climate models and scenarios as major sources of uncertainty, but uncertainties introduced by land use change and hydrologic model assumptions are rarely investigated. In this paper an attempt is made to segregate the contribution from (i) general circulation models (GCMs), (ii) emission scenarios, (iii) land use scenarios, (iv) stationarity assumption of the hydrologic model, and (v) internal variability of the processes, to overall uncertainty in streamflow projections using analysis of variance (ANOVA) approach. Generally, most of the impact assessment studies are carried out with unchanging hydrologic model parameters in future. It is, however, necessary to address the nonstationarity in model parameters with changing land use and climate. In this paper, a regression based methodology is presented to obtain the hydrologic model parameters with changing land use and climate scenarios in future. The Upper Ganga Basin (UGB) in India is used as a case study to demonstrate the methodology. The semi-distributed Variable Infiltration Capacity (VIC) model is set-up over the basin, under nonstationary conditions. Results indicate that model parameters vary with time, thereby invalidating the often-used assumption of model stationarity. The streamflow in UGB under the nonstationary model condition is found to reduce in future. The flows are also found to be sensitive to changes in land use. Segregation results suggest that model stationarity assumption and GCMs along with their interactions with emission scenarios, act as dominant sources of uncertainty. This paper provides a generalized framework for hydrologists to examine stationarity assumption of models before considering them

  17. [How valid are student self-reports of bullying in schools?].

    Science.gov (United States)

    Morbitzer, Petra; Spröber, Nina; Hautzinger, Martin

    2009-01-01

    In this study we examine the reliability and validity of students' self-reports about bullying and victimization in schools. 208 5th class students of four "middle schools" in Southern Germany filled in the Bully-Victim-Questionnaire (Olweus, 1989, adapted by Lösel, Bliesener, Averbeck, 1997) and the School Climate Survey (Brockenborough, 2001) to assess the prevalence of bullying/victimization, and to evaluate attitudes towards aggression and support for victims. By using reliability and validity criteria, one third (31%) of the questionnaires was classified as "unreliable/invalid". Mean comparisons of the "unreliable/invalid" group and the "valid" group of the subscales concerning bullying/victimization found significant differences. The "unreliable/invalid" group stated higher values of bullying and victimization. Based on the "unreliable/invalid" questionnaires more students could be identified as bullies/victims or bully-victims. The prevalence of bullying/victimization in the whole sample was reduced if "unreliable/invalid" questionnaires were excluded. The results are discussed in the framework of theories about the presentation of the self ("impression management', "social desirability") and systematic response patterns ("extreme response bias").

  18. Validity of two-phase polymer electrolyte membrane fuel cell models with respect to the gas diffusion layer

    Science.gov (United States)

    Ziegler, C.; Gerteisen, D.

    A dynamic two-phase model of a proton exchange membrane fuel cell with respect to the gas diffusion layer (GDL) is presented and compared with chronoamperometric experiments. Very good agreement between experiment and simulation is achieved for potential step voltammetry (PSV) and sine wave testing (SWT). Homogenized two-phase models can be categorized in unsaturated flow theory (UFT) and multiphase mixture (M 2) models. Both model approaches use the continuum hypothesis as fundamental assumption. Cyclic voltammetry experiments show that there is a deterministic and a stochastic liquid transport mode depending on the fraction of hydrophilic pores of the GDL. ESEM imaging is used to investigate the morphology of the liquid water accumulation in the pores of two different media (unteflonated Toray-TGP-H-090 and hydrophobic Freudenberg H2315 I3). The morphology of the liquid water accumulation are related with the cell behavior. The results show that UFT and M 2 two-phase models are a valid approach for diffusion media with large fraction of hydrophilic pores such as unteflonated Toray-TGP-H paper. However, the use of the homgenized UFT and M 2 models appears to be invalid for GDLs with large fraction of hydrophobic pores that corresponds to a high average contact angle of the GDL.

  19. Neural correlates of the spatial and expectancy components of endogenous and stimulus-driven orienting of attention in the Posner task.

    Science.gov (United States)

    Doricchi, Fabrizio; Macci, Enrica; Silvetti, Massimo; Macaluso, Emiliano

    2010-07-01

    Voluntary orienting of visual attention is conventionally measured in tasks with predictive central cues followed by frequent valid targets at the cued location and by infrequent invalid targets at the uncued location. This implies that invalid targets entail both spatial reorienting of attention and breaching of the expected spatial congruency between cues and targets. Here, we used event-related functional magnetic resonance imaging (fMRI) to separate the neural correlates of the spatial and expectancy components of both endogenous orienting and stimulus-driven reorienting of attention. We found that during endogenous orienting with predictive cues, there was a significant deactivation of the right Temporal-Parietal Junction (TPJ). We also discovered that the lack of an equivalent deactivation with nonpredictive cues was matched to drop in attentional costs and preservation of attentional benefits. The right TPJ showed equivalent responses to invalid targets following predictive and nonpredictive cues. On the contrary, infrequent-unexpected invalid targets following predictive cues specifically activated the right Middle and Inferior Frontal Gyrus (MFG-IFG). Additional comparisons with spatially neutral trials demonstrated that, independently of cue predictiveness, valid targets activate the left TPJ, whereas invalid targets activate both the left and right TPJs. These findings show that the selective right TPJ activation that is found in the comparison between invalid and valid trials results from the reciprocal cancelling of the different activations that in the left TPJ are related to the processing of valid and invalid targets. We propose that left and right TPJs provide "matching and mismatching to attentional template" signals. These signals enable reorienting of attention and play a crucial role in the updating of the statistical contingency between cues and targets.

  20. Semiparametric Theory and Missing Data

    CERN Document Server

    Tsiatis, Anastasios A

    2006-01-01

    Missing data arise in almost all scientific disciplines. In many cases, missing data in an analysis is treated in a casual and ad-hoc manner, leading to invalid inferences and erroneous conclusions. This book summarizes knowledge regarding the theory of estimation for semiparametric models with missing data.

  1. A novel mouse model of creatine transporter deficiency [v2; ref status: indexed, http://f1000r.es/4zb

    Directory of Open Access Journals (Sweden)

    Laura Baroncelli

    2015-01-01

    Full Text Available Mutations in the creatine (Cr transporter (CrT gene lead to cerebral creatine deficiency syndrome-1 (CCDS1, an X-linked metabolic disorder characterized by cerebral Cr deficiency causing intellectual disability, seizures, movement  and behavioral disturbances, language and speech impairment ( OMIM #300352. CCDS1 is still an untreatable pathology that can be very invalidating for patients and caregivers. Only two murine models of CCDS1, one of which is an ubiquitous knockout mouse, are currently available to study the possible mechanisms underlying the pathologic phenotype of CCDS1 and to develop therapeutic strategies. Given the importance of validating phenotypes and efficacy of promising treatments in more than one mouse model we have generated a new murine model of CCDS1 obtained by ubiquitous deletion of 5-7 exons in the Slc6a8 gene. We showed a remarkable Cr depletion in the murine brain tissues and cognitive defects, thus resembling the key features of human CCDS1. These results confirm that CCDS1 can be well modeled in mice. This CrT−/y murine model will provide a new tool for increasing the relevance of preclinical studies to the human disease.

  2. Microwave-based investigation of electrochemical processes in catalysts and related systems; Mikrowellengestuetzte Aufklaerung elektronischer Vorgaenge in Katalysatoren und verwandten Systemen

    Energy Technology Data Exchange (ETDEWEB)

    Fischerauer, Gerhard; Spoerl, Matthias; Reiss, Sebastian; Moos, Ralf [Bayreuth Univ. (DE). Bayreuth Engine Research Center (BERC)

    2010-07-01

    Technically important electrochemical reactions often occur at high temperatures and inside bulky structures. The difficulties associated with their direct observation are usually circumvented by indirect measurement strategies. This contribution reports on a microwave-based direct measurement method and the results obtained when it was applied to systems such as three-way catalysts (TWC), lean NOx traps (LNT), and diesel particulate filters (DPF). (orig.)

  3. Chronic 5-HT4 receptor agonist treatment restores learning and memory deficits in a neuroendocrine mouse model of anxiety/depression.

    Science.gov (United States)

    Darcet, Flavie; Gardier, Alain M; David, Denis J; Guilloux, Jean-Philippe

    2016-03-11

    Cognitive disturbances are often reported as serious invalidating symptoms in patients suffering from major depression disorders (MDD) and are not fully corrected by classical monoaminergic antidepressant drugs. If the role of 5-HT4 receptor agonists as cognitive enhancers is well established in naïve animals or in animal models of cognitive impairment, their cognitive effects in the context of stress need to be examined. Using a mouse model of anxiety/depression (CORT model), we reported that a chronic 5-HT4 agonist treatment (RS67333, 1.5mg/kg/day) restored chronic corticosterone-induced cognitive deficits, including episodic-like, associative and spatial learning and memory impairments. On the contrary, a chronic monoaminergic antidepressant drug treatment with fluoxetine (18mg/kg/day) only partially restored spatial learning and memory deficits and had no effect in the associative/contextual task. These results suggest differential mechanisms underlying cognitive effects of these drugs. Finally, the present study highlights 5-HT4 receptor stimulation as a promising therapeutic mechanism to alleviate cognitive symptoms related to MDD. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  4. A model for electron currents near a field null

    International Nuclear Information System (INIS)

    Stark, R.A.; Miley, G.H.

    1987-01-01

    The fluid approximation is invalid near a field null, since the local electron orbit size and the magnetic scale length are comparable. To model the electron currents in this region we propose a single equation of motion describing the bulk electron dynamics. The equation applies to the plasma within one thermal orbit size of the null. The region is treated as unmagnetized; electrons are accelerated by the inductive electric field and drag on ions; damping is provided by viscosity due to electrons and collisions with ions. Through variational calculations and a particle tracking code for electrons, the size of the terms in the equation of motion have been estimated. The resulting equation of motion combines with Faraday's Law to produce a governing equation which implicitly contains the self inductive field of the electrons. This governing equation predicts that viscosity prevents complete cancellation of the ion current density by the electrons in the null region. Thus electron dynamics near the field null should not prevent the formation and deepening of field reversal using neutral-beam injection

  5. Do non-targeted effects increase or decrease low dose risk in relation to the linear-non-threshold (LNT) model?

    International Nuclear Information System (INIS)

    Little, M.P.

    2010-01-01

    In this paper we review the evidence for departure from linearity for malignant and non-malignant disease and in the light of this assess likely mechanisms, and in particular the potential role for non-targeted effects. Excess cancer risks observed in the Japanese atomic bomb survivors and in many medically and occupationally exposed groups exposed at low or moderate doses are generally statistically compatible. For most cancer sites the dose-response in these groups is compatible with linearity over the range observed. The available data on biological mechanisms do not provide general support for the idea of a low dose threshold or hormesis. This large body of evidence does not suggest, indeed is not statistically compatible with, any very large threshold in dose for cancer, or with possible hormetic effects, and there is little evidence of the sorts of non-linearity in response implied by non-DNA-targeted effects. There are also excess risks of various types of non-malignant disease in the Japanese atomic bomb survivors and in other groups. In particular, elevated risks of cardiovascular disease, respiratory disease and digestive disease are observed in the A-bomb data. In contrast with cancer, there is much less consistency in the patterns of risk between the various exposed groups; for example, radiation-associated respiratory and digestive diseases have not been seen in these other (non-A-bomb) groups. Cardiovascular risks have been seen in many exposed populations, particularly in medically exposed groups, but in contrast with cancer there is much less consistency in risk between studies: risks per unit dose in epidemiological studies vary over at least two orders of magnitude, possibly a result of confounding and effect modification by well known (but unobserved) risk factors. In the absence of a convincing mechanistic explanation of epidemiological evidence that is, at present, less than persuasive, a cause-and-effect interpretation of the reported statistical associations for cardiovascular disease is unreliable but cannot be excluded. Inflammatory processes are the most likely mechanism by which radiation could modify the atherosclerotic disease process. If there is to be modification by low doses of ionizing radiation of cardiovascular disease through this mechanism, a role for non-DNA-targeted effects cannot be excluded.

  6. A novel mouse model of creatine transporter deficiency [v1; ref status: indexed, http://f1000r.es/4f8

    Directory of Open Access Journals (Sweden)

    Laura Baroncelli

    2014-09-01

    Full Text Available Mutations in the creatine (Cr transporter (CrT gene lead to cerebral creatine deficiency syndrome-1 (CCDS1, an X-linked metabolic disorder characterized by cerebral Cr deficiency causing intellectual disability, seizures, movement  and behavioral disturbances, language and speech impairment ( OMIM #300352. CCDS1 is still an untreatable pathology that can be very invalidating for patients and caregivers. Only two murine models of CCDS1, one of which is an ubiquitous knockout mouse, are currently available to study the possible mechanisms underlying the pathologic phenotype of CCDS1 and to develop therapeutic strategies. Given the importance of validating phenotypes and efficacy of promising treatments in more than one mouse model we have generated a new murine model of CCDS1 obtained by ubiquitous deletion of 5-7 exons in the Slc6a8 gene. We showed a remarkable Cr depletion in the murine brain tissues and cognitive defects, thus resembling the key features of human CCDS1. These results confirm that CCDS1 can be well modeled in mice. This CrT−/y murine model will provide a new tool for increasing the relevance of preclinical studies to the human disease.

  7. The effect of differential motivation on IRT linking

    NARCIS (Netherlands)

    Mittelhaëuser, M.A.; Béguin, A.A.; Sijtsma, K.

    2015-01-01

    The purpose of this study was to investigate whether simulated differential motivation between the stakes for operational tests and anchor items produces an invalid linking result if the Rasch model is used to link the operational tests. This was done for an external anchor design and a variation of

  8. Precise generation of systems biology models from KEGG pathways.

    Science.gov (United States)

    Wrzodek, Clemens; Büchel, Finja; Ruff, Manuel; Dräger, Andreas; Zell, Andreas

    2013-02-21

    The KEGG PATHWAY database provides a plethora of pathways for a diversity of organisms. All pathway components are directly linked to other KEGG databases, such as KEGG COMPOUND or KEGG REACTION. Therefore, the pathways can be extended with an enormous amount of information and provide a foundation for initial structural modeling approaches. As a drawback, KGML-formatted KEGG pathways are primarily designed for visualization purposes and often omit important details for the sake of a clear arrangement of its entries. Thus, a direct conversion into systems biology models would produce incomplete and erroneous models. Here, we present a precise method for processing and converting KEGG pathways into initial metabolic and signaling models encoded in the standardized community pathway formats SBML (Levels 2 and 3) and BioPAX (Levels 2 and 3). This method involves correcting invalid or incomplete KGML content, creating complete and valid stoichiometric reactions, translating relations to signaling models and augmenting the pathway content with various information, such as cross-references to Entrez Gene, OMIM, UniProt ChEBI, and many more.Finally, we compare several existing conversion tools for KEGG pathways and show that the conversion from KEGG to BioPAX does not involve a loss of information, whilst lossless translations to SBML can only be performed using SBML Level 3, including its recently proposed qualitative models and groups extension packages. Building correct BioPAX and SBML signaling models from the KEGG database is a unique characteristic of the proposed method. Further, there is no other approach that is able to appropriately construct metabolic models from KEGG pathways, including correct reactions with stoichiometry. The resulting initial models, which contain valid and comprehensive SBML or BioPAX code and a multitude of cross-references, lay the foundation to facilitate further modeling steps.

  9. Why we need new approaches to low-dose risk modeling

    International Nuclear Information System (INIS)

    Alvarez, J.L.; Seiler, F.A.

    1996-01-01

    The linear no-threshold model for radiation effects was introduced as a conservative model for the design of radiation protection programs. The model has persisted not only as the basis for such programs, but has come to be treated as a dogma and is often confused with scientific fact. In this examination a number of serious problems with the linear no-threshold model of radiation carcinogenesis were demonstrated, many of them invalidating the hypothesis. It was shown that the relative risk formalism did not approach 1 as the dose approaches zero. When morality ratios were used instead, the data in the region below 0.3 Sv were systematically below the predictions of the linear model. It was also shown that the data above 0.3 Sv were of little use in formulating a model at low doses. In addition, these data are valid only for doses accumulated at high dose rates, and there is no scientific justification for using the model in low-dose, low-dose-rate extrapolations for purposes of radiation protection. Further examination of model fits to the Japanese survivor data were attempted. Several such models were fit to the data including an unconstrained linear, linear-square root, and Weibull, all of which fit the data better than the relative risk, linear no-threshold model. These fits were used to demonstrate that the linear model systematically over estimates the risk at low doses in the Japanese survivor data set. It is recommended here that an unbiased re-analysis of the data be undertaken and the results used to construct a new model, based on all pertinent data. This model could then form the basis for managing radiation risks in the appropriate regions of dose and dose rate

  10. Incidents Prediction in Road Junctions Using Artificial Neural Networks

    Science.gov (United States)

    Hajji, Tarik; Alami Hassani, Aicha; Ouazzani Jamil, Mohammed

    2018-05-01

    The implementation of an incident detection system (IDS) is an indispensable operation in the analysis of the road traffics. However the IDS may, in no case, represent an alternative to the classical monitoring system controlled by the human eye. The aim of this work is to increase detection and prediction probability of incidents in camera-monitored areas. Knowing that, these areas are monitored by multiple cameras and few supervisors. Our solution is to use Artificial Neural Networks (ANN) to analyze moving objects trajectories on captured images. We first propose a modelling of the trajectories and their characteristics, after we develop a learning database for valid and invalid trajectories, and then we carry out a comparative study to find the artificial neural network architecture that maximizes the rate of valid and invalid trajectories recognition.

  11. On the (In)Validity of Tests of Simple Mediation: Threats and Solutions

    Science.gov (United States)

    Pek, Jolynn; Hoyle, Rick H.

    2015-01-01

    Mediation analysis is a popular framework for identifying underlying mechanisms in social psychology. In the context of simple mediation, we review and discuss the implications of three facets of mediation analysis: (a) conceptualization of the relations between the variables, (b) statistical approaches, and (c) relevant elements of design. We also highlight the issue of equivalent models that are inherent in simple mediation. The extent to which results are meaningful stem directly from choices regarding these three facets of mediation analysis. We conclude by discussing how mediation analysis can be better applied to examine causal processes, highlight the limits of simple mediation, and make recommendations for better practice. PMID:26985234

  12. Aminoacylation of the N-terminal cysteine is essential for Lol-dependent release of lipoproteins from membranes but does not depend on lipoprotein sorting signals.

    Science.gov (United States)

    Fukuda, Ayumu; Matsuyama, Shin-Ichi; Hara, Takashi; Nakayama, Jiro; Nagasawa, Hiromichi; Tokuda, Hajime

    2002-11-08

    Lipoproteins are present in a wide variety of bacteria and are anchored to membranes through lipids attached to the N-terminal cysteine. The Lol system of Escherichia coli mediates the membrane-specific localization of lipoproteins. Aspartate at position 2 functions as a Lol avoidance signal and causes the retention of lipoproteins in the inner membrane, whereas lipoproteins having residues other than aspartate at position 2 are released from the inner membrane and localized to the outer membrane by the Lol system. Phospholipid:apolipoprotein transacylase, Lnt, catalyzes the last step of lipoprotein modification, converting apolipoprotein into mature lipoprotein. To reveal the importance of this aminoacylation for the Lol-dependent membrane localization, apolipoproteins were prepared by inhibiting lipoprotein maturation. Lnt was also purified and used to convert apolipoprotein into mature lipoprotein in vitro. The release of these lipoproteins was examined in proteoliposomes. We show here that the aminoacylation is essential for the Lol-dependent release of lipoproteins from membranes. Furthermore, lipoproteins with aspartate at position 2 were found to be aminoacylated both in vivo and in vitro, indicating that the lipoprotein-sorting signal does not affect lipid modification.

  13. The dose makes the poison. Even for radiation; Die Dosis macht das Gift. Auch bei Strahlenbelastung

    Energy Technology Data Exchange (ETDEWEB)

    Langeheine, Juergen

    2014-11-15

    The dose makes the poison, a quote by Paracelsus a doctor who lived half a millennium ago, is still valid today. Nevertheless this general accepted fact is being excluded in relation to ionizing radiation, which is wrongly considered as radioactive radiation. Here applies the LNT-Hypothesis (Linear No Threshold), agreed on by the ICRP, the Commission on Radiological Protection, a dose-to-effect relationship, which is based on the EU directives and the German Radiation Protection Ordinance. The LNT-hypothesis states, that even every smallest dose of radiation already provides a potentiality of danger and was introduced as precaution assuming that self-healing mechanisms even through weak radiation of damaged cells can be excluded and every damage caused by radiation inevitably leads to cell mutation and with it to cancer development. Without any further knowledge assumptions were made, that the same mechanism for cancer development applies for high and small doses. This assumption turned out to be wrong, as it is increasingly reported on findings which show, that smaller doses of ionized radiation demonstrably does not cause any damage, but on the contrary can even be healthy.

  14. The conceptualization model problem—surprise

    Science.gov (United States)

    Bredehoeft, John

    2005-03-01

    The foundation of model analysis is the conceptual model. Surprise is defined as new data that renders the prevailing conceptual model invalid; as defined here it represents a paradigm shift. Limited empirical data indicate that surprises occur in 20-30% of model analyses. These data suggest that groundwater analysts have difficulty selecting the appropriate conceptual model. There is no ready remedy to the conceptual model problem other than (1) to collect as much data as is feasible, using all applicable methods—a complementary data collection methodology can lead to new information that changes the prevailing conceptual model, and (2) for the analyst to remain open to the fact that the conceptual model can change dramatically as more information is collected. In the final analysis, the hydrogeologist makes a subjective decision on the appropriate conceptual model. The conceptualization problem does not render models unusable. The problem introduces an uncertainty that often is not widely recognized. Conceptual model uncertainty is exacerbated in making long-term predictions of system performance. C'est le modèle conceptuel qui se trouve à base d'une analyse sur un modèle. On considère comme une surprise lorsque le modèle est invalidé par des données nouvelles; dans les termes définis ici la surprise est équivalente à un change de paradigme. Des données empiriques limitées indiquent que les surprises apparaissent dans 20 à 30% des analyses effectuées sur les modèles. Ces données suggèrent que l'analyse des eaux souterraines présente des difficultés lorsqu'il s'agit de choisir le modèle conceptuel approprié. Il n'existe pas un autre remède au problème du modèle conceptuel que: (1) rassembler autant des données que possible en utilisant toutes les méthodes applicables—la méthode des données complémentaires peut conduire aux nouvelles informations qui vont changer le modèle conceptuel, et (2) l'analyste doit rester ouvert au fait

  15. Analysis of correlations between sites in models of protein sequences

    International Nuclear Information System (INIS)

    Giraud, B.G.; Lapedes, A.; Liu, L.C.

    1998-01-01

    A criterion based on conditional probabilities, related to the concept of algorithmic distance, is used to detect correlated mutations at noncontiguous sites on sequences. We apply this criterion to the problem of analyzing correlations between sites in protein sequences; however, the analysis applies generally to networks of interacting sites with discrete states at each site. Elementary models, where explicit results can be derived easily, are introduced. The number of states per site considered ranges from 2, illustrating the relation to familiar classical spin systems, to 20 states, suitable for representing amino acids. Numerical simulations show that the criterion remains valid even when the genetic history of the data samples (e.g., protein sequences), as represented by a phylogenetic tree, introduces nonindependence between samples. Statistical fluctuations due to finite sampling are also investigated and do not invalidate the criterion. A subsidiary result is found: The more homogeneous a population, the more easily its average properties can drift from the properties of its ancestor. copyright 1998 The American Physical Society

  16. The Impacts of Tuition Rate Changes on College Undergraduate Headcounts and Credit Hours Over Time--A Case Study.

    Science.gov (United States)

    Chressanthis, George A.

    1986-01-01

    Using 1964-1983 enrollment data for a small Michigan state college, this paper charts tuition rate change impacts on college undergraduate headcounts and credit hours over time. Results indicate that student behavior follows the law of demand, varies with class standing, corroborates human capital investment models, and invalidates uniform tuition…

  17. Soviet Dissident Scientists, 1966-78: A Study.

    Science.gov (United States)

    1979-06-01

    could be meaning.ully determined. Other factors, such s marital status, career aspirations , or previous military service, mignt have bec-. as relevant...LNT0VCh, LOEOSAa T. VZMAO VAp VOL’PIN, GESh0ICh, 14~i LAVUTl’, IMLAM3ICh, MLI~hch, PODrapOL’ =, RUDAID, THAChZV. 5DB v9 15696 p 2-3. 88. 8DB v24 AS1283

  18. Effect of Experience of Internal Medicine Residents during Infectious Disease Elective on Future Infectious Disease Fellowship Application

    Science.gov (United States)

    2017-10-04

    Experience of !ntcrnal Medicine Residents during Infectious Disease Elective on Future lntCctious Di~casc Fcllo\\vship Application Sb. GRANT N_UMBER...undefined. Since 2008 at our institution. internal medicine (!!vi) residents have been required to do a four-\\\\’eek inpatient !D rotation as an intern... Medicine Residents during Infectious Disease Elective on Fut ure Infectious Disease Fellowship Application ~ Poeter# 1440 .,...._,: OVfil"S~ ti

  19. Ramsar hot springs: how safe is to live in an environment with high level of natural radiation

    International Nuclear Information System (INIS)

    Mortazavi, S.M.J.

    2005-01-01

    Ramsar in northern Iran is among the world's well-known areas with highest levels of natural radiation. Annual exposure levels in areas with elevated levels of natural radiation in Ramsar are up to 260 mGy y -1 and average exposure rates are about 10 mGy y -1 for a population of about 2000 residents. Due to the local geology, which includes high levels of radium in rocks, soils, and groundwater, Ramsar residents are also exposed to high levels of alpha activity in the form of ingested radium and radium decay progeny as well as very high radon levels (over 1000 MBq m -3 ) in their dwellings. In some cases, the inhabitants of these areas receive doses much higher than the current ICRP-60 dose limit of 20 mSv y -1 . As the biological effects of low doses of radiation are not fully understood, the current radiation protection recommendations are based on the predictions of an assumption on the linear, no-threshold (LNT) relationship between radiation dose and the carcinogenic effects. Considering LNT, areas having such levels of natural radiation must be evacuated or at least require immediate remedial actions. Inhabitants of the high level natural radiation areas (HLNRAs) of Ramsar ar largely unaware of natural radiation, radon, or its possible health effects, and the inhabitants have not encountered any harmful effects due to living in their paternal houses. In this regard, it is often difficult to ask the inhabitants of HLNRAs of Ramsar to carry out remedical actions. Despite the fact that considering LNT and ALARA, public health in HLNRAs like Ramsar is best served by relocating the inhabitants, the residents' health seems unaffected and relocation is upsetting to the residents. Based on the findings obtained by studies on the health effect of high levels of natural radiation in Ramsar, as well as other HLNRAs, no consistent detrimental effect has been detected so far. However, more research is needed to clarify if the regulatory authorities should set limiting

  20. Variation in consumption of human milk oligosaccharides by infant gut-associated strains of Bifidobacterium breve.

    Science.gov (United States)

    Ruiz-Moyano, Santiago; Totten, Sarah M; Garrido, Daniel A; Smilowitz, Jennifer T; German, J Bruce; Lebrilla, Carlito B; Mills, David A

    2013-10-01

    Human milk contains a high concentration of complex oligosaccharides that influence the composition of the intestinal microbiota in breast-fed infants. Previous studies have indicated that select species such as Bifidobacterium longum subsp. infantis and Bifidobacterium bifidum can utilize human milk oligosaccharides (HMO) in vitro as the sole carbon source, while the relatively few B. longum subsp. longum and Bifidobacterium breve isolates tested appear less adapted to these substrates. Considering the high frequency at which B. breve is isolated from breast-fed infant feces, we postulated that some B. breve strains can more vigorously consume HMO and thus are enriched in the breast-fed infant gastrointestinal tract. To examine this, a number of B. breve isolates from breast-fed infant feces were characterized for the presence of different glycosyl hydrolases that participate in HMO utilization, as well as by their ability to grow on HMO or specific HMO species such as lacto-N-tetraose (LNT) and fucosyllactose. All B. breve strains showed high levels of growth on LNT and lacto-N-neotetraose (LNnT), and, in general, growth on total HMO was moderate for most of the strains, with several strain differences. Growth and consumption of fucosylated HMO were strain dependent, mostly in isolates possessing a glycosyl hydrolase family 29 α-fucosidase. Glycoprofiling of the spent supernatant after HMO fermentation by select strains revealed that all B. breve strains can utilize sialylated HMO to a certain extent, especially sialyl-lacto-N-tetraose. Interestingly, this specific oligosaccharide was depleted before neutral LNT by strain SC95. In aggregate, this work indicates that the HMO consumption phenotype in B. breve is variable; however, some strains display specific adaptations to these substrates, enabling more vigorous consumption of fucosylated and sialylated HMO. These results provide a rationale for the predominance of this species in breast-fed infant feces and

  1. Distributed Generation in European Electricity Markets

    DEFF Research Database (Denmark)

    Ropenus, Stephanie

    ) shows that the problem mainly comes from the assumptions of the eddy-viscosity concept, which are deeply invalidated in the wind turbine wake region. Different models that intent to correct the k-ε model’s issues are investigated, of which none of them is found to be adequate. The mixing of the wake...

  2. Primary disability of the Chernobyl Accident consequences liquidators

    International Nuclear Information System (INIS)

    Zubritskij, M.K.; Plakhotya, L.P.; Kalinina, T.V.; Zhilinskaya, E.I.

    1994-01-01

    The structure of courses of the primary invalidism of the Chernobyl accident consequences liquidators is studies. The main reasons of the loss of a capacity for work are blood circulation diseases (41.9%), neoplasms (19.9%), diseases of the nervous system and sense organs (9.7%), mental disorders (5.9%) and endocrine diseases (5.5%). The invalids distribution in the different regions and in different age groups according to the disease forms is analysed. The average durations of the diseases resulting in the primary invalidism are about 2.8 years. In average the illnesses began in the 3.1 years. 6 refs

  3. Vortex ring state by full-field actuator disc model

    Energy Technology Data Exchange (ETDEWEB)

    Soerensen, J.N.; Shen, W.Z.; Munduate, X. [DTU, Dept. of Energy Engineering, Lyngby (Denmark)

    1997-08-01

    One-dimensional momentum theory provides a simple analytical tool for analysing the gross flow behavior of lifting propellers and rotors. Combined with a blade-element strip-theory approach, it has for many years been the most popular model for load and performance predictions of wind turbines. The model works well at moderate and high wind velocities, but is not reliable at small wind velocities, where the expansion of the wake is large and the flow field behind the rotor dominated by turbulent mixing. This is normally referred to as the turbulent wake state or the vortex ring state. In the vortex ring state, momentum theory predicts a decrease of thrust whereas the opposite is found from experiments. The reason for the disagreement is that recirculation takes place behind the rotor with the consequence that the stream tubes past the rotor becomes effectively chocked. This represents a condition at which streamlines no longer carry fluid elements from far upstream to far downstream, hence one-dimensional momentum theory is invalid and empirical corrections have to be introduced. More sophisticated analytical or semi-analytical rotor models have been used to describe stationary flow fields for heavily loaded propellers. In recent years generalized actuator disc models have been developed, but up to now no detailed computations of the turbulent wake state or the vortex ring state have been performed. In the present work the phenomenon is simulated by direct simulation of the Navier-Stokes equations, where the influence of the rotor on the flow field is modelled simply by replacing the blades by an actuator disc with a constant normal load. (EG) 13 refs.

  4. Have the consequences of reactor accidents for the population been well assessed? Six questions to the experts in the field

    Energy Technology Data Exchange (ETDEWEB)

    Pohl, Peter

    2016-07-15

    Six questions to the experts in the field are posed: (1) Why is the assessment of accident consequences not separated in long-term and peak exposure? (2) Why is the exposure due to I-131 seen critical mainly in regard to the thyroid? (3) Do you have any reliable relations of health risk versus peak exposure? (4) Why do you not abolish the LNT assumption and replace it with a threshold model? (5) Why do you include indirect, psycho-somatic effects in assessing the consequences of reactor accidents when this is not customary with accidents with often more casualties? (6) How can the number of Chernobyl-assigned thyroid cancers have risen from some 600 about to some 4,000 today, when the latency period is in the range of 4 to 5 years?.

  5. Estimation of lower-bound KJc on pressure vessel steels from invalid data

    International Nuclear Information System (INIS)

    McCable, D.E.; Merkle, J.G.

    1996-01-01

    Statistical methods are currently being introduced into the transition temperature characterization of ferritic steels. Objective is to replace imprecise correlations between empirical impact test methods and universal K Ic or K Ia lower-bound curves with direct use of material-specific fracture mechanics data. This paper introduces a computational procedure that couples order statistics, weakest-link statistical theory, and a constraint model to arrive at estimates of lower-bound K Jc values. All of the above concepts have been used before to meet various objectives. In the present case, scheme is to make a best estimate of lower-bound fracture toughness when resource K Jc data are too few to use conventional statistical analyses. Utility of the procedure is of greatest value in the middle-to-high toughness part of the transition range where specimen constraint loss and elevated lower-bound toughness interfere with conventional statistical analysis methods

  6. Selecting the right statistical model for analysis of insect count data by using information theoretic measures.

    Science.gov (United States)

    Sileshi, G

    2006-10-01

    Researchers and regulatory agencies often make statistical inferences from insect count data using modelling approaches that assume homogeneous variance. Such models do not allow for formal appraisal of variability which in its different forms is the subject of interest in ecology. Therefore, the objectives of this paper were to (i) compare models suitable for handling variance heterogeneity and (ii) select optimal models to ensure valid statistical inferences from insect count data. The log-normal, standard Poisson, Poisson corrected for overdispersion, zero-inflated Poisson, the negative binomial distribution and zero-inflated negative binomial models were compared using six count datasets on foliage-dwelling insects and five families of soil-dwelling insects. Akaike's and Schwarz Bayesian information criteria were used for comparing the various models. Over 50% of the counts were zeros even in locally abundant species such as Ootheca bennigseni Weise, Mesoplatys ochroptera Stål and Diaecoderus spp. The Poisson model after correction for overdispersion and the standard negative binomial distribution model provided better description of the probability distribution of seven out of the 11 insects than the log-normal, standard Poisson, zero-inflated Poisson or zero-inflated negative binomial models. It is concluded that excess zeros and variance heterogeneity are common data phenomena in insect counts. If not properly modelled, these properties can invalidate the normal distribution assumptions resulting in biased estimation of ecological effects and jeopardizing the integrity of the scientific inferences. Therefore, it is recommended that statistical models appropriate for handling these data properties be selected using objective criteria to ensure efficient statistical inference.

  7. Water Quality Assessment of DoD Installations/Facilities in the Chesapeake Bay Region. Phase 3. Volume 2. Overall Approach, Findings and Recommendations.

    Science.gov (United States)

    1987-11-01

    POLUTANT NATUTIF LNT OURE SURC % :CAC RUNOFF RUNOF j CONiM. L7.. OF ACTVT LAST % e %NO ACTIVITY fP YESr 5% CHARACTERIZE REGIONAL LOADINGS] . . RELATIVE...no NPDES is required, NRL must submit compliance reports to EPA. Current compliance status is unknown..,"- .. % %- .. .. .. d. Radioactive materials...and treats large quantities of radioactive material. e. HDL personnel have reported an occasional oily sheen at the stormwater outfall weir of Paint

  8. Limiting values for radioactive materials in food

    International Nuclear Information System (INIS)

    Steiner, Martin

    2014-01-01

    The contribution describes the fundamentals of radiation protection: LNT (linear, no threshold) hypotheses, ALARA (a slow as reasonably achievable), limiting values. Using the example the nuclear accident in Chernobyl the differences in contamination development in different foodstuffs in Germany is demonstrated including recommended limiting values and the radiation exposures after 30 years due to consumption of contaminated food. The natural radioactivity is about 0.3 mSv/year.

  9. Characterization and Modeling of High Power Microwave Effects in CMOS Microelectronics

    Science.gov (United States)

    2010-01-01

    margin measurement 28 Any voltage above the line marked VIH is considered a valid logic high on the input of the gate. VIH and VIL are defined...can handle any voltage noise level at the input up to VIL without changing state. The region in between VIL and VIH is considered an invalid logic...29 Table 2.2: Intrinsic device characteristics derived from SPETCRE simulations   VIH  (V)  VIL (V)  High Noise Margin  (V)  Low Noise Margin (V

  10. World high background natural radiation areas: Need to protect public from radiation exposure

    International Nuclear Information System (INIS)

    Sohrabi, Mehdi

    2013-01-01

    Highlights of findings on radiological measurements, radiobiological and epidemiological studies in some main world high background natural radiation (HBNR) areas such as in Brazil, China, India and Iran are presented and discussed with special regard to remediation of radiation exposure of inhabitants in such areas. The current radiation protection philosophy and recommendations applied to workers and public from operation of radiation and nuclear applications are based on the linear non-threshold (LNT) model. The inhabitants of HBNR and radon prone areas receive relatively high radiation doses. Therefore, according to the LNT concept, the inhabitants in HBNR areas and in particular those in Ramsar are considered at risk and their exposure should be regulated. The HBNR areas in the world have different conditions in terms of dose and population. In particular, the inhabitants in HBNR areas of Ramsar receive very high internal and external exposures. This author believes that the public in such areas should be protected and proposes a plan to remedy high exposure of the inhabitants of the HBNR areas of Ramsar, while maintaining these areas as they stand to establish a national environmental radioactivity park which can be provisionally called “Ramsar Research Natural Radioactivity Park” (RRNRP). The major HBNR areas, the public exposure and the need to remedy exposures of inhabitants are reviewed and discussed. - Highlights: ► Highlights of findings on studies in HBNR areas are reviewed and discussed. ► The need to protect HBNR area inhabitants and remedy public exposure is emphasized. ► A collective approach is proposed to remedy exposure of Ramsar HBNR area inhabitants. ► Relocation of HBNR area inhabitants and establishing a park at the location is proposed. ► The advantages and disadvantages of the methods are discussed and recommendations are made

  11. A parsimonious approach to modeling animal movement data.

    Directory of Open Access Journals (Sweden)

    Yann Tremblay

    Full Text Available Animal tracking is a growing field in ecology and previous work has shown that simple speed filtering of tracking data is not sufficient and that improvement of tracking location estimates are possible. To date, this has required methods that are complicated and often time-consuming (state-space models, resulting in limited application of this technique and the potential for analysis errors due to poor understanding of the fundamental framework behind the approach. We describe and test an alternative and intuitive approach consisting of bootstrapping random walks biased by forward particles. The model uses recorded data accuracy estimates, and can assimilate other sources of data such as sea-surface temperature, bathymetry and/or physical boundaries. We tested our model using ARGOS and geolocation tracks of elephant seals that also carried GPS tags in addition to PTTs, enabling true validation. Among pinnipeds, elephant seals are extreme divers that spend little time at the surface, which considerably impact the quality of both ARGOS and light-based geolocation tracks. Despite such low overall quality tracks, our model provided location estimates within 4.0, 5.5 and 12.0 km of true location 50% of the time, and within 9, 10.5 and 20.0 km 90% of the time, for above, equal or below average elephant seal ARGOS track qualities, respectively. With geolocation data, 50% of errors were less than 104.8 km (<0.94 degrees, and 90% were less than 199.8 km (<1.80 degrees. Larger errors were due to lack of sea-surface temperature gradients. In addition we show that our model is flexible enough to solve the obstacle avoidance problem by assimilating high resolution coastline data. This reduced the number of invalid on-land location by almost an order of magnitude. The method is intuitive, flexible and efficient, promising extensive utilization in future research.

  12. Inter-rater reliability of healthcare professional skills' portfolio assessments: The Andalusian Agency for Healthcare Quality model

    Directory of Open Access Journals (Sweden)

    Antonio Almuedo-Paz

    2014-07-01

    Full Text Available This study aims to determine the reliability of assessment criteria used for a portfolio at the Andalusian Agency for Healthcare Quality (ACSA. Data: all competences certification processes, regardless of their discipline. Period: 2010-2011. Three types of tests are used: 368 certificates, 17,895 reports and 22,642 clinical practice reports (N = 3,010 candidates. The tests were evaluated in pairs by the ACSA team of raters using two categories: valid and invalid. Results: The percentage agreement in assessments of certificates was 89,9%, while for the reports of clinical practice was 85,1 % and for clinical practice reports was 81,7%. The inter-rater agreement coefficients (kappa ranged from 0,468 to 0,711. Discussion: The results of this study show that the inter-rater reliability of assessments varies from fair to good. Compared with other similar studies, the results put the reliability of the model in a comfortable position. Among the improvements incorporated, progressive automation of evaluations must be highlighted.

  13. Mental retirement and schooling

    DEFF Research Database (Denmark)

    Bingley, Paul; Martinello, Alessandro

    2013-01-01

    , which affect cognitive functioning at old ages, they are invalid as instruments without controlling for schooling. We show by means of simulation and a replication study that unless the model incorporates schooling, the estimated effect of retirement is negatively biased. This explains a large part...... of the “mental retirement” effects which have recently been found...

  14. Targeted and non-targeted effects of ionizing radiation

    Directory of Open Access Journals (Sweden)

    Omar Desouky

    2015-04-01

    Full Text Available For a long time it was generally accepted that effects of ionizing radiation such as cell death, chromosomal aberrations, DNA damage, mutagenesis, and carcinogenesis result from direct ionization of cell structures, particularly DNA, or from indirect damage through reactive oxygen species produced by radiolysis of water, and these biological effects were attributed to irreparable or misrepaired DNA damage in cells directly hit by radiation. Using linear non-threshold model (LNT, possible risks from exposure to low dose ionizing radiation (below 100 mSv are estimated by extrapolating from data obtained after exposure to higher doses of radiation. This model has been challenged by numerous observations, in which cells that were not directly traversed by the ionizing radiation exhibited responses similar to those of the directly irradiated cells. Therefore, it is nowadays accepted that the detrimental effects of ionizing radiation are not restricted only in the irradiated cells, but also to non-irradiated bystander or even distant cells manifesting various biological effects.

  15. The thermodynamic stability induced by solute co-segregation in nanocrystalline ternary alloys

    Energy Technology Data Exchange (ETDEWEB)

    Liang, Tao; Chen, Zheng; Zhang, Jinyong; Zhang, Ping [China Univ. of Mining and Technology, Xuzhou (China). School of Mateial Science and Engineering; Yang, Xiaoqin [China Univ. of Mining and Technology, Xuzhou (China). School of Chemical Engineering and Technology

    2017-06-15

    The grain growth and thermodynamic stability induced by solute co-segregation in ternary alloys are presented. Grain growth behavior of the single-phase supersaturated grains prepared in Ni-Fe-Pb alloy melt at different undercoolings was investigated by performing isothermal annealings at T = 400 C-800 C. Combining the multicomponent Gibbs adsorption equation and Guttmann's grain boundary segregation model, an empirical relation for isothermal grain growth was derived. By application of the model to grain growth in Ni-Fe-Pb, Fe-Cr-Zr and Fe-Ni-Zr alloys, it was predicted that driving grain boundary energy to zero is possible in alloys due to the co-segregation induced by the interactive effect between the solutes Fe/Pb, Zr/Ni and Zr/Cr. A non-linear relationship rather than a simple linear relation between 1/D* (D* the metastable equilibrium grain size) and ln(T) was predicted due to the interactive effect.

  16. On the "Size" of Einstein's Universe

    Directory of Open Access Journals (Sweden)

    Crothers S. J.

    2007-10-01

    Full Text Available It is alleged by the Standard Cosmological Model that Einstein’s Universe is finite but unbounded. Although this is a longstanding and widespread allegation, it is nonetheless incorrect. It is also alleged by this Model that the Universe is expanding and that it began with a Big Bang. These are also longstanding and widespread claims that are demonstrably false. The FRW models for an expanding, finite, unbounded Universe are inconsistent with General Relativity and are therefore invalid.

  17. Pooled Bayesian analysis of 28 studies on radon induced lung cancers

    International Nuclear Information System (INIS)

    Fornalski, K.W.; Dobrzyński, L.

    2010-01-01

    The influence of ionizing radiation of radon-222 and its daughters on the lung cancer incidence and mortality published in 28 papers was reanalyzed, for two ranges of low annual radiation dose of below 70 mSv per year (391 Bq m -3 ) and 150 mSv per year (838 Bq m -3 ). The seven popular models of dose-effect relationship were tested. The assumption-free Bayesian statistical methods were used for all curve fittings. Also the Model Selection algorithm was used to verify the relative probability of all seven models. The results of the analysis demonstrate that in this ranges of doses (below 70 and 150 mSv/ year) the published data do not show the presence of a risk of lung cancer induction. The most probable dose-effect relationship is constant one (risk ratio, RR=1). The statistical analysis shows that there is no basis for increase the risk of lung cancer in low dose area. The final conclusion results from the fact that the model assuming no dependence of the lung cancer induction on the radiation doses is at least 100 times more likely than six other models tested, including the Linear No-Threshold (LNT) model

  18. Persistence of information on the web: Analyzing citations contained in research articles

    DEFF Research Database (Denmark)

    Lawrance, S.; Coetzee, F.; Flake, G.

    2000-01-01

    that a significant percentage of URLs are now invalid, ranging from 23% for 1999 articles, to 53% for 1994. We also found that for almost all of the invalid URLs, it was possible to locate the information (or highly related information) in an alternate location, primarily with the use of search engines. However...

  19. Total Risk Management for Low Dose Radiation Exposures

    International Nuclear Information System (INIS)

    Simic, Z.; Mikulicic, V.; Sterc, D.

    2012-01-01

    Our civilization is witnessing about century of nuclear age mixed with enormous promises and cataclysmic threats. Nuclear energy seems to encapsulate both potential for pure good and evil or at least we humans are able to perceive that. These images are continuously with us and they are both helping and distracting from making best of nuclear potentials for civilization. Today with nuclear use significantly present and with huge potential to further improve our life with energy and medical use it is of enormous importance to try to have calmed, rational, and objective view on potential risks and certain benefits. Because all use of nuclear energy proved that their immediate risks are negligible (i.e., Three Mile Island and Fukushima) or much smaller than from the other alternatives (i.e., Chernobyl) it seems that the most important issue is the amount of risk from the long term effects to people from exposure to small doses of radiation. A similar issue is present in the increased use of modern computational tomography and other radiation sources use in medicine for examination and therapy. Finally, extreme natural exposures are third such potential risk sources. Definition of low doses varies depending on the way of delivery (i.e., single, multiple or continuous exposures), and for this paper usual dose of 100 mSv is selected as yearly upper amount. There are three very different scientifically supported views on the potential risks from the low doses exposure. The most conservative theory is that all radiation is harmful, and even small increments from background levels (i.e., 2-3 mSv) present additional risk. This view is called linear no threshold theory (LNT) and it is accepted as a regulatory conservative simple approach which guarantees safety. Risk is derived from the extrapolation of the measured effects of high levels of radiation. Opposite theory to LNT is hormesis which assumes that in fact small doses of radiation are helpful and they are improving our

  20. The Teddy Bear Clinic Constitutional Court case: Sexual conduct ...

    African Journals Online (AJOL)

    The Constitutional Court in the Teddy Bear Clinic appeal case held that the sections of the Sexual Offences Act that impose criminal liability for sexual offences on adolescent children under 16 years of age are invalid. The invalidity was suspended for 18 months to allow Parliament to correct the Act's defects. A moratorium ...

  1. The practice of quality-associated costing: application to transfusion manufacturing processes.

    Science.gov (United States)

    Trenchard, P M; Dixon, R

    1997-01-01

    This article applies the new method of quality-associated costing (QAC) to the mixture of processes that create red cell and plasma products from whole blood donations. The article compares QAC with two commonly encountered but arbitrary models and illustrates the invalidity of clinical cost-benefit analysis based on these models. The first, an "isolated" cost model, seeks to allocate each whole process cost to only one product class. The other is a "shared" cost model, and it seeks to allocate an approximately equal share of all process costs to all associated products.

  2. Trpm4 gene invalidation leads to cardiac hypertrophy and electrophysiological alterations.

    Directory of Open Access Journals (Sweden)

    Marie Demion

    Full Text Available RATIONALE: TRPM4 is a non-selective Ca2+-activated cation channel expressed in the heart, particularly in the atria or conduction tissue. Mutations in the Trpm4 gene were recently associated with several human conduction disorders such as Brugada syndrome. TRPM4 channel has also been implicated at the ventricular level, in inotropism or in arrhythmia genesis due to stresses such as ß-adrenergic stimulation, ischemia-reperfusion, and hypoxia re-oxygenation. However, the physiological role of the TRPM4 channel in the healthy heart remains unclear. OBJECTIVES: We aimed to investigate the role of the TRPM4 channel on whole cardiac function with a Trpm4 gene knock-out mouse (Trpm4-/- model. METHODS AND RESULTS: Morpho-functional analysis revealed left ventricular (LV eccentric hypertrophy in Trpm4-/- mice, with an increase in both wall thickness and chamber size in the adult mouse (aged 32 weeks when compared to Trpm4+/+ littermate controls. Immunofluorescence on frozen heart cryosections and qPCR analysis showed no fibrosis or cellular hypertrophy. Instead, cardiomyocytes in Trpm4-/- mice were smaller than Trpm4+/+with a higher density. Immunofluorescent labeling for phospho-histone H3, a mitosis marker, showed that the number of mitotic myocytes was increased 3-fold in the Trpm4-/-neonatal stage, suggesting hyperplasia. Adult Trpm4-/- mice presented multilevel conduction blocks, as attested by PR and QRS lengthening in surface ECGs and confirmed by intracardiac exploration. Trpm4-/-mice also exhibited Luciani-Wenckebach atrioventricular blocks, which were reduced following atropine infusion, suggesting paroxysmal parasympathetic overdrive. In addition, Trpm4-/- mice exhibited shorter action potentials in atrial cells. This shortening was unrelated to modifications of the voltage-gated Ca2+ or K+ currents involved in the repolarizing phase. CONCLUSIONS: TRPM4 has pleiotropic roles in the heart, including the regulation of conduction and cellular

  3. A model-based design and validation approach with OMEGA-UML and the IF toolset

    Science.gov (United States)

    Ben-hafaiedh, Imene; Constant, Olivier; Graf, Susanne; Robbana, Riadh

    2009-03-01

    Intelligent, embedded systems such as autonomous robots and other industrial systems are becoming increasingly more heterogeneous with respect to the platforms on which they are implemented, and thus the software architecture more complex to design and analyse. In this context, it is important to have well-defined design methodologies which should be supported by (1) high level design concepts allowing to master the design complexity, (2) concepts for the expression of non-functional requirements and (3) analysis tools allowing to verify or invalidate that the system under development will be able to conform to its requirements. We illustrate here such an approach for the design of complex embedded systems on hand of a small case study used as a running example for illustration purposes. We briefly present the important concepts of the OMEGA-RT UML profile, we show how we use this profile in a modelling approach, and explain how these concepts are used in the IFx verification toolbox to integrate validation into the design flow and make scalable verification possible.

  4. Real-time emergency forecasting technique for situation management systems

    Science.gov (United States)

    Kopytov, V. V.; Kharechkin, P. V.; Naumenko, V. V.; Tretyak, R. S.; Tebueva, F. B.

    2018-05-01

    The article describes the real-time emergency forecasting technique that allows increasing accuracy and reliability of forecasting results of any emergency computational model applied for decision making in situation management systems. Computational models are improved by the Improved Brown’s method applying fractal dimension to forecast short time series data being received from sensors and control systems. Reliability of emergency forecasting results is ensured by the invalid sensed data filtering according to the methods of correlation analysis.

  5. Universal time-dependence of the mean-square displacement in extremely rugged energy landscapes with equal minima

    DEFF Research Database (Denmark)

    Dyre, Jeppe; Jacobsen, Jacob M.

    1995-01-01

    This paper presents a calculation of the time dependence of the mean-square displacement for symmetric random energy barrier hopping models at low temperatures, where the frequency dependence of the normalized diffusion constant D-tilde becomes universal, i.e., independent of the energy barrier...... probability distribution [J. C. Dyre, Phys. Rev. B 49, 11 709 (1994)]. The universal time dependence of the mean-square displacement is calculated from the effective medium approximation (EMA) universality equation, D-tilde lnD-tilde=s-tilde, where s-tilde is the dimensionless imaginary frequency, as well...... as for the approximation to the EMA universality equation D-tilde~=s-tilde/ln(1+s-tilde). At long times the universal mean-square displacement is linear in time, corresponding to ordinary diffusion, whereas the mean-square displacement at short times t in dimensionless units varies as 2/ln(t-1)....

  6. Luminescence emission in NaCl:Cu X-irradiated at LNT and RT

    International Nuclear Information System (INIS)

    Herreros, J.M.; Jaque, F.

    1979-01-01

    The thermoluminescence (TL) and photostimulated thermoluminescence (PTL) in NaCl:Cu in the range of temperatures 77-400 K have been studied. For the five peaks found, the order kinetics of recombination, the pre-exponential factor activation energy and role of F and F' centres have been analyzed. (Auth.)

  7. Analytic treatment of nuclear spin-lattice relaxation for diffusion in a cone model

    Science.gov (United States)

    Sitnitsky, A. E.

    2011-12-01

    We consider nuclear spin-lattice relaxation rate resulted from a diffusion equation for rotational wobbling in a cone. We show that the widespread point of view that there are no analytical expressions for correlation functions for wobbling in a cone model is invalid and prove that nuclear spin-lattice relaxation in this model is exactly tractable and amenable to full analytical description. The mechanism of relaxation is assumed to be due to dipole-dipole interaction of nuclear spins and is treated within the framework of the standard Bloemberger, Purcell, Pound-Solomon scheme. We consider the general case of arbitrary orientation of the cone axis relative the magnetic field. The BPP-Solomon scheme is shown to remain valid for systems with the distribution of the cone axes depending only on the tilt relative the magnetic field but otherwise being isotropic. We consider the case of random isotropic orientation of cone axes relative the magnetic field taking place in powders. Also we consider the cases of their predominant orientation along or opposite the magnetic field and that of their predominant orientation transverse to the magnetic field which may be relevant for, e.g., liquid crystals. Besides we treat in details the model case of the cone axis directed along the magnetic field. The latter provides direct comparison of the limiting case of our formulas with the textbook formulas for free isotropic rotational diffusion. The dependence of the spin-lattice relaxation rate on the cone half-width yields results similar to those predicted by the model-free approach.

  8. Importance and pitfalls of molecular analysis to parasite epidemiology.

    Science.gov (United States)

    Constantine, Clare C

    2003-08-01

    Molecular tools are increasingly being used to address questions about parasite epidemiology. Parasites represent a diverse group and they might not fit traditional population genetic models. Testing hypotheses depends equally on correct sampling, appropriate tool and/or marker choice, appropriate analysis and careful interpretation. All methods of analysis make assumptions which, if violated, make the results invalid. Some guidelines to avoid common pitfalls are offered here.

  9. The Application of Cyber Physical System for Thermal Power Plants: Data-Driven Modeling

    Directory of Open Access Journals (Sweden)

    Yongping Yang

    2018-03-01

    Full Text Available Optimal operation of energy systems plays an important role to enhance their lifetime security and efficiency. The determination of optimal operating strategies requires intelligent utilization of massive data accumulated during operation or prediction. The investigation of these data solely without combining physical models may run the risk that the established relationships between inputs and outputs, the models which reproduce the behavior of the considered system/component in a wide range of boundary conditions, are invalid for certain boundary conditions, which never occur in the database employed. Therefore, combining big data with physical models via cyber physical systems (CPS is of great importance to derive highly-reliable and -accurate models and becomes more and more popular in practical applications. In this paper, we focus on the description of a systematic method to apply CPS to the performance analysis and decision making of thermal power plants. We proposed a general procedure of CPS with both offline and online phases for its application to thermal power plants and discussed the corresponding methods employed to support each sub-procedure. As an example, a data-driven model of turbine island of an existing air-cooling based thermal power plant is established with the proposed procedure and demonstrates its practicality, validity and flexibility. To establish such model, the historical operating data are employed in the cyber layer for modeling and linking each physical component. The decision-making procedure of optimal frequency of air-cooling condenser is also illustrated to show its applicability of online use. It is concluded that the cyber physical system with the data mining technique is effective and promising to facilitate the real-time analysis and control of thermal power plants.

  10. Influence of ionizing radiation on human body

    Directory of Open Access Journals (Sweden)

    Zygmunt Zdrojewicz

    2016-06-01

    Full Text Available This article describes positive and negative aspects of ionizing radiation and its effects on human body. Being a part of various medical procedures in medicine, ionising radiation has become an important aspect for both medical practitioners and patients. Commonly used in treatment, diagnostics and interventional radiology, its medical usage follows numerous rules, designed to reduce excessive exposure to ionizing radiation. Its widespread use makes it extremely important to research and confirm effects of various doses of radiation on patients of all ages. Two scientific theories, explaining radiation effects on human organism, stand in contrast: commonly accepted LNT-hypothesis and yet to be proven hormesis theory. Despite the fact that the current radiation protection standards are based on the linear theory (LNT-hypothesis, the hormesis theory arouses more and more interest, and numerous attempts are made to prove its validity. Further research expanding the knowledge on radiation hormesis can change the face of the future. Perhaps such researches will open up new possibilities for the use of ionizing radiation, as well as enable the calculation of the optimal and personalised radiation dose for each patient, allowing us to find a new “golden mean”. The authors therefore are careful and believe that these methods have a large future, primarily patient’s good should however be kept in mind.

  11. Understanding lack of understanding : Invalidation in rheumatic diseases

    NARCIS (Netherlands)

    Kool, M.B.

    2012-01-01

    The quality of life of patients with chronic rheumatic diseases is negatively influenced by symptoms such as pain, fatigue, and stiffness, and secondary symptoms such as physical limitations and depressive mood. On top of this burden, some patients experience negative responses from others, such as

  12. Spatial Interpretation of Tower, Chamber and Modelled Terrestrial Fluxes in a Tropical Forest Plantation

    Science.gov (United States)

    Whidden, E.; Roulet, N.

    2003-04-01

    Interpretation of a site average terrestrial flux may be complicated in the presence of inhomogeneities. Inhomogeneity may invalidate the basic assumptions of aerodynamic flux measurement. Chamber measurement may miss or misinterpret important temporal or spatial anomalies. Models may smooth over important nonlinearities depending on the scale of application. Although inhomogeneity is usually seen as a design problem, many sites have spatial variance that may have a large impact on net flux, and in many cases a large homogeneous surface is unrealistic. The sensitivity and validity of a site average flux are investigated in the presence of an inhomogeneous site. Directional differences are used to evaluate the validity of aerodynamic methods and the computation of a site average tower flux. Empirical and modelling methods are used to interpret the spatial controls on flux. An ecosystem model, Ecosys, is used to assess spatial length scales appropriate to the ecophysiologic controls. A diffusion model is used to compare tower, chamber, and model data, by spatially weighting contributions within the tower footprint. Diffusion model weighting is also used to improve tower flux estimates by producing footprint averaged ecological parameters (soil moisture, soil temperature, etc.). Although uncertainty remains in the validity of measurement methods and the accuracy of diffusion models, a detailed spatial interpretation is required at an inhomogeneous site. Flux estimation between methods improves with spatial interpretation, showing the importance to an estimation of a site average flux. Small-scale temporal and spatial anomalies may be relatively unimportant to overall flux, but accounting for medium-scale differences in ecophysiological controls is necessary. A combination of measurements and modelling can be used to define the appropriate time and length scales of significant non-linearity due to inhomogeneity.

  13. Comportamiento productivo y reproductivo del ganado holstein rojo, holstein negro y pardo suizo en Palmira, Valle del Cauca

    Directory of Open Access Journals (Sweden)

    Zapata Oscar

    1989-06-01

    Full Text Available Three breeds of dairy cattle (26 Red Holstein RH, 97 Black Holstein BH and 29 Brown Swiss BS, were evaluated for reproductive eficiency, and milk production for the years 1979 - 1987 a t the Instituto Colombiano Agropecuario, Palmira. Days of milking, milk and fat production were for BH (324.0days, 2545.9 kg, 91.1 kg for the RH (300.0 days, 2243.7 kg, 81.0 kg for the BS (298.2 days, 1886.6 kg, 66.9 kg. % fat average were 3.6% in RH, 3.7 % in BH and BS. The lactation curves for the three breeds were best ajusted with the model: Y=A+B(lnt + C(lnt2. Larger persistaency was observed on BS followed by RH. The BS presented better reproductive efficiency than RH and BH with a mean of 169.0,177.4 and 195.6 days, calving data-conception and 2.0, 22 and 22 for number of services per conception. The HR breed had a calving age of 3.4 years and 470.5 kg of calving weight larger than 3.5 years and 440.0 kg for the BH and 3.8 years and 458.1 kg for the BS. Average weight at birth for males and females together BS showed grater weights (37.02 kq, BH and RH showed similar results (36.0 and 362 kg. The incidence of problem at birth was higher for RH (10.90/0 and puerperium problems in BS(21.60,b. Not significal effect of calving weight and rainfall on milk yield and calving date-conception. In most cases there was no correlation between milk yield and calving date-conception. There was a very narow relation between calving date- conception and number of services per conception.En el Instituto Colombiano Agropecuario-Palmira, se evaluó el comportamiento productivo y reproductivo de 152 vacas; 26 Holstein Rojo (HR, 97 Holstein Negro (HN y 29 vacas Pardo Suizo (PS, durante el período 1979-1987. La mayor duración de la lactancia, producción de leche y grasa (kg la presentó la raza HN (324 días, 2545.9 ka, 91.1 kg, seguido de la HR (300 días, 2243.7,81.0 kg y de la PS (298.2 días, 1886.6 kg, 66.9 kg. los porcentajes de grasa fueron 3.6 (HR y 3.7010 (HN y PSI

  14. Comparative risk assessment: an element for a more rational and ethical approach to radiation risk

    International Nuclear Information System (INIS)

    Danesi, P.R.

    2006-01-01

    Peaceful nuclear technologies are still perceived by a large fraction of the general public, the media as well as by some decision makers, as more risky than many other 'conventional' technologies. One of the approaches that can help bringing more rationality and ethics into the picture is to present the risk associated with radiation and nuclear technologies in the frame of correctly conducted comparative risk assessments. However, comparing different risks is not so straightforward because quantifying different risks on a comparable scale requires overcoming both conceptual and practical difficulties. Risk (R) can be expressed as the product of the probability (P) that a given undesired event, the risk, will occur, times the consequences of this event (C), i.e. R = P x C. Although in principle risks could be compared by simply ranking them according to the different values of R, this simplistic approach is not always possible because to correctly compare risks all factors, circumstances and assumptions should be mutually equivalent and quantified and the (often large) uncertainties taken into proper account. In the case of radiation risk, ICRP has assumed as valid the LNT model, (probability coefficient of 5 % per Sievert for attributable death from cancer) and selected the present equivalent dose limit of 1 mSv per year for public exposure. This dose corresponds to 50 lethal cancers by 1 million people exposed and is approximately equivalent (in terms of probability of death) to the risk of bicycling for 600 km, driving for 3200 km, crossing a busy road twice a day for 1 year, smoking 2.5 packets of cigarettes or being X-rayed once for kidney metabolism. However, according to many scientists on the basis of both epidemiological and biological results and considerations, the actual risk is far lower than that predicted by the LNT model. Nevertheless, the policies and myths that were created about half a century ago are still persisting and have lead the general

  15. [Effects of radiation exposure on human body].

    Science.gov (United States)

    Kamiya, Kenji; Sasatani, Megumi

    2012-03-01

    There are two types of radiation health effect; acute disorder and late on-set disorder. Acute disorder is a deterministic effect that the symptoms appear by exposure above a threshold. Tissues and cells that compose the human body have different radiation sensitivity respectively, and the symptoms appear in order, from highly radiosensitive tissues. The clinical symptoms of acute disorder begin with a decrease in lymphocytes, and then the symptoms appear such as alopecia, skin erythema, hematopoietic damage, gastrointestinal damage, central nervous system damage with increasing radiation dose. Regarding the late on-set disorder, a predominant health effect is the cancer among the symptoms of such as cancer, non-cancer disease and genetic effect. Cancer and genetic effect are recognized as stochastic effects without the threshold. When radiation dose is equal to or more than 100 mSv, it is observed that the cancer risk by radiation exposure increases linearly with an increase in dose. On the other hand, the risk of developing cancer through low-dose radiation exposure, less 100 mSv, has not yet been clarified scientifically. Although uncertainty still remains regarding low level risk estimation, ICRP propound LNT model and conduct radiation protection in accordance with LNT model in the low-dose and low-dose rate radiation from a position of radiation protection. Meanwhile, the mechanism of radiation damage has been gradually clarified. The initial event of radiation-induced diseases is thought to be the damage to genome such as radiation-induced DNA double-strand breaks. Recently, it is clarified that our cells could recognize genome damage and induce the diverse cell response to maintain genome integrity. This phenomenon is called DNA damage response which induces the cell cycle arrest, DNA repair, apoptosis, cell senescence and so on. These responses act in the direction to maintain genome integrity against genome damage, however, the death of large number of

  16. The annealing behavior of hydrogen implanted into Al-Si alloy

    Energy Technology Data Exchange (ETDEWEB)

    Ogura, Masahiko; Yamaji, Norisuke; Imai, Makoto; Itoh, Akio; Imanishi, Nobutsugu [Kyoto Univ. (Japan). Faculty of Engineering

    1997-03-01

    We have studied effects of not only defects but also an added elements on trap-sites of hydrogen in metals. For the purpose, we observed depth profiles and thermal behaviors of hydrogen implanted into Al-1.5at.%Si alloy samples in an implantation-temperature range of liquid nitrogen temperature (LNT) to 373K at different doses. The results were compared with those for pure aluminum samples. It was found that hydrogen is trapped as molecules in grain boundaries of Al/Si. (author)

  17. Experimental Research Regarding New Models of Organizational Communication in the Romanian Tourism

    Directory of Open Access Journals (Sweden)

    Cristina STATE

    2015-12-01

    Full Text Available Presenting interests for the most various sciences (cybernetics, economics, ethnology, philosophy, history, psycho-sociology etc., the complex communication process incited and triggered a lot of opinions, many of them not complementary at all and even taken to the level of some passions generating contradictions. The result was the conceptualization of the content and of the communication functions on different forms called models by their creators. In time, with their evolution, the communication models have included, besides some basic elements (sender, message, means of communication, receiver and effect also a range of detail elements essential to streamline the process itself: the noise source , codec and feedback, the interaction of the field specific experience of the transmitter and receptor, the organizational context of communication and communication skills, including how to produce and interpretate these ones. Finally, any model’ functions are either heuristic (to explain, organizational (to order or predictive (making assumptions. They are worth only by their degree of probability remaining valid so long as it is not invalidated by practice and is one way of describing reality and not the reality itself. This is the context in which our work, the first of its kind in Romania, proposes in the context of improving organizational management, two new models of communication at both the micro- and macro- economic, models through which, using crowdsourcing, the units in the tourism, hospitality and leisure industry (THLI will be able to communicate more effectively, based not on own insights and / or perceptions but, firstly, on the views of management and experts in the field and especially on the customer’ feedback.

  18. An interface tracking model for droplet electrocoalescence.

    Energy Technology Data Exchange (ETDEWEB)

    Erickson, Lindsay Crowl

    2013-09-01

    This report describes an Early Career Laboratory Directed Research and Development (LDRD) project to develop an interface tracking model for droplet electrocoalescence. Many fluid-based technologies rely on electrical fields to control the motion of droplets, e.g. microfluidic devices for high-speed droplet sorting, solution separation for chemical detectors, and purification of biodiesel fuel. Precise control over droplets is crucial to these applications. However, electric fields can induce complex and unpredictable fluid dynamics. Recent experiments (Ristenpart et al. 2009) have demonstrated that oppositely charged droplets bounce rather than coalesce in the presence of strong electric fields. A transient aqueous bridge forms between approaching drops prior to pinch-off. This observation applies to many types of fluids, but neither theory nor experiments have been able to offer a satisfactory explanation. Analytic hydrodynamic approximations for interfaces become invalid near coalescence, and therefore detailed numerical simulations are necessary. This is a computationally challenging problem that involves tracking a moving interface and solving complex multi-physics and multi-scale dynamics, which are beyond the capabilities of most state-of-the-art simulations. An interface-tracking model for electro-coalescence can provide a new perspective to a variety of applications in which interfacial physics are coupled with electrodynamics, including electro-osmosis, fabrication of microelectronics, fuel atomization, oil dehydration, nuclear waste reprocessing and solution separation for chemical detectors. We present a conformal decomposition finite element (CDFEM) interface-tracking method for the electrohydrodynamics of two-phase flow to demonstrate electro-coalescence. CDFEM is a sharp interface method that decomposes elements along fluid-fluid boundaries and uses a level set function to represent the interface.

  19. Protecting effects specifically from low doses of ionizing radiation to mammalian cells challenge the concept of linearity

    International Nuclear Information System (INIS)

    Feinendegen, L.E.; Sondhaus, C.A.; Altman, K.I.

    1998-01-01

    This report examines the origin of tissue effects that may follow from different cellular responses to low-dose irradiation, using published data. Two principal categories of cellular responses are considered. One response category relates to the probability of radiation-induced DNA damage. The other category consists of low-dose induced changes in intracellular signaling that induce mechanisms of DNA damage control different from those operating at high levels of exposure. Modeled in this way, tissue is treated as a complex adaptive system. The interaction of the various cellular responses results in a net tissue dose-effect relation that is likely to deviate from linearity in the low-dose region. This suggests that the LNT hypothesis should be reexamined. The aim of this paper is to demonstrate that by use of microdosimetric concepts, the energy deposited in cell mass can be related to the occurrence of cellular responses, both damaging and defensive

  20. Low doses of ionizing radiation to mammalian cells may rather control than cause DNA damage

    International Nuclear Information System (INIS)

    Feinendegen, L.E.; Sondhaus, C.A.; Altman, K.I.

    1998-01-01

    This report examines the origin of tissue effects that may follow from different cellular responses to low-dose irradiation, using published data. Two principal categories of cellular responses are considered. One response category relates to the probability of radiation-induced DNA damage. The other category consists of low-dose induced metabolic changes that induce mechanisms of DNA damage mitigation, which do not operate at high levels of exposure. Modeled in this way, tissue is treated as a complex adaptive system. The interaction of the various cellular responses results in a net tissue dose-effect relation that is likely to deviate from linearity in the low-dose region. This suggests that the LNT hypothesis should be reexamined. This paper aims at demonstrating tissue effects as an expression of cellular responses, both damaging and defensive, in relation to the energy deposited in cell mass, by use of microdosimetric concepts

  1. Protecting effects specifically from low doses of ionizing radiation to mammalian cells challenge the concept of linearity

    Energy Technology Data Exchange (ETDEWEB)

    Feinendegen, L.E. [Brookhaven National Lab., Upton, NY (United States). Medical Dept.; Bond, V.P. [Washington State Univ., Richland, WA (United States); Sondhaus, C.A. [Univ. of Arizona, Tucson, AZ (United States). Dept. of Radiology and Radiation Control Office; Altman, K.I. [Univ. of Rochester Medical Center, NY (United States). Dept. of Biochemistry and Biophysics

    1998-12-31

    This report examines the origin of tissue effects that may follow from different cellular responses to low-dose irradiation, using published data. Two principal categories of cellular responses are considered. One response category relates to the probability of radiation-induced DNA damage. The other category consists of low-dose induced changes in intracellular signaling that induce mechanisms of DNA damage control different from those operating at high levels of exposure. Modeled in this way, tissue is treated as a complex adaptive system. The interaction of the various cellular responses results in a net tissue dose-effect relation that is likely to deviate from linearity in the low-dose region. This suggests that the LNT hypothesis should be reexamined. The aim of this paper is to demonstrate that by use of microdosimetric concepts, the energy deposited in cell mass can be related to the occurrence of cellular responses, both damaging and defensive.

  2. Low doses of ionizing radiation to mammalian cells may rather control than cause DNA damage

    Energy Technology Data Exchange (ETDEWEB)

    Feinendegen, L.E. [Brookhaven National Lab., Upton, NY (United States). Medical Dept.; Bond, V.P. [Washington State Univ., Richland, WA (United States); Sondhaus, C.A. [Univ. of Arizona, Tucson, AZ (United States). Dept. of Radiology and Radiation Control Office; Altman, K.I. [Univ. of Rochester Medical Center, NY (United States). Dept. of Biochemistry and Biophysics

    1998-12-31

    This report examines the origin of tissue effects that may follow from different cellular responses to low-dose irradiation, using published data. Two principal categories of cellular responses are considered. One response category relates to the probability of radiation-induced DNA damage. The other category consists of low-dose induced metabolic changes that induce mechanisms of DNA damage mitigation, which do not operate at high levels of exposure. Modeled in this way, tissue is treated as a complex adaptive system. The interaction of the various cellular responses results in a net tissue dose-effect relation that is likely to deviate from linearity in the low-dose region. This suggests that the LNT hypothesis should be reexamined. This paper aims at demonstrating tissue effects as an expression of cellular responses, both damaging and defensive, in relation to the energy deposited in cell mass, by use of microdosimetric concepts.

  3. Taxonomic analysis of perceived risk: modeling individual and group perceptions within homogeneous hazard domains

    International Nuclear Information System (INIS)

    Kraus, N.N.; Slovic, P.

    1988-01-01

    Previous studies of risk perception have typically focused on the mean judgments of a group of people regarding the riskiness (or safety) of a diverse set of hazardous activities, substances, and technologies. This paper reports the results of two studies that take a different path. Study 1 investigated whether models within a single technological domain were similar to previous models based on group means and diverse hazards. Study 2 created a group taxonomy of perceived risk for only one technological domain, railroads, and examined whether the structure of that taxonomy corresponded with taxonomies derived from prior studies of diverse hazards. Results from Study 1 indicated that the importance of various risk characteristics in determining perceived risk differed across individuals and across hazards, but not so much as to invalidate the results of earlier studies based on group means and diverse hazards. In Study 2, the detailed analysis of railroad hazards produced a structure that had both important similarities to, and dissimilarities from, the structure obtained in prior research with diverse hazard domains. The data also indicated that railroad hazards are really quite diverse, with some approaching nuclear reactors in their perceived seriousness. These results suggest that information about the diversity of perceptions within a single domain of hazards could provide valuable input to risk-management decisions

  4. What happens at very low levels of radiation exposure ? Are the low dose exposures beneficial ?

    International Nuclear Information System (INIS)

    Deniz, Dalji

    2006-01-01

    Full text: Radiation is naturally present in our environment and has been since the birth of this planet. The human population is constantly exposed to low levels of natural background radiation, primarily from environmental sources, and to higher levels from occupational sources, medical therapy, and other human-mediated events. Radiation is one of the best-investigated hazardous agents. The biological effects of ionizing radiation for radiation protection consideration are grouped into two categories: The deterministic and the stochastic ones. Deterministic radiation effects can be clinically diagnosed in the exposed individual and occur when above a certain t hreshold a n appropriately high dose is absorbed in the tissues and organs to cause the death of a large number of cells and consequently to impair tissue or organ functions early after exposure. A clinically observable biological effect (Acute Radiation Sendromes, ARS) occurs days to months after an acute radiation dose. Stochastic radiation effects are the chronic effects of radiation result from relatively low exposure levels delivered over long periods of time. These are sort of effects that might result from occupational exposure, or to the background exposure levels. Such late effects might be the development of malignant (cancerous) disease and of the hereditary consequences. These effects may be observed many years after the radiation exposure. There is a latent period between the initial radiation exposure and the development of the biological effect. For this reason, a stochastic effect is called a Linear or Zero-Threshold (LNT) Dose-Response Effect. There is a stochastic correlation between the number of cases of cancers or genetic defects developed inside a population and the dose received by the population at relatively large levels of radiation. These changes in gene activation seem to be able to modify the response of cells to subsequent radiation exposure, termed the a daptive response

  5. 3D finite element model of the diabetic neuropathic foot: a gait analysis driven approach.

    Science.gov (United States)

    Guiotto, Annamaria; Sawacha, Zimi; Guarneri, Gabriella; Avogaro, Angelo; Cobelli, Claudio

    2014-09-22

    Diabetic foot is an invalidating complication of diabetes that can lead to foot ulcers. Three-dimensional (3D) finite element analysis (FEA) allows characterizing the loads developed in the different anatomical structures of the foot in dynamic conditions. The aim of this study was to develop a subject specific 3D foot FE model (FEM) of a diabetic neuropathic (DNS) and a healthy (HS) subject, whose subject specificity can be found in term of foot geometry and boundary conditions. Kinematics, kinetics and plantar pressure (PP) data were extracted from the gait analysis trials of the two subjects with this purpose. The FEM were developed segmenting bones, cartilage and skin from MRI and drawing a horizontal plate as ground support. Materials properties were adopted from previous literature. FE simulations were run with the kinematics and kinetics data of four different phases of the stance phase of gait (heel strike, loading response, midstance and push off). FEMs were then driven by group gait data of 10 neuropathic and 10 healthy subjects. Model validation focused on agreement between FEM-simulated and experimental PP. The peak values and the total distribution of the pressures were compared for this purpose. Results showed that the models were less robust when driven from group data and underestimated the PP in each foot subarea. In particular in the case of the neuropathic subject's model the mean errors between experimental and simulated data were around the 20% of the peak values. This knowledge is crucial in understanding the aetiology of diabetic foot. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. Guilt by Association: On Iffy Propositions and the Proper Treatment of Mental-Models Theory

    OpenAIRE

    Schroyens, Walter; Schaeken, Walter

    2004-01-01

    On montre que l’absence de distinction entre prédictions logiques et prédictions psychologiques chez Evans, Handley et Over (2003) a donné naissance à de pseudo-arguments qui rendent la théorie des modèles mentaux (TMM) de Johnson-Laird & Byrne (2002) coupable par association avec des hypothèses connues pour être invalides psychologiquement. On montre aussi que la théorie des modèles mentaux  permet différentes interprétations du « si », parmi lesquelles l’interprétation conjonctive (‘p et q’...

  7. AKIBAT HUKUM PRODUK YANG CACAT

    Directory of Open Access Journals (Sweden)

    Indrati Rini

    1999-07-01

    Full Text Available Fransantion object between producen and consumers usually relates to goods product. Producen as a party/side that produces and distribute. Goods must not be separated from the produc responsibility. Consumers Must be guarranteed that the purchased-goods should not be invalided. The producers accountability on invalided products is the product that can not be used according to its function is based on both an agreement and unlawful acts.

  8. Long-range epidemic spreading in a random environment.

    Science.gov (United States)

    Juhász, Róbert; Kovács, István A; Iglói, Ferenc

    2015-03-01

    Modeling long-range epidemic spreading in a random environment, we consider a quenched, disordered, d-dimensional contact process with infection rates decaying with distance as 1/rd+σ. We study the dynamical behavior of the model at and below the epidemic threshold by a variant of the strong-disorder renormalization-group method and by Monte Carlo simulations in one and two spatial dimensions. Starting from a single infected site, the average survival probability is found to decay as P(t)∼t-d/z up to multiplicative logarithmic corrections. Below the epidemic threshold, a Griffiths phase emerges, where the dynamical exponent z varies continuously with the control parameter and tends to zc=d+σ as the threshold is approached. At the threshold, the spatial extension of the infected cluster (in surviving trials) is found to grow as R(t)∼t1/zc with a multiplicative logarithmic correction and the average number of infected sites in surviving trials is found to increase as Ns(t)∼(lnt)χ with χ=2 in one dimension.

  9. Hypoxia in the St. Lawrence Estuary: How a Coding Error Led to the Belief that "Physics Controls Spatial Patterns".

    Directory of Open Access Journals (Sweden)

    Daniel Bourgault

    Full Text Available Two fundamental sign errors were found in a computer code used for studying the oxygen minimum zone (OMZ and hypoxia in the Estuary and Gulf of St. Lawrence. These errors invalidate the conclusions drawn from the model, and call into question a proposed mechanism for generating OMZ that challenges classical understanding. The study in question is being cited frequently, leading the discipline in the wrong direction.

  10. Some implications of excess soft X-ray emission from Seyfert 1 galaxies

    International Nuclear Information System (INIS)

    Fabian, A.C.; Guilbert, P.W.; Arnaud, K.A.; Shafer, R.A.; Tennant, A.F.; Ward, M.J.

    1986-01-01

    The X-ray spectrum of Seyfert 1 galaxies is characterized by a hard power-law spectrum. It is often postulated that this maintains a Compton-heated two-phase Broad-Line Region (BLR) around the central source. It is shown here that the strong excess soft X-ray emission observed in MKN 841 and other Seyfert galaxies invalidates this model if the BLR is spherically symmetric. Alternatives are proposed. (author)

  11. Accurate market price formation model with both supply-demand and trend-following for global food prices providing policy recommendations.

    Science.gov (United States)

    Lagi, Marco; Bar-Yam, Yavni; Bertrand, Karla Z; Bar-Yam, Yaneer

    2015-11-10

    Recent increases in basic food prices are severely affecting vulnerable populations worldwide. Proposed causes such as shortages of grain due to adverse weather, increasing meat consumption in China and India, conversion of corn to ethanol in the United States, and investor speculation on commodity markets lead to widely differing implications for policy. A lack of clarity about which factors are responsible reinforces policy inaction. Here, for the first time to our knowledge, we construct a dynamic model that quantitatively agrees with food prices. The results show that the dominant causes of price increases are investor speculation and ethanol conversion. Models that just treat supply and demand are not consistent with the actual price dynamics. The two sharp peaks in 2007/2008 and 2010/2011 are specifically due to investor speculation, whereas an underlying upward trend is due to increasing demand from ethanol conversion. The model includes investor trend following as well as shifting between commodities, equities, and bonds to take advantage of increased expected returns. Claims that speculators cannot influence grain prices are shown to be invalid by direct analysis of price-setting practices of granaries. Both causes of price increase, speculative investment and ethanol conversion, are promoted by recent regulatory changes-deregulation of the commodity markets, and policies promoting the conversion of corn to ethanol. Rapid action is needed to reduce the impacts of the price increases on global hunger.

  12. Optimal harvesting policy of a stochastic two-species competitive model with Lévy noise in a polluted environment

    Science.gov (United States)

    Zhao, Yu; Yuan, Sanling

    2017-07-01

    As well known that the sudden environmental shocks and toxicant can affect the population dynamics of fish species, a mechanistic understanding of how sudden environmental change and toxicant influence the optimal harvesting policy requires development. This paper presents the optimal harvesting of a stochastic two-species competitive model with Lévy noise in a polluted environment, where the Lévy noise is used to describe the sudden climate change. Due to the discontinuity of the Lévy noise, the classical optimal harvesting methods based on the explicit solution of the corresponding Fokker-Planck equation are invalid. The object of this paper is to fill up this gap and establish the optimal harvesting policy. By using of aggregation and ergodic methods, the approximation of the optimal harvesting effort and maximum expectation of sustainable yields are obtained. Numerical simulations are carried out to support these theoretical results. Our analysis shows that the Lévy noise and the mean stress measure of toxicant in organism may affect the optimal harvesting policy significantly.

  13. Correlates and consequences of the disclosure of pain-related distress to one’s spouse

    Science.gov (United States)

    Cano, Annmarie; Leong, Laura E. M.; Williams, Amy M.; May, Dana K. K.; Lutz, Jillian R.

    2012-01-01

    The communication of pain has received a great deal of attention in the pain literature; however, one form of pain communication—emotional disclosure of pain-related distress (e.g., sadness, worry, anger about pain)—has not been studied extensively. The current study examined the extent to which this form of pain communication occurred during an observed conversation with one’s spouse and also investigated the correlates and consequences of disclosure. Individuals with chronic pain (ICPs) and their spouses (N = 95 couples) completed several questionnaires regarding pain, psychological distress, and relationship distress as well as video recorded interactions about the impact of pain on their lives. Approximately two-thirds of ICPs (n = 65) disclosed their pain-related distress to their spouses. ICPs who reported greater pain severity, ruminative catastrophizing and affective distress about pain, and depressive and anxiety symptoms were more likely to disclose their distress to their spouses. Spouses of ICPs who disclosed only once or twice were significantly less likely to invalidate their partners whereas spouses of ICPs who disclosed at a higher rate were significantly more likely to validate their partners. Furthermore, spouses were more likely to engage in invalidation after attempting more neutral or validating responses, suggesting an erosion of support when ICPs engaged in high rates of disclosure. Correlates of spousal invalidation included both spouses’ helplessness catastrophizing, ICPs’ affective distress about pain, and spouses’ anxiety, suggesting that both partners’ distress are implicated in maladaptive disclosure-response patterns. Findings are discussed in light of pain communication and empathy models of pain. PMID:23059054

  14. Vanguards of paradigm shift in radiation biology. Radiation-induced adaptive and bystander responses

    International Nuclear Information System (INIS)

    Matsumoto, Hideki; Hamada, Nobuyuki; Kobayashi, Yasuhiko; Takahashi, Akihisa; Ohnishi, Takeo

    2007-01-01

    The risks of exposure to low dose ionizing radiation (below 100 mSv) are estimated by extrapolating from data obtained after exposure to high dose radiation, using a linear no-threshold model (LNT model). However, the validity of using this dose-response model is controversial because evidence accumulated over the past decade has indicated that living organisms, including humans, respond differently to low dose/low dose-rate radiation than they do to high dose/high dose-rate radiation. In other words, there are accumulated findings which cannot be explained by the classical ''target theory'' of radiation biology. The radioadaptive response, radiation-induced bystander effects, low-dose radio-hypersensitivity, and genomic instability are specifically observed in response to low dose/low dose-rate radiation, and the mechanisms underlying these responses often involve biochemical/molecular signals that respond to targeted and non-targeted events. Recently, correlations between the radioadaptive and bystander responses have been increasingly reported. The present review focuses on the latter two phenomena by summarizing observations supporting their existence, and discussing the linkage between them from the aspect of production of reactive oxygen and nitrogen species. (author)

  15. Wind Turbine Wake in Atmospheric Turbulence

    DEFF Research Database (Denmark)

    Réthoré, Pierre-Elouan Mikael

    to calibrate faster and simpler engineering wind farm wake models. The most attractive solution was the actuator disc method with the steady state k-ε turbulence model. The first step to design such a tool is the treatment of the forces. This thesis presents a computationally inexpensive method to apply......) shows that the problem mainly comes from the assumptions of the eddy-viscosity concept, which are deeply invalidated in the wind turbine wake region. Different models that intent to correct the k-ε model’s issues are investigated, of which none of them is found to be adequate. The mixing of the wake...

  16. Fault Detection and Isolation and Fault Tolerant Control of Wind Turbines Using Set-Valued Observers

    DEFF Research Database (Denmark)

    Casau, Pedro; Rosa, Paulo Andre Nobre; Tabatabaeipour, Seyed Mojtaba

    2012-01-01

    Research on wind turbine Operations & Maintenance (O&M) procedures is critical to the expansion of Wind Energy Conversion systems (WEC). In order to reduce O&M costs and increase the lifespan of the turbine, we study the application of Set-Valued Observers (SVO) to the problem of Fault Detection...... and Isolation (FDI) and Fault Tolerant Control (FTC) of wind turbines, by taking advantage of the recent advances in SVO theory for model invalidation. A simple wind turbine model is presented along with possible faulty scenarios. The FDI algorithm is built on top of the described model, taking into account...

  17. EPR and optical studies of Cu{sup 2+} ions doped in magnesium potassium phosphate hexahydrate single crystals

    Energy Technology Data Exchange (ETDEWEB)

    Kripal, Ram; Shukla, Santwana, E-mail: ram_kripal2001@rediffmail.com, E-mail: shukla.santwana@gmail.com [EPR Laboratory, Department of Physics, University of Allahabad, Allahabad 211002 (India)

    2011-03-15

    An electron paramagnetic resonance (EPR) study of Cu{sup 2+}-doped magnesium potassium phosphate is performed at liquid nitrogen temperature (LNT; 77 K). Two magnetically non-equivalent sites for Cu{sup 2+} are observed. The spin Hamiltonian parameters are determined with the fitting of spectra to a rhombic symmetry crystalline field. The ground state wavefunction is also determined. The g-anisotropy is evaluated and compared with the experimental value. With the help of an optical study, the nature of the bonding in the complex is discussed.

  18. EPR and optical studies of Cu2+ ions doped in magnesium potassium phosphate hexahydrate single crystals

    International Nuclear Information System (INIS)

    Kripal, Ram; Shukla, Santwana

    2011-01-01

    An electron paramagnetic resonance (EPR) study of Cu 2+ -doped magnesium potassium phosphate is performed at liquid nitrogen temperature (LNT; 77 K). Two magnetically non-equivalent sites for Cu 2+ are observed. The spin Hamiltonian parameters are determined with the fitting of spectra to a rhombic symmetry crystalline field. The ground state wavefunction is also determined. The g-anisotropy is evaluated and compared with the experimental value. With the help of an optical study, the nature of the bonding in the complex is discussed.

  19. Low-Level Radiation: Are Chemical Officers Adequately Trained

    Science.gov (United States)

    2004-06-17

    1,000,000 Smoking 1 pack of cigarettes/day ( polonium - 210 ) 8,000 200,000 per 1,000,000 Sleeping next to one’s partner 2 50 per 1,000,000 * LNT. The...treating cancer, food and material irradiation, gamma radiography, and industrial measurement gauges. Half-life is 5.27 years. Decay. The process by...Cosmic rays (at sea level) 30 1,100 per 1,000,000 Cosmic rays (Denver at 5000 ft elevation) 55 2,000 per 1,000,000 Human body (from food we eat

  20. Attention and executive functions in a rat model of chronic epilepsy.

    Science.gov (United States)

    Faure, Jean-Baptiste; Marques-Carneiro, José E; Akimana, Gladys; Cosquer, Brigitte; Ferrandon, Arielle; Herbeaux, Karine; Koning, Estelle; Barbelivien, Alexandra; Nehlig, Astrid; Cassel, Jean-Christophe

    2014-05-01

    Temporal lobe epilepsy is a relatively frequent, invalidating, and often refractory neurologic disorder. It is associated with cognitive impairments that affect memory and executive functions. In the rat lithium-pilocarpine temporal lobe epilepsy model, memory impairment and anxiety disorder are classically reported. Here we evaluated sustained visual attention in this model of epilepsy, a function not frequently explored. Thirty-five Sprague-Dawley rats were subjected to lithium-pilocarpine status epilepticus. Twenty of them received a carisbamate treatment for 7 days, starting 1 h after status epilepticus onset. Twelve controls received lithium and saline. Five months later, attention was assessed in the five-choice serial reaction time task, a task that tests visual attention and inhibitory control (impulsivity/compulsivity). Neuronal counting was performed in brain regions of interest to the functions studied (hippocampus, prefrontal cortex, nucleus basalis magnocellularis, and pedunculopontine tegmental nucleus). Lithium-pilocarpine rats developed motor seizures. When they were able to learn the task, they exhibited attention impairment and a tendency toward impulsivity and compulsivity. These disturbances occurred in the absence of neuronal loss in structures classically related to attentional performance, although they seemed to better correlate with neuronal loss in hippocampus. Globally, rats that received carisbamate and developed motor seizures were as impaired as untreated rats, whereas those that did not develop overt motor seizures performed like controls, despite evidence for hippocampal damage. This study shows that attention deficits reported by patients with temporal lobe epilepsy can be observed in the lithium-pilocarpine model. Carisbamate prevents the occurrence of motor seizures, attention impairment, impulsivity, and compulsivity in a subpopulation of neuroprotected rats. Wiley Periodicals, Inc. © 2014 International League Against Epilepsy.

  1. Comments on the Dutton-Puls model: Temperature and yield stress dependences of crack growth rate in zirconium alloys

    International Nuclear Information System (INIS)

    Kim, Young S.

    2010-01-01

    Research highlights: → This study shows first that temperature and yield stress dependences of crack growth rate in zirconium alloys can analytically be understood not by the Dutton-Puls model but by Kim's new DHC model. → It is demonstrated that the driving force for DHC is ΔC, not the stress gradient, which is the core of Kim's DHC model. → The Dutton-Puls model reveals the invalidity of Puls' claim that the crack tip solubility would increase to the cooling solvus. - Abstract: This work was prompted by the publication of Puls's recent papers claiming that the Dutton-Puls model is valid enough to explain the stress and temperature dependences of the crack growth rate (CGR) in zirconium alloys. The first version of the Dutton-Puls model shows that the CGR has positive dependences on the concentration difference ΔC, hydrogen diffusivity D H , and the yield strength, and a negative dependence on the applied stress intensity factor K I , which is one of its critical defects. Thus, the Dutton-Puls model claiming that the temperature dependence of CGR is determined by D H C H turns out to be incorrect. Given that ΔC is independent of the stress, it is evident that the driving force for DHC is ΔC, not the stress gradient, corroborating the validity of Kim's model. Furthermore, the predicted activation energy for CGR in a cold-worked Zr-2.5Nb tube disagrees with the measured one for the Zr-2.5Nb tube, showing that the Dutton-Puls model is too defective to explain the temperature dependence of CGR. It is demonstrated that the revised Dutton-Puls model also cannot explain the yield stress dependence of CGR.

  2. Test Collections for Patent-to-Patent Retrieval and Patent Map Generation in NTCIR-4 Workshop

    OpenAIRE

    Fujii, Atsushi; Iwayama, Makoto; Kando, Noriko

    2004-01-01

    This paper describes the Patent Retrieval Task in the Fourth NTCIR Workshop, and the test collections produced in this task. We perform the invalidity search task, in which each participant group searches a patent collection for the patents that can invalidate the demand in an existing claim. We also perform the automatic patent map generation task, in which the patents associated with a specific topic are organized in a multi-dimensional matrix.

  3. Integrated Disposal Facility FY 2016: ILAW Verification and Validation of the eSTOMP Simulator

    Energy Technology Data Exchange (ETDEWEB)

    Freedman, Vicky L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Bacon, Diana H. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fang, Yilin [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-05-13

    This document describes two sets of simulations carried out to further verify and validate the eSTOMP simulator. In this report, a distinction is made between verification and validation, and the focus is on verifying eSTOMP through a series of published benchmarks on cementitious wastes, and validating eSTOMP based on a lysimeter experiment for the glassified waste. These activities are carried out within the context of a scientific view of validation that asserts that models can only be invalidated, and that model validation (and verification) is a subjective assessment.

  4. Simulation of lean NOx trap performance with microkinetic chemistry and without mass transfer.

    Energy Technology Data Exchange (ETDEWEB)

    Larson, Rich; Daw, C. Stuart (Oak Ridge National Laboratory, Knoxville, TN); Pihl, Josh A. (Oak Ridge National Laboratory, Knoxville, TN); Chakravarthy, V. Kalyana (Oak Ridge National Laboratory, Knoxville, TN)

    2011-08-01

    A microkinetic chemical reaction mechanism capable of describing both the storage and regeneration processes in a fully formulated lean NO{sub x} trap (LNT) is presented. The mechanism includes steps occurring on the precious metal, barium oxide (NO{sub x} storage), and cerium oxide (oxygen storage) sites of the catalyst. The complete reaction set is used in conjunction with a transient plug flow reactor code to simulate not only conventional storage/regeneration cycles with a CO/H{sub 2} reductant, but also steady flow temperature sweep experiments that were previously analyzed with just a precious metal mechanism and a steady state code. The results show that NO{sub x} storage is not negligible during some of the temperature ramps, necessitating a re-evaluation of the precious metal kinetic parameters. The parameters for the entire mechanism are inferred by finding the best overall fit to the complete set of experiments. Rigorous thermodynamic consistency is enforced for parallel reaction pathways and with respect to known data for all of the gas phase species involved. It is found that, with a few minor exceptions, all of the basic experimental observations can be reproduced with these purely kinetic simulations, i.e., without including mass-transfer limitations. In addition to accounting for normal cycling behavior, the final mechanism should provide a starting point for the description of further LNT phenomena such as desulfation and the role of alternative reductants.

  5. Checking the foundation: recent radiobiology and the linear no-threshold theory.

    Science.gov (United States)

    Ulsh, Brant A

    2010-12-01

    The linear no-threshold (LNT) theory has been adopted as the foundation of radiation protection standards and risk estimation for several decades. The "microdosimetric argument" has been offered in support of the LNT theory. This argument postulates that energy is deposited in critical cellular targets by radiation in a linear fashion across all doses down to zero, and that this in turn implies a linear relationship between dose and biological effect across all doses. This paper examines whether the microdosimetric argument holds at the lowest levels of biological organization following low dose, low dose-rate exposures to ionizing radiation. The assumptions of the microdosimetric argument are evaluated in light of recent radiobiological studies on radiation damage in biological molecules and cellular and tissue level responses to radiation damage. There is strong evidence that radiation initially deposits energy in biological molecules (e.g., DNA) in a linear fashion, and that this energy deposition results in various forms of prompt DNA damage that may be produced in a pattern that is distinct from endogenous (e.g., oxidative) damage. However, a large and rapidly growing body of radiobiological evidence indicates that cell and tissue level responses to this damage, particularly at low doses and/or dose-rates, are nonlinear and may exhibit thresholds. To the extent that responses observed at lower levels of biological organization in vitro are predictive of carcinogenesis observed in vivo, this evidence directly contradicts the assumptions upon which the microdosimetric argument is based.

  6. The risk of low doses of ionising radiation and the linear no threshold relationship debate

    International Nuclear Information System (INIS)

    Tubiana, M.; Masse, R.; Vathaire, F. de; Averbeck, D.; Aurengo, A.

    2007-01-01

    The ICRP and the B.E.I.R. VII reports recommend a linear no threshold (L.N.T.) relationship for the estimation of cancer excess risk induced by ionising radiations (IR), but the 2005 report of Medicine and Science French Academies concludes that it leads to overestimate of risk for low and very low doses. The bases of L.N.T. are challenged by recent biological and animal experimental studies which show that the defence against IR involves the cell microenvironment and the immunologic system. The defence mechanisms against low doses are different and comparatively more effective than for high doses. Cell death is predominant against low doses. DNA repairing is activated against high doses, in order to preserve tissue functions. These mechanisms provide for multicellular organisms an effective and low cost defence system. The differences between low and high doses defence mechanisms are obvious for alpha emitters which show several greys threshold effects. These differences result in an impairment of epidemiological studies which, for statistical power purpose, amalgamate high and low doses exposure data, since it would imply that cancer IR induction and defence mechanisms are similar in both cases. Low IR dose risk estimates should rely on specific epidemiological studies restricted to low dose exposures and taking precisely into account potential confounding factors. The preliminary synthesis of cohort studies for which low dose data (< 100 mSv) were available show no significant risk excess, neither for solid cancer nor for leukemias. (authors)

  7. Effect of physiotherapy on arm functions of patients with rheumatoid arthritis

    OpenAIRE

    Kruopienė, Joana

    2006-01-01

    Rheumatoid arthritis is the inflammatory disease of web, which causes progressive inflammation of joints. Rheumatoid arthritis is on the top according to the number of patients who become invalids. The growing number of invalids in Lithuania becomes not only medical problem, but social problem as well. Everything is done to quell the activity of pathological process, its progression and to return and maintain the functions of moving device of the body with the help of prophylaxis, therapy and...

  8. Withdrawn: Tau polarization as CP diagnostic for a light Higgs boson at a photon collider

    CERN Document Server

    Godbole, Rohini M.; Kraml, Sabine

    Unfortunately, we have discovered a bug in the computer program which invalidates the numerical results of our paper. While our formulae and the dependencies explained in the paper are correct, the numerical values of the asymmetries presented in the paper go down by more than two orders of magnitude. The plots and the conclusions of the paper as they stand are hence invalid and will have to be changed. We withdraw this paper till we finish this reformulation.

  9. Magnetic relaxation in sintered Tl2Ca2Ba2Cu3O/sub x/ and YBa2Cu3O/sub 7-//sub x/ superconductors

    International Nuclear Information System (INIS)

    McHenry, M.E.; Maley, M.P.; Venturini, E.L.; Ginley, D.L.

    1989-01-01

    We have characterized the time dependence of the zero-field-cooled magnetization for sintered pellets of the Tl 2:2:2:3 and Y 1:2:3 superconductors. The magnetic relaxation in both cases is large and exhibits a logarithmic time dependence. The temperature dependence of the relaxation rate A = dM/d ln(t) has been characterized for both materials for applied fields of 1,2,3, and 10 kG. The relaxation rate for the Y 1:2:3 sintered material is comparable to that observed in similar sintered materials and in single crystals. The Tl 2:2:2:3 material exhibits similar relaxation spectra with a weaker temperature dependence at a given field consistent with stronger pinning in this material. The temperature dependence of the relaxation is analyzed using a phenomenological relaxation model to yield an average pinning energy (0.33 eV at H = 1 kG) and its field dependence

  10. Radiation hormesis and the linear-no-threshold assumption

    CERN Document Server

    Sanders, Charles L

    2009-01-01

    Current radiation protection standards are based upon the application of the linear no-threshold (LNT) assumption, which considers that even very low doses of ionizing radiation can cause cancer. The radiation hormesis hypothesis, by contrast, proposes that low-dose ionizing radiation is beneficial. In this book, the author examines all facets of radiation hormesis in detail, including the history of the concept and mechanisms, and presents comprehensive, up-to-date reviews for major cancer types. It is explained how low-dose radiation can in fact decrease all-cause and all-cancer mortality an

  11. Tunneling in a self-consistent dynamic image potential

    International Nuclear Information System (INIS)

    Rudberg, B.G.R.; Jonson, M.

    1991-01-01

    We have calculated the self-consistent effective potential for an electron tunneling through a square barrier while interacting with surface plasmons. This potential reduces to the classical image potential in the static limit. In the opposite limit, when the ''velocity'' of the tunneling electron is large, it reduces to the unperturbed square-barrier potential. For a wide variety of parameters the dynamic effects on the transmission coefficient T=|t 2 | can, for instance, be related to the Buettiker-Landauer traversal time for tunneling, given by τ BL =ℎ|d lnt/dV|

  12. A Monte Carlo-adjusted goodness-of-fit test for parametric models describing spatial point patterns

    KAUST Repository

    Dao, Ngocanh

    2014-04-03

    Assessing the goodness-of-fit (GOF) for intricate parametric spatial point process models is important for many application fields. When the probability density of the statistic of the GOF test is intractable, a commonly used procedure is the Monte Carlo GOF test. Additionally, if the data comprise a single dataset, a popular version of the test plugs a parameter estimate in the hypothesized parametric model to generate data for theMonte Carlo GOF test. In this case, the test is invalid because the resulting empirical level does not reach the nominal level. In this article, we propose a method consisting of nested Monte Carlo simulations which has the following advantages: the bias of the resulting empirical level of the test is eliminated, hence the empirical levels can always reach the nominal level, and information about inhomogeneity of the data can be provided.We theoretically justify our testing procedure using Taylor expansions and demonstrate that it is correctly sized through various simulation studies. In our first data application, we discover, in agreement with Illian et al., that Phlebocarya filifolia plants near Perth, Australia, can follow a homogeneous Poisson clustered process that provides insight into the propagation mechanism of these plants. In our second data application, we find, in contrast to Diggle, that a pairwise interaction model provides a good fit to the micro-anatomy data of amacrine cells designed for analyzing the developmental growth of immature retina cells in rabbits. This article has supplementary material online. © 2013 American Statistical Association, Institute of Mathematical Statistics, and Interface Foundation of North America.

  13. Effects of attention bias modification with short and long stimulus-duration: A randomized experiment with individuals with subclinical social anxiety.

    Science.gov (United States)

    Liang, Chi-Wen; Hsu, Wen-Yau

    2016-06-30

    This study investigated the differential effects of two attention bias modification (ABM) with different stimulus durations. Seventy-two undergraduates with subclinical social anxiety were randomly assigned to one of four conditions: an ABM condition with either a 100-ms or a 500-ms stimulus duration (ABM-100/ ABM-500) or an attention placebo (AP) condition with either a 100-ms or a 500-ms stimulus duration (AP-100/ AP-500). Participants completed the pre-assessments, eight attentional training sessions, and post-assessments. A modified Posner paradigm was used to assess changes in attentional processing. After completion of attentional training, the ABM-100 group significantly speeded up their responses to 100-ms invalid trials, regardless of the word type. The ABM-100 group also exhibited significant reduced latencies to 500-ms invalid social threat trials and a marginally significant reduced latencies to 500-ms invalid neutral trials. The ABM-500 group showed significant reduced latencies to 500-ms invalid social threat trials. Both ABMs significantly reduced participants' fear of negative evaluations and interactional anxiousness relative to their comparative AP. The effects on social anxiety did not differ between the two ABMs. This study suggests that although both ABMs using short and long stimulus durations reduce some aspects of social anxiety, they influence participants' attentional disengagement in different ways. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  14. Android platform based smartphones for a logistical remote association repair framework.

    Science.gov (United States)

    Lien, Shao-Fan; Wang, Chun-Chieh; Su, Juhng-Perng; Chen, Hong-Ming; Wu, Chein-Hsing

    2014-06-25

    The maintenance of large-scale systems is an important issue for logistics support planning. In this paper, we developed a Logistical Remote Association Repair Framework (LRARF) to aid repairmen in keeping the system available. LRARF includes four subsystems: smart mobile phones, a Database Management System (DBMS), a Maintenance Support Center (MSC) and wireless networks. The repairman uses smart mobile phones to capture QR-codes and the images of faulty circuit boards. The captured QR-codes and images are transmitted to the DBMS so the invalid modules can be recognized via the proposed algorithm. In this paper, the Linear Projective Transform (LPT) is employed for fast QR-code calibration. Moreover, the ANFIS-based data mining system is used for module identification and searching automatically for the maintenance manual corresponding to the invalid modules. The inputs of the ANFIS-based data mining system are the QR-codes and image features; the output is the module ID. DBMS also transmits the maintenance manual back to the maintenance staff. If modules are not recognizable, the repairmen and center engineers can obtain the relevant information about the invalid modules through live video. The experimental results validate the applicability of the Android-based platform in the recognition of invalid modules. In addition, the live video can also be recorded synchronously on the MSC for later use.

  15. Android Platform Based Smartphones for a Logistical Remote Association Repair Framework

    Directory of Open Access Journals (Sweden)

    Shao-Fan Lien

    2014-06-01

    Full Text Available The maintenance of large-scale systems is an important issue for logistics support planning. In this paper, we developed a Logistical Remote Association Repair Framework (LRARF to aid repairmen in keeping the system available. LRARF includes four subsystems: smart mobile phones, a Database Management System (DBMS, a Maintenance Support Center (MSC and wireless networks. The repairman uses smart mobile phones to capture QR-codes and the images of faulty circuit boards. The captured QR-codes and images are transmitted to the DBMS so the invalid modules can be recognized via the proposed algorithm. In this paper, the Linear Projective Transform (LPT is employed for fast QR-code calibration. Moreover, the ANFIS-based data mining system is used for module identification and searching automatically for the maintenance manual corresponding to the invalid modules. The inputs of the ANFIS-based data mining system are the QR-codes and image features; the output is the module ID. DBMS also transmits the maintenance manual back to the maintenance staff. If modules are not recognizable, the repairmen and center engineers can obtain the relevant information about the invalid modules through live video. The experimental results validate the applicability of the Android-based platform in the recognition of invalid modules. In addition, the live video can also be recorded synchronously on the MSC for later use.

  16. High Performance Programming Using Explicit Shared Memory Model on Cray T3D1

    Science.gov (United States)

    Simon, Horst D.; Saini, Subhash; Grassi, Charles

    1994-01-01

    The Cray T3D system is the first-phase system in Cray Research, Inc.'s (CRI) three-phase massively parallel processing (MPP) program. This system features a heterogeneous architecture that closely couples DEC's Alpha microprocessors and CRI's parallel-vector technology, i.e., the Cray Y-MP and Cray C90. An overview of the Cray T3D hardware and available programming models is presented. Under Cray Research adaptive Fortran (CRAFT) model four programming methods (data parallel, work sharing, message-passing using PVM, and explicit shared memory model) are available to the users. However, at this time data parallel and work sharing programming models are not available to the user community. The differences between standard PVM and CRI's PVM are highlighted with performance measurements such as latencies and communication bandwidths. We have found that the performance of neither standard PVM nor CRI s PVM exploits the hardware capabilities of the T3D. The reasons for the bad performance of PVM as a native message-passing library are presented. This is illustrated by the performance of NAS Parallel Benchmarks (NPB) programmed in explicit shared memory model on Cray T3D. In general, the performance of standard PVM is about 4 to 5 times less than obtained by using explicit shared memory model. This degradation in performance is also seen on CM-5 where the performance of applications using native message-passing library CMMD on CM-5 is also about 4 to 5 times less than using data parallel methods. The issues involved (such as barriers, synchronization, invalidating data cache, aligning data cache etc.) while programming in explicit shared memory model are discussed. Comparative performance of NPB using explicit shared memory programming model on the Cray T3D and other highly parallel systems such as the TMC CM-5, Intel Paragon, Cray C90, IBM-SP1, etc. is presented.

  17. LETTERS AND COMMENTS: Comment on 'Reinterpreting the famous train/embankment experiment of relativity'

    Science.gov (United States)

    Rowland, David R.

    2004-09-01

    Nelson (2003 Eur. J. Phys. 24 379) recently claimed on logical grounds that Einstein's train and embankment thought experiment cannot be used to prove the relativity of simultaneity prior to knowledge of the Lorentz transformations as it is purported to do. It is argued in this comment that Nelson's claim is based on premises which are incorrect, thus invalidating his conclusions. It is also argued that Nelson's article furnishes a 'proof by contradiction' of the desired result, thus also invalidating his claim.

  18. The Role of Political and Economic Factors in Thailand’s Last Two Coups D’ Etat

    Science.gov (United States)

    2007-09-01

    Reg Invalid Total Population VAP Vote/ VAP PR CL Status 1975 8,695,000 18,500,000 47.0% N/A 41,896,000 18,902,400 46.0% 2 3 PF 1976 9,084,104...and registered voters are approximate. Terms are as follow: VAP =voting age population; PR=political rights; CL=civil liberties; Invalid=the number of...existed. However, by 1996, two parties were emerging as being dominant over the smaller parties—the Democrats and New Aspiration . This was about

  19. Complementarity of flux- and biometric-based data to constrain parameters in a terrestrial carbon model

    Directory of Open Access Journals (Sweden)

    Zhenggang Du

    2015-03-01

    Full Text Available To improve models for accurate projections, data assimilation, an emerging statistical approach to combine models with data, have recently been developed to probe initial conditions, parameters, data content, response functions and model uncertainties. Quantifying how many information contents are contained in different data streams is essential to predict future states of ecosystems and the climate. This study uses a data assimilation approach to examine the information contents contained in flux- and biometric-based data to constrain parameters in a terrestrial carbon (C model, which includes canopy photosynthesis and vegetation–soil C transfer submodels. Three assimilation experiments were constructed with either net ecosystem exchange (NEE data only or biometric data only [including foliage and woody biomass, litterfall, soil organic C (SOC and soil respiration], or both NEE and biometric data to constrain model parameters by a probabilistic inversion application. The results showed that NEE data mainly constrained parameters associated with gross primary production (GPP and ecosystem respiration (RE but were almost invalid for C transfer coefficients, while biometric data were more effective in constraining C transfer coefficients than other parameters. NEE and biometric data constrained about 26% (6 and 30% (7 of a total of 23 parameters, respectively, but their combined application constrained about 61% (14 of all parameters. The complementarity of NEE and biometric data was obvious in constraining most of parameters. The poor constraint by only NEE or biometric data was probably attributable to either the lack of long-term C dynamic data or errors from measurements. Overall, our results suggest that flux- and biometric-based data, containing different processes in ecosystem C dynamics, have different capacities to constrain parameters related to photosynthesis and C transfer coefficients, respectively. Multiple data sources could also

  20. Correlates and consequences of the disclosure of pain-related distress to one's spouse.

    Science.gov (United States)

    Cano, Annmarie; Leong, Laura E M; Williams, Amy M; May, Dana K K; Lutz, Jillian R

    2012-12-01

    The communication of pain has received a great deal of attention in the pain literature; however, one form of pain communication--emotional disclosure of pain-related distress (e.g., sadness, worry, anger about pain)--has not been studied extensively. This study examined the extent to which this form of pain communication occurred during an observed conversation with one's spouse and also investigated the correlates and consequences of disclosure. Individuals with chronic pain (ICP) and their spouses (N=95 couples) completed several questionnaires regarding pain, psychological distress, and relationship distress as well as video recorded interactions about the impact of pain on their lives. Approximately two-thirds of ICPs (n=65) disclosed their pain-related distress to their spouses. ICPs who reported greater pain severity, ruminative catastrophizing and affective distress about pain, and depressive and anxiety symptoms were more likely to disclose their distress to their spouses. Spouses of ICPs who disclosed only once or twice were significantly less likely to invalidate their partners whereas spouses of ICPs who disclosed at a higher rate were significantly more likely to validate their partners. Furthermore, spouses were more likely to engage in invalidation after attempting more neutral or validating responses, suggesting an erosion of support when ICPs engaged in high rates of disclosure. Correlates of spousal invalidation included both spouses' helplessness catastrophizing, ICPs' affective distress about pain, and spouses' anxiety, suggesting that both partners' distress are implicated in maladaptive disclosure-response patterns. Findings are discussed in light of pain communication and empathy models of pain. Copyright © 2012 International Association for the Study of Pain. Published by Elsevier B.V. All rights reserved.

  1. Rural household fuel production and consumption in Ethiopia: A case study

    International Nuclear Information System (INIS)

    Mekonnen, A.

    1997-01-01

    Community forestry in Ethiopia have been implemented using the top-down approach which may have contributed to the failure of most of these projects. The community plantations practically belonged to the government and the labour contribution of the local communities in the establishment of the plantations was mainly in exchange for wages paid in kind (food-for-work) largely financed by the United Nations/World Food Program (UN-WFP). We use the contingent valuation method to examine the determinants of the value of community forestry in rural Ethiopia, when the plantations are established, managed and used by the communities themselves. The value elicitation format used is discrete question with open-ended follow-up which is closer to the market scenario our respondents are familiar with compared, for example, to the single discrete choice format. Unlike most other studies, we use a Tobit model with sample selection in the empirical analysis of the bid function to look into the effect of excluding invalid responses (protest zeros, outliers and missing bids) from the analysis. We find that exclusion of invalid responses would lead to sample selection bias. One implication of such a bias is that mean WTP values computed using data that does not include households with invalid responses should be adjusted downwards before they are used for benefit aggregation. The analysis of the bid function shows that household size, household income, distance of homestead to proposed place of plantation, number of trees owned and sex of household head are significant variables that explain willingness to pay. We also find that there are significant differences in willingness to pay across sites 50 refs, 4 tabs

  2. Rural household fuel production and consumption in Ethiopia: A case study

    Energy Technology Data Exchange (ETDEWEB)

    Mekonnen, A.

    1997-11-01

    Community forestry in Ethiopia have been implemented using the top-down approach which may have contributed to the failure of most of these projects. The community plantations practically belonged to the government and the labour contribution of the local communities in the establishment of the plantations was mainly in exchange for wages paid in kind (food-for-work) largely financed by the United Nations/World Food Program (UN-WFP). We use the contingent valuation method to examine the determinants of the value of community forestry in rural Ethiopia, when the plantations are established, managed and used by the communities themselves. The value elicitation format used is discrete question with open-ended follow-up which is closer to the market scenario our respondents are familiar with compared, for example, to the single discrete choice format. Unlike most other studies, we use a Tobit model with sample selection in the empirical analysis of the bid function to look into the effect of excluding invalid responses (protest zeros, outliers and missing bids) from the analysis. We find that exclusion of invalid responses would lead to sample selection bias. One implication of such a bias is that mean WTP values computed using data that does not include households with invalid responses should be adjusted downwards before they are used for benefit aggregation. The analysis of the bid function shows that household size, household income, distance of homestead to proposed place of plantation, number of trees owned and sex of household head are significant variables that explain willingness to pay. We also find that there are significant differences in willingness to pay across sites 50 refs, 4 tabs

  3. CFD modelling of convective heat transfer from a window with adjacent venetian blinds

    Energy Technology Data Exchange (ETDEWEB)

    Marjanovic, L. [Belgrade Univ., Belgrade (Yugoslavia). Faculty of Mechanical Engineering]|[DeMontfort Univ. (United Kingdom). Inst. of Energy and Sustainable Development; Cook, M; Hanby, V.; Rees, S. [DeMontfort Univ. (United Kingdom). Inst. of Energy and Sustainable Development

    2005-07-01

    There is a limited amount of 3-dimensional modeling information on the performance of glazing systems with blinds. Two-dimensional flow modeling has indicated that 1-dimensional heat transfer can lead to invalid results where 2- and 3-dimensional effects are present. In this study, a 3-dimensional numerical solution was obtained on the effect of a venetian blind on the conjugate heat transfer from an indoor window glazing system. The solution was obtained for the coupled laminar free convection and radiation heat transfer problem, including conduction along the blind slats. Continuity, momentum and energy equations for buoyant flow were solved using Computational Fluid Dynamics (CFD) software. Grey diffuse radiation exchange between the window, blind and air were considered using the Monte Carlo method. All thermophysical properties of air were assumed to be constant with the exception of density, which was modeled using the Bousinesq approximation. Both winter and summer conditions were considered. In the computational domain, the window represented an isothermal type boundary condition with no slip. The height of the domain was extended beyond the blinds to allow for inflow and outflow regions. Fluid was allowed to entrain into the domain at an ambient temperature in a direction perpendicular to the window. The results indicated that heat transfer between window and indoor air is influenced both quantitatively and qualitatively by the presence of an aluminium venetian blind, and that the cellular flow between the blind slats can have a significant effect on the convective heat transfer from the window surface that is more fully recognized and analyzed in 3 dimensions. refs., 2 tabs., 13 figs.

  4. Forward-Looking Beta Estimates:Evidence from an Emerging Market

    OpenAIRE

    Onour, Ibrahim

    2008-01-01

    Results in this paper support evidence of time-varying beta coefficients for five sectors in Kuwait Stock Market. The paper indicates banks, food, and service sectors exhibit relatively wider range of variation compared to industry and real estate sectors. Results of time-varying betas invalidate the standard application of Capital Asset Pricing model that assumes constant beta. In terms of risk exposure, banks and industrial sectors reflect higher risk as their average betas exceed the mark...

  5. Measuring work engagement among community health workers in Sierra Leone: Validating the Utrecht Work Engagement Scale

    OpenAIRE

    Vallières, Frédérique; McAuliffe, Eilish; Hyland, Philip; Galligan, Marie; Ghee, Annette

    2017-01-01

    This study examines the concept of volunteer work engagement in a sample of 334 community health workers in Bonthe District, Sierra Leone. Structural equation modelling was used to validate both the 9-item and the 17-item Utrecht Work Engagement Scale (UWES-9 and UWES-17, respectively). Results assessing the UWES-17 invalidated the three-factor structure within this cohort of community health workers, as high correlations were found between latent factors. Findings for the validity of the UWE...

  6. Restoration of supersymmetric Slavnov-Taylor and Ward identities in the presence of soft and spontaneous symmetry breaking

    International Nuclear Information System (INIS)

    Fischer, I.; Hollik, W.; Roth, M.; Stoeckinger, D.

    2004-01-01

    Supersymmetric Slavnov-Taylor and Ward identities are investigated in the presence of soft and spontaneous symmetry breaking. We consider an Abelian model where soft supersymmetry breaking yields a mass splitting between electron and selectron and triggers spontaneous symmetry breaking, and we derive the corresponding identities that relate the electron and selectron masses to the Yukawa coupling. We demonstrate that the identities are valid in dimensional reduction and invalid in dimensional regularization and compute the necessary symmetry-restoring counterterms

  7. Restoration of supersymmetric Slavnov-Taylor and Ward identities in presence of soft and spontaneous symmetry breaking

    International Nuclear Information System (INIS)

    Fischer, I.; Hollik, W.; Roth, M.; Stoeckinger, D.

    2003-12-01

    Supersymmetric Slavnov-Taylor and Ward identities are investigated in presence of soft and spontaneous symmetry breaking. We consider an abelian model where soft supersymmetry breaking yields a mass splitting between electron and selectron and triggers spontaneous symmetry breaking, and we derive corresponding identities that relate the electron and selectron masses with the Yukawa coupling. We demonstrate that the identities are valid in dimensional reduction and invalid in dimensional regularization and compute the necessary symmetry-restoring counterterms. (orig.)

  8. Age-dependent impairment of auditory processing under spatially focused and divided attention: an electrophysiological study.

    Science.gov (United States)

    Wild-Wall, Nele; Falkenstein, Michael

    2010-01-01

    By using event-related potentials (ERPs) the present study examines if age-related differences in preparation and processing especially emerge during divided attention. Binaurally presented auditory cues called for focused (valid and invalid) or divided attention to one or both ears. Responses were required to subsequent monaurally presented valid targets (vowels), but had to be suppressed to non-target vowels or invalidly cued vowels. Middle-aged participants were more impaired under divided attention than young ones, likely due to an age-related decline in preparatory attention following cues as was reflected in a decreased CNV. Under divided attention, target processing was increased in the middle-aged, likely reflecting compensatory effort to fulfill task requirements in the difficult condition. Additionally, middle-aged participants processed invalidly cued stimuli more intensely as was reflected by stimulus ERPs. The results suggest an age-related impairment in attentional preparation after auditory cues especially under divided attention and latent difficulties to suppress irrelevant information.

  9. Identifying the Safety Factors over Traffic Signs in State Roads using a Panel Quantile Regression Approach.

    Science.gov (United States)

    Šarić, Željko; Xu, Xuecai; Duan, Li; Babić, Darko

    2018-06-20

    This study intended to investigate the interactions between accident rate and traffic signs in state roads located in Croatia, and accommodate the heterogeneity attributed to unobserved factors. The data from 130 state roads between 2012 and 2016 were collected from Traffic Accident Database System maintained by the Republic of Croatia Ministry of the Interior. To address the heterogeneity, a panel quantile regression model was proposed, in which quantile regression model offers a more complete view and a highly comprehensive analysis of the relationship between accident rate and traffic signs, while the panel data model accommodates the heterogeneity attributed to unobserved factors. Results revealed that (1) low visibility of material damage (MD) and death or injured (DI) increased the accident rate; (2) the number of mandatory signs and the number of warning signs were more likely to reduce the accident rate; (3)average speed limit and the number of invalid traffic signs per km exhibited a high accident rate. To our knowledge, it's the first attempt to analyze the interactions between accident consequences and traffic signs by employing a panel quantile regression model; by involving the visibility, the present study demonstrates that the low visibility causes a relatively higher risk of MD and DI; It is noteworthy that average speed limit corresponds with accident rate positively; The number of mandatory signs and the number of warning signs are more likely to reduce the accident rate; The number of invalid traffic signs per km are significant for accident rate, thus regular maintenance should be kept for a safer roadway environment.

  10. Model Reduction in Biomechanics

    Science.gov (United States)

    Feng, Yan

    mechanical parameters from experimental results. However, in real biological world, these homogeneous and isotropic assumptions are usually invalidate. Thus, instead of using hypothesized model, a specific continuum model at mesoscopic scale can be introduced based upon data reduction of the results from molecular simulations at atomistic level. Once a continuum model is established, it can provide details on the distribution of stresses and strains induced within the biomolecular system which is useful in determining the distribution and transmission of these forces to the cytoskeletal and sub-cellular components, and help us gain a better understanding in cell mechanics. A data-driven model reduction approach to the problem of microtubule mechanics as an application is present, a beam element is constructed for microtubules based upon data reduction of the results from molecular simulation of the carbon backbone chain of alphabeta-tubulin dimers. The data base of mechanical responses to various types of loads from molecular simulation is reduced to dominant modes. The dominant modes are subsequently used to construct the stiffness matrix of a beam element that captures the anisotropic behavior and deformation mode coupling that arises from a microtubule's spiral structure. In contrast to standard Euler-Bernoulli or Timoshenko beam elements, the link between forces and node displacements results not from hypothesized deformation behavior, but directly from the data obtained by molecular scale simulation. Differences between the resulting microtubule data-driven beam model (MTDDBM) and standard beam elements are presented, with a focus on coupling of bending, stretch, shear deformations. The MTDDBM is just as economical to use as a standard beam element, and allows accurate reconstruction of the mechanical behavior of structures within a cell as exemplified in a simple model of a component element of the mitotic spindle.

  11. Secure Environments for Collaboration among Ubiquitous Roaming Entities

    DEFF Research Database (Denmark)

    Jensen, Christian D.

    2002-01-01

    SECURE is a newly started IST project, which addresses secure collaboration among computational entities in emerging global computing systems. The properties of these systems introduce new security challenges that are not adequately addressed by existing security models and mechanisms. The scale ...... and uncertainty of this global computing environment invalidates existing security models. Instead, new security models have to be developed along with new security mechanisms that control access to protected resources.......SECURE is a newly started IST project, which addresses secure collaboration among computational entities in emerging global computing systems. The properties of these systems introduce new security challenges that are not adequately addressed by existing security models and mechanisms. The scale...

  12. Harnessing the theoretical foundations of the exponential and beta-Poisson dose-response models to quantify parameter uncertainty using Markov Chain Monte Carlo.

    Science.gov (United States)

    Schmidt, Philip J; Pintar, Katarina D M; Fazil, Aamir M; Topp, Edward

    2013-09-01

    Dose-response models are the essential link between exposure assessment and computed risk values in quantitative microbial risk assessment, yet the uncertainty that is inherent to computed risks because the dose-response model parameters are estimated using limited epidemiological data is rarely quantified. Second-order risk characterization approaches incorporating uncertainty in dose-response model parameters can provide more complete information to decisionmakers by separating variability and uncertainty to quantify the uncertainty in computed risks. Therefore, the objective of this work is to develop procedures to sample from posterior distributions describing uncertainty in the parameters of exponential and beta-Poisson dose-response models using Bayes's theorem and Markov Chain Monte Carlo (in OpenBUGS). The theoretical origins of the beta-Poisson dose-response model are used to identify a decomposed version of the model that enables Bayesian analysis without the need to evaluate Kummer confluent hypergeometric functions. Herein, it is also established that the beta distribution in the beta-Poisson dose-response model cannot address variation among individual pathogens, criteria to validate use of the conventional approximation to the beta-Poisson model are proposed, and simple algorithms to evaluate actual beta-Poisson probabilities of infection are investigated. The developed MCMC procedures are applied to analysis of a case study data set, and it is demonstrated that an important region of the posterior distribution of the beta-Poisson dose-response model parameters is attributable to the absence of low-dose data. This region includes beta-Poisson models for which the conventional approximation is especially invalid and in which many beta distributions have an extreme shape with questionable plausibility. © Her Majesty the Queen in Right of Canada 2013. Reproduced with the permission of the Minister of the Public Health Agency of Canada.

  13. Comment on 'Reinterpreting the famous train/embankment experiment of relativity'

    International Nuclear Information System (INIS)

    Rowland, David R

    2004-01-01

    Nelson (2003 Eur. J. Phys. 24 379) recently claimed on logical grounds that Einstein's train and embankment thought experiment cannot be used to prove the relativity of simultaneity prior to knowledge of the Lorentz transformations as it is purported to do. It is argued in this comment that Nelson's claim is based on premises which are incorrect, thus invalidating his conclusions. It is also argued that Nelson's article furnishes a 'proof by contradiction' of the desired result, thus also invalidating his claim. (letters and comments)

  14. Faster than light motion does not imply time travel

    International Nuclear Information System (INIS)

    Andréka, Hajnal; Madarász, Judit X; Németi, István; Székely, Gergely; Stannett, Mike

    2014-01-01

    Seeing the many examples in the literature of causality violations based on faster than light (FTL) signals one naturally thinks that FTL motion leads inevitably to the possibility of time travel. We show that this logical inference is invalid by demonstrating a model, based on (3+1)-dimensional Minkowski spacetime, in which FTL motion is permitted (in every direction without any limitation on speed) yet which does not admit time travel. Moreover, the Principle of Relativity is true in this model in the sense that all observers are equivalent. In short, FTL motion does not imply time travel after all. (paper)

  15. Medical-legal issues in headache: penal and civil Italian legislation, working claims, social security, off-label prescription.

    Science.gov (United States)

    Aguggia, M; Cavallini, M; Varetto, L

    2006-05-01

    Primary headaches can be considered simultaneously as symptom and disease itself, while secondary headaches are expressions of a pathological process that can be systemic or locoregional. Because of its subjective features, headache is often difficult to assess and quantify by severity, frequency and invalidity rate, and for these reasons it has often been implicated in legal controversies. Headache has seldom been considered in the criminal law, except when it represents a typical symptom of a disease whose existence can be objectively assessed (i. e. raised intracranial pressure). Therefore, in civil legislation it is not yet coded to start claiming for invalidity compensation. In particular, one of the most debated medical-legal questions is represented by headaches occurring after head injury. Headache is often the principal symptom at the beginning of several toxic chronic syndromes, with many implications, especially in working claims, and, more recently, it may be referred to as one of the most frequent symptoms by victims of mobbing (i. e. psychological harassment in the workplace). The National Institute for Industrial Accident Insurance (INAIL) scales (instituted by the law 38/2000) mention the "Subjective cranial trauma syndrome" and give an invalidity rate evaluation. With reference to other headache forms, no legislation really exists at the present time, and headache is only considered as a symptom of a certain coded disease. Requests for invalidity social pension and the question of off-label prescriptions (drug prescription for a disease, without formal indication for it) are other controversial matters.

  16. Toxicological awakenings: the rebirth of hormesis as a central pillar of toxicology

    International Nuclear Information System (INIS)

    Calabrese, Edward J.

    2005-01-01

    This paper assesses historical reasons that may account for the marginalization of hormesis as a dose-response model in the biomedical sciences in general and toxicology in particular. The most significant and enduring explanatory factors are the early and close association of the concept of hormesis with the highly controversial medical practice of homeopathy and the difficulty in assessing hormesis with high-dose testing protocols which have dominated the discipline of toxicology, especially regulatory toxicology. The long-standing and intensely acrimonious conflict between homeopathy and 'traditional' medicine (allopathy) lead to the exclusion of the hormesis concept from a vast array of medical- and public health-related activities including research, teaching, grant funding, publishing, professional societal meetings, and regulatory initiatives of governmental agencies and their advisory bodies. Recent publications indicate that the hormetic dose-response is far more common and fundamental than the dose-response models [threshold/linear no threshold (LNT)] used in toxicology and risk assessment, and by governmental regulatory agencies in the establishment of exposure standards for workers and the general public. Acceptance of the possibility of hormesis has the potential to profoundly affect the practice of toxicology and risk assessment, especially with respect to carcinogen assessment

  17. Does infant cognition research undermine sociological theory?

    DEFF Research Database (Denmark)

    Bjerre, Jørn

    2012-01-01

    This article discusses how the results of infant research challenge the assumptions of the classical sciences of social behaviour. According to A.J. Bergesen, the findings of infant research invalidate Durkheim's theory of mental categories, thus requiring a re-theorizing of sociology. This article...... argues that Bergesen's reading of Emile Durkheim is incorrect, and his review of the infant research in fact invalidates his argument. Reviewing the assumptions of sociology in the light of the findings of infant research, it is argued that the real challenge is to formulate a research strategy...

  18. Application of X-ray computed micro-tomography to the study of damage and oxidation kinetics of thermostructural composites

    Energy Technology Data Exchange (ETDEWEB)

    Caty, Olivier, E-mail: caty@lcts.u-bordeaux1.fr [Laboratory of Thermostructural Composites (LCTS), Université de Bordeaux, CNRS, SAFRAN, CEA, 3 Allée La Boétie, 33600 Pessac (France); Ibarroule, Philippe; Herbreteau, Mathieu; Rebillat, Francis [Laboratory of Thermostructural Composites (LCTS), Université de Bordeaux, CNRS, SAFRAN, CEA, 3 Allée La Boétie, 33600 Pessac (France); Maire, Eric [MATEIS Laboratory, INSA Lyon, 7 Avenue Jean Capelle, 69621 Villeurbanne Cedex (France); Vignoles, Gérard L. [Laboratory of Thermostructural Composites (LCTS), Université de Bordeaux, CNRS, SAFRAN, CEA, 3 Allée La Boétie, 33600 Pessac (France)

    2014-04-01

    Thermostructural composites are three-dimensionally (3D) structured materials. Weakening phenomena (mechanical and chemical) take place inside the material following its 3D structure and are thus hard to describe accurately. X-ray computed micro-tomography (μCT) is a recent solution that allows their experimental investigation. The technique is applied here to the study of failure under tensile loading and to the self healing processes during oxidation. Results are useful data to verify or invalidate hypotheses or estimations made in current models.

  19. Mechanistic Investigation of the Reduction of NOx over Pt- and Rh-Based LNT Catalysts

    Directory of Open Access Journals (Sweden)

    Lukasz Kubiak

    2016-03-01

    Full Text Available The influence of the noble metals (Pt vs. Rh on the NOx storage reduction performances of lean NOx trap catalysts is here investigated by transient micro-reactor flow experiments. The study indicates a different behavior during the storage in that the Rh-based catalyst showed higher storage capacity at high temperature as compared to the Pt-containing sample, while the opposite is seen at low temperatures. It is suggested that the higher storage capacity of the Rh-containing sample at high temperature is related to the higher dispersion of Rh as compared to Pt, while the lower storage capacity of Rh-Ba/Al2O3 at low temperature is related to its poor oxidizing properties. The noble metals also affect the catalyst behavior upon reduction of the stored NOx, by decreasing the threshold temperature for the reduction of the stored NOx. The Pt-based catalyst promotes the reduction of the adsorbed NOx at lower temperatures if compared to the Rh-containing sample, due to its superior reducibility. However, Rh-based material shows higher reactivity in the NH3 decomposition significantly enhancing N2 selectivity. Moreover, formation of small amounts of N2O is observed on both Pt- and Rh-based catalyst samples only during the reduction of highly reactive NOx stored at 150 °C, where NOx is likely in the form of nitrites.

  20. Non-targeted effects of radiation: applications for radiation protection and contribution to LNT discussion

    International Nuclear Information System (INIS)

    Belyakov, O.V.; Folkard, M.; Prise, K.M.; Michael, B.D.; Mothersill, C.

    2002-01-01

    According to the target theory of radiation induced effects (Lea, 1946), which forms a central core of radiation biology, DNA damage occurs during or very shortly after irradiation of the nuclei in targeted cells and the potential for biological consequences can be expressed within one or two cell generations. A range of evidence has now emerged that challenges the classical effects resulting from targeted damage to DNA. These effects have also been termed non-(DNA)-targeted (Ward, 1999) and include radiation-induced bystander effects (Iyer and Lehnert, 2000a), genomic instability (Wright, 2000), adaptive response (Wolff, 1998), low dose hyper-radiosensitivity (HRS) (Joiner, et al., 2001), delayed reproductive death (Seymour, et al., 1986) and induction of genes by radiation (Hickman, et al., 1994). An essential feature of non-targeted effects is that they do not require a direct nuclear exposure by irradiation to be expressed and they are particularly significant at low doses. This new evidence suggests a new paradigm for radiation biology that challenges the universality of target theory. In this paper we will concentrate on the radiation-induced bystander effects because of its particular importance for radiation protection

  1. Implicit assumptions underlying simple harvest models of marine bird populations can mislead environmental management decisions.

    Science.gov (United States)

    O'Brien, Susan H; Cook, Aonghais S C P; Robinson, Robert A

    2017-10-01

    Assessing the potential impact of additional mortality from anthropogenic causes on animal populations requires detailed demographic information. However, these data are frequently lacking, making simple algorithms, which require little data, appealing. Because of their simplicity, these algorithms often rely on implicit assumptions, some of which may be quite restrictive. Potential Biological Removal (PBR) is a simple harvest model that estimates the number of additional mortalities that a population can theoretically sustain without causing population extinction. However, PBR relies on a number of implicit assumptions, particularly around density dependence and population trajectory that limit its applicability in many situations. Among several uses, it has been widely employed in Europe in Environmental Impact Assessments (EIA), to examine the acceptability of potential effects of offshore wind farms on marine bird populations. As a case study, we use PBR to estimate the number of additional mortalities that a population with characteristics typical of a seabird population can theoretically sustain. We incorporated this level of additional mortality within Leslie matrix models to test assumptions within the PBR algorithm about density dependence and current population trajectory. Our analyses suggest that the PBR algorithm identifies levels of mortality which cause population declines for most population trajectories and forms of population regulation. Consequently, we recommend that practitioners do not use PBR in an EIA context for offshore wind energy developments. Rather than using simple algorithms that rely on potentially invalid implicit assumptions, we recommend use of Leslie matrix models for assessing the impact of additional mortality on a population, enabling the user to explicitly define assumptions and test their importance. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. ERP evidence for selective drop in attentional costs in uncertain environments: challenging a purely premotor account of covert orienting of attention.

    Science.gov (United States)

    Lasaponara, Stefano; Chica, Ana B; Lecce, Francesca; Lupianez, Juan; Doricchi, Fabrizio

    2011-07-01

    Several studies have proved that the reliability of endogenous spatial cues linearly modulates the reaction time advantage in the processing of targets at validly cued vs. invalidly cued locations, i.e. the "validity effect". This would imply that with non-predictive cues, no "validity effect" should be observed. However, contrary to this prediction, one could hypothesize that attentional benefits by valid cuing (i.e. the RT advantage for validly vs. neutrally cued targets) can still be maintained with non-predictive cues, if the brain were endowed with mechanisms allowing the selective reduction in costs of reorienting from invalidly cued locations (i.e. the reduction of the RT disadvantage for invalidly vs. neutrally cued targets). This separated modulation of attentional benefits and costs would be adaptive in uncertain contexts where cues predict at chance level the location of targets. Through the joint recording of manual reaction times and event-related cerebral potentials (ERPs), we have found that this is the case and that relying on non-predictive endogenous cues results in abatement of attentional costs and the difference in the amplitude of the P1 brain responses evoked by invalidly vs. neutrally cued targets. In contrast, the use of non-predictive cues leaves unaffected attentional benefits and the difference in the amplitude of the N1 responses evoked by validly vs. neutrally cued targets. At the individual level, the drop in costs with non-predictive cues was matched with equivalent lateral biases in RTs to neutrally and invalidly cued targets presented in the left and right visual field. During the cue period, the drop in costs with non-predictive cues was preceded by reduction of the Early Directing Attention Negativity (EDAN) on posterior occipital sites and by enhancement of the frontal Anterior Directing Attention Negativity (ADAN) correlated to preparatory voluntary orienting. These findings demonstrate, for the first time, that the segregation

  3. Combinatorial DNA Damage Pairing Model Based on X-Ray-Induced Foci Predicts the Dose and LET Dependence of Cell Death in Human Breast Cells

    Energy Technology Data Exchange (ETDEWEB)

    Vadhavkar, Nikhil [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Pham, Christopher [University of Texas, Houston, TX (United States). MD Anderson Cancer Center; Georgescu, Walter [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Life Sciences Div.; Deschamps, Thomas [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Life Sciences Div.; Heuskin, Anne-Catherine [Univ. of Namur (Belgium). Namur Research inst. for Life Sciences (NARILIS), Research Center for the Physics of Matter and Radiation (PMR); Tang, Jonathan [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Life Sciences Div.; Costes, Sylvain V. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Life Sciences Div.

    2014-09-01

    are based on experimental RIF and are three times larger than the hypothetical LEM voxel used to fit survival curves. Our model is therefore an alternative to previous approaches that provides a testable biological mechanism (i.e., RIF). In addition, we propose that DSB pairing will help develop more accurate alternatives to the linear cancer risk model (LNT) currently used for regulating exposure to very low levels of ionizing radiation.

  4. Factors contributing to speech perception scores in long-term pediatric cochlear implant users.

    Science.gov (United States)

    Davidson, Lisa S; Geers, Ann E; Blamey, Peter J; Tobey, Emily A; Brenner, Christine A

    2011-02-01

    The objectives of this report are to (1) describe the speech perception abilities of long-term pediatric cochlear implant (CI) recipients by comparing scores obtained at elementary school (CI-E, 8 to 9 yrs) with scores obtained at high school (CI-HS, 15 to 18 yrs); (2) evaluate speech perception abilities in demanding listening conditions (i.e., noise and lower intensity levels) at adolescence; and (3) examine the relation of speech perception scores to speech and language development over this longitudinal timeframe. All 112 teenagers were part of a previous nationwide study of 8- and 9-yr-olds (N = 181) who received a CI between 2 and 5 yrs of age. The test battery included (1) the Lexical Neighborhood Test (LNT; hard and easy word lists); (2) the Bamford Kowal Bench sentence test; (3) the Children's Auditory-Visual Enhancement Test; (4) the Test of Auditory Comprehension of Language at CI-E; (5) the Peabody Picture Vocabulary Test at CI-HS; and (6) the McGarr sentences (consonants correct) at CI-E and CI-HS. CI-HS speech perception was measured in both optimal and demanding listening conditions (i.e., background noise and low-intensity level). Speech perception scores were compared based on age at test, lexical difficulty of stimuli, listening environment (optimal and demanding), input mode (visual and auditory-visual), and language age. All group mean scores significantly increased with age across the two test sessions. Scores of adolescents significantly decreased in demanding listening conditions. The effect of lexical difficulty on the LNT scores, as evidenced by the difference in performance between easy versus hard lists, increased with age and decreased for adolescents in challenging listening conditions. Calculated curves for percent correct speech perception scores (LNT and Bamford Kowal Bench) and consonants correct on the McGarr sentences plotted against age-equivalent language scores on the Test of Auditory Comprehension of Language and Peabody

  5. Sub-discretized surface model with application to contact mechanics in multi-body simulation

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, S; Williams, J

    2008-02-28

    The mechanics of contact between rough and imperfectly spherical adhesive powder grains are often complicated by a variety of factors, including several which vary over sub-grain length scales. These include several traction factors that vary spatially over the surface of the individual grains, including high energy electron and acceptor sites (electrostatic), hydrophobic and hydrophilic sites (electrostatic and capillary), surface energy (general adhesion), geometry (van der Waals and mechanical), and elasto-plastic deformation (mechanical). For mechanical deformation and reaction, coupled motions, such as twisting with bending and sliding, as well as surface roughness add an asymmetry to the contact force which invalidates assumptions for popular models of contact, such as the Hertzian and its derivatives, for the non-adhesive case, and the JKR and DMT models for adhesive contacts. Though several contact laws have been offered to ameliorate these drawbacks, they are often constrained to particular loading paths (most often normal loading) and are relatively complicated for computational implementation. This paper offers a simple and general computational method for augmenting contact law predictions in multi-body simulations through characterization of the contact surfaces using a hierarchically-defined surface sub-discretization. For the case of adhesive contact between powder grains in low stress regimes, this technique can allow a variety of existing contact laws to be resolved across scales, allowing for moments and torques about the contact area as well as normal and tangential tractions to be resolved. This is especially useful for multi-body simulation applications where the modeler desires statistical distributions and calibration for parameters in contact laws commonly used for resolving near-surface contact mechanics. The approach is verified against analytical results for the case of rough, elastic spheres.

  6. Analyzing subsurface drain network performance in an agricultural monitoring site with a three-dimensional hydrological model

    Science.gov (United States)

    Nousiainen, Riikka; Warsta, Lassi; Turunen, Mika; Huitu, Hanna; Koivusalo, Harri; Pesonen, Liisa

    2015-10-01

    Effectiveness of a subsurface drainage system decreases with time, leading to a need to restore the drainage efficiency by installing new drain pipes in problem areas. The drainage performance of the resulting system varies spatially and complicates runoff and nutrient load generation within the fields. We presented a method to estimate the drainage performance of a heterogeneous subsurface drainage system by simulating the area with the three-dimensional hydrological FLUSH model. A GIS analysis was used to delineate the surface runoff contributing area in the field. We applied the method to reproduce the water balance and to investigate the effectiveness of a subsurface drainage network of a clayey field located in southern Finland. The subsurface drainage system was originally installed in the area in 1971 and the drainage efficiency was improved in 1995 and 2005 by installing new drains. FLUSH was calibrated against total runoff and drain discharge data from 2010 to 2011 and validated against total runoff in 2012. The model supported quantification of runoff fractions via the three installed drainage networks. Model realisations were produced to investigate the extent of the runoff contributing areas and the effect of the drainage parameters on subsurface drain discharge. The analysis showed that better model performance was achieved when the efficiency of the oldest drainage network (installed in 1971) was decreased. Our analysis method can reveal the drainage system performance but not the reason for the deterioration of the drainage performance. Tillage layer runoff from the field was originally computed by subtracting drain discharge from the total runoff. The drains installed in 1995 bypass the measurement system, which renders the tillage layer runoff calculation procedure invalid after 1995. Therefore, this article suggests use of a local correction coefficient based on the simulations for further research utilizing data from the study area.

  7. The effects of musical training on movement pre-programming and re-programming abilities: an event-related potential investigation.

    Science.gov (United States)

    Anatürk, Melis; Jentzsch, Ines

    2015-03-01

    Two response precuing experiments were conducted to investigate effects of musical skill level on the ability to pre- and re-programme simple movements. Participants successfully used advance information to prepare forthcoming responses and showed response slowing when precue information was invalid rather than valid. This slowing was, however, only observed for partially invalid but not fully invalid precues. Musicians were generally faster than non-musicians, but no group differences in the efficiency of movement pre-programming or re-programming were observed. Interestingly, only musicians exhibited a significant foreperiod lateralized readiness potential (LRP) when response hand was pre-specified or full advance information was provided. These LRP findings suggest increased effector-specific motor preparation in musicians than non-musicians. However, here the levels of effector-specific preparation did not predict preparatory advantages observed in behaviour. In sum, combining the response precuing and ERP paradigms serves a valuable tool to examine influences of musical training on movement pre- or re-programming processes. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. An information-theoretic approach to the modeling and analysis of whole-genome bisulfite sequencing data.

    Science.gov (United States)

    Jenkinson, Garrett; Abante, Jordi; Feinberg, Andrew P; Goutsias, John

    2018-03-07

    DNA methylation is a stable form of epigenetic memory used by cells to control gene expression. Whole genome bisulfite sequencing (WGBS) has emerged as a gold-standard experimental technique for studying DNA methylation by producing high resolution genome-wide methylation profiles. Statistical modeling and analysis is employed to computationally extract and quantify information from these profiles in an effort to identify regions of the genome that demonstrate crucial or aberrant epigenetic behavior. However, the performance of most currently available methods for methylation analysis is hampered by their inability to directly account for statistical dependencies between neighboring methylation sites, thus ignoring significant information available in WGBS reads. We present a powerful information-theoretic approach for genome-wide modeling and analysis of WGBS data based on the 1D Ising model of statistical physics. This approach takes into account correlations in methylation by utilizing a joint probability model that encapsulates all information available in WGBS methylation reads and produces accurate results even when applied on single WGBS samples with low coverage. Using the Shannon entropy, our approach provides a rigorous quantification of methylation stochasticity in individual WGBS samples genome-wide. Furthermore, it utilizes the Jensen-Shannon distance to evaluate differences in methylation distributions between a test and a reference sample. Differential performance assessment using simulated and real human lung normal/cancer data demonstrate a clear superiority of our approach over DSS, a recently proposed method for WGBS data analysis. Critically, these results demonstrate that marginal methods become statistically invalid when correlations are present in the data. This contribution demonstrates clear benefits and the necessity of modeling joint probability distributions of methylation using the 1D Ising model of statistical physics and of

  9. TU-C-18A-01: Models of Risk From Low-Dose Radiation Exposures: What Does the Evidence Say?

    International Nuclear Information System (INIS)

    Bushberg, J; Boreham, D; Ulsh, B

    2014-01-01

    At dose levels of (approximately) 500 mSv or more, increased cancer incidence and mortality have been clearly demonstrated. However, at the low doses of radiation used in medical imaging, the relationship between dose and cancer risk is not well established. As such, assumptions about the shape of the dose-response curve are made. These assumptions, or risk models, are used to estimate potential long term effects. Common models include 1) the linear non-threshold (LNT) model, 2) threshold models with either a linear or curvilinear dose response above the threshold, and 3) a hormetic model, where the risk is initially decreased below background levels before increasing. The choice of model used when making radiation risk or protection calculations and decisions can have significant implications on public policy and health care decisions. However, the ongoing debate about which risk model best describes the dose-response relationship at low doses of radiation makes informed decision making difficult. This symposium will review the two fundamental approaches to determining the risk associated with low doses of ionizing radiation, namely radiation epidemiology and radiation biology. The strengths and limitations of each approach will be reviewed, the results of recent studies presented, and the appropriateness of different risk models for various real world scenarios discussed. Examples of well-designed and poorly-designed studies will be provided to assist medical physicists in 1) critically evaluating publications in the field and 2) communicating accurate information to medical professionals, patients, and members of the general public. Equipped with the best information that radiation epidemiology and radiation biology can currently provide, and an understanding of the limitations of such information, individuals and organizations will be able to make more informed decisions regarding questions such as 1) how much shielding to install at medical facilities, 2) at

  10. TU-C-18A-01: Models of Risk From Low-Dose Radiation Exposures: What Does the Evidence Say?

    Energy Technology Data Exchange (ETDEWEB)

    Bushberg, J [UC Davis Medical Center, Sacramento, CA (United States); Boreham, D [McMaster University, Ontario, CA (Canada); Ulsh, B

    2014-06-15

    At dose levels of (approximately) 500 mSv or more, increased cancer incidence and mortality have been clearly demonstrated. However, at the low doses of radiation used in medical imaging, the relationship between dose and cancer risk is not well established. As such, assumptions about the shape of the dose-response curve are made. These assumptions, or risk models, are used to estimate potential long term effects. Common models include 1) the linear non-threshold (LNT) model, 2) threshold models with either a linear or curvilinear dose response above the threshold, and 3) a hormetic model, where the risk is initially decreased below background levels before increasing. The choice of model used when making radiation risk or protection calculations and decisions can have significant implications on public policy and health care decisions. However, the ongoing debate about which risk model best describes the dose-response relationship at low doses of radiation makes informed decision making difficult. This symposium will review the two fundamental approaches to determining the risk associated with low doses of ionizing radiation, namely radiation epidemiology and radiation biology. The strengths and limitations of each approach will be reviewed, the results of recent studies presented, and the appropriateness of different risk models for various real world scenarios discussed. Examples of well-designed and poorly-designed studies will be provided to assist medical physicists in 1) critically evaluating publications in the field and 2) communicating accurate information to medical professionals, patients, and members of the general public. Equipped with the best information that radiation epidemiology and radiation biology can currently provide, and an understanding of the limitations of such information, individuals and organizations will be able to make more informed decisions regarding questions such as 1) how much shielding to install at medical facilities, 2) at

  11. Inhibitory mechanism of the matching heuristic in syllogistic reasoning.

    Science.gov (United States)

    Tse, Ping Ping; Moreno Ríos, Sergio; García-Madruga, Juan Antonio; Bajo Molina, María Teresa

    2014-11-01

    A number of heuristic-based hypotheses have been proposed to explain how people solve syllogisms with automatic processes. In particular, the matching heuristic employs the congruency of the quantifiers in a syllogism—by matching the quantifier of the conclusion with those of the two premises. When the heuristic leads to an invalid conclusion, successful solving of these conflict problems requires the inhibition of automatic heuristic processing. Accordingly, if the automatic processing were based on processing the set of quantifiers, no semantic contents would be inhibited. The mental model theory, however, suggests that people reason using mental models, which always involves semantic processing. Therefore, whatever inhibition occurs in the processing implies the inhibition of the semantic contents. We manipulated the validity of the syllogism and the congruency of the quantifier of its conclusion with those of the two premises according to the matching heuristic. A subsequent lexical decision task (LDT) with related words in the conclusion was used to test any inhibition of the semantic contents after each syllogistic evaluation trial. In the LDT, the facilitation effect of semantic priming diminished after correctly solved conflict syllogisms (match-invalid or mismatch-valid), but was intact after no-conflict syllogisms. The results suggest the involvement of an inhibitory mechanism of semantic contents in syllogistic reasoning when there is a conflict between the output of the syntactic heuristic and actual validity. Our results do not support a uniquely syntactic process of syllogistic reasoning but fit with the predictions based on mental model theory. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. Feedback control architecture and the bacterial chemotaxis network.

    Directory of Open Access Journals (Sweden)

    Abdullah Hamadeh

    2011-05-01

    Full Text Available Bacteria move towards favourable and away from toxic environments by changing their swimming pattern. This response is regulated by the chemotaxis signalling pathway, which has an important feature: it uses feedback to 'reset' (adapt the bacterial sensing ability, which allows the bacteria to sense a range of background environmental changes. The role of this feedback has been studied extensively in the simple chemotaxis pathway of Escherichia coli. However it has been recently found that the majority of bacteria have multiple chemotaxis homologues of the E. coli proteins, resulting in more complex pathways. In this paper we investigate the configuration and role of feedback in Rhodobacter sphaeroides, a bacterium containing multiple homologues of the chemotaxis proteins found in E. coli. Multiple proteins could produce different possible feedback configurations, each having different chemotactic performance qualities and levels of robustness to variations and uncertainties in biological parameters and to intracellular noise. We develop four models corresponding to different feedback configurations. Using a series of carefully designed experiments we discriminate between these models and invalidate three of them. When these models are examined in terms of robustness to noise and parametric uncertainties, we find that the non-invalidated model is superior to the others. Moreover, it has a 'cascade control' feedback architecture which is used extensively in engineering to improve system performance, including robustness. Given that the majority of bacteria are known to have multiple chemotaxis pathways, in this paper we show that some feedback architectures allow them to have better performance than others. In particular, cascade control may be an important feature in achieving robust functionality in more complex signalling pathways and in improving their performance.

  13. Urine specimen validity test for drug abuse testing in workplace and court settings.

    Science.gov (United States)

    Lin, Shin-Yu; Lee, Hei-Hwa; Lee, Jong-Feng; Chen, Bai-Hsiun

    2018-01-01

    In recent decades, urine drug testing in the workplace has become common in many countries in the world. There have been several studies concerning the use of the urine specimen validity test (SVT) for drug abuse testing administered in the workplace. However, very little data exists concerning the urine SVT on drug abuse tests from court specimens, including dilute, substituted, adulterated, and invalid tests. We investigated 21,696 submitted urine drug test samples for SVT from workplace and court settings in southern Taiwan over 5 years. All immunoassay screen-positive urine specimen drug tests were confirmed by gas chromatography/mass spectrometry. We found that the mean 5-year prevalence of tampering (dilute, substituted, or invalid tests) in urine specimens from the workplace and court settings were 1.09% and 3.81%, respectively. The mean 5-year percentage of dilute, substituted, and invalid urine specimens from the workplace were 89.2%, 6.8%, and 4.1%, respectively. The mean 5-year percentage of dilute, substituted, and invalid urine specimens from the court were 94.8%, 1.4%, and 3.8%, respectively. No adulterated cases were found among the workplace or court samples. The most common drug identified from the workplace specimens was amphetamine, followed by opiates. The most common drug identified from the court specimens was ketamine, followed by amphetamine. We suggest that all urine specimens taken for drug testing from both the workplace and court settings need to be tested for validity. Copyright © 2017. Published by Elsevier B.V.

  14. Field Flows of Dark Energy

    Energy Technology Data Exchange (ETDEWEB)

    Cahn, Robert N.; de Putter, Roland; Linder, Eric V.

    2008-07-08

    Scalar field dark energy evolving from a long radiation- or matter-dominated epoch has characteristic dynamics. While slow-roll approximations are invalid, a well defined field expansion captures the key aspects of the dark energy evolution during much of the matter-dominated epoch. Since this behavior is determined, it is not faithfully represented if priors for dynamical quantities are chosen at random. We demonstrate these features for both thawing and freezing fields, and for some modified gravity models, and unify several special cases in the literature.

  15. Specific heat of Ginzburg-Landau fields in the n-1 expansion

    International Nuclear Information System (INIS)

    Bray, A.J.

    1975-01-01

    The n -1 expansion for the specific heat C/subv/ of the n-component Ginzburg-Landau model is discussed in terms of an n -1 expansion for the irreducible polarization. In the low-temperature limit, each successive term of the latter expansion diverges more strongly than the last, invalidating a truncation of this series at any finite order in 1/n. The most divergent terms in each order are identified and summed. The results provide justification for the usual truncated expansions for C/subv/

  16. Resurrecting Equilibria Through Cycles

    DEFF Research Database (Denmark)

    Barnett, Richard C.; Bhattacharya, Joydeep; Bunzel, Helle

    equilibria because they asymptotically violate some economic restriction of the model. The literature has always ruled out such paths. This paper studies a pure-exchange monetary overlapping generations economy in which real balances cycle forever between momentary equilibrium points. The novelty is to show...... that segments of the offer curve that have been previously ignored, can in fact be used to produce asymptotically valid cyclical paths. Indeed, a cycle can bestow dynamic validity on momentary equilibrium points that had erstwhile been classified as dynamically invalid....

  17. Topological organization of (low-dimensional) chaos

    International Nuclear Information System (INIS)

    Tufillaro, N.B.

    1992-01-01

    Recent progress toward classifying low-dimensional chaos measured from time series data is described. This classification theory assigns a template to the time series once the time series is embedded in three dimensions. The template describes the primary folding and stretching mechanisms of phase space responsible for the chaotic motion. Topological invariants of the unstable periodic orbits in the closure of the strange set are calculated from the (reconstructed) template. These topological invariants must be consistent with ampersand ny model put forth to describe the time series data, and are useful in invalidating (or gaining confidence in) any model intended to describe the dynamical system generating the time series

  18. Consistent and efficient processing of ADCP streamflow measurements

    Science.gov (United States)

    Mueller, David S.; Constantinescu, George; Garcia, Marcelo H.; Hanes, Dan

    2016-01-01

    The use of Acoustic Doppler Current Profilers (ADCPs) from a moving boat is a commonly used method for measuring streamflow. Currently, the algorithms used to compute the average depth, compute edge discharge, identify invalid data, and estimate velocity and discharge for invalid data vary among manufacturers. These differences could result in different discharges being computed from identical data. Consistent computational algorithm, automated filtering, and quality assessment of ADCP streamflow measurements that are independent of the ADCP manufacturer are being developed in a software program that can process ADCP moving-boat discharge measurements independent of the ADCP used to collect the data.

  19. Radiation and health: low-level-ionizing radiation exposure and effects

    International Nuclear Information System (INIS)

    Kant, Krishan

    2013-01-01

    In the present paper, brief review of the available literature, data and reports of various radiation exposure and protection studies is presented. An in-depth analysis of reports available suggests that the possible beneficial outcomes of exposure to LLIR are: increased Growth rate, Development, Neurogenesis, Memory, Fecundity (Fertility), Immunity (Resistance to diseases due to large doses of radiation) and Lifespan (Longevity) Decreased Cancer deaths, Cardiovascular deaths, Respiratory deaths, Neonatal deaths, Sterility, Infection, Premature deaths. The findings also suggest that the LNT theory is overly stated for assessing carcinogenic risks at low doses. It is not scientifically justified and should be banned as it creates radio phobia thereby blocking the efforts to supply reliable, environmentally friendly nuclear energy and important medical therapies. There is no need for anyone to live in fear of serious health consequences from the radioactivity that comes out from nuclear installations and exposures in the range of background radiation. A linear quadratic model has been given illustrating the validity of radiation hormesis, besides the comparison of the dose rates arising from natural and manmade sources to Indian population

  20. Consolidating the medical model of disability: on poliomyelitis and the constitution of orthopedic surgery and orthopaedics as a speciality in Spain (1930-1950

    Directory of Open Access Journals (Sweden)

    Martínez-Pérez, José

    2009-06-01

    Full Text Available At the beginning of the 1930s, various factors made it necessary to transform one of the institutions which was renowned for its work regarding the social reinsertion of the disabled, that is, the Instituto de Reeducación Profesional de Inválidos del Trabajo (Institute for Occupational Retraining of Invalids of Work. The economic crisis of 1929 and the legislative reform aimed at regulating occupational accidents highlighted the failings of this institution to fulfil its objectives. After a time of uncertainty, the centre was renamed the Instituto Nacional de Reeducación de Inválidos (National Institute for Retraining of Invalids. This was done to take advantage of its work in championing the recovery of all people with disabilities.

    This work aims to study the role played in this process by the poliomyelitis epidemics in Spain at this time. It aims to highlight how this disease justified the need to continue the work of a group of professionals and how it helped to reorient the previous programme to re-educate the «invalids». Thus we shall see the way in which, from 1930 to 1950, a specific medical technology helped to consolidate an «individual model» of disability and how a certain cultural stereotype of those affected developed as a result. Lastly, this work discusses the way in which all this took place in the midst of a process of professional development of orthopaedic surgeons.

    A comienzos de la década de 1930, una serie de factores obligaron a transformar una de las instituciones que más se había destacado en España en la labor de conseguir la reinserción social de las personas con discapacidades: el Instituto de Reeducación de Inválidos del Trabajo. La crisis económica de 1929 y las reformas legislativas destinadas a regular los accidentes del trabajo pusieron de relieve, entre otros factores, las limitaciones de esa institución para cumplir sus objetivos. Tras un período de cierta indefinición, el

  1. Invalid Cookery, Nursing and Domestic Medicine in Ireland, c. 1900.

    Science.gov (United States)

    Adelman, Juliana

    2018-04-01

    This article uses a 1903 text by the Irish cookery instructress Kathleen Ferguson to examine the intersections between food, medicine and domestic work. Sick Room Cookery, and numerous texts like it, drew on traditions of domestic medicine and Anglo-Irish gastronomy while also seeking to establish female expertise informed by modern science and medicine. Placing the text in its broader cultural context, the article examines how it fit into the tradition of domestic medicine and the emerging profession of domestic science. Giving equal weight to the history of food and of medicine, and seeing each as shaped by historical context, help us to see the practice of feeding the sick in a different way.

  2. Leaf arrangements are invalid in the taxonomy of orchid species

    Directory of Open Access Journals (Sweden)

    Anna Jakubska-Busse

    2017-07-01

    Full Text Available The selection and validation of proper distinguishing characters are of crucial importance in taxonomic revisions. The modern classifications of orchids utilize the molecular tools, but still the selection and identification of the material used in these studies is for the most part related to general species morphology. One of the vegetative characters quoted in orchid manuals is leaf arrangement. However, phyllotactic diversity and ontogenetic changeability have not been analysed in detail in reference to particular taxonomic groups. Therefore, we evaluated the usefulness of leaf arrangements in the taxonomy of the genus Epipactis Zinn, 1757. Typical leaf arrangements in shoots of this genus are described as distichous or spiral. However, in the course of field research and screening of herbarium materials, we indisputably disproved the presence of distichous phyllotaxis in the species Epipactis purpurata Sm. and confirmed the spiral Fibonacci pattern as the dominant leaf arrangement. In addition, detailed analyses revealed the presence of atypical decussate phyllotaxis in this species, as well as demonstrated the ontogenetic formation of pseudowhorls. These findings confirm ontogenetic variability and plasticity in E. purpurata. Our results are discussed in the context of their significance in delimitations of complex taxa within the genus Epipactis.

  3. 32 CFR 538.5 - Conversion of invalidated military payment certificates.

    Science.gov (United States)

    2010-07-01

    ... be forwarded by the summary court officer to the U.S. Army Finance and Accounting Center for decision... up in the accounts of the finance and accounting officer. Such certificates will be held in... claimant. In the event these certificates are again received by the finance and accounting officer as...

  4. OPERA and MINOS Experimental Result Prove Big Bang Theory Invalid

    Science.gov (United States)

    Pressler, David E.

    2012-03-01

    The greatest error in the history of science is the misinterpretation of the Michelson-Morley Experiment. The speed of light was measured to travel at the same speed in all three directions (x, y, z axis) in ones own inertial reference system; however, c will always be measured as having an absolute different speed in all other inertial frames at different energy levels. Time slows down due to motion or a gravity field. Time is the rate of physical process. Speed = Distance/Time. If the time changes the distance must change. Therefore, BOTH mirrors must move towards the center of the interferometer and space must contract in all-three-directions; C-Space. Gravity is a C-Space condition, and is the cause of redshift in our universe-not motion. The universe is not expanding. OPERA results are directly indicated; at the surface of earth, the strength of the gravity field is at maximum-below the earth's surface, time and space is less distorted, C-Space; therefore, c is faster. Newtonian mechanics dictate that a spherical shell of matter at greater radii, with uniform density, produces no net force on an observer located centrally. An observer located on the sphere's surface, like our Earth's or a large sphere, like one located in a remote galaxy, will construct a picture centered on himself to be identical to the one centered inside the spherical shell of mass. Both observers will view the incoming radiation, emitted by the other observer, as redshifted, because they lay on each others radial line. The Universe is static and very old.

  5. Genetic invalidation of Lp-PLA2 as a therapeutic target

    DEFF Research Database (Denmark)

    Gregson, John M; Freitag, Daniel F; Surendran, Praveen

    2017-01-01

    AIMS: Darapladib, a potent inhibitor of lipoprotein-associated phospholipase A2 (Lp-PLA2), has not reduced risk of cardiovascular disease outcomes in recent randomized trials. We aimed to test whether Lp-PLA2 enzyme activity is causally relevant to coronary heart disease. METHODS: In 72...... (Val379Ala (rs1051931)) in PLA2G7, the gene encoding Lp-PLA2. We supplemented de-novo genotyping with information on a further 45,823 coronary heart disease patients and 88,680 controls in publicly available databases and other previous studies. We conducted a systematic review of randomized trials...... to compare effects of darapladib treatment on soluble Lp-PLA2 activity, conventional cardiovascular risk factors, and coronary heart disease risk with corresponding effects of Lp-PLA2-lowering alleles. RESULTS: Lp-PLA2 activity was decreased by 64% (p = 2.4 × 10(-25)) with carriage of any of the four loss...

  6. A comparison of zero-order, first-order, and Monod biotransformation models

    International Nuclear Information System (INIS)

    Bekins, B.A.; Warren, E.; Godsy, E.M.

    1998-01-01

    Under some conditions, a first-order kinetic model is a poor representation of biodegradation in contaminated aquifers. Although it is well known that the assumption of first-order kinetics is valid only when substrate concentration, S, is much less than the half-saturation constant, K S , this assumption is often made without verification of this condition. The authors present a formal error analysis showing that the relative error in the first-order approximation is S/K S and in the zero-order approximation the error is K S /S. They then examine the problems that arise when the first-order approximation is used outside the range for which it is valid. A series of numerical simulations comparing results of first- and zero-order rate approximations to Monod kinetics for a real data set illustrates that if concentrations observed in the field are higher than K S , it may be better to model degradation using a zero-order rate expression. Compared with Monod kinetics, extrapolation of a first-order rate to lower concentrations under-predicts the biotransformation potential, while extrapolation to higher concentrations may grossly over-predict the transformation rate. A summary of solubilities and Monod parameters for aerobic benzene, toluene, and xylene (BTX) degradation shows that the a priori assumption of first-order degradation kinetics at sites contaminated with these compounds is not valid. In particular, out of six published values of K S for toluene, only one is greater than 2 mg/L, indicating that when toluene is present in concentrations greater than about a part per million, the assumption of first-order kinetics may be invalid. Finally, the authors apply an existing analytical solution for steady-state one-dimensional advective transport with Monod degradation kinetics to a field data set

  7. Collective dose: Dogma, tool, or trauma?

    International Nuclear Information System (INIS)

    Becker, K.

    1996-01-01

    In Europe as well as in the United States, the argument continues in the radiation protection community between the open-quotes fundamentalists,close quotes who firmly believe in the linear no-threshold (LNT) hypothesis and the closely related concept of collective dose, and open-quotes pragmatists,close quotes who have serious doubts about these concepts for both radiobiological and socioeconomic reasons. The latter view is reflected in many recent compilations in the scientific literature, in particular in the books by G. Walinder and S. Kondo. The fundamentalist view has been expressed in other recent publications. What has been described as the good old boys' network of the establishment threatens nonbelievers with excommunication: in his 1996 Sievert lecture, D. Beninson described doubts about the LNT hypothesis as open-quotes arrogant ignoranceclose quotes; A. Gonzales, in a 1995 letter to the author, described them as open-quotes intellectual lazinessclose quotes; and R. Clarke, chairman of the International Commission on Radiological Protection (ICRP), during the 1996 IRPA Congress, described them as open-quotes seriously misguided.close quotes Threshold or no threshold, that is the question - the most important one for this and the next generation of health physicists. Corrections of the current dogmas are needed soon and should be initiated not only by the nuclear community (which may be blamed by opinion makers for a biased view) but by those members of the radiation protection community, who are not only interested in order to 'keep the hazard alive' for reasons of publicity, research funding, and sales of instruments and services. The International Atomic Energy Agency will devote a symposium to these questions next year. The results should be interesting

  8. Relationship between dose and risk, and assessment of carcinogenic risks associated with low doses of ionizing radiation

    International Nuclear Information System (INIS)

    Tubiana, M.; Aurengo, A.

    2005-01-01

    This report raises doubts on the validity of using LNT (linear no-threshold) relationship for evaluating the carcinogenic risk of low doses (< 100 mSv) and even more for very low doses (< 10 mSv). The LNT concept can be a useful pragmatic tool for assessing rules in radioprotection for doses above 10 mSv; however since it is not based on biological concepts of our current knowledge, it should not be used without precaution for assessing by extrapolation the risks associated with low and even more so, with very low doses (< 10 mSv), especially for benefit-risk assessments imposed on radiologists by the European directive 97-43. The biological mechanisms are different for doses lower than a few dozen mSv and for higher doses. The eventual risks in the dose range of radiological examinations (0.1 to 5 mSv, up to 20 mSv for some examinations) must be estimated taking into account radiobiological and experimental data. An empirical relationship which has been just validated for doses higher than 200 mSv may lead to an overestimation of risks (associated with doses one hundred fold lower), and this overestimation could discourage patients from undergoing useful examinations and introduce a bias in radioprotection measures against very low doses (< 10 mSv). Decision makers confronted with problems of radioactive waste or risk of contamination, should re-examine the methodology used for the evaluation of risks associated with very low doses and with doses delivered at a very low dose rate. This report confirms the inappropriateness of the collective dose concept to evaluate population irradiation risks

  9. Closed loop identification of a piezoelectrically controlled radial gas bearing: Theory and experiment

    DEFF Research Database (Denmark)

    Sekunda, André Krabdrup; Niemann, Hans Henrik; Poulsen, Niels Kjølstad

    2018-01-01

    Gas bearing systems have extremely small damping properties. Feedback control is thus employed to increase the damping of gas bearings. Such a feedback loop correlates the input with the measurement noise which in turn makes the assumptions for direct identification invalid. The originality...... of this article lies in the investigation of the impact of using different identification methods to identify a rotor-bearing systems’ dynamic model when a feedback loop is active. Two different identification methods are employed. The first method is open loop Prediction Error Method, while the other method...

  10. Explorations in quantum computing

    CERN Document Server

    Williams, Colin P

    2011-01-01

    By the year 2020, the basic memory components of a computer will be the size of individual atoms. At such scales, the current theory of computation will become invalid. ""Quantum computing"" is reinventing the foundations of computer science and information theory in a way that is consistent with quantum physics - the most accurate model of reality currently known. Remarkably, this theory predicts that quantum computers can perform certain tasks breathtakingly faster than classical computers -- and, better yet, can accomplish mind-boggling feats such as teleporting information, breaking suppos

  11. The new ICRP general recommendations

    International Nuclear Information System (INIS)

    Sasaki, Y.; Lochard, J.; Holm, L.E.; Niwa, O.; Ishigure, N.; Kosako, T.; Kai, M.

    2007-01-01

    The new draft ICRP recommendations was presented by the ICRP chair, Professor Lars-Eric Holm. His presentation was followed by presentations by Japanese members of the various (CRP committees, discussing their views of the draft recommendations based on their own technical experience. After these presentations, questions from the floor raised many of the key issues of the conference: dose constrains, the LNT hypothesis, dose bands, etc. This showed that the conference participants had carefully and completely read the draft, and were very interested in building a final ICRP recommendation that appropriately addresses all their concerns. These issues were also discussed throughout the entire conference. (author)

  12. Surface-illuminant ambiguity and color constancy: effects of scene complexity and depth cues.

    Science.gov (United States)

    Kraft, James M; Maloney, Shannon I; Brainard, David H

    2002-01-01

    Two experiments were conducted to study how scene complexity and cues to depth affect human color constancy. Specifically, two levels of scene complexity were compared. The low-complexity scene contained two walls with the same surface reflectance and a test patch which provided no information about the illuminant. In addition to the surfaces visible in the low-complexity scene, the high-complexity scene contained two rectangular solid objects and 24 paper samples with diverse surface reflectances. Observers viewed illuminated objects in an experimental chamber and adjusted the test patch until it appeared achromatic. Achromatic settings made tinder two different illuminants were used to compute an index that quantified the degree of constancy. Two experiments were conducted: one in which observers viewed the stimuli directly, and one in which they viewed the scenes through an optical system that reduced cues to depth. In each experiment, constancy was assessed for two conditions. In the valid-cue condition, many cues provided valid information about the illuminant change. In the invalid-cue condition, some image cues provided invalid information. Four broad conclusions are drawn from the data: (a) constancy is generally better in the valid-cue condition than in the invalid-cue condition: (b) for the stimulus configuration used, increasing image complexity has little effect in the valid-cue condition but leads to increased constancy in the invalid-cue condition; (c) for the stimulus configuration used, reducing cues to depth has little effect for either constancy condition: and (d) there is moderate individual variation in the degree of constancy exhibited, particularly in the degree to which the complexity manipulation affects performance.

  13. Quantitative analysis of dynamic fault trees using improved Sequential Binary Decision Diagrams

    International Nuclear Information System (INIS)

    Ge, Daochuan; Lin, Meng; Yang, Yanhua; Zhang, Ruoxing; Chou, Qiang

    2015-01-01

    Dynamic fault trees (DFTs) are powerful in modeling systems with sequence- and function dependent failure behaviors. The key point lies in how to quantify complex DFTs analytically and efficiently. Unfortunately, the existing methods for analyzing DFTs all have their own disadvantages. They either suffer from the problem of combinatorial explosion or need a long computation time to obtain an accurate solution. Sequential Binary Decision Diagrams (SBDDs) are regarded as novel and efficient approaches to deal with DFTs, but their two apparent shortcomings remain to be handled: That is, SBDDs probably generate invalid nodes when given an unpleasant variable index and the scale of the resultant cut sequences greatly relies on the chosen variable index. An improved SBDD method is proposed in this paper to deal with the two mentioned problems. It uses an improved ite (If-Then-Else) algorithm to avoid generating invalid nodes when building SBDDs, and a heuristic variable index to keep the scale of resultant cut sequences as small as possible. To confirm the applicability and merits of the proposed method, several benchmark examples are demonstrated, and the results indicate this approach is efficient as well as reasonable. - Highlights: • New ITE method. • Linear complexity-based finding algorithm. • Heuristic variable index

  14. Inconsistent Responding in a Criminal Forensic Setting: An Evaluation of the VRIN-r and TRIN-r Scales of the MMPI-2-RF.

    Science.gov (United States)

    Gu, Wen; Reddy, Hima B; Green, Debbie; Belfi, Brian; Einzig, Shanah

    2017-01-01

    Criminal forensic evaluations are complicated by the risk that examinees will respond in an unreliable manner. Unreliable responding could occur due to lack of personal investment in the evaluation, severe mental illness, and low cognitive abilities. In this study, 31% of Minnesota Multiphasic Personality Inventory-2 Restructured Form (MMPI-2-RF; Ben-Porath & Tellegen, 2008/2011) profiles were invalid due to random or fixed-responding (T score ≥ 80 on the VRIN-r or TRIN-r scales) in a sample of pretrial criminal defendants evaluated in the context of treatment for competency restoration. Hierarchical regression models showed that symptom exaggeration variables, as measured by inconsistently reported psychiatric symptoms, contributed over and above education and intellectual functioning in their prediction of both random responding and fixed responding. Psychopathology variables, as measured by mood disturbance, better predicted fixed responding after controlling for estimates of cognitive abilities, but did not improve the prediction for random responding. These findings suggest that random responding and fixed responding are not only affected by education and intellectual functioning, but also by intentional exaggeration and aspects of psychopathology. Measures of intellectual functioning and effort and response style should be considered for administration in conjunction with self-report personality measures to rule out rival hypotheses of invalid profiles.

  15. Health risk in nuclear power plants

    International Nuclear Information System (INIS)

    Bliznakov, V.

    1997-01-01

    Worked out are the health risk indices for NPP personnel that could be used in normal operation and in case of accident. These indices concern temporary incapacity for work, invalidity, lethality, cancer, etc. Risk estimation is based on produced energy in NPP or on the collective dose of personnel exposure. Assessed are the specific risk values for NPP ''Kozloduy'', which show that the risk in normal operation is significantly low (of the order of 2.3 ./. 7.2 x 10 -4 for invalidity, lethality and cancer). Health risk indices can be used when comparing various alternative energy sources, as well as for determination of the power strategy of a country. (author)

  16. Interdependent demands, regulatory constraint, and peak-load pricing. [Assessment of Bailey's model

    Energy Technology Data Exchange (ETDEWEB)

    Nguyen, D T; Macgregor-Reid, G J

    1977-06-01

    A model of a regulated firm which includes an analysis of peak-load pricing has been formulated by E. E. Bailey in which three alternative modes of regulation on a profit-maximizing firm are considered. The main conclusion reached is that under a regulation limiting the rate of return on capital investment, price reductions are received solely by peak-users and that when regulation limiting the profit per unit of output or the return on costs is imposed, there are price reductions for all users. Bailey has expressly assumed that the demands in different periods are interdependent but has somehow failed to derive the correct price and welfare implications of this empirically highly relevant assumption. Her conclusions would have been perfectly correct for marginal revenues but are quite incorrect for prices, even if her assumption that price exceeds marginal revenues in every period holds. This present paper derives fully and rigorously the implications of regulation for prices, outputs, capacity, and social welfare for a profit-maximizing firm with interdependent demands. In section II, Bailey's model is reproduced and the optimal conditions are given. In section III, it is demonstrated that under the conditions of interdependent demands assumed by Bailey herself, her often-quoted conclusion concerning the effects of the return-on-investment regulation on the off-peak price is invalid. In section IV, the effects of the return-on-investment regulation on the optimal prices, outputs, capacity, and social welfare both for the case in which the demands in different periods are substitutes and for the case in which they are complements are examined. In section V, the pricing and welfare implications of the return-on-investment regulation are compared with the two other modes of regulation considered by Bailey. Section VI is a summary of all sections. (MCW)

  17. Extremely rare collapse and build-up of turbulence in stochastic models of transitional wall flows.

    Science.gov (United States)

    Rolland, Joran

    2018-02-01

    multistability, where ln(T) in the limit of small variance noise is studied. Two points of view, local noise of small variance and large length, can be used to discuss the exponential dependence in L of T. In particular, it is shown how a T≍exp[L(A^{'}R-B^{'})] can be derived in a conceptual two degrees of freedom model of a transitional wall flow proposed by Dauchot and Manneville. This is done by identifying a quasipotential in low variance noise, large length limit. This pinpoints the physical effects controlling collapse and build-up trajectories and corresponding passage times with an emphasis on the saddle points between laminar and turbulent states. This analytical analysis also shows that these effects lead to the asymmetric probability density function of kinetic energy of turbulence.

  18. Consecutive Short-Scan CT for Geological Structure Analog Models with Large Size on In-Situ Stage.

    Science.gov (United States)

    Yang, Min; Zhang, Wen; Wu, Xiaojun; Wei, Dongtao; Zhao, Yixin; Zhao, Gang; Han, Xu; Zhang, Shunli

    2016-01-01

    For the analysis of interior geometry and property changes of a large-sized analog model during a loading or other medium (water or oil) injection process with a non-destructive way, a consecutive X-ray computed tomography (XCT) short-scan method is developed to realize an in-situ tomography imaging. With this method, the X-ray tube and detector rotate 270° around the center of the guide rail synchronously by switching positive and negative directions alternately on the way of translation until all the needed cross-sectional slices are obtained. Compared with traditional industrial XCTs, this method well solves the winding problems of high voltage cables and oil cooling service pipes during the course of rotation, also promotes the convenience of the installation of high voltage generator and cooling system. Furthermore, hardware costs are also significantly decreased. This kind of scanner has higher spatial resolution and penetrating ability than medical XCTs. To obtain an effective sinogram which matches rotation angles accurately, a structural similarity based method is applied to elimination of invalid projection data which do not contribute to the image reconstruction. Finally, on the basis of geometrical symmetry property of fan-beam CT scanning, a whole sinogram filling a full 360° range is produced and a standard filtered back-projection (FBP) algorithm is performed to reconstruct artifacts-free images.

  19. Simultaneous fitting of real-time PCR data with efficiency of amplification modeled as Gaussian function of target fluorescence

    Directory of Open Access Journals (Sweden)

    Lazar Andreas

    2008-02-01

    Full Text Available Abstract Background In real-time PCR, it is necessary to consider the efficiency of amplification (EA of amplicons in order to determine initial target levels properly. EAs can be deduced from standard curves, but these involve extra effort and cost and may yield invalid EAs. Alternatively, EA can be extracted from individual fluorescence curves. Unfortunately, this is not reliable enough. Results Here we introduce simultaneous non-linear fitting to determine – without standard curves – an optimal common EA for all samples of a group. In order to adjust EA as a function of target fluorescence, and still to describe fluorescence as a function of cycle number, we use an iterative algorithm that increases fluorescence cycle by cycle and thus simulates the PCR process. A Gauss peak function is used to model the decrease of EA with increasing amplicon accumulation. Our approach was validated experimentally with hydrolysis probe or SYBR green detection with dilution series of 5 different targets. It performed distinctly better in terms of accuracy than standard curve, DART-PCR, and LinRegPCR approaches. Based on reliable EAs, it was possible to detect that for some amplicons, extraordinary fluorescence (EA > 2.00 was generated with locked nucleic acid hydrolysis probes, but not with SYBR green. Conclusion In comparison to previously reported approaches that are based on the separate analysis of each curve and on modelling EA as a function of cycle number, our approach yields more accurate and precise estimates of relative initial target levels.

  20. Theoretical Issues of Validity in the Measurement of Aided Speech Reception Threshold in Noise for Comparing Nonlinear Hearing Aid Systems.

    Science.gov (United States)

    Naylor, Graham

    2016-07-01

    Adaptive Speech Reception Threshold in noise (SRTn) measurements are often used to make comparisons between alternative hearing aid (HA) systems. Such measurements usually do not constrain the signal-to-noise ratio (SNR) at which testing takes place. Meanwhile, HA systems increasingly include nonlinear features that operate differently in different SNRs, and listeners differ in their inherent SNR requirements. To show that SRTn measurements, as commonly used in comparisons of alternative HA systems, suffer from threats to their validity, to illustrate these threats with examples of potentially invalid conclusions in the research literature, and to propose ways to tackle these threats. An examination of the nature of SRTn measurements in the context of test theory, modern nonlinear HAs, and listener diversity. Examples from the audiological research literature were used to estimate typical interparticipant variation in SRTn and to illustrate cases where validity may have been compromised. There can be no doubt that SRTn measurements, when used to compare nonlinear HA systems, in principle, suffer from threats to their internal and external/ecological validity. Interactions between HA nonlinearities and SNR, and interparticipant differences in inherent SNR requirements, can act to generate misleading results. In addition, SRTn may lie at an SNR outside the range for which the HA system is designed or expected to operate in. Although the extent of invalid conclusions in the literature is difficult to evaluate, examples of studies were nevertheless identified where the risk of each form of invalidity is significant. Reliable data on ecological SNRs is becoming available, so that ecological validity can be assessed. Methodological developments that can reduce the risk of invalid conclusions include variations on the SRTn measurement procedure itself, manipulations of stimulus or scoring conditions to place SRTn in an ecologically relevant range, and design and analysis

  1. Cardiorespiratory Fitness Is Associated with Selective Attention in Healthy Male High-School Students.

    Science.gov (United States)

    Wengaard, Eivind; Kristoffersen, Morten; Harris, Anette; Gundersen, Hilde

    2017-01-01

    Background : Previous studies have shown associations of physical fitness and cognition in children and in younger and older adults. However, knowledge about associations in high-school adolescents and young adults is sparse. Thus, the aim of this study was to evaluate the association of physical fitness, measured as maximal oxygen uptake ([Formula: see text]), muscle mass, weekly training, and cognitive function in the executive domains of selective attention and inhibitory control, in healthy male high-school students. Methods : Fifty-four males (17.9 ± 0.9 years, 72 ± 11 kg and 182 ± 7 cm) completed a [Formula: see text] test, a body composition test and a visual cognitive task based on the Posner cue paradigm with three types of stimuli with different attentional demands (i.e., stimuli presentation following no cue, valid cue or invalid cue presentations). The task consisted of 336 target stimuli, where 56 (17%) of the target stimuli appeared without a cue (no cue), 224 (67%) appeared in the same rectangle as the cue (valid cue) and 56 (17%) appeared in the rectangle opposite to the cue (invalid cue). Mean reaction time (RT) and corresponding errors was calculated for each stimuli type. Total task duration was 9 min and 20 s In addition, relevant background information was obtained in a questionnaire. Results : Linear mixed model analyses showed that higher [Formula: see text] was associated with faster RT for stimuli following invalid cue (Estimate = -2.69, SE = 1.03, p = 0.011), and for stimuli following valid cue (Estimate = -2.08, SE = 1.03, p = 0.048). There was no association of muscle mass and stimuli ( F = 1.01, p = 0.397) or of weekly training and stimuli ( F = 0.99, p = 0.405). Conclusion : The results suggest that cardiorespiratory fitness is associated with cognitive performance in healthy male high-school students in the executive domains of selective attention.

  2. Cardiorespiratory Fitness Is Associated with Selective Attention in Healthy Male High-School Students

    Directory of Open Access Journals (Sweden)

    Eivind Wengaard

    2017-06-01

    Full Text Available Background: Previous studies have shown associations of physical fitness and cognition in children and in younger and older adults. However, knowledge about associations in high-school adolescents and young adults is sparse. Thus, the aim of this study was to evaluate the association of physical fitness, measured as maximal oxygen uptake (V·O2max, muscle mass, weekly training, and cognitive function in the executive domains of selective attention and inhibitory control, in healthy male high-school students.Methods: Fifty-four males (17.9 ± 0.9 years, 72 ± 11 kg and 182 ± 7 cm completed a V·O2max test, a body composition test and a visual cognitive task based on the Posner cue paradigm with three types of stimuli with different attentional demands (i.e., stimuli presentation following no cue, valid cue or invalid cue presentations. The task consisted of 336 target stimuli, where 56 (17% of the target stimuli appeared without a cue (no cue, 224 (67% appeared in the same rectangle as the cue (valid cue and 56 (17% appeared in the rectangle opposite to the cue (invalid cue. Mean reaction time (RT and corresponding errors was calculated for each stimuli type. Total task duration was 9 min and 20 s In addition, relevant background information was obtained in a questionnaire.Results: Linear mixed model analyses showed that higher V·O2max was associated with faster RT for stimuli following invalid cue (Estimate = −2.69, SE = 1.03, p = 0.011, and for stimuli following valid cue (Estimate = −2.08, SE = 1.03, p = 0.048. There was no association of muscle mass and stimuli (F = 1.01, p = 0.397 or of weekly training and stimuli (F = 0.99, p = 0.405.Conclusion: The results suggest that cardiorespiratory fitness is associated with cognitive performance in healthy male high-school students in the executive domains of selective attention.

  3. Quintessential inflation on the brane and the relic gravity wave background

    International Nuclear Information System (INIS)

    Sami, M.; Sahni, V.

    2004-01-01

    Quintessential inflation describes a scenario in which both inflation and dark energy (quintessence) are described by the same scalar field. In conventional braneworld models of quintessential inflation gravitational particle-production is used to reheat the universe. This reheating mechanism is very inefficient and results in an excessive production of gravity waves which violate nucleosynthesis constraints and invalidate the model. We describe a new method of realizing quintessential inflation on the brane in which inflation is followed by 'instant preheating' (Felder, Kofman and Linde 1999). The larger reheating temperature in this model results in a smaller amplitude of relic gravity waves which is consistent with nucleosynthesis bounds. The relic gravity wave background has a 'blue' spectrum at high frequencies and is a generic byproduct of successful quintessential inflation on the brane

  4. Struggles for medical legitimacy among women experiencing sexual pain: A qualitative study.

    Science.gov (United States)

    Braksmajer, Amy

    2018-04-01

    Given the prominent role of medical institutions in defining what is "healthy" and "normal," many women turn to medicine when experiencing pain during intercourse (dyspareunia). The medical encounter can become a contest between patients and providers when physicians do not grant legitimacy to patients' claims of illness. Drawing on interviews conducted from 2007 to 2008 and 2011 to 2012 with 32 women experiencing dyspareunia (ages 18-60 years) and living in New York City and its surrounding areas, this study examined women's and their physicians' claims regarding bodily expertise, particularly women's perceptions of physician invalidation, their understanding of this invalidation as gendered, and the consequences for women's pursuit of medicalization. Women overwhelmingly sought a medical diagnosis for their dyspareunia, in which they believed that providers would relieve uncertainty about its origin, give treatment alternatives, and permit them to avoid sexual activity. When providers did not give diagnoses, women reported feeling that their bodily self-knowledge was dismissed and their symptoms were attributed to psychosomatic causes. Furthermore, some women linked their perceptions of invalidation to both historical and contemporary forms of gender bias. Exploration of women's struggles for medical legitimacy may lead to a better understanding of the processes by which medicalization of female sexuality takes place.

  5. Development and necessary norms of reasoning

    Science.gov (United States)

    Markovits, Henry

    2014-01-01

    The question of whether reasoning can, or should, be described by a single normative model is an important one. In the following, I combine epistemological considerations taken from Piaget’s notion of genetic epistemology, a hypothesis about the role of reasoning in communication and developmental data to argue that some basic logical principles are in fact highly normative. I argue here that explicit, analytic human reasoning, in contrast to intuitive reasoning, uniformly relies on a form of validity that allows distinguishing between valid and invalid arguments based on the existence of counterexamples to conclusions. PMID:24904501

  6. In defence of collective dose

    International Nuclear Information System (INIS)

    Fairlie, I.; Sumner, D.

    2000-01-01

    Recent proposals for a new scheme of radiation protection leave little room for collective dose estimations. This article discusses the history and present use of collective doses for occupational, ALARA, EIS and other purposes with reference to practical industry papers and government reports. The linear no-threshold (LNT) hypothesis suggests that collective doses which consist of very small doses added together should be used. Moral and ethical questions are discussed, particularly the emphasis on individual doses to the exclusion of societal risks, uncertainty over effects into the distant future and hesitation over calculating collective detriments. It is concluded that for moral, practical and legal reasons, collective dose is a valid parameter which should continue to be used. (author)

  7. Epidemiological and radio-biological studies in high background radiation areas of Kerala coast: implications in radiation protection science and human health

    International Nuclear Information System (INIS)

    Das, Birajalaxmi

    2018-01-01

    Till date, Linear No Threshold hypothesis (LNT) is well accepted in radiation protection science in spite of its limitations. However, dose response studies using multiple biological end points from high-background radiation areas have challenged the linearity. Radio-biological and epidemiological studies from high level natural radiation areas of Kerala coast showed non-linearity as well as efficient repair of DNA damage in HLNRA indicating that dose limits for public exposure needs to be revisited which may have implications in radiation protection science, human health and low dose radiation biology. However, further studies using high throughput approach is required to identify chronic radiation signatures in human population exposed to elevated level of natural background radiation

  8. Homeopathy with radiation?

    International Nuclear Information System (INIS)

    Kiefer, Juergen

    2017-01-01

    There are no reliable data to estimate radiation risk to humans below doses of about 100 mSv. The ICRP adopts for protection purposes the LNT(linear no threshold)-concept. As there is no evidence for its general validity there is room for other ideas. One is ''radiation hormesis'', i.e. the notion that low radiation doses are not harmful but rather beneficial to human health. This view is critically discussed and it is shown that there is no evidence to prove this conception neither from epidemiological studies nor convincingly from animal experiments or mechanistic investigations. There is, therefore, no reason to abandon the prudent approach of the ALARA-principle.

  9. Electrical resistivity measurements in superconducting ceramics

    International Nuclear Information System (INIS)

    Muccillo, R.; Bressiani, A.H.A.; Muccillo, E.N.S.; Bressiani, J.C.

    1988-01-01

    Electrical resistivity measurements have been done in (Y, Ba, Cu, O) - and (Y, A1, Ba, Cu, O) - based superconducting ceramics. The sintered specimens were prepared by applying gold electrodes and winding on the non-metalized part with a copper strip to be immersed in liquid nitrogen for cooling. The resistivity measurements have been done by the four-probe method. A copper-constantan or chromel-alumel thermocouple inserted between the specimen and the copper cold finger has been used for the determination of the critical temperature T c . Details of the experimental set-up and resistivity versus temperature plots in the LNT-RT range for the superconducting ceramics are the major contributions of this communication. (author) [pt

  10. Electrical resistivity measurements in superconducting ceramics

    International Nuclear Information System (INIS)

    Muccillo, R.; Bressiani, A.H.A.; Muccillo, E.N.S.; Bressian, J.C.

    1988-01-01

    Electrical resistivity measurements have been done in (Y,Ba,Cu,O)- and (Y,Al,Ba,Cu,O)-based superconducting ceramics. The sintered specimens were prepared by applying gold electrodes and winding on the non-metalized part with a copper strip to be immersed in liquid nitrogen for cooling. The resistivity measurements have been done by the four-probe method. A copper constantan or chromel-alumel thermocouple inserted between the specimen and the copper cold finger has been used for the determination of the critical temperature T c . Details of the experimental set-up and resistivity versus temperature plots in the LNT-RT range for the superconducting ceramics are the major contributions of this communication. (author) [pt

  11. Homeopathy with radiation?; Homoeopathie mit Strahlen?

    Energy Technology Data Exchange (ETDEWEB)

    Kiefer, Juergen

    2017-07-01

    There are no reliable data to estimate radiation risk to humans below doses of about 100 mSv. The ICRP adopts for protection purposes the LNT(linear no threshold)-concept. As there is no evidence for its general validity there is room for other ideas. One is ''radiation hormesis'', i.e. the notion that low radiation doses are not harmful but rather beneficial to human health. This view is critically discussed and it is shown that there is no evidence to prove this conception neither from epidemiological studies nor convincingly from animal experiments or mechanistic investigations. There is, therefore, no reason to abandon the prudent approach of the ALARA-principle.

  12. Inkjet printing technology and conductive inks synthesis for microfabrication techniques

    International Nuclear Information System (INIS)

    Dang, Mau Chien; Dung Dang, Thi My; Fribourg-Blanc, Eric

    2013-01-01

    Inkjet printing is an advanced technique which reliably reproduces text, images and photos on paper and some other substrates by desktop printers and is now used in the field of materials deposition. This interest in maskless materials deposition is coupled with the development of microfabrication techniques for the realization of circuits or patterns on flexible substrates for which printing techniques are of primary interest. This paper is a review of some results obtained in inkjet printing technology to develop microfabrication techniques at Laboratory for Nanotechnology (LNT). Ink development, in particular conductive ink, study of printed patterns, as well as application of these to the realization of radio-frequency identification (RFID) tags on flexible substrates, are presented. (paper)

  13. Biological effects of low doses of ionizing radiation: Conflict between assumptions and observations

    International Nuclear Information System (INIS)

    Kesavan, P.C.; Devasagayam, T.P.A.

    1997-01-01

    Recent epidemiological data on cancer incidence among the A-bomb survivors and more importantly experimental studies in cell and molecular radiobiology do not lend unequivocal support to the ''linear, no threshold'' (LNT) hypothesis; in fact, the discernible evidence that low and high doses of ionizing radiations induce qualitatively different/opposite effects cannot be summarily rejected. A time has come to examine the mechanistic aspects of ''radiation hormesis'' and ''radioadaptive response'' seriously rather than proclaiming one's profound disbelief about these phenomena. To put the discussion in a serious scientific mode, we briefly catalogue here reports in the literature on gene expression differentially influenced by low and high doses. These are not explicable in terms of the current radiation paradigm. (author)

  14. Identification and location tasks rely on different mental processes: a diffusion model account of validity effects in spatial cueing paradigms with emotional stimuli.

    Science.gov (United States)

    Imhoff, Roland; Lange, Jens; Germar, Markus

    2018-02-22

    Spatial cueing paradigms are popular tools to assess human attention to emotional stimuli, but different variants of these paradigms differ in what participants' primary task is. In one variant, participants indicate the location of the target (location task), whereas in the other they indicate the shape of the target (identification task). In the present paper we test the idea that although these two variants produce seemingly comparable cue validity effects on response times, they rest on different underlying processes. Across four studies (total N = 397; two in the supplement) using both variants and manipulating the motivational relevance of cue content, diffusion model analyses revealed that cue validity effects in location tasks are primarily driven by response biases, whereas the same effect rests on delay due to attention to the cue in identification tasks. Based on this, we predict and empirically support that a symmetrical distribution of valid and invalid cues would reduce cue validity effects in location tasks to a greater extent than in identification tasks. Across all variants of the task, we fail to replicate the effect of greater cue validity effects for arousing (vs. neutral) stimuli. We discuss the implications of these findings for best practice in spatial cueing research.

  15. A comparative study and a phylogenetic exploration of the compositional architectures of mammalian nuclear genomes.

    Directory of Open Access Journals (Sweden)

    Eran Elhaik

    2014-11-01

    Full Text Available For the past four decades the compositional organization of the mammalian genome posed a formidable challenge to molecular evolutionists attempting to explain it from an evolutionary perspective. Unfortunately, most of the explanations adhered to the "isochore theory," which has long been rebutted. Recently, an alternative compositional domain model was proposed depicting the human and cow genomes as composed mostly of short compositionally homogeneous and nonhomogeneous domains and a few long ones. We test the validity of this model through a rigorous sequence-based analysis of eleven completely sequenced mammalian and avian genomes. Seven attributes of compositional domains are used in the analyses: (1 the number of compositional domains, (2 compositional domain-length distribution, (3 density of compositional domains, (4 genome coverage by the different domain types, (5 degree of fit to a power-law distribution, (6 compositional domain GC content, and (7 the joint distribution of GC content and length of the different domain types. We discuss the evolution of these attributes in light of two competing phylogenetic hypotheses that differ from each other in the validity of clade Euarchontoglires. If valid, the murid genome compositional organization would be a derived state and exhibit a high similarity to that of other mammals. If invalid, the murid genome compositional organization would be closer to an ancestral state. We demonstrate that the compositional organization of the murid genome differs from those of primates and laurasiatherians, a phenomenon previously termed the "murid shift," and in many ways resembles the genome of opossum. We find no support to the "isochore theory." Instead, our findings depict the mammalian genome as a tapestry of mostly short homogeneous and nonhomogeneous domains and few long ones thus providing strong evidence in favor of the compositional domain model and seem to invalidate clade Euarchontoglires.

  16. Why Current Statistics of Complementary Alternative Medicine Clinical Trials is Invalid.

    Science.gov (United States)

    Pandolfi, Maurizio; Carreras, Giulia

    2018-06-07

    It is not sufficiently known that frequentist statistics cannot provide direct information on the probability that the research hypothesis tested is correct. The error resulting from this misunderstanding is compounded when the hypotheses under scrutiny have precarious scientific bases, which, generally, those of complementary alternative medicine (CAM) are. In such cases, it is mandatory to use inferential statistics, considering the prior probability that the hypothesis tested is true, such as the Bayesian statistics. The authors show that, under such circumstances, no real statistical significance can be achieved in CAM clinical trials. In this respect, CAM trials involving human material are also hardly defensible from an ethical viewpoint.

  17. 20 CFR 656.30 - Validity of and invalidation of labor certifications.

    Science.gov (United States)

    2010-04-01

    ... Immigration Officer. (2) The filing date, established under § 656.17(c), of an approved labor certification... particular job opportunity, the alien named on the original application (unless a substitution was approved... the written request of a Consular or Immigration Officer. The Certifying Officer shall issue such...

  18. Simulations of Micro Gas Flows by the DS-BGK Method

    KAUST Repository

    Li, Jun

    2011-01-01

    For gas flows in micro devices, the molecular mean free path is of the same order as the characteristic scale making the Navier-Stokes equation invalid. Recently, some micro gas flows are simulated by the DS-BGK method, which is convergent to the BGK equation and very efficient for low-velocity cases. As the molecular reflection on the boundary is the dominant effect compared to the intermolecular collisions in micro gas flows, the more realistic boundary condition, namely the CLL reflection model, is employed in the DS-BGK simulation and the influence of the accommodation coefficients used in the molecular reflection model on the results are discussed. The simulation results are verified by comparison with those of the DSMC method as criteria. Copyright © 2011 by ASME.

  19. Predictions for heat transfer characteristics in a natural draft reactor cooling system using a second moment closure turbulence model

    International Nuclear Information System (INIS)

    Nishimura, M.; Maekawa, I.

    2004-01-01

    A numerical study is performed on the natural draft reactor cavity cooling system (RCCS). In the cooling system, buoyancy driven heated upward flow could be in the mixed convection regime that is accompanied by heat transfer impairment. Also, the heating wall condition is asymmetric with regard to the channel cross section. These flow regime and thermal boundary conditions may invalidate the use of design correlation. To precisely simulate the flow and thermal fields within the RCCS, the second moment closure turbulence model is applied. Two types of the RCCS channel geometry are selected to make a comparison: an annular duct with fins on the outer surface of the inner circular wall, and a multi-rectangular duct. The prediction shows that the local heat transfer coefficient on the RCCS with finned annular duct is less than 1/6 of that estimated with Dittus-Boelter correlation. Much portion of the natural draft airflow does not contribute cooling at all because mainstream escapes from the narrow gaps between the fins. This result and thus the finned annulus design are unacceptable from the viewpoint for structural integrity of the RCCS wall boundary. The performance of the multi-rectangular duct design is acceptable that the RCCS maximum temperature is less than 400 degree centigrade even when the flow rate is halved from the designed condition. (author)

  20. Moral Dilemmas and Existential Issues Encountered Both in Psychotherapy and Philosophical Counseling Practices

    Directory of Open Access Journals (Sweden)

    Beatrice A. Popescu

    2015-08-01

    Full Text Available This paper stems from clinical observations and empirical data collected in the therapy room over six years. It investigates the relationship between psychotherapy and philosophical counseling, proposing an integrative model of counseling. During cognitive behavior therapy sessions with clients who turn to therapy in order to solve their clinical issues, the author noticed that behind most of the invalidating symptoms classified by the DSM-5 as depression, anxiety, hypochondriac and phobic complaints, usually lies a lack of existential meaning or existential scope and clients are also tormented by moral dilemmas. Following the anamnestic interview and the psychological evaluation, rarely the depression or anxiety diagnosed on Axis I is purely just a sum of invalidating symptoms, which may disappear if treated symptomatically. When applying the Sentence Completion Test, an 80 items test of psychodynamic origin and high-face validity, most of the clients report an entire plethora of conscious or unconscious motivations, distorted cognitions or irrational thinking but also grave existential themes such as scope or meaning of life, professional identity, fear of death, solitude and loneliness, freedom of choice and liberty. Same issues are approached in the philosophical counseling practice, but no systematic research has been done yet in the field. Future research and investigation is needed in order to assess the importance of moral dilemmas and existential issues in both practices.

  1. Moral Dilemmas and Existential Issues Encountered Both in Psychotherapy and Philosophical Counseling Practices.

    Science.gov (United States)

    Popescu, Beatrice A

    2015-08-01

    This paper stems from clinical observations and empirical data collected in the therapy room over six years. It investigates the relationship between psychotherapy and philosophical counseling, proposing an integrative model of counseling. During cognitive behavior therapy sessions with clients who turn to therapy in order to solve their clinical issues, the author noticed that behind most of the invalidating symptoms classified by the DSM-5 as depression, anxiety, hypochondriac and phobic complaints, usually lies a lack of existential meaning or existential scope and clients are also tormented by moral dilemmas. Following the anamnestic interview and the psychological evaluation, rarely the depression or anxiety diagnosed on Axis I is purely just a sum of invalidating symptoms, which may disappear if treated symptomatically. When applying the Sentence Completion Test, an 80 items test of psychodynamic origin and high-face validity, most of the clients report an entire plethora of conscious or unconscious motivations, distorted cognitions or irrational thinking but also grave existential themes such as scope or meaning of life, professional identity, fear of death, solitude and loneliness, freedom of choice and liberty. Same issues are approached in the philosophical counseling practice, but no systematic research has been done yet in the field. Future research and investigation is needed in order to assess the importance of moral dilemmas and existential issues in both practices.

  2. A study of thermal deformation in the carriage of a permanent magnet direct drive linear motor stage

    International Nuclear Information System (INIS)

    Chow, J.H.; Zhong, Z.W.; Lin, W.; Khoo, L.P.

    2012-01-01

    Carriage deformation due to temperature gradients within the materials of the carriage affects the accuracy of precision machines. This is largely due to the indeterminist temperature distribution in the carriage's material caused by the non-linearity of heat transfer. The joule heat from the motor coil forms the main heat source. When coupled with the heat loss through convection and radiation, the temperature variation in the motor's carriage also increases. In this study, the Finite Element Analysis was used together with a set of boundary conditions, which was obtained empirically, to analyze the distortion of the motor's carriage. The simulated results were compared with those obtained through experiments. The study shows that it is important to know, rather than to assume, the thermal boundary conditions of the motor's carriage of a precision machine in order to accurately estimate the thermal deformation of the carriage in precision machining. - Highlights: ► Deformation occurs in carriages which are mounted with linear motor. ► The convective coefficient, which is assumed to be 10 W mm −2 , is shown to be invalid. ► The perfect contact conductance is shown to be invalid too. ► To have an accurate thermal model, boundary conditions have to be realistic. ► Boundary conditions are the heat source, convective and conductance values.

  3. Developing self-concept instrument for pre-service mathematics teachers

    Science.gov (United States)

    Afgani, M. W.; Suryadi, D.; Dahlan, J. A.

    2018-01-01

    This study aimed to develop self-concept instrument for undergraduate students of mathematics education in Palembang, Indonesia. Type of this study was development research of non-test instrument in questionnaire form. A Validity test of the instrument was performed with construct validity test by using Pearson product moment and factor analysis, while reliability test used Cronbach’s alpha. The instrument was tested by 65 undergraduate students of mathematics education in one of the universities at Palembang, Indonesia. The instrument consisted of 43 items with 7 aspects of self-concept, that were the individual concern, social identity, individual personality, view of the future, the influence of others who become role models, the influence of the environment inside or outside the classroom, and view of the mathematics. The result of validity test showed there was one invalid item because the value of Pearson’s r was 0.107 less than the critical value (0.244; α = 0.05). The item was included in social identity aspect. After the invalid item was removed, Construct validity test with factor analysis generated only one factor. The Kaiser-Meyer-Olkin (KMO) coefficient was 0.846 and reliability coefficient was 0.91. From that result, we concluded that the self-concept instrument for undergraduate students of mathematics education in Palembang, Indonesia was valid and reliable with 42 items.

  4. Modelling, simulation, and optimisation of a downflow entrained flow reactor for black liquor gasification

    Energy Technology Data Exchange (ETDEWEB)

    Marklund, Magnus [ETC Energitekniskt Centrum, Piteaa (Sweden)

    2003-12-01

    of heat flux to the reactor wall. By using a model based on coal combustion it was concluded that the gas flow field is relatively insensitive to the burner spray angle. Partial verification of an advanced PBLG model for a simplified case showed very good agreement. By studying the influence fiom uncertainties in some model parameter inputs, it was found that all the studied main effects are relatively small and the uncertainties in the examined model parameters would not invalidate the results fiom a design optimisation with the developed PBLG reactor model.

  5. Recent international regulations: low dose-low rate radiation protection and the demise of reason.

    Science.gov (United States)

    Okkalides, Demetrios

    2008-01-01

    The radiation protection measures suggested by the International Committee for Radiation Protection (ICRP), national regulating bodies and experts, have been becoming ever more strict despite the decrease of any information supporting the existence of the Linear no Threshold model (LNT) and of any adverse effects of Low Dose Low Rate (LDLR) irradiation. This tendency arises from the disproportionate response of human society to hazards that are currently in fashion and is unreasonable. The 1 mSv/year dose limit for the public suggested by the ICRP corresponds to a 1/18,181 detriment-adjusted cancer risk and is much lower than other hazards that are faced by modern societies such as e.g. driving and smoking which carry corresponding rate risks of 1/2,100 and 1/2,000. Even worldwide deadly work accidents rate is higher at 1/ 8,065. Such excessive safety measures against minimal risks from man made radiation sources divert resources from very real and much greater hazards. In addition they undermine research and development of radiation technology and tend to subjugate science and the quest for understanding nature to phobic practices.

  6. Adequacy of authors' replies to criticism raised in electronic letters to the editor: cohort study

    DEFF Research Database (Denmark)

    Gøtzsche, Peter C; Delamothe, Tony; Godlee, Fiona

    2010-01-01

    To investigate whether substantive criticism in electronic letters to the editor, defined as a problem that could invalidate the research or reduce its reliability, is adequately addressed by the authors....

  7. THE HIGH BACKGROUND RADIATION AREA IN RAMSAR IRAN: GEOLOGY, NORM, BIOLOGY, LNT, AND POSSIBLE REGULATORY FUN

    Energy Technology Data Exchange (ETDEWEB)

    Karam, P. A.

    2002-02-25

    The city of Ramsar Iran hosts some of the highest natural radiation levels on earth, and over 2000 people are exposed to radiation doses ranging from 1 to 26 rem per year. Curiously, inhabitants of this region seem to have no greater incidence of cancer than those in neighboring areas of normal background radiation levels, and preliminary studies suggest their blood cells experience fewer induced chromosomal abnormalities when exposed to 150 rem ''challenge'' doses of radiation than do the blood cells of their neighbors. This paper will briefly describe the unique geology that gives Ramsar its extraordinarily high background radiation levels. It will then summarize the studies performed to date and will conclude by suggesting ways to incorporate these findings (if they are borne out by further testing) into future radiation protection standards.

  8. Hidden Markov event sequence models: toward unsupervised functional MRI brain mapping.

    Science.gov (United States)

    Faisan, Sylvain; Thoraval, Laurent; Armspach, Jean-Paul; Foucher, Jack R; Metz-Lutz, Marie-Noëlle; Heitz, Fabrice

    2005-01-01

    Most methods used in functional MRI (fMRI) brain mapping require restrictive assumptions about the shape and timing of the fMRI signal in activated voxels. Consequently, fMRI data may be partially and misleadingly characterized, leading to suboptimal or invalid inference. To limit these assumptions and to capture the broad range of possible activation patterns, a novel statistical fMRI brain mapping method is proposed. It relies on hidden semi-Markov event sequence models (HSMESMs), a special class of hidden Markov models (HMMs) dedicated to the modeling and analysis of event-based random processes. Activation detection is formulated in terms of time coupling between (1) the observed sequence of hemodynamic response onset (HRO) events detected in the voxel's fMRI signal and (2) the "hidden" sequence of task-induced neural activation onset (NAO) events underlying the HROs. Both event sequences are modeled within a single HSMESM. The resulting brain activation model is trained to automatically detect neural activity embedded in the input fMRI data set under analysis. The data sets considered in this article are threefold: synthetic epoch-related, real epoch-related (auditory lexical processing task), and real event-related (oddball detection task) fMRI data sets. Synthetic data: Activation detection results demonstrate the superiority of the HSMESM mapping method with respect to a standard implementation of the statistical parametric mapping (SPM) approach. They are also very close, sometimes equivalent, to those obtained with an "ideal" implementation of SPM in which the activation patterns synthesized are reused for analysis. The HSMESM method appears clearly insensitive to timing variations of the hemodynamic response and exhibits low sensitivity to fluctuations of its shape (unsustained activation during task). Real epoch-related data: HSMESM activation detection results compete with those obtained with SPM, without requiring any prior definition of the expected

  9. Traditions, Paradigms and Basic Concepts in Islamic Psychology.

    Science.gov (United States)

    Skinner, Rasjid

    2018-03-23

    The conceptual tools of psychology aim to explain the complexity of phenomena that psychotherapists observe in their patients and within themselves, as well as to predict the outcome of therapy. Naturally, Muslim psychologists have sought satisfaction in the conceptual tools of their trade and in what has been written in Islamic psychology-notably by Badri (The dilemma of Muslim psychologists, MWH London, London, 1979), who critiqued Western psychology from an Islamic perspective, arguing the need to filter out from Western Psychology which was cross-culturally invalid or was in conflict with Islamic precept. In this paper, I advocate an extension of Badri's (1979) approach and present a working model of the self derived from traditional Islamic thought. This model, though rudimentary and incomplete, I believe, makes better sense of my perceptions as a clinician than any other psychological model within my knowledge.

  10. Augmented Reality in Scientific Publications-Taking the Visualization of 3D Structures to the Next Level.

    Science.gov (United States)

    Wolle, Patrik; Müller, Matthias P; Rauh, Daniel

    2018-03-16

    The examination of three-dimensional structural models in scientific publications allows the reader to validate or invalidate conclusions drawn by the authors. However, either due to a (temporary) lack of access to proper visualization software or a lack of proficiency, this information is not necessarily available to every reader. As the digital revolution is quickly progressing, technologies have become widely available that overcome the limitations and offer to all the opportunity to appreciate models not only in 2D, but also in 3D. Additionally, mobile devices such as smartphones and tablets allow access to this information almost anywhere, at any time. Since access to such information has only recently become standard practice, we want to outline straightforward ways to incorporate 3D models in augmented reality into scientific publications, books, posters, and presentations and suggest that this should become general practice.

  11. P2X7 Receptors Drive Spine Synapse Plasticity in the Learned Helplessness Model of Depression.

    Science.gov (United States)

    Otrokocsi, Lilla; Kittel, Ágnes; Sperlágh, Beáta

    2017-10-01

    Major depressive disorder is characterized by structural and functional abnormalities of cortical and limbic brain areas, including a decrease in spine synapse number in the dentate gyrus of the hippocampus. Recent studies highlighted that both genetic and pharmacological invalidation of the purinergic P2X7 receptor (P2rx7) leads to antidepressant-like phenotype in animal experiments; however, the impact of P2rx7 on depression-related structural changes in the hippocampus is not clarified yet. Effects of genetic deletion of P2rx7s on depressive-like behavior and spine synapse density in the dentate gyrus were investigated using the learned helplessness mouse model of depression. We demonstrate that in wild-type animals, inescapable footshocks lead to learned helplessness behavior reflected in increased latency and number of escape failures to subsequent escapable footshocks. This behavior is accompanied with downregulation of mRNA encoding P2rx7 and decrease of spine synapse density in the dentate gyrus as determined by electron microscopic stereology. In addition, a decrease in synaptopodin but not in PSD95 and NR2B/GluN2B protein level was also observed under these conditions. Whereas the absence of P2rx7 was characterized by escape deficit, no learned helpless behavior is observed in these animals. Likewise, no decrease in spine synapse number and synaptopodin protein levels was detected in response to inescapable footshocks in P2rx7-deficient animals. Our findings suggest the endogenous activation of P2rx7s in the learned helplessness model of depression and decreased plasticity of spine synapses in P2rx7-deficient mice might explain the resistance of these animals to repeated stressful stimuli. © The Author 2017. Published by Oxford University Press on behalf of CINP.

  12. Assessment of First- and Second-Order Wave-Excitation Load Models for Cylindrical Substructures: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Pereyra, Brandon; Wendt, Fabian; Robertson, Amy; Jonkman, Jason

    2017-03-09

    The hydrodynamic loads on an offshore wind turbine's support structure present unique engineering challenges for offshore wind. Two typical approaches used for modeling these hydrodynamic loads are potential flow (PF) and strip theory (ST), the latter via Morison's equation. This study examines the first- and second-order wave-excitation surge forces on a fixed cylinder in regular waves computed by the PF and ST approaches to (1) verify their numerical implementations in HydroDyn and (2) understand when the ST approach breaks down. The numerical implementation of PF and ST in HydroDyn, a hydrodynamic time-domain solver implemented as a module in the FAST wind turbine engineering tool, was verified by showing the consistency in the first- and second-order force output between the two methods across a range of wave frequencies. ST is known to be invalid at high frequencies, and this study investigates where the ST solution diverges from the PF solution. Regular waves across a range of frequencies were run in HydroDyn for a monopile substructure. As expected, the solutions for the first-order (linear) wave-excitation loads resulting from these regular waves are similar for PF and ST when the diameter of the cylinder is small compared to the length of the waves (generally when the diameter-to-wavelength ratio is less than 0.2). The same finding applies to the solutions for second-order wave-excitation loads, but for much smaller diameter-to-wavelength ratios (based on wavelengths of first-order waves).

  13. Satisfaction Determinants of Women during Childbirth in Health ...

    African Journals Online (AJOL)

    USER

    thesis, we are interested in measuring the effect of women's satisfaction level ... value: confirmation/invalidation paradigm,. (iv). Satisfaction is the perceived competence of professionals by customers, the quality of information received, the ...

  14. A new paradigm in radioadaptive response developing from microbeam research

    International Nuclear Information System (INIS)

    Matsumoto, Hideki; Tomita, Masanori; Otsuka, Kensuke; Hatashita, Masanori

    2009-01-01

    A classic paradigm in radiation biology asserts that all radiation effects on cells, tissues and organisms are due to the direct action of radiation on living tissue. Using this model, possible risks from exposure to low dose ionizing radiation (below 100 mSv) are estimated by extrapolating from data obtained after exposure to higher doses of radiation, using a linear non-threshold model (LNT model). However, the validity of using this dose-response model is controversial because evidence accumulated over the past decade has indicated that living organisms, including humans, respond differently to low dose/low dose-rate radiation than they do to high dose/high dose-rate radiation. These important responses to low dose/low dose-rate radiation are the radiation-induced adaptive response, the bystander response, low-dose hypersensitivity, and genomic instability. The mechanisms underlying these responses often involve bio-chemical and molecular signals generated in response to targeted and non-targeted events. In order to define and understand the bystander response to provide a basis for the understanding of non-targeted events and to elucidate the mechanisms involved, recent sophisticated research has been conducted with X-ray microbeams and charged heavy particle microbeams, and these studies have produced many new observations. Based on these observations, associations have been suggested to exist between the radio-adaptive and bystander responses. The present review focuses on these two phenomena, and summarizes observations supporting their existence, and discusses the linkage between them in light of recent results obtained from experiments utilizing microbeams. (author)

  15. Vortex pinning in layered organic superconductors: κ-(BEDT-TTF)2Cu[N(CN)2]Br

    International Nuclear Information System (INIS)

    Khizroev, S.; Zuo, F.; Alexandrakis, G.C.; Schlueter, J.A.; Geiser, U.; Williams, J.M.

    1996-01-01

    Magnetization studies on organic single-crystal superconductors of κ-(BEDT-TTF) 2 Cu[N(CN) 2 ]Br with the field H parallel to the b axis (perpendicular to the conducting plane) show anomalous field and temperature dependence of vortex pinning in the mixed state. At high temperatures, the magnetization M decays with increasing field with a power-law dependence. The normalized relaxation rate S=d(lnM)/d(lnt) decreases monotonically with H. At low temperatures (T rev shows a universal power-law dependence of H rev on (1-T/T c ) in the temperature range investigated. We suggest that the magnetic anomaly observed is due to a dimensional crossover in the nature of vortex pinning. copyright 1996 American Institute of Physics

  16. Irradiation effect of the insulating materials for fusion superconducting magnets at cryogenic temperature

    Science.gov (United States)

    Kobayashi, Koji; Akiyama, Yoko; Nishijima, Shigehiro

    2017-09-01

    In ITER, superconducting magnets should be used in such severe environment as high fluence of fast neutron, cryogenic temperature and large electromagnetic forces. Insulating material is one of the most sensitive component to radiation. So radiation resistance on mechanical properties at cryogenic temperature are required for insulating material. The purpose of this study is to evaluate irradiation effect of insulating material at cryogenic temperature by gamma-ray irradiation. Firstly, glass fiber reinforced plastic (GFRP) and hybrid composite were prepared. After irradiation at room temperature (RT) or liquid nitrogen temperature (LNT, 77 K), interlaminar shear strength (ILSS) and glass-transition temperature (Tg) measurement were conducted. It was shown that insulating materials irradiated at room temperature were much degraded than those at cryogenic temperature.

  17. Nanofoams Response to Radiation Damage

    Energy Technology Data Exchange (ETDEWEB)

    Fu, Engang [Los Alamos National Laboratory; Serrano De Caro, Magdalena [Los Alamos National Laboratory; Wang, Yongqiang [Los Alamos National Laboratory; Nastasi, Michael [Nebraska Center for Energy Sciences Research, University of Nebraska-Lincoln, NE 68508; Zepeda-Ruiz, Luis [PLS, Lawrence Livermore National Laboratory, Livermore, CA 94551; Bringa, Eduardo M. [CONICET and Inst. Ciencias Basicas, Universidad Nacional de Cuyo, Mendoza, 5500 Argentina; Baldwin, Jon K. [Los Alamos National Laboratory; Caro, Jose A. [Los Alamos National Laboratory

    2012-07-30

    Conclusions of this presentation are: (1) np-Au foams were successfully synthesized by de-alloying process; (2) np-Au foams remain porous structure after Ne ion irradiation to 1 dpa; (3) SFTs were observed in irradiated np-Au foams with highest and intermediate flux, while no SFTs were observed with lowest flux; (4) SFTs were observed in irradiated np-Au foams at RT, whereas no SFTs were observed at LNT irradiation; (5) The diffusivity of vacancies in Au at RT is high enough so that the vacancies have enough time to agglomerate and thus collapse. As a result, SFTs were formed; (6) The high flux created much more damage/time, vacancies don't have enough time to diffuse or recombine. As a result, SFTs were formed.

  18. Isolating lattice from electronic contributions in thermal transport measurements of metals and alloys above ambient temperature and an adiabatic model

    Science.gov (United States)

    Criss, Everett M.; Hofmeister, Anne M.

    2017-06-01

    From femtosecond spectroscopy (fs-spectroscopy) of metals, electrons and phonons reequilibrate nearly independently, which contrasts with models of heat transfer at ordinary temperatures (T > 100 K). These electronic transfer models only agree with thermal conductivity (k) data at a single temperature, but do not agree with thermal diffusivity (D) data. To address the discrepancies, which are important to problems in solid state physics, we separately measured electronic (ele) and phononic (lat) components of D in many metals and alloys over ˜290-1100 K by varying measurement duration and sample length in laser-flash experiments. These mechanisms produce distinct diffusive responses in temperature versus time acquisitions because carrier speeds (u) and heat capacities (C) differ greatly. Electronic transport of heat only operates for a brief time after heat is applied because u is high. High Dele is associated with moderate T, long lengths, low electrical resistivity, and loss of ferromagnetism. Relationships of Dele and Dlat with physical properties support our assignments. Although kele reaches ˜20 × klat near 470 K, it is transient. Combining previous data on u with each D provides mean free paths and lifetimes that are consistent with ˜298 K fs-spectroscopy, and new values at high T. Our findings are consistent with nearly-free electrons absorbing and transmitting a small fraction of the incoming heat, whereas phonons absorb and transmit the majority. We model time-dependent, parallel heat transfer under adiabatic conditions which is one-dimensional in solids, as required by thermodynamic law. For noninteracting mechanisms, k≅ΣCikiΣCi/(ΣCi2). For metals, this reduces to k = klat above ˜20 K, consistent with our measurements, and shows that Meissner’s equation (k≅klat + kele) is invalid above ˜20 K. For one mechanism with multiple, interacting carriers, k≅ΣCiki/(ΣCi). Thus, certain dynamic behaviors of electrons and phonons in metals have been

  19. Evidence for impairments in using static line drawings of eye gaze cues to orient visual-spatial attention in children with high functioning autism.

    Science.gov (United States)

    Goldberg, Melissa C; Mostow, Allison J; Vecera, Shaun P; Larson, Jennifer C Gidley; Mostofsky, Stewart H; Mahone, E Mark; Denckla, Martha B

    2008-09-01

    We examined the ability to use static line drawings of eye gaze cues to orient visual-spatial attention in children with high functioning autism (HFA) compared to typically developing children (TD). The task was organized such that on valid trials, gaze cues were directed toward the same spatial location as the appearance of an upcoming target, while on invalid trials gaze cues were directed to an opposite location. Unlike TD children, children with HFA showed no advantage in reaction time (RT) on valid trials compared to invalid trials (i.e., no significant validity effect). The two stimulus onset asynchronies (200 ms, 700 ms) did not differentially affect these findings. The results suggest that children with HFA show impairments in utilizing static line drawings of gaze cues to orient visual-spatial attention.

  20. The importance of assessing for validity of symptom report and performance in attention deficit/hyperactivity disorder (ADHD): Introduction to the special section on noncredible presentation in ADHD.

    Science.gov (United States)

    Suhr, Julie A; Berry, David T R

    2017-12-01

    Invalid self-report and invalid performance occur with high base rates in attention deficit/hyperactivity disorder (ADHD; Harrison, 2006; Musso & Gouvier, 2014). Although much research has focused on the development and validation of symptom validity tests (SVTs) and performance validity tests (PVTs) for psychiatric and neurological presentations, less attention has been given to the use of SVTs and PVTs in ADHD evaluation. This introduction to the special section describes a series of studies examining the use of SVTs and PVTs in adult ADHD evaluation. We present the series of studies in the context of prior research on noncredible presentation and call for future research using improved research methods and with a focus on assessment issues specific to ADHD evaluation. (PsycINFO Database Record (c) 2017 APA, all rights reserved).