WorldWideScience

Sample records for linear no-threshold lnt

  1. Linear non-threshold (LNT) radiation hazards model and its evaluation

    International Nuclear Information System (INIS)

    Min Rui

    2011-01-01

    In order to introduce linear non-threshold (LNT) model used in study on the dose effect of radiation hazards and to evaluate its application, the analysis of comprehensive literatures was made. The results show that LNT model is more suitable to describe the biological effects in accuracy for high dose than that for low dose. Repairable-conditionally repairable model of cell radiation effects can be well taken into account on cell survival curve in the all conditions of high, medium and low absorbed dose range. There are still many uncertainties in assessment model of effective dose of internal radiation based on the LNT assumptions and individual mean organ equivalent, and it is necessary to establish gender-specific voxel human model, taking gender differences into account. From above, the advantages and disadvantages of various models coexist. Before the setting of the new theory and new model, LNT model is still the most scientific attitude. (author)

  2. Response to, "On the origins of the linear no-threshold (LNT) dogma by means of untruths, artful dodges and blind faith.".

    Science.gov (United States)

    Beyea, Jan

    2016-07-01

    It is not true that successive groups of researchers from academia and research institutions-scientists who served on panels of the US National Academy of Sciences (NAS)-were duped into supporting a linear no-threshold model (LNT) by the opinions expressed in the genetic panel section of the 1956 "BEAR I" report. Successor reports had their own views of the LNT model, relying on mouse and human data, not fruit fly data. Nor was the 1956 report biased and corrupted, as has been charged in an article by Edward J. Calabrese in this journal. With or without BEAR I, the LNT model would likely have been accepted in the US for radiation protection purposes in the 1950's. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  3. The threshold vs LNT showdown: Dose rate findings exposed flaws in the LNT model part 2. How a mistake led BEIR I to adopt LNT

    International Nuclear Information System (INIS)

    Calabrese, Edward J.

    2017-01-01

    This paper reveals that nearly 25 years after the used Russell's dose-rate data to support the adoption of the linear-no-threshold (LNT) dose response model for genetic and cancer risk assessment, Russell acknowledged a significant under-reporting of the mutation rate of the historical control group. This error, which was unknown to BEIR I, had profound implications, leading it to incorrectly adopt the LNT model, which was a decision that profoundly changed the course of risk assessment for radiation and chemicals to the present. -- Highlights: • The BEAR I Genetics Panel made an error in denying dose rate for mutation. • The BEIR I Genetics Subcommittee attempted to correct this dose rate error. • The control group used for risk assessment by BEIR I is now known to be in error. • Correcting this error contradicts the LNT, supporting a threshold model.

  4. The threshold vs LNT showdown: Dose rate findings exposed flaws in the LNT model part 2. How a mistake led BEIR I to adopt LNT

    Energy Technology Data Exchange (ETDEWEB)

    Calabrese, Edward J., E-mail: edwardc@schoolph.umass.edu [Department of Environmental Health Sciences, School of Public Health and Health Sciences, Morrill I, N344, University of Massachusetts, Amherst, MA 01003 (United States)

    2017-04-15

    This paper reveals that nearly 25 years after the used Russell's dose-rate data to support the adoption of the linear-no-threshold (LNT) dose response model for genetic and cancer risk assessment, Russell acknowledged a significant under-reporting of the mutation rate of the historical control group. This error, which was unknown to BEIR I, had profound implications, leading it to incorrectly adopt the LNT model, which was a decision that profoundly changed the course of risk assessment for radiation and chemicals to the present. -- Highlights: • The BEAR I Genetics Panel made an error in denying dose rate for mutation. • The BEIR I Genetics Subcommittee attempted to correct this dose rate error. • The control group used for risk assessment by BEIR I is now known to be in error. • Correcting this error contradicts the LNT, supporting a threshold model.

  5. Linear, no threshold response at low doses of ionizing radiation: ideology, prejudice and science

    International Nuclear Information System (INIS)

    Kesavan, P.C.

    2014-01-01

    The linear, no threshold (LNT) response model assumes that there is no threshold dose for the radiation-induced genetic effects (heritable mutations and cancer), and it forms the current basis for radiation protection standards for radiation workers and the general public. The LNT model is, however, based more on ideology than valid radiobiological data. Further, phenomena such as 'radiation hormesis', 'radioadaptive response', 'bystander effects' and 'genomic instability' are now demonstrated to be radioprotective and beneficial. More importantly, the 'differential gene expression' reveals that qualitatively different proteins are induced by low and high doses. This finding negates the LNT model which assumes that qualitatively similar proteins are formed at all doses. Thus, all available scientific data challenge the LNT hypothesis. (author)

  6. Test of the linear-no threshold theory of radiation carcinogenesis

    International Nuclear Information System (INIS)

    Cohen, B.L.

    1998-01-01

    It is shown that testing the linear-no threshold theory (L-NT) of radiation carcinogenesis is extremely important and that lung cancer resulting from exposure to radon in homes is the best tool for doing this. A study of lung cancer rates vs radon exposure in U.S. Counties, reported in 1975, is reviewed. It shows, with extremely powerful statistics, that lung cancer rates decrease with increasing radon exposure, in sharp contrast to the prediction of L-NT, with a discrepancy of over 20 standard deviations. Very extensive efforts were made to explain an appreciable part of this discrepancy consistently with L-NT, with no success; it was concluded that L-NT fails, grossly exaggerating the cancer risk of low level radiation. Two updating studies reported in 1996 are also reviewed. New updating studies utilizing more recent lung cancer statistics and considering 450 new potential confounding factors are reported. All updates reinforce the previous conclusion, and the discrepancy with L-NT is increased. (author)

  7. The threshold vs LNT showdown: Dose rate findings exposed flaws in the LNT model part 2. How a mistake led BEIR I to adopt LNT.

    Science.gov (United States)

    Calabrese, Edward J

    2017-04-01

    This paper reveals that nearly 25 years after the National Academy of Sciences (NAS), Biological Effects of Ionizing Radiation (BEIR) I Committee (1972) used Russell's dose-rate data to support the adoption of the linear-no-threshold (LNT) dose response model for genetic and cancer risk assessment, Russell acknowledged a significant under-reporting of the mutation rate of the historical control group. This error, which was unknown to BEIR I, had profound implications, leading it to incorrectly adopt the LNT model, which was a decision that profoundly changed the course of risk assessment for radiation and chemicals to the present. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. On the origins of the linear no-threshold (LNT) dogma by means of untruths, artful dodges and blind faith

    International Nuclear Information System (INIS)

    Calabrese, Edward J.

    2015-01-01

    This paper is an historical assessment of how prominent radiation geneticists in the United States during the 1940s and 1950s successfully worked to build acceptance for the linear no-threshold (LNT) dose–response model in risk assessment, significantly impacting environmental, occupational and medical exposure standards and practices to the present time. Detailed documentation indicates that actions taken in support of this policy revolution were ideologically driven and deliberately and deceptively misleading; that scientific records were artfully misrepresented; and that people and organizations in positions of public trust failed to perform the duties expected of them. Key activities are described and the roles of specific individuals are documented. These actions culminated in a 1956 report by a Genetics Panel of the U.S. National Academy of Sciences (NAS) on Biological Effects of Atomic Radiation (BEAR). In this report the Genetics Panel recommended that a linear dose response model be adopted for the purpose of risk assessment, a recommendation that was rapidly and widely promulgated. The paper argues that current international cancer risk assessment policies are based on fraudulent actions of the U.S. NAS BEAR I Committee, Genetics Panel and on the uncritical, unquestioning and blind-faith acceptance by regulatory agencies and the scientific community. - Highlights: • The 1956 recommendation of the US NAS to use the LNT for risk assessment was adopted worldwide. • This recommendation is based on a falsification of the research record and represents scientific misconduct. • The record misrepresented the magnitude of panelist disagreement of genetic risk from radiation. • These actions enhanced public acceptance of their risk assessment policy recommendations.

  9. On the origins of the linear no-threshold (LNT) dogma by means of untruths, artful dodges and blind faith

    Energy Technology Data Exchange (ETDEWEB)

    Calabrese, Edward J., E-mail: edwardc@schoolph.umass.edu

    2015-10-15

    This paper is an historical assessment of how prominent radiation geneticists in the United States during the 1940s and 1950s successfully worked to build acceptance for the linear no-threshold (LNT) dose–response model in risk assessment, significantly impacting environmental, occupational and medical exposure standards and practices to the present time. Detailed documentation indicates that actions taken in support of this policy revolution were ideologically driven and deliberately and deceptively misleading; that scientific records were artfully misrepresented; and that people and organizations in positions of public trust failed to perform the duties expected of them. Key activities are described and the roles of specific individuals are documented. These actions culminated in a 1956 report by a Genetics Panel of the U.S. National Academy of Sciences (NAS) on Biological Effects of Atomic Radiation (BEAR). In this report the Genetics Panel recommended that a linear dose response model be adopted for the purpose of risk assessment, a recommendation that was rapidly and widely promulgated. The paper argues that current international cancer risk assessment policies are based on fraudulent actions of the U.S. NAS BEAR I Committee, Genetics Panel and on the uncritical, unquestioning and blind-faith acceptance by regulatory agencies and the scientific community. - Highlights: • The 1956 recommendation of the US NAS to use the LNT for risk assessment was adopted worldwide. • This recommendation is based on a falsification of the research record and represents scientific misconduct. • The record misrepresented the magnitude of panelist disagreement of genetic risk from radiation. • These actions enhanced public acceptance of their risk assessment policy recommendations.

  10. Radiation hormesis and the linear-no-threshold assumption

    CERN Document Server

    Sanders, Charles L

    2009-01-01

    Current radiation protection standards are based upon the application of the linear no-threshold (LNT) assumption, which considers that even very low doses of ionizing radiation can cause cancer. The radiation hormesis hypothesis, by contrast, proposes that low-dose ionizing radiation is beneficial. In this book, the author examines all facets of radiation hormesis in detail, including the history of the concept and mechanisms, and presents comprehensive, up-to-date reviews for major cancer types. It is explained how low-dose radiation can in fact decrease all-cause and all-cancer mortality an

  11. Linear-No-Threshold Default Assumptions for Noncancer and Nongenotoxic Cancer Risks: A Mathematical and Biological Critique.

    Science.gov (United States)

    Bogen, Kenneth T

    2016-03-01

    To improve U.S. Environmental Protection Agency (EPA) dose-response (DR) assessments for noncarcinogens and for nonlinear mode of action (MOA) carcinogens, the 2009 NRC Science and Decisions Panel recommended that the adjustment-factor approach traditionally applied to these endpoints should be replaced by a new default assumption that both endpoints have linear-no-threshold (LNT) population-wide DR relationships. The panel claimed this new approach is warranted because population DR is LNT when any new dose adds to a background dose that explains background levels of risk, and/or when there is substantial interindividual heterogeneity in susceptibility in the exposed human population. Mathematically, however, the first claim is either false or effectively meaningless and the second claim is false. Any dose-and population-response relationship that is statistically consistent with an LNT relationship may instead be an additive mixture of just two quasi-threshold DR relationships, which jointly exhibit low-dose S-shaped, quasi-threshold nonlinearity just below the lower end of the observed "linear" dose range. In this case, LNT extrapolation would necessarily overestimate increased risk by increasingly large relative magnitudes at diminishing values of above-background dose. The fact that chemically-induced apoptotic cell death occurs by unambiguously nonlinear, quasi-threshold DR mechanisms is apparent from recent data concerning this quintessential toxicity endpoint. The 2009 NRC Science and Decisions Panel claims and recommendations that default LNT assumptions be applied to DR assessment for noncarcinogens and nonlinear MOA carcinogens are therefore not justified either mathematically or biologically. © 2015 The Author. Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.

  12. Checking the foundation: recent radiobiology and the linear no-threshold theory.

    Science.gov (United States)

    Ulsh, Brant A

    2010-12-01

    The linear no-threshold (LNT) theory has been adopted as the foundation of radiation protection standards and risk estimation for several decades. The "microdosimetric argument" has been offered in support of the LNT theory. This argument postulates that energy is deposited in critical cellular targets by radiation in a linear fashion across all doses down to zero, and that this in turn implies a linear relationship between dose and biological effect across all doses. This paper examines whether the microdosimetric argument holds at the lowest levels of biological organization following low dose, low dose-rate exposures to ionizing radiation. The assumptions of the microdosimetric argument are evaluated in light of recent radiobiological studies on radiation damage in biological molecules and cellular and tissue level responses to radiation damage. There is strong evidence that radiation initially deposits energy in biological molecules (e.g., DNA) in a linear fashion, and that this energy deposition results in various forms of prompt DNA damage that may be produced in a pattern that is distinct from endogenous (e.g., oxidative) damage. However, a large and rapidly growing body of radiobiological evidence indicates that cell and tissue level responses to this damage, particularly at low doses and/or dose-rates, are nonlinear and may exhibit thresholds. To the extent that responses observed at lower levels of biological organization in vitro are predictive of carcinogenesis observed in vivo, this evidence directly contradicts the assumptions upon which the microdosimetric argument is based.

  13. Validity of the linear no-threshold theory of radiation carcinogenesis at low doses

    International Nuclear Information System (INIS)

    Cohen, B.L.

    1999-01-01

    A great deal is known about the cancer risk of high radiation doses from studies of Japanese A-bomb survivors, patients exposed for medical therapy, occupational exposures, etc. But the vast majority of important applications deal with much lower doses, usually accumulated at much lower dose rates, referred to as 'low-level radiation' (LLR). Conventionally, the cancer risk from LLR has been estimated by the use of linear no-threshold theory (LNT). For example, it is assumed that the cancer risk from 0 01 Sr (100 mrem) of dose is 0 01 times the risk from 1 Sv (100 rem). In recent years, the former risk estimates have often been reduced by a 'dose and dose rate reduction factor', which is taken to be a factor of 2. But otherwise, the LNT is frequently used for doses as low as one hundred-thousandth of those for which there is direct evidence of cancer induction by radiation. It is the origin of the commonly used expression 'no level of radiation is safe' and the consequent public fear of LLR. The importance of this use of the LNT can not be exaggerated and is used in many applications in the nuclear industry. The LNT paradigm has also been carried over to chemical carcinogens, leading to severe restrictions on use of cleaning fluids, organic chemicals, pesticides, etc. If the LNT were abandoned for radiation, it would probably also be abandoned for chemical carcinogens. In view of these facts, it is important to consider the validity of the LNT. That is the purpose of this paper. (author)

  14. The threshold vs LNT showdown: Dose rate findings exposed flaws in the LNT model part 1. The Russell-Muller debate

    Energy Technology Data Exchange (ETDEWEB)

    Calabrese, Edward J., E-mail: edwardc@schoolph.umass.edu

    2017-04-15

    This paper assesses the discovery of the dose-rate effect in radiation genetics and how it challenged fundamental tenets of the linear non-threshold (LNT) dose response model, including the assumptions that all mutational damage is cumulative and irreversible and that the dose-response is linear at low doses. Newly uncovered historical information also describes how a key 1964 report by the International Commission for Radiological Protection (ICRP) addressed the effects of dose rate in the assessment of genetic risk. This unique story involves assessments by two leading radiation geneticists, Hermann J. Muller and William L. Russell, who independently argued that the report's Genetic Summary Section on dose rate was incorrect while simultaneously offering vastly different views as to what the report's summary should have contained. This paper reveals occurrences of scientific disagreements, how conflicts were resolved, which view(s) prevailed and why. During this process the Nobel Laureate, Muller, provided incorrect information to the ICRP in what appears to have been an attempt to manipulate the decision-making process and to prevent the dose-rate concept from being adopted into risk assessment practices. - Highlights: • The discovery of radiation dose rate challenged the scientific basis of LNT. • Radiation dose rate occurred in males and females. • The dose rate concept supported a threshold dose-response for radiation.

  15. The threshold vs LNT showdown: Dose rate findings exposed flaws in the LNT model part 1. The Russell-Muller debate

    International Nuclear Information System (INIS)

    Calabrese, Edward J.

    2017-01-01

    This paper assesses the discovery of the dose-rate effect in radiation genetics and how it challenged fundamental tenets of the linear non-threshold (LNT) dose response model, including the assumptions that all mutational damage is cumulative and irreversible and that the dose-response is linear at low doses. Newly uncovered historical information also describes how a key 1964 report by the International Commission for Radiological Protection (ICRP) addressed the effects of dose rate in the assessment of genetic risk. This unique story involves assessments by two leading radiation geneticists, Hermann J. Muller and William L. Russell, who independently argued that the report's Genetic Summary Section on dose rate was incorrect while simultaneously offering vastly different views as to what the report's summary should have contained. This paper reveals occurrences of scientific disagreements, how conflicts were resolved, which view(s) prevailed and why. During this process the Nobel Laureate, Muller, provided incorrect information to the ICRP in what appears to have been an attempt to manipulate the decision-making process and to prevent the dose-rate concept from being adopted into risk assessment practices. - Highlights: • The discovery of radiation dose rate challenged the scientific basis of LNT. • Radiation dose rate occurred in males and females. • The dose rate concept supported a threshold dose-response for radiation.

  16. The linear nonthreshold (LNT) model as used in radiation protection: an NCRP update.

    Science.gov (United States)

    Boice, John D

    2017-10-01

    The linear nonthreshold (LNT) model has been used in radiation protection for over 40 years and has been hotly debated. It relies heavily on human epidemiology, with support from radiobiology. The scientific underpinnings include NCRP Report No. 136 ('Evaluation of the Linear-Nonthreshold Dose-Response Model for Ionizing Radiation'), UNSCEAR 2000, ICRP Publication 99 (2004) and the National Academies BEIR VII Report (2006). NCRP Scientific Committee 1-25 is reviewing recent epidemiologic studies focusing on dose-response models, including threshold, and the relevance to radiation protection. Recent studies after the BEIR VII Report are being critically reviewed and include atomic-bomb survivors, Mayak workers, atomic veterans, populations on the Techa River, U.S. radiological technologists, the U.S. Million Person Study, international workers (INWORKS), Chernobyl cleanup workers, children given computerized tomography scans, and tuberculosis-fluoroscopy patients. Methodologic limitations, dose uncertainties and statistical approaches (and modeling assumptions) are being systematically evaluated. The review of studies continues and will be published as an NCRP commentary in 2017. Most studies reviewed to date are consistent with a straight-line dose response but there are a few exceptions. In the past, the scientific consensus process has worked in providing practical and prudent guidance. So pragmatic judgment is anticipated. The evaluations are ongoing and the extensive NCRP review process has just begun, so no decisions or recommendations are in stone. The march of science requires a constant assessment of emerging evidence to provide an optimum, though not necessarily perfect, approach to radiation protection. Alternatives to the LNT model may be forthcoming, e.g. an approach that couples the best epidemiology with biologically-based models of carcinogenesis, focusing on chronic (not acute) exposure circumstances. Currently for the practical purposes of

  17. Lessons to be learned from a contentious challenge to mainstream radiobiological science (the linear no-threshold theory of genetic mutations)

    International Nuclear Information System (INIS)

    Beyea, Jan

    2017-01-01

    There are both statistically valid and invalid reasons why scientists with differing default hypotheses can disagree in high-profile situations. Examples can be found in recent correspondence in this journal, which may offer lessons for resolving challenges to mainstream science, particularly when adherents of a minority view attempt to elevate the status of outlier studies and/or claim that self-interest explains the acceptance of the dominant theory. Edward J. Calabrese and I have been debating the historical origins of the linear no-threshold theory (LNT) of carcinogenesis and its use in the regulation of ionizing radiation. Professor Calabrese, a supporter of hormesis, has charged a committee of scientists with misconduct in their preparation of a 1956 report on the genetic effects of atomic radiation. Specifically he argues that the report mischaracterized the LNT research record and suppressed calculations of some committee members. After reviewing the available scientific literature, I found that the contemporaneous evidence overwhelmingly favored a (genetics) LNT and that no calculations were suppressed. Calabrese's claims about the scientific record do not hold up primarily because of lack of attention to statistical analysis. Ironically, outlier studies were more likely to favor supra-linearity, not sub-linearity. Finally, the claim of investigator bias, which underlies Calabrese's accusations about key studies, is based on misreading of text. Attention to ethics charges, early on, may help seed a counter narrative explaining the community's adoption of a default hypothesis and may help focus attention on valid evidence and any real weaknesses in the dominant paradigm. - Highlights: • Edward J Calabrese has made a contentious challenge to mainstream radiobiological science. • Such challenges should not be neglected, lest they enter the political arena without review. • Key genetic studies from the 1940s, challenged by Calabrese, were

  18. Lessons to be learned from a contentious challenge to mainstream radiobiological science (the linear no-threshold theory of genetic mutations)

    Energy Technology Data Exchange (ETDEWEB)

    Beyea, Jan, E-mail: jbeyea@cipi.com

    2017-04-15

    There are both statistically valid and invalid reasons why scientists with differing default hypotheses can disagree in high-profile situations. Examples can be found in recent correspondence in this journal, which may offer lessons for resolving challenges to mainstream science, particularly when adherents of a minority view attempt to elevate the status of outlier studies and/or claim that self-interest explains the acceptance of the dominant theory. Edward J. Calabrese and I have been debating the historical origins of the linear no-threshold theory (LNT) of carcinogenesis and its use in the regulation of ionizing radiation. Professor Calabrese, a supporter of hormesis, has charged a committee of scientists with misconduct in their preparation of a 1956 report on the genetic effects of atomic radiation. Specifically he argues that the report mischaracterized the LNT research record and suppressed calculations of some committee members. After reviewing the available scientific literature, I found that the contemporaneous evidence overwhelmingly favored a (genetics) LNT and that no calculations were suppressed. Calabrese's claims about the scientific record do not hold up primarily because of lack of attention to statistical analysis. Ironically, outlier studies were more likely to favor supra-linearity, not sub-linearity. Finally, the claim of investigator bias, which underlies Calabrese's accusations about key studies, is based on misreading of text. Attention to ethics charges, early on, may help seed a counter narrative explaining the community's adoption of a default hypothesis and may help focus attention on valid evidence and any real weaknesses in the dominant paradigm. - Highlights: • Edward J Calabrese has made a contentious challenge to mainstream radiobiological science. • Such challenges should not be neglected, lest they enter the political arena without review. • Key genetic studies from the 1940s, challenged by Calabrese, were

  19. The risk of low doses of ionising radiation and the linear no threshold relationship debate

    International Nuclear Information System (INIS)

    Tubiana, M.; Masse, R.; Vathaire, F. de; Averbeck, D.; Aurengo, A.

    2007-01-01

    The ICRP and the B.E.I.R. VII reports recommend a linear no threshold (L.N.T.) relationship for the estimation of cancer excess risk induced by ionising radiations (IR), but the 2005 report of Medicine and Science French Academies concludes that it leads to overestimate of risk for low and very low doses. The bases of L.N.T. are challenged by recent biological and animal experimental studies which show that the defence against IR involves the cell microenvironment and the immunologic system. The defence mechanisms against low doses are different and comparatively more effective than for high doses. Cell death is predominant against low doses. DNA repairing is activated against high doses, in order to preserve tissue functions. These mechanisms provide for multicellular organisms an effective and low cost defence system. The differences between low and high doses defence mechanisms are obvious for alpha emitters which show several greys threshold effects. These differences result in an impairment of epidemiological studies which, for statistical power purpose, amalgamate high and low doses exposure data, since it would imply that cancer IR induction and defence mechanisms are similar in both cases. Low IR dose risk estimates should rely on specific epidemiological studies restricted to low dose exposures and taking precisely into account potential confounding factors. The preliminary synthesis of cohort studies for which low dose data (< 100 mSv) were available show no significant risk excess, neither for solid cancer nor for leukemias. (authors)

  20. Social psychological approach to the problem of threshold

    International Nuclear Information System (INIS)

    Nakayachi, Kazuya

    1999-01-01

    This paper discusses the threshold of carcinogen risk from the viewpoint of social psychology. First, the results of a survey suggesting that renunciation of the Linear No-Threshold (LNT) hypothesis would have no influence on the public acceptance (PA) of nuclear power plants are reported. Second, the relationship between the adoption of the LNT hypothesis and the standardization of management for various risks are discussed. (author)

  1. Problems in the radon versus lung cancer test of the linear no-threshold theory and a procedure for resolving them

    International Nuclear Information System (INIS)

    Cohen, B.L.

    1996-01-01

    It has been shown that lung cancer rates in U.S. counties, with or without correction for smoking, decrease with increasing radon exposure, in sharp contrast to the increase predicted by the linear-no-threshold (LNT) theory. The discrepancy is by 20 standard deviations, and very extensive efforts to explain it were not successful. Unless a plausible explanation for this discrepancy (or conflicting evidence) can be found, continued use of the LNT theory is a violation of open-quotes the scientific method.close quotes Nevertheless, LNT continues to be accepted and used by all official and governmental organizations, such as the International Commission on Radiological Protection, the National Council on Radiation Protection and Measurements, the Council on Radiation Protection and Measurements, the National Academy of Sciences - U.S. Nuclear Regulatory Commission Board of Radiation Effects Research, Environmental Protection Agency etc., and there has been no move by any of these bodies to discontinue or limit its use. Assuming that they rely on the scientific method, this clearly implies that they have a plausible explanation for the discrepancy. The author has made great efforts to discover these 'plausible explanations' by inquiries through various channels, and the purpose of this paper is to describe and discuss them

  2. Lessons to be learned from a contentious challenge to mainstream radiobiological science (the linear no-threshold theory of genetic mutations).

    Science.gov (United States)

    Beyea, Jan

    2017-04-01

    There are both statistically valid and invalid reasons why scientists with differing default hypotheses can disagree in high-profile situations. Examples can be found in recent correspondence in this journal, which may offer lessons for resolving challenges to mainstream science, particularly when adherents of a minority view attempt to elevate the status of outlier studies and/or claim that self-interest explains the acceptance of the dominant theory. Edward J. Calabrese and I have been debating the historical origins of the linear no-threshold theory (LNT) of carcinogenesis and its use in the regulation of ionizing radiation. Professor Calabrese, a supporter of hormesis, has charged a committee of scientists with misconduct in their preparation of a 1956 report on the genetic effects of atomic radiation. Specifically he argues that the report mischaracterized the LNT research record and suppressed calculations of some committee members. After reviewing the available scientific literature, I found that the contemporaneous evidence overwhelmingly favored a (genetics) LNT and that no calculations were suppressed. Calabrese's claims about the scientific record do not hold up primarily because of lack of attention to statistical analysis. Ironically, outlier studies were more likely to favor supra-linearity, not sub-linearity. Finally, the claim of investigator bias, which underlies Calabrese's accusations about key studies, is based on misreading of text. Attention to ethics charges, early on, may help seed a counter narrative explaining the community's adoption of a default hypothesis and may help focus attention on valid evidence and any real weaknesses in the dominant paradigm. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  3. LNT-an apparent rather than a real controversy?

    Energy Technology Data Exchange (ETDEWEB)

    Charles, M W [School of Physics and Astronomy, University of Birmingham, Edgbaston, Birmingham B15 2TT (United Kingdom)

    2006-09-15

    Can the carcinogenic risks of radiation that are observed at high doses be extrapolated to low doses? This question has been debated through the whole professional life of the author-now nearing four decades. In its extreme form the question relates to a particular hypothesis (LNT) used widely by the international community for radiological protection applications. The linear no-threshold (LNT) hypothesis propounds that the extrapolation is linear and that it extends down to zero dose. The debate on the validity of LNT has increased dramatically in recent years. This is in no small part due to concern that exaggerated risks at low doses leads to undue amounts of societal resources being used to reduce man-made human exposure and because of the related growing public aversion to diagnostic and therapeutic medical exposures. The debate appears to be entering a new phase. There is a growing realisation of the limitations of fundamental data and the scientific approach to address this question at low doses. There also appears to be an increasing awareness that the assumptions necessary for a workable and acceptable system of radiological protection at low doses must necessarily be based on considerable pragmatism. Recent developments are reviewed and a historical perspective is given on the general nature of controversies in radiation protection over the years. All the protagonists in the debate will at the end of the day probably be able to claim that they were right{exclamation_point} (opinion)

  4. Radiation, ecology and the invalid LNT model: the evolutionary imperative.

    Science.gov (United States)

    Parsons, Peter A

    2006-09-27

    Metabolic and energetic efficiency, and hence fitness of organisms to survive, should be maximal in their habitats. This tenet of evolutionary biology invalidates the linear-no threshold (LNT) model for the risk consequences of environmental agents. Hormesis in response to selection for maximum metabolic and energetic efficiency, or minimum metabolic imbalance, to adapt to a stressed world dominated by oxidative stress should therefore be universal. Radiation hormetic zones extending substantially beyond common background levels, can be explained by metabolic interactions among multiple abiotic stresses. Demographic and experimental data are mainly in accord with this expectation. Therefore, non-linearity becomes the primary model for assessing risks from low-dose ionizing radiation. This is the evolutionary imperative upon which risk assessment for radiation should be based.

  5. An experimental test of the linear no-threshold theory of radiation carcinogenesis

    International Nuclear Information System (INIS)

    Cohen, B.L.

    1990-01-01

    There is a substantial body of quantitative information on radiation-induced cancer at high dose, but there are no data at low dose. The usual method for estimating effects of low-level radiation is to assume a linear no-threshold dependence. if this linear no-threshold assumption were not used, essentially all fears about radiation would disappear. Since these fears are costing tens of billions of dollars, it is most important that the linear no-threshold theory be tested at low dose. An opportunity for possibly testing the linear no-threshold concept is now available at low dose due to radon in homes. The purpose of this paper is to attempt to use this data to test the linear no-threshold theory

  6. Observations on the Chernobyl Disaster and LNT.

    Science.gov (United States)

    Jaworowski, Zbigniew

    2010-01-28

    The Chernobyl accident was probably the worst possible catastrophe of a nuclear power station. It was the only such catastrophe since the advent of nuclear power 55 years ago. It resulted in a total meltdown of the reactor core, a vast emission of radionuclides, and early deaths of only 31 persons. Its enormous political, economic, social and psychological impact was mainly due to deeply rooted fear of radiation induced by the linear non-threshold hypothesis (LNT) assumption. It was a historic event that provided invaluable lessons for nuclear industry and risk philosophy. One of them is demonstration that counted per electricity units produced, early Chernobyl fatalities amounted to 0.86 death/GWe-year), and they were 47 times lower than from hydroelectric stations ( approximately 40 deaths/GWe-year). The accident demonstrated that using the LNT assumption as a basis for protection measures and radiation dose limitations was counterproductive, and lead to sufferings and pauperization of millions of inhabitants of contaminated areas. The projections of thousands of late cancer deaths based on LNT, are in conflict with observations that in comparison with general population of Russia, a 15% to 30% deficit of solid cancer mortality was found among the Russian emergency workers, and a 5% deficit solid cancer incidence among the population of most contaminated areas.

  7. Observations on the Chernobyl Disaster and LNT

    Science.gov (United States)

    Jaworowski, Zbigniew

    2010-01-01

    The Chernobyl accident was probably the worst possible catastrophe of a nuclear power station. It was the only such catastrophe since the advent of nuclear power 55 years ago. It resulted in a total meltdown of the reactor core, a vast emission of radionuclides, and early deaths of only 31 persons. Its enormous political, economic, social and psychological impact was mainly due to deeply rooted fear of radiation induced by the linear non-threshold hypothesis (LNT) assumption. It was a historic event that provided invaluable lessons for nuclear industry and risk philosophy. One of them is demonstration that counted per electricity units produced, early Chernobyl fatalities amounted to 0.86 death/GWe-year), and they were 47 times lower than from hydroelectric stations (∼40 deaths/GWe-year). The accident demonstrated that using the LNT assumption as a basis for protection measures and radiation dose limitations was counterproductive, and lead to sufferings and pauperization of millions of inhabitants of contaminated areas. The projections of thousands of late cancer deaths based on LNT, are in conflict with observations that in comparison with general population of Russia, a 15% to 30% deficit of solid cancer mortality was found among the Russian emergency workers, and a 5% deficit solid cancer incidence among the population of most contaminated areas. PMID:20585443

  8. Molecular biology, epidemiology, and the demise of the linear no-threshold hypothesis

    International Nuclear Information System (INIS)

    Pollycove, M.

    1998-01-01

    The LNT hypothesis is the basic principle of all radiation protection policy. This theory assumes that all radiation doses, even those close to zero, are harmful in linear proportion to dose and that all doses produce a proportionate number of harmful mutations, i.e., mis- or unrepaired DNA alterations. The LNT theory is used to generate collective dose calculations of the number of deaths produced by minute fractions of background radiation. Current molecular biology reveals an enormous amount of relentless metabolic oxidative free radical damage with mis/unrepaired alterations of DNA. The corresponding mis/unrepaired DNA alterations produced by background radiation are negligible. These DNA alterations are effectively disposed of by the DNA damage-control biosystem of antioxidant prevention, enzymatic repair, and mutation removal. High-dose radiation injures this biosystem with associated risk increments of mortality and cancer mortality. Low-dose radiation stimulates DNA damage-control with associated epidemiologic observations of risk decrements of mortality and cancer mortality, i.e., hormesis. How can this 40-year-old LNT paradigm continue to be the operative principle of radiation protection policy despite the contradictory scientific observations of both molecular biology and epidemiology and the lack of any supportive human data? The increase of public fear through repeated statements of deaths caused by 'deadly' radiation has engendered an enormous increase in expenditures now required to 'protect' the public from all applications of nuclear technology: medical, research, energy, disposal, and cleanup remediation. Government funds are allocated to appointed committees, the research they support, and to multiple environmental and regulatory agencies. The LNT theory and multibillion dollar radiation activities have now become a symbiotic self-sustaining powerful political and economic force. (author)

  9. Validity of the linear no-threshold (LNT) hypothesis in setting radiation protection regulations for the inhabitants in high level natural radiation areas of Ramsar, Iran

    International Nuclear Information System (INIS)

    Mortazavi, S.M.J.; Atefi, M.; Razi, Z.; Mortazavi Gh

    2010-01-01

    Some areas in Ramsar, a city in northern Iran, have long been known as inhabited areas with the highest levels of natural radiation. Despite the fact that the health effects of high doses of ionizing radiation are well documented, biological effects of above the background levels of natural radiation are still controversial and the validity of the LNT hypothesis in this area, has been criticized by many investigators around the world. The study of the health effects of high levels of natural radiation in areas such as Ramsar, help scientists to investigate the biological effects without the need for extrapolating the observations either from high doses of radiation to low dose region or from laboratory animals to humans. Considering the importance of these studies, National Radiation Protection Department (NRPD) of the Iranian Nuclear Regulatory Authority has started an integrative research project on the health effects of long-term exposure to high levels of natural radiation. This paper reviews findings of the studies conducted on the plants and humans living or laboratory animals kept in high level natural radiation areas of Ramsar. In human studies, different end points such as DNA damage, chromosome aberrations, blood cells and immunological alterations are discussed. This review comes to the conclusion that no reproducible detrimental health effect has been reported so far. In this paper the validity of LNT hypothesis in the assessment of the health effects of high levels of natural radiation is discussed. (author)

  10. Model Uncertainty via the Integration of Hormesis and LNT as the Default in Cancer Risk Assessment.

    Science.gov (United States)

    Calabrese, Edward J

    2015-01-01

    On June 23, 2015, the US Nuclear Regulatory Commission (NRC) issued a formal notice in the Federal Register that it would consider whether "it should amend its 'Standards for Protection Against Radiation' regulations from the linear non-threshold (LNT) model of radiation protection to the hormesis model." The present commentary supports this recommendation based on the (1) flawed and deceptive history of the adoption of LNT by the US National Academy of Sciences (NAS) in 1956; (2) the documented capacity of hormesis to make more accurate predictions of biological responses for diverse biological end points in the low-dose zone; (3) the occurrence of extensive hormetic data from the peer-reviewed biomedical literature that revealed hormetic responses are highly generalizable, being independent of biological model, end point measured, inducing agent, level of biological organization, and mechanism; and (4) the integration of hormesis and LNT models via a model uncertainty methodology that optimizes public health responses at 10(-4). Thus, both LNT and hormesis can be integratively used for risk assessment purposes, and this integration defines the so-called "regulatory sweet spot."

  11. Linear versus non-linear: a perspective from health physics and radiobiology

    International Nuclear Information System (INIS)

    Gentner, N.E.; Osborne, R.V.

    1998-01-01

    There is a vigorous debate about whether or not there may be a 'threshold' for radiation-induced adverse health effects. A linear-no threshold (LNT) model allows radiation protection practitioners to manage putative risk consistently, because different types of exposure, exposures at different times, and exposures to different organs may be summed. If we are to argue to regulators and the public that low doses are less dangerous than we presently assume, it is incumbent on us to prove this. The question is, therefore, whether any consonant body of evidence exists that the risk of low doses has been over-estimated. From the perspectives of both health physics and radiobiology, we conclude that the evidence for linearity at high doses (and arguably of fairly small total doses if delivered at high dose rate) is strong. For low doses (or in fact, even for fairly high doses) delivered at low dose rate, the evidence is much less compelling. Since statistical limitations at low doses are almost always going to prevent a definitive answer, one way or the other, from human data, we need a way out of this epistemological dilemma of 'LNT or not LNT, that is the question'. To our minds, the path forward is to exploit (1) radiobiological studies which address directly the question of what the dose and dose rate effectiveness factor is in actual human bodies exposed to low-level radiation, in concert with (2) epidemiological studies of human populations exposed to fairly high doses (to obtain statistical power) but where exposure was protracted over some years. (author)

  12. Whither LNT?

    International Nuclear Information System (INIS)

    Higson, D.J.

    2015-01-01

    UNSCEAR and ICRP have reported that no health effects have been attributed to radiation exposure at Fukushima. As at Chernobyl, however, fear that there is no safe dose of radiation has caused enormous psychological damage to health; and evacuation to protect the public from exposure to radiation appears to have done. more harm than good. UNSCEAR and ICRP both stress that collective doses, aggregated from the exposure of large numbers of individuals to very low doses, should not be used to estimate numbers of radiation-induced health effects. This is incompatible with the LNT assumption recommended by the ICRP. (author)

  13. Whither LNT?

    International Nuclear Information System (INIS)

    Higson, D.J.

    2014-01-01

    UNSCEAR and ICRP have reported that no health effects have been attributed to radiation exposure at Fukushima. As at Chernobyl, however, fear that there is no safe dose of radiation has caused enormous psychological damage to health; and evacuation to protect the public from exposure to radiation appears to have done more harm than good. UNSCEAR and ICRP both stress that collective doses, aggregated from the exposure of large numbers of individuals to very low doses, should not be used to estimate numbers of radiation-induced health effects. This is incompatible with the LNT assumption recommended by the ICRP. (author)

  14. Whither LNT?

    Energy Technology Data Exchange (ETDEWEB)

    Higson, D.J.

    2015-03-15

    UNSCEAR and ICRP have reported that no health effects have been attributed to radiation exposure at Fukushima. As at Chernobyl, however, fear that there is no safe dose of radiation has caused enormous psychological damage to health; and evacuation to protect the public from exposure to radiation appears to have done. more harm than good. UNSCEAR and ICRP both stress that collective doses, aggregated from the exposure of large numbers of individuals to very low doses, should not be used to estimate numbers of radiation-induced health effects. This is incompatible with the LNT assumption recommended by the ICRP. (author)

  15. Whither LNT?

    Energy Technology Data Exchange (ETDEWEB)

    Higson, D.J. [Australian Nuclear Association, Paddington, NSW (Australia)

    2014-07-01

    UNSCEAR and ICRP have reported that no health effects have been attributed to radiation exposure at Fukushima. As at Chernobyl, however, fear that there is no safe dose of radiation has caused enormous psychological damage to health; and evacuation to protect the public from exposure to radiation appears to have done more harm than good. UNSCEAR and ICRP both stress that collective doses, aggregated from the exposure of large numbers of individuals to very low doses, should not be used to estimate numbers of radiation-induced health effects. This is incompatible with the LNT assumption recommended by the ICRP. (author)

  16. Test of the linear-no threshold theory of radiation carcinogenesis

    International Nuclear Information System (INIS)

    Cohen, B.L.

    1994-01-01

    We recently completed a compilation of radon measurements from available sources which gives the average radon level, in homes for 1730 counties, well over half of all U.S. counties and comprising about 90% of the total U.S. population. Epidemiologists normally study the relationship between mortality risks to individuals, m, vs their personal exposure, r, whereas an ecological study like ours deals with the relationship between the average risk to groups of individuals (population of counties) and their average exposure. It is well known to epidemiologists that, in general, the average dose does not determine the average risk, and to assume otherwise is called 'the ecological fallacy'. However, it is easy to show that, in testing a linear-no threshold theory, 'the ecological fallacy' does not apply; in that theory, the average dose does determine the average risk. This is widely recognized from the fact that 'person-rem' determines the number of deaths. Dividing person-rem by population gives average dose, and dividing number of deaths by population gives mortality rate. Because of the 'ecological fallacy', epidemiology textbooks often state that an ecological study cannot determine a causal relationship between risk and exposure. That may be true, but it is irrelevant here because the purpose of our study is not to determine a causal relationship; it is rather to test the linear-no threshold dependence of m on r. (author)

  17. A test of the linear-no threshold theory of radiation carcinogenesis

    International Nuclear Information System (INIS)

    Cohen, B.L.

    1990-01-01

    It has been pointed out that, while an ecological study cannot determine whether radon causes lung cancer, it can test the validity of a linear-no threshold relationship between them. The linear-no threshold theory predicts a substantial positive correlation between the average radon exposure in various counties and their lung cancer mortality rates. Data on living areas of houses in 411 counties from all parts of the United States exhibit, rather, a substantial negative correlation with the slopes of the lines of regression differing from zero by 10 and 7 standard deviations for males and females, respectively, and from the positive slope predicted by the theory by at least 16 and 12 standard deviations. When the data are segmented into 23 groups of states or into 7 regions of the country, the predominantly negative slopes and correlations persist, applying to 18 of the 23 state groups and 6 of the 7 regions. Five state-sponsored studies are analyzed, and four of these give a strong negative slope (the other gives a weak positive slope, in agreement with our data for that state). A strong negative slope is also obtained in our data on basements in 253 counties. A random selection-no charge study of 39 high and low lung cancer counties (+4 low population states) gives a much stronger negative correlation. When nine potential confounding factors are included in a multiple linear regression analysis, the discrepancy with theory is reduced only to 12 and 8.5 standard deviations for males and females, respectively. When the data are segmented into four groups by population, the multiple regression vs radon level gives a strong negative slope for each of the four groups. Other considerations are introduced to reduce the discrepancy, but it remains very substantial

  18. Pros and cons of the revolution in radiation protection

    International Nuclear Information System (INIS)

    Latek, Stanislav

    2001-01-01

    In 1959, the International Commission of Radiation Protection (ICRP) chose the LNT (Linear No-Threshold) model as an assumption to form the basis for regulating radiation protection. During the 1999 UNSCEAR session, held in April in Vienna, the linear no-threshold (LNT) hypothesis was discussed. Among other LNT-related subjects, the Committee discussed the problem of collective dose and dose commitment. These concepts have been introduced in the early 1960s, as the offspring of the linear no-threshold assumption. At the time they reflected a deep concern about the induction of hereditary effects by nuclear tests fallout. Almost four decades later, collective dose and dose commitment are still widely used, although by now both the concepts and the concern should have faded into oblivion. It seems that the principles and concepts of radiation protection have gone astray and have led to exceedingly prohibitive standards and impractical recommendations. Revision of these principles and concepts is now being proposed by an increasing number of scientists and several organisations

  19. A practical threshold concept for simple and reasonable radiation protection

    International Nuclear Information System (INIS)

    Kaneko, Masahito

    2002-01-01

    A half century ago it was assumed for the purpose of protection that radiation risks are linearly proportional at all levels of dose. Linear No-Threshold (LNT) hypothesis has greatly contributed to the minimization of doses received by workers and members of the public, while it has brought about 'radiophobia' and unnecessary over-regulation. Now that the existence of bio-defensive mechanisms such as DNA repair, apoptosis and adaptive response are well recognized, the linearity assumption can be said 'unscientific'. Evidences increasingly imply that there are threshold effects in risk of radiation. A concept of 'practical' thresholds is proposed and the classification of 'stochastic' and 'deterministic' radiation effects should be abandoned. 'Practical' thresholds are dose levels below which induction of detectable radiogenic cancers or hereditary effects are not expected. There seems to be no evidence of deleterious health effects from radiation exposures at the current dose limits (50 mSv/y for workers and 5 mSv/y for members of the public), which have been adopted worldwide in the latter half of the 20th century. Those limits are assumed to have been set below certain 'practical' thresholds. As any workers and members of the public do not gain benefits from being exposed, excepting intentional irradiation for medical purposes, their radiation exposures should be kept below 'practical' thresholds. There is no use of 'justification' and 'optimization' (ALARA) principles, because there are no 'radiation detriments' as far as exposures are maintained below 'practical' thresholds. Accordingly the ethical issue of 'justification' to allow benefit to society to offset radiation detriments to individuals can be resolved. And also the ethical issue of 'optimization' to exchange health or safety for economical gain can be resolved. The ALARA principle should be applied to the probability (risk) of exceeding relevant dose limits instead of applying to normal exposures

  20. Test of the linear-no threshold theory of radiation carcinogenesis for inhaled radon decay products

    International Nuclear Information System (INIS)

    Cohen, B.L.

    1995-01-01

    Data on lung cancer mortality rates vs. average radon concentration in homes for 1,601 U.S. counties are used to test the linear-no threshold theory. The widely recognized problems with ecological studies, as applied to this work, are addressed extensively. With or without corrections for variations in smoking prevalence, there is a strong tendency for lung cancer rates to decrease with increasing radon exposure, in sharp contrast to the increase expected from the theory. The discrepancy in slope is about 20 standard deviations. It is shown that uncertainties in lung cancer rates, radon exposures, and smoking prevalence are not important and that confounding by 54 socioeconomic factors, by geography, and by altitude and climate can explain only a small fraction of the discrepancy. Effects of known radon-smoking prevalence correlations - rural people have higher radon levels and smoke less than urban people, and smokers are exposed to less radon than non-smokers - are calculated and found to be trivial. In spite of extensive efforts, no potential explanation for the discrepancy other than failure of the linear-no threshold theory for carcinogenesis from inhaled radon decay products could be found. (author)

  1. Radiation, Ecology and the Invalid LNT Model: The Evolutionary Imperative

    OpenAIRE

    Parsons, Peter A.

    2006-01-01

    Metabolic and energetic efficiency, and hence fitness of organisms to survive, should be maximal in their habitats. This tenet of evolutionary biology invalidates the linear-nothreshold (LNT) model for the risk consequences of environmental agents. Hormesis in response to selection for maximum metabolic and energetic efficiency, or minimum metabolic imbalance, to adapt to a stressed world dominated by oxidative stress should therefore be universal. Radiation hormetic zones extending substanti...

  2. Do non-targeted effects increase or decrease low dose risk in relation to the linear-non-threshold (LNT) model?

    International Nuclear Information System (INIS)

    Little, M.P.

    2010-01-01

    In this paper we review the evidence for departure from linearity for malignant and non-malignant disease and in the light of this assess likely mechanisms, and in particular the potential role for non-targeted effects. Excess cancer risks observed in the Japanese atomic bomb survivors and in many medically and occupationally exposed groups exposed at low or moderate doses are generally statistically compatible. For most cancer sites the dose-response in these groups is compatible with linearity over the range observed. The available data on biological mechanisms do not provide general support for the idea of a low dose threshold or hormesis. This large body of evidence does not suggest, indeed is not statistically compatible with, any very large threshold in dose for cancer, or with possible hormetic effects, and there is little evidence of the sorts of non-linearity in response implied by non-DNA-targeted effects. There are also excess risks of various types of non-malignant disease in the Japanese atomic bomb survivors and in other groups. In particular, elevated risks of cardiovascular disease, respiratory disease and digestive disease are observed in the A-bomb data. In contrast with cancer, there is much less consistency in the patterns of risk between the various exposed groups; for example, radiation-associated respiratory and digestive diseases have not been seen in these other (non-A-bomb) groups. Cardiovascular risks have been seen in many exposed populations, particularly in medically exposed groups, but in contrast with cancer there is much less consistency in risk between studies: risks per unit dose in epidemiological studies vary over at least two orders of magnitude, possibly a result of confounding and effect modification by well known (but unobserved) risk factors. In the absence of a convincing mechanistic explanation of epidemiological evidence that is, at present, less than persuasive, a cause-and-effect interpretation of the reported

  3. The brittle basis of linearity

    International Nuclear Information System (INIS)

    Roth, E.

    1997-01-01

    The LNT-theory of cancer generation by ionizing radiation is commonly vindicated by 3 arguments: The stochastic character of irradiation hits to cells, the monoclonality of cancer generation, and the error proneness of DNA-repair. It is shown that this conclusion is logically inadmissible. Equally, the rescuing attempts tried by some LNT-supporters are not successful. It contradicts the laws of thinking to exclude threshold and hormesis in this way. (author)

  4. Epidemiology Without Biology: False Paradigms, Unfounded Assumptions, and Specious Statistics in Radiation Science (with Commentaries by Inge Schmitz-Feuerhake and Christopher Busby and a Reply by the Authors)

    OpenAIRE

    Sacks, Bill; Meyerson, Gregory; Siegel, Jeffry A.

    2016-01-01

    Radiation science is dominated by a paradigm based on an assumption without empirical foundation. Known as the linear no-threshold (LNT) hypothesis, it holds that all ionizing radiation is harmful no matter how low the dose or dose rate. Epidemiological studies that claim to confirm LNT either neglect experimental and/or observational discoveries at the cellular, tissue, and organismal levels, or mention them only to distort or dismiss them. The appearance of validity in these studies rests o...

  5. Multi-stratified multiple regression tests of the linear/no-threshold theory of radon-induced lung cancer

    International Nuclear Information System (INIS)

    Cohen, B.L.

    1992-01-01

    A plot of lung-cancer rates versus radon exposures in 965 US counties, or in all US states, has a strong negative slope, b, in sharp contrast to the strong positive slope predicted by linear/no-threshold theory. The discrepancy between these slopes exceeds 20 standard deviations (SD). Including smoking frequency in the analysis substantially improves fits to a linear relationship but has little effect on the discrepancy in b, because correlations between smoking frequency and radon levels are quite weak. Including 17 socioeconomic variables (SEV) in multiple regression analysis reduces the discrepancy to 15 SD. Data were divided into segments by stratifying on each SEV in turn, and on geography, and on both simultaneously, giving over 300 data sets to be analyzed individually, but negative slopes predominated. The slope is negative whether one considers only the most urban counties or only the most rural; only the richest or only the poorest; only the richest in the South Atlantic region or only the poorest in that region, etc., etc.,; and for all the strata in between. Since this is an ecological study, the well-known problems with ecological studies were investigated and found not to be applicable here. The open-quotes ecological fallacyclose quotes was shown not to apply in testing a linear/no-threshold theory, and the vulnerability to confounding is greatly reduced when confounding factors are only weakly correlated with radon levels, as is generally the case here. All confounding factors known to correlate with radon and with lung cancer were investigated quantitatively and found to have little effect on the discrepancy

  6. Non-Linear Adaptive Phenomena Which Decrease The Risk of Infection After Pre-Exposure to Radiofrequency Radiation

    OpenAIRE

    Mortazavi, S.M.J.; Motamedifar, M.; Namdari, G.; Taheri, M.; Mortazavi, A.R.; Shokrpour, N.

    2013-01-01

    Substantial evidence indicates that adaptive response induced by low doses of ionizing radiation can result in resistance to the damage caused by a subsequently high-dose radiation or cause cross-resistance to other non-radiation stressors. Adaptive response contradicts the linear-non-threshold (LNT) dose-response model for ionizing radiation. We have previously reported that exposure of laboratory animals to radiofrequency radiation can induce a survival adaptive response. Furthermore, we ha...

  7. Putting aside the LNT dilemma in the controllable dose concept

    International Nuclear Information System (INIS)

    Koblinger, Laszlo

    2000-01-01

    Recently, Professor R. Clarke, ICRP Chairman has published his proposal for a renewal of the basic radiation protection concept. The two main points of his proposed system are: (1) the term Controllable Dose is introduced, and (2) the protection philosophy is based on the individual. For practical uses terms like 'Action Level', 'Investigation Level' etc. are introduced. The outline of the new system promises a really less complex frame; no distinction between practices and interventions, unified treatment for occupational, medical and public exposures. There is, however, an inconsistency within the new system: Though linearity is not assumed, the relations between the definitions of the new terms of the system of protection and the doses assigned to them are still based on the LNT hypothesis. To avoid this discrepancy a new definition of Action Level is recommended as a conservative estimate of the lowest dose where harmful effects have ever been demonstrated. Other levels should be defined by the Action Level and Safety Factors applied on the doses. (author)

  8. Thresholding projection estimators in functional linear models

    OpenAIRE

    Cardot, Hervé; Johannes, Jan

    2010-01-01

    We consider the problem of estimating the regression function in functional linear regression models by proposing a new type of projection estimators which combine dimension reduction and thresholding. The introduction of a threshold rule allows to get consistency under broad assumptions as well as minimax rates of convergence under additional regularity hypotheses. We also consider the particular case of Sobolev spaces generated by the trigonometric basis which permits to get easily mean squ...

  9. Topics on study of low dose-effect relationship

    Energy Technology Data Exchange (ETDEWEB)

    Yamada, Takeshi [Toho Univ., School of Medicine, Tokyo (Japan); Ohyama, Harumi

    1999-09-01

    It is not exceptional but usually observed that a dose-effect relationship in biosystem is not linear. Sometimes, the low dose-effect relationship appears entirely contrary to the expectation from high dose-effect. This is called a 'hormesis' phenomena. A high dose irradiation inflicts certainly an injury on biosystem. No matter how low the dose may be, an irradiation might inflict some injury on biosystem according to Linear Non-Threshold hypothesis(LNT). On the contrary to the expectation, a low dose irradiation stimulates immune system, and promotes cell proliferation. This is called 'radiation hormesis'. The studies of the radiation hormesis are made on from four points of view as follows: (1) radiation adaptive response, (2) revitalization caused by a low dose stimulation, (3) a low dose response unexpected from the LNT hypothesis, (4) negation of the LNT hypothesis. The various empirical proofs of radiation hormesis are introduced in the report. (M . Suetake)

  10. Topics on study of low dose-effect relationship

    International Nuclear Information System (INIS)

    Yamada, Takeshi; Ohyama, Harumi

    1999-01-01

    It is not exceptional but usually observed that a dose-effect relationship in biosystem is not linear. Sometimes, the low dose-effect relationship appears entirely contrary to the expectation from high dose-effect. This is called a 'hormesis' phenomena. A high dose irradiation inflicts certainly an injury on biosystem. No matter how low the dose may be, an irradiation might inflict some injury on biosystem according to Linear Non-Threshold hypothesis(LNT). On the contrary to the expectation, a low dose irradiation stimulates immune system, and promotes cell proliferation. This is called 'radiation hormesis'. The studies of the radiation hormesis are made on from four points of view as follows: (1) radiation adaptive response, (2) revitalization caused by a low dose stimulation, (3) a low dose response unexpected from the LNT hypothesis, (4) negation of the LNT hypothesis. The various empirical proofs of radiation hormesis are introduced in the report. (M . Suetake)

  11. Non-Linearity of dose-effect relationship on the example of cytogenetic effects in plant cells at low level exposure to ionising radiation

    International Nuclear Information System (INIS)

    Oudalova, Alla; Geras'kin, Stanislav; Dikarev, Vladimir; Dikareva, Nina; Chernonog, Elena; Copplestone, David; Evseeva, Tatyana

    2006-01-01

    Over several decades, modelling the effects of ionizing radiation on biological system has relied on the target principle [Timofeeff-Ressovsky et al., 1935], which assumes that cell damage or modification to genes appear as a direct consequence of the exposure of biological macromolecules to charged particles. Furthermore, it is assumed that there is no threshold for the induction of biological damage and that the effects observed are proportional to the energy absorbed. Following this principle, the average number of hits per target should increase linearly with dose, and the yield of mutations per unit of dose is assumed to be the same at both low and high doses (linearity of response). This principle has served as the scientific background for the linear no-threshold (LNT) concept that forms the basis for the radiological protection for the public and the environment [ICRP, 1990]. It follows from the LNT that there is an additional risk for human health from exposure to any radiation level, even below natural background. Since the mid 50's, however, the scientific basis for the LNT concept has been challenged as experimental data have shown that, at low doses, there was a non linear relationship in the dose response. Luchnik and Timofeeff-Ressovsky were the first who showed a non-linear response to a low dose exposure [Luchnik, 1957; Timofeeff-Ressovsky and Luchnik, 1960]. Since then, many data have been accumulated which contradict the LNT model at low doses and dose rates. However, the hit-effect paradigm has become such a strong and indissoluble fact that it has persisted even under the growing pressure of scientific evidence for phenomena at low dose exposure that can not be successfully accounted for by the LNT concept. In recent years, additional information on non-targeted effects of radiation has been accumulated following the first reports of an adaptive response in human lymphocytes [Olivieri et al., 1984] as well as bystander mutagenic effect of alpha

  12. Non-Linearity of dose-effect relationship on the example of cytogenetic effects in plant cells at low level exposure to ionising radiation

    Energy Technology Data Exchange (ETDEWEB)

    Oudalova, Alla; Geras' kin, Stanislav; Dikarev, Vladimir; Dikareva, Nina; Chernonog, Elena [Russian Institute of Agricultural Radiology and Agroecology, RIARAE, 249032 Obninsk (Russian Federation); Copplestone, David [Environment Agency, Millbank Tower, 25th. Floor, 21/24 Millbank, London, SW1P 4XL (United Kingdom); Evseeva, Tatyana [Institute of Biology, Kommunisticheskaya st., 28 Syktyvkar 167610, Komi Republic (Russian Federation)

    2006-07-01

    Over several decades, modelling the effects of ionizing radiation on biological system has relied on the target principle [Timofeeff-Ressovsky et al., 1935], which assumes that cell damage or modification to genes appear as a direct consequence of the exposure of biological macromolecules to charged particles. Furthermore, it is assumed that there is no threshold for the induction of biological damage and that the effects observed are proportional to the energy absorbed. Following this principle, the average number of hits per target should increase linearly with dose, and the yield of mutations per unit of dose is assumed to be the same at both low and high doses (linearity of response). This principle has served as the scientific background for the linear no-threshold (LNT) concept that forms the basis for the radiological protection for the public and the environment [ICRP, 1990]. It follows from the LNT that there is an additional risk for human health from exposure to any radiation level, even below natural background. Since the mid 50's, however, the scientific basis for the LNT concept has been challenged as experimental data have shown that, at low doses, there was a non linear relationship in the dose response. Luchnik and Timofeeff-Ressovsky were the first who showed a non-linear response to a low dose exposure [Luchnik, 1957; Timofeeff-Ressovsky and Luchnik, 1960]. Since then, many data have been accumulated which contradict the LNT model at low doses and dose rates. However, the hit-effect paradigm has become such a strong and indissoluble fact that it has persisted even under the growing pressure of scientific evidence for phenomena at low dose exposure that can not be successfully accounted for by the LNT concept. In recent years, additional information on non-targeted effects of radiation has been accumulated following the first reports of an adaptive response in human lymphocytes [Olivieri et al., 1984] as well as bystander mutagenic effect of

  13. Permitted and forbidden sets in symmetric threshold-linear networks.

    Science.gov (United States)

    Hahnloser, Richard H R; Seung, H Sebastian; Slotine, Jean-Jacques

    2003-03-01

    The richness and complexity of recurrent cortical circuits is an inexhaustible source of inspiration for thinking about high-level biological computation. In past theoretical studies, constraints on the synaptic connection patterns of threshold-linear networks were found that guaranteed bounded network dynamics, convergence to attractive fixed points, and multistability, all fundamental aspects of cortical information processing. However, these conditions were only sufficient, and it remained unclear which were the minimal (necessary) conditions for convergence and multistability. We show that symmetric threshold-linear networks converge to a set of attractive fixed points if and only if the network matrix is copositive. Furthermore, the set of attractive fixed points is nonconnected (the network is multiattractive) if and only if the network matrix is not positive semidefinite. There are permitted sets of neurons that can be coactive at a stable steady state and forbidden sets that cannot. Permitted sets are clustered in the sense that subsets of permitted sets are permitted and supersets of forbidden sets are forbidden. By viewing permitted sets as memories stored in the synaptic connections, we provide a formulation of long-term memory that is more general than the traditional perspective of fixed-point attractor networks. There is a close correspondence between threshold-linear networks and networks defined by the generalized Lotka-Volterra equations.

  14. Cancer and low dose responses In Vivo: implications for radiation protection

    Energy Technology Data Exchange (ETDEWEB)

    Mitchel, R.E.J. [Atomic Energy of Canada Limited, Chalk River, Ontario (Canada)

    2006-12-15

    This paper discusses the linear no-threshold (LNT) hypothesis, risk prediction and radiation protection. The summary implications for the radiation protection system are that at low doses the conceptual basis of the present system appears to be incorrect. The belief that the current system embodies the precautionary principle and that the LNT assumption is cautious appears incorrect. The concept of dose additivity appears incorrect. Effective dose (Sievert) and the weighting factors on which it is based appear to be invalid. There may be no constant and appropriate value of DDREF for radiological protection dosimetry. The use of dose as a predictor of risk needs to be re-examined. The use of dose limits as a means of limiting risk need to be re-evaluated.

  15. Review of the controversy on risks from low levels of radiation

    International Nuclear Information System (INIS)

    Higson, D.

    2001-01-01

    The need for regulation of low levels of radiation exposure, and the estimation of risks from such exposures, are based on the assumption that risk is proportional to dose without a threshold, the 'linear no-threshold (LNT) hypothesis'. This assumption is not supported by scientific data. There is no clear evidence of harm from low levels of exposure, up to at least 20 mSv (acute dose) or total dose rates of at least 50 mSv per year. Even allowing for reasonable extrapolation from radiation levels at which harmful effects have been observed, the LNT assumption should not be used to estimate risks from doses less than 100 mSv. Laboratory and epidemiological evidence, and evolutionary expectations of biological effects from low level radiation, suggest that beneficial health effects (sometimes called 'radiation hormesis') are at least as likely as harmful effects from such exposures. Controversy on this matter strikes at the basis of radiation protection practice

  16. Development of Optimal Catalyst Designs and Operating Strategies for Lean NOx Reduction in Coupled LNT-SCR Systems

    Energy Technology Data Exchange (ETDEWEB)

    Harold, Michael [Univ. of Houston, TX (United States); Crocker, Mark [Univ. of Kentucky, Lexington, KY (United States); Balakotaiah, Vemuri [Univ. of Houston, TX (United States); Luss, Dan [Univ. of Houston, TX (United States); Choi, Jae-Soon [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Dearth, Mark [Ford Motor Company, Dearborn, MI (United States); McCabe, Bob [Ford Motor Company, Dearborn, MI (United States); Theis, Joe [Ford Motor Company, Dearborn, MI (United States)

    2013-09-30

    Oxides of nitrogen in the form of nitric oxide (NO) and nitrogen dioxide (NO2) commonly referred to as NOx, is one of the two chemical precursors that lead to ground-level ozone, a ubiquitous air pollutant in urban areas. A major source of NOx} is generated by equipment and vehicles powered by diesel engines, which have a combustion exhaust that contains NOx in the presence of excess O2. Catalytic abatement measures that are effective for gasoline-fueled engines such as the precious metal containing three-way catalytic converter (TWC) cannot be used to treat O2-laden exhaust containing NOx. Two catalytic technologies that have emerged as effective for NOx abatement are NOx storage and reduction (NSR) and selective catalytic reduction (SCR). NSR is similar to TWC but requires much larger quantities of expensive precious metals and sophisticated periodic switching operation, while SCR requires an on-board source of ammonia which serves as the chemical reductant of the NOx. The fact that NSR produces ammonia as a byproduct while SCR requires ammonia to work has led to interest in combining the two together to avoid the need for the cumbersome ammonia generation system. In this project a comprehensive study was carried out of the fundamental aspects and application feasibility of combined NSR/SCR. The project team, which included university, industry, and national lab researchers, investigated the kinetics and mechanistic features of the underlying chemistry in the lean NOx trap (LNT) wherein NSR was carried out, with particular focus on identifying the operating conditions such as temperature and catalytic properties which lead to the production of ammonia in the LNT. The performance features of SCR on both model and commercial catalysts focused on the synergy between the LNT and SCR converters in terms of utilizing the upstream-generated ammonia and

  17. The potential for bias in Cohen's ecological analysis of lung cancer and residential radon

    International Nuclear Information System (INIS)

    Lubin, Jay H.

    2002-01-01

    Cohen's ecological analysis of US lung cancer mortality rates and mean county radon concentration shows decreasing mortality rates with increasing radon concentration (Cohen 1995 Health Phys. 68 157-74). The results prompted his rejection of the linear-no-threshold (LNT) model for radon and lung cancer. Although several authors have demonstrated that risk patterns in ecological analyses provide no inferential value for assessment of risk to individuals, Cohen advances two arguments in a recent response to Darby and Doll (2000 J. Radiol. Prot. 20 221-2) who suggest Cohen's results are and will always be burdened by the ecological fallacy. Cohen asserts that the ecological fallacy does not apply when testing the LNT model, for which average exposure determines average risk, and that the influence of confounding factors is obviated by the use of large numbers of stratification variables. These assertions are erroneous. Average dose determines average risk only for models which are linear in all covariates, in which case ecological analyses are valid. However, lung cancer risk and radon exposure, while linear in the relative risk, are not linearly related to the scale of absolute risk, and thus Cohen's rejection of the LNT model is based on a false premise of linearity. In addition, it is demonstrated that the deleterious association for radon and lung cancer observed in residential and miner studies is consistent with negative trends from ecological studies, of the type described by Cohen. (author)

  18. Limiting values for radioactive materials in food

    International Nuclear Information System (INIS)

    Steiner, Martin

    2014-01-01

    The contribution describes the fundamentals of radiation protection: LNT (linear, no threshold) hypotheses, ALARA (a slow as reasonably achievable), limiting values. Using the example the nuclear accident in Chernobyl the differences in contamination development in different foodstuffs in Germany is demonstrated including recommended limiting values and the radiation exposures after 30 years due to consumption of contaminated food. The natural radioactivity is about 0.3 mSv/year.

  19. Growth of non-toxigenic Clostridium botulinum mutant LNT01 in cooked beef: One-step kinetic analysis and comparison with C. sporogenes and C. perfringens.

    Science.gov (United States)

    Huang, Lihan

    2018-05-01

    The objective of this study was to investigate the growth kinetics of Clostridium botulinum LNT01, a non-toxigenic mutant of C. botulinum 62A, in cooked ground beef. The spores of C. botulinum LNT01 were inoculated to ground beef and incubated anaerobically under different temperature conditions to observe growth and develop growth curves. A one-step kinetic analysis method was used to analyze the growth curves simultaneously to minimize the global residual error. The data analysis was performed using the USDA IPMP-Global Fit, with the Huang model as the primary model and the cardinal parameters model as the secondary model. The results of data analysis showed that the minimum, optimum, and maximum growth temperatures of this mutant are 11.5, 36.4, and 44.3 °C, and the estimated optimum specific growth rate is 0.633 ln CFU/g per h, or 0.275 log CFU/g per h. The maximum cell density is 7.84 log CFU/g. The models and kinetic parameters were validated using additional isothermal and dynamic growth curves. The resulting residual errors of validation followed a Laplace distribution, with about 60% of the residual errors within ±0.5 log CFU/g of experimental observations, suggesting that the models could predict the growth of C. botulinum LNT01 in ground beef with reasonable accuracy. Comparing with C. perfringens, C. botulinum LNT01 grows at much slower rates and with much longer lag times. Its growth kinetics is also very similar to C. sporogenes in ground beef. The results of computer simulation using kinetic models showed that, while prolific growth of C. perfringens may occur in ground beef during cooling, no growth of C. botulinum LNT01 or C. sporogenes would occur under the same cooling conditions. The models developed in this study may be used for prediction of the growth and risk assessments of proteolytic C. botulinum in cooked meats. Published by Elsevier Ltd.

  20. Biological effect and tumor risk of diagnostic x-rays. The ''war of the theories''; Biologische Wirkung und Tumorrisiko diagnostischer Roentgenstrahlen. Der ''Krieg der Modelle''

    Energy Technology Data Exchange (ETDEWEB)

    Selzer, E.; Hebar, A. [Medizinische Universitaet Wien, Abteilung fuer Strahlenbiologie, Klinik fuer Strahlentherapie, Wien (Austria)

    2012-10-15

    Since the introduction of ionizing radiation as a treatment and diagnostic tool in humans, scientists have been trying to estimate its side effects and potential health risks. There is now ample evidence for the principal existence of a direct relationship between higher doses and the risks of side effects. Most of the uncertainties lie in the field of low-dose effects especially with respect to the risk of cancer induction. Low-dose effects are usually of relevance in diagnostic medicine while high-dose radiation effects are typically observed after radiotherapeutic treatment for cancer or after nuclear accidents. The current state of the ''war of theories'' may be summarized as follows: one group of scientists and health regulatory officials favors the hypothesis that there is no threshold dose, i.e. the linear-no-threshold hypothesis (LNT) of radiation which can be regarded as safe. On the contrary, the critics of this hypothesis suggest that the risks of doses below 50 mSv are not measurable or even of clinical relevance and are not adequately described by a linear dose-response relationship. The aim of this article is to summarize the major unresolved issues in this field. Arguments are presented why the validity of the LNT model in the low-dose range should be regarded as at least inconsistent and is thus questionable. (orig.) [German] Seit der Einfuehrung ionisierender Strahlen als ein Mittel zur Behandlung und Diagnose beim Menschen haben Wissenschaftler versucht, ihre Nebenwirkungen und potenziellen Risiken fuer die Gesundheit einzuschaetzen. Es gibt nun ausreichende Evidenz fuer das grundsaetzliche Vorliegen einer direkten Beziehung zwischen hoeheren Dosen und Nebenwirkungsrisiken. Die meisten Unsicherheiten liegen auf dem Gebiet der Niedrigdosisforschung v. a. im Hinblick auf das Risiko der Induktion von Krebs. Niedrigdosiseffekte sind ueblicherweise von Bedeutung in der diagnostischen Medizin, waehrend Hochdosisbestrahlungseffekte

  1. Regulatory Initiatives for Control and Release of Technologically Enhanced Naturally-Occurring Radioactive Material

    International Nuclear Information System (INIS)

    Egidi, P.V.

    1999-01-01

    Current drafts of proposed standards and suggested State regulations for control and release of technologically-enhanced naturally-occurring radioactive material (TENORM), and standards for release of volumetrically-contaminated material in the US are reviewed. These are compared to the recommendations of the International Atomic Energy Association (IAEA) Safety Series and the European Commission (EC) proposals. Past regulatory efforts with respect to TENORM in the US dealt primarily with oil-field related wastes. Currently, nine states (AK, GA, LA, MS, NM, OH, OR SC, TX) have specific regulations pertaining to TENORM, mostly based on uranium mill tailings cleanup criteria. The new US proposals are dose- or risk-based, as are the IAEA and EC recommendations, and are grounded in the linear no threshold hypothesis (LNT). TENORM wastes involve extremely large volumes, particularly scrap metal and mine wastes. Costs to control and dispose of these wastes can be considerable. The current debate over the validity of LNT at low doses and low dose rates is particularly germane to this discussion. Most standards setting organizations and regulatory agencies base their recommendations on the LNT. The US Environmental Protection Agency has released a draft Federal Guidance Report that recommends calculating health risks from low-level exposure to radionuclides based on the LNT. However, some scientific and professional organizations are openly questioning the validity of LNT and its basis for regulations, practices, and costs to society in general. It is not clear at this time how a non-linear regulatory scheme would be implemented

  2. Regulatory Initiatives for Control and Release of Technologically Enhanced Naturally-Occurring Radioactive Materials

    Energy Technology Data Exchange (ETDEWEB)

    Egidi, P.V.

    1999-03-02

    Current drafts of proposed standards and suggested State regulations for control and release of technologically-enhanced naturally-occurring radioactive material (TENORM), and standards for release of volumetrically-contaminated material in the US are reviewed. These are compared to the recommendations of the International Atomic Energy Association (IAEA) Safety Series and the European Commission (EC) proposals. Past regulatory efforts with respect to TENORM in the US dealt primarily with oil-field related wastes. Currently, nine states (AK, GA, LA, MS, NM, OH, OR SC, TX) have specific regulations pertaining to TENORM, mostly based on uranium mill tailings cleanup criteria. The new US proposals are dose- or risk-based, as are the IAEA and EC recommendations, and are grounded in the linear no threshold hypothesis (LNT). TENORM wastes involve extremely large volumes, particularly scrap metal and mine wastes. Costs to control and dispose of these wastes can be considerable. The current debate over the validity of LNT at low doses and low dose rates is particularly germane to this discussion. Most standards setting organizations and regulatory agencies base their recommendations on the LNT. The US Environmental Protection Agency has released a draft Federal Guidance Report that recommends calculating health risks from low-level exposure to radionuclides based on the LNT. However, some scientific and professional organizations are openly questioning the validity of LNT and its basis for regulations, practices, and costs to society in general. It is not clear at this time how a non-linear regulatory scheme would be implemented.

  3. Scientific foundation of regulating ionizing radiation: application of metrics for evaluation of regulatory science information.

    Science.gov (United States)

    Moghissi, A Alan; Gerraa, Vikrham Kumar; McBride, Dennis K; Swetnam, Michael

    2014-11-01

    This paper starts by describing the historical evolution of assessment of biologic effects of ionizing radiation leading to the linear non-threshold (LNT) system currently used to regulate exposure to ionizing radiation. The paper describes briefly the concept of Best Available Science (BAS) and Metrics for Evaluation of Scientific Claims (MESC) derived for BAS. It identifies three phases of regulatory science consisting of the initial phase, when the regulators had to develop regulations without having the needed scientific information; the exploratory phase, when relevant tools were developed; and the standard operating phase, when the tools were applied to regulations. Subsequently, an attempt is made to apply the BAS/MESC system to various stages of LNT. This paper then compares the exposure limits imposed by regulatory agencies and also compares them with naturally occurring radiation at several cities. Controversies about LNT are addressed, including judgments of the U.S. National Academies and their French counterpart. The paper concludes that, based on the BAS/MESC system, there is no disagreement between the two academies on the scientific foundation of LNT; instead, the disagreement is based on their judgment or speculation.

  4. Non-linearity of dose-effect relationship at low level exposure on the example of cytogenetic effects in plant cells

    International Nuclear Information System (INIS)

    Oudalova, A.A.; Geras'kin, S.A.; Dikarev, V.G.; Dikareva, N.S.; Chernonog, E.V.

    2007-01-01

    Complete text of publication follows. There has been an increasing concern in the current scientific society and among the public about the need to protect the environment in order to maintain the ecosystem sustainability and future well-being of man. The linear non-threshold (LNT) hypothesis as the most officially acknowledged concept of biological effect of radiation fails to explain many facts on effects at low level exposures (LLE) accumulated lately. Available information on the dose-effect relationship at low doses is scarce and incomplete for non-human species despite the fact that, under conditions of increased radiation exposure, some biota species occur at a risk of higher impact than humans because of differences in ecological niches occupied. Dose-effect relationships for cytogenetic damage in the range of LLE are studied in a series os experiments with plant (Hordeum vulgare L.) meristem cells. Dose-effect dependences obtained show an obvious non-linear behavior in the LLE region. A piecewise linear model (PLM) for dose-cytogenetic effect relationship that considers an existence of dose-independent part at LLE ('plateau') is developed and specified on the data obtained. An advantage of the PLM over linear model in approximating the frequency of cytogenetic disturbances is demonstrated. From an empirical probability distribution analysis, it is shown that the increase in cytogenetic damage level is tightly connected with changes in a process of absorbed energy distribution between target volumes in terms of fraction of cells experienced a radiation hit event. An appropriateness of the LNT hypothesis to the description of cytogenetic disturbances yield in plant meristem cells in the LLE region is discussed. The results support a conclusion about indirect mechanism of mutagenesis induced by low doses. New data obtained concern a perception of fundamental mechanisms governing cell response to LLE. These findings are of general biological interest, since

  5. Magazines as wilderness information sources: assessing users' general wilderness knowledge and specific leave no trace knowledge

    Science.gov (United States)

    John J. Confer; Andrew J. Mowen; Alan K. Graefe; James D. Absher

    2000-01-01

    The Leave No Trace (LNT) educational program has the potential to provide wilderness users with useful minimum impact information. For LNT to be effective, managers need to understand who is most/least aware of minimum impact practices and how to expose users to LNT messages. This study examined LNT knowledge among various user groups at an Eastern wilderness area and...

  6. Commentary on inhaled {sup 239}PuO{sub 2} in dogs - a prophylaxis against lung cancer?

    Energy Technology Data Exchange (ETDEWEB)

    Cuttler, J.M., E-mail: jerrycuttler@rogers.com [Cuttler and Associates, Vaughan, ON (Canada); Feinendegen, L. [Brookhaven National Laboratories, Upton, NY (United States)

    2015-07-01

    Several studies on the effect of inhaled plutonium-dioxide particulates and the incidence of lung tumors in dogs reveal beneficial effects when the cumulative alpha-radiation dose is low. There is a threshold at an exposure level of about 100 cGy for excess tumor incidence and reduced lifespan. The observations conform to the expectations of the radiation hormesis dose-response model and contradict the predictions of the Linear No-Threshold (LNT) hypothesis. These studies suggest investigating the possibility of employing low-dose alpha-radiation, such as from {sup 239}PuO{sub 2} inhalation, as a prophylaxis against lung cancer. (author)

  7. Commentary on inhaled {sup 239}PuO{sub 2} in dogs - a prophylaxis against lung cancer?

    Energy Technology Data Exchange (ETDEWEB)

    Cuttler, J.M. [Cuttler and Assoc., Vaughan, Ontario (Canada); Feinendegen, L. [Brookhaven National Laboratories, Upton, New York (United States)

    2015-06-15

    Several studies on the effect of inhaled plutonium-dioxide particulates and the incidence of lung tumors in dogs reveal beneficial effects when the cumulative alpha-radiation dose is low. There is a threshold at an exposure level of about 100 cGy for excess tumor incidence and reduced lifespan. The observations conform to the expectations of the radiation hormesis dose-response model and contradict the predictions of the Linear No-Threshold (LNT) hypothesis. These studies suggest investigating the possibility of employing low-dose alpha-radiation, such as from {sup 239}PuO {sub 2} inhalation, as a prophylaxis against lung cancer. (author)

  8. The Use of Lexical Neighborhood Test (LNT) in the Assessment of Speech Recognition Performance of Cochlear Implantees with Normal and Malformed Cochlea.

    Science.gov (United States)

    Kant, Anjali R; Banik, Arun A

    2017-09-01

    The present study aims to use the model-based test Lexical Neighborhood Test (LNT), to assess speech recognition performance in early and late implanted hearing impaired children with normal and malformed cochlea. The LNT was administered to 46 children with congenital (prelingual) bilateral severe-profound sensorineural hearing loss, using Nucleus 24 cochlear implant. The children were grouped into Group 1-(early implantees with normal cochlea-EI); n = 15, 31/2-61/2 years of age; mean age at implantation-3½ years. Group 2-(late implantees with normal cochlea-LI); n = 15, 6-12 years of age; mean age at implantation-5 years. Group 3-(early implantees with malformed cochlea-EIMC); n = 9; 4.9-10.6 years of age; mean age at implantation-3.10 years. Group 4-(late implantees with malformed cochlea-LIMC); n = 7; 7-12.6 years of age; mean age at implantation-6.3 years. The following were the malformations: dysplastic cochlea, common cavity, Mondini's, incomplete partition-1 and 2 (IP-1 and 2), enlarged IAC. The children were instructed to repeat the words on hearing them. Means of the word and phoneme scores were computed. The LNT can also be used to assess speech recognition performance of hearing impaired children with malformed cochlea. When both easy and hard lists of LNT are considered, although, late implantees (with or without normal cochlea), have achieved higher word scores than early implantees, the differences are not statistically significant. Using LNT for assessing speech recognition enables a quantitative as well as descriptive report of phonological processes used by the children.

  9. Molecular alterations in childhood thyroid cancer after Chernobyl accident and low-dose radiation risk

    International Nuclear Information System (INIS)

    Suzuki, Keiji; Mitsutake, Norisato; Yamashita, Shunichi

    2012-01-01

    The linear no-threshold (LNT) model of radiation carcinogenesis has been used for evaluating the risk from radiation exposure. While the epidemiological studies have supported the LNT model at doses above 100 mGy, more uncertainties are still existed in the LNT model at low doses below 100 mGy. Thus, it is urged to clarify the molecular mechanisms underlying radiation carcinogenesis. After the Chernobyl accident in 1986, significant amount of childhood thyroid cancer has emerged in the children living in the contaminated area. As the incidence of sporadic childhood thyroid cancer is very low, it is quite evident that those cancer cases have been induced by radiation exposure caused mainly by the intake of contaminated foods, such as milk. Because genetic alterations in childhood thyroid cancers have extensively been studied, it should provide a unique chance to understand the molecular mechanisms of radiation carcinogenesis. In a current review, molecular signatures obtained from the molecular studies of childhood thyroid cancer after Chernobyl accident have been overviewed, and new roles of radiation exposure in thyroid carcinogenesis will be discussed. (author)

  10. Health Physics Society Comments to U.S. Environmental Protection Agency Regulatory Reform Task Force.

    Science.gov (United States)

    Ring, Joseph; Tupin, Edward; Elder, Deirdre; Hiatt, Jerry; Sheetz, Michael; Kirner, Nancy; Little, Craig

    2018-05-01

    The Health Physics Society (HPS) provided comment to the U.S. Environmental Protection Agency (EPA) on options to consider when developing an action plan for President Trump's Executive Order to evaluate regulations for repeal, replacement, or modification. The HPS recommended that the EPA reconsider their adherence to the linear no-threshold (LNT) model for radiation risk calculations and improve several documents by better addressing uncertainties in low-dose, low dose-rate (LDDR) radiation exposure environments. The authors point out that use of the LNT model near background levels cannot provide reliable risk projections, use of the LNT model and collective-dose calculations in some EPA documents is inconsistent with the recommendations of international organizations, and some EPA documents have not been exposed to the public comment rule-making process. To assist in establishing a better scientific basis for the risks of low dose rate and low dose radiation exposure, the EPA should continue to support the "Million Worker Study," led by the National Council on Radiation Protection and Measurement.

  11. Low level radiation: how does the linear without threshold model provide the safety of Canadian

    International Nuclear Information System (INIS)

    Anon.

    2010-01-01

    The linear without threshold model is a model of risk used worldwide by the most of health organisms of nuclear regulation in order to establish dose limits for workers and public. It is in the heart of the approach adopted by the Canadian commission of nuclear safety (C.C.S.N.) in matter of radiation protection. The linear without threshold model presumes reasonably it exists a direct link between radiation exposure and cancer rate. It does not exist scientific evidence that chronicle exposure to radiation doses under 100 milli sievert (mSv) leads harmful effects on health. Several scientific reports highlighted scientific evidences that seem indicate a low level of radiation is less harmful than the linear without threshold predicts. As the linear without threshold model presumes that any radiation exposure brings risks, the ALARA principle obliges the licensees to get the radiation exposure at the lowest reasonably achievable level, social and economical factors taken into account. ALARA principle constitutes a basic principle in the C.C.S.N. approach in matter of radiation protection; On the radiation protection plan, C.C.S.N. gets a careful approach that allows to provide health and safety of Canadian people and the protection of their environment. (N.C.)

  12. Communicating Leave No Trace ethics and practices: Efficacy of two-day trainer courses

    Science.gov (United States)

    Daniels, M.L.; Marion, J.L.

    2005-01-01

    Heavy recreational visitation within protected natural areas has resulted in many ecological impacts. Many of these impacts may be avoided or minimized through adoption of low-impact hiking and camping practices. Although ?No Trace? messages have been promoted in public lands since the 1970s, few studies have documented the reception and effectiveness of these messages. The U.S. Leave No Trace Center for Outdoor Ethics develops and promotes two-day Trainer courses that teach Leave No Trace (LNT) skills and ethics to outdoor professionals, groups, and interested individuals. This study examined the change in knowledge, ethics, and behavior of LNT Trainer course participants. The respondents were a convenience sample of participants in Trainer courses offered from April through August 2003. Trainer course instructors administered pre-course and post-course questionnaires to their participants, and we contacted participants individually with a followup questionnaire 4 months after completion of their course. Scores for each of the sections increased immediately following the course, and decreased slightly over the 4 months following the course. Overall, more than half of the knowledge and behavior items, and half of the ethics items, showed significant improvement from pre-course measures to the follow-up. Age, reported LNT experience, and backpacking experience affected the participants? pre-course knowledge and behavior scores. Younger, less experienced respondents also showed a greater improvement in behavior following the course. Trainer course participants also shared their LNT skills and ethics with others both formally and informally. In summary, the LNT Trainer course was successful in increasing participants? knowledge, ethics, and behavior, which they then shared with others. Since many low impact skills taught in the LNT curriculum are supported by scientific research, LNT educational programs have the potential to effectively minimize the environmental

  13. Biological responses to low dose rate gamma radiation

    International Nuclear Information System (INIS)

    Magae, Junji; Ogata, Hiromitsu

    2003-01-01

    Linear non-threshold (LNT) theory is a basic theory for radioprotection. While LNT dose not consider irradiation time or dose-rate, biological responses to radiation are complex processes dependent on irradiation time as well as total dose. Moreover, experimental and epidemiological studies that can evaluate LNT at low dose/low dose-rate are not sufficiently accumulated. Here we analyzed quantitative relationship among dose, dose-rate and irradiation time using chromosomal breakage and proliferation inhibition of human cells as indicators of biological responses. We also acquired quantitative data at low doses that can evaluate adaptability of LNT with statistically sufficient accuracy. Our results demonstrate that biological responses at low dose-rate are remarkably affected by exposure time, and they are dependent on dose-rate rather than total dose in long-term irradiation. We also found that change of biological responses at low dose was not linearly correlated to dose. These results suggest that it is necessary for us to create a new model which sufficiently includes dose-rate effect and correctly fits of actual experimental and epidemiological results to evaluate risk of radiation at low dose/low dose-rate. (author)

  14. Evidence for beneficial low level radiation effects and radiation hormesis

    International Nuclear Information System (INIS)

    Feinendegen, L.E.

    2005-01-01

    Low doses in the mGy range cause a dual effect on cellular DNA. One effect concerns a relatively low probability of DNA damage per energy deposition event and it increases proportional with dose, with possible bystander effects operating. This damage at background radiation exposure is orders of magnitudes lower than that from endogenous sources, such as ROS. The other effect at comparable doses brings an easily obeservable adaptive protection against DNA damage from any, mainly endogenous sources, depending on cell type, species, and metabolism. Protective responses express adaptive responses to metabolic perturbations and also mimic oxygen stress responses. Adaptive protection operates in terms of DNA damage prevention and repair, and of immune stimulation. It develops with a delay of hours, may last for days to months, and increasingly disappears at doses beyond about 100 to 200 mGy. Radiation-induced apoptosis and terminal cell differentiation occurs also at higher doses and adds to protection by reducing genomic instability and the number of mutated cells in tissues. At low doses, damage reduction by adaptive protection against damage from endogenous sources predictably outweighs radiogenic damage induction. The analysis of the consequences of the particular low-dose scenario shows that the linear-no-threshold (LNT) hypothesis for cancer risk is scientifically unfounded and appears to be invalid in favor of a threshold or hormesis. This is consistent with data both from animal studies and human epidemiological observations on low-dose induced cancer. The LNT hypothesis should be abandoned and be replaced by a hypothesis that is scientifically justified. The appropriate model should include terms for both linear and non-linear response probabilities. Maintaining the LNT-hypothesis as basis for radiation protection causes unressonable fear and expenses. (author)

  15. A biological basis for the linear non-threshold dose-response relationship for low-level carcinogen exposure

    International Nuclear Information System (INIS)

    Albert, R.E.

    1981-01-01

    This chapter examines low-level dose-response relationships in terms of the two-stage mouse tumorigenesis model. Analyzes the feasibility of the linear non-threshold dose-response model which was first adopted for use in the assessment of cancer risks from ionizing radiation and more recently from chemical carcinogens. Finds that both the interaction of B(a)P with epidermal DNA of the mouse skin and the dose-response relationship for the initiation stage of mouse skin tumorigenesis showed a linear non-threshold dose-response relationship. Concludes that low level exposure to environmental carcinogens has a linear non-threshold dose-response relationship with the carcinogen acting as an initiator and the promoting action being supplied by the factors that are responsible for the background cancer rate in the target tissue

  16. Epidemiological studies on the effects of low-level ionizing radiation on cancer risk

    International Nuclear Information System (INIS)

    Akiba, Suminori

    2010-01-01

    The health effects of low-level ionizing radiation are yet unclear. As pointed out by Upton in his review (Upton, 1989), low-level ionizing radiation seems to have different biological effects from what high-level radiation has. If so, the hazard identification of ionizing radiation should he conducted separately for low- and high-level ionizing radiation; the hazard identification of low-level radiation is yet to be completed. What makes hazard identification of ionizing radiation difficult, particularly in the case of carcinogenic effect, is the difficulty in distinguishing radiation-induced cancer from other cancers with respect to clinicopathological features and molecular biological characteristics. Actually, it is suspected that radiation-induced carcinogenesis involves mechanisms not specific for radiation, such as oxidative stress. Excess risk per dose in medium-high dose ranges can be extrapolated to a low-dose range if dose-response can be described by the linear-non-threshold model. The cancer risk data of atomic-bomb survivors describes leukemia risk with a linear-quadratic (LQ) model and solid-cancer risk with linear non-threshold (LNT) model. The LQ model for leukemia and the LNT model for solid cancer correspond to the two-hit model and the one-hit model, respectively. Although the one-hit model is an unlikely dose-response for carcinogenesis, there is no convincing epidemiological evidence supporting the LQ model or non-threshold model for solid cancer. It should be pointed out, however, even if the true dose response is non-linear various noises involved in epidemiological data may mask the truth. In this paper, the potential contribution of epidemiological studies on nuclear workers and residents in high background radiation areas will be discussed. (author)

  17. An Evolved System of Radiological Protection

    International Nuclear Information System (INIS)

    Kaneko, M.

    2004-01-01

    The current system of radiological protection based on the Linear No-Threshold (LNT) hypothesis has greatly contributed to the minimization of doses received by workers and members of the public. However, it has brought about r adiophobia a mong people and waste of resources due to over-regulation, because the LNT implies that radiation is harmful no matter how small the dose is. The author reviewed the results of research on health effects of radiation including major epidemiological studies on radiation workers and found no clear evidence of deleterious health effects from radiation exposures below the current maximum dose limits (50 mSv/y for workers and 5 mSv/y for members of the public), which have been adopted worldwide in the second half of the 20th century. Now that the existence of bio-defensive mechanisms such as DNA repair, apoptosis and adaptive response are well recognized, the linearity assumption cannot be said to be s cientific . Evidences increasingly imply that there are threshold effects in risk of radiation. A concept of practical thresholds or virtually safe doses will have to be introduced into the new system of radiological protection in order to resolve the low dose issues. Practical thresholds may be defined as dose levels below which induction of detectable radiogenic cancers or hereditary effects are not expected. If any workers and members of the public do not gain benefits from being exposed, excepting intentional irradiation for medical purposes, their radiation exposures should be kept below practical thresholds. On the assumption that the current dose limits are below practical thresholds and with no radiation detriments, there is no need of justification and optimization (ALARA) principles for occupational and public exposures. Then the ethical issue of justification to allow benefit to society to offset radiation detriments to individuals can be resolved. And also the ethical issue of optimization to exchange health or safety for

  18. On the existence of a threshold in the dose-response relationship from the epidemiological data of atomic bomb survivors

    International Nuclear Information System (INIS)

    Matsuura, Tatsuo; Sugahara, Tsutomu

    2002-01-01

    Whether or not there is a threshold dose in the dose-response relationship for cancer incidence due to radiation is one of the most important but controversial issues in radiation protection and nuclear policy making. The epidemiological studies on the Life Span Study (LSS) group of atomic bomb survivors in Hiroshima and Nagasaki, conducted by Radiation Effects Research Foundation (RERF) have been regarded to be most authentic, and they keep the view that there is no evidence to deny the linear non-threshold (LNT) hypotheses. The authors have claimed the necessity of reassessment of exposure doses of survivors, by considering the contribution of chronic dose, which comes from fall-out, induced radioactivity, and early entrance near the center of the city. The authors also have stressed the importance of the cases of if 'not-in-city' survivors, frequently reported to be fatal by the heavy chronic exposure. Recently we have noticed that the appearance of acute radiation symptoms is an important index for estimating total dose. In this paper, based on Obos statistical data (in 1957) for the acute symptoms observed for various category of survivors, we present an estimation of the average chronic dose of survivors, which should be added to the instantaneous dose for the directly exposed groups. By assuming the threshold for the appearance of the acute symptom such as epilation as 0.5 Sv, average chronic dose of 0.32 Sv was estimated for all survivors. Then the present dose-response relationship for cancer incidence should be shifted to the right hand side by this amount, and the value of about 0.32 Sv or more is suggested as the threshold for cancer incidence in low radiation level region

  19. Hormesis: Fact or fiction?

    International Nuclear Information System (INIS)

    Holzman, D.

    1995-01-01

    Bernard Cohen had not intended to foment revolution. To be sure, he had hoped that the linear, no-threshold (LNT) model of ionizing radiation's effects on humans would prove to be an exaggeration of reality at the low levels of radiation that one can measure in humans throughout the United States. His surprising conclusion, however, was that within the low dose ranges of radiation one receives in the home, the higher the dose, the less chance one had of contracting lung cancer. 1 fig., 1 tab

  20. Epidemiological and radio-biological studies in high background radiation areas of Kerala coast: implications in radiation protection science and human health

    International Nuclear Information System (INIS)

    Das, Birajalaxmi

    2018-01-01

    Till date, Linear No Threshold hypothesis (LNT) is well accepted in radiation protection science in spite of its limitations. However, dose response studies using multiple biological end points from high-background radiation areas have challenged the linearity. Radio-biological and epidemiological studies from high level natural radiation areas of Kerala coast showed non-linearity as well as efficient repair of DNA damage in HLNRA indicating that dose limits for public exposure needs to be revisited which may have implications in radiation protection science, human health and low dose radiation biology. However, further studies using high throughput approach is required to identify chronic radiation signatures in human population exposed to elevated level of natural background radiation

  1. Dose-responses for mortality from cerebrovascular and heart diseases in atomic bomb survivors: 1950-2003

    Energy Technology Data Exchange (ETDEWEB)

    Schoellnberger, Helmut [Helmholtz Zentrum Muenchen, Department of Radiation Sciences, Institute of Radiation Protection, Neuherberg (Germany); Federal Office for Radiation Protection, Department of Radiation Protection and the Environment, Neuherberg (Germany); Eidemueller, Markus; Simonetto, Cristoforo; Kaiser, Jan Christian [Helmholtz Zentrum Muenchen, Department of Radiation Sciences, Institute of Radiation Protection, Neuherberg (Germany); Cullings, Harry M. [Radiation Effects Research Foundation, Department of Statistics, Hiroshima (Japan); Neff, Frauke [Staedtisches Klinikum Muenchen and Technical University of Munich, Institute of Pathology, Munich (Germany)

    2018-03-15

    The scientific community faces important discussions on the validity of the linear no-threshold (LNT) model for radiation-associated cardiovascular diseases at low and moderate doses. In the present study, mortalities from cerebrovascular diseases (CeVD) and heart diseases from the latest data on atomic bomb survivors were analyzed. The analysis was performed with several radio-biologically motivated linear and nonlinear dose-response models. For each detrimental health outcome one set of models was identified that all fitted the data about equally well. This set was used for multi-model inference (MMI), a statistical method of superposing different models to allow risk estimates to be based on several plausible dose-response models rather than just relying on a single model of choice. MMI provides a more accurate determination of the dose response and a more comprehensive characterization of uncertainties. It was found that for CeVD, the dose-response curve from MMI is located below the linear no-threshold model at low and medium doses (0-1.4 Gy). At higher doses MMI predicts a higher risk compared to the LNT model. A sublinear dose-response was also found for heart diseases (0-3 Gy). The analyses provide no conclusive answer to the question whether there is a radiation risk below 0.75 Gy for CeVD and 2.6 Gy for heart diseases. MMI suggests that the dose-response curves for CeVD and heart diseases in the Lifespan Study are sublinear at low and moderate doses. This has relevance for radiotherapy treatment planning and for international radiation protection practices in general. (orig.)

  2. Health effects of low level exposure to ionizing radiation: origin and development of a controversy

    International Nuclear Information System (INIS)

    Masse, Roland

    2014-06-01

    Health hazard assessment related to doses of ionizing radiation lower than 100-200 mSv is a matter of controversy, and more acutely when choosing transition towards a new energetic paradigm. Neither epidemiological nor experimental data can be used to determine the shape of the dose-effect relationship from 0 to 100 mSv. Recently, however, long term follow-up of children and young adults exposed to CT scans evidenced that doses of 50 to 60 mGy delivered at high dose-rate were associated to a significant increase of leukemias and cancers, including brain cancer. On the basis of the available data, this article leaves some questions about the plausibility of the linear no threshold hypothesis (LNT) used by radiological protection bodies to control overexposure of the members of public and workers. It concludes that although the plausibility of LNT is fairly weak, using LNT helps to situate the order of magnitude of health risks associated with the development of nuclear power plants and to compare them with those resulting from burning fossil fuels and biomass; the results show that sparing human lives can only be achieved with nuclear for the same quantity of energy produced. (author)

  3. Surveys of radon levels in homes in the United States: A test of the linear-no-threshold dose-response relationship for radiation carcinogenesis

    International Nuclear Information System (INIS)

    Cohen, B.L.

    1987-01-01

    The University of Pittsburgh Radon Project for large scale measurements of radon concentrations in homes is described. Its principal research is to test the linear-no threshold dose-response relationship for radiation carcinogenesis by determining average radon levels in the 25 U.S. counties (within certain population ranges) with highest and lowest lung cancer rates. The theory predicts that the former should have about 3 times higher average radon levels than the latter, under the assumption that any correlation between exposure to radon and exposure to other causes of lung cancer is weak. The validity of this assumption is tested with data on average radon level vs replies to items on questionnaires; there is little correlation between radon levels in houses and smoking habits, educational attainment, or economic status of the occupants, or with urban vs rural environs which is an indicator of exposure to air pollution

  4. Groundwater decline and tree change in floodplain landscapes: Identifying non-linear threshold responses in canopy condition

    Directory of Open Access Journals (Sweden)

    J. Kath

    2014-12-01

    Full Text Available Groundwater decline is widespread, yet its implications for natural systems are poorly understood. Previous research has revealed links between groundwater depth and tree condition; however, critical thresholds which might indicate ecological ‘tipping points’ associated with rapid and potentially irreversible change have been difficult to quantify. This study collated data for two dominant floodplain species, Eucalyptus camaldulensis (river red gum and E. populnea (poplar box from 118 sites in eastern Australia where significant groundwater decline has occurred. Boosted regression trees, quantile regression and Threshold Indicator Taxa Analysis were used to investigate the relationship between tree condition and groundwater depth. Distinct non-linear responses were found, with groundwater depth thresholds identified in the range from 12.1 m to 22.6 m for E. camaldulensis and 12.6 m to 26.6 m for E. populnea beyond which canopy condition declined abruptly. Non-linear threshold responses in canopy condition in these species may be linked to rooting depth, with chronic groundwater decline decoupling trees from deep soil moisture resources. The quantification of groundwater depth thresholds is likely to be critical for management aimed at conserving groundwater dependent biodiversity. Identifying thresholds will be important in regions where water extraction and drying climates may contribute to further groundwater decline. Keywords: Canopy condition, Dieback, Drought, Tipping point, Ecological threshold, Groundwater dependent ecosystems

  5. Homeopathy with radiation?

    International Nuclear Information System (INIS)

    Kiefer, Juergen

    2017-01-01

    There are no reliable data to estimate radiation risk to humans below doses of about 100 mSv. The ICRP adopts for protection purposes the LNT(linear no threshold)-concept. As there is no evidence for its general validity there is room for other ideas. One is ''radiation hormesis'', i.e. the notion that low radiation doses are not harmful but rather beneficial to human health. This view is critically discussed and it is shown that there is no evidence to prove this conception neither from epidemiological studies nor convincingly from animal experiments or mechanistic investigations. There is, therefore, no reason to abandon the prudent approach of the ALARA-principle.

  6. Homeopathy with radiation?; Homoeopathie mit Strahlen?

    Energy Technology Data Exchange (ETDEWEB)

    Kiefer, Juergen

    2017-07-01

    There are no reliable data to estimate radiation risk to humans below doses of about 100 mSv. The ICRP adopts for protection purposes the LNT(linear no threshold)-concept. As there is no evidence for its general validity there is room for other ideas. One is ''radiation hormesis'', i.e. the notion that low radiation doses are not harmful but rather beneficial to human health. This view is critically discussed and it is shown that there is no evidence to prove this conception neither from epidemiological studies nor convincingly from animal experiments or mechanistic investigations. There is, therefore, no reason to abandon the prudent approach of the ALARA-principle.

  7. Sulfur impact on NO{sub x} storage, oxygen storage, and ammonia breakthrough during cyclic lean/rich operation of a commercial lean NO{sub x} trap

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Jae-Soon; Partridge, William P.; Daw, C. Stuart [Fuels, Engines, and Emissions Research Center, Oak Ridge National Laboratory, P.O. Box 2008, MS-6472, Oak Ridge, TN 37831-6472 (United States)

    2007-11-30

    The objective of the present study was to develop an improved understanding of how sulfur affects the spatiotemporal distribution of reactions and temperature inside a monolithic lean NO{sub x} trap (LNT). These spatiotemporal distributions are believed to be major factors in LNT function, and thus, we expect that a better understanding of these phenomena can benefit the design and operation of commercial LNTs. In our study, we experimentally evaluated a commercial LNT monolith installed in a bench-flow reactor with simulated engine exhaust. The reactor feed gas composition was cycled to simulate fast lean/rich LNT operation at 325 C, and spatiotemporal species and temperature profiles were monitored along the LNT axis at different sulfur loadings. Reactor outlet NO{sub x}, NO, N{sub 2}O, and NH{sub 3} were also measured. Sulfur tended to accumulate in a plug-like fashion in the reactor and progressively inhibited NO{sub x} storage capacity along the axis. The NO{sub x} storage/reduction (NSR) reactions occurred over a relatively short portion of the reactor (NSR zone) under the conditions used in this study, and thus, net NO{sub x} conversion was only significantly reduced at high sulfur loading. Oxygen storage capacity (OSC) was poisoned by sulfur also in a progressive manner but to a lesser extent than the NO{sub x} storage capacity. Global selectivity for N{sub 2}O remained low at all sulfur loadings, but NH{sub 3} selectivity increased significantly with sulfur loading. We conjecture that NH{sub 3} breakthrough increased because of decreasing oxidation of NH{sub 3}, slipping from the NSR zone, by downstream stored oxygen. The NSR and oxygen storage/reduction (OSR) generated distinctive exotherms during the rich phase and at the rich/lean transition. Exotherm locations shifted downstream with sulfur accumulation in a manner that was consistent with the progressive poisoning of NSR and OSR sites. (author)

  8. Polarization properties of below-threshold harmonics from aligned molecules H2+ in linearly polarized laser fields.

    Science.gov (United States)

    Dong, Fulong; Tian, Yiqun; Yu, Shujuan; Wang, Shang; Yang, Shiping; Chen, Yanjun

    2015-07-13

    We investigate the polarization properties of below-threshold harmonics from aligned molecules in linearly polarized laser fields numerically and analytically. We focus on lower-order harmonics (LOHs). Our simulations show that the ellipticity of below-threshold LOHs depends strongly on the orientation angle and differs significantly for different harmonic orders. Our analysis reveals that this LOH ellipticity is closely associated with resonance effects and the axis symmetry of the molecule. These results shed light on the complex generation mechanism of below-threshold harmonics from aligned molecules.

  9. Nuclear disaster in Fukushima. Based on the WHO data between 22.000 and 66.000 carcinoma deaths are expected in Japan; Atomkatastrophe in Fukushima. Auf der Grundlage der WHO-Daten sind in Japan zwischen 22.000 und 66.000 Krebserkrankungen zu erwarten

    Energy Technology Data Exchange (ETDEWEB)

    Paulitz, Henrik; Eisenberg, Winfrid; Thiel, Reinhold

    2013-03-14

    The authors show that based on the data and assumption of WHO about 22.000 deaths due to cancer are expected in Japan as a consequence of the nuclear disaster in Fukushima in March 2011. The following data are used: the radiation exposure of the Japanese public in the first year after the nuclear catastrophe, the linear no-threshold model (LNT), risk factor for mortality (EAR excess absolute risk). When the factor to determine the lifetime dose is based on the experience of Chernobyl (UNSCEAR calculations) and the most recent scientific research the number of expected cancer cases rises to 66.000.

  10. Radiation protection. Basic concepts of ICRP

    International Nuclear Information System (INIS)

    Saito, Tsutomu; Hirata, Hideki

    2014-01-01

    The title subject is easily explained. Main international organizations for radiation protection are United Nations Scientific Committee on the Effects of Atomic Radiation (UNSCEAR), International Commission on Radiological Protection (ICRP) and International Atomic Energy Agency (IAEA). The UNSCEAR objectively summarizes and publishes scientific findings; ICRP, an NGO, takes part in recommending the radiological protection from the expertized aspect; and IAEA, a UN autonomy, aims at peaceful usage of atomic power. These organizations support the legal regulation and standard of nations. The purpose of the ICRP recommendation (Pub. 103, 2007) is to contribute to the appropriate protection of radiation hazardous effects, which are assumed to be linearly proportional (the model of linear no-threshold, LNT) that radiation risk exists even at the lowest dose. When a change in the single cell results in hazardous alteration, the causative effects are called stochastic effects, which include the mutation leading to cancer formation and genetic effect in offspring (not observed in man). ICRP says the validity of LNT for the stochastic effects essentially from the protective aspect, although epidemiological data support it at >100 mSv exposure. The deterministic effects are caused by loss of cell itself or of its function, where the threshold is defined to be the dose causing >1% of disorder or death. Radiation protective system against exposure is on the situation (programmed, emergent and natural), category (occupational, public and medical) and 3 principles of justification, optimization and application of dose limit. (T.T.)

  11. Genetic parameters for direct and maternal calving ease in Walloon dairy cattle based on linear and threshold models.

    Science.gov (United States)

    Vanderick, S; Troch, T; Gillon, A; Glorieux, G; Gengler, N

    2014-12-01

    Calving ease scores from Holstein dairy cattle in the Walloon Region of Belgium were analysed using univariate linear and threshold animal models. Variance components and derived genetic parameters were estimated from a data set including 33,155 calving records. Included in the models were season, herd and sex of calf × age of dam classes × group of calvings interaction as fixed effects, herd × year of calving, maternal permanent environment and animal direct and maternal additive genetic as random effects. Models were fitted with the genetic correlation between direct and maternal additive genetic effects either estimated or constrained to zero. Direct heritability for calving ease was approximately 8% with linear models and approximately 12% with threshold models. Maternal heritabilities were approximately 2 and 4%, respectively. Genetic correlation between direct and maternal additive effects was found to be not significantly different from zero. Models were compared in terms of goodness of fit and predictive ability. Criteria of comparison such as mean squared error, correlation between observed and predicted calving ease scores as well as between estimated breeding values were estimated from 85,118 calving records. The results provided few differences between linear and threshold models even though correlations between estimated breeding values from subsets of data for sires with progeny from linear model were 17 and 23% greater for direct and maternal genetic effects, respectively, than from threshold model. For the purpose of genetic evaluation for calving ease in Walloon Holstein dairy cattle, the linear animal model without covariance between direct and maternal additive effects was found to be the best choice. © 2014 Blackwell Verlag GmbH.

  12. Exogenous calcium alleviates low night temperature stress on the photosynthetic apparatus of tomato leaves.

    Directory of Open Access Journals (Sweden)

    Guoxian Zhang

    Full Text Available The effect of exogenous CaCl2 on photosystem I and II (PSI and PSII activities, cyclic electron flow (CEF, and proton motive force of tomato leaves under low night temperature (LNT was investigated. LNT stress decreased the net photosynthetic rate (Pn, effective quantum yield of PSII [Y(II], and photochemical quenching (qP, whereas CaCl2 pretreatment improved Pn, Y(II, and qP under LNT stress. LNT stress significantly increased the non-regulatory quantum yield of energy dissipation [Y(NO], whereas CaCl2 alleviated this increase. Exogenous Ca2+ enhanced stimulation of CEF by LNT stress. Inhibition of oxidized PQ pools caused by LNT stress was alleviated by CaCl2 pretreatment. LNT stress reduced zeaxanthin formation and ATPase activity, but CaCl2 pretreatment reversed both of these effects. LNT stress caused excess formation of a proton gradient across the thylakoid membrane, whereas CaCl2 pretreatment decreased the said factor under LNT. Thus, our results showed that photoinhibition of LNT-stressed plants could be alleviated by CaCl2 pretreatment. Our findings further revealed that this alleviation was mediated in part by improvements in carbon fixation capacity, PQ pools, linear and cyclic electron transports, xanthophyll cycles, and ATPase activity.

  13. Reaction thresholds in doubly special relativity

    International Nuclear Information System (INIS)

    Heyman, Daniel; Major, Seth; Hinteleitner, Franz

    2004-01-01

    Two theories of special relativity with an additional invariant scale, 'doubly special relativity', are tested with calculations of particle process kinematics. Using the Judes-Visser modified conservation laws, thresholds are studied in both theories. In contrast with some linear approximations, which allow for particle processes forbidden in special relativity, both the Amelino-Camelia and Magueijo-Smolin frameworks allow no additional processes. To first order, the Amelino-Camelia framework thresholds are lowered and the Magueijo-Smolin framework thresholds may be raised or lowered

  14. Genetic evaluation of calf and heifer survival in Iranian Holstein cattle using linear and threshold models.

    Science.gov (United States)

    Forutan, M; Ansari Mahyari, S; Sargolzaei, M

    2015-02-01

    Calf and heifer survival are important traits in dairy cattle affecting profitability. This study was carried out to estimate genetic parameters of survival traits in female calves at different age periods, until nearly the first calving. Records of 49,583 female calves born during 1998 and 2009 were considered in five age periods as days 1-30, 31-180, 181-365, 366-760 and full period (day 1-760). Genetic components were estimated based on linear and threshold sire models and linear animal models. The models included both fixed effects (month of birth, dam's parity number, calving ease and twin/single) and random effects (herd-year, genetic effect of sire or animal and residual). Rates of death were 2.21, 3.37, 1.97, 4.14 and 12.4% for the above periods, respectively. Heritability estimates were very low ranging from 0.48 to 3.04, 0.62 to 3.51 and 0.50 to 4.24% for linear sire model, animal model and threshold sire model, respectively. Rank correlations between random effects of sires obtained with linear and threshold sire models and with linear animal and sire models were 0.82-0.95 and 0.61-0.83, respectively. The estimated genetic correlations between the five different periods were moderate and only significant for 31-180 and 181-365 (r(g) = 0.59), 31-180 and 366-760 (r(g) = 0.52), and 181-365 and 366-760 (r(g) = 0.42). The low genetic correlations in current study would suggest that survival at different periods may be affected by the same genes with different expression or by different genes. Even though the additive genetic variations of survival traits were small, it might be possible to improve these traits by traditional or genomic selection. © 2014 Blackwell Verlag GmbH.

  15. Epidemiology Without Biology: False Paradigms, Unfounded Assumptions, and Specious Statistics in Radiation Science (with Commentaries by Inge Schmitz-Feuerhake and Christopher Busby and a Reply by the Authors).

    Science.gov (United States)

    Sacks, Bill; Meyerson, Gregory; Siegel, Jeffry A

    Radiation science is dominated by a paradigm based on an assumption without empirical foundation. Known as the linear no-threshold (LNT) hypothesis, it holds that all ionizing radiation is harmful no matter how low the dose or dose rate. Epidemiological studies that claim to confirm LNT either neglect experimental and/or observational discoveries at the cellular, tissue, and organismal levels, or mention them only to distort or dismiss them. The appearance of validity in these studies rests on circular reasoning, cherry picking, faulty experimental design, and/or misleading inferences from weak statistical evidence. In contrast, studies based on biological discoveries demonstrate the reality of hormesis: the stimulation of biological responses that defend the organism against damage from environmental agents. Normal metabolic processes are far more damaging than all but the most extreme exposures to radiation. However, evolution has provided all extant plants and animals with defenses that repair such damage or remove the damaged cells, conferring on the organism even greater ability to defend against subsequent damage. Editors of medical journals now admit that perhaps half of the scientific literature may be untrue. Radiation science falls into that category. Belief in LNT informs the practice of radiology, radiation regulatory policies, and popular culture through the media. The result is mass radiophobia and harmful outcomes, including forced relocations of populations near nuclear power plant accidents, reluctance to avail oneself of needed medical imaging studies, and aversion to nuclear energy-all unwarranted and all harmful to millions of people.

  16. A Near-Threshold Shape Resonance in the Valence-Shell Photoabsorption of Linear Alkynes

    Energy Technology Data Exchange (ETDEWEB)

    Jacovella, U.; Holland, D. M. P.; Boyé-Péronne, S.; Gans, Bérenger; de Oliveira, N.; Ito, K.; Joyeux, D.; Archer, L. E.; Lucchese, R. R.; Xu, Hong; Pratt, S. T.

    2015-12-17

    The room-temperature photoabsorption spectra of a number of linear alkynes with internal triple bonds (e.g., 2-butyne, 2-pentyne, and 2- and 3-hexyne) show similar resonances just above the lowest ionization threshold of the neutral molecules. These features result in a substantial enhancement of the photoabsorption cross sections relative to the cross sections of alkynes with terminal triple bonds (e.g., propyne, 1-butyne, 1-pentyne,...). Based on earlier work on 2-butyne [Xu et al., J. Chem. Phys. 2012, 136, 154303], these features are assigned to excitation from the neutral highest occupied molecular orbital (HOMO) to a shape resonance with g (l = 4) character and approximate pi symmetry. This generic behavior results from the similarity of the HOMOs in all internal alkynes, as well as the similarity of the corresponding g pi virtual orbital in the continuum. Theoretical calculations of the absorption spectrum above the ionization threshold for the 2- and 3-alkynes show the presence of a shape resonance when the coupling between the two degenerate or nearly degenerate pi channels is included, with a dominant contribution from l = 4. These calculations thus confirm the qualitative arguments for the importance of the l = 4 continuum near threshold for internal alkynes, which should also apply to other linear internal alkynes and alkynyl radicals. The 1-alkynes do not have such high partial waves present in the shape resonance. The lower l partial waves in these systems are consistent with the broader features observed in the corresponding spectra.

  17. What happens at very low levels of radiation exposure ? Are the low dose exposures beneficial ?

    International Nuclear Information System (INIS)

    Deniz, Dalji

    2006-01-01

    . This adaptive response seems to be the manifestation of a protective effect that may reduce risk at very low doses. Current knowledge in molecular biology shows no evidence of a threshold effect for Stochastic Effects. Therefore, any level of radiation may be considered to cause them. Conversely, some studies show that low levels of irradiation are in fact beneficial to the health (Radiation Hormesis). However, in the absence of clear scientific evidence, the regulators adopted a conservative approach and consider all levels of radiation as being potentially damaging to the human body (LNT theory). According to LNT theory; the effects of low doses of ionizing radiation can be estimated by linear extrapolation from effects observed by linear extrapolation from effects observed by high doses. There is not any safe dose because even very low doses of ionizing radiation produce some biological effect. The results of many investigations do not support the LNT theory. Furthermore relationship between environmental radon concentrations and lung cancer even contradict this theory and clearly suggest a hermetic effect -radiation hormesis-. Although data are still incomplete, extensive epidemiological studies have indicated that radiation hormesis is really exist. In this review, contradictory evidence Linear No-Threshold Theory and Radiation Hormesis Effect is discussed

  18. Análise genética de escores de avaliação visual de bovinos com modelos bayesianos de limiar e linear Genetic analysis for visual scores of bovines with the linear and threshold bayesian models

    Directory of Open Access Journals (Sweden)

    Carina Ubirajara de Faria

    2008-07-01

    Full Text Available O objetivo deste trabalho foi comparar as estimativas de parâmetros genéticos obtidas em análises bayesianas uni-característica e bi-característica, em modelo animal linear e de limiar, considerando-se as características categóricas morfológicas de bovinos da raça Nelore. Os dados de musculosidade, estrutura física e conformação foram obtidos entre 2000 e 2005, em 3.864 animais de 13 fazendas participantes do Programa Nelore Brasil. Foram realizadas análises bayesianas uni e bi-características, em modelos de limiar e linear. De modo geral, os modelos de limiar e linear foram eficientes na estimação dos parâmetros genéticos para escores visuais em análises bayesianas uni-características. Nas análises bi-características, observou-se que: com utilização de dados contínuos e categóricos, o modelo de limiar proporcionou estimativas de correlação genética de maior magnitude do que aquelas do modelo linear; e com o uso de dados categóricos, as estimativas de herdabilidade foram semelhantes. A vantagem do modelo linear foi o menor tempo gasto no processamento das análises. Na avaliação genética de animais para escores visuais, o uso do modelo de limiar ou linear não influenciou a classificação dos animais, quanto aos valores genéticos preditos, o que indica que ambos os modelos podem ser utilizados em programas de melhoramento genético.The objective of this work was to compare the estimates of genetic parameters obtained in single-trait and two-trait bayesian analyses, under linear and threshold animal models, considering categorical morphological traits of bovines of the Nelore breed. Data of musculature, physical structure and conformation were obtained between years 2000 and 2005, from 3,864 bovines of the Nelore breed from 13 participant farms of the Nelore Brazil Program. Single-trait and two-trait bayesian analyses were performed under linear and threshold animal models. In general, the linear and threshold

  19. Collective dose: Dogma, tool, or trauma?

    International Nuclear Information System (INIS)

    Becker, K.

    1996-01-01

    In Europe as well as in the United States, the argument continues in the radiation protection community between the open-quotes fundamentalists,close quotes who firmly believe in the linear no-threshold (LNT) hypothesis and the closely related concept of collective dose, and open-quotes pragmatists,close quotes who have serious doubts about these concepts for both radiobiological and socioeconomic reasons. The latter view is reflected in many recent compilations in the scientific literature, in particular in the books by G. Walinder and S. Kondo. The fundamentalist view has been expressed in other recent publications. What has been described as the good old boys' network of the establishment threatens nonbelievers with excommunication: in his 1996 Sievert lecture, D. Beninson described doubts about the LNT hypothesis as open-quotes arrogant ignoranceclose quotes; A. Gonzales, in a 1995 letter to the author, described them as open-quotes intellectual lazinessclose quotes; and R. Clarke, chairman of the International Commission on Radiological Protection (ICRP), during the 1996 IRPA Congress, described them as open-quotes seriously misguided.close quotes Threshold or no threshold, that is the question - the most important one for this and the next generation of health physicists. Corrections of the current dogmas are needed soon and should be initiated not only by the nuclear community (which may be blamed by opinion makers for a biased view) but by those members of the radiation protection community, who are not only interested in order to 'keep the hazard alive' for reasons of publicity, research funding, and sales of instruments and services. The International Atomic Energy Agency will devote a symposium to these questions next year. The results should be interesting

  20. HERITABILITY AND BREEDING VALUE OF SHEEP FERTILITY ESTIMATED BY MEANS OF THE GIBBS SAMPLING METHOD USING THE LINEAR AND THRESHOLD MODELS

    Directory of Open Access Journals (Sweden)

    DARIUSZ Piwczynski

    2013-03-01

    Full Text Available The research was carried out on 4,030 Polish Merino ewes born in the years 1991- 2001, kept in 15 flocks from the Pomorze and Kujawy region. Fertility of ewes in subsequent reproduction seasons was analysed with the use of multiple logistic regression. The research showed that there is a statistical influence of the flock, year of birth, age of dam, flock year interaction of birth on the ewes fertility. In order to estimate the genetic parameters, the Gibbs sampling method was applied, using the univariate animal models, both linear as well as threshold. Estimates of fertility depending on the model equalled 0.067 to 0.104, whereas the estimates of repeatability equalled respectively: 0.076 and 0.139. The obtained genetic parameters were then used to estimate the breeding values of the animals in terms of controlled trait (Best Linear Unbiased Prediction method using linear and threshold models. The obtained animal breeding values rankings in respect of the same trait with the use of linear and threshold models were strongly correlated with each other (rs = 0.972. Negative genetic trends of fertility (0.01-0.08% per year were found.

  1. The non-linear link between electricity consumption and temperature in Europe: A threshold panel approach

    Energy Technology Data Exchange (ETDEWEB)

    Bessec, Marie [CGEMP, Universite Paris-Dauphine, Place du Marechal de Lattre de Tassigny Paris (France); Fouquau, Julien [LEO, Universite d' Orleans, Faculte de Droit, d' Economie et de Gestion, Rue de Blois, BP 6739, 45067 Orleans Cedex 2 (France)

    2008-09-15

    This paper investigates the relationship between electricity demand and temperature in the European Union. We address this issue by means of a panel threshold regression model on 15 European countries over the last two decades. Our results confirm the non-linearity of the link between electricity consumption and temperature found in more limited geographical areas in previous studies. By distinguishing between North and South countries, we also find that this non-linear pattern is more pronounced in the warm countries. Finally, rolling regressions show that the sensitivity of electricity consumption to temperature in summer has increased in the recent period. (author)

  2. Targeted and non-targeted effects of ionizing radiation

    OpenAIRE

    Omar Desouky; Nan Ding; Guangming Zhou

    2015-01-01

    For a long time it was generally accepted that effects of ionizing radiation such as cell death, chromosomal aberrations, DNA damage, mutagenesis, and carcinogenesis result from direct ionization of cell structures, particularly DNA, or from indirect damage through reactive oxygen species produced by radiolysis of water, and these biological effects were attributed to irreparable or misrepaired DNA damage in cells directly hit by radiation. Using linear non-threshold model (LNT), possible ris...

  3. Advanced Computational Approaches for Characterizing Stochastic Cellular Responses to Low Dose, Low Dose Rate Exposures

    Energy Technology Data Exchange (ETDEWEB)

    Scott, Bobby, R., Ph.D.

    2003-06-27

    applications of NEOTRANS2, indicate that nonlinear threshold-type, dose-response relationships for excess stochastic effects (problematic nonlethal mutations, neoplastic transformation) should be expected after exposure to low linear energy transfer (LET) gamma rays or gamma rays in combination with high-LET alpha radiation. Similar thresholds are expected for low-dose-rate low-LET beta irradiation. We attribute the thresholds to low-dose, low-LET radiation induced protection against spontaneous mutations and neoplastic transformations. The protection is presumed mainly to involve selective elimination of problematic cells via apoptosis. Low-dose, low-LET radiation is presumed to trigger wide-area cell signaling, which in turn leads to problematic bystander cells (e.g., mutants, neoplastically transformed cells) selectively undergoing apoptosis. Thus, this protective bystander effect leads to selective elimination of problematic cells (a tissue cleansing process in vivo). However, this protective bystander effects is a different process from low-dose stimulation of the immune system. Low-dose, low-LET radiation stimulation of the immune system may explain why thresholds for inducing excess cancer appear much larger (possibly more than 100-fold larger) than thresholds for inducing excess mutations and neoplastic transformations, when the dose rate is low. For ionizing radiation, the current risk assessment paradigm is such that the relative risk (RR) is always ¡Ý 1, no matter how small the dose. Our research results indicate that for low-dose or low-dose-rate, low-LET irradiation, RR < 1 may be more the rule than the exception. Directly tied to the current RR paradigm are the billion-dollar cleanup costs for radionuclide-contaminated DOE sites. Our research results suggest that continued use of the current RR paradigm for which RR ¡Ý 1 could cause more harm than benefit to society (e.g., by spreading unwarranted fear about phantom excess risks associated with low-dose low

  4. In defence of collective dose

    International Nuclear Information System (INIS)

    Fairlie, I.; Sumner, D.

    2000-01-01

    Recent proposals for a new scheme of radiation protection leave little room for collective dose estimations. This article discusses the history and present use of collective doses for occupational, ALARA, EIS and other purposes with reference to practical industry papers and government reports. The linear no-threshold (LNT) hypothesis suggests that collective doses which consist of very small doses added together should be used. Moral and ethical questions are discussed, particularly the emphasis on individual doses to the exclusion of societal risks, uncertainty over effects into the distant future and hesitation over calculating collective detriments. It is concluded that for moral, practical and legal reasons, collective dose is a valid parameter which should continue to be used. (author)

  5. Knowledge on radiation dose-rate for risk communication on nuclear power plants

    International Nuclear Information System (INIS)

    Sugiyama, Ken-ichiro

    2013-01-01

    The sense of anxiety on radiation after Fukushima Dai-ichi accident has not disappeared because of the nightmare scenario on radiation cultivated through the Cold War era starting at the atomic bomb dropping at Hiroshima and Nagasaki. In the present paper, from the viewpoint of establishing the social acceptance of nuclear power plants as well as new reasonable regulation, biological defense in depth (production of anti-oxidants, DNA repair, cell death/apoptosis, and immune defense mechanisms) found in a few decades are presented in comparison with the linear no-threshold (LNT) model for the induction of cancer in the range up to 100 mSv (as single or annual doses) applied for the present regulation. (author)

  6. Biological effects of low doses of ionizing radiation: Conflict between assumptions and observations

    International Nuclear Information System (INIS)

    Kesavan, P.C.; Devasagayam, T.P.A.

    1997-01-01

    Recent epidemiological data on cancer incidence among the A-bomb survivors and more importantly experimental studies in cell and molecular radiobiology do not lend unequivocal support to the ''linear, no threshold'' (LNT) hypothesis; in fact, the discernible evidence that low and high doses of ionizing radiations induce qualitatively different/opposite effects cannot be summarily rejected. A time has come to examine the mechanistic aspects of ''radiation hormesis'' and ''radioadaptive response'' seriously rather than proclaiming one's profound disbelief about these phenomena. To put the discussion in a serious scientific mode, we briefly catalogue here reports in the literature on gene expression differentially influenced by low and high doses. These are not explicable in terms of the current radiation paradigm. (author)

  7. Thresholds, switches and hysteresis in hydrology from the pedon to the catchment scale: a non-linear systems theory

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available Hysteresis is a rate-independent non-linearity that is expressed through thresholds, switches, and branches. Exceedance of a threshold, or the occurrence of a turning point in the input, switches the output onto a particular output branch. Rate-independent branching on a very large set of switches with non-local memory is the central concept in the new definition of hysteresis. Hysteretic loops are a special case. A self-consistent mathematical description of hydrological systems with hysteresis demands a new non-linear systems theory of adequate generality. The goal of this paper is to establish this and to show how this may be done. Two results are presented: a conceptual model for the hysteretic soil-moisture characteristic at the pedon scale and a hysteretic linear reservoir at the catchment scale. Both are based on the Preisach model. A result of particular significance is the demonstration that the independent domain model of the soil moisture characteristic due to Childs, Poulavassilis, Mualem and others, is equivalent to the Preisach hysteresis model of non-linear systems theory, a result reminiscent of the reduction of the theory of the unit hydrograph to linear systems theory in the 1950s. A significant reduction in the number of model parameters is also achieved. The new theory implies a change in modelling paradigm.

  8. The dose makes the poison. Even for radiation; Die Dosis macht das Gift. Auch bei Strahlenbelastung

    Energy Technology Data Exchange (ETDEWEB)

    Langeheine, Juergen

    2014-11-15

    The dose makes the poison, a quote by Paracelsus a doctor who lived half a millennium ago, is still valid today. Nevertheless this general accepted fact is being excluded in relation to ionizing radiation, which is wrongly considered as radioactive radiation. Here applies the LNT-Hypothesis (Linear No Threshold), agreed on by the ICRP, the Commission on Radiological Protection, a dose-to-effect relationship, which is based on the EU directives and the German Radiation Protection Ordinance. The LNT-hypothesis states, that even every smallest dose of radiation already provides a potentiality of danger and was introduced as precaution assuming that self-healing mechanisms even through weak radiation of damaged cells can be excluded and every damage caused by radiation inevitably leads to cell mutation and with it to cancer development. Without any further knowledge assumptions were made, that the same mechanism for cancer development applies for high and small doses. This assumption turned out to be wrong, as it is increasingly reported on findings which show, that smaller doses of ionized radiation demonstrably does not cause any damage, but on the contrary can even be healthy.

  9. Direction detection thresholds of passive self-motion in artistic gymnasts.

    Science.gov (United States)

    Hartmann, Matthias; Haller, Katia; Moser, Ivan; Hossner, Ernst-Joachim; Mast, Fred W

    2014-04-01

    In this study, we compared direction detection thresholds of passive self-motion in the dark between artistic gymnasts and controls. Twenty-four professional female artistic gymnasts (ranging from 7 to 20 years) and age-matched controls were seated on a motion platform and asked to discriminate the direction of angular (yaw, pitch, roll) and linear (leftward-rightward) motion. Gymnasts showed lower thresholds for the linear leftward-rightward motion. Interestingly, there was no difference for the angular motions. These results show that the outstanding self-motion abilities in artistic gymnasts are not related to an overall higher sensitivity in self-motion perception. With respect to vestibular processing, our results suggest that gymnastic expertise is exclusively linked to superior interpretation of otolith signals when no change in canal signals is present. In addition, thresholds were overall lower for the older (14-20 years) than for the younger (7-13 years) participants, indicating the maturation of vestibular sensitivity from childhood to adolescence.

  10. No evidence for a critical salinity threshold for growth and reproduction in the freshwater snail Physa acuta

    International Nuclear Information System (INIS)

    Kefford, Ben J.; Nugegoda, Dayanthi

    2005-01-01

    The growth and reproduction of the freshwater snail Physa acuta (Gastropoda: Physidae) were measured at various salinity levels (growth: distilled water, 50, 100, 500, 1000 and 5000 μS/cm; reproduction: deionized water, 100, 500, 1000 and 3000 μS/cm) established using the artificial sea salt, Ocean Nature. This was done to examine the assumption that there is no direct effect of salinity on freshwater animals until a threshold, beyond which sub-lethal effects, such as reduction in growth and reproduction, will occur. Growth of P. acuta was maximal in terms of live and dry mass at salinity levels 500-1000 μS/cm. The number of eggs produced per snail per day was maximal between 100 and 1000 μS/cm. Results show that rather than a threshold response to salinity, small rises in salinity (from low levels) can produce increased growth and reproduction until a maximum is reached. Beyond this salinity, further increases result in a decrease in growth and reproduction. Studies on the growth of freshwater invertebrates and fish have generally shown a similar lack of a threshold response. The implications for assessing the effects of salinisation on freshwater organisms need to be further considered. - Responses of snails to increasing salinity were non-linear

  11. No-threshold dose-response curves for nongenotoxic chemicals: Findings and applications for risk assessment

    International Nuclear Information System (INIS)

    Sheehan, Daniel M.

    2006-01-01

    We tested the hypothesis that no threshold exists when estradiol acts through the same mechanism as an active endogenous estrogen. A Michaelis-Menten (MM) equation accounting for response saturation, background effects, and endogenous estrogen level fit a turtle sex-reversal data set with no threshold and estimated the endogenous dose. Additionally, 31 diverse literature dose-response data sets were analyzed by adding a term for nonhormonal background; good fits were obtained but endogenous dose estimations were not significant due to low resolving power. No thresholds were observed. Data sets were plotted using a normalized MM equation; all 178 data points were accommodated on a single graph. Response rates from ∼1% to >95% were well fit. The findings contradict the threshold assumption and low-dose safety. Calculating risk and assuming additivity of effects from multiple chemicals acting through the same mechanism rather than assuming a safe dose for nonthresholded curves is appropriate

  12. Risk Assessments for Workers and the Population Following the Chernobyl Accident. Annex XI of Technical Volume 4

    International Nuclear Information System (INIS)

    2015-01-01

    The doses received by emergency workers at the Fukushima Daiichi NPP were much lower than those of the Chernobyl emergency workers (referred to as ‘liquidators’) and there is no evidence of an increased risk for Chernobyl workers below an equivalent dose of 150 mGy, so the inferred risks are expected to be small. Nevertheless, it is useful to adopt a similar modelling approach. The main estimates of radiation risks for the cohort of Chernobyl emergency workers who received moderate doses are in good quantitative agreement with the results for atomic bomb survivors if the linear non-threshold (LNT) model is used. The minimum latency period for radiation related solid cancers in the Russian cohort was estimated as four years. No statistically significant relationship was found between the thyroid cancer incidence and external radiation for the Russian cohort of liquidators

  13. Time-dependent local-to-normal mode transition in triatomic molecules

    Science.gov (United States)

    Cruz, Hans; Bermúdez-Montaña, Marisol; Lemus, Renato

    2018-01-01

    Time-evolution of the vibrational states of two interacting harmonic oscillators in the local mode scheme is presented. A local-to-normal mode transition (LNT) is identified and studied from temporal perspective through time-dependent frequencies of the oscillators. The LNT is established as a polyad-breaking phenomenon from the local standpoint for the stretching degrees of freedom in a triatomic molecule. This study is carried out in the algebraic representation of bosonic operators. The dynamics of the states are determined via the solutions of the corresponding nonlinear Ermakov equation and a local time-dependent polyad is obtained as a tool to identify the LNT. Applications of this formalism to H2O, CO2, O3 and NO2 molecules in the adiabatic, sudden and linear regime are considered.

  14. Televīzijas loma neapmierinātībā ar politiku: LTV1, LNT, TV3 nedēļas analītisko raidījumu satura, to veidotāju un ekspertu vērtējumu analīze (2008.gada oktobris-2009.gada marts)

    OpenAIRE

    Novodvorskis, Vladimirs

    2009-01-01

    Maģistra darbu „Televīzijas loma neapmierinātībā ar politiku: LTV1, LNT, TV3 nedēļas analītisko raidījumu satura, to veidotāju un ekspertu vērtējumu analīze (2008. gada oktobris – 2009. gada marts)” izstrādāja Latvijas Universitātes Komunikācijas studiju nodaļas students Vladimirs Novodvorskis. Darbs veltīts auditorijas negatīvas attieksmes veidošanas problēmas izpētei televīzijas informatīvi analītiskajos raidījumos Panorāma, De facto (LTV1), LNT Top 10 (LNT), Nekā personīga (TV3) pret pol...

  15. Gradient-driven flux-tube simulations of ion temperature gradient turbulence close to the non-linear threshold

    Energy Technology Data Exchange (ETDEWEB)

    Peeters, A. G.; Rath, F.; Buchholz, R.; Grosshauser, S. R.; Strintzi, D.; Weikl, A. [Physics Department, University of Bayreuth, Universitätsstrasse 30, Bayreuth (Germany); Camenen, Y. [Aix Marseille Univ, CNRS, PIIM, UMR 7345, Marseille (France); Candy, J. [General Atomics, PO Box 85608, San Diego, California 92186-5608 (United States); Casson, F. J. [CCFE, Culham Science Centre, Abingdon OX14 3DB, Oxon (United Kingdom); Hornsby, W. A. [Max Planck Institut für Plasmaphysik, Boltzmannstrasse 2 85748 Garching (Germany)

    2016-08-15

    It is shown that Ion Temperature Gradient turbulence close to the threshold exhibits a long time behaviour, with smaller heat fluxes at later times. This reduction is connected with the slow growth of long wave length zonal flows, and consequently, the numerical dissipation on these flows must be sufficiently small. Close to the nonlinear threshold for turbulence generation, a relatively small dissipation can maintain a turbulent state with a sizeable heat flux, through the damping of the zonal flow. Lowering the dissipation causes the turbulence, for temperature gradients close to the threshold, to be subdued. The heat flux then does not go smoothly to zero when the threshold is approached from above. Rather, a finite minimum heat flux is obtained below which no fully developed turbulent state exists. The threshold value of the temperature gradient length at which this finite heat flux is obtained is up to 30% larger compared with the threshold value obtained by extrapolating the heat flux to zero, and the cyclone base case is found to be nonlinearly stable. Transport is subdued when a fully developed staircase structure in the E × B shearing rate forms. Just above the threshold, an incomplete staircase develops, and transport is mediated by avalanche structures which propagate through the marginally stable regions.

  16. An examination of adaptive cellular protective mechanisms using a multi-stage carcinogenesis model

    International Nuclear Information System (INIS)

    Schollnberger, H.; Stewart, R. D.; Mitchel, R. E. J.; Hofmann, W.

    2004-01-01

    A multi-stage cancer model that describes the putative rate-limiting steps in carcinogenesis was developed and used to investigate the potential impact on lung cancer incidence of the hormesis mechanisms suggested by Feinendegen and Pollycove. In this deterministic cancer model, radiation and endogenous processes damage the DNA of target cells in the lung. Some fraction of the misrepaired our unrepaired DNA damage induces genomic instability and, ultimately, leads to the accumulation of malignant cells. The model accounts for cell birth and death processes. Ita also includes a rate of malignant transformation and a lag period for tumour formation. Cellular defence mechanisms are incorporated into the model by postulating dose and dose rate dependent radical scavenging. The accuracy of DNA damage repair also depends on dose and dose rate. Sensitivity studies were conducted to identify critical model inputs and to help define the shapes of the cumulative lung cancer incidence curves that may arise when dose and dose rate dependent cellular defence mechanisms are incorporated into a multi-stage cancer model. For lung cancer, both linear no-threshold (LNT) and non-LNT shaped responses can be obtained. The reported studied clearly show that it is critical to know whether or not and to what extent multiply damaged DNA sites are formed by endogenous processes. Model inputs that give rise to U-shaped responses are consistent with an effective cumulative lung cancer incidence threshold that may be as high as 300 mGy (4 mGy per year for 75 years). (Author) 11 refs

  17. Simulation of deposition and activity distribution of radionuclides in human airways

    International Nuclear Information System (INIS)

    Farkas, A.; Balashazy, I.; Szoke, I.; Hofmann, W.; Golser, R.

    2002-01-01

    The aim of our research activities is the modelling of the biological processes related to the development of lung cancer at the large central-airways observed in the case of uranium miners caused by the inhalation of radionuclides (especially alpha-emitting radon decay products). Statistical data show that at the uranium miners the lung cancer has developed mainly in the 3-4.-5. airway generations and especially in the right upper lobe. Therefore, it is rather important to study the physical and biological effects in this section of the human airways to find relations between the radiation dose and the adverse health effects. These results may provide useful information about the validity or invalidity of the currently used LNT (Linear-No-Threshold) dose-effect hypothesis at low doses

  18. Whole body exposure to low-dose γ-radiation enhances the antioxidant defense system

    International Nuclear Information System (INIS)

    Pathak, C.M.; Avti, P.K.; Khanduja, K.L.; Sharma, S.C.

    2008-01-01

    It is believed that the extent of cellular damage by low- radiation dose is proportional to the effects observed at high radiation dose as per the Linear-No-Threshold (LNT) hypothesis. However, this notion may not be true at low-dose radiation exposure in the living system. Recent evidence suggest that the living organisms do not respond to ionizing radiations in a linear manner in the low dose range 0.01-0.5Gy and rather restore the homeostasis both in vivo and in vitro by normal physiological mechanisms such as cellular and DNA repair processes, immune reactions, antioxidant defense, adaptive responses, activation of immune functions, stimulation of growth etc. In this study, we have attempted to find the critical radiation dose range and the post irradiation period during which the antioxidant defense systems in the lungs, liver and kidneys remain stimulated in these organs after whole body exposure of the animals to low-dose radiation

  19. An abuse of risk assessment: how regulatory agencies improperly adopted LNT for cancer risk assessment.

    Science.gov (United States)

    Calabrese, Edward J

    2015-04-01

    The Genetics Panel of the National Academy of Sciences' Committee on Biological Effects of Atomic Radiation (BEAR) recommended the adoption of the linear dose-response model in 1956, abandoning the threshold dose-response for genetic risk assessments. This recommendation was quickly generalized to include somatic cells for cancer risk assessment and later was instrumental in the adoption of linearity for carcinogen risk assessment by the Environmental Protection Agency. The Genetics Panel failed to provide any scientific assessment to support this recommendation and refused to do so when later challenged by other leading scientists. Thus, the linearity model used in cancer risk assessment was based on ideology rather than science and originated with the recommendation of the NAS BEAR Committee Genetics Panel. Historical documentation in support of these conclusions is provided in the transcripts of the Panel meetings and in previously unexamined correspondence among Panel members.

  20. Exposures at low doses and biological effects of ionizing radiations

    International Nuclear Information System (INIS)

    Masse, R.

    2000-01-01

    Everyone is exposed to radiation from natural, man-made and medical sources, and world-wide average annual exposure can be set at about 3.5 mSv. Exposure to natural sources is characterised by very large fluctuations, not excluding a range covering two orders of magnitude. Millions of inhabitants are continuously exposed to external doses as high as 10 mSv per year, delivered at low dose rates, very few workers are exposed above the legal limit of 50 mSv/year, and referring to accidental exposures, only 5% of the 116 000 people evacuated following the Chernobyl disaster encountered doses above 100 mSv. Epidemiological survey of accidentally, occupationally or medically exposed groups have revealed radio-induced cancers, mostly following high dose-rate exposure levels, only above 100 mSv. Risk coefficients were derived from these studies and projected into linear models of risk (linear non-threshold hypothesis: LNT), for the purpose of risk management following exposures at low doses and low dose-rates. The legitimacy of this approach has been questioned, by the Academy of sciences and the Academy of medicine in France, arguing: that LNT was not supported by Hiroshima and Nagasaki studies when neutron dose was revisited; that linear modelling failed to explain why so many site-related cancers were obviously nonlinearly related to the dose, and especially when theory predicted they ought to be; that no evidence could be found of radio-induced cancers related to natural exposures or to low exposures at the work place; and that no evidence of genetic disease could be shown from any of the exposed groups. Arguments were provided from cellular and molecular biology helping to solve this issue, all resulting in dismissing the LNT hypothesis. These arguments included: different mechanisms of DNA repair at high and low dose rate; influence of inducible stress responses modifying mutagenesis and lethality; bystander effects allowing it to be considered that individual

  1. Modelos lineares e não lineares da curva de Phillips para previsão da taxa de inflação no Brasil

    Directory of Open Access Journals (Sweden)

    Elano Ferreira Arruda

    2011-09-01

    Full Text Available Este trabalho compara previsões da taxa de inflação mensal brasileira a partir de diferentes modelos lineares e não lineares de séries temporais e da curva de Phillips. Em geral, os modelos não lineares apresentaram um melhor desempenho preditivo. Um modelo VAR produziu o menor erro quadrático médio de previsão (EQM entre os modelos lineares, enquanto as melhores previsões, entre todos os modelos, foram geradas pela curva de Phillips ampliada com threshold, a qual apresentou um EQM 20% menor do que a do modelo VAR. Essa diferença é significante de acordo com o teste de Diebold e Mariano (1995.

  2. The linear hypothesis - an idea whose time has passed

    International Nuclear Information System (INIS)

    Tschaeche, A.N.

    1995-01-01

    The linear no-threshold hypothesis is the basis for radiation protection standards in the United States. In the words of the National Council on Radiation Protection and Measurements (NCRP), the hypothesis is: open-quotes In the interest of estimating effects in humans conservatively, it is not unreasonable to follow the assumption of a linear relationship between dose and effect in the low dose regions for which direct observational data are not available.close quotes The International Commission on Radiological Protection (ICRP) stated the hypothesis in a slightly different manner: open-quotes One such basic assumption ... is that ... there is ... a linear relationship without threshold between dose and the probability of an effect. The hypothesis was necessary 50 yr ago when it was first enunciated because the dose-effect curve for ionizing radiation for effects in humans was not known. The ICRP and NCRP needed a model to extrapolate high-dose effects to low-dose effects. So the linear no-threshold hypothesis was born. Certain details of the history of the development and use of the linear hypothesis are presented. In particular, use of the hypothesis by the U.S. regulatory agencies is examined. Over time, the sense of the hypothesis has been corrupted. The corruption of the hypothesis into the current paradigm of open-quote a little radiation, no matter how small, can and will harm youclose quotes is presented. The reasons the corruption occurred are proposed. The effects of the corruption are enumerated, specifically, the use of the corruption by the antinuclear forces in the United States and some of the huge costs to U.S. taxpayers due to the corruption. An alternative basis for radiation protection standards to assure public safety, based on the weight of scientific evidence on radiation health effects, is proposed

  3. Rad-by-rad (bit-by-bit): triumph of evidence over activities fostering fear of radiogenic cancers at low doses

    International Nuclear Information System (INIS)

    Strzelczyk, J.; Potter, W.; Zdrojewicz, Z.

    2006-01-01

    Full text: Large segments of Western population hold sciences in low esteem. This trend became particularly pervasive in the field of radiation sciences in recent decades. The resulting lack of knowledge, easily filled with fear that feeds on itself, makes people susceptible to prevailing dogmas. Decades-long moratorium on nuclear power in the US, resentment of a nything nuclear , delay/refusal to obtain medical radiation procedures are some of the societal consequences. The problem has been exacerbated by promulgation of the linear-no-threshold (LNT) dose response model by advisory bodies such as the ICRP, NCRP and others. This model assumes no safe level of radiation and implies that response is the same per unit dose regardless of the total dose or dose rate. The most recent (June 2005) report from the National Research Council, BEIR VII (Biological Effects of Ionizing Radiation) continues this approach and quantifies potential cancer risks at low doses by linear extrapolation of risk values obtained from epidemiological observations of populations exposed to high doses, 0.2 to 3 Sv. It minimizes significance of lack of evidence of adverse effects in populations exposed to low doses and discounts documented beneficial effects of low dose exposures on the human immune system. The LNT doctrine is in direct conflict with current findings of radiobiology and important features of modern radiation oncology. Fortunately, these aspects are addressed in-depth in another major report - issued jointly in March 2005 by two French Academies, of Sciences and of Medicine. The latter report is much less publicized thus it is a responsibility of radiation professionals, physicists, nuclear engineers, and physicians to become familiar with its content and relevant studies, and to widely disseminate this information. To counteract biased media, we need to be creative in developing means of sharing good news about radiation with co-workers, patients, and the general public

  4. Evaluating the "Threshold Theory": Can Head Impact Indicators Help?

    Science.gov (United States)

    Mihalik, Jason P; Lynall, Robert C; Wasserman, Erin B; Guskiewicz, Kevin M; Marshall, Stephen W

    2017-02-01

    This study aimed to determine the clinical utility of biomechanical head impact indicators by measuring the sensitivity, specificity, positive predictive value (PV+), and negative predictive value (PV-) of multiple thresholds. Head impact biomechanics (n = 283,348) from 185 football players in one Division I program were collected. A multidisciplinary clinical team independently made concussion diagnoses (n = 24). We dichotomized each impact using diagnosis (yes = 24, no = 283,324) and across a range of plausible impact indicator thresholds (10g increments beginning with a resultant linear head acceleration of 50g and ending with 120g). Some thresholds had adequate sensitivity, specificity, and PV-. All thresholds had low PV+, with the best recorded PV+ less than 0.4% when accounting for all head impacts sustained by our sample. Even when conservatively adjusting the frequency of diagnosed concussions by a factor of 5 to account for unreported/undiagnosed injuries, the PV+ of head impact indicators at any threshold was no greater than 1.94%. Although specificity and PV- appear high, the low PV+ would generate many unnecessary evaluations if these indicators were the sole diagnostic criteria. The clinical diagnostic value of head impact indicators is considerably questioned by these data. Notwithstanding, valid sensor technologies continue to offer objective data that have been used to improve player safety and reduce injury risk.

  5. Ramsar hot springs: how safe is to live in an environment with high level of natural radiation

    International Nuclear Information System (INIS)

    Mortazavi, S.M.J.

    2005-01-01

    Ramsar in northern Iran is among the world's well-known areas with highest levels of natural radiation. Annual exposure levels in areas with elevated levels of natural radiation in Ramsar are up to 260 mGy y -1 and average exposure rates are about 10 mGy y -1 for a population of about 2000 residents. Due to the local geology, which includes high levels of radium in rocks, soils, and groundwater, Ramsar residents are also exposed to high levels of alpha activity in the form of ingested radium and radium decay progeny as well as very high radon levels (over 1000 MBq m -3 ) in their dwellings. In some cases, the inhabitants of these areas receive doses much higher than the current ICRP-60 dose limit of 20 mSv y -1 . As the biological effects of low doses of radiation are not fully understood, the current radiation protection recommendations are based on the predictions of an assumption on the linear, no-threshold (LNT) relationship between radiation dose and the carcinogenic effects. Considering LNT, areas having such levels of natural radiation must be evacuated or at least require immediate remedial actions. Inhabitants of the high level natural radiation areas (HLNRAs) of Ramsar ar largely unaware of natural radiation, radon, or its possible health effects, and the inhabitants have not encountered any harmful effects due to living in their paternal houses. In this regard, it is often difficult to ask the inhabitants of HLNRAs of Ramsar to carry out remedical actions. Despite the fact that considering LNT and ALARA, public health in HLNRAs like Ramsar is best served by relocating the inhabitants, the residents' health seems unaffected and relocation is upsetting to the residents. Based on the findings obtained by studies on the health effect of high levels of natural radiation in Ramsar, as well as other HLNRAs, no consistent detrimental effect has been detected so far. However, more research is needed to clarify if the regulatory authorities should set limiting

  6. Competitive inhibition can linearize dose-response and generate a linear rectifier.

    Science.gov (United States)

    Savir, Yonatan; Tu, Benjamin P; Springer, Michael

    2015-09-23

    Many biological responses require a dynamic range that is larger than standard bi-molecular interactions allow, yet the also ability to remain off at low input. Here we mathematically show that an enzyme reaction system involving a combination of competitive inhibition, conservation of the total level of substrate and inhibitor, and positive feedback can behave like a linear rectifier-that is, a network motif with an input-output relationship that is linearly sensitive to substrate above a threshold but unresponsive below the threshold. We propose that the evolutionarily conserved yeast SAGA histone acetylation complex may possess the proper physiological response characteristics and molecular interactions needed to perform as a linear rectifier, and we suggest potential experiments to test this hypothesis. One implication of this work is that linear responses and linear rectifiers might be easier to evolve or synthetically construct than is currently appreciated.

  7. Threshold quantum cryptography

    International Nuclear Information System (INIS)

    Tokunaga, Yuuki; Okamoto, Tatsuaki; Imoto, Nobuyuki

    2005-01-01

    We present the concept of threshold collaborative unitary transformation or threshold quantum cryptography, which is a kind of quantum version of threshold cryptography. Threshold quantum cryptography states that classical shared secrets are distributed to several parties and a subset of them, whose number is greater than a threshold, collaborates to compute a quantum cryptographic function, while keeping each share secretly inside each party. The shared secrets are reusable if no cheating is detected. As a concrete example of this concept, we show a distributed protocol (with threshold) of conjugate coding

  8. Leukemia and ionizing radiation revisited

    Energy Technology Data Exchange (ETDEWEB)

    Cuttler, J.M. [Cuttler & Associates Inc., Vaughan, Ontario (Canada); Welsh, J.S. [Loyola University-Chicago, Dept. or Radiation Oncology, Stritch School of Medicine, Maywood, Illinois (United States)

    2016-03-15

    A world-wide radiation health scare was created in the late 19508 to stop the testing of atomic bombs and block the development of nuclear energy. In spite of the large amount of evidence that contradicts the cancer predictions, this fear continues. It impairs the use of low radiation doses in medical diagnostic imaging and radiation therapy. This brief article revisits the second of two key studies, which revolutionized radiation protection, and identifies a serious error that was missed. This error in analyzing the leukemia incidence among the 195,000 survivors, in the combined exposed populations of Hiroshima and Nagasaki, invalidates use of the LNT model for assessing the risk of cancer from ionizing radiation. The threshold acute dose for radiation-induced leukemia, based on about 96,800 humans, is identified to be about 50 rem, or 0.5 Sv. It is reasonable to expect that the thresholds for other cancer types are higher than this level. No predictions or hints of excess cancer risk (or any other health risk) should be made for an acute exposure below this value until there is scientific evidence to support the LNT hypothesis. (author)

  9. Energy conserving, linear scaling Born-Oppenheimer molecular dynamics.

    Science.gov (United States)

    Cawkwell, M J; Niklasson, Anders M N

    2012-10-07

    Born-Oppenheimer molecular dynamics simulations with long-term conservation of the total energy and a computational cost that scales linearly with system size have been obtained simultaneously. Linear scaling with a low pre-factor is achieved using density matrix purification with sparse matrix algebra and a numerical threshold on matrix elements. The extended Lagrangian Born-Oppenheimer molecular dynamics formalism [A. M. N. Niklasson, Phys. Rev. Lett. 100, 123004 (2008)] yields microcanonical trajectories with the approximate forces obtained from the linear scaling method that exhibit no systematic drift over hundreds of picoseconds and which are indistinguishable from trajectories computed using exact forces.

  10. Some environmental challenges which the uranium production industry faces in the 21st century

    International Nuclear Information System (INIS)

    Zhang Lisheng

    2004-01-01

    Some of the environmental challenges which the uranium production industry faces in the 21st century have been discussed in the paper. They are: the use of the linear non-threshold (LNT) model for radiation protection, the concept of 'controllable dose' as an alternative to the current International Commission on Radiological Protection (ICRP) system of dose limitation, the future of collective dose and the ALARA (As low As Reasonably Achievable) principle and the application of a risk-based framework for managing hazards. The author proposes that, the risk assessment/risk management framework could be used for managing the environmental, safety and decommissioning issues associated with the uranium fuel cycle. (author)

  11. Point of no return: experimental determination of the lethal hydraulic threshold during drought for loblolly pine (Pinus taeda)

    Science.gov (United States)

    Hammond, W.; Yu, K.; Wilson, L. A.; Will, R.; Anderegg, W.; Adams, H. D.

    2017-12-01

    The strength of the terrestrial carbon sink—dominated by forests—remains one of the greatest uncertainties in climate change modelling. How forests will respond to increased variability in temperature and precipitation is poorly understood, and experimental study to better inform global vegetation models in this area is needed. Necessary for achieving­­­­ this goal is an understanding of how increased temperatures and drought will affect landscape level distributions of plant species. Quantifying physiological thresholds representing a point of no return from drought stress, including thresholds in hydraulic function, is critical to this end. Recent theoretical, observational, and modelling research has converged upon a threshold of 60 percent loss of hydraulic conductivity at mortality (PLClethal). However, direct experimental determination of lethal points in conductivity and cavitation during drought is lacking. We quantified thresholds in hydraulic function in Loblolly pine, Pinus taeda, a commercially important timber species. In a greenhouse experiment, we exposed saplings (n = 96 total) to drought and rewatered treatment groups at variable levels of increasing water stress determined by pre-selected targets in pre-dawn water potential. Treatments also included a watered control with no drought, and drought with no rewatering. We measured physiological responses to water stress, including hydraulic conductivity, native PLC, water potential, foliar color, canopy die-back, and dark-adapted chlorophyll fluorescence. Following the rewatering treatment, we observed saplings for at least two months to determine which survived and which died. Using these data we calculated lethal physiological thresholds in water potential, directly measured PLC, and PLC inferred from water potential using a hydraulic vulnerability curve. We found that PLClethal inferred from water potential agreed with the 60% threshold suggested by previous research. However, directly

  12. A message to Fukushima: nothing to fear but fear itself.

    Science.gov (United States)

    Sutou, Shizuyo

    2016-01-01

    The linear no-threshold model (LNT) has been the basis for radiation protection policies worldwide for 60 years. LNT was fabricated without correct data. The lifespan study of Atomic bomb survivors (LSS) has provided fundamental data to support the NLT. In LSS, exposure doses were underestimated and cancer risk was overestimated; LSS data do not support LNT anymore. In light of these findings, radiation levels and cancer risk in Fukushima are reexamined. Soon after the Fukushima accident, the International Commission on Radiological Protection issued an emergency recommendation that national authorities set reference highest levels in the band of 20-100 mSv and, when the radiation source is under control, reference levels are in the band of 1-20 mSv/y. The Japanese government set the limit dose as low as 1 mSv for the public and stirred up radiophobia, which continues to cause tremendous human, social, and economic losses. Estimated doses in three areas of Fukushima were 0.6-2.3 mSv/y in Tamura City, 1.1-5.5 mSv/y in Kawauchi Village, and 3.8-17 mSv/y in Iitate Village. Since even after acute irradiation, no significant differences are found below 200 mSv for leukemia and below 100 mSv for solid cancers. These data indicate that cancer risk is negligible in Fukushima. Moreover, beneficial effects (lessened cancer incidence) were observed at 400-600 mSv in LSS. Living organisms, which have established efficient defense mechanisms against radiation through 3.8 billion years of evolutionary history, can tolerate 1000 mSv/y if radiation dose rates are low. In fact, people have lived for generations without adverse health effects in high background radiation areas such as Kelara (35 mSv/y), India, and Ramsar (260 mSv/y), Iran. Low dose radiation itself is harmless, but fear of radiation is vitally harmful. When people return to the evacuation zones in Fukushima now and in the future, they will be exposed to such low radiation doses as to cause no physical

  13. Toxicological awakenings: the rebirth of hormesis as a central pillar of toxicology

    International Nuclear Information System (INIS)

    Calabrese, Edward J.

    2005-01-01

    This paper assesses historical reasons that may account for the marginalization of hormesis as a dose-response model in the biomedical sciences in general and toxicology in particular. The most significant and enduring explanatory factors are the early and close association of the concept of hormesis with the highly controversial medical practice of homeopathy and the difficulty in assessing hormesis with high-dose testing protocols which have dominated the discipline of toxicology, especially regulatory toxicology. The long-standing and intensely acrimonious conflict between homeopathy and 'traditional' medicine (allopathy) lead to the exclusion of the hormesis concept from a vast array of medical- and public health-related activities including research, teaching, grant funding, publishing, professional societal meetings, and regulatory initiatives of governmental agencies and their advisory bodies. Recent publications indicate that the hormetic dose-response is far more common and fundamental than the dose-response models [threshold/linear no threshold (LNT)] used in toxicology and risk assessment, and by governmental regulatory agencies in the establishment of exposure standards for workers and the general public. Acceptance of the possibility of hormesis has the potential to profoundly affect the practice of toxicology and risk assessment, especially with respect to carcinogen assessment

  14. Stability Analysis of Continuous-Time and Discrete-Time Quaternion-Valued Neural Networks With Linear Threshold Neurons.

    Science.gov (United States)

    Chen, Xiaofeng; Song, Qiankun; Li, Zhongshan; Zhao, Zhenjiang; Liu, Yurong

    2018-07-01

    This paper addresses the problem of stability for continuous-time and discrete-time quaternion-valued neural networks (QVNNs) with linear threshold neurons. Applying the semidiscretization technique to the continuous-time QVNNs, the discrete-time analogs are obtained, which preserve the dynamical characteristics of their continuous-time counterparts. Via the plural decomposition method of quaternion, homeomorphic mapping theorem, as well as Lyapunov theorem, some sufficient conditions on the existence, uniqueness, and global asymptotical stability of the equilibrium point are derived for the continuous-time QVNNs and their discrete-time analogs, respectively. Furthermore, a uniform sufficient condition on the existence, uniqueness, and global asymptotical stability of the equilibrium point is obtained for both continuous-time QVNNs and their discrete-time version. Finally, two numerical examples are provided to substantiate the effectiveness of the proposed results.

  15. Modeling DPOAE input/output function compression: comparisons with hearing thresholds.

    Science.gov (United States)

    Bhagat, Shaum P

    2014-09-01

    Basilar membrane input/output (I/O) functions in mammalian animal models are characterized by linear and compressed segments when measured near the location corresponding to the characteristic frequency. A method of studying basilar membrane compression indirectly in humans involves measuring distortion-product otoacoustic emission (DPOAE) I/O functions. Previous research has linked compression estimates from behavioral growth-of-masking functions to hearing thresholds. The aim of this study was to compare compression estimates from DPOAE I/O functions and hearing thresholds at 1 and 2 kHz. A prospective correlational research design was performed. The relationship between DPOAE I/O function compression estimates and hearing thresholds was evaluated with Pearson product-moment correlations. Normal-hearing adults (n = 16) aged 22-42 yr were recruited. DPOAE I/O functions (L₂ = 45-70 dB SPL) and two-interval forced-choice hearing thresholds were measured in normal-hearing adults. A three-segment linear regression model applied to DPOAE I/O functions supplied estimates of compression thresholds, defined as breakpoints between linear and compressed segments and the slopes of the compressed segments. Pearson product-moment correlations between DPOAE compression estimates and hearing thresholds were evaluated. A high correlation between DPOAE compression thresholds and hearing thresholds was observed at 2 kHz, but not at 1 kHz. Compression slopes also correlated highly with hearing thresholds only at 2 kHz. The derivation of cochlear compression estimates from DPOAE I/O functions provides a means to characterize basilar membrane mechanics in humans and elucidates the role of compression in tone detection in the 1-2 kHz frequency range. American Academy of Audiology.

  16. The influence of thresholds on the risk assessment of carcinogens in food.

    Science.gov (United States)

    Pratt, Iona; Barlow, Susan; Kleiner, Juliane; Larsen, John Christian

    2009-08-01

    The risks from exposure to chemical contaminants in food must be scientifically assessed, in order to safeguard the health of consumers. Risk assessment of chemical contaminants that are both genotoxic and carcinogenic presents particular difficulties, since the effects of such substances are normally regarded as being without a threshold. No safe level can therefore be defined, and this has implications for both risk management and risk communication. Risk management of these substances in food has traditionally involved application of the ALARA (As Low as Reasonably Achievable) principle, however ALARA does not enable risk managers to assess the urgency and extent of the risk reduction measures needed. A more refined approach is needed, and several such approaches have been developed. Low-dose linear extrapolation from animal carcinogenicity studies or epidemiological studies to estimate risks for humans at low exposure levels has been applied by a number of regulatory bodies, while more recently the Margin of Exposure (MOE) approach has been applied by both the European Food Safety Authority and the Joint FAO/WHO Expert Committee on Food Additives. A further approach is the Threshold of Toxicological Concern (TTC), which establishes exposure thresholds for chemicals present in food, dependent on structure. Recent experimental evidence that genotoxic responses may be thresholded has significant implications for the risk assessment of chemicals that are both genotoxic and carcinogenic. In relation to existing approaches such as linear extrapolation, MOE and TTC, the existence of a threshold reduces the uncertainties inherent in such methodology and improves confidence in the risk assessment. However, for the foreseeable future, regulatory decisions based on the concept of thresholds for genotoxic carcinogens are likely to be taken case-by-case, based on convincing data on the Mode of Action indicating that the rate limiting variable for the development of cancer

  17. Thresholds and criteria for evaluating and communicating impact significance in environmental statements: 'See no evil, hear no evil, speak no evil'?

    International Nuclear Information System (INIS)

    Wood, Graham

    2008-01-01

    The evaluation and communication of the significance of environmental effects remains a critical yet poorly understood component of EIA theory and practice. Following a conceptual overview of the generic dimensions of impact significance in EIA, this paper reports upon the findings of an empirical study of recent environmental impact statements that considers the treatment of significance for impacts concerning landscape ('see no evil') and noise ('hear no evil'), focussing specifically upon the evaluation and communication of impact significance ('speak no evil') in UK practice. Particular attention is given to the use of significance criteria and thresholds, including the development of a typology of approaches applied within the context of noise and landscape/visual impacts. Following a broader discussion of issues surrounding the formulation, application and interpretation of significance criteria, conclusions and recommendations relevant to wider EIA practice are suggested

  18. Cytogenetic Low-Dose Hyperradiosensitivity Is Observed in Human Peripheral Blood Lymphocytes

    Energy Technology Data Exchange (ETDEWEB)

    Seth, Isheeta [Department of Biological Sciences, Wayne State University, Detroit, Michigan (United States); Joiner, Michael C. [Department of Radiation Oncology, Wayne State University, Detroit, Michigan (United States); Tucker, James D., E-mail: jtucker@biology.biosci.wayne.edu [Department of Biological Sciences, Wayne State University, Detroit, Michigan (United States)

    2015-01-01

    Purpose: The shape of the ionizing radiation response curve at very low doses has been the subject of considerable debate. Linear-no-threshold (LNT) models are widely used to estimate risks associated with low-dose exposures. However, the low-dose hyperradiosensitivity (HRS) phenomenon, in which cells are especially sensitive at low doses but then show increased radioresistance at higher doses, provides evidence of nonlinearity in the low-dose region. HRS is more prominent in the G2 phase of the cell cycle than in the G0/G1 or S phases. Here we provide the first cytogenetic mechanistic evidence of low-dose HRS in human peripheral blood lymphocytes using structural chromosomal aberrations. Methods and Materials: Human peripheral blood lymphocytes from 2 normal healthy female donors were acutely exposed to cobalt 60 γ rays in either G0 or G2 using closely spaced doses ranging from 0 to 1.5 Gy. Structural chromosomal aberrations were enumerated, and the slopes of the regression lines at low doses (0-0.4 Gy) were compared with doses of 0.5 Gy and above. Results: HRS was clearly evident in both donors for cells irradiated in G2. No HRS was observed in cells irradiated in G0. The radiation effect per unit dose was 2.5- to 3.5-fold higher for doses ≤0.4 Gy than for doses >0.5 Gy. Conclusions: These data provide the first cytogenetic evidence for the existence of HRS in human cells irradiated in G2 and suggest that LNT models may not always be optimal for making radiation risk assessments at low doses.

  19. Demystifying nuclear power: the linear non-threshold model and its use for evaluating radiation effects on living organisms

    Energy Technology Data Exchange (ETDEWEB)

    Ramos, Alexandre F.; Vasconcelos, Miguel F.; Vergueiro, Sophia M. C.; Lima, Suzylaine S., E-mail: alex.ramos@usp.br [Universidade de São Paulo (USP), SP (Brazil). Núcleo Interdisciplinar de Modelagem de Sistemas Complexos

    2017-07-01

    Recently, a new variable has been introduced on nuclear power expansion policy: public opinion. That variable challenges the nuclear community to develop new programs aiming to educate society sectors interested on energy generation and not necessarily familiarized with concepts of the nuclear eld. Here we approach this challenge by discussing how a misconception about the use of theories in science has misled the interpretation of the Chernobyl's accident consequences. That discussion have been presented for students from fields related with Environmental Sciences and Humanities and have helped to elucidate that an extrapolation such as the Linear Non-Threshold model is a hypothesis to be tested experimentally instead of a theoretical tool with predictive power. (author)

  20. Demystifying nuclear power: the linear non-threshold model and its use for evaluating radiation effects on living organisms

    International Nuclear Information System (INIS)

    Ramos, Alexandre F.; Vasconcelos, Miguel F.; Vergueiro, Sophia M. C.; Lima, Suzylaine S.

    2017-01-01

    Recently, a new variable has been introduced on nuclear power expansion policy: public opinion. That variable challenges the nuclear community to develop new programs aiming to educate society sectors interested on energy generation and not necessarily familiarized with concepts of the nuclear eld. Here we approach this challenge by discussing how a misconception about the use of theories in science has misled the interpretation of the Chernobyl's accident consequences. That discussion have been presented for students from fields related with Environmental Sciences and Humanities and have helped to elucidate that an extrapolation such as the Linear Non-Threshold model is a hypothesis to be tested experimentally instead of a theoretical tool with predictive power. (author)

  1. The oscillatory behavior of heated channels: an analysis of the density effect. Part I. The mechanism (non linear analysis). Part II. The oscillations thresholds (linearized analysis)

    International Nuclear Information System (INIS)

    Boure, J.

    1967-01-01

    The problem of the oscillatory behavior of heated channels is presented in terms of delay-times and a density effect model is proposed to explain the behavior. The density effect is the consequence of the physical relationship between enthalpy and density of the fluid. In the first part non-linear equations are derived from the model in a dimensionless form. A description of the mechanism of oscillations is given, based on the analysis of the equations. An inventory of the governing parameters is established. At this point of the study, some facts in agreement with the experiments can be pointed out. In the second part the start of the oscillatory behavior of heated channels is studied in terms of the density effect. The threshold equations are derived, after linearization of the equations obtained in Part I. They can be solved rigorously by numerical methods to yield: -1) a relation between the describing parameters at the onset of oscillations, and -2) the frequency of the oscillations. By comparing the results predicted by the model to the experimental behavior of actual systems, the density effect is very often shown to be the actual cause of oscillatory behaviors. (author) [fr

  2. Top quark threshold scan and study of detectors for highly granular hadron calorimeters at future linear colliders

    International Nuclear Information System (INIS)

    Tesar, Michal

    2014-01-01

    Two major projects for future linear electron-positron colliders, the International Linear Collider (ILC) and the Compact Linear Collider (CLIC), are currently under development. These projects can be seen as complementary machines to the Large Hadron Collider (LHC) which permit a further progress in high energy physics research. They overlap considerably and share the same technological approaches. To meet the ambitious goals of precise measurements, new detector concepts like very finely segmented calorimeters are required. We study the precision of the top quark mass measurement achievable at CLIC and the ILC. The employed method was a t anti t pair production threshold scan. In this technique, simulated measurement points of the t anti t production cross section around the threshold are fitted with theoretical curves calculated at next-to-next-to-leading order. Detector effects, the influence of the beam energy spectrum and initial state radiation of the colliding particles are taken into account. Assuming total integrated luminosity of 100 fb -1 , our results show that the top quark mass in a theoretically well-defined 1S mass scheme can be extracted with a combined statistical and systematic uncertainty of less than 50 MeV. The other part of this work regards experimental studies of highly granular hadron calorimeter (HCAL) elements. To meet the required high jet energy resolution at the future linear colliders, a large and finely segmented detector is needed. One option is to assemble a sandwich calorimeter out of many low-cost scintillators read out by silicon photomultipliers (SiPM). We characterize the areal homogeneity of SiPM response with the help of a highly collimated beam of pulsed visible light. The spatial resolution of the experiment reach the order of 1 μm and allows to study the active area structures within single SiPM microcells. Several SiPM models are characterized in terms of relative photon detection efficiency and probability crosstalk

  3. Detecting fatigue thresholds from electromyographic signals: A systematic review on approaches and methodologies.

    Science.gov (United States)

    Ertl, Peter; Kruse, Annika; Tilp, Markus

    2016-10-01

    The aim of the current paper was to systematically review the relevant existing electromyographic threshold concepts within the literature. The electronic databases MEDLINE and SCOPUS were screened for papers published between January 1980 and April 2015 including the keywords: neuromuscular fatigue threshold, anaerobic threshold, electromyographic threshold, muscular fatigue, aerobic-anaerobictransition, ventilatory threshold, exercise testing, and cycle-ergometer. 32 articles were assessed with regard to their electromyographic methodologies, description of results, statistical analysis and test protocols. Only one article was of very good quality. 21 were of good quality and two articles were of very low quality. The review process revealed that: (i) there is consistent evidence of one or two non-linear increases of EMG that might reflect the additional recruitment of motor units (MU) or different fiber types during fatiguing cycle ergometer exercise, (ii) most studies reported no statistically significant difference between electromyographic and metabolic thresholds, (iii) one minute protocols with increments between 10 and 25W appear most appropriate to detect muscular threshold, (iv) threshold detection from the vastus medialis, vastus lateralis, and rectus femoris is recommended, and (v) there is a great variety in study protocols, measurement techniques, and data processing. Therefore, we recommend further research and standardization in the detection of EMGTs. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Differential stimulation of antioxidant defense in various organs of mice after whole body exposure to low-dose gamma radiation

    International Nuclear Information System (INIS)

    Pathak, C.M.; Avti, P.K.; Khanduja, K.L.; Sharma, S.C.

    2007-01-01

    It has been generally considered that any dose of ionizing radiation is detrimental to the living organisms, however low the radiation dose may be. The much relied upon 'Linear-No-Threshold' (LNT) hypothesis dose not have any convincing experimental evidence regarding the damaging effects at very low-doses and low-dose rates. Generally, the deleterious biological effects have been inferred theoretically by extrapolating the known effects of high radiation dose to low-dose range. Recently, it has been reported that the living organisms do not respond to ionizing radiations in a linear manner in the low-dose range 0.01-0.50 Gy and rather restore the homeostasis both in-vivo and in-vitro by normal physiological mechanisms such as, cellular and DNA repair processes, immune reactions, antioxidant defense, adaptive responses, activation of immune functions; stimulation of growth etc. In this study, we have attempted to find: (i) the critical radiation dose range and the post irradiation period during which the antioxidant defense systems in the lungs, liver and kidneys remain stimulated; and (ii) to evaluate the degree to which these defense mechanisms remain stimulated in these organs after whole body exposure of the animal to low-dose radiation

  5. Effect of threshold quantization in opportunistic splitting algorithm

    KAUST Repository

    Nam, Haewoon

    2011-12-01

    This paper discusses algorithms to find the optimal threshold and also investigates the impact of threshold quantization on the scheduling outage performance of the opportunistic splitting scheduling algorithm. Since this algorithm aims at finding the user with the highest channel quality within the minimal number of mini-slots by adjusting the threshold every mini-slot, optimizing the threshold is of paramount importance. Hence, in this paper we first discuss how to compute the optimal threshold along with two tight approximations for the optimal threshold. Closed-form expressions are provided for those approximations for simple calculations. Then, we consider linear quantization of the threshold to take the limited number of bits for signaling messages in practical systems into consideration. Due to the limited granularity for the quantized threshold value, an irreducible scheduling outage floor is observed. The numerical results show that the two approximations offer lower scheduling outage probability floors compared to the conventional algorithm when the threshold is quantized. © 2006 IEEE.

  6. Detection thresholds of macaque otolith afferents.

    Science.gov (United States)

    Yu, Xiong-Jie; Dickman, J David; Angelaki, Dora E

    2012-06-13

    The vestibular system is our sixth sense and is important for spatial perception functions, yet the sensory detection and discrimination properties of vestibular neurons remain relatively unexplored. Here we have used signal detection theory to measure detection thresholds of otolith afferents using 1 Hz linear accelerations delivered along three cardinal axes. Direction detection thresholds were measured by comparing mean firing rates centered on response peak and trough (full-cycle thresholds) or by comparing peak/trough firing rates with spontaneous activity (half-cycle thresholds). Thresholds were similar for utricular and saccular afferents, as well as for lateral, fore/aft, and vertical motion directions. When computed along the preferred direction, full-cycle direction detection thresholds were 7.54 and 3.01 cm/s(2) for regular and irregular firing otolith afferents, respectively. Half-cycle thresholds were approximately double, with excitatory thresholds being half as large as inhibitory thresholds. The variability in threshold among afferents was directly related to neuronal gain and did not depend on spike count variance. The exact threshold values depended on both the time window used for spike count analysis and the filtering method used to calculate mean firing rate, although differences between regular and irregular afferent thresholds were independent of analysis parameters. The fact that minimum thresholds measured in macaque otolith afferents are of the same order of magnitude as human behavioral thresholds suggests that the vestibular periphery might determine the limit on our ability to detect or discriminate small differences in head movement, with little noise added during downstream processing.

  7. No-signaling quantum key distribution: solution by linear programming

    Science.gov (United States)

    Hwang, Won-Young; Bae, Joonwoo; Killoran, Nathan

    2015-02-01

    We outline a straightforward approach for obtaining a secret key rate using only no-signaling constraints and linear programming. Assuming an individual attack, we consider all possible joint probabilities. Initially, we study only the case where Eve has binary outcomes, and we impose constraints due to the no-signaling principle and given measurement outcomes. Within the remaining space of joint probabilities, by using linear programming, we get bound on the probability of Eve correctly guessing Bob's bit. We then make use of an inequality that relates this guessing probability to the mutual information between Bob and a more general Eve, who is not binary-restricted. Putting our computed bound together with the Csiszár-Körner formula, we obtain a positive key generation rate. The optimal value of this rate agrees with known results, but was calculated in a more straightforward way, offering the potential of generalization to different scenarios.

  8. β-Glucan from Lentinus edodes inhibits nitric oxide and tumor necrosis factor-α production and phosphorylation of mitogen-activated protein kinases in lipopolysaccharide-stimulated murine RAW 264.7 macrophages.

    Science.gov (United States)

    Xu, Xiaojuan; Yasuda, Michiko; Nakamura-Tsuruta, Sachiko; Mizuno, Masashi; Ashida, Hitoshi

    2012-01-06

    Lentinan (LNT), a β-glucan from the fruiting bodies of Lentinus edodes, is well known to have immunomodulatory activity. NO and TNF-α are associated with many inflammatory diseases. In this study, we investigated the effects of LNT extracted by sonication (LNT-S) on the NO and TNF-α production in LPS-stimulated murine RAW 264.7 macrophages. The results suggested that treatment with LNT-S not only resulted in the striking inhibition of TNF-α and NO production in LPS-activated macrophage RAW 264.7 cells, but also the protein expression of inducible NOS (iNOS) and the gene expression of iNOS mRNA and TNF-α mRNA. It is surprising that LNT-S enhanced LPS-induced NF-κB p65 nuclear translocation and NF-κB luciferase activity, but severely inhibited the phosphorylation of JNK1/2 and ERK1/2. The neutralizing antibodies of anti-Dectin-1 and anti-TLR2 hardly affected the inhibition of NO production. All of these results suggested that the suppression of LPS-induced NO and TNF-α production was at least partially attributable to the inhibition of JNK1/2 and ERK1/2 activation. This work discovered a promising molecule to control the diseases associated with overproduction of NO and TNF-α.

  9. Pooled Bayesian analysis of 28 studies on radon induced lung cancers

    International Nuclear Information System (INIS)

    Fornalski, K.W.; Dobrzyński, L.

    2010-01-01

    The influence of ionizing radiation of radon-222 and its daughters on the lung cancer incidence and mortality published in 28 papers was reanalyzed, for two ranges of low annual radiation dose of below 70 mSv per year (391 Bq m -3 ) and 150 mSv per year (838 Bq m -3 ). The seven popular models of dose-effect relationship were tested. The assumption-free Bayesian statistical methods were used for all curve fittings. Also the Model Selection algorithm was used to verify the relative probability of all seven models. The results of the analysis demonstrate that in this ranges of doses (below 70 and 150 mSv/ year) the published data do not show the presence of a risk of lung cancer induction. The most probable dose-effect relationship is constant one (risk ratio, RR=1). The statistical analysis shows that there is no basis for increase the risk of lung cancer in low dose area. The final conclusion results from the fact that the model assuming no dependence of the lung cancer induction on the radiation doses is at least 100 times more likely than six other models tested, including the Linear No-Threshold (LNT) model

  10. Stress induction in the bacteria Shewanella oneidensis and Deinococcus radiodurans in response to below-background ionizing radiation.

    Science.gov (United States)

    Castillo, Hugo; Schoderbek, Donald; Dulal, Santosh; Escobar, Gabriela; Wood, Jeffrey; Nelson, Roger; Smith, Geoffrey

    2015-01-01

    The 'Linear no-threshold' (LNT) model predicts that any amount of radiation increases the risk of organisms to accumulate negative effects. Several studies at below background radiation levels (4.5-11.4 nGy h(-1)) show decreased growth rates and an increased susceptibility to oxidative stress. The purpose of our study is to obtain molecular evidence of a stress response in Shewanella oneidensis and Deinococcus radiodurans grown at a gamma dose rate of 0.16 nGy h(-1), about 400 times less than normal background radiation. Bacteria cultures were grown at a dose rate of 0.16 or 71.3 nGy h(-1) gamma irradiation. Total RNA was extracted from samples at early-exponential and stationary phases for the rt-PCR relative quantification (radiation-deprived treatment/background radiation control) of the stress-related genes katB (catalase), recA (recombinase), oxyR (oxidative stress transcriptional regulator), lexA (SOS regulon transcriptional repressor), dnaK (heat shock protein 70) and SOA0154 (putative heavy metal efflux pump). Deprivation of normal levels of radiation caused a reduction in growth of both bacterial species, accompanied by the upregulation of katB, recA, SOA0154 genes in S. oneidensis and the upregulation of dnaK in D. radiodurans. When cells were returned to background radiation levels, growth rates recovered and the stress response dissipated. Our results indicate that below-background levels of radiation inhibited growth and elicited a stress response in two species of bacteria, contrary to the LNT model prediction.

  11. On the derivation of the ionisation threshold law

    International Nuclear Information System (INIS)

    Peterkop, R.

    1983-01-01

    The different procedures for derivation of the electron-atom ionisation threshold law have been analysed and the reasons for discrepancies in the results are pointed out. It is shown that if the wavefunction has a linear node at equal electron distances (r 1 =r 2 ), then the threshold law for the total cross section has the form σ approx. Esup(3m), where σ approx. Esup(m) is the Wannier law. The distribution of energy between escaping electrons is non-uniform and has a parabolic node at equal energies (epsilon 1 = epsilon 2 ). The linear node at opposite directions of electrons (theta = π) does not change the Wannier law but leads to a parabolic node in angular distribution at theta = π. The existence of both nodes leads to the threshold law σ approx. Esup(3m) and to parabolic nodes in energy and angular distributions. (author)

  12. Thresholds of ion turbulence in tokamaks

    International Nuclear Information System (INIS)

    Garbet, X.; Laurent, L.; Mourgues, F.; Roubin, J.P.; Samain, A.; Zou, X.L.

    1991-01-01

    The linear thresholds of ionic turbulence are numerically calculated for the Tokamaks JET and TORE SUPRA. It is proved that the stability domain at η i >0 is determined by trapped ion modes and is characterized by η i ≥1 and a threshold L Ti /R of order (0.2/0.3)/(1+T i /T e ). The latter value is significantly smaller than what has been previously predicted. Experimental temperature profiles in heated discharges are usually marginal with respect to this criterium. It is also shown that the eigenmodes are low frequency, low wavenumber ballooned modes, which may produce a very large transport once the threshold ion temperature gradient is reached

  13. Threshold current for fireball generation

    Science.gov (United States)

    Dijkhuis, Geert C.

    1982-05-01

    Fireball generation from a high-intensity circuit breaker arc is interpreted here as a quantum-mechanical phenomenon caused by severe cooling of electrode material evaporating from contact surfaces. According to the proposed mechanism, quantum effects appear in the arc plasma when the radius of one magnetic flux quantum inside solid electrode material has shrunk to one London penetration length. A formula derived for the threshold discharge current preceding fireball generation is found compatible with data reported by Silberg. This formula predicts linear scaling of the threshold current with the circuit breaker's electrode radius and concentration of conduction electrons.

  14. Construction of Protograph LDPC Codes with Linear Minimum Distance

    Science.gov (United States)

    Divsalar, Dariush; Dolinar, Sam; Jones, Christopher

    2006-01-01

    A construction method for protograph-based LDPC codes that simultaneously achieve low iterative decoding threshold and linear minimum distance is proposed. We start with a high-rate protograph LDPC code with variable node degrees of at least 3. Lower rate codes are obtained by splitting check nodes and connecting them by degree-2 nodes. This guarantees the linear minimum distance property for the lower-rate codes. Excluding checks connected to degree-1 nodes, we show that the number of degree-2 nodes should be at most one less than the number of checks for the protograph LDPC code to have linear minimum distance. Iterative decoding thresholds are obtained by using the reciprocal channel approximation. Thresholds are lowered by using either precoding or at least one very high-degree node in the base protograph. A family of high- to low-rate codes with minimum distance linearly increasing in block size and with capacity-approaching performance thresholds is presented. FPGA simulation results for a few example codes show that the proposed codes perform as predicted.

  15. Effects of polarization and absorption on laser induced optical breakdown threshold for skin rejuvenation

    Science.gov (United States)

    Varghese, Babu; Bonito, Valentina; Turco, Simona; Verhagen, Rieko

    2016-03-01

    Laser induced optical breakdown (LIOB) is a non-linear absorption process leading to plasma formation at locations where the threshold irradiance for breakdown is surpassed. In this paper we experimentally demonstrate the influence of polarization and absorption on laser induced breakdown threshold in transparent, absorbing and scattering phantoms made from water suspensions of polystyrene microspheres. We demonstrate that radially polarized light yields a lower irradiance threshold for creating optical breakdown compared to linearly polarized light. We also demonstrate that the thermal initiation pathway used for generating seed electrons results in a lower irradiance threshold compared to multiphoton initiation pathway used for optical breakdown.

  16. Mirror structures above and below the linear instability threshold: Cluster observations, fluid model and hybrid simulations

    Directory of Open Access Journals (Sweden)

    V. Génot

    2009-02-01

    Full Text Available Using 5 years of Cluster data, we present a detailed statistical analysis of magnetic fluctuations associated with mirror structures in the magnetosheath. We especially focus on the shape of these fluctuations which, in addition to quasi-sinusoidal forms, also display deep holes and high peaks. The occurrence frequency and the most probable location of the various types of structures is discussed, together with their relation to local plasma parameters. While these properties have previously been correlated to the β of the plasma, we emphasize here the influence of the distance to the linear mirror instability threshold. This enables us to interpret the observations of mirror structures in a stable plasma in terms of bistability and subcritical bifurcation. The data analysis is supplemented by the prediction of a quasi-static anisotropic MHD model and hybrid numerical simulations in an expanding box aimed at mimicking the magnetosheath plasma. This leads us to suggest a scenario for the formation and evolution of mirror structures.

  17. Common misinterpretations of the 'linear, no-threshold' relationship used in radiation protection

    International Nuclear Information System (INIS)

    Bond, V.P.; Sondhaus, C.A.

    1987-01-01

    Absorbed dose D is shown to be a composite variable, the product of the fraction of cells hit (I H ) and the mean ''dose'' (hit size) anti z to those cells. D is suitable for use with high level exposure (HLE) to radiation and its resulting acute organ effects because, since I H =1.0, it approximates closely enough the mean energy density in the cell as well as in the organ. However, the low level exposure (LLE) to radiation and its consequent probability of cancer induction from a single cell, stochastic delivery of energy to cells results in a wide distribution of hit sizes z, and the expected mean value, anti z, is constant with exposure. Thus, with LLE, only I H varies with D so that the apparent proportionality between ''dose'' and the fraction of cells transformed is misleading. This proportionality therefore does not mean that any (cell) dose, no matter how small, can be lethal. Rather, it means that, in the exposure of a population of individual organisms consisting of the constituent relevant cells, there is a small probability of particle-cell ineractions which transfer energy. The probability of a cell transforming and initiating a cancer can only be greater than zero if the hi t size (''dose'') to the cell is large enough. Otherwise stated, if the ''dose'' is defined at the proper level of biological organization, namely, the cell and not the organ, only a large dose z to that cell is effective. (orig.)

  18. Threshold law for positron-atom impact ionisation

    International Nuclear Information System (INIS)

    Temkin, A.

    1982-01-01

    The threshold law for ionisation of atoms by positron impact is adduced in analogy with the author's approach to the electron-atom ionisation. It is concluded the Coulomb-dipole region of potential gives the essential part of the interaction in both cases and leads to the same kind of result: a modulated linear law. An additional process which enters positron ionisation is positronium formation in the continuum, but that will not dominate the threshold yield. The result is in sharp contrast to the positron threshold law as recently derived by Klar (J. Phys. B.; 14:4165 (1981)) on the basis of a Wannier-type (Phys. Rev.; 90:817 (1953)) analysis. (author)

  19. Microkinetic Modeling of Lean NOx Trap Sulfation and Desulfation

    Energy Technology Data Exchange (ETDEWEB)

    Larson, Richard S. [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2011-08-01

    A microkinetic reaction sub-mechanism designed to account for the sulfation and desulfation of a commercial lean NOx trap (LNT) is presented. This set of reactions is appended to a previously developed mechanism for the normal storage and regeneration processes in an LNT in order to provide a comprehensive modeling tool. The reactions describing the storage, release, and reduction of sulfur oxides are patterned after those involving NOx, but the number of reactions is kept to the minimum necessary to give an adequate simulation of the experimental observations. Values for the kinetic constants are estimated by fitting semi-quantitatively the somewhat limited experimental data, using a transient plug flow reactor code to model the processes occurring in a single monolith channel. Rigorous thermodynamic constraints are imposed in order to ensure that the overall mechanism is consistent both internally and with the known properties of all gas-phase species. The final mechanism is shown to be capable of reproducing the principal aspects of sulfation/desulfation behavior, most notably (a) the essentially complete trapping of SO2 during normal cycling; (b) the preferential sulfation of NOx storage sites over oxygen storage sites and the consequent plug-like and diffuse sulfation profiles; (c) the degradation of NOx storage and reduction (NSR) capability with increasing sulfation level; and (d) the mix of H2S and SO2 evolved during desulfation by temperature-programmed reduction.

  20. High-damage-threshold static laser beam shaping using optically patterned liquid-crystal devices.

    Science.gov (United States)

    Dorrer, C; Wei, S K-H; Leung, P; Vargas, M; Wegman, K; Boulé, J; Zhao, Z; Marshall, K L; Chen, S H

    2011-10-15

    Beam shaping of coherent laser beams is demonstrated using liquid crystal (LC) cells with optically patterned pixels. The twist angle of a nematic LC is locally set to either 0 or 90° by an alignment layer prepared via exposure to polarized UV light. The two distinct pixel types induce either no polarization rotation or a 90° polarization rotation, respectively, on a linearly polarized optical field. An LC device placed between polarizers functions as a binary transmission beam shaper with a highly improved damage threshold compared to metal beam shapers. Using a coumarin-based photoalignment layer, various devices have been fabricated and tested, with a measured single-shot nanosecond damage threshold higher than 30 J/cm2.

  1. Improving sensitivity of linear regression-based cell type-specific differential expression deconvolution with per-gene vs. global significance threshold.

    Science.gov (United States)

    Glass, Edmund R; Dozmorov, Mikhail G

    2016-10-06

    The goal of many human disease-oriented studies is to detect molecular mechanisms different between healthy controls and patients. Yet, commonly used gene expression measurements from blood samples suffer from variability of cell composition. This variability hinders the detection of differentially expressed genes and is often ignored. Combined with cell counts, heterogeneous gene expression may provide deeper insights into the gene expression differences on the cell type-specific level. Published computational methods use linear regression to estimate cell type-specific differential expression, and a global cutoff to judge significance, such as False Discovery Rate (FDR). Yet, they do not consider many artifacts hidden in high-dimensional gene expression data that may negatively affect linear regression. In this paper we quantify the parameter space affecting the performance of linear regression (sensitivity of cell type-specific differential expression detection) on a per-gene basis. We evaluated the effect of sample sizes, cell type-specific proportion variability, and mean squared error on sensitivity of cell type-specific differential expression detection using linear regression. Each parameter affected variability of cell type-specific expression estimates and, subsequently, the sensitivity of differential expression detection. We provide the R package, LRCDE, which performs linear regression-based cell type-specific differential expression (deconvolution) detection on a gene-by-gene basis. Accounting for variability around cell type-specific gene expression estimates, it computes per-gene t-statistics of differential detection, p-values, t-statistic-based sensitivity, group-specific mean squared error, and several gene-specific diagnostic metrics. The sensitivity of linear regression-based cell type-specific differential expression detection differed for each gene as a function of mean squared error, per group sample sizes, and variability of the proportions

  2. Análise do efeito não linear do patrimônio líquido no apreçamento de fundos de investimento em ações

    Directory of Open Access Journals (Sweden)

    Paulo Rogério Faustino Matos

    2012-01-01

    Full Text Available Este artigo faz uso do Capital Asset Pricing Model (CAPM, em sua versão canônica e com extensões não lineares, visando a apreçar um painel de 75 fundos de investimento em ações no Brasil, ao longo dos últimos 11 anos. O resultado sugere que a versão linear desse arcabouço não seja capaz de apreçar ou de prever retornos reais de fundos que possuam elevados patrimônio líquido (PL e outperformance, em relação ao índice da Bolsa de Valores de São Paulo (Ibovespa, corroborando evidências anteriores. A versão não linear com thresholds baseados no PL parece lidar melhor com a questão de alfas de Jensen significativos, apesar de ser estatisticamente indicada apenas para poucos fundos com elevado PL, mas baixa outperformance. Essa é uma evidência de que, apesar de o tamanho influenciar na gestão e, possivelmente, na performance de um fundo, a modelagem de apreçamento desse efeito deve ser feita linearmente.

  3. Melanin microcavitation threshold in the near infrared

    Science.gov (United States)

    Schmidt, Morgan S.; Kennedy, Paul K.; Vincelette, Rebecca L.; Schuster, Kurt J.; Noojin, Gary D.; Wharmby, Andrew W.; Thomas, Robert J.; Rockwell, Benjamin A.

    2014-02-01

    Thresholds for microcavitation of isolated bovine and porcine melanosomes were determined using single nanosecond (ns) laser pulses in the NIR (1000 - 1319 nm) wavelength regime. Average fluence thresholds for microcavitation increased non-linearly with increasing wavelength. Average fluence thresholds were also measured for 10-ns pulses at 532 nm, and found to be comparable to visible ns pulse values published in previous reports. Fluence thresholds were used to calculate melanosome absorption coefficients, which decreased with increasing wavelength. This trend was found to be comparable to the decrease in retinal pigmented epithelial (RPE) layer absorption coefficients reported over the same wavelength region. Estimated corneal total intraocular energy (TIE) values were determined and compared to the current and proposed maximum permissible exposure (MPE) safe exposure levels. Results from this study support the proposed changes to the MPE levels.

  4. Introducing Linear Functions: An Alternative Statistical Approach

    Science.gov (United States)

    Nolan, Caroline; Herbert, Sandra

    2015-01-01

    The introduction of linear functions is the turning point where many students decide if mathematics is useful or not. This means the role of parameters and variables in linear functions could be considered to be "threshold concepts". There is recognition that linear functions can be taught in context through the exploration of linear…

  5. Disaggregated energy consumption and GDP in Taiwan: A threshold co-integration analysis

    International Nuclear Information System (INIS)

    Hu, J.-L.; Lin, C.-H.

    2008-01-01

    Energy consumption growth is much higher than economic growth for Taiwan in recent years, worsening its energy efficiency. This paper provides a solid explanation by examining the equilibrium relationship between GDP and disaggregated energy consumption under a non-linear framework. The threshold co-integration test developed with asymmetric dynamic adjusting processes proposed by Hansen and Seo [Hansen, B.E., Seo, B., 2002. Testing for two-regime threshold cointegration in vector error-correction models. Journal of Econometrics 110, 293-318.] is applied. Non-linear co-integrations between GDP and disaggregated energy consumptions are confirmed except for oil consumption. The two-regime vector error-correction models (VECM) show that the adjustment process of energy consumption toward equilibrium is highly persistent when an appropriately threshold is reached. There is mean-reverting behavior when the threshold is reached, making aggregate and disaggregated energy consumptions grow faster than GDP in Taiwan

  6. The H-mode power threshold in JET

    Energy Technology Data Exchange (ETDEWEB)

    Start, D F.H.; Bhatnagar, V P; Campbell, D J; Cordey, J G; Esch, H P.L. de; Gormezano, C; Hawkes, N; Horton, L; Jones, T T.C.; Lomas, P J; Lowry, C; Righi, E; Rimini, F G; Saibene, G; Sartori, R; Sips, G; Stork, D; Thomas, P; Thomsen, K; Tubbing, B J.D.; Von Hellermann, M; Ward, D J [Commission of the European Communities, Abingdon (United Kingdom). JET Joint Undertaking

    1994-07-01

    New H-mode threshold data over a range of toroidal field and density values have been obtained from the present campaign. The scaling with n{sub e} B{sub t} is almost identical with that of the 91/92 period for the same discharge conditions. The scaling with toroidal field alone gives somewhat higher thresholds than the older data. The 1991/2 database shows a scaling of P{sub th} (power threshold) with n{sub e} B{sub t} which is approximately linear and agrees well with that observed on other tokamaks. For NBI and carbon target tiles the threshold power is a factor of two higher with the ion {Nu}B drift away from the target compared with the value found with the drift towards the target. The combination of ICRH and beryllium tiles appears to be beneficial for reducing P{sub th}. The power threshold is largely insensitive to plasma current, X-point height and distance between the last closed flux surface and the limiter, at least for values greater than 2 cm. (authors). 3 refs., 6 figs.

  7. Relationship between dose and risk, and assessment of carcinogenic risks associated with low doses of ionizing radiation

    International Nuclear Information System (INIS)

    Tubiana, M.; Aurengo, A.

    2005-01-01

    This report raises doubts on the validity of using LNT (linear no-threshold) relationship for evaluating the carcinogenic risk of low doses (< 100 mSv) and even more for very low doses (< 10 mSv). The LNT concept can be a useful pragmatic tool for assessing rules in radioprotection for doses above 10 mSv; however since it is not based on biological concepts of our current knowledge, it should not be used without precaution for assessing by extrapolation the risks associated with low and even more so, with very low doses (< 10 mSv), especially for benefit-risk assessments imposed on radiologists by the European directive 97-43. The biological mechanisms are different for doses lower than a few dozen mSv and for higher doses. The eventual risks in the dose range of radiological examinations (0.1 to 5 mSv, up to 20 mSv for some examinations) must be estimated taking into account radiobiological and experimental data. An empirical relationship which has been just validated for doses higher than 200 mSv may lead to an overestimation of risks (associated with doses one hundred fold lower), and this overestimation could discourage patients from undergoing useful examinations and introduce a bias in radioprotection measures against very low doses (< 10 mSv). Decision makers confronted with problems of radioactive waste or risk of contamination, should re-examine the methodology used for the evaluation of risks associated with very low doses and with doses delivered at a very low dose rate. This report confirms the inappropriateness of the collective dose concept to evaluate population irradiation risks

  8. Improvement of biological decontamination, protective and repair activity against radiation injury

    International Nuclear Information System (INIS)

    Kagawa, Yasuo

    2013-01-01

    Because the protection of human subject from late radiation injury is the final goal of remediation of radioactive contamination of 137 Cs in environment, improvement of DNA-repairing ability and 137 Cs-removal from human body is important. In order to reduce environmental radioactivity in areas exceeding 5 mSv/year in Fukushima prefecture, the cost is estimated to be 118 trillion yen, and there are difficulties in finding place to store 137 Cs-contaminated soils and in 137 Cs-recontamination. The radiation damage of DNA molecule takes place stochastically following linear no threshold model (LNT), but the cancer risk and other late radiation injury from long-term low dose radiation do not follow LNT model if we improve DNA repair and the cell regeneration systems. Indirect effects of radiation damage on DNA mediated by reactive oxygen species (ROS) are prevented by vitamin C, E, carotenoids including lycopene and phytochemicals. ROS is also removed by superoxide dismutases containing Cu, Mn and Z. Direct effects of radiation damage on DNA are repaired by enzyme systems using folic acid, vitamins B 6 and B 12 . In addition, before the radiation injury, absorption of 137 Cs is prevented by taking pectin etc. and excretion of 137 Cs is accelerated by ingesting more K. Finally, early detection of cancer and its removal by detailed health check of radiation-exposed people is needed. Radiation-protecting diet developed to protect astronauts from about 1 mSv per day, will be useful for many workers of atomic power plant as well as people living in the 137 Cs-contaminated areas. (author)

  9. Protograph based LDPC codes with minimum distance linearly growing with block size

    Science.gov (United States)

    Divsalar, Dariush; Jones, Christopher; Dolinar, Sam; Thorpe, Jeremy

    2005-01-01

    We propose several LDPC code constructions that simultaneously achieve good threshold and error floor performance. Minimum distance is shown to grow linearly with block size (similar to regular codes of variable degree at least 3) by considering ensemble average weight enumerators. Our constructions are based on projected graph, or protograph, structures that support high-speed decoder implementations. As with irregular ensembles, our constructions are sensitive to the proportion of degree-2 variable nodes. A code with too few such nodes tends to have an iterative decoding threshold that is far from the capacity threshold. A code with too many such nodes tends to not exhibit a minimum distance that grows linearly in block length. In this paper we also show that precoding can be used to lower the threshold of regular LDPC codes. The decoding thresholds of the proposed codes, which have linearly increasing minimum distance in block size, outperform that of regular LDPC codes. Furthermore, a family of low to high rate codes, with thresholds that adhere closely to their respective channel capacity thresholds, is presented. Simulation results for a few example codes show that the proposed codes have low error floors as well as good threshold SNFt performance.

  10. Bioclimatic Thresholds, Thermal Constants and Survival of Mealybug, Phenacoccus solenopsis (Hemiptera: Pseudococcidae) in Response to Constant Temperatures on Hibiscus

    Science.gov (United States)

    Sreedevi, Gudapati; Prasad, Yenumula Gerard; Prabhakar, Mathyam; Rao, Gubbala Ramachandra; Vennila, Sengottaiyan; Venkateswarlu, Bandi

    2013-01-01

    Temperature-driven development and survival rates of the mealybug, Phenacoccussolenopsis Tinsley (Hemiptera: Pseudococcidae) were examined at nine constant temperatures (15, 20, 25, 27, 30, 32, 35 and 40°C) on hibiscus ( Hibiscus rosa -sinensis L.). Crawlers successfully completed development to adult stage between 15 and 35°C, although their survival was affected at low temperatures. Two linear and four nonlinear models were fitted to describe developmental rates of P . solenopsis as a function of temperature, and for estimating thermal constants and bioclimatic thresholds (lower, optimum and upper temperature thresholds for development: Tmin, Topt and Tmax, respectively). Estimated thresholds between the two linear models were statistically similar. Ikemoto and Takai’s linear model permitted testing the equivalence of lower developmental thresholds for life stages of P . solenopsis reared on two hosts, hibiscus and cotton. Thermal constants required for completion of cumulative development of female and male nymphs and for the whole generation were significantly lower on hibiscus (222.2, 237.0, 308.6 degree-days, respectively) compared to cotton. Three nonlinear models performed better in describing the developmental rate for immature instars and cumulative life stages of female and male and for generation based on goodness-of-fit criteria. The simplified β type distribution function estimated Topt values closer to the observed maximum rates. Thermodynamic SSI model indicated no significant differences in the intrinsic optimum temperature estimates for different geographical populations of P . solenopsis . The estimated bioclimatic thresholds and the observed survival rates of P . solenopsis indicate the species to be high-temperature adaptive, and explained the field abundance of P . solenopsis on its host plants. PMID:24086597

  11. Bioclimatic thresholds, thermal constants and survival of mealybug, Phenacoccus solenopsis (hemiptera: pseudococcidae) in response to constant temperatures on hibiscus.

    Science.gov (United States)

    Sreedevi, Gudapati; Prasad, Yenumula Gerard; Prabhakar, Mathyam; Rao, Gubbala Ramachandra; Vennila, Sengottaiyan; Venkateswarlu, Bandi

    2013-01-01

    Temperature-driven development and survival rates of the mealybug, Phenacoccussolenopsis Tinsley (Hemiptera: Pseudococcidae) were examined at nine constant temperatures (15, 20, 25, 27, 30, 32, 35 and 40°C) on hibiscus (Hibiscusrosa -sinensis L.). Crawlers successfully completed development to adult stage between 15 and 35°C, although their survival was affected at low temperatures. Two linear and four nonlinear models were fitted to describe developmental rates of P. solenopsis as a function of temperature, and for estimating thermal constants and bioclimatic thresholds (lower, optimum and upper temperature thresholds for development: Tmin, Topt and Tmax, respectively). Estimated thresholds between the two linear models were statistically similar. Ikemoto and Takai's linear model permitted testing the equivalence of lower developmental thresholds for life stages of P. solenopsis reared on two hosts, hibiscus and cotton. Thermal constants required for completion of cumulative development of female and male nymphs and for the whole generation were significantly lower on hibiscus (222.2, 237.0, 308.6 degree-days, respectively) compared to cotton. Three nonlinear models performed better in describing the developmental rate for immature instars and cumulative life stages of female and male and for generation based on goodness-of-fit criteria. The simplified β type distribution function estimated Topt values closer to the observed maximum rates. Thermodynamic SSI model indicated no significant differences in the intrinsic optimum temperature estimates for different geographical populations of P. solenopsis. The estimated bioclimatic thresholds and the observed survival rates of P. solenopsis indicate the species to be high-temperature adaptive, and explained the field abundance of P. solenopsis on its host plants.

  12. Performance Evaluation of Linear (ARMA and Threshold Nonlinear (TAR Time Series Models in Daily River Flow Modeling (Case Study: Upstream Basin Rivers of Zarrineh Roud Dam

    Directory of Open Access Journals (Sweden)

    Farshad Fathian

    2017-01-01

    Full Text Available Introduction: Time series models are generally categorized as a data-driven method or mathematically-based method. These models are known as one of the most important tools in modeling and forecasting of hydrological processes, which are used to design and scientific management of water resources projects. On the other hand, a better understanding of the river flow process is vital for appropriate streamflow modeling and forecasting. One of the main concerns of hydrological time series modeling is whether the hydrologic variable is governed by the linear or nonlinear models through time. Although the linear time series models have been widely applied in hydrology research, there has been some recent increasing interest in the application of nonlinear time series approaches. The threshold autoregressive (TAR method is frequently applied in modeling the mean (first order moment of financial and economic time series. Thise type of the model has not received considerable attention yet from the hydrological community. The main purposes of this paper are to analyze and to discuss stochastic modeling of daily river flow time series of the study area using linear (such as ARMA: autoregressive integrated moving average and non-linear (such as two- and three- regime TAR models. Material and Methods: The study area has constituted itself of four sub-basins namely, Saghez Chai, Jighato Chai, Khorkhoreh Chai and Sarogh Chai from west to east, respectively, which discharge water into the Zarrineh Roud dam reservoir. River flow time series of 6 hydro-gauge stations located on upstream basin rivers of Zarrineh Roud dam (located in the southern part of Urmia Lake basin were considered to model purposes. All the data series used here to start from January 1, 1997, and ends until December 31, 2011. In this study, the daily river flow data from January 01 1997 to December 31 2009 (13 years were chosen for calibration and data for January 01 2010 to December 31 2011

  13. Visuo-manual tracking: does intermittent control with aperiodic sampling explain linear power and non-linear remnant without sensorimotor noise?

    Science.gov (United States)

    Gollee, Henrik; Gawthrop, Peter J; Lakie, Martin; Loram, Ian D

    2017-11-01

    A human controlling an external system is described most easily and conventionally as linearly and continuously translating sensory input to motor output, with the inevitable output remnant, non-linearly related to the input, attributed to sensorimotor noise. Recent experiments show sustained manual tracking involves repeated refractoriness (insensitivity to sensory information for a certain duration), with the temporary 200-500 ms periods of irresponsiveness to sensory input making the control process intrinsically non-linear. This evidence calls for re-examination of the extent to which random sensorimotor noise is required to explain the non-linear remnant. This investigation of manual tracking shows how the full motor output (linear component and remnant) can be explained mechanistically by aperiodic sampling triggered by prediction error thresholds. Whereas broadband physiological noise is general to all processes, aperiodic sampling is associated with sensorimotor decision making within specific frontal, striatal and parietal networks; we conclude that manual tracking utilises such slow serial decision making pathways up to several times per second. The human operator is described adequately by linear translation of sensory input to motor output. Motor output also always includes a non-linear remnant resulting from random sensorimotor noise from multiple sources, and non-linear input transformations, for example thresholds or refractory periods. Recent evidence showed that manual tracking incurs substantial, serial, refractoriness (insensitivity to sensory information of 350 and 550 ms for 1st and 2nd order systems respectively). Our two questions are: (i) What are the comparative merits of explaining the non-linear remnant using noise or non-linear transformations? (ii) Can non-linear transformations represent serial motor decision making within the sensorimotor feedback loop intrinsic to tracking? Twelve participants (instructed to act in three prescribed

  14. Implications for human and environmental health of low doses of ionising radiation

    International Nuclear Information System (INIS)

    Mothersill, Carmel; Seymour, Colin

    2014-01-01

    The last 20 years have seen a major paradigm shift in radiation biology. Several discoveries challenge the DNA centric view which holds that DNA damage is the critical effect of radiation irrespective of dose. This theory leads to the assumption that dose and effect are simply linked – the more energy deposition, the more DNA damage and the greater the biological effect. This is embodied in radiation protection (RP) regulations as the linear-non-threshold (LNT) model. However the science underlying the LNT model is being challenged particularly in relation to the environment because it is now clear that at low doses of concern in RP, cells, tissues and organisms respond to radiation by inducing responses which are not readily predictable by dose. These include adaptive responses, bystander effects, genomic instability and low dose hypersensitivity, and are commonly described as stress responses, while recognizing that “stress” can be good as well as bad. The phenomena contribute to observed radiation responses and appear to be influenced by genetic, epigenetic and environmental factors, meaning that dose and response are not simply related. The question is whether our discovery of these phenomena means that we need to re-evaluate RP approaches. The so-called “non-targeted” mechanisms mean that low dose radiobiology is very complex and supra linear or sub-linear (even hormetic) responses are possible but their occurrence is unpredictable for any given system level. Issues which may need consideration are synergistic or antagonistic effects of other pollutants. RP, at present, only looks at radiation dose but the new (NTE) radiobiology means that chemical or physical agents, which interfere with tissue responses to low doses of radiation, could critically modulate the predicted risk. Similarly, the “health” of the organism could determine the effect of a given low dose by enabling or disabling a critical response. These issues will be discussed

  15. Effects of programming threshold and maplaw settings on acoustic thresholds and speech discrimination with the MED-EL COMBI 40+ cochlear implant.

    Science.gov (United States)

    Boyd, Paul J

    2006-12-01

    The principal task in the programming of a cochlear implant (CI) speech processor is the setting of the electrical dynamic range (output) for each electrode, to ensure that a comfortable loudness percept is obtained for a range of input levels. This typically involves separate psychophysical measurement of electrical threshold ([theta] e) and upper tolerance levels using short current bursts generated by the fitting software. Anecdotal clinical experience and some experimental studies suggest that the measurement of [theta]e is relatively unimportant and that the setting of upper tolerance limits is more critical for processor programming. The present study aims to test this hypothesis and examines in detail how acoustic thresholds and speech recognition are affected by setting of the lower limit of the output ("Programming threshold" or "PT") to understand better the influence of this parameter and how it interacts with certain other programming parameters. Test programs (maps) were generated with PT set to artificially high and low values and tested on users of the MED-EL COMBI 40+ CI system. Acoustic thresholds and speech recognition scores (sentence tests) were measured for each of the test maps. Acoustic thresholds were also measured using maps with a range of output compression functions ("maplaws"). In addition, subjective reports were recorded regarding the presence of "background threshold stimulation" which is occasionally reported by CI users if PT is set to relatively high values when using the CIS strategy. Manipulation of PT was found to have very little effect. Setting PT to minimum produced a mean 5 dB (S.D. = 6.25) increase in acoustic thresholds, relative to thresholds with PT set normally, and had no statistically significant effect on speech recognition scores on a sentence test. On the other hand, maplaw setting was found to have a significant effect on acoustic thresholds (raised as maplaw is made more linear), which provides some theoretical

  16. Compressively sampled MR image reconstruction using generalized thresholding iterative algorithm

    Science.gov (United States)

    Elahi, Sana; kaleem, Muhammad; Omer, Hammad

    2018-01-01

    Compressed sensing (CS) is an emerging area of interest in Magnetic Resonance Imaging (MRI). CS is used for the reconstruction of the images from a very limited number of samples in k-space. This significantly reduces the MRI data acquisition time. One important requirement for signal recovery in CS is the use of an appropriate non-linear reconstruction algorithm. It is a challenging task to choose a reconstruction algorithm that would accurately reconstruct the MR images from the under-sampled k-space data. Various algorithms have been used to solve the system of non-linear equations for better image quality and reconstruction speed in CS. In the recent past, iterative soft thresholding algorithm (ISTA) has been introduced in CS-MRI. This algorithm directly cancels the incoherent artifacts produced because of the undersampling in k -space. This paper introduces an improved iterative algorithm based on p -thresholding technique for CS-MRI image reconstruction. The use of p -thresholding function promotes sparsity in the image which is a key factor for CS based image reconstruction. The p -thresholding based iterative algorithm is a modification of ISTA, and minimizes non-convex functions. It has been shown that the proposed p -thresholding iterative algorithm can be used effectively to recover fully sampled image from the under-sampled data in MRI. The performance of the proposed method is verified using simulated and actual MRI data taken at St. Mary's Hospital, London. The quality of the reconstructed images is measured in terms of peak signal-to-noise ratio (PSNR), artifact power (AP), and structural similarity index measure (SSIM). The proposed approach shows improved performance when compared to other iterative algorithms based on log thresholding, soft thresholding and hard thresholding techniques at different reduction factors.

  17. Genotoxic thresholds, DNA repair, and susceptibility in human populations

    International Nuclear Information System (INIS)

    Jenkins, Gareth J.S.; Zair, Zoulikha; Johnson, George E.; Doak, Shareen H.

    2010-01-01

    It has been long assumed that DNA damage is induced in a linear manner with respect to the dose of a direct acting genotoxin. Thus, it is implied that direct acting genotoxic agents induce DNA damage at even the lowest of concentrations and that no 'safe' dose range exists. The linear (non-threshold) paradigm has led to the one-hit model being developed. This 'one hit' scenario can be interpreted such that a single DNA damaging event in a cell has the capability to induce a single point mutation in that cell which could (if positioned in a key growth controlling gene) lead to increased proliferation, leading ultimately to the formation of a tumour. There are many groups (including our own) who, for a decade or more, have argued, that low dose exposures to direct acting genotoxins may be tolerated by cells through homeostatic mechanisms such as DNA repair. This argument stems from the existence of evolutionary adaptive mechanisms that allow organisms to adapt to low levels of exogenous sources of genotoxins. We have been particularly interested in the genotoxic effects of known mutagens at low dose exposures in human cells and have identified for the first time, in vitro genotoxic thresholds for several mutagenic alkylating agents (Doak et al., 2007). Our working hypothesis is that DNA repair is primarily responsible for these thresholded effects at low doses by removing low levels of DNA damage but becoming saturated at higher doses. We are currently assessing the roles of base excision repair (BER) and methylguanine-DNA methyltransferase (MGMT) for roles in the identified thresholds (Doak et al., 2008). This research area is currently important as it assesses whether 'safe' exposure levels to mutagenic chemicals can exist and allows risk assessment using appropriate safety factors to define such exposure levels. Given human variation, the mechanistic basis for genotoxic thresholds (e.g. DNA repair) has to be well defined in order that susceptible individuals are

  18. Regional rainfall thresholds for landslide occurrence using a centenary database

    Science.gov (United States)

    Vaz, Teresa; Luís Zêzere, José; Pereira, Susana; Cruz Oliveira, Sérgio; Quaresma, Ivânia

    2017-04-01

    considered as the critical rainfall combination responsible for triggering the landslide event. Only events whose critical rainfall combinations have a return period above 3 years were included. This criterion reduces the likelihood of been included events whose triggering factor was other than rainfall. The rainfall quantity-duration threshold for the Lisbon region was firstly defined using the linear and potential regression. Considering that this threshold allow the existence of false negatives (i.e. events below the threshold) it was also identified the lower limit and upper limit rainfall thresholds. These limits were defined empirically by establishing the quantity-durations combinations bellow which no landslides were recorded (lower limit) and the quantity-durations combinations above which only landslides were recorded without any false positive occurrence (upper limit). The zone between the lower limit and upper limit rainfall thresholds was analysed using a probabilistic approach, defining the uncertainties of each rainfall critical conditions in the triggering of landslides. Finally, the performances of the thresholds obtained in this study were assessed using ROC metrics. This work was supported by the project FORLAND - Hydrogeomorphologic risk in Portugal: driving forces and application for land use planning [grant number PTDC/ATPGEO/1660/2014] funded by the Portuguese Foundation for Science and Technology (FCT), Portugal. Sérgio Cruz Oliveira is a post-doc fellow of the FCT [grant number SFRH/BPD/85827/2012].

  19. The Glare Effect Test and the Impact of Age on Luminosity Thresholds

    Directory of Open Access Journals (Sweden)

    Alessio Facchin

    2017-06-01

    Full Text Available The glare effect (GE is an illusion in which a white region appears self-luminous when surrounded by linearly decreasing luminance ramps. It has been shown that the magnitude of the luminosity effect can be modulated by manipulating the luminance range of the gradients. In the present study we tested the thresholds for the GE on two groups of adults: young (20–30 years old and elderly (60–75 years old. Purpose of our perspective study was to test the possibility of transforming the GE into a test that could easily measure thresholds for luminosity and discomfort glare. The Glare Effect Test (GET consisted in 101 printed cards that differed from each other for the range of luminance ramps. Participants were assessed with GET and a battery of visual tests: visual acuity, contrast sensitivity, illusion of length perception, and Ishihara test. Specifically in the GET, participants were required to classify cards on the basis of two reference cards (solid black-no gradient; full range black to white gradient. PSEs of the GE show no correlation with the other visual tests, revealing a divergent validity. A significant difference between young and elderly was found: contrary to our original expectations, luminosity thresholds of GE for elderly were higher than those for young, suggesting a non-direct relationship between luminosity perception and discomfort glare.

  20. Low dose effects of ionizing radiations in in vitro and in vivo biological systems: a multi-scale approach study

    International Nuclear Information System (INIS)

    Antoccia, A.; Berardinelli, F.; Argazzi, E.; Balata, M.; Bedogni, R.

    2011-01-01

    Long-term biological effects of low-dose radiation are little known nowadays and its carcinogenic risk is estimated on the assumption that risk remains linearly proportional to the radiation dose down to low-dose levels. However in the last 20 years this hypothesis has gradually begun to seem in contrast with a huge collection of experimental evidences, which has shown the presence of plethora of non-linear phenomena (including hypersensitivity and induced radioresistance, adaptive response, and non-targeted phenomena like bystander effect and genomic instability) occurring after low-dose irradiation. These phenomena might imply a non-linear behaviour of cancer risk curves in the low-dose region and question the validity of the Linear No-Threshold (LNT) model currently used for cancer risk assessment through extrapolation from existing high-dose data. Moreover only few information is available regarding the effects induced on cryo preserved cells by multi-year background radiation exposure, which might induce a radiation-damage accumulation, due to the inhibition of cellular repair mechanisms. In this framework, the multi-year Excalibur (Exposure effects at low doses of ionizing radiation in biological culture) experiment, funded by INFN-CNS5, has undertaken a multi-scale approach investigation on the biological effects induced in in vitro and in vivo biological systems, in culture and cryo preserved conditions, as a function of radiation quality (X/γ-rays, protons, He-4 ions of various energies) and dose, with particular emphasis on the low-dose region and non-linear phenomena, in terms of different biological endpoints.

  1. Psychophysical thresholds of face visibility during infancy

    DEFF Research Database (Denmark)

    Gelskov, Sofie; Kouider, Sid

    2010-01-01

    The ability to detect and focus on faces is a fundamental prerequisite for developing social skills. But how well can infants detect faces? Here, we address this question by studying the minimum duration at which faces must appear to trigger a behavioral response in infants. We used a preferential...... looking method in conjunction with masking and brief presentations (300 ms and below) to establish the temporal thresholds of visibility at different stages of development. We found that 5 and 10 month-old infants have remarkably similar visibility thresholds about three times higher than those of adults....... By contrast, 15 month-olds not only revealed adult-like thresholds, but also improved their performance through memory-based strategies. Our results imply that the development of face visibility follows a non-linear course and is determined by a radical improvement occurring between 10 and 15 months....

  2. Simulation study on single event burnout in linear doping buffer layer engineered power VDMOSFET

    Science.gov (United States)

    Yunpeng, Jia; Hongyuan, Su; Rui, Jin; Dongqing, Hu; Yu, Wu

    2016-02-01

    The addition of a buffer layer can improve the device's secondary breakdown voltage, thus, improving the single event burnout (SEB) threshold voltage. In this paper, an N type linear doping buffer layer is proposed. According to quasi-stationary avalanche simulation and heavy ion beam simulation, the results show that an optimized linear doping buffer layer is critical. As SEB is induced by heavy ions impacting, the electric field of an optimized linear doping buffer device is much lower than that with an optimized constant doping buffer layer at a given buffer layer thickness and the same biasing voltages. Secondary breakdown voltage and the parasitic bipolar turn-on current are much higher than those with the optimized constant doping buffer layer. So the linear buffer layer is more advantageous to improving the device's SEB performance. Project supported by the National Natural Science Foundation of China (No. 61176071), the Doctoral Fund of Ministry of Education of China (No. 20111103120016), and the Science and Technology Program of State Grid Corporation of China (No. SGRI-WD-71-13-006).

  3. Recent results on the linearity of the dose-response relationship for radiation-induced mutations in human cells by low dose levels

    International Nuclear Information System (INIS)

    Traut, H.

    1987-01-01

    Five studies made by various authors in the last years are discussed, which are significant in that the response of human cells to low-dose irradiation is determined directly and not by extrapolation, and which also provide information on the mutagenic effects of low radiation doses. The results of these studies do not indicate any other than a linear response for induction of mutations by low-dose irradiation, nor are there any reasons observable for assuming the existence of a threshold dose. It is very likely therefore that cancer initiation at the low dose level also is characterized by a linear relationship. Although threshold dose levels cannot generally be excluded, and maybe are only too low to be detected by experiment, there is no plausible biophysical argument for assuming the existence of such microdose threshold. (orig./MG) [de

  4. Existence and control of Go/No-Go decision transition threshold in the striatum.

    Directory of Open Access Journals (Sweden)

    Jyotika Bahuguna

    2015-04-01

    Full Text Available A typical Go/No-Go decision is suggested to be implemented in the brain via the activation of the direct or indirect pathway in the basal ganglia. Medium spiny neurons (MSNs in the striatum, receiving input from cortex and projecting to the direct and indirect pathways express D1 and D2 type dopamine receptors, respectively. Recently, it has become clear that the two types of MSNs markedly differ in their mutual and recurrent connectivities as well as feedforward inhibition from FSIs. Therefore, to understand striatal function in action selection, it is of key importance to identify the role of the distinct connectivities within and between the two types of MSNs on the balance of their activity. Here, we used both a reduced firing rate model and numerical simulations of a spiking network model of the striatum to analyze the dynamic balance of spiking activities in D1 and D2 MSNs. We show that the asymmetric connectivity of the two types of MSNs renders the striatum into a threshold device, indicating the state of cortical input rates and correlations by the relative activity rates of D1 and D2 MSNs. Next, we describe how this striatal threshold can be effectively modulated by the activity of fast spiking interneurons, by the dopamine level, and by the activity of the GPe via pallidostriatal backprojections. We show that multiple mechanisms exist in the basal ganglia for biasing striatal output in favour of either the `Go' or the `No-Go' pathway. This new understanding of striatal network dynamics provides novel insights into the putative role of the striatum in various behavioral deficits in patients with Parkinson's disease, including increased reaction times, L-Dopa-induced dyskinesia, and deep brain stimulation-induced impulsivity.

  5. Existence and control of Go/No-Go decision transition threshold in the striatum.

    Science.gov (United States)

    Bahuguna, Jyotika; Aertsen, Ad; Kumar, Arvind

    2015-04-01

    A typical Go/No-Go decision is suggested to be implemented in the brain via the activation of the direct or indirect pathway in the basal ganglia. Medium spiny neurons (MSNs) in the striatum, receiving input from cortex and projecting to the direct and indirect pathways express D1 and D2 type dopamine receptors, respectively. Recently, it has become clear that the two types of MSNs markedly differ in their mutual and recurrent connectivities as well as feedforward inhibition from FSIs. Therefore, to understand striatal function in action selection, it is of key importance to identify the role of the distinct connectivities within and between the two types of MSNs on the balance of their activity. Here, we used both a reduced firing rate model and numerical simulations of a spiking network model of the striatum to analyze the dynamic balance of spiking activities in D1 and D2 MSNs. We show that the asymmetric connectivity of the two types of MSNs renders the striatum into a threshold device, indicating the state of cortical input rates and correlations by the relative activity rates of D1 and D2 MSNs. Next, we describe how this striatal threshold can be effectively modulated by the activity of fast spiking interneurons, by the dopamine level, and by the activity of the GPe via pallidostriatal backprojections. We show that multiple mechanisms exist in the basal ganglia for biasing striatal output in favour of either the `Go' or the `No-Go' pathway. This new understanding of striatal network dynamics provides novel insights into the putative role of the striatum in various behavioral deficits in patients with Parkinson's disease, including increased reaction times, L-Dopa-induced dyskinesia, and deep brain stimulation-induced impulsivity.

  6. Protecting effects specifically from low doses of ionizing radiation to mammalian cells challenge the concept of linearity

    International Nuclear Information System (INIS)

    Feinendegen, L.E.; Sondhaus, C.A.; Altman, K.I.

    1998-01-01

    This report examines the origin of tissue effects that may follow from different cellular responses to low-dose irradiation, using published data. Two principal categories of cellular responses are considered. One response category relates to the probability of radiation-induced DNA damage. The other category consists of low-dose induced changes in intracellular signaling that induce mechanisms of DNA damage control different from those operating at high levels of exposure. Modeled in this way, tissue is treated as a complex adaptive system. The interaction of the various cellular responses results in a net tissue dose-effect relation that is likely to deviate from linearity in the low-dose region. This suggests that the LNT hypothesis should be reexamined. The aim of this paper is to demonstrate that by use of microdosimetric concepts, the energy deposited in cell mass can be related to the occurrence of cellular responses, both damaging and defensive

  7. Protecting effects specifically from low doses of ionizing radiation to mammalian cells challenge the concept of linearity

    Energy Technology Data Exchange (ETDEWEB)

    Feinendegen, L.E. [Brookhaven National Lab., Upton, NY (United States). Medical Dept.; Bond, V.P. [Washington State Univ., Richland, WA (United States); Sondhaus, C.A. [Univ. of Arizona, Tucson, AZ (United States). Dept. of Radiology and Radiation Control Office; Altman, K.I. [Univ. of Rochester Medical Center, NY (United States). Dept. of Biochemistry and Biophysics

    1998-12-31

    This report examines the origin of tissue effects that may follow from different cellular responses to low-dose irradiation, using published data. Two principal categories of cellular responses are considered. One response category relates to the probability of radiation-induced DNA damage. The other category consists of low-dose induced changes in intracellular signaling that induce mechanisms of DNA damage control different from those operating at high levels of exposure. Modeled in this way, tissue is treated as a complex adaptive system. The interaction of the various cellular responses results in a net tissue dose-effect relation that is likely to deviate from linearity in the low-dose region. This suggests that the LNT hypothesis should be reexamined. The aim of this paper is to demonstrate that by use of microdosimetric concepts, the energy deposited in cell mass can be related to the occurrence of cellular responses, both damaging and defensive.

  8. No evidence for a critical salinity threshold for growth and reproduction in the freshwater snail Physa acuta.

    Science.gov (United States)

    Kefford, Ben J; Nugegoda, Dayanthi

    2005-04-01

    The growth and reproduction of the freshwater snail Physa acuta (Gastropoda: Physidae) were measured at various salinity levels (growth: distilled water, 50, 100, 500, 1000 and 5000 microS/cm; reproduction: deionized water, 100, 500, 1000 and 3000 microS/cm) established using the artificial sea salt, Ocean Nature. This was done to examine the assumption that there is no direct effect of salinity on freshwater animals until a threshold, beyond which sub-lethal effects, such as reduction in growth and reproduction, will occur. Growth of P. acuta was maximal in terms of live and dry mass at salinity levels 500-1000 microS/cm. The number of eggs produced per snail per day was maximal between 100 and 1000 microS/cm. Results show that rather than a threshold response to salinity, small rises in salinity (from low levels) can produce increased growth and reproduction until a maximum is reached. Beyond this salinity, further increases result in a decrease in growth and reproduction. Studies on the growth of freshwater invertebrates and fish have generally shown a similar lack of a threshold response. The implications for assessing the effects of salinisation on freshwater organisms need to be further considered.

  9. Analysis of linear measurements on 3D surface models using CBCT data segmentation obtained by automatic standard pre-set thresholds in two segmentation software programs: an in vitro study.

    Science.gov (United States)

    Poleti, Marcelo Lupion; Fernandes, Thais Maria Freire; Pagin, Otávio; Moretti, Marcela Rodrigues; Rubira-Bullen, Izabel Regina Fischer

    2016-01-01

    The aim of this in vitro study was to evaluate the reliability and accuracy of linear measurements on three-dimensional (3D) surface models obtained by standard pre-set thresholds in two segmentation software programs. Ten mandibles with 17 silica markers were scanned for 0.3-mm voxels in the i-CAT Classic (Imaging Sciences International, Hatfield, PA, USA). Twenty linear measurements were carried out by two observers two times on the 3D surface models: the Dolphin Imaging 11.5 (Dolphin Imaging & Management Solutions, Chatsworth, CA, USA), using two filters(Translucent and Solid-1), and in the InVesalius 3.0.0 (Centre for Information Technology Renato Archer, Campinas, SP, Brazil). The physical measurements were made by another observer two times using a digital caliper on the dry mandibles. Excellent intra- and inter-observer reliability for the markers, physical measurements, and 3D surface models were found (intra-class correlation coefficient (ICC) and Pearson's r ≥ 0.91). The linear measurements on 3D surface models by Dolphin and InVesalius software programs were accurate (Dolphin Solid-1 > InVesalius > Dolphin Translucent). The highest absolute and percentage errors were obtained for the variable R1-R1 (1.37 mm) and MF-AC (2.53 %) in the Dolphin Translucent and InVesalius software, respectively. Linear measurements on 3D surface models obtained by standard pre-set thresholds in the Dolphin and InVesalius software programs are reliable and accurate compared with physical measurements. Studies that evaluate the reliability and accuracy of the 3D models are necessary to ensure error predictability and to establish diagnosis, treatment plan, and prognosis in a more realistic way.

  10. Regional rainfall thresholds for landslide occurrence using a centenary database

    Science.gov (United States)

    Vaz, Teresa; Luís Zêzere, José; Pereira, Susana; Cruz Oliveira, Sérgio; Garcia, Ricardo A. C.; Quaresma, Ivânia

    2018-04-01

    This work proposes a comprehensive method to assess rainfall thresholds for landslide initiation using a centenary landslide database associated with a single centenary daily rainfall data set. The method is applied to the Lisbon region and includes the rainfall return period analysis that was used to identify the critical rainfall combination (cumulated rainfall duration) related to each landslide event. The spatial representativeness of the reference rain gauge is evaluated and the rainfall thresholds are assessed and calibrated using the receiver operating characteristic (ROC) metrics. Results show that landslide events located up to 10 km from the rain gauge can be used to calculate the rainfall thresholds in the study area; however, these thresholds may be used with acceptable confidence up to 50 km from the rain gauge. The rainfall thresholds obtained using linear and potential regression perform well in ROC metrics. However, the intermediate thresholds based on the probability of landslide events established in the zone between the lower-limit threshold and the upper-limit threshold are much more informative as they indicate the probability of landslide event occurrence given rainfall exceeding the threshold. This information can be easily included in landslide early warning systems, especially when combined with the probability of rainfall above each threshold.

  11. Small-threshold behaviour of two-loop self-energy diagrams: two-particle thresholds

    International Nuclear Information System (INIS)

    Berends, F.A.; Davydychev, A.I.; Moskovskij Gosudarstvennyj Univ., Moscow; Smirnov, V.A.; Moskovskij Gosudarstvennyj Univ., Moscow

    1996-01-01

    The behaviour of two-loop two-point diagrams at non-zero thresholds corresponding to two-particle cuts is analyzed. The masses involved in a cut and the external momentum are assumed to be small as compared to some of the other masses of the diagram. By employing general formulae of asymptotic expansions of Feynman diagrams in momenta and masses, we construct an algorithm to derive analytic approximations to the diagrams. In such a way, we calculate several first coefficients of the expansion. Since no conditions on relative values of the small masses and the external momentum are imposed, the threshold irregularities are described analytically. Numerical examples, using diagrams occurring in the standard model, illustrate the convergence of the expansion below the first large threshold. (orig.)

  12. Synchronization of low- and high-threshold motor units.

    Science.gov (United States)

    Defreitas, Jason M; Beck, Travis W; Ye, Xin; Stock, Matt S

    2014-04-01

    We examined the degree of synchronization for both low- and high-threshold motor unit (MU) pairs at high force levels. MU spike trains were recorded from the quadriceps during high-force isometric leg extensions. Short-term synchronization (between -6 and 6 ms) was calculated for every unique MU pair for each contraction. At high force levels, earlier recruited motor unit pairs (low-threshold) demonstrated relatively low levels of short-term synchronization (approximately 7.3% extra firings than would have been expected by chance). However, the magnitude of synchronization increased significantly and linearly with mean recruitment threshold (reaching 22.1% extra firings for motor unit pairs recruited above 70% MVC). Three potential mechanisms that could explain the observed differences in synchronization across motor unit types are proposed and discussed. Copyright © 2013 Wiley Periodicals, Inc.

  13. DNA repair by MGMT, but not AAG, causes a threshold in alkylation-induced colorectal carcinogenesis.

    Science.gov (United States)

    Fahrer, Jörg; Frisch, Janina; Nagel, Georg; Kraus, Alexander; Dörsam, Bastian; Thomas, Adam D; Reißig, Sonja; Waisman, Ari; Kaina, Bernd

    2015-10-01

    Epidemiological studies indicate that N-nitroso compounds (NOC) are causally linked to colorectal cancer (CRC). NOC induce DNA alkylations, including O (6)-methylguanine (O (6)-MeG) and N-methylated purines, which are repaired by O (6)-MeG-DNA methyltransferase (MGMT) and N-alkyladenine-DNA glycosylase (AAG)-initiated base excision repair, respectively. In view of recent evidence of nonlinear mutagenicity for NOC-like compounds, the question arises as to the existence of threshold doses in CRC formation. Here, we set out to determine the impact of DNA repair on the dose-response of alkylation-induced CRC. DNA repair proficient (WT) and deficient (Mgmt (-/-), Aag (-/-) and Mgmt (-/-)/Aag (-/-)) mice were treated with azoxymethane (AOM) and dextran sodium sulfate to trigger CRC. Tumors were quantified by non-invasive mini-endoscopy. A non-linear increase in CRC formation was observed in WT and Aag (-/-) mice. In contrast, a linear dose-dependent increase in tumor frequency was found in Mgmt (-/-) and Mgmt (-/-)/Aag (-/-) mice. The data were corroborated by hockey stick modeling, yielding similar carcinogenic thresholds for WT and Aag (-/-) and no threshold for MGMT lacking mice. O (6)-MeG levels and depletion of MGMT correlated well with the observed dose-response in CRC formation. AOM induced dose-dependently DNA double-strand breaks in colon crypts including Lgr5-positive colon stem cells, which coincided with ATR-Chk1-p53 signaling. Intriguingly, Mgmt (-/-) mice displayed significantly enhanced levels of γ-H2AX, suggesting the usefulness of γ-H2AX as an early genotoxicity marker in the colorectum. This study demonstrates for the first time a non-linear dose-response for alkylation-induced colorectal carcinogenesis and reveals DNA repair by MGMT, but not AAG, as a key node in determining a carcinogenic threshold. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  14. Data Compression with Linear Algebra

    OpenAIRE

    Etler, David

    2015-01-01

    A presentation on the applications of linear algebra to image compression. Covers entropy, the discrete cosine transform, thresholding, quantization, and examples of images compressed with DCT. Given in Spring 2015 at Ocean County College as part of the honors program.

  15. Phi photoproduction near threshold with Okubo-Zweig-Iizuka evading phi NN interactions

    CERN Document Server

    William, R A

    1998-01-01

    Existing intermediate and high energy phi-photoproduction data is consistent with purely diffractive production (i.e., Pomeron exchange). However, near threshold (1.574 GeV K sup + K sup - decay angular distribution. We stress the importance of measurements with linearly polarized photons near the phi threshold to separate natural and unnatural parity exchange mechanisms. Approved and planned phi photoproduction and electroproduction experiments at Jefferson Lab will help establish the relative dynamical contributions near threshold and clarify outstanding theoretical issues related to apparent Okubo-Zweig-Iizuka violations.

  16. Threshold stoichiometry for beam induced nitrogen depletion of SiN

    International Nuclear Information System (INIS)

    Timmers, H.; Weijers, T.D.M.; Elliman, R.G.; Uribasterra, J.; Whitlow, H.J.; Sarwe, E.-L.

    2002-01-01

    Measurements of the stoichiometry of silicon nitride films as a function of the number of incident ions using heavy ion elastic recoil detection (ERD) show that beam-induced nitrogen depletion depends on the projectile species, the beam energy, and the initial stoichiometry. A threshold stoichiometry exists in the range 1.3>N/Si≥1, below which the films are stable against nitrogen depletion. Above this threshold, depletion is essentially linear with incident fluence. The depletion rate correlates non-linearly with the electronic energy loss of the projectile ion in the film. Sufficiently long exposure of nitrogen-rich films renders the mechanism, which prevents depletion of nitrogen-poor films, ineffective. Compromising depth-resolution, nitrogen depletion from SiN films during ERD analysis can be reduced significantly by using projectile beams with low atomic numbers

  17. Comparison of Classical and Robust Estimates of Threshold Auto-regression Parameters

    Directory of Open Access Journals (Sweden)

    V. B. Goryainov

    2017-01-01

    Full Text Available The study object is the first-order threshold auto-regression model with a single zero-located threshold. The model describes a stochastic temporal series with discrete time by means of a piecewise linear equation consisting of two linear classical first-order autoregressive equations. One of these equations is used to calculate a running value of the temporal series. A control variable that determines the choice between these two equations is the sign of the previous value of the same series.The first-order threshold autoregressive model with a single threshold depends on two real parameters that coincide with the coefficients of the piecewise linear threshold equation. These parameters are assumed to be unknown. The paper studies an estimate of the least squares, an estimate the least modules, and the M-estimates of these parameters. The aim of the paper is a comparative study of the accuracy of these estimates for the main probabilistic distributions of the updating process of the threshold autoregressive equation. These probability distributions were normal, contaminated normal, logistic, double-exponential distributions, a Student's distribution with different number of degrees of freedom, and a Cauchy distribution.As a measure of the accuracy of each estimate, was chosen its variance to measure the scattering of the estimate around the estimated parameter. An estimate with smaller variance made from the two estimates was considered to be the best. The variance was estimated by computer simulation. To estimate the smallest modules an iterative weighted least-squares method was used and the M-estimates were done by the method of a deformable polyhedron (the Nelder-Mead method. To calculate the least squares estimate, an explicit analytic expression was used.It turned out that the estimation of least squares is best only with the normal distribution of the updating process. For the logistic distribution and the Student's distribution with the

  18. Measuring the educational impact of Promoting Environmental Awareness in Kids (PEAK): The development and implementation of a new scale

    Science.gov (United States)

    Jennifer Miller; Lindsey Brown; Eddie Hill; Amy Shellman; Ron Ramsing; Edwin. Gómez

    2012-01-01

    The Leave No Trace Center for Outdoor Ethics (LNT) is a nonprofit educational organization that teaches skills and values for recreating responsibly in the out-of-doors. LNT developed Promoting Environmental Awareness in Kids (PEAK), based on seven ethical principles. The PEAK program provides a pack that contains several interactive activities specifically designed to...

  19. Thresholding of auditory cortical representation by background noise

    Science.gov (United States)

    Liang, Feixue; Bai, Lin; Tao, Huizhong W.; Zhang, Li I.; Xiao, Zhongju

    2014-01-01

    It is generally thought that background noise can mask auditory information. However, how the noise specifically transforms neuronal auditory processing in a level-dependent manner remains to be carefully determined. Here, with in vivo loose-patch cell-attached recordings in layer 4 of the rat primary auditory cortex (A1), we systematically examined how continuous wideband noise of different levels affected receptive field properties of individual neurons. We found that the background noise, when above a certain critical/effective level, resulted in an elevation of intensity threshold for tone-evoked responses. This increase of threshold was linearly dependent on the noise intensity above the critical level. As such, the tonal receptive field (TRF) of individual neurons was translated upward as an entirety toward high intensities along the intensity domain. This resulted in preserved preferred characteristic frequency (CF) and the overall shape of TRF, but reduced frequency responding range and an enhanced frequency selectivity for the same stimulus intensity. Such translational effects on intensity threshold were observed in both excitatory and fast-spiking inhibitory neurons, as well as in both monotonic and nonmonotonic (intensity-tuned) A1 neurons. Our results suggest that in a noise background, fundamental auditory representations are modulated through a background level-dependent linear shifting along intensity domain, which is equivalent to reducing stimulus intensity. PMID:25426029

  20. Thresholding of auditory cortical representation by background noise.

    Science.gov (United States)

    Liang, Feixue; Bai, Lin; Tao, Huizhong W; Zhang, Li I; Xiao, Zhongju

    2014-01-01

    It is generally thought that background noise can mask auditory information. However, how the noise specifically transforms neuronal auditory processing in a level-dependent manner remains to be carefully determined. Here, with in vivo loose-patch cell-attached recordings in layer 4 of the rat primary auditory cortex (A1), we systematically examined how continuous wideband noise of different levels affected receptive field properties of individual neurons. We found that the background noise, when above a certain critical/effective level, resulted in an elevation of intensity threshold for tone-evoked responses. This increase of threshold was linearly dependent on the noise intensity above the critical level. As such, the tonal receptive field (TRF) of individual neurons was translated upward as an entirety toward high intensities along the intensity domain. This resulted in preserved preferred characteristic frequency (CF) and the overall shape of TRF, but reduced frequency responding range and an enhanced frequency selectivity for the same stimulus intensity. Such translational effects on intensity threshold were observed in both excitatory and fast-spiking inhibitory neurons, as well as in both monotonic and nonmonotonic (intensity-tuned) A1 neurons. Our results suggest that in a noise background, fundamental auditory representations are modulated through a background level-dependent linear shifting along intensity domain, which is equivalent to reducing stimulus intensity.

  1. Effects of fatigue on motor unit firing rate versus recruitment threshold relationships.

    Science.gov (United States)

    Stock, Matt S; Beck, Travis W; Defreitas, Jason M

    2012-01-01

    The purpose of this study was to examine the influence of fatigue on the average firing rate versus recruitment threshold relationships for the vastus lateralis (VL) and vastus medialis. Nineteen subjects performed ten maximum voluntary contractions of the dominant leg extensors. Before and after this fatiguing protocol, the subjects performed a trapezoid isometric muscle action of the leg extensors, and bipolar surface electromyographic signals were detected from both muscles. These signals were then decomposed into individual motor unit action potential trains. For each subject and muscle, the relationship between average firing rate and recruitment threshold was examined using linear regression analyses. For the VL, the linear slope coefficients and y-intercepts for these relationships increased and decreased, respectively, after fatigue. For both muscles, many of the motor units decreased their firing rates. With fatigue, recruitment of higher threshold motor units resulted in an increase in slope for the VL. Copyright © 2011 Wiley Periodicals, Inc.

  2. Paradigm lost, paradigm found: The re-emergence of hormesis as a fundamental dose response model in the toxicological sciences

    International Nuclear Information System (INIS)

    Calabrese, Edward J.

    2005-01-01

    This paper provides an assessment of the toxicological basis of the hormetic dose-response relationship including issues relating to its reproducibility, frequency, and generalizability across biological models, endpoints measured and chemical class/physical stressors and implications for risk assessment. The quantitative features of the hormetic dose response are described and placed within toxicological context that considers study design, temporal assessment, mechanism, and experimental model/population heterogeneity. Particular emphasis is placed on an historical evaluation of why the field of toxicology rejected hormesis in favor of dose response models such as the threshold model for assessing non-carcinogens and linear no threshold (LNT) models for assessing carcinogens. The paper argues that such decisions were principally based on complex historical factors that emerged from the intense and protracted conflict between what is now called traditional medicine and homeopathy and the overly dominating influence of regulatory agencies on the toxicological intellectual agenda. Such regulatory agency influence emphasized hazard/risk assessment goals such as the derivation of no observed adverse effect levels (NOAELs) and the lowest observed adverse effect levels (LOAELs) which were derived principally from high dose studies using few doses, a feature which restricted perceptions and distorted judgments of several generations of toxicologists concerning the nature of the dose-response continuum. Such historical and technical blind spots lead the field of toxicology to not only reject an established dose-response model (hormesis), but also the model that was more common and fundamental than those that the field accepted. - The quantitative features of the hormetic dose/response are described and placed within the context of toxicology

  3. Paradigm lost, paradigm found: The re-emergence of hormesis as a fundamental dose response model in the toxicological sciences

    Energy Technology Data Exchange (ETDEWEB)

    Calabrese, Edward J. [Environmental Health Sciences, School of Public Health, Morrill I, N344, University of Massachusetts, Amherst, MA 01003 (United States)]. E-mail: edwardc@schoolph.umass.edu

    2005-12-15

    This paper provides an assessment of the toxicological basis of the hormetic dose-response relationship including issues relating to its reproducibility, frequency, and generalizability across biological models, endpoints measured and chemical class/physical stressors and implications for risk assessment. The quantitative features of the hormetic dose response are described and placed within toxicological context that considers study design, temporal assessment, mechanism, and experimental model/population heterogeneity. Particular emphasis is placed on an historical evaluation of why the field of toxicology rejected hormesis in favor of dose response models such as the threshold model for assessing non-carcinogens and linear no threshold (LNT) models for assessing carcinogens. The paper argues that such decisions were principally based on complex historical factors that emerged from the intense and protracted conflict between what is now called traditional medicine and homeopathy and the overly dominating influence of regulatory agencies on the toxicological intellectual agenda. Such regulatory agency influence emphasized hazard/risk assessment goals such as the derivation of no observed adverse effect levels (NOAELs) and the lowest observed adverse effect levels (LOAELs) which were derived principally from high dose studies using few doses, a feature which restricted perceptions and distorted judgments of several generations of toxicologists concerning the nature of the dose-response continuum. Such historical and technical blind spots lead the field of toxicology to not only reject an established dose-response model (hormesis), but also the model that was more common and fundamental than those that the field accepted. - The quantitative features of the hormetic dose/response are described and placed within the context of toxicology.

  4. The biological effects of exposure to ionising radiation

    International Nuclear Information System (INIS)

    Higson, D.J.

    2016-01-01

    Scenarios for exposure to ionising radiation range from natural background radiation (chronic) to the explosions of atomic bombs (acute), with some medical, industrial and research exposures lying between these extremes. Biological responses to radiation that predominate at high doses incurred at high dose rates are different from those that predominate at low doses and low dose rates. Single doses from bomb explosions ranged up to many thousand mGy. Acute doses greater than about 1000 mGy cause acute radiation syndrome (ARS). Below this threshold, radiation has a variety of potential latent health effects: Change to the incidence of cancer is the most usual subject of attention but change to longevity may be the best overall measure because decreased incidences of non-cancer mortality have been observed to coincide with increased incidence of cancer mortality. Acute doses greater than 500 mGy cause increased risks of cancer and decreased life expectancy. For doses less than 100 mGy, beneficial overall health effects ('radiation hormesis') have been observed. At the other end of the spectrum, chronic exposure to natural radiation has occurred throughout evolution and is necessary for the normal life and health of current species. Dose rates greater than the present global average of about 2 mGy per year have either no discernible health effect or beneficial health effects up to several hundred mGy per year. It is clearly not credible that a single health effects model -- such as the linear no-threshold (LNT) model of risk estimation -- could fit all latent health effects. A more realistic model is suggested.

  5. Ionizing radiation in 21st century

    International Nuclear Information System (INIS)

    Jaworowski, Zbigniew

    2005-01-01

    The paper begins with the author's personal experience in Poland on the occasion of Chernobyl nuclear accident followed by main lessons that the author could deduce from the accident. After the discovery of ionizing radiation at the end of 19th century, social perception has altered between acceptance and rejection stemming from recognition of the basic aspects: usefulness for medical applications and for technical and scientific aims, beneficial effects of their low levels, and harmful effects of high levels. The author explains how linear no-threshold (LNT) assumption according to which even the lowest, near zero doses of radiation may cause cancer genetic harm has become established. Comparing the natural radioactivity of the earth's crust with the activity of much shorter-lived radioactive wastes from the nuclear power cycle, it is concluded that none of the man-made component of the radioactive wastes has higher-toxicity than the natural Th 232. The paper concludes by stating that one century has not been long enough to adapt mentally to ionizing radiation and radioactivity and perhaps 21st century will suffice for this adaptation. (S. Ohno)

  6. Recent international regulations: low dose-low rate radiation protection and the demise of reason.

    Science.gov (United States)

    Okkalides, Demetrios

    2008-01-01

    The radiation protection measures suggested by the International Committee for Radiation Protection (ICRP), national regulating bodies and experts, have been becoming ever more strict despite the decrease of any information supporting the existence of the Linear no Threshold model (LNT) and of any adverse effects of Low Dose Low Rate (LDLR) irradiation. This tendency arises from the disproportionate response of human society to hazards that are currently in fashion and is unreasonable. The 1 mSv/year dose limit for the public suggested by the ICRP corresponds to a 1/18,181 detriment-adjusted cancer risk and is much lower than other hazards that are faced by modern societies such as e.g. driving and smoking which carry corresponding rate risks of 1/2,100 and 1/2,000. Even worldwide deadly work accidents rate is higher at 1/ 8,065. Such excessive safety measures against minimal risks from man made radiation sources divert resources from very real and much greater hazards. In addition they undermine research and development of radiation technology and tend to subjugate science and the quest for understanding nature to phobic practices.

  7. Vanguards of paradigm shift in radiation biology. Radiation-induced adaptive and bystander responses

    International Nuclear Information System (INIS)

    Matsumoto, Hideki; Hamada, Nobuyuki; Kobayashi, Yasuhiko; Takahashi, Akihisa; Ohnishi, Takeo

    2007-01-01

    The risks of exposure to low dose ionizing radiation (below 100 mSv) are estimated by extrapolating from data obtained after exposure to high dose radiation, using a linear no-threshold model (LNT model). However, the validity of using this dose-response model is controversial because evidence accumulated over the past decade has indicated that living organisms, including humans, respond differently to low dose/low dose-rate radiation than they do to high dose/high dose-rate radiation. In other words, there are accumulated findings which cannot be explained by the classical ''target theory'' of radiation biology. The radioadaptive response, radiation-induced bystander effects, low-dose radio-hypersensitivity, and genomic instability are specifically observed in response to low dose/low dose-rate radiation, and the mechanisms underlying these responses often involve biochemical/molecular signals that respond to targeted and non-targeted events. Recently, correlations between the radioadaptive and bystander responses have been increasingly reported. The present review focuses on the latter two phenomena by summarizing observations supporting their existence, and discussing the linkage between them from the aspect of production of reactive oxygen and nitrogen species. (author)

  8. Low doses of ionizing radiation: Biological effects and regulatory control. Invited papers and discussions. Proceedings of an international conference

    International Nuclear Information System (INIS)

    1998-01-01

    The levels and biological effects resulting from exposure to ionizing radiation are continuously reviewed by the United Nations Committee on the Effects of Atomic Radiation (UNSCEAR). Since its creation in 1928, the International Commission on Radiological Protection (ICRP) has issued recommendations on protection against ionizing radiation. The UNSCEAR estimates and the ICRP recommendations have served as the basis for national and international safety standards on radiation safety, including those developed by the International Atomic Energy Agency (IAEA) and the World Health Organization (WHO). Concerning health effects of low doses of ionizing radiation, the international standards are based on the plausible assumption that, above the unavoidable background radiation dose, the probability of effects increases linearly with dose, i.e. on a 'linear, no threshold' (LNT) assumption. However, in recent years the biological estimates of health effects of low doses of ionizing radiation and the regulatory approach to the control of low level radiation exposure have been much debated. To foster information exchange on the relevant issues, an International Conference on Low Doses of Ionizing Radiation: Biological Effects and Regulatory Control, jointly sponsored by the IAEA and WHO in co-operation with UNSCEAR, was held from 17-21 November 1997 at Seville, Spain. These Proceedings contain the invited special reports, keynote papers, summaries of discussions, session summaries and addresses presented at the opening and closing of the Conference

  9. Ecological thresholds: The key to successful enviromental management or an important concept with no practical application?

    Science.gov (United States)

    Groffman, P.M.; Baron, Jill S.; Blett, T.; Gold, A.J.; Goodman, I.; Gunderson, L.H.; Levinson, B.M.; Palmer, Margaret A.; Paerl, H.W.; Peterson, G.D.; Poff, N.L.; Rejeski, D.W.; Reynolds, J.F.; Turner, M.G.; Weathers, K.C.; Wiens, J.

    2006-01-01

    An ecological threshold is the point at which there is an abrupt change in an ecosystem quality, property or phenomenon, or where small changes in an environmental driver produce large responses in the ecosystem. Analysis of thresholds is complicated by nonlinear dynamics and by multiple factor controls that operate at diverse spatial and temporal scales. These complexities have challenged the use and utility of threshold concepts in environmental management despite great concern about preventing dramatic state changes in valued ecosystems, the need for determining critical pollutant loads and the ubiquity of other threshold-based environmental problems. In this paper we define the scope of the thresholds concept in ecological science and discuss methods for identifying and investigating thresholds using a variety of examples from terrestrial and aquatic environments, at ecosystem, landscape and regional scales. We end with a discussion of key research needs in this area.

  10. Multipole surface solitons supported by the interface between linear media and nonlocal nonlinear media

    International Nuclear Information System (INIS)

    Shi, Zhiwei; Li, Huagang; Guo, Qi

    2012-01-01

    We address multipole surface solitons occurring at the interface between a linear medium and a nonlocal nonlinear medium. We show the impact of nonlocality, the propagation constant, and the linear index difference of two media on the properties of the surface solitons. We find that there exist a threshold value of the degree of the nonlocality at the same linear index difference of two media, only when the degree of the nonlocality goes beyond the value, the multipole surface solitons can be stable. -- Highlights: ► We show the impact of nonlocality and the linear index difference of two media on the properties of the surface solitons. ► For the surface solitons, only when the degree of the nonlocality goes beyond a threshold value, they can be stable. ► The number of poles and the index difference of two media can all influence the threshold value.

  11. Fault tolerance in parity-state linear optical quantum computing

    International Nuclear Information System (INIS)

    Hayes, A. J. F.; Ralph, T. C.; Haselgrove, H. L.; Gilchrist, Alexei

    2010-01-01

    We use a combination of analytical and numerical techniques to calculate the noise threshold and resource requirements for a linear optical quantum computing scheme based on parity-state encoding. Parity-state encoding is used at the lowest level of code concatenation in order to efficiently correct errors arising from the inherent nondeterminism of two-qubit linear-optical gates. When combined with teleported error-correction (using either a Steane or Golay code) at higher levels of concatenation, the parity-state scheme is found to achieve a saving of approximately three orders of magnitude in resources when compared to the cluster state scheme, at a cost of a somewhat reduced noise threshold.

  12. Interlocking-induced stiffness in stochastically microcracked materials beyond the transport percolation threshold

    Science.gov (United States)

    Picu, R. C.; Pal, A.; Lupulescu, M. V.

    2016-04-01

    We study the mechanical behavior of two-dimensional, stochastically microcracked continua in the range of crack densities close to, and above, the transport percolation threshold. We show that these materials retain stiffness up to crack densities much larger than the transport percolation threshold due to topological interlocking of sample subdomains. Even with a linear constitutive law for the continuum, the mechanical behavior becomes nonlinear in the range of crack densities bounded by the transport and stiffness percolation thresholds. The effect is due to the fractal nature of the fragmentation process and is not linked to the roughness of individual cracks.

  13. Hyper-arousal decreases human visual thresholds.

    Directory of Open Access Journals (Sweden)

    Adam J Woods

    Full Text Available Arousal has long been known to influence behavior and serves as an underlying component of cognition and consciousness. However, the consequences of hyper-arousal for visual perception remain unclear. The present study evaluates the impact of hyper-arousal on two aspects of visual sensitivity: visual stereoacuity and contrast thresholds. Sixty-eight participants participated in two experiments. Thirty-four participants were randomly divided into two groups in each experiment: Arousal Stimulation or Sham Control. The Arousal Stimulation group underwent a 50-second cold pressor stimulation (immersing the foot in 0-2° C water, a technique known to increase arousal. In contrast, the Sham Control group immersed their foot in room temperature water. Stereoacuity thresholds (Experiment 1 and contrast thresholds (Experiment 2 were measured before and after stimulation. The Arousal Stimulation groups demonstrated significantly lower stereoacuity and contrast thresholds following cold pressor stimulation, whereas the Sham Control groups showed no difference in thresholds. These results provide the first evidence that hyper-arousal from sensory stimulation can lower visual thresholds. Hyper-arousal's ability to decrease visual thresholds has important implications for survival, sports, and everyday life.

  14. Analysis of ecological thresholds in a temperate forest undergoing dieback.

    Directory of Open Access Journals (Sweden)

    Philip Martin

    Full Text Available Positive feedbacks in drivers of degradation can cause threshold responses in natural ecosystems. Though threshold responses have received much attention in studies of aquatic ecosystems, they have been neglected in terrestrial systems, such as forests, where the long time-scales required for monitoring have impeded research. In this study we explored the role of positive feedbacks in a temperate forest that has been monitored for 50 years and is undergoing dieback, largely as a result of death of the canopy dominant species (Fagus sylvatica, beech. Statistical analyses showed strong non-linear losses in basal area for some plots, while others showed relatively gradual change. Beech seedling density was positively related to canopy openness, but a similar relationship was not observed for saplings, suggesting a feedback whereby mortality in areas with high canopy openness was elevated. We combined this observation with empirical data on size- and growth-mediated mortality of trees to produce an individual-based model of forest dynamics. We used this model to simulate changes in the structure of the forest over 100 years under scenarios with different juvenile and mature mortality probabilities, as well as a positive feedback between seedling and mature tree mortality. This model produced declines in forest basal area when critical juvenile and mature mortality probabilities were exceeded. Feedbacks in juvenile mortality caused a greater reduction in basal area relative to scenarios with no feedback. Non-linear, concave declines of basal area occurred only when mature tree mortality was 3-5 times higher than rates observed in the field. Our results indicate that the longevity of trees may help to buffer forests against environmental change and that the maintenance of old, large trees may aid the resilience of forest stands. In addition, our work suggests that dieback of forests may be avoidable providing pressures on mature and juvenile trees do

  15. Threshold Studies of the Microwave Instability in Electron Storage Rings

    International Nuclear Information System (INIS)

    Bane, Karl

    2010-01-01

    We use a Vlasov-Fokker-Planck program and a linearized Vlasov solver to study the microwave instability threshold of impedance models: (1) a Q = 1 resonator and (2) shielded coherent synchrotron radiation (CSR), and find the results of the two programs agree well. For shielded CSR we show that only two dimensionless parameters, the shielding parameter Π and the strength parameter S csr , are needed to describe the system. We further show that there is a strong instability associated with CSR, and that the threshold, to good approximation, is given by (S csr )th = 0.5 + 0.12Π. In particular, this means that shielding has little effect in stabilizing the beam for Π ∼ -3/2 . We, in addition, find another instability in the vicinity of Π = 0.7 with a lower threshold, (S csr ) th ∼ 0.2. We find that the threshold to this instability depends strongly on damping time, (S csr ) th ∼ τ p -1/2 , and that the tune spread at threshold is small - both hallmarks of a weak instability.

  16. Justifying threshold voltage definition for undoped body transistors through 'crossover point' concept

    International Nuclear Information System (INIS)

    Baruah, Ratul Kumar; Mahapatra, Santanu

    2009-01-01

    Two different definitions, one is potential based and the other is charge based, are used in the literatures to define the threshold voltage of undoped body symmetric double gate transistors. This paper, by introducing a novel concept of crossover point, proves that the charge based definition is more accurate than the potential based definition. It is shown that for a given channel length the potential based definition predicts anomalous change in threshold voltage with body thickness variation while the charge based definition results in monotonous change. The threshold voltage is then extracted from drain current versus gate voltage characteristics using linear extrapolation, transconductance and match-point methods. In all the three cases it is found that trend of threshold voltage variation support the charge based definition.

  17. Testing for a Debt-Threshold Effect on Output Growth.

    Science.gov (United States)

    Lee, Sokbae; Park, Hyunmin; Seo, Myung Hwan; Shin, Youngki

    2017-12-01

    Using the Reinhart-Rogoff dataset, we find a debt threshold not around 90 per cent but around 30 per cent, above which the median real gross domestic product (GDP) growth falls abruptly. Our work is the first to formally test for threshold effects in the relationship between public debt and median real GDP growth. The null hypothesis of no threshold effect is rejected at the 5 per cent significance level for most cases. While we find no evidence of a threshold around 90 per cent, our findings from the post-war sample suggest that the debt threshold for economic growth may exist around a relatively small debt-to-GDP ratio of 30 per cent. Furthermore, countries with debt-to-GDP ratios above 30 per cent have GDP growth that is 1 percentage point lower at the median.

  18. Bedding material affects mechanical thresholds, heat thresholds and texture preference

    Science.gov (United States)

    Moehring, Francie; O’Hara, Crystal L.; Stucky, Cheryl L.

    2015-01-01

    It has long been known that the bedding type animals are housed on can affect breeding behavior and cage environment. Yet little is known about its effects on evoked behavior responses or non-reflexive behaviors. C57BL/6 mice were housed for two weeks on one of five bedding types: Aspen Sani Chips® (standard bedding for our institute), ALPHA-Dri®, Cellu-Dri™, Pure-o’Cel™ or TEK-Fresh. Mice housed on Aspen exhibited the lowest (most sensitive) mechanical thresholds while those on TEK-Fresh exhibited 3-fold higher thresholds. While bedding type had no effect on responses to punctate or dynamic light touch stimuli, TEK-Fresh housed animals exhibited greater responsiveness in a noxious needle assay, than those housed on the other bedding types. Heat sensitivity was also affected by bedding as animals housed on Aspen exhibited the shortest (most sensitive) latencies to withdrawal whereas those housed on TEK-Fresh had the longest (least sensitive) latencies to response. Slight differences between bedding types were also seen in a moderate cold temperature preference assay. A modified tactile conditioned place preference chamber assay revealed that animals preferred TEK-Fresh to Aspen bedding. Bedding type had no effect in a non-reflexive wheel running assay. In both acute (two day) and chronic (5 week) inflammation induced by injection of Complete Freund’s Adjuvant in the hindpaw, mechanical thresholds were reduced in all groups regardless of bedding type, but TEK-Fresh and Pure-o’Cel™ groups exhibited a greater dynamic range between controls and inflamed cohorts than Aspen housed mice. PMID:26456764

  19. Modeling of Volatility with Non-linear Time Series Model

    OpenAIRE

    Kim Song Yon; Kim Mun Chol

    2013-01-01

    In this paper, non-linear time series models are used to describe volatility in financial time series data. To describe volatility, two of the non-linear time series are combined into form TAR (Threshold Auto-Regressive Model) with AARCH (Asymmetric Auto-Regressive Conditional Heteroskedasticity) error term and its parameter estimation is studied.

  20. Study of the Hearing Threshold of Dance Teachers

    Directory of Open Access Journals (Sweden)

    Nehring, Cristiane

    2015-03-01

    Full Text Available Introduction High sound pressure levels can cause hearing loss, beginning at high frequencies. Objective To analyze the hearing thresholds of dance teachers. Methods This study had a cross-sectional, observational, prospective, and descriptive design. Conventional and high-frequency hearing evaluations were performed with dance teachers and subjects in the control group. Results In all, 64 individuals were assessed, 32 in the research group and 32 in the control group. Results showed that individuals in the research group had hearing loss at frequencies between 4 and 8 kHz, but no significant difference was found between groups. Frequency analysis showed that individuals in the control group had higher thresholds than individuals in the research group at the frequency of 0.25 kHz. In the control group, men showed higher thresholds than women at the frequency of 9 kHz. Conclusion A low prevalence of hearing loss was found, with no difference between teachers and subjects from the control group. No difference was found for hearing thresholds at high frequencies between groups. Results have been partially affected by sex.

  1. Research on linear driving of wave maker; Zoha sochi no linear drive ka kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    Yamamoto, I; Taniguchi, S; Nohara, T [Mitsubishi Heavy Industries, Ltd., Tokyo (Japan)

    1997-10-01

    The water tank test of marine structures or submarine structures uses a wave maker to generate waves. A typical flap wave maker uses the wave making flap penetrating a water surface whose bottom is fixed on a tank bottom through a hinge, and the top is connected with a rod driven by rotating servomotor for reciprocating motion of the flap. However, this driving gear using a rotating servomotor and a bowl- screw has some defects such as noise caused by bowl rotation, backlash due to wear and limited driving speed. A linear motor with less friction mechanisms was thus applied to the driving gear. The performance test result of the prototype driving gear using a linear motor showed the possibility of the linear driven wave maker. The linear driven wave maker could also achieve low noise and simple mechanism. The sufficient durability and applicability of the linear driven wave maker mechanism were confirmed through strength calculation necessary for improving the prototype wave maker. 1 ref., 5 figs., 2 tabs.

  2. Alternative method for determining anaerobic threshold in rowers

    Directory of Open Access Journals (Sweden)

    Giovani Dos Santos Cunha

    2008-01-01

    Full Text Available http://dx.doi.org/10.5007/1980-0037.2008v10n4p367 In rowing, the standard breathing that athletes are trained to use makes it difficult, or even impossible, to detect ventilatory limits, due to the coupling of the breath with the technical movement. For this reason, some authors have proposed determining the anaerobic threshold from the respiratory exchange ratio (RER, but there is not yet consensus on what value of RER should be used. The objective of this study was to test what value of RER corresponds to the anaerobic threshold and whether this value can be used as an independent parameter for determining the anaerobic threshold of rowers. The sample comprised 23 male rowers. They were submitted to a maximal cardiorespiratory test on a rowing ergometer with concurrent ergospirometry in order to determine VO2máx and the physiological variables corresponding to their anaerobic threshold. The anaerobic threshold was determined using the Dmax (maximal distance method. The physiological variables were classified into maximum values and anaerobic threshold values. The maximal state of these rowers reached VO2 (58.2±4.4 ml.kg-1.min-1, lactate (8.2±2.1 mmol.L-1, power (384±54.3 W and RER (1.26±0.1. At the anaerobic threshold they reached VO2 (46.9±7.5 ml.kg-1.min-1, lactate (4.6±1.3 mmol.L-1, power (300± 37.8 W and RER (0.99±0.1. Conclusions - the RER can be used as an independent method for determining the anaerobic threshold of rowers, adopting a value of 0.99, however, RER should exhibit a non-linear increase above this figure.

  3. Pathway to a paradigm: the linear nonthreshold dose-response model in historical context. The American Academy of Health Physics 1995 Radiology Centennial Hartman Oration.

    Science.gov (United States)

    Kathren, R L

    1996-05-01

    This paper traces the evolution of the linear nonthreshold dose-response model and its acceptance as a paradigm in radiation protection practice and risk analysis. Deterministic effects such as skin burns and even deep tissue trauma were associated with excessive exposure to x rays shortly after their discovery, and carcinogenicity was observed as early as 1902. Still, it was not until 1925 that the first protective limits were suggested. For three decades these limits were based on the concept of a tolerance dose which, if not exceeded, would result in no demonstrable harm to the individual and implicitly assumed a threshold dose below which radiation effects would be absent. After World War II, largely because of genetic concerns related to atmospheric weapons testing, radiation protection dose limits were expressed in terms of a risk based maximum permissible dose which clearly implied no threshold. The 1927 discovery by Muller of x-ray induced genetic mutations in fruit flies, linear with dose and with no apparent threshold, was an important underpinning of the standards. The linear nonthreshold dose-response model was originally used to provide an upper limit estimate of the risk, with zero being the lower limit, of low level irradiation since the dose-response curve could not be determined at low dose levels. Evidence to the contrary such as hormesis and the classic studies of the radium dial painters notwithstanding, the linear nonthreshold model gained greater acceptance and in the centennial year of the discovery of x rays stands as a paradigm although serious questions are beginning to be raised regarding its general applicability. The work includes a brief digression describing the work of x-ray protection pioneer William Rollins and concludes with a recommendation for application of a de minimis dose level in radiation protection.

  4. Sparse signals recovered by non-convex penalty in quasi-linear systems.

    Science.gov (United States)

    Cui, Angang; Li, Haiyang; Wen, Meng; Peng, Jigen

    2018-01-01

    The goal of compressed sensing is to reconstruct a sparse signal under a few linear measurements far less than the dimension of the ambient space of the signal. However, many real-life applications in physics and biomedical sciences carry some strongly nonlinear structures, and the linear model is no longer suitable. Compared with the compressed sensing under the linear circumstance, this nonlinear compressed sensing is much more difficult, in fact also NP-hard, combinatorial problem, because of the discrete and discontinuous nature of the [Formula: see text]-norm and the nonlinearity. In order to get a convenience for sparse signal recovery, we set the nonlinear models have a smooth quasi-linear nature in this paper, and study a non-convex fraction function [Formula: see text] in this quasi-linear compressed sensing. We propose an iterative fraction thresholding algorithm to solve the regularization problem [Formula: see text] for all [Formula: see text]. With the change of parameter [Formula: see text], our algorithm could get a promising result, which is one of the advantages for our algorithm compared with some state-of-art algorithms. Numerical experiments show that our method performs much better than some state-of-the-art methods.

  5. How well can we reconstruct the t anti t system near its threshold at future e sup + e sup - linear colliders?

    CERN Document Server

    Ikematsu, K; Hioki, Z; Sumino, Y; Takahashi, T

    2003-01-01

    We developed a new method for the full kinematical reconstruction of the t anti t system near its threshold at future linear e sup + e sup - colliders. In the core of the method lies likelihood fitting which is designed to improve measurement accuracies of the kinematical variables that specify the final states resulting from t anti t decays. The improvement is demonstrated by applying this method to a Monte Carlo t anti t sample generated with various experimental effects including beamstrahlung, finite acceptance and resolution of the detector system, etc. In most cases the fit takes a broad non-Gaussian distribution of a given kinematical variable to a nearly Gaussian shape, thereby justifying phenomenological analyses based on simple Gaussian smearing of the parton-level momenta. The standard deviations of the resultant distributions of various kinematical variables are given in order to facilitate such phenomenological analyses. A possible application of the kinematical fitting method and its expected im...

  6. Investigation of excimer laser ablation threshold of polymers using a microphone

    Energy Technology Data Exchange (ETDEWEB)

    Krueger, Joerg; Niino, Hiroyuki; Yabe, Akira

    2002-09-30

    KrF excimer laser ablation of polyethylene terephthalate (PET), polyimide (PI) and polycarbonate (PC) in air was studied by an in situ monitoring technique using a microphone. The microphone signal generated by a short acoustic pulse represented the etch rate of laser ablation depending on the laser fluence, i.e., the ablation 'strength'. From a linear relationship between the microphone output voltage and the laser fluence, the single-pulse ablation thresholds were found to be 30 mJ cm{sup -2} for PET, 37 mJ cm{sup -2} for PI and 51 mJ cm{sup -2} for PC (20-pulses threshold). The ablation thresholds of PET and PI were not influenced by the number of pulses per spot, while PC showed an incubation phenomenon. A microphone technique provides a simple method to determine the excimer laser ablation threshold of polymer films.

  7. Removing Malmquist bias from linear regressions

    Science.gov (United States)

    Verter, Frances

    1993-01-01

    Malmquist bias is present in all astronomical surveys where sources are observed above an apparent brightness threshold. Those sources which can be detected at progressively larger distances are progressively more limited to the intrinsically luminous portion of the true distribution. This bias does not distort any of the measurements, but distorts the sample composition. We have developed the first treatment to correct for Malmquist bias in linear regressions of astronomical data. A demonstration of the corrected linear regression that is computed in four steps is presented.

  8. Threshold Theory Tested in an Organizational Setting

    DEFF Research Database (Denmark)

    Christensen, Bo T.; Hartmann, Peter V. W.; Hedegaard Rasmussen, Thomas

    2017-01-01

    A large sample of leaders (N = 4257) was used to test the link between leader innovativeness and intelligence. The threshold theory of the link between creativity and intelligence assumes that below a certain IQ level (approximately IQ 120), there is some correlation between IQ and creative...... potential, but above this cutoff point, there is no correlation. Support for the threshold theory of creativity was found, in that the correlation between IQ and innovativeness was positive and significant below a cutoff point of IQ 120. Above the cutoff, no significant relation was identified, and the two...... correlations differed significantly. The finding was stable across distinct parts of the sample, providing support for the theory, although the correlations in all subsamples were small. The findings lend support to the existence of threshold effects using perceptual measures of behavior in real...

  9. No-go theorem for passive single-rail linear optical quantum computing.

    Science.gov (United States)

    Wu, Lian-Ao; Walther, Philip; Lidar, Daniel A

    2013-01-01

    Photonic quantum systems are among the most promising architectures for quantum computers. It is well known that for dual-rail photons effective non-linearities and near-deterministic non-trivial two-qubit gates can be achieved via the measurement process and by introducing ancillary photons. While in principle this opens a legitimate path to scalable linear optical quantum computing, the technical requirements are still very challenging and thus other optical encodings are being actively investigated. One of the alternatives is to use single-rail encoded photons, where entangled states can be deterministically generated. Here we prove that even for such systems universal optical quantum computing using only passive optical elements such as beam splitters and phase shifters is not possible. This no-go theorem proves that photon bunching cannot be passively suppressed even when extra ancilla modes and arbitrary number of photons are used. Our result provides useful guidance for the design of optical quantum computers.

  10. Mass spectrometric determination of partial electron impact ionization cross sections of No, No2, and N2O from threshold up to 180 eV

    International Nuclear Information System (INIS)

    Kim, Y. B.

    1982-01-01

    Electron impact ionization of nitric oxide (NO), nitrogen dioxide (NO 2 ) and nitrous oxide (N 2 O) has been studied as a function of electron energy up to 180 eV with a double focussing mass spectrometer Varian MAT CH5 and an improved Nier type electron impact ion source. Relative partial ionization cross sections were measured for the processes NO + + 2e, NO ++ + 3e, and NO 2 + e -> NO + 2 + 2e, NO ++ + 3e and N 2 O + e -> N 2 O + + 2e. An accurate measurement of the cross section ratios q(NO 2+ /NO)/q(NO + /NO) and q(NO 2 2 /NO 2 )/q(NO + 2 /NO 2 ) has been made. Relative cross section functions were calibrated absolutely with two different normalization methods. Moreover, both metastable and collision induced dissociations of N 2 O + were studied quantitatively using the technique of decoupling the acceleration and deflection electric fields. Using the n- th root extrapolation the following ionization potentials have been derived from the cross section functions near threshold: NO + (X 1 Σ + ); NO ++ ; NO + 2 ; NO 2 ++ ; N 2 O + (X 2 π). These results are compared with previous measurements and theoretical calculations, where available. Part of the results presented have been already published in seven papers by the author. (Author)

  11. No-Impact Threshold Values for NRAP's Reduced Order Models

    Energy Technology Data Exchange (ETDEWEB)

    Last, George V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Murray, Christopher J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Brown, Christopher F. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Jordan, Preston D. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Sharma, Maneesh [West Virginia Univ., and National Energy Technlogy Lab., Morgantown, WV (United States)

    2013-02-01

    The purpose of this study was to develop methodologies for establishing baseline datasets and statistical protocols for determining statistically significant changes between background concentrations and predicted concentrations that would be used to represent a contamination plume in the Gen II models being developed by NRAP’s Groundwater Protection team. The initial effort examined selected portions of two aquifer systems; the urban shallow-unconfined aquifer system of the Edwards-Trinity Aquifer System (being used to develop the ROM for carbon-rock aquifers, and the a portion of the High Plains Aquifer (an unconsolidated and semi-consolidated sand and gravel aquifer, being used to development the ROM for sandstone aquifers). Threshold values were determined for Cd, Pb, As, pH, and TDS that could be used to identify contamination due to predicted impacts from carbon sequestration storage reservoirs, based on recommendations found in the EPA’s ''Unified Guidance for Statistical Analysis of Groundwater Monitoring Data at RCRA Facilities'' (US Environmental Protection Agency 2009). Results from this effort can be used to inform a ''no change'' scenario with respect to groundwater impacts, rather than the use of an MCL that could be significantly higher than existing concentrations in the aquifer.

  12. White Light Generation and Anisotropic Damage in Gold Films near Percolation Threshold

    DEFF Research Database (Denmark)

    Novikov, Sergey M.; Frydendahl, Christian; Beermann, Jonas

    2017-01-01

    in vanishingly small gaps between gold islands in thin films near the electrically determined percolation threshold. Optical explorations using two-photon luminescence (TPL) and near-field microscopies reveals supercubic TPL power dependencies with white-light spectra, establishing unequivocally...... that the strongest TPL signals are generated close to the percolation threshold films, and occurrence of extremely confined (similar to 30 nm) and strongly enhanced (similar to 100 times) fields at the illumination wavelength. For linearly polarized and sufficiently powerful light, we observe pronounced optical...

  13. Compartmentalization in environmental science and the perversion of multiple thresholds

    Energy Technology Data Exchange (ETDEWEB)

    Burkart, W. [Institute of Radiation Hygiene of the Federal Office for Radiation Protection, Ingolstaedter Landstr. 1, D 85716 Oberschleissheim, Muenchen (Germany)

    2000-04-17

    Nature and living organisms are separated into compartments. The self-assembly of phospholipid micelles was as fundamental to the emergence of life and evolution as the formation of DNA precursors and their self-replication. Also, modern science owes much of its success to the study of single compartments, the dissection of complex structures and event chains into smaller study objects which can be manipulated with a set of more and more sophisticated equipment. However, in environmental science, these insights are obtained at a price: firstly, it is difficult to recognize, let alone to take into account what is lost during fragmentation and dissection; and secondly, artificial compartments such as scientific disciplines become self-sustaining, leading to new and unnecessary boundaries, subtly framing scientific culture and impeding progress in holistic understanding. The long-standing but fruitless quest to define dose-effect relationships and thresholds for single toxic agents in our environment is a central part of the problem. Debating single-agent toxicity in splendid isolation is deeply flawed in view of a modern world where people are exposed to low levels of a multitude of genotoxic and non-genotoxic agents. Its potential danger lies in the unwarranted postulation of separate thresholds for agents with similar action. A unifying concept involving toxicology and radiation biology is needed for a full mechanistic assessment of environmental health risks. The threat of synergism may be less than expected, but this may also hold for the safety margin commonly thought to be a consequence of linear no-threshold dose-effect relationship assumptions.

  14. Calibration of the neutron scintillation counter threshold

    International Nuclear Information System (INIS)

    Noga, V.I.; Ranyuk, Yu.N.; Telegin, Yu.N.

    1978-01-01

    A method for calibrating the threshold of a neutron counter in the form of a 10x10x40 cm plastic scintillator is described. The method is based on the evaluation of the Compton boundary of γ-spectrum from the discrimination curve of counter loading. The results of calibration using 60 Co and 24 Na γ-sources are given. In order to eValuate the Compton edge rapidly, linear extrapolation of the linear part of the discrimination curve towards its intersection with the X axis is recommended. Special measurements have shown that the calibration results do not practically depend on the distance between the cathode of a photomultiplier and the place where collimated γ-radiation of the calibration source reaches the scintillator

  15. Influence of arousal threshold and depth of sleep on respiratory stability in man: analysis using a mathematical model.

    Science.gov (United States)

    Longobardo, G S; Evangelisti, C J; Cherniack, N S

    2009-12-01

    We examined the effect of arousals (shifts from sleep to wakefulness) on breathing during sleep using a mathematical model. The model consisted of a description of the fluid dynamics and mechanical properties of the upper airways and lungs, as well as a controller sensitive to arterial and brain changes in CO(2), changes in arterial oxygen, and a neural input, alertness. The body was divided into multiple gas store compartments connected by the circulation. Cardiac output was constant, and cerebral blood flows were sensitive to changes in O(2) and CO(2) levels. Arousal was considered to occur instantaneously when afferent respiratory chemical and neural stimulation reached a threshold value, while sleep occurred when stimulation fell below that value. In the case of rigid and nearly incompressible upper airways, lowering arousal threshold decreased the stability of breathing and led to the occurrence of repeated apnoeas. In more compressible upper airways, to maintain stability, increasing arousal thresholds and decreasing elasticity were linked approximately linearly, until at low elastances arousal thresholds had no effect on stability. Increased controller gain promoted instability. The architecture of apnoeas during unstable sleep changed with the arousal threshold and decreases in elasticity. With rigid airways, apnoeas were central. With lower elastances, apnoeas were mixed even with higher arousal thresholds. With very low elastances and still higher arousal thresholds, sleep consisted totally of obstructed apnoeas. Cycle lengths shortened as the sleep architecture changed from mixed apnoeas to total obstruction. Deeper sleep also tended to promote instability by increasing plant gain. These instabilities could be countered by arousal threshold increases which were tied to deeper sleep or accumulated aroused time, or by decreased controller gains.

  16. Evapotranspiration patterns in complex upland forests reveal contrasting topographic thresholds of non-linearity

    Science.gov (United States)

    Metzen, D.; Sheridan, G. J.; Benyon, R. G.; Bolstad, P. V.; Nyman, P.; Lane, P. N. J.

    2017-12-01

    Large areas of forest are often treated as being homogeneous just because they fall in a single climate category. However, we observe strong vegetation patterns in relation to topography in SE Australian forests and thus hypothesise that ET will vary spatially as well. Spatial heterogeneity evolves over different temporal scales in response to climatic forcing with increasing time lag from soil moisture (sub-yearly), to vegetation (10s -100s of years) to soil properties and topography (>100s of years). Most importantly, these processes and time scales are not independent, creating feedbacks that result in "co-evolved stable states" which yield the current spatial terrain, vegetation and ET patterns. We used up-scaled sap flux and understory ET measurements from water-balance plots, as well as LiDAR derived terrain and vegetation information, to infer links between spatio-temporal energy and water fluxes, topography and vegetation patterns at small catchment scale. Topography caused variations in aridity index between polar and equatorial-facing slopes (1.3 vs 1.8), which in turn manifested in significant differences in sapwood area index (6.9 vs 5.8), overstory LAI (3.0 vs 2.3), understory LAI (0.5 vs 0.4), sub-canopy radiation load (4.6 vs 6.8 MJ m-2 d-1), overstory transpiration (501 vs 347 mm a-1) and understory ET (79 vs 155 mm a-1). Large spatial variation in overstory transpiration (195 to 891 mm a-1) was observed over very short distances (100s m); a range representative of diverse forests such as arid open woodlands and wet mountain ash forests. Contrasting, non-linear overstory and understory ET patterns were unveiled between aspects, and topographic thresholds were lower for overstory than understory ET. While ET partitioning remained stable on polar-facing slopes regardless of slope position, overstory contribution gradually decreased with increasing slope inclination on equatorial aspects. Further, we show that ET patterns and controls underlie strong

  17. Non-linear leak currents affect mammalian neuron physiology

    Directory of Open Access Journals (Sweden)

    Shiwei eHuang

    2015-11-01

    Full Text Available In their seminal works on squid giant axons, Hodgkin and Huxley approximated the membrane leak current as Ohmic, i.e. linear, since in their preparation, sub-threshold current rectification due to the influence of ionic concentration is negligible. Most studies on mammalian neurons have made the same, largely untested, assumption. Here we show that the membrane time constant and input resistance of mammalian neurons (when other major voltage-sensitive and ligand-gated ionic currents are discounted varies non-linearly with membrane voltage, following the prediction of a Goldman-Hodgkin-Katz-based passive membrane model. The model predicts that under such conditions, the time constant/input resistance-voltage relationship will linearize if the concentration differences across the cell membrane are reduced. These properties were observed in patch-clamp recordings of cerebellar Purkinje neurons (in the presence of pharmacological blockers of other background ionic currents and were more prominent in the sub-threshold region of the membrane potential. Model simulations showed that the non-linear leak affects voltage-clamp recordings and reduces temporal summation of excitatory synaptic input. Together, our results demonstrate the importance of trans-membrane ionic concentration in defining the functional properties of the passive membrane in mammalian neurons as well as other excitable cells.

  18. A Robust Threshold for Iterative Channel Estimation in OFDM Systems

    Directory of Open Access Journals (Sweden)

    A. Kalaycioglu

    2010-04-01

    Full Text Available A novel threshold computation method for pilot symbol assisted iterative channel estimation in OFDM systems is considered. As the bits are transmitted in packets, the proposed technique is based on calculating a particular threshold for each data packet in order to select the reliable decoder output symbols to improve the channel estimation performance. Iteratively, additional pilot symbols are established according to the threshold and the channel is re-estimated with the new pilots inserted to the known channel estimation pilot set. The proposed threshold calculation method for selecting additional pilots performs better than non-iterative channel estimation, no threshold and fixed threshold techniques in poor HF channel simulations.

  19. Thresholds of Toxicological Concern - Setting a threshold for testing below which there is little concern.

    Science.gov (United States)

    Hartung, Thomas

    2017-01-01

    Low dose, low risk; very low dose, no real risk. Setting a pragmatic threshold below which concerns become negligible is the purpose of thresholds of toxicological concern (TTC). The idea is that such threshold values do not need to be established for each and every chemical based on experimental data, but that by analyzing the distribution of lowest or no-effect doses of many chemicals, a TTC can be defined - typically using the 5th percentile of this distribution and lowering it by an uncertainty factor of, e.g., 100. In doing so, TTC aims to compare exposure information (dose) with a threshold below which any hazard manifestation is very unlikely to occur. The history and current developments of this concept are reviewed and the application of TTC for different regulated products and their hazards is discussed. TTC lends itself as a pragmatic filter to deprioritize testing needs whenever real-life exposures are much lower than levels where hazard manifestation would be expected, a situation that is called "negligible exposure" in the REACH legislation, though the TTC concept has not been fully incorporated in its implementation (yet). Other areas and regulations - especially in the food sector and for pharmaceutical impurities - are more proactive. Large, curated databases on toxic effects of chemicals provide us with the opportunity to set TTC for many hazards and substance classes and thus offer a precautionary second tier for risk assessments if hazard cannot be excluded. This allows focusing testing efforts better on relevant exposures to chemicals.

  20. Threshold responses of Amazonian stream fishes to timing and extent of deforestation.

    Science.gov (United States)

    Brejão, Gabriel L; Hoeinghaus, David J; Pérez-Mayorga, María Angélica; Ferraz, Silvio F B; Casatti, Lilian

    2017-12-06

    Deforestation is a primary driver of biodiversity change through habitat loss and fragmentation. Stream biodiversity may not respond to deforestation in a simple linear relationship. Rather, threshold responses to extent and timing of deforestation may occur. Identification of critical deforestation thresholds is needed for effective conservation and management. We tested for threshold responses of fish species and functional groups to degree of watershed and riparian zone deforestation and time since impact in 75 streams in the western Brazilian Amazon. We used remote sensing to assess deforestation from 1984 to 2011. Fish assemblages were sampled with seines and dip nets in a standardized manner. Fish species (n = 84) were classified into 20 functional groups based on ecomorphological traits associated with habitat use, feeding, and locomotion. Threshold responses were quantified using threshold indicator taxa analysis. Negative threshold responses to deforestation were common and consistently occurred at very low levels of deforestation (70% deforestation and >10 years after impact. Findings were similar at the community level for both taxonomic and functional analyses. Because most negative threshold responses occurred at low levels of deforestation and soon after impact, even minimal change is expected to negatively affect biodiversity. Delayed positive threshold responses to extreme deforestation by a few species do not offset the loss of sensitive taxa and likely contribute to biotic homogenization. © 2017 Society for Conservation Biology.

  1. Linear gate with prescaled window

    Energy Technology Data Exchange (ETDEWEB)

    Koch, J; Bissem, H H; Krause, H; Scobel, W [Hamburg Univ. (Germany, F.R.). 1. Inst. fuer Experimentalphysik

    1978-07-15

    An electronic circuit is described that combines the features of a linear gate, a single channel analyzer and a prescaler. It allows selection of a pulse height region between two adjustable thresholds and scales the intensity of the spectrum within this window down by a factor 2sup(N) (0<=N<=9), whereas the complementary part of the spectrum is transmitted without being affected.

  2. Non-linearity consideration when analyzing reactor noise statistical characteristics. [BWR

    Energy Technology Data Exchange (ETDEWEB)

    Kebadze, B V; Adamovski, L A

    1975-06-01

    Statistical characteristics of boiling water reactor noise in the vicinity of stability threshold are studied. The reactor is considered as a non-linear system affected by random perturbations. To solve a non-linear problem the principle of statistical linearization is used. It is shown that the halfwidth of resonance peak in neutron power noise spectrum density as well as the reciprocal of noise dispersion, which are used in predicting a stable operation theshold, are different from zero both within and beyond the stability boundary the determination of which was based on linear criteria.

  3. Linearization Method and Linear Complexity

    Science.gov (United States)

    Tanaka, Hidema

    We focus on the relationship between the linearization method and linear complexity and show that the linearization method is another effective technique for calculating linear complexity. We analyze its effectiveness by comparing with the logic circuit method. We compare the relevant conditions and necessary computational cost with those of the Berlekamp-Massey algorithm and the Games-Chan algorithm. The significant property of a linearization method is that it needs no output sequence from a pseudo-random number generator (PRNG) because it calculates linear complexity using the algebraic expression of its algorithm. When a PRNG has n [bit] stages (registers or internal states), the necessary computational cost is smaller than O(2n). On the other hand, the Berlekamp-Massey algorithm needs O(N2) where N(≅2n) denotes period. Since existing methods calculate using the output sequence, an initial value of PRNG influences a resultant value of linear complexity. Therefore, a linear complexity is generally given as an estimate value. On the other hand, a linearization method calculates from an algorithm of PRNG, it can determine the lower bound of linear complexity.

  4. Consistent deformations of dual formulations of linearized gravity: A no-go result

    International Nuclear Information System (INIS)

    Bekaert, Xavier; Boulanger, Nicolas; Henneaux, Marc

    2003-01-01

    The consistent, local, smooth deformations of the dual formulation of linearized gravity involving a tensor field in the exotic representation of the Lorentz group with Young symmetry type (D-3,1) (one column of length D-3 and one column of length 1) are systematically investigated. The rigidity of the Abelian gauge algebra is first established. We next prove a no-go theorem for interactions involving at most two derivatives of the fields

  5. Modelling female fertility traits in beef cattle using linear and non-linear models.

    Science.gov (United States)

    Naya, H; Peñagaricano, F; Urioste, J I

    2017-06-01

    Female fertility traits are key components of the profitability of beef cattle production. However, these traits are difficult and expensive to measure, particularly under extensive pastoral conditions, and consequently, fertility records are in general scarce and somehow incomplete. Moreover, fertility traits are usually dominated by the effects of herd-year environment, and it is generally assumed that relatively small margins are kept for genetic improvement. New ways of modelling genetic variation in these traits are needed. Inspired in the methodological developments made by Prof. Daniel Gianola and co-workers, we assayed linear (Gaussian), Poisson, probit (threshold), censored Poisson and censored Gaussian models to three different kinds of endpoints, namely calving success (CS), number of days from first calving (CD) and number of failed oestrus (FE). For models involving FE and CS, non-linear models overperformed their linear counterparts. For models derived from CD, linear versions displayed better adjustment than the non-linear counterparts. Non-linear models showed consistently higher estimates of heritability and repeatability in all cases (h 2  linear models; h 2  > 0.23 and r > 0.24, for non-linear models). While additive and permanent environment effects showed highly favourable correlations between all models (>0.789), consistency in selecting the 10% best sires showed important differences, mainly amongst the considered endpoints (FE, CS and CD). In consequence, endpoints should be considered as modelling different underlying genetic effects, with linear models more appropriate to describe CD and non-linear models better for FE and CS. © 2017 Blackwell Verlag GmbH.

  6. Linear dose-response of acentric chromosome fragments down to 1 R of x-rays in grasshopper neuroblasts, a potential mutagen-test system

    International Nuclear Information System (INIS)

    Gaulden, M.E.; Read, C.B.

    1978-01-01

    Grasshopper-embryo neuroblasts have no spontaneous chromosome breakage; therefore they permit easy detection of agents that break chromosomes. An X-ray exposure of 1 R induces in them a detectable number of chromosome fragments. The dose-response of acentric fragment frequency fits a linear model between 0 and 128 R. Thus another cell type is added to those previously demonstrated to have no threshold dose for the induction of chromosome or gene mutations

  7. Hardware, software and strategies for radiation safety awareness

    International Nuclear Information System (INIS)

    Iyer, M.R.

    2016-01-01

    The various components to be in place for a successful radiation safety awareness program calls for an in depth multi disciplinary R and D which is not often appreciated. Yes, there has been public suspicion about the safety of radiation and nuclear power. And a lot of ground is covered by nuclear agencies to remove those suspicions. The reasons for this are not far off to see. The concepts of LNT (Linear No-Threshold) and the resulting ALARA (As Low As Reasonably Achievable) used for radiological protection have been stumbling blocks for public acceptability of nuclear power. We cannot blame the public if often people get confused and easily get exploited by interested people. The lack of clear cut definition of what is safe and what is not is something that need to be removed from public mind and the message need to be forcibly conveyed to the public. The nomenclature for radiation protection is beset with some quibbling factors hinging on some basic scientific findings which has definitely not been proved in the last half a century or so is responsible for this. A format language, software based communication medium that is easily discernible to the public need to be developed

  8. Non-targeted effects of ionising radiation - Implications for radiation protection

    International Nuclear Information System (INIS)

    Sisko Salomaa

    2006-01-01

    The universality of the target theory of radiation-induced effects is challenged by observations on non-targeted effects such as bystander effects, genomic instability and adaptive response. Essential features of non-targeted effects are that they do not require direct nuclear exposure by radiation and they are particularly significant at low doses. This new evidence suggests a need for a new paradigm in radiation biology. The new paradigm should cover both the classical (targeted) and the non-targeted effects. New aspects include the role of cellular communication and tissue-level responses. A better understanding of non-targeted effects may have important consequences for health risk assessment and, consequently, on radiation protection. Non-targeted effects may contribute to the estimation of cancer risk from occupational, medical and environmental exposures. In particular, they may have implications for the applicability of the Linear-No-Threshold (LNT) model in extrapolating radiation risk data into the low-dose region. This also means that the adequacy of the concept of dose to estimate risk is challenged by these findings. Moreover, these effects may provide new mechanistic explanations for the development of non-cancer diseases. Further research is required to determine if these effects, typically measured in cell cultures, are applicable in tissue level, whole animals, and ultimately in humans. (author)

  9. Non-targeted effects of ionising radiation

    International Nuclear Information System (INIS)

    Belyakov, O.V.

    2008-01-01

    The universality of the target theory of radiation-induced effects is challenged by observations on non-targeted effects such as bystander effects and genomic instability. Essential features of non-targeted effects are that they do not require direct nuclear exposure by radiation and they are particularly significant at low doses. This new evidence suggests a need for a new paradigm in radiation biology. The new paradigm would cover both the classical (targeted) and the non-targeted effects. New aspects include the role of cellular communication and tissue-level responses. A better understanding of non-targeted effects may have important consequences for health risk assessment and, consequently, on radiation protection. Non-targeted effects may contribute to the estimation of cancer risk from occupational, medical and environmental exposures. In particular, they may have implications for the applicability of the Linear-No-Threshold (LNT) model in extrapolating radiation risk data into the low-dose region. This also means that the adequacy of the concept of dose to estimate risk is challenged by these findings. Moreover, these effects may provide new mechanistic explanations for the development of non-cancer diseases. Further research is required to determine if these effects, typically measured in cell cultures, are applicable in tissue level, whole animals, and ultimately in humans. (orig.)

  10. Non-targeted effects of ionising radiation - Implications for radiation protection

    Energy Technology Data Exchange (ETDEWEB)

    Sisko Salomaa [STUK - Radiation and Nuclear Safety Authority, Helsinki (Finland)

    2006-07-01

    The universality of the target theory of radiation-induced effects is challenged by observations on non-targeted effects such as bystander effects, genomic instability and adaptive response. Essential features of non-targeted effects are that they do not require direct nuclear exposure by radiation and they are particularly significant at low doses. This new evidence suggests a need for a new paradigm in radiation biology. The new paradigm should cover both the classical (targeted) and the non-targeted effects. New aspects include the role of cellular communication and tissue-level responses. A better understanding of non-targeted effects may have important consequences for health risk assessment and, consequently, on radiation protection. Non-targeted effects may contribute to the estimation of cancer risk from occupational, medical and environmental exposures. In particular, they may have implications for the applicability of the Linear-No-Threshold (LNT) model in extrapolating radiation risk data into the low-dose region. This also means that the adequacy of the concept of dose to estimate risk is challenged by these findings. Moreover, these effects may provide new mechanistic explanations for the development of non-cancer diseases. Further research is required to determine if these effects, typically measured in cell cultures, are applicable in tissue level, whole animals, and ultimately in humans. (author)

  11. Numerical modelling of local deposition patients, activity distributions and cellular hit probabilities of inhaled radon progenies in human airways

    International Nuclear Information System (INIS)

    Farkas, A.; Balashazy, I.; Szoeke, I.

    2003-01-01

    The general objective of our research is modelling the biophysical processes of the effects of inhaled radon progenies. This effort is related to the rejection or support of the linear no threshold (LNT) dose-effect hypothesis, which seems to be one of the most challenging tasks of current radiation protection. Our approximation and results may also serve as a useful tool for lung cancer models. In this study, deposition patterns, activity distributions and alpha-hit probabilities of inhaled radon progenies in the large airways of the human tracheobronchial tree are computed. The airflow fields and related particle deposition patterns strongly depend on the shape of airway geometry and breathing pattern. Computed deposition patterns of attached an unattached radon progenies are strongly inhomogeneous creating hot spots at the carinal regions and downstream of the inner sides of the daughter airways. The results suggest that in the vicinity of the carinal regions the multiple hit probabilities are quite high even at low average doses and increase exponentially in the low-dose range. Thus, even the so-called low doses may present high doses for large clusters of cells. The cell transformation probabilities are much higher in these regions and this phenomenon cannot be modeled with average burdens. (authors)

  12. Targeted and non-targeted effects of ionizing radiation

    Directory of Open Access Journals (Sweden)

    Omar Desouky

    2015-04-01

    Full Text Available For a long time it was generally accepted that effects of ionizing radiation such as cell death, chromosomal aberrations, DNA damage, mutagenesis, and carcinogenesis result from direct ionization of cell structures, particularly DNA, or from indirect damage through reactive oxygen species produced by radiolysis of water, and these biological effects were attributed to irreparable or misrepaired DNA damage in cells directly hit by radiation. Using linear non-threshold model (LNT, possible risks from exposure to low dose ionizing radiation (below 100 mSv are estimated by extrapolating from data obtained after exposure to higher doses of radiation. This model has been challenged by numerous observations, in which cells that were not directly traversed by the ionizing radiation exhibited responses similar to those of the directly irradiated cells. Therefore, it is nowadays accepted that the detrimental effects of ionizing radiation are not restricted only in the irradiated cells, but also to non-irradiated bystander or even distant cells manifesting various biological effects.

  13. Proposal of radiation exposure index, REXI

    International Nuclear Information System (INIS)

    Katoh, Kazuaki

    2002-01-01

    As a measure of harmful effect of radiation, radiation exposure index (REXI) is proposed. It is an integer expression of logarithmic ratio of radiation dose to a standard value. REXI is a dimensionless quantity and is free from the requirement of additivity, in contrast with dose. Not a few kinds of doses are used in the field of radiation protection and among them the effective dose plays main role, since the main target of radiation control is of the so-called stochastic effect and the effective dose is used as the controlling quantity. Effective dose is a radiation dose, namely, a quantity of caused to describe the effect, but it cannot be a representation of the effect itself. It is nothing but a measure of possibility of the effect. In addition, the LNT (linearity and non threshold) Postulation adopted by ICRP makes it difficult to understand the foreseen associated effect quantitatively. (author)

  14. Quantifying the Arousal Threshold Using Polysomnography in Obstructive Sleep Apnea.

    Science.gov (United States)

    Sands, Scott A; Terrill, Philip I; Edwards, Bradley A; Taranto Montemurro, Luigi; Azarbarzin, Ali; Marques, Melania; de Melo, Camila M; Loring, Stephen H; Butler, James P; White, David P; Wellman, Andrew

    2018-01-01

    Precision medicine for obstructive sleep apnea (OSA) requires noninvasive estimates of each patient's pathophysiological "traits." Here, we provide the first automated technique to quantify the respiratory arousal threshold-defined as the level of ventilatory drive triggering arousal from sleep-using diagnostic polysomnographic signals in patients with OSA. Ventilatory drive preceding clinically scored arousals was estimated from polysomnographic studies by fitting a respiratory control model (Terrill et al.) to the pattern of ventilation during spontaneous respiratory events. Conceptually, the magnitude of the airflow signal immediately after arousal onset reveals information on the underlying ventilatory drive that triggered the arousal. Polysomnographic arousal threshold measures were compared with gold standard values taken from esophageal pressure and intraoesophageal diaphragm electromyography recorded simultaneously (N = 29). Comparisons were also made to arousal threshold measures using continuous positive airway pressure (CPAP) dial-downs (N = 28). The validity of using (linearized) nasal pressure rather than pneumotachograph ventilation was also assessed (N = 11). Polysomnographic arousal threshold values were correlated with those measured using esophageal pressure and diaphragm EMG (R = 0.79, p < .0001; R = 0.73, p = .0001), as well as CPAP manipulation (R = 0.73, p < .0001). Arousal threshold estimates were similar using nasal pressure and pneumotachograph ventilation (R = 0.96, p < .0001). The arousal threshold in patients with OSA can be estimated using polysomnographic signals and may enable more personalized therapeutic interventions for patients with a low arousal threshold. © Sleep Research Society 2017. Published by Oxford University Press on behalf of the Sleep Research Society. All rights reserved. For permissions, please e-mail journals.permissions@oup.com.

  15. Nonlinearity and thresholds in dose-response relationships for carcinogenicity due to sampling variation, logarithmic dose scaling, or small differences in individual susceptibility

    International Nuclear Information System (INIS)

    Lutz, W.K.; Gaylor, D.W.; Conolly, R.B.; Lutz, R.W.

    2005-01-01

    Nonlinear and threshold-like shapes of dose-response curves are often observed in tests for carcinogenicity. Here, we present three examples where an apparent threshold is spurious and can be misleading for low dose extrapolation and human cancer risk assessment. Case 1: For experiments that are not replicated, such as rodent bioassays for carcinogenicity, random variation can lead to misinterpretation of the result. This situation was simulated by 20 random binomial samplings of 50 animals per group, assuming a true linear dose response from 5% to 25% tumor incidence at arbitrary dose levels 0, 0.5, 1, 2, and 4. Linearity was suggested only by 8 of the 20 simulations. Four simulations did not reveal the carcinogenicity at all. Three exhibited thresholds, two showed a nonmonotonic behavior with a decrease at low dose, followed by a significant increase at high dose ('hormesis'). Case 2: Logarithmic representation of the dose axis transforms a straight line into a sublinear (up-bent) curve, which can be misinterpreted to indicate a threshold. This is most pronounced if the dose scale includes a wide low dose range. Linear regression of net tumor incidences and intersection with the dose axis results in an apparent threshold, even with an underlying true linear dose-incidence relationship. Case 3: Nonlinear shapes of dose-cancer incidence curves are rarely seen with epidemiological data in humans. The discrepancy to data in rodents may in part be explained by a wider span of individual susceptibilities for tumor induction in humans due to more diverse genetic background and modulation by co-carcinogenic lifestyle factors. Linear extrapolation of a human cancer risk could therefore be appropriate even if animal bioassays show nonlinearity

  16. The influence of gender and bruxism on human minimum interdental threshold ability

    Directory of Open Access Journals (Sweden)

    Patrícia dos Santos Calderon

    2009-06-01

    Full Text Available OBJECTIVE: To evaluate the influence of gender and bruxism on the ability to discriminate minimum interdental threshold. MATERIAL AND METHODS: One hundred and fifteen individuals, representing both genders, bruxers and non-bruxers, with a mean age of 23.64 years, were selected for this study. For group allocation, every individual was subjected to a specific physical examination to detect bruxism (performed by three different examiners. Evaluation of the ability to discriminate minimum interdental threshold was performed using industrialized 0.010 mm-, 0.024 mm-, 0.030 mm-, 0.050 mm-, 0.080 mm- and 0.094 mm-thick aluminum foils that were placed between upper and lower premolars. Data were analyzed statistically by multiple linear regression analysis at 5% significance level. RESULTS: Neither gender nor bruxism influenced the ability to discriminate minimum interdental threshold (p>0.05. CONCLUSIONS: Gender and the presence of bruxism do not play a role in the minimum interdental threshold.

  17. Automatic Threshold Setting and Its Uncertainty Quantification in Wind Turbine Condition Monitoring System

    DEFF Research Database (Denmark)

    Marhadi, Kun Saptohartyadi; Skrimpas, Georgios Alexandros

    2015-01-01

    Setting optimal alarm thresholds in vibration based condition monitoring system is inherently difficult. There are no established thresholds for many vibration based measurements. Most of the time, the thresholds are set based on statistics of the collected data available. Often times the underly......Setting optimal alarm thresholds in vibration based condition monitoring system is inherently difficult. There are no established thresholds for many vibration based measurements. Most of the time, the thresholds are set based on statistics of the collected data available. Often times...... the underlying probability distribution that describes the data is not known. Choosing an incorrect distribution to describe the data and then setting up thresholds based on the chosen distribution could result in sub-optimal thresholds. Moreover, in wind turbine applications the collected data available may...... not represent the whole operating conditions of a turbine, which results in uncertainty in the parameters of the fitted probability distribution and the thresholds calculated. In this study, Johnson, Normal, and Weibull distributions are investigated; which distribution can best fit vibration data collected...

  18. Near-threshold fatigue crack growth behavior of AISI 316 stainless steel

    International Nuclear Information System (INIS)

    Tobler, R.L.

    1986-01-01

    The near-threshold fatigue behavior of an AISI 316 alloy was characterized using a newly developed, fully automatic fatigue test apparatus. Significant differences in the near-threshold behavior at temperatures of 295 and 4 K are observed. At 295 K, where the operationally defined threshold at 10 -10 m/cycle is insensitive contains stress ratio and strongly affected by crack closure, the effective threshold stress intensity factor (ΔK/sub Th/)/sub eff/) is about 4.65 MPa m/sub 1/2/ at R = 0.3. At 4 K, the threshold is higher, crack closure is less pronounced, and there is a stress ratio dependency: (ΔK/sub Th/)/sub eff/ is 5.1 MPa m/sup 1/2/ at R = 0.3 and 6.1 MPa m/sup 1/2/ at R - 0.1. There is also a significant difference in the form of the da/dN-versus-ΔK curves on log-log coordinates: at 4 K the curve has the expected sigmoidal shape, but at 295 K the trend is linear over the region of da/dN from 10 -7 to 10 -10 m/cycle. Other results suggest that the near-threshold measurements of a 6.4-mm-thick specimen of this alloy are insensitive to cyclic test frequency below 40 Hz

  19. Near-Threshold Ionization of Argon by Positron Impact

    Science.gov (United States)

    Babij, T. J.; Machacek, J. R.; Murtagh, D. J.; Buckman, S. J.; Sullivan, J. P.

    2018-03-01

    The direct single-ionization cross section for Ar by positron impact has been measured in the region above the first ionization threshold. These measurements are compared to semiclassical calculations which give rise to a power law variation of the cross section in the threshold region. The experimental results appear to be in disagreement with extensions to the Wannier theory applied to positron impact ionization, with a smaller exponent than that calculated by most previous works. In fact, in this work, we see no difference in threshold behavior between the positron and electron cases. Possible reasons for this discrepancy are discussed.

  20. Measurements of NN → dπ near threshold

    International Nuclear Information System (INIS)

    Hutcheon, D.A.

    1990-09-01

    New, precise measurements of the differential cross sections for np → dπ 0 and π + d → pp and of analyzing powers for pp → dπ + have been made at energies within 10 MeV (c.m.) of threshold. They allow the pion s-wave and p-wave parts of the production strength to be distinguished unambiguously, yielding an s-wave strength at threshold which is significantly smaller than the previously accepted value. There is no evidence for charge independence breaking nor for πNN resonances near threshold. (Author) (17 refs., 17 figs., tab.)

  1. The top-antitop threshold at the ILC. NNLL QCD uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Hoang, Andre H. [Vienna Univ. (Austria). Faculty of Physics; Stahlhofen, Maximilian [DESY Hamburg (Germany). Theory Group

    2013-09-15

    We study combined variations of the renormalization and matching scales in order to reliably estimate the perturbative error of the latest NNLL vNRQCD prediction for top-antitop threshold production at a future linear collider. We present results for the total cross section and for the experimentally more relevant case, when moderate cuts are imposed on the reconstructed top and antitop invariant masses.

  2. The top-antitop threshold at the ILC. NNLL QCD uncertainties

    International Nuclear Information System (INIS)

    Hoang, Andre H.; Stahlhofen, Maximilian

    2013-09-01

    We study combined variations of the renormalization and matching scales in order to reliably estimate the perturbative error of the latest NNLL vNRQCD prediction for top-antitop threshold production at a future linear collider. We present results for the total cross section and for the experimentally more relevant case, when moderate cuts are imposed on the reconstructed top and antitop invariant masses.

  3. Genomic analysis of cow mortality and milk production using a threshold-linear model.

    Science.gov (United States)

    Tsuruta, S; Lourenco, D A L; Misztal, I; Lawlor, T J

    2017-09-01

    The objective of this study was to investigate the feasibility of genomic evaluation for cow mortality and milk production using a single-step methodology. Genomic relationships between cow mortality and milk production were also analyzed. Data included 883,887 (866,700) first-parity, 733,904 (711,211) second-parity, and 516,256 (492,026) third-parity records on cow mortality (305-d milk yields) of Holsteins from Northeast states in the United States. The pedigree consisted of up to 1,690,481 animals including 34,481 bulls genotyped with 36,951 SNP markers. Analyses were conducted with a bivariate threshold-linear model for each parity separately. Genomic information was incorporated as a genomic relationship matrix in the single-step BLUP. Traditional and genomic estimated breeding values (GEBV) were obtained with Gibbs sampling using fixed variances, whereas reliabilities were calculated from variances of GEBV samples. Genomic EBV were then converted into single nucleotide polymorphism (SNP) marker effects. Those SNP effects were categorized according to values corresponding to 1 to 4 standard deviations. Moving averages and variances of SNP effects were calculated for windows of 30 adjacent SNP, and Manhattan plots were created for SNP variances with the same window size. Using Gibbs sampling, the reliability for genotyped bulls for cow mortality was 28 to 30% in EBV and 70 to 72% in GEBV. The reliability for genotyped bulls for 305-d milk yields was 53 to 65% to 81 to 85% in GEBV. Correlations of SNP effects between mortality and 305-d milk yields within categories were the highest with the largest SNP effects and reached >0.7 at 4 standard deviations. All SNP regions explained less than 0.6% of the genetic variance for both traits, except regions close to the DGAT1 gene, which explained up to 2.5% for cow mortality and 4% for 305-d milk yields. Reliability for GEBV with a moderate number of genotyped animals can be calculated by Gibbs samples. Genomic

  4. Hydrometeorological threshold conditions for debris flow initiation in Norway

    Directory of Open Access Journals (Sweden)

    N. K. Meyer

    2012-10-01

    Full Text Available Debris flows, triggered by extreme precipitation events and rapid snow melt, cause considerable damage to the Norwegian infrastructure every year. To define intensity-duration (ID thresholds for debris flow initiation critical water supply conditions arising from intensive rainfall or snow melt were assessed on the basis of daily hydro-meteorological information for 502 documented debris flow events. Two threshold types were computed: one based on absolute ID relationships and one using ID relationships normalized by the local precipitation day normal (PDN. For each threshold type, minimum, medium and maximum threshold values were defined by fitting power law curves along the 10th, 50th and 90th percentiles of the data population. Depending on the duration of the event, the absolute threshold intensities needed for debris flow initiation vary between 15 and 107 mm day−1. Since the PDN changes locally, the normalized thresholds show spatial variations. Depending on location, duration and threshold level, the normalized threshold intensities vary between 6 and 250 mm day−1. The thresholds obtained were used for a frequency analysis of over-threshold events giving an estimation of the exceedance probability and thus potential for debris flow events in different parts of Norway. The absolute thresholds are most often exceeded along the west coast, while the normalized thresholds are most frequently exceeded on the west-facing slopes of the Norwegian mountain ranges. The minimum thresholds derived in this study are in the range of other thresholds obtained for regions with a climate comparable to Norway. Statistics reveal that the normalized threshold is more reliable than the absolute threshold as the former shows no spatial clustering of debris flows related to water supply events captured by the threshold.

  5. Radio-over-fiber linearization with optimized genetic algorithm CPWL model.

    Science.gov (United States)

    Mateo, Carlos; Carro, Pedro L; García-Dúcar, Paloma; De Mingo, Jesús; Salinas, Íñigo

    2017-02-20

    This article proposes an optimized version of a canonical piece-wise-linear (CPWL) digital predistorter in order to enhance the linearity of a radio-over-fiber (RoF) LTE mobile fronthaul. In this work, we propose a threshold allocation optimization process carried out by a genetic algorithm (GA) in order to optimize the CPWL model (GA-CPWL). Firstly, experiments show how the CPWL model outperforms the classical memory polynomial DPD in an intensity modulation/direct detection (IM/DD) RoF link. Then, the GA-CPWL predistorter is compared with the CPWL model in several scenarios, in order to verify that the proposed DPD offers better performance in different optical transmission conditions. Experimental results reveal that with a proper threshold allocation, the GA-CPWL predistorter offers very promising outcomes.

  6. Laser-damage thresholds of thin-film optical coatings at 248 nm

    International Nuclear Information System (INIS)

    Milam, D.; Rainer, F.; Lowdermilk, W.H.

    1981-01-01

    We have measured the laser-induced damage thresholds for 248 nm wavelength light of over 100 optical coatings from commercial vendors and research institutions. All samples were irradiated once per damage site with temporally multi-lobed, 20-ns pulses generated by a KrF laser. The survey included high, partial, and dichroic reflectors, anti-reflective coatings, and single layer films. The samples were supplied by ten vendors. The majority of samples tested were high reflectors and antireflective coatings. The highest damage thresholds were 8.5 to 9.4 J/cm 2 , respectively. Although these represent extremes of what has been tested so far, several vendors have produced coatings of both types with thresholds which consistently exceed 6 J/cm 2 . Repeated irradiations of some sites were made on a few samples. These yielded no degradation in threshold, but in fact some improvement in damage resistance. These same samples also exhibited no change in threshold after being retested seven months later

  7. Alternative method for determining anaerobic threshold in rowers

    Directory of Open Access Journals (Sweden)

    Giovani dos Santos Cunha

    2008-12-01

    Full Text Available In rowing, the standard breathing that athletes are trained to use makes it difficult, or even impossible, to detectventilatory limits, due to the coupling of the breath with the technical movement. For this reason, some authors have proposeddetermining the anaerobic threshold from the respiratory exchange ratio (RER, but there is not yet consensus on what valueof RER should be used. The objective of this study was to test what value of RER corresponds to the anaerobic thresholdand whether this value can be used as an independent parameter for determining the anaerobic threshold of rowers. Thesample comprised 23 male rowers. They were submitted to a maximal cardiorespiratory test on a rowing ergometer withconcurrent ergospirometry in order to determine VO2máx and the physiological variables corresponding to their anaerobicthreshold. The anaerobic threshold was determined using the Dmax (maximal distance method. The physiological variableswere classified into maximum values and anaerobic threshold values. The maximal state of these rowers reached VO2(58.2±4.4 ml.kg-1.min-1, lactate (8.2±2.1 mmol.L-1, power (384±54.3 W and RER (1.26±0.1. At the anaerobic thresholdthey reached VO2 (46.9±7.5 ml.kg-1.min-1, lactate (4.6±1.3 mmol.L-1, power (300± 37.8 W and RER (0.99±0.1. Conclusions- the RER can be used as an independent method for determining the anaerobic threshold of rowers, adopting a value of0.99, however, RER should exhibit a non-linear increase above this figure.

  8. CARA Risk Assessment Thresholds

    Science.gov (United States)

    Hejduk, M. D.

    2016-01-01

    Warning remediation threshold (Red threshold): Pc level at which warnings are issued, and active remediation considered and usually executed. Analysis threshold (Green to Yellow threshold): Pc level at which analysis of event is indicated, including seeking additional information if warranted. Post-remediation threshold: Pc level to which remediation maneuvers are sized in order to achieve event remediation and obviate any need for immediate follow-up maneuvers. Maneuver screening threshold: Pc compliance level for routine maneuver screenings (more demanding than regular Red threshold due to additional maneuver uncertainty).

  9. On the Appearance of Thresholds in the Dynamical Model of Star Formation

    Science.gov (United States)

    Elmegreen, Bruce G.

    2018-02-01

    The Kennicutt–Schmidt (KS) relationship between the surface density of the star formation rate (SFR) and the gas surface density has three distinct power laws that may result from one model in which gas collapses at a fixed fraction of the dynamical rate. The power-law slope is 1 when the observed gas has a characteristic density for detection, 1.5 for total gas when the thickness is about constant as in the main disks of galaxies, and 2 for total gas when the thickness is regulated by self-gravity and the velocity dispersion is about constant, as in the outer parts of spirals, dwarf irregulars, and giant molecular clouds. The observed scaling of the star formation efficiency (SFR per unit CO) with the dense gas fraction (HCN/CO) is derived from the KS relationship when one tracer (HCN) is on the linear part and the other (CO) is on the 1.5 part. Observations of a threshold density or column density with a constant SFR per unit gas mass above the threshold are proposed to be selection effects, as are observations of star formation in only the dense parts of clouds. The model allows a derivation of all three KS relations using the probability distribution function of density with no thresholds for star formation. Failed galaxies and systems with sub-KS SFRs are predicted to have gas that is dominated by an equilibrium warm phase where the thermal Jeans length exceeds the Toomre length. A squared relation is predicted for molecular gas-dominated young galaxies.

  10. Acute effects of dynamic exercises on the relationship between the motor unit firing rate and the recruitment threshold.

    Science.gov (United States)

    Ye, Xin; Beck, Travis W; DeFreitas, Jason M; Wages, Nathan P

    2015-04-01

    The aim of this study was to compare the acute effects of concentric versus eccentric exercise on motor control strategies. Fifteen men performed six sets of 10 repetitions of maximal concentric exercises or eccentric isokinetic exercises with their dominant elbow flexors on separate experimental visits. Before and after the exercise, maximal strength testing and submaximal trapezoid isometric contractions (40% of the maximal force) were performed. Both exercise conditions caused significant strength loss in the elbow flexors, but the loss was greater following the eccentric exercise (t=2.401, P=.031). The surface electromyographic signals obtained from the submaximal trapezoid isometric contractions were decomposed into individual motor unit action potential trains. For each submaximal trapezoid isometric contraction, the relationship between the average motor unit firing rate and the recruitment threshold was examined using linear regression analysis. In contrast to the concentric exercise, which did not cause significant changes in the mean linear slope coefficient and y-intercept of the linear regression line, the eccentric exercise resulted in a lower mean linear slope and an increased mean y-intercept, thereby indicating that increasing the firing rates of low-threshold motor units may be more important than recruiting high-threshold motor units to compensate for eccentric exercise-induced strength loss. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. Do multiple body modifications alter pain threshold?

    Science.gov (United States)

    Yamamotová, A; Hrabák, P; Hříbek, P; Rokyta, R

    2017-12-30

    In recent years, epidemiological data has shown an increasing number of young people who deliberately self-injure. There have also been parallel increases in the number of people with tattoos and those who voluntarily undergo painful procedures associated with piercing, scarification, and tattooing. People with self-injury behaviors often say that they do not feel the pain. However, there is no information regarding pain perception in those that visit tattoo parlors and piercing studios compared to those who don't. The aim of this study was to compare nociceptive sensitivity in four groups of subjects (n=105, mean age 26 years, 48 women and 57 men) with different motivations to experience pain (i.e., with and without multiple body modifications) in two different situations; (1) in controlled, emotionally neutral conditions, and (2) at a "Hell Party" (HP), an event organized by a piercing and tattoo parlor, with a main event featuring a public demonstration of painful techniques (burn scars, hanging on hooks, etc.). Pain thresholds of the fingers of the hand were measured using a thermal stimulator and mechanical algometer. In HP participants, information about alcohol intake, self-harming behavior, and psychiatric history were used in the analysis as intervening variables. Individuals with body modifications as well as without body modifications had higher thermal pain thresholds at Hell Party, compared to thresholds measured at control neutral conditions. No such differences were found relative to mechanical pain thresholds. Increased pain threshold in all HP participants, irrespectively of body modification, cannot be simply explained by a decrease in the sensory component of pain; instead, we found that the environment significantly influenced the cognitive and affective component of pain.

  12. SOA thresholds for the perception of discrete/continuous tactile stimulation

    DEFF Research Database (Denmark)

    Eid, Mohamad; Korres, Georgios; Jensen, Camilla Birgitte Falk

    In this paper we present an experiment to measure the upper and lower thresholds of the Stimulus Onset Asynchrony (SOA) for continuous/discrete apparent haptic motion. We focus on three stimulation parameters: the burst duration, the SOA time, and the inter-actuator distance (between successive......-discrete boundary at lower SOA. Furthermore, the larger the inter-actuator distance, the more linear the relationship between the burst duration and the SOA timing. Finally, the large range between lower and upper thresholds for SOA can be utilized to create continuous movement stimulation on the skin at “varying...... speeds”. The results are discussed in reference to designing a tactile interface for providing continuous haptic motion with a desired speed of continuous tactile stimulation....

  13. Perspective: Uses and misuses of thresholds in diagnostic decision making.

    Science.gov (United States)

    Warner, Jeremy L; Najarian, Robert M; Tierney, Lawrence M

    2010-03-01

    The concept of thresholds plays a vital role in decisions involving the initiation, continuation, and completion of diagnostic testing. Much research has focused on the development of explicit thresholds, in the form of practice guidelines and decision analyses. However, these tools are used infrequently; most medical decisions are made at the bedside, using implicit thresholds. Study of these thresholds can lead to a deeper understanding of clinical decision making. The authors examine some factors constituting individual clinicians' implicit thresholds. They propose a model for static thresholds using the concept of situational gravity to explain why some thresholds are high, and some low. Next, they consider the hypothetical effects of incorrect placement of thresholds (miscalibration) and changes to thresholds during diagnosis (manipulation). They demonstrate these concepts using common clinical scenarios. Through analysis of miscalibration of thresholds, the authors demonstrate some common maladaptive clinical behaviors, which are nevertheless internally consistent. They then explain how manipulation of thresholds gives rise to common cognitive heuristics including premature closure and anchoring. They also discuss the case where no threshold has been exceeded despite exhaustive collection of data, which commonly leads to application of the availability or representativeness heuristics. Awareness of implicit thresholds allows for a more effective understanding of the processes of medical decision making and, possibly, to the avoidance of detrimental heuristics and their associated medical errors. Research toward accurately defining these thresholds for individual physicians and toward determining their dynamic properties during the diagnostic process may yield valuable insights.

  14. A study on the temperature dependence of the threshold switching characteristics of Ge2Sb2Te5

    Science.gov (United States)

    Lee, Suyoun; Jeong, Doo Seok; Jeong, Jeung-hyun; Zhe, Wu; Park, Young-Wook; Ahn, Hyung-Woo; Cheong, Byung-ki

    2010-01-01

    We investigated the temperature dependence of the threshold switching characteristics of a memory-type chalcogenide material, Ge2Sb2Te5. We found that the threshold voltage (Vth) decreased linearly with temperature, implying the existence of a critical conductivity of Ge2Sb2Te5 for its threshold switching. In addition, we investigated the effect of bias voltage and temperature on the delay time (tdel) of the threshold switching of Ge2Sb2Te5 and described the measured relationship by an analytic expression which we derived based on a physical model where thermally activated hopping is a dominant transport mechanism in the material.

  15. Model for fitting longitudinal traits subject to threshold response applied to genetic evaluation for heat tolerance

    Directory of Open Access Journals (Sweden)

    Misztal Ignacy

    2009-01-01

    Full Text Available Abstract A semi-parametric non-linear longitudinal hierarchical model is presented. The model assumes that individual variation exists both in the degree of the linear change of performance (slope beyond a particular threshold of the independent variable scale and in the magnitude of the threshold itself; these individual variations are attributed to genetic and environmental components. During implementation via a Bayesian MCMC approach, threshold levels were sampled using a Metropolis step because their fully conditional posterior distributions do not have a closed form. The model was tested by simulation following designs similar to previous studies on genetics of heat stress. Posterior means of parameters of interest, under all simulation scenarios, were close to their true values with the latter always being included in the uncertain regions, indicating an absence of bias. The proposed models provide flexible tools for studying genotype by environmental interaction as well as for fitting other longitudinal traits subject to abrupt changes in the performance at particular points on the independent variable scale.

  16. Determining lower threshold concentrations for synergistic effects

    DEFF Research Database (Denmark)

    Bjergager, Maj-Britt Andersen; Dalhoff, Kristoffer; Kretschmann, Andreas

    2017-01-01

    which proven synergists cease to act as synergists towards the aquatic crustacean Daphnia magna. To do this, we compared several approaches and test-setups to evaluate which approach gives the most conservative estimate for the lower threshold for synergy for three known azole synergists. We focus...... on synergistic interactions between the pyrethroid insecticide, alpha-cypermethrin, and one of the three azole fungicides prochloraz, propiconazole or epoxiconazole measured on Daphnia magna immobilization. Three different experimental setups were applied: A standard 48h acute toxicity test, an adapted 48h test...... of immobile organisms increased more than two-fold above what was predicted by independent action (vertical assessment). All three tests confirmed the hypothesis of the existence of a lower azole threshold concentration below which no synergistic interaction was observed. The lower threshold concentration...

  17. Reading for Integration, Identifying Complementary Threshold Concepts: The ACRL Framework in Conversation with Naming What We Know: Threshold Concepts of Writing Studies

    Directory of Open Access Journals (Sweden)

    Brittney Johnson

    2016-12-01

    Full Text Available In 2015, threshold concepts formed the foundation of two disciplinary documents: the ACRL Framework for Information Literacy (2015 and Naming What We Know: Threshold Concepts of Writing Studies (2015. While there is no consensus in the fields about the value of threshold concepts in teaching, reading the six Frames in the ACRL document alongside the threshold concepts of writing studies illuminates overlapping elements that may empower faculty in both fields to advocate collectively against skills-focused writing and research instruction through cross-disciplinary integrations. To facilitate cross-disciplinary conversations around the documents, the authors propose an order for reading the Frames, identify the associated writing concepts, and explain how the shared concepts reveal an internal complexity which may have implications for teaching the ACRL Framework.

  18. Theory of threshold phenomena

    International Nuclear Information System (INIS)

    Hategan, Cornel

    2002-01-01

    Theory of Threshold Phenomena in Quantum Scattering is developed in terms of Reduced Scattering Matrix. Relationships of different types of threshold anomalies both to nuclear reaction mechanisms and to nuclear reaction models are established. Magnitude of threshold effect is related to spectroscopic factor of zero-energy neutron state. The Theory of Threshold Phenomena, based on Reduced Scattering Matrix, does establish relationships between different types of threshold effects and nuclear reaction mechanisms: the cusp and non-resonant potential scattering, s-wave threshold anomaly and compound nucleus resonant scattering, p-wave anomaly and quasi-resonant scattering. A threshold anomaly related to resonant or quasi resonant scattering is enhanced provided the neutron threshold state has large spectroscopic amplitude. The Theory contains, as limit cases, Cusp Theories and also results of different nuclear reactions models as Charge Exchange, Weak Coupling, Bohr and Hauser-Feshbach models. (author)

  19. Non-linear, connectivity and threshold-dominated runoff-generation controls DOC and heavy metal export in a small peat catchment

    Science.gov (United States)

    Birkel, Christian; Broder, Tanja; Biester, Harald

    2017-04-01

    Peat soils act as important carbon sinks, but they also release large amounts of dissolved organic carbon (DOC) to the aquatic system. The DOC export is strongly tied to the export of soluble heavy metals. The accumulation of potentially toxic substances due to anthropogenic activities, and their natural export from peat soils to the aquatic system is an important health and environmental issue. However, limited knowledge exists as to how much of these substances are mobilized, how they are mobilized in terms of flow pathways and under which hydrometeorological conditions. In this study, we report from a combined experimental and modelling effort to provide greater process understanding from a small, lead (Pb) and arsenic (As) contaminated upland peat catchment in northwestern Germany. We developed a minimally parameterized, but process-based, coupled hydrology-biogeochemistry model applied to simulate detailed hydrometric and biogeochemical data. The model was based on an initial data mining analysis, in combination with regression relationships of discharge, DOC and element export. We assessed the internal model DOC-processing based on stream-DOC hysteresis patterns and 3-hourly time step groundwater level and soil DOC data (not used for calibration as an independent model test) for two consecutive summer periods in 2013 and 2014. We found that Pb and As mobilization can be efficiently predicted from DOC transport alone, but Pb showed a significant non-linear relationship with DOC, while As was linearly related to DOC. The relatively parsimonious model (nine calibrated parameters in total) showed the importance of non-linear and rapid near-surface runoff-generation mechanisms that caused around 60% of simulated DOC load. The total load was high even though these pathways were only activated during storm events on average 30% of the monitoring time - as also shown by the experimental data. Overall, the drier period 2013 resulted in increased nonlinearity, but

  20. Particles near threshold

    International Nuclear Information System (INIS)

    Bhattacharya, T.; Willenbrock, S.

    1993-01-01

    We propose returning to the definition of the width of a particle in terms of the pole in the particle's propagator. Away from thresholds, this definition of width is equivalent to the standard perturbative definition, up to next-to-leading order; however, near a threshold, the two definitions differ significantly. The width as defined by the pole position provides more information in the threshold region than the standard perturbative definition and, in contrast with the perturbative definition, does not vanish when a two-particle s-wave threshold is approached from below

  1. Primordial black holes in linear and non-linear regimes

    Energy Technology Data Exchange (ETDEWEB)

    Allahyari, Alireza; Abolhasani, Ali Akbar [Department of Physics, Sharif University of Technology, Tehran (Iran, Islamic Republic of); Firouzjaee, Javad T., E-mail: allahyari@physics.sharif.edu, E-mail: j.taghizadeh.f@ipm.ir [School of Astronomy, Institute for Research in Fundamental Sciences (IPM), P.O. Box 19395-5531, Tehran (Iran, Islamic Republic of)

    2017-06-01

    We revisit the formation of primordial black holes (PBHs) in the radiation-dominated era for both linear and non-linear regimes, elaborating on the concept of an apparent horizon. Contrary to the expectation from vacuum models, we argue that in a cosmological setting a density fluctuation with a high density does not always collapse to a black hole. To this end, we first elaborate on the perturbation theory for spherically symmetric space times in the linear regime. Thereby, we introduce two gauges. This allows to introduce a well defined gauge-invariant quantity for the expansion of null geodesics. Using this quantity, we argue that PBHs do not form in the linear regime irrespective of the density of the background. Finally, we consider the formation of PBHs in non-linear regimes, adopting the spherical collapse picture. In this picture, over-densities are modeled by closed FRW models in the radiation-dominated era. The difference of our approach is that we start by finding an exact solution for a closed radiation-dominated universe. This yields exact results for turn-around time and radius. It is important that we take the initial conditions from the linear perturbation theory. Additionally, instead of using uniform Hubble gauge condition, both density and velocity perturbations are admitted in this approach. Thereby, the matching condition will impose an important constraint on the initial velocity perturbations δ {sup h} {sub 0} = −δ{sub 0}/2. This can be extended to higher orders. Using this constraint, we find that the apparent horizon of a PBH forms when δ > 3 at turn-around time. The corrections also appear from the third order. Moreover, a PBH forms when its apparent horizon is outside the sound horizon at the re-entry time. Applying this condition, we infer that the threshold value of the density perturbations at horizon re-entry should be larger than δ {sub th} > 0.7.

  2. A summary of the lateral cutoff analysis and results from NASA's Farfield Investigation of No-boom Thresholds

    Science.gov (United States)

    Cliatt, Larry J.; Hill, Michael A.; Haering, Edward A.; Arnac, Sarah R.

    2015-10-01

    In support of the ongoing effort by the National Aeronautics and Space Administration (NASA) to bring supersonic commercial travel to the public, NASA, in partnership with other industry organizations, conducted a flight research experiment to analyze acoustic propagation at the lateral edge of the sonic boom carpet. The name of the effort was the Farfield Investigation of No-boom Thresholds (FaINT). The research from FaINT determined an appropriate metric for sonic boom waveforms in the transition and shadow zones called Perceived Sound Exposure Level, established a value of 65 dB as a limit for the acoustic lateral extent of a sonic boom's noise region, analyzed change in sonic boom levels near lateral cutoff, and compared between real sonic boom measurements and numerical predictions.

  3. Estimation of failure probabilities of linear dynamic systems by ...

    Indian Academy of Sciences (India)

    An iterative method for estimating the failure probability for certain time-variant reliability problems has been developed. In the paper, the focus is on the displacement response of a linear oscillator driven by white noise. Failure is then assumed to occur when the displacement response exceeds a critical threshold.

  4. Studies on the post-ictal rise in seizure threshold. [Rats

    Energy Technology Data Exchange (ETDEWEB)

    Nutt, D.J.; Cowen, P.J.; Green, A.R.

    1981-05-08

    Seizure thresholds were determined by timed infusion of a convulsant drug. Following an electroconvulsive shock (ECS) rats exhibited a raised seizure threshold to infusion of the GABA antagonist drugs, pentylenetetrazol, bicuculline and isopropyl-bicyclophosphate, but not to the glycine antagonist strychnine or the 5-HT agonist, quipazine. The increase in threshold was seen following a bicuculline-induced seizure and 30 min following the last of a course of ECS given once daily for 10 days. The rise in seizure threshold still occurred when animals were pretreated with alpha-methyl-p-tyrosine (200 mg . kg-1), p-chlorophenylalanine (200 mg . kg-2), naloxone (1 mg . kg-1) or indomethacin (20 mg . kg-1). Diazepam (2 mg . kg-1), flurazepam (10 mg . kg-1) and sodium valproate (400 mg . kg-1) elevated basal seizure threshold and a further rise followed the ECS. Phenytoin (40 mg . kg-1) and carbamazepine (40 mg . kg-1) had no effect on basal seizure threshold or the ECS-induce rise. (

  5. Application of pentacene thin-film transistors with controlled threshold voltages to enhancement/depletion inverters

    Science.gov (United States)

    Takahashi, Hajime; Hanafusa, Yuki; Kimura, Yoshinari; Kitamura, Masatoshi

    2018-03-01

    Oxygen plasma treatment has been carried out to control the threshold voltage in organic thin-film transistors (TFTs) having a SiO2 gate dielectric prepared by rf sputtering. The threshold voltage linearly changed in the range of -3.7 to 3.1 V with the increase in plasma treatment time. Although the amount of change is smaller than that for organic TFTs having thermally grown SiO2, the tendency of the change was similar to that for thermally grown SiO2. To realize different plasma treatment times on the same substrate, a certain region on the SiO2 surface was selected using a shadow mask, and was treated with oxygen plasma. Using the process, organic TFTs with negative threshold voltages and those with positive threshold voltages were fabricated on the same substrate. As a result, enhancement/depletion inverters consisting of the organic TFTs operated at supply voltages of 5 to 15 V.

  6. Adjustments differ among low-threshold motor units during intermittent, isometric contractions.

    Science.gov (United States)

    Farina, Dario; Holobar, Ales; Gazzoni, Marco; Zazula, Damjan; Merletti, Roberto; Enoka, Roger M

    2009-01-01

    We investigated the changes in muscle fiber conduction velocity, recruitment and derecruitment thresholds, and discharge rate of low-threshold motor units during a series of ramp contractions. The aim was to compare the adjustments in motor unit activity relative to the duration that each motor unit was active during the task. Multichannel surface electromyographic (EMG) signals were recorded from the abductor pollicis brevis muscle of eight healthy men during 12-s contractions (n = 25) in which the force increased and decreased linearly from 0 to 10% of the maximum. The maximal force exhibited a modest decline (8.5 +/- 9.3%; P motor units that were active for 16-98% of the time during the first five contractions were identified throughout the task by decomposition of the EMG signals. Action potential conduction velocity decreased during the task by a greater amount for motor units that were initially active for >70% of the time compared with that of less active motor units. Moreover, recruitment and derecruitment thresholds increased for these most active motor units, whereas the thresholds decreased for the less active motor units. Another 18 motor units were recruited at an average of 171 +/- 32 s after the beginning of the task. The recruitment and derecruitment thresholds of these units decreased during the task, but muscle fiber conduction velocity did not change. These results indicate that low-threshold motor units exhibit individual adjustments in muscle fiber conduction velocity and motor neuron activation that depended on the relative duration of activity during intermittent contractions.

  7. Total Risk Management for Low Dose Radiation Exposures

    International Nuclear Information System (INIS)

    Simic, Z.; Mikulicic, V.; Sterc, D.

    2012-01-01

    Our civilization is witnessing about century of nuclear age mixed with enormous promises and cataclysmic threats. Nuclear energy seems to encapsulate both potential for pure good and evil or at least we humans are able to perceive that. These images are continuously with us and they are both helping and distracting from making best of nuclear potentials for civilization. Today with nuclear use significantly present and with huge potential to further improve our life with energy and medical use it is of enormous importance to try to have calmed, rational, and objective view on potential risks and certain benefits. Because all use of nuclear energy proved that their immediate risks are negligible (i.e., Three Mile Island and Fukushima) or much smaller than from the other alternatives (i.e., Chernobyl) it seems that the most important issue is the amount of risk from the long term effects to people from exposure to small doses of radiation. A similar issue is present in the increased use of modern computational tomography and other radiation sources use in medicine for examination and therapy. Finally, extreme natural exposures are third such potential risk sources. Definition of low doses varies depending on the way of delivery (i.e., single, multiple or continuous exposures), and for this paper usual dose of 100 mSv is selected as yearly upper amount. There are three very different scientifically supported views on the potential risks from the low doses exposure. The most conservative theory is that all radiation is harmful, and even small increments from background levels (i.e., 2-3 mSv) present additional risk. This view is called linear no threshold theory (LNT) and it is accepted as a regulatory conservative simple approach which guarantees safety. Risk is derived from the extrapolation of the measured effects of high levels of radiation. Opposite theory to LNT is hormesis which assumes that in fact small doses of radiation are helpful and they are improving our

  8. Doubly excited 1,3Po resonances of helium below the N=2–9 ionisation thresholds

    International Nuclear Information System (INIS)

    Dieng, M.; Sakho, I.; Biaye, M.; Wagué, A.

    2014-01-01

    A novel approach is used to evaluate energies of singlet and triplet resonance states of helium below the N=2–9 hydrogenic thresholds. We have combined the variational method with the no-linear parameters of Hylleraas and the β-parameters of screening constant by unit nuclear charge. Comparison with various available theoretical and experimental literature values indicates a good agreement. - Hightlights: • A simple approach, to calculate inter-shell n (K,T) N A1,3 P o states. • The calculations use two methods combined. • A simple expression is used to calculate intershell 1,3 P o states. • Satisfactory agreements between theoretical and experimental literature values up to Z=10

  9. Impacto não Linear do Marketing Mix no Desempenho em Vendas de Marcas

    Directory of Open Access Journals (Sweden)

    Rafael Barreiros Porto

    2015-01-01

    Full Text Available O padrão de impacto que as atividades de marketingexercem nas vendas não tem sido evidenciado na literatura. Muitas pesquisas adotam perspectivas lineares restritas, desconsiderando as evidências empíricas. Este trabalho investigou o impacto não linear do marketingmixno volume em vendas e no volume de consumidores e de compra por consumidor. Realizou-se um estudo longitudinal em painel de marcas e de consumidores simultâneos. Analisaram-se 121 marcas durante 13 meses, com 793 compras/mês feitas pelos consumidores por meio de três equações de estimativas generalizadas. Os resultados apontam que o marketing mix, em especial brandinge precificação, impacta fortemente todas as dependentes em formato não linear, com bons ajustes dos parâmetros. Oefeito conjunto gera economias de escala para as marcas, enquanto, para cada consumidor, o efeito conjunto estimula-o a adquirir maiores quantidades gradativamente. A pesquisa demonstra oito padrões impactantes do marketingmixsobre os indicadores investigados, com alterações de sua ordem e de seu peso para marcas e consumidores.

  10. Threshold responses of songbirds to long-term timber management on an active industrial forest

    Science.gov (United States)

    Becker, Douglas A.; Wood, Petra Bohall; Keyser, Patrick D.; Wigley, T. Bently; Dellinger, Rachel; Weakland, Cathy A.

    2011-01-01

    Forest managers often seek to balance economic benefits from timber harvesting with maintenance of habitat for wildlife, ecosystem function, and human uses. Most research on the relationship between avian abundance and active timber management has been short-term, lasting one to two years, creating the need to investigate long-term avian responses and to identify harvest thresholds when a small change in habitat results in a disproportionate response in relative abundance and nest success. Our objectives were to identify trends in relative abundance and nest success and to identify landscape-scale disturbance thresholds for avian species and habitat guilds in response to a variety of harvest treatments (clear-cuts, heavy and light partial harvests) over 14 years. We conducted point counts and monitored nests at an industrial forest in the central Appalachians of West Virginia during 1996–1998, 2001–2003, and 2007–2009. Early successional species increased in relative abundance across all three time periods, whereas interior-edge and forest-interior guilds peaked in relative abundance mid-study after which the forest-interior guild declined. Of 41 species with >10 detections, four (10%) declined significantly, 13 (32%) increased significantly (only three species among all periods), and 9 (22%) peaked in abundance mid-study (over the entire study period, four species had no significant change in abundance, four declined, and one increased). Based on piecewise linear models, forest-interior and interior-edge guilds’ relative abundance harvest thresholds were 28% total harvests (all harvests combined), 10% clear-cut harvests, and 18% light partial harvests, after which abundances declined. Harvest thresholds for the early successional guild were 42% total harvests, 11% clear-cut harvest, and 10% light partial harvests, and relative abundances increased after surpassing thresholds albeit at a reduced rate of increase after the clear-cut threshold. Threshold

  11. Weighted-noise threshold based channel estimation for OFDM ...

    Indian Academy of Sciences (India)

    Existing optimal time-domain thresholds exhibit suboptimal behavior for completely unavailable KCS ... Compared with no truncation case, truncation improved the MSE ... channel estimation errors has been studied. ...... Consumer Electron.

  12. Damage threshold of lithium niobate crystal under single and multiple femtosecond laser pulses: theoretical and experimental study

    International Nuclear Information System (INIS)

    Meng, Qinglong; Zhang, Bin; Zhong, Sencheng; Zhu, Liguo

    2016-01-01

    The damage threshold of lithium niobate crystal under single and multiple femtosecond laser pulses has been studied theoretically and experimentally. Firstly, the model for the damage threshold prediction of crystal materials based on the improved rate equation has been proposed. Then, the experimental measure method of the damage threshold of crystal materials has been given in detail. On the basis, the variation of the damage threshold of lithium niobate crystal with the pulse duration has also been analyzed quantitatively. Finally, the damage threshold of lithium niobate crystal under multiple laser pulses has been measured and compared to the theoretical results. The results show that the transmittance of lithium niobate crystal is almost a constant when the laser pulse fluence is relative low, whereas it decreases linearly with the increase in the laser pulse fluence below the damage threshold. The damage threshold of lithium niobate crystal increases with the increase in the duration of the femtosecond laser pulse. And the damage threshold of lithium niobate crystal under multiple laser pulses is obviously lower than that irradiated by a single laser pulse. The theoretical data fall in good agreement with the experimental results. (orig.)

  13. Threshold Evaluation of Emergency Risk Communication for Health Risks Related to Hazardous Ambient Temperature.

    Science.gov (United States)

    Liu, Yang; Hoppe, Brenda O; Convertino, Matteo

    2018-04-10

    Emergency risk communication (ERC) programs that activate when the ambient temperature is expected to cross certain extreme thresholds are widely used to manage relevant public health risks. In practice, however, the effectiveness of these thresholds has rarely been examined. The goal of this study is to test if the activation criteria based on extreme temperature thresholds, both cold and heat, capture elevated health risks for all-cause and cause-specific mortality and morbidity in the Minneapolis-St. Paul Metropolitan Area. A distributed lag nonlinear model (DLNM) combined with a quasi-Poisson generalized linear model is used to derive the exposure-response functions between daily maximum heat index and mortality (1998-2014) and morbidity (emergency department visits; 2007-2014). Specific causes considered include cardiovascular, respiratory, renal diseases, and diabetes. Six extreme temperature thresholds, corresponding to 1st-3rd and 97th-99th percentiles of local exposure history, are examined. All six extreme temperature thresholds capture significantly increased relative risks for all-cause mortality and morbidity. However, the cause-specific analyses reveal heterogeneity. Extreme cold thresholds capture increased mortality and morbidity risks for cardiovascular and respiratory diseases and extreme heat thresholds for renal disease. Percentile-based extreme temperature thresholds are appropriate for initiating ERC targeting the general population. Tailoring ERC by specific causes may protect some but not all individuals with health conditions exacerbated by hazardous ambient temperature exposure. © 2018 Society for Risk Analysis.

  14. Effect of transcranial direct current stimulation on vestibular-ocular and vestibulo-perceptual thresholds.

    Science.gov (United States)

    Kyriakareli, Artemis; Cousins, Sian; Pettorossi, Vito E; Bronstein, Adolfo M

    2013-10-02

    Transcranial direct current stimulation (tDCS) was used in 17 normal individuals to modulate vestibulo-ocular reflex (VOR) and self-motion perception rotational thresholds. The electrodes were applied over the temporoparietal junction bilaterally. Both vestibular nystagmic and perceptual thresholds were increased during as well as after tDCS stimulation. Body rotation was labeled as ipsilateral or contralateral to the anode side, but no difference was observed depending on the direction of rotation or hemisphere polarity. Threshold increase during tDCS was greater for VOR than for motion perception. 'Sham' stimulation had no effect on thresholds. We conclude that tDCS produces an immediate and sustained depression of cortical regions controlling VOR and movement perception. Temporoparietal areas appear to be involved in vestibular threshold modulation but the differential effects observed between VOR and perception suggest a partial dissociation between cortical processing of reflexive and perceptual responses.

  15. Cyclone–anticyclone vortex asymmetry mechanism and linear Ekman friction

    Energy Technology Data Exchange (ETDEWEB)

    Chefranov, S. G., E-mail: schefranov@mail.ru [Russian Academy of Sciences, Obukhov Institute of Atmospheric Physics (Russian Federation)

    2016-04-15

    Allowance for the linear Ekman friction has been found to ensure a threshold (in rotation frequency) realization of the linear dissipative–centrifugal instability and the related chiral symmetry breaking in the dynamics of Lagrangian particles, which leads to the cyclone–anticyclone vortex asymmetry. An excess of the fluid rotation rate ω{sub 0} over some threshold value determined by the fluid eigenfrequency ω (i.e., ω{sub 0} > ω) is shown to be a condition for the realization of such an instability. A new generalization of the solution of the Karman problem to determine the steady-state velocity field in a viscous incompressible fluid above a rotating solid disk of large radius, in which the linear Ekman friction was additionally taken into account, has been obtained. A correspondence of this solution and the conditions for the realization of the dissipative–centrifugal instability of a chiral-symmetric vortex state and the corresponding cyclone–anticyclone vortex asymmetry has been shown. A generalization of the well-known spiral velocity distribution in an “Ekman layer” near a solid surface has been established for the case where the fluid rotation frequency far from the disk ω differs from the disk rotation frequency ω{sub 0}.

  16. Linearity of bulk-controlled inverter ring VCO in weak and strong inversion

    DEFF Research Database (Denmark)

    Wismar, Ulrik Sørensen; Wisland, D.; Andreani, Pietro

    2007-01-01

    In this paper linearity of frequency modulation in voltage controlled inverter ring oscillators for non feedback sigma delta converter applications is studied. The linearity is studied through theoretical models of the oscillator operating at supply voltages above and below the threshold voltage......, process variations and temperature variations have also been simulated to indicate the advantages of having the soft rail bias transistor in the VCO....

  17. Risk thresholds for alcohol consumption

    DEFF Research Database (Denmark)

    Wood, Angela M; Kaptoge, Stephen; Butterworth, Adam S

    2018-01-01

    previous cardiovascular disease. METHODS: We did a combined analysis of individual-participant data from three large-scale data sources in 19 high-income countries (the Emerging Risk Factors Collaboration, EPIC-CVD, and the UK Biobank). We characterised dose-response associations and calculated hazard......BACKGROUND: Low-risk limits recommended for alcohol consumption vary substantially across different national guidelines. To define thresholds associated with lowest risk for all-cause mortality and cardiovascular disease, we studied individual-participant data from 599 912 current drinkers without......·4 million person-years of follow-up. For all-cause mortality, we recorded a positive and curvilinear association with the level of alcohol consumption, with the minimum mortality risk around or below 100 g per week. Alcohol consumption was roughly linearly associated with a higher risk of stroke (HR per 100...

  18. The rubber hand illusion increases heat pain threshold.

    Science.gov (United States)

    Hegedüs, G; Darnai, G; Szolcsányi, T; Feldmann, Á; Janszky, J; Kállai, J

    2014-09-01

    Accumulating evidence shows that manipulations of cortical body representation, for example, by simply viewing one's own body, can relieve pain in healthy subjects. Despite the widespread use of the rubber hand illusion (RHI) as an effective experimental tool for the manipulation of bodily awareness, previous studies examining the analgesic effect of the RHI have produced conflicting results. We used noxious heat stimuli to induce finger pain in 29 healthy subjects, and we recorded the participants' pain thresholds and subjective pain ratings during the RHI and during the control conditions. Two control conditions were included in our experiment - a standard one with reduced illusion strength (asynchronous stroking control) and an additional one in which the participants viewed their own hand. Raw data showed that both the RHI and the vision of the own hand resulted in slightly higher pain thresholds than the asynchronous stroking control (illusion: 47.79 °C; own-hand: 47.99 °C; asynchronous: 47.52 °C). After logarithmic transformation to achieve normality, paired t-tests revealed that both increases in pain threshold were significant (illusion/asynchronous: p = 0.036; own-hand/asynchronous: p = 0.007). In contrast, there was no significant difference in pain threshold between the illusion and the own-hand conditions (p = 0.656). Pain rating scores were not log-normal, and Wilcoxon singed-rank tests found no significant differences in pain ratings between the study conditions. The RHI increases heat pain threshold and the analgesic effect of the RHI is comparable with that of seeing one's own hand. The latter finding may have clinical implications. © 2014 European Pain Federation - EFIC®

  19. Algorithmic detectability threshold of the stochastic block model

    Science.gov (United States)

    Kawamoto, Tatsuro

    2018-03-01

    The assumption that the values of model parameters are known or correctly learned, i.e., the Nishimori condition, is one of the requirements for the detectability analysis of the stochastic block model in statistical inference. In practice, however, there is no example demonstrating that we can know the model parameters beforehand, and there is no guarantee that the model parameters can be learned accurately. In this study, we consider the expectation-maximization (EM) algorithm with belief propagation (BP) and derive its algorithmic detectability threshold. Our analysis is not restricted to the community structure but includes general modular structures. Because the algorithm cannot always learn the planted model parameters correctly, the algorithmic detectability threshold is qualitatively different from the one with the Nishimori condition.

  20. Deep Brain Stimulation of the Subthalamic Nucleus Does Not Affect the Decrease of Decision Threshold during the Choice Process When There Is No Conflict, Time Pressure, or Reward.

    Science.gov (United States)

    Leimbach, Friederike; Georgiev, Dejan; Litvak, Vladimir; Antoniades, Chrystalina; Limousin, Patricia; Jahanshahi, Marjan; Bogacz, Rafal

    2018-06-01

    During a decision process, the evidence supporting alternative options is integrated over time, and the choice is made when the accumulated evidence for one of the options reaches a decision threshold. Humans and animals have an ability to control the decision threshold, that is, the amount of evidence that needs to be gathered to commit to a choice, and it has been proposed that the subthalamic nucleus (STN) is important for this control. Recent behavioral and neurophysiological data suggest that, in some circumstances, the decision threshold decreases with time during choice trials, allowing overcoming of indecision during difficult choices. Here we asked whether this within-trial decrease of the decision threshold is mediated by the STN and if it is affected by disrupting information processing in the STN through deep brain stimulation (DBS). We assessed 13 patients with Parkinson disease receiving bilateral STN DBS six or more months after the surgery, 11 age-matched controls, and 12 young healthy controls. All participants completed a series of decision trials, in which the evidence was presented in discrete time points, which allowed more direct estimation of the decision threshold. The participants differed widely in the slope of their decision threshold, ranging from constant threshold within a trial to steeply decreasing. However, the slope of the decision threshold did not depend on whether STN DBS was switched on or off and did not differ between the patients and controls. Furthermore, there was no difference in accuracy and RT between the patients in the on and off stimulation conditions and healthy controls. Previous studies that have reported modulation of the decision threshold by STN DBS or unilateral subthalamotomy in Parkinson disease have involved either fast decision-making under conflict or time pressure or in anticipation of high reward. Our findings suggest that, in the absence of reward, decision conflict, or time pressure for decision

  1. Spectral singularities, threshold gain, and output intensity for a slab laser with mirrors

    Science.gov (United States)

    Doğan, Keremcan; Mostafazadeh, Ali; Sarısaman, Mustafa

    2018-05-01

    We explore the consequences of the emergence of linear and nonlinear spectral singularities in TE modes of a homogeneous slab of active optical material that is placed between two mirrors. We use the results together with two basic postulates regarding the behavior of laser light emission to derive explicit expressions for the laser threshold condition and output intensity for these modes of the slab and discuss their physical implications. In particular, we reveal the details of the dependence of the threshold gain and output intensity on the position and properties of the mirrors and on the real part of the refractive index of the gain material.

  2. Unstable volatility functions: the break preserving local linear estimator

    DEFF Research Database (Denmark)

    Casas, Isabel; Gijbels, Irene

    The objective of this paper is to introduce the break preserving local linear (BPLL) estimator for the estimation of unstable volatility functions. Breaks in the structure of the conditional mean and/or the volatility functions are common in Finance. Markov switching models (Hamilton, 1989......) and threshold models (Lin and Terasvirta, 1994) are amongst the most popular models to describe the behaviour of data with structural breaks. The local linear (LL) estimator is not consistent at points where the volatility function has a break and it may even report negative values for finite samples...

  3. Limiares de reconhecimento de sentenças no silêncio em campo livre versus limiares tonais em fone em indivíduos com perda auditiva coclear Sentence recognition thresholds in silence in free field versus pure tone thresholds in individuals with hearing loss

    Directory of Open Access Journals (Sweden)

    Nilvia Herondina Soares Aurélio

    2008-01-01

    Full Text Available OBJETIVO: investigar a correlação existente entre os limiares tonais e os Limiares de Reconhecimento de Sentenças no Silêncio (LRSS e verificar, se é possível, através do audiograma estabelecer um prognóstico deste paciente sobre a sua habilidade de reconhecer a fala. MÉTODOS: foram analisados 42 indivíduos com perda auditiva coclear de grau moderado, 18 do sexo feminino e 24 do masculino, com idades entre 41 e 76 anos. Primeiramente foi realizada avaliação audiológica básica e, em seguida, a pesquisa dos Limiares de Reconhecimento de Sentenças no Silêncio, em campo livre, por meio do teste Listas de Sentenças em Português. RESULTADOS: a análise estatística evidenciou correlação significante entre o limiar de reconhecimento de sentenças no silêncio e a média das freqüências de 0,5, 1 e 2 kHz. Por sua vez, ao correlacionar os Limiares de Reconhecimento de Sentenças no Silêncio com a média das freqüências de 3, 4 e 6 kHz, não houve correlação significante. CONCLUSÃO: o prognóstico provável da habilidade de reconhecimento de fala no silêncio, pode ser feito apenas com base nos limiares das freqüências de 0,5, 1 e 2 kHz em perdas auditivas cocleares.PURPOSE: to investigate the existent correlation among pure tone thresholds and Sentence Recognition Thresholds in Silence (TRSS and to check if it is possible, through the audiogram, to set up a prognostic of the patients about their communication ability. METHODS: 42 individuals with moderate cochlear hearing loss, 18 females and 24 males, 41 to 76-year old were studied. Firstly, a basic audiologic evaluation was carried out, then a search of TRSS in free field, through the Portuguese Sentence List Test (PSLT (Costa, 1998. RESULTS: The statistical analysis showed significant correlation between the sentence recognition thresholds in silence and the average of frequencies of 0.5, 1 and 2 kHz. However, when correlating with the average frequencies of 3, 4 e 6 k

  4. Music effect on pain threshold evaluated with current perception threshold

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    AIM: Music relieves anxiety and psychotic tension. This effect of music is applied to surgical operation in the hospital and dental office. It is still unclear whether this music effect is only limited to the psychological aspect but not to the physical aspect or whether its music effect is influenced by the mood or emotion of audience. To elucidate these issues, we evaluated the music effect on pain threshold by current perception threshold (CPT) and profile of mood states (POMC) test. METHODS: Healthy 30 subjects (12 men, 18 women, 25-49 years old, mean age 34.9) were tested. (1)After POMC test, all subjects were evaluated pain threshold with CPT by Neurometer (Radionics, USA) under 6 conditions, silence, listening to the slow tempo classic music, nursery music, hard rock music, classic paino music and relaxation music with 30 seconds interval. (2)After Stroop color word test as the stresser, pain threshold was evaluated with CPT under 2 conditions, silence and listening to the slow tempo classic music. RESULTS: Under litening to the music, CPT sores increased, especially 2 000 Hz level related with compression, warm and pain sensation. Type of music, preference of music and stress also affected CPT score. CONCLUSION: The present study demonstrated that the concentration on the music raise the pain threshold and that stress and mood influence the music effect on pain threshold.

  5. Comparison of skin sensory thresholds using pre-programmed or single-frequency transcutaneous electrical nerve stimulation.

    Science.gov (United States)

    Kang, Jong Ho

    2015-12-01

    [Purpose] The purpose of the present study was to compare the sensory thresholds of healthy subjects using pre-programmed or single-frequency transcutaneous electrical nerve stimulation. [Subjects] Ninety healthy adult subjects were randomly assigned to pre-programmed or single-frequency stimulation groups, each consisting of 45 participants. [Methods] Sensory thresholds were measured in the participants' forearms using von Frey filaments before and after pre-programmed or single-frequency transcutaneous electrical nerve stimulation, and the result in values were analyzed. [Results] Significant increases in sensory threshold after stimulation were observed in both groups. However, there were no significant differences between the two groups in sensory thresholds after stimulation or in the magnitude of threshold increases following stimulation. [Conclusion] Our results show that there are no differences between sensory threshold increases induced by pre-programmed and single-frequency transcutaneous electrical nerve stimulation.

  6. Local hyperspectral data multisharpening based on linear/linear-quadratic nonnegative matrix factorization by integrating lidar data

    Science.gov (United States)

    Benhalouche, Fatima Zohra; Karoui, Moussa Sofiane; Deville, Yannick; Ouamri, Abdelaziz

    2015-10-01

    In this paper, a new Spectral-Unmixing-based approach, using Nonnegative Matrix Factorization (NMF), is proposed to locally multi-sharpen hyperspectral data by integrating a Digital Surface Model (DSM) obtained from LIDAR data. In this new approach, the nature of the local mixing model is detected by using the local variance of the object elevations. The hyper/multispectral images are explored using small zones. In each zone, the variance of the object elevations is calculated from the DSM data in this zone. This variance is compared to a threshold value and the adequate linear/linearquadratic spectral unmixing technique is used in the considered zone to independently unmix hyperspectral and multispectral data, using an adequate linear/linear-quadratic NMF-based approach. The obtained spectral and spatial information thus respectively extracted from the hyper/multispectral images are then recombined in the considered zone, according to the selected mixing model. Experiments based on synthetic hyper/multispectral data are carried out to evaluate the performance of the proposed multi-sharpening approach and literature linear/linear-quadratic approaches used on the whole hyper/multispectral data. In these experiments, real DSM data are used to generate synthetic data containing linear and linear-quadratic mixed pixel zones. The DSM data are also used for locally detecting the nature of the mixing model in the proposed approach. Globally, the proposed approach yields good spatial and spectral fidelities for the multi-sharpened data and significantly outperforms the used literature methods.

  7. Introduction to Special Feature on Catastrophic Thresholds, Perspectives, Definitions, and Applications

    Directory of Open Access Journals (Sweden)

    Robert A. Washington-Allen

    2010-09-01

    Full Text Available The contributions to this special feature focus on several conceptual and operational applications for understanding non-linear behavior of complex systems with various ecological criteria at unique levels of organization. The organizing theme of the feature emphasizes alternative stable states or regimes and intervening thresholds that possess great relevance to ecology and natural resource management. The authors within this special feature address the conceptual models of catastrophe theory, self-organization, cross-scale interactions and time-scale calculus; develop operational definitions and procedures for understanding the occurrence of dynamic regimes or multiple stable states and thresholds; suggest diagnostics tools for detection of states and thresholds and contribute to the development of scaling laws; and finally, demonstrate applications that promote both greater ecological understanding and management prescriptions for insect and disease outbreaks, resource island formation, and characterization of ecological resilience. This Special Feature concludes with a synthesis of the commonalities and disparities of concepts and interpretations among the contributed papers to identify issues and approaches that merit further research emphasis.

  8. Plenary panel 1: The scientific bases of radiation protection. Non-targeted effects of ionising radiation - Implications for radiation protection

    International Nuclear Information System (INIS)

    Salomaa, S.

    2006-01-01

    The universality of the target theory of radiation-induced effects is challenged by observations on non-targeted effects such as bystander effects, genomic instability and adaptive response. Essential features of non-targeted effects are that they do not require direct nuclear exposure by radiation and they are particularly significant at low doses. This new evidence suggests a need for a new paradigm in radiation biology. The new paradigm should cover both the classical (targeted) and the non-targeted effects. New aspects include the role of cellular communication and tissue-level responses. A better understanding of non-targeted effects may have important consequences for health risk assessment and, consequently, on radiation protection. Non-targeted effects may contribute to the estimation of cancer risk from occupational, medical and environmental exposures. In particular, they may have implications for the applicability of the Linear-No-Threshold (L.N.T.) model in extrapolating radiation risk data into the low-dose region. This also means that the adequacy of the concept of dose to estimate risk is challenged by these findings. Moreover, these effects may provide new mechanistic explanations for the development of non-cancer diseases. Further research is required to determine if these effects, typically measured in cell cultures, are applicable in tissue level, whole animals, and ultimately in humans. (authors)

  9. Plenary panel 1: The scientific bases of radiation protection. Non-targeted effects of ionising radiation - Implications for radiation protection

    Energy Technology Data Exchange (ETDEWEB)

    Salomaa, S. [STUK - Radiation and Nuclear Safety Authority, Helsinki (Finland)

    2006-07-01

    The universality of the target theory of radiation-induced effects is challenged by observations on non-targeted effects such as bystander effects, genomic instability and adaptive response. Essential features of non-targeted effects are that they do not require direct nuclear exposure by radiation and they are particularly significant at low doses. This new evidence suggests a need for a new paradigm in radiation biology. The new paradigm should cover both the classical (targeted) and the non-targeted effects. New aspects include the role of cellular communication and tissue-level responses. A better understanding of non-targeted effects may have important consequences for health risk assessment and, consequently, on radiation protection. Non-targeted effects may contribute to the estimation of cancer risk from occupational, medical and environmental exposures. In particular, they may have implications for the applicability of the Linear-No-Threshold (L.N.T.) model in extrapolating radiation risk data into the low-dose region. This also means that the adequacy of the concept of dose to estimate risk is challenged by these findings. Moreover, these effects may provide new mechanistic explanations for the development of non-cancer diseases. Further research is required to determine if these effects, typically measured in cell cultures, are applicable in tissue level, whole animals, and ultimately in humans. (authors)

  10. Impact of sub and supra-threshold adaptation currents in networks of spiking neurons.

    Science.gov (United States)

    Colliaux, David; Yger, Pierre; Kaneko, Kunihiko

    2015-12-01

    Neuronal adaptation is the intrinsic capacity of the brain to change, by various mechanisms, its dynamical responses as a function of the context. Such a phenomena, widely observed in vivo and in vitro, is known to be crucial in homeostatic regulation of the activity and gain control. The effects of adaptation have already been studied at the single-cell level, resulting from either voltage or calcium gated channels both activated by the spiking activity and modulating the dynamical responses of the neurons. In this study, by disentangling those effects into a linear (sub-threshold) and a non-linear (supra-threshold) part, we focus on the the functional role of those two distinct components of adaptation onto the neuronal activity at various scales, starting from single-cell responses up to recurrent networks dynamics, and under stationary or non-stationary stimulations. The effects of slow currents on collective dynamics, like modulation of population oscillation and reliability of spike patterns, is quantified for various types of adaptation in sparse recurrent networks.

  11. [Relationship between Occlusal Discomfort Syndrome and Occlusal Threshold].

    Science.gov (United States)

    Munakata, Motohiro; Ono, Yumie; Hayama, Rika; Kataoka, Kanako; Ikuta, Ryuhei; Tamaki, Katsushi

    2016-03-01

    Occlusal dysesthesia has been defined as persistent uncomfortable feelings of intercuspal position continuing for more than 6 months without evidence of physical occlusal discrepancy. The problem often occurs after occlusal intervention by dental care. Although various dental treatments (e. g. occlusal adjustment, orthodontic treatment and prosthetic reconstruction) are attempted to solve occlusal dysesthesia, they rarely reach a satisfactory result, neither for patients nor dentists. In Japan, these symptoms are defined by the term "Occlusal discomfort syndrome" (ODS). The aim of this study was to investigate the characteristics of ODS with the simple occlusal sensory perceptive and discriminative test. Twenty-one female dental patients with ODS (mean age 55.8 ± 19.2 years) and 21 age- and gender-matched dental patients without ODS (mean age 53.1 ± 16.8 years) participated in the study. Upon grinding occlusal registration foils that were stacked to different thicknesses, participants reported the thicknesses at which they recognized the foils (recognition threshold) and felt discomfort (discomfort threshold). Although there was no significant difference in occlusal recognition thresholds between the two patient groups, the discomfort threshold was significantly smaller in the patients with ODS than in those without ODS. Moreover, the recognition threshold showed an age-dependent increase in patients without ODS, whereas it remained comparable between the younger (patient subgroups with ODS. These results suggest that occlusal discomfort threshold rather than recognition threshold is an issue in ODS. The foil grinding procedure is a simple and useful method to evaluate occlusal perceptive and discriminative abilities in patients with ODS.

  12. Modelos linear e não linear em análises genéticas para sobrevivência de crias de ovinos da raça Santa Inês Linear and nonlinear models in genetic analyses of lamb survival in the Santa Inês hair sheep breed

    Directory of Open Access Journals (Sweden)

    W.H. Sousa

    1999-06-01

    Full Text Available Registros de sobrevivência do nascimento ao desmame de 3846 crias de ovinos da raça Santa Inês foram analisados por modelos de reprodutor linear e não linear (modelo de limiar, para estimar componentes de variância e herdabilidade. Os modelos usados para sobrevivência, analisada como característica da cria, incluíram os efeitos fixos de sexo, da combinação tipo de nascimento-criação da cria e da idade da ovelha ao parto, efeito da covariável peso da cria ao nascer e efeitos aleatórios de reprodutor, da classe rebanho-ano-estação e do resíduo. Componentes de variância para o modelo linear foram estimados pelo método da máxima verossimilhança restrita (REML e para o modelo não linear por uma aproximação da máxima verossimilhança marginal (MML, pelo programa CMMAT2. O coeficiente de herdabilidade (h² estimado pelo modelo de limiar foi de 0,29, e pelo modelo linear, 0,14. A correlação de ordem de Spearman entre as capacidades de transmissão dos reprodutores, com base nos dois modelos foi de 0,96. As estimativas de h² obtidas indicam a possibilidade de se obter, por seleção, ganho genético para sobrevivência.Records of 3,846 lambs survival from birth to weaning of Santa Inês hair sheep breed, were analyzed by linear and non linear sire models (threshold model to estimate variance components and heritability (h². The models that were used to analyze survival, considered in this study as a lamb trait, included the fixed effects of sex of the lamb, combination of type of birth-rearing of lamb, and age of ewe, birth weight of lamb as covariate, and random effects of sire, herd-year-season and residual. Variance components were obtained using restricted maximum likelihood (REML, in linear model and marginal maximum likelihood in threshold model through CMMAT2 program. Estimate of heritability (h² obtained by threshold model was 0.29 and by linear model was 0.14. Rank correlation of Spearman, between sire solutions

  13. Pairing based threshold cryptography improving on Libert-Quisquater and Baek-Zheng

    DEFF Research Database (Denmark)

    Desmedt, Yvo; Lange, Tanja

    2006-01-01

    In this paper we apply techniques from secret sharing and threshold decryption to show how to properly design an ID-based threshold system in which one assumes no trust in any party. In our scheme: We avoid that any single machine ever knew the master secret s of the trusted authority (TA). Inste...

  14. Managing ecological thresholds in coupled environmental–human systems

    Science.gov (United States)

    Horan, Richard D.; Fenichel, Eli P.; Drury, Kevin L. S.; Lodge, David M.

    2011-01-01

    Many ecosystems appear subject to regime shifts—abrupt changes from one state to another after crossing a threshold or tipping point. Thresholds and their associated stability landscapes are determined within a coupled socioeconomic–ecological system (SES) where human choices, including those of managers, are feedback responses. Prior work has made one of two assumptions about managers: that they face no institutional constraints, in which case the SES may be managed to be fairly robust to shocks and tipping points are of little importance, or that managers are rigidly constrained with no flexibility to adapt, in which case the inferred thresholds may poorly reflect actual managerial flexibility. We model a multidimensional SES to investigate how alternative institutions affect SES stability landscapes and alter tipping points. With institutionally dependent human feedbacks, the stability landscape depends on institutional arrangements. Strong institutions that account for feedback responses create the possibility for desirable states of the world and can cause undesirable states to cease to exist. Intermediate institutions interact with ecological relationships to determine the existence and nature of tipping points. Finally, weak institutions can eliminate tipping points so that only undesirable states of the world remain. PMID:21502517

  15. The dynamic time-over-threshold method for multi-channel APD based gamma-ray detectors

    Energy Technology Data Exchange (ETDEWEB)

    Orita, T., E-mail: orita.tadashi@jaea.go.jp [Japan Atomic Energy Agency, Fukushima (Japan); Shimazoe, K.; Takahashi, H. [Department of Nuclear Management and Engineering, The University of Tokyo, Bunkyō (Japan)

    2015-03-01

    t– Recent advances in manufacturing technology have enabled the use of multi-channel pixelated detectors in gamma-ray imaging applications. When obtaining gamma-ray measurements, it is important to obtain pulse height information in order to avoid unnecessary events such as scattering. However, as the number of channels increases, more electronics are needed to process each channel's signal, and the corresponding increases in circuit size and power consumption can result in practical problems. The time-over-threshold (ToT) method, which has recently become popular in the medical field, is a signal processing technique that can effectively avoid such problems. However, ToT suffers from poor linearity and its dynamic range is limited. We therefore propose a new ToT technique called the dynamic time-over-threshold (dToT) method [4]. A new signal processing system using dToT and CR-RC shaping demonstrated much better linearity than that of a conventional ToT. Using a test circuit with a new Gd{sub 3}Al{sub 2}Ga{sub 3}O{sub 12} (GAGG) scintillator and an avalanche photodiode, the pulse height spectra of {sup 137}Cs and {sup 22}Na sources were measured with high linearity. Based on these results, we designed a new application-specific integrated circuit (ASIC) for this multi-channel dToT system, measured the spectra of a {sup 22}Na source, and investigated the linearity of the system.

  16. Wheel slip control with torque blending using linear and nonlinear model predictive control

    Science.gov (United States)

    Basrah, M. Sofian; Siampis, Efstathios; Velenis, Efstathios; Cao, Dongpu; Longo, Stefano

    2017-11-01

    Modern hybrid electric vehicles employ electric braking to recuperate energy during deceleration. However, currently anti-lock braking system (ABS) functionality is delivered solely by friction brakes. Hence regenerative braking is typically deactivated at a low deceleration threshold in case high slip develops at the wheels and ABS activation is required. If blending of friction and electric braking can be achieved during ABS events, there would be no need to impose conservative thresholds for deactivation of regenerative braking and the recuperation capacity of the vehicle would increase significantly. In addition, electric actuators are typically significantly faster responding and would deliver better control of wheel slip than friction brakes. In this work we present a control strategy for ABS on a fully electric vehicle with each wheel independently driven by an electric machine and friction brake independently applied at each wheel. In particular we develop linear and nonlinear model predictive control strategies for optimal performance and enforcement of critical control and state constraints. The capability for real-time implementation of these controllers is assessed and their performance is validated in high fidelity simulation.

  17. Linear dose response curves in fungi and tradescantia

    International Nuclear Information System (INIS)

    Unrau, P.

    1999-07-01

    heterozygosity (LOH) events occur because Clone 02 repairs both DSB and LCD by recombination. Clone 02 has a linear dose response for high LET radiation. Starting from the same initial yieId frequency, wild-types have a sublinear response. The sublinear response reflects a smoothly decreasing probability that 'pinks' are generated as a function of increasing high LET dose for wild-type but not Clone 02. This smoothly decreasing response would be expected for LOH in 'wild-type' humans. It reflects an increasing proportion of DNA damage being repaired by non-recombinational pathways and/or an increasing probability of cell death with increasing dose. Clone 02 at low doses and low dose rates of low LET radiation has a linear dose response, reflecting a 1/16 probability of a lesion leading to LOH, relative to high LET lesions. This differential is held to reflect: microdosimetric differences in energy deposition and, therefore, DNA damage by low and high LET radiations; the effects of lesion clustering after high LET on the probability of generating the end wild-types. While no observations have been made at very low doses and dose rates in wild-types, there is no reason to suppose that the low LET linear non-threshold dose response of Clone 02 is abnormal. The importance of the LOH somatic genetic end-point is that it reflects cancer risk in humans. The linear non-threshold low dose low LET response curves reflects either the probability that recombinational Holliday junctions are occasionally cleaved in a rare orientation to generate LOH, or the probability that low LET lesions include a small proportion of clustered events similar to high LET ionization or both. Calculations of the Poisson probability that two or more low LET lesions will be induced in the same target suggest that dose rate effects depend upon the coincidence of DNA lesions in the same target, and that the probability of LOH depends upon lesion and repair factors. But the slope of LOH in Clone 02 and all other

  18. Linear dose response curves in fungi and tradescantia

    Energy Technology Data Exchange (ETDEWEB)

    Unrau, P. [Atomic Energy of Canada Ltd., Chalk River, Ontario (Canada)

    1999-07-15

    ;pink' loss of heterozygosity (LOH) events occur because Clone 02 repairs both DSB and LCD by recombination. Clone 02 has a linear dose response for high LET radiation. Starting from the same initial yieId frequency, wild-types have a sublinear response. The sublinear response reflects a smoothly decreasing probability that 'pinks' are generated as a function of increasing high LET dose for wild-type but not Clone 02. This smoothly decreasing response would be expected for LOH in 'wild-type' humans. It reflects an increasing proportion of DNA damage being repaired by non-recombinational pathways and/or an increasing probability of cell death with increasing dose. Clone 02 at low doses and low dose rates of low LET radiation has a linear dose response, reflecting a 1/16 probability of a lesion leading to LOH, relative to high LET lesions. This differential is held to reflect: microdosimetric differences in energy deposition and, therefore, DNA damage by low and high LET radiations; the effects of lesion clustering after high LET on the probability of generating the end wild-types. While no observations have been made at very low doses and dose rates in wild-types, there is no reason to suppose that the low LET linear non-threshold dose response of Clone 02 is abnormal. The importance of the LOH somatic genetic end-point is that it reflects cancer risk in humans. The linear non-threshold low dose low LET response curves reflects either the probability that recombinational Holliday junctions are occasionally cleaved in a rare orientation to generate LOH, or the probability that low LET lesions include a small proportion of clustered events similar to high LET ionization or both. Calculations of the Poisson probability that two or more low LET lesions will be induced in the same target suggest that dose rate effects depend upon the coincidence of DNA lesions in the same target, and that the probability of LOH depends upon lesion and repair factors. But the

  19. The cooperative effect of p53 and Rb in local nanotherapy in a rabbit VX2 model of hepatocellular carcinoma

    Directory of Open Access Journals (Sweden)

    Dong S

    2013-10-01

    Full Text Available Shengli Dong,1 Qibin Tang,2 Miaoyun Long,3 Jian Guan,4 Lu Ye,5 Gaopeng Li6 1Department of General Surgery, The Second Hospital of Shanxi Medical University, Shanxi Medical University, Taiyuan, Shanxi Province, 2Department of Hepatobiliopancreatic Surgery, Sun Yat-sen Memorial Hospital, Sun Yat-sen University, Guangzhou, Guangdong Province, 3Department of Thyroid and Vascular Surgery, Sun Yat-sen Memorial Hospital, Sun Yat-sen University, Guangzhou, Guangdong Province, 4Department of Radiology, First Affiliated Hospital, Sun Yat-sen University, Guangzhou, Guangdong Province, 5Infection Department, Guangzhou No 8 Hospital, Guangzhou, Guangdong Province, 6Department of Ultrasound, Sun Yat-sen Memorial Hospital, Sun Yat-sen University, Guangzhou, Guangdong Province, People's Republic of China Background/aim: A local nanotherapy (LNT combining the therapeutic efficacy of trans-arterial embolization, nanoparticles, and p53 gene therapy has been previously presented. The study presented here aimed to further improve the incomplete tumor eradication and limited survival enhancement and to elucidate the molecular mechanism of the LNT. Methods: In a tumor-targeting manner, recombinant expressing plasmids harboring wild-type p53 and Rb were either co-transferred or transferred separately to rabbit hepatic VX2 tumors in a poly-L-lysine-modified hydroxyapatite nanoparticle nanoplex and Lipiodol® (Guerbet, Villepinte, France emulsion via the hepatic artery. Subsequent co-expression of p53 and Rb proteins within the treated tumors was investigated by Western blotting and in situ analysis by laser-scanning confocal microscopy. The therapeutic effect was evaluated by the tumor growth velocity, apoptosis and necrosis rates, their sensitivity to Adriamycin® (ADM, mitomycin C, and fluorouracil, the microvessel density of tumor tissue, and the survival time of animals. Eventually, real-time polymerase chain reaction and enhanced chemiluminescence Western blotting

  20. Conduction in rectangular quasi-one-dimensional and two-dimensional random resistor networks away from the percolation threshold.

    Science.gov (United States)

    Kiefer, Thomas; Villanueva, Guillermo; Brugger, Jürgen

    2009-08-01

    In this study we investigate electrical conduction in finite rectangular random resistor networks in quasione and two dimensions far away from the percolation threshold p(c) by the use of a bond percolation model. Various topologies such as parallel linear chains in one dimension, as well as square and triangular lattices in two dimensions, are compared as a function of the geometrical aspect ratio. In particular we propose a linear approximation for conduction in two-dimensional systems far from p(c), which is useful for engineering purposes. We find that the same scaling function, which can be used for finite-size scaling of percolation thresholds, also applies to describe conduction away from p(c). This is in contrast to the quasi-one-dimensional case, which is highly nonlinear. The qualitative analysis of the range within which the linear approximation is legitimate is given. A brief link to real applications is made by taking into account a statistical distribution of the resistors in the network. Our results are of potential interest in fields such as nanostructured or composite materials and sensing applications.

  1. Predicting hearing thresholds and occupational hearing loss with multiple-frequency auditory steady-state responses.

    Science.gov (United States)

    Hsu, Ruey-Fen; Ho, Chi-Kung; Lu, Sheng-Nan; Chen, Shun-Sheng

    2010-10-01

    An objective investigation is needed to verify the existence and severity of hearing impairments resulting from work-related, noise-induced hearing loss in arbitration of medicolegal aspects. We investigated the accuracy of multiple-frequency auditory steady-state responses (Mf-ASSRs) between subjects with sensorineural hearing loss (SNHL) with and without occupational noise exposure. Cross-sectional study. Tertiary referral medical centre. Pure-tone audiometry and Mf-ASSRs were recorded in 88 subjects (34 patients had occupational noise-induced hearing loss [NIHL], 36 patients had SNHL without noise exposure, and 18 volunteers were normal controls). Inter- and intragroup comparisons were made. A predicting equation was derived using multiple linear regression analysis. ASSRs and pure-tone thresholds (PTTs) showed a strong correlation for all subjects (r = .77 ≈ .94). The relationship is demonstrated by the equationThe differences between the ASSR and PTT were significantly higher for the NIHL group than for the subjects with non-noise-induced SNHL (p tool for objectively evaluating hearing thresholds. Predictive value may be lower in subjects with occupational hearing loss. Regardless of carrier frequencies, the severity of hearing loss affects the steady-state response. Moreover, the ASSR may assist in detecting noise-induced injury of the auditory pathway. A multiple linear regression equation to accurately predict thresholds was shown that takes into consideration all effect factors.

  2. An examination of neuromuscular and metabolic fatigue thresholds

    International Nuclear Information System (INIS)

    Bergstrom, Haley C; Housh, Terry J; Cochrane, Kristen C; Jenkins, Nathaniel D M; Lewis, Robert W Jr; Traylor, Daniel A; Schmidt, Richard J; Johnson, Glen O; Cramer, Joel T; Zuniga, Jorge M

    2013-01-01

    This study examined the relationships among the physical working capacity at the fatigue threshold (PWC FT ), the power outputs associated with the gas exchange threshold (PGET) and the respiratory compensation point (PRCP), and critical power (CP) to identify possible physiological mechanisms underlying the onset of neuromuscular fatigue. Ten participants (mean ± SD age: 20 ± 1 years) performed a maximal incremental cycle ergometer test to determine the PWC FT , PGET, and PRCP. CP was determined from the 3 min all-out test. The PWC FT (197 ± 55 W), PRCP (212 ± 50 W), and CP (208 ± 63 W) were significantly greater than the PGET (168 ± 40 W), but there were no significant differences among the PWC FT , PRCP, and CP. All thresholds were significantly inter-4 (r = 0.794–0.958). The 17% greater estimates for the PWC FT than PGET were likely related to differences in the physiological mechanisms that underlie these fatigue thresholds, while the non-significant difference and high correlation between the PWC FT and the PRCP suggested that hyperkalemia may underlie both thresholds. Furthermore, it is possible that the 5% lower estimate of the PWC FT than CP could more accurately reflect the demarcation of the heavy from severe exercise intensity domains. (paper)

  3. Electron Cloud Effect in the Linear Colliders

    International Nuclear Information System (INIS)

    Pivi, M

    2004-01-01

    Beam induced multipacting, driven by the electric field of successive positively charged bunches, may arise from a resonant motion of electrons, generated by secondary emission, bouncing back and forth between opposite walls of the vacuum chamber. The electron-cloud effect (ECE) has been observed or is expected at many storage rings [1]. In the beam pipe of the Damping Ring (DR) of a linear collider, an electron cloud is produced initially by ionization of the residual gas and photoelectrons from the synchrotron radiation. The cloud is then sustained by secondary electron emission. This electron cloud can reach equilibrium after the passage of only a few bunches. The electron-cloud effect may be responsible for collective effects as fast coupled-bunch and single-bunch instability, emittance blow-up or incoherent tune shift when the bunch current exceeds a certain threshold, accompanied by a large number of electrons in the vacuum chamber. The ECE was identified as one of the most important R and D topics in the International Linear Collider Report [2]. Systematic studies on the possible electron-cloud effect have been initiated at SLAC for the GLC/NLC and TESLA linear colliders, with particular attention to the effect in the positron main damping ring (MDR) and the positron Low Emittance Transport which includes the bunch compressor system (BCS), the main linac, and the beam delivery system (BDS). We present recent computer simulation results for the main features of the electron cloud generation in both machine designs. Thus, single and coupled-bunch instability thresholds are estimated for the GLC/NLC design

  4. A threshold in the dose-response relationship for X-ray induced somatic mutation frequency in drosophila melanogaster

    International Nuclear Information System (INIS)

    Koana, Takao; Sakai, Kazuo; Okada, M.O.

    2004-01-01

    The dose-response relationship of ionizing radiation and its stochastic effects has been thought to be linear without any thresholds for a long time. The basic data for this model was obtained from mutational assays using germ cells of male fruit fly Drosophila melanogaster. However, cancer-causing activity should be examined more appropriately in somatic cells than in germ cells. In this paper, we examined the dose-response relationship of X-ray irradiation and somatic mutation in drosophila, and found a threshold at approximately 1 Gy in the DNA repair proficient flies. In the repair deficient siblings, the threshold was smaller and the inclination of the dose-response curve was five times steeper. These results suggest that the dose-response relationship between X-ray irradiation and somatic mutation has a threshold, and that the DNA repair function contributes to its formation. (author)

  5. Introducing linear functions: an alternative statistical approach

    Science.gov (United States)

    Nolan, Caroline; Herbert, Sandra

    2015-12-01

    The introduction of linear functions is the turning point where many students decide if mathematics is useful or not. This means the role of parameters and variables in linear functions could be considered to be `threshold concepts'. There is recognition that linear functions can be taught in context through the exploration of linear modelling examples, but this has its limitations. Currently, statistical data is easily attainable, and graphics or computer algebra system (CAS) calculators are common in many classrooms. The use of this technology provides ease of access to different representations of linear functions as well as the ability to fit a least-squares line for real-life data. This means these calculators could support a possible alternative approach to the introduction of linear functions. This study compares the results of an end-of-topic test for two classes of Australian middle secondary students at a regional school to determine if such an alternative approach is feasible. In this study, test questions were grouped by concept and subjected to concept by concept analysis of the means of test results of the two classes. This analysis revealed that the students following the alternative approach demonstrated greater competence with non-standard questions.

  6. Thresholds in the sliding resistance of simulated basal ice

    Directory of Open Access Journals (Sweden)

    L. F. Emerson

    2007-10-01

    Full Text Available We report laboratory determinations of the shear resistance to sliding melting ice with entrained particles over a hard, impermeable surface. With higher particle concentrations and larger particle sizes, Coulomb friction at particle-bed contacts dominates and the shear stress increases linearly with normal load. We term this the sandy regime. When either particle concentration or particle size is reduced below a threshold, the dependence of shear resistance on normal load is no longer statistically significant. We term this regime slippery. We use force and mass balance considerations to examine the flow of melt water beneath the simulated basal ice. At high particle concentrations, the transition from sandy to slippery behavior occurs when the particle size is comparable to the thickness of the melt film that separates the sliding ice from its bed. For larger particle sizes, a transition from sandy to slippery behavior occurs when the particle concentration drops sufficiently that the normal load is no longer transferred completely to the particle-bed contacts. We estimate that the melt films separating the particles from the ice are approximately 0.1 µm thick at this transition. Our laboratory results suggest the potential for abrupt transitions in the shear resistance beneath hard-bedded glaciers with changes in either the thickness of melt layers or the particle loading.

  7. Calculation of femtosecond pulse laser induced damage threshold for broadband antireflective microstructure arrays.

    Science.gov (United States)

    Jing, Xufeng; Shao, Jianda; Zhang, Junchao; Jin, Yunxia; He, Hongbo; Fan, Zhengxiu

    2009-12-21

    In order to more exactly predict femtosecond pulse laser induced damage threshold, an accurate theoretical model taking into account photoionization, avalanche ionization and decay of electrons is proposed by comparing respectively several combined ionization models with the published experimental measurements. In addition, the transmittance property and the near-field distribution of the 'moth eye' broadband antireflective microstructure directly patterned into the substrate material as a function of the surface structure period and groove depth are performed by a rigorous Fourier model method. It is found that the near-field distribution is strongly dependent on the periodicity of surface structure for TE polarization, but for TM wave it is insensitive to the period. What's more, the femtosecond pulse laser damage threshold of the surface microstructure on the pulse duration taking into account the local maximum electric field enhancement was calculated using the proposed relatively accurate theoretical ionization model. For the longer incident wavelength of 1064 nm, the weak linear damage threshold on the pulse duration is shown, but there is a surprising oscillation peak of breakdown threshold as a function of the pulse duration for the shorter incident wavelength of 532 nm.

  8. Study on low-energy sputtering near the threshold energy by molecular dynamics simulations

    Directory of Open Access Journals (Sweden)

    C. Yan

    2012-09-01

    Full Text Available Using molecular dynamics simulation, we have studied the low-energy sputtering at the energies near the sputtering threshold. Different projectile-target combinations of noble metal atoms (Cu, Ag, Au, Ni, Pd, and Pt are simulated in the range of incident energy from 0.1 to 200 eV. It is found that the threshold energies for sputtering are different for the cases of M1 < M2 and M1 ≥ M2, where M1 and M2 are atomic mass of projectile and target atoms, respectively. The sputtering yields are found to have a linear dependence on the reduced incident energy, but the dependence behaviors are different for the both cases. The two new formulas are suggested to describe the energy dependences of the both cases by fitting the simulation results with the determined threshold energies. With the study on the energy dependences of sticking probabilities and traces of the projectiles and recoils, we propose two different mechanisms to describe the sputtering behavior of low-energy atoms near the threshold energy for the cases of M1 < M2 and M1 ≥ M2, respectively.

  9. How extreme is enough to cause a threshold response of ecosystem

    Science.gov (United States)

    Niu, S.; Zhang, F.; Yang, Q.; Song, B.; Sun, J.

    2017-12-01

    Precipitation is a primary determinant of terrestrial ecosystem productivity over much of the globe. Recent studies have shown asymmetric or threshold responses of ecosystem productivity to precipitation gradient. However, it's not clear how extreme is enough to cause a threshold response of ecosystem. We conducted a global meta-analysis of precipitation experiments, a site level precipitation gradient experiment, and a remote sensing data mining on the relationship between precipitation extreme vs NDVI extreme. The meta-analysis shows that ANPP, BNPP, NEE, and other carbon cycle variables, showed similar response magnitudes to either precipitation increase or decrease when precipitation levels were normalized to the medium value of treatments (40%) across all the studies. Overall, the response ratios of these variables were linearly correlated with changes in precipitation amounts and soil water content. In the field gradient study with treatments of 1/12, 1/8. 1/4, 1/2, control, and 5/4 of ambient precipitation, the threshold of NPP, SR, NEE occurred when precipitation was reduced to the level of 1/8-1/12 of ambient precipitation. This means that only extreme drought can induce a threshold response of ecosystem. The regional remote sensing data showed that climate extremes with yearly low precipitation from 1982 to 2013 rarely cause extreme responses of vegetation, further suggesting that it is very difficult to detect threshold responses to natural climatic fluctuation. Our three studies together indicate that asymmetrical responses of vegetation to precipitation are likely detected, but only in very extreme precipitation events.

  10. [Effects of radiation exposure on human body].

    Science.gov (United States)

    Kamiya, Kenji; Sasatani, Megumi

    2012-03-01

    There are two types of radiation health effect; acute disorder and late on-set disorder. Acute disorder is a deterministic effect that the symptoms appear by exposure above a threshold. Tissues and cells that compose the human body have different radiation sensitivity respectively, and the symptoms appear in order, from highly radiosensitive tissues. The clinical symptoms of acute disorder begin with a decrease in lymphocytes, and then the symptoms appear such as alopecia, skin erythema, hematopoietic damage, gastrointestinal damage, central nervous system damage with increasing radiation dose. Regarding the late on-set disorder, a predominant health effect is the cancer among the symptoms of such as cancer, non-cancer disease and genetic effect. Cancer and genetic effect are recognized as stochastic effects without the threshold. When radiation dose is equal to or more than 100 mSv, it is observed that the cancer risk by radiation exposure increases linearly with an increase in dose. On the other hand, the risk of developing cancer through low-dose radiation exposure, less 100 mSv, has not yet been clarified scientifically. Although uncertainty still remains regarding low level risk estimation, ICRP propound LNT model and conduct radiation protection in accordance with LNT model in the low-dose and low-dose rate radiation from a position of radiation protection. Meanwhile, the mechanism of radiation damage has been gradually clarified. The initial event of radiation-induced diseases is thought to be the damage to genome such as radiation-induced DNA double-strand breaks. Recently, it is clarified that our cells could recognize genome damage and induce the diverse cell response to maintain genome integrity. This phenomenon is called DNA damage response which induces the cell cycle arrest, DNA repair, apoptosis, cell senescence and so on. These responses act in the direction to maintain genome integrity against genome damage, however, the death of large number of

  11. Application of NCRP REPORT No.151 for evaluating the radiation level at the ambience of megavoltage medical electron linear accelerator treatment room

    International Nuclear Information System (INIS)

    Yang Haiyou; Yu Shui

    2011-01-01

    Objective: The estimation model,on radiation level at the ambience of medical electron linear accelerator treatment rooms, is derived on the basis of NCRP REPORT No.151, which presents the calculation model of shielding design about barrier thicknesses of megavoltage medical electron linear accelerator treatment rooms. Methods: The estimation model comes from NCRP REPORT No.151- S tructural Shielding Design and Evaluation for Megavoltage X-and Gamma-Ray Radiotherapy Facilities , which presents the calculation model of shielding design about megavoltage medical electron linear accelerator treatment rooms, and the dose rate at isocenter replaces the workload, and the occupancy factor and the use factor are forsaken, then the converse deduction is done according to barrier thicknesses of shielding materials. Ultimately, the estimation model, on radiation level at the ambience of medical electron linear accelerator treatment rooms, is derived. Results: It can be regarded as a systematic estimation model for calculating the radiation level at the ambience of medical electron linear accelerator treatment room. Conclusion: The estimation model has certain practical value to evaluate the radiation level at the ambience of medical electron linear accelerator treatment room. (authors)

  12. Climate change and critical thresholds in China's food security

    International Nuclear Information System (INIS)

    Xiong, Wei; Lin, Erda; Ju, Hui; Xu, Yinlong

    2007-01-01

    Identification of 'critical thresholds' of temperature increase is an essential task for inform policy decisions on establishing greenhouse gas (GHG) emission targets. We use the A2 (medium-high GHG emission pathway) and B2 (medium-low) climate change scenarios produced by the Regional Climate Model PRECIS, the crop model - CERES, and socio-economic scenarios described by IPCC SRES, to simulate the average yield changes per hectare of three main grain crops (rice, wheat, and maize) at 50 km x 50 km scale. The threshold of food production to temperature increases was analyzed based on the relationship between yield changes and temperature rise, and then food security was discussed corresponding to each IPCC SRES scenario. The results show that without the CO2 fertilization effect in the analysis, the yield per hectare for the three crops would fall consistently as temperature rises beyond 2.5C; when the CO2 fertilization effect was included in the simulation, there were no adverse impacts on China's food production under the projected range of temperature rise (0.9-3.9C). A critical threshold of temperature increase was not found for food production. When the socio-economic scenarios, agricultural technology development and international trade were incorporated in the analysis, China's internal food production would meet a critical threshold of basic demand (300 kg/capita) while it would not under A2 (no CO2 fertilization); whereas basic food demand would be satisfied under both A2 and B2, and would even meet a higher food demand threshold required to sustain economic growth (400 kg/capita) under B2, when CO2 fertilization was considered

  13. World high background natural radiation areas: Need to protect public from radiation exposure

    International Nuclear Information System (INIS)

    Sohrabi, Mehdi

    2013-01-01

    Highlights of findings on radiological measurements, radiobiological and epidemiological studies in some main world high background natural radiation (HBNR) areas such as in Brazil, China, India and Iran are presented and discussed with special regard to remediation of radiation exposure of inhabitants in such areas. The current radiation protection philosophy and recommendations applied to workers and public from operation of radiation and nuclear applications are based on the linear non-threshold (LNT) model. The inhabitants of HBNR and radon prone areas receive relatively high radiation doses. Therefore, according to the LNT concept, the inhabitants in HBNR areas and in particular those in Ramsar are considered at risk and their exposure should be regulated. The HBNR areas in the world have different conditions in terms of dose and population. In particular, the inhabitants in HBNR areas of Ramsar receive very high internal and external exposures. This author believes that the public in such areas should be protected and proposes a plan to remedy high exposure of the inhabitants of the HBNR areas of Ramsar, while maintaining these areas as they stand to establish a national environmental radioactivity park which can be provisionally called “Ramsar Research Natural Radioactivity Park” (RRNRP). The major HBNR areas, the public exposure and the need to remedy exposures of inhabitants are reviewed and discussed. - Highlights: ► Highlights of findings on studies in HBNR areas are reviewed and discussed. ► The need to protect HBNR area inhabitants and remedy public exposure is emphasized. ► A collective approach is proposed to remedy exposure of Ramsar HBNR area inhabitants. ► Relocation of HBNR area inhabitants and establishing a park at the location is proposed. ► The advantages and disadvantages of the methods are discussed and recommendations are made

  14. Treatment of threshold retinopathy of prematurity

    Directory of Open Access Journals (Sweden)

    Deshpande Dhanashree

    1998-01-01

    Full Text Available This report deals with our experience in the management of threshold retinopathy of prematurity (ROP. A total of 45 eyes of 23 infants were subjected to treatment of threshold ROP. 26.1% of these infants had a birth weight of >l,500 gm. The preferred modality of treatment was laser indirect photocoagulation, which was facilitated by scleral depression. Cryopexy was done in cases with nondilating pupils or medial haze and was always under general anaesthesia. Retreatment with either modality was needed in 42.2% eyes; in this the skip areas were covered. Total regression of diseases was achieved in 91.1% eyes with no sequelae. All the 4 eyes that progressed to stage 5 despite treatment had zone 1 disease. Major treatment-induced complications did not occur in this series. This study underscores the importance of routine screening of infants upto 2,000 gm birth weight for ROP and the excellent response that is achieved with laser photocoagulation in inducing regression of threshold ROP. Laser is the preferred method of treatment in view of the absence of treatment-related morbidity to the premature infants.

  15. Threshold Signature Schemes Application

    Directory of Open Access Journals (Sweden)

    Anastasiya Victorovna Beresneva

    2015-10-01

    Full Text Available This work is devoted to an investigation of threshold signature schemes. The systematization of the threshold signature schemes was done, cryptographic constructions based on interpolation Lagrange polynomial, elliptic curves and bilinear pairings were examined. Different methods of generation and verification of threshold signatures were explored, the availability of practical usage of threshold schemes in mobile agents, Internet banking and e-currency was shown. The topics of further investigation were given and it could reduce a level of counterfeit electronic documents signed by a group of users.

  16. Thresholds in radiobiology

    International Nuclear Information System (INIS)

    Katz, R.; Hofmann, W.

    1982-01-01

    Interpretations of biological radiation effects frequently use the word 'threshold'. The meaning of this word is explored together with its relationship to the fundamental character of radiation effects and to the question of perception. It is emphasised that although the existence of either a dose or an LET threshold can never be settled by experimental radiobiological investigations, it may be argued on fundamental statistical grounds that for all statistical processes, and especially where the number of observed events is small, the concept of a threshold is logically invalid. (U.K.)

  17. ‘Soglitude’- introducing a method of thinking thresholds

    Directory of Open Access Journals (Sweden)

    Tatjana Barazon

    2010-04-01

    Full Text Available ‘Soglitude’ is an invitation to acknowledge the existence of thresholds in thought. A threshold in thought designates the indetermination, the passage, the evolution of every state the world is in. The creation we add to it, and the objectivity we suppose, on the border of those two ideas lies our perceptive threshold. No state will ever be permanent, and in order to stress the temporary, fluent character of the world and our perception of it, we want to introduce a new suitable method to think change and transformation, when we acknowledge our own threshold nature. The contributions gathered in this special issue come from various disciplines: anthropology, philosophy, critical theory, film studies, political science, literature and history. The variety of these insights shows the resonance of the idea of threshold in every category of thought. We hope to enlarge the notion in further issues on physics and chemistry, as well as mathematics. The articles in this issue introduce the method of threshold thinking by showing the importance of the in-between, of the changing of perspective in their respective domain. The ‘Documents’ section named INTERSTICES, includes a selection of poems, two essays, a philosophical-artistic project called ‘infraphysique’, a performance on thresholds in the soul, and a dialogue with Israel Rosenfield. This issue presents a kaleidoscope of possible threshold thinking and hopes to initiate new ways of looking at things.For every change that occurs in reality there is a subjective counterpart in our perception and this needs to be acknowledged as such. What we name objective is reflected in our own personal perception in its own personal manner, in such a way that the objectivity of an event might altogether be questioned. The absolute point of view, the view from “nowhere”, could well be the projection that causes dogmatism. By introducing the method of thinking thresholds into a system, be it

  18. Ecosystem impacts of hypoxia: thresholds of hypoxia and pathways to recovery

    International Nuclear Information System (INIS)

    Steckbauer, A; Duarte, C M; Vaquer-Sunyer, R; Carstensen, J; Conley, D J

    2011-01-01

    Coastal hypoxia is increasing in the global coastal zone, where it is recognized as a major threat to biota. Managerial efforts to prevent hypoxia and achieve recovery of ecosystems already affected by hypoxia are largely based on nutrient reduction plans. However, these managerial efforts need to be informed by predictions on the thresholds of hypoxia (i.e. the oxygen levels required to conserve biodiversity) as well as the timescales for the recovery of ecosystems already affected by hypoxia. The thresholds for hypoxia in coastal ecosystems are higher than previously thought and are not static, but regulated by local and global processes, being particularly sensitive to warming. The examination of recovery processes in a number of coastal areas managed for reducing nutrient inputs and, thus, hypoxia (Northern Adriatic; Black Sea; Baltic Sea; Delaware Bay; and Danish Coastal Areas) reveals that recovery timescales following the return to normal oxygen conditions are much longer than those of loss following the onset of hypoxia, and typically involve decadal timescales. The extended lag time for ecosystem recovery from hypoxia results in non-linear pathways of recovery due to hysteresis and the shift in baselines, affecting the oxygen thresholds for hypoxia through time.

  19. Linear collider: a preview

    Energy Technology Data Exchange (ETDEWEB)

    Wiedemann, H.

    1981-11-01

    Since no linear colliders have been built yet it is difficult to know at what energy the linear cost scaling of linear colliders drops below the quadratic scaling of storage rings. There is, however, no doubt that a linear collider facility for a center of mass energy above say 500 GeV is significantly cheaper than an equivalent storage ring. In order to make the linear collider principle feasible at very high energies a number of problems have to be solved. There are two kinds of problems: one which is related to the feasibility of the principle and the other kind of problems is associated with minimizing the cost of constructing and operating such a facility. This lecture series describes the problems and possible solutions. Since the real test of a principle requires the construction of a prototype I will in the last chapter describe the SLC project at the Stanford Linear Accelerator Center.

  20. Linear collider: a preview

    International Nuclear Information System (INIS)

    Wiedemann, H.

    1981-11-01

    Since no linear colliders have been built yet it is difficult to know at what energy the linear cost scaling of linear colliders drops below the quadratic scaling of storage rings. There is, however, no doubt that a linear collider facility for a center of mass energy above say 500 GeV is significantly cheaper than an equivalent storage ring. In order to make the linear collider principle feasible at very high energies a number of problems have to be solved. There are two kinds of problems: one which is related to the feasibility of the principle and the other kind of problems is associated with minimizing the cost of constructing and operating such a facility. This lecture series describes the problems and possible solutions. Since the real test of a principle requires the construction of a prototype I will in the last chapter describe the SLC project at the Stanford Linear Accelerator Center

  1. Possible factors determining the non-linearity in the VO2-power output relationship in humans: theoretical studies.

    Science.gov (United States)

    Korzeniewski, Bernard; Zoladz, Jerzy A

    2003-08-01

    At low power output exercise (below lactate threshold), the oxygen uptake increases linearly with power output, but at high power output exercise (above lactate threshold) some additional oxygen consumption causes a non-linearity in the overall VO(2) (oxygen uptake rate)-power output relationship. The functional significance of this phenomenon for human exercise tolerance is very important, but the mechanisms underlying it remain unknown. In the present work, a computer model of oxidative phosphorylation in intact skeletal muscle developed previously is used to examine the background of this relationship in different modes of exercise. Our simulations demonstrate that the non-linearity in the VO(2)-power output relationship and the difference in the magnitude of this non-linearity between incremental exercise mode and square-wave exercise mode (constant power output exercise) can be generated by introducing into the model some hypothetical factor F (group of associated factors) that accumulate(s) in time during exercise. The performed computer simulations, based on this assumption, give proper time courses of changes in VO(2) and [PCr] after an onset of work of different intensities, including the slow component in VO(2), well matching the experimental results. Moreover, if it is assumed that the exercise terminates because of fatigue when the amount/intensity of F exceed some threshold value, the model allows the generation of a proper shape of the well-known power-duration curve. This fact suggests that the phenomenon of the non-linearity of the VO(2)-power output relationship and the magnitude of this non-linearity in different modes of exercise is determined by some factor(s) responsible for muscle fatigue.

  2. Menstrual Disruption with Exercise Is Not Linked to an Energy Availability Threshold.

    Science.gov (United States)

    Lieberman, Jay L; DE Souza, Mary Jane; Wagstaff, David A; Williams, Nancy I

    2018-03-01

    Chronic reductions in energy availability (EA) suppress reproductive function. A particular calculation of EA quantifies the dietary energy remaining after exercise for all physiological functions. Reductions in luteinizing hormone pulse frequency have been demonstrated when EA using this calculation is <30 kcal·kg·fat-free mass (ffm)·d. We determined whether menstrual disturbances (MD) are induced when EA is <30 kcal·kg ffm·d. Thirty-five sedentary, ovulatory women age 18 to 24 yr (weight, 59.0 ± 0.8 kg; body mass index, 21.8 ± 0.4 kg·m) completed a diet and exercise intervention over three menstrual cycles. Participants were randomized to groups that varied in the magnitude of negative energy balance created by the combination of exercise and energy restriction. Menstrual disturbances were determined using daily urinary estrone-1-glucuronide and pregnanediol glucuronide, midcycle luteinizing hormone, and menstrual calendars. In a secondary analysis, we calculated EA from energy balance data and tested the association of EA with MD. A generalized linear mixed-effects model showed that the likelihood of a MD decreased by 9% for each unit increase in EA (odds ratio, 0.91; 95% confidence interval, 0.84-0.98; P = 0.010). No specific value of EA emerged as a threshold below which MD were induced. When participants were partitioned into EA tertile groups (low EA, 23.4-34.1; n = 11; moderate EA, 34.9-40.7; n = 12, and high EA, 41.2-50.1; n = 12 [kcal·kg ffm·d]), estrone-1-glucuronide (P < 0.001), pregnanediol glucuronide (P < 0.001), and luteal phase length (P = 0.031) decreased significantly, independent of tertile. These findings do not support that a threshold of EA exists below which MD are induced but do suggest that MD increase linearly as EA decreases. Menstrual disturbances can likely be prevented by monitoring EA using a simplified assessment of metabolic status.

  3. Intrinsic suppression of turbulence in linear plasma devices

    Science.gov (United States)

    Leddy, J.; Dudson, B.

    2017-12-01

    Plasma turbulence is the dominant transport mechanism for heat and particles in magnetised plasmas in linear devices and tokamaks, so the study of turbulence is important in limiting and controlling this transport. Linear devices provide an axial magnetic field that serves to confine a plasma in cylindrical geometry as it travels along the magnetic field from the source to the strike point. Due to perpendicular transport, the plasma density and temperature have a roughly Gaussian radial profile with gradients that drive instabilities, such as resistive drift-waves and Kelvin-Helmholtz. If unstable, these instabilities cause perturbations to grow resulting in saturated turbulence, increasing the cross-field transport of heat and particles. When the plasma emerges from the source, there is a time, {τ }\\parallel , that describes the lifetime of the plasma based on parallel velocity and length of the device. As the plasma moves down the device, it also moves azimuthally according to E × B and diamagnetic velocities. There is a balance point in these parallel and perpendicular times that sets the stabilisation threshold. We simulate plasmas with a variety of parallel lengths and magnetic fields to vary the parallel and perpendicular lifetimes, respectively, and find that there is a clear correlation between the saturated RMS density perturbation level and the balance between these lifetimes. The threshold of marginal stability is seen to exist where {τ }\\parallel ≈ 11{τ }\\perp . This is also associated with the product {τ }\\parallel {γ }* , where {γ }* is the drift-wave linear growth rate, indicating that the instability must exist for roughly 100 times the growth time for the instability to enter the nonlinear growth phase. We explore the root of this correlation and the implications for linear device design.

  4. No evidence of a threshold in traffic volume affecting road-kill mortality at a large spatio-temporal scale

    Energy Technology Data Exchange (ETDEWEB)

    Grilo, Clara, E-mail: clarabentesgrilo@gmail.com [Departamento de Biología de la Conservación, Estación Biológica de Doñana (EBD-CSIC), Calle Américo Vespucio s/n, E-41092 Sevilla (Spain); Centro Brasileiro de Estudos em Ecologia de Estradas, Departamento de Biologia, Universidade Federal de Lavras, Campus Universitário, 37200-000 Lavras, Minas Gerais (Brazil); Ferreira, Flavio Zanchetta; Revilla, Eloy [Departamento de Biología de la Conservación, Estación Biológica de Doñana (EBD-CSIC), Calle Américo Vespucio s/n, E-41092 Sevilla (Spain)

    2015-11-15

    Previous studies have found that the relationship between wildlife road mortality and traffic volume follows a threshold effect on low traffic volume roads. We aimed at evaluating the response of several species to increasing traffic intensity on highways over a large geographic area and temporal period. We used data of four terrestrial vertebrate species with different biological and ecological features known by their high road-kill rates: the barn owl (Tyto alba), hedgehog (Erinaceus europaeus), red fox (Vulpes vulpes) and European rabbit (Oryctolagus cuniculus). Additionally, we checked whether road-kill likelihood varies when traffic patterns depart from the average. We used annual average daily traffic (AADT) and road-kill records observed along 1000 km of highways in Portugal over seven consecutive years (2003–2009). We fitted candidate models using Generalized Linear Models with a binomial distribution through a sample unit of 1 km segments to describe the effect of traffic on the probability of finding at least one victim in each segment during the study. We also assigned for each road-kill record the traffic of that day and the AADT on that year to test for differences using Paired Student's t-test. Mortality risk declined significantly with traffic volume but varied among species: the probability of finding road-killed red foxes and rabbits occurs up to moderate traffic volumes (< 20,000 AADT) whereas barn owls and hedgehogs occurred up to higher traffic volumes (40,000 AADT). Perception of risk may explain differences in responses towards high traffic highway segments. Road-kill rates did not vary significantly when traffic intensity departed from the average. In summary, we did not find evidence of traffic thresholds for the analysed species and traffic intensities. We suggest mitigation measures to reduce mortality be applied in particular on low traffic roads (< 5000 AADT) while additional measures to reduce barrier effects should take into

  5. No evidence of a threshold in traffic volume affecting road-kill mortality at a large spatio-temporal scale

    International Nuclear Information System (INIS)

    Grilo, Clara; Ferreira, Flavio Zanchetta; Revilla, Eloy

    2015-01-01

    Previous studies have found that the relationship between wildlife road mortality and traffic volume follows a threshold effect on low traffic volume roads. We aimed at evaluating the response of several species to increasing traffic intensity on highways over a large geographic area and temporal period. We used data of four terrestrial vertebrate species with different biological and ecological features known by their high road-kill rates: the barn owl (Tyto alba), hedgehog (Erinaceus europaeus), red fox (Vulpes vulpes) and European rabbit (Oryctolagus cuniculus). Additionally, we checked whether road-kill likelihood varies when traffic patterns depart from the average. We used annual average daily traffic (AADT) and road-kill records observed along 1000 km of highways in Portugal over seven consecutive years (2003–2009). We fitted candidate models using Generalized Linear Models with a binomial distribution through a sample unit of 1 km segments to describe the effect of traffic on the probability of finding at least one victim in each segment during the study. We also assigned for each road-kill record the traffic of that day and the AADT on that year to test for differences using Paired Student's t-test. Mortality risk declined significantly with traffic volume but varied among species: the probability of finding road-killed red foxes and rabbits occurs up to moderate traffic volumes (< 20,000 AADT) whereas barn owls and hedgehogs occurred up to higher traffic volumes (40,000 AADT). Perception of risk may explain differences in responses towards high traffic highway segments. Road-kill rates did not vary significantly when traffic intensity departed from the average. In summary, we did not find evidence of traffic thresholds for the analysed species and traffic intensities. We suggest mitigation measures to reduce mortality be applied in particular on low traffic roads (< 5000 AADT) while additional measures to reduce barrier effects should take into

  6. Mitigating of modal instabilities in linearly-polarized fiber amplifiers by shifting pump wavelength

    International Nuclear Information System (INIS)

    Tao, Rumao; Ma, Pengfei; Wang, Xiaolin; Zhou, Pu; Liu, Zejin

    2015-01-01

    We investigated the effect of pump wavelength on the modal instabilities (MI) in high-power linearly polarized Yb-doped fiber amplifiers. We built a novel semi-analytical model to determine the frequency coupling characteristics and power threshold of MI, which indicates promising MI suppression through pumping at an appropriate wavelength. By pumping at 915 nm, the threshold can be enhanced by a factor of 2.1 as compared to that pumped at 976 nm. Based on a high-power linearly polarized fiber amplifier platform, we studied the influence of pump wavelength experimentally. A maximal enhancement factor of 1.9 has been achieved when pumped at 915 nm, which agrees with the theoretical calculation and verified our theoretical model. Furthermore, we show that MI suppression by detuning the pump wavelength is weakened for fiber with a large core-to-cladding ratio. (paper)

  7. Shifts in the relationship between motor unit recruitment thresholds versus derecruitment thresholds during fatigue.

    Science.gov (United States)

    Stock, Matt S; Mota, Jacob A

    2017-12-01

    Muscle fatigue is associated with diminished twitch force amplitude. We examined changes in the motor unit recruitment versus derecruitment threshold relationship during fatigue. Nine men (mean age = 26 years) performed repeated isometric contractions at 50% maximal voluntary contraction (MVC) knee extensor force until exhaustion. Surface electromyographic signals were detected from the vastus lateralis, and were decomposed into their constituent motor unit action potential trains. Motor unit recruitment and derecruitment thresholds and firing rates at recruitment and derecruitment were evaluated at the beginning, middle, and end of the protocol. On average, 15 motor units were studied per contraction. For the initial contraction, three subjects showed greater recruitment thresholds than derecruitment thresholds for all motor units. Five subjects showed greater recruitment thresholds than derecruitment thresholds for only low-threshold motor units at the beginning, with a mean cross-over of 31.6% MVC. As the muscle fatigued, many motor units were derecruited at progressively higher forces. In turn, decreased slopes and increased y-intercepts were observed. These shifts were complemented by increased firing rates at derecruitment relative to recruitment. As the vastus lateralis fatigued, the central nervous system's compensatory adjustments resulted in a shift of the regression line of the recruitment versus derecruitment threshold relationship. Copyright © 2017 IPEM. Published by Elsevier Ltd. All rights reserved.

  8. The relationship between intelligence and creativity: New support for the threshold hypothesis by means of empirical breakpoint detection

    Science.gov (United States)

    Jauk, Emanuel; Benedek, Mathias; Dunst, Beate; Neubauer, Aljoscha C.

    2013-01-01

    The relationship between intelligence and creativity has been subject to empirical research for decades. Nevertheless, there is yet no consensus on how these constructs are related. One of the most prominent notions concerning the interplay between intelligence and creativity is the threshold hypothesis, which assumes that above-average intelligence represents a necessary condition for high-level creativity. While earlier research mostly supported the threshold hypothesis, it has come under fire in recent investigations. The threshold hypothesis is commonly investigated by splitting a sample at a given threshold (e.g., at 120 IQ points) and estimating separate correlations for lower and upper IQ ranges. However, there is no compelling reason why the threshold should be fixed at an IQ of 120, and to date, no attempts have been made to detect the threshold empirically. Therefore, this study examined the relationship between intelligence and different indicators of creative potential and of creative achievement by means of segmented regression analysis in a sample of 297 participants. Segmented regression allows for the detection of a threshold in continuous data by means of iterative computational algorithms. We found thresholds only for measures of creative potential but not for creative achievement. For the former the thresholds varied as a function of criteria: When investigating a liberal criterion of ideational originality (i.e., two original ideas), a threshold was detected at around 100 IQ points. In contrast, a threshold of 120 IQ points emerged when the criterion was more demanding (i.e., many original ideas). Moreover, an IQ of around 85 IQ points was found to form the threshold for a purely quantitative measure of creative potential (i.e., ideational fluency). These results confirm the threshold hypothesis for qualitative indicators of creative potential and may explain some of the observed discrepancies in previous research. In addition, we obtained

  9. A Fast Method for Measuring Psychophysical Thresholds Across the Cochlear Implant Array

    Directory of Open Access Journals (Sweden)

    Julie A. Bierer

    2015-02-01

    Full Text Available A rapid threshold measurement procedure, based on Bekesy tracking, is proposed and evaluated for use with cochlear implants (CIs. Fifteen postlingually deafened adult CI users participated. Absolute thresholds for 200-ms trains of biphasic pulses were measured using the new tracking procedure and were compared with thresholds obtained with a traditional forced-choice adaptive procedure under both monopolar and quadrupolar stimulation. Virtual spectral sweeps across the electrode array were implemented in the tracking procedure via current steering, which divides the current between two adjacent electrodes and varies the proportion of current directed to each electrode. Overall, no systematic differences were found between threshold estimates with the new channel sweep procedure and estimates using the adaptive forced-choice procedure. Test–retest reliability for the thresholds from the sweep procedure was somewhat poorer than for thresholds from the forced-choice procedure. However, the new method was about 4 times faster for the same number of repetitions. Overall the reliability and speed of the new tracking procedure provides it with the potential to estimate thresholds in a clinical setting. Rapid methods for estimating thresholds could be of particular clinical importance in combination with focused stimulation techniques that result in larger threshold variations between electrodes.

  10. Tactile, thermal, and electrical thresholds in patients with and without phantom limb pain after traumatic lower limb amputation

    Directory of Open Access Journals (Sweden)

    Li S

    2015-04-01

    Full Text Available Shengai Li,1,2 Danielle H Melton,1,2 Sheng Li1,2 1Department of Physical Medicine and Rehabilitation, University of Texas Health Science Center at Houston, Houston, TX, USA; 2Neurorehabilitation Research Laboratory, TIRR Memorial Hermann Research Center, Houston, TX, USA Purpose: To examine whether there is central sensitization in patients with phantom limb pain (PLP after traumatic limb amputation. Methods: Seventeen patients after unilateral lower limb amputation secondary to trauma were enrolled. Ten patients had chronic PLP, while the other seven patients had no PLP. Tactile-sensation threshold, cold- and warm-sensation thresholds, cold- and heat-pain thresholds, electrical-sensation threshold (EST, and electrical-pain threshold on the distal residual limb and the symmetrical site on the sound limb were measured in all tested patients. Their thresholds were compared within the PLP and non-PLP group, and between the groups. Results: The novel findings included: 1 electrical-pain threshold was only decreased in the sound limb in the PLP group and there was no difference between two limbs in the non-PLP group, suggesting central sensitization in patients with PLP; and 2 EST was increased on the affected limb as compared to the sound limb within the PLP group, but there were no significant differences in EST between the PLP and non-PLP group. There were in general no significant differences in other tested thresholds within the groups and between groups. Conclusion: Our results demonstrate central sensitization in the patients with PLP after traumatic limb amputation. Keywords: central sensitization, pain threshold, human

  11. Thresholds of parametric instabilities near the lower hybrid frequency

    International Nuclear Information System (INIS)

    Berger, R.L.; Perkins, F.W.

    1975-06-01

    Resonant decay instabilities of a pump wave with frequency ω 0 near the lower-hybrid frequency ω/sub LH/ are analyzed with respect to the wavenumber k of the decay waves and the ratio ω 0 /ω/sub LH/ to determine the decay process with the minimum threshold. It was found that the lowest thresholds are for decay into an electron plasma (lower hybrid) wave plus either a backward ion-cyclotron wave, an ion Bernstein wave, or a low frequency sound wave. For ω 0 less than (2ω/sub LH/)/sup 1 / 2 /, it was found that these decay processes can occur and have faster growth than ion quasimodes provided the drift velocity (cE 0 /B 0 ) is much less than the sound speed. In many cases of interest, electromagnetic corrections to the lower-hybrid wave rule out decay into all but short wavelength (k rho/sub i/ greater than 1) waves. The experimental results are consistent with the linear theory of parametric instabilities in a homogeneous plasma. (U.S.)

  12. Threshold factorization redux

    Science.gov (United States)

    Chay, Junegone; Kim, Chul

    2018-05-01

    We reanalyze the factorization theorems for the Drell-Yan process and for deep inelastic scattering near threshold, as constructed in the framework of the soft-collinear effective theory (SCET), from a new, consistent perspective. In order to formulate the factorization near threshold in SCET, we should include an additional degree of freedom with small energy, collinear to the beam direction. The corresponding collinear-soft mode is included to describe the parton distribution function (PDF) near threshold. The soft function is modified by subtracting the contribution of the collinear-soft modes in order to avoid double counting on the overlap region. As a result, the proper soft function becomes infrared finite, and all the factorized parts are free of rapidity divergence. Furthermore, the separation of the relevant scales in each factorized part becomes manifest. We apply the same idea to the dihadron production in e+e- annihilation near threshold, and show that the resultant soft function is also free of infrared and rapidity divergences.

  13. Effects of whole body vibration on motor unit recruitment and threshold.

    Science.gov (United States)

    Pollock, Ross D; Woledge, Roger C; Martin, Finbarr C; Newham, Di J

    2012-02-01

    Whole body vibration (WBV) has been suggested to elicit reflex muscle contractions but this has never been verified. We recorded from 32 single motor units (MU) in the vastus lateralis of 7 healthy subjects (34 ± 15.4 yr) during five 1-min bouts of WBV (30 Hz, 3 mm peak to peak), and the vibration waveform was also recorded. Recruitment thresholds were recorded from 38 MUs before and after WBV. The phase angle distribution of all MUs during WBV was nonuniform (P recruitment threshold after WBV and average recruitment threshold; the lowest threshold MUs increased recruitment threshold (P = 0.008) while reductions were observed in the higher threshold units (P = 0.031). We investigated one possible cause of changed thresholds. Presynaptic inhibition in the soleus was measured in 8 healthy subjects (29 ± 4.6 yr). A total of 30 H-reflexes (stimulation intensity 30% Mmax) were recorded before and after WBV: 15 conditioned by prior stimulation (60 ms) of the antagonist and 15 unconditioned. There were no significant changes in the relationship between the conditioned and unconditioned responses. The consistent phase angle at which each MU fired during WBV indicates the presence of reflex muscle activity similar to the tonic vibration reflex. The varying response in high- and low-threshold MUs may be due to the different contributions of the mono- and polysynaptic pathways but not presynaptic inhibition.

  14. Comparison between intensity- duration thresholds and cumulative rainfall thresholds for the forecasting of landslide

    Science.gov (United States)

    Lagomarsino, Daniela; Rosi, Ascanio; Rossi, Guglielmo; Segoni, Samuele; Catani, Filippo

    2014-05-01

    This work makes a quantitative comparison between the results of landslide forecasting obtained using two different rainfall threshold models, one using intensity-duration thresholds and the other based on cumulative rainfall thresholds in an area of northern Tuscany of 116 km2. The first methodology identifies rainfall intensity-duration thresholds by means a software called MaCumBA (Massive CUMulative Brisk Analyzer) that analyzes rain-gauge records, extracts the intensities (I) and durations (D) of the rainstorms associated with the initiation of landslides, plots these values on a diagram, and identifies thresholds that define the lower bounds of the I-D values. A back analysis using data from past events can be used to identify the threshold conditions associated with the least amount of false alarms. The second method (SIGMA) is based on the hypothesis that anomalous or extreme values of rainfall are responsible for landslide triggering: the statistical distribution of the rainfall series is analyzed, and multiples of the standard deviation (σ) are used as thresholds to discriminate between ordinary and extraordinary rainfall events. The name of the model, SIGMA, reflects the central role of the standard deviations in the proposed methodology. The definition of intensity-duration rainfall thresholds requires the combined use of rainfall measurements and an inventory of dated landslides, whereas SIGMA model can be implemented using only rainfall data. These two methodologies were applied in an area of 116 km2 where a database of 1200 landslides was available for the period 2000-2012. The results obtained are compared and discussed. Although several examples of visual comparisons between different intensity-duration rainfall thresholds are reported in the international literature, a quantitative comparison between thresholds obtained in the same area using different techniques and approaches is a relatively undebated research topic.

  15. Comparison of different threshold 18FDG PET with computer tomography for defining gross tumor volume in non-small cell lung carcinoma

    International Nuclear Information System (INIS)

    Chen Shaoqing; Yu Jinming; Xing Ligang; Gong Heyi; Fu Zheng; Yang Guoren

    2006-01-01

    Objective: Under different standard uptake value(SUV), to assess gross tumor volume (GTV) definition for non-small cell lung cancer (NSCLC) with 18-fluoro-deoxy-glueose positron emission tomography( 18 FDG PET) both under definite threshold (42 percent threshold) and various relative threshold (threshold SUV/maximum SUV) derived from the linear regressive function, threshold SUV=0.307 x (mean target SUV) + 0.588, with computer tomography(CT). Methods: Of 20 patients with non-small cell lung cancer, the CT GTV (GTV CT ), PET GTV with 42 percents threshold (GTV 42% ) and PET GTV with relative threshold (GTV relate ) were obtained and compared. Results: The mean GTV 42% , mean GTV relate and mean GTV CT was (13 812.5±13 841.4), (24 325.3±22 454.7) and (28350.9± 26 079.8) mm 3 , respectively, with the difference in mean GTV among these three methods significant (F =. 10, P 42% was smaller than the GTV relate and the GTV CT (P relate and GTV CT (P = 0.125 ). Conclusion: The relative threshold is more suitable to define the gross tumor volume than the definite threshold. (authors)

  16. Can we set a global threshold age to define mature forests?

    DEFF Research Database (Denmark)

    Martin, Philip; Jung, Martin; Brearley, Francis Q.

    2016-01-01

    ) whether we can set a threshold age for mature forests. Using data from previously published studies we modelled the impacts of forest age and climate on BD using linear mixed effects models. We examined the potential biases in the dataset by comparing how representative it was of global mature forests......Globally, mature forests appear to be increasing in biomass density (BD). There is disagreement whether these increases are the result of increases in atmospheric CO2 concentrations or a legacy effect of previous land-use. Recently, it was suggested that a threshold of 450 years should be used...... to define mature forests and that many forests increasing in BD may be younger than this. However, the study making these suggestions failed to account for the interactions between forest age and climate. Here we revisit the issue to identify: (1) how climate and forest age control global forest BD and (2...

  17. Oracle Inequalities for Convex Loss Functions with Non-Linear Targets

    DEFF Research Database (Denmark)

    Caner, Mehmet; Kock, Anders Bredahl

    This paper consider penalized empirical loss minimization of convex loss functions with unknown non-linear target functions. Using the elastic net penalty we establish a finite sample oracle inequality which bounds the loss of our estimator from above with high probability. If the unknown target...... of the same order as that of the oracle. If the target is linear we give sufficient conditions for consistency of the estimated parameter vector. Next, we briefly discuss how a thresholded version of our estimator can be used to perform consistent variable selection. We give two examples of loss functions...

  18. Efficient Market Hypothesis in South Africa: Evidence from Linear and Nonlinear Unit Root Tests

    Directory of Open Access Journals (Sweden)

    Andrew Phiri

    2015-12-01

    Full Text Available This study investigates the weak form efficient market hypothesis (EMH for five generalized stock indices in the Johannesburg Stock Exchange (JSE using weekly data collected from 31st January 2000 to 16th December 2014. In particular, we test for weak form market efficiency using a battery of linear and nonlinear unit root testing procedures comprising of the classical augmented Dickey-Fuller (ADF tests, the two-regime threshold autoregressive (TAR unit root tests described in Enders and Granger (1998 as well as the three-regime unit root tests described in Bec, Salem, and Carrasco (2004. Based on our empirical analysis, we are able to demonstrate that whilst the linear unit root tests advocate for unit roots within the time series, the nonlinear unit root tests suggest that most stock indices are threshold stationary processes. These results bridge two opposing contentions obtained from previous studies by concluding that under a linear framework the JSE stock indices offer support in favour of weak form market efficiency whereas when nonlinearity is accounted for, a majority of the indices violate the weak form EMH.

  19. The health effects of low-dose ionizing radiation

    International Nuclear Information System (INIS)

    Dixit, A.N.; Dixit, Nishant

    2012-01-01

    It has been established by various researches, that high doses of ionizing radiation are harmful to health. There is substantial controversy regarding the effects of low doses of ionizing radiation despite the large amount of work carried out (both laboratory and epidemiological). Exposure to high levels of radiation can cause radiation injury, and these injuries can be relatively severe with sufficiently high radiation doses. Prolonged exposure to low levels of radiation may lead to cancer, although the nature of our response to very low radiation levels is not well known at this time. Many of our radiation safety regulations and procedures are designed to protect the health of those exposed to radiation occupationally or as members of the public. According to the linear no-threshold (LNT) hypothesis, any amount, however small, of radiation is potentially harmful, even down to zero levels. The threshold hypothesis, on the other hand, emphasizes that below a certain threshold level of radiation exposure, any deleterious effects are absent. At the same time, there are strong arguments, both experimental and epidemiological, which support the radiation hormesis (beneficial effects of low-level ionizing radiation). These effects cannot be anticipated by extrapolating from harmful effects noted at high doses. Evidence indicates an inverse relationship between chronic low-dose radiation levels and cancer incidence and/or mortality rates. Examples are drawn from: 1) state surveys for more than 200 million people in the United States; 2) state cancer hospitals for 200 million people in India; 3) 10,000 residents of Taipei who lived in cobalt-60 contaminated homes; 4) high-radiation areas of Ramsar, Iran; 5) 12 million person-years of exposed and carefully selected control nuclear workers; 6) almost 300,000 radon measurements of homes in the United States; and 7) non-smokers in high-radon areas of early Saxony, Germany. This evidence conforms to the hypothesis that

  20. Cancer risk of low dose/low dose rate radiation: a meta-analysis of cancer data of mammals exposed to low doses of radiation

    International Nuclear Information System (INIS)

    Ogata, Hiromitsu; Magae, Junji

    2008-01-01

    Full text: Linear No Threshold (LNT) model is a basic theory for radioprotection, but the adaptability of this hypothesis to biological responses at low doses or at low dose rates is not sufficiently investigated. Simultaneous consideration of the cumulative dose and the dose rate is necessary for evaluating the risk of long-term exposure to ionizing radiation at low dose. This study intends to examine several numerical relationships between doses and dose rates in biological responses to gamma radiation. Collected datasets on the relationship between dose and the incidence of cancer in mammals exposed to low doses of radiation were analysed using meta-regression models and modified exponential (MOE) model, which we previously published, that predicts irradiation time-dependent biological response at low dose rate ionizing radiation. Minimum doses of observable risk and effective doses with a variety of dose rates were calculated using parameters estimated by fitting meta-regression models to the data and compared them with other statistical models that find values corresponding to 'threshold limits'. By fitting a weighted regression model (fixed-effects meta-regression model) to the data on risk of all cancers, it was found that the log relative risk [log(RR)] increased as the total exposure dose increased. The intersection of this regression line with the x-axis denotes the minimum dose of observable risk. These estimated minimum doses and effective doses increased with decrease of dose rate. The goodness of fits of MOE-model depended on cancer types, but the total cancer risk is reduced when dose rates are very low. The results suggest that dose response curve for cancer risk is remarkably affected by dose rate and that dose rate effect changes as a function of dose rate. For scientific discussion on the low dose exposure risk and its uncertainty, the term 'threshold' should be statistically defined, and dose rate effects should be included in the risk

  1. Threshold guidance update

    International Nuclear Information System (INIS)

    Wickham, L.E.

    1986-01-01

    The Department of Energy (DOE) is developing the concept of threshold quantities for use in determining which waste materials must be handled as radioactive waste and which may be disposed of as nonradioactive waste at its sites. Waste above this concentration level would be managed as radioactive or mixed waste (if hazardous chemicals are present); waste below this level would be handled as sanitary waste. Last years' activities (1984) included the development of a threshold guidance dose, the development of threshold concentrations corresponding to the guidance dose, the development of supporting documentation, review by a technical peer review committee, and review by the DOE community. As a result of the comments, areas have been identified for more extensive analysis, including an alternative basis for selection of the guidance dose and the development of quality assurance guidelines. Development of quality assurance guidelines will provide a reasonable basis for determining that a given waste stream qualifies as a threshold waste stream and can then be the basis for a more extensive cost-benefit analysis. The threshold guidance and supporting documentation will be revised, based on the comments received. The revised documents will be provided to DOE by early November. DOE-HQ has indicated that the revised documents will be available for review by DOE field offices and their contractors

  2. Linear study of the precessional fishbone instability

    Science.gov (United States)

    Idouakass, M.; Faganello, M.; Berk, H. L.; Garbet, X.; Benkadda, S.

    2016-10-01

    The precessional fishbone instability is an m = n = 1 internal kink mode destabilized by a population of trapped energetic particles. The linear phase of this instability is studied here, analytically and numerically, with a simplified model. This model uses the reduced magneto-hydrodynamics equations for the bulk plasma and the Vlasov equation for a population of energetic particles with a radially decreasing density. A threshold condition for the instability is found, as well as a linear growth rate and frequency. It is shown that the mode frequency is given by the precession frequency of the deeply trapped energetic particles at the position of strongest radial gradient. The growth rate is shown to scale with the energetic particle density and particle energy while it is decreased by continuum damping.

  3. A Threshold Exists in the Dose-response Relationship for Somatic Mutation Frequency Inducted by X-ray Irradiation of Drosophia

    International Nuclear Information System (INIS)

    Koana, T.; Takashima, Y.; Okada, M. O.; Ikehata, M.; Miyakoshi, J.; Sakai, K.

    2004-01-01

    The dose-response relationship of ionizing radiation and its stochastic effects has been thought to be linear without any thresholds. The basic data for this model was obtained from mutational assays in the male germ cells of fruits fly Drosophila melanogaster. However, carcinogenic activity should be examined more appropriately in somatic cells than in germ cells. Here, the dose-response relationship of X- ray irradiation and somatic mutation is examined in Drosophila. A threshold at approximately 1Gy was observed in the DNA repair proficient flies. In the repair deficient siblings, the threshold was smaller and the inclination of the dose-response curve was much steeper. These results suggest that the dose-response relationship between X-ray irradiation and somatic mutation has a threshold, and that the DNA repair function contributes to its formation. (Author) 35 refs

  4. Two-dimensional linear and nonlinear Talbot effect from rogue waves.

    Science.gov (United States)

    Zhang, Yiqi; Belić, Milivoj R; Petrović, Milan S; Zheng, Huaibin; Chen, Haixia; Li, Changbiao; Lu, Keqing; Zhang, Yanpeng

    2015-03-01

    We introduce two-dimensional (2D) linear and nonlinear Talbot effects. They are produced by propagating periodic 2D diffraction patterns and can be visualized as 3D stacks of Talbot carpets. The nonlinear Talbot effect originates from 2D rogue waves and forms in a bulk 3D nonlinear medium. The recurrences of an input rogue wave are observed at the Talbot length and at the half-Talbot length, with a π phase shift; no other recurrences are observed. Differing from the nonlinear Talbot effect, the linear effect displays the usual fractional Talbot images as well. We also find that the smaller the period of incident rogue waves, the shorter the Talbot length. Increasing the beam intensity increases the Talbot length, but above a threshold this leads to a catastrophic self-focusing phenomenon which destroys the effect. We also find that the Talbot recurrence can be viewed as a self-Fourier transform of the initial periodic beam that is automatically performed during propagation. In particular, linear Talbot effect can be viewed as a fractional self-Fourier transform, whereas the nonlinear Talbot effect can be viewed as the regular self-Fourier transform. Numerical simulations demonstrate that the rogue-wave initial condition is sufficient but not necessary for the observation of the effect. It may also be observed from other periodic inputs, provided they are set on a finite background. The 2D effect may find utility in the production of 3D photonic crystals.

  5. Definition of percolation thresholds on self-affine surfaces

    NARCIS (Netherlands)

    Marrink, S.J.; Paterson, Lincoln; Knackstedt, Mark A.

    2000-01-01

    We study the percolation transition on a two-dimensional substrate with long-range self-affine correlations. We find that the position of the percolation threshold on a correlated lattice is no longer unique and depends on the spanning rule employed. Numerical results are provided for spanning

  6. Spasticity Measurement Based on Tonic Stretch Reflex Threshold in Children with Cerebral Palsy Using the PediAnklebot

    Directory of Open Access Journals (Sweden)

    Marco Germanotta

    2017-05-01

    Full Text Available Nowadays, objective measures are becoming prominent in spasticity assessment, to overcome limitations of clinical scales. Among others, Tonic Stretch Reflex Threshold (TSRT showed promising results. Previous studies demonstrated the validity and reliability of TSRT in spasticity assessment at elbow and ankle joints in adults. Purposes of the present study were to assess: (i the feasibility of measuring TSRT to evaluate spasticity at the ankle joint in children with Cerebral Palsy (CP, and (ii the correlation between objective measures and clinical scores. A mechatronic device, the pediAnklebot, was used to impose 50 passive stretches to the ankle of 10 children with CP and 3 healthy children, to elicit muscles response at 5 different velocities. Surface electromyography, angles, and angular velocities were recorded to compute dynamic stretch reflex threshold; TSRT was computed with a linear regression through angles and angular velocities. TSRTs for the most affected side of children with CP resulted into the biomechanical range (95.7 ± 12.9° and 86.7 ± 17.4° for Medial and Lateral Gastrocnemius, and 75.9 ± 12.5° for Tibialis Anterior. In three patients, the stretch reflex was not elicited in the less affected side. TSRTs were outside the biomechanical range in healthy children. However, no correlation was found between clinical scores and TSRT values. Here, we demonstrated the capability of TSRT to discriminate between spastic and non-spastic muscles, while no significant outcomes were found for the dorsiflexor muscle.

  7. Evaluation of the threshold trimming method for micro inertial fluidic switch based on electrowetting technology

    Directory of Open Access Journals (Sweden)

    Tingting Liu

    2014-03-01

    Full Text Available The switch based on electrowetting technology has the advantages of no moving part, low contact resistance, long life and adjustable acceleration threshold. The acceleration threshold of switch can be fine-tuned by adjusting the applied voltage. This paper is focused on the electrowetting properties of switch and the influence of microchannel structural parameters, applied voltage and droplet volume on acceleration threshold. In the presence of process errors of micro inertial fluidic switch and measuring errors of droplet volume, there is a deviation between test acceleration threshold and target acceleration threshold. Considering the process errors and measuring errors, worst-case analysis is used to analyze the influence of parameter tolerance on the acceleration threshold. Under worst-case condition the total acceleration threshold tolerance caused by various errors is 9.95%. The target acceleration threshold can be achieved by fine-tuning the applied voltage. The acceleration threshold trimming method of micro inertial fluidic switch is verified.

  8. Ultimate parameters of the photon collider at the international linear ...

    Indian Academy of Sciences (India)

    be achieved by adding more wigglers to the DRs; the incremental cost is easily ... the above emittances, the limit on the effective horizontal β-function is about 5 mm [12 .... coupling in γγ collisions just above the γγ → hh threshold [19]. .... [21] V I Telnov, talk at the ECFA Workshop on Linear Colliders, Montpellier, France, 12–.

  9. Longitudinal Single-Bunch Instability in the ILC Damping Rings: Estimate of Current Threshold

    International Nuclear Information System (INIS)

    Venturini, Marco; Venturini, Marco

    2008-01-01

    Characterization of single-bunch instabilities in the International Linear Collider (ILC) damping rings (DRs) has been indicated as a high-priority activity toward completion of an engineering design. In this paper we report on a first estimate of the current thresholds for the instability using numerical and analytical models of the wake potentials associated with the various machine components. The numerical models were derived (upon appropriate scaling) from designs of the corresponding components installed in existing machines. The current thresholds for instabilities were determined by numerical solution of the Vlasov equation for the longitudinal dynamics. For the DR baseline lattice as of Feb. 2007 we find the critical current for instability to be safely above the design specifications leaving room for further optimization of the choice of the momentum compaction

  10. Bayesian methods for jointly estimating genomic breeding values of one continuous and one threshold trait.

    Directory of Open Access Journals (Sweden)

    Chonglong Wang

    Full Text Available Genomic selection has become a useful tool for animal and plant breeding. Currently, genomic evaluation is usually carried out using a single-trait model. However, a multi-trait model has the advantage of using information on the correlated traits, leading to more accurate genomic prediction. To date, joint genomic prediction for a continuous and a threshold trait using a multi-trait model is scarce and needs more attention. Based on the previously proposed methods BayesCπ for single continuous trait and BayesTCπ for single threshold trait, we developed a novel method based on a linear-threshold model, i.e., LT-BayesCπ, for joint genomic prediction of a continuous trait and a threshold trait. Computing procedures of LT-BayesCπ using Markov Chain Monte Carlo algorithm were derived. A simulation study was performed to investigate the advantages of LT-BayesCπ over BayesCπ and BayesTCπ with regard to the accuracy of genomic prediction on both traits. Factors affecting the performance of LT-BayesCπ were addressed. The results showed that, in all scenarios, the accuracy of genomic prediction obtained from LT-BayesCπ was significantly increased for the threshold trait compared to that from single trait prediction using BayesTCπ, while the accuracy for the continuous trait was comparable with that from single trait prediction using BayesCπ. The proposed LT-BayesCπ could be a method of choice for joint genomic prediction of one continuous and one threshold trait.

  11. Performance of iPad-based threshold perimetry in glaucoma and controls.

    Science.gov (United States)

    Schulz, Angela M; Graham, Elizabeth C; You, YuYi; Klistorner, Alexander; Graham, Stuart L

    2017-10-04

    Independent validation of iPad visual field testing software Melbourne Rapid Fields (MRF). To examine the functionality of MRF and compare its performance with Humphrey SITA 24-2 (HVF). Prospective, cross-sectional validation study. Sixty glaucomas (MD:-5.08±5.22); 17 pre-perimetric, 43 HVF field defects and 25 controls. The MRF was compared with HVF for scotoma detection, global indices, regional mean threshold values and sensitivity/specificity. Long-term test-retest variability was assessed after 6 months. Linear regression and Bland Altman analyses of global indices sensitivity/specificity using ROC curves, intraclass correlations. Using a cluster definition of three points at <1% or two at 0.5% to define a scotoma on HVF, MRF detected 39/54 abnormal hemifields with a similar threshold-based criteria. Global indices were highly correlated between MRF and HVF: MD r 2 = 0.80, PSD r 2 = 0.77, VFI r 2 = 0.85 (all P < 0.0001). For manifest glaucoma patients, correlations of regional mean thresholds ranged from r 2 = 0.45-0.78, despite differing array of tested points between devices. ROC analysis of global indices showed reasonable sensitivity/specificity with AUC values of MD:0.89, PSD:0.85 and VFI:0.88. MRF retest variability was low with ICC values at 0.95 (MD and VFI), 0.94 (PSD). However, individual test point variability for mid-range thresholds was higher. MRF perimetry, despite using a completely different test paradigm, shows good performance characteristics compared to HVF for detection of defects, correlation of global indices and regional mean threshold values. Reproducibility for individual points may limit application for monitoring change over time, and fixation monitoring needs improvement. © 2017 Royal Australian and New Zealand College of Ophthalmologists.

  12. Threshold network of a financial market using the P-value of correlation coefficients

    Science.gov (United States)

    Ha, Gyeong-Gyun; Lee, Jae Woo; Nobi, Ashadun

    2015-06-01

    Threshold methods in financial networks are important tools for obtaining important information about the financial state of a market. Previously, absolute thresholds of correlation coefficients have been used; however, they have no relation to the length of time. We assign a threshold value depending on the size of the time window by using the P-value concept of statistics. We construct a threshold network (TN) at the same threshold value for two different time window sizes in the Korean Composite Stock Price Index (KOSPI). We measure network properties, such as the edge density, clustering coefficient, assortativity coefficient, and modularity. We determine that a significant difference exists between the network properties of the two time windows at the same threshold, especially during crises. This implies that the market information depends on the length of the time window when constructing the TN. We apply the same technique to Standard and Poor's 500 (S&P500) and observe similar results.

  13. Alien plant invasions and native plant extinctions: a six-threshold framework

    Science.gov (United States)

    Downey, Paul O.; Richardson, David M.

    2016-01-01

    Biological invasions are widely acknowledged as a major threat to global biodiversity. Species from all major taxonomic groups have become invasive. The range of impacts of invasive taxa and the overall magnitude of the threat is increasing. Plants comprise the biggest and best-studied group of invasive species. There is a growing debate; however, regarding the nature of the alien plant threat—in particular whether the outcome is likely to be the widespread extinction of native plant species. The debate has raised questions on whether the threat posed by invasive plants to native plants has been overstated. We provide a conceptual framework to guide discussion on this topic, in which the threat posed by invasive plants is considered in the context of a progression from no impact through to extinction. We define six thresholds along the ‘extinction trajectory’, global extinction being the final threshold. Although there are no documented examples of either ‘in the wild’ (Threshold 5) or global extinctions (Threshold 6) of native plants that are attributable solely to plant invasions, there is evidence that native plants have crossed or breached other thresholds along the extinction trajectory due to the impacts associated with plant invasions. Several factors may be masking where native species are on the trajectory; these include a lack of appropriate data to accurately map the position of species on the trajectory, the timeframe required to definitively state that extinctions have occurred and management interventions. Such interventions, focussing mainly on Thresholds 1–3 (a declining population through to the local extinction of a population), are likely to alter the extinction trajectory of some species. The critical issue for conservation managers is the trend, because interventions must be implemented before extinctions occur. Thus the lack of evidence for extinctions attributable to plant invasions does not mean we should disregard the broader

  14. Intermediate structure and threshold phenomena

    International Nuclear Information System (INIS)

    Hategan, Cornel

    2004-01-01

    The Intermediate Structure, evidenced through microstructures of the neutron strength function, is reflected in open reaction channels as fluctuations in excitation function of nuclear threshold effects. The intermediate state supporting both neutron strength function and nuclear threshold effect is a micro-giant neutron threshold state. (author)

  15. Self-pulsation threshold of Raman amplified Brillouin fiber cavities

    DEFF Research Database (Denmark)

    Ott, Johan Raunkjær; Pedersen, Martin Erland Vestergaard; Rottwitt, Karsten

    2009-01-01

    An implicit equation for the oscillation threshold of stimulated Brillouin scattering from Raman amplified signals in fibers with external feedback is derived under the assumption of no depletion. This is compared to numerical investigations of Raman amplification schemes showing good agreement...

  16. Long-range epidemic spreading in a random environment.

    Science.gov (United States)

    Juhász, Róbert; Kovács, István A; Iglói, Ferenc

    2015-03-01

    Modeling long-range epidemic spreading in a random environment, we consider a quenched, disordered, d-dimensional contact process with infection rates decaying with distance as 1/rd+σ. We study the dynamical behavior of the model at and below the epidemic threshold by a variant of the strong-disorder renormalization-group method and by Monte Carlo simulations in one and two spatial dimensions. Starting from a single infected site, the average survival probability is found to decay as P(t)∼t-d/z up to multiplicative logarithmic corrections. Below the epidemic threshold, a Griffiths phase emerges, where the dynamical exponent z varies continuously with the control parameter and tends to zc=d+σ as the threshold is approached. At the threshold, the spatial extension of the infected cluster (in surviving trials) is found to grow as R(t)∼t1/zc with a multiplicative logarithmic correction and the average number of infected sites in surviving trials is found to increase as Ns(t)∼(lnt)χ with χ=2 in one dimension.

  17. Effects of visual erotic stimulation on vibrotactile detection thresholds in men.

    Science.gov (United States)

    Jiao, Chuanshu; Knight, Peter K; Weerakoon, Patricia; Turman, A Bulent

    2007-12-01

    This study examined the effects of sexual arousal on vibration detection thresholds in the right index finger of 30 healthy, heterosexual males who reported no sexual dysfunction. Vibrotactile detection thresholds at frequencies of 30, 60, and 100 Hz were assessed before and after watching erotic and control videos using a forced-choice, staircase method. A mechanical stimulator was used to produce the vibratory stimulus. Results were analyzed using repeated measures analysis of variance. After watching the erotic video, the vibrotactile detection thresholds at 30, 60, and 100 Hz were significantly reduced (p erotic stimulus. The results show that sexual arousal resulted in an increase in vibrotactile sensitivity to low frequency stimuli in the index finger of sexually functional men.

  18. Nuclear threshold effects and neutron strength function

    International Nuclear Information System (INIS)

    Hategan, Cornel; Comisel, Horia

    2003-01-01

    One proves that a Nuclear Threshold Effect is dependent, via Neutron Strength Function, on Spectroscopy of Ancestral Neutron Threshold State. The magnitude of the Nuclear Threshold Effect is proportional to the Neutron Strength Function. Evidence for relation of Nuclear Threshold Effects to Neutron Strength Functions is obtained from Isotopic Threshold Effect and Deuteron Stripping Threshold Anomaly. The empirical and computational analysis of the Isotopic Threshold Effect and of the Deuteron Stripping Threshold Anomaly demonstrate their close relationship to Neutron Strength Functions. It was established that the Nuclear Threshold Effects depend, in addition to genuine Nuclear Reaction Mechanisms, on Spectroscopy of (Ancestral) Neutron Threshold State. The magnitude of the effect is proportional to the Neutron Strength Function, in their dependence on mass number. This result constitutes also a proof that the origins of these threshold effects are Neutron Single Particle States at zero energy. (author)

  19. Modification of electrical pain threshold by voluntary breathing-controlled electrical stimulation (BreEStim in healthy subjects.

    Directory of Open Access Journals (Sweden)

    Shengai Li

    Full Text Available BACKGROUND: Pain has a distinct sensory and affective (i.e., unpleasantness component. BreEStim, during which electrical stimulation is delivered during voluntary breathing, has been shown to selectively reduce the affective component of post-amputation phantom pain. The objective was to examine whether BreEStim increases pain threshold such that subjects could have improved tolerance of sensation of painful stimuli. METHODS: Eleven pain-free healthy subjects (7 males, 4 females participated in the study. All subjects received BreEStim (100 stimuli and conventional electrical stimulation (EStim, 100 stimuli to two acupuncture points (Neiguan and Weiguan of the dominant hand in a random order. The two different treatments were provided at least three days apart. Painful, but tolerable electrical stimuli were delivered randomly during EStim, but were triggered by effortful inhalation during BreEStim. Measurements of tactile sensation threshold, electrical sensation and electrical pain thresholds, thermal (cold sensation, warm sensation, cold pain and heat pain thresholds were recorded from the thenar eminence of both hands. These measurements were taken pre-intervention and 10-min post-intervention. RESULTS: There was no difference in the pre-intervention baseline measurement of all thresholds between BreEStim and EStim. The electrical pain threshold significantly increased after BreEStim (27.5±6.7% for the dominant hand and 28.5±10.8% for the non-dominant hand, respectively. The electrical pain threshold significantly decreased after EStim (9.1±2.8% for the dominant hand and 10.2±4.6% for the non-dominant hand, respectively (F[1, 10] = 30.992, p = .00024. There was no statistically significant change in other thresholds after BreEStim and EStim. The intensity of electrical stimuli was progressively increased, but no difference was found between BreEStim and EStim. CONCLUSION: Voluntary breathing controlled electrical stimulation

  20. Non-linear time series analysis on flow instability of natural circulation under rolling motion condition

    International Nuclear Information System (INIS)

    Zhang, Wenchao; Tan, Sichao; Gao, Puzhen; Wang, Zhanwei; Zhang, Liansheng; Zhang, Hong

    2014-01-01

    Highlights: • Natural circulation flow instabilities in rolling motion are studied. • The method of non-linear time series analysis is used. • Non-linear evolution characteristic of flow instability is analyzed. • Irregular complex flow oscillations are chaotic oscillations. • The effect of rolling parameter on the threshold of chaotic oscillation is studied. - Abstract: Non-linear characteristics of natural circulation flow instabilities under rolling motion conditions were studied by the method of non-linear time series analysis. Experimental flow time series of different dimensionless power and rolling parameters were analyzed based on phase space reconstruction theory. Attractors which were reconstructed in phase space and the geometric invariants, including correlation dimension, Kolmogorov entropy and largest Lyapunov exponent, were determined. Non-linear characteristics of natural circulation flow instabilities under rolling motion conditions was studied based on the results of the geometric invariant analysis. The results indicated that the values of the geometric invariants first increase and then decrease as dimensionless power increases which indicated the non-linear characteristics of the system first enhance and then weaken. The irregular complex flow oscillation is typical chaotic oscillation because the value of geometric invariants is at maximum. The threshold of chaotic oscillation becomes larger as the rolling frequency or rolling amplitude becomes big. The main influencing factors that influence the non-linear characteristics of the natural circulation system under rolling motion are thermal driving force, flow resistance and the additional forces caused by rolling motion. The non-linear characteristics of the natural circulation system under rolling motion changes caused by the change of the feedback and coupling degree among these influencing factors when the dimensionless power or rolling parameters changes

  1. RF power generation for future linear colliders

    International Nuclear Information System (INIS)

    Fowkes, W.R.; Allen, M.A.; Callin, R.S.; Caryotakis, G.; Eppley, K.R.; Fant, K.S.; Farkas, Z.D.; Feinstein, J.; Ko, K.; Koontz, R.F.; Kroll, N.; Lavine, T.L.; Lee, T.G.; Miller, R.H.; Pearson, C.; Spalek, G.; Vlieks, A.E.; Wilson, P.B.

    1990-06-01

    The next linear collider will require 200 MW of rf power per meter of linac structure at relatively high frequency to produce an accelerating gradient of about 100 MV/m. The higher frequencies result in a higher breakdown threshold in the accelerating structure hence permit higher accelerating gradients per meter of linac. The lower frequencies have the advantage that high peak power rf sources can be realized. 11.42 GHz appears to be a good compromise and the effort at the Stanford Linear Accelerator Center (SLAC) is being concentrated on rf sources operating at this frequency. The filling time of the accelerating structure for each rf feed is expected to be about 80 ns. Under serious consideration at SLAC is a conventional klystron followed by a multistage rf pulse compression system, and the Crossed-Field Amplifier. These are discussed in this paper

  2. The abundance threshold for plague as a critical percolation phenomenon

    DEFF Research Database (Denmark)

    Davis, S; Trapman, P; Leirs, H

    2008-01-01

    . However, no natural examples have been reported. The central question of interest in percolation theory 4 , the possibility of an infinite connected cluster, corresponds in infectious disease to a positive probability of an epidemic. Archived records of plague (infection with Yersinia pestis....... Abundance thresholds are the theoretical basis for attempts to manage infectious disease by reducing the abundance of susceptibles, including vaccination and the culling of wildlife 6, 7, 8 . This first natural example of a percolation threshold in a disease system invites a re-appraisal of other invasion...

  3. No effect of experimental occlusal interferences on pressure pain thresholds of the masseter and temporalis muscles in healthy women.

    Science.gov (United States)

    Michelotti, A; Farella, M; Steenks, M H; Gallo, L M; Palla, S

    2006-04-01

    It has been suggested that occlusal interferences may lead to pain and tenderness of the masticatory muscles. Tender jaw muscles are more sensitive to pressure pain, as assessed by means of pressure algometry. We tested the effects of occlusal interferences on the pressure pain threshold of the jaw muscles by means of a double-blind randomized crossover experiment carried out on 11 young healthy females. Golden strips were glued either to an occlusal contact area (active interference) or to the vestibular surface of the same tooth (dummy interference) and left for 8 d each. Pressure pain thresholds of the masseter and anterior temporalis muscles were assessed under interference-free, dummy-interference and active-interference conditions. The results indicated that the application of an active occlusal interference, as used in this study, did not influence significantly the pressure pain thresholds of these muscles in healthy individuals.

  4. Pressure pain thresholds and musculoskeletal morbidity in automobile manufacturing workers.

    Science.gov (United States)

    Gold, Judith E; Punnett, Laura; Katz, Jeffrey N

    2006-02-01

    Reduced pressure pain thresholds (PPTs) have been reported in occupational groups with symptoms of upper extremity musculoskeletal disorders (UEMSDs). The purpose of this study was to determine whether automobile manufacturing workers (n=460) with signs and symptoms of UEMSDs had reduced PPTs (greater sensitivity to pain through pressure applied to the skin) when compared with unaffected members of the cohort, which served as the reference group. The association of PPTs with symptom severity and localization of PE findings was investigated, as was the hypothesis that reduced thresholds would be found on the affected side in those with unilateral physical examination (PE) findings. PPTs were measured during the workday at 12 upper extremity sites. A PE for signs of UEMSDs and symptom questionnaire was administered. After comparison of potential covariates using t tests, linear regression multivariable models were constructed with the average of 12 sites (avgPPT) as the outcome. Subjects with PE findings and/or symptoms had a statistically significant lower avgPPT than non-cases. AvgPPT was reduced in those with more widespread PE findings and in those with greater symptom severity (test for trend, PNo difference between side-specific avgPPT was found in those with unilateral PE findings. Reduced PPTs were associated with female gender, increasing age, and grip strength below the gender-adjusted mean. After adjusting for the above confounders, avgPPT was associated with muscle/tendon PE findings and symptom severity in multivariable models. PPTs were associated with signs and symptoms of UEMSDs, after adjusting for gender, age and grip strength. The utility of this noninvasive testing modality should be assessed on the basis of prospective large cohort studies to determine if low PPTs are predictive of UEMSDs in asymptomatic individuals or of progression and spread of UEMSDs from localized to more diffuse disorders.

  5. A New Wavelet Threshold Function and Denoising Application

    Directory of Open Access Journals (Sweden)

    Lu Jing-yi

    2016-01-01

    Full Text Available In order to improve the effects of denoising, this paper introduces the basic principles of wavelet threshold denoising and traditional structures threshold functions. Meanwhile, it proposes wavelet threshold function and fixed threshold formula which are both improved here. First, this paper studies the problems existing in the traditional wavelet threshold functions and introduces the adjustment factors to construct the new threshold function basis on soft threshold function. Then, it studies the fixed threshold and introduces the logarithmic function of layer number of wavelet decomposition to design the new fixed threshold formula. Finally, this paper uses hard threshold, soft threshold, Garrote threshold, and improved threshold function to denoise different signals. And the paper also calculates signal-to-noise (SNR and mean square errors (MSE of the hard threshold functions, soft thresholding functions, Garrote threshold functions, and the improved threshold function after denoising. Theoretical analysis and experimental results showed that the proposed approach could improve soft threshold functions with constant deviation and hard threshold with discontinuous function problems. The proposed approach could improve the different decomposition scales that adopt the same threshold value to deal with the noise problems, also effectively filter the noise in the signals, and improve the SNR and reduce the MSE of output signals.

  6. Rapid Estimation of Gustatory Sensitivity Thresholds with SIAM and QUEST

    Directory of Open Access Journals (Sweden)

    Richard Höchenberger

    2017-06-01

    Full Text Available Adaptive methods provide quick and reliable estimates of sensory sensitivity. Yet, these procedures are typically developed for and applied to the non-chemical senses only, i.e., to vision, audition, and somatosensation. The relatively long inter-stimulus-intervals in gustatory studies, which are required to minimize adaptation and habituation, call for time-efficient threshold estimations. We therefore tested the suitability of two adaptive yes-no methods based on SIAM and QUEST for rapid estimation of taste sensitivity by comparing test-retest reliability for sucrose, citric acid, sodium chloride, and quinine hydrochloride thresholds. We show that taste thresholds can be obtained in a time efficient manner with both methods (within only 6.5 min on average using QUEST and ~9.5 min using SIAM. QUEST yielded higher test-retest correlations than SIAM in three of the four tastants. Either method allows for taste threshold estimation with low strain on participants, rendering them particularly advantageous for use in subjects with limited attentional or mnemonic capacities, and for time-constrained applications during cohort studies or in the testing of patients and children.

  7. Reaction πN → ππN near threshold

    International Nuclear Information System (INIS)

    Frlez, E.

    1993-11-01

    The LAMPF E1179 experiment used the π 0 spectrometer and an array of charged particle range counters to detect and record π + π 0 , π 0 p, and π + π 0 p coincidences following the reaction π + p → π 0 π + p near threshold. The total cross sections for single pion production were measured at the incident pion kinetic energies 190, 200, 220, 240, and 260 MeV. Absolute normalizations were fixed by measuring π + p elastic scattering at 260 MeV. A detailed analysis of the π 0 detection efficiency was performed using cosmic ray calibrations and pion single charge exchange measurements with a 30 MeV π - beam. All published data on πN → ππN, including our results, are simultaneously fitted to yield a common chiral symmetry breaking parameter ξ =-0.25±0.10. The threshold matrix element |α 0 (π 0 π + p)| determined by linear extrapolation yields the value of the s-wave isospin-2 ππ scattering length α 0 2 (ππ) = -0.041±0.003 m π -1 , within the framework of soft-pion theory

  8. Repeated-Sprint Sequences During Female Soccer Matches Using Fixed and Individual Speed Thresholds.

    Science.gov (United States)

    Nakamura, Fábio Y; Pereira, Lucas A; Loturco, Irineu; Rosseti, Marcelo; Moura, Felipe A; Bradley, Paul S

    2017-07-01

    Nakamura, FY, Pereira, LA, Loturco, I, Rosseti, M, Moura, FA, and Bradley, PS. Repeated-sprint sequences during female soccer matches using fixed and individual speed thresholds. J Strength Cond Res 31(7): 1802-1810, 2017-The main objective of this study was to characterize the occurrence of single sprint and repeated-sprint sequences (RSS) during elite female soccer matches, using fixed (20 km·h) and individually based speed thresholds (>90% of the mean speed from a 20-m sprint test). Eleven elite female soccer players from the same team participated in the study. All players performed a 20-m linear sprint test, and were assessed in up to 10 official matches using Global Positioning System technology. Magnitude-based inferences were used to test for meaningful differences. Results revealed that irrespective of adopting fixed or individual speed thresholds, female players produced only a few RSS during matches (2.3 ± 2.4 sequences using the fixed threshold and 3.3 ± 3.0 sequences using the individually based threshold), with most sequences composing of just 2 sprints. Additionally, central defenders performed fewer sprints (10.2 ± 4.1) than other positions (fullbacks: 28.1 ± 5.5; midfielders: 21.9 ± 10.5; forwards: 31.9 ± 11.1; with the differences being likely to almost certainly associated with effect sizes ranging from 1.65 to 2.72), and sprinting ability declined in the second half. The data do not support the notion that RSS occurs frequently during soccer matches in female players, irrespective of using fixed or individual speed thresholds to define sprint occurrence. However, repeated-sprint ability development cannot be ruled out from soccer training programs because of its association with match-related performance.

  9. Threshold-Voltage Shifts in Organic Transistors Due to Self-Assembled Monolayers at the Dielectric: Evidence for Electronic Coupling and Dipolar Effects.

    Science.gov (United States)

    Aghamohammadi, Mahdieh; Rödel, Reinhold; Zschieschang, Ute; Ocal, Carmen; Boschker, Hans; Weitz, R Thomas; Barrena, Esther; Klauk, Hagen

    2015-10-21

    The mechanisms behind the threshold-voltage shift in organic transistors due to functionalizing of the gate dielectric with self-assembled monolayers (SAMs) are still under debate. We address the mechanisms by which SAMs determine the threshold voltage, by analyzing whether the threshold voltage depends on the gate-dielectric capacitance. We have investigated transistors based on five oxide thicknesses and two SAMs with rather diverse chemical properties, using the benchmark organic semiconductor dinaphtho[2,3-b:2',3'-f]thieno[3,2-b]thiophene. Unlike several previous studies, we have found that the dependence of the threshold voltage on the gate-dielectric capacitance is completely different for the two SAMs. In transistors with an alkyl SAM, the threshold voltage does not depend on the gate-dielectric capacitance and is determined mainly by the dipolar character of the SAM, whereas in transistors with a fluoroalkyl SAM the threshold voltages exhibit a linear dependence on the inverse of the gate-dielectric capacitance. Kelvin probe force microscopy measurements indicate this behavior is attributed to an electronic coupling between the fluoroalkyl SAM and the organic semiconductor.

  10. Pressure-drop and density-wave instability thresholds in boiling channels

    International Nuclear Information System (INIS)

    Gurgenci, H.; Yildirim, T.; Kakac, S.; Veziroglu, T.N.

    1987-01-01

    In this study, a criterion for linearized stability with respect to both the pressure-drop and the density-wave oscillations is developed for a single-channel upflow boiling system operating between constant pressures with upstream compressibility introduced through a surge tank. Two different two-phase flow models, namely a constant-property homogeneous flow model a variable-property drift-flux model, have been employed. The conservation equations for both models and the equations of surge tank dynamics are first linearized for small perturbation and the stability of the resulting set of equations for each model are examined by use of Nyquist plots. As a measure of the relative instability of the system, the amounts of the inlet throttling necessary to stabilize the system at particular operating points have been calculated. The results are compared with experimental findings. Comparisons show that the drift-flux formulation offers a simple and reliable way of determining the instability thresholds

  11. The risk equivalent of an exposure to-, versus a dose of radiation

    International Nuclear Information System (INIS)

    Bond, V.P.

    1986-01-01

    The long-term potential carcinogenic effects of low-level exposure (LLE) are addressed. The principal point discussed is linear, no-threshold dose-response curve. That the linear no-threshold, or proportional relationship is widely used is seen in the way in which the values for cancer risk coefficients are expressed - in terms of new cases, per million persons exposed, per year, per unit exposure or dose. This implies that the underlying relationship is proportional, i.e., ''linear, without threshold''. 12 refs., 9 figs., 1 tab

  12. Threshold behavior in electron-atom scattering

    International Nuclear Information System (INIS)

    Sadeghpour, H.R.; Greene, C.H.

    1996-01-01

    Ever since the classic work of Wannier in 1953, the process of treating two threshold electrons in the continuum of a positively charged ion has been an active field of study. The authors have developed a treatment motivated by the physics below the double ionization threshold. By modeling the double ionization as a series of Landau-Zener transitions, they obtain an analytical formulation of the absolute threshold probability which has a leading power law behavior, akin to Wannier's law. Some of the noteworthy aspects of this derivation are that the derivation can be conveniently continued below threshold giving rise to a open-quotes cuspclose quotes at threshold, and that on both sides of the threshold, absolute values of the cross sections are obtained

  13. Thresholds of a bunched beam longitudinal instability in proton synchrotrons

    International Nuclear Information System (INIS)

    Balbekov, V.I.; Ivanov, S.V.

    1986-01-01

    The formulas and graphs for calculating instability thresholds arising during the interaction of a bunched proton beam with narrow-band resonator are given. The instabilities of three types with oscillations of a definite multipolarity, oscillations of some bound multipoles and with microwave oscillations arising as a result of addition of a great number of multipoles. The analysis of the above data shows that the increase of oscillations nonlinearity is accompanied by the growth of instability threshold only in the zone of separated and weakly bound multipoles. The increase of spread of synchrotron frequencies reduces the zone separated multipoles owing to which the microwave bunch instability can be caused by more and more low-frequency resonators. In the microwave zone practically there is no stabilizing effect of synchrotron frequencies spread. The instability threshold of the bunched beam now - where exceeds the microwave level

  14. Microkinetic Modeling of Lean NOx Trap Storage and Regeneration

    Energy Technology Data Exchange (ETDEWEB)

    Larson, Richard S. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Chakravarthy, V. Kalyana [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Pihl, Josh A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Daw, C. Stuart [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2011-12-01

    A microkinetic chemical reaction mechanism capable of describing both the storage and regeneration processes in a fully formulated lean NOx trap (LNT) is presented. The mechanism includes steps occurring on the precious metal, barium oxide (NOx storage), and cerium oxide (oxygen storage) sites of the catalyst. The complete reaction set is used in conjunction with a transient plug flow reactor code (including boundary layer mass transfer) to simulate not only a set of long storage/regeneration cycles with a CO/H2 reductant, but also a series of steady flow temperature sweep experiments that were previously analyzed with just a precious metal mechanism and a steady state code neglecting mass transfer. The results show that, while mass transfer effects are generally minor, NOx storage is not negligible during some of the temperature ramps, necessitating a re-evaluation of the precious metal kinetic parameters. The parameters for the entire mechanism are inferred by finding the best overall fit to the complete set of experiments. Rigorous thermodynamic consistency is enforced for parallel reaction pathways and with respect to known data for all of the gas phase species involved. It is found that, with a few minor exceptions, all of the basic experimental observations can be reproduced with the transient simulations. In addition to accounting for normal cycling behavior, the final mechanism should provide a starting point for the description of further LNT phenomena such as desulfation and the role of alternative reductants.

  15. Climate change and critical thresholds in China's food security

    Energy Technology Data Exchange (ETDEWEB)

    Xiong, Wei; Lin, Erda; Ju, Hui; Xu, Yinlong [Institute of Environment and Sustainable Development in Agriculture, Chinese Academy of Agricultural Sciences, Beijing (China)

    2007-03-15

    Identification of 'critical thresholds' of temperature increase is an essential task for inform policy decisions on establishing greenhouse gas (GHG) emission targets. We use the A2 (medium-high GHG emission pathway) and B2 (medium-low) climate change scenarios produced by the Regional Climate Model PRECIS, the crop model - CERES, and socio-economic scenarios described by IPCC SRES, to simulate the average yield changes per hectare of three main grain crops (rice, wheat, and maize) at 50 km x 50 km scale. The threshold of food production to temperature increases was analyzed based on the relationship between yield changes and temperature rise, and then food security was discussed corresponding to each IPCC SRES scenario. The results show that without the CO2 fertilization effect in the analysis, the yield per hectare for the three crops would fall consistently as temperature rises beyond 2.5C; when the CO2 fertilization effect was included in the simulation, there were no adverse impacts on China's food production under the projected range of temperature rise (0.9-3.9C). A critical threshold of temperature increase was not found for food production. When the socio-economic scenarios, agricultural technology development and international trade were incorporated in the analysis, China's internal food production would meet a critical threshold of basic demand (300 kg/capita) while it would not under A2 (no CO2 fertilization); whereas basic food demand would be satisfied under both A2 and B2, and would even meet a higher food demand threshold required to sustain economic growth (400 kg/capita) under B2, when CO2 fertilization was considered.

  16. Double Photoionization Near Threshold

    Science.gov (United States)

    Wehlitz, Ralf

    2007-01-01

    The threshold region of the double-photoionization cross section is of particular interest because both ejected electrons move slowly in the Coulomb field of the residual ion. Near threshold both electrons have time to interact with each other and with the residual ion. Also, different theoretical models compete to describe the double-photoionization cross section in the threshold region. We have investigated that cross section for lithium and beryllium and have analyzed our data with respect to the latest results in the Coulomb-dipole theory. We find that our data support the idea of a Coulomb-dipole interaction.

  17. Lasing thresholds of helical photonic structures with different positions of a single light-amplifying helix turn

    Energy Technology Data Exchange (ETDEWEB)

    Blinov, L M; Palto, S P [A.V. Shubnikov Institute of Crystallography, Russian Academy of Sciences, Moscow, Russian Federaion (Russian Federation)

    2013-09-30

    Numerical simulation is used to assess the lasing threshold of helical structures of cholesteric liquid crystals (CLCs) in which only one turn amplifies light. This turn is located either in the centre of symmetric structures of various sizes or in an arbitrary place in asymmetric structures of preset size. In all cases, we find singularities in light amplification by a one-dimensional CLC structure for the most important band-edge modes (m1, m2 and m3) and plot the threshold gain coefficient k{sub th} against the position of the amplifying turn. For the symmetric structures, the lasing threshold of the m1 mode is shown to vary linearly with the inverse of the square of the cavity length. Moreover, modes with a lower density of photonic states (DOS) in the cavity may have a lower lasing threshold. This can be accounted for by the dependence of the density of photonic states on the position of the amplifying turn and, accordingly, by the nonuniform electromagnetic field intensity distribution along the cavity for different modes. In the asymmetric structures, the same field energy distribution is responsible for a correlation between k{sub th} and DOS curves. (lasers)

  18. 75 FR 75911 - Adjustment of Monetary Threshold for Reporting Rail Equipment Accidents/Incidents for Calendar...

    Science.gov (United States)

    2010-12-07

    ..., Notice No. 3] RIN 2130-ZA04 Adjustment of Monetary Threshold for Reporting Rail Equipment Accidents... (DOT). ACTION: Final rule. SUMMARY: This rule increases the rail equipment accident/incident reporting threshold from $9,200 to $9,400 for certain railroad accidents/incidents involving property damage that...

  19. Finance-growth nexus: Insights from an application of threshold regression model to Malaysia's dual financial system

    Directory of Open Access Journals (Sweden)

    Alaa Alaabed

    2016-06-01

    Full Text Available The purpose of this paper is to test the growing converging views regarding the destabilizing and growth-halting impact of interest-based debt financial system. The views are as advocated by the followers of Keynes and Hyman Minsky and those of Islam. Islam discourages interest rate based debt financing as it considers it not very conducive to productive activities and human solidarity. Likewise, since the onset of the crisis of 2007/2008, calls by skeptics of mainstream capitalism have been renewed. The paper applies a threshold regression model to Malaysian data and finds that the relationship between growth and financial development is non-linear. A threshold is estimated, after which credit expansion negatively impacts GDP growth. While the post-threshold negative relationship is found to be statistically significant, the estimated positive relationship at lower levels of financial development is insignificant. The findings provide support to the above views and are hoped to guide monetary authorities to better growth-promoting policy-making.

  20. Rejection thresholds in solid chocolate-flavored compound coating.

    Science.gov (United States)

    Harwood, Meriel L; Ziegler, Gregory R; Hayes, John E

    2012-10-01

    Classical detection thresholds do not predict liking, as they focus on the presence or absence of a sensation. Recently however, Prescott and colleagues described a new method, the rejection threshold, where a series of forced choice preference tasks are used to generate a dose-response function to determine hedonically acceptable concentrations. That is, how much is too much? To date, this approach has been used exclusively in liquid foods. Here, we determined group rejection thresholds in solid chocolate-flavored compound coating for bitterness. The influences of self-identified preferences for milk or dark chocolate, as well as eating style (chewers compared to melters) on rejection thresholds were investigated. Stimuli included milk chocolate-flavored compound coating spiked with increasing amounts of sucrose octaacetate, a bitter and generally recognized as safe additive. Paired preference tests (blank compared to spike) were used to determine the proportion of the group that preferred the blank. Across pairs, spiked samples were presented in ascending concentration. We were able to quantify and compare differences between 2 self-identified market segments. The rejection threshold for the dark chocolate preferring group was significantly higher than the milk chocolate preferring group (P= 0.01). Conversely, eating style did not affect group rejection thresholds (P= 0.14), although this may reflect the amount of chocolate given to participants. Additionally, there was no association between chocolate preference and eating style (P= 0.36). Present work supports the contention that this method can be used to examine preferences within specific market segments and potentially individual differences as they relate to ingestive behavior. This work makes use of the rejection threshold method to study market segmentation, extending its use to solid foods. We believe this method has broad applicability to the sensory specialist and product developer by providing a

  1. The effect of age and gender on pressure pain thresholds and suprathreshold stimuli

    DEFF Research Database (Denmark)

    Petrini, Laura; Tomczak Matthiesen, Susan; Arendt-Nielsen, Lars

    2015-01-01

    The study investigates the impact of age and gender on (1) experimental pressure pain detection thresholds (PPDT) and pressure pain tolerance thresholds (PPTolT) and (2) participants’self-reports of pain intensity and unpleasantness at suprathreshold and subthreshold levels. Methods: twenty young...... (20–34, mean age = 24.6 ± 3.5 years, ten female) and twenty elderly (65–88, mean age = 73.7 ± 6.6 years, ten female) healthy volunteers were compared. Mini-Mental State Examination (MMSE 28–30) assessed intact cognitive functioning. Pain thresholds were assessed together with the sensory intensity...... ratings to 1.3 × PPDT (pain) and 0.2 × PPDT (no pain). Results: PPDT and PPTolT significantly decreased with age and were lower in young females as compared with young males. No gender differences were observed in the elderly group. PPDT decreased significantly with age in males but not in females...

  2. Temperature thresholds and thermal requirements for development of Nasonovia ribisnigri (Hemiptera: Aphididae).

    Science.gov (United States)

    Diaz, Beatriz Maria; Muñiz, Mariano; Barrios, Laura; Fereres, Alberto

    2007-08-01

    Early detection of Nasonovia ribisnigri (Mosley) (Hemiptera: Aphididae) on lettuce is of primary importance for its effective control. Temperature thresholds for development of this pest were estimated using developmental rates [r(T)] at different constant temperatures (8, 12, 16, 20, 24, 26, and 28 degrees C). Observed developmental rates data and temperature were fitted to two linear (Campbell and Muñiz and Gil) and a nonlinear (Lactin) models. Lower temperature threshold estimated by the Campbell model was 3.6 degrees C for apterous, 4.1 degrees C for alates, and 3.1 degrees C for both aphid adult morphs together. Similar values of the lower temperature threshold were obtained with the Muñiz and Gil model, for apterous (4.0 degrees C), alates (4.2 degrees C), and both adult morphs together (3.7 degrees C) of N. ribisnigri. Thermal requirements of N. ribisnigri to complete development were estimated by Campbell and Muñiz and Gil models for apterous in 125 and 129 DD and for both adult morphs together in 143 and 139 DD, respectively. For complete development from birth to adulthood, the alate morph needed 15-18 DD more than the apterous morph. The lower temperature threshold determined by the Lactin model was 5.3 degrees C for alates, 2.3 degrees C for apterous, and 1.9 degrees C for both adult morphs together. The optimal and upper temperature thresholds were 25.2 and 33.6 degrees C, respectively, for the alate morph, 27 and 35.9 degrees C, respectively, for the apterous morph, and 26.1 and 35.3 degrees C, respectively, for the two adult morphs together. The Campbell model provided the best fit to the observed developmental rates data of N. ribisnigri. This information could be incorporated in forecasting models of this pest.

  3. Determination of Cost-Effectiveness Threshold for Health Care Interventions in Malaysia.

    Science.gov (United States)

    Lim, Yen Wei; Shafie, Asrul Akmal; Chua, Gin Nie; Ahmad Hassali, Mohammed Azmi

    2017-09-01

    One major challenge in prioritizing health care using cost-effectiveness (CE) information is when alternatives are more expensive but more effective than existing technology. In such a situation, an external criterion in the form of a CE threshold that reflects the willingness to pay (WTP) per quality-adjusted life-year is necessary. To determine a CE threshold for health care interventions in Malaysia. A cross-sectional, contingent valuation study was conducted using a stratified multistage cluster random sampling technique in four states in Malaysia. One thousand thirteen respondents were interviewed in person for their socioeconomic background, quality of life, and WTP for a hypothetical scenario. The CE thresholds established using the nonparametric Turnbull method ranged from MYR12,810 to MYR22,840 (~US $4,000-US $7,000), whereas those estimated with the parametric interval regression model were between MYR19,929 and MYR28,470 (~US $6,200-US $8,900). Key factors that affected the CE thresholds were education level, estimated monthly household income, and the description of health state scenarios. These findings suggest that there is no single WTP value for a quality-adjusted life-year. The CE threshold estimated for Malaysia was found to be lower than the threshold value recommended by the World Health Organization. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  4. Three caveats for linear stability theory: Rayleigh-Benard convection

    International Nuclear Information System (INIS)

    Greenside, H.S.

    1984-06-01

    Recent theories and experiments challenge the applicability of linear stability theory near the onset of buoyancy-driven (Rayleigh-Benard) convection. This stability theory, based on small perturbations of infinite parallel rolls, is found to miss several important features of the convective flow. The reason is that the lateral boundaries have a profound influence on the possible wave numbers and flow patterns even for the largest cells studied. Also, the nonlinear growth of incoherent unstable modes distorts the rolls, leading to a spatially disordered and sometimes temporally nonperiodic flow. Finally, the relation of the skewed varicose instability to the onset of turbulence (nonperiodic time dependence) is examined. Linear stability theory may not suffice to predict the onset of time dependence in large cells close to threshold

  5. Does more energy consumption bolster economic growth? An application of the nonlinear threshold regression model

    International Nuclear Information System (INIS)

    Huang, B.-N.; Hwang, M.J.; Yang, C.W.

    2008-01-01

    This paper separates data extending from 1971 to 2002 into the energy crisis period (1971-1980) and the post-energy crisis period (1981-2000) for 82 countries. The cross-sectional data (yearly averages) in these two periods are used to investigate the nonlinear relationships between energy consumption growth and economic growth when threshold variables are used. If threshold variables are higher than certain optimal threshold levels, there is either no significant relationship or else a significant negative relationship between energy consumption and economic growth. However, when these threshold variables are lower than certain optimal levels, there is a significant positive relationship between the two. In 48 out of the 82 countries studied, none of the four threshold variables is found to be higher than the optimal levels. It is inferred that these 48 countries should adopt a more aggressive energy policy. As for the other 34 countries, at least one threshold variable is higher than the optimal threshold level and thus these countries should adopt energy policies with varying degrees of conservation based on the number of threshold variables that are higher than the optimal threshold levels

  6. Linear-rank testing of a non-binary, responder-analysis, efficacy score to evaluate pharmacotherapies for substance use disorders.

    Science.gov (United States)

    Holmes, Tyson H; Li, Shou-Hua; McCann, David J

    2016-11-23

    The design of pharmacological trials for management of substance use disorders is shifting toward outcomes of successful individual-level behavior (abstinence or no heavy use). While binary success/failure analyses are common, McCann and Li (CNS Neurosci Ther 2012; 18: 414-418) introduced "number of beyond-threshold weeks of success" (NOBWOS) scores to avoid dichotomized outcomes. NOBWOS scoring employs an efficacy "hurdle" with values reflecting duration of success. Here, we evaluate NOBWOS scores rigorously. Formal analysis of mathematical structure of NOBWOS scores is followed by simulation studies spanning diverse conditions to assess operating characteristics of five linear-rank tests on NOBWOS scores. Simulations include assessment of Fisher's exact test applied to hurdle component. On average, statistical power was approximately equal for five linear-rank tests. Under none of conditions examined did Fisher's exact test exhibit greater statistical power than any of the linear-rank tests. These linear-rank tests provide good Type I and Type II error control for comparing distributions of NOBWOS scores between groups (e.g. active vs. placebo). All methods were applied to re-analyses of data from four clinical trials of differing lengths and substances of abuse. These linear-rank tests agreed across all trials in rejecting (or not) their null (equality of distributions) at ≤ 0.05. © The Author(s) 2016.

  7. Efeito do estímulo facilitador no limiar de reflexo acústico The facilitating stimulus effect in the acoustic reflex threshold

    Directory of Open Access Journals (Sweden)

    Renata M. M. Carvallo

    2004-04-01

    Full Text Available A sensibilização auditiva, ferramenta utilizada na pesquisa dos reflexos acústicos, permite a redução do limiar de reflexo acústico a partir de um estímulo facilitador. Ele pode ser apresentado antes ou simultaneamente ao tom eliciador do reflexo acústico. São comparados os limiares pré e pós-exposição ao estímulo, esperando-se obter a redução do limiar. A partir do estudo dos reflexos acústicos é possível obter maiores informações a respeito das vias auditivas, como estruturas do tronco encefálico, visto que o arco reflexo está relacionado a núcleos auditivos nessa região. Eles também estão envolvidos no processamento auditivo. Assim, alterações do reflexo acústico poderiam estar relacionadas a falhas em habilidades de processamento auditivo. OBJETIVO: Esta pesquisa teve o objetivo de estudar a sensibilização do reflexo acústico a partir de um estímulo facilitador de 6 KHz. FORMA DE ESTUDO: Estudo clínico com coorte transversal. MATERIAL E MÉTODO: jovens mulheres com idade entre 20 e 25 anos, sem queixas audiológicas e limiares auditivos dentro dos limites da normalidade. RESULTADOS:Foi encontrada redução significativa do limiar de reflexo acústico entre 6,71 e 17,23 dB nas orelhas em que houve a sensibilização. CONCLUSÃO: A apresentação simultânea de um estímulo facilitador de alta freqüência gera redução do limiar de reflexo acústico em pessoas com audição dentro dos limites da normalidade.The auditory sensitization, a tool used in the research of the acoustic reflex, allows the decrease of the acoustic reflex threshold from a facilitating stimulus. It may be presented before or simultaneously at the elicitor tone. The thresholds after and before the facilitating stimulus are compared and it is expected to get the decrease of the threshold. From the study of the acoustic reflex it is possible to get much information about the auditory pathways, like structures of the brainstem, since the

  8. Alien plant invasions and native plant extinctions: a six-threshold framework.

    Science.gov (United States)

    Downey, Paul O; Richardson, David M

    2016-01-01

    Biological invasions are widely acknowledged as a major threat to global biodiversity. Species from all major taxonomic groups have become invasive. The range of impacts of invasive taxa and the overall magnitude of the threat is increasing. Plants comprise the biggest and best-studied group of invasive species. There is a growing debate; however, regarding the nature of the alien plant threat-in particular whether the outcome is likely to be the widespread extinction of native plant species. The debate has raised questions on whether the threat posed by invasive plants to native plants has been overstated. We provide a conceptual framework to guide discussion on this topic, in which the threat posed by invasive plants is considered in the context of a progression from no impact through to extinction. We define six thresholds along the 'extinction trajectory', global extinction being the final threshold. Although there are no documented examples of either 'in the wild' (Threshold 5) or global extinctions (Threshold 6) of native plants that are attributable solely to plant invasions, there is evidence that native plants have crossed or breached other thresholds along the extinction trajectory due to the impacts associated with plant invasions. Several factors may be masking where native species are on the trajectory; these include a lack of appropriate data to accurately map the position of species on the trajectory, the timeframe required to definitively state that extinctions have occurred and management interventions. Such interventions, focussing mainly on Thresholds 1-3 (a declining population through to the local extinction of a population), are likely to alter the extinction trajectory of some species. The critical issue for conservation managers is the trend, because interventions must be implemented before extinctions occur. Thus the lack of evidence for extinctions attributable to plant invasions does not mean we should disregard the broader threat

  9. Aeolian Erosion on Mars - a New Threshold for Saltation

    Science.gov (United States)

    Teiser, J.; Musiolik, G.; Kruss, M.; Demirci, T.; Schrinski, B.; Daerden, F.; Smith, M. D.; Neary, L.; Wurm, G.

    2017-12-01

    The Martian atmosphere shows a large variety of dust activity, ranging from local dust devils to global dust storms. Also, sand motion has been observed in form of moving dunes. The dust entrainment into the Martian atmosphere is not well understood due to the small atmospheric pressure of only a few mbar. Laboratory experiments on Earth and numerical models were developed to understand these processes leading to dust lifting and saltation. Experiments so far suggested that large wind velocities are needed to reach the threshold shear velocity and to entrain dust into the atmosphere. In global circulation models this threshold shear velocity is typically reduced artificially to reproduce the observed dust activity. Although preceding experiments were designed to simulate Martian conditions, no experiment so far could scale all parameters to Martian conditions, as either the atmospheric or the gravitational conditions were not scaled. In this work, a first experimental study of saltation under Martian conditions is presented. Martian gravity is reached by a centrifuge on a parabolic flight, while pressure (6 mbar) and atmospheric composition (95% CO2, 5% air) are adjusted to Martian levels. A sample of JSC 1A (grain sizes from 10 - 100 µm) was used to simulate Martian regolith. The experiments showed that the reduced gravity (0.38 g) not only affects the weight of the dust particles, but also influences the packing density within the soil and therefore also the cohesive forces. The measured threshold shear velocity of 0.82 m/s is significantly lower than the measured value for 1 g in ground experiments (1.01 m/s). Feeding the measured value into a Global Circulation Model showed that no artificial reduction of the threshold shear velocity might be needed to reproduce the global dust distribution in the Martian atmosphere.

  10. Perceptibility and acceptability thresholds for colour differences in dentistry

    NARCIS (Netherlands)

    Khashayar, G.; Bain, P.A.; Salari, S.; Dozic, A.; Kleverlaan, C.J.; Feilzer, A.J.

    2014-01-01

    Introduction Data on acceptability (AT) and perceptibility thresholds (PT) for colour differences vary in dental literature. There is consensus that the determination of ΔE* is appropriate to define AT and PT, however there is no consensus regarding the values that should be used. The aim of this

  11. Evaluation of participants' perception and taste thresholds with a zirconia palatal plate.

    Science.gov (United States)

    Wada, Takeshi; Takano, Tomofumi; Tasaka, Akinori; Ueda, Takayuki; Sakurai, Kaoru

    2016-10-01

    Zirconia and cobalt-chromium can withstand a similar degree of loading. Therefore, using a zirconia base for removable dentures could allow the thickness of the palatal area to be reduced similarly to metal base dentures. We hypothesized that zirconia palatal plate for removable dentures provides a high level of participants' perception without influencing taste thresholds. The purpose of this study was to evaluate the participants' perception and taste thresholds of zirconia palatal plate. Palatal plates fabricated using acrylic resin, zirconia, and cobalt-chromium alloy were inserted into healthy individuals. Taste thresholds were investigated using the whole-mouth gustatory test, and participants' perception was evaluated using the 100-mm visual analog scale to assess the ease of pronunciation, ease of swallowing, sensation of temperature, metallic taste, sensation of foreign body, subjective sensory about weight, adhesiveness of chewing gum, and general satisfaction. For the taste thresholds, no significant differences were noted in sweet, salty, sour, bitter, or umami tastes among participants wearing no plate, or the resin, zirconia, and metal plates. Speech was easier and foreign body sensation was lower with the zirconia plate than with the resin plate. Evaluation of the adhesiveness of chewing gum showed that chewing gum does not readily adhere to the zirconia plate in comparison with the metal plate. The comprehensive participants' perception of the zirconia plate was evaluated as being superior to the resin plate. A zirconia palatal plate provides a high level of participants' perception without influencing taste thresholds. Copyright © 2016 Japan Prosthodontic Society. Published by Elsevier Ltd. All rights reserved.

  12. THRESHOLD DETERMINATION FOR LOCAL INSTANTANEOUS SEA SURFACE HEIGHT DERIVATION WITH ICEBRIDGE DATA IN BEAUFORT SEA

    Directory of Open Access Journals (Sweden)

    C. Zhu

    2018-05-01

    Full Text Available The NASA Operation IceBridge (OIB mission is the largest program in the Earth’s polar remote sensing science observation project currently, initiated in 2009, which collects airborne remote sensing measurements to bridge the gap between NASA’s ICESat and the upcoming ICESat-2 mission. This paper develop an improved method that optimizing the selection method of Digital Mapping System (DMS image and using the optimal threshold obtained by experiments in Beaufort Sea to calculate the local instantaneous sea surface height in this area. The optimal threshold determined by comparing manual selection with the lowest (Airborne Topographic Mapper ATM L1B elevation threshold of 2 %, 1 %, 0.5 %, 0.2 %, 0.1 % and 0.05 % in A, B, C sections, the mean of mean difference are 0.166 m, 0.124 m, 0.083 m, 0.018 m, 0.002 m and −0.034 m. Our study shows the lowest L1B data of 0.1 % is the optimal threshold. The optimal threshold and manual selections are also used to calculate the instantaneous sea surface height over images with leads, we find that improved methods has closer agreement with those from L1B manual selections. For these images without leads, the local instantaneous sea surface height estimated by using the linear equations between distance and sea surface height calculated over images with leads.

  13. Does Deep Cervical Flexor Muscle Training Affect Pain Pressure Thresholds of Myofascial Trigger Points in Patients with Chronic Neck Pain? A Prospective Randomized Controlled Trial

    Directory of Open Access Journals (Sweden)

    Pavlos Bobos

    2016-01-01

    Full Text Available Background. We need to understand more about how DNF performs in different contexts and whether it affects the pain threshold over myofascial trigger points (MTrPs. Purpose. The objectives were to investigate the effect of neck muscles training on disability and pain and on pain threshold over MTrPs in people with chronic neck pain. Methods. Patients with chronic neck pain were eligible for participation with a Neck Disability Index (NDI score of over 5/50 and having at least one MTrP on either levator scapulae, upper trapezoid, or splenius capitis muscle. Patients were randomly assigned into either DNF training, superficial neck muscle exercise, or advice group. Generalized linear model (GLM was used to detect differences in treatment groups over time. Results. Out of 67 participants, 60 (47 females, mean age: 39.45 ± 12.67 completed the study. Neck disability and neck pain were improved over time between and within groups (p<0.05. However, no differences were found within and between the therapeutic groups (p<0.05 in the tested muscles’ PPTs and in cervicothoracic angle over a 7-week period. Conclusion. All three groups improved over time. This infers that the pain pathways involved in the neck pain relief are not those involved in pain threshold.

  14. Representation of dynamical stimuli in populations of threshold neurons.

    Directory of Open Access Journals (Sweden)

    Tatjana Tchumatchenko

    2011-10-01

    Full Text Available Many sensory or cognitive events are associated with dynamic current modulations in cortical neurons. This raises an urgent demand for tractable model approaches addressing the merits and limits of potential encoding strategies. Yet, current theoretical approaches addressing the response to mean- and variance-encoded stimuli rarely provide complete response functions for both modes of encoding in the presence of correlated noise. Here, we investigate the neuronal population response to dynamical modifications of the mean or variance of the synaptic bombardment using an alternative threshold model framework. In the variance and mean channel, we provide explicit expressions for the linear and non-linear frequency response functions in the presence of correlated noise and use them to derive population rate response to step-like stimuli. For mean-encoded signals, we find that the complete response function depends only on the temporal width of the input correlation function, but not on other functional specifics. Furthermore, we show that both mean- and variance-encoded signals can relay high-frequency inputs, and in both schemes step-like changes can be detected instantaneously. Finally, we obtain the pairwise spike correlation function and the spike triggered average from the linear mean-evoked response function. These results provide a maximally tractable limiting case that complements and extends previous results obtained in the integrate and fire framework.

  15. (t, n) Threshold d-Level Quantum Secret Sharing.

    Science.gov (United States)

    Song, Xiu-Li; Liu, Yan-Bing; Deng, Hong-Yao; Xiao, Yong-Gang

    2017-07-25

    Most of Quantum Secret Sharing(QSS) are (n, n) threshold 2-level schemes, in which the 2-level secret cannot be reconstructed until all n shares are collected. In this paper, we propose a (t, n) threshold d-level QSS scheme, in which the d-level secret can be reconstructed only if at least t shares are collected. Compared with (n, n) threshold 2-level QSS, the proposed QSS provides better universality, flexibility, and practicability. Moreover, in this scheme, any one of the participants does not know the other participants' shares, even the trusted reconstructor Bob 1 is no exception. The transformation of the particles includes some simple operations such as d-level CNOT, Quantum Fourier Transform(QFT), Inverse Quantum Fourier Transform(IQFT), and generalized Pauli operator. The transformed particles need not to be transmitted from one participant to another in the quantum channel. Security analysis shows that the proposed scheme can resist intercept-resend attack, entangle-measure attack, collusion attack, and forgery attack. Performance comparison shows that it has lower computation and communication costs than other similar schemes when 2 < t < n - 1.

  16. Study of 1D complex resistivity inversion using digital linear filter technique; Linear filter ho wo mochiita fukusohi teiko no gyakukaisekiho no kento

    Energy Technology Data Exchange (ETDEWEB)

    Sakurai, K; Shima, H [OYO Corp., Tokyo (Japan)

    1996-10-01

    This paper proposes a modeling method of one-dimensional complex resistivity using linear filter technique which has been extended to the complex resistivity. In addition, a numerical test of inversion was conducted using the monitoring results, to discuss the measured frequency band. Linear filter technique is a method by which theoretical potential can be calculated for stratified structures, and it is widely used for the one-dimensional analysis of dc electrical exploration. The modeling can be carried out only using values of complex resistivity without using values of potential. In this study, a bipolar method was employed as a configuration of electrodes. The numerical test of one-dimensional complex resistivity inversion was conducted using the formulated modeling. A three-layered structure model was used as a numerical model. A multi-layer structure with a thickness of 5 m was analyzed on the basis of apparent complex resistivity calculated from the model. From the results of numerical test, it was found that both the chargeability and the time constant agreed well with those of the original model. A trade-off was observed between the chargeability and the time constant at the stage of convergence. 3 refs., 9 figs., 1 tab.

  17. High pain sensitivity is distinct from high susceptibility to non-painful sensory input at threshold level.

    Science.gov (United States)

    Hummel, Thomas; Springborn, Maria; Croy, Ilona; Kaiser, Jochen; Lötsch, Jörn

    2011-04-01

    Individuals may differ considerably in their sensitivity towards various painful stimuli supporting the notion of a person as stoical or complaining about pain. Molecular and functional imaging research provides support that this may extend also to other sensory qualities. Whether a person can be characterized as possessing a generally high or low sensory acuity is unknown. This was therefore assessed with thresholds to painful and non-painful stimuli, with a focus on chemical stimuli that besides pain may evoke clearly non-painful sensations such as taste or smell. In 36 healthy men and 78 women (ages 18 to 52 years), pain thresholds to chemo-somatosensory (intranasal gaseous CO(2)) and electrical stimuli (cutaneous stimulation) were significantly correlated (ρ(2)=0.2268, psensory qualities, i.e., for the rose-like odor phenyl ethyl alcohol and gustatory thresholds for sour (citric acid) and salty (NaCl). Similarly, pain clusters showed no differences in thresholds to other stimuli. Moreover, no clustering was obtained for thresholds to both painful and non-painful stimuli together. Thus, individuals could not be characterized as highly sensitive (or insensitive) to all chemical stimuli no matter of evoking pain. This suggests that pain is primarily a singular sensory perception distinct from others such as olfaction or taste. Copyright © 2011 Elsevier B.V. All rights reserved.

  18. Impaired NaCl taste thresholds in Zn deprived rats

    International Nuclear Information System (INIS)

    Brosvic, G.M.; Slotnick, B.M.; Nelson, N.; Henkin, R.I.

    1986-01-01

    Zn deficiency is a relatively common cause of loss of taste acuity in humans. In some patients replacement with exogenous Zn results in rapid reversal of the loss whereas in others prolonged treatment is needed to restore normal taste function. To study this 300 gm outbred Sprague Dawley rats were given Zn deficient diet (< 1 ppm Zn) supplemented with Zn in drinking water (0.1 gm Zn/100 gm body weight). Rats were trained in an automated operant conditions procedure and NaCl taste thresholds determined. During an initial training period and over two replications mean thresholds were 0.006% and mean plasma Zn was 90 +/- 2 μg/dl (M +/- SEM) determined by flame atomic absorption spectrophotometry. Rats were then divided into two groups; in one (3 rats) Zn supplement was removed, in the other (4 rats), pair-fed with the former group, Zn supplement was continued. In 10 days NaCl thresholds in Zn deprived rats increased significantly (0.07%, p < 0.01) and in 17 days increased 13 fold (0.08%) but thresholds for pair fed, supplemented rats remained constant (0.006%). There was no overlap in response between any rat in the two groups. Plasma Zn at 17 days in Zn-deprived rats was significantly below pair-fed rats (52 +/- 13 vs 89 +/- 6 μg/dl, respectively, P < 0.01). At this time Zn-deprived rats were supplemented with Zn for 27 days without any reduction in taste thresholds. These preliminary results are consistent with previous observations in Zn deficient patients

  19. ZnO-PVA nanocomposite films for low threshold optical limiting applications

    International Nuclear Information System (INIS)

    Viswanath, Varsha; Beenakumari, C.; Muneera, C. I.

    2014-01-01

    Zinc oxide-PVA nanocomposite films were fabricated adopting a simple method based on solution-casting, incorporating small weight percentages ( −3 M to 7×10 −3 M), and their structure, morphology, linear and low threshold nonlinear optical properties were investigated. The films were characterized as nanostructured ZnO encapsulated between the molecules/chains of the semicrystalline host polymer PVA. The samples exhibited low threshold nonlinear absorption and negative nonlinear refraction, as studied using the Z-scan technique. A switchover from SA to RSA was observed as the concentration of ZnO was increased. The optical limiting of 632.8 nm CW laser light displayed by these nanocomposite films is also demonstrated. The estimated values of the effective coefficients of nonlinear absorption, nonlinear refraction and third-order nonlinear susceptibility, |χ (3) |, compared to those reported for continuous wave laser light excitation, measure up to the highest among them. The results show that the ZnO-PVA nanocomposite films have great potential applications in future optical and photonic devices

  20. Why GHQ threshold varies from one place to another

    NARCIS (Netherlands)

    Goldberg, DP; Oldehinkel, T; Ormel, J

    Background. No convincing explanation has been forthcoming for the variation in best threshold to adopt for the GHQ in different settings. Methods. Data dealing with the GHQ and the CIDI in 15 cities from a recent WHO study was subjected to further analysis. Results. The mean number of CIDI symptoms

  1. Vernakalant selectively prolongs atrial refractoriness with no effect on ventricular refractoriness or defibrillation threshold in pigs.

    Science.gov (United States)

    Bechard, Jeff; Gibson, John Ken; Killingsworth, Cheryl R; Wheeler, Jeffery J; Schneidkraut, Marlowe J; Huang, Jian; Ideker, Raymond E; McAfee, Donald A

    2011-03-01

    Vernakalant is a novel antiarrhythmic agent that has demonstrated clinical efficacy for the treatment of atrial fibrillation. Vernakalant blocks, to various degrees, cardiac sodium and potassium channels with a pattern that suggests atrial selectivity. We hypothesized, therefore, that vernakalant would affect atrial more than ventricular effective refractory period (ERP) and have little or no effect on ventricular defibrillation threshold (DFT). Atrial and ventricular ERP and ventricular DFT were determined before and after treatment with vernakalant or vehicle in 23 anesthetized male mixed-breed pigs. Vernakalant was infused at a rate designed to achieve stable plasma levels similar to those in human clinical trials. Atrial and ventricular ERP were determined by endocardial extrastimuli delivered to the right atria or right ventricle. Defibrillation was achieved using external biphasic shocks delivered through adhesive defibrillation patches placed on the thorax after 10 seconds of electrically induced ventricular fibrillation. The DFT was estimated using the Dixon "up-and-down" method. Vernakalant significantly increased atrial ERP compared with vehicle controls (34 ± 8 versus 9 ± 7 msec, respectively) without significantly affecting ventricular ERP or DFT. This is consistent with atrial selective actions and supports the conclusion that vernakalant does not alter the efficacy of electrical defibrillation.

  2. Impact of PET reconstruction algorithm and threshold on dose painting of non-small cell lung cancer

    International Nuclear Information System (INIS)

    Knudtsen, Ingerid Skjei; Elmpt, Wouter van; Öllers, Michel; Malinen, Eirik

    2014-01-01

    Purpose: In the current work, we investigate the impact of PET reconstruction methods (RMs) and threshold on two types of dose painting (DP) prescription strategies for non-small cell lung cancer (NSCLC). Materials and methods: Sixteen patients with NSCLC underwent an 18F-FDG-PET/CT examination prior to radiotherapy. Six different RMs were used. For both a dose painting by contours (DPBC) and a dose painting by numbers (DPBN) strategy, the prescribed radiation dose within the gross tumor volume (GTV) was mapped according to the spatial distribution of standardized uptake values (SUVs). SUV max and SUV peak were used for volume thresholding in DPBC and a linear SUV-dose scaling approach was used for DPBN. Deviations from the dose prescription as determined by the standard RM was scored by a quality factor (QF). Results: For DPBC, the mean difference in thresholded boost volume between RMs was typically within 10%. The difference in dose prescription was systematically lower for thresholding based on SUV peak (largest mean QF 2.8 ± 2.0%) compared to SUV max (largest mean QF 3.6 ± 3.0%). For DPBN, the resulting dose prescriptions were less dependent on RM and threshold; the largest mean QFs were 1.3 ± 0.3% both for SUV max and SUV peak . Conclusions: PET reconstruction algorithms will both influence DPBC and DPBN, although the impact is smaller for DPBN. For some patients, the resulting variations in dose prescriptions may result in clinically different dose distributions. SUV peak is a more robust thresholding parameter than SUV max

  3. Numerical investigation of the inertial cavitation threshold under multi-frequency ultrasound.

    Science.gov (United States)

    Suo, Dingjie; Govind, Bala; Zhang, Shengqi; Jing, Yun

    2018-03-01

    Through the introduction of multi-frequency sonication in High Intensity Focused Ultrasound (HIFU), enhancement of efficiency has been noted in several applications including thrombolysis, tissue ablation, sonochemistry, and sonoluminescence. One key experimental observation is that multi-frequency ultrasound can help lower the inertial cavitation threshold, thereby improving the power efficiency. However, this has not been well corroborated by the theory. In this paper, a numerical investigation on the inertial cavitation threshold of microbubbles (MBs) under multi-frequency ultrasound irradiation is conducted. The relationships between the cavitation threshold and MB size at various frequencies and in different media are investigated. The results of single-, dual and triple frequency sonication show reduced inertial cavitation thresholds by introducing additional frequencies which is consistent with previous experimental work. In addition, no significant difference is observed between dual frequency sonication with various frequency differences. This study, not only reaffirms the benefit of using multi-frequency ultrasound for various applications, but also provides a possible route for optimizing ultrasound excitations for initiating inertial cavitation. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Predicting hearing thresholds in occupational noise-induced hearing loss by auditory steady state responses.

    Science.gov (United States)

    Attias, Joseph; Karawani, Hanin; Shemesh, Rafi; Nageris, Ben

    2014-01-01

    Currently available behavioral tools for the assessment of noise-induced hearing loss (NIHL) depend on the reliable cooperation of the subject. Furthermore, in workers' compensation cases, there is considerable financial gain to be had from exaggerating symptoms, such that accurate assessment of true hearing threshold levels is essential. An alternative objective physiologic tool for assessing NIHL is the auditory steady state response (ASSR) test, which combines frequency specificity with a high level of auditory stimulation, making it applicable for the evaluation of subjects with a moderate to severe deficit. The primary aim of the study was to assess the value of the multifrequency ASSR test in predicting the behavioral warble-tone audiogram in a large sample of young subjects with NIHL of varying severity or with normal hearing. The secondary goal was to assess suprathreshold ASSR growth functions in these two groups. The study group included 157 subjects regularly exposed to high levels of occupational noise, who attended a university-associated audiological clinic for evaluation of NIHL from 2009 through 2011. All underwent a behavioral audiogram, and on the basis of the findings, were divided into those with NIHL (108 subjects, 216 ears) or normal hearing (49 subjects, 98 ears). The accuracy of the ASSR threshold estimations for frequencies of 500, 1000, 2000, and 4000 Hz was compared between groups, and the specificity and sensitivity of the ASSR test in differentiating ears with or without NIHL was calculated using receiver operating characteristic analysis. Linear regression analysis was used to formulate an equation to predict the behavioral warble-tone audiogram at each test frequency using ASSR thresholds. Multifrequency ASSR amplitude growth as a function of stimulus intensity was compared between the NIHL and normal-hearing groups for 1000 Hz and 4000 Hz carrier frequencies. In the subjects with NIHL, ASSR thresholds to various frequencies were

  5. A numerical study of threshold states

    International Nuclear Information System (INIS)

    Ata, M.S.; Grama, C.; Grama, N.; Hategan, C.

    1979-01-01

    There are some experimental evidences of charged particle threshold states. On the statistical background of levels, some simple structures were observed in excitation spectrum. They occur near the coulombian threshold and have a large reduced width for the decay in the threshold channel. These states were identified as charged cluster threshold states. Such threshold states were observed in sup(15,16,17,18)O, sup(18,19)F, sup(19,20)Ne, sup(24)Mg, sup(32)S. The types of clusters involved were d, t, 3 He, α and even 12 C. They were observed in heavy-ions transfer reactions in the residual nucleus as strong excited levels. The charged particle threshold states occur as simple structures at high excitation energy. They could be interesting both from nuclear structure as well as nuclear reaction mechanism point of view. They could be excited as simple structures both in compound and residual nucleus. (author)

  6. Thermal, Catalytic Conversion of Alkanes to Linear Aldehydes and Linear Amines.

    Science.gov (United States)

    Tang, Xinxin; Jia, Xiangqing; Huang, Zheng

    2018-03-21

    Alkanes, the main constituents of petroleum, are attractive feedstocks for producing value-added chemicals. Linear aldehydes and amines are two of the most important building blocks in the chemical industry. To date, there have been no effective methods for directly converting n-alkanes to linear aldehydes and linear amines. Here, we report a molecular dual-catalyst system for production of linear aldehydes via regioselective carbonylation of n-alkanes. The system is comprised of a pincer iridium catalyst for transfer-dehydrogenation of the alkane using t-butylethylene or ethylene as a hydrogen acceptor working sequentially with a rhodium catalyst for olefin isomerization-hydroformylation with syngas. The system exhibits high regioselectivity for linear aldehydes and gives high catalytic turnover numbers when using ethylene as the acceptor. In addition, the direct conversion of light alkanes, n-pentane and n-hexane, to siloxy-terminated alkyl aldehydes through a sequence of Ir/Fe-catalyzed alkane silylation and Ir/Rh-catalyzed alkane carbonylation, is described. Finally, the Ir/Rh dual-catalyst strategy has been successfully applied to regioselective alkane aminomethylation to form linear alkyl amines.

  7. Ecosystem resilience and threshold response in the Galápagos coastal zone.

    Directory of Open Access Journals (Sweden)

    Alistair W R Seddon

    Full Text Available BACKGROUND: The Intergovernmental Panel on Climate Change (IPCC provides a conservative estimate on rates of sea-level rise of 3.8 mm yr(-1 at the end of the 21(st century, which may have a detrimental effect on ecologically important mangrove ecosystems. Understanding factors influencing the long-term resilience of these communities is critical but poorly understood. We investigate ecological resilience in a coastal mangrove community from the Galápagos Islands over the last 2700 years using three research questions: What are the 'fast and slow' processes operating in the coastal zone? Is there evidence for a threshold response? How can the past inform us about the resilience of the modern system? METHODOLOGY/PRINCIPAL FINDINGS: Palaeoecological methods (AMS radiocarbon dating, stable carbon isotopes (δ(13C were used to reconstruct sedimentation rates and ecological change over the past 2,700 years at Diablas lagoon, Isabela, Galápagos. Bulk geochemical analysis was also used to determine local environmental changes, and salinity was reconstructed using a diatom transfer function. Changes in relative sea level (RSL were estimated using a glacio-isostatic adjustment model. Non-linear behaviour was observed in the Diablas mangrove ecosystem as it responded to increased salinities following exposure to tidal inundations. A negative feedback was observed which enabled the mangrove canopy to accrete vertically, but disturbances may have opened up the canopy and contributed to an erosion of resilience over time. A combination of drier climatic conditions and a slight fall in RSL then resulted in a threshold response, from a mangrove community to a microbial mat. CONCLUSIONS/SIGNIFICANCE: Palaeoecological records can provide important information on the nature of non-linear behaviour by identifying thresholds within ecological systems, and in outlining responses to 'fast' and 'slow' environmental change between alternative stable states. This study

  8. Neuronal spike-train responses in the presence of threshold noise.

    Science.gov (United States)

    Coombes, S; Thul, R; Laudanski, J; Palmer, A R; Sumner, C J

    2011-03-01

    The variability of neuronal firing has been an intense topic of study for many years. From a modelling perspective it has often been studied in conductance based spiking models with the use of additive or multiplicative noise terms to represent channel fluctuations or the stochastic nature of neurotransmitter release. Here we propose an alternative approach using a simple leaky integrate-and-fire model with a noisy threshold. Initially, we develop a mathematical treatment of the neuronal response to periodic forcing using tools from linear response theory and use this to highlight how a noisy threshold can enhance downstream signal reconstruction. We further develop a more general framework for understanding the responses to large amplitude forcing based on a calculation of first passage times. This is ideally suited to understanding stochastic mode-locking, for which we numerically determine the Arnol'd tongue structure. An examination of data from regularly firing stellate neurons within the ventral cochlear nucleus, responding to sinusoidally amplitude modulated pure tones, shows tongue structures consistent with these predictions and highlights that stochastic, as opposed to deterministic, mode-locking is utilised at the level of the single stellate cell to faithfully encode periodic stimuli.

  9. Conceptions of nuclear threshold status

    International Nuclear Information System (INIS)

    Quester, G.H.

    1991-01-01

    This paper reviews some alternative definitions of nuclear threshold status. Each of them is important, and major analytical confusions would result if one sense of the term is mistaken for another. The motives for nations entering into such threshold status are a blend of civilian and military gains, and of national interests versus parochial or bureaucratic interests. A portion of the rationale for threshold status emerges inevitably from the pursuit of economic goals, and another portion is made more attraction by the derives of the domestic political process. Yet the impact on international security cannot be dismissed, especially where conflicts among the states remain real. Among the military or national security motives are basic deterrence, psychological warfare, war-fighting and, more generally, national prestige. In the end, as the threshold phenomenon is assayed for lessons concerning the role of nuclear weapons more generally in international relations and security, one might conclude that threshold status and outright proliferation coverage to a degree in the motives for all of the states involved and in the advantages attained. As this paper has illustrated, nuclear threshold status is more subtle and more ambiguous than outright proliferation, and it takes considerable time to sort out the complexities. Yet the world has now had a substantial amount of time to deal with this ambiguous status, and this may tempt more states to exploit it

  10. Linear and quadrature models for data from treshold measurements of the transient visual system

    NARCIS (Netherlands)

    Brinker, den A.C.

    1986-01-01

    III this paper two models are considered for the transient visual system at threshold. One is a linear model and the other a model contain ing a quadrature element. Both models are commonly used on evidence from different experimental sources. It is shown that both models act in a similar fashion

  11. Measurement of inclusive eta production in e+e- interactions near charm threshold

    International Nuclear Information System (INIS)

    Partridge, R.; Peck, C.; Porter, F.C.; Gu, Y.F.; Kollmann, W.; Richardson, M.; Strauch, K.; Wacker, K.; Aschman, D.; Bagger, J.; Burnett, T.; Cavalli-Sforza, M.; Coyne, D.; Joy, M.; Sadrozinski, H.F.W.; Hofstadter, R.; Horisberger, R.; Kirkbride, I.; Kolanoski, H.; Koenigsmann, K.; Liberman, A.; O'Reilly, J.; Osterheld, A.; Tompkins, J.; Bloom, E.; Bulos, F.; Chestnut, R.; Gaiser, J.; Godfrey, G.; Kiesling, C.; Lockman, W.; Oreglia, M.

    1981-01-01

    We have measured the inclusive cross section for eta production in e + e - interactions near charm threshold using the Crystal Ball detector. No pronounced structure in the energy dependence is observed. By comparing cross sections above and below charm threshold we obtain the limits (90% confidence limit): R(e + e - →FF-barX)Br(F→etax) <0.15--0.32 (for E/sub c.m./ from 4.0 to 4.5 GeV), Br(D→etax)<0.13 [averaged over charged and neutral D components of the psi''(3770) decays]. Our results are inconsistent with a previous report of a large energy dependence of the eta cross section ascribed to the crossing the FF* and F*F* production thresholds

  12. 2.43 kW narrow linewidth linearly polarized all-fiber amplifier based on mode instability suppression

    Science.gov (United States)

    Su, Rongtao; Tao, Rumao; Wang, Xiaolin; Zhang, Hanwei; Ma, Pengfei; Zhou, Pu; Xu, Xiaojun

    2017-08-01

    We demonstrate an experimental study on scaling mode instability (MI) threshold in fiber amplifiers based on fiber coiling. The experimental results show that coiling the active fiber in the cylindrical spiral shape is superior to the coiling in the plane spiral shape. When the polarization maintained Yb-doped fiber (PM YDF: with a core/inner-cladding diameter of 20/400 µm) is coiled on an aluminous plate with a bend diameter of 9-16 cm, the MI threshold is ~1.55 kW. When such a PM YDF is coiled on an aluminous cylinder with diameter of 9 cm, no MI is observed at the output power of 2.43 kW, which is limited by the available pump power. The spectral width and polarization extinction ratio is 0.255 nm and 18.3 dB, respectively, at 2.43 kW. To the best of our knowledge, this is the highest output power from a linear polarized narrow linewidth all-fiberized amplifier. By using a theoretical model, the potential MI-free scaling capability in such an amplifier is estimated to be 3.5 kW.

  13. Validation and evaluation of epistemic uncertainty in rainfall thresholds for regional scale landslide forecasting

    Science.gov (United States)

    Gariano, Stefano Luigi; Brunetti, Maria Teresa; Iovine, Giulio; Melillo, Massimo; Peruccacci, Silvia; Terranova, Oreste Giuseppe; Vennari, Carmela; Guzzetti, Fausto

    2015-04-01

    Prediction of rainfall-induced landslides can rely on empirical rainfall thresholds. These are obtained from the analysis of past rainfall events that have (or have not) resulted in slope failures. Accurate prediction requires reliable thresholds, which need to be validated before their use in operational landslide warning systems. Despite the clear relevance of validation, only a few studies have addressed the problem, and have proposed and tested robust validation procedures. We propose a validation procedure that allows for the definition of optimal thresholds for early warning purposes. The validation is based on contingency table, skill scores, and receiver operating characteristic (ROC) analysis. To establish the optimal threshold, which maximizes the correct landslide predictions and minimizes the incorrect predictions, we propose an index that results from the linear combination of three weighted skill scores. Selection of the optimal threshold depends on the scope and the operational characteristics of the early warning system. The choice is made by selecting appropriately the weights, and by searching for the optimal (maximum) value of the index. We discuss weakness in the validation procedure caused by the inherent lack of information (epistemic uncertainty) on landslide occurrence typical of large study areas. When working at the regional scale, landslides may have occurred and may have not been reported. This results in biases and variations in the contingencies and the skill scores. We introduce two parameters to represent the unknown proportion of rainfall events (above and below the threshold) for which landslides occurred and went unreported. We show that even a very small underestimation in the number of landslides can result in a significant decrease in the performance of a threshold measured by the skill scores. We show that the variations in the skill scores are different for different uncertainty of events above or below the threshold. This

  14. Measuring Input Thresholds on an Existing Board

    Science.gov (United States)

    Kuperman, Igor; Gutrich, Daniel G.; Berkun, Andrew C.

    2011-01-01

    temperatures to show that the interface had voltage margin under all worst case conditions. Gate input thresholds are normally measured at the manufacturer when the device is on a chip tester. A key function of this machine was duplicated on an existing flight board with no modifications to the nets to be tested, with the exception of changes in the FPGA program.

  15. Improved bounds on the epidemic threshold of exact SIS models on complex networks

    KAUST Repository

    Ruhi, Navid Azizan; Thrampoulidis, Christos; Hassibi, Babak

    2017-01-01

    The SIS (susceptible-infected-susceptible) epidemic model on an arbitrary network, without making approximations, is a 2n-state Markov chain with a unique absorbing state (the all-healthy state). This makes analysis of the SIS model and, in particular, determining the threshold of epidemic spread quite challenging. It has been shown that the exact marginal probabilities of infection can be upper bounded by an n-dimensional linear time-invariant system, a consequence of which is that the Markov chain is “fast-mixing” when the LTI system is stable, i.e. when equation (where β is the infection rate per link, δ is the recovery rate, and λmax(A) is the largest eigenvalue of the network's adjacency matrix). This well-known threshold has been recently shown not to be tight in several cases, such as in a star network. In this paper, we provide tighter upper bounds on the exact marginal probabilities of infection, by also taking pairwise infection probabilities into account. Based on this improved bound, we derive tighter eigenvalue conditions that guarantee fast mixing (i.e., logarithmic mixing time) of the chain. We demonstrate the improvement of the threshold condition by comparing the new bound with the known one on various networks with various epidemic parameters.

  16. Improved bounds on the epidemic threshold of exact SIS models on complex networks

    KAUST Repository

    Ruhi, Navid Azizan

    2017-01-05

    The SIS (susceptible-infected-susceptible) epidemic model on an arbitrary network, without making approximations, is a 2n-state Markov chain with a unique absorbing state (the all-healthy state). This makes analysis of the SIS model and, in particular, determining the threshold of epidemic spread quite challenging. It has been shown that the exact marginal probabilities of infection can be upper bounded by an n-dimensional linear time-invariant system, a consequence of which is that the Markov chain is “fast-mixing” when the LTI system is stable, i.e. when equation (where β is the infection rate per link, δ is the recovery rate, and λmax(A) is the largest eigenvalue of the network\\'s adjacency matrix). This well-known threshold has been recently shown not to be tight in several cases, such as in a star network. In this paper, we provide tighter upper bounds on the exact marginal probabilities of infection, by also taking pairwise infection probabilities into account. Based on this improved bound, we derive tighter eigenvalue conditions that guarantee fast mixing (i.e., logarithmic mixing time) of the chain. We demonstrate the improvement of the threshold condition by comparing the new bound with the known one on various networks with various epidemic parameters.

  17. Can we set a global threshold age to define mature forests?

    Directory of Open Access Journals (Sweden)

    Philip Martin

    2016-02-01

    Full Text Available Globally, mature forests appear to be increasing in biomass density (BD. There is disagreement whether these increases are the result of increases in atmospheric CO2 concentrations or a legacy effect of previous land-use. Recently, it was suggested that a threshold of 450 years should be used to define mature forests and that many forests increasing in BD may be younger than this. However, the study making these suggestions failed to account for the interactions between forest age and climate. Here we revisit the issue to identify: (1 how climate and forest age control global forest BD and (2 whether we can set a threshold age for mature forests. Using data from previously published studies we modelled the impacts of forest age and climate on BD using linear mixed effects models. We examined the potential biases in the dataset by comparing how representative it was of global mature forests in terms of its distribution, the climate space it occupied, and the ages of the forests used. BD increased with forest age, mean annual temperature and annual precipitation. Importantly, the effect of forest age increased with increasing temperature, but the effect of precipitation decreased with increasing temperatures. The dataset was biased towards northern hemisphere forests in relatively dry, cold climates. The dataset was also clearly biased towards forests <250 years of age. Our analysis suggests that there is not a single threshold age for forest maturity. Since climate interacts with forest age to determine BD, a threshold age at which they reach equilibrium can only be determined locally. We caution against using BD as the only determinant of forest maturity since this ignores forest biodiversity and tree size structure which may take longer to recover. Future research should address the utility and cost-effectiveness of different methods for determining whether forests should be classified as mature.

  18. Threshold Concepts in Finance: Student Perspectives

    Science.gov (United States)

    Hoadley, Susan; Kyng, Tim; Tickle, Leonie; Wood, Leigh N.

    2015-01-01

    Finance threshold concepts are the essential conceptual knowledge that underpin well-developed financial capabilities and are central to the mastery of finance. In this paper we investigate threshold concepts in finance from the point of view of students, by establishing the extent to which students are aware of threshold concepts identified by…

  19. Thresholding magnetic resonance images of human brain

    Institute of Scientific and Technical Information of China (English)

    Qing-mao HU; Wieslaw L NOWINSKI

    2005-01-01

    In this paper, methods are proposed and validated to determine low and high thresholds to segment out gray matter and white matter for MR images of different pulse sequences of human brain. First, a two-dimensional reference image is determined to represent the intensity characteristics of the original three-dimensional data. Then a region of interest of the reference image is determined where brain tissues are present. The non-supervised fuzzy c-means clustering is employed to determine: the threshold for obtaining head mask, the low threshold for T2-weighted and PD-weighted images, and the high threshold for T1-weighted, SPGR and FLAIR images. Supervised range-constrained thresholding is employed to determine the low threshold for T1-weighted, SPGR and FLAIR images. Thresholding based on pairs of boundary pixels is proposed to determine the high threshold for T2- and PD-weighted images. Quantification against public data sets with various noise and inhomogeneity levels shows that the proposed methods can yield segmentation robust to noise and intensity inhomogeneity. Qualitatively the proposed methods work well with real clinical data.

  20. Near threshold fatigue testing

    Science.gov (United States)

    Freeman, D. C.; Strum, M. J.

    1993-01-01

    Measurement of the near-threshold fatigue crack growth rate (FCGR) behavior provides a basis for the design and evaluation of components subjected to high cycle fatigue. Typically, the near-threshold fatigue regime describes crack growth rates below approximately 10(exp -5) mm/cycle (4 x 10(exp -7) inch/cycle). One such evaluation was recently performed for the binary alloy U-6Nb. The procedures developed for this evaluation are described in detail to provide a general test method for near-threshold FCGR testing. In particular, techniques for high-resolution measurements of crack length performed in-situ through a direct current, potential drop (DCPD) apparatus, and a method which eliminates crack closure effects through the use of loading cycles with constant maximum stress intensity are described.

  1. Poisson versus threshold models for genetic analysis of clinical mastitis in US Holsteins.

    Science.gov (United States)

    Vazquez, A I; Weigel, K A; Gianola, D; Bates, D M; Perez-Cabal, M A; Rosa, G J M; Chang, Y M

    2009-10-01

    Typically, clinical mastitis is coded as the presence or absence of disease in a given lactation, and records are analyzed with either linear models or binary threshold models. Because the presence of mastitis may include cows with multiple episodes, there is a loss of information when counts are treated as binary responses. Poisson models are appropriated for random variables measured as the number of events, and although these models are used extensively in studying the epidemiology of mastitis, they have rarely been used for studying the genetic aspects of mastitis. Ordinal threshold models are pertinent for ordered categorical responses; although one can hypothesize that the number of clinical mastitis episodes per animal reflects a continuous underlying increase in mastitis susceptibility, these models have rarely been used in genetic analysis of mastitis. The objective of this study was to compare probit, Poisson, and ordinal threshold models for the genetic evaluation of US Holstein sires for clinical mastitis. Mastitis was measured as a binary trait or as the number of mastitis cases. Data from 44,908 first-parity cows recorded in on-farm herd management software were gathered, edited, and processed for the present study. The cows were daughters of 1,861 sires, distributed over 94 herds. Predictive ability was assessed via a 5-fold cross-validation using 2 loss functions: mean squared error of prediction (MSEP) as the end point and a cost difference function. The heritability estimates were 0.061 for mastitis measured as a binary trait in the probit model and 0.085 and 0.132 for the number of mastitis cases in the ordinal threshold and Poisson models, respectively; because of scale differences, only the probit and ordinal threshold models are directly comparable. Among healthy animals, MSEP was smallest for the probit model, and the cost function was smallest for the ordinal threshold model. Among diseased animals, MSEP and the cost function were smallest

  2. International program on linear electric motors. CIGGT report No. 92-1

    Energy Technology Data Exchange (ETDEWEB)

    Dawson, G.E.; Eastham, A.R.; Parker, J.H.

    1992-12-31

    The International Program for Linear Electric Motors (LEM) was begun in April 1989 to communicate and coordinate activities with centers of expertise in Germany, Canada, and Japan; to provide for the assessment and support of the planning of technological developments and for dissemination of information to researchers, service operators, and policy makers; and to ensure that full advantage can be taken if opportunities for technology transfer occur. This report documents the work done under the program, including standardizing linear induction motor (LIM) design characteristics; test procedures and measurement methods; rating; database for design data; criteria for evaluation of designs; computer programs for modelling performance; and a design study for an agreed application.

  3. Sulfated lentinan induced mitochondrial dysfunction leads to programmed cell death of tobacco BY-2 cells.

    Science.gov (United States)

    Wang, Jie; Wang, Yaofeng; Shen, Lili; Qian, Yumei; Yang, Jinguang; Wang, Fenglong

    2017-04-01

    Sulphated lentinan (sLTN) is known to act as a resistance inducer by causing programmed cell death (PCD) in tobacco suspension cells. However, the underlying mechanism of this effect is largely unknown. Using tobacco BY-2 cell model, morphological and biochemical studies revealed that mitochondrial reactive oxygen species (ROS) production and mitochondrial dysfunction contribute to sLNT induced PCD. Cell viability, and HO/PI fluorescence imaging and TUNEL assays confirmed a typical cell death process caused by sLNT. Acetylsalicylic acid (an ROS scavenger), diphenylene iodonium (an inhibitor of NADPH oxidases) and protonophore carbonyl cyanide p-trifluoromethoxyphenyl hydrazone (a protonophore and an uncoupler of mitochondrial oxidative phosphorylation) inhibited sLNT-induced H 2 O 2 generation and cell death, suggesting that ROS generation linked, at least partly, to a mitochondrial dysfunction and caspase-like activation. This conclusion was further confirmed by double-stained cells with the mitochondria-specific marker MitoTracker RedCMXRos and the ROS probe H 2 DCFDA. Moreover, the sLNT-induced PCD of BY-2 cells required cellular metabolism as up-regulation of the AOX family gene transcripts and induction of the SA biosynthesis, the TCA cycle, and miETC related genes were observed. It is concluded that mitochondria play an essential role in the signaling pathway of sLNT-induced ROS generation, which possibly provided new insight into the sLNT-mediated antiviral response, including PCD. Copyright © 2016. Published by Elsevier Inc.

  4. Upgrading from the Dicon Wiring Management system to IntEC at the Gentilly 2 station

    International Nuclear Information System (INIS)

    Theoret, P.A.

    1995-01-01

    The General Electric DICON Wiring Management system supplied to HQ during the construction of G2 is currently being replaced by the stand-alone version of the IntEC software developed by AECL. The reasons for replacing DICON and choosing lntEC are discussed. The different aspects of the two year DICON data conversion project are presented with the problems encountered and the means that were taken to resolve the problems. lntEC has shown our DICON data to be considerably more deficient than we had thought. This has increased the cost and the duration of the conversion process. However, correcting the errors during the conversion process provides us with much more accurate data. This should be viewed as an investment in configuration management. Many potential causes of future errors and potentially critical path delays have been removed. We have chosen to document the detailed procedures for the use of lntEC in our plant using a Windows Help File compiler. This also has been found to be extremely useful as a training tool as well as providing on-line help. The DICON data conversion into lntEC will not be completed until 1996. lntEC is not perfect. However, from what we have up to now, we are satisfied with the conviviality and efficiency of lntEC and with AECL's diligence in constantly aspiring in making it a better product. (author)

  5. Effect of butorphanol on thermal nociceptive threshold in healthy pony foals.

    Science.gov (United States)

    McGowan, K T; Elfenbein, J R; Robertson, S A; Sanchez, L C

    2013-07-01

    Pain management is an important component of foal nursing care, and no objective data currently exist regarding the analgesic efficacy of opioids in foals. To evaluate the somatic antinociceptive effects of 2 commonly used doses of intravenous (i.v.) butorphanol in healthy foals. Our hypothesis was that thermal nociceptive threshold would increase following i.v. butorphanol in a dose-dependent manner in both neonatal and older pony foals. Seven healthy neonatal pony foals (age 1-2 weeks), and 11 healthy older pony foals (age 4-8 weeks). Five foals were used during both age periods. Treatments, which included saline (0.5 ml), butorphanol (0.05 mg/kg bwt) and butorphanol (0.1 mg/kg bwt), were administered i.v. in a randomised crossover design with at least 2 days between treatments. Response variables included thermal nociceptive threshold, skin temperature and behaviour score. Data within each age period were analysed using a 2-way repeated measures ANOVA, followed by a Holm-Sidak multiple comparison procedure if warranted. There was a significant (P<0.05) increase in thermal threshold, relative to Time 0, following butorphanol (0.1 mg/kg bwt) administration in both age groups. No significant time or treatment effects were apparent for skin temperature. Significant time, but not treatment, effects were evident for behaviour score in both age groups. Butorphanol (0.1 mg/kg bwt, but not 0.05 mg/kg bwt) significantly increased thermal nociceptive threshold in neonatal and older foals without apparent adverse behavioural effects. Butorphanol shows analgesic potential in foals for management of somatic painful conditions. © 2012 EVJ Ltd.

  6. Preservation of auditory brainstem response thresholds after cochleostomy and titanium microactuator implantation in the lateral wall of cat scala tympani.

    Science.gov (United States)

    Lesinski, S George; Prewitt, Jessica; Bray, Victor; Aravamudhan, Radhika; Bermeo Blanco, Oscar A; Farmer-Fedor, Brenda L; Ward, Jonette A

    2014-04-01

    The safety of implanting a titanium microactuator into the lateral wall of cat scala tympani was assessed by comparing preoperative and postoperative auditory brainstem response (ABR) thresholds for 1 to 3 months. The safety of directly stimulating cochlear perilymph with an implantable hearing system requires maintaining preoperative hearing levels. This cat study is an essential step in the development of the next generation of fully implantable hearing devices for humans. Following GLP surgical standards, a 1-mm cochleostomy was drilled into the lateral wall of the scala tympani, and a nonfunctioning titanium anchor/microactuator assembly was inserted in 8 cats. The scala media was damaged in the 1 cat. ABR thresholds with click and 4- and 8-kHz stimuli were measured preoperatively and compared with postoperative thresholds at 1, 2, and 3 months. Nonimplanted ear thresholds were also measured to establish statistical significance for threshold shifts (>28.4 dB). Two audiologists independently interpreted thresholds. Postoperatively, 7 cats implanted in the scala tympani demonstrated no significant ABR threshold shift for click stimulus; one shifted ABR thresholds to 4- and 8-kHz stimuli. The eighth cat, with surgical damage to the scala media, maintained stable click threshold but had a significant shift to 4- and 8-kHz stimuli. This cat study provides no evidence of worsening hearing thresholds after fenestration of the scala tympani and insertion of a titanium anchor/microactuator, provided there is no surgical trauma to the scala media and the implanted device is securely anchored in the cochleostomy. These 2 issues have been resolved in the development of a fully implantable hearing system for humans. The long-term hearing stability (combined with histologic studies) reaffirm that the microactuator is well tolerated by the cat cochlea.

  7. Alignment dependence in above-threshold ionization of H2+: role of intermediate resonances

    DEFF Research Database (Denmark)

    Hernández, Jorge Fernández; Madsen, Lars Bojer

    2009-01-01

    We report a 3D ab initio investigation of the dependence of above-threshold ionization of the H2+ molecule on the orientation of a linearly polarized intense femtosecond laser pulse with respect to the molecular axis. The calculations were performed in the frozen nuclei approximation for the 2Σ+g(1......sσg) ground and the 2Σ+u(2pσu) first excited electronic states, in laser pulses of seven optical cycles (19 fs) with a wavelength of 800 nm and for different intensities. The numerical procedure combines two different techniques, a grid-based split-step method to propagate the wave packet during...... the pulse, and a bound and scattering states B-spline basis set calculation to extract the information from the former. We show that the orientation dependence of the above-threshold ionization spectra is very sensitive to the intensity of the field and to the final electron energy. For some intensities...

  8. Elements of linear space

    CERN Document Server

    Amir-Moez, A R; Sneddon, I N

    1962-01-01

    Elements of Linear Space is a detailed treatment of the elements of linear spaces, including real spaces with no more than three dimensions and complex n-dimensional spaces. The geometry of conic sections and quadric surfaces is considered, along with algebraic structures, especially vector spaces and transformations. Problems drawn from various branches of geometry are given.Comprised of 12 chapters, this volume begins with an introduction to real Euclidean space, followed by a discussion on linear transformations and matrices. The addition and multiplication of transformations and matrices a

  9. At-Risk-of-Poverty Threshold

    Directory of Open Access Journals (Sweden)

    Táňa Dvornáková

    2012-06-01

    Full Text Available European Statistics on Income and Living Conditions (EU-SILC is a survey on households’ living conditions. The main aim of the survey is to get long-term comparable data on social and economic situation of households. Data collected in the survey are used mainly in connection with the evaluation of income poverty and determinationof at-risk-of-poverty rate. This article deals with the calculation of the at risk-of-poverty threshold based on data from EU-SILC 2009. The main task is to compare two approaches to the computation of at riskof-poverty threshold. The first approach is based on the calculation of the threshold for each country separately,while the second one is based on the calculation of the threshold for all states together. The introduction summarizes common attributes in the calculation of the at-risk-of-poverty threshold, such as disposable household income, equivalised household income. Further, different approaches to both calculations are introduced andadvantages and disadvantages of these approaches are stated. Finally, the at-risk-of-poverty rate calculation is described and comparison of the at-risk-of-poverty rates based on these two different approaches is made.

  10. Logical Qubit in a Linear Array of Semiconductor Quantum Dots

    Directory of Open Access Journals (Sweden)

    Cody Jones

    2018-06-01

    Full Text Available We design a logical qubit consisting of a linear array of quantum dots, we analyze error correction for this linear architecture, and we propose a sequence of experiments to demonstrate components of the logical qubit on near-term devices. To avoid the difficulty of fully controlling a two-dimensional array of dots, we adapt spin control and error correction to a one-dimensional line of silicon quantum dots. Control speed and efficiency are maintained via a scheme in which electron spin states are controlled globally using broadband microwave pulses for magnetic resonance, while two-qubit gates are provided by local electrical control of the exchange interaction between neighboring dots. Error correction with two-, three-, and four-qubit codes is adapted to a linear chain of qubits with nearest-neighbor gates. We estimate an error correction threshold of 10^{-4}. Furthermore, we describe a sequence of experiments to validate the methods on near-term devices starting from four coupled dots.

  11. Noise reduction in Lidar signal using correlation-based EMD combined with soft thresholding and roughness penalty

    Science.gov (United States)

    Chang, Jianhua; Zhu, Lingyan; Li, Hongxu; Xu, Fan; Liu, Binggang; Yang, Zhenbo

    2018-01-01

    Empirical mode decomposition (EMD) is widely used to analyze the non-linear and non-stationary signals for noise reduction. In this study, a novel EMD-based denoising method, referred to as EMD with soft thresholding and roughness penalty (EMD-STRP), is proposed for the Lidar signal denoising. With the proposed method, the relevant and irrelevant intrinsic mode functions are first distinguished via a correlation coefficient. Then, the soft thresholding technique is applied to the irrelevant modes, and the roughness penalty technique is applied to the relevant modes to extract as much information as possible. The effectiveness of the proposed method was evaluated using three typical signals contaminated by white Gaussian noise. The denoising performance was then compared to the denoising capabilities of other techniques, such as correlation-based EMD partial reconstruction, correlation-based EMD hard thresholding, and wavelet transform. The use of EMD-STRP on the measured Lidar signal resulted in the noise being efficiently suppressed, with an improved signal to noise ratio of 22.25 dB and an extended detection range of 11 km.

  12. Recirculating beam-breakup thresholds for polarized higher-order modes with optical coupling

    Directory of Open Access Journals (Sweden)

    Georg H. Hoffstaetter

    2007-04-01

    Full Text Available Here we will derive the general theory of the beam-breakup (BBU instability in recirculating linear accelerators with coupled beam optics and with polarized higher-order dipole modes. The bunches do not have to be at the same radio-frequency phase during each recirculation turn. This is important for the description of energy recovery linacs (ERLs where beam currents become very large and coupled optics are used on purpose to increase the threshold current. This theory can be used for the analysis of phase errors of recirculated bunches, and of errors in the optical coupling arrangement. It is shown how the threshold current for a given linac can be computed and a remarkable agreement with tracking data is demonstrated. General formulas are then analyzed for several analytically solvable problems: (a Why can different higher order modes (HOM in one cavity couple and why can they then not be considered individually, even when their frequencies are separated by much more than the resonance widths of the HOMs? For the Cornell ERL as an example, it is noted that optimum advantage is taken of coupled optics when the cavities are designed with an x-y HOM frequency splitting of above 50 MHz. The simulated threshold current is then far above the design current of this accelerator. To justify that the simulation can represent an actual accelerator, we simulate cavities with 1 to 8 modes and show that using a limited number of modes is reasonable. (b How does the x-y coupling in the particle optics determine when modes can be considered separately? (c How much of an increase in threshold current can be obtained by coupled optics and why does the threshold current for polarized modes diminish roughly with the square root of the HOMs’ quality factors. Because of this square root scaling, polarized modes with coupled optics increase the threshold current more effectively for cavities that have rather large HOM quality factors, e.g. those without very

  13. Response threshold variance as a basis of collective rationality.

    Science.gov (United States)

    Yamamoto, Tatsuhiro; Hasegawa, Eisuke

    2017-04-01

    Determining the optimal choice among multiple options is necessary in various situations, and the collective rationality of groups has recently become a major topic of interest. Social insects are thought to make such optimal choices by collecting individuals' responses relating to an option's value (=a quality-graded response). However, this behaviour cannot explain the collective rationality of brains because neurons can make only 'yes/no' responses on the basis of the response threshold. Here, we elucidate the basic mechanism underlying the collective rationality of such simple units and show that an ant species uses this mechanism. A larger number of units respond 'yes' to the best option available to a collective decision-maker using only the yes/no mechanism; thus, the best option is always selected by majority decision. Colonies of the ant Myrmica kotokui preferred the better option in a binary choice experiment. The preference of a colony was demonstrated by the workers, which exhibited variable thresholds between two options' qualities. Our results demonstrate how a collective decision-maker comprising simple yes/no judgement units achieves collective rationality without using quality-graded responses. This mechanism has broad applicability to collective decision-making in brain neurons, swarm robotics and human societies.

  14. Summary of DOE threshold limits efforts

    International Nuclear Information System (INIS)

    Wickham, L.E.; Smith, C.F.; Cohen, J.J.

    1987-01-01

    The Department of Energy (DOE) has been developing the concept of threshold quantities for use in determining which waste materials may be disposed of as nonradioactive waste in DOE sanitary landfills. Waste above a threshold level could be managed as radioactive or mixed waste (if hazardous chemicals are present); waste below this level would be handled as sanitary waste. After extensive review of a draft threshold guidance document in 1985, a second draft threshold background document was produced in March 1986. The second draft included a preliminary cost-benefit analysis and quality assurance considerations. The review of the second draft has been completed. Final changes to be incorporated include an in-depth cost-benefit analysis of two example sites and recommendations of how to further pursue (i.e. employ) the concept of threshold quantities within the DOE. 3 references

  15. A Fiber Bragg Grating Interrogation System with Self-Adaption Threshold Peak Detection Algorithm.

    Science.gov (United States)

    Zhang, Weifang; Li, Yingwu; Jin, Bo; Ren, Feifei; Wang, Hongxun; Dai, Wei

    2018-04-08

    A Fiber Bragg Grating (FBG) interrogation system with a self-adaption threshold peak detection algorithm is proposed and experimentally demonstrated in this study. This system is composed of a field programmable gate array (FPGA) and advanced RISC machine (ARM) platform, tunable Fabry-Perot (F-P) filter and optical switch. To improve system resolution, the F-P filter was employed. As this filter is non-linear, this causes the shifting of central wavelengths with the deviation compensated by the parts of the circuit. Time-division multiplexing (TDM) of FBG sensors is achieved by an optical switch, with the system able to realize the combination of 256 FBG sensors. The wavelength scanning speed of 800 Hz can be achieved by a FPGA+ARM platform. In addition, a peak detection algorithm based on a self-adaption threshold is designed and the peak recognition rate is 100%. Experiments with different temperatures were conducted to demonstrate the effectiveness of the system. Four FBG sensors were examined in the thermal chamber without stress. When the temperature changed from 0 °C to 100 °C, the degree of linearity between central wavelengths and temperature was about 0.999 with the temperature sensitivity being 10 pm/°C. The static interrogation precision was able to reach 0.5 pm. Through the comparison of different peak detection algorithms and interrogation approaches, the system was verified to have an optimum comprehensive performance in terms of precision, capacity and speed.

  16. A Fiber Bragg Grating Interrogation System with Self-Adaption Threshold Peak Detection Algorithm

    Directory of Open Access Journals (Sweden)

    Weifang Zhang

    2018-04-01

    Full Text Available A Fiber Bragg Grating (FBG interrogation system with a self-adaption threshold peak detection algorithm is proposed and experimentally demonstrated in this study. This system is composed of a field programmable gate array (FPGA and advanced RISC machine (ARM platform, tunable Fabry–Perot (F–P filter and optical switch. To improve system resolution, the F–P filter was employed. As this filter is non-linear, this causes the shifting of central wavelengths with the deviation compensated by the parts of the circuit. Time-division multiplexing (TDM of FBG sensors is achieved by an optical switch, with the system able to realize the combination of 256 FBG sensors. The wavelength scanning speed of 800 Hz can be achieved by a FPGA+ARM platform. In addition, a peak detection algorithm based on a self-adaption threshold is designed and the peak recognition rate is 100%. Experiments with different temperatures were conducted to demonstrate the effectiveness of the system. Four FBG sensors were examined in the thermal chamber without stress. When the temperature changed from 0 °C to 100 °C, the degree of linearity between central wavelengths and temperature was about 0.999 with the temperature sensitivity being 10 pm/°C. The static interrogation precision was able to reach 0.5 pm. Through the comparison of different peak detection algorithms and interrogation approaches, the system was verified to have an optimum comprehensive performance in terms of precision, capacity and speed.

  17. A flash flood early warning system based on rainfall thresholds and daily soil moisture indexes

    Science.gov (United States)

    Brigandì, Giuseppina; Tito Aronica, Giuseppe

    2015-04-01

    Main focus of the paper is to present a flash flood early warning system, developed for Civil Protection Agency for the Sicily Region, for alerting extreme hydrometeorological events by using a methodology based on the combined use of rainfall thresholds and soil moisture indexes. As matter of fact, flash flood warning is a key element to improve the Civil Protection achievements to mitigate damages and safeguard the security of people. It is a rather complicated task, particularly in those catchments with flashy response where even brief anticipations are important and welcomed. In this context, some kind of hydrological precursors can be considered to improve the effectiveness of the emergency actions (i.e. early flood warning). Now, it is well known how soil moisture is an important factor in flood formation, because the runoff generation is strongly influenced by the antecedent soil moisture conditions of the catchment. The basic idea of the work here presented is to use soil moisture indexes derived in a continuous form to define a first alert phase in a flash flood forecasting chain and then define a unique rainfall threshold for a given day for the subsequent alarm phases activation, derived as a function of the soil moisture conditions at the beginning of the day. Daily soil moisture indexes, representative of the moisture condition of the catchment, were derived by using a parsimonious and simply to use approach based on the IHACRES model application in a modified form developed by the authors. It is a simple, spatially-lumped rainfall-streamflow model, based on the SCS-CN method and on the unit hydrograph approach that requires only rainfall, streamflow and air temperature data. It consists of two modules. In the first a non linear loss model, based on the SCS-CN method, was used to transform total rainfall into effective rainfall. In the second, a linear convolution of effective rainfall was performed using a total unit hydrograph with a configuration of

  18. Estimation of the LET threshold of single event upset of microelectronics in experiments with Cf-252

    International Nuclear Information System (INIS)

    Kuznetsov, N.V.; Nymmik, R.A.

    1996-01-01

    A method is proposed for analyzing single event upsets (SEU) in large scale integration circuits of random access memory (RAM) when exposed to Cf-252 fission fragments. The method makes is possible to find the RAM linear energy transfer (LET) threshold to be used for estimations of RAM SEU rates in space. The method is illustrated by analyzing experimental data for the 2 x 8 kbit CMOS/bulk RAM. (author)

  19. Implications of the spatial dependence of the single-event-upset threshold in SRAMs measured with a pulsed laser

    International Nuclear Information System (INIS)

    Buchner, S.; Langworthy, J.B.; Stapor, W.J.; Campbell, A.B.; Rivet, S.

    1994-01-01

    Pulsed laser light was used to measure single event upset (SEU) thresholds for a large number of memory cells in both CMOS and bipolar SRAMs. Results showed that small variations in intercell upset threshold could not explain the gradual rise in the curve of cross section versus linear energy transfer (LET). The memory cells exhibited greater intracell variations implying that the charge collection efficiency within a memory cell varies spatially and contributes substantially to the shape of the curve of cross section versus LET. The results also suggest that the pulsed laser can be used for hardness-assurance measurements on devices with sensitive areas larger than the diameter of the laser beam

  20. Parton distributions with threshold resummation

    CERN Document Server

    Bonvini, Marco; Rojo, Juan; Rottoli, Luca; Ubiali, Maria; Ball, Richard D.; Bertone, Valerio; Carrazza, Stefano; Hartland, Nathan P.

    2015-01-01

    We construct a set of parton distribution functions (PDFs) in which fixed-order NLO and NNLO calculations are supplemented with soft-gluon (threshold) resummation up to NLL and NNLL accuracy respectively, suitable for use in conjunction with any QCD calculation in which threshold resummation is included at the level of partonic cross sections. These resummed PDF sets, based on the NNPDF3.0 analysis, are extracted from deep-inelastic scattering, Drell-Yan, and top quark pair production data, for which resummed calculations can be consistently used. We find that, close to threshold, the inclusion of resummed PDFs can partially compensate the enhancement in resummed matrix elements, leading to resummed hadronic cross-sections closer to the fixed-order calculation. On the other hand, far from threshold, resummed PDFs reduce to their fixed-order counterparts. Our results demonstrate the need for a consistent use of resummed PDFs in resummed calculations.

  1. Non-linear effects in electron cyclotron current drive applied for the stabilization of neoclassical tearing modes

    NARCIS (Netherlands)

    Ayten, B.; Westerhof, E.; ASDEX Upgrade team,

    2014-01-01

    Due to the smallness of the volumes associated with the flux surfaces around the O-point of a magnetic island, the electron cyclotron power density applied inside the island for the stabilization of neoclassical tearing modes (NTMs) can exceed the threshold for non-linear effects as derived

  2. Simulation study on single event burnout in linear doping buffer layer engineered power VDMOSFET

    International Nuclear Information System (INIS)

    Jia Yunpeng; Su Hongyuan; Hu Dongqing; Wu Yu; Jin Rui

    2016-01-01

    The addition of a buffer layer can improve the device's secondary breakdown voltage, thus, improving the single event burnout (SEB) threshold voltage. In this paper, an N type linear doping buffer layer is proposed. According to quasi-stationary avalanche simulation and heavy ion beam simulation, the results show that an optimized linear doping buffer layer is critical. As SEB is induced by heavy ions impacting, the electric field of an optimized linear doping buffer device is much lower than that with an optimized constant doping buffer layer at a given buffer layer thickness and the same biasing voltages. Secondary breakdown voltage and the parasitic bipolar turn-on current are much higher than those with the optimized constant doping buffer layer. So the linear buffer layer is more advantageous to improving the device's SEB performance. (paper)

  3. A review of linear compressors for refrigeration

    OpenAIRE

    Liang, Kun

    2017-01-01

    Linear compressor has no crank mechanism compared with conventional reciprocating compressor. This allows higher efficiency, oil-free operation, lower cost and smaller size when linear compressors are used for vapour compression refrigeration (VCR) system. Typically, a linear compressor consists of a linear motor (connected to a piston) and suspension springs, operated at resonant frequency. This paper presents a review of linear compressors for refrigeration system. Different designs and mod...

  4. A threshold model of investor psychology

    Science.gov (United States)

    Cross, Rod; Grinfeld, Michael; Lamba, Harbir; Seaman, Tim

    2005-08-01

    We introduce a class of agent-based market models founded upon simple descriptions of investor psychology. Agents are subject to various psychological tensions induced by market conditions and endowed with a minimal ‘personality’. This personality consists of a threshold level for each of the tensions being modeled, and the agent reacts whenever a tension threshold is reached. This paper considers an elementary model including just two such tensions. The first is ‘cowardice’, which is the stress caused by remaining in a minority position with respect to overall market sentiment and leads to herding-type behavior. The second is ‘inaction’, which is the increasing desire to act or re-evaluate one's investment position. There is no inductive learning by agents and they are only coupled via the global market price and overall market sentiment. Even incorporating just these two psychological tensions, important stylized facts of real market data, including fat-tails, excess kurtosis, uncorrelated price returns and clustered volatility over the timescale of a few days are reproduced. By then introducing an additional parameter that amplifies the effect of externally generated market noise during times of extreme market sentiment, long-time volatility correlations can also be recovered.

  5. Threshold velocity for environmentally-assisted cracking in low alloy steels

    International Nuclear Information System (INIS)

    Wire, G.L.; Kandra, J.T.

    1997-01-01

    Environmentally Assisted Cracking (EAC) in low alloy steels is generally believed to be activated by dissolution of MnS inclusions at the crack tip in high temperature LWR environments. EAC is the increase of fatigue crack growth rate of up to 40 to 100 times the rate in air that occurs in high temperature LWR environments. A steady state theory developed by Combrade, suggested that EAC will initiate only above a critical crack velocity and cease below this same velocity. A range of about twenty in critical crack tip velocities was invoked by Combrade, et al., to describe data available at that time. This range was attributed to exposure of additional sulfides above and below the crack plane. However, direct measurements of exposed sulfide densities on cracked specimens were performed herein and the results rule out significant additional sulfide exposure as a plausible explanation. Alternatively, it is proposed herein that localized EAC starting at large sulfide clusters reduces the calculated threshold velocity from the value predicted for a uniform distribution of sulfides. Calculations are compared with experimental results where the threshold velocity has been measured, and the predicted wide range of threshold values for steels of similar sulfur content but varying sulfide morphology is observed. The threshold velocity decreases with the increasing maximum sulfide particle size, qualitatively consistent with the theory. The calculation provides a basis for a conservative minimum velocity threshold tied directly to the steel sulfur level, in cases where no details of sulfide distribution are known

  6. Log canonical thresholds of smooth Fano threefolds

    International Nuclear Information System (INIS)

    Cheltsov, Ivan A; Shramov, Konstantin A

    2008-01-01

    The complex singularity exponent is a local invariant of a holomorphic function determined by the integrability of fractional powers of the function. The log canonical thresholds of effective Q-divisors on normal algebraic varieties are algebraic counterparts of complex singularity exponents. For a Fano variety, these invariants have global analogues. In the former case, it is the so-called α-invariant of Tian; in the latter case, it is the global log canonical threshold of the Fano variety, which is the infimum of log canonical thresholds of all effective Q-divisors numerically equivalent to the anticanonical divisor. An appendix to this paper contains a proof that the global log canonical threshold of a smooth Fano variety coincides with its α-invariant of Tian. The purpose of the paper is to compute the global log canonical thresholds of smooth Fano threefolds (altogether, there are 105 deformation families of such threefolds). The global log canonical thresholds are computed for every smooth threefold in 64 deformation families, and the global log canonical thresholds are computed for a general threefold in 20 deformation families. Some bounds for the global log canonical thresholds are computed for 14 deformation families. Appendix A is due to J.-P. Demailly.

  7. Comparison of linear and non-linear models for predicting energy expenditure from raw accelerometer data.

    Science.gov (United States)

    Montoye, Alexander H K; Begum, Munni; Henning, Zachary; Pfeiffer, Karin A

    2017-02-01

    This study had three purposes, all related to evaluating energy expenditure (EE) prediction accuracy from body-worn accelerometers: (1) compare linear regression to linear mixed models, (2) compare linear models to artificial neural network models, and (3) compare accuracy of accelerometers placed on the hip, thigh, and wrists. Forty individuals performed 13 activities in a 90 min semi-structured, laboratory-based protocol. Participants wore accelerometers on the right hip, right thigh, and both wrists and a portable metabolic analyzer (EE criterion). Four EE prediction models were developed for each accelerometer: linear regression, linear mixed, and two ANN models. EE prediction accuracy was assessed using correlations, root mean square error (RMSE), and bias and was compared across models and accelerometers using repeated-measures analysis of variance. For all accelerometer placements, there were no significant differences for correlations or RMSE between linear regression and linear mixed models (correlations: r  =  0.71-0.88, RMSE: 1.11-1.61 METs; p  >  0.05). For the thigh-worn accelerometer, there were no differences in correlations or RMSE between linear and ANN models (ANN-correlations: r  =  0.89, RMSE: 1.07-1.08 METs. Linear models-correlations: r  =  0.88, RMSE: 1.10-1.11 METs; p  >  0.05). Conversely, one ANN had higher correlations and lower RMSE than both linear models for the hip (ANN-correlation: r  =  0.88, RMSE: 1.12 METs. Linear models-correlations: r  =  0.86, RMSE: 1.18-1.19 METs; p  linear models for the wrist-worn accelerometers (ANN-correlations: r  =  0.82-0.84, RMSE: 1.26-1.32 METs. Linear models-correlations: r  =  0.71-0.73, RMSE: 1.55-1.61 METs; p  models offer a significant improvement in EE prediction accuracy over linear models. Conversely, linear models showed similar EE prediction accuracy to machine learning models for hip- and thigh

  8. A meta-analysis of cambium phenology and growth: linear and non-linear patterns in conifers of the northern hemisphere.

    Science.gov (United States)

    Rossi, Sergio; Anfodillo, Tommaso; Cufar, Katarina; Cuny, Henri E; Deslauriers, Annie; Fonti, Patrick; Frank, David; Gricar, Jozica; Gruber, Andreas; King, Gregory M; Krause, Cornelia; Morin, Hubert; Oberhuber, Walter; Prislan, Peter; Rathgeber, Cyrille B K

    2013-12-01

    Ongoing global warming has been implicated in shifting phenological patterns such as the timing and duration of the growing season across a wide variety of ecosystems. Linear models are routinely used to extrapolate these observed shifts in phenology into the future and to estimate changes in associated ecosystem properties such as net primary productivity. Yet, in nature, linear relationships may be special cases. Biological processes frequently follow more complex, non-linear patterns according to limiting factors that generate shifts and discontinuities, or contain thresholds beyond which responses change abruptly. This study investigates to what extent cambium phenology is associated with xylem growth and differentiation across conifer species of the northern hemisphere. Xylem cell production is compared with the periods of cambial activity and cell differentiation assessed on a weekly time scale on histological sections of cambium and wood tissue collected from the stems of nine species in Canada and Europe over 1-9 years per site from 1998 to 2011. The dynamics of xylogenesis were surprisingly homogeneous among conifer species, although dispersions from the average were obviously observed. Within the range analysed, the relationships between the phenological timings were linear, with several slopes showing values close to or not statistically different from 1. The relationships between the phenological timings and cell production were distinctly non-linear, and involved an exponential pattern. The trees adjust their phenological timings according to linear patterns. Thus, shifts of one phenological phase are associated with synchronous and comparable shifts of the successive phases. However, small increases in the duration of xylogenesis could correspond to a substantial increase in cell production. The findings suggest that the length of the growing season and the resulting amount of growth could respond differently to changes in environmental conditions.

  9. Electric moulding of dispersed lipid nanotubes into a nanofluidic device.

    Science.gov (United States)

    Frusawa, Hiroshi; Manabe, Tatsuhiko; Kagiyama, Eri; Hirano, Ken; Kameta, Naohiro; Masuda, Mitsutoshi; Shimizu, Toshimi

    2013-01-01

    Hydrophilic nanotubes formed by lipid molecules have potential applications as platforms for chemical or biological events occurring in an attolitre volume inside a hollow cylinder. Here, we have integrated the lipid nanotubes (LNTs) by applying an AC electric field via plug-in electrode needles placed above a substrate. The off-chip assembly method has the on-demand adjustability of an electrode configuration, enabling the dispersed LNT to be electrically moulded into a separate film of parallel LNT arrays in one-step. The fluorescence resonance energy transfer technique as well as the digital microscopy visualised the overall filling of gold nanoparticles up to the inner capacity of an LNT film by capillary action, thereby showing the potential of this flexible film for use as a high-throughput nanofluidic device where not only is the endo-signalling and product in each LNT multiplied but also the encapsulated objects are efficiently transported and reacted.

  10. Magnetic x-ray linear dichroism in resonant and non-resonant Gd 4f photoemission

    Energy Technology Data Exchange (ETDEWEB)

    Mishra, S.; Gammon, W.J.; Pappas, D.P. [Virginia Commonwealth Univ., Richmond, VA (United States)] [and others

    1997-04-01

    The enhancement of the magnetic linear dichroism in resonant 4f photoemission (MLDRPE) is studied from a 50 monolayer film of Gd/Y(0001). The ALS at beamline 7.0.1 provided the source of linearly polarized x-rays used in this study. The polarized light was incident at an angle of 30 degrees relative to the film plane, and the sample magnetization was perpendicular to the photon polarization. The linear dichroism of the 4f core levels is measured as the photon energy is tuned through the 4d-4f resonance. The authors find that the MLDRPE asymmetry is strongest at the resonance. Near the threshold the asymmetry has several features which are out of phase with the fine structure of the total yield.

  11. Magnetic x-ray linear dichroism in resonant and non-resonant Gd 4f photoemission

    International Nuclear Information System (INIS)

    Mishra, S.; Gammon, W.J.; Pappas, D.P.

    1997-01-01

    The enhancement of the magnetic linear dichroism in resonant 4f photoemission (MLDRPE) is studied from a 50 monolayer film of Gd/Y(0001). The ALS at beamline 7.0.1 provided the source of linearly polarized x-rays used in this study. The polarized light was incident at an angle of 30 degrees relative to the film plane, and the sample magnetization was perpendicular to the photon polarization. The linear dichroism of the 4f core levels is measured as the photon energy is tuned through the 4d-4f resonance. The authors find that the MLDRPE asymmetry is strongest at the resonance. Near the threshold the asymmetry has several features which are out of phase with the fine structure of the total yield

  12. Reaction πN → ππN near threshold

    Energy Technology Data Exchange (ETDEWEB)

    Frlez, Emil [Univ. of Virginia, Charlottesville, VA (United States)

    1993-11-01

    The LAMPF E1179 experiment used the π0 spectrometer and an array of charged particle range counters to detect and record π+π0, π0p, and π+π0p coincidences following the reaction π+p → π0π+p near threshold. The total cross sections for single pion production were measured at the incident pion kinetic energies 190, 200, 220, 240, and 260 MeV. Absolute normalizations were fixed by measuring π+p elastic scattering at 260 MeV. A detailed analysis of the π0 detection efficiency was performed using cosmic ray calibrations and pion single charge exchange measurements with a 30 MeV π- beam. All published data on πN → ππN, including our results, are simultaneously fitted to yield a common chiral symmetry breaking parameter ξ =-0.25±0.10. The threshold matrix element |α00π+p)| determined by linear extrapolation yields the value of the s-wave isospin-2 ππ scattering length α$2\\atop{0}$(ππ) = -0.041±0.003 m$-1\\atop{π}$-1, within the framework of soft-pion theory.

  13. Threshold Concepts and Information Literacy

    Science.gov (United States)

    Townsend, Lori; Brunetti, Korey; Hofer, Amy R.

    2011-01-01

    What do we teach when we teach information literacy in higher education? This paper describes a pedagogical approach to information literacy that helps instructors focus content around transformative learning thresholds. The threshold concept framework holds promise for librarians because it grounds the instructor in the big ideas and underlying…

  14. The effect of power change on the PCI failure threshold

    Energy Technology Data Exchange (ETDEWEB)

    Sipush, P J; Kaiser, R S [Westinghouse Nuclear Fuel Division, Pittsburg, PA (United States)

    1983-06-01

    Investigations of the PCI mechanism have led to the conclusion that the failure threshold is best defined by the power change ({delta}P) during the ramp, rather than the final power achieved at the end of the ramp. The data base studied was comprehensive and includes a wide variety of water reactor systems and fuel designs. It has also been found that operating parameters have a more significant effect on failure susceptibility than fuel rod design variables. The most significant operating variable affecting the failure threshold was found to be the base irradiation history, indicating that fission product release and migration prior to the ramp (during base irradiation) is an important consideration. It can be shown that fuel irradiated at relatively higher linear heat ratings ends to fail at lower {delta}P. This effect has also been independently verified by statistical analyses which will also be discussed. Industry out-of-pile internal gas pressurization tests with irradiated tubing in the absence of simulated fission product species and at low stress levels, also tends to indicate the importance of the prior irradiation history on PCI performance. Other parameters that affect the power ramping performance are the initial ramping power and the pellet power distribution which is a function of fuel enrichment and burnup. (author)

  15. A Threshold Continuum for Aeolian Sand Transport

    Science.gov (United States)

    Swann, C.; Ewing, R. C.; Sherman, D. J.

    2015-12-01

    The threshold of motion for aeolian sand transport marks the initial entrainment of sand particles by the force of the wind. This is typically defined and modeled as a singular wind speed for a given grain size and is based on field and laboratory experimental data. However, the definition of threshold varies significantly between these empirical models, largely because the definition is based on visual-observations of initial grain movement. For example, in his seminal experiments, Bagnold defined threshold of motion when he observed that 100% of the bed was in motion. Others have used 50% and lesser values. Differences in threshold models, in turn, result is large errors in predicting the fluxes associated with sand and dust transport. Here we use a wind tunnel and novel sediment trap to capture the fractions of sand in creep, reptation and saltation at Earth and Mars pressures and show that the threshold of motion for aeolian sand transport is best defined as a continuum in which grains progress through stages defined by the proportion of grains in creep and saltation. We propose the use of scale dependent thresholds modeled by distinct probability distribution functions that differentiate the threshold based on micro to macro scale applications. For example, a geologic timescale application corresponds to a threshold when 100% of the bed in motion whereas a sub-second application corresponds to a threshold when a single particle is set in motion. We provide quantitative measurements (number and mode of particle movement) corresponding to visual observations, percent of bed in motion and degrees of transport intermittency for Earth and Mars. Understanding transport as a continuum provides a basis for revaluating sand transport thresholds on Earth, Mars and Titan.

  16. Lipoproteins of slow-growing Mycobacteria carry three fatty acids and are N-acylated by apolipoprotein N-acyltransferase BCG_2070c.

    Science.gov (United States)

    Brülle, Juliane K; Tschumi, Andreas; Sander, Peter

    2013-10-05

    Lipoproteins are virulence factors of Mycobacterium tuberculosis. Bacterial lipoproteins are modified by the consecutive action of preprolipoprotein diacylglyceryl transferase (Lgt), prolipoprotein signal peptidase (LspA) and apolipoprotein N- acyltransferase (Lnt) leading to the formation of mature triacylated lipoproteins. Lnt homologues are found in Gram-negative and high GC-rich Gram-positive, but not in low GC-rich Gram-positive bacteria, although N-acylation is observed. In fast-growing Mycobacterium smegmatis, the molecular structure of the lipid modification of lipoproteins was resolved recently as a diacylglyceryl residue carrying ester-bound palmitic acid and ester-bound tuberculostearic acid and an additional amide-bound palmitic acid. We exploit the vaccine strain Mycobacterium bovis BCG as model organism to investigate lipoprotein modifications in slow-growing mycobacteria. Using Escherichia coli Lnt as a query in BLASTp search, we identified BCG_2070c and BCG_2279c as putative lnt genes in M. bovis BCG. Lipoproteins LprF, LpqH, LpqL and LppX were expressed in M. bovis BCG and BCG_2070c lnt knock-out mutant and lipid modifications were analyzed at molecular level by matrix-assisted laser desorption ionization time-of-flight/time-of-flight analysis. Lipoprotein N-acylation was observed in wildtype but not in BCG_2070c mutants. Lipoprotein N- acylation with palmitoyl and tuberculostearyl residues was observed. Lipoproteins are triacylated in slow-growing mycobacteria. BCG_2070c encodes a functional Lnt in M. bovis BCG. We identified mycobacteria-specific tuberculostearic acid as further substrate for N-acylation in slow-growing mycobacteria.

  17. An energy-based body temperature threshold between torpor and normothermia for small mammals.

    Science.gov (United States)

    Willis, Craig K R

    2007-01-01

    Field studies of use of torpor by heterothermic endotherms suffer from the lack of a standardized threshold differentiating torpid body temperatures (T(b)) from normothermic T(b)'s. This threshold can be more readily observed if metabolic rate (MR) is measured in the laboratory. I digitized figures from the literature that depicted simultaneous traces of MR and T(b) from 32 respirometry runs for 14 mammal species. For each graph, I quantified the T(b) measured when MR first began to drop at the onset of torpor (T(b-onset)). I used a general linear model to quantify the effect of ambient temperature (T(a)) and body mass (BM) on T(b-onset). For species lighter than 70 g, the model was highly significant and was described by the equation Tb-onset=(0.055+/-0.014)BM+(0.071+/-0.031)Ta+(31.823+/-0.740). To be conservative, I recommend use of these model parameters minus 1 standard error, which modifies the equation to Tb-onset-1 SE=(0.041)BM+(0.040)Ta+31.083. This approach provides a standardized threshold for differentiating torpor from normothermia that is based on use of energy, the actual currency of interest for studies of torpor in the wild. Few laboratory studies have presented the time-course data required to quantify T(b-onset), so more data are needed to validate this relationship.

  18. Sub-threshold Post Traumatic Stress Disorder in the WHO World Mental Health Surveys

    Science.gov (United States)

    McLaughlin, Katie A.; Koenen, Karestan C.; Friedman, Matthew J.; Ruscio, Ayelet Meron; Karam, Elie G.; Shahly, Victoria; Stein, Dan J.; Hill, Eric D.; Petukhova, Maria; Alonso, Jordi; Andrade, Laura Helena; Angermeyer, Matthias C.; Borges, Guilherme; de Girolamo, Giovanni; de Graaf, Ron; Demyttenaere, Koen; Florescu, Silvia E.; Mladenova, Maya; Posada-Villa, Jose; Scott, Kate M.; Takeshima, Tadashi; Kessler, Ronald C.

    2014-01-01

    Background Although only a minority of people exposed to a traumatic event (TE) develops PTSD, symptoms not meeting full PTSD criteria are common and often clinically significant. Individuals with these symptoms have sometimes been characterized as having sub-threshold PTSD, but no consensus exists on the optimal definition of this term. Data from a large cross-national epidemiological survey are used to provide a principled basis for such a definition. Methods The WHO World Mental Health (WMH) Surveys administered fully-structured psychiatric diagnostic interviews to community samples in 13 countries containing assessments of PTSD associated with randomly selected TEs. Focusing on the 23,936 respondents reporting lifetime TE exposure, associations of approximated DSM-5 PTSD symptom profiles with six outcomes (distress-impairment, suicidality, comorbid fear-distress disorders, PTSD symptom duration) were examined to investigate implications of different sub-threshold definitions. Results Although consistently highest distress-impairment, suicidality, comorbidity, and symptom duration were observed among the 3.0% of respondents with DSM-5 PTSD than other symptom profiles, the additional 3.6% of respondents meeting two or three of DSM-5 Criteria BE also had significantly elevated scores for most outcomes. The proportion of cases with threshold versus sub-threshold PTSD varied depending on TE type, with threshold PTSD more common following interpersonal violence and sub-threshold PTSD more common following events happening to loved ones. Conclusions Sub-threshold DSM-5 PTSD is most usefully defined as meeting two or three of the DSM-5 Criteria B-E. Use of a consistent definition is critical to advance understanding of the prevalence, predictors, and clinical significance of sub-threshold PTSD. PMID:24842116

  19. Electrophysiological gap detection thresholds: effects of age and comparison with a behavioral measure.

    Science.gov (United States)

    Palmer, Shannon B; Musiek, Frank E

    2014-01-01

    Temporal processing ability has been linked to speech understanding ability and older adults often complain of difficulty understanding speech in difficult listening situations. Temporal processing can be evaluated using gap detection procedures. There is some research showing that gap detection can be evaluated using an electrophysiological procedure. However, there is currently no research establishing gap detection threshold using the N1-P2 response. The purposes of the current study were to 1) determine gap detection thresholds in younger and older normal-hearing adults using an electrophysiological measure, 2) compare the electrophysiological gap detection threshold and behavioral gap detection threshold within each group, and 3) investigate the effect of age on each gap detection measure. This study utilized an older adult group and younger adult group to compare performance on an electrophysiological and behavioral gap detection procedure. The subjects in this study were 11 younger, normal-hearing adults (mean = 22 yrs) and 11 older, normal-hearing adults (mean = 64.36 yrs). All subjects completed an adaptive behavioral gap detection procedure in order to determine their behavioral gap detection threshold (BGDT). Subjects also completed an electrophysiologic gap detection procedure to determine their electrophysiologic gap detection threshold (EGDT). Older adults demonstrated significantly larger gap detection thresholds than the younger adults. However, EGDT and BGDT were not significantly different in either group. The mean difference between EGDT and BGDT for all subjects was 0.43 msec. Older adults show poorer gap detection ability when compared to younger adults. However, this study shows that gap detection thresholds can be measured using evoked potential recordings and yield results similar to a behavioral measure. American Academy of Audiology.

  20. Iran: the next nuclear threshold state?

    OpenAIRE

    Maurer, Christopher L.

    2014-01-01

    Approved for public release; distribution is unlimited A nuclear threshold state is one that could quickly operationalize its peaceful nuclear program into one capable of producing a nuclear weapon. This thesis compares two known threshold states, Japan and Brazil, with Iran to determine if the Islamic Republic could also be labeled a threshold state. Furthermore, it highlights the implications such a status could have on U.S. nonproliferation policy. Although Iran's nuclear program is mir...

  1. Non-linear absorption for concentrated solar energy transport

    Energy Technology Data Exchange (ETDEWEB)

    Jaramillo, O. A; Del Rio, J.A; Huelsz, G [Centro de Investigacion de Energia, UNAM, Temixco, Morelos (Mexico)

    2000-07-01

    In order to determine the maximum solar energy that can be transported using SiO{sub 2} optical fibers, analysis of non-linear absorption is required. In this work, we model the interaction between solar radiation and the SiO{sub 2} optical fiber core to determine the dependence of the absorption of the radioactive intensity. Using Maxwell's equations we obtain the relation between the refractive index and the electric susceptibility up to second order in terms of the electric field intensity. This is not enough to obtain an explicit expression for the non-linear absorption. Thus, to obtain the non-linear optical response, we develop a microscopic model of an harmonic driven oscillators with damp ing, based on the Drude-Lorentz theory. We solve this model using experimental information for the SiO{sub 2} optical fiber, and we determine the frequency-dependence of the non-linear absorption and the non-linear extinction of SiO{sub 2} optical fibers. Our results estimate that the average value over the solar spectrum for the non-linear extinction coefficient for SiO{sub 2} is k{sub 2}=10{sup -}29m{sup 2}V{sup -}2. With this result we conclude that the non-linear part of the absorption coefficient of SiO{sub 2} optical fibers during the transport of concentrated solar energy achieved by a circular concentrator is negligible, and therefore the use of optical fibers for solar applications is an actual option. [Spanish] Con el objeto de determinar la maxima energia solar que puede transportarse usando fibras opticas de SiO{sub 2} se requiere el analisis de absorcion no linear. En este trabajo modelamos la interaccion entre la radiacion solar y el nucleo de la fibra optica de SiO{sub 2} para determinar la dependencia de la absorcion de la intensidad radioactiva. Mediante el uso de las ecuaciones de Maxwell obtenemos la relacion entre el indice de refraccion y la susceptibilidad electrica hasta el segundo orden en terminos de intensidad del campo electrico. Esto no es

  2. Investigando a hipótese da paridade do poder de compra: um enfoque não linear Testing the long-run purchasing power parity hypothesis: a nonlinear approach

    Directory of Open Access Journals (Sweden)

    André M. Marques

    2011-08-01

    Full Text Available A hipótese da Paridade do Poder de Compra (PPP é investigada analisando-se a dinâmica de longo prazo da taxa de câmbio real efetiva no Brasil. Com dados mensais, o objetivo principal do estudo é testar a hipótese da PPP com um enfoque não linear. A metodologia empregada baseou-se na aplicação de testes gerais (Keenan, 1985; Tsay, 1986 e específicos para não linearidade do tipo Threshold (Chan, 1990. Seguindo-se a metodologia de Hansen (1999, o estágio seguinte consistiu em testar o número de regimes necessários para descrever a dinâmica não linear da taxa de câmbio real. Os resultados das estimações sugerem a ocorrência de apenas dois regimes distintos, com persistência e volatilidade diferentes. Conclui-se que a hipótese da PPP é apoiada pelos resultados.We consider a threshold time series model in order to test the PPP hypothesis with Brazilian effective real exchange rate dataset in the long run. By following Keenan (1985, Tsay (1986 and Chan (1990, we test Brazilian dataset for several types of nonlinearities. So, after apply Hansen's test to infer about the number of regimes, we apply the more recent methodology of Self-Exciting Threshold Autoregressive (SETAR model to point out some threshold to which a signal of turning point could be given in the states of the exchange rate dynamics. All the tests suggest that the Brazilian real exchange rate is highly nonlinear. The skeleton of the SETAR models fitted shows that PPP hypothesis is supported in the long run in spite the deviations from short run.

  3. Threshold pump intensity effect on the refractive index changes in InGaN SQD: Internal constitution and size effects

    Energy Technology Data Exchange (ETDEWEB)

    El Ghazi, Haddou, E-mail: hadghazi@gmail.com [Special Mathematics, CPGE Rabat (Morocco); LPS, Faculty of Science, Dhar El Mehrez, BP 1796 Fes-Atlas (Morocco); A John Peter [P.G. & Research Department of Physics, Goverment Arts and Science College, Melur 625106, Madurai (India)

    2015-04-01

    In the present paper, internal composition and size-dependent threshold pump intensity effects on on-center impurity-related linear, third-order nonlinear and total refractive index changes are investigated in wurtzite (In,Ga)N/GaN unstrained spherical quantum dot. The calculation is performed within the framework of parabolic band and single band effective-mass approximations using a combination of Quantum Genetic Algorithm (QGA) and Hartree–Fock–Roothaan (HFR) method. According to the results obtained, (i) a significant red-shift (blue shift) is obtained as the dot size (potential barrier) increases and (ii) a threshold optical pump intensity depending strongly on the size and the internal composition is obtained which constitutes the limit between two behaviors.

  4. Linear theory on temporal instability of megahertz faraday waves for monodisperse microdroplet ejection.

    Science.gov (United States)

    Tsai, Shirley C; Tsai, Chen S

    2013-08-01

    A linear theory on temporal instability of megahertz Faraday waves for monodisperse microdroplet ejection based on mass conservation and linearized Navier-Stokes equations is presented using the most recently observed micrometer- sized droplet ejection from a millimeter-sized spherical water ball as a specific example. The theory is verified in the experiments utilizing silicon-based multiple-Fourier horn ultrasonic nozzles at megahertz frequency to facilitate temporal instability of the Faraday waves. Specifically, the linear theory not only correctly predicted the Faraday wave frequency and onset threshold of Faraday instability, the effect of viscosity, the dynamics of droplet ejection, but also established the first theoretical formula for the size of the ejected droplets, namely, the droplet diameter equals four-tenths of the Faraday wavelength involved. The high rate of increase in Faraday wave amplitude at megahertz drive frequency subsequent to onset threshold, together with enhanced excitation displacement on the nozzle end face, facilitated by the megahertz multiple Fourier horns in resonance, led to high-rate ejection of micrometer- sized monodisperse droplets (>10(7) droplets/s) at low electrical drive power (<;1 W) with short initiation time (<;0.05 s). This is in stark contrast to the Rayleigh-Plateau instability of a liquid jet, which ejects one droplet at a time. The measured diameters of the droplets ranging from 2.2 to 4.6 μm at 2 to 1 MHz drive frequency fall within the optimum particle size range for pulmonary drug delivery.

  5. Evaluation of mechanical properties of Dy123 bulk superconductors by 3-point bending tests

    International Nuclear Information System (INIS)

    Katagiri, K.; Hatakeyama, Y.; Sato, T.; Kasaba, K.; Shoji, Y.; Murakami, A.; Teshima, H.; Hirano, H.

    2006-01-01

    In order to evaluate the mechanical properties, such as Young's modulus and strength, of Dy123 bulk superconductors and those with 10 wt.% Ag 2 O, we performed 3-point bending tests at room (RT) and liquid nitrogen temperatures (LNT) using specimens cut from the bulks. The Young's modulus and the bending strength increased with decrease in temperature. In the tests loading in the direction of c-axis and ones perpendicular to it, Young's moduli were almost comparable at both RT and LNT. Although the strengths for both orientations were also comparable at LNT, those at RT were different. Young's moduli loaded in the direction of c-axis for Ag 2 O added bulk specimens, 127 GPa in average at RT, were almost comparable to those without Ag 2 O, and 134 GPa at LNT, were slightly lower than those without Ag 2 O. On the other hand, the strengths at both RT and LNT were enhanced by 20% by the Ag addition. The mechanical properties of Dy123 bulks without Ag 2 O were compared with those of Y123 bulks obtained previously. The Young's modulus for loading in the direction of c-axis was slightly lower, and the strength was comparable to those in Y123 bulks, respectively

  6. Software thresholds alter the bias of actigraphy for monitoring sleep in team-sport athletes.

    Science.gov (United States)

    Fuller, Kate L; Juliff, Laura; Gore, Christopher J; Peiffer, Jeremiah J; Halson, Shona L

    2017-08-01

    Actical ® actigraphy is commonly used to monitor athlete sleep. The proprietary software, called Actiware ® , processes data with three different sleep-wake thresholds (Low, Medium or High), but there is no standardisation regarding their use. The purpose of this study was to examine validity and bias of the sleep-wake thresholds for processing Actical ® sleep data in team sport athletes. Validation study comparing actigraph against accepted gold standard polysomnography (PSG). Sixty seven nights of sleep were recorded simultaneously with polysomnography and Actical ® devices. Individual night data was compared across five sleep measures for each sleep-wake threshold using Actiware ® software. Accuracy of each sleep-wake threshold compared with PSG was evaluated from mean bias with 95% confidence limits, Pearson moment-product correlation and associated standard error of estimate. The Medium threshold generated the smallest mean bias compared with polysomnography for total sleep time (8.5min), sleep efficiency (1.8%) and wake after sleep onset (-4.1min); whereas the Low threshold had the smallest bias (7.5min) for wake bouts. Bias in sleep onset latency was the same across thresholds (-9.5min). The standard error of the estimate was similar across all thresholds; total sleep time ∼25min, sleep efficiency ∼4.5%, wake after sleep onset ∼21min, and wake bouts ∼8 counts. Sleep parameters measured by the Actical ® device are greatly influenced by the sleep-wake threshold applied. In the present study the Medium threshold produced the smallest bias for most parameters compared with PSG. Given the magnitude of measurement variability, confidence limits should be employed when interpreting changes in sleep parameters. Copyright © 2017 Sports Medicine Australia. All rights reserved.

  7. 11 CFR 9036.1 - Threshold submission.

    Science.gov (United States)

    2010-01-01

    ... credit or debit card, including one made over the Internet, the candidate shall provide sufficient... section shall not count toward the threshold amount. (c) Threshold certification by Commission. (1) After...

  8. 18F-FDG PET/CT-based gross tumor volume definition for radiotherapy in head and neck Cancer: a correlation study between suitable uptake value threshold and tumor parameters

    International Nuclear Information System (INIS)

    Kao, Chia-Hung; Hsieh, Te-Chun; Yu, Chun-Yen; Yen, Kuo-Yang; Yang, Shih-Neng; Wang, Yao-Ching; Liang, Ji-An; Chien, Chun-Ru; Chen, Shang-Wen

    2010-01-01

    To define a suitable threshold setting for gross tumor volume (GTV) when using 18 Fluoro-deoxyglucose positron emission tomography and computed tomogram (PET/CT) for radiotherapy planning in head and neck cancer (HNC). Fifteen HNC patients prospectively received PET/CT simulation for their radiation treatment planning. Biological target volume (BTV) was derived from PET/CT-based GTV of the primary tumor. The BTVs were defined as the isodensity volumes when adjusting different percentage of the maximal standardized uptake value (SUVmax), excluding any artifact from surrounding normal tissues. CT-based primary GTV (C-pGTV) that had been previously defined by radiation oncologists was compared with the BTV. Suitable threshold level (sTL) could be determined when BTV value and its morphology using a certain threshold level was observed to be the best fitness of the C-pGTV. Suitable standardized uptake value (sSUV) was calculated as the sTL multiplied by the SUVmax. Our result demonstrated no single sTL or sSUV method could achieve an optimized volumetric match with the C-pGTV. The sTL was 13% to 27% (mean, 19%), whereas the sSUV was 1.64 to 3.98 (mean, 2.46). The sTL was inversely correlated with the SUVmax [sTL = -0.1004 Ln (SUVmax) + 0.4464; R 2 = 0.81]. The sSUV showed a linear correlation with the SUVmax (sSUV = 0.0842 SUVmax + 1.248; R 2 = 0.89). The sTL was not associated with the value of C-pGTVs. In PET/CT-based BTV for HNC, a suitable threshold or SUV level can be established by correlating with SUVmax rather than using a fixed threshold

  9. Threshold-linear analysis of measures of fertility in artificial insemination data and days to calving in beef cattle.

    Science.gov (United States)

    Donoghue, K A; Rekaya, R; Bertrand, J K; Misztal, I

    2004-04-01

    Mating and calving records for 47,533 first-calf heifers in Australian Angus herds were used to examine the relationship between days to calving (DC) and two measures of fertility in AI data: 1) calving to first insemination (CFI) and 2) calving success (CS). Calving to first insemination and calving success were defined as binary traits. A threshold-linear Bayesian model was employed for both analyses: 1) DC and CFI and 2) DC and CS. Posterior means (SD) of additive covariance and corresponding genetic correlation between the DC and CFI were -0.62 d (0.19 d) and -0.66 (0.12), respectively. The corresponding point estimates between the DC and CS were -0.70 d (0.14 d) and -0.73 (0.06), respectively. These genetic correlations indicate a strong, negative relationship between DC and both measures of fertility in AI data. Selecting for animals with shorter DC intervals genetically will lead to correlated increases in both CS and CFI. Posterior means (SD) for additive and residual variance and heritability for DC for the DC-CFI analysis were 23.5 d2 (4.1 d2), 363.2 d2 (4.8 d2), and 0.06 (0.01), respectively. The corresponding parameter estimates for the DC-CS analysis were very similar. Posterior means (SD) for additive, herd-year and service sire variance and heritability for CFI were 0.04 (0.01), 0.06 (0.06), 0.14 (0.16), and 0.03 (0.01), respectively. Posterior means (SD) for additive, herd-year, and service sire variance and heritability for CS were 0.04 (0.01), 0.07 (0.07), 0.14 (0.16), and 0.03 (0.01), respectively. The similarity of the parameter estimates for CFI and CS suggest that either trait could be used as a measure of fertility in AI data. However, the definition of CFI allows the identification of animals that not only record a calving event, but calve to their first insemination, and the value of this trait would be even greater in a more complete dataset than that used in this study. The magnitude of the correlations between DC and CS-CFI suggest that

  10. Computational analysis of thresholds for magnetophosphenes

    International Nuclear Information System (INIS)

    Laakso, Ilkka; Hirata, Akimasa

    2012-01-01

    In international guidelines, basic restriction limits on the exposure of humans to low-frequency magnetic and electric fields are set with the objective of preventing the generation of phosphenes, visual sensations of flashing light not caused by light. Measured data on magnetophosphenes, i.e. phosphenes caused by a magnetically induced electric field on the retina, are available from volunteer studies. However, there is no simple way for determining the retinal threshold electric field or current density from the measured threshold magnetic flux density. In this study, the experimental field configuration of a previous study, in which phosphenes were generated in volunteers by exposing their heads to a magnetic field between the poles of an electromagnet, is computationally reproduced. The finite-element method is used for determining the induced electric field and current in five different MRI-based anatomical models of the head. The direction of the induced current density on the retina is dominantly radial to the eyeball, and the maximum induced current density is observed at the superior and inferior sides of the retina, which agrees with literature data on the location of magnetophosphenes at the periphery of the visual field. On the basis of computed data, the macroscopic retinal threshold current density for phosphenes at 20 Hz can be estimated as 10 mA m −2 (−20% to  + 30%, depending on the anatomical model); this current density corresponds to an induced eddy current of 14 μA (−20% to  + 10%), and about 20% of this eddy current flows through each eye. The ICNIRP basic restriction limit for the induced electric field in the case of occupational exposure is not exceeded until the magnetic flux density is about two to three times the measured threshold for magnetophosphenes, so the basic restriction limit does not seem to be conservative. However, the reasons for the non-conservativeness are purely technical: removal of the highest 1% of

  11. Exercise training attenuates experimental autoimmune encephalomyelitis by peripheral immunomodulation rather than direct neuroprotection.

    Science.gov (United States)

    Einstein, Ofira; Fainstein, Nina; Touloumi, Olga; Lagoudaki, Roza; Hanya, Ester; Grigoriadis, Nikolaos; Katz, Abram; Ben-Hur, Tamir

    2018-01-01

    Conflicting results exist on the effects of exercise training (ET) on Experimental Autoimmune Encephalomyelitis (EAE), nor is it known how exercise impacts on disease progression. We examined whether ET ameliorates the development of EAE by modulating the systemic immune system or exerting direct neuroprotective effects on the CNS. Healthy mice were subjected to 6weeks of motorized treadmill running. The Proteolipid protein (PLP)-induced transfer EAE model in mice was utilized. To assess effects of ET on systemic autoimmunity, lymph-node (LN)-T cells from trained- vs. sedentary donor mice were transferred to naïve recipients. To assess direct neuroprotective effects of ET, PLP-reactive LN-T cells were transferred into recipient mice that were trained prior to EAE transfer or to sedentary mice. EAE severity was assessed in vivo and the characteristics of encephalitogenic LN-T cells derived from PLP-immunized mice were evaluated in vitro. LN-T cells obtained from trained mice induced an attenuated clinical and pathological EAE in recipient mice vs. cells derived from sedentary animals. Training inhibited the activation, proliferation and cytokine gene expression of PLP-reactive T cells in response to CNS-derived autoantigen, but strongly enhanced their proliferation in response to Concanavalin A, a non-specific stimulus. However, there was no difference in EAE severity when autoreactive encephalitogenic T cells were transferred to trained vs. sedentary recipient mice. ET inhibits immune system responses to an auto-antigen to attenuate EAE, rather than generally suppressing the immune system, but does not induce a direct neuro-protective effect against EAE. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Linear algebra done right

    CERN Document Server

    Axler, Sheldon

    2015-01-01

    This best-selling textbook for a second course in linear algebra is aimed at undergrad math majors and graduate students. The novel approach taken here banishes determinants to the end of the book. The text focuses on the central goal of linear algebra: understanding the structure of linear operators on finite-dimensional vector spaces. The author has taken unusual care to motivate concepts and to simplify proofs. A variety of interesting exercises in each chapter helps students understand and manipulate the objects of linear algebra. The third edition contains major improvements and revisions throughout the book. More than 300 new exercises have been added since the previous edition. Many new examples have been added to illustrate the key ideas of linear algebra. New topics covered in the book include product spaces, quotient spaces, and dual spaces. Beautiful new formatting creates pages with an unusually pleasant appearance in both print and electronic versions. No prerequisites are assumed other than the ...

  13. High-frequency (8 to 16 kHz) reference thresholds and intrasubject threshold variability relative to ototoxicity criteria using a Sennheiser HDA 200 earphone.

    Science.gov (United States)

    Frank, T

    2001-04-01

    The first purpose of this study was to determine high-frequency (8 to 16 kHz) thresholds for standardizing reference equivalent threshold sound pressure levels (RETSPLs) for a Sennheiser HDA 200 earphone. The second and perhaps more important purpose of this study was to determine whether repeated high-frequency thresholds using a Sennheiser HDA 200 earphone had a lower intrasubject threshold variability than the ASHA 1994 significant threshold shift criteria for ototoxicity. High-frequency thresholds (8 to 16 kHz) were obtained for 100 (50 male, 50 female) normally hearing (0.25 to 8 kHz) young adults (mean age of 21.2 yr) in four separate test sessions using a Sennheiser HDA 200 earphone. The mean and median high-frequency thresholds were similar for each test session and increased as frequency increased. At each frequency, the high-frequency thresholds were not significantly (p > 0.05) different for gender, test ear, or test session. The median thresholds at each frequency were similar to the 1998 interim ISO RETSPLs; however, large standard deviations and wide threshold distributions indicated very high intersubject threshold variability, especially at 14 and 16 kHz. Threshold repeatability was determined by finding the threshold differences between each possible test session comparison (N = 6). About 98% of all of the threshold differences were within a clinically acceptable range of +/-10 dB from 8 to 14 kHz. The threshold differences between each subject's second, third, and fourth minus their first test session were also found to determine whether intrasubject threshold variability was less than the ASHA 1994 criteria for determining a significant threshold shift due to ototoxicity. The results indicated a false-positive rate of 0% for a threshold shift > or = 20 dB at any frequency and a false-positive rate of 2% for a threshold shift >10 dB at two consecutive frequencies. This study verified that the output of high-frequency audiometers at 0 dB HL using

  14. Normal Threshold Size of Stimuli in Children Using a Game-Based Visual Field Test.

    Science.gov (United States)

    Wang, Yanfang; Ali, Zaria; Subramani, Siddharth; Biswas, Susmito; Fenerty, Cecilia; Henson, David B; Aslam, Tariq

    2017-06-01

    The aim of this study was to demonstrate and explore the ability of novel game-based perimetry to establish normal visual field thresholds in children. One hundred and eighteen children (aged 8.0 ± 2.8 years old) with no history of visual field loss or significant medical history were recruited. Each child had one eye tested using a game-based visual field test 'Caspar's Castle' at four retinal locations 12.7° (N = 118) from fixation. Thresholds were established repeatedly using up/down staircase algorithms with stimuli of varying diameter (luminance 20 cd/m 2 , duration 200 ms, background luminance 10 cd/m 2 ). Relationships between threshold and age were determined along with measures of intra- and intersubject variability. The Game-based visual field test was able to establish threshold estimates in the full range of children tested. Threshold size reduced with increasing age in children. Intrasubject variability and intersubject variability were inversely related to age in children. Normal visual field thresholds were established for specific locations in children using a novel game-based visual field test. These could be used as a foundation for developing a game-based perimetry screening test for children.

  15. Thermotactile perception thresholds measurement conditions.

    Science.gov (United States)

    Maeda, Setsuo; Sakakibara, Hisataka

    2002-10-01

    The purpose of this paper is to investigate the effects of posture, push force and rate of temperature change on thermotactile thresholds and to clarify suitable measuring conditions for Japanese people. Thermotactile (warm and cold) thresholds on the right middle finger were measured with an HVLab thermal aesthesiometer. Subjects were eight healthy male Japanese students. The effects of posture in measurement were examined in the posture of a straight hand and forearm placed on a support, the same posture without a support, and the fingers and hand flexed at the wrist with the elbow placed on a desk. The finger push force applied to the applicator of the thermal aesthesiometer was controlled at a 0.5, 1.0, 2.0 and 3.0 N. The applicator temperature was changed to 0.5, 1.0, 1.5, 2.0 and 2.5 degrees C/s. After each measurement, subjects were asked about comfort under the measuring conditions. Three series of experiments were conducted on different days to evaluate repeatability. Repeated measures ANOVA showed that warm thresholds were affected by the push force and the rate of temperature change and that cold thresholds were influenced by posture and push force. The comfort assessment indicated that the measurement posture of a straight hand and forearm laid on a support was the most comfortable for the subjects. Relatively high repeatability was obtained under measurement conditions of a 1 degrees C/s temperature change rate and a 0.5 N push force. Measurement posture, push force and rate of temperature change can affect the thermal threshold. Judging from the repeatability, a push force of 0.5 N and a temperature change of 1.0 degrees C/s in the posture with the straight hand and forearm laid on a support are recommended for warm and cold threshold measurements.

  16. DOE approach to threshold quantities

    International Nuclear Information System (INIS)

    Wickham, L.E.; Kluk, A.F.; Department of Energy, Washington, DC)

    1985-01-01

    The Department of Energy (DOE) is developing the concept of threshold quantities for use in determining which waste materials must be handled as radioactive waste and which may be disposed of as nonradioactive waste at its sites. Waste above this concentration level would be managed as radioactive or mixed waste (if hazardous chemicals are present); waste below this level would be handled as sanitary waste. Ideally, the threshold must be set high enough to significantly reduce the amount of waste requiring special handling. It must also be low enough so that waste at the threshold quantity poses a very small health risk and multiple exposures to such waste would still constitute a small health risk. It should also be practical to segregate waste above or below the threshold quantity using available instrumentation. Guidance is being prepared to aid DOE sites in establishing threshold quantity values based on pathways analysis using site-specific parameters (waste stream characteristics, maximum exposed individual, population considerations, and site specific parameters such as rainfall, etc.). A guidance dose of between 0.001 to 1.0 mSv/y (0.1 to 100 mrem/y) was recommended with 0.3 mSv/y (30 mrem/y) selected as the guidance dose upon which to base calculations. Several tasks were identified, beginning with the selection of a suitable pathway model for relating dose to the concentration of radioactivity in the waste. Threshold concentrations corresponding to the guidance dose were determined for waste disposal sites at a selected humid and arid site. Finally, cost-benefit considerations at the example sites were addressed. The results of the various tasks are summarized and the relationship of this effort with related developments at other agencies discussed

  17. Does sensory stimulation threshold affect lumbar facet radiofrequency denervation outcomes? A prospective clinical correlational study.

    Science.gov (United States)

    Cohen, Steven P; Strassels, Scott A; Kurihara, Connie; Lesnick, Ivan K; Hanling, Steven R; Griffith, Scott R; Buckenmaier, Chester C; Nguyen, Conner

    2011-11-01

    Radiofrequency facet denervation is one of the most frequently performed procedures for chronic low back pain. Although sensory stimulation is generally used as a surrogate measure to denote sufficient proximity of the electrode to the nerve, no study has examined whether stimulation threshold influences outcome. We prospectively recorded data in 61 consecutive patients undergoing lumbar facet radiofrequency denervation who experienced significant pain relief after medial branch blocks. For each nerve lesioned, multiple attempts were made to maximize sensory stimulation threshold (SST). Mean SST was calculated on the basis of the lowest stimulation perceived at 0.1-V increments for each medial branch. A positive outcome was defined as a ≥50% reduction in back pain coupled with a positive satisfaction score lasting ≥3 months. The relationship between mean SST and denervation outcomes was evaluated via a receiver's operating characteristic (ROC) curve, and stratifying outcomes on the basis of various cutoff values. No correlation was noted between mean SST and pain relief at rest (Pearson's r=-0.01, 95% confidence interval [CI]: -0.24 to 0.23, P=0.97), with activity (r=-0.17, 95% CI: -0.40 to 0.07, P=0.20), or a successful outcome. No optimal SST could be identified. There is no significant relationship between mean SST during lumbar facet radiofrequency denervation and treatment outcome, which may be due to differences in general sensory perception. Because stimulation threshold was optimized for each patient, these data cannot be interpreted to suggest that sensory testing should not be performed, or that high sensory stimulation thresholds obtained on the first attempt should be deemed acceptable.

  18. Doubler system quench detection threshold

    International Nuclear Information System (INIS)

    Kuepke, K.; Kuchnir, M.; Martin, P.

    1983-01-01

    The experimental study leading to the determination of the sensitivity needed for protecting the Fermilab Doubler from damage during quenches is presented. The quench voltage thresholds involved were obtained from measurements made on Doubler cable of resistance x temperature and voltage x time during quenches under several currents and from data collected during operation of the Doubler Quench Protection System as implemented in the B-12 string of 20 magnets. At 4kA, a quench voltage threshold in excess of 5.OV will limit the peak Doubler cable temperature to 452K for quenches originating in the magnet coils whereas a threshold of 0.5V is required for quenches originating outside of coils

  19. Dose Response Model of Biological Reaction to Low Dose Rate Gamma Radiation

    International Nuclear Information System (INIS)

    Magae, J.; Furikawa, C.; Hoshi, Y.; Kawakami, Y.; Ogata, H.

    2004-01-01

    It is necessary to use reproducible and stable indicators to evaluate biological responses to long term irradiation at low dose-rate. They should be simple and quantitative enough to produce the results statistically accurate, because we have to analyze the subtle changes of biological responses around background level at low dose. For these purposes we chose micronucleus formation of U2OS, a human osteosarcoma cell line, as indicators of biological responses. Cells were exposed to gamma ray in irradiation rom bearing 50,000 Ci 60Co. After irradiation, they were cultured for 24 h in the presence of cytochalasin B to block cytokinesis, and cytoplasm and nucleus were stained with DAPI and prospidium iodide, respectively. the number of binuclear cells bearing micronuclei was counted under a fluorescence microscope. Dose rate in the irradiation room was measured with PLD. Dose response of PLD is linear between 1 mGy to 10 Gy, and standard deviation of triplicate count was several percent of mean value. We fitted statistically dose response curves to the data, and they were plotted on the coordinate of linearly scale response and dose. The results followed to the straight line passing through the origin of the coordinate axes between 0.1-5 Gy, and dose and does rate effectiveness factor (DDREF) was less than 2 when cells were irradiated for 1-10 min. Difference of the percent binuclear cells bearing micronucleus between irradiated cells and control cells was not statistically significant at the dose above 0.1 Gy when 5,000 binuclear cells were analyzed. In contrast, dose response curves never followed LNT, when cells were irradiated for 7 to 124 days. Difference of the percent binuclear cells bearing micronucleus between irradiated cells and control cells was not statistically significant at the dose below 6 Gy, when cells were continuously irradiated for 124 days. These results suggest that dose response curve of biological reaction is remarkably affected by exposure

  20. The linear hypothesis: An idea whose time has passed

    International Nuclear Information System (INIS)

    Tschaeche, A.N.

    1995-01-01

    This paper attempts to present a clear idea of what the linear (no-threshold) hypothesis (LH) is, how it was corrupted and what happened to the nuclear industry as a result, and one possible solution to this major problem for the nuclear industry. The corruption lies in the change of the LH from ''a little radiation MAY produce harm'' to ''low doses of radiation WILL KILL you.'' The result has been the retardation of the nuclear industry in the United States, although the industry is one of the safest, if not the safest industry. It is suggested to replace the LH with two sets of standards, one having to do with human and environmental health and safety, and the other (more stringent) for protection of manufactured items and premises. The safety standard could be some dose such as 5 rem/year. This would do away with the ALARA concept below the annual limit and with the collective dose at low doses. Benefits of the two-tier radiation standards system would be the alleviation of the public fear of radiation and the health of the nuclear industry