WorldWideScience

Sample records for acceptance threshold hypothesis

  1. Acceptance threshold hypothesis is supported by chemical similarity of cuticular hydrocarbons in a stingless bee, Melipona asilvai.

    Science.gov (United States)

    Nascimento, D L; Nascimento, F S

    2012-11-01

    The ability to discriminate nestmates from non-nestmates in insect societies is essential to protect colonies from conspecific invaders. The acceptance threshold hypothesis predicts that organisms whose recognition systems classify recipients without errors should optimize the balance between acceptance and rejection. In this process, cuticular hydrocarbons play an important role as cues of recognition in social insects. The aims of this study were to determine whether guards exhibit a restrictive level of rejection towards chemically distinct individuals, becoming more permissive during the encounters with either nestmate or non-nestmate individuals bearing chemically similar profiles. The study demonstrates that Melipona asilvai (Hymenoptera: Apidae: Meliponini) guards exhibit a flexible system of nestmate recognition according to the degree of chemical similarity between the incoming forager and its own cuticular hydrocarbons profile. Guards became less restrictive in their acceptance rates when they encounter non-nestmates with highly similar chemical profiles, which they probably mistake for nestmates, hence broadening their acceptance level.

  2. Energy Threshold Hypothesis for Household Consumption

    International Nuclear Information System (INIS)

    Ortiz, Samira; Castro-Sitiriche, Marcel; Amador, Isamar

    2017-01-01

    A strong positive relationship among quality of life and electricity consumption at impoverished countries is found in many studies. However, previous work has presented that the positive relationship does not hold beyond certain electricity consumption threshold. Consequently, there is a need of exploring the possibility for communities to live with sustainable level of energy consumption without sacrificing their quality of life. The Gallup-Healthways Report measures global citizen’s wellbeing. This paper provides a new outlook using these elements to explore the relationships among actual percentage of population thriving in most countries and their energy consumption. A measurement of efficiency is computed to determine an adjusted relative social value of energy considering the variability in the happy life years as a function of electric power consumption. Adjustment is performed so single components don’t dominate in the measurement. It is interesting to note that the countries with the highest relative social value of energy are in the top 10 countries of the Gallup report.

  3. Faster magnet sorting with a threshold acceptance algorithm

    International Nuclear Information System (INIS)

    Lidia, S.; Carr, R.

    1995-01-01

    We introduce here a new technique for sorting magnets to minimize the field errors in permanent magnet insertion devices. Simulated annealing has been used in this role, but we find the technique of threshold acceptance produces results of equal quality in less computer time. Threshold accepting would be of special value in designing very long insertion devices, such as long free electron lasers (FELs). Our application of threshold acceptance to magnet sorting showed that it converged to equivalently low values of the cost function, but that it converged significantly faster. We present typical cases showing time to convergence for various error tolerances, magnet numbers, and temperature schedules

  4. Faster magnet sorting with a threshold acceptance algorithm

    International Nuclear Information System (INIS)

    Lidia, S.

    1994-08-01

    The authors introduce here a new technique for sorting magnets to minimize the field errors in permanent magnet insertion devices. Simulated annealing has been used in this role, but they find the technique of threshold acceptance produces results of equal quality in less computer time. Threshold accepting would be of special value in designing very long insertion devices, such as long FEL's. Their application of threshold acceptance to magnet sorting showed that it converged to equivalently low values of the cost function, but that it converged significantly faster. They present typical cases showing time to convergence for various error tolerances, magnet numbers, and temperature schedules

  5. Perceptibility and acceptability thresholds for colour differences in dentistry

    NARCIS (Netherlands)

    Khashayar, G.; Bain, P.A.; Salari, S.; Dozic, A.; Kleverlaan, C.J.; Feilzer, A.J.

    2014-01-01

    Introduction Data on acceptability (AT) and perceptibility thresholds (PT) for colour differences vary in dental literature. There is consensus that the determination of ΔE* is appropriate to define AT and PT, however there is no consensus regarding the values that should be used. The aim of this

  6. The Threshold Hypothesis Applied to Spatial Skill and Mathematics

    Science.gov (United States)

    Freer, Daniel

    2017-01-01

    This cross-sectional study assessed the relation between spatial skills and mathematics in 854 participants across kindergarten, third grade, and sixth grade. Specifically, the study probed for a threshold for spatial skills when performing mathematics, above which spatial scores and mathematics scores would be significantly less related. This…

  7. Raison d’être of insulin resistance: the adjustable threshold hypothesis

    OpenAIRE

    Wang, Guanyu

    2014-01-01

    The epidemics of obesity and diabetes demand a deeper understanding of insulin resistance, for which the adjustable threshold hypothesis is formed in this paper. To test the hypothesis, mathematical modelling was used to analyse clinical data and to simulate biological processes at both molecular and organismal levels. I found that insulin resistance roots in the thresholds of the cell's bistable response. By assuming heterogeneity of the thresholds, single cells' all-or-none response can col...

  8. Confirmation of Maslow's Hypothesis of Synergy: Developing an Acceptance of Selfishness at the Workplace Scale.

    Science.gov (United States)

    Takaki, Jiro; Taniguchi, Toshiyo; Fujii, Yasuhito

    2016-04-30

    This study aimed to develop a new Acceptance of Selfishness at the Workplace Scale (ASWS) and to confirm Maslow's hypothesis of synergy: if both a sense of contribution and acceptance of selfishness at the workplace are high, workers are psychologically healthy. In a cross-sectional study with employees of three Japanese companies, 656 workers answered a self-administered questionnaire on paper completely (response rate = 66.8%). Each questionnaire was submitted to us in a sealed envelope and analyzed. The ASWS indicated high internal consistency (Cronbach's alpha = 0.86). Significant (p Maslow's hypothesis of synergy was confirmed.

  9. Acceptance sampling for attributes via hypothesis testing and the hypergeometric distribution

    Science.gov (United States)

    Samohyl, Robert Wayne

    2017-10-01

    This paper questions some aspects of attribute acceptance sampling in light of the original concepts of hypothesis testing from Neyman and Pearson (NP). Attribute acceptance sampling in industry, as developed by Dodge and Romig (DR), generally follows the international standards of ISO 2859, and similarly the Brazilian standards NBR 5425 to NBR 5427 and the United States Standards ANSI/ASQC Z1.4. The paper evaluates and extends the area of acceptance sampling in two directions. First, by suggesting the use of the hypergeometric distribution to calculate the parameters of sampling plans avoiding the unnecessary use of approximations such as the binomial or Poisson distributions. We show that, under usual conditions, discrepancies can be large. The conclusion is that the hypergeometric distribution, ubiquitously available in commonly used software, is more appropriate than other distributions for acceptance sampling. Second, and more importantly, we elaborate the theory of acceptance sampling in terms of hypothesis testing rigorously following the original concepts of NP. By offering a common theoretical structure, hypothesis testing from NP can produce a better understanding of applications even beyond the usual areas of industry and commerce such as public health and political polling. With the new procedures, both sample size and sample error can be reduced. What is unclear in traditional acceptance sampling is the necessity of linking the acceptable quality limit (AQL) exclusively to the producer and the lot quality percent defective (LTPD) exclusively to the consumer. In reality, the consumer should also be preoccupied with a value of AQL, as should the producer with LTPD. Furthermore, we can also question why type I error is always uniquely associated with the producer as producer risk, and likewise, the same question arises with consumer risk which is necessarily associated with type II error. The resolution of these questions is new to the literature. The

  10. Beyond the fragmentation threshold hypothesis: regime shifts in biodiversity across fragmented landscapes.

    Directory of Open Access Journals (Sweden)

    Renata Pardini

    Full Text Available Ecological systems are vulnerable to irreversible change when key system properties are pushed over thresholds, resulting in the loss of resilience and the precipitation of a regime shift. Perhaps the most important of such properties in human-modified landscapes is the total amount of remnant native vegetation. In a seminal study Andrén proposed the existence of a fragmentation threshold in the total amount of remnant vegetation, below which landscape-scale connectivity is eroded and local species richness and abundance become dependent on patch size. Despite the fact that species patch-area effects have been a mainstay of conservation science there has yet to be a robust empirical evaluation of this hypothesis. Here we present and test a new conceptual model describing the mechanisms and consequences of biodiversity change in fragmented landscapes, identifying the fragmentation threshold as a first step in a positive feedback mechanism that has the capacity to impair ecological resilience, and drive a regime shift in biodiversity. The model considers that local extinction risk is defined by patch size, and immigration rates by landscape vegetation cover, and that the recovery from local species losses depends upon the landscape species pool. Using a unique dataset on the distribution of non-volant small mammals across replicate landscapes in the Atlantic forest of Brazil, we found strong evidence for our model predictions--that patch-area effects are evident only at intermediate levels of total forest cover, where landscape diversity is still high and opportunities for enhancing biodiversity through local management are greatest. Furthermore, high levels of forest loss can push native biota through an extinction filter, and result in the abrupt, landscape-wide loss of forest-specialist taxa, ecological resilience and management effectiveness. The proposed model links hitherto distinct theoretical approaches within a single framework

  11. The threshold hypothesis: solving the equation of nurture vs nature in type 1 diabetes.

    Science.gov (United States)

    Wasserfall, C; Nead, K; Mathews, C; Atkinson, M A

    2011-09-01

    For more than 40 years, the contributions of nurture (i.e. the environment) and nature (i.e. genetics) have been touted for their aetiological importance in type 1 diabetes. Disappointingly, knowledge gains in these areas, while individually successful, have to a large extent occurred in isolation from each other. One reason underlying this divide is the lack of a testable model that simultaneously considers the contributions of genetic and environmental determinants in the formation of this and potentially other disorders that are subject to these variables. To address this void, we have designed a model based on the hypothesis that the aetiological influences of genetics and environment, when evaluated as intersecting and reciprocal trend lines based on odds ratios, result in a method of concurrently evaluating both facets and defining the attributable risk of clinical onset of type 1 diabetes. The model, which we have elected to term the 'threshold hypothesis', also provides a novel means of conceptualising the complex interactions of nurture with nature in type 1 diabetes across various geographical populations.

  12. The relationship between intelligence and creativity: New support for the threshold hypothesis by means of empirical breakpoint detection

    Science.gov (United States)

    Jauk, Emanuel; Benedek, Mathias; Dunst, Beate; Neubauer, Aljoscha C.

    2013-01-01

    The relationship between intelligence and creativity has been subject to empirical research for decades. Nevertheless, there is yet no consensus on how these constructs are related. One of the most prominent notions concerning the interplay between intelligence and creativity is the threshold hypothesis, which assumes that above-average intelligence represents a necessary condition for high-level creativity. While earlier research mostly supported the threshold hypothesis, it has come under fire in recent investigations. The threshold hypothesis is commonly investigated by splitting a sample at a given threshold (e.g., at 120 IQ points) and estimating separate correlations for lower and upper IQ ranges. However, there is no compelling reason why the threshold should be fixed at an IQ of 120, and to date, no attempts have been made to detect the threshold empirically. Therefore, this study examined the relationship between intelligence and different indicators of creative potential and of creative achievement by means of segmented regression analysis in a sample of 297 participants. Segmented regression allows for the detection of a threshold in continuous data by means of iterative computational algorithms. We found thresholds only for measures of creative potential but not for creative achievement. For the former the thresholds varied as a function of criteria: When investigating a liberal criterion of ideational originality (i.e., two original ideas), a threshold was detected at around 100 IQ points. In contrast, a threshold of 120 IQ points emerged when the criterion was more demanding (i.e., many original ideas). Moreover, an IQ of around 85 IQ points was found to form the threshold for a purely quantitative measure of creative potential (i.e., ideational fluency). These results confirm the threshold hypothesis for qualitative indicators of creative potential and may explain some of the observed discrepancies in previous research. In addition, we obtained

  13. Rationality, practice variation and person-centred health policy: a threshold hypothesis.

    Science.gov (United States)

    Djulbegovic, Benjamin; Hamm, Robert M; Mayrhofer, Thomas; Hozo, Iztok; Van den Ende, Jef

    2015-12-01

    Variation in practice of medicine is one of the major health policy issues of today. Ultimately, it is related to physicians' decision making. Similar patients with similar likelihood of having disease are often managed by different doctors differently: some doctors may elect to observe the patient, others decide to act based on diagnostic testing and yet others may elect to treat without testing. We explain these differences in practice by differences in disease probability thresholds at which physicians decide to act: contextual social and clinical factors and emotions such as regret affect the threshold by influencing the way doctors integrate objective data related to treatment and testing. However, depending on a theoretical construct each of the physician's behaviour can be considered rational. In fact, we showed that the current regulatory policies lead to predictably low thresholds for most decisions in contemporary practice. As a result, we may expect continuing motivation for overuse of treatment and diagnostic tests. We argue that rationality should take into account both formal principles of rationality and human intuitions about good decisions along the lines of Rawls' 'reflective equilibrium/considered judgment'. In turn, this can help define a threshold model that is empirically testable. © 2015 The Authors. Journal of Evaluation in Clinical Practice published by John Wiley & Sons, Ltd.

  14. Automated backbone assignment of labeled proteins using the threshold accepting algorithm

    International Nuclear Information System (INIS)

    Leutner, Michael; Gschwind, Ruth M.; Liermann, Jens; Schwarz, Christian; Gemmecker, Gerd; Kessler, Horst

    1998-01-01

    The sequential assignment of backbone resonances is the first step in the structure determination of proteins by heteronuclear NMR. For larger proteins, an assignment strategy based on proton side-chain information is no longer suitable for the use in an automated procedure. Our program PASTA (Protein ASsignment by Threshold Accepting) is therefore designed to partially or fully automate the sequential assignment of proteins, based on the analysis of NMR backbone resonances plus C β information. In order to overcome the problems caused by peak overlap and missing signals in an automated assignment process, PASTA uses threshold accepting, a combinatorial optimization strategy, which is superior to simulated annealing due to generally faster convergence and better solutions. The reliability of this algorithm is shown by reproducing the complete sequential backbone assignment of several proteins from published NMR data. The robustness of the algorithm against misassigned signals, noise, spectral overlap and missing peaks is shown by repeating the assignment with reduced sequential information and increased chemical shift tolerances. The performance of the program on real data is finally demonstrated with automatically picked peak lists of human nonpancreatic synovial phospholipase A 2 , a protein with 124 residues

  15. Rethinking avian response to Tamarix on the lower Colorado River: A threshold hypothesis

    Science.gov (United States)

    van Riper, Charles; Paxton, K.L.; O'brien, C.; Shafroth, P.B.; McGrath, L.J.

    2008-01-01

    Many of the world's large river systems have been greatly altered in the past century due to river regulation, agriculture, and invasion of introduced Tamarix spp. (saltcedar, tamarisk). These riverine ecosystems are known to provide important habitat for avian communities, but information on responses of birds to differing levels of Tamarix is not known. Past research on birds along the Colorado River has shown that avian abundance in general is greater in native than in non-native habitat. In this article, we address habitat restoration on the lower Colorado River by comparing abundance and diversity of avian communities at a matrix of different amounts of native and non-native habitats at National Wildlife Refuges in Arizona. Two major patterns emerged from this study: (1) Not all bird species responded to Tamarix in a similar fashion, and for many bird species, abundance was highest at intermediate Tamarix levels (40-60%), suggesting a response threshold. (2) In Tamarix-dominated habitats, the greatest increase in bird abundance occurred when small amounts of native vegetation were present as a component of that habitat. In fact, Tamarix was the best vegetation predictor of avian abundance when compared to vegetation density and canopy cover. Our results suggest that to positively benefit avian abundance and diversity, one cost-effective way to rehabilitate larger monoculture Tamarix stands would be to add relatively low levels of native vegetation (???20-40%) within homogenous Tamarix habitat. In addition, this could be much more cost effective and feasible than attempting to replace all Tamarix with native vegetation. ?? 2008 Society for Ecological Restoration International.

  16. Molecular biology, epidemiology, and the demise of the linear no-threshold hypothesis

    International Nuclear Information System (INIS)

    Pollycove, M.

    1998-01-01

    The LNT hypothesis is the basic principle of all radiation protection policy. This theory assumes that all radiation doses, even those close to zero, are harmful in linear proportion to dose and that all doses produce a proportionate number of harmful mutations, i.e., mis- or unrepaired DNA alterations. The LNT theory is used to generate collective dose calculations of the number of deaths produced by minute fractions of background radiation. Current molecular biology reveals an enormous amount of relentless metabolic oxidative free radical damage with mis/unrepaired alterations of DNA. The corresponding mis/unrepaired DNA alterations produced by background radiation are negligible. These DNA alterations are effectively disposed of by the DNA damage-control biosystem of antioxidant prevention, enzymatic repair, and mutation removal. High-dose radiation injures this biosystem with associated risk increments of mortality and cancer mortality. Low-dose radiation stimulates DNA damage-control with associated epidemiologic observations of risk decrements of mortality and cancer mortality, i.e., hormesis. How can this 40-year-old LNT paradigm continue to be the operative principle of radiation protection policy despite the contradictory scientific observations of both molecular biology and epidemiology and the lack of any supportive human data? The increase of public fear through repeated statements of deaths caused by 'deadly' radiation has engendered an enormous increase in expenditures now required to 'protect' the public from all applications of nuclear technology: medical, research, energy, disposal, and cleanup remediation. Government funds are allocated to appointed committees, the research they support, and to multiple environmental and regulatory agencies. The LNT theory and multibillion dollar radiation activities have now become a symbiotic self-sustaining powerful political and economic force. (author)

  17. Air Traffic Controller Acceptability of Unmanned Aircraft System Detect-and-Avoid Thresholds

    Science.gov (United States)

    Mueller, Eric R.; Isaacson, Douglas R.; Stevens, Derek

    2016-01-01

    A human-in-the-loop experiment was conducted with 15 retired air traffic controllers to investigate two research questions: (a) what procedures are appropriate for the use of unmanned aircraft system (UAS) detect-and-avoid systems, and (b) how long in advance of a predicted close encounter should pilots request or execute a separation maneuver. The controller participants managed a busy Oakland air route traffic control sector with mixed commercial/general aviation and manned/UAS traffic, providing separation services, miles-in-trail restrictions and issuing traffic advisories. Controllers filled out post-scenario and post-simulation questionnaires, and metrics were collected on the acceptability of procedural options and temporal thresholds. The states of aircraft were also recorded when controllers issued traffic advisories. Subjective feedback indicated a strong preference for pilots to request maneuvers to remain well clear from intruder aircraft rather than deviate from their IFR clearance. Controllers also reported that maneuvering at 120 seconds until closest point of approach (CPA) was too early; maneuvers executed with less than 90 seconds until CPA were more acceptable. The magnitudes of the requested maneuvers were frequently judged to be too large, indicating a possible discrepancy between the quantitative UAS well clear standard and the one employed subjectively by manned pilots. The ranges between pairs of aircraft and the times to CPA at which traffic advisories were issued were used to construct empirical probability distributions of those metrics. Given these distributions, we propose that UAS pilots wait until an intruder aircraft is approximately 80 seconds to CPA or 6 nmi away before requesting a maneuver, and maneuver immediately if the intruder is within 60 seconds and 4 nmi. These thresholds should make the use of UAS detect and avoid systems compatible with current airspace procedures and controller expectations.

  18. Panel discussion on health effects of low-dose ionizing radiation. Scientific findings and non-threshold hypothesis

    International Nuclear Information System (INIS)

    1995-06-01

    This is a record of a panel discussion in the IAEA Interregional Training Course. In current radiation work, protection measures are taken on the assumption that any amount of radiation, however small, entails a risk of deleterious effects. This so-called non-threshold assumption of radiation effects, on the one hand, creates public distrust of radiation use. However, because the health effects of low-dose ionizing radiation are difficult to verify, wide views ranging from the non-threshold hypothesis to one which sees small amounts of radiation as rather useful and necessary are presented. In this panel discussion, how the health effects of low-dose ionizing radiation should be considered from the standpoint of radiation protection was discussed. Panelists included such eminent scientists as Dr. Sugahara and Dr. Okada, who are deeply interested in this field and are playing leading parts in radiobiology research in Japan, and Dr. Stather, deputy Director of NRPB, UK, who, in UNSCEAR and ICRP, is actively participating in the international review of radiation effects and the preparation of reports on radiation protection recommendations. They agreed with each other that although it is reasonable, under the current scientific understanding, to follow the recommendation of ICRP, research in this area should be strongly promoted hereafter, for basing radiation protection on firm scientific grounds. Many participants actively asked about and discussed problems in their own field. (author)

  19. Confirmation of Maslow’s Hypothesis of Synergy: Developing an Acceptance of Selfishness at the Workplace Scale

    Directory of Open Access Journals (Sweden)

    Jiro Takaki

    2016-04-01

    Full Text Available This study aimed to develop a new Acceptance of Selfishness at the Workplace Scale (ASWS and to confirm Maslow’s hypothesis of synergy: if both a sense of contribution and acceptance of selfishness at the workplace are high, workers are psychologically healthy. In a cross-sectional study with employees of three Japanese companies, 656 workers answered a self-administered questionnaire on paper completely (response rate = 66.8%. Each questionnaire was submitted to us in a sealed envelope and analyzed. The ASWS indicated high internal consistency (Cronbach’s alpha = 0.86. Significant (p < 0.001 positive moderate correlations between ASWS scores and job control scores support the ASWS’s convergent and discriminant validity. Significant (p < 0.001 associations of ASWS scores with psychological distress and work engagement supported the ASWS’s criterion validity. In short, ASWS was a psychometrically satisfactory measure. Significant (p < 0.05 interactions between a sense of contribution and acceptance of selfishness at the workplace in linear regression models showed that when those two factors are low, psychological distress becomes high. However, when a sense of contribution and acceptance of selfishness are high, work engagement also becomes high. Thus, Maslow’s hypothesis of synergy was confirmed.

  20. Calculation of Pareto-optimal solutions to multiple-objective problems using threshold-of-acceptability constraints

    Science.gov (United States)

    Giesy, D. P.

    1978-01-01

    A technique is presented for the calculation of Pareto-optimal solutions to a multiple-objective constrained optimization problem by solving a series of single-objective problems. Threshold-of-acceptability constraints are placed on the objective functions at each stage to both limit the area of search and to mathematically guarantee convergence to a Pareto optimum.

  1. Threshold Assessment: Definition of Acceptable Sites as Part of Site Selection for the Japanese HLW Program

    International Nuclear Information System (INIS)

    McKenna, S.A.; Wakasugi, Keiichiro; Webb, E.K.; Makino, Hitoshi; Ishihara, Yoshinao; Ijiri, Yuji; Sawada, Atsushi; Baba, Tomoko; Ishiguro, Katsuhiko; Umeki, Hiroyuki

    2000-01-01

    For the last ten years, the Japanese High-Level Nuclear Waste (HLW) repository program has focused on assessing the feasibility of a basic repository concept, which resulted in the recently published H12 Report. As Japan enters the implementation phase, a new organization must identify, screen and choose potential repository sites. Thus, a rapid mechanism for determining the likelihood of site suitability is critical. The threshold approach, described here, is a simple mechanism for defining the likelihood that a site is suitable given estimates of several critical parameters. We rely on the results of a companion paper, which described a probabilistic performance assessment simulation of the HLW reference case in the H12 report. The most critical two or three input parameters are plotted against each other and treated as spatial variables. Geostatistics is used to interpret the spatial correlation, which in turn is used to simulate multiple realizations of the parameter value maps. By combining an array of realizations, we can look at the probability that a given site, as represented by estimates of this combination of parameters, would be good host for a repository site

  2. A hypothesis linking sub-optimal seawater pCO2 conditions for cnidarian-Symbiodinium symbioses with the exceedence of the interglacial threshold (>260 ppmv

    Directory of Open Access Journals (Sweden)

    S. A. Wooldridge

    2012-05-01

    Full Text Available Most scleractinian corals and many other cnidarians host intracellular photosynthetic dinoflagellate symbionts ("zooxanthellae". The zooxanthellae contribute to host metabolism and skeletogenesis to such an extent that this symbiosis is well recognised for its contribution in creating the coral reef ecosystem. The stable functioning of cnidarian symbioses is however dependent upon the host's ability to maintain demographic control of its algal partner. In this review, I explain how the modern envelope of seawater conditions found within many coral reef ecosystems (characterised by elevated temperatures, rising pCO2, and enriched nutrient levels are antagonistic toward the dominant host processes that restrict excessive symbiont proliferation. Moreover, I outline a new hypothesis and initial evidence base, which support the suggestion that the additional "excess" zooxanthellae fraction permitted by seawater pCO2 levels beyond 260 ppmv significantly increases the propensity for symbiosis breakdown ("bleaching" in response to temperature and irradiance extremes. The relevance of this biological threshold is discussed in terms of historical reef extinction events, glacial-interglacial climate cycles and the modern decline of coral reef ecosystems.

  3. A hypothesis linking sub-optimal seawater pCO2 conditions for cnidarian-Symbiodinium symbioses with the exceedence of the interglacial threshold (>260 ppmv)

    Science.gov (United States)

    Wooldridge, S. A.

    2012-05-01

    Most scleractinian corals and many other cnidarians host intracellular photosynthetic dinoflagellate symbionts ("zooxanthellae"). The zooxanthellae contribute to host metabolism and skeletogenesis to such an extent that this symbiosis is well recognised for its contribution in creating the coral reef ecosystem. The stable functioning of cnidarian symbioses is however dependent upon the host's ability to maintain demographic control of its algal partner. In this review, I explain how the modern envelope of seawater conditions found within many coral reef ecosystems (characterised by elevated temperatures, rising pCO2, and enriched nutrient levels) are antagonistic toward the dominant host processes that restrict excessive symbiont proliferation. Moreover, I outline a new hypothesis and initial evidence base, which support the suggestion that the additional "excess" zooxanthellae fraction permitted by seawater pCO2 levels beyond 260 ppmv significantly increases the propensity for symbiosis breakdown ("bleaching") in response to temperature and irradiance extremes. The relevance of this biological threshold is discussed in terms of historical reef extinction events, glacial-interglacial climate cycles and the modern decline of coral reef ecosystems.

  4. Social psychological approach to the problem of threshold

    International Nuclear Information System (INIS)

    Nakayachi, Kazuya

    1999-01-01

    This paper discusses the threshold of carcinogen risk from the viewpoint of social psychology. First, the results of a survey suggesting that renunciation of the Linear No-Threshold (LNT) hypothesis would have no influence on the public acceptance (PA) of nuclear power plants are reported. Second, the relationship between the adoption of the LNT hypothesis and the standardization of management for various risks are discussed. (author)

  5. Honeybee (Apis cerana) foraging responses to the toxic honey of Tripterygium hypoglaucum (Celastraceae): changing threshold of nectar acceptability.

    Science.gov (United States)

    Tan, K; Guo, Y H; Nicolson, S W; Radloff, S E; Song, Q S; Hepburn, H R

    2007-12-01

    To investigate honeybee foraging responses to toxic nectar, honey was collected from Apis cerana colonies in the Yaoan county of Yunnan Province, China, during June, when flowers of Tripterygium hypoglaucum were the main nectar source available. Pollen analysis confirmed the origin of the honey, and high-performance liquid chromatography showed the prominent component triptolide to be present at a concentration of 0.61 mug/g +/- 0.11 SD. In cage tests that used young adult worker bees, significantly more of those provided with a diet of T. hypoglaucum honey mixed with sugar powder (1:1) died within 6 d (68.3%) compared to control groups provided with normal honey mixed with sugar powder (15.8%). Honeybees were trained to visit feeders that contained honey of T. hypoglaucum (toxic honey) as the test group and honey of Vicia sativa or Elsholtzia ciliata as control groups (all honeys diluted 1:3 with water). Bees preferred the feeders with normal honey to those with toxic honey, as shown by significantly higher visiting frequencies and longer imbibition times. However, when the feeder of normal honey was removed, leaving only honey of T. hypoglaucum, the foraging bees returned to the toxic honey after a few seconds of hesitation, and both visiting frequency and imbibition time increased to values previously recorded for normal honey. Toxic honey thus became acceptable to the bees in the absence of other nectar sources.

  6. Validity of the linear no-threshold (LNT) hypothesis in setting radiation protection regulations for the inhabitants in high level natural radiation areas of Ramsar, Iran

    International Nuclear Information System (INIS)

    Mortazavi, S.M.J.; Atefi, M.; Razi, Z.; Mortazavi Gh

    2010-01-01

    Some areas in Ramsar, a city in northern Iran, have long been known as inhabited areas with the highest levels of natural radiation. Despite the fact that the health effects of high doses of ionizing radiation are well documented, biological effects of above the background levels of natural radiation are still controversial and the validity of the LNT hypothesis in this area, has been criticized by many investigators around the world. The study of the health effects of high levels of natural radiation in areas such as Ramsar, help scientists to investigate the biological effects without the need for extrapolating the observations either from high doses of radiation to low dose region or from laboratory animals to humans. Considering the importance of these studies, National Radiation Protection Department (NRPD) of the Iranian Nuclear Regulatory Authority has started an integrative research project on the health effects of long-term exposure to high levels of natural radiation. This paper reviews findings of the studies conducted on the plants and humans living or laboratory animals kept in high level natural radiation areas of Ramsar. In human studies, different end points such as DNA damage, chromosome aberrations, blood cells and immunological alterations are discussed. This review comes to the conclusion that no reproducible detrimental health effect has been reported so far. In this paper the validity of LNT hypothesis in the assessment of the health effects of high levels of natural radiation is discussed. (author)

  7. On setting NRC alarm thresholds for inventory differences and process unit loss estimators: Clarifying their statistical basis with hypothesis testing methods and error propagation models from Jaech, Bowen and Bennett and IAEA

    International Nuclear Information System (INIS)

    Ong, L.

    1995-01-01

    Major fuel cycle facilities in the US private sector are required to respond-at predetermined alarm levels-to various special nuclear material loss estimators in the material control and accounting (MC and A) area. This paper presents US Nuclear Regulatory Commission (NRC) policy, along with the underlying statistical rationale, for establishing and inspecting the application of thresholds to detect excessive inventory differences (ID). Accordingly, escalating responsive action must be taken to satisfy NRC's MC and A regulations for low-enriched uranium (LEU) fuel conversion/fabrication plants and LEU enrichment facilities. The establishment of appropriate ID detection thresholds depends on a site-specific goal quantity, a specified probability of detection and the standard error of the ID. Regulatory guidelines for ID significance tests and process control tests conducted by licensees with highly enriched uranium are similarly rationalized in definitive hypothesis testing including null and alternative hypotheses; statistical efforts of the first, second, third, and fourth kinds; and suitable test statistics, uncertainty estimates, prevailing assumptions, and critical values for comparisons. Conceptual approaches are described in the context of significance test considerations and measurement error models including the treatment of so called ''systematic error variance'' effects as observations of random variables in the statistical sense

  8. Comparison of Threshold Saccadic Vector Optokinetic Perimetry (SVOP) and Standard Automated Perimetry (SAP) in Glaucoma. Part II: Patterns of Visual Field Loss and Acceptability.

    Science.gov (United States)

    McTrusty, Alice D; Cameron, Lorraine A; Perperidis, Antonios; Brash, Harry M; Tatham, Andrew J; Agarwal, Pankaj K; Murray, Ian C; Fleck, Brian W; Minns, Robert A

    2017-09-01

    We compared patterns of visual field loss detected by standard automated perimetry (SAP) to saccadic vector optokinetic perimetry (SVOP) and examined patient perceptions of each test. A cross-sectional study was done of 58 healthy subjects and 103 with glaucoma who were tested using SAP and two versions of SVOP (v1 and v2). Visual fields from both devices were categorized by masked graders as: 0, normal; 1, paracentral defect; 2, nasal step; 3, arcuate defect; 4, altitudinal; 5, biarcuate; and 6, end-stage field loss. SVOP and SAP classifications were cross-tabulated. Subjects completed a questionnaire on their opinions of each test. We analyzed 142 (v1) and 111 (v2) SVOP and SAP test pairs. SVOP v2 had a sensitivity of 97.7% and specificity of 77.9% for identifying normal versus abnormal visual fields. SAP and SVOP v2 classifications showed complete agreement in 54% of glaucoma patients, with a further 23% disagreeing by one category. On repeat testing, 86% of SVOP v2 classifications agreed with the previous test, compared to 91% of SAP classifications; 71% of subjects preferred SVOP compared to 20% who preferred SAP. Eye-tracking perimetry can be used to obtain threshold visual field sensitivity values in patients with glaucoma and produce maps of visual field defects, with patterns exhibiting close agreement to SAP. Patients preferred eye-tracking perimetry compared to SAP. This first report of threshold eye tracking perimetry shows good agreement with conventional automated perimetry and provides a benchmark for future iterations.

  9. P value and the theory of hypothesis testing: an explanation for new researchers.

    Science.gov (United States)

    Biau, David Jean; Jolles, Brigitte M; Porcher, Raphaël

    2010-03-01

    In the 1920s, Ronald Fisher developed the theory behind the p value and Jerzy Neyman and Egon Pearson developed the theory of hypothesis testing. These distinct theories have provided researchers important quantitative tools to confirm or refute their hypotheses. The p value is the probability to obtain an effect equal to or more extreme than the one observed presuming the null hypothesis of no effect is true; it gives researchers a measure of the strength of evidence against the null hypothesis. As commonly used, investigators will select a threshold p value below which they will reject the null hypothesis. The theory of hypothesis testing allows researchers to reject a null hypothesis in favor of an alternative hypothesis of some effect. As commonly used, investigators choose Type I error (rejecting the null hypothesis when it is true) and Type II error (accepting the null hypothesis when it is false) levels and determine some critical region. If the test statistic falls into that critical region, the null hypothesis is rejected in favor of the alternative hypothesis. Despite similarities between the two, the p value and the theory of hypothesis testing are different theories that often are misunderstood and confused, leading researchers to improper conclusions. Perhaps the most common misconception is to consider the p value as the probability that the null hypothesis is true rather than the probability of obtaining the difference observed, or one that is more extreme, considering the null is true. Another concern is the risk that an important proportion of statistically significant results are falsely significant. Researchers should have a minimum understanding of these two theories so that they are better able to plan, conduct, interpret, and report scientific experiments.

  10. The problem of the detection threshold in radiation measurement

    International Nuclear Information System (INIS)

    Rose, E.; Wueneke, C.D.

    1983-01-01

    In all cases encountered in practical radiation measurement, the basic problem is to differentiate between the lowest measured value and the zero value (background, natural background radiation, etc.). For this purpose, on the mathematical side, tests based on hypotheses are to be applied. These will show the probability of differentiation between two values having the same random spread. By means of these tests and the corresponding error theory, a uniform treatment of the subject, applicable to all problems relating to measuring technique alike, can be found. Two basic concepts are found in this process, which have to be defined in terms of semantics and nomenclature: Decision threshold and detection threshold, or 'minimum detectable mean value'. At the decision threshold, one has to decide (with a given statistical error probability) whether a measured value is to be attributed to the background radiation, accepting the zero hypothesis, or whether this value differs significantly from the background radiation (error of 1rst kind). The minimum detectable mean value is the value which, with a given decision threshold, can be determined with sufficient significance to be a measured value and thus cannot be mistaken as background radiation (alternative hypothesis, error of 2nd kind). Normally, the two error types are of equal importance. It may happen, however, that one type of error gains more importance, depending on the approach. (orig.) [de

  11. THE FRACTAL MARKET HYPOTHESIS

    OpenAIRE

    FELICIA RAMONA BIRAU

    2012-01-01

    In this article, the concept of capital market is analysed using Fractal Market Hypothesis which is a modern, complex and unconventional alternative to classical finance methods. Fractal Market Hypothesis is in sharp opposition to Efficient Market Hypothesis and it explores the application of chaos theory and fractal geometry to finance. Fractal Market Hypothesis is based on certain assumption. Thus, it is emphasized that investors did not react immediately to the information they receive and...

  12. Variability: A Pernicious Hypothesis.

    Science.gov (United States)

    Noddings, Nel

    1992-01-01

    The hypothesis of greater male variability in test results is discussed in its historical context, and reasons feminists have objected to the hypothesis are considered. The hypothesis acquires political importance if it is considered that variability results from biological, rather than cultural, differences. (SLD)

  13. THE FRACTAL MARKET HYPOTHESIS

    Directory of Open Access Journals (Sweden)

    FELICIA RAMONA BIRAU

    2012-05-01

    Full Text Available In this article, the concept of capital market is analysed using Fractal Market Hypothesis which is a modern, complex and unconventional alternative to classical finance methods. Fractal Market Hypothesis is in sharp opposition to Efficient Market Hypothesis and it explores the application of chaos theory and fractal geometry to finance. Fractal Market Hypothesis is based on certain assumption. Thus, it is emphasized that investors did not react immediately to the information they receive and of course, the manner in which they interpret that information may be different. Also, Fractal Market Hypothesis refers to the way that liquidity and investment horizons influence the behaviour of financial investors.

  14. Physiopathological Hypothesis of Cellulite

    Science.gov (United States)

    de Godoy, José Maria Pereira; de Godoy, Maria de Fátima Guerreiro

    2009-01-01

    A series of questions are asked concerning this condition including as regards to its name, the consensus about the histopathological findings, physiological hypothesis and treatment of the disease. We established a hypothesis for cellulite and confirmed that the clinical response is compatible with this hypothesis. Hence this novel approach brings a modern physiological concept with physiopathologic basis and clinical proof of the hypothesis. We emphasize that the choice of patient, correct diagnosis of cellulite and the technique employed are fundamental to success. PMID:19756187

  15. Life Origination Hydrate Hypothesis (LOH-Hypothesis

    Directory of Open Access Journals (Sweden)

    Victor Ostrovskii

    2012-01-01

    Full Text Available The paper develops the Life Origination Hydrate Hypothesis (LOH-hypothesis, according to which living-matter simplest elements (LMSEs, which are N-bases, riboses, nucleosides, nucleotides, DNA- and RNA-like molecules, amino-acids, and proto-cells repeatedly originated on the basis of thermodynamically controlled, natural, and inevitable processes governed by universal physical and chemical laws from CH4, niters, and phosphates under the Earth's surface or seabed within the crystal cavities of the honeycomb methane-hydrate structure at low temperatures; the chemical processes passed slowly through all successive chemical steps in the direction that is determined by a gradual decrease in the Gibbs free energy of reacting systems. The hypothesis formulation method is based on the thermodynamic directedness of natural movement and consists ofan attempt to mentally backtrack on the progression of nature and thus reveal principal milestones alongits route. The changes in Gibbs free energy are estimated for different steps of the living-matter origination process; special attention is paid to the processes of proto-cell formation. Just the occurrence of the gas-hydrate periodic honeycomb matrix filled with LMSEs almost completely in its final state accounts for size limitation in the DNA functional groups and the nonrandom location of N-bases in the DNA chains. The slowness of the low-temperature chemical transformations and their “thermodynamic front” guide the gross process of living matter origination and its successive steps. It is shown that the hypothesis is thermodynamically justified and testable and that many observed natural phenomena count in its favor.

  16. Hypothesis analysis methods, hypothesis analysis devices, and articles of manufacture

    Science.gov (United States)

    Sanfilippo, Antonio P [Richland, WA; Cowell, Andrew J [Kennewick, WA; Gregory, Michelle L [Richland, WA; Baddeley, Robert L [Richland, WA; Paulson, Patrick R [Pasco, WA; Tratz, Stephen C [Richland, WA; Hohimer, Ryan E [West Richland, WA

    2012-03-20

    Hypothesis analysis methods, hypothesis analysis devices, and articles of manufacture are described according to some aspects. In one aspect, a hypothesis analysis method includes providing a hypothesis, providing an indicator which at least one of supports and refutes the hypothesis, using the indicator, associating evidence with the hypothesis, weighting the association of the evidence with the hypothesis, and using the weighting, providing information regarding the accuracy of the hypothesis.

  17. On the Keyhole Hypothesis

    DEFF Research Database (Denmark)

    Mikkelsen, Kaare B.; Kidmose, Preben; Hansen, Lars Kai

    2017-01-01

    simultaneously recorded scalp EEG. A cross-validation procedure was employed to ensure unbiased estimates. We present several pieces of evidence in support of the keyhole hypothesis: There is a high mutual information between data acquired at scalp electrodes and through the ear-EEG "keyhole," furthermore we......We propose and test the keyhole hypothesis that measurements from low dimensional EEG, such as ear-EEG reflect a broadly distributed set of neural processes. We formulate the keyhole hypothesis in information theoretical terms. The experimental investigation is based on legacy data consisting of 10...

  18. The Qualitative Expectations Hypothesis

    DEFF Research Database (Denmark)

    Frydman, Roman; Johansen, Søren; Rahbek, Anders

    2017-01-01

    We introduce the Qualitative Expectations Hypothesis (QEH) as a new approach to modeling macroeconomic and financial outcomes. Building on John Muth's seminal insight underpinning the Rational Expectations Hypothesis (REH), QEH represents the market's forecasts to be consistent with the predictions...... of an economistís model. However, by assuming that outcomes lie within stochastic intervals, QEH, unlike REH, recognizes the ambiguity faced by an economist and market participants alike. Moreover, QEH leaves the model open to ambiguity by not specifying a mechanism determining specific values that outcomes take...

  19. The Qualitative Expectations Hypothesis

    DEFF Research Database (Denmark)

    Frydman, Roman; Johansen, Søren; Rahbek, Anders

    We introduce the Qualitative Expectations Hypothesis (QEH) as a new approach to modeling macroeconomic and financial outcomes. Building on John Muth's seminal insight underpinning the Rational Expectations Hypothesis (REH), QEH represents the market's forecasts to be consistent with the predictions...... of an economist's model. However, by assuming that outcomes lie within stochastic intervals, QEH, unlike REH, recognizes the ambiguity faced by an economist and market participants alike. Moreover, QEH leaves the model open to ambiguity by not specifying a mechanism determining specific values that outcomes take...

  20. Revisiting the Dutch hypothesis

    NARCIS (Netherlands)

    Postma, Dirkje S.; Weiss, Scott T.; van den Berge, Maarten; Kerstjens, Huib A. M.; Koppelman, Gerard H.

    The Dutch hypothesis was first articulated in 1961, when many novel and advanced scientific techniques were not available, such as genomics techniques for pinpointing genes, gene expression, lipid and protein profiles, and the microbiome. In addition, computed tomographic scans and advanced analysis

  1. The Lehman Sisters Hypothesis

    NARCIS (Netherlands)

    I.P. van Staveren (Irene)

    2014-01-01

    markdownabstract__Abstract__ This article explores the Lehman Sisters Hypothesis. It reviews empirical literature about gender differences in behavioral, experimental, and neuro-economics as well as in other fields of behavioral research. It discusses gender differences along three dimensions of

  2. CARA Risk Assessment Thresholds

    Science.gov (United States)

    Hejduk, M. D.

    2016-01-01

    Warning remediation threshold (Red threshold): Pc level at which warnings are issued, and active remediation considered and usually executed. Analysis threshold (Green to Yellow threshold): Pc level at which analysis of event is indicated, including seeking additional information if warranted. Post-remediation threshold: Pc level to which remediation maneuvers are sized in order to achieve event remediation and obviate any need for immediate follow-up maneuvers. Maneuver screening threshold: Pc compliance level for routine maneuver screenings (more demanding than regular Red threshold due to additional maneuver uncertainty).

  3. Bayesian Hypothesis Testing

    Energy Technology Data Exchange (ETDEWEB)

    Andrews, Stephen A. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Sigeti, David E. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-11-15

    These are a set of slides about Bayesian hypothesis testing, where many hypotheses are tested. The conclusions are the following: The value of the Bayes factor obtained when using the median of the posterior marginal is almost the minimum value of the Bayes factor. The value of τ2 which minimizes the Bayes factor is a reasonable choice for this parameter. This allows a likelihood ratio to be computed with is the least favorable to H0.

  4. The Drift Burst Hypothesis

    OpenAIRE

    Christensen, Kim; Oomen, Roel; Renò, Roberto

    2016-01-01

    The Drift Burst Hypothesis postulates the existence of short-lived locally explosive trends in the price paths of financial assets. The recent US equity and Treasury flash crashes can be viewed as two high profile manifestations of such dynamics, but we argue that drift bursts of varying magnitude are an expected and regular occurrence in financial markets that can arise through established mechanisms such as feedback trading. At a theoretical level, we show how to build drift bursts into the...

  5. Hypothesis in research

    Directory of Open Access Journals (Sweden)

    Eudaldo Enrique Espinoza Freire

    2018-01-01

    Full Text Available It is intended with this work to have a material with the fundamental contents, which enable the university professor to formulate the hypothesis, for the development of an investigation, taking into account the problem to be solved. For its elaboration, the search of information in primary documents was carried out, such as thesis of degree and reports of research results, selected on the basis of its relevance with the analyzed subject, current and reliability, secondary documents, as scientific articles published in journals of recognized prestige, the selection was made with the same terms as in the previous documents. It presents a conceptualization of the updated hypothesis, its characterization and an analysis of the structure of the hypothesis in which the determination of the variables is deepened. The involvement of the university professor in the teaching-research process currently faces some difficulties, which are manifested, among other aspects, in an unstable balance between teaching and research, which leads to a separation between them.

  6. Threshold quantum cryptography

    International Nuclear Information System (INIS)

    Tokunaga, Yuuki; Okamoto, Tatsuaki; Imoto, Nobuyuki

    2005-01-01

    We present the concept of threshold collaborative unitary transformation or threshold quantum cryptography, which is a kind of quantum version of threshold cryptography. Threshold quantum cryptography states that classical shared secrets are distributed to several parties and a subset of them, whose number is greater than a threshold, collaborates to compute a quantum cryptographic function, while keeping each share secretly inside each party. The shared secrets are reusable if no cheating is detected. As a concrete example of this concept, we show a distributed protocol (with threshold) of conjugate coding

  7. The linear hypothesis and radiation carcinogenesis

    International Nuclear Information System (INIS)

    Roberts, P.B.

    1981-10-01

    An assumption central to most estimations of the carcinogenic potential of low levels of ionising radiation is that the risk always increases in direct proportion to the dose received. This assumption (the linear hypothesis) has been both strongly defended and attacked on several counts. It appears unlikely that conclusive, direct evidence on the validity of the hypothesis will be forthcoming. We review the major indirect arguments used in the debate. All of them are subject to objections that can seriously weaken their case. In the present situation, retention of the linear hypothesis as the basis of extrapolations from high to low dose levels can lead to excessive fears, over-regulation and unnecessarily expensive protection measures. To offset these possibilities, support is given to suggestions urging a cut-off dose, probably some fraction of natural background, below which risks can be deemed acceptable

  8. Theory of threshold phenomena

    International Nuclear Information System (INIS)

    Hategan, Cornel

    2002-01-01

    Theory of Threshold Phenomena in Quantum Scattering is developed in terms of Reduced Scattering Matrix. Relationships of different types of threshold anomalies both to nuclear reaction mechanisms and to nuclear reaction models are established. Magnitude of threshold effect is related to spectroscopic factor of zero-energy neutron state. The Theory of Threshold Phenomena, based on Reduced Scattering Matrix, does establish relationships between different types of threshold effects and nuclear reaction mechanisms: the cusp and non-resonant potential scattering, s-wave threshold anomaly and compound nucleus resonant scattering, p-wave anomaly and quasi-resonant scattering. A threshold anomaly related to resonant or quasi resonant scattering is enhanced provided the neutron threshold state has large spectroscopic amplitude. The Theory contains, as limit cases, Cusp Theories and also results of different nuclear reactions models as Charge Exchange, Weak Coupling, Bohr and Hauser-Feshbach models. (author)

  9. Ideal Standards, Acceptance, and Relationship Satisfaction: Latitudes of Differential Effects

    Directory of Open Access Journals (Sweden)

    Asuman Buyukcan-Tetik

    2017-09-01

    Full Text Available We examined whether the relations of consistency between ideal standards and perceptions of a current romantic partner with partner acceptance and relationship satisfaction level off, or decelerate, above a threshold. We tested our hypothesis using a 3-year longitudinal data set collected from heterosexual newlywed couples. We used two indicators of consistency: pattern correspondence (within-person correlation between ideal standards and perceived partner ratings and mean-level match (difference between ideal standards score and perceived partner score. Our results revealed that pattern correspondence had no relation with partner acceptance, but a positive linear/exponential association with relationship satisfaction. Mean-level match had a significant positive association with actor’s acceptance and relationship satisfaction up to the point where perceived partner score equaled ideal standards score. Partner effects did not show a consistent pattern. The results suggest that the consistency between ideal standards and perceived partner attributes has a non-linear association with acceptance and relationship satisfaction, although the results were more conclusive for mean-level match.

  10. Tissue misrepair hypothesis for radiation carcinogenesis

    International Nuclear Information System (INIS)

    Kondo, Sohei

    1991-01-01

    Dose-response curves for chronic leukemia in A-bomb survivors and liver tumors in patients given Thorotrast (colloidal thorium dioxide) show large threshold effects. The existence of these threshold effects can be explained by the following hypothesis. A high dose of radiation causes a persistent wound in a cellrenewable tissue. Disorder of the injured cell society partly frees the component cells from territorial restraints on their proliferation, enabling them to continue development of their cellular functions toward advanced autonomy. This progression might be achieved by continued epigenetic and genetic changes as a result of occasional errors in the otherwise concerted healing action of various endogeneous factors recruited for tissue repair. Carcinogenesis is not simply a single-cell problem but a cell-society problem. Therefore, it is not warranted to estimate risk at low doses by linear extrapolation from cancer data at high doses without knowledge of the mechanism of radiation carcinogenesis. (author) 57 refs

  11. [Dilemma of null hypothesis in ecological hypothesis's experiment test.

    Science.gov (United States)

    Li, Ji

    2016-06-01

    Experimental test is one of the major test methods of ecological hypothesis, though there are many arguments due to null hypothesis. Quinn and Dunham (1983) analyzed the hypothesis deduction model from Platt (1964) and thus stated that there is no null hypothesis in ecology that can be strictly tested by experiments. Fisher's falsificationism and Neyman-Pearson (N-P)'s non-decisivity inhibit statistical null hypothesis from being strictly tested. Moreover, since the null hypothesis H 0 (α=1, β=0) and alternative hypothesis H 1 '(α'=1, β'=0) in ecological progresses are diffe-rent from classic physics, the ecological null hypothesis can neither be strictly tested experimentally. These dilemmas of null hypothesis could be relieved via the reduction of P value, careful selection of null hypothesis, non-centralization of non-null hypothesis, and two-tailed test. However, the statistical null hypothesis significance testing (NHST) should not to be equivalent to the causality logistical test in ecological hypothesis. Hence, the findings and conclusions about methodological studies and experimental tests based on NHST are not always logically reliable.

  12. The oxidative hypothesis of senescence

    Directory of Open Access Journals (Sweden)

    Gilca M

    2007-01-01

    Full Text Available The oxidative hypothesis of senescence, since its origin in 1956, has garnered significant evidence and growing support among scientists for the notion that free radicals play an important role in ageing, either as "damaging" molecules or as signaling molecules. Age-increasing oxidative injuries induced by free radicals, higher susceptibility to oxidative stress in short-lived organisms, genetic manipulations that alter both oxidative resistance and longevity and the anti-ageing effect of caloric restriction and intermittent fasting are a few examples of accepted scientific facts that support the oxidative theory of senescence. Though not completely understood due to the complex "network" of redox regulatory systems, the implication of oxidative stress in the ageing process is now well documented. Moreover, it is compatible with other current ageing theories (e.g., those implicating the mitochondrial damage/mitochondrial-lysosomal axis, stress-induced premature senescence, biological "garbage" accumulation, etc. This review is intended to summarize and critically discuss the redox mechanisms involved during the ageing process: sources of oxidant agents in ageing (mitochondrial -electron transport chain, nitric oxide synthase reaction- and non-mitochondrial- Fenton reaction, microsomal cytochrome P450 enzymes, peroxisomal β -oxidation and respiratory burst of phagocytic cells, antioxidant changes in ageing (enzymatic- superoxide dismutase, glutathione-reductase, glutathion peroxidase, catalase- and non-enzymatic glutathione, ascorbate, urate, bilirubine, melatonin, tocopherols, carotenoids, ubiquinol, alteration of oxidative damage repairing mechanisms and the role of free radicals as signaling molecules in ageing.

  13. The Bergschrund Hypothesis Revisited

    Science.gov (United States)

    Sanders, J. W.; Cuffey, K. M.; MacGregor, K. R.

    2009-12-01

    After Willard Johnson descended into the Lyell Glacier bergschrund nearly 140 years ago, he proposed that the presence of the bergschrund modulated daily air temperature fluctuations and enhanced freeze-thaw processes. He posited that glaciers, through their ability to birth bergschrunds, are thus able to induce rapid cirque headwall retreat. In subsequent years, many researchers challenged the bergschrund hypothesis on grounds that freeze-thaw events did not occur at depth in bergschrunds. We propose a modified version of Johnson’s original hypothesis: that bergschrunds maintain subfreezing temperatures at values that encourage rock fracture via ice lensing because they act as a cold air trap in areas that would otherwise be held near zero by temperate glacial ice. In support of this claim we investigated three sections of the bergschrund at the West Washmawapta Glacier, British Columbia, Canada, which sits in an east-facing cirque. During our bergschrund reconnaissance we installed temperature sensors at multiple elevations, light sensors at depth in 2 of the 3 locations and painted two 1 m2 sections of the headwall. We first emphasize bergschrunds are not wanting for ice: verglas covers significant fractions of the headwall and icicles dangle from the base of bödens or overhanging rocks. If temperature, rather than water availability, is the limiting factor governing ice-lensing rates, our temperature records demonstrate that the bergschrund provides a suitable environment for considerable rock fracture. At the three sites (north, west, and south walls), the average temperature at depth from 9/3/2006 to 8/6/2007 was -3.6, -3.6, and -2.0 °C, respectively. During spring, when we observed vast amounts of snow melt trickle in to the bergschrund, temperatures averaged -3.7, -3.8, and -2.2 °C, respectively. Winter temperatures are even lower: -8.5, -7.3, and -2.4 °C, respectively. Values during the following year were similar. During the fall, diurnal

  14. Einstein's Revolutionary Light-Quantum Hypothesis

    Science.gov (United States)

    Stuewer, Roger H.

    2005-05-01

    The paper in which Albert Einstein proposed his light-quantum hypothesis was the only one of his great papers of 1905 that he himself termed ``revolutionary.'' Contrary to widespread belief, Einstein did not propose his light-quantum hypothesis ``to explain the photoelectric effect.'' Instead, he based his argument for light quanta on the statistical interpretation of the second law of thermodynamics, with the photoelectric effect being only one of three phenomena that he offered as possible experimental support for it. I will discuss Einstein's light-quantum hypothesis of 1905 and his introduction of the wave-particle duality in 1909 and then turn to the reception of his work on light quanta by his contemporaries. We will examine the reasons that prominent physicists advanced to reject Einstein's light-quantum hypothesis in succeeding years. Those physicists included Robert A. Millikan, even though he provided convincing experimental proof of the validity of Einstein's equation of the photoelectric effect in 1915. The turning point came after Arthur Holly Compton discovered the Compton effect in late 1922, but even then Compton's discovery was contested both on experimental and on theoretical grounds. Niels Bohr, in particular, had never accepted the reality of light quanta and now, in 1924, proposed a theory, the Bohr-Kramers-Slater theory, which assumed that energy and momentum were conserved only statistically in microscopic interactions. Only after that theory was disproved experimentally in 1925 was Einstein's revolutionary light-quantum hypothesis generally accepted by physicists---a full two decades after Einstein had proposed it.

  15. The linear hypothesis - an idea whose time has passed

    International Nuclear Information System (INIS)

    Tschaeche, A.N.

    1995-01-01

    The linear no-threshold hypothesis is the basis for radiation protection standards in the United States. In the words of the National Council on Radiation Protection and Measurements (NCRP), the hypothesis is: open-quotes In the interest of estimating effects in humans conservatively, it is not unreasonable to follow the assumption of a linear relationship between dose and effect in the low dose regions for which direct observational data are not available.close quotes The International Commission on Radiological Protection (ICRP) stated the hypothesis in a slightly different manner: open-quotes One such basic assumption ... is that ... there is ... a linear relationship without threshold between dose and the probability of an effect. The hypothesis was necessary 50 yr ago when it was first enunciated because the dose-effect curve for ionizing radiation for effects in humans was not known. The ICRP and NCRP needed a model to extrapolate high-dose effects to low-dose effects. So the linear no-threshold hypothesis was born. Certain details of the history of the development and use of the linear hypothesis are presented. In particular, use of the hypothesis by the U.S. regulatory agencies is examined. Over time, the sense of the hypothesis has been corrupted. The corruption of the hypothesis into the current paradigm of open-quote a little radiation, no matter how small, can and will harm youclose quotes is presented. The reasons the corruption occurred are proposed. The effects of the corruption are enumerated, specifically, the use of the corruption by the antinuclear forces in the United States and some of the huge costs to U.S. taxpayers due to the corruption. An alternative basis for radiation protection standards to assure public safety, based on the weight of scientific evidence on radiation health effects, is proposed

  16. Threshold Signature Schemes Application

    Directory of Open Access Journals (Sweden)

    Anastasiya Victorovna Beresneva

    2015-10-01

    Full Text Available This work is devoted to an investigation of threshold signature schemes. The systematization of the threshold signature schemes was done, cryptographic constructions based on interpolation Lagrange polynomial, elliptic curves and bilinear pairings were examined. Different methods of generation and verification of threshold signatures were explored, the availability of practical usage of threshold schemes in mobile agents, Internet banking and e-currency was shown. The topics of further investigation were given and it could reduce a level of counterfeit electronic documents signed by a group of users.

  17. Particles near threshold

    International Nuclear Information System (INIS)

    Bhattacharya, T.; Willenbrock, S.

    1993-01-01

    We propose returning to the definition of the width of a particle in terms of the pole in the particle's propagator. Away from thresholds, this definition of width is equivalent to the standard perturbative definition, up to next-to-leading order; however, near a threshold, the two definitions differ significantly. The width as defined by the pole position provides more information in the threshold region than the standard perturbative definition and, in contrast with the perturbative definition, does not vanish when a two-particle s-wave threshold is approached from below

  18. Acceptability, acceptance and decision making

    International Nuclear Information System (INIS)

    Ackerschott, H.

    2002-01-01

    There is a fundamental difference between the acceptability of a civilizatory or societal risk and the acceptability of the decision-making process that leads to a civilizatory or societal risk. The analysis of individual risk decisions - regarding who, executes when which indisputably hazardous, unhealthy or dangerous behaviour under which circumstances - is not helpful in finding solutions for the political decisions at hand in Germany concerning nuclear energy in particular or energy in general. The debt for implementation of any technology, in the sense of making the technology a success in terms of broad acceptance and general utilisation, lies with the particular industry involved. Regardless of the technology, innovation research identifies the implementation phase as most critical to the success of any innovation. In this sense, nuclear technology is at best still an innovation, because the implementation has not yet been completed. Fear and opposition to innovation are ubiquitous. Even the economy - which is often described as 'rational' - is full of this resistance. Innovation has an impact on the pivotal point between stability, the presupposition for the successful execution of decisions already taken and instability, which includes insecurity, but is also necessary for the success of further development. By definition, innovations are beyond our sphere of experience; not at the level of reliability and trust yet to come. Yet they are evaluated via the simplifying heuristics for making decisions proven not only to be necessary and useful, but also accurate in the familiar. The 'settlement of the debt of implementation', the accompanying communication, the decision-making procedures concerning the regulation of averse effects of the technology, but also the tailoring of the new technology or service itself must be directed to appropriate target groups. But the group often aimed at in the nuclear debate, the group, which largely determines political

  19. The distribution choice for the threshold of solid state relay

    International Nuclear Information System (INIS)

    Sun Beiyun; Zhou Hui; Cheng Xiangyue; Mao Congguang

    2009-01-01

    Either normal distribution or Weibull distribution can be accepted as sample distribution of the threshold of solid state relay. By goodness-of-fit method, bootstrap method and Bayesian method, the Weibull distribution is chosen later. (authors)

  20. The conscious access hypothesis: Explaining the consciousness.

    Science.gov (United States)

    Prakash, Ravi

    2008-01-01

    The phenomenon of conscious awareness or consciousness is complicated but fascinating. Although this concept has intrigued the mankind since antiquity, exploration of consciousness from scientific perspectives is not very old. Among myriad of theories regarding nature, functions and mechanism of consciousness, off late, cognitive theories have received wider acceptance. One of the most exciting hypotheses in recent times has been the "conscious access hypotheses" based on the "global workspace model of consciousness". It underscores an important property of consciousness, the global access of information in cerebral cortex. Present article reviews the "conscious access hypothesis" in terms of its theoretical underpinnings as well as experimental supports it has received.

  1. Decision-making when data and inferences are not conclusive: risk-benefit and acceptable regret approach.

    Science.gov (United States)

    Hozo, Iztok; Schell, Michael J; Djulbegovic, Benjamin

    2008-07-01

    The absolute truth in research is unobtainable, as no evidence or research hypothesis is ever 100% conclusive. Therefore, all data and inferences can in principle be considered as "inconclusive." Scientific inference and decision-making need to take into account errors, which are unavoidable in the research enterprise. The errors can occur at the level of conclusions that aim to discern the truthfulness of research hypothesis based on the accuracy of research evidence and hypothesis, and decisions, the goal of which is to enable optimal decision-making under present and specific circumstances. To optimize the chance of both correct conclusions and correct decisions, the synthesis of all major statistical approaches to clinical research is needed. The integration of these approaches (frequentist, Bayesian, and decision-analytic) can be accomplished through formal risk:benefit (R:B) analysis. This chapter illustrates the rational choice of a research hypothesis using R:B analysis based on decision-theoretic expected utility theory framework and the concept of "acceptable regret" to calculate the threshold probability of the "truth" above which the benefit of accepting a research hypothesis outweighs its risks.

  2. Is the Aluminum Hypothesis Dead?

    Science.gov (United States)

    2014-01-01

    The Aluminum Hypothesis, the idea that aluminum exposure is involved in the etiology of Alzheimer disease, dates back to a 1965 demonstration that aluminum causes neurofibrillary tangles in the brains of rabbits. Initially the focus of intensive research, the Aluminum Hypothesis has gradually been abandoned by most researchers. Yet, despite this current indifference, the Aluminum Hypothesis continues to attract the attention of a small group of scientists and aluminum continues to be viewed with concern by some of the public. This review article discusses reasons that mainstream science has largely abandoned the Aluminum Hypothesis and explores a possible reason for some in the general public continuing to view aluminum with mistrust. PMID:24806729

  3. Segmentation by Large Scale Hypothesis Testing - Segmentation as Outlier Detection

    DEFF Research Database (Denmark)

    Darkner, Sune; Dahl, Anders Lindbjerg; Larsen, Rasmus

    2010-01-01

    a microscope and we show how the method can handle transparent particles with significant glare point. The method generalizes to other problems. THis is illustrated by applying the method to camera calibration images and MRI of the midsagittal plane for gray and white matter separation and segmentation......We propose a novel and efficient way of performing local image segmentation. For many applications a threshold of pixel intensities is sufficient but determine the appropriate threshold value can be difficult. In cases with large global intensity variation the threshold value has to be adapted...... locally. We propose a method based on large scale hypothesis testing with a consistent method for selecting an appropriate threshold for the given data. By estimating the background distribution we characterize the segment of interest as a set of outliers with a certain probability based on the estimated...

  4. Double Photoionization Near Threshold

    Science.gov (United States)

    Wehlitz, Ralf

    2007-01-01

    The threshold region of the double-photoionization cross section is of particular interest because both ejected electrons move slowly in the Coulomb field of the residual ion. Near threshold both electrons have time to interact with each other and with the residual ion. Also, different theoretical models compete to describe the double-photoionization cross section in the threshold region. We have investigated that cross section for lithium and beryllium and have analyzed our data with respect to the latest results in the Coulomb-dipole theory. We find that our data support the idea of a Coulomb-dipole interaction.

  5. Thresholds in radiobiology

    International Nuclear Information System (INIS)

    Katz, R.; Hofmann, W.

    1982-01-01

    Interpretations of biological radiation effects frequently use the word 'threshold'. The meaning of this word is explored together with its relationship to the fundamental character of radiation effects and to the question of perception. It is emphasised that although the existence of either a dose or an LET threshold can never be settled by experimental radiobiological investigations, it may be argued on fundamental statistical grounds that for all statistical processes, and especially where the number of observed events is small, the concept of a threshold is logically invalid. (U.K.)

  6. The Variability Hypothesis: The History of a Biological Model of Sex Differences in Intelligence.

    Science.gov (United States)

    Shields, Stephanie A.

    1982-01-01

    Describes the origin and development of the variability hypothesis as applied to the study of social and psychological sex differences. Explores changes in the hypothesis over time, social and scientific factors that fostered its acceptance, and possible parallels between the variability hypothesis and contemporary theories of sex differences.…

  7. Determining lower threshold concentrations for synergistic effects

    DEFF Research Database (Denmark)

    Bjergager, Maj-Britt Andersen; Dalhoff, Kristoffer; Kretschmann, Andreas

    2017-01-01

    which proven synergists cease to act as synergists towards the aquatic crustacean Daphnia magna. To do this, we compared several approaches and test-setups to evaluate which approach gives the most conservative estimate for the lower threshold for synergy for three known azole synergists. We focus...... on synergistic interactions between the pyrethroid insecticide, alpha-cypermethrin, and one of the three azole fungicides prochloraz, propiconazole or epoxiconazole measured on Daphnia magna immobilization. Three different experimental setups were applied: A standard 48h acute toxicity test, an adapted 48h test...... of immobile organisms increased more than two-fold above what was predicted by independent action (vertical assessment). All three tests confirmed the hypothesis of the existence of a lower azole threshold concentration below which no synergistic interaction was observed. The lower threshold concentration...

  8. Color difference thresholds in dentistry.

    Science.gov (United States)

    Paravina, Rade D; Ghinea, Razvan; Herrera, Luis J; Bona, Alvaro D; Igiel, Christopher; Linninger, Mercedes; Sakai, Maiko; Takahashi, Hidekazu; Tashkandi, Esam; Perez, Maria del Mar

    2015-01-01

    The aim of this prospective multicenter study was to determine 50:50% perceptibility threshold (PT) and 50:50% acceptability threshold (AT) of dental ceramic under simulated clinical settings. The spectral radiance of 63 monochromatic ceramic specimens was determined using a non-contact spectroradiometer. A total of 60 specimen pairs, divided into 3 sets of 20 specimen pairs (medium to light shades, medium to dark shades, and dark shades), were selected for psychophysical experiment. The coordinating center and seven research sites obtained the Institutional Review Board (IRB) approvals prior the beginning of the experiment. Each research site had 25 observers, divided into five groups of five observers: dentists-D, dental students-S, dental auxiliaries-A, dental technicians-T, and lay persons-L. There were 35 observers per group (five observers per group at each site ×7 sites), for a total of 175 observers. Visual color comparisons were performed using a viewing booth. Takagi-Sugeno-Kang (TSK) fuzzy approximation was used for fitting the data points. The 50:50% PT and 50:50% AT were determined in CIELAB and CIEDE2000. The t-test was used to evaluate the statistical significance in thresholds differences. The CIELAB 50:50% PT was ΔEab  = 1.2, whereas 50:50% AT was ΔEab  = 2.7. Corresponding CIEDE2000 (ΔE00 ) values were 0.8 and 1.8, respectively. 50:50% PT by the observer group revealed differences among groups D, A, T, and L as compared with 50:50% PT for all observers. The 50:50% AT for all observers was statistically different than 50:50% AT in groups T and L. A 50:50% perceptibility and ATs were significantly different. The same is true for differences between two color difference formulas ΔE00 /ΔEab . Observer groups and sites showed high level of statistical difference in all thresholds. Visual color difference thresholds can serve as a quality control tool to guide the selection of esthetic dental materials, evaluate clinical performance, and

  9. Influence of a threshold existence on the sanitary consequences of a nuclear accident

    International Nuclear Information System (INIS)

    Nifenecker, H.

    2001-11-01

    The justification of the application of the dose response relationship without threshold to the calculation of the fatal cancers number in the case of an accidental irradiation of a population is discussed. The hypothesis of a harmlessness low threshold is examined. The existence of a threshold even low reduces significantly the number of victims. A simulation case is studied. (N.C.)

  10. Regional Seismic Threshold Monitoring

    National Research Council Canada - National Science Library

    Kvaerna, Tormod

    2006-01-01

    ... model to be used for predicting the travel times of regional phases. We have applied these attenuation relations to develop and assess a regional threshold monitoring scheme for selected subregions of the European Arctic...

  11. The thresholds for statistical and clinical significance

    DEFF Research Database (Denmark)

    Jakobsen, Janus Christian; Gluud, Christian; Winkel, Per

    2014-01-01

    BACKGROUND: Thresholds for statistical significance are insufficiently demonstrated by 95% confidence intervals or P-values when assessing results from randomised clinical trials. First, a P-value only shows the probability of getting a result assuming that the null hypothesis is true and does...... not reflect the probability of getting a result assuming an alternative hypothesis to the null hypothesis is true. Second, a confidence interval or a P-value showing significance may be caused by multiplicity. Third, statistical significance does not necessarily result in clinical significance. Therefore...... of the probability that a given trial result is compatible with a 'null' effect (corresponding to the P-value) divided by the probability that the trial result is compatible with the intervention effect hypothesised in the sample size calculation; (3) adjust the confidence intervals and the statistical significance...

  12. Hypothesis Designs for Three-Hypothesis Test Problems

    OpenAIRE

    Yan Li; Xiaolong Pu

    2010-01-01

    As a helpful guide for applications, the alternative hypotheses of the three-hypothesis test problems are designed under the required error probabilities and average sample number in this paper. The asymptotic formulas and the proposed numerical quadrature formulas are adopted, respectively, to obtain the hypothesis designs and the corresponding sequential test schemes under the Koopman-Darmois distributions. The example of the normal mean test shows that our methods are qu...

  13. Comparison between intensity- duration thresholds and cumulative rainfall thresholds for the forecasting of landslide

    Science.gov (United States)

    Lagomarsino, Daniela; Rosi, Ascanio; Rossi, Guglielmo; Segoni, Samuele; Catani, Filippo

    2014-05-01

    This work makes a quantitative comparison between the results of landslide forecasting obtained using two different rainfall threshold models, one using intensity-duration thresholds and the other based on cumulative rainfall thresholds in an area of northern Tuscany of 116 km2. The first methodology identifies rainfall intensity-duration thresholds by means a software called MaCumBA (Massive CUMulative Brisk Analyzer) that analyzes rain-gauge records, extracts the intensities (I) and durations (D) of the rainstorms associated with the initiation of landslides, plots these values on a diagram, and identifies thresholds that define the lower bounds of the I-D values. A back analysis using data from past events can be used to identify the threshold conditions associated with the least amount of false alarms. The second method (SIGMA) is based on the hypothesis that anomalous or extreme values of rainfall are responsible for landslide triggering: the statistical distribution of the rainfall series is analyzed, and multiples of the standard deviation (σ) are used as thresholds to discriminate between ordinary and extraordinary rainfall events. The name of the model, SIGMA, reflects the central role of the standard deviations in the proposed methodology. The definition of intensity-duration rainfall thresholds requires the combined use of rainfall measurements and an inventory of dated landslides, whereas SIGMA model can be implemented using only rainfall data. These two methodologies were applied in an area of 116 km2 where a database of 1200 landslides was available for the period 2000-2012. The results obtained are compared and discussed. Although several examples of visual comparisons between different intensity-duration rainfall thresholds are reported in the international literature, a quantitative comparison between thresholds obtained in the same area using different techniques and approaches is a relatively undebated research topic.

  14. Tests of the lunar hypothesis

    Science.gov (United States)

    Taylor, S. R.

    1984-01-01

    The concept that the Moon was fissioned from the Earth after core separation is the most readily testable hypothesis of lunar origin, since direct comparisons of lunar and terrestrial compositions can be made. Differences found in such comparisons introduce so many ad hoc adjustments to the fission hypothesis that it becomes untestable. Further constraints may be obtained from attempting to date the volatile-refractory element fractionation. The combination of chemical and isotopic problems suggests that the fission hypothesis is no longer viable, and separate terrestrial and lunar accretion from a population of fractionated precursor planetesimals provides a more reasonable explanation.

  15. Evaluating the Stage Learning Hypothesis.

    Science.gov (United States)

    Thomas, Hoben

    1980-01-01

    A procedure for evaluating the Genevan stage learning hypothesis is illustrated by analyzing Inhelder, Sinclair, and Bovet's guided learning experiments (in "Learning and the Development of Cognition." Cambridge: Harvard University Press, 1974). (Author/MP)

  16. The Purchasing Power Parity Hypothesis:

    African Journals Online (AJOL)

    2011-10-02

    Oct 2, 2011 ... reject the unit root hypothesis in real exchange rates may simply be due to the shortness ..... Violations of Purchasing Power Parity and Their Implications for Efficient ... Official Intervention in the Foreign Exchange Market:.

  17. The equilibrium-point hypothesis--past, present and future.

    Science.gov (United States)

    Feldman, Anatol G; Levin, Mindy F

    2009-01-01

    This chapter is a brief account of fundamentals of the equilibrium-point hypothesis or more adequately called the threshold control theory (TCT). It also compares the TCT with other approaches to motor control. The basic notions of the TCT are reviewed with a major focus on solutions to the problems of multi-muscle and multi-degrees of freedom redundancy. The TCT incorporates cognitive aspects by explaining how neurons recognize that internal (neural) and external (environmental) events match each other. These aspects as well as how motor learning occurs are subjects of further development of the TCT hypothesis.

  18. Threshold guidance update

    International Nuclear Information System (INIS)

    Wickham, L.E.

    1986-01-01

    The Department of Energy (DOE) is developing the concept of threshold quantities for use in determining which waste materials must be handled as radioactive waste and which may be disposed of as nonradioactive waste at its sites. Waste above this concentration level would be managed as radioactive or mixed waste (if hazardous chemicals are present); waste below this level would be handled as sanitary waste. Last years' activities (1984) included the development of a threshold guidance dose, the development of threshold concentrations corresponding to the guidance dose, the development of supporting documentation, review by a technical peer review committee, and review by the DOE community. As a result of the comments, areas have been identified for more extensive analysis, including an alternative basis for selection of the guidance dose and the development of quality assurance guidelines. Development of quality assurance guidelines will provide a reasonable basis for determining that a given waste stream qualifies as a threshold waste stream and can then be the basis for a more extensive cost-benefit analysis. The threshold guidance and supporting documentation will be revised, based on the comments received. The revised documents will be provided to DOE by early November. DOE-HQ has indicated that the revised documents will be available for review by DOE field offices and their contractors

  19. Near threshold fatigue testing

    Science.gov (United States)

    Freeman, D. C.; Strum, M. J.

    1993-01-01

    Measurement of the near-threshold fatigue crack growth rate (FCGR) behavior provides a basis for the design and evaluation of components subjected to high cycle fatigue. Typically, the near-threshold fatigue regime describes crack growth rates below approximately 10(exp -5) mm/cycle (4 x 10(exp -7) inch/cycle). One such evaluation was recently performed for the binary alloy U-6Nb. The procedures developed for this evaluation are described in detail to provide a general test method for near-threshold FCGR testing. In particular, techniques for high-resolution measurements of crack length performed in-situ through a direct current, potential drop (DCPD) apparatus, and a method which eliminates crack closure effects through the use of loading cycles with constant maximum stress intensity are described.

  20. Multimodal distribution of human cold pain thresholds.

    Science.gov (United States)

    Lötsch, Jörn; Dimova, Violeta; Lieb, Isabel; Zimmermann, Michael; Oertel, Bruno G; Ultsch, Alfred

    2015-01-01

    It is assumed that different pain phenotypes are based on varying molecular pathomechanisms. Distinct ion channels seem to be associated with the perception of cold pain, in particular TRPM8 and TRPA1 have been highlighted previously. The present study analyzed the distribution of cold pain thresholds with focus at describing the multimodality based on the hypothesis that it reflects a contribution of distinct ion channels. Cold pain thresholds (CPT) were available from 329 healthy volunteers (aged 18 - 37 years; 159 men) enrolled in previous studies. The distribution of the pooled and log-transformed threshold data was described using a kernel density estimation (Pareto Density Estimation (PDE)) and subsequently, the log data was modeled as a mixture of Gaussian distributions using the expectation maximization (EM) algorithm to optimize the fit. CPTs were clearly multi-modally distributed. Fitting a Gaussian Mixture Model (GMM) to the log-transformed threshold data revealed that the best fit is obtained when applying a three-model distribution pattern. The modes of the identified three Gaussian distributions, retransformed from the log domain to the mean stimulation temperatures at which the subjects had indicated pain thresholds, were obtained at 23.7 °C, 13.2 °C and 1.5 °C for Gaussian #1, #2 and #3, respectively. The localization of the first and second Gaussians was interpreted as reflecting the contribution of two different cold sensors. From the calculated localization of the modes of the first two Gaussians, the hypothesis of an involvement of TRPM8, sensing temperatures from 25 - 24 °C, and TRPA1, sensing cold from 17 °C can be derived. In that case, subjects belonging to either Gaussian would possess a dominance of the one or the other receptor at the skin area where the cold stimuli had been applied. The findings therefore support a suitability of complex analytical approaches to detect mechanistically determined patterns from pain phenotype data.

  1. Testing the null hypothesis: the forgotten legacy of Karl Popper?

    Science.gov (United States)

    Wilkinson, Mick

    2013-01-01

    Testing of the null hypothesis is a fundamental aspect of the scientific method and has its basis in the falsification theory of Karl Popper. Null hypothesis testing makes use of deductive reasoning to ensure that the truth of conclusions is irrefutable. In contrast, attempting to demonstrate the new facts on the basis of testing the experimental or research hypothesis makes use of inductive reasoning and is prone to the problem of the Uniformity of Nature assumption described by David Hume in the eighteenth century. Despite this issue and the well documented solution provided by Popper's falsification theory, the majority of publications are still written such that they suggest the research hypothesis is being tested. This is contrary to accepted scientific convention and possibly highlights a poor understanding of the application of conventional significance-based data analysis approaches. Our work should remain driven by conjecture and attempted falsification such that it is always the null hypothesis that is tested. The write up of our studies should make it clear that we are indeed testing the null hypothesis and conforming to the established and accepted philosophical conventions of the scientific method.

  2. Threshold factorization redux

    Science.gov (United States)

    Chay, Junegone; Kim, Chul

    2018-05-01

    We reanalyze the factorization theorems for the Drell-Yan process and for deep inelastic scattering near threshold, as constructed in the framework of the soft-collinear effective theory (SCET), from a new, consistent perspective. In order to formulate the factorization near threshold in SCET, we should include an additional degree of freedom with small energy, collinear to the beam direction. The corresponding collinear-soft mode is included to describe the parton distribution function (PDF) near threshold. The soft function is modified by subtracting the contribution of the collinear-soft modes in order to avoid double counting on the overlap region. As a result, the proper soft function becomes infrared finite, and all the factorized parts are free of rapidity divergence. Furthermore, the separation of the relevant scales in each factorized part becomes manifest. We apply the same idea to the dihadron production in e+e- annihilation near threshold, and show that the resultant soft function is also free of infrared and rapidity divergences.

  3. Elaborating on Threshold Concepts

    Science.gov (United States)

    Rountree, Janet; Robins, Anthony; Rountree, Nathan

    2013-01-01

    We propose an expanded definition of Threshold Concepts (TCs) that requires the successful acquisition and internalisation not only of knowledge, but also its practical elaboration in the domains of applied strategies and mental models. This richer definition allows us to clarify the relationship between TCs and Fundamental Ideas, and to account…

  4. The atomic hypothesis: physical consequences

    International Nuclear Information System (INIS)

    Rivas, Martin

    2008-01-01

    The hypothesis that matter is made of some ultimate and indivisible objects, together with the restricted relativity principle, establishes a constraint on the kind of variables we are allowed to use for the variational description of elementary particles. We consider that the atomic hypothesis not only states the indivisibility of elementary particles, but also that these ultimate objects, if not annihilated, cannot be modified by any interaction so that all allowed states of an elementary particle are only kinematical modifications of any one of them. Therefore, an elementary particle cannot have excited states. In this way, the kinematical group of spacetime symmetries not only defines the symmetries of the system, but also the variables in terms of which the mathematical description of the elementary particles can be expressed in either the classical or the quantum mechanical description. When considering the interaction of two Dirac particles, the atomic hypothesis restricts the interaction Lagrangian to a kind of minimal coupling interaction

  5. Extra dimensions hypothesis in high energy physics

    Directory of Open Access Journals (Sweden)

    Volobuev Igor

    2017-01-01

    Full Text Available We discuss the history of the extra dimensions hypothesis and the physics and phenomenology of models with large extra dimensions with an emphasis on the Randall- Sundrum (RS model with two branes. We argue that the Standard Model extension based on the RS model with two branes is phenomenologically acceptable only if the inter-brane distance is stabilized. Within such an extension of the Standard Model, we study the influence of the infinite Kaluza-Klein (KK towers of the bulk fields on collider processes. In particular, we discuss the modification of the scalar sector of the theory, the Higgs-radion mixing due to the coupling of the Higgs boson to the radion and its KK tower, and the experimental restrictions on the mass of the radion-dominated states.

  6. Multiple sclerosis: a geographical hypothesis.

    Science.gov (United States)

    Carlyle, I P

    1997-12-01

    Multiple sclerosis remains a rare neurological disease of unknown aetiology, with a unique distribution, both geographically and historically. Rare in equatorial regions, it becomes increasingly common in higher latitudes; historically, it was first clinically recognized in the early nineteenth century. A hypothesis, based on geographical reasoning, is here proposed: that the disease is the result of a specific vitamin deficiency. Different individuals suffer the deficiency in separate and often unique ways. Evidence to support the hypothesis exists in cultural considerations, in the global distribution of the disease, and in its historical prevalence.

  7. Discussion of the Porter hypothesis

    International Nuclear Information System (INIS)

    1999-11-01

    In the reaction to the long-range vision of RMNO, published in 1996, The Dutch government posed the question whether a far-going and progressive modernization policy will lead to competitive advantages of high-quality products on partly new markets. Such a question is connected to the so-called Porter hypothesis: 'By stimulating innovation, strict environmental regulations can actually enhance competitiveness', from which statement it can be concluded that environment and economy can work together quite well. A literature study has been carried out in order to determine under which conditions that hypothesis is endorsed in the scientific literature and policy documents. Recommendations are given for further studies. refs

  8. The thrifty phenotype hypothesis revisited

    DEFF Research Database (Denmark)

    Vaag, A A; Grunnet, L G; Arora, G P

    2012-01-01

    Twenty years ago, Hales and Barker along with their co-workers published some of their pioneering papers proposing the 'thrifty phenotype hypothesis' in Diabetologia (4;35:595-601 and 3;36:62-67). Their postulate that fetal programming could represent an important player in the origin of type 2...... of the underlying molecular mechanisms. Type 2 diabetes is a multiple-organ disease, and developmental programming, with its idea of organ plasticity, is a plausible hypothesis for a common basis for the widespread organ dysfunctions in type 2 diabetes and the metabolic syndrome. Only two among the 45 known type 2...

  9. Athlete's Heart: Is the Morganroth Hypothesis Obsolete?

    Science.gov (United States)

    Haykowsky, Mark J; Samuel, T Jake; Nelson, Michael D; La Gerche, Andre

    2018-05-01

    In 1975, Morganroth and colleagues reported that the increased left ventricular (LV) mass in highly trained endurance athletes versus nonathletes was primarily due to increased end-diastolic volume while the increased LV mass in resistance trained athletes was solely due to an increased LV wall thickness. Based on the divergent remodelling patterns observed, Morganroth and colleagues hypothesised that the increased "volume" load during endurance exercise may be similar to that which occurs in patients with mitral or aortic regurgitation while the "pressure" load associated with performing a Valsalva manoeuvre (VM) during resistance exercise may mimic the stress imposed on the heart by systemic hypertension or aortic stenosis. Despite widespread acceptance of the four-decade old Morganroth hypothesis in sports cardiology, some investigators have questioned whether such a divergent "athlete's heart" phenotype exists. Given this uncertainty, the purpose of this brief review is to re-evaluate the Morganroth hypothesis regarding: i) the acute effects of resistance exercise performed with a brief VM on LV wall stress, and the patterns of LV remodelling in resistance-trained athletes; ii) the acute effects of endurance exercise on biventricular wall stress, and the time course and pattern of LV and right ventricular (RV) remodelling with endurance training; and iii) the value of comparing "loading" conditions between athletes and patients with cardiac pathology. Copyright © 2018. Published by Elsevier B.V.

  10. Application of threshold concepts to ecological management problems: occupancy of Golden Eagles in Denali National Park, Alaska: Chapter 5

    Science.gov (United States)

    Eaton, Mitchell J.; Martin, Julien; Nichols, James D.; McIntyre, Carol; McCluskie, Maggie C.; Schmutz, Joel A.; Lubow, Bruce L.; Runge, Michael C.; Edited by Guntenspergen, Glenn R.

    2014-01-01

    In this chapter, we demonstrate the application of the various classes of thresholds, detailed in earlier chapters and elsewhere, via an actual but simplified natural resource management case study. We intend our example to provide the reader with the ability to recognize and apply the theoretical concepts of utility, ecological and decision thresholds to management problems through a formalized decision-analytic process. Our case study concerns the management of human recreational activities in Alaska’s Denali National Park, USA, and the possible impacts of such activities on nesting Golden Eagles, Aquila chrysaetos. Managers desire to allow visitors the greatest amount of access to park lands, provided that eagle nesting-site occupancy is maintained at a level determined to be acceptable by the managers themselves. As these two management objectives are potentially at odds, we treat minimum desired occupancy level as a utility threshold which, then, serves to guide the selection of annual management alternatives in the decision process. As human disturbance is not the only factor influencing eagle occupancy, we model nesting-site dynamics as a function of both disturbance and prey availability. We incorporate uncertainty in these dynamics by considering several hypotheses, including a hypothesis that site occupancy is affected only at a threshold level of prey abundance (i.e., an ecological threshold effect). By considering competing management objectives and accounting for two forms of thresholds in the decision process, we are able to determine the optimal number of annual nesting-site restrictions that will produce the greatest long-term benefits for both eagles and humans. Setting a utility threshold of 75 occupied sites, out of a total of 90 potential nesting sites, the optimization specified a decision threshold at approximately 80 occupied sites. At the point that current occupancy falls below 80 sites, the recommended decision is to begin restricting

  11. Regional rainfall thresholds for landslide occurrence using a centenary database

    Science.gov (United States)

    Vaz, Teresa; Luís Zêzere, José; Pereira, Susana; Cruz Oliveira, Sérgio; Garcia, Ricardo A. C.; Quaresma, Ivânia

    2018-04-01

    This work proposes a comprehensive method to assess rainfall thresholds for landslide initiation using a centenary landslide database associated with a single centenary daily rainfall data set. The method is applied to the Lisbon region and includes the rainfall return period analysis that was used to identify the critical rainfall combination (cumulated rainfall duration) related to each landslide event. The spatial representativeness of the reference rain gauge is evaluated and the rainfall thresholds are assessed and calibrated using the receiver operating characteristic (ROC) metrics. Results show that landslide events located up to 10 km from the rain gauge can be used to calculate the rainfall thresholds in the study area; however, these thresholds may be used with acceptable confidence up to 50 km from the rain gauge. The rainfall thresholds obtained using linear and potential regression perform well in ROC metrics. However, the intermediate thresholds based on the probability of landslide events established in the zone between the lower-limit threshold and the upper-limit threshold are much more informative as they indicate the probability of landslide event occurrence given rainfall exceeding the threshold. This information can be easily included in landslide early warning systems, especially when combined with the probability of rainfall above each threshold.

  12. Testing for a Debt-Threshold Effect on Output Growth.

    Science.gov (United States)

    Lee, Sokbae; Park, Hyunmin; Seo, Myung Hwan; Shin, Youngki

    2017-12-01

    Using the Reinhart-Rogoff dataset, we find a debt threshold not around 90 per cent but around 30 per cent, above which the median real gross domestic product (GDP) growth falls abruptly. Our work is the first to formally test for threshold effects in the relationship between public debt and median real GDP growth. The null hypothesis of no threshold effect is rejected at the 5 per cent significance level for most cases. While we find no evidence of a threshold around 90 per cent, our findings from the post-war sample suggest that the debt threshold for economic growth may exist around a relatively small debt-to-GDP ratio of 30 per cent. Furthermore, countries with debt-to-GDP ratios above 30 per cent have GDP growth that is 1 percentage point lower at the median.

  13. Low heat pain thresholds in migraineurs between attacks.

    Science.gov (United States)

    Schwedt, Todd J; Zuniga, Leslie; Chong, Catherine D

    2015-06-01

    Between attacks, migraine is associated with hypersensitivities to sensory stimuli. The objective of this study was to investigate hypersensitivity to pain in migraineurs between attacks. Cutaneous heat pain thresholds were measured in 112 migraineurs, migraine free for ≥ 48 hours, and 75 healthy controls. Pain thresholds at the head and at the arm were compared between migraineurs and controls using two-tailed t-tests. Among migraineurs, correlations between heat pain thresholds and headache frequency, allodynia symptom severity, and time interval until next headache were calculated. Migraineurs had lower pain thresholds than controls at the head (43.9 ℃ ± 3.2 ℃ vs. 45.1 ℃ ± 3.0 ℃, p = 0.015) and arm (43.2 ℃ ± 3.4 ℃ vs. 44.8 ℃ ± 3.3 ℃, p pain thresholds and headache frequency or allodynia symptom severity. For the 41 migraineurs for whom time to next headache was known, there were positive correlations between time to next headache and pain thresholds at the head (r = 0.352, p = 0.024) and arm (r = 0.312, p = 0.047). This study provides evidence that migraineurs have low heat pain thresholds between migraine attacks. Mechanisms underlying these lower pain thresholds could also predispose migraineurs to their next migraine attack, a hypothesis supported by finding positive correlations between pain thresholds and time to next migraine attack. © International Headache Society 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  14. Hadron production near threshold

    Indian Academy of Sciences (India)

    Abstract. Final state interaction effects in pp → pΛK+ and pd → 3He η reactions are explored near threshold to study the sensitivity of the cross-sections to the pΛ potential and the ηN scattering matrix. The final state scattering wave functions between Λ and p and η and 3He are described rigorously. The Λ production is ...

  15. Casualties and threshold effects

    International Nuclear Information System (INIS)

    Mays, C.W.; National Cancer Inst., Bethesda

    1988-01-01

    Radiation effects like cancer are denoted as casualties. Other radiation effects occur almost in everyone when the radiation dose is sufficiently high. One then speaks of radiation effects with a threshold dose. In this article the author puts his doubt about this classification of radiation effects. He argues that some effects of exposure to radiation do not fit in this classification. (H.W.). 19 refs.; 2 figs.; 1 tab

  16. Resonance phenomena near thresholds

    International Nuclear Information System (INIS)

    Persson, E.; Mueller, M.; Rotter, I.; Technische Univ. Dresden

    1995-12-01

    The trapping effect is investigated close to the elastic threshold. The nucleus is described as an open quantum mechanical many-body system embedded in the continuum of decay channels. An ensemble of compound nucleus states with both discrete and resonance states is investigated in an energy-dependent formalism. It is shown that the discrete states can trap the resonance ones and also that the discrete states can directly influence the scattering cross section. (orig.)

  17. Questioning the social intelligence hypothesis.

    Science.gov (United States)

    Holekamp, Kay E

    2007-02-01

    The social intelligence hypothesis posits that complex cognition and enlarged "executive brains" evolved in response to challenges that are associated with social complexity. This hypothesis has been well supported, but some recent data are inconsistent with its predictions. It is becoming increasingly clear that multiple selective agents, and non-selective constraints, must have acted to shape cognitive abilities in humans and other animals. The task now is to develop a larger theoretical framework that takes into account both inter-specific differences and similarities in cognition. This new framework should facilitate consideration of how selection pressures that are associated with sociality interact with those that are imposed by non-social forms of environmental complexity, and how both types of functional demands interact with phylogenetic and developmental constraints.

  18. Whiplash and the compensation hypothesis.

    Science.gov (United States)

    Spearing, Natalie M; Connelly, Luke B

    2011-12-01

    Review article. To explain why the evidence that compensation-related factors lead to worse health outcomes is not compelling, either in general, or in the specific case of whiplash. There is a common view that compensation-related factors lead to worse health outcomes ("the compensation hypothesis"), despite the presence of important, and unresolved sources of bias. The empirical evidence on this question has ramifications for the design of compensation schemes. Using studies on whiplash, this article outlines the methodological problems that impede attempts to confirm or refute the compensation hypothesis. Compensation studies are prone to measurement bias, reverse causation bias, and selection bias. Errors in measurement are largely due to the latent nature of whiplash injuries and health itself, a lack of clarity over the unit of measurement (specific factors, or "compensation"), and a lack of appreciation for the heterogeneous qualities of compensation-related factors and schemes. There has been a failure to acknowledge and empirically address reverse causation bias, or the likelihood that poor health influences the decision to pursue compensation: it is unclear if compensation is a cause or a consequence of poor health, or both. Finally, unresolved selection bias (and hence, confounding) is evident in longitudinal studies and natural experiments. In both cases, between-group differences have not been addressed convincingly. The nature of the relationship between compensation-related factors and health is unclear. Current approaches to testing the compensation hypothesis are prone to several important sources of bias, which compromise the validity of their results. Methods that explicitly test the hypothesis and establish whether or not a causal relationship exists between compensation factors and prolonged whiplash symptoms are needed in future studies.

  19. Measurements of NN → dπ near threshold

    International Nuclear Information System (INIS)

    Hutcheon, D.A.

    1990-09-01

    New, precise measurements of the differential cross sections for np → dπ 0 and π + d → pp and of analyzing powers for pp → dπ + have been made at energies within 10 MeV (c.m.) of threshold. They allow the pion s-wave and p-wave parts of the production strength to be distinguished unambiguously, yielding an s-wave strength at threshold which is significantly smaller than the previously accepted value. There is no evidence for charge independence breaking nor for πNN resonances near threshold. (Author) (17 refs., 17 figs., tab.)

  20. Intermediate structure and threshold phenomena

    International Nuclear Information System (INIS)

    Hategan, Cornel

    2004-01-01

    The Intermediate Structure, evidenced through microstructures of the neutron strength function, is reflected in open reaction channels as fluctuations in excitation function of nuclear threshold effects. The intermediate state supporting both neutron strength function and nuclear threshold effect is a micro-giant neutron threshold state. (author)

  1. Coloring geographical threshold graphs

    Energy Technology Data Exchange (ETDEWEB)

    Bradonjic, Milan [Los Alamos National Laboratory; Percus, Allon [Los Alamos National Laboratory; Muller, Tobias [EINDHOVEN UNIV. OF TECH

    2008-01-01

    We propose a coloring algorithm for sparse random graphs generated by the geographical threshold graph (GTG) model, a generalization of random geometric graphs (RGG). In a GTG, nodes are distributed in a Euclidean space, and edges are assigned according to a threshold function involving the distance between nodes as well as randomly chosen node weights. The motivation for analyzing this model is that many real networks (e.g., wireless networks, the Internet, etc.) need to be studied by using a 'richer' stochastic model (which in this case includes both a distance between nodes and weights on the nodes). Here, we analyze the GTG coloring algorithm together with the graph's clique number, showing formally that in spite of the differences in structure between GTG and RGG, the asymptotic behavior of the chromatic number is identical: {chi}1n 1n n / 1n n (1 + {omicron}(1)). Finally, we consider the leading corrections to this expression, again using the coloring algorithm and clique number to provide bounds on the chromatic number. We show that the gap between the lower and upper bound is within C 1n n / (1n 1n n){sup 2}, and specify the constant C.

  2. A Molecular–Structure Hypothesis

    Directory of Open Access Journals (Sweden)

    Jan C. A. Boeyens

    2010-11-01

    Full Text Available The self-similar symmetry that occurs between atomic nuclei, biological growth structures, the solar system, globular clusters and spiral galaxies suggests that a similar pattern should characterize atomic and molecular structures. This possibility is explored in terms of the current molecular structure-hypothesis and its extension into four-dimensional space-time. It is concluded that a quantum molecule only has structure in four dimensions and that classical (Newtonian structure, which occurs in three dimensions, cannot be simulated by quantum-chemical computation.

  3. Antiaging therapy: a prospective hypothesis

    Directory of Open Access Journals (Sweden)

    Shahidi Bonjar MR

    2015-01-01

    Full Text Available Mohammad Rashid Shahidi Bonjar,1 Leyla Shahidi Bonjar2 1School of Dentistry, Kerman University of Medical Sciences, Kerman Iran; 2Department of Pharmacology, College of Pharmacy, Kerman University of Medical Sciences, Kerman, Iran Abstract: This hypothesis proposes a new prospective approach to slow the aging process in older humans. The hypothesis could lead to developing new treatments for age-related illnesses and help humans to live longer. This hypothesis has no previous documentation in scientific media and has no protocol. Scientists have presented evidence that systemic aging is influenced by peculiar molecules in the blood. Researchers at Albert Einstein College of Medicine, New York, and Harvard University in Cambridge discovered elevated titer of aging-related molecules (ARMs in blood, which trigger cascade of aging process in mice; they also indicated that the process can be reduced or even reversed. By inhibiting the production of ARMs, they could reduce age-related cognitive and physical declines. The present hypothesis offers a new approach to translate these findings into medical treatment: extracorporeal adjustment of ARMs would lead to slower rates of aging. A prospective “antiaging blood filtration column” (AABFC is a nanotechnological device that would fulfill the central role in this approach. An AABFC would set a near-youth homeostatic titer of ARMs in the blood. In this regard, the AABFC immobilizes ARMs from the blood while blood passes through the column. The AABFC harbors antibodies against ARMs. ARM antibodies would be conjugated irreversibly to ARMs on contact surfaces of the reaction platforms inside the AABFC till near-youth homeostasis is attained. The treatment is performed with the aid of a blood-circulating pump. Similar to a renal dialysis machine, blood would circulate from the body to the AABFC and from there back to the body in a closed circuit until ARMs were sufficiently depleted from the blood. The

  4. Is PMI the Hypothesis or the Null Hypothesis?

    Science.gov (United States)

    Tarone, Aaron M; Sanford, Michelle R

    2017-09-01

    Over the past several decades, there have been several strident exchanges regarding whether forensic entomologists estimate the postmortem interval (PMI), minimum PMI, or something else. During that time, there has been a proliferation of terminology reflecting this concern regarding "what we do." This has been a frustrating conversation for some in the community because much of this debate appears to be centered on what assumptions are acknowledged directly and which are embedded within a list of assumptions (or ignored altogether) in the literature and in case reports. An additional component of the conversation centers on a concern that moving away from the use of certain terminology like PMI acknowledges limitations and problems that would make the application of entomology appear less useful in court-a problem for lawyers, but one that should not be problematic for scientists in the forensic entomology community, as uncertainty is part of science that should and can be presented effectively in the courtroom (e.g., population genetic concepts in forensics). Unfortunately, a consequence of the way this conversation is conducted is that even as all involved in the debate acknowledge the concerns of their colleagues, parties continue to talk past one another advocating their preferred terminology. Progress will not be made until the community recognizes that all of the terms under consideration take the form of null hypothesis statements and that thinking about "what we do" as a null hypothesis has useful legal and scientific ramifications that transcend arguments over the usage of preferred terminology. © The Authors 2017. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  5. The Stoichiometric Divisome: A Hypothesis

    Directory of Open Access Journals (Sweden)

    Waldemar eVollmer

    2015-05-01

    Full Text Available Dividing Escherichia coli cells simultaneously constrict the inner membrane, peptidoglycan layer and outer membrane to synthesize the new poles of the daughter cells. For this, more than 30 proteins localize to mid-cell where they form a large, ring-like assembly, the divisome, facilitating division. Although the precise function of most divisome proteins is unknown, it became apparent in recent years that dynamic protein-protein interactions are essential for divisome assembly and function. However, little is known about the nature of the interactions involved and the stoichiometry of the proteins within the divisome. A recent study (Li et al., 2014 used ribosome profiling to measure the absolute protein synthesis rates in E. coli. Interestingly, they observed that most proteins which participate in known multiprotein complexes are synthesized proportional to their stoichiometry. Based on this principle we present a hypothesis for the stoichiometry of the core of the divisome, taking into account known protein-protein interactions. From this hypothesis we infer a possible mechanism for PG synthesis during division.

  6. Why women apologize more than men: gender differences in thresholds for perceiving offensive behavior.

    Science.gov (United States)

    Schumann, Karina; Ross, Michael

    2010-11-01

    Despite wide acceptance of the stereotype that women apologize more readily than men, there is little systematic evidence to support this stereotype or its supposed bases (e.g., men's fragile egos). We designed two studies to examine whether gender differences in apology behavior exist and, if so, why. In Study 1, participants reported in daily diaries all offenses they committed or experienced and whether an apology had been offered. Women reported offering more apologies than men, but they also reported committing more offenses. There was no gender difference in the proportion of offenses that prompted apologies. This finding suggests that men apologize less frequently than women because they have a higher threshold for what constitutes offensive behavior. In Study 2, we tested this threshold hypothesis by asking participants to evaluate both imaginary and recalled offenses. As predicted, men rated the offenses as less severe than women did. These different ratings of severity predicted both judgments of whether an apology was deserved and actual apology behavior.

  7. Crossing the Petawatt threshold

    International Nuclear Information System (INIS)

    Perry, M.

    1996-01-01

    A revolutionary new laser called the Petawatt, developed by Lawrence Livermore researchers after an intensive three-year development effort, has produced more than 1,000 trillion (open-quotes petaclose quotes) watts of power, a world record. By crossing the petawatt threshold, the extraordinarily powerful laser heralds a new age in laser research. Lasers that provide a petawatt of power or more in a picosecond may make it possible to achieve fusion using significantly less energy than currently envisioned, through a novel Livermore concept called open-quotes fast ignition.close quotes The petawatt laser will also enable researchers to study the fundamental properties of matter, thereby aiding the Department of Energy's Stockpile Stewardship efforts and opening entirely new physical regimes to study. The technology developed for the Petawatt has also provided several spinoff technologies, including a new approach to laser material processing

  8. Responsible technology acceptance

    DEFF Research Database (Denmark)

    Toft, Madeleine Broman; Schuitema, Geertje; Thøgersen, John

    2014-01-01

    As a response to climate change and the desire to gain independence from imported fossil fuels, there is a pressure to increase the proportion of electricity from renewable sources which is one of the reasons why electricity grids are currently being turned into Smart Grids. In this paper, we focus...... on private consumers’ acceptance of having Smart Grid technology installed in their home. We analyse acceptance in a combined framework of the Technology Acceptance Model and the Norm Activation Model. We propose that individuals are only likely to accept Smart Grid technology if they assess usefulness...... in terms of a positive impact for society and the environment. Therefore, we expect that Smart Grid technology acceptance can be better explained when the well-known technology acceptance parameters included in the Technology Acceptance Model are supplemented by moral norms as suggested by the Norm...

  9. Rationality, practice variation and person?centred health policy: a threshold hypothesis

    OpenAIRE

    Djulbegovic, Benjamin; Hamm, Robert M.; Mayrhofer, Thomas; Hozo, Iztok; Van den Ende, Jef

    2015-01-01

    Abstract Variation in practice of medicine is one of the major health policy issues of today. Ultimately, it is related to physicians' decision making. Similar patients with similar likelihood of having disease are often managed by different doctors differently: some doctors may elect to observe the patient, others decide to act based on diagnostic testing and yet others may elect to treat without testing. We explain these differences in practice by differences in disease probability threshol...

  10. Should the Equilibrium Point Hypothesis (EPH) be Considered a Scientific Theory?

    OpenAIRE

    Sainburg, Robert L.

    2014-01-01

    The purpose of this commentary is to discuss factors that limit consideration of the equilibrium point hypothesis as a scientific theory. The EPH describes control of motor neuron threshold through the variable lambda, which corresponds to a unique referent configuration for a muscle, joint, or combination of joints. One of the most compelling features of the equilibrium point hypothesis is the integration of posture and movement control into a single mechanism. While the essential core of th...

  11. The Stem Cell Hypothesis of Aging

    Directory of Open Access Journals (Sweden)

    Anna Meiliana

    2010-04-01

    Full Text Available BACKGROUND: There is probably no single way to age. Indeed, so far there is no single accepted explanation or mechanisms of aging (although more than 300 theories have been proposed. There is an overall decline in tissue regenerative potential with age, and the question arises as to whether this is due to the intrinsic aging of stem cells or rather to the impairment of stem cell function in the aged tissue environment. CONTENT: Recent data suggest that we age, in part, because our self-renewing stem cells grow old as a result of heritable intrinsic events, such as DNA damage, as well as extrinsic forces, such as changes in their supporting niches. Mechanisms that suppress the development of cancer, such as senescence and apoptosis, which rely on telomere shortening and the activities of p53 and p16INK4a may also induce an unwanted consequence: a decline in the replicative function of certain stem cells types with advancing age. This decrease regenerative capacity appears to pointing to the stem cell hypothesis of aging. SUMMARY: Recent evidence suggested that we grow old partly because of our stem cells grow old as a result of mechanisms that suppress the development of cancer over a lifetime. We believe that a further, more precise mechanistic understanding of this process will be required before this knowledge can be translated into human anti-aging therapies. KEYWORDS: stem cells, senescence, telomere, DNA damage, epigenetic, aging.

  12. Hemispheric lateralization of motor thresholds in relation to stuttering.

    Directory of Open Access Journals (Sweden)

    Per A Alm

    Full Text Available Stuttering is a complex speech disorder. Previous studies indicate a tendency towards elevated motor threshold for the left hemisphere, as measured using transcranial magnetic stimulation (TMS. This may reflect a monohemispheric motor system impairment. The purpose of the study was to investigate the relative side-to-side difference (asymmetry and the absolute levels of motor threshold for the hand area, using TMS in adults who stutter (n = 15 and in controls (n = 15. In accordance with the hypothesis, the groups differed significantly regarding the relative side-to-side difference of finger motor threshold (p = 0.0026, with the stuttering group showing higher motor threshold of the left hemisphere in relation to the right. Also the absolute level of the finger motor threshold for the left hemisphere differed between the groups (p = 0.049. The obtained results, together with previous investigations, provide support for the hypothesis that stuttering tends to be related to left hemisphere motor impairment, and possibly to a dysfunctional state of bilateral speech motor control.

  13. Hemispheric Lateralization of Motor Thresholds in Relation to Stuttering

    Science.gov (United States)

    Alm, Per A.; Karlsson, Ragnhild; Sundberg, Madeleine; Axelson, Hans W.

    2013-01-01

    Stuttering is a complex speech disorder. Previous studies indicate a tendency towards elevated motor threshold for the left hemisphere, as measured using transcranial magnetic stimulation (TMS). This may reflect a monohemispheric motor system impairment. The purpose of the study was to investigate the relative side-to-side difference (asymmetry) and the absolute levels of motor threshold for the hand area, using TMS in adults who stutter (n = 15) and in controls (n = 15). In accordance with the hypothesis, the groups differed significantly regarding the relative side-to-side difference of finger motor threshold (p = 0.0026), with the stuttering group showing higher motor threshold of the left hemisphere in relation to the right. Also the absolute level of the finger motor threshold for the left hemisphere differed between the groups (p = 0.049). The obtained results, together with previous investigations, provide support for the hypothesis that stuttering tends to be related to left hemisphere motor impairment, and possibly to a dysfunctional state of bilateral speech motor control. PMID:24146930

  14. Crossing the threshold

    Science.gov (United States)

    Bush, John; Tambasco, Lucas

    2017-11-01

    First, we summarize the circumstances in which chaotic pilot-wave dynamics gives rise to quantum-like statistical behavior. For ``closed'' systems, in which the droplet is confined to a finite domain either by boundaries or applied forces, quantum-like features arise when the persistence time of the waves exceeds the time required for the droplet to cross its domain. Second, motivated by the similarities between this hydrodynamic system and stochastic electrodynamics, we examine the behavior of a bouncing droplet above the Faraday threshold, where a stochastic element is introduced into the drop dynamics by virtue of its interaction with a background Faraday wave field. With a view to extending the dynamical range of pilot-wave systems to capture more quantum-like features, we consider a generalized theoretical framework for stochastic pilot-wave dynamics in which the relative magnitudes of the drop-generated pilot-wave field and a stochastic background field may be varied continuously. We gratefully acknowledge the financial support of the NSF through their CMMI and DMS divisions.

  15. Albania - Thresholds I and II

    Data.gov (United States)

    Millennium Challenge Corporation — From 2006 to 2011, the government of Albania (GOA) received two Millennium Challenge Corporation (MCC) Threshold Programs totaling $29.6 million. Albania received...

  16. Memory in astrocytes: a hypothesis

    Directory of Open Access Journals (Sweden)

    Caudle Robert M

    2006-01-01

    Full Text Available Abstract Background Recent work has indicated an increasingly complex role for astrocytes in the central nervous system. Astrocytes are now known to exchange information with neurons at synaptic junctions and to alter the information processing capabilities of the neurons. As an extension of this trend a hypothesis was proposed that astrocytes function to store information. To explore this idea the ion channels in biological membranes were compared to models known as cellular automata. These comparisons were made to test the hypothesis that ion channels in the membranes of astrocytes form a dynamic information storage device. Results Two dimensional cellular automata were found to behave similarly to ion channels in a membrane when they function at the boundary between order and chaos. The length of time information is stored in this class of cellular automata is exponentially related to the number of units. Therefore the length of time biological ion channels store information was plotted versus the estimated number of ion channels in the tissue. This analysis indicates that there is an exponential relationship between memory and the number of ion channels. Extrapolation of this relationship to the estimated number of ion channels in the astrocytes of a human brain indicates that memory can be stored in this system for an entire life span. Interestingly, this information is not affixed to any physical structure, but is stored as an organization of the activity of the ion channels. Further analysis of two dimensional cellular automata also demonstrates that these systems have both associative and temporal memory capabilities. Conclusion It is concluded that astrocytes may serve as a dynamic information sink for neurons. The memory in the astrocytes is stored by organizing the activity of ion channels and is not associated with a physical location such as a synapse. In order for this form of memory to be of significant duration it is necessary

  17. Robust and distributed hypothesis testing

    CERN Document Server

    Gül, Gökhan

    2017-01-01

    This book generalizes and extends the available theory in robust and decentralized hypothesis testing. In particular, it presents a robust test for modeling errors which is independent from the assumptions that a sufficiently large number of samples is available, and that the distance is the KL-divergence. Here, the distance can be chosen from a much general model, which includes the KL-divergence as a very special case. This is then extended by various means. A minimax robust test that is robust against both outliers as well as modeling errors is presented. Minimax robustness properties of the given tests are also explicitly proven for fixed sample size and sequential probability ratio tests. The theory of robust detection is extended to robust estimation and the theory of robust distributed detection is extended to classes of distributions, which are not necessarily stochastically bounded. It is shown that the quantization functions for the decision rules can also be chosen as non-monotone. Finally, the boo...

  18. The venom optimization hypothesis revisited.

    Science.gov (United States)

    Morgenstern, David; King, Glenn F

    2013-03-01

    Animal venoms are complex chemical mixtures that typically contain hundreds of proteins and non-proteinaceous compounds, resulting in a potent weapon for prey immobilization and predator deterrence. However, because venoms are protein-rich, they come with a high metabolic price tag. The metabolic cost of venom is sufficiently high to result in secondary loss of venom whenever its use becomes non-essential to survival of the animal. The high metabolic cost of venom leads to the prediction that venomous animals may have evolved strategies for minimizing venom expenditure. Indeed, various behaviors have been identified that appear consistent with frugality of venom use. This has led to formulation of the "venom optimization hypothesis" (Wigger et al. (2002) Toxicon 40, 749-752), also known as "venom metering", which postulates that venom is metabolically expensive and therefore used frugally through behavioral control. Here, we review the available data concerning economy of venom use by animals with either ancient or more recently evolved venom systems. We conclude that the convergent nature of the evidence in multiple taxa strongly suggests the existence of evolutionary pressures favoring frugal use of venom. However, there remains an unresolved dichotomy between this economy of venom use and the lavish biochemical complexity of venom, which includes a high degree of functional redundancy. We discuss the evidence for biochemical optimization of venom as a means of resolving this conundrum. Copyright © 2012 Elsevier Ltd. All rights reserved.

  19. Alien abduction: a medical hypothesis.

    Science.gov (United States)

    Forrest, David V

    2008-01-01

    In response to a new psychological study of persons who believe they have been abducted by space aliens that found that sleep paralysis, a history of being hypnotized, and preoccupation with the paranormal and extraterrestrial were predisposing experiences, I noted that many of the frequently reported particulars of the abduction experience bear more than a passing resemblance to medical-surgical procedures and propose that experience with these may also be contributory. There is the altered state of consciousness, uniformly colored figures with prominent eyes, in a high-tech room under a round bright saucerlike object; there is nakedness, pain and a loss of control while the body's boundaries are being probed; and yet the figures are thought benevolent. No medical-surgical history was apparently taken in the above mentioned study, but psychological laboratory work evaluated false memory formation. I discuss problems in assessing intraoperative awareness and ways in which the medical hypothesis could be elaborated and tested. If physicians are causing this syndrome in a percentage of patients, we should know about it; and persons who feel they have been abducted should be encouraged to inform their surgeons and anesthesiologists without challenging their beliefs.

  20. Extraction of airway trees using multiple hypothesis tracking and template matching

    DEFF Research Database (Denmark)

    Raghavendra, Selvan; Petersen, Jens; Pedersen, Jesper Johannes Holst

    2016-01-01

    used in constructing a multiple hypothesis tree, which is then traversed to reach decisions. The proposed modifications remove the need for local thresholding of hypotheses as decisions are made entirely based on statistical comparisons involving the hypothesis tree. The results show improvements......Knowledge of airway tree morphology has important clinical applications in diagnosis of chronic obstructive pulmonary disease. We present an automatic tree extraction method based on multiple hypothesis tracking and template matching for this purpose and evaluate its performance on chest CT images...

  1. Threshold Concepts and Information Literacy

    Science.gov (United States)

    Townsend, Lori; Brunetti, Korey; Hofer, Amy R.

    2011-01-01

    What do we teach when we teach information literacy in higher education? This paper describes a pedagogical approach to information literacy that helps instructors focus content around transformative learning thresholds. The threshold concept framework holds promise for librarians because it grounds the instructor in the big ideas and underlying…

  2. Developing thresholds of potential concern for invasive alien species: Hypotheses and concepts

    Directory of Open Access Journals (Sweden)

    Llewellyn C. Foxcroft

    2009-03-01

    Conservation implication: In accepting that species and systems are variable, and that flux is inevitable and desirable, these TPCs developed for invasive alien species specifi cally, provide end points against which monitoring can be assessed. Once a threshold is reached, the cause of the threshold being exceeded is examined and management interventions recommended.

  3. [Psychodynamic hypothesis about suicidality in elderly men].

    Science.gov (United States)

    Lindner, Reinhard

    2010-08-01

    Old men are overrepresented in the whole of all suicides. In contrast, only very few elderly men find their way to specialised treatment facilities. Elderly accept psychotherapy more rarely than younger persons. Therefore presentations on the psychodynamics of suicidality in old men are rare and mostly casuistical. By means of a stepwise reconstructable qualitative case comparison of five randomly chosen elderly suicidal men with ideal types of suicidal (younger) men concerning biography, suicidal symptoms and transference, psychodynamic hypothesis of suicidality in elderly men are developed. All patients came into psychotherapy in a specialised academic out-patient clinic for psychodynamic treatment of acute and chronic suicidality. The five elderly suicidal men predominantly were living in long-term, conflictuous sexual relationships and also had ambivalent relationships to their children. Suicidality in old age refers to lifelong existing intrapsychic conflicts, concerning (male) identity, self-esteem and a core conflict between fusion and separation wishes. The body gets a central role in suicidal experiences, being a defensive instance modified by age and/or physical illness, which brings up to consciousness aggressive and envious impulses, but also feelings of emptiness and insecurity, which have to be warded off again by projection into the body. In transference relationships there are on the one hand the regular transference, on the other hand an age specific turned around transference, with their counter transference reactions. The chosen methodological approach serves the systematic finding of hypotheses with a higher degree in evidence than hypotheses generated from single case studies. Georg Thieme Verlag KG Stuttgart - New York.

  4. High-frequency (8 to 16 kHz) reference thresholds and intrasubject threshold variability relative to ototoxicity criteria using a Sennheiser HDA 200 earphone.

    Science.gov (United States)

    Frank, T

    2001-04-01

    The first purpose of this study was to determine high-frequency (8 to 16 kHz) thresholds for standardizing reference equivalent threshold sound pressure levels (RETSPLs) for a Sennheiser HDA 200 earphone. The second and perhaps more important purpose of this study was to determine whether repeated high-frequency thresholds using a Sennheiser HDA 200 earphone had a lower intrasubject threshold variability than the ASHA 1994 significant threshold shift criteria for ototoxicity. High-frequency thresholds (8 to 16 kHz) were obtained for 100 (50 male, 50 female) normally hearing (0.25 to 8 kHz) young adults (mean age of 21.2 yr) in four separate test sessions using a Sennheiser HDA 200 earphone. The mean and median high-frequency thresholds were similar for each test session and increased as frequency increased. At each frequency, the high-frequency thresholds were not significantly (p > 0.05) different for gender, test ear, or test session. The median thresholds at each frequency were similar to the 1998 interim ISO RETSPLs; however, large standard deviations and wide threshold distributions indicated very high intersubject threshold variability, especially at 14 and 16 kHz. Threshold repeatability was determined by finding the threshold differences between each possible test session comparison (N = 6). About 98% of all of the threshold differences were within a clinically acceptable range of +/-10 dB from 8 to 14 kHz. The threshold differences between each subject's second, third, and fourth minus their first test session were also found to determine whether intrasubject threshold variability was less than the ASHA 1994 criteria for determining a significant threshold shift due to ototoxicity. The results indicated a false-positive rate of 0% for a threshold shift > or = 20 dB at any frequency and a false-positive rate of 2% for a threshold shift >10 dB at two consecutive frequencies. This study verified that the output of high-frequency audiometers at 0 dB HL using

  5. Testing the gravitational instability hypothesis?

    Science.gov (United States)

    Babul, Arif; Weinberg, David H.; Dekel, Avishai; Ostriker, Jeremiah P.

    1994-01-01

    We challenge a widely accepted assumption of observational cosmology: that successful reconstruction of observed galaxy density fields from measured galaxy velocity fields (or vice versa), using the methods of gravitational instability theory, implies that the observed large-scale structures and large-scale flows were produced by the action of gravity. This assumption is false, in that there exist nongravitational theories that pass the reconstruction tests and gravitational theories with certain forms of biased galaxy formation that fail them. Gravitational instability theory predicts specific correlations between large-scale velocity and mass density fields, but the same correlations arise in any model where (a) structures in the galaxy distribution grow from homogeneous initial conditions in a way that satisfies the continuity equation, and (b) the present-day velocity field is irrotational and proportional to the time-averaged velocity field. We demonstrate these assertions using analytical arguments and N-body simulations. If large-scale structure is formed by gravitational instability, then the ratio of the galaxy density contrast to the divergence of the velocity field yields an estimate of the density parameter Omega (or, more generally, an estimate of beta identically equal to Omega(exp 0.6)/b, where b is an assumed constant of proportionality between galaxy and mass density fluctuations. In nongravitational scenarios, the values of Omega or beta estimated in this way may fail to represent the true cosmological values. However, even if nongravitational forces initiate and shape the growth of structure, gravitationally induced accelerations can dominate the velocity field at late times, long after the action of any nongravitational impulses. The estimated beta approaches the true value in such cases, and in our numerical simulations the estimated beta values are reasonably accurate for both gravitational and nongravitational models. Reconstruction tests

  6. ‘Soglitude’- introducing a method of thinking thresholds

    Directory of Open Access Journals (Sweden)

    Tatjana Barazon

    2010-04-01

    philosophical, artistic or scientific, it tends to free itself from rigid or fixed models and accepts change and development as the fundamental nature of things. Thinking thresholds as a method of thought progress cannot be done in a single process and therefore asks for participation in its proper nature. The soglitude springs namely from the acceptance of a multitude of points of view, as it is shown by the numerous contributions we present in this issue ‘Seuils, Thresholds, Soglitudes’ of Conserveries mémorielles.

  7. Music effect on pain threshold evaluated with current perception threshold

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    AIM: Music relieves anxiety and psychotic tension. This effect of music is applied to surgical operation in the hospital and dental office. It is still unclear whether this music effect is only limited to the psychological aspect but not to the physical aspect or whether its music effect is influenced by the mood or emotion of audience. To elucidate these issues, we evaluated the music effect on pain threshold by current perception threshold (CPT) and profile of mood states (POMC) test. METHODS: Healthy 30 subjects (12 men, 18 women, 25-49 years old, mean age 34.9) were tested. (1)After POMC test, all subjects were evaluated pain threshold with CPT by Neurometer (Radionics, USA) under 6 conditions, silence, listening to the slow tempo classic music, nursery music, hard rock music, classic paino music and relaxation music with 30 seconds interval. (2)After Stroop color word test as the stresser, pain threshold was evaluated with CPT under 2 conditions, silence and listening to the slow tempo classic music. RESULTS: Under litening to the music, CPT sores increased, especially 2 000 Hz level related with compression, warm and pain sensation. Type of music, preference of music and stress also affected CPT score. CONCLUSION: The present study demonstrated that the concentration on the music raise the pain threshold and that stress and mood influence the music effect on pain threshold.

  8. Acceptance procedures: Microfilm printer

    Science.gov (United States)

    Lockwood, H. E.

    1973-01-01

    Acceptance tests were made for a special order automatic additive color microfilm printer. Tests include film capacity, film transport, resolution, illumination uniformity, exposure range checks, and color cuing considerations.

  9. On risks and acceptability

    International Nuclear Information System (INIS)

    Watson, S.R.

    1981-01-01

    A very attractive notion is that it should be possible not only to determine how much risk is associated with any particular activity, but also to determine if that risk is acceptable. Stated boldly this seems an entirely unobjectionable and indeed a very acceptable notion. There is, however, underlying this idea, a mistaken view of risk which we might refer to as the ''phlogiston'' theory of risk. In this paper, presented at the SRP meeting on Ethical and Legal Aspects of Radiological Protection, the phlogiston theory of risk is described; secondly, it will be argued that it is too simple a theory to be realistic or useful; and thirdly, the management of risk will be placed in a wider decision framework. Acceptability, it will be argued is highly dependent on context, and it is not possible, therefore, to lay down generally applicable notions of acceptability. (author)

  10. Acceptable noise level

    DEFF Research Database (Denmark)

    Olsen, Steen Østergaard; Nielsen, Lars Holme; Lantz, Johannes

    2012-01-01

    The acceptable noise level (ANL) is used to quantify the amount of background noise that subjects can accept while listening to speech, and is suggested for prediction of individual hearing-aid use. The aim of this study was to assess the repeatability of the ANL measured in normal-hearing subjects...... using running Danish and non-semantic speech materials as stimuli and modulated speech-spectrum and multi-talker babble noises as competing stimuli....

  11. Operations Acceptance Management

    OpenAIRE

    Suchá, Ivana

    2010-01-01

    This paper examines the process of Operations Acceptance Management, whose main task is to control Operations Acceptance Tests (OAT). In the first part the author focuses on the theoretical ground for the problem in the context of ITSM best practices framework ITIL. Benefits, process pitfalls and possibilities for automation are discussed in this part. The second part contains a case study of DHL IT Services (Prague), where a solution optimizing the overall workflow was implemented using simp...

  12. Acceptable noise level

    DEFF Research Database (Denmark)

    Olsen, Steen Østergaard; Nielsen, Lars Holme; Lantz, Johannes

    2012-01-01

    The acceptable noise level (ANL) is used to quantify the amount of background noise that subjects can accept while listening to speech, and is suggested for prediction of individual hearing-aid use. The aim of this study was to assess the repeatability of the ANL measured in normal-hearing subjec...... using running Danish and non-semantic speech materials as stimuli and modulated speech-spectrum and multi-talker babble noises as competing stimuli....

  13. Cochlear neuropathy and the coding of supra-threshold sound.

    Science.gov (United States)

    Bharadwaj, Hari M; Verhulst, Sarah; Shaheen, Luke; Liberman, M Charles; Shinn-Cunningham, Barbara G

    2014-01-01

    Many listeners with hearing thresholds within the clinically normal range nonetheless complain of difficulty hearing in everyday settings and understanding speech in noise. Converging evidence from human and animal studies points to one potential source of such difficulties: differences in the fidelity with which supra-threshold sound is encoded in the early portions of the auditory pathway. Measures of auditory subcortical steady-state responses (SSSRs) in humans and animals support the idea that the temporal precision of the early auditory representation can be poor even when hearing thresholds are normal. In humans with normal hearing thresholds (NHTs), paradigms that require listeners to make use of the detailed spectro-temporal structure of supra-threshold sound, such as selective attention and discrimination of frequency modulation (FM), reveal individual differences that correlate with subcortical temporal coding precision. Animal studies show that noise exposure and aging can cause a loss of a large percentage of auditory nerve fibers (ANFs) without any significant change in measured audiograms. Here, we argue that cochlear neuropathy may reduce encoding precision of supra-threshold sound, and that this manifests both behaviorally and in SSSRs in humans. Furthermore, recent studies suggest that noise-induced neuropathy may be selective for higher-threshold, lower-spontaneous-rate nerve fibers. Based on our hypothesis, we suggest some approaches that may yield particularly sensitive, objective measures of supra-threshold coding deficits that arise due to neuropathy. Finally, we comment on the potential clinical significance of these ideas and identify areas for future investigation.

  14. Cochlear Neuropathy and the Coding of Supra-threshold Sound

    Directory of Open Access Journals (Sweden)

    Hari M Bharadwaj

    2014-02-01

    Full Text Available Many listeners with hearing thresholds within the clinically normal range nonetheless complain of difficulty hearing in everyday settings and understanding speech in noise. Converging evidence from human and animal studies points to one potential source of such difficulties: differences in the fidelity with which supra-threshold sound is encoded in the early portions of the auditory pathway. Measures of auditory subcortical steady-state responses in humans and animals support the idea that the temporal precision of the early auditory representation can be poor even when hearing thresholds are normal. In humans with normal hearing thresholds, behavioral ability in paradigms that require listeners to make use of the detailed spectro-temporal structure of supra-threshold sound, such as selective attention and discrimination of frequency modulation, correlate with subcortical temporal coding precision. Animal studies show that noise exposure and aging can cause a loss of a large percentage of auditory nerve fibers without any significant change in measured audiograms. Here, we argue that cochlear neuropathy may reduce encoding precision of supra-threshold sound, and that this manifests both behaviorally and in subcortical steady-state responses in humans. Furthermore, recent studies suggest that noise-induced neuropathy may be selective for higher-threshold, lower-spontaneous-rate nerve fibers. Based on our hypothesis, we suggest some approaches that may yield particularly sensitive, objective measures of supra-threshold coding deficits that arise due to neuropathy. Finally, we comment on the potential clinical significance of these ideas and identify areas for future investigation.

  15. Parton distributions with threshold resummation

    CERN Document Server

    Bonvini, Marco; Rojo, Juan; Rottoli, Luca; Ubiali, Maria; Ball, Richard D.; Bertone, Valerio; Carrazza, Stefano; Hartland, Nathan P.

    2015-01-01

    We construct a set of parton distribution functions (PDFs) in which fixed-order NLO and NNLO calculations are supplemented with soft-gluon (threshold) resummation up to NLL and NNLL accuracy respectively, suitable for use in conjunction with any QCD calculation in which threshold resummation is included at the level of partonic cross sections. These resummed PDF sets, based on the NNPDF3.0 analysis, are extracted from deep-inelastic scattering, Drell-Yan, and top quark pair production data, for which resummed calculations can be consistently used. We find that, close to threshold, the inclusion of resummed PDFs can partially compensate the enhancement in resummed matrix elements, leading to resummed hadronic cross-sections closer to the fixed-order calculation. On the other hand, far from threshold, resummed PDFs reduce to their fixed-order counterparts. Our results demonstrate the need for a consistent use of resummed PDFs in resummed calculations.

  16. Validity of Linder Hypothesis in Bric Countries

    Directory of Open Access Journals (Sweden)

    Rana Atabay

    2016-03-01

    Full Text Available In this study, the theory of similarity in preferences (Linder hypothesis has been introduced and trade in BRIC countries has been examined whether the trade between these countries was valid for this hypothesis. Using the data for the period 1996 – 2010, the study applies to panel data analysis in order to provide evidence regarding the empirical validity of the Linder hypothesis for BRIC countries’ international trade. Empirical findings show that the trade between BRIC countries is in support of Linder hypothesis.

  17. Conceptions of nuclear threshold status

    International Nuclear Information System (INIS)

    Quester, G.H.

    1991-01-01

    This paper reviews some alternative definitions of nuclear threshold status. Each of them is important, and major analytical confusions would result if one sense of the term is mistaken for another. The motives for nations entering into such threshold status are a blend of civilian and military gains, and of national interests versus parochial or bureaucratic interests. A portion of the rationale for threshold status emerges inevitably from the pursuit of economic goals, and another portion is made more attraction by the derives of the domestic political process. Yet the impact on international security cannot be dismissed, especially where conflicts among the states remain real. Among the military or national security motives are basic deterrence, psychological warfare, war-fighting and, more generally, national prestige. In the end, as the threshold phenomenon is assayed for lessons concerning the role of nuclear weapons more generally in international relations and security, one might conclude that threshold status and outright proliferation coverage to a degree in the motives for all of the states involved and in the advantages attained. As this paper has illustrated, nuclear threshold status is more subtle and more ambiguous than outright proliferation, and it takes considerable time to sort out the complexities. Yet the world has now had a substantial amount of time to deal with this ambiguous status, and this may tempt more states to exploit it

  18. Hypothesis Testing in the Real World

    Science.gov (United States)

    Miller, Jeff

    2017-01-01

    Critics of null hypothesis significance testing suggest that (a) its basic logic is invalid and (b) it addresses a question that is of no interest. In contrast to (a), I argue that the underlying logic of hypothesis testing is actually extremely straightforward and compelling. To substantiate that, I present examples showing that hypothesis…

  19. Error probabilities in default Bayesian hypothesis testing

    NARCIS (Netherlands)

    Gu, Xin; Hoijtink, Herbert; Mulder, J,

    2016-01-01

    This paper investigates the classical type I and type II error probabilities of default Bayes factors for a Bayesian t test. Default Bayes factors quantify the relative evidence between the null hypothesis and the unrestricted alternative hypothesis without needing to specify prior distributions for

  20. Reassessing the Trade-off Hypothesis

    DEFF Research Database (Denmark)

    Rosas, Guillermo; Manzetti, Luigi

    2015-01-01

    Do economic conditions drive voters to punish politicians that tolerate corruption? Previous scholarly work contends that citizens in young democracies support corrupt governments that are capable of promoting good economic outcomes, the so-called trade-off hypothesis. We test this hypothesis based...

  1. Mastery Learning and the Decreasing Variability Hypothesis.

    Science.gov (United States)

    Livingston, Jennifer A.; Gentile, J. Ronald

    1996-01-01

    This report results from studies that tested two variations of Bloom's decreasing variability hypothesis using performance on successive units of achievement in four graduate classrooms that used mastery learning procedures. Data do not support the decreasing variability hypothesis; rather, they show no change over time. (SM)

  2. 1991 Acceptance priority ranking

    International Nuclear Information System (INIS)

    1991-12-01

    The Standard Contract for Disposal of Spent Nuclear Fuel and/or High- Level Radioactive Waste (10 CFR Part 961) that the Department of Energy (DOE) has executed with the owners and generators of civilian spent nuclear fuel requires annual publication of the Acceptance Priority Ranking (APR). The 1991 APR details the order in which DOE will allocate Federal waste acceptance capacity. As required by the Standard Contract, the ranking is based on the age of permanently discharged spent nuclear fuel (SNF), with the owners of the oldest SNF, on an industry-wide basis, given the highest priority. the 1991 APR will be the basis for the annual allocation of waste acceptance capacity to the Purchasers in the 1991 Annual Capacity Report (ACR), to be issued later this year. This document is based on SNF discharges as of December 31, 1990, and reflects Purchaser comments and corrections, as appropriate, to the draft APR issued on May 15, 1991

  3. Prospective detection of large prediction errors: a hypothesis testing approach

    International Nuclear Information System (INIS)

    Ruan, Dan

    2010-01-01

    Real-time motion management is important in radiotherapy. In addition to effective monitoring schemes, prediction is required to compensate for system latency, so that treatment can be synchronized with tumor motion. However, it is difficult to predict tumor motion at all times, and it is critical to determine when large prediction errors may occur. Such information can be used to pause the treatment beam or adjust monitoring/prediction schemes. In this study, we propose a hypothesis testing approach for detecting instants corresponding to potentially large prediction errors in real time. We treat the future tumor location as a random variable, and obtain its empirical probability distribution with the kernel density estimation-based method. Under the null hypothesis, the model probability is assumed to be a concentrated Gaussian centered at the prediction output. Under the alternative hypothesis, the model distribution is assumed to be non-informative uniform, which reflects the situation that the future position cannot be inferred reliably. We derive the likelihood ratio test (LRT) for this hypothesis testing problem and show that with the method of moments for estimating the null hypothesis Gaussian parameters, the LRT reduces to a simple test on the empirical variance of the predictive random variable. This conforms to the intuition to expect a (potentially) large prediction error when the estimate is associated with high uncertainty, and to expect an accurate prediction when the uncertainty level is low. We tested the proposed method on patient-derived respiratory traces. The 'ground-truth' prediction error was evaluated by comparing the prediction values with retrospective observations, and the large prediction regions were subsequently delineated by thresholding the prediction errors. The receiver operating characteristic curve was used to describe the performance of the proposed hypothesis testing method. Clinical implication was represented by miss

  4. The influence of olfactory concept on the probability of detecting sub- and peri-threshold odorants in a complex mixture

    NARCIS (Netherlands)

    Bult, J.H.F.; Schifferstein, H.N.J.; Roozen, J.P.; Voragen, A.G.J.; Kroeze, J.H.A.

    2001-01-01

    The headspace of apple juice was analysed to obtain an ecologically relevant stimulus model mixture of apple volatiles. Two sets of volatiles were made up: a set of eight supra-threshold volatiles (MIX) and a set of three sub-threshold volatiles. These sets were used to test the hypothesis that

  5. The heritability of acceptability in South African Merino sheep ...

    African Journals Online (AJOL)

    Selection for production and reproduction in South African Merino sheep is always combined with selection based on visual appraisal and will, in all probability, remain so for many years to come. Heritabilities for acceptability were estimated using a threshold model to analyse data from two parent Merino studs. Effects ...

  6. The Acceptance of Background Noise in Adult Cochlear Implant Users

    Science.gov (United States)

    Plyler, Patrick N.; Bahng, Junghwa; von Hapsburg, Deborah

    2008-01-01

    Purpose: The purpose of this study was to determine (a) if acceptable noise levels (ANLs) are different in cochlear implant (CI) users than in listeners with normal hearing, (b) if ANLs are related to sentence reception thresholds in noise in CI users, and (c) if ANLs and subjective outcome measures are related in CI users. Method: ANLs and the…

  7. A practical threshold concept for simple and reasonable radiation protection

    International Nuclear Information System (INIS)

    Kaneko, Masahito

    2002-01-01

    A half century ago it was assumed for the purpose of protection that radiation risks are linearly proportional at all levels of dose. Linear No-Threshold (LNT) hypothesis has greatly contributed to the minimization of doses received by workers and members of the public, while it has brought about 'radiophobia' and unnecessary over-regulation. Now that the existence of bio-defensive mechanisms such as DNA repair, apoptosis and adaptive response are well recognized, the linearity assumption can be said 'unscientific'. Evidences increasingly imply that there are threshold effects in risk of radiation. A concept of 'practical' thresholds is proposed and the classification of 'stochastic' and 'deterministic' radiation effects should be abandoned. 'Practical' thresholds are dose levels below which induction of detectable radiogenic cancers or hereditary effects are not expected. There seems to be no evidence of deleterious health effects from radiation exposures at the current dose limits (50 mSv/y for workers and 5 mSv/y for members of the public), which have been adopted worldwide in the latter half of the 20th century. Those limits are assumed to have been set below certain 'practical' thresholds. As any workers and members of the public do not gain benefits from being exposed, excepting intentional irradiation for medical purposes, their radiation exposures should be kept below 'practical' thresholds. There is no use of 'justification' and 'optimization' (ALARA) principles, because there are no 'radiation detriments' as far as exposures are maintained below 'practical' thresholds. Accordingly the ethical issue of 'justification' to allow benefit to society to offset radiation detriments to individuals can be resolved. And also the ethical issue of 'optimization' to exchange health or safety for economical gain can be resolved. The ALARA principle should be applied to the probability (risk) of exceeding relevant dose limits instead of applying to normal exposures

  8. Testing the hypothesis that treatment can eliminate HIV

    DEFF Research Database (Denmark)

    Okano, Justin T; Robbins, Danielle; Palk, Laurence

    2016-01-01

    BACKGROUND: Worldwide, approximately 35 million individuals are infected with HIV; about 25 million of these live in sub-Saharan Africa. WHO proposes using treatment as prevention (TasP) to eliminate HIV. Treatment suppresses viral load, decreasing the probability an individual transmits HIV....... The elimination threshold is one new HIV infection per 1000 individuals. Here, we test the hypothesis that TasP can substantially reduce epidemics and eliminate HIV. We estimate the impact of TasP, between 1996 and 2013, on the Danish HIV epidemic in men who have sex with men (MSM), an epidemic UNAIDS has...... identified as a priority for elimination. METHODS: We use a CD4-staged Bayesian back-calculation approach to estimate incidence, and the hidden epidemic (the number of HIV-infected undiagnosed MSM). To develop the back-calculation model, we use data from an ongoing nationwide population-based study...

  9. Doubler system quench detection threshold

    International Nuclear Information System (INIS)

    Kuepke, K.; Kuchnir, M.; Martin, P.

    1983-01-01

    The experimental study leading to the determination of the sensitivity needed for protecting the Fermilab Doubler from damage during quenches is presented. The quench voltage thresholds involved were obtained from measurements made on Doubler cable of resistance x temperature and voltage x time during quenches under several currents and from data collected during operation of the Doubler Quench Protection System as implemented in the B-12 string of 20 magnets. At 4kA, a quench voltage threshold in excess of 5.OV will limit the peak Doubler cable temperature to 452K for quenches originating in the magnet coils whereas a threshold of 0.5V is required for quenches originating outside of coils

  10. Thermotactile perception thresholds measurement conditions.

    Science.gov (United States)

    Maeda, Setsuo; Sakakibara, Hisataka

    2002-10-01

    The purpose of this paper is to investigate the effects of posture, push force and rate of temperature change on thermotactile thresholds and to clarify suitable measuring conditions for Japanese people. Thermotactile (warm and cold) thresholds on the right middle finger were measured with an HVLab thermal aesthesiometer. Subjects were eight healthy male Japanese students. The effects of posture in measurement were examined in the posture of a straight hand and forearm placed on a support, the same posture without a support, and the fingers and hand flexed at the wrist with the elbow placed on a desk. The finger push force applied to the applicator of the thermal aesthesiometer was controlled at a 0.5, 1.0, 2.0 and 3.0 N. The applicator temperature was changed to 0.5, 1.0, 1.5, 2.0 and 2.5 degrees C/s. After each measurement, subjects were asked about comfort under the measuring conditions. Three series of experiments were conducted on different days to evaluate repeatability. Repeated measures ANOVA showed that warm thresholds were affected by the push force and the rate of temperature change and that cold thresholds were influenced by posture and push force. The comfort assessment indicated that the measurement posture of a straight hand and forearm laid on a support was the most comfortable for the subjects. Relatively high repeatability was obtained under measurement conditions of a 1 degrees C/s temperature change rate and a 0.5 N push force. Measurement posture, push force and rate of temperature change can affect the thermal threshold. Judging from the repeatability, a push force of 0.5 N and a temperature change of 1.0 degrees C/s in the posture with the straight hand and forearm laid on a support are recommended for warm and cold threshold measurements.

  11. DOE approach to threshold quantities

    International Nuclear Information System (INIS)

    Wickham, L.E.; Kluk, A.F.; Department of Energy, Washington, DC)

    1985-01-01

    The Department of Energy (DOE) is developing the concept of threshold quantities for use in determining which waste materials must be handled as radioactive waste and which may be disposed of as nonradioactive waste at its sites. Waste above this concentration level would be managed as radioactive or mixed waste (if hazardous chemicals are present); waste below this level would be handled as sanitary waste. Ideally, the threshold must be set high enough to significantly reduce the amount of waste requiring special handling. It must also be low enough so that waste at the threshold quantity poses a very small health risk and multiple exposures to such waste would still constitute a small health risk. It should also be practical to segregate waste above or below the threshold quantity using available instrumentation. Guidance is being prepared to aid DOE sites in establishing threshold quantity values based on pathways analysis using site-specific parameters (waste stream characteristics, maximum exposed individual, population considerations, and site specific parameters such as rainfall, etc.). A guidance dose of between 0.001 to 1.0 mSv/y (0.1 to 100 mrem/y) was recommended with 0.3 mSv/y (30 mrem/y) selected as the guidance dose upon which to base calculations. Several tasks were identified, beginning with the selection of a suitable pathway model for relating dose to the concentration of radioactivity in the waste. Threshold concentrations corresponding to the guidance dose were determined for waste disposal sites at a selected humid and arid site. Finally, cost-benefit considerations at the example sites were addressed. The results of the various tasks are summarized and the relationship of this effort with related developments at other agencies discussed

  12. Displacement compressors - acceptance tests

    CERN Document Server

    International Organization for Standardization. Geneva

    1996-01-01

    ISO 1217:2009 specifies methods for acceptance tests regarding volume rate of flow and power requirements of displacement compressors. It also specifies methods for testing liquid-ring type compressors and the operating and testing conditions which apply when a full performance test is specified.

  13. From motivation to acceptability

    DEFF Research Database (Denmark)

    Nordfalk, Francisca; Olejaz, Maria; Jensen, Anja M. B.

    2016-01-01

    Background: Over the past three decades, public attitudes to organ donation have been a subject of numerous studies focusing on donor motivation. Here, we present a fresh approach. We suggest focusing on public acceptability instead of motivation. The point is to understand public attitudes well...

  14. Approaches to acceptable risk

    International Nuclear Information System (INIS)

    Whipple, C.

    1997-01-01

    Several alternative approaches to address the question open-quotes How safe is safe enough?close quotes are reviewed and an attempt is made to apply the reasoning behind these approaches to the issue of acceptability of radiation exposures received in space. The approaches to the issue of the acceptability of technological risk described here are primarily analytical, and are drawn from examples in the management of environmental health risks. These include risk-based approaches, in which specific quantitative risk targets determine the acceptability of an activity, and cost-benefit and decision analysis, which generally focus on the estimation and evaluation of risks, benefits and costs, in a framework that balances these factors against each other. These analytical methods tend by their quantitative nature to emphasize the magnitude of risks, costs and alternatives, and to downplay other factors, especially those that are not easily expressed in quantitative terms, that affect acceptance or rejection of risk. Such other factors include the issues of risk perceptions and how and by whom risk decisions are made

  15. A threshold for dissipative fission

    International Nuclear Information System (INIS)

    Thoennessen, M.; Bertsch, G.F.

    1993-01-01

    The empirical domain of validity of statistical theory is examined as applied to fission data on pre-fission data on pre-fission neutron, charged particle, and γ-ray multiplicities. Systematics are found of the threshold excitation energy for the appearance of nonstatistical fission. From the data on systems with not too high fissility, the relevant phenomenological parameter is the ratio of the threshold temperature T thresh to the (temperature-dependent) fission barrier height E Bar (T). The statistical model reproduces the data for T thresh /E Bar (T) thresh /E Bar (T) independent of mass and fissility of the systems

  16. Thresholds in chemical respiratory sensitisation.

    Science.gov (United States)

    Cochrane, Stella A; Arts, Josje H E; Ehnes, Colin; Hindle, Stuart; Hollnagel, Heli M; Poole, Alan; Suto, Hidenori; Kimber, Ian

    2015-07-03

    There is a continuing interest in determining whether it is possible to identify thresholds for chemical allergy. Here allergic sensitisation of the respiratory tract by chemicals is considered in this context. This is an important occupational health problem, being associated with rhinitis and asthma, and in addition provides toxicologists and risk assessors with a number of challenges. In common with all forms of allergic disease chemical respiratory allergy develops in two phases. In the first (induction) phase exposure to a chemical allergen (by an appropriate route of exposure) causes immunological priming and sensitisation of the respiratory tract. The second (elicitation) phase is triggered if a sensitised subject is exposed subsequently to the same chemical allergen via inhalation. A secondary immune response will be provoked in the respiratory tract resulting in inflammation and the signs and symptoms of a respiratory hypersensitivity reaction. In this article attention has focused on the identification of threshold values during the acquisition of sensitisation. Current mechanistic understanding of allergy is such that it can be assumed that the development of sensitisation (and also the elicitation of an allergic reaction) is a threshold phenomenon; there will be levels of exposure below which sensitisation will not be acquired. That is, all immune responses, including allergic sensitisation, have threshold requirement for the availability of antigen/allergen, below which a response will fail to develop. The issue addressed here is whether there are methods available or clinical/epidemiological data that permit the identification of such thresholds. This document reviews briefly relevant human studies of occupational asthma, and experimental models that have been developed (or are being developed) for the identification and characterisation of chemical respiratory allergens. The main conclusion drawn is that although there is evidence that the

  17. Optimization Problems on Threshold Graphs

    Directory of Open Access Journals (Sweden)

    Elena Nechita

    2010-06-01

    Full Text Available During the last three decades, different types of decompositions have been processed in the field of graph theory. Among these we mention: decompositions based on the additivity of some characteristics of the graph, decompositions where the adjacency law between the subsets of the partition is known, decompositions where the subgraph induced by every subset of the partition must have predeterminate properties, as well as combinations of such decompositions. In this paper we characterize threshold graphs using the weakly decomposition, determine: density and stability number, Wiener index and Wiener polynomial for threshold graphs.

  18. Threshold current for fireball generation

    Science.gov (United States)

    Dijkhuis, Geert C.

    1982-05-01

    Fireball generation from a high-intensity circuit breaker arc is interpreted here as a quantum-mechanical phenomenon caused by severe cooling of electrode material evaporating from contact surfaces. According to the proposed mechanism, quantum effects appear in the arc plasma when the radius of one magnetic flux quantum inside solid electrode material has shrunk to one London penetration length. A formula derived for the threshold discharge current preceding fireball generation is found compatible with data reported by Silberg. This formula predicts linear scaling of the threshold current with the circuit breaker's electrode radius and concentration of conduction electrons.

  19. Nuclear threshold effects and neutron strength function

    International Nuclear Information System (INIS)

    Hategan, Cornel; Comisel, Horia

    2003-01-01

    One proves that a Nuclear Threshold Effect is dependent, via Neutron Strength Function, on Spectroscopy of Ancestral Neutron Threshold State. The magnitude of the Nuclear Threshold Effect is proportional to the Neutron Strength Function. Evidence for relation of Nuclear Threshold Effects to Neutron Strength Functions is obtained from Isotopic Threshold Effect and Deuteron Stripping Threshold Anomaly. The empirical and computational analysis of the Isotopic Threshold Effect and of the Deuteron Stripping Threshold Anomaly demonstrate their close relationship to Neutron Strength Functions. It was established that the Nuclear Threshold Effects depend, in addition to genuine Nuclear Reaction Mechanisms, on Spectroscopy of (Ancestral) Neutron Threshold State. The magnitude of the effect is proportional to the Neutron Strength Function, in their dependence on mass number. This result constitutes also a proof that the origins of these threshold effects are Neutron Single Particle States at zero energy. (author)

  20. Threshold effect under nonlinear limitation of the intensity of high-power light

    International Nuclear Information System (INIS)

    Tereshchenko, S A; Podgaetskii, V M; Gerasimenko, A Yu; Savel'ev, M S

    2015-01-01

    A model is proposed to describe the properties of limiters of high-power laser radiation, which takes into account the threshold character of nonlinear interaction of radiation with the working medium of the limiter. The generally accepted non-threshold model is a particular case of the threshold model if the threshold radiation intensity is zero. Experimental z-scan data are used to determine the nonlinear optical characteristics of media with carbon nanotubes, polymethine and pyran dyes, zinc selenide, porphyrin-graphene and fullerene-graphene. A threshold effect of nonlinear interaction between laser radiation and some of investigated working media of limiters is revealed. It is shown that the threshold model more adequately describes experimental z-scan data. (nonlinear optical phenomena)

  1. Implications of the Bohm-Aharonov hypothesis

    International Nuclear Information System (INIS)

    Ghirardi, G.C.; Rimini, A.; Weber, T.

    1976-01-01

    It is proved that the Bohm-Aharonov hypothesis concerning largerly separated subsystems of composite quantum systems implies that it is impossible to express the dynamical evolution in terms of the density operator

  2. Multi-agent sequential hypothesis testing

    KAUST Repository

    Kim, Kwang-Ki K.; Shamma, Jeff S.

    2014-01-01

    incorporate costs of taking private/public measurements, costs of time-difference and disagreement in actions of agents, and costs of false declaration/choices in the sequential hypothesis testing. The corresponding sequential decision processes have well

  3. Percolation Threshold Parameters of Fluids

    Czech Academy of Sciences Publication Activity Database

    Škvor, J.; Nezbeda, Ivo

    2009-01-01

    Roč. 79, č. 4 (2009), 041141-041147 ISSN 1539-3755 Institutional research plan: CEZ:AV0Z40720504 Keywords : percolation threshold * universality * infinite cluster Subject RIV: CF - Physical ; Theoretical Chemistry Impact factor: 2.400, year: 2009

  4. Threshold analyses and Lorentz violation

    International Nuclear Information System (INIS)

    Lehnert, Ralf

    2003-01-01

    In the context of threshold investigations of Lorentz violation, we discuss the fundamental principle of coordinate independence, the role of an effective dynamical framework, and the conditions of positivity and causality. Our analysis excludes a variety of previously considered Lorentz-breaking parameters and opens an avenue for viable dispersion-relation investigations of Lorentz violation

  5. Threshold enhancement of diphoton resonances

    Directory of Open Access Journals (Sweden)

    Aoife Bharucha

    2016-10-01

    Full Text Available We revisit a mechanism to enhance the decay width of (pseudo-scalar resonances to photon pairs when the process is mediated by loops of charged fermions produced near threshold. Motivated by the recent LHC data, indicating the presence of an excess in the diphoton spectrum at approximately 750 GeV, we illustrate this threshold enhancement mechanism in the case of a 750 GeV pseudoscalar boson A with a two-photon decay mediated by a charged and uncolored fermion having a mass at the 12MA threshold and a small decay width, <1 MeV. The implications of such a threshold enhancement are discussed in two explicit scenarios: i the Minimal Supersymmetric Standard Model in which the A state is produced via the top quark mediated gluon fusion process and decays into photons predominantly through loops of charginos with masses close to 12MA and ii a two Higgs doublet model in which A is again produced by gluon fusion but decays into photons through loops of vector-like charged heavy leptons. In both these scenarios, while the mass of the charged fermion has to be adjusted to be extremely close to half of the A resonance mass, the small total widths are naturally obtained if only suppressed three-body decay channels occur. Finally, the implications of some of these scenarios for dark matter are discussed.

  6. The (not so) Immortal Strand Hypothesis

    OpenAIRE

    Tomasetti, Cristian; Bozic, Ivana

    2015-01-01

    Background: Non-random segregation of DNA strands during stem cell replication has been proposed as a mechanism to minimize accumulated genetic errors in stem cells of rapidly dividing tissues. According to this hypothesis, an “immortal” DNA strand is passed to the stem cell daughter and not the more differentiated cell, keeping the stem cell lineage replication error-free. After it was introduced, experimental evidence both in favor and against the hypothesis has been presented. Principal...

  7. Waste acceptance and logistics

    International Nuclear Information System (INIS)

    Carlson, James H.

    1992-01-01

    There are three major components which are normally highlighted when the Civilian Radioactive Waste Management Program is discussed - the repository, the monitored retrievable storage facility, and the transportation system. These are clearly the major physical system elements and they receive the greatest external attention. However, there will not be a successful, operative waste management system without fully operational waste acceptance plans and logistics arrangements. This paper will discuss the importance of developing, on a parallel basis to the normally considered waste management system elements, the waste acceptance and logistics arrangements to enable the timely transfer of spent nuclear fuel from more than one hundred and twenty waste generators to the Federal government. The paper will also describe the specific activities the Program has underway to make the necessary arrangements. (author)

  8. Environment and public acceptance

    International Nuclear Information System (INIS)

    Gauvenet; Bresson; Braillard; Ertaud; Ladonchamps, de; Toureau

    1976-01-01

    The problems involved in the siting of nuclear power stations at a local level are of a political economic, social or ecological order. The acceptance of a nuclear station mostly depends on its interest for the local population. In order to avoid negative reactions, the men who are responsible must make the harmonious integration of the station within the existing economic and social context their first priority [fr

  9. Nuclear power and acceptation

    International Nuclear Information System (INIS)

    Speelman, J.E.

    1990-01-01

    In 1989 a workshop was held organized by the IAEA and the Argonne National Laboratory. The purpose was to investigate under which circumstances a large-scale extension of nuclear power can be accepted. Besides the important technical information, the care for the environment determined the atmosphere during the workshop. The opinion dominated that nuclear power can contribute in tackling the environment problems, but that the social and political climate this almost makes impossible. (author). 7 refs.; 1 fig.; 1 tab

  10. ANALYSING ACCEPTANCE SAMPLING PLANS BY MARKOV CHAINS

    Directory of Open Access Journals (Sweden)

    Mohammad Mirabi

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: In this research, a Markov analysis of acceptance sampling plans in a single stage and in two stages is proposed, based on the quality of the items inspected. In a stage of this policy, if the number of defective items in a sample of inspected items is more than the upper threshold, the batch is rejected. However, the batch is accepted if the number of defective items is less than the lower threshold. Nonetheless, when the number of defective items falls between the upper and lower thresholds, the decision-making process continues to inspect the items and collect further samples. The primary objective is to determine the optimal values of the upper and lower thresholds using a Markov process to minimise the total cost associated with a batch acceptance policy. A solution method is presented, along with a numerical demonstration of the application of the proposed methodology.

    AFRIKAANSE OPSOMMING: In hierdie navorsing word ’n Markov-ontleding gedoen van aannamemonsternemingsplanne wat plaasvind in ’n enkele stap of in twee stappe na gelang van die kwaliteit van die items wat geïnspekteer word. Indien die eerste monster toon dat die aantal defektiewe items ’n boonste grens oorskry, word die lot afgekeur. Indien die eerste monster toon dat die aantal defektiewe items minder is as ’n onderste grens, word die lot aanvaar. Indien die eerste monster toon dat die aantal defektiewe items in die gebied tussen die boonste en onderste grense lê, word die besluitnemingsproses voortgesit en verdere monsters word geneem. Die primêre doel is om die optimale waardes van die booonste en onderste grense te bepaal deur gebruik te maak van ’n Markov-proses sodat die totale koste verbonde aan die proses geminimiseer kan word. ’n Oplossing word daarna voorgehou tesame met ’n numeriese voorbeeld van die toepassing van die voorgestelde oplossing.

  11. Why Does REM Sleep Occur? A Wake-up Hypothesis

    Directory of Open Access Journals (Sweden)

    Dr. W. R. eKlemm

    2011-09-01

    Full Text Available Brain activity differs in the various sleep stages and in conscious wakefulness. Awakening from sleep requires restoration of the complex nerve impulse patterns in neuronal network assemblies necessary to re-create and sustain conscious wakefulness. Herein I propose that the brain uses REM to help wake itself up after it has had a sufficient amount of sleep. Evidence suggesting this hypothesis includes the facts that, 1 when first going to sleep, the brain plunges into Stage N3 (formerly called Stage IV, a deep abyss of sleep, and, as the night progresses, the sleep is punctuated by episodes of REM that become longer and more frequent toward morning, 2 conscious-like dreams are a reliable component of the REM state in which the dreamer is an active mental observer or agent in the dream, 3 the last awakening during a night’s sleep usually occurs in a REM episode during or at the end of a dream, 4 both REM and awake consciousness seem to arise out of a similar brainstem ascending arousal system 5 N3 is a functionally perturbed state that eventually must be corrected so that embodied brain can direct adaptive behavior, and 6 corticofugal projections to brainstem arousal areas provide a way to trigger increased cortical activity in REM to progressively raise the sleeping brain to the threshold required for wakefulness. This paper shows how the hypothesis conforms to common experience and has substantial predictive and explanatory power regarding the phenomenology of sleep in terms of ontogeny, aging, phylogeny, abnormal/disease states, cognition, and behavioral physiology. That broad range of consistency is not matched by competing theories, which are summarized herein. Specific ways to test this wake-up hypothesis are suggested. Such research could lead to a better understanding of awake consciousness.

  12. Why does rem sleep occur? A wake-up hypothesis.

    Science.gov (United States)

    Klemm, W R

    2011-01-01

    Brain activity differs in the various sleep stages and in conscious wakefulness. Awakening from sleep requires restoration of the complex nerve impulse patterns in neuronal network assemblies necessary to re-create and sustain conscious wakefulness. Herein I propose that the brain uses rapid eye movement (REM) to help wake itself up after it has had a sufficient amount of sleep. Evidence suggesting this hypothesis includes the facts that, (1) when first going to sleep, the brain plunges into Stage N3 (formerly called Stage IV), a deep abyss of sleep, and, as the night progresses, the sleep is punctuated by episodes of REM that become longer and more frequent toward morning, (2) conscious-like dreams are a reliable component of the REM state in which the dreamer is an active mental observer or agent in the dream, (3) the last awakening during a night's sleep usually occurs in a REM episode during or at the end of a dream, (4) both REM and awake consciousness seem to arise out of a similar brainstem ascending arousal system (5) N3 is a functionally perturbed state that eventually must be corrected so that embodied brain can direct adaptive behavior, and (6) cortico-fugal projections to brainstem arousal areas provide a way to trigger increased cortical activity in REM to progressively raise the sleeping brain to the threshold required for wakefulness. This paper shows how the hypothesis conforms to common experience and has substantial predictive and explanatory power regarding the phenomenology of sleep in terms of ontogeny, aging, phylogeny, abnormal/disease states, cognition, and behavioral physiology. That broad range of consistency is not matched by competing theories, which are summarized herein. Specific ways to test this wake-up hypothesis are suggested. Such research could lead to a better understanding of awake consciousness.

  13. A two-stage cognitive theory of the positive symptoms of psychosis. Highlighting the role of lowered decision thresholds.

    Science.gov (United States)

    Moritz, Steffen; Pfuhl, Gerit; Lüdtke, Thies; Menon, Mahesh; Balzan, Ryan P; Andreou, Christina

    2017-09-01

    We outline a two-stage heuristic account for the pathogenesis of the positive symptoms of psychosis. A narrative review on the empirical evidence of the liberal acceptance (LA) account of positive symptoms is presented. At the heart of our theory is the idea that psychosis is characterized by a lowered decision threshold, which results in the premature acceptance of hypotheses that a nonpsychotic individual would reject. Once the hypothesis is judged as valid, counterevidence is not sought anymore due to a bias against disconfirmatory evidence as well as confirmation biases, consolidating the false hypothesis. As a result of LA, confidence in errors is enhanced relative to controls. Subjective probabilities are initially low for hypotheses in individuals with delusions, and delusional ideas at stage 1 (belief formation) are often fragile. In the course of the second stage (belief maintenance), fleeting delusional ideas evolve into fixed false beliefs, particularly if the delusional idea is congruent with the emotional state and provides "meaning". LA may also contribute to hallucinations through a misattribution of (partially) normal sensory phenomena. Interventions such as metacognitive training that aim to "plant the seeds of doubt" decrease positive symptoms by encouraging individuals to seek more information and to attenuate confidence. The effect of antipsychotic medication is explained by its doubt-inducing properties. The model needs to be confirmed by longitudinal designs that allow an examination of causal relationships. Evidence is currently weak for hallucinations. The theory may account for positive symptoms in a subgroup of patients. Future directions are outlined. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  14. Age and Acceptance of Euthanasia.

    Science.gov (United States)

    Ward, Russell A.

    1980-01-01

    Study explores relationship between age (and sex and race) and acceptance of euthanasia. Women and non-Whites were less accepting because of religiosity. Among older people less acceptance was attributable to their lesser education and greater religiosity. Results suggest that quality of life in old age affects acceptability of euthanasia. (Author)

  15. Appropriate threshold levels of cardiac beat-to-beat variation in semi-automatic analysis of equine ECG recordings

    DEFF Research Database (Denmark)

    Madsen, Mette Flethøj; Kanters, Jørgen K.; Pedersen, Philip Juul

    2016-01-01

    considerably with heart rate (HR), and an adaptable model consisting of three different HR ranges with separate threshold levels of maximum acceptable RR deviation was consequently defined. For resting HRs

  16. Rational expectations, psychology and inductive learning via moving thresholds

    Science.gov (United States)

    Lamba, H.; Seaman, T.

    2008-06-01

    This paper modifies a previously introduced class of heterogeneous agent models in a way that allows for the inclusion of different types of agent motivations and behaviours in a consistent manner. The agents operate within a highly simplified environment where they are only able to be long or short one unit of the asset. The price of the asset is influenced by both an external information stream and the demand of the agents. The current strategy of each agent is defined by a pair of moving thresholds straddling the current price. When the price crosses either of the thresholds for a particular agent, that agent switches position and a new pair of thresholds is generated. The threshold dynamics can mimic different sources of investor motivation, running the gamut from purely rational information-processing, through rational (but often undesirable) behaviour induced by perverse incentives and moral hazards, to purely psychological effects. The simplest model of this kind precisely conforms to the Efficient Market Hypothesis (EMH) and this allows causal relationships to be established between actions at the agent level and violations of EMH price statistics at the global level. In particular, the effects of herding behaviour and perverse incentives are examined.

  17. Nano-material size dependent laser-plasma thresholds

    Science.gov (United States)

    EL Sherbini, Ashraf M.; Parigger, Christian G.

    2016-10-01

    The reduction of laser fluence for initiation of plasma was measured for zinc monoxide nanoparticles of diameters in the range of 100 to 20 nm. In a previous work by EL Sherbini and Parigger [Wavelength Dependency and Threshold Measurements for Nanoparticle-enhanced Laser-induced Breakdown Spectroscopy, Spectrochim. Acta Part B 116 (2016) 8-15], the hypothesis of threshold dependence on particle size leads to the interpretation of the experiments for varying excitation wavelengths with fixed, 30 nm nanomaterial. The experimental results presented in this work were obtained with 1064 nm Nd:YAG radiation and confirm and validate the suspected reduction due to quenching of the thermal conduction length to the respective sizes of the nanoparticles.

  18. Social laughter is correlated with an elevated pain threshold.

    Science.gov (United States)

    Dunbar, R I M; Baron, Rebecca; Frangou, Anna; Pearce, Eiluned; van Leeuwen, Edwin J C; Stow, Julie; Partridge, Giselle; MacDonald, Ian; Barra, Vincent; van Vugt, Mark

    2012-03-22

    Although laughter forms an important part of human non-verbal communication, it has received rather less attention than it deserves in both the experimental and the observational literatures. Relaxed social (Duchenne) laughter is associated with feelings of wellbeing and heightened affect, a proximate explanation for which might be the release of endorphins. We tested this hypothesis in a series of six experimental studies in both the laboratory (watching videos) and naturalistic contexts (watching stage performances), using change in pain threshold as an assay for endorphin release. The results show that pain thresholds are significantly higher after laughter than in the control condition. This pain-tolerance effect is due to laughter itself and not simply due to a change in positive affect. We suggest that laughter, through an endorphin-mediated opiate effect, may play a crucial role in social bonding.

  19. The issue of threshold states

    International Nuclear Information System (INIS)

    Luck, L.

    1994-01-01

    The states which have not joined the Non-proliferation Treaty nor have undertaken any other internationally binding commitment not to develop or otherwise acquire nuclear weapons are considered a threshold states. Their nuclear status is rendered opaque as a conscious policy. Nuclear threshold status remains a key disarmament issue. For those few states, as India, Pakistan, Israel, who have put themselves in this position, the security returns have been transitory and largely illusory. The cost to them, and to the international community committed to the norm of non-proliferation, has been huge. The decisions which could lead to recovery from the situation in which they find themselves are essentially at their own hands. Whatever assistance the rest of international community is able to extend, it will need to be accompanied by a vital political signal

  20. Multiscalar production amplitudes beyond threshold

    CERN Document Server

    Argyres, E N; Kleiss, R H

    1993-01-01

    We present exact tree-order amplitudes for $H^* \\to n~H$, for final states containing one or two particles with non-zero three-momentum, for various interaction potentials. We show that there are potentials leading to tree amplitudes that satisfy unitarity, not only at threshold but also in the above kinematical configurations and probably beyond. As a by-product, we also calculate $2\\to n$ tree amplitudes at threshold and show that for the unbroken $\\phi^4$ theory they vanish for $n>4~$, for the Standard Model Higgs they vanish for $n\\ge 3~$ and for a model potential, respecting tree-order unitarity, for $n$ even and $n>4~$. Finally, we calculate the imaginary part of the one-loop $1\\to n$ amplitude in both symmetric and spontaneously broken $\\phi^4$ theory.

  1. Multiple hypothesis tracking for the cyber domain

    Science.gov (United States)

    Schwoegler, Stefan; Blackman, Sam; Holsopple, Jared; Hirsch, Michael J.

    2011-09-01

    This paper discusses how methods used for conventional multiple hypothesis tracking (MHT) can be extended to domain-agnostic tracking of entities from non-kinematic constraints such as those imposed by cyber attacks in a potentially dense false alarm background. MHT is widely recognized as the premier method to avoid corrupting tracks with spurious data in the kinematic domain but it has not been extensively applied to other problem domains. The traditional approach is to tightly couple track maintenance (prediction, gating, filtering, probabilistic pruning, and target confirmation) with hypothesis management (clustering, incompatibility maintenance, hypothesis formation, and Nassociation pruning). However, by separating the domain specific track maintenance portion from the domain agnostic hypothesis management piece, we can begin to apply the wealth of knowledge gained from ground and air tracking solutions to the cyber (and other) domains. These realizations led to the creation of Raytheon's Multiple Hypothesis Extensible Tracking Architecture (MHETA). In this paper, we showcase MHETA for the cyber domain, plugging in a well established method, CUBRC's INFormation Engine for Real-time Decision making, (INFERD), for the association portion of the MHT. The result is a CyberMHT. We demonstrate the power of MHETA-INFERD using simulated data. Using metrics from both the tracking and cyber domains, we show that while no tracker is perfect, by applying MHETA-INFERD, advanced nonkinematic tracks can be captured in an automated way, perform better than non-MHT approaches, and decrease analyst response time to cyber threats.

  2. Aminoglycoside antibiotics and autism: a speculative hypothesis

    Directory of Open Access Journals (Sweden)

    Manev Hari

    2001-10-01

    Full Text Available Abstract Background Recently, it has been suspected that there is a relationship between therapy with some antibiotics and the onset of autism; but even more curious, some children benefited transiently from a subsequent treatment with a different antibiotic. Here, we speculate how aminoglycoside antibiotics might be associated with autism. Presentation We hypothesize that aminoglycoside antibiotics could a trigger the autism syndrome in susceptible infants by causing the stop codon readthrough, i.e., a misreading of the genetic code of a hypothetical critical gene, and/or b improve autism symptoms by correcting the premature stop codon mutation in a hypothetical polymorphic gene linked to autism. Testing Investigate, retrospectively, whether a link exists between aminoglycoside use (which is not extensive in children and the onset of autism symptoms (hypothesis "a", or between amino glycoside use and improvement of these symptoms (hypothesis "b". Whereas a prospective study to test hypothesis "a" is not ethically justifiable, a study could be designed to test hypothesis "b". Implications It should be stressed that at this stage no direct evidence supports our speculative hypothesis and that its main purpose is to initiate development of new ideas that, eventually, would improve our understanding of the pathobiology of autism.

  3. Realistic Realizations Of Threshold Circuits

    Science.gov (United States)

    Razavi, Hassan M.

    1987-08-01

    Threshold logic, in which each input is weighted, has many theoretical advantages over the standard gate realization, such as reducing the number of gates, interconnections, and power dissipation. However, because of the difficult synthesis procedure and complicated circuit implementation, their use in the design of digital systems is almost nonexistant. In this study, three methods of NMOS realizations are discussed, and their advantages and shortcomings are explored. Also, the possibility of using the methods to realize multi-valued logic is examined.

  4. Root finding with threshold circuits

    Czech Academy of Sciences Publication Activity Database

    Jeřábek, Emil

    2012-01-01

    Roč. 462, Nov 30 (2012), s. 59-69 ISSN 0304-3975 R&D Projects: GA AV ČR IAA100190902; GA MŠk(CZ) 1M0545 Institutional support: RVO:67985840 Keywords : root finding * threshold circuit * power series Subject RIV: BA - General Mathematics Impact factor: 0.489, year: 2012 http://www.sciencedirect.com/science/article/pii/S0304397512008006#

  5. Den betingede accept

    DEFF Research Database (Denmark)

    Kolind, Torsten

    1999-01-01

    The article focus on aspects of identity and social order in relation to the interaction between ‘normals' and ex-prisoners, that is, ex-prisoners, who wants to live a normal life without criminality. It is argued, that this interaction and the normality that the ex-prisoner is granted often......, on the surface, can look rather unproblematic, but that it, none the less, is ruled by, what the author calls the conditioned accept. That is, the ex-prisoner should see himself as normal, at the same time that he withdraw from those situations, practices and attitude where the normals would have difficulties...

  6. Baby-Crying Acceptance

    Science.gov (United States)

    Martins, Tiago; de Magalhães, Sérgio Tenreiro

    The baby's crying is his most important mean of communication. The crying monitoring performed by devices that have been developed doesn't ensure the complete safety of the child. It is necessary to join, to these technological resources, means of communicating the results to the responsible, which would involve the digital processing of information available from crying. The survey carried out, enabled to understand the level of adoption, in the continental territory of Portugal, of a technology that will be able to do such a digital processing. It was used the TAM as the theoretical referential. The statistical analysis showed that there is a good probability of acceptance of such a system.

  7. Acceptance, Tolerance, Participation

    International Nuclear Information System (INIS)

    1993-01-01

    The problem of radioactive waste management from an ethical and societal viewpoint was treated in this seminar, which had participants from universities (social, theological, philosophical and science institutes), waste management industry, and regulatory and controlling authorities. After initial reviews on repository technology, policies and schedules, knowledge gaps, and ethical aspects on decision making under uncertainty, four subjects were treated in lectures and discussions: Democratic collective responsibility, Handling threats in democratic decision making, Waste management - a technological operation with a social dimension, Acceptance and legitimity. Lectures with comments and discussions are collected in this report

  8. Marketing for Acceptance

    Directory of Open Access Journals (Sweden)

    Tina L. Johnston, Ph.D.

    2009-11-01

    Full Text Available Becoming a researcher comes with the credentializing pressure to publish articles in peer-reviewed journals (Glaser, 1992; Glaser, 2007; Glaser, 2008. The work intensive process is exacerbated when the author’s research method is grounded theory. This study investigated the concerns of early and experienced grounded theorists to discover how they worked towards publishing research projects that applied grounded theory as a methodology. The result was a grounded theory of marketing for acceptance that provides the reader with insight into ways that classic grounded theorists have published their works. This is followed by a discussion of ideas for normalizing classic grounded theory research methods in our substantive fields.

  9. Design proposal for door thresholds

    Directory of Open Access Journals (Sweden)

    Smolka Radim

    2017-01-01

    Full Text Available Panels for openings in structures have always been an essential and integral part of buildings. Their importance in terms of a building´s functionality was not recognised. However, the general view on this issue has changed from focusing on big planar segments and critical details to sub-elements of these structures. This does not only focus on the forms of connecting joints but also on the supporting systems that keep the panels in the right position and ensure they function properly. One of the most strained segments is the threshold structure, especially the entrance door threshold structure. It is the part where substantial defects in construction occur in terms of waterproofing, as well as in the static, thermal and technical functions thereof. In conventional buildings, this problem is solved by pulling the floor structure under the entrance door structure and subsequently covering it with waterproofing material. This system cannot work effectively over the long term so local defects occur. A proposal is put forward to solve this problem by installing a sub-threshold door coupler made of composite materials. The coupler is designed so that its variability complies with the required parameters for most door structures on the European market.

  10. Radiation hormesis and the linear-no-threshold assumption

    CERN Document Server

    Sanders, Charles L

    2009-01-01

    Current radiation protection standards are based upon the application of the linear no-threshold (LNT) assumption, which considers that even very low doses of ionizing radiation can cause cancer. The radiation hormesis hypothesis, by contrast, proposes that low-dose ionizing radiation is beneficial. In this book, the author examines all facets of radiation hormesis in detail, including the history of the concept and mechanisms, and presents comprehensive, up-to-date reviews for major cancer types. It is explained how low-dose radiation can in fact decrease all-cause and all-cancer mortality an

  11. Testing competing forms of the Milankovitch hypothesis

    DEFF Research Database (Denmark)

    Kaufmann, Robert K.; Juselius, Katarina

    2016-01-01

    We test competing forms of the Milankovitch hypothesis by estimating the coefficients and diagnostic statistics for a cointegrated vector autoregressive model that includes 10 climate variables and four exogenous variables for solar insolation. The estimates are consistent with the physical...... ice volume and solar insolation. The estimated adjustment dynamics show that solar insolation affects an array of climate variables other than ice volume, each at a unique rate. This implies that previous efforts to test the strong form of the Milankovitch hypothesis by examining the relationship...... that the latter is consistent with a weak form of the Milankovitch hypothesis and that it should be restated as follows: Internal climate dynamics impose perturbations on glacial cycles that are driven by solar insolation. Our results show that these perturbations are likely caused by slow adjustment between land...

  12. Rejecting the equilibrium-point hypothesis.

    Science.gov (United States)

    Gottlieb, G L

    1998-01-01

    The lambda version of the equilibrium-point (EP) hypothesis as developed by Feldman and colleagues has been widely used and cited with insufficient critical understanding. This article offers a small antidote to that lack. First, the hypothesis implicitly, unrealistically assumes identical transformations of lambda into muscle tension for antagonist muscles. Without that assumption, its definitions of command variables R, C, and lambda are incompatible and an EP is not defined exclusively by R nor is it unaffected by C. Second, the model assumes unrealistic and unphysiological parameters for the damping properties of the muscles and reflexes. Finally, the theory lacks rules for two of its three command variables. A theory of movement should offer insight into why we make movements the way we do and why we activate muscles in particular patterns. The EP hypothesis offers no unique ideas that are helpful in addressing either of these questions.

  13. Rayleigh's hypothesis and the geometrical optics limit.

    Science.gov (United States)

    Elfouhaily, Tanos; Hahn, Thomas

    2006-09-22

    The Rayleigh hypothesis (RH) is often invoked in the theoretical and numerical treatment of rough surface scattering in order to decouple the analytical form of the scattered field. The hypothesis stipulates that the scattered field away from the surface can be extended down onto the rough surface even though it is formed by solely up-going waves. Traditionally this hypothesis is systematically used to derive the Volterra series under the small perturbation method which is equivalent to the low-frequency limit. In this Letter we demonstrate that the RH also carries the high-frequency or the geometrical optics limit, at least to first order. This finding has never been explicitly derived in the literature. Our result comforts the idea that the RH might be an exact solution under some constraints in the general case of random rough surfaces and not only in the case of small-slope deterministic periodic gratings.

  14. A PC microsimulation of a gap acceptance model for turning left at a T-junction

    NARCIS (Netherlands)

    Schaap, Nina; Dijck, T.; van Arem, Bart; Morsink, Peter L.J.

    2009-01-01

    Vehicles are controlled by sub-behavioral models in a microsimulation model, this includes the gap acceptance model where the decision about how to cross a junction is made. The critical gap in these models must serve as a threshold value to accept or reject the space between two successive vehicles

  15. Bedding material affects mechanical thresholds, heat thresholds and texture preference

    Science.gov (United States)

    Moehring, Francie; O’Hara, Crystal L.; Stucky, Cheryl L.

    2015-01-01

    It has long been known that the bedding type animals are housed on can affect breeding behavior and cage environment. Yet little is known about its effects on evoked behavior responses or non-reflexive behaviors. C57BL/6 mice were housed for two weeks on one of five bedding types: Aspen Sani Chips® (standard bedding for our institute), ALPHA-Dri®, Cellu-Dri™, Pure-o’Cel™ or TEK-Fresh. Mice housed on Aspen exhibited the lowest (most sensitive) mechanical thresholds while those on TEK-Fresh exhibited 3-fold higher thresholds. While bedding type had no effect on responses to punctate or dynamic light touch stimuli, TEK-Fresh housed animals exhibited greater responsiveness in a noxious needle assay, than those housed on the other bedding types. Heat sensitivity was also affected by bedding as animals housed on Aspen exhibited the shortest (most sensitive) latencies to withdrawal whereas those housed on TEK-Fresh had the longest (least sensitive) latencies to response. Slight differences between bedding types were also seen in a moderate cold temperature preference assay. A modified tactile conditioned place preference chamber assay revealed that animals preferred TEK-Fresh to Aspen bedding. Bedding type had no effect in a non-reflexive wheel running assay. In both acute (two day) and chronic (5 week) inflammation induced by injection of Complete Freund’s Adjuvant in the hindpaw, mechanical thresholds were reduced in all groups regardless of bedding type, but TEK-Fresh and Pure-o’Cel™ groups exhibited a greater dynamic range between controls and inflamed cohorts than Aspen housed mice. PMID:26456764

  16. [Experimental testing of Pflüger's reflex hypothesis of menstruation in late 19th century].

    Science.gov (United States)

    Simmer, H H

    1980-07-01

    Pflüger's hypothesis of a nerve reflex as the cause of menstruation published in 1865 and accepted by many, nonetheless did not lead to experimental investigations for 25 years. According to this hypothesis the nerve reflex starts in the ovary by an increase of the intraovarian pressure by the growing follicles. In 1884 Adolph Kehrer proposed a program to test the nerve reflex, but only in 1890, Cohnstein artificially increased the intraovarian pressure in women by bimanual compression from the outside and the vagina. His results were not convincing. Six years later, Strassmann injected fluids into ovaries of animals and obtained changes in the uterus resembling those of oestrus. His results seemed to verify a prognosis derived from Pflüger's hypothesis. Thus, after a long interval, that hypothesis had become a paradigma. Though reasons can be given for the delay, it is little understood, why experimental testing started so late.

  17. Bayesian methodology for reliability model acceptance

    International Nuclear Information System (INIS)

    Zhang Ruoxue; Mahadevan, Sankaran

    2003-01-01

    This paper develops a methodology to assess the reliability computation model validity using the concept of Bayesian hypothesis testing, by comparing the model prediction and experimental observation, when there is only one computational model available to evaluate system behavior. Time-independent and time-dependent problems are investigated, with consideration of both cases: with and without statistical uncertainty in the model. The case of time-independent failure probability prediction with no statistical uncertainty is a straightforward application of Bayesian hypothesis testing. However, for the life prediction (time-dependent reliability) problem, a new methodology is developed in this paper to make the same Bayesian hypothesis testing concept applicable. With the existence of statistical uncertainty in the model, in addition to the application of a predictor estimator of the Bayes factor, the uncertainty in the Bayes factor is explicitly quantified through treating it as a random variable and calculating the probability that it exceeds a specified value. The developed method provides a rational criterion to decision-makers for the acceptance or rejection of the computational model

  18. Adiabatic theory of Wannier threshold laws and ionization cross sections

    International Nuclear Information System (INIS)

    Macek, J.H.; Ovchinnikov, S.Y.

    1994-01-01

    Adiabatic energy eigenvalues of H 2 + are computed for complex values of the internuclear distance R. The infinite number of bound-state eigenenergies are represented by a function ε(R) that is single valued on a multisheeted Riemann surface. A region is found where ε(R) and the corresponding eigenfunctions exhibit harmonic-oscillator structure characteristic of electron motion on a potential saddle. The Schroedinger equation is solved in the adiabatic approximation along a path in the complex R plane to compute ionization cross sections. The cross section thus obtained joins the Wannier threshold region with the keV energy region, but the exponent near the ionization threshold disagrees with well-accepted values. Accepted values are obtained when a lowest-order diabatic correction is employed, indicating that adiabatic approximations do not give the correct zero velocity limit for ionization cross sections. Semiclassical eigenvalues for general top-of-barrier motion are given and the theory is applied to the ionization of atomic hydrogen by electron impact. The theory with a first diabatic correction gives the Wannier threshold law even for this case

  19. Determining color difference thresholds in denture base acrylic resin.

    Science.gov (United States)

    Ren, Jiabao; Lin, Hong; Huang, Qingmei; Zheng, Gang

    2015-11-01

    In restorative prostheses, color is important, but the choice of color difference formula used to quantify color change in acrylic resins is not straightforward. The purpose of this in vitro study was to choose a color difference formula that best represented differences between the calculated color and the observed imperceptible to unacceptable color and to determine the corresponding perceptibility and acceptability threshold of color stability for denture base acrylic resins. A total of 291 acrylic resin denture base plates were fabricated and subjected to radiation tests from zero to 42 hours in accordance with ISO 7491:2000. Color was measured with a portable spectrophotometer, and color differences were calculated with 3 International Commission on Illumination (CIE) formulas: CIELab, CMC(1:1), and CIEDE2000. Thirty-four observers with no deficiencies in color perception participated in psychophysical perceptibility and acceptability assessments under controlled conditions in vitro. These 2 types of assessments were regressed to each observer by each formula to generate receiver operator characteristic (ROC) curves. Areas under the curves (AUCs) were then calculated and analyzed to exclude observers with poor color discrimination. AUCs were subjected to 1-way ANOVA (α=.05) to deter the statistical significance of discriminability among the 3 formulas in terms of perceptibility and acceptability judgments. Student-Newman-Keuls tests (α=.05) were used for post hoc comparison. CMC(1:1) and CIEDE2000 formulas performed better for imperceptible to unacceptable color differences, with corresponding CMC(1:1) and CIEDE2000 values for perceptibility of 2.52 and 1.72, respectively, and acceptability thresholds of 6.21 and 4.08, respectively. Formulas CMC(1:1) and CIEDE2000 possess higher discriminability than that of CIELab in the assessment of perceptible color difference threshold of denture base acrylic resin. A statistically significant difference exists

  20. Rejection thresholds in solid chocolate-flavored compound coating.

    Science.gov (United States)

    Harwood, Meriel L; Ziegler, Gregory R; Hayes, John E

    2012-10-01

    Classical detection thresholds do not predict liking, as they focus on the presence or absence of a sensation. Recently however, Prescott and colleagues described a new method, the rejection threshold, where a series of forced choice preference tasks are used to generate a dose-response function to determine hedonically acceptable concentrations. That is, how much is too much? To date, this approach has been used exclusively in liquid foods. Here, we determined group rejection thresholds in solid chocolate-flavored compound coating for bitterness. The influences of self-identified preferences for milk or dark chocolate, as well as eating style (chewers compared to melters) on rejection thresholds were investigated. Stimuli included milk chocolate-flavored compound coating spiked with increasing amounts of sucrose octaacetate, a bitter and generally recognized as safe additive. Paired preference tests (blank compared to spike) were used to determine the proportion of the group that preferred the blank. Across pairs, spiked samples were presented in ascending concentration. We were able to quantify and compare differences between 2 self-identified market segments. The rejection threshold for the dark chocolate preferring group was significantly higher than the milk chocolate preferring group (P= 0.01). Conversely, eating style did not affect group rejection thresholds (P= 0.14), although this may reflect the amount of chocolate given to participants. Additionally, there was no association between chocolate preference and eating style (P= 0.36). Present work supports the contention that this method can be used to examine preferences within specific market segments and potentially individual differences as they relate to ingestive behavior. This work makes use of the rejection threshold method to study market segmentation, extending its use to solid foods. We believe this method has broad applicability to the sensory specialist and product developer by providing a

  1. A new glaucoma hypothesis: a role of glymphatic system dysfunction.

    Science.gov (United States)

    Wostyn, Peter; Van Dam, Debby; Audenaert, Kurt; Killer, Hanspeter Esriel; De Deyn, Peter Paul; De Groot, Veva

    2015-06-29

    In a recent review article titled "A new look at cerebrospinal fluid circulation", Brinker et al. comprehensively described novel insights from molecular and cellular biology as well as neuroimaging research, which indicate that cerebrospinal fluid (CSF) physiology is much more complex than previously believed. The glymphatic system is a recently defined brain-wide paravascular pathway for CSF and interstitial fluid exchange that facilitates efficient clearance of interstitial solutes, including amyloid-β, from the brain. Although further studies are needed to substantiate the functional significance of the glymphatic concept, one implication is that glymphatic pathway dysfunction may contribute to the deficient amyloid-β clearance in Alzheimer's disease. In this paper, we review several lines of evidence suggesting that the glymphatic system may also have potential clinical relevance for the understanding of glaucoma. As a clinically acceptable MRI-based approach to evaluate glymphatic pathway function in humans has recently been developed, a unique opportunity now exists to investigate whether suppression of the glymphatic system contributes to the development of glaucoma. The observation of a dysfunctional glymphatic system in patients with glaucoma would provide support for the hypothesis recently proposed by our group that CSF circulatory dysfunction may play a contributory role in the pathogenesis of glaucomatous damage. This would suggest a new hypothesis for glaucoma, which, just like Alzheimer's disease, might be considered then as an imbalance between production and clearance of neurotoxins, including amyloid-β.

  2. The delphic oracle and the ethylene-intoxication hypothesis.

    Science.gov (United States)

    Foster, J; Lehoux, D

    2007-01-01

    An interdisciplinary team of scientists--including an archeologist, a geologist, a chemist, and a toxicologist--has argued that ethylene intoxication was the probable cause of the High Priestess of Delphi's divinatory (mantic) trances. The claim that the High Priestess of Delphi entered a mantic state because of ethylene intoxication enjoyed widespread reception in specialist academic journals, science magazines, and newspapers. This article uses a similar interdisciplinary approach to show that this hypothesis is implausible since it is based on problematic scientific and textual evidence, as well as a fallacious argument. The main issue raised by this counterargument is not that a particular scientific hypothesis or conjecture turned out to be false. (This is expected in scientific investigation.) Rather, the main issue is that it was a positivist disposition that originally led readers to associate the evidence presented in such a way that it seemed to point to the conclusion, even when the evidence did not support the conclusion. We conclude by observing that positivist dispositions can lead to the acceptance of claims because they have a scientific form, not because they are grounded in robust evidence and sound argument.

  3. On the generalized gravi-magnetic hypothesis

    International Nuclear Information System (INIS)

    Massa, C.

    1989-01-01

    According to a generalization of the gravi-magnetic hypothesis (GMH) any neutral mass moving in a curvilinear path with respect to an inertial frame creates a magnetic field, dependent on the curvature radius of the path. A simple astrophysical consequence of the generalized GMH is suggested considering the special cases of binary pulsars and binary neutron stars

  4. Remarks about the hypothesis of limiting fragmentation

    International Nuclear Information System (INIS)

    Chou, T.T.; Yang, C.N.

    1987-01-01

    Remarks are made about the hypothesis of limiting fragmentation. In particular, the concept of favored and disfavored fragment distribution is introduced. Also, a sum rule is proved leading to a useful quantity called energy-fragmentation fraction. (author). 11 refs, 1 fig., 2 tabs

  5. Multiple hypothesis clustering in radar plot extraction

    NARCIS (Netherlands)

    Huizing, A.G.; Theil, A.; Dorp, Ph. van; Ligthart, L.P.

    1995-01-01

    False plots and plots with inaccurate range and Doppler estimates may severely degrade the performance of tracking algorithms in radar systems. This paper describes how a multiple hypothesis clustering technique can be applied to mitigate the problems involved in plot extraction. The measures of

  6. The (not so immortal strand hypothesis

    Directory of Open Access Journals (Sweden)

    Cristian Tomasetti

    2015-03-01

    Significance: Utilizing an approach that is fundamentally different from previous efforts to confirm or refute the immortal strand hypothesis, we provide evidence against non-random segregation of DNA during stem cell replication. Our results strongly suggest that parental DNA is passed randomly to stem cell daughters and provides new insight into the mechanism of DNA replication in stem cells.

  7. A Developmental Study of the Infrahumanization Hypothesis

    Science.gov (United States)

    Martin, John; Bennett, Mark; Murray, Wayne S.

    2008-01-01

    Intergroup attitudes in children were examined based on Leyen's "infrahumanization hypothesis". This suggests that some uniquely human emotions, such as shame and guilt (secondary emotions), are reserved for the in-group, whilst other emotions that are not uniquely human and shared with animals, such as anger and pleasure (primary…

  8. Morbidity and Infant Development: A Hypothesis.

    Science.gov (United States)

    Pollitt, Ernesto

    1983-01-01

    Results of a study conducted in 14 villages of Sui Lin Township, Taiwan, suggest the hypothesis that, under conditions of extreme economic impoverishment and among children within populations where energy protein malnutrition is endemic, there is an inverse relationship between incidence of morbidity in infancy and measures of motor and mental…

  9. Diagnostic Hypothesis Generation and Human Judgment

    Science.gov (United States)

    Thomas, Rick P.; Dougherty, Michael R.; Sprenger, Amber M.; Harbison, J. Isaiah

    2008-01-01

    Diagnostic hypothesis-generation processes are ubiquitous in human reasoning. For example, clinicians generate disease hypotheses to explain symptoms and help guide treatment, auditors generate hypotheses for identifying sources of accounting errors, and laypeople generate hypotheses to explain patterns of information (i.e., data) in the…

  10. Multi-hypothesis distributed stereo video coding

    DEFF Research Database (Denmark)

    Salmistraro, Matteo; Zamarin, Marco; Forchhammer, Søren

    2013-01-01

    for stereo sequences, exploiting an interpolated intra-view SI and two inter-view SIs. The quality of the SI has a major impact on the DVC Rate-Distortion (RD) performance. As the inter-view SIs individually present lower RD performance compared with the intra-view SI, we propose multi-hypothesis decoding...

  11. [Resonance hypothesis of heart rate variability origin].

    Science.gov (United States)

    Sheĭkh-Zade, Iu R; Mukhambetaliev, G Kh; Cherednik, I L

    2009-09-01

    A hypothesis is advanced of the heart rate variability being subjected to beat-to-beat regulation of cardiac cycle duration in order to ensure the resonance interaction between respiratory and own fluctuation of the arterial system volume for minimization of power expenses of cardiorespiratory system. Myogenic, parasympathetic and sympathetic machanisms of heart rate variability are described.

  12. In Defense of Chi's Ontological Incompatibility Hypothesis

    Science.gov (United States)

    Slotta, James D.

    2011-01-01

    This article responds to an article by A. Gupta, D. Hammer, and E. F. Redish (2010) that asserts that M. T. H. Chi's (1992, 2005) hypothesis of an "ontological commitment" in conceptual development is fundamentally flawed. In this article, I argue that Chi's theoretical perspective is still very much intact and that the critique offered by Gupta…

  13. Vacuum counterexamples to the cosmic censorship hypothesis

    International Nuclear Information System (INIS)

    Miller, B.D.

    1981-01-01

    In cylindrically symmetric vacuum spacetimes it is possible to specify nonsingular initial conditions such that timelike singularities will (necessarily) evolve from these conditions. Examples are given; the spacetimes are somewhat analogous to one of the spherically symmetric counterexamples to the cosmic censorship hypothesis

  14. From heresy to dogma in accounts of opposition to Howard Temin's DNA provirus hypothesis.

    Science.gov (United States)

    Marcum, James A

    2002-01-01

    In 1964 the Wisconsin virologist Howard Temin proposed the DNA provirus hypothesis to explain the mechanism by which a cancer-producing virus containing only RNA infects and transforms cells. His hypothesis reversed the flow of genetic information, as ordained by the central dogma of molecular biology. Although there was initial opposition to his hypothesis it was widely accepted, after the discovery of reverse transcriptase in 1970. Most accounts of Temin's hypothesis after the discovery portray the hypothesis as heretical, because it challenged the central dogma. Temin himself in his Nobel Prize speech of 1975 narrates a similar story about its reception. But are these accounts warranted? I argue that members of the virology community opposed Temin's provirus hypothesis not simply because it was a counterexample to the central dogma, but more importantly because his experimental evidence for supporting it was inconclusive. Furthermore, I propose that these accounts of opposition to the DNA provirus hypothesis as heretical, written by Temin and others after the discovery of reverse transcriptase, played a significant role in establishing retrovirology as a specialized field.

  15. Sensory Acceptability of Squash (Cucurbita Maximain Making Ice Cream

    Directory of Open Access Journals (Sweden)

    Raymund B. Moreno

    2015-02-01

    Full Text Available - This experimental research was conducted to determine the sensory acceptability of mashed squash (Cucurbita Maxima of different proportions in making ice cream in terms of appearance, aroma, texture, taste and general acceptability. Five treatments were formulated in the study—four of which utilized mashed squash at various proportions and one treatment was used as the control variable which contains no mashed squash at all. The respondents of the study were the 20 Food Technology students and 10 faculty members of West Visayas State University Calinog Campus who were selected through random sampling. The respondents evaluated the finished products using a modified sensory evaluation score sheet based on Six Point Hedonic Scale. The statistical tools used were the means, standard deviation, Wilcoxon Signed Rank Test. The 0.01 alpha level was used as the criterion for acceptance or rejection of the null hypothesis. The result of the study led to the conclusion that there is a significant difference that existed in the level of acceptability of mashed squash in making ice cream in terms of appearance, aroma, and general acceptability, therefore the null hypothesis is rejected. However, no significant difference in the level of acceptability of using mashed squash in making ice cream in terms of taste and texture.

  16. A novel hypothesis splitting method implementation for multi-hypothesis filters

    DEFF Research Database (Denmark)

    Bayramoglu, Enis; Ravn, Ole; Andersen, Nils Axel

    2013-01-01

    The paper presents a multi-hypothesis filter library featuring a novel method for splitting Gaussians into ones with smaller variances. The library is written in C++ for high performance and the source code is open and free1. The multi-hypothesis filters commonly approximate the distribution tran...

  17. The Income Inequality Hypothesis Revisited : Assessing the Hypothesis Using Four Methodological Approaches

    NARCIS (Netherlands)

    Kragten, N.; Rözer, J.

    The income inequality hypothesis states that income inequality has a negative effect on individual’s health, partially because it reduces social trust. This article aims to critically assess the income inequality hypothesis by comparing several analytical strategies, namely OLS regression,

  18. Optimizing Systems of Threshold Detection Sensors

    National Research Council Canada - National Science Library

    Banschbach, David C

    2008-01-01

    .... Below the threshold all signals are ignored. We develop a mathematical model for setting individual sensor thresholds to obtain optimal probability of detecting a significant event, given a limit on the total number of false positives allowed...

  19. 11 CFR 9036.1 - Threshold submission.

    Science.gov (United States)

    2010-01-01

    ... credit or debit card, including one made over the Internet, the candidate shall provide sufficient... section shall not count toward the threshold amount. (c) Threshold certification by Commission. (1) After...

  20. A Dopamine Hypothesis of Autism Spectrum Disorder.

    Science.gov (United States)

    Pavăl, Denis

    2017-01-01

    Autism spectrum disorder (ASD) comprises a group of neurodevelopmental disorders characterized by social deficits and stereotyped behaviors. While several theories have emerged, the pathogenesis of ASD remains unknown. Although studies report dopamine signaling abnormalities in autistic patients, a coherent dopamine hypothesis which could link neurobiology to behavior in ASD is currently lacking. In this paper, we present such a hypothesis by proposing that autistic behavior arises from dysfunctions in the midbrain dopaminergic system. We hypothesize that a dysfunction of the mesocorticolimbic circuit leads to social deficits, while a dysfunction of the nigrostriatal circuit leads to stereotyped behaviors. Furthermore, we discuss 2 key predictions of our hypothesis, with emphasis on clinical and therapeutic aspects. First, we argue that dopaminergic dysfunctions in the same circuits should associate with autistic-like behavior in nonautistic subjects. Concerning this, we discuss the case of PANDAS (pediatric autoimmune neuropsychiatric disorder associated with streptococcal infections) which displays behaviors similar to those of ASD, presumed to arise from dopaminergic dysfunctions. Second, we argue that providing dopamine modulators to autistic subjects should lead to a behavioral improvement. Regarding this, we present clinical studies of dopamine antagonists which seem to have improving effects on autistic behavior. Furthermore, we explore the means of testing our hypothesis by using neuroreceptor imaging, which could provide comprehensive evidence for dopamine signaling dysfunctions in autistic subjects. Lastly, we discuss the limitations of our hypothesis. Along these lines, we aim to provide a dopaminergic model of ASD which might lead to a better understanding of the ASD pathogenesis. © 2017 S. Karger AG, Basel.

  1. Nuclear thermodynamics below particle threshold

    International Nuclear Information System (INIS)

    Schiller, A.; Agvaanluvsan, U.; Algin, E.; Bagheri, A.; Chankova, R.; Guttormsen, M.; Hjorth-Jensen, M.; Rekstad, J.; Siem, S.; Sunde, A. C.; Voinov, A.

    2005-01-01

    From a starting point of experimentally measured nuclear level densities, we discuss thermodynamical properties of nuclei below the particle emission threshold. Since nuclei are essentially mesoscopic systems, a straightforward generalization of macroscopic ensemble theory often yields unphysical results. A careful critique of traditional thermodynamical concepts reveals problems commonly encountered in mesoscopic systems. One of which is the fact that microcanonical and canonical ensemble theory yield different results, another concerns the introduction of temperature for small, closed systems. Finally, the concept of phase transitions is investigated for mesoscopic systems

  2. Public acceptance of nuclear power

    International Nuclear Information System (INIS)

    Wildgruber, O.H.

    1990-01-01

    The lecture addresses the question why we need public acceptance work and provides some clues to it. It explains various human behaviour patterns which determine the basics for public acceptance. To some extent, the opposition to nuclear energy and the role the media play are described. Public acceptance efforts of industry are critically reviewed. Some hints on difficulties with polling are provided. The lecture concludes with recommendations for further public acceptance work. (author)

  3. Color-discrimination threshold determination using pseudoisochromatic test plates

    Directory of Open Access Journals (Sweden)

    Kaiva eJurasevska

    2014-11-01

    Full Text Available We produced a set of pseudoisochromatic plates for determining individual color-difference thresholds to assess test performance and test properties, and analyzed the results. We report a high test validity and classification ability for the deficiency type and severity level (comparable to that of the fourth edition of the Hardy–Rand–Rittler (HRR test. We discuss changes of the acceptable chromatic shifts from the protan and deutan confusion lines along the CIE xy diagram, and the high correlation of individual color-difference thresholds and the red–green discrimination index. Color vision was tested using an Oculus HMC anomaloscope, a Farnsworth D15, and an HRR test on 273 schoolchildren, and 57 other subjects with previously diagnosed red–green color-vision deficiency.

  4. Photoproduction of the φ(1020) near threshold in CLAS

    International Nuclear Information System (INIS)

    Tedeschi, D.J.

    2002-01-01

    The differential cross section for the photoproduction of the φ (1020) near threshold (E γ = 1.57GeV) is predicted to be sensitive to production mechanisms other than diffraction. However, the existing low energy data is of limited statistics and kinematical coverage. Complete measurements of φ meson production on the proton have been performed at The Thomas Jefferson National Accelerator Facility using a liquid hydrogen target and the CEBAF Large Acceptance Spectrometer (CLAS). The φ was identified by missing mass using a proton and positive kaon detected by CLAS in coincidence with an electron in the photon tagger. The energy of the tagged, bremsstrahlung photons ranged from φ-threshold to 2.4 GeV. A description of the data set and the differential cross section for (E γ = 2.0 GeV) will be presented and compared with present theoretical calculations. (author)

  5. The Nebular Hypothesis - A False Paradigm Misleading Scientists

    Science.gov (United States)

    Myers, L. S.

    2005-05-01

    ignored in the belief this comparatively small volume is insignificant relative to Earth's total mass and gravity. This misconception led to outdated gravitational constants and trajectories for "slingshotted" space missions that approached Earth closer than anticipated because the daily increase in mass increases Earth's gravitational pull. Today's philosophy assumes comets, meteoroids, asteroids and planets are different types of objects because of their varied sizes and appearances, but when all solar bodies are arranged by size they form a continuum from irregular meteoroids (remnants of comets) to spherical asteroids and planets. When meteoroids reach diameters of 500-600 kilometers, they become spherical-the critical threshold at which gravity can focus total molecular weight of any body omnidirectionally onto its exact center to initiate compressive heating and melting of originally cold rock core, producing magma, H2O and other gases. The Accreation concept assumes all solar bodies are different-sized objects of the same species, each having reached its present size and chemical composition by amalgamation and accretion. Each is at a different stage of growth but destined to become larger until it reaches the size of another sun (star). This is universal planetary growth controlled by gravity, but initiated by the trajectory imparted at its supernova birth and chance capture by some larger body elsewhere in the Universe. Like the paradigm shift from geocentrism to heliocentrism sparked by Copernicus in 1543, the time has come for a new paradigm to put scientific research on a more productive course toward TRUTH. The new concept of Accreation (creation by accretion) is offered as a replacement for the now defunct nebular hypothesis.

  6. Compositional threshold for Nuclear Waste Glass Durability

    International Nuclear Information System (INIS)

    Kruger, Albert A.; Farooqi, Rahmatullah; Hrma, Pavel R.

    2013-01-01

    Within the composition space of glasses, a distinct threshold appears to exist that separates 'good' glasses, i.e., those which are sufficiently durable, from 'bad' glasses of a low durability. The objective of our research is to clarify the origin of this threshold by exploring the relationship between glass composition, glass structure and chemical durability around the threshold region

  7. Threshold Concepts in Finance: Student Perspectives

    Science.gov (United States)

    Hoadley, Susan; Kyng, Tim; Tickle, Leonie; Wood, Leigh N.

    2015-01-01

    Finance threshold concepts are the essential conceptual knowledge that underpin well-developed financial capabilities and are central to the mastery of finance. In this paper we investigate threshold concepts in finance from the point of view of students, by establishing the extent to which students are aware of threshold concepts identified by…

  8. Epidemic threshold in directed networks

    Science.gov (United States)

    Li, Cong; Wang, Huijuan; Van Mieghem, Piet

    2013-12-01

    Epidemics have so far been mostly studied in undirected networks. However, many real-world networks, such as the online social network Twitter and the world wide web, on which information, emotion, or malware spreads, are directed networks, composed of both unidirectional links and bidirectional links. We define the directionality ξ as the percentage of unidirectional links. The epidemic threshold τc for the susceptible-infected-susceptible (SIS) epidemic is lower bounded by 1/λ1 in directed networks, where λ1, also called the spectral radius, is the largest eigenvalue of the adjacency matrix. In this work, we propose two algorithms to generate directed networks with a given directionality ξ. The effect of ξ on the spectral radius λ1, principal eigenvector x1, spectral gap (λ1-λ2), and algebraic connectivity μN-1 is studied. Important findings are that the spectral radius λ1 decreases with the directionality ξ, whereas the spectral gap and the algebraic connectivity increase with the directionality ξ. The extent of the decrease of the spectral radius depends on both the degree distribution and the degree-degree correlation ρD. Hence, in directed networks, the epidemic threshold is larger and a random walk converges to its steady state faster than that in undirected networks with the same degree distribution.

  9. Computational gestalts and perception thresholds.

    Science.gov (United States)

    Desolneux, Agnès; Moisan, Lionel; Morel, Jean-Michel

    2003-01-01

    In 1923, Max Wertheimer proposed a research programme and method in visual perception. He conjectured the existence of a small set of geometric grouping laws governing the perceptual synthesis of phenomenal objects, or "gestalt" from the atomic retina input. In this paper, we review this set of geometric grouping laws, using the works of Metzger, Kanizsa and their schools. In continuation, we explain why the Gestalt theory research programme can be translated into a Computer Vision programme. This translation is not straightforward, since Gestalt theory never addressed two fundamental matters: image sampling and image information measurements. Using these advances, we shall show that gestalt grouping laws can be translated into quantitative laws allowing the automatic computation of gestalts in digital images. From the psychophysical viewpoint, a main issue is raised: the computer vision gestalt detection methods deliver predictable perception thresholds. Thus, we are set in a position where we can build artificial images and check whether some kind of agreement can be found between the computationally predicted thresholds and the psychophysical ones. We describe and discuss two preliminary sets of experiments, where we compared the gestalt detection performance of several subjects with the predictable detection curve. In our opinion, the results of this experimental comparison support the idea of a much more systematic interaction between computational predictions in Computer Vision and psychophysical experiments.

  10. Threshold enhancement of diphoton resonances

    CERN Document Server

    Bharucha, Aoife; Goudelis, Andreas

    2016-10-10

    The data collected by the LHC collaborations at an energy of 13 TeV indicates the presence of an excess in the diphoton spectrum that would correspond to a resonance of a 750 GeV mass. The apparently large production cross section is nevertheless very difficult to explain in minimal models. We consider the possibility that the resonance is a pseudoscalar boson $A$ with a two--photon decay mediated by a charged and uncolored fermion having a mass at the $\\frac12 M_A$ threshold and a very small decay width, $\\ll 1$ MeV; one can then generate a large enhancement of the $A\\gamma\\gamma$ amplitude which explains the excess without invoking a large multiplicity of particles propagating in the loop, large electric charges and/or very strong Yukawa couplings. The implications of such a threshold enhancement are discussed in two explicit scenarios: i) the Minimal Supersymmetric Standard Model in which the $A$ state is produced via the top quark mediated gluon fusion process and decays into photons predominantly through...

  11. HPS simulation and acceptance

    Energy Technology Data Exchange (ETDEWEB)

    Mundim, Luiz Martins [UERJ, Rio de Janeiro, RJ (Brazil); Pol, Maria Elena [CBPF, Rio de Janeiro, RJ (Brazil)

    2013-07-01

    Full text: The High Precision Spectrometer (HPS) is a proposal of sub-detector to be installed in the region of 200-240m from each side of CMS along the LHC beam-line to measure scattered protons from exclusive centrally produced processes, pp → p + X + p. In order to study the protons that reach the detectors, the beam-line of the LHC accelerator has to be taken into account, as the particles are deflected by dipoles and suffer the influence of quadrupoles and other beam devices. The LHC team provides a detailed description of these elements, currents, energies, magnetic fields, and all the information needed to study the propagation of the protons. The program HECTOR, developed at the University of Louvain, uses the information from LHC to calculate at any point along the beam-line the kinematic quantities that characterize the scattered protons. A simple minded program was initially developed for the preliminary studies of acceptances varying the position and size of the foreseen detectors. Also, it took into account vertex and position smearing, to simulate a realistic resolution of the tracking detectors. These studies were performed using a particle gun generator which shoot protons from the IP within reasonable ranges of possible t and ξ (the square of the four-momentum transfer and the fractional energy loss of the outgoing proton in a diffractive collision), and propagated them to the position of the tracking detectors. These kinematic quantities were reconstructed back at the IP using the transport equations from HECTOR. This simplified simulation was afterwards interfaced with the full software of CMS, CMSSW, in such a way that when a diffractive event was fully simulated and reconstructed in the central detector, the outgoing protons were treated by the HPS software and then the complete (CMS+HPS) event was output. The ExHuME generator was used to produce Monte Carlo simulations to study the mass acceptance of the HPS detector, and central and

  12. Bridging the Gap between Social Acceptance and Ethical Acceptability.

    Science.gov (United States)

    Taebi, Behnam

    2017-10-01

    New technology brings great benefits, but it can also create new and significant risks. When evaluating those risks in policymaking, there is a tendency to focus on social acceptance. By solely focusing on social acceptance, we could, however, overlook important ethical aspects of technological risk, particularly when we evaluate technologies with transnational and intergenerational risks. I argue that good governance of risky technology requires analyzing both social acceptance and ethical acceptability. Conceptually, these two notions are mostly complementary. Social acceptance studies are not capable of sufficiently capturing all the morally relevant features of risky technologies; ethical analyses do not typically include stakeholders' opinions, and they therefore lack the relevant empirical input for a thorough ethical evaluation. Only when carried out in conjunction are these two types of analysis relevant to national and international governance of risky technology. I discuss the Rawlsian wide reflective equilibrium as a method for marrying social acceptance and ethical acceptability. Although the rationale of my argument is broadly applicable, I will examine the case of multinational nuclear waste repositories in particular. This example will show how ethical issues may be overlooked if we focus only on social acceptance, and will provide a test case for demonstrating how the wide reflective equilibrium can help to bridge the proverbial acceptance-acceptability gap. © 2016 The Authors Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.

  13. Tests of the Giant Impact Hypothesis

    Science.gov (United States)

    Jones, J. H.

    1998-01-01

    The giant impact hypothesis has gained popularity as a means of explaining a volatile-depleted Moon that still has a chemical affinity to the Earth. As Taylor's Axiom decrees, the best models of lunar origin are testable, but this is difficult with the giant impact model. The energy associated with the impact would be sufficient to totally melt and partially vaporize the Earth. And this means that there should he no geological vestige of Barber times. Accordingly, it is important to devise tests that may be used to evaluate the giant impact hypothesis. Three such tests are discussed here. None of these is supportive of the giant impact model, but neither do they disprove it.

  14. The discovered preference hypothesis - an empirical test

    DEFF Research Database (Denmark)

    Lundhede, Thomas; Ladenburg, Jacob; Olsen, Søren Bøye

    Using stated preference methods for valuation of non-market goods is known to be vulnerable to a range of biases. Some authors claim that these so-called anomalies in effect render the methods useless for the purpose. However, the Discovered Preference Hypothesis, as put forth by Plott [31], offers...... an nterpretation and explanation of biases which entails that the stated preference methods need not to be completely written off. In this paper we conduct a test for the validity and relevance of the DPH interpretation of biases. In a choice experiment concerning preferences for protection of Danish nature areas...... as respondents evaluate more and more choice sets. This finding supports the Discovered Preference Hypothesis interpretation and explanation of starting point bias....

  15. The Hypothesis-Driven Physical Examination.

    Science.gov (United States)

    Garibaldi, Brian T; Olson, Andrew P J

    2018-05-01

    The physical examination remains a vital part of the clinical encounter. However, physical examination skills have declined in recent years, in part because of decreased time at the bedside. Many clinicians question the relevance of physical examinations in the age of technology. A hypothesis-driven approach to teaching and practicing the physical examination emphasizes the performance of maneuvers that can alter the likelihood of disease. Likelihood ratios are diagnostic weights that allow clinicians to estimate the post-probability of disease. This hypothesis-driven approach to the physical examination increases its value and efficiency, while preserving its cultural role in the patient-physician relationship. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. MOLIERE: Automatic Biomedical Hypothesis Generation System.

    Science.gov (United States)

    Sybrandt, Justin; Shtutman, Michael; Safro, Ilya

    2017-08-01

    Hypothesis generation is becoming a crucial time-saving technique which allows biomedical researchers to quickly discover implicit connections between important concepts. Typically, these systems operate on domain-specific fractions of public medical data. MOLIERE, in contrast, utilizes information from over 24.5 million documents. At the heart of our approach lies a multi-modal and multi-relational network of biomedical objects extracted from several heterogeneous datasets from the National Center for Biotechnology Information (NCBI). These objects include but are not limited to scientific papers, keywords, genes, proteins, diseases, and diagnoses. We model hypotheses using Latent Dirichlet Allocation applied on abstracts found near shortest paths discovered within this network, and demonstrate the effectiveness of MOLIERE by performing hypothesis generation on historical data. Our network, implementation, and resulting data are all publicly available for the broad scientific community.

  17. The Method of Hypothesis in Plato's Philosophy

    Directory of Open Access Journals (Sweden)

    Malihe Aboie Mehrizi

    2016-09-01

    Full Text Available The article deals with the examination of method of hypothesis in Plato's philosophy. This method, respectively, will be examined in three dialogues of Meno, Phaedon and Republic in which it is explicitly indicated. It will be shown the process of change of Plato’s attitude towards the position and usage of the method of hypothesis in his realm of philosophy. In Meno, considering the geometry, Plato attempts to introduce a method that can be used in the realm of philosophy. But, ultimately in Republic, Plato’s special attention to the method and its importance in the philosophical investigations, leads him to revise it. Here, finally Plato introduces the particular method of philosophy, i.e., the dialectic

  18. Debates—Hypothesis testing in hydrology: Introduction

    Science.gov (United States)

    Blöschl, Günter

    2017-03-01

    This paper introduces the papers in the "Debates—Hypothesis testing in hydrology" series. The four articles in the series discuss whether and how the process of testing hypotheses leads to progress in hydrology. Repeated experiments with controlled boundary conditions are rarely feasible in hydrology. Research is therefore not easily aligned with the classical scientific method of testing hypotheses. Hypotheses in hydrology are often enshrined in computer models which are tested against observed data. Testability may be limited due to model complexity and data uncertainty. All four articles suggest that hypothesis testing has contributed to progress in hydrology and is needed in the future. However, the procedure is usually not as systematic as the philosophy of science suggests. A greater emphasis on a creative reasoning process on the basis of clues and explorative analyses is therefore needed.

  19. Multi-agent sequential hypothesis testing

    KAUST Repository

    Kim, Kwang-Ki K.

    2014-12-15

    This paper considers multi-agent sequential hypothesis testing and presents a framework for strategic learning in sequential games with explicit consideration of both temporal and spatial coordination. The associated Bayes risk functions explicitly incorporate costs of taking private/public measurements, costs of time-difference and disagreement in actions of agents, and costs of false declaration/choices in the sequential hypothesis testing. The corresponding sequential decision processes have well-defined value functions with respect to (a) the belief states for the case of conditional independent private noisy measurements that are also assumed to be independent identically distributed over time, and (b) the information states for the case of correlated private noisy measurements. A sequential investment game of strategic coordination and delay is also discussed as an application of the proposed strategic learning rules.

  20. Hypothesis testing of scientific Monte Carlo calculations

    Science.gov (United States)

    Wallerberger, Markus; Gull, Emanuel

    2017-11-01

    The steadily increasing size of scientific Monte Carlo simulations and the desire for robust, correct, and reproducible results necessitates rigorous testing procedures for scientific simulations in order to detect numerical problems and programming bugs. However, the testing paradigms developed for deterministic algorithms have proven to be ill suited for stochastic algorithms. In this paper we demonstrate explicitly how the technique of statistical hypothesis testing, which is in wide use in other fields of science, can be used to devise automatic and reliable tests for Monte Carlo methods, and we show that these tests are able to detect some of the common problems encountered in stochastic scientific simulations. We argue that hypothesis testing should become part of the standard testing toolkit for scientific simulations.

  1. Reverse hypothesis machine learning a practitioner's perspective

    CERN Document Server

    Kulkarni, Parag

    2017-01-01

    This book introduces a paradigm of reverse hypothesis machines (RHM), focusing on knowledge innovation and machine learning. Knowledge- acquisition -based learning is constrained by large volumes of data and is time consuming. Hence Knowledge innovation based learning is the need of time. Since under-learning results in cognitive inabilities and over-learning compromises freedom, there is need for optimal machine learning. All existing learning techniques rely on mapping input and output and establishing mathematical relationships between them. Though methods change the paradigm remains the same—the forward hypothesis machine paradigm, which tries to minimize uncertainty. The RHM, on the other hand, makes use of uncertainty for creative learning. The approach uses limited data to help identify new and surprising solutions. It focuses on improving learnability, unlike traditional approaches, which focus on accuracy. The book is useful as a reference book for machine learning researchers and professionals as ...

  2. Exploring heterogeneous market hypothesis using realized volatility

    Science.gov (United States)

    Chin, Wen Cheong; Isa, Zaidi; Mohd Nor, Abu Hassan Shaari

    2013-04-01

    This study investigates the heterogeneous market hypothesis using high frequency data. The cascaded heterogeneous trading activities with different time durations are modelled by the heterogeneous autoregressive framework. The empirical study indicated the presence of long memory behaviour and predictability elements in the financial time series which supported heterogeneous market hypothesis. Besides the common sum-of-square intraday realized volatility, we also advocated two power variation realized volatilities in forecast evaluation and risk measurement in order to overcome the possible abrupt jumps during the credit crisis. Finally, the empirical results are used in determining the market risk using the value-at-risk approach. The findings of this study have implications for informationally market efficiency analysis, portfolio strategies and risk managements.

  3. Water Taxation and the Double Dividend Hypothesis

    OpenAIRE

    Nicholas Kilimani

    2014-01-01

    The double dividend hypothesis contends that environmental taxes have the potential to yield multiple benefits for the economy. However, empirical evidence of the potential impacts of environmental taxation in developing countries is still limited. This paper seeks to contribute to the literature by exploring the impact of a water tax in a developing country context, with Uganda as a case study. Policy makers in Uganda are exploring ways of raising revenue by taxing environmental goods such a...

  4. [Working memory, phonological awareness and spelling hypothesis].

    Science.gov (United States)

    Gindri, Gigiane; Keske-Soares, Márcia; Mota, Helena Bolli

    2007-01-01

    Working memory, phonological awareness and spelling hypothesis. To verify the relationship between working memory, phonological awareness and spelling hypothesis in pre-school children and first graders. Participants of this study were 90 students, belonging to state schools, who presented typical linguistic development. Forty students were preschoolers, with the average age of six and 50 students were first graders, with the average age of seven. Participants were submitted to an evaluation of the working memory abilities based on the Working Memory Model (Baddeley, 2000), involving phonological loop. Phonological loop was evaluated using the Auditory Sequential Test, subtest 5 of Illinois Test of Psycholinguistic Abilities (ITPA), Brazilian version (Bogossian & Santos, 1977), and the Meaningless Words Memory Test (Kessler, 1997). Phonological awareness abilities were investigated using the Phonological Awareness: Instrument of Sequential Assessment (CONFIAS - Moojen et al., 2003), involving syllabic and phonemic awareness tasks. Writing was characterized according to Ferreiro & Teberosky (1999). Preschoolers presented the ability of repeating sequences of 4.80 digits and 4.30 syllables. Regarding phonological awareness, the performance in the syllabic level was of 19.68 and in the phonemic level was of 8.58. Most of the preschoolers demonstrated to have a pre-syllabic writing hypothesis. First graders repeated, in average, sequences of 5.06 digits and 4.56 syllables. These children presented a phonological awareness of 31.12 in the syllabic level and of 16.18 in the phonemic level, and demonstrated to have an alphabetic writing hypothesis. The performance of working memory, phonological awareness and spelling level are inter-related, as well as being related to chronological age, development and scholarity.

  5. Privacy on Hypothesis Testing in Smart Grids

    OpenAIRE

    Li, Zuxing; Oechtering, Tobias

    2015-01-01

    In this paper, we study the problem of privacy information leakage in a smart grid. The privacy risk is assumed to be caused by an unauthorized binary hypothesis testing of the consumer's behaviour based on the smart meter readings of energy supplies from the energy provider. Another energy supplies are produced by an alternative energy source. A controller equipped with an energy storage device manages the energy inflows to satisfy the energy demand of the consumer. We study the optimal ener...

  6. Box-particle probability hypothesis density filtering

    OpenAIRE

    Schikora, M.; Gning, A.; Mihaylova, L.; Cremers, D.; Koch, W.

    2014-01-01

    This paper develops a novel approach for multitarget tracking, called box-particle probability hypothesis density filter (box-PHD filter). The approach is able to track multiple targets and estimates the unknown number of targets. Furthermore, it is capable of dealing with three sources of uncertainty: stochastic, set-theoretic, and data association uncertainty. The box-PHD filter reduces the number of particles significantly, which improves the runtime considerably. The small number of box-p...

  7. Quantum effects and hypothesis of cosmic censorship

    International Nuclear Information System (INIS)

    Parnovskij, S.L.

    1989-01-01

    It is shown that filamentary characteristics with linear mass of less than 10 25 g/cm distort slightly the space-time at distances, exceeding Planck ones. Their formation doesn't change vacuum energy and doesn't lead to strong quantum radiation. Therefore, the problem of their occurrence can be considered within the framework of classical collapse. Quantum effects can be ignored when considering the problem of validity of cosmic censorship hypothesis

  8. The (not so) immortal strand hypothesis.

    Science.gov (United States)

    Tomasetti, Cristian; Bozic, Ivana

    2015-03-01

    Non-random segregation of DNA strands during stem cell replication has been proposed as a mechanism to minimize accumulated genetic errors in stem cells of rapidly dividing tissues. According to this hypothesis, an "immortal" DNA strand is passed to the stem cell daughter and not the more differentiated cell, keeping the stem cell lineage replication error-free. After it was introduced, experimental evidence both in favor and against the hypothesis has been presented. Using a novel methodology that utilizes cancer sequencing data we are able to estimate the rate of accumulation of mutations in healthy stem cells of the colon, blood and head and neck tissues. We find that in these tissues mutations in stem cells accumulate at rates strikingly similar to those expected without the protection from the immortal strand mechanism. Utilizing an approach that is fundamentally different from previous efforts to confirm or refute the immortal strand hypothesis, we provide evidence against non-random segregation of DNA during stem cell replication. Our results strongly suggest that parental DNA is passed randomly to stem cell daughters and provides new insight into the mechanism of DNA replication in stem cells. Copyright © 2015. Published by Elsevier B.V.

  9. A test of the orthographic recoding hypothesis

    Science.gov (United States)

    Gaygen, Daniel E.

    2003-04-01

    The Orthographic Recoding Hypothesis [D. E. Gaygen and P. A. Luce, Percept. Psychophys. 60, 465-483 (1998)] was tested. According to this hypothesis, listeners recognize spoken words heard for the first time by mapping them onto stored representations of the orthographic forms of the words. Listeners have a stable orthographic representation of words, but no phonological representation, when those words have been read frequently but never heard or spoken. Such may be the case for low frequency words such as jargon. Three experiments using visually and auditorily presented nonword stimuli tested this hypothesis. The first two experiments were explicit tests of memory (old-new tests) for words presented visually. In the first experiment, the recognition of auditorily presented nonwords was facilitated when they previously appeared on a visually presented list. The second experiment was similar, but included a concurrent articulation task during a visual word list presentation, thus preventing covert rehearsal of the nonwords. The results were similar to the first experiment. The third experiment was an indirect test of memory (auditory lexical decision task) for visually presented nonwords. Auditorily presented nonwords were identified as nonwords significantly more slowly if they had previously appeared on the visually presented list accompanied by a concurrent articulation task.

  10. Consumer health information seeking as hypothesis testing.

    Science.gov (United States)

    Keselman, Alla; Browne, Allen C; Kaufman, David R

    2008-01-01

    Despite the proliferation of consumer health sites, lay individuals often experience difficulty finding health information online. The present study attempts to understand users' information seeking difficulties by drawing on a hypothesis testing explanatory framework. It also addresses the role of user competencies and their interaction with internet resources. Twenty participants were interviewed about their understanding of a hypothetical scenario about a family member suffering from stable angina and then searched MedlinePlus consumer health information portal for information on the problem presented in the scenario. Participants' understanding of heart disease was analyzed via semantic analysis. Thematic coding was used to describe information seeking trajectories in terms of three key strategies: verification of the primary hypothesis, narrowing search within the general hypothesis area and bottom-up search. Compared to an expert model, participants' understanding of heart disease involved different key concepts, which were also differently grouped and defined. This understanding provided the framework for search-guiding hypotheses and results interpretation. Incorrect or imprecise domain knowledge led individuals to search for information on irrelevant sites, often seeking out data to confirm their incorrect initial hypotheses. Online search skills enhanced search efficiency, but did not eliminate these difficulties. Regardless of their web experience and general search skills, lay individuals may experience difficulty with health information searches. These difficulties may be related to formulating and evaluating hypotheses that are rooted in their domain knowledge. Informatics can provide support at the levels of health information portals, individual websites, and consumer education tools.

  11. Risk thresholds for alcohol consumption

    DEFF Research Database (Denmark)

    Wood, Angela M; Kaptoge, Stephen; Butterworth, Adam S

    2018-01-01

    previous cardiovascular disease. METHODS: We did a combined analysis of individual-participant data from three large-scale data sources in 19 high-income countries (the Emerging Risk Factors Collaboration, EPIC-CVD, and the UK Biobank). We characterised dose-response associations and calculated hazard......BACKGROUND: Low-risk limits recommended for alcohol consumption vary substantially across different national guidelines. To define thresholds associated with lowest risk for all-cause mortality and cardiovascular disease, we studied individual-participant data from 599 912 current drinkers without......·4 million person-years of follow-up. For all-cause mortality, we recorded a positive and curvilinear association with the level of alcohol consumption, with the minimum mortality risk around or below 100 g per week. Alcohol consumption was roughly linearly associated with a higher risk of stroke (HR per 100...

  12. Detection thresholds of macaque otolith afferents.

    Science.gov (United States)

    Yu, Xiong-Jie; Dickman, J David; Angelaki, Dora E

    2012-06-13

    The vestibular system is our sixth sense and is important for spatial perception functions, yet the sensory detection and discrimination properties of vestibular neurons remain relatively unexplored. Here we have used signal detection theory to measure detection thresholds of otolith afferents using 1 Hz linear accelerations delivered along three cardinal axes. Direction detection thresholds were measured by comparing mean firing rates centered on response peak and trough (full-cycle thresholds) or by comparing peak/trough firing rates with spontaneous activity (half-cycle thresholds). Thresholds were similar for utricular and saccular afferents, as well as for lateral, fore/aft, and vertical motion directions. When computed along the preferred direction, full-cycle direction detection thresholds were 7.54 and 3.01 cm/s(2) for regular and irregular firing otolith afferents, respectively. Half-cycle thresholds were approximately double, with excitatory thresholds being half as large as inhibitory thresholds. The variability in threshold among afferents was directly related to neuronal gain and did not depend on spike count variance. The exact threshold values depended on both the time window used for spike count analysis and the filtering method used to calculate mean firing rate, although differences between regular and irregular afferent thresholds were independent of analysis parameters. The fact that minimum thresholds measured in macaque otolith afferents are of the same order of magnitude as human behavioral thresholds suggests that the vestibular periphery might determine the limit on our ability to detect or discriminate small differences in head movement, with little noise added during downstream processing.

  13. The linear hypothesis: An idea whose time has passed

    International Nuclear Information System (INIS)

    Tschaeche, A.N.

    1995-01-01

    This paper attempts to present a clear idea of what the linear (no-threshold) hypothesis (LH) is, how it was corrupted and what happened to the nuclear industry as a result, and one possible solution to this major problem for the nuclear industry. The corruption lies in the change of the LH from ''a little radiation MAY produce harm'' to ''low doses of radiation WILL KILL you.'' The result has been the retardation of the nuclear industry in the United States, although the industry is one of the safest, if not the safest industry. It is suggested to replace the LH with two sets of standards, one having to do with human and environmental health and safety, and the other (more stringent) for protection of manufactured items and premises. The safety standard could be some dose such as 5 rem/year. This would do away with the ALARA concept below the annual limit and with the collective dose at low doses. Benefits of the two-tier radiation standards system would be the alleviation of the public fear of radiation and the health of the nuclear industry

  14. No-threshold dose-response curves for nongenotoxic chemicals: Findings and applications for risk assessment

    International Nuclear Information System (INIS)

    Sheehan, Daniel M.

    2006-01-01

    We tested the hypothesis that no threshold exists when estradiol acts through the same mechanism as an active endogenous estrogen. A Michaelis-Menten (MM) equation accounting for response saturation, background effects, and endogenous estrogen level fit a turtle sex-reversal data set with no threshold and estimated the endogenous dose. Additionally, 31 diverse literature dose-response data sets were analyzed by adding a term for nonhormonal background; good fits were obtained but endogenous dose estimations were not significant due to low resolving power. No thresholds were observed. Data sets were plotted using a normalized MM equation; all 178 data points were accommodated on a single graph. Response rates from ∼1% to >95% were well fit. The findings contradict the threshold assumption and low-dose safety. Calculating risk and assuming additivity of effects from multiple chemicals acting through the same mechanism rather than assuming a safe dose for nonthresholded curves is appropriate

  15. Should the Equilibrium Point Hypothesis (EPH) be Considered a Scientific Theory?

    Science.gov (United States)

    Sainburg, Robert L

    2015-04-01

    The purpose of this commentary is to discuss factors that limit consideration of the equilibrium point hypothesis as a scientific theory. The EPH describes control of motor neuron threshold through the variable lambda, which corresponds to a unique referent configuration for a muscle, joint, or combination of joints. One of the most compelling features of the equilibrium point hypothesis is the integration of posture and movement control into a single mechanism. While the essential core of the hypothesis is based upon spinal circuitry interacting with peripheral mechanics, the proponents have extended the theory to include the higher-level processes that generate lambda, and in doing so, imposed an injunction against the supraspinal nervous system modeling, computing, or predicting dynamics. This limitation contradicts evidence that humans take account of body and environmental dynamics in motor selection, motor control, and motor adaptation processes. A number of unresolved limitations to the EPH have been debated in the literature for many years, including whether muscle resistance to displacement, measured during movement, is adequate to support this form of control, violations in equifinality predictions, spinal circuits that alter the proposed invariant characteristic for muscles, and limitations in the description of how the complexity of spinal circuitry might be integrated to yield a unique and stable equilibrium position for a given motor neuron threshold. In addition, an important empirical limitation of EPH is the measurement of the invariant characteristic, which needs to be done under a constant central state. While there is no question that the EPH is an elegant and generative hypothesis for motor control research, the claim that this hypothesis has reached the status of a scientific theory is premature.

  16. American acceptance of nuclear power

    International Nuclear Information System (INIS)

    Barrett, W.

    1980-01-01

    The characteristic adventurous spirit that built American technology will eventually lead to American acceptance of nuclear power unless an overpowering loss of nerve causes us to reject both nuclear technology and world leadership. The acceptance of new technology by society has always been accompanied by activist opposition to industralization. To resolve the debate between environmental and exploitive extremists, we must accept with humility the basic premise that human accomplishment is a finite part of nature

  17. Tacit acceptance of the succession

    Directory of Open Access Journals (Sweden)

    Ioana NICOLAE

    2012-01-01

    Full Text Available This paper examines some essential and contradictory aspects regarding the issue of tacit acceptance of succession in terms of distinction between documents valuing tacit acceptance of succession and other acts that would not justify such a solution. The documents expressly indicated by the legislator as having tacit acceptance value as well as those which do not have such value are presented and their most important legal effects are examined and discussed.

  18. Updating the lamellar hypothesis of hippocampal organization

    Directory of Open Access Journals (Sweden)

    Robert S Sloviter

    2012-12-01

    Full Text Available In 1971, Andersen and colleagues proposed that excitatory activity in the entorhinal cortex propagates topographically to the dentate gyrus, and on through a trisynaptic circuit lying within transverse hippocampal slices or lamellae [Andersen, Bliss, and Skrede. 1971. Lamellar organization of hippocampal pathways. Exp Brain Res 13, 222-238]. In this way, a relatively simple structure might mediate complex functions in a manner analogous to the way independent piano keys can produce a nearly infinite variety of unique outputs. The lamellar hypothesis derives primary support from the lamellar distribution of dentate granule cell axons (the mossy fibers, which innervate dentate hilar neurons and area CA3 pyramidal cells and interneurons within the confines of a thin transverse hippocampal segment. Following the initial formulation of the lamellar hypothesis, anatomical studies revealed that unlike granule cells, hilar mossy cells, CA3 pyramidal cells, and Layer II entorhinal cells all form axonal projections that are more divergent along the longitudinal axis than the clearly lamellar mossy fiber pathway. The existence of pathways with translamellar distribution patterns has been interpreted, incorrectly in our view, as justifying outright rejection of the lamellar hypothesis [Amaral and Witter. 1989. The three-dimensional organization of the hippocampal formation: a review of anatomical data. Neuroscience 31, 571-591]. We suggest that the functional implications of longitudinally-projecting axons depend not on whether they exist, but on what they do. The observation that focal granule cell layer discharges normally inhibit, rather than excite, distant granule cells suggests that longitudinal axons in the dentate gyrus may mediate "lateral" inhibition and define lamellar function, rather than undermine it. In this review, we attempt a reconsideration of the evidence that most directly impacts the physiological concept of hippocampal lamellar

  19. Hypothesis Testing as an Act of Rationality

    Science.gov (United States)

    Nearing, Grey

    2017-04-01

    Statistical hypothesis testing is ad hoc in two ways. First, setting probabilistic rejection criteria is, as Neyman (1957) put it, an act of will rather than an act of rationality. Second, physical theories like conservation laws do not inherently admit probabilistic predictions, and so we must use what are called epistemic bridge principles to connect model predictions with the actual methods of hypothesis testing. In practice, these bridge principles are likelihood functions, error functions, or performance metrics. I propose that the reason we are faced with these problems is because we have historically failed to account for a fundamental component of basic logic - namely the portion of logic that explains how epistemic states evolve in the presence of empirical data. This component of Cox' (1946) calculitic logic is called information theory (Knuth, 2005), and adding information theory our hypothetico-deductive account of science yields straightforward solutions to both of the above problems. This also yields a straightforward method for dealing with Popper's (1963) problem of verisimilitude by facilitating a quantitative approach to measuring process isomorphism. In practice, this involves data assimilation. Finally, information theory allows us to reliably bound measures of epistemic uncertainty, thereby avoiding the problem of Bayesian incoherency under misspecified priors (Grünwald, 2006). I therefore propose solutions to four of the fundamental problems inherent in both hypothetico-deductive and/or Bayesian hypothesis testing. - Neyman (1957) Inductive Behavior as a Basic Concept of Philosophy of Science. - Cox (1946) Probability, Frequency and Reasonable Expectation. - Knuth (2005) Lattice Duality: The Origin of Probability and Entropy. - Grünwald (2006). Bayesian Inconsistency under Misspecification. - Popper (1963) Conjectures and Refutations: The Growth of Scientific Knowledge.

  20. Bridging the Gap between Social Acceptance and Ethical Acceptability

    NARCIS (Netherlands)

    Taebi, B.

    2016-01-01

    New technology brings great benefits, but it can also create new and significant risks. When evaluating those risks in policymaking, there is a tendency to focus on social acceptance. By solely focusing on social acceptance, we could, however, overlook important ethical aspects of technological

  1. The conscious access hypothesis: Explaining the consciousness

    OpenAIRE

    Prakash, Ravi

    2008-01-01

    The phenomenon of conscious awareness or consciousness is complicated but fascinating. Although this concept has intrigued the mankind since antiquity, exploration of consciousness from scientific perspectives is not very old. Among myriad of theories regarding nature, functions and mechanism of consciousness, off late, cognitive theories have received wider acceptance. One of the most exciting hypotheses in recent times has been the ?conscious access hypotheses? based on the ?global workspac...

  2. The atom - from hypothesis to certainty

    International Nuclear Information System (INIS)

    Lacina, A.

    1998-01-01

    Modern history of the atomistic conception is described. The article concentrates on and discusses in detail mainly the most important hundred years of gradual development of this idea - from setting a basis of chemical atomism in the late 18th century up to the theoretical and experimental analysis of the Brownian movement, realized in the early 20th century. This analysis was generally accepted as the first indisputable proof of the particle structure of matter. (Z.J.)

  3. Interstellar colonization and the zoo hypothesis

    International Nuclear Information System (INIS)

    Jones, E.M.

    1978-01-01

    Michael Hart and others have pointed out that current estimates of the number of technological civilizations arisen in the Galaxy since its formation is in fundamental conflict with the expectation that such a civilization could colonize and utilize the entire Galaxy in 10 to 20 million years. This dilemma can be called Hart's paradox. Resolution of the paradox requires that one or more of the following are true: we are the Galaxy's first technical civilization; interstellar travel is immensely impractical or simply impossible; technological civilizations are very short-lived; or we inhabit a wildnerness preserve. The latter is the zoo hypothesis

  4. RANDOM WALK HYPOTHESIS IN FINANCIAL MARKETS

    Directory of Open Access Journals (Sweden)

    Nicolae-Marius JULA

    2017-05-01

    Full Text Available Random walk hypothesis states that the stock market prices do not follow a predictable trajectory, but are simply random. If you are trying to predict a random set of data, one should test for randomness, because, despite the power and complexity of the used models, the results cannot be trustworthy. There are several methods for testing these hypotheses and the use of computational power provided by the R environment makes the work of the researcher easier and with a cost-effective approach. The increasing power of computing and the continuous development of econometric tests should give the potential investors new tools in selecting commodities and investing in efficient markets.

  5. Confluence Model or Resource Dilution Hypothesis?

    DEFF Research Database (Denmark)

    Jæger, Mads

    have a negative effect on educational attainment most studies cannot distinguish empirically between the CM and the RDH. In this paper, I use the different theoretical predictions in the CM and the RDH on the role of cognitive ability as a partial or complete mediator of the sibship size effect......Studies on family background often explain the negative effect of sibship size on educational attainment by one of two theories: the Confluence Model (CM) or the Resource Dilution Hypothesis (RDH). However, as both theories – for substantively different reasons – predict that sibship size should...

  6. Set theory and the continuum hypothesis

    CERN Document Server

    Cohen, Paul J

    2008-01-01

    This exploration of a notorious mathematical problem is the work of the man who discovered the solution. The independence of the continuum hypothesis is the focus of this study by Paul J. Cohen. It presents not only an accessible technical explanation of the author's landmark proof but also a fine introduction to mathematical logic. An emeritus professor of mathematics at Stanford University, Dr. Cohen won two of the most prestigious awards in mathematics: in 1964, he was awarded the American Mathematical Society's Bôcher Prize for analysis; and in 1966, he received the Fields Medal for Logic.

  7. Statistical hypothesis testing with SAS and R

    CERN Document Server

    Taeger, Dirk

    2014-01-01

    A comprehensive guide to statistical hypothesis testing with examples in SAS and R When analyzing datasets the following questions often arise:Is there a short hand procedure for a statistical test available in SAS or R?If so, how do I use it?If not, how do I program the test myself? This book answers these questions and provides an overview of the most commonstatistical test problems in a comprehensive way, making it easy to find and performan appropriate statistical test. A general summary of statistical test theory is presented, along with a basicdescription for each test, including the

  8. Threshold behavior in electron-atom scattering

    International Nuclear Information System (INIS)

    Sadeghpour, H.R.; Greene, C.H.

    1996-01-01

    Ever since the classic work of Wannier in 1953, the process of treating two threshold electrons in the continuum of a positively charged ion has been an active field of study. The authors have developed a treatment motivated by the physics below the double ionization threshold. By modeling the double ionization as a series of Landau-Zener transitions, they obtain an analytical formulation of the absolute threshold probability which has a leading power law behavior, akin to Wannier's law. Some of the noteworthy aspects of this derivation are that the derivation can be conveniently continued below threshold giving rise to a open-quotes cuspclose quotes at threshold, and that on both sides of the threshold, absolute values of the cross sections are obtained

  9. A numerical study of threshold states

    International Nuclear Information System (INIS)

    Ata, M.S.; Grama, C.; Grama, N.; Hategan, C.

    1979-01-01

    There are some experimental evidences of charged particle threshold states. On the statistical background of levels, some simple structures were observed in excitation spectrum. They occur near the coulombian threshold and have a large reduced width for the decay in the threshold channel. These states were identified as charged cluster threshold states. Such threshold states were observed in sup(15,16,17,18)O, sup(18,19)F, sup(19,20)Ne, sup(24)Mg, sup(32)S. The types of clusters involved were d, t, 3 He, α and even 12 C. They were observed in heavy-ions transfer reactions in the residual nucleus as strong excited levels. The charged particle threshold states occur as simple structures at high excitation energy. They could be interesting both from nuclear structure as well as nuclear reaction mechanism point of view. They could be excited as simple structures both in compound and residual nucleus. (author)

  10. Iran: the next nuclear threshold state?

    OpenAIRE

    Maurer, Christopher L.

    2014-01-01

    Approved for public release; distribution is unlimited A nuclear threshold state is one that could quickly operationalize its peaceful nuclear program into one capable of producing a nuclear weapon. This thesis compares two known threshold states, Japan and Brazil, with Iran to determine if the Islamic Republic could also be labeled a threshold state. Furthermore, it highlights the implications such a status could have on U.S. nonproliferation policy. Although Iran's nuclear program is mir...

  11. Genotoxic thresholds, DNA repair, and susceptibility in human populations

    International Nuclear Information System (INIS)

    Jenkins, Gareth J.S.; Zair, Zoulikha; Johnson, George E.; Doak, Shareen H.

    2010-01-01

    It has been long assumed that DNA damage is induced in a linear manner with respect to the dose of a direct acting genotoxin. Thus, it is implied that direct acting genotoxic agents induce DNA damage at even the lowest of concentrations and that no 'safe' dose range exists. The linear (non-threshold) paradigm has led to the one-hit model being developed. This 'one hit' scenario can be interpreted such that a single DNA damaging event in a cell has the capability to induce a single point mutation in that cell which could (if positioned in a key growth controlling gene) lead to increased proliferation, leading ultimately to the formation of a tumour. There are many groups (including our own) who, for a decade or more, have argued, that low dose exposures to direct acting genotoxins may be tolerated by cells through homeostatic mechanisms such as DNA repair. This argument stems from the existence of evolutionary adaptive mechanisms that allow organisms to adapt to low levels of exogenous sources of genotoxins. We have been particularly interested in the genotoxic effects of known mutagens at low dose exposures in human cells and have identified for the first time, in vitro genotoxic thresholds for several mutagenic alkylating agents (Doak et al., 2007). Our working hypothesis is that DNA repair is primarily responsible for these thresholded effects at low doses by removing low levels of DNA damage but becoming saturated at higher doses. We are currently assessing the roles of base excision repair (BER) and methylguanine-DNA methyltransferase (MGMT) for roles in the identified thresholds (Doak et al., 2008). This research area is currently important as it assesses whether 'safe' exposure levels to mutagenic chemicals can exist and allows risk assessment using appropriate safety factors to define such exposure levels. Given human variation, the mechanistic basis for genotoxic thresholds (e.g. DNA repair) has to be well defined in order that susceptible individuals are

  12. Hypothesis-driven physical examination curriculum.

    Science.gov (United States)

    Allen, Sharon; Olson, Andrew; Menk, Jeremiah; Nixon, James

    2017-12-01

    Medical students traditionally learn physical examination skills as a rote list of manoeuvres. Alternatives like hypothesis-driven physical examination (HDPE) may promote students' understanding of the contribution of physical examination to diagnostic reasoning. We sought to determine whether first-year medical students can effectively learn to perform a physical examination using an HDPE approach, and then tailor the examination to specific clinical scenarios. Medical students traditionally learn physical examination skills as a rote list of manoeuvres CONTEXT: First-year medical students at the University of Minnesota were taught both traditional and HDPE approaches during a required 17-week clinical skills course in their first semester. The end-of-course evaluation assessed HDPE skills: students were assigned one of two cardiopulmonary cases. Each case included two diagnostic hypotheses. During an interaction with a standardised patient, students were asked to select physical examination manoeuvres in order to make a final diagnosis. Items were weighted and selection order was recorded. First-year students with minimal pathophysiology performed well. All students selected the correct diagnosis. Importantly, students varied the order when selecting examination manoeuvres depending on the diagnoses under consideration, demonstrating early clinical decision-making skills. An early introduction to HDPE may reinforce physical examination skills for hypothesis generation and testing, and can foster early clinical decision-making skills. This has important implications for further research in physical examination instruction. © 2016 John Wiley & Sons Ltd and The Association for the Study of Medical Education.

  13. A default Bayesian hypothesis test for mediation.

    Science.gov (United States)

    Nuijten, Michèle B; Wetzels, Ruud; Matzke, Dora; Dolan, Conor V; Wagenmakers, Eric-Jan

    2015-03-01

    In order to quantify the relationship between multiple variables, researchers often carry out a mediation analysis. In such an analysis, a mediator (e.g., knowledge of a healthy diet) transmits the effect from an independent variable (e.g., classroom instruction on a healthy diet) to a dependent variable (e.g., consumption of fruits and vegetables). Almost all mediation analyses in psychology use frequentist estimation and hypothesis-testing techniques. A recent exception is Yuan and MacKinnon (Psychological Methods, 14, 301-322, 2009), who outlined a Bayesian parameter estimation procedure for mediation analysis. Here we complete the Bayesian alternative to frequentist mediation analysis by specifying a default Bayesian hypothesis test based on the Jeffreys-Zellner-Siow approach. We further extend this default Bayesian test by allowing a comparison to directional or one-sided alternatives, using Markov chain Monte Carlo techniques implemented in JAGS. All Bayesian tests are implemented in the R package BayesMed (Nuijten, Wetzels, Matzke, Dolan, & Wagenmakers, 2014).

  14. Gaussian Hypothesis Testing and Quantum Illumination.

    Science.gov (United States)

    Wilde, Mark M; Tomamichel, Marco; Lloyd, Seth; Berta, Mario

    2017-09-22

    Quantum hypothesis testing is one of the most basic tasks in quantum information theory and has fundamental links with quantum communication and estimation theory. In this paper, we establish a formula that characterizes the decay rate of the minimal type-II error probability in a quantum hypothesis test of two Gaussian states given a fixed constraint on the type-I error probability. This formula is a direct function of the mean vectors and covariance matrices of the quantum Gaussian states in question. We give an application to quantum illumination, which is the task of determining whether there is a low-reflectivity object embedded in a target region with a bright thermal-noise bath. For the asymmetric-error setting, we find that a quantum illumination transmitter can achieve an error probability exponent stronger than a coherent-state transmitter of the same mean photon number, and furthermore, that it requires far fewer trials to do so. This occurs when the background thermal noise is either low or bright, which means that a quantum advantage is even easier to witness than in the symmetric-error setting because it occurs for a larger range of parameters. Going forward from here, we expect our formula to have applications in settings well beyond those considered in this paper, especially to quantum communication tasks involving quantum Gaussian channels.

  15. Inoculation stress hypothesis of environmental enrichment.

    Science.gov (United States)

    Crofton, Elizabeth J; Zhang, Yafang; Green, Thomas A

    2015-02-01

    One hallmark of psychiatric conditions is the vast continuum of individual differences in susceptibility vs. resilience resulting from the interaction of genetic and environmental factors. The environmental enrichment paradigm is an animal model that is useful for studying a range of psychiatric conditions, including protective phenotypes in addiction and depression models. The major question is how environmental enrichment, a non-drug and non-surgical manipulation, can produce such robust individual differences in such a wide range of behaviors. This paper draws from a variety of published sources to outline a coherent hypothesis of inoculation stress as a factor producing the protective enrichment phenotypes. The basic tenet suggests that chronic mild stress from living in a complex environment and interacting non-aggressively with conspecifics can inoculate enriched rats against subsequent stressors and/or drugs of abuse. This paper reviews the enrichment phenotypes, mulls the fundamental nature of environmental enrichment vs. isolation, discusses the most appropriate control for environmental enrichment, and challenges the idea that cortisol/corticosterone equals stress. The intent of the inoculation stress hypothesis of environmental enrichment is to provide a scaffold with which to build testable hypotheses for the elucidation of the molecular mechanisms underlying these protective phenotypes and thus provide new therapeutic targets to treat psychiatric/neurological conditions. Copyright © 2014 Elsevier Ltd. All rights reserved.

  16. The Debt Overhang Hypothesis: Evidence from Pakistan

    Directory of Open Access Journals (Sweden)

    Shah Muhammad Imran

    2016-04-01

    Full Text Available This study investigates the debt overhang hypothesis for Pakistan in the period 1960-2007. The study examines empirically the dynamic behaviour of GDP, debt services, the employed labour force and investment using the time series concepts of unit roots, cointegration, error correlation and causality. Our findings suggest that debt-servicing has a negative impact on the productivity of both labour and capital, and that in turn has adversely affected economic growth. By severely constraining the ability of the country to service debt, this lends support to the debt-overhang hypothesis in Pakistan. The long run relation between debt services and economic growth implies that future increases in output will drain away in form of high debt service payments to lender country as external debt acts like a tax on output. More specifically, foreign creditors will benefit more from the rise in productivity than will domestic producers and labour. This suggests that domestic labour and capital are the ultimate losers from this heavy debt burden.

  17. Roots and Route of the Artification Hypothesis

    Directory of Open Access Journals (Sweden)

    Ellen Dissanayake

    2017-08-01

    Full Text Available Over four decades, my ideas about the arts in human evolution have themselves evolved, from an original notion of art as a human behaviour of “making special” to a full-fledged hypothesis of artification. A summary of the gradual developmental path (or route of the hypothesis, based on ethological principles and concepts, is given, and an argument presented in which artification is described as an exaptation whose roots lie in adaptive features of ancestral mother–infant interaction that contributed to infant survival and maternal reproductive success. I show how the interaction displays features of a ritualised behavior whose operations (formalization, repetition, exaggeration, and elaboration can be regarded as characteristic elements of human ritual ceremonies as well as of art (including song, dance, performance, literary language, altered surroundings, and other examples of making ordinary sounds, movement, language, environments, objects, and bodies extraordinary. Participation in these behaviours in ritual practices served adaptive ends in early Homo by coordinating brain and body states, and thereby emotionally bonding members of a group in common cause as well as reducing existential anxiety in individuals. A final section situates artification within contemporary philosophical and popular ideas of art, claiming that artifying is not a synonym for or definition of art but foundational to any evolutionary discussion of artistic/aesthetic behaviour.

  18. Hypothesis: does ochratoxin A cause testicular cancer?

    Science.gov (United States)

    Schwartz, Gary G

    2002-02-01

    Little is known about the etiology of testicular cancer, which is the most common cancer among young men. Epidemiologic data point to a carcinogenic exposure in early life or in utero, but the nature of the exposure is unknown. We hypothesize that the mycotoxin, ochratoxin A, is a cause of testicular cancer. Ochratoxin A is a naturally occurring contaminant of cereals, pigmeat, and other foods and is a known genotoxic carcinogen in animals. The major features of the descriptive epidemiology of testicular cancer (a high incidence in northern Europe, increasing incidence over time, and associations with high socioeconomic status, and with poor semen quality) are all associated with exposure to ochratoxin A. Exposure of animals to ochratoxin A via the diet or via in utero transfer induces adducts in testicular DNA. We hypothesize that consumption of foods contaminated with ochratoxin A during pregnancy and/or childhood induces lesions in testicular DNA and that puberty promotes these lesions to testicular cancer. We tested the ochratoxin A hypothesis using ecologic data on the per-capita consumption of cereals, coffee, and pigmeat, the principal dietary sources of ochratoxin A. Incidence rates for testicular cancer in 20 countries were significantly correlated with the per-capita consumption of coffee and pigmeat (r = 0.49 and 0.54, p = 0.03 and 0.01). The ochratoxin A hypothesis offers a coherent explanation for much of the descriptive epidemiology of testicular cancer and suggests new avenues for analytic research.

  19. Urbanization and the more-individuals hypothesis.

    Science.gov (United States)

    Chiari, Claudia; Dinetti, Marco; Licciardello, Cinzia; Licitra, Gaetano; Pautasso, Marco

    2010-03-01

    1. Urbanization is a landscape process affecting biodiversity world-wide. Despite many urban-rural studies of bird assemblages, it is still unclear whether more species-rich communities have more individuals, regardless of the level of urbanization. The more-individuals hypothesis assumes that species-rich communities have larger populations, thus reducing the chance of local extinctions. 2. Using newly collated avian distribution data for 1 km(2) grid cells across Florence, Italy, we show a significantly positive relationship between species richness and assemblage abundance for the whole urban area. This richness-abundance relationship persists for the 1 km(2) grid cells with less than 50% of urbanized territory, as well as for the remaining grid cells, with no significant difference in the slope of the relationship. These results support the more-individuals hypothesis as an explanation of patterns in species richness, also in human modified and fragmented habitats. 3. However, the intercept of the species richness-abundance relationship is significantly lower for highly urbanized grid cells. Our study confirms that urban communities have lower species richness but counters the common notion that assemblages in densely urbanized ecosystems have more individuals. In Florence, highly inhabited areas show fewer species and lower assemblage abundance. 4. Urbanized ecosystems are an ongoing large-scale natural experiment which can be used to test ecological theories empirically.

  20. Dynamical thresholds for complete fusion

    International Nuclear Information System (INIS)

    Davies, K.T.R.; Sierk, A.J.; Nix, J.R.

    1983-01-01

    It is our purpose here to study the effect of nuclear dissipation and shape parametrization on dynamical thresholds for compound-nucleus formation in symmetric heavy-ion reactions. This is done by solving numerically classical equations of motion for head-on collisions to determine whether the dynamical trajectory in a multidimensional deformation space passes inside the fission saddle point and forms a compound nucleus, or whether it passes outside the fission saddle point and reseparates in a fast-fission or deep-inelastic reaction. Specifying the nuclear shape in terms of smoothly joined portions of three quadratic surfaces of revolution, we take into account three symmetric deformation coordinates. However, in some cases we reduce the number of coordinates to two by requiring the ends of the fusing system to be spherical in shape. The nuclear potential energy of deformation is determined in terms of a Coulomb energy and a double volume energy of a Yukawa-plus-exponential folding function. The collective kinetic energy is calculated for incompressible, nearly irrotational flow by means of the Werner-Wheeler approximation. Four possibilities are studied for the transfer of collective kinetic energy into internal single-particle excitation energy: zero dissipation, ordinary two body viscosity, one-body wall-formula dissipation, and one-body wall-and-window dissipation

  1. A new temperature threshold detector - Application to missile monitoring

    Science.gov (United States)

    Coston, C. J.; Higgins, E. V.

    Comprehensive thermal surveys within the case of solid propellant ballistic missile flight motors are highly desirable. For example, a problem involving motor failures due to insulator cracking at motor ignition, which took several years to solve, could have been identified immediately on the basis of a suitable thermal survey. Using conventional point measurements, such as those utilizing typical thermocouples, for such a survey on a full scale motor is not feasible because of the great number of sensors and measurements required. An alternate approach recognizes that temperatures below a threshold (which depends on the material being monitored) are acceptable, but higher temperatures exceed design margins. In this case hot spots can be located by a grid of wire-like sensors which are sensitive to temperature above the threshold anywhere along the sensor. A new type of temperature threshold detector is being developed for flight missile use. The considered device consists of KNO3 separating copper and Constantan metals. Above the KNO3 MP, galvanic action provides a voltage output of a few tenths of a volt.

  2. The Younger Dryas impact hypothesis: A requiem

    Science.gov (United States)

    Pinter, Nicholas; Scott, Andrew C.; Daulton, Tyrone L.; Podoll, Andrew; Koeberl, Christian; Anderson, R. Scott; Ishman, Scott E.

    2011-06-01

    The Younger Dryas (YD) impact hypothesis is a recent theory that suggests that a cometary or meteoritic body or bodies hit and/or exploded over North America 12,900 years ago, causing the YD climate episode, extinction of Pleistocene megafauna, demise of the Clovis archeological culture, and a range of other effects. Since gaining widespread attention in 2007, substantial research has focused on testing the 12 main signatures presented as evidence of a catastrophic extraterrestrial event 12,900 years ago. Here we present a review of the impact hypothesis, including its evolution and current variants, and of efforts to test and corroborate the hypothesis. The physical evidence interpreted as signatures of an impact event can be separated into two groups. The first group consists of evidence that has been largely rejected by the scientific community and is no longer in widespread discussion, including: particle tracks in archeological chert; magnetic nodules in Pleistocene bones; impact origin of the Carolina Bays; and elevated concentrations of radioactivity, iridium, and fullerenes enriched in 3He. The second group consists of evidence that has been active in recent research and discussions: carbon spheres and elongates, magnetic grains and magnetic spherules, byproducts of catastrophic wildfire, and nanodiamonds. Over time, however, these signatures have also seen contrary evidence rather than support. Recent studies have shown that carbon spheres and elongates do not represent extraterrestrial carbon nor impact-induced megafires, but are indistinguishable from fungal sclerotia and arthropod fecal material that are a small but common component of many terrestrial deposits. Magnetic grains and spherules are heterogeneously distributed in sediments, but reported measurements of unique peaks in concentrations at the YD onset have yet to be reproduced. The magnetic grains are certainly just iron-rich detrital grains, whereas reported YD magnetic spherules are

  3. Approaches to informed consent for hypothesis-testing and hypothesis-generating clinical genomics research.

    Science.gov (United States)

    Facio, Flavia M; Sapp, Julie C; Linn, Amy; Biesecker, Leslie G

    2012-10-10

    Massively-parallel sequencing (MPS) technologies create challenges for informed consent of research participants given the enormous scale of the data and the wide range of potential results. We propose that the consent process in these studies be based on whether they use MPS to test a hypothesis or to generate hypotheses. To demonstrate the differences in these approaches to informed consent, we describe the consent processes for two MPS studies. The purpose of our hypothesis-testing study is to elucidate the etiology of rare phenotypes using MPS. The purpose of our hypothesis-generating study is to test the feasibility of using MPS to generate clinical hypotheses, and to approach the return of results as an experimental manipulation. Issues to consider in both designs include: volume and nature of the potential results, primary versus secondary results, return of individual results, duty to warn, length of interaction, target population, and privacy and confidentiality. The categorization of MPS studies as hypothesis-testing versus hypothesis-generating can help to clarify the issue of so-called incidental or secondary results for the consent process, and aid the communication of the research goals to study participants.

  4. Some problems in the acceptability of implementing radiation protection programs

    International Nuclear Information System (INIS)

    Neill, R.H.

    1997-01-01

    The three fundamentals that radiation protection programs are based upon are; 1) establishing a quantitative correlation between radiation exposure and biological effects in people; 2) determining a level of acceptable risk of exposure; and 3) establishing systems to measure the radiation dose to insure compliance with the regulations or criteria. The paper discusses the interrelationship of these fundamentals, difficulties in obtaining a consensus of acceptable risk and gives some examples of problems in identifying the most critical population-at-risk and in measuring dose. Despite such problems, it is recommended that we proceed with the existing conservative structure of radiation protection programs based upon a linear no threshold model for low radiation doses to insure public acceptability of various potential radiation risks. Voluntary compliance as well as regulatory requirements should continue to be pursued to maintain minimal exposure to ionizing radiation. (author)

  5. Cone penetrometer acceptance test report

    Energy Technology Data Exchange (ETDEWEB)

    Boechler, G.N.

    1996-09-19

    This Acceptance Test Report (ATR) documents the results of acceptance test procedure WHC-SD-WM-ATR-151. Included in this report is a summary of the tests, the results and issues, the signature and sign- off ATP pages, and a summarized table of the specification vs. ATP section that satisfied the specification.

  6. Acceptance conditions in automated negotiation

    NARCIS (Netherlands)

    Baarslag, T.; Hindriks, K.V.; Jonker, C.M.

    2011-01-01

    In every negotiation with a deadline, one of the negotiating parties has to accept an offer to avoid a break off. A break off is usually an undesirable outcome for both parties, therefore it is important that a negotiator employs a proficient mechanism to decide under which conditions to accept.

  7. Consumer Acceptance of Novel Foods

    NARCIS (Netherlands)

    Fischer, A.R.H.; Reinders, M.J.

    2016-01-01

    The success of novel foods depends to a considerable extent on whether consumers accept those innovations. This chapter provides an overview of current knowledge relevant to consumer acceptance of innovations in food. A broad range of theories and approaches to assess consumer response to

  8. Worldwide nuclear revival and acceptance

    International Nuclear Information System (INIS)

    Geraets, Luc H.; Crommelynck, Yves A.

    2009-01-01

    The current status and trends of the nuclear revival in Europe and abroad are outlined. The development of public opinion in the last decade is playing an important part. This has turned from clear rejection to careful acceptance. Transparency and open communication will be important aspects in the further development of nuclear acceptance. (orig.)

  9. Learning-Related Changes in Adolescents' Neural Networks during Hypothesis-Generating and Hypothesis-Understanding Training

    Science.gov (United States)

    Lee, Jun-Ki; Kwon, Yongju

    2012-01-01

    Fourteen science high school students participated in this study, which investigated neural-network plasticity associated with hypothesis-generating and hypothesis-understanding in learning. The students were divided into two groups and participated in either hypothesis-generating or hypothesis-understanding type learning programs, which were…

  10. Increased intensity discrimination thresholds in tinnitus subjects with a normal audiogram

    DEFF Research Database (Denmark)

    Epp, Bastian; Hots, J.; Verhey, J. L.

    2012-01-01

    Recent auditory brain stem response measurements in tinnitus subjects with normal audiograms indicate the presence of hidden hearing loss that manifests as reduced neural output from the cochlea at high sound intensities, and results from mice suggest a link to deafferentation of auditory nerve...... fibers. As deafferentation would lead to deficits in hearing performance, the present study investigates whether tinnitus patients with normal hearing thresholds show impairment in intensity discrimination compared to an audiometrically matched control group. Intensity discrimination thresholds were...... significantly increased in the tinnitus frequency range, consistent with the hypothesis that auditory nerve fiber deafferentation is associated with tinnitus....

  11. Using local multiplicity to improve effect estimation from a hypothesis-generating pharmacogenetics study.

    Science.gov (United States)

    Zou, W; Ouyang, H

    2016-02-01

    We propose a multiple estimation adjustment (MEA) method to correct effect overestimation due to selection bias from a hypothesis-generating study (HGS) in pharmacogenetics. MEA uses a hierarchical Bayesian approach to model individual effect estimates from maximal likelihood estimation (MLE) in a region jointly and shrinks them toward the regional effect. Unlike many methods that model a fixed selection scheme, MEA capitalizes on local multiplicity independent of selection. We compared mean square errors (MSEs) in simulated HGSs from naive MLE, MEA and a conditional likelihood adjustment (CLA) method that model threshold selection bias. We observed that MEA effectively reduced MSE from MLE on null effects with or without selection, and had a clear advantage over CLA on extreme MLE estimates from null effects under lenient threshold selection in small samples, which are common among 'top' associations from a pharmacogenetics HGS.

  12. Alternatives to the linear risk hypothesis

    International Nuclear Information System (INIS)

    Craig, A.G.

    1976-01-01

    A theoretical argument is presented which suggests that in using the linear hypothesis for all values of LET the low dose risk is overestimated for low LET but that it is underestimated for very high LET. The argument is based upon the idea that cell lesions which do not lead to cell death may in fact lead to a malignant cell. Expressions for the Surviving Fraction and the Cancer Risk based on this argument are given. An advantage of this very general approach is that is expresses cell survival and cancer risk entirely in terms of the cell lesions and avoids the rather contentious argument as to how the average number of lesions should be related to the dose. (U.K.)

  13. Large numbers hypothesis. II - Electromagnetic radiation

    Science.gov (United States)

    Adams, P. J.

    1983-01-01

    This paper develops the theory of electromagnetic radiation in the units covariant formalism incorporating Dirac's large numbers hypothesis (LNH). A direct field-to-particle technique is used to obtain the photon propagation equation which explicitly involves the photon replication rate. This replication rate is fixed uniquely by requiring that the form of a free-photon distribution function be preserved, as required by the 2.7 K cosmic radiation. One finds that with this particular photon replication rate the units covariant formalism developed in Paper I actually predicts that the ratio of photon number to proton number in the universe varies as t to the 1/4, precisely in accord with LNH. The cosmological red-shift law is also derived and it is shown to differ considerably from the standard form of (nu)(R) - const.

  14. Artistic talent in dyslexia--a hypothesis.

    Science.gov (United States)

    Chakravarty, Ambar

    2009-10-01

    The present article hints at a curious neurocognitive phenomenon of development of artistic talents in some children with dyslexia. The article also takes note of the phenomenon of creating in the midst of language disability as observed in the lives of such creative people like Leonardo da Vinci and Albert Einstein who were most probably affected with developmental learning disorders. It has been hypothesised that a developmental delay in the dominant hemisphere most likely 'disinhibits' the non-dominant parietal lobe to unmask talents, artistic or otherwise, in some such individuals. The present hypothesis follows the phenomenon of paradoxical functional facilitation described earlier. It has been suggested that children with learning disorders be encouraged to develop such hidden talents to full capacity, rather than be subjected to overemphasising on the correction of the disturbed coded symbol operations, in remedial training.

  15. Statistical hypothesis tests of some micrometeorological observations

    International Nuclear Information System (INIS)

    SethuRaman, S.; Tichler, J.

    1977-01-01

    Chi-square goodness-of-fit is used to test the hypothesis that the medium scale of turbulence in the atmospheric surface layer is normally distributed. Coefficients of skewness and excess are computed from the data. If the data are not normal, these coefficients are used in Edgeworth's asymptotic expansion of Gram-Charlier series to determine an altrnate probability density function. The observed data are then compared with the modified probability densities and the new chi-square values computed.Seventy percent of the data analyzed was either normal or approximatley normal. The coefficient of skewness g 1 has a good correlation with the chi-square values. Events with vertical-barg 1 vertical-bar 1 vertical-bar<0.43 were approximately normal. Intermittency associated with the formation and breaking of internal gravity waves in surface-based inversions over water is thought to be the reason for the non-normality

  16. The hexagon hypothesis: Six disruptive scenarios.

    Science.gov (United States)

    Burtles, Jim

    2015-01-01

    This paper aims to bring a simple but effective and comprehensive approach to the development, delivery and monitoring of business continuity solutions. To ensure that the arguments and principles apply across the board, the paper sticks to basic underlying concepts rather than sophisticated interpretations. First, the paper explores what exactly people are defending themselves against. Secondly, the paper looks at how defences should be set up. Disruptive events tend to unfold in phases, each of which invites a particular style of protection, ranging from risk management through to business continuity to insurance cover. Their impact upon any business operation will fall into one of six basic scenarios. The hexagon hypothesis suggests that everyone should be prepared to deal with each of these six disruptive scenarios and it provides them with a useful benchmark for business continuity.

  17. Novae, supernovae, and the island universe hypothesis

    International Nuclear Information System (INIS)

    Van Den Bergh, S.

    1988-01-01

    Arguments in Curtis's (1917) paper related to the island universe hypothesis and the existence of novae in spiral nebulae are considered. It is noted that the maximum magnitude versus rate-of-decline relation for novae may be the best tool presently available for the calibration of the extragalactic distance scale. Light curve observations of six novae are used to determine a distance of 18.6 + or - 3.5 MPc to the Virgo cluster. Results suggest that Type Ia supernovae cannot easily be used as standard candles, and that Type II supernovae are unsuitable as distance indicators. Factors other than precursor mass are probably responsible for determining the ultimate fate of evolving stars. 83 references

  18. Multiple model cardinalized probability hypothesis density filter

    Science.gov (United States)

    Georgescu, Ramona; Willett, Peter

    2011-09-01

    The Probability Hypothesis Density (PHD) filter propagates the first-moment approximation to the multi-target Bayesian posterior distribution while the Cardinalized PHD (CPHD) filter propagates both the posterior likelihood of (an unlabeled) target state and the posterior probability mass function of the number of targets. Extensions of the PHD filter to the multiple model (MM) framework have been published and were implemented either with a Sequential Monte Carlo or a Gaussian Mixture approach. In this work, we introduce the multiple model version of the more elaborate CPHD filter. We present the derivation of the prediction and update steps of the MMCPHD particularized for the case of two target motion models and proceed to show that in the case of a single model, the new MMCPHD equations reduce to the original CPHD equations.

  19. Software thresholds alter the bias of actigraphy for monitoring sleep in team-sport athletes.

    Science.gov (United States)

    Fuller, Kate L; Juliff, Laura; Gore, Christopher J; Peiffer, Jeremiah J; Halson, Shona L

    2017-08-01

    Actical ® actigraphy is commonly used to monitor athlete sleep. The proprietary software, called Actiware ® , processes data with three different sleep-wake thresholds (Low, Medium or High), but there is no standardisation regarding their use. The purpose of this study was to examine validity and bias of the sleep-wake thresholds for processing Actical ® sleep data in team sport athletes. Validation study comparing actigraph against accepted gold standard polysomnography (PSG). Sixty seven nights of sleep were recorded simultaneously with polysomnography and Actical ® devices. Individual night data was compared across five sleep measures for each sleep-wake threshold using Actiware ® software. Accuracy of each sleep-wake threshold compared with PSG was evaluated from mean bias with 95% confidence limits, Pearson moment-product correlation and associated standard error of estimate. The Medium threshold generated the smallest mean bias compared with polysomnography for total sleep time (8.5min), sleep efficiency (1.8%) and wake after sleep onset (-4.1min); whereas the Low threshold had the smallest bias (7.5min) for wake bouts. Bias in sleep onset latency was the same across thresholds (-9.5min). The standard error of the estimate was similar across all thresholds; total sleep time ∼25min, sleep efficiency ∼4.5%, wake after sleep onset ∼21min, and wake bouts ∼8 counts. Sleep parameters measured by the Actical ® device are greatly influenced by the sleep-wake threshold applied. In the present study the Medium threshold produced the smallest bias for most parameters compared with PSG. Given the magnitude of measurement variability, confidence limits should be employed when interpreting changes in sleep parameters. Copyright © 2017 Sports Medicine Australia. All rights reserved.

  20. Problems with the Younger Dryas Boundary (YDB) Impact Hypothesis

    Science.gov (United States)

    Boslough, M.

    2009-12-01

    One breakthrough of 20th-century Earth science was the recognition of impacts as an important geologic process. The most obvious result is a crater. There are more than 170 confirmed terrestrial impact structures with a non-uniform spatial distribution suggesting more to be found. Many have been erased by tectonics and erosion. Deep water impacts do not form craters, and craters in ice sheets disappear when the ice melts. There is growing speculation that such hidden impacts have caused frequent major environmental events of the Holocene, but this is inconsistent with the astronomically-constrained population of Earth-crossing asteroids. Impacts can have consequences much more significant than excavation of a crater. The K/T boundary mass extinction is attributed to the environmental effects of a major impact, and some researchers argue that other extinctions, abrupt climate changes, and even civilization collapses have resulted from impacts. Nuclear winter models suggest that 2-km diameter asteroids exceed a "global catastrophe threshold" by injecting sufficient dust into the stratosphere to cause short-term climate changes, but would not necessarily collapse most natural ecosystems or cause mass extinctions. Globally-catastrophic impacts recur on timescales of about one million years. The 1994 collision of Comet Shoemaker-Levy 9 with Jupiter led us recognize the significance of terrestrial airbursts caused by objects exploding violently in Earth’s atmosphere. We have invoked airbursts to explain rare forms of non-volcanic glasses and melts by using high-resolution computational models to improve our understanding of atmospheric explosions, and have suggested that multiple airbursts from fragmented impactors could be responsible for regional effects. Our models have been cited in support of the widely-publicized YDB impact hypothesis. Proponents claim that a broken comet exploded over North America, with some fragments cratering the Laurentide Ice Sheet. They

  1. On the immunostimulatory hypothesis of cancer

    Directory of Open Access Journals (Sweden)

    Juan Bruzzo

    2011-12-01

    Full Text Available There is a rather generalized belief that the worst possible outcome for the application of immunological therapies against cancer is a null effect on tumor growth. However, a significant body of evidence summarized in the immunostimulatory hypothesis of cancer suggests that, upon certain circumstances, the growth of incipient and established tumors can be accelerated rather than inhibited by the immune response supposedly mounted to limit tumor growth. In order to provide more compelling evidence of this proposition, we have explored the growth behavior characteristics of twelve murine tumors -most of them of spontaneous origin- arisen in the colony of our laboratory, in putatively immunized and control mice. Using classical immunization procedures, 8 out of 12 tumors were actually stimulated in "immunized" mice while the remaining 4 were neither inhibited nor stimulated. Further, even these apparently non-antigenic tumors could reveal some antigenicity if more stringent than classical immunization procedures were used. This possibility was suggested by the results obtained with one of these four apparently non-antigenic tumors: the LB lymphoma. In effect, upon these stringent immunization pretreatments, LB was slightly inhibited or stimulated, depending on the titer of the immune reaction mounted against the tumor, with higher titers rendering inhibition and lower titers rendering tumor stimulation. All the above results are consistent with the immunostimulatory hypothesis that entails the important therapeutic implications -contrary to the orthodoxy- that, anti-tumor vaccines may run a real risk of doing harm if the vaccine-induced immunity is too weak to move the reaction into the inhibitory part of the immune response curve and that, a slight and prolonged immunodepression -rather than an immunostimulation- might interfere with the progression of some tumors and thus be an aid to cytotoxic therapies.

  2. The Stress Acceleration Hypothesis of Nightmares

    Directory of Open Access Journals (Sweden)

    Tore Nielsen

    2017-06-01

    Full Text Available Adverse childhood experiences can deleteriously affect future physical and mental health, increasing risk for many illnesses, including psychiatric problems, sleep disorders, and, according to the present hypothesis, idiopathic nightmares. Much like post-traumatic nightmares, which are triggered by trauma and lead to recurrent emotional dreaming about the trauma, idiopathic nightmares are hypothesized to originate in early adverse experiences that lead in later life to the expression of early memories and emotions in dream content. Accordingly, the objectives of this paper are to (1 review existing literature on sleep, dreaming and nightmares in relation to early adverse experiences, drawing upon both empirical studies of dreaming and nightmares and books and chapters by recognized nightmare experts and (2 propose a new approach to explaining nightmares that is based upon the Stress Acceleration Hypothesis of mental illness. The latter stipulates that susceptibility to mental illness is increased by adversity occurring during a developmentally sensitive window for emotional maturation—the infantile amnesia period—that ends around age 3½. Early adversity accelerates the neural and behavioral maturation of emotional systems governing the expression, learning, and extinction of fear memories and may afford short-term adaptive value. But it also engenders long-term dysfunctional consequences including an increased risk for nightmares. Two mechanisms are proposed: (1 disruption of infantile amnesia allows normally forgotten early childhood memories to influence later emotions, cognitions and behavior, including the common expression of threats in nightmares; (2 alterations of normal emotion regulation processes of both waking and sleep lead to increased fear sensitivity and less effective fear extinction. These changes influence an affect network previously hypothesized to regulate fear extinction during REM sleep, disruption of which leads to

  3. The demand for environmental quality and the environmental Kuznets Curve hypothesis

    International Nuclear Information System (INIS)

    Khanna, Neha; Plassmann, Florenz

    2004-01-01

    Household demand for better environmental quality is the key factor in the long-term global applicability of the Environmental Kuznets Curve (EKC) hypothesis. We argue that, for given consumer preferences, the threshold income level at which the EKC turns downwards or the equilibrium income elasticity changes sign from positive to negative depends on the ability to spatially separate production and consumption. We test our hypothesis by estimating the equilibrium income elasticities of five pollutants, using 1990 data for the United States. We find that the change in sign occurs at lower income levels for pollutants for which spatial separation is relatively easy as compared to pollutants for which spatial separation is difficult. Our results suggest that even high-income households in the United States have not yet reached the income level at which their demand for better environmental quality is high enough to cause the income-pollution relationship to turn downwards for all the pollutants that we analyzed

  4. The demand for environmental quality and the environmental Kuznets Curve hypothesis

    Energy Technology Data Exchange (ETDEWEB)

    Khanna, Neha [Department of Economics and Environmental Studies Program, Binghamton, University (LT 1004), P.O. Box 6000, Binghamton, NY 13902-6000 (United States); Plassmann, Florenz [Department of Economics, Binghamton University (LT 904), P.O. Box 6000, Binghamton, NY 13902-6000 (United States)

    2004-12-01

    Household demand for better environmental quality is the key factor in the long-term global applicability of the Environmental Kuznets Curve (EKC) hypothesis. We argue that, for given consumer preferences, the threshold income level at which the EKC turns downwards or the equilibrium income elasticity changes sign from positive to negative depends on the ability to spatially separate production and consumption. We test our hypothesis by estimating the equilibrium income elasticities of five pollutants, using 1990 data for the United States. We find that the change in sign occurs at lower income levels for pollutants for which spatial separation is relatively easy as compared to pollutants for which spatial separation is difficult. Our results suggest that even high-income households in the United States have not yet reached the income level at which their demand for better environmental quality is high enough to cause the income-pollution relationship to turn downwards for all the pollutants that we analyzed.

  5. Log canonical thresholds of smooth Fano threefolds

    International Nuclear Information System (INIS)

    Cheltsov, Ivan A; Shramov, Konstantin A

    2008-01-01

    The complex singularity exponent is a local invariant of a holomorphic function determined by the integrability of fractional powers of the function. The log canonical thresholds of effective Q-divisors on normal algebraic varieties are algebraic counterparts of complex singularity exponents. For a Fano variety, these invariants have global analogues. In the former case, it is the so-called α-invariant of Tian; in the latter case, it is the global log canonical threshold of the Fano variety, which is the infimum of log canonical thresholds of all effective Q-divisors numerically equivalent to the anticanonical divisor. An appendix to this paper contains a proof that the global log canonical threshold of a smooth Fano variety coincides with its α-invariant of Tian. The purpose of the paper is to compute the global log canonical thresholds of smooth Fano threefolds (altogether, there are 105 deformation families of such threefolds). The global log canonical thresholds are computed for every smooth threefold in 64 deformation families, and the global log canonical thresholds are computed for a general threefold in 20 deformation families. Some bounds for the global log canonical thresholds are computed for 14 deformation families. Appendix A is due to J.-P. Demailly.

  6. Thresholding magnetic resonance images of human brain

    Institute of Scientific and Technical Information of China (English)

    Qing-mao HU; Wieslaw L NOWINSKI

    2005-01-01

    In this paper, methods are proposed and validated to determine low and high thresholds to segment out gray matter and white matter for MR images of different pulse sequences of human brain. First, a two-dimensional reference image is determined to represent the intensity characteristics of the original three-dimensional data. Then a region of interest of the reference image is determined where brain tissues are present. The non-supervised fuzzy c-means clustering is employed to determine: the threshold for obtaining head mask, the low threshold for T2-weighted and PD-weighted images, and the high threshold for T1-weighted, SPGR and FLAIR images. Supervised range-constrained thresholding is employed to determine the low threshold for T1-weighted, SPGR and FLAIR images. Thresholding based on pairs of boundary pixels is proposed to determine the high threshold for T2- and PD-weighted images. Quantification against public data sets with various noise and inhomogeneity levels shows that the proposed methods can yield segmentation robust to noise and intensity inhomogeneity. Qualitatively the proposed methods work well with real clinical data.

  7. Time-efficient multidimensional threshold tracking method

    DEFF Research Database (Denmark)

    Fereczkowski, Michal; Kowalewski, Borys; Dau, Torsten

    2015-01-01

    Traditionally, adaptive methods have been used to reduce the time it takes to estimate psychoacoustic thresholds. However, even with adaptive methods, there are many cases where the testing time is too long to be clinically feasible, particularly when estimating thresholds as a function of anothe...

  8. 40 CFR 68.115 - Threshold determination.

    Science.gov (United States)

    2010-07-01

    ... (CONTINUED) CHEMICAL ACCIDENT PREVENTION PROVISIONS Regulated Substances for Accidental Release Prevention... process exceeds the threshold. (b) For the purposes of determining whether more than a threshold quantity... portion of the process is less than 10 millimeters of mercury (mm Hg), the amount of the substance in the...

  9. Applying Threshold Concepts to Finance Education

    Science.gov (United States)

    Hoadley, Susan; Wood, Leigh N.; Tickle, Leonie; Kyng, Tim

    2016-01-01

    Purpose: The purpose of this paper is to investigate and identify threshold concepts that are the essential conceptual content of finance programmes. Design/Methodology/Approach: Conducted in three stages with finance academics and students, the study uses threshold concepts as both a theoretical framework and a research methodology. Findings: The…

  10. The adaptive value of gluttony: predators mediate the life history trade-offs of satiation threshold.

    Science.gov (United States)

    Pruitt, J N; Krauel, J J

    2010-10-01

    Animals vary greatly in their tendency to consume large meals. Yet, whether or how meal size influences fitness in wild populations is infrequently considered. Using a predator exclusion, mark-recapture experiment, we estimated selection on the amount of food accepted during an ad libitum feeding bout (hereafter termed 'satiation threshold') in the wolf spider Schizocosa ocreata. Individually marked, size-matched females of known satiation threshold were assigned to predator exclusion and predator inclusion treatments and tracked for a 40-day period. We also estimated the narrow-sense heritability of satiation threshold using dam-on-female-offspring regression. In the absence of predation, high satiation threshold was positively associated with larger and faster egg case production. However, these selective advantages were lost when predators were present. We estimated the heritability of satiation threshold to be 0.56. Taken together, our results suggest that satiation threshold can respond to selection and begets a life history trade-off in this system: high satiation threshold individuals tend to produce larger egg cases but also suffer increased susceptibility to predation. © 2010 The Authors. Journal Compilation © 2010 European Society For Evolutionary Biology.

  11. Hypothesis testing of a change point during cognitive decline among Alzheimer's disease patients.

    Science.gov (United States)

    Ji, Ming; Xiong, Chengjie; Grundman, Michael

    2003-10-01

    In this paper, we present a statistical hypothesis test for detecting a change point over the course of cognitive decline among Alzheimer's disease patients. The model under the null hypothesis assumes a constant rate of cognitive decline over time and the model under the alternative hypothesis is a general bilinear model with an unknown change point. When the change point is unknown, however, the null distribution of the test statistics is not analytically tractable and has to be simulated by parametric bootstrap. When the alternative hypothesis that a change point exists is accepted, we propose an estimate of its location based on the Akaike's Information Criterion. We applied our method to a data set from the Neuropsychological Database Initiative by implementing our hypothesis testing method to analyze Mini Mental Status Exam scores based on a random-slope and random-intercept model with a bilinear fixed effect. Our result shows that despite large amount of missing data, accelerated decline did occur for MMSE among AD patients. Our finding supports the clinical belief of the existence of a change point during cognitive decline among AD patients and suggests the use of change point models for the longitudinal modeling of cognitive decline in AD research.

  12. Effects of programming threshold and maplaw settings on acoustic thresholds and speech discrimination with the MED-EL COMBI 40+ cochlear implant.

    Science.gov (United States)

    Boyd, Paul J

    2006-12-01

    The principal task in the programming of a cochlear implant (CI) speech processor is the setting of the electrical dynamic range (output) for each electrode, to ensure that a comfortable loudness percept is obtained for a range of input levels. This typically involves separate psychophysical measurement of electrical threshold ([theta] e) and upper tolerance levels using short current bursts generated by the fitting software. Anecdotal clinical experience and some experimental studies suggest that the measurement of [theta]e is relatively unimportant and that the setting of upper tolerance limits is more critical for processor programming. The present study aims to test this hypothesis and examines in detail how acoustic thresholds and speech recognition are affected by setting of the lower limit of the output ("Programming threshold" or "PT") to understand better the influence of this parameter and how it interacts with certain other programming parameters. Test programs (maps) were generated with PT set to artificially high and low values and tested on users of the MED-EL COMBI 40+ CI system. Acoustic thresholds and speech recognition scores (sentence tests) were measured for each of the test maps. Acoustic thresholds were also measured using maps with a range of output compression functions ("maplaws"). In addition, subjective reports were recorded regarding the presence of "background threshold stimulation" which is occasionally reported by CI users if PT is set to relatively high values when using the CIS strategy. Manipulation of PT was found to have very little effect. Setting PT to minimum produced a mean 5 dB (S.D. = 6.25) increase in acoustic thresholds, relative to thresholds with PT set normally, and had no statistically significant effect on speech recognition scores on a sentence test. On the other hand, maplaw setting was found to have a significant effect on acoustic thresholds (raised as maplaw is made more linear), which provides some theoretical

  13. Consumer acceptance of functional foods

    DEFF Research Database (Denmark)

    Frewer, Lynn J.; Scholderer, Joachim; Lambert, Nigel

    2003-01-01

    In the past, it has been assumed that consumers would accept novel foods if there is a concrete and tangible consumer benefit associated with them, which implies that those functional foods would quickly be accepted. However, there is evidence that individuals are likely to differ in the extent...... to which they are likely to buy products with particular functional properties. Various cross-cultural and demographic differences in acceptance found in the literature are reviewed, as well as barriers to dietary change. In conclusion, it is argued that understanding consumer's risk perceptions...

  14. Summary of DOE threshold limits efforts

    International Nuclear Information System (INIS)

    Wickham, L.E.; Smith, C.F.; Cohen, J.J.

    1987-01-01

    The Department of Energy (DOE) has been developing the concept of threshold quantities for use in determining which waste materials may be disposed of as nonradioactive waste in DOE sanitary landfills. Waste above a threshold level could be managed as radioactive or mixed waste (if hazardous chemicals are present); waste below this level would be handled as sanitary waste. After extensive review of a draft threshold guidance document in 1985, a second draft threshold background document was produced in March 1986. The second draft included a preliminary cost-benefit analysis and quality assurance considerations. The review of the second draft has been completed. Final changes to be incorporated include an in-depth cost-benefit analysis of two example sites and recommendations of how to further pursue (i.e. employ) the concept of threshold quantities within the DOE. 3 references

  15. Stylized facts from a threshold-based heterogeneous agent model

    Science.gov (United States)

    Cross, R.; Grinfeld, M.; Lamba, H.; Seaman, T.

    2007-05-01

    A class of heterogeneous agent models is investigated where investors switch trading position whenever their motivation to do so exceeds some critical threshold. These motivations can be psychological in nature or reflect behaviour suggested by the efficient market hypothesis (EMH). By introducing different propensities into a baseline model that displays EMH behaviour, one can attempt to isolate their effects upon the market dynamics. The simulation results indicate that the introduction of a herding propensity results in excess kurtosis and power-law decay consistent with those observed in actual return distributions, but not in significant long-term volatility correlations. Possible alternatives for introducing such long-term volatility correlations are then identified and discussed.

  16. A Threshold Continuum for Aeolian Sand Transport

    Science.gov (United States)

    Swann, C.; Ewing, R. C.; Sherman, D. J.

    2015-12-01

    The threshold of motion for aeolian sand transport marks the initial entrainment of sand particles by the force of the wind. This is typically defined and modeled as a singular wind speed for a given grain size and is based on field and laboratory experimental data. However, the definition of threshold varies significantly between these empirical models, largely because the definition is based on visual-observations of initial grain movement. For example, in his seminal experiments, Bagnold defined threshold of motion when he observed that 100% of the bed was in motion. Others have used 50% and lesser values. Differences in threshold models, in turn, result is large errors in predicting the fluxes associated with sand and dust transport. Here we use a wind tunnel and novel sediment trap to capture the fractions of sand in creep, reptation and saltation at Earth and Mars pressures and show that the threshold of motion for aeolian sand transport is best defined as a continuum in which grains progress through stages defined by the proportion of grains in creep and saltation. We propose the use of scale dependent thresholds modeled by distinct probability distribution functions that differentiate the threshold based on micro to macro scale applications. For example, a geologic timescale application corresponds to a threshold when 100% of the bed in motion whereas a sub-second application corresponds to a threshold when a single particle is set in motion. We provide quantitative measurements (number and mode of particle movement) corresponding to visual observations, percent of bed in motion and degrees of transport intermittency for Earth and Mars. Understanding transport as a continuum provides a basis for revaluating sand transport thresholds on Earth, Mars and Titan.

  17. Near death experiences: a multidisciplinary hypothesis.

    Science.gov (United States)

    Bókkon, István; Mallick, Birendra N; Tuszynski, Jack A

    2013-01-01

    Recently, we proposed a novel biophysical concept regarding on the appearance of brilliant lights during near death experiences (NDEs) (Bókkon and Salari, 2012). Specifically, perceiving brilliant light in NDEs has been proposed to arise due to the reperfusion that produces unregulated overproduction of free radicals and energetically excited molecules that can generate a transient enhancement of bioluminescent biophotons in different areas of the brain, including retinotopic visual areas. If this excess of bioluminescent photon emission exceeds a threshold in retinotopic visual areas, this can appear as (phosphene) lights because the brain interprets these intrinsic retinotopic bioluminescent photons as if they originated from the external physical world. Here, we review relevant literature that reported experimental studies (Imaizumi et al., 1984; Suzuki et al., 1985) that essentially support our previously published conception, i.e., that seeing lights in NDEs may be due to the transient enhancement of bioluminescent biophotons. Next, we briefly describe our biophysical visual representation model that may explain brilliant lights experienced during NDEs (by phosphenes as biophotons) and REM sleep associated dream-like intrinsic visual imageries through biophotons in NDEs. Finally, we link our biophysical visual representation notion to self-consciousness that may involve extremely low-energy quantum entanglements. This article is intended to introduce novel concepts for discussion and does not pretend to give the ultimate explanation for the currently unanswerable questions about matter, life and soul; their creation and their interrelationship.

  18. Market Acceptance of Smart Growth

    Science.gov (United States)

    This report finds that smart growth developments enjoy market acceptance because of stability in prices over time. Housing resales in smart growth developments often have greater appreciation than their conventional suburban counterparts.

  19. L-286, Acceptance Test Record

    International Nuclear Information System (INIS)

    HARMON, B.C.

    2000-01-01

    This document provides a detailed account of how the acceptance testing was conducted for Project L-286, ''200E Area Sanitary Water Plant Effluent Stream Reduction''. The testing of the L-286 instrumentation system was conducted under the direct supervision

  20. Standards regulations and public acceptance

    International Nuclear Information System (INIS)

    Fernandez, E.C.

    1977-01-01

    Spanish nuclear legislation and the associated procedure for the authorization of installations is summarized. Public acceptance is discussed in the context of the needs for and hazards of nuclear energy. (U.K.)

  1. Ritz, Einstein, and the Emission Hypothesis

    Science.gov (United States)

    Martínez, Alberto A.

    . Just as Albert Einstein's special theory of relativity was gaining acceptance around 1908, the young Swiss physicist Walter Ritz advanced a competing though preliminary emission theory that sought to explain the phenomena of electrodynamics on the assumption that the speed of light depends on the motion of its source. I survey Ritz's unfinished work in this area and review the reasons why Einstein and other physicists rejected Ritz's and other emission theories. Since Ritz's emission theory attracted renewed attention in the 1960s, I discuss how the earlier observational evidence was misconstrued as telling against it more conclusively than actually was the case. Finally, I contrast the role played by evidence against Ritz's theory with other factors that led to the early rejection of his approach.

  2. The Matter-Gravity Entanglement Hypothesis

    Science.gov (United States)

    Kay, Bernard S.

    2018-03-01

    I outline some of my work and results (some dating back to 1998, some more recent) on my matter-gravity entanglement hypothesis, according to which the entropy of a closed quantum gravitational system is equal to the system's matter-gravity entanglement entropy. The main arguments presented are: (1) that this hypothesis is capable of resolving what I call the second-law puzzle, i.e. the puzzle as to how the entropy increase of a closed system can be reconciled with the asssumption of unitary time-evolution; (2) that the black hole information loss puzzle may be regarded as a special case of this second law puzzle and that therefore the same resolution applies to it; (3) that the black hole thermal atmosphere puzzle (which I recall) can be resolved by adopting a radically different-from-usual description of quantum black hole equilibrium states, according to which they are total pure states, entangled between matter and gravity in such a way that the partial states of matter and gravity are each approximately thermal equilibrium states (at the Hawking temperature); (4) that the Susskind-Horowitz-Polchinski string-theoretic understanding of black hole entropy as the logarithm of the degeneracy of a long string (which is the weak string coupling limit of a black hole) cannot be quite correct but should be replaced by a modified understanding according to which it is the entanglement entropy between a long string and its stringy atmosphere, when in a total pure equilibrium state in a suitable box, which (in line with (3)) goes over, at strong-coupling, to a black hole in equilibrium with its thermal atmosphere. The modified understanding in (4) is based on a general result, which I also describe, which concerns the likely state of a quantum system when it is weakly coupled to an energy-bath and the total state is a random pure state with a given energy. This result generalizes Goldstein et al.'s `canonical typicality' result to systems which are not necessarily small.

  3. The Matter-Gravity Entanglement Hypothesis

    Science.gov (United States)

    Kay, Bernard S.

    2018-05-01

    I outline some of my work and results (some dating back to 1998, some more recent) on my matter-gravity entanglement hypothesis, according to which the entropy of a closed quantum gravitational system is equal to the system's matter-gravity entanglement entropy. The main arguments presented are: (1) that this hypothesis is capable of resolving what I call the second-law puzzle, i.e. the puzzle as to how the entropy increase of a closed system can be reconciled with the asssumption of unitary time-evolution; (2) that the black hole information loss puzzle may be regarded as a special case of this second law puzzle and that therefore the same resolution applies to it; (3) that the black hole thermal atmosphere puzzle (which I recall) can be resolved by adopting a radically different-from-usual description of quantum black hole equilibrium states, according to which they are total pure states, entangled between matter and gravity in such a way that the partial states of matter and gravity are each approximately thermal equilibrium states (at the Hawking temperature); (4) that the Susskind-Horowitz-Polchinski string-theoretic understanding of black hole entropy as the logarithm of the degeneracy of a long string (which is the weak string coupling limit of a black hole) cannot be quite correct but should be replaced by a modified understanding according to which it is the entanglement entropy between a long string and its stringy atmosphere, when in a total pure equilibrium state in a suitable box, which (in line with (3)) goes over, at strong-coupling, to a black hole in equilibrium with its thermal atmosphere. The modified understanding in (4) is based on a general result, which I also describe, which concerns the likely state of a quantum system when it is weakly coupled to an energy-bath and the total state is a random pure state with a given energy. This result generalizes Goldstein et al.'s `canonical typicality' result to systems which are not necessarily small.

  4. Public acceptance of small reactors

    International Nuclear Information System (INIS)

    McDougall, D.S.

    1997-01-01

    The success of any nuclear program requires acceptance by the local public and all levels of government involved in the decision to initiate a reactor program. Public acceptance of a nuclear energy source is a major challenge in successful initiation of a small reactor program. In AECL's experience, public acceptance will not be obtained until the public is convinced that the specific nuclear program is needed, safe and economic and environmental benefit to the community. The title of public acceptance is misleading. The objective of the program is a fully informed public. The program proponent cannot force public acceptance, which is beyond his control. He can, however, ensure that the public is informed. Once information has begun to flow to the public by various means as will be explained later, the proponent is responsible to ensure that the information that is provided by him and by others is accurate. Most importantly, and perhaps most difficult to accomplish, the proponent must develop a consultative process that allows the proponent and the public to agree on actions that are acceptable to the proponent and the community

  5. Acceptance is in the eye of the beholder: self-esteem and motivated perceptions of acceptance from the opposite sex.

    Science.gov (United States)

    Cameron, Jessica J; Stinson, Danu Anthony; Gaetz, Roslyn; Balchen, Stacey

    2010-09-01

    Social risk elicits self-esteem differences in signature social motivations and behaviors during the relationship-initiation process. In particular, the present research tested the hypothesis that lower self-esteem individuals' (LSEs) motivation to avoid rejection leads them to self-protectively underestimate acceptance from potential romantic partners, whereas higher self-esteem individuals' (HSEs) motivation to promote new relationships leads them to overestimate acceptance. The results of 5 experiments supported these predictions. Social risk increased activation of avoidance goals for LSEs on a word-recall task but increased activation of approach goals for HSEs, as evidenced by their increased use of likeable behaviors. Consistent with these patterns of goal activation, even though actual acceptance cues were held constant across all participants, social risk decreased the amount of acceptance that LSEs perceived from their interaction partner but increased the amount of acceptance that HSEs perceived from their interaction partner. It is important to note that such self-esteem differences in avoidance goals, approach behaviors, and perceptions of acceptance were completely eliminated when social risk was removed. (PsycINFO Database Record (c) 2010 APA, all rights reserved).

  6. Hyper-arousal decreases human visual thresholds.

    Directory of Open Access Journals (Sweden)

    Adam J Woods

    Full Text Available Arousal has long been known to influence behavior and serves as an underlying component of cognition and consciousness. However, the consequences of hyper-arousal for visual perception remain unclear. The present study evaluates the impact of hyper-arousal on two aspects of visual sensitivity: visual stereoacuity and contrast thresholds. Sixty-eight participants participated in two experiments. Thirty-four participants were randomly divided into two groups in each experiment: Arousal Stimulation or Sham Control. The Arousal Stimulation group underwent a 50-second cold pressor stimulation (immersing the foot in 0-2° C water, a technique known to increase arousal. In contrast, the Sham Control group immersed their foot in room temperature water. Stereoacuity thresholds (Experiment 1 and contrast thresholds (Experiment 2 were measured before and after stimulation. The Arousal Stimulation groups demonstrated significantly lower stereoacuity and contrast thresholds following cold pressor stimulation, whereas the Sham Control groups showed no difference in thresholds. These results provide the first evidence that hyper-arousal from sensory stimulation can lower visual thresholds. Hyper-arousal's ability to decrease visual thresholds has important implications for survival, sports, and everyday life.

  7. Hypothesis test for synchronization: twin surrogates revisited.

    Science.gov (United States)

    Romano, M Carmen; Thiel, Marco; Kurths, Jürgen; Mergenthaler, Konstantin; Engbert, Ralf

    2009-03-01

    The method of twin surrogates has been introduced to test for phase synchronization of complex systems in the case of passive experiments. In this paper we derive new analytical expressions for the number of twins depending on the size of the neighborhood, as well as on the length of the trajectory. This allows us to determine the optimal parameters for the generation of twin surrogates. Furthermore, we determine the quality of the twin surrogates with respect to several linear and nonlinear statistics depending on the parameters of the method. In the second part of the paper we perform a hypothesis test for phase synchronization in the case of experimental data from fixational eye movements. These miniature eye movements have been shown to play a central role in neural information processing underlying the perception of static visual scenes. The high number of data sets (21 subjects and 30 trials per person) allows us to compare the generated twin surrogates with the "natural" surrogates that correspond to the different trials. We show that the generated twin surrogates reproduce very well all linear and nonlinear characteristics of the underlying experimental system. The synchronization analysis of fixational eye movements by means of twin surrogates reveals that the synchronization between the left and right eye is significant, indicating that either the centers in the brain stem generating fixational eye movements are closely linked, or, alternatively that there is only one center controlling both eyes.

  8. Marginal contrasts and the Contrastivist Hypothesis

    Directory of Open Access Journals (Sweden)

    Daniel Currie Hall

    2016-12-01

    Full Text Available The Contrastivist Hypothesis (CH; Hall 2007; Dresher 2009 holds that the only features that can be phonologically active in any language are those that serve to distinguish phonemes, which presupposes that phonemic status is categorical. Many researchers, however, demonstrate the existence of gradient relations. For instance, Hall (2009 quantifies these using the information-theoretic measure of entropy (unpredictability of distribution and shows that a pair of sounds may have an entropy between 0 (totally predictable and 1 (totally unpredictable. We argue that the existence of such intermediate degrees of contrastiveness does not make the CH untenable, but rather offers insight into contrastive hierarchies. The existence of a continuum does not preclude categorical distinctions: a categorical line can be drawn between zero entropy (entirely predictable, and thus by the CH phonologically inactive and non-zero entropy (at least partially contrastive, and thus potentially phonologically active. But this does not mean that intermediate degrees of surface contrastiveness are entirely irrelevant to the CH; rather, we argue, they can shed light on how deeply ingrained a phonemic distinction is in the phonological system. As an example, we provide a case study from Pulaar [ATR] harmony, which has previously been claimed to be problematic for the CH.

  9. Confabulation: Developing the 'emotion dysregulation' hypothesis.

    Science.gov (United States)

    Turnbull, Oliver H; Salas, Christian E

    2017-02-01

    Confabulations offer unique opportunities for establishing the neurobiological basis of delusional thinking. As regards causal factors, a review of the confabulation literature suggests that neither amnesia nor executive impairment can be the sole (or perhaps even the primary) cause of all delusional beliefs - though they may act in concert with other factors. A key perspective in the modern literature is that many delusions have an emotionally positive or 'wishful' element, that may serve to modulate or manage emotional experience. Some authors have referred to this perspective as the 'emotion dysregulation' hypothesis. In this article we review the theoretical underpinnings of this approach, and develop the idea by suggesting that the positive aspects of confabulatory states may have a role in perpetuating the imbalance between cognitive control and emotion. We draw on existing evidence from fields outside neuropsychology, to argue for three main causal factors: that positive emotions are related to more global or schematic forms of cognitive processing; that positive emotions influence the accuracy of memory recollection; and that positive emotions make people more susceptible to false memories. These findings suggest that the emotions that we want to feel (or do not want to feel) can influence the way we reconstruct past experiences and generate a sense of self - a proposition that bears on a unified theory of delusional belief states. Crown Copyright © 2016. Published by Elsevier Ltd. All rights reserved.

  10. Evolutionary hypothesis for Chiari type I malformation.

    Science.gov (United States)

    Fernandes, Yvens Barbosa; Ramina, Ricardo; Campos-Herrera, Cynthia Resende; Borges, Guilherme

    2013-10-01

    Chiari I malformation (CM-I) is classically defined as a cerebellar tonsillar herniation (≥5 mm) through the foramen magnum. A decreased posterior fossa volume, mainly due to basioccipital hypoplasia and sometimes platybasia, leads to posterior fossa overcrowding and consequently cerebellar herniation. Regardless of radiological findings, embryological genetic hypothesis or any other postulations, the real cause behind this malformation is yet not well-elucidated and remains largely unknown. The aim of this paper is to approach CM-I under a broader and new perspective, conjoining anthropology, genetics and neurosurgery, with special focus on the substantial changes that have occurred in the posterior cranial base through human evolution. Important evolutionary allometric changes occurred during brain expansion and genetics studies of human evolution demonstrated an unexpected high rate of gene flow interchange and possibly interbreeding during this process. Based upon this review we hypothesize that CM-I may be the result of an evolutionary anthropological imprint, caused by evolving species populations that eventually met each other and mingled in the last 1.7 million years. Copyright © 2013 Elsevier Ltd. All rights reserved.

  11. Environmental Kuznets Curve Hypothesis. A Survey

    International Nuclear Information System (INIS)

    Dinda, Soumyananda

    2004-01-01

    The Environmental Kuznets Curve (EKC) hypothesis postulates an inverted-U-shaped relationship between different pollutants and per capita income, i.e., environmental pressure increases up to a certain level as income goes up; after that, it decreases. An EKC actually reveals how a technically specified measurement of environmental quality changes as the fortunes of a country change. A sizeable literature on EKC has grown in recent period. The common point of all the studies is the assertion that the environmental quality deteriorates at the early stages of economic development/growth and subsequently improves at the later stages. In other words, environmental pressure increases faster than income at early stages of development and slows down relative to GDP growth at higher income levels. This paper reviews some theoretical developments and empirical studies dealing with EKC phenomenon. Possible explanations for this EKC are seen in (1) the progress of economic development, from clean agrarian economy to polluting industrial economy to clean service economy; (2) tendency of people with higher income having higher preference for environmental quality, etc. Evidence of the existence of the EKC has been questioned from several corners. Only some air quality indicators, especially local pollutants, show the evidence of an EKC. However, an EKC is empirically observed, till there is no agreement in the literature on the income level at which environmental degradation starts declining. This paper provides an overview of the EKC literature, background history, conceptual insights, policy and the conceptual and methodological critique

  12. DAMPs, ageing, and cancer: The 'DAMP Hypothesis'.

    Science.gov (United States)

    Huang, Jin; Xie, Yangchun; Sun, Xiaofang; Zeh, Herbert J; Kang, Rui; Lotze, Michael T; Tang, Daolin

    2015-11-01

    Ageing is a complex and multifactorial process characterized by the accumulation of many forms of damage at the molecular, cellular, and tissue level with advancing age. Ageing increases the risk of the onset of chronic inflammation-associated diseases such as cancer, diabetes, stroke, and neurodegenerative disease. In particular, ageing and cancer share some common origins and hallmarks such as genomic instability, epigenetic alteration, aberrant telomeres, inflammation and immune injury, reprogrammed metabolism, and degradation system impairment (including within the ubiquitin-proteasome system and the autophagic machinery). Recent advances indicate that damage-associated molecular pattern molecules (DAMPs) such as high mobility group box 1, histones, S100, and heat shock proteins play location-dependent roles inside and outside the cell. These provide interaction platforms at molecular levels linked to common hallmarks of ageing and cancer. They can act as inducers, sensors, and mediators of stress through individual plasma membrane receptors, intracellular recognition receptors (e.g., advanced glycosylation end product-specific receptors, AIM2-like receptors, RIG-I-like receptors, and NOD1-like receptors, and toll-like receptors), or following endocytic uptake. Thus, the DAMP Hypothesis is novel and complements other theories that explain the features of ageing. DAMPs represent ideal biomarkers of ageing and provide an attractive target for interventions in ageing and age-associated diseases. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. Identity of Particles and Continuum Hypothesis

    Science.gov (United States)

    Berezin, Alexander A.

    2001-04-01

    Why all electrons are the same? Unlike other objects, particles and atoms (same isotopes) are forbidden to have individuality or personal history (or reveal their hidden variables, even if they do have them). Or at least, what we commonly call physics so far was unable to disprove particle's sameness (Berezin and Nakhmanson, Physics Essays, 1990). Consider two opposing hypotheses: (A) particles are indeed absolutely same, or (B) they do have individuality, but it is beyond our capacity to demonstrate. This dilemma sounds akin to undecidability of Continuum Hypothesis of existence (or not) of intermediate cardinalities between integers and reals (P.Cohen). Both yes and no of it are true. Thus, (alleged) sameness of electrons and atoms may be a physical translation (embodiment) of this fundamental Goedelian undecidability. Experiments unlikely to help: even if we find that all electrons are same within 30 decimal digits, could their masses (or charges) still differ in100-th digit? Within (B) personalized informationally rich (infinitely rich?) digital tails (starting at, say, 100-th decimal) may carry individual record of each particle history. Within (A) parameters (m, q) are indeed exactly same in all digits and their sameness is based on some inherent (meta)physical principle akin to Platonism or Eddington-type numerology.

  14. Environmental Kuznets Curve Hypothesis. A Survey

    Energy Technology Data Exchange (ETDEWEB)

    Dinda, Soumyananda [Economic Research Unit, Indian Statistical Institute, 203, B.T. Road, Kolkata-108 (India)

    2004-08-01

    The Environmental Kuznets Curve (EKC) hypothesis postulates an inverted-U-shaped relationship between different pollutants and per capita income, i.e., environmental pressure increases up to a certain level as income goes up; after that, it decreases. An EKC actually reveals how a technically specified measurement of environmental quality changes as the fortunes of a country change. A sizeable literature on EKC has grown in recent period. The common point of all the studies is the assertion that the environmental quality deteriorates at the early stages of economic development/growth and subsequently improves at the later stages. In other words, environmental pressure increases faster than income at early stages of development and slows down relative to GDP growth at higher income levels. This paper reviews some theoretical developments and empirical studies dealing with EKC phenomenon. Possible explanations for this EKC are seen in (1) the progress of economic development, from clean agrarian economy to polluting industrial economy to clean service economy; (2) tendency of people with higher income having higher preference for environmental quality, etc. Evidence of the existence of the EKC has been questioned from several corners. Only some air quality indicators, especially local pollutants, show the evidence of an EKC. However, an EKC is empirically observed, till there is no agreement in the literature on the income level at which environmental degradation starts declining. This paper provides an overview of the EKC literature, background history, conceptual insights, policy and the conceptual and methodological critique.

  15. A threshold concentration of anti-merozoite antibodies is required for protection from clinical episodes of malaria

    DEFF Research Database (Denmark)

    Murungi, Linda M; Kamuyu, Gathoni; Lowe, Brett

    2013-01-01

    Antibodies to selected Plasmodium falciparum merozoite antigens are often reported to be associated with protection from malaria in one epidemiological cohort, but not in another. Here, we sought to understand this paradox by exploring the hypothesis that a threshold concentration of antibodies i...

  16. A NONPARAMETRIC HYPOTHESIS TEST VIA THE BOOTSTRAP RESAMPLING

    OpenAIRE

    Temel, Tugrul T.

    2001-01-01

    This paper adapts an already existing nonparametric hypothesis test to the bootstrap framework. The test utilizes the nonparametric kernel regression method to estimate a measure of distance between the models stated under the null hypothesis. The bootstraped version of the test allows to approximate errors involved in the asymptotic hypothesis test. The paper also develops a Mathematica Code for the test algorithm.

  17. NEUTRON SPECTRUM MEASUREMENTS USING MULTIPLE THRESHOLD DETECTORS

    Energy Technology Data Exchange (ETDEWEB)

    Gerken, William W.; Duffey, Dick

    1963-11-15

    From American Nuclear Society Meeting, New York, Nov. 1963. The use of threshold detectors, which simultaneously undergo reactions with thermal neutrons and two or more fast neutron threshold reactions, was applied to measurements of the neutron spectrum in a reactor. A number of different materials were irradiated to determine the most practical ones for use as multiple threshold detectors. These results, as well as counting techniques and corrections, are presented. Some materials used include aluminum, alloys of Al -Ni, aluminum-- nickel oxides, and magesium orthophosphates. (auth)

  18. Reaction thresholds in doubly special relativity

    International Nuclear Information System (INIS)

    Heyman, Daniel; Major, Seth; Hinteleitner, Franz

    2004-01-01

    Two theories of special relativity with an additional invariant scale, 'doubly special relativity', are tested with calculations of particle process kinematics. Using the Judes-Visser modified conservation laws, thresholds are studied in both theories. In contrast with some linear approximations, which allow for particle processes forbidden in special relativity, both the Amelino-Camelia and Magueijo-Smolin frameworks allow no additional processes. To first order, the Amelino-Camelia framework thresholds are lowered and the Magueijo-Smolin framework thresholds may be raised or lowered

  19. Radar rainfall estimation for the identification of debris-flow precipitation thresholds

    Science.gov (United States)

    Marra, Francesco; Nikolopoulos, Efthymios I.; Creutin, Jean-Dominique; Borga, Marco

    2014-05-01

    variogram) of the triggering rainfall. These results show that weather radar has the potential to effectively increase the accuracy of rainfall thresholds for debris flow occurrence. However, these benefits may only be achieved if the same monitoring instrumentation is used both to derive the rainfall thresholds and for use of thresholds for real-time identification of debris flows occurrence. References Nikolopoulos, E.I., Borga M., Crema S., Marchi L, Marra F. & Guzzetti F., 2014. Impact of uncertainty in rainfall estimation on the identification of rainfall thresholds for debris-flow occurrence. Geomorphology (conditionally accepted) Peruccacci, S., Brunetti, M.T., Luciani, S., Vennari, C., and Guzzetti, F., 2012. Lithological and seasonal control of rainfall thresholds for the possible initiation of landslides in central Italy, Geomorphology, 139-140, 79-90, 2012.

  20. Updating the mild encephalitis hypothesis of schizophrenia.

    Science.gov (United States)

    Bechter, K

    2013-04-05

    Schizophrenia seems to be a heterogeneous disorder. Emerging evidence indicates that low level neuroinflammation (LLNI) may not occur infrequently. Many infectious agents with low overall pathogenicity are risk factors for psychoses including schizophrenia and for autoimmune disorders. According to the mild encephalitis (ME) hypothesis, LLNI represents the core pathogenetic mechanism in a schizophrenia subgroup that has syndromal overlap with other psychiatric disorders. ME may be triggered by infections, autoimmunity, toxicity, or trauma. A 'late hit' and gene-environment interaction are required to explain major findings about schizophrenia, and both aspects would be consistent with the ME hypothesis. Schizophrenia risk genes stay rather constant within populations despite a resulting low number of progeny; this may result from advantages associated with risk genes, e.g., an improved immune response, which may act protectively within changing environments, although they are associated with the disadvantage of increased susceptibility to psychotic disorders. Specific schizophrenic symptoms may arise with instances of LLNI when certain brain functional systems are involved, in addition to being shaped by pre-existing liability factors. Prodrome phase and the transition to a diseased status may be related to LLNI processes emerging and varying over time. The variability in the course of schizophrenia resembles the varying courses of autoimmune disorders, which result from three required factors: genes, the environment, and the immune system. Preliminary criteria for subgrouping neurodevelopmental, genetic, ME, and other types of schizophrenias are provided. A rare example of ME schizophrenia may be observed in Borna disease virus infection. Neurodevelopmental schizophrenia due to early infections has been estimated by others to explain approximately 30% of cases, but the underlying pathomechanisms of transition to disease remain in question. LLNI (e.g. from

  1. Atopic dermatitis and the hygiene hypothesis revisited.

    Science.gov (United States)

    Flohr, Carsten; Yeo, Lindsey

    2011-01-01

    We published a systematic review on atopic dermatitis (AD) and the hygiene hypothesis in 2005. Since then, the body of literature has grown significantly. We therefore repeated our systematic review to examine the evidence from population-based studies for an association between AD risk and specific infections, childhood immunizations, the use of antibiotics and environmental exposures that lead to a change in microbial burden. Medline was searched from 1966 until June 2010 to identify relevant studies. We found an additional 49 papers suitable for inclusion. There is evidence to support an inverse relationship between AD and endotoxin, early day care, farm animal and dog exposure in early life. Cat exposure in the presence of skin barrier impairment is positively associated with AD. Helminth infection at least partially protects against AD. This is not the case for viral and bacterial infections, but consumption of unpasteurized farm milk seems protective. Routine childhood vaccinations have no effect on AD risk. The positive association between viral infections and AD found in some studies appears confounded by antibiotic prescription, which has been consistently associated with an increase in AD risk. There is convincing evidence for an inverse relationship between helminth infections and AD but no other pathogens. The protective effect seen with early day care, endotoxin, unpasteurized farm milk and animal exposure is likely to be due to a general increase in exposure to non-pathogenic microbes. This would also explain the risk increase associated with the use of broad-spectrum antibiotics. Future studies should assess skin barrier gene mutation carriage and phenotypic skin barrier impairment, as gene-environment interactions are likely to impact on AD risk. Copyright © 041_ S. Karger AG, Basel.

  2. Comparisons between detection threshold and loudness perception for individual cochlear implant channels

    Science.gov (United States)

    Bierer, Julie Arenberg; Nye, Amberly D

    2014-01-01

    Objective The objective of the present study, performed in cochlear implant listeners, was to examine how the level of current required to detect single-channel electrical pulse trains relates to loudness perception on the same channel. The working hypothesis was that channels with relatively high thresholds, when measured with a focused current pattern, interface poorly to the auditory nerve. For such channels a smaller dynamic range between perceptual threshold and the most comfortable loudness would result, in part, from a greater sensitivity to changes in electrical field spread compared to low-threshold channels. The narrower range of comfortable listening levels may have important implications for speech perception. Design Data were collected from eight, adult cochlear implant listeners implanted with the HiRes90k cochlear implant (Advanced Bionics Corp.). The partial tripolar (pTP) electrode configuration, consisting of one intracochlear active electrode, two flanking electrodes carrying a fraction (σ) of the return current, and an extracochlear ground, was used for stimulation. Single-channel detection thresholds and most comfortable listening levels were acquired using the most focused pTP configuration possible (σ ≥ 0.8) to identify three channels for further testing – those with the highest, median, and lowest thresholds – for each subject. Threshold, equal-loudness contours (at 50% of the monopolar dynamic range), and loudness growth functions were measured for each of these three test channels using various partial tripolar fractions. Results For all test channels, thresholds increased as the electrode configuration became more focused. The rate of increase with the focusing parameter σ was greatest for the high-threshold channel compared to the median- and low-threshold channels. The 50% equal-loudness contours exhibited similar rates of increase in level across test channels and subjects. Additionally, test channels with the highest

  3. Approach to DOE threshold guidance limits

    International Nuclear Information System (INIS)

    Shuman, R.D.; Wickham, L.E.

    1984-01-01

    The need for less restrictive criteria governing disposal of extremely low-level radioactive waste has long been recognized. The Low-Level Waste Management Program has been directed by the Department of Energy (DOE) to aid in the development of a threshold guidance limit for DOE low-level waste facilities. Project objectives are concernd with the definition of a threshold limit dose and pathway analysis of radionuclide transport within selected exposure scenarios at DOE sites. Results of the pathway analysis will be used to determine waste radionuclide concentration guidelines that meet the defined threshold limit dose. Methods of measurement and verification of concentration limits round out the project's goals. Work on defining a threshold limit dose is nearing completion. Pathway analysis of sanitary landfill operations at the Savannah River Plant and the Idaho National Engineering Laboratory is in progress using the DOSTOMAN computer code. Concentration limit calculations and determination of implementation procedures shall follow completion of the pathways work. 4 references

  4. Pion photoproduction on the nucleon at threshold

    International Nuclear Information System (INIS)

    Cheon, I.T.; Jeong, M.T.

    1989-08-01

    Electric dipole amplitudes of pion photoproduction on the nucleon at threshold have been calculated in the framework of the chiral bag model. Our results are in good agreement with the existing experimental data

  5. Effect of dissipation on dynamical fusion thresholds

    International Nuclear Information System (INIS)

    Sierk, A.J.

    1986-01-01

    The existence of dynamical thresholds to fusion in heavy nuclei (A greater than or equal to 200) due to the nature of the potential-energy surface is shown. These thresholds exist even in the absence of dissipative forces, due to the coupling between the various collective deformation degrees of freedom. Using a macroscopic model of nuclear shape dynamics, It is shown how three different suggested dissipation mechanisms increase by varying amounts the excitation energy over the one-dimensional barrier required to cause compound-nucleus formation. The recently introduced surface-plus-window dissipation may give a reasonable representation of experimental data on fusion thresholds, in addition to properly describing fission-fragment kinetic energies and isoscalar giant multipole widths. Scaling of threshold results to asymmetric systems is discussed. 48 refs., 10 figs

  6. 40 CFR 98.411 - Reporting threshold.

    Science.gov (United States)

    2010-07-01

    ...) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Industrial Greenhouse Gases § 98.411 Reporting threshold. Any supplier of industrial greenhouse gases who meets the requirements of § 98.2(a)(4) must report GHG...

  7. Melanin microcavitation threshold in the near infrared

    Science.gov (United States)

    Schmidt, Morgan S.; Kennedy, Paul K.; Vincelette, Rebecca L.; Schuster, Kurt J.; Noojin, Gary D.; Wharmby, Andrew W.; Thomas, Robert J.; Rockwell, Benjamin A.

    2014-02-01

    Thresholds for microcavitation of isolated bovine and porcine melanosomes were determined using single nanosecond (ns) laser pulses in the NIR (1000 - 1319 nm) wavelength regime. Average fluence thresholds for microcavitation increased non-linearly with increasing wavelength. Average fluence thresholds were also measured for 10-ns pulses at 532 nm, and found to be comparable to visible ns pulse values published in previous reports. Fluence thresholds were used to calculate melanosome absorption coefficients, which decreased with increasing wavelength. This trend was found to be comparable to the decrease in retinal pigmented epithelial (RPE) layer absorption coefficients reported over the same wavelength region. Estimated corneal total intraocular energy (TIE) values were determined and compared to the current and proposed maximum permissible exposure (MPE) safe exposure levels. Results from this study support the proposed changes to the MPE levels.

  8. Secure information management using linguistic threshold approach

    CERN Document Server

    Ogiela, Marek R

    2013-01-01

    This book details linguistic threshold schemes for information sharing. It examines the opportunities of using these techniques to create new models of managing strategic information shared within a commercial organisation or a state institution.

  9. Robust Adaptive Thresholder For Document Scanning Applications

    Science.gov (United States)

    Hsing, To R.

    1982-12-01

    In document scanning applications, thresholding is used to obtain binary data from a scanner. However, due to: (1) a wide range of different color backgrounds; (2) density variations of printed text information; and (3) the shading effect caused by the optical systems, the use of adaptive thresholding to enhance the useful information is highly desired. This paper describes a new robust adaptive thresholder for obtaining valid binary images. It is basically a memory type algorithm which can dynamically update the black and white reference level to optimize a local adaptive threshold function. The results of high image quality from different types of simulate test patterns can be obtained by this algorithm. The software algorithm is described and experiment results are present to describe the procedures. Results also show that the techniques described here can be used for real-time signal processing in the varied applications.

  10. Recent progress in understanding climate thresholds

    NARCIS (Netherlands)

    Good, Peter; Bamber, Jonathan; Halladay, Kate; Harper, Anna B.; Jackson, Laura C.; Kay, Gillian; Kruijt, Bart; Lowe, Jason A.; Phillips, Oliver L.; Ridley, Jeff; Srokosz, Meric; Turley, Carol; Williamson, Phillip

    2018-01-01

    This article reviews recent scientific progress, relating to four major systems that could exhibit threshold behaviour: ice sheets, the Atlantic meridional overturning circulation (AMOC), tropical forests and ecosystem responses to ocean acidification. The focus is on advances since the

  11. Verifiable Secret Redistribution for Threshold Sharing Schemes

    National Research Council Canada - National Science Library

    Wong, Theodore M; Wang, Chenxi; Wing, Jeannette M

    2002-01-01

    .... Our protocol guards against dynamic adversaries. We observe that existing protocols either cannot be readily extended to allow redistribution between different threshold schemes, or have vulnerabilities that allow faulty old shareholders...

  12. Thresholding projection estimators in functional linear models

    OpenAIRE

    Cardot, Hervé; Johannes, Jan

    2010-01-01

    We consider the problem of estimating the regression function in functional linear regression models by proposing a new type of projection estimators which combine dimension reduction and thresholding. The introduction of a threshold rule allows to get consistency under broad assumptions as well as minimax rates of convergence under additional regularity hypotheses. We also consider the particular case of Sobolev spaces generated by the trigonometric basis which permits to get easily mean squ...

  13. Noise thresholds for optical quantum computers.

    Science.gov (United States)

    Dawson, Christopher M; Haselgrove, Henry L; Nielsen, Michael A

    2006-01-20

    In this Letter we numerically investigate the fault-tolerant threshold for optical cluster-state quantum computing. We allow both photon loss noise and depolarizing noise (as a general proxy for all local noise), and obtain a threshold region of allowed pairs of values for the two types of noise. Roughly speaking, our results show that scalable optical quantum computing is possible for photon loss probabilities <3 x 10(-3), and for depolarization probabilities <10(-4).

  14. Design of Threshold Controller Based Chaotic Circuits

    DEFF Research Database (Denmark)

    Mohamed, I. Raja; Murali, K.; Sinha, Sudeshna

    2010-01-01

    We propose a very simple implementation of a second-order nonautonomous chaotic oscillator, using a threshold controller as the only source of nonlinearity. We demonstrate the efficacy and simplicity of our design through numerical and experimental results. Further, we show that this approach...... of using a threshold controller as a nonlinear element, can be extended to obtain autonomous and multiscroll chaotic attractor circuits as well....

  15. A New Wavelet Threshold Function and Denoising Application

    Directory of Open Access Journals (Sweden)

    Lu Jing-yi

    2016-01-01

    Full Text Available In order to improve the effects of denoising, this paper introduces the basic principles of wavelet threshold denoising and traditional structures threshold functions. Meanwhile, it proposes wavelet threshold function and fixed threshold formula which are both improved here. First, this paper studies the problems existing in the traditional wavelet threshold functions and introduces the adjustment factors to construct the new threshold function basis on soft threshold function. Then, it studies the fixed threshold and introduces the logarithmic function of layer number of wavelet decomposition to design the new fixed threshold formula. Finally, this paper uses hard threshold, soft threshold, Garrote threshold, and improved threshold function to denoise different signals. And the paper also calculates signal-to-noise (SNR and mean square errors (MSE of the hard threshold functions, soft thresholding functions, Garrote threshold functions, and the improved threshold function after denoising. Theoretical analysis and experimental results showed that the proposed approach could improve soft threshold functions with constant deviation and hard threshold with discontinuous function problems. The proposed approach could improve the different decomposition scales that adopt the same threshold value to deal with the noise problems, also effectively filter the noise in the signals, and improve the SNR and reduce the MSE of output signals.

  16. Nuclear Energy and Public Acceptance

    International Nuclear Information System (INIS)

    Daifuku, K.

    2002-01-01

    The continued use of nuclear power in the European Union and elsewhere requires an adequate level of public and political acceptance. A lack of acceptance is often mistakenly cited as a reason for the slowdown in nuclear power plant construction in Western Europe and as a justification for abandoning nuclear power. In fact, the reasons for the slowdown have more to do with the following two factors: Plentiful supplies of low-priced natural gas, making gas-fired power plants a more attractive investment choice; more than adequate supplies of electricity which have curbed the need for the construction of new plant of any kind. In general, moves towards a withdrawal from nuclear in certain Community countries have been due to party political pressures and have not been a response to public opposition to nuclear. In addition, opinion polls do not show widespread public opposition to the use of nuclear power. Figures consistently indicate that the use of nuclear power does not come high on the list of most people's main worries. Their main concerns focus on other issues such as crime and financial problems. In the main, electricity is taken for granted in the industrialised world. Electric power only becomes an issue when there is a threat of shortages. So if public acceptance is not the main obstacle, what is? Political acceptance is an integral part of the process in which nuclear becomes acceptable or not. The relationship between public and political acceptance and the role of the industry in this context, on how to foster a better trialogue, will be examined. (author)

  17. Patient acceptance of awake craniotomy.

    Science.gov (United States)

    Wrede, Karsten H; Stieglitz, Lennart H; Fiferna, Antje; Karst, Matthias; Gerganov, Venelin M; Samii, Madjid; von Gösseln, Hans-Henning; Lüdemann, Wolf O

    2011-12-01

    The aim of this study was to objectively assess the patients' acceptance for awake craniotomy in a group of neurosurgical patients, who underwent this procedure for removal of lesions in or close to eloquent brain areas. Patients acceptance for awake craniotomy under local anesthesia and conscious sedation was assessed by a formal questionnaire (PPP33), initially developed for general surgery patients. The results are compared to a group of patients who had brain surgery under general anesthesia and to previously published data. The awake craniotomy (AC) group consisted of 37 male and 9 female patients (48 craniotomies) with age ranging from 18 to 71 years. The general anesthesia (GA) group consisted of 26 male and 15 female patients (43 craniotomies) with age ranging from 26 to 83 years. All patients in the study were included in the questionnaire analysis. In comparison to GA the overall PPP33 score for AC was higher (p=0.07), suggesting better overall acceptance for AC. The subscale scores for AC were also significantly better compared to GA for the two subscales "postoperative pain" (p=0.02) and "physical disorders" (p=0.01) and equal for the other 6 subscales. The results of the overall mean score and the scores for the subscales of the PPP33 questionnaire verify good patients' acceptance for AC. Previous studies have shown good patients' acceptance for awake craniotomy, but only a few times using formal approaches. By utilizing a formal questionnaire we could verify good patient acceptance for awake craniotomy for the treatment of brain tumors in or close to eloquent areas. This is a novel approach that substantiates previously published experiences. Copyright © 2011 Elsevier B.V. All rights reserved.

  18. On the controlling parameters for fatigue-crack threshold at low homologous temperatures

    International Nuclear Information System (INIS)

    Yu, W.; Gerberich, W.W.

    1983-01-01

    Fatigue crack propagation phenomena near the threshold stress intensity level ΔK /SUB TH/ , has been a vigorously studied topic in recent years. Near threshold the crack propagates rather slowly, thus giving enough time for various physical and chemical reactions to take place. Room air, which is the most commonly encountered environment, can still supply various ingredients such as oxygen, water vapor (and thus hydrogen) to support these reactions. Much effort had been directed toward the environmental aspects of near threshold fatigue crack growth. By conducting tests under vacuum, Suresh and coworkers found that the crack propagation rate in a 2-1/4 Cr-1Mo steel was higher in vacuum than in air. An oxide induced closure, which served to reduce the effective stress intensity at the crack tip, seems to furnish a good explanation. Neumann and coworkers proposed that during the fatigue process, extrusion-intrusion pairs can develop as a consequence of reversed slip around the crack tip when the crack was propagated near threshold stress intensity. Beevers demonstrated that fatigue fracture surfaces contact each other during unloading even under tension-tension cycling. Kanninen and Atkinson also reached the conclusion that the compressive stress acting at the crack tip due to residual plasticity can induce closure. Microstructural effects have also been cited as important factors in near threshold crack growth. It is generally accepted that coarser grains have a beneficial effect on the resistance to the near threshold crack propagation

  19. Defect Detection of Steel Surfaces with Global Adaptive Percentile Thresholding of Gradient Image

    Science.gov (United States)

    Neogi, Nirbhar; Mohanta, Dusmanta K.; Dutta, Pranab K.

    2017-12-01

    Steel strips are used extensively for white goods, auto bodies and other purposes where surface defects are not acceptable. On-line surface inspection systems can effectively detect and classify defects and help in taking corrective actions. For detection of defects use of gradients is very popular in highlighting and subsequently segmenting areas of interest in a surface inspection system. Most of the time, segmentation by a fixed value threshold leads to unsatisfactory results. As defects can be both very small and large in size, segmentation of a gradient image based on percentile thresholding can lead to inadequate or excessive segmentation of defective regions. A global adaptive percentile thresholding of gradient image has been formulated for blister defect and water-deposit (a pseudo defect) in steel strips. The developed method adaptively changes the percentile value used for thresholding depending on the number of pixels above some specific values of gray level of the gradient image. The method is able to segment defective regions selectively preserving the characteristics of defects irrespective of the size of the defects. The developed method performs better than Otsu method of thresholding and an adaptive thresholding method based on local properties.

  20. Text recycling: acceptable or misconduct?

    Science.gov (United States)

    Harriman, Stephanie; Patel, Jigisha

    2014-08-16

    Text recycling, also referred to as self-plagiarism, is the reproduction of an author's own text from a previous publication in a new publication. Opinions on the acceptability of this practice vary, with some viewing it as acceptable and efficient, and others as misleading and unacceptable. In light of the lack of consensus, journal editors often have difficulty deciding how to act upon the discovery of text recycling. In response to these difficulties, we have created a set of guidelines for journal editors on how to deal with text recycling. In this editorial, we discuss some of the challenges of developing these guidelines, and how authors can avoid undisclosed text recycling.

  1. Toward an acceptable nuclear future

    International Nuclear Information System (INIS)

    Weinberg, A.M.

    1977-11-01

    The nuclear option is in danger of being foreclosed. The trend toward antinuclearism may be reversed if concerns about low-level radiation insult can be shown ultimately to be without foundation; evidence for this speculation is presented. Nevertheless it is suggested that the nuclear enterprise itself must propose new initiatives to increase the acceptability of nuclear energy. A key element of an acceptable nuclear future is cluster siting of reactors. This siting plan might be achieved by confining new reactors essentially to existing sites

  2. The zinc dyshomeostasis hypothesis of Alzheimer's disease.

    Directory of Open Access Journals (Sweden)

    Travis J A Craddock

    Full Text Available Alzheimer's disease (AD is the most common form of dementia in the elderly. Hallmark AD neuropathology includes extracellular amyloid plaques composed largely of the amyloid-β protein (Aβ, intracellular neurofibrillary tangles (NFTs composed of hyper-phosphorylated microtubule-associated protein tau (MAP-tau, and microtubule destabilization. Early-onset autosomal dominant AD genes are associated with excessive Aβ accumulation, however cognitive impairment best correlates with NFTs and disrupted microtubules. The mechanisms linking Aβ and NFT pathologies in AD are unknown. Here, we propose that sequestration of zinc by Aβ-amyloid deposits (Aβ oligomers and plaques not only drives Aβ aggregation, but also disrupts zinc homeostasis in zinc-enriched brain regions important for memory and vulnerable to AD pathology, resulting in intra-neuronal zinc levels, which are either too low, or excessively high. To evaluate this hypothesis, we 1 used molecular modeling of zinc binding to the microtubule component protein tubulin, identifying specific, high-affinity zinc binding sites that influence side-to-side tubulin interaction, the sensitive link in microtubule polymerization and stability. We also 2 performed kinetic modeling showing zinc distribution in extra-neuronal Aβ deposits can reduce intra-neuronal zinc binding to microtubules, destabilizing microtubules. Finally, we 3 used metallomic imaging mass spectrometry (MIMS to show anatomically-localized and age-dependent zinc dyshomeostasis in specific brain regions of Tg2576 transgenic, mice, a model for AD. We found excess zinc in brain regions associated with memory processing and NFT pathology. Overall, we present a theoretical framework and support for a new theory of AD linking extra-neuronal Aβ amyloid to intra-neuronal NFTs and cognitive dysfunction. The connection, we propose, is based on β-amyloid-induced alterations in zinc ion concentration inside neurons affecting stability of

  3. The zinc dyshomeostasis hypothesis of Alzheimer's disease.

    Science.gov (United States)

    Craddock, Travis J A; Tuszynski, Jack A; Chopra, Deepak; Casey, Noel; Goldstein, Lee E; Hameroff, Stuart R; Tanzi, Rudolph E

    2012-01-01

    Alzheimer's disease (AD) is the most common form of dementia in the elderly. Hallmark AD neuropathology includes extracellular amyloid plaques composed largely of the amyloid-β protein (Aβ), intracellular neurofibrillary tangles (NFTs) composed of hyper-phosphorylated microtubule-associated protein tau (MAP-tau), and microtubule destabilization. Early-onset autosomal dominant AD genes are associated with excessive Aβ accumulation, however cognitive impairment best correlates with NFTs and disrupted microtubules. The mechanisms linking Aβ and NFT pathologies in AD are unknown. Here, we propose that sequestration of zinc by Aβ-amyloid deposits (Aβ oligomers and plaques) not only drives Aβ aggregation, but also disrupts zinc homeostasis in zinc-enriched brain regions important for memory and vulnerable to AD pathology, resulting in intra-neuronal zinc levels, which are either too low, or excessively high. To evaluate this hypothesis, we 1) used molecular modeling of zinc binding to the microtubule component protein tubulin, identifying specific, high-affinity zinc binding sites that influence side-to-side tubulin interaction, the sensitive link in microtubule polymerization and stability. We also 2) performed kinetic modeling showing zinc distribution in extra-neuronal Aβ deposits can reduce intra-neuronal zinc binding to microtubules, destabilizing microtubules. Finally, we 3) used metallomic imaging mass spectrometry (MIMS) to show anatomically-localized and age-dependent zinc dyshomeostasis in specific brain regions of Tg2576 transgenic, mice, a model for AD. We found excess zinc in brain regions associated with memory processing and NFT pathology. Overall, we present a theoretical framework and support for a new theory of AD linking extra-neuronal Aβ amyloid to intra-neuronal NFTs and cognitive dysfunction. The connection, we propose, is based on β-amyloid-induced alterations in zinc ion concentration inside neurons affecting stability of polymerized

  4. Is skin penetration a determining factor in skin sensitization potential and potency? Refuting the notion of a LogKow threshold for Skin Sensitization

    Science.gov (United States)

    Summary:Background. It is widely accepted that substances that cannot penetrate through the skin will not be sensitisers. Thresholds based on relevant physicochemical parameters such as a LogKow > 1 and a MW < 500, are assumed and widely accepted as self-evident truths. Objective...

  5. Evaluation of a Teen Dating Violence Social Marketing Campaign: Lessons Learned when the Null Hypothesis Was Accepted

    Science.gov (United States)

    Rothman, Emily F.; Decker, Michele R.; Silverman, Jay G.

    2006-01-01

    This chapter discusses a three-month statewide mass media campaign to prevent teen dating violence, "See It and Stop It." The Massachusetts campaign reached out--using television, radio, and print advertising--and also encouraged anti-violence activism in select high schools. The objective was to drive thirteen- to seventeen-year-olds to…

  6. Limitations of acceptability curves for presenting uncertainty in cost-effectiveness analysis

    NARCIS (Netherlands)

    Groot Koerkamp, Bas; Hunink, M. G. Myriam; Stijnen, Theo; Hammitt, James K.; Kuntz, Karen M.; Weinstein, Milton C.

    2007-01-01

    Clinical journals increasingly illustrate uncertainty about the cost and effect of health care interventions using cost-effectiveness acceptability curves (CEACs). CEACs present the probability that each competing alternative is optimal for a range of values of the cost-effectiveness threshold. The

  7. A shift from significance test to hypothesis test through power analysis in medical research.

    Science.gov (United States)

    Singh, G

    2006-01-01

    Medical research literature until recently, exhibited substantial dominance of the Fisher's significance test approach of statistical inference concentrating more on probability of type I error over Neyman-Pearson's hypothesis test considering both probability of type I and II error. Fisher's approach dichotomises results into significant or not significant results with a P value. The Neyman-Pearson's approach talks of acceptance or rejection of null hypothesis. Based on the same theory these two approaches deal with same objective and conclude in their own way. The advancement in computing techniques and availability of statistical software have resulted in increasing application of power calculations in medical research and thereby reporting the result of significance tests in the light of power of the test also. Significance test approach, when it incorporates power analysis contains the essence of hypothesis test approach. It may be safely argued that rising application of power analysis in medical research may have initiated a shift from Fisher's significance test to Neyman-Pearson's hypothesis test procedure.

  8. A shift from significance test to hypothesis test through power analysis in medical research

    Directory of Open Access Journals (Sweden)

    Singh Girish

    2006-01-01

    Full Text Available Medical research literature until recently, exhibited substantial dominance of the Fisher′s significance test approach of statistical inference concentrating more on probability of type I error over Neyman-Pearson′s hypothesis test considering both probability of type I and II error. Fisher′s approach dichotomises results into significant or not significant results with a P value. The Neyman-Pearson′s approach talks of acceptance or rejection of null hypothesis. Based on the same theory these two approaches deal with same objective and conclude in their own way. The advancement in computing techniques and availability of statistical software have resulted in increasing application of power calculations in medical research and thereby reporting the result of significance tests in the light of power of the test also. Significance test approach, when it incorporates power analysis contains the essence of hypothesis test approach. It may be safely argued that rising application of power analysis in medical research may have initiated a shift from Fisher′s significance test to Neyman-Pearson′s hypothesis test procedure.

  9. Nitrogen trailer acceptance test report

    International Nuclear Information System (INIS)

    Kostelnik, A.J.

    1996-01-01

    This Acceptance Test Report documents compliance with the requirements of specification WHC-S-0249. The equipment was tested according to WHC-SD-WM-ATP-108 Rev.0. The equipment being tested is a portable contained nitrogen supply. The test was conducted at Norco's facility

  10. Consumer Acceptability Of Irradiated Foods

    International Nuclear Information System (INIS)

    Awoyinka, A.; Akingbohungbe, A.E.

    1994-01-01

    Three commonly used food items; maize, beans and smoked fish were irradiated and consumer acceptability was tested through a questionnaire method. Subjects were residents in Ile-Ife, Nigeria. Respondents attitudes towards the processing and tasting of the food were very positive and the possibility of marketing the foods was suggested by them

  11. W-025, acceptance test report

    International Nuclear Information System (INIS)

    Roscha, V.

    1994-01-01

    This acceptance test report (ATR) has been prepared to establish the results of the field testing conducted on W-025 to demonstrate that the electrical/instrumentation systems functioned as intended by design. This is part of the RMW Land Disposal Facility

  12. Safety culture and public acceptance

    International Nuclear Information System (INIS)

    Mikhalevich, Alexander A.

    2002-01-01

    After the Chernobyl NPP accident a public acceptance has become a key factor in nuclear power development all over the world. Therefore, nuclear safety culture should be based not only on technical principles, responsibilities, supervision, regulatory provisions, emergency preparedness, but the public awareness of minimum risk during the operation and decommissioning of NPPs, radioactive waste management, etc. (author)

  13. Euthanasia Acceptance: An Attitudinal Inquiry.

    Science.gov (United States)

    Klopfer, Fredrick J.; Price, William F.

    The study presented was conducted to examine potential relationships between attitudes regarding the dying process, including acceptance of euthanasia, and other attitudinal or demographic attributes. The data of the survey was comprised of responses given by 331 respondents to a door-to-door interview. Results are discussed in terms of preferred…

  14. AAL- technology acceptance through experience

    NARCIS (Netherlands)

    Huldtgren, A.; Ascencio San Pedro, G.; Pohlmeyer, A.E.; Romero Herrera, N.A.

    2014-01-01

    Despite substantial research and development of Ambient Assisted Living (AAL) technologies, their acceptance remains low. This is partially caused by a lack of accounting for users' needs and values, and the social contexts these systems are to be embedded in. Participatory design has some potential

  15. Energy justice: Participation promotes acceptance

    Science.gov (United States)

    Baxter, Jamie

    2017-08-01

    Wind turbines have been a go-to technology for addressing climate change, but they are increasingly a source of frustration for all stakeholders. While community ownership is often lauded as a panacea for maximizing turbine acceptance, a new study suggests that decision-making involvement — procedural fairness — matters most.

  16. Worldwide nuclear revival and acceptance

    International Nuclear Information System (INIS)

    Geraets, Luc H.; Crommelynck, Yves A.

    2010-01-01

    The paper outlines the current status and trends of the nuclear revival in Europe and abroad, the evolution of the public opinion in the last decade, and the interaction between the former and the latter. It emphasises the absolute priority of a professional communication and exchange to gain public acceptance. (orig.)

  17. Espectroscopia de fotoelétrons de limiares de átomos e moléculas Atomic and molecular threshold photoelectron spectroscopy

    Directory of Open Access Journals (Sweden)

    Maria Cristina Andreolli Lopes

    2006-02-01

    Full Text Available A threshold photoelectron spectrometer applied to the study of atomic and molecular threshold photoionization processes is described. The spectrometer has been used in conjunction with a toroidal grating monochromator at the National Synchrotron Radiation Laboratory (LNLS, Brazil. It can be tuned to accept threshold electrons (< 20 meV and work with a power resolution of 716 (~18 meV at 12 eV with a high signal/noise ratio. The performance of this apparatus and some characteristics of the TGM (Toroidal Grating Monochromator beam line of LNLS are described and discussed by means of argon, O2 and N2 threshold photoelectron spectra.

  18. Genetic variation in threshold reaction norms for alternative reproductive tactics in male Atlantic salmon, Salmo salar.

    Science.gov (United States)

    Piché, Jacinthe; Hutchings, Jeffrey A; Blanchard, Wade

    2008-07-07

    Alternative reproductive tactics may be a product of adaptive phenotypic plasticity, such that discontinuous variation in life history depends on both the genotype and the environment. Phenotypes that fall below a genetically determined threshold adopt one tactic, while those exceeding the threshold adopt the alternative tactic. We report evidence of genetic variability in maturation thresholds for male Atlantic salmon (Salmo salar) that mature either as large (more than 1 kg) anadromous males or as small (10-150 g) parr. Using a common-garden experimental protocol, we find that the growth rate at which the sneaker parr phenotype is expressed differs among pure- and mixed-population crosses. Maturation thresholds of hybrids were intermediate to those of pure crosses, consistent with the hypothesis that the life-history switch points are heritable. Our work provides evidence, for a vertebrate, that thresholds for alternative reproductive tactics differ genetically among populations and can be modelled as discontinuous reaction norms for age and size at maturity.

  19. Effects of ultrasound frequency and tissue stiffness on the histotripsy intrinsic threshold for cavitation.

    Science.gov (United States)

    Vlaisavljevich, Eli; Lin, Kuang-Wei; Maxwell, Adam; Warnez, Matthew T; Mancia, Lauren; Singh, Rahul; Putnam, Andrew J; Fowlkes, Brian; Johnsen, Eric; Cain, Charles; Xu, Zhen

    2015-06-01

    Histotripsy is an ultrasound ablation method that depends on the initiation of a cavitation bubble cloud to fractionate soft tissue. Previous work has indicated that a cavitation cloud can be formed by a single pulse with one high-amplitude negative cycle, when the negative pressure amplitude directly exceeds a pressure threshold intrinsic to the medium. We hypothesize that the intrinsic threshold in water-based tissues is determined by the properties of the water inside the tissue, and changes in tissue stiffness or ultrasound frequency will have a minimal impact on the histotripsy intrinsic threshold. To test this hypothesis, the histotripsy intrinsic threshold was investigated both experimentally and theoretically. The probability of cavitation was measured by subjecting tissue phantoms with adjustable mechanical properties and ex vivo tissues to a histotripsy pulse of 1-2 cycles produced by 345-kHz, 500-kHz, 1.5-MHz and 3-MHz histotripsy transducers. Cavitation was detected and characterized by passive cavitation detection and high-speed photography, from which the probability of cavitation was measured versus pressure amplitude. The results revealed that the intrinsic threshold (the negative pressure at which probability = 0.5) is independent of stiffness for Young's moduli (E) ultrasound frequency in the hundreds of kilohertz to megahertz range. Copyright © 2015 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.

  20. Identifying Threshold Concepts for Information Literacy: A Delphi Study

    Directory of Open Access Journals (Sweden)

    Lori Townsend

    2016-06-01

    Full Text Available This study used the Delphi method to engage expert practitioners on the topic of threshold concepts for information literacy. A panel of experts considered two questions. First, is the threshold concept approach useful for information literacy instruction? The panel unanimously agreed that the threshold concept approach holds potential for information literacy instruction. Second, what are the threshold concepts for information literacy instruction? The panel proposed and discussed over fifty potential threshold concepts, finally settling on six information literacy threshold concepts.

  1. Diesel Engine Actuator Fault Isolation using Multiple Models Hypothesis Tests

    DEFF Research Database (Denmark)

    Bøgh, S.A.

    1994-01-01

    Detection of current faults in a D.C. motor with unknown load torques is not feasible with linear methods and threshold logic......Detection of current faults in a D.C. motor with unknown load torques is not feasible with linear methods and threshold logic...

  2. QRS Detection Based on Improved Adaptive Threshold

    Directory of Open Access Journals (Sweden)

    Xuanyu Lu

    2018-01-01

    Full Text Available Cardiovascular disease is the first cause of death around the world. In accomplishing quick and accurate diagnosis, automatic electrocardiogram (ECG analysis algorithm plays an important role, whose first step is QRS detection. The threshold algorithm of QRS complex detection is known for its high-speed computation and minimized memory storage. In this mobile era, threshold algorithm can be easily transported into portable, wearable, and wireless ECG systems. However, the detection rate of the threshold algorithm still calls for improvement. An improved adaptive threshold algorithm for QRS detection is reported in this paper. The main steps of this algorithm are preprocessing, peak finding, and adaptive threshold QRS detecting. The detection rate is 99.41%, the sensitivity (Se is 99.72%, and the specificity (Sp is 99.69% on the MIT-BIH Arrhythmia database. A comparison is also made with two other algorithms, to prove our superiority. The suspicious abnormal area is shown at the end of the algorithm and RR-Lorenz plot drawn for doctors and cardiologists to use as aid for diagnosis.

  3. Cost-effectiveness thresholds: pros and cons.

    Science.gov (United States)

    Bertram, Melanie Y; Lauer, Jeremy A; De Joncheere, Kees; Edejer, Tessa; Hutubessy, Raymond; Kieny, Marie-Paule; Hill, Suzanne R

    2016-12-01

    Cost-effectiveness analysis is used to compare the costs and outcomes of alternative policy options. Each resulting cost-effectiveness ratio represents the magnitude of additional health gained per additional unit of resources spent. Cost-effectiveness thresholds allow cost-effectiveness ratios that represent good or very good value for money to be identified. In 2001, the World Health Organization's Commission on Macroeconomics in Health suggested cost-effectiveness thresholds based on multiples of a country's per-capita gross domestic product (GDP). In some contexts, in choosing which health interventions to fund and which not to fund, these thresholds have been used as decision rules. However, experience with the use of such GDP-based thresholds in decision-making processes at country level shows them to lack country specificity and this - in addition to uncertainty in the modelled cost-effectiveness ratios - can lead to the wrong decision on how to spend health-care resources. Cost-effectiveness information should be used alongside other considerations - e.g. budget impact and feasibility considerations - in a transparent decision-making process, rather than in isolation based on a single threshold value. Although cost-effectiveness ratios are undoubtedly informative in assessing value for money, countries should be encouraged to develop a context-specific process for decision-making that is supported by legislation, has stakeholder buy-in, for example the involvement of civil society organizations and patient groups, and is transparent, consistent and fair.

  4. At-Risk-of-Poverty Threshold

    Directory of Open Access Journals (Sweden)

    Táňa Dvornáková

    2012-06-01

    Full Text Available European Statistics on Income and Living Conditions (EU-SILC is a survey on households’ living conditions. The main aim of the survey is to get long-term comparable data on social and economic situation of households. Data collected in the survey are used mainly in connection with the evaluation of income poverty and determinationof at-risk-of-poverty rate. This article deals with the calculation of the at risk-of-poverty threshold based on data from EU-SILC 2009. The main task is to compare two approaches to the computation of at riskof-poverty threshold. The first approach is based on the calculation of the threshold for each country separately,while the second one is based on the calculation of the threshold for all states together. The introduction summarizes common attributes in the calculation of the at-risk-of-poverty threshold, such as disposable household income, equivalised household income. Further, different approaches to both calculations are introduced andadvantages and disadvantages of these approaches are stated. Finally, the at-risk-of-poverty rate calculation is described and comparison of the at-risk-of-poverty rates based on these two different approaches is made.

  5. Threshold concepts in finance: student perspectives

    Science.gov (United States)

    Hoadley, Susan; Kyng, Tim; Tickle, Leonie; Wood, Leigh N.

    2015-10-01

    Finance threshold concepts are the essential conceptual knowledge that underpin well-developed financial capabilities and are central to the mastery of finance. In this paper we investigate threshold concepts in finance from the point of view of students, by establishing the extent to which students are aware of threshold concepts identified by finance academics. In addition, we investigate the potential of a framework of different types of knowledge to differentiate the delivery of the finance curriculum and the role of modelling in finance. Our purpose is to identify ways to improve curriculum design and delivery, leading to better student outcomes. Whilst we find that there is significant overlap between what students identify as important in finance and the threshold concepts identified by academics, much of this overlap is expressed by indirect reference to the concepts. Further, whilst different types of knowledge are apparent in the student data, there is evidence that students do not necessarily distinguish conceptual from other types of knowledge. As well as investigating the finance curriculum, the research demonstrates the use of threshold concepts to compare and contrast student and academic perceptions of a discipline and, as such, is of interest to researchers in education and other disciplines.

  6. Psychophysical thresholds of face visibility during infancy

    DEFF Research Database (Denmark)

    Gelskov, Sofie; Kouider, Sid

    2010-01-01

    The ability to detect and focus on faces is a fundamental prerequisite for developing social skills. But how well can infants detect faces? Here, we address this question by studying the minimum duration at which faces must appear to trigger a behavioral response in infants. We used a preferential...... looking method in conjunction with masking and brief presentations (300 ms and below) to establish the temporal thresholds of visibility at different stages of development. We found that 5 and 10 month-old infants have remarkably similar visibility thresholds about three times higher than those of adults....... By contrast, 15 month-olds not only revealed adult-like thresholds, but also improved their performance through memory-based strategies. Our results imply that the development of face visibility follows a non-linear course and is determined by a radical improvement occurring between 10 and 15 months....

  7. Stimulated Brillouin scattering threshold in fiber amplifiers

    International Nuclear Information System (INIS)

    Liang Liping; Chang Liping

    2011-01-01

    Based on the wave coupling theory and the evolution model of the critical pump power (or Brillouin threshold) for stimulated Brillouin scattering (SBS) in double-clad fiber amplifiers, the influence of signal bandwidth, fiber-core diameter and amplifier gain on SBS threshold is simulated theoretically. And experimental measurements of SBS are presented in ytterbium-doped double-clad fiber amplifiers with single-frequency hundred nanosecond pulse amplification. Under different input signal pulses, the forward amplified pulse distortion is observed when the pulse energy is up to 660 nJ and the peak power is up to 3.3 W in the pulse amplification with pulse duration of 200 ns and repetition rate of 1 Hz. And the backward SBS narrow pulse appears. The pulse peak power equals to SBS threshold. Good agreement is shown between the modeled and experimental data. (authors)

  8. Threshold Theory Tested in an Organizational Setting

    DEFF Research Database (Denmark)

    Christensen, Bo T.; Hartmann, Peter V. W.; Hedegaard Rasmussen, Thomas

    2017-01-01

    A large sample of leaders (N = 4257) was used to test the link between leader innovativeness and intelligence. The threshold theory of the link between creativity and intelligence assumes that below a certain IQ level (approximately IQ 120), there is some correlation between IQ and creative...... potential, but above this cutoff point, there is no correlation. Support for the threshold theory of creativity was found, in that the correlation between IQ and innovativeness was positive and significant below a cutoff point of IQ 120. Above the cutoff, no significant relation was identified, and the two...... correlations differed significantly. The finding was stable across distinct parts of the sample, providing support for the theory, although the correlations in all subsamples were small. The findings lend support to the existence of threshold effects using perceptual measures of behavior in real...

  9. Effects of pulse duration on magnetostimulation thresholds

    Energy Technology Data Exchange (ETDEWEB)

    Saritas, Emine U., E-mail: saritas@ee.bilkent.edu.tr [Department of Bioengineering, University of California, Berkeley, Berkeley, California 94720-1762 (United States); Department of Electrical and Electronics Engineering, Bilkent University, Bilkent, Ankara 06800 (Turkey); National Magnetic Resonance Research Center (UMRAM), Bilkent University, Bilkent, Ankara 06800 (Turkey); Goodwill, Patrick W. [Department of Bioengineering, University of California, Berkeley, Berkeley, California 94720-1762 (United States); Conolly, Steven M. [Department of Bioengineering, University of California, Berkeley, Berkeley, California 94720-1762 (United States); Department of EECS, University of California, Berkeley, California 94720-1762 (United States)

    2015-06-15

    Purpose: Medical imaging techniques such as magnetic resonance imaging and magnetic particle imaging (MPI) utilize time-varying magnetic fields that are subject to magnetostimulation limits, which often limit the speed of the imaging process. Various human-subject experiments have studied the amplitude and frequency dependence of these thresholds for gradient or homogeneous magnetic fields. Another contributing factor was shown to be number of cycles in a magnetic pulse, where the thresholds decreased with longer pulses. The latter result was demonstrated on two subjects only, at a single frequency of 1.27 kHz. Hence, whether the observed effect was due to the number of cycles or due to the pulse duration was not specified. In addition, a gradient-type field was utilized; hence, whether the same phenomenon applies to homogeneous magnetic fields remained unknown. Here, the authors investigate the pulse duration dependence of magnetostimulation limits for a 20-fold range of frequencies using homogeneous magnetic fields, such as the ones used for the drive field in MPI. Methods: Magnetostimulation thresholds were measured in the arms of six healthy subjects (age: 27 ± 5 yr). Each experiment comprised testing the thresholds at eight different pulse durations between 2 and 125 ms at a single frequency, which took approximately 30–40 min/subject. A total of 34 experiments were performed at three different frequencies: 1.2, 5.7, and 25.5 kHz. A solenoid coil providing homogeneous magnetic field was used to induce stimulation, and the field amplitude was measured in real time. A pre-emphasis based pulse shaping method was employed to accurately control the pulse durations. Subjects reported stimulation via a mouse click whenever they felt a twitching/tingling sensation. A sigmoid function was fitted to the subject responses to find the threshold at a specific frequency and duration, and the whole procedure was repeated at all relevant frequencies and pulse durations

  10. Thresholds of ion turbulence in tokamaks

    International Nuclear Information System (INIS)

    Garbet, X.; Laurent, L.; Mourgues, F.; Roubin, J.P.; Samain, A.; Zou, X.L.

    1991-01-01

    The linear thresholds of ionic turbulence are numerically calculated for the Tokamaks JET and TORE SUPRA. It is proved that the stability domain at η i >0 is determined by trapped ion modes and is characterized by η i ≥1 and a threshold L Ti /R of order (0.2/0.3)/(1+T i /T e ). The latter value is significantly smaller than what has been previously predicted. Experimental temperature profiles in heated discharges are usually marginal with respect to this criterium. It is also shown that the eigenmodes are low frequency, low wavenumber ballooned modes, which may produce a very large transport once the threshold ion temperature gradient is reached

  11. THRESHOLD PARAMETER OF THE EXPECTED LOSSES

    Directory of Open Access Journals (Sweden)

    Josip Arnerić

    2012-12-01

    Full Text Available The objective of extreme value analysis is to quantify the probabilistic behavior of unusually large losses using only extreme values above some high threshold rather than using all of the data which gives better fit to tail distribution in comparison to traditional methods with assumption of normality. In our case we estimate market risk using daily returns of the CROBEX index at the Zagreb Stock Exchange. Therefore, it’s necessary to define the excess distribution above some threshold, i.e. Generalized Pareto Distribution (GPD is used as much more reliable than the normal distribution due to the fact that gives the accent on the extreme values. Parameters of GPD distribution will be estimated using maximum likelihood method (MLE. The contribution of this paper is to specify threshold which is large enough so that GPD approximation valid but low enough so that a sufficient number of observations are available for a precise fit.

  12. Effects of pulse duration on magnetostimulation thresholds

    International Nuclear Information System (INIS)

    Saritas, Emine U.; Goodwill, Patrick W.; Conolly, Steven M.

    2015-01-01

    Purpose: Medical imaging techniques such as magnetic resonance imaging and magnetic particle imaging (MPI) utilize time-varying magnetic fields that are subject to magnetostimulation limits, which often limit the speed of the imaging process. Various human-subject experiments have studied the amplitude and frequency dependence of these thresholds for gradient or homogeneous magnetic fields. Another contributing factor was shown to be number of cycles in a magnetic pulse, where the thresholds decreased with longer pulses. The latter result was demonstrated on two subjects only, at a single frequency of 1.27 kHz. Hence, whether the observed effect was due to the number of cycles or due to the pulse duration was not specified. In addition, a gradient-type field was utilized; hence, whether the same phenomenon applies to homogeneous magnetic fields remained unknown. Here, the authors investigate the pulse duration dependence of magnetostimulation limits for a 20-fold range of frequencies using homogeneous magnetic fields, such as the ones used for the drive field in MPI. Methods: Magnetostimulation thresholds were measured in the arms of six healthy subjects (age: 27 ± 5 yr). Each experiment comprised testing the thresholds at eight different pulse durations between 2 and 125 ms at a single frequency, which took approximately 30–40 min/subject. A total of 34 experiments were performed at three different frequencies: 1.2, 5.7, and 25.5 kHz. A solenoid coil providing homogeneous magnetic field was used to induce stimulation, and the field amplitude was measured in real time. A pre-emphasis based pulse shaping method was employed to accurately control the pulse durations. Subjects reported stimulation via a mouse click whenever they felt a twitching/tingling sensation. A sigmoid function was fitted to the subject responses to find the threshold at a specific frequency and duration, and the whole procedure was repeated at all relevant frequencies and pulse durations

  13.   Information and acceptance of prenatal examinations - a qualitative study

    DEFF Research Database (Denmark)

    Fleron, Stina Lou; Dahl, Katja; Risør, Mette Bech

    by the health care system offering it. By prenatal examinations the pregnant women want to be giving the choice of future management should there be something wrong with their child. Conclusions:Participation in prenatal examinations is not based on a thorough knowledge of pros and contra of the screening tests......  Background:In 2004 The Danish National Board of Health issued new guidelines on prenatal examinations. The importance of informed decision making is strongly emphasised and any acceptance of the screenings tests offered should be based on thorough and adequate information. Objective...... and hypothesis:To explore the influence of information in the decision-making process of prenatal screenings tests offered, the relation between information, knowledge and up-take rates and reasons for accepting or declining the screenings tests offered.  Methods:The study is based on a qualitative approach...

  14. Food irradiation receives international acceptance

    Energy Technology Data Exchange (ETDEWEB)

    Beddoes, J M [Atomic Energy of Canada Ltd., Ottawa, Ontario. Commercial Products

    1982-04-01

    Irradiation has advantages as a method of preserving food, especially in the Third World. The author tabulates some examples of actual use of food irradiation with dates and tonnages, and tells the story of the gradual acceptance of food irradiation by the World Health Organization, other international bodies, and the U.S. Food and Drug Administration (USFDA). At present, the joint IAEA/FAO/WHO standard permits an energy level of up to 5 MeV for gamma rays, well above the 1.3 MeV energy level of /sup 60/Co. The USFDA permits irradiation of any food up to 10 krad, and minor constituents of a diet may be irradiated up to 5 Mrad. The final hurdle to be cleared, that of economic acceptance, depends on convincing the food processing industry that the process is technically and economically efficient.

  15. Consumer acceptance of irradiated food

    Energy Technology Data Exchange (ETDEWEB)

    Loaharanu, P [Head, Food Preservation Section, Joint FAO/ IAEA Division of Nuclear Techniques in Food and Agriculture, Wagramerstr. 5, A-1400, Vienna (Austria)

    1998-12-31

    There was a widely held opinion during the 1970`s and 1980`s that consumers would be reluctant to purchase irradiated food, as it was perceived that consumers would confuse irradiated food with food contaminated by radionuclides. Indeed, a number of consumer attitude surveys conducted in several western countries during these two decades demonstrated that the concerns of consumers on irradiated food varied from very concerned to seriously concerned.This paper attempts to review parameters conducting in measuring consumer acceptance of irradiated food during the past three decades and to project the trends on this subject. It is believed that important lessons learned from past studies will guide further efforts to market irradiated food with wide consumer acceptance in the future. (Author)

  16. Consumer acceptance of irradiated food

    Energy Technology Data Exchange (ETDEWEB)

    Loaharanu, P. [Head, Food Preservation Section, Joint FAO/ IAEA Division of Nuclear Techniques in Food and Agriculture, Wagramerstr. 5, A-1400, Vienna (Austria)

    1997-12-31

    There was a widely held opinion during the 1970`s and 1980`s that consumers would be reluctant to purchase irradiated food, as it was perceived that consumers would confuse irradiated food with food contaminated by radionuclides. Indeed, a number of consumer attitude surveys conducted in several western countries during these two decades demonstrated that the concerns of consumers on irradiated food varied from very concerned to seriously concerned.This paper attempts to review parameters conducting in measuring consumer acceptance of irradiated food during the past three decades and to project the trends on this subject. It is believed that important lessons learned from past studies will guide further efforts to market irradiated food with wide consumer acceptance in the future. (Author)

  17. Food irradiation receives international acceptance

    International Nuclear Information System (INIS)

    Beddoes, J.M.

    1982-01-01

    Irradition has advantages as a method of preserving food, especially in the Third World. The author tabulates some examples of actual use of food irradiation with dates and tonnages, and tells the story of the gradual acceptance of food irradiation by the World Health Organization, other international bodies, and the U.S. Food and Drug Administration (USFDA). At present, the joint IAEA/FAO/WHO standard permits an energy level of up to 5 MeV for gamma rays, well above the 1.3 MeV energy level of 60 Co. The USFDA permits irradiation of any food up to 10 krad, and minor constituents of a diet may be irradiated up to 5 Mrad. The final hurdle to be cleared, that of economic acceptance, depends on convincing the food processing industry that the process is technically and economically efficient

  18. Consumer acceptance of irradiated food

    International Nuclear Information System (INIS)

    Loaharanu, P.

    1997-01-01

    There was a widely held opinion during the 1970's and 1980's that consumers would be reluctant to purchase irradiated food, as it was perceived that consumers would confuse irradiated food with food contaminated by radionuclides. Indeed, a number of consumer attitude surveys conducted in several western countries during these two decades demonstrated that the concerns of consumers on irradiated food varied from very concerned to seriously concerned.This paper attempts to review parameters conducting in measuring consumer acceptance of irradiated food during the past three decades and to project the trends on this subject. It is believed that important lessons learned from past studies will guide further efforts to market irradiated food with wide consumer acceptance in the future. (Author)

  19. Shifts in the relationship between motor unit recruitment thresholds versus derecruitment thresholds during fatigue.

    Science.gov (United States)

    Stock, Matt S; Mota, Jacob A

    2017-12-01

    Muscle fatigue is associated with diminished twitch force amplitude. We examined changes in the motor unit recruitment versus derecruitment threshold relationship during fatigue. Nine men (mean age = 26 years) performed repeated isometric contractions at 50% maximal voluntary contraction (MVC) knee extensor force until exhaustion. Surface electromyographic signals were detected from the vastus lateralis, and were decomposed into their constituent motor unit action potential trains. Motor unit recruitment and derecruitment thresholds and firing rates at recruitment and derecruitment were evaluated at the beginning, middle, and end of the protocol. On average, 15 motor units were studied per contraction. For the initial contraction, three subjects showed greater recruitment thresholds than derecruitment thresholds for all motor units. Five subjects showed greater recruitment thresholds than derecruitment thresholds for only low-threshold motor units at the beginning, with a mean cross-over of 31.6% MVC. As the muscle fatigued, many motor units were derecruited at progressively higher forces. In turn, decreased slopes and increased y-intercepts were observed. These shifts were complemented by increased firing rates at derecruitment relative to recruitment. As the vastus lateralis fatigued, the central nervous system's compensatory adjustments resulted in a shift of the regression line of the recruitment versus derecruitment threshold relationship. Copyright © 2017 IPEM. Published by Elsevier Ltd. All rights reserved.

  20. Risk acceptance by the population

    International Nuclear Information System (INIS)

    Diekershoff, K.

    1980-01-01

    Information which is given by systematical learning processes creates a necessary prerequisite for a partly realistic evaluation of risks. If the objective shall be achieved to reduce continuously the acceptance of risks it is absolutely necessary to include the persons concerned in the process of communication and formation. In this field social science could make a specific contribution by its approach in action research. (orig./RW) [de

  1. The threshold photoelectron spectrum of mercury

    International Nuclear Information System (INIS)

    Rojas, H; Dawber, G; Gulley, N; King, G C; Bowring, N; Ward, R

    2013-01-01

    The threshold photoelectron spectrum of mercury has been recorded over the energy range (10–40 eV) which covers the region from the lowest state of the singly charged ion, 5d 10 6s( 2 S 1/2 ), to the double charged ionic state, 5d 9 ( 2 D 3/2 )6s( 1 D 2 ). Synchrotron radiation has been used in conjunction with the penetrating-field threshold-electron technique to obtain the spectrum with high resolution. The spectrum shows many more features than observed in previous photoemission measurements with many of these assigned to satellite states converging to the double ionization limit. (paper)

  2. Near threshold expansion of Feynman diagrams

    International Nuclear Information System (INIS)

    Mendels, E.

    2005-01-01

    The near threshold expansion of Feynman diagrams is derived from their configuration space representation, by performing all x integrations. The general scalar Feynman diagram is considered, with an arbitrary number of external momenta, an arbitrary number of internal lines and an arbitrary number of loops, in n dimensions and all masses may be different. The expansions are considered both below and above threshold. Rules, giving real and imaginary part, are derived. Unitarity of a sunset diagram with I internal lines is checked in a direct way by showing that its imaginary part is equal to the phase space integral of I particles

  3. Thresholds in Xeric Hydrology and Biogeochemistry

    Science.gov (United States)

    Meixner, T.; Brooks, P. D.; Simpson, S. C.; Soto, C. D.; Yuan, F.; Turner, D.; Richter, H.

    2011-12-01

    Due to water limitation, thresholds in hydrologic and biogeochemical processes are common in arid and semi-arid systems. Some of these thresholds such as those focused on rainfall runoff relationships have been well studied. However to gain a full picture of the role that thresholds play in driving the hydrology and biogeochemistry of xeric systems a full view of the entire array of processes at work is needed. Here a walk through the landscape of xeric systems will be conducted illustrating the powerful role of hydrologic thresholds on xeric system biogeochemistry. To understand xeric hydro-biogeochemistry two key ideas need to be focused on. First, it is important to start from a framework of reaction and transport. Second an understanding of the temporal and spatial components of thresholds that have a large impact on hydrologic and biogeochemical fluxes needs to be offered. In the uplands themselves episodic rewetting and drying of soils permits accelerated biogeochemical processing but also more gradual drainage of water through the subsurface than expected in simple conceptions of biogeochemical processes. Hydrologic thresholds (water content above hygroscopic) results in a stop start nutrient spiral of material across the landscape since runoff connecting uplands to xeric perennial riparian is episodic and often only transports materials a short distance (100's of m). This episodic movement results in important and counter-intuitive nutrient inputs to riparian zones but also significant processing and uptake of nutrients. The floods that transport these biogeochemicals also result in significant input to riparian groundwater and may be key to sustaining these critical ecosystems. Importantly the flood driven recharge process itself is a threshold process dependent on flood characteristics (floods greater than 100 cubic meters per second) and antecedent conditions (losing to near neutral gradients). Floods also appear to influence where arid and semi

  4. Double photoionization of helium near threshold

    International Nuclear Information System (INIS)

    Levin, J.C.; Armen, G.B.; Sellin, I.A.

    1996-01-01

    There has been substantial recent experimental interest in the ratio of double-to-single photoionization of He near threshold following several theoretical observations that earlier measurements appear to overestimate the ratio, perhaps by as much as 25%, in the first several hundred eV above threshold. The authors recent measurements are 10%-15% below these earlier results and more recent results of Doerner et al. and Samson et al. are yet another 10% lower. The authors will compare these measurement with new data, not yet analyzed, and available theory

  5. Color image Segmentation using automatic thresholding techniques

    International Nuclear Information System (INIS)

    Harrabi, R.; Ben Braiek, E.

    2011-01-01

    In this paper, entropy and between-class variance based thresholding methods for color images segmentation are studied. The maximization of the between-class variance (MVI) and the entropy (ME) have been used as a criterion functions to determine an optimal threshold to segment images into nearly homogenous regions. Segmentation results from the two methods are validated and the segmentation sensitivity for the test data available is evaluated, and a comparative study between these methods in different color spaces is presented. The experimental results demonstrate the superiority of the MVI method for color image segmentation.

  6. The planet beyond the plume hypothesis

    Science.gov (United States)

    Smith, Alan D.; Lewis, Charles

    1999-12-01

    Acceptance of the theory of plate tectonics was accompanied by the rise of the mantle plume/hotspot concept which has come to dominate geodynamics from its use both as an explanation for the origin of intraplate volcanism and as a reference frame for plate motions. However, even with a large degree of flexibility permitted in plume composition, temperature, size, and depth of origin, adoption of any limited number of hotspots means the plume model cannot account for all occurrences of the type of volcanism it was devised to explain. While scientific protocol would normally demand that an alternative explanation be sought, there have been few challenges to "plume theory" on account of a series of intricate controls set up by the plume model which makes plumes seem to be an essential feature of the Earth. The hotspot frame acts not only as a reference but also controls plate tectonics. Accommodating plumes relegates mantle convection to a weak, sluggish effect such that basal drag appears as a minor, resisting force, with plates having to move themselves by boundary forces and continents having to be rifted by plumes. Correspondingly, the geochemical evolution of the mantle is controlled by the requirement to isolate subducted crust into plume sources which limits potential buffers on the composition of the MORB-source to plume- or lower mantle material. Crustal growth and Precambrian tectonics are controlled by interpretations of greenstone belts as oceanic plateaus generated by plumes. Challenges to any aspect of the plume model are thus liable to be dismissed unless a counter explanation is offered across the geodynamic spectrum influenced by "plume theory". Nonetheless, an alternative synthesis can be made based on longstanding petrological evidence for derivation of intraplate volcanism from volatile-bearing sources (wetspots) in conjunction with concepts dismissed for being incompatible or superfluous to "plume theory". In the alternative Earth, the sources for

  7. Public acceptance and public relations

    International Nuclear Information System (INIS)

    Tanaka, Yasumasa

    1977-01-01

    A set of problems are discussed, which must be studied before the public relations are dealt with. Firstly, the trade-off between energy and health must be considered. There were several ages in which the consideration on health took preference to the energy requirement in the past. For example, the use of coal in London was prohibited by the King's proclamation in 1,306. Secondly, the selection for the acceptance of atomic power development and utilization is based on the subjective susceptibility psychologically, and cannot be concluded only by the logical reasoning. Thirdly, the strict definition of ''national consensus'' is necessary. That is, whether does it mean pleviscite or mere mood. Fourthly, whether the atomic energy is free from the danger or death biologically or not. Fifthly, is there any method for discriminating the persons who accept atomic power from the persons who do not socially. Although the probability of death caused by atomic accidents is very small (one three hundred millionth a year), many peoples hate atomic power and oppose to the construction of nuclear power plants. Four reasons for this are considered: (1) social diffusion of innovation, (2) nuclear allergy, (3) shortage of the conception of risk-benefit, and (4) heterogeneity of the public. According to the investigation of the relationship between electric power and livelihood, carried out by the policy and science research institute in Tokyo, the highly subjective decision for the acceptance of atomic power is independent of the objective knowledge on atomic power. (Iwakiri, K.)

  8. Wind power: basic challenge concerning social acceptance

    NARCIS (Netherlands)

    Wolsink, M.; Meyers, R.A.

    2012-01-01

    This reference article gives an overview of social acceptance (acceptance by all relevant actors in society) of all relevant aspects of implementation and diffusion of wind power. In social acceptance three dimensions of acceptance are distinguished (socio-political -; community -; market

  9. Acceptance and suitability of novel trees for Orthotomicus erosus, an exotic bark beetle in North America

    Science.gov (United States)

    A.J. Walter; R.C. Venette; S.A. Kells

    2010-01-01

    To predict whether an herbivorous pest insect will establish in a new area, the potential host plants must be known. For invading bark beetles, adults must recognize and accept trees suitable for larval development. The preference-performance hypothesis predicts that adults will select host species that maximize the fitness of their offspring. We tested five species of...

  10. Peer Acceptance and Friendship in Early Childhood: The Conceptual Distinctions between Them

    Science.gov (United States)

    Beazidou, Eleftheria; Botsoglou, Kafenia

    2016-01-01

    This paper reviews previous literature about peer acceptance and friendship, two of the most critical aspects of peer relations that have received most of research attention during the past years. In this review, we will focus on the processes explaining the way children use the ability to socialise with peers; explore the hypothesis that certain…

  11. Key Variables of Merging Behaviour : Empirical Comparison between Two Sites and Assessment of Gap Acceptance Theory

    NARCIS (Netherlands)

    Marczak, F.; Daamen, W.; Buisson, C.

    2013-01-01

    This paper presents two empirical trajectory data sets focusing on the merging behaviour on a motorway, both in the Netherlands and in France. A careful review of the literature shows that the main theories explaining this behaviour rely on the hypothesis of gap acceptance, i.e. the fact that each

  12. Pressure pain thresholds and musculoskeletal morbidity in automobile manufacturing workers.

    Science.gov (United States)

    Gold, Judith E; Punnett, Laura; Katz, Jeffrey N

    2006-02-01

    Reduced pressure pain thresholds (PPTs) have been reported in occupational groups with symptoms of upper extremity musculoskeletal disorders (UEMSDs). The purpose of this study was to determine whether automobile manufacturing workers (n=460) with signs and symptoms of UEMSDs had reduced PPTs (greater sensitivity to pain through pressure applied to the skin) when compared with unaffected members of the cohort, which served as the reference group. The association of PPTs with symptom severity and localization of PE findings was investigated, as was the hypothesis that reduced thresholds would be found on the affected side in those with unilateral physical examination (PE) findings. PPTs were measured during the workday at 12 upper extremity sites. A PE for signs of UEMSDs and symptom questionnaire was administered. After comparison of potential covariates using t tests, linear regression multivariable models were constructed with the average of 12 sites (avgPPT) as the outcome. Subjects with PE findings and/or symptoms had a statistically significant lower avgPPT than non-cases. AvgPPT was reduced in those with more widespread PE findings and in those with greater symptom severity (test for trend, P

  13. Men’s Perception of Raped Women: Test of the Sexually Transmitted Disease Hypothesis and the Cuckoldry Hypothesis

    Directory of Open Access Journals (Sweden)

    Prokop Pavol

    2016-06-01

    Full Text Available Rape is a recurrent adaptive problem of female humans and females of a number of non-human animals. Rape has various physiological and reproductive costs to the victim. The costs of rape are furthermore exaggerated by social rejection and blaming of a victim, particularly by men. The negative perception of raped women by men has received little attention from an evolutionary perspective. Across two independent studies, we investigated whether the risk of sexually transmitted diseases (the STD hypothesis, Hypothesis 1 or paternity uncertainty (the cuckoldry hypothesis, Hypothesis 2 influence the negative perception of raped women by men. Raped women received lower attractiveness score than non-raped women, especially in long-term mate attractiveness score. The perceived attractiveness of raped women was not influenced by the presence of experimentally manipulated STD cues on faces of putative rapists. Women raped by three men received lower attractiveness score than women raped by one man. These results provide stronger support for the cuckoldry hypothesis (Hypothesis 2 than for the STD hypothesis (Hypothesis 1. Single men perceived raped women as more attractive than men in a committed relationship (Hypothesis 3, suggesting that the mating opportunities mediate men’s perception of victims of rape. Overall, our results suggest that the risk of cuckoldry underlie the negative perception of victims of rape by men rather than the fear of disease transmission.

  14. New Hypothesis for SOFC Ceramic Oxygen Electrode Mechanisms

    DEFF Research Database (Denmark)

    Mogensen, Mogens Bjerg; Chatzichristodoulou, Christodoulos; Graves, Christopher R.

    2016-01-01

    A new hypothesis for the electrochemical reaction mechanism in solid oxide cell ceramic oxygen electrodes is proposed based on literature including our own results. The hypothesis postulates that the observed thin layers of SrO-La2O3 on top of ceramic perovskite and other Ruddlesden-Popper...

  15. Assess the Critical Period Hypothesis in Second Language Acquisition

    Science.gov (United States)

    Du, Lihong

    2010-01-01

    The Critical Period Hypothesis aims to investigate the reason for significant difference between first language acquisition and second language acquisition. Over the past few decades, researchers carried out a series of studies to test the validity of the hypothesis. Although there were certain limitations in these studies, most of their results…

  16. Dynamical agents' strategies and the fractal market hypothesis

    Czech Academy of Sciences Publication Activity Database

    Vácha, Lukáš; Vošvrda, Miloslav

    2005-01-01

    Roč. 14, č. 2 (2005), s. 172-179 ISSN 1210-0455 Grant - others:GA UK(CZ) 454/2004/A EK/FSV Institutional research plan: CEZ:AV0Z10750506 Keywords : efficient market hypothesis * fractal market hypothesis * agent's investment horizons Subject RIV: AH - Economics

  17. An Exercise for Illustrating the Logic of Hypothesis Testing

    Science.gov (United States)

    Lawton, Leigh

    2009-01-01

    Hypothesis testing is one of the more difficult concepts for students to master in a basic, undergraduate statistics course. Students often are puzzled as to why statisticians simply don't calculate the probability that a hypothesis is true. This article presents an exercise that forces students to lay out on their own a procedure for testing a…

  18. A default Bayesian hypothesis test for ANOVA designs

    NARCIS (Netherlands)

    Wetzels, R.; Grasman, R.P.P.P.; Wagenmakers, E.J.

    2012-01-01

    This article presents a Bayesian hypothesis test for analysis of variance (ANOVA) designs. The test is an application of standard Bayesian methods for variable selection in regression models. We illustrate the effect of various g-priors on the ANOVA hypothesis test. The Bayesian test for ANOVA

  19. Adaptation hypothesis of biological efficiency of ionizing radiation

    International Nuclear Information System (INIS)

    Kudritskij, Yu.K.; Georgievskij, A.B.; Karpov, V.I.

    1992-01-01

    Adaptation hypothesis of biological efficiency of ionizing radiation is based on acknowledgement of invariance of fundamental laws and principles of biology related to unity of biota and media, evolution and adaptation for radiobiology. The basic arguments for adaptation hypothesis validity, its correspondence to the requirements imposed on scientific hypothes are presented

  20. New methods of testing nonlinear hypothesis using iterative NLLS estimator

    Science.gov (United States)

    Mahaboob, B.; Venkateswarlu, B.; Mokeshrayalu, G.; Balasiddamuni, P.

    2017-11-01

    This research paper discusses the method of testing nonlinear hypothesis using iterative Nonlinear Least Squares (NLLS) estimator. Takeshi Amemiya [1] explained this method. However in the present research paper, a modified Wald test statistic due to Engle, Robert [6] is proposed to test the nonlinear hypothesis using iterative NLLS estimator. An alternative method for testing nonlinear hypothesis using iterative NLLS estimator based on nonlinear hypothesis using iterative NLLS estimator based on nonlinear studentized residuals has been proposed. In this research article an innovative method of testing nonlinear hypothesis using iterative restricted NLLS estimator is derived. Pesaran and Deaton [10] explained the methods of testing nonlinear hypothesis. This paper uses asymptotic properties of nonlinear least squares estimator proposed by Jenrich [8]. The main purpose of this paper is to provide very innovative methods of testing nonlinear hypothesis using iterative NLLS estimator, iterative NLLS estimator based on nonlinear studentized residuals and iterative restricted NLLS estimator. Eakambaram et al. [12] discussed least absolute deviation estimations versus nonlinear regression model with heteroscedastic errors and also they studied the problem of heteroscedasticity with reference to nonlinear regression models with suitable illustration. William Grene [13] examined the interaction effect in nonlinear models disused by Ai and Norton [14] and suggested ways to examine the effects that do not involve statistical testing. Peter [15] provided guidelines for identifying composite hypothesis and addressing the probability of false rejection for multiple hypotheses.

  1. The Younger Dryas impact hypothesis: A critical review

    NARCIS (Netherlands)

    van Hoesel, A.; Hoek, W.Z.; Pennock, G.M.; Drury, Martyn

    2014-01-01

    The Younger Dryas impact hypothesis suggests that multiple extraterrestrial airbursts or impacts resulted in the Younger Dryas cooling, extensive wildfires, megafaunal extinctions and changes in human population. After the hypothesis was first published in 2007, it gained much criticism, as the

  2. Heritability estimates derived from threshold analyses for ...

    African Journals Online (AJOL)

    Unknown

    reproductive traits in a composite multibreed beef cattle herd using a threshold model. A GFCAT set of ..... pressure for longevity include low heritabilities, the increased generation interval necessary to obtain survival information, and automatic selection because long-lived cows contribute more offspring to subsequent ...

  3. Regression Discontinuity Designs Based on Population Thresholds

    DEFF Research Database (Denmark)

    Eggers, Andrew C.; Freier, Ronny; Grembi, Veronica

    In many countries, important features of municipal government (such as the electoral system, mayors' salaries, and the number of councillors) depend on whether the municipality is above or below arbitrary population thresholds. Several papers have used a regression discontinuity design (RDD...

  4. Thresholding methods for PET imaging: A review

    International Nuclear Information System (INIS)

    Dewalle-Vignion, A.S.; Betrouni, N.; Huglo, D.; Vermandel, M.; Dewalle-Vignion, A.S.; Hossein-Foucher, C.; Huglo, D.; Vermandel, M.; Dewalle-Vignion, A.S.; Hossein-Foucher, C.; Huglo, D.; Vermandel, M.; El Abiad, A.

    2010-01-01

    This work deals with positron emission tomography segmentation methods for tumor volume determination. We propose a state of art techniques based on fixed or adaptive threshold. Methods found in literature are analysed with an objective point of view on their methodology, advantages and limitations. Finally, a comparative study is presented. (authors)

  5. Identification of Threshold Concepts for Biochemistry

    Science.gov (United States)

    Loertscher, Jennifer; Green, David; Lewis, Jennifer E.; Lin, Sara; Minderhout, Vicky

    2014-01-01

    Threshold concepts (TCs) are concepts that, when mastered, represent a transformed understanding of a discipline without which the learner cannot progress. We have undertaken a process involving more than 75 faculty members and 50 undergraduate students to identify a working list of TCs for biochemistry. The process of identifying TCs for…

  6. The Resting Motor Threshold - Restless or Resting?

    DEFF Research Database (Denmark)

    Karabanov, Anke Ninija; Raffin, Estelle Emeline; Siebner, Hartwig Roman

    2015-01-01

    , the RMT of the right first dorsal interosseus muscle was repeatedly determined using a threshold-hunting procedure while participants performed motor imagery and visual attention tasks with the right or left hand. Data were analyzed using repeated-measure ANOVA. Results RMT differed depending on which...

  7. The gradual nature of threshold switching

    International Nuclear Information System (INIS)

    Wimmer, M; Salinga, M

    2014-01-01

    The recent commercialization of electronic memories based on phase change materials proved the usability of this peculiar family of materials for application purposes. More advanced data storage and computing concepts, however, demand a deeper understanding especially of the electrical properties of the amorphous phase and the switching behaviour. In this work, we investigate the temporal evolution of the current through the amorphous state of the prototypical phase change material, Ge 2 Sb 2 Te 5 , under constant voltage. A custom-made electrical tester allows the measurement of delay times over five orders of magnitude, as well as the transient states of electrical excitation prior to the actual threshold switching. We recognize a continuous current increase over time prior to the actual threshold-switching event to be a good measure for the electrical excitation. A clear correlation between a significant rise in pre-switching-current and the later occurrence of threshold switching can be observed. This way, we found experimental evidence for the existence of an absolute minimum for the threshold voltage (or electric field respectively) holding also for time scales far beyond the measurement range. (paper)

  8. Multiparty Computation from Threshold Homomorphic Encryption

    DEFF Research Database (Denmark)

    Cramer, Ronald; Damgård, Ivan Bjerre; Nielsen, Jesper Buus

    2001-01-01

    We introduce a new approach to multiparty computation (MPC) basing it on homomorphic threshold crypto-systems. We show that given keys for any sufficiently efficient system of this type, general MPC protocols for n parties can be devised which are secure against an active adversary that corrupts...

  9. Classification error of the thresholded independence rule

    DEFF Research Database (Denmark)

    Bak, Britta Anker; Fenger-Grøn, Morten; Jensen, Jens Ledet

    We consider classification in the situation of two groups with normally distributed data in the ‘large p small n’ framework. To counterbalance the high number of variables we consider the thresholded independence rule. An upper bound on the classification error is established which is taylored...

  10. Intraoperative transfusion threshold and tissue oxygenation

    DEFF Research Database (Denmark)

    Nielsen, K; Dahl, B; Johansson, P I

    2012-01-01

    Transfusion with allogeneic red blood cells (RBCs) may be needed to maintain oxygen delivery during major surgery, but the appropriate haemoglobin (Hb) concentration threshold has not been well established. We hypothesised that a higher level of Hb would be associated with improved subcutaneous...... oxygen tension during major spinal surgery....

  11. Handwriting Automaticity: The Search for Performance Thresholds

    Science.gov (United States)

    Medwell, Jane; Wray, David

    2014-01-01

    Evidence is accumulating that handwriting has an important role in written composition. In particular, handwriting automaticity appears to relate to success in composition. This relationship has been little explored in British contexts and we currently have little idea of what threshold performance levels might be. In this paper, we report on two…

  12. Grid - a fast threshold tracking procedure

    DEFF Research Database (Denmark)

    Fereczkowski, Michal; Dau, Torsten; MacDonald, Ewen

    2016-01-01

    A new procedure, called “grid”, is evaluated that allows rapid acquisition of threshold curves for psychophysics and, in particular, psychoacoustic, experiments. In this method, the parameterresponse space is sampled in two dimensions within a single run. This allows the procedure to focus more e...

  13. 49 CFR 80.13 - Threshold criteria.

    Science.gov (United States)

    2010-10-01

    ... exceed $30 million); (4) Project financing shall be repayable, in whole or in part, from tolls, user fees... Transportation Office of the Secretary of Transportation CREDIT ASSISTANCE FOR SURFACE TRANSPORTATION PROJECTS... project shall meet the following five threshold criteria: (1) The project shall be consistent with the...

  14. Low-threshold conical microcavity dye lasers

    DEFF Research Database (Denmark)

    Grossmann, Tobias; Schleede, Simone; Hauser, Mario

    2010-01-01

    element simulations confirm that lasing occurs in whispering gallery modes which corresponds well to the measured multimode laser-emission. The effect of dye concentration on lasing threshold and lasing wavelength is investigated and can be explained using a standard dye laser model....

  15. Microplastic effect thresholds for freshwater benthic macroinvertebrates

    NARCIS (Netherlands)

    Redondo Hasselerharm, P.E.; Dede Falahudin, Dede; Peeters, E.T.H.M.; Koelmans, A.A.

    2018-01-01

    Now that microplastics have been detected in lakes, rivers and estuaries all over the globe, evaluating their effects on biota has become an urgent research priority. This is the first study that aims at determining the effect thresholds for a battery of six freshwater benthic macroinvertebrates

  16. Threshold Concepts in Finance: Conceptualizing the Curriculum

    Science.gov (United States)

    Hoadley, Susan; Tickle, Leonie; Wood, Leigh N.; Kyng, Tim

    2015-01-01

    Graduates with well-developed capabilities in finance are invaluable to our society and in increasing demand. Universities face the challenge of designing finance programmes to develop these capabilities and the essential knowledge that underpins them. Our research responds to this challenge by identifying threshold concepts that are central to…

  17. Distribution of sensory taste thresholds for phenylthiocarbamide ...

    African Journals Online (AJOL)

    The ability to taste Phenylthiocarbamide (PTC), a bitter organic compound has been described as a bimodal autosomal trait in both genetic and anthropological studies. This study is based on the ability of a person to taste PTC. The present study reports the threshold distribution of PTC taste sensitivity among some Muslim ...

  18. The acoustic reflex threshold in aging ears.

    Science.gov (United States)

    Silverman, C A; Silman, S; Miller, M H

    1983-01-01

    This study investigates the controversy regarding the influence of age on the acoustic reflex threshold for broadband noise, 500-, 1000-, 2000-, and 4000-Hz activators between Jerger et al. [Mono. Contemp. Audiol. 1 (1978)] and Jerger [J. Acoust. Soc. Am. 66 (1979)] on the one hand and Silman [J. Acoust. Soc. Am. 66 (1979)] and others on the other. The acoustic reflex thresholds for broadband noise, 500-, 1000-, 2000-, and 4000-Hz activators were evaluated under two measurement conditions. Seventy-two normal-hearing ears were drawn from 72 subjects ranging in age from 20-69 years. The results revealed that age was correlated with the acoustic reflex threshold for BBN activator but not for any of the tonal activators; the correlation was stronger under the 1-dB than under the 5-dB measurement condition. Also, the mean acoustic reflex thresholds for broadband noise activator were essentially similar to those reported by Jerger et al. (1978) but differed from those obtained in this study under the 1-dB measurement condition.

  19. Atherogenic Risk Factors and Hearing Thresholds

    DEFF Research Database (Denmark)

    Frederiksen, Thomas Winther; Ramlau-Hansen, Cecilia Høst; Stokholm, Zara Ann

    2014-01-01

    The objective of this study was to evaluate the influence of atherogenic risk factors on hearing thresholds. In a cross-sectional study we analyzed data from a Danish survey in 2009-2010 on physical and psychological working conditions. The study included 576 white- and blue-collar workers from c...

  20. Near threshold behavior of photoelectron satellite intensities

    International Nuclear Information System (INIS)

    Shirley, D.A.; Becker, U.; Heimann, P.A.; Langer, B.

    1987-09-01

    The historical background and understanding of photoelectron satellite peaks is reviewed, using He(n), Ne(1s), Ne(2p), Ar(1s), and Ar(3s) as case studies. Threshold studies are emphasized. The classification of electron correlation effects as either ''intrinsic'' or ''dynamic'' is recommended. 30 refs., 7 figs

  1. Cost–effectiveness thresholds: pros and cons

    Science.gov (United States)

    Lauer, Jeremy A; De Joncheere, Kees; Edejer, Tessa; Hutubessy, Raymond; Kieny, Marie-Paule; Hill, Suzanne R

    2016-01-01

    Abstract Cost–effectiveness analysis is used to compare the costs and outcomes of alternative policy options. Each resulting cost–effectiveness ratio represents the magnitude of additional health gained per additional unit of resources spent. Cost–effectiveness thresholds allow cost–effectiveness ratios that represent good or very good value for money to be identified. In 2001, the World Health Organization’s Commission on Macroeconomics in Health suggested cost–effectiveness thresholds based on multiples of a country’s per-capita gross domestic product (GDP). In some contexts, in choosing which health interventions to fund and which not to fund, these thresholds have been used as decision rules. However, experience with the use of such GDP-based thresholds in decision-making processes at country level shows them to lack country specificity and this – in addition to uncertainty in the modelled cost–effectiveness ratios – can lead to the wrong decision on how to spend health-care resources. Cost–effectiveness information should be used alongside other considerations – e.g. budget impact and feasibility considerations – in a transparent decision-making process, rather than in isolation based on a single threshold value. Although cost–effectiveness ratios are undoubtedly informative in assessing value for money, countries should be encouraged to develop a context-specific process for decision-making that is supported by legislation, has stakeholder buy-in, for example the involvement of civil society organizations and patient groups, and is transparent, consistent and fair. PMID:27994285

  2. Identifying thresholds for ecosystem-based management.

    Directory of Open Access Journals (Sweden)

    Jameal F Samhouri

    Full Text Available BACKGROUND: One of the greatest obstacles to moving ecosystem-based management (EBM from concept to practice is the lack of a systematic approach to defining ecosystem-level decision criteria, or reference points that trigger management action. METHODOLOGY/PRINCIPAL FINDINGS: To assist resource managers and policymakers in developing EBM decision criteria, we introduce a quantitative, transferable method for identifying utility thresholds. A utility threshold is the level of human-induced pressure (e.g., pollution at which small changes produce substantial improvements toward the EBM goal of protecting an ecosystem's structural (e.g., diversity and functional (e.g., resilience attributes. The analytical approach is based on the detection of nonlinearities in relationships between ecosystem attributes and pressures. We illustrate the method with a hypothetical case study of (1 fishing and (2 nearshore habitat pressure using an empirically-validated marine ecosystem model for British Columbia, Canada, and derive numerical threshold values in terms of the density of two empirically-tractable indicator groups, sablefish and jellyfish. We also describe how to incorporate uncertainty into the estimation of utility thresholds and highlight their value in the context of understanding EBM trade-offs. CONCLUSIONS/SIGNIFICANCE: For any policy scenario, an understanding of utility thresholds provides insight into the amount and type of management intervention required to make significant progress toward improved ecosystem structure and function. The approach outlined in this paper can be applied in the context of single or multiple human-induced pressures, to any marine, freshwater, or terrestrial ecosystem, and should facilitate more effective management.

  3. Do multiple body modifications alter pain threshold?

    Science.gov (United States)

    Yamamotová, A; Hrabák, P; Hříbek, P; Rokyta, R

    2017-12-30

    In recent years, epidemiological data has shown an increasing number of young people who deliberately self-injure. There have also been parallel increases in the number of people with tattoos and those who voluntarily undergo painful procedures associated with piercing, scarification, and tattooing. People with self-injury behaviors often say that they do not feel the pain. However, there is no information regarding pain perception in those that visit tattoo parlors and piercing studios compared to those who don't. The aim of this study was to compare nociceptive sensitivity in four groups of subjects (n=105, mean age 26 years, 48 women and 57 men) with different motivations to experience pain (i.e., with and without multiple body modifications) in two different situations; (1) in controlled, emotionally neutral conditions, and (2) at a "Hell Party" (HP), an event organized by a piercing and tattoo parlor, with a main event featuring a public demonstration of painful techniques (burn scars, hanging on hooks, etc.). Pain thresholds of the fingers of the hand were measured using a thermal stimulator and mechanical algometer. In HP participants, information about alcohol intake, self-harming behavior, and psychiatric history were used in the analysis as intervening variables. Individuals with body modifications as well as without body modifications had higher thermal pain thresholds at Hell Party, compared to thresholds measured at control neutral conditions. No such differences were found relative to mechanical pain thresholds. Increased pain threshold in all HP participants, irrespectively of body modification, cannot be simply explained by a decrease in the sensory component of pain; instead, we found that the environment significantly influenced the cognitive and affective component of pain.

  4. Fast Acceptance by Common Experience

    Directory of Open Access Journals (Sweden)

    Nathan Berg

    2010-08-01

    Full Text Available Schelling (1969, 1971a,b, 1978 observed that macro-level patterns do not necessarily reflect micro-level intentions, desires or goals. In his classic model on neighborhood segregation which initiated a large and influential literature, individuals with no desire to be segregated from those who belong to other social groups nevertheless wind up clustering with their own type. Most extensions of Schelling's model have replicated this result. There is an important mismatch, however, between theory and observation, which has received relatively little attention. Whereas Schelling-inspired models typically predict large degrees of segregation starting from virtually any initial condition, the empirical literature documents considerable heterogeneity in measured levels of segregation. This paper introduces a mechanism that can produce significantly higher levels of integration and, therefore, brings predicted distributions of segregation more in line with real-world observation. As in the classic Schelling model, agents in a simulated world want to stay or move to a new location depending on the proportion of neighbors they judge to be acceptable. In contrast to the classic model, agents' classifications of their neighbors as acceptable or not depend lexicographically on recognition first and group type (e.g., ethnic stereotyping second. The FACE-recognition model nests classic Schelling: When agents have no recognition memory, judgments about the acceptability of a prospective neighbor rely solely on his or her group type (as in the Schelling model. A very small amount of recognition memory, however, eventually leads to different classifications that, in turn, produce dramatic macro-level effects resulting in significantly higher levels of integration. A novel implication of the FACE-recognition model concerns the large potential impact of policy interventions that generate modest numbers of face-to-face encounters with members of other social groups.

  5. How acceptable has become tolerable

    International Nuclear Information System (INIS)

    Clarke, R.H.

    1989-01-01

    A brief article discusses the differing conclusions drawn by the Royal Society Study Group and the Health and Safety Executive on the acceptability of the level of annual risk of death to the individual due to radiation. Regarding occupational exposure, both groups arrived at the same figure of 1 in 1000 per year but the former group considered this to be 'hardly totally unacceptable' while the latter group considered this to be 'borderline of intolerable'. Regarding exposure of the members of the public, the levels of risk concluded from both groups were even more divergent. (U.K.)

  6. Axelrod model: accepting or discussing

    Science.gov (United States)

    Dybiec, Bartlomiej; Mitarai, Namiko; Sneppen, Kim

    2012-10-01

    Agents building social systems are characterized by complex states, and interactions among individuals can align their opinions. The Axelrod model describes how local interactions can result in emergence of cultural domains. We propose two variants of the Axelrod model where local consensus is reached either by listening and accepting one of neighbors' opinion or two agents discuss their opinion and achieve an agreement with mixed opinions. We show that the local agreement rule affects the character of the transition between the single culture and the multiculture regimes.

  7. The effect of voluntariness on the acceptance of e-learning by nursing students.

    Science.gov (United States)

    Žvanut, Boštjan; Pucer, Patrik; Ličen, Sabina; Trobec, Irena; Plazar, Nadja; Vavpotič, Damjan

    2011-05-01

    Although e-learning is an innovation that is worth making generally available, it is not always accepted by nursing students. Many researchers state that voluntariness is closely related to the individual level of adoption of innovations. Hence, we hypothesized that voluntariness moderates the effect of perceived attributes of innovations (e.g. relative advantage, compatibility, complexity, trialability, and observability), which determines the acceptance of e-learning. To test the hypothesis a survey involving two groups of nursing students was carried out. For the first group the usage of e-learning was mandatory, for the second group it was optional. The results confirm our hypothesis. Institutions, interested in e-learning initiatives, should consider the effect of voluntariness when implementing e-learning. This paper provides a useful reference that can help e-learning providers to develop guidelines that can improve the acceptance of e-learning. Copyright © 2010 Elsevier Ltd. All rights reserved.

  8. Feasibility study using hypothesis testing to demonstrate containment of radionuclides within waste packages

    International Nuclear Information System (INIS)

    Thomas, R.E.

    1986-04-01

    The purpose of this report is to apply methods of statistical hypothesis testing to demonstrate the performance of containers of radioactive waste. The approach involves modeling the failure times of waste containers using Weibull distributions, making strong assumptions about the parameters. A specific objective is to apply methods of statistical hypothesis testing to determine the number of container tests that must be performed in order to control the probability of arriving at the wrong conclusions. An algorithm to determine the required number of containers to be tested with the acceptable number of failures is derived as a function of the distribution parameters, stated probabilities, and the desired waste containment life. Using a set of reference values for the input parameters, sample sizes of containers to be tested are calculated for demonstration purposes. These sample sizes are found to be excessively large, indicating that this hypothesis-testing framework does not provide a feasible approach for demonstrating satisfactory performance of waste packages for exceptionally long time periods

  9. FUZZY ACCEPTANCE SAMPLING AND CHARACTERISTIC CURVES

    Directory of Open Access Journals (Sweden)

    Ebru Turano?lu

    2012-02-01

    Full Text Available Acceptance sampling is primarily used for the inspection of incoming or outgoing lots. Acceptance sampling refers to the application of specific sampling plans to a designated lot or sequence of lots. The parameters of acceptance sampling plans are sample sizes and acceptance numbers. In some cases, it may not be possible to define acceptance sampling parameters as crisp values. These parameters can be expressed by linguistic variables. The fuzzy set theory can be successfully used to cope with the vagueness in these linguistic expressions for acceptance sampling. In this paper, the main distributions of acceptance sampling plans are handled with fuzzy parameters and their acceptance probability functions are derived. Then the characteristic curves of acceptance sampling are examined under fuzziness. Illustrative examples are given.

  10. Public acceptance: A Japanese view

    International Nuclear Information System (INIS)

    1972-01-01

    A number of factors enter into a consideration of the public acceptance of nuclear power ? the public, nuclear power as an entity, and the interaction between the two. Interaction here implies the manner in which nuclear power is presented to the public ? what is the public need for nuclear power, and what public risk is entailed in having it? The problem of public acceptance, in this sense, is time-dependent. For the public is changeable, just as nuclear power is subject to technical progress and ' social' improvement. Japan is geographically a very small country with a very high density of population. Any industrial activity and any large-scale employment of modern technology is apt to have a much greater impact on the physical, social and biological environment of individual Japanese people than similar activities would have on those of other countries. Industrial pollutants such as sulphur dioxide from power plants, oxides of nitrogen from automobile engine exhausts, organic mercury from chemical industries and so on affect society to a high degree, considered in terms of their concentration either per capita or per square kilometre. In the case of nuclear power, therefore, people are more concerned with radiological effects than with thermal pollution.no matter how one looks at it, the experience of Hiroshima and Nagasaki has made the average member of the Japanese public, very sensitive to the problem of radiation safety. This is no longer a subject in which science or logic can persuade

  11. Public acceptance: A Japanese view

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1972-07-01

    A number of factors enter into a consideration of the public acceptance of nuclear power ? the public, nuclear power as an entity, and the interaction between the two. Interaction here implies the manner in which nuclear power is presented to the public ? what is the public need for nuclear power, and what public risk is entailed in having it? The problem of public acceptance, in this sense, is time-dependent. For the public is changeable, just as nuclear power is subject to technical progress and ' social' improvement. Japan is geographically a very small country with a very high density of population. Any industrial activity and any large-scale employment of modern technology is apt to have a much greater impact on the physical, social and biological environment of individual Japanese people than similar activities would have on those of other countries. Industrial pollutants such as sulphur dioxide from power plants, oxides of nitrogen from automobile engine exhausts, organic mercury from chemical industries and so on affect society to a high degree, considered in terms of their concentration either per capita or per square kilometre. In the case of nuclear power, therefore, people are more concerned with radiological effects than with thermal pollution.no matter how one looks at it, the experience of Hiroshima and Nagasaki has made the average member of the Japanese public, very sensitive to the problem of radiation safety. This is no longer a subject in which science or logic can persuade.

  12. Waste transmutation and public acceptance

    International Nuclear Information System (INIS)

    Pigford, T.H.

    1991-01-01

    The concept of transmuting radioactive wastes with reactors or accelerators is appealing. It has the potential of simplifying or eliminating problems of disposing of nuclear waste. The transmutation concept has been renewed vigorously at a time when national projects to dispose of high-level and transuranic waste are seriously delayed. In this period of tightening federal funds and program curtailments, skilled technical staffs are available at US Department of Energy (DOE) national laboratories and contractors to work on waste transmutation. If the claims of transmutation can be shown to be realistic, economically feasible, and capable of being implemented within the US institutional infrastructure, public acceptance of nuclear waste disposal may be enhanced. If the claims for transmutation are not substantiated, however, there will result a serious loss of credibility and an unjust exacerbation of public concerns about nuclear waste. The paper discusses the following topics: how public acceptance is achieved; the technical community and waste disposal; transmutation and technical communication; transmutation issues; technical fixes and public perception

  13. Wind energy and social acceptability

    International Nuclear Information System (INIS)

    Feurtey, E.

    2008-01-01

    This document was prepared as part of a decentralized collaboration between Quebec and France to share knowledge regarding strategies and best practices in wind power development. It reviewed the social acceptance of Quebec's wind power industry, particularly at the municipal level. The wind industry is growing rapidly in Quebec, and this growth has generated many reactions ranging from positive to negative. The purpose of this joint effort was to describe decision making steps to developing a wind turbine array. The history of wind development in Quebec was discussed along with the various hardware components required in a wind turbine and different types of installations. The key element in implementing wind turbine arrays is to establish public acceptance of the project, followed by a good regulatory framework to define the roles and responsibilities of participants. The production of electricity from wind turbines constitutes a clean and renewable source of energy. Although it is associated with a reduction in greenhouse gas emissions, this form of energy can also have negative environmental impacts, including noise. The revenues generated by wind parks are important factors in the decision making process. Two case studies in Quebec were presented. refs., tabs., figs.

  14. Policy formulation of public acceptance

    International Nuclear Information System (INIS)

    Kasai, Akihiro

    1978-01-01

    Since 1970, the new policy formulation for public acceptance of the new consideration on the location of electric power generation has been set and applied. The planning and the enforcement being conducted by local public organizations for the local economic build-up with plant location and also the adjustement of the requirements for fishery are two main specific characters in this new policy. The background of this new public acceptance policy, the history and the actual problems about the compensation for the location of power generation plants are reviewed. One new proposal, being recommended by the Policy and Science Laboratory to MITI in 1977 is explained. This is based on the method of promoting the location of power generation plants by public participation placing the redevelopment of regional societies as its basis. The problems concerning the industrial structures in farm villages, fishing villages and the areas of commerce and industry should be systematized, and explained from the viewpoint of outside impact, the characteristics of local areas and the location problems in this new proposal. Finally, the location process and its effectiveness should be put in order. (Nakai, Y.)

  15. Environmental policy without costs? A review of the Porter hypothesis

    Energy Technology Data Exchange (ETDEWEB)

    Braennlund, Runar; Lundgren, Tommy. e-mail: runar.brannlund@econ.umu.se

    2009-03-15

    This paper reviews the theoretical and empirical literature connected to the so called Porter Hypothesis. That is, to review the literature connected to the discussion about the relation between environmental policy and competitiveness. According to the conventional wisdom environmental policy, aiming for improving the environment through for example emission reductions, do imply costs since scarce resources must be diverted from somewhere else. However, this conventional wisdom has been challenged and questioned recently through what has been denoted the 'Porter hypothesis'. Those in the forefront of the Porter hypothesis challenge the conventional wisdom basically on the ground that resources are used inefficiently in the absence of the right kind of environmental regulations, and that the conventional neo-classical view is too static to take inefficiencies into account. The conclusions that can be made from this review is (1) that the theoretical literature can identify the circumstances and mechanisms that must exist for a Porter effect to occur, (2) that these circumstances are rather non-general, hence rejecting the Porter hypothesis in general, (3) that the empirical literature give no general support for the Porter hypothesis. Furthermore, a closer look at the 'Swedish case' reveals no support for the Porter hypothesis in spite of the fact that Swedish environmental policy the last 15-20 years seems to be in line the prerequisites stated by the Porter hypothesis concerning environmental policy

  16. Model-dependence of the CO2 threshold for melting the hard Snowball Earth

    Directory of Open Access Journals (Sweden)

    W. R. Peltier

    2011-01-01

    Full Text Available One of the critical issues of the Snowball Earth hypothesis is the CO2 threshold for triggering the deglaciation. Using Community Atmospheric Model version 3.0 (CAM3, we study the problem for the CO2 threshold. Our simulations show large differences from previous results (e.g. Pierrehumbert, 2004, 2005; Le Hir et al., 2007. At 0.2 bars of CO2, the January maximum near-surface temperature is about 268 K, about 13 K higher than that in Pierrehumbert (2004, 2005, but lower than the value of 270 K for 0.1 bar of CO2 in Le Hir et al. (2007. It is found that the difference of simulation results is mainly due to model sensitivity of greenhouse effect and longwave cloud forcing to increasing CO2. At 0.2 bars of CO2, CAM3 yields 117 Wm−2 of clear-sky greenhouse effect and 32 Wm−2 of longwave cloud forcing, versus only about 77 Wm−2 and 10.5 Wm−2 in Pierrehumbert (2004, 2005, respectively. CAM3 has comparable clear-sky greenhouse effect to that in Le Hir et al. (2007, but lower longwave cloud forcing. CAM3 also produces much stronger Hadley cells than that in Pierrehumbert (2005. Effects of pressure broadening and collision-induced absorption are also studied using a radiative-convective model and CAM3. Both effects substantially increase surface temperature and thus lower the CO2 threshold. The radiative-convective model yields a CO2 threshold of about 0.21 bars with surface albedo of 0.663. Without considering the effects of pressure broadening and collision-induced absorption, CAM3 yields an approximate CO2 threshold of about 1.0 bar for surface albedo of about 0.6. However, the threshold is lowered to 0.38 bars as both effects are considered.

  17. Measuring Input Thresholds on an Existing Board

    Science.gov (United States)

    Kuperman, Igor; Gutrich, Daniel G.; Berkun, Andrew C.

    2011-01-01

    A critical PECL (positive emitter-coupled logic) interface to Xilinx interface needed to be changed on an existing flight board. The new Xilinx input interface used a CMOS (complementary metal-oxide semiconductor) type of input, and the driver could meet its thresholds typically, but not in worst-case, according to the data sheet. The previous interface had been based on comparison with an external reference, but the CMOS input is based on comparison with an internal divider from the power supply. A way to measure what the exact input threshold was for this device for 64 inputs on a flight board was needed. The measurement technique allowed an accurate measurement of the voltage required to switch a Xilinx input from high to low for each of the 64 lines, while only probing two of them. Directly driving an external voltage was considered too risky, and tests done on any other unit could not be used to qualify the flight board. The two lines directly probed gave an absolute voltage threshold calibration, while data collected on the remaining 62 lines without probing gave relative measurements that could be used to identify any outliers. The PECL interface was forced to a long-period square wave by driving a saturated square wave into the ADC (analog to digital converter). The active pull-down circuit was turned off, causing each line to rise rapidly and fall slowly according to the input s weak pull-down circuitry. The fall time shows up as a change in the pulse width of the signal ready by the Xilinx. This change in pulse width is a function of capacitance, pulldown current, and input threshold. Capacitance was known from the different trace lengths, plus a gate input capacitance, which is the same for all inputs. The pull-down current is the same for all inputs including the two that are probed directly. The data was combined, and the Excel solver tool was used to find input thresholds for the 62 lines. This was repeated over different supply voltages and

  18. Speech-in-Noise Tests and Supra-threshold Auditory Evoked Potentials as Metrics for Noise Damage and Clinical Trial Outcome Measures.

    Science.gov (United States)

    Le Prell, Colleen G; Brungart, Douglas S

    2016-09-01

    In humans, the accepted clinical standards for detecting hearing loss are the behavioral audiogram, based on the absolute detection threshold of pure-tones, and the threshold auditory brainstem response (ABR). The audiogram and the threshold ABR are reliable and sensitive measures of hearing thresholds in human listeners. However, recent results from noise-exposed animals demonstrate that noise exposure can cause substantial neurodegeneration in the peripheral auditory system without degrading pure-tone audiometric thresholds. It has been suggested that clinical measures of auditory performance conducted with stimuli presented above the detection threshold may be more sensitive than the behavioral audiogram in detecting early-stage noise-induced hearing loss in listeners with audiometric thresholds within normal limits. Supra-threshold speech-in-noise testing and supra-threshold ABR responses are reviewed here, given that they may be useful supplements to the behavioral audiogram for assessment of possible neurodegeneration in noise-exposed listeners. Supra-threshold tests may be useful for assessing the effects of noise on the human inner ear, and the effectiveness of interventions designed to prevent noise trauma. The current state of the science does not necessarily allow us to define a single set of best practice protocols. Nonetheless, we encourage investigators to incorporate these metrics into test batteries when feasible, with an effort to standardize procedures to the greatest extent possible as new reports emerge.

  19. Biostatistics series module 2: Overview of hypothesis testing

    Directory of Open Access Journals (Sweden)

    Avijit Hazra

    2016-01-01

    Full Text Available Hypothesis testing (or statistical inference is one of the major applications of biostatistics. Much of medical research begins with a research question that can be framed as a hypothesis. Inferential statistics begins with a null hypothesis that reflects the conservative position of no change or no difference in comparison to baseline or between groups. Usually, the researcher has reason to believe that there is some effect or some difference which is the alternative hypothesis. The researcher therefore proceeds to study samples and measure outcomes in the hope of generating evidence strong enough for the statistician to be able to reject the null hypothesis. The concept of the P value is almost universally used in hypothesis testing. It denotes the probability of obtaining by chance a result at least as extreme as that observed, even when the null hypothesis is true and no real difference exists. Usually, if P is < 0.05 the null hypothesis is rejected and sample results are deemed statistically significant. With the increasing availability of computers and access to specialized statistical software, the drudgery involved in statistical calculations is now a thing of the past, once the learning curve of the software has been traversed. The life sciences researcher is therefore free to devote oneself to optimally designing the study, carefully selecting the hypothesis tests to be applied, and taking care in conducting the study well. Unfortunately, selecting the right test seems difficult initially. Thinking of the research hypothesis as addressing one of five generic research questions helps in selection of the right hypothesis test. In addition, it is important to be clear about the nature of the variables (e.g., numerical vs. categorical; parametric vs. nonparametric and the number of groups or data sets being compared (e.g., two or more than two at a time. The same research question may be explored by more than one type of hypothesis test

  20. Age-dependent variation in the terminal investment threshold in male crickets.

    Science.gov (United States)

    Duffield, Kristin R; Hampton, Kylie J; Houslay, Thomas M; Hunt, John; Rapkin, James; Sakaluk, Scott K; Sadd, Ben M

    2018-03-01

    The terminal investment hypothesis proposes that decreased expectation of future reproduction (e.g., arising from a threat to survival) should precipitate increased investment in current reproduction. The level at which a cue of decreased survival is sufficient to trigger terminal investment (i.e., the terminal investment threshold) may vary according to other factors that influence expectation for future reproduction. We test whether the terminal investment threshold varies with age in male crickets, using heat-killed bacteria to simulate an immune-inducing infection. We measured calling effort (a behavior essential for mating) and hemolymph antimicrobial activity in young and old males across a gradient of increasing infection cue intensity. There was a significant interaction between the infection cue and age in their effect on calling effort, confirming the existence of a dynamic terminal investment threshold: young males reduced effort at all infection levels, whereas old males increased effort at the highest levels relative to naïve individuals. A lack of a corresponding decrease in antibacterial activity suggests that altered reproductive effort is not traded against investment in this component of immunity. Collectively, these results support the existence of a dynamic terminal investment threshold, perhaps accounting for some of the conflicting evidence in support of terminal investment. © 2018 The Author(s). Evolution © 2018 The Society for the Study of Evolution.

  1. Threshold values of ankle dorsiflexion and gross motor function in 60 children with cerebral palsy

    DEFF Research Database (Denmark)

    Rasmussen, Helle M; Svensson, Joachim; Thorning, Maria

    2018-01-01

    Background and purpose - Threshold values defining 3 categories of passive range of motion are used in the Cerebral Palsy follow-Up Program to guide clinical decisions. The aim of this study was to investigate the threshold values by testing the hypothesis that passive range of motion in ankle...... dorsiflexion is associated with gross motor function and that function differs between the groups of participants in each category. Patients and methods - We analyzed data from 60 ambulatory children (aged 5-9 years) with spastic cerebral palsy. Outcomes were passive range of motion in ankle dorsiflexion...... with flexed and extended knee and gross motor function (Gait Deviation Index, Gait Variable Score of the ankle, peak dorsiflexion during gait, 1-minute walk, Gross Motor Function Measure, the Pediatric Quality of Life Inventory Cerebral Palsy Module, and Pediatric Outcomes Data Collection Instrument). Results...

  2. Phase-change memory: A continuous multilevel compact model of subthreshold conduction and threshold switching

    Science.gov (United States)

    Pigot, Corentin; Gilibert, Fabien; Reyboz, Marina; Bocquet, Marc; Zuliani, Paola; Portal, Jean-Michel

    2018-04-01

    Phase-change memory (PCM) compact modeling of the threshold switching based on a thermal runaway in Poole–Frenkel conduction is proposed. Although this approach is often used in physical models, this is the first time it is implemented in a compact model. The model accuracy is validated by a good correlation between simulations and experimental data collected on a PCM cell embedded in a 90 nm technology. A wide range of intermediate states is measured and accurately modeled with a single set of parameters, allowing multilevel programing. A good convergence is exhibited even in snapback simulation owing to this fully continuous approach. Moreover, threshold properties extraction indicates a thermally enhanced switching, which validates the basic hypothesis of the model. Finally, it is shown that this model is compliant with a new drift-resilient cell-state metric. Once enriched with a phase transition module, this compact model is ready to be implemented in circuit simulators.

  3. Linear, no threshold response at low doses of ionizing radiation: ideology, prejudice and science

    International Nuclear Information System (INIS)

    Kesavan, P.C.

    2014-01-01

    The linear, no threshold (LNT) response model assumes that there is no threshold dose for the radiation-induced genetic effects (heritable mutations and cancer), and it forms the current basis for radiation protection standards for radiation workers and the general public. The LNT model is, however, based more on ideology than valid radiobiological data. Further, phenomena such as 'radiation hormesis', 'radioadaptive response', 'bystander effects' and 'genomic instability' are now demonstrated to be radioprotective and beneficial. More importantly, the 'differential gene expression' reveals that qualitatively different proteins are induced by low and high doses. This finding negates the LNT model which assumes that qualitatively similar proteins are formed at all doses. Thus, all available scientific data challenge the LNT hypothesis. (author)

  4. PAGs - Public perception and acceptance

    International Nuclear Information System (INIS)

    Quillin, Robert M.

    1989-01-01

    Full text: While Protective Action Guides or PAGs have been a part of the lexicon of the radiation protection field for several decades, the concept of accepting higher levels of risk under certain situations has not received adequate scrutiny by the general public, the media or elected officials. Consequently there is a question as to how implementation of PAGs would be perceived by the above groups in the event that such implementation became necessary. A personal case in point involves the response of an executive in the food industry. When the concept of selling a food product meeting the PAGs was explained his response was, 'we won't sell a contaminated product, we would dump the unprocessed raw food. Our industry image is that of a natural unadulterated food'. While this may be an isolated view, there is a need to determine what is the perception and consequently what would be the response if PAGs were implemented today. If the response was negative by anyone of the three groups listed previously, then there is an obvious need for a program to assure receptiveness by those concerned. However, this may face formidable obstacles. This is because the terms radiation and radioactive have gained generally negative word associations, e.g. 'deadly' radiation and radioactive 'desert'. The former term was recently heard in a taped presentation at a Museum of Natural History on a completely unrelated subject. The latter term was part of a recent article heading in the Wall Street Journal. Incidentally the article was discussing television. Thus beyond the scientific issues of setting PAGs and the administrative and procedural issues of implementing PAGs there is the issue of society's understanding and acceptance of PAGs. Particularly, how can such understanding and acceptance be achieved in a situation which is associated with an actual or perceived radiation emergency? These are not questions that radiation or agricultural scientists can answer alone. These are

  5. Efficient Market Hypothesis in South Africa: Evidence from Linear and Nonlinear Unit Root Tests

    Directory of Open Access Journals (Sweden)

    Andrew Phiri

    2015-12-01

    Full Text Available This study investigates the weak form efficient market hypothesis (EMH for five generalized stock indices in the Johannesburg Stock Exchange (JSE using weekly data collected from 31st January 2000 to 16th December 2014. In particular, we test for weak form market efficiency using a battery of linear and nonlinear unit root testing procedures comprising of the classical augmented Dickey-Fuller (ADF tests, the two-regime threshold autoregressive (TAR unit root tests described in Enders and Granger (1998 as well as the three-regime unit root tests described in Bec, Salem, and Carrasco (2004. Based on our empirical analysis, we are able to demonstrate that whilst the linear unit root tests advocate for unit roots within the time series, the nonlinear unit root tests suggest that most stock indices are threshold stationary processes. These results bridge two opposing contentions obtained from previous studies by concluding that under a linear framework the JSE stock indices offer support in favour of weak form market efficiency whereas when nonlinearity is accounted for, a majority of the indices violate the weak form EMH.

  6. The periodontal pain paradox: Difficulty on pain assesment in dental patients (The periodontal pain paradox hypothesis

    Directory of Open Access Journals (Sweden)

    Haryono Utomo

    2006-12-01

    Full Text Available In daily dental practice, the majority of patients’ main complaints are related to pain. Most patients assume that all pains inside the oral cavity originated from the tooth. One particular case is thermal sensitivity; sometimes patients were being able to point the site of pain, although there is neither visible caries nor secondary caries in dental radiograph. In this case, gingival recession and dentin hypersensitivity are first to be treated to eliminate the pain. If these treatments failed, pain may misdiagnose as pulpal inflammation and lead to unnecessary root canal treatment. Study in pain during periodontal instrumentation of plaque-related periodontitis revealed that the majority of patients feel pain and discomfort during probing and scaling. It seems obvious because an inflammation, either acute or chronic is related to a lowered pain threshold. However, in contrast, in this case report, patient suffered from chronic gingivitis and thermal sensitivity experienced a relative pain-free sensation during probing and scaling. Lowered pain threshold which accompanied by a blunted pain perception upon periodontal instrumentation is proposed to be termed as the periodontal pain paradox. The objective of this study is to reveal the possibility of certain factors in periodontal inflammation which may involved in the periodontal pain paradox hypothesis. Patient with thermal hypersensitivity who was conducted probing and scaling, after the relative pain-free instrumentation, thermal hypersensitivity rapidly disappeared. Based on the successful periodontal treatment, it is concluded that chronic gingivitis may modulate periodontal pain perception which termed as periodontal pain paradox

  7. Differential equation models for sharp threshold dynamics.

    Science.gov (United States)

    Schramm, Harrison C; Dimitrov, Nedialko B

    2014-01-01

    We develop an extension to differential equation models of dynamical systems to allow us to analyze probabilistic threshold dynamics that fundamentally and globally change system behavior. We apply our novel modeling approach to two cases of interest: a model of infectious disease modified for malware where a detection event drastically changes dynamics by introducing a new class in competition with the original infection; and the Lanchester model of armed conflict, where the loss of a key capability drastically changes the effectiveness of one of the sides. We derive and demonstrate a step-by-step, repeatable method for applying our novel modeling approach to an arbitrary system, and we compare the resulting differential equations to simulations of the system's random progression. Our work leads to a simple and easily implemented method for analyzing probabilistic threshold dynamics using differential equations. Published by Elsevier Inc.

  8. Gamin partable radiation meter with alarm threshold

    International Nuclear Information System (INIS)

    Payat, Rene.

    1981-10-01

    The Gamin Radiation meter is a direct reading, portable, battery-powered gamma doserate meter featuring alarm thresholds. Doserate is read on a micro-ammeter with a millirad-per-hour logarithmic scale, covering a range of 0,1 to 1000 millirads/hour. The instrument issues an audible warning signal when dose-rate level exceeds a threshold value, which can be selected. The detector tube is of the Geiger-Muller counter, energy compensated type. Because of its low battery drain, the instrument can be operated continously for 1000 hours. It is powered by four 1.5 volt alcaline batteries of the R6 type. The electronic circuitry is housed in a small lightweight case made of impact resistant plastic. Applications of the Gamin portable radiation monitor are found in health physics, safety departments, medical facilities, teaching, civil defense [fr

  9. Rayleigh scattering from ions near threshold

    International Nuclear Information System (INIS)

    Roy, S.C.; Gupta, S.K.S.; Kissel, L.; Pratt, R.H.

    1988-01-01

    Theoretical studies of Rayleigh scattering of photons from neon atoms with different degrees of ionization, for energies both below and above the K-edges of the ions, are presented. Some unexpected structures both in Rayleigh scattering and in photoionization from neutral and weakly ionized atoms, very close to threshold, have been reported. It has recently been realized that some of the predicted structures may have a nonphysical origin and are due to the limitation of the independent-particle model and also to the use of a Coulombic Latter tail. Use of a K-shell vacancy potential - in which an electron is assumed to be removed from the K-shell - in calculating K-shell Rayleigh scattering amplitudes removes some of the structure effects near threshold. We present in this work a discussion of scattering angular distributions and total cross sections, obtained utilizing vacancy potentials, and compare these predictions with those previously obtained in other potential model. (author) [pt

  10. Edith Wharton's threshold phobia and two worlds.

    Science.gov (United States)

    Holtzman, Deanna; Kulish, Nancy

    2014-08-01

    The American novelist Edith Wharton suffered an unusual childhood neurotic symptom, a fear of crossing thresholds, a condition that might be called a "threshold phobia." This symptom is identified and examined in autobiographical material, letters, diaries, and selected literary fiction and nonfiction left by Wharton to arrive at a formulation not previously drawn together. A fascinating theme-living or being trapped between "two worlds"-runs through much of the writer's life and work. The phobia is related to this theme, and both can be linked more broadly to certain sexual conflicts in women. This understanding of Wharton's phobia, it is argued, throws new light on the developmental issues and conflicts related to the female "oedipal" or triadic phase, characterized by the need to negotiate the two worlds of mother and of father. © 2014 by the American Psychoanalytic Association.

  11. Ultracompact low-threshold organic laser.

    Science.gov (United States)

    Deotare, Parag B; Mahony, Thomas S; Bulović, Vladimir

    2014-11-25

    We report an ultracompact low-threshold laser with an Alq3:DCM host:guest molecular organic thin film gain layer. The device uses a photonic crystal nanobeam cavity which provides a high quality factor to mode volume (Q/V) ratio and increased spontaneous emission factor along with a small footprint. Lasing is observed with a threshold of 4.2 μJ/cm(2) when pumped by femtosecond pulses of λ = 400 nm wavelength light. We also model the dynamics of the laser and show good agreement with the experimental data. The inherent waveguide geometry of the structure enables easy on-chip integration with potential applications in biochemical sensing, inertial sensors, and data communication.

  12. Predicting visual acuity from detection thresholds.

    Science.gov (United States)

    Newacheck, J S; Haegerstrom-Portnoy, G; Adams, A J

    1990-03-01

    Visual performance based exclusively on high luminance and high contrast letter acuity measures often fails to predict individual performance at low contrast and low luminance. Here we measured visual acuity over a wide range of contrasts and luminances (low mesopic to photopic) for 17 young normal observers. Acuity vs. contrast functions appear to fit a single template which can be displaced laterally along the log contrast axis. The magnitude of this lateral displacement for different luminances was well predicted by the contrast threshold difference for a 4 min arc spot. The acuity vs. contrast template, taken from the mean of all 17 subjects, was used in conjunction with individual spot contrast threshold measures to predict an individual's visual acuity over a wide range of luminance and contrast levels. The accuracy of the visual acuity predictions from this simple procedure closely approximates test-retest accuracy for both positive (projected Landolt rings) and negative contrast (Bailey-Lovie charts).

  13. Acceptance of Others, Feeling of Being Accepted and Striving for Being Accepted Among the Representatives of Different Kinds of Occupations

    Directory of Open Access Journals (Sweden)

    Gergana Stanoeva

    2012-05-01

    Full Text Available This paper deals with an important issue related to the human attitudes and needs in interpersonal and professional aspects. The theoretical part deals with several psychological components of the self-esteem and esteem of the others – acceptance of the others, feeling of being accepted, need for approval. Some gender differences in manifestations of acceptance and feeling of being accepted at the workplace are discussed. This article presents some empirical data for the degree of acceptance of others, feeling of being accepted and the strive for being accepted among the representatives of helping, pedagogical, administrative and economic occupations, as well as non-qualified workers. The goals of the study were to reveal the interdependency between these constructs and to be found some significant differences between the representatives of the four groups of occupations. The methods of the first study were W. Fey’s scales “Acceptance of others”, and “How do I feel accepted by others”. The method of the second study was Crown and Marlowe Scale for Social Desirability. The results indicated some significant differences in acceptance of others and feeling of being accepted between the non-qualified workers and the representatives of helping, administrative and economic occupations. There were not any significant difference in strive for being accepted between the fouroccupational groups.

  14. The monolithic double-threshold discriminator

    International Nuclear Information System (INIS)

    Baturitsky, M.A.; Dvornikov, O.V.

    1999-01-01

    A double-threshold discriminator capable of processing input signals of different duration is described. Simplicity of the discriminator circuitry makes it possible to embody the discriminator in multichannel ICs using microwave bipolar-JFET technology. Time walk is calculated to be less than 0.35 ns for the input ramp signals with rise times 25-100 ns and amplitudes 50 mV-1 V

  15. Factors affecting mechanical (nociceptive) thresholds in piglets.

    Science.gov (United States)

    Janczak, Andrew M; Ranheim, Birgit; Fosse, Torunn K; Hild, Sophie; Nordgreen, Janicke; Moe, Randi O; Zanella, Adroaldo J

    2012-11-01

    To evaluate the stability and repeatability of measures of mechanical (nociceptive) thresholds in piglets and to examine potentially confounding factors when using a hand held algometer. Descriptive, prospective cohort. Forty-four piglets from four litters, weighing 4.6 ± 1.0 kg (mean ± SD) at 2 weeks of age. Mechanical thresholds were measured twice on each of 2 days during the first and second week of life. Data were analyzed using a repeated measures design to test the effects of behavior prior to testing, sex, week, day within week, and repetition within day. The effect of body weight and the interaction between piglet weight and behaviour were also tested. Piglet was entered into the model as a random effect as an additional test of repeatability. The effect of repeated testing was used to test the stability of measures. Pearson correlations between repeated measures were used to test the repeatability of measures. Variance component analysis was used to describe the variability in the data. Variance component analysis indicated that piglet explained only 17% of the variance in the data. All variables in the model (behaviour prior to testing, sex, week, day within week, repetition within day, body weight, the interaction between body weight and behaviour, piglet identity) except sex had a significant effect (p testing and measures changed with repeated testing and increased with increasing piglet weight, indicating that time (age) and animal body weight should be taken into account when measuring mechanical (nociceptive) thresholds in piglets. Mechanical (nociceptive) thresholds can be used both for testing the efficacy of anaesthetics and analgesics, and for assessing hyperalgesia in chronic pain states in research and clinical settings. © 2012 The Authors. Veterinary Anaesthesia and Analgesia. © 2012 Association of Veterinary Anaesthetists and the American College of Veterinary Anesthesiologists.

  16. Management applicability of the intermediate disturbance hypothesis across Mongolian rangeland ecosystems.

    Science.gov (United States)

    Sasaki, Takehiro; Okubo, Satoru; Okayasu, Tomoo; Jamsran, Undarmaa; Ohkuro, Toshiya; Takeuchi, Kazuhiko

    2009-03-01

    The current growing body of evidence for diversity-disturbance relationships suggests that the peaked pattern predicted by the intermediate disturbance hypothesis (IDH) may not be the rule. Even if ecologists could quantify the diversity-disturbance relationship consistent with the IDH, the applicability of the IDH to land management has rarely been addressed. We examined two hypotheses related to the generality and management applicability of the IDH to Mongolian rangeland ecosystems: that the diversity-disturbance relationship varies as a function of landscape condition and that some intermediate scales of grazing can play an important role in terms of sustainable rangeland management through a grazing gradient approach. We quantified the landscape condition of each ecological site using an ordination technique and determined two types of landscape conditions: relatively benign and harsh environmental conditions. At the ecological sites characterized by relatively benign environmental conditions, diversity-disturbance relationships were generally consistent with the IDH, and maximum diversity was observed at some intermediate distance from the source of the grazing gradient. In contrast, the IDH was not supported at most (but not all) sites characterized by relatively harsh environmental conditions. The intermediate levels of grazing were generally located below the ecological threshold representing the points or zones at which disturbance should be limited to prevent drastic changes in ecological conditions, suggesting that there is little "conundrum" with regard to intermediate disturbance in the studied systems in terms of land management. We suggest that the landscape condition is one of the primary factors that cause inconsistencies in diversity-disturbance relationships. The ecological threshold can extend its utility in rangeland management because it also has the compatibility with the maintenance of species diversity. This study thus suggests that some

  17. User acceptance of mobile notifications

    CERN Document Server

    Westermann, Tilo

    2017-01-01

    This book presents an alternative approach to studying smartphone-app user notifications. It starts with insights into user acceptance of mobile notifications in order to provide tools to support users in managing these. It extends previous research by investigating factors that influence users’ perception of notifications and proposes tools addressing the shortcomings of current systems. It presents a technical framework and testbed as an approach for evaluating the usage of mobile applications and notifications, and then discusses a series of studies based on this framework that investigate factors influencing users’ perceptions of mobile notifications. Lastly, a set of design guidelines for the usage of mobile notifications is derived that can be employed to support users in handling notifications on smartphones.

  18. Threshold Learning Dynamics in Social Networks

    Science.gov (United States)

    González-Avella, Juan Carlos; Eguíluz, Victor M.; Marsili, Matteo; Vega-Redondo, Fernado; San Miguel, Maxi

    2011-01-01

    Social learning is defined as the ability of a population to aggregate information, a process which must crucially depend on the mechanisms of social interaction. Consumers choosing which product to buy, or voters deciding which option to take with respect to an important issue, typically confront external signals to the information gathered from their contacts. Economic models typically predict that correct social learning occurs in large populations unless some individuals display unbounded influence. We challenge this conclusion by showing that an intuitive threshold process of individual adjustment does not always lead to such social learning. We find, specifically, that three generic regimes exist separated by sharp discontinuous transitions. And only in one of them, where the threshold is within a suitable intermediate range, the population learns the correct information. In the other two, where the threshold is either too high or too low, the system either freezes or enters into persistent flux, respectively. These regimes are generally observed in different social networks (both complex or regular), but limited interaction is found to promote correct learning by enlarging the parameter region where it occurs. PMID:21637714

  19. Explaining the length threshold of polyglutamine aggregation

    International Nuclear Information System (INIS)

    De Los Rios, Paolo; Hafner, Marc; Pastore, Annalisa

    2012-01-01

    The existence of a length threshold, of about 35 residues, above which polyglutamine repeats can give rise to aggregation and to pathologies, is one of the hallmarks of polyglutamine neurodegenerative diseases such as Huntington’s disease. The reason why such a minimal length exists at all has remained one of the main open issues in research on the molecular origins of such classes of diseases. Following the seminal proposals of Perutz, most research has focused on the hunt for a special structure, attainable only above the minimal length, able to trigger aggregation. Such a structure has remained elusive and there is growing evidence that it might not exist at all. Here we review some basic polymer and statistical physics facts and show that the existence of a threshold is compatible with the modulation that the repeat length imposes on the association and dissociation rates of polyglutamine polypeptides to and from oligomers. In particular, their dramatically different functional dependence on the length rationalizes the very presence of a threshold and hints at the cellular processes that might be at play, in vivo, to prevent aggregation and the consequent onset of the disease. (paper)

  20. Explaining the length threshold of polyglutamine aggregation

    Science.gov (United States)

    De Los Rios, Paolo; Hafner, Marc; Pastore, Annalisa

    2012-06-01

    The existence of a length threshold, of about 35 residues, above which polyglutamine repeats can give rise to aggregation and to pathologies, is one of the hallmarks of polyglutamine neurodegenerative diseases such as Huntington’s disease. The reason why such a minimal length exists at all has remained one of the main open issues in research on the molecular origins of such classes of diseases. Following the seminal proposals of Perutz, most research has focused on the hunt for a special structure, attainable only above the minimal length, able to trigger aggregation. Such a structure has remained elusive and there is growing evidence that it might not exist at all. Here we review some basic polymer and statistical physics facts and show that the existence of a threshold is compatible with the modulation that the repeat length imposes on the association and dissociation rates of polyglutamine polypeptides to and from oligomers. In particular, their dramatically different functional dependence on the length rationalizes the very presence of a threshold and hints at the cellular processes that might be at play, in vivo, to prevent aggregation and the consequent onset of the disease.

  1. Treatment of threshold retinopathy of prematurity

    Directory of Open Access Journals (Sweden)

    Deshpande Dhanashree

    1998-01-01

    Full Text Available This report deals with our experience in the management of threshold retinopathy of prematurity (ROP. A total of 45 eyes of 23 infants were subjected to treatment of threshold ROP. 26.1% of these infants had a birth weight of >l,500 gm. The preferred modality of treatment was laser indirect photocoagulation, which was facilitated by scleral depression. Cryopexy was done in cases with nondilating pupils or medial haze and was always under general anaesthesia. Retreatment with either modality was needed in 42.2% eyes; in this the skip areas were covered. Total regression of diseases was achieved in 91.1% eyes with no sequelae. All the 4 eyes that progressed to stage 5 despite treatment had zone 1 disease. Major treatment-induced complications did not occur in this series. This study underscores the importance of routine screening of infants upto 2,000 gm birth weight for ROP and the excellent response that is achieved with laser photocoagulation in inducing regression of threshold ROP. Laser is the preferred method of treatment in view of the absence of treatment-related morbidity to the premature infants.

  2. Acoustic emission sensor radiation damage threshold experiment

    International Nuclear Information System (INIS)

    Beeson, K.M.; Pepper, C.E.

    1994-01-01

    Determination of the threshold for damage to acoustic emission sensors exposed to radiation is important in their application to leak detection in radioactive waste transport and storage. Proper response to system leaks is necessary to ensure the safe operation of these systems. A radiation impaired sensor could provide ''false negative or false positive'' indication of acoustic signals from leaks within the system. Research was carried out in the Radiochemical Technology Division at Oak Ridge National Laboratory to determine the beta/gamma radiation damage threshold for acoustic emission sensor systems. The individual system consisted of an acoustic sensor mounted with a two part epoxy onto a stainless steel waveguide. The systems were placed in an irradiation fixture and exposed to a Cobalt-60 source. After each irradiation, the sensors were recalibrated by Physical Acoustics Corporation. The results were compared to the initial calibrations performed prior to irradiation and a control group, not exposed to radiation, was used to validate the results. This experiment determines the radiation damage threshold of each acoustic sensor system and verifies its life expectancy, usefulness and reliability for many applications in radioactive environments

  3. Threshold photoelectron spectroscopy of acetaldehyde and acrolein

    International Nuclear Information System (INIS)

    Yencha, Andrew J.; Siggel-King, Michele R.F.; King, George C.; Malins, Andrew E.R.; Eypper, Marie

    2013-01-01

    Highlights: •High-resolution threshold photoelectron spectrum of acetaldehyde. •High-resolution threshold photoelectron spectrum of acrolein. •High-resolution total photoion yield spectrum of acetaldehyde. •High-resolution total photoion yield spectrum of acrolein. •Determination of vertical ionization potentials in acetaldehyde and acrolein. -- Abstract: High-resolution (6 meV and 12 meV) threshold photoelectron (TPE) spectra of acetaldehyde and acrolein (2-propenal) have been recorded over the valence binding energy region 10–20 eV, employing synchrotron radiation and a penetrating-field electron spectrometer. These TPE spectra are presented here for the first time. All of the band structures observed in the TPE spectra replicate those found in their conventional HeI photoelectron (PE) spectra. However, the relative band intensities are found to be dramatically different in the two types of spectra that are attributed to the different dominant operative formation mechanisms. In addition, some band shapes and their vertical ionization potentials are found to differ in the two types of spectra that are associated with the autoionization of Rydberg states in the two molecules

  4. Sensitivity and Specificity of Swedish Interactive Threshold Algorithm and Standard Full Threshold Perimetry in Primary Open-angle Glaucoma.

    Science.gov (United States)

    Bamdad, Shahram; Beigi, Vahid; Sedaghat, Mohammad Reza

    2017-01-01

    Perimetry is one of the mainstays in glaucoma diagnosis and treatment. Various strategies offer different accuracies in glaucoma testing. Our aim was to determine and compare the diagnostic sensitivity and specificity of Swedish Interactive Threshold Algorithm (SITA) Fast and Standard Full Threshold (SFT) strategies of the Humphrey Field Analyzer (HFA) in identifying patients with visual field defect in glaucoma disease. This prospective observational case series study was conducted in a university-based eye hospital. A total of 37 eyes of 20 patients with glaucoma were evaluated using the central 30-2 program and both the SITA Fast and SFT strategies. Both strategies were performed for each strategy in each session and for four times in a 2-week period. Data were analyzed using the Student's t-test, analysis of variance, and chi-square test. The SITA Fast and SFT strategies had similar sensitivity of 93.3%. The specificity of SITA Fast and SFT strategies was 57.4% and 71.4% respectively. The mean duration of SFT tests was 14.6 minutes, and that of SITA Fast tests was 5.45 minutes (a statistically significant 62.5% reduction). In gray scale plots, visual field defect was less deep in SITA Fast than in SFT; however, more points had significant defect (p 0.5% and p deviation plots in SITA Fast than in SFT; these differences were not clinically significant. In conclusion, the SITA Fast strategy showed higher sensitivity for detection of glaucoma compared to the SFT strategy, yet with reduced specificity; however, the shorter test duration makes it a more acceptable choice in many clinical situations, especially for children, elderly, and those with musculoskeletal diseases.

  5. DISPOSABLE CANISTER WASTE ACCEPTANCE CRITERIA

    Energy Technology Data Exchange (ETDEWEB)

    R.J. Garrett

    2001-07-30

    The purpose of this calculation is to provide the bases for defining the preclosure limits on radioactive material releases from radioactive waste forms to be received in disposable canisters at the Monitored Geologic Repository (MGR) at Yucca Mountain. Specifically, this calculation will provide the basis for criteria to be included in a forthcoming revision of the Waste Acceptance System Requirements Document (WASRD) that limits releases in terms of non-isotope-specific canister release dose-equivalent source terms. These criteria will be developed for the Department of Energy spent nuclear fuel (DSNF) standard canister, the Multicanister Overpack (MCO), the naval spent fuel canister, the High-Level Waste (HLW) canister, the plutonium can-in-canister, and the large Multipurpose Canister (MPC). The shippers of such canisters will be required to demonstrate that they meet these criteria before the canisters are accepted at the MGR. The Quality Assurance program is applicable to this calculation. The work reported in this document is part of the analysis of DSNF and is performed using procedure AP-3.124, Calculations. The work done for this analysis was evaluated according to procedure QAP-2-0, Control of Activities, which has been superseded by AP-2.21Q, Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities. This evaluation determined that such activities are subject to the requirements of DOE/RW/0333P, Quality Assurance Requirements and Description (DOE 2000). This work is also prepared in accordance with the development plan titled Design Basis Event Analyses on DOE SNF and Plutonium Can-In-Canister Waste Forms (CRWMS M&O 1999a) and Technical Work Plan For: Department of Energy Spent Nuclear Fuel Work Packages (CRWMS M&O 2000d). This calculation contains no electronic data applicable to any electronic data management system.

  6. Perspective: Uses and misuses of thresholds in diagnostic decision making.

    Science.gov (United States)

    Warner, Jeremy L; Najarian, Robert M; Tierney, Lawrence M

    2010-03-01

    The concept of thresholds plays a vital role in decisions involving the initiation, continuation, and completion of diagnostic testing. Much research has focused on the development of explicit thresholds, in the form of practice guidelines and decision analyses. However, these tools are used infrequently; most medical decisions are made at the bedside, using implicit thresholds. Study of these thresholds can lead to a deeper understanding of clinical decision making. The authors examine some factors constituting individual clinicians' implicit thresholds. They propose a model for static thresholds using the concept of situational gravity to explain why some thresholds are high, and some low. Next, they consider the hypothetical effects of incorrect placement of thresholds (miscalibration) and changes to thresholds during diagnosis (manipulation). They demonstrate these concepts using common clinical scenarios. Through analysis of miscalibration of thresholds, the authors demonstrate some common maladaptive clinical behaviors, which are nevertheless internally consistent. They then explain how manipulation of thresholds gives rise to common cognitive heuristics including premature closure and anchoring. They also discuss the case where no threshold has been exceeded despite exhaustive collection of data, which commonly leads to application of the availability or representativeness heuristics. Awareness of implicit thresholds allows for a more effective understanding of the processes of medical decision making and, possibly, to the avoidance of detrimental heuristics and their associated medical errors. Research toward accurately defining these thresholds for individual physicians and toward determining their dynamic properties during the diagnostic process may yield valuable insights.

  7. Null but not void: considerations for hypothesis testing.

    Science.gov (United States)

    Shaw, Pamela A; Proschan, Michael A

    2013-01-30

    Standard statistical theory teaches us that once the null and alternative hypotheses have been defined for a parameter, the choice of the statistical test is clear. Standard theory does not teach us how to choose the null or alternative hypothesis appropriate to the scientific question of interest. Neither does it tell us that in some cases, depending on which alternatives are realistic, we may want to define our null hypothesis differently. Problems in statistical practice are frequently not as pristinely summarized as the classic theory in our textbooks. In this article, we present examples in statistical hypothesis testing in which seemingly simple choices are in fact rich with nuance that, when given full consideration, make the choice of the right hypothesis test much less straightforward. Published 2012. This article is a US Government work and is in the public domain in the USA.

  8. Alzheimer's disease: the amyloid hypothesis and the Inverse Warburg effect

    KAUST Repository

    Demetrius, Lloyd A.; Magistretti, Pierre J.; Pellerin, Luc

    2015-01-01

    Epidemiological and biochemical studies show that the sporadic forms of Alzheimer's disease (AD) are characterized by the following hallmarks: (a) An exponential increase with age; (b) Selective neuronal vulnerability; (c) Inverse cancer comorbidity. The present article appeals to these hallmarks to evaluate and contrast two competing models of AD: the amyloid hypothesis (a neuron-centric mechanism) and the Inverse Warburg hypothesis (a neuron-astrocytic mechanism). We show that these three hallmarks of AD conflict with the amyloid hypothesis, but are consistent with the Inverse Warburg hypothesis, a bioenergetic model which postulates that AD is the result of a cascade of three events—mitochondrial dysregulation, metabolic reprogramming (the Inverse Warburg effect), and natural selection. We also provide an explanation for the failures of the clinical trials based on amyloid immunization, and we propose a new class of therapeutic strategies consistent with the neuroenergetic selection model.

  9. Cross-system log file analysis for hypothesis testing

    NARCIS (Netherlands)

    Glahn, Christian

    2008-01-01

    Glahn, C. (2008). Cross-system log file analysis for hypothesis testing. Presented at Empowering Learners for Lifelong Competence Development: pedagogical, organisational and technological issues. 4th TENCompetence Open Workshop. April, 10, 2008, Madrid, Spain.

  10. Hypothesis Testing Using the Films of the Three Stooges

    Science.gov (United States)

    Gardner, Robert; Davidson, Robert

    2010-01-01

    The use of The Three Stooges' films as a source of data in an introductory statistics class is described. The Stooges' films are separated into three populations. Using these populations, students may conduct hypothesis tests with data they collect.

  11. Incidence of allergy and atopic disorders and hygiene hypothesis.

    Czech Academy of Sciences Publication Activity Database

    Bencko, V.; Šíma, Petr

    2017-01-01

    Roč. 2, 6 March (2017), č. článku 1244. ISSN 2474-1663 Institutional support: RVO:61388971 Keywords : allergy disorders * atopic disorders * hygiene hypothesis Subject RIV: EE - Microbiology, Virology OBOR OECD: Microbiology

  12. Alzheimer's disease: the amyloid hypothesis and the Inverse Warburg effect

    KAUST Repository

    Demetrius, Lloyd A.

    2015-01-14

    Epidemiological and biochemical studies show that the sporadic forms of Alzheimer\\'s disease (AD) are characterized by the following hallmarks: (a) An exponential increase with age; (b) Selective neuronal vulnerability; (c) Inverse cancer comorbidity. The present article appeals to these hallmarks to evaluate and contrast two competing models of AD: the amyloid hypothesis (a neuron-centric mechanism) and the Inverse Warburg hypothesis (a neuron-astrocytic mechanism). We show that these three hallmarks of AD conflict with the amyloid hypothesis, but are consistent with the Inverse Warburg hypothesis, a bioenergetic model which postulates that AD is the result of a cascade of three events—mitochondrial dysregulation, metabolic reprogramming (the Inverse Warburg effect), and natural selection. We also provide an explanation for the failures of the clinical trials based on amyloid immunization, and we propose a new class of therapeutic strategies consistent with the neuroenergetic selection model.

  13. The Double-Deficit Hypothesis in Spanish Developmental Dyslexia

    Science.gov (United States)

    Jimenez, Juan E.; Hernandez-Valle, Isabel; Rodriguez, Cristina; Guzman, Remedios; Diaz, Alicia; Ortiz, Rosario

    2008-01-01

    The double-deficit hypothesis (DDH) of developmental dyslexia was investigated in seven to twelve year old Spanish children. It was observed that the double deficit (DD) group had the greatest difficulty with reading.

  14. Independent control of joint stiffness in the framework of the equilibrium-point hypothesis.

    Science.gov (United States)

    Latash, M L

    1992-01-01

    In the framework of the equilibrium-point hypothesis, virtual trajectories and joint stiffness patterns have been reconstructed during two motor tasks practiced against a constant bias torque. One task required a voluntary increase in joint stiffness while preserving the original joint position. The other task involved fast elbow flexions over 36 degrees. Joint stiffness gradually subsided after the termination of fast movements. In both tasks, the external torque could slowly and unexpectedly change. The subjects were required not to change their motor commands if the torque changed, i.e. "to do the same no matter what the motor did". In both tasks, changes in joint stiffness were accompanied by unchanged virtual trajectories that were also independent of the absolute value of the bias torque. By contrast, the intercept of the joint compliant characteristic with the angle axis, r(t)-function, has demonstrated a clear dependence upon both the level of coactivation and external load. We assume that a template virtual trajectory is generated at a certain level of the motor hierarchy and is later scaled taking into account some commonly changing dynamic factors of the movement execution, for example, external load. The scaling leads to the generation of commands to the segmental structures that can be expressed, according to the equilibrium-point hypothesis, as changes in the thresholds of the tonic stretch reflex for corresponding muscles.

  15. Conserved-vector-current hypothesis and the ν-baree-bar→π-π0 process

    International Nuclear Information System (INIS)

    Dubnickova, A.Z.; Dubnicka, S.; Rekalo, M.P.

    1992-01-01

    Based on the conserved-vector-current (CVC) hypothesis and a four-ρ-resonance unitary and analytic vector dominance model of the pion electromagnetic form factor, the σ tot (E ν lab ) and dσ/dE π lab of the weak ν-bar e e - →π - π 0 process are predicted theoretically for the first time. Their experimental approval could verify the CVC hypothesis for all energies above the two-pion threshold. Since, unlike the electromagnetic e + e - →π + π - process, there is no isoscalar vector-meson contribution to the weak ν-bar e e - →π - π 0 reaction, accurate measurements of the σ tot (E ν lab ) that moreover is strengthened with energy E ν lab linearly could solve the problem of the mass specification of the first excited state of the ρ (770) meson. An equality σ tot (ν-bar e e - →π - π 0 )=σ tot (e + e - →π - π 0 ) is predicted for √s≅70 GeV. 4 refs.; 5 figs

  16. Perceived social acceptance, theory of mind and social adjustment in children with intellectual disabilities.

    Science.gov (United States)

    Fiasse, Catherine; Nader-Grosbois, Nathalie

    2012-01-01

    Perceived social acceptance, theory of mind (ToM) and social adjustment were investigated in 45 children with intellectual disabilities (ID) compared with 45 typically developing (TD) preschoolers, matched for developmental age assessed by means of the Differential Scales of Intellectual Efficiency-Revised edition (EDEI-R, Perron-Borelli, 1996). Children's understanding of beliefs and emotions was assessed by means of ToM belief tasks (Nader-Grosbois & Thirion-Marissiaux, 2011) and ToM emotion tasks (Nader-Grosbois & Thirion-Marissiaux, 2011). Seven items from the Pictorial Scale of Perceived Competence and Social Acceptance for children (PSPCSA, Harter & Pike, 1980) assessed children's perceived social acceptance. Their teachers completed the Social Adjustment for Children Scale (EASE, Hughes, Soares-Boucaud, Hochmann, & Frith, 1997). For both groups together, the results showed that perceived social acceptance mediates the relation between ToM skills and social adjustment. The presence or absence of intellectual disabilities does not moderate the relations either between ToM skills and perceived social acceptance, or between perceived social acceptance and social adjustment. The study did not confirm the difference hypothesis of structural and relational patterns between these three processes in children with ID, but instead supported the hypothesis of a similar structure that develops in a delayed manner. Copyright © 2012 Elsevier Ltd. All rights reserved.

  17. The Random-Walk Hypothesis on the Indian Stock Market

    OpenAIRE

    Ankita Mishra; Vinod Mishra; Russell Smyth

    2014-01-01

    This study tests the random walk hypothesis for the Indian stock market. Using 19 years of monthly data on six indices from the National Stock Exchange (NSE) and the Bombay Stock Exchange (BSE), this study applies three different unit root tests with two structural breaks to analyse the random walk hypothesis. We find that unit root tests that allow for two structural breaks alone are not able to reject the unit root null; however, a recently developed unit root test that simultaneously accou...

  18. The Fractal Market Hypothesis: Applications to Financial Forecasting

    OpenAIRE

    Blackledge, Jonathan

    2010-01-01

    Most financial modelling systems rely on an underlying hypothesis known as the Efficient Market Hypothesis (EMH) including the famous Black-Scholes formula for placing an option. However, the EMH has a fundamental flaw: it is based on the assumption that economic processes are normally distributed and it has long been known that this is not the case. This fundamental assumption leads to a number of shortcomings associated with using the EMH to analyse financial data which includes failure to ...

  19. Dopamine and Reward: The Anhedonia Hypothesis 30 years on

    OpenAIRE

    Wise, Roy A.

    2008-01-01

    The anhedonia hypothesis – that brain dopamine plays a critical role in the subjective pleasure associated with positive rewards – was intended to draw the attention of psychiatrists to the growing evidence that dopamine plays a critical role in the objective reinforcement and incentive motivation associated with food and water, brain stimulation reward, and psychomotor stimulant and opiate reward. The hypothesis called to attention the apparent paradox that neuroleptics, drugs used to treat ...

  20. Personal values and pain tolerance: does a values intervention add to acceptance?

    Science.gov (United States)

    Branstetter-Rost, Ann; Cushing, Christopher; Douleh, Tanya

    2009-08-01

    Previous research suggests that acceptance is a promising alternative to distraction and control techniques in successfully coping with pain. Acceptance interventions based upon Acceptance and Commitment Therapy (ACT) have been shown to lead to greater tolerance of acute pain as well as increased adjustment and less disability among individuals with chronic pain. However, in these previous intervention studies, the ACT component of values has either not been included or not specifically evaluated. The current study compares the effects of an ACT-based acceptance intervention with and without the values component among individuals completing the cold-pressor task. Results indicate that inclusion of the values component (n = 34) of ACT leads to significantly greater pain tolerance than acceptance alone (n = 30). Consistent with previous research, both conditions were associated with greater pain tolerance than control (n = 35). Despite the difference in tolerance, pain threshold did not differ, and participants in the control condition provided lower ratings of pain severity. The findings from this study support the important role of values and values clarification in acceptance-based interventions such as ACT, and provide direction for clinicians working with individuals with chronic pain conditions. This article evaluates the additive effect of including a personalized-values exercise in an acceptance-based treatment for pain. Results indicate that values interventions make a significant contribution and improvement to acceptance interventions, which may be of interest to clinicians who provide psychological treatment to individuals with chronic pain.