WorldWideScience

Sample records for acceptance threshold hypothesis

  1. Energy Threshold Hypothesis for Household Consumption

    International Nuclear Information System (INIS)

    Ortiz, Samira; Castro-Sitiriche, Marcel; Amador, Isamar

    2017-01-01

    A strong positive relationship among quality of life and electricity consumption at impoverished countries is found in many studies. However, previous work has presented that the positive relationship does not hold beyond certain electricity consumption threshold. Consequently, there is a need of exploring the possibility for communities to live with sustainable level of energy consumption without sacrificing their quality of life. The Gallup-Healthways Report measures global citizen’s wellbeing. This paper provides a new outlook using these elements to explore the relationships among actual percentage of population thriving in most countries and their energy consumption. A measurement of efficiency is computed to determine an adjusted relative social value of energy considering the variability in the happy life years as a function of electric power consumption. Adjustment is performed so single components don’t dominate in the measurement. It is interesting to note that the countries with the highest relative social value of energy are in the top 10 countries of the Gallup report.

  2. On the origins of autism: The Quantitative Threshold Exposure hypothesis.

    Science.gov (United States)

    Crawford, S

    2015-12-01

    The Quantitative Threshold Exposure (QTE) hypothesis is a multifactorial threshold model that accounts for the cumulative effects of risk factor exposure in both the causation of autism spectrum disorder (ASD) and its dramatic increase over the past 30 years. The QTE hypothesis proposes that ASD is triggered by the cumulative effects of high-level exposure to endogenous and environmental factors that act as antigens to impair normal immune system (IS) and associated central nervous system (CNS) functions during critical developmental stages. The quantitative threshold parameters that comprise a cumulative risk for the development of ASD are identified by the assessment of documented epidemiological factors that, in sum, determine the likelihood that ASD will occur as a result of their effects on critically integrated IS and CNS pathways active during prenatal, neo-natal and early childhood brain maturation. The model proposes an explanation for the relationship between critical developmental stages of brain/immune system development in conjunction with the quantitative effects of genetic and environmental risk factors that may interface with these critical developmental windows. This model may be useful even when the individual contributions of specific risk factors cannot be quantified, as it proposes that the combined quantitative level of exposure to risk factors for ASD rather than exposure to any one risk factor per se defines threshold occurrence rates. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Perceptibility and acceptability thresholds for colour differences in dentistry

    NARCIS (Netherlands)

    Khashayar, G.; Bain, P.A.; Salari, S.; Dozic, A.; Kleverlaan, C.J.; Feilzer, A.J.

    2014-01-01

    Introduction Data on acceptability (AT) and perceptibility thresholds (PT) for colour differences vary in dental literature. There is consensus that the determination of ΔE* is appropriate to define AT and PT, however there is no consensus regarding the values that should be used. The aim of this

  4. The Threshold Hypothesis Applied to Spatial Skill and Mathematics

    Science.gov (United States)

    Freer, Daniel

    2017-01-01

    This cross-sectional study assessed the relation between spatial skills and mathematics in 854 participants across kindergarten, third grade, and sixth grade. Specifically, the study probed for a threshold for spatial skills when performing mathematics, above which spatial scores and mathematics scores would be significantly less related. This…

  5. A Threshold Accepting Metaheuristic for the Vehicle Routing Problem with Time Windows

    NARCIS (Netherlands)

    Bräysy, Olli; Berger, Jean; Barkaoui, Mohamed; Dullaert, Wout

    2003-01-01

    Threshold Accepting, a variant of Simulated Annealing, is applied for the first time to a set of 356 benchmark instances for the Vehicle Routing with Time Windows. The Threshold Accepting metaheuristic is used to improve upon results obtained with a recent parallel genetic algorithm and a

  6. A Threshold Accepting Metaheuristic for the Vehicle Routing Problem with Time Windows.

    NARCIS (Netherlands)

    Bräysy, Olli; Berger, Jean; Barkaoui, Mohamed; Dullaert, Wout

    2003-01-01

    Threshold Accepting, a variant of Simulated, Annealing, is applied for the first time to a set of 356 benchmark instances for the Vehicle Routing with Time Windows. The Threshold Accepting metaheuristic is used to improve upon results obtained with a recent parallel genetic algorithm and a

  7. Relationship between Divergent Thinking and Intelligence: An Empirical Study of the Threshold Hypothesis with Chinese Children

    Science.gov (United States)

    Shi, Baoguo; Wang, Lijing; Yang, Jiahui; Zhang, Mengpin; Xu, Li

    2017-01-01

    The threshold hypothesis is a classical and notable explanation for the relationship between creativity and intelligence. However, few empirical examinations of this theory exist, and the results are inconsistent. To test this hypothesis, this study investigated the relationship between divergent thinking (DT) and intelligence with a sample of 568 Chinese children aged between 11 and 13 years old using testing and questionnaire methods. The study focused on the breakpoint of intelligence and the moderation effect of openness on the relationship between intelligence and DT. The findings were as follows: (1) a breakpoint at the intelligence quotient (IQ) of 109.20 when investigating the relationship between either DT fluency or DT flexibility and intelligence. Another breakpoint was detected at the IQ of 116.80 concerning the correlation between originality and intelligence. The breakpoint of the relation between the composite score of creativity and intelligence occurred at the IQ of 110.10. (2) Openness to experience had a moderating effect on the correlation between the indicators of creativity and intelligence under the breakpoint. Above this point, however, the effect was not significant. The results suggested a relationship between DT and intelligence among Chinese children, which conforms to the threshold hypothesis. Besides, it remains necessary to explore the personality factors accounting for individual differences in the relationship between DT and intelligence. PMID:28275361

  8. Beyond the fragmentation threshold hypothesis: regime shifts in biodiversity across fragmented landscapes.

    Directory of Open Access Journals (Sweden)

    Renata Pardini

    Full Text Available Ecological systems are vulnerable to irreversible change when key system properties are pushed over thresholds, resulting in the loss of resilience and the precipitation of a regime shift. Perhaps the most important of such properties in human-modified landscapes is the total amount of remnant native vegetation. In a seminal study Andrén proposed the existence of a fragmentation threshold in the total amount of remnant vegetation, below which landscape-scale connectivity is eroded and local species richness and abundance become dependent on patch size. Despite the fact that species patch-area effects have been a mainstay of conservation science there has yet to be a robust empirical evaluation of this hypothesis. Here we present and test a new conceptual model describing the mechanisms and consequences of biodiversity change in fragmented landscapes, identifying the fragmentation threshold as a first step in a positive feedback mechanism that has the capacity to impair ecological resilience, and drive a regime shift in biodiversity. The model considers that local extinction risk is defined by patch size, and immigration rates by landscape vegetation cover, and that the recovery from local species losses depends upon the landscape species pool. Using a unique dataset on the distribution of non-volant small mammals across replicate landscapes in the Atlantic forest of Brazil, we found strong evidence for our model predictions--that patch-area effects are evident only at intermediate levels of total forest cover, where landscape diversity is still high and opportunities for enhancing biodiversity through local management are greatest. Furthermore, high levels of forest loss can push native biota through an extinction filter, and result in the abrupt, landscape-wide loss of forest-specialist taxa, ecological resilience and management effectiveness. The proposed model links hitherto distinct theoretical approaches within a single framework

  9. The threshold hypothesis: solving the equation of nurture vs nature in type 1 diabetes.

    Science.gov (United States)

    Wasserfall, C; Nead, K; Mathews, C; Atkinson, M A

    2011-09-01

    For more than 40 years, the contributions of nurture (i.e. the environment) and nature (i.e. genetics) have been touted for their aetiological importance in type 1 diabetes. Disappointingly, knowledge gains in these areas, while individually successful, have to a large extent occurred in isolation from each other. One reason underlying this divide is the lack of a testable model that simultaneously considers the contributions of genetic and environmental determinants in the formation of this and potentially other disorders that are subject to these variables. To address this void, we have designed a model based on the hypothesis that the aetiological influences of genetics and environment, when evaluated as intersecting and reciprocal trend lines based on odds ratios, result in a method of concurrently evaluating both facets and defining the attributable risk of clinical onset of type 1 diabetes. The model, which we have elected to term the 'threshold hypothesis', also provides a novel means of conceptualising the complex interactions of nurture with nature in type 1 diabetes across various geographical populations.

  10. Acceptance sampling for attributes via hypothesis testing and the hypergeometric distribution

    Science.gov (United States)

    Samohyl, Robert Wayne

    2017-10-01

    This paper questions some aspects of attribute acceptance sampling in light of the original concepts of hypothesis testing from Neyman and Pearson (NP). Attribute acceptance sampling in industry, as developed by Dodge and Romig (DR), generally follows the international standards of ISO 2859, and similarly the Brazilian standards NBR 5425 to NBR 5427 and the United States Standards ANSI/ASQC Z1.4. The paper evaluates and extends the area of acceptance sampling in two directions. First, by suggesting the use of the hypergeometric distribution to calculate the parameters of sampling plans avoiding the unnecessary use of approximations such as the binomial or Poisson distributions. We show that, under usual conditions, discrepancies can be large. The conclusion is that the hypergeometric distribution, ubiquitously available in commonly used software, is more appropriate than other distributions for acceptance sampling. Second, and more importantly, we elaborate the theory of acceptance sampling in terms of hypothesis testing rigorously following the original concepts of NP. By offering a common theoretical structure, hypothesis testing from NP can produce a better understanding of applications even beyond the usual areas of industry and commerce such as public health and political polling. With the new procedures, both sample size and sample error can be reduced. What is unclear in traditional acceptance sampling is the necessity of linking the acceptable quality limit (AQL) exclusively to the producer and the lot quality percent defective (LTPD) exclusively to the consumer. In reality, the consumer should also be preoccupied with a value of AQL, as should the producer with LTPD. Furthermore, we can also question why type I error is always uniquely associated with the producer as producer risk, and likewise, the same question arises with consumer risk which is necessarily associated with type II error. The resolution of these questions is new to the literature. The

  11. The relationship between intelligence and creativity: New support for the threshold hypothesis by means of empirical breakpoint detection

    Science.gov (United States)

    Jauk, Emanuel; Benedek, Mathias; Dunst, Beate; Neubauer, Aljoscha C.

    2013-01-01

    The relationship between intelligence and creativity has been subject to empirical research for decades. Nevertheless, there is yet no consensus on how these constructs are related. One of the most prominent notions concerning the interplay between intelligence and creativity is the threshold hypothesis, which assumes that above-average intelligence represents a necessary condition for high-level creativity. While earlier research mostly supported the threshold hypothesis, it has come under fire in recent investigations. The threshold hypothesis is commonly investigated by splitting a sample at a given threshold (e.g., at 120 IQ points) and estimating separate correlations for lower and upper IQ ranges. However, there is no compelling reason why the threshold should be fixed at an IQ of 120, and to date, no attempts have been made to detect the threshold empirically. Therefore, this study examined the relationship between intelligence and different indicators of creative potential and of creative achievement by means of segmented regression analysis in a sample of 297 participants. Segmented regression allows for the detection of a threshold in continuous data by means of iterative computational algorithms. We found thresholds only for measures of creative potential but not for creative achievement. For the former the thresholds varied as a function of criteria: When investigating a liberal criterion of ideational originality (i.e., two original ideas), a threshold was detected at around 100 IQ points. In contrast, a threshold of 120 IQ points emerged when the criterion was more demanding (i.e., many original ideas). Moreover, an IQ of around 85 IQ points was found to form the threshold for a purely quantitative measure of creative potential (i.e., ideational fluency). These results confirm the threshold hypothesis for qualitative indicators of creative potential and may explain some of the observed discrepancies in previous research. In addition, we obtained

  12. Molecular biology, epidemiology, and the demise of the linear no-threshold (LNT) hypothesis.

    Science.gov (United States)

    Pollycove, M; Feinendegen, L E

    1999-01-01

    The prime concern of radiation protection policy since 1959 has been protecting DNA from damage. The 1995 NCRP Report 121 on collective dose states that since no human data provides direct support for the linear no threshold hypothesis (LNT), and some studies provide quantitative data that, with statistical significance, contradict LNT, ultimately, confidence in LNT is based on the biophysical concept that the passage of a single charged particle could cause damage to DNA that would result in cancer. Current understanding of the basic molecular biologic mechanisms involved and recent data are examined before presenting several statistically significant epidemiologic studies that contradict the LNT hypothesis. Over eons of time a complex biosystem evolved to control the DNA alterations (oxidative adducts) produced by about 10(10) free radicals/cell/d derived from 2-3% of all metabolized oxygen. Antioxidant prevention, enzymatic repair of DNA damage, and removal of persistent DNA alterations by apoptosis, differentiation, necrosis, and the immune system, sequentially reduce DNA damage from about 10(6) DNA alterations/cell/d to about 1 mutation/cell/d. These mutations accumulate in stem cells during a lifetime with progressive DNA damage-control impairment associated with aging and malignant growth. A comparatively negligible number of mutations, an average of about 10(-7) mutations/cell/d, is produced by low LET radiation background of 0.1 cGy/y. The remarkable efficiency of this biosystem is increased by the adaptive responses to low-dose ionizing radiation. Each of the sequential functions that prevent, repair, and remove DNA damage are adaptively stimulated by low-dose ionizing radiation in contrast to their impairment by high-dose radiation. The biologic effect of radiation is not determined by the number of mutations it creates, but by its effect on the biosystem that controls the relentless enormous burden of oxidative DNA damage. At low doses, radiation

  13. Rethinking avian response to Tamarix on the lower Colorado River: A threshold hypothesis

    Science.gov (United States)

    van Riper, Charles; Paxton, K.L.; O'brien, C.; Shafroth, P.B.; McGrath, L.J.

    2008-01-01

    Many of the world's large river systems have been greatly altered in the past century due to river regulation, agriculture, and invasion of introduced Tamarix spp. (saltcedar, tamarisk). These riverine ecosystems are known to provide important habitat for avian communities, but information on responses of birds to differing levels of Tamarix is not known. Past research on birds along the Colorado River has shown that avian abundance in general is greater in native than in non-native habitat. In this article, we address habitat restoration on the lower Colorado River by comparing abundance and diversity of avian communities at a matrix of different amounts of native and non-native habitats at National Wildlife Refuges in Arizona. Two major patterns emerged from this study: (1) Not all bird species responded to Tamarix in a similar fashion, and for many bird species, abundance was highest at intermediate Tamarix levels (40-60%), suggesting a response threshold. (2) In Tamarix-dominated habitats, the greatest increase in bird abundance occurred when small amounts of native vegetation were present as a component of that habitat. In fact, Tamarix was the best vegetation predictor of avian abundance when compared to vegetation density and canopy cover. Our results suggest that to positively benefit avian abundance and diversity, one cost-effective way to rehabilitate larger monoculture Tamarix stands would be to add relatively low levels of native vegetation (???20-40%) within homogenous Tamarix habitat. In addition, this could be much more cost effective and feasible than attempting to replace all Tamarix with native vegetation. ?? 2008 Society for Ecological Restoration International.

  14. Air Traffic Controller Acceptability of Unmanned Aircraft System Detect-and-Avoid Thresholds

    Science.gov (United States)

    Mueller, Eric R.; Isaacson, Douglas R.; Stevens, Derek

    2016-01-01

    A human-in-the-loop experiment was conducted with 15 retired air traffic controllers to investigate two research questions: (a) what procedures are appropriate for the use of unmanned aircraft system (UAS) detect-and-avoid systems, and (b) how long in advance of a predicted close encounter should pilots request or execute a separation maneuver. The controller participants managed a busy Oakland air route traffic control sector with mixed commercial/general aviation and manned/UAS traffic, providing separation services, miles-in-trail restrictions and issuing traffic advisories. Controllers filled out post-scenario and post-simulation questionnaires, and metrics were collected on the acceptability of procedural options and temporal thresholds. The states of aircraft were also recorded when controllers issued traffic advisories. Subjective feedback indicated a strong preference for pilots to request maneuvers to remain well clear from intruder aircraft rather than deviate from their IFR clearance. Controllers also reported that maneuvering at 120 seconds until closest point of approach (CPA) was too early; maneuvers executed with less than 90 seconds until CPA were more acceptable. The magnitudes of the requested maneuvers were frequently judged to be too large, indicating a possible discrepancy between the quantitative UAS well clear standard and the one employed subjectively by manned pilots. The ranges between pairs of aircraft and the times to CPA at which traffic advisories were issued were used to construct empirical probability distributions of those metrics. Given these distributions, we propose that UAS pilots wait until an intruder aircraft is approximately 80 seconds to CPA or 6 nmi away before requesting a maneuver, and maneuver immediately if the intruder is within 60 seconds and 4 nmi. These thresholds should make the use of UAS detect and avoid systems compatible with current airspace procedures and controller expectations.

  15. Panel discussion on health effects of low-dose ionizing radiation. Scientific findings and non-threshold hypothesis

    International Nuclear Information System (INIS)

    1995-06-01

    This is a record of a panel discussion in the IAEA Interregional Training Course. In current radiation work, protection measures are taken on the assumption that any amount of radiation, however small, entails a risk of deleterious effects. This so-called non-threshold assumption of radiation effects, on the one hand, creates public distrust of radiation use. However, because the health effects of low-dose ionizing radiation are difficult to verify, wide views ranging from the non-threshold hypothesis to one which sees small amounts of radiation as rather useful and necessary are presented. In this panel discussion, how the health effects of low-dose ionizing radiation should be considered from the standpoint of radiation protection was discussed. Panelists included such eminent scientists as Dr. Sugahara and Dr. Okada, who are deeply interested in this field and are playing leading parts in radiobiology research in Japan, and Dr. Stather, deputy Director of NRPB, UK, who, in UNSCEAR and ICRP, is actively participating in the international review of radiation effects and the preparation of reports on radiation protection recommendations. They agreed with each other that although it is reasonable, under the current scientific understanding, to follow the recommendation of ICRP, research in this area should be strongly promoted hereafter, for basing radiation protection on firm scientific grounds. Many participants actively asked about and discussed problems in their own field. (author)

  16. Can you taste it? Taste detection and acceptability thresholds for chlorine residual in drinking water in Dhaka, Bangladesh.

    Science.gov (United States)

    Crider, Yoshika; Sultana, Sonia; Unicomb, Leanne; Davis, Jennifer; Luby, Stephen P; Pickering, Amy J

    2018-02-01

    Chlorination is a low-cost, effective method for drinking water treatment, but aversion to the taste or smell of chlorinated water can limit use of chlorine treatment products. Forced choice triangle tests were used to evaluate chlorine detection and acceptability thresholds for two common types of chlorine among adults in Dhaka, Bangladesh, where previous studies have found low sustained uptake of chlorine water treatment products. The median detection threshold was 0.70mg/L (n=25, SD=0.57) for water dosed with liquid sodium hypochlorite (NaOCl) and 0.73mg/L (n=25, SD=0.83) for water dosed with solid sodium dichloroisocyanurate (NaDCC). Median acceptability thresholds (based on user report) were 1.16mg/L (SD=0.70) for NaOCl and 1.26mg/L (SD=0.67) for NaDCC. There was no significant difference in detection or acceptability thresholds for dosing with NaOCl versus NaDCC. Although users are willing to accept treated water in which they can detect the taste of chlorine, their acceptability limit is well below the 2.0mg/L that chlorine water treatment products are often designed to dose. For some settings, reducing dose may increase adoption of chlorinated water while still providing effective disinfection. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Calculation of Pareto-optimal solutions to multiple-objective problems using threshold-of-acceptability constraints

    Science.gov (United States)

    Giesy, D. P.

    1978-01-01

    A technique is presented for the calculation of Pareto-optimal solutions to a multiple-objective constrained optimization problem by solving a series of single-objective problems. Threshold-of-acceptability constraints are placed on the objective functions at each stage to both limit the area of search and to mathematically guarantee convergence to a Pareto optimum.

  18. Threshold Assessment: Definition of Acceptable Sites as Part of Site Selection for the Japanese HLW Program

    International Nuclear Information System (INIS)

    McKenna, S.A.; Wakasugi, Keiichiro; Webb, E.K.; Makino, Hitoshi; Ishihara, Yoshinao; Ijiri, Yuji; Sawada, Atsushi; Baba, Tomoko; Ishiguro, Katsuhiko; Umeki, Hiroyuki

    2000-01-01

    For the last ten years, the Japanese High-Level Nuclear Waste (HLW) repository program has focused on assessing the feasibility of a basic repository concept, which resulted in the recently published H12 Report. As Japan enters the implementation phase, a new organization must identify, screen and choose potential repository sites. Thus, a rapid mechanism for determining the likelihood of site suitability is critical. The threshold approach, described here, is a simple mechanism for defining the likelihood that a site is suitable given estimates of several critical parameters. We rely on the results of a companion paper, which described a probabilistic performance assessment simulation of the HLW reference case in the H12 report. The most critical two or three input parameters are plotted against each other and treated as spatial variables. Geostatistics is used to interpret the spatial correlation, which in turn is used to simulate multiple realizations of the parameter value maps. By combining an array of realizations, we can look at the probability that a given site, as represented by estimates of this combination of parameters, would be good host for a repository site

  19. Social psychological approach to the problem of threshold

    International Nuclear Information System (INIS)

    Nakayachi, Kazuya

    1999-01-01

    This paper discusses the threshold of carcinogen risk from the viewpoint of social psychology. First, the results of a survey suggesting that renunciation of the Linear No-Threshold (LNT) hypothesis would have no influence on the public acceptance (PA) of nuclear power plants are reported. Second, the relationship between the adoption of the LNT hypothesis and the standardization of management for various risks are discussed. (author)

  20. The Role of High Frequency Dynamic Threshold (HiDT) Serum Carcinoembryonic Antigen (CEA) Measurements in Colorectal Cancer Surveillance : A (Revisited) Hypothesis Paper

    NARCIS (Netherlands)

    Grossmann, Irene; Verberne, Charlotte; De Bock, Geertruida; Havenga, Klaas; Kema, Ido; Klaase, Joost; Renehan, Andrew; Wiggers, Theo

    2011-01-01

    Following curative treatment for colorectal cancer (CRC), 30% to 50% of patients will develop recurrent disease. For CRC there are several lines of evidence supporting the hypothesis that early detection of metachronous disease offers a second opportunity for cure. This paper revisits the potential

  1. Honeybee (Apis cerana) foraging responses to the toxic honey of Tripterygium hypoglaucum (Celastraceae): changing threshold of nectar acceptability.

    Science.gov (United States)

    Tan, K; Guo, Y H; Nicolson, S W; Radloff, S E; Song, Q S; Hepburn, H R

    2007-12-01

    To investigate honeybee foraging responses to toxic nectar, honey was collected from Apis cerana colonies in the Yaoan county of Yunnan Province, China, during June, when flowers of Tripterygium hypoglaucum were the main nectar source available. Pollen analysis confirmed the origin of the honey, and high-performance liquid chromatography showed the prominent component triptolide to be present at a concentration of 0.61 mug/g +/- 0.11 SD. In cage tests that used young adult worker bees, significantly more of those provided with a diet of T. hypoglaucum honey mixed with sugar powder (1:1) died within 6 d (68.3%) compared to control groups provided with normal honey mixed with sugar powder (15.8%). Honeybees were trained to visit feeders that contained honey of T. hypoglaucum (toxic honey) as the test group and honey of Vicia sativa or Elsholtzia ciliata as control groups (all honeys diluted 1:3 with water). Bees preferred the feeders with normal honey to those with toxic honey, as shown by significantly higher visiting frequencies and longer imbibition times. However, when the feeder of normal honey was removed, leaving only honey of T. hypoglaucum, the foraging bees returned to the toxic honey after a few seconds of hesitation, and both visiting frequency and imbibition time increased to values previously recorded for normal honey. Toxic honey thus became acceptable to the bees in the absence of other nectar sources.

  2. Improving the understanding of sleep apnea characterization using Recurrence Quantification Analysis by defining overall acceptable values for the dimensionality of the system, the delay, and the distance threshold.

    Science.gov (United States)

    Martín-González, Sofía; Navarro-Mesa, Juan L; Juliá-Serdá, Gabriel; Ramírez-Ávila, G Marcelo; Ravelo-García, Antonio G

    2018-01-01

    Our contribution focuses on the characterization of sleep apnea from a cardiac rate point of view, using Recurrence Quantification Analysis (RQA), based on a Heart Rate Variability (HRV) feature selection process. Three parameters are crucial in RQA: those related to the embedding process (dimension and delay) and the threshold distance. There are no overall accepted parameters for the study of HRV using RQA in sleep apnea. We focus on finding an overall acceptable combination, sweeping a range of values for each of them simultaneously. Together with the commonly used RQA measures, we include features related to recurrence times, and features originating in the complex network theory. To the best of our knowledge, no author has used them all for sleep apnea previously. The best performing feature subset is entered into a Linear Discriminant classifier. The best results in the "Apnea-ECG Physionet database" and the "HuGCDN2014 database" are, according to the area under the receiver operating characteristic curve, 0.93 (Accuracy: 86.33%) and 0.86 (Accuracy: 84.18%), respectively. Our system outperforms, using a relatively small set of features, previously existing studies in the context of sleep apnea. We conclude that working with dimensions around 7-8 and delays about 4-5, and using for the threshold distance the Fixed Amount of Nearest Neighbours (FAN) method with 5% of neighbours, yield the best results. Therefore, we would recommend these reference values for future work when applying RQA to the analysis of HRV in sleep apnea. We also conclude that, together with the commonly used vertical and diagonal RQA measures, there are newly used features that contribute valuable information for apnea minutes discrimination. Therefore, they are especially interesting for characterization purposes. Using two different databases supports that the conclusions reached are potentially generalizable, and are not limited by database variability.

  3. Reactions to Unfavorable Evaluations of the Self as a Function of Acceptance of Disability: A Test of Dembo, Leviton, And Wright's Misfortune Hypothesis

    Science.gov (United States)

    Grand, Sheldon A.

    1972-01-01

    Based on Dembo et al. (1956), it was predicted that in comparing high acceptors of their disability with low acceptors, the latter would (a) rate a non-disabled negative evaluator as more likeable, (b) perceive themselves as more similar to him, and (c) suffer more as a result of exposure to him. The hypothesis was confirmed. (Author/BY)

  4. Comparison of Threshold Saccadic Vector Optokinetic Perimetry (SVOP) and Standard Automated Perimetry (SAP) in Glaucoma. Part II: Patterns of Visual Field Loss and Acceptability.

    Science.gov (United States)

    McTrusty, Alice D; Cameron, Lorraine A; Perperidis, Antonios; Brash, Harry M; Tatham, Andrew J; Agarwal, Pankaj K; Murray, Ian C; Fleck, Brian W; Minns, Robert A

    2017-09-01

    We compared patterns of visual field loss detected by standard automated perimetry (SAP) to saccadic vector optokinetic perimetry (SVOP) and examined patient perceptions of each test. A cross-sectional study was done of 58 healthy subjects and 103 with glaucoma who were tested using SAP and two versions of SVOP (v1 and v2). Visual fields from both devices were categorized by masked graders as: 0, normal; 1, paracentral defect; 2, nasal step; 3, arcuate defect; 4, altitudinal; 5, biarcuate; and 6, end-stage field loss. SVOP and SAP classifications were cross-tabulated. Subjects completed a questionnaire on their opinions of each test. We analyzed 142 (v1) and 111 (v2) SVOP and SAP test pairs. SVOP v2 had a sensitivity of 97.7% and specificity of 77.9% for identifying normal versus abnormal visual fields. SAP and SVOP v2 classifications showed complete agreement in 54% of glaucoma patients, with a further 23% disagreeing by one category. On repeat testing, 86% of SVOP v2 classifications agreed with the previous test, compared to 91% of SAP classifications; 71% of subjects preferred SVOP compared to 20% who preferred SAP. Eye-tracking perimetry can be used to obtain threshold visual field sensitivity values in patients with glaucoma and produce maps of visual field defects, with patterns exhibiting close agreement to SAP. Patients preferred eye-tracking perimetry compared to SAP. This first report of threshold eye tracking perimetry shows good agreement with conventional automated perimetry and provides a benchmark for future iterations.

  5. Quantifying ecological thresholds from response surfaces

    Science.gov (United States)

    Heather E. Lintz; Bruce McCune; Andrew N. Gray; Katherine A. McCulloh

    2011-01-01

    Ecological thresholds are abrupt changes of ecological state. While an ecological threshold is a widely accepted concept, most empirical methods detect them in time or across geographic space. Although useful, these approaches do not quantify the direct drivers of threshold response. Causal understanding of thresholds detected empirically requires their investigation...

  6. Physiopathological Hypothesis of Cellulite

    OpenAIRE

    de Godoy, Jos? Maria Pereira; de Godoy, Maria de F?tima Guerreiro

    2009-01-01

    A series of questions are asked concerning this condition including as regards to its name, the consensus about the histopathological findings, physiological hypothesis and treatment of the disease. We established a hypothesis for cellulite and confirmed that the clinical response is compatible with this hypothesis. Hence this novel approach brings a modern physiological concept with physiopathologic basis and clinical proof of the hypothesis. We emphasize that the choice of patient, correct ...

  7. Variability: A Pernicious Hypothesis.

    Science.gov (United States)

    Noddings, Nel

    1992-01-01

    The hypothesis of greater male variability in test results is discussed in its historical context, and reasons feminists have objected to the hypothesis are considered. The hypothesis acquires political importance if it is considered that variability results from biological, rather than cultural, differences. (SLD)

  8. Physiopathological Hypothesis of Cellulite

    Science.gov (United States)

    de Godoy, José Maria Pereira; de Godoy, Maria de Fátima Guerreiro

    2009-01-01

    A series of questions are asked concerning this condition including as regards to its name, the consensus about the histopathological findings, physiological hypothesis and treatment of the disease. We established a hypothesis for cellulite and confirmed that the clinical response is compatible with this hypothesis. Hence this novel approach brings a modern physiological concept with physiopathologic basis and clinical proof of the hypothesis. We emphasize that the choice of patient, correct diagnosis of cellulite and the technique employed are fundamental to success. PMID:19756187

  9. Life Origination Hydrate Hypothesis (LOH-Hypothesis).

    Science.gov (United States)

    Ostrovskii, Victor; Kadyshevich, Elena

    2012-01-04

    The paper develops the Life Origination Hydrate Hypothesis (LOH-hypothesis), according to which living-matter simplest elements (LMSEs, which are N-bases, riboses, nucleosides, nucleotides), DNA- and RNA-like molecules, amino-acids, and proto-cells repeatedly originated on the basis of thermodynamically controlled, natural, and inevitable processes governed by universal physical and chemical laws from CH4, niters, and phosphates under the Earth's surface or seabed within the crystal cavities of the honeycomb methane-hydrate structure at low temperatures; the chemical processes passed slowly through all successive chemical steps in the direction that is determined by a gradual decrease in the Gibbs free energy of reacting systems. The hypothesis formulation method is based on the thermodynamic directedness of natural movement and consists ofan attempt to mentally backtrack on the progression of nature and thus reveal principal milestones alongits route. The changes in Gibbs free energy are estimated for different steps of the living-matter origination process; special attention is paid to the processes of proto-cell formation. Just the occurrence of the gas-hydrate periodic honeycomb matrix filled with LMSEs almost completely in its final state accounts for size limitation in the DNA functional groups and the nonrandom location of N-bases in the DNA chains. The slowness of the low-temperature chemical transformations and their "thermodynamic front" guide the gross process of living matter origination and its successive steps. It is shown that the hypothesis is thermodynamically justified and testable and that many observed natural phenomena count in its favor.

  10. Life Origination Hydrate Hypothesis (LOH-Hypothesis

    Directory of Open Access Journals (Sweden)

    Victor Ostrovskii

    2012-01-01

    Full Text Available The paper develops the Life Origination Hydrate Hypothesis (LOH-hypothesis, according to which living-matter simplest elements (LMSEs, which are N-bases, riboses, nucleosides, nucleotides, DNA- and RNA-like molecules, amino-acids, and proto-cells repeatedly originated on the basis of thermodynamically controlled, natural, and inevitable processes governed by universal physical and chemical laws from CH4, niters, and phosphates under the Earth's surface or seabed within the crystal cavities of the honeycomb methane-hydrate structure at low temperatures; the chemical processes passed slowly through all successive chemical steps in the direction that is determined by a gradual decrease in the Gibbs free energy of reacting systems. The hypothesis formulation method is based on the thermodynamic directedness of natural movement and consists ofan attempt to mentally backtrack on the progression of nature and thus reveal principal milestones alongits route. The changes in Gibbs free energy are estimated for different steps of the living-matter origination process; special attention is paid to the processes of proto-cell formation. Just the occurrence of the gas-hydrate periodic honeycomb matrix filled with LMSEs almost completely in its final state accounts for size limitation in the DNA functional groups and the nonrandom location of N-bases in the DNA chains. The slowness of the low-temperature chemical transformations and their “thermodynamic front” guide the gross process of living matter origination and its successive steps. It is shown that the hypothesis is thermodynamically justified and testable and that many observed natural phenomena count in its favor.

  11. How do Economic Growth Asymmetry and Inflation Expectations Affect Fisher Hypothesis and Fama’s Proxy Hypothesis?

    Directory of Open Access Journals (Sweden)

    Yuan-Ming Lee

    2017-12-01

    Full Text Available Based on the threshold panel data model, this study employs the quarterly panel data of 38 countries between 1981 and 2014 to test whether economic growth asymmetry, expected inflation, and unexpected inflation affect the Fisher hypothesis and Fama’s proxy hypothesis. The empirical results show the following: (1 When real economic growth rate is greater than the threshold (-0.009, Fisher hypothesis is supported. (2 When real economic growth rate is less than the threshold (-0.009, two scenarios hold true: before real variables are included, Fisher hypothesis is rejected; and when real variables are included, real economic growth is negative, inflation is expected, and thus, Fama’s hypothesis is supported.

  12. The Qualitative Expectations Hypothesis

    DEFF Research Database (Denmark)

    Frydman, Roman; Johansen, Søren; Rahbek, Anders

    2017-01-01

    We introduce the Qualitative Expectations Hypothesis (QEH) as a new approach to modeling macroeconomic and financial outcomes. Building on John Muth's seminal insight underpinning the Rational Expectations Hypothesis (REH), QEH represents the market's forecasts to be consistent with the predictions...... of an economistís model. However, by assuming that outcomes lie within stochastic intervals, QEH, unlike REH, recognizes the ambiguity faced by an economist and market participants alike. Moreover, QEH leaves the model open to ambiguity by not specifying a mechanism determining specific values that outcomes take...

  13. The Qualitative Expectations Hypothesis

    DEFF Research Database (Denmark)

    Frydman, Roman; Johansen, Søren; Rahbek, Anders

    We introduce the Qualitative Expectations Hypothesis (QEH) as a new approach to modeling macroeconomic and financial outcomes. Building on John Muth's seminal insight underpinning the Rational Expectations Hypothesis (REH), QEH represents the market's forecasts to be consistent with the predictions...... of an economist's model. However, by assuming that outcomes lie within stochastic intervals, QEH, unlike REH, recognizes the ambiguity faced by an economist and market participants alike. Moreover, QEH leaves the model open to ambiguity by not specifying a mechanism determining specific values that outcomes take...

  14. Segmentation by Large Scale Hypothesis Testing - Segmentation as Outlier Detection

    DEFF Research Database (Denmark)

    Darkner, Sune; Dahl, Anders Lindbjerg; Larsen, Rasmus

    2010-01-01

    We propose a novel and efficient way of performing local image segmentation. For many applications a threshold of pixel intensities is sufficient but determine the appropriate threshold value can be difficult. In cases with large global intensity variation the threshold value has to be adapted...... locally. We propose a method based on large scale hypothesis testing with a consistent method for selecting an appropriate threshold for the given data. By estimating the background distribution we characterize the segment of interest as a set of outliers with a certain probability based on the estimated...

  15. The Lehman Sisters Hypothesis

    NARCIS (Netherlands)

    I.P. van Staveren (Irene)

    2014-01-01

    markdownabstract__Abstract__ This article explores the Lehman Sisters Hypothesis. It reviews empirical literature about gender differences in behavioral, experimental, and neuro-economics as well as in other fields of behavioral research. It discusses gender differences along three dimensions of

  16. Revisiting the Dutch hypothesis

    NARCIS (Netherlands)

    Postma, Dirkje S.; Weiss, Scott T.; van den Berge, Maarten; Kerstjens, Huib A. M.; Koppelman, Gerard H.

    The Dutch hypothesis was first articulated in 1961, when many novel and advanced scientific techniques were not available, such as genomics techniques for pinpointing genes, gene expression, lipid and protein profiles, and the microbiome. In addition, computed tomographic scans and advanced analysis

  17. Threshold quantum cryptography

    International Nuclear Information System (INIS)

    Tokunaga, Yuuki; Okamoto, Tatsuaki; Imoto, Nobuyuki

    2005-01-01

    We present the concept of threshold collaborative unitary transformation or threshold quantum cryptography, which is a kind of quantum version of threshold cryptography. Threshold quantum cryptography states that classical shared secrets are distributed to several parties and a subset of them, whose number is greater than a threshold, collaborates to compute a quantum cryptographic function, while keeping each share secretly inside each party. The shared secrets are reusable if no cheating is detected. As a concrete example of this concept, we show a distributed protocol (with threshold) of conjugate coding

  18. History of the emergence of the electron spin hypothesis

    International Nuclear Information System (INIS)

    Fu Haihui

    2002-01-01

    Electron spin is an important concept in atomic physics and quantum mechanics. The emergence and acceptance of the hypothesis of electron spin has a special place in history, and is reviewed here from three aspects

  19. The Drift Burst Hypothesis

    OpenAIRE

    Christensen, Kim; Oomen, Roel; Renò, Roberto

    2016-01-01

    The Drift Burst Hypothesis postulates the existence of short-lived locally explosive trends in the price paths of financial assets. The recent US equity and Treasury flash crashes can be viewed as two high profile manifestations of such dynamics, but we argue that drift bursts of varying magnitude are an expected and regular occurrence in financial markets that can arise through established mechanisms such as feedback trading. At a theoretical level, we show how to build drift bursts into the...

  20. Bayesian Hypothesis Testing

    Energy Technology Data Exchange (ETDEWEB)

    Andrews, Stephen A. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Sigeti, David E. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-11-15

    These are a set of slides about Bayesian hypothesis testing, where many hypotheses are tested. The conclusions are the following: The value of the Bayes factor obtained when using the median of the posterior marginal is almost the minimum value of the Bayes factor. The value of τ2 which minimizes the Bayes factor is a reasonable choice for this parameter. This allows a likelihood ratio to be computed with is the least favorable to H0.

  1. The interactive brain hypothesis.

    Science.gov (United States)

    Di Paolo, Ezequiel; De Jaegher, Hanne

    2012-01-01

    Enactive approaches foreground the role of interpersonal interaction in explanations of social understanding. This motivates, in combination with a recent interest in neuroscientific studies involving actual interactions, the question of how interactive processes relate to neural mechanisms involved in social understanding. We introduce the Interactive Brain Hypothesis (IBH) in order to help map the spectrum of possible relations between social interaction and neural processes. The hypothesis states that interactive experience and skills play enabling roles in both the development and current function of social brain mechanisms, even in cases where social understanding happens in the absence of immediate interaction. We examine the plausibility of this hypothesis against developmental and neurobiological evidence and contrast it with the widespread assumption that mindreading is crucial to all social cognition. We describe the elements of social interaction that bear most directly on this hypothesis and discuss the empirical possibilities open to social neuroscience. We propose that the link between coordination dynamics and social understanding can be best grasped by studying transitions between states of coordination. These transitions form part of the self-organization of interaction processes that characterize the dynamics of social engagement. The patterns and synergies of this self-organization help explain how individuals understand each other. Various possibilities for role-taking emerge during interaction, determining a spectrum of participation. This view contrasts sharply with the observational stance that has guided research in social neuroscience until recently. We also introduce the concept of readiness to interact to describe the practices and dispositions that are summoned in situations of social significance (even if not interactive). This latter idea links interactive factors to more classical observational scenarios.

  2. The interactive brain hypothesis

    Directory of Open Access Journals (Sweden)

    Ezequiel Alejandro Di Paolo

    2012-06-01

    Full Text Available Enactive approaches foreground the role of interpersonal interaction in explanations of social understanding. This motivates, in combination with a recent interest in neuroscientific studies involving actual interactions, the question of how interactive processes relate to neural mechanisms involved in social understanding. We introduce the Interactive Brain Hypothesis in order to help map the possible relations between interaction and neural processes. The hypothesis states that interactive experience and skills play enabling roles in both the development and current function of social brain mechanisms, even in cases where social understanding happens in the absence of immediate interaction. We examine the plausibility of this hypothesis against developmental and neurobiological evidence and contrast it with the widespread assumption that mindreading is crucial to all social cognition. We describe the elements of social interaction that bear most directly on this hypothesis and discuss the empirical possibilities open to social neuroscience. We propose that the link between coordination dynamics and social understanding can be best grasped by studying transitions between states of coordination. These transitions form part of the self-organisation of interaction processes that characterise the dynamics of social engagement. The patterns and synergies of this self-organisation help explain how individuals understand each other. Various possibilities for role-taking emerge during interaction, determining a spectrum of participation. This view contrasts sharply with the observational stance that has guided research in social neuroscience until recently. We also introduce the concept of readiness to interact to describe the developed practices and dispositions that are summoned in situations of social significance (even if not interactive. This latter idea could link interactive factors to more classical observational scenarios.

  3. Theory of threshold phenomena

    International Nuclear Information System (INIS)

    Hategan, Cornel

    2002-01-01

    Theory of Threshold Phenomena in Quantum Scattering is developed in terms of Reduced Scattering Matrix. Relationships of different types of threshold anomalies both to nuclear reaction mechanisms and to nuclear reaction models are established. Magnitude of threshold effect is related to spectroscopic factor of zero-energy neutron state. The Theory of Threshold Phenomena, based on Reduced Scattering Matrix, does establish relationships between different types of threshold effects and nuclear reaction mechanisms: the cusp and non-resonant potential scattering, s-wave threshold anomaly and compound nucleus resonant scattering, p-wave anomaly and quasi-resonant scattering. A threshold anomaly related to resonant or quasi resonant scattering is enhanced provided the neutron threshold state has large spectroscopic amplitude. The Theory contains, as limit cases, Cusp Theories and also results of different nuclear reactions models as Charge Exchange, Weak Coupling, Bohr and Hauser-Feshbach models. (author)

  4. Hypothesis in research

    Directory of Open Access Journals (Sweden)

    Eudaldo Enrique Espinoza Freire

    2018-01-01

    Full Text Available It is intended with this work to have a material with the fundamental contents, which enable the university professor to formulate the hypothesis, for the development of an investigation, taking into account the problem to be solved. For its elaboration, the search of information in primary documents was carried out, such as thesis of degree and reports of research results, selected on the basis of its relevance with the analyzed subject, current and reliability, secondary documents, as scientific articles published in journals of recognized prestige, the selection was made with the same terms as in the previous documents. It presents a conceptualization of the updated hypothesis, its characterization and an analysis of the structure of the hypothesis in which the determination of the variables is deepened. The involvement of the university professor in the teaching-research process currently faces some difficulties, which are manifested, among other aspects, in an unstable balance between teaching and research, which leads to a separation between them.

  5. Physiologic time: A hypothesis

    Science.gov (United States)

    West, Damien; West, Bruce J.

    2013-06-01

    The scaling of respiratory metabolism with body size in animals is considered by many to be a fundamental law of nature. One apparent consequence of this law is the scaling of physiologic time with body size, implying that physiologic time is separate and distinct from clock time. Physiologic time is manifest in allometry relations for lifespans, cardiac cycles, blood volume circulation, respiratory cycle, along with a number of other physiologic phenomena. Herein we present a theory of physiologic time that explains the allometry relation between time and total body mass averages as entailed by the hypothesis that the fluctuations in the total body mass are described by a scaling probability density.

  6. Ideal Standards, Acceptance, and Relationship Satisfaction: Latitudes of Differential Effects

    Directory of Open Access Journals (Sweden)

    Asuman Buyukcan-Tetik

    2017-09-01

    Full Text Available We examined whether the relations of consistency between ideal standards and perceptions of a current romantic partner with partner acceptance and relationship satisfaction level off, or decelerate, above a threshold. We tested our hypothesis using a 3-year longitudinal data set collected from heterosexual newlywed couples. We used two indicators of consistency: pattern correspondence (within-person correlation between ideal standards and perceived partner ratings and mean-level match (difference between ideal standards score and perceived partner score. Our results revealed that pattern correspondence had no relation with partner acceptance, but a positive linear/exponential association with relationship satisfaction. Mean-level match had a significant positive association with actor’s acceptance and relationship satisfaction up to the point where perceived partner score equaled ideal standards score. Partner effects did not show a consistent pattern. The results suggest that the consistency between ideal standards and perceived partner attributes has a non-linear association with acceptance and relationship satisfaction, although the results were more conclusive for mean-level match.

  7. The oxidative hypothesis of senescence

    Directory of Open Access Journals (Sweden)

    Gilca M

    2007-01-01

    Full Text Available The oxidative hypothesis of senescence, since its origin in 1956, has garnered significant evidence and growing support among scientists for the notion that free radicals play an important role in ageing, either as "damaging" molecules or as signaling molecules. Age-increasing oxidative injuries induced by free radicals, higher susceptibility to oxidative stress in short-lived organisms, genetic manipulations that alter both oxidative resistance and longevity and the anti-ageing effect of caloric restriction and intermittent fasting are a few examples of accepted scientific facts that support the oxidative theory of senescence. Though not completely understood due to the complex "network" of redox regulatory systems, the implication of oxidative stress in the ageing process is now well documented. Moreover, it is compatible with other current ageing theories (e.g., those implicating the mitochondrial damage/mitochondrial-lysosomal axis, stress-induced premature senescence, biological "garbage" accumulation, etc. This review is intended to summarize and critically discuss the redox mechanisms involved during the ageing process: sources of oxidant agents in ageing (mitochondrial -electron transport chain, nitric oxide synthase reaction- and non-mitochondrial- Fenton reaction, microsomal cytochrome P450 enzymes, peroxisomal β -oxidation and respiratory burst of phagocytic cells, antioxidant changes in ageing (enzymatic- superoxide dismutase, glutathione-reductase, glutathion peroxidase, catalase- and non-enzymatic glutathione, ascorbate, urate, bilirubine, melatonin, tocopherols, carotenoids, ubiquinol, alteration of oxidative damage repairing mechanisms and the role of free radicals as signaling molecules in ageing.

  8. Resistive Threshold Logic

    OpenAIRE

    James, A. P.; Francis, L. R. V. J.; Kumar, D.

    2013-01-01

    We report a resistance based threshold logic family useful for mimicking brain like large variable logic functions in VLSI. A universal Boolean logic cell based on an analog resistive divider and threshold logic circuit is presented. The resistive divider is implemented using memristors and provides output voltage as a summation of weighted product of input voltages. The output of resistive divider is converted into a binary value by a threshold operation implemented by CMOS inverter and/or O...

  9. Threshold Signature Schemes Application

    Directory of Open Access Journals (Sweden)

    Anastasiya Victorovna Beresneva

    2015-10-01

    Full Text Available This work is devoted to an investigation of threshold signature schemes. The systematization of the threshold signature schemes was done, cryptographic constructions based on interpolation Lagrange polynomial, elliptic curves and bilinear pairings were examined. Different methods of generation and verification of threshold signatures were explored, the availability of practical usage of threshold schemes in mobile agents, Internet banking and e-currency was shown. The topics of further investigation were given and it could reduce a level of counterfeit electronic documents signed by a group of users.

  10. Acceptable Channel Switching Delays for Mobile TV

    DEFF Research Database (Denmark)

    Fleury, Alexandre; Pedersen, Jakob Schou; Larsen, Lars Bo

    2011-01-01

    This paper presents a user study investigating the acceptability of channel switching delays on mobile television systems. The authors first review the previous work in the area, then propose a study design and present results from its implementation, focusing on the overall acceptability threshold...

  11. The linear hypothesis - an idea whose time has passed

    International Nuclear Information System (INIS)

    Tschaeche, A.N.

    1995-01-01

    The linear no-threshold hypothesis is the basis for radiation protection standards in the United States. In the words of the National Council on Radiation Protection and Measurements (NCRP), the hypothesis is: open-quotes In the interest of estimating effects in humans conservatively, it is not unreasonable to follow the assumption of a linear relationship between dose and effect in the low dose regions for which direct observational data are not available.close quotes The International Commission on Radiological Protection (ICRP) stated the hypothesis in a slightly different manner: open-quotes One such basic assumption ... is that ... there is ... a linear relationship without threshold between dose and the probability of an effect. The hypothesis was necessary 50 yr ago when it was first enunciated because the dose-effect curve for ionizing radiation for effects in humans was not known. The ICRP and NCRP needed a model to extrapolate high-dose effects to low-dose effects. So the linear no-threshold hypothesis was born. Certain details of the history of the development and use of the linear hypothesis are presented. In particular, use of the hypothesis by the U.S. regulatory agencies is examined. Over time, the sense of the hypothesis has been corrupted. The corruption of the hypothesis into the current paradigm of open-quote a little radiation, no matter how small, can and will harm youclose quotes is presented. The reasons the corruption occurred are proposed. The effects of the corruption are enumerated, specifically, the use of the corruption by the antinuclear forces in the United States and some of the huge costs to U.S. taxpayers due to the corruption. An alternative basis for radiation protection standards to assure public safety, based on the weight of scientific evidence on radiation health effects, is proposed

  12. Einstein's Revolutionary Light-Quantum Hypothesis

    Science.gov (United States)

    Stuewer, Roger H.

    2005-05-01

    The paper in which Albert Einstein proposed his light-quantum hypothesis was the only one of his great papers of 1905 that he himself termed ``revolutionary.'' Contrary to widespread belief, Einstein did not propose his light-quantum hypothesis ``to explain the photoelectric effect.'' Instead, he based his argument for light quanta on the statistical interpretation of the second law of thermodynamics, with the photoelectric effect being only one of three phenomena that he offered as possible experimental support for it. I will discuss Einstein's light-quantum hypothesis of 1905 and his introduction of the wave-particle duality in 1909 and then turn to the reception of his work on light quanta by his contemporaries. We will examine the reasons that prominent physicists advanced to reject Einstein's light-quantum hypothesis in succeeding years. Those physicists included Robert A. Millikan, even though he provided convincing experimental proof of the validity of Einstein's equation of the photoelectric effect in 1915. The turning point came after Arthur Holly Compton discovered the Compton effect in late 1922, but even then Compton's discovery was contested both on experimental and on theoretical grounds. Niels Bohr, in particular, had never accepted the reality of light quanta and now, in 1924, proposed a theory, the Bohr-Kramers-Slater theory, which assumed that energy and momentum were conserved only statistically in microscopic interactions. Only after that theory was disproved experimentally in 1925 was Einstein's revolutionary light-quantum hypothesis generally accepted by physicists---a full two decades after Einstein had proposed it.

  13. Stroke rehabilitation reaches a threshold.

    Directory of Open Access Journals (Sweden)

    Cheol E Han

    2008-08-01

    Full Text Available Motor training with the upper limb affected by stroke partially reverses the loss of cortical representation after lesion and has been proposed to increase spontaneous arm use. Moreover, repeated attempts to use the affected hand in daily activities create a form of practice that can potentially lead to further improvement in motor performance. We thus hypothesized that if motor retraining after stroke increases spontaneous arm use sufficiently, then the patient will enter a virtuous circle in which spontaneous arm use and motor performance reinforce each other. In contrast, if the dose of therapy is not sufficient to bring spontaneous use above threshold, then performance will not increase and the patient will further develop compensatory strategies with the less affected hand. To refine this hypothesis, we developed a computational model of bilateral hand use in arm reaching to study the interactions between adaptive decision making and motor relearning after motor cortex lesion. The model contains a left and a right motor cortex, each controlling the opposite arm, and a single action choice module. The action choice module learns, via reinforcement learning, the value of using each arm for reaching in specific directions. Each motor cortex uses a neural population code to specify the initial direction along which the contralateral hand moves towards a target. The motor cortex learns to minimize directional errors and to maximize neuronal activity for each movement. The derived learning rule accounts for the reversal of the loss of cortical representation after rehabilitation and the increase of this loss after stroke with insufficient rehabilitation. Further, our model exhibits nonlinear and bistable behavior: if natural recovery, motor training, or both, brings performance above a certain threshold, then training can be stopped, as the repeated spontaneous arm use provides a form of motor learning that further bootstraps performance and

  14. Threshold concepts in prosthetics.

    Science.gov (United States)

    Hill, Sophie

    2017-12-01

    Curriculum documents identify key concepts within learning prosthetics. Threshold concepts provide an alternative way of viewing the curriculum, focussing on the ways of thinking and practicing within prosthetics. Threshold concepts can be described as an opening to a different way of viewing a concept. This article forms part of a larger study exploring what students and staff experience as difficult in learning about prosthetics. To explore possible threshold concepts within prosthetics. Qualitative, interpretative phenomenological analysis. Data from 18 students and 8 staff at two universities with undergraduate prosthetics and orthotics programmes were generated through interviews and questionnaires. The data were analysed using an interpretative phenomenological analysis approach. Three possible threshold concepts arose from the data: 'how we walk', 'learning to talk' and 'considering the person'. Three potential threshold concepts in prosthetics are suggested with possible implications for prosthetics education. These possible threshold concepts involve changes in both conceptual and ontological knowledge, integrating into the persona of the individual. This integration occurs through the development of memories associated with procedural concepts that combine with disciplinary concepts. Considering the prosthetics curriculum through the lens of threshold concepts enables a focus on how students learn to become prosthetists. Clinical relevance This study provides new insights into how prosthetists learn. This has implications for curriculum design in prosthetics education.

  15. Sleep memory processing: the sequential hypothesis.

    Science.gov (United States)

    Giuditta, Antonio

    2014-01-01

    According to the sequential hypothesis (SH) memories acquired during wakefulness are processed during sleep in two serial steps respectively occurring during slow wave sleep (SWS) and rapid eye movement (REM) sleep. During SWS memories to be retained are distinguished from irrelevant or competing traces that undergo downgrading or elimination. Processed memories are stored again during REM sleep which integrates them with preexisting memories. The hypothesis received support from a wealth of EEG, behavioral, and biochemical analyses of trained rats. Further evidence was provided by independent studies of human subjects. SH basic premises, data, and interpretations have been compared with corresponding viewpoints of the synaptic homeostatic hypothesis (SHY). Their similarities and differences are presented and discussed within the framework of sleep processing operations. SHY's emphasis on synaptic renormalization during SWS is acknowledged to underline a key sleep effect, but this cannot marginalize sleep's main role in selecting memories to be retained from downgrading traces, and in their integration with preexisting memories. In addition, SHY's synaptic renormalization raises an unsolved dilemma that clashes with the accepted memory storage mechanism exclusively based on modifications of synaptic strength. This difficulty may be bypassed by the assumption that SWS-processed memories are stored again by REM sleep in brain subnuclear quantum particles. Storing of memories in quantum particles may also occur in other vigilance states. Hints are provided on ways to subject the quantum hypothesis to experimental tests.

  16. Sleep memory processing: the sequential hypothesis

    Directory of Open Access Journals (Sweden)

    Antonio eGiuditta

    2014-12-01

    Full Text Available According to the sequential hypothesis (SH memories acquired during wakefulness are processed during sleep in two serial steps respectively occurring during slow wave sleep (SWS and REM sleep. During SWS memories to be retained are distinguished from irrelevant or competing traces that undergo downgrading or elimination. Processed memories are stored again during REM sleep which integrates them with preexisting memories. The hypothesis received support from a wealth of EEG, behavioral, and biochemical analyses of trained rats. Further evidence was provided by independent studies of human subjects. SH basic premises, data, and interpretations have been compared with corresponding viewpoints of the synaptic homeostatic hypothesis (SHY. Their similarities and differences are presented and discussed within the framework of sleep processing operations. SHY’s emphasis on synaptic renormalization during SWS is acknowledged to underline a key sleep effect, but this cannot marginalize sleep’s main role in selecting memories to be retained from downgrading traces, and in their integration with preexisting memories. In addition, SHY’s synaptic renormalization raises an unsolved dilemma that clashes with the accepted memory storage mechanism exclusively based on modifications of synaptic strength. This difficulty may be bypassed by the assumption that SWS-processed memories are stored again by REM sleep in brain subnuclear quantum particles. Storing of memories in quantum particles may also occur in other vigilance states. Hints are provided on ways to subject the quantum hypothesis to experimental tests.

  17. On the two steps threshold selection for over-threshold modelling of extreme events

    Science.gov (United States)

    Bernardara, Pietro; Mazas, Franck; Weiss, Jerome; Andreewsky, Marc; Kergadallan, Xavier; Benoit, Michel; Hamm, Luc

    2013-04-01

    The estimation of the probability of occurrence of extreme events is traditionally achieved by fitting a probability distribution on a sample of extreme observations. In particular, the extreme value theory (EVT) states that values exceeding a given threshold converge through a Generalized Pareto Distribution (GPD) if the original sample is composed of independent and identically distributed values. However, the temporal series of sea and ocean variables usually show strong temporal autocorrelation. Traditionally, in order to select independent events for the following statistical analysis, the concept of a physical threshold is introduced: events that excess that threshold are defined as "extreme events". This is the so-called "Peak Over a Threshold (POT)" sampling, widely spread in the literature and currently used for engineering applications among many others. In the past, the threshold for the statistical sampling of extreme values asymptotically convergent toward GPD and the threshold for the physical selection of independent extreme events were confused, as the same threshold was used for both sampling data and to meet the hypothesis of extreme value convergence, leading to some incoherencies. In particular, if the two steps are performed simultaneously, the number of peaks over the threshold can increase but also decrease when the threshold decreases. This is logic in a physical point of view, since the definition of the sample of "extreme events" changes, but is not coherent with the statistical theory. We introduce a two-steps threshold selection for over-threshold modelling, aiming to discriminate (i) a physical threshold for the selection of extreme and independent events, and (ii) a statistical threshold for the optimization of the coherence with the hypothesis of the EVT. The former is a physical events identification procedure (also called "declustering") aiming at selecting independent extreme events. The latter is a purely statistical optimization

  18. Social contagion with degree-dependent thresholds

    Science.gov (United States)

    Lee, Eun; Holme, Petter

    2017-07-01

    We investigate opinion spreading by a threshold model in a situation in which the influence of people is heterogeneously distributed. We assume that there is a coupling between the influence of an individual (measured by the out-degree) and the threshold for accepting a new opinion or habit. We find that if the coupling is strongly positive, the final state of the system will be a mix of different opinions. Otherwise, it will converge to a consensus state. This phenomenon cannot simply be explained as a phase transition, but it is a combined effect of mechanisms and their relative dominance in different regions of parameter space.

  19. Regional Seismic Threshold Monitoring

    National Research Council Canada - National Science Library

    Kvaerna, Tormod

    2006-01-01

    ... model to be used for predicting the travel times of regional phases. We have applied these attenuation relations to develop and assess a regional threshold monitoring scheme for selected subregions of the European Arctic...

  20. Color difference thresholds in dentistry.

    Science.gov (United States)

    Paravina, Rade D; Ghinea, Razvan; Herrera, Luis J; Bona, Alvaro D; Igiel, Christopher; Linninger, Mercedes; Sakai, Maiko; Takahashi, Hidekazu; Tashkandi, Esam; Perez, Maria del Mar

    2015-01-01

    The aim of this prospective multicenter study was to determine 50:50% perceptibility threshold (PT) and 50:50% acceptability threshold (AT) of dental ceramic under simulated clinical settings. The spectral radiance of 63 monochromatic ceramic specimens was determined using a non-contact spectroradiometer. A total of 60 specimen pairs, divided into 3 sets of 20 specimen pairs (medium to light shades, medium to dark shades, and dark shades), were selected for psychophysical experiment. The coordinating center and seven research sites obtained the Institutional Review Board (IRB) approvals prior the beginning of the experiment. Each research site had 25 observers, divided into five groups of five observers: dentists-D, dental students-S, dental auxiliaries-A, dental technicians-T, and lay persons-L. There were 35 observers per group (five observers per group at each site ×7 sites), for a total of 175 observers. Visual color comparisons were performed using a viewing booth. Takagi-Sugeno-Kang (TSK) fuzzy approximation was used for fitting the data points. The 50:50% PT and 50:50% AT were determined in CIELAB and CIEDE2000. The t-test was used to evaluate the statistical significance in thresholds differences. The CIELAB 50:50% PT was ΔEab  = 1.2, whereas 50:50% AT was ΔEab  = 2.7. Corresponding CIEDE2000 (ΔE00 ) values were 0.8 and 1.8, respectively. 50:50% PT by the observer group revealed differences among groups D, A, T, and L as compared with 50:50% PT for all observers. The 50:50% AT for all observers was statistically different than 50:50% AT in groups T and L. A 50:50% perceptibility and ATs were significantly different. The same is true for differences between two color difference formulas ΔE00 /ΔEab . Observer groups and sites showed high level of statistical difference in all thresholds. Visual color difference thresholds can serve as a quality control tool to guide the selection of esthetic dental materials, evaluate clinical performance, and

  1. The conscious access hypothesis: Explaining the consciousness.

    Science.gov (United States)

    Prakash, Ravi

    2008-01-01

    The phenomenon of conscious awareness or consciousness is complicated but fascinating. Although this concept has intrigued the mankind since antiquity, exploration of consciousness from scientific perspectives is not very old. Among myriad of theories regarding nature, functions and mechanism of consciousness, off late, cognitive theories have received wider acceptance. One of the most exciting hypotheses in recent times has been the "conscious access hypotheses" based on the "global workspace model of consciousness". It underscores an important property of consciousness, the global access of information in cerebral cortex. Present article reviews the "conscious access hypothesis" in terms of its theoretical underpinnings as well as experimental supports it has received.

  2. Acceptability, acceptance and decision making

    International Nuclear Information System (INIS)

    Ackerschott, H.

    2002-01-01

    There is a fundamental difference between the acceptability of a civilizatory or societal risk and the acceptability of the decision-making process that leads to a civilizatory or societal risk. The analysis of individual risk decisions - regarding who, executes when which indisputably hazardous, unhealthy or dangerous behaviour under which circumstances - is not helpful in finding solutions for the political decisions at hand in Germany concerning nuclear energy in particular or energy in general. The debt for implementation of any technology, in the sense of making the technology a success in terms of broad acceptance and general utilisation, lies with the particular industry involved. Regardless of the technology, innovation research identifies the implementation phase as most critical to the success of any innovation. In this sense, nuclear technology is at best still an innovation, because the implementation has not yet been completed. Fear and opposition to innovation are ubiquitous. Even the economy - which is often described as 'rational' - is full of this resistance. Innovation has an impact on the pivotal point between stability, the presupposition for the successful execution of decisions already taken and instability, which includes insecurity, but is also necessary for the success of further development. By definition, innovations are beyond our sphere of experience; not at the level of reliability and trust yet to come. Yet they are evaluated via the simplifying heuristics for making decisions proven not only to be necessary and useful, but also accurate in the familiar. The 'settlement of the debt of implementation', the accompanying communication, the decision-making procedures concerning the regulation of averse effects of the technology, but also the tailoring of the new technology or service itself must be directed to appropriate target groups. But the group often aimed at in the nuclear debate, the group, which largely determines political

  3. Influence of a threshold existence on the sanitary consequences of a nuclear accident

    International Nuclear Information System (INIS)

    Nifenecker, H.

    2001-11-01

    The justification of the application of the dose response relationship without threshold to the calculation of the fatal cancers number in the case of an accidental irradiation of a population is discussed. The hypothesis of a harmlessness low threshold is examined. The existence of a threshold even low reduces significantly the number of victims. A simulation case is studied. (N.C.)

  4. Is the Aluminum Hypothesis Dead?

    Science.gov (United States)

    2014-01-01

    The Aluminum Hypothesis, the idea that aluminum exposure is involved in the etiology of Alzheimer disease, dates back to a 1965 demonstration that aluminum causes neurofibrillary tangles in the brains of rabbits. Initially the focus of intensive research, the Aluminum Hypothesis has gradually been abandoned by most researchers. Yet, despite this current indifference, the Aluminum Hypothesis continues to attract the attention of a small group of scientists and aluminum continues to be viewed with concern by some of the public. This review article discusses reasons that mainstream science has largely abandoned the Aluminum Hypothesis and explores a possible reason for some in the general public continuing to view aluminum with mistrust. PMID:24806729

  5. Comparison between intensity- duration thresholds and cumulative rainfall thresholds for the forecasting of landslide

    Science.gov (United States)

    Lagomarsino, Daniela; Rosi, Ascanio; Rossi, Guglielmo; Segoni, Samuele; Catani, Filippo

    2014-05-01

    This work makes a quantitative comparison between the results of landslide forecasting obtained using two different rainfall threshold models, one using intensity-duration thresholds and the other based on cumulative rainfall thresholds in an area of northern Tuscany of 116 km2. The first methodology identifies rainfall intensity-duration thresholds by means a software called MaCumBA (Massive CUMulative Brisk Analyzer) that analyzes rain-gauge records, extracts the intensities (I) and durations (D) of the rainstorms associated with the initiation of landslides, plots these values on a diagram, and identifies thresholds that define the lower bounds of the I-D values. A back analysis using data from past events can be used to identify the threshold conditions associated with the least amount of false alarms. The second method (SIGMA) is based on the hypothesis that anomalous or extreme values of rainfall are responsible for landslide triggering: the statistical distribution of the rainfall series is analyzed, and multiples of the standard deviation (σ) are used as thresholds to discriminate between ordinary and extraordinary rainfall events. The name of the model, SIGMA, reflects the central role of the standard deviations in the proposed methodology. The definition of intensity-duration rainfall thresholds requires the combined use of rainfall measurements and an inventory of dated landslides, whereas SIGMA model can be implemented using only rainfall data. These two methodologies were applied in an area of 116 km2 where a database of 1200 landslides was available for the period 2000-2012. The results obtained are compared and discussed. Although several examples of visual comparisons between different intensity-duration rainfall thresholds are reported in the international literature, a quantitative comparison between thresholds obtained in the same area using different techniques and approaches is a relatively undebated research topic.

  6. The thresholds for statistical and clinical significance

    DEFF Research Database (Denmark)

    Jakobsen, Janus Christian; Gluud, Christian; Winkel, Per

    2014-01-01

    , assessment of intervention effects in randomised clinical trials deserves more rigour in order to become more valid. METHODS: Several methodologies for assessing the statistical and clinical significance of intervention effects in randomised clinical trials were considered. Balancing simplicity......BACKGROUND: Thresholds for statistical significance are insufficiently demonstrated by 95% confidence intervals or P-values when assessing results from randomised clinical trials. First, a P-value only shows the probability of getting a result assuming that the null hypothesis is true and does...... not reflect the probability of getting a result assuming an alternative hypothesis to the null hypothesis is true. Second, a confidence interval or a P-value showing significance may be caused by multiplicity. Third, statistical significance does not necessarily result in clinical significance. Therefore...

  7. Hydrodynamics of sediment threshold

    Science.gov (United States)

    Ali, Sk Zeeshan; Dey, Subhasish

    2016-07-01

    A novel hydrodynamic model for the threshold of cohesionless sediment particle motion under a steady unidirectional streamflow is presented. The hydrodynamic forces (drag and lift) acting on a solitary sediment particle resting over a closely packed bed formed by the identical sediment particles are the primary motivating forces. The drag force comprises of the form drag and form induced drag. The lift force includes the Saffman lift, Magnus lift, centrifugal lift, and turbulent lift. The points of action of the force system are appropriately obtained, for the first time, from the basics of micro-mechanics. The sediment threshold is envisioned as the rolling mode, which is the plausible mode to initiate a particle motion on the bed. The moment balance of the force system on the solitary particle about the pivoting point of rolling yields the governing equation. The conditions of sediment threshold under the hydraulically smooth, transitional, and rough flow regimes are examined. The effects of velocity fluctuations are addressed by applying the statistical theory of turbulence. This study shows that for a hindrance coefficient of 0.3, the threshold curve (threshold Shields parameter versus shear Reynolds number) has an excellent agreement with the experimental data of uniform sediments. However, most of the experimental data are bounded by the upper and lower limiting threshold curves, corresponding to the hindrance coefficients of 0.2 and 0.4, respectively. The threshold curve of this study is compared with those of previous researchers. The present model also agrees satisfactorily with the experimental data of nonuniform sediments.

  8. Hypothesis Validity of Clinical Research.

    Science.gov (United States)

    Wampold, Bruce E.; And Others

    1990-01-01

    Describes hypothesis validity as extent to which research results reflect theoretically derived predictions about relations between or among constructs. Discusses role of hypotheses in theory testing. Presents four threats to hypothesis validity: (1) inconsequential research hypotheses; (2) ambiguous research hypotheses; (3) noncongruence of…

  9. The Variability Hypothesis: The History of a Biological Model of Sex Differences in Intelligence.

    Science.gov (United States)

    Shields, Stephanie A.

    1982-01-01

    Describes the origin and development of the variability hypothesis as applied to the study of social and psychological sex differences. Explores changes in the hypothesis over time, social and scientific factors that fostered its acceptance, and possible parallels between the variability hypothesis and contemporary theories of sex differences.…

  10. A threshold method for immunological correlates of protection.

    Science.gov (United States)

    Chen, Xuan; Bailleux, Fabrice; Desai, Kamal; Qin, Li; Dunning, Andrew J

    2013-03-01

    Immunological correlates of protection are biological markers such as disease-specific antibodies which correlate with protection against disease and which are measurable with immunological assays. It is common in vaccine research and in setting immunization policy to rely on threshold values for the correlate where the accepted threshold differentiates between individuals who are considered to be protected against disease and those who are susceptible. Examples where thresholds are used include development of a new generation 13-valent pneumococcal conjugate vaccine which was required in clinical trials to meet accepted thresholds for the older 7-valent vaccine, and public health decision making on vaccination policy based on long-term maintenance of protective thresholds for Hepatitis A, rubella, measles, Japanese encephalitis and others. Despite widespread use of such thresholds in vaccine policy and research, few statistical approaches have been formally developed which specifically incorporate a threshold parameter in order to estimate the value of the protective threshold from data. We propose a 3-parameter statistical model called the a:b model which incorporates parameters for a threshold and constant but different infection probabilities below and above the threshold estimated using profile likelihood or least squares methods. Evaluation of the estimated threshold can be performed by a significance test for the existence of a threshold using a modified likelihood ratio test which follows a chi-squared distribution with 3 degrees of freedom, and confidence intervals for the threshold can be obtained by bootstrapping. The model also permits assessment of relative risk of infection in patients achieving the threshold or not. Goodness-of-fit of the a:b model may be assessed using the Hosmer-Lemeshow approach. The model is applied to 15 datasets from published clinical trials on pertussis, respiratory syncytial virus and varicella. Highly significant thresholds

  11. Multimodal distribution of human cold pain thresholds.

    Science.gov (United States)

    Lötsch, Jörn; Dimova, Violeta; Lieb, Isabel; Zimmermann, Michael; Oertel, Bruno G; Ultsch, Alfred

    2015-01-01

    It is assumed that different pain phenotypes are based on varying molecular pathomechanisms. Distinct ion channels seem to be associated with the perception of cold pain, in particular TRPM8 and TRPA1 have been highlighted previously. The present study analyzed the distribution of cold pain thresholds with focus at describing the multimodality based on the hypothesis that it reflects a contribution of distinct ion channels. Cold pain thresholds (CPT) were available from 329 healthy volunteers (aged 18 - 37 years; 159 men) enrolled in previous studies. The distribution of the pooled and log-transformed threshold data was described using a kernel density estimation (Pareto Density Estimation (PDE)) and subsequently, the log data was modeled as a mixture of Gaussian distributions using the expectation maximization (EM) algorithm to optimize the fit. CPTs were clearly multi-modally distributed. Fitting a Gaussian Mixture Model (GMM) to the log-transformed threshold data revealed that the best fit is obtained when applying a three-model distribution pattern. The modes of the identified three Gaussian distributions, retransformed from the log domain to the mean stimulation temperatures at which the subjects had indicated pain thresholds, were obtained at 23.7 °C, 13.2 °C and 1.5 °C for Gaussian #1, #2 and #3, respectively. The localization of the first and second Gaussians was interpreted as reflecting the contribution of two different cold sensors. From the calculated localization of the modes of the first two Gaussians, the hypothesis of an involvement of TRPM8, sensing temperatures from 25 - 24 °C, and TRPA1, sensing cold from 17 °C can be derived. In that case, subjects belonging to either Gaussian would possess a dominance of the one or the other receptor at the skin area where the cold stimuli had been applied. The findings therefore support a suitability of complex analytical approaches to detect mechanistically determined patterns from pain phenotype data.

  12. Hadron production near threshold

    Indian Academy of Sciences (India)

    Final state interaction effects in → + and → 3He reactions are explored near threshold to study the sensitivity of the cross-sections to the potential and the scattering matrix. The final state scattering wave functions between and and and 3He are described rigorously. The production is ...

  13. Elaborating on Threshold Concepts

    Science.gov (United States)

    Rountree, Janet; Robins, Anthony; Rountree, Nathan

    2013-01-01

    We propose an expanded definition of Threshold Concepts (TCs) that requires the successful acquisition and internalisation not only of knowledge, but also its practical elaboration in the domains of applied strategies and mental models. This richer definition allows us to clarify the relationship between TCs and Fundamental Ideas, and to account…

  14. Boundaries, Thresholds, and Consequences.

    Science.gov (United States)

    Smith, Carl R.

    1997-01-01

    Highlights issues in the debate concerning Individuals with Disabilities Education Act (IDEA) special education legislation as it relates to student discipline and incarcerated juveniles. Focuses on assessment issues and thresholds for diagnosable conditions. Looks at debates surrounding IDEA and some of the consequences of new legislation. (RJM)

  15. Testing One Hypothesis Multiple times

    OpenAIRE

    Algeri, Sara; van Dyk, David A.

    2017-01-01

    Hypothesis testing in presence of a nuisance parameter that is only identifiable under the alternative is challenging in part because standard asymptotic results (e.g., Wilks theorem for the generalized likelihood ratio test) do not apply. Several solutions have been proposed in the statistical literature and their practical implementation often reduces the problem into one of Testing One Hypothesis Multiple times (TOHM). Specifically, a fine discretization of the space of the non-identifiabl...

  16. Regional rainfall thresholds for landslide occurrence using a centenary database

    Science.gov (United States)

    Vaz, Teresa; Luís Zêzere, José; Pereira, Susana; Cruz Oliveira, Sérgio; Garcia, Ricardo A. C.; Quaresma, Ivânia

    2018-04-01

    This work proposes a comprehensive method to assess rainfall thresholds for landslide initiation using a centenary landslide database associated with a single centenary daily rainfall data set. The method is applied to the Lisbon region and includes the rainfall return period analysis that was used to identify the critical rainfall combination (cumulated rainfall duration) related to each landslide event. The spatial representativeness of the reference rain gauge is evaluated and the rainfall thresholds are assessed and calibrated using the receiver operating characteristic (ROC) metrics. Results show that landslide events located up to 10 km from the rain gauge can be used to calculate the rainfall thresholds in the study area; however, these thresholds may be used with acceptable confidence up to 50 km from the rain gauge. The rainfall thresholds obtained using linear and potential regression perform well in ROC metrics. However, the intermediate thresholds based on the probability of landslide events established in the zone between the lower-limit threshold and the upper-limit threshold are much more informative as they indicate the probability of landslide event occurrence given rainfall exceeding the threshold. This information can be easily included in landslide early warning systems, especially when combined with the probability of rainfall above each threshold.

  17. Application of threshold concepts to ecological management problems: occupancy of Golden Eagles in Denali National Park, Alaska: Chapter 5

    Science.gov (United States)

    Eaton, Mitchell J.; Martin, Julien; Nichols, James D.; McIntyre, Carol; McCluskie, Maggie C.; Schmutz, Joel A.; Lubow, Bruce L.; Runge, Michael C.; Edited by Guntenspergen, Glenn R.

    2014-01-01

    In this chapter, we demonstrate the application of the various classes of thresholds, detailed in earlier chapters and elsewhere, via an actual but simplified natural resource management case study. We intend our example to provide the reader with the ability to recognize and apply the theoretical concepts of utility, ecological and decision thresholds to management problems through a formalized decision-analytic process. Our case study concerns the management of human recreational activities in Alaska’s Denali National Park, USA, and the possible impacts of such activities on nesting Golden Eagles, Aquila chrysaetos. Managers desire to allow visitors the greatest amount of access to park lands, provided that eagle nesting-site occupancy is maintained at a level determined to be acceptable by the managers themselves. As these two management objectives are potentially at odds, we treat minimum desired occupancy level as a utility threshold which, then, serves to guide the selection of annual management alternatives in the decision process. As human disturbance is not the only factor influencing eagle occupancy, we model nesting-site dynamics as a function of both disturbance and prey availability. We incorporate uncertainty in these dynamics by considering several hypotheses, including a hypothesis that site occupancy is affected only at a threshold level of prey abundance (i.e., an ecological threshold effect). By considering competing management objectives and accounting for two forms of thresholds in the decision process, we are able to determine the optimal number of annual nesting-site restrictions that will produce the greatest long-term benefits for both eagles and humans. Setting a utility threshold of 75 occupied sites, out of a total of 90 potential nesting sites, the optimization specified a decision threshold at approximately 80 occupied sites. At the point that current occupancy falls below 80 sites, the recommended decision is to begin restricting

  18. Testing the null hypothesis: the forgotten legacy of Karl Popper?

    Science.gov (United States)

    Wilkinson, Mick

    2013-01-01

    Testing of the null hypothesis is a fundamental aspect of the scientific method and has its basis in the falsification theory of Karl Popper. Null hypothesis testing makes use of deductive reasoning to ensure that the truth of conclusions is irrefutable. In contrast, attempting to demonstrate the new facts on the basis of testing the experimental or research hypothesis makes use of inductive reasoning and is prone to the problem of the Uniformity of Nature assumption described by David Hume in the eighteenth century. Despite this issue and the well documented solution provided by Popper's falsification theory, the majority of publications are still written such that they suggest the research hypothesis is being tested. This is contrary to accepted scientific convention and possibly highlights a poor understanding of the application of conventional significance-based data analysis approaches. Our work should remain driven by conjecture and attempted falsification such that it is always the null hypothesis that is tested. The write up of our studies should make it clear that we are indeed testing the null hypothesis and conforming to the established and accepted philosophical conventions of the scientific method.

  19. Hadron production near threshold

    Indian Academy of Sciences (India)

    Abstract. Final state interaction effects in pp → pΛK+ and pd → 3He η reactions are explored near threshold to study the sensitivity of the cross-sections to the pΛ potential and the ηN scattering matrix. The final state scattering wave functions between Λ and p and η and 3He are described rigorously. The Λ production is ...

  20. 5-HTP hypothesis of schizophrenia.

    Science.gov (United States)

    Fukuda, K

    2014-01-01

    To pose a new hypothesis of schizophrenia that affirms and unifies conventional hypotheses. Outside the brain, there are 5-HTP-containing argyrophil cells that have tryptophan hydroxylase 1 without l-aromatic amino acid decarboxylase. Monoamine oxidase in the liver and lung metabolize 5-HT, rather than 5-HTP, and 5-HTP freely crosses the blood-brain barrier, converting to 5-HT in the brain. Therefore I postulate that hyperfunction of 5-HTP-containing argyrophil cells may be a cause of schizophrenia. I investigate the consistency of this hypothesis with other hypotheses using a deductive method. Overactive 5-HTP-containing argyrophil cells produce excess amounts of 5-HTP. Abundant 5-HTP increases 5-HT within the brain (linking to the 5-HT hypothesis), and leads to negative feedback of 5-HT synthesis at the rate-limiting step catalysed by tryptophan hydroxylase 2. Owing to this negative feedback, brain tryptophan is further metabolized via the kynurenine pathway. Increased kynurenic acid contributes to deficiencies of glutamate function and dopamine activity, known causes of schizophrenia. The 5-HTP hypothesis affirms conventional hypotheses, as the metabolic condition caused by acceleration of tryptophan hydroxylase 1 and suppression of tryptophan hydroxylase 2, activates both 5-HT and kynurenic acid. In order to empirically test the theory, it will be useful to monitor serum 5-HTP and match it to different phases of schizophrenia. This hypothesis may signal a new era with schizophrenia treated as a brain-gut interaction. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. The atomic hypothesis: physical consequences

    International Nuclear Information System (INIS)

    Rivas, Martin

    2008-01-01

    The hypothesis that matter is made of some ultimate and indivisible objects, together with the restricted relativity principle, establishes a constraint on the kind of variables we are allowed to use for the variational description of elementary particles. We consider that the atomic hypothesis not only states the indivisibility of elementary particles, but also that these ultimate objects, if not annihilated, cannot be modified by any interaction so that all allowed states of an elementary particle are only kinematical modifications of any one of them. Therefore, an elementary particle cannot have excited states. In this way, the kinematical group of spacetime symmetries not only defines the symmetries of the system, but also the variables in terms of which the mathematical description of the elementary particles can be expressed in either the classical or the quantum mechanical description. When considering the interaction of two Dirac particles, the atomic hypothesis restricts the interaction Lagrangian to a kind of minimal coupling interaction

  2. Extra dimensions hypothesis in high energy physics

    Directory of Open Access Journals (Sweden)

    Volobuev Igor

    2017-01-01

    Full Text Available We discuss the history of the extra dimensions hypothesis and the physics and phenomenology of models with large extra dimensions with an emphasis on the Randall- Sundrum (RS model with two branes. We argue that the Standard Model extension based on the RS model with two branes is phenomenologically acceptable only if the inter-brane distance is stabilized. Within such an extension of the Standard Model, we study the influence of the infinite Kaluza-Klein (KK towers of the bulk fields on collider processes. In particular, we discuss the modification of the scalar sector of the theory, the Higgs-radion mixing due to the coupling of the Higgs boson to the radion and its KK tower, and the experimental restrictions on the mass of the radion-dominated states.

  3. Discussion of the Porter hypothesis

    International Nuclear Information System (INIS)

    1999-11-01

    In the reaction to the long-range vision of RMNO, published in 1996, The Dutch government posed the question whether a far-going and progressive modernization policy will lead to competitive advantages of high-quality products on partly new markets. Such a question is connected to the so-called Porter hypothesis: 'By stimulating innovation, strict environmental regulations can actually enhance competitiveness', from which statement it can be concluded that environment and economy can work together quite well. A literature study has been carried out in order to determine under which conditions that hypothesis is endorsed in the scientific literature and policy documents. Recommendations are given for further studies. refs

  4. The thrifty phenotype hypothesis revisited

    DEFF Research Database (Denmark)

    Vaag, A A; Grunnet, L G; Arora, G P

    2012-01-01

    Twenty years ago, Hales and Barker along with their co-workers published some of their pioneering papers proposing the 'thrifty phenotype hypothesis' in Diabetologia (4;35:595-601 and 3;36:62-67). Their postulate that fetal programming could represent an important player in the origin of type 2...... of the underlying molecular mechanisms. Type 2 diabetes is a multiple-organ disease, and developmental programming, with its idea of organ plasticity, is a plausible hypothesis for a common basis for the widespread organ dysfunctions in type 2 diabetes and the metabolic syndrome. Only two among the 45 known type 2...

  5. Oscillatory Threshold Logic

    Science.gov (United States)

    Borresen, Jon; Lynch, Stephen

    2012-01-01

    In the 1940s, the first generation of modern computers used vacuum tube oscillators as their principle components, however, with the development of the transistor, such oscillator based computers quickly became obsolete. As the demand for faster and lower power computers continues, transistors are themselves approaching their theoretical limit and emerging technologies must eventually supersede them. With the development of optical oscillators and Josephson junction technology, we are again presented with the possibility of using oscillators as the basic components of computers, and it is possible that the next generation of computers will be composed almost entirely of oscillatory devices. Here, we demonstrate how coupled threshold oscillators may be used to perform binary logic in a manner entirely consistent with modern computer architectures. We describe a variety of computational circuitry and demonstrate working oscillator models of both computation and memory. PMID:23173034

  6. Antiaging therapy: a prospective hypothesis

    Directory of Open Access Journals (Sweden)

    Shahidi Bonjar MR

    2015-01-01

    Full Text Available Mohammad Rashid Shahidi Bonjar,1 Leyla Shahidi Bonjar2 1School of Dentistry, Kerman University of Medical Sciences, Kerman Iran; 2Department of Pharmacology, College of Pharmacy, Kerman University of Medical Sciences, Kerman, Iran Abstract: This hypothesis proposes a new prospective approach to slow the aging process in older humans. The hypothesis could lead to developing new treatments for age-related illnesses and help humans to live longer. This hypothesis has no previous documentation in scientific media and has no protocol. Scientists have presented evidence that systemic aging is influenced by peculiar molecules in the blood. Researchers at Albert Einstein College of Medicine, New York, and Harvard University in Cambridge discovered elevated titer of aging-related molecules (ARMs in blood, which trigger cascade of aging process in mice; they also indicated that the process can be reduced or even reversed. By inhibiting the production of ARMs, they could reduce age-related cognitive and physical declines. The present hypothesis offers a new approach to translate these findings into medical treatment: extracorporeal adjustment of ARMs would lead to slower rates of aging. A prospective “antiaging blood filtration column” (AABFC is a nanotechnological device that would fulfill the central role in this approach. An AABFC would set a near-youth homeostatic titer of ARMs in the blood. In this regard, the AABFC immobilizes ARMs from the blood while blood passes through the column. The AABFC harbors antibodies against ARMs. ARM antibodies would be conjugated irreversibly to ARMs on contact surfaces of the reaction platforms inside the AABFC till near-youth homeostasis is attained. The treatment is performed with the aid of a blood-circulating pump. Similar to a renal dialysis machine, blood would circulate from the body to the AABFC and from there back to the body in a closed circuit until ARMs were sufficiently depleted from the blood. The

  7. A Molecular–Structure Hypothesis

    Directory of Open Access Journals (Sweden)

    Jan C. A. Boeyens

    2010-11-01

    Full Text Available The self-similar symmetry that occurs between atomic nuclei, biological growth structures, the solar system, globular clusters and spiral galaxies suggests that a similar pattern should characterize atomic and molecular structures. This possibility is explored in terms of the current molecular structure-hypothesis and its extension into four-dimensional space-time. It is concluded that a quantum molecule only has structure in four dimensions and that classical (Newtonian structure, which occurs in three dimensions, cannot be simulated by quantum-chemical computation.

  8. A durkheimian hypothesis on stress.

    Science.gov (United States)

    Mestrovic, S; Glassner, B

    1983-01-01

    Commonalities among the events that appear on life events lists and among the types of social supports which have been found to reduce the likelihood of illness are reviewed in the life events literature in an attempt to find a context within sociological theory. Social integration seems to underlie the stress-illness process. In seeking a tradition from which to understand these facts, we selected Durkheim's works in the context of the homo duplex concept wherein social integration involves the interplay of individualism and social forces. After presenting a specific hypothesis for the stress literature, the paper concludes with implications and suggestions for empirical research.

  9. Iris pigmentation and AC thresholds.

    Science.gov (United States)

    Roche, A F; Mukherjee, D; Chumlea, W C; Siervogel, R M

    1983-03-01

    Data from 160 White children were used to analyze possible associations between iris pigmentation and AC pure-tone thresholds. Iris pigmentation was graded from iris color using glass models of eyes, and AC thresholds were obtained under carefully controlled conditions. Analyses of variance using two groupings of iris color grades showed no evidence of an association between iris color grade and AC thresholds. Furthermore, inspection of arrays of the actual glass eye models, in conjunction with the order of mean thresholds at each test frequency, did not indicate the presence of an association between iris color grades and thresholds. It was concluded that while iris pigmentation may be related to some aspects of hearing ability, it does not appear to be related to AC thresholds in children.

  10. Crossing the threshold

    Science.gov (United States)

    Bush, John; Tambasco, Lucas

    2017-11-01

    First, we summarize the circumstances in which chaotic pilot-wave dynamics gives rise to quantum-like statistical behavior. For ``closed'' systems, in which the droplet is confined to a finite domain either by boundaries or applied forces, quantum-like features arise when the persistence time of the waves exceeds the time required for the droplet to cross its domain. Second, motivated by the similarities between this hydrodynamic system and stochastic electrodynamics, we examine the behavior of a bouncing droplet above the Faraday threshold, where a stochastic element is introduced into the drop dynamics by virtue of its interaction with a background Faraday wave field. With a view to extending the dynamical range of pilot-wave systems to capture more quantum-like features, we consider a generalized theoretical framework for stochastic pilot-wave dynamics in which the relative magnitudes of the drop-generated pilot-wave field and a stochastic background field may be varied continuously. We gratefully acknowledge the financial support of the NSF through their CMMI and DMS divisions.

  11. Albania - Thresholds I and II

    Data.gov (United States)

    Millennium Challenge Corporation — From 2006 to 2011, the government of Albania (GOA) received two Millennium Challenge Corporation (MCC) Threshold Programs totaling $29.6 million. Albania received...

  12. Olfactory threshold in Parkinson's disease.

    Science.gov (United States)

    Quinn, N P; Rossor, M N; Marsden, C D

    1987-01-01

    Olfactory threshold to differing concentrations of amyl acetate was determined in 78 subjects with idiopathic Parkinson's disease and 40 age-matched controls. Impaired olfactory threshold (previously reported by others) was confirmed in Parkinsonian subjects compared with controls. There was no significant correlation between olfactory threshold and age, sex, duration of disease, or current therapy with levodopa or anticholinergic drugs. In a sub-group of 14 levodopa-treated patients with severe "on-off" fluctuations, no change in olfactory threshold between the two states was demonstrable. Olfactory impairment in Parkinson's disease may involve mechanisms that are not influenced by pharmacologic manipulation of dopaminergic or cholinergic status. PMID:3819760

  13. Learning foraging thresholds for lizards

    Energy Technology Data Exchange (ETDEWEB)

    Goldberg, L.A. [Univ. of Warwick, Coventry (United Kingdom). Dept. of Computer Science; Hart, W.E. [Sandia National Labs., Albuquerque, NM (United States); Wilson, D.B. [Massachusetts Inst. of Tech., Cambridge, MA (United States)

    1996-01-12

    This work gives a proof of convergence for a randomized learning algorithm that describes how anoles (lizards found in the Carribean) learn a foraging threshold distance. This model assumes that an anole will pursue a prey if and only if it is within this threshold of the anole`s perch. This learning algorithm was proposed by the biologist Roughgarden and his colleagues. They experimentally confirmed that this algorithm quickly converges to the foraging threshold that is predicted by optimal foraging theory our analysis provides an analytic confirmation that the learning algorithm converses to this optimal foraging threshold with high probability.

  14. Is PMI the Hypothesis or the Null Hypothesis?

    Science.gov (United States)

    Tarone, Aaron M; Sanford, Michelle R

    2017-09-01

    Over the past several decades, there have been several strident exchanges regarding whether forensic entomologists estimate the postmortem interval (PMI), minimum PMI, or something else. During that time, there has been a proliferation of terminology reflecting this concern regarding "what we do." This has been a frustrating conversation for some in the community because much of this debate appears to be centered on what assumptions are acknowledged directly and which are embedded within a list of assumptions (or ignored altogether) in the literature and in case reports. An additional component of the conversation centers on a concern that moving away from the use of certain terminology like PMI acknowledges limitations and problems that would make the application of entomology appear less useful in court-a problem for lawyers, but one that should not be problematic for scientists in the forensic entomology community, as uncertainty is part of science that should and can be presented effectively in the courtroom (e.g., population genetic concepts in forensics). Unfortunately, a consequence of the way this conversation is conducted is that even as all involved in the debate acknowledge the concerns of their colleagues, parties continue to talk past one another advocating their preferred terminology. Progress will not be made until the community recognizes that all of the terms under consideration take the form of null hypothesis statements and that thinking about "what we do" as a null hypothesis has useful legal and scientific ramifications that transcend arguments over the usage of preferred terminology. © The Authors 2017. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  15. Hemispheric lateralization of motor thresholds in relation to stuttering.

    Directory of Open Access Journals (Sweden)

    Per A Alm

    Full Text Available Stuttering is a complex speech disorder. Previous studies indicate a tendency towards elevated motor threshold for the left hemisphere, as measured using transcranial magnetic stimulation (TMS. This may reflect a monohemispheric motor system impairment. The purpose of the study was to investigate the relative side-to-side difference (asymmetry and the absolute levels of motor threshold for the hand area, using TMS in adults who stutter (n = 15 and in controls (n = 15. In accordance with the hypothesis, the groups differed significantly regarding the relative side-to-side difference of finger motor threshold (p = 0.0026, with the stuttering group showing higher motor threshold of the left hemisphere in relation to the right. Also the absolute level of the finger motor threshold for the left hemisphere differed between the groups (p = 0.049. The obtained results, together with previous investigations, provide support for the hypothesis that stuttering tends to be related to left hemisphere motor impairment, and possibly to a dysfunctional state of bilateral speech motor control.

  16. Threshold Concepts and Information Literacy

    Science.gov (United States)

    Townsend, Lori; Brunetti, Korey; Hofer, Amy R.

    2011-01-01

    What do we teach when we teach information literacy in higher education? This paper describes a pedagogical approach to information literacy that helps instructors focus content around transformative learning thresholds. The threshold concept framework holds promise for librarians because it grounds the instructor in the big ideas and underlying…

  17. Second threshold in weak interactions

    NARCIS (Netherlands)

    Veltman, M.J.G.

    1977-01-01

    The point of view that weak interactions must have a second threshold below 300 – 600 GeV is developed. Above this threshold new physics must come in. This new physics may be the Higgs system, or some other nonperturbative system possibly having some similarities to the Higgs system. The limit of

  18. The Nature of Psychological Thresholds

    Science.gov (United States)

    Rouder, Jeffrey N.; Morey, Richard D.

    2009-01-01

    Following G. T. Fechner (1966), thresholds have been conceptualized as the amount of intensity needed to transition between mental states, such as between a states of unconsciousness and consciousness. With the advent of the theory of signal detection, however, discrete-state theory and the corresponding notion of threshold have been discounted.…

  19. The Stem Cell Hypothesis of Aging

    Directory of Open Access Journals (Sweden)

    Anna Meiliana

    2010-04-01

    Full Text Available BACKGROUND: There is probably no single way to age. Indeed, so far there is no single accepted explanation or mechanisms of aging (although more than 300 theories have been proposed. There is an overall decline in tissue regenerative potential with age, and the question arises as to whether this is due to the intrinsic aging of stem cells or rather to the impairment of stem cell function in the aged tissue environment. CONTENT: Recent data suggest that we age, in part, because our self-renewing stem cells grow old as a result of heritable intrinsic events, such as DNA damage, as well as extrinsic forces, such as changes in their supporting niches. Mechanisms that suppress the development of cancer, such as senescence and apoptosis, which rely on telomere shortening and the activities of p53 and p16INK4a may also induce an unwanted consequence: a decline in the replicative function of certain stem cells types with advancing age. This decrease regenerative capacity appears to pointing to the stem cell hypothesis of aging. SUMMARY: Recent evidence suggested that we grow old partly because of our stem cells grow old as a result of mechanisms that suppress the development of cancer over a lifetime. We believe that a further, more precise mechanistic understanding of this process will be required before this knowledge can be translated into human anti-aging therapies. KEYWORDS: stem cells, senescence, telomere, DNA damage, epigenetic, aging.

  20. Responsible technology acceptance

    DEFF Research Database (Denmark)

    Toft, Madeleine Broman; Schuitema, Geertje; Thøgersen, John

    2014-01-01

    on private consumers’ acceptance of having Smart Grid technology installed in their home. We analyse acceptance in a combined framework of the Technology Acceptance Model and the Norm Activation Model. We propose that individuals are only likely to accept Smart Grid technology if they assess usefulness......As a response to climate change and the desire to gain independence from imported fossil fuels, there is a pressure to increase the proportion of electricity from renewable sources which is one of the reasons why electricity grids are currently being turned into Smart Grids. In this paper, we focus...... in terms of a positive impact for society and the environment. Therefore, we expect that Smart Grid technology acceptance can be better explained when the well-known technology acceptance parameters included in the Technology Acceptance Model are supplemented by moral norms as suggested by the Norm...

  1. Developing thresholds of potential concern for invasive alien species: Hypotheses and concepts

    Directory of Open Access Journals (Sweden)

    Llewellyn C. Foxcroft

    2009-03-01

    Conservation implication: In accepting that species and systems are variable, and that flux is inevitable and desirable, these TPCs developed for invasive alien species specifi cally, provide end points against which monitoring can be assessed. Once a threshold is reached, the cause of the threshold being exceeded is examined and management interventions recommended.

  2. Origin of the linearity no threshold (LNT) dose-response concept.

    Science.gov (United States)

    Calabrese, Edward J

    2013-09-01

    This paper identifies the origin of the linearity at low-dose concept [i.e., linear no threshold (LNT)] for ionizing radiation-induced mutation. After the discovery of X-ray-induced mutations, Olson and Lewis (Nature 121(3052):673-674, 1928) proposed that cosmic/terrestrial radiation-induced mutations provide the principal mechanism for the induction of heritable traits, providing the driving force for evolution. For this concept to be general, a LNT dose relationship was assumed, with genetic damage proportional to the energy absorbed. Subsequent studies suggested a linear dose response for ionizing radiation-induced mutations (Hanson and Heys in Am Nat 63(686):201-213, 1929; Oliver in Science 71:44-46, 1930), supporting the evolutionary hypothesis. Based on an evaluation of spontaneous and ionizing radiation-induced mutation with Drosophila, Muller argued that background radiation had a negligible impact on spontaneous mutation, discrediting the ionizing radiation-based evolutionary hypothesis. Nonetheless, an expanded set of mutation dose-response observations provided a basis for collaboration between theoretical physicists (Max Delbruck and Gunter Zimmer) and the radiation geneticist Nicolai Timoféeff-Ressovsky. They developed interrelated physical science-based genetics perspectives including a biophysical model of the gene, a radiation-induced gene mutation target theory and the single-hit hypothesis of radiation-induced mutation, which, when integrated, provided the theoretical mechanism and mathematical basis for the LNT model. The LNT concept became accepted by radiation geneticists and recommended by national/international advisory committees for risk assessment of ionizing radiation-induced mutational damage/cancer from the mid-1950s to the present. The LNT concept was later generalized to chemical carcinogen risk assessment and used by public health and regulatory agencies worldwide.

  3. Metabolic syndrome--neurotrophic hypothesis.

    Science.gov (United States)

    Hristova, M; Aloe, L

    2006-01-01

    An increasing number of researchers of the metabolic syndrome assume that many mechanisms are involved in its complex pathophysiology such as an increased sympathetic activity, disorders of the hypothalamo-pituitary-adrenal axis, the action of chronic subclinical infections, proinflammatory cytokines, and the effect of adipocytokines or psychoemotional stress. An increasing body of scientific research in this field confirms the role of the neurotrophins and mastocytes in the pathogenesis of inflammatory and immune diseases. Recently it has been proved that neurotrophins and mastocytes have metabotrophic effects and take part in the carbohydrate and lipid metabolism. In the early stage of the metabolic syndrome we established a statistically significant increase in the plasma levels of the nerve growth factor. In the generalized stage the plasma levels of the neutrophines were statistically decreased in comparison to those in the healthy controls. We consider that the neurotrophin deficit is likely to play a significant pathogenic role in the development of the metabolic anthropometric and vascular manifestations of the generalized stage of MetSyn. We suggest a hypothesis for the etiopathogenesis of the metabolic syndrome based on the neuro-immuno-endocrine interactions. The specific pathogenic pathways of MetSyn development include: (1) increased tissue and plasma levels of proinflammatory cytokines Interleukin-1(IL-1), Interleukin-6 (IL-6 ) and tumor necrosis factor - alpha (TNF-alpha) caused by inflammatory and/or emotional distress; (2) increased plasma levels of neurotrophin - nerve growth factor (NGF) caused by the high IL-1, IL-6 and TNFalpha levels; (3) high plasma levels of NGF which enhance activation of: the autonomous nerve system--vegetodystonia (disbalance of neurotransmitters); Neuropeptide Y (NPY)--enhanced feeding, obesity and increased leptin plasma levels; hypothalamo-pituitary-adrenal axis--increased corticotropin-releasing hormone (CRH) and

  4. Memory in astrocytes: a hypothesis

    Directory of Open Access Journals (Sweden)

    Caudle Robert M

    2006-01-01

    Full Text Available Abstract Background Recent work has indicated an increasingly complex role for astrocytes in the central nervous system. Astrocytes are now known to exchange information with neurons at synaptic junctions and to alter the information processing capabilities of the neurons. As an extension of this trend a hypothesis was proposed that astrocytes function to store information. To explore this idea the ion channels in biological membranes were compared to models known as cellular automata. These comparisons were made to test the hypothesis that ion channels in the membranes of astrocytes form a dynamic information storage device. Results Two dimensional cellular automata were found to behave similarly to ion channels in a membrane when they function at the boundary between order and chaos. The length of time information is stored in this class of cellular automata is exponentially related to the number of units. Therefore the length of time biological ion channels store information was plotted versus the estimated number of ion channels in the tissue. This analysis indicates that there is an exponential relationship between memory and the number of ion channels. Extrapolation of this relationship to the estimated number of ion channels in the astrocytes of a human brain indicates that memory can be stored in this system for an entire life span. Interestingly, this information is not affixed to any physical structure, but is stored as an organization of the activity of the ion channels. Further analysis of two dimensional cellular automata also demonstrates that these systems have both associative and temporal memory capabilities. Conclusion It is concluded that astrocytes may serve as a dynamic information sink for neurons. The memory in the astrocytes is stored by organizing the activity of ion channels and is not associated with a physical location such as a synapse. In order for this form of memory to be of significant duration it is necessary

  5. Robust and distributed hypothesis testing

    CERN Document Server

    Gül, Gökhan

    2017-01-01

    This book generalizes and extends the available theory in robust and decentralized hypothesis testing. In particular, it presents a robust test for modeling errors which is independent from the assumptions that a sufficiently large number of samples is available, and that the distance is the KL-divergence. Here, the distance can be chosen from a much general model, which includes the KL-divergence as a very special case. This is then extended by various means. A minimax robust test that is robust against both outliers as well as modeling errors is presented. Minimax robustness properties of the given tests are also explicitly proven for fixed sample size and sequential probability ratio tests. The theory of robust detection is extended to robust estimation and the theory of robust distributed detection is extended to classes of distributions, which are not necessarily stochastically bounded. It is shown that the quantization functions for the decision rules can also be chosen as non-monotone. Finally, the boo...

  6. The venom optimization hypothesis revisited.

    Science.gov (United States)

    Morgenstern, David; King, Glenn F

    2013-03-01

    Animal venoms are complex chemical mixtures that typically contain hundreds of proteins and non-proteinaceous compounds, resulting in a potent weapon for prey immobilization and predator deterrence. However, because venoms are protein-rich, they come with a high metabolic price tag. The metabolic cost of venom is sufficiently high to result in secondary loss of venom whenever its use becomes non-essential to survival of the animal. The high metabolic cost of venom leads to the prediction that venomous animals may have evolved strategies for minimizing venom expenditure. Indeed, various behaviors have been identified that appear consistent with frugality of venom use. This has led to formulation of the "venom optimization hypothesis" (Wigger et al. (2002) Toxicon 40, 749-752), also known as "venom metering", which postulates that venom is metabolically expensive and therefore used frugally through behavioral control. Here, we review the available data concerning economy of venom use by animals with either ancient or more recently evolved venom systems. We conclude that the convergent nature of the evidence in multiple taxa strongly suggests the existence of evolutionary pressures favoring frugal use of venom. However, there remains an unresolved dichotomy between this economy of venom use and the lavish biochemical complexity of venom, which includes a high degree of functional redundancy. We discuss the evidence for biochemical optimization of venom as a means of resolving this conundrum. Copyright © 2012 Elsevier Ltd. All rights reserved.

  7. Alien abduction: a medical hypothesis.

    Science.gov (United States)

    Forrest, David V

    2008-01-01

    In response to a new psychological study of persons who believe they have been abducted by space aliens that found that sleep paralysis, a history of being hypnotized, and preoccupation with the paranormal and extraterrestrial were predisposing experiences, I noted that many of the frequently reported particulars of the abduction experience bear more than a passing resemblance to medical-surgical procedures and propose that experience with these may also be contributory. There is the altered state of consciousness, uniformly colored figures with prominent eyes, in a high-tech room under a round bright saucerlike object; there is nakedness, pain and a loss of control while the body's boundaries are being probed; and yet the figures are thought benevolent. No medical-surgical history was apparently taken in the above mentioned study, but psychological laboratory work evaluated false memory formation. I discuss problems in assessing intraoperative awareness and ways in which the medical hypothesis could be elaborated and tested. If physicians are causing this syndrome in a percentage of patients, we should know about it; and persons who feel they have been abducted should be encouraged to inform their surgeons and anesthesiologists without challenging their beliefs.

  8. Photoproduction of Charm Near Threshold

    Energy Technology Data Exchange (ETDEWEB)

    Brodsky, Stanley J.

    2000-10-31

    Charm and bottom production near threshold is sensitive to the multi-quark, gluonic, and hidden-color correlations of hadronic and nuclear wavefunctions in QCD since all of the target's constituents must act coherently within the small interaction volume of the heavy quark production subprocess. Although such multi-parton subprocess cross sections are suppressed by powers of 1=m{sub Q}{sup 2}, they have less phase-space suppression and can dominate the contributions of the leading-twist single-gluon subprocesses in the threshold regime. The small rates for open and hidden charm photoproduction at threshold call for a dedicated facility.

  9. ‘Soglitude’- introducing a method of thinking thresholds

    Directory of Open Access Journals (Sweden)

    Tatjana Barazon

    2010-04-01

    philosophical, artistic or scientific, it tends to free itself from rigid or fixed models and accepts change and development as the fundamental nature of things. Thinking thresholds as a method of thought progress cannot be done in a single process and therefore asks for participation in its proper nature. The soglitude springs namely from the acceptance of a multitude of points of view, as it is shown by the numerous contributions we present in this issue ‘Seuils, Thresholds, Soglitudes’ of Conserveries mémorielles.

  10. ARC Code TI: ACCEPT

    Data.gov (United States)

    National Aeronautics and Space Administration — ACCEPT consists of an overall software infrastructure framework and two main software components. The software infrastructure framework consists of code written to...

  11. Parton distributions with threshold resummation

    CERN Document Server

    Bonvini, Marco; Rojo, Juan; Rottoli, Luca; Ubiali, Maria; Ball, Richard D.; Bertone, Valerio; Carrazza, Stefano; Hartland, Nathan P.

    2015-01-01

    We construct a set of parton distribution functions (PDFs) in which fixed-order NLO and NNLO calculations are supplemented with soft-gluon (threshold) resummation up to NLL and NNLL accuracy respectively, suitable for use in conjunction with any QCD calculation in which threshold resummation is included at the level of partonic cross sections. These resummed PDF sets, based on the NNPDF3.0 analysis, are extracted from deep-inelastic scattering, Drell-Yan, and top quark pair production data, for which resummed calculations can be consistently used. We find that, close to threshold, the inclusion of resummed PDFs can partially compensate the enhancement in resummed matrix elements, leading to resummed hadronic cross-sections closer to the fixed-order calculation. On the other hand, far from threshold, resummed PDFs reduce to their fixed-order counterparts. Our results demonstrate the need for a consistent use of resummed PDFs in resummed calculations.

  12. Food allergy: Stakeholder perspectives on acceptable risk

    DEFF Research Database (Denmark)

    Madsen, Charlotte Bernhard; Crevel, René; Chan, Chun-Han

    2010-01-01

    We have reached a point where it is difficult to improve food allergy risk management without an agreement on levels of acceptable risk. This paper presents and discusses the perspectives of the different stakeholders (allergic consumers, health professionals, public authorities and the food...... industry) on acceptable risk in food allergy. Understanding where these perspectives diverge and even conflict may help develop an approach to define what is acceptable. Uncertainty about food allergy, its consequences and how to manage them is the common denominator of the stakeholders’ views. In patients...... to all patients despite the fact that the risk to each is not identical. Regulators and the food industry struggle with the fact that the lack of management thresholds forces them to make case-by-case decisions in an area of uncertainty with penalties for under- or over-prediction. As zero risk...

  13. Cochlear neuropathy and the coding of supra-threshold sound.

    Science.gov (United States)

    Bharadwaj, Hari M; Verhulst, Sarah; Shaheen, Luke; Liberman, M Charles; Shinn-Cunningham, Barbara G

    2014-01-01

    Many listeners with hearing thresholds within the clinically normal range nonetheless complain of difficulty hearing in everyday settings and understanding speech in noise. Converging evidence from human and animal studies points to one potential source of such difficulties: differences in the fidelity with which supra-threshold sound is encoded in the early portions of the auditory pathway. Measures of auditory subcortical steady-state responses (SSSRs) in humans and animals support the idea that the temporal precision of the early auditory representation can be poor even when hearing thresholds are normal. In humans with normal hearing thresholds (NHTs), paradigms that require listeners to make use of the detailed spectro-temporal structure of supra-threshold sound, such as selective attention and discrimination of frequency modulation (FM), reveal individual differences that correlate with subcortical temporal coding precision. Animal studies show that noise exposure and aging can cause a loss of a large percentage of auditory nerve fibers (ANFs) without any significant change in measured audiograms. Here, we argue that cochlear neuropathy may reduce encoding precision of supra-threshold sound, and that this manifests both behaviorally and in SSSRs in humans. Furthermore, recent studies suggest that noise-induced neuropathy may be selective for higher-threshold, lower-spontaneous-rate nerve fibers. Based on our hypothesis, we suggest some approaches that may yield particularly sensitive, objective measures of supra-threshold coding deficits that arise due to neuropathy. Finally, we comment on the potential clinical significance of these ideas and identify areas for future investigation.

  14. Conceptions of nuclear threshold status

    International Nuclear Information System (INIS)

    Quester, G.H.

    1991-01-01

    This paper reviews some alternative definitions of nuclear threshold status. Each of them is important, and major analytical confusions would result if one sense of the term is mistaken for another. The motives for nations entering into such threshold status are a blend of civilian and military gains, and of national interests versus parochial or bureaucratic interests. A portion of the rationale for threshold status emerges inevitably from the pursuit of economic goals, and another portion is made more attraction by the derives of the domestic political process. Yet the impact on international security cannot be dismissed, especially where conflicts among the states remain real. Among the military or national security motives are basic deterrence, psychological warfare, war-fighting and, more generally, national prestige. In the end, as the threshold phenomenon is assayed for lessons concerning the role of nuclear weapons more generally in international relations and security, one might conclude that threshold status and outright proliferation coverage to a degree in the motives for all of the states involved and in the advantages attained. As this paper has illustrated, nuclear threshold status is more subtle and more ambiguous than outright proliferation, and it takes considerable time to sort out the complexities. Yet the world has now had a substantial amount of time to deal with this ambiguous status, and this may tempt more states to exploit it

  15. Threshold models in radiation carcinogenesis

    International Nuclear Information System (INIS)

    Hoel, D.G.; Li, P.

    1998-01-01

    Cancer incidence and mortality data from the atomic bomb survivors cohort has been analyzed to allow for the possibility of a threshold dose response. The same dose-response models as used in the original papers were fit to the data. The estimated cancer incidence from the fitted models over-predicted the observed cancer incidence in the lowest exposure group. This is consistent with a threshold or nonlinear dose-response at low-doses. Thresholds were added to the dose-response models and the range of possible thresholds is shown for both solid tumor cancers as well as the different leukemia types. This analysis suggests that the A-bomb cancer incidence data agree more with a threshold or nonlinear dose-response model than a purely linear model although the linear model is statistically equivalent. This observation is not found with the mortality data. For both the incidence data and the mortality data the addition of a threshold term significantly improves the fit to the linear or linear-quadratic dose response for both total leukemias and also for the leukemia subtypes of ALL, AML, and CML

  16. Tactile thresholds in healthy subjects

    Directory of Open Access Journals (Sweden)

    Metka Moharić

    2014-10-01

    Full Text Available Background: The assessment of sensory thresholds provides a method of examining the function of peripheral nerve fibers and their central connections. Quantitative sensory testing is a variant of conventional sensory testing wherein the goal is the quantification of the level of stimulation needed to produce a particular sensation. While thermal and vibratory testing are established methods in assessment of sensory thresholds, assessment of tactile thresholds with monofilaments is not used routinely. The purpose of this study was to assess the tactile thresholds in normal healthy population.Methods: In 39 healthy volunteers (19 men aged 21 to 71 years, tactile thresholds were assessed with von Frey’s hair in 7 parts of the body bilaterally.Results: We found touch sensitivity not to be dependent on age or gender. The right side was significantly more sensitive in the lateral part of the leg (p=0.011 and the left side in the medial part of the arm (p=0.022. There were also significant differences between sites (p<0.001, whereby distal parts of the body were more sensitive.Conclusions: Von Frey filaments allow the estimation of tactile thresholds without the need for complicated instrumentation.

  17. Determining lower threshold concentrations for synergistic effects.

    Science.gov (United States)

    Bjergager, Maj-Britt Andersen; Dalhoff, Kristoffer; Kretschmann, Andreas; Nørgaard, Katrine Banke; Mayer, Philipp; Cedergreen, Nina

    2017-01-01

    Though only occurring rarely, synergistic interactions between chemicals in mixtures have long been a point of focus. Most studies analyzing synergistic interactions used unrealistically high chemical concentrations. The aim of the present study is to determine the threshold concentration below which proven synergists cease to act as synergists towards the aquatic crustacean Daphnia magna. To do this, we compared several approaches and test-setups to evaluate which approach gives the most conservative estimate for the lower threshold for synergy for three known azole synergists. We focus on synergistic interactions between the pyrethroid insecticide, alpha-cypermethrin, and one of the three azole fungicides prochloraz, propiconazole or epoxiconazole measured on Daphnia magna immobilization. Three different experimental setups were applied: A standard 48h acute toxicity test, an adapted 48h test using passive dosing for constant chemical exposure concentrations, and a 14-day test. Synergy was defined as occuring in mixtures where either EC 50 values decreased more than two-fold below what was predicted by concentration addition (horizontal assessment) or as mixtures where the fraction of immobile organisms increased more than two-fold above what was predicted by independent action (vertical assessment). All three tests confirmed the hypothesis of the existence of a lower azole threshold concentration below which no synergistic interaction was observed. The lower threshold concentration, however, decreased with increasing test duration from 0.026±0.013μM (9.794±4.897μgL -1 ), 0.425±0.089μM (145.435±30.46μgL -1 ) and 0.757±0.253μM (249.659±83.44μgL -1 ) for prochloraz, propiconazole and epoxiconazole in standard 48h toxicity tests to 0.015±0.004μM (5.651±1.507μgL -1 ), 0.145±0.025μM (49.619±8.555μgL -1 ) and 0.122±0.0417μM (40.236±13.75μgL -1 ), respectively, in the 14-days tests. Testing synergy in relation to concentration addition provided

  18. Acceptable noise level

    DEFF Research Database (Denmark)

    Olsen, Steen Østergaard; Nielsen, Lars Holme; Lantz, Johannes

    2012-01-01

    The acceptable noise level (ANL) is used to quantify the amount of background noise that subjects can accept while listening to speech, and is suggested for prediction of individual hearing-aid use. The aim of this study was to assess the repeatability of the ANL measured in normal-hearing subjects...

  19. The influence of olfactory concept on the probability of detecting sub- and peri-threshold odorants in a complex mixture

    NARCIS (Netherlands)

    Bult, J.H.F.; Schifferstein, H.N.J.; Roozen, J.P.; Voragen, A.G.J.; Kroeze, J.H.A.

    2001-01-01

    The headspace of apple juice was analysed to obtain an ecologically relevant stimulus model mixture of apple volatiles. Two sets of volatiles were made up: a set of eight supra-threshold volatiles (MIX) and a set of three sub-threshold volatiles. These sets were used to test the hypothesis that

  20. A practical threshold concept for simple and reasonable radiation protection

    International Nuclear Information System (INIS)

    Kaneko, Masahito

    2002-01-01

    A half century ago it was assumed for the purpose of protection that radiation risks are linearly proportional at all levels of dose. Linear No-Threshold (LNT) hypothesis has greatly contributed to the minimization of doses received by workers and members of the public, while it has brought about 'radiophobia' and unnecessary over-regulation. Now that the existence of bio-defensive mechanisms such as DNA repair, apoptosis and adaptive response are well recognized, the linearity assumption can be said 'unscientific'. Evidences increasingly imply that there are threshold effects in risk of radiation. A concept of 'practical' thresholds is proposed and the classification of 'stochastic' and 'deterministic' radiation effects should be abandoned. 'Practical' thresholds are dose levels below which induction of detectable radiogenic cancers or hereditary effects are not expected. There seems to be no evidence of deleterious health effects from radiation exposures at the current dose limits (50 mSv/y for workers and 5 mSv/y for members of the public), which have been adopted worldwide in the latter half of the 20th century. Those limits are assumed to have been set below certain 'practical' thresholds. As any workers and members of the public do not gain benefits from being exposed, excepting intentional irradiation for medical purposes, their radiation exposures should be kept below 'practical' thresholds. There is no use of 'justification' and 'optimization' (ALARA) principles, because there are no 'radiation detriments' as far as exposures are maintained below 'practical' thresholds. Accordingly the ethical issue of 'justification' to allow benefit to society to offset radiation detriments to individuals can be resolved. And also the ethical issue of 'optimization' to exchange health or safety for economical gain can be resolved. The ALARA principle should be applied to the probability (risk) of exceeding relevant dose limits instead of applying to normal exposures

  1. DOE approach to threshold quantities

    International Nuclear Information System (INIS)

    Wickham, L.E.; Kluk, A.F.; Department of Energy, Washington, DC)

    1985-01-01

    The Department of Energy (DOE) is developing the concept of threshold quantities for use in determining which waste materials must be handled as radioactive waste and which may be disposed of as nonradioactive waste at its sites. Waste above this concentration level would be managed as radioactive or mixed waste (if hazardous chemicals are present); waste below this level would be handled as sanitary waste. Ideally, the threshold must be set high enough to significantly reduce the amount of waste requiring special handling. It must also be low enough so that waste at the threshold quantity poses a very small health risk and multiple exposures to such waste would still constitute a small health risk. It should also be practical to segregate waste above or below the threshold quantity using available instrumentation. Guidance is being prepared to aid DOE sites in establishing threshold quantity values based on pathways analysis using site-specific parameters (waste stream characteristics, maximum exposed individual, population considerations, and site specific parameters such as rainfall, etc.). A guidance dose of between 0.001 to 1.0 mSv/y (0.1 to 100 mrem/y) was recommended with 0.3 mSv/y (30 mrem/y) selected as the guidance dose upon which to base calculations. Several tasks were identified, beginning with the selection of a suitable pathway model for relating dose to the concentration of radioactivity in the waste. Threshold concentrations corresponding to the guidance dose were determined for waste disposal sites at a selected humid and arid site. Finally, cost-benefit considerations at the example sites were addressed. The results of the various tasks are summarized and the relationship of this effort with related developments at other agencies discussed

  2. Acceptable noise level

    DEFF Research Database (Denmark)

    Olsen, Steen Østergaard; Nielsen, Lars Holme; Lantz, Johannes

    2012-01-01

    The acceptable noise level (ANL) is used to quantify the amount of background noise that subjects can accept while listening to speech, and is suggested for prediction of individual hearing-aid use. The aim of this study was to assess the repeatability of the ANL measured in normal-hearing subjects...... using running Danish and non-semantic speech materials as stimuli and modulated speech-spectrum and multi-talker babble noises as competing stimuli....

  3. Thresholds in chemical respiratory sensitisation.

    Science.gov (United States)

    Cochrane, Stella A; Arts, Josje H E; Ehnes, Colin; Hindle, Stuart; Hollnagel, Heli M; Poole, Alan; Suto, Hidenori; Kimber, Ian

    2015-07-03

    There is a continuing interest in determining whether it is possible to identify thresholds for chemical allergy. Here allergic sensitisation of the respiratory tract by chemicals is considered in this context. This is an important occupational health problem, being associated with rhinitis and asthma, and in addition provides toxicologists and risk assessors with a number of challenges. In common with all forms of allergic disease chemical respiratory allergy develops in two phases. In the first (induction) phase exposure to a chemical allergen (by an appropriate route of exposure) causes immunological priming and sensitisation of the respiratory tract. The second (elicitation) phase is triggered if a sensitised subject is exposed subsequently to the same chemical allergen via inhalation. A secondary immune response will be provoked in the respiratory tract resulting in inflammation and the signs and symptoms of a respiratory hypersensitivity reaction. In this article attention has focused on the identification of threshold values during the acquisition of sensitisation. Current mechanistic understanding of allergy is such that it can be assumed that the development of sensitisation (and also the elicitation of an allergic reaction) is a threshold phenomenon; there will be levels of exposure below which sensitisation will not be acquired. That is, all immune responses, including allergic sensitisation, have threshold requirement for the availability of antigen/allergen, below which a response will fail to develop. The issue addressed here is whether there are methods available or clinical/epidemiological data that permit the identification of such thresholds. This document reviews briefly relevant human studies of occupational asthma, and experimental models that have been developed (or are being developed) for the identification and characterisation of chemical respiratory allergens. The main conclusion drawn is that although there is evidence that the

  4. On computational Gestalt detection thresholds.

    Science.gov (United States)

    Grompone von Gioi, Rafael; Jakubowicz, Jérémie

    2009-01-01

    The aim of this paper is to show some recent developments of computational Gestalt theory, as pioneered by Desolneux, Moisan and Morel. The new results allow to predict much more accurately the detection thresholds. This step is unavoidable if one wants to analyze visual detection thresholds in the light of computational Gestalt theory. The paper first recalls the main elements of computational Gestalt theory. It points out a precision issue in this theory, essentially due to the use of discrete probability distributions. It then proposes to overcome this issue by using continuous probability distributions and illustrates it on the meaningful alignment detector of Desolneux et al.

  5. Nuclear threshold effects and neutron strength function

    International Nuclear Information System (INIS)

    Hategan, Cornel; Comisel, Horia

    2003-01-01

    One proves that a Nuclear Threshold Effect is dependent, via Neutron Strength Function, on Spectroscopy of Ancestral Neutron Threshold State. The magnitude of the Nuclear Threshold Effect is proportional to the Neutron Strength Function. Evidence for relation of Nuclear Threshold Effects to Neutron Strength Functions is obtained from Isotopic Threshold Effect and Deuteron Stripping Threshold Anomaly. The empirical and computational analysis of the Isotopic Threshold Effect and of the Deuteron Stripping Threshold Anomaly demonstrate their close relationship to Neutron Strength Functions. It was established that the Nuclear Threshold Effects depend, in addition to genuine Nuclear Reaction Mechanisms, on Spectroscopy of (Ancestral) Neutron Threshold State. The magnitude of the effect is proportional to the Neutron Strength Function, in their dependence on mass number. This result constitutes also a proof that the origins of these threshold effects are Neutron Single Particle States at zero energy. (author)

  6. Thresholds models of technological transitions

    NARCIS (Netherlands)

    Zeppini, P.; Frenken, K.; Kupers, R.

    2014-01-01

    We present a systematic review of seven threshold models of technological transitions from physics, biology, economics and sociology. The very same phenomenon of a technological transition can be explained by very different logics, ranging from economic explanations based on price, performance and

  7. Risk thresholds for alcohol consumption

    DEFF Research Database (Denmark)

    Wood, Angela M; Kaptoge, Stephen; Butterworth, Adam S

    2018-01-01

    BACKGROUND: Low-risk limits recommended for alcohol consumption vary substantially across different national guidelines. To define thresholds associated with lowest risk for all-cause mortality and cardiovascular disease, we studied individual-participant data from 599 912 current drinkers withou...

  8. Weights of Exact Threshold Functions

    DEFF Research Database (Denmark)

    Babai, László; Hansen, Kristoffer Arnsfelt; Podolskii, Vladimir V.

    2010-01-01

    We consider Boolean exact threshold functions defined by linear equations, and in general degree d polynomials. We give upper and lower bounds on the maximum magnitude (absolute value) of the coefficients required to represent such functions. These bounds are very close and in the linear case in ...... leave a substantial gap, a challenge for future work....

  9. Threshold quantities for helminth infections

    NARCIS (Netherlands)

    Heesterbeek, J.A.P.; Roberts, M.G.

    1995-01-01

    For parasites with a clearly defined life-cycle we give threshold quantities that determine the stability of the parasite-free steady state for autonomous and periodic deterministic systems formulated in terms of mean parasite burdens. We discuss the biological interpretations of the quantities, how

  10. Percolation Threshold Parameters of Fluids

    Czech Academy of Sciences Publication Activity Database

    Škvor, J.; Nezbeda, Ivo

    2009-01-01

    Roč. 79, č. 4 (2009), 041141-041147 ISSN 1539-3755 Institutional research plan: CEZ:AV0Z40720504 Keywords : percolation threshold * universality * infinite cluster Subject RIV: CF - Physical ; Theoretical Chemistry Impact factor: 2.400, year: 2009

  11. Threshold enhancement of diphoton resonances

    Directory of Open Access Journals (Sweden)

    Aoife Bharucha

    2016-10-01

    Full Text Available We revisit a mechanism to enhance the decay width of (pseudo-scalar resonances to photon pairs when the process is mediated by loops of charged fermions produced near threshold. Motivated by the recent LHC data, indicating the presence of an excess in the diphoton spectrum at approximately 750 GeV, we illustrate this threshold enhancement mechanism in the case of a 750 GeV pseudoscalar boson A with a two-photon decay mediated by a charged and uncolored fermion having a mass at the 12MA threshold and a small decay width, <1 MeV. The implications of such a threshold enhancement are discussed in two explicit scenarios: i the Minimal Supersymmetric Standard Model in which the A state is produced via the top quark mediated gluon fusion process and decays into photons predominantly through loops of charginos with masses close to 12MA and ii a two Higgs doublet model in which A is again produced by gluon fusion but decays into photons through loops of vector-like charged heavy leptons. In both these scenarios, while the mass of the charged fermion has to be adjusted to be extremely close to half of the A resonance mass, the small total widths are naturally obtained if only suppressed three-body decay channels occur. Finally, the implications of some of these scenarios for dark matter are discussed.

  12. Crossing Thresholds in Academic Reading

    Science.gov (United States)

    Abbott, Rob

    2013-01-01

    This paper looks at the conceptual thresholds in relation to academic reading which might be crossed by undergraduate English Literature students. It is part of a wider study following 16 students through three years of undergraduate study. It uses theoretical ideas from Bakhtin and Foucault to analyse interviews with English lecturers. It…

  13. Teaching hypothesis testing: a necessary challenge

    NARCIS (Netherlands)

    Post, Wendy J.; van Duijn, Marijtje A.J.; Makar, Katie; de Sousa, Bruno; Gould, Robert

    The last decades a debate has been going on about the use of hypothesis testing. This has led some teachers to think that confidence intervals and effect sizes need to be taught instead of formal hypothesis testing with p-values. Although we see shortcomings of the use of p-values in statistical

  14. Hypothesis elimination on a quantum computer

    OpenAIRE

    Soklakov, Andrei N.; Schack, Ruediger

    2004-01-01

    Hypothesis elimination is a special case of Bayesian updating, where each piece of new data rules out a set of prior hypotheses. We describe how to use Grover's algorithm to perform hypothesis elimination for a class of probability distributions encoded on a register of qubits, and establish a lower bound on the required computational resources.

  15. Mastery Learning and the Decreasing Variability Hypothesis.

    Science.gov (United States)

    Livingston, Jennifer A.; Gentile, J. Ronald

    1996-01-01

    This report results from studies that tested two variations of Bloom's decreasing variability hypothesis using performance on successive units of achievement in four graduate classrooms that used mastery learning procedures. Data do not support the decreasing variability hypothesis; rather, they show no change over time. (SM)

  16. Hypothesis Testing in the Real World

    Science.gov (United States)

    Miller, Jeff

    2017-01-01

    Critics of null hypothesis significance testing suggest that (a) its basic logic is invalid and (b) it addresses a question that is of no interest. In contrast to (a), I argue that the underlying logic of hypothesis testing is actually extremely straightforward and compelling. To substantiate that, I present examples showing that hypothesis…

  17. Error probabilities in default Bayesian hypothesis testing

    NARCIS (Netherlands)

    Gu, Xin; Hoijtink, Herbert; Mulder, J,

    2016-01-01

    This paper investigates the classical type I and type II error probabilities of default Bayes factors for a Bayesian t test. Default Bayes factors quantify the relative evidence between the null hypothesis and the unrestricted alternative hypothesis without needing to specify prior distributions for

  18. Predictions from high scale mixing unification hypothesis

    Indian Academy of Sciences (India)

    2016-01-09

    Jan 9, 2016 ... Starting with 'high scale mixing unification' hypothesis, we investigate the renormalization group evolution of mixing parameters and masses for both Dirac and Majorana-type neutrinos. Following this hypothesis, the PMNS mixing parameters are taken to be identical to the CKM ones at a unifying high ...

  19. Prospective detection of large prediction errors: a hypothesis testing approach

    International Nuclear Information System (INIS)

    Ruan, Dan

    2010-01-01

    Real-time motion management is important in radiotherapy. In addition to effective monitoring schemes, prediction is required to compensate for system latency, so that treatment can be synchronized with tumor motion. However, it is difficult to predict tumor motion at all times, and it is critical to determine when large prediction errors may occur. Such information can be used to pause the treatment beam or adjust monitoring/prediction schemes. In this study, we propose a hypothesis testing approach for detecting instants corresponding to potentially large prediction errors in real time. We treat the future tumor location as a random variable, and obtain its empirical probability distribution with the kernel density estimation-based method. Under the null hypothesis, the model probability is assumed to be a concentrated Gaussian centered at the prediction output. Under the alternative hypothesis, the model distribution is assumed to be non-informative uniform, which reflects the situation that the future position cannot be inferred reliably. We derive the likelihood ratio test (LRT) for this hypothesis testing problem and show that with the method of moments for estimating the null hypothesis Gaussian parameters, the LRT reduces to a simple test on the empirical variance of the predictive random variable. This conforms to the intuition to expect a (potentially) large prediction error when the estimate is associated with high uncertainty, and to expect an accurate prediction when the uncertainty level is low. We tested the proposed method on patient-derived respiratory traces. The 'ground-truth' prediction error was evaluated by comparing the prediction values with retrospective observations, and the large prediction regions were subsequently delineated by thresholding the prediction errors. The receiver operating characteristic curve was used to describe the performance of the proposed hypothesis testing method. Clinical implication was represented by miss

  20. Hypothesis testing in hydrology: Theory and practice

    Science.gov (United States)

    Kirchner, James; Pfister, Laurent

    2017-04-01

    Well-posed hypothesis tests have spurred major advances in hydrological theory. However, a random sample of recent research papers suggests that in hydrology, as in other fields, hypothesis formulation and testing rarely correspond to the idealized model of the scientific method. Practices such as "p-hacking" or "HARKing" (Hypothesizing After the Results are Known) are major obstacles to more rigorous hypothesis testing in hydrology, along with the well-known problem of confirmation bias - the tendency to value and trust confirmations more than refutations - among both researchers and reviewers. Hypothesis testing is not the only recipe for scientific progress, however: exploratory research, driven by innovations in measurement and observation, has also underlain many key advances. Further improvements in observation and measurement will be vital to both exploratory research and hypothesis testing, and thus to advancing the science of hydrology.

  1. The heritability of acceptability in South African Merino sheep ...

    African Journals Online (AJOL)

    Selection for production and reproduction in South African Merino sheep is always combined with selection based on visual appraisal and will, in all probability, remain so for many years to come. Heritabilities for acceptability were estimated using a threshold model to analyse data from two parent Merino studs. Effects ...

  2. Knowledge dimensions in hypothesis test problems

    Science.gov (United States)

    Krishnan, Saras; Idris, Noraini

    2012-05-01

    The reformation in statistics education over the past two decades has predominantly shifted the focus of statistical teaching and learning from procedural understanding to conceptual understanding. The emphasis of procedural understanding is on the formulas and calculation procedures. Meanwhile, conceptual understanding emphasizes students knowing why they are using a particular formula or executing a specific procedure. In addition, the Revised Bloom's Taxonomy offers a twodimensional framework to describe learning objectives comprising of the six revised cognition levels of original Bloom's taxonomy and four knowledge dimensions. Depending on the level of complexities, the four knowledge dimensions essentially distinguish basic understanding from the more connected understanding. This study identifiesthe factual, procedural and conceptual knowledgedimensions in hypothesis test problems. Hypothesis test being an important tool in making inferences about a population from sample informationis taught in many introductory statistics courses. However, researchers find that students in these courses still have difficulty in understanding the underlying concepts of hypothesis test. Past studies also show that even though students can perform the hypothesis testing procedure, they may not understand the rationale of executing these steps or know how to apply them in novel contexts. Besides knowing the procedural steps in conducting a hypothesis test, students must have fundamental statistical knowledge and deep understanding of the underlying inferential concepts such as sampling distribution and central limit theorem. By identifying the knowledge dimensions of hypothesis test problems in this study, suitable instructional and assessment strategies can be developed in future to enhance students' learning of hypothesis test as a valuable inferential tool.

  3. Fire drives functional thresholds on the savanna-forest transition.

    Science.gov (United States)

    Dantas, Vinícius de L; Batalha, Marco A; Pausas, Juli G

    2013-11-01

    In tropical landscapes, vegetation patches with contrasting tree densities are distributed as mosaics. However, the locations of patches and densities of trees within them cannot be predicted by climate models alone. It has been proposed that plant-fire feedbacks drive functional thresholds at a landscape scale, thereby maintaining open (savanna) and closed (forest) communities as two distinct stable states. However, there is little rigorous field evidence for this threshold model. Here we aim to provide support for such a model from a field perspective and to analyze the functional and phylogenetic consequences of fire in a Brazilian savanna landscape (Cerrado). We hypothesize that, in tropical landscapes, savanna and forest are two stable states maintained by plant-fire feedbacks. If so, their functional and diversity attributes should change abruptly along a community closure gradient. We set 98 plots along a gradient from open savanna to closed forest in the Brazilian Cerrado and tested for a threshold pattern in nine functional traits, five soil features, and seven diversity indicators. We then tested whether the threshold pattern was associated with different fire regimes. Most community attributes presented a threshold pattern on the savanna-forest transition with coinciding breakpoints. The thresholds separated two community states: (1) open environments with low-diversity communities growing in poor soils and dominated by plants that are highly resistant to high-intensity fires; and (2) closed environments with highly diverse plant communities growing in more fertile soils and dominated by shade-tolerant species that efficiently prevent light from reaching the understory. In addition, each state was associated with contrasting fire regimes. Our results are consistent with the hypothesis that forests and savannas are two coexisting stable states with contrasting patterns of function and diversity that are regulated by fire-plant feedbacks; our results also

  4. Elicitation threshold of cobalt chloride

    DEFF Research Database (Denmark)

    Fischer, Louise A; Johansen, Jeanne D; Voelund, Aage

    2016-01-01

    BACKGROUND: Cobalt is a strong skin sensitizer (grade 5 of 5 in the guinea-pig maximization test) that is used in various industrial and consumer applications. To prevent sensitization to cobalt and elicitation of allergic cobalt dermatitis, information about the elicitation threshold level...... of cobalt is important. OBJECTIVE: To identify the dermatitis elicitation threshold levels in cobalt-allergic individuals. MATERIALS AND METHODS: Published patch test dose-response studies were reviewed to determine the elicitation dose (ED) levels in dermatitis patients with a previous positive patch test...... reaction to cobalt. A logistic dose-response model was applied to data collected from the published literature to estimate ED values. The 95% confidence interval (CI) for the ratio of mean doses that can elicit a reaction in 10% (ED(10)) of a population was calculated with Fieller's method. RESULTS...

  5. Scaling behavior of threshold epidemics

    Science.gov (United States)

    Ben-Naim, E.; Krapivsky, P. L.

    2012-05-01

    We study the classic Susceptible-Infected-Recovered (SIR) model for the spread of an infectious disease. In this stochastic process, there are two competing mechanism: infection and recovery. Susceptible individuals may contract the disease from infected individuals, while infected ones recover from the disease at a constant rate and are never infected again. Our focus is the behavior at the epidemic threshold where the rates of the infection and recovery processes balance. In the infinite population limit, we establish analytically scaling rules for the time-dependent distribution functions that characterize the sizes of the infected and the recovered sub-populations. Using heuristic arguments, we also obtain scaling laws for the size and duration of the epidemic outbreaks as a function of the total population. We perform numerical simulations to verify the scaling predictions and discuss the consequences of these scaling laws for near-threshold epidemic outbreaks.

  6. UGV acceptance testing

    Science.gov (United States)

    Kramer, Jeffrey A.; Murphy, Robin R.

    2006-05-01

    With over 100 models of unmanned vehicles now available for military and civilian safety, security or rescue applications, it is important to for agencies to establish acceptance testing. However, there appears to be no general guidelines for what constitutes a reasonable acceptance test. This paper describes i) a preliminary method for acceptance testing by a customer of the mechanical and electrical components of an unmanned ground vehicle system, ii) how it has been applied to a man-packable micro-robot, and iii) discusses the value of testing both to ensure that the customer has a workable system and to improve design. The test method automated the operation of the robot to repeatedly exercise all aspects and combinations of components on the robot for 6 hours. The acceptance testing process uncovered many failures consistent with those shown to occur in the field, showing that testing by the user does predict failures. The process also demonstrated that the testing by the manufacturer can provide important design data that can be used to identify, diagnose, and prevent long-term problems. Also, the structured testing environment showed that sensor systems can be used to predict errors and changes in performance, as well as uncovering unmodeled behavior in subsystems.

  7. Approaches to acceptable risk

    International Nuclear Information System (INIS)

    Whipple, C.

    1997-01-01

    Several alternative approaches to address the question open-quotes How safe is safe enough?close quotes are reviewed and an attempt is made to apply the reasoning behind these approaches to the issue of acceptability of radiation exposures received in space. The approaches to the issue of the acceptability of technological risk described here are primarily analytical, and are drawn from examples in the management of environmental health risks. These include risk-based approaches, in which specific quantitative risk targets determine the acceptability of an activity, and cost-benefit and decision analysis, which generally focus on the estimation and evaluation of risks, benefits and costs, in a framework that balances these factors against each other. These analytical methods tend by their quantitative nature to emphasize the magnitude of risks, costs and alternatives, and to downplay other factors, especially those that are not easily expressed in quantitative terms, that affect acceptance or rejection of risk. Such other factors include the issues of risk perceptions and how and by whom risk decisions are made

  8. Roots at the Percolation Threshold

    Science.gov (United States)

    Kroener, E.; Ahmed, M. A.; Kaestner, A.; Vontobel, P.; Zarebanadkouki, M.; Carminati, A.

    2014-12-01

    Much of the carbon assimilated by plants during photosynthesis is lost to the soil via rhizodepositions. One component of rhizopdeposition is mucilage, a hydrogel that dramatically alters the soil physical properties. Mucilage was assumed to explain unexpectedly low rhizosphere rewetting rates during irrigation (Carminati et al. 2010) and temporarily water repellency in the rhizosphere after severe drying (Moradi et al. 2012).Here, we present an experimental and theoretical study for the rewetting behaviour of a soil mixed with mucilage, which was used as an analogue of the rhizosphere. Our samples were made of two layers of untreated soils separated by a thin layer (ca. 1 mm) of soil treated with mucilage. We prepared soil columns of varying particle size, mucilage concentration and height of the middle layer above the water table. The dry soil columns were re-wetted by capillary rise from the bottom.The rewetting of the middle layer showed a distinct dual behavior. For mucilage concentrations lower than a certain threshold, water could cross the thin layer almost immediately after rewetting of bulk soil. At slightly higher mucilage concentrations, the thin layer was almost impermeable. The mucilage concentration at the threshold strongly depended on particle size: the smaller the particle size the larger the soil specific surface and the more mucilage was needed to cover the entire particle surface and to induce water repellency.We applied a classic pore network model to simulate the experimental observations. In the model a certain fraction of nodes were randomly disconnected to reproduce the effect of mucilage in temporarily blocking the flow. The percolation model could qualitatively reproduce well the threshold characteristics of the experiments. Our experiments, together with former observations of water dynamics in the rhizosphere, suggest that the rhizosphere is near the percolation threshold, where small variations in mucilage concentration sensitively

  9. Realistic Realizations Of Threshold Circuits

    Science.gov (United States)

    Razavi, Hassan M.

    1987-08-01

    Threshold logic, in which each input is weighted, has many theoretical advantages over the standard gate realization, such as reducing the number of gates, interconnections, and power dissipation. However, because of the difficult synthesis procedure and complicated circuit implementation, their use in the design of digital systems is almost nonexistant. In this study, three methods of NMOS realizations are discussed, and their advantages and shortcomings are explored. Also, the possibility of using the methods to realize multi-valued logic is examined.

  10. Root finding with threshold circuits

    Czech Academy of Sciences Publication Activity Database

    Jeřábek, Emil

    2012-01-01

    Roč. 462, Nov 30 (2012), s. 59-69 ISSN 0304-3975 R&D Projects: GA AV ČR IAA100190902; GA MŠk(CZ) 1M0545 Institutional support: RVO:67985840 Keywords : root finding * threshold circuit * power series Subject RIV: BA - General Mathematics Impact factor: 0.489, year: 2012 http://www.sciencedirect.com/science/article/pii/S0304397512008006#

  11. Rational expectations, psychology and inductive learning via moving thresholds

    Science.gov (United States)

    Lamba, H.; Seaman, T.

    2008-06-01

    This paper modifies a previously introduced class of heterogeneous agent models in a way that allows for the inclusion of different types of agent motivations and behaviours in a consistent manner. The agents operate within a highly simplified environment where they are only able to be long or short one unit of the asset. The price of the asset is influenced by both an external information stream and the demand of the agents. The current strategy of each agent is defined by a pair of moving thresholds straddling the current price. When the price crosses either of the thresholds for a particular agent, that agent switches position and a new pair of thresholds is generated. The threshold dynamics can mimic different sources of investor motivation, running the gamut from purely rational information-processing, through rational (but often undesirable) behaviour induced by perverse incentives and moral hazards, to purely psychological effects. The simplest model of this kind precisely conforms to the Efficient Market Hypothesis (EMH) and this allows causal relationships to be established between actions at the agent level and violations of EMH price statistics at the global level. In particular, the effects of herding behaviour and perverse incentives are examined.

  12. Social laughter is correlated with an elevated pain threshold.

    Science.gov (United States)

    Dunbar, R I M; Baron, Rebecca; Frangou, Anna; Pearce, Eiluned; van Leeuwen, Edwin J C; Stow, Julie; Partridge, Giselle; MacDonald, Ian; Barra, Vincent; van Vugt, Mark

    2012-03-22

    Although laughter forms an important part of human non-verbal communication, it has received rather less attention than it deserves in both the experimental and the observational literatures. Relaxed social (Duchenne) laughter is associated with feelings of wellbeing and heightened affect, a proximate explanation for which might be the release of endorphins. We tested this hypothesis in a series of six experimental studies in both the laboratory (watching videos) and naturalistic contexts (watching stage performances), using change in pain threshold as an assay for endorphin release. The results show that pain thresholds are significantly higher after laughter than in the control condition. This pain-tolerance effect is due to laughter itself and not simply due to a change in positive affect. We suggest that laughter, through an endorphin-mediated opiate effect, may play a crucial role in social bonding.

  13. Social laughter is correlated with an elevated pain threshold

    Science.gov (United States)

    Dunbar, R. I. M.; Baron, Rebecca; Frangou, Anna; Pearce, Eiluned; van Leeuwen, Edwin J. C.; Stow, Julie; Partridge, Giselle; MacDonald, Ian; Barra, Vincent; van Vugt, Mark

    2012-01-01

    Although laughter forms an important part of human non-verbal communication, it has received rather less attention than it deserves in both the experimental and the observational literatures. Relaxed social (Duchenne) laughter is associated with feelings of wellbeing and heightened affect, a proximate explanation for which might be the release of endorphins. We tested this hypothesis in a series of six experimental studies in both the laboratory (watching videos) and naturalistic contexts (watching stage performances), using change in pain threshold as an assay for endorphin release. The results show that pain thresholds are significantly higher after laughter than in the control condition. This pain-tolerance effect is due to laughter itself and not simply due to a change in positive affect. We suggest that laughter, through an endorphin-mediated opiate effect, may play a crucial role in social bonding. PMID:21920973

  14. Implications of the Bohm-Aharonov hypothesis

    International Nuclear Information System (INIS)

    Ghirardi, G.C.; Rimini, A.; Weber, T.

    1976-01-01

    It is proved that the Bohm-Aharonov hypothesis concerning largerly separated subsystems of composite quantum systems implies that it is impossible to express the dynamical evolution in terms of the density operator

  15. Sleep memory processing: the sequential hypothesis

    OpenAIRE

    Giuditta, Antonio

    2014-01-01

    According to the sequential hypothesis (SH) memories acquired during wakefulness are processed during sleep in two serial steps respectively occurring during slow wave sleep (SWS) and rapid eye movement (REM) sleep. During SWS memories to be retained are distinguished from irrelevant or competing traces that undergo downgrading or elimination. Processed memories are stored again during REM sleep which integrates them with preexisting memories. The hypothesis received support from a wealth of ...

  16. Why Does REM Sleep Occur? A Wake-up Hypothesis

    Directory of Open Access Journals (Sweden)

    Dr. W. R. eKlemm

    2011-09-01

    Full Text Available Brain activity differs in the various sleep stages and in conscious wakefulness. Awakening from sleep requires restoration of the complex nerve impulse patterns in neuronal network assemblies necessary to re-create and sustain conscious wakefulness. Herein I propose that the brain uses REM to help wake itself up after it has had a sufficient amount of sleep. Evidence suggesting this hypothesis includes the facts that, 1 when first going to sleep, the brain plunges into Stage N3 (formerly called Stage IV, a deep abyss of sleep, and, as the night progresses, the sleep is punctuated by episodes of REM that become longer and more frequent toward morning, 2 conscious-like dreams are a reliable component of the REM state in which the dreamer is an active mental observer or agent in the dream, 3 the last awakening during a night’s sleep usually occurs in a REM episode during or at the end of a dream, 4 both REM and awake consciousness seem to arise out of a similar brainstem ascending arousal system 5 N3 is a functionally perturbed state that eventually must be corrected so that embodied brain can direct adaptive behavior, and 6 corticofugal projections to brainstem arousal areas provide a way to trigger increased cortical activity in REM to progressively raise the sleeping brain to the threshold required for wakefulness. This paper shows how the hypothesis conforms to common experience and has substantial predictive and explanatory power regarding the phenomenology of sleep in terms of ontogeny, aging, phylogeny, abnormal/disease states, cognition, and behavioral physiology. That broad range of consistency is not matched by competing theories, which are summarized herein. Specific ways to test this wake-up hypothesis are suggested. Such research could lead to a better understanding of awake consciousness.

  17. Why does rem sleep occur? A wake-up hypothesis.

    Science.gov (United States)

    Klemm, W R

    2011-01-01

    Brain activity differs in the various sleep stages and in conscious wakefulness. Awakening from sleep requires restoration of the complex nerve impulse patterns in neuronal network assemblies necessary to re-create and sustain conscious wakefulness. Herein I propose that the brain uses rapid eye movement (REM) to help wake itself up after it has had a sufficient amount of sleep. Evidence suggesting this hypothesis includes the facts that, (1) when first going to sleep, the brain plunges into Stage N3 (formerly called Stage IV), a deep abyss of sleep, and, as the night progresses, the sleep is punctuated by episodes of REM that become longer and more frequent toward morning, (2) conscious-like dreams are a reliable component of the REM state in which the dreamer is an active mental observer or agent in the dream, (3) the last awakening during a night's sleep usually occurs in a REM episode during or at the end of a dream, (4) both REM and awake consciousness seem to arise out of a similar brainstem ascending arousal system (5) N3 is a functionally perturbed state that eventually must be corrected so that embodied brain can direct adaptive behavior, and (6) cortico-fugal projections to brainstem arousal areas provide a way to trigger increased cortical activity in REM to progressively raise the sleeping brain to the threshold required for wakefulness. This paper shows how the hypothesis conforms to common experience and has substantial predictive and explanatory power regarding the phenomenology of sleep in terms of ontogeny, aging, phylogeny, abnormal/disease states, cognition, and behavioral physiology. That broad range of consistency is not matched by competing theories, which are summarized herein. Specific ways to test this wake-up hypothesis are suggested. Such research could lead to a better understanding of awake consciousness.

  18. Voting on Thresholds for Public Goods

    DEFF Research Database (Denmark)

    Rauchdobler, Julian; Sausgruber, Rupert; Tyran, Jean-Robert

    Introducing a threshold in the sense of a minimal project size transforms a public goods game with an inefficient equilibrium into a coordination game with a set of Pareto-superior equilibria. Thresholds may therefore improve efficiency in the voluntary provision of public goods. In our one......-shot experiment, we find that coordination often fails and exogenously imposed thresholds are ineffective at best and often counter-productive. This holds under a range of threshold levels and refund rates. We test if thresholds perform better if they are endogenously chosen, i.e. if a threshold is approved...

  19. Radiation hormesis and the linear-no-threshold assumption

    CERN Document Server

    Sanders, Charles L

    2009-01-01

    Current radiation protection standards are based upon the application of the linear no-threshold (LNT) assumption, which considers that even very low doses of ionizing radiation can cause cancer. The radiation hormesis hypothesis, by contrast, proposes that low-dose ionizing radiation is beneficial. In this book, the author examines all facets of radiation hormesis in detail, including the history of the concept and mechanisms, and presents comprehensive, up-to-date reviews for major cancer types. It is explained how low-dose radiation can in fact decrease all-cause and all-cancer mortality an

  20. Adiabatic theory of Wannier threshold laws and ionization cross sections

    International Nuclear Information System (INIS)

    Macek, J.H.; Ovchinnikov, S.Y.

    1994-01-01

    Adiabatic energy eigenvalues of H 2 + are computed for complex values of the internuclear distance R. The infinite number of bound-state eigenenergies are represented by a function ε(R) that is single valued on a multisheeted Riemann surface. A region is found where ε(R) and the corresponding eigenfunctions exhibit harmonic-oscillator structure characteristic of electron motion on a potential saddle. The Schroedinger equation is solved in the adiabatic approximation along a path in the complex R plane to compute ionization cross sections. The cross section thus obtained joins the Wannier threshold region with the keV energy region, but the exponent near the ionization threshold disagrees with well-accepted values. Accepted values are obtained when a lowest-order diabatic correction is employed, indicating that adiabatic approximations do not give the correct zero velocity limit for ionization cross sections. Semiclassical eigenvalues for general top-of-barrier motion are given and the theory is applied to the ionization of atomic hydrogen by electron impact. The theory with a first diabatic correction gives the Wannier threshold law even for this case

  1. Rejection thresholds in solid chocolate-flavored compound coating.

    Science.gov (United States)

    Harwood, Meriel L; Ziegler, Gregory R; Hayes, John E

    2012-10-01

    Classical detection thresholds do not predict liking, as they focus on the presence or absence of a sensation. Recently however, Prescott and colleagues described a new method, the rejection threshold, where a series of forced choice preference tasks are used to generate a dose-response function to determine hedonically acceptable concentrations. That is, how much is too much? To date, this approach has been used exclusively in liquid foods. Here, we determined group rejection thresholds in solid chocolate-flavored compound coating for bitterness. The influences of self-identified preferences for milk or dark chocolate, as well as eating style (chewers compared to melters) on rejection thresholds were investigated. Stimuli included milk chocolate-flavored compound coating spiked with increasing amounts of sucrose octaacetate, a bitter and generally recognized as safe additive. Paired preference tests (blank compared to spike) were used to determine the proportion of the group that preferred the blank. Across pairs, spiked samples were presented in ascending concentration. We were able to quantify and compare differences between 2 self-identified market segments. The rejection threshold for the dark chocolate preferring group was significantly higher than the milk chocolate preferring group (P= 0.01). Conversely, eating style did not affect group rejection thresholds (P= 0.14), although this may reflect the amount of chocolate given to participants. Additionally, there was no association between chocolate preference and eating style (P= 0.36). Present work supports the contention that this method can be used to examine preferences within specific market segments and potentially individual differences as they relate to ingestive behavior. This work makes use of the rejection threshold method to study market segmentation, extending its use to solid foods. We believe this method has broad applicability to the sensory specialist and product developer by providing a

  2. Waste acceptance and logistics

    International Nuclear Information System (INIS)

    Carlson, James H.

    1992-01-01

    There are three major components which are normally highlighted when the Civilian Radioactive Waste Management Program is discussed - the repository, the monitored retrievable storage facility, and the transportation system. These are clearly the major physical system elements and they receive the greatest external attention. However, there will not be a successful, operative waste management system without fully operational waste acceptance plans and logistics arrangements. This paper will discuss the importance of developing, on a parallel basis to the normally considered waste management system elements, the waste acceptance and logistics arrangements to enable the timely transfer of spent nuclear fuel from more than one hundred and twenty waste generators to the Federal government. The paper will also describe the specific activities the Program has underway to make the necessary arrangements. (author)

  3. Environment and public acceptance

    International Nuclear Information System (INIS)

    Gauvenet; Bresson; Braillard; Ertaud; Ladonchamps, de; Toureau

    1976-01-01

    The problems involved in the siting of nuclear power stations at a local level are of a political economic, social or ecological order. The acceptance of a nuclear station mostly depends on its interest for the local population. In order to avoid negative reactions, the men who are responsible must make the harmonious integration of the station within the existing economic and social context their first priority [fr

  4. Marketing for Acceptance

    OpenAIRE

    Tina L. Johnston, Ph.D.

    2009-01-01

    Becoming a researcher comes with the credentializing pressure to publish articles in peer-reviewed journals (Glaser, 1992; Glaser, 2007; Glaser, 2008). The work intensive process is exacerbated when the author’s research method is grounded theory. This study investigated the concerns of early and experienced grounded theorists to discover how they worked towards publishing research projects that applied grounded theory as a methodology. The result was a grounded theory of marketing for accept...

  5. Nuclear power and acceptation

    International Nuclear Information System (INIS)

    Speelman, J.E.

    1990-01-01

    In 1989 a workshop was held organized by the IAEA and the Argonne National Laboratory. The purpose was to investigate under which circumstances a large-scale extension of nuclear power can be accepted. Besides the important technical information, the care for the environment determined the atmosphere during the workshop. The opinion dominated that nuclear power can contribute in tackling the environment problems, but that the social and political climate this almost makes impossible. (author). 7 refs.; 1 fig.; 1 tab

  6. An efficient coding hypothesis links sparsity and selectivity of neural responses.

    Directory of Open Access Journals (Sweden)

    Florian Blättler

    Full Text Available To what extent are sensory responses in the brain compatible with first-order principles? The efficient coding hypothesis projects that neurons use as few spikes as possible to faithfully represent natural stimuli. However, many sparsely firing neurons in higher brain areas seem to violate this hypothesis in that they respond more to familiar stimuli than to nonfamiliar stimuli. We reconcile this discrepancy by showing that efficient sensory responses give rise to stimulus selectivity that depends on the stimulus-independent firing threshold and the balance between excitatory and inhibitory inputs. We construct a cost function that enforces minimal firing rates in model neurons by linearly punishing suprathreshold synaptic currents. By contrast, subthreshold currents are punished quadratically, which allows us to optimally reconstruct sensory inputs from elicited responses. We train synaptic currents on many renditions of a particular bird's own song (BOS and few renditions of conspecific birds' songs (CONs. During training, model neurons develop a response selectivity with complex dependence on the firing threshold. At low thresholds, they fire densely and prefer CON and the reverse BOS (REV over BOS. However, at high thresholds or when hyperpolarized, they fire sparsely and prefer BOS over REV and over CON. Based on this selectivity reversal, our model suggests that preference for a highly familiar stimulus corresponds to a high-threshold or strong-inhibition regime of an efficient coding strategy. Our findings apply to songbird mirror neurons, and in general, they suggest that the brain may be endowed with simple mechanisms to rapidly change selectivity of neural responses to focus sensory processing on either familiar or nonfamiliar stimuli. In summary, we find support for the efficient coding hypothesis and provide new insights into the interplay between the sparsity and selectivity of neural responses.

  7. Order acceptance with reinforcement learning

    NARCIS (Netherlands)

    Mainegra Hing, M.; van Harten, Aart; Schuur, Peter

    2001-01-01

    Order Acceptance (OA) is one of the main functions in a business control framework. Basically, OA involves for each order a 0/1 (i.e., reject/accept) decision. Always accepting an order when capacity is available could unable the system to accept more convenient orders in the future. Another

  8. Age and Acceptance of Euthanasia.

    Science.gov (United States)

    Ward, Russell A.

    1980-01-01

    Study explores relationship between age (and sex and race) and acceptance of euthanasia. Women and non-Whites were less accepting because of religiosity. Among older people less acceptance was attributable to their lesser education and greater religiosity. Results suggest that quality of life in old age affects acceptability of euthanasia. (Author)

  9. Optimizing Systems of Threshold Detection Sensors

    National Research Council Canada - National Science Library

    Banschbach, David C

    2008-01-01

    .... Below the threshold all signals are ignored. We develop a mathematical model for setting individual sensor thresholds to obtain optimal probability of detecting a significant event, given a limit on the total number of false positives allowed...

  10. Nuclear thermodynamics below particle threshold

    International Nuclear Information System (INIS)

    Schiller, A.; Agvaanluvsan, U.; Algin, E.; Bagheri, A.; Chankova, R.; Guttormsen, M.; Hjorth-Jensen, M.; Rekstad, J.; Siem, S.; Sunde, A. C.; Voinov, A.

    2005-01-01

    From a starting point of experimentally measured nuclear level densities, we discuss thermodynamical properties of nuclei below the particle emission threshold. Since nuclei are essentially mesoscopic systems, a straightforward generalization of macroscopic ensemble theory often yields unphysical results. A careful critique of traditional thermodynamical concepts reveals problems commonly encountered in mesoscopic systems. One of which is the fact that microcanonical and canonical ensemble theory yield different results, another concerns the introduction of temperature for small, closed systems. Finally, the concept of phase transitions is investigated for mesoscopic systems

  11. [Acceptance and commitment therapy].

    Science.gov (United States)

    Ducasse, D; Fond, G

    2015-02-01

    Acceptance and commitment therapy (ACT) is a third generation of cognitive-behavioral therapies. The point is to help patients to improve their psychological flexibility in order to accept unavoidable private events. Thus, they have the opportunity to invest energy in committed actions rather than struggle against their psychological events. (i) To present the ACT basic concepts and (ii) to propose a systematic review of the literature about effectiveness of this kind of psychotherapy. (i) The core concepts of ACT come from Monestès (2011), Schoendorff (2011), and Harris (2012); (ii) we conducted a systematic review of the literature using the PRISMA's criteria. The research paradigm was « acceptance and commitment therapy AND randomized controlled trial ». The bases of the MEDLINE, Cochrane and Web of science have been checked. Overall, 61 articles have been found, of which, after reading the abstracts, 40 corresponded to the subject of our study. (I) Psychological flexibility is established through six core ACT processes (cognitive defusion, acceptance, being present, values, committed action, self as context), while the therapist emphasizes on experiential approach. (II) Emerging research shows that ACT is efficacious in the psychological treatment of a wide range of psychiatric problems, including psychosis, depression, obsessive-compulsive disorder, trichotillomania, generalized anxiety disorder, post-traumatic stress disorder, borderline personality disorder, eating disorders. ACT has also shown a utility in other areas of medicine: the management chronic pain, drug-dependence, smoking cessation, the management of epilepsy, diabetic self-management, the management of work stress, the management of tinnitus, and the management of multiple sclerosis. Meta-analysis of controlled outcome studies reported an average effect size (Cohen's d) of 0.66 at post-treatment (n=704) and 0.65 (n=580) at follow-up (on average 19.2 weeks later). In studies involving

  12. Multiple hypothesis tracking for the cyber domain

    Science.gov (United States)

    Schwoegler, Stefan; Blackman, Sam; Holsopple, Jared; Hirsch, Michael J.

    2011-09-01

    This paper discusses how methods used for conventional multiple hypothesis tracking (MHT) can be extended to domain-agnostic tracking of entities from non-kinematic constraints such as those imposed by cyber attacks in a potentially dense false alarm background. MHT is widely recognized as the premier method to avoid corrupting tracks with spurious data in the kinematic domain but it has not been extensively applied to other problem domains. The traditional approach is to tightly couple track maintenance (prediction, gating, filtering, probabilistic pruning, and target confirmation) with hypothesis management (clustering, incompatibility maintenance, hypothesis formation, and Nassociation pruning). However, by separating the domain specific track maintenance portion from the domain agnostic hypothesis management piece, we can begin to apply the wealth of knowledge gained from ground and air tracking solutions to the cyber (and other) domains. These realizations led to the creation of Raytheon's Multiple Hypothesis Extensible Tracking Architecture (MHETA). In this paper, we showcase MHETA for the cyber domain, plugging in a well established method, CUBRC's INFormation Engine for Real-time Decision making, (INFERD), for the association portion of the MHT. The result is a CyberMHT. We demonstrate the power of MHETA-INFERD using simulated data. Using metrics from both the tracking and cyber domains, we show that while no tracker is perfect, by applying MHETA-INFERD, advanced nonkinematic tracks can be captured in an automated way, perform better than non-MHT approaches, and decrease analyst response time to cyber threats.

  13. Aminoglycoside antibiotics and autism: a speculative hypothesis

    Directory of Open Access Journals (Sweden)

    Manev Hari

    2001-10-01

    Full Text Available Abstract Background Recently, it has been suspected that there is a relationship between therapy with some antibiotics and the onset of autism; but even more curious, some children benefited transiently from a subsequent treatment with a different antibiotic. Here, we speculate how aminoglycoside antibiotics might be associated with autism. Presentation We hypothesize that aminoglycoside antibiotics could a trigger the autism syndrome in susceptible infants by causing the stop codon readthrough, i.e., a misreading of the genetic code of a hypothetical critical gene, and/or b improve autism symptoms by correcting the premature stop codon mutation in a hypothetical polymorphic gene linked to autism. Testing Investigate, retrospectively, whether a link exists between aminoglycoside use (which is not extensive in children and the onset of autism symptoms (hypothesis "a", or between amino glycoside use and improvement of these symptoms (hypothesis "b". Whereas a prospective study to test hypothesis "a" is not ethically justifiable, a study could be designed to test hypothesis "b". Implications It should be stressed that at this stage no direct evidence supports our speculative hypothesis and that its main purpose is to initiate development of new ideas that, eventually, would improve our understanding of the pathobiology of autism.

  14. Threshold Concepts in Finance: Student Perspectives

    Science.gov (United States)

    Hoadley, Susan; Kyng, Tim; Tickle, Leonie; Wood, Leigh N.

    2015-01-01

    Finance threshold concepts are the essential conceptual knowledge that underpin well-developed financial capabilities and are central to the mastery of finance. In this paper we investigate threshold concepts in finance from the point of view of students, by establishing the extent to which students are aware of threshold concepts identified by…

  15. Color-discrimination threshold determination using pseudoisochromatic test plates

    Directory of Open Access Journals (Sweden)

    Kaiva eJurasevska

    2014-11-01

    Full Text Available We produced a set of pseudoisochromatic plates for determining individual color-difference thresholds to assess test performance and test properties, and analyzed the results. We report a high test validity and classification ability for the deficiency type and severity level (comparable to that of the fourth edition of the Hardy–Rand–Rittler (HRR test. We discuss changes of the acceptable chromatic shifts from the protan and deutan confusion lines along the CIE xy diagram, and the high correlation of individual color-difference thresholds and the red–green discrimination index. Color vision was tested using an Oculus HMC anomaloscope, a Farnsworth D15, and an HRR test on 273 schoolchildren, and 57 other subjects with previously diagnosed red–green color-vision deficiency.

  16. Color-discrimination threshold determination using pseudoisochromatic test plates

    Science.gov (United States)

    Jurasevska, Kaiva; Ozolinsh, Maris; Fomins, Sergejs; Gutmane, Ausma; Zutere, Brigita; Pausus, Anete; Karitans, Varis

    2014-01-01

    We produced a set of pseudoisochromatic plates for determining individual color-difference thresholds to assess test performance and test properties, and analyzed the results. We report a high test validity and classification ability for the deficiency type and severity level [comparable to that of the fourth edition of the Hardy–Rand–Rittler (HRR) test]. We discuss changes of the acceptable chromatic shifts from the protan and deutan confusion lines along the CIE xy diagram, and the high correlation of individual color-difference thresholds and the red–green discrimination index. Color vision was tested using an Oculus HMC anomaloscope, a Farnsworth D15, and an HRR test on 273 schoolchildren, and 57 other subjects with previously diagnosed red–green color-vision deficiency. PMID:25505891

  17. Photoproduction of the φ(1020) near threshold in CLAS

    International Nuclear Information System (INIS)

    Tedeschi, D.J.

    2002-01-01

    The differential cross section for the photoproduction of the φ (1020) near threshold (E γ = 1.57GeV) is predicted to be sensitive to production mechanisms other than diffraction. However, the existing low energy data is of limited statistics and kinematical coverage. Complete measurements of φ meson production on the proton have been performed at The Thomas Jefferson National Accelerator Facility using a liquid hydrogen target and the CEBAF Large Acceptance Spectrometer (CLAS). The φ was identified by missing mass using a proton and positive kaon detected by CLAS in coincidence with an electron in the photon tagger. The energy of the tagged, bremsstrahlung photons ranged from φ-threshold to 2.4 GeV. A description of the data set and the differential cross section for (E γ = 2.0 GeV) will be presented and compared with present theoretical calculations. (author)

  18. Dantrolene Reduces the Threshold and Gain for Shivering

    Science.gov (United States)

    Lin, Chun-Ming; Neeru, Sharma; Doufas, Anthony G.; Liem, Edwin; Shah, Yunus Muneer; Wadhwa, Anupama; Lenhardt, Rainer; Bjorksten, Andrew; Kurz, Andrea

    2005-01-01

    Dantrolene is used for treatment of life-threatening hyperthermia, yet its thermoregulatory effects are unknown. We tested the hypothesis that dantrolene reduces the threshold (triggering core temperature) and gain (incremental increase) of shivering. With IRB approval and informed consent, healthy volunteers were evaluated on two random days: control and dantrolene (≈2.5 mg/kg plus a continuous infusion). In study 1, 9 men were warmed until sweating was provoked and then cooled until arterio-venous shunt constriction and shivering occurred. Sweating was quantified on the chest using a ventilated capsule. Absolute right middle fingertip blood flow was quantified using venous-occlusion volume plethysmography. A sustained increase in oxygen consumption identified the shivering threshold. In study 2, 9 men were given cold Ringer's solution IV to reduce core temperature ≈2°C/h. Cooling was stopped when shivering intensity no longer increased with further core cooling. The gain of shivering was the slope of oxygen consumption vs. core temperature regression. In Study 1, sweating and vasoconstriction thresholds were similar on both days. In contrast, shivering threshold decreased 0.3±0.3°C, P=0.004, on the dantrolene day. In Study 2, dantrolene decreased the shivering threshold from 36.7±0.2 to 36.3±0.3°C, P=0.01 and systemic gain from 353±144 to 211±93 ml·min−1·°C−1, P=0.02. Thus, dantrolene substantially decreased the gain of shivering, but produced little central thermoregulatory inhibition. PMID:15105208

  19. A PC microsimulation of a gap acceptance model for turning left at a T-junction

    NARCIS (Netherlands)

    Schaap, Nina; Dijck, T.; van Arem, Bart; Morsink, Peter L.J.

    2009-01-01

    Vehicles are controlled by sub-behavioral models in a microsimulation model, this includes the gap acceptance model where the decision about how to cross a junction is made. The critical gap in these models must serve as a threshold value to accept or reject the space between two successive vehicles

  20. Threshold enhancement of diphoton resonances

    CERN Document Server

    Bharucha, Aoife; Goudelis, Andreas

    2016-10-10

    The data collected by the LHC collaborations at an energy of 13 TeV indicates the presence of an excess in the diphoton spectrum that would correspond to a resonance of a 750 GeV mass. The apparently large production cross section is nevertheless very difficult to explain in minimal models. We consider the possibility that the resonance is a pseudoscalar boson $A$ with a two--photon decay mediated by a charged and uncolored fermion having a mass at the $\\frac12 M_A$ threshold and a very small decay width, $\\ll 1$ MeV; one can then generate a large enhancement of the $A\\gamma\\gamma$ amplitude which explains the excess without invoking a large multiplicity of particles propagating in the loop, large electric charges and/or very strong Yukawa couplings. The implications of such a threshold enhancement are discussed in two explicit scenarios: i) the Minimal Supersymmetric Standard Model in which the $A$ state is produced via the top quark mediated gluon fusion process and decays into photons predominantly through...

  1. Reassessing the Trade-off Hypothesis

    DEFF Research Database (Denmark)

    Rosas, Guillermo; Manzetti, Luigi

    2015-01-01

    Do economic conditions drive voters to punish politicians that tolerate corruption? Previous scholarly work contends that citizens in young democracies support corrupt governments that are capable of promoting good economic outcomes, the so-called trade-off hypothesis. We test this hypothesis based...... on mass surveys in eighteen Latin American countries throughout 2004–2012. We find that citizens that report bribe attempts from bureaucrats are always more likely to report presidential disapproval than citizens that report no such attempts, that is, Latin American victims of corruption are not duped...... by good economic performance. However, we find some evidence for a weaker form of the trade-off hypothesis: presidential disapproval among corruption victims might be more pronounced in contexts of high inflation and high unemployment....

  2. Marketing for Acceptance

    Directory of Open Access Journals (Sweden)

    Tina L. Johnston, Ph.D.

    2009-11-01

    Full Text Available Becoming a researcher comes with the credentializing pressure to publish articles in peer-reviewed journals (Glaser, 1992; Glaser, 2007; Glaser, 2008. The work intensive process is exacerbated when the author’s research method is grounded theory. This study investigated the concerns of early and experienced grounded theorists to discover how they worked towards publishing research projects that applied grounded theory as a methodology. The result was a grounded theory of marketing for acceptance that provides the reader with insight into ways that classic grounded theorists have published their works. This is followed by a discussion of ideas for normalizing classic grounded theory research methods in our substantive fields.

  3. Acceptance, Tolerance, Participation

    International Nuclear Information System (INIS)

    1993-01-01

    The problem of radioactive waste management from an ethical and societal viewpoint was treated in this seminar, which had participants from universities (social, theological, philosophical and science institutes), waste management industry, and regulatory and controlling authorities. After initial reviews on repository technology, policies and schedules, knowledge gaps, and ethical aspects on decision making under uncertainty, four subjects were treated in lectures and discussions: Democratic collective responsibility, Handling threats in democratic decision making, Waste management - a technological operation with a social dimension, Acceptance and legitimity. Lectures with comments and discussions are collected in this report

  4. Baby-Crying Acceptance

    Science.gov (United States)

    Martins, Tiago; de Magalhães, Sérgio Tenreiro

    The baby's crying is his most important mean of communication. The crying monitoring performed by devices that have been developed doesn't ensure the complete safety of the child. It is necessary to join, to these technological resources, means of communicating the results to the responsible, which would involve the digital processing of information available from crying. The survey carried out, enabled to understand the level of adoption, in the continental territory of Portugal, of a technology that will be able to do such a digital processing. It was used the TAM as the theoretical referential. The statistical analysis showed that there is a good probability of acceptance of such a system.

  5. Multi-hypothesis distributed stereo video coding

    DEFF Research Database (Denmark)

    Salmistraro, Matteo; Zamarin, Marco; Forchhammer, Søren

    2013-01-01

    for stereo sequences, exploiting an interpolated intra-view SI and two inter-view SIs. The quality of the SI has a major impact on the DVC Rate-Distortion (RD) performance. As the inter-view SIs individually present lower RD performance compared with the intra-view SI, we propose multi-hypothesis decoding...... for robust fusion and improved performance. Compared with a state-of-the-art single side information solution, the proposed DVC decoder improves the RD performance for all the chosen test sequences by up to 0.8 dB. The proposed multi-hypothesis decoder showed higher robustness compared with other fusion...

  6. Ready for Retirement: The Gateway Drug Hypothesis.

    Science.gov (United States)

    Kleinig, John

    2015-01-01

    The psycho-social observation that the use of some psychoactive substances ("drugs") is often followed by the use of other and more problematic drugs has given rise to a cluster of so-called "gateway drug hypotheses," and such hypotheses have often played an important role in developing drug use policy. The current essay suggests that drug use policies that have drawn on versions of the hypothesis have involved an unjustified oversimplification of the dynamics of drug use, reflecting the interests of certain stakeholders rather than wise social policy. The hypothesis should be retired.

  7. [Experimental testing of Pflüger's reflex hypothesis of menstruation in late 19th century].

    Science.gov (United States)

    Simmer, H H

    1980-07-01

    Pflüger's hypothesis of a nerve reflex as the cause of menstruation published in 1865 and accepted by many, nonetheless did not lead to experimental investigations for 25 years. According to this hypothesis the nerve reflex starts in the ovary by an increase of the intraovarian pressure by the growing follicles. In 1884 Adolph Kehrer proposed a program to test the nerve reflex, but only in 1890, Cohnstein artificially increased the intraovarian pressure in women by bimanual compression from the outside and the vagina. His results were not convincing. Six years later, Strassmann injected fluids into ovaries of animals and obtained changes in the uterus resembling those of oestrus. His results seemed to verify a prognosis derived from Pflüger's hypothesis. Thus, after a long interval, that hypothesis had become a paradigma. Though reasons can be given for the delay, it is little understood, why experimental testing started so late.

  8. On the Pathogenesis of Alzheimer's Disease: The MAM Hypothesis.

    Science.gov (United States)

    Area-Gomez, Estela; Schon, Eric A

    2017-03-01

    The pathogenesis of Alzheimer's disease (AD) is currently unclear and is the subject of much debate. The most widely accepted hypothesis designed to explain AD pathogenesis is the amyloid cascade, which invokes the accumulation of extracellular plaques and intracellular tangles as playing a fundamental role in the course and progression of the disease. However, besides plaques and tangles, other biochemical and morphological features are also present in AD, often manifesting early in the course of the disease before the accumulation of plaques and tangles. These include altered calcium, cholesterol, and phospholipid metabolism; altered mitochondrial dynamics; and reduced bioenergetic function. Notably, these other features of AD are associated with functions localized to a subdomain of the endoplasmic reticulum (ER), known as mitochondria-associated ER membranes (MAMs). The MAM region of the ER is a lipid raft-like domain closely apposed to mitochondria in such a way that the 2 organelles are able to communicate with each other, both physically and biochemically, thereby facilitating the functions of this region. We have found that MAM-localized functions are increased significantly in cellular and animal models of AD and in cells from patients with AD in a manner consistent with the biochemical findings noted above. Based on these and other observations, we propose that increased ER-mitochondrial apposition and perturbed MAM function lie at the heart of AD pathogenesis.-Area-Gomez, E., Schon, E. A. On the pathogenesis of Alzheimer's disease: the MAM hypothesis. © FASEB.

  9. The delphic oracle and the ethylene-intoxication hypothesis.

    Science.gov (United States)

    Foster, J; Lehoux, D

    2007-01-01

    An interdisciplinary team of scientists--including an archeologist, a geologist, a chemist, and a toxicologist--has argued that ethylene intoxication was the probable cause of the High Priestess of Delphi's divinatory (mantic) trances. The claim that the High Priestess of Delphi entered a mantic state because of ethylene intoxication enjoyed widespread reception in specialist academic journals, science magazines, and newspapers. This article uses a similar interdisciplinary approach to show that this hypothesis is implausible since it is based on problematic scientific and textual evidence, as well as a fallacious argument. The main issue raised by this counterargument is not that a particular scientific hypothesis or conjecture turned out to be false. (This is expected in scientific investigation.) Rather, the main issue is that it was a positivist disposition that originally led readers to associate the evidence presented in such a way that it seemed to point to the conclusion, even when the evidence did not support the conclusion. We conclude by observing that positivist dispositions can lead to the acceptance of claims because they have a scientific form, not because they are grounded in robust evidence and sound argument.

  10. Immigration, political trust, and Brexit - Testing an aversion amplification hypothesis.

    Science.gov (United States)

    Abrams, Dominic; Travaglino, Giovanni A

    2018-04-01

    A few weeks prior to the EU referendum (23rd June 2016) two broadly representative samples of the electorate were drawn in Kent (the south-east of England, N = 1,001) and Scotland (N = 1,088) for online surveys that measured their trust in politicians, concerns about acceptable levels of immigration, threat from immigration, European identification, and voting intention. We tested an aversion amplification hypothesis that the impact of immigration concerns on threat and identification would be amplified when political trust was low. We hypothesized that the effect of aversion amplification on voting intentions would be mediated first by perceived threat from immigration, and then by (dis) identification with Europe. Results in both samples were consistent with this hypothesis and suggest that voters were most likely to reject the political status quo (choose Brexit) when concerns that immigration levels were too high were combined with a low level of trust in politicians. © 2018 The Authors. British Journal of Social Psychology published by John Wiley & Sons Ltd on behalf of British Psychological Society.

  11. An integrative perspective of the anaerobic threshold.

    Science.gov (United States)

    Sales, Marcelo Magalhães; Sousa, Caio Victor; da Silva Aguiar, Samuel; Knechtle, Beat; Nikolaidis, Pantelis Theodoros; Alves, Polissandro Mortoza; Simões, Herbert Gustavo

    2017-12-14

    The concept of anaerobic threshold (AT) was introduced during the nineteen sixties. Since then, several methods to identify the anaerobic threshold (AT) have been studied and suggested as novel 'thresholds' based upon the variable used for its detection (i.e. lactate threshold, ventilatory threshold, glucose threshold). These different techniques have brought some confusion about how we should name this parameter, for instance, anaerobic threshold or the physiological measure used (i.e. lactate, ventilation). On the other hand, the modernization of scientific methods and apparatus to detect AT, as well as the body of literature formed in the past decades, could provide a more cohesive understanding over the AT and the multiple physiological systems involved. Thus, the purpose of this review was to provide an integrative perspective of the methods to determine AT. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Bayesian methodology for reliability model acceptance

    International Nuclear Information System (INIS)

    Zhang Ruoxue; Mahadevan, Sankaran

    2003-01-01

    This paper develops a methodology to assess the reliability computation model validity using the concept of Bayesian hypothesis testing, by comparing the model prediction and experimental observation, when there is only one computational model available to evaluate system behavior. Time-independent and time-dependent problems are investigated, with consideration of both cases: with and without statistical uncertainty in the model. The case of time-independent failure probability prediction with no statistical uncertainty is a straightforward application of Bayesian hypothesis testing. However, for the life prediction (time-dependent reliability) problem, a new methodology is developed in this paper to make the same Bayesian hypothesis testing concept applicable. With the existence of statistical uncertainty in the model, in addition to the application of a predictor estimator of the Bayes factor, the uncertainty in the Bayes factor is explicitly quantified through treating it as a random variable and calculating the probability that it exceeds a specified value. The developed method provides a rational criterion to decision-makers for the acceptance or rejection of the computational model

  13. Conditional acceptability of random variables

    Directory of Open Access Journals (Sweden)

    Tasos C Christofides

    2016-06-01

    Full Text Available Abstract Acceptable random variables introduced by Giuliano Antonini et al. (J. Math. Anal. Appl. 338:1188-1203, 2008 form a class of dependent random variables that contains negatively dependent random variables as a particular case. The concept of acceptability has been studied by authors under various versions of the definition, such as extended acceptability or wide acceptability. In this paper, we combine the concept of acceptability with the concept of conditioning, which has been the subject of current research activity. For conditionally acceptable random variables, we provide a number of probability inequalities that can be used to obtain asymptotic results.

  14. Epidemic thresholds for bipartite networks

    Science.gov (United States)

    Hernández, D. G.; Risau-Gusman, S.

    2013-11-01

    It is well known that sexually transmitted diseases (STD) spread across a network of human sexual contacts. This network is most often bipartite, as most STD are transmitted between men and women. Even though network models in epidemiology have quite a long history now, there are few general results about bipartite networks. One of them is the simple dependence, predicted using the mean field approximation, between the epidemic threshold and the average and variance of the degree distribution of the network. Here we show that going beyond this approximation can lead to qualitatively different results that are supported by numerical simulations. One of the new features, that can be relevant for applications, is the existence of a critical value for the infectivity of each population, below which no epidemics can arise, regardless of the value of the infectivity of the other population.

  15. Detection thresholds of macaque otolith afferents.

    Science.gov (United States)

    Yu, Xiong-Jie; Dickman, J David; Angelaki, Dora E

    2012-06-13

    The vestibular system is our sixth sense and is important for spatial perception functions, yet the sensory detection and discrimination properties of vestibular neurons remain relatively unexplored. Here we have used signal detection theory to measure detection thresholds of otolith afferents using 1 Hz linear accelerations delivered along three cardinal axes. Direction detection thresholds were measured by comparing mean firing rates centered on response peak and trough (full-cycle thresholds) or by comparing peak/trough firing rates with spontaneous activity (half-cycle thresholds). Thresholds were similar for utricular and saccular afferents, as well as for lateral, fore/aft, and vertical motion directions. When computed along the preferred direction, full-cycle direction detection thresholds were 7.54 and 3.01 cm/s(2) for regular and irregular firing otolith afferents, respectively. Half-cycle thresholds were approximately double, with excitatory thresholds being half as large as inhibitory thresholds. The variability in threshold among afferents was directly related to neuronal gain and did not depend on spike count variance. The exact threshold values depended on both the time window used for spike count analysis and the filtering method used to calculate mean firing rate, although differences between regular and irregular afferent thresholds were independent of analysis parameters. The fact that minimum thresholds measured in macaque otolith afferents are of the same order of magnitude as human behavioral thresholds suggests that the vestibular periphery might determine the limit on our ability to detect or discriminate small differences in head movement, with little noise added during downstream processing.

  16. The discovered preference hypothesis - an empirical test

    DEFF Research Database (Denmark)

    Lundhede, Thomas; Ladenburg, Jacob; Olsen, Søren Bøye

    Using stated preference methods for valuation of non-market goods is known to be vulnerable to a range of biases. Some authors claim that these so-called anomalies in effect render the methods useless for the purpose. However, the Discovered Preference Hypothesis, as put forth by Plott [31], offe...

  17. The Hypothesis of Incommensurability and Multicultural Education

    Science.gov (United States)

    McDonough, Tim

    2009-01-01

    This article describes the logical and rhetorical grounds for a multicultural pedagogy that teaches students the knowledge and skills needed to interact creatively in the public realm betwixt and between cultures. I begin by discussing the notion of incommensurability. I contend that this hypothesis was intended to perform a particular rhetorical…

  18. [Resonance hypothesis of heart rate variability origin].

    Science.gov (United States)

    Sheĭkh-Zade, Iu R; Mukhambetaliev, G Kh; Cherednik, I L

    2009-09-01

    A hypothesis is advanced of the heart rate variability being subjected to beat-to-beat regulation of cardiac cycle duration in order to ensure the resonance interaction between respiratory and own fluctuation of the arterial system volume for minimization of power expenses of cardiorespiratory system. Myogenic, parasympathetic and sympathetic machanisms of heart rate variability are described.

  19. Hypothesis on the nature of atmospheric UFOs

    Science.gov (United States)

    Mukharev, L. A.

    1991-08-01

    A hypothesis is developed according to which the atmospheric UFO phenomenon has an electromagnetic nature. It is suggested that an atmospheric UFO is an agglomeration of charged atmospheric dust within which there exists a slowly damped electromagnetic field. This field is considered to be the source of the observed optical effects and the motive force of the UFO.

  20. Multiple hypothesis clustering in radar plot extraction

    NARCIS (Netherlands)

    Huizing, A.G.; Theil, A.; Dorp, Ph. van; Ligthart, L.P.

    1995-01-01

    False plots and plots with inaccurate range and Doppler estimates may severely degrade the performance of tracking algorithms in radar systems. This paper describes how a multiple hypothesis clustering technique can be applied to mitigate the problems involved in plot extraction. The measures of

  1. The (not so immortal strand hypothesis

    Directory of Open Access Journals (Sweden)

    Cristian Tomasetti

    2015-03-01

    Significance: Utilizing an approach that is fundamentally different from previous efforts to confirm or refute the immortal strand hypothesis, we provide evidence against non-random segregation of DNA during stem cell replication. Our results strongly suggest that parental DNA is passed randomly to stem cell daughters and provides new insight into the mechanism of DNA replication in stem cells.

  2. Forty Years Later: Updating the Fossilization Hypothesis

    Science.gov (United States)

    Han, ZhaoHong

    2013-01-01

    A founding concept in second language acquisition (SLA) research, fossilization has been fundamental to understanding second language (L2) development. The Fossilization Hypothesis, introduced in Selinker's seminal text (1972), has thus been one of the most influential theories, guiding a significant bulk of SLA research for four decades; 2012…

  3. Remarks about the hypothesis of limiting fragmentation

    International Nuclear Information System (INIS)

    Chou, T.T.; Yang, C.N.

    1987-01-01

    Remarks are made about the hypothesis of limiting fragmentation. In particular, the concept of favored and disfavored fragment distribution is introduced. Also, a sum rule is proved leading to a useful quantity called energy-fragmentation fraction. (author). 11 refs, 1 fig., 2 tabs

  4. Television Exposure Measures and the Cultivation Hypothesis.

    Science.gov (United States)

    Potter, W. James; Chang, Ik Chin

    1990-01-01

    Describes study of students in grades 8 through 12 that was conducted to determine the degree to which television messages influence a person's construction of reality (the cultivation hypothesis). Research methodology that tests the effects of television exposure is examined with emphasis on the importance of demographic control variables. (38…

  5. Commentary: Human papillomavirus and tar hypothesis for ...

    Indian Academy of Sciences (India)

    2010-08-09

    Aug 9, 2010 ... Commentary: Human papillomavirus and tar hypothesis for squamous cell cervical cancer. Christina Bennett Allen E Kuhn Harry W Haverkos. Volume 35 Issue 3 September 2010 pp ... Keywords. Cervical cancer; co-factors; human papillomavirus; tar-based vaginal douche; tobacco smoke; wood smoke ...

  6. Morbidity and Infant Development: A Hypothesis.

    Science.gov (United States)

    Pollitt, Ernesto

    1983-01-01

    Results of a study conducted in 14 villages of Sui Lin Township, Taiwan, suggest the hypothesis that, under conditions of extreme economic impoverishment and among children within populations where energy protein malnutrition is endemic, there is an inverse relationship between incidence of morbidity in infancy and measures of motor and mental…

  7. From heresy to dogma in accounts of opposition to Howard Temin's DNA provirus hypothesis.

    Science.gov (United States)

    Marcum, James A

    2002-01-01

    In 1964 the Wisconsin virologist Howard Temin proposed the DNA provirus hypothesis to explain the mechanism by which a cancer-producing virus containing only RNA infects and transforms cells. His hypothesis reversed the flow of genetic information, as ordained by the central dogma of molecular biology. Although there was initial opposition to his hypothesis it was widely accepted, after the discovery of reverse transcriptase in 1970. Most accounts of Temin's hypothesis after the discovery portray the hypothesis as heretical, because it challenged the central dogma. Temin himself in his Nobel Prize speech of 1975 narrates a similar story about its reception. But are these accounts warranted? I argue that members of the virology community opposed Temin's provirus hypothesis not simply because it was a counterexample to the central dogma, but more importantly because his experimental evidence for supporting it was inconclusive. Furthermore, I propose that these accounts of opposition to the DNA provirus hypothesis as heretical, written by Temin and others after the discovery of reverse transcriptase, played a significant role in establishing retrovirology as a specialized field.

  8. The Income Inequality Hypothesis Revisited : Assessing the Hypothesis Using Four Methodological Approaches

    NARCIS (Netherlands)

    Kragten, N.; Rözer, J.

    The income inequality hypothesis states that income inequality has a negative effect on individual’s health, partially because it reduces social trust. This article aims to critically assess the income inequality hypothesis by comparing several analytical strategies, namely OLS regression,

  9. Cortical Neural Computation by Discrete Results Hypothesis.

    Science.gov (United States)

    Castejon, Carlos; Nuñez, Angel

    2016-01-01

    One of the most challenging problems we face in neuroscience is to understand how the cortex performs computations. There is increasing evidence that the power of the cortical processing is produced by populations of neurons forming dynamic neuronal ensembles. Theoretical proposals and multineuronal experimental studies have revealed that ensembles of neurons can form emergent functional units. However, how these ensembles are implicated in cortical computations is still a mystery. Although cell ensembles have been associated with brain rhythms, the functional interaction remains largely unclear. It is still unknown how spatially distributed neuronal activity can be temporally integrated to contribute to cortical computations. A theoretical explanation integrating spatial and temporal aspects of cortical processing is still lacking. In this Hypothesis and Theory article, we propose a new functional theoretical framework to explain the computational roles of these ensembles in cortical processing. We suggest that complex neural computations underlying cortical processing could be temporally discrete and that sensory information would need to be quantized to be computed by the cerebral cortex. Accordingly, we propose that cortical processing is produced by the computation of discrete spatio-temporal functional units that we have called "Discrete Results" (Discrete Results Hypothesis). This hypothesis represents a novel functional mechanism by which information processing is computed in the cortex. Furthermore, we propose that precise dynamic sequences of "Discrete Results" is the mechanism used by the cortex to extract, code, memorize and transmit neural information. The novel "Discrete Results" concept has the ability to match the spatial and temporal aspects of cortical processing. We discuss the possible neural underpinnings of these functional computational units and describe the empirical evidence supporting our hypothesis. We propose that fast-spiking (FS

  10. The Nebular Hypothesis - A False Paradigm Misleading Scientists

    Science.gov (United States)

    Myers, L. S.

    2005-05-01

    ignored in the belief this comparatively small volume is insignificant relative to Earth's total mass and gravity. This misconception led to outdated gravitational constants and trajectories for "slingshotted" space missions that approached Earth closer than anticipated because the daily increase in mass increases Earth's gravitational pull. Today's philosophy assumes comets, meteoroids, asteroids and planets are different types of objects because of their varied sizes and appearances, but when all solar bodies are arranged by size they form a continuum from irregular meteoroids (remnants of comets) to spherical asteroids and planets. When meteoroids reach diameters of 500-600 kilometers, they become spherical-the critical threshold at which gravity can focus total molecular weight of any body omnidirectionally onto its exact center to initiate compressive heating and melting of originally cold rock core, producing magma, H2O and other gases. The Accreation concept assumes all solar bodies are different-sized objects of the same species, each having reached its present size and chemical composition by amalgamation and accretion. Each is at a different stage of growth but destined to become larger until it reaches the size of another sun (star). This is universal planetary growth controlled by gravity, but initiated by the trajectory imparted at its supernova birth and chance capture by some larger body elsewhere in the Universe. Like the paradigm shift from geocentrism to heliocentrism sparked by Copernicus in 1543, the time has come for a new paradigm to put scientific research on a more productive course toward TRUTH. The new concept of Accreation (creation by accretion) is offered as a replacement for the now defunct nebular hypothesis.

  11. Sensory Acceptability of Squash (Cucurbita Maximain Making Ice Cream

    Directory of Open Access Journals (Sweden)

    Raymund B. Moreno

    2015-02-01

    Full Text Available - This experimental research was conducted to determine the sensory acceptability of mashed squash (Cucurbita Maxima of different proportions in making ice cream in terms of appearance, aroma, texture, taste and general acceptability. Five treatments were formulated in the study—four of which utilized mashed squash at various proportions and one treatment was used as the control variable which contains no mashed squash at all. The respondents of the study were the 20 Food Technology students and 10 faculty members of West Visayas State University Calinog Campus who were selected through random sampling. The respondents evaluated the finished products using a modified sensory evaluation score sheet based on Six Point Hedonic Scale. The statistical tools used were the means, standard deviation, Wilcoxon Signed Rank Test. The 0.01 alpha level was used as the criterion for acceptance or rejection of the null hypothesis. The result of the study led to the conclusion that there is a significant difference that existed in the level of acceptability of mashed squash in making ice cream in terms of appearance, aroma, and general acceptability, therefore the null hypothesis is rejected. However, no significant difference in the level of acceptability of using mashed squash in making ice cream in terms of taste and texture.

  12. No-threshold dose-response curves for nongenotoxic chemicals: Findings and applications for risk assessment

    International Nuclear Information System (INIS)

    Sheehan, Daniel M.

    2006-01-01

    We tested the hypothesis that no threshold exists when estradiol acts through the same mechanism as an active endogenous estrogen. A Michaelis-Menten (MM) equation accounting for response saturation, background effects, and endogenous estrogen level fit a turtle sex-reversal data set with no threshold and estimated the endogenous dose. Additionally, 31 diverse literature dose-response data sets were analyzed by adding a term for nonhormonal background; good fits were obtained but endogenous dose estimations were not significant due to low resolving power. No thresholds were observed. Data sets were plotted using a normalized MM equation; all 178 data points were accommodated on a single graph. Response rates from ∼1% to >95% were well fit. The findings contradict the threshold assumption and low-dose safety. Calculating risk and assuming additivity of effects from multiple chemicals acting through the same mechanism rather than assuming a safe dose for nonthresholded curves is appropriate

  13. Public acceptance of nuclear power

    International Nuclear Information System (INIS)

    Wildgruber, O.H.

    1990-01-01

    The lecture addresses the question why we need public acceptance work and provides some clues to it. It explains various human behaviour patterns which determine the basics for public acceptance. To some extent, the opposition to nuclear energy and the role the media play are described. Public acceptance efforts of industry are critically reviewed. Some hints on difficulties with polling are provided. The lecture concludes with recommendations for further public acceptance work. (author)

  14. Accepting Lower Salaries for Meaningful Work

    Directory of Open Access Journals (Sweden)

    Jing Hu

    2017-09-01

    Full Text Available A growing literature indicates that people are increasingly motivated to experience a sense of meaning in their work lives. Little is known, however, about how perceptions of work meaningfulness influence job choice decisions. Although much of the research on job choice has focused on the importance of financial compensation, the subjective meanings attached to a job should also play a role. The current set of studies explored the hypothesis that people are willing to accept lower salaries for more meaningful work. In Study 1, participants reported lower minimum acceptable salaries when comparing jobs that they considered to be personally meaningful with those that they considered to be meaningless. In Study 2, an experimental enhancement of a job’s apparent meaningfulness lowered the minimum acceptable salary that participants required for the position. In two large-scale cross-national samples of full-time employees in 2005 and 2015, Study 3 found that participants who experienced more meaningful work lives were more likely to turn down higher-paying job offers elsewhere. The strength of this effect also increased significantly over this time period. Study 4 replicated these findings in an online sample, such that participants who reported having more meaningful work were less willing to leave their current jobs and organizations for higher paying opportunities. These patterns of results remained significant when controlling for demographic factors and differences in job characteristics.

  15. Accepting Lower Salaries for Meaningful Work.

    Science.gov (United States)

    Hu, Jing; Hirsh, Jacob B

    2017-01-01

    A growing literature indicates that people are increasingly motivated to experience a sense of meaning in their work lives. Little is known, however, about how perceptions of work meaningfulness influence job choice decisions. Although much of the research on job choice has focused on the importance of financial compensation, the subjective meanings attached to a job should also play a role. The current set of studies explored the hypothesis that people are willing to accept lower salaries for more meaningful work. In Study 1, participants reported lower minimum acceptable salaries when comparing jobs that they considered to be personally meaningful with those that they considered to be meaningless. In Study 2, an experimental enhancement of a job's apparent meaningfulness lowered the minimum acceptable salary that participants required for the position. In two large-scale cross-national samples of full-time employees in 2005 and 2015, Study 3 found that participants who experienced more meaningful work lives were more likely to turn down higher-paying job offers elsewhere. The strength of this effect also increased significantly over this time period. Study 4 replicated these findings in an online sample, such that participants who reported having more meaningful work were less willing to leave their current jobs and organizations for higher paying opportunities. These patterns of results remained significant when controlling for demographic factors and differences in job characteristics.

  16. The Method of Hypothesis in Plato's Philosophy

    Directory of Open Access Journals (Sweden)

    Malihe Aboie Mehrizi

    2016-09-01

    Full Text Available The article deals with the examination of method of hypothesis in Plato's philosophy. This method, respectively, will be examined in three dialogues of Meno, Phaedon and Republic in which it is explicitly indicated. It will be shown the process of change of Plato’s attitude towards the position and usage of the method of hypothesis in his realm of philosophy. In Meno, considering the geometry, Plato attempts to introduce a method that can be used in the realm of philosophy. But, ultimately in Republic, Plato’s special attention to the method and its importance in the philosophical investigations, leads him to revise it. Here, finally Plato introduces the particular method of philosophy, i.e., the dialectic

  17. Reverse hypothesis machine learning a practitioner's perspective

    CERN Document Server

    Kulkarni, Parag

    2017-01-01

    This book introduces a paradigm of reverse hypothesis machines (RHM), focusing on knowledge innovation and machine learning. Knowledge- acquisition -based learning is constrained by large volumes of data and is time consuming. Hence Knowledge innovation based learning is the need of time. Since under-learning results in cognitive inabilities and over-learning compromises freedom, there is need for optimal machine learning. All existing learning techniques rely on mapping input and output and establishing mathematical relationships between them. Though methods change the paradigm remains the same—the forward hypothesis machine paradigm, which tries to minimize uncertainty. The RHM, on the other hand, makes use of uncertainty for creative learning. The approach uses limited data to help identify new and surprising solutions. It focuses on improving learnability, unlike traditional approaches, which focus on accuracy. The book is useful as a reference book for machine learning researchers and professionals as ...

  18. Testing competing forms of the Milankovitch hypothesis

    DEFF Research Database (Denmark)

    Kaufmann, Robert K.; Juselius, Katarina

    2016-01-01

    We test competing forms of the Milankovitch hypothesis by estimating the coefficients and diagnostic statistics for a cointegrated vector autoregressive model that includes 10 climate variables and four exogenous variables for solar insolation. The estimates are consistent with the physical...... ice volume and solar insolation. The estimated adjustment dynamics show that solar insolation affects an array of climate variables other than ice volume, each at a unique rate. This implies that previous efforts to test the strong form of the Milankovitch hypothesis by examining the relationship...... mechanisms postulated to drive glacial cycles. They show that the climate variables are driven partly by solar insolation, determining the timing and magnitude of glaciations and terminations, and partly by internal feedback dynamics, pushing the climate variables away from equilibrium. We argue...

  19. Multi-agent sequential hypothesis testing

    KAUST Repository

    Kim, Kwang-Ki K.

    2014-12-15

    This paper considers multi-agent sequential hypothesis testing and presents a framework for strategic learning in sequential games with explicit consideration of both temporal and spatial coordination. The associated Bayes risk functions explicitly incorporate costs of taking private/public measurements, costs of time-difference and disagreement in actions of agents, and costs of false declaration/choices in the sequential hypothesis testing. The corresponding sequential decision processes have well-defined value functions with respect to (a) the belief states for the case of conditional independent private noisy measurements that are also assumed to be independent identically distributed over time, and (b) the information states for the case of correlated private noisy measurements. A sequential investment game of strategic coordination and delay is also discussed as an application of the proposed strategic learning rules.

  20. Exploring heterogeneous market hypothesis using realized volatility

    Science.gov (United States)

    Chin, Wen Cheong; Isa, Zaidi; Mohd Nor, Abu Hassan Shaari

    2013-04-01

    This study investigates the heterogeneous market hypothesis using high frequency data. The cascaded heterogeneous trading activities with different time durations are modelled by the heterogeneous autoregressive framework. The empirical study indicated the presence of long memory behaviour and predictability elements in the financial time series which supported heterogeneous market hypothesis. Besides the common sum-of-square intraday realized volatility, we also advocated two power variation realized volatilities in forecast evaluation and risk measurement in order to overcome the possible abrupt jumps during the credit crisis. Finally, the empirical results are used in determining the market risk using the value-at-risk approach. The findings of this study have implications for informationally market efficiency analysis, portfolio strategies and risk managements.

  1. Debates—Hypothesis testing in hydrology: Introduction

    Science.gov (United States)

    Blöschl, Günter

    2017-03-01

    This paper introduces the papers in the "Debates—Hypothesis testing in hydrology" series. The four articles in the series discuss whether and how the process of testing hypotheses leads to progress in hydrology. Repeated experiments with controlled boundary conditions are rarely feasible in hydrology. Research is therefore not easily aligned with the classical scientific method of testing hypotheses. Hypotheses in hydrology are often enshrined in computer models which are tested against observed data. Testability may be limited due to model complexity and data uncertainty. All four articles suggest that hypothesis testing has contributed to progress in hydrology and is needed in the future. However, the procedure is usually not as systematic as the philosophy of science suggests. A greater emphasis on a creative reasoning process on the basis of clues and explorative analyses is therefore needed.

  2. Eigenstate Thermalization Hypothesis and Quantum Thermodynamics

    Science.gov (United States)

    Olshanii, Maxim

    2009-03-01

    One of the open questions in quantum thermodynamics reads: how can linear quantum dynamics provide chaos necessary for thermalization of an isolated quantum system? To this end, we perform an ab initio numerical analysis of a system of hard-core bosons on a lattice and show [Marcos Rigol, Vanja Dunjko & Maxim Olshanii, Nature 452, 854 (2008)] that the above controversy can be resolved via the Eigenstate Thermalization Hypothesis suggested independently by Deutsch [J. M. Deutsch, Phys. Rev. A 43, 2046 (1991)] and Srednicki [M. Srednicki, Phys. Rev. E 50, 888 (1994)]. According to this hypothesis, in quantum systems thermalization happens in each individual eigenstate of the system separately, but it is hidden initially by coherences between them. In course of the time evolution the thermal properties become revealed through (linear) decoherence that needs not to be chaotic.

  3. Threshold behavior in electron-atom scattering

    International Nuclear Information System (INIS)

    Sadeghpour, H.R.; Greene, C.H.

    1996-01-01

    Ever since the classic work of Wannier in 1953, the process of treating two threshold electrons in the continuum of a positively charged ion has been an active field of study. The authors have developed a treatment motivated by the physics below the double ionization threshold. By modeling the double ionization as a series of Landau-Zener transitions, they obtain an analytical formulation of the absolute threshold probability which has a leading power law behavior, akin to Wannier's law. Some of the noteworthy aspects of this derivation are that the derivation can be conveniently continued below threshold giving rise to a open-quotes cuspclose quotes at threshold, and that on both sides of the threshold, absolute values of the cross sections are obtained

  4. Sea otter health: Challenging a pet hypothesis

    OpenAIRE

    Lafferty, Kevin D.

    2015-01-01

    A recent series of studies on tagged sea otters (Enhydra lutris nereis) challenges the hypothesis that sea otters are sentinels of a dirty ocean, in particular, that pet cats are the main source of exposure to Toxoplasma gondii in central California. Counter to expectations, sea otters from unpopulated stretches of coastline are less healthy and more exposed to parasites than city-associated otters. Ironically, now it seems that spillover from wildlife, not pets, dominates spatial patterns of...

  5. Sea otter health: Challenging a pet hypothesis

    Directory of Open Access Journals (Sweden)

    Kevin D. Lafferty

    2015-12-01

    Full Text Available A recent series of studies on tagged sea otters (Enhydra lutris nereis challenges the hypothesis that sea otters are sentinels of a dirty ocean, in particular, that pet cats are the main source of exposure to Toxoplasma gondii in central California. Counter to expectations, sea otters from unpopulated stretches of coastline are less healthy and more exposed to parasites than city-associated otters. Ironically, now it seems that spillover from wildlife, not pets, dominates spatial patterns of disease transmission.

  6. Sea otter health: Challenging a pet hypothesis.

    Science.gov (United States)

    Lafferty, Kevin D

    2015-12-01

    A recent series of studies on tagged sea otters (Enhydra lutris nereis) challenges the hypothesis that sea otters are sentinels of a dirty ocean, in particular, that pet cats are the main source of exposure to Toxoplasma gondii in central California. Counter to expectations, sea otters from unpopulated stretches of coastline are less healthy and more exposed to parasites than city-associated otters. Ironically, now it seems that spillover from wildlife, not pets, dominates spatial patterns of disease transmission.

  7. Sea otter health: challenging a pet hypothesis

    Science.gov (United States)

    Lafferty, Kevin D.

    2015-01-01

    A recent series of studies on tagged sea otters (Enhydra lutris nereis) challenges the hypothesis that sea otters are sentinels of a dirty ocean, in particular, that pet cats are the main source of exposure to Toxoplasma gondii in central California. Counter to expectations, sea otters from unpopulated stretches of coastline are less healthy and more exposed to parasites than city-associated otters. Ironically, now it seems that spillover from wildlife, not pets, dominates spatial patterns of disease transmission.

  8. Application of Sivasubramanian Kalimuthu Hypothesis to Triangles

    OpenAIRE

    M. Sivasubramanian

    2009-01-01

    Problem statement: The interior angles sum of a number of Euclidean triangles was transformed into quadratic equations. The analysis of those quadratic equations yielded the following proposition: There exists Euclidean triangle whose interior angle sum is a straight angle. Approach: In this study, the researchers introduced a new hypothesis for quadratic equations and derived an entirely new result. Results: The result of the study was controversial, but mathematically consistent. Conclusion...

  9. Kelvin on an old, celebrated hypothesis

    Science.gov (United States)

    Harrison, Edward

    1986-07-01

    Lord Kelvin in 1901 tested an ``old and celebrated hypothesis'' that if we could see far enough into space the whole sky would be occupied with stellar disks all of perhaps the same brightness as the Sun. Kelvin was the first to solve quantitatively and correctly the riddle of a dark night sky, a riddle that had been previously solved qualitatively by Edgar Allan Poe, and is now known as Olbers' paradox.

  10. Test of Taylor's Hypothesis with Distributed Temperature

    Science.gov (United States)

    Cheng, Y.; Gentine, P.; Sayde, C.; Tanner, E.; Ochsner, T. E.; Dong, J.

    2016-12-01

    Taylor's hypothesis[Taylor, 1938] assumes that mean wind speed carries the spatial pattern of turbulent motion past a fixed point in a "frozen" way, which has been widely used to relate streamwise wavenumber and angular frequency . Experiments[Fisher, 1964; Tong, 1996] have shown some deviation from Taylor's hypothesis at highly turbulent intensity flows and at high wavenumbers. However, the velocity or scalar measurements have always been fixed at a few spatial points rather than distributed in space. This experiment was designed for the first time to directly compare the time and spatial spectrum of temperature to test Taylor's hypothesis, measuring temperature with high resolution in both time and space by Distributed Temperature Sensing utilizing the attenuation difference of Raman scattering in the optic fiber at the MOISST site Oklahoma. The length of transact is 233 meters along the dominant wind direction. The temperature sampling distance is 0.127m and sampling time frequency is 1 Hz. The heights of the 4 fiber cables parallel to ground are 1m, 1.254m, 1.508m and 1.762m respectively. Also, eddy covariance instrument was set up near the Distributed Temperature Sensing as comparison for temperature data. The temperature spatial spectrum could be obtained with one fixed time point, while the temperature time spectrum could be obtained with one fixed spatial point in the middle of transact. The preliminary results would be presented in the AGU fall meeting. Reference Fisher, M. J., and Davies, P.O.A.L (1964), Correlation measurements in a non-frozen pattern of turbulence, Journal of fluid mechanics, 18(1), 97-116. Taylor, G. I. (1938), The spectrum of turbulence, Proceedings of the Royal Society of London A: Mathematical, Physical and Engineering Sciences, 164(919), 476-490. Tong, C. (1996), Taylor's Hypothesis and Two-point Coherence Measurements, Boundary-Layer Meteorology, 81(3), 399-410.

  11. Unaware Memory in Hypothesis Generation Tasks

    Science.gov (United States)

    1986-12-01

    distinguished two forms of memory : deliberate recollection of prior events versus the unaware influence of prior events on the performance of a later task...attempts to remember information. The findings reported here contribute in particular to our understanding of the memory processes involved in hypothesis... influenced by prior exposure to relevant events. Indeed, since the prior events themselves often cannot be consciously retrieved, this latter form of memory

  12. HPS simulation and acceptance

    Energy Technology Data Exchange (ETDEWEB)

    Mundim, Luiz Martins [UERJ, Rio de Janeiro, RJ (Brazil); Pol, Maria Elena [CBPF, Rio de Janeiro, RJ (Brazil)

    2013-07-01

    Full text: The High Precision Spectrometer (HPS) is a proposal of sub-detector to be installed in the region of 200-240m from each side of CMS along the LHC beam-line to measure scattered protons from exclusive centrally produced processes, pp → p + X + p. In order to study the protons that reach the detectors, the beam-line of the LHC accelerator has to be taken into account, as the particles are deflected by dipoles and suffer the influence of quadrupoles and other beam devices. The LHC team provides a detailed description of these elements, currents, energies, magnetic fields, and all the information needed to study the propagation of the protons. The program HECTOR, developed at the University of Louvain, uses the information from LHC to calculate at any point along the beam-line the kinematic quantities that characterize the scattered protons. A simple minded program was initially developed for the preliminary studies of acceptances varying the position and size of the foreseen detectors. Also, it took into account vertex and position smearing, to simulate a realistic resolution of the tracking detectors. These studies were performed using a particle gun generator which shoot protons from the IP within reasonable ranges of possible t and ξ (the square of the four-momentum transfer and the fractional energy loss of the outgoing proton in a diffractive collision), and propagated them to the position of the tracking detectors. These kinematic quantities were reconstructed back at the IP using the transport equations from HECTOR. This simplified simulation was afterwards interfaced with the full software of CMS, CMSSW, in such a way that when a diffractive event was fully simulated and reconstructed in the central detector, the outgoing protons were treated by the HPS software and then the complete (CMS+HPS) event was output. The ExHuME generator was used to produce Monte Carlo simulations to study the mass acceptance of the HPS detector, and central and

  13. Magnesium Sulfate Only Slightly Reduces the Shivering Threshold in Humans

    Science.gov (United States)

    Wadhwa, Anupama; Sengupta, Papiya; Durrani, Jaleel; Akça, Ozan; Lenhardt, Rainer; Sessler, Daniel I.

    2005-01-01

    Background: Hypothermia may be an effective treatment for stroke or acute myocardial infarction; however, it provokes vigorous shivering, which causes potentially dangerous hemodynamic responses and prevents further hypothermia. Magnesium is an attractive antishivering agent because it is used for treatment of postoperative shivering and provides protection against ischemic injury in animal models. We tested the hypothesis that magnesium reduces the threshold (triggering core temperature) and gain of shivering without substantial sedation or muscle weakness. Methods: We studied nine healthy male volunteers (18-40 yr) on two randomly assigned treatment days: 1) Control and 2) Magnesium (80 mg·kg-1 followed by infusion at 2 g·h-1). Lactated Ringer's solution (4°C) was infused via a central venous catheter over a period of approximately 2 hours to decrease tympanic membrane temperature ≈1.5°C·h-1. A significant and persistent increase in oxygen consumption identified the threshold. The gain of shivering was determined by the slope of oxygen consumption vs. core temperature regression. Sedation was evaluated using verbal rating score (VRS, 0-10) and bispectral index of the EEG (BIS). Peripheral muscle strength was evaluated using dynamometry and spirometry. Data were analyzed using repeated-measures ANOVA; Pshivering threshold (36.3±0.4 [mean±SD] vs. 36.6±0.3°C, P=0.040). It did not affect the gain of shivering (Control: 437±289, Magnesium: 573±370 ml·min-1·°C-1, P=0.344). The magnesium bolus did not produce significant sedation or appreciably reduce muscle strength. Conclusions: Magnesium significantly reduced the shivering threshold; however, due to the modest absolute reduction, this finding is considered to be clinically unimportant for induction of therapeutic hypothermia. PMID:15749735

  14. A two-step framework for over-threshold modelling of environmental extremes

    Science.gov (United States)

    Bernardara, P.; Mazas, F.; Kergadallan, X.; Hamm, L.

    2014-03-01

    The evaluation of the probability of occurrence of extreme natural events is important for the protection of urban areas, industrial facilities and others. Traditionally, the extreme value theory (EVT) offers a valid theoretical framework on this topic. In an over-threshold modelling (OTM) approach, Pickands' theorem, (Pickands, 1975) states that, for a sample composed by independent and identically distributed (i.i.d.) values, the distribution of the data exceeding a given threshold converges through a generalized Pareto distribution (GPD). Following this theoretical result, the analysis of realizations of environmental variables exceeding a threshold spread widely in the literature. However, applying this theorem to an auto-correlated time series logically involves two successive and complementary steps: the first one is required to build a sample of i.i.d. values from the available information, as required by the EVT; the second to set the threshold for the optimal convergence toward the GPD. In the past, the same threshold was often employed both for sampling observations and for meeting the hypothesis of extreme value convergence. This confusion can lead to an erroneous understanding of methodologies and tools available in the literature. This paper aims at clarifying the conceptual framework involved in threshold selection, reviewing the available methods for the application of both steps and illustrating it with a double threshold approach.

  15. Testing the egocentric mirror-rotation hypothesis.

    Science.gov (United States)

    Muelenz, Cornelius; Hecht, Heiko; Gamer, Matthias

    2010-01-01

    Although observers know about the law of reflection, their intuitive understanding of spatial locations in mirrors is often erroneous. Hecht et al. (2005) proposed a two-stage mirror-rotation hypothesis to explain these misconceptions. The hypothesis involves an egocentric bias to the effect that observers behave as if the mirror surface were rotated by about 2 degrees to be more orthogonal than is the case. We test four variants of the hypothesis, which differ depending on whether the virtual world, the mirror, or both are taken to be rotated. We devised an experimental setup that allowed us to distinguish between these variants. Our results confirm that the virtual world--and only the virtual world--is being rotated. Observers had to perform a localization task, using a mirror that was either fronto-parallel or rotated opposite the direction of the predicted effect. We were thus able to compensate for the effect. The positions of objects in mirrors were perceived in accordance with the erroneous conception that the virtual world behind the mirror is slightly rotated and that the reconstruction is based on the non-rotated fronto-parallel mirror. A covert rotation of the mirror by about 2 degrees against the predicted effect was able to compensate for the placement error.

  16. Consumer health information seeking as hypothesis testing.

    Science.gov (United States)

    Keselman, Alla; Browne, Allen C; Kaufman, David R

    2008-01-01

    Despite the proliferation of consumer health sites, lay individuals often experience difficulty finding health information online. The present study attempts to understand users' information seeking difficulties by drawing on a hypothesis testing explanatory framework. It also addresses the role of user competencies and their interaction with internet resources. Twenty participants were interviewed about their understanding of a hypothetical scenario about a family member suffering from stable angina and then searched MedlinePlus consumer health information portal for information on the problem presented in the scenario. Participants' understanding of heart disease was analyzed via semantic analysis. Thematic coding was used to describe information seeking trajectories in terms of three key strategies: verification of the primary hypothesis, narrowing search within the general hypothesis area and bottom-up search. Compared to an expert model, participants' understanding of heart disease involved different key concepts, which were also differently grouped and defined. This understanding provided the framework for search-guiding hypotheses and results interpretation. Incorrect or imprecise domain knowledge led individuals to search for information on irrelevant sites, often seeking out data to confirm their incorrect initial hypotheses. Online search skills enhanced search efficiency, but did not eliminate these difficulties. Regardless of their web experience and general search skills, lay individuals may experience difficulty with health information searches. These difficulties may be related to formulating and evaluating hypotheses that are rooted in their domain knowledge. Informatics can provide support at the levels of health information portals, individual websites, and consumer education tools.

  17. A test of the orthographic recoding hypothesis

    Science.gov (United States)

    Gaygen, Daniel E.

    2003-04-01

    The Orthographic Recoding Hypothesis [D. E. Gaygen and P. A. Luce, Percept. Psychophys. 60, 465-483 (1998)] was tested. According to this hypothesis, listeners recognize spoken words heard for the first time by mapping them onto stored representations of the orthographic forms of the words. Listeners have a stable orthographic representation of words, but no phonological representation, when those words have been read frequently but never heard or spoken. Such may be the case for low frequency words such as jargon. Three experiments using visually and auditorily presented nonword stimuli tested this hypothesis. The first two experiments were explicit tests of memory (old-new tests) for words presented visually. In the first experiment, the recognition of auditorily presented nonwords was facilitated when they previously appeared on a visually presented list. The second experiment was similar, but included a concurrent articulation task during a visual word list presentation, thus preventing covert rehearsal of the nonwords. The results were similar to the first experiment. The third experiment was an indirect test of memory (auditory lexical decision task) for visually presented nonwords. Auditorily presented nonwords were identified as nonwords significantly more slowly if they had previously appeared on the visually presented list accompanied by a concurrent articulation task.

  18. Iran: the next nuclear threshold state?

    OpenAIRE

    Maurer, Christopher L.

    2014-01-01

    Approved for public release; distribution is unlimited A nuclear threshold state is one that could quickly operationalize its peaceful nuclear program into one capable of producing a nuclear weapon. This thesis compares two known threshold states, Japan and Brazil, with Iran to determine if the Islamic Republic could also be labeled a threshold state. Furthermore, it highlights the implications such a status could have on U.S. nonproliferation policy. Although Iran's nuclear program is mir...

  19. Roots at the percolation threshold.

    Science.gov (United States)

    Kroener, Eva; Ahmed, Mutez Ali; Carminati, Andrea

    2015-04-01

    The rhizosphere is the layer of soil around the roots where complex and dynamic interactions between plants and soil affect the capacity of plants to take up water. The physical properties of the rhizosphere are affected by mucilage, a gel exuded by roots. Mucilage can absorb large volumes of water, but it becomes hydrophobic after drying. We use a percolation model to describe the rewetting of dry rhizosphere. We find that at a critical mucilage concentration the rhizosphere becomes impermeable. The critical mucilage concentration depends on the radius of the soil particle size. Capillary rise experiments with neutron radiography prove that for concentrations below the critical mucilage concentration water could easily cross the rhizosphere, while above the critical concentration water could no longer percolate through it. Our studies, together with former observations of water dynamics in the rhizosphere, suggest that the rhizosphere is near the percolation threshold, where small variations in mucilage concentration sensitively alter the soil hydraulic conductivity. Is mucilage exudation a plant mechanism to efficiently control the rhizosphere conductivity and the access to water?

  20. Percolation Threshold in Polycarbonate Nanocomposites

    Science.gov (United States)

    Ahuja, Suresh

    2014-03-01

    Nanocomposites have unique mechanical, electrical, magnetic, optical and thermal properties. Many methods could be applied to prepare polymer-inorganic nanocomposites, such as sol-gel processing, in-situ polymerization, particle in-situ formation, blending, and radiation synthesis. The analytical composite models that have been put forth include Voigt and Reuss bounds, Polymer nanocomposites offer the possibility of substantial improvements in material properties such as shear and bulk modulus, yield strength, toughness, film scratch resistance, optical properties, electrical conductivity, gas and solvent transport, with only very small amounts of nanoparticles Experimental results are compared against composite models of Hashin and Shtrikman bounds, Halpin-Tsai model, Cox model, and various Mori and Tanaka models. Examples of numerical modeling are molecular dynamics modeling and finite element modeling of reduced modulus and hardness that takes into account the modulus of the components and the effect of the interface between the hard filler and relatively soft polymer, polycarbonate. Higher nanoparticle concentration results in poor dispersion and adhesion to polymer matrix which results in lower modulus and hardness and departure from the existing composite models. As the level of silica increases beyond a threshold level, aggregates form which results in weakening of the structure. Polymer silica interface is found to be weak as silica is non-interacting promoting interfacial slip at silica-matrix junctions. Our experimental results compare favorably with those of nanocomposites of polyesters where the effect of nanoclay on composite hardness and modulus depended on dispersion of nanoclay in polyester.

  1. Dynamical thresholds for complete fusion

    International Nuclear Information System (INIS)

    Davies, K.T.R.; Sierk, A.J.; Nix, J.R.

    1983-01-01

    It is our purpose here to study the effect of nuclear dissipation and shape parametrization on dynamical thresholds for compound-nucleus formation in symmetric heavy-ion reactions. This is done by solving numerically classical equations of motion for head-on collisions to determine whether the dynamical trajectory in a multidimensional deformation space passes inside the fission saddle point and forms a compound nucleus, or whether it passes outside the fission saddle point and reseparates in a fast-fission or deep-inelastic reaction. Specifying the nuclear shape in terms of smoothly joined portions of three quadratic surfaces of revolution, we take into account three symmetric deformation coordinates. However, in some cases we reduce the number of coordinates to two by requiring the ends of the fusing system to be spherical in shape. The nuclear potential energy of deformation is determined in terms of a Coulomb energy and a double volume energy of a Yukawa-plus-exponential folding function. The collective kinetic energy is calculated for incompressible, nearly irrotational flow by means of the Werner-Wheeler approximation. Four possibilities are studied for the transfer of collective kinetic energy into internal single-particle excitation energy: zero dissipation, ordinary two body viscosity, one-body wall-formula dissipation, and one-body wall-and-window dissipation

  2. Efficient threshold for volumetric segmentation

    Science.gov (United States)

    Burdescu, Dumitru D.; Brezovan, Marius; Stanescu, Liana; Stoica Spahiu, Cosmin; Ebanca, Daniel

    2015-07-01

    Image segmentation plays a crucial role in effective understanding of digital images. However, the research on the existence of general purpose segmentation algorithm that suits for variety of applications is still very much active. Among the many approaches in performing image segmentation, graph based approach is gaining popularity primarily due to its ability in reflecting global image properties. Volumetric image segmentation can simply result an image partition composed by relevant regions, but the most fundamental challenge in segmentation algorithm is to precisely define the volumetric extent of some object, which may be represented by the union of multiple regions. The aim in this paper is to present a new method to detect visual objects from color volumetric images and efficient threshold. We present a unified framework for volumetric image segmentation and contour extraction that uses a virtual tree-hexagonal structure defined on the set of the image voxels. The advantage of using a virtual tree-hexagonal network superposed over the initial image voxels is that it reduces the execution time and the memory space used, without losing the initial resolution of the image.

  3. Bridging the Gap between Social Acceptance and Ethical Acceptability.

    Science.gov (United States)

    Taebi, Behnam

    2017-10-01

    New technology brings great benefits, but it can also create new and significant risks. When evaluating those risks in policymaking, there is a tendency to focus on social acceptance. By solely focusing on social acceptance, we could, however, overlook important ethical aspects of technological risk, particularly when we evaluate technologies with transnational and intergenerational risks. I argue that good governance of risky technology requires analyzing both social acceptance and ethical acceptability. Conceptually, these two notions are mostly complementary. Social acceptance studies are not capable of sufficiently capturing all the morally relevant features of risky technologies; ethical analyses do not typically include stakeholders' opinions, and they therefore lack the relevant empirical input for a thorough ethical evaluation. Only when carried out in conjunction are these two types of analysis relevant to national and international governance of risky technology. I discuss the Rawlsian wide reflective equilibrium as a method for marrying social acceptance and ethical acceptability. Although the rationale of my argument is broadly applicable, I will examine the case of multinational nuclear waste repositories in particular. This example will show how ethical issues may be overlooked if we focus only on social acceptance, and will provide a test case for demonstrating how the wide reflective equilibrium can help to bridge the proverbial acceptance-acceptability gap. © 2016 The Authors Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.

  4. Error Thresholds on Dynamic Fittness-Landscapes

    OpenAIRE

    Nilsson, Martin; Snoad, Nigel

    1999-01-01

    In this paper we investigate error-thresholds on dynamics fitness-landscapes. We show that there exists both lower and an upper threshold, representing limits to the copying fidelity of simple replicators. The lower bound can be expressed as a correction term to the error-threshold present on a static landscape. The upper error-threshold is a new limit that only exists on dynamic fitness-landscapes. We also show that for long genomes on highly dynamic fitness-landscapes there exists a lower b...

  5. Statistical sampling and hypothesis testing in orthopaedic research.

    Science.gov (United States)

    Bernstein, Joseph; McGuire, Kevin; Freedman, Kevin B

    2003-08-01

    The purpose of the current article was to review the process of hypothesis testing and statistical sampling and empower readers to critically appraise the literature. When the p value of a study lies above the alpha threshold, the results are said to be not statistically significant. It is possible, however, that real differences do exist, but the study was insufficiently powerful to detect them. In that case, the conclusion that two groups are equivalent is wrong. The probability of this mistake, the Type II error, is given by the beta statistic. The complement of beta, or 1-beta, representing the chance of avoiding a Type II error, is termed the statistical power of the study. We previously examined the statistical power and sample size in all of the studies published in 1997 in the American and British volumes of the Journal of Bone and Joint Surgery, and in Clinical Orthopaedics and Related Research. In the journals examined, only 3% of studies had adequate statistical power to detect a small effect size in this sample. In addition, a study examining only randomized control trials in these journals showed that none of 25 randomized control trials had adequate statistical power to detect a small effect size. However, beta, or power, is less well understood. Because of this, researchers and readers should be aware of the need to address issues of statistical power before a study begins and be cautious of studies that conclude that no difference exists between groups.

  6. The linear hypothesis: An idea whose time has passed

    International Nuclear Information System (INIS)

    Tschaeche, A.N.

    1995-01-01

    This paper attempts to present a clear idea of what the linear (no-threshold) hypothesis (LH) is, how it was corrupted and what happened to the nuclear industry as a result, and one possible solution to this major problem for the nuclear industry. The corruption lies in the change of the LH from ''a little radiation MAY produce harm'' to ''low doses of radiation WILL KILL you.'' The result has been the retardation of the nuclear industry in the United States, although the industry is one of the safest, if not the safest industry. It is suggested to replace the LH with two sets of standards, one having to do with human and environmental health and safety, and the other (more stringent) for protection of manufactured items and premises. The safety standard could be some dose such as 5 rem/year. This would do away with the ALARA concept below the annual limit and with the collective dose at low doses. Benefits of the two-tier radiation standards system would be the alleviation of the public fear of radiation and the health of the nuclear industry

  7. Public acceptance of biofuels

    International Nuclear Information System (INIS)

    Savvanidou, Electra; Zervas, Efthimios; Tsagarakis, Konstantinos P.

    2010-01-01

    The public acceptance of biofuels in Greece is examined in this work. The analysis of 571 face to face interviews shows that 90.7% of the respondents believe that climatic changes are related to fossil fuel consumption, while only 23.8% know the difference between biodiesel and bioethanol. 76.1% believe that energy saving should precede the use of an alternative source of energy. Only 27.3% believe that priority must be given to biofuels over other renewable energy sources. Only 49.9% think that the use of biofuels can be an effective solution against climatic changes and 53.9% believe that the use of biofuels can be an effective solution for the energy problem. Finally, 80.9% of the car owners are willing to use biofuels, 44.8% are willing to pay the supplementary amount of 0.06 EUR/L of the fuel market price, while the average amount reported as willing to pay was 0.079 EUR/L on top of the fuel market price. Furthermore, eight models correlating the eight main responses with several socioeconomic variables are developed and analyzed. Those findings heave important policy implications related to the use and promotion of biofuels. (author)

  8. Increased intensity discrimination thresholds in tinnitus subjects with a normal audiogram

    DEFF Research Database (Denmark)

    Epp, Bastian; Hots, J.; Verhey, J. L.

    2012-01-01

    Recent auditory brain stem response measurements in tinnitus subjects with normal audiograms indicate the presence of hidden hearing loss that manifests as reduced neural output from the cochlea at high sound intensities, and results from mice suggest a link to deafferentation of auditory nerve...... fibers. As deafferentation would lead to deficits in hearing performance, the present study investigates whether tinnitus patients with normal hearing thresholds show impairment in intensity discrimination compared to an audiometrically matched control group. Intensity discrimination thresholds were...... significantly increased in the tinnitus frequency range, consistent with the hypothesis that auditory nerve fiber deafferentation is associated with tinnitus....

  9. Probability Density and CFAR Threshold Estimation for Hyperspectral Imaging

    Energy Technology Data Exchange (ETDEWEB)

    Clark, G A

    2004-09-21

    The work reported here shows the proof of principle (using a small data set) for a suite of algorithms designed to estimate the probability density function of hyperspectral background data and compute the appropriate Constant False Alarm Rate (CFAR) matched filter decision threshold for a chemical plume detector. Future work will provide a thorough demonstration of the algorithms and their performance with a large data set. The LASI (Large Aperture Search Initiative) Project involves instrumentation and image processing for hyperspectral images of chemical plumes in the atmosphere. The work reported here involves research and development on algorithms for reducing the false alarm rate in chemical plume detection and identification algorithms operating on hyperspectral image cubes. The chemical plume detection algorithms to date have used matched filters designed using generalized maximum likelihood ratio hypothesis testing algorithms [1, 2, 5, 6, 7, 12, 10, 11, 13]. One of the key challenges in hyperspectral imaging research is the high false alarm rate that often results from the plume detector [1, 2]. The overall goal of this work is to extend the classical matched filter detector to apply Constant False Alarm Rate (CFAR) methods to reduce the false alarm rate, or Probability of False Alarm P{sub FA} of the matched filter [4, 8, 9, 12]. A detector designer is interested in minimizing the probability of false alarm while simultaneously maximizing the probability of detection P{sub D}. This is summarized by the Receiver Operating Characteristic Curve (ROC) [10, 11], which is actually a family of curves depicting P{sub D} vs. P{sub FA}parameterized by varying levels of signal to noise (or clutter) ratio (SNR or SCR). Often, it is advantageous to be able to specify a desired P{sub FA} and develop a ROC curve (P{sub D} vs. decision threshold r{sub 0}) for that case. That is the purpose of this work. Specifically, this work develops a set of algorithms and MATLAB

  10. Software thresholds alter the bias of actigraphy for monitoring sleep in team-sport athletes.

    Science.gov (United States)

    Fuller, Kate L; Juliff, Laura; Gore, Christopher J; Peiffer, Jeremiah J; Halson, Shona L

    2017-08-01

    Actical ® actigraphy is commonly used to monitor athlete sleep. The proprietary software, called Actiware ® , processes data with three different sleep-wake thresholds (Low, Medium or High), but there is no standardisation regarding their use. The purpose of this study was to examine validity and bias of the sleep-wake thresholds for processing Actical ® sleep data in team sport athletes. Validation study comparing actigraph against accepted gold standard polysomnography (PSG). Sixty seven nights of sleep were recorded simultaneously with polysomnography and Actical ® devices. Individual night data was compared across five sleep measures for each sleep-wake threshold using Actiware ® software. Accuracy of each sleep-wake threshold compared with PSG was evaluated from mean bias with 95% confidence limits, Pearson moment-product correlation and associated standard error of estimate. The Medium threshold generated the smallest mean bias compared with polysomnography for total sleep time (8.5min), sleep efficiency (1.8%) and wake after sleep onset (-4.1min); whereas the Low threshold had the smallest bias (7.5min) for wake bouts. Bias in sleep onset latency was the same across thresholds (-9.5min). The standard error of the estimate was similar across all thresholds; total sleep time ∼25min, sleep efficiency ∼4.5%, wake after sleep onset ∼21min, and wake bouts ∼8 counts. Sleep parameters measured by the Actical ® device are greatly influenced by the sleep-wake threshold applied. In the present study the Medium threshold produced the smallest bias for most parameters compared with PSG. Given the magnitude of measurement variability, confidence limits should be employed when interpreting changes in sleep parameters. Copyright © 2017 Sports Medicine Australia. All rights reserved.

  11. The conscious access hypothesis: Explaining the consciousness

    OpenAIRE

    Prakash, Ravi

    2008-01-01

    The phenomenon of conscious awareness or consciousness is complicated but fascinating. Although this concept has intrigued the mankind since antiquity, exploration of consciousness from scientific perspectives is not very old. Among myriad of theories regarding nature, functions and mechanism of consciousness, off late, cognitive theories have received wider acceptance. One of the most exciting hypotheses in recent times has been the ?conscious access hypotheses? based on the ?global workspac...

  12. Hypothesis Testing as an Act of Rationality

    Science.gov (United States)

    Nearing, Grey

    2017-04-01

    Statistical hypothesis testing is ad hoc in two ways. First, setting probabilistic rejection criteria is, as Neyman (1957) put it, an act of will rather than an act of rationality. Second, physical theories like conservation laws do not inherently admit probabilistic predictions, and so we must use what are called epistemic bridge principles to connect model predictions with the actual methods of hypothesis testing. In practice, these bridge principles are likelihood functions, error functions, or performance metrics. I propose that the reason we are faced with these problems is because we have historically failed to account for a fundamental component of basic logic - namely the portion of logic that explains how epistemic states evolve in the presence of empirical data. This component of Cox' (1946) calculitic logic is called information theory (Knuth, 2005), and adding information theory our hypothetico-deductive account of science yields straightforward solutions to both of the above problems. This also yields a straightforward method for dealing with Popper's (1963) problem of verisimilitude by facilitating a quantitative approach to measuring process isomorphism. In practice, this involves data assimilation. Finally, information theory allows us to reliably bound measures of epistemic uncertainty, thereby avoiding the problem of Bayesian incoherency under misspecified priors (Grünwald, 2006). I therefore propose solutions to four of the fundamental problems inherent in both hypothetico-deductive and/or Bayesian hypothesis testing. - Neyman (1957) Inductive Behavior as a Basic Concept of Philosophy of Science. - Cox (1946) Probability, Frequency and Reasonable Expectation. - Knuth (2005) Lattice Duality: The Origin of Probability and Entropy. - Grünwald (2006). Bayesian Inconsistency under Misspecification. - Popper (1963) Conjectures and Refutations: The Growth of Scientific Knowledge.

  13. Time-efficient multidimensional threshold tracking method

    DEFF Research Database (Denmark)

    Fereczkowski, Michal; Kowalewski, Borys; Dau, Torsten

    2015-01-01

    Traditionally, adaptive methods have been used to reduce the time it takes to estimate psychoacoustic thresholds. However, even with adaptive methods, there are many cases where the testing time is too long to be clinically feasible, particularly when estimating thresholds as a function of anothe...

  14. Applying Threshold Concepts to Finance Education

    Science.gov (United States)

    Hoadley, Susan; Wood, Leigh N.; Tickle, Leonie; Kyng, Tim

    2016-01-01

    Purpose: The purpose of this paper is to investigate and identify threshold concepts that are the essential conceptual content of finance programmes. Design/Methodology/Approach: Conducted in three stages with finance academics and students, the study uses threshold concepts as both a theoretical framework and a research methodology. Findings: The…

  15. Voting on Thresholds for Public Goods

    DEFF Research Database (Denmark)

    Rauchdobler, Julian; Sausgruber, Rupert; Tyran, Jean-Robert

    2010-01-01

    Introducing a threshold in the sense of a minimal project size transforms a public-good game with an inefficient equilibrium into a coordination game with a set of Pareto-superior equilibria. Thresholds may therefore improve efficiency in the voluntary provision of public goods. In our one-shot e...

  16. Intelligence and Creativity: Over the Threshold Together?

    Science.gov (United States)

    Welter, Marisete Maria; Jaarsveld, Saskia; van Leeuwen, Cees; Lachmann, Thomas

    2016-01-01

    Threshold theory predicts a positive correlation between IQ and creativity scores up to an IQ level of 120 and no correlation above this threshold. Primary school children were tested at beginning (N = 98) and ending (N = 70) of the school year. Participants performed the standard progressive matrices (SPM) and the Test of Creative…

  17. Threshold Concepts, Systems and Learning for Sustainability

    Science.gov (United States)

    Sandri, Orana Jade

    2013-01-01

    This paper presents a framework for understanding the role that systems theory might play in education for sustainability (EfS). It offers a sketch and critique of Land and Meyer's notion of a "threshold concept", to argue that seeing systems as a threshold concept for sustainability is useful for understanding the processes of…

  18. Evaluation of the Detection Threshold of Three ...

    African Journals Online (AJOL)

    A mean count of 39 pigments per microlitre was obtained for these five patients. Both HEXAGON MALARIA and SD-BIOLINE had a detection threshold of 4 pigments per microlitre, while ACCU-STAT MALARIA had 20 pigments per microlitre. This suggests that these three kits have good detection thresholds and could ...

  19. Log canonical thresholds of smooth Fano threefolds

    International Nuclear Information System (INIS)

    Cheltsov, Ivan A; Shramov, Konstantin A

    2008-01-01

    The complex singularity exponent is a local invariant of a holomorphic function determined by the integrability of fractional powers of the function. The log canonical thresholds of effective Q-divisors on normal algebraic varieties are algebraic counterparts of complex singularity exponents. For a Fano variety, these invariants have global analogues. In the former case, it is the so-called α-invariant of Tian; in the latter case, it is the global log canonical threshold of the Fano variety, which is the infimum of log canonical thresholds of all effective Q-divisors numerically equivalent to the anticanonical divisor. An appendix to this paper contains a proof that the global log canonical threshold of a smooth Fano variety coincides with its α-invariant of Tian. The purpose of the paper is to compute the global log canonical thresholds of smooth Fano threefolds (altogether, there are 105 deformation families of such threefolds). The global log canonical thresholds are computed for every smooth threefold in 64 deformation families, and the global log canonical thresholds are computed for a general threefold in 20 deformation families. Some bounds for the global log canonical thresholds are computed for 14 deformation families. Appendix A is due to J.-P. Demailly.

  20. Statistical hypothesis testing with SAS and R

    CERN Document Server

    Taeger, Dirk

    2014-01-01

    A comprehensive guide to statistical hypothesis testing with examples in SAS and R When analyzing datasets the following questions often arise:Is there a short hand procedure for a statistical test available in SAS or R?If so, how do I use it?If not, how do I program the test myself? This book answers these questions and provides an overview of the most commonstatistical test problems in a comprehensive way, making it easy to find and performan appropriate statistical test. A general summary of statistical test theory is presented, along with a basicdescription for each test, including the

  1. Confluence Model or Resource Dilution Hypothesis?

    DEFF Research Database (Denmark)

    Jæger, Mads

    Studies on family background often explain the negative effect of sibship size on educational attainment by one of two theories: the Confluence Model (CM) or the Resource Dilution Hypothesis (RDH). However, as both theories – for substantively different reasons – predict that sibship size should...... to identify a unique RDH effect on educational attainment. Using sibling data from the Wisconsin Longitudinal Study (WLS) and a random effect Instrumental Variable model, I find that in addition to having a negative effect on cognitive ability, sibship size also has a strong negative effect on educational...

  2. Set theory and the continuum hypothesis

    CERN Document Server

    Cohen, Paul J

    2008-01-01

    This exploration of a notorious mathematical problem is the work of the man who discovered the solution. The independence of the continuum hypothesis is the focus of this study by Paul J. Cohen. It presents not only an accessible technical explanation of the author's landmark proof but also a fine introduction to mathematical logic. An emeritus professor of mathematics at Stanford University, Dr. Cohen won two of the most prestigious awards in mathematics: in 1964, he was awarded the American Mathematical Society's Bôcher Prize for analysis; and in 1966, he received the Fields Medal for Logic.

  3. Underestimation of pacing threshold as determined by an automatic ventricular threshold testing algorithm.

    Science.gov (United States)

    Sauer, William H; Cooper, Joshua M; Lai, Rebecca W; Verdino, Ralph J

    2006-09-01

    In this case report, we describe markedly different pacing thresholds determined by a manual threshold test and the automatic Ventricular Capture Management algorithm. The discrepancy in pacing threshold values reported was due to the difference in the AV intervals used with the different testing methods. We propose that the differences in right ventricular dimensions with altered diastolic filling periods affected the threshold in this patient with a new passive fixation lead in the right ventricular apex.

  4. The adaptive value of gluttony: predators mediate the life history trade-offs of satiation threshold.

    Science.gov (United States)

    Pruitt, J N; Krauel, J J

    2010-10-01

    Animals vary greatly in their tendency to consume large meals. Yet, whether or how meal size influences fitness in wild populations is infrequently considered. Using a predator exclusion, mark-recapture experiment, we estimated selection on the amount of food accepted during an ad libitum feeding bout (hereafter termed 'satiation threshold') in the wolf spider Schizocosa ocreata. Individually marked, size-matched females of known satiation threshold were assigned to predator exclusion and predator inclusion treatments and tracked for a 40-day period. We also estimated the narrow-sense heritability of satiation threshold using dam-on-female-offspring regression. In the absence of predation, high satiation threshold was positively associated with larger and faster egg case production. However, these selective advantages were lost when predators were present. We estimated the heritability of satiation threshold to be 0.56. Taken together, our results suggest that satiation threshold can respond to selection and begets a life history trade-off in this system: high satiation threshold individuals tend to produce larger egg cases but also suffer increased susceptibility to predation. © 2010 The Authors. Journal Compilation © 2010 European Society For Evolutionary Biology.

  5. American acceptance of nuclear power

    International Nuclear Information System (INIS)

    Barrett, W.

    1980-01-01

    The characteristic adventurous spirit that built American technology will eventually lead to American acceptance of nuclear power unless an overpowering loss of nerve causes us to reject both nuclear technology and world leadership. The acceptance of new technology by society has always been accompanied by activist opposition to industralization. To resolve the debate between environmental and exploitive extremists, we must accept with humility the basic premise that human accomplishment is a finite part of nature

  6. Tacit acceptance of the succession

    Directory of Open Access Journals (Sweden)

    Ioana NICOLAE

    2012-01-01

    Full Text Available This paper examines some essential and contradictory aspects regarding the issue of tacit acceptance of succession in terms of distinction between documents valuing tacit acceptance of succession and other acts that would not justify such a solution. The documents expressly indicated by the legislator as having tacit acceptance value as well as those which do not have such value are presented and their most important legal effects are examined and discussed.

  7. A Threshold Continuum for Aeolian Sand Transport

    Science.gov (United States)

    Swann, C.; Ewing, R. C.; Sherman, D. J.

    2015-12-01

    The threshold of motion for aeolian sand transport marks the initial entrainment of sand particles by the force of the wind. This is typically defined and modeled as a singular wind speed for a given grain size and is based on field and laboratory experimental data. However, the definition of threshold varies significantly between these empirical models, largely because the definition is based on visual-observations of initial grain movement. For example, in his seminal experiments, Bagnold defined threshold of motion when he observed that 100% of the bed was in motion. Others have used 50% and lesser values. Differences in threshold models, in turn, result is large errors in predicting the fluxes associated with sand and dust transport. Here we use a wind tunnel and novel sediment trap to capture the fractions of sand in creep, reptation and saltation at Earth and Mars pressures and show that the threshold of motion for aeolian sand transport is best defined as a continuum in which grains progress through stages defined by the proportion of grains in creep and saltation. We propose the use of scale dependent thresholds modeled by distinct probability distribution functions that differentiate the threshold based on micro to macro scale applications. For example, a geologic timescale application corresponds to a threshold when 100% of the bed in motion whereas a sub-second application corresponds to a threshold when a single particle is set in motion. We provide quantitative measurements (number and mode of particle movement) corresponding to visual observations, percent of bed in motion and degrees of transport intermittency for Earth and Mars. Understanding transport as a continuum provides a basis for revaluating sand transport thresholds on Earth, Mars and Titan.

  8. Bridging the Gap between Social Acceptance and Ethical Acceptability

    NARCIS (Netherlands)

    Taebi, B.

    2016-01-01

    New technology brings great benefits, but it can also create new and significant risks. When evaluating those risks in policymaking, there is a tendency to focus on social acceptance. By solely focusing on social acceptance, we could, however, overlook important ethical aspects of technological

  9. Roots and Route of the Artification Hypothesis

    Directory of Open Access Journals (Sweden)

    Ellen Dissanayake

    2017-08-01

    Full Text Available Over four decades, my ideas about the arts in human evolution have themselves evolved, from an original notion of art as a human behaviour of “making special” to a full-fledged hypothesis of artification. A summary of the gradual developmental path (or route of the hypothesis, based on ethological principles and concepts, is given, and an argument presented in which artification is described as an exaptation whose roots lie in adaptive features of ancestral mother–infant interaction that contributed to infant survival and maternal reproductive success. I show how the interaction displays features of a ritualised behavior whose operations (formalization, repetition, exaggeration, and elaboration can be regarded as characteristic elements of human ritual ceremonies as well as of art (including song, dance, performance, literary language, altered surroundings, and other examples of making ordinary sounds, movement, language, environments, objects, and bodies extraordinary. Participation in these behaviours in ritual practices served adaptive ends in early Homo by coordinating brain and body states, and thereby emotionally bonding members of a group in common cause as well as reducing existential anxiety in individuals. A final section situates artification within contemporary philosophical and popular ideas of art, claiming that artifying is not a synonym for or definition of art but foundational to any evolutionary discussion of artistic/aesthetic behaviour.

  10. The alliance hypothesis for human friendship.

    Science.gov (United States)

    DeScioli, Peter; Kurzban, Robert

    2009-06-03

    Exploration of the cognitive systems underlying human friendship will be advanced by identifying the evolved functions these systems perform. Here we propose that human friendship is caused, in part, by cognitive mechanisms designed to assemble support groups for potential conflicts. We use game theory to identify computations about friends that can increase performance in multi-agent conflicts. This analysis suggests that people would benefit from: 1) ranking friends, 2) hiding friend-ranking, and 3) ranking friends according to their own position in partners' rankings. These possible tactics motivate the hypotheses that people possess egocentric and allocentric representations of the social world, that people are motivated to conceal this information, and that egocentric friend-ranking is determined by allocentric representations of partners' friend-rankings (more than others' traits). We report results from three studies that confirm predictions derived from the alliance hypothesis. Our main empirical finding, replicated in three studies, was that people's rankings of their ten closest friends were predicted by their own perceived rank among their partners' other friends. This relationship remained strong after controlling for a variety of factors such as perceived similarity, familiarity, and benefits. Our results suggest that the alliance hypothesis merits further attention as a candidate explanation for human friendship.

  11. Revisiting the genomic hypomethylation hypothesis of aging.

    Science.gov (United States)

    Unnikrishnan, Archana; Hadad, Niran; Masser, Dustin R; Jackson, Jordan; Freeman, Willard M; Richardson, Arlan

    2018-01-24

    The genomic hypomethylation hypothesis of aging proposes that an overall decrease in global DNA methylation occurs with age, and it has been argued that the decrease in global DNA methylation could be an important factor in aging, resulting in the relaxation of gene expression regulation and abnormal gene expression. Since it was initially observed that DNA methylation decreased with age in 1974, 16 articles have been published describing the effect of age on global DNA methylation in various tissues from rodents and humans. We critically reviewed the publications on the effect of age on DNA methylation and the expression of the enzymes involved in DNA methylation to evaluate the validity of the hypomethylation hypothesis of aging. On the basis of the current scientific literature, we conclude that a decrease in the global methylation of the genome occurs in most if not all tissues/cells as an animal ages. However, age-related changes in DNA methylation in specific regions or at specific sites in the genome occur even though the global DNA methylation does not change. © 2018 The Authors. Annals of the New York Academy of Sciences published by Wiley Periodicals, Inc. on behalf of New York Academy of Sciences.

  12. Hypothesis-driven physical examination curriculum.

    Science.gov (United States)

    Allen, Sharon; Olson, Andrew; Menk, Jeremiah; Nixon, James

    2017-12-01

    Medical students traditionally learn physical examination skills as a rote list of manoeuvres. Alternatives like hypothesis-driven physical examination (HDPE) may promote students' understanding of the contribution of physical examination to diagnostic reasoning. We sought to determine whether first-year medical students can effectively learn to perform a physical examination using an HDPE approach, and then tailor the examination to specific clinical scenarios. Medical students traditionally learn physical examination skills as a rote list of manoeuvres CONTEXT: First-year medical students at the University of Minnesota were taught both traditional and HDPE approaches during a required 17-week clinical skills course in their first semester. The end-of-course evaluation assessed HDPE skills: students were assigned one of two cardiopulmonary cases. Each case included two diagnostic hypotheses. During an interaction with a standardised patient, students were asked to select physical examination manoeuvres in order to make a final diagnosis. Items were weighted and selection order was recorded. First-year students with minimal pathophysiology performed well. All students selected the correct diagnosis. Importantly, students varied the order when selecting examination manoeuvres depending on the diagnoses under consideration, demonstrating early clinical decision-making skills. An early introduction to HDPE may reinforce physical examination skills for hypothesis generation and testing, and can foster early clinical decision-making skills. This has important implications for further research in physical examination instruction. © 2016 John Wiley & Sons Ltd and The Association for the Study of Medical Education.

  13. The Debt Overhang Hypothesis: Evidence from Pakistan

    Directory of Open Access Journals (Sweden)

    Shah Muhammad Imran

    2016-04-01

    Full Text Available This study investigates the debt overhang hypothesis for Pakistan in the period 1960-2007. The study examines empirically the dynamic behaviour of GDP, debt services, the employed labour force and investment using the time series concepts of unit roots, cointegration, error correlation and causality. Our findings suggest that debt-servicing has a negative impact on the productivity of both labour and capital, and that in turn has adversely affected economic growth. By severely constraining the ability of the country to service debt, this lends support to the debt-overhang hypothesis in Pakistan. The long run relation between debt services and economic growth implies that future increases in output will drain away in form of high debt service payments to lender country as external debt acts like a tax on output. More specifically, foreign creditors will benefit more from the rise in productivity than will domestic producers and labour. This suggests that domestic labour and capital are the ultimate losers from this heavy debt burden.

  14. Gaussian Hypothesis Testing and Quantum Illumination.

    Science.gov (United States)

    Wilde, Mark M; Tomamichel, Marco; Lloyd, Seth; Berta, Mario

    2017-09-22

    Quantum hypothesis testing is one of the most basic tasks in quantum information theory and has fundamental links with quantum communication and estimation theory. In this paper, we establish a formula that characterizes the decay rate of the minimal type-II error probability in a quantum hypothesis test of two Gaussian states given a fixed constraint on the type-I error probability. This formula is a direct function of the mean vectors and covariance matrices of the quantum Gaussian states in question. We give an application to quantum illumination, which is the task of determining whether there is a low-reflectivity object embedded in a target region with a bright thermal-noise bath. For the asymmetric-error setting, we find that a quantum illumination transmitter can achieve an error probability exponent stronger than a coherent-state transmitter of the same mean photon number, and furthermore, that it requires far fewer trials to do so. This occurs when the background thermal noise is either low or bright, which means that a quantum advantage is even easier to witness than in the symmetric-error setting because it occurs for a larger range of parameters. Going forward from here, we expect our formula to have applications in settings well beyond those considered in this paper, especially to quantum communication tasks involving quantum Gaussian channels.

  15. A default Bayesian hypothesis test for mediation.

    Science.gov (United States)

    Nuijten, Michèle B; Wetzels, Ruud; Matzke, Dora; Dolan, Conor V; Wagenmakers, Eric-Jan

    2015-03-01

    In order to quantify the relationship between multiple variables, researchers often carry out a mediation analysis. In such an analysis, a mediator (e.g., knowledge of a healthy diet) transmits the effect from an independent variable (e.g., classroom instruction on a healthy diet) to a dependent variable (e.g., consumption of fruits and vegetables). Almost all mediation analyses in psychology use frequentist estimation and hypothesis-testing techniques. A recent exception is Yuan and MacKinnon (Psychological Methods, 14, 301-322, 2009), who outlined a Bayesian parameter estimation procedure for mediation analysis. Here we complete the Bayesian alternative to frequentist mediation analysis by specifying a default Bayesian hypothesis test based on the Jeffreys-Zellner-Siow approach. We further extend this default Bayesian test by allowing a comparison to directional or one-sided alternatives, using Markov chain Monte Carlo techniques implemented in JAGS. All Bayesian tests are implemented in the R package BayesMed (Nuijten, Wetzels, Matzke, Dolan, & Wagenmakers, 2014).

  16. The Younger Dryas impact hypothesis: A requiem

    Science.gov (United States)

    Pinter, Nicholas; Scott, Andrew C.; Daulton, Tyrone L.; Podoll, Andrew; Koeberl, Christian; Anderson, R. Scott; Ishman, Scott E.

    2011-06-01

    The Younger Dryas (YD) impact hypothesis is a recent theory that suggests that a cometary or meteoritic body or bodies hit and/or exploded over North America 12,900 years ago, causing the YD climate episode, extinction of Pleistocene megafauna, demise of the Clovis archeological culture, and a range of other effects. Since gaining widespread attention in 2007, substantial research has focused on testing the 12 main signatures presented as evidence of a catastrophic extraterrestrial event 12,900 years ago. Here we present a review of the impact hypothesis, including its evolution and current variants, and of efforts to test and corroborate the hypothesis. The physical evidence interpreted as signatures of an impact event can be separated into two groups. The first group consists of evidence that has been largely rejected by the scientific community and is no longer in widespread discussion, including: particle tracks in archeological chert; magnetic nodules in Pleistocene bones; impact origin of the Carolina Bays; and elevated concentrations of radioactivity, iridium, and fullerenes enriched in 3He. The second group consists of evidence that has been active in recent research and discussions: carbon spheres and elongates, magnetic grains and magnetic spherules, byproducts of catastrophic wildfire, and nanodiamonds. Over time, however, these signatures have also seen contrary evidence rather than support. Recent studies have shown that carbon spheres and elongates do not represent extraterrestrial carbon nor impact-induced megafires, but are indistinguishable from fungal sclerotia and arthropod fecal material that are a small but common component of many terrestrial deposits. Magnetic grains and spherules are heterogeneously distributed in sediments, but reported measurements of unique peaks in concentrations at the YD onset have yet to be reproduced. The magnetic grains are certainly just iron-rich detrital grains, whereas reported YD magnetic spherules are

  17. Stylized facts from a threshold-based heterogeneous agent model

    Science.gov (United States)

    Cross, R.; Grinfeld, M.; Lamba, H.; Seaman, T.

    2007-05-01

    A class of heterogeneous agent models is investigated where investors switch trading position whenever their motivation to do so exceeds some critical threshold. These motivations can be psychological in nature or reflect behaviour suggested by the efficient market hypothesis (EMH). By introducing different propensities into a baseline model that displays EMH behaviour, one can attempt to isolate their effects upon the market dynamics. The simulation results indicate that the introduction of a herding propensity results in excess kurtosis and power-law decay consistent with those observed in actual return distributions, but not in significant long-term volatility correlations. Possible alternatives for introducing such long-term volatility correlations are then identified and discussed.

  18. Some problems in the acceptability of implementing radiation protection programs

    International Nuclear Information System (INIS)

    Neill, R.H.

    1997-01-01

    The three fundamentals that radiation protection programs are based upon are; 1) establishing a quantitative correlation between radiation exposure and biological effects in people; 2) determining a level of acceptable risk of exposure; and 3) establishing systems to measure the radiation dose to insure compliance with the regulations or criteria. The paper discusses the interrelationship of these fundamentals, difficulties in obtaining a consensus of acceptable risk and gives some examples of problems in identifying the most critical population-at-risk and in measuring dose. Despite such problems, it is recommended that we proceed with the existing conservative structure of radiation protection programs based upon a linear no threshold model for low radiation doses to insure public acceptability of various potential radiation risks. Voluntary compliance as well as regulatory requirements should continue to be pursued to maintain minimal exposure to ionizing radiation. (author)

  19. Learning-Related Changes in Adolescents' Neural Networks during Hypothesis-Generating and Hypothesis-Understanding Training

    Science.gov (United States)

    Lee, Jun-Ki; Kwon, Yongju

    2012-01-01

    Fourteen science high school students participated in this study, which investigated neural-network plasticity associated with hypothesis-generating and hypothesis-understanding in learning. The students were divided into two groups and participated in either hypothesis-generating or hypothesis-understanding type learning programs, which were…

  20. Hyper-arousal decreases human visual thresholds.

    Directory of Open Access Journals (Sweden)

    Adam J Woods

    Full Text Available Arousal has long been known to influence behavior and serves as an underlying component of cognition and consciousness. However, the consequences of hyper-arousal for visual perception remain unclear. The present study evaluates the impact of hyper-arousal on two aspects of visual sensitivity: visual stereoacuity and contrast thresholds. Sixty-eight participants participated in two experiments. Thirty-four participants were randomly divided into two groups in each experiment: Arousal Stimulation or Sham Control. The Arousal Stimulation group underwent a 50-second cold pressor stimulation (immersing the foot in 0-2° C water, a technique known to increase arousal. In contrast, the Sham Control group immersed their foot in room temperature water. Stereoacuity thresholds (Experiment 1 and contrast thresholds (Experiment 2 were measured before and after stimulation. The Arousal Stimulation groups demonstrated significantly lower stereoacuity and contrast thresholds following cold pressor stimulation, whereas the Sham Control groups showed no difference in thresholds. These results provide the first evidence that hyper-arousal from sensory stimulation can lower visual thresholds. Hyper-arousal's ability to decrease visual thresholds has important implications for survival, sports, and everyday life.

  1. The Criticality Hypothesis in Neural Systems

    Science.gov (United States)

    Karimipanah, Yahya

    There is mounting evidence that neural networks of the cerebral cortex exhibit scale invariant dynamics. At the larger scale, fMRI recordings have shown evidence for spatiotemporal long range correlations. On the other hand, at the smaller scales this scale invariance is marked by the power law distribution of the size and duration of spontaneous bursts of activity, which are referred as neuronal avalanches. The existence of such avalanches has been confirmed by several studies in vitro and in vivo, among different species and across multiple scales, from spatial scale of MEG and EEG down to single cell resolution. This prevalent scale free nature of cortical activity suggests the hypothesis that the cortex resides at a critical state between two phases of order (short-lasting activity) and disorder (long-lasting activity). In addition, it has been shown, both theoretically and experimentally, that being at criticality brings about certain functional advantages for information processing. However, despite the plenty of evidence and plausibility of the neural criticality hypothesis, still very little is known on how the brain may leverage such criticality to facilitate neural coding. Moreover, the emergent functions that may arise from critical dynamics is poorly understood. In the first part of this thesis, we review several pieces of evidence for the neural criticality hypothesis at different scales, as well as some of the most popular theories of self-organized criticality (SOC). Thereafter, we will focus on the most prominent evidence from small scales, namely neuronal avalanches. We will explore the effect of adaptation and how it can maintain scale free dynamics even at the presence of external stimuli. Using calcium imaging we also experimentally demonstrate the existence of scale free activity at the cellular resolution in vivo. Moreover, by exploring the subsampling issue in neural data, we will find some fundamental constraints of the conventional methods

  2. Acceptance conditions in automated negotiation

    NARCIS (Netherlands)

    Baarslag, T.; Hindriks, K.V.; Jonker, C.M.

    2011-01-01

    In every negotiation with a deadline, one of the negotiating parties has to accept an offer to avoid a break off. A break off is usually an undesirable outcome for both parties, therefore it is important that a negotiator employs a proficient mechanism to decide under which conditions to accept.

  3. Consumer Acceptance of Novel Foods

    NARCIS (Netherlands)

    Fischer, A.R.H.; Reinders, M.J.

    2016-01-01

    The success of novel foods depends to a considerable extent on whether consumers accept those innovations. This chapter provides an overview of current knowledge relevant to consumer acceptance of innovations in food. A broad range of theories and approaches to assess consumer response to

  4. Consumer acceptance of functional foods

    DEFF Research Database (Denmark)

    Frewer, Lynn J.; Scholderer, Joachim; Lambert, Nigel

    2003-01-01

    In the past, it has been assumed that consumers would accept novel foods if there is a concrete and tangible consumer benefit associated with them, which implies that those functional foods would quickly be accepted. However, there is evidence that individuals are likely to differ in the extent t...

  5. Worldwide nuclear revival and acceptance

    International Nuclear Information System (INIS)

    Geraets, Luc H.; Crommelynck, Yves A.

    2009-01-01

    The current status and trends of the nuclear revival in Europe and abroad are outlined. The development of public opinion in the last decade is playing an important part. This has turned from clear rejection to careful acceptance. Transparency and open communication will be important aspects in the further development of nuclear acceptance. (orig.)

  6. Problems with the Younger Dryas Boundary (YDB) Impact Hypothesis

    Science.gov (United States)

    Boslough, M.

    2009-12-01

    One breakthrough of 20th-century Earth science was the recognition of impacts as an important geologic process. The most obvious result is a crater. There are more than 170 confirmed terrestrial impact structures with a non-uniform spatial distribution suggesting more to be found. Many have been erased by tectonics and erosion. Deep water impacts do not form craters, and craters in ice sheets disappear when the ice melts. There is growing speculation that such hidden impacts have caused frequent major environmental events of the Holocene, but this is inconsistent with the astronomically-constrained population of Earth-crossing asteroids. Impacts can have consequences much more significant than excavation of a crater. The K/T boundary mass extinction is attributed to the environmental effects of a major impact, and some researchers argue that other extinctions, abrupt climate changes, and even civilization collapses have resulted from impacts. Nuclear winter models suggest that 2-km diameter asteroids exceed a "global catastrophe threshold" by injecting sufficient dust into the stratosphere to cause short-term climate changes, but would not necessarily collapse most natural ecosystems or cause mass extinctions. Globally-catastrophic impacts recur on timescales of about one million years. The 1994 collision of Comet Shoemaker-Levy 9 with Jupiter led us recognize the significance of terrestrial airbursts caused by objects exploding violently in Earth’s atmosphere. We have invoked airbursts to explain rare forms of non-volcanic glasses and melts by using high-resolution computational models to improve our understanding of atmospheric explosions, and have suggested that multiple airbursts from fragmented impactors could be responsible for regional effects. Our models have been cited in support of the widely-publicized YDB impact hypothesis. Proponents claim that a broken comet exploded over North America, with some fragments cratering the Laurentide Ice Sheet. They

  7. A novel hypothesis splitting method implementation for multi-hypothesis filters

    DEFF Research Database (Denmark)

    Bayramoglu, Enis; Ravn, Ole; Andersen, Nils Axel

    2013-01-01

    The paper presents a multi-hypothesis filter library featuring a novel method for splitting Gaussians into ones with smaller variances. The library is written in C++ for high performance and the source code is open and free1. The multi-hypothesis filters commonly approximate the distribution...... transformations better, if the covariances of the individual hypotheses are sufficiently small. We propose a look-up table based method to calculate a set of Gaussian hypotheses approximating a wider Gaussian in order to improve the filter approximation. Python bindings for the library are also provided for fast...

  8. Alternatives to the linear risk hypothesis

    International Nuclear Information System (INIS)

    Craig, A.G.

    1976-01-01

    A theoretical argument is presented which suggests that in using the linear hypothesis for all values of LET the low dose risk is overestimated for low LET but that it is underestimated for very high LET. The argument is based upon the idea that cell lesions which do not lead to cell death may in fact lead to a malignant cell. Expressions for the Surviving Fraction and the Cancer Risk based on this argument are given. An advantage of this very general approach is that is expresses cell survival and cancer risk entirely in terms of the cell lesions and avoids the rather contentious argument as to how the average number of lesions should be related to the dose. (U.K.)

  9. Novae, supernovae, and the island universe hypothesis

    International Nuclear Information System (INIS)

    Van Den Bergh, S.

    1988-01-01

    Arguments in Curtis's (1917) paper related to the island universe hypothesis and the existence of novae in spiral nebulae are considered. It is noted that the maximum magnitude versus rate-of-decline relation for novae may be the best tool presently available for the calibration of the extragalactic distance scale. Light curve observations of six novae are used to determine a distance of 18.6 + or - 3.5 MPc to the Virgo cluster. Results suggest that Type Ia supernovae cannot easily be used as standard candles, and that Type II supernovae are unsuitable as distance indicators. Factors other than precursor mass are probably responsible for determining the ultimate fate of evolving stars. 83 references

  10. Multi-hypothesis modelling of snowmelt

    Science.gov (United States)

    Essery, R.

    2017-12-01

    Modules to predict the melt of snow on the ground are essential components of hydrological and climatological models. Energy to melt snow can come from shortwave or longwave radiation fluxes, turbulent heat fluxes from the atmosphere, conducted heat fluxes from the ground or advected heat in rain falling on snow. Multiple competing hypotheses (parametrizations) for these fluxes and how they are connected to model state variables are in current use. The multiple sources of energy and limited data to constrain them lead to a great deal of equifinality and difficulties in understanding model behaviour. This presentation will discuss how a multi-hypothesis snow model can be used to understand the complementary, competing and confounding influences of model structural choices, parameter uncertainty and input data errors on simulations of snowmelt.

  11. The regulation of the air: a hypothesis

    Directory of Open Access Journals (Sweden)

    E. G. Nisbet

    2012-03-01

    Full Text Available We propose the hypothesis that natural selection, acting on the specificity or preference for CO2 over O2 of the enzyme rubisco (ribulose-1,5-bisphosphate carboxylase/oxygenase, has controlled the CO2:O2 ratio of the atmosphere since the evolution of photosynthesis and has also sustained the Earth's greenhouse-set surface temperature. Rubisco works in partnership with the nitrogen-fixing enzyme nitrogenase to control atmospheric pressure. Together, these two enzymes control global surface temperature and indirectly the pH and oxygenation of the ocean. Thus, the co-evolution of these two enzymes may have produced clement conditions on the Earth's surface, allowing life to be sustained.

  12. Large numbers hypothesis. II - Electromagnetic radiation

    Science.gov (United States)

    Adams, P. J.

    1983-01-01

    This paper develops the theory of electromagnetic radiation in the units covariant formalism incorporating Dirac's large numbers hypothesis (LNH). A direct field-to-particle technique is used to obtain the photon propagation equation which explicitly involves the photon replication rate. This replication rate is fixed uniquely by requiring that the form of a free-photon distribution function be preserved, as required by the 2.7 K cosmic radiation. One finds that with this particular photon replication rate the units covariant formalism developed in Paper I actually predicts that the ratio of photon number to proton number in the universe varies as t to the 1/4, precisely in accord with LNH. The cosmological red-shift law is also derived and it is shown to differ considerably from the standard form of (nu)(R) - const.

  13. Automatic histogram threshold using fuzzy measures.

    Science.gov (United States)

    Vieira Lopes, Nuno; Mogadouro do Couto, Pedro A; Bustince, Humberto; Melo-Pinto, Pedro

    2010-01-01

    In this paper, an automatic histogram threshold approach based on a fuzziness measure is presented. This work is an improvement of an existing method. Using fuzzy logic concepts, the problems involved in finding the minimum of a criterion function are avoided. Similarity between gray levels is the key to find an optimal threshold. Two initial regions of gray levels, located at the boundaries of the histogram, are defined. Then, using an index of fuzziness, a similarity process is started to find the threshold point. A significant contrast between objects and background is assumed. Previous histogram equalization is used in small contrast images. No prior knowledge of the image is required.

  14. Reaction thresholds in doubly special relativity

    International Nuclear Information System (INIS)

    Heyman, Daniel; Major, Seth; Hinteleitner, Franz

    2004-01-01

    Two theories of special relativity with an additional invariant scale, 'doubly special relativity', are tested with calculations of particle process kinematics. Using the Judes-Visser modified conservation laws, thresholds are studied in both theories. In contrast with some linear approximations, which allow for particle processes forbidden in special relativity, both the Amelino-Camelia and Magueijo-Smolin frameworks allow no additional processes. To first order, the Amelino-Camelia framework thresholds are lowered and the Magueijo-Smolin framework thresholds may be raised or lowered

  15. Digital IP Protection Using Threshold Voltage Control

    OpenAIRE

    Davis, Joseph; Kulkarni, Niranjan; Yang, Jinghua; Dengi, Aykut; Vrudhula, Sarma

    2016-01-01

    This paper proposes a method to completely hide the functionality of a digital standard cell. This is accomplished by a differential threshold logic gate (TLG). A TLG with $n$ inputs implements a subset of Boolean functions of $n$ variables that are linear threshold functions. The output of such a gate is one if and only if an integer weighted linear arithmetic sum of the inputs equals or exceeds a given integer threshold. We present a novel architecture of a TLG that not only allows a single...

  16. The Stress Acceleration Hypothesis of Nightmares

    Directory of Open Access Journals (Sweden)

    Tore Nielsen

    2017-06-01

    Full Text Available Adverse childhood experiences can deleteriously affect future physical and mental health, increasing risk for many illnesses, including psychiatric problems, sleep disorders, and, according to the present hypothesis, idiopathic nightmares. Much like post-traumatic nightmares, which are triggered by trauma and lead to recurrent emotional dreaming about the trauma, idiopathic nightmares are hypothesized to originate in early adverse experiences that lead in later life to the expression of early memories and emotions in dream content. Accordingly, the objectives of this paper are to (1 review existing literature on sleep, dreaming and nightmares in relation to early adverse experiences, drawing upon both empirical studies of dreaming and nightmares and books and chapters by recognized nightmare experts and (2 propose a new approach to explaining nightmares that is based upon the Stress Acceleration Hypothesis of mental illness. The latter stipulates that susceptibility to mental illness is increased by adversity occurring during a developmentally sensitive window for emotional maturation—the infantile amnesia period—that ends around age 3½. Early adversity accelerates the neural and behavioral maturation of emotional systems governing the expression, learning, and extinction of fear memories and may afford short-term adaptive value. But it also engenders long-term dysfunctional consequences including an increased risk for nightmares. Two mechanisms are proposed: (1 disruption of infantile amnesia allows normally forgotten early childhood memories to influence later emotions, cognitions and behavior, including the common expression of threats in nightmares; (2 alterations of normal emotion regulation processes of both waking and sleep lead to increased fear sensitivity and less effective fear extinction. These changes influence an affect network previously hypothesized to regulate fear extinction during REM sleep, disruption of which leads to

  17. Ecological Hypothesis of Dentin and Root Caries.

    Science.gov (United States)

    Takahashi, Nobuhiro; Nyvad, Bente

    2016-01-01

    Recent advances regarding the caries process indicate that ecological phenomena induced by bacterial acid production tilt the de- and remineralization balance of the dental hard tissues towards demineralization through bacterial acid-induced adaptation and selection within the microbiota - from the dynamic stability stage to the aciduric stage via the acidogenic stage [Takahashi and Nyvad, 2008]. Dentin and root caries can also be partly explained by this hypothesis; however, the fact that these tissues contain a considerable amount of organic material suggests that protein degradation is involved in caries formation. In this review, we compiled relevant histological, biochemical, and microbiological information about dentin/root caries and refined the hypothesis by adding degradation of the organic matrix (the proteolytic stage) to the abovementioned stages. Bacterial acidification not only induces demineralization and exposure of the organic matrix in dentin/root surfaces but also activation of dentin-embedded and salivary matrix metalloproteinases and cathepsins. These phenomena initiate degradation of the demineralized organic matrix in dentin/root surfaces. While a bacterial involvement has never been confirmed in the initial degradation of organic material, the detection of proteolytic/amino acid-degrading bacteria and bacterial metabolites in dentin and root caries suggests a bacterial digestion and metabolism of partly degraded matrix. Moreover, bacterial metabolites might induce pulpitis as an inflammatory/immunomodulatory factor. Root and dentin surfaces are always at risk of becoming demineralized in the oral cavity, and exposed organic materials can be degraded by host-derived proteases contained in saliva and dentin itself. New approaches to the prevention and treatment of root/dentin caries are required. © 2016 S. Karger AG, Basel.

  18. On the immunostimulatory hypothesis of cancer

    Directory of Open Access Journals (Sweden)

    Juan Bruzzo

    2011-12-01

    Full Text Available There is a rather generalized belief that the worst possible outcome for the application of immunological therapies against cancer is a null effect on tumor growth. However, a significant body of evidence summarized in the immunostimulatory hypothesis of cancer suggests that, upon certain circumstances, the growth of incipient and established tumors can be accelerated rather than inhibited by the immune response supposedly mounted to limit tumor growth. In order to provide more compelling evidence of this proposition, we have explored the growth behavior characteristics of twelve murine tumors -most of them of spontaneous origin- arisen in the colony of our laboratory, in putatively immunized and control mice. Using classical immunization procedures, 8 out of 12 tumors were actually stimulated in "immunized" mice while the remaining 4 were neither inhibited nor stimulated. Further, even these apparently non-antigenic tumors could reveal some antigenicity if more stringent than classical immunization procedures were used. This possibility was suggested by the results obtained with one of these four apparently non-antigenic tumors: the LB lymphoma. In effect, upon these stringent immunization pretreatments, LB was slightly inhibited or stimulated, depending on the titer of the immune reaction mounted against the tumor, with higher titers rendering inhibition and lower titers rendering tumor stimulation. All the above results are consistent with the immunostimulatory hypothesis that entails the important therapeutic implications -contrary to the orthodoxy- that, anti-tumor vaccines may run a real risk of doing harm if the vaccine-induced immunity is too weak to move the reaction into the inhibitory part of the immune response curve and that, a slight and prolonged immunodepression -rather than an immunostimulation- might interfere with the progression of some tumors and thus be an aid to cytotoxic therapies.

  19. The Stress Acceleration Hypothesis of Nightmares

    Science.gov (United States)

    Nielsen, Tore

    2017-01-01

    Adverse childhood experiences can deleteriously affect future physical and mental health, increasing risk for many illnesses, including psychiatric problems, sleep disorders, and, according to the present hypothesis, idiopathic nightmares. Much like post-traumatic nightmares, which are triggered by trauma and lead to recurrent emotional dreaming about the trauma, idiopathic nightmares are hypothesized to originate in early adverse experiences that lead in later life to the expression of early memories and emotions in dream content. Accordingly, the objectives of this paper are to (1) review existing literature on sleep, dreaming and nightmares in relation to early adverse experiences, drawing upon both empirical studies of dreaming and nightmares and books and chapters by recognized nightmare experts and (2) propose a new approach to explaining nightmares that is based upon the Stress Acceleration Hypothesis of mental illness. The latter stipulates that susceptibility to mental illness is increased by adversity occurring during a developmentally sensitive window for emotional maturation—the infantile amnesia period—that ends around age 3½. Early adversity accelerates the neural and behavioral maturation of emotional systems governing the expression, learning, and extinction of fear memories and may afford short-term adaptive value. But it also engenders long-term dysfunctional consequences including an increased risk for nightmares. Two mechanisms are proposed: (1) disruption of infantile amnesia allows normally forgotten early childhood memories to influence later emotions, cognitions and behavior, including the common expression of threats in nightmares; (2) alterations of normal emotion regulation processes of both waking and sleep lead to increased fear sensitivity and less effective fear extinction. These changes influence an affect network previously hypothesized to regulate fear extinction during REM sleep, disruption of which leads to nightmares. This

  20. Radar rainfall estimation for the identification of debris-flow precipitation thresholds

    Science.gov (United States)

    Marra, Francesco; Nikolopoulos, Efthymios I.; Creutin, Jean-Dominique; Borga, Marco

    2014-05-01

    variogram) of the triggering rainfall. These results show that weather radar has the potential to effectively increase the accuracy of rainfall thresholds for debris flow occurrence. However, these benefits may only be achieved if the same monitoring instrumentation is used both to derive the rainfall thresholds and for use of thresholds for real-time identification of debris flows occurrence. References Nikolopoulos, E.I., Borga M., Crema S., Marchi L, Marra F. & Guzzetti F., 2014. Impact of uncertainty in rainfall estimation on the identification of rainfall thresholds for debris-flow occurrence. Geomorphology (conditionally accepted) Peruccacci, S., Brunetti, M.T., Luciani, S., Vennari, C., and Guzzetti, F., 2012. Lithological and seasonal control of rainfall thresholds for the possible initiation of landslides in central Italy, Geomorphology, 139-140, 79-90, 2012.

  1. Dental pain induced by an ambient thermal differential: pathophysiological hypothesis

    Directory of Open Access Journals (Sweden)

    Le Fur-Bonnabesse A

    2017-12-01

    Full Text Available Anaïs Le Fur-Bonnabesse,1,2 Céline Bodéré,1–3 Cyrielle Hélou,2 Valérie Chevalier,2,4 Jean-Paul Goulet5 1Laboratory of Neurosciences of Brest (EA4685, University of Western Brittany, Brest, France; 2Dental School, University of Western Brittany, Brest, France; 3Assessment and Treatment Center of Pain, Regional and University Hospital Center, Brest, France; 4Laboratory IRDL, FRE CNRS 3744, University of Western Brittany, Brest, France; 5School of Dental Medicine, Universite Laval, Quebec, QC, Canada Abstract: Dental pain triggered by temperature differential is a misrecognized condition and a form of dental allodynia. Dental allodynia is characterized by recurrent episodes of diffuse, dull and throbbing tooth pain that develops when returning to an indoor room temperature after being exposed for a long period to cold weather. The pain episode may last up to few hours before subsiding. Effective treatment is to properly shield the pulpal tissue of the offending tooth by increasing the protective layer of the dentin/enamel complex. This review underscores the difference in dentin hypersensitivity and offers a mechanistic hypothesis based on the following processes. Repeated exposure to significant positive temperature gradients (from cold to warm generates phenotypic changes of dental primary afferents on selected teeth with subsequent development of a “low-grade” neurogenic inflammation. As a result, nociceptive C-fibers become sensitized and responsive to innocuous temperature gradients because the activation threshold of specific TRP ion channels is lowered and central sensitization takes place. Comprehensive overviews that cover dental innervation and sensory modalities, thermodynamics of tooth structure, mechanisms of dental nociception and the thermal pain are also provided. Keywords: pain, dental pain, thermal allodynia, atypical tooth sensitivity, dentin hypersensitivity 

  2. A threshold concentration of anti-merozoite antibodies is required for protection from clinical episodes of malaria

    DEFF Research Database (Denmark)

    Murungi, Linda M; Kamuyu, Gathoni; Lowe, Brett

    2013-01-01

    Antibodies to selected Plasmodium falciparum merozoite antigens are often reported to be associated with protection from malaria in one epidemiological cohort, but not in another. Here, we sought to understand this paradox by exploring the hypothesis that a threshold concentration of antibodies i...

  3. Assessing the Computational Potential of the Eschaton - Testing the Selfish Biocosm Hypothesis

    Science.gov (United States)

    Gardner, J. N.

    The Selfish Biocosm (SB) hypothesis asserts that the anthropic qualities which our universe exhibits can be explained as incidental consequences of a cosmic replication cycle in which a cosmologically extended bio- sphere supplies two of the essential elements of self-replication identified by von Neumann. It was previously suggested that the hypothesis implies (1) that the emergence of life and intelligence are key epigenetic thresholds in the cosmic replication cycle, strongly favored by the physical laws and constants which prevail in our particular universe and (2) that a falsifiable implication of the hypothesis is that the emergence of increas- ingly intelligent life is a robust phenomenon, strongly favored by the natural processes of evolution which result from the interplay of those laws and constants. Here I propose a further falsifiable implication of the SB hypothesis: that there exists a plausible final state of the cosmos which exhibits maximal computational potential. This predicted final state--the Omega Point or eschaton--appears to be not inconsistent with Lloyd's description of the ultimate computational device: a computer as powerful as the laws of physics will allow.

  4. Secure information management using linguistic threshold approach

    CERN Document Server

    Ogiela, Marek R

    2013-01-01

    This book details linguistic threshold schemes for information sharing. It examines the opportunities of using these techniques to create new models of managing strategic information shared within a commercial organisation or a state institution.

  5. Melanin microcavitation threshold in the near infrared

    Science.gov (United States)

    Schmidt, Morgan S.; Kennedy, Paul K.; Vincelette, Rebecca L.; Schuster, Kurt J.; Noojin, Gary D.; Wharmby, Andrew W.; Thomas, Robert J.; Rockwell, Benjamin A.

    2014-02-01

    Thresholds for microcavitation of isolated bovine and porcine melanosomes were determined using single nanosecond (ns) laser pulses in the NIR (1000 - 1319 nm) wavelength regime. Average fluence thresholds for microcavitation increased non-linearly with increasing wavelength. Average fluence thresholds were also measured for 10-ns pulses at 532 nm, and found to be comparable to visible ns pulse values published in previous reports. Fluence thresholds were used to calculate melanosome absorption coefficients, which decreased with increasing wavelength. This trend was found to be comparable to the decrease in retinal pigmented epithelial (RPE) layer absorption coefficients reported over the same wavelength region. Estimated corneal total intraocular energy (TIE) values were determined and compared to the current and proposed maximum permissible exposure (MPE) safe exposure levels. Results from this study support the proposed changes to the MPE levels.

  6. A prototype threshold Cherenkov counter for DIRAC

    CERN Document Server

    Bragadireanu, M; Cima, E; Dulach, B; Gianotti, P; Guaraldo, C; Iliescu, M A; Lanaro, A; Levi-Sandri, P; Petrascu, C; Girolami, B; Groza, L; Kulikov, A; Kuptsov, A; Topilin, N; Trusov, S

    1999-01-01

    We have designed, built and tested a gas threshold Cherenkov counter as prototype for a larger counter foreseen for use in the DIRAC experiment, at CERN. We describe the performances of the counter on a test beam.

  7. Recent progress in understanding climate thresholds

    NARCIS (Netherlands)

    Good, Peter; Bamber, Jonathan; Halladay, Kate; Harper, Anna B.; Jackson, Laura C.; Kay, Gillian; Kruijt, Bart; Lowe, Jason A.; Phillips, Oliver L.; Ridley, Jeff; Srokosz, Meric; Turley, Carol; Williamson, Phillip

    2018-01-01

    This article reviews recent scientific progress, relating to four major systems that could exhibit threshold behaviour: ice sheets, the Atlantic meridional overturning circulation (AMOC), tropical forests and ecosystem responses to ocean acidification. The focus is on advances since the

  8. Near death experiences: A multidisciplinary hypothesis

    Directory of Open Access Journals (Sweden)

    Istvan eBokkon

    2013-09-01

    Full Text Available Recently, we proposed a novel biophysical concept regarding on the appearance of brilliant lights during near death experiences (NDEs (Bókkon and Salari, 2012. Specifically, perceiving brilliant light in NDEs has been proposed to arise due to the reperfusion that produces unregulated overproduction of free radicals and energetically excited molecules that can generate a transient enhancement of bioluminescent biophotons in different areas of the brain, including retinotopic visual areas. If this excess of bioluminescent photon emission exceeds a threshold in retinotopic visual areas, this can appear as (phosphene lights because the brain interprets these intrinsic retinotopic bioluminescent photons as if they originated from the external physical world. Here, we review relevant literature that reported experimental studies (Imaizumi et al., 1984; Suzuki et al., 1985 that essentially support our previously published conception, i.e. that seeing lights in NDEs may be due to the transient enhancement of bioluminescent biophotons. Next, we briefly describe our biophysical visual representation model that may explain brilliant lights experienced during NDEs (by phosphenes as biophotons and REM sleep associated dream-like intrinsic visual imageries through biophotons in NDEs. Finally, we link our biophysical visual representation notion to self-consciousness that may involve extremely low-energy quantum entanglements. This article is intended to introduce novel concepts for discussion and does not pretend to give the ultimate explanation for the currently unanswerable questions about matter, life and soul; their creation and their interrelationship.

  9. Cost?effectiveness thresholds: pros and cons

    OpenAIRE

    Bertram, Melanie Y; Lauer, Jeremy A; De Joncheere, Kees; Edejer, Tessa; Hutubessy, Raymond; Kieny, Marie-Paule; Hill, Suzanne R

    2016-01-01

    Abstract Cost?effectiveness analysis is used to compare the costs and outcomes of alternative policy options. Each resulting cost?effectiveness ratio represents the magnitude of additional health gained per additional unit of resources spent. Cost?effectiveness thresholds allow cost?effectiveness ratios that represent good or very good value for money to be identified. In 2001, the World Health Organization?s Commission on Macroeconomics in Health suggested cost?effectiveness thresholds based...

  10. Measurements of NN yields d. pi. very near threshold. Pt. 1; The np yields d. pi. sup 0 cross section

    Energy Technology Data Exchange (ETDEWEB)

    Hutcheon, D.A.; Abegg, R.; Greeniaus, L.G.; Miller, C.A. (TRIUMF, Vancouver (Canada) Nuclear Research Centre, Univ. Alberta, Edmonton (Canada)); Korkmaz, E.; Moss, G.A.; Edwards, G.W.R.; Mack, D.; Olsen, W.C.; Yanlin, Ye. (Nuclear Research Centre, Univ. Alberta, Edmonton (Canada)); Davison, N.E. (Dept. of Physics, Univ. of Manitoba, Winnipeg (Canada)); Van Heerden, I.J. (Dept. of Physics, Univ. of Western Cape, Bellville (South Africa))

    1991-12-30

    We have measured cross sections for the reaction np{yields}d{pi}{sup 0} at beam energies very near the pion-production threshold. The yield near threshold is 23% lower than the previously accepted value based on {pi}{sup +}d{yields}pp data. P-wave pion production was observed at energies as low as 1.5 MeV (c.m.) above threshold. There was no evidence of narrow {pi}NN resonances in the energy range surveyed -275 to 291 MeV, corresponding to pion c.m. momenta between 2 and 43 MeV/c. (orig.).

  11. Nefopam and alfentanil additively reduce the shivering threshold in humans whereas nefopam and clonidine do not.

    Science.gov (United States)

    Alfonsi, Pascal; Passard, Andrea; Gaude-Joindreau, Valérie; Guignard, Bruno; Sessler, Daniel I; Chauvin, Marcel

    2009-07-01

    Induction of therapeutic hypothermia is often complicated by shivering. Nefopam reduces the shivering threshold with minimal side effects. Consequently, nefopam is an attractive component for induction of therapeutic hypothermia. However, nefopam alone is insufficient; it will thus need to be combined with another drug. Clonidine and alfentanil each reduce the shivering threshold. This study, therefore, tested the hypothesis that nefopam, combined either with clonidine or alfentanil, synergistically reduces the shivering threshold. For each combination, ten volunteers were studied on 4 days. Combination 1: (1) control (no drug); (2) nefopam (100 ng/ml); (3) clonidine (2.5 microg/kg); and (4) nefopam plus clonidine (100 ng/ml and 2.5 microg/kg, respectively). Combination 2: (1) control (no drug); (2) nefopam (100 ng/ml); (3) alfentanil (150 ng/ml); and (4) nefopam plus alfentanil (100 ng/ml and 150 ng/ml, respectively). Lactated Ringer's solution (approximately 4 degrees C) was infused to decrease core temperature. Mean skin temperature was maintained at 31 degrees C. The core temperature that increased oxygen consumption to more than 25% of baseline identified the shivering threshold. With nefopam and clonidine, the shivering thresholds were significantly lower than on the control day. The shivering threshold decreased significantly less than would be expected on the basis of the individual effects of each drug (P = 0.034). In contrast, the interaction between nefopam and alfentanil on shivering was additive, meaning that the combination reduced the shivering threshold as much as would be expected by the individual effect of each drug. Nefopam and alfentanil additively reduce the shivering threshold, but nefopam and clonidine do not.

  12. The carotid baroreflex modifies the pressor threshold of the muscle metaboreflex in humans.

    Science.gov (United States)

    Ichinose, Masashi; Ichinose-Kuwahara, Tomoko; Watanabe, Kazuhito; Kondo, Narihiko; Nishiyasu, Takeshi

    2017-09-01

    The purpose of the present study was to test our hypothesis that unloading the carotid baroreceptors alters the threshold and gain of the muscle metaboreflex in humans. Ten healthy subjects performed a static handgrip exercise at 50% of maximum voluntary contraction. Contraction was sustained for 15, 30, 45, and 60 s and was followed by 3 min of forearm circulatory arrest, during which forearm muscular pH is known to decrease linearly with increasing contraction time. The carotid baroreceptors were unloaded by applying 0.1-Hz sinusoidal neck pressure (oscillating from +15 to +50 mmHg) during ischemia. We estimated the threshold and gain of the muscle metaboreflex by analyzing the relationship between the cardiovascular responses during ischemia and the amount of work done during the exercise. In the condition with unloading of the carotid baroreceptors, the muscle metaboreflex thresholds for mean arterial blood pressure (MAP) and total vascular resistance (TVR) corresponded to significantly lower work levels than the control condition (threshold for MAP: 795 ± 102 vs. 662 ± 208 mmHg and threshold for TVR: 818 ± 213 vs. 572 ± 292 kg·s, P baroreflex modifies the muscle metaboreflex threshold in humans. Our results suggest the carotid baroreflex brakes the muscle metaboreflex, thereby inhibiting muscle metaboreflex-mediated pressor and vasoconstriction responses. NEW & NOTEWORTHY We found that unloading the carotid baroreceptors shifts the pressor threshold of the muscle metaboreflex toward lower metabolic stimulation levels in humans. This finding indicates that, in the normal loading state, the carotid baroreflex inhibits the muscle metaboreflex pressor response by shifting the reflex threshold to higher metabolic stimulation levels. Copyright © 2017 the American Physiological Society.

  13. Ritz, Einstein, and the Emission Hypothesis

    Science.gov (United States)

    Martínez, Alberto A.

    . Just as Albert Einstein's special theory of relativity was gaining acceptance around 1908, the young Swiss physicist Walter Ritz advanced a competing though preliminary emission theory that sought to explain the phenomena of electrodynamics on the assumption that the speed of light depends on the motion of its source. I survey Ritz's unfinished work in this area and review the reasons why Einstein and other physicists rejected Ritz's and other emission theories. Since Ritz's emission theory attracted renewed attention in the 1960s, I discuss how the earlier observational evidence was misconstrued as telling against it more conclusively than actually was the case. Finally, I contrast the role played by evidence against Ritz's theory with other factors that led to the early rejection of his approach.

  14. The Matter-Gravity Entanglement Hypothesis

    Science.gov (United States)

    Kay, Bernard S.

    2018-03-01

    I outline some of my work and results (some dating back to 1998, some more recent) on my matter-gravity entanglement hypothesis, according to which the entropy of a closed quantum gravitational system is equal to the system's matter-gravity entanglement entropy. The main arguments presented are: (1) that this hypothesis is capable of resolving what I call the second-law puzzle, i.e. the puzzle as to how the entropy increase of a closed system can be reconciled with the asssumption of unitary time-evolution; (2) that the black hole information loss puzzle may be regarded as a special case of this second law puzzle and that therefore the same resolution applies to it; (3) that the black hole thermal atmosphere puzzle (which I recall) can be resolved by adopting a radically different-from-usual description of quantum black hole equilibrium states, according to which they are total pure states, entangled between matter and gravity in such a way that the partial states of matter and gravity are each approximately thermal equilibrium states (at the Hawking temperature); (4) that the Susskind-Horowitz-Polchinski string-theoretic understanding of black hole entropy as the logarithm of the degeneracy of a long string (which is the weak string coupling limit of a black hole) cannot be quite correct but should be replaced by a modified understanding according to which it is the entanglement entropy between a long string and its stringy atmosphere, when in a total pure equilibrium state in a suitable box, which (in line with (3)) goes over, at strong-coupling, to a black hole in equilibrium with its thermal atmosphere. The modified understanding in (4) is based on a general result, which I also describe, which concerns the likely state of a quantum system when it is weakly coupled to an energy-bath and the total state is a random pure state with a given energy. This result generalizes Goldstein et al.'s `canonical typicality' result to systems which are not necessarily small.

  15. Learning and sleep: the sequential hypothesis.

    Science.gov (United States)

    Ambrosini, M V.; Giuditta, A

    2001-12-01

    During the last 30 years, paradoxical sleep (PS) has been generally considered as the only type of sleep involved in memory processing, mainly for the consistent increase of PS episodes in laboratory animals learning a relatively complex task, and for the retention deficits induced by post-training PS deprivation. The vicissitudes of this idea, examined in detail by several laboratories, have been critically presented in a number of review articles However, according to a more comprehensive unitary proposal (the sequential hypothesis), memory processing during sleep does require the initial participation of slow-wave sleep (SS) in addition to the subsequent involvement of PS. The evidence supporting this hypothesis, largely derived from experiments concerning rats trained for a two-way active avoidance task, is reviewed here in some detail. Recent studies of human sleep are in full agreement with this view. In the rat, the main effect of learning on post-training SS consists in the selective increment in the average duration of SS episodes initiating different types of sleep sequences. Notably, following training for a two-way active avoidance task, the occurrence of this effect in sleep sequences including transition sleep (TS), such as SS-->TS-->W and SS-->TS-->PS, appears related to the processing of memories of the novel avoidance response. Conversely, the occurrence of the same effect in sleep sequences lacking TS may reflect the processing of memories of innate responses (escapes and freezings). Memories of innate and novel responses are assumed to engage in a dynamic competitive interaction to attain control of waking behaviour. Interestingly, in baseline sleep, variables of SS-->TS-->W and SS-->TS-->PS sequences, such as the average duration of SS, TS, and PS episodes, have proved to be good indices of the capacity to learn, as shown by their strong correlations with the number of avoidances scored by rats the following day. Comparable correlations have not

  16. On the controlling parameters for fatigue-crack threshold at low homologous temperatures

    International Nuclear Information System (INIS)

    Yu, W.; Gerberich, W.W.

    1983-01-01

    Fatigue crack propagation phenomena near the threshold stress intensity level ΔK /SUB TH/ , has been a vigorously studied topic in recent years. Near threshold the crack propagates rather slowly, thus giving enough time for various physical and chemical reactions to take place. Room air, which is the most commonly encountered environment, can still supply various ingredients such as oxygen, water vapor (and thus hydrogen) to support these reactions. Much effort had been directed toward the environmental aspects of near threshold fatigue crack growth. By conducting tests under vacuum, Suresh and coworkers found that the crack propagation rate in a 2-1/4 Cr-1Mo steel was higher in vacuum than in air. An oxide induced closure, which served to reduce the effective stress intensity at the crack tip, seems to furnish a good explanation. Neumann and coworkers proposed that during the fatigue process, extrusion-intrusion pairs can develop as a consequence of reversed slip around the crack tip when the crack was propagated near threshold stress intensity. Beevers demonstrated that fatigue fracture surfaces contact each other during unloading even under tension-tension cycling. Kanninen and Atkinson also reached the conclusion that the compressive stress acting at the crack tip due to residual plasticity can induce closure. Microstructural effects have also been cited as important factors in near threshold crack growth. It is generally accepted that coarser grains have a beneficial effect on the resistance to the near threshold crack propagation

  17. Evolutionary hypothesis for Chiari type I malformation.

    Science.gov (United States)

    Fernandes, Yvens Barbosa; Ramina, Ricardo; Campos-Herrera, Cynthia Resende; Borges, Guilherme

    2013-10-01

    Chiari I malformation (CM-I) is classically defined as a cerebellar tonsillar herniation (≥5 mm) through the foramen magnum. A decreased posterior fossa volume, mainly due to basioccipital hypoplasia and sometimes platybasia, leads to posterior fossa overcrowding and consequently cerebellar herniation. Regardless of radiological findings, embryological genetic hypothesis or any other postulations, the real cause behind this malformation is yet not well-elucidated and remains largely unknown. The aim of this paper is to approach CM-I under a broader and new perspective, conjoining anthropology, genetics and neurosurgery, with special focus on the substantial changes that have occurred in the posterior cranial base through human evolution. Important evolutionary allometric changes occurred during brain expansion and genetics studies of human evolution demonstrated an unexpected high rate of gene flow interchange and possibly interbreeding during this process. Based upon this review we hypothesize that CM-I may be the result of an evolutionary anthropological imprint, caused by evolving species populations that eventually met each other and mingled in the last 1.7 million years. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. Environmental Kuznets Curve Hypothesis. A Survey

    International Nuclear Information System (INIS)

    Dinda, Soumyananda

    2004-01-01

    The Environmental Kuznets Curve (EKC) hypothesis postulates an inverted-U-shaped relationship between different pollutants and per capita income, i.e., environmental pressure increases up to a certain level as income goes up; after that, it decreases. An EKC actually reveals how a technically specified measurement of environmental quality changes as the fortunes of a country change. A sizeable literature on EKC has grown in recent period. The common point of all the studies is the assertion that the environmental quality deteriorates at the early stages of economic development/growth and subsequently improves at the later stages. In other words, environmental pressure increases faster than income at early stages of development and slows down relative to GDP growth at higher income levels. This paper reviews some theoretical developments and empirical studies dealing with EKC phenomenon. Possible explanations for this EKC are seen in (1) the progress of economic development, from clean agrarian economy to polluting industrial economy to clean service economy; (2) tendency of people with higher income having higher preference for environmental quality, etc. Evidence of the existence of the EKC has been questioned from several corners. Only some air quality indicators, especially local pollutants, show the evidence of an EKC. However, an EKC is empirically observed, till there is no agreement in the literature on the income level at which environmental degradation starts declining. This paper provides an overview of the EKC literature, background history, conceptual insights, policy and the conceptual and methodological critique

  19. Evidence for the spotting hypothesis in gymnasts.

    Science.gov (United States)

    Heinen, Thomas

    2011-04-01

    The goal of this study was to investigate the visual spotting hypothesis in 10 experts and 10 apprentices as they perform back aerial somersaults from a standing position with no preparatory jumps (short flight duration condition) and after some preparatory jumps with a flight time of 1s (long flight duration condition). Differences in gaze behavior and kinematics were expected between experts and apprentices and between experimental conditions. Gaze behavior was measured using a portable and wireless eye-tracking system in combination with a movement-analysis system. Experts exhibited a smaller landing deviation from the middle of the trampoline bed than apprentices. Experts showed higher fixation ratios during the take-off and flight phase. Experts exhibited no blinks in any of the somersaults in both conditions, whereas apprentices showed significant blink ratios in both experimental conditions. The findings suggest that gymnasts can use visual spotting during the back aerial somersault, even when the time of flight is delimited. We conclude that knowledge about gaze-movement relationships may help coaches develop specific training programs in the learning process of the back aerial somersault.

  20. [Hypothesis of "sinew-meridian system"].

    Science.gov (United States)

    Liu, Nongyu

    2017-01-12

    The author provides the hypothesis on the "sinew-meridian system" in terms of the physiology, pathology, diagnosis and treatment of meridians and sinew-meridians. Meridians are nourished with blood and sinew-meridians are softened with yang qi . Meridians are circulated in linear form and sinew-meridians are distributed in centripetal state. Meridians are communicated externally and internally and sinew-meridians are connected with tendons and bones. Meridians pertain to zangfu organs and sinew-meridians stabilize zangfu organs. Meridians nourish five sensory organs and sinew-meridians moisten nine orifices. Meridians are characterized as nourishment and sinew-meridians as solidity. Meridians emphasize the conditions of either deficiency or excess, and sinew-meridians as either cold or heat. The meridian disorder is located deeply and of complex and sinew-meridian's is located superficially and of simplicity. The meridian disorder is difficult to treat and with poor therapeutic effect and the sinew-meridian disorder is easy to treat and with rapid therapeutic effect. The "sinew-meridian system" composes of meridian-collateral system and tendon-skin system, in which the meridian-collateral system includes the twelve meridians, eight extra meridians and fifteen collaterals, being relevant with nutrition and blood, acting on transporting qi , blood and message; the tendon-skin system includes twelve sinew-meridians and twelve meridians of cutaneous regions, being relevant with defensive qi , acting on governing the motor function and protecting the body.

  1. Iron hypothesis of cardiovascular disease: still controversial.

    Science.gov (United States)

    Aursulesei, Viviana; Cozma, A; Krasniqi, A

    2014-01-01

    Iron hypothesis has been a controversial subject for over 30 years as many studies support its role as a risk factor for cardiovascular disease, while other studies found no evidence to support it. The conflicting results are accounted for by the non-homogeneity of trial design in terms of population inclusion criteria and different endpoints, non-uniform use of parameters for assessing iron role, and incomplete understanding of the mechanisms of action. The nature of iron is dual, being of crucial importance for the human body, but also toxic as "free iron" induces oxidative stress. Under physiological conditions, there are efficient and complex mechanisms against iron-induced oxidative stress, which could be reproduced for creating new, intelligent antioxidants. Iron depletion improves the cardiovascular prognosis only if serum concentration is at the lowest limit of normal ranges. However, low iron levels and the type of dietary iron intake correlate with atherosclerotic cardiovascular disease, influence the ischemic endpoints in the elderly, and exert negative impact on heart failure prognosis. So far, the causal relation and involved mechanisms are not fully elucidated. Iron overload is a difficult and frequent condition, involving the cardiovascular system by specific pathogenic pathways, therefore determining a particular form of restrictive cardiomyopathy and vaso-occlusive arterial damage.

  2. Spectral analysis and the Riemann hypothesis

    Science.gov (United States)

    Lachaud, Gilles

    2003-11-01

    The explicit formulas of Riemann and Guinand-Weil relate the set of prime numbers with the set of nontrivial zeros of the zeta function of Riemann. We recall Alain Connes' spectral interpretation of the critical zeros of the Riemann zeta function as eigenvalues of the absorption spectrum of an unbounded operator in a suitable Hilbert space. We then give a spectral interpretation of the zeros of the Dedekind zeta function of an algebraic number field K of degree n in an automorphic setting. If K is a complex quadratic field, the torical forms are the functions defined on the modular surface X, such that the sum of this function over the "Gauss set" of K is zero, and Eisenstein series provide such torical forms. In the case of a general number field, one can associate to K a maximal torus T of the general linear group G. The torical forms are the functions defined on the modular variety X associated to G, such that the integral over the subvariety induced by T is zero. Alternately, the torical forms are the functions which are orthogonal to orbital series on X. We show here that the Riemann hypothesis is equivalent to certain conditions bearing on spaces of torical forms, constructed from Eisenstein series, the torical wave packets. Furthermore, we define a Hilbert space and a self-adjoint operator on this space, whose spectrum equals the set of critical zeros of the Dedekind zeta function of K.

  3. Unit cell hypothesis for Streptococcus faecalis.

    Science.gov (United States)

    Edelstein, E M; Rosenzweig, M S; Daneo-Moore, L; Higgins, M L

    1980-07-01

    The mass doubling times of exponential-phase cultures of Streptococcus faecalis were varied from 30 to 110 min by omitting glutamine from a defined growth medium and providing different concentrations of glutamate (ranging from 300 to 14 mug/ml). After Formalin fixation, cells were dried by the critical point method, and carbon-platinum replicas were prepared. The surface area and volume of cell poles seen in these replicas were estimated by a computer-assisted, three-dimensional reconstruction technique. It was found that the amount of surface area and volume of poles seen in these replicas were independent of the growth rate of culture from which the samples were taken. These observations were consistent with the unit cell model hypothesis of Donachie and Begg, in which a small number of surface sites would produce a constant amount of new cell surface regardless of the mass doubling time of the culture. However, measurements of the thickness of the cell wall taken from thin sections of the same cells showed that the cell wall increased in thickness as a function of the increase in cellular peptidoglycan content which occurs when the growth rate of this organism is slowed down by a decrease in glutamate concentration. Thus, it would seem that although the size of polar shells made by S. faecalis is invariant with growth rate, the amount of wall precursors used to construct these shells is not.

  4. The social brain hypothesis of schizophrenia

    Science.gov (United States)

    BURNS, JONATHAN

    2006-01-01

    The social brain hypothesis is a useful heuristic for understanding schizophrenia. It focuses attention on the core Bleulerian concept of autistic alienation and is consistent with well-replicated findings of social brain dysfunction in schizophrenia as well as contemporary theories of human cognitive and brain evolution. The contributions of Heidegger, Merleau-Ponty and Wittgenstein allow us to arrive at a new "philosophy of interpersonal relatedness", which better reflects the "embodied mind" and signifies the end of Cartesian dualistic thinking. In this paper I review the evolution, development and neurobiology of the social brain - the anatomical and functional substrate for adaptive social behaviour and cognition. Functional imaging identifies fronto-temporal and fronto-parietal cortical networks as comprising the social brain, while the discovery of "mirror neurons" provides an understanding of social cognition at a cellular level. Patients with schizophrenia display abnormalities in a wide range of social cognition tasks such as emotion recognition, theory of mind and affective responsiveness. Furthermore, recent research indicates that schizophrenia is a disorder of functional and structural connectivity of social brain networks. These findings lend support to the claim that schizophrenia represents a costly by-product of social brain evolution in Homo sapiens. Individuals with this disorder find themselves seriously disadvantaged in the social arena and vulnerable to the stresses of their complex social environments. This state of "disembodiment" and interpersonal alienation is the core phenomenon of schizophrenia and the root cause of intolerable suffering in the lives of those affected. PMID:16946939

  5. Impulse Control Disorders - The Continuum Hypothesis.

    Science.gov (United States)

    Stenberg, Georg

    2016-01-01

    The group Parkinson Inside Out is composed of health professionals and academic researchers who have been diagnosed with Parkinson's Disease. In our discussions we try to make use of both our inside perspective as patients, and our outside perspective as professionals. In this paper, we apply the two perspectives to the Impulse Control Disorders. These impulsive behaviour patterns are thought to be relatively uncommon side effects of some of the medication used in dopamine replacement therapy. The phenomenon is usually described as relatively rare (controlling impulses is a very common experience for patients undergoing dopamine replacement therapy. They result from difficulties in decision making engendered by variations in dopamine accessibility in the reward centre of the brain. Only in a minority do the consequences grow to the damaging proportions of a disorder, but most patients are probably affected to some degree. Seeing, and measuring, decision difficulties as a continuous dimension, rather than as a discrete category, brings increased possibilities for early detection and continuous monitoring. With reliable measures of the propensity for impulsive decision making, it may become possible to both reap the benefits and avoid the dangers of the dopamine agonists. We point to ways of empirically testing our continuity hypothesis.

  6. Marginal contrasts and the Contrastivist Hypothesis

    Directory of Open Access Journals (Sweden)

    Daniel Currie Hall

    2016-12-01

    Full Text Available The Contrastivist Hypothesis (CH; Hall 2007; Dresher 2009 holds that the only features that can be phonologically active in any language are those that serve to distinguish phonemes, which presupposes that phonemic status is categorical. Many researchers, however, demonstrate the existence of gradient relations. For instance, Hall (2009 quantifies these using the information-theoretic measure of entropy (unpredictability of distribution and shows that a pair of sounds may have an entropy between 0 (totally predictable and 1 (totally unpredictable. We argue that the existence of such intermediate degrees of contrastiveness does not make the CH untenable, but rather offers insight into contrastive hierarchies. The existence of a continuum does not preclude categorical distinctions: a categorical line can be drawn between zero entropy (entirely predictable, and thus by the CH phonologically inactive and non-zero entropy (at least partially contrastive, and thus potentially phonologically active. But this does not mean that intermediate degrees of surface contrastiveness are entirely irrelevant to the CH; rather, we argue, they can shed light on how deeply ingrained a phonemic distinction is in the phonological system. As an example, we provide a case study from Pulaar [ATR] harmony, which has previously been claimed to be problematic for the CH.

  7. Transpiration: A Test of Optimality Hypothesis

    Science.gov (United States)

    Wang, J.; Bras, R. L.; Lerdau, M.; Salvucci, G. D.; Wofsy, S.

    2003-12-01

    The argument is that the fundamental mechanisms behind bare soil evaporation are also responsible for plant transpiration except that stomata affect the exchange of water vapor between the evaporating surface and the atmosphere. It is hypothesized that the system of liquid water in leaf tissues and the water vapor in the atmosphere tries to evolve towards a potential equilibrium as quickly as possible by maximizing transpiration. In the proposed theory, CO2 flux is used as a non-parametric equivalent of stomatal conductance as CO2 and water vapor diffuse in and out of leaves through the same path. It is further assumed that stomatal aperture is directly controlled by guard cell turgor (or leaf water potential). Transpiration is formulated as a function of leaf temperature, leaf water potential/stomatal conductance (or CO2 flux as the surrogate), and sensible heat flux (characterizing transport mechanism) at a given level of radiative energy input. Optimization of transpiration constrained by the energy balance equation leads to vanishing derivatives of transpiration with respect to leaf temperature and CO2 flux. Effect of vapor pressure deficit on transpiration is also investigated. Preliminary tests using field experimental measurements lead to encouraging evidence in support of the hypothesis. It is found that transpiration is fairly insensitive to atmospheric humidity as suggested by several earlier studies.

  8. Export-led Growth Hypothesis: Turkey Application

    Directory of Open Access Journals (Sweden)

    İsmail KÜÇÜKAKSOY

    2015-12-01

    Full Text Available This paper aims to investigate validity of “Export-led Growth Hypothesis” for Turkey using quarterly data in period from 2003:Q1 to 2015:Q1. Hypothesis argues that there is causality relationship from real export to real Gross Domestic Product (GDP. Johansen cointegration test, Gregory-Hansen cointegration test, Toda-Yamamoto causality test, Fully Modified Ordinary Least Squares (FMOLS, Canonical cointegrating regression (CCR and Dynamic ordinary least squares (DOLS methods were used in this study. Findings can be summarized as follows: a According to Johansen cointegration test there is no relationship among variables in the long-run whereas Gregory-Hansen cointegration test has determined relationship in the long-run; b According to Toda-Yamamoto causality test there is bidirectional causality between real export and real GDP. This finding proves the validity of “Export-led Growth Hypothesis” for Turkey; c According to FMOLS, CCR, DOLS methods a 1% increase in the real export increases the real GDP by 1.5195%, 1.5552%, 1.3171% respectively in the long-run. These methods prove the validity of “Export-led Growth Hypothesis” for Turkey.

  9. A matched filter hypothesis for cognitive control.

    Science.gov (United States)

    Chrysikou, Evangelia G; Weber, Matthew J; Thompson-Schill, Sharon L

    2014-09-01

    The prefrontal cortex exerts top-down influences on several aspects of higher-order cognition by functioning as a filtering mechanism that biases bottom-up sensory information toward a response that is optimal in context. However, research also indicates that not all aspects of complex cognition benefit from prefrontal regulation. Here we review and synthesize this research with an emphasis on the domains of learning and creative cognition, and outline how the appropriate level of cognitive control in a given situation can vary depending on the organism's goals and the characteristics of the given task. We offer a matched filter hypothesis for cognitive control, which proposes that the optimal level of cognitive control is task-dependent, with high levels of cognitive control best suited to tasks that are explicit, rule-based, verbal or abstract, and can be accomplished given the capacity limits of working memory and with low levels of cognitive control best suited to tasks that are implicit, reward-based, non-verbal or intuitive, and which can be accomplished irrespective of working memory limitations. Our approach promotes a view of cognitive control as a tool adapted to a subset of common challenges, rather than an all-purpose optimization system suited to every problem the organism might encounter. © 2013 Published by Elsevier Ltd.

  10. Standards regulations and public acceptance

    International Nuclear Information System (INIS)

    Fernandez, E.C.

    1977-01-01

    Spanish nuclear legislation and the associated procedure for the authorization of installations is summarized. Public acceptance is discussed in the context of the needs for and hazards of nuclear energy. (U.K.)

  11. Market Acceptance of Smart Growth

    Science.gov (United States)

    This report finds that smart growth developments enjoy market acceptance because of stability in prices over time. Housing resales in smart growth developments often have greater appreciation than their conventional suburban counterparts.

  12. Pain thresholds during standardized psychological stress in relation to perceived psychosocial work situation. Stockholm Music I Study Group.

    Science.gov (United States)

    Theorell, T; Nordemar, R; Michélsen, H

    1993-04-01

    The hypothesis that perceived psychosocial work situation is associated with pain threshold was tested on a sample of 103 men and women aged 19-65 yr in Stockholm. Half of the studied sample was a random sample of men (N = 26) and women (N = 31), while the remaining subjects were medical secretaries (women, N = 28) and furniture movers (N = 31). Pain thresholds were measured by means of an algometer before, during and after a standardized colour word test. The measurements were made on six different points in the neck and shoulder region. Before psychological stress in the laboratory, perceived psychological demands were significantly associated with pain threshold--the higher the demands the higher the pain threshold. During stress those who reported low decision latitude and high degree of sleep disturbance were shown to have a low pain threshold. The findings are consistent with the hypothesis that subjects with high demand levels have an elevated pain threshold when they are not under excessive psychological stress. During psychological stress, on the other hand, those with low decision latitude are more pain sensitive than others, and this is aggravated in those who also report a high degree of sleep disturbance.

  13. Acceptance is in the eye of the beholder: self-esteem and motivated perceptions of acceptance from the opposite sex.

    Science.gov (United States)

    Cameron, Jessica J; Stinson, Danu Anthony; Gaetz, Roslyn; Balchen, Stacey

    2010-09-01

    Social risk elicits self-esteem differences in signature social motivations and behaviors during the relationship-initiation process. In particular, the present research tested the hypothesis that lower self-esteem individuals' (LSEs) motivation to avoid rejection leads them to self-protectively underestimate acceptance from potential romantic partners, whereas higher self-esteem individuals' (HSEs) motivation to promote new relationships leads them to overestimate acceptance. The results of 5 experiments supported these predictions. Social risk increased activation of avoidance goals for LSEs on a word-recall task but increased activation of approach goals for HSEs, as evidenced by their increased use of likeable behaviors. Consistent with these patterns of goal activation, even though actual acceptance cues were held constant across all participants, social risk decreased the amount of acceptance that LSEs perceived from their interaction partner but increased the amount of acceptance that HSEs perceived from their interaction partner. It is important to note that such self-esteem differences in avoidance goals, approach behaviors, and perceptions of acceptance were completely eliminated when social risk was removed. (PsycINFO Database Record (c) 2010 APA, all rights reserved).

  14. Public acceptance of small reactors

    International Nuclear Information System (INIS)

    McDougall, D.S.

    1997-01-01

    The success of any nuclear program requires acceptance by the local public and all levels of government involved in the decision to initiate a reactor program. Public acceptance of a nuclear energy source is a major challenge in successful initiation of a small reactor program. In AECL's experience, public acceptance will not be obtained until the public is convinced that the specific nuclear program is needed, safe and economic and environmental benefit to the community. The title of public acceptance is misleading. The objective of the program is a fully informed public. The program proponent cannot force public acceptance, which is beyond his control. He can, however, ensure that the public is informed. Once information has begun to flow to the public by various means as will be explained later, the proponent is responsible to ensure that the information that is provided by him and by others is accurate. Most importantly, and perhaps most difficult to accomplish, the proponent must develop a consultative process that allows the proponent and the public to agree on actions that are acceptable to the proponent and the community

  15. Tracking of nociceptive thresholds using adaptive psychophysical methods

    NARCIS (Netherlands)

    Doll, Robert; Buitenweg, Jan R.; Meijer, Hil Gaétan Ellart; Veltink, Petrus H.

    Psychophysical thresholds reflect the state of the underlying nociceptive mechanisms. For example, noxious events can activate endogenous analgesic mechanisms that increase the nociceptive threshold. Therefore, tracking thresholds over time facilitates the investigation of the dynamics of these

  16. Identifying Threshold Concepts for Information Literacy: A Delphi Study

    Directory of Open Access Journals (Sweden)

    Lori Townsend

    2016-06-01

    Full Text Available This study used the Delphi method to engage expert practitioners on the topic of threshold concepts for information literacy. A panel of experts considered two questions. First, is the threshold concept approach useful for information literacy instruction? The panel unanimously agreed that the threshold concept approach holds potential for information literacy instruction. Second, what are the threshold concepts for information literacy instruction? The panel proposed and discussed over fifty potential threshold concepts, finally settling on six information literacy threshold concepts.

  17. Cost-effectiveness thresholds: pros and cons.

    Science.gov (United States)

    Bertram, Melanie Y; Lauer, Jeremy A; De Joncheere, Kees; Edejer, Tessa; Hutubessy, Raymond; Kieny, Marie-Paule; Hill, Suzanne R

    2016-12-01

    Cost-effectiveness analysis is used to compare the costs and outcomes of alternative policy options. Each resulting cost-effectiveness ratio represents the magnitude of additional health gained per additional unit of resources spent. Cost-effectiveness thresholds allow cost-effectiveness ratios that represent good or very good value for money to be identified. In 2001, the World Health Organization's Commission on Macroeconomics in Health suggested cost-effectiveness thresholds based on multiples of a country's per-capita gross domestic product (GDP). In some contexts, in choosing which health interventions to fund and which not to fund, these thresholds have been used as decision rules. However, experience with the use of such GDP-based thresholds in decision-making processes at country level shows them to lack country specificity and this - in addition to uncertainty in the modelled cost-effectiveness ratios - can lead to the wrong decision on how to spend health-care resources. Cost-effectiveness information should be used alongside other considerations - e.g. budget impact and feasibility considerations - in a transparent decision-making process, rather than in isolation based on a single threshold value. Although cost-effectiveness ratios are undoubtedly informative in assessing value for money, countries should be encouraged to develop a context-specific process for decision-making that is supported by legislation, has stakeholder buy-in, for example the involvement of civil society organizations and patient groups, and is transparent, consistent and fair.

  18. Threshold concepts in finance: conceptualizing the curriculum

    Science.gov (United States)

    Hoadley, Susan; Tickle, Leonie; Wood, Leigh N.; Kyng, Tim

    2015-08-01

    Graduates with well-developed capabilities in finance are invaluable to our society and in increasing demand. Universities face the challenge of designing finance programmes to develop these capabilities and the essential knowledge that underpins them. Our research responds to this challenge by identifying threshold concepts that are central to the mastery of finance and by exploring their potential for informing curriculum design and pedagogical practices to improve student outcomes. In this paper, we report the results of an online survey of finance academics at multiple institutions in Australia, Canada, New Zealand, South Africa and the United Kingdom. The outcomes of our research are recommendations for threshold concepts in finance endorsed by quantitative evidence, as well as a model of the finance curriculum incorporating finance, modelling and statistics threshold concepts. In addition, we draw conclusions about the application of threshold concept theory supported by both quantitative and qualitative evidence. Our methodology and findings have general relevance to the application of threshold concept theory as a means to investigate and inform curriculum design and delivery in higher education.

  19. Proposed diagnostic thresholds for gestational diabetes mellitus according to a 75-g oral glucose tolerance test

    DEFF Research Database (Denmark)

    Jensen, Dorte Møller; Damm, P; Sørensen, B

    2003-01-01

    AIMS: To study if established diagnostic threshold values for gestational diabetes based on a 75-g, 2-h oral glucose tolerance test can be supported by maternal and perinatal outcomes. METHODS: Historical cohort study of 3260 pregnant women examined for gestational diabetes on the basis of risk....../l than in women with 2-h glucose of 9.0-11.0 mmol/l. CONCLUSIONS: The risk for several maternal and perinatal complications increased with the diagnostic threshold for 2-h glucose. Large-scale blinded studies are needed to clarify the question of a clinically meaningful diagnosis of gestational diabetes...... mellitus. Until these results are available, a 2-h threshold level of 9.0 mmol/l after a 75-g oral glucose tolerance test seems acceptable....

  20. Accurate motor mapping in awake common marmosets using micro-electrocorticographical stimulation and stochastic threshold estimation

    DEFF Research Database (Denmark)

    Kosugi, Akito; Takemi, Mitsuaki; Tia, Banty

    2018-01-01

    OBJECTIVE: Motor map has been widely used as an indicator of motor skills and learning, cortical injury, plasticity, and functional recovery. Cortical stimulation mapping using epidural electrodes is recently adopted for animal studies. However, several technical limitations still remain. Test......, in common marmosets. Approach. ECS was applied using the implanted μECoG electrode arrays in three adult common marmosets under awake conditions. Motor evoked potentials were recorded through electromyographical electrodes implanted in upper limb muscles. The motor threshold was calculated through...... motor threshold with acceptable precision. In vivo ECS mapping showed high test-retest reliability with respect to the excitability and location of the cortical forelimb motor representations. Significance. Using implanted μECoG electrode arrays and a modified motor threshold-hunting algorithm, we...

  1. A technique for setting analytical thresholds in massively parallel sequencing-based forensic DNA analysis.

    Science.gov (United States)

    Young, Brian; King, Jonathan L; Budowle, Bruce; Armogida, Luigi

    2017-01-01

    Amplicon (targeted) sequencing by massively parallel sequencing (PCR-MPS) is a potential method for use in forensic DNA analyses. In this application, PCR-MPS may supplement or replace other instrumental analysis methods such as capillary electrophoresis and Sanger sequencing for STR and mitochondrial DNA typing, respectively. PCR-MPS also may enable the expansion of forensic DNA analysis methods to include new marker systems such as single nucleotide polymorphisms (SNPs) and insertion/deletions (indels) that currently are assayable using various instrumental analysis methods including microarray and quantitative PCR. Acceptance of PCR-MPS as a forensic method will depend in part upon developing protocols and criteria that define the limitations of a method, including a defensible analytical threshold or method detection limit. This paper describes an approach to establish objective analytical thresholds suitable for multiplexed PCR-MPS methods. A definition is proposed for PCR-MPS method background noise, and an analytical threshold based on background noise is described.

  2. Idiopathic sudden sensorineural hearing loss: cardiovascular risk factors do not influence hearing threshold recovery.

    Science.gov (United States)

    Ciorba, A; Hatzopoulos, S; Bianchini, C; Iannini, V; Rosignoli, M; Skarzynski, H; Aimoni, C

    2015-04-01

    Previous studies have suggested that risk factors for ischaemic vascular disease, such as cigarette smoking, hypertension and hyperlipidaemia, can also be considered risk factors for the development of idiopathic sudden sensorineural hearing loss (ISSNHL). In this study, we have evaluated the hypothesis that these factors can influence hearing threshold recovery in patients affected by ISSNHL. A total of 141 subjects who suffered an episode of ISSNHL were included. All subjects were assessed with tonal audiometry, auditory brainstem responses and MRI to exclude retrocochlear pathology. Hearing tests were conducted at ISSNHL onset (t = 0) and after 30 days. Patients were divided into three classes according to the presence/absence of one or more cardiovascular risk factors including: history of smoking, total serum cholesterol/triglycerides, history of hypertension and diabetes mellitus. Values of hearing threshold recovery were estimated and comparisons were conducted across the three risk factor classes. 75% of patients affected by ISSNHL showed a threshold recovery. However, the threshold recovery was found to be class-independent (average recovery value of 18 dB HL per classes) and also independent of age and gender. Even if cardiovascular risk factors have been found to be involved in the pathogenesis of ISSNHL, the present study suggests that these factors do not have any significant influence on the threshold recovery in ISSNHL.

  3. The glial growth factors deficiency and synaptic destabilization hypothesis of schizophrenia

    Directory of Open Access Journals (Sweden)

    Zoega Tomas

    2002-07-01

    Full Text Available Abstract Background A systems approach to understanding the etiology of schizophrenia requires a theory which is able to integrate genetic as well as neurodevelopmental factors. Presentation of the hypothesis Based on a co-localization of loci approach and a large amount of circumstantial evidence, we here propose that a functional deficiency of glial growth factors and of growth factors produced by glial cells are among the distal causes in the genotype-to-phenotype chain leading to the development of schizophrenia. These factors include neuregulin, insulin-like growth factor I, insulin, epidermal growth factor, neurotrophic growth factors, erbB receptors, phosphatidylinositol-3 kinase, growth arrest specific genes, neuritin, tumor necrosis factor alpha, glutamate, NMDA and cholinergic receptors. A genetically and epigenetically determined low baseline of glial growth factor signaling and synaptic strength is expected to increase the vulnerability for additional reductions (e.g., by viruses such as HHV-6 and JC virus infecting glial cells. This should lead to a weakening of the positive feedback loop between the presynaptic neuron and its targets, and below a certain threshold to synaptic destabilization and schizophrenia. Testing the hypothesis Supported by informed conjectures and empirical facts, the hypothesis makes an attractive case for a large number of further investigations. Implications of the hypothesis The hypothesis suggests glial cells as the locus of the genes-environment interactions in schizophrenia, with glial asthenia as an important factor for the genetic liability to the disorder, and an increase of prolactin and/or insulin as possible working mechanisms of traditional and atypical neuroleptic treatments.

  4. The Gumbel hypothesis test for left censored observations using regional earthquake records as an example

    Directory of Open Access Journals (Sweden)

    E. M. Thompson

    2011-01-01

    Full Text Available Annual maximum (AM time series are incomplete (i.e., censored when no events are included above the assumed censoring threshold (i.e., magnitude of completeness. We introduce a distrtibutional hypothesis test for left-censored Gumbel observations based on the probability plot correlation coefficient (PPCC. Critical values of the PPCC hypothesis test statistic are computed from Monte-Carlo simulations and are a function of sample size, censoring level, and significance level. When applied to a global catalog of earthquake observations, the left-censored Gumbel PPCC tests are unable to reject the Gumbel hypothesis for 45 of 46 seismic regions. We apply four different field significance tests for combining individual tests into a collective hypothesis test. None of the field significance tests are able to reject the global hypothesis that AM earthquake magnitudes arise from a Gumbel distribution. Because the field significance levels are not conclusive, we also compute the likelihood that these field significance tests are unable to reject the Gumbel model when the samples arise from a more complex distributional alternative. A power study documents that the censored Gumbel PPCC test is unable to reject some important and viable Generalized Extreme Value (GEV alternatives. Thus, we cannot rule out the possibility that the global AM earthquake time series could arise from a GEV distribution with a finite upper bound, also known as a reverse Weibull distribution. Our power study also indicates that the binomial and uniform field significance tests are substantially more powerful than the more commonly used Bonferonni and false discovery rate multiple comparison procedures.

  5. Deciding with Thresholds: Importance Measures and Value of Information.

    Science.gov (United States)

    Borgonovo, Emanuele; Cillo, Alessandra

    2017-10-01

    Risk-informed decision making is often accompanied by the specification of an acceptable level of risk. Such target level is compared against the value of a risk metric, usually computed through a probabilistic safety assessment model, to decide about the acceptability of a given design, the launch of a space mission, etc. Importance measures complement the decision process with information about the risk/safety significance of events. However, importance measures do not tell us whether the occurrence of an event can change the overarching decision. By linking value of information and importance measures for probabilistic risk assessment models, this work obtains a value-of-information-based importance measure that brings together the risk metric, risk importance measures, and the risk threshold in one expression. The new importance measure does not impose additional computational burden because it can be calculated from our knowledge of the risk achievement and risk reduction worth, and complements the insights delivered by these importance measures. Several properties are discussed, including the joint decision worth of basic event groups. The application to the large loss of coolant accident sequence of the Advanced Test Reactor helps us in illustrating the risk analysis insights. © 2017 Society for Risk Analysis.

  6. Regression Discontinuity Designs Based on Population Thresholds

    DEFF Research Database (Denmark)

    Eggers, Andrew C.; Freier, Ronny; Grembi, Veronica

    ) to measure the effects of these threshold-based policies on political and economic outcomes. Using evidence from France, Germany, and Italy, we highlight two common pitfalls that arise in exploiting population-based policies (confounded treatment and sorting) and we provide guidance for detecting......In many countries, important features of municipal government (such as the electoral system, mayors' salaries, and the number of councillors) depend on whether the municipality is above or below arbitrary population thresholds. Several papers have used a regression discontinuity design (RDD...... and addressing these pitfalls. Even when these problems are present, population-threshold RDD may be the best available research design for studying the effects of certain policies and political institutions....

  7. Effects of pulse duration on magnetostimulation thresholds

    Energy Technology Data Exchange (ETDEWEB)

    Saritas, Emine U., E-mail: saritas@ee.bilkent.edu.tr [Department of Bioengineering, University of California, Berkeley, Berkeley, California 94720-1762 (United States); Department of Electrical and Electronics Engineering, Bilkent University, Bilkent, Ankara 06800 (Turkey); National Magnetic Resonance Research Center (UMRAM), Bilkent University, Bilkent, Ankara 06800 (Turkey); Goodwill, Patrick W. [Department of Bioengineering, University of California, Berkeley, Berkeley, California 94720-1762 (United States); Conolly, Steven M. [Department of Bioengineering, University of California, Berkeley, Berkeley, California 94720-1762 (United States); Department of EECS, University of California, Berkeley, California 94720-1762 (United States)

    2015-06-15

    Purpose: Medical imaging techniques such as magnetic resonance imaging and magnetic particle imaging (MPI) utilize time-varying magnetic fields that are subject to magnetostimulation limits, which often limit the speed of the imaging process. Various human-subject experiments have studied the amplitude and frequency dependence of these thresholds for gradient or homogeneous magnetic fields. Another contributing factor was shown to be number of cycles in a magnetic pulse, where the thresholds decreased with longer pulses. The latter result was demonstrated on two subjects only, at a single frequency of 1.27 kHz. Hence, whether the observed effect was due to the number of cycles or due to the pulse duration was not specified. In addition, a gradient-type field was utilized; hence, whether the same phenomenon applies to homogeneous magnetic fields remained unknown. Here, the authors investigate the pulse duration dependence of magnetostimulation limits for a 20-fold range of frequencies using homogeneous magnetic fields, such as the ones used for the drive field in MPI. Methods: Magnetostimulation thresholds were measured in the arms of six healthy subjects (age: 27 ± 5 yr). Each experiment comprised testing the thresholds at eight different pulse durations between 2 and 125 ms at a single frequency, which took approximately 30–40 min/subject. A total of 34 experiments were performed at three different frequencies: 1.2, 5.7, and 25.5 kHz. A solenoid coil providing homogeneous magnetic field was used to induce stimulation, and the field amplitude was measured in real time. A pre-emphasis based pulse shaping method was employed to accurately control the pulse durations. Subjects reported stimulation via a mouse click whenever they felt a twitching/tingling sensation. A sigmoid function was fitted to the subject responses to find the threshold at a specific frequency and duration, and the whole procedure was repeated at all relevant frequencies and pulse durations

  8. THRESHOLD PARAMETER OF THE EXPECTED LOSSES

    Directory of Open Access Journals (Sweden)

    Josip Arnerić

    2012-12-01

    Full Text Available The objective of extreme value analysis is to quantify the probabilistic behavior of unusually large losses using only extreme values above some high threshold rather than using all of the data which gives better fit to tail distribution in comparison to traditional methods with assumption of normality. In our case we estimate market risk using daily returns of the CROBEX index at the Zagreb Stock Exchange. Therefore, it’s necessary to define the excess distribution above some threshold, i.e. Generalized Pareto Distribution (GPD is used as much more reliable than the normal distribution due to the fact that gives the accent on the extreme values. Parameters of GPD distribution will be estimated using maximum likelihood method (MLE. The contribution of this paper is to specify threshold which is large enough so that GPD approximation valid but low enough so that a sufficient number of observations are available for a precise fit.

  9. Threshold Theory Tested in an Organizational Setting

    DEFF Research Database (Denmark)

    Christensen, Bo T.; Hartmann, Peter V. W.; Hedegaard Rasmussen, Thomas

    2017-01-01

    correlations differed significantly. The finding was stable across distinct parts of the sample, providing support for the theory, although the correlations in all subsamples were small. The findings lend support to the existence of threshold effects using perceptual measures of behavior in real......A large sample of leaders (N = 4257) was used to test the link between leader innovativeness and intelligence. The threshold theory of the link between creativity and intelligence assumes that below a certain IQ level (approximately IQ 120), there is some correlation between IQ and creative...... potential, but above this cutoff point, there is no correlation. Support for the threshold theory of creativity was found, in that the correlation between IQ and innovativeness was positive and significant below a cutoff point of IQ 120. Above the cutoff, no significant relation was identified, and the two...

  10. Limitations of acceptability curves for presenting uncertainty in cost-effectiveness analysis

    NARCIS (Netherlands)

    Groot Koerkamp, Bas; Hunink, M. G. Myriam; Stijnen, Theo; Hammitt, James K.; Kuntz, Karen M.; Weinstein, Milton C.

    2007-01-01

    Clinical journals increasingly illustrate uncertainty about the cost and effect of health care interventions using cost-effectiveness acceptability curves (CEACs). CEACs present the probability that each competing alternative is optimal for a range of values of the cost-effectiveness threshold. The

  11. The Zinc Dyshomeostasis Hypothesis of Alzheimer's Disease

    Science.gov (United States)

    Craddock, Travis J. A.; Tuszynski, Jack A.; Chopra, Deepak; Casey, Noel; Goldstein, Lee E.; Hameroff, Stuart R.; Tanzi, Rudolph E.

    2012-01-01

    Alzheimer's disease (AD) is the most common form of dementia in the elderly. Hallmark AD neuropathology includes extracellular amyloid plaques composed largely of the amyloid-β protein (Aβ), intracellular neurofibrillary tangles (NFTs) composed of hyper-phosphorylated microtubule-associated protein tau (MAP-tau), and microtubule destabilization. Early-onset autosomal dominant AD genes are associated with excessive Aβ accumulation, however cognitive impairment best correlates with NFTs and disrupted microtubules. The mechanisms linking Aβ and NFT pathologies in AD are unknown. Here, we propose that sequestration of zinc by Aβ-amyloid deposits (Aβ oligomers and plaques) not only drives Aβ aggregation, but also disrupts zinc homeostasis in zinc-enriched brain regions important for memory and vulnerable to AD pathology, resulting in intra-neuronal zinc levels, which are either too low, or excessively high. To evaluate this hypothesis, we 1) used molecular modeling of zinc binding to the microtubule component protein tubulin, identifying specific, high-affinity zinc binding sites that influence side-to-side tubulin interaction, the sensitive link in microtubule polymerization and stability. We also 2) performed kinetic modeling showing zinc distribution in extra-neuronal Aβ deposits can reduce intra-neuronal zinc binding to microtubules, destabilizing microtubules. Finally, we 3) used metallomic imaging mass spectrometry (MIMS) to show anatomically-localized and age-dependent zinc dyshomeostasis in specific brain regions of Tg2576 transgenic, mice, a model for AD. We found excess zinc in brain regions associated with memory processing and NFT pathology. Overall, we present a theoretical framework and support for a new theory of AD linking extra-neuronal Aβ amyloid to intra-neuronal NFTs and cognitive dysfunction. The connection, we propose, is based on β-amyloid-induced alterations in zinc ion concentration inside neurons affecting stability of polymerized

  12. The zinc dyshomeostasis hypothesis of Alzheimer's disease.

    Directory of Open Access Journals (Sweden)

    Travis J A Craddock

    Full Text Available Alzheimer's disease (AD is the most common form of dementia in the elderly. Hallmark AD neuropathology includes extracellular amyloid plaques composed largely of the amyloid-β protein (Aβ, intracellular neurofibrillary tangles (NFTs composed of hyper-phosphorylated microtubule-associated protein tau (MAP-tau, and microtubule destabilization. Early-onset autosomal dominant AD genes are associated with excessive Aβ accumulation, however cognitive impairment best correlates with NFTs and disrupted microtubules. The mechanisms linking Aβ and NFT pathologies in AD are unknown. Here, we propose that sequestration of zinc by Aβ-amyloid deposits (Aβ oligomers and plaques not only drives Aβ aggregation, but also disrupts zinc homeostasis in zinc-enriched brain regions important for memory and vulnerable to AD pathology, resulting in intra-neuronal zinc levels, which are either too low, or excessively high. To evaluate this hypothesis, we 1 used molecular modeling of zinc binding to the microtubule component protein tubulin, identifying specific, high-affinity zinc binding sites that influence side-to-side tubulin interaction, the sensitive link in microtubule polymerization and stability. We also 2 performed kinetic modeling showing zinc distribution in extra-neuronal Aβ deposits can reduce intra-neuronal zinc binding to microtubules, destabilizing microtubules. Finally, we 3 used metallomic imaging mass spectrometry (MIMS to show anatomically-localized and age-dependent zinc dyshomeostasis in specific brain regions of Tg2576 transgenic, mice, a model for AD. We found excess zinc in brain regions associated with memory processing and NFT pathology. Overall, we present a theoretical framework and support for a new theory of AD linking extra-neuronal Aβ amyloid to intra-neuronal NFTs and cognitive dysfunction. The connection, we propose, is based on β-amyloid-induced alterations in zinc ion concentration inside neurons affecting stability of

  13. Thresholds in Xeric Hydrology and Biogeochemistry

    Science.gov (United States)

    Meixner, T.; Brooks, P. D.; Simpson, S. C.; Soto, C. D.; Yuan, F.; Turner, D.; Richter, H.

    2011-12-01

    Due to water limitation, thresholds in hydrologic and biogeochemical processes are common in arid and semi-arid systems. Some of these thresholds such as those focused on rainfall runoff relationships have been well studied. However to gain a full picture of the role that thresholds play in driving the hydrology and biogeochemistry of xeric systems a full view of the entire array of processes at work is needed. Here a walk through the landscape of xeric systems will be conducted illustrating the powerful role of hydrologic thresholds on xeric system biogeochemistry. To understand xeric hydro-biogeochemistry two key ideas need to be focused on. First, it is important to start from a framework of reaction and transport. Second an understanding of the temporal and spatial components of thresholds that have a large impact on hydrologic and biogeochemical fluxes needs to be offered. In the uplands themselves episodic rewetting and drying of soils permits accelerated biogeochemical processing but also more gradual drainage of water through the subsurface than expected in simple conceptions of biogeochemical processes. Hydrologic thresholds (water content above hygroscopic) results in a stop start nutrient spiral of material across the landscape since runoff connecting uplands to xeric perennial riparian is episodic and often only transports materials a short distance (100's of m). This episodic movement results in important and counter-intuitive nutrient inputs to riparian zones but also significant processing and uptake of nutrients. The floods that transport these biogeochemicals also result in significant input to riparian groundwater and may be key to sustaining these critical ecosystems. Importantly the flood driven recharge process itself is a threshold process dependent on flood characteristics (floods greater than 100 cubic meters per second) and antecedent conditions (losing to near neutral gradients). Floods also appear to influence where arid and semi

  14. Color image Segmentation using automatic thresholding techniques

    International Nuclear Information System (INIS)

    Harrabi, R.; Ben Braiek, E.

    2011-01-01

    In this paper, entropy and between-class variance based thresholding methods for color images segmentation are studied. The maximization of the between-class variance (MVI) and the entropy (ME) have been used as a criterion functions to determine an optimal threshold to segment images into nearly homogenous regions. Segmentation results from the two methods are validated and the segmentation sensitivity for the test data available is evaluated, and a comparative study between these methods in different color spaces is presented. The experimental results demonstrate the superiority of the MVI method for color image segmentation.

  15. The comprehension of sentences with unaccusative verbs in aphasia: a test of the intervener hypothesis.

    Science.gov (United States)

    Sullivan, Natalie; Walenski, Matthew; Love, Tracy; Shapiro, Lewis P

    2017-01-01

    It is well accepted that individuals with agrammatic Broca's aphasia have difficulty comprehending some sentences with filler-gap dependencies. While investigations of these difficulties have been conducted with several different sentence types (e.g., object relatives, Wh -questions), we explore sentences containing unaccusative verbs, which arguably have a single noun phrase (NP) that is base-generated in object position but then is displaced to surface subject position. Unaccusative verbs provide an ideal test case for a particular hypothesis about the comprehension disorder-the Intervener Hypothesis-that posits that the difficulty individuals with agrammatic Broca's aphasia have comprehending sentences containing filler-gap dependencies results from similarity-based interference caused by the presence of an intervening NP between the two elements of a syntactic chain. To assess a particular account of the comprehension deficit in agrammatic Broca's aphasia-the Intervener Hypothesis. We used a sentence-picture matching task to determine if listeners with agrammatic Broca's aphasia (LWBA) and age-matched neurologically unimpaired controls (AMC) have difficulty comprehending unaccusative verbs when placed in subject relative and complement phrase (CP) constructions. We found above-chance comprehension of both sentence constructions with the AMC participants. In contrast, we found above-chance comprehension of CP sentences containing unaccusative verbs but poor comprehension of subject relative sentences containing unaccusative verbs for the LWBA. These results provide support for the Intervener Hypothesis, wherein the presence of an intervening NP between two elements of a filler-gap dependency adversely affects sentence comprehension.

  16. Role of stable isotopes in life--testing isotopic resonance hypothesis.

    Science.gov (United States)

    Zubarev, Roman A

    2011-04-01

    Stable isotopes of most important biological elements, such as C, H, N and O, affect living organisms. In rapidly growing species, deuterium and to a lesser extent other heavy isotopes reduce the growth rate. At least for deuterium it is known that its depletion also negatively impacts the speed of biological processes. As a rule, living organisms "resist" changes in their isotopic environment, preferring natural isotopic abundances. This preference could be due to evolutionary optimization; an additional effect could be due to the presence of the "isotopic resonance". The isotopic resonance phenomenon has been linked to the choice of earliest amino acids, and thus affected the evolution of genetic code. To test the isotopic resonance hypothesis, literature data were analyzed against quantitative and qualitative predictions of the hypothesis. Four studies provided five independent datasets, each in very good quantitative agreement with the predictions. Thus, the isotopic resonance hypothesis is no longer simply plausible; it can now be deemed likely. Additional testing is needed, however, before full acceptance of this hypothesis. Copyright © 2011 Beijing Genomics Institute. Published by Elsevier Ltd. All rights reserved.

  17. A shift from significance test to hypothesis test through power analysis in medical research

    Directory of Open Access Journals (Sweden)

    Singh Girish

    2006-01-01

    Full Text Available Medical research literature until recently, exhibited substantial dominance of the Fisher′s significance test approach of statistical inference concentrating more on probability of type I error over Neyman-Pearson′s hypothesis test considering both probability of type I and II error. Fisher′s approach dichotomises results into significant or not significant results with a P value. The Neyman-Pearson′s approach talks of acceptance or rejection of null hypothesis. Based on the same theory these two approaches deal with same objective and conclude in their own way. The advancement in computing techniques and availability of statistical software have resulted in increasing application of power calculations in medical research and thereby reporting the result of significance tests in the light of power of the test also. Significance test approach, when it incorporates power analysis contains the essence of hypothesis test approach. It may be safely argued that rising application of power analysis in medical research may have initiated a shift from Fisher′s significance test to Neyman-Pearson′s hypothesis test procedure.

  18. Evaluation of a Teen Dating Violence Social Marketing Campaign: Lessons Learned when the Null Hypothesis Was Accepted

    Science.gov (United States)

    Rothman, Emily F.; Decker, Michele R.; Silverman, Jay G.

    2006-01-01

    This chapter discusses a three-month statewide mass media campaign to prevent teen dating violence, "See It and Stop It." The Massachusetts campaign reached out--using television, radio, and print advertising--and also encouraged anti-violence activism in select high schools. The objective was to drive thirteen- to seventeen-year-olds to…

  19. Toward an acceptable nuclear future

    International Nuclear Information System (INIS)

    Weinberg, A.M.

    1977-11-01

    The nuclear option is in danger of being foreclosed. The trend toward antinuclearism may be reversed if concerns about low-level radiation insult can be shown ultimately to be without foundation; evidence for this speculation is presented. Nevertheless it is suggested that the nuclear enterprise itself must propose new initiatives to increase the acceptability of nuclear energy. A key element of an acceptable nuclear future is cluster siting of reactors. This siting plan might be achieved by confining new reactors essentially to existing sites

  20. The nitric oxide hypothesis of aging.

    Science.gov (United States)

    McCann, S M; Licinio, J; Wong, M L; Yu, W H; Karanth, S; Rettorri, V

    1998-01-01

    Nitric oxide (NO), generated by endothelial (e) NO synthase (NOS) and neuronal (n) NOS, plays a ubiquitous role in the body in controlling the function of almost every, if not every, organ system. Bacterial and viral products, such as bacterial lipopolysaccharide (LPS), induce inducible (i) NOS synthesis that produces massive amounts of NO toxic to the invading viruses and bacteria, but also host cells by inactivation of enzymes leading to cell death. The actions of all forms of NOS are mediated not only by the free radical oxidant properties of this soluble gas, but also by its activation of guanylate cyclase (GC), leading to the production of cyclic guanosine monophosphate (cGMP) that mediates many of its physiological actions. In addition, NO activates cyclooxygenase and lipoxygenase, leading to the production of physiologically relevant quantities of prostaglandin E2 (PGE2) and leukotrienes. In the case of iNOS, the massive release of NO, PGE2, and leukotrienes produces toxic effects. Systemic injection of LPS causes induction of interleukin (IL)-1 beta mRNA followed by IL-beta synthesis that induces iNOS mRNA with a latency of two and four hours, respectively, in the anterior pituitary and pineal glands, meninges, and choroid plexus, regions outside the blood-brain barrier, and shortly thereafter, in hypothalamic regions, such as the temperature-regulating centers, paraventricular nucleus containing releasing and inhibiting hormone neurons, and the arcuate nucleus, a region containing these neurons and axons bound for the median eminence. We are currently determining if LPS similarly activates cytokine and iNOS production in the cardiovascular system and the gonads. Our hypothesis is that recurrent infections over the life span play a significant role in producing aging changes in all systems outside the blood-brain barrier via release of toxic quantities of NO. NO may be a major factor in the development of coronary heart disease (CHD). Considerable evidence

  1. Diesel Engine Actuator Fault Isolation using Multiple Models Hypothesis Tests

    DEFF Research Database (Denmark)

    Bøgh, S.A.

    1994-01-01

    Detection of current faults in a D.C. motor with unknown load torques is not feasible with linear methods and threshold logic......Detection of current faults in a D.C. motor with unknown load torques is not feasible with linear methods and threshold logic...

  2. Pressure pain thresholds and musculoskeletal morbidity in automobile manufacturing workers.

    Science.gov (United States)

    Gold, Judith E; Punnett, Laura; Katz, Jeffrey N

    2006-02-01

    Reduced pressure pain thresholds (PPTs) have been reported in occupational groups with symptoms of upper extremity musculoskeletal disorders (UEMSDs). The purpose of this study was to determine whether automobile manufacturing workers (n=460) with signs and symptoms of UEMSDs had reduced PPTs (greater sensitivity to pain through pressure applied to the skin) when compared with unaffected members of the cohort, which served as the reference group. The association of PPTs with symptom severity and localization of PE findings was investigated, as was the hypothesis that reduced thresholds would be found on the affected side in those with unilateral physical examination (PE) findings. PPTs were measured during the workday at 12 upper extremity sites. A PE for signs of UEMSDs and symptom questionnaire was administered. After comparison of potential covariates using t tests, linear regression multivariable models were constructed with the average of 12 sites (avgPPT) as the outcome. Subjects with PE findings and/or symptoms had a statistically significant lower avgPPT than non-cases. AvgPPT was reduced in those with more widespread PE findings and in those with greater symptom severity (test for trend, P

  3. Mesoscale spatial variability in seawater cavitation thresholds

    Science.gov (United States)

    Mel'nikov, N. P.; Elistratov, V. P.

    2017-03-01

    The paper presents the spatial variability of cavitation thresholds and some hydrological and hydrochemical parameters of seawater in the interfrontal zone of the Pacific Subarctic Front, in the Drake Passage, and in the equatorial part of the Pacific Ocean, measured in the near-surface layer to a depth of 70 m.

  4. Intraoperative transfusion threshold and tissue oxygenation

    DEFF Research Database (Denmark)

    Nielsen, K; Dahl, B; Johansson, P I

    2012-01-01

    Transfusion with allogeneic red blood cells (RBCs) may be needed to maintain oxygen delivery during major surgery, but the appropriate haemoglobin (Hb) concentration threshold has not been well established. We hypothesised that a higher level of Hb would be associated with improved subcutaneous...

  5. Threshold Concepts in Finance: Conceptualizing the Curriculum

    Science.gov (United States)

    Hoadley, Susan; Tickle, Leonie; Wood, Leigh N.; Kyng, Tim

    2015-01-01

    Graduates with well-developed capabilities in finance are invaluable to our society and in increasing demand. Universities face the challenge of designing finance programmes to develop these capabilities and the essential knowledge that underpins them. Our research responds to this challenge by identifying threshold concepts that are central to…

  6. Heritability estimates derived from threshold analyses for ...

    African Journals Online (AJOL)

    Unknown

    Abstract. The object of this study was to estimate heritabilities and sire breeding values for stayability and reproductive traits in a composite multibreed beef cattle herd using a threshold model. A GFCAT set of programmes was used to analyse reproductive data. Heritabilities and product-moment correlations between.

  7. Design of Threshold Controller Based Chaotic Circuits

    DEFF Research Database (Denmark)

    Mohamed, I. Raja; Murali, K.; Sinha, Sudeshna

    2010-01-01

    We propose a very simple implementation of a second-order nonautonomous chaotic oscillator, using a threshold controller as the only source of nonlinearity. We demonstrate the efficacy and simplicity of our design through numerical and experimental results. Further, we show that this approach...

  8. Grid - a fast threshold tracking procedure

    DEFF Research Database (Denmark)

    Fereczkowski, Michal; Dau, Torsten; MacDonald, Ewen

    2016-01-01

    A new procedure, called “grid”, is evaluated that allows rapid acquisition of threshold curves for psychophysics and, in particular, psychoacoustic, experiments. In this method, the parameterresponse space is sampled in two dimensions within a single run. This allows the procedure to focus more e...

  9. Atherogenic Risk Factors and Hearing Thresholds

    DEFF Research Database (Denmark)

    Frederiksen, Thomas Winther; Ramlau-Hansen, Cecilia Høst; Stokholm, Zara Ann

    2014-01-01

    The objective of this study was to evaluate the influence of atherogenic risk factors on hearing thresholds. In a cross-sectional study we analyzed data from a Danish survey in 2009-2010 on physical and psychological working conditions. The study included 576 white- and blue-collar workers from c...

  10. 40 CFR 68.115 - Threshold determination.

    Science.gov (United States)

    2010-07-01

    ... in accordance with the definition of flammability hazard rating 4 in the NFPA 704, Standard System... more than a threshold quantity is present at a stationary source. (iii) Naturally occurring hydrocarbon..., regulated substances in naturally occurring hydrocarbon mixtures need not be considered when determining...

  11. Identification of Threshold Concepts for Biochemistry

    Science.gov (United States)

    Loertscher, Jennifer; Green, David; Lewis, Jennifer E.; Lin, Sara; Minderhout, Vicky

    2014-01-01

    Threshold concepts (TCs) are concepts that, when mastered, represent a transformed understanding of a discipline without which the learner cannot progress. We have undertaken a process involving more than 75 faculty members and 50 undergraduate students to identify a working list of TCs for biochemistry. The process of identifying TCs for…

  12. Determining lower threshold concentrations for synergistic effects

    DEFF Research Database (Denmark)

    Bjergager, Maj-Britt Andersen; Dalhoff, Kristoffer; Kretschmann, Andreas

    2017-01-01

    which proven synergists cease to act as synergists towards the aquatic crustacean Daphnia magna. To do this, we compared several approaches and test-setups to evaluate which approach gives the most conservative estimate for the lower threshold for synergy for three known azole synergists. We focus.......619±8.555μgL(-1)) and 0.122±0.0417μM (40.236±13.75μgL(-1)), respectively, in the 14-days tests. Testing synergy in relation to concentration addition provided the most conservative values. The threshold values for the vertical assessments in tests where the two could be compared were in general 1.2 to 4.......7 fold higher than the horizontal assessments. Using passive dosing rather than dilution series or spiking did not lower the threshold significantly. Below the threshold for synergy, slight antagony could often be observed. This is most likely due to induction of enzymes active in metabolization of alpha...

  13. Microplastic effect thresholds for freshwater benthic macroinvertebrates

    NARCIS (Netherlands)

    Redondo Hasselerharm, P.E.; Dede Falahudin, Dede; Peeters, E.T.H.M.; Koelmans, A.A.

    2018-01-01

    Now that microplastics have been detected in lakes, rivers and estuaries all over the globe, evaluating their effects on biota has become an urgent research priority. This is the first study that aims at determining the effect thresholds for a battery of six freshwater benthic macroinvertebrates

  14. Distribution of sensory taste thresholds for phenylthiocarbamide ...

    African Journals Online (AJOL)

    The ability to taste Phenylthiocarbamide (PTC), a bitter organic compound has been described as a bimodal autosomal trait in both genetic and anthropological studies. This study is based on the ability of a person to taste PTC. The present study reports the threshold distribution of PTC taste sensitivity among some Muslim ...

  15. A low-threshold erbium glass minilaser

    Science.gov (United States)

    Gapontsev, V. P.; Gromov, A. K.; Izyneev, A. A.; Sadovskii, P. I.; Stavrov, A. A.

    1989-04-01

    Minilasers emitting in the 1.54-micron region with an emission threshold less than 5 J and efficiency up to 1.7 percent have been constructed using a Cr-Y-Er laser glass, LGS-Kh. Under repetitively-pulsed operation, an average lasing power of 0.7 W and a pulse repetition rate of 7 Hz have been achieved.

  16. Thresholding methods for PET imaging: A review

    International Nuclear Information System (INIS)

    Dewalle-Vignion, A.S.; Betrouni, N.; Huglo, D.; Vermandel, M.; Dewalle-Vignion, A.S.; Hossein-Foucher, C.; Huglo, D.; Vermandel, M.; Dewalle-Vignion, A.S.; Hossein-Foucher, C.; Huglo, D.; Vermandel, M.; El Abiad, A.

    2010-01-01

    This work deals with positron emission tomography segmentation methods for tumor volume determination. We propose a state of art techniques based on fixed or adaptive threshold. Methods found in literature are analysed with an objective point of view on their methodology, advantages and limitations. Finally, a comparative study is presented. (authors)

  17. Low-threshold conical microcavity dye lasers

    DEFF Research Database (Denmark)

    Grossmann, Tobias; Schleede, Simone; Hauser, Mario

    2010-01-01

    element simulations confirm that lasing occurs in whispering gallery modes which corresponds well to the measured multimode laser-emission. The effect of dye concentration on lasing threshold and lasing wavelength is investigated and can be explained using a standard dye laser model....

  18. Classification error of the thresholded independence rule

    DEFF Research Database (Denmark)

    Bak, Britta Anker; Fenger-Grøn, Morten; Jensen, Jens Ledet

    We consider classification in the situation of two groups with normally distributed data in the ‘large p small n’ framework. To counterbalance the high number of variables we consider the thresholded independence rule. An upper bound on the classification error is established which is taylored...

  19. The acoustic reflex threshold in aging ears.

    Science.gov (United States)

    Silverman, C A; Silman, S; Miller, M H

    1983-01-01

    This study investigates the controversy regarding the influence of age on the acoustic reflex threshold for broadband noise, 500-, 1000-, 2000-, and 4000-Hz activators between Jerger et al. [Mono. Contemp. Audiol. 1 (1978)] and Jerger [J. Acoust. Soc. Am. 66 (1979)] on the one hand and Silman [J. Acoust. Soc. Am. 66 (1979)] and others on the other. The acoustic reflex thresholds for broadband noise, 500-, 1000-, 2000-, and 4000-Hz activators were evaluated under two measurement conditions. Seventy-two normal-hearing ears were drawn from 72 subjects ranging in age from 20-69 years. The results revealed that age was correlated with the acoustic reflex threshold for BBN activator but not for any of the tonal activators; the correlation was stronger under the 1-dB than under the 5-dB measurement condition. Also, the mean acoustic reflex thresholds for broadband noise activator were essentially similar to those reported by Jerger et al. (1978) but differed from those obtained in this study under the 1-dB measurement condition.

  20. Euthanasia Acceptance: An Attitudinal Inquiry.

    Science.gov (United States)

    Klopfer, Fredrick J.; Price, William F.

    The study presented was conducted to examine potential relationships between attitudes regarding the dying process, including acceptance of euthanasia, and other attitudinal or demographic attributes. The data of the survey was comprised of responses given by 331 respondents to a door-to-door interview. Results are discussed in terms of preferred…

  1. Nitrogen trailer acceptance test report

    International Nuclear Information System (INIS)

    Kostelnik, A.J.

    1996-01-01

    This Acceptance Test Report documents compliance with the requirements of specification WHC-S-0249. The equipment was tested according to WHC-SD-WM-ATP-108 Rev.0. The equipment being tested is a portable contained nitrogen supply. The test was conducted at Norco's facility

  2. AAL- technology acceptance through experience

    NARCIS (Netherlands)

    Huldtgren, A.; Ascencio San Pedro, G.; Pohlmeyer, A.E.; Romero Herrera, N.A.

    2014-01-01

    Despite substantial research and development of Ambient Assisted Living (AAL) technologies, their acceptance remains low. This is partially caused by a lack of accounting for users' needs and values, and the social contexts these systems are to be embedded in. Participatory design has some potential

  3. Safety culture and public acceptance

    International Nuclear Information System (INIS)

    Mikhalevich, Alexander A.

    2002-01-01

    After the Chernobyl NPP accident a public acceptance has become a key factor in nuclear power development all over the world. Therefore, nuclear safety culture should be based not only on technical principles, responsibilities, supervision, regulatory provisions, emergency preparedness, but the public awareness of minimum risk during the operation and decommissioning of NPPs, radioactive waste management, etc. (author)

  4. Worldwide nuclear revival and acceptance

    International Nuclear Information System (INIS)

    Geraets, Luc H.; Crommelynck, Yves A.

    2010-01-01

    The paper outlines the current status and trends of the nuclear revival in Europe and abroad, the evolution of the public opinion in the last decade, and the interaction between the former and the latter. It emphasises the absolute priority of a professional communication and exchange to gain public acceptance. (orig.)

  5. Energy justice: Participation promotes acceptance

    Science.gov (United States)

    Baxter, Jamie

    2017-08-01

    Wind turbines have been a go-to technology for addressing climate change, but they are increasingly a source of frustration for all stakeholders. While community ownership is often lauded as a panacea for maximizing turbine acceptance, a new study suggests that decision-making involvement — procedural fairness — matters most.

  6. W-025, acceptance test report

    International Nuclear Information System (INIS)

    Roscha, V.

    1994-01-01

    This acceptance test report (ATR) has been prepared to establish the results of the field testing conducted on W-025 to demonstrate that the electrical/instrumentation systems functioned as intended by design. This is part of the RMW Land Disposal Facility

  7. Distance Discrimination Thresholds During Flight Simulation in a Maritime Environment

    Science.gov (United States)

    2011-11-01

    UNCLASSIFIED Distance Discrimination Thresholds During Flight Simulation in a Maritime Environment Jessica Parker Air Operations...Distance Discrimination Thresholds During Flight Simulation in a Maritime Environment Executive Summary The Aeronautical Design Standard...position to be perceived. This minimum distance was defined as the distance discrimination threshold. For both high and low sea states, the thresholds

  8. The Relationship between Rate of Algometer Application and Pain Pressure Threshold in the Assessment of Myofascial Trigger Point Sensitivity.

    Science.gov (United States)

    Linde, Lukas D; Kumbhare, Dinesh A; Joshi, Maneil; Srbely, John Z

    2018-02-01

    Pressure algometry is a commonly employed technique in the assessment of both regional and widespread musculoskeletal pain. Despite its acceptance amongst clinicians and scientists, the relationship between rate of pressure application (RoA) and pain pressure threshold (PPT) remains poorly understood. We set out to test the hypothesis that a strong, positive, linear relationship exists between the RoA and the PPT within the infraspinatus of young healthy subjects. Thirty-three participants were randomly recruited from the local university community. PPT measures were recorded from a clinically identified myofascial trigger point within the right infraspinatus muscle during pressure algometry. A total of 2 PPT measures were recorded using each of 3 different RoAs, including low (15 N/s), medium (35 N/s), and high (55 N/s). Three baseline trials were also conducted at 30 N/s. The Pearson's correlation coefficient between RoA and PPT was calculated for each subject and averaged across participants. The mean(SD) correlation between subjects was 0.77 (0.19), and the mean (SD) slope of the linear regression was 0.13 (0.09). Our results demonstrate that there is a strong, linear relationship between the RoA and PPT when using the pressure algometry technique. The low slope between RoA and PPT suggests clinicians can rely on PPT assessments despite small RoA fluctuations. Future research should explore this relationship further in a clinical population and in other muscles affected by chronic myofascial pain. Advancing cost-effective, reliable, and clinically feasible tools such as algometry is important to enhancing the diagnosis and management of chronic myofascial pain. © 2017 World Institute of Pain.

  9. Cost–effectiveness thresholds: pros and cons

    Science.gov (United States)

    Lauer, Jeremy A; De Joncheere, Kees; Edejer, Tessa; Hutubessy, Raymond; Kieny, Marie-Paule; Hill, Suzanne R

    2016-01-01

    Abstract Cost–effectiveness analysis is used to compare the costs and outcomes of alternative policy options. Each resulting cost–effectiveness ratio represents the magnitude of additional health gained per additional unit of resources spent. Cost–effectiveness thresholds allow cost–effectiveness ratios that represent good or very good value for money to be identified. In 2001, the World Health Organization’s Commission on Macroeconomics in Health suggested cost–effectiveness thresholds based on multiples of a country’s per-capita gross domestic product (GDP). In some contexts, in choosing which health interventions to fund and which not to fund, these thresholds have been used as decision rules. However, experience with the use of such GDP-based thresholds in decision-making processes at country level shows them to lack country specificity and this – in addition to uncertainty in the modelled cost–effectiveness ratios – can lead to the wrong decision on how to spend health-care resources. Cost–effectiveness information should be used alongside other considerations – e.g. budget impact and feasibility considerations – in a transparent decision-making process, rather than in isolation based on a single threshold value. Although cost–effectiveness ratios are undoubtedly informative in assessing value for money, countries should be encouraged to develop a context-specific process for decision-making that is supported by legislation, has stakeholder buy-in, for example the involvement of civil society organizations and patient groups, and is transparent, consistent and fair. PMID:27994285

  10. Do multiple body modifications alter pain threshold?

    Science.gov (United States)

    Yamamotová, A; Hrabák, P; Hříbek, P; Rokyta, R

    2017-12-30

    In recent years, epidemiological data has shown an increasing number of young people who deliberately self-injure. There have also been parallel increases in the number of people with tattoos and those who voluntarily undergo painful procedures associated with piercing, scarification, and tattooing. People with self-injury behaviors often say that they do not feel the pain. However, there is no information regarding pain perception in those that visit tattoo parlors and piercing studios compared to those who don't. The aim of this study was to compare nociceptive sensitivity in four groups of subjects (n=105, mean age 26 years, 48 women and 57 men) with different motivations to experience pain (i.e., with and without multiple body modifications) in two different situations; (1) in controlled, emotionally neutral conditions, and (2) at a "Hell Party" (HP), an event organized by a piercing and tattoo parlor, with a main event featuring a public demonstration of painful techniques (burn scars, hanging on hooks, etc.). Pain thresholds of the fingers of the hand were measured using a thermal stimulator and mechanical algometer. In HP participants, information about alcohol intake, self-harming behavior, and psychiatric history were used in the analysis as intervening variables. Individuals with body modifications as well as without body modifications had higher thermal pain thresholds at Hell Party, compared to thresholds measured at control neutral conditions. No such differences were found relative to mechanical pain thresholds. Increased pain threshold in all HP participants, irrespectively of body modification, cannot be simply explained by a decrease in the sensory component of pain; instead, we found that the environment significantly influenced the cognitive and affective component of pain.

  11. Visual threshold estimation and its relation to the question: Fechner-law or Stevens-power function.

    Science.gov (United States)

    Thoss, F

    1986-01-01

    This investigation was initiated by the apparent rivalry between Fechner-law and Stevens-power function. We used the pupil reaction as on objective measure for the excitation of the visual system within both the threshold and the suprathreshold region. The course of the threshold as a function of adaptation light shows important differences at different field areas. The Weber-law is valid only in the case of large fields (30 degrees) as had been used by Weber himself. At smaller fields the threshold may be described as a power function with exponents of 0.5 at the smallest (0.5 degrees) and about 0.75 at medium sized (5 degrees) fields. In all these experiments the Fechner-hypothesis is suitable automatically because of the constant increment of pupil constriction at the threshold. The validity of the Fechner-hypothesis together with the approximation of the threshold course by power functions lead to Stevens-power functions for the suprathreshold region. The Fechner-law - a logarithmic function - is valid at large fields only. This consideration has been confirmed by our measurements in the suprathreshold sphere.

  12. Probabilistic rainfall thresholds for triggering debris flows in a human-modified landscape

    Science.gov (United States)

    Giannecchini, Roberto; Galanti, Yuri; D'Amato Avanzi, Giacomo; Barsanti, Michele

    2016-03-01

    In the Carrara Marble Basin (CMB; Apuan Alps, Italy) quarrying has accumulated widespread and thick quarry waste, lying on steep slopes and invading valley bottoms. The Apuan Alps are one of the rainiest areas in Italy and rainstorms often cause landslides and debris flows. The stability conditions of quarry waste are difficult to assess, owing to its textural, geotechnical and hydrogeological variability. Therefore, empirical rainfall thresholds may be effective in forecasting the possible occurrence of debris flows in the CMB. Three types of thresholds were defined for three rain gauges of the CMB and for the whole area: rainfall intensity-rainfall duration (ID), cumulated event rainfall-rainfall duration (ED), and cumulated event rainfall normalized by the mean annual precipitation-rainfall intensity (EMAPI). The rainfall events recorded from 1950 to 2005 was analyzed and compared with the occurrence of debris flows involving the quarry waste. They were classified in events that triggered one or more debris flows and events that did not trigger debris flows. This dataset was fitted using the logistic regression method that allows us to define a set of thresholds, corresponding to different probabilities of failure (from 10% to 90%) and therefore to different warning levels. The performance of the logistic regression in defining probabilistic thresholds was evaluated by means of contingency tables, skill scores and receiver operating characteristic (ROC) analysis. These analyses indicate that the predictive capability of the three types of threshold is acceptable for each rain gauge and for the whole CMB. The best compromise between the number of correct debris flow predictions and the number of wrong predictions is obtained for the 40% probability thresholds. The results obtained can be tested in an experimental debris flows forecasting system based on rainfall thresholds, and could have implications for the debris flow hazard and risk assessment in the CMB.

  13. Accurate motor mapping in awake common marmosets using micro-electrocorticographical stimulation and stochastic threshold estimation.

    Science.gov (United States)

    Kosugi, Akito; Takemi, Mitsuaki; Tia, Banty; Castagnola, Elisa; Ansaldo, Alberto; Sato, Kenta; Awiszus, Friedemann; Seki, Kazuhiko; Ricci, Davide; Fadiga, Luciano; Iriki, Atsushi; Ushiba, Junichi

    2018-03-01

    Motor map has been widely used as an indicator of motor skills and learning, cortical injury, plasticity, and functional recovery. Cortical stimulation mapping using epidural electrodes is recently adopted for animal studies. However, several technical limitations still remain. Test-retest reliability of epidural cortical stimulation (ECS) mapping has not been examined in detail. Many previous studies defined evoked movements and motor thresholds by visual inspection, and thus, lacked quantitative measurements. A reliable and quantitative motor map is important to elucidate the mechanisms of motor cortical reorganization. The objective of the current study was to perform reliable ECS mapping of motor representations based on the motor thresholds, which were stochastically estimated by motor evoked potentials and chronically implanted micro-electrocorticographical (μECoG) electrode arrays, in common marmosets. Approach. ECS was applied using the implanted μECoG electrode arrays in three adult common marmosets under awake conditions. Motor evoked potentials were recorded through electromyographical electrodes implanted in upper limb muscles. The motor threshold was calculated through a modified maximum likelihood threshold-hunting algorithm fitted with the recorded data from marmosets. Further, a computer simulation confirmed reliability of the algorithm. Main results. Computer simulation suggested that the modified maximum likelihood threshold-hunting algorithm enabled to estimate motor threshold with acceptable precision. In vivo ECS mapping showed high test-retest reliability with respect to the excitability and location of the cortical forelimb motor representations. Significance. Using implanted μECoG electrode arrays and a modified motor threshold-hunting algorithm, we were able to achieve reliable motor mapping in common marmosets with the ECS system. © 2018 IOP Publishing Ltd.

  14. Threshold effects of habitat fragmentation on fish diversity at landscapes scales.

    Science.gov (United States)

    Yeager, Lauren A; Keller, Danielle A; Burns, Taylor R; Pool, Alexia S; Fodrie, F Joel

    2016-08-01

    Habitat fragmentation involves habitat loss concomitant with changes in spatial configuration, confounding mechanistic drivers of biodiversity change associated with habitat disturbance. Studies attempting to isolate the effects of altered habitat configuration on associated communities have reported variable results. This variability may be explained in part by the fragmentation threshold hypothesis, which predicts that the effects of habitat configuration may only manifest at low levels of remnant habitat area. To separate the effects of habitat area and configuration on biodiversity, we surveyed fish communities in seagrass landscapes spanning a range of total seagrass area (2-74% cover within 16 000-m 2 landscapes) and spatial configurations (1-75 discrete patches). We also measured variation in fine-scale seagrass variables, which are known to affect faunal community composition and may covary with landscape-scale features. We found that species richness decreased and the community structure shifted with increasing patch number within the landscape, but only when seagrass area was low (fragmentation threshold hypothesis and we suggest that poor matrix quality and low dispersal ability for sensitive taxa in our system may explain why our results support the hypothesis, while previous empirical work has largely failed to match predictions. © 2016 by the Ecological Society of America.

  15. Measuring Input Thresholds on an Existing Board

    Science.gov (United States)

    Kuperman, Igor; Gutrich, Daniel G.; Berkun, Andrew C.

    2011-01-01

    A critical PECL (positive emitter-coupled logic) interface to Xilinx interface needed to be changed on an existing flight board. The new Xilinx input interface used a CMOS (complementary metal-oxide semiconductor) type of input, and the driver could meet its thresholds typically, but not in worst-case, according to the data sheet. The previous interface had been based on comparison with an external reference, but the CMOS input is based on comparison with an internal divider from the power supply. A way to measure what the exact input threshold was for this device for 64 inputs on a flight board was needed. The measurement technique allowed an accurate measurement of the voltage required to switch a Xilinx input from high to low for each of the 64 lines, while only probing two of them. Directly driving an external voltage was considered too risky, and tests done on any other unit could not be used to qualify the flight board. The two lines directly probed gave an absolute voltage threshold calibration, while data collected on the remaining 62 lines without probing gave relative measurements that could be used to identify any outliers. The PECL interface was forced to a long-period square wave by driving a saturated square wave into the ADC (analog to digital converter). The active pull-down circuit was turned off, causing each line to rise rapidly and fall slowly according to the input s weak pull-down circuitry. The fall time shows up as a change in the pulse width of the signal ready by the Xilinx. This change in pulse width is a function of capacitance, pulldown current, and input threshold. Capacitance was known from the different trace lengths, plus a gate input capacitance, which is the same for all inputs. The pull-down current is the same for all inputs including the two that are probed directly. The data was combined, and the Excel solver tool was used to find input thresholds for the 62 lines. This was repeated over different supply voltages and

  16.   Information and acceptance of prenatal examinations - a qualitative study

    DEFF Research Database (Denmark)

    Fleron, Stina Lou; Dahl, Katja; Risør, Mette Bech

    by the health care system offering it. By prenatal examinations the pregnant women want to be giving the choice of future management should there be something wrong with their child. Conclusions:Participation in prenatal examinations is not based on a thorough knowledge of pros and contra of the screening tests......  Background:In 2004 The Danish National Board of Health issued new guidelines on prenatal examinations. The importance of informed decision making is strongly emphasised and any acceptance of the screenings tests offered should be based on thorough and adequate information. Objective...... and hypothesis:To explore the influence of information in the decision-making process of prenatal screenings tests offered, the relation between information, knowledge and up-take rates and reasons for accepting or declining the screenings tests offered.  Methods:The study is based on a qualitative approach...

  17. Reduced visual surround suppression in schizophrenia shown by measuring contrast detection thresholds

    Science.gov (United States)

    Serrano-Pedraza, Ignacio; Romero-Ferreiro, Verónica; Read, Jenny C. A.; Diéguez-Risco, Teresa; Bagney, Alexandra; Caballero-González, Montserrat; Rodríguez-Torresano, Javier; Rodriguez-Jimenez, Roberto

    2014-01-01

    Visual perception in schizophrenia is attracting a broad interest given the deep knowledge that we have about the visual system in healthy populations. One example is the class of effects known collectively as visual surround suppression. For example, the visibility of a grating located in the visual periphery is impaired by the presence of a surrounding grating of the same spatial frequency and orientation. Previous studies have suggested abnormal visual surround suppression in patients with schizophrenia. Given that schizophrenia patients have cortical alterations including hypofunction of NMDA receptors and reduced concentration of GABA neurotransmitter, which affect lateral inhibitory connections, then they should be relatively better than controls at detecting visual stimuli that are usually suppressed. We tested this hypothesis by measuring contrast detection thresholds using a new stimulus configuration. We tested two groups: 21 schizophrenia patients and 24 healthy subjects. Thresholds were obtained using Bayesian staircases in a four-alternative forced-choice detection task where the target was a grating within a 3∘ Butterworth window that appeared in one of four possible positions at 5∘ eccentricity. We compared three conditions, (a) target with no-surround, (b) target embedded within a surrounding grating of 20∘ diameter and 25% contrast with same spatial frequency and orthogonal orientation, and (c) target embedded within a surrounding grating with parallel (same) orientation. Previous results with healthy populations have shown that contrast thresholds are lower for orthogonal and no-surround (NS) conditions than for parallel surround (PS). The log-ratios between parallel and NS thresholds are used as an index quantifying visual surround suppression. Patients performed poorly compared to controls in the NS and orthogonal-surround conditions. However, they performed as well as controls when the surround was parallel, resulting in significantly

  18. Speech-in-Noise Tests and Supra-threshold Auditory Evoked Potentials as Metrics for Noise Damage and Clinical Trial Outcome Measures.

    Science.gov (United States)

    Le Prell, Colleen G; Brungart, Douglas S

    2016-09-01

    In humans, the accepted clinical standards for detecting hearing loss are the behavioral audiogram, based on the absolute detection threshold of pure-tones, and the threshold auditory brainstem response (ABR). The audiogram and the threshold ABR are reliable and sensitive measures of hearing thresholds in human listeners. However, recent results from noise-exposed animals demonstrate that noise exposure can cause substantial neurodegeneration in the peripheral auditory system without degrading pure-tone audiometric thresholds. It has been suggested that clinical measures of auditory performance conducted with stimuli presented above the detection threshold may be more sensitive than the behavioral audiogram in detecting early-stage noise-induced hearing loss in listeners with audiometric thresholds within normal limits. Supra-threshold speech-in-noise testing and supra-threshold ABR responses are reviewed here, given that they may be useful supplements to the behavioral audiogram for assessment of possible neurodegeneration in noise-exposed listeners. Supra-threshold tests may be useful for assessing the effects of noise on the human inner ear, and the effectiveness of interventions designed to prevent noise trauma. The current state of the science does not necessarily allow us to define a single set of best practice protocols. Nonetheless, we encourage investigators to incorporate these metrics into test batteries when feasible, with an effort to standardize procedures to the greatest extent possible as new reports emerge.

  19. Threshold pion electroproduction at large momentum transfers; Threshold Pion-Elektroproduktion bei grossen Energieuebertraegen

    Energy Technology Data Exchange (ETDEWEB)

    Peters, Andreas

    2008-02-15

    We consider pion electroproduction close to threshold for Q{sup 2} in the region 1-10 GeV{sup 2} on a nucleon target. The momentum transfer dependence of the S-wave multipoles at threshold, E{sub 0+} and L{sub 0+}, is calculated in the chiral limit using light-cone sum rules. Predictions for the cross sections in the threshold region are given taking into account P-wave contributions that, as we argue, are model independent to a large extent. The results are compared with the SLAC E136 data on the structure function F{sub 2}(W,Q{sup 2}) in the threshold region. (orig.)

  20. Model-dependence of the CO2 threshold for melting the hard Snowball Earth

    Directory of Open Access Journals (Sweden)

    W. R. Peltier

    2011-01-01

    Full Text Available One of the critical issues of the Snowball Earth hypothesis is the CO2 threshold for triggering the deglaciation. Using Community Atmospheric Model version 3.0 (CAM3, we study the problem for the CO2 threshold. Our simulations show large differences from previous results (e.g. Pierrehumbert, 2004, 2005; Le Hir et al., 2007. At 0.2 bars of CO2, the January maximum near-surface temperature is about 268 K, about 13 K higher than that in Pierrehumbert (2004, 2005, but lower than the value of 270 K for 0.1 bar of CO2 in Le Hir et al. (2007. It is found that the difference of simulation results is mainly due to model sensitivity of greenhouse effect and longwave cloud forcing to increasing CO2. At 0.2 bars of CO2, CAM3 yields 117 Wm−2 of clear-sky greenhouse effect and 32 Wm−2 of longwave cloud forcing, versus only about 77 Wm−2 and 10.5 Wm−2 in Pierrehumbert (2004, 2005, respectively. CAM3 has comparable clear-sky greenhouse effect to that in Le Hir et al. (2007, but lower longwave cloud forcing. CAM3 also produces much stronger Hadley cells than that in Pierrehumbert (2005. Effects of pressure broadening and collision-induced absorption are also studied using a radiative-convective model and CAM3. Both effects substantially increase surface temperature and thus lower the CO2 threshold. The radiative-convective model yields a CO2 threshold of about 0.21 bars with surface albedo of 0.663. Without considering the effects of pressure broadening and collision-induced absorption, CAM3 yields an approximate CO2 threshold of about 1.0 bar for surface albedo of about 0.6. However, the threshold is lowered to 0.38 bars as both effects are considered.

  1. Consumer acceptance of irradiated food

    Energy Technology Data Exchange (ETDEWEB)

    Loaharanu, P. [Head, Food Preservation Section, Joint FAO/ IAEA Division of Nuclear Techniques in Food and Agriculture, Wagramerstr. 5, A-1400, Vienna (Austria)

    1997-12-31

    There was a widely held opinion during the 1970`s and 1980`s that consumers would be reluctant to purchase irradiated food, as it was perceived that consumers would confuse irradiated food with food contaminated by radionuclides. Indeed, a number of consumer attitude surveys conducted in several western countries during these two decades demonstrated that the concerns of consumers on irradiated food varied from very concerned to seriously concerned.This paper attempts to review parameters conducting in measuring consumer acceptance of irradiated food during the past three decades and to project the trends on this subject. It is believed that important lessons learned from past studies will guide further efforts to market irradiated food with wide consumer acceptance in the future. (Author)

  2. Food irradiation receives international acceptance

    International Nuclear Information System (INIS)

    Beddoes, J.M.

    1982-01-01

    Irradition has advantages as a method of preserving food, especially in the Third World. The author tabulates some examples of actual use of food irradiation with dates and tonnages, and tells the story of the gradual acceptance of food irradiation by the World Health Organization, other international bodies, and the U.S. Food and Drug Administration (USFDA). At present, the joint IAEA/FAO/WHO standard permits an energy level of up to 5 MeV for gamma rays, well above the 1.3 MeV energy level of 60 Co. The USFDA permits irradiation of any food up to 10 krad, and minor constituents of a diet may be irradiated up to 5 Mrad. The final hurdle to be cleared, that of economic acceptance, depends on convincing the food processing industry that the process is technically and economically efficient

  3. Consumer acceptance of irradiated food

    International Nuclear Information System (INIS)

    Loaharanu, P.

    1997-01-01

    There was a widely held opinion during the 1970's and 1980's that consumers would be reluctant to purchase irradiated food, as it was perceived that consumers would confuse irradiated food with food contaminated by radionuclides. Indeed, a number of consumer attitude surveys conducted in several western countries during these two decades demonstrated that the concerns of consumers on irradiated food varied from very concerned to seriously concerned.This paper attempts to review parameters conducting in measuring consumer acceptance of irradiated food during the past three decades and to project the trends on this subject. It is believed that important lessons learned from past studies will guide further efforts to market irradiated food with wide consumer acceptance in the future. (Author)

  4. Pengaruh Self-Acceptance Importance, Affiliation Importance, dan Community Feeling Importance terhadap Compulsive Buying

    Directory of Open Access Journals (Sweden)

    Euis Soliha

    2011-03-01

    Full Text Available This study focused on phenomenon behavior of compulsive buying. The study examined how Self-Acceptance Importance, Affiliation Importance, and Community Feeling Importance influenced on Compulsive Buying. Population in this research was students in Kota Semarang, and 104 students become samples. To answer problem that is accurate, researcher applies econometrics Logit model. Result of research indicates that there were negativity influence Self-Acceptance Importance, Affiliation Importance and Community Feeling to Compulsive Buying. Result of this supports all hypothesis and consistent with theory.Keywords:    compulsive buying, self-acceptance Importance, affiliation Importance, community feeling Importance, Logit Model

  5. Olfactory detection thresholds and pleasantness of a food-related and a non-food odour in hunger and satiety.

    Science.gov (United States)

    Albrecht, J; Schreder, T; Kleemann, A M; Schöpf, V; Kopietz, R; Anzinger, A; Demmel, M; Linn, J; Kettenmann, B; Wiesmann, M

    2009-06-01

    The primary aim of this study was to investigate whether olfactory detection thresholds are dependent on different states of satiety. Using the threshold test of the Sniffin' Sticks test battery (single-staircase, three alternative forced choice procedure), sensitivity to a non-food odour (n-butanol) and a food-related odour (isoamyl acetate) was investigated. Twenty-four healthy, female subjects (mean age 24.2 years, SD 2.7 years) with normal olfactory function performed the tests when hungry and when satiated. Additionally, they rated their emotional condition, arousal, alertness as well as the intensity and pleasantness of both odorants. No significant change in the detection thresholds for the non-food odour n-butanol, but a significant change in detection threshold for the food-related odour isoamyl acetate was found. The detection threshold for isoamyl acetate was significantly lower in the state of satiety compared to the hungry condition. As expected, the perceived pleasantness of isoamyl acetate was significantly lower in satiety. In summary, the results indicate that in our experimental setting the actual state of satiety has effects on detection thresholds of a food-related odour, but not of a non-food odour. Interestingly, the higher sensitivity was found during the state of satiety challenging the current hypothesis that control of food intake is supported by a decrease in sensitivity to food odours. Instead our findings that satiety decreases the pleasantness of a food-related odour support the hypothesis that both odour threshold as well as pleasantness play an important role in the control of food intake.

  6. Men’s Perception of Raped Women: Test of the Sexually Transmitted Disease Hypothesis and the Cuckoldry Hypothesis

    Directory of Open Access Journals (Sweden)

    Prokop Pavol

    2016-06-01

    Full Text Available Rape is a recurrent adaptive problem of female humans and females of a number of non-human animals. Rape has various physiological and reproductive costs to the victim. The costs of rape are furthermore exaggerated by social rejection and blaming of a victim, particularly by men. The negative perception of raped women by men has received little attention from an evolutionary perspective. Across two independent studies, we investigated whether the risk of sexually transmitted diseases (the STD hypothesis, Hypothesis 1 or paternity uncertainty (the cuckoldry hypothesis, Hypothesis 2 influence the negative perception of raped women by men. Raped women received lower attractiveness score than non-raped women, especially in long-term mate attractiveness score. The perceived attractiveness of raped women was not influenced by the presence of experimentally manipulated STD cues on faces of putative rapists. Women raped by three men received lower attractiveness score than women raped by one man. These results provide stronger support for the cuckoldry hypothesis (Hypothesis 2 than for the STD hypothesis (Hypothesis 1. Single men perceived raped women as more attractive than men in a committed relationship (Hypothesis 3, suggesting that the mating opportunities mediate men’s perception of victims of rape. Overall, our results suggest that the risk of cuckoldry underlie the negative perception of victims of rape by men rather than the fear of disease transmission.

  7. Reactor tank UT acceptance criteria

    International Nuclear Information System (INIS)

    Daugherty, W.L.

    1990-01-01

    The SRS reactor tanks are constructed of type 304 stainless steel, with 0.5 inch thick walls. An ultrasonic (UT) in-service inspection program has been developed for examination of these tanks, in accordance with the ISI Plan for the Savannah River Production Reactors Process Water System (DPSTM-88-100-1). Prior to initiation of these inspections, criteria for the disposition of any indications that might be found are required. A working group has been formed to review available information on the SRS reactor tanks and develop acceptance criteria. This working group includes nationally recognized experts in the nuclear industry. The working group has met three times and produced three documents describing the proposed acceptance criteria, the technical basis for the criteria and a proposed initial sampling plan. This report transmits these three documents, which were prepared in accordance with the technical task plan and quality assurance plan for this task, task 88-001-A- 1. In addition, this report summarizes the acceptance criteria and proposed sampling plan, and provides further interpretation of the intent of these three documents where necessary

  8. Public acceptance and public relations

    International Nuclear Information System (INIS)

    Tanaka, Yasumasa

    1977-01-01

    A set of problems are discussed, which must be studied before the public relations are dealt with. Firstly, the trade-off between energy and health must be considered. There were several ages in which the consideration on health took preference to the energy requirement in the past. For example, the use of coal in London was prohibited by the King's proclamation in 1,306. Secondly, the selection for the acceptance of atomic power development and utilization is based on the subjective susceptibility psychologically, and cannot be concluded only by the logical reasoning. Thirdly, the strict definition of ''national consensus'' is necessary. That is, whether does it mean pleviscite or mere mood. Fourthly, whether the atomic energy is free from the danger or death biologically or not. Fifthly, is there any method for discriminating the persons who accept atomic power from the persons who do not socially. Although the probability of death caused by atomic accidents is very small (one three hundred millionth a year), many peoples hate atomic power and oppose to the construction of nuclear power plants. Four reasons for this are considered: (1) social diffusion of innovation, (2) nuclear allergy, (3) shortage of the conception of risk-benefit, and (4) heterogeneity of the public. According to the investigation of the relationship between electric power and livelihood, carried out by the policy and science research institute in Tokyo, the highly subjective decision for the acceptance of atomic power is independent of the objective knowledge on atomic power. (Iwakiri, K.)

  9. Predicting visual acuity from detection thresholds.

    Science.gov (United States)

    Newacheck, J S; Haegerstrom-Portnoy, G; Adams, A J

    1990-03-01

    Visual performance based exclusively on high luminance and high contrast letter acuity measures often fails to predict individual performance at low contrast and low luminance. Here we measured visual acuity over a wide range of contrasts and luminances (low mesopic to photopic) for 17 young normal observers. Acuity vs. contrast functions appear to fit a single template which can be displaced laterally along the log contrast axis. The magnitude of this lateral displacement for different luminances was well predicted by the contrast threshold difference for a 4 min arc spot. The acuity vs. contrast template, taken from the mean of all 17 subjects, was used in conjunction with individual spot contrast threshold measures to predict an individual's visual acuity over a wide range of luminance and contrast levels. The accuracy of the visual acuity predictions from this simple procedure closely approximates test-retest accuracy for both positive (projected Landolt rings) and negative contrast (Bailey-Lovie charts).

  10. Edith Wharton's threshold phobia and two worlds.

    Science.gov (United States)

    Holtzman, Deanna; Kulish, Nancy

    2014-08-01

    The American novelist Edith Wharton suffered an unusual childhood neurotic symptom, a fear of crossing thresholds, a condition that might be called a "threshold phobia." This symptom is identified and examined in autobiographical material, letters, diaries, and selected literary fiction and nonfiction left by Wharton to arrive at a formulation not previously drawn together. A fascinating theme-living or being trapped between "two worlds"-runs through much of the writer's life and work. The phobia is related to this theme, and both can be linked more broadly to certain sexual conflicts in women. This understanding of Wharton's phobia, it is argued, throws new light on the developmental issues and conflicts related to the female "oedipal" or triadic phase, characterized by the need to negotiate the two worlds of mother and of father. © 2014 by the American Psychoanalytic Association.

  11. Multiparty Computation from Threshold Homomorphic Encryption

    DEFF Research Database (Denmark)

    Cramer, Ronald; Damgård, Ivan Bjerre; Nielsen, Jesper Buus

    2001-01-01

    We introduce a new approach to multiparty computation (MPC) basing it on homomorphic threshold crypto-systems. We show that given keys for any sufficiently efficient system of this type, general MPC protocols for n parties can be devised which are secure against an active adversary that corrupts...... any minority of the parties. The total number of bits broadcast is O(nk|C|), where k is the security parameter and |C| is the size of a (Boolean) circuit computing the function to be securely evaluated. An earlier proposal by Franklin and Haber with the same complexity was only secure for passive...... adversaries, while all earlier protocols with active security had complexity at least quadratic in n. We give two examples of threshold cryptosystems that can support our construction and lead to the claimed complexities....

  12. The Resting Motor Threshold - Restless or Resting?

    DEFF Research Database (Denmark)

    Karabanov, Anke Ninija; Raffin, Estelle Emeline; Siebner, Hartwig Roman

    2015-01-01

    Background The resting motor threshold (RMT) is used to individually adjust the intensity of transcranial magnetic stimulation (TMS) intensity and is assumed to be stable. Here we challenge this notion by showing that RMT expresses acute context-dependent fluctuations. Method In twelve participants......, the RMT of the right first dorsal interosseus muscle was repeatedly determined using a threshold-hunting procedure while participants performed motor imagery and visual attention tasks with the right or left hand. Data were analyzed using repeated-measure ANOVA. Results RMT differed depending on which...... hand performed the task (P = 0.003). RMT of right FDI was lower during motor imagery than during visual attention of the right hand (P = 0.002), but did not differ between left-hand tasks (P = 0.988). Conclusions State-dependent changes of RMT occur in absence of overt motor activity and can...

  13. Gamin partable radiation meter with alarm threshold

    International Nuclear Information System (INIS)

    Payat, Rene.

    1981-10-01

    The Gamin Radiation meter is a direct reading, portable, battery-powered gamma doserate meter featuring alarm thresholds. Doserate is read on a micro-ammeter with a millirad-per-hour logarithmic scale, covering a range of 0,1 to 1000 millirads/hour. The instrument issues an audible warning signal when dose-rate level exceeds a threshold value, which can be selected. The detector tube is of the Geiger-Muller counter, energy compensated type. Because of its low battery drain, the instrument can be operated continously for 1000 hours. It is powered by four 1.5 volt alcaline batteries of the R6 type. The electronic circuitry is housed in a small lightweight case made of impact resistant plastic. Applications of the Gamin portable radiation monitor are found in health physics, safety departments, medical facilities, teaching, civil defense [fr

  14. Rayleigh scattering from ions near threshold

    International Nuclear Information System (INIS)

    Roy, S.C.; Gupta, S.K.S.; Kissel, L.; Pratt, R.H.

    1988-01-01

    Theoretical studies of Rayleigh scattering of photons from neon atoms with different degrees of ionization, for energies both below and above the K-edges of the ions, are presented. Some unexpected structures both in Rayleigh scattering and in photoionization from neutral and weakly ionized atoms, very close to threshold, have been reported. It has recently been realized that some of the predicted structures may have a nonphysical origin and are due to the limitation of the independent-particle model and also to the use of a Coulombic Latter tail. Use of a K-shell vacancy potential - in which an electron is assumed to be removed from the K-shell - in calculating K-shell Rayleigh scattering amplitudes removes some of the structure effects near threshold. We present in this work a discussion of scattering angular distributions and total cross sections, obtained utilizing vacancy potentials, and compare these predictions with those previously obtained in other potential model. (author) [pt

  15. The monolithic double-threshold discriminator

    International Nuclear Information System (INIS)

    Baturitsky, M.A.; Dvornikov, O.V.

    1999-01-01

    A double-threshold discriminator capable of processing input signals of different duration is described. Simplicity of the discriminator circuitry makes it possible to embody the discriminator in multichannel ICs using microwave bipolar-JFET technology. Time walk is calculated to be less than 0.35 ns for the input ramp signals with rise times 25-100 ns and amplitudes 50 mV-1 V

  16. Bivariate hard thresholding in wavelet function estimation

    OpenAIRE

    Piotr Fryzlewicz

    2007-01-01

    We propose a generic bivariate hard thresholding estimator of the discrete wavelet coefficients of a function contaminated with i.i.d. Gaussian noise. We demonstrate its good risk properties in a motivating example, and derive upper bounds for its mean-square error. Motivated by the clustering of large wavelet coefficients in real-life signals, we propose two wavelet denoising algorithms, both of which use specific instances of our bivariate estimator. The BABTE algorithm uses basis averaging...

  17. Estimasi Regresi Wavelet Thresholding Dengan Metode Bootstrap

    OpenAIRE

    Suparti, Suparti; Mustofa, Achmad; Rusgiyono, Agus

    2007-01-01

    Wavelet is a function that has the certainly characteristic for example, it oscillate about zero point ascillating, localized in the time and frequency domain and construct the orthogonal bases in L2(R) space. On of the wavelet application is to estimate non parametric regression function. There are two kinds of wavelet estimator, i.e., linear and non linear wavelet estimator. The non linear wavelet estimator is called a thresholding wavelet rstimator. The application of the bootstrap method...

  18. Factors affecting mechanical (nociceptive) thresholds in piglets.

    Science.gov (United States)

    Janczak, Andrew M; Ranheim, Birgit; Fosse, Torunn K; Hild, Sophie; Nordgreen, Janicke; Moe, Randi O; Zanella, Adroaldo J

    2012-11-01

    To evaluate the stability and repeatability of measures of mechanical (nociceptive) thresholds in piglets and to examine potentially confounding factors when using a hand held algometer. Descriptive, prospective cohort. Forty-four piglets from four litters, weighing 4.6 ± 1.0 kg (mean ± SD) at 2 weeks of age. Mechanical thresholds were measured twice on each of 2 days during the first and second week of life. Data were analyzed using a repeated measures design to test the effects of behavior prior to testing, sex, week, day within week, and repetition within day. The effect of body weight and the interaction between piglet weight and behaviour were also tested. Piglet was entered into the model as a random effect as an additional test of repeatability. The effect of repeated testing was used to test the stability of measures. Pearson correlations between repeated measures were used to test the repeatability of measures. Variance component analysis was used to describe the variability in the data. Variance component analysis indicated that piglet explained only 17% of the variance in the data. All variables in the model (behaviour prior to testing, sex, week, day within week, repetition within day, body weight, the interaction between body weight and behaviour, piglet identity) except sex had a significant effect (p testing and measures changed with repeated testing and increased with increasing piglet weight, indicating that time (age) and animal body weight should be taken into account when measuring mechanical (nociceptive) thresholds in piglets. Mechanical (nociceptive) thresholds can be used both for testing the efficacy of anaesthetics and analgesics, and for assessing hyperalgesia in chronic pain states in research and clinical settings. © 2012 The Authors. Veterinary Anaesthesia and Analgesia. © 2012 Association of Veterinary Anaesthetists and the American College of Veterinary Anesthesiologists.

  19. Near threshold studies of photoelectron satellites

    International Nuclear Information System (INIS)

    Heimann, P.A.

    1986-11-01

    Photoelectron spectroscopy and synchrotron radiation have been used to study correlation effects in the rare gases: He, Ne, Ar, Kr, and Xe. Two kinds of time-of-flight electron analyzers were employed to examine photoionization very close to threshold and at higher kinetic energies. Partial cross sections and angular distributions have been measured for a number of photoelectron satellites. The shake-off probability has been determined at some inner-shell resonances. 121 refs., 28 figs., 13 tabs

  20. Design sensitivity and statistical power in acceptability judgment experiments

    Directory of Open Access Journals (Sweden)

    Jon Sprouse

    2017-02-01

    Full Text Available Previous investigations into the validity of acceptability judgment data have focused almost exclusively on 'type I errors '(or false positives because of the consequences of such errors for syntactic theories (Sprouse & Almeida 2012; Sprouse et al. 2013. The current study complements these previous studies by systematically investigating the 'type II error rate '(false negatives, or equivalently, the 'statistical power', of a wide cross-section of possible acceptability judgment experiments. Though type II errors have historically been assumed to be less costly than type I errors, the dynamics of scientific publishing mean that high type II error rates (i.e., studies with low statistical power can lead to increases in type I error rates in a given field of study. We present a set of experiments and resampling simulations to estimate statistical power for four tasks (forced-choice, Likert scale, magnitude estimation, and yes-no, 50 effect sizes instantiated by real phenomena, sample sizes from 5 to 100 participants, and two approaches to statistical analysis (null hypothesis and Bayesian. Our goals are twofold (i to provide a fuller picture of the status of acceptability judgment data in syntax, and (ii to provide detailed information that syntacticians can use to design and evaluate the sensitivity of acceptability judgment experiments in their own research.

  1. Peer Acceptance and Friendship in Early Childhood: The Conceptual Distinctions between Them

    Science.gov (United States)

    Beazidou, Eleftheria; Botsoglou, Kafenia

    2016-01-01

    This paper reviews previous literature about peer acceptance and friendship, two of the most critical aspects of peer relations that have received most of research attention during the past years. In this review, we will focus on the processes explaining the way children use the ability to socialise with peers; explore the hypothesis that certain…

  2. Experiencing limits of acceptable change: some thoughts after a decade of implementation

    Science.gov (United States)

    Stephen F. McCool; David N. Cole

    1997-01-01

    Wilderness managers and researchers have experienced implementation of the Limits of Acceptable Change planning system for over a decade. In a sense, implementation of LAC has been a broad scale experiment in planning, with the hypothesis being that LAC processes are more effective approaches to deal with questions of recreation management in protected areas than the...

  3. Acceptance and suitability of novel trees for Orthotomicus erosus, an exotic bark beetle in North America

    Science.gov (United States)

    A.J. Walter; R.C. Venette; S.A. Kells

    2010-01-01

    To predict whether an herbivorous pest insect will establish in a new area, the potential host plants must be known. For invading bark beetles, adults must recognize and accept trees suitable for larval development. The preference-performance hypothesis predicts that adults will select host species that maximize the fitness of their offspring. We tested five species of...

  4. The Role of Hypothesis in Constructive Design Research

    DEFF Research Database (Denmark)

    Bang, Anne Louise; Krogh, Peter; Ludvigsen, Martin

    2012-01-01

    and solid perspective on how to keep constructive design research on track, this paper offers a model for understanding the role of hypothesis in constructive design research. The model allows for understanding the hypothesis’s relation to research motivation, questions, experiments, evaluation...... position of the hypothesis as a key-governing element even in artistic led research processes....

  5. Dynamical agents' strategies and the fractal market hypothesis

    Czech Academy of Sciences Publication Activity Database

    Vácha, Lukáš; Vošvrda, Miloslav

    2005-01-01

    Roč. 14, č. 2 (2005), s. 172-179 ISSN 1210-0455 Grant - others:GA UK(CZ) 454/2004/A EK/FSV Institutional research plan: CEZ:AV0Z10750506 Keywords : efficient market hypothesis * fractal market hypothesis * agent's investment horizons Subject RIV: AH - Economics

  6. An Exercise for Illustrating the Logic of Hypothesis Testing

    Science.gov (United States)

    Lawton, Leigh

    2009-01-01

    Hypothesis testing is one of the more difficult concepts for students to master in a basic, undergraduate statistics course. Students often are puzzled as to why statisticians simply don't calculate the probability that a hypothesis is true. This article presents an exercise that forces students to lay out on their own a procedure for testing a…

  7. A new 'hidden colour hypothesis' in hadron physics

    Indian Academy of Sciences (India)

    A new `hidden colour hypothesis' within the framework of QCD, as an extension of and in keeping with the spirit of the `colour singlet hypothesis' is hereby proposed. As such it should play a role in a consistent description of exotic hadrons, such as diquonia, pentaquarks, dibaryons etc. How these exotic hadrons are ...

  8. A new 'hidden colour hypothesis' in hadron physics

    Indian Academy of Sciences (India)

    Abstract. A new 'hidden colour hypothesis' within the framework of QCD, as an exten- sion of and in keeping with the spirit of the 'colour singlet hypothesis' is hereby proposed. As such it should play a role in a consistent description of exotic hadrons, such as diquonia, pentaquarks, dibaryons etc. How these exotic hadrons ...

  9. New Hypothesis for SOFC Ceramic Oxygen Electrode Mechanisms

    DEFF Research Database (Denmark)

    Mogensen, Mogens Bjerg; Chatzichristodoulou, Christodoulos; Graves, Christopher R.

    2016-01-01

    A new hypothesis for the electrochemical reaction mechanism in solid oxide cell ceramic oxygen electrodes is proposed based on literature including our own results. The hypothesis postulates that the observed thin layers of SrO-La2O3 on top of ceramic perovskite and other Ruddlesden-Popper...

  10. The Younger Dryas impact hypothesis: A critical review

    NARCIS (Netherlands)

    van Hoesel, A.; Hoek, W.Z.; Pennock, G.M.; Drury, Martyn

    2014-01-01

    The Younger Dryas impact hypothesis suggests that multiple extraterrestrial airbursts or impacts resulted in the Younger Dryas cooling, extensive wildfires, megafaunal extinctions and changes in human population. After the hypothesis was first published in 2007, it gained much criticism, as the

  11. New methods of testing nonlinear hypothesis using iterative NLLS estimator

    Science.gov (United States)

    Mahaboob, B.; Venkateswarlu, B.; Mokeshrayalu, G.; Balasiddamuni, P.

    2017-11-01

    This research paper discusses the method of testing nonlinear hypothesis using iterative Nonlinear Least Squares (NLLS) estimator. Takeshi Amemiya [1] explained this method. However in the present research paper, a modified Wald test statistic due to Engle, Robert [6] is proposed to test the nonlinear hypothesis using iterative NLLS estimator. An alternative method for testing nonlinear hypothesis using iterative NLLS estimator based on nonlinear hypothesis using iterative NLLS estimator based on nonlinear studentized residuals has been proposed. In this research article an innovative method of testing nonlinear hypothesis using iterative restricted NLLS estimator is derived. Pesaran and Deaton [10] explained the methods of testing nonlinear hypothesis. This paper uses asymptotic properties of nonlinear least squares estimator proposed by Jenrich [8]. The main purpose of this paper is to provide very innovative methods of testing nonlinear hypothesis using iterative NLLS estimator, iterative NLLS estimator based on nonlinear studentized residuals and iterative restricted NLLS estimator. Eakambaram et al. [12] discussed least absolute deviation estimations versus nonlinear regression model with heteroscedastic errors and also they studied the problem of heteroscedasticity with reference to nonlinear regression models with suitable illustration. William Grene [13] examined the interaction effect in nonlinear models disused by Ai and Norton [14] and suggested ways to examine the effects that do not involve statistical testing. Peter [15] provided guidelines for identifying composite hypothesis and addressing the probability of false rejection for multiple hypotheses.

  12. Assess the Critical Period Hypothesis in Second Language Acquisition

    Science.gov (United States)

    Du, Lihong

    2010-01-01

    The Critical Period Hypothesis aims to investigate the reason for significant difference between first language acquisition and second language acquisition. Over the past few decades, researchers carried out a series of studies to test the validity of the hypothesis. Although there were certain limitations in these studies, most of their results…

  13. Threshold Learning Dynamics in Social Networks

    Science.gov (United States)

    González-Avella, Juan Carlos; Eguíluz, Victor M.; Marsili, Matteo; Vega-Redondo, Fernado; San Miguel, Maxi

    2011-01-01

    Social learning is defined as the ability of a population to aggregate information, a process which must crucially depend on the mechanisms of social interaction. Consumers choosing which product to buy, or voters deciding which option to take with respect to an important issue, typically confront external signals to the information gathered from their contacts. Economic models typically predict that correct social learning occurs in large populations unless some individuals display unbounded influence. We challenge this conclusion by showing that an intuitive threshold process of individual adjustment does not always lead to such social learning. We find, specifically, that three generic regimes exist separated by sharp discontinuous transitions. And only in one of them, where the threshold is within a suitable intermediate range, the population learns the correct information. In the other two, where the threshold is either too high or too low, the system either freezes or enters into persistent flux, respectively. These regimes are generally observed in different social networks (both complex or regular), but limited interaction is found to promote correct learning by enlarging the parameter region where it occurs. PMID:21637714

  14. Near threshold computing technology, methods and applications

    CERN Document Server

    Silvano, Cristina

    2016-01-01

    This book explores near-threshold computing (NTC), a design-space using techniques to run digital chips (processors) near the lowest possible voltage.  Readers will be enabled with specific techniques to design chips that are extremely robust; tolerating variability and resilient against errors.  Variability-aware voltage and frequency allocation schemes will be presented that will provide performance guarantees, when moving toward near-threshold manycore chips.  ·         Provides an introduction to near-threshold computing, enabling reader with a variety of tools to face the challenges of the power/utilization wall; ·         Demonstrates how to design efficient voltage regulation, so that each region of the chip can operate at the most efficient voltage and frequency point; ·         Investigates how performance guarantees can be ensured when moving towards NTC manycores through variability-aware voltage and frequency allocation schemes.  .

  15. Threshold photoelectron spectroscopy of acetaldehyde and acrolein

    International Nuclear Information System (INIS)

    Yencha, Andrew J.; Siggel-King, Michele R.F.; King, George C.; Malins, Andrew E.R.; Eypper, Marie

    2013-01-01

    Highlights: •High-resolution threshold photoelectron spectrum of acetaldehyde. •High-resolution threshold photoelectron spectrum of acrolein. •High-resolution total photoion yield spectrum of acetaldehyde. •High-resolution total photoion yield spectrum of acrolein. •Determination of vertical ionization potentials in acetaldehyde and acrolein. -- Abstract: High-resolution (6 meV and 12 meV) threshold photoelectron (TPE) spectra of acetaldehyde and acrolein (2-propenal) have been recorded over the valence binding energy region 10–20 eV, employing synchrotron radiation and a penetrating-field electron spectrometer. These TPE spectra are presented here for the first time. All of the band structures observed in the TPE spectra replicate those found in their conventional HeI photoelectron (PE) spectra. However, the relative band intensities are found to be dramatically different in the two types of spectra that are attributed to the different dominant operative formation mechanisms. In addition, some band shapes and their vertical ionization potentials are found to differ in the two types of spectra that are associated with the autoionization of Rydberg states in the two molecules

  16. Treatment of threshold retinopathy of prematurity

    Directory of Open Access Journals (Sweden)

    Deshpande Dhanashree

    1998-01-01

    Full Text Available This report deals with our experience in the management of threshold retinopathy of prematurity (ROP. A total of 45 eyes of 23 infants were subjected to treatment of threshold ROP. 26.1% of these infants had a birth weight of >l,500 gm. The preferred modality of treatment was laser indirect photocoagulation, which was facilitated by scleral depression. Cryopexy was done in cases with nondilating pupils or medial haze and was always under general anaesthesia. Retreatment with either modality was needed in 42.2% eyes; in this the skip areas were covered. Total regression of diseases was achieved in 91.1% eyes with no sequelae. All the 4 eyes that progressed to stage 5 despite treatment had zone 1 disease. Major treatment-induced complications did not occur in this series. This study underscores the importance of routine screening of infants upto 2,000 gm birth weight for ROP and the excellent response that is achieved with laser photocoagulation in inducing regression of threshold ROP. Laser is the preferred method of treatment in view of the absence of treatment-related morbidity to the premature infants.

  17. Auditory temporal resolution threshold in elderly individuals.

    Science.gov (United States)

    Queiroz, Daniela Soares de; Momensohn-Santos, Teresa Maria; Branco-Barreiro, Fátima Cristina Alves

    2010-01-01

    the Random Gap Detection Test (RGDT) evaluates temporal resolution threshold. There are doubts as to whether performance in this task remains unchanged with the aging process. At the same time, there is a concern about how much the difficulties of communication experienced by elderly individuals are related to the deterioration of temporal resolution. to determine auditory temporal resolution threshold in elderly individuals with normal peripheral hearing or symmetric mild sensorineural hearing loss, and to correlate findings with gender, age, audiometric findings and scores obtained in the Self - Assessment of Communication (SAC) questionnaire. 63 elderly individuals, aged between 60 and 80 years (53 women and 10 men), were submitted to the RGDT and the SAC. statistical analysis of the relationship between gender and the RGDT indicated that the performance of elderly females was statistically poorer when compared to elderly males. Age and audiometric configuration did not correlate to performance in the RDGT and in the SAC. The results indicate that in the SAC both genders presented no significant complaints about communication difficulties regardless of the outcome obtained in the RGDT or audiometric configuration. the average temporal resolution threshold for women was 104.81ms. Considering gender, females did not present correlations between age and audiometric configuration, not only when considering the RGDT results but also when analyzing the SAC results.

  18. Chemical sensing thresholds for mine detection dogs

    Science.gov (United States)

    Phelan, James M.; Barnett, James L.

    2002-08-01

    Mine detection dogs have been found to be an effective method to locate buried landmines. The capabilities of the canine olfaction method are from a complex combination of training and inherent capacity of the dog for odor detection. The purpose of this effort was to explore the detection thresholds of a limited group of dogs that were trained specifically for landmine detection. Soils were contaminated with TNT and 2,4-DNT to develop chemical vapor standards to present to the dogs. Soils contained ultra trace levels of TNT and DNT, which produce extremely low vapor levels. Three groups of dogs were presented the headspace vapors from the contaminated soils in work environments for each dog group. One positive sample was placed among several that contained clean soils and, the location and vapor source (strength, type) was frequently changed. The detection thresholds for the dogs were determined from measured and extrapolated dilution of soil chemical residues and, estimated soil vapor values using phase partitioning relationships. The results showed significant variances in dog sensing thresholds, where some dogs could sense the lowest levels and others had trouble with even the highest source. The remarkable ultra-trace levels detectable by the dogs are consistent with the ultra-trace chemical residues derived from buried landmines; however, poor performance may go unnoticed without periodic challenge tests at levels consistent with performance requirements.

  19. Treating acetaminophen overdose: thresholds, costs and uncertainties.

    Science.gov (United States)

    Gosselin, S; Hoffman, R S; Juurlink, D N; Whyte, I; Yarema, M; Caro, J

    2013-03-01

    The United Kingdom's Medicines and Healthcare Products Regulatory Agency (MHRA) modified the indications for N-acetylcysteine therapy of acetaminophen (paracetamol) overdose in September 2012. The new treatment threshold line was lowered to 100 mg/L (662 μmol/L) for a 4 hours acetaminophen concentration from the previous 200 mg/L (1325 μmol/L). This decision has the potential to substantially increase overall costs associated with acetaminophen overdose with unclear benefits from a marginal increase in patients protected from hepatotoxicity, fulminant hepatic failure, death, or transplant. Changing the treatment threshold for acetaminophen overdose also implies that ingestion amounts previously thought not to require acetaminophen concentration measurements would need to be revised. As a result, more individuals will be sent to hospitals in order that everyone with a predicted 4 hours concentration above the 100 mg/L line will have concentrations measured and potentially be treated with N-acetylcysteine. Before others consider adopting this new treatment guideline, formal cost-effectiveness analyses need to be performed to define the appropriate thresholds for referral and treatment.

  20. Cardio-vascular reserve index (CVRI) during exercise complies with the pattern assumed by the cardiovascular reserve hypothesis.

    Science.gov (United States)

    Segel, Michael J; Bobrovsky, Ben-Zion; Gabbay, Itay E; Ben-Dov, Issahar; Reuveny, Ronen; Gabbay, Uri

    2017-05-01

    The Cardio-vascular reserve index (CVRI) had been empirically validated in diverse morbidities as a quantitative estimate of the reserve assumed by the cardiovascular reserve hypothesis. This work evaluates whether CVRI during exercise complies with the cardiovascular reserve hypothesis. Retrospective study based on a database of patients who underwent cardio-pulmonary exercise testing (CPX) for diverse indications. Patient's physiological measurements were retrieved at four predefined CPX stages (rest, anaerobic threshold, peak exercise and after 2min of recovery). CVRI was individually calculated retrospectively at each stage. Mean CVRI at rest was 0.81, significantly higher (p0.05). CVRI after 2min of recovery rose considerably, most in the group with the best exercise capacity and least in those with the lowest exercise capacity. CVRI during exercise fits the pattern predicted by the cardiovascular reserve hypothesis. CVRI decreased with exercise reaching a minimum at peak exercise and rising with recovery. The CVRI nadir at peak exercise, similar across groups classified by exercise capacity, complies with the assumed exhaustion threshold. The clinical utility of CVRI should be further evaluated. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Phase-change memory: A continuous multilevel compact model of subthreshold conduction and threshold switching

    Science.gov (United States)

    Pigot, Corentin; Gilibert, Fabien; Reyboz, Marina; Bocquet, Marc; Zuliani, Paola; Portal, Jean-Michel

    2018-04-01

    Phase-change memory (PCM) compact modeling of the threshold switching based on a thermal runaway in Poole–Frenkel conduction is proposed. Although this approach is often used in physical models, this is the first time it is implemented in a compact model. The model accuracy is validated by a good correlation between simulations and experimental data collected on a PCM cell embedded in a 90 nm technology. A wide range of intermediate states is measured and accurately modeled with a single set of parameters, allowing multilevel programing. A good convergence is exhibited even in snapback simulation owing to this fully continuous approach. Moreover, threshold properties extraction indicates a thermally enhanced switching, which validates the basic hypothesis of the model. Finally, it is shown that this model is compliant with a new drift-resilient cell-state metric. Once enriched with a phase transition module, this compact model is ready to be implemented in circuit simulators.

  2. Linear, no threshold response at low doses of ionizing radiation: ideology, prejudice and science

    International Nuclear Information System (INIS)

    Kesavan, P.C.

    2014-01-01

    The linear, no threshold (LNT) response model assumes that there is no threshold dose for the radiation-induced genetic effects (heritable mutations and cancer), and it forms the current basis for radiation protection standards for radiation workers and the general public. The LNT model is, however, based more on ideology than valid radiobiological data. Further, phenomena such as 'radiation hormesis', 'radioadaptive response', 'bystander effects' and 'genomic instability' are now demonstrated to be radioprotective and beneficial. More importantly, the 'differential gene expression' reveals that qualitatively different proteins are induced by low and high doses. This finding negates the LNT model which assumes that qualitatively similar proteins are formed at all doses. Thus, all available scientific data challenge the LNT hypothesis. (author)

  3. Revisiting the hypothesis-driven interview in a contemporary context.

    Science.gov (United States)

    Holmes, Alex; Singh, Bruce; McColl, Geoff

    2011-12-01

    The "hypothesis-driven interview" was articulated by George Engel as a method of raising and testing hypotheses in the process of building a biopsychosocial formulation and determining the most likely diagnosis. This interview was a forerunner of the modern medical interview as well as the contemporary psychiatric assessment. The objective of this article is to describe the hypothesis-driven interview and to explore its relationship with the contemporary medical interview. The literature on the medical and hypothesis-driven interview was reviewed. Key features of each were identified. The hypothesis-driven interview shares much with the contemporary medical interview. In addition, it enhances the application of communication skills and allows the interviewer to develop a formulation during the course of the assessment. The hypothesis-driven interview is well suited to the aims of a contemporary psychiatric assessment.

  4. Wind energy and social acceptability; Energie eolienne et acceptability sociale

    Energy Technology Data Exchange (ETDEWEB)

    Feurtey, E. (ed.)

    2008-07-01

    This document was prepared as part of a decentralized collaboration between Quebec and France to share knowledge regarding strategies and best practices in wind power development. It reviewed the social acceptance of Quebec's wind power industry, particularly at the municipal level. The wind industry is growing rapidly in Quebec, and this growth has generated many reactions ranging from positive to negative. The purpose of this joint effort was to describe decision making steps to developing a wind turbine array. The history of wind development in Quebec was discussed along with the various hardware components required in a wind turbine and different types of installations. The key element in implementing wind turbine arrays is to establish public acceptance of the project, followed by a good regulatory framework to define the roles and responsibilities of participants. The production of electricity from wind turbines constitutes a clean and renewable source of energy. Although it is associated with a reduction in greenhouse gas emissions, this form of energy can also have negative environmental impacts, including noise. The revenues generated by wind parks are important factors in the decision making process. Two case studies in Quebec were presented. refs., tabs., figs.

  5. A Robust Threshold for Iterative Channel Estimation in OFDM Systems

    Directory of Open Access Journals (Sweden)

    A. Kalaycioglu

    2010-04-01

    Full Text Available A novel threshold computation method for pilot symbol assisted iterative channel estimation in OFDM systems is considered. As the bits are transmitted in packets, the proposed technique is based on calculating a particular threshold for each data packet in order to select the reliable decoder output symbols to improve the channel estimation performance. Iteratively, additional pilot symbols are established according to the threshold and the channel is re-estimated with the new pilots inserted to the known channel estimation pilot set. The proposed threshold calculation method for selecting additional pilots performs better than non-iterative channel estimation, no threshold and fixed threshold techniques in poor HF channel simulations.

  6. Considering a Threshold Energy in Reactive Transport Modeling of Microbially Mediated Redox Reactions in an Arsenic-Affected Aquifer

    Directory of Open Access Journals (Sweden)

    Marco Rotiroti

    2018-01-01

    Full Text Available The reductive dissolution of Fe-oxide driven by organic matter oxidation is the primary mechanism accepted for As mobilization in several alluvial aquifers. These processes are often mediated by microorganisms that require a minimum Gibbs energy available to conduct the reaction in order to sustain their life functions. Implementing this threshold energy in reactive transport modeling is rarely used in the existing literature. This work presents a 1D reactive transport modeling of As mobilization by the reductive dissolution of Fe-oxide and subsequent immobilization by co-precipitation in iron sulfides considering a threshold energy for the following terminal electron accepting processes: (a Fe-oxide reduction, (b sulfate reduction, and (c methanogenesis. The model is then extended by implementing a threshold energy on both reaction directions for the redox reaction pairs Fe(III reduction/Fe(II oxidation and methanogenesis/methane oxidation. The optimal threshold energy fitted in 4.50, 3.76, and 1.60 kJ/mol e− for sulfate reduction, Fe(III reduction/Fe(II oxidation, and methanogenesis/methane oxidation, respectively. The use of models implementing bidirectional threshold energy is needed when a redox reaction pair can be transported between domains with different redox potentials. This may often occur in 2D or 3D simulations.

  7. Fast Acceptance by Common Experience

    Directory of Open Access Journals (Sweden)

    Nathan Berg

    2010-08-01

    Full Text Available Schelling (1969, 1971a,b, 1978 observed that macro-level patterns do not necessarily reflect micro-level intentions, desires or goals. In his classic model on neighborhood segregation which initiated a large and influential literature, individuals with no desire to be segregated from those who belong to other social groups nevertheless wind up clustering with their own type. Most extensions of Schelling's model have replicated this result. There is an important mismatch, however, between theory and observation, which has received relatively little attention. Whereas Schelling-inspired models typically predict large degrees of segregation starting from virtually any initial condition, the empirical literature documents considerable heterogeneity in measured levels of segregation. This paper introduces a mechanism that can produce significantly higher levels of integration and, therefore, brings predicted distributions of segregation more in line with real-world observation. As in the classic Schelling model, agents in a simulated world want to stay or move to a new location depending on the proportion of neighbors they judge to be acceptable. In contrast to the classic model, agents' classifications of their neighbors as acceptable or not depend lexicographically on recognition first and group type (e.g., ethnic stereotyping second. The FACE-recognition model nests classic Schelling: When agents have no recognition memory, judgments about the acceptability of a prospective neighbor rely solely on his or her group type (as in the Schelling model. A very small amount of recognition memory, however, eventually leads to different classifications that, in turn, produce dramatic macro-level effects resulting in significantly higher levels of integration. A novel implication of the FACE-recognition model concerns the large potential impact of policy interventions that generate modest numbers of face-to-face encounters with members of other social groups.

  8. Decision modeling and acceptance criteria

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager

    2003-01-01

    formulation of decision criteria and public acceptance criteria connected to risk analysis of technical operations that may endanger human life and property. Public restrictions on the decisions concerning the design, construction and managing of the technical operation have in the past been imposed......, the owner that tries to optimize the net gain of the operation, and the public that has somewhat different preferences than the owner, but also strong interests in the success of the owner. The principles of rational decision are needed for appreciation of the problem. Recognizing that there is an insurance...

  9. Hydrometeorological threshold conditions for debris flow initiation in Norway

    Directory of Open Access Journals (Sweden)

    N. K. Meyer

    2012-10-01

    Full Text Available Debris flows, triggered by extreme precipitation events and rapid snow melt, cause considerable damage to the Norwegian infrastructure every year. To define intensity-duration (ID thresholds for debris flow initiation critical water supply conditions arising from intensive rainfall or snow melt were assessed on the basis of daily hydro-meteorological information for 502 documented debris flow events. Two threshold types were computed: one based on absolute ID relationships and one using ID relationships normalized by the local precipitation day normal (PDN. For each threshold type, minimum, medium and maximum threshold values were defined by fitting power law curves along the 10th, 50th and 90th percentiles of the data population. Depending on the duration of the event, the absolute threshold intensities needed for debris flow initiation vary between 15 and 107 mm day−1. Since the PDN changes locally, the normalized thresholds show spatial variations. Depending on location, duration and threshold level, the normalized threshold intensities vary between 6 and 250 mm day−1. The thresholds obtained were used for a frequency analysis of over-threshold events giving an estimation of the exceedance probability and thus potential for debris flow events in different parts of Norway. The absolute thresholds are most often exceeded along the west coast, while the normalized thresholds are most frequently exceeded on the west-facing slopes of the Norwegian mountain ranges. The minimum thresholds derived in this study are in the range of other thresholds obtained for regions with a climate comparable to Norway. Statistics reveal that the normalized threshold is more reliable than the absolute threshold as the former shows no spatial clustering of debris flows related to water supply events captured by the threshold.

  10. FUZZY ACCEPTANCE SAMPLING AND CHARACTERISTIC CURVES

    Directory of Open Access Journals (Sweden)

    Ebru Turano?lu

    2012-02-01

    Full Text Available Acceptance sampling is primarily used for the inspection of incoming or outgoing lots. Acceptance sampling refers to the application of specific sampling plans to a designated lot or sequence of lots. The parameters of acceptance sampling plans are sample sizes and acceptance numbers. In some cases, it may not be possible to define acceptance sampling parameters as crisp values. These parameters can be expressed by linguistic variables. The fuzzy set theory can be successfully used to cope with the vagueness in these linguistic expressions for acceptance sampling. In this paper, the main distributions of acceptance sampling plans are handled with fuzzy parameters and their acceptance probability functions are derived. Then the characteristic curves of acceptance sampling are examined under fuzziness. Illustrative examples are given.

  11. The periodontal pain paradox: Difficulty on pain assesment in dental patients (The periodontal pain paradox hypothesis

    Directory of Open Access Journals (Sweden)

    Haryono Utomo

    2006-12-01

    Full Text Available In daily dental practice, the majority of patients’ main complaints are related to pain. Most patients assume that all pains inside the oral cavity originated from the tooth. One particular case is thermal sensitivity; sometimes patients were being able to point the site of pain, although there is neither visible caries nor secondary caries in dental radiograph. In this case, gingival recession and dentin hypersensitivity are first to be treated to eliminate the pain. If these treatments failed, pain may misdiagnose as pulpal inflammation and lead to unnecessary root canal treatment. Study in pain during periodontal instrumentation of plaque-related periodontitis revealed that the majority of patients feel pain and discomfort during probing and scaling. It seems obvious because an inflammation, either acute or chronic is related to a lowered pain threshold. However, in contrast, in this case report, patient suffered from chronic gingivitis and thermal sensitivity experienced a relative pain-free sensation during probing and scaling. Lowered pain threshold which accompanied by a blunted pain perception upon periodontal instrumentation is proposed to be termed as the periodontal pain paradox. The objective of this study is to reveal the possibility of certain factors in periodontal inflammation which may involved in the periodontal pain paradox hypothesis. Patient with thermal hypersensitivity who was conducted probing and scaling, after the relative pain-free instrumentation, thermal hypersensitivity rapidly disappeared. Based on the successful periodontal treatment, it is concluded that chronic gingivitis may modulate periodontal pain perception which termed as periodontal pain paradox

  12. [Hepatitis C, interferon a and depression: main physiopathologic hypothesis].

    Science.gov (United States)

    Vignau, J; Karila, L; Costisella, O; Canva, V

    2005-01-01

    Imputability of thymic disorders caused by IFNalpha during the chronic Hepatitis C treatment -- hepatitis C and depression -- the infection by the hepatitis C virus (HCV) is a major public health concern since it affects 1.2% in the French population. Eighty percent of those contaminated by HCV keep bearing the virus chronically although they remain asymptomatic during many years. HCV infection is associated with psychiatric symptoms like depression. Together with other factors (eg the severity of hepatic condition), depression may induce significant impairment in quality of life. Conversely, some psychiatric conditions may increase the risk of HCV infection. In drug-addicted subjects using intravenous route, HCV contamination rate ranges from 74 to 100%. Compared with general population, a higher HCV contamination rate has also been noticed in some other subgroups of subjects (patients with alcohol abuse or dependence, with alcohol-induced hepatic disease and psychiatric inpatients). However, no valid explanation to this phenomenon has been established. Interferon alpha and depression - Interferons are a variety of cytokines naturally produced by human tissues and have also been synthesized for therapeutic purposes (treatment of a variety of cancers and viral infections). Many psychobehavioural symptoms are observed under IFNalpha treatment. Among them, mood disorders are known to occur early after entry into treatment and to be within the reach of preventive measures. The reported frequency of depression during IFNalpha treatment ranges from 0 to 37%. This variation reflects either methodological biases (eg differences in psychiatric assessment) or the heterogeneity of the population of patients accepted in therapeutic protocols. Note that the adjunction of ribavirine to IFNalpha in therapeutic protocols has not brought any changes in the depression frequency. The causal relationship between IFNalpha administration and the occurrence of mood disorders has been

  13. Vehicle Detection Based on Probability Hypothesis Density Filter

    Directory of Open Access Journals (Sweden)

    Feihu Zhang

    2016-04-01

    Full Text Available In the past decade, the developments of vehicle detection have been significantly improved. By utilizing cameras, vehicles can be detected in the Regions of Interest (ROI in complex environments. However, vision techniques often suffer from false positives and limited field of view. In this paper, a LiDAR based vehicle detection approach is proposed by using the Probability Hypothesis Density (PHD filter. The proposed approach consists of two phases: the hypothesis generation phase to detect potential objects and the hypothesis verification phase to classify objects. The performance of the proposed approach is evaluated in complex scenarios, compared with the state-of-the-art.

  14. Biostatistics series module 2: Overview of hypothesis testing

    Directory of Open Access Journals (Sweden)

    Avijit Hazra

    2016-01-01

    Full Text Available Hypothesis testing (or statistical inference is one of the major applications of biostatistics. Much of medical research begins with a research question that can be framed as a hypothesis. Inferential statistics begins with a null hypothesis that reflects the conservative position of no change or no difference in comparison to baseline or between groups. Usually, the researcher has reason to believe that there is some effect or some difference which is the alternative hypothesis. The researcher therefore proceeds to study samples and measure outcomes in the hope of generating evidence strong enough for the statistician to be able to reject the null hypothesis. The concept of the P value is almost universally used in hypothesis testing. It denotes the probability of obtaining by chance a result at least as extreme as that observed, even when the null hypothesis is true and no real difference exists. Usually, if P is < 0.05 the null hypothesis is rejected and sample results are deemed statistically significant. With the increasing availability of computers and access to specialized statistical software, the drudgery involved in statistical calculations is now a thing of the past, once the learning curve of the software has been traversed. The life sciences researcher is therefore free to devote oneself to optimally designing the study, carefully selecting the hypothesis tests to be applied, and taking care in conducting the study well. Unfortunately, selecting the right test seems difficult initially. Thinking of the research hypothesis as addressing one of five generic research questions helps in selection of the right hypothesis test. In addition, it is important to be clear about the nature of the variables (e.g., numerical vs. categorical; parametric vs. nonparametric and the number of groups or data sets being compared (e.g., two or more than two at a time. The same research question may be explored by more than one type of hypothesis test

  15. Public acceptance: A Japanese view

    International Nuclear Information System (INIS)

    1972-01-01

    A number of factors enter into a consideration of the public acceptance of nuclear power ? the public, nuclear power as an entity, and the interaction between the two. Interaction here implies the manner in which nuclear power is presented to the public ? what is the public need for nuclear power, and what public risk is entailed in having it? The problem of public acceptance, in this sense, is time-dependent. For the public is changeable, just as nuclear power is subject to technical progress and ' social' improvement. Japan is geographically a very small country with a very high density of population. Any industrial activity and any large-scale employment of modern technology is apt to have a much greater impact on the physical, social and biological environment of individual Japanese people than similar activities would have on those of other countries. Industrial pollutants such as sulphur dioxide from power plants, oxides of nitrogen from automobile engine exhausts, organic mercury from chemical industries and so on affect society to a high degree, considered in terms of their concentration either per capita or per square kilometre. In the case of nuclear power, therefore, people are more concerned with radiological effects than with thermal pollution.no matter how one looks at it, the experience of Hiroshima and Nagasaki has made the average member of the Japanese public, very sensitive to the problem of radiation safety. This is no longer a subject in which science or logic can persuade

  16. Wind energy and social acceptability

    International Nuclear Information System (INIS)

    Feurtey, E.

    2008-01-01

    This document was prepared as part of a decentralized collaboration between Quebec and France to share knowledge regarding strategies and best practices in wind power development. It reviewed the social acceptance of Quebec's wind power industry, particularly at the municipal level. The wind industry is growing rapidly in Quebec, and this growth has generated many reactions ranging from positive to negative. The purpose of this joint effort was to describe decision making steps to developing a wind turbine array. The history of wind development in Quebec was discussed along with the various hardware components required in a wind turbine and different types of installations. The key element in implementing wind turbine arrays is to establish public acceptance of the project, followed by a good regulatory framework to define the roles and responsibilities of participants. The production of electricity from wind turbines constitutes a clean and renewable source of energy. Although it is associated with a reduction in greenhouse gas emissions, this form of energy can also have negative environmental impacts, including noise. The revenues generated by wind parks are important factors in the decision making process. Two case studies in Quebec were presented. refs., tabs., figs.

  17. Policy formulation of public acceptance

    International Nuclear Information System (INIS)

    Kasai, Akihiro

    1978-01-01

    Since 1970, the new policy formulation for public acceptance of the new consideration on the location of electric power generation has been set and applied. The planning and the enforcement being conducted by local public organizations for the local economic build-up with plant location and also the adjustement of the requirements for fishery are two main specific characters in this new policy. The background of this new public acceptance policy, the history and the actual problems about the compensation for the location of power generation plants are reviewed. One new proposal, being recommended by the Policy and Science Laboratory to MITI in 1977 is explained. This is based on the method of promoting the location of power generation plants by public participation placing the redevelopment of regional societies as its basis. The problems concerning the industrial structures in farm villages, fishing villages and the areas of commerce and industry should be systematized, and explained from the viewpoint of outside impact, the characteristics of local areas and the location problems in this new proposal. Finally, the location process and its effectiveness should be put in order. (Nakai, Y.)

  18. Heat exchanger staybolt acceptance criteria

    International Nuclear Information System (INIS)

    Lam, P.S.; Sindelar, R.L.; Barnes, D.M.

    1992-02-01

    The structural integrity demonstration of the primary coolant piping system includes evaluating the structural capacity of each component against a large break or equivalent Double-Ended Guillotine Break. A large break at the inlet or outlet heads of the heat exchangers would occur if the restraint members of the heads become inactive. The structural integrity of the heads is demonstrated by showing the redundant capacity of the staybolts to restrain the head at design conditions and under seismic loadings. The Savannah River Site heat exchanger head is attached to the tubesheet by 84 staybolts. Access to the staybolts is limited due to a welded seal cap over the staybolts. An ultrasonic testing (UT) inspection technique to provide an in-situ examination of the staybolts has recently been developed at SRS. Examination of the staybolts will be performed to ensure their service condition and configuration is within acceptance limits. An acceptance criteria methodology has been developed to disposition flaws reported in the staybolt inspections while ensuring adequate restraint capacity of the staybolts to maintain integrity of the heat exchanger heads against collapse. The methodology includes an approach for the baseline and periodic inspections of the staybolts. The heat exchanger head is analyzed with a three-dimensional finite element model. The restraint provided by the staybolts is evaluated for several postulated cases of inactive or missing staybolts. Evaluation of specific, inactive staybolt configurations based on the UT results can be performed with the finite element model and fracture methodology in this report

  19. Improvement of the drift chamber system in the SAPHIR detector and first measurements of the Φ meson production at threshold

    International Nuclear Information System (INIS)

    Scholmann, J.N.

    1996-09-01

    The SAPHIR detector at ELSA enables the measurement of photon induced Φ meson production from threshold up to 3 GeV in the full kinematical range. A considerable improvement of the drift chamber system is a precondition of gaining the necessary data rate in an acceptable time. The research focuses attention on the choice of the chamber gas and on a different mechanical construction, so as to minimize the negative influences of the photon beam crossing the sensitive volume of the drift chamber system. In addition, first preliminary results of the total and the differential cross section for the Φ meson production close to threshold were evaluated. (orig.)

  20. Tissue damage thresholds during therapeutic electrical stimulation

    Science.gov (United States)

    Cogan, Stuart F.; Ludwig, Kip A.; Welle, Cristin G.; Takmakov, Pavel

    2016-04-01

    Objective. Recent initiatives in bioelectronic modulation of the nervous system by the NIH (SPARC), DARPA (ElectRx, SUBNETS) and the GlaxoSmithKline Bioelectronic Medicines effort are ushering in a new era of therapeutic electrical stimulation. These novel therapies are prompting a re-evaluation of established electrical thresholds for stimulation-induced tissue damage. Approach. In this review, we explore what is known and unknown in published literature regarding tissue damage from electrical stimulation. Main results. For macroelectrodes, the potential for tissue damage is often assessed by comparing the intensity of stimulation, characterized by the charge density and charge per phase of a stimulus pulse, with a damage threshold identified through histological evidence from in vivo experiments as described by the Shannon equation. While the Shannon equation has proved useful in assessing the likely occurrence of tissue damage, the analysis is limited by the experimental parameters of the original studies. Tissue damage is influenced by factors not explicitly incorporated into the Shannon equation, including pulse frequency, duty cycle, current density, and electrode size. Microelectrodes in particular do not follow the charge per phase and charge density co-dependence reflected in the Shannon equation. The relevance of these factors to tissue damage is framed in the context of available reports from modeling and in vivo studies. Significance. It is apparent that emerging applications, especially with microelectrodes, will require clinical charge densities that exceed traditional damage thresholds. Experimental data show that stimulation at higher charge densities can be achieved without causing tissue damage, suggesting that safety parameters for microelectrodes might be distinct from those defined for macroelectrodes. However, these increased charge densities may need to be justified by bench, non-clinical or clinical testing to provide evidence of device

  1. Coevolutionary interactions between farmers and mafia induce host acceptance of avian brood parasites.

    Science.gov (United States)

    Abou Chakra, Maria; Hilbe, Christian; Traulsen, Arne

    2016-05-01

    Brood parasites exploit their host in order to increase their own fitness. Typically, this results in an arms race between parasite trickery and host defence. Thus, it is puzzling to observe hosts that accept parasitism without any resistance. The 'mafia' hypothesis suggests that these hosts accept parasitism to avoid retaliation. Retaliation has been shown to evolve when the hosts condition their response to mafia parasites, who use depredation as a targeted response to rejection. However, it is unclear if acceptance would also emerge when 'farming' parasites are present in the population. Farming parasites use depredation to synchronize the timing with the host, destroying mature clutches to force the host to re-nest. Herein, we develop an evolutionary model to analyse the interaction between depredatory parasites and their hosts. We show that coevolutionary cycles between farmers and mafia can still induce host acceptance of brood parasites. However, this equilibrium is unstable and in the long-run the dynamics of this host-parasite interaction exhibits strong oscillations: when farmers are the majority, accepters conditional to mafia (the host will reject first and only accept after retaliation by the parasite) have a higher fitness than unconditional accepters (the host always accepts parasitism). This leads to an increase in mafia parasites' fitness and in turn induce an optimal environment for accepter hosts.

  2. Threshold dose distributions for 5 major allergenic foods in children

    NARCIS (Netherlands)

    Blom, W.M.; Vlieg-Boerstra, B.J.; Kruizinga, A.G.; Heide, S. van der; Houben, G.F.; Dubois, A.E.J.

    2013-01-01

    Background: For most allergenic foods, insufficient threshold dose information within the population restricts the advice on levels of unintended allergenic foods which should trigger precautionary labeling on prepackaged foods. Objective: We wanted to derive threshold dose distributions for major

  3. Threshold dose distributions for 5 major allergenic foods in children

    NARCIS (Netherlands)

    Blom, W. Marty; Vlieg-Boerstra, Berber J.; Kruizinga, Astrid G.; van der Heide, Sicco; Houben, Geert F.; Dubois, Anthony E. J.

    Background: For most allergenic foods, insufficient threshold dose information within the population restricts the advice on levels of unintended allergenic foods which should trigger precautionary labeling on prepackaged foods. Objective: We wanted to derive threshold dose distributions for major

  4. Category 3 threshold quantities for hazard categorization of nonreactor facilities

    Energy Technology Data Exchange (ETDEWEB)

    Mandigo, R.L.

    1996-02-13

    This document provides the information necessary to determine Hazard Category 3 threshold quantities for those isotopes of interest not listed in WHC-CM-4-46, Section 4, Table 1.''Threshold Quantities.''

  5. Multiuser switched diversity scheduling systems with per-user threshold

    KAUST Repository

    Nam, Haewoon

    2010-05-01

    A multiuser switched diversity scheduling scheme with per-user feedback threshold is proposed and analyzed in this paper. The conventional multiuser switched diversity scheduling scheme uses a single feedback threshold for every user, where the threshold is a function of the average signal-to-noise ratios (SNRs) of the users as well as the number of users involved in the scheduling process. The proposed scheme, however, constructs a sequence of feedback thresholds instead of a single feedback threshold such that each user compares its channel quality with the corresponding feedback threshold in the sequence. Numerical and simulation results show that thanks to the flexibility of threshold selection, where a potentially different threshold can be used for each user, the proposed scheme provides a higher system capacity than that for the conventional scheme. © 2006 IEEE.

  6. Double threshold discriminator for timing measurements

    International Nuclear Information System (INIS)

    Frolov, A.R.; Oslopova, T.V.; Pestov, Yu.N.

    1995-01-01

    The new type of a discriminator is based on the idea of simultaneous time measurements at two different thresholds for each pulse. Instead of using two independent electronic TDC channels this discriminator produces an output pulse with the timing taking into account the information from two time measurements ''on-line''. The operation principle, analytical calculations and experimental results are presented. The time walk of the discriminator at the level of 10 ps in the range of the input pulse height of 0.2-1.5 V has been obtained. ((orig.))

  7. Concentrating lightguide for threshold Cherenkov counters

    International Nuclear Information System (INIS)

    Gavrishchuk, O.P.; Onuchin, V.A.; Semenov, V.K.; Suzdalev, V.I.

    1991-01-01

    A method of manufacturing lightguides (Winston lenses) is proposed to increase the effective area of light collection on photodetectors (with diameter of detectiving area from 45 to 120 mm) and to broaden angular range of radiation detection in threshold Cherenkov counters. The concentrating lightguides with height and diameter up to 300 mm were pressure formed of 3 to 5 mm thick plexiglass sheets. Dependences of the light reflection coefficient on the wavelength (for wavelengths between 185 and 650 nm) of the deposited lightguide are presented. 10 refs.; 4 figs

  8. Superstring threshold corrections to Yukawa couplings

    International Nuclear Information System (INIS)

    Antoniadis, I.; Taylor, T.R.

    1992-12-01

    A general method of computing string corrections to the Kaehler metric and Yukawa couplings is developed at the one-loop level for a general compactification of the heterotic superstring theory. It also provides a direct determination of the so-called Green-Schwarz term. The matter metric has an infrared divergent part which reproduces the field-theoretical anomalous dimensions, and a moduli-dependent part which gives rise to threshold corrections in the physical Yukawa couplings. Explicit expressions are derived for symmetric orbifold compactifications. (author). 20 refs

  9. Computational analysis of thresholds for magnetophosphenes

    International Nuclear Information System (INIS)

    Laakso, Ilkka; Hirata, Akimasa

    2012-01-01

    In international guidelines, basic restriction limits on the exposure of humans to low-frequency magnetic and electric fields are set with the objective of preventing the generation of phosphenes, visual sensations of flashing light not caused by light. Measured data on magnetophosphenes, i.e. phosphenes caused by a magnetically induced electric field on the retina, are available from volunteer studies. However, there is no simple way for determining the retinal threshold electric field or current density from the measured threshold magnetic flux density. In this study, the experimental field configuration of a previous study, in which phosphenes were generated in volunteers by exposing their heads to a magnetic field between the poles of an electromagnet, is computationally reproduced. The finite-element method is used for determining the induced electric field and current in five different MRI-based anatomical models of the head. The direction of the induced current density on the retina is dominantly radial to the eyeball, and the maximum induced current density is observed at the superior and inferior sides of the retina, which agrees with literature data on the location of magnetophosphenes at the periphery of the visual field. On the basis of computed data, the macroscopic retinal threshold current density for phosphenes at 20 Hz can be estimated as 10 mA m −2 (−20% to  + 30%, depending on the anatomical model); this current density corresponds to an induced eddy current of 14 μA (−20% to  + 10%), and about 20% of this eddy current flows through each eye. The ICNIRP basic restriction limit for the induced electric field in the case of occupational exposure is not exceeded until the magnetic flux density is about two to three times the measured threshold for magnetophosphenes, so the basic restriction limit does not seem to be conservative. However, the reasons for the non-conservativeness are purely technical: removal of the highest 1% of

  10. Level reduction and the quantum threshold theorem

    Science.gov (United States)

    Aliferis, Panagiotis (Panos)

    Computers have led society to the information age revolutionizing central aspects of our lives from production and communication to education and entertainment. There exist, however, important problems which are intractable with the computers available today and, experience teaches us, will remain so even with the more advanced computers we can envision for tomorrow.Quantum computers promise speedups to some of these important but classically intractable problems. Simulating physical systems, a problem of interest in a diverse range of areas from testing physical theories to understanding chemical reactions, and solving number factoring, a problem at the basis of cryptographic protocols that are used widely today on the internet, are examples of applications for which quantum computers, when built, will offer a great advantage over what is possible with classical computer technology.The construction of a quantum computer of sufficient scale to solve interesting problems is, however, especially challenging. The reason for this is that, by its very nature, operating a quantum computer will require the coherent control of the quantum state of a very large number of particles. Fortunately, the theory of quantum error correction and fault-tolerant quantum computation gives us confidence that such quantum states can be created, can be stored in memory and can also be manipulated provided the quantum computer can be isolated to a sufficient degree from sources of noise.One of the central results in the theory of fault-tolerant quantum computation, the quantum threshold theorem shows that a noisy quantum computer can accurately and efficiently simulate any ideal quantum computation provided that noise is weakly correlated and its strength is below a critical value known as the quantum accuracy threshold. This thesis provides a simpler and more transparent non-inductive proof of this theorem based on the concept of level reduction. This concept is also used in proving the

  11. 14-channel threshold gas Cerenkov counter

    International Nuclear Information System (INIS)

    Voichishin, M.N.; Devitsin, E.G.; Gus'kov, B.N.; Kapishin, M.N.; Zavertyaev, M.V.; Zinchenko, A.I.

    1985-01-01

    A 14-channel threshold gas Cerenkov counter filled with Freon-12 at a pressure of 1 atm is described. The radiator length is 150 cm. The counter efficiency for protons with a momentum of circa equal to or greater than 30 GeV/c exceeds 98%. The counter is a part of the system for identification of secondary charged particles of the BIS-2 spectrometer of the Institute of HighEnergy Physics. A diagram of the counter and its dimensions is shown. The counter consists of a light- and gasproof housing, a set of focusing mirrors, and a photomultiplier system

  12. Technology Thresholds for Microgravity: Status and Prospects

    Science.gov (United States)

    Noever, D. A.

    1996-01-01

    The technological and economic thresholds for microgravity space research are estimated in materials science and biotechnology. In the 1990s, the improvement of materials processing has been identified as a national scientific priority, particularly for stimulating entrepreneurship. The substantial US investment at stake in these critical technologies includes six broad categories: aerospace, transportation, health care, information, energy, and the environment. Microgravity space research addresses key technologies in each area. The viability of selected space-related industries is critically evaluated and a market share philosophy is developed, namely that incremental improvements in a large markets efficiency is a tangible reward from space-based research.

  13. Norm based Threshold Selection for Fault Detectors

    DEFF Research Database (Denmark)

    Rank, Mike Lind; Niemann, Henrik

    1998-01-01

    The design of fault detectors for fault detection and isolation (FDI) in dynamic systems is considered from a norm based point of view. An analysis of norm based threshold selection is given based on different formulations of FDI problems. Both the nominal FDI problem as well as the uncertain FDI...... problem are considered. Based on this analysis, a performance index based on norms of the involved transfer functions is given. The performance index allows us also to optimize the structure of the fault detection filter directly...

  14. Statistical Algorithm for the Adaptation of Detection Thresholds

    DEFF Research Database (Denmark)

    Stotsky, Alexander A.

    2008-01-01

    Many event detection mechanisms in spark ignition automotive engines are based on the comparison of the engine signals to the detection threshold values. Different signal qualities for new and aged engines necessitate the development of an adaptation algorithm for the detection thresholds...... remains constant regardless of engine age and changing detection threshold values. This, in turn, guarantees the same event detection performance for new and aged engines/sensors. Adaptation of the engine knock detection threshold is given as an example. Udgivelsesdato: 2008...

  15. Estimating the Threshold Level of Inflation for Thailand

    OpenAIRE

    JIRANYAKUL, Komain

    2017-01-01

    Abstract. This paper analyzes the relationship between inflation and economic growth in Thailand using annual dataset during 1990 and 2015. The threshold model is estimated for different levels of threshold inflation rate. The results suggest that the threshold level of inflation above which inflation significantly slow growth is estimated at 3 percent. The negative relationship between inflation and growth is apparent above this threshold level of inflation. In other words, the inflation rat...

  16. The phase shift hypothesis for the circadian component of winter depression.

    Science.gov (United States)

    Lewy, Alfred J; Rough, Jennifer N; Songer, Jeannine B; Mishra, Neelam; Yuhas, Krista; Emens, Jonathan S

    2007-01-01

    The finding that bright light can suppress melatonin production led to the study of two situations, indeed, models, of light deprivation: totally blind people and winter depressives. The leading hypothesis for winter depression (seasonal affective disorder, or SAD) is the phase shift hypothesis (PSH). The PSH was recently established in a study in which SAD patients were given low-dose melatonin in the afternoon/evening to cause phase advances, or in the morning to cause phase delays, or placebo. The prototypical phase-delayed patient, as well as the smaller subgroup of phase-advanced patients, optimally responded to melatonin given at the correct time. Symptom severity improved as circadian misalignment was corrected. Circadian misalignment is best measured as the time interval between the dim light melatonin onset (DLMO) and mid-sleep. Using the operational definition of the plasma DLMO as the interpolated time when melatonin levels continuously rise above the threshold of 10 pg/mL, the average interval between DLMO and mid-sleep in healthy controls is 6 hours, which is associated with optimal mood in SAD patients.

  17. Personal values and pain tolerance: does a values intervention add to acceptance?

    Science.gov (United States)

    Branstetter-Rost, Ann; Cushing, Christopher; Douleh, Tanya

    2009-08-01

    Previous research suggests that acceptance is a promising alternative to distraction and control techniques in successfully coping with pain. Acceptance interventions based upon Acceptance and Commitment Therapy (ACT) have been shown to lead to greater tolerance of acute pain as well as increased adjustment and less disability among individuals with chronic pain. However, in these previous intervention studies, the ACT component of values has either not been included or not specifically evaluated. The current study compares the effects of an ACT-based acceptance intervention with and without the values component among individuals completing the cold-pressor task. Results indicate that inclusion of the values component (n = 34) of ACT leads to significantly greater pain tolerance than acceptance alone (n = 30). Consistent with previous research, both conditions were associated with greater pain tolerance than control (n = 35). Despite the difference in tolerance, pain threshold did not differ, and participants in the control condition provided lower ratings of pain severity. The findings from this study support the important role of values and values clarification in acceptance-based interventions such as ACT, and provide direction for clinicians working with individuals with chronic pain conditions. This article evaluates the additive effect of including a personalized-values exercise in an acceptance-based treatment for pain. Results indicate that values interventions make a significant contribution and improvement to acceptance interventions, which may be of interest to clinicians who provide psychological treatment to individuals with chronic pain.

  18. Using the Gaia Hypothesis to Synthesize an Introductory Biology Course.

    Science.gov (United States)

    Baker, Gail A.

    1993-01-01

    The Gaia Hypothesis emphasizes the interactions and feedback mechanisms between the living and nonliving process that take place on Earth. Employing this concept in instruction can emphasize the interdisciplinary nature of science and give a planetary perspective of biology. (PR)

  19. Update on the "Dutch hypothesis" for chronic respiratory disease

    DEFF Research Database (Denmark)

    Vestbo, J; Prescott, E

    1998-01-01

    BACKGROUND: Many patients with chronic obstructive lung disease show increased airways responsiveness to histamine. We investigated the hypothesis that increased airways responsiveness predicts the development and remission of chronic respiratory symptoms. METHODS: We used data from 24-year follo...

  20. Incidence of allergy and atopic disorders and hygiene hypothesis.

    Czech Academy of Sciences Publication Activity Database

    Bencko, V.; Šíma, Petr

    2017-01-01

    Roč. 2, 6 March (2017), č. článku 1244. ISSN 2474-1663 Institutional support: RVO:61388971 Keywords : allergy disorders * atopic disorders * hygiene hypothesis Subject RIV: EE - Microbiology, Virology OBOR OECD: Microbiology

  1. Cross-system log file analysis for hypothesis testing

    NARCIS (Netherlands)

    Glahn, Christian

    2008-01-01

    Glahn, C. (2008). Cross-system log file analysis for hypothesis testing. Presented at Empowering Learners for Lifelong Competence Development: pedagogical, organisational and technological issues. 4th TENCompetence Open Workshop. April, 10, 2008, Madrid, Spain.

  2. Crustal Plateaus as Ancient Large Impact Features: A Hypothesis

    Science.gov (United States)

    Hansen, V. L.

    2005-03-01

    An alternate hypothesis of crustal plateau formation through deformation and progressive crystallization of a huge lava pond, that results from massive melting of the mantle due to bolide impact with ancient thin Venus lithosphere is presented.

  3. The hypothesis of superluminal neutrinos: Comparing OPERA with other data

    DEFF Research Database (Denmark)

    Drago, A.; Masina, I.; Pagliara, G.

    2012-01-01

    The OPERA Collaboration reported evidence for muonic neutrinos traveling slightly faster than light in vacuum. While waiting further checks from the experimental community, here we aim at exploring some theoretical consequences of the hypothesis that muonic neutrinos are superluminal, considering...

  4. On a Misconception Involving Point Collocation and the Rayleigh Hypothesis

    DEFF Research Database (Denmark)

    Christiansen, Søren; Kleinman, Ralph E.

    1996-01-01

    It is shown that the Rayleigh hypothesis does notgovern convergence of the simple point collocationapproach to the numerical solutions of scatteringby a sinusoidal grating. A recently developed numerical technique, interval arithmetic, is employed to perform some decisive numerical experiments...

  5. Alzheimer's disease: the amyloid hypothesis and the Inverse Warburg effect

    KAUST Repository

    Demetrius, Lloyd A.

    2015-01-14

    Epidemiological and biochemical studies show that the sporadic forms of Alzheimer\\'s disease (AD) are characterized by the following hallmarks: (a) An exponential increase with age; (b) Selective neuronal vulnerability; (c) Inverse cancer comorbidity. The present article appeals to these hallmarks to evaluate and contrast two competing models of AD: the amyloid hypothesis (a neuron-centric mechanism) and the Inverse Warburg hypothesis (a neuron-astrocytic mechanism). We show that these three hallmarks of AD conflict with the amyloid hypothesis, but are consistent with the Inverse Warburg hypothesis, a bioenergetic model which postulates that AD is the result of a cascade of three events—mitochondrial dysregulation, metabolic reprogramming (the Inverse Warburg effect), and natural selection. We also provide an explanation for the failures of the clinical trials based on amyloid immunization, and we propose a new class of therapeutic strategies consistent with the neuroenergetic selection model.

  6. Interpretative possibilities and limitations of Saxe/Goldstein hypothesis

    Directory of Open Access Journals (Sweden)

    André Strauss

    2012-08-01

    Full Text Available The Saxe/Goldstein Hypothesis was generated within the processual archaeology milieu and therefore it was supposed to allow reconstructing the social dimensions of past populations by studying their mortuary practices. In its original form stated that the emergency of formal cemeteries would be the result of an increase on the competition for vital resources. This would lead to the formation of corporate groups of descent whose main objectives were to monopolize the access to vital resources. Later, a reformulated version of this hypothesis was developed emphasizing the relationship between the presence of formal cemeteries and mobility pattern of human groups. In this contribution we present a critical review on the formation of this hypothesis and discuss its limitations. Finally, two examples taken from the Brazilian archaeological record are used to show how the lack of a critical posture in relation to the Saxe/Goldstein Hypothesis may lead to fragile interpretations of the archaeological record.

  7. Acceptance of Others, Feeling of Being Accepted and Striving for Being Accepted Among the Representatives of Different Kinds of Occupations

    Directory of Open Access Journals (Sweden)

    Gergana Stanoeva

    2012-05-01

    Full Text Available This paper deals with an important issue related to the human attitudes and needs in interpersonal and professional aspects. The theoretical part deals with several psychological components of the self-esteem and esteem of the others – acceptance of the others, feeling of being accepted, need for approval. Some gender differences in manifestations of acceptance and feeling of being accepted at the workplace are discussed. This article presents some empirical data for the degree of acceptance of others, feeling of being accepted and the strive for being accepted among the representatives of helping, pedagogical, administrative and economic occupations, as well as non-qualified workers. The goals of the study were to reveal the interdependency between these constructs and to be found some significant differences between the representatives of the four groups of occupations. The methods of the first study were W. Fey’s scales “Acceptance of others”, and “How do I feel accepted by others”. The method of the second study was Crown and Marlowe Scale for Social Desirability. The results indicated some significant differences in acceptance of others and feeling of being accepted between the non-qualified workers and the representatives of helping, administrative and economic occupations. There were not any significant difference in strive for being accepted between the fouroccupational groups.

  8. Impacts of DEM resolution and area threshold value uncertainty on ...

    African Journals Online (AJOL)

    ... that DEM resolution influences the selected flow accumulation threshold value; the suitable flow accumulation threshold value increases as the DEM resolution increases, and shows greater variability for basins with lower drainage densities. The link between drainage area threshold value and stream network extraction ...

  9. Static and Transient Cavitation Threshold Measurements for Mercury

    Energy Technology Data Exchange (ETDEWEB)

    Moraga, F.; Taleyarkhan, R.P.

    1999-11-14

    Transient and static cavitation thresholds for mercury as a function of the cover gas (helium or air), and pressure are reported. Both static and transient cavitation onset pressure thresholds increase linearly with cover gas pressure. Additionally, the cavitation thresholds as a function of dissolved gases were also measured and are reported.

  10. Teratogenicity and the Threshold of Toxicological Concern concept

    NARCIS (Netherlands)

    Schothorst F van; Piersma AH; TOX

    2003-01-01

    The Threshold of Toxicological Concern (TTC) is a principle that refers to the possibility of establishing a human exposure threshold value for all chemicals below which there is no significant risk to human health. The threshold value is primarily based on carcinogenesis data. For

  11. Is action potential threshold lowest in the axon?

    NARCIS (Netherlands)

    Kole, Maarten H. P.; Stuart, Greg J.

    2008-01-01

    Action potential threshold is thought to be lowest in the axon, but when measured using conventional techniques, we found that action potential voltage threshold of rat cortical pyramidal neurons was higher in the axon than at other neuronal locations. In contrast, both current threshold and voltage

  12. Data compression by a decreasing slope-threshold test

    Science.gov (United States)

    Kleinrock, L.

    1973-01-01

    Resolution can be obtained at large compression ratios with method for selecting data points for transmission by telemetry in television compressed-data system. Test slope of raw data stream and compare it to symmetric pair of decreasing thresholds. When either threshold is exceeded, data are sampled and transmitted; thresholds are reset, and test begins again.

  13. fm threshold and methods of limiting its effect on performance

    African Journals Online (AJOL)

    SAMSON BRIGHT ONYEDIKACHI

    Performance evaluation shows that the threshold is the existence of large noise in the output of the system, which makes signal detection ... carried out on how to lower it using special circuits. The threshold effect in FM system affects the ..... amplifier, reduction of noise at this point will lower the threshold. It then implies that.

  14. PAGs - Public perception and acceptance

    International Nuclear Information System (INIS)

    Quillin, Robert M.

    1989-01-01

    Full text: While Protective Action Guides or PAGs have been a part of the lexicon of the radiation protection field for several decades, the concept of accepting higher levels of risk under certain situations has not received adequate scrutiny by the general public, the media or elected officials. Consequently there is a question as to how implementation of PAGs would be perceived by the above groups in the event that such implementation became necessary. A personal case in point involves the response of an executive in the food industry. When the concept of selling a food product meeting the PAGs was explained his response was, 'we won't sell a contaminated product, we would dump the unprocessed raw food. Our industry image is that of a natural unadulterated food'. While this may be an isolated view, there is a need to determine what is the perception and consequently what would be the response if PAGs were implemented today. If the response was negative by anyone of the three groups listed previously, then there is an obvious need for a program to assure receptiveness by those concerned. However, this may face formidable obstacles. This is because the terms radiation and radioactive have gained generally negative word associations, e.g. 'deadly' radiation and radioactive 'desert'. The former term was recently heard in a taped presentation at a Museum of Natural History on a completely unrelated subject. The latter term was part of a recent article heading in the Wall Street Journal. Incidentally the article was discussing television. Thus beyond the scientific issues of setting PAGs and the administrative and procedural issues of implementing PAGs there is the issue of society's understanding and acceptance of PAGs. Particularly, how can such understanding and acceptance be achieved in a situation which is associated with an actual or perceived radiation emergency? These are not questions that radiation or agricultural scientists can answer alone. These are

  15. Hypothesis Testing of Parameters for Ordinary Linear Circular Regression

    Directory of Open Access Journals (Sweden)

    Abdul Ghapor Hussin

    2006-07-01

    Full Text Available This paper presents the hypothesis testing of parameters for ordinary linear circular regression model assuming the circular random error distributed as von Misses distribution. The main interests are in testing of the intercept and slope parameter of the regression line. As an illustration, this hypothesis testing will be used in analyzing the wind and wave direction data recorded by two different techniques which are HF radar system and anchored wave buoy.

  16. Threshold photoelectron spectroscopy of the imidogen radical

    Energy Technology Data Exchange (ETDEWEB)

    Garcia, Gustavo A., E-mail: gustavo.garcia@synchrotron-soleil.fr [Synchrotron SOLEIL, L’Orme des Merisiers, St. Aubin, BP 48, 91192 Gif sur Yvette (France); Gans, Bérenger [Institut des Sciences Moléculaires d’Orsay, Univ Paris-Sud, CNRS, Bât 210, Univ Paris-Sud, 91405 Orsay Cedex (France); Tang, Xiaofeng [Synchrotron SOLEIL, L’Orme des Merisiers, St. Aubin, BP 48, 91192 Gif sur Yvette (France); Ward, Michael; Batut, Sébastien [PC2A, Université de Lille 1, UMR CNRS-USTL 8522, Cité Scientifique Bât. C11, F-59655 Villeneuve d’Ascq (France); Nahon, Laurent [Synchrotron SOLEIL, L’Orme des Merisiers, St. Aubin, BP 48, 91192 Gif sur Yvette (France); Fittschen, Christa [PC2A, Université de Lille 1, UMR CNRS-USTL 8522, Cité Scientifique Bât. C11, F-59655 Villeneuve d’Ascq (France); Loison, Jean-Christophe [ISM, Université de Bordeaux, CNRS, 351 cours de la Libération, 33405 Talence Cedex (France)

    2015-08-15

    We present the threshold photoelectron spectroscopy of the imidogen radical (NH) recorded in the photon energy region up to 1 eV above its first ionization threshold. The radical was produced by reaction of NH{sub 3} and F in a microwave discharge flow-tube and photoionized using vacuum ultraviolet (VUV) synchrotron radiation. A double imaging coincidence spectrometer was used to record mass-selected spectra and avoid contributions from the byproducts present in the reactor and background gas. The energy region includes the ground X{sup +2}Π and first electronically excited a{sup +4}Σ{sup −} states of NH{sup +}. Strong adiabatic transitions and weak vibrational progressions up to v{sup +} = 2 are observed for both electronic states. The rotational profile seen in the origin band has been modeled using existing neutral and cationic spectroscopic constants leading to a precise determination of the adiabatic ionization energy at 13.480 ± 0.002 eV.

  17. Granular motions near the threshold of entrainment

    Science.gov (United States)

    Valyrakis, Manousos; Alexakis, athanasios-Theodosios

    2016-04-01

    Our society is continuously impacted by significant weather events many times resulting in catastrophes that interrupt our normal way of life. In the context of climate change and increasing urbanisation these "extreme" hydrologic events are intensified both in magnitude and frequency, inducing costs of the order of billions of pounds. The vast majority of such costs and impacts (even more to developed societies) are due to water related catastrophes such as the geomorphic action of flowing water (including scouring of critical infrastructure, bed and bank destabilisation) and flooding. New tools and radically novel concepts are in need, to enable our society becoming more resilient. This presentation, emphasises the utility of inertial sensors in gaining new insights on the interaction of flow hydrodynamics with the granular surface at the particle scale and for near threshold flow conditions. In particular, new designs of the "smart-sphere" device are discussed with focus on the purpose specific sets of flume experiments, designed to identify the exact response of the particle resting at the bed surface for various below, near and above threshold flow conditions. New sets of measurements are presented for particle entrainment from a Lagrangian viewpoint. Further to finding direct application in addressing real world challenges in the water sector, it is shown that such novel sensor systems can also help the research community (both experimentalists and computational modellers) gain a better insight on the underlying processes governing granular dynamics.

  18. Phonation threshold flow in elongated excised larynges.

    Science.gov (United States)

    Jiang, Jack J; Regner, Michael F; Tao, Chao; Pauls, Steven

    2008-07-01

    This study proposes the use of a new parameter of vocal aerodynamics, phonation threshold flow (PTF). The sensitivities of the PTF and the phonation threshold pressure (PTP) were quantitatively compared to the percent of vocal fold elongation from physiologic length. Ten excised canine larynges were mounted on a bench apparatus capable of controlling vocal fold elongation. Subglottal airflow was gradually increased until the onset of phonation. Elongation of the vocal folds was varied from +0% (physiologic length) to +15%, and the PTF and PTP were measured. The mean PTFs at physiologic vocal fold length ranged from 101 to 217 mL/s. No statistically significant relationship was found to exist between the size of the larynx and the measured PTF values (p = .404). The average percent change of PTF compared to the magnitude of elongation was found to be statistically significant (p < .001). The data indicated that the PTF was proportional to the percent of vocal fold elongation. The PTF was positively correlated with vocal fold elongation and the PTP for small magnitudes of elongation. The results suggest that the PTF may be indicative of the biomechanical properties of the vocal folds, thus providing a possibly valuable tool in the clinical evaluation of laryngeal function.

  19. Multiratio fusion change detection with adaptive thresholding

    Science.gov (United States)

    Hytla, Patrick C.; Balster, Eric J.; Vasquez, Juan R.; Neuroth, Robert M.

    2017-04-01

    A ratio-based change detection method known as multiratio fusion (MRF) is proposed and tested. The MRF framework builds on other change detection components proposed in this work: dual ratio (DR) and multiratio (MR). The DR method involves two ratios coupled with adaptive thresholds to maximize detected changes and minimize false alarms. The use of two ratios is shown to outperform the single ratio case when the means of the image pairs are not equal. MR change detection builds on the DR method by including negative imagery to produce four total ratios with adaptive thresholds. Inclusion of negative imagery is shown to improve detection sensitivity and to boost detection performance in certain target and background cases. MRF further expands this concept by fusing together the ratio outputs using a routine in which detections must be verified by two or more ratios to be classified as a true changed pixel. The proposed method is tested with synthetically generated test imagery and real datasets with results compared to other methods found in the literature. DR is shown to significantly outperform the standard single ratio method. MRF produces excellent change detection results that exhibit up to a 22% performance improvement over other methods from the literature at low false-alarm rates.

  20. Noisy-threshold control of cell death

    Directory of Open Access Journals (Sweden)

    Vilar Jose MG

    2010-11-01

    Full Text Available Abstract Background Cellular responses to death-promoting stimuli typically proceed through a differentiated multistage process, involving a lag phase, extensive death, and potential adaptation. Deregulation of this chain of events is at the root of many diseases. Improper adaptation is particularly important because it allows cell sub-populations to survive even in the continuous presence of death conditions, which results, among others, in the eventual failure of many targeted anticancer therapies. Results Here, I show that these typical responses arise naturally from the interplay of intracellular variability with a threshold-based control mechanism that detects cellular changes in addition to just the cellular state itself. Implementation of this mechanism in a quantitative model for T-cell apoptosis, a prototypical example of programmed cell death, captures with exceptional accuracy experimental observations for different expression levels of the oncogene Bcl-xL and directly links adaptation with noise in an ATP threshold below which cells die. Conclusions These results indicate that oncogenes like Bcl-xL, besides regulating absolute death values, can have a novel role as active controllers of cell-cell variability and the extent of adaptation.