WorldWideScience

Sample records for high detection probability

  1. High-resolution elastic recoil detection utilizing Bayesian probability theory

    International Nuclear Information System (INIS)

    Neumaier, P.; Dollinger, G.; Bergmaier, A.; Genchev, I.; Goergens, L.; Fischer, R.; Ronning, C.; Hofsaess, H.

    2001-01-01

    Elastic recoil detection (ERD) analysis is improved in view of depth resolution and the reliability of the measured spectra. Good statistics at even low ion fluences is obtained utilizing a large solid angle of 5 msr at the Munich Q3D magnetic spectrograph and using a 40 MeV 197 Au beam. In this way the elemental depth profiles are not essentially altered during analysis even if distributions with area densities below 1x10 14 atoms/cm 2 are measured. As the energy spread due to the angular acceptance is fully eliminated by ion-optical and numerical corrections, an accurate and reliable apparatus function is derived. It allows to deconvolute the measured spectra using the adaptive kernel method, a maximum entropy concept in the framework of Bayesian probability theory. In addition, the uncertainty of the reconstructed spectra is quantified. The concepts are demonstrated at 13 C depth profiles measured at ultra-thin films of tetrahedral amorphous carbon (ta-C). Depth scales of those profiles are given with an accuracy of 1.4x10 15 atoms/cm 2

  2. Detection probability of Campylobacter

    NARCIS (Netherlands)

    Evers, E.G.; Post, J.; Putirulan, F.F.; Wal, van der F.J.

    2010-01-01

    A rapid presence/absence test for Campylobacter in chicken faeces is being evaluated to support the scheduling of highly contaminated broiler flocks as a measure to reduce public health risks [Nauta, M. J., & Havelaar, A. H. (2008). Risk-based standards for Campylobacter in the broiler meat

  3. Probability of detection - Comparative study of computed and film radiography for high-energy applications

    International Nuclear Information System (INIS)

    Venkatachalam, R.; Venugopal, M.; Prasad, T.

    2007-01-01

    Full text of publication follows: Suitability of computed radiography with Ir-192, Co-60 and up to 9 MeV x-rays for weld inspections is of importance to many heavy engineering and aerospace industries. CR is preferred because of lesser exposure and processing time as compared to film based radiography and also digital images offers other advantages such as image enhancements, quantitative measurements and easier archival. This paper describes systemic experimental approaches and image quality metrics to compare imaging performance of CR with film-based radiography. Experiments were designed using six-sigma methodology to validate performance of CR for steel thickness up to 160 mm with Ir- 192, Co-60 and x-ray energies varying from 100 kV up to 9 MeV. Weld specimens with defects such as lack of fusion, penetration, cracks, concavity, and porosities were studied for evaluating radiographic sensitivity and imaging performance of the system. Attempts were also made to quantify probability of detection using specimens with artificial and natural defects for various experimental conditions and were compared with film based systems. (authors)

  4. Probability in High Dimension

    Science.gov (United States)

    2014-06-30

    precisely the content of the following result. The price we pay is that the assumption that A is a packing in (F, k ·k1) is too weak to make this happen...Regularité des trajectoires des fonctions aléatoires gaussiennes. In: École d’Été de Probabilités de Saint- Flour , IV-1974, pp. 1–96. Lecture Notes in...Lectures on probability theory and statistics (Saint- Flour , 1994), Lecture Notes in Math., vol. 1648, pp. 165–294. Springer, Berlin (1996) 50. Ledoux

  5. Optimizing Probability of Detection Point Estimate Demonstration

    Science.gov (United States)

    Koshti, Ajay M.

    2017-01-01

    Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.

  6. Computer simulation of probability of detection

    International Nuclear Information System (INIS)

    Fertig, K.W.; Richardson, J.M.

    1983-01-01

    This paper describes an integrated model for assessing the performance of a given ultrasonic inspection system for detecting internal flaws, where the performance of such a system is measured by probability of detection. The effects of real part geometries on sound propagations are accounted for and the noise spectra due to various noise mechanisms are measured. An ultrasonic inspection simulation computer code has been developed to be able to detect flaws with attributes ranging over an extensive class. The detection decision is considered to be a binary decision based on one received waveform obtained in a pulse-echo or pitch-catch setup. This study focuses on the detectability of flaws using an amplitude thresholding type. Some preliminary results on the detectability of radially oriented cracks in IN-100 for bore-like geometries are given

  7. Diagnostic accuracy of the MMSE in detecting probable and possible Alzheimer's disease in ethnically diverse highly educated individuals: an analysis of the NACC database.

    Science.gov (United States)

    Spering, Cynthia C; Hobson, Valerie; Lucas, John A; Menon, Chloe V; Hall, James R; O'Bryant, Sid E

    2012-08-01

    To validate and extend the findings of a raised cut score of O'Bryant and colleagues (O'Bryant SE, Humphreys JD, Smith GE, et al. Detecting dementia with the mini-mental state examination in highly educated individuals. Arch Neurol. 2008;65(7):963-967.) for the Mini-Mental State Examination in detecting cognitive dysfunction in a bilingual sample of highly educated ethnically diverse individuals. Archival data were reviewed from participants enrolled in the National Alzheimer's Coordinating Center minimum data set. Data on 7,093 individuals with 16 or more years of education were analyzed, including 2,337 cases with probable and possible Alzheimer's disease, 1,418 mild cognitive impairment patients, and 3,088 nondemented controls. Ethnic composition was characterized as follows: 6,296 Caucasians, 581 African Americans, 4 American Indians or Alaska natives, 2 native Hawaiians or Pacific Islanders, 149 Asians, 43 "Other," and 18 of unknown origin. Diagnostic accuracy estimates (sensitivity, specificity, and likelihood ratio) of Mini-Mental State Examination cut scores in detecting probable and possible Alzheimer's disease were examined. A standard Mini-Mental State Examination cut score of 24 (≤23) yielded a sensitivity of 0.58 and a specificity of 0.98 in detecting probable and possible Alzheimer's disease across ethnicities. A cut score of 27 (≤26) resulted in an improved balance of sensitivity and specificity (0.79 and 0.90, respectively). In the cognitively impaired group (mild cognitive impairment and probable and possible Alzheimer's disease), the standard cut score yielded a sensitivity of 0.38 and a specificity of 1.00 while raising the cut score to 27 resulted in an improved balance of 0.59 and 0.96 of sensitivity and specificity, respectively. These findings cross-validate our previous work and extend them to an ethnically diverse cohort. A higher cut score is needed to maximize diagnostic accuracy of the Mini-Mental State Examination in individuals

  8. Probability of detection of clinical seizures using heart rate changes.

    Science.gov (United States)

    Osorio, Ivan; Manly, B F J

    2015-08-01

    Heart rate-based seizure detection is a viable complement or alternative to ECoG/EEG. This study investigates the role of various biological factors on the probability of clinical seizure detection using heart rate. Regression models were applied to 266 clinical seizures recorded from 72 subjects to investigate if factors such as age, gender, years with epilepsy, etiology, seizure site origin, seizure class, and data collection centers, among others, shape the probability of EKG-based seizure detection. Clinical seizure detection probability based on heart rate changes, is significantly (pprobability of detecting clinical seizures (>0.8 in the majority of subjects) using heart rate is highest for complex partial seizures, increases with a patient's years with epilepsy, is lower for females than for males and is unrelated to the side of hemisphere origin. Clinical seizure detection probability using heart rate is multi-factorially dependent and sufficiently high (>0.8) in most cases to be clinically useful. Knowledge of the role that these factors play in shaping said probability will enhance its applicability and usefulness. Heart rate is a reliable and practical signal for extra-cerebral detection of clinical seizures originating from or spreading to central autonomic network structures. Copyright © 2015 British Epilepsy Association. Published by Elsevier Ltd. All rights reserved.

  9. High throughput nonparametric probability density estimation.

    Science.gov (United States)

    Farmer, Jenny; Jacobs, Donald

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference.

  10. Transitional Probabilities Are Prioritized over Stimulus/Pattern Probabilities in Auditory Deviance Detection: Memory Basis for Predictive Sound Processing.

    Science.gov (United States)

    Mittag, Maria; Takegata, Rika; Winkler, István

    2016-09-14

    Representations encoding the probabilities of auditory events do not directly support predictive processing. In contrast, information about the probability with which a given sound follows another (transitional probability) allows predictions of upcoming sounds. We tested whether behavioral and cortical auditory deviance detection (the latter indexed by the mismatch negativity event-related potential) relies on probabilities of sound patterns or on transitional probabilities. We presented healthy adult volunteers with three types of rare tone-triplets among frequent standard triplets of high-low-high (H-L-H) or L-H-L pitch structure: proximity deviant (H-H-H/L-L-L), reversal deviant (L-H-L/H-L-H), and first-tone deviant (L-L-H/H-H-L). If deviance detection was based on pattern probability, reversal and first-tone deviants should be detected with similar latency because both differ from the standard at the first pattern position. If deviance detection was based on transitional probabilities, then reversal deviants should be the most difficult to detect because, unlike the other two deviants, they contain no low-probability pitch transitions. The data clearly showed that both behavioral and cortical auditory deviance detection uses transitional probabilities. Thus, the memory traces underlying cortical deviance detection may provide a link between stimulus probability-based change/novelty detectors operating at lower levels of the auditory system and higher auditory cognitive functions that involve predictive processing. Our research presents the first definite evidence for the auditory system prioritizing transitional probabilities over probabilities of individual sensory events. Forming representations for transitional probabilities paves the way for predictions of upcoming sounds. Several recent theories suggest that predictive processing provides the general basis of human perception, including important auditory functions, such as auditory scene analysis. Our

  11. Accounting Fraud: an estimation of detection probability

    Directory of Open Access Journals (Sweden)

    Artur Filipe Ewald Wuerges

    2014-12-01

    Full Text Available Financial statement fraud (FSF is costly for investors and can damage the credibility of the audit profession. To prevent and detect fraud, it is helpful to know its causes. The binary choice models (e.g. logit and probit commonly used in the extant literature, however, fail to account for undetected cases of fraud and thus present unreliable hypotheses tests. Using a sample of 118 companies accused of fraud by the Securities and Exchange Commission (SEC, we estimated a logit model that corrects the problems arising from undetected frauds in U.S. companies. To avoid multicollinearity problems, we extracted seven factors from 28 variables using the principal factors method. Our results indicate that only 1.43 percent of the instances of FSF were publicized by the SEC. Of the six significant variables included in the traditional, uncorrected logit model, three were found to be actually non-significant in the corrected model. The likelihood of FSF is 5.12 times higher when the firm’s auditor issues an adverse or qualified report.

  12. Review of Literature for Model Assisted Probability of Detection

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, Ryan M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Crawford, Susan L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Lareau, John P. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Anderson, Michael T. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-09-30

    This is a draft technical letter report for NRC client documenting a literature review of model assisted probability of detection (MAPOD) for potential application to nuclear power plant components for improvement of field NDE performance estimations.

  13. Factors Affecting Detection Probability of Acoustic Tags in Coral Reefs

    KAUST Repository

    Bermudez, Edgar F.

    2012-01-01

    of the transmitter detection range and the detection probability. A one-month range test of a coded telemetric system was conducted prior to a large-scale tagging project investigating the movement of approximately 400 fishes from 30 species on offshore coral reefs

  14. An empirical probability model of detecting species at low densities.

    Science.gov (United States)

    Delaney, David G; Leung, Brian

    2010-06-01

    False negatives, not detecting things that are actually present, are an important but understudied problem. False negatives are the result of our inability to perfectly detect species, especially those at low density such as endangered species or newly arriving introduced species. They reduce our ability to interpret presence-absence survey data and make sound management decisions (e.g., rapid response). To reduce the probability of false negatives, we need to compare the efficacy and sensitivity of different sampling approaches and quantify an unbiased estimate of the probability of detection. We conducted field experiments in the intertidal zone of New England and New York to test the sensitivity of two sampling approaches (quadrat vs. total area search, TAS), given different target characteristics (mobile vs. sessile). Using logistic regression we built detection curves for each sampling approach that related the sampling intensity and the density of targets to the probability of detection. The TAS approach reduced the probability of false negatives and detected targets faster than the quadrat approach. Mobility of targets increased the time to detection but did not affect detection success. Finally, we interpreted two years of presence-absence data on the distribution of the Asian shore crab (Hemigrapsus sanguineus) in New England and New York, using our probability model for false negatives. The type of experimental approach in this paper can help to reduce false negatives and increase our ability to detect species at low densities by refining sampling approaches, which can guide conservation strategies and management decisions in various areas of ecology such as conservation biology and invasion ecology.

  15. Sensitivity of probability-of-failure estimates with respect to probability of detection curve parameters

    Energy Technology Data Exchange (ETDEWEB)

    Garza, J. [University of Texas at San Antonio, Mechanical Engineering, 1 UTSA circle, EB 3.04.50, San Antonio, TX 78249 (United States); Millwater, H., E-mail: harry.millwater@utsa.edu [University of Texas at San Antonio, Mechanical Engineering, 1 UTSA circle, EB 3.04.50, San Antonio, TX 78249 (United States)

    2012-04-15

    A methodology has been developed and demonstrated that can be used to compute the sensitivity of the probability-of-failure (POF) with respect to the parameters of inspection processes that are simulated using probability of detection (POD) curves. The formulation is such that the probabilistic sensitivities can be obtained at negligible cost using sampling methods by reusing the samples used to compute the POF. As a result, the methodology can be implemented for negligible cost in a post-processing non-intrusive manner thereby facilitating implementation with existing or commercial codes. The formulation is generic and not limited to any specific random variables, fracture mechanics formulation, or any specific POD curve as long as the POD is modeled parametrically. Sensitivity estimates for the cases of different POD curves at multiple inspections, and the same POD curves at multiple inspections have been derived. Several numerical examples are presented and show excellent agreement with finite difference estimates with significant computational savings. - Highlights: Black-Right-Pointing-Pointer Sensitivity of the probability-of-failure with respect to the probability-of-detection curve. Black-Right-Pointing-Pointer The sensitivities are computed with negligible cost using Monte Carlo sampling. Black-Right-Pointing-Pointer The change in the POF due to a change in the POD curve parameters can be easily estimated.

  16. Sensitivity of the probability of failure to probability of detection curve regions

    International Nuclear Information System (INIS)

    Garza, J.; Millwater, H.

    2016-01-01

    Non-destructive inspection (NDI) techniques have been shown to play a vital role in fracture control plans, structural health monitoring, and ensuring availability and reliability of piping, pressure vessels, mechanical and aerospace equipment. Probabilistic fatigue simulations are often used in order to determine the efficacy of an inspection procedure with the NDI method modeled as a probability of detection (POD) curve. These simulations can be used to determine the most advantageous NDI method for a given application. As an aid to this process, a first order sensitivity method of the probability-of-failure (POF) with respect to regions of the POD curve (lower tail, middle region, right tail) is developed and presented here. The sensitivity method computes the partial derivative of the POF with respect to a change in each region of a POD or multiple POD curves. The sensitivities are computed at no cost by reusing the samples from an existing Monte Carlo (MC) analysis. A numerical example is presented considering single and multiple inspections. - Highlights: • Sensitivities of probability-of-failure to a region of probability-of-detection curve. • The sensitivities are computed with negligible cost. • Sensitivities identify the important region of a POD curve. • Sensitivities can be used as a guide to selecting the optimal POD curve.

  17. Sensitivity of probability-of-failure estimates with respect to probability of detection curve parameters

    International Nuclear Information System (INIS)

    Garza, J.; Millwater, H.

    2012-01-01

    A methodology has been developed and demonstrated that can be used to compute the sensitivity of the probability-of-failure (POF) with respect to the parameters of inspection processes that are simulated using probability of detection (POD) curves. The formulation is such that the probabilistic sensitivities can be obtained at negligible cost using sampling methods by reusing the samples used to compute the POF. As a result, the methodology can be implemented for negligible cost in a post-processing non-intrusive manner thereby facilitating implementation with existing or commercial codes. The formulation is generic and not limited to any specific random variables, fracture mechanics formulation, or any specific POD curve as long as the POD is modeled parametrically. Sensitivity estimates for the cases of different POD curves at multiple inspections, and the same POD curves at multiple inspections have been derived. Several numerical examples are presented and show excellent agreement with finite difference estimates with significant computational savings. - Highlights: ► Sensitivity of the probability-of-failure with respect to the probability-of-detection curve. ►The sensitivities are computed with negligible cost using Monte Carlo sampling. ► The change in the POF due to a change in the POD curve parameters can be easily estimated.

  18. Factors Affecting Detection Probability of Acoustic Tags in Coral Reefs

    KAUST Repository

    Bermudez, Edgar F.

    2012-05-01

    Acoustic telemetry is an important tool for studying the movement patterns, behaviour, and site fidelity of marine organisms; however, its application is challenged in coral reef environments where complex topography and intense environmental noise interferes with acoustic signals, and there has been less study. Therefore, it is particularly critical in coral reef telemetry studies to first conduct a long-term range test, a tool that provides informa- tion on the variability and periodicity of the transmitter detection range and the detection probability. A one-month range test of a coded telemetric system was conducted prior to a large-scale tagging project investigating the movement of approximately 400 fishes from 30 species on offshore coral reefs in the central Red Sea. During this range test we determined the effect of the following factors on transmitter detection efficiency: distance from receiver, time of day, depth, wind, current, moon-phase and temperature. The experiment showed that biological noise is likely to be responsible for a diel pattern of -on average- twice as many detections during the day as during the night. Biological noise appears to be the most important noise source in coral reefs overwhelming the effect of wind-driven noise, which is important in other studies. Detection probability is also heavily influenced by the location of the acoustic sensor within the reef structure. Understanding the effect of environmental factors on transmitter detection probability allowed us to design a more effective receiver array for the large-scale tagging study.

  19. Structural health monitoring and probability of detection estimation

    Science.gov (United States)

    Forsyth, David S.

    2016-02-01

    Structural health monitoring (SHM) methods are often based on nondestructive testing (NDT) sensors and are often proposed as replacements for NDT to lower cost and/or improve reliability. In order to take advantage of SHM for life cycle management, it is necessary to determine the Probability of Detection (POD) of the SHM system just as for traditional NDT to ensure that the required level of safety is maintained. Many different possibilities exist for SHM systems, but one of the attractive features of SHM versus NDT is the ability to take measurements very simply after the SHM system is installed. Using a simple statistical model of POD, some authors have proposed that very high rates of SHM system data sampling can result in high effective POD even in situations where an individual test has low POD. In this paper, we discuss the theoretical basis for determining the effect of repeated inspections, and examine data from SHM experiments against this framework to show how the effective POD from multiple tests can be estimated.

  20. Probability

    CERN Document Server

    Shiryaev, A N

    1996-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, ergodic theory, weak convergence of probability measures, stationary stochastic processes, and the Kalman-Bucy filter Many examples are discussed in detail, and there are a large number of exercises The book is accessible to advanced undergraduates and can be used as a text for self-study This new edition contains substantial revisions and updated references The reader will find a deeper study of topics such as the distance between probability measures, metrization of weak convergence, and contiguity of probability measures Proofs for a number of some important results which were merely stated in the first edition have been added The author included new material on the probability of large deviations, and on the central limit theorem for sums of dependent random variables

  1. PROBABILITY CALIBRATION BY THE MINIMUM AND MAXIMUM PROBABILITY SCORES IN ONE-CLASS BAYES LEARNING FOR ANOMALY DETECTION

    Data.gov (United States)

    National Aeronautics and Space Administration — PROBABILITY CALIBRATION BY THE MINIMUM AND MAXIMUM PROBABILITY SCORES IN ONE-CLASS BAYES LEARNING FOR ANOMALY DETECTION GUICHONG LI, NATHALIE JAPKOWICZ, IAN HOFFMAN,...

  2. Imperfection detection probability at ultrasonic testing of reactor vessels

    International Nuclear Information System (INIS)

    Kazinczy, F. de; Koernvik, L.Aa.

    1980-02-01

    The report is a lecture given at a symposium organized by the Swedish nuclear power inspectorate on February 1980. Equipments, calibration and testing procedures are reported. The estimation of defect detection probability for ultrasonic tests and the reliability of literature data are discussed. Practical testing of reactor vessels and welded joints are described. Swedish test procedures are compared with other countries. Series of test data for welded joints of the OKG-2 reactor are presented. Future recommendations for testing procedures are made. (GBn)

  3. Modelling detection probabilities to evaluate management and control tools for an invasive species

    Science.gov (United States)

    Christy, M.T.; Yackel Adams, A.A.; Rodda, G.H.; Savidge, J.A.; Tyrrell, C.L.

    2010-01-01

    For most ecologists, detection probability (p) is a nuisance variable that must be modelled to estimate the state variable of interest (i.e. survival, abundance, or occupancy). However, in the realm of invasive species control, the rate of detection and removal is the rate-limiting step for management of this pervasive environmental problem. For strategic planning of an eradication (removal of every individual), one must identify the least likely individual to be removed, and determine the probability of removing it. To evaluate visual searching as a control tool for populations of the invasive brown treesnake Boiga irregularis, we designed a mark-recapture study to evaluate detection probability as a function of time, gender, size, body condition, recent detection history, residency status, searcher team and environmental covariates. We evaluated these factors using 654 captures resulting from visual detections of 117 snakes residing in a 5-ha semi-forested enclosure on Guam, fenced to prevent immigration and emigration of snakes but not their prey. Visual detection probability was low overall (= 0??07 per occasion) but reached 0??18 under optimal circumstances. Our results supported sex-specific differences in detectability that were a quadratic function of size, with both small and large females having lower detection probabilities than males of those sizes. There was strong evidence for individual periodic changes in detectability of a few days duration, roughly doubling detection probability (comparing peak to non-elevated detections). Snakes in poor body condition had estimated mean detection probabilities greater than snakes with high body condition. Search teams with high average detection rates exhibited detection probabilities about twice that of search teams with low average detection rates. Surveys conducted with bright moonlight and strong wind gusts exhibited moderately decreased probabilities of detecting snakes. Synthesis and applications. By

  4. Quantifying seining detection probability for fishes of Great Plains sand‐bed rivers

    Science.gov (United States)

    Mollenhauer, Robert; Logue, Daniel R.; Brewer, Shannon K.

    2018-01-01

    Species detection error (i.e., imperfect and variable detection probability) is an essential consideration when investigators map distributions and interpret habitat associations. When fish detection error that is due to highly variable instream environments needs to be addressed, sand‐bed streams of the Great Plains represent a unique challenge. We quantified seining detection probability for diminutive Great Plains fishes across a range of sampling conditions in two sand‐bed rivers in Oklahoma. Imperfect detection resulted in underestimates of species occurrence using naïve estimates, particularly for less common fishes. Seining detection probability also varied among fishes and across sampling conditions. We observed a quadratic relationship between water depth and detection probability, in which the exact nature of the relationship was species‐specific and dependent on water clarity. Similarly, the direction of the relationship between water clarity and detection probability was species‐specific and dependent on differences in water depth. The relationship between water temperature and detection probability was also species dependent, where both the magnitude and direction of the relationship varied among fishes. We showed how ignoring detection error confounded an underlying relationship between species occurrence and water depth. Despite imperfect and heterogeneous detection, our results support that determining species absence can be accomplished with two to six spatially replicated seine hauls per 200‐m reach under average sampling conditions; however, required effort would be higher under certain conditions. Detection probability was low for the Arkansas River Shiner Notropis girardi, which is federally listed as threatened, and more than 10 seine hauls per 200‐m reach would be required to assess presence across sampling conditions. Our model allows scientists to estimate sampling effort to confidently assess species occurrence, which

  5. Quantifying Detection Probabilities for Proliferation Activities in Undeclared Facilities

    International Nuclear Information System (INIS)

    Listner, C.; Canty, M.; Niemeyer, I.; Rezniczek, A.; Stein, G.

    2015-01-01

    International Safeguards is currently in an evolutionary process to increase effectiveness and efficiency of the verification system. This is an obvious consequence of the inability to detect the Iraq's clandestine nuclear weapons programme in the early 90s. By the adoption of the Programme 93+2, this has led to the development of Integrated Safeguards and the State-level concept. Moreover, the IAEA's focus was extended onto proliferation activities outside the State's declared facilities. The effectiveness of safeguards activities within declared facilities can and have been quantified with respect to costs and detection probabilities. In contrast, when verifying the absence of undeclared facilities this quantification has been avoided in the past because it has been considered to be impossible. However, when balancing the allocation of budget between the declared and the undeclared field, explicit reasoning is needed why safeguards effort is distributed in a given way. Such reasoning can be given by a holistic, information and risk-driven approach to Acquisition Path Analysis comprising declared and undeclared facilities. Regarding the input, this approach relies on the quantification of several factors, i.e., costs of attractiveness values for specific proliferation activities, potential safeguards measures and detection probabilities for these measures also for the undeclared field. In order to overcome the lack of quantification for detection probabilities in undeclared facilities, the authors of this paper propose a general verification error model. Based on this model, four different approaches are explained and assessed with respect to their advantages and disadvantages: the analogy approach, the Bayes approach, the frequentist approach and the process approach. The paper concludes with a summary and an outlook on potential future research activities. (author)

  6. Improving detection probabilities for pests in stored grain.

    Science.gov (United States)

    Elmouttie, David; Kiermeier, Andreas; Hamilton, Grant

    2010-12-01

    The presence of insects in stored grain is a significant problem for grain farmers, bulk grain handlers and distributors worldwide. Inspection of bulk grain commodities is essential to detect pests and thereby to reduce the risk of their presence in exported goods. It has been well documented that insect pests cluster in response to factors such as microclimatic conditions within bulk grain. Statistical sampling methodologies for grain, however, have typically considered pests and pathogens to be homogeneously distributed throughout grain commodities. In this paper, a sampling methodology is demonstrated that accounts for the heterogeneous distribution of insects in bulk grain. It is shown that failure to account for the heterogeneous distribution of pests may lead to overestimates of the capacity for a sampling programme to detect insects in bulk grain. The results indicate the importance of the proportion of grain that is infested in addition to the density of pests within the infested grain. It is also demonstrated that the probability of detecting pests in bulk grain increases as the number of subsamples increases, even when the total volume or mass of grain sampled remains constant. This study underlines the importance of considering an appropriate biological model when developing sampling methodologies for insect pests. Accounting for a heterogeneous distribution of pests leads to a considerable improvement in the detection of pests over traditional sampling models. Copyright © 2010 Society of Chemical Industry.

  7. Probability of detection as a function of multiple influencing parameters

    Energy Technology Data Exchange (ETDEWEB)

    Pavlovic, Mato

    2014-10-15

    Non-destructive testing is subject to measurement uncertainties. In safety critical applications the reliability assessment of its capability to detect flaws is therefore necessary. In most applications, the flaw size is the single most important parameter that influences the probability of detection (POD) of the flaw. That is why the POD is typically calculated and expressed as a function of the flaw size. The capability of the inspection system to detect flaws is established by comparing the size of reliably detected flaw with the size of the flaw that is critical for the structural integrity. Applications where several factors have an important influence on the POD are investigated in this dissertation. To devise a reliable estimation of the NDT system capability it is necessary to express the POD as a function of all these factors. A multi-parameter POD model is developed. It enables POD to be calculated and expressed as a function of several influencing parameters. The model was tested on the data from the ultrasonic inspection of copper and cast iron components with artificial flaws. Also, a technique to spatially present POD data called the volume POD is developed. The fusion of the POD data coming from multiple inspections of the same component with different sensors is performed to reach the overall POD of the inspection system.

  8. Probability of detection as a function of multiple influencing parameters

    International Nuclear Information System (INIS)

    Pavlovic, Mato

    2014-01-01

    Non-destructive testing is subject to measurement uncertainties. In safety critical applications the reliability assessment of its capability to detect flaws is therefore necessary. In most applications, the flaw size is the single most important parameter that influences the probability of detection (POD) of the flaw. That is why the POD is typically calculated and expressed as a function of the flaw size. The capability of the inspection system to detect flaws is established by comparing the size of reliably detected flaw with the size of the flaw that is critical for the structural integrity. Applications where several factors have an important influence on the POD are investigated in this dissertation. To devise a reliable estimation of the NDT system capability it is necessary to express the POD as a function of all these factors. A multi-parameter POD model is developed. It enables POD to be calculated and expressed as a function of several influencing parameters. The model was tested on the data from the ultrasonic inspection of copper and cast iron components with artificial flaws. Also, a technique to spatially present POD data called the volume POD is developed. The fusion of the POD data coming from multiple inspections of the same component with different sensors is performed to reach the overall POD of the inspection system.

  9. Optimal sample size for probability of detection curves

    International Nuclear Information System (INIS)

    Annis, Charles; Gandossi, Luca; Martin, Oliver

    2013-01-01

    Highlights: • We investigate sample size requirement to develop probability of detection curves. • We develop simulations to determine effective inspection target sizes, number and distribution. • We summarize these findings and provide guidelines for the NDE practitioner. -- Abstract: The use of probability of detection curves to quantify the reliability of non-destructive examination (NDE) systems is common in the aeronautical industry, but relatively less so in the nuclear industry, at least in European countries. Due to the nature of the components being inspected, sample sizes tend to be much lower. This makes the manufacturing of test pieces with representative flaws, in sufficient numbers, so to draw statistical conclusions on the reliability of the NDT system under investigation, quite costly. The European Network for Inspection and Qualification (ENIQ) has developed an inspection qualification methodology, referred to as the ENIQ Methodology. It has become widely used in many European countries and provides assurance on the reliability of NDE systems, but only qualitatively. The need to quantify the output of inspection qualification has become more important as structural reliability modelling and quantitative risk-informed in-service inspection methodologies become more widely used. A measure of the NDE reliability is necessary to quantify risk reduction after inspection and probability of detection (POD) curves provide such a metric. The Joint Research Centre, Petten, The Netherlands supported ENIQ by investigating the question of the sample size required to determine a reliable POD curve. As mentioned earlier manufacturing of test pieces with defects that are typically found in nuclear power plants (NPPs) is usually quite expensive. Thus there is a tendency to reduce sample sizes, which in turn increases the uncertainty associated with the resulting POD curve. The main question in conjunction with POS curves is the appropriate sample size. Not

  10. Detection probability in aerial surveys of feral horses

    Science.gov (United States)

    Ransom, Jason I.

    2011-01-01

    Observation bias pervades data collected during aerial surveys of large animals, and although some sources can be mitigated with informed planning, others must be addressed using valid sampling techniques that carefully model detection probability. Nonetheless, aerial surveys are frequently employed to count large mammals without applying such methods to account for heterogeneity in visibility of animal groups on the landscape. This often leaves managers and interest groups at odds over decisions that are not adequately informed. I analyzed detection of feral horse (Equus caballus) groups by dual independent observers from 24 fixed-wing and 16 helicopter flights using mixed-effect logistic regression models to investigate potential sources of observation bias. I accounted for observer skill, population location, and aircraft type in the model structure and analyzed the effects of group size, sun effect (position related to observer), vegetation type, topography, cloud cover, percent snow cover, and observer fatigue on detection of horse groups. The most important model-averaged effects for both fixed-wing and helicopter surveys included group size (fixed-wing: odds ratio = 0.891, 95% CI = 0.850–0.935; helicopter: odds ratio = 0.640, 95% CI = 0.587–0.698) and sun effect (fixed-wing: odds ratio = 0.632, 95% CI = 0.350–1.141; helicopter: odds ratio = 0.194, 95% CI = 0.080–0.470). Observer fatigue was also an important effect in the best model for helicopter surveys, with detection probability declining after 3 hr of survey time (odds ratio = 0.278, 95% CI = 0.144–0.537). Biases arising from sun effect and observer fatigue can be mitigated by pre-flight survey design. Other sources of bias, such as those arising from group size, topography, and vegetation can only be addressed by employing valid sampling techniques such as double sampling, mark–resight (batch-marked animals), mark–recapture (uniquely marked and

  11. Understanding environmental DNA detection probabilities: A case study using a stream-dwelling char Salvelinus fontinalis

    Science.gov (United States)

    Wilcox, Taylor M; Mckelvey, Kevin S.; Young, Michael K.; Sepulveda, Adam; Shepard, Bradley B.; Jane, Stephen F; Whiteley, Andrew R.; Lowe, Winsor H.; Schwartz, Michael K.

    2016-01-01

    Environmental DNA sampling (eDNA) has emerged as a powerful tool for detecting aquatic animals. Previous research suggests that eDNA methods are substantially more sensitive than traditional sampling. However, the factors influencing eDNA detection and the resulting sampling costs are still not well understood. Here we use multiple experiments to derive independent estimates of eDNA production rates and downstream persistence from brook trout (Salvelinus fontinalis) in streams. We use these estimates to parameterize models comparing the false negative detection rates of eDNA sampling and traditional backpack electrofishing. We find that using the protocols in this study eDNA had reasonable detection probabilities at extremely low animal densities (e.g., probability of detection 0.18 at densities of one fish per stream kilometer) and very high detection probabilities at population-level densities (e.g., probability of detection > 0.99 at densities of ≥ 3 fish per 100 m). This is substantially more sensitive than traditional electrofishing for determining the presence of brook trout and may translate into important cost savings when animals are rare. Our findings are consistent with a growing body of literature showing that eDNA sampling is a powerful tool for the detection of aquatic species, particularly those that are rare and difficult to sample using traditional methods.

  12. Environmental DNA (eDNA) Detection Probability Is Influenced by Seasonal Activity of Organisms.

    Science.gov (United States)

    de Souza, Lesley S; Godwin, James C; Renshaw, Mark A; Larson, Eric

    2016-01-01

    Environmental DNA (eDNA) holds great promise for conservation applications like the monitoring of invasive or imperiled species, yet this emerging technique requires ongoing testing in order to determine the contexts over which it is effective. For example, little research to date has evaluated how seasonality of organism behavior or activity may influence detection probability of eDNA. We applied eDNA to survey for two highly imperiled species endemic to the upper Black Warrior River basin in Alabama, US: the Black Warrior Waterdog (Necturus alabamensis) and the Flattened Musk Turtle (Sternotherus depressus). Importantly, these species have contrasting patterns of seasonal activity, with N. alabamensis more active in the cool season (October-April) and S. depressus more active in the warm season (May-September). We surveyed sites historically occupied by these species across cool and warm seasons over two years with replicated eDNA water samples, which were analyzed in the laboratory using species-specific quantitative PCR (qPCR) assays. We then used occupancy estimation with detection probability modeling to evaluate both the effects of landscape attributes on organism presence and season of sampling on detection probability of eDNA. Importantly, we found that season strongly affected eDNA detection probability for both species, with N. alabamensis having higher eDNA detection probabilities during the cool season and S. depressus have higher eDNA detection probabilities during the warm season. These results illustrate the influence of organismal behavior or activity on eDNA detection in the environment and identify an important role for basic natural history in designing eDNA monitoring programs.

  13. Optimal Sample Size for Probability of Detection Curves

    International Nuclear Information System (INIS)

    Annis, Charles; Gandossi, Luca; Martin, Oliver

    2012-01-01

    The use of Probability of Detection (POD) curves to quantify NDT reliability is common in the aeronautical industry, but relatively less so in the nuclear industry. The European Network for Inspection Qualification's (ENIQ) Inspection Qualification Methodology is based on the concept of Technical Justification, a document assembling all the evidence to assure that the NDT system in focus is indeed capable of finding the flaws for which it was designed. This methodology has become widely used in many countries, but the assurance it provides is usually of qualitative nature. The need to quantify the output of inspection qualification has become more important, especially as structural reliability modelling and quantitative risk-informed in-service inspection methodologies become more widely used. To credit the inspections in structural reliability evaluations, a measure of the NDT reliability is necessary. A POD curve provides such metric. In 2010 ENIQ developed a technical report on POD curves, reviewing the statistical models used to quantify inspection reliability. Further work was subsequently carried out to investigate the issue of optimal sample size for deriving a POD curve, so that adequate guidance could be given to the practitioners of inspection reliability. Manufacturing of test pieces with cracks that are representative of real defects found in nuclear power plants (NPP) can be very expensive. Thus there is a tendency to reduce sample sizes and in turn reduce the conservatism associated with the POD curve derived. Not much guidance on the correct sample size can be found in the published literature, where often qualitative statements are given with no further justification. The aim of this paper is to summarise the findings of such work. (author)

  14. Probabilities of False Alarm for Vital Sign Detection on the Basis of a Doppler Radar System

    Directory of Open Access Journals (Sweden)

    Nguyen Thi Phuoc Van

    2018-02-01

    Full Text Available Vital detection on the basis of Doppler radars has drawn a great deal of attention from researchers because of its high potential for applications in biomedicine, surveillance, and finding people alive under debris during natural hazards. In this research, the signal-to-noise ratio (SNR of the remote vital-sign detection system is investigated. On the basis of different types of noise, such as phase noise, Gaussian noise, leakage noise between the transmitting and receiving antennae, and so on, the SNR of the system has first been examined. Then the research has focused on the investigation of the detection and false alarm probabilities of the system when the transmission link between the human and the radar sensor system took the Nakagami-m channel model. The analytical model for the false alarm and the detection probabilities of the system have been derived. The proposed theoretical models for the SNR and detection probability match with the simulation and measurement results. These theoretical models have the potential to be used as good references for the hardware development of the vital-sign detection radar sensor system.

  15. More efficient integrated safeguards by applying a reasonable detection probability for maintaining low presence probability of undetected nuclear proliferating activities

    International Nuclear Information System (INIS)

    Otsuka, Naoto

    2013-01-01

    Highlights: • A theoretical foundation is presented for more efficient Integrated Safeguards (IS). • Probability of undetected nuclear proliferation activities should be maintained low. • For nations under IS, the probability to start proliferation activities is very low. • The fact can decrease the detection probability of IS by dozens of percentage points. • The cost of IS per nation can be cut down by reducing inspection frequencies etc. - Abstract: A theoretical foundation is presented for implementing more efficiently the present International Atomic Energy Agency (IAEA) integrated safeguards (ISs) on the basis of fuzzy evaluation of the probability that the evaluated nation will continue peaceful activities. It is shown that by determining the presence probability of undetected nuclear proliferating activities, nations under IS can be maintained at acceptably low proliferation risk levels even if the detection probability of current IS is decreased by dozens of percentage from the present value. This makes it possible to reduce inspection frequency and the number of collected samples, allowing the IAEA to cut costs per nation. This will contribute to further promotion and application of IS to more nations by the IAEA, and more efficient utilization of IAEA resources from the viewpoint of whole IS framework

  16. Camera trap arrays improve detection probability of wildlife: Investigating study design considerations using an empirical dataset.

    Science.gov (United States)

    O'Connor, Kelly M; Nathan, Lucas R; Liberati, Marjorie R; Tingley, Morgan W; Vokoun, Jason C; Rittenhouse, Tracy A G

    2017-01-01

    Camera trapping is a standard tool in ecological research and wildlife conservation. Study designs, particularly for small-bodied or cryptic wildlife species often attempt to boost low detection probabilities by using non-random camera placement or baited cameras, which may bias data, or incorrectly estimate detection and occupancy. We investigated the ability of non-baited, multi-camera arrays to increase detection probabilities of wildlife. Study design components were evaluated for their influence on wildlife detectability by iteratively parsing an empirical dataset (1) by different sizes of camera arrays deployed (1-10 cameras), and (2) by total season length (1-365 days). Four species from our dataset that represented a range of body sizes and differing degrees of presumed detectability based on life history traits were investigated: white-tailed deer (Odocoileus virginianus), bobcat (Lynx rufus), raccoon (Procyon lotor), and Virginia opossum (Didelphis virginiana). For all species, increasing from a single camera to a multi-camera array significantly improved detection probability across the range of season lengths and number of study sites evaluated. The use of a two camera array increased survey detection an average of 80% (range 40-128%) from the detection probability of a single camera across the four species. Species that were detected infrequently benefited most from a multiple-camera array, where the addition of up to eight cameras produced significant increases in detectability. However, for species detected at high frequencies, single cameras produced a season-long (i.e, the length of time over which cameras are deployed and actively monitored) detectability greater than 0.75. These results highlight the need for researchers to be critical about camera trap study designs based on their intended target species, as detectability for each focal species responded differently to array size and season length. We suggest that researchers a priori identify

  17. Camera trap arrays improve detection probability of wildlife: Investigating study design considerations using an empirical dataset.

    Directory of Open Access Journals (Sweden)

    Kelly M O'Connor

    Full Text Available Camera trapping is a standard tool in ecological research and wildlife conservation. Study designs, particularly for small-bodied or cryptic wildlife species often attempt to boost low detection probabilities by using non-random camera placement or baited cameras, which may bias data, or incorrectly estimate detection and occupancy. We investigated the ability of non-baited, multi-camera arrays to increase detection probabilities of wildlife. Study design components were evaluated for their influence on wildlife detectability by iteratively parsing an empirical dataset (1 by different sizes of camera arrays deployed (1-10 cameras, and (2 by total season length (1-365 days. Four species from our dataset that represented a range of body sizes and differing degrees of presumed detectability based on life history traits were investigated: white-tailed deer (Odocoileus virginianus, bobcat (Lynx rufus, raccoon (Procyon lotor, and Virginia opossum (Didelphis virginiana. For all species, increasing from a single camera to a multi-camera array significantly improved detection probability across the range of season lengths and number of study sites evaluated. The use of a two camera array increased survey detection an average of 80% (range 40-128% from the detection probability of a single camera across the four species. Species that were detected infrequently benefited most from a multiple-camera array, where the addition of up to eight cameras produced significant increases in detectability. However, for species detected at high frequencies, single cameras produced a season-long (i.e, the length of time over which cameras are deployed and actively monitored detectability greater than 0.75. These results highlight the need for researchers to be critical about camera trap study designs based on their intended target species, as detectability for each focal species responded differently to array size and season length. We suggest that researchers a priori

  18. Detection probabilities for time-domain velocity estimation

    DEFF Research Database (Denmark)

    Jensen, Jørgen Arendt

    1991-01-01

    programs, it is demonstrated that the probability of correct estimation depends on the signal-to-noise ratio, transducer bandwidth, number of A-lines and number of samples used in the correlation estimate. The influence of applying a stationary echo-canceler is explained. The echo canceling can be modeled...

  19. Saliency Detection via Absorbing Markov Chain With Learnt Transition Probability.

    Science.gov (United States)

    Lihe Zhang; Jianwu Ai; Bowen Jiang; Huchuan Lu; Xiukui Li

    2018-02-01

    In this paper, we propose a bottom-up saliency model based on absorbing Markov chain (AMC). First, a sparsely connected graph is constructed to capture the local context information of each node. All image boundary nodes and other nodes are, respectively, treated as the absorbing nodes and transient nodes in the absorbing Markov chain. Then, the expected number of times from each transient node to all other transient nodes can be used to represent the saliency value of this node. The absorbed time depends on the weights on the path and their spatial coordinates, which are completely encoded in the transition probability matrix. Considering the importance of this matrix, we adopt different hierarchies of deep features extracted from fully convolutional networks and learn a transition probability matrix, which is called learnt transition probability matrix. Although the performance is significantly promoted, salient objects are not uniformly highlighted very well. To solve this problem, an angular embedding technique is investigated to refine the saliency results. Based on pairwise local orderings, which are produced by the saliency maps of AMC and boundary maps, we rearrange the global orderings (saliency value) of all nodes. Extensive experiments demonstrate that the proposed algorithm outperforms the state-of-the-art methods on six publicly available benchmark data sets.

  20. Study of detection probability from lesion by scintiscanning

    International Nuclear Information System (INIS)

    Silva, D.C. da; Dias-Neto, A.L.

    1992-01-01

    The importance of work with the information density parameter in scintiscanning is described, fixing the minimum values of information density, above of which the existent injuries are not detected, allowing also the reproducibility of the examination. (C.G.C.)

  1. Detection probability of least tern and piping plover chicks in a large river system

    Science.gov (United States)

    Roche, Erin A.; Shaffer, Terry L.; Anteau, Michael J.; Sherfy, Mark H.; Stucker, Jennifer H.; Wiltermuth, Mark T.; Dovichin, Colin M.

    2014-01-01

    Monitoring the abundance and stability of populations of conservation concern is often complicated by an inability to perfectly detect all members of the population. Mark-recapture offers a flexible framework in which one may identify factors contributing to imperfect detection, while at the same time estimating demographic parameters such as abundance or survival. We individually color-marked, recaptured, and re-sighted 1,635 federally listed interior least tern (Sternula antillarum; endangered) chicks and 1,318 piping plover (Charadrius melodus; threatened) chicks from 2006 to 2009 at 4 study areas along the Missouri River and investigated effects of observer-, subject-, and site-level covariates suspected of influencing detection. Increasing the time spent searching and crew size increased the probability of detecting both species regardless of study area and detection methods were not associated with decreased survival. However, associations between detection probability and the investigated covariates were highly variable by study area and species combinations, indicating that a universal mark-recapture design may not be appropriate.

  2. Global parameter optimization for maximizing radioisotope detection probabilities at fixed false alarm rates

    Energy Technology Data Exchange (ETDEWEB)

    Portnoy, David, E-mail: david.portnoy@jhuapl.edu [Johns Hopkins University Applied Physics Laboratory, 11100 Johns Hopkins Road, Laurel, MD 20723 (United States); Feuerbach, Robert; Heimberg, Jennifer [Johns Hopkins University Applied Physics Laboratory, 11100 Johns Hopkins Road, Laurel, MD 20723 (United States)

    2011-10-01

    Today there is a tremendous amount of interest in systems that can detect radiological or nuclear threats. Many of these systems operate in extremely high throughput situations where delays caused by false alarms can have a significant negative impact. Thus, calculating the tradeoff between detection rates and false alarm rates is critical for their successful operation. Receiver operating characteristic (ROC) curves have long been used to depict this tradeoff. The methodology was first developed in the field of signal detection. In recent years it has been used increasingly in machine learning and data mining applications. It follows that this methodology could be applied to radiological/nuclear threat detection systems. However many of these systems do not fit into the classic principles of statistical detection theory because they tend to lack tractable likelihood functions and have many parameters, which, in general, do not have a one-to-one correspondence with the detection classes. This work proposes a strategy to overcome these problems by empirically finding parameter values that maximize the probability of detection for a selected number of probabilities of false alarm. To find these parameter values a statistical global optimization technique that seeks to estimate portions of a ROC curve is proposed. The optimization combines elements of simulated annealing with elements of genetic algorithms. Genetic algorithms were chosen because they can reduce the risk of getting stuck in local minima. However classic genetic algorithms operate on arrays of Booleans values or bit strings, so simulated annealing is employed to perform mutation in the genetic algorithm. The presented initial results were generated using an isotope identification algorithm developed at Johns Hopkins University Applied Physics Laboratory. The algorithm has 12 parameters: 4 real-valued and 8 Boolean. A simulated dataset was used for the optimization study; the 'threat' set of

  3. Global parameter optimization for maximizing radioisotope detection probabilities at fixed false alarm rates

    International Nuclear Information System (INIS)

    Portnoy, David; Feuerbach, Robert; Heimberg, Jennifer

    2011-01-01

    Today there is a tremendous amount of interest in systems that can detect radiological or nuclear threats. Many of these systems operate in extremely high throughput situations where delays caused by false alarms can have a significant negative impact. Thus, calculating the tradeoff between detection rates and false alarm rates is critical for their successful operation. Receiver operating characteristic (ROC) curves have long been used to depict this tradeoff. The methodology was first developed in the field of signal detection. In recent years it has been used increasingly in machine learning and data mining applications. It follows that this methodology could be applied to radiological/nuclear threat detection systems. However many of these systems do not fit into the classic principles of statistical detection theory because they tend to lack tractable likelihood functions and have many parameters, which, in general, do not have a one-to-one correspondence with the detection classes. This work proposes a strategy to overcome these problems by empirically finding parameter values that maximize the probability of detection for a selected number of probabilities of false alarm. To find these parameter values a statistical global optimization technique that seeks to estimate portions of a ROC curve is proposed. The optimization combines elements of simulated annealing with elements of genetic algorithms. Genetic algorithms were chosen because they can reduce the risk of getting stuck in local minima. However classic genetic algorithms operate on arrays of Booleans values or bit strings, so simulated annealing is employed to perform mutation in the genetic algorithm. The presented initial results were generated using an isotope identification algorithm developed at Johns Hopkins University Applied Physics Laboratory. The algorithm has 12 parameters: 4 real-valued and 8 Boolean. A simulated dataset was used for the optimization study; the 'threat' set of spectra

  4. Global parameter optimization for maximizing radioisotope detection probabilities at fixed false alarm rates

    Science.gov (United States)

    Portnoy, David; Feuerbach, Robert; Heimberg, Jennifer

    2011-10-01

    Today there is a tremendous amount of interest in systems that can detect radiological or nuclear threats. Many of these systems operate in extremely high throughput situations where delays caused by false alarms can have a significant negative impact. Thus, calculating the tradeoff between detection rates and false alarm rates is critical for their successful operation. Receiver operating characteristic (ROC) curves have long been used to depict this tradeoff. The methodology was first developed in the field of signal detection. In recent years it has been used increasingly in machine learning and data mining applications. It follows that this methodology could be applied to radiological/nuclear threat detection systems. However many of these systems do not fit into the classic principles of statistical detection theory because they tend to lack tractable likelihood functions and have many parameters, which, in general, do not have a one-to-one correspondence with the detection classes. This work proposes a strategy to overcome these problems by empirically finding parameter values that maximize the probability of detection for a selected number of probabilities of false alarm. To find these parameter values a statistical global optimization technique that seeks to estimate portions of a ROC curve is proposed. The optimization combines elements of simulated annealing with elements of genetic algorithms. Genetic algorithms were chosen because they can reduce the risk of getting stuck in local minima. However classic genetic algorithms operate on arrays of Booleans values or bit strings, so simulated annealing is employed to perform mutation in the genetic algorithm. The presented initial results were generated using an isotope identification algorithm developed at Johns Hopkins University Applied Physics Laboratory. The algorithm has 12 parameters: 4 real-valued and 8 Boolean. A simulated dataset was used for the optimization study; the "threat" set of spectra

  5. Focus in High School Mathematics: Statistics and Probability

    Science.gov (United States)

    National Council of Teachers of Mathematics, 2009

    2009-01-01

    Reasoning about and making sense of statistics and probability are essential to students' future success. This volume belongs to a series that supports National Council of Teachers of Mathematics' (NCTM's) "Focus in High School Mathematics: Reasoning and Sense Making" by providing additional guidance for making reasoning and sense making part of…

  6. Probability Model for Data Redundancy Detection in Sensor Networks

    Directory of Open Access Journals (Sweden)

    Suman Kumar

    2009-01-01

    Full Text Available Sensor networks are made of autonomous devices that are able to collect, store, process and share data with other devices. Large sensor networks are often redundant in the sense that the measurements of some nodes can be substituted by other nodes with a certain degree of confidence. This spatial correlation results in wastage of link bandwidth and energy. In this paper, a model for two associated Poisson processes, through which sensors are distributed in a plane, is derived. A probability condition is established for data redundancy among closely located sensor nodes. The model generates a spatial bivariate Poisson process whose parameters depend on the parameters of the two individual Poisson processes and on the distance between the associated points. The proposed model helps in building efficient algorithms for data dissemination in the sensor network. A numerical example is provided investigating the advantage of this model.

  7. Detecting and classifying low probability of intercept radar

    CERN Document Server

    Pace, Philip E

    2008-01-01

    This revised and expanded second edition brings you to the cutting edge with new chapters on LPI radar design, including over-the-horizon radar, random noise radar, and netted LPI radar. You also discover critical LPI detection techniques, parameter extraction signal processing techniques, and anti-radiation missile design strategies to counter LPI radar.

  8. The intensity detection of single-photon detectors based on photon counting probability density statistics

    International Nuclear Information System (INIS)

    Zhang Zijing; Song Jie; Zhao Yuan; Wu Long

    2017-01-01

    Single-photon detectors possess the ultra-high sensitivity, but they cannot directly respond to signal intensity. Conventional methods adopt sampling gates with fixed width and count the triggered number of sampling gates, which is capable of obtaining photon counting probability to estimate the echo signal intensity. In this paper, we not only count the number of triggered sampling gates, but also record the triggered time position of photon counting pulses. The photon counting probability density distribution is obtained through the statistics of a series of the triggered time positions. Then Minimum Variance Unbiased Estimation (MVUE) method is used to estimate the echo signal intensity. Compared with conventional methods, this method can improve the estimation accuracy of echo signal intensity due to the acquisition of more detected information. Finally, a proof-of-principle laboratory system is established. The estimation accuracy of echo signal intensity is discussed and a high accuracy intensity image is acquired under low-light level environments. (paper)

  9. Detecting and classifying low probability of intercept radar

    CERN Document Server

    Pace, Phillip E

    2003-01-01

    The drive is on to devise LPI radar systems that evade hostile detection as well as develop non-cooperative intercept devices that outsmart enemy LPI radar. Based on the author's own design experience, this comprehensive, hands-on book gives you the latest design and development techniques to innovate new LPI radar systems and discover new ways to intercept enemy LPI radar. and help you visually identify waveform parameters. Filled with more than 500 equations that provide rigorous mathematical detail, this book can be used by both entry-level and seasoned engineers. Besides thoroughly treatin

  10. Antitrust Enforcement Under Endogenous Fines and Price-Dependent Detection Probabilities

    NARCIS (Netherlands)

    Houba, H.E.D.; Motchenkova, E.; Wen, Q.

    2010-01-01

    We analyze the effectiveness of antitrust regulation in a repeated oligopoly model in which both fines and detection probabilities depend on the cartel price. Such fines are closer to actual guidelines than the commonly assumed fixed fines. Under a constant detection probability, we confirm the

  11. Domestic wells have high probability of pumping septic tank leachate

    Science.gov (United States)

    Bremer, J. E.; Harter, T.

    2012-08-01

    Onsite wastewater treatment systems are common in rural and semi-rural areas around the world; in the US, about 25-30% of households are served by a septic (onsite) wastewater treatment system, and many property owners also operate their own domestic well nearby. Site-specific conditions and local groundwater flow are often ignored when installing septic systems and wells. In areas with small lots (thus high spatial septic system densities), shallow domestic wells are prone to contamination by septic system leachate. Mass balance approaches have been used to determine a maximum septic system density that would prevent contamination of groundwater resources. In this study, a source area model based on detailed groundwater flow and transport modeling is applied for a stochastic analysis of domestic well contamination by septic leachate. Specifically, we determine the probability that a source area overlaps with a septic system drainfield as a function of aquifer properties, septic system density and drainfield size. We show that high spatial septic system density poses a high probability of pumping septic system leachate. The hydraulic conductivity of the aquifer has a strong influence on the intersection probability. We find that mass balance calculations applied on a regional scale underestimate the contamination risk of individual drinking water wells by septic systems. This is particularly relevant for contaminants released at high concentrations, for substances that experience limited attenuation, and those that are harmful even at low concentrations (e.g., pathogens).

  12. Domestic wells have high probability of pumping septic tank leachate

    Directory of Open Access Journals (Sweden)

    J. E. Bremer

    2012-08-01

    Full Text Available Onsite wastewater treatment systems are common in rural and semi-rural areas around the world; in the US, about 25–30% of households are served by a septic (onsite wastewater treatment system, and many property owners also operate their own domestic well nearby. Site-specific conditions and local groundwater flow are often ignored when installing septic systems and wells. In areas with small lots (thus high spatial septic system densities, shallow domestic wells are prone to contamination by septic system leachate. Mass balance approaches have been used to determine a maximum septic system density that would prevent contamination of groundwater resources. In this study, a source area model based on detailed groundwater flow and transport modeling is applied for a stochastic analysis of domestic well contamination by septic leachate. Specifically, we determine the probability that a source area overlaps with a septic system drainfield as a function of aquifer properties, septic system density and drainfield size. We show that high spatial septic system density poses a high probability of pumping septic system leachate. The hydraulic conductivity of the aquifer has a strong influence on the intersection probability. We find that mass balance calculations applied on a regional scale underestimate the contamination risk of individual drinking water wells by septic systems. This is particularly relevant for contaminants released at high concentrations, for substances that experience limited attenuation, and those that are harmful even at low concentrations (e.g., pathogens.

  13. High probability of disease in angina pectoris patients

    DEFF Research Database (Denmark)

    Høilund-Carlsen, Poul F.; Johansen, Allan; Vach, Werner

    2007-01-01

    BACKGROUND: According to most current guidelines, stable angina pectoris patients with a high probability of having coronary artery disease can be reliably identified clinically. OBJECTIVES: To examine the reliability of clinical evaluation with or without an at-rest electrocardiogram (ECG......) in patients with a high probability of coronary artery disease. PATIENTS AND METHODS: A prospective series of 357 patients referred for coronary angiography (CA) for suspected stable angina pectoris were examined by a trained physician who judged their type of pain and Canadian Cardiovascular Society grade...... on CA. Of the patients who had also an abnormal at-rest ECG, 14% to 21% of men and 42% to 57% of women had normal MPS. Sex-related differences were statistically significant. CONCLUSIONS: Clinical prediction appears to be unreliable. Addition of at-rest ECG data results in some improvement, particularly...

  14. Count data, detection probabilities, and the demography, dynamics, distribution, and decline of amphibians.

    Science.gov (United States)

    Schmidt, Benedikt R

    2003-08-01

    The evidence for amphibian population declines is based on count data that were not adjusted for detection probabilities. Such data are not reliable even when collected using standard methods. The formula C = Np (where C is a count, N the true parameter value, and p is a detection probability) relates count data to demography, population size, or distributions. With unadjusted count data, one assumes a linear relationship between C and N and that p is constant. These assumptions are unlikely to be met in studies of amphibian populations. Amphibian population data should be based on methods that account for detection probabilities.

  15. Search times and probability of detection in time-limited search

    Science.gov (United States)

    Wilson, David; Devitt, Nicole; Maurer, Tana

    2005-05-01

    When modeling the search and target acquisition process, probability of detection as a function of time is important to war games and physical entity simulations. Recent US Army RDECOM CERDEC Night Vision and Electronics Sensor Directorate modeling of search and detection has focused on time-limited search. Developing the relationship between detection probability and time of search as a differential equation is explored. One of the parameters in the current formula for probability of detection in time-limited search corresponds to the mean time to detect in time-unlimited search. However, the mean time to detect in time-limited search is shorter than the mean time to detect in time-unlimited search and the relationship between them is a mathematical relationship between these two mean times. This simple relationship is derived.

  16. Estimation of the defect detection probability for ultrasonic tests on thick sections steel weldments. Technical report

    International Nuclear Information System (INIS)

    Johnson, D.P.; Toomay, T.L.; Davis, C.S.

    1979-02-01

    An inspection uncertainty analysis of published PVRC Specimen 201 data is reported to obtain an estimate of the probability of recording an indication as a function of imperfection height for ASME Section XI Code ultrasonic inspections of the nuclear reactor vessel plate seams and to demonstrate the advantages of inspection uncertainty analysis over conventional detection/nondetection counting analysis. This analysis found the probability of recording a significant defect with an ASME Section XI Code ultrasonic inspection to be very high, if such a defect should exist in the plate seams of a nuclear reactor vessel. For a one-inch high crack, for example, this analysis gives a best estimate recording probability of .985 and a 90% lower confidence bound recording probabilty of .937. It is also shown that inspection uncertainty analysis gives more accurate estimates and gives estimates over a much greater flaw size range than is possible with conventional analysis. There is reason to believe that the estimation procedure used is conservative, the estimation is based on data generated several years ago, on very small defects, in an environment that is different from the actual in-service inspection environment

  17. Minimizing Detection Probability Routing in Ad Hoc Networks Using Directional Antennas

    Directory of Open Access Journals (Sweden)

    Towsley Don

    2009-01-01

    Full Text Available In a hostile environment, it is important for a transmitter to make its wireless transmission invisible to adversaries because an adversary can detect the transmitter if the received power at its antennas is strong enough. This paper defines a detection probability model to compute the level of a transmitter being detected by a detection system at arbitrary location around the transmitter. Our study proves that the probability of detecting a directional antenna is much lower than that of detecting an omnidirectional antenna if both the directional and omnidirectional antennas provide the same Effective Isotropic Radiated Power (EIRP in the direction of the receiver. We propose a Minimizing Detection Probability (MinDP routing algorithm to find a secure routing path in ad hoc networks where nodes employ directional antennas to transmit data to decrease the probability of being detected by adversaries. Our study shows that the MinDP routing algorithm can reduce the total detection probability of deliveries from the source to the destination by over 74%.

  18. Estimating occurrence and detection probabilities for stream-breeding salamanders in the Gulf Coastal Plain

    Science.gov (United States)

    Lamb, Jennifer Y.; Waddle, J. Hardin; Qualls, Carl P.

    2017-01-01

    Large gaps exist in our knowledge of the ecology of stream-breeding plethodontid salamanders in the Gulf Coastal Plain. Data describing where these salamanders are likely to occur along environmental gradients, as well as their likelihood of detection, are important for the prevention and management of amphibian declines. We used presence/absence data from leaf litter bag surveys and a hierarchical Bayesian multispecies single-season occupancy model to estimate the occurrence of five species of plethodontids across reaches in headwater streams in the Gulf Coastal Plain. Average detection probabilities were high (range = 0.432–0.942) and unaffected by sampling covariates specific to the use of litter bags (i.e., bag submergence, sampling season, in-stream cover). Estimates of occurrence probabilities differed substantially between species (range = 0.092–0.703) and were influenced by the size of the upstream drainage area and by the maximum proportion of the reach that dried. The effects of these two factors were not equivalent across species. Our results demonstrate that hierarchical multispecies models successfully estimate occurrence parameters for both rare and common stream-breeding plethodontids. The resulting models clarify how species are distributed within stream networks, and they provide baseline values that will be useful in evaluating the conservation statuses of plethodontid species within lotic systems in the Gulf Coastal Plain.

  19. Modeling co-occurrence of northern spotted and barred owls: accounting for detection probability differences

    Science.gov (United States)

    Bailey, Larissa L.; Reid, Janice A.; Forsman, Eric D.; Nichols, James D.

    2009-01-01

    Barred owls (Strix varia) have recently expanded their range and now encompass the entire range of the northern spotted owl (Strix occidentalis caurina). This expansion has led to two important issues of concern for management of northern spotted owls: (1) possible competitive interactions between the two species that could contribute to population declines of northern spotted owls, and (2) possible changes in vocalization behavior and detection probabilities of northern spotted owls induced by presence of barred owls. We used a two-species occupancy model to investigate whether there was evidence of competitive exclusion between the two species at study locations in Oregon, USA. We simultaneously estimated detection probabilities for both species and determined if the presence of one species influenced the detection of the other species. Model selection results and associated parameter estimates provided no evidence that barred owls excluded spotted owls from territories. We found strong evidence that detection probabilities differed for the two species, with higher probabilities for northern spotted owls that are the object of current surveys. Non-detection of barred owls is very common in surveys for northern spotted owls, and detection of both owl species was negatively influenced by the presence of the congeneric species. Our results suggest that analyses directed at hypotheses of barred owl effects on demographic or occupancy vital rates of northern spotted owls need to deal adequately with imperfect and variable detection probabilities for both species.

  20. Probability-of-Superiority SEM (PS-SEM—Detecting Probability-Based Multivariate Relationships in Behavioral Research

    Directory of Open Access Journals (Sweden)

    Johnson Ching-Hong Li

    2018-06-01

    Full Text Available In behavioral research, exploring bivariate relationships between variables X and Y based on the concept of probability-of-superiority (PS has received increasing attention. Unlike the conventional, linear-based bivariate relationship (e.g., Pearson's correlation, PS defines that X and Y can be related based on their likelihood—e.g., a student who is above mean in SAT has 63% likelihood of achieving an above-mean college GPA. Despite its increasing attention, the concept of PS is restricted to a simple bivariate scenario (X-Y pair, which hinders the development and application of PS in popular multivariate modeling such as structural equation modeling (SEM. Therefore, this study addresses an empirical-based simulation study that explores the potential of detecting PS-based relationship in SEM, called PS-SEM. The simulation results showed that the proposed PS-SEM method can detect and identify PS-based when data follow PS-based relationships, thereby providing a useful method for researchers to explore PS-based SEM in their studies. Conclusions, implications, and future directions based on the findings are also discussed.

  1. Control Surface Fault Diagnosis with Specified Detection Probability - Real Event Experiences

    DEFF Research Database (Denmark)

    Hansen, Søren; Blanke, Mogens

    2013-01-01

    desired levels of false alarms and detection probabilities. Self-tuning residual generators are employed for diagnosis and are combined with statistical change detection to form a setup for robust fault diagnosis. On-line estimation of test statistics is used to obtain a detection threshold and a desired...... false alarm probability. A data based method is used to determine the validity of the methods proposed. Verification is achieved using real data and shows that the presented diagnosis method is efficient and could have avoided incidents where faults led to loss of aircraft....

  2. Monitoring Least Bitterns (Ixobrychis exilis) in Vermont: Detection probability and occupancy modeling

    Science.gov (United States)

    Cherukuri, Aswini; Strong, Allan; Donovan, Therese M.

    2018-01-01

    Ixobrychus exillis (Least Bittern) is listed as a species of high concern in the North American Waterbird Conservation Plan and is a US Fish and Wildlife Service migratory bird species of conservation concern in the Northeast. Little is known about the population of Least Bitterns in the Northeast because of their low population density, tendency to nest in dense wetland vegetation, and secretive behavior. Urban and agricultural development is expected to encroach on and degrade suitable wetland habitat; however, we cannot predict the effects on Least Bittern populations without more accurate information on their abundance and distribution. We conducted surveys of wetlands in Vermont to assess the efficacy of a monitoring protocol and to establish baseline Least Bittern abundance and distribution data at a sample of 29 wetland sites. Surveys yielded detections of 31 individuals at 15 of 29 sites across 3 biophysical regions and at 5 sites where occupancy had not been previously reported. Probability of occupancy was positively related to wetland size and number of patches, though the relationships were not strong enough to conclude if these were true determinants of occupancy. Call—response broadcast surveys yielded 30 detections, while passive surveys yielded 13. Call—response broadcasts (P = 0.897) increased the rate of detection by 55% compared to passive surveys (P = 0.577). Our results suggest that call—response broadcast surveys are an effective means of assessing Least Bittern occupancy and may reduce bias in long-term monitoring programs.

  3. Improved detection probability of low level light and infrared image fusion system

    Science.gov (United States)

    Luo, Yuxiang; Fu, Rongguo; Zhang, Junju; Wang, Wencong; Chang, Benkang

    2018-02-01

    Low level light(LLL) image contains rich information on environment details, but is easily affected by the weather. In the case of smoke, rain, cloud or fog, much target information will lose. Infrared image, which is from the radiation produced by the object itself, can be "active" to obtain the target information in the scene. However, the image contrast and resolution is bad, the ability of the acquisition of target details is very poor, and the imaging mode does not conform to the human visual habit. The fusion of LLL and infrared image can make up for the deficiency of each sensor and give play to the advantages of single sensor. At first, we show the hardware design of fusion circuit. Then, through the recognition probability calculation of the target(one person) and the background image(trees), we find that the trees detection probability of LLL image is higher than that of the infrared image, and the person detection probability of the infrared image is obviously higher than that of LLL image. The detection probability of fusion image for one person and trees is higher than that of single detector. Therefore, image fusion can significantly enlarge recognition probability and improve detection efficiency.

  4. Detection probability of gyrfalcons and other cliff-nesting raptors during aerial surveys in Alaska

    Science.gov (United States)

    Booms, Travis L.; Fuller, Mark R.; Schempf, Philip F.; McCaffery, Brian J.; Lindberg, Mark S.; Watson, Richard T.; Cade, Tom J.; Fuller, Mark; Hunt, Grainger; Potapov, Eugene

    2011-01-01

    Assessing the status of Gyrfalcons (Falco rusticolus) and other cliffnesting raptors as the Arctic climate changes often requires aerial surveys of their breeding habitats. Because traditional, count-based surveys that do not adjust for differing detection probabilities can provide faulty inference about population status (Link and Sauer 1998, Thompson 2002), it will be important to incorporate measures of detection probability into survey methods whenever possible. To evaluate the feasibility of this, we conducted repeated aerial surveys for breeding cliff-nesting raptors on the Yukon Delta National Wildlife Refuge (YDNWR) in western Alaska to estimate detection probabilities of Gyrfalcons, Golden Eagles (Aquila chrysaetos), Rough-legged Hawks (Buteo lagopus), and also Common Ravens (Corvus corax). Using the program PRESENCE, we modeled detection histories of each species based on single species occupancy modeling following MacKenzie et al. (2002, 2006). We used different observers during four helicopter replicate surveys in the Kilbuck Mountains and five fixed-wing replicate surveys in the Ingakslugwat Hills (hereafter called Volcanoes) near Bethel, Alaska. We used the following terms and definitions throughout: Survey Site: site of a nest used previously by a raptor and marked with a GPS-obtained latitude and longitude accurate to within 20 m. All GPS locations were obtained in prior years from a helicopter hovering approximately 10?20 m from a nest. The site was considered occupied if a bird or an egg was detected within approximately 500 m of the nest and this area served as our sampling unit. When multiple historical nests were located on a single cliff, we used only one GPS location to locate the survey site. Detection probability (p): the probability of a species being detected at a site given the site is occupied. Occupancy (?): the probability that the species of interest is present at a site during the survey period. A site was considered occupied if the

  5. Neutron emission probability at high excitation and isospin

    International Nuclear Information System (INIS)

    Aggarwal, Mamta

    2005-01-01

    One-neutron and two-neutron emission probability at different excitations and varying isospin have been studied. Several degrees of freedom like deformation, rotations, temperature, isospin fluctuations and shell structure are incorporated via statistical theory of hot rotating nuclei

  6. A fast algorithm for estimating transmission probabilities in QTL detection designs with dense maps

    Directory of Open Access Journals (Sweden)

    Gilbert Hélène

    2009-11-01

    Full Text Available Abstract Background In the case of an autosomal locus, four transmission events from the parents to progeny are possible, specified by the grand parental origin of the alleles inherited by this individual. Computing the probabilities of these transmission events is essential to perform QTL detection methods. Results A fast algorithm for the estimation of these probabilities conditional to parental phases has been developed. It is adapted to classical QTL detection designs applied to outbred populations, in particular to designs composed of half and/or full sib families. It assumes the absence of interference. Conclusion The theory is fully developed and an example is given.

  7. High-resolution urban flood modelling - a joint probability approach

    Science.gov (United States)

    Hartnett, Michael; Olbert, Agnieszka; Nash, Stephen

    2017-04-01

    (Divoky et al., 2005). Nevertheless, such events occur and in Ireland alone there are several cases of serious damage due to flooding resulting from a combination of high sea water levels and river flows driven by the same meteorological conditions (e.g. Olbert et al. 2015). A November 2009 fluvial-coastal flooding of Cork City bringing €100m loss was one such incident. This event was used by Olbert et al. (2015) to determine processes controlling urban flooding and is further explored in this study to elaborate on coastal and fluvial flood mechanisms and their roles in controlling water levels. The objective of this research is to develop a methodology to assess combined effect of multiple source flooding on flood probability and severity in urban areas and to establish a set of conditions that dictate urban flooding due to extreme climatic events. These conditions broadly combine physical flood drivers (such as coastal and fluvial processes), their mechanisms and thresholds defining flood severity. The two main physical processes controlling urban flooding: high sea water levels (coastal flooding) and high river flows (fluvial flooding), and their threshold values for which flood is likely to occur, are considered in this study. Contribution of coastal and fluvial drivers to flooding and their impacts are assessed in a two-step process. The first step involves frequency analysis and extreme value statistical modelling of storm surges, tides and river flows and ultimately the application of joint probability method to estimate joint exceedence return periods for combination of surges, tide and river flows. In the second step, a numerical model of Cork Harbour MSN_Flood comprising a cascade of four nested high-resolution models is used to perform simulation of flood inundation under numerous hypothetical coastal and fluvial flood scenarios. The risk of flooding is quantified based on a range of physical aspects such as the extent and depth of inundation (Apel et al

  8. Summary of intrinsic and extrinsic factors affecting detection probability of marsh birds

    Science.gov (United States)

    Conway, C.J.; Gibbs, J.P.

    2011-01-01

    Many species of marsh birds (rails, bitterns, grebes, etc.) rely exclusively on emergent marsh vegetation for all phases of their life cycle, and many organizations have become concerned about the status and persistence of this group of birds. Yet, marsh birds are notoriously difficult to monitor due to their secretive habits. We synthesized the published and unpublished literature and summarized the factors that influence detection probability of secretive marsh birds in North America. Marsh birds are more likely to respond to conspecific than heterospecific calls, and seasonal peak in vocalization probability varies among co-existing species. The effectiveness of morning versus evening surveys varies among species and locations. Vocalization probability appears to be positively correlated with density in breeding Virginia Rails (Rallus limicola), Soras (Porzana carolina), and Clapper Rails (Rallus longirostris). Movement of birds toward the broadcast source creates biases when using count data from callbroadcast surveys to estimate population density. Ambient temperature, wind speed, cloud cover, and moon phase affected detection probability in some, but not all, studies. Better estimates of detection probability are needed. We provide recommendations that would help improve future marsh bird survey efforts and a list of 14 priority information and research needs that represent gaps in our current knowledge where future resources are best directed. ?? Society of Wetland Scientists 2011.

  9. Mining of high utility-probability sequential patterns from uncertain databases.

    Directory of Open Access Journals (Sweden)

    Binbin Zhang

    Full Text Available High-utility sequential pattern mining (HUSPM has become an important issue in the field of data mining. Several HUSPM algorithms have been designed to mine high-utility sequential patterns (HUPSPs. They have been applied in several real-life situations such as for consumer behavior analysis and event detection in sensor networks. Nonetheless, most studies on HUSPM have focused on mining HUPSPs in precise data. But in real-life, uncertainty is an important factor as data is collected using various types of sensors that are more or less accurate. Hence, data collected in a real-life database can be annotated with existing probabilities. This paper presents a novel pattern mining framework called high utility-probability sequential pattern mining (HUPSPM for mining high utility-probability sequential patterns (HUPSPs in uncertain sequence databases. A baseline algorithm with three optional pruning strategies is presented to mine HUPSPs. Moroever, to speed up the mining process, a projection mechanism is designed to create a database projection for each processed sequence, which is smaller than the original database. Thus, the number of unpromising candidates can be greatly reduced, as well as the execution time for mining HUPSPs. Substantial experiments both on real-life and synthetic datasets show that the designed algorithm performs well in terms of runtime, number of candidates, memory usage, and scalability for different minimum utility and minimum probability thresholds.

  10. Effects of population variability on the accuracy of detection probability estimates

    DEFF Research Database (Denmark)

    Ordonez Gloria, Alejandro

    2011-01-01

    Observing a constant fraction of the population over time, locations, or species is virtually impossible. Hence, quantifying this proportion (i.e. detection probability) is an important task in quantitative population ecology. In this study we determined, via computer simulations, the ef- fect of...

  11. Binomial Test Method for Determining Probability of Detection Capability for Fracture Critical Applications

    Science.gov (United States)

    Generazio, Edward R.

    2011-01-01

    The capability of an inspection system is established by applications of various methodologies to determine the probability of detection (POD). One accepted metric of an adequate inspection system is that for a minimum flaw size and all greater flaw sizes, there is 0.90 probability of detection with 95% confidence (90/95 POD). Directed design of experiments for probability of detection (DOEPOD) has been developed to provide an efficient and accurate methodology that yields estimates of POD and confidence bounds for both Hit-Miss or signal amplitude testing, where signal amplitudes are reduced to Hit-Miss by using a signal threshold Directed DOEPOD uses a nonparametric approach for the analysis or inspection data that does require any assumptions about the particular functional form of a POD function. The DOEPOD procedure identifies, for a given sample set whether or not the minimum requirement of 0.90 probability of detection with 95% confidence is demonstrated for a minimum flaw size and for all greater flaw sizes (90/95 POD). The DOEPOD procedures are sequentially executed in order to minimize the number of samples needed to demonstrate that there is a 90/95 POD lower confidence bound at a given flaw size and that the POD is monotonic for flaw sizes exceeding that 90/95 POD flaw size. The conservativeness of the DOEPOD methodology results is discussed. Validated guidelines for binomial estimation of POD for fracture critical inspection are established.

  12. Probability of failure of the watershed algorithm for peak detection in comprehensive two-dimensional chromatography

    NARCIS (Netherlands)

    Vivó-Truyols, G.; Janssen, H.-G.

    2010-01-01

    The watershed algorithm is the most common method used for peak detection and integration In two-dimensional chromatography However, the retention time variability in the second dimension may render the algorithm to fail A study calculating the probabilities of failure of the watershed algorithm was

  13. Probability of defect detection of Posiva's electron beam weld

    International Nuclear Information System (INIS)

    Kanzler, D.; Mueller, C.; Pitkaenen, J.

    2013-12-01

    The report 'Probability of Defect Detection of Posiva's electron beam weld' describes POD curves of four NDT methods radiographic testing, ultrasonic testing, eddy current testing and visual testing. POD-curves are based on the artificial defects in reference blocks. The results are devoted to the demonstration of suitability of the methods for EB weld testing. Report describes methodology and procedure applied by BAM. Report creates a link from the assessment of the reliability and inspection performance to the risk assessment process of the canister final disposal project. Report ensures the confirmation of the basic quality of the NDT methods and their capability to describe the quality of the EB-weld. The probability of detection curves are determined based on the MIL-1823 standard and it's reliability guidelines. The MIL-1823 standard was developed for the determination of integrity of gas turbine engines for the US military. In the POD-process there are determined as a key parameter for the defect detectability the a90/95 magnitudes, i.e. the size measure a of the defect, for which the lower 95 % confidence band crosses the 90 % POD level. By this way can be confirmed that defects with a size of a90/95 will be detected with 90 % probability. In case the experiment will be repeated 5 % might fall outside this confidence limit. (orig.)

  14. A prototype method for diagnosing high ice water content probability using satellite imager data

    Science.gov (United States)

    Yost, Christopher R.; Bedka, Kristopher M.; Minnis, Patrick; Nguyen, Louis; Strapp, J. Walter; Palikonda, Rabindra; Khlopenkov, Konstantin; Spangenberg, Douglas; Smith, William L., Jr.; Protat, Alain; Delanoe, Julien

    2018-03-01

    Recent studies have found that ingestion of high mass concentrations of ice particles in regions of deep convective storms, with radar reflectivity considered safe for aircraft penetration, can adversely impact aircraft engine performance. Previous aviation industry studies have used the term high ice water content (HIWC) to define such conditions. Three airborne field campaigns were conducted in 2014 and 2015 to better understand how HIWC is distributed in deep convection, both as a function of altitude and proximity to convective updraft regions, and to facilitate development of new methods for detecting HIWC conditions, in addition to many other research and regulatory goals. This paper describes a prototype method for detecting HIWC conditions using geostationary (GEO) satellite imager data coupled with in situ total water content (TWC) observations collected during the flight campaigns. Three satellite-derived parameters were determined to be most useful for determining HIWC probability: (1) the horizontal proximity of the aircraft to the nearest overshooting convective updraft or textured anvil cloud, (2) tropopause-relative infrared brightness temperature, and (3) daytime-only cloud optical depth. Statistical fits between collocated TWC and GEO satellite parameters were used to determine the membership functions for the fuzzy logic derivation of HIWC probability. The products were demonstrated using data from several campaign flights and validated using a subset of the satellite-aircraft collocation database. The daytime HIWC probability was found to agree quite well with TWC time trends and identified extreme TWC events with high probability. Discrimination of HIWC was more challenging at night with IR-only information. The products show the greatest capability for discriminating TWC ≥ 0.5 g m-3. Product validation remains challenging due to vertical TWC uncertainties and the typically coarse spatio-temporal resolution of the GEO data.

  15. Evaluating species richness: biased ecological inference results from spatial heterogeneity in species detection probabilities

    Science.gov (United States)

    McNew, Lance B.; Handel, Colleen M.

    2015-01-01

    Accurate estimates of species richness are necessary to test predictions of ecological theory and evaluate biodiversity for conservation purposes. However, species richness is difficult to measure in the field because some species will almost always be overlooked due to their cryptic nature or the observer's failure to perceive their cues. Common measures of species richness that assume consistent observability across species are inviting because they may require only single counts of species at survey sites. Single-visit estimation methods ignore spatial and temporal variation in species detection probabilities related to survey or site conditions that may confound estimates of species richness. We used simulated and empirical data to evaluate the bias and precision of raw species counts, the limiting forms of jackknife and Chao estimators, and multi-species occupancy models when estimating species richness to evaluate whether the choice of estimator can affect inferences about the relationships between environmental conditions and community size under variable detection processes. Four simulated scenarios with realistic and variable detection processes were considered. Results of simulations indicated that (1) raw species counts were always biased low, (2) single-visit jackknife and Chao estimators were significantly biased regardless of detection process, (3) multispecies occupancy models were more precise and generally less biased than the jackknife and Chao estimators, and (4) spatial heterogeneity resulting from the effects of a site covariate on species detection probabilities had significant impacts on the inferred relationships between species richness and a spatially explicit environmental condition. For a real dataset of bird observations in northwestern Alaska, the four estimation methods produced different estimates of local species richness, which severely affected inferences about the effects of shrubs on local avian richness. Overall, our results

  16. Investigating the probability of detection of typical cavity shapes through modelling and comparison of geophysical techniques

    Science.gov (United States)

    James, P.

    2011-12-01

    With a growing need for housing in the U.K., the government has proposed increased development of brownfield sites. However, old mine workings and natural cavities represent a potential hazard before, during and after construction on such sites, and add further complication to subsurface parameters. Cavities are hence a limitation to certain redevelopment and their detection is an ever important consideration. The current standard technique for cavity detection is a borehole grid, which is intrusive, non-continuous, slow and expensive. A new robust investigation standard in the detection of cavities is sought and geophysical techniques offer an attractive alternative. Geophysical techniques have previously been utilised successfully in the detection of cavities in various geologies, but still has an uncertain reputation in the engineering industry. Engineers are unsure of the techniques and are inclined to rely on well known techniques than utilise new technologies. Bad experiences with geophysics are commonly due to the indiscriminate choice of particular techniques. It is imperative that a geophysical survey is designed with the specific site and target in mind at all times, and the ability and judgement to rule out some, or all, techniques. To this author's knowledge no comparative software exists to aid technique choice. Also, previous modelling software limit the shapes of bodies and hence typical cavity shapes are not represented. Here, we introduce 3D modelling software (Matlab) which computes and compares the response to various cavity targets from a range of techniques (gravity, gravity gradient, magnetic, magnetic gradient and GPR). Typical near surface cavity shapes are modelled including shafts, bellpits, various lining and capping materials, and migrating voids. The probability of cavity detection is assessed in typical subsurface and noise conditions across a range of survey parameters. Techniques can be compared and the limits of detection distance

  17. Optimize the Coverage Probability of Prediction Interval for Anomaly Detection of Sensor-Based Monitoring Series

    Directory of Open Access Journals (Sweden)

    Jingyue Pang

    2018-03-01

    Full Text Available Effective anomaly detection of sensing data is essential for identifying potential system failures. Because they require no prior knowledge or accumulated labels, and provide uncertainty presentation, the probability prediction methods (e.g., Gaussian process regression (GPR and relevance vector machine (RVM are especially adaptable to perform anomaly detection for sensing series. Generally, one key parameter of prediction models is coverage probability (CP, which controls the judging threshold of the testing sample and is generally set to a default value (e.g., 90% or 95%. There are few criteria to determine the optimal CP for anomaly detection. Therefore, this paper designs a graphic indicator of the receiver operating characteristic curve of prediction interval (ROC-PI based on the definition of the ROC curve which can depict the trade-off between the PI width and PI coverage probability across a series of cut-off points. Furthermore, the Youden index is modified to assess the performance of different CPs, by the minimization of which the optimal CP is derived by the simulated annealing (SA algorithm. Experiments conducted on two simulation datasets demonstrate the validity of the proposed method. Especially, an actual case study on sensing series from an on-orbit satellite illustrates its significant performance in practical application.

  18. Directed Design of Experiments for Validating Probability of Detection Capability of NDE Systems (DOEPOD)

    Science.gov (United States)

    Generazio, Edward R.

    2015-01-01

    Directed Design of Experiments for Validating Probability of Detection Capability of NDE Systems (DOEPOD) Manual v.1.2 The capability of an inspection system is established by applications of various methodologies to determine the probability of detection (POD). One accepted metric of an adequate inspection system is that there is 95% confidence that the POD is greater than 90% (90/95 POD). Design of experiments for validating probability of detection capability of nondestructive evaluation (NDE) systems (DOEPOD) is a methodology that is implemented via software to serve as a diagnostic tool providing detailed analysis of POD test data, guidance on establishing data distribution requirements, and resolving test issues. DOEPOD demands utilization of observance of occurrences. The DOEPOD capability has been developed to provide an efficient and accurate methodology that yields observed POD and confidence bounds for both Hit-Miss or signal amplitude testing. DOEPOD does not assume prescribed POD logarithmic or similar functions with assumed adequacy over a wide range of flaw sizes and inspection system technologies, so that multi-parameter curve fitting or model optimization approaches to generate a POD curve are not required. DOEPOD applications for supporting inspector qualifications is included.

  19. Estimating species occurrence, abundance, and detection probability using zero-inflated distributions.

    Science.gov (United States)

    Wenger, Seth J; Freeman, Mary C

    2008-10-01

    Researchers have developed methods to account for imperfect detection of species with either occupancy (presence absence) or count data using replicated sampling. We show how these approaches can be combined to simultaneously estimate occurrence, abundance, and detection probability by specifying a zero-inflated distribution for abundance. This approach may be particularly appropriate when patterns of occurrence and abundance arise from distinct processes operating at differing spatial or temporal scales. We apply the model to two data sets: (1) previously published data for a species of duck, Anas platyrhynchos, and (2) data for a stream fish species, Etheostoma scotti. We show that in these cases, an incomplete-detection zero-inflated modeling approach yields a superior fit to the data than other models. We propose that zero-inflated abundance models accounting for incomplete detection be considered when replicate count data are available.

  20. Modeling detection probability to improve marsh bird surveys in southern Canada and the Great Lakes states

    Directory of Open Access Journals (Sweden)

    Douglas C. Tozer

    2016-12-01

    Full Text Available Marsh birds are notoriously elusive, with variation in detection probability across species, regions, seasons, and different times of day and weather. Therefore, it is important to develop regional field survey protocols that maximize detections, but that also produce data for estimating and analytically adjusting for remaining differences in detections. We aimed to improve regional field survey protocols by estimating detection probability of eight elusive marsh bird species throughout two regions that have ongoing marsh bird monitoring programs: the southern Canadian Prairies (Prairie region and the southern portion of the Great Lakes basin and parts of southern Québec (Great Lakes-St. Lawrence region. We accomplished our goal using generalized binomial N-mixture models and data from ~22,300 marsh bird surveys conducted between 2008 and 2014 by Bird Studies Canada's Prairie, Great Lakes, and Québec Marsh Monitoring Programs. Across all species, on average, detection probability was highest in the Great Lakes-St. Lawrence region from the beginning of May until mid-June, and then fell throughout the remainder of the season until the end of June; was lowest in the Prairie region in mid-May and then increased throughout the remainder of the season until the end of June; was highest during darkness compared with light; and did not vary significantly according to temperature (range: 0-30°C, cloud cover (0%-100%, or wind (0-20 kph, or during morning versus evening. We used our results to formulate improved marsh bird survey protocols for each region. Our analysis and recommendations are useful and contribute to conservation of wetland birds at various scales from local single-species studies to the continental North American Marsh Bird Monitoring Program.

  1. Sampling little fish in big rivers: Larval fish detection probabilities in two Lake Erie tributaries and implications for sampling effort and abundance indices

    Science.gov (United States)

    Pritt, Jeremy J.; DuFour, Mark R.; Mayer, Christine M.; Roseman, Edward F.; DeBruyne, Robin L.

    2014-01-01

    Larval fish are frequently sampled in coastal tributaries to determine factors affecting recruitment, evaluate spawning success, and estimate production from spawning habitats. Imperfect detection of larvae is common, because larval fish are small and unevenly distributed in space and time, and coastal tributaries are often large and heterogeneous. We estimated detection probabilities of larval fish from several taxa in the Maumee and Detroit rivers, the two largest tributaries of Lake Erie. We then demonstrated how accounting for imperfect detection influenced (1) the probability of observing taxa as present relative to sampling effort and (2) abundance indices for larval fish of two Detroit River species. We found that detection probabilities ranged from 0.09 to 0.91 but were always less than 1.0, indicating that imperfect detection is common among taxa and between systems. In general, taxa with high fecundities, small larval length at hatching, and no nesting behaviors had the highest detection probabilities. Also, detection probabilities were higher in the Maumee River than in the Detroit River. Accounting for imperfect detection produced up to fourfold increases in abundance indices for Lake Whitefish Coregonus clupeaformis and Gizzard Shad Dorosoma cepedianum. The effect of accounting for imperfect detection in abundance indices was greatest during periods of low abundance for both species. Detection information can be used to determine the appropriate level of sampling effort for larval fishes and may improve management and conservation decisions based on larval fish data.

  2. An Estimation of a Passive Infra-Red Sensor Probability of Detection

    International Nuclear Information System (INIS)

    Osman, E.A.; El-Gazar, M.I.; Shaat, M.K.; El-Kafas, A.A.; Zidan, W.I.; Wadoud, A.A.

    2009-01-01

    Passive Infera-Red (PIR) sensors are one of many detection sensors are used to detect any intrusion process of the nuclear sites. In this work, an estimation of a PIR Sensor's Probability of Detection of a hypothetical facility is presented. sensor performance testing performed to determine whether a particular sensor will be acceptable in a proposed design. We have access to a sensor test field in which the sensor of interest is already properly installed and the parameters have been set to optimal levels by preliminary testing. The PIR sensor construction, operation and design for the investigated nuclear site are explained. Walking and running intrusion tests were carried out inside the field areas of the PIR sensor to evaluate the sensor performance during the intrusion process. 10 trials experimentally performed for achieving the intrusion process via a passive infra-red sensor's network system. The performance and intrusion senses of PIR sensors inside the internal zones was recorded and evaluated.

  3. Hotspots ampersand other hidden targets: Probability of detection, number, frequency and area

    International Nuclear Information System (INIS)

    Vita, C.L.

    1994-01-01

    Concepts and equations are presented for making probability-based estimates of the detection probability, and the number, frequency, and area of hidden targets, including hotspots, at a given site. Targets include hotspots, which are areas of extreme or particular contamination, and any object or feature that is hidden from direct visual observation--including buried objects and geologic or hydrologic details or anomalies. Being Bayesian, results are fundamentally consistent with observational methods. Results are tools for planning or interpreting exploration programs used in site investigation or characterization, remedial design, construction, or compliance monitoring, including site closure. Used skillfully and creatively, these tools can help streamline and expedite environmental restoration, reducing time and cost, making site exploration cost-effective, and providing acceptable risk at minimum cost. 14 refs., 4 figs

  4. Probability based high temperature engineering creep and structural fire resistance

    CERN Document Server

    Razdolsky, Leo

    2017-01-01

    This volume on structural fire resistance is for aerospace, structural, and fire prevention engineers; architects, and educators. It bridges the gap between prescriptive- and performance-based methods and simplifies very complex and comprehensive computer analyses to the point that the structural fire resistance and high temperature creep deformations will have a simple, approximate analytical expression that can be used in structural analysis and design. The book emphasizes methods of the theory of engineering creep (stress-strain diagrams) and mathematical operations quite distinct from those of solid mechanics absent high-temperature creep deformations, in particular the classical theory of elasticity and structural engineering. Dr. Razdolsky’s previous books focused on methods of computing the ultimate structural design load to the different fire scenarios. The current work is devoted to the computing of the estimated ultimate resistance of the structure taking into account the effect of high temperatur...

  5. Probability of Detection (POD) as a statistical model for the validation of qualitative methods.

    Science.gov (United States)

    Wehling, Paul; LaBudde, Robert A; Brunelle, Sharon L; Nelson, Maria T

    2011-01-01

    A statistical model is presented for use in validation of qualitative methods. This model, termed Probability of Detection (POD), harmonizes the statistical concepts and parameters between quantitative and qualitative method validation. POD characterizes method response with respect to concentration as a continuous variable. The POD model provides a tool for graphical representation of response curves for qualitative methods. In addition, the model allows comparisons between candidate and reference methods, and provides calculations of repeatability, reproducibility, and laboratory effects from collaborative study data. Single laboratory study and collaborative study examples are given.

  6. Estimating factors influencing the detection probability of semiaquatic freshwater snails using quadrat survey methods

    Science.gov (United States)

    Roesler, Elizabeth L.; Grabowski, Timothy B.

    2018-01-01

    Developing effective monitoring methods for elusive, rare, or patchily distributed species requires extra considerations, such as imperfect detection. Although detection is frequently modeled, the opportunity to assess it empirically is rare, particularly for imperiled species. We used Pecos assiminea (Assiminea pecos), an endangered semiaquatic snail, as a case study to test detection and accuracy issues surrounding quadrat searches. Quadrats (9 × 20 cm; n = 12) were placed in suitable Pecos assiminea habitat and randomly assigned a treatment, defined as the number of empty snail shells (0, 3, 6, or 9). Ten observers rotated through each quadrat, conducting 5-min visual searches for shells. The probability of detecting a shell when present was 67.4 ± 3.0%, but it decreased with the increasing litter depth and fewer number of shells present. The mean (± SE) observer accuracy was 25.5 ± 4.3%. Accuracy was positively correlated to the number of shells in the quadrat and negatively correlated to the number of times a quadrat was searched. The results indicate quadrat surveys likely underrepresent true abundance, but accurately determine the presence or absence. Understanding detection and accuracy of elusive, rare, or imperiled species improves density estimates and aids in monitoring and conservation efforts.

  7. Normal mammogram detection based on local probability difference transforms and support vector machines

    International Nuclear Information System (INIS)

    Chiracharit, W.; Kumhom, P.; Chamnongthai, K.; Sun, Y.; Delp, E.J.; Babbs, C.F

    2007-01-01

    Automatic detection of normal mammograms, as a ''first look'' for breast cancer, is a new approach to computer-aided diagnosis. This approach may be limited, however, by two main causes. The first problem is the presence of poorly separable ''crossed-distributions'' in which the correct classification depends upon the value of each feature. The second problem is overlap of the feature distributions that are extracted from digitized mammograms of normal and abnormal patients. Here we introduce a new Support Vector Machine (SVM) based method utilizing with the proposed uncrossing mapping and Local Probability Difference (LPD). Crossed-distribution feature pairs are identified and mapped into a new features that can be separated by a zero-hyperplane of the new axis. The probability density functions of the features of normal and abnormal mammograms are then sampled and the local probability difference functions are estimated to enhance the features. From 1,000 ground-truth-known mammograms, 250 normal and 250 abnormal cases, including spiculated lesions, circumscribed masses or microcalcifications, are used for training a support vector machine. The classification results tested with another 250 normal and 250 abnormal sets show improved testing performances with 90% sensitivity and 89% specificity. (author)

  8. High Speed Edge Detection

    Science.gov (United States)

    Prokop, Norman F (Inventor)

    2016-01-01

    Analog circuits for detecting edges in pixel arrays are disclosed. A comparator may be configured to receive an all pass signal and a low pass signal for a pixel intensity in an array of pixels. A latch may be configured to receive a counter signal and a latching signal from the comparator. The comparator may be configured to send the latching signal to the latch when the all pass signal is below the low pass signal minus an offset. The latch may be configured to hold a last negative edge location when the latching signal is received from the comparator.

  9. High probability of comorbidities in bronchial asthma in Germany.

    Science.gov (United States)

    Heck, S; Al-Shobash, S; Rapp, D; Le, D D; Omlor, A; Bekhit, A; Flaig, M; Al-Kadah, B; Herian, W; Bals, R; Wagenpfeil, S; Dinh, Q T

    2017-04-21

    Clinical experience has shown that allergic and non-allergic respiratory, metabolic, mental, and cardiovascular disorders sometimes coexist with bronchial asthma. However, no study has been carried out that calculates the chance of manifestation of these disorders with bronchial asthma in Saarland and Rhineland-Palatinate, Germany. Using ICD10 diagnoses from health care institutions, the present study systematically analyzed the co-prevalence and odds ratios of comorbidities in the asthma population in Germany. The odds ratios were adjusted for age and sex for all comorbidities for patients with asthma vs. without asthma. Bronchial asthma was strongly associated with allergic and with a lesser extent to non-allergic comorbidities: OR 7.02 (95%CI:6.83-7.22) for allergic rhinitis; OR 4.98 (95%CI:4.67-5.32) allergic conjunctivitis; OR 2.41 (95%CI:2.33-2.52) atopic dermatitis; OR 2.47 (95%CI:2.16-2.82) food allergy, and OR 1.69 (95%CI:1.61-1.78) drug allergy. Interestingly, increased ORs were found for respiratory diseases: 2.06 (95%CI:1.64-2.58) vocal dysfunction; 1.83 (95%CI:1.74-1.92) pneumonia; 1.78 (95%CI:1.73-1.84) sinusitis; 1.71 (95%CI:1.65-1.78) rhinopharyngitis; 2.55 (95%CI:2.03-3.19) obstructive sleep apnea; 1.42 (95%CI:1.25-1.61) pulmonary embolism, and 3.75 (95%CI:1.64-8.53) bronchopulmonary aspergillosis. Asthmatics also suffer from psychiatric, metabolic, cardiac or other comorbidities. Myocardial infarction (OR 0.86, 95%CI:0.79-0.94) did not coexist with asthma. Based on the calculated chances of manifestation for these comorbidities, especially allergic and respiratory, to a lesser extent also metabolic, cardiovascular, and mental disorders should be taken into consideration in the diagnostic and treatment strategy of bronchial asthma. PREVALENCE OF CO-EXISTING DISEASES IN GERMANY: Patients in Germany with bronchial asthma are highly likely to suffer from co-existing diseases and their treatments should reflect this. Quoc Thai Dinh at Saarland

  10. Highly enhanced avalanche probability using sinusoidally-gated silicon avalanche photodiode

    Energy Technology Data Exchange (ETDEWEB)

    Suzuki, Shingo; Namekata, Naoto, E-mail: nnao@phys.cst.nihon-u.ac.jp; Inoue, Shuichiro [Institute of Quantum Science, Nihon University, 1-8-14 Kanda-Surugadai, Chiyoda-ku, Tokyo 101-8308 (Japan); Tsujino, Kenji [Tokyo Women' s Medical University, 8-1 Kawada-cho, Shinjuku-ku, Tokyo 162-8666 (Japan)

    2014-01-27

    We report on visible light single photon detection using a sinusoidally-gated silicon avalanche photodiode. Detection efficiency of 70.6% was achieved at a wavelength of 520 nm when an electrically cooled silicon avalanche photodiode with a quantum efficiency of 72.4% was used, which implies that a photo-excited single charge carrier in a silicon avalanche photodiode can trigger a detectable avalanche (charge) signal with a probability of 97.6%.

  11. [Detecting high risk pregnancy].

    Science.gov (United States)

    Doret, Muriel; Gaucherand, Pascal

    2009-12-20

    Antenatal care is aiming to reduce maternal land foetal mortality and morbidity. Maternal and foetal mortality can be due to different causes. Their knowledge allows identifying pregnancy (high risk pregnancy) with factors associated with an increased risk for maternal and/or foetal mortality and serious morbidity. Identification of high risk pregnancies and initiation of appropriate treatment and/or surveillance should improve maternal and/or foetal outcome. New risk factors are continuously described thanks to improvement in antenatal care and development in biology and cytopathology, increasing complexity in identifying high risk pregnancies. Level of risk can change all over the pregnancy. Ideally, it should be evaluated prior to the pregnancy and at each antenatal visit. Clinical examination is able to screen for intra-uterin growth restriction, pre-eclampsia, threatened for preterm labour; ultrasounds help in the diagnosis of foetal morphological anomalies, foetal chromosomal anomalies, placenta praevia and abnormal foetal growth; biological exams are used to screen for pre-eclampsia, gestational diabetes, trisomy 21 (for which screening method just changed), rhesus immunisation, seroconversion for toxoplasmosis or rubeola, unknown infectious disease (syphilis, hepatitis B, VIH). During pregnancy, most of the preventive strategies have to be initiated during the first trimester or even before conception. Prevention for neural-tube defects, neonatal hypocalcemia and listeriosis should be performed for all women. On the opposite, some measures are concerning only women with risk factors such as prevention for toxoplasmosis, rhesus immunization (which recently changed), tobacco complications and pre-eclampsia and intra-uterine growth factor restriction.

  12. Model-assisted probability of detection of flaws in aluminum blocks using polynomial chaos expansions

    Science.gov (United States)

    Du, Xiaosong; Leifsson, Leifur; Grandin, Robert; Meeker, William; Roberts, Ronald; Song, Jiming

    2018-04-01

    Probability of detection (POD) is widely used for measuring reliability of nondestructive testing (NDT) systems. Typically, POD is determined experimentally, while it can be enhanced by utilizing physics-based computational models in combination with model-assisted POD (MAPOD) methods. With the development of advanced physics-based methods, such as ultrasonic NDT testing, the empirical information, needed for POD methods, can be reduced. However, performing accurate numerical simulations can be prohibitively time-consuming, especially as part of stochastic analysis. In this work, stochastic surrogate models for computational physics-based measurement simulations are developed for cost savings of MAPOD methods while simultaneously ensuring sufficient accuracy. The stochastic surrogate is used to propagate the random input variables through the physics-based simulation model to obtain the joint probability distribution of the output. The POD curves are then generated based on those results. Here, the stochastic surrogates are constructed using non-intrusive polynomial chaos (NIPC) expansions. In particular, the NIPC methods used are the quadrature, ordinary least-squares (OLS), and least-angle regression sparse (LARS) techniques. The proposed approach is demonstrated on the ultrasonic testing simulation of a flat bottom hole flaw in an aluminum block. The results show that the stochastic surrogates have at least two orders of magnitude faster convergence on the statistics than direct Monte Carlo sampling (MCS). Moreover, the evaluation of the stochastic surrogate models is over three orders of magnitude faster than the underlying simulation model for this case, which is the UTSim2 model.

  13. What is the probability that direct detection experiments have observed dark matter?

    International Nuclear Information System (INIS)

    Bozorgnia, Nassim; Schwetz, Thomas

    2014-01-01

    In Dark Matter direct detection we are facing the situation of some experiments reporting positive signals which are in conflict with limits from other experiments. Such conclusions are subject to large uncertainties introduced by the poorly known local Dark Matter distribution. We present a method to calculate an upper bound on the joint probability of obtaining the outcome of two potentially conflicting experiments under the assumption that the Dark Matter hypothesis is correct, but completely independent of assumptions about the Dark Matter distribution. In this way we can quantify the compatibility of two experiments in an astrophysics independent way. We illustrate our method by testing the compatibility of the hints reported by DAMA and CDMS-Si with the limits from the LUX and SuperCDMS experiments. The method does not require Monte Carlo simulations but is mostly based on using Poisson statistics. In order to deal with signals of few events we introduce the so-called ''signal length'' to take into account energy information. The signal length method provides a simple way to calculate the probability to obtain a given experimental outcome under a specified Dark Matter and background hypothesis

  14. Reliability assessment for thickness measurements of pipe wall using probability of detection

    International Nuclear Information System (INIS)

    Nakamoto, Hiroyuki; Kojima, Fumio; Kato, Sho

    2013-01-01

    This paper proposes a reliability assessment method for thickness measurements of pipe wall using probability of detection (POD). Thicknesses of pipes are measured by qualified inspectors with ultrasonic thickness gauges. The inspection results are affected by human factors of the inspectors and include some errors, because the inspectors have different experiences and frequency of inspections. In order to ensure reliability for inspection results, first, POD evaluates experimental results of pipe-wall thickness inspection. We verify that the results have differences depending on inspectors including qualified inspectors. Second, two human factors that affect POD are indicated. Finally, it is confirmed that POD can identify the human factors and ensure reliability for pipe-wall thickness inspections. (author)

  15. Detection and Classification of Low Probability of Intercept Radar Signals Using Parallel Filter Arrays and Higher Order Statistics

    National Research Council Canada - National Science Library

    Taboada, Fernando

    2002-01-01

    Low probability of intercept (LPI) is that property of an emitter that because of its low power, wide bandwidth, frequency variability, or other design attributes, makes it difficult to be detected or identified by means of passive...

  16. Identifying Changes in the Probability of High Temperature, High Humidity Heat Wave Events

    Science.gov (United States)

    Ballard, T.; Diffenbaugh, N. S.

    2016-12-01

    Understanding how heat waves will respond to climate change is critical for adequate planning and adaptation. While temperature is the primary determinant of heat wave severity, humidity has been shown to play a key role in heat wave intensity with direct links to human health and safety. Here we investigate the individual contributions of temperature and specific humidity to extreme heat wave conditions in recent decades. Using global NCEP-DOE Reanalysis II daily data, we identify regional variability in the joint probability distribution of humidity and temperature. We also identify a statistically significant positive trend in humidity over the eastern U.S. during heat wave events, leading to an increased probability of high humidity, high temperature events. The extent to which we can expect this trend to continue under climate change is complicated due to variability between CMIP5 models, in particular among projections of humidity. However, our results support the notion that heat wave dynamics are characterized by more than high temperatures alone, and understanding and quantifying the various components of the heat wave system is crucial for forecasting future impacts.

  17. Raman spectroscopy detection of platelet for Alzheimer’s disease with predictive probabilities

    International Nuclear Information System (INIS)

    Wang, L J; Du, X Q; Du, Z W; Yang, Y Y; Chen, P; Wang, X H; Cheng, Y; Peng, J; Shen, A G; Hu, J M; Tian, Q; Shang, X L; Liu, Z C; Yao, X Q; Wang, J Z

    2014-01-01

    Alzheimer’s disease (AD) is a common form of dementia. Early and differential diagnosis of AD has always been an arduous task for the medical expert due to the unapparent early symptoms and the currently imperfect imaging examination methods. Therefore, obtaining reliable markers with clinical diagnostic value in easily assembled samples is worthy and significant. Our previous work with laser Raman spectroscopy (LRS), in which we detected platelet samples of different ages of AD transgenic mice and non-transgenic controls, showed great effect in the diagnosis of AD. In addition, a multilayer perception network (MLP) classification method was adopted to discriminate the spectral data. However, there were disturbances, which were induced by noise from the machines and so on, in the data set; thus the MLP method had to be trained with large-scale data. In this paper, we aim to re-establish the classification models of early and advanced AD and the control group with fewer features, and apply some mechanism of noise reduction to improve the accuracy of models. An adaptive classification method based on the Gaussian process (GP) featured, with predictive probabilities, is proposed, which could tell when a data set is related to some kind of disease. Compared with MLP on the same feature set, GP showed much better performance in the experimental results. What is more, since the spectra of platelets are isolated from AD, GP has good expansibility and can be applied in diagnosis of many other similar diseases, such as Parkinson’s disease (PD). Spectral data of 4 month and 12 month AD platelets, as well as control data, were collected. With predictive probabilities, the proposed GP classification method improved the diagnostic sensitivity to nearly 100%. Samples were also collected from PD platelets as classification and comparison to the 12 month AD. The presented approach and our experiments indicate that utilization of GP with predictive probabilities in

  18. Probability of detection for bolt hole eddy current in extracted from service aircraft wing structures

    Science.gov (United States)

    Underhill, P. R.; Uemura, C.; Krause, T. W.

    2018-04-01

    Fatigue cracks are prone to develop around fasteners found in multi-layer aluminum structures on aging aircraft. Bolt hole eddy current (BHEC) is used for detection of cracks from within bolt holes after fastener removal. In support of qualification towards a target a90/95 (detect 90% of cracks of depth a, 95% of the time) of 0.76 mm (0.030"), a preliminary probability of detection (POD) study was performed to identify those parameters whose variation may keep a bolt hole inspection from attaining its goal. Parameters that were examined included variability in lift-off due to probe type, out-of-round holes, holes with diameters too large to permit surface-contact of the probe and mechanical damage to the holes, including burrs. The study examined the POD for BHEC of corner cracks in unfinished fastener holes extracted from service material. 68 EDM notches were introduced into two specimens of a horizontal stabilizer from a CC-130 Hercules aircraft. The fastener holes were inspected in the unfinished state, simulating potential inspection conditions, by 7 certified inspectors using a manual BHEC setup with an impedance plane display and also with one inspection conducted utilizing a BHEC automated C-Scan apparatus. While the standard detection limit of 1.27 mm (0.050") was achieved, given the a90/95 of 0.97 mm (0.039"), the target 0.76 mm (0.030") was not achieved. The work highlighted a number of areas where there was insufficient information to complete the qualification. Consequently, a number of recommendations were made. These included; development of a specification for minimum probe requirements; criteria for condition of the hole to be inspected, including out-of-roundness and presence of corrosion pits; statement of range of hole sizes; inspection frequency and data display for analysis.

  19. Increasing Classroom Compliance: Using a High-Probability Command Sequence with Noncompliant Students

    Science.gov (United States)

    Axelrod, Michael I.; Zank, Amber J.

    2012-01-01

    Noncompliance is one of the most problematic behaviors within the school setting. One strategy to increase compliance of noncompliant students is a high-probability command sequence (HPCS; i.e., a set of simple commands in which an individual is likely to comply immediately prior to the delivery of a command that has a lower probability of…

  20. High But Not Low Probability of Gain Elicits a Positive Feeling Leading to the Framing Effect.

    Science.gov (United States)

    Gosling, Corentin J; Moutier, Sylvain

    2017-01-01

    Human risky decision-making is known to be highly susceptible to profit-motivated responses elicited by the way in which options are framed. In fact, studies investigating the framing effect have shown that the choice between sure and risky options depends on how these options are presented. Interestingly, the probability of gain of the risky option has been highlighted as one of the main factors causing variations in susceptibility to the framing effect. However, while it has been shown that high probabilities of gain of the risky option systematically lead to framing bias, questions remain about the influence of low probabilities of gain. Therefore, the first aim of this paper was to clarify the respective roles of high and low probabilities of gain in the framing effect. Due to the difference between studies using a within- or between-subjects design, we conducted a first study investigating the respective roles of these designs. For both designs, we showed that trials with a high probability of gain led to the framing effect whereas those with a low probability did not. Second, as emotions are known to play a key role in the framing effect, we sought to determine whether they are responsible for such a debiasing effect of the low probability of gain. Our second study thus investigated the relationship between emotion and the framing effect depending on high and low probabilities. Our results revealed that positive emotion was related to risk-seeking in the loss frame, but only for trials with a high probability of gain. Taken together, these results support the interpretation that low probabilities of gain suppress the framing effect because they prevent the positive emotion of gain anticipation.

  1. High But Not Low Probability of Gain Elicits a Positive Feeling Leading to the Framing Effect

    Science.gov (United States)

    Gosling, Corentin J.; Moutier, Sylvain

    2017-01-01

    Human risky decision-making is known to be highly susceptible to profit-motivated responses elicited by the way in which options are framed. In fact, studies investigating the framing effect have shown that the choice between sure and risky options depends on how these options are presented. Interestingly, the probability of gain of the risky option has been highlighted as one of the main factors causing variations in susceptibility to the framing effect. However, while it has been shown that high probabilities of gain of the risky option systematically lead to framing bias, questions remain about the influence of low probabilities of gain. Therefore, the first aim of this paper was to clarify the respective roles of high and low probabilities of gain in the framing effect. Due to the difference between studies using a within- or between-subjects design, we conducted a first study investigating the respective roles of these designs. For both designs, we showed that trials with a high probability of gain led to the framing effect whereas those with a low probability did not. Second, as emotions are known to play a key role in the framing effect, we sought to determine whether they are responsible for such a debiasing effect of the low probability of gain. Our second study thus investigated the relationship between emotion and the framing effect depending on high and low probabilities. Our results revealed that positive emotion was related to risk-seeking in the loss frame, but only for trials with a high probability of gain. Taken together, these results support the interpretation that low probabilities of gain suppress the framing effect because they prevent the positive emotion of gain anticipation. PMID:28232808

  2. Compensating for geographic variation in detection probability with water depth improves abundance estimates of coastal marine megafauna.

    Science.gov (United States)

    Hagihara, Rie; Jones, Rhondda E; Sobtzick, Susan; Cleguer, Christophe; Garrigue, Claire; Marsh, Helene

    2018-01-01

    The probability of an aquatic animal being available for detection is typically probability of detection is important for obtaining robust estimates of the population abundance and determining its status and trends. The dugong (Dugong dugon) is a bottom-feeding marine mammal and a seagrass community specialist. We hypothesized that the probability of a dugong being available for detection is dependent on water depth and that dugongs spend more time underwater in deep-water seagrass habitats than in shallow-water seagrass habitats. We tested this hypothesis by quantifying the depth use of 28 wild dugongs fitted with GPS satellite transmitters and time-depth recorders (TDRs) at three sites with distinct seagrass depth distributions: 1) open waters supporting extensive seagrass meadows to 40 m deep (Torres Strait, 6 dugongs, 2015); 2) a protected bay (average water depth 6.8 m) with extensive shallow seagrass beds (Moreton Bay, 13 dugongs, 2011 and 2012); and 3) a mixture of lagoon, coral and seagrass habitats to 60 m deep (New Caledonia, 9 dugongs, 2013). The fitted instruments were used to measure the times the dugongs spent in the experimentally determined detection zones under various environmental conditions. The estimated probability of detection was applied to aerial survey data previously collected at each location. In general, dugongs were least available for detection in Torres Strait, and the population estimates increased 6-7 fold using depth-specific availability correction factors compared with earlier estimates that assumed homogeneous detection probability across water depth and location. Detection probabilities were higher in Moreton Bay and New Caledonia than Torres Strait because the water transparency in these two locations was much greater than in Torres Strait and the effect of correcting for depth-specific detection probability much less. The methodology has application to visual survey of coastal megafauna including surveys using Unmanned

  3. Influence of porcine circovirus type 2 vaccination on the probability and severity of pneumonia detected postmortem.

    Science.gov (United States)

    Raith, J; Kuchling, S; Schleicher, C; Schobesberger, H; Köfer, J

    2015-01-31

    To evaluate the influence of porcine circovirus type 2 vaccination (PCV-2) on the probability and severity of pneumonia, postmortem findings of 247,505 pigs slaughtered between 2008 and 2011 were analysed by applying a cumulative link mixed model. Three major effects could be observed: (1) PCV-2 vaccination significantly (P<0.01) reduced the odds (coefficient: -0.05) of postmortem findings of mild, moderate and severe pneumonia for vaccinated pigs. (2) Pigs from fattening farms were less likely (coefficient: -0.44; P<0.05) to exhibit signs of pneumonia at slaughter than pigs from farrow-to-finish farms. (3) When vaccinated, the odds of detecting postmortem signs showed an even more pronounced reduction (coefficient: -0.19; P<0.001) for pigs from fattening farms. Combining PCV-2 vaccination, farm type and interaction effects between these two factors, a pig vaccinated against PCV-2 from a fattening farm had only half the chance (OR 0.51) of pneumonia being detected at postmortem than a non-vaccinated pig from a farrow-to-finish farm. The study demonstrates the benefit of a vaccination programme against PCV-2 as an important tool to reduce the risk of postmortem pneumonia findings and the severity of pneumonia in pigs at slaughter. British Veterinary Association.

  4. Probabilistic analysis of Millstone Unit 3 ultimate containment failure probability given high pressure: Chapter 14

    International Nuclear Information System (INIS)

    Bickel, J.H.

    1983-01-01

    The quantification of the containment event trees in the Millstone Unit 3 Probabilistic Safety Study utilizes a conditional probability of failure given high pressure which is based on a new approach. The generation of this conditional probability was based on a weakest link failure mode model which considered contributions from a number of overlapping failure modes. This overlap effect was due to a number of failure modes whose mean failure pressures were clustered within a 5 psi range and which had uncertainties due to variances in material strengths and analytical uncertainties which were between 9 and 15 psi. Based on a review of possible probability laws to describe the failure probability of individual structural failure modes, it was determined that a Weibull probability law most adequately described the randomness in the physical process of interest. The resultant conditional probability of failure is found to have a median failure pressure of 132.4 psia. The corresponding 5-95 percentile values are 112 psia and 146.7 psia respectively. The skewed nature of the conditional probability of failure vs. pressure results in a lower overall containment failure probability for an appreciable number of the severe accident sequences of interest, but also probabilities which are more rigorously traceable from first principles

  5. Variable terrestrial GPS telemetry detection rates: Addressing the probability of successful acquisitions

    Science.gov (United States)

    Ironside, Kirsten E.; Mattson, David J.; Choate, David; Stoner, David; Arundel, Terry; Hansen, Jered R.; Theimer, Tad; Holton, Brandon; Jansen, Brian; Sexton, Joseph O.; Longshore, Kathleen M.; Edwards, Thomas C.; Peters, Michael

    2017-01-01

    Studies using global positioning system (GPS) telemetry rarely result in 100% fix success rates (FSR), which may bias datasets because data loss is systematic rather than a random process. Previous spatially explicit models developed to correct for sampling bias have been limited to small study areas, a small range of data loss, or were study-area specific. We modeled environmental effects on FSR from desert to alpine biomes, investigated the full range of potential data loss (0–100% FSR), and evaluated whether animal body position can contribute to lower FSR because of changes in antenna orientation based on GPS detection rates for 4 focal species: cougars (Puma concolor), desert bighorn sheep (Ovis canadensis nelsoni), Rocky Mountain elk (Cervus elaphus nelsoni), and mule deer (Odocoileus hemionus). Terrain exposure and height of over story vegetation were the most influential factors affecting FSR. Model evaluation showed a strong correlation (0.88) between observed and predicted FSR and no significant differences between predicted and observed FSRs using 2 independent validation datasets. We found that cougars and canyon-dwelling bighorn sheep may select for environmental features that influence their detectability by GPS technology, mule deer may select against these features, and elk appear to be nonselective. We observed temporal patterns in missed fixes only for cougars. We provide a model for cougars, predicting fix success by time of day that is likely due to circadian changes in collar orientation and selection of daybed sites. We also provide a model predicting the probability of GPS fix acquisitions given environmental conditions, which had a strong relationship (r 2 = 0.82) with deployed collar FSRs across species.

  6. Operational NDT simulator, towards human factors integration in simulated probability of detection

    Science.gov (United States)

    Rodat, Damien; Guibert, Frank; Dominguez, Nicolas; Calmon, Pierre

    2017-02-01

    In the aeronautic industry, the performance demonstration of Non-Destructive Testing (NDT) procedures relies on Probability Of Detection (POD) analyses. This statistical approach measures the ability of the procedure to detect a flaw with regard to one of its characteristic dimensions. The inspection chain is evaluated as a whole, including equipment configuration, probe effciency but also operator manipulations. Traditionally, a POD study requires an expensive campaign during which several operators apply the procedure on a large set of representative samples. Recently, new perspectives for the POD estimation have been introduced using NDT simulation to generate data. However, these approaches do not offer straightforward solutions to take the operator into account. The simulation of human factors, including cognitive aspects, often raises questions. To address these diffculties, we propose a concept of operational NDT simulator [1]. This work presents the first steps in the implementation of such simulator for ultrasound phased array inspection of composite parts containing Flat Bottom Holes (FBHs). The final system will look like a classical ultrasound testing equipment with a single exception: the displayed signals will be synthesized. Our hardware (ultrasound acquisition card, 3D position tracker) and software (position analysis, inspection scenario, synchronization, simulations) environments are developed as a bench to test the meta-modeling techniques able to provide fast-simulated realistic ultra-sound signals. The results presented here are obtained by on-the-fly merging of real and simulated signals. They confirm the feasibility of our approach: the replacement of real signals by purely simulated ones has been unnoticed by operators. We believe this simulator is a great prospect for POD evaluation including human factors, and may also find applications for training or procedure set-up.

  7. Comparing rapid methods for detecting Listeria in seafood and environmental samples using the most probably number (MPN) technique.

    Science.gov (United States)

    Cruz, Cristina D; Win, Jessicah K; Chantarachoti, Jiraporn; Mutukumira, Anthony N; Fletcher, Graham C

    2012-02-15

    The standard Bacteriological Analytical Manual (BAM) protocol for detecting Listeria in food and on environmental surfaces takes about 96 h. Some studies indicate that rapid methods, which produce results within 48 h, may be as sensitive and accurate as the culture protocol. As they only give presence/absence results, it can be difficult to compare the accuracy of results generated. We used the Most Probable Number (MPN) technique to evaluate the performance and detection limits of six rapid kits for detecting Listeria in seafood and on an environmental surface compared with the standard protocol. Three seafood products and an environmental surface were inoculated with similar known cell concentrations of Listeria and analyzed according to the manufacturers' instructions. The MPN was estimated using the MPN-BAM spreadsheet. For the seafood products no differences were observed among the rapid kits and efficiency was similar to the BAM method. On the environmental surface the BAM protocol had a higher recovery rate (sensitivity) than any of the rapid kits tested. Clearview™, Reveal®, TECRA® and VIDAS® LDUO detected the cells but only at high concentrations (>10(2) CFU/10 cm(2)). Two kits (VIP™ and Petrifilm™) failed to detect 10(4) CFU/10 cm(2). The MPN method was a useful tool for comparing the results generated by these presence/absence test kits. There remains a need to develop a rapid and sensitive method for detecting Listeria in environmental samples that performs as well as the BAM protocol, since none of the rapid tests used in this study achieved a satisfactory result. Copyright © 2011 Elsevier B.V. All rights reserved.

  8. The probability distribution of intergranular stress corrosion cracking life for sensitized 304 stainless steels in high temperature, high purity water

    International Nuclear Information System (INIS)

    Akashi, Masatsune; Kenjyo, Takao; Matsukura, Shinji; Kawamoto, Teruaki

    1984-01-01

    In order to discuss the probability distribution of intergranular stress corrsion carcking life for sensitized 304 stainless steels, a series of the creviced bent beem (CBB) and the uni-axial constant load tests were carried out in oxygenated high temperature, high purity water. The following concludions were resulted; (1) The initiation process of intergranular stress corrosion cracking has been assumed to be approximated by the Poisson stochastic process, based on the CBB test results. (2) The probability distribution of intergranular stress corrosion cracking life may consequently be approximated by the exponential probability distribution. (3) The experimental data could be fitted to the exponential probability distribution. (author)

  9. A comparison of Probability Of Detection (POD) data determined using different statistical methods

    Science.gov (United States)

    Fahr, A.; Forsyth, D.; Bullock, M.

    1993-12-01

    Different statistical methods have been suggested for determining probability of detection (POD) data for nondestructive inspection (NDI) techniques. A comparative assessment of various methods of determining POD was conducted using results of three NDI methods obtained by inspecting actual aircraft engine compressor disks which contained service induced cracks. The study found that the POD and 95 percent confidence curves as a function of crack size as well as the 90/95 percent crack length vary depending on the statistical method used and the type of data. The distribution function as well as the parameter estimation procedure used for determining POD and the confidence bound must be included when referencing information such as the 90/95 percent crack length. The POD curves and confidence bounds determined using the range interval method are very dependent on information that is not from the inspection data. The maximum likelihood estimators (MLE) method does not require such information and the POD results are more reasonable. The log-logistic function appears to model POD of hit/miss data relatively well and is easy to implement. The log-normal distribution using MLE provides more realistic POD results and is the preferred method. Although it is more complicated and slower to calculate, it can be implemented on a common spreadsheet program.

  10. A Comparison of Error Bounds for a Nonlinear Tracking System with Detection Probability Pd < 1

    Science.gov (United States)

    Tong, Huisi; Zhang, Hao; Meng, Huadong; Wang, Xiqin

    2012-01-01

    Error bounds for nonlinear filtering are very important for performance evaluation and sensor management. This paper presents a comparative study of three error bounds for tracking filtering, when the detection probability is less than unity. One of these bounds is the random finite set (RFS) bound, which is deduced within the framework of finite set statistics. The others, which are the information reduction factor (IRF) posterior Cramer-Rao lower bound (PCRLB) and enumeration method (ENUM) PCRLB are introduced within the framework of finite vector statistics. In this paper, we deduce two propositions and prove that the RFS bound is equal to the ENUM PCRLB, while it is tighter than the IRF PCRLB, when the target exists from the beginning to the end. Considering the disappearance of existing targets and the appearance of new targets, the RFS bound is tighter than both IRF PCRLB and ENUM PCRLB with time, by introducing the uncertainty of target existence. The theory is illustrated by two nonlinear tracking applications: ballistic object tracking and bearings-only tracking. The simulation studies confirm the theory and reveal the relationship among the three bounds. PMID:23242274

  11. Modern Approaches to the Computation of the Probability of Target Detection in Cluttered Environments

    Science.gov (United States)

    Meitzler, Thomas J.

    The field of computer vision interacts with fields such as psychology, vision research, machine vision, psychophysics, mathematics, physics, and computer science. The focus of this thesis is new algorithms and methods for the computation of the probability of detection (Pd) of a target in a cluttered scene. The scene can be either a natural visual scene such as one sees with the naked eye (visual), or, a scene displayed on a monitor with the help of infrared sensors. The relative clutter and the temperature difference between the target and background (DeltaT) are defined and then used to calculate a relative signal -to-clutter ratio (SCR) from which the Pd is calculated for a target in a cluttered scene. It is shown how this definition can include many previous definitions of clutter and (DeltaT). Next, fuzzy and neural -fuzzy techniques are used to calculate the Pd and it is shown how these methods can give results that have a good correlation with experiment. The experimental design for actually measuring the Pd of a target by observers is described. Finally, wavelets are applied to the calculation of clutter and it is shown how this new definition of clutter based on wavelets can be used to compute the Pd of a target.

  12. Review of Literature on Probability of Detection for Magnetic Particle Nondestructive Testing

    Science.gov (United States)

    2013-01-01

    a precipitation hardened martensitic stainless steel . The inspections were based on MIL-STD-1949A [51], now superseded but current at the time...inspector population involved in the tests, it is not possible to draw any further conclusions.  MPT of flat 17-4PH stainless steel plates. A brief...inspection method used to detect surface-breaking cracks in high-strength steel components. A survey of the available literature on the reliability

  13. Probability of acoustic transmitter detections by receiver lines in Lake Huron: results of multi-year field tests and simulations

    Science.gov (United States)

    Hayden, Todd A.; Holbrook, Christopher M.; Binder, Thomas; Dettmers, John M.; Cooke, Steven J.; Vandergoot, Christopher S.; Krueger, Charles C.

    2016-01-01

    BackgroundAdvances in acoustic telemetry technology have led to an improved understanding of the spatial ecology of many freshwater and marine fish species. Understanding the performance of acoustic receivers is necessary to distinguish between tagged fish that may have been present but not detected and from those fish that were absent from the area. In this study, two stationary acoustic transmitters were deployed 250 m apart within each of four acoustic receiver lines each containing at least 10 receivers (i.e., eight acoustic transmitters) located in Saginaw Bay and central Lake Huron for nearly 2 years to determine whether the probability of detecting an acoustic transmission varied as a function of time (i.e., season), location, and distance between acoustic transmitter and receiver. Distances between acoustic transmitters and receivers ranged from 200 m to >10 km in each line. The daily observed probability of detecting an acoustic transmission was used in simulation models to estimate the probability of detecting a moving acoustic transmitter on a line of receivers.ResultsThe probability of detecting an acoustic transmitter on a receiver 1000 m away differed by month for different receiver lines in Lake Huron and Saginaw Bay but was similar for paired acoustic transmitters deployed 250 m apart within the same line. Mean probability of detecting an acoustic transmitter at 1000 m calculated over the study period varied among acoustic transmitters 250 m apart within a line and differed among receiver lines in Lake Huron and Saginaw Bay. The simulated probability of detecting a moving acoustic transmitter on a receiver line was characterized by short periods of time with decreased detection. Although increased receiver spacing and higher fish movement rates decreased simulated detection probability, the location of the simulated receiver line in Lake Huron had the strongest effect on simulated detection probability.ConclusionsPerformance of receiver

  14. Probability of Detection Study to Assess the Performance of Nondestructive Inspection Methods for Wind Turbine Blades.

    Energy Technology Data Exchange (ETDEWEB)

    Roach, Dennis P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rice, Thomas M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Paquette, Joshua [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-07-01

    Wind turbine blades pose a unique set of inspection challenges that span from very thick and attentive spar cap structures to porous bond lines, varying core material and a multitude of manufacturing defects of interest. The need for viable, accurate nondestructive inspection (NDI) technology becomes more important as the cost per blade, and lost revenue from downtime, grows. NDI methods must not only be able to contend with the challenges associated with inspecting extremely thick composite laminates and subsurface bond lines, but must also address new inspection requirements stemming from the growing understanding of blade structural aging phenomena. Under its Blade Reliability Collaborative program, Sandia Labs quantitatively assessed the performance of a wide range of NDI methods that are candidates for wind blade inspections. Custom wind turbine blade test specimens, containing engineered defects, were used to determine critical aspects of NDI performance including sensitivity, accuracy, repeatability, speed of inspection coverage, and ease of equipment deployment. The detection of fabrication defects helps enhance plant reliability and increase blade life while improved inspection of operating blades can result in efficient blade maintenance, facilitate repairs before critical damage levels are reached and minimize turbine downtime. The Sandia Wind Blade Flaw Detection Experiment was completed to evaluate different NDI methods that have demonstrated promise for interrogating wind blades for manufacturing flaws or in-service damage. These tests provided the Probability of Detection information needed to generate industry-wide performance curves that quantify: 1) how well current inspection techniques are able to reliably find flaws in wind turbine blades (industry baseline) and 2) the degree of improvements possible through integrating more advanced NDI techniques and procedures. _____________ S a n d i a N a t i o n a l L a b o r a t o r i e s i s a m u l t i

  15. Decomposition of conditional probability for high-order symbolic Markov chains

    Science.gov (United States)

    Melnik, S. S.; Usatenko, O. V.

    2017-07-01

    The main goal of this paper is to develop an estimate for the conditional probability function of random stationary ergodic symbolic sequences with elements belonging to a finite alphabet. We elaborate on a decomposition procedure for the conditional probability function of sequences considered to be high-order Markov chains. We represent the conditional probability function as the sum of multilinear memory function monomials of different orders (from zero up to the chain order). This allows us to introduce a family of Markov chain models and to construct artificial sequences via a method of successive iterations, taking into account at each step increasingly high correlations among random elements. At weak correlations, the memory functions are uniquely expressed in terms of the high-order symbolic correlation functions. The proposed method fills the gap between two approaches, namely the likelihood estimation and the additive Markov chains. The obtained results may have applications for sequential approximation of artificial neural network training.

  16. Burden of high fracture probability worldwide: secular increases 2010-2040.

    Science.gov (United States)

    Odén, A; McCloskey, E V; Kanis, J A; Harvey, N C; Johansson, H

    2015-09-01

    The number of individuals aged 50 years or more at high risk of osteoporotic fracture worldwide in 2010 was estimated at 158 million and is set to double by 2040. The aim of this study was to quantify the number of individuals worldwide aged 50 years or more at high risk of osteoporotic fracture in 2010 and 2040. A threshold of high fracture probability was set at the age-specific 10-year probability of a major fracture (clinical vertebral, forearm, humeral or hip fracture) which was equivalent to that of a woman with a BMI of 24 kg/m(2) and a prior fragility fracture but no other clinical risk factors. The prevalence of high risk was determined worldwide and by continent using all available country-specific FRAX models and applied the population demography for each country. Twenty-one million men and 137 million women had a fracture probability at or above the threshold in the world for the year 2010. The greatest number of men and women at high risk were from Asia (55 %). Worldwide, the number of high-risk individuals is expected to double over the next 40 years. We conclude that individuals with high probability of osteoporotic fractures comprise a very significant disease burden to society, particularly in Asia, and that this burden is set to increase markedly in the future. These analyses provide a platform for the evaluation of risk assessment and intervention strategies.

  17. The Probability of Detection in the Telephone Line of Device of the Unauthorized Removal of Information

    Directory of Open Access Journals (Sweden)

    I. V. Svintsov

    2011-06-01

    Full Text Available The article discusses the theory of quantitative description of the possible presence in the telephone line devices unauthorized removal of information, investigated with the help of probability theory.

  18. Multidetector computed tomographic pulmonary angiography in patients with a high clinical probability of pulmonary embolism.

    Science.gov (United States)

    Moores, L; Kline, J; Portillo, A K; Resano, S; Vicente, A; Arrieta, P; Corres, J; Tapson, V; Yusen, R D; Jiménez, D

    2016-01-01

    ESSENTIALS: When high probability of pulmonary embolism (PE), sensitivity of computed tomography (CT) is unclear. We investigated the sensitivity of multidetector CT among 134 patients with a high probability of PE. A normal CT alone may not safely exclude PE in patients with a high clinical pretest probability. In patients with no clear alternative diagnosis after CTPA, further testing should be strongly considered. Whether patients with a negative multidetector computed tomographic pulmonary angiography (CTPA) result and a high clinical pretest probability of pulmonary embolism (PE) should be further investigated is controversial. This was a prospective investigation of the sensitivity of multidetector CTPA among patients with a priori clinical assessment of a high probability of PE according to the Wells criteria. Among patients with a negative CTPA result, the diagnosis of PE required at least one of the following conditions: ventilation/perfusion lung scan showing a high probability of PE in a patient with no history of PE, abnormal findings on venous ultrasonography in a patient without previous deep vein thrombosis at that site, or the occurrence of venous thromboembolism (VTE) in a 3-month follow-up period after anticoagulation was withheld because of a negative multidetector CTPA result. We identified 498 patients with a priori clinical assessment of a high probability of PE and a completed CTPA study. CTPA excluded PE in 134 patients; in these patients, the pooled incidence of VTE was 5.2% (seven of 134 patients; 95% confidence interval [CI] 1.5-9.0). Five patients had VTEs that were confirmed by an additional imaging test despite a negative CTPA result (five of 48 patients; 10.4%; 95% CI 1.8-19.1), and two patients had objectively confirmed VTEs that occurred during clinical follow-up of at least 3 months (two of 86 patients; 2.3%; 95% CI 0-5.5). None of the patients had a fatal PE during follow-up. A normal multidetector CTPA result alone may not safely

  19. Detection probability of cliff-nesting raptors during helicopter and fixed-wing aircraft surveys in western Alaska

    Science.gov (United States)

    Booms, T.L.; Schempf, P.F.; McCaffery, B.J.; Lindberg, M.S.; Fuller, M.R.

    2010-01-01

    We conducted repeated aerial surveys for breeding cliff-nesting raptors on the Yukon Delta National Wildlife Refuge (YDNWR) in western Alaska to estimate detection probabilities of Gyrfalcons (Falco rusticolus), Golden Eagles (Aquila chrysaetos), Rough-legged Hawks (Buteo lagopus), and also Common Ravens (Corvus corax). Using the program PRESENCE, we modeled detection histories of each species based on single species occupancy modeling. We used different observers during four helicopter replicate surveys in the Kilbuck Mountains and five fixed-wing replicate surveys in the Ingakslugwat Hills near Bethel, AK. During helicopter surveys, Gyrfalcons had the highest detection probability estimate (p^;p^ 0.79; SE 0.05), followed by Golden Eagles (p^=0.68; SE 0.05), Common Ravens (p^=0.45; SE 0.17), and Rough-legged Hawks (p^=0.10; SE 0.11). Detection probabilities from fixed-wing aircraft in the Ingakslugwat Hills were similar to those from the helicopter in the Kilbuck Mountains for Gyrfalcons and Golden Eagles, but were higher for Common Ravens (p^=0.85; SE 0.06) and Rough-legged Hawks (p^=0.42; SE 0.07). Fixed-wing aircraft provided detection probability estimates and SEs in the Ingakslugwat Hills similar to or better than those from helicopter surveys in the Kilbucks and should be considered for future cliff-nesting raptor surveys where safe, low-altitude flight is possible. Overall, detection probability varied by observer experience and in some cases, by study area/aircraft type.

  20. Unconventional signal detection techniques with Gaussian probability mixtures adaptation in non-AWGN channels: full resolution receiver

    Science.gov (United States)

    Chabdarov, Shamil M.; Nadeev, Adel F.; Chickrin, Dmitry E.; Faizullin, Rashid R.

    2011-04-01

    In this paper we discuss unconventional detection technique also known as «full resolution receiver». This receiver uses Gaussian probability mixtures for interference structure adaptation. Full resolution receiver is alternative to conventional matched filter receivers in the case of non-Gaussian interferences. For the DS-CDMA forward channel with presence of complex interferences sufficient performance increasing was shown.

  1. BER Analysis Using Beat Probability Method of 3D Optical CDMA Networks with Double Balanced Detection

    Directory of Open Access Journals (Sweden)

    Chih-Ta Yen

    2015-01-01

    Full Text Available This study proposes novel three-dimensional (3D matrices of wavelength/time/spatial code for code-division multiple-access (OCDMA networks, with a double balanced detection mechanism. We construct 3D carrier-hopping prime/modified prime (CHP/MP codes by extending a two-dimensional (2D CHP code integrated with a one-dimensional (1D MP code. The corresponding coder/decoder pairs were based on fiber Bragg gratings (FBGs and tunable optical delay lines integrated with splitters/combiners. System performance was enhanced by the low cross correlation properties of the 3D code designed to avoid the beat noise phenomenon. The CHP/MP code cardinality increased significantly compared to the CHP code under the same bit error rate (BER. The results indicate that the 3D code method can enhance system performance because both the beating terms and multiple-access interference (MAI were reduced by the double balanced detection mechanism. Additionally, the optical component can also be relaxed for high transmission scenery.

  2. High temperature triggers latent variation among individuals: oviposition rate and probability for outbreaks.

    Directory of Open Access Journals (Sweden)

    Christer Björkman

    2011-01-01

    Full Text Available It is anticipated that extreme population events, such as extinctions and outbreaks, will become more frequent as a consequence of climate change. To evaluate the increased probability of such events, it is crucial to understand the mechanisms involved. Variation between individuals in their response to climatic factors is an important consideration, especially if microevolution is expected to change the composition of populations.Here we present data of a willow leaf beetle species, showing high variation among individuals in oviposition rate at a high temperature (20 °C. It is particularly noteworthy that not all individuals responded to changes in temperature; individuals laying few eggs at 20 °C continued to do so when transferred to 12 °C, whereas individuals that laid many eggs at 20 °C reduced their oviposition and laid the same number of eggs as the others when transferred to 12 °C. When transferred back to 20 °C most individuals reverted to their original oviposition rate. Thus, high variation among individuals was only observed at the higher temperature. Using a simple population model and based on regional climate change scenarios we show that the probability of outbreaks increases if there is a realistic increase in the number of warm summers. The probability of outbreaks also increased with increasing heritability of the ability to respond to increased temperature.If climate becomes warmer and there is latent variation among individuals in their temperature response, the probability for outbreaks may increase. However, the likelihood for microevolution to play a role may be low. This conclusion is based on the fact that it has been difficult to show that microevolution affect the probability for extinctions. Our results highlight the urge for cautiousness when predicting the future concerning probabilities for extreme population events.

  3. Device to detect the presence of a pure signal in a discrete noisy signal measured at an average rate of constant noise with a probability of false detection lower than one predeterminated

    International Nuclear Information System (INIS)

    Poussier, E.; Rambaut, M.

    1986-01-01

    Detection consists of a measurement of a counting rate. A probability of wrong detection is associated with this counting rate and with an average estimated rate of noise. Detection consists also in comparing the wrong detection probability to a predeterminated rate of wrong detection. The comparison can use tabulated values. Application is made to corpuscule radiation detection [fr

  4. Probability of fracture and life extension estimate of the high-flux isotope reactor vessel

    International Nuclear Information System (INIS)

    Chang, S.J.

    1998-01-01

    The state of the vessel steel embrittlement as a result of neutron irradiation can be measured by its increase in ductile-brittle transition temperature (DBTT) for fracture, often denoted by RT NDT for carbon steel. This transition temperature can be calibrated by the drop-weight test and, sometimes, by the Charpy impact test. The life extension for the high-flux isotope reactor (HFIR) vessel is calculated by using the method of fracture mechanics that is incorporated with the effect of the DBTT change. The failure probability of the HFIR vessel is limited as the life of the vessel by the reactor core melt probability of 10 -4 . The operating safety of the reactor is ensured by periodic hydrostatic pressure test (hydrotest). The hydrotest is performed in order to determine a safe vessel static pressure. The fracture probability as a result of the hydrostatic pressure test is calculated and is used to determine the life of the vessel. Failure to perform hydrotest imposes the limit on the life of the vessel. The conventional method of fracture probability calculations such as that used by the NRC-sponsored PRAISE CODE and the FAVOR CODE developed in this Laboratory are based on the Monte Carlo simulation. Heavy computations are required. An alternative method of fracture probability calculation by direct probability integration is developed in this paper. The present approach offers simple and expedient ways to obtain numerical results without losing any generality. In this paper, numerical results on (1) the probability of vessel fracture, (2) the hydrotest time interval, and (3) the hydrotest pressure as a result of the DBTT increase are obtained

  5. Prognostic value of stress echocardiography in women with high (⩾80%) probability of coronary artery disease

    OpenAIRE

    Davar, J; Roberts, E; Coghlan, J; Evans, T; Lipkin, D

    2001-01-01

    OBJECTIVE—To assess the prognostic significance of stress echocardiography in women with a high probability of coronary artery disease (CAD).
SETTING—Secondary and tertiary cardiology unit at a university teaching hospital.
PARTICIPANTS—A total of 135 women (mean (SD) age 63 (9) years) with pre-test probability of CAD ⩾80% were selected from a database of patients investigated by treadmill or dobutamine stress echocardiography between 1995 and 1998.
MAIN OUTCOME MEASURES—Patients were followe...

  6. Understanding environmental DNA detection probabilities: A case study using a stream-dwelling char Salvelinus fontinalis

    Science.gov (United States)

    Taylor M. Wilcox; Kevin S. McKelvey; Michael K. Young; Adam J. Sepulveda; Bradley B. Shepard; Stephen F. Jane; Andrew R. Whiteley; Winsor H. Lowe; Michael K. Schwartz

    2016-01-01

    Environmental DNA sampling (eDNA) has emerged as a powerful tool for detecting aquatic animals. Previous research suggests that eDNA methods are substantially more sensitive than traditional sampling. However, the factors influencing eDNA detection and the resulting sampling costs are still not well understood. Here we use multiple experiments to derive...

  7. Real Time Robot Soccer Game Event Detection Using Finite State Machines with Multiple Fuzzy Logic Probability Evaluators

    Directory of Open Access Journals (Sweden)

    Elmer P. Dadios

    2009-01-01

    Full Text Available This paper presents a new algorithm for real time event detection using Finite State Machines with multiple Fuzzy Logic Probability Evaluators (FLPEs. A machine referee for a robot soccer game is developed and is used as the platform to test the proposed algorithm. A novel technique to detect collisions and other events in microrobot soccer game under inaccurate and insufficient information is presented. The robots' collision is used to determine goalkeeper charging and goal score events which are crucial for the machine referee's decisions. The Main State Machine (MSM handles the schedule of event activation. The FLPE calculates the probabilities of the true occurrence of the events. Final decisions about the occurrences of events are evaluated and compared through threshold crisp probability values. The outputs of FLPEs can be combined to calculate the probability of an event composed of subevents. Using multiple fuzzy logic system, the FLPE utilizes minimal number of rules and can be tuned individually. Experimental results show the accuracy and robustness of the proposed algorithm.

  8. Intelligent tutorial system for teaching of probability and statistics at high school in Mexico

    Directory of Open Access Journals (Sweden)

    Fernando Gudino Penaloza, Miguel Gonzalez Mendoza, Neil Hernandez Gress, Jaime Mora Vargas

    2009-12-01

    Full Text Available This paper describes the implementation of an intelligent tutoring system dedicated to teaching probability and statistics atthe preparatory school (or high school in Mexico. The system solution was used as a desktop computer and adapted tocarry a mobile environment for the implementation of mobile learning or m-learning. The system complies with the idea ofbeing adaptable to the needs of each student and is able to adapt to three different teaching models that meet the criteriaof three student profiles.

  9. A study on the effect of flaw detection probability assumptions on risk reduction achieved by non-destructive inspection

    International Nuclear Information System (INIS)

    Cronvall, O.; Simola, K.; Männistö, I.; Gunnars, J.; Alverlind, L.; Dillström, P.; Gandossi, L.

    2012-01-01

    Leakages and ruptures of piping components lead to reduction or loss of the pressure retaining capability of the system, and thus contribute to the overall risk associated with nuclear power plants. In-service inspection (ISI) aims at verifying that defects are not present in components of the pressure boundary or, if defects are present, ensuring that these are detected before they affect the safe operation of the plant. Reliability estimates of piping are needed e.g., in probabilistic safety assessment (PSA) studies, risk-informed ISI (RI-ISI) applications, and other structural reliability assessments. Probabilistic fracture mechanics models can account for ISI reliability, but a quantitative estimate for the latter is needed. This is normally expressed in terms of probability of detection (POD) curves, which correlate the probability of detecting a flaw with flaw size. A detailed POD curve is often difficult (or practically impossible) to obtain. If sufficient risk reduction can be shown by using simplified (but reasonably conservative) POD estimates, more complex PODs are not needed. This paper summarises the results of a study on the effect of piping inspection reliability assumptions on failure probability using structural reliability models. The main interest was to investigate whether it is justifiable to use a simplified POD curve. Further, the study compared various structural reliability calculation approaches for a set of analysis cases. The results indicate that the use of a simplified POD could be justifiable in RI-ISI applications.

  10. The Effect of High Frequency Pulse on the Discharge Probability in Micro EDM

    Science.gov (United States)

    Liu, Y.; Qu, Y.; Zhang, W.; Ma, F.; Sha, Z.; Wang, Y.; Rolfe, B.; Zhang, S.

    2017-12-01

    High frequency pulse improves the machining efficiency of micro electric discharge machining (micro EDM), while it also brings some changes in micro EDM process. This paper focuses on the influence of skin-effect under the high frequency pulse on energy distribution and transmission in micro EDM, based on which, the rules of discharge probability of electrode end face are also analysed. On the basis of the electrical discharge process under the condition of high frequency pulse in micro EDM, COMSOL Multiphysics software is used to establish energy transmission model in micro electrode. The discharge energy distribution and transmission within tool electrode under different pulse frequencies, electrical currents, and permeability situation are studied in order to get the distribution pattern of current density and electric field intensity in the electrode end face under the influence of electrical parameters change. The electric field intensity distribution is regarded as the influencing parameter of discharge probability on the electrode end. Finally, MATLAB is used to fit the curve and obtain the distribution of discharge probability of electrode end face.

  11. Studies on the effect of flaw detection probability assumptions on risk reduction at inspection

    Energy Technology Data Exchange (ETDEWEB)

    Simola, K.; Cronvall, O.; Maennistoe, I. (VTT Technical Research Centre of Finland (Finland)); Gunnars, J.; Alverlind, L.; Dillstroem, P. (Inspecta Technology, Stockholm (Sweden)); Gandossi, L. (European Commission Joint Research Centre, Brussels (Belgium))

    2009-12-15

    The aim of the project was to study the effect of POD assumptions on failure probability using structural reliability models. The main interest was to investigate whether it is justifiable to use a simplified POD curve e.g. in risk-informed in-service inspection (RI-ISI) studies. The results of the study indicate that the use of a simplified POD curve could be justifiable in RI-ISI applications. Another aim was to compare various structural reliability calculation approaches for a set of cases. Through benchmarking one can identify differences and similarities between modelling approaches, and provide added confidence on models and identify development needs. Comparing the leakage probabilities calculated by different approaches at the end of plant lifetime (60 years) shows that the results are very similar when inspections are not accounted for. However, when inspections are taken into account the predicted order of magnitude differs. Further studies would be needed to investigate the reasons for the differences. Development needs and plans for the benchmarked structural reliability models are discussed. (author)

  12. Studies on the effect of flaw detection probability assumptions on risk reduction at inspection

    International Nuclear Information System (INIS)

    Simola, K.; Cronvall, O.; Maennistoe, I.; Gunnars, J.; Alverlind, L.; Dillstroem, P.; Gandossi, L.

    2009-12-01

    The aim of the project was to study the effect of POD assumptions on failure probability using structural reliability models. The main interest was to investigate whether it is justifiable to use a simplified POD curve e.g. in risk-informed in-service inspection (RI-ISI) studies. The results of the study indicate that the use of a simplified POD curve could be justifiable in RI-ISI applications. Another aim was to compare various structural reliability calculation approaches for a set of cases. Through benchmarking one can identify differences and similarities between modelling approaches, and provide added confidence on models and identify development needs. Comparing the leakage probabilities calculated by different approaches at the end of plant lifetime (60 years) shows that the results are very similar when inspections are not accounted for. However, when inspections are taken into account the predicted order of magnitude differs. Further studies would be needed to investigate the reasons for the differences. Development needs and plans for the benchmarked structural reliability models are discussed. (author)

  13. Probability modeling of high flow extremes in Yingluoxia watershed, the upper reaches of Heihe River basin

    Science.gov (United States)

    Li, Zhanling; Li, Zhanjie; Li, Chengcheng

    2014-05-01

    Probability modeling of hydrological extremes is one of the major research areas in hydrological science. Most basins in humid and semi-humid south and east of China are concerned for probability modeling analysis of high flow extremes. While, for the inland river basin which occupies about 35% of the country area, there is a limited presence of such studies partly due to the limited data availability and a relatively low mean annual flow. The objective of this study is to carry out probability modeling of high flow extremes in the upper reach of Heihe River basin, the second largest inland river basin in China, by using the peak over threshold (POT) method and Generalized Pareto Distribution (GPD), in which the selection of threshold and inherent assumptions for POT series are elaborated in details. For comparison, other widely used probability distributions including generalized extreme value (GEV), Lognormal, Log-logistic and Gamma are employed as well. Maximum likelihood estimate is used for parameter estimations. Daily flow data at Yingluoxia station from 1978 to 2008 are used. Results show that, synthesizing the approaches of mean excess plot, stability features of model parameters, return level plot and the inherent independence assumption of POT series, an optimum threshold of 340m3/s is finally determined for high flow extremes in Yingluoxia watershed. The resulting POT series is proved to be stationary and independent based on Mann-Kendall test, Pettitt test and autocorrelation test. In terms of Kolmogorov-Smirnov test, Anderson-Darling test and several graphical diagnostics such as quantile and cumulative density function plots, GPD provides the best fit to high flow extremes in the study area. The estimated high flows for long return periods demonstrate that, as the return period increasing, the return level estimates are probably more uncertain. The frequency of high flow extremes exhibits a very slight but not significant decreasing trend from 1978 to

  14. Serial follow up V/P scanning in assessment of treatment response in high probability scans for pulmonary embolism

    Energy Technology Data Exchange (ETDEWEB)

    Moustafa, H; Elhaddad, SH; Wagih, SH; Ziada, G; Samy, A; Saber, R [Department of nuclear medicine and radiology, faculty of medicine, Cairo university, Cairo, (Egypt)

    1995-10-01

    138 patients proved with V/P scan to have different probabilities of pulmonary emboli event. Serial follow up scanning after 3 days, 2 weeks, 1 month and 3 months was done, with anticoagulant therapy. Out of the remaining 10 patients, 6 patients died with documented P.E. by P.M. study and lost follow up recorded in 4 patients. Complete response with disappearance of all perfusion defects after 2 weeks was detected in 37 patients (49.3%), partial improvement of lesions after 3 months was elicited in 32%. The overall incidence of response was (81.3%) such response was complete in low probability group (100%), (84.2%) in intermediate group and (79.3%) in high probability group with partial response in 45.3%. New lesions were evident in 18.7% of this series. To conclude that serial follow up V/P scan is mandatory for evaluation of response to anticoagulant therapy specially in first 3 months. 2 figs., 3 tabs.

  15. Probability of detecting perchlorate under natural conditions in deep groundwater in California and the Southwestern United States

    Science.gov (United States)

    Fram, Miranda S.; Belitz, Kenneth

    2011-01-01

    We use data from 1626 groundwater samples collected in California, primarily from public drinking water supply wells, to investigate the distribution of perchlorate in deep groundwater under natural conditions. The wells were sampled for the California Groundwater Ambient Monitoring and Assessment Priority Basin Project. We develop a logistic regression model for predicting probabilities of detecting perchlorate at concentrations greater than multiple threshold concentrations as a function of climate (represented by an aridity index) and potential anthropogenic contributions of perchlorate (quantified as an anthropogenic score, AS). AS is a composite categorical variable including terms for nitrate, pesticides, and volatile organic compounds. Incorporating water-quality parameters in AS permits identification of perturbation of natural occurrence patterns by flushing of natural perchlorate salts from unsaturated zones by irrigation recharge as well as addition of perchlorate from industrial and agricultural sources. The data and model results indicate low concentrations (0.1-0.5 μg/L) of perchlorate occur under natural conditions in groundwater across a wide range of climates, beyond the arid to semiarid climates in which they mostly have been previously reported. The probability of detecting perchlorate at concentrations greater than 0.1 μg/L under natural conditions ranges from 50-70% in semiarid to arid regions of California and the Southwestern United States to 5-15% in the wettest regions sampled (the Northern California coast). The probability of concentrations above 1 μg/L under natural conditions is low (generally <3%).

  16. Physicochemical properties determining the detection probability of tryptic peptides in Fourier transform mass spectrometry. A correlation study

    DEFF Research Database (Denmark)

    Nielsen, Michael L; Savitski, Mikhail M; Kjeldsen, Frank

    2004-01-01

    Sequence verification and mapping of posttranslational modifications require nearly 100% sequence coverage in the "bottom-up" protein analysis. Even in favorable cases, routine liquid chromatography-mass spectrometry detects from protein digests peptides covering 50-90% of the sequence. Here we...... investigated the reasons for limited peptide detection, considering various physicochemical aspects of peptide behavior in liquid chromatography-Fourier transform mass spectrometry (LC-FTMS). No overall correlation was found between the detection probability and peptide mass. In agreement with literature data...... between pI and signal response. An explanation of this paradoxal behavior was found through the observation that more acidic tryptic peptide lengths tend to be longer. Longer peptides tend to acquire higher average charge state in positive mode electrospray ionization than more basic but shorter...

  17. Consistent Practices for the Probability of Detection (POD) of Fracture Critical Metallic Components Project

    Science.gov (United States)

    Hughitt, Brian; Generazio, Edward (Principal Investigator); Nichols, Charles; Myers, Mika (Principal Investigator); Spencer, Floyd (Principal Investigator); Waller, Jess (Principal Investigator); Wladyka, Jordan (Principal Investigator); Aldrin, John; Burke, Eric; Cerecerez, Laura; hide

    2016-01-01

    NASA-STD-5009 requires that successful flaw detection by NDE methods be statistically qualified for use on fracture critical metallic components, but does not standardize practices. This task works towards standardizing calculations and record retention with a web-based tool, the NNWG POD Standards Library or NPSL. Test methods will also be standardized with an appropriately flexible appendix to -5009 identifying best practices. Additionally, this appendix will describe how specimens used to qualify NDE systems will be cataloged, stored and protected from corrosion, damage, or loss.

  18. Re-assessment of road accident data-analysis policy : applying theory from involuntary, high-consequence, low-probability events like nuclear power plant meltdowns to voluntary, low-consequence, high-probability events like traffic accidents

    Science.gov (United States)

    2002-02-01

    This report examines the literature on involuntary, high-consequence, low-probability (IHL) events like nuclear power plant meltdowns to determine what can be applied to the problem of voluntary, low-consequence high-probability (VLH) events like tra...

  19. Using areas of known occupancy to identify sources of variation in detection probability of raptors: taking time lowers replication effort for surveys.

    Science.gov (United States)

    Murn, Campbell; Holloway, Graham J

    2016-10-01

    Species occurring at low density can be difficult to detect and if not properly accounted for, imperfect detection will lead to inaccurate estimates of occupancy. Understanding sources of variation in detection probability and how they can be managed is a key part of monitoring. We used sightings data of a low-density and elusive raptor (white-headed vulture Trigonoceps occipitalis ) in areas of known occupancy (breeding territories) in a likelihood-based modelling approach to calculate detection probability and the factors affecting it. Because occupancy was known a priori to be 100%, we fixed the model occupancy parameter to 1.0 and focused on identifying sources of variation in detection probability. Using detection histories from 359 territory visits, we assessed nine covariates in 29 candidate models. The model with the highest support indicated that observer speed during a survey, combined with temporal covariates such as time of year and length of time within a territory, had the highest influence on the detection probability. Averaged detection probability was 0.207 (s.e. 0.033) and based on this the mean number of visits required to determine within 95% confidence that white-headed vultures are absent from a breeding area is 13 (95% CI: 9-20). Topographical and habitat covariates contributed little to the best models and had little effect on detection probability. We highlight that low detection probabilities of some species means that emphasizing habitat covariates could lead to spurious results in occupancy models that do not also incorporate temporal components. While variation in detection probability is complex and influenced by effects at both temporal and spatial scales, temporal covariates can and should be controlled as part of robust survey methods. Our results emphasize the importance of accounting for detection probability in occupancy studies, particularly during presence/absence studies for species such as raptors that are widespread and

  20. Guided waves based SHM systems for composites structural elements: statistical analyses finalized at probability of detection definition and assessment

    Science.gov (United States)

    Monaco, E.; Memmolo, V.; Ricci, F.; Boffa, N. D.; Maio, L.

    2015-03-01

    Maintenance approaches based on sensorised structures and Structural Health Monitoring systems could represent one of the most promising innovations in the fields of aerostructures since many years, mostly when composites materials (fibers reinforced resins) are considered. Layered materials still suffer today of drastic reductions of maximum allowable stress values during the design phase as well as of costly and recurrent inspections during the life cycle phase that don't permit of completely exploit their structural and economic potentialities in today aircrafts. Those penalizing measures are necessary mainly to consider the presence of undetected hidden flaws within the layered sequence (delaminations) or in bonded areas (partial disbonding); in order to relax design and maintenance constraints a system based on sensors permanently installed on the structure to detect and locate eventual flaws can be considered (SHM system) once its effectiveness and reliability will be statistically demonstrated via a rigorous Probability Of Detection function definition and evaluation. This paper presents an experimental approach with a statistical procedure for the evaluation of detection threshold of a guided waves based SHM system oriented to delaminations detection on a typical wing composite layered panel. The experimental tests are mostly oriented to characterize the statistical distribution of measurements and damage metrics as well as to characterize the system detection capability using this approach. Numerically it is not possible to substitute part of the experimental tests aimed at POD where the noise in the system response is crucial. Results of experiments are presented in the paper and analyzed.

  1. Lamb wave-based damage quantification and probability of detection modeling for fatigue life assessment of riveted lap joint

    Science.gov (United States)

    He, Jingjing; Wang, Dengjiang; Zhang, Weifang

    2015-03-01

    This study presents an experimental and modeling study for damage detection and quantification in riveted lap joints. Embedded lead zirconate titanate piezoelectric (PZT) ceramic wafer-type sensors are employed to perform in-situ non-destructive testing during fatigue cyclical loading. A multi-feature integration method is developed to quantify the crack size using signal features of correlation coefficient, amplitude change, and phase change. In addition, probability of detection (POD) model is constructed to quantify the reliability of the developed sizing method. Using the developed crack size quantification method and the resulting POD curve, probabilistic fatigue life prediction can be performed to provide comprehensive information for decision-making. The effectiveness of the overall methodology is demonstrated and validated using several aircraft lap joint specimens from different manufactures and under different loading conditions.

  2. The external costs of low probability-high consequence events: Ex ante damages and lay risks

    International Nuclear Information System (INIS)

    Krupnick, A.J.; Markandya, A.; Nickell, E.

    1994-01-01

    This paper provides an analytical basis for characterizing key differences between two perspectives on how to estimate the expected damages of low probability - high consequence events. One perspective is the conventional method used in the U.S.-EC fuel cycle reports [e.g., ORNL/RFF (1994a,b]. This paper articulates another perspective, using economic theory. The paper makes a strong case for considering this, approach as an alternative, or at least as a complement, to the conventional approach. This alternative approach is an important area for future research. I Interest has been growing worldwide in embedding the external costs of productive activities, particularly the fuel cycles resulting in electricity generation, into prices. In any attempt to internalize these costs, one must take into account explicitly the remote but real possibilities of accidents and the wide gap between lay perceptions and expert assessments of such risks. In our fuel cycle analyses, we estimate damages and benefits' by simply monetizing expected consequences, based on pollution dispersion models, exposure-response functions, and valuation functions. For accidents, such as mining and transportation accidents, natural gas pipeline accidents, and oil barge accidents, we use historical data to estimate the rates of these accidents. For extremely severe accidents--such as severe nuclear reactor accidents and catastrophic oil tanker spills--events are extremely rare and they do not offer a sufficient sample size to estimate their probabilities based on past occurrences. In those cases the conventional approach is to rely on expert judgments about both the probability of the consequences and their magnitude. As an example of standard practice, which we term here an expert expected damage (EED) approach to estimating damages, consider how evacuation costs are estimated in the nuclear fuel cycle report

  3. The external costs of low probability-high consequence events: Ex ante damages and lay risks

    Energy Technology Data Exchange (ETDEWEB)

    Krupnick, A J; Markandya, A; Nickell, E

    1994-07-01

    This paper provides an analytical basis for characterizing key differences between two perspectives on how to estimate the expected damages of low probability - high consequence events. One perspective is the conventional method used in the U.S.-EC fuel cycle reports [e.g., ORNL/RFF (1994a,b]. This paper articulates another perspective, using economic theory. The paper makes a strong case for considering this, approach as an alternative, or at least as a complement, to the conventional approach. This alternative approach is an important area for future research. I Interest has been growing worldwide in embedding the external costs of productive activities, particularly the fuel cycles resulting in electricity generation, into prices. In any attempt to internalize these costs, one must take into account explicitly the remote but real possibilities of accidents and the wide gap between lay perceptions and expert assessments of such risks. In our fuel cycle analyses, we estimate damages and benefits' by simply monetizing expected consequences, based on pollution dispersion models, exposure-response functions, and valuation functions. For accidents, such as mining and transportation accidents, natural gas pipeline accidents, and oil barge accidents, we use historical data to estimate the rates of these accidents. For extremely severe accidents--such as severe nuclear reactor accidents and catastrophic oil tanker spills--events are extremely rare and they do not offer a sufficient sample size to estimate their probabilities based on past occurrences. In those cases the conventional approach is to rely on expert judgments about both the probability of the consequences and their magnitude. As an example of standard practice, which we term here an expert expected damage (EED) approach to estimating damages, consider how evacuation costs are estimated in the nuclear fuel cycle report.

  4. The Lateral Trigger Probability function for UHE Cosmic Rays Showers detected by the Pierre Auger Observatory

    Czech Academy of Sciences Publication Activity Database

    Abreu, P.; Aglietta, M.; Ahn, E.J.; Boháčová, Martina; Chudoba, Jiří; Ebr, Jan; Mandát, Dušan; Nečesal, Petr; Nožka, Libor; Nyklíček, Michal; Palatka, Miroslav; Pech, Miroslav; Prouza, Michael; Řídký, Jan; Schovancová, Jaroslava; Schovánek, Petr; Šmída, Radomír; Trávníček, Petr; Vícha, Jakub

    2011-01-01

    Roč. 35, č. 5 (2011), 266-276 ISSN 0927-6505 R&D Projects: GA MŠk LC527; GA MŠk(CZ) 1M06002; GA AV ČR KJB100100904; GA MŠk(CZ) LA08016 Institutional research plan: CEZ:AV0Z10100502; CEZ:AV0Z10100522 Keywords : trigger * cosmic ray shower s Subject RIV: BF - Elementary Particles and High Energy Physics Impact factor: 3.216, year: 2011 http://www.auger.org/technical_info/pdfs/PerroneLTP_Published.pdf

  5. NDE reliability and probability of detection (POD) evolution and paradigm shift

    Energy Technology Data Exchange (ETDEWEB)

    Singh, Surendra [NDE Engineering, Materials and Process Engineering, Honeywell Aerospace, Phoenix, AZ 85034 (United States)

    2014-02-18

    The subject of NDE Reliability and POD has gone through multiple phases since its humble beginning in the late 1960s. This was followed by several programs including the important one nicknamed “Have Cracks – Will Travel” or in short “Have Cracks” by Lockheed Georgia Company for US Air Force during 1974–1978. This and other studies ultimately led to a series of developments in the field of reliability and POD starting from the introduction of fracture mechanics and Damaged Tolerant Design (DTD) to statistical framework by Bernes and Hovey in 1981 for POD estimation to MIL-STD HDBK 1823 (1999) and 1823A (2009). During the last decade, various groups and researchers have further studied the reliability and POD using Model Assisted POD (MAPOD), Simulation Assisted POD (SAPOD), and applying Bayesian Statistics. All and each of these developments had one objective, i.e., improving accuracy of life prediction in components that to a large extent depends on the reliability and capability of NDE methods. Therefore, it is essential to have a reliable detection and sizing of large flaws in components. Currently, POD is used for studying reliability and capability of NDE methods, though POD data offers no absolute truth regarding NDE reliability, i.e., system capability, effects of flaw morphology, and quantifying the human factors. Furthermore, reliability and POD have been reported alike in meaning but POD is not NDE reliability. POD is a subset of the reliability that consists of six phases: 1) samples selection using DOE, 2) NDE equipment setup and calibration, 3) System Measurement Evaluation (SME) including Gage Repeatability and Reproducibility (Gage R and R) and Analysis Of Variance (ANOVA), 4) NDE system capability and electronic and physical saturation, 5) acquiring and fitting data to a model, and data analysis, and 6) POD estimation. This paper provides an overview of all major POD milestones for the last several decades and discuss rationale for using

  6. Viscosity measurement - probably a means for detecting radiation treatment of spices?

    International Nuclear Information System (INIS)

    Heide, L.; Albrich, S.; Boegl, K.W.

    1987-12-01

    The viscosity of 13 different spices and dried vegetables in total was measured. Optimal conditions were first determined for each product, i.e. concentration, pH-value, temperature, particle size and soaking time. For method evaluation, examinations were primarily performed to study the effect of storage, the reproducibility and the influence of the different varieties of the same spice. In supplement, for pepper, the viscosity was measured as a function of radiation dose. In summation, significant changes in the gel forming capability after irradiation could be observed after preliminary experiments in 8 dried spices (ginger, carrots, leek, cloves, pepper, celery, cinnamon and onions). With 3 spices (ginger, pepper and cinnamon) could the results from examining all different varieties of the same spice be substantiated. An additional influence of storage time on viscosity could not be proved during the investigative period of 8 months. Generally seen, there is no possibility of being able to identify an irradiated spice on the basis of viscosity measurements alone, since the difference between the varieties of one and the same spice is considerably great. However, radiation treatment can be reliably excluded with ginger, pepper and cinnamon, if the viscosities are high (10-20 Pa x s). (orig./MG) [de

  7. Midcourse Guidance Law Based on High Target Acquisition Probability Considering Angular Constraint and Line-of-Sight Angle Rate Control

    Directory of Open Access Journals (Sweden)

    Xiao Liu

    2016-01-01

    Full Text Available Random disturbance factors would lead to the variation of target acquisition point during the long distance flight. To acquire a high target acquisition probability and improve the impact precision, missiles should be guided to an appropriate target acquisition position with certain attitude angles and line-of-sight (LOS angle rate. This paper has presented a new midcourse guidance law considering the influences of random disturbances, detection distance restraint, and target acquisition probability with Monte Carlo simulation. Detailed analyses of the impact points on the ground and the random distribution of the target acquisition position in the 3D space are given to get the appropriate attitude angles and the end position for the midcourse guidance. Then, a new formulation biased proportional navigation (BPN guidance law with angular constraint and LOS angle rate control has been derived to ensure the tracking ability when attacking the maneuvering target. Numerical simulations demonstrates that, compared with the proportional navigation guidance (PNG law and the near-optimal spatial midcourse guidance (NSMG law, BPN guidance law demonstrates satisfactory performances and can meet both the midcourse terminal angular constraint and the LOS angle rate requirement.

  8. A 'new' Cromer-related high frequency antigen probably antithetical to WES.

    Science.gov (United States)

    Daniels, G L; Green, C A; Darr, F W; Anderson, H; Sistonen, P

    1987-01-01

    An antibody to a high frequency antigen, made in a WES+ Black antenatal patient (Wash.), failed to react with the red cells of a presumed WES+ homozygote and is, therefore, probably antithetical to anti-WES. Like anti-WES, it reacted with papain, ficin, trypsin or neuraminidase treated cells but not with alpha-chymotrypsin or pronase treated cells and was specifically inhibited by concentrated serum. It also reacted more strongly in titration with WES- cells than with WES+ cells. The antibody is Cromer-related as it failed to react with Inab phenotype (IFC-) cells and reacted only weakly with Dr(a-) cells. Wash. cells and those of the other possible WES+ homozygote are Cr(a+) Tc(a+b-c-) Dr(a+) IFC+ but reacted only very weakly with anti-Esa.

  9. A scan statistic for binary outcome based on hypergeometric probability model, with an application to detecting spatial clusters of Japanese encephalitis.

    Science.gov (United States)

    Zhao, Xing; Zhou, Xiao-Hua; Feng, Zijian; Guo, Pengfei; He, Hongyan; Zhang, Tao; Duan, Lei; Li, Xiaosong

    2013-01-01

    As a useful tool for geographical cluster detection of events, the spatial scan statistic is widely applied in many fields and plays an increasingly important role. The classic version of the spatial scan statistic for the binary outcome is developed by Kulldorff, based on the Bernoulli or the Poisson probability model. In this paper, we apply the Hypergeometric probability model to construct the likelihood function under the null hypothesis. Compared with existing methods, the likelihood function under the null hypothesis is an alternative and indirect method to identify the potential cluster, and the test statistic is the extreme value of the likelihood function. Similar with Kulldorff's methods, we adopt Monte Carlo test for the test of significance. Both methods are applied for detecting spatial clusters of Japanese encephalitis in Sichuan province, China, in 2009, and the detected clusters are identical. Through a simulation to independent benchmark data, it is indicated that the test statistic based on the Hypergeometric model outweighs Kulldorff's statistics for clusters of high population density or large size; otherwise Kulldorff's statistics are superior.

  10. Estimation of probability density functions of damage parameter for valve leakage detection in reciprocating pump used in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jong Kyeom; Kim, Tae Yun; Kim, Hyun Su; Chai, Jang Bom; Lee, Jin Woo [Div. of Mechanical Engineering, Ajou University, Suwon (Korea, Republic of)

    2016-10-15

    This paper presents an advanced estimation method for obtaining the probability density functions of a damage parameter for valve leakage detection in a reciprocating pump. The estimation method is based on a comparison of model data which are simulated by using a mathematical model, and experimental data which are measured on the inside and outside of the reciprocating pump in operation. The mathematical model, which is simplified and extended on the basis of previous models, describes not only the normal state of the pump, but also its abnormal state caused by valve leakage. The pressure in the cylinder is expressed as a function of the crankshaft angle, and an additional volume flow rate due to the valve leakage is quantified by a damage parameter in the mathematical model. The change in the cylinder pressure profiles due to the suction valve leakage is noticeable in the compression and expansion modes of the pump. The damage parameter value over 300 cycles is calculated in two ways, considering advance or delay in the opening and closing angles of the discharge valves. The probability density functions of the damage parameter are compared for diagnosis and prognosis on the basis of the probabilistic features of valve leakage.

  11. Estimation of probability density functions of damage parameter for valve leakage detection in reciprocating pump used in nuclear power plants

    International Nuclear Information System (INIS)

    Lee, Jong Kyeom; Kim, Tae Yun; Kim, Hyun Su; Chai, Jang Bom; Lee, Jin Woo

    2016-01-01

    This paper presents an advanced estimation method for obtaining the probability density functions of a damage parameter for valve leakage detection in a reciprocating pump. The estimation method is based on a comparison of model data which are simulated by using a mathematical model, and experimental data which are measured on the inside and outside of the reciprocating pump in operation. The mathematical model, which is simplified and extended on the basis of previous models, describes not only the normal state of the pump, but also its abnormal state caused by valve leakage. The pressure in the cylinder is expressed as a function of the crankshaft angle, and an additional volume flow rate due to the valve leakage is quantified by a damage parameter in the mathematical model. The change in the cylinder pressure profiles due to the suction valve leakage is noticeable in the compression and expansion modes of the pump. The damage parameter value over 300 cycles is calculated in two ways, considering advance or delay in the opening and closing angles of the discharge valves. The probability density functions of the damage parameter are compared for diagnosis and prognosis on the basis of the probabilistic features of valve leakage

  12. Estimation of Probability Density Functions of Damage Parameter for Valve Leakage Detection in Reciprocating Pump Used in Nuclear Power Plants

    Directory of Open Access Journals (Sweden)

    Jong Kyeom Lee

    2016-10-01

    Full Text Available This paper presents an advanced estimation method for obtaining the probability density functions of a damage parameter for valve leakage detection in a reciprocating pump. The estimation method is based on a comparison of model data which are simulated by using a mathematical model, and experimental data which are measured on the inside and outside of the reciprocating pump in operation. The mathematical model, which is simplified and extended on the basis of previous models, describes not only the normal state of the pump, but also its abnormal state caused by valve leakage. The pressure in the cylinder is expressed as a function of the crankshaft angle, and an additional volume flow rate due to the valve leakage is quantified by a damage parameter in the mathematical model. The change in the cylinder pressure profiles due to the suction valve leakage is noticeable in the compression and expansion modes of the pump. The damage parameter value over 300 cycles is calculated in two ways, considering advance or delay in the opening and closing angles of the discharge valves. The probability density functions of the damage parameter are compared for diagnosis and prognosis on the basis of the probabilistic features of valve leakage.

  13. High-severity fire: evaluating its key drivers and mapping its probability across western US forests

    Science.gov (United States)

    Parks, Sean A.; Holsinger, Lisa M.; Panunto, Matthew H.; Jolly, W. Matt; Dobrowski, Solomon Z.; Dillon, Gregory K.

    2018-04-01

    Wildland fire is a critical process in forests of the western United States (US). Variation in fire behavior, which is heavily influenced by fuel loading, terrain, weather, and vegetation type, leads to heterogeneity in fire severity across landscapes. The relative influence of these factors in driving fire severity, however, is poorly understood. Here, we explore the drivers of high-severity fire for forested ecoregions in the western US over the period 2002–2015. Fire severity was quantified using a satellite-inferred index of severity, the relativized burn ratio. For each ecoregion, we used boosted regression trees to model high-severity fire as a function of live fuel, topography, climate, and fire weather. We found that live fuel, on average, was the most important factor driving high-severity fire among ecoregions (average relative influence = 53.1%) and was the most important factor in 14 of 19 ecoregions. Fire weather was the second most important factor among ecoregions (average relative influence = 22.9%) and was the most important factor in five ecoregions. Climate (13.7%) and topography (10.3%) were less influential. We also predicted the probability of high-severity fire, were a fire to occur, using recent (2016) satellite imagery to characterize live fuel for a subset of ecoregions in which the model skill was deemed acceptable (n = 13). These ‘wall-to-wall’ gridded ecoregional maps provide relevant and up-to-date information for scientists and managers who are tasked with managing fuel and wildland fire. Lastly, we provide an example of the predicted likelihood of high-severity fire under moderate and extreme fire weather before and after fuel reduction treatments, thereby demonstrating how our framework and model predictions can potentially serve as a performance metric for land management agencies tasked with reducing hazardous fuel across large landscapes.

  14. Approximation of rejective sampling inclusion probabilities and application to high order correlations

    NARCIS (Netherlands)

    Boistard, H.; Lopuhää, H.P.; Ruiz-Gazen, A.

    2012-01-01

    This paper is devoted to rejective sampling. We provide an expansion of joint inclusion probabilities of any order in terms of the inclusion probabilities of order one, extending previous results by Hájek (1964) and Hájek (1981) and making the remainder term more precise. Following Hájek (1981), the

  15. Measurements of atomic transition probabilities in highly ionized atoms by fast ion beams

    International Nuclear Information System (INIS)

    Martinson, I.; Curtis, L.J.; Lindgaerd, A.

    1977-01-01

    A summary is given of the beam-foil method by which level lifetimes and transition probabilities can be determined in atoms and ions. Results are presented for systems of particular interest for fusion research, such as the Li, Be, Na, Mg, Cu and Zn isoelectronic sequences. The available experimental material is compared to theoretical transition probabilities. (author)

  16. An analytical calculation of neighbourhood order probabilities for high dimensional Poissonian processes and mean field models

    International Nuclear Information System (INIS)

    Tercariol, Cesar Augusto Sangaletti; Kiipper, Felipe de Moura; Martinez, Alexandre Souto

    2007-01-01

    Consider that the coordinates of N points are randomly generated along the edges of a d-dimensional hypercube (random point problem). The probability P (d,N) m,n that an arbitrary point is the mth nearest neighbour to its own nth nearest neighbour (Cox probabilities) plays an important role in spatial statistics. Also, it has been useful in the description of physical processes in disordered media. Here we propose a simpler derivation of Cox probabilities, where we stress the role played by the system dimensionality d. In the limit d → ∞, the distances between pair of points become independent (random link model) and closed analytical forms for the neighbourhood probabilities are obtained both for the thermodynamic limit and finite-size system. Breaking the distance symmetry constraint drives us to the random map model, for which the Cox probabilities are obtained for two cases: whether a point is its own nearest neighbour or not

  17. Power allocation for target detection in radar networks based on low probability of intercept: A cooperative game theoretical strategy

    Science.gov (United States)

    Shi, Chenguang; Salous, Sana; Wang, Fei; Zhou, Jianjiang

    2017-08-01

    Distributed radar network systems have been shown to have many unique features. Due to their advantage of signal and spatial diversities, radar networks are attractive for target detection. In practice, the netted radars in radar networks are supposed to maximize their transmit power to achieve better detection performance, which may be in contradiction with low probability of intercept (LPI). Therefore, this paper investigates the problem of adaptive power allocation for radar networks in a cooperative game-theoretic framework such that the LPI performance can be improved. Taking into consideration both the transmit power constraints and the minimum signal to interference plus noise ratio (SINR) requirement of each radar, a cooperative Nash bargaining power allocation game based on LPI is formulated, whose objective is to minimize the total transmit power by optimizing the power allocation in radar networks. First, a novel SINR-based network utility function is defined and utilized as a metric to evaluate power allocation. Then, with the well-designed network utility function, the existence and uniqueness of the Nash bargaining solution are proved analytically. Finally, an iterative Nash bargaining algorithm is developed that converges quickly to a Pareto optimal equilibrium for the cooperative game. Numerical simulations and theoretic analysis are provided to evaluate the effectiveness of the proposed algorithm.

  18. Skin damage probabilities using fixation materials in high-energy photon beams

    International Nuclear Information System (INIS)

    Carl, J.; Vestergaard, A.

    2000-01-01

    Patient fixation, such as thermoplastic masks, carbon-fibre support plates and polystyrene bead vacuum cradles, is used to reproduce patient positioning in radiotherapy. Consequently low-density materials may be introduced in high-energy photon beams. The aim of the this study was to measure the increase in skin dose when low-density materials are present and calculate the radiobiological consequences in terms of probabilities of early and late skin damage. An experimental thin-windowed plane-parallel ion chamber was used. Skin doses were measured using various overlaying low-density fixation materials. A fixed geometry of a 10 x 10 cm field, a SSD = 100 cm and photon energies of 4, 6 and 10 MV on Varian Clinac 2100C accelerators were used for all measurements. Radiobiological consequences of introducing these materials into the high-energy photon beams were evaluated in terms of early and late damage of the skin based on the measured surface doses and the LQ-model. The experimental ion chamber save results consistent with other studies. A relationship between skin dose and material thickness in mg/cm 2 was established and used to calculate skin doses in scenarios assuming radiotherapy treatment with opposed fields. Conventional radiotherapy may apply mid-point doses up to 60-66 Gy in daily 2-Gy fractions opposed fields. Using thermoplastic fixation and high-energy photons as low as 4 MV do increase the dose to the skin considerably. However, using thermoplastic materials with thickness less than 100 mg/cm 2 skin doses are comparable with those produced by variation in source to skin distance, field size or blocking trays within clinical treatment set-ups. The use of polystyrene cradles and carbon-fibre materials with thickness less than 100 mg/cm 2 should be avoided at 4 MV at doses above 54-60 Gy. (author)

  19. Jihadist Foreign Fighter Phenomenon in Western Europe: A Low-Probability, High-Impact Threat

    Directory of Open Access Journals (Sweden)

    Edwin Bakker

    2015-11-01

    Full Text Available The phenomenon of foreign fighters in Syria and Iraq is making headlines. Their involvement in the atrocities committed by terrorist groups such as the so-called “Islamic State” and Jabhat al-Nusra have caused grave concern and public outcry in the foreign fighters’ European countries of origin. While much has been written about these foreign fighters and the possible threat they pose, the impact of this phenomenon on Western European societies has yet to be documented. This Research Paper explores four particular areas where this impact is most visible: a violent incidents associated with (returned foreign fighters, b official and political responses linked to these incidents, c public opinion, and d anti-Islam reactions linked to these incidents. The authors conclude that the phenomenon of jihadist foreign fighters in European societies should be primarily regarded as a social and political threat, not a physical one. They consider the phenomenon of European jihadist foreign fighters a “low-probability, high-impact” threat.

  20. Probable high prevalence of limb-girdle muscular dystrophy type 2D in Taiwan.

    Science.gov (United States)

    Liang, Wen-Chen; Chou, Po-Ching; Hung, Chia-Cheng; Su, Yi-Ning; Kan, Tsu-Min; Chen, Wan-Zi; Hayashi, Yukiko K; Nishino, Ichizo; Jong, Yuh-Jyh

    2016-03-15

    Limb-girdle muscular dystrophy type 2D (LGMD2D), an autosomal-recessive inherited LGMD, is caused by the mutations in SGCA. SGCA encodes alpha-sarcoglycan (SG) that forms a heterotetramer with other SGs in the sarcolemma, and comprises part of the dystrophin-glycoprotein complex. The frequency of LGMD2D is variable among different ethnic backgrounds, and so far only a few patients have been reported in Asia. We identified five patients with a novel homozygous mutation of c.101G>T (p.Arg34Leu) in SGCA from a big aboriginal family ethnically consisting of two tribes in Taiwan. Patient 3 is the maternal uncle of patients 1 and 2. All their parents, heterozygous for c.101G>T, denied consanguineous marriages although they were from the same tribe. The heterozygous parents of patients 4 and 5 were from two different tribes, originally residing in different geographic regions in Taiwan. Haplotype analysis showed that all five patients shared the same mutation-associated haplotype, indicating the probability of a founder effect and consanguinity. The results suggest that the carrier rate of c.101G>T in SGCA may be high in Taiwan, especially in the aboriginal population regardless of the tribes. It is important to investigate the prevalence of LGMD2D in Taiwan for early diagnosis and treatment. Copyright © 2016. Published by Elsevier B.V.

  1. Improving Ranking Using Quantum Probability

    OpenAIRE

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...

  2. Conditional probability of intense rainfall producing high ground concentrations from radioactive plumes

    International Nuclear Information System (INIS)

    Wayland, J.R.

    1977-03-01

    The overlap of the expanding plume of radioactive material from a hypothetical nuclear accident with rainstorms over dense population areas is considered. The conditional probability of the occurrence of hot spots from intense cellular rainfall is presented

  3. Learning difficulties of senior high school students based on probability understanding levels

    Science.gov (United States)

    Anggara, B.; Priatna, N.; Juandi, D.

    2018-05-01

    Identifying students' difficulties in learning concept of probability is important for teachers to prepare the appropriate learning processes and can overcome obstacles that may arise in the next learning processes. This study revealed the level of students' understanding of the concept of probability and identified their difficulties as a part of the epistemological obstacles identification of the concept of probability. This study employed a qualitative approach that tends to be the character of descriptive research involving 55 students of class XII. In this case, the writer used the diagnostic test of probability concept learning difficulty, observation, and interview as the techniques to collect the data needed. The data was used to determine levels of understanding and the learning difficulties experienced by the students. From the result of students' test result and learning observation, it was found that the mean cognitive level was at level 2. The findings indicated that students had appropriate quantitative information of probability concept but it might be incomplete or incorrectly used. The difficulties found are the ones in arranging sample space, events, and mathematical models related to probability problems. Besides, students had difficulties in understanding the principles of events and prerequisite concept.

  4. Influence of call broadcast timing within point counts and survey duration on detection probability of marsh breeding birds

    Directory of Open Access Journals (Sweden)

    Douglas C. Tozer

    2017-12-01

    Full Text Available The Standardized North American Marsh Bird Monitoring Protocol recommends point counts consisting of a 5-min passive observation period, meant to be free of broadcast bias, followed by call broadcasts to entice elusive species to reveal their presence. Prior to this protocol, some monitoring programs used point counts with broadcasts during the first 5 min of 10-min counts, and have since used 15-min counts with an initial 5-min passive period (P1 followed by 5 min of broadcasts (B and a second 5-min passive period (P2 to ensure consistency across years and programs. Influence of timing of broadcasts within point counts and point count duration, however, have rarely been assessed. Using data from 23,973 broadcast-assisted 15-min point counts conducted throughout the Great Lakes-St. Lawrence region between 2008 and 2016 by Bird Studies Canada's Marsh Monitoring Program and Central Michigan University's Great Lakes Coastal Wetland Monitoring Program, we estimated detection probabilities of individuals for 14 marsh breeding bird species during P1B compared to BP2, P1 compared to P2, and P1B compared to P1BP2. For six broadcast species and American Bittern (Botaurus lentiginosus, we found no significant difference in detection during P1B compared to BP2, and no significant difference in four of the same seven species during P1 compared to P2. We observed small but significant differences in detection for 7 of 14 species during P1B compared to P1BP2. We conclude that differences in timing of broadcasts causes no bias based on counts from entire 10-minute surveys, although P1B should be favored over BP2 because the same amount of effort in P1B avoids broadcast bias in all broadcast species, and 10-min surveys are superior to 15-min surveys because modest gains in detection of some species does not warrant the additional effort. We recommend point counts consisting of 5 min of passive observation followed by broadcasts, consistent with the standardized

  5. Laser Raman detection for oral cancer based on an adaptive Gaussian process classification method with posterior probabilities

    International Nuclear Information System (INIS)

    Du, Zhanwei; Yang, Yongjian; Bai, Yuan; Wang, Lijun; Su, Le; Chen, Yong; Li, Xianchang; Zhou, Xiaodong; Shen, Aiguo; Hu, Jiming; Jia, Jun

    2013-01-01

    The existing methods for early and differential diagnosis of oral cancer are limited due to the unapparent early symptoms and the imperfect imaging examination methods. In this paper, the classification models of oral adenocarcinoma, carcinoma tissues and a control group with just four features are established by utilizing the hybrid Gaussian process (HGP) classification algorithm, with the introduction of the mechanisms of noise reduction and posterior probability. HGP shows much better performance in the experimental results. During the experimental process, oral tissues were divided into three groups, adenocarcinoma (n = 87), carcinoma (n = 100) and the control group (n = 134). The spectral data for these groups were collected. The prospective application of the proposed HGP classification method improved the diagnostic sensitivity to 56.35% and the specificity to about 70.00%, and resulted in a Matthews correlation coefficient (MCC) of 0.36. It is proved that the utilization of HGP in LRS detection analysis for the diagnosis of oral cancer gives accurate results. The prospect of application is also satisfactory. (paper)

  6. Laser Raman detection for oral cancer based on an adaptive Gaussian process classification method with posterior probabilities

    Science.gov (United States)

    Du, Zhanwei; Yang, Yongjian; Bai, Yuan; Wang, Lijun; Su, Le; Chen, Yong; Li, Xianchang; Zhou, Xiaodong; Jia, Jun; Shen, Aiguo; Hu, Jiming

    2013-03-01

    The existing methods for early and differential diagnosis of oral cancer are limited due to the unapparent early symptoms and the imperfect imaging examination methods. In this paper, the classification models of oral adenocarcinoma, carcinoma tissues and a control group with just four features are established by utilizing the hybrid Gaussian process (HGP) classification algorithm, with the introduction of the mechanisms of noise reduction and posterior probability. HGP shows much better performance in the experimental results. During the experimental process, oral tissues were divided into three groups, adenocarcinoma (n = 87), carcinoma (n = 100) and the control group (n = 134). The spectral data for these groups were collected. The prospective application of the proposed HGP classification method improved the diagnostic sensitivity to 56.35% and the specificity to about 70.00%, and resulted in a Matthews correlation coefficient (MCC) of 0.36. It is proved that the utilization of HGP in LRS detection analysis for the diagnosis of oral cancer gives accurate results. The prospect of application is also satisfactory.

  7. Interrelationships Between Receiver/Relative Operating Characteristics Display, Binomial, Logit, and Bayes' Rule Probability of Detection Methodologies

    Science.gov (United States)

    Generazio, Edward R.

    2014-01-01

    Unknown risks are introduced into failure critical systems when probability of detection (POD) capabilities are accepted without a complete understanding of the statistical method applied and the interpretation of the statistical results. The presence of this risk in the nondestructive evaluation (NDE) community is revealed in common statements about POD. These statements are often interpreted in a variety of ways and therefore, the very existence of the statements identifies the need for a more comprehensive understanding of POD methodologies. Statistical methodologies have data requirements to be met, procedures to be followed, and requirements for validation or demonstration of adequacy of the POD estimates. Risks are further enhanced due to the wide range of statistical methodologies used for determining the POD capability. Receiver/Relative Operating Characteristics (ROC) Display, simple binomial, logistic regression, and Bayes' rule POD methodologies are widely used in determining POD capability. This work focuses on Hit-Miss data to reveal the framework of the interrelationships between Receiver/Relative Operating Characteristics Display, simple binomial, logistic regression, and Bayes' Rule methodologies as they are applied to POD. Knowledge of these interrelationships leads to an intuitive and global understanding of the statistical data, procedural and validation requirements for establishing credible POD estimates.

  8. CGC/saturation approach for soft interactions at high energy: survival probability of central exclusive production

    Energy Technology Data Exchange (ETDEWEB)

    Gotsman, E.; Maor, U. [Tel Aviv University, Department of Particle Physics, Raymond and Beverly Sackler Faculty of Exact Science, School of Physics and Astronomy, Tel Aviv (Israel); Levin, E. [Tel Aviv University, Department of Particle Physics, Raymond and Beverly Sackler Faculty of Exact Science, School of Physics and Astronomy, Tel Aviv (Israel); Universidad Tecnica Federico Santa Maria, Departemento de Fisica, Centro Cientifico-Tecnologico de Valparaiso, Valparaiso (Chile)

    2016-04-15

    We estimate the value of the survival probability for central exclusive production in a model which is based on the CGC/saturation approach. Hard and soft processes are described in the same framework. At LHC energies, we obtain a small value for the survival probability. The source of the small value is the impact parameter dependence of the hard amplitude. Our model has successfully described a large body of soft data: elastic, inelastic and diffractive cross sections, inclusive production and rapidity correlations, as well as the t-dependence of deep inelastic diffractive production of vector mesons. (orig.)

  9. Numerical modelling as a cost-reduction tool for probability of detection of bolt hole eddy current testing

    Science.gov (United States)

    Mandache, C.; Khan, M.; Fahr, A.; Yanishevsky, M.

    2011-03-01

    Probability of detection (PoD) studies are broadly used to determine the reliability of specific nondestructive inspection procedures, as well as to provide data for damage tolerance life estimations and calculation of inspection intervals for critical components. They require inspections on a large set of samples, a fact that makes these statistical assessments time- and cost-consuming. Physics-based numerical simulations of nondestructive testing inspections could be used as a cost-effective alternative to empirical investigations. They realistically predict the inspection outputs as functions of the input characteristics related to the test piece, transducer and instrument settings, which are subsequently used to partially substitute and/or complement inspection data in PoD analysis. This work focuses on the numerical modelling aspects of eddy current testing for the bolt hole inspections of wing box structures typical of the Lockheed Martin C-130 Hercules and P-3 Orion aircraft, found in the air force inventory of many countries. Boundary element-based numerical modelling software was employed to predict the eddy current signal responses when varying inspection parameters related to probe characteristics, crack geometry and test piece properties. Two demonstrator exercises were used for eddy current signal prediction when lowering the driver probe frequency and changing the material's electrical conductivity, followed by subsequent discussions and examination of the implications on using simulated data in the PoD analysis. Despite some simplifying assumptions, the modelled eddy current signals were found to provide similar results to the actual inspections. It is concluded that physics-based numerical simulations have the potential to partially substitute or complement inspection data required for PoD studies, reducing the cost, time, effort and resources necessary for a full empirical PoD assessment.

  10. Hierarchical Decompositions for the Computation of High-Dimensional Multivariate Normal Probabilities

    KAUST Repository

    Genton, Marc G.

    2017-09-07

    We present a hierarchical decomposition scheme for computing the n-dimensional integral of multivariate normal probabilities that appear frequently in statistics. The scheme exploits the fact that the formally dense covariance matrix can be approximated by a matrix with a hierarchical low rank structure. It allows the reduction of the computational complexity per Monte Carlo sample from O(n2) to O(mn+knlog(n/m)), where k is the numerical rank of off-diagonal matrix blocks and m is the size of small diagonal blocks in the matrix that are not well-approximated by low rank factorizations and treated as dense submatrices. This hierarchical decomposition leads to substantial efficiencies in multivariate normal probability computations and allows integrations in thousands of dimensions to be practical on modern workstations.

  11. Hierarchical Decompositions for the Computation of High-Dimensional Multivariate Normal Probabilities

    KAUST Repository

    Genton, Marc G.; Keyes, David E.; Turkiyyah, George

    2017-01-01

    We present a hierarchical decomposition scheme for computing the n-dimensional integral of multivariate normal probabilities that appear frequently in statistics. The scheme exploits the fact that the formally dense covariance matrix can be approximated by a matrix with a hierarchical low rank structure. It allows the reduction of the computational complexity per Monte Carlo sample from O(n2) to O(mn+knlog(n/m)), where k is the numerical rank of off-diagonal matrix blocks and m is the size of small diagonal blocks in the matrix that are not well-approximated by low rank factorizations and treated as dense submatrices. This hierarchical decomposition leads to substantial efficiencies in multivariate normal probability computations and allows integrations in thousands of dimensions to be practical on modern workstations.

  12. Development of probabilistic thinking-oriented learning tools for probability materials at junior high school students

    Science.gov (United States)

    Sari, Dwi Ivayana; Hermanto, Didik

    2017-08-01

    This research is a developmental research of probabilistic thinking-oriented learning tools for probability materials at ninth grade students. This study is aimed to produce a good probabilistic thinking-oriented learning tools. The subjects were IX-A students of MTs Model Bangkalan. The stages of this development research used 4-D development model which has been modified into define, design and develop. Teaching learning tools consist of lesson plan, students' worksheet, learning teaching media and students' achievement test. The research instrument used was a sheet of learning tools validation, a sheet of teachers' activities, a sheet of students' activities, students' response questionnaire and students' achievement test. The result of those instruments were analyzed descriptively to answer research objectives. The result was teaching learning tools in which oriented to probabilistic thinking of probability at ninth grade students which has been valid. Since teaching and learning tools have been revised based on validation, and after experiment in class produced that teachers' ability in managing class was effective, students' activities were good, students' responses to the learning tools were positive and the validity, sensitivity and reliability category toward achievement test. In summary, this teaching learning tools can be used by teacher to teach probability for develop students' probabilistic thinking.

  13. Cerebral gray matter volume losses in essential tremor: A case-control study using high resolution tissue probability maps.

    Science.gov (United States)

    Cameron, Eric; Dyke, Jonathan P; Hernandez, Nora; Louis, Elan D; Dydak, Ulrike

    2018-03-10

    Essential tremor (ET) is increasingly recognized as a multi-dimensional disorder with both motor and non-motor features. For this reason, imaging studies are more broadly examining regions outside the cerebellar motor loop. Reliable detection of cerebral gray matter (GM) atrophy requires optimized processing, adapted to high-resolution magnetic resonance imaging (MRI). We investigated cerebral GM volume loss in ET cases using automated segmentation of MRI T1-weighted images. MRI was acquired on 47 ET cases and 36 controls. Automated segmentation and voxel-wise comparisons of volume were performed using Statistical Parametric Mapping (SPM) software. To improve upon standard protocols, the high-resolution International Consortium for Brain Mapping (ICBM) 2009a atlas and tissue probability maps were used to process each subject image. Group comparisons were performed: all ET vs. Controls, ET with head tremor (ETH) vs. Controls, and severe ET vs. An analysis of variance (ANOVA) was performed between ET with and without head tremor and controls. Age, sex, and Montreal Cognitive Assessment (MoCA) score were regressed out from each comparison. We were able to consistently identify regions of cerebral GM volume loss in ET and in ET subgroups in the posterior insula, superior temporal gyri, cingulate cortex, inferior frontal gyri and other occipital and parietal regions. There were no significant increases in GM volume in ET in any comparisons with controls. This study, which uses improved methodologies, provides evidence that GM volume loss in ET is present beyond the cerebellum, and in fact, is widespread throughout the cerebrum as well. Copyright © 2018 Elsevier Ltd. All rights reserved.

  14. Development of risk assessment simulation tool for optimal control of a low probability-high consequence disaster

    International Nuclear Information System (INIS)

    Yotsumoto, Hiroki; Yoshida, Kikuo; Genchi, Hiroshi

    2011-01-01

    In order to control low probability-high consequence disaster which causes huge social and economic damage, it is necessary to develop simultaneous risk assessment simulation tool based on the scheme of disaster risk including diverse effects of primary disaster and secondary damages. We propose the scheme of this risk simulation tool. (author)

  15. Recent trends in the probability of high out-of-pocket medical expenses in the United States

    Directory of Open Access Journals (Sweden)

    Katherine E Baird

    2016-09-01

    Full Text Available Objective: This article measures the probability that out-of-pocket expenses in the United States exceed a threshold share of income. It calculates this probability separately by individuals’ health condition, income, and elderly status and estimates changes occurring in these probabilities between 2010 and 2013. Data and Method: This article uses nationally representative household survey data on 344,000 individuals. Logistic regressions estimate the probabilities that out-of-pocket expenses exceed 5% and alternatively 10% of income in the two study years. These probabilities are calculated for individuals based on their income, health status, and elderly status. Results: Despite favorable changes in both health policy and the economy, large numbers of Americans continue to be exposed to high out-of-pocket expenditures. For instance, the results indicate that in 2013 over a quarter of nonelderly low-income citizens in poor health spent 10% or more of their income on out-of-pocket expenses, and over 40% of this group spent more than 5%. Moreover, for Americans as a whole, the probability of spending in excess of 5% of income on out-of-pocket costs increased by 1.4 percentage points between 2010 and 2013, with the largest increases occurring among low-income Americans; the probability of Americans spending more than 10% of income grew from 9.3% to 9.6%, with the largest increases also occurring among the poor. Conclusion: The magnitude of out-of-pocket’s financial burden and the most recent upward trends in it underscore a need to develop good measures of the degree to which health care policy exposes individuals to financial risk, and to closely monitor the Affordable Care Act’s success in reducing Americans’ exposure to large medical bills.

  16. High Detectivity Graphene-Silicon Heterojunction Photodetector.

    Science.gov (United States)

    Li, Xinming; Zhu, Miao; Du, Mingde; Lv, Zheng; Zhang, Li; Li, Yuanchang; Yang, Yao; Yang, Tingting; Li, Xiao; Wang, Kunlin; Zhu, Hongwei; Fang, Ying

    2016-02-03

    A graphene/n-type silicon (n-Si) heterojunction has been demonstrated to exhibit strong rectifying behavior and high photoresponsivity, which can be utilized for the development of high-performance photodetectors. However, graphene/n-Si heterojunction photodetectors reported previously suffer from relatively low specific detectivity due to large dark current. Here, by introducing a thin interfacial oxide layer, the dark current of graphene/n-Si heterojunction has been reduced by two orders of magnitude at zero bias. At room temperature, the graphene/n-Si photodetector with interfacial oxide exhibits a specific detectivity up to 5.77 × 10(13) cm Hz(1/2) W(-1) at the peak wavelength of 890 nm in vacuum, which is highest reported detectivity at room temperature for planar graphene/Si heterojunction photodetectors. In addition, the improved graphene/n-Si heterojunction photodetectors possess high responsivity of 0.73 A W(-1) and high photo-to-dark current ratio of ≈10(7) . The current noise spectral density of the graphene/n-Si photodetector has been characterized under ambient and vacuum conditions, which shows that the dark current can be further suppressed in vacuum. These results demonstrate that graphene/Si heterojunction with interfacial oxide is promising for the development of high detectivity photodetectors. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Risk-averse decision-making for civil infrastructure exposed to low-probability, high-consequence events

    International Nuclear Information System (INIS)

    Cha, Eun Jeong; Ellingwood, Bruce R.

    2012-01-01

    Quantitative analysis and assessment of risk to civil infrastructure has two components: probability of a potentially damaging event and consequence of damage, measured in terms of financial or human losses. Decision models that have been utilized during the past three decades take into account the probabilistic component rationally, but address decision-maker attitudes toward consequences and risk only to a limited degree. The application of models reflecting these attitudes to decisions involving low-probability, high-consequence events that may impact civil infrastructure requires a fundamental understanding of risk acceptance attitudes and how they affect individual and group choices. In particular, the phenomenon of risk aversion may be a significant factor in decisions for civil infrastructure exposed to low-probability events with severe consequences, such as earthquakes, hurricanes or floods. This paper utilizes cumulative prospect theory to investigate the role and characteristics of risk-aversion in assurance of structural safety.

  18. Probability of Detection (POD) Analysis for the Advanced Retirement for Cause (RFC)/Engine Structural Integrity Program (ENSIP) Nondestructive Evaluation (NDE) System-Volume 3: Material Correlation Study

    National Research Council Canada - National Science Library

    Berens, Alan

    2000-01-01

    .... Volume 1 presents a description of changes made to the probability of detection (POD) analysis program of Mil-STD-1823 and the statistical evaluation of modifications that were made to version 3 of the Eddy Current Inspection System (ECIS v3...

  19. Comparing a recursive digital filter with the moving-average and sequential probability-ratio detection methods for SNM portal monitors

    International Nuclear Information System (INIS)

    Fehlau, P.E.

    1993-01-01

    The author compared a recursive digital filter proposed as a detection method for French special nuclear material monitors with the author's detection methods, which employ a moving-average scaler or a sequential probability-ratio test. Each of these nine test subjects repeatedly carried a test source through a walk-through portal monitor that had the same nuisance-alarm rate with each method. He found that the average detection probability for the test source is also the same for each method. However, the recursive digital filter may have on drawback: its exponentially decreasing response to past radiation intensity prolongs the impact of any interference from radiation sources of radiation-producing machinery. He also examined the influence of each test subject on the monitor's operation by measuring individual attenuation factors for background and source radiation, then ranked the subjects' attenuation factors against their individual probabilities for detecting the test source. The one inconsistent ranking was probably caused by that subject's unusually long stride when passing through the portal

  20. The probability of malignancy in small pulmonary nodules coexisting with potentially operable lung cancer detected by CT

    International Nuclear Information System (INIS)

    Yuan, Yue; Matsumoto, Tsuneo; Hiyama, Atsuto; Miura, Goji; Tanaka, Nobuyuki; Emoto, Takuya; Kawamura, Takeo; Matsunaga, Naofumi

    2003-01-01

    The aim of this study was to assess the probability of malignancy in one or two small nodules 1 cm or less coexisting with potentially operable lung cancer (coexisting small nodules). The preoperative helical CT scans of 223 patients with lung cancer were retrospectively reviewed. The probability of malignancy of coexisting small nodules was evaluated based on nodule size, location, and clinical stage of the primary lung cancers. Seventy-one coexisting small nodules were found on conventional CT in 58 (26%) of 223 patients, and 14 (6%) patients had malignant nodules. Eighteen (25%) of such nodules were malignant. The probability of malignancy was not significantly different between two groups of nodules larger and smaller than 0.5 cm (p=0.1). The probability of malignancy of such nodules within primary tumor lobe was significantly higher than that in the other lobes (p<0.01). Metastatic nodules were significantly fewer in clinical stage-IA patients than in the patients with the other stage (p<0.01); however, four (57%) of seven synchronous lung cancers were located in the non-primary tumor lobes in the clinical stage-I patients. Malignant coexisting small nodules are not infrequent, and such nodules in the non-primary tumor lobes should be carefully diagnosed. (orig.)

  1. Conditional Probability Modulates Visual Search Efficiency

    Directory of Open Access Journals (Sweden)

    Bryan eCort

    2013-10-01

    Full Text Available We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability – the likelihood of a particular color given a particular combination of two cues – varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.

  2. A Probability Co-Kriging Model to Account for Reporting Bias and Recognize Areas at High Risk for Zebra Mussels and Eurasian Watermilfoil Invasions in Minnesota

    Directory of Open Access Journals (Sweden)

    Kaushi S. T. Kanankege

    2018-01-01

    Full Text Available Zebra mussels (ZMs (Dreissena polymorpha and Eurasian watermilfoil (EWM (Myriophyllum spicatum are aggressive aquatic invasive species posing a conservation burden on Minnesota. Recognizing areas at high risk for invasion is a prerequisite for the implementation of risk-based prevention and mitigation management strategies. The early detection of invasion has been challenging, due in part to the imperfect observation process of invasions including the absence of a surveillance program, reliance on public reporting, and limited resource availability, which results in reporting bias. To predict the areas at high risk for invasions, while accounting for underreporting, we combined network analysis and probability co-kriging to estimate the risk of ZM and EWM invasions. We used network analysis to generate a waterbody-specific variable representing boater traffic, a known high risk activity for human-mediated transportation of invasive species. In addition, co-kriging was used to estimate the probability of species introduction, using waterbody-specific variables. A co-kriging model containing distance to the nearest ZM infested location, boater traffic, and road access was used to recognize the areas at high risk for ZM invasions (AUC = 0.78. The EWM co-kriging model included distance to the nearest EWM infested location, boater traffic, and connectivity to infested waterbodies (AUC = 0.76. Results suggested that, by 2015, nearly 20% of the waterbodies in Minnesota were at high risk of ZM (12.45% or EWM (12.43% invasions, whereas only 125/18,411 (0.67% and 304/18,411 (1.65% are currently infested, respectively. Prediction methods presented here can support decisions related to solving the problems of imperfect detection, which subsequently improve the early detection of biological invasions.

  3. A Probability Co-Kriging Model to Account for Reporting Bias and Recognize Areas at High Risk for Zebra Mussels and Eurasian Watermilfoil Invasions in Minnesota.

    Science.gov (United States)

    Kanankege, Kaushi S T; Alkhamis, Moh A; Phelps, Nicholas B D; Perez, Andres M

    2017-01-01

    Zebra mussels (ZMs) ( Dreissena polymorpha ) and Eurasian watermilfoil (EWM) ( Myriophyllum spicatum ) are aggressive aquatic invasive species posing a conservation burden on Minnesota. Recognizing areas at high risk for invasion is a prerequisite for the implementation of risk-based prevention and mitigation management strategies. The early detection of invasion has been challenging, due in part to the imperfect observation process of invasions including the absence of a surveillance program, reliance on public reporting, and limited resource availability, which results in reporting bias. To predict the areas at high risk for invasions, while accounting for underreporting, we combined network analysis and probability co-kriging to estimate the risk of ZM and EWM invasions. We used network analysis to generate a waterbody-specific variable representing boater traffic, a known high risk activity for human-mediated transportation of invasive species. In addition, co-kriging was used to estimate the probability of species introduction, using waterbody-specific variables. A co-kriging model containing distance to the nearest ZM infested location, boater traffic, and road access was used to recognize the areas at high risk for ZM invasions (AUC = 0.78). The EWM co-kriging model included distance to the nearest EWM infested location, boater traffic, and connectivity to infested waterbodies (AUC = 0.76). Results suggested that, by 2015, nearly 20% of the waterbodies in Minnesota were at high risk of ZM (12.45%) or EWM (12.43%) invasions, whereas only 125/18,411 (0.67%) and 304/18,411 (1.65%) are currently infested, respectively. Prediction methods presented here can support decisions related to solving the problems of imperfect detection, which subsequently improve the early detection of biological invasions.

  4. Climate drives inter-annual variability in probability of high severity fire occurrence in the western United States

    Science.gov (United States)

    Keyser, Alisa; Westerling, Anthony LeRoy

    2017-05-01

    A long history of fire suppression in the western United States has significantly changed forest structure and ecological function, leading to increasingly uncharacteristic fires in terms of size and severity. Prior analyses of fire severity in California forests showed that time since last fire and fire weather conditions predicted fire severity very well, while a larger regional analysis showed that topography and climate were important predictors of high severity fire. There has not yet been a large-scale study that incorporates topography, vegetation and fire-year climate to determine regional scale high severity fire occurrence. We developed models to predict the probability of high severity fire occurrence for the western US. We predict high severity fire occurrence with some accuracy, and identify the relative importance of predictor classes in determining the probability of high severity fire. The inclusion of both vegetation and fire-year climate predictors was critical for model skill in identifying fires with high fractional fire severity. The inclusion of fire-year climate variables allows this model to forecast inter-annual variability in areas at future risk of high severity fire, beyond what slower-changing fuel conditions alone can accomplish. This allows for more targeted land management, including resource allocation for fuels reduction treatments to decrease the risk of high severity fire.

  5. Detection and Classification of Low Probability of Intercept Radar Signals Using Parallel Filter Arrays and Higher Order Statistics

    National Research Council Canada - National Science Library

    Taboada, Fernando

    2002-01-01

    ... intercept devices such as radar warning, electronic support and electronic intelligence receivers, In order to detect LPI radar waveforms new signal processing techniques are required This thesis first...

  6. Poor concordance of spiral CT (SCT) and high probability ventilation-perfusion (V/Q) studies in the diagnosis of pulmonary embolism (PE)

    International Nuclear Information System (INIS)

    Roman, M.R.; Angelides, S.; Chen, N.

    2000-01-01

    Full text: Despite its limitations, V/Q scintigraphy remains the favoured non-invasive technique for the diagnosis of pulmonary embolism (PE). PE is present in 85-90% and 30-40% of high and intermediate probability V/Q studies respectively. The value of spiral CT (SCT), a newer imaging modality, has yet to be determined. The aims of this study were to determine the frequency of positive SCT for PE in high and intermediate probability V/Q studies performed within 24hr apart. 15 patients (6M, 9F, mean age - 70.2) with a high probability study were included. Six (40%) SCT were reported as positive (four with emboli present in the main pulmonary arteries), seven as negative, one equivocal and one was technically sub-optimal. Pulmonary angiography was not performed in any patient. In all seven negative studies, the SCT was performed before the V/Q study. Of these, two studies were revised to positive once the result of the V/Q study was known, while, three others had resolving mismatch V/Q defects on follow-up studies (performed 5-14 days later); two of these three also had a positive duplex scan of the lower limbs. One other was most likely due to chronic thromboembolic disease. Only three patients had a V/Q scan prior to the SCT; all were positive for PE on both imaging modalities. Of 26 patients (11M, 15F, mean age - 68.5) with an intermediate probability V/Q study, SCT was positive in only two (8%). Thus the low detection rate of PE by SCT in this albeit small series, raises doubts as to its role in the diagnosis of PE. Copyright (2000) The Australian and New Zealand Society of Nuclear Medicine Inc

  7. Investigation of photon detection probability dependence of SPADnet-I digital photon counter as a function of angle of incidence, wavelength and polarization

    Energy Technology Data Exchange (ETDEWEB)

    Játékos, Balázs, E-mail: jatekosb@eik.bme.hu; Ujhelyi, Ferenc; Lőrincz, Emőke; Erdei, Gábor

    2015-01-01

    SPADnet-I is a prototype, fully digital, high spatial and temporal resolution silicon photon counter, based on standard CMOS imaging technology, developed by the SPADnet consortium. Being a novel device, the exact dependence of photon detection probability (PDP) of SPADnet-I was not known as a function of angle of incidence, wavelength and polarization of the incident light. Our targeted application area of this sensor is next generation PET detector modules, where they will be used along with LYSO:Ce scintillators. Hence, we performed an extended investigation of PDP in a wide range of angle of incidence (0° to 80°), concentrating onto a 60 nm broad wavelength interval around the characteristic emission peak (λ=420 nm) of the scintillator. In the case where the sensor was optically coupled to a scintillator, our experiments showed a notable dependence of PDP on angle, polarization and wavelength. The sensor has an average PDP of approximately 30% from 0° to 60° angle of incidence, where it starts to drop rapidly. The PDP turned out not to be polarization dependent below 30°. If the sensor is used without a scintillator (i.e. the light source is in air), the polarization dependence is much less expressed, it begins only from 50°.

  8. Simple heuristic derivation of some charge-transfer probabilities at asymptotically high incident velocities

    International Nuclear Information System (INIS)

    Spruch, L.; Shakeshaft, R.

    1984-01-01

    For asymptotically high incident velocities we provide simple, heuristic, almost classical, derivations of the cross section for forward charge transfer, and of the ratio of the cross section for capture to the elastic-scattering cross section for the projectile scattered through an angle close to π/3

  9. Assessment of local variability by high-throughput e-beam metrology for prediction of patterning defect probabilities

    Science.gov (United States)

    Wang, Fuming; Hunsche, Stefan; Anunciado, Roy; Corradi, Antonio; Tien, Hung Yu; Tang, Peng; Wei, Junwei; Wang, Yongjun; Fang, Wei; Wong, Patrick; van Oosten, Anton; van Ingen Schenau, Koen; Slachter, Bram

    2018-03-01

    We present an experimental study of pattern variability and defectivity, based on a large data set with more than 112 million SEM measurements from an HMI high-throughput e-beam tool. The test case is a 10nm node SRAM via array patterned with a DUV immersion LELE process, where we see a variation in mean size and litho sensitivities between different unique via patterns that leads to a seemingly qualitative differences in defectivity. The large available data volume enables further analysis to reliably distinguish global and local CDU variations, including a breakdown into local systematics and stochastics. A closer inspection of the tail end of the distributions and estimation of defect probabilities concludes that there is a common defect mechanism and defect threshold despite the observed differences of specific pattern characteristics. We expect that the analysis methodology can be applied for defect probability modeling as well as general process qualification in the future.

  10. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....

  11. Detecting ultra high energy neutrinos with LOFAR

    International Nuclear Information System (INIS)

    Mevius, M.; Buitink, S.; Falcke, H.; Hörandel, J.; James, C.W.; McFadden, R.; Scholten, O.; Singh, K.; Stappers, B.; Veen, S. ter

    2012-01-01

    The NuMoon project aims to detect signals of Ultra High Energy (UHE) Cosmic Rays with radio telescopes on Earth using the Lunar Cherenkov technique at low frequencies (∼150MHz). The advantage of using low frequencies is the much larger effective detecting volume, with as trade-off the cut-off in sensitivity at lower energies. A first upper limit on the UHE neutrino flux from data of the Westerbork Radio Telescope (WSRT) has been published, while a second experiment, using the new LOFAR telescope, is in preparation. The advantages of LOFAR over WSRT are the larger collecting area, the better pointing accuracy and the use of ring buffers, which allow the implementation of a sophisticated self-trigger algorithm. The expected sensitivity of LOFAR reaches flux limits within the range of some theoretical production models.

  12. Generalized Probability-Probability Plots

    NARCIS (Netherlands)

    Mushkudiani, N.A.; Einmahl, J.H.J.

    2004-01-01

    We introduce generalized Probability-Probability (P-P) plots in order to study the one-sample goodness-of-fit problem and the two-sample problem, for real valued data.These plots, that are constructed by indexing with the class of closed intervals, globally preserve the properties of classical P-P

  13. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  14. Ignition Probability

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...

  15. Conclusion: probable and possible futures. MRI with ultra high magnetic field

    International Nuclear Information System (INIS)

    Le Bihan, D.

    2009-01-01

    MR neuroimaging does not interfere with brain function. Because it is safe, it can be used to study the brains of both patients and healthy volunteers. The tasks performed by neurons depend largely on their precise location, and high-field magnets have the potential to provide a 5- to 10-fold increase in spatio-temporal resolution. This should allow brain function to be studied on a scale of only a few thousand neurons, possibly at the intermediate scale of the 'neural code'. NeuroSpin, a new CEA research center, is dedicated to neuro-MRI at high magnetic field strengths. As a forum for dialogue between those developing and those using these instruments, it brings together researchers and engineers, technicians and medical doctors. NeuroSpin is one of the few institutions in Europe, if not the world, where these experts can come together in one place to design, construct and use machines equipped with ultra-strong magnets. The strongest 'routine' MR device currently operates at 3 Tesla (60 000 times the earth's magnetic field), whereas a first French system operating at 7 Tesla (140 000 times the earth's field) is now available for human studies, and another system operating at 11.7 Tesla (world record) should be delivered in 2011. Preclinical studies are also being conducted with magnets operating at 7 Tesla and, soon, 17.6 Tesla. (author)

  16. Probability of detecting marine predator-prey and species interactions using novel hybrid acoustic transmitter-receiver tags.

    Directory of Open Access Journals (Sweden)

    Laurie L Baker

    Full Text Available Understanding the nature of inter-specific and conspecific interactions in the ocean is challenging because direct observation is usually impossible. The development of dual transmitter/receivers, Vemco Mobile Transceivers (VMT, and satellite-linked (e.g. GPS tags provides a unique opportunity to better understand between and within species interactions in space and time. Quantifying the uncertainty associated with detecting a tagged animal, particularly under varying field conditions, is vital for making accurate biological inferences when using VMTs. We evaluated the detection efficiency of VMTs deployed on grey seals, Halichoerus grypus, off Sable Island (NS, Canada in relation to environmental characteristics and seal behaviour using generalized linear models (GLM to explore both post-processed detection data and summarized raw VMT data. When considering only post-processed detection data, only about half of expected detections were recorded at best even when two VMT-tagged seals were estimated to be within 50-200 m of one another. At a separation of 400 m, only about 15% of expected detections were recorded. In contrast, when incomplete transmissions from the summarized raw data were also considered, the ratio of complete transmission to complete and incomplete transmissions was about 70% for distances ranging from 50-1000 m, with a minimum of around 40% at 600 m and a maximum of about 85% at 50 m. Distance between seals, wind stress, and depth were the most important predictors of detection efficiency. Access to the raw VMT data allowed us to focus on the physical and environmental factors that limit a transceiver's ability to resolve a transmitter's identity.

  17. Detecting and locating light atoms from high-resolution STEM images : The quest for a single optimal design

    NARCIS (Netherlands)

    Gonnissen, J; De Backer, A; den Dekker, A.J.; Sijbers, J.; Van Aert, S.

    2016-01-01

    In the present paper, the optimal detector design is investigated for both detecting and locating light atoms from high resolution scanning transmission electron microscopy (HR STEM) images. The principles of detection theory are used to quantify the probability of error for the detection of

  18. Advanced RESTART method for the estimation of the probability of failure of highly reliable hybrid dynamic systems

    International Nuclear Information System (INIS)

    Turati, Pietro; Pedroni, Nicola; Zio, Enrico

    2016-01-01

    The efficient estimation of system reliability characteristics is of paramount importance for many engineering applications. Real world system reliability modeling calls for the capability of treating systems that are: i) dynamic, ii) complex, iii) hybrid and iv) highly reliable. Advanced Monte Carlo (MC) methods offer a way to solve these types of problems, which are feasible according to the potentially high computational costs. In this paper, the REpetitive Simulation Trials After Reaching Thresholds (RESTART) method is employed, extending it to hybrid systems for the first time (to the authors’ knowledge). The estimation accuracy and precision of RESTART highly depend on the choice of the Importance Function (IF) indicating how close the system is to failure: in this respect, proper IFs are here originally proposed to improve the performance of RESTART for the analysis of hybrid systems. The resulting overall simulation approach is applied to estimate the probability of failure of the control system of a liquid hold-up tank and of a pump-valve subsystem subject to degradation induced by fatigue. The results are compared to those obtained by standard MC simulation and by RESTART with classical IFs available in the literature. The comparison shows the improvement in the performance obtained by our approach. - Highlights: • We consider the issue of estimating small failure probabilities in dynamic systems. • We employ the RESTART method to estimate the failure probabilities. • New Importance Functions (IFs) are introduced to increase the method performance. • We adopt two dynamic, hybrid, highly reliable systems as case studies. • A comparison with literature IFs proves the effectiveness of the new IFs.

  19. Balancing forest-regeneration probabilities and maintenance costs in dry grasslands of high conservation priority

    Science.gov (United States)

    Bolliger, Janine; Edwards, Thomas C.; Eggenberg, Stefan; Ismail, Sascha; Seidl, Irmi; Kienast, Felix

    2011-01-01

    Abandonment of agricultural land has resulted in forest regeneration in species-rich dry grasslands across European mountain regions and threatens conservation efforts in this vegetation type. To support national conservation strategies, we used a site-selection algorithm (MARXAN) to find optimum sets of floristic regions (reporting units) that contain grasslands of high conservation priority. We sought optimum sets that would accommodate 136 important dry-grassland species and that would minimize forest regeneration and costs of management needed to forestall predicted forest regeneration. We did not consider other conservation elements of dry grasslands, such as animal species richness, cultural heritage, and changes due to climate change. Optimal sets that included 95–100% of the dry grassland species encompassed an average of 56–59 floristic regions (standard deviation, SD 5). This is about 15% of approximately 400 floristic regions that contain dry-grassland sites and translates to 4800–5300 ha of dry grassland out of a total of approximately 23,000 ha for the entire study area. Projected costs to manage the grasslands in these optimum sets ranged from CHF (Swiss francs) 5.2 to 6.0 million/year. This is only 15–20% of the current total estimated cost of approximately CHF30–45 million/year required if all dry grasslands were to be protected. The grasslands of the optimal sets may be viewed as core sites in a national conservation strategy.

  20. Estimating site occupancy and detection probabilities for cooper's and sharp-shinned hawks in the Southern Sierra Nevada

    Science.gov (United States)

    Jennifer E. Carlson; Douglas D. Piirto; John J. Keane; Samantha J. Gill

    2015-01-01

    Long-term monitoring programs that can detect a population change over time can be useful for managers interested in assessing population trends in response to forest management activities for a particular species. Such long-term monitoring programs have been designed for the Northern Goshawk (Accipiter gentilis), but not for the more elusive Sharp...

  1. Fragility estimation for seismically isolated nuclear structures by high confidence low probability of failure values and bi-linear regression

    International Nuclear Information System (INIS)

    Carausu, A.

    1996-01-01

    A method for the fragility estimation of seismically isolated nuclear power plant structure is proposed. The relationship between the ground motion intensity parameter (e.g. peak ground velocity or peak ground acceleration) and the response of isolated structures is expressed in terms of a bi-linear regression line, whose coefficients are estimated by the least-square method in terms of available data on seismic input and structural response. The notion of high confidence low probability of failure (HCLPF) value is also used for deriving compound fragility curves for coupled subsystems. (orig.)

  2. An Upper Bound on High Speed Satellite Collision Probability When Only One Object has Position Uncertainty Information

    Science.gov (United States)

    Frisbee, Joseph H., Jr.

    2015-01-01

    Upper bounds on high speed satellite collision probability, PC †, have been investigated. Previous methods assume an individual position error covariance matrix is available for each object. The two matrices being combined into a single, relative position error covariance matrix. Components of the combined error covariance are then varied to obtain a maximum PC. If error covariance information for only one of the two objects was available, either some default shape has been used or nothing could be done. An alternative is presented that uses the known covariance information along with a critical value of the missing covariance to obtain an approximate but potentially useful Pc upper bound.

  3. Factors associated with high probability of target blood pressure non-achievement in hypertensive patients

    Directory of Open Access Journals (Sweden)

    S. P. Zhemanyuk

    2017-12-01

    Full Text Available One of the topic issue of modern cardiology is factors of target blood pressure level non-achievement clarifying due to a better understanding how we can reduce cardiovascular complications. The aim of the study is to determine the factors of poor blood pressure control using the ambulatory blood pressure monitoring parameters and adenosine 5'-diphosphate-induced platelet aggregation parameters in patients with arterial hypertension. Material and methods. The study involved 153 patients with essential hypertension (EH stage II, II degree. The ambulatory blood pressure monitoring (ABPM was performed in patients during at least two of first-line antihypertensive drugs in optimal daily doses usage by the ABPM bifunctional device (Incart, S.-P., R.F.. Platelet aggregation was carried out using light transmittance aggregation by optical analyzer (Solar, R.B. with adenosine 5'-diphosphate (Sigma-Aldrich at final concentration of 10.0 × 10-6 mol / L. The first group were inadequately controlled essential hypertensive individuals with high systolic or/and diastolic BP level according to the ABPM results, and the second one were patients with adequately controlled EH. Groups of patients were comparable in age (60.39 ± 10.74 years vs. 62.80 ± 9.63; p = 0.181, respectively. In the group of EH patients who reached the target level of blood pressure, women predominated (60% vs. 39.81%; p = 0.021, respectively. We used the binary logistic regression analysis to determine the predictors of target blood pressure level poor reaching using ABPM and platelet aggregation parameters. Results According to the univariate logistic regression analysis, the dependent factors influencing the target blood pressure level poor reaching are the average diurnal diastolic blood pressure (DBP (OR = 44.8; diurnal variability of systolic blood pressure (SBP (OR = 4.4; square index of hypertension for diurnal periods SBP (OR = 318.9; square index of hypertension for diurnal

  4. Quantum Probabilities as Behavioral Probabilities

    Directory of Open Access Journals (Sweden)

    Vyacheslav I. Yukalov

    2017-03-01

    Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.

  5. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... analytic expression for the distribution function of a sum of random variables. The presence of heavy-tailed random variables complicates the problem even more. The objective of this dissertation is to provide better approximations by means of sharp asymptotic expressions and Monte Carlo estimators...

  6. Trackline and Point Detection Probabilities for Acoustic Surveys of Cuvier’s and Blainville’s Beaked Whales

    Science.gov (United States)

    2013-09-01

    of sperm whales. Although the methods developed in those papers demonstrate feasibility, they are not applicable to a)Author to whom correspondence...information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and...location clicks (Marques et al., 2009) instead of detecting individual animals or groups of animals; these cue- counting methods will not be specifically

  7. A New Method Based on Two-Stage Detection Mechanism for Detecting Ships in High-Resolution SAR Images

    Directory of Open Access Journals (Sweden)

    Xu Yongli

    2017-01-01

    Full Text Available Ship detection in synthetic aperture radar (SAR remote sensing images, being a fundamental but challenging problem in the field of satellite image analysis, plays an important role for a wide range of applications and is receiving significant attention in recent years. Aiming at the requirements of ship detection in high-resolution SAR images, the accuracy, the intelligent level, a better real-time operation and processing efficiency, The characteristics of ocean background and ship target in high-resolution SAR images were analyzed, we put forward a ship detection algorithm in high-resolution SAR images. The algorithm consists of two detection stages: The first step designs a pre-training classifier based on improved spectral residual visual model to obtain the visual salient regions containing ship targets quickly, then achieve the purpose of probably detection of ships. In the second stage, considering the Bayesian theory of binary hypothesis detection, a local maximum posterior probability (MAP classifier is designed for the classification of pixels. After the parameter estimation and judgment criterion, the classification of pixels are carried out in the target areas to achieve the classification of two types of pixels in the salient regions. In the paper, several types of satellite image data, such as TerraSAR-X (TS-X, Radarsat-2, are used to evaluate the performance of detection methods. Comparing with classical CFAR detection algorithms, experimental results show that the algorithm can achieve a better effect of suppressing false alarms, which caused by the speckle noise and ocean clutter background inhomogeneity. At the same time, the detection speed is increased by 25% to 45%.

  8. The correct estimate of the probability of false detection of the matched filter in weak-signal detection problems . II. Further results with application to a set of ALMA and ATCA data

    Science.gov (United States)

    Vio, R.; Vergès, C.; Andreani, P.

    2017-08-01

    The matched filter (MF) is one of the most popular and reliable techniques to the detect signals of known structure and amplitude smaller than the level of the contaminating noise. Under the assumption of stationary Gaussian noise, MF maximizes the probability of detection subject to a constant probability of false detection or false alarm (PFA). This property relies upon a priori knowledge of the position of the searched signals, which is usually not available. Recently, it has been shown that when applied in its standard form, MF may severely underestimate the PFA. As a consequence the statistical significance of features that belong to noise is overestimated and the resulting detections are actually spurious. For this reason, an alternative method of computing the PFA has been proposed that is based on the probability density function (PDF) of the peaks of an isotropic Gaussian random field. In this paper we further develop this method. In particular, we discuss the statistical meaning of the PFA and show that, although useful as a preliminary step in a detection procedure, it is not able to quantify the actual reliability of a specific detection. For this reason, a new quantity is introduced called the specific probability of false alarm (SPFA), which is able to carry out this computation. We show how this method works in targeted simulations and apply it to a few interferometric maps taken with the Atacama Large Millimeter/submillimeter Array (ALMA) and the Australia Telescope Compact Array (ATCA). We select a few potential new point sources and assign an accurate detection reliability to these sources.

  9. Probability tales

    CERN Document Server

    Grinstead, Charles M; Snell, J Laurie

    2011-01-01

    This book explores four real-world topics through the lens of probability theory. It can be used to supplement a standard text in probability or statistics. Most elementary textbooks present the basic theory and then illustrate the ideas with some neatly packaged examples. Here the authors assume that the reader has seen, or is learning, the basic theory from another book and concentrate in some depth on the following topics: streaks, the stock market, lotteries, and fingerprints. This extended format allows the authors to present multiple approaches to problems and to pursue promising side discussions in ways that would not be possible in a book constrained to cover a fixed set of topics. To keep the main narrative accessible, the authors have placed the more technical mathematical details in appendices. The appendices can be understood by someone who has taken one or two semesters of calculus.

  10. Probability theory

    CERN Document Server

    Dorogovtsev, A Ya; Skorokhod, A V; Silvestrov, D S; Skorokhod, A V

    1997-01-01

    This book of problems is intended for students in pure and applied mathematics. There are problems in traditional areas of probability theory and problems in the theory of stochastic processes, which has wide applications in the theory of automatic control, queuing and reliability theories, and in many other modern science and engineering fields. Answers to most of the problems are given, and the book provides hints and solutions for more complicated problems.

  11. Correlator bank detection of gravitational wave chirps--False-alarm probability, template density, and thresholds: Behind and beyond the minimal-match issue

    International Nuclear Information System (INIS)

    Croce, R.P.; Demma, Th.; Pierro, V.; Pinto, I.M.; Longo, M.; Marano, S.; Matta, V.

    2004-01-01

    The general problem of computing the false-alarm probability vs the detection-threshold relationship for a bank of correlators is addressed, in the context of maximum-likelihood detection of gravitational waves in additive stationary Gaussian noise. Specific reference is made to chirps from coalescing binary systems. Accurate (lower-bound) approximants for the cumulative distribution of the whole-bank supremum are deduced from a class of Bonferroni-type inequalities. The asymptotic properties of the cumulative distribution are obtained, in the limit where the number of correlators goes to infinity. The validity of numerical simulations made on small-size banks is extended to banks of any size, via a Gaussian-correlation inequality. The result is used to readdress the problem of relating the template density to the fraction of potentially observable sources which could be dismissed as an effect of template space discreteness

  12. Genet-specific DNA methylation probabilities detected in a spatial epigenetic analysis of a clonal plant population.

    Directory of Open Access Journals (Sweden)

    Kiwako S Araki

    Full Text Available In sessile organisms such as plants, spatial genetic structures of populations show long-lasting patterns. These structures have been analyzed across diverse taxa to understand the processes that determine the genetic makeup of organismal populations. For many sessile organisms that mainly propagate via clonal spread, epigenetic status can vary between clonal individuals in the absence of genetic changes. However, fewer previous studies have explored the epigenetic properties in comparison to the genetic properties of natural plant populations. Here, we report the simultaneous evaluation of the spatial structure of genetic and epigenetic variation in a natural population of the clonal plant Cardamine leucantha. We applied a hierarchical Bayesian model to evaluate the effects of membership of a genet (a group of individuals clonally derived from a single seed and vegetation cover on the epigenetic variation between ramets (clonal plants that are physiologically independent individuals. We sampled 332 ramets in a 20 m × 20 m study plot that contained 137 genets (identified using eight SSR markers. We detected epigenetic variation in DNA methylation at 24 methylation-sensitive amplified fragment length polymorphism (MS-AFLP loci. There were significant genet effects at all 24 MS-AFLP loci in the distribution of subepiloci. Vegetation cover had no statistically significant effect on variation in the majority of MS-AFLP loci. The spatial aggregation of epigenetic variation is therefore largely explained by the aggregation of ramets that belong to the same genets. By applying hierarchical Bayesian analyses, we successfully identified a number of genet-specific changes in epigenetic status within a natural plant population in a complex context, where genotypes and environmental factors are unevenly distributed. This finding suggests that it requires further studies on the spatial epigenetic structure of natural populations of diverse organisms

  13. Genet-specific DNA methylation probabilities detected in a spatial epigenetic analysis of a clonal plant population.

    Science.gov (United States)

    Araki, Kiwako S; Kubo, Takuya; Kudoh, Hiroshi

    2017-01-01

    In sessile organisms such as plants, spatial genetic structures of populations show long-lasting patterns. These structures have been analyzed across diverse taxa to understand the processes that determine the genetic makeup of organismal populations. For many sessile organisms that mainly propagate via clonal spread, epigenetic status can vary between clonal individuals in the absence of genetic changes. However, fewer previous studies have explored the epigenetic properties in comparison to the genetic properties of natural plant populations. Here, we report the simultaneous evaluation of the spatial structure of genetic and epigenetic variation in a natural population of the clonal plant Cardamine leucantha. We applied a hierarchical Bayesian model to evaluate the effects of membership of a genet (a group of individuals clonally derived from a single seed) and vegetation cover on the epigenetic variation between ramets (clonal plants that are physiologically independent individuals). We sampled 332 ramets in a 20 m × 20 m study plot that contained 137 genets (identified using eight SSR markers). We detected epigenetic variation in DNA methylation at 24 methylation-sensitive amplified fragment length polymorphism (MS-AFLP) loci. There were significant genet effects at all 24 MS-AFLP loci in the distribution of subepiloci. Vegetation cover had no statistically significant effect on variation in the majority of MS-AFLP loci. The spatial aggregation of epigenetic variation is therefore largely explained by the aggregation of ramets that belong to the same genets. By applying hierarchical Bayesian analyses, we successfully identified a number of genet-specific changes in epigenetic status within a natural plant population in a complex context, where genotypes and environmental factors are unevenly distributed. This finding suggests that it requires further studies on the spatial epigenetic structure of natural populations of diverse organisms, particularly for

  14. High-altitude cosmic ray neutrons: probable source for the high-energy protons of the earth's radiation belts

    International Nuclear Information System (INIS)

    Hajnal, F.; Wilson, J.

    1992-01-01

    'Full Text:' Several High-altitude cosmic-ray neutron measurements were performed by the NASA Ames Laboratory in the mid-to late-1970s using airplanes flying at about 13km altitude along constant geomagnetic latitudes of 20, 44 and 51 degrees north. Bonner spheres and manganese, gold and aluminium foils were used in the measurements. In addition, large moderated BF-3 counters served as normalizing instruments. Data analyses performed at that time did not provide complete and unambiguous spectral information and field intensities. Recently, using our new unfolding methods and codes, and Bonner-sphere response function extensions for higher energies, 'new' neutron spectral intensities were obtained, which show progressive hardening of neutron spectra as a function of increasing geomagnetic latitude, with substantial increases in the energy region iron, 1 0 MeV to 10 GeV. For example, we found that the total neutron fluences at 20 and 51 degrees magnetic north are in the ratio of 1 to 5.2 and the 10 MeV to 10 GeV fluence ratio is 1 to 18. The magnitude of these ratios is quite remarkable. From the new results, the derived absolute neutron energy distribution is of the correct strength and shape for the albedo neutrons to be the main source of the high-energy protons trapped in the Earth's inner radiation belt. In addition, the results, depending on the extrapolation scheme used, indicate that the neutron dose equivalent rate may be as high as 0.1 mSv/h near the geomagnetic north pole and thus a significant contributor to the radiation exposures of pilots, flight attendants and the general public. (author)

  15. fatalityCMR: capture-recapture software to correct raw counts of wildlife fatalities using trial experiments for carcass detection probability and persistence time

    Science.gov (United States)

    Peron, Guillaume; Hines, James E.

    2014-01-01

    Many industrial and agricultural activities involve wildlife fatalities by collision, poisoning or other involuntary harvest: wind turbines, highway network, utility network, tall structures, pesticides, etc. Impacted wildlife may benefit from official protection, including the requirement to monitor the impact. Carcass counts can often be conducted to quantify the number of fatalities, but they need to be corrected for carcass persistence time (removal by scavengers and decay) and detection probability (searcher efficiency). In this article we introduce a new piece of software that fits a superpopulation capture-recapture model to raw count data. It uses trial data to estimate detection and daily persistence probabilities. A recurrent issue is that fatalities of rare, protected species are infrequent, in which case the software offers the option to switch to an ‘evidence of absence’ mode, i.e., estimate the number of carcasses that may have been missed by field crews. The software allows distinguishing between different turbine types (e.g. different vegetation cover under turbines, or different technical properties), as well between two carcass age-classes or states, with transition between those classes (e.g, fresh and dry). There is a data simulation capacity that may be used at the planning stage to optimize sampling design. Resulting mortality estimates can be used 1) to quantify the required amount of compensation, 2) inform mortality projections for proposed development sites, and 3) inform decisions about management of existing sites.

  16. Highly sensitive detection of a current ripple

    International Nuclear Information System (INIS)

    Aoki, Takashi; Gushiken, Tutomu; Nishikigouri, Kazutaka; Kumada, Masayuki.

    1996-01-01

    In the HIMAC, there are six thyristor-controlled power sources for driving two synchrotrons. These power sources are the three-output terminal power sources which are equipped with positive output, negative output and neutral point for the common mode countermeasures. As electromagnet circuits are connected to the three-output terminal power sources, those are three-line type. In the inside of the power source circuits controlled by thyristors, there is the oscillation peculiar to the power sources, and the variation of voltage induces current spikes. This time, in order to assess the results of the common mode countermeasures in the power source and electromagnet circuits, as one method of cross-check, it is considered that since electromagnet current flows being divided to the bridging resistance and the coil, if attention is paid to the current on bridging resistance side, the ripple components of common mode and normal mode can be detected with high sensitivity, and this was verified. The present state of heightening the performance of synchrotron power sources is explained. The cross-check of the method of assessing the performance of electromagnet power sources is reported. The method of measuring ripple current and the results of the measurement are reported. (K.I.)

  17. Using extreme value theory approaches to forecast the probability of outbreak of highly pathogenic influenza in Zhejiang, China.

    Directory of Open Access Journals (Sweden)

    Jiangpeng Chen

    Full Text Available Influenza is a contagious disease with high transmissibility to spread around the world with considerable morbidity and mortality and presents an enormous burden on worldwide public health. Few mathematical models can be used because influenza incidence data are generally not normally distributed. We developed a mathematical model using Extreme Value Theory (EVT to forecast the probability of outbreak of highly pathogenic influenza.The incidence data of highly pathogenic influenza in Zhejiang province from April 2009 to November 2013 were retrieved from the website of Health and Family Planning Commission of Zhejiang Province. MATLAB "VIEM" toolbox was used to analyze data and modelling. In the present work, we used the Peak Over Threshold (POT model, assuming the frequency as a Poisson process and the intensity to be Pareto distributed, to characterize the temporal variability of the long-term extreme incidence of highly pathogenic influenza in Zhejiang, China.The skewness and kurtosis of the incidence of highly pathogenic influenza in Zhejiang between April 2009 and November 2013 were 4.49 and 21.12, which indicated a "fat tail" distribution. A QQ plot and a mean excess plot were used to further validate the features of the distribution. After determining the threshold, we modeled the extremes and estimated the shape parameter and scale parameter by the maximum likelihood method. The results showed that months in which the incidence of highly pathogenic influenza is about 4462/2286/1311/487 are predicted to occur once every five/three/two/one year, respectively.Despite the simplicity, the present study successfully offers the sound modeling strategy and a methodological avenue to implement forecasting of an epidemic in the midst of its course.

  18. Changes in the high-mountain vegetation of the Central Iberian Peninsula as a probable sign of global warming.

    Science.gov (United States)

    Sanz-Elorza, Mario; Dana, Elías D; González, Alberto; Sobrino, Eduardo

    2003-08-01

    Aerial images of the high summits of the Spanish Central Range reveal significant changes in vegetation over the period 1957 to 1991. These changes include the replacement of high-mountain grassland communities dominated by Festuca aragonensis, typical of the Cryoro-Mediterranean belt, by shrub patches of Juniperus communis ssp. alpina and Cytisus oromediterraneus from lower altitudes (Oro-Mediterranean belt). Climatic data indicate a shift towards warmer conditions in this mountainous region since the 1940s, with the shift being particularly marked from 1960. Changes include significantly higher minimum and maximum temperatures, fewer days with snow cover and a redistribution of monthly rainfall. Total yearly precipitation showed no significant variation. There were no marked changes in land use during the time frame considered, although there were minor changes in grazing species in the 19th century. It is hypothesized that the advance of woody species into higher altitudes is probably related to climate change, which could have acted in conjunction with discrete variations in landscape management. The pronounced changes observed in the plant communities of the area reflect the susceptibility of high-mountain Mediterranean species to environmental change.

  19. The prevalence of probable delayed-sleep-phase syndrome in students from junior high school to university in Tottori, Japan.

    Science.gov (United States)

    Hazama, Gen-i; Inoue, Yuichi; Kojima, Kazushige; Ueta, Toshiyuki; Nakagome, Kazuyuki

    2008-09-01

    Delayed sleep phase syndrome (DSPS) is a circadian rhythm sleep disorder with a typical onset in the second decade of life. DSPS is characterized by the sleep-onset insomnia and the difficulty in waking at the desired time in the morning. Although DSPS is associated with inability to attend school, the prevalence has been controversial. To elucidate a change in the prevalence of DSPS among young population, epidemiological survey was conducted on Japanese students. A total of 4,971 students of junior high school, senior high school, and university were enrolled in this cross sectional study in Tottori Prefecture. They answered anonymous screening questionnaire regarding school schedule, sleep hygiene and symptomatic items of sleep disorders. The prevalence of probable DSPS was estimated at 0.48% among the total subject students without gender difference. In university, the prevalence of the last year students showed the highest value (1.66%), while that of the first year students showed the lowest value (0.09%) among all school years from junior high school to university. The prevalence increased with advancing university school years. Thus, a considerable number of Japanese students are affected with DSPS. Senior students of university are more vulnerable to the disorder than younger students. Appropriate school schedule may decrease the mismatch between the individual's sleep-wake cycle and the school schedule. Promotion of a regular sleep habit is necessary to prevent DSPS among this population.

  20. Viscosity measurement - probably a means for detecting radiation treatment of spices. Viskositaetsmessung - ein Verfahren zur Identifizierung strahlenbehandelter Gewuerze

    Energy Technology Data Exchange (ETDEWEB)

    Heide, L; Albrich, S; Boegl, K W; Mohr, E; Wichmann, G

    1987-12-01

    The viscosity of 13 different spices and dried vegetables in total was measured. Optimal conditions were first determined for each product, i.e. concentration, pH-value, temperature, particle size and soaking time. For method evaluation, examinations were primarily performed to study the effect of storage, the reproducibility and the influence of the different varieties of the same spice. In supplement, for pepper, the viscosity was measured as a function of radiation dose. In summation, significant changes in the gel forming capability after irradiation could be observed after preliminary experiments in 8 dried spices (ginger, carrots, leek, cloves, pepper, celery, cinnamon and onions). With 3 spices (ginger, pepper and cinnamon) could the results from examining all different varieties of the same spice be substantiated. An additional influence of storage time on viscosity could not be proved during the investigative period of 8 months. Generally seen, there is no possibility of being able to identify an irradiated spice on the basis of viscosity measurements alone, since the difference between the varieties of one and the same spice is considerably great. However, radiation treatment can be reliably excluded with ginger, pepper and cinnamon, if the viscosities are high (10-20 Pa x s).

  1. Impact of high-flux haemodialysis on the probability of target attainment for oral amoxicillin/clavulanic acid combination therapy.

    Science.gov (United States)

    Hui, Katrina; Patel, Kashyap; Kong, David C M; Kirkpatrick, Carl M J

    2017-07-01

    Clearance of small molecules such as amoxicillin and clavulanic acid is expected to increase during high-flux haemodialysis, which may result in lower concentrations and thus reduced efficacy. To date, clearance of amoxicillin/clavulanic acid (AMC) during high-flux haemodialysis remains largely unexplored. Using published pharmacokinetic parameters, a two-compartment model with first-order input was simulated to investigate the impact of high-flux haemodialysis on the probability of target attainment (PTA) of orally administered AMC combination therapy. The following pharmacokinetic/pharmacodynamic targets were used to calculate the PTA. For amoxicillin, the time that the free concentration remains above the minimum inhibitory concentration (MIC) of ≥50% of the dosing period (≥50%ƒT >MIC ) was used. For clavulanic acid, the time that the free concentration was >0.1 mg/L of ≥45% of the dosing period (≥45%ƒT >0.1 mg/L ) was used. Dialysis clearance reported in low-flux haemodialysis for both compounds was doubled to represent the likely clearance during high-flux haemodialysis. Monte Carlo simulations were performed to produce concentration-time profiles over 10 days in 1000 virtual patients. Seven different regimens commonly seen in clinical practice were explored. When AMC was dosed twice daily, the PTA was mostly ≥90% for both compounds regardless of when haemodialysis commenced. When administered once daily, the PTA was 20-30% for clavulanic acid and ≥90% for amoxicillin. The simulations suggest that once-daily orally administered AMC in patients receiving high-flux haemodialysis may result in insufficient concentrations of clavulanic acid to effectively treat infections, especially on days when haemodialysis occurs. Copyright © 2017 Elsevier B.V. and International Society of Chemotherapy. All rights reserved.

  2. High-speed detection of DNA translocation in nanopipettes

    Science.gov (United States)

    Fraccari, Raquel L.; Ciccarella, Pietro; Bahrami, Azadeh; Carminati, Marco; Ferrari, Giorgio; Albrecht, Tim

    2016-03-01

    We present a high-speed electrical detection scheme based on a custom-designed CMOS amplifier which allows the analysis of DNA translocation in glass nanopipettes on a microsecond timescale. Translocation of different DNA lengths in KCl electrolyte provides a scaling factor of the DNA translocation time equal to p = 1.22, which is different from values observed previously with nanopipettes in LiCl electrolyte or with nanopores. Based on a theoretical model involving electrophoresis, hydrodynamics and surface friction, we show that the experimentally observed range of p-values may be the result of, or at least be affected by DNA adsorption and friction between the DNA and the substrate surface.We present a high-speed electrical detection scheme based on a custom-designed CMOS amplifier which allows the analysis of DNA translocation in glass nanopipettes on a microsecond timescale. Translocation of different DNA lengths in KCl electrolyte provides a scaling factor of the DNA translocation time equal to p = 1.22, which is different from values observed previously with nanopipettes in LiCl electrolyte or with nanopores. Based on a theoretical model involving electrophoresis, hydrodynamics and surface friction, we show that the experimentally observed range of p-values may be the result of, or at least be affected by DNA adsorption and friction between the DNA and the substrate surface. Electronic supplementary information (ESI) available: Gel electrophoresis confirming lengths and purity of DNA samples, comparison between Axopatch 200B and custom-built setup, comprehensive low-noise amplifier characterization, representative I-V curves of nanopipettes used, typical scatter plots of τ vs. peak amplitude for the four LDNA's used, table of most probable τ values, a comparison between different fitting models for the DNA translocation time distribution, further details on the stochastic numerical simulation of the scaling statistics and the derivation of the extended

  3. Using exceedance probabilities to detect anomalies in routinely recorded animal health data, with particular reference to foot-and-mouth disease in Viet Nam.

    Science.gov (United States)

    Richards, K K; Hazelton, M L; Stevenson, M A; Lockhart, C Y; Pinto, J; Nguyen, L

    2014-10-01

    The widespread availability of computer hardware and software for recording and storing disease event information means that, in theory, we have the necessary information to carry out detailed analyses of factors influencing the spatial distribution of disease in animal populations. However, the reliability of such analyses depends on data quality, with anomalous records having the potential to introduce significant bias and lead to inappropriate decision making. In this paper we promote the use of exceedance probabilities as a tool for detecting anomalies when applying hierarchical spatio-temporal models to animal health data. We illustrate this methodology through a case study data on outbreaks of foot-and-mouth disease (FMD) in Viet Nam for the period 2006-2008. A flexible binomial logistic regression was employed to model the number of FMD infected communes within each province of the country. Standard analyses of the residuals from this model failed to identify problems, but exceedance probabilities identified provinces in which the number of reported FMD outbreaks was unexpectedly low. This finding is interesting given that these provinces are on major cattle movement pathways through Viet Nam. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. From Remotely Sensed Vegetation Onset to Sowing Dates: Aggregating Pixel-Level Detections into Village-Level Sowing Probabilities

    Directory of Open Access Journals (Sweden)

    Eduardo Marinho

    2014-11-01

    Full Text Available Monitoring the start of the crop season in Sahel provides decision makers with valuable information for an early assessment of potential production and food security threats. Presently, the most common method for the estimation of sowing dates in West African countries consists of applying given thresholds on rainfall estimations. However, the coarse spatial resolution and the possible inaccuracy of these estimations are limiting factors. In this context, the remote sensing approach, which consists of deriving green-up onset dates from satellite remote sensing data, appears as an interesting alternative. It builds upon a novel statistic model that translates vegetation onset detections derived from MODIS time series into sowing probabilities at the village level. Results for Niger show that this approach outperforms the standard method adopted in the region based on rainfall thresholds.

  5. High sensitivity neutron bursts detecting system

    International Nuclear Information System (INIS)

    Shyam, A.; Kaushik, T.C.; Srinivasan, M.; Kulkarni, L.V.

    1993-01-01

    Technique and instrumentation to detect multiplicity of fast neutrons, emitted in sharp bursts, has been developed. A bank of 16 BF 3 detectors, in an appropriate thermalising assembly, efficiency ∼ 16%, is used to detect neutron bursts. The output from this setup, through appropriate electronics, is divided into two paths. The first is directly connected to a computer controlled scalar. The second is connected to another similar scalar through a delay time unit (DTU). The DTU design is such that once it is triggered by a count pulse than it does not allow any counts to be recorded for a fixed dead time set at ∼ 100 μs. The difference in counts recorded directly and through DTU gives the total number of neutrons produced in bursts. This setup is being used to study lattice cracking, anomalous effects in solid deuterium systems and various reactor physics experiments. (author). 3 refs., 1 fig

  6. High on Crime Fiction and Detection

    DEFF Research Database (Denmark)

    Grodal, Torben Kragh

    2010-01-01

    how crime fiction activates strong salience (in some respects similar to the effect of dopamine-drugs like cocaine, Ritalin, and amphetamine) and discusses the role of social intelligence in crime fiction. It further contrasts the unempathic classical detector fictions with two subtypes of crime...... fiction that blend seeking with other emotions: the hardboiled crime fiction that blends detection with action and hot emotions like anger and bonding, and the moral crime fiction that strongly evokes moral disgust and contempt, often in conjunction with detectors that perform hard to fake signals...

  7. Reliability considerations of NDT by probability of detection (POD). Determination using ultrasound phased array. Results from a project in frame of the German nuclear safety research program

    International Nuclear Information System (INIS)

    Kurz, Jochen H.; Dugan, Sandra; Juengert, Anne

    2013-01-01

    Reliable assessment procedures are an important aspect of maintenance concepts. Non-destructive testing (NDT) methods are an essential part of a variety of maintenance plans. Fracture mechanical assessments require knowledge of flaw dimensions, loads and material parameters. NDT methods are able to acquire information on all of these areas. However, it has to be considered that the level of detail information depends on the case investigated and therefore on the applicable methods. Reliability aspects of NDT methods are of importance if quantitative information is required. Different design concepts e.g. the damage tolerance approach in aerospace already include reliability criteria of NDT methods applied in maintenance plans. NDT is also an essential part during construction and maintenance of nuclear power plants. In Germany, type and extent of inspection are specified in Safety Standards of the Nuclear Safety Standards Commission (KTA). Only certified inspections are allowed in the nuclear industry. The qualification of NDT is carried out in form of performance demonstrations of the inspection teams and the equipment, witnessed by an authorized inspector. The results of these tests are mainly statements regarding the detection capabilities of certain artificial flaws. In other countries, e.g. the U.S., additional blind tests on test blocks with hidden and unknown flaws may be required, in which a certain percentage of these flaws has to be detected. The knowledge of the probability of detection (POD) curves of specific flaws in specific testing conditions is often not present. This paper shows the results of a research project designed for POD determination of ultrasound phased array inspections of real and artificial cracks. The continuative objective of this project was to generate quantitative POD results. The distribution of the crack sizes of the specimens and the inspection planning is discussed, and results of the ultrasound inspections are presented. In

  8. Detecting Faults In High-Voltage Transformers

    Science.gov (United States)

    Blow, Raymond K.

    1988-01-01

    Simple fixture quickly shows whether high-voltage transformer has excessive voids in dielectric materials and whether high-voltage lead wires too close to transformer case. Fixture is "go/no-go" indicator; corona appears if transformer contains such faults. Nests in wire mesh supported by cap of clear epoxy. If transformer has defects, blue glow of corona appears in mesh and is seen through cap.

  9. A low false negative filter for detecting rare bird species from short video segments using a probable observation data set-based EKF method.

    Science.gov (United States)

    Song, Dezhen; Xu, Yiliang

    2010-09-01

    We report a new filter to assist the search for rare bird species. Since a rare bird only appears in front of a camera with very low occurrence (e.g., less than ten times per year) for very short duration (e.g., less than a fraction of a second), our algorithm must have a very low false negative rate. We verify the bird body axis information with the known bird flying dynamics from the short video segment. Since a regular extended Kalman filter (EKF) cannot converge due to high measurement error and limited data, we develop a novel probable observation data set (PODS)-based EKF method. The new PODS-EKF searches the measurement error range for all probable observation data that ensures the convergence of the corresponding EKF in short time frame. The algorithm has been extensively tested using both simulated inputs and real video data of four representative bird species. In the physical experiments, our algorithm has been tested on rock pigeons and red-tailed hawks with 119 motion sequences. The area under the ROC curve is 95.0%. During the one-year search of ivory-billed woodpeckers, the system reduces the raw video data of 29.41 TB to only 146.7 MB (reduction rate 99.9995%).

  10. The perception of probability.

    Science.gov (United States)

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  11. Space Shuttle Program (SSP) Orbiter Main Propulsion System (MPS) Gaseous Hydrogen (GH2) Flow Control Valve (FCV) Poppet Eddy Current (EC) Inspection Probability of Detection (POD) Study. Volume 1

    Science.gov (United States)

    Piascik, Robert S.; Prosser, William H.

    2011-01-01

    The Director of the NASA Engineering and Safety Center (NESC), requested an independent assessment of the anomalous gaseous hydrogen (GH2) flow incident on the Space Shuttle Program (SSP) Orbiter Vehicle (OV)-105 during the Space Transportation System (STS)-126 mission. The main propulsion system (MPS) engine #2 GH2 flow control valve (FCV) LV-57 transition from low towards high flow position without being commanded. Post-flight examination revealed that the FCV LV-57 poppet had experienced a fatigue failure that liberated a section of the poppet flange. The NESC assessment provided a peer review of the computational fluid dynamics (CFD), stress analysis, and impact testing. A probability of detection (POD) study was requested by the SSP Orbiter Project for the eddy current (EC) nondestructive evaluation (NDE) techniques that were developed to inspect the flight FCV poppets. This report contains the findings and recommendations from the NESC assessment.

  12. Space Shuttle Program (SSP) Orbiter Main Propulsion System (MPS) Gaseous Hydrogen (GH2) Flow Control Valve (FCV) Poppet Eddy Current (EC) Inspection Probability of Detection (POD) Study. Volume 2; Appendices

    Science.gov (United States)

    Piascik, Robert S.; Prosser, William H.

    2011-01-01

    The Director of the NASA Engineering and Safety Center (NESC), requested an independent assessment of the anomalous gaseous hydrogen (GH2) flow incident on the Space Shuttle Program (SSP) Orbiter Vehicle (OV)-105 during the Space Transportation System (STS)-126 mission. The main propulsion system (MPS) engine #2 GH2 flow control valve (FCV) LV-57 transition from low towards high flow position without being commanded. Post-flight examination revealed that the FCV LV-57 poppet had experienced a fatigue failure that liberated a section of the poppet flange. The NESC assessment provided a peer review of the computational fluid dynamics (CFD), stress analysis, and impact testing. A probability of detection (POD) study was requested by the SSP Orbiter Project for the eddy current (EC) nondestructive evaluation (NDE) techniques that were developed to inspect the flight FCV poppets. This report contains the Appendices to the main report.

  13. Detection and estimation research of high-speed railway catenary

    CERN Document Server

    Liu, Zhigang

    2017-01-01

    This book describes the wave characteristics of contact lines taking wind into consideration and discusses new methods for detecting catenary geometry, pantograph slide fault, and catenary support system faults. It also introduces wire-irregularity detection methods for catenary estimation, and discusses modern spectrum estimation tools for catenary. It is organized in three parts: the first discusses statistical characteristics of pantograph-catenary data, such as stationarity, periodicity, correlation, high-order statistical properties and wave characteristics of contact lines, which are the basis of pantograph-catenary relationship analysis. The second part includes geometry parameter detection and support-system fault detection in catenary, as well as slide-fault detection in pantographs, and presents some new detection algorithms and plans. The final part addresses catenary estimation, including detection of contact-line wire irregularities and estimation of catenary based on spectrum, and presents detec...

  14. Probability Aggregates in Probability Answer Set Programming

    OpenAIRE

    Saad, Emad

    2013-01-01

    Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...

  15. Introduction to imprecise probabilities

    CERN Document Server

    Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M

    2014-01-01

    In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin

  16. Prioritizing forest fuels treatments based on the probability of high-severity fire restores adaptive capacity in Sierran forests.

    Science.gov (United States)

    Krofcheck, Daniel J; Hurteau, Matthew D; Scheller, Robert M; Loudermilk, E Louise

    2018-02-01

    In frequent fire forests of the western United States, a legacy of fire suppression coupled with increases in fire weather severity have altered fire regimes and vegetation dynamics. When coupled with projected climate change, these conditions have the potential to lead to vegetation type change and altered carbon (C) dynamics. In the Sierra Nevada, fuels reduction approaches that include mechanical thinning followed by regular prescribed fire are one approach to restore the ability of the ecosystem to tolerate episodic fire and still sequester C. Yet, the spatial extent of the area requiring treatment makes widespread treatment implementation unlikely. We sought to determine if a priori knowledge of where uncharacteristic wildfire is most probable could be used to optimize the placement of fuels treatments in a Sierra Nevada watershed. We developed two treatment placement strategies: the naive strategy, based on treating all operationally available area and the optimized strategy, which only treated areas where crown-killing fires were most probable. We ran forecast simulations using projected climate data through 2,100 to determine how the treatments differed in terms of C sequestration, fire severity, and C emissions relative to a no-management scenario. We found that in both the short (20 years) and long (100 years) term, both management scenarios increased C stability, reduced burn severity, and consequently emitted less C as a result of wildfires than no-management. Across all metrics, both scenarios performed the same, but the optimized treatment required significantly less C removal (naive=0.42 Tg C, optimized=0.25 Tg C) to achieve the same treatment efficacy. Given the extent of western forests in need of fire restoration, efficiently allocating treatments is a critical task if we are going to restore adaptive capacity in frequent-fire forests. © 2017 John Wiley & Sons Ltd.

  17. Fault Analysis and Detection in Microgrids with High PV Penetration

    Energy Technology Data Exchange (ETDEWEB)

    El Khatib, Mohamed [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hernandez Alvidrez, Javier [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ellis, Abraham [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-05-01

    In this report we focus on analyzing current-controlled PV inverters behaviour under faults in order to develop fault detection schemes for microgrids with high PV penetration. Inverter model suitable for steady state fault studies is presented and the impact of PV inverters on two protection elements is analyzed. The studied protection elements are superimposed quantities based directional element and negative sequence directional element. Additionally, several non-overcurrent fault detection schemes are discussed in this report for microgrids with high PV penetration. A detailed time-domain simulation study is presented to assess the performance of the presented fault detection schemes under different microgrid modes of operation.

  18. Highly accurate determination of relative gamma-ray detection efficiency for Ge detector and its application

    International Nuclear Information System (INIS)

    Miyahara, H.; Mori, C.; Fleming, R.F.; Dewaraja, Y.K.

    1997-01-01

    When quantitative measurements of γ-rays using High-Purity Ge (HPGe) detectors are made for a variety of applications, accurate knowledge of oy-ray detection efficiency is required. The emission rates of γ-rays from sources can be determined quickly in the case that the absolute peak efficiency is calibrated. On the other hand, the relative peak efficiencies can be used for determination of intensity ratios for plural samples and for comparison to the standard source. Thus, both absolute and relative detection efficiencies are important in use of γ-ray detector. The objective of this work is to determine the relative gamma-ray peak detection efficiency for an HPGe detector with the uncertainty approaching 0.1% . We used some nuclides which emit at least two gamma-rays with energies from 700 to 2400 keV for which the relative emission probabilities are known with uncertainties much smaller than 0.1%. The relative peak detection efficiencies were calculated from the measurements of the nuclides, 46 Sc, 48 Sc, 60 Co and 94 Nb, emitting two γ- rays with the emission probabilities of almost unity. It is important that various corrections for the emission probabilities, the cascade summing effect, and the self-absorption are small. A third order polynomial function on both logarithmic scales of energy and efficiency was fitted to the data, and the peak efficiency predicted at certain energy from covariance matrix showed the uncertainty less than 0.5% except for near 700 keV. As an application, the emission probabilities of the 1037.5 and 1212.9 keV γ-rays for 48 Sc were determined using the function of the highly precise relative peak efficiency. Those were 0.9777+0,.00079 and 0.02345+0.00017 for the 1037.5 and 1212.9 keV γ-rays, respectively. The sum of these probabilities is close to unity within the uncertainty which means that the certainties of the results are high and the accuracy has been improved considerably

  19. Robust estimation of the expected survival probabilities from high-dimensional Cox models with biomarker-by-treatment interactions in randomized clinical trials

    Directory of Open Access Journals (Sweden)

    Nils Ternès

    2017-05-01

    Full Text Available Abstract Background Thanks to the advances in genomics and targeted treatments, more and more prediction models based on biomarkers are being developed to predict potential benefit from treatments in a randomized clinical trial. Despite the methodological framework for the development and validation of prediction models in a high-dimensional setting is getting more and more established, no clear guidance exists yet on how to estimate expected survival probabilities in a penalized model with biomarker-by-treatment interactions. Methods Based on a parsimonious biomarker selection in a penalized high-dimensional Cox model (lasso or adaptive lasso, we propose a unified framework to: estimate internally the predictive accuracy metrics of the developed model (using double cross-validation; estimate the individual survival probabilities at a given timepoint; construct confidence intervals thereof (analytical or bootstrap; and visualize them graphically (pointwise or smoothed with spline. We compared these strategies through a simulation study covering scenarios with or without biomarker effects. We applied the strategies to a large randomized phase III clinical trial that evaluated the effect of adding trastuzumab to chemotherapy in 1574 early breast cancer patients, for which the expression of 462 genes was measured. Results In our simulations, penalized regression models using the adaptive lasso estimated the survival probability of new patients with low bias and standard error; bootstrapped confidence intervals had empirical coverage probability close to the nominal level across very different scenarios. The double cross-validation performed on the training data set closely mimicked the predictive accuracy of the selected models in external validation data. We also propose a useful visual representation of the expected survival probabilities using splines. In the breast cancer trial, the adaptive lasso penalty selected a prediction model with 4

  20. A high-throughput multiplex method adapted for GMO detection.

    Science.gov (United States)

    Chaouachi, Maher; Chupeau, Gaëlle; Berard, Aurélie; McKhann, Heather; Romaniuk, Marcel; Giancola, Sandra; Laval, Valérie; Bertheau, Yves; Brunel, Dominique

    2008-12-24

    A high-throughput multiplex assay for the detection of genetically modified organisms (GMO) was developed on the basis of the existing SNPlex method designed for SNP genotyping. This SNPlex assay allows the simultaneous detection of up to 48 short DNA sequences (approximately 70 bp; "signature sequences") from taxa endogenous reference genes, from GMO constructions, screening targets, construct-specific, and event-specific targets, and finally from donor organisms. This assay avoids certain shortcomings of multiplex PCR-based methods already in widespread use for GMO detection. The assay demonstrated high specificity and sensitivity. The results suggest that this assay is reliable, flexible, and cost- and time-effective for high-throughput GMO detection.

  1. Scaling Qualitative Probability

    OpenAIRE

    Burgin, Mark

    2017-01-01

    There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...

  2. Detecting Android Malwares with High-Efficient Hybrid Analyzing Methods

    Directory of Open Access Journals (Sweden)

    Yu Liu

    2018-01-01

    Full Text Available In order to tackle the security issues caused by malwares of Android OS, we proposed a high-efficient hybrid-detecting scheme for Android malwares. Our scheme employed different analyzing methods (static and dynamic methods to construct a flexible detecting scheme. In this paper, we proposed some detecting techniques such as Com+ feature based on traditional Permission and API call features to improve the performance of static detection. The collapsing issue of traditional function call graph-based malware detection was also avoided, as we adopted feature selection and clustering method to unify function call graph features of various dimensions into same dimension. In order to verify the performance of our scheme, we built an open-access malware dataset in our experiments. The experimental results showed that the suggested scheme achieved high malware-detecting accuracy, and the scheme could be used to establish Android malware-detecting cloud services, which can automatically adopt high-efficiency analyzing methods according to the properties of the Android applications.

  3. Rapid Target Detection in High Resolution Remote Sensing Images Using Yolo Model

    Science.gov (United States)

    Wu, Z.; Chen, X.; Gao, Y.; Li, Y.

    2018-04-01

    Object detection in high resolution remote sensing images is a fundamental and challenging problem in the field of remote sensing imagery analysis for civil and military application due to the complex neighboring environments, which can cause the recognition algorithms to mistake irrelevant ground objects for target objects. Deep Convolution Neural Network(DCNN) is the hotspot in object detection for its powerful ability of feature extraction and has achieved state-of-the-art results in Computer Vision. Common pipeline of object detection based on DCNN consists of region proposal, CNN feature extraction, region classification and post processing. YOLO model frames object detection as a regression problem, using a single CNN predicts bounding boxes and class probabilities in an end-to-end way and make the predict faster. In this paper, a YOLO based model is used for object detection in high resolution sensing images. The experiments on NWPU VHR-10 dataset and our airport/airplane dataset gain from GoogleEarth show that, compare with the common pipeline, the proposed model speeds up the detection process and have good accuracy.

  4. Study of the photo-detection efficiency of FBK High-Density silicon photomultipliers

    International Nuclear Information System (INIS)

    Zappalà, G.; Regazzoni, V.; Acerbi, F.; Ferri, A.; Gola, A.; Paternoster, G.; Zorzi, N.; Piemonte, C.

    2016-01-01

    This work presents a study of the factors contributing to the Photo-Detection Efficiency of Silicon Photomultipliers (SiPMs): Quantum Efficiency, Triggering Probability and Fill Factor. Two different SiPM High-Density technologies are tested, NUV-HD, based on n-on-p junction, and RGB-HD, based on p-on-n junction, developed at FBK, Trento. The quantum efficiency was measured on photodiodes produced along with the SiPMs. The triggering probability, as a function of wavelength and bias voltage, was measured on circular Single Photon Avalanche Diodes (SPADs) with 100% fill factor. Square SPADs, having the same layout of single SiPM cells, were studied to measure the effective fill factor and compare it to the nominal value. The comparison of the circular and square SPADs allows to get the transition region size between the effective active area of the cell and the one defined by the layout.

  5. Voltage spike detection in high field superconducting accelerator magnets

    Energy Technology Data Exchange (ETDEWEB)

    Orris, D.F.; Carcagno, R.; Feher, S.; Makulski, A.; Pischalnikov, Y.M.; /Fermilab

    2004-12-01

    A measurement system for the detection of small magnetic flux changes in superconducting magnets, which are due to either mechanical motion of the conductor or flux jump, has been developed at Fermilab. These flux changes are detected as small amplitude, short duration voltage spikes, which are {approx}15mV in magnitude and lasts for {approx}30 {micro}sec. The detection system combines an analog circuit for the signal conditioning of two coil segments and a fast data acquisition system for digitizing the results, performing threshold detection, and storing the resultant data. The design of the spike detection system along with the modeling results and noise analysis will be presented. Data from tests of high field Nb{sub 3}Sn magnets at currents up to {approx}20KA will also be shown.

  6. Voltage spike detection in high field superconducting accelerator magnets

    International Nuclear Information System (INIS)

    Orris, D.F.; Carcagno, R.; Feher, S.; Makulski, A.; Pischalnikov, Y.M.

    2004-01-01

    A measurement system for the detection of small magnetic flux changes in superconducting magnets, which are due to either mechanical motion of the conductor or flux jump, has been developed at Fermilab. These flux changes are detected as small amplitude, short duration voltage spikes, which are ∼15mV in magnitude and lasts for ∼30(micro)sec. The detection system combines an analog circuit for the signal conditioning of two coil segments and a fast data acquisition system for digitizing the results, performing threshold detection, and storing the resultant data. The design of the spike detection system along with the modeling results and noise analysis will be presented. Data from tests of high field Nb3Sn magnets at currents up to ∼20KA will also be shown

  7. Bayesian Peptide Peak Detection for High Resolution TOF Mass Spectrometry.

    Science.gov (United States)

    Zhang, Jianqiu; Zhou, Xiaobo; Wang, Honghui; Suffredini, Anthony; Zhang, Lin; Huang, Yufei; Wong, Stephen

    2010-11-01

    In this paper, we address the issue of peptide ion peak detection for high resolution time-of-flight (TOF) mass spectrometry (MS) data. A novel Bayesian peptide ion peak detection method is proposed for TOF data with resolution of 10 000-15 000 full width at half-maximum (FWHW). MS spectra exhibit distinct characteristics at this resolution, which are captured in a novel parametric model. Based on the proposed parametric model, a Bayesian peak detection algorithm based on Markov chain Monte Carlo (MCMC) sampling is developed. The proposed algorithm is tested on both simulated and real datasets. The results show a significant improvement in detection performance over a commonly employed method. The results also agree with expert's visual inspection. Moreover, better detection consistency is achieved across MS datasets from patients with identical pathological condition.

  8. Posterior Probability Matching and Human Perceptual Decision Making.

    Directory of Open Access Journals (Sweden)

    Richard F Murray

    2015-06-01

    Full Text Available Probability matching is a classic theory of decision making that was first developed in models of cognition. Posterior probability matching, a variant in which observers match their response probabilities to the posterior probability of each response being correct, is being used increasingly often in models of perception. However, little is known about whether posterior probability matching is consistent with the vast literature on vision and hearing that has developed within signal detection theory. Here we test posterior probability matching models using two tools from detection theory. First, we examine the models' performance in a two-pass experiment, where each block of trials is presented twice, and we measure the proportion of times that the model gives the same response twice to repeated stimuli. We show that at low performance levels, posterior probability matching models give highly inconsistent responses across repeated presentations of identical trials. We find that practised human observers are more consistent across repeated trials than these models predict, and we find some evidence that less practised observers more consistent as well. Second, we compare the performance of posterior probability matching models on a discrimination task to the performance of a theoretical ideal observer that achieves the best possible performance. We find that posterior probability matching is very inefficient at low-to-moderate performance levels, and that human observers can be more efficient than is ever possible according to posterior probability matching models. These findings support classic signal detection models, and rule out a broad class of posterior probability matching models for expert performance on perceptual tasks that range in complexity from contrast discrimination to symmetry detection. However, our findings leave open the possibility that inexperienced observers may show posterior probability matching behaviour, and our methods

  9. Extending the Generalised Pareto Distribution for Novelty Detection in High-Dimensional Spaces.

    Science.gov (United States)

    Clifton, David A; Clifton, Lei; Hugueny, Samuel; Tarassenko, Lionel

    2014-01-01

    Novelty detection involves the construction of a "model of normality", and then classifies test data as being either "normal" or "abnormal" with respect to that model. For this reason, it is often termed one-class classification. The approach is suitable for cases in which examples of "normal" behaviour are commonly available, but in which cases of "abnormal" data are comparatively rare. When performing novelty detection, we are typically most interested in the tails of the normal model, because it is in these tails that a decision boundary between "normal" and "abnormal" areas of data space usually lies. Extreme value statistics provides an appropriate theoretical framework for modelling the tails of univariate (or low-dimensional) distributions, using the generalised Pareto distribution (GPD), which can be demonstrated to be the limiting distribution for data occurring within the tails of most practically-encountered probability distributions. This paper provides an extension of the GPD, allowing the modelling of probability distributions of arbitrarily high dimension, such as occurs when using complex, multimodel, multivariate distributions for performing novelty detection in most real-life cases. We demonstrate our extension to the GPD using examples from patient physiological monitoring, in which we have acquired data from hospital patients in large clinical studies of high-acuity wards, and in which we wish to determine "abnormal" patient data, such that early warning of patient physiological deterioration may be provided.

  10. Kinase detection with gallium nitride based high electron mobility transistors.

    Science.gov (United States)

    Makowski, Matthew S; Bryan, Isaac; Sitar, Zlatko; Arellano, Consuelo; Xie, Jinqiao; Collazo, Ramon; Ivanisevic, Albena

    2013-07-01

    A label-free kinase detection system was fabricated by the adsorption of gold nanoparticles functionalized with kinase inhibitor onto AlGaN/GaN high electron mobility transistors (HEMTs). The HEMTs were operated near threshold voltage due to the greatest sensitivity in this operational region. The Au NP/HEMT biosensor system electrically detected 1 pM SRC kinase in ionic solutions. These results are pertinent to drug development applications associated with kinase sensing.

  11. Proposal to detect an emission of unusual super-high energy electrons in electron storage rings

    Directory of Open Access Journals (Sweden)

    Da-peng Qian

    2014-01-01

    Full Text Available According to an extended Lorentz–Einstein mass formula taken into the uncertainty principle, it is predicted that the electron beams passing accelerating electric field should with a small probability generate abnormal super-high energy electrons which are much higher than the beam energy. Author’s preliminary experiment result at electron storage ring has hinted these signs, so suggests to more strictly detect this unusual phenomenon, and thus to test the extended mass formula as well as a more perfect special relativity.

  12. Muon radiography technology for detecting high-Z materials

    International Nuclear Information System (INIS)

    Ma Lingling; Wang Wenxin; Zhou Jianrong; Sun Shaohua; Liu Zuoye; Li Lu; Du Hongchuan; Zhang Xiaodong; Hu Bitao

    2010-01-01

    This paper studies the possibility of using the scattering of cosmic muons to identify threatening high-Z materials. Various scenarios of threat material detection are simulated with the Geant4 toolkit. PoCA (Point of Closest Approach) algorithm reconstructing muon track gives 3D radiography images of the target material. Z-discrimination capability, effects of the placement of high-Z materials, shielding materials inside the cargo, and spatial resolution of position sensitive detector for muon radiography are carefully studied. Our results show that a detector position resolution of 50 μm is good enough for shielded materials detection. (authors)

  13. The research of high voltage switchgear detecting unit

    Science.gov (United States)

    Ji, Tong; Xie, Wei; Wang, Xiaoqing; Zhang, Jinbo

    2017-07-01

    In order to understand the status of the high voltage switch in the whole life circle, you must monitor the mechanical and electrical parameters that affect device health. So this paper gives a new high voltage switchgear detecting unit based on ARM technology. It can measure closing-opening mechanical wave, storage motor current wave and contactor temperature to judge the device’s health status. When something goes wrong, it can be on alert and give some advice. The practice showed that it can meet the requirements of circuit breaker mechanical properties temperature online detection.

  14. Detection of artifacts from high energy bursts in neonatal EEG.

    Science.gov (United States)

    Bhattacharyya, Sourya; Biswas, Arunava; Mukherjee, Jayanta; Majumdar, Arun Kumar; Majumdar, Bandana; Mukherjee, Suchandra; Singh, Arun Kumar

    2013-11-01

    Detection of non-cerebral activities or artifacts, intermixed within the background EEG, is essential to discard them from subsequent pattern analysis. The problem is much harder in neonatal EEG, where the background EEG contains spikes, waves, and rapid fluctuations in amplitude and frequency. Existing artifact detection methods are mostly limited to detect only a subset of artifacts such as ocular, muscle or power line artifacts. Few methods integrate different modules, each for detection of one specific category of artifact. Furthermore, most of the reference approaches are implemented and tested on adult EEG recordings. Direct application of those methods on neonatal EEG causes performance deterioration, due to greater pattern variation and inherent complexity. A method for detection of a wide range of artifact categories in neonatal EEG is thus required. At the same time, the method should be specific enough to preserve the background EEG information. The current study describes a feature based classification approach to detect both repetitive (generated from ECG, EMG, pulse, respiration, etc.) and transient (generated from eye blinking, eye movement, patient movement, etc.) artifacts. It focuses on artifact detection within high energy burst patterns, instead of detecting artifacts within the complete background EEG with wide pattern variation. The objective is to find true burst patterns, which can later be used to identify the Burst-Suppression (BS) pattern, which is commonly observed during newborn seizure. Such selective artifact detection is proven to be more sensitive to artifacts and specific to bursts, compared to the existing artifact detection approaches applied on the complete background EEG. Several time domain, frequency domain, statistical features, and features generated by wavelet decomposition are analyzed to model the proposed bi-classification between burst and artifact segments. A feature selection method is also applied to select the

  15. Development of Probability Evaluation Methodology for High Pressure/Temperature Gas Induced RCS Boundary Failure and SG Creep Rupture

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Byung Chul; Hong, Soon Joon; Lee, Jin Yong; Lee, Kyung Jin; Lee, Kuh Hyung [FNC Tech. Co., Seoul (Korea, Republic of)

    2008-04-15

    Existing MELCOR 1.8.5 model was improved in view of severe accident natural circulation and MELCOR 1.8.6 input model was developed and calculation sheets for detailed MELCOR 1.8.6 model were produced. Effects of natural circulation modeling were found by simulating SBO accident by comparing existing model with detailed model. Major phenomenon and system operations which affect on natural circulation by high temperature and high pressure gas were investigated and representative accident sequences for creep rupture model of RCS pipeline and SG tube were selected.

  16. On Probability Leakage

    OpenAIRE

    Briggs, William M.

    2012-01-01

    The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.

  17. Gender Expression, Violence, and Bullying Victimization: Findings from Probability Samples of High School Students in 4 US School Districts

    Science.gov (United States)

    Gordon, Allegra R.; Conron, Kerith J.; Calzo, Jerel P.; White, Matthew T.; Reisner, Sari L.; Austin, S. Bryn

    2018-01-01

    Background: Young people may experience school-based violence and bullying victimization related to their gender expression, independent of sexual orientation identity. However, the associations between gender expression and bullying and violence have not been examined in racially and ethnically diverse population-based samples of high school…

  18. Addendum to ‘Understanding risks in the light of uncertainty: low-probability, high-impact coastal events in cities’

    Science.gov (United States)

    Galarraga, Ibon; Sainz de Murieta, Elisa; Markandya, Anil; María Abadie, Luis

    2018-02-01

    This addendum adds to the analysis presented in ‘Understanding risks in the light of uncertainty: low-probability, high-impact coastal events in cities’ Abadie et al (2017 Environ. Res. Lett. 12 014017). We propose to use the framework developed earlier to enhance communication and understanding of risks, with the aim of bridging the gap between highly technical risk management discussion to the public risk aversion debate. We also propose that the framework could be used for stress-testing resilience.

  19. Analysis of HIV-1 intersubtype recombination breakpoints suggests region with high pairing probability may be a more fundamental factor than sequence similarity affecting HIV-1 recombination.

    Science.gov (United States)

    Jia, Lei; Li, Lin; Gui, Tao; Liu, Siyang; Li, Hanping; Han, Jingwan; Guo, Wei; Liu, Yongjian; Li, Jingyun

    2016-09-21

    With increasing data on HIV-1, a more relevant molecular model describing mechanism details of HIV-1 genetic recombination usually requires upgrades. Currently an incomplete structural understanding of the copy choice mechanism along with several other issues in the field that lack elucidation led us to perform an analysis of the correlation between breakpoint distributions and (1) the probability of base pairing, and (2) intersubtype genetic similarity to further explore structural mechanisms. Near full length sequences of URFs from Asia, Europe, and Africa (one sequence/patient), and representative sequences of worldwide CRFs were retrieved from the Los Alamos HIV database. Their recombination patterns were analyzed by jpHMM in detail. Then the relationships between breakpoint distributions and (1) the probability of base pairing, and (2) intersubtype genetic similarities were investigated. Pearson correlation test showed that all URF groups and the CRF group exhibit the same breakpoint distribution pattern. Additionally, the Wilcoxon two-sample test indicated a significant and inexplicable limitation of recombination in regions with high pairing probability. These regions have been found to be strongly conserved across distinct biological states (i.e., strong intersubtype similarity), and genetic similarity has been determined to be a very important factor promoting recombination. Thus, the results revealed an unexpected disagreement between intersubtype similarity and breakpoint distribution, which were further confirmed by genetic similarity analysis. Our analysis reveals a critical conflict between results from natural HIV-1 isolates and those from HIV-1-based assay vectors in which genetic similarity has been shown to be a very critical factor promoting recombination. These results indicate the region with high-pairing probabilities may be a more fundamental factor affecting HIV-1 recombination than sequence similarity in natural HIV-1 infections. Our

  20. Individual tree detection based on densities of high points of high resolution airborne lidar

    NARCIS (Netherlands)

    Abd Rahman, M.Z.; Gorte, B.G.H.

    2008-01-01

    The retrieval of individual tree location from Airborne LiDAR has focused largely on utilizing canopy height. However, high resolution Airborne LiDAR offers another source of information for tree detection. This paper presents a new method for tree detection based on high points’ densities from a

  1. Detectability of Gravitational Waves from High-Redshift Binaries.

    Science.gov (United States)

    Rosado, Pablo A; Lasky, Paul D; Thrane, Eric; Zhu, Xingjiang; Mandel, Ilya; Sesana, Alberto

    2016-03-11

    Recent nondetection of gravitational-wave backgrounds from pulsar timing arrays casts further uncertainty on the evolution of supermassive black hole binaries. We study the capabilities of current gravitational-wave observatories to detect individual binaries and demonstrate that, contrary to conventional wisdom, some are, in principle, detectable throughout the Universe. In particular, a binary with rest-frame mass ≳10^{10}M_{⊙} can be detected by current timing arrays at arbitrarily high redshifts. The same claim will apply for less massive binaries with more sensitive future arrays. As a consequence, future searches for nanohertz gravitational waves could be expanded to target evolving high-redshift binaries. We calculate the maximum distance at which binaries can be observed with pulsar timing arrays and other detectors, properly accounting for redshift and using realistic binary waveforms.

  2. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  3. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  4. Detection of Patients at High Risk of Medication Errors

    DEFF Research Database (Denmark)

    Sædder, Eva Aggerholm; Lisby, Marianne; Nielsen, Lars Peter

    2016-01-01

    Medication errors (MEs) are preventable and can result in patient harm and increased expenses in the healthcare system in terms of hospitalization, prolonged hospitalizations and even death. We aimed to develop a screening tool to detect acutely admitted patients at low or high risk of MEs...

  5. Fiber based hydrophones for ultra-high energy neutrino detection

    NARCIS (Netherlands)

    Buis, E.J.; Doppenberg, E.J.J.; Eijk, D. van; Lahmann, R.; Nieuwland, R.A.; Toet, P.M.

    2014-01-01

    It is a well studied process [1, 2] that energy deposition of cosmic ray particles in water that generate thermo-acoustic signals. Hydrophones of sufficient sensitivity could measure this signal and provide a means of detecting ultra-high energetic cosmic neutrinos. We investigate optical

  6. Dynamical explanation for the high water abundance detected in Orion

    International Nuclear Information System (INIS)

    Elitzur, M.

    1979-01-01

    Shock wave chemistry is suggested as the likely explanation for the high water abundance which has been recently detected in Orion by Phyllips et al. The existence of such a shock and its inferred properties are in agreement with other observations of Orion such as the broad velocity feature and H 2 vibration emission. Shock waves are proposed as the likely explanation for high water abundances observed in other sources such as the strong H 2 O masers

  7. Detection of actinides and rare earths in natural matrices with the AGLAE new, high sensitivity detection set-up

    Science.gov (United States)

    Zucchiatti, Alessandro; Alonso, Ursula; Lemasson, Quentin; Missana, Tiziana; Moignard, Brice; Pacheco, Claire; Pichon, Laurent; Camarena de la Mora, Sandra

    2014-08-01

    A series of granite samples (Grimsel and Äspö) enriched by sorption with natU (10-3 M, 10-4 M, 10-5 M in solution) and La (10-3 M, 10-4 M in solution) has been scanned by PIXE over a surface of 1920 × 1920 mm2 together with non-enriched Grimsel and Äspö granites and a glass standard. An assessment of minimum detection limits, MDL's, for several elements has been performed with the use of standard materials. Due to mapping and the high sensitivity of the new AGLAE detection system, U levels around 30 ppm can be detected from the whole PIXE spectrum (one low energy detector and four summed filtered detectors) while U reach grains, inhomogeneously distributed over the surface can be clearly identified through the multi elemental maps and analyzed separately. Even the nominally enriched samples have La levels below the MDL, probably because precipitation of the element (and not adsorption) mostly took place, and precipitates were eliminated after surface cleaning carried out before PIXE analyses. A multi detector system that implies a PIXE detection solid angle much wider than in any other similar set-up (a factor of 2-5); a higher events selectivity, given by the possibility of filtering individually up to 4 PIXE detectors; a double RBS detector, the new Ion Beam Induced Luminescence (IBIL) spectrometry and gamma spectrometry. Full mapping capability in air, assisted by a powerful event by event reconstruction software. These features allow lower Minimum Detection Limits (MDL) which are highly beneficial to the analysis of cultural heritage objects, meaning generally a reduction of irradiation time. Paintings will then be studied without any damage to the pigments that have color change tendencies which is a major drawback of the previous system. Alternatively they could allow an increase in information collected at equal time, particularly considering the detector's fast response and therefore the potential for high beam currents when sample damage can be

  8. Quantum probability measures and tomographic probability densities

    NARCIS (Netherlands)

    Amosov, GG; Man'ko, [No Value

    2004-01-01

    Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the

  9. Landslide detection using very high-resolution satellite imageries

    Science.gov (United States)

    Suga, Yuzo; Konishi, Tomohisa

    2012-10-01

    The heavy rain induced by the 12th typhoon caused landslide disaster at Kii Peninsula in the middle part of Japan. We propose a quick response method for landslide disaster mapping using very high resolution (VHR) satellite imageries. Especially, Synthetic Aperture Radar (SAR) is effective because it has the capability of all weather and day/night observation. In this study, multi-temporal COSMO-SkyMed imageries were used to detect the landslide areas. It was difficult to detect the landslide areas using only backscatter change pattern derived from pre- and post-disaster COSMOSkyMed imageries. Thus, the authors adopted a correlation analysis which the moving window was selected for the correlation coefficient calculation. Low value of the correlation coefficient reflects land cover change between pre- and post-disaster imageries. This analysis is effective for the detection of landslides using SAR data. The detected landslide areas were compared with the area detected by EROS-B high resolution optical image. In addition, we have developed 3D viewing system for geospatial visualizing of the damaged area using these satellite image data with digital elevation model. The 3D viewing system has the performance of geographic measurement with respect to elevation height, area and volume calculation, and cross section drawing including landscape viewing and image layer construction using a mobile personal computer with interactive operation. As the result, it was verified that a quick response for the detection of landslide disaster at the initial stage could be effectively performed using optical and SAR very high resolution satellite data by means of 3D viewing system.

  10. Applied probability and stochastic processes

    CERN Document Server

    Sumita, Ushio

    1999-01-01

    Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...

  11. Study on high power ultraviolet laser oil detection system

    Science.gov (United States)

    Jin, Qi; Cui, Zihao; Bi, Zongjie; Zhang, Yanchao; Tian, Zhaoshuo; Fu, Shiyou

    2018-03-01

    Laser Induce Fluorescence (LIF) is a widely used new telemetry technology. It obtains information about oil spill and oil film thickness by analyzing the characteristics of stimulated fluorescence and has an important application in the field of rapid analysis of water composition. A set of LIF detection system for marine oil pollution is designed in this paper, which uses 355nm high-energy pulsed laser as the excitation light source. A high-sensitivity image intensifier is used in the detector. The upper machine sends a digital signal through a serial port to achieve nanoseconds range-gated width control for image intensifier. The target fluorescence spectrum image is displayed on the image intensifier by adjusting the delay time and the width of the pulse signal. The spectral image is coupled to CCD by lens imaging to achieve spectral display and data analysis function by computer. The system is used to detect the surface of the floating oil film in the distance of 25m to obtain the fluorescence spectra of different oil products respectively. The fluorescence spectra of oil products are obvious. The experimental results show that the system can realize high-precision long-range fluorescence detection and reflect the fluorescence characteristics of the target accurately, with broad application prospects in marine oil pollution identification and oil film thickness detection.

  12. High mobility ZnO nanowires for terahertz detection applications

    International Nuclear Information System (INIS)

    Liu, Huiqiang; Peng, Rufang; Chu, Shijin; Chu, Sheng

    2014-01-01

    An oxide nanowire material was utilized for terahertz detection purpose. High quality ZnO nanowires were synthesized and field-effect transistors were fabricated. Electrical transport measurements demonstrated the nanowire with good transfer characteristics and fairly high electron mobility. It is shown that ZnO nanowires can be used as building blocks for the realization of terahertz detectors based on a one-dimensional plasmon detection configuration. Clear terahertz wave (∼0.3 THz) induced photovoltages were obtained at room temperature with varying incidence intensities. Further analysis showed that the terahertz photoresponse is closely related to the high electron mobility of the ZnO nanowire sample, which suggests that oxide nanoelectronics may find useful terahertz applications.

  13. Position sensitive detection of neutrons in high radiation background field.

    Science.gov (United States)

    Vavrik, D; Jakubek, J; Pospisil, S; Vacik, J

    2014-01-01

    We present the development of a high-resolution position sensitive device for detection of slow neutrons in the environment of extremely high γ and e(-) radiation background. We make use of a planar silicon pixelated (pixel size: 55 × 55 μm(2)) spectroscopic Timepix detector adapted for neutron detection utilizing very thin (10)B converter placed onto detector surface. We demonstrate that electromagnetic radiation background can be discriminated from the neutron signal utilizing the fact that each particle type produces characteristic ionization tracks in the pixelated detector. Particular tracks can be distinguished by their 2D shape (in the detector plane) and spectroscopic response using single event analysis. A Cd sheet served as thermal neutron stopper as well as intensive source of gamma rays and energetic electrons. Highly efficient discrimination was successful even at very low neutron to electromagnetic background ratio about 10(-4).

  14. Early results utilizing high-energy fission product gamma rays to detect fissionable material in cargo

    International Nuclear Information System (INIS)

    Slaughter, D.R.; Accatino, M.R.; Alford, O.J.; Bernstein, A.; Descalle, M.; Gosnell, T.B.; Hall, J.M.; Loshak, A.; Manatt, D.R.; McDowell, M.R.; Moore, T.L.; Petersen, D.C.; Pohl, B.A.; Pruet, J.A.; Prussin, S.G.

    2004-01-01

    Full text: A concept for detecting the presence of special nuclear material ( 235 U or 239 Pu) concealed in inter modal cargo containers is described. It is based on interrogation with a pulsed beam of 6-8 MeV neutrons and fission events are identified between beam pulses by their β-delayed neutron emission or β -delayed high-energy γ-radiation. The high-energy γ-ray signature is being employed for the first time. Fission product γ-rays above 3 MeV are distinct from natural radioactivity and from nearly all of the induced activity in a normal cargo. High-energy γ-radiation is nearly 10X more abundant than the delayed neutrons and penetrates even thick cargo's readily. The concept employs two large (8x20 ft) arrays of liquid scintillation detectors that have high efficiency for the detection of both delayed neutrons and delayed γ-radiation. Detector backgrounds and potential interferences with the fission signature radiation have been identified and quantified. This information, together with predicted signature strength, has been applied to the estimation of detection probability for the nuclear material and estimation of false alarm rates. This work was performed under the auspices of the U.S. Department of Energy by the University of California, Lawrence Livermore National Laboratory under contract No. W-7405-Eng-48

  15. High resolution surface scanning of Thick-GEM for single photo-electron detection

    International Nuclear Information System (INIS)

    Hamar, G.; Varga, D.

    2012-01-01

    An optical system for high resolution scanning of TGEM UV photon detection systems is introduced. The structure exploits the combination of a single Au-coated TGEM under study, and an asymmetric MWPC (Close Cathode Chamber) as post-amplification stage. A pulsed UV LED source with emission down to 240 nm has been focused to a spot of 0.07 mm on the TGEM surface, and single photo-electron charge spectra has been recorded over selected two dimensional regions. This way, the TGEM gain (order of 10–100) and TGEM photo-electron detection efficiency is clearly separated, unlike in case of continuous illumination. The surface structure connected to the TGEM photon detection is well observable, including inefficiencies in the holes and at the symmetry points between holes. The detection efficiency as well as the gas gain are fluctuating from hole to hole. The gain is constant in the hexagon around any hole, pointing to the fact that the gain depends on hole geometry, and less on the position where the electron enters. The detection probability map strongly changes with the field strength above the TGEM surface, in relation to the change of the actual surface field configuration. The results can be confronted with position-dependent simulations of TGEM electron transfer and gas multiplication. -- Highlights: ► First demonstration of Thick GEM surface scanning with single photo-electrons. ► Resolution of 0.1 mm is sufficient to identify structures connected to TGEM surface field structure. ► Gain and detection efficiency and separately measurable. ► Detection efficiency is high in a ring around the holes, and gain is constant in the hexagonal collection regions.

  16. High resolution surface scanning of Thick-GEM for single photo-electron detection

    Energy Technology Data Exchange (ETDEWEB)

    Hamar, G., E-mail: hamar.gergo@wigner.mta.hu [Wigner Research Centre for Physics, Budapest (Hungary); Varga, D., E-mail: vdezso@mail.cern.ch [Eoetvoes Lorand University, Budapest (Hungary)

    2012-12-01

    An optical system for high resolution scanning of TGEM UV photon detection systems is introduced. The structure exploits the combination of a single Au-coated TGEM under study, and an asymmetric MWPC (Close Cathode Chamber) as post-amplification stage. A pulsed UV LED source with emission down to 240 nm has been focused to a spot of 0.07 mm on the TGEM surface, and single photo-electron charge spectra has been recorded over selected two dimensional regions. This way, the TGEM gain (order of 10-100) and TGEM photo-electron detection efficiency is clearly separated, unlike in case of continuous illumination. The surface structure connected to the TGEM photon detection is well observable, including inefficiencies in the holes and at the symmetry points between holes. The detection efficiency as well as the gas gain are fluctuating from hole to hole. The gain is constant in the hexagon around any hole, pointing to the fact that the gain depends on hole geometry, and less on the position where the electron enters. The detection probability map strongly changes with the field strength above the TGEM surface, in relation to the change of the actual surface field configuration. The results can be confronted with position-dependent simulations of TGEM electron transfer and gas multiplication. -- Highlights: Black-Right-Pointing-Pointer First demonstration of Thick GEM surface scanning with single photo-electrons. Black-Right-Pointing-Pointer Resolution of 0.1 mm is sufficient to identify structures connected to TGEM surface field structure. Black-Right-Pointing-Pointer Gain and detection efficiency and separately measurable. Black-Right-Pointing-Pointer Detection efficiency is high in a ring around the holes, and gain is constant in the hexagonal collection regions.

  17. High sensitive quench detection method using an integrated test wire

    International Nuclear Information System (INIS)

    Fevrier, A.; Tavergnier, J.P.; Nithart, H.; Kiblaire, M.; Duchateau, J.L.

    1981-01-01

    A high sensitive quench detection method which works even in the presence of an external perturbing magnetic field is reported. The quench signal is obtained from the difference in voltages at the superconducting winding terminals and at the terminals at a secondary winding strongly coupled to the primary. The secondary winding could consist of a ''zero-current strand'' of the superconducting cable not connected to one of the winding terminals or an integrated normal test wire inside the superconducting cable. Experimental results on quench detection obtained by this method are described. It is shown that the integrated test wire method leads to efficient and sensitive quench detection, especially in the presence of an external perturbing magnetic field

  18. Highly sensitive detection of urinary cadmium to assess personal exposure

    Energy Technology Data Exchange (ETDEWEB)

    Argun, Avni A.; Banks, Ashley M.; Merlen, Gwendolynne; Tempelman, Linda A. [Giner, Inc., 89 Rumford Ave., Newton 02466, MA United States (United States); Becker, Michael F.; Schuelke, Thomas [Fraunhofer USA – CCL, 1449 Engineering Research Ct., East Lansing 48824, MI (United States); Dweik, Badawi M., E-mail: bdweik@ginerinc.com [Giner, Inc., 89 Rumford Ave., Newton 02466, MA United States (United States)

    2013-04-22

    Highlights: ► An electrochemical sensor capable of detecting cadmium at parts-per-billion levels in urine. ► A novel fabrication method for Boron-Doped Diamond (BDD) ultramicroelectrode (UME) arrays. ► Unique combination of BDD UME arrays and a differential pulse voltammetry algorithm. ► High sensitivity, high reproducibility, and very low noise levels. ► Opportunity for portable operation to assess on-site personal exposure. -- Abstract: A series of Boron-Doped Diamond (BDD) ultramicroelectrode arrays were fabricated and investigated for their performance as electrochemical sensors to detect trace level metals such as cadmium. The steady-state diffusion behavior of these sensors was validated using cyclic voltammetry followed by electrochemical detection of cadmium in water and in human urine to demonstrate high sensitivity (>200 μA ppb{sup −1} cm{sup −2}) and low background current (<4 nA). When an array of ultramicroelectrodes was positioned with optimal spacing, these BDD sensors showed a sigmoidal diffusion behavior. They also demonstrated high accuracy with linear dose dependence for quantification of cadmium in a certified reference river water sample from the U.S. National Institute of Standards and Technology (NIST) as well as in a human urine sample spiked with 0.25–1 ppb cadmium.

  19. Profile parameters of wheelset detection for high speed freight train

    Science.gov (United States)

    Yang, Kai; Ma, Li; Gao, Xiaorong; Wang, Li

    2012-04-01

    Because of freight train, in China, transports goods on railway freight line throughout the country, it does not depart from or return to engine shed during a long phase, thus we cannot monitor the quality of wheel set effectively. This paper provides a system which uses leaser and high speed camera, applies no-contact light section technology to get precise wheel set profile parameters. The paper employs clamping-track method to avoid complex railway ballast modification project. And detailed descript an improved image-tracking algorithm to extract central line from profile curve. For getting one pixel width and continuous line of the profile curve, uses local gray maximum points as direction control points to direct tracking direction. The results based on practical experiment show the system adapted to detection environment of high speed and high vibration, and it can effectively detect the wheelset geometric parameters with high accuracy. The system fills the gaps in wheel set detection for freight train in main line and has an enlightening function on monitoring the quality of wheel set.

  20. Toward a generalized probability theory: conditional probabilities

    International Nuclear Information System (INIS)

    Cassinelli, G.

    1979-01-01

    The main mathematical object of interest in the quantum logic approach to the foundations of quantum mechanics is the orthomodular lattice and a set of probability measures, or states, defined by the lattice. This mathematical structure is studied per se, independently from the intuitive or physical motivation of its definition, as a generalized probability theory. It is thought that the building-up of such a probability theory could eventually throw light on the mathematical structure of Hilbert-space quantum mechanics as a particular concrete model of the generalized theory. (Auth.)

  1. High-sensitivity high-selectivity detection of CWAs and TICs using tunable laser photoacoustic spectroscopy

    Science.gov (United States)

    Pushkarsky, Michael; Webber, Michael; Patel, C. Kumar N.

    2005-03-01

    We provide a general technique for evaluating the performance of an optical sensor for the detection of chemical warfare agents (CWAs) in realistic environments and present data from a simulation model based on a field deployed discretely tunable 13CO2 laser photoacoustic spectrometer (L-PAS). Results of our calculations show the sensor performance in terms of usable sensor sensitivity as a function of probability of false positives (PFP). The false positives arise from the presence of many other gases in the ambient air that could be interferents. Using the L-PAS as it exists today, we can achieve a detection threshold of about 4 ppb for the CWAs while maintaining a PFP of less than 1:106. Our simulation permits us to vary a number of parameters in the model to provide guidance for performance improvement. We find that by using a larger density of laser lines (such as those obtained through the use of tunable semiconductor lasers), improving the detector noise and maintaining the accuracy of laser frequency determination, optical detection schemes can make possible CWA sensors having sub-ppb detection capability with TIC detection.

  2. Auto Detection For High Level Water Content For Oil Well

    Science.gov (United States)

    Janier, Josefina Barnachea; Jumaludin, Zainul Arifin B.

    2010-06-01

    Auto detection of high level water content for oil well is a system that measures the percentage of water in crude oil. This paper aims to discuss an auto detection system for measuring the content of water level in crude oil which is applicable for offshore and onshore oil operations. Data regarding water level content from wells can be determined by using automation thus, well with high water level can be determined immediately whether to be closed or not from operations. Theoretically the system measures the percentage of two- fluid mixture where the fluids have different electrical conductivities which are water and crude oil. The system made use of grid sensor which is a grid pattern like of horizontal and vertical wires. When water occupies the space at the intersection of vertical and horizontal wires, an electrical signal is detected which proved that water completed the circuit path in the system. The electrical signals are counted whereas the percentage of water is determined from the total electrical signals detected over electrical signals provided. Simulation of the system using the MultiSIM showed that the system provided the desired result.

  3. Detection of Waterborne Viruses Using High Affinity Molecularly Imprinted Polymers.

    Science.gov (United States)

    Altintas, Zeynep; Gittens, Micah; Guerreiro, Antonio; Thompson, Katy-Anne; Walker, Jimmy; Piletsky, Sergey; Tothill, Ibtisam E

    2015-07-07

    Molecularly imprinted polymers (MIPs) are artificial receptor ligands which can recognize and specifically bind to a target molecule. They are more resistant to chemical and biological damage and inactivation than antibodies. Therefore, target specific-MIP nanoparticles are aimed to develop and implemented to biosensors for the detection of biological toxic agents such as viruses, bacteria, and fungi toxins that cause many diseases and death due to the environmental contamination. For the first time, a molecularly imprinted polymer (MIP) targeting the bacteriophage MS2 as the template was investigated using a novel solid-phase synthesis method to obtain the artificial affinity ligand for the detection and removal of waterborne viruses through optical-based sensors. A high affinity between the artificial ligand and the target was found, and a regenerative MIP-based virus detection assay was successfully developed using a new surface plasmon resonance (SPR)-biosensor which provides an alternative technology for the specific detection and removal of waterborne viruses that lead to high disease and death rates all over the world.

  4. High resolution PET breast imager with improved detection efficiency

    Science.gov (United States)

    Majewski, Stanislaw

    2010-06-08

    A highly efficient PET breast imager for detecting lesions in the entire breast including those located close to the patient's chest wall. The breast imager includes a ring of imaging modules surrounding the imaged breast. Each imaging module includes a slant imaging light guide inserted between a gamma radiation sensor and a photodetector. The slant light guide permits the gamma radiation sensors to be placed in close proximity to the skin of the chest wall thereby extending the sensitive region of the imager to the base of the breast. Several types of photodetectors are proposed for use in the detector modules, with compact silicon photomultipliers as the preferred choice, due to its high compactness. The geometry of the detector heads and the arrangement of the detector ring significantly reduce dead regions thereby improving detection efficiency for lesions located close to the chest wall.

  5. LAT Perspectives in Detection of High Energy Cosmic Ray Electrons

    International Nuclear Information System (INIS)

    Moiseev, Alexander; Ormes, J.F.; Funk, Stefan

    2007-01-01

    The LAT science objectives and capabilities in the detection of high energy electrons in the energy range from 20 GeV to ∼1.5 TeV are presented. LAT simulations are used to establish the event selections. It is found that maintaining the efficiency of electron detection at the level of 30%, the residual hadron contamination does not exceed 2-3% of the electron flux. It is expected to collect ∼ ten million of electrons with the energy above 20 GeV for one year of observation. Precise spectrum reconstruction with collected electron statistics opens the unique opportunity to investigate several important problems such as models of IC radiation, revealing the signatures of nearby sources such as high energy cutoff in the electron spectrum, testing the propagation model, and search for KKDM particles decay through their contribution to the electron spectrum

  6. Noise reduction in muon tomography for detecting high density objects

    International Nuclear Information System (INIS)

    Benettoni, M; Checchia, P; Cossutta, L; Furlan, M; Gonella, F; Pegoraro, M; Garola, A Rigoni; Ronchese, P; Vanini, S; Viesti, G; Bettella, G; Bonomi, G; Donzella, A; Subieta, M; Zenoni, A; Calvagno, G; Cortelazzo, G; Zanuttigh, P; Calvini, P; Squarcia, S

    2013-01-01

    The muon tomography technique, based on multiple Coulomb scattering of cosmic ray muons, has been proposed as a tool to detect the presence of high density objects inside closed volumes. In this paper a new and innovative method is presented to handle the density fluctuations (noise) of reconstructed images, a well known problem of this technique. The effectiveness of our method is evaluated using experimental data obtained with a muon tomography prototype located at the Legnaro National Laboratories (LNL) of the Istituto Nazionale di Fisica Nucleare (INFN). The results reported in this paper, obtained with real cosmic ray data, show that with appropriate image filtering and muon momentum classification, the muon tomography technique can detect high density materials, such as lead, albeit surrounded by light or medium density material, in short times. A comparison with algorithms published in literature is also presented

  7. Neutron detection in a high gamma-ray background with EJ-301 and EJ-309 liquid scintillators

    International Nuclear Information System (INIS)

    Stevanato, L.; Cester, D.; Nebbia, G.; Viesti, G.

    2012-01-01

    Using a fast digitizer, the neutron–gamma discrimination capability of the new liquid scintillator EJ-309 is compared with that obtained using standard EJ-301. Moreover the capability of both the scintillation detectors to identify a weak neutron source in a high gamma-ray background is demonstrated. The probability of neutron detection is PD=95% at 95% confidence level for a gamma-ray background corresponding to a dose rate of 100 μSv/h.

  8. Detecting highly overlapping community structure by greedy clique expansion

    OpenAIRE

    Lee, Conrad; Reid, Fergal; McDaid, Aaron; Hurley, Neil

    2010-01-01

    In complex networks it is common for each node to belong to several communities, implying a highly overlapping community structure. Recent advances in benchmarking indicate that existing community assignment algorithms that are capable of detecting overlapping communities perform well only when the extent of community overlap is kept to modest levels. To overcome this limitation, we introduce a new community assignment algorithm called Greedy Clique Expansion (GCE). The algorithm identifies d...

  9. A hydrophone prototype for ultra high energy neutrino acoustic detection

    International Nuclear Information System (INIS)

    Cotrufo, A.; Plotnikov, A.; Yershova, O.; Anghinolfi, M.; Piombo, D.

    2009-01-01

    The design of an air-backed fiber-optic hydrophone is presented. With respect to the previous models this prototype is optimized to provide a bandwidth sufficiently large to detect acoustic signals produced by high energy hadronic showers in water. In addiction to the geometrical configuration and to the choice of the materials, the preliminary results of the measured performances in air are presented.

  10. A hydrophone prototype for ultra high energy neutrino acoustic detection

    Energy Technology Data Exchange (ETDEWEB)

    Cotrufo, A. [University of Genoa, Department of Physics, Via Dodecaneso 33, I-16146 (Italy)], E-mail: cotrufo@ge.infn.it; Plotnikov, A.; Yershova, O. [GSI Helmholtz Centre for Heavy Ion Research, GmbH Planckstrasse1, 64291 Darmstadt (Germany); Anghinolfi, M.; Piombo, D. [INFN, University of Genoa, Department of Physics, Via Dodecaneso 33, I-16146 (Italy)

    2009-06-01

    The design of an air-backed fiber-optic hydrophone is presented. With respect to the previous models this prototype is optimized to provide a bandwidth sufficiently large to detect acoustic signals produced by high energy hadronic showers in water. In addiction to the geometrical configuration and to the choice of the materials, the preliminary results of the measured performances in air are presented.

  11. Nonlinear detection for a high rate extended binary phase shift keying system.

    Science.gov (United States)

    Chen, Xian-Qing; Wu, Le-Nan

    2013-03-28

    The algorithm and the results of a nonlinear detector using a machine learning technique called support vector machine (SVM) on an efficient modulation system with high data rate and low energy consumption is presented in this paper. Simulation results showed that the performance achieved by the SVM detector is comparable to that of a conventional threshold decision (TD) detector. The two detectors detect the received signals together with the special impacting filter (SIF) that can improve the energy utilization efficiency. However, unlike the TD detector, the SVM detector concentrates not only on reducing the BER of the detector, but also on providing accurate posterior probability estimates (PPEs), which can be used as soft-inputs of the LDPC decoder. The complexity of this detector is considered in this paper by using four features and simplifying the decision function. In addition, a bandwidth efficient transmission is analyzed with both SVM and TD detector. The SVM detector is more robust to sampling rate than TD detector. We find that the SVM is suitable for extended binary phase shift keying (EBPSK) signal detection and can provide accurate posterior probability for LDPC decoding.

  12. Nonlinear Detection for a High Rate Extended Binary Phase Shift Keying System

    Directory of Open Access Journals (Sweden)

    Le-Nan Wu

    2013-03-01

    Full Text Available The algorithm and the results of a nonlinear detector using a machine learning technique called support vector machine (SVM on an efficient modulation system with high data rate and low energy consumption is presented in this paper. Simulation results showed that the performance achieved by the SVM detector is comparable to that of a conventional threshold decision (TD detector. The two detectors detect the received signals together with the special impacting filter (SIF that can improve the energy utilization efficiency. However, unlike the TD detector, the SVM detector concentrates not only on reducing the BER of the detector, but also on providing accurate posterior probability estimates (PPEs, which can be used as soft-inputs of the LDPC decoder. The complexity of this detector is considered in this paper by using four features and simplifying the decision function. In addition, a bandwidth efficient transmission is analyzed with both SVM and TD detector. The SVM detector is more robust to sampling rate than TD detector. We find that the SVM is suitable for extended binary phase shift keying (EBPSK signal detection and can provide accurate posterior probability for LDPC decoding.

  13. Irish study of high-density Schizophrenia families: Field methods and power to detect linkage

    Energy Technology Data Exchange (ETDEWEB)

    Kendler, K.S.; Straub, R.E.; MacLean, C.J. [Virginia Commonwealth Univ., Richmond, VA (United States)] [and others

    1996-04-09

    Large samples of multiplex pedigrees will probably be needed to detect susceptibility loci for schizophrenia by linkage analysis. Standardized ascertainment of such pedigrees from culturally and ethnically homogeneous populations may improve the probability of detection and replication of linkage. The Irish Study of High-Density Schizophrenia Families (ISHDSF) was formed from standardized ascertainment of multiplex schizophrenia families in 39 psychiatric facilities covering over 90% of the population in Ireland and Northern Ireland. We here describe a phenotypic sample and a subset thereof, the linkage sample. Individuals were included in the phenotypic sample if adequate diagnostic information, based on personal interview and/or hospital record, was available. Only individuals with available DNA were included in the linkage sample. Inclusion of a pedigree into the phenotypic sample required at least two first, second, or third degree relatives with non-affective psychosis (NAP), one of whom had schizophrenia (S) or poor-outcome schizoaffective disorder (PO-SAD). Entry into the linkage sample required DNA samples on at least two individuals with NAP, of whom at least one had S or PO-SAD. Affection was defined by narrow, intermediate, and broad criteria. 75 refs., 6 tabs.

  14. Detecting and locating light atoms from high-resolution STEM images: The quest for a single optimal design.

    Science.gov (United States)

    Gonnissen, J; De Backer, A; den Dekker, A J; Sijbers, J; Van Aert, S

    2016-11-01

    In the present paper, the optimal detector design is investigated for both detecting and locating light atoms from high resolution scanning transmission electron microscopy (HR STEM) images. The principles of detection theory are used to quantify the probability of error for the detection of light atoms from HR STEM images. To determine the optimal experiment design for locating light atoms, use is made of the so-called Cramér-Rao Lower Bound (CRLB). It is investigated if a single optimal design can be found for both the detection and location problem of light atoms. Furthermore, the incoming electron dose is optimised for both research goals and it is shown that picometre range precision is feasible for the estimation of the atom positions when using an appropriate incoming electron dose under the optimal detector settings to detect light atoms. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Detecting and locating light atoms from high-resolution STEM images: The quest for a single optimal design

    Energy Technology Data Exchange (ETDEWEB)

    Gonnissen, J.; De Backer, A. [Electron Microscopy for Materials Science (EMAT), University of Antwerp, Groenenborgerlaan 171, B-2020 Antwerp (Belgium); Dekker, A.J. den [iMinds-Vision Lab, University of Antwerp, Universiteitsplein 1, 2610 Wilrijk (Belgium); Delft Center for Systems and Control (DCSC), Delft University of Technology, Mekelweg 2, 2628 CD Delft (Netherlands); Sijbers, J. [iMinds-Vision Lab, University of Antwerp, Universiteitsplein 1, 2610 Wilrijk (Belgium); Van Aert, S., E-mail: sandra.vanaert@uantwerpen.be [Electron Microscopy for Materials Science (EMAT), University of Antwerp, Groenenborgerlaan 171, B-2020 Antwerp (Belgium)

    2016-11-15

    In the present paper, the optimal detector design is investigated for both detecting and locating light atoms from high resolution scanning transmission electron microscopy (HR STEM) images. The principles of detection theory are used to quantify the probability of error for the detection of light atoms from HR STEM images. To determine the optimal experiment design for locating light atoms, use is made of the so-called Cramér–Rao Lower Bound (CRLB). It is investigated if a single optimal design can be found for both the detection and location problem of light atoms. Furthermore, the incoming electron dose is optimised for both research goals and it is shown that picometre range precision is feasible for the estimation of the atom positions when using an appropriate incoming electron dose under the optimal detector settings to detect light atoms. - Highlights: • The optimal detector design to detect and locate light atoms in HR STEM is derived. • The probability of error is quantified and used to detect light atoms. • The Cramér–Rao lower bound is calculated to determine the atomic column precision. • Both measures are evaluated and result in the single optimal LAADF detector regime. • The incoming electron dose is optimised for both research goals.

  16. Ultrananocrystalline Diamond Membranes for Detection of High-Mass Proteins

    Science.gov (United States)

    Kim, H.; Park, J.; Aksamija, Z.; Arbulu, M.; Blick, R. H.

    2016-12-01

    Mechanical resonators realized on the nanoscale by now offer applications in mass sensing of biomolecules with extraordinary sensitivity. The general idea is that perfect mechanical mass sensors should be of extremely small size to achieve zepto- or yoctogram sensitivity in weighing single molecules similar to a classical scale. However, the small effective size and long response time for weighing biomolecules with a cantilever restricts their usefulness as a high-throughput method. Commercial mass spectrometry (MS), on the other hand, such as electrospray ionization and matrix-assisted laser desorption and ionization (MALDI) time of flight (TOF) and their charge-amplifying detectors are the gold standards to which nanomechanical resonators have to live up to. These two methods rely on the ionization and acceleration of biomolecules and the following ion detection after a mass selection step, such as TOF. The principle we describe here for ion detection is based on the conversion of kinetic energy of the biomolecules into thermal excitation of chemical vapor deposition diamond nanomembranes via phonons followed by phonon-mediated detection via field emission of thermally emitted electrons. We fabricate ultrathin diamond membranes with large lateral dimensions for MALDI TOF MS of high-mass proteins. These diamond membranes are realized by straightforward etching methods based on semiconductor processing. With a minimal thickness of 100 nm and cross sections of up to 400 ×400 μ m2 , the membranes offer extreme aspect ratios. Ion detection is demonstrated in MALDI TOF analysis over a broad range from insulin to albumin. The resulting data in detection show much enhanced resolution as compared to existing detectors, which can offer better sensitivity and overall performance in resolving protein masses.

  17. DIRAC: A high resolution spectrometer for pionium detection

    Energy Technology Data Exchange (ETDEWEB)

    Adeva, B. E-mail: adevab@usc.es; Afanasyev, L.; Benayoun, M.; Benelli, A.; Berka, Z.; Brekhovskikh, V.; Caragheorgheopol, G.; Cechak, T.; Chiba, M.; Cima, E.; Constantinescu, S.; Detraz, C.; Dreossi, D.; Drijard, D.; Dudarev, A.; Evangelou, I.; Ferro-Luzzi, M.; Gallas, M.V.; Gerndt, J.; Giacomich, R.; Gianotti, P.; Giardoni, M.; Goldin, D.; Gomez, F.; Gorin, A.; Gortchakov, O.; Guaraldo, C.; Hansroul, M.; Iliescu, M.; Zhabitsky, M.; Karpukhin, V.; Kluson, J.; Kobayashi, M.; Kokkas, P.; Komarov, V.; Kruglov, V.; Kruglova, L.; Kulikov, A.; Kuptsov, A.; Kurochkin, V.; Kuroda, K.-I.; Lamberto, A.; Lanaro, A.; Lapshin, V.; Lednicky, R.; Leruste, P.; Levisandri, P.; Lopez Aguera, A.; Lucherini, V.; Maki, T.; Manthos, N.; Manuilov, I.; Montanet, L.; Narjoux, J.-L.; Nemenov, L.; Nikitin, M.; Nunez Pardo, T.; Okada, K.; Olchevskii, V.; Orecchini, D.; Pazos, A.; Pentia, M.; Penzo, A.; Perreau, J.-M.; Petrascu, C.; Plo, M.; Ponta, T.; Pop, D.; Rappazzo, G.F.; Riazantsev, A.; Rodriguez, J.M.; Rodriguez Fernandez, A.; Romero, A.; Rykalin, V.; Santamarina, C.; Saborido, J.; Schacher, J.; Schuetz, Ch.P.; Sidorov, A.; Smolik, J.; Steinacher, M.; Takeutchi, F.; Tarasov, A.; Tauscher, L.; Tobar, M.J.; Triantis, F.; Trusov, S.; Utkin, V.; Vazquez Doce, O.; Vazquez, P.; Vlachos, S.; Yazkov, V.; Yoshimura, Y.; Zrelov, P

    2003-12-11

    The DIRAC spectrometer has been commissioned at CERN with the aim of detecting {pi}{sup +}{pi}{sup -} atoms produced by a 24 GeV/c high intensity proton beam in thin foil targets. A challenging apparatus is required to cope with the high interaction rates involved, the triggering of pion pairs with very detector efficiency.ation of the imaging sof the latter with resolution around 0.6 MeV/c. The general characteristics of the apparatus are explained and each part is described in some detail. The main features of the trigger system, data-acquisition, monitoring and set-up performances are also given.

  18. DIRAC: A high resolution spectrometer for pionium detection

    International Nuclear Information System (INIS)

    Adeva, B.; Afanasyev, L.; Benayoun, M.; Benelli, A.; Berka, Z.; Brekhovskikh, V.; Caragheorgheopol, G.; Cechak, T.; Chiba, M.; Cima, E.; Constantinescu, S.; Detraz, C.; Dreossi, D.; Drijard, D.; Dudarev, A.; Evangelou, I.; Ferro-Luzzi, M.; Gallas, M.V.; Gerndt, J.; Giacomich, R.; Gianotti, P.; Giardoni, M.; Goldin, D.; Gomez, F.; Gorin, A.; Gortchakov, O.; Guaraldo, C.; Hansroul, M.; Iliescu, M.; Zhabitsky, M.; Karpukhin, V.; Kluson, J.; Kobayashi, M.; Kokkas, P.; Komarov, V.; Kruglov, V.; Kruglova, L.; Kulikov, A.; Kuptsov, A.; Kurochkin, V.; Kuroda, K.-I.; Lamberto, A.; Lanaro, A.; Lapshin, V.; Lednicky, R.; Leruste, P.; Levisandri, P.; Lopez Aguera, A.; Lucherini, V.; Maki, T.; Manthos, N.; Manuilov, I.; Montanet, L.; Narjoux, J.-L.; Nemenov, L.; Nikitin, M.; Nunez Pardo, T.; Okada, K.; Olchevskii, V.; Orecchini, D.; Pazos, A.; Pentia, M.; Penzo, A.; Perreau, J.-M.; Petrascu, C.; Plo, M.; Ponta, T.; Pop, D.; Rappazzo, G.F.; Riazantsev, A.; Rodriguez, J.M.; Rodriguez Fernandez, A.; Romero, A.; Rykalin, V.; Santamarina, C.; Saborido, J.; Schacher, J.; Schuetz, Ch.P.; Sidorov, A.; Smolik, J.; Steinacher, M.; Takeutchi, F.; Tarasov, A.; Tauscher, L.; Tobar, M.J.; Triantis, F.; Trusov, S.; Utkin, V.; Vazquez Doce, O.; Vazquez, P.; Vlachos, S.; Yazkov, V.; Yoshimura, Y.; Zrelov, P.

    2003-01-01

    The DIRAC spectrometer has been commissioned at CERN with the aim of detecting π + π - atoms produced by a 24 GeV/c high intensity proton beam in thin foil targets. A challenging apparatus is required to cope with the high interaction rates involved, the triggering of pion pairs with very detector efficiency.ation of the imaging sof the latter with resolution around 0.6 MeV/c. The general characteristics of the apparatus are explained and each part is described in some detail. The main features of the trigger system, data-acquisition, monitoring and set-up performances are also given

  19. DIRAC: A High Resolution Spectrometer for Pionium Detection

    CERN Document Server

    Afanasiev, L G; Benelli, A; Berka, Z; Brekhovskikh, V; Caragheorgheopol, G; Cechák, T; Chiba, M; Cima, E; Constantinescu, S; Détraz, C; Dreossi, D; Drijard, Daniel; Dudarev, A; Evangelou, I; Ferro-Luzzi, M; Gallas, M V; Gerndt, J; Giacomich, R; Gianotti, P; Giardoni, M; Goldin, D; Gómez, F; Gorin, A; Gortchakov, O E; Guaraldo, C; Hansroul, M; Iliescu, M A; Zhabitsky, V M; Karpukhin, V V; Kluson, J; Kobayashi, M; Kokkas, P; Komarov, V; Kruglov, V; Kruglova, L; Kulikov, A; Kuptsov, A; Kurochkin, V; Kuroda, K I; Lamberto, A; Lanaro, A; Lapshin, V G; Lednicky, R; Leruste, P; Levisandri, P; López-Aguera, A; Lucherini, V; Mäki, T; Manthos, N; Manuilov, I V; Montanet, Lucien; Narjoux, J L; Nemenov, Leonid L; Nikitin, M; Núñez-Pardo de Vera, M T; Okada, K; Olchevskii, V; Orecchini, D; Pazos, A; Pentia, M; Penzo, Aldo L; Perreau, J M; Petrascu, C; Pló, M; Ponta, T; Pop, D; Rappazzo, G F; Riazantsev, A; Rodríguez, J M; Rodríguez-Fernández, A M; Romero, A; Rykalin, V I; Santamarina-Rios, C; Saborido, J; Schacher, J; Schütz, C P; Sidorov, A; Smolik, J; Steinacher, M; Takeutchi, F; Tarasov, A; Tauscher, Ludwig; Tobar, M J; Triantis, F A; Trusov, S V; Utkin, V; Vázquez-Doce, O; Vázquez, P; Vlachos, S; Yazkov, V; Yoshimura, Y; Zrelov, V P

    2003-01-01

    The DIRAC spectrometer has been commissioned at CERN with the aim of detecting $\\pi^+ \\pi^-$ atoms produced by a 24 GeV/$c$ high intensity proton beam in thin foil targets. A challenging apparatus is required to cope with the high interaction rates involved, the triggering of pion pairs with very low relative momentum, and the measurement of the latter with resolution around 0.6 MeV/$c$. The general characteristics of the apparatus are explained and each part is described in some detail. The main features of the trigger system, data-acquisition, monitoring and setup performances are also given.

  20. Optical intensity modulation direct detection versus heterodyne detection: A high-SNR capacity comparison

    KAUST Repository

    Chaaban, Anas

    2016-09-15

    An optical wireless communications system which employs either intensity-modulation and direct-detection (IM-DD) or heterodyne detection (HD) is considered. IM-DD has lower complexity and cost than HD, but on the other hand, has lower capacity. It is therefore interesting to investigate the capacity gap between the two systems. The main focus of this paper is to investigate this gap at high SNR. Bounds on this gap are established for two cases: between IM-DD and HD, and between IM-DD and an HD-PAM which is an HD system employing pulse-amplitude modulation (PAM). While the gap between IM-DD and HD increases as the signal-to-noise ratio (SNR) increases, the gap between IM-DD and an HD-PAM is upper bounded by a constant at high SNR. © 2015 IEEE.

  1. Optical intensity modulation direct detection versus heterodyne detection: A high-SNR capacity comparison

    KAUST Repository

    Chaaban, Anas; Alouini, Mohamed-Slim

    2016-01-01

    An optical wireless communications system which employs either intensity-modulation and direct-detection (IM-DD) or heterodyne detection (HD) is considered. IM-DD has lower complexity and cost than HD, but on the other hand, has lower capacity. It is therefore interesting to investigate the capacity gap between the two systems. The main focus of this paper is to investigate this gap at high SNR. Bounds on this gap are established for two cases: between IM-DD and HD, and between IM-DD and an HD-PAM which is an HD system employing pulse-amplitude modulation (PAM). While the gap between IM-DD and HD increases as the signal-to-noise ratio (SNR) increases, the gap between IM-DD and an HD-PAM is upper bounded by a constant at high SNR. © 2015 IEEE.

  2. High Altitude Aerial Natural Gas Leak Detection System

    Energy Technology Data Exchange (ETDEWEB)

    Richard T. Wainner; Mickey B. Frish; B. David Green; Matthew C. Laderer; Mark G. Allen; Joseph R. Morency

    2006-12-31

    The objective of this program was to develop and demonstrate a cost-effective and power-efficient advanced standoff sensing technology able to detect and quantify, from a high-altitude (> 10,000 ft) aircraft, natural gas leaking from a high-pressure pipeline. The advanced technology is based on an enhanced version of the Remote Methane Leak Detector (RMLD) platform developed previously by Physical Sciences Inc. (PSI). The RMLD combines a telecommunications-style diode laser, fiber-optic components, and low-cost DSP electronics with the well-understood principles of Wavelength Modulation Spectroscopy (WMS), to indicate the presence of natural gas located between the operator and a topographic target. The transceiver transmits a laser beam onto a topographic target and receives some of the laser light reflected by the target. The controller processes the received light signal to deduce the amount of methane in the laser's path. For use in the airborne platform, we modified three aspects of the RMLD, by: (1) inserting an Erbium-doped optical fiber laser amplifier to increase the transmitted laser power from 10 mW to 5W; (2) increasing the optical receiver diameter from 10 cm to 25 cm; and (3) altering the laser wavelength from 1653 nm to 1618 nm. The modified RMLD system provides a path-integrated methane concentration sensitivity {approx}5000 ppm-m, sufficient to detect the presence of a leak from a high capacity transmission line while discriminating against attenuation by ambient methane. In ground-based simulations of the aerial leak detection scenario, we demonstrated the ability to measure methane leaks within the laser beam path when it illuminates a topographic target 2000 m away. We also demonstrated simulated leak detection from ranges of 200 m using the 25 cm optical receiver without the fiber amplifier.

  3. Rapid detection of coliforms in drinking water of Arak city using multiplex PCR method in comparison with the standard method of culture (Most Probably Number

    Directory of Open Access Journals (Sweden)

    Dehghan fatemeh

    2014-05-01

    Conclusions: Multiplex PCR method with shortened operation time was used for the simultaneous detection of total coliforms and Escherichia coli in distribution system of Arak city. It's recommended to be used at least as an initial screening test, and then the positive samples could be randomly tested by MPN.

  4. High Precision Edge Detection Algorithm for Mechanical Parts

    Directory of Open Access Journals (Sweden)

    Duan Zhenyun

    2018-04-01

    Full Text Available High precision and high efficiency measurement is becoming an imperative requirement for a lot of mechanical parts. So in this study, a subpixel-level edge detection algorithm based on the Gaussian integral model is proposed. For this purpose, the step edge normal section line Gaussian integral model of the backlight image is constructed, combined with the point spread function and the single step model. Then gray value of discrete points on the normal section line of pixel edge is calculated by surface interpolation, and the coordinate as well as gray information affected by noise is fitted in accordance with the Gaussian integral model. Therefore, a precise location of a subpixel edge was determined by searching the mean point. Finally, a gear tooth was measured by M&M3525 gear measurement center to verify the proposed algorithm. The theoretical analysis and experimental results show that the local edge fluctuation is reduced effectively by the proposed method in comparison with the existing subpixel edge detection algorithms. The subpixel edge location accuracy and computation speed are improved. And the maximum error of gear tooth profile total deviation is 1.9 μm compared with measurement result with gear measurement center. It indicates that the method has high reliability to meet the requirement of high precision measurement.

  5. High Precision Edge Detection Algorithm for Mechanical Parts

    Science.gov (United States)

    Duan, Zhenyun; Wang, Ning; Fu, Jingshun; Zhao, Wenhui; Duan, Boqiang; Zhao, Jungui

    2018-04-01

    High precision and high efficiency measurement is becoming an imperative requirement for a lot of mechanical parts. So in this study, a subpixel-level edge detection algorithm based on the Gaussian integral model is proposed. For this purpose, the step edge normal section line Gaussian integral model of the backlight image is constructed, combined with the point spread function and the single step model. Then gray value of discrete points on the normal section line of pixel edge is calculated by surface interpolation, and the coordinate as well as gray information affected by noise is fitted in accordance with the Gaussian integral model. Therefore, a precise location of a subpixel edge was determined by searching the mean point. Finally, a gear tooth was measured by M&M3525 gear measurement center to verify the proposed algorithm. The theoretical analysis and experimental results show that the local edge fluctuation is reduced effectively by the proposed method in comparison with the existing subpixel edge detection algorithms. The subpixel edge location accuracy and computation speed are improved. And the maximum error of gear tooth profile total deviation is 1.9 μm compared with measurement result with gear measurement center. It indicates that the method has high reliability to meet the requirement of high precision measurement.

  6. Tsunami Arrival Detection with High Frequency (HF Radar

    Directory of Open Access Journals (Sweden)

    Donald Barrick

    2012-05-01

    Full Text Available Quantitative real-time observations of a tsunami have been limited to deep-water, pressure-sensor observations of changes in the sea surface elevation and observations of sea level fluctuations at the coast, which are essentially point measurements. Constrained by these data, models have been used for predictions and warning of the arrival of a tsunami, but to date no system exists for local detection of an actual incoming wave with a significant warning capability. Networks of coastal high frequency (HF-radars are now routinely observing surface currents in many countries. We report here on an empirical method for the detection of the initial arrival of a tsunami, and demonstrate its use with results from data measured by fourteen HF radar sites in Japan and USA following the magnitude 9.0 earthquake off Sendai, Japan, on 11 March 2011. The distance offshore at which the tsunami can be detected, and hence the warning time provided, depends on the bathymetry: the wider the shallow continental shelf, the greater this time. We compare arrival times at the radars with those measured by neighboring tide gauges. Arrival times measured by the radars preceded those at neighboring tide gauges by an average of 19 min (Japan and 15 min (USA The initial water-height increase due to the tsunami as measured by the tide gauges was moderate, ranging from 0.3 to 2 m. Thus it appears possible to detect even moderate tsunamis using this method. Larger tsunamis could obviously be detected further from the coast. We find that tsunami arrival within the radar coverage area can be announced 8 min (i.e., twice the radar spectral time resolution after its first appearance. This can provide advance warning of the tsunami approach to the coastline locations.

  7. Supervised detection of exoplanets in high-contrast imaging sequences

    Science.gov (United States)

    Gomez Gonzalez, C. A.; Absil, O.; Van Droogenbroeck, M.

    2018-06-01

    Context. Post-processing algorithms play a key role in pushing the detection limits of high-contrast imaging (HCI) instruments. State-of-the-art image processing approaches for HCI enable the production of science-ready images relying on unsupervised learning techniques, such as low-rank approximations, for generating a model point spread function (PSF) and subtracting the residual starlight and speckle noise. Aims: In order to maximize the detection rate of HCI instruments and survey campaigns, advanced algorithms with higher sensitivities to faint companions are needed, especially for the speckle-dominated innermost region of the images. Methods: We propose a reformulation of the exoplanet detection task (for ADI sequences) that builds on well-established machine learning techniques to take HCI post-processing from an unsupervised to a supervised learning context. In this new framework, we present algorithmic solutions using two different discriminative models: SODIRF (random forests) and SODINN (neural networks). We test these algorithms on real ADI datasets from VLT/NACO and VLT/SPHERE HCI instruments. We then assess their performances by injecting fake companions and using receiver operating characteristic analysis. This is done in comparison with state-of-the-art ADI algorithms, such as ADI principal component analysis (ADI-PCA). Results: This study shows the improved sensitivity versus specificity trade-off of the proposed supervised detection approach. At the diffraction limit, SODINN improves the true positive rate by a factor ranging from 2 to 10 (depending on the dataset and angular separation) with respect to ADI-PCA when working at the same false-positive level. Conclusions: The proposed supervised detection framework outperforms state-of-the-art techniques in the task of discriminating planet signal from speckles. In addition, it offers the possibility of re-processing existing HCI databases to maximize their scientific return and potentially improve

  8. Hexagonal boron nitride neutron detectors with high detection efficiencies

    Science.gov (United States)

    Maity, A.; Grenadier, S. J.; Li, J.; Lin, J. Y.; Jiang, H. X.

    2018-01-01

    Neutron detectors fabricated from 10B enriched hexagonal boron nitride (h-10BN or h-BN) epilayers have demonstrated the highest thermal neutron detection efficiency among solid-state neutron detectors to date at about 53%. In this work, photoconductive-like vertical detectors with a detection area of 1 × 1 mm2 were fabricated from 50 μm thick free-standing h-BN epilayers using Ni/Au and Ti/Al bilayers as ohmic contacts. Leakage currents, mobility-lifetime (μτ) products under UV photoexcitation, and neutron detection efficiencies have been measured for a total of 16 different device configurations. The results have unambiguously identified that detectors incorporating the Ni/Au bilayer on both surfaces as ohmic contacts and using the negatively biased top surface for neutron irradiation are the most desired device configurations. It was noted that high growth temperatures of h-10BN epilayers on sapphire substrates tend to yield a higher concentration of oxygen impurities near the bottom surface, leading to a better device performance by the chosen top surface for irradiation than by the bottom. Preferential scattering of oxygen donors tends to reduce the mobility of holes more than that of electrons, making the biasing scheme with the ability of rapidly extracting holes at the irradiated surface while leaving the electrons to travel a large average distance inside the detector at a preferred choice. When measured against a calibrated 6LiF filled micro-structured semiconductor neutron detector, it was shown that the optimized configuration has pushed the detection efficiency of h-BN neutron detectors to 58%. These detailed studies also provided a better understanding of growth-mediated impurities in h-BN epilayers and their effects on the charge collection and neutron detection efficiencies.

  9. Radar detection of ultra high energy cosmic rays

    Science.gov (United States)

    Myers, Isaac J.

    TARA (Telescope Array Radar) is a cosmic ray radar detection experiment co-located with Telescope Array, the conventional surface scintillation detector (SD) and fluorescence telescope detector (FD) near Delta, UT. The TARA detector combines a 40 kW transmitter and high gain transmitting antenna which broadcasts the radar carrier over the SD array and in the FD field of view to a 250 MS/s DAQ receiver. Data collection began in August, 2013. TARA stands apart from other cosmic ray radar experiments in that radar data is directly compared with conventional cosmic ray detector events. The transmitter is also directly controlled by TARA researchers. Waveforms from the FD-triggered data stream are time-matched with TA events and searched for signal using a novel signal search technique in which the expected (simulated) radar echo of a particular air shower is used as a matched filter template and compared to radio waveforms. This technique is used to calculate the radar cross-section (RCS) upper-limit on all triggers that correspond to well-reconstructed TA FD monocular events. Our lowest cosmic ray RCS upper-limit is 42 cm2 for an 11 EeV event. An introduction to cosmic rays is presented with the evolution of detection and the necessity of new detection techniques, of which radar detection is a candidate. The software simulation of radar scattering from cosmic rays follows. The TARA detector, including transmitter and receiver systems, are discussed in detail. Our search algorithm and methodology for calculating RCS is presented for the purpose of being repeatable. Search results are explained in context of the usefulness and future of cosmic ray radar detection.

  10. High resolution capacitance detection circuit for rotor micro-gyroscope

    Directory of Open Access Journals (Sweden)

    Ming-Yuan Ren

    2014-03-01

    Full Text Available Conventional methods for rotor position detection of micro-gyroscopes include common exciting electrodes (single frequency and common sensing electrodes (frequency multiplex, but they have encountered some problems. So we present a high resolution and low noise pick-off circuit for micro-gyroscopes which utilizes the time multiplex method. The detecting circuit adopts a continuous-time current sensing circuit for capacitance measurement, and its noise analysis of the charge amplifier is introduced. The equivalent output noise power spectral density of phase-sensitive demodulation is 120 nV/Hz1/2. Tests revealed that the whole circuitry has a relative capacitance resolution of 1 × 10−8.

  11. DETECTION OF BARCHAN DUNES IN HIGH RESOLUTION SATELLITE IMAGES

    Directory of Open Access Journals (Sweden)

    M. A. Azzaoui

    2016-06-01

    Full Text Available Barchan dunes are the fastest moving sand dunes in the desert. We developed a process to detect barchans dunes on High resolution satellite images. It consisted of three steps, we first enhanced the image using histogram equalization and noise reduction filters. Then, the second step proceeds to eliminate the parts of the image having a texture different from that of the barchans dunes. Using supervised learning, we tested a coarse to fine textural analysis based on Kolomogorov Smirnov test and Youden’s J-statistic on co-occurrence matrix. As an output we obtained a mask that we used in the next step to reduce the search area. In the third step we used a gliding window on the mask and check SURF features with SVM to get barchans dunes candidates. Detected barchans dunes were considered as the fusion of overlapping candidates. The results of this approach were very satisfying in processing time and precision.

  12. High-Speed Video System for Micro-Expression Detection and Recognition

    Directory of Open Access Journals (Sweden)

    Diana Borza

    2017-12-01

    Full Text Available Micro-expressions play an essential part in understanding non-verbal communication and deceit detection. They are involuntary, brief facial movements that are shown when a person is trying to conceal something. Automatic analysis of micro-expression is challenging due to their low amplitude and to their short duration (they occur as fast as 1/15 to 1/25 of a second. We propose a fully micro-expression analysis system consisting of a high-speed image acquisition setup and a software framework which can detect the frames when the micro-expressions occurred as well as determine the type of the emerged expression. The detection and classification methods use fast and simple motion descriptors based on absolute image differences. The recognition module it only involves the computation of several 2D Gaussian probabilities. The software framework was tested on two publicly available high speed micro-expression databases and the whole system was used to acquire new data. The experiments we performed show that our solution outperforms state of the art works which use more complex and computationally intensive descriptors.

  13. Table of sample sizes needed to detect at least one defective with 100(1-α)% probability (α = 0.01, 0.05)

    International Nuclear Information System (INIS)

    Stewart, K.B.

    1972-01-01

    Tables are presented which give the random sample size needed in order to be 95 percent(99 percent) certain of detecting at least one defective item when there are k defective items in a population of n items. The application of the tables to certain safeguards problems is discussed. The range of the tables is as follows: r = 0(1)25, n = r(1)r + 999. (U.S.)

  14. Philosophical theories of probability

    CERN Document Server

    Gillies, Donald

    2000-01-01

    The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.

  15. High-energy Neutrino Emission from Short Gamma-Ray Bursts: Prospects for Coincident Detection with Gravitational Waves

    Energy Technology Data Exchange (ETDEWEB)

    Kimura, Shigeo S.; Murase, Kohta; Mészáros, Peter [Department of Physics, Pennsylvania State University, University Park, PA 16802 (United States); Kiuchi, Kenta [Center for Gravitational Physics, Yukawa Institute for Theoretical Physics, Kyoto, Kyoto 606-8502 (Japan)

    2017-10-10

    We investigate current and future prospects for coincident detection of high-energy neutrinos and gravitational waves (GWs). Short gamma-ray bursts (SGRBs) are believed to originate from mergers of compact star binaries involving neutron stars. We estimate high-energy neutrino fluences from prompt emission, extended emission (EE), X-ray flares, and plateau emission, and we show that neutrino signals associated with the EE are the most promising. Assuming that the cosmic-ray loading factor is ∼10 and the Lorentz factor distribution is lognormal, we calculate the probability of neutrino detection from EE by current and future neutrino detectors, and we find that the quasi-simultaneous detection of high-energy neutrinos, gamma-rays, and GWs is possible with future instruments or even with current instruments for nearby SGRBs having EE. We also discuss stacking analyses that will also be useful with future experiments such as IceCube-Gen2.

  16. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned

  17. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  18. CEST ANALYSIS: AUTOMATED CHANGE DETECTION FROM VERY-HIGH-RESOLUTION REMOTE SENSING IMAGES

    Directory of Open Access Journals (Sweden)

    M. Ehlers

    2012-08-01

    Full Text Available A fast detection, visualization and assessment of change in areas of crisis or catastrophes are important requirements for coordination and planning of help. Through the availability of new satellites and/or airborne sensors with very high spatial resolutions (e.g., WorldView, GeoEye new remote sensing data are available for a better detection, delineation and visualization of change. For automated change detection, a large number of algorithms has been proposed and developed. From previous studies, however, it is evident that to-date no single algorithm has the potential for being a reliable change detector for all possible scenarios. This paper introduces the Combined Edge Segment Texture (CEST analysis, a decision-tree based cooperative suite of algorithms for automated change detection that is especially designed for the generation of new satellites with very high spatial resolution. The method incorporates frequency based filtering, texture analysis, and image segmentation techniques. For the frequency analysis, different band pass filters can be applied to identify the relevant frequency information for change detection. After transforming the multitemporal images via a fast Fourier transform (FFT and applying the most suitable band pass filter, different methods are available to extract changed structures: differencing and correlation in the frequency domain and correlation and edge detection in the spatial domain. Best results are obtained using edge extraction. For the texture analysis, different 'Haralick' parameters can be calculated (e.g., energy, correlation, contrast, inverse distance moment with 'energy' so far providing the most accurate results. These algorithms are combined with a prior segmentation of the image data as well as with morphological operations for a final binary change result. A rule-based combination (CEST of the change algorithms is applied to calculate the probability of change for a particular location. CEST

  19. Comparison of Wells and Revised Geneva Rule to Assess Pretest Probability of Pulmonary Embolism in High-Risk Hospitalized Elderly Adults.

    Science.gov (United States)

    Di Marca, Salvatore; Cilia, Chiara; Campagna, Andrea; D'Arrigo, Graziella; Abd ElHafeez, Samar; Tripepi, Giovanni; Puccia, Giuseppe; Pisano, Marcella; Mastrosimone, Gianluca; Terranova, Valentina; Cardella, Antonella; Buonacera, Agata; Stancanelli, Benedetta; Zoccali, Carmine; Malatino, Lorenzo

    2015-06-01

    To assess and compare the diagnostic power for pulmonary embolism (PE) of Wells and revised Geneva scores in two independent cohorts (training and validation groups) of elderly adults hospitalized in a non-emergency department. Prospective clinical study, January 2011 to January 2013. Unit of Internal Medicine inpatients, University of Catania, Italy. Elderly adults (mean age 76 ± 12), presenting with dyspnea or chest pain and with high clinical probability of PE or D-dimer values greater than 500 ng/mL (N = 203), were enrolled and consecutively assigned to a training (n = 101) or a validation (n = 102) group. The clinical probability of PE was assessed using Wells and revised Geneva scores. Clinical examination, D-dimer test, and multidetector computed angiotomography were performed in all participants. The accuracy of the scores was assessed using receiver operating characteristic analyses. PE was confirmed in 46 participants (23%) (24 training group, 22 validation group). In the training group, the area under the receiver operating characteristic curve was 0.91 (95% confidence interval (CI) = 0.85-0.98) for the Wells score and 0.69 (95% CI = 0.56-0.82) for the revised Geneva score (P < .001). These results were confirmed in the validation group (P < .05). The positive (LR+) and negative likelihood ratios (LR-) (two indices combining sensitivity and specificity) of the Wells score were superior to those of the revised Geneva score in the training (LR+, 7.90 vs 1.34; LR-, 0.23 vs 0.66) and validation (LR+, 13.5 vs 1.46; LR-, 0.47 vs 0.54) groups. In high-risk elderly hospitalized adults, the Wells score is more accurate than the revised Geneva score for diagnosing PE. © 2015, Copyright the Authors Journal compilation © 2015, The American Geriatrics Society.

  20. REDItools: high-throughput RNA editing detection made easy.

    Science.gov (United States)

    Picardi, Ernesto; Pesole, Graziano

    2013-07-15

    The reliable detection of RNA editing sites from massive sequencing data remains challenging and, although several methodologies have been proposed, no computational tools have been released to date. Here, we introduce REDItools a suite of python scripts to perform high-throughput investigation of RNA editing using next-generation sequencing data. REDItools are in python programming language and freely available at http://code.google.com/p/reditools/. ernesto.picardi@uniba.it or graziano.pesole@uniba.it Supplementary data are available at Bioinformatics online.

  1. Introduction to probability theory with contemporary applications

    CERN Document Server

    Helms, Lester L

    2010-01-01

    This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process

  2. Probability with applications and R

    CERN Document Server

    Dobrow, Robert P

    2013-01-01

    An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c

  3. AMS detection of actinides at high mass separation

    Energy Technology Data Exchange (ETDEWEB)

    Steier, Peter; Lachner, Johannes; Priller, Alfred; Winkler, Stephan; Golser, Robin [University of Vienna, Faculty of Physics, Vienna (Austria); Eigl, Rosmarie [Hiroshima University, Earth and Planetary Systems Science, Hiroshima (Japan); Quinto, Francesca [Institut fuer Nukleare Entsorgung, KIT, Eggenstein-Leopoldshafen (Germany); Sakaguchi, Aya [University of Tsukuba, Center for Research in Isotopes and Environmental Dynamics, Tsukuba (Japan)

    2015-07-01

    AMS is the mass spectrometric method with the highest abundance sensitivity, which is a prerequisite for measurement of the long-lived radioisotope {sup 236}U (t{sub 1/2}=23.4 million years). The most successful application so far is oceanography, since anthropogenic {sup 236}U is present in the world oceans at {sup 236}U:{sup 238}U from 10{sup -11} to 10{sup -8}. We have explored methods to increase the sensitivity and thus to reduce the water volume required to 1 L or less, which significantly reduces the sampling effort. High sensitivity is also necessary to address the expected typical natural isotopic ratios on the order {sup 236}U:{sup 238}U = 10{sup -13}, with potential applications in geology. With a second 90 analyzer magnet and a new Time-of-Flight beam line, VERA is robust against chemical impurities in the background, which e.g. allows measuring Pu isotopes directly in a uranium matrix. This simplifies chemical sample preparation for actinide detection, and may illustrate why AMS reaches lower detection limits than other mass spectrometric methods with nominally higher detection efficiency.

  4. Highly sensitive detection using microring resonator and nanopores

    Science.gov (United States)

    Bougot-Robin, K.; Hoste, J. W.; Le Thomas, N.; Bienstman, P.; Edel, J. B.

    2016-04-01

    One of the most significant challenges facing physical and biological scientists is the accurate detection and identification of single molecules in free-solution environments. The ability to perform such sensitive and selective measurements opens new avenues for a large number of applications in biological, medical and chemical analysis, where small sample volumes and low analyte concentrations are the norm. Access to information at the single or few molecules scale is rendered possible by a fine combination of recent advances in technologies. We propose a novel detection method that combines highly sensitive label-free resonant sensing obtained with high-Q microcavities and position control in nanoscale pores (nanopores). In addition to be label-free and highly sensitive, our technique is immobilization free and does not rely on surface biochemistry to bind probes on a chip. This is a significant advantage, both in term of biology uncertainties and fewer biological preparation steps. Through combination of high-Q photonic structures with translocation through nanopore at the end of a pipette, or through a solid-state membrane, we believe significant advances can be achieved in the field of biosensing. Silicon microrings are highly advantageous in term of sensitivity, multiplexing, and microfabrication and are chosen for this study. In term of nanopores, we both consider nanopore at the end of a nanopipette, with the pore being approach from the pipette with nanoprecise mechanical control. Alternatively, solid state nanopores can be fabricated through a membrane, supporting the ring. Both configuration are discussed in this paper, in term of implementation and sensitivity.

  5. KEPLER'S OPTICAL SECONDARY ECLIPSE OF HAT-P-7b AND PROBABLE DETECTION OF PLANET-INDUCED STELLAR GRAVITY DARKENING

    Energy Technology Data Exchange (ETDEWEB)

    Morris, Brett M.; Deming, Drake [Department of Astronomy, University of Maryland, College Park, MD 20742 (United States); Mandell, Avi M. [Goddard Center for Astrobiology, NASA' s Goddard Space Flight Center, Greenbelt, MD 20771 (United States)

    2013-02-20

    We present observations spanning 355 orbital phases of HAT-P-7 observed by Kepler from 2009 May to 2011 March (Q1-9). We find a shallower secondary eclipse depth than initially announced, consistent with a low optical albedo and detection of nearly exclusively thermal emission, without a reflected light component. We find an approximately 10 ppm perturbation to the average transit light curve near phase -0.02 that we attribute to a temperature decrease on the surface of the star, phased to the orbit of the planet. This cooler spot is consistent with planet-induced gravity darkening, slightly lagging the sub-planet position due to the finite response time of the stellar atmosphere. The brightness temperature of HAT-P-7b in the Kepler bandpass is T{sub B} = 2733 {+-} 21 K and the amplitude of the deviation in stellar surface temperature due to gravity darkening is approximately -0.18 K. The detection of the spot is not statistically unequivocal due its small amplitude, though additional Kepler observations should be able to verify the astrophysical nature of the anomaly.

  6. Automated Detection of Thermo-Erosion in High Latitude Ecosystems

    Science.gov (United States)

    Lara, M. J.; Chipman, M. L.; Hu, F.

    2017-12-01

    Detecting permafrost disturbance is of critical importance as the severity of climate change and associated increase in wildfire frequency and magnitude impacts regional to global carbon dynamics. However, it has not been possible to evaluate spatiotemporal patterns of permafrost degradation over large regions of the Arctic, due to limited spatial and temporal coverage of high resolution optical, radar, lidar, or hyperspectral remote sensing products. Here we present the first automated multi-temporal analysis for detecting disturbance in response to permafrost thaw, using meso-scale high-frequency remote sensing products (i.e. entire Landsat image archive). This approach was developed, tested, and applied in the Noatak National Preserve (26,500km2) in northwestern Alaska. We identified thermo-erosion (TE), by capturing the indirect spectral signal associated with episodic sediment plumes in adjacent waterbodies following TE disturbance. We isolated this turbidity signal within lakes during summer (mid-summer & late-summer) and annual time-period image composites (1986-2016), using the cloud-based geospatial parallel processing platform, Google Earth Engine™API. We validated the TE detection algorithm using seven consecutive years of sub-meter high resolution imagery (2009-2015) covering 798 ( 33%) of the 2456 total lakes in the Noatak lowlands. Our approach had "good agreement" with sediment pulses and landscape deformation in response to permafrost thaw (overall accuracy and kappa coefficient of 85% and 0.61). We identify active TE to impact 10.4% of all lakes, but was inter-annually variable, with the highest and lowest TE years represented by 1986 ( 41.1%) and 2002 ( 0.7%), respectively. We estimate thaw slumps, lake erosion, lake drainage, and gully formation to account for 23.3, 61.8, 12.5, and 1.3%, of all active TE across the Noatak National Preserve. Preliminary analysis, suggests TE may be subject to a hysteresis effect following extreme climatic

  7. High detection rate of dog circovirus in diarrheal dogs.

    Science.gov (United States)

    Hsu, Han-Siang; Lin, Ting-Han; Wu, Hung-Yi; Lin, Lee-Shuan; Chung, Cheng-Shu; Chiou, Ming-Tang; Lin, Chao-Nan

    2016-06-17

    Diarrhea is one of the most common clinical symptoms reported in companion animal clinics. Dog circovirus (DogCV) is a new mammalian circovirus that is considered to be a cause of alimentary syndromes such as diarrhea, vomiting and hemorrhagic enteritis. DogCV has previously only been identified in the United States, Italy, Germany (GeneBank accession number: KF887949) and China (GeneBank accession number: KT946839). Therefore, the aims of this study were to determine the prevalence of DogCV in Taiwan and to explore the correlation between diarrhea and DogCV infection. Clinical specimens were collected between 2012 and 2014 from 207 dogs suffering from diarrhea and 160 healthy dogs. In this study, we developed a sensitive and specific SYBR Green-based real-time PCR assays to detected DogCV in naturally infected animals. Of the analyzed fecal samples from diarrheal dogs and health dogs, 58 (28.0 %) and 19 (11.9 %), respectively, were DogCV positive. The difference in DogCV prevalence was highly significant (P = 0.0002755) in diarrheal dogs. This is the first study to reveal that DogCV is currently circulating in domestic dogs in Taiwan and to demonstrate its high detection rate in dogs with diarrhea.

  8. Real-time defect detection on highly reflective curved surfaces

    Science.gov (United States)

    Rosati, G.; Boschetti, G.; Biondi, A.; Rossi, A.

    2009-03-01

    This paper presents an automated defect detection system for coated plastic components for the automotive industry. This research activity came up as an evolution of a previous study which employed a non-flat mirror to illuminate and inspect high reflective curved surfaces. According to this method, the rays emitted from a light source are conveyed on the surface under investigation by means of a suitably curved mirror. After the reflection on the surface, the light rays are collected by a CCD camera, in which the coating defects appear as shadows of various shapes and dimensions. In this paper we present an evolution of the above-mentioned method, introducing a simplified mirror set-up in order to reduce the costs and the complexity of the defect detection system. In fact, a set of plane mirrors is employed instead of the curved one. Moreover, the inspection of multiple bend radius parts is investigated. A prototype of the machine vision system has been developed in order to test this simplified method. This device is made up of a light projector, a set of plane mirrors for light rays reflection, a conveyor belt for handling components, a CCD camera and a desktop PC which performs image acquisition and processing. Like in the previous system, the defects are identified as shadows inside a high brightness image. At the end of the paper, first experimental results are presented.

  9. DHPLC technology for high-throughput detection of mutations in a durum wheat TILLING population.

    Science.gov (United States)

    Colasuonno, Pasqualina; Incerti, Ornella; Lozito, Maria Luisa; Simeone, Rosanna; Gadaleta, Agata; Blanco, Antonio

    2016-02-17

    Durum wheat (Triticum turgidum L.) is a cereal crop widely grown in the Mediterranean regions; the amber grain is mainly used for the production of pasta, couscous and typical breads. Single nucleotide polymorphism (SNP) detection technologies and high-throughput mutation induction represent a new challenge in wheat breeding to identify allelic variation in large populations. The TILLING strategy makes use of traditional chemical mutagenesis followed by screening for single base mismatches to identify novel mutant loci. Although TILLING has been combined to several sensitive pre-screening methods for SNP analysis, most rely on expensive equipment. Recently, a new low cost and time saving DHPLC protocol has been used in molecular human diagnostic to detect unknown mutations. In this work, we developed a new durum wheat TILLING population (cv. Marco Aurelio) using 0.70-0.85% ethyl methane sulfonate (EMS). To investigate the efficiency of the mutagenic treatments, a pilot screening was carried out on 1,140 mutant lines focusing on two target genes (Lycopene epsilon-cyclase, ε-LCY, and Lycopene beta-cyclase, β-LCY) involved in carotenoid metabolism in wheat grains. We simplify the heteroduplex detection by two low cost methods: the enzymatic cleavage (CelI)/agarose gel technique and the denaturing high-performance liquid chromatography (DHPLC). The CelI/agarose gel approach allowed us to identify 31 mutations, whereas the DHPLC procedure detected a total of 46 mutations for both genes. All detected mutations were confirmed by direct sequencing. The estimated overall mutation frequency for the pilot assay by the DHPLC methodology resulted to be of 1/77 kb, representing a high probability to detect interesting mutations in the target genes. We demonstrated the applicability and efficiency of a new strategy for the detection of induced variability. We produced and characterized a new durum wheat TILLING population useful for a better understanding of key gene functions

  10. Microcystin Detection Characteristics of Fluorescence Immunochromatography and High Performance Liquid Chromatography

    International Nuclear Information System (INIS)

    Pyo, Dong Jin; Park, Geun Young; Choi, Jong Chon; Oh, Chang Suk

    2005-01-01

    Different detection characteristics of fluorescence immunochromatography method and high performance liquid chromatography (HPLC) method for the analysis of cyanobacterial toxins were studied. In particular, low and high limits of detection, detection time and reproducibility and detectable microcystin species were compared when fluorescence immunochromatography method and high performance liquid chromatography method were applied for the detection of microcystin (MC), a cyclic peptide toxin of the freshwater cyanobacterium Microcystis aeruginosa. A Fluorescence immunochromatography assay system has the unique advantages of short detection time and low detection limit, and high performance liquid chromatography detection method has the strong advantage of individual quantifications of several species of microcystins

  11. Foundations of probability

    International Nuclear Information System (INIS)

    Fraassen, B.C. van

    1979-01-01

    The interpretation of probabilities in physical theories are considered, whether quantum or classical. The following points are discussed 1) the functions P(μ, Q) in terms of which states and propositions can be represented, are classical (Kolmogoroff) probabilities, formally speaking, 2) these probabilities are generally interpreted as themselves conditional, and the conditions are mutually incompatible where the observables are maximal and 3) testing of the theory typically takes the form of confronting the expectation values of observable Q calculated with probability measures P(μ, Q) for states μ; hence, of comparing the probabilities P(μ, Q)(E) with the frequencies of occurrence of the corresponding events. It seems that even the interpretation of quantum mechanics, in so far as it concerns what the theory says about the empirical (i.e. actual, observable) phenomena, deals with the confrontation of classical probability measures with observable frequencies. This confrontation is studied. (Auth./C.F.)

  12. Analytic Neutrino Oscillation Probabilities in Matter: Revisited

    Energy Technology Data Exchange (ETDEWEB)

    Parke, Stephen J. [Fermilab; Denton, Peter B. [Copenhagen U.; Minakata, Hisakazu [Madrid, IFT

    2018-01-02

    We summarize our recent paper on neutrino oscillation probabilities in matter, explaining the importance, relevance and need for simple, highly accurate approximations to the neutrino oscillation probabilities in matter.

  13. Low-Probability High-Consequence (LPHC) Failure Events in Geologic Carbon Sequestration Pipelines and Wells: Framework for LPHC Risk Assessment Incorporating Spatial Variability of Risk

    Energy Technology Data Exchange (ETDEWEB)

    Oldenburg, Curtis M. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Budnitz, Robert J. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-08-31

    If Carbon dioxide Capture and Storage (CCS) is to be effective in mitigating climate change, it will need to be carried out on a very large scale. This will involve many thousands of miles of dedicated high-pressure pipelines in order to transport many millions of tonnes of CO2 annually, with the CO2 delivered to many thousands of wells that will inject the CO2 underground. The new CCS infrastructure could rival in size the current U.S. upstream natural gas pipeline and well infrastructure. This new infrastructure entails hazards for life, health, animals, the environment, and natural resources. Pipelines are known to rupture due to corrosion, from external forces such as impacts by vehicles or digging equipment, by defects in construction, or from the failure of valves and seals. Similarly, wells are vulnerable to catastrophic failure due to corrosion, cement degradation, or operational mistakes. While most accidents involving pipelines and wells will be minor, there is the inevitable possibility of accidents with very high consequences, especially to public health. The most important consequence of concern is CO2 release to the environment in concentrations sufficient to cause death by asphyxiation to nearby populations. Such accidents are thought to be very unlikely, but of course they cannot be excluded, even if major engineering effort is devoted (as it will be) to keeping their probability low and their consequences minimized. This project has developed a methodology for analyzing the risks of these rare but high-consequence accidents, using a step-by-step probabilistic methodology. A key difference between risks for pipelines and wells is that the former are spatially distributed along the pipe whereas the latter are confined to the vicinity of the well. Otherwise, the methodology we develop for risk assessment of pipeline and well failures is similar and provides an analysis both of the annual probabilities of

  14. The quantum probability calculus

    International Nuclear Information System (INIS)

    Jauch, J.M.

    1976-01-01

    The Wigner anomaly (1932) for the joint distribution of noncompatible observables is an indication that the classical probability calculus is not applicable for quantum probabilities. It should, therefore, be replaced by another, more general calculus, which is specifically adapted to quantal systems. In this article this calculus is exhibited and its mathematical axioms and the definitions of the basic concepts such as probability field, random variable, and expectation values are given. (B.R.H)

  15. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....

  16. Probability of satellite collision

    Science.gov (United States)

    Mccarter, J. W.

    1972-01-01

    A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.

  17. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...

  18. Handbook of probability

    CERN Document Server

    Florescu, Ionut

    2013-01-01

    THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio

  19. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  20. Evaluation of the probability of arrester failure in a high-voltage transmission line using a Q learning artificial neural network model

    International Nuclear Information System (INIS)

    Ekonomou, L; Karampelas, P; Vita, V; Chatzarakis, G E

    2011-01-01

    One of the most popular methods of protecting high voltage transmission lines against lightning strikes and internal overvoltages is the use of arresters. The installation of arresters in high voltage transmission lines can prevent or even reduce the lines' failure rate. Several studies based on simulation tools have been presented in order to estimate the critical currents that exceed the arresters' rated energy stress and to specify the arresters' installation interval. In this work artificial intelligence, and more specifically a Q-learning artificial neural network (ANN) model, is addressed for evaluating the arresters' failure probability. The aims of the paper are to describe in detail the developed Q-learning ANN model and to compare the results obtained by its application in operating 150 kV Greek transmission lines with those produced using a simulation tool. The satisfactory and accurate results of the proposed ANN model can make it a valuable tool for designers of electrical power systems seeking more effective lightning protection, reducing operational costs and better continuity of service

  1. Evaluation of the probability of arrester failure in a high-voltage transmission line using a Q learning artificial neural network model

    Science.gov (United States)

    Ekonomou, L.; Karampelas, P.; Vita, V.; Chatzarakis, G. E.

    2011-04-01

    One of the most popular methods of protecting high voltage transmission lines against lightning strikes and internal overvoltages is the use of arresters. The installation of arresters in high voltage transmission lines can prevent or even reduce the lines' failure rate. Several studies based on simulation tools have been presented in order to estimate the critical currents that exceed the arresters' rated energy stress and to specify the arresters' installation interval. In this work artificial intelligence, and more specifically a Q-learning artificial neural network (ANN) model, is addressed for evaluating the arresters' failure probability. The aims of the paper are to describe in detail the developed Q-learning ANN model and to compare the results obtained by its application in operating 150 kV Greek transmission lines with those produced using a simulation tool. The satisfactory and accurate results of the proposed ANN model can make it a valuable tool for designers of electrical power systems seeking more effective lightning protection, reducing operational costs and better continuity of service.

  2. No Bridge Too High: Infants Decide Whether to Cross Based on the Probability of Falling not the Severity of the Potential Fall

    Science.gov (United States)

    Kretch, Kari S.; Adolph, Karen E.

    2013-01-01

    Do infants, like adults, consider both the probability of falling and the severity of a potential fall when deciding whether to cross a bridge? Crawling and walking infants were encouraged to cross bridges varying in width over a small drop-off, a large drop-off, or no drop-off. Bridge width affects the probability of falling, whereas drop-off…

  3. Polypyrrole–gold nanoparticle composites for highly sensitive DNA detection

    International Nuclear Information System (INIS)

    Spain, Elaine; Keyes, Tia E.; Forster, Robert J.

    2013-01-01

    DNA capture surfaces represent a powerful approach to developing highly sensitive sensors for identifying the cause of infection. Electrochemically deposited polypyrrole, PPy, films have been functionalized with electrodeposited gold nanoparticles to give a nanocomposite material, PPy–AuNP. Thiolated capture strand DNA, that is complementary to the sequence from the pathogen Staphylococcus aureus that causes mammary gland inflammation, was then immobilized onto the gold nanoparticles and any of the underlying gold electrode that is exposed. A probe strand, labelled with horse radish peroxidase, HRP, was then hybridized to the target. The concentration of the target was determined by measuring the current generated by reducing benzoquinone produced by the HRP label. Semi-log plots of the pathogen DNA concentration vs. faradaic current are linear from 150 pM to 1 μM and pM concentrations can be detected without the need for molecular, e.g., PCR or NASBA, amplification. The nanocomposite also exhibits excellent selectivity and single base mismatches in a 30 mer sequence can be detected

  4. Detecting geomorphic processes and change with high resolution topographic data

    Science.gov (United States)

    Mudd, Simon; Hurst, Martin; Grieve, Stuart; Clubb, Fiona; Milodowski, David; Attal, Mikael

    2016-04-01

    The first global topographic dataset was released in 1996, with 1 km grid spacing. It is astonishing that in only 20 years we now have access to tens of thousands of square kilometres of LiDAR data at point densities greater than 5 points per square meter. This data represents a treasure trove of information that our geomorphic predecessors could only dream of. But what are we to do with this data? Here we explore the potential of high resolution topographic data to dig deeper into geomorphic processes across a wider range of landscapes and using much larger spatial coverage than previously possible. We show how this data can be used to constrain sediment flux relationships using relief and hillslope length, and how this data can be used to detect landscape transience. We show how the nonlinear sediment flux law, proposed for upland, soil mantled landscapes by Roering et al. (1999) is consistent with a number of topographic tests. This flux law allows us to predict how landscapes will respond to tectonic forcing, and we show how these predictions can be used to detect erosion rate perturbations across a range of tectonic settings.

  5. Possible standoff detection of ionizing radiation using high-power THz electromagnetic waves

    Science.gov (United States)

    Nusinovich, Gregory S.; Sprangle, Phillip; Romero-Talamas, Carlos A.; Rodgers, John; Pu, Ruifeng; Kashyn, Dmytro G.; Antonsen, Thomas M., Jr.; Granatstein, Victor L.

    2012-06-01

    Recently, a new method of remote detection of concealed radioactive materials was proposed. This method is based on focusing high-power short wavelength electromagnetic radiation in a small volume where the wave electric field exceeds the breakdown threshold. In the presence of free electrons caused by ionizing radiation, in this volume an avalanche discharge can then be initiated. When the wavelength is short enough, the probability of having even one free electron in this small volume in the absence of additional sources of ionization is low. Hence, a high breakdown rate will indicate that in the vicinity of this volume there are some materials causing ionization of air. To prove this concept a 0.67 THz gyrotron delivering 200-300 kW power in 10 microsecond pulses is under development. This method of standoff detection of concealed sources of ionizing radiation requires a wide range of studies, viz., evaluation of possible range, THz power and pulse duration, production of free electrons in air by gamma rays penetrating through container walls, statistical delay time in initiation of the breakdown in the case of low electron density, temporal evolution of plasma structure in the breakdown and scattering of THz radiation from small plasma objects. Most of these issues are discussed in the paper.

  6. Tuned by experience: How orientation probability modulates early perceptual processing.

    Science.gov (United States)

    Jabar, Syaheed B; Filipowicz, Alex; Anderson, Britt

    2017-09-01

    Probable stimuli are more often and more quickly detected. While stimulus probability is known to affect decision-making, it can also be explained as a perceptual phenomenon. Using spatial gratings, we have previously shown that probable orientations are also more precisely estimated, even while participants remained naive to the manipulation. We conducted an electrophysiological study to investigate the effect that probability has on perception and visual-evoked potentials. In line with previous studies on oddballs and stimulus prevalence, low-probability orientations were associated with a greater late positive 'P300' component which might be related to either surprise or decision-making. However, the early 'C1' component, thought to reflect V1 processing, was dampened for high-probability orientations while later P1 and N1 components were unaffected. Exploratory analyses revealed a participant-level correlation between C1 and P300 amplitudes, suggesting a link between perceptual processing and decision-making. We discuss how these probability effects could be indicative of sharpening of neurons preferring the probable orientations, due either to perceptual learning, or to feature-based attention. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Micro-crack detection in high-performance cementitious materials

    DEFF Research Database (Denmark)

    Lura, Pietro; Guang, Ye; Tanaka, Kyoji

    2005-01-01

    of high-performance cement pastes in silicone moulds that exert minimal external restraint. Cast-in steel rods with varying diameter internally restrain the autogenous shrinkage and lead to crack formation. Dimensions of the steel rods are chosen so that the size of this restraining inclusion resembles......-ray tomography, do not allow sufficient resolution of microcracks. A new technique presented in this paper allows detection of microcracks in cement paste while avoiding artefacts induced by unwanted restraint, drying or temperature variations. The technique consists in casting small circular cylindrical samples...... aggregate size. Gallium intrusion of the cracks and subsequent examination by electron probe micro analysis, EPMA, are used to identify the cracks. The gallium intrusion technique allows controllable impregnation of cracks in the cement paste. A distinct contrast between gallium and the surrounding material...

  8. Early detection of psychosis: finding those at clinical high risk.

    Science.gov (United States)

    Addington, Jean; Epstein, Irvin; Reynolds, Andrea; Furimsky, Ivana; Rudy, Laura; Mancini, Barbara; McMillan, Simone; Kirsopp, Diane; Zipursky, Robert B

    2008-08-01

    In early detection work, recruiting individuals who meet the prodromal criteria is difficult. The aim of this paper was to describe the development of a research clinic for individuals who appear to be at risk of developing a psychosis and the process for educating the community and obtaining referrals. The outcome of all referrals to the clinic over a 4-year period was examined. Following an ongoing education campaign that was over inclusive in order to aid recruitment, approximately 27% of all referrals met the criteria for being at clinical high risk of psychosis. We are seeing only a small proportion of those in the community who eventually go on to develop a psychotic illness. This raises two important issues, namely how to remedy the situation, and second, the impact of this on current research in terms of sampling bias and generalizability of research findings. © 2008 The Authors. Journal compilation © 2008 Blackwell Publishing Asia Pty Ltd.

  9. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  10. Probability, Nondeterminism and Concurrency

    DEFF Research Database (Denmark)

    Varacca, Daniele

    Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...

  11. Janus-faced probability

    CERN Document Server

    Rocchi, Paolo

    2014-01-01

    The problem of probability interpretation was long overlooked before exploding in the 20th century, when the frequentist and subjectivist schools formalized two conflicting conceptions of probability. Beyond the radical followers of the two schools, a circle of pluralist thinkers tends to reconcile the opposing concepts. The author uses two theorems in order to prove that the various interpretations of probability do not come into opposition and can be used in different contexts. The goal here is to clarify the multifold nature of probability by means of a purely mathematical approach and to show how philosophical arguments can only serve to deepen actual intellectual contrasts. The book can be considered as one of the most important contributions in the analysis of probability interpretation in the last 10-15 years.

  12. Highly sensitive and selective detection of dopamine using one-pot synthesized highly photoluminescent silicon nanoparticles.

    Science.gov (United States)

    Zhang, Xiaodong; Chen, Xiaokai; Kai, Siqi; Wang, Hong-Yin; Yang, Jingjing; Wu, Fu-Gen; Chen, Zhan

    2015-03-17

    A simple and highly efficient method for dopamine (DA) detection using water-soluble silicon nanoparticles (SiNPs) was reported. The SiNPs with a high quantum yield of 23.6% were synthesized by using a one-pot microwave-assisted method. The fluorescence quenching capability of a variety of molecules on the synthesized SiNPs has been tested; only DA molecules were found to be able to quench the fluorescence of these SiNPs effectively. Therefore, such a quenching effect can be used to selectively detect DA. All other molecules tested have little interference with the dopamine detection, including ascorbic acid, which commonly exists in cells and can possibly affect the dopamine detection. The ratio of the fluorescence intensity difference between the quenched and unquenched cases versus the fluorescence intensity without quenching (ΔI/I) was observed to be linearly proportional to the DA analyte concentration in the range from 0.005 to 10.0 μM, with a detection limit of 0.3 nM (S/N = 3). To the best of our knowledge, this is the lowest limit for DA detection reported so far. The mechanism of fluorescence quenching is attributed to the energy transfer from the SiNPs to the oxidized dopamine molecules through Förster resonance energy transfer. The reported method of SiNP synthesis is very simple and cheap, making the above sensitive and selective DA detection approach using SiNPs practical for many applications.

  13. BL153 Partially Prevents High-Fat Diet Induced Liver Damage Probably via Inhibition of Lipid Accumulation, Inflammation, and Oxidative Stress

    Directory of Open Access Journals (Sweden)

    Jian Wang

    2014-01-01

    Full Text Available The present study was to investigate whether a magnolia extract, named BL153, can prevent obesity-induced liver damage and identify the possible protective mechanism. To this end, obese mice were induced by feeding with high fat diet (HFD, 60% kcal as fat and the age-matched control mice were fed with control diet (10% kcal as fat for 6 months. Simultaneously these mice were treated with or without BL153 daily at 3 dose levels (2.5, 5, and 10 mg/kg by gavage. HFD feeding significantly increased the body weight and the liver weight. Administration of BL153 significantly reduced the liver weight but without effects on body weight. As a critical step of the development of NAFLD, hepatic fibrosis was induced in the mice fed with HFD, shown by upregulating the expression of connective tissue growth factor and transforming growth factor beta 1, which were significantly attenuated by BL153 in a dose-dependent manner. Mechanism study revealed that BL153 significantly suppressed HFD induced hepatic lipid accumulation and oxidative stress and slightly prevented liver inflammation. These results suggest that HFD induced fibrosis in the liver can be prevented partially by BL153, probably due to reduction of hepatic lipid accumulation, inflammation and oxidative stress.

  14. Long distance high power optical laser fiber break detection and continuity monitoring systems and methods

    Science.gov (United States)

    Rinzler, Charles C.; Gray, William C.; Faircloth, Brian O.; Zediker, Mark S.

    2016-02-23

    A monitoring and detection system for use on high power laser systems, long distance high power laser systems and tools for performing high power laser operations. In particular, the monitoring and detection systems provide break detection and continuity protection for performing high power laser operations on, and in, remote and difficult to access locations.

  15. High-temperature superconducting nanowires for photon detection

    Energy Technology Data Exchange (ETDEWEB)

    Arpaia, R. [Quantum Device Physics Laboratory, Department of Microtechnology and Nanoscience, Chalmers University of Technology, S-41296 Göteborg (Sweden); CNR SPIN Institute – Superconductors, Innovative Materials and Devices, UOS–Napoli, I-80100 Napoli (Italy); Dipartimento di Fisica, Università degli Studi di Napoli ‘Federico II’, I-80125 Napoli (Italy); Ejrnaes, M. [CNR SPIN Institute – Superconductors, Innovative Materials and Devices, UOS–Napoli, I-80100 Napoli (Italy); Parlato, L. [CNR SPIN Institute – Superconductors, Innovative Materials and Devices, UOS–Napoli, I-80100 Napoli (Italy); Dipartimento di Fisica, Università degli Studi di Napoli ‘Federico II’, I-80125 Napoli (Italy); Tafuri, F. [CNR SPIN Institute – Superconductors, Innovative Materials and Devices, UOS–Napoli, I-80100 Napoli (Italy); Dipartimento di Ingegneria Industriale e dell’Informazione, Seconda Università di Napoli, I-81031 Aversa, CE (Italy); Cristiano, R. [CNR SPIN Institute – Superconductors, Innovative Materials and Devices, UOS–Napoli, I-80100 Napoli (Italy); Golubev, D. [Low Temperature Laboratory (OVLL), Aalto University School of Science, P.O. Box 13500, FI-00076 Aalto (Finland); Sobolewski, Roman, E-mail: roman.sobolewski@rochester.edu [Institute of Electron Technology, PL-02668 Warszawa (Poland); Department of Electrical and Computer Engineering and Laboratory for Laser Energetics, University of Rochester, NY 14627-0231 (United States); Bauch, T.; Lombardi, F. [Quantum Device Physics Laboratory, Department of Microtechnology and Nanoscience, Chalmers University of Technology, S-41296 Göteborg (Sweden); and others

    2015-02-15

    Highlights: • Homogeneous YBCO nanowires have been fabricated for photon detection applications. • Serial-parallel nanowire configuration leads to a large detector active area. • The YBCO nanowires exhibit critical current densities up to 106 A/cm{sup 2}. • The devices have been excited using a 1550-nm wavelength, pulsed laser irradiation. • Photoresponse signals have been measured and analyzed from 4 K up to the device T{sub c}. - Abstract: The possible use of high-temperature superconductors (HTS) for realizing superconducting nanowire single-photon detectors is a challenging, but also promising, aim because of their ultrafast electron relaxation times and high operating temperatures. The state-of-the-art HTS nanowires with a 50-nm thickness and widths down to 130 nm have been fabricated and tested under a 1550-nm wavelength laser irradiation. Experimental results presenting both the amplitude and rise times of the photoresponse signals as a function of the normalized detector bias current, measured in a wide temperature range, are discussed. The presence of two distinct regimes in the photoresponse temperature dependence is clearly evidenced, indicating that there are two different response mechanisms responsible for the HTS photoresponse mechanisms.

  16. DNA barcode detects high genetic structure within neotropical bird species.

    Directory of Open Access Journals (Sweden)

    Erika Sendra Tavares

    Full Text Available BACKGROUND: Towards lower latitudes the number of recognized species is not only higher, but also phylogeographic subdivision within species is more pronounced. Moreover, new genetically isolated populations are often described in recent phylogenies of Neotropical birds suggesting that the number of species in the region is underestimated. Previous COI barcoding of Argentinean bird species showed more complex patterns of regional divergence in the Neotropical than in the North American avifauna. METHODS AND FINDINGS: Here we analyzed 1,431 samples from 561 different species to extend the Neotropical bird barcode survey to lower latitudes, and detected even higher geographic structure within species than reported previously. About 93% (520 of the species were identified correctly from their DNA barcodes. The remaining 41 species were not monophyletic in their COI sequences because they shared barcode sequences with closely related species (N = 21 or contained very divergent clusters suggestive of putative new species embedded within the gene tree (N = 20. Deep intraspecific divergences overlapping with among-species differences were detected in 48 species, often with samples from large geographic areas and several including multiple subspecies. This strong population genetic structure often coincided with breaks between different ecoregions or areas of endemism. CONCLUSIONS: The taxonomic uncertainty associated with the high incidence of non-monophyletic species and discovery of putative species obscures studies of historical patterns of species diversification in the Neotropical region. We showed that COI barcodes are a valuable tool to indicate which taxa would benefit from more extensive taxonomic revisions with multilocus approaches. Moreover, our results support hypotheses that the megadiversity of birds in the region is associated with multiple geographic processes starting well before the Quaternary and extending to more recent

  17. High-throughput detection of prostate cancer in histological sections using probabilistic pairwise Markov models.

    Science.gov (United States)

    Monaco, James P; Tomaszewski, John E; Feldman, Michael D; Hagemann, Ian; Moradi, Mehdi; Mousavi, Parvin; Boag, Alexander; Davidson, Chris; Abolmaesumi, Purang; Madabhushi, Anant

    2010-08-01

    In this paper we present a high-throughput system for detecting regions of carcinoma of the prostate (CaP) in HSs from radical prostatectomies (RPs) using probabilistic pairwise Markov models (PPMMs), a novel type of Markov random field (MRF). At diagnostic resolution a digitized HS can contain 80Kx70K pixels - far too many for current automated Gleason grading algorithms to process. However, grading can be separated into two distinct steps: (1) detecting cancerous regions and (2) then grading these regions. The detection step does not require diagnostic resolution and can be performed much more quickly. Thus, we introduce a CaP detection system capable of analyzing an entire digitized whole-mount HS (2x1.75cm(2)) in under three minutes (on a desktop computer) while achieving a CaP detection sensitivity and specificity of 0.87 and 0.90, respectively. We obtain this high-throughput by tailoring the system to analyze the HSs at low resolution (8microm per pixel). This motivates the following algorithm: (Step 1) glands are segmented, (Step 2) the segmented glands are classified as malignant or benign, and (Step 3) the malignant glands are consolidated into continuous regions. The classification of individual glands leverages two features: gland size and the tendency for proximate glands to share the same class. The latter feature describes a spatial dependency which we model using a Markov prior. Typically, Markov priors are expressed as the product of potential functions. Unfortunately, potential functions are mathematical abstractions, and constructing priors through their selection becomes an ad hoc procedure, resulting in simplistic models such as the Potts. Addressing this problem, we introduce PPMMs which formulate priors in terms of probability density functions, allowing the creation of more sophisticated models. To demonstrate the efficacy of our CaP detection system and assess the advantages of using a PPMM prior instead of the Potts, we alternately

  18. Nitrogen detected TROSY at high field yields high resolution and sensitivity for protein NMR

    International Nuclear Information System (INIS)

    Takeuchi, Koh; Arthanari, Haribabu; Shimada, Ichio; Wagner, Gerhard

    2015-01-01

    Detection of 15 N in multidimensional NMR experiments of proteins has sparsely been utilized because of the low gyromagnetic ratio (γ) of nitrogen and the presumed low sensitivity of such experiments. Here we show that selecting the TROSY components of proton-attached 15 N nuclei (TROSY 15 N H ) yields high quality spectra in high field magnets (>600 MHz) by taking advantage of the slow 15 N transverse relaxation and compensating for the inherently low 15 N sensitivity. The 15 N TROSY transverse relaxation rates increase modestly with molecular weight but the TROSY gain in peak heights depends strongly on the magnetic field strength. Theoretical simulations predict that the narrowest line width for the TROSY 15 N H component can be obtained at 900 MHz, but sensitivity reaches its maximum around 1.2 GHz. Based on these considerations, a 15 N-detected 2D 1 H– 15 N TROSY-HSQC ( 15 N-detected TROSY-HSQC) experiment was developed and high-quality 2D spectra were recorded at 800 MHz in 2 h for 1 mM maltose-binding protein at 278 K (τ c  ∼ 40 ns). Unlike for 1 H detected TROSY, deuteration is not mandatory to benefit 15 N detected TROSY due to reduced dipolar broadening, which facilitates studies of proteins that cannot be deuterated, especially in cases where production requires eukaryotic expression systems. The option of recording 15 N TROSY of proteins expressed in H 2 O media also alleviates the problem of incomplete amide proton back exchange, which often hampers the detection of amide groups in the core of large molecular weight proteins that are expressed in D 2 O culture media and cannot be refolded for amide back exchange. These results illustrate the potential of 15 N H -detected TROSY experiments as a means to exploit the high resolution offered by high field magnets near and above 1 GHz

  19. Nitrogen detected TROSY at high field yields high resolution and sensitivity for protein NMR

    Energy Technology Data Exchange (ETDEWEB)

    Takeuchi, Koh [National Institute for Advanced Industrial Science and Technology, Molecular Profiling Research Center for Drug Discovery (Japan); Arthanari, Haribabu [Harvard Medical School, Department of Biochemistry and Molecular Pharmacology (United States); Shimada, Ichio, E-mail: shimada@iw-nmr.f.u-tokyo.ac.jp [National Institute for Advanced Industrial Science and Technology, Molecular Profiling Research Center for Drug Discovery (Japan); Wagner, Gerhard, E-mail: gerhard-wagner@hms.harvard.edu [Harvard Medical School, Department of Biochemistry and Molecular Pharmacology (United States)

    2015-12-15

    Detection of {sup 15}N in multidimensional NMR experiments of proteins has sparsely been utilized because of the low gyromagnetic ratio (γ) of nitrogen and the presumed low sensitivity of such experiments. Here we show that selecting the TROSY components of proton-attached {sup 15}N nuclei (TROSY {sup 15}N{sub H}) yields high quality spectra in high field magnets (>600 MHz) by taking advantage of the slow {sup 15}N transverse relaxation and compensating for the inherently low {sup 15}N sensitivity. The {sup 15}N TROSY transverse relaxation rates increase modestly with molecular weight but the TROSY gain in peak heights depends strongly on the magnetic field strength. Theoretical simulations predict that the narrowest line width for the TROSY {sup 15}N{sub H} component can be obtained at 900 MHz, but sensitivity reaches its maximum around 1.2 GHz. Based on these considerations, a {sup 15}N-detected 2D {sup 1}H–{sup 15}N TROSY-HSQC ({sup 15}N-detected TROSY-HSQC) experiment was developed and high-quality 2D spectra were recorded at 800 MHz in 2 h for 1 mM maltose-binding protein at 278 K (τ{sub c} ∼ 40 ns). Unlike for {sup 1}H detected TROSY, deuteration is not mandatory to benefit {sup 15}N detected TROSY due to reduced dipolar broadening, which facilitates studies of proteins that cannot be deuterated, especially in cases where production requires eukaryotic expression systems. The option of recording {sup 15}N TROSY of proteins expressed in H{sub 2}O media also alleviates the problem of incomplete amide proton back exchange, which often hampers the detection of amide groups in the core of large molecular weight proteins that are expressed in D{sub 2}O culture media and cannot be refolded for amide back exchange. These results illustrate the potential of {sup 15}N{sub H}-detected TROSY experiments as a means to exploit the high resolution offered by high field magnets near and above 1 GHz.

  20. Probability and Measure

    CERN Document Server

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  1. The concept of probability

    International Nuclear Information System (INIS)

    Bitsakis, E.I.; Nicolaides, C.A.

    1989-01-01

    The concept of probability is now, and always has been, central to the debate on the interpretation of quantum mechanics. Furthermore, probability permeates all of science, as well as our every day life. The papers included in this volume, written by leading proponents of the ideas expressed, embrace a broad spectrum of thought and results: mathematical, physical epistemological, and experimental, both specific and general. The contributions are arranged in parts under the following headings: Following Schroedinger's thoughts; Probability and quantum mechanics; Aspects of the arguments on nonlocality; Bell's theorem and EPR correlations; Real or Gedanken experiments and their interpretation; Questions about irreversibility and stochasticity; and Epistemology, interpretation and culture. (author). refs.; figs.; tabs

  2. High prevalence of rheumatic heart disease detected by echocardiography in school children.

    Science.gov (United States)

    Bhaya, Maneesha; Panwar, Sadik; Beniwal, Rajesh; Panwar, Raja Babu

    2010-04-01

    It is fairly easy to detect advanced valve lesions of established rheumatic heart disease by echocardiography in the clinically identified cases of rheumatic heart disease. However, to diagnose a subclinical case of rheumatic heart disease, no uniform set of echocardiographic criteria exist. Moderate thickening of valve leaflets is considered an indicator of established rheumatic heart disease. World Health Organization criteria for diagnosing probable rheumatic heart disease are more sensitive and are based on the detection of significant regurgitation of mitral and/or aortic valves by color Doppler. We attempted diagnosing RHD in school children in Bikaner city by cardiac ultrasound. The stratified cluster sampling technique was employed to identify 31 random clusters in the coeducational schools of Bikaner city. We selected 1059 school children aged 6-15 years from these schools. An experienced operator did careful cardiac auscultation and echocardiographic study. A second expert confirmed the echocardiographic findings. The prevalence of lesions suggestive of rheumatic heart disease by echocardiography was 51 per 1,000 (denominator = 1059; 95% CI: 38-64 per 1,000). We were able to clinically diagnose RHD in one child. None of these children or their parents having echocardiographic evidence of RHD could provide a positive history of acute rheumatic fever. By echocardiographic screening, we found a high prevalence of rheumatic heart disease in the surveyed population. Clinical auscultation had much lower diagnostic efficacy.

  3. Acoustic detection of ultra-high energy cascades in ice

    Energy Technology Data Exchange (ETDEWEB)

    Boeser, S.

    2006-12-08

    Current underwater optical neutrino telescopes are designed to detect neutrinos from astrophysical sources with energies in the TeV range. Due to the low fluxes and small cross sections, no high energy neutrinos of extraterrestrial origin have been observed so far. Only the Cherenkov neutrino detectors on the km{sup 3} scale that are currently under construction will have the necessary volume to observe these rare interactions. For the guaranteed source of neutrinos from interactions of the ultra-high energy cosmic at EeV energies rays with the ambient cosmic microwave background, event rates of only one per year are expected in these experiments. To measure the flux and verify the predicted cross sections of these cosmogenic neutrinos, an observed volume of the order of 100 km{sup 3} will be necessary, that will not be feasible with existing detection techniques. Alternative methods are required to build a detector on these scales. One promising idea is to record the acoustic waves generated in hadronic or electromagnetic cascades following the neutrino interaction. The higher amplitudes of the sonic signal and the large expected absorption length of sound favour South Polar ice instead of sea water as a medium. The prerequisites for an estimate of the potential of such a detector are suitable acoustic sensors, a verification of the model of thermo-acoustic sound generation and a determination of the acoustic properties of the ice. In a theoretical derivation the mechanism of thermo-elastic excitation of acoustic waves was shown to be equivalent for isotropic solids and liquids. Following a detailed analysis of the existing knowledge a simulation study of a hybrid optical-radio-acoustic detector has been performed. Ultrasonic sensors dedicated to in-ice application were developed and have been used to record acoustic signals from intense proton and laser beams in water and ice. With the obtained experience, the hitherto largest array of acoustic sensors and

  4. Acoustic detection of ultra-high energy cascades in ice

    International Nuclear Information System (INIS)

    Boeser, S.

    2006-01-01

    Current underwater optical neutrino telescopes are designed to detect neutrinos from astrophysical sources with energies in the TeV range. Due to the low fluxes and small cross sections, no high energy neutrinos of extraterrestrial origin have been observed so far. Only the Cherenkov neutrino detectors on the km 3 scale that are currently under construction will have the necessary volume to observe these rare interactions. For the guaranteed source of neutrinos from interactions of the ultra-high energy cosmic at EeV energies rays with the ambient cosmic microwave background, event rates of only one per year are expected in these experiments. To measure the flux and verify the predicted cross sections of these cosmogenic neutrinos, an observed volume of the order of 100 km 3 will be necessary, that will not be feasible with existing detection techniques. Alternative methods are required to build a detector on these scales. One promising idea is to record the acoustic waves generated in hadronic or electromagnetic cascades following the neutrino interaction. The higher amplitudes of the sonic signal and the large expected absorption length of sound favour South Polar ice instead of sea water as a medium. The prerequisites for an estimate of the potential of such a detector are suitable acoustic sensors, a verification of the model of thermo-acoustic sound generation and a determination of the acoustic properties of the ice. In a theoretical derivation the mechanism of thermo-elastic excitation of acoustic waves was shown to be equivalent for isotropic solids and liquids. Following a detailed analysis of the existing knowledge a simulation study of a hybrid optical-radio-acoustic detector has been performed. Ultrasonic sensors dedicated to in-ice application were developed and have been used to record acoustic signals from intense proton and laser beams in water and ice. With the obtained experience, the hitherto largest array of acoustic sensors and transmitters was

  5. Probability for statisticians

    CERN Document Server

    Shorack, Galen R

    2017-01-01

    This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...

  6. Concepts of probability theory

    CERN Document Server

    Pfeiffer, Paul E

    1979-01-01

    Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.

  7. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  8. Probability and Statistical Inference

    OpenAIRE

    Prosper, Harrison B.

    2006-01-01

    These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.

  9. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  10. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  11. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  12. Probability in quantum mechanics

    Directory of Open Access Journals (Sweden)

    J. G. Gilson

    1982-01-01

    Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.

  13. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  14. Quantum computing and probability

    International Nuclear Information System (INIS)

    Ferry, David K

    2009-01-01

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction. (viewpoint)

  15. Managing and understanding risk perception of surface leaks from CCS sites: risk assessment for emerging technologies and low-probability, high-consequence events

    Science.gov (United States)

    Augustin, C. M.

    2015-12-01

    Carbon capture and storage (CCS) has been suggested by the Intergovernmental Panel on Climate Change as a partial solution to the greenhouse gas emissions problem. As CCS has become mainstream, researchers have raised multiple risk assessment issues typical of emerging technologies. In our research, we examine issues occuring when stored carbon dioxide (CO2) migrates to the near-surface or surface. We believe that both the public misperception and the physical reality of potential environmental, health, and commercial impacts of leak events from such subsurface sites have prevented widespread adoption of CCS. This paper is presented in three parts; the first is an evaluation of the systemic risk of a CCS site CO2 leak and models indicating potential likelihood of a leakage event. As the likelihood of a CCS site leak is stochastic and nonlinear, we present several Bayesian simulations for leak events based on research done with other low-probability, high-consequence gaseous pollutant releases. Though we found a large, acute leak to be exceptionally rare, we demonstrate potential for a localized, chronic leak at a CCS site. To that end, we present the second piece of this paper. Using a combination of spatio-temporal models and reaction-path models, we demonstrate the interplay between leak migrations, material interactions, and atmospheric dispersion for leaks of various duration and volume. These leak-event scenarios have implications for human, environmental, and economic health; they also have a significant impact on implementation support. Public acceptance of CCS is essential for a national low-carbon future, and this is what we address in the final part of this paper. We demonstrate that CCS remains unknown to the general public in the United States. Despite its unknown state, we provide survey findings -analyzed in Slovic and Weber's 2002 framework - that show a high unknown, high dread risk perception of leaks from a CCS site. Secondary findings are a

  16. Communicating Low-Probability High-Consequence Risk, Uncertainty and Expert Confidence: Induced Seismicity of Deep Geothermal Energy and Shale Gas.

    Science.gov (United States)

    Knoblauch, Theresa A K; Stauffacher, Michael; Trutnevyte, Evelina

    2018-04-01

    Subsurface energy activities entail the risk of induced seismicity including low-probability high-consequence (LPHC) events. For designing respective risk communication, the scientific literature lacks empirical evidence of how the public reacts to different written risk communication formats about such LPHC events and to related uncertainty or expert confidence. This study presents findings from an online experiment (N = 590) that empirically tested the public's responses to risk communication about induced seismicity and to different technology frames, namely deep geothermal energy (DGE) and shale gas (between-subject design). Three incrementally different formats of written risk communication were tested: (i) qualitative, (ii) qualitative and quantitative, and (iii) qualitative and quantitative with risk comparison. Respondents found the latter two the easiest to understand, the most exact, and liked them the most. Adding uncertainty and expert confidence statements made the risk communication less clear, less easy to understand and increased concern. Above all, the technology for which risks are communicated and its acceptance mattered strongly: respondents in the shale gas condition found the identical risk communication less trustworthy and more concerning than in the DGE conditions. They also liked the risk communication overall less. For practitioners in DGE or shale gas projects, the study shows that the public would appreciate efforts in describing LPHC risks with numbers and optionally risk comparisons. However, there seems to be a trade-off between aiming for transparency by disclosing uncertainty and limited expert confidence, and thereby decreasing clarity and increasing concern in the view of the public. © 2017 Society for Risk Analysis.

  17. VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS

    Directory of Open Access Journals (Sweden)

    Smirnov Vladimir Alexandrovich

    2012-10-01

    Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.

  18. Narrow CSF space at high convexity and high midline areas in idiopathic normal pressure hydrocephalus detected by axial and coronal MRI

    Energy Technology Data Exchange (ETDEWEB)

    Sasaki, Makoto [Iwate Medical University, Department of Radiology, Morioka (Japan); Honda, Satoshi [St. Luke' s International Hospital, Department of Radiology, Tokyo (Japan); Yuasa, Tatsuhiko; Iwamura, Akihide [Kohnodai Hospital, National Center of Neurology and Psychiatry, Department of Neurology, Ichikawa (Japan); Shibata, Eri [Iwate Medical University, Department of Neuropsychiatry, Morioka (Japan); Ohba, Hideki [Iwate Medical University, Department of Neurology, Morioka (Japan)

    2008-02-15

    The aim of this study was to determine the performance of axial and coronal magnetic resonance imaging (MRI) in detecting the narrowing of the cerebrospinal fluid (CSF) space at the high convexity and high midline areas, which is speculated to be one of the clinical characteristics of idiopathic normal pressure hydrocephalus (iNPH). We retrospectively examined axial and coronal T1-weighted images of 14 iNPH patients and 12 age-matched controls. The narrowness of the CSF space at the high convexity/midline was blindly evaluated by five raters using a continuous confidence rating scale for receiver operating characteristic (ROC) analysis. Axial and coronal imaging accurately determined the presence of the narrow cisterns/sulci at the high convexity/midline and was capable of predicting probable/definite iNPH with a high degree of accuracy. there were also no significant differences in the detection of this finding between the axial and coronal images. Both axial and coronal T1-weighted MRI can detect the narrow CSF space at the high convexity/midline accurately and may therefore facilitate clinicians in choosing a management strategy for iNPH patients. (orig.)

  19. Narrow CSF space at high convexity and high midline areas in idiopathic normal pressure hydrocephalus detected by axial and coronal MRI

    International Nuclear Information System (INIS)

    Sasaki, Makoto; Honda, Satoshi; Yuasa, Tatsuhiko; Iwamura, Akihide; Shibata, Eri; Ohba, Hideki

    2008-01-01

    The aim of this study was to determine the performance of axial and coronal magnetic resonance imaging (MRI) in detecting the narrowing of the cerebrospinal fluid (CSF) space at the high convexity and high midline areas, which is speculated to be one of the clinical characteristics of idiopathic normal pressure hydrocephalus (iNPH). We retrospectively examined axial and coronal T1-weighted images of 14 iNPH patients and 12 age-matched controls. The narrowness of the CSF space at the high convexity/midline was blindly evaluated by five raters using a continuous confidence rating scale for receiver operating characteristic (ROC) analysis. Axial and coronal imaging accurately determined the presence of the narrow cisterns/sulci at the high convexity/midline and was capable of predicting probable/definite iNPH with a high degree of accuracy. there were also no significant differences in the detection of this finding between the axial and coronal images. Both axial and coronal T1-weighted MRI can detect the narrow CSF space at the high convexity/midline accurately and may therefore facilitate clinicians in choosing a management strategy for iNPH patients. (orig.)

  20. High Magnetic Field in THz Plasma Wave Detection by High Electron Mobility Transistors

    Science.gov (United States)

    Sakowicz, M.; Łusakowski, J.; Karpierz, K.; Grynberg, M.; Valusis, G.

    The role of gated and ungated two dimensional (2D) electron plasma in THz detection by high electron mobility transistors (HEMTs) was investigated. THz response of GaAs/AlGaAs and GaN/AlGaN HEMTs was measured at 4.4K in quantizing magnetic fields with a simultaneous modulation of the gate voltage UGS. This allowed us to measure both the detection signal, S, and its derivative dS/dUGS. Shubnikov - de-Haas oscillations (SdHO) of both S and dS/dUGS were observed. A comparison of SdHO observed in detection and magnetoresistance measurements allows us to associate unambiguously SdHO in S and dS/dUGS with the ungated and gated parts of the transistor channel, respectively. This allows us to conclude that the entire channel takes part in the detection process. Additionally, in the case of GaAlAs/GaAs HEMTs, a structure related to the cyclotron resonance transition was observed.

  1. High Hydrogen Content Graphene Hydride Compounds & High Cross-Section Cladding Coatings for Fast Neutron Detection

    International Nuclear Information System (INIS)

    Chandrashekhar, MVS

    2017-01-01

    The objective is to develop and implement a superior low-cost, large area (potentially >32in), easily deployable, close proximity, harsh environment innovative neutron sensor needed for next generation fuel cycle monitoring. We will exploit recent breakthroughs at the PI's lab on the electrochemistry of epitaxial graphene (EG) formed on commercial SiC wafers, a transformative nanomaterial system with superior radiation detection and durability properties to develop a new paradigm in detection for fast neutrons, a by-product of fission reactors. There are currently few effective detection/monitoring schemes, especially solid-state ones at present. This is essential for monitoring and control of future fuel cycles to make them more efficient and reliable. By exploiting these novel materials, as well as innovative hybrid SiC/EG/Cladding device architectures conceived by the team, will develop low-cost, high performance solutions to fast-neutron detection. Finally, we will also explore 3-terminal device implementations for neutron detectors with built-in electronic gain to further shrink these devices and improve their sensitivity.

  2. Highly sensitive real-time PCR for specific detection and quantification of Coxiella burnetii

    Directory of Open Access Journals (Sweden)

    Linke Sonja

    2006-01-01

    Full Text Available Abstract Background Coxiella burnetii, the bacterium causing Q fever, is an obligate intracellular biosafety level 3 agent. Detection and quantification of these bacteria with conventional methods is time consuming and dangerous. During the last years, several PCR based diagnostic assays were developed to detect C. burnetii DNA in cell cultures and clinical samples. We developed and evaluated TaqMan-based real-time PCR assays that targeted the singular icd (isocitrate dehydrogenase gene and the transposase of the IS1111a element present in multiple copies in the C. burnetii genome. Results To evaluate the precision of the icd and IS1111 real-time PCR assays, we performed different PCR runs with independent DNA dilutions of the C. burnetii Nine Mile RSA493 strain. The results showed very low variability, indicating efficient reproducibility of both assays. Using probit analysis, we determined that the minimal number of genome equivalents per reaction that could be detected with a 95% probability was 10 for the icd marker and 6.5 for the IS marker. Plasmid standards with cloned icd and IS1111 fragments were used to establish standard curves which were linear over a range from 10 to 107 starting plasmid copy numbers. We were able to quantify cell numbers of a diluted, heat-inactivated Coxiella isolate with a detection limit of 17 C. burnetii particles per reaction. Real-time PCR targeting both markers was performed with DNA of 75 different C. burnetii isolates originating from all over the world. Using this approach, the number of IS1111 elements in the genome of the Nine Mile strain was determined to be 23, close to 20, the number revealed by genome sequencing. In other isolates, the number of IS1111 elements varied widely (between seven and 110 and seemed to be very high in some isolates. Conclusion We validated TaqMan-based real-time PCR assays targeting the icd and IS1111 markers of C. burnetii. The assays were shown to be specific, highly

  3. Irreversibility and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    The mathematical entropy - unlike physical entropy - is simply a measure of uniformity for probability distributions in general. So understood, conditional entropies have the same logical structure as conditional probabilities. If, as is sometimes supposed, conditional probabilities are time-reversible, then so are conditional entropies and, paradoxically, both then share this symmetry with physical equations of motion. The paradox is, of course that probabilities yield a direction to time both in statistical mechanics and quantum mechanics, while the equations of motion do not. The supposed time-reversibility of both conditionals seems also to involve a form of retrocausality that is related to, but possibly not the same as, that described by Costa de Beaurgard. The retrocausality is paradoxically at odds with the generally presumed irreversibility of the quantum mechanical measurement process. Further paradox emerges if the supposed time-reversibility of the conditionals is linked with the idea that the thermodynamic entropy is the same thing as 'missing information' since this confounds the thermodynamic and mathematical entropies. However, it is shown that irreversibility is a formal consequence of conditional entropies and, hence, of conditional probabilities also. 8 refs. (Author)

  4. The pleasures of probability

    CERN Document Server

    Isaac, Richard

    1995-01-01

    The ideas of probability are all around us. Lotteries, casino gambling, the al­ most non-stop polling which seems to mold public policy more and more­ these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re­ moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac­ ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...

  5. Experimental Probability in Elementary School

    Science.gov (United States)

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  6. Detection of argan oil adulteration with vegetable oils by high-performance liquid chromatography-evaporative light scattering detection.

    Science.gov (United States)

    Salghi, Rachid; Armbruster, Wolfgang; Schwack, Wolfgang

    2014-06-15

    Triacylglycerol profiles were selected as indicator of adulteration of argan oils to carry out a rapid screening of samples for the evaluation of authenticity. Triacylglycerols were separated by high-performance liquid chromatography-evaporative light scattering detection. Different peak area ratios were defined to sensitively detect adulteration of argan oil with vegetable oils such as sunflower, soy bean, and olive oil up to the level of 5%. Based on four reference argan oils, mean limits of detection and quantitation were calculated to approximately 0.4% and 1.3%, respectively. Additionally, 19 more argan oil reference samples were analysed by high-performance liquid chromatography-refractive index detection, resulting in highly comparative results. The overall strategy demonstrated a good applicability in practise, and hence a high potential to be transferred to routine laboratories. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. Cueing spatial attention through timing and probability.

    Science.gov (United States)

    Girardi, Giovanna; Antonucci, Gabriella; Nico, Daniele

    2013-01-01

    Even when focused on an effortful task we retain the ability to detect salient environmental information, and even irrelevant visual stimuli can be automatically detected. However, to which extent unattended information affects attentional control is not fully understood. Here we provide evidences of how the brain spontaneously organizes its cognitive resources by shifting attention between a selective-attending and a stimulus-driven modality within a single task. Using a spatial cueing paradigm we investigated the effect of cue-target asynchronies as a function of their probabilities of occurrence (i.e., relative frequency). Results show that this accessory information modulates attentional shifts. A valid spatial cue improved participants' performance as compared to an invalid one only in trials in which target onset was highly predictable because of its more robust occurrence. Conversely, cuing proved ineffective when spatial cue and target were associated according to a less frequent asynchrony. These patterns of response depended on asynchronies' probability and not on their duration. Our findings clearly demonstrate that through a fine decision-making, performed trial-by-trial, the brain utilizes implicit information to decide whether or not voluntarily shifting spatial attention. As if according to a cost-planning strategy, the cognitive effort of shifting attention depending on the cue is performed only when the expected advantages are higher. In a trade-off competition for cognitive resources, voluntary/automatic attending may thus be a more complex process than expected. Copyright © 2011 Elsevier Ltd. All rights reserved.

  8. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....

  9. Probability and stochastic modeling

    CERN Document Server

    Rotar, Vladimir I

    2012-01-01

    Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...

  10. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  11. High probability of avian influenza virus (H7N7) transmission from poultry to humans active in disease control on infected farms

    NARCIS (Netherlands)

    M.E.H. Bos (Marian); D.E. te Beest (Dennis); M. van Boven (Michiel); M.R.D.R.B. van Holle; A. Meijer (Adam); A. Bosman (Arnold); Y.M. Mulder (Yonne); M.P.G. Koopmans D.V.M. (Marion); A. Stegeman (Arjan)

    2010-01-01

    textabstractAn epizootic of avian influenza (H7N7) caused a large number of human infections in The Netherlands in 2003. We used data from this epizootic to estimate infection probabilities for persons involved in disease control on infected farms. Analyses were based on databases containing

  12. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    2014-01-01

    either construct elicitation mechanisms that control for risk aversion, or construct elicitation mechanisms which undertake 'calibrating adjustments' to elicited reports. We illustrate how the joint estimation of risk attitudes and subjective probabilities can provide the calibration adjustments...... that theory calls for. We illustrate this approach using data from a controlled experiment with real monetary consequences to the subjects. This allows the observer to make inferences about the latent subjective probability, under virtually any well-specified model of choice under subjective risk, while still...

  13. Classic Problems of Probability

    CERN Document Server

    Gorroochurn, Prakash

    2012-01-01

    "A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin

  14. Detection of Doppler Microembolic Signals Using High Order Statistics

    Directory of Open Access Journals (Sweden)

    Maroun Geryes

    2016-01-01

    Full Text Available Robust detection of the smallest circulating cerebral microemboli is an efficient way of preventing strokes, which is second cause of mortality worldwide. Transcranial Doppler ultrasound is widely considered the most convenient system for the detection of microemboli. The most common standard detection is achieved through the Doppler energy signal and depends on an empirically set constant threshold. On the other hand, in the past few years, higher order statistics have been an extensive field of research as they represent descriptive statistics that can be used to detect signal outliers. In this study, we propose new types of microembolic detectors based on the windowed calculation of the third moment skewness and fourth moment kurtosis of the energy signal. During energy embolus-free periods the distribution of the energy is not altered and the skewness and kurtosis signals do not exhibit any peak values. In the presence of emboli, the energy distribution is distorted and the skewness and kurtosis signals exhibit peaks, corresponding to the latter emboli. Applied on real signals, the detection of microemboli through the skewness and kurtosis signals outperformed the detection through standard methods. The sensitivities and specificities reached 78% and 91% and 80% and 90% for the skewness and kurtosis detectors, respectively.

  15. An Objective Theory of Probability (Routledge Revivals)

    CERN Document Server

    Gillies, Donald

    2012-01-01

    This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axioma

  16. Rapid and high-throughput detection of highly pathogenic bacteria by Ibis PLEX-ID technology.

    Directory of Open Access Journals (Sweden)

    Daniela Jacob

    Full Text Available In this manuscript, we describe the identification of highly pathogenic bacteria using an assay coupling biothreat group-specific PCR with electrospray ionization mass spectrometry (PCR/ESI-MS run on an Ibis PLEX-ID high-throughput platform. The biothreat cluster assay identifies most of the potential bioterrorism-relevant microorganisms including Bacillus anthracis, Francisella tularensis, Yersinia pestis, Burkholderia mallei and pseudomallei, Brucella species, and Coxiella burnetii. DNA from 45 different reference materials with different formulations and different concentrations were chosen and sent to a service screening laboratory that uses the PCR/ESI-MS platform to provide a microbial identification service. The standard reference materials were produced out of a repository built up in the framework of the EU funded project "Establishment of Quality Assurances for Detection of Highly Pathogenic Bacteria of Potential Bioterrorism Risk" (EQADeBa. All samples were correctly identified at least to the genus level.

  17. Smart Sensor Based Obstacle Detection for High-Speed Unmanned Surface Vehicle

    DEFF Research Database (Denmark)

    Hermann, Dan; Galeazzi, Roberto; Andersen, Jens Christian

    2015-01-01

    This paper describes an obstacle detection system for a high-speed and agile unmanned surface vehicle (USV), running at speeds up to 30 m/s. The aim is a real-time and high performance obstacle detection system using both radar and vision technologies to detect obstacles within a range of 175 m. ...... performance using sensor fusion of radar and computer vision....

  18. Counterexamples in probability

    CERN Document Server

    Stoyanov, Jordan M

    2013-01-01

    While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.

  19. Epistemology and Probability

    CERN Document Server

    Plotnitsky, Arkady

    2010-01-01

    Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general

  20. Transition probabilities for atoms

    International Nuclear Information System (INIS)

    Kim, Y.K.

    1980-01-01

    Current status of advanced theoretical methods for transition probabilities for atoms and ions is discussed. An experiment on the f values of the resonance transitions of the Kr and Xe isoelectronic sequences is suggested as a test for the theoretical methods

  1. High Density Nano-Electrode Array for Radiation Detection

    International Nuclear Information System (INIS)

    Misra, Mano

    2010-01-01

    Bulk single crystals of Cd 1-x Zn x Te (x=0.04 to x=0.2) compound semiconductor is used for room temperature radiation detection. The production of large volume of Cd 1-x Zn x Te with low defect density is expensive. As a result there is a growing research interest in the production of nanostructured compound semiconductors such as Cd 1-x Zn x Te in an electrochemical route. In this investigation, Cd 1-x Zn x Te ternary compound semiconductor, referred as CZT, was electrodeposited in the form of nanowires onto a TiO 2 nanotubular template from propylene carbonate as the non-aqueous electrolyte, using a pulse-reverse electrodeposition process at 130 C. The template acted as a support in growing ordered nanowire of CZT which acts as a one dimensional conductor. Cyclic Voltammogram (CV) studies were conducted in determining the potentials for the growth of nanowires of uniform stoichiometry. The morphologies and composition of CZT were characterized by using SEM, TEM and XRD. The STEM mapping carried out on the nanowires showed the uniform distribution of Cd, Zn and Te elements. TEM image showed that the nanowires were polycrystalline in nature. The Mott-Schottky analysis carried on the nanowires showed that the nanowires were a p-type semiconductor. The carrier density, band gap and resistivity of the Cd 0.9 Zn 0.1 Te nanowires were 4.29 x 10 13 cm -3 , 1.56 eV and 2.76 x 10 11 (Omega)-cm respectively. The high resistivity was attributed to the presence of deep defect states such as cadmium vacancies or Te antisites which were created by the anodic cycle of the pulse-reverse electrodeposition process. Stacks of series connected CZT nanowire arrays were tested with different bias potentials. The background current was in the order of tens of picoamperes. When exposed to radiation source Amerecium-241 (60 KeV, 4 (micro)Ci), the stacked CZT nanowires arrays showed sensing behavior. The sensitivity of the nanowire arrays increased as the number of stacks increased. The

  2. High Density Nano-Electrode Array for Radiation Detection

    Energy Technology Data Exchange (ETDEWEB)

    Mano Misra

    2010-05-07

    Bulk single crystals of Cd1-xZnxTe (x=0.04 to x=0.2) compound semiconductor is used for room temperature radiation detection. The production of large volume of Cd1-xZnxTe with low defect density is expensive. As a result there is a growing research interest in the production of nanostructured compound semiconductors such as Cd1-xZnxTe in an electrochemical route. In this investigation, Cd1-xZnxTe ternary compound semiconductor, referred as CZT, was electrodeposited in the form of nanowires onto a TiO2 nanotubular template from propylene carbonate as the non-aqueous electrolyte, using a pulse-reverse electrodeposition process at 130 ºC. The template acted as a support in growing ordered nanowire of CZT which acts as a one dimensional conductor. Cyclic Voltammogram (CV) studies were conducted in determining the potentials for the growth of nanowires of uniform stoichiometry. The morphologies and composition of CZT were characterized by using SEM, TEM and XRD. The STEM mapping carried out on the nanowires showed the uniform distribution of Cd, Zn and Te elements. TEM image showed that the nanowires were polycrystalline in nature. The Mott-Schottky analysis carried on the nanowires showed that the nanowires were a p-type semiconductor. The carrier density, band gap and resistivity of the Cd0.9Zn0.1Te nanowires were 4.29x1013 cm-3, 1.56 eV and 2.76x1011Ω-cm respectively. The high resistivity was attributed to the presence of deep defect states such as cadmium vacancies or Te antisites which were created by the anodic cycle of the pulse-reverse electrodeposition process. Stacks of series connected CZT nanowire arrays were tested with different bias potentials. The background current was in the order of tens of picoamperes. When exposed to radiation source Amerecium-241 (60 KeV, 4 μCi), the stacked CZT nanowires arrays showed sensing behavior. The sensitivity of the nanowire arrays increased as the number of stacks increased. The preliminary results indicate that the

  3. Remotely detected high-field MRI of porous samples

    Science.gov (United States)

    Seeley, Juliette A.; Han, Song-I.; Pines, Alexander

    2004-04-01

    Remote detection of NMR is a novel technique in which an NMR-active sensor surveys an environment of interest and retains memory of that environment to be recovered at a later time in a different location. The NMR or MRI information about the sensor nucleus is encoded and stored as spin polarization at the first location and subsequently moved to a different physical location for optimized detection. A dedicated probe incorporating two separate radio frequency (RF)—circuits was built for this purpose. The encoding solenoid coil was large enough to fit around the bulky sample matrix, while the smaller detection solenoid coil had not only a higher quality factor, but also an enhanced filling factor since the coil volume comprised purely the sensor nuclei. We obtained two-dimensional (2D) void space images of two model porous samples with resolution less than 1.4 mm 2. The remotely reconstructed images demonstrate the ability to determine fine structure with image quality superior to their directly detected counterparts and show the great potential of NMR remote detection for imaging applications that suffer from low sensitivity due to low concentrations and filling factor.

  4. Highly sensitive BTX detection using surface functionalized QCM sensor

    Energy Technology Data Exchange (ETDEWEB)

    Bozkurt, Asuman Aşıkoğlu; Özdemir, Okan; Altındal, Ahmet, E-mail: altindal@yildiz.edu.tr [Department of Physics, Yildiz Technical University, Davutpasa, 34210 Istanbul (Turkey)

    2016-03-25

    A novel organic compound was designed and successfully synthesized for the fabrication of QCM based sensors to detect the low concentrations of BTX gases in indoor air. The effect of the long-range electron orbital delocalization on the BTX vapour sensing properties of azo-bridged Pcs based chemiresistor-type sensors have also been investigated in this work. The sensing behaviour of the film for the online detection of volatile organic solvent vapors was investigated by utilizing an AT-cut quartz crystal resonator. It was observed that the adsorption of the target molecules on the coating surface cause a reversible negative frequency shift of the resonator. Thus, a variety of solvent vapors can be detected by using the phthalocyanine film as sensitive coating, with sensitivity in the ppm range and response times in the order of several seconds depending on the molecular structure of the organic solvent.

  5. Detecting high-frequency gravitational waves with optically levitated sensors.

    Science.gov (United States)

    Arvanitaki, Asimina; Geraci, Andrew A

    2013-02-15

    We propose a tunable resonant sensor to detect gravitational waves in the frequency range of 50-300 kHz using optically trapped and cooled dielectric microspheres or microdisks. The technique we describe can exceed the sensitivity of laser-based gravitational wave observatories in this frequency range, using an instrument of only a few percent of their size. Such a device extends the search volume for gravitational wave sources above 100 kHz by 1 to 3 orders of magnitude, and could detect monochromatic gravitational radiation from the annihilation of QCD axions in the cloud they form around stellar mass black holes within our galaxy due to the superradiance effect.

  6. Negative probability in the framework of combined probability

    OpenAIRE

    Burgin, Mark

    2013-01-01

    Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...

  7. Rapid, highly sensitive and highly specific gene detection by combining enzymatic amplification and DNA chip detection simultaneously

    Directory of Open Access Journals (Sweden)

    Koji Hashimoto

    2016-05-01

    Full Text Available We have developed a novel gene detection method based on the loop-mediated isothermal amplification (LAMP reaction and the DNA dissociation reaction on the same DNA chip surface to achieve a lower detection limit, broader dynamic range and faster detection time than are attainable with a conventional DNA chip. Both FAM- and thiol-labeled DNA probe bound to the complementary sequence accompanying Dabcyl was immobilized on the gold surface via Au/thiol bond. The LAMP reaction was carried out on the DNA probe fixed gold surface. At first, Dabcyl molecules quenched the FAM fluorescence. According to the LAMP reaction, the complementary sequence with Dabcyl was competitively reacted with the amplified targeted sequence. As a result, the FAM fluorescence increased owing to dissociation of the complementary sequence from the DNA probe. The simultaneous reaction of LAMP and DNA chip detection was achieved, and 103 copies of the targeted gene were detected within an hour by measuring fluorescence intensity of the DNA probe. Keywords: Biosensor, DNA chip, Loop-mediated isothermal amplification (LAMP, Fluorescence detection, Gold substrate, Au/thiol bond

  8. Leak detection system for a high temperature fluid pipe

    International Nuclear Information System (INIS)

    Puyal, C.; Meuwisse, C.

    1989-01-01

    The leak detection system is made by a cable with at least two isolated electrical conductors, close to the wall of the pipe. The material of the cable is chosen so as to change its electrical characteristics if a leak causes heating of the cable. A detector at one end of the cable can measure the modifications of the electrical characteristics [fr

  9. Real time avalanche detection for high risk areas.

    Science.gov (United States)

    2014-12-01

    Avalanches routinely occur on State Highway 21 (SH21) between Lowman and Stanley, Idaho each winter. The avalanches pose : a threat to the safety of maintenance workers and the traveling public. A real-time avalanche detection system will allow the :...

  10. A Comparative Study of Data Mining Algorithms for High Detection Rate in Intrusion Detection System

    Directory of Open Access Journals (Sweden)

    Nabeela Ashraf

    2018-01-01

    Full Text Available Due to the fast growth and tradition of the internet over the last decades, the network security problems are increasing vigorously. Humans can not handle the speed of processes and the huge amount of data required to handle network anomalies. Therefore, it needs substantial automation in both speed and accuracy. Intrusion Detection System is one of the approaches to recognize illegal access and rare attacks to secure networks. In this proposed paper, Naive Bayes, J48 and Random Forest classifiers are compared to compute the detection rate and accuracy of IDS. For experiments, the KDD_NSL dataset is used.

  11. Contributions to quantum probability

    International Nuclear Information System (INIS)

    Fritz, Tobias

    2010-01-01

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a finite set can occur as the outcome

  12. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  13. Contributions to quantum probability

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Tobias

    2010-06-25

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a

  14. Waste Package Misload Probability

    International Nuclear Information System (INIS)

    Knudsen, J.K.

    2001-01-01

    The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a

  15. Probability theory and applications

    CERN Document Server

    Hsu, Elton P

    1999-01-01

    This volume, with contributions by leading experts in the field, is a collection of lecture notes of the six minicourses given at the IAS/Park City Summer Mathematics Institute. It introduces advanced graduates and researchers in probability theory to several of the currently active research areas in the field. Each course is self-contained with references and contains basic materials and recent results. Topics include interacting particle systems, percolation theory, analysis on path and loop spaces, and mathematical finance. The volume gives a balanced overview of the current status of probability theory. An extensive bibliography for further study and research is included. This unique collection presents several important areas of current research and a valuable survey reflecting the diversity of the field.

  16. Paradoxes in probability theory

    CERN Document Server

    Eckhardt, William

    2013-01-01

    Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory.  Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies.  Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.

  17. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  18. Detected troponin elevation is associated with high early mortality after lung resection for cancer

    Directory of Open Access Journals (Sweden)

    Van Tornout Fillip

    2006-10-01

    Full Text Available Abstract Background Myocardial infarction can be difficult to diagnose after lung surgery. As recent diagnostic criteria emphasize serum cardiac markers (in particular serum troponin we set out to evaluate its clinical utility and to establish the long term prognostic impact of detected abnormal postoperative troponin levels after lung resection. Methods We studied a historic cohort of patients with primary lung cancer who underwent intended surgical resection. Patients were grouped according to known postoperative troponin status and survival calculated by Kaplan Meier method and compared using log rank. Parametric survival analysis was used to ascertain independent predictors of mortality. Results From 2001 to 2004, a total of 207 patients underwent lung resection for primary lung cancer of which 14 (7% were identified with elevated serum troponin levels within 30 days of surgery, with 9 (64% having classical features of myocardial infarction. The median time to follow up (interquartile range was 22 (1 to 52 months, and the one and five year survival probabilities (95% CI for patients without and with postoperative troponin elevation were 92% (85 to 96 versus 60% (31 to 80 and 61% (51 to 71 versus 18% (3 to 43 respectively (p T stage and postoperative troponin elevation remained independent predictors of mortality in the final multivariable model. The acceleration factor for death of elevated serum troponin after adjusting for tumour stage was 9.19 (95% CI 3.75 to 22.54. Conclusion Patients with detected serum troponin elevation are at high risk of early mortality with or without symptoms of myocardial infarction after lung resection.

  19. Model uncertainty and probability

    International Nuclear Information System (INIS)

    Parry, G.W.

    1994-01-01

    This paper discusses the issue of model uncertainty. The use of probability as a measure of an analyst's uncertainty as well as a means of describing random processes has caused some confusion, even though the two uses are representing different types of uncertainty with respect to modeling a system. The importance of maintaining the distinction between the two types is illustrated with a simple example

  20. Retrocausality and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    Costa de Beauregard has proposed that physical causality be identified with conditional probability. The proposal is shown to be vulnerable on two accounts. The first, though mathematically trivial, seems to be decisive so far as the current formulation of the proposal is concerned. The second lies in a physical inconsistency which seems to have its source in a Copenhagenlike disavowal of realism in quantum mechanics. 6 refs. (Author)

  1. Probability via expectation

    CERN Document Server

    Whittle, Peter

    1992-01-01

    This book is a complete revision of the earlier work Probability which ap­ peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de­ manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...

  2. Spatial probability aids visual stimulus discrimination

    Directory of Open Access Journals (Sweden)

    Michael Druker

    2010-08-01

    Full Text Available We investigated whether the statistical predictability of a target's location would influence how quickly and accurately it was classified. Recent results have suggested that spatial probability can be a cue for the allocation of attention in visual search. One explanation for probability cuing is spatial repetition priming. In our two experiments we used probability distributions that were continuous across the display rather than relying on a few arbitrary screen locations. This produced fewer spatial repeats and allowed us to dissociate the effect of a high probability location from that of short-term spatial repetition. The task required participants to quickly judge the color of a single dot presented on a computer screen. In Experiment 1, targets were more probable in an off-center hotspot of high probability that gradually declined to a background rate. Targets garnered faster responses if they were near earlier target locations (priming and if they were near the high probability hotspot (probability cuing. In Experiment 2, target locations were chosen on three concentric circles around fixation. One circle contained 80% of targets. The value of this ring distribution is that it allowed for a spatially restricted high probability zone in which sequentially repeated trials were not likely to be physically close. Participant performance was sensitive to the high-probability circle in addition to the expected effects of eccentricity and the distance to recent targets. These two experiments suggest that inhomogeneities in spatial probability can be learned and used by participants on-line and without prompting as an aid for visual stimulus discrimination and that spatial repetition priming is not a sufficient explanation for this effect. Future models of attention should consider explicitly incorporating the probabilities of targets locations and features.

  3. High Resolution Viscosity Measurement by Thermal Noise Detection

    Directory of Open Access Journals (Sweden)

    Felipe Aguilar Sandoval

    2015-11-01

    Full Text Available An interferometric method is implemented in order to accurately assess the thermal fluctuations of a micro-cantilever sensor in liquid environments. The power spectrum density (PSD of thermal fluctuations together with Sader’s model of the cantilever allow for the indirect measurement of the liquid viscosity with good accuracy. The good quality of the deflection signal and the characteristic low noise of the instrument allow for the detection and corrections of drawbacks due to both the cantilever shape irregularities and the uncertainties on the position of the laser spot at the fluctuating end of the cantilever. Variation of viscosity below 0.03 mPa·s was detected with the alternative to achieve measurements with a volume as low as 50 µL.

  4. Optical detection of random features for high security applications

    Science.gov (United States)

    Haist, T.; Tiziani, H. J.

    1998-02-01

    Optical detection of random features in combination with digital signatures based on public key codes in order to recognize counterfeit objects will be discussed. Without applying expensive production techniques objects are protected against counterfeiting. Verification is done off-line by optical means without a central authority. The method is applied for protecting banknotes. Experimental results for this application are presented. The method is also applicable for identity verification of a credit- or chip-card holder.

  5. Towards autonomous radio detection of ultra high energy cosmic rays

    International Nuclear Information System (INIS)

    Garcon, Th.

    2010-01-01

    The radio-detection of extensive air showers, investigated for the first time in the 1960's, obtained promising results but plagued by the technical limitations. At that time, H.R. Allan summed up the state of the art in an extensive review article whose conclusions and predictions are still used today. Set up in 2001 at the Nancay Observatory, the CODALEMA experiment was built first as a demonstrator and successfully showed the feasibility of the radio-detection of extensive air showers. Radically modified in 2005, it allowed to obtain a clear energy correlation, and put in evidence an unambiguous signature of the geomagnetic origin of the electric field emission process associated to the air shower. The switch towards large areas is the next step of the technique's development. Therefore, the autonomy of the detectors becomes essential. After test prototypes installed in 2006 at the Pierre Auger Observatory, a generation of new autonomous detectors was developed. Their first results will be presented. This work is also dedicated to the issues related to the radio-detection technique: the antenna response, the sensitivity, the surrounding effects, the monitoring of a big array. The determination of the shower characteristics independently of other detectors such as the lateral distribution, the energy correlation and the frequency spectrum of the radio transient will be discussed. (author)

  6. Field Test Data for Detecting Vibrations of a Building Using High-Speed Video Cameras

    Science.gov (United States)

    2017-10-01

    ARL-TR-8185 ● OCT 2017 US Army Research Laboratory Field Test Data for Detecting Vibrations of a Building Using High-Speed Video...Field Test Data for Detecting Vibrations of a Building Using High-Speed Video Cameras by Caitlin P Conn and Geoffrey H Goldman Sensors and...June 2016 – October 2017 4. TITLE AND SUBTITLE Field Test Data for Detecting Vibrations of a Building Using High-Speed Video Cameras 5a. CONTRACT

  7. High-resolution MRI in detecting subareolar breast abscess.

    Science.gov (United States)

    Fu, Peifen; Kurihara, Yasuyuki; Kanemaki, Yoshihide; Okamoto, Kyoko; Nakajima, Yasuo; Fukuda, Mamoru; Maeda, Ichiro

    2007-06-01

    Because subareolar breast abscess has a high recurrence rate, a more effective imaging technique is needed to comprehensively visualize the lesions and guide surgery. We performed a high-resolution MRI technique using a microscopy coil to reveal the characteristics and extent of subareolar breast abscess. High-resolution MRI has potential diagnostic value in subareolar breast abscess. This technique can be used to guide surgery with the aim of reducing the recurrence rate.

  8. Probability mapping of contaminants

    Energy Technology Data Exchange (ETDEWEB)

    Rautman, C.A.; Kaplan, P.G. [Sandia National Labs., Albuquerque, NM (United States); McGraw, M.A. [Univ. of California, Berkeley, CA (United States); Istok, J.D. [Oregon State Univ., Corvallis, OR (United States); Sigda, J.M. [New Mexico Inst. of Mining and Technology, Socorro, NM (United States)

    1994-04-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds).

  9. Probability mapping of contaminants

    International Nuclear Information System (INIS)

    Rautman, C.A.; Kaplan, P.G.; McGraw, M.A.; Istok, J.D.; Sigda, J.M.

    1994-01-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds)

  10. The Use of Highly Sensitive Detection Methods for Eradication of Plasmodium

    DEFF Research Database (Denmark)

    Hede, Marianne Smedegaard; Knudsen, Birgitta R.

    2017-01-01

    The key to a successful malaria eradication program is highly efficient detection of Plasmodium infected people followed by appropriate treatment to avoid spreading of the parasite. We will discuss some of the demands that such a detection method needs to fulfill and review some of the advantages...... and disadvantages of currently available detection methods...

  11. Probability of causation approach

    International Nuclear Information System (INIS)

    Jose, D.E.

    1988-01-01

    Probability of causation (PC) is sometimes viewed as a great improvement by those persons who are not happy with the present rulings of courts in radiation cases. The author does not share that hope and expects that PC will not play a significant role in these issues for at least the next decade. If it is ever adopted in a legislative compensation scheme, it will be used in a way that is unlikely to please most scientists. Consequently, PC is a false hope for radiation scientists, and its best contribution may well lie in some of the spin-off effects, such as an influence on medical practice

  12. Generalized Probability Functions

    Directory of Open Access Journals (Sweden)

    Alexandre Souto Martinez

    2009-01-01

    Full Text Available From the integration of nonsymmetrical hyperboles, a one-parameter generalization of the logarithmic function is obtained. Inverting this function, one obtains the generalized exponential function. Motivated by the mathematical curiosity, we show that these generalized functions are suitable to generalize some probability density functions (pdfs. A very reliable rank distribution can be conveniently described by the generalized exponential function. Finally, we turn the attention to the generalization of one- and two-tail stretched exponential functions. We obtain, as particular cases, the generalized error function, the Zipf-Mandelbrot pdf, the generalized Gaussian and Laplace pdf. Their cumulative functions and moments were also obtained analytically.

  13. High-speed asynchronous optical sampling for high-sensitivity detection of coherent phonons

    International Nuclear Information System (INIS)

    Dekorsy, T; Taubert, R; Hudert, F; Schrenk, G; Bartels, A; Cerna, R; Kotaidis, V; Plech, A; Koehler, K; Schmitz, J; Wagner, J

    2007-01-01

    A new optical pump-probe technique is implemented for the investigation of coherent acoustic phonon dynamics in the GHz to THz frequency range which is based on two asynchronously linked femtosecond lasers. Asynchronous optical sampling (ASOPS) provides the performance of on all-optical oscilloscope and allows us to record optically induced lattice dynamics over nanosecond times with femtosecond resolution at scan rates of 10 kHz without any moving part in the set-up. Within 1 minute of data acquisition time signal-to-noise ratios better than 10 7 are achieved. We present examples of the high-sensitivity detection of coherent phonons in superlattices and of the coherent acoustic vibration of metallic nanoparticles

  14. Evaluation of nuclear power plant component failure probability and core damage probability using simplified PSA model

    International Nuclear Information System (INIS)

    Shimada, Yoshio

    2000-01-01

    It is anticipated that the change of frequency of surveillance tests, preventive maintenance or parts replacement of safety related components may cause the change of component failure probability and result in the change of core damage probability. It is also anticipated that the change is different depending on the initiating event frequency or the component types. This study assessed the change of core damage probability using simplified PSA model capable of calculating core damage probability in a short time period, which is developed by the US NRC to process accident sequence precursors, when various component's failure probability is changed between 0 and 1, or Japanese or American initiating event frequency data are used. As a result of the analysis, (1) It was clarified that frequency of surveillance test, preventive maintenance or parts replacement of motor driven pumps (high pressure injection pumps, residual heat removal pumps, auxiliary feedwater pumps) should be carefully changed, since the core damage probability's change is large, when the base failure probability changes toward increasing direction. (2) Core damage probability change is insensitive to surveillance test frequency change, since the core damage probability change is small, when motor operated valves and turbine driven auxiliary feed water pump failure probability changes around one figure. (3) Core damage probability change is small, when Japanese failure probability data are applied to emergency diesel generator, even if failure probability changes one figure from the base value. On the other hand, when American failure probability data is applied, core damage probability increase is large, even if failure probability changes toward increasing direction. Therefore, when Japanese failure probability data is applied, core damage probability change is insensitive to surveillance tests frequency change etc. (author)

  15. Comparing the mannitol-egg yolk-polymyxin agar plating method with the three-tube most-probable-number method for enumeration of Bacillus cereus spores in raw and high-temperature, short-time pasteurized milk.

    Science.gov (United States)

    Harper, Nigel M; Getty, Kelly J K; Schmidt, Karen A; Nutsch, Abbey L; Linton, Richard H

    2011-03-01

    The U.S. Food and Drug Administration's Bacteriological Analytical Manual recommends two enumeration methods for Bacillus cereus: (i) standard plate count method with mannitol-egg yolk-polymyxin (MYP) agar and (ii) a most-probable-number (MPN) method with tryptic soy broth (TSB) supplemented with 0.1% polymyxin sulfate. This study compared the effectiveness of MYP and MPN methods for detecting and enumerating B. cereus in raw and high-temperature, short-time pasteurized skim (0.5%), 2%, and whole (3.5%) bovine milk stored at 4°C for 96 h. Each milk sample was inoculated with B. cereus EZ-Spores and sampled at 0, 48, and 96 h after inoculation. There were no differences (P > 0.05) in B. cereus populations among sampling times for all milk types, so data were pooled to obtain overall mean values for each treatment. The overall B. cereus population mean of pooled sampling times for the MPN method (2.59 log CFU/ml) was greater (P milk samples ranged from 2.36 to 3.46 and 2.66 to 3.58 log CFU/ml for inoculated milk treatments for the MYP plate count and MPN methods, respectively, which is below the level necessary for toxin production. The MPN method recovered more B. cereus, which makes it useful for validation research. However, the MYP plate count method for enumeration of B. cereus also had advantages, including its ease of use and faster time to results (2 versus 5 days for MPN).

  16. Automated detection of cavities present in the high explosive filler of artillery shells

    International Nuclear Information System (INIS)

    Kruger, R.P.; Janney, D.H.; Breedlove, J.R. Jr.

    1976-01-01

    Initial research has been conducted into the use of digital image analysis techniques for automated detection and characterization of piping cavities present in the high explosive (HE) filler region of 105-mm artillery shells. Experimental work utilizing scene segmentation techniques followed by a sequential similarity detection algorithm for cavitation detection have yielded promising initial results. This work is described with examples of computer-detected defects

  17. An angle-based subspace anomaly detection approach to high-dimensional data: With an application to industrial fault detection

    International Nuclear Information System (INIS)

    Zhang, Liangwei; Lin, Jing; Karim, Ramin

    2015-01-01

    The accuracy of traditional anomaly detection techniques implemented on full-dimensional spaces degrades significantly as dimensionality increases, thereby hampering many real-world applications. This work proposes an approach to selecting meaningful feature subspace and conducting anomaly detection in the corresponding subspace projection. The aim is to maintain the detection accuracy in high-dimensional circumstances. The suggested approach assesses the angle between all pairs of two lines for one specific anomaly candidate: the first line is connected by the relevant data point and the center of its adjacent points; the other line is one of the axis-parallel lines. Those dimensions which have a relatively small angle with the first line are then chosen to constitute the axis-parallel subspace for the candidate. Next, a normalized Mahalanobis distance is introduced to measure the local outlier-ness of an object in the subspace projection. To comprehensively compare the proposed algorithm with several existing anomaly detection techniques, we constructed artificial datasets with various high-dimensional settings and found the algorithm displayed superior accuracy. A further experiment on an industrial dataset demonstrated the applicability of the proposed algorithm in fault detection tasks and highlighted another of its merits, namely, to provide preliminary interpretation of abnormality through feature ordering in relevant subspaces. - Highlights: • An anomaly detection approach for high-dimensional reliability data is proposed. • The approach selects relevant subspaces by assessing vectorial angles. • The novel ABSAD approach displays superior accuracy over other alternatives. • Numerical illustration approves its efficacy in fault detection applications

  18. Leak Detection of High Pressure Feedwater Heater Using Empirical Models

    International Nuclear Information System (INIS)

    Lee, Song Kyu; Kim, Eun Kee; Heo, Gyun Young; An, Sang Ha

    2009-01-01

    Even small leak from tube side or pass partition within the high pressure feedwater heater (HPFWH) causes a significant deficiency in its performance. Plant operation under the HPFWH leak condition for long time will result in cost increase. Tube side leak within HPFWH can produce the high velocity jet of water and it can cause neighboring tube failures. However, most of plants are being operated without any information for internal leaks of HPFWH, even though it is prone to be damaged under high temperature and high pressure operating conditions. Leaks from tubes and/or pass partition of HPFWH occurred in many nuclear power plants, for example, Mihama PS-2, Takahama PS-2 and Point Beach Nuclear Plant Unit 1. If the internal leaks of HPFWH are monitored, the cost can be reduced by inexpensive repairs relative to loss in performance and moreover plant shutdown as well as further tube damages can be prevented

  19. Detection of high mass cluster ions sputtered from Bi surfaces

    Energy Technology Data Exchange (ETDEWEB)

    Shepard, A; Hewitt, R W; Slusser, G J; Baitinger, W E; Cooks, R G; Winograd, N [Purdue Univ., Lafayette, Ind. (USA). Dept. of Chemistry; Delgass, W N [Purdue Univ., Lafayette, Ind. (USA); Varon, A; Devant, G [Societe RIBER, 92 - Rueil-Malmaison (France)

    1976-12-01

    The technique of secondary ion mass spectrometry (SIMS) has been employed to detect Bi/sup 3 +/ ions and associated oxides Bi/sub 3/Osub(x)sup(+)(x=1 to 4) from a Bi foil. Using a 3 keV Ar/sup +/ ion primary beam of 5x10/sup -7/ A/cm/sup 2/, mass resolution to nearly 700 with the requisite sensitivity has been achieved. The Bi surface was also monitored by X-ray photoelectron spectroscopy (XPS or ESCA). The presence of a weak O 1s peak at 532.7 eV and a strong SIMS Bi/sup 3 +/ peak is interpreted to mean that the oxygen is weakly incorporated into the Bi lattice without disrupting metal-metal bonds.

  20. FEM Techniques for High Stress Detection in Accelerated Fatigue Simulation

    Science.gov (United States)

    Veltri, M.

    2016-09-01

    This work presents the theory and a numerical validation study in support to a novel method for a priori identification of fatigue critical regions, with the aim to accelerate durability design in large FEM problems. The investigation is placed in the context of modern full-body structural durability analysis, where a computationally intensive dynamic solution could be required to identify areas with potential for fatigue damage initiation. The early detection of fatigue critical areas can drive a simplification of the problem size, leading to sensible improvement in solution time and model handling while allowing processing of the critical areas in higher detail. The proposed technique is applied to a real life industrial case in a comparative assessment with established practices. Synthetic damage prediction quantification and visualization techniques allow for a quick and efficient comparison between methods, outlining potential application benefits and boundaries.

  1. The ion mobility spectrometer for high explosive vapor detection

    International Nuclear Information System (INIS)

    Cohen, M.J.; Stimac, R.M.; Wernlund, R.F.

    1984-01-01

    The Phemto-Chem /SUP R/ Model 100 Ion Mobility Spectrometer (IMS) operates in air and measures a number of explosive vapors at levels as low as partsper-trillion in seconds. The theory and operation of this instrument is discussed. The IMS inhales the vapor sample in a current of air and generates characteristic ions which are separated by time-of -ion drift in the atmospheric pressure gas. Quantitative results, using a dilution tunnel and standard signal generator with TNT, nitroglycerine, ethylene glycol dinitrate, cyclohexanone, methylamine, octafluoronaphthalene and hexafluorobenzene, are given. Rapid sample treatment with sample concentrations, microprocessor signal readout and chemical identification, offer a realistic opportunity of rapid explosive vapor detection at levels down to 10 -14 parts by volume in air

  2. The potential for very high-frequency gravitational wave detection

    International Nuclear Information System (INIS)

    Cruise, A M

    2012-01-01

    The science case for observing gravitational waves at frequencies in the millihertz-kilohertz range using LIGO, VIRGO, GEO600 or LISA is very strong and the first results are expected at these frequencies. However, as gravitational wave astronomy progresses beyond the first detections, other frequency bands may be worth exploring. Early predictions of gravitational wave emission from discrete sources at very much higher frequencies (megahertz and above) have been published and more recent studies of cosmological signals from inflation, Kaluza-Klein modes from gravitational interactions in brane worlds and plasma instabilities surrounding violent astrophysical events, are all possible sources. This communication examines current observational possibilities and the detector technology required to make meaningful observations at these frequencies. (paper)

  3. Striatal activity is modulated by target probability.

    Science.gov (United States)

    Hon, Nicholas

    2017-06-14

    Target probability has well-known neural effects. In the brain, target probability is known to affect frontal activity, with lower probability targets producing more prefrontal activation than those that occur with higher probability. Although the effect of target probability on cortical activity is well specified, its effect on subcortical structures such as the striatum is less well understood. Here, I examined this issue and found that the striatum was highly responsive to target probability. This is consistent with its hypothesized role in the gating of salient information into higher-order task representations. The current data are interpreted in light of that fact that different components of the striatum are sensitive to different types of task-relevant information.

  4. Highly-sensitive and large-dynamic diffuse optical tomography system for breast tumor detection

    Science.gov (United States)

    Du, Wenwen; Zhang, Limin; Yin, Guoyan; Zhang, Yanqi; Zhao, Huijuan; Gao, Feng

    2018-02-01

    Diffuse optical tomography (DOT) as a new functional imaging has important clinical applications in many aspects such as benign and malignant breast tumor detection, tumor staging and so on. For quantitative detection of breast tumor, a three-wavelength continuous-wave DOT prototype system combined the ultra-high sensitivity of the photon-counting detection and the measurement parallelism of the lock-in technique was developed to provide high temporal resolution, high sensitivity, large dynamic detection range and signal-to-noise ratio. Additionally, a CT-analogous scanning mode was proposed to cost-effectively increase the detection data. To evaluate the feasibility of the system, a series of assessments were conducted. The results demonstrate that the system can obtain high linearity, stability and negligible inter-wavelength crosstalk. The preliminary phantom experiments show the absorption coefficient is able to be successfully reconstructed, indicating that the system is one of the ideal platforms for optical breast tumor detection.

  5. Probable maximum flood control

    International Nuclear Information System (INIS)

    DeGabriele, C.E.; Wu, C.L.

    1991-11-01

    This study proposes preliminary design concepts to protect the waste-handling facilities and all shaft and ramp entries to the underground from the probable maximum flood (PMF) in the current design configuration for the proposed Nevada Nuclear Waste Storage Investigation (NNWSI) repository protection provisions were furnished by the United States Bureau of Reclamation (USSR) or developed from USSR data. Proposed flood protection provisions include site grading, drainage channels, and diversion dikes. Figures are provided to show these proposed flood protection provisions at each area investigated. These areas are the central surface facilities (including the waste-handling building and waste treatment building), tuff ramp portal, waste ramp portal, men-and-materials shaft, emplacement exhaust shaft, and exploratory shafts facility

  6. A high-throughput qPCR system for simultaneous quantitative detection of dairy Lactococcus lactis and Leuconostoc bacteriophages

    DEFF Research Database (Denmark)

    Muhammed, Musemma Kedir; Krych, Lukasz; Nielsen, Dennis Sandris

    2017-01-01

    simultaneous quantitative detection of Lc. lactis 936 (now SK1virus), P335, c2 (now C2virus) and Leuconostoc phage groups. Component assays are designed to have high efficiencies and nearly the same dynamic detection ranges, i.e., from 1.1 x 105 to 1.1 x 101 phage genomes per reaction, which corresponds to 9 x......Simultaneous quantitative detection of Lactococcus (Lc.) lactis and Leuconostoc species bacteriophages (phages) has not been reported in dairies using undefined mixed-strain DL-starters, probably due to the lack of applicable methods. We optimized a high-throughput qPCR system that allows...... 107 to 9 x 103 phage particles mL-1 without any additional up-concentrating steps. The amplification efficiencies of the corresponding assays were 100.1±2.6, 98.7±2.3, 101.0±2.3 and 96.2±6.2. The qPCR system was tested on samples obtained from a dairy plant that employed traditional mother...

  7. Detection systems for high energy particle producing gaseous ionization

    International Nuclear Information System (INIS)

    Martinez, L.; Duran, I.

    1985-01-01

    This report contains a review on the most used detectors based on the collection of the ionization produced by high energy particles: proportional counters, multiwire proportional chambers, Geiger-Muller counters and drift chambers. In six sections, the fundamental principles, the field configuration and useful gas mixtures, are discussed, most relevant devices are reported along 90 pages with 98 references. (Author) 98 refs

  8. Detection systems for high energy particle producing gaseous ionization

    International Nuclear Information System (INIS)

    Duran, I.; Martinez, L.

    1985-01-01

    This report contains a review on the most used detectors based on the collection of the ionization produced by high energy particles: proportional counters, multiwire proportional chambers, Geiger-Mueller counters and drift chambers. In six sections, the fundamental principles, the field configuration and useful gas mixtures are discussed, most relevant devices are reported. (author)

  9. Pattern of interstitial lung disease detected by high resolution ...

    African Journals Online (AJOL)

    Background: Diffuse lung diseases constitute a major cause of morbidity and mortality worldwide. High Resolution Computed Tomography (HRCT) is the recommended imaging technique in the diagnosis, assessment and followup of these diseases. Objectives: To describe the pattern of HRCT findings in patients with ...

  10. Robust real-time change detection in high jitter.

    Energy Technology Data Exchange (ETDEWEB)

    Simonson, Katherine Mary; Ma, Tian J.

    2009-08-01

    A new method is introduced for real-time detection of transient change in scenes observed by staring sensors that are subject to platform jitter, pixel defects, variable focus, and other real-world challenges. The approach uses flexible statistical models for the scene background and its variability, which are continually updated to track gradual drift in the sensor's performance and the scene under observation. Two separate models represent temporal and spatial variations in pixel intensity. For the temporal model, each new frame is projected into a low-dimensional subspace designed to capture the behavior of the frame data over a recent observation window. Per-pixel temporal standard deviation estimates are based on projection residuals. The second approach employs a simple representation of jitter to generate pixelwise moment estimates from a single frame. These estimates rely on spatial characteristics of the scene, and are used gauge each pixel's susceptibility to jitter. The temporal model handles pixels that are naturally variable due to sensor noise or moving scene elements, along with jitter displacements comparable to those observed in the recent past. The spatial model captures jitter-induced changes that may not have been seen previously. Change is declared in pixels whose current values are inconsistent with both models.

  11. First estimates of the probability of survival in a small-bodied, high-elevation frog (Boreal Chorus Frog, Pseudacris maculata), or how historical data can be useful

    Science.gov (United States)

    Muths, Erin L.; Scherer, R. D.; Amburgey, S. M.; Matthews, T.; Spencer, A. W.; Corn, P.S.

    2016-01-01

    In an era of shrinking budgets yet increasing demands for conservation, the value of existing (i.e., historical) data are elevated. Lengthy time series on common, or previously common, species are particularly valuable and may be available only through the use of historical information. We provide first estimates of the probability of survival and longevity (0.67–0.79 and 5–7 years, respectively) for a subalpine population of a small-bodied, ostensibly common amphibian, the Boreal Chorus Frog (Pseudacris maculata (Agassiz, 1850)), using historical data and contemporary, hypothesis-driven information–theoretic analyses. We also test a priori hypotheses about the effects of color morph (as suggested by early reports) and of drought (as suggested by recent climate predictions) on survival. Using robust mark–recapture models, we find some support for early hypotheses regarding the effect of color on survival, but we find no effect of drought. The congruence between early findings and our analyses highlights the usefulness of historical information in providing raw data for contemporary analyses and context for conservation and management decisions.

  12. Rejecting probability summation for radial frequency patterns, not so Quick!

    Science.gov (United States)

    Baldwin, Alex S; Schmidtmann, Gunnar; Kingdom, Frederick A A; Hess, Robert F

    2016-05-01

    Radial frequency (RF) patterns are used to assess how the visual system processes shape. They are thought to be detected globally. This is supported by studies that have found summation for RF patterns to be greater than what is possible if the parts were being independently detected and performance only then improved with an increasing number of cycles by probability summation between them. However, the model of probability summation employed in these previous studies was based on High Threshold Theory (HTT), rather than Signal Detection Theory (SDT). We conducted rating scale experiments to investigate the receiver operating characteristics. We find these are of the curved form predicted by SDT, rather than the straight lines predicted by HTT. This means that to test probability summation we must use a model based on SDT. We conducted a set of summation experiments finding that thresholds decrease as the number of modulated cycles increases at approximately the same rate as previously found. As this could be consistent with either additive or probability summation, we performed maximum-likelihood fitting of a set of summation models (Matlab code provided in our Supplementary material) and assessed the fits using cross validation. We find we are not able to distinguish whether the responses to the parts of an RF pattern are combined by additive or probability summation, because the predictions are too similar. We present similar results for summation between separate RF patterns, suggesting that the summation process there may be the same as that within a single RF. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. The ship edge feature detection based on high and low threshold for remote sensing image

    Science.gov (United States)

    Li, Xuan; Li, Shengyang

    2018-05-01

    In this paper, a method based on high and low threshold is proposed to detect the ship edge feature due to the low accuracy rate caused by the noise. Analyze the relationship between human vision system and the target features, and to determine the ship target by detecting the edge feature. Firstly, using the second-order differential method to enhance the quality of image; Secondly, to improvement the edge operator, we introduction of high and low threshold contrast to enhancement image edge and non-edge points, and the edge as the foreground image, non-edge as a background image using image segmentation to achieve edge detection, and remove the false edges; Finally, the edge features are described based on the result of edge features detection, and determine the ship target. The experimental results show that the proposed method can effectively reduce the number of false edges in edge detection, and has the high accuracy of remote sensing ship edge detection.

  14. Probability concepts in quality risk management.

    Science.gov (United States)

    Claycamp, H Gregg

    2012-01-01

    Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although risk is generally a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management tools are relatively silent on the meaning and uses of "probability." The probability concept is typically applied by risk managers as a combination of frequency-based calculation and a "degree of belief" meaning of probability. Probability as a concept that is crucial for understanding and managing risk is discussed through examples from the most general, scenario-defining and ranking tools that use probability implicitly to more specific probabilistic tools in risk management. A rich history of probability in risk management applied to other fields suggests that high-quality risk management decisions benefit from the implementation of more thoughtful probability concepts in both risk modeling and risk management. Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although "risk" generally describes a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management methodologies and respective tools focus on managing severity but are relatively silent on the in-depth meaning and uses of "probability." Pharmaceutical manufacturers are expanding their use of quality risk management to identify and manage risks to the patient that might occur in phases of the pharmaceutical life cycle from drug development to manufacture, marketing to product discontinuation. A probability concept is typically applied by risk managers as a combination of data-based measures of probability and a subjective "degree of belief" meaning of probability. Probability as

  15. High-Tc superconducting Josephson mixers for terahertz heterodyne detection

    International Nuclear Information System (INIS)

    Malnou, M.; Feuillet-Palma, C.; Olanier, L.; Lesueur, J.; Bergeal, N.; Ulysse, C.; Faini, G.; Febvre, P.; Sirena, M.

    2014-01-01

    We report on an experimental and theoretical study of the high-frequency mixing properties of ion-irradiated YBa 2 Cu 3 O 7 Josephson junctions embedded in THz antennas. We investigated the influence of the local oscillator power and frequency on the device performances. The experimental data are compared with theoretical predictions of the general three-port model for mixers in which the junction is described by the resistively shunted junction model. A good agreement is obtained for the conversion efficiency in different frequency ranges, spanning above and below the characteristic frequencies f c of the junctions

  16. Probability and rational choice

    Directory of Open Access Journals (Sweden)

    David Botting

    2014-05-01

    Full Text Available http://dx.doi.org/10.5007/1808-1711.2014v18n1p1 In this paper I will discuss the rationality of reasoning about the future. There are two things that we might like to know about the future: which hypotheses are true and what will happen next. To put it in philosophical language, I aim to show that there are methods by which inferring to a generalization (selecting a hypothesis and inferring to the next instance (singular predictive inference can be shown to be normative and the method itself shown to be rational, where this is due in part to being based on evidence (although not in the same way and in part on a prior rational choice. I will also argue that these two inferences have been confused, being distinct not only conceptually (as nobody disputes but also in their results (the value given to the probability of the hypothesis being not in general that given to the next instance and that methods that are adequate for one are not by themselves adequate for the other. A number of debates over method founder on this confusion and do not show what the debaters think they show.

  17. Highly Sensitive Filter Paper Substrate for SERS Trace Explosives Detection

    Directory of Open Access Journals (Sweden)

    Pedro M. Fierro-Mercado

    2012-01-01

    Full Text Available We report on a novel and extremely low-cost surface-enhanced Raman spectroscopy (SERS substrate fabricated depositing gold nanoparticles on common lab filter paper using thermal inkjet technology. The paper-based substrate combines all advantages of other plasmonic structures fabricated by more elaborate techniques with the dynamic flexibility given by the inherent nature of the paper for an efficient sample collection, robustness, and stability. We describe the fabrication, characterization, and SERS activity of our substrate using 2,4,6-trinitrotoluene, 2,4-dinitrotoluene, and 1,3,5-trinitrobenzene as analytes. The paper-based SERS substrates presented a high sensitivity and excellent reproducibility for analytes employed, demonstrating a direct application in forensic science and homeland security.

  18. High Throughput Sequencing for Detection of Foodborne Pathogens

    Directory of Open Access Journals (Sweden)

    Camilla Sekse

    2017-10-01

    Full Text Available High-throughput sequencing (HTS is becoming the state-of-the-art technology for typing of microbial isolates, especially in clinical samples. Yet, its application is still in its infancy for monitoring and outbreak investigations of foods. Here we review the published literature, covering not only bacterial but also viral and Eukaryote food pathogens, to assess the status and potential of HTS implementation to inform stakeholders, improve food safety and reduce outbreak impacts. The developments in sequencing technology and bioinformatics have outpaced the capacity to analyze and interpret the sequence data. The influence of sample processing, nucleic acid extraction and purification, harmonized protocols for generation and interpretation of data, and properly annotated and curated reference databases including non-pathogenic “natural” strains are other major obstacles to the realization of the full potential of HTS in analytical food surveillance, epidemiological and outbreak investigations, and in complementing preventive approaches for the control and management of foodborne pathogens. Despite significant obstacles, the achieved progress in capacity and broadening of the application range over the last decade is impressive and unprecedented, as illustrated with the chosen examples from the literature. Large consortia, often with broad international participation, are making coordinated efforts to cope with many of the mentioned obstacles. Further rapid progress can therefore be prospected for the next decade.

  19. COVAL, Compound Probability Distribution for Function of Probability Distribution

    International Nuclear Information System (INIS)

    Astolfi, M.; Elbaz, J.

    1979-01-01

    1 - Nature of the physical problem solved: Computation of the probability distribution of a function of variables, given the probability distribution of the variables themselves. 'COVAL' has been applied to reliability analysis of a structure subject to random loads. 2 - Method of solution: Numerical transformation of probability distributions

  20. A highly sensitive method for detection of molybdenum-containing proteins

    International Nuclear Information System (INIS)

    Kalakutskii, K.L.; Shvetsov, A.A.; Bursakov, S.A.; Letarov, A.V.; Zabolotnyi, A.I.; L'vov, N.P.

    1992-01-01

    A highly sensitive method for detection of molybdenum-containing proteins in gels after electrophoresis has been developed. The method involves in vitro labeling of the proteins with the radioactive isotope 185 W. The method used to detect molybdenum-accumulating proteins in lupine seeds, xanthine dehydrogenase and another molybdenum-containing protein in wheat, barley, and pea seedlings, and nitrate reductase and xanthine dehydrogenase in bacteroides from lupine nodules. Nitrogenase could not be detected by the method. 16 refs., 5 figs

  1. A Miniaturized Colorimeter with a Novel Design and High Precision for Photometric Detection

    OpenAIRE

    Jun-Chao Yan; Yan Chen; Yu Pang; Jan Slavik; Yun-Fei Zhao; Xiao-Ming Wu; Yi Yang; Si-Fan Yang; Tian-Ling Ren

    2018-01-01

    Water quality detection plays an increasingly important role in environmental protection. In this work, a novel colorimeter based on the Beer-Lambert law was designed for chemical element detection in water with high precision and miniaturized structure. As an example, the colorimeter can detect phosphorus, which was accomplished in this article to evaluate the performance. Simultaneously, a modified algorithm was applied to extend the linear measurable range. The colorimeter encompassed a ne...

  2. Kepler Planet Reliability Metrics: Astrophysical Positional Probabilities for Data Release 25

    Science.gov (United States)

    Bryson, Stephen T.; Morton, Timothy D.

    2017-01-01

    This document is very similar to KSCI-19092-003, Planet Reliability Metrics: Astrophysical Positional Probabilities, which describes the previous release of the astrophysical positional probabilities for Data Release 24. The important changes for Data Release 25 are:1. The computation of the astrophysical positional probabilities uses the Data Release 25 processed pixel data for all Kepler Objects of Interest.2. Computed probabilities now have associated uncertainties, whose computation is described in x4.1.3.3. The scene modeling described in x4.1.2 uses background stars detected via ground-based high-resolution imaging, described in x5.1, that are not in the Kepler Input Catalog or UKIRT catalog. These newly detected stars are presented in Appendix B. Otherwise the text describing the algorithms and examples is largely unchanged from KSCI-19092-003.

  3. An novel frequent probability pattern mining algorithm based on circuit simulation method in uncertain biological networks

    Science.gov (United States)

    2014-01-01

    Background Motif mining has always been a hot research topic in bioinformatics. Most of current research on biological networks focuses on exact motif mining. However, due to the inevitable experimental error and noisy data, biological network data represented as the probability model could better reflect the authenticity and biological significance, therefore, it is more biological meaningful to discover probability motif in uncertain biological networks. One of the key steps in probability motif mining is frequent pattern discovery which is usually based on the possible world model having a relatively high computational complexity. Methods In this paper, we present a novel method for detecting frequent probability patterns based on circuit simulation in the uncertain biological networks. First, the partition based efficient search is applied to the non-tree like subgraph mining where the probability of occurrence in random networks is small. Then, an algorithm of probability isomorphic based on circuit simulation is proposed. The probability isomorphic combines the analysis of circuit topology structure with related physical properties of voltage in order to evaluate the probability isomorphism between probability subgraphs. The circuit simulation based probability isomorphic can avoid using traditional possible world model. Finally, based on the algorithm of probability subgraph isomorphism, two-step hierarchical clustering method is used to cluster subgraphs, and discover frequent probability patterns from the clusters. Results The experiment results on data sets of the Protein-Protein Interaction (PPI) networks and the transcriptional regulatory networks of E. coli and S. cerevisiae show that the proposed method can efficiently discover the frequent probability subgraphs. The discovered subgraphs in our study contain all probability motifs reported in the experiments published in other related papers. Conclusions The algorithm of probability graph isomorphism

  4. Assessing the clinical probability of pulmonary embolism

    International Nuclear Information System (INIS)

    Miniati, M.; Pistolesi, M.

    2001-01-01

    Clinical assessment is a cornerstone of the recently validated diagnostic strategies for pulmonary embolism (PE). Although the diagnostic yield of individual symptoms, signs, and common laboratory tests is limited, the combination of these variables, either by empirical assessment or by a prediction rule, can be used to express a clinical probability of PE. The latter may serve as pretest probability to predict the probability of PE after further objective testing (posterior or post-test probability). Over the last few years, attempts have been made to develop structured prediction models for PE. In a Canadian multicenter prospective study, the clinical probability of PE was rated as low, intermediate, or high according to a model which included assessment of presenting symptoms and signs, risk factors, and presence or absence of an alternative diagnosis at least as likely as PE. Recently, a simple clinical score was developed to stratify outpatients with suspected PE into groups with low, intermediate, or high clinical probability. Logistic regression was used to predict parameters associated with PE. A score ≤ 4 identified patients with low probability of whom 10% had PE. The prevalence of PE in patients with intermediate (score 5-8) and high probability (score ≥ 9) was 38 and 81%, respectively. As opposed to the Canadian model, this clinical score is standardized. The predictor variables identified in the model, however, were derived from a database of emergency ward patients. This model may, therefore, not be valid in assessing the clinical probability of PE in inpatients. In the PISA-PED study, a clinical diagnostic algorithm was developed which rests on the identification of three relevant clinical symptoms and on their association with electrocardiographic and/or radiographic abnormalities specific for PE. Among patients who, according to the model, had been rated as having a high clinical probability, the prevalence of proven PE was 97%, while it was 3

  5. Probability of crack-initiation and application to NDE

    Energy Technology Data Exchange (ETDEWEB)

    Prantl, G [Nuclear Safety Inspectorate HSK, (Switzerland)

    1988-12-31

    Fracture toughness is a property with a certain variability. When a statistical distribution is assumed, the probability of crack initiation may be calculated for a given problem defined by its geometry and the applied stress. Experiments have shown, that cracks which experience a certain small amount of ductile growth can reliably be detected by acoustic emission measurements. The probability of crack detection by AE-techniques may be estimated using this experimental finding and the calculated probability of crack initiation. (author).

  6. A Tale of Two Probabilities

    Science.gov (United States)

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  7. A game with rules in the making - how the high probability of waiting games in nanomedicine is being mitigated through distributed regulation and responsible innovation

    NARCIS (Netherlands)

    D'Silva, J.J.F.; Robinson, D.K.R.; Shelley Egan, Clare

    2012-01-01

    The potential benefits of nanotechnologies in healthcare are widely expected to be enormous and a considerable amount of investment is already pouring into public research in this area. These high expectations of benefits are coupled with uncertainty surrounding the potential risks of the

  8. Are polynuclear superhalogens without halogen atoms probable? A high-level ab initio case study on triple-bridged binuclear anions with cyanide ligands

    Science.gov (United States)

    Yin, Bing; Li, Teng; Li, Jin-Feng; Yu, Yang; Li, Jian-Li; Wen, Zhen-Yi; Jiang, Zhen-Yi

    2014-03-01

    The first theoretical exploration of superhalogen properties of polynuclear structures based on pseudohalogen ligand is reported here via a case study on eight triply-bridged [Mg2(CN)5]- clusters. From our high-level ab initio results, all these clusters are superhalogens due to their high vertical electron detachment energies (VDE), of which the largest value is 8.67 eV at coupled-cluster single double triple (CCSD(T)) level. Although outer valence Green's function results are consistent with CCSD(T) in most cases, it overestimates the VDEs of three anions dramatically by more than 1 eV. Therefore, the combined usage of several theoretical methods is important for the accuracy of purely theoretical prediction of superhalogen properties of new structures. Spatial distribution of the extra electron of high-VDE anions here indicates two features: remarkable aggregation on bridging CN units and non-negligible distribution on every CN unit. These two features lower the potential and kinetic energies of the extra electron respectively and thus lead to high VDE. Besides superhalogen properties, the structures, relative stabilities and thermodynamic stabilities with respect to detachment of CN-1 were also investigated for these anions. The collection of these results indicates that polynuclear structures based on pseudohalogen ligand are promising candidates for new superhalogens with enhanced properties.

  9. Are polynuclear superhalogens without halogen atoms probable? A high-level ab initio case study on triple-bridged binuclear anions with cyanide ligands

    International Nuclear Information System (INIS)

    Yin, Bing; Wen, Zhen-Yi; Li, Teng; Li, Jin-Feng; Yu, Yang; Li, Jian-Li; Jiang, Zhen-Yi

    2014-01-01

    The first theoretical exploration of superhalogen properties of polynuclear structures based on pseudohalogen ligand is reported here via a case study on eight triply-bridged [Mg 2 (CN) 5 ] − clusters. From our high-level ab initio results, all these clusters are superhalogens due to their high vertical electron detachment energies (VDE), of which the largest value is 8.67 eV at coupled-cluster single double triple (CCSD(T)) level. Although outer valence Green's function results are consistent with CCSD(T) in most cases, it overestimates the VDEs of three anions dramatically by more than 1 eV. Therefore, the combined usage of several theoretical methods is important for the accuracy of purely theoretical prediction of superhalogen properties of new structures. Spatial distribution of the extra electron of high-VDE anions here indicates two features: remarkable aggregation on bridging CN units and non-negligible distribution on every CN unit. These two features lower the potential and kinetic energies of the extra electron respectively and thus lead to high VDE. Besides superhalogen properties, the structures, relative stabilities and thermodynamic stabilities with respect to detachment of CN −1 were also investigated for these anions. The collection of these results indicates that polynuclear structures based on pseudohalogen ligand are promising candidates for new superhalogens with enhanced properties

  10. Are polynuclear superhalogens without halogen atoms probable? A high-level ab initio case study on triple-bridged binuclear anions with cyanide ligands

    Energy Technology Data Exchange (ETDEWEB)

    Yin, Bing, E-mail: rayinyin@gmail.com; Wen, Zhen-Yi [MOE Key Laboratory of Synthetic and Natural Functional Molecule Chemistry, Shaanxi Key Laboratory of Physico-Inorganic Chemistry, College of Chemistry and Materials Science, Northwest University, Xi' an 710069 (China); Institute of Modern Physics, Northwest University, Xi' an 710069 (China); Li, Teng; Li, Jin-Feng; Yu, Yang; Li, Jian-Li [MOE Key Laboratory of Synthetic and Natural Functional Molecule Chemistry, Shaanxi Key Laboratory of Physico-Inorganic Chemistry, College of Chemistry and Materials Science, Northwest University, Xi' an 710069 (China); Jiang, Zhen-Yi [Institute of Modern Physics, Northwest University, Xi' an 710069 (China)

    2014-03-07

    The first theoretical exploration of superhalogen properties of polynuclear structures based on pseudohalogen ligand is reported here via a case study on eight triply-bridged [Mg{sub 2}(CN){sub 5}]{sup −} clusters. From our high-level ab initio results, all these clusters are superhalogens due to their high vertical electron detachment energies (VDE), of which the largest value is 8.67 eV at coupled-cluster single double triple (CCSD(T)) level. Although outer valence Green's function results are consistent with CCSD(T) in most cases, it overestimates the VDEs of three anions dramatically by more than 1 eV. Therefore, the combined usage of several theoretical methods is important for the accuracy of purely theoretical prediction of superhalogen properties of new structures. Spatial distribution of the extra electron of high-VDE anions here indicates two features: remarkable aggregation on bridging CN units and non-negligible distribution on every CN unit. These two features lower the potential and kinetic energies of the extra electron respectively and thus lead to high VDE. Besides superhalogen properties, the structures, relative stabilities and thermodynamic stabilities with respect to detachment of CN{sup −1} were also investigated for these anions. The collection of these results indicates that polynuclear structures based on pseudohalogen ligand are promising candidates for new superhalogens with enhanced properties.

  11. HIGH RESOLUTION RESISTIVITY LEAK DETECTION DATA PROCESSING and EVALUATION MEHTODS and REQUIREMENTS

    International Nuclear Information System (INIS)

    SCHOFIELD JS

    2007-01-01

    This document has two purposes: (sm b ullet) Describe how data generated by High Resolution REsistivity (HRR) leak detection (LD) systems deployed during single-shell tank (SST) waste retrieval operations are processed and evaluated. (sm b ullet) Provide the basic review requirements for HRR data when Hrr is deployed as a leak detection method during SST waste retrievals

  12. High speed low power optical detection of sub-wavelength scatterer

    NARCIS (Netherlands)

    Roy, S.; Bouwens, M.A.J.; Wei, L.; Pereira, S.F.; Urbach, H.P.; Walle, P. van der

    2015-01-01

    Optical detection of scatterers on a flat substrate, generally done using dark field microscopy technique, is challenging since it requires high power illumination to obtain sufficient SNR (Signal to Noise Ratio) to be able to detect sub-wavelength particles. We developed a bright field technique,

  13. Introduction to probability with R

    CERN Document Server

    Baclawski, Kenneth

    2008-01-01

    FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable

  14. A first course in probability

    CERN Document Server

    Ross, Sheldon

    2014-01-01

    A First Course in Probability, Ninth Edition, features clear and intuitive explanations of the mathematics of probability theory, outstanding problem sets, and a variety of diverse examples and applications. This book is ideal for an upper-level undergraduate or graduate level introduction to probability for math, science, engineering and business students. It assumes a background in elementary calculus.

  15. Lectures on probability and statistics

    International Nuclear Information System (INIS)

    Yost, G.P.

    1984-09-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another

  16. Highly sensitive optical chemosensor for the detection of Cu using a ...

    Indian Academy of Sciences (India)

    Administrator

    Highly sensitive colorimetric chemosensor molecule RHN for selective detection of Cu. 2+ in ... colour development against the colourless blank during the sensing event, a feature that would facilitate ... ever reported, much attention has been.

  17. Assay for dihydroorotase using high-performance liquid chromatography with radioactivity detection

    International Nuclear Information System (INIS)

    Mehdi, S.; Wiseman, J.S.

    1989-01-01

    An assay for measuring dihydroorotase activity was devised. Radiolabeled substrate and product were separated by high-performance liquid chromatography using a reverse-phase column with ion-pairing, and the radioactivity was quantitated by flow detection

  18. Device for detecting imminent failure of high-dielectric stress capacitors. [Patent application

    Science.gov (United States)

    McDuff, G.G.

    1980-11-05

    A device is described for detecting imminent failure of a high-dielectric stress capacitor utilizing circuitry for detecting pulse width variations and pulse magnitude variations. Inexpensive microprocessor circuitry is utilized to make numerical calculations of digital data supplied by detection circuitry for comparison of pulse width data and magnitude data to determine if preselected ranges have been exceeded, thereby indicating imminent failure of a capacitor. Detection circuitry may be incorporated in transmission lines, pulse power circuitry, including laser pulse circuitry or any circuitry where capacitors or capacitor banks are utilized.

  19. Depleted Nanocrystal-Oxide Heterojunctions for High-Sensitivity Infrared Detection

    Science.gov (United States)

    2015-08-28

    Approved for Public Release; Distribution Unlimited Final Report: 4.3 Electronic Sensing - Depleted Nanocrystal- Oxide Heterojunctions for High...reviewed journals: Final Report: 4.3 Electronic Sensing - Depleted Nanocrystal- Oxide Heterojunctions for High-Sensitivity Infrared Detection Report Title...PERCENT_SUPPORTEDNAME FTE Equivalent: Total Number: 1 1 Final Progress Report Project title: Depleted Nanocrystal- Oxide Heterojunctions for High

  20. Probability of brittle failure

    Science.gov (United States)

    Kim, A.; Bosnyak, C. P.; Chudnovsky, A.

    1991-01-01

    A methodology was developed for collecting statistically representative data for crack initiation and arrest from small number of test specimens. An epoxy (based on bisphenol A diglycidyl ether and polyglycol extended diglycyl ether and cured with diethylene triamine) is selected as a model material. A compact tension specimen with displacement controlled loading is used to observe multiple crack initiation and arrests. The energy release rate at crack initiation is significantly higher than that at a crack arrest, as has been observed elsewhere. The difference between these energy release rates is found to depend on specimen size (scale effect), and is quantitatively related to the fracture surface morphology. The scale effect, similar to that in statistical strength theory, is usually attributed to the statistics of defects which control the fracture process. Triangular shaped ripples (deltoids) are formed on the fracture surface during the slow subcritical crack growth, prior to the smooth mirror-like surface characteristic of fast cracks. The deltoids are complementary on the two crack faces which excludes any inelastic deformation from consideration. Presence of defects is also suggested by the observed scale effect. However, there are no defects at the deltoid apexes detectable down to the 0.1 micron level.

  1. Dual-energy X-ray radiography for automatic high-Z material detection

    International Nuclear Information System (INIS)

    Chen Gongyin; Bennett, Gordon; Perticone, David

    2007-01-01

    There is an urgent need for high-Z material detection in cargo. Materials with Z > 74 can indicate the presence of fissile materials or radiation shielding. Dual (high) energy X-ray material discrimination is based on the fact that different materials have different energy dependence in X-ray attenuation coefficients. This paper introduces the basic physics and analyzes the factors that affect dual-energy material discrimination performance. A detection algorithm is also discussed

  2. Probability an introduction with statistical applications

    CERN Document Server

    Kinney, John J

    2014-01-01

    Praise for the First Edition""This is a well-written and impressively presented introduction to probability and statistics. The text throughout is highly readable, and the author makes liberal use of graphs and diagrams to clarify the theory.""  - The StatisticianThoroughly updated, Probability: An Introduction with Statistical Applications, Second Edition features a comprehensive exploration of statistical data analysis as an application of probability. The new edition provides an introduction to statistics with accessible coverage of reliability, acceptance sampling, confidence intervals, h

  3. Inductively coupled plasma emission spectrometric detection of simulated high performance liquid chromatographic peaks

    International Nuclear Information System (INIS)

    Fraley, D.M.; Yates, D.; Manahan, S.E.

    1979-01-01

    Because of its multielement capability, element-specificity, and low detection limits, inductively coupled plasma optical emission spectrometry (ICP) is a very promising technique for the detection of specific elemental species separated by high performance liquid chromatography (HPLC). This paper evaluated ICP as a detector for HPLC peaks containing specific elements. Detection limits for a number of elements have been evaluated in terms of the minimum detectable concentration of the element at the chromatographic peak maximum. The elements studies were Al, As, B, Ba, Ca, Cd, Co, Cr, Cu, Fe, K, Li, Mg, Mn, Mo, Na, Ni, P, Pb, Sb, Se, Sr, Ti, V, and Zn. In addition, ICP was compared with atomic absorption spectrometry for the detection of HPLC peaks composed of EDTA and NTA chelates of copper. Furthermore, ICP was compared to uv solution absorption for the detection of copper chelates. 6 figures, 4 tables

  4. An intelligent detection method for high-field asymmetric waveform ion mobility spectrometry.

    Science.gov (United States)

    Li, Yue; Yu, Jianwen; Ruan, Zhiming; Chen, Chilai; Chen, Ran; Wang, Han; Liu, Youjiang; Wang, Xiaozhi; Li, Shan

    2018-04-01

    In conventional high-field asymmetric waveform ion mobility spectrometry signal acquisition, multi-cycle detection is time consuming and limits somewhat the technique's scope for rapid field detection. In this study, a novel intelligent detection approach has been developed in which a threshold was set on the relative error of α parameters, which can eliminate unnecessary time spent on detection. In this method, two full-spectrum scans were made in advance to obtain the estimated compensation voltage at different dispersion voltages, resulting in a narrowing down of the whole scan area to just the peak area(s) of interest. This intelligent detection method can reduce the detection time to 5-10% of that of the original full-spectrum scan in a single cycle.

  5. Development and data analysis of a radio-detection of ultra high energy cosmic rays experiment

    International Nuclear Information System (INIS)

    Belletoile, A.

    2007-10-01

    The radio-detection of cosmic rays was first attempted in the sixties. Unfortunately at that time, the results suffered from poor reproducibility and the technique was abandoned in favour of direct particle and fluorescence detection. Taking advantage of recent technological improvements the radio-detection of ultra high energy cosmic rays is being reinvestigated. In this document, first, we remind the reader of the global problematic of cosmic rays. Then, the several mechanisms involved in the emission of an electric field associated with extensive air showers are discussed. The CODALEMA (cosmic detection array with logarithmic electro magnetic antenna) experiment that aims to demonstrate the feasibility of cosmic ray radio-detection, is extensively described along with the first experimental results. A radio-detection test experiment implanted at the giant detector Pierre Auger is presented. It should provide inputs to design the future detector using this technique at extreme energies. (author)

  6. Economic choices reveal probability distortion in macaque monkeys.

    Science.gov (United States)

    Stauffer, William R; Lak, Armin; Bossaerts, Peter; Schultz, Wolfram

    2015-02-18

    Economic choices are largely determined by two principal elements, reward value (utility) and probability. Although nonlinear utility functions have been acknowledged for centuries, nonlinear probability weighting (probability distortion) was only recently recognized as a ubiquitous aspect of real-world choice behavior. Even when outcome probabilities are known and acknowledged, human decision makers often overweight low probability outcomes and underweight high probability outcomes. Whereas recent studies measured utility functions and their corresponding neural correlates in monkeys, it is not known whether monkeys distort probability in a manner similar to humans. Therefore, we investigated economic choices in macaque monkeys for evidence of probability distortion. We trained two monkeys to predict reward from probabilistic gambles with constant outcome values (0.5 ml or nothing). The probability of winning was conveyed using explicit visual cues (sector stimuli). Choices between the gambles revealed that the monkeys used the explicit probability information to make meaningful decisions. Using these cues, we measured probability distortion from choices between the gambles and safe rewards. Parametric modeling of the choices revealed classic probability weighting functions with inverted-S shape. Therefore, the animals overweighted low probability rewards and underweighted high probability rewards. Empirical investigation of the behavior verified that the choices were best explained by a combination of nonlinear value and nonlinear probability distortion. Together, these results suggest that probability distortion may reflect evolutionarily preserved neuronal processing. Copyright © 2015 Stauffer et al.

  7. A brief introduction to probability.

    Science.gov (United States)

    Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio

    2018-02-01

    The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.

  8. Comprehensive sample analysis using high performance liquid chromatography with multi-detection

    International Nuclear Information System (INIS)

    Pravadali, Sercan; Bassanese, Danielle N.; Conlan, Xavier A.; Francis, Paul S.; Smith, Zoe M.; Terry, Jessica M.; Shalliker, R. Andrew

    2013-01-01

    Graphical abstract: -- Highlights: •Detection selectivity was assessed with 6 detection modes. •Natural samples show great diversity in detection selectivity. •Complex samples require evaluation using a multifaceted approach to detection. •23/30 known compounds (detected by MS) detected by chemiluminescence, DPPH and UV. -- Abstract: Herein we assess the separation space offered by a liquid chromatography system with an optimised uni-dimensional separation for the determination of the key chemical entities in the highly complex matrix of a tobacco leaf extract. Multiple modes of detection, including UV–visible absorbance, chemiluminescence (acidic potassium permanganate, manganese(IV), and tris(2,2′-bipyridine)ruthenium(III)), mass spectrometry and DPPH radical scavenging were used in an attempt to systematically reduce the data complexity of the sample whilst obtaining a greater degree of molecule-specific information. A large amount of chemical data was obtained, but several limitations in the ability to assign detector responses to particular compounds, even with the aid of complementary detection systems, were observed. Thirty-three compounds were detected via MS on the tobacco extract and 12 out of 32 compounds gave a peak height ratio (PHR) greater than 0.33 on one or more detectors. This paper serves as a case study of these limitations, illustrating why multidimensional chromatography is an important consideration when developing a comprehensive chemical detection system

  9. Comprehensive sample analysis using high performance liquid chromatography with multi-detection

    Energy Technology Data Exchange (ETDEWEB)

    Pravadali, Sercan [Australian Centre for Research on Separation Sciences (ACROSS), School of Science and Health, University of Western Sydney (Parramatta), NSW 1797 (Australia); Bassanese, Danielle N.; Conlan, Xavier A.; Francis, Paul S.; Smith, Zoe M.; Terry, Jessica M. [Centre for Chemistry and Biotechnology, School of Life and Environmental Sciences, Deakin University, Victoria 3216 (Australia); Shalliker, R. Andrew, E-mail: R.Shalliker@uws.edu.au [Australian Centre for Research on Separation Sciences (ACROSS), School of Science and Health, University of Western Sydney (Parramatta), NSW 1797 (Australia)

    2013-11-25

    Graphical abstract: -- Highlights: •Detection selectivity was assessed with 6 detection modes. •Natural samples show great diversity in detection selectivity. •Complex samples require evaluation using a multifaceted approach to detection. •23/30 known compounds (detected by MS) detected by chemiluminescence, DPPH and UV. -- Abstract: Herein we assess the separation space offered by a liquid chromatography system with an optimised uni-dimensional separation for the determination of the key chemical entities in the highly complex matrix of a tobacco leaf extract. Multiple modes of detection, including UV–visible absorbance, chemiluminescence (acidic potassium permanganate, manganese(IV), and tris(2,2′-bipyridine)ruthenium(III)), mass spectrometry and DPPH radical scavenging were used in an attempt to systematically reduce the data complexity of the sample whilst obtaining a greater degree of molecule-specific information. A large amount of chemical data was obtained, but several limitations in the ability to assign detector responses to particular compounds, even with the aid of complementary detection systems, were observed. Thirty-three compounds were detected via MS on the tobacco extract and 12 out of 32 compounds gave a peak height ratio (PHR) greater than 0.33 on one or more detectors. This paper serves as a case study of these limitations, illustrating why multidimensional chromatography is an important consideration when developing a comprehensive chemical detection system.

  10. High Sensitivity and High Detection Specificity of Gold-Nanoparticle-Grafted Nanostructured Silicon Mass Spectrometry for Glucose Analysis.

    Science.gov (United States)

    Tsao, Chia-Wen; Yang, Zhi-Jie

    2015-10-14

    Desorption/ionization on silicon (DIOS) is a high-performance matrix-free mass spectrometry (MS) analysis method that involves using silicon nanostructures as a matrix for MS desorption/ionization. In this study, gold nanoparticles grafted onto a nanostructured silicon (AuNPs-nSi) surface were demonstrated as a DIOS-MS analysis approach with high sensitivity and high detection specificity for glucose detection. A glucose sample deposited on the AuNPs-nSi surface was directly catalyzed to negatively charged gluconic acid molecules on a single AuNPs-nSi chip for MS analysis. The AuNPs-nSi surface was fabricated using two electroless deposition steps and one electroless etching step. The effects of the electroless fabrication parameters on the glucose detection efficiency were evaluated. Practical application of AuNPs-nSi MS glucose analysis in urine samples was also demonstrated in this study.

  11. K-forbidden transition probabilities

    International Nuclear Information System (INIS)

    Saitoh, T.R.; Sletten, G.; Bark, R.A.; Hagemann, G.B.; Herskind, B.; Saitoh-Hashimoto, N.; Tsukuba Univ., Ibaraki

    2000-01-01

    Reduced hindrance factors of K-forbidden transitions are compiled for nuclei with A∝180 where γ-vibrational states are observed. Correlations between these reduced hindrance factors and Coriolis forces, statistical level mixing and γ-softness have been studied. It is demonstrated that the K-forbidden transition probabilities are related to γ-softness. The decay of the high-K bandheads has been studied by means of the two-state mixing, which would be induced by the γ-softness, with the use of a number of K-forbidden transitions compiled in the present work, where high-K bandheads are depopulated by both E2 and ΔI=1 transitions. The validity of the two-state mixing scheme has been examined by using the proposed identity of the B(M1)/B(E2) ratios of transitions depopulating high-K bandheads and levels of low-K bands. A break down of the identity might indicate that other levels would mediate transitions between high- and low-K states. (orig.)

  12. Multiplex enrichment quantitative PCR (ME-qPCR): a high-throughput, highly sensitive detection method for GMO identification.

    Science.gov (United States)

    Fu, Wei; Zhu, Pengyu; Wei, Shuang; Zhixin, Du; Wang, Chenguang; Wu, Xiyang; Li, Feiwu; Zhu, Shuifang

    2017-04-01

    Among all of the high-throughput detection methods, PCR-based methodologies are regarded as the most cost-efficient and feasible methodologies compared with the next-generation sequencing or ChIP-based methods. However, the PCR-based methods can only achieve multiplex detection up to 15-plex due to limitations imposed by the multiplex primer interactions. The detection throughput cannot meet the demands of high-throughput detection, such as SNP or gene expression analysis. Therefore, in our study, we have developed a new high-throughput PCR-based detection method, multiplex enrichment quantitative PCR (ME-qPCR), which is a combination of qPCR and nested PCR. The GMO content detection results in our study showed that ME-qPCR could achieve high-throughput detection up to 26-plex. Compared to the original qPCR, the Ct values of ME-qPCR were lower for the same group, which showed that ME-qPCR sensitivity is higher than the original qPCR. The absolute limit of detection for ME-qPCR could achieve levels as low as a single copy of the plant genome. Moreover, the specificity results showed that no cross-amplification occurred for irrelevant GMO events. After evaluation of all of the parameters, a practical evaluation was performed with different foods. The more stable amplification results, compared to qPCR, showed that ME-qPCR was suitable for GMO detection in foods. In conclusion, ME-qPCR achieved sensitive, high-throughput GMO detection in complex substrates, such as crops or food samples. In the future, ME-qPCR-based GMO content identification may positively impact SNP analysis or multiplex gene expression of food or agricultural samples. Graphical abstract For the first-step amplification, four primers (A, B, C, and D) have been added into the reaction volume. In this manner, four kinds of amplicons have been generated. All of these four amplicons could be regarded as the target of second-step PCR. For the second-step amplification, three parallels have been taken for

  13. Remote detection of radioactive material using high-power pulsed electromagnetic radiation.

    Science.gov (United States)

    Kim, Dongsung; Yu, Dongho; Sawant, Ashwini; Choe, Mun Seok; Lee, Ingeun; Kim, Sung Gug; Choi, EunMi

    2017-05-09

    Remote detection of radioactive materials is impossible when the measurement location is far from the radioactive source such that the leakage of high-energy photons or electrons from the source cannot be measured. Current technologies are less effective in this respect because they only allow the detection at distances to which the high-energy photons or electrons can reach the detector. Here we demonstrate an experimental method for remote detection of radioactive materials by inducing plasma breakdown with the high-power pulsed electromagnetic waves. Measurements of the plasma formation time and its dispersion lead to enhanced detection sensitivity compared to the theoretically predicted one based only on the plasma on and off phenomena. We show that lower power of the incident electromagnetic wave is sufficient for plasma breakdown in atmospheric-pressure air and the elimination of the statistical distribution is possible in the presence of radioactive material.

  14. Photon Counting System for High-Sensitivity Detection of Bioluminescence at Optical Fiber End.

    Science.gov (United States)

    Iinuma, Masataka; Kadoya, Yutaka; Kuroda, Akio

    2016-01-01

    The technique of photon counting is widely used for various fields and also applicable to a high-sensitivity detection of luminescence. Thanks to recent development of single photon detectors with avalanche photodiodes (APDs), the photon counting system with an optical fiber has become powerful for a detection of bioluminescence at an optical fiber end, because it allows us to fully use the merits of compactness, simple operation, highly quantum efficiency of the APD detectors. This optical fiber-based system also has a possibility of improving the sensitivity to a local detection of Adenosine triphosphate (ATP) by high-sensitivity detection of the bioluminescence. In this chapter, we are introducing a basic concept of the optical fiber-based system and explaining how to construct and use this system.

  15. High-fidelity state detection and tomography of a single-ion Zeeman qubit

    International Nuclear Information System (INIS)

    Keselman, A; Glickman, Y; Akerman, N; Kotler, S; Ozeri, R

    2011-01-01

    We demonstrate high-fidelity Zeeman qubit state detection in a single trapped 88 Sr + ion. Qubit readout is performed by shelving one of the qubit states to a metastable level using a narrow linewidth diode laser at 674 nm, followed by state-selective fluorescence detection. The average fidelity reached for the readout of the qubit state is 0.9989(1). We then measure the fidelity of state tomography, averaged over all possible single-qubit states, which is 0.9979(2). We also fully characterize the detection process using quantum process tomography. This readout fidelity is compatible with recent estimates of the detection error threshold required for fault-tolerant computation, whereas high-fidelity state tomography opens the way for high-precision quantum process tomography.

  16. Detection of Subtle Context-Dependent Model Inaccuracies in High-Dimensional Robot Domains.

    Science.gov (United States)

    Mendoza, Juan Pablo; Simmons, Reid; Veloso, Manuela

    2016-12-01

    Autonomous robots often rely on models of their sensing and actions for intelligent decision making. However, when operating in unconstrained environments, the complexity of the world makes it infeasible to create models that are accurate in every situation. This article addresses the problem of using potentially large and high-dimensional sets of robot execution data to detect situations in which a robot model is inaccurate-that is, detecting context-dependent model inaccuracies in a high-dimensional context space. To find inaccuracies tractably, the robot conducts an informed search through low-dimensional projections of execution data to find parametric Regions of Inaccurate Modeling (RIMs). Empirical evidence from two robot domains shows that this approach significantly enhances the detection power of existing RIM-detection algorithms in high-dimensional spaces.

  17. Highly sensitive chemiluminescent point mutation detection by circular strand-displacement amplification reaction.

    Science.gov (United States)

    Shi, Chao; Ge, Yujie; Gu, Hongxi; Ma, Cuiping

    2011-08-15

    Single nucleotide polymorphism (SNP) genotyping is attracting extensive attentions owing to its direct connections with human diseases including cancers. Here, we have developed a highly sensitive chemiluminescence biosensor based on circular strand-displacement amplification and the separation by magnetic beads reducing the background signal for point mutation detection at room temperature. This method took advantage of both the T4 DNA ligase recognizing single-base mismatch with high selectivity and the strand-displacement reaction of polymerase to perform signal amplification. The detection limit of this method was 1.3 × 10(-16)M, which showed better sensitivity than that of most of those reported detection methods of SNP. Additionally, the magnetic beads as carrier of immobility was not only to reduce the background signal, but also may have potential apply in high through-put screening of SNP detection in human genome. Copyright © 2011 Elsevier B.V. All rights reserved.

  18. High-fidelity state detection and tomography of a single-ion Zeeman qubit

    Energy Technology Data Exchange (ETDEWEB)

    Keselman, A; Glickman, Y; Akerman, N; Kotler, S; Ozeri, R, E-mail: ozeri@weizmann.ac.il [Physics of Complex Systems, Weizmann Institute of Science, Rehovot 76100 (Israel)

    2011-07-15

    We demonstrate high-fidelity Zeeman qubit state detection in a single trapped {sup 88}Sr{sup +} ion. Qubit readout is performed by shelving one of the qubit states to a metastable level using a narrow linewidth diode laser at 674 nm, followed by state-selective fluorescence detection. The average fidelity reached for the readout of the qubit state is 0.9989(1). We then measure the fidelity of state tomography, averaged over all possible single-qubit states, which is 0.9979(2). We also fully characterize the detection process using quantum process tomography. This readout fidelity is compatible with recent estimates of the detection error threshold required for fault-tolerant computation, whereas high-fidelity state tomography opens the way for high-precision quantum process tomography.

  19. Winter School on Operator Spaces, Noncommutative Probability and Quantum Groups

    CERN Document Server

    2017-01-01

    Providing an introduction to current research topics in functional analysis and its applications to quantum physics, this book presents three lectures surveying recent progress and open problems.  A special focus is given to the role of symmetry in non-commutative probability, in the theory of quantum groups, and in quantum physics. The first lecture presents the close connection between distributional symmetries and independence properties. The second introduces many structures (graphs, C*-algebras, discrete groups) whose quantum symmetries are much richer than their classical symmetry groups, and describes the associated quantum symmetry groups. The last lecture shows how functional analytic and geometric ideas can be used to detect and to quantify entanglement in high dimensions.  The book will allow graduate students and young researchers to gain a better understanding of free probability, the theory of compact quantum groups, and applications of the theory of Banach spaces to quantum information. The l...

  20. Sequential Probability Ratio Test for Spacecraft Collision Avoidance Maneuver Decisions

    Science.gov (United States)

    Carpenter, J. Russell; Markley, F. Landis

    2013-01-01

    A document discusses sequential probability ratio tests that explicitly allow decision-makers to incorporate false alarm and missed detection risks, and are potentially less sensitive to modeling errors than a procedure that relies solely on a probability of collision threshold. Recent work on constrained Kalman filtering has suggested an approach to formulating such a test for collision avoidance maneuver decisions: a filter bank with two norm-inequality-constrained epoch-state extended Kalman filters. One filter models the null hypotheses that the miss distance is inside the combined hard body radius at the predicted time of closest approach, and one filter models the alternative hypothesis. The epoch-state filter developed for this method explicitly accounts for any process noise present in the system. The method appears to work well using a realistic example based on an upcoming, highly elliptical orbit formation flying mission.