Xu, Yan; Lei, Huo; Dong, Hong; Zhang, Liping; Qin, Qionglian; Gao, Jianmei; Zou, Yunlian; Yan, Xinmin
2009-09-01
Previous studies found that the forkhead transcription factor 2 (FOXL2) gene mutations are responsible for both types of blepharophimosis-ptosis-epicanthus inversus syndrome (BPES) but have not established any systematic statistic model for the complex and even contradictory results about genotype-phenotype correlations between them. This study is aimed to find possible mutations of FOXL2 gene in a Chinese family with type II BPES by using DNA sequencing and to further clarify genotype-phenotype correlations between FOXL2 mutations and BPES by using a systematic statistical method, namely Multifactor Dimensionality Reduction (MDR). A novel mutation (g.933_965dup) which could result in an expansion of the polyalanine (polyAla) tract was detected in all patients of this family. MDR analysis for intragenic mutations of FOXL2 gene reported in previous BPES studies indicated that the mutations which led to much stronger disturbance of amino acid sequence were responsible for more type I BPES, while other kinds of mutation were responsible for more type II BPES. In conclusion, the present study found a novel FOXL2 gene mutation in a Chinese BPES family and a new general genotype-phenotype correlation tendency between FOXL2 intragenic mutations and BPES, both of which expanded the knowledge about FOXL2 gene and BPES.
Statistical theory of signal detection
Helstrom, Carl Wilhelm; Costrell, L; Kandiah, K
1968-01-01
Statistical Theory of Signal Detection, Second Edition provides an elementary introduction to the theory of statistical testing of hypotheses that is related to the detection of signals in radar and communications technology. This book presents a comprehensive survey of digital communication systems. Organized into 11 chapters, this edition begins with an overview of the theory of signal detection and the typical detection problem. This text then examines the goals of the detection system, which are defined through an analogy with the testing of statistical hypotheses. Other chapters consider
Detection statistics in the micromaser
Johnson, D B; Johnson, David B.
2001-01-01
We present a general method for the derivation of various statistical quantities describing the detection of a beam of atoms emerging from a micromaser. The user of non-normalized conditioned density operators and a linear master equation for the dynamics between detection events is discussed as are the counting statistics, sequence statistics, and waiting time statistics. In particular, we derive expressions for the mean number of successive detections of atoms in one of any two orthogonal states of the two-level atom. We also derive expressions for the mean waiting times between detections. We show that the mean waiting times between de- tections of atoms in like states are equivalent to the mean waiting times calculated from the uncorrelated steady state detection rates, though like atoms are indeed correlated. The mean waiting times between detections of atoms in unlike states exhibit correlations. We evaluate the expressions for various detector efficiencies using numerical integration, reporting re- sul...
Dempster, Martin; McCorry, Noleen K.
2009-01-01
Previous research has demonstrated that students' cognitions about statistics are related to their performance in statistics assessments. The purpose of this research is to examine the nature of the relationships between undergraduate psychology students' previous experiences of maths, statistics and computing; their attitudes toward statistics;…
Attribute and topology based change detection in a constellation of previously detected objects
Paglieroni, David W.; Beer, Reginald N.
2016-01-19
A system that applies attribute and topology based change detection to networks of objects that were detected on previous scans of a structure, roadway, or area of interest. The attributes capture properties or characteristics of the previously detected objects, such as location, time of detection, size, elongation, orientation, etc. The topology of the network of previously detected objects is maintained in a constellation database that stores attributes of previously detected objects and implicitly captures the geometrical structure of the network. A change detection system detects change by comparing the attributes and topology of new objects detected on the latest scan to the constellation database of previously detected objects.
Point pattern match-based change detection in a constellation of previously detected objects
Paglieroni, David W.
2016-06-07
A method and system is provided that applies attribute- and topology-based change detection to objects that were detected on previous scans of a medium. The attributes capture properties or characteristics of the previously detected objects, such as location, time of detection, detection strength, size, elongation, orientation, etc. The locations define a three-dimensional network topology forming a constellation of previously detected objects. The change detection system stores attributes of the previously detected objects in a constellation database. The change detection system detects changes by comparing the attributes and topological consistency of newly detected objects encountered during a new scan of the medium to previously detected objects in the constellation database. The change detection system may receive the attributes of the newly detected objects as the objects are detected by an object detection system in real time.
2014-01-01
While research shows that adults attend to both segmental and suprasegmental regularities in speech, including syllabic transitional probabilities as well as stress and intonational patterns, little is known about how statistical learning operates given input from tonal languages. In the current study, we designed an artificial tone language to address several questions: can adults track regularities in a tonal language? Is learning enhanced by previous exposure to tone-marking languages? Doe...
Tianlin eWang
2014-09-01
Full Text Available While research shows that adults attend to both segmental and suprasegmental regularities in speech, including syllabic transitional probabilities as well as stress and intonational patterns, little is known about how statistical learning operates given input from tonal languages. In the current study, we designed an artificial tone language to address several questions: Can adults track regularities in a tonal language? Is learning enhanced by previous exposure to tone-marking languages? Does bilingualism affect learning in this task? To address these questions, we contrasted the performance of English monolingual adults (Exp. 1, Mandarin monolingual and Mandarin-English bilingual adults (Exp. 2, and non-tonal bilingual adults (Exp.3 in a statistical learning task using an artificial tone language. The pattern of results suggests that while prior exposure to tonal languages did not lead to significant improvements in performance, bilingual experience did enhance learning outcomes. This study represents the first demonstration of statistical learning of an artificial tone language and suggests a complex interplay between prior language experience and subsequent language learning.
Wang, Tianlin; Saffran, Jenny R
2014-01-01
While research shows that adults attend to both segmental and suprasegmental regularities in speech, including syllabic transitional probabilities as well as stress and intonational patterns, little is known about how statistical learning operates given input from tonal languages. In the current study, we designed an artificial tone language to address several questions: can adults track regularities in a tonal language? Is learning enhanced by previous exposure to tone-marking languages? Does bilingualism affect learning in this task? To address these questions, we contrasted the performance of English monolingual adults (Experiment 1), Mandarin monolingual and Mandarin-English bilingual adults (Experiment 2), and non-tonal bilingual adults (Experiment 3) in a statistical learning task using an artificial tone language. The pattern of results suggests that while prior exposure to tonal languages did not lead to significant improvements in performance, bilingual experience did enhance learning outcomes. This study represents the first demonstration of statistical learning of an artificial tone language and suggests a complex interplay between prior language experience and subsequent language learning.
Statistical analysis of DNT detection using chemically functionalized microcantilever arrays
Bosco, Filippo; Bache, M.; Hwu, E.-T.
2012-01-01
The need for miniaturized and sensitive sensors for explosives detection is increasing in areas such as security and demining. Micrometer sized cantilevers are often used for label-free detection, and have previously been reported to be able to detect explosives. However, only a few measurements...... from 1 to 2 cantilevers have been reported, without any information on repeatability and reliability of the presented data. In explosive detection high reliability is needed and thus a statistical measurement approach needs to be developed and implemented. We have developed a DVD-based read-out system...
Multilayer Statistical Intrusion Detection in Wireless Networks
Noureddine Boudriga
2008-12-01
Full Text Available The rapid proliferation of mobile applications and services has introduced new vulnerabilities that do not exist in fixed wired networks. Traditional security mechanisms, such as access control and encryption, turn out to be inefficient in modern wireless networks. Given the shortcomings of the protection mechanisms, an important research focuses in intrusion detection systems (IDSs. This paper proposes a multilayer statistical intrusion detection framework for wireless networks. The architecture is adequate to wireless networks because the underlying detection models rely on radio parameters and traffic models. Accurate correlation between radio and traffic anomalies allows enhancing the efficiency of the IDS. A radio signal fingerprinting technique based on the maximal overlap discrete wavelet transform (MODWT is developed. Moreover, a geometric clustering algorithm is presented. Depending on the characteristics of the fingerprinting technique, the clustering algorithm permits to control the false positive and false negative rates. Finally, simulation experiments have been carried out to validate the proposed IDS.
Statistical detection of systematic election irregularities
Klimek, Peter; Yegorov, Yuri; Hanel, Rudolf; Thurner, Stefan
2012-01-01
Democratic societies are built around the principle of free and fair elections, and that each citizen’s vote should count equally. National elections can be regarded as large-scale social experiments, where people are grouped into usually large numbers of electoral districts and vote according to their preferences. The large number of samples implies statistical consequences for the polling results, which can be used to identify election irregularities. Using a suitable data representation, we find that vote distributions of elections with alleged fraud show a kurtosis substantially exceeding the kurtosis of normal elections, depending on the level of data aggregation. As an example, we show that reported irregularities in recent Russian elections are, indeed, well-explained by systematic ballot stuffing. We develop a parametric model quantifying the extent to which fraudulent mechanisms are present. We formulate a parametric test detecting these statistical properties in election results. Remarkably, this technique produces robust outcomes with respect to the resolution of the data and therefore, allows for cross-country comparisons. PMID:23010929
Statistical fault detection in photovoltaic systems
Garoudja, Elyes
2017-05-08
Faults in photovoltaic (PV) systems, which can result in energy loss, system shutdown or even serious safety breaches, are often difficult to avoid. Fault detection in such systems is imperative to improve their reliability, productivity, safety and efficiency. Here, an innovative model-based fault-detection approach for early detection of shading of PV modules and faults on the direct current (DC) side of PV systems is proposed. This approach combines the flexibility, and simplicity of a one-diode model with the extended capacity of an exponentially weighted moving average (EWMA) control chart to detect incipient changes in a PV system. The one-diode model, which is easily calibrated due to its limited calibration parameters, is used to predict the healthy PV array\\'s maximum power coordinates of current, voltage and power using measured temperatures and irradiances. Residuals, which capture the difference between the measurements and the predictions of the one-diode model, are generated and used as fault indicators. Then, the EWMA monitoring chart is applied on the uncorrelated residuals obtained from the one-diode model to detect and identify the type of fault. Actual data from the grid-connected PV system installed at the Renewable Energy Development Center, Algeria, are used to assess the performance of the proposed approach. Results show that the proposed approach successfully monitors the DC side of PV systems and detects temporary shading.
Effects of previous surgery on the detection of sentinel nodes in women with vulvar cancer.
Ennik, T.A.; Allen, D.G; Bekkers, R.L.M.; Hyde, S.E.; Grant, P.T.
2011-01-01
BACKGROUND: There is a growing interest to apply the sentinel node (SN) procedure in the treatment of vulvar cancer. Previous vulvar surgery might disrupt lymphatic patterns and thereby decrease SN detection rates, lengthen scintigraphic appearance time (SAT), and increase SN false-negative rate. Th
Hansen, S.L.; Hetmanski, J
1983-01-01
Blood subcultures repeated 3 days after the cultures were first identified as positives increased our detection of polymicrobic bacteremia in 9.1 to 27% of clinically significant patient episodes. Reincubation and repeated subculture of previously positive blood cultures had a direct impact on the therapeutic management of patients with polymicrobic bacteremia.
The value of statistical tools to detect data fabrication
Hartgerink, Chris; Wicherts, Jelte; Van Assen, Marcel|info:eu-repo/dai/nl/407629971
2016-01-01
We aim to investigate how statistical tools can help detect potential data fabrication in the social- and medical sciences. In this proposal we outline three projects to assess the value of such statistical tools to detect potential data fabrication and make the first steps in order to apply them
Statistical Damage Detection of Civil Engineering Structures using ARMAV Models
Andersen, P.; Kirkegaard, Poul Henning
In this paper a statistically based damage detection of a lattice steel mast is performed. By estimation of the modal parameters and their uncertainties it is possible to detect whether some of the modal parameters have changed with a statistical significance. The estimation of the uncertainties ...
Detecting and interpreting statistical lensing by absorbers
Ménard, B
2004-01-01
We propose a method for detecting gravitational magnification of distant sources, like quasars, due to absorber systems detected in their spectra. We first motivate the use of metal absorption lines rather than Lyman-alpha lines, then we show how to relate the observed moments of the source magnitude distribution to the mass distribution of absorbers. In order to illustrate the feasibility of the method, we use a simple model to estimate the amplitude of the effect expected for MgII absorption lines, and show that their lensing signal might already be detectable in large surveys like the SDSS. Our model suggests that quasars behind strong MgII absorbers are in average brightened by -0.05 to -0.2 magnitude due to magnification. One must therefore revisit the claim that, in magnitude limited surveys, quasars with strong absorbers tend to be missed due to extinction effects. In addition to constraining the mass of absorber systems, applying our method will allow for the quantification of this bias.
A decision surface-based taxonomy of detection statistics
Bouffard, François
2012-09-01
Current and past literature on the topic of detection statistics - in particular those used in hyperspectral target detection - can be intimidating for newcomers, especially given the huge number of detection tests described in the literature. Detection tests for hyperspectral measurements, such as those generated by dispersive or Fourier transform spectrometers used in remote sensing of atmospheric contaminants, are of paramount importance if any level of analysis automation is to be achieved. The detection statistics used in hyperspectral target detection are generally borrowed and adapted from other fields such as radar signal processing or acoustics. Consequently, although remarkable efforts have been made to clarify and categorize the vast number of available detection tests, understanding their differences, similarities, limits and other intricacies is still an exacting journey. Reasons for this state of affairs include heterogeneous nomenclature and mathematical notation, probably due to the multiple origins of hyperspectral target detection formalisms. Attempts at sorting out detection statistics using ambiguously defined properties may also cause more harm than good. Ultimately, a detection statistic is entirely characterized by its decision boundary. Thus, we propose to catalogue detection statistics according to the shape of their decision surfaces, which greatly simplifies this taxonomy exercise. We make a distinction between the topology resulting from the mathematical formulation of the statistic and mere parameters that adjust the boundary's precise shape, position and orientation. Using this simple approach, similarities between various common detection statistics are found, limit cases are reduced to simpler statistics, and a general understanding of the available detection tests and their properties becomes much easier to achieve.
Characterization of binary string statistics for syntactic landmine detection
Nasif, Ahmed O.; Mark, Brian L.; Hintz, Kenneth J.
2011-06-01
Syntactic landmine detection has been proposed to detect and classify non-metallic landmines using ground penetrating radar (GPR). In this approach, the GPR return is processed to extract characteristic binary strings for landmine and clutter discrimination. In our previous work, we discussed the preprocessing methodology by which the amplitude information of the GPR A-scan signal can be effectively converted into binary strings, which identify the impedance discontinuities in the signal. In this work, we study the statistical properties of the binary string space. In particular, we develop a Markov chain model to characterize the observed bit sequence of the binary strings. The state is defined as the number of consecutive zeros between two ones in the binarized A-scans. Since the strings are highly sparse (the number of zeros is much greater than the number of ones), defining the state this way leads to fewer number of states compared to the case where each bit is defined as a state. The number of total states is further reduced by quantizing the number of consecutive zeros. In order to identify the correct order of the Markov model, the mean square difference (MSD) between the transition matrices of mine strings and non-mine strings is calculated up to order four using training data. The results show that order one or two maximizes this MSD. The specification of the transition probabilities of the chain can be used to compute the likelihood of any given string. Such a model can be used to identify characteristic landmine strings during the training phase. These developments on modeling and characterizing the string statistics can potentially be part of a real-time landmine detection algorithm that identifies landmine and clutter in an adaptive fashion.
Using Person Fit Statistics to Detect Outliers in Survey Research.
Felt, John M; Castaneda, Ruben; Tiemensma, Jitske; Depaoli, Sarah
2017-01-01
Context: When working with health-related questionnaires, outlier detection is important. However, traditional methods of outlier detection (e.g., boxplots) can miss participants with "atypical" responses to the questions that otherwise have similar total (subscale) scores. In addition to detecting outliers, it can be of clinical importance to determine the reason for the outlier status or "atypical" response. Objective: The aim of the current study was to illustrate how to derive person fit statistics for outlier detection through a statistical method examining person fit with a health-based questionnaire. Design and Participants: Patients treated for Cushing's syndrome (n = 394) were recruited from the Cushing's Support and Research Foundation's (CSRF) listserv and Facebook page. Main Outcome Measure: Patients were directed to an online survey containing the CushingQoL (English version). A two-dimensional graded response model was estimated, and person fit statistics were generated using the Zh statistic. Results: Conventional outlier detections methods revealed no outliers reflecting extreme scores on the subscales of the CushingQoL. However, person fit statistics identified 18 patients with "atypical" response patterns, which would have been otherwise missed (Zh > |±2.00|). Conclusion: While the conventional methods of outlier detection indicated no outliers, person fit statistics identified several patients with "atypical" response patterns who otherwise appeared average. Person fit statistics allow researchers to delve further into the underlying problems experienced by these "atypical" patients treated for Cushing's syndrome. Annotated code is provided to aid other researchers in using this method.
A flexibly shaped spatial scan statistic for detecting clusters
Takahashi Kunihiko
2005-05-01
Full Text Available Abstract Background The spatial scan statistic proposed by Kulldorff has been applied to a wide variety of epidemiological studies for cluster detection. This scan statistic, however, uses a circular window to define the potential cluster areas and thus has difficulty in correctly detecting actual noncircular clusters. A recent proposal by Duczmal and Assunção for detecting noncircular clusters is shown to detect a cluster of very irregular shape that is much larger than the true cluster in our experiences. Methods We propose a flexibly shaped spatial scan statistic that can detect irregular shaped clusters within relatively small neighborhoods of each region. The performance of the proposed spatial scan statistic is compared to that of Kulldorff's circular spatial scan statistic with Monte Carlo simulation by considering several circular and noncircular hot-spot cluster models. For comparison, we also propose a new bivariate power distribution classified by the number of regions detected as the most likely cluster and the number of hot-spot regions included in the most likely cluster. Results The circular spatial scan statistics shows a high level of accuracy in detecting circular clusters exactly. The proposed spatial scan statistic is shown to have good usual powers plus the ability to detect the noncircular hot-spot clusters more accurately than the circular one. Conclusion The proposed spatial scan statistic is shown to work well for small to moderate cluster size, up to say 30. For larger cluster sizes, the method is not practically feasible and a more efficient algorithm is needed.
Fundamental statistical limitations of future dark matter direct detection experiments
Strege, C.; Trotta, F.; Bertone, G.; Peter, A.H.G.; Scott, P.
2012-01-01
We discuss irreducible statistical limitations of future ton-scale dark matter direct detection experiments. We focus in particular on the coverage of confidence intervals, which quantifies the reliability of the statistical method used to reconstruct the dark matter parameters and the bias of the r
Hu, Xuan; Fan, Mingwen; Mulder, Jan; Frencken, Jo E
2016-01-01
To compare the level of agreement between carious lesion assessments according to the visual clinical examination and the colour photograph methods. Data on the presence of enamel/dentin carious lesions in previously sealed occlusal surfaces in first molars were obtained by two trained and calibrated examiners through visual clinical examination and from colour photographs 4 years after sealing. Kappa statistics were applied to calculate agreement between assessment methods. Data analysis was performed using sign, Bowker symmetry and McNemar's tests. The prevalence of dentin carious lesions was very low. The kappa coefficients for detecting enamel/dentin carious lesions using the two assessment methods were 0.65 (CI: 0.56-0.74) for examiner 1 and 0.70 (CI: 0.62-0.78) for examiner 2. Examiner 2 observed more enamel/dentin carious lesions on colour photographs than did examiner 1 (p = 0.008). Sensitivity analyses did not confirm this outcome. There was no difference in the detection of enamel/dentin carious lesions in previously sealed occlusal surfaces using colour photographs vs visual clinical examination. The colour photograph method is therefore equivalent to the visual clinical examination in detecting enamel/dentin carious lesions. More studies are required.
Statistical detection of EEG synchrony using empirical bayesian inference.
Archana K Singh
Full Text Available There is growing interest in understanding how the brain utilizes synchronized oscillatory activity to integrate information across functionally connected regions. Computing phase-locking values (PLV between EEG signals is a popular method for quantifying such synchronizations and elucidating their role in cognitive tasks. However, high-dimensionality in PLV data incurs a serious multiple testing problem. Standard multiple testing methods in neuroimaging research (e.g., false discovery rate, FDR suffer severe loss of power, because they fail to exploit complex dependence structure between hypotheses that vary in spectral, temporal and spatial dimension. Previously, we showed that a hierarchical FDR and optimal discovery procedures could be effectively applied for PLV analysis to provide better power than FDR. In this article, we revisit the multiple comparison problem from a new Empirical Bayes perspective and propose the application of the local FDR method (locFDR; Efron, 2001 for PLV synchrony analysis to compute FDR as a posterior probability that an observed statistic belongs to a null hypothesis. We demonstrate the application of Efron's Empirical Bayes approach for PLV synchrony analysis for the first time. We use simulations to validate the specificity and sensitivity of locFDR and a real EEG dataset from a visual search study for experimental validation. We also compare locFDR with hierarchical FDR and optimal discovery procedures in both simulation and experimental analyses. Our simulation results showed that the locFDR can effectively control false positives without compromising on the power of PLV synchrony inference. Our results from the application locFDR on experiment data detected more significant discoveries than our previously proposed methods whereas the standard FDR method failed to detect any significant discoveries.
Statistical detection of EEG synchrony using empirical bayesian inference.
Singh, Archana K; Asoh, Hideki; Takeda, Yuji; Phillips, Steven
2015-01-01
There is growing interest in understanding how the brain utilizes synchronized oscillatory activity to integrate information across functionally connected regions. Computing phase-locking values (PLV) between EEG signals is a popular method for quantifying such synchronizations and elucidating their role in cognitive tasks. However, high-dimensionality in PLV data incurs a serious multiple testing problem. Standard multiple testing methods in neuroimaging research (e.g., false discovery rate, FDR) suffer severe loss of power, because they fail to exploit complex dependence structure between hypotheses that vary in spectral, temporal and spatial dimension. Previously, we showed that a hierarchical FDR and optimal discovery procedures could be effectively applied for PLV analysis to provide better power than FDR. In this article, we revisit the multiple comparison problem from a new Empirical Bayes perspective and propose the application of the local FDR method (locFDR; Efron, 2001) for PLV synchrony analysis to compute FDR as a posterior probability that an observed statistic belongs to a null hypothesis. We demonstrate the application of Efron's Empirical Bayes approach for PLV synchrony analysis for the first time. We use simulations to validate the specificity and sensitivity of locFDR and a real EEG dataset from a visual search study for experimental validation. We also compare locFDR with hierarchical FDR and optimal discovery procedures in both simulation and experimental analyses. Our simulation results showed that the locFDR can effectively control false positives without compromising on the power of PLV synchrony inference. Our results from the application locFDR on experiment data detected more significant discoveries than our previously proposed methods whereas the standard FDR method failed to detect any significant discoveries.
Detection of small target using recursive higher order statistics
Hou, Wang; Sun, Hongyuan; Lei, Zhihui
2014-02-01
In this paper, a recursive higher order statistics algorithm is proposed for small target detection in temporal domain. Firstly, the background of image sequence is normalized. Then, the higher order statistics are recursively solved in image sequence to obtain the feature image. Finally, the feature image is segmented with threshold to detect the small target. To validate the algorithm proposed in this paper, five simulated and one semi-simulation image sequences are created. The ROC curves are employed for evaluation of experimental results. Experiment results show that our method is very effective for small target detection.
GNSS Spoofing Detection Based on Signal Power Measurements: Statistical Analysis
V. Dehghanian
2012-01-01
Full Text Available A threat to GNSS receivers is posed by a spoofing transmitter that emulates authentic signals but with randomized code phase and Doppler values over a small range. Such spoofing signals can result in large navigational solution errors that are passed onto the unsuspecting user with potentially dire consequences. An effective spoofing detection technique is developed in this paper, based on signal power measurements and that can be readily applied to present consumer grade GNSS receivers with minimal firmware changes. An extensive statistical analysis is carried out based on formulating a multihypothesis detection problem. Expressions are developed to devise a set of thresholds required for signal detection and identification. The detection processing methods developed are further manipulated to exploit incidental antenna motion arising from user interaction with a GNSS handheld receiver to further enhance the detection performance of the proposed algorithm. The statistical analysis supports the effectiveness of the proposed spoofing detection technique under various multipath conditions.
Statistical detection of the hidden distortions in diffusive spectra
Nigmatullin, R R; Smith, G; Butler, P
2003-01-01
The detection of an unknown substance in small concentration represents an important problem in spectroscopy. Usually this detection is based on the recognition of specific 'labels' i.e. the visual appearance of new resonance lines that appear in the spectrograms analysed. But if the concentration of the unknown substance is small and visual indications (e.g. resonance peaks in diffusive spectra) are absent then the detection of the unknown substance constitutes a problem. We suggest a new methodology for the statistical detection of an unknown substance, based on the transformation of fluctuations obtained from initial spectrograms into ordered quantized histograms (QHs). The QHs obtained help to detect, statistically, the presence of unknown substances using the characteristics of conventional quantum spectra adopted from quantum mechanics. The averaging of the QHs helps to calculate the ordered 'fluctuation fork' (FF), which provides a specific 'noise ruler' for the detection and quantification of the trac...
Roshni D. Tale
2014-04-01
Full Text Available Deception detection has important legal and medical applications, but the reliability of methods for the differentiation between truthful and deceptive responses is still limited. Deception detection can be more accurately achieved by measuring the brain correlates of lying in an individual. For the evaluation of the method, several participants were gone through the designed concealed information test paradigm and their respective brain signals were recorded. The electroencephalogram (EEG signals were recorded and separated into many single trials. To enhance signal noise ratio (SNR of P3 components, the independent component analysis (ICA method was adopted to separate non-P3 (i.e. artifacts and P3 components from every single trial. Then the P3 waveforms with high SNR were reconstructed. And then group of features based on time, frequency, and amplitude were extracted from the reconstructed P3 waveforms. Finally, two different class of feature samples were used to train a support vector machine (SVM classifier because it has higher performance compared with several other classifiers. The method presented in this paper improves the efficiency of CIT and deception detection in comparison with previous reported methods.
Advanced Statistical Signal Processing Techniques for Landmine Detection Using GPR
2014-07-12
Processing Techniques for Landmine Detection Using GPR The views, opinions and/or findings contained in this report are those of the author(s) and should not...AGENCY NAME(S) AND ADDRESS (ES) U.S. Army Research Office P.O. Box 12211 Research Triangle Park, NC 27709-2211 landmine Detection, Signal...310 Jesse Hall Columbia, MO 65211 -1230 654808 633606 ABSTRACT Advanced Statistical Signal Processing Techniques for Landmine Detection Using GPR Report
Statistical language analysis for automatic exfiltration event detection.
Robinson, David Gerald
2010-04-01
This paper discusses the recent development a statistical approach for the automatic identification of anomalous network activity that is characteristic of exfiltration events. This approach is based on the language processing method eferred to as latent dirichlet allocation (LDA). Cyber security experts currently depend heavily on a rule-based framework for initial detection of suspect network events. The application of the rule set typically results in an extensive list of uspect network events that are then further explored manually for suspicious activity. The ability to identify anomalous network events is heavily dependent on the experience of the security personnel wading through the network log. Limitations f this approach are clear: rule-based systems only apply to exfiltration behavior that has previously been observed, and experienced cyber security personnel are rare commodities. Since the new methodology is not a discrete rule-based pproach, it is more difficult for an insider to disguise the exfiltration events. A further benefit is that the methodology provides a risk-based approach that can be implemented in a continuous, dynamic or evolutionary fashion. This permits uspect network activity to be identified early with a quantifiable risk associated with decision making when responding to suspicious activity.
Statistical Methods for Quantitatively Detecting Fungal Disease from Fruits’ Images
Jagadeesh D. Pujari; Yakkundimath, Rajesh Siddaramayya; Byadgi, Abdulmunaf Syedhusain
2013-01-01
In this paper we have proposed statistical methods for detecting fungal disease and classifying based on disease severity levels. Most fruits diseases are caused by bacteria, fungi, virus, etc of which fungi are responsible for a large number of diseases in fruits. In this study images of fruits, affected by different fungal symptoms are collected and categorized based on disease severity. Statistical features like block wise, gray level co-occurrence matrix (GLCM), gray level runlength matr...
Turbo Detection in Rayleigh flat fading channel with unknown statistics
Paul Fortier
2010-11-01
Full Text Available The turbo detection of turbo coded symbols over correlated Rayleigh flat fading channels generatedaccording to Jakes’ model is considered in this paper. We propose a method to estimate the channelsignal-to-noise ratio (SNR and the maximum Doppler frequency. These statistics are required bythe linear minimum mean squared error (LMMSE channel estimator. To improve the system convergence,we redefine the channel reliability factor by taking into account the channel estimationerror statistics. Simulation results for rate 1=3 turbo code and two different normalized fading ratesshow that the use of the new reliability factor greatly improves the performance. The improvementis more substantial when channel statistics are unknown.
Statistical Inference for Detecting Structures and Anomalies in Networks
2015-08-27
mean geodesic dis- tance). Although useful for detecting some types of change points, we demonstrate that these methods fail to detect changes, up to...based on linearizing the BP equations along the lines of previous work under this grant. We verify our analytic and algorithmic results via numerical
Statistical detection of structural damage based on model reduction
Tao YIN; Heung-fai LAM; Hong-ping ZHU
2009-01-01
This paper proposes a statistical method for damage detection based on the finite element (FE) model reduction technique that utilizes measured modal data with a limited number of sensors.A deterministic damage detection process is formulated based on the model reduction technique.The probabilistic process is integrated into the deterministic damage detection process using a perturbation technique,resulting in a statistical structural damage detection method.This is achieved by deriving the firstand second-order partial derivatives of uncertain parameters,such as elasticity of the damaged member,with respect to the measurement noise,which allows expectation and covariance matrix of the uncertain parameters to be calculated.Besides the theoretical development,this paper reports numerical verification of the proposed method using a portal frame example and Monte Carlo simulation.
Detailed noise statistics for an optically preamplified direct detection receiver
Danielsen, Søren Lykke; Mikkelsen, Benny; Durhuus, Terji
1995-01-01
We describe the exact statistics of an optically preamplified direct detection receiver by means of the moment generating function. The theory allows an arbitrary shaped electrical filter in the receiver circuit. The moment generating function (MGF) allows for a precise calculation of the error...
Statistical Procedures for Estimating and Detecting Climate Changes
无
2006-01-01
This paper provides a concise description of the philosophy, mathematics, and algorithms for estimating,detecting, and attributing climate changes. The estimation follows the spectral method by using empirical orthogonal functions, also called the method of reduced space optimal averaging. The detection follows the linear regression method, which can be found in most textbooks about multivariate statistical techniques.The detection algorithms are described by using the space-time approach to avoid the non-stationarity problem. The paper includes (1) the optimal averaging method for minimizing the uncertainties of the global change estimate, (2) the weighted least square detection of both single and multiple signals, (3)numerical examples, and (4) the limitations of the linear optimal averaging and detection methods.
A CUSUM framework for detection of space-time disease clusters using scan statistics.
Sonesson, Christian
2007-11-20
Several methods for timely detection of emerging clusters of diseases have recently been proposed. We focus our attention on one of the most popular types of method; a scan statistic. Different ways of constructing space-time scan statistics based on surveillance theory are presented. We bridge the ideas from space-time disease surveillance, public health surveillance and industrial quality control and show that previously suggested space-time scan statistics methods can be fitted into a general CUSUM framework. Crucial differences between the methods studied are due to different assumptions about the spatial process. An example is the specification of the spatial regions of interest for a possible cluster, another is the increased rate to be detected within a cluster. We evaluate the detection ability of the methods considering the possibility of a cluster emerging at any time during the surveillance period. The methods are applied to the detection of an increased incidence of Tularemia in Sweden.
Statistical methods for detecting periodic fragments in DNA sequence data
Ying Hua
2011-04-01
Full Text Available Abstract Background Period 10 dinucleotides are structurally and functionally validated factors that influence the ability of DNA to form nucleosomes, histone core octamers. Robust identification of periodic signals in DNA sequences is therefore required to understand nucleosome organisation in genomes. While various techniques for identifying periodic components in genomic sequences have been proposed or adopted, the requirements for such techniques have not been considered in detail and confirmatory testing for a priori specified periods has not been developed. Results We compared the estimation accuracy and suitability for confirmatory testing of autocorrelation, discrete Fourier transform (DFT, integer period discrete Fourier transform (IPDFT and a previously proposed Hybrid measure. A number of different statistical significance procedures were evaluated but a blockwise bootstrap proved superior. When applied to synthetic data whose period-10 signal had been eroded, or for which the signal was approximately period-10, the Hybrid technique exhibited superior properties during exploratory period estimation. In contrast, confirmatory testing using the blockwise bootstrap procedure identified IPDFT as having the greatest statistical power. These properties were validated on yeast sequences defined from a ChIP-chip study where the Hybrid metric confirmed the expected dominance of period-10 in nucleosome associated DNA but IPDFT identified more significant occurrences of period-10. Application to the whole genomes of yeast and mouse identified ~ 21% and ~ 19% respectively of these genomes as spanned by period-10 nucleosome positioning sequences (NPS. Conclusions For estimating the dominant period, we find the Hybrid period estimation method empirically to be the most effective for both eroded and approximate periodicity. The blockwise bootstrap was found to be effective as a significance measure, performing particularly well in the problem of
Detection of Doppler Microembolic Signals Using High Order Statistics
Maroun Geryes
2016-01-01
Full Text Available Robust detection of the smallest circulating cerebral microemboli is an efficient way of preventing strokes, which is second cause of mortality worldwide. Transcranial Doppler ultrasound is widely considered the most convenient system for the detection of microemboli. The most common standard detection is achieved through the Doppler energy signal and depends on an empirically set constant threshold. On the other hand, in the past few years, higher order statistics have been an extensive field of research as they represent descriptive statistics that can be used to detect signal outliers. In this study, we propose new types of microembolic detectors based on the windowed calculation of the third moment skewness and fourth moment kurtosis of the energy signal. During energy embolus-free periods the distribution of the energy is not altered and the skewness and kurtosis signals do not exhibit any peak values. In the presence of emboli, the energy distribution is distorted and the skewness and kurtosis signals exhibit peaks, corresponding to the latter emboli. Applied on real signals, the detection of microemboli through the skewness and kurtosis signals outperformed the detection through standard methods. The sensitivities and specificities reached 78% and 91% and 80% and 90% for the skewness and kurtosis detectors, respectively.
On detection and assessment of statistical significance of Genomic Islands
Chaudhuri Probal
2008-04-01
Full Text Available Abstract Background Many of the available methods for detecting Genomic Islands (GIs in prokaryotic genomes use markers such as transposons, proximal tRNAs, flanking repeats etc., or they use other supervised techniques requiring training datasets. Most of these methods are primarily based on the biases in GC content or codon and amino acid usage of the islands. However, these methods either do not use any formal statistical test of significance or use statistical tests for which the critical values and the P-values are not adequately justified. We propose a method, which is unsupervised in nature and uses Monte-Carlo statistical tests based on randomly selected segments of a chromosome. Such tests are supported by precise statistical distribution theory, and consequently, the resulting P-values are quite reliable for making the decision. Results Our algorithm (named Design-Island, an acronym for Detection of Statistically Significant Genomic Island runs in two phases. Some 'putative GIs' are identified in the first phase, and those are refined into smaller segments containing horizontally acquired genes in the refinement phase. This method is applied to Salmonella typhi CT18 genome leading to the discovery of several new pathogenicity, antibiotic resistance and metabolic islands that were missed by earlier methods. Many of these islands contain mobile genetic elements like phage-mediated genes, transposons, integrase and IS elements confirming their horizontal acquirement. Conclusion The proposed method is based on statistical tests supported by precise distribution theory and reliable P-values along with a technique for visualizing statistically significant islands. The performance of our method is better than many other well known methods in terms of their sensitivity and accuracy, and in terms of specificity, it is comparable to other methods.
Tassoni, Giovanna; Cippitelli, Marta; Ottaviani, Giovanni; Froldi, Rino; Cingolani, Mariano
2016-07-01
A forensic standard procedure is described that combines enzyme-linked immunoassay for screening and gas chromatography-mass spectrometry (GC-MS) for confirmation to detect drugs of abuse in a sample before used to detect opioids and cocaine. We used two equal aliquots of the same previously selected cannabinoid positive hair samples, one of which was subjected to acid hydrolysis. Afterward, both the aliquots were subjected to basic extraction and then to immunoassay screening. After derivatization, the GC-MS parameters were the same for both the aliquots for the determination of the cannabinoids (Δ(9)-tetrahydrocannabinol, cannabidiol and cannabinol). The results show that there were no statistical differences between the nonpreviously treated and the pretreated hair samples for the quantification of the three cannabis products for immunochemical procedure. No differences between the two groups were shown as for GC-MS confirmation procedures. All substances showed a good linearity between 0.05 and 2 ng/mg. The limit of detection ranged from 0.02 to 0.03 ng/mg, and the limit of quantification was 0.05 ng/mg for all substances. To our knowledge, this is the first time that screening and confirmation procedures have been applied on the same sample of hair to detect more than one drug of abuse.
Statistical Outlier Detection for Jury Based Grading Systems
Thompson, Mary Kathryn; Clemmensen, Line Katrine Harder; Rosas, Harvey
2013-01-01
This paper presents an algorithm that was developed to identify statistical outliers from the scores of grading jury members in a large project-based first year design course. The background and requirements for the outlier detection system are presented. The outlier detection algorithm...... and the follow-up procedures for score validation and appeals are described in detail. Finally, the impact of various elements of the outlier detection algorithm, their interactions, and the sensitivity of their numerical values are investigated. It is shown that the difference in the mean score produced...... by a grading jury before and after a suspected outlier is removed from the mean is the single most effective criterion for identifying potential outliers but that all of the criteria included in the algorithm have an effect on the outlier detection process....
Research on Life Signals Detection Based on Higher Order Statistics
Jian-Jun Li
2012-10-01
Full Text Available The life signals are built on harmonic mode for their low frequency, quasi-periodicity, low SNR, and the easy submerged in strong clutter noise. The method for detecting life signal based on adaptive filter and high order statistics is presented, in which neither the Gaussian supposition of the observed signal, nor a prior information about the waveform and arrival time of the observed signal is necessary. The principle of method is to separate the spectrum of input signal into many narrow frequency bands, whose Sub-band signal is followed by a short-time estimation of higher-order statistics so as to suppress Gaussian noises. Simulated results show that the method can effectively detect life signals from noise with good convergence speed and stability, and greatly improve the signal quality with respect to LMS method.
Detection of a diffusive cloak via second-order statistics
Koirala, Milan
2016-01-01
We propose a scheme to detect the diffusive cloak proposed by Schittny et al [Science 345, 427 (2014)]. We exploit the fact that diffusion of light is an approximation that disregards wave interference. The long-range contribution to intensity correlation is sensitive to locations of paths crossings and the interference inside the medium, allowing one to detect the size and position, including the depth, of the diffusive cloak. Our results also suggest that it is possible to separately manipulate the first- and the second-order statistics of wave propagation in turbid media.
Statistically normalized coherent change detection for synthetic aperture sonar imagery
G-Michael, Tesfaye; Tucker, J. D.; Roberts, Rodney G.
2016-05-01
Coherent Change Detection (CCD) is a process of highlighting an area of activity in scenes (seafloor) under survey and generated from pairs of synthetic aperture sonar (SAS) images of approximately the same location observed at two different time instances. The problem of CCD and subsequent anomaly feature extraction/detection is complicated due to several factors such as the presence of random speckle pattern in the images, changing environmental conditions, and platform instabilities. These complications make the detection of weak target activities even more difficult. Typically, the degree of similarity between two images measured at each pixel locations is the coherence between the complex pixel values in the two images. Higher coherence indicates little change in the scene represented by the pixel and lower coherence indicates change activity in the scene. Such coherence estimation scheme based on the pixel intensity correlation is an ad-hoc procedure where the effectiveness of the change detection is determined by the choice of threshold which can lead to high false alarm rates. In this paper, we propose a novel approach for anomalous change pattern detection using the statistical normalized coherence and multi-pass coherent processing. This method may be used to mitigate shadows by reducing the false alarms resulting in the coherent map due to speckles and shadows. Test results of the proposed methods on a data set of SAS images will be presented, illustrating the effectiveness of the normalized coherence in terms statistics from multi-pass survey of the same scene.
Navarro-Gonzalex, Rafael; Sutter, Brad; Archer, Doug; Ming, Doug; Eigenbrode, Jennifer; Franz, Heather; Glavin, Daniel; McAdam, Amy; Stern, Jennifer; McKay, Christopher; Coll, Patrice; Cabane, Michel; Mahaffy, Paul; Conrad, Pamela; Martin-Torres, Francisco; Zorzano-Mier, Maria; Grotzinger, John
2013-01-01
The first chemical analysis of soluble salts in the soil was carried out by the Phoenix Lander in the Martian Arctic [1]. Surprisingly, chlorine was present as magnesium or calcium perchlorate at 0.4 to 0.6 percent. Additional support for the identification of perchlorate came from the evolved gas analysis which detected the release of molecular oxygen at 350-550C [1]. When Mars-like soils from the Atacama Desert were spiked with magnesium perchlorate (1 percent) and heated using the Viking GC-MS protocol, nearly all the organics were combusted but a small amount was chlorinated, forming chloromethane and dichloromethane [2]. These chlorohydrocarbons were detected by the Viking GC-MS experiments when the Martian soil was analyzed but they were considered to be terrestrial contaminants [3]. Reinterpretation of the Viking results suggests trichloromethane, and chloromethylpropene) detected both by SAM QMS and GC-MS derived from known Earth organic contaminants in the instrument [6]. Calcium perchlorate appears to be the best candidate for evolved O2 in the Rocknest samples at this time but other Cl species (e.g., chlorates) are possible and must be evaluated. The potential detection of perchlorates in Rocknest material adds weight to the argument that both Viking Landers measured signatures of perchlorates. Even if the source of the organic carbon detected is still unknown, the chlorine source was likely Martian. Two mechanisms have been hypothesized for the formation of soil perchlorate: (1) Atmospheric oxidation of chlorine; and (2) UV photooxidation of chlorides catalyzed by mineral catalysts [7]. The presence of soil perchlorates in the Martian surface has important implications for the detection of organics [2], carbonates [8] and nitrates [9] by SAM.
Statistical methods for detecting differentially methylated loci and regions
Mark D Robinson
2014-09-01
Full Text Available DNA methylation, the reversible addition of methyl groups at CpG dinucleotides, represents an important regulatory layer associated with gene expression. Changed methylation status has been noted across diverse pathological states, including cancer. The rapid development and uptake of microarrays and large scale DNA sequencing has prompted an explosion of data analytic methods for processing and discovering changes in DNA methylation across varied data types. In this mini-review, we present a compact and accessible discussion of many of the salient challenges, such as experimental design, statistical methods for differential methylation detection, critical considerations such as cell type composition and the potential confounding that can arise from batch effects. From a statistical perspective, our main interests include the use of empirical Bayes or hierarchical models, which have proved immensely powerful in genomics, and the procedures by which false discovery control is achieved.
Robust Statistical Detection of Power-Law Cross-Correlation
Blythe, Duncan A. J.; Nikulin, Vadim V.; Müller, Klaus-Robert
2016-06-01
We show that widely used approaches in statistical physics incorrectly indicate the existence of power-law cross-correlations between financial stock market fluctuations measured over several years and the neuronal activity of the human brain lasting for only a few minutes. While such cross-correlations are nonsensical, no current methodology allows them to be reliably discarded, leaving researchers at greater risk when the spurious nature of cross-correlations is not clear from the unrelated origin of the time series and rather requires careful statistical estimation. Here we propose a theory and method (PLCC-test) which allows us to rigorously and robustly test for power-law cross-correlations, correctly detecting genuine and discarding spurious cross-correlations, thus establishing meaningful relationships between processes in complex physical systems. Our method reveals for the first time the presence of power-law cross-correlations between amplitudes of the alpha and beta frequency ranges of the human electroencephalogram.
A new statistical approach to climate change detection and attribution
Ribes, Aurélien; Zwiers, Francis W.; Azaïs, Jean-Marc; Naveau, Philippe
2017-01-01
We propose here a new statistical approach to climate change detection and attribution that is based on additive decomposition and simple hypothesis testing. Most current statistical methods for detection and attribution rely on linear regression models where the observations are regressed onto expected response patterns to different external forcings. These methods do not use physical information provided by climate models regarding the expected response magnitudes to constrain the estimated responses to the forcings. Climate modelling uncertainty is difficult to take into account with regression based methods and is almost never treated explicitly. As an alternative to this approach, our statistical model is only based on the additivity assumption; the proposed method does not regress observations onto expected response patterns. We introduce estimation and testing procedures based on likelihood maximization, and show that climate modelling uncertainty can easily be accounted for. Some discussion is provided on how to practically estimate the climate modelling uncertainty based on an ensemble of opportunity. Our approach is based on the " models are statistically indistinguishable from the truth" paradigm, where the difference between any given model and the truth has the same distribution as the difference between any pair of models, but other choices might also be considered. The properties of this approach are illustrated and discussed based on synthetic data. Lastly, the method is applied to the linear trend in global mean temperature over the period 1951-2010. Consistent with the last IPCC assessment report, we find that most of the observed warming over this period (+0.65 K) is attributable to anthropogenic forcings (+0.67 ± 0.12 K, 90 % confidence range), with a very limited contribution from natural forcings (-0.01± 0.02 K).
Statistical method for detecting structural change in the growth process.
Ninomiya, Yoshiyuki; Yoshimoto, Atsushi
2008-03-01
Due to competition among individual trees and other exogenous factors that change the growth environment, each tree grows following its own growth trend with some structural changes in growth over time. In the present article, a new method is proposed to detect a structural change in the growth process. We formulate the method as a simple statistical test for signal detection without constructing any specific model for the structural change. To evaluate the p-value of the test, the tube method is developed because the regular distribution theory is insufficient. Using two sets of tree diameter growth data sampled from planted forest stands of Cryptomeria japonica in Japan, we conduct an analysis of identifying the effect of thinning on the growth process as a structural change. Our results demonstrate that the proposed method is useful to identify the structural change caused by thinning. We also provide the properties of the method in terms of the size and power of the test.
Photon statistics measurement by use of single photon detection
XIAO Liantuan; JIANG Yuqiang; ZHAO Yanting; YIN Wangbao; ZHAO Jianming; JIA Suotang
2004-01-01
The direct measurement of the Mandel para- meter of weak laser pulses, with 10 ns pulse duration time and the mean number of photon per pulsebeing approximately 0.1, is investigated by recording every photocount event. With the Hanbury Brown and Twiss detection scheme, and not more than one photon per pulse being detected during the sample time by single-photon counters, we have found that the single mode diode laser with driving current lower than the threshold yields a sub-Poissonian statistics. In addition, when the diode laser driving current is much higher than the threshold, it is validated that the Mandel parameter QC of the Poissonian coherent state is nearly The experimental results are in good agreement with theoretical prediction considering the measurement error.
Detection of mumps virus genotype H in two previously vaccinated patients from Mexico City.
Del Valle, Alberto; García, Alí A; Barrón, Blanca L
2016-06-01
Infections caused by mumps virus (MuV) have been successfully prevented through vaccination; however, in recent years, an increasing number of mumps outbreaks have been reported within vaccinated populations. In this study, MuV was genotyped for the first time in Mexico. Saliva samples were obtained from two previously vaccinated patients in Mexico City who had developed parotitis. Viral isolation was carried out in Vero cells, and the SH and HN genes were amplified by RT-PCR. Amplicons were sequenced and compared to a set of reference sequences to identify the MuV genotype.
Detection of hydrogeochemical seismic precursors by a statistical learning model
L. Castellana
2008-11-01
Full Text Available The problem of detecting the occurrence of an earthquake precursor is faced in the general framework of the statistical learning theory. The aim of this work is both to build models able to detect seismic precursors from time series of different geochemical signals and to provide an estimate of number of false positives. The model we used is k-Nearest-Neighbor classifier for discriminating "no-disturbed signal", "seismic precursor" and "co-post seismic precursor" in time series relative to thirteen different hydrogeochemical parameters collected in water samples from a natural spring in Kamchachta (Russia peninsula. The measurements collected are ion content (Na, Cl, Ca, HCO_{3}, H_{3}BO_{3}, parameters (pH, Q, T and gases (N_{2}, CO_{2}, CH_{4}, O_{2}, Ag. The classification error is measured by Leave-K-Out-Cross-Validation procedure. Our study shows that the most discriminative ions for detecting seismic precursors are Cl and Na having an error rates of 15%. Moreover, the most discriminative parameters and gases are Q and CH_{4} respectively, with error rate of 21%. The ions result the most informative hydrogeochemicals for detecting seismic precursors due to the peculiarities of the mechanisms involved in earthquake preparation. Finally we show that the information collected some month before the event under analysis are necessary to improve the classification accuracy.
RFI detection by automated feature extraction and statistical analysis
Winkel, B.; Kerp, J.; Stanko, S.
2007-01-01
In this paper we present an interference detection toolbox consisting of a high dynamic range Digital Fast-Fourier-Transform spectrometer (DFFT, based on FPGA-technology) and data analysis software for automated radio frequency interference (RFI) detection. The DFFT spectrometer allows high speed data storage of spectra on time scales of less than a second. The high dynamic range of the device assures constant calibration even during extremely powerful RFI events. The software uses an algorithm which performs a two-dimensional baseline fit in the time-frequency domain, searching automatically for RFI signals superposed on the spectral data. We demonstrate, that the software operates successfully on computer-generated RFI data as well as on real DFFT data recorded at the Effelsberg 100-m telescope. At 21-cm wavelength RFI signals can be identified down to the 4σ_rms level. A statistical analysis of all RFI events detected in our observational data revealed that: (1) mean signal strength is comparable to the astronomical line emission of the Milky Way, (2) interferences are polarised, (3) electronic devices in the neighbourhood of the telescope contribute significantly to the RFI radiation. We also show that the radiometer equation is no longer fulfilled in presence of RFI signals.
RFI detection by automated feature extraction and statistical analysis
Winkel, B; Stanko, S; Winkel, Benjamin; Kerp, Juergen; Stanko, Stephan
2006-01-01
In this paper we present an interference detection toolbox consisting of a high dynamic range Digital Fast-Fourier-Transform spectrometer (DFFT, based on FPGA-technology) and data analysis software for automated radio frequency interference (RFI) detection. The DFFT spectrometer allows high speed data storage of spectra on time scales of less than a second. The high dynamic range of the device assures constant calibration even during extremely powerful RFI events. The software uses an algorithm which performs a two-dimensional baseline fit in the time-frequency domain, searching automatically for RFI signals superposed on the spectral data. We demonstrate, that the software operates successfully on computer-generated RFI data as well as on real DFFT data recorded at the Effelsberg 100-m telescope. At 21-cm wavelength RFI signals can be identified down to the 4-sigma level. A statistical analysis of all RFI events detected in our observational data revealed that: (1) mean signal strength is comparable to the a...
Metamorphic Virus Detection in Portable Executables Using Opcodes Statistical Feature
Rad, Babak Bashari
2011-01-01
Metamorphic viruses engage different mutation techniques to escape from string signature based scanning. They try to change their code in new offspring so that the variants appear non-similar and have no common sequences of string as signature. However, all versions of a metamorphic virus have similar task and performance. This obfuscation process helps to keep them safe from the string based signature detection. In this study, we make use of instructions statistical features to compare the similarity of two hosted files probably occupied by two mutated forms of a specific metamorphic virus. The introduced solution in this paper is relied on static analysis and employs the frequency histogram of machine opcodes in different instances of obfuscated viruses. We use Minkowski-form histogram distance measurements in order to check the likeness of portable executables (PE). The purpose of this research is to present an idea that for a number of special obfuscation approaches the presented solution can be used to i...
Cosmology with phase statistics: parameter forecasts and detectability of BAO
Eggemeier, Alexander; Smith, Robert E.
2017-04-01
We consider an alternative to conventional three-point statistics such as the bispectrum, which is purely based on the Fourier phases of the density field: the line correlation function. This statistic directly probes the non-linear clustering regime and contains information highly complementary to that contained in the power spectrum. In this work, we determine, for the first time, its potential to constrain cosmological parameters and detect baryon acoustic oscillations (hereafter BAOs). We show how to compute the line correlation function for a discrete sampled set of tracers that follow a local Lagrangian biasing scheme and demonstrate how it breaks the degeneracy between the amplitude of density fluctuations and the bias parameters of the model. We then derive analytic expressions for its covariance and show that it can be written as a sum of a Gaussian piece plus non-Gaussian corrections. We compare our predictions with a large ensemble of N-body simulations and confirm that BAOs do indeed modulate the signal of the line correlation function for scales 50-100 h-1Mpc and that the characteristic S-shape feature would be detectable in upcoming Stage IV surveys at the level of ∼4σ. We then focus on the cosmological information content and compute Fisher forecasts for an idealized Stage III galaxy redshift survey of volume V ∼ 10 h-3 Gpc3 and out to z = 1. We show that combining the line correlation function with the galaxy power spectrum and a Planck-like microwave background survey yields improvements up to a factor of 2 for parameters such as σ8, b1 and b2, compared with using only the two-point information alone.
Cosmology with Phase Statistics: Parameter Forecasts and Detectability of BAO
Eggemeier, Alexander
2016-01-01
We consider an alternative to conventional three-point statistics such as the bispectrum, which is purely based on the Fourier phases of the density field: the line correlation function. This statistic directly probes the non-linear clustering regime and contains information highly complementary to that contained in the power spectrum. In this work, we determine, for the first time, its potential to constrain cosmological parameters and detect baryon acoustic oscillations (hereafter BAOs). We show how to compute the line correlation function for a discrete sampled set of tracers that follow a local Lagrangian biasing scheme and demonstrate how it breaks the degeneracy between the amplitude of density fluctuations and the bias parameters of the model. We then derive analytic expressions for its covariance and show that it can be written as a sum of a Gaussian piece plus non-Gaussian corrections. We compare our predictions with a large ensemble of $N$-body simulations and confirm that BAOs do indeed modulate th...
Statistical methods for the detection and analysis of radioactive sources
Klumpp, John
We consider four topics from areas of radioactive statistical analysis in the present study: Bayesian methods for the analysis of count rate data, analysis of energy data, a model for non-constant background count rate distributions, and a zero-inflated model of the sample count rate. The study begins with a review of Bayesian statistics and techniques for analyzing count rate data. Next, we consider a novel system for incorporating energy information into count rate measurements which searches for elevated count rates in multiple energy regions simultaneously. The system analyzes time-interval data in real time to sequentially update a probability distribution for the sample count rate. We then consider a "moving target" model of background radiation in which the instantaneous background count rate is a function of time, rather than being fixed. Unlike the sequential update system, this model assumes a large body of pre-existing data which can be analyzed retrospectively. Finally, we propose a novel Bayesian technique which allows for simultaneous source detection and count rate analysis. This technique is fully compatible with, but independent of, the sequential update system and moving target model.
Statistical Analysis of Data with Non-Detectable Values
Frome, E.L.
2004-08-26
Environmental exposure measurements are, in general, positive and may be subject to left censoring, i.e. the measured value is less than a ''limit of detection''. In occupational monitoring, strategies for assessing workplace exposures typically focus on the mean exposure level or the probability that any measurement exceeds a limit. A basic problem of interest in environmental risk assessment is to determine if the mean concentration of an analyte is less than a prescribed action level. Parametric methods, used to determine acceptable levels of exposure, are often based on a two parameter lognormal distribution. The mean exposure level and/or an upper percentile (e.g. the 95th percentile) are used to characterize exposure levels, and upper confidence limits are needed to describe the uncertainty in these estimates. In certain situations it is of interest to estimate the probability of observing a future (or ''missed'') value of a lognormal variable. Statistical methods for random samples (without non-detects) from the lognormal distribution are well known for each of these situations. In this report, methods for estimating these quantities based on the maximum likelihood method for randomly left censored lognormal data are described and graphical methods are used to evaluate the lognormal assumption. If the lognormal model is in doubt and an alternative distribution for the exposure profile of a similar exposure group is not available, then nonparametric methods for left censored data are used. The mean exposure level, along with the upper confidence limit, is obtained using the product limit estimate, and the upper confidence limit on the 95th percentile (i.e. the upper tolerance limit) is obtained using a nonparametric approach. All of these methods are well known but computational complexity has limited their use in routine data analysis with left censored data. The recent development of the R environment for statistical
THE DETECTION AND STATISTICS OF GIANT ARCS BEHIND CLASH CLUSTERS
Xu, Bingxiao; Zheng, Wei [Department of Physics and Astronomy, The Johns Hopkins University, 3400 North Charles Street, Baltimore, MD 21218 (United States); Postman, Marc; Bradley, Larry [Space Telescope Science Institute, 3700 San Martin Drive, Baltimore, MD 21208 (United States); Meneghetti, Massimo; Koekemoer, Anton [INAF, Osservatorio Astronomico di Bologna, and INFN, Sezione di Bologna, Via Ranzani 1, I-40127 Bologna (Italy); Seitz, Stella [Universitaets-Sternwarte, Fakultaet fuer Physik, Ludwig-Maximilians Universitaet Muenchen, Scheinerstr. 1, D-81679 Muenchen (Germany); Zitrin, Adi [California Institute of Technology, MC 249-17, Pasadena, CA 91125 (United States); Merten, Julian [University of Oxford, Department of Physics, Denys Wilkinson Building, Keble Road, Oxford, OX1 3RH (United Kingdom); Maoz, Dani [School of Physics and Astronomy, Tel Aviv University, Tel-Aviv 69978 (Israel); Frye, Brenda [Steward Observatory/Department of Astronomy, University of Arizona, 933 N. Cherry Ave., Tucson, AZ 85721 (United States); Umetsu, Keiichi [Institute of Astronomy and Astrophysics, Academia Sinica, P.O. Box 23-141, Taipei 10617, Taiwan (China); Vega, Jesus, E-mail: bxu6@jhu.edu [Universidad Autonoma de Madrid, Ciudad Universitaria de Cantoblanco, E-28049 Madrid (Spain)
2016-02-01
We developed an algorithm to find and characterize gravitationally lensed galaxies (arcs) to perform a comparison of the observed and simulated arc abundance. Observations are from the Cluster Lensing And Supernova survey with Hubble (CLASH). Simulated CLASH images are created using the MOKA package and also clusters selected from the high-resolution, hydrodynamical simulations, MUSIC, over the same mass and redshift range as the CLASH sample. The algorithm's arc elongation accuracy, completeness, and false positive rate are determined and used to compute an estimate of the true arc abundance. We derive a lensing efficiency of 4 ± 1 arcs (with length ≥6″ and length-to-width ratio ≥7) per cluster for the X-ray-selected CLASH sample, 4 ± 1 arcs per cluster for the MOKA-simulated sample, and 3 ± 1 arcs per cluster for the MUSIC-simulated sample. The observed and simulated arc statistics are in full agreement. We measure the photometric redshifts of all detected arcs and find a median redshift z{sub s} = 1.9 with 33% of the detected arcs having z{sub s} > 3. We find that the arc abundance does not depend strongly on the source redshift distribution but is sensitive to the mass distribution of the dark matter halos (e.g., the c–M relation). Our results show that consistency between the observed and simulated distributions of lensed arc sizes and axial ratios can be achieved by using cluster-lensing simulations that are carefully matched to the selection criteria used in the observations.
New Image Statistics for Detecting Disturbed Galaxy Morphologies at High Redshift
Freeman, P E; Lee, A B; Newman, J A; Conselice, C J; Koekemoer, A M; Lotz, J M; Mozena, M
2013-01-01
Testing theories of hierarchical structure formation requires estimating the distribution of galaxy morphologies and its change with redshift. One aspect of this investigation involves identifying galaxies with disturbed morphologies (e.g., merging galaxies). This is often done by summarizing galaxy images using, e.g., the CAS and Gini-M20 statistics of Conselice (2003) and Lotz et al. (2004), respectively, and associating particular statistic values with disturbance. We introduce three statistics that enhance detection of disturbed morphologies at high-redshift (z ~ 2): the multi-mode (M), intensity (I), and deviation (D) statistics. We show their effectiveness by training a machine-learning classifier, random forest, using 1,639 galaxies observed in the H band by the Hubble Space Telescope WFC3, galaxies that had been previously classified by eye by the CANDELS collaboration (Grogin et al. 2011, Koekemoer et al. 2011). We find that the MID statistics (and the A statistic of Conselice 2003) are the most usef...
The Detection and Statistics of Giant Arcs Behind CLASH Clusters
Xu, Bingxiao; Meneghetti, Massimo; Seitz, Stella; Zitrin, Adi; Merten, Julian; Maoz, Dani; Frye, Brenda; Umetsu, Keiichi; Zheng, Wei; Bradley, Larry; Vega, Jesus; Koekemoer, Anton
2015-01-01
We developed an algorithm to find and characterize gravitationally lensed galaxies (arcs) to perform a comparison of the observed and simulated arc abundance. Observations are from the Cluster Lensing And Supernova survey with Hubble (CLASH). Simulated CLASH images are created using the MOKA package and also clusters selected from the high resolution, hydrodynamical simulations, MUSIC, over the same mass and redshift range as the CLASH sample. The algorithm' s arc elongation accuracy, completeness and false positive rate are determined and used to compute an estimate of the true arc abundance. We derive a lensing efficiency of $4 \\pm 1$ arcs (with length $\\ge 6"$ and length-to-width ratio $\\ge 7$) per cluster for the X-ray selected CLASH sample, $4 \\pm 1$ arcs per cluster for the MOKA simulated sample and $3 \\pm 1$ arcs per cluster for the MUSIC simulated sample. The observed and simulated arc statistics are in full agreement. We measure the photometric redshifts of all detected arcs and find a median redshif...
Metamorphic Virus Detection in Portable Executables Using Opcodes Statistical Feature
Babak Bashari Rad
2011-01-01
Full Text Available Metamorphic viruses engage different mutation techniques to escape from string signature based scanning. They try to change their code in new offspring so that the variants appear non-similar and have no common sequences of string as signature. However, all versions of a metamorphic virus have similar task and performance. This obfuscation process helps to keep them safe from the string based signature detection. In this study, we make use of instructions statistical features to compare the similarity of two hosted files probably occupied by two mutated forms of a specific metamorphic virus. The introduced solution in this paper is relied on static analysis and employs the frequency histogram of machine opcodes in different instances of obfuscated viruses. We use Minkowski-form histogram distance measurements in order to check the likeness of portable executables (PE. The purpose of this research is to present an idea that for a number of special obfuscation approaches the presented solution can be used to identify morphed copies of a file. Thus, it can be applied by antivirus scanner to recognize different versions of a metamorphic virus.
Statistical Mechanics of the Community Detection Problem: Theory and Application
Hu, Dandan
We study phase transitions in spin glass type systems and in related computational problems. In the current work, we focus on the "community detection" problem when cast in terms of a general Potts spin glass type problem. We report on phase transitions between solvable and unsolvable regimes. Solvable region may further split into easy and hard phases. Spin glass type phase transitions appear at both low and high temperatures. Low temperature transitions correspond to an order by disorder type effect wherein fluctuations render the system ordered or solvable. Separate transitions appear at higher temperatures into a disordered (or an unsolvable) phases. Different sorts of randomness lead to disparate behaviors. We illustrate the spin glass character of both transitions and report on memory effects. We further relate Potts type spin systems to mechanical analogs and suggest how chaotic-type behavior in general thermodynamic systems can indeed naturally arise in hard-computational problems and spin-glasses. In this work, we also examine large networks (with a power law distribution in cluster size) that have a large number of communities. We infer that large systems at a constant ratio of q to the number of nodes N asymptotically tend toward insolvability in the limit of large N for any positive temperature. We further employ multivariate Tutte polynomials to show that increasing q emulates increasing T for a general Potts model, leading to a similar stability region at low T. We further apply the replica inference based Potts model method to unsupervised image segmentation on multiple scales. This approach was inspired by the statistical mechanics problem of "community detection" and its phase diagram. The problem is cast as identifying tightly bound clusters against a background. Within our multiresolution approach, we compute information theory based correlations among multiple solutions of the same graph over a range of resolutions. Significant multiresolution
A statistical method for the detection of alternative splicing using RNA-seq.
Liguo Wang
Full Text Available Deep sequencing of transcriptome (RNA-seq provides unprecedented opportunity to interrogate plausible mRNA splicing patterns by mapping RNA-seq reads to exon junctions (thereafter junction reads. In most previous studies, exon junctions were detected by using the quantitative information of junction reads. The quantitative criterion (e.g. minimum of two junction reads, although is straightforward and widely used, usually results in high false positive and false negative rates, owning to the complexity of transcriptome. Here, we introduced a new metric, namely Minimal Match on Either Side of exon junction (MMES, to measure the quality of each junction read, and subsequently implemented an empirical statistical model to detect exon junctions. When applied to a large dataset (>200M reads consisting of mouse brain, liver and muscle mRNA sequences, and using independent transcripts databases as positive control, our method was proved to be considerably more accurate than previous ones, especially for detecting junctions originated from low-abundance transcripts. Our results were also confirmed by real time RT-PCR assay. The MMES metric can be used either in this empirical statistical model or in other more sophisticated classifiers, such as logistic regression.
Configurational Statistics of Magnetic Bead Detection with Magnetoresistive Sensors
Henriksen, Anders Dahl; Ley, Mikkel Wennemoes Hvitfeld; Flyvbjerg, Henrik
2015-01-01
and precision with which the coverage can be determined from a single sensor measurement. We show that statistical fluctuations between samples may reduce the sensitivity and dynamic range of a sensor significantly when the functionalized area is larger than the sensor area. Hence, the statistics of sampling...... is essential to sensor design. For illustration, we analyze three important published cases for which statistical fluctuations are dominant, significant, and insignificant, respectively....
Saavedra, Miguel; Zulantay, Inés; Apt, Werner; Martínez, Gabriela; Rojas, Antonio; Rodríguez, Jorge
2013-01-01
We evaluate the elimination of the microscopic stage of conventional xenodiagnosis (XD) to optimize the parasitological diagnosis of Trypanosoma cruzi in chronic Chagas disease. To this purpose we applied under informed consent two XD cages to 150 Chilean chronic chagasic patients. The fecal samples (FS) of the triatomines at 30, 60 and 90 days post feeding were divided into two parts: in one a microscopic search for mobile trypomastigote and/or epimastigote forms was performed. In the other part, DNA extraction-purification for PCR directed to the conserved region of kDNA minicircles of trypanosomes (PCR-XD), without previous microscopic observation was done. An XD was considered positive when at least one mobile T. cruzi parasite in any one of three periods of incubation was observed, whereas PCR-XD was considered positive when the 330 bp band specific for T. cruzi was detected. 25 of 26 cases with positive conventional XD were PCR-XD positive (concordance 96.2%), whereas 85 of 124 cases with negative conventional XD were positive by PCR-XD (68.5%). Human chromosome 12 detected by Real-time PCR used as exogenous internal control of PCR-XD reaction allowed to discounting of PCR inhibition and false negative in 40 cases with negative PCR-XD. PCR-XD performed without previous microscopic observation is a useful tool for detection of viable parasites with higher efficiency then conventional XD.
Lee, L.; Helsel, D.
2005-01-01
Trace contaminants in water, including metals and organics, often are measured at sufficiently low concentrations to be reported only as values below the instrument detection limit. Interpretation of these "less thans" is complicated when multiple detection limits occur. Statistical methods for multiply censored, or multiple-detection limit, datasets have been developed for medical and industrial statistics, and can be employed to estimate summary statistics or model the distributions of trace-level environmental data. We describe S-language-based software tools that perform robust linear regression on order statistics (ROS). The ROS method has been evaluated as one of the most reliable procedures for developing summary statistics of multiply censored data. It is applicable to any dataset that has 0 to 80% of its values censored. These tools are a part of a software library, or add-on package, for the R environment for statistical computing. This library can be used to generate ROS models and associated summary statistics, plot modeled distributions, and predict exceedance probabilities of water-quality standards. ?? 2005 Elsevier Ltd. All rights reserved.
Statistical Algorithm for the Adaptation of Detection Thresholds
Stotsky, Alexander A.
2008-01-01
Many event detection mechanisms in spark ignition automotive engines are based on the comparison of the engine signals to the detection threshold values. Different signal qualities for new and aged engines necessitate the development of an adaptation algorithm for the detection thresholds...
Quantile regression for the statistical analysis of immunological data with many non-detects
Eilers, P.H.C.; Roder, E.; Savelkoul, H.F.J.; Wijk, van R.G.
2012-01-01
Background Immunological parameters are hard to measure. A well-known problem is the occurrence of values below the detection limit, the non-detects. Non-detects are a nuisance, because classical statistical analyses, like ANOVA and regression, cannot be applied. The more advanced statistical techni
Quantile regression for the statistical analysis of immunological data with many non-detects
Eilers Paul HC; Röder Esther; Savelkoul Huub FJ; van Wijk Roy
2012-01-01
Abstract Background Immunological parameters are hard to measure. A well-known problem is the occurrence of values below the detection limit, the non-detects. Non-detects are a nuisance, because classical statistical analyses, like ANOVA and regression, cannot be applied. The more advanced statistical techniques currently available for the analysis of datasets with non-detects can only be used if a small percentage of the data are non-detects. Methods and results Quantile regression, a genera...
Statistical Anomaly Detection for Monitoring of Human Dynamics
Kamiya, K.; Fuse, T.
2015-05-01
Understanding of human dynamics has drawn attention to various areas. Due to the wide spread of positioning technologies that use GPS or public Wi-Fi, location information can be obtained with high spatial-temporal resolution as well as at low cost. By collecting set of individual location information in real time, monitoring of human dynamics is recently considered possible and is expected to lead to dynamic traffic control in the future. Although this monitoring focuses on detecting anomalous states of human dynamics, anomaly detection methods are developed ad hoc and not fully systematized. This research aims to define an anomaly detection problem of the human dynamics monitoring with gridded population data and develop an anomaly detection method based on the definition. According to the result of a review we have comprehensively conducted, we discussed the characteristics of the anomaly detection of human dynamics monitoring and categorized our problem to a semi-supervised anomaly detection problem that detects contextual anomalies behind time-series data. We developed an anomaly detection method based on a sticky HDP-HMM, which is able to estimate the number of hidden states according to input data. Results of the experiment with synthetic data showed that our proposed method has good fundamental performance with respect to the detection rate. Through the experiment with real gridded population data, an anomaly was detected when and where an actual social event had occurred.
Active Fault Detection Based on a Statistical Test
Sekunda, André Krabdrup; Niemann, Hans Henrik; Poulsen, Niels Kjølstad
2016-01-01
In this paper active fault detection of closed loop systems using dual Youla-Jabr-Bongiorno-Kucera(YJBK) parameters is presented. Until now all detector design for active fault detection using the dual YJBK parameters has been based on CUSUM detectors. Here a method for design of a matched filter...
Surface Electromyographic Onset Detection Based On Statistics and Information Content
López, Natalia M.; Orosco, Eugenio; di Sciascio, Fernando
2011-12-01
The correct detection of the onset of muscular contraction is a diagnostic tool to neuromuscular diseases and an action trigger to control myoelectric devices. In this work, entropy and information content concepts were applied in algorithmic methods to automatic detection in surface electromyographic signals.
Hayslett, H T
1991-01-01
Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the
Statistical guidelines for detecting past population shifts using ancient DNA
Mourier, Tobias; Ho, Simon; Gilbert, M Thomas P;
2012-01-01
results provide useful guidelines for scaling sampling schemes and for optimizing our ability to infer past population dynamics. In addition, our results suggest that many ancient DNA studies may face power issues in detecting moderate demographic collapses and/or highly dynamic demographic shifts when...... quantitative and temporal sampling schemes, we test the power of ancient mitochondrial sequences and nuclear single-nucleotide polymorphisms (SNPs) to detect past population bottlenecks. Within our simulated framework, mitochondrial sequences have only limited power to detect subtle bottlenecks and/or fast...... post-bottleneck recoveries. In contrast, nuclear SNPs can detect bottlenecks followed by rapid recovery, although bottlenecks involving reduction of less than half the population are generally detected with low power unless extensive genetic information from ancient individuals is available. Our...
Joint multipartite photon statistics by on/off detection.
Brida, G; Genovese, M; Piacentini, F; Paris, Matteo G A
2006-12-01
We demonstrate a method to reconstruct the joint photon statistics of two or more modes of radiation by using on/off photodetection performed at different quantum efficiencies. The two-mode case is discussed in detail, and experimental results are presented for the bipartite states obtained after a beam splitter fed by a single photon state or a thermal state.
Extrasolar planets detections and statistics through gravitational microlensing
Cassan, A.
2014-10-01
Gravitational microlensing was proposed thirty years ago as a promising method to probe the existence and properties of compact objects in the Galaxy and its surroundings. The particularity and strength of the technique is based on the fact that the detection does not rely on the detection of the photon emission of the object itself, but on the way its mass affects the path of light of a background, almost aligned source. Detections thus include not only bright, but also dark objects. Today, the many successes of gravitational microlensing have largely exceeded the original promises. Microlensing contributed important results and breakthroughs in several astrophysical fields as it was used as a powerful tool to probe the Galactic structure (proper motions, extinction maps), to search for dark and compact massive objects in the halo and disk of the Milky Way, to probe the atmospheres of bulge red giant stars, to search for low-mass stars and brown dwarfs and to hunt for extrasolar planets. As an extrasolar planet detection method, microlensing nowadays stands in the top five of the successful observational techniques. Compared to other (complementary) detection methods, microlensing provides unique information on the population of exoplanets, because it allows the detection of very low-mass planets (down to the mass of the Earth) at large orbital distances from their star (0.5 to 10 AU). It is also the only technique that allows the discovery of planets at distances from Earth greater than a few kiloparsecs, up to the bulge of the Galaxy. Microlensing discoveries include the first ever detection of a cool super-Earth around an M-dwarf star, the detection of several cool Neptunes, Jupiters and super-Jupiters, as well as multi-planetary systems and brown dwarfs. So far, the least massive planet detected by microlensing has only three times the mass of the Earth and orbits a very low mass star at the edge of the brown dwarf regime. Several free-floating planetary
Drillstring Washout Diagnosis Using Friction Estimation and Statistical Change Detection
Willersrud, Anders; Blanke, Mogens; Imsland, Lars
2015-01-01
-distribution encountered in data. Change detection methods are developed using logged sensor data from a horizontal 1400 m managed pressure drilling test rig. Detection scheme design is conducted using probabilities for false alarm and detection to determine thresholds in hypothesis tests. A multivariate......In oil and gas drilling, corrosion or tensile stress can give small holes in the drillstring, which can cause leakage and prevent sufficient flow of drilling fluid. If such washout remains undetected and develops, the consequence can be a complete twist-off of the drillstring. Aiming at early...... washout diagnosis, this paper employs an adaptive observer to estimate friction parameters in the nonlinear pro- cess. Non-Gaussian noise is a nuisance in the parameter estimates, and dedicated generalized likelihood tests are developed to make efficient washout detection with the multivariate t...
Gatt, Philip; Johnson, Steven; Nichols, Terry
2009-06-10
The performance of single and multielement Geiger-mode avalanche photodiode (GM-APD) devices are investigated as a function of the detector's reset or dead time. The theoretical results, developed herein, capture the effects of both quantum fluctuations and speckle noise and are shown to agree with Monte Carlo simulation measurements. First, a theory for the mean response or count rate to an arbitrary input flux is developed. The probability that the GM-APD is armed is shown to be the ratio of this mean response to the input flux. This arm probability, P(A), is then utilized to derive the signal photon detection efficiency (SPDE), which is the fraction of signal photons that are detected. The SPDE is a function of the input flux, the arm probability, and the dead time. When the dead time is zero, GM-APDs behave linearly, P(A) is unity, and the SPDE theory is simplified to the detector's effective quantum efficiency. When the dead time is long compared to the acquisition gate time, the theory converges to previously published "infinite" dead-time theories. The SPDE theory is then applied to develop other key ladar performance metrics, e.g., signal-to-noise ratio and detection statistics. The GM-APD detection statistics are shown to converge to that of a linear photon counting device when the combined signal and noise flux is much less than the reset rate. For higher flux levels, the SPDE degrades, due to a decreased arm probability, and the detection probability degrades relative to that of a linear device.
Kim, Seung Ja (Dept. of Radiology, Seoul Metropolitan Government - Seoul National Univ. Boramae Medical Center, Seoul (Korea, Republic of)); Moon, Woo Kyung; Cho, Nariya; Chang, Jung Min (Dept. of Radiology, Seoul National Univ. Hospital, Seoul (Korea, Republic of)), email: moonwk@snu.ac.kr
2012-05-15
Background: The computer-aided detection (CAD) system is widely used for screening mammography. The performance of the CAD system for contralateral breast cancer has not been reported for women with a history of breast cancer. Purpose: To retrospectively evaluate the performance of a CAD system on current and previous mammograms in patients with contralateral metachronous breast cancer. Material and Methods: During a 3-year period, 4945 postoperative patients had follow-up examinations, from whom we selected 55 women with contralateral breast cancers. Among them, 38 had visible malignant signs on the current mammograms. We analyzed the sensitivity and false-positive marks of the system on the current and previous mammograms according to lesion type and breast density. Results: The total visible lesion components on the current mammograms included 27 masses and 14 calcifications in 38 patients. The case-based sensitivity for all lesion types was 63.2% (24/38) with false-positive marks of 0.71 per patient. The lesion-based sensitivity for masses and calcifications was 59.3% (16/27) and 71.4% (10/14), respectively. The lesion-based sensitivity for masses in fatty and dense breasts was 68.8% (11/16) and 45.5% (5/11), respectively. The lesion-based sensitivity for calcifications in fatty and dense breasts was 100.0% (3/3) and 63.6% (7/11), respectively. The total visible lesion components on the previous mammograms included 13 masses and three calcifications in 16 patients, and the sensitivity for all lesion types was 31.3% (5/16) with false-positive marks of 0.81 per patient. On these mammograms, the sensitivity for masses and calcifications was 30.8% (4/13) and 33.3% (1/3), respectively. The sensitivity in fatty and dense breasts was 28.6% (2/7) and 33.3% (3/9), respectively. Conclusion: In the women with a history of breast cancer, the sensitivity of the CAD system in visible contralateral breast cancer was lower than in most previous reports using the same CAD
Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.
Diagnosis of UAV Pitot Tube Defects Using Statistical Change Detection
Hansen, Søren; Blanke, Mogens; Adrian, Jens
2010-01-01
Unmanned Aerial Vehicles need a large degree of tolerance to faults. One of the most important steps towards this is the ability to detect and isolate faults in sensors and actuators in real time and make remedial actions to avoid that faults develop to failure. This paper analyses...... the possibilities of detecting faults in the pitot tube of a small unmanned aerial vehicle, a fault that easily causes a crash if not diagnosed and handled in time. Using as redundant information the velocity measured from an onboard GPS receiver, the air-speed estimated from engine throttle and the pitot tube...
Statistical methods for damage detection applied to civil structures
Gres, Szymon; Ulriksen, Martin Dalgaard; Döhler, Michael
2017-01-01
of the two damage detection methods is similar, hereby implying merit of the new Mahalanobis distance-based approach, as it is less computational complex. The fusion of the damage indicators in the control chart provides the most accurate view on the progressively damaged systems....... and compared to the well-known subspace-based damage detection algorithm in the context of two large case studies. Both methods are implemented in the modal analysis and structural health monitoring software ARTeMIS, in which the joint features of the methods are concluded in a control chart in an attempt...
A Blind Detection Algorithm Utilizing Statistical Covariance in Cognitive Radio
Yingxue Li
2012-11-01
Full Text Available As the expression of performance parameters are obtained using asymptotic method in most blind covariance detection algorithm, the paper presented a new blind detection algorithm using cholesky factorization. Utilizing random matrix theory, we derived the performance parameters using non-asymptotic method. The proposed method overcomes the noise uncertainty problem and performs well without any information about the channel, primary user and noise. Numerical simulation results demonstrate that the performance parameters expressions are correct and the new detector outperforms the other blind covariance detectors.
Detecting Hidden Encrypted Volume Files via Statistical Analysis
Mario Piccinelli
2015-05-01
Full Text Available Nowadays various software tools have been developed for the purpose of creating encrypted volume files. Many of those tools are open source and freely available on the internet. Because of that, the probability of finding encrypted files which could contain forensically useful information has dramatically increased. While decoding these files without the key is still a major challenge, the simple fact of being able to recognize their existence is now a top priority for every digital forensics investigation. In this paper we will present a statistical approach to find elements of a seized filesystem which have a reasonable chance of containing encrypted data.
Statistics by Example, Detecting Patterns, Teachers' Commentary and Solutions Manual.
Zelinka, Martha; Weisberg, Sanford
Part I of the teachers' guide for "Detecting Patterns" gives a brief description of the mathematical skills needed for the unit, lists the substantive areas touched on by the problems in the pamphlet, suggests classroom uses for the booklet, and gives background information on the individual chapters of the unit. Part II provides complete…
Fidalgo, Angel M.; Alavi, Seyed Mohammad; Amirian, Seyed Mohammad Reza
2014-01-01
This study examines three controversial aspects in differential item functioning (DIF) detection by logistic regression (LR) models: first, the relative effectiveness of different analytical strategies for detecting DIF; second, the suitability of the Wald statistic for determining the statistical significance of the parameters of interest; and…
Reliable detection of directional couplings using rank statistics.
Chicharro, Daniel; Andrzejak, Ralph G
2009-08-01
To detect directional couplings from time series various measures based on distances in reconstructed state spaces were introduced. These measures can, however, be biased by asymmetries in the dynamics' structure, noise color, or noise level, which are ubiquitous in experimental signals. Using theoretical reasoning and results from model systems we identify the various sources of bias and show that most of them can be eliminated by an appropriate normalization. We furthermore diminish the remaining biases by introducing a measure based on ranks of distances. This rank-based measure outperforms existing distance-based measures concerning both sensitivity and specificity for directional couplings. Therefore, our findings are relevant for a reliable detection of directional couplings from experimental signals.
Detecting errors in micro and trace analysis by using statistics
Heydorn, K.
1993-01-01
By assigning a standard deviation to each step in an analytical method it is possible to predict the standard deviation of each analytical result obtained by this method. If the actual variability of replicate analytical results agrees with the expected, the analytical method is said...... to results for chlorine in freshwater from BCR certification analyses by highly competent analytical laboratories in the EC. Titration showed systematic errors of several percent, while radiochemical neutron activation analysis produced results without detectable bias....
A flexibly shaped space-time scan statistic for disease outbreak detection and monitoring
Tango Toshiro
2008-04-01
Full Text Available Abstract Background Early detection of disease outbreaks enables public health officials to implement disease control and prevention measures at the earliest possible time. A time periodic geographical disease surveillance system based on a cylindrical space-time scan statistic has been used extensively for disease surveillance along with the SaTScan software. In the purely spatial setting, many different methods have been proposed to detect spatial disease clusters. In particular, some spatial scan statistics are aimed at detecting irregularly shaped clusters which may not be detected by the circular spatial scan statistic. Results Based on the flexible purely spatial scan statistic, we propose a flexibly shaped space-time scan statistic for early detection of disease outbreaks. The performance of the proposed space-time scan statistic is compared with that of the cylindrical scan statistic using benchmark data. In order to compare their performances, we have developed a space-time power distribution by extending the purely spatial bivariate power distribution. Daily syndromic surveillance data in Massachusetts, USA, are used to illustrate the proposed test statistic. Conclusion The flexible space-time scan statistic is well suited for detecting and monitoring disease outbreaks in irregularly shaped areas.
Martinez, Rafael; Rodriguez, Francisco de Borja; Camacho, David
2007-01-01
The main contribution of this paper is to design an Information Retrieval (IR) technique based on Algorithmic Information Theory (using the Normalized Compression Distance- NCD), statistical techniques (outliers), and novel organization of data base structure. The paper shows how they can be integrated to retrieve information from generic databases using long (text-based) queries. Two important problems are analyzed in the paper. On the one hand, how to detect "false positives" when the distance among the documents is very low and there is actual similarity. On the other hand, we propose a way to structure a document database which similarities distance estimation depends on the length of the selected text. Finally, the experimental evaluations that have been carried out to study previous problems are shown.
Nonlinear Statistical Process Monitoring and Fault Detection Using Kernel ICA
ZHANG Xi; YAN Wei-wu; ZHAO Xu; SHAO Hui-he
2007-01-01
A novel nonlinear process monitoring and fault detection method based on kernel independent component analysis (ICA) is proposed. The kernel ICA method is a two-phase algorithm: whitened kernel principal component (KPCA) plus ICA. KPCA spheres data and makes the data structure become as linearly separable as possible by virtue of an implicit nonlinear mapping determined by kernel. ICA seeks the projection directions in the KPCA whitened space, making the distribution of the projected data as non-gaussian as possible. The application to the fluid catalytic cracking unit (FCCU) simulated process indicates that the proposed process monitoring method based on kernel ICA can effectively capture the nonlinear relationship in process variables. Its performance significantly outperforms monitoring method based on ICA or KPCA.
Quantile regression for the statistical analysis of immunological data with many non-detects
P.H.C. Eilers (Paul); E. Röder (Esther); H.F.J. Savelkoul (Huub); R. Gerth van Wijk (Roy)
2012-01-01
textabstractBackground: Immunological parameters are hard to measure. A well-known problem is the occurrence of values below the detection limit, the non-detects. Non-detects are a nuisance, because classical statistical analyses, like ANOVA and regression, cannot be applied. The more advanced stati
Løkke, Anders; Ulrik, Charlotte Suppli; Dahl, Ronald;
2012-01-01
Background and Aim: Under-diagnosis of COPD is a widespread problem. This study aimed to identify previously undiagnosed cases of COPD in a high-risk population identified through general practice. Methods: Participating GPs (n = 241) recruited subjects with no previous diagnosis of lung disease,...
Li, Jia; Tseng, George C
2011-01-01
Global expression analyses using microarray technologies are becoming more common in genomic research, therefore, new statistical challenges associated with combining information from multiple studies must be addressed. In this paper we will describe our proposal for an adaptively weighted (AW) statistic to combine multiple genomic studies for detecting differentially expressed genes. We will also present our results from comparisons of our proposed AW statistic to Fisher...
Franco, Ana; Gaillard, Vinciane; Cleeremans, Axel; Destrebecqz, Arnaud
2015-12-01
Statistical learning can be used to extract the words from continuous speech. Gómez, Bion, and Mehler (Language and Cognitive Processes, 26, 212-223, 2011) proposed an online measure of statistical learning: They superimposed auditory clicks on a continuous artificial speech stream made up of a random succession of trisyllabic nonwords. Participants were instructed to detect these clicks, which could be located either within or between words. The results showed that, over the length of exposure, reaction times (RTs) increased more for within-word than for between-word clicks. This result has been accounted for by means of statistical learning of the between-word boundaries. However, even though statistical learning occurs without an intention to learn, it nevertheless requires attentional resources. Therefore, this process could be affected by a concurrent task such as click detection. In the present study, we evaluated the extent to which the click detection task indeed reflects successful statistical learning. Our results suggest that the emergence of RT differences between within- and between-word click detection is neither systematic nor related to the successful segmentation of the artificial language. Therefore, instead of being an online measure of learning, the click detection task seems to interfere with the extraction of statistical regularities.
Statistical methods for the detection of answer copying on achievement tests
Sotaridona, Leonardo Sitchirita
2003-01-01
This thesis contains a collection of studies where statistical methods for the detection of answer copying on achievement tests in multiple-choice format are proposed and investigated. Although all methods are suited to detect answer copying, each method is designed to address specific characteristi
Early pack-off diagnosis in drilling using an adaptive observer and statistical change detection
Willersrud, Anders; Imsland, Lars; Blanke, Mogens
2015-01-01
in the well. A model-based adaptive observer is used to estimate these friction parameters as well as flow rates. Detecting changes to these estimates can then be used for pack-off diagnosis, which due to measurement noise is done using statistical change detection. Isolation of incident type and location...
Hu, X.; Fan, M.; Mulder, J.; Frencken, J.E.
2016-01-01
PURPOSE: To compare the level of agreement between carious lesion assessments according to the visual clinical examination and the colour photograph methods. MATERIALS AND METHODS: Data on the presence of enamel/dentin carious lesions in previously sealed occlusal surfaces in first molars were
A DoS/DDoS Attack Detection System Using Chi-Square Statistic Approach
Fang-Yie Leu
2010-04-01
Full Text Available Nowadays, users can easily access and download network attack tools, which often provide friendly interfaces and easily operated features, from the Internet. Therefore, even a naive hacker can also launch a large scale DoS or DDoS attack to prevent a system, i.e., the victim, from providing Internet services. In this paper, we propose an agent based intrusion detection architecture, which is a distributed detection system, to detect DoS/DDoS attacks by invoking a statistic approach that compares source IP addresses' normal and current packet statistics to discriminate whether there is a DoS/DDoS attack. It first collects all resource IPs' packet statistics so as to create their normal packet distribution. Once some IPs' current packet distribution suddenly changes, very often it is an attack. Experimental results show that this approach can effectively detect DoS/DDoS attacks.
A DoS/DDoS Attack Detection System Using Chi-Square Statistic Approach
Fang-Yie Leu
2010-04-01
Full Text Available Nowadays, users can easily access and download network attack tools, which often provide friendly interfaces and easily operated features, from the Internet. Therefore, even a naive hacker can also launch a large scale DoS or DDoS attack to prevent a system, i.e., the victim, from providing Internet services. In this paper, we propose an agent based intrusion detection architecture, which is a distributed detection system, to detect DoS/DDoS attacks by invoking a statistic approach that compares source IP addresses' normal and current packet statistics to discriminate whether there is a DoS/DDoS attack. It first collects all resource IPs' packet statistics so as to create their normal packet distribution. Once some IPs' current packet distribution suddenly changes, very often it is an attack. Experimental results show that this approach can effectively detect DoS/DDoS attacks.
Statistical study of undulator radiated power by a classical detection system in the mm-wave regime
A. Eliran
2009-05-01
Full Text Available The statistics of FEL spontaneous emission power detected with a detector integration time much larger than the slippage time has been measured in many previous works at high frequencies. In such cases the quantum (shot noise generated in the detection process is dominant. We have measured spontaneous emission in the Israeli electrostatic accelerator FEL (EA-FEL operating in the mm-wave lengths. In this regime the detector is based on a diode rectifier for which the detector quantum noise is negligible. The measurements were repeated numerous times in order to create a sample space with sufficient data enabling evaluation of the statistical features of the radiated power. The probability density function of the radiated power was found and its moments were calculated. The results of analytical and numerical models are compared to those obtained in experimental measurements.
Abootorabi Zarchi, Hossein; Jónsson, Ragnar Ingi; Blanke, Mogens
2009-01-01
detection and hypothesis testing applied on activity sensor data. This paper enhances earlier method by employing fuzzy logic technique to classify oestrus alerts from a model-based detection method utilising the cyclic nature of oestrus. Based on the distribution of the trait period since last detected...
Uesaka, Karin; Maezawa, Masaki; Inokuma, Hisashi
2016-03-01
A serological survey of Borrelia infection of dogs was performed in Sapporo, Japan, where Borrelia garinii infection in dogs was detected in 2011. A total of 314 serum samples were collected from dogs that visited three animal hospitals in Sapporo from 2012 to 2014. The two-step evaluation method, involving screening ELISA followed by Western blot analysis, was used to detect antibodies against Borrelia species. A total of 34 samples were positive by ELISA. Among those 34 samples, 32 were positive for Borrelia spp. by Western blot. These findings suggest that the 32 dogs (10.2%) generated antibodies against Borrelia burgdorferi sensu lato, such as B. garinii or B. afzelii. Antibody positivity was 7.6% and 13.3% for dogs living in urban and rural areas, respectively. Dogs with a history of tick infestation showed a positive rate of 16.7%, which was higher, although not significantly, than the 6.7% among dogs without a history.
Metoyer, Candace N.; Walsh, Stephen J.; Tardiff, Mark F.; Chilton, Lawrence
2008-10-30
The detection and identification of weak gaseous plumes using thermal imaging data is complicated by many factors. These include variability due to atmosphere, ground and plume temperature, and background clutter. This paper presents an analysis of one formulation of the physics-based model that describes the at-sensor observed radiance. The motivating question for the analyses performed in this paper is as follows. Given a set of backgrounds, is there a way to predict the background over which the probability of detecting a given chemical will be the highest? Two statistics were developed to address this question. These statistics incorporate data from the long-wave infrared band to predict the background over which chemical detectability will be the highest. These statistics can be computed prior to data collection. As a preliminary exploration into the predictive ability of these statistics, analyses were performed on synthetic hyperspectral images. Each image contained one chemical (either carbon tetrachloride or ammonia) spread across six distinct background types. The statistics were used to generate predictions for the background ranks. Then, the predicted ranks were compared to the empirical ranks obtained from the analyses of the synthetic images. For the simplified images under consideration, the predicted and empirical ranks showed a promising amount of agreement. One statistic accurately predicted the best and worst background for detection in all of the images. Future work may include explorations of more complicated plume ingredients, background types, and noise structures.
Rodríguez, Rogelio; Borràs, Antoni; Leal, Luz; Cerdà, Víctor; Ferrer, Laura
2016-03-10
An automatic system based on multisyringe flow injection analysis (MSFIA) and lab-on-valve (LOV) flow techniques for separation and pre-concentration of (226)Ra from drinking and natural water samples has been developed. The analytical protocol combines two different procedures: the Ra adsorption on MnO2 and the BaSO4 co-precipitation, achieving more selectivity especially in water samples with low radium levels. Radium is adsorbed on MnO2 deposited on macroporous of bead cellulose. Then, it is eluted with hydroxylamine to transform insoluble MnO2 to soluble Mn(II) thus freeing Ra, which is then coprecipitated with BaSO4. The (226)Ra can be directly detected in off-line mode using a low background proportional counter (LBPC) or through a liquid scintillation counter (LSC), after performing an on-line coprecipitate dissolution. Thus, the versatility of the proposed system allows the selection of the radiometric detection technique depending on the detector availability or the required response efficiency (sample number vs. response time and limit of detection). The MSFIA-LOV system improves the precision (1.7% RSD), and the extraction frequency (up to 3 h(-1)). Besides, it has been satisfactorily applied to different types of water matrices (tap, mineral, well and sea water). The (226)Ra minimum detectable activities (LSC: 0.004 Bq L(-1); LBPC: 0.02 Bq L(-1)) attained by this system allow to reach the guidance values proposed by the relevant international agencies e.g. WHO, EPA and EC.
Rodríguez, Rogelio [Environmental Radioactivity Laboratory (LaboRA), University of the Balearic Islands, Cra. Valldemossa km 7.5, 07122, Palma (Spain); Environment and Energy Department, Advanced Materials Research Center (CIMAV) S.C., Miguel de Cervantes 120, Chihuahua, Chih. 31136 (Mexico); Borràs, Antoni [Environmental Radioactivity Laboratory (LaboRA), University of the Balearic Islands, Cra. Valldemossa km 7.5, 07122, Palma (Spain); Leal, Luz [Environment and Energy Department, Advanced Materials Research Center (CIMAV) S.C., Miguel de Cervantes 120, Chihuahua, Chih. 31136 (Mexico); Cerdà, Víctor [Department of Chemistry, University of the Balearic Islands, Cra. Valldemossa km 7.5, 07122, Palma (Spain); Ferrer, Laura, E-mail: laura.ferrer@uib.es [Environmental Radioactivity Laboratory (LaboRA), University of the Balearic Islands, Cra. Valldemossa km 7.5, 07122, Palma (Spain)
2016-03-10
An automatic system based on multisyringe flow injection analysis (MSFIA) and lab-on-valve (LOV) flow techniques for separation and pre-concentration of {sup 226}Ra from drinking and natural water samples has been developed. The analytical protocol combines two different procedures: the Ra adsorption on MnO{sub 2} and the BaSO{sub 4} co-precipitation, achieving more selectivity especially in water samples with low radium levels. Radium is adsorbed on MnO{sub 2} deposited on macroporous of bead cellulose. Then, it is eluted with hydroxylamine to transform insoluble MnO{sub 2} to soluble Mn(II) thus freeing Ra, which is then coprecipitated with BaSO{sub 4}. The {sup 226}Ra can be directly detected in off-line mode using a low background proportional counter (LBPC) or through a liquid scintillation counter (LSC), after performing an on-line coprecipitate dissolution. Thus, the versatility of the proposed system allows the selection of the radiometric detection technique depending on the detector availability or the required response efficiency (sample number vs. response time and limit of detection). The MSFIA-LOV system improves the precision (1.7% RSD), and the extraction frequency (up to 3 h{sup −1}). Besides, it has been satisfactorily applied to different types of water matrices (tap, mineral, well and sea water). The {sup 226}Ra minimum detectable activities (LSC: 0.004 Bq L{sup −1}; LBPC: 0.02 Bq L{sup −1}) attained by this system allow to reach the guidance values proposed by the relevant international agencies e.g. WHO, EPA and EC. - Highlights: • Automatic, rapid and selective method for {sup 226}Ra extraction/pre-concentration from water. • MSFIA-LOV system performs a sample clean-up prior to {sup 226}Ra radiometric detection. • {sup 226}Ra sample preparation allows using two radiometric detectors (LBPC and LSC). • Environmental levels of {sup 226}Ra are easily quantified. • High sensitivity and selectivity are achieved, reaching the
Kennedy, R R; Merry, A F
2011-09-01
Anaesthesia involves processing large amounts of information over time. One task of the anaesthetist is to detect substantive changes in physiological variables promptly and reliably. It has been previously demonstrated that a graphical trend display of historical data leads to more rapid detection of such changes. We examined the effect of a graphical indication of the magnitude of Trigg's Tracking Variable, a simple statistically based trend detection algorithm, on the accuracy and latency of the detection of changes in a micro-simulation. Ten anaesthetists each viewed 20 simulations with four variables displayed as the current value with a simple graphical trend display. Values for these variables were generated by a computer model, and updated every second; after a period of stability a change occurred to a new random value at least 10 units from baseline. In 50% of the simulations an indication of the rate of change was given by a five level graphical representation of the value of Trigg's Tracking Variable. Participants were asked to indicate when they thought a change was occurring. Changes were detected 10.9% faster with the trend indicator present (mean 13.1 [SD 3.1] cycles vs 14.6 [SD 3.4] cycles, 95% confidence interval 0.4 to 2.5 cycles, P = 0.013. There was no difference in accuracy of detection (median with trend detection 97% [interquartile range 95 to 100%], without trend detection 100% [98 to 100%]), P = 0.8. We conclude that simple statistical trend detection may speed detection of changes during routine anaesthesia, even when a graphical trend display is present.
Detecting Mass Substructure in Galaxy Clusters: An Aperture Mass Statistic for Gravitational Flexion
Leonard, Adrienne; Wilkins, Stephen M
2008-01-01
Gravitational flexion has recently been introduced as a technique by which one can map out and study substructure in clusters of galaxies. Previous analyses involving flexion have measured the individual galaxy-galaxy flexion signal, or used either parametric techniques or a KSB-type inversion to reconstruct the mass distribution in Abell 1689. In this paper, we present an aperture mass statistic for flexion, and apply it to the lensed images of background galaxies obtained by ray-tracing simulations through a simple analytic mass distribution and through a galaxy cluster from the Millennium simulation. We show that this method is effective at detecting and accurately tracing structure within clusters of galaxies on sub-arcminute scales with high signal-to-noise even using a moderate background source number density and image resolution. In addition, the method provides much more information about both the overall shape and the small-scale structure of a cluster of galaxies than can be achieved through a weak...
A Blind Blur Detection Scheme Using Statistical Features of Phase Congruency and Gradient Magnitude
Shamik Tiwari
2014-01-01
Full Text Available The growing uses of camera-based barcode readers have recently gained a lot of attention. This has boosted interest in no-reference blur detection algorithms. Blur is an undesirable phenomenon which appears as one of the most frequent causes of image degradation. In this paper we present a new no-reference blur detection scheme that is based on the statistical features of phase congruency and gradient magnitude maps. Blur detection is achieved by approximating the functional relationship between these features using a feed forward neural network. Simulation results show that the proposed scheme gives robust blur detection scheme.
Irshad, Humayun
2013-01-01
According to Nottingham grading system, mitosis count plays a critical role in cancer diagnosis and grading. Manual counting of mitosis is tedious and subject to considerable inter- and intra-reader variations. The aim is to improve the accuracy of mitosis detection by selecting the color channels that better capture the statistical and morphological features, which classify mitosis from other objects. We propose a framework that includes comprehensive analysis of statistics and morphological features in selected channels of various color spaces that assist pathologists in mitosis detection. In candidate detection phase, we perform Laplacian of Gaussian, thresholding, morphology and active contour model on blue-ratio image to detect and segment candidates. In candidate classification phase, we extract a total of 143 features including morphological, first order and second order (texture) statistics features for each candidate in selected channels and finally classify using decision tree classifier. The proposed method has been evaluated on Mitosis Detection in Breast Cancer Histological Images (MITOS) dataset provided for an International Conference on Pattern Recognition 2012 contest and achieved 74% and 71% detection rate, 70% and 56% precision and 72% and 63% F-Measure on Aperio and Hamamatsu images, respectively. The proposed multi-channel features computation scheme uses fixed image scale and extracts nuclei features in selected channels of various color spaces. This simple but robust model has proven to be highly efficient in capturing multi-channels statistical features for mitosis detection, during the MITOS international benchmark. Indeed, the mitosis detection of critical importance in cancer diagnosis is a very challenging visual task. In future work, we plan to use color deconvolution as preprocessing and Hough transform or local extrema based candidate detection in order to reduce the number of candidates in mitosis and non-mitosis classes.
Statistical detection and modeling of the over-dispersion of winter storm occurrence
Raschke, M.
2015-08-01
In this communication, I improve the detection and modeling of the over-dispersion of winter storm occurrence. For this purpose, the generalized Poisson distribution and the Bayesian information criterion are introduced; the latter is used for statistical model selection. Moreover, I replace the frequently used dispersion statistics by an over-dispersion parameter which does not depend on the considered return period of storm events. These models and methods are applied in order to properly detect the over-dispersion in winter storm data for Germany, carrying out a joint estimation of the distribution models for different samples.
Andrius Gudiškis
2015-07-01
Full Text Available This paper proposes an algorithm to reduce the noise distortion influence in heartbeat annotation detection in electrocardiogram (ECG signals. Boundary estimation module is based on energy detector. Heartbeat detection is usually performed by QRS detectors that are able to find QRS regions in a ECG signal that are a direct representation of a heartbeat. However, QRS performs as intended only in cases where ECG signals have high signal to noise ratio, when there are more noticeable signal distortion detectors accuracy decreases. Proposed algorithm uses additional data, taken from arterial blood pressure signal which was recorded in parallel to ECG signal, and uses it to support the QRS detection process in distorted signal areas. Proposed algorithm performs as well as classical QRS detectors in cases where signal to noise ratio is high, compared to the heartbeat annotations provided by experts. In signals with considerably lower signal to noise ratio proposed algorithm improved the detection accuracy to up to 6%.
Han Zhang
2014-01-01
Full Text Available A novel fast SAR image change detection method is presented in this paper. Based on a Bayesian approach, the prior information that speckles follow the Nakagami distribution is incorporated into the difference image (DI generation process. The new DI performs much better than the familiar log ratio (LR DI as well as the cumulant based Kullback-Leibler divergence (CKLD DI. The statistical region merging (SRM approach is first introduced to change detection context. A new clustering procedure with the region variance as the statistical inference variable is exhibited to tailor SAR image change detection purposes, with only two classes in the final map, the unchanged and changed classes. The most prominent advantages of the proposed modified SRM (MSRM method are the ability to cope with noise corruption and the quick implementation. Experimental results show that the proposed method is superior in both the change detection accuracy and the operation efficiency.
Sushko, Oleksandr; Dubrovka, Rostyslav; Donnan, Robert S., E-mail: r.donnan@qmul.ac.uk [School of Electronic Engineering and Computer Science, Queen Mary University of London, Mile End Road, London E1 4NS (United Kingdom)
2015-02-07
The initial purpose of the study is to systematically investigate the solvation properties of different proteins in water solution by terahertz (THz) radiation absorption. Transmission measurements of protein water solutions have been performed using a vector network analyser-driven quasi-optical bench covering the WR-3 waveguide band (0.220–0.325 THz). The following proteins, ranging from low to high molecular weight, were chosen for this study: lysozyme, myoglobin, and bovine serum albumin (BSA). Absorption properties of solutions were studied at different concentrations of proteins ranging from 2 to 100 mg/ml. The concentration-dependent absorption of protein molecules was determined by treating the solution as a two-component model first; then, based on protein absorptivity, the extent of the hydration shell is estimated. Protein molecules are shown to possess a concentration-dependent absorptivity in water solutions. Absorption curves of all three proteins sharply peak towards a dilution-limit that is attributed to the enhanced flexibility of protein and amino acid side chains. An alternative approach to the determination of hydration shell thickness is thereby suggested, based on protein absorptivity. The proposed approach is independent of the absorption of the hydration shell. The derived estimate of hydration shell thickness for each protein supports previous findings that protein-water interaction dynamics extends beyond 2-3 water solvation-layers as predicted by molecular dynamics simulations and other techniques such as NMR, X-ray scattering, and neutron scattering. According to our estimations, the radius of the dynamic hydration shell is 16, 19, and 25 Å, respectively, for lysozyme, myoglobin, and BSA proteins and correlates with the dipole moment of the protein. It is also seen that THz radiation can serve as an initial estimate of the protein hydrophobicity.
2014-01-01
One challenge to implementing spectral change detection algorithms using multitemporal Landsat data is that key dates and periods are often missing from the record due to weather disturbances and lapses in continuous coverage. This paper presents a method that utilizes residuals from harmonic regression over years of Landsat data, in conjunction with statistical quality control charts, to signal subtle disturbances in vegetative cover. These charts are able to detect changes from both defores...
Gajic, D.; Djurovic, Z.; Di Gennaro, S.; Gustafsson, Fredrik
2014-01-01
The electroencephalogram (EEG) signal is very important in the diagnosis of epilepsy. Long-term EEG recordings of an epileptic patient contain a huge amount of EEG data. The detection of epileptic activity is, therefore, a very demanding process that requires a detailed analysis of the entire length of the EEG data, usually performed by an expert. This paper describes an automated classification of EEG signals for the detection of epileptic seizures using wavelet transform and statistical pat...
Statistical control chart and neural network classification for improving human fall detection
Harrou, Fouzi
2017-01-05
This paper proposes a statistical approach to detect and classify human falls based on both visual data from camera and accelerometric data captured by accelerometer. Specifically, we first use a Shewhart control chart to detect the presence of potential falls by using accelerometric data. Unfortunately, this chart cannot distinguish real falls from fall-like actions, such as lying down. To bypass this difficulty, a neural network classifier is then applied only on the detected cases through visual data. To assess the performance of the proposed method, experiments are conducted on the publicly available fall detection databases: the University of Rzeszow\\'s fall detection (URFD) dataset. Results demonstrate that the detection phase play a key role in reducing the number of sequences used as input into the neural network classifier for classification, significantly reducing computational burden and achieving better accuracy.
Fault detection of a spur gear using vibration signal with multivariable statistical parameters
Songpon Klinchaeam
2014-10-01
Full Text Available This paper presents a condition monitoring technique of a spur gear fault detection using vibration signal analysis based on time domain. Vibration signals were acquired from gearboxes and used to simulate various faults on spur gear tooth. In this study, vibration signals were applied to monitor a normal and various fault conditions of a spur gear such as normal, scuffing defect, crack defect and broken tooth. The statistical parameters of vibration signal were used to compare and evaluate the value of fault condition. This technique can be applied to set alarm limit of the signal condition based on statistical parameter such as variance, kurtosis, rms and crest factor. These parameters can be used to set as a boundary decision of signal condition. From the results, the vibration signal analysis with single statistical parameter is unclear to predict fault of the spur gears. The using at least two statistical parameters can be clearly used to separate in every case of fault detection. The boundary decision of statistical parameter with the 99.7% certainty ( 3 from 300 referenced dataset and detected the testing condition with 99.7% ( 3 accuracy and had an error of less than 0.3 % using 50 testing dataset.
An Algorithm to Improve Test Answer Copying Detection Using the Omega Statistic
Maeda, Hotaka; Zhang, Bo
2017-01-01
The omega (?) statistic is reputed to be one of the best indices for detecting answer copying on multiple choice tests, but its performance relies on the accurate estimation of copier ability, which is challenging because responses from the copiers may have been contaminated. We propose an algorithm that aims to identify and delete the suspected…
Statistics of multi-tube detecting systems; Estadistica de sistemas de deteccion multitubo
Grau Carles, P.; Grau Malonda, A.
1994-07-01
In this paper three new statistical theorems are demonstrated and applied. These theorems simplify very much the obtention of the formulae to compute the counting efficiency when the detection system is formed by several photomultipliers associated in coincidence and sum. These theorems are applied to several photomultiplier arrangements in order to show their potential and the application way. (Author) 6 refs.
Haris, K
2016-01-01
Global network of advanced Interferometric gravitational wave (GW) detectors are expected to be on-line soon. Coherent observation of GW from a distant compact binary coalescence (CBC) with a network of interferometers located in different continents give crucial information about the source such as source location and polarization information. In this paper we compare different multi-detector network detection statistics for CBC search. In maximum likelihood ratio (MLR) based detection approaches, the likelihood ratio is optimized to obtain the best model parameters and the best likelihood ratio value is used as statistic to make decision on the presence of signal. However, an alternative Bayesian approach involves marginalization of the likelihood ratio over the parameters to obtain the average likelihood ratio. We obtain an analytical expression for the Bayesian statistic using the two effective synthetic data streams for targeted search of non-spinning compact binary systems with an uninformative prior on...
Choquet, É; Soummer, R; Perrin, M D; Hagan, J B; Gofas-Salas, E; Rajan, A; Aguilar, J
2015-01-01
The ALICE program, for Archival Legacy Investigation of Circumstellar Environment, is currently conducting a virtual survey of about 400 stars, by re-analyzing the HST-NICMOS coronagraphic archive with advanced post-processing techniques. We present here the strategy that we adopted to identify detections and potential candidates for follow-up observations, and we give a preliminary overview of our detections. We present a statistical analysis conducted to evaluate the confidence level on these detection and the completeness of our candidate search.
Movement and respiration detection using statistical properties of the FMCW radar signal
Kiuru, Tero
2016-07-26
This paper presents a 24 GHz FMCW radar system for detection of movement and respiration using change in the statistical properties of the received radar signal, both amplitude and phase. We present the hardware and software segments of the radar system as well as algorithms with measurement results for two distinct use-cases: 1. FMCW radar as a respiration monitor and 2. a dual-use of the same radar system for smart lighting and intrusion detection. By using change in statistical properties of the signal for detection, several system parameters can be relaxed, including, for example, pulse repetition rate, power consumption, computational load, processor speed, and memory space. We will also demonstrate, that the capability to switch between received signal strength and phase difference enables dual-use cases with one requiring extreme sensitivity to movement and the other robustness against small sources of interference. © 2016 IEEE.
Probability of Detection (POD) as a statistical model for the validation of qualitative methods.
Wehling, Paul; LaBudde, Robert A; Brunelle, Sharon L; Nelson, Maria T
2011-01-01
A statistical model is presented for use in validation of qualitative methods. This model, termed Probability of Detection (POD), harmonizes the statistical concepts and parameters between quantitative and qualitative method validation. POD characterizes method response with respect to concentration as a continuous variable. The POD model provides a tool for graphical representation of response curves for qualitative methods. In addition, the model allows comparisons between candidate and reference methods, and provides calculations of repeatability, reproducibility, and laboratory effects from collaborative study data. Single laboratory study and collaborative study examples are given.
Photon Statistics of Single-Photon Quantum States in Real Single Photon Detection
李刚; 李园; 王军民; 彭堃墀; 张天才
2004-01-01
@@ Single photon detection (SPD) with high quantum efficiency has been widely used for measurement of different quantum states with different photon distributions.Based on the direct single SPD and double-SPD of HBT configuration, we discuss the effect of a real SPD on the photon statistics measurement and it shows that the measured photon distributions for different quantum states are corrected in different forms.The results are confirmed by experiment with the strongly attenuated coherent light and thermal light.This system can be used to characterize the photon statistics of the fluorescence light from single atom or single molecular.
Statistical methods on detecting differentially expressed genes for RNA-seq data
Chen Zhongxue
2011-12-01
Full Text Available Abstract Background For RNA-seq data, the aggregated counts of the short reads from the same gene is used to approximate the gene expression level. The count data can be modelled as samples from Poisson distributions with possible different parameters. To detect differentially expressed genes under two situations, statistical methods for detecting the difference of two Poisson means are used. When the expression level of a gene is low, i.e., the number of count is small, it is usually more difficult to detect the mean differences, and therefore statistical methods which are more powerful for low expression level are particularly desirable. In statistical literature, several methods have been proposed to compare two Poisson means (rates. In this paper, we compare these methods by using simulated and real RNA-seq data. Results Through simulation study and real data analysis, we find that the Wald test with the data being log-transformed is more powerful than other methods, including the likelihood ratio test, which has similar power as the variance stabilizing transformation test; both are more powerful than the conditional exact test and Fisher exact test. Conclusions When the count data in RNA-seq can be reasonably modelled as Poisson distribution, the Wald-Log test is more powerful and should be used to detect the differentially expressed genes.
Liang, C L; De Pater, I; Alcock, C B; Axelrod, T; Wang, A; Liang, Chyng-Lan; Rice, John A.; Pater, Imke de; Alcock, Charles; Axelrod, Tim; Wang, Andrew
2002-01-01
The Taiwanese-American Occultation Survey (TAOS) will detect objects in the Kuiper Belt, by measuring the rate of occultations of stars by these objects, using an array of three to four 50cm wide-field robotic telescopes. Thousands of stars will be monitored, resulting in hundreds of millions of photometric measurements per night. To optimize the success of TAOS, we have investigated various methods of gathering and processing the data and developed statistical methods for detecting occultations. In this paper we discuss these methods. The resulting estimated detection efficiencies will be used to guide the choice of various operational parameters determining the mode of actual observation when the telescopes come on line and begin routine observations. In particular we show how real-time detection algorithms may be constructed, taking advantage of having multiple telescopes. We also discuss a retrospective method for estimating the rate at which occultations occur.
Recommended methods for statistical analysis of data containing less-than-detectable measurements
Atwood, C.L.; Blackwood, L.G.; Harris, G.A.; Loehr, C.A.
1990-09-01
This report is a manual for statistical workers dealing with environmental measurements, when some of the measurements are not given exactly but are only reported as less than detectable. For some statistical settings with such data, many methods have been proposed in the literature, while for others few or none have been proposed. This report gives a recommended method in each of the settings considered. The body of the report gives a brief description of each recommended method. Appendix A gives example programs using the statistical package SAS, for those methods that involve nonstandard methods. Appendix B presents the methods that were compared and the reasons for selecting each recommended method, and explains any fine points that might be of interest. This is an interim version. Future revisions will complete the recommendations. 34 refs., 2 figs., 11 tabs.
Recommended methods for statistical analysis of data containing less-than-detectable measurements
Atwood, C.L.; Blackwood, L.G.; Harris, G.A.; Loehr, C.A.
1991-09-01
This report is a manual for statistical workers dealing with environmental measurements, when some of the measurements are not given exactly but are only reported as less than detectable. For some statistical settings with such data, many methods have been proposed in the literature, while for others few or none have been proposed. This report gives a recommended method in each of the settings considered. The body of the report gives a brief description of each recommended method. Appendix A gives example programs using the statistical package SAS, for those methods that involve nonstandard methods. Appendix B presents the methods that were compared and the reasons for selecting each recommended method, and explains any fine points that might be of interest. 7 refs., 4 figs.
Statistical Analysis of Tissue Images for Detection and Classification of Cervical Cancer
Jagtap, Jaidip; Pandey, Kiran; Agarwa, Asha; Panigrahi, Prasanta K; Pradhan, Asima
2011-01-01
Cervical cancer is one of the major health threats in women worldwide. The current "gold standard" for detecting cancer of the epithelial tissue is the histopathology analysis of biopsy samples. However it relies on the pathologist's judgment of the disease. We investigate the utility of statistical parameters as a potential tool for detection and discrimination of the stages of dysplasia. Digital images of the tissue slides are captured with the help of a digital camera plugged to a microscope. Statistical data analysis is performed with the help of software to evaluate parameters such as mean, maxima, full width half maxima, skewness, kurtosis etc. for the images. We believe that these parameters can help effectively to improve the diagnosis and further classify normal and abnormal tissue sections. These parameters can be used independently as well as in tandem with other parameters as features in classification algorithms that involve the use of Neural networks or Principal component analysis.
Hu, Juju; Hu, Haijiang; Ji, Yinghua
2010-03-15
Periodic nonlinearity that ranges from tens of nanometers to a few nanometers in heterodyne interferometer limits its use in high accuracy measurement. A novel method is studied to detect the nonlinearity errors based on the electrical subdivision and the analysis method of statistical signal in heterodyne Michelson interferometer. Under the movement of micropositioning platform with the uniform velocity, the method can detect the nonlinearity errors by using the regression analysis and Jackknife estimation. Based on the analysis of the simulations, the method can estimate the influence of nonlinearity errors and other noises for the dimensions measurement in heterodyne Michelson interferometer.
Early pack-off diagnosis in drilling using an adaptive observer and statistical change detection
Willersrud, Anders; Imsland, Lars; Blanke, Mogens
2015-01-01
in the well. A model-based adaptive observer is used to estimate these friction parameters as well as flow rates. Detecting changes to these estimates can then be used for pack-off diagnosis, which due to measurement noise is done using statistical change detection. Isolation of incident type and location...... is done using a multivariate generalized likelihood ratio test, determining the change direction of the estimated mean values. The method is tested on simulated data from the commercial high-fidelity multi-phase simulator OLGA, where three different pack-offs at different locations and with different...
Signal waveform detection with statistical automaton for internet and web service streaming.
Tseng, Kuo-Kun; Ji, Yuzhu; Liu, Yiming; Huang, Nai-Lun; Zeng, Fufu; Lin, Fang-Ying
2014-01-01
In recent years, many approaches have been suggested for Internet and web streaming detection. In this paper, we propose an approach to signal waveform detection for Internet and web streaming, with novel statistical automatons. The system records network connections over a period of time to form a signal waveform and compute suspicious characteristics of the waveform. Network streaming according to these selected waveform features by our newly designed Aho-Corasick (AC) automatons can be classified. We developed two versions, that is, basic AC and advanced AC-histogram waveform automata, and conducted comprehensive experimentation. The results confirm that our approach is feasible and suitable for deployment.
Signal Waveform Detection with Statistical Automaton for Internet and Web Service Streaming
Kuo-Kun Tseng
2014-01-01
Full Text Available In recent years, many approaches have been suggested for Internet and web streaming detection. In this paper, we propose an approach to signal waveform detection for Internet and web streaming, with novel statistical automatons. The system records network connections over a period of time to form a signal waveform and compute suspicious characteristics of the waveform. Network streaming according to these selected waveform features by our newly designed Aho-Corasick (AC automatons can be classified. We developed two versions, that is, basic AC and advanced AC-histogram waveform automata, and conducted comprehensive experimentation. The results confirm that our approach is feasible and suitable for deployment.
T. Venkat Narayana Rao
2011-11-01
Full Text Available Edge detection is the most important feature of image processing for object detection, it is crucial to have a good understanding of edge detection algorithms/operators. Computer vision is rapidly expanding field that depends on the capability to perform faster segments and thus to classify and infer images. Segmentation is central to the successful extraction of image features and their ensuing classification. Powerful segmentation techniques are available; however each technique is ad hoc. In this paper, the computer vision investigates the sub regions of the composite image, brings out commonly used and most important edge detection algorithms/operators with a wide-ranging comparative along with the statistical approach. This paper implements popular algorithms such as Sobel, Roberts, Prewitt, Laplacian of Gaussian and canny. A standard metric is used for evaluating the performance degradation of edge detection algorithms as a function of Peak Signal to Noise Ratio (PSNR along with the elapsed time for generating the segmented output image. A statistical approach to evaluate the variance among the PSNR and the time elapsed in output image is also incorporated. This paper provides a basis for objectively comparing the performance of different techniques and quantifies relative noise tolerance. Results shown allow selection of the most optimum method for application to image.
Banks-Leite, Cristina; Pardini, Renata; Boscolo, Danilo; Cassano, Camila Righetto; Püttker, Thomas; Barros, Camila Santos; Barlow, Jos
2014-08-01
1. In recent years, there has been a fast development of models that adjust for imperfect detection. These models have revolutionized the analysis of field data, and their use has repeatedly demonstrated the importance of sampling design and data quality. There are, however, several practical limitations associated with the use of detectability models which restrict their relevance to tropical conservation science. 2. We outline the main advantages of detectability models, before examining their limitations associated with their applicability to the analysis of tropical communities, rare species and large-scale data sets. Finally, we discuss whether detection probability needs to be controlled before and/or after data collection. 3. Models that adjust for imperfect detection allow ecologists to assess data quality by estimating uncertainty and to obtain adjusted ecological estimates of populations and communities. Importantly, these models have allowed informed decisions to be made about the conservation and management of target species. 4. Data requirements for obtaining unadjusted estimates are substantially lower than for detectability-adjusted estimates, which require relatively high detection/recapture probabilities and a number of repeated surveys at each location. These requirements can be difficult to meet in large-scale environmental studies where high levels of spatial replication are needed, or in the tropics where communities are composed of many naturally rare species. However, while imperfect detection can only be adjusted statistically, covariates of detection probability can also be controlled through study design. Using three study cases where we controlled for covariates of detection probability through sampling design, we show that the variation in unadjusted ecological estimates from nearly 100 species was qualitatively the same as that obtained from adjusted estimates. Finally, we discuss that the decision as to whether one should control for
Application of Statistical Methods to Activation Analytical Results near the Limit of Detection
Heydorn, Kaj; Wanscher, B.
1978-01-01
Reporting actual numbers instead of upper limits for analytical results at or below the detection limit may produce reliable data when these numbers are subjected to appropriate statistical processing. Particularly in radiometric methods, such as activation analysis, where individual standard...... deviations of analytical results may be estimated, improved discrimination may be based on the Analysis of Precision. Actual experimental results from a study of the concentrations of arsenic in human skin demonstrate the power of this principle....
2012-01-01
Glyphosate quantification methods are complex and expensive, and its control in natural water bodies is getting more important year after year. In order to find a new system that facilitates the detection of glyphosate, we present a comparison between two models to predict glyphosate concentration in aqueous dissolutions. One of them is done by an artificial neural network (ANN) embedded in a microcontroller and the other one is done by statistic methods (Partial Least Squares) in a computer...
Homogeneity and change-point detection tests for multivariate data using rank statistics
Lung-Yut-Fong, Alexandre; Cappé, Olivier
2011-01-01
Detecting and locating changes in highly multivariate data is a major concern in several current statistical applications. In this context, the first contribution of the paper is a novel non-parametric two-sample homogeneity test for multivariate data based on the well-known Wilcoxon rank statistic. The proposed two-sample homogeneity test statistic can be extended to deal with ordinal or censored data as well as to test for the homogeneity of more than two samples. The second contribution of the paper concerns the use of the proposed test statistic to perform retrospective change-point analysis. It is first shown that the approach is computationally feasible even when looking for a large number of change-points thanks to the use of dynamic programming. Computable asymptotic $p$-values for the test are then provided in the case where a single potential change-point is to be detected. Compared to available alternatives, the proposed approach appears to be very reliable and robust. This is particularly true in ...
Fernández-Llamazares, Alvaro; Belmonte, Jordina; Delgado, Rosario; De Linares, Concepción
2014-04-01
Airborne pollen records are a suitable indicator for the study of climate change. The present work focuses on the role of annual pollen indices for the detection of bioclimatic trends through the analysis of the aerobiological spectra of 11 taxa of great biogeographical relevance in Catalonia over an 18-year period (1994-2011), by means of different parametric and non-parametric statistical methods. Among others, two non-parametric rank-based statistical tests were performed for detecting monotonic trends in time series data of the selected airborne pollen types and we have observed that they have similar power in detecting trends. Except for those cases in which the pollen data can be well-modeled by a normal distribution, it is better to apply non-parametric statistical methods to aerobiological studies. Our results provide a reliable representation of the pollen trends in the region and suggest that greater pollen quantities are being liberated to the atmosphere in the last years, specially by Mediterranean taxa such as Pinus, Total Quercus and Evergreen Quercus, although the trends may differ geographically. Longer aerobiological monitoring periods are required to corroborate these results and survey the increasing levels of certain pollen types that could exert an impact in terms of public health.
Application of hotspot detection using spatial scan statistic: Study of criminality in Indonesia
Runadi, Taruga; Widyaningsih, Yekti
2017-03-01
According to the police registered data, the number of criminal cases tends to fluctuate during 2011 to 2013. It means there is no significant reduction cases number of criminal acts during that period. Local government needs to observe whether their area was a high risk of criminal case. The objectives of this study are to detect hotspot area of certain criminal cases using spatial scan statistic. This study analyzed the data of 22 criminal types cases based on province in Indonesia that occurred during 2013. The data was obtained from Badan Pusat Statistik (BPS) that was released in 2014. Hotspot detection was performed according to the likelihood ratio of the Poisson model using SaTScanTM software and then mapped using R. The spatial scan statistic method successfully detected provinces that was categorized as hotspot for 22 crime types cases being analyzed with p-value less than 0.05. The local governments of province that were detected as hotspot area of certain crime cases should provide more attention to improve security quality.
Detection, Excision and Statistics of Interference at the Mauritius Radio Telescope
S. Sachdev; N. Udaya Shankar
2001-06-01
A technique to detect man-made interference in the visibility data of the Mauritius Radio Telescope (MRT) has been developed. This technique is based on the understanding that the interference is generally ‘spiky’ in nature and has Fourier components beyond the maximum frequency which can arise from the radio sky and can therefore be identified. We take the sum of magnitudes of visibilities on all the baselines measured at a given time to improve detectability. This is then high-pass filtered to get a time series from which the contribution of the sky is removed. Interference is detected in the high-pass data using an iterative scheme. In each iteration, interference with amplitudes beyond a certain threshold is detected. These points are then removed from the original time series and the resulting data are high-pass filtered and the process repeated. We have also studied the statistics of the strength, numbers, time of occurrence and duration of the interference at the MRT. The statistics indicate that most often the interference excision can be carried out while post-integrating the visibilities by giving a zero weight to the interference points.
Kim, Hyeonsu; Seo, Jongpil; Ahn, Jongmin; Chung, Jaehak
2017-07-01
We propose a mitigation scheme for snapping shrimp noise when it corrupts an orthogonal frequency division multiplexing (OFDM) signal in underwater acoustic communication systems. The OFDM signal distorted by the snapping shrimp noise is filtered by a band-stop filter. The snapping shrimp noises in the filtered signal are detected by a detector with a constant false alarm rate whose threshold is derived theoretically from the statistics of the background noise. The detected signals are reconstructed by a simple reconstruction method. The proposed scheme has a higher detection capability and a lower mean square error of the channel estimation for simulated data and a lower bit error rate for practical ocean OFDM data collected in northern East China Sea than the conventional noise-mitigating methods.
Simon, Martin
2015-01-01
This monograph is concerned with the analysis and numerical solution of a stochastic inverse anomaly detection problem in electrical impedance tomography (EIT). Martin Simon studies the problem of detecting a parameterized anomaly in an isotropic, stationary and ergodic conductivity random field whose realizations are rapidly oscillating. For this purpose, he derives Feynman-Kac formulae to rigorously justify stochastic homogenization in the case of the underlying stochastic boundary value problem. The author combines techniques from the theory of partial differential equations and functional analysis with probabilistic ideas, paving the way to new mathematical theorems which may be fruitfully used in the treatment of the problem at hand. Moreover, the author proposes an efficient numerical method in the framework of Bayesian inversion for the practical solution of the stochastic inverse anomaly detection problem. Contents Feynman-Kac formulae Stochastic homogenization Statistical inverse problems Targe...
Zeng, Bobo; Wang, Guijin; Ruan, Zhiwei; Lin, Xinggang; Meng, Long
2012-07-01
High-performance pedestrian detection with good accuracy and fast speed is an important yet challenging task in computer vision. We design a novel feature named pair normalized channel feature (PNCF), which simultaneously combines and normalizes two channel features in image channels, achieving a highly discriminative power and computational efficiency. PNCF applies to both gradient channels and color channels so that shape and appearance information are described and integrated in the same feature. To efficiently explore the formidably large PNCF feature space, we propose a statistics-based feature learning method to select a small number of potentially discriminative candidate features, which are fed into the boosting algorithm. In addition, channel compression and a hybrid pyramid are employed to speed up the multiscale detection. Experiments illustrate the effectiveness of PNCF and its learning method. Our proposed detector outperforms the state-of-the-art on several benchmark datasets in both detection accuracy and efficiency.
Enqvist, Andreas
2008-03-15
One particular purpose of nuclear safeguards, in addition to accounting for known materials, is the detection, identifying and quantifying unknown material, to prevent accidental and clandestine transports and uses of nuclear materials. This can be achieved in a non-destructive way through the various physical and statistical properties of particle emission and detection from such materials. This thesis addresses some fundamental aspects of nuclear materials and the way they can be detected and quantified by such methods. Factorial moments or multiplicities have long been used within the safeguard area. These are low order moments of the underlying number distributions of emission and detection. One objective of the present work was to determine the full probability distribution and its dependence on the sample mass and the detection process. Derivation and analysis of the full probability distribution and its dependence on the above factors constitutes the first part of the thesis. Another possibility of identifying unknown samples lies in the information in the 'fingerprints' (pulse shape distribution) left by a detected neutron or photon. A study of the statistical properties of the interaction of the incoming radiation (neutrons and photons) with the detectors constitutes the second part of the thesis. The interaction between fast neutrons and organic scintillation detectors is derived, and compared to Monte Carlo simulations. An experimental approach is also addressed in which cross correlation measurements were made using liquid scintillation detectors. First the dependence of the pulse height distribution on the energy and collision number of an incoming neutron was derived analytically and compared to numerical simulations. Then an algorithm was elaborated which can discriminate neutron pulses from photon pulses. The resulting cross correlation graphs are analyzed and discussed whether they can be used in applications to distinguish possible
A Statistic for the Detection of Long Strings in Microwave Background Maps
Perivolaropoulos, L
1997-01-01
Using analytical methods and Monte Carlo simulations, we analyze a new statistic designed to detect isolated step-like discontinuities which are coherent over large areas of Cosmic Microwave Background (CMB) pixel maps. Such coherent temperature discontinuities are predicted by the Kaiser-Stebbins effect to form due to long cosmic strings present in our present horizon. The background of the coherent step-like seed is assumed to be a scale invariant Gaussian random field which could have been produced by a superposition of seeds on smaller scales and/or by inflationary quantum fluctuations. The effects of uncorrelated Gaussian random noise are also studied. The statistical variable considered is the Sample Mean Difference (SMD) between large neighbouring sectors of CMB maps, separated by a straight line in two dimensional maps and a point in one dimensional maps. We find that including noise, the SMD statistics can detect at the $1 \\sigma$ to $2 \\sigma$ level the presense of a long string with $G\\mu (v_s \\gam...
Chen, Shuo; Kang, Jian; Xing, Yishi; Wang, Guoqing
2015-12-01
Group-level functional connectivity analyses often aim to detect the altered connectivity patterns between subgroups with different clinical or psychological experimental conditions, for example, comparing cases and healthy controls. We present a new statistical method to detect differentially expressed connectivity networks with significantly improved power and lower false-positive rates. The goal of our method was to capture most differentially expressed connections within networks of constrained numbers of brain regions (by the rule of parsimony). By virtue of parsimony, the false-positive individual connectivity edges within a network are effectively reduced, whereas the informative (differentially expressed) edges are allowed to borrow strength from each other to increase the overall power of the network. We develop a test statistic for each network in light of combinatorics graph theory, and provide p-values for the networks (in the weak sense) by using permutation test with multiple-testing adjustment. We validate and compare this new approach with existing methods, including false discovery rate and network-based statistic, via simulation studies and a resting-state functional magnetic resonance imaging case-control study. The results indicate that our method can identify differentially expressed connectivity networks, whereas existing methods are limited.
A statistical study of decaying kink oscillations detected using SDO/AIA
Goddard, C R; Nakariakov, V M; Zimovets, I V
2016-01-01
Despite intensive studies of kink oscillations of coronal loops in the last decade, a large scale statistically significant investigation of the oscillation parameters has not been made using data from the Solar Dynamics Observatory (SDO). We carry out a statistical study of kink oscillations using Extreme Ultra-Violet (EUV) imaging data from a previously compiled catalogue. We analysed 58 kink oscillation events observed by the Atmospheric Imaging Assembly (AIA) onboard SDO during its first four years of operation (2010-2014). Parameters of the oscillations, including the initial apparent amplitude, period, length of the oscillating loop, and damping are studied for 120 individual loop oscillations. Analysis of the initial loop displacement and oscillation amplitude leads to the conclusion that the initial loop displacement prescribes the initial amplitude of oscillation in general. The period is found to scale with the loop length, and a linear fit of the data cloud gives a kink speed of Ck =(1330+/-50) km ...
Ortega-Martinez, Antonio; Padilla-Martinez, Juan Pablo; Franco, Walfre
2016-04-01
The skin contains several fluorescent molecules or fluorophores that serve as markers of structure, function and composition. UV fluorescence excitation photography is a simple and effective way to image specific intrinsic fluorophores, such as the one ascribed to tryptophan which emits at a wavelength of 345 nm upon excitation at 295 nm, and is a marker of cellular proliferation. Earlier, we built a clinical UV photography system to image cellular proliferation. In some samples, the naturally low intensity of the fluorescence can make it difficult to separate the fluorescence of cells in higher proliferation states from background fluorescence and other imaging artifacts -- like electronic noise. In this work, we describe a statistical image segmentation method to separate the fluorescence of interest. Statistical image segmentation is based on image averaging, background subtraction and pixel statistics. This method allows to better quantify the intensity and surface distributions of fluorescence, which in turn simplify the detection of borders. Using this method we delineated the borders of highly-proliferative skin conditions and diseases, in particular, allergic contact dermatitis, psoriatic lesions and basal cell carcinoma. Segmented images clearly define lesion borders. UV fluorescence excitation photography along with statistical image segmentation may serve as a quick and simple diagnostic tool for clinicians.
Oberer, R.B.
2002-11-12
The current practice of nondestructive assay (NDA) of fissile materials using neutrons is dominated by the {sup 3}He detector. This has been the case since the mid 1980s when Fission Multiplicity Detection (FMD) was replaced with thermal well counters and neutron multiplicity counting (NMC). The thermal well counters detect neutrons by neutron capture in the {sup 3}He detector subsequent to moderation. The process of detection requires from 30 to 60 {micro}s. As will be explained in Section 3.3 the rate of detecting correlated neutrons (signal) from the same fission are independent of this time but the rate of accidental correlations (noise) are proportional to this time. The well counters are at a distinct disadvantage when there is a large source of uncorrelated neutrons present from ({alpha}, n) reactions for example. Plastic scintillating detectors, as were used in FMD, require only about 20 ns to detect neutrons from fission. One thousandth as many accidental coincidences are therefore accumulated. The major problem with the use of fast-plastic scintillation detectors, however, is that both neutrons and gamma rays are detected. The pulses from the two are indistinguishable in these detectors. For this thesis, a new technique was developed to use higher-order time correlation statistics to distinguish combinations of neutron and gamma ray detections in fast-plastic scintillation detectors. A system of analysis to describe these correlations was developed based on simple physical principles. Other sources of correlations from non-fission events are identified and integrated into the analysis developed for fission events. A number of ratios and metric are identified to determine physical properties of the source from the correlations. It is possible to determine both the quantity being measured and detection efficiency from these ratios from a single measurement without a separate calibration. To account for detector dead-time, an alternative analytical technique
Irshad, Humayun; Roux, Ludovic; Racoceanu, Daniel
2013-01-01
Accurate counting of mitosis in breast cancer histopathology plays a critical role in the grading process. Manual counting of mitosis is tedious and subject to considerable inter- and intra-reader variations. This work aims at improving the accuracy of mitosis detection by selecting the color channels that better capture the statistical and morphological features having mitosis discrimination from other objects. The proposed framework includes comprehensive analysis of first and second order statistical features together with morphological features in selected color channels and a study on balancing the skewed dataset using SMOTE method for increasing the predictive accuracy of mitosis classification. The proposed framework has been evaluated on MITOS data set during an ICPR 2012 contest and ranked second from 17 finalists. The proposed framework achieved 74% detection rate, 70% precision and 72% F-Measure. In future work, we plan to apply our mitosis detection tool to images produced by different types of slide scanners, including multi-spectral and multi-focal microscopy.
Performance analysis of Wald-statistic based network detection methods for radiation sources
Sen, Satyabrata [ORNL; Rao, Nageswara S [ORNL; Wu, Qishi [University of Memphis; Barry, M. L.. [New Jersey Institute of Technology; Grieme, M. [New Jersey Institute of Technology; Brooks, Richard R [ORNL; Cordone, G. [Clemson University
2016-01-01
There have been increasingly large deployments of radiation detection networks that require computationally fast algorithms to produce prompt results over ad-hoc sub-networks of mobile devices, such as smart-phones. These algorithms are in sharp contrast to complex network algorithms that necessitate all measurements to be sent to powerful central servers. In this work, at individual sensors, we employ Wald-statistic based detection algorithms which are computationally very fast, and are implemented as one of three Z-tests and four chi-square tests. At fusion center, we apply the K-out-of-N fusion to combine the sensors hard decisions. We characterize the performance of detection methods by deriving analytical expressions for the distributions of underlying test statistics, and by analyzing the fusion performances in terms of K, N, and the false-alarm rates of individual detectors. We experimentally validate our methods using measurements from indoor and outdoor characterization tests of the Intelligence Radiation Sensors Systems (IRSS) program. In particular, utilizing the outdoor measurements, we construct two important real-life scenarios, boundary surveillance and portal monitoring, and present the results of our algorithms.
A space-time permutation scan statistic for disease outbreak detection.
Martin Kulldorff
2005-03-01
Full Text Available BACKGROUND: The ability to detect disease outbreaks early is important in order to minimize morbidity and mortality through timely implementation of disease prevention and control measures. Many national, state, and local health departments are launching disease surveillance systems with daily analyses of hospital emergency department visits, ambulance dispatch calls, or pharmacy sales for which population-at-risk information is unavailable or irrelevant. METHODS AND FINDINGS: We propose a prospective space-time permutation scan statistic for the early detection of disease outbreaks that uses only case numbers, with no need for population-at-risk data. It makes minimal assumptions about the time, geographical location, or size of the outbreak, and it adjusts for natural purely spatial and purely temporal variation. The new method was evaluated using daily analyses of hospital emergency department visits in New York City. Four of the five strongest signals were likely local precursors to citywide outbreaks due to rotavirus, norovirus, and influenza. The number of false signals was at most modest. CONCLUSION: If such results hold up over longer study times and in other locations, the space-time permutation scan statistic will be an important tool for local and national health departments that are setting up early disease detection surveillance systems.
Durmuş, Tahir; Reichelt, Uta; Huppertz, Alexander; Hamm, Bernd; Beyersdorff, Dirk; Franiel, Tobias
2013-01-01
We aimed to investigate prostate cancer detection rate of magnetic resonance imaging (MRI)-guided biopsy and to elucidate possible relations to the number of prior negative transrectal ultrasonography (TRUS)-guided biopsies. Eighty-seven consecutive patients (mean age, 65.0 years; mean prostate-specific antigen, 13.3 ng/mL) with at least one prior negative TRUS-guided biopsy and persistent suspicion of prostate cancer were included in this study. All patients underwent MRI-guided biopsy after a diagnostic multiparametric MRI examination at 1.5 Tesla. Specimens were immediately fixated and subsequently evaluated by an experienced uropathologist. Prostate cancer detection rates were calculated. Prostate cancer-positive and -negative cores were compared. Correlation between number of prior biopsies and presence of prostate cancer was evaluated. Cancer detection rates for patients with one (n=24), two (n=25), three (n=18), and four or more (n=20) negative TRUS-guided biopsies were 29.2%, 40.0%, 66.7%, and 35.0%, respectively (P = 0.087). The median number of removed cores per patient was 3 (range, 1-8) without a significant difference between patients with and without cancer (P = 0.48). Thirty of 36 cancer patients were at intermediate or high risk according to the D´Amico clinical risk score. Eleven of 15 high risk cancers were localized in the transition zone (P = 0.002). This study demonstrates high cancer detection rates of MRI-guided biopsy independent of the number of previous TRUS-guided biopsies and the number of taken prostate cores. MRI-guided biopsy therefore represents a less invasive and effective diagnostic tool for patients with prostate cancer suspicion and previous negative TRUS-guided biopsies.
Krzyżak, A. T.; Jasiński, A.; Adamek, D.
2006-07-01
Qualification of the most statistically "sensitive" diffusion parameters using Magnetic Resonance (MR) Diffusion Tensor Imaging (DTI) of the control and injured spinal cord of a rat in vivo and in vitro after the trauma is reported. Injury was induced in TH12/TH13 level by a controlled "weight-drop". In vitro experiments were performed in a home-built MR microscope, with a 6.4 T magnet, in vivo samples were measured in a 9.4 T/21 horizontal magnet The aim of this work was to find the most effective diffusion parameters which are useful in the statistically significant detection of spinal cord tissue damage. Apparent diffusion tensor (ADT) weighted data measured in vivo and in vitro on control and injured rat spinal cord (RSC) in the transverse planes and analysis of the diffusion anisotropy as a function of many parameters, which allows statisticall expose of the existence of the damage are reported.
Klimek, Peter; Hanel, Rudolf; Thurner, Stefan
2012-01-01
Democratic societies are built around the principle of free and fair elections, that each citizen's vote should count equal. National elections can be regarded as large-scale social experiments, where people are grouped into usually large numbers of electoral districts and vote according to their preferences. The large number of samples implies certain statistical consequences for the polling results which can be used to identify election irregularities. Using a suitable data collapse, we find that vote distributions of elections with alleged fraud show a kurtosis of hundred times more than normal elections. As an example we show that reported irregularities in the 2011 Duma election are indeed well explained by systematic ballot stuffing and develop a parametric model quantifying to which extent fraudulent mechanisms are present. We show that if specific statistical properties are present in an election, the results do not represent the will of the people. We formulate a parametric test detecting these stati...
Mukherjee, Kushal; Gupta, Shalabh; Ray, Asok; Wettergren, Thomas A
2011-06-01
This paper presents a statistical-mechanics-inspired procedure for optimization of the sensor field configuration to detect mobile targets. The key idea is to capture the low-dimensional behavior of the sensor field configurations across the Pareto front in a multiobjective scenario for optimal sensor deployment, where the nondominated points are concentrated within a small region of the large-dimensional decision space. The sensor distribution is constructed using location-dependent energy-like functions and intensive temperature-like parameters in the sense of statistical mechanics. This low-dimensional representation is shown to permit rapid optimization of the sensor field distribution on a high-fidelity simulation test bed of distributed sensor networks.
Madsen, Tobias
2017-01-01
are used to scale the aforementioned driver detection methods to a dataset consisting of more than 2,000 cancer genomes. The sizes and dimensionalities of genomic data sets, be it a large number of genes or multiple heterogeneous data sources, pose both great statistical opportunities and challenges....... This distribution can be learned across the entire set of genes and then be used to improve inference on the level of the individual gene. A practical way to implement this insight is using empirical Bayes. This idea is one of the main statistical underpinnings of the present work. The thesis consist of three main...... manuscripts as well as two supplementary manuscripts. In the first manuscript we explore efficient significance evaluation for models defined with factor graphs. Factor graphs are a class of graphical models encompassing both Bayesian networks and Markov models. We specifically develop a saddle...
Statistical Properties of Solar Active Regions Based on Objective Detection and Characterization
Zhang, Jie
2010-05-01
We present a study of the statistical properties of solar magnetic regions based on objective detection and characterization. The uniformity and consistency of the magnetogram images provided by SOHO/MDI make it an ideal database for automated detection of solar magnetic features. The results of detection are mainly controlled by the following four parameters or thresholds: (1) magnetic intensity threshold of kernel pixels (to find strong field regions), (2) erosion size threshold for morphological opening operation (to remove small patches), (3) magnetic intensity threshold of AR pixels (to recover the whole size of an AR), (4) dilation size threshold for morphological closing operation (to merge neighboring patches to form a whole AR). We find that the best combination of the above four parameters is (1) 250 Gauss, (2) 10 Mm, (3) 50 Gauss, and (4) 10 Mm, which yields a detection of 1772 ARs that is most similar to the NOAA catalog based on human operators; as a comparison, NOAA/SWPC reports 2281 ARs during the same period. By varying the values of the control parameters, the number of ARs detected can range from as small as 1000 to as large as 10000. With these data, we are now able to make detailed statistical study of solar active regions, including (1) how AR number and emerged magnetic flux vary with solar cycle? (2) how AR number and emerged magnetic flux vary with latitude during different phases of solar cycle? (3) the distribution of AR number with respect to the size; Is the distribution power-law, Gaussian or log-normal, and the implication on the mechanisms of generating ARs? Is there a north-south asymmetry of ARs? How the strong magnetic patches distribute within an AR? This study provides us new insights on the properties and generations of solar active regions.
Harrou, Fouzi
2017-09-18
This study reports the development of an innovative fault detection and diagnosis scheme to monitor the direct current (DC) side of photovoltaic (PV) systems. Towards this end, we propose a statistical approach that exploits the advantages of one-diode model and those of the univariate and multivariate exponentially weighted moving average (EWMA) charts to better detect faults. Specifically, we generate array\\'s residuals of current, voltage and power using measured temperature and irradiance. These residuals capture the difference between the measurements and the predictions MPP for the current, voltage and power from the one-diode model, and use them as fault indicators. Then, we apply the multivariate EWMA (MEWMA) monitoring chart to the residuals to detect faults. However, a MEWMA scheme cannot identify the type of fault. Once a fault is detected in MEWMA chart, the univariate EWMA chart based on current and voltage indicators is used to identify the type of fault (e.g., short-circuit, open-circuit and shading faults). We applied this strategy to real data from the grid-connected PV system installed at the Renewable Energy Development Center, Algeria. Results show the capacity of the proposed strategy to monitors the DC side of PV systems and detects partial shading.
Detection of microcalcifications in mammograms using error of prediction and statistical measures
Acha, Begoña; Serrano, Carmen; Rangayyan, Rangaraj M.; Leo Desautels, J. E.
2009-01-01
A two-stage method for detecting microcalcifications in mammograms is presented. In the first stage, the determination of the candidates for microcalcifications is performed. For this purpose, a 2-D linear prediction error filter is applied, and for those pixels where the prediction error is larger than a threshold, a statistical measure is calculated to determine whether they are candidates for microcalcifications or not. In the second stage, a feature vector is derived for each candidate, and after a classification step using a support vector machine, the final detection is performed. The algorithm is tested with 40 mammographic images, from Screen Test: The Alberta Program for the Early Detection of Breast Cancer with 50-μm resolution, and the results are evaluated using a free-response receiver operating characteristics curve. Two different analyses are performed: an individual microcalcification detection analysis and a cluster analysis. In the analysis of individual microcalcifications, detection sensitivity values of 0.75 and 0.81 are obtained at 2.6 and 6.2 false positives per image, on the average, respectively. The best performance is characterized by a sensitivity of 0.89, a specificity of 0.99, and a positive predictive value of 0.79. In cluster analysis, a sensitivity value of 0.97 is obtained at 1.77 false positives per image, and a value of 0.90 is achieved at 0.94 false positive per image.
Statistical Analysis of Probability of Detection Hit/Miss Data for Small Data Sets
Harding, C. A.; Hugo, G. R.
2003-03-01
This paper examines the validity of statistical methods for determining nondestructive inspection probability of detection (POD) curves from relatively small hit/miss POD data sets. One method published in the literature is shown to be invalid for analysis of POD hit/miss data. Another standard method is shown to be valid only for data sets containing more than 200 observations. An improved method is proposed which allows robust lower 95% confidence limit POD curves to be determined from data sets containing as few as 50 hit/miss observations.
Bruhvi Poptani
2013-01-01
Full Text Available Background: Micro-organisms are the primary causative agents of endodontic infections. Phenotype based procedures for bacterial identification has certain drawbacks especially, when investigating the microbiota of root-filled teeth. Thus, more sensitive methods like Polymerase chain reaction (PCR can provide results that are more accurate and reliable for the microbial prevalence in the root filled teeth. Aim: In this study, we have investigated twenty symptomatic root-filled teeth with chronic apical periodontitis for the prevalence of Enterococcus faecalis and Candida albicans in the root filled teeth associated with symptomatic cases with or without periradicular lesions. Materials and Methods: Microbiological samples were taken from the canals immediately after removal of previous gutta percha cones using aseptic techniques. After removal of root canal filling, samples were obtained with paper points placed in the canal. Paper points were transferred to a cryotube containing "Tris EDTA" buffer and immediately frozen at −20°C. Results: By PCR amplification of the samples using taxon specific primers, E. faecalis was found to be prevalent species, detected in 65% of the cases and C. albicans was detected in 35% of cases. Conclusion: The results of the study shows that geographical influence and dietary factors might have some role to play in the prevalence of the species like C. albicans and presence of E. faecalis confirming the assertion of previous culture-dependent and independent approaches for the microbiological survey of root filled teeth.
Shao, Quanxi; Wang, You-Gan
2009-09-01
Power calculation and sample size determination are critical in designing environmental monitoring programs. The traditional approach based on comparing the mean values may become statistically inappropriate and even invalid when substantial proportions of the response values are below the detection limits or censored because strong distributional assumptions have to be made on the censored observations when implementing the traditional procedures. In this paper, we propose a quantile methodology that is robust to outliers and can also handle data with a substantial proportion of below-detection-limit observations without the need of imputing the censored values. As a demonstration, we applied the methods to a nutrient monitoring project, which is a part of the Perth Long-Term Ocean Outlet Monitoring Program. In this example, the sample size required by our quantile methodology is, in fact, smaller than that by the traditional t-test, illustrating the merit of our method.
Geeta Hanji
2016-11-01
Full Text Available Noise reduction is an important area of research in image processing applications. The performance of the digital image noise filtering method primarily depends upon the accuracy of noise detection scheme. This paper presents an effective detector based, adaptive mask, median filtration of heavily noised digital images affected with fixed value (or salt and pepper impulse noise. The proposed filter presents a novel approach; an ameliorated Rank Ordered Absolute Deviation (ROAD statistics to judge whether the input pixel is noised or noise free. If a pixel is detected as corrupted, it is subjected to adaptive mask median filtration; otherwise, it is kept unchanged. Extensive experimental results and comparative performance evaluations demonstrate that the proposed filter outperforms the existing decision type, median based filters with powerful noise detectors in terms of objective performance measures and visual retrieviation accuracy.
Ziegler, Paul D; Glotzer, Taya V; Daoud, Emile G; Singer, Daniel E; Ezekowitz, Michael D; Hoyt, Robert H; Koehler, Jodi L; Coles, James; Wyse, D George
2012-11-01
The detection of undiagnosed atrial tachycardia/atrial fibrillation (AT/AF) among patients with stroke risk factors could be useful for primary stroke prevention. We analyzed newly detected AT/AF (NDAF) using continuous monitoring in patients with stroke risk factors but without previous stroke or evidence of AT/AF. NDAF (AT/AF >5 minutes on any day) was determined in patients with implantable cardiac rhythm devices and ≥1 stroke risk factors (congestive heart failure, hypertension, age ≥75 years, or diabetes). All devices were capable of continuously monitoring the daily cumulative time in AT/AF. Of 1,368 eligible patients, NDAF was identified in 416 (30%) during a follow-up of 1.1 ± 0.7 years and was unrelated to the CHADS(2) score (congestive heart failure, hypertension [blood pressure consistently >140/90 mm Hg or hypertension treated with medication], age ≥75 years, diabetes mellitus, previous stroke or transient ischemic attack). The presence of AT/AF >6 hours on ≥1 day increased significantly with increased CHADS(2) scores and was present in 158 (54%) of 294 patients with NDAF and a CHADS(2) score of ≥2. NDAF was sporadic, and 78% of patients with a CHADS(2) score of ≥2 with NDAF experienced AT/AF on risk patients was 72 days (interquartile range 13 to 177). In conclusion, continuous monitoring identified NDAF in 30% of patients with stroke risk factors. In patients with NDAF, AT/AF occurred sporadically, highlighting the difficulty in detecting paroxysmal AT/AF using traditional monitoring methods. However, AT/AF also persisted for >6 hours on ≥1 days in most patients with NDAF and multiple stroke risk factors. Whether patients with CHADS(2) risk factors but without a history of AF might benefit from implantable monitors for the selection and administration of anticoagulation for primary stroke prevention merits additional investigation.
Wang, Ping; Dai, Xin-Gang
2016-09-01
The term "APEC Blue" has been created to describe the clear sky days since the Asia-Pacific Economic Cooperation (APEC) summit held in Beijing during November 5-11, 2014. The duration of the APEC Blue is detected from November 1 to November 14 (hereafter Blue Window) by moving t test in statistics. Observations show that APEC Blue corresponds to low air pollution with respect to PM2.5, PM10, SO2, and NO2 under strict emission-control measures (ECMs) implemented in Beijing and surrounding areas. Quantitative assessment shows that ECM is more effective on reducing aerosols than the chemical constituents. Statistical investigation has revealed that the window also resulted from intensified wind variability, as well as weakened static stability of atmosphere (SSA). The wind and ECMs played key roles in reducing air pollution during November 1-7 and 11-13, and strict ECMs and weak SSA become dominant during November 7-10 under weak wind environment. Moving correlation manifests that the emission reduction for aerosols can increase the apparent wind cleanup effect, leading to significant negative correlations of them, and the period-wise changes in emission rate can be well identified by multi-scale correlations basing on wavelet decomposition. In short, this case study manifests statistically how human interference modified air quality in the mega city through controlling local and surrounding emissions in association with meteorological condition.
Hoell, Simon; Omenzetter, Piotr
2017-04-01
The increasing demand for carbon neutral energy in a challenging economic environment is a driving factor for erecting ever larger wind turbines in harsh environments using novel wind turbine blade (WTBs) designs characterized by high flexibilities and lower buckling capacities. To counteract resulting increasing of operation and maintenance costs, efficient structural health monitoring systems can be employed to prevent dramatic failures and to schedule maintenance actions according to the true structural state. This paper presents a novel methodology for classifying structural damages using vibrational responses from a single sensor. The method is based on statistical classification using Bayes' theorem and an advanced statistic, which allows controlling the performance by varying the number of samples which represent the current state. This is done for multivariate damage sensitive features defined as partial autocorrelation coefficients (PACCs) estimated from vibrational responses and principal component analysis scores from PACCs. Additionally, optimal DSFs are composed not only for damage classification but also for damage detection based on binary statistical hypothesis testing, where features selections are found with a fast forward procedure. The method is applied to laboratory experiments with a small scale WTB with wind-like excitation and non-destructive damage scenarios. The obtained results demonstrate the advantages of the proposed procedure and are promising for future applications of vibration-based structural health monitoring in WTBs.
Using statistical distances to detect changes in the normal behavior of ECG-Holter signals
Bastos de Figueiredo, Julio C.; Furuie, Sergio S.
2001-05-01
One of the main problems in the study of complex systems is to define a good metric that can distinguish between different dynamical behaviors in a nonlinear system. In this work we describe a method to detect different types of behaviors in a long term ECG-Holter using short portions of the Holter signal. This method is based on the calculation of the statistical distance between two distributions in a phase-space of a dynamical system. A short portion of an ECG-Holter signal with normal behavior is used to reconstruct the trajectory of an attractor in low dimensional phase-space. The points in this trajectory are interpreted as statistical distributions in the phase-space and assumed to represent the normal dynamical behavior of the ECG recording in this space. A fast algorithm is then used to compute the statistical distance between this attractor and all other attractors that are built using a sliding temporal window over the signal. For normal cases the distance stayed almost constant and below a threshold. For cases with abnormal transients, on the abnormal portion of ECG, the distance increased consistently with morphological changes.
Ugo Bastolla
2014-03-01
Full Text Available The properties of biomolecules depend both on physics and on the evolutionary process that formed them. These two points of view produce a powerful synergism. Physics sets the stage and the constraints that molecular evolution has to obey, and evolutionary theory helps in rationalizing the physical properties of biomolecules, including protein folding thermodynamics. To complete the parallelism, protein thermodynamics is founded on the statistical mechanics in the space of protein structures, and molecular evolution can be viewed as statistical mechanics in the space of protein sequences. In this review, we will integrate both points of view, applying them to detecting selection on the stability of the folded state of proteins. We will start discussing positive design, which strengthens the stability of the folded against the unfolded state of proteins. Positive design justifies why statistical potentials for protein folding can be obtained from the frequencies of structural motifs. Stability against unfolding is easier to achieve for longer proteins. On the contrary, negative design, which consists in destabilizing frequently formed misfolded conformations, is more difficult to achieve for longer proteins. The folding rate can be enhanced by strengthening short-range native interactions, but this requirement contrasts with negative design, and evolution has to trade-off between them. Finally, selection can accelerate functional movements by favoring low frequency normal modes of the dynamics of the native state that strongly correlate with the functional conformation change.
Perles, Stephanie J.; Wagner, Tyler; Irwin, Brian J.; Manning, Douglas R.; Callahan, Kristina K.; Marshall, Matthew R.
2014-01-01
Forests are socioeconomically and ecologically important ecosystems that are exposed to a variety of natural and anthropogenic stressors. As such, monitoring forest condition and detecting temporal changes therein remain critical to sound public and private forestland management. The National Parks Service’s Vital Signs monitoring program collects information on many forest health indicators, including species richness, cover by exotics, browse pressure, and forest regeneration. We applied a mixed-model approach to partition variability in data for 30 forest health indicators collected from several national parks in the eastern United States. We then used the estimated variance components in a simulation model to evaluate trend detection capabilities for each indicator. We investigated the extent to which the following factors affected ability to detect trends: (a) sample design: using simple panel versus connected panel design, (b) effect size: increasing trend magnitude, (c) sample size: varying the number of plots sampled each year, and (d) stratified sampling: post-stratifying plots into vegetation domains. Statistical power varied among indicators; however, indicators that measured the proportion of a total yielded higher power when compared to indicators that measured absolute or average values. In addition, the total variability for an indicator appeared to influence power to detect temporal trends more than how total variance was partitioned among spatial and temporal sources. Based on these analyses and the monitoring objectives of theVital Signs program, the current sampling design is likely overly intensive for detecting a 5 % trend·year−1 for all indicators and is appropriate for detecting a 1 % trend·year−1 in most indicators.
Perles, Stephanie J; Wagner, Tyler; Irwin, Brian J; Manning, Douglas R; Callahan, Kristina K; Marshall, Matthew R
2014-09-01
Forests are socioeconomically and ecologically important ecosystems that are exposed to a variety of natural and anthropogenic stressors. As such, monitoring forest condition and detecting temporal changes therein remain critical to sound public and private forestland management. The National Parks Service's Vital Signs monitoring program collects information on many forest health indicators, including species richness, cover by exotics, browse pressure, and forest regeneration. We applied a mixed-model approach to partition variability in data for 30 forest health indicators collected from several national parks in the eastern United States. We then used the estimated variance components in a simulation model to evaluate trend detection capabilities for each indicator. We investigated the extent to which the following factors affected ability to detect trends: (a) sample design: using simple panel versus connected panel design, (b) effect size: increasing trend magnitude, (c) sample size: varying the number of plots sampled each year, and (d) stratified sampling: post-stratifying plots into vegetation domains. Statistical power varied among indicators; however, indicators that measured the proportion of a total yielded higher power when compared to indicators that measured absolute or average values. In addition, the total variability for an indicator appeared to influence power to detect temporal trends more than how total variance was partitioned among spatial and temporal sources. Based on these analyses and the monitoring objectives of the Vital Signs program, the current sampling design is likely overly intensive for detecting a 5 % trend·year(-1) for all indicators and is appropriate for detecting a 1 % trend·year(-1) in most indicators.
Gofford, Jason; Reeves, James N.; Tombesi, Francesco; Braito, Valentina; Turner, T. Jane; Miller, Lance; Cappi, Massimo
2013-01-01
We present the results of a new spectroscopic study of Fe K-band absorption in active galactic nuclei (AGN). Using data obtained from the Suzaku public archive we have performed a statistically driven blind search for Fe XXV Healpha and/or Fe XXVI Lyalpha absorption lines in a large sample of 51 Type 1.0-1.9 AGN. Through extensive Monte Carlo simulations we find that statistically significant absorption is detected at E greater than or approximately equal to 6.7 keV in 20/51 sources at the P(sub MC) greater than or equal tov 95 per cent level, which corresponds to approximately 40 per cent of the total sample. In all cases, individual absorption lines are detected independently and simultaneously amongst the two (or three) available X-ray imaging spectrometer detectors, which confirms the robustness of the line detections. The most frequently observed outflow phenomenology consists of two discrete absorption troughs corresponding to Fe XXV Healpha and Fe XXVI Lyalpha at a common velocity shift. From xstar fitting the mean column density and ionization parameter for the Fe K absorption components are log (N(sub H) per square centimeter)) is approximately equal to 23 and log (Xi/erg centimeter per second) is approximately equal to 4.5, respectively. Measured outflow velocities span a continuous range from less than1500 kilometers per second up to approximately100 000 kilometers per second, with mean and median values of approximately 0.1 c and approximately 0.056 c, respectively. The results of this work are consistent with those recently obtained using XMM-Newton and independently provides strong evidence for the existence of very highly ionized circumnuclear material in a significant fraction of both radio-quiet and radio-loud AGN in the local universe.
Greenberg, Ariela Caren
Differential item functioning (DIF) and differential distractor functioning (DDF) are methods used to screen for item bias (Camilli & Shepard, 1994; Penfield, 2008). Using an applied empirical example, this mixed-methods study examined the congruency and relationship of DIF and DDF methods in screening multiple-choice items. Data for Study I were drawn from item responses of 271 female and 236 male low-income children on a preschool science assessment. Item analyses employed a common statistical approach of the Mantel-Haenszel log-odds ratio (MH-LOR) to detect DIF in dichotomously scored items (Holland & Thayer, 1988), and extended the approach to identify DDF (Penfield, 2008). Findings demonstrated that the using MH-LOR to detect DIF and DDF supported the theoretical relationship that the magnitude and form of DIF and are dependent on the DDF effects, and demonstrated the advantages of studying DIF and DDF in multiple-choice items. A total of 4 items with DIF and DDF and 5 items with only DDF were detected. Study II incorporated an item content review, an important but often overlooked and under-published step of DIF and DDF studies (Camilli & Shepard). Interviews with 25 female and 22 male low-income preschool children and an expert review helped to interpret the DIF and DDF results and their comparison, and determined that a content review process of studied items can reveal reasons for potential item bias that are often congruent with the statistical results. Patterns emerged and are discussed in detail. The quantitative and qualitative analyses were conducted in an applied framework of examining the validity of the preschool science assessment scores for evaluating science programs serving low-income children, however, the techniques can be generalized for use with measures across various disciplines of research.
Conradsen, Knut; Nielsen, Allan Aasbjerg; Schou, Jesper;
2003-01-01
. Based on this distribution, a test statistic for equality of two such matrices and an associated asymptotic probability for obtaining a smaller value of the test statistic are derived and applied successfully to change detection in polarimetric SAR data. In a case study, EMISAR L-band data from April 17...... to HH, VV, or HV data alone, the derived test statistic reduces to the well-known gamma likelihood-ratio test statistic. The derived test statistic and the associated significance value can be applied as a line or edge detector in fully polarimetric SAR data also....
Fujimoto, K.; Yanagisawa, T.; Uetsuhara, M.
Automated detection and tracking of faint objects in optical, or bearing-only, sensor imagery is a topic of immense interest in space surveillance. Robust methods in this realm will lead to better space situational awareness (SSA) while reducing the cost of sensors and optics. They are especially relevant in the search for high area-to-mass ratio (HAMR) objects, as their apparent brightness can change significantly over time. A track-before-detect (TBD) approach has been shown to be suitable for faint, low signal-to-noise ratio (SNR) images of resident space objects (RSOs). TBD does not rely upon the extraction of feature points within the image based on some thresholding criteria, but rather directly takes as input the intensity information from the image file. Not only is all of the available information from the image used, TBD avoids the computational intractability of the conventional feature-based line detection (i.e., "string of pearls") approach to track detection for low SNR data. Implementation of TBD rooted in finite set statistics (FISST) theory has been proposed recently by Vo, et al. Compared to other TBD methods applied so far to SSA, such as the stacking method or multi-pass multi-period denoising, the FISST approach is statistically rigorous and has been shown to be more computationally efficient, thus paving the path toward on-line processing. In this paper, we intend to apply a multi-Bernoulli filter to actual CCD imagery of RSOs. The multi-Bernoulli filter can explicitly account for the birth and death of multiple targets in a measurement arc. TBD is achieved via a sequential Monte Carlo implementation. Preliminary results with simulated single-target data indicate that a Bernoulli filter can successfully track and detect objects with measurement SNR as low as 2.4. Although the advent of fast-cadence scientific CMOS sensors have made the automation of faint object detection a realistic goal, it is nonetheless a difficult goal, as measurements
Towards spatial localisation of harmful algal blooms; statistics-based spatial anomaly detection
Shutler, J. D.; Grant, M. G.; Miller, P. I.
2005-10-01
Harmful algal blooms are believed to be increasing in occurrence and their toxins can be concentrated by filter-feeding shellfish and cause amnesia or paralysis when ingested. As a result fisheries and beaches in the vicinity of blooms may need to be closed and the local population informed. For this avoidance planning timely information on the existence of a bloom, its species and an accurate map of its extent would be prudent. Current research to detect these blooms from space has mainly concentrated on spectral approaches towards determining species. We present a novel statistics-based background-subtraction technique that produces improved descriptions of an anomaly's extent from remotely-sensed ocean colour data. This is achieved by extracting bulk information from a background model; this is complemented by a computer vision ramp filtering technique to specifically detect the perimeter of the anomaly. The complete extraction technique uses temporal-variance estimates which control the subtraction of the scene of interest from the time-weighted background estimate, producing confidence maps of anomaly extent. Through the variance estimates the method learns the associated noise present in the data sequence, providing robustness, and allowing generic application. Further, the use of the median for the background model reduces the effects of anomalies that appear within the time sequence used to generate it, allowing seasonal variations in the background levels to be closely followed. To illustrate the detection algorithm's application, it has been applied to two spectrally different oceanic regions.
Wagner, Tyler; Irwin, Brian J.; James R. Bence,; Daniel B. Hayes,
2016-01-01
Monitoring to detect temporal trends in biological and habitat indices is a critical component of fisheries management. Thus, it is important that management objectives are linked to monitoring objectives. This linkage requires a definition of what constitutes a management-relevant “temporal trend.” It is also important to develop expectations for the amount of time required to detect a trend (i.e., statistical power) and for choosing an appropriate statistical model for analysis. We provide an overview of temporal trends commonly encountered in fisheries management, review published studies that evaluated statistical power of long-term trend detection, and illustrate dynamic linear models in a Bayesian context, as an additional analytical approach focused on shorter term change. We show that monitoring programs generally have low statistical power for detecting linear temporal trends and argue that often management should be focused on different definitions of trends, some of which can be better addressed by alternative analytical approaches.
Picot, Antoine; Obeid, Ziad; Régnier, Jérémi; Poignant, Sylvain; Darnis, Olivier; Maussion, Pascal
2014-01-01
International audience; In this paper, an original method for bearing fault detection in high speed synchronous machines is presented. This method is based on the statistical process of Welch's periodogram of the stator currents in order to obtain stable and normalized fault indicators. The principle of the method is to statistically compare the current spectrum to a healthy reference so as to quantify the changes over the time. A statistic-based indicator is then constructed by monitoring sp...
Statistical analysis of monochromatic whistler waves near the Moon detected by Kaguya
Y. Katoh
2011-05-01
Full Text Available Observations are presented of monochromatic whistler waves near the Moon detected by the Lunar Magnetometer (LMAG on board Kaguya. The waves were observed as narrowband magnetic fluctuations with frequencies close to 1 Hz, and were mostly left-hand polarized in the spacecraft frame. We performed a statistical analysis of the waves to identify the distributions of their intensity and occurrence. The results indicate that the waves were generated by the solar wind interaction with lunar crustal magnetic anomalies. The conditions for observation of the waves strongly depend on the solar zenith angle (SZA, and a high occurrence rate is recognized in the region of SZA between 40° to 90° with remarkable north-south and dawn-dusk asymmetries. We suggest that ion beams reflected by the lunar magnetic anomalies are a possible source of the waves.
Willems, Sander; Fraiture, Marie-Alice; Deforce, Dieter; De Keersmaecker, Sigrid C J; De Loose, Marc; Ruttink, Tom; Herman, Philippe; Van Nieuwerburgh, Filip; Roosens, Nancy
2016-02-01
Because the number and diversity of genetically modified (GM) crops has significantly increased, their analysis based on real-time PCR (qPCR) methods is becoming increasingly complex and laborious. While several pioneers already investigated Next Generation Sequencing (NGS) as an alternative to qPCR, its practical use has not been assessed for routine analysis. In this study a statistical framework was developed to predict the number of NGS reads needed to detect transgene sequences, to prove their integration into the host genome and to identify the specific transgene event in a sample with known composition. This framework was validated by applying it to experimental data from food matrices composed of pure GM rice, processed GM rice (noodles) or a 10% GM/non-GM rice mixture, revealing some influential factors. Finally, feasibility of NGS for routine analysis of GM crops was investigated by applying the framework to samples commonly encountered in routine analysis of GM crops.
Schmidt, Tobias M.; Worseck, Gabor; Hennawi, Joseph F.; Prochaska, J. Xavier; Crighton, Neil H. M.
2017-09-01
The He ii transverse proximity effect—enhanced He ii {Ly}α transmission in a background sightline caused by the ionizing radiation of a foreground quasar—offers a unique opportunity to probe the morphology of quasar-driven He ii reionization. We conduct a comprehensive spectroscopic survey to find z∼ 3 quasars in the foreground of 22 background quasar sightlines with Hubble Space Telescope/COS He ii {Ly}α transmission spectra. With our two-tiered survey strategy, consisting of a deep pencil-beam survey and a shallow wide-field survey, we discover 131 new quasars, which we complement with known SDSS/BOSS quasars in our fields. Using a restricted sample of 66 foreground quasars with inferred He ii photoionization rates greater than the expected UV background at these redshifts ({{{Γ }}}{QSO}{He {{II}}}> 5× {10}-16 {{{s}}}-1) we perform the first statistical analysis of the He ii transverse proximity effect. Our results show qualitative evidence for a large object-to-object variance: among the four foreground quasars with the highest {{{Γ }}}{QSO}{He {{II}}} only one (previously known) quasar is associated with a significant He ii transmission spike. We perform a stacking analysis to average down these fluctuations, and detect an excess in the average He ii transmission near the foreground quasars at 3σ significance. This statistical evidence for the transverse proximity effect is corroborated by a clear dependence of the signal strength on {{{Γ }}}{QSO}{He {{II}}}. Our detection places a purely geometrical lower limit on the quasar lifetime of {t}{{Q}}> 25 {Myr}. Improved modeling would additionally constrain quasar obscuration and the mean free path of He ii-ionizing photons.
Frome, EL
2005-09-20
Environmental exposure measurements are, in general, positive and may be subject to left censoring; i.e,. the measured value is less than a ''detection limit''. In occupational monitoring, strategies for assessing workplace exposures typically focus on the mean exposure level or the probability that any measurement exceeds a limit. Parametric methods used to determine acceptable levels of exposure, are often based on a two parameter lognormal distribution. The mean exposure level, an upper percentile, and the exceedance fraction are used to characterize exposure levels, and confidence limits are used to describe the uncertainty in these estimates. Statistical methods for random samples (without non-detects) from the lognormal distribution are well known for each of these situations. In this report, methods for estimating these quantities based on the maximum likelihood method for randomly left censored lognormal data are described and graphical methods are used to evaluate the lognormal assumption. If the lognormal model is in doubt and an alternative distribution for the exposure profile of a similar exposure group is not available, then nonparametric methods for left censored data are used. The mean exposure level, along with the upper confidence limit, is obtained using the product limit estimate, and the upper confidence limit on an upper percentile (i.e., the upper tolerance limit) is obtained using a nonparametric approach. All of these methods are well known but computational complexity has limited their use in routine data analysis with left censored data. The recent development of the R environment for statistical data analysis and graphics has greatly enhanced the availability of high-quality nonproprietary (open source) software that serves as the basis for implementing the methods in this paper.
Hébert-Dufresne, Laurent; Grochow, Joshua A.; Allard, Antoine
2016-08-01
We introduce a network statistic that measures structural properties at the micro-, meso-, and macroscopic scales, while still being easy to compute and interpretable at a glance. Our statistic, the onion spectrum, is based on the onion decomposition, which refines the k-core decomposition, a standard network fingerprinting method. The onion spectrum is exactly as easy to compute as the k-cores: It is based on the stages at which each vertex gets removed from a graph in the standard algorithm for computing the k-cores. Yet, the onion spectrum reveals much more information about a network, and at multiple scales; for example, it can be used to quantify node heterogeneity, degree correlations, centrality, and tree- or lattice-likeness. Furthermore, unlike the k-core decomposition, the combined degree-onion spectrum immediately gives a clear local picture of the network around each node which allows the detection of interesting subgraphs whose topological structure differs from the global network organization. This local description can also be leveraged to easily generate samples from the ensemble of networks with a given joint degree-onion distribution. We demonstrate the utility of the onion spectrum for understanding both static and dynamic properties on several standard graph models and on many real-world networks.
Zechlin, Hannes-S; Donato, Fiorenza; Fornengo, Nicolao; Vittino, Andrea
2015-01-01
The source-count distribution as a function of their flux, dN/dS, is one of the main quantities characterizing gamma-ray source populations. We employ statistical properties of the Fermi-LAT photon counts map to measure the composition of the extragalactic gamma-ray sky at high latitudes (|b|>30 deg) between 1 GeV and 10 GeV. We present a new method, generalizing the use of standard pixel-count statistics, to decompose the total observed gamma-ray emission into: (a) point-source contributions, (b) the Galactic foreground contribution, and (c) a truly diffuse isotropic background contribution. Using the 6-year Fermi-LAT data set (P7REP), we show that the dN/dS distribution in the regime of so far undetected point sources can be consistently described with a power-law of index between 1.9 and 2.0. We measure dN/dS down to an integral flux of ~2x10^{-11} cm^{-2}s^{-1}, improving beyond the 3FGL catalog detection limit by about one order of magnitude. The overall dN/dS distribution is consistent with a broken pow...
Statistical detection of fraud in the reporting of Croatian public companies
Siniša Slijepčević
2014-03-01
Full Text Available Statistical methods based on Benford’s distribution, Z- and χ2-statistics are being successfully applied to detect likely accounting and reporting fraud, for example in the daily usage of the Internal Revenue Service in the USA, and in historical analysis of Greek macroeconomic reporting. We adapt and apply the methodology to the analysis of the reporting of some leading Croatian public companies. We find indications of reporting fraud in several of the companies analyzed. In particular we find correlation between the likelihood of reporting fraud, measured as a deviation from Benford’s law, and reported net income losses, for companies large enough (with a revenue of at least 1 billion kuna. Finally, we suggest application of the methodology to improve the internal processes, efficiency and effectiveness of the State Auditing Office.Data availability: The data used in the study are corporate data in the public domain. For legal reasons, however, the identities of the companies are disguised. Contact the first author for the sanitized data sets that can be used to verify and replicate the analysis.
Detection of seizure and epilepsy using higher order statistics in the EMD domain.
Alam, S M Shafiul; Bhuiyan, M I H
2013-03-01
In this paper, a method using higher order statistical moments of EEG signals calculated in the empirical mode decomposition (EMD) domain is proposed for detecting seizure and epilepsy. The appropriateness of these moments in distinguishing the EEG signals is investigated through an extensive analysis in the EMD domain. An artificial neural network is employed as the classifier of the EEG signals wherein these moments are used as features. The performance of the proposed method is studied using a publicly available benchmark database for various classification cases that include healthy, interictal (seizure-free interval) and ictal (seizure), healthy and seizure, nonseizure and seizure, and interictal and ictal, and compared with that of several recent methods based on time-frequency analysis and statistical moments. It is shown that the proposed method can provide, in almost all the cases, 100% accuracy, sensitivity, and specificity, especially in the case of discriminating seizure activities from the nonseizure ones for patients with epilepsy while being much faster as compared to the time-frequency analysis-based techniques.
机采血小板检测前过程质量控制的研究%A study on quality control of previous process of apheresis platelets detection
杨图深; 朱业华; 陈亦明
2014-01-01
目的：探讨机采血小板检测前过程的质量控制。方法留取机采单份血小板、机采双份血小板样本各100份，以1∶1、1∶3及1∶7稀释度进行稀释，采用全自动血液细胞计数仪检测血小板计数。另采集机采血小板样本100份，室温下静置0、30、60、90、120 min ，以1∶3稀释后，检测其血小板计数。结果机采单份或双份血小板样本以1∶1与1∶3稀释后检测血小板计数的差异及1∶3与1∶7稀释后检测的差异均有统计学意义（ P＜0．05）。静置0、30 min检测的血小板计数与静置60、90、120 min的检测值的差异有统计学意义（P＜0．05）。结论机采血小板检测前过程的质量控制对采集后的血小板计数十分重要。%Objective To study the quality control of previous process of apheresis platelets detection .Methods Single and double apheresis platelets each of 100 samples were collected and were diluted 1∶1 ,1∶3 and 1∶7 .Aautomatic Blood Cell Count-er was employed to detect the platelet count .Another 100 samples of apheresis platelets were collected and stand at room tempera-ture for 0 ,30 ,60 ,90 ,120 min ,After 1∶3 dilution ,the platelet counts were detected .Results Differences of platelet counts be-tween 1∶1 and 1∶3 dilution ,1∶3 and 1∶7 dilution of single or double apheresis platelets showed statistically significant differ-ences(P<0 .05) .Differences of platelet counts between standing for 0 ,30 min and standing for 60 ,90 ,120 min were found statis-tically significant(P<0 .05) .Conclusion Quality control of previous process of apheresis platelets detection is very important for platelet count after collection .
Statistical modeling, detection, and segmentation of stains in digitized fabric images
Gururajan, Arunkumar; Sari-Sarraf, Hamed; Hequet, Eric F.
2007-02-01
This paper will describe a novel and automated system based on a computer vision approach, for objective evaluation of stain release on cotton fabrics. Digitized color images of the stained fabrics are obtained, and the pixel values in the color and intensity planes of these images are probabilistically modeled as a Gaussian Mixture Model (GMM). Stain detection is posed as a decision theoretic problem, where the null hypothesis corresponds to absence of a stain. The null hypothesis and the alternate hypothesis mathematically translate into a first order GMM and a second order GMM respectively. The parameters of the GMM are estimated using a modified Expectation-Maximization (EM) algorithm. Minimum Description Length (MDL) is then used as the test statistic to decide the verity of the null hypothesis. The stain is then segmented by a decision rule based on the probability map generated by the EM algorithm. The proposed approach was tested on a dataset of 48 fabric images soiled with stains of ketchup, corn oil, mustard, ragu sauce, revlon makeup and grape juice. The decision theoretic part of the algorithm produced a correct detection rate (true positive) of 93% and a false alarm rate of 5% on these set of images.
Zakaria, Chahnez; Curé, Olivier; Salzano, Gabriella; Smaïli, Kamel
In Computer Supported Cooperative Work (CSCW), it is crucial for project leaders to detect conflicting situations as early as possible. Generally, this task is performed manually by studying a set of documents exchanged between team members. In this paper, we propose a full-fledged automatic solution that identifies documents, subjects and actors involved in relational conflicts. Our approach detects conflicts in emails, probably the most popular type of documents in CSCW, but the methods used can handle other text-based documents. These methods rely on the combination of statistical and ontological operations. The proposed solution is decomposed in several steps: (i) we enrich a simple negative emotion ontology with terms occuring in the corpus of emails, (ii) we categorize each conflicting email according to the concepts of this ontology and (iii) we identify emails, subjects and team members involved in conflicting emails using possibilistic description logic and a set of proposed measures. Each of these steps are evaluated and validated on concrete examples. Moreover, this approach's framework is generic and can be easily adapted to domains other than conflicts, e.g. security issues, and extended with operations making use of our proposed set of measures.
Bauermann, F V; Falkenberg, S M; Ridpath, J F
2016-09-11
The ability of ruminant pestivirus including bovine viral diarrhoea virus (BVDV) and the related emerging pestivirus, HoBi-like virus, to establish persistent infection (PI) following foetal infection is central to keeping these viruses in circulation. Non-PI dams carrying BVDV PI calves develop high levels of immunity due constantly viral exposure. A study to determine whether the immunity developed following the generation of a BVDV PI is enough to prevent HoBi-like virus infection of a subsequent foetus was performed. This study consisted of nine pregnant cows, four had birthed BVDV-1 PI calves in a previous pregnancy, three cows had birthed BVDV-2 PIs and two had birthed pestivirus negative calves. From this, six pregnant cows were challenged with HoBi-like virus about day 85 of gestation (four BVDV-1 and two BVDV-2 cows) and three non-challenged cows (negative control). At the day of challenge, the serum neutralizing titres against the homologous BVDV strains of the first inoculation ranged from 1148 to 5793. At day 6 post-challenge, HoBi-like RNA was detected in the serum of all four BVDV-1 cows but not in the two BVDV-2 cows. The foetuses harvested from five of the exposed dams (three BVDV-1 and two BVDV-2 cows) at day 30 post-challenge were positive for HoBi-like virus RNA. The sixth cow, BVDV-1 cow #541, while pregnant at the time of exposure, had no foetus 30 days after exposure. Foetuses from HoBi-like virus exposed dams were significantly smaller and lighter than control foetuses. HoBi-like RNA was detected in samples of all challenged foetuses. The identification of viral RNA in the serum of 4 cows at day 6 post-challenge, as well viral RNA detection in all foetuses 30 days post-inoculation, indicates that the foetuses of dams with high antibodies titres against BVDV-1 or BVDV-2 would not be protected from challenge with a HoBi-like virus.
Ceballos, Melisa Rodas; García-Tenorio, Rafael; Estela, José Manuel; Cerdà, Víctor; Ferrer, Laura
2017-12-01
Leached fractions of U and Th from different environmental solid matrices were evaluated by an automatic system enabling the on-line lixiviation and extraction/pre-concentration of these two elements previous ICP-MS detection. UTEVA resin was used as selective extraction material. Ten leached fraction, using artificial rainwater (pH 5.4) as leaching agent, and a residual fraction were analyzed for each sample, allowing the study of behavior of U and Th in dynamic lixiviation conditions. Multivariate techniques have been employed for the efficient optimization of the independent variables that affect the lixiviation process. The system reached LODs of 0.1 and 0.7ngkg(-1) of U and Th, respectively. The method was satisfactorily validated for three solid matrices, by the analysis of a soil reference material (IAEA-375), a certified sediment reference material (BCR- 320R) and a phosphogypsum reference material (MatControl CSN-CIEMAT 2008). Besides, environmental samples were analyzed, showing a similar behavior, i.e. the content of radionuclides decreases with the successive extractions. In all cases, the accumulative leached fraction of U and Th for different solid matrices studied (soil, sediment and phosphogypsum) were extremely low, up to 0.05% and 0.005% of U and Th, respectively. However, a great variability was observed in terms of mass concentration released, e.g. between 44 and 13,967ngUkg(-1). Copyright © 2017 Elsevier B.V. All rights reserved.
Statistical modelling and power analysis for detecting trends in total suspended sediment loads
Wang, You-Gan; Wang, Shen S. J.; Dunlop, Jason
2015-01-01
The export of sediments from coastal catchments can have detrimental impacts on estuaries and near shore reef ecosystems such as the Great Barrier Reef. Catchment management approaches aimed at reducing sediment loads require monitoring to evaluate their effectiveness in reducing loads over time. However, load estimation is not a trivial task due to the complex behaviour of constituents in natural streams, the variability of water flows and often a limited amount of data. Regression is commonly used for load estimation and provides a fundamental tool for trend estimation by standardising the other time specific covariates such as flow. This study investigates whether load estimates and resultant power to detect trends can be enhanced by (i) modelling the error structure so that temporal correlation can be better quantified, (ii) making use of predictive variables, and (iii) by identifying an efficient and feasible sampling strategy that may be used to reduce sampling error. To achieve this, we propose a new regression model that includes an innovative compounding errors model structure and uses two additional predictive variables (average discounted flow and turbidity). By combining this modelling approach with a new, regularly optimised, sampling strategy, which adds uniformity to the event sampling strategy, the predictive power was increased to 90%. Using the enhanced regression model proposed here, it was possible to detect a trend of 20% over 20 years. This result is in stark contrast to previous conclusions presented in the literature.
A Statistical Framework for Automatic Leakage Detection in Smart Water and Gas Grids
Marco Fagiani
2016-08-01
Full Text Available In the last few years, due to the technological improvement of advanced metering infrastructures, water and natural gas grids can be regarded as smart-grids, similarly to power ones. However, considering the number of studies related to the application of computational intelligence to distribution grids, the gap between power grids and water/gas grids is notably wide. For this purpose, in this paper, a framework for leakage identification is presented. The framework is composed of three sections aimed at the extraction and the selection of features and at the detection of leakages. A variation of the Sequential Feature Selection (SFS algorithm is used to select the best performing features within a set, including, also, innovative temporal ones. The leakage identification is based on novelty detection and exploits the characterization of a normality model. Three statistical approaches, The Gaussian Mixture Model (GMM, Hidden Markov Model (HMM and One-Class Support Vector Machine (OC-SVM, are adopted, under a comparative perspective. Both residential and office building environments are investigated by means of two datasets. One is the Almanac of Minutely Power dataset (AMPds, and it provides water and gas data consumption at 1, 10 and 30 min of time resolution; the other is the Department of International Development (DFID dataset, and it provides water and gas data consumption at 30 min of time resolution. The achieved performance, computed by means of the Area Under the Curve (AUC, reaches 90 % in the office building case study, thus confirming the suitability of the proposed approach for applications in smart water and gas grids.
KLIP-ing for Analogs - Detection Statistics for HR8799-like systems
Hanson, Jake R.; Apai, Daniel
2015-01-01
In late 2008, the announcement of the discovery of the directly imaged quadruple planetary system HR8799 was made. This system is unique not only due to the number of planets it contains but also because it poses a serious threat to our current understanding of planetary core accretion. Namely, the observed radial separations between the planets and their A/F-type host star are not consistent with the amount of gas we would expect the planets to have accreted, as well as the fact the system as a whole contains more than 70 times the mass of our own solar system.In order to examine whether or not planetary systems similar to HR8799 are anomalous, this project has conducted the largest survey to date of directly imaged A/F-type stars. Using the NACO-VLT imaging system, we implement a modern image reduction algorithm known as KLIP on over 60 targets to detect analogs. KLIP is a PCA based algorithm and operates by creating a library of PSF eigenimages for a given set of input images. This library contains all of the time-independent PSF sources that rotate with the field of view for the input images. Once the PSF library is created, KLIP then recreates any target from the input images as a superposition of known PSF eigenimages from the library and subtracts this from the original, leaving behind possible planetary candidates.The results of this project provide a quantitative comparison of KLIP and other image reduction algorithms for this data set. We will also use a Monte Carlo based simulation to determine the frequency of HR8799 analogs around AF type stars based on our detection statistics.
A Statistical Method to Constrain Faint Radio Source Counts Below the Detection Threshold
Mitchell-Wynne, Ketron; Afonso, Jose; Jarvis, Matt J
2013-01-01
We present a statistical method based on a maximum likelihood approach to constrain the number counts of extragalactic sources below the nominal flux-density limit of continuum imaging surveys. We extract flux densities from a radio map using positional information from an auxiliary catalogue and show that we can model the number counts of this undetected population down to flux density levels well below the detection threshold of the radio survey. We demonstrate the capabilities that our method will have with future generation wide-area radio surveys by performing simulations over various sky areas with a power-law dN/dS model. We generate a simulated power-law distribution with flux densities ranging from 0.1 \\sigma to 2 \\sigma, convolve this distribution with a Gaussian noise distribution rms of 10 micro-Jy/beam, and are able to recover the counts from the noisy distribution. We then demonstrate the application of our method using data from the Faint Images of the Radio Sky at Twenty-Centimeters survey (FI...
Improving statistical keyword detection in short texts: Entropic and clustering approaches
Carretero-Campos, C.; Bernaola-Galván, P.; Coronado, A. V.; Carpena, P.
2013-03-01
In the last years, two successful approaches have been introduced to tackle the problem of statistical keyword detection in a text without the use of external information: (i) The entropic approach, where Shannon’s entropy of information is used to quantify the information content of the sequence of occurrences of each word in the text; and (ii) The clustering approach, which links the heterogeneity of the spatial distribution of a word in the text (clustering) with its relevance. In this paper, first we present some modifications to both techniques which improve their results. Then, we propose new metrics to evaluate the performance of keyword detectors based specifically on the needs of a typical user, and we employ them to find out which approach performs better. Although both approaches work well in long texts, we obtain in general that measures based on word-clustering perform at least as well as the entropic measure, which needs a convenient partition of the text to be applied, such as chapters of a book. In the latter approach we also show that the partition of the text chosen affects strongly its results. Finally, we focus on short texts, a case of high practical importance, such as short reports, web pages, scientific articles, etc. We show that the performance of word-clustering measures is also good in generic short texts since these measures are able to discriminate better the degree of relevance of low frequency words than the entropic approach.
Malm, Christer B; Khoo, Nelson S; Granlund, Irene; Lindstedt, Emilia; Hult, Andreas
2016-01-01
The discovery of erythropoietin (EPO) simplified blood doping in sports, but improved detection methods, for EPO has forced cheating athletes to return to blood transfusion. Autologous blood transfusion with cryopreserved red blood cells (RBCs) is the method of choice, because no valid method exists to accurately detect such event. In endurance sports, it can be estimated that elite athletes improve performance by up to 3% with blood doping, regardless of method. Valid detection methods for autologous blood doping is important to maintain credibility of athletic performances. Recreational male (N = 27) and female (N = 11) athletes served as Transfusion (N = 28) and Control (N = 10) subjects in two different transfusion settings. Hematological variables and physical performance were measured before donation of 450 or 900 mL whole blood, and until four weeks after re-infusion of the cryopreserved RBC fraction. Blood was analyzed for transferrin, iron, Hb, EVF, MCV, MCHC, reticulocytes, leucocytes and EPO. Repeated measures multivariate analysis of variance (MANOVA) and pattern recognition using Principal Component Analysis (PCA) and Orthogonal Projections of Latent Structures (OPLS) discriminant analysis (DA) investigated differences between Control and Transfusion groups over time. Significant increase in performance (15 ± 8%) and VO2max (17 ± 10%) (mean ± SD) could be measured 48 h after RBC re-infusion, and remained increased for up to four weeks in some subjects. In total, 533 blood samples were included in the study (Clean = 220, Transfused = 313). In response to blood transfusion, the largest change in hematological variables occurred 48 h after blood donation, when Control and Transfused groups could be separated with OPLS-DA (R2 = 0.76/Q2 = 0.59). RBC re-infusion resulted in the best model (R2 = 0.40/Q2 = 0.10) at the first sampling point (48 h), predicting one false positive and one false negative. Over all, a 25% and 86% false positives ratio was
Bose, S
2002-01-01
The robust statistic proposed by Creighton (Creighton J D E 1999 Phys. Rev. D 60 021101) and Allen et al (Allen et al 2001 Preprint gr-gc/010500) for the detection of stationary non-Gaussian noise is briefly reviewed. We compute the robust statistic for generic weak gravitational-wave signals in the mixture-Gaussian noise model to an accuracy higher than in those analyses, and reinterpret its role. Specifically, we obtain the coherent statistic for detecting gravitational-wave signals from inspiralling compact binaries with an arbitrary network of earth-based interferometers. Finally, we show that excess computational costs incurred owing to non-Gaussianity is negligible compared to the cost of detection in Gaussian noise.
Graph-based and statistical approaches for detecting spectrally variable target materials
Ziemann, Amanda K.; Theiler, James
2016-05-01
In discriminating target materials from background clutter in hyperspectral imagery, one must contend with variability in both. Most algorithms focus on the clutter variability, but for some materials there is considerable variability in the spectral signatures of the target. This is especially the case for solid target materials, whose signatures depend on morphological properties (particle size, packing density, etc.) that are rarely known a priori. In this paper, we investigate detection algorithms that explicitly take into account the diversity of signatures for a given target. In particular, we investigate variable target detectors when applied to new representations of the hyperspectral data: a manifold learning based approach, and a residual based approach. The graph theory and manifold learning based approach incorporates multiple spectral signatures of the target material of interest; this is built upon previous work that used a single target spectrum. In this approach, we first build an adaptive nearest neighbors (ANN) graph on the data and target spectra, and use a biased locally linear embedding (LLE) transformation to perform nonlinear dimensionality reduction. This biased transformation results in a lower-dimensional representation of the data that better separates the targets from the background. The residual approach uses an annulus based computation to represent each pixel after an estimate of the local background is removed, which suppresses local backgrounds and emphasizes the target-containing pixels. We will show detection results in the original spectral space, the dimensionality-reduced space, and the residual space, all using subspace detectors: ranked spectral angle mapper (rSAM), subspace adaptive matched filter (ssAMF), and subspace adaptive cosine/coherence estimator (ssACE). Results of this exploratory study will be shown on a ground-truthed hyperspectral image with variable target spectra and both full and mixed pixel targets.
Zechlin, Hannes-S.; Cuoco, Alessandro; Donato, Fiorenza; Fornengo, Nicolao; Vittino, Andrea
2016-08-01
The source-count distribution as a function of their flux, {dN}/{dS}, is one of the main quantities characterizing gamma-ray source populations. We employ statistical properties of the Fermi Large Area Telescope (LAT) photon counts map to measure the composition of the extragalactic gamma-ray sky at high latitudes (| b| ≥slant 30°) between 1 and 10 GeV. We present a new method, generalizing the use of standard pixel-count statistics, to decompose the total observed gamma-ray emission into (a) point-source contributions, (b) the Galactic foreground contribution, and (c) a truly diffuse isotropic background contribution. Using the 6 yr Fermi-LAT data set (P7REP), we show that the {dN}/{dS} distribution in the regime of so far undetected point sources can be consistently described with a power law with an index between 1.9 and 2.0. We measure {dN}/{dS} down to an integral flux of ˜ 2× {10}-11 {{cm}}-2 {{{s}}}-1, improving beyond the 3FGL catalog detection limit by about one order of magnitude. The overall {dN}/{dS} distribution is consistent with a broken power law, with a break at {2.1}-1.3+1.0× {10}-8 {{cm}}-2 {{{s}}}-1. The power-law index {n}1={3.1}-0.5+0.7 for bright sources above the break hardens to {n}2=1.97+/- 0.03 for fainter sources below the break. A possible second break of the {dN}/{dS} distribution is constrained to be at fluxes below 6.4× {10}-11 {{cm}}-2 {{{s}}}-1 at 95% confidence level. The high-latitude gamma-ray sky between 1 and 10 GeV is shown to be composed of ˜25% point sources, ˜69.3% diffuse Galactic foreground emission, and ˜6% isotropic diffuse background.
Nielsen, Allan Aasbjerg; Conradsen, Knut; Skriver, Henning
2016-01-01
Based on an omnibus likelihood ratio test statistic for the equality of several variance-covariance matrices following the complex Wishart distribution with an associated p-value and a factorization of this test statistic, change analysis in a short sequence of multilook, polarimetric SAR data...... in the covariance matrix representation is carried out. The omnibus test statistic and its factorization detect if and when change(s) occur. The technique is demonstrated on airborne EMISAR L-band data but may be applied to Sentinel-1, Cosmo-SkyMed, TerraSAR-X, ALOS and RadarSat-2 or other dual- and quad...
IMANISHI, M.; NEWTON, A. E.; VIEIRA, A. R.; GONZALEZ-AVILES, G.; KENDALL SCOTT, M. E.; MANIKONDA, K.; MAXWELL, T. N.; HALPIN, J. L.; FREEMAN, M. M.; MEDALLA, F.; AYERS, T. L.; DERADO, G.; MAHON, B. E.; MINTZ, E. D.
2016-01-01
SUMMARY Although rare, typhoid fever cases acquired in the United States continue to be reported. Detection and investigation of outbreaks in these domestically acquired cases offer opportunities to identify chronic carriers. We searched surveillance and laboratory databases for domestically acquired typhoid fever cases, used a space–time scan statistic to identify clusters, and classified clusters as outbreaks or non-outbreaks. From 1999 to 2010, domestically acquired cases accounted for 18% of 3373 reported typhoid fever cases; their isolates were less often multidrug-resistant (2% vs. 15%) compared to isolates from travel-associated cases. We identified 28 outbreaks and two possible outbreaks within 45 space–time clusters of ⩾2 domestically acquired cases, including three outbreaks involving ⩾2 molecular subtypes. The approach detected seven of the ten outbreaks published in the literature or reported to CDC. Although this approach did not definitively identify any previously unrecognized outbreaks, it showed the potential to detect outbreaks of typhoid fever that may escape detection by routine analysis of surveillance data. Sixteen outbreaks had been linked to a carrier. Every case of typhoid fever acquired in a non-endemic country warrants thorough investigation. Space–time scan statistics, together with shoe-leather epidemiology and molecular subtyping, may improve outbreak detection. PMID:25427666
Imanishi, M; Newton, A E; Vieira, A R; Gonzalez-Aviles, G; Kendall Scott, M E; Manikonda, K; Maxwell, T N; Halpin, J L; Freeman, M M; Medalla, F; Ayers, T L; Derado, G; Mahon, B E; Mintz, E D
2015-08-01
Although rare, typhoid fever cases acquired in the United States continue to be reported. Detection and investigation of outbreaks in these domestically acquired cases offer opportunities to identify chronic carriers. We searched surveillance and laboratory databases for domestically acquired typhoid fever cases, used a space-time scan statistic to identify clusters, and classified clusters as outbreaks or non-outbreaks. From 1999 to 2010, domestically acquired cases accounted for 18% of 3373 reported typhoid fever cases; their isolates were less often multidrug-resistant (2% vs. 15%) compared to isolates from travel-associated cases. We identified 28 outbreaks and two possible outbreaks within 45 space-time clusters of ⩾2 domestically acquired cases, including three outbreaks involving ⩾2 molecular subtypes. The approach detected seven of the ten outbreaks published in the literature or reported to CDC. Although this approach did not definitively identify any previously unrecognized outbreaks, it showed the potential to detect outbreaks of typhoid fever that may escape detection by routine analysis of surveillance data. Sixteen outbreaks had been linked to a carrier. Every case of typhoid fever acquired in a non-endemic country warrants thorough investigation. Space-time scan statistics, together with shoe-leather epidemiology and molecular subtyping, may improve outbreak detection.
Spechler, R.M.
1996-01-01
Potentiometric surface maps of the Upper Floridan aquifer show two depressions around the St. Johns River frm the city of Jacksonville south toward Green Cove Springs. These depressions, depending on their locations, are the result of withdrawals from agricultural, industrial, domestic and public-supply wells, diffuse upward leakage, and discharge from springs. Submerged springs that discharge into the St. Johns River between Jacksonville and Green Cove Springs have been thought to exist, but locating and evaluating these springs had not been attempted before this investigation. Thermal infrared imagery, seismic reflection, and numerous interviews with local residents were used to locate springs. An airborne thermal infrared survey was conducted along a section of the St. Johns River in northeastern Florida during February 1992 to detect possible sources of ground-water discharge to the river. An infrared image displayed one thermal anomaly in the St. Johns River which is associated with a previously unknown spring discharge from the Floridan aquifer system. Thermal anomalies also were observed at six locations where municipal facilities discharge treated wastewater to the river. Results of seismic reflection surveys indicate the presence of collapse and other karst features underlying the St. Johns River. These features indicate that the surficial deposits and the Hawthorn Formation that underlie the river probably do not consist of continuous beds. The collapse or deformation of the Hawthorn Formation or the presence of permeable sediment of localized extent could create zones of relatively high vertical leakance. This could provide a more direct hydraulic connection between the Upper Floridan aquifer and the river. Water samples collected from the only submerged spring in the St. Johns River within the Jacksonville-Green Cove Springs reach indicate that the source of the water is the Floridan aquifer system. Chloride and sulfate concentrations were 12 and 340
On the statistics of proto-cluster candidates detected in the Planck all-sky survey
Negrello, M.; Gonzalez-Nuevo, J.; De Zotti, G.; Bonato, M.; Cai, Z.-Y.; Clements, D.; Danese, L.; Dole, H.; Greenslade, J.; Lapi, A.; Montier, L.
2017-09-01
Observational investigations of the abundance of massive precursors of local galaxy clusters ('proto-clusters') allow us to test the growth of density perturbations, to constrain cosmological parameters that control it, to test the theory of non-linear collapse and how the galaxy formation takes place in dense environments. The Planck collaboration has recently published a catalogue of ≳2000 cold extragalactic sub-millimeter sources, i.e. with colours indicative of z ≳ 2, almost all of which appear to be overdensities of star-forming galaxies. They are thus considered as proto-cluster candidates. Their number densities (or their flux densities) are far in excess of expectations from the standard scenario for the evolution of large-scale structure. Simulations based on a physically motivated galaxy evolution model show that essentially all cold peaks brighter than S545GHz = 500 mJy found in Planck maps after having removed the Galactic dust emission can be interpreted as positive Poisson fluctuations of the number of high-z dusty proto-clusters within the same Planck beam, rather then being individual clumps of physically bound galaxies. This conclusion does not change if an empirical fit to the luminosity function of dusty galaxies is used instead of the physical model. The simulations accurately reproduce the statistic of the Planck detections and yield distributions of sizes and ellipticities in qualitative agreement with observations. The redshift distribution of the brightest proto-clusters contributing to the cold peaks has a broad maximum at 1.5 ≤ z ≤ 3. Therefore follow-up of Planck proto-cluster candidates will provide key information on the high-z evolution of large scale structure.
Joanna F Dipnall
Full Text Available Atheoretical large-scale data mining techniques using machine learning algorithms have promise in the analysis of large epidemiological datasets. This study illustrates the use of a hybrid methodology for variable selection that took account of missing data and complex survey design to identify key biomarkers associated with depression from a large epidemiological study.The study used a three-step methodology amalgamating multiple imputation, a machine learning boosted regression algorithm and logistic regression, to identify key biomarkers associated with depression in the National Health and Nutrition Examination Study (2009-2010. Depression was measured using the Patient Health Questionnaire-9 and 67 biomarkers were analysed. Covariates in this study included gender, age, race, smoking, food security, Poverty Income Ratio, Body Mass Index, physical activity, alcohol use, medical conditions and medications. The final imputed weighted multiple logistic regression model included possible confounders and moderators.After the creation of 20 imputation data sets from multiple chained regression sequences, machine learning boosted regression initially identified 21 biomarkers associated with depression. Using traditional logistic regression methods, including controlling for possible confounders and moderators, a final set of three biomarkers were selected. The final three biomarkers from the novel hybrid variable selection methodology were red cell distribution width (OR 1.15; 95% CI 1.01, 1.30, serum glucose (OR 1.01; 95% CI 1.00, 1.01 and total bilirubin (OR 0.12; 95% CI 0.05, 0.28. Significant interactions were found between total bilirubin with Mexican American/Hispanic group (p = 0.016, and current smokers (p<0.001.The systematic use of a hybrid methodology for variable selection, fusing data mining techniques using a machine learning algorithm with traditional statistical modelling, accounted for missing data and complex survey sampling
Willemsen, Ina; van Esser, Joost; Kluytmans-van den Bergh, Marjolein; Zhou, Kai; Rossen, John W.; Verhulst, Carlo; Verduin, Kees; Kluytmans, Jan
2016-01-01
The laboratory detection of OXA-48-carbapenemase-producing Enterobacteriaceae is difficult, as minimum inhibition concentrations for carbapenems are often below the clinical breakpoint. In 2011, the Dutch national guideline for the detection of highly resistant micro-organisms was issued, which incl
Willemsen, Ina; van Esser, Joost; Kluytmans - van den Bergh, MFQ; Zhou, Kai; Rossen, John W; Verhulst, Carlo; Verduin, Kees; Kluytmans, Jan
2016-01-01
INTRODUCTION: The laboratory detection of OXA-48-carbapenemase-producing Enterobacteriaceae is difficult, as minimum inhibition concentrations for carbapenems are often below the clinical breakpoint. In 2011, the Dutch national guideline for the detection of highly resistant micro-organisms was issu
Willemsen, Ina; van Esser, Joost; Kluytmans-van den Bergh, Marjolein; Zhou, Kai; Rossen, John W.; Verhulst, Carlo; Verduin, Kees; Kluytmans, Jan
2016-01-01
The laboratory detection of OXA-48-carbapenemase-producing Enterobacteriaceae is difficult, as minimum inhibition concentrations for carbapenems are often below the clinical breakpoint. In 2011, the Dutch national guideline for the detection of highly resistant micro-organisms was issued, which incl
Willemsen, Ina; van Esser, Joost; Kluytmans - van den Bergh, MFQ; Zhou, Kai; Rossen, John W; Verhulst, Carlo; Verduin, Kees; Kluytmans, Jan
2016-01-01
INTRODUCTION: The laboratory detection of OXA-48-carbapenemase-producing Enterobacteriaceae is difficult, as minimum inhibition concentrations for carbapenems are often below the clinical breakpoint. In 2011, the Dutch national guideline for the detection of highly resistant micro-organisms was issu
2007-06-01
the observed system. Our research involved a comparative analysis of two multivariate statistical methods, the multivariate CUSUM (MCUSUM) and the...outbreaks. We found that, similar to results for the univariate CUSUM and EWMA, the directionally-sensitive MCUSUM and MEWMA perform very similarly. 14...SUBJECT TERMS Biosurveillance, Multivariate CUSUM , Multivariate EWMA, Statistical Process Control, Syndromic Surveillance 15. NUMBER OF PAGES
Detection of Invalid Test Scores on Admission Tests : A Simulation Study Using Person-Fit Statistics
Tendeiro, Jorge N.; Meijer, Rob R.; Albers, Casper J.
While an admission test may strongly predict success in university or law school programs for most test takers, there may be some test takers who are mismeasured. To address this issue, a class of statistics called person-fit statistics is used to check the validity of individual test scores.
Nielsen, Allan Aasbjerg; Conradsen, Knut; Skriver, Henning
2016-01-01
in the covariance matrix representation is carried out. The omnibus test statistic and its factorization detect if and when change(s) occur. The technique is demonstrated on airborne EMISAR L-band data but may be applied to Sentinel-1, Cosmo-SkyMed, TerraSAR-X, ALOS and RadarSat-2 or other dual- and quad...
Simonson, K.M.
1998-08-01
The rate at which a mine detection system falsely identifies man-made or natural clutter objects as mines is referred to as the system's false alarm rate (FAR). Generally expressed as a rate per unit area or time, the FAR is one of the primary metrics used to gauge system performance. In this report, an overview is given of statistical methods appropriate for the analysis of data relating to FAR. Techniques are presented for determining a suitable size for the clutter collection area, for summarizing the performance of a single sensor, and for comparing different sensors. For readers requiring more thorough coverage of the topics discussed, references to the statistical literature are provided. A companion report addresses statistical issues related to the estimation of mine detection probabilities.
Comparative Study of Statistical Skin Detection Algorithms for Sub-Continental Human Images
Tabassum, Mirza Rehenuma; Kamal, Md Mostafa; Muctadir, Hossain Muhammad; Ibrahim, Muhammad; Shakir, Asif Khan; Imran, Asif; Islamm, Saiful; Rabbani, Md Golam; Khaled, Shah Mostafa; Islam, Md Saiful; Begum, Zerina; 10.3923/itj.2010.811.817
2010-01-01
Object detection has been a focus of research in human-computer interaction. Skin area detection has been a key to different recognitions like face recognition, human motion detection, pornographic and nude image prediction, etc. Most of the research done in the fields of skin detection has been trained and tested on human images of African, Mongolian and Anglo-Saxon ethnic origins. Although there are several intensity invariant approaches to skin detection, the skin color of Indian sub-continentals have not been focused separately. The approach of this research is to make a comparative study between three image segmentation approaches using Indian sub-continental human images, to optimize the detection criteria, and to find some efficient parameters to detect the skin area from these images. The experiments observed that HSV color model based approach to Indian sub-continental skin detection is more suitable with considerable success rate of 91.1% true positives and 88.1% true negatives.
Myriam Oliveira-Rodríguez
2016-08-01
Full Text Available Exosomes are cell-secreted nanovesicles (40–200 nm that represent a rich source of novel biomarkers in the diagnosis and prognosis of certain diseases. Despite the increasingly recognized relevance of these vesicles as biomarkers, their detection has been limited due in part to current technical challenges in the rapid isolation and analysis of exosomes. The complexity of the development of analytical platforms relies on the heterogeneous composition of the exosome membrane. One of the most attractive tests is the inmunochromatographic strips, which allow rapid detection by unskilled operators. We have successfully developed a novel lateral flow immunoassay (LFIA for the detection of exosomes based on the use of tetraspanins as targets. We have applied this platform for the detection of exosomes purified from different sources: cell culture supernatants, human plasma and urine. As proof of concept, we explored the analytical potential of this LFIA platform to accurately quantify exosomes purified from a human metastatic melanoma cell line. The one-step assay can be completed in 15 min, with a limit of detection of 8.54×105 exosomes/µL when a blend of anti-CD9 and anti-CD81 were selected as capture antibodies and anti-CD63 labelled with gold nanoparticles as detection antibody. Based on our results, this platform could be well suited to be used as a rapid exosome quantification tool, with promising diagnostic applications, bearing in mind that the detection of exosomes from different sources may require adaptation of the analytical settings to their specific composition.
CHEN, Z.
2014-11-01
Full Text Available Impulse noise in power line communication (PLC channel seriously degrades the performance of Multiple-Input Multiple-Output (MIMO system. To remedy this problem, a MIMO detection method based on fractional lower order statistics (FLOS for PLC channel with impulse noise is proposed in this paper. The alpha stable distribution is used to model impulse noise, and FLOS is applied to construct the criteria of MIMO detection. Then the optimal detection solution is obtained by recursive least squares algorithm. Finally, the transmitted signals in PLC MIMO system are restored with the obtained detection matrix. The proposed method does not require channel estimation and has low computational complexity. The simulation results show that the proposed method has a better PLC MIMO detection performance than the existing ones under impulsive noise environment.
Lindquist, S.G.; Braedgaard, H.; Svenstrup, K.
2008-01-01
entities of FTD have involved identification of several new causative genes. METHODS AND RESULTS: We report the finding of a truncating mutation in the CHMP2B gene (c.532-1G>C) in a patient with early onset dementia. The patient was previously not known to be related to the single Danish pedigree known...
Natarajan Sripriya
2004-02-01
Full Text Available Abstract Background Gene microarray technology provides the ability to study the regulation of thousands of genes simultaneously, but its potential is limited without an estimate of the statistical significance of the observed changes in gene expression. Due to the large number of genes being tested and the comparatively small number of array replicates (e.g., N = 3, standard statistical methods such as the Student's t-test fail to produce reliable results. Two other statistical approaches commonly used to improve significance estimates are a penalized t-test and a Z-test using intensity-dependent variance estimates. Results The performance of these approaches is compared using a dataset of 23 replicates, and a new implementation of the Z-test is introduced that pools together variance estimates of genes with similar minimum intensity. Significance estimates based on 3 replicate arrays are calculated using each statistical technique, and their accuracy is evaluated by comparing them to a reliable estimate based on the remaining 20 replicates. The reproducibility of each test statistic is evaluated by applying it to multiple, independent sets of 3 replicate arrays. Two implementations of a Z-test using intensity-dependent variance produce more reproducible results than two implementations of a penalized t-test. Furthermore, the minimum intensity-based Z-statistic demonstrates higher accuracy and higher or equal precision than all other statistical techniques tested. Conclusion An intensity-based variance estimation technique provides one simple, effective approach that can improve p-value estimates for differentially regulated genes derived from replicated microarray datasets. Implementations of the Z-test algorithms are available at http://vessels.bwh.harvard.edu/software/papers/bmcg2004.
Diagnosis of CO Pollution in HTPEM Fuel Cell using Statistical Change Detection
Jeppesen, Christian; Blanke, Mogens; Zhou, Fan;
2015-01-01
The fuel cell technologies are advancing and maturing for commercial markets. However proper diagnostic tools needs to be developed in order to insure reliability and durability of fuel cell systems. This paper presents a design of a data driven method to detect CO content in the anode gas...... of a high temperature fuel cell. In this work the fuel cell characterization is based on an experimental equivalent electrical circuit, where model parameters are mapped as a function of the load current. The designed general likelihood ratio test detection scheme detects whether a equivalent electrical...... circuit parameter differ from the non-faulty operation. It is proven that the general likelihood ratio test detection scheme, with a very low probability of false alarm, can detect CO content in the anode gas of the fuel cell....
J. Q. Zhao
2016-06-01
Full Text Available Accurate and timely change detection of Earth’s surface features is extremely important for understanding relationships and interactions between people and natural phenomena. Many traditional methods of change detection only use a part of polarization information and the supervised threshold selection. Those methods are insufficiency and time-costing. In this paper, we present a novel unsupervised change-detection method based on quad-polarimetric SAR data and automatic threshold selection to solve the problem of change detection. First, speckle noise is removed for the two registered SAR images. Second, the similarity measure is calculated by the test statistic, and automatic threshold selection of KI is introduced to obtain the change map. The efficiency of the proposed method is demonstrated by the quad-pol SAR images acquired by Radarsat-2 over Wuhan of China.
Liu, Wei; Feng, Huanqing; Li, Chuanfu; Huang, Yufeng; Wu, Dehuang; Tong, Tong
2009-01-01
In this paper, we present a method that detects intracranial space-occupying lesions in two-dimensional (2D) brain high-resolution CT images. Use of statistical texture atlas technique localizes anatomy variation in the gray level distribution of brain images, and in turn, identifies the regions with lesions. The statistical texture atlas involves 147 HRCT slices of normal individuals and its construction is extremely time-consuming. To improve the performance of atlas construction, we have implemented the pixel-wise texture extraction procedure on Nvidia 8800GTX GPU with Compute Unified Device Architecture (CUDA) platform. Experimental results indicate that the extracted texture feature is distinctive and robust enough, and is suitable for detecting uniform and mixed density space-occupying lesions. In addition, a significant speedup against straight forward CPU version was achieved with CUDA.
A novel statistical approach for detection of suspicious regions in digital mammogram
Z.A. Abo-Eleneen
2013-07-01
Full Text Available In this paper, we propose a novel algorithm to detect the suspicious regions on digital mammograms that based on the Fisher information measure. The proposed algorithm is tested different types and categories of mammograms (fatty, fatty-glandular and dense glandular within mini-MIAS database (Mammogram Image Analysis Society database (UK. The proposed method is compared with a different segmentation based information theoretical methods to demonstrate their effectiveness. The experimental results on mammography images showed the effectiveness in the detection of suspicious regions. This study can be a part of developing a computer-aided decision (CAD system for early detection of breast cancer.
Space Object Detection and Tracking Within a Finite Set Statistics Framework
2017-04-13
Grant No. FA9550-15-1-0069, devoted to the investigation and improvement of the detection and tracking methods of inactive Resident Space Objects (RSOs...FA9550-15-1-0069, devoted to the investigation and improvement of the detection and tracking methods of inactive Resident Space Objects (RSOs). In the...photometry using ccds. University of Oklahoma, 17, 2006. [18] Gene Stansbery. Nasa’s orbital debris program office. briefing to the nasa advisory
M. Amate
2007-01-01
Full Text Available An original algorithm for the detection of small objects in a noisy background is proposed. Its application to underwater objects detection by sonar imaging is addressed. This new method is based on the use of higher-order statistics (HOS that are locally estimated on the images. The proposed algorithm is divided into two steps. In a first step, HOS (skewness and kurtosis are estimated locally using a square sliding computation window. Small deterministic objects have different statistical properties from the background they are thus highlighted. The influence of the signal-to-noise ratio (SNR on the results is studied in the case of Gaussian noise. Mathematical expressions of the estimators and of the expected performances are derived and are experimentally confirmed. In a second step, the results are focused by a matched filter using a theoretical model. This enables the precise localization of the regions of interest. The proposed method generalizes to other statistical distributions and we derive the theoretical expressions of the HOS estimators in the case of a Weibull distribution (both when only noise is present or when a small deterministic object is present within the filtering window. This enables the application of the proposed technique to the processing of synthetic aperture sonar data containing underwater mines whose echoes have to be detected and located. Results on real data sets are presented and quantitatively evaluated using receiver operating characteristic (ROC curves.
Bamidis, P D; Lithari, C; Konstantinidis, S T
2010-01-01
With the number of scientific papers published in journals, conference proceedings, and international literature ever increasing, authors and reviewers are not only facilitated with an abundance of information, but unfortunately continuously confronted with risks associated with the erroneous copy of another's material. In parallel, Information Communication Technology (ICT) tools provide to researchers novel and continuously more effective ways to analyze and present their work. Software tools regarding statistical analysis offer scientists the chance to validate their work and enhance the quality of published papers. Moreover, from the reviewers and the editor's perspective, it is now possible to ensure the (text-content) originality of a scientific article with automated software tools for plagiarism detection. In this paper, we provide a step-bystep demonstration of two categories of tools, namely, statistical analysis and plagiarism detection. The aim is not to come up with a specific tool recommendation, but rather to provide useful guidelines on the proper use and efficiency of either category of tools. In the context of this special issue, this paper offers a useful tutorial to specific problems concerned with scientific writing and review discourse. A specific neuroscience experimental case example is utilized to illustrate the young researcher's statistical analysis burden, while a test scenario is purpose-built using open access journal articles to exemplify the use and comparative outputs of seven plagiarism detection software pieces. PMID:21487489
Bamidis, P D; Lithari, C; Konstantinidis, S T
2010-12-01
With the number of scientific papers published in journals, conference proceedings, and international literature ever increasing, authors and reviewers are not only facilitated with an abundance of information, but unfortunately continuously confronted with risks associated with the erroneous copy of another's material. In parallel, Information Communication Technology (ICT) tools provide to researchers novel and continuously more effective ways to analyze and present their work. Software tools regarding statistical analysis offer scientists the chance to validate their work and enhance the quality of published papers. Moreover, from the reviewers and the editor's perspective, it is now possible to ensure the (text-content) originality of a scientific article with automated software tools for plagiarism detection. In this paper, we provide a step-bystep demonstration of two categories of tools, namely, statistical analysis and plagiarism detection. The aim is not to come up with a specific tool recommendation, but rather to provide useful guidelines on the proper use and efficiency of either category of tools. In the context of this special issue, this paper offers a useful tutorial to specific problems concerned with scientific writing and review discourse. A specific neuroscience experimental case example is utilized to illustrate the young researcher's statistical analysis burden, while a test scenario is purpose-built using open access journal articles to exemplify the use and comparative outputs of seven plagiarism detection software pieces.
Shashank Vyas
2016-01-01
Full Text Available Integration of solar photovoltaic (PV generation with power distribution networks leads to many operational challenges and complexities. Unintentional islanding is one of them which is of rising concern given the steady increase in grid-connected PV power. This paper builds up on an exploratory study of unintentional islanding on a modeled radial feeder having large PV penetration. Dynamic simulations, also run in real time, resulted in exploration of unique potential causes of creation of accidental islands. The resulting voltage and current data underwent dimensionality reduction using principal component analysis (PCA which formed the basis for the application of Q statistic control charts for detecting the anomalous currents that could island the system. For reducing the false alarm rate of anomaly detection, Kullback-Leibler (K-L divergence was applied on the principal component projections which concluded that Q statistic based approach alone is not reliable for detection of the symptoms liable to cause unintentional islanding. The obtained data was labeled and a K-nearest neighbor (K-NN binomial classifier was then trained for identification and classification of potential islanding precursors from other power system transients. The three-phase short-circuit fault case was successfully identified as statistically different from islanding symptoms.
An Algorithm for Detection of DVB-T Signals Based on Their Second-Order Statistics
Jallon Pierre
2008-01-01
Full Text Available Abstract We propose in this paper a detection algorithm based on a cost function that jointly tests the correlation induced by the cyclic prefix and the fact that this correlation is time-periodic. In the first part of the paper, the cost function is introduced and some analytical results are given. In particular, the noise and multipath channel impacts on its values are theoretically analysed. In a second part of the paper, some asymptotic results are derived. A first exploitation of these results is used to build a detection test based on the false alarm probability. These results are also used to evaluate the impact of the number of cycle frequencies taken into account in the cost function on the detection performances. Thanks to numerical estimations, we have been able to estimate that the proposed algorithm detects DVB-T signals with an SNR of dB. As a comparison, and in the same context, the detection algorithm proposed by the 802.22 WG in 2006 is able to detect these signals with an SNR of dB.
An Algorithm for Detection of DVB-T Signals Based on Their Second-Order Statistics
Pierre Jallon
2008-03-01
Full Text Available We propose in this paper a detection algorithm based on a cost function that jointly tests the correlation induced by the cyclic prefix and the fact that this correlation is time-periodic. In the first part of the paper, the cost function is introduced and some analytical results are given. In particular, the noise and multipath channel impacts on its values are theoretically analysed. In a second part of the paper, some asymptotic results are derived. A first exploitation of these results is used to build a detection test based on the false alarm probability. These results are also used to evaluate the impact of the number of cycle frequencies taken into account in the cost function on the detection performances. Thanks to numerical estimations, we have been able to estimate that the proposed algorithm detects DVB-T signals with an SNR of Ã¢ÂˆÂ’12Ã¢Â€Â‰dB. As a comparison, and in the same context, the detection algorithm proposed by the 802.22 WG in 2006 is able to detect these signals with an SNR of Ã¢ÂˆÂ’8Ã¢Â€Â‰dB.
Yamashita Rios de Sousa, Arthur Matsuo; Takayasu, Hideki; Takayasu, Misako
2017-01-01
We extend the concept of statistical symmetry as the invariance of a probability distribution under transformation to analyze binary sign time series data of price difference from the foreign exchange market. We model segments of the sign time series as Markov sequences and apply a local hypothesis test to evaluate the symmetries of independence and time reversion in different periods of the market. For the test, we derive the probability of a binary Markov process to generate a given set of number of symbol pairs. Using such analysis, we could not only segment the time series according the different behaviors but also characterize the segments in terms of statistical symmetries. As a particular result, we find that the foreign exchange market is essentially time reversible but this symmetry is broken when there is a strong external influence.
Valdivia-Silva, Julio E.; Lavan, David; Diego Orihuela-Tacuri, M.; Sanabria, Gabriela
2016-07-01
Currently, studies in Drosophila melanogaster has shown emerging evidence that microgravity stimuli can be detected at the genetic level. Analysis of the transcriptome in the pupal stage of the fruit flies under microgravity conditions versus ground controls has suggested the presence of a few candidate genes as "gravity sensors" which are experimentally validated. Additionally, several studies have shown that microgravity causes inhibitory effects in different types of cancer cells, although the genes involved and responsible for these effects are still unknown. Here, we demonstrate that the genes suggested as the sensors of gravitational waves in Drosophila melanogaster and their human counterpart (orthologous genes) are highly involved in carcinogenesis, proliferation, anti-apoptotic signals, invasiveness, and metastatic potential of breast cancer cell tumors. The transcriptome analyses suggested that the observed inhibitory effect in cancer cells could be due to changes in the genetic expression of these candidates. These results encourage the possibility of new therapeutic targets managed together and not in isolation.
Cosmic Statistics of Statistics
Szapudi, I.; Colombi, S.; Bernardeau, F.
1999-01-01
The errors on statistics measured in finite galaxy catalogs are exhaustively investigated. The theory of errors on factorial moments by Szapudi & Colombi (1996) is applied to cumulants via a series expansion method. All results are subsequently extended to the weakly non-linear regime. Together with previous investigations this yields an analytic theory of the errors for moments and connected moments of counts in cells from highly nonlinear to weakly nonlinear scales. The final analytic formu...
Hoell, Simon; Omenzetter, Piotr
2017-07-01
Considering jointly damage sensitive features (DSFs) of signals recorded by multiple sensors, applying advanced transformations to these DSFs and assessing systematically their contribution to damage detectability and localisation can significantly enhance the performance of structural health monitoring systems. This philosophy is explored here for partial autocorrelation coefficients (PACCs) of acceleration responses. They are interrogated with the help of the linear discriminant analysis based on the Fukunaga-Koontz transformation using datasets of the healthy and selected reference damage states. Then, a simple but efficient fast forward selection procedure is applied to rank the DSF components with respect to statistical distance measures specialised for either damage detection or localisation. For the damage detection task, the optimal feature subsets are identified based on the statistical hypothesis testing. For damage localisation, a hierarchical neuro-fuzzy tool is developed that uses the DSF ranking to establish its own optimal architecture. The proposed approaches are evaluated experimentally on data from non-destructively simulated damage in a laboratory scale wind turbine blade. The results support our claim of being able to enhance damage detectability and localisation performance by transforming and optimally selecting DSFs. It is demonstrated that the optimally selected PACCs from multiple sensors or their Fukunaga-Koontz transformed versions can not only improve the detectability of damage via statistical hypothesis testing but also increase the accuracy of damage localisation when used as inputs into a hierarchical neuro-fuzzy network. Furthermore, the computational effort of employing these advanced soft computing models for damage localisation can be significantly reduced by using transformed DSFs.
Zeng, W; Liu, B
1999-01-01
Digital watermarking has been proposed as the means for copyright protection of multimedia data. Many of existing watermarking schemes focused on the robust means to mark an image invisibly without really addressing the ends of these schemes. This paper first discusses some scenarios in which many current watermarking schemes fail to resolve the rightful ownership of an image. The key problems are then identified, and some crucial requirements for a valid invisible watermark detection are discussed. In particular, we show that, for the particular application of resolving rightful ownership using invisible watermarks, it might be crucial to require that the original image not be directly involved in the watermark detection process. A general framework for validly detecting the invisible watermarks is then proposed. Some requirements on the claimed signature/watermarks to be used for detection are discussed to prevent the existence of any counterfeit scheme. The optimal detection strategy within the framework is derived. We show the effectiveness of this technique based on some visual-model-based watermark encoding schemes.
Nielsen, Allan A.; Conradsen, Knut; Skriver, Henning
2016-10-01
Test statistics for comparison of real (as opposed to complex) variance-covariance matrices exist in the statistics literature [1]. In earlier publications we have described a test statistic for the equality of two variance-covariance matrices following the complex Wishart distribution with an associated p-value [2]. We showed their application to bitemporal change detection and to edge detection [3] in multilook, polarimetric synthetic aperture radar (SAR) data in the covariance matrix representation [4]. The test statistic and the associated p-value is described in [5] also. In [6] we focussed on the block-diagonal case, we elaborated on some computer implementation issues, and we gave examples on the application to change detection in both full and dual polarization bitemporal, bifrequency, multilook SAR data. In [7] we described an omnibus test statistic Q for the equality of k variance-covariance matrices following the complex Wishart distribution. We also described a factorization of Q = R2 R3 … Rk where Q and Rj determine if and when a difference occurs. Additionally, we gave p-values for Q and Rj. Finally, we demonstrated the use of Q and Rj and the p-values to change detection in truly multitemporal, full polarization SAR data. Here we illustrate the methods by means of airborne L-band SAR data (EMISAR) [8,9]. The methods may be applied to other polarimetric SAR data also such as data from Sentinel-1, COSMO-SkyMed, TerraSAR-X, ALOS, and RadarSat-2 and also to single-pol data. The account given here closely follows that given our recent IEEE TGRS paper [7]. Selected References [1] Anderson, T. W., An Introduction to Multivariate Statistical Analysis, John Wiley, New York, third ed. (2003). [2] Conradsen, K., Nielsen, A. A., Schou, J., and Skriver, H., "A test statistic in the complex Wishart distribution and its application to change detection in polarimetric SAR data," IEEE Transactions on Geoscience and Remote Sensing 41(1): 4-19, 2003. [3] Schou, J
Tenan, Matthew S; Tweedell, Andrew J; Haynes, Courtney A
2017-01-01
The timing of muscle activity is a commonly applied analytic method to understand how the nervous system controls movement. This study systematically evaluates six classes of standard and statistical algorithms to determine muscle onset in both experimental surface electromyography (EMG) and simulated EMG with a known onset time. Eighteen participants had EMG collected from the biceps brachii and vastus lateralis while performing a biceps curl or knee extension, respectively. Three established methods and three statistical methods for EMG onset were evaluated. Linear envelope, Teager-Kaiser energy operator + linear envelope and sample entropy were the established methods evaluated while general time series mean/variance, sequential and batch processing of parametric and nonparametric tools, and Bayesian changepoint analysis were the statistical techniques used. Visual EMG onset (experimental data) and objective EMG onset (simulated data) were compared with algorithmic EMG onset via root mean square error and linear regression models for stepwise elimination of inferior algorithms. The top algorithms for both data types were analyzed for their mean agreement with the gold standard onset and evaluation of 95% confidence intervals. The top algorithms were all Bayesian changepoint analysis iterations where the parameter of the prior (p0) was zero. The best performing Bayesian algorithms were p0 = 0 and a posterior probability for onset determination at 60-90%. While existing algorithms performed reasonably, the Bayesian changepoint analysis methodology provides greater reliability and accuracy when determining the singular onset of EMG activity in a time series. Further research is needed to determine if this class of algorithms perform equally well when the time series has multiple bursts of muscle activity.
Shojiguchi, A.; Tanaka, T.; Okada, M.
Recently a modified algorithm of code-division multiple-access (CDMA) parallel interference canceler (PIC) has been proposed by Tanaka based on statistical neurodynamics. In this paper we apply the modified algorithm to the linear PIC (LPIC) and investigate its stability. We show that the stable (unstable) fixed points of the modified algorithm correspond to the stable (unstable) replica symmetry solutions with the Gaussian prior. We also show the modified algorithm is a special case of Kabashima's belief-propagation algorithm with Gaussian prior.
Liu, Ming-Tsung; Yu, Pao-Ta
2011-01-01
A personalized e-learning service provides learning content to fit learners' individual differences. Learning achievements are influenced by cognitive as well as non-cognitive factors such as mood, motivation, interest, and personal styles. This paper proposes the Learning Caution Indexes (LCI) to detect aberrant learning patterns. The philosophy…
Diagnosis of CO Pollution in HTPEM Fuel Cell using Statistical Change Detection
Jeppesen, Christian; Blanke, Mogens; Zhou, Fan;
2015-01-01
The fuel cell technologies are advancing and maturing for commercial markets. However proper diagnostic tools needs to be developed in order to insure reliability and durability of fuel cell systems. This paper presents a design of a data driven method to detect CO content in the anode gas...
Using Cumulative Sum Statistics to Detect Inconsistencies in Unproctored Internet Testing
Tendeiro, Jorge N.; Meijer, Rob R.; Schakel, Lolle; Maij-de Meij, Annette M.
2013-01-01
Unproctored Internet Testing (UIT) is becoming more popular in personnel recruitment and selection. A drawback of UIT is that cheating is easy and, therefore, a proctored test is often administered after an UIT procedure. To detect inconsistent test scores from UIT, a cumulative sum procedure (CUSUM
A generic Framework for Landmine detection using statistical classifier based on IR images
Dr.G.Padmavathi,
2011-01-01
Full Text Available Landmine detection with passive infrared images can depend quite heavily on the environmental conditions, and there are cross over periods when the thermal contrast is negligible and the mines may be undetectable. Conventional antipersonnel mine detection has not evolved the perfect betterment in the methodological process. Here a generic framework is proposed using most adaptable methods and techniques. The IR captured image is being used for the detection and identification of buried targets by adopting high efficiency and better reliability image processing techniques for landmine detection. Results on diverse landmine data, collected using IR sensors show that the adopted method can identify meaningful and coherent clusters and that different expert algorithms can be identified for the different contexts. The initial experiments have also indicated that the need of preprocessing the images will highly increase the individual classifier performance. In future, the sensing technology can also be combined with processing power and wireless communication to make it profitable for various types ofsecurity threats.
Diagnosis of CO Pollution in HTPEM Fuel Cell using Statistical Change Detection
Jeppesen, Christian; Blanke, Mogens; Zhou, Fan
2015-01-01
The fuel cell technologies are advancing and maturing for commercial markets. However proper diagnostic tools needs to be developed in order to insure reliability and durability of fuel cell systems. This paper presents a design of a data driven method to detect CO content in the anode gas of a h...
A. L. Salih
2016-06-01
Full Text Available The analysis of the impact crater size-frequency distribution (CSFD is a well-established approach to the determination of the age of planetary surfaces. Classically, estimation of the CSFD is achieved by manual crater counting and size determination in spacecraft images, which, however, becomes very time-consuming for large surface areas and/or high image resolution. With increasing availability of high-resolution (nearly global image mosaics of planetary surfaces, a variety of automated methods for the detection of craters based on image data and/or topographic data have been developed. In this contribution a template-based crater detection algorithm is used which analyses image data acquired under known illumination conditions. Its results are used to establish the CSFD for the examined area, which is then used to estimate the absolute model age of the surface. The detection threshold of the automatic crater detection algorithm is calibrated based on a region with available manually determined CSFD such that the age inferred from the manual crater counts corresponds to the age inferred from the automatic crater detection results. With this detection threshold, the automatic crater detection algorithm can be applied to a much larger surface region around the calibration area. The proposed age estimation method is demonstrated for a Kaguya Terrain Camera image mosaic of 7.4 m per pixel resolution of the floor region of the lunar crater Tsiolkovsky, which consists of dark and flat mare basalt and has an area of nearly 10,000 km2. The region used for calibration, for which manual crater counts are available, has an area of 100 km2. In order to obtain a spatially resolved age map, CSFDs and surface ages are computed for overlapping quadratic regions of about 4.4 x 4.4 km2 size offset by a step width of 74 m. Our constructed surface age map of the floor of Tsiolkovsky shows age values of typically 3.2-3.3 Ga, while for small regions lower (down to
Salih, A. L.; Mühlbauer, M.; Grumpe, A.; Pasckert, J. H.; Wöhler, C.; Hiesinger, H.
2016-06-01
The analysis of the impact crater size-frequency distribution (CSFD) is a well-established approach to the determination of the age of planetary surfaces. Classically, estimation of the CSFD is achieved by manual crater counting and size determination in spacecraft images, which, however, becomes very time-consuming for large surface areas and/or high image resolution. With increasing availability of high-resolution (nearly) global image mosaics of planetary surfaces, a variety of automated methods for the detection of craters based on image data and/or topographic data have been developed. In this contribution a template-based crater detection algorithm is used which analyses image data acquired under known illumination conditions. Its results are used to establish the CSFD for the examined area, which is then used to estimate the absolute model age of the surface. The detection threshold of the automatic crater detection algorithm is calibrated based on a region with available manually determined CSFD such that the age inferred from the manual crater counts corresponds to the age inferred from the automatic crater detection results. With this detection threshold, the automatic crater detection algorithm can be applied to a much larger surface region around the calibration area. The proposed age estimation method is demonstrated for a Kaguya Terrain Camera image mosaic of 7.4 m per pixel resolution of the floor region of the lunar crater Tsiolkovsky, which consists of dark and flat mare basalt and has an area of nearly 10,000 km2. The region used for calibration, for which manual crater counts are available, has an area of 100 km2. In order to obtain a spatially resolved age map, CSFDs and surface ages are computed for overlapping quadratic regions of about 4.4 x 4.4 km2 size offset by a step width of 74 m. Our constructed surface age map of the floor of Tsiolkovsky shows age values of typically 3.2-3.3 Ga, while for small regions lower (down to 2.9 Ga) and higher
Hotspot detection using space-time scan statistics on children under five years of age in Depok
Verdiana, Miranti; Widyaningsih, Yekti
2017-03-01
Some problems that affect the health level in Depok is the high malnutrition rates from year to year and the more spread infectious and non-communicable diseases in some areas. Children under five years old is a vulnerable part of population to get the malnutrition and diseases. Based on this reason, it is important to observe the location and time, where and when, malnutrition in Depok happened in high intensity. To obtain the location and time of the hotspots of malnutrition and diseases that attack children under five years old, space-time scan statistics method can be used. Space-time scan statistic is a hotspot detection method, where the area and time of information and time are taken into account simultaneously in detecting the hotspots. This method detects a hotspot with a cylindrical scanning window: the cylindrical pedestal describes the area, and the height of cylinder describe the time. Cylinders formed is a hotspot candidate that may occur, which require testing of hypotheses, whether a cylinder can be summed up as a hotspot. Hotspot detection in this study carried out by forming a combination of several variables. Some combination of variables provides hotspot detection results that tend to be the same, so as to form groups (clusters). In the case of infant health level in Depok city, Beji health care center region in 2011-2012 is a hotspot. According to the combination of the variables used in the detection of hotspots, Beji health care center is most frequently as a hotspot. Hopefully the local government can take the right policy to improve the health level of children under five in the city of Depok.
Empirical Bayes scan statistics for detecting clusters of disease risk variants in genetic studies.
McCallum, Kenneth J; Ionita-Laza, Iuliana
2015-12-01
Recent developments of high-throughput genomic technologies offer an unprecedented detailed view of the genetic variation in various human populations, and promise to lead to significant progress in understanding the genetic basis of complex diseases. Despite this tremendous advance in data generation, it remains very challenging to analyze and interpret these data due to their sparse and high-dimensional nature. Here, we propose novel applications and new developments of empirical Bayes scan statistics to identify genomic regions significantly enriched with disease risk variants. We show that the proposed empirical Bayes methodology can be substantially more powerful than existing scan statistics methods especially so in the presence of many non-disease risk variants, and in situations when there is a mixture of risk and protective variants. Furthermore, the empirical Bayes approach has greater flexibility to accommodate covariates such as functional prediction scores and additional biomarkers. As proof-of-concept we apply the proposed methods to a whole-exome sequencing study for autism spectrum disorders and identify several promising candidate genes.
Detecting hippocampal shape changes in Alzheimer's disease using statistical shape models
Shen, Kaikai; Bourgeat, Pierrick; Fripp, Jurgen; Meriaudeau, Fabrice; Salvado, Olivier
2011-03-01
The hippocampus is affected at an early stage in the development of Alzheimer's disease (AD). Using brain Magnetic Resonance (MR) images, we can investigate the effect of AD on the morphology of the hippocampus. Statistical shape models (SSM) are usually used to describe and model the hippocampal shape variations among the population. We use the shape variation from SSM as features to classify AD from normal control cases (NC). Conventional SSM uses principal component analysis (PCA) to compute the modes of variations among the population. Although these modes are representative of variations within the training data, they are not necessarily discriminant on labelled data. In this study, a Hotelling's T 2 test is used to qualify the landmarks which can be used for PCA. The resulting variation modes are used as predictors of AD from NC. The discrimination ability of these predictors is evaluated in terms of their classification performances using support vector machines (SVM). Using only landmarks statistically discriminant between AD and NC in SSM showed a better separation between AD and NC. These predictors also showed better correlation to the cognitive scores such as mini-mental state examination (MMSE) and Alzheimer's disease assessment scale (ADAS).
A statistical framework for applying RNA profiling to chemical hazard detection.
Kostich, Mitchell S
2017-12-01
Use of 'omics technologies in environmental science is expanding. However, application is mostly restricted to characterizing molecular steps leading from toxicant interaction with molecular receptors to apical endpoints in laboratory species. Use in environmental decision-making is limited, due to difficulty in elucidating mechanisms in sufficient detail to make quantitative outcome predictions in any single species or in extending predictions to aquatic communities. Here we introduce a mechanism-agnostic statistical approach, supplementing mechanistic investigation by allowing probabilistic outcome prediction even when understanding of molecular pathways is limited, and facilitating extrapolation from results in laboratory test species to predictions about aquatic communities. We use concepts familiar to environmental managers, supplemented with techniques employed for clinical interpretation of 'omics-based biomedical tests. We describe the framework in step-wise fashion, beginning with single test replicates of a single RNA variant, then extending to multi-gene RNA profiling, collections of test replicates, and integration of complementary data. In order to simplify the presentation, we focus on using RNA profiling for distinguishing presence versus absence of chemical hazards, but the principles discussed can be extended to other types of 'omics measurements, multi-class problems, and regression. We include a supplemental file demonstrating many of the concepts using the open source R statistical package. Published by Elsevier Ltd.
EEG-based Drowsiness Detection for Safe Driving Using Chaotic Features and Statistical Tests
Mardi, Zahra; Ashtiani, Seyedeh Naghmeh Miri; Mikaili, Mohammad
2011-01-01
Electro encephalography (EEG) is one of the most reliable sources to detect sleep onset while driving. In this study, we have tried to demonstrate that sleepiness and alertness signals are separable with an appropriate margin by extracting suitable features. So, first of all, we have recorded EEG signals from 10 volunteers. They were obliged to avoid sleeping for about 20 hours before the test. We recorded the signals while subjects did a virtual driving game. They tried to pass some barriers...
Statistical Methods for Detecting and Modeling General Patterns and Relationships in Lifetime Data
Kvaloey, Jan Terje
1999-04-01
In this thesis, the author tries to develop methods of detecting and modeling general patterns and relationships in lifetime data. Tests with power against nonmonotonic trends and nonmonotonic co variate effects are considered, and nonparametric regression methods which allow estimation of fairly general nonlinear relationships are studied. Practical uses of some of the methods are illustrated although in a medical rather than engineering or technological context.
Ming Ren
2016-03-01
Full Text Available Partial discharge (PD detection is essential to the operation of high-voltage systems. In this context, we investigate the basic characteristics of light emission during PDs in SF6 gas from the perspective of insulation diagnosis. A synchronous system is constructed using three optical photoelectric instruments with separate wavelength responses in the ultraviolet (UV, 189–352 nm, visible (VIS, 381–675 nm, and near-infrared (NIR, 737–920 nm spectral ranges and a wide-band PD current pulse detector with a response of 1 pC. The results indicate that light emission depends upon the type of insulation defect and discharge energy. An increase in PD charge gives rise to more components in the spectral range from UV to VIS, and the presence of an insulator surface in discharges yields a more complex VIS-to-NIR spectrum. The phase-resolved partial discharge pattern (PRPD of UV light pulses can reasonably reflect the electroluminescence process in the presence of the insulator surface and weak corona at negative voltage points. The PRPD of VIS light describes the features of the actual PD pattern in most cases. In comparison with the other two spectral ranges, light intensity in the VIS range is more sensitive to changes in gas-pressure-normalized voltage (Vrms/p. The linear fitting analysis of the relationships between the light intensity and PD charge shows that UV light detection has a greater sensitivity to the PD charge and that UV detection exhibits a greater degree of linearity. NIR detection is applicable only to severe PDs. We believe that our findings can significantly aid in application of optical PD diagnosis in SF6 gas insulated systems.
Iravanian, Shahriar; Kanu, Uche B; Christini, David J
2012-07-01
Cardiac repolarization alternans is an electrophysiologic condition identified by a beat-to-beat fluctuation in action potential waveform. It has been mechanistically linked to instances of T-wave alternans, a clinically defined ECG alternation in T-wave morphology, and associated with the onset of cardiac reentry and sudden cardiac death. Many alternans detection algorithms have been proposed in the past, but the majority have been designed specifically for use with T-wave alternans. Action potential duration (APD) signals obtained from experiments (especially those derived from optical mapping) possess unique characteristics, which requires the development and use of a more appropriate alternans detection method. In this paper, we present a new class of algorithms, based on the Monte Carlo method, for the detection and quantitative measurement of alternans. Specifically, we derive a set of algorithms (one an analytical and more efficient version of the other) and compare its performance with the standard spectral method and the generalized likelihood ratio test algorithm using synthetic APD sequences and optical mapping data obtained from an alternans control experiment. We demonstrate the benefits of the new algorithm in the presence of Gaussian and Laplacian noise and frame-shift errors. The proposed algorithms are well suited for experimental applications, and furthermore, have low complexity and are implementable using fixed-point arithmetic, enabling potential use with implantable cardiac devices.
Robust Statistical Approaches for RSS-Based Floor Detection in Indoor Localization.
Razavi, Alireza; Valkama, Mikko; Lohan, Elena Simona
2016-05-31
Floor detection for indoor 3D localization of mobile devices is currently an important challenge in the wireless world. Many approaches currently exist, but usually the robustness of such approaches is not addressed or investigated. The goal of this paper is to show how to robustify the floor estimation when probabilistic approaches with a low number of parameters are employed. Indeed, such an approach would allow a building-independent estimation and a lower computing power at the mobile side. Four robustified algorithms are to be presented: a robust weighted centroid localization method, a robust linear trilateration method, a robust nonlinear trilateration method, and a robust deconvolution method. The proposed approaches use the received signal strengths (RSS) measured by the Mobile Station (MS) from various heard WiFi access points (APs) and provide an estimate of the vertical position of the MS, which can be used for floor detection. We will show that robustification can indeed increase the performance of the RSS-based floor detection algorithms.
Robust Statistical Approaches for RSS-Based Floor Detection in Indoor Localization
Alireza Razavi
2016-05-01
Full Text Available Floor detection for indoor 3D localization of mobile devices is currently an important challenge in the wireless world. Many approaches currently exist, but usually the robustness of such approaches is not addressed or investigated. The goal of this paper is to show how to robustify the floor estimation when probabilistic approaches with a low number of parameters are employed. Indeed, such an approach would allow a building-independent estimation and a lower computing power at the mobile side. Four robustified algorithms are to be presented: a robust weighted centroid localization method, a robust linear trilateration method, a robust nonlinear trilateration method, and a robust deconvolution method. The proposed approaches use the received signal strengths (RSS measured by the Mobile Station (MS from various heard WiFi access points (APs and provide an estimate of the vertical position of the MS, which can be used for floor detection. We will show that robustification can indeed increase the performance of the RSS-based floor detection algorithms.
Le Goualher, G; Argenti, A.M.; Duyme, M
2000-01-01
Principal Component Analysis allows a quantitative description of shape variability with a restricted number of parameters (or modes) which can be used to quantify the difference between two shapes through the computation of a modal distance. A statistical test can then be applied to this set...... encoding. When applied to real data, this study highlighted genetic constraints on the shape of the central sulcus. We found from 10 pairs of monozygotic twins that the intrapair modal distance of the central sulcus was significantly smaller than the interpair modal distance, for both the left central...... sulcus (Z = -2.66; P definition of the central sulcus shape were confirmed by applying the same experiment to 10 pairs of normal young individuals (Z = -1.39; Z = -0.63, i.e., values not significant at the P
Zhang, Pan
2014-01-01
Modularity is a popular measure of community structure. However, maximizing the modularity can lead to many competing partitions with almost the same modularity that are poorly correlated to each other; it can also overfit, producing illusory "communities" in random graphs where none exist. We address this problem by using the modularity as a Hamiltonian, and computing the marginals of the resulting Gibbs distribution. If we assign each node to its most-likely community under these marginals, we claim that, unlike the ground state, the resulting partition is a good measure of statistically-significant community structure. We propose an efficient Belief Propagation (BP) algorithm to compute these marginals. In random networks with no true communities, the system has two phases as we vary the temperature: a paramagnetic phase where all marginals are equal, and a spin glass phase where BP fails to converge. In networks with real community structure, there is an additional retrieval phase where BP converges, and ...
Just post it: the lesson from two cases of fabricated data detected by statistics alone.
Simonsohn, Uri
2013-10-01
I argue that requiring authors to post the raw data supporting their published results has the benefit, among many others, of making fraud much less likely to go undetected. I illustrate this point by describing two cases of suspected fraud I identified exclusively through statistical analysis of reported means and standard deviations. Analyses of the raw data behind these published results provided invaluable confirmation of the initial suspicions, ruling out benign explanations (e.g., reporting errors, unusual distributions), identifying additional signs of fabrication, and also ruling out one of the suspected fraud's explanations for his anomalous results. If journals, granting agencies, universities, or other entities overseeing research promoted or required data posting, it seems inevitable that fraud would be reduced.
The Good, the Bad and the Ugly: Statistical quality assessment of SZ detections
Aghanim, N; Diego, J -M; Douspis, M; Macias-Perez, J; Pointecouteau, E; Comis, B; Arnaud, M; Montier, L
2014-01-01
We examine three approaches to the problem of source classification in catalogues. Our goal is to determine the confidence with which the elements in these catalogues can be distinguished in populations on the basis of their spectral energy distribution (SED). Our analysis is based on the projection of the measurements onto a comprehensive SED model of the main signals in the considered range of frequencies. We first first consider likelihood analysis, which half way between supervised and unsupervised methods. Next, we investigate an unsupervised clustering technique. Finally, we consider a supervised classifier based on Artificial Neural Networks. We illustrate the approach and results using catalogues from various surveys. i.e., X-Rays (MCXC), optical (SDSS) and millimetric (Planck Sunyaev-Zeldovich (SZ)). We show that the results from the statistical classifications of the three methods are in very good agreement with each others, although the supervised neural network-based classification shows better pe...
Smith, D.L.; Sagalovsky, L.; Micklich, B.J.; Harper, M.K.; Novick, A.H.
1994-06-01
A least-squares algorithm developed for analysis of fast-neutron transmission data resulting from non-destructive interrogation of sealed luggage and containers is subjected to a probabilistic interpretation. The approach is to convert knowledge of uncertainties in the derived areal elemental densities, as provided by this algorithm, into probability information that can be used to judge whether an interrogated object is either benign or potentially contains an illicit substance that should be investigated further. Two approaches are considered in this paper. One involves integration of a normalized probability density function associated with the least-squares solution. The other tests this solution against a hypothesis that the interrogated object indeed contains illicit material. This is accomplished by an application of the F-distribution from statistics. These two methods of data interpretation are applied to specific sets of neutron transmission results produced by Monte Carlo simulation.
Gofford, Jason; Tombesi, Francesco; Braito, Valentina; Turner, T Jane; Miller, Lance; Cappi, Massimo
2012-01-01
We present the results of a new spectroscopic study of Fe K-band absorption in Active Galactic Nuclei (AGN). Using data obtained from the Suzaku public archive we have performed a statistically driven blind search for Fe XXV Hea and/or Fe XXVI Lyb absorption lines in a large sample of 51 type 1.0-1.9 AGN. Through extensive Monte Carlo simulations we find statistically significant absorption is detected at E>6.7 keV in 20/51 sources at the P(MC)>95% level, which corresponds to ~40% of the total sample. In all cases, individual absorption lines are detected independently and simultaneously amongst the two (or three) available XIS detectors which confirms the robustness of the line detections. The most frequently observed outflow phenomenology consists of two discrete absorption troughs corresponding to Fe XXV Hea and Fe XXVI Lyb at a common velocity shift. From xstar fitting the mean column density and ionisation parameter for the Fe K absorption components are log(NH/cm^{-2})~23 and log(xi/erg cm s^{-1})~4.5, ...
Cui, B.; Zhang, Y.; Yan, L.; Cai, X.
2017-09-01
Detecting the land cover changes is an important application of multi-temporal synthetic aperture radar (SAR) images. This study puts forward a novel SAR change detection method which has two-steps: change detector construction and change threshold selection. For change detector construction, considering the SAR intensity images follow the gamma distribution, the conditional probabilities of the binary hypothesis test are provided, then the log likelihood ratio (LLR) combined with the log ratio (LR) to construct a detector which can enhance the degree of change to calculate the diversity degree convenient between the two images; for change threshold selection, owing to the characteristic that the curve about the ratio value of adjacent grey-level (GL) values in normalized difference map, the normalized difference map can be segmented in three parts by two thresholds selected which correspond to the regions of unchanged, backscatter enhanced and weakened separately. And as this, the change areas can be also determined simultaneously. The experimental results on different areas and sensors indicate that the proposed algorithm is effective and feasible.
PERFORMANCE EVALUATION OF VARIOUS STATISTICAL CLASSIFIERS IN DETECTING THE DISEASED CITRUS LEAVES
SUDHEER REDDY BANDI
2013-02-01
Full Text Available Citrus fruits are in lofty obligation because the humans consume them daily. This research aims to amend citrus production, which knows a low upshot bourgeois on the production and complex during measurements. Nowadays citrus plants grappling some traits/diseases. Harm of the insect is one of the major trait/disease. Insecticides are not ever evidenced effectual because insecticides may be toxic to some gracious of birds. Farmers get outstanding difficulties in detecting the diseases ended open eye and also it is quite expensive.Machine vision and Image processing techniques helps in sleuthing the disease mark in citrus leaves and sound job. In this search, Citrus leaves of four classes like Normal, Greasy spot, Melanose and Scab are collected and investigated using texture analysis based on the Color Co-occurrence Method (CCM to take Hue, Saturation and Intensity (HSI features. In the arrangement form, the features are categorised for all leafage conditions using k-Nearest Neighbor (kNN, Naive Bayes classifier (NBC, Linear Discriminate Analysis (LDA classifier and Random Forest Tree Algorithm classifier (RFT. The experimental results inform that proposed attack significantly supports 98.75% quality in automated detection of regular and struck leaves using texture psychotherapy based CCM method using LDA formula. Eventually all the classifiers are compared using Earphone Operative Characteristic contour and analyzed the performance of all the classifiers.
Tai, Caroline G; Graff, Rebecca E; Liu, Jinghua; Passarelli, Michael N; Mefford, Joel A; Shaw, Gary M; Hoffmann, Thomas J; Witte, John S
2015-08-01
The National Birth Defects Prevention Study (NBDPS) contains a wealth of information on affected and unaffected family triads, and thus provides numerous opportunities to study gene-environment interactions (G×E) in the etiology of birth defect outcomes. Depending on the research objective, several analytic options exist to estimate G×E effects that use varying combinations of individuals drawn from available triads. In this study, we discuss important considerations in the collection of genetic data and environmental exposures. We will also present several population- and family-based approaches that can be applied to data from the NBDPS including case-control, case-only, family-based trio, and maternal versus fetal effects. For each, we describe the data requirements, applicable statistical methods, advantages, and disadvantages. A range of approaches can be used to evaluate potentially important G×E effects in the NBDPS. Investigators should be aware of the limitations inherent to each approach when choosing a study design and interpreting results. © 2015 Wiley Periodicals, Inc.
Statistics and implications of substructure detected in a representative sample of X-ray clusters
Chon, Gayoung; Smith, Graham
2012-01-01
We present a morphological study of 35 X-ray luminous galaxy clusters at 0.15
Statistical Methods for Evaluating DNA Methylation as a Marker for Early Detection or Prognosis
Todd A. Alonzo
2007-01-01
Full Text Available We summarize standard and novel statistical methods for evaluating the classification accuracy of DNA methylation markers. The choice of method will depend on the type of marker studied (qualitative/quantitative, the number of markers, and the type of outcome (time-invariant/time-varying. A minimum of two error rates are needed for assessing marker accuracy: the true-positive fraction and the false-positive fraction. Measures of association that are computed from the combination of these error rates, such as the odds ratio or relative risk, are not informative about classification accuracy. We provide an example of a DNA methylation marker that is strongly associated with time to death (logrank p = 0.0003 that is not a good classifier as evaluated by the true-positive and false-positive fractions. Finally, we would like to emphasize the importance of study design. Markers can behave differently in different groups of individuals. It is important to know what factors may affect the accuracy of a marker and in which subpopulations the marker may be more accurate. Such an understanding is extremely important when comparing marker accuracy in two groups of subjects.
Heggeseth, Brianna; Harley, Kim; Warner, Marcella; Jewell, Nicholas; Eskenazi, Brenda
2015-01-01
It has been hypothesized that environmental exposures at key development periods such as in utero play a role in childhood growth and obesity. To investigate whether in utero exposure to endocrine-disrupting chemicals, dichlorodiphenyltrichloroethane (DDT) and its metabolite, dichlorodiphenyldichloroethane (DDE), is associated with childhood physical growth, we took a novel statistical approach to analyze data from the CHAMACOS cohort study. To model heterogeneity in the growth patterns, we used a finite mixture model in combination with a data transformation to characterize body mass index (BMI) with four groups and estimated the association between exposure and group membership. In boys, higher maternal concentrations of DDT and DDE during pregnancy are associated with a BMI growth pattern that is stable until about age five followed by increased growth through age nine. In contrast, higher maternal DDT exposure during pregnancy is associated with a flat, relatively stable growth pattern in girls. This study suggests that in utero exposure to DDT and DDE may be associated with childhood BMI growth patterns, not just BMI level, and both the magnitude of exposure and sex may impact the relationship.
Farrington, C. Paddy; Noufaily, Angela; Andrews, Nick J.; Charlett, Andre
2016-01-01
A large-scale multiple surveillance system for infectious disease outbreaks has been in operation in England and Wales since the early 1990s. Changes to the statistical algorithm at the heart of the system were proposed and the purpose of this paper is to compare two new algorithms with the original algorithm. Test data to evaluate performance are created from weekly counts of the number of cases of each of more than 2000 diseases over a twenty-year period. The time series of each disease is separated into one series giving the baseline (background) disease incidence and a second series giving disease outbreaks. One series is shifted forward by twelve months and the two are then recombined, giving a realistic series in which it is known where outbreaks have been added. The metrics used to evaluate performance include a scoring rule that appropriately balances sensitivity against specificity and is sensitive to variation in probabilities near 1. In the context of disease surveillance, a scoring rule can be adapted to reflect the size of outbreaks and this was done. Results indicate that the two new algorithms are comparable to each other and better than the algorithm they were designed to replace. PMID:27513749
Khuu, Sieu K.; Cham, Joey; Hayes, Anthony
2017-01-01
In the present study, we investigated the detection of contours defined by constant curvature and the statistics of curved contours in natural scenes. In Experiment 1, we examined the degree to which human sensitivity to contours is affected by changing the curvature angle and disrupting contour curvature continuity by varying the orientation of end elements. We find that (1) changing the angle of contour curvature decreased detection performance, while (2) end elements oriented in the direction (i.e., clockwise) of curvature facilitated contour detection regardless of the curvature angle of the contour. In Experiment 2 we further established that the relative effect of end—element orientation on contour detection was not only dependent on their orientation (collinear or cocircular), but also their spatial separation from the contour, and whether the contour shape was curved or not (i.e., C-shaped or S-shaped). Increasing the spatial separation of end-elements reduced contour detection performance regardless of their orientation or the contour shape. However, at small separations, cocircular end-elements facilitated the detection of C-shaped contours, but not S-shaped contours. The opposite result was observed for collinear end-elements, which improved the detection of S- shaped, but not C-shaped contours. These dissociative results confirmed that the visual system specifically codes contour curvature, but the association of contour elements occurs locally. Finally, we undertook an analysis of natural images that mapped contours with a constant angular change and determined the frequency of occurrence of end elements with different orientations. Analogous to our behavioral data, this image analysis revealed that the mapped end elements of constantly curved contours are likely to be oriented clockwise to the angle of curvature. Our findings indicate that the visual system is selectively sensitive to contours defined by constant curvature and that this might
Suzuha Hatakeyama
2016-04-01
Full Text Available We study the social problem of cyberbullying, defined as a new form of bullying that takes place in the Internet space. This paper proposes a method for automatic acquisition of seed words to improve performance of the original method for the cyberbullying detection by Nitta et al. [1]. We conduct an experiment exactly in the same settings to find out that the method based on a Web mining technique, lost over 30% points of its performance since being proposed in 2013. Thus, we hypothesize on the reasons for the decrease in the performance and propose a number of improvements, from which we experimentally choose the best one. Furthermore, we collect several seed word sets using different approaches, evaluate and their precision. We found out that the influential factor in extraction of harmful expressions is not the number of seed words, but the way the seed words were collected and filtered.
Target detection for low angle radar based on multi-frequency order-statistics
Yunhe Cao∗; Shenghua Wang; Yu Wang; Shenghua Zhou
2015-01-01
For radar targets flying at low altitude, multiple pathways produce fade or enhancement relative to the level that would be expected in a free-space environment. In this paper, a new detec-tion method based on a wide-ranging multi-frequency radar for low angle targets is proposed. Sequential transmitting multiple pulses with different frequencies are first applied to decorrelate the cohe-rence of the direct and reflected echoes. After receiving al echoes, the multi-frequency samples are arranged in a sort descending ac-cording to the amplitude. Some high amplitude echoes in the same range cel are accumulated to improve the signal-to-noise ratio and the optimal number of high amplitude echoes is analyzed and given by experiments. Final y, simulation results are presented to verify the effectiveness of the method.
Lapierre FabianD
2010-01-01
Full Text Available Abstract For locating maritime vessels longer than 45 meters, such vessels are required to set up an Automatic Identification System (AIS used by vessel traffic services. However, when a boat is shutting down its AIS, there are no means to detect it in open sea. In this paper, we use Electro-Optical (EO imagers for noncooperative vessel detection when the AIS is not operational. As compared to radar sensors, EO sensors have lower cost, lower payload, and better computational processing load. EO sensors are mounted on LEO microsatellites. We propose a real-time statistical methodology to estimate sensor Receiver Operating Characteristic (ROC curves. It does not require the computation of the entire image received at the sensor. We then illustrate the use of this methodology to design a simple simulator that can help sensor manufacturers in optimizing the design of EO sensors for maritime applications.
Evaluation of the Wishart test statistics for polarimetric SAR data
Skriver, Henning; Nielsen, Allan Aasbjerg; Conradsen, Knut
2003-01-01
A test statistic for equality of two covariance matrices following the complex Wishart distribution has previously been used in new algorithms for change detection, edge detection and segmentation in polarimetric SAR images. Previously, the results for change detection and edge detection have been...
von Larcher, Thomas; Harlander, Uwe; Alexandrov, Kiril; Wang, Yongtai
2010-05-01
Experiments on baroclinic wave instabilities in a rotating cylindrical gap have been long performed, e.g., to unhide regular waves of different zonal wave number, to better understand the transition to the quasi-chaotic regime, and to reveal the underlying dynamical processes of complex wave flows. We present the application of appropriate multivariate data analysis methods on time series data sets acquired by the use of non-intrusive measurement techniques of a quite different nature. While the high accurate Laser-Doppler-Velocimetry (LDV ) is used for measurements of the radial velocity component at equidistant azimuthal positions, a high sensitive thermographic camera measures the surface temperature field. The measurements are performed at particular parameter points, where our former studies show that kinds of complex wave patterns occur [1, 2]. Obviously, the temperature data set has much more information content as the velocity data set due to the particular measurement techniques. Both sets of time series data are analyzed by using multivariate statistical techniques. While the LDV data sets are studied by applying the Multi-Channel Singular Spectrum Analysis (M - SSA), the temperature data sets are analyzed by applying the Empirical Orthogonal Functions (EOF ). Our goal is (a) to verify the results yielded with the analysis of the velocity data and (b) to compare the data analysis methods. Therefor, the temperature data are processed in a way to become comparable to the LDV data, i.e. reducing the size of the data set in such a manner that the temperature measurements would imaginary be performed at equidistant azimuthal positions only. This approach initially results in a great loss of information. But applying the M - SSA to the reduced temperature data sets enable us to compare the methods. [1] Th. von Larcher and C. Egbers, Experiments on transitions of baroclinic waves in a differentially heated rotating annulus, Nonlinear Processes in Geophysics
EEG-based Drowsiness Detection for Safe Driving Using Chaotic Features and Statistical Tests.
Mardi, Zahra; Ashtiani, Seyedeh Naghmeh Miri; Mikaili, Mohammad
2011-05-01
Electro encephalography (EEG) is one of the most reliable sources to detect sleep onset while driving. In this study, we have tried to demonstrate that sleepiness and alertness signals are separable with an appropriate margin by extracting suitable features. So, first of all, we have recorded EEG signals from 10 volunteers. They were obliged to avoid sleeping for about 20 hours before the test. We recorded the signals while subjects did a virtual driving game. They tried to pass some barriers that were shown on monitor. Process of recording was ended after 45 minutes. Then, after preprocessing of recorded signals, we labeled them by drowsiness and alertness by using times associated with pass times of the barriers or crash times to them. Then, we extracted some chaotic features (include Higuchi's fractal dimension and Petrosian's fractal dimension) and logarithm of energy of signal. By applying the two-tailed t-test, we have shown that these features can create 95% significance level of difference between drowsiness and alertness in each EEG channels. Ability of each feature has been evaluated by artificial neural network and accuracy of classification with all features was about 83.3% and this accuracy has been obtained without performing any optimization process on classifier.
Lin, Jen-Jen; Cheng, Jung-Yu; Huang, Li-Fei; Lin, Ying-Hsiu; Wan, Yung-Liang; Tsui, Po-Hsiang
2017-02-09
The Nakagami distribution is an approximation useful to the statistics of ultrasound backscattered signals for tissue characterization. Various estimators may affect the Nakagami parameter in the detection of changes in backscattered statistics. In particular, the moment-based estimator (MBE) and maximum likelihood estimator (MLE) are two primary methods used to estimate the Nakagami parameters of ultrasound signals. This study explored the effects of the MBE and different MLE approximations on Nakagami parameter estimations. Ultrasound backscattered signals of different scatterer number densities were generated using a simulation model, and phantom experiments and measurements of human liver tissues were also conducted to acquire real backscattered echoes. Envelope signals were employed to estimate the Nakagami parameters by using the MBE, first- and second-order approximations of MLE (MLE1 and MLE2, respectively), and Greenwood approximation (MLEgw) for comparisons. The simulation results demonstrated that, compared with the MBE and MLE1, the MLE2 and MLEgw enabled more stable parameter estimations with small sample sizes. Notably, the required data length of the envelope signal was 3.6 times the pulse length. The phantom and tissue measurement results also showed that the Nakagami parameters estimated using the MLE2 and MLEgw could simultaneously differentiate various scatterer concentrations with lower standard deviations and reliably reflect physical meanings associated with the backscattered statistics. Therefore, the MLE2 and MLEgw are suggested as estimators for the development of Nakagami-based methodologies for ultrasound tissue characterization.
Chakraborty, Jayasree; Rangayyan, Rangaraj M.; Banik, Shantanu; Mukhopadhyay, Sudipta; Leo Desautels, J. E.
2012-07-01
Architectural distortion is an important sign of early breast cancer. Due to its subtlety, it is often missed during screening. We propose a method to detect architectural distortion in prior mammograms of interval-cancer cases based on statistical measures of oriented patterns. Oriented patterns were analyzed in the present work because regions with architectural distortion contain a large number of tissue structures spread over a wide angular range. Two new types of cooccurrence matrices were derived to estimate the joint occurrence of the angles of oriented structures. Statistical features were computed from each of the angle cooccurrence matrices to discriminate sites of architectural distortion from falsely detected regions in normal parts of mammograms. A total of 4,224 regions of interest (ROIs) were automatically obtained from 106 prior mammograms of 56 interval-cancer cases and 52 mammograms of 13 normal cases with the application of Gabor filters and phase portrait analysis. For each ROI, Haralick's 14 features were computed using the angle cooccurrence matrices. The best result obtained in terms of the area under the receiver operating characteristic (ROC) curve with the leave-one-patient-out method was 0.76; the free-response ROC curve indicated a sensitivity of 80% at 4.2 false positives per patient.
Falagas, Matthew E; Kouranos, Vasilios D; Michalopoulos, Argyris; Rodopoulou, Sophia P; Athanasoulia, Anastasia P; Karageorgopoulos, Drosos E
2010-02-15
Comparative cohort studies are often conducted to identify novel therapeutic strategies or prognostic factors for ventilator-associated pneumonia (VAP). We aimed to evaluate the power of such studies to provide clinically and statistically significant conclusions with regard to mortality differences. We searched in PubMed and Scopus for comparative cohort studies that evaluated mortality in patients with VAP. We calculated the central estimates and corresponding 95% confidence intervals (CIs) for mortality differences between compared patient groups. We also calculated the statistical power of the included studies to detect a difference in mortality that corresponds to a risk ratio of 0.80. We identified 39 (20 prospective) comparative cohort studies on VAP as eligible for inclusion in this analysis. The median absolute risk difference in mortality between compared groups was 10% (interquartile range [IQR], 5%-18%), and the median width of the 95% CI of the absolute risk difference in mortality was 34% (IQR, 28%-42.5%). The median power of the included studies to detect a risk ratio for mortality of 0.80 was 14.7% (IQR, 10.6%-21.8%). There is considerable uncertainty around the central estimate of comparative cohort studies on VAP with regard to mortality differences. For a wiser use of resources allocated to research, we emphasize the need to conduct cohort studies with larger sample size so that potential differences between the compared groups are more likely to be shown.
Yang, Y.; Liu, W.
2017-09-01
To solve the problems of existing method of change detection using fully polarimetric SAR which not takes full advantage of polarimetric information and the result of false alarm rate of which is high, a method is proposed based on test statistic and Gaussian mixture model in this paper. In the case of the flood disaster in Wuhan city in 2016, difference image is obtained by the likelihoodratio parameter which is built using coherency matrix C3 or covariance matrix T3 of fully polarimetric SAR based on test statistic, and it becomes a reality that the change information is automatic extracted by the parameter of Gaussian mixture model (GMM) of difference image based on the expectation maximization (EM) iterative algorithm. The experimental results show that the overall accuracy of change detection results can be improved and false alarm rate can be reduced using this method by comparison with traditional constant false alarm rate (CFAR) method. Thus the validity and feasibility of the method is demonstrated.
Degui Zhi
Full Text Available Recently, whole-genome sequencing, especially exome sequencing, has successfully led to the identification of causal mutations for rare monogenic Mendelian diseases. However, it is unclear whether this approach can be generalized and effectively applied to other Mendelian diseases with high locus heterogeneity. Moreover, the current exome sequencing approach has limitations such as false positive and false negative rates of mutation detection due to sequencing errors and other artifacts, but the impact of these limitations on experimental design has not been systematically analyzed. To address these questions, we present a statistical modeling framework to calculate the power, the probability of identifying truly disease-causing genes, under various inheritance models and experimental conditions, providing guidance for both proper experimental design and data analysis. Based on our model, we found that the exome sequencing approach is well-powered for mutation detection in recessive, but not dominant, Mendelian diseases with high locus heterogeneity. A disease gene responsible for as low as 5% of the disease population can be readily identified by sequencing just 200 unrelated patients. Based on these results, for identifying rare Mendelian disease genes, we propose that a viable approach is to combine, sequence, and analyze patients with the same disease together, leveraging the statistical framework presented in this work.
Peter M Visscher
2014-04-01
Full Text Available We have recently developed analysis methods (GREML to estimate the genetic variance of a complex trait/disease and the genetic correlation between two complex traits/diseases using genome-wide single nucleotide polymorphism (SNP data in unrelated individuals. Here we use analytical derivations and simulations to quantify the sampling variance of the estimate of the proportion of phenotypic variance captured by all SNPs for quantitative traits and case-control studies. We also derive the approximate sampling variance of the estimate of a genetic correlation in a bivariate analysis, when two complex traits are either measured on the same or different individuals. We show that the sampling variance is inversely proportional to the number of pairwise contrasts in the analysis and to the variance in SNP-derived genetic relationships. For bivariate analysis, the sampling variance of the genetic correlation additionally depends on the harmonic mean of the proportion of variance explained by the SNPs for the two traits and the genetic correlation between the traits, and depends on the phenotypic correlation when the traits are measured on the same individuals. We provide an online tool for calculating the power of detecting genetic (covariation using genome-wide SNP data. The new theory and online tool will be helpful to plan experimental designs to estimate the missing heritability that has not yet been fully revealed through genome-wide association studies, and to estimate the genetic overlap between complex traits (diseases in particular when the traits (diseases are not measured on the same samples.
V C Vani; S Chatterjee
2008-05-01
Detection of periodic structures, hidden in random surfaces has been addressed by us for some time and the `extended matched filter' method, developed by us, has been shown to be effective in detecting the hidden periodic part from the light scattering data in circumstances where conventional data analysis methods cannot reveal the successive peaks due to scattering by the periodic part of the surface. It has been shown that if 0 is the coherence length of light on scattering from the rough part and is the wavelength of the periodic part of the surface, the extended matched filter method can detect hidden periodic structures for (0/) ≥ 0:11, while conventional methods are limited to much higher values ((0/) ≥ 0:33). In the method developed till now, the detection of periodic structures involves the detection of the central peak, first peak and second peak in the scattered intensity of light, located at scattering wave vectors = 0, , 2, respectively, where = 2/, their distinct identities being obfuscated by the fact that the peaks have width = 2/0 ≫ . The relative magnitudes of these peaks and the consequent problems associated in identifying them is discussed. The Kolmogorov-Smirnov statistical goodness test is used to justify the identification of the peaks. This test is used to `reject' or `not reject' the null hypothesis which states that the successive peaks do exist. This test is repeated for various values of 0/, which leads to the conclusion that there is really a periodic structure hidden behind the random surface.
Liu, Guosheng; Seo, Eun-Kyoung
2013-02-01
has been long believed that the dominant microwave signature of snowfall over land is the brightness temperature decrease caused by ice scattering. However, our analysis of multiyear satellite data revealed that on most of occasions, brightness temperatures are rather higher under snowfall than nonsnowfall conditions, likely due to the emission by cloud liquid water. This brightness temperature increase masks the scattering signature and complicates the snowfall detection problem. In this study, we propose a statistical method for snowfall detection, which is developed by using CloudSat radar to train high-frequency passive microwave observations. To capture the major variations of the brightness temperatures and reduce the dimensionality of independent variables, the detection algorithm is designed to use the information contained in the first three principal components resulted from Empirical Orthogonal Function (EOF) analysis, which capture ~99% of the total variances of brightness temperatures. Given a multichannel microwave observation, the algorithm first transforms the brightness temperature vector into EOF space and then retrieves a probability of snowfall by using the CloudSat radar-trained look-up table. Validation has been carried out by case studies and averaged horizontal snowfall fraction maps. The result indicated that the algorithm has clear skills in identifying snowfall areas even over mountainous regions.
Polewski, Przemyslaw; Yao, Wei; Heurich, Marco; Krzystek, Peter; Stilla, Uwe
2017-07-01
This paper introduces a statistical framework for detecting cylindrical shapes in dense point clouds. We target the application of mapping fallen trees in datasets obtained through terrestrial laser scanning. This is a challenging task due to the presence of ground vegetation, standing trees, DTM artifacts, as well as the fragmentation of dead trees into non-collinear segments. Our method shares the concept of voting in parameter space with the generalized Hough transform, however two of its significant drawbacks are improved upon. First, the need to generate samples on the shape's surface is eliminated. Instead, pairs of nearby input points lying on the surface cast a vote for the cylinder's parameters based on the intrinsic geometric properties of cylindrical shapes. Second, no discretization of the parameter space is required: the voting is carried out in continuous space by means of constructing a kernel density estimator and obtaining its local maxima, using automatic, data-driven kernel bandwidth selection. Furthermore, we show how the detected cylindrical primitives can be efficiently merged to obtain object-level (entire tree) semantic information using graph-cut segmentation and a tailored dynamic algorithm for eliminating cylinder redundancy. Experiments were performed on 3 plots from the Bavarian Forest National Park, with ground truth obtained through visual inspection of the point clouds. It was found that relative to sample consensus (SAC) cylinder fitting, the proposed voting framework can improve the detection completeness by up to 10 percentage points while maintaining the correctness rate.
Goodman, Joseph W
2015-01-01
This book discusses statistical methods that are useful for treating problems in modern optics, and the application of these methods to solving a variety of such problems This book covers a variety of statistical problems in optics, including both theory and applications. The text covers the necessary background in statistics, statistical properties of light waves of various types, the theory of partial coherence and its applications, imaging with partially coherent light, atmospheric degradations of images, and noise limitations in the detection of light. New topics have been introduced i
Shen, Wen; Li, Suiqiong; Horikawa, Shin; Petrenko, Valery A.; Barbaree, James; Chin, Bryan A.
2011-06-01
This work demonstrated a direct detection of Salmonella on fresh food produce using groups of magnetoelastic biosensors. The magnetoelastic biosensors were coated with E2 phage, which specifically binds with S. typhimurium. The resonance frequency of the biosensor is measured using a pulse excitation system, which allows simultaneous detection of multiple sensors. Multiple measurement and control biosensors were placed on fresh food surfaces that had been spiked with a known amount of Salmonella. Binding with bacteria was allowed to occur for 30 minutes in a humid air environment. The resonance frequencies of the groups of biosensors were then measured to determine the amount of bound bacteria. By using a statistical experimental design and by taking the average of repeated measurements, possible detection errors are decreased. By using multiple sensors at each site of interest, a higher portion of the contaminated surface has contact with biosensors, allowing for more complete information on the food produce surface. Results from SEM pictures of the sensor surface agree with the sensor frequency response results.
Cohn, T.A.; England, J.F.; Berenbrock, C.E.; Mason, R.R.; Stedinger, J.R.; Lamontagne, J.R.
2013-01-01
he Grubbs-Beck test is recommended by the federal guidelines for detection of low outliers in flood flow frequency computation in the United States. This paper presents a generalization of the Grubbs-Beck test for normal data (similar to the Rosner (1983) test; see also Spencer and McCuen (1996)) that can provide a consistent standard for identifying multiple potentially influential low flows. In cases where low outliers have been identified, they can be represented as “less-than” values, and a frequency distribution can be developed using censored-data statistical techniques, such as the Expected Moments Algorithm. This approach can improve the fit of the right-hand tail of a frequency distribution and provide protection from lack-of-fit due to unimportant but potentially influential low flows (PILFs) in a flood series, thus making the flood frequency analysis procedure more robust.
Mark Frogley
2013-01-01
Full Text Available To reduce the maintenance cost, avoid catastrophic failure, and improve the wind transmission system reliability, online condition monitoring system is critical important. In the real applications, many rotating mechanical faults, such as bearing surface defect, gear tooth crack, chipped gear tooth and so on generate impulsive signals. When there are these types of faults developing inside rotating machinery, each time the rotating components pass over the damage point, an impact force could be generated. The impact force will cause a ringing of the support structure at the structural natural frequency. By effectively detecting those periodic impulse signals, one group of rotating machine faults could be detected and diagnosed. However, in real wind turbine operations, impulsive fault signals are usually relatively weak to the background noise and vibration signals generated from other healthy components, such as shaft, blades, gears and so on. Moreover, wind turbine transmission systems work under dynamic operating conditions. This will further increase the difficulties in fault detection and diagnostics. Therefore, developing advanced signal processing methods to enhance the impulsive signals is in great needs.In this paper, an adaptive filtering technique will be applied for enhancing the fault impulse signals-to-noise ratio in wind turbine gear transmission systems. Multiple statistical features designed to quantify the impulsive signals of the processed signal are extracted for bearing fault detection. The multiple dimensional features are then transformed into one dimensional feature. A minimum error rate classifier will be designed based on the compressed feature to identify the gear transmission system with defect. Real wind turbine vibration signals will be used to demonstrate the effectiveness of the presented methodology.
Poghosyan, G. V.
2013-12-01
A statistical analysis of time intervals between the dates of birth of genetic relatives has been carried out on the basis of 33 family trees. Using the Monte Carlo method, a significant departure of the distribution of birthdays from random results is detected relative to two long-period solar harmonics known from the theory of the Earth tides, i.e., a solar elliptical wave ( S a ) with a period of an anomalistic year (365.259640 days) and a solar declinational wave ( S sa ) with a period of half of the tropical year (182.621095 days). Further research requires larger statistical samples and involves clarifying the effect of long-period lunar harmonics, i.e., an lunar elliptical wave ( M m ) with a period of an anomalistic month (27.554551 days) and a lunar declinational wave ( M f ) with a period of half of a tropical month (13.660791 day), as well as the impact of important lunar and solar tides of time intervals with periods of half (14.765294 days, the interval between syzygial tides at new and full moon) and a whole (29.530588 days) synodic month. It is known that the periodic compression and stretching of the Earth's crust at the time of the tides by means of the piezoelectric effect lead to the generation of long-period electric oscillations with periods corresponding to the harmonics of the theory of the Earth tides. The detection of these harmonics in connection with biological processes will make it possible to determine the impact of regular cosmogeophysical fluctuations (tidal waves) on the processes in the biosphere.
Rezvan Abbasi
2017-08-01
Full Text Available Electroencephalogram signals (EEG have always been used in medical diagnosis. Evaluation of the statistical characteristics of EEG signals is actually the foundation of all brain signal processing methods. Since the correct prediction of disease status is of utmost importance, the goal is to use those models that have minimum error and maximum reliability. In anautomatic epileptic seizure detection system, we should be able to distinguish between EEG signals before, during and after seizure. Extracting useful characteristics from EEG data can greatly increase the classification accuracy. In this new approach, we first parse EEG signals to sub-bands in different categories with the help of discrete wavelet transform(DWT and then we derive statistical characteristics such as maximum, minimum, average and standard deviation for each sub-band. A multilayer perceptron (MLPneural network was used to assess the different scenarios of healthy and seizure among the collected signal sets. In order to assess the success and effectiveness of the proposed method, the confusion matrix was used and its accuracy was achieved98.33 percent. Due to the limitations and obstacles in analyzing EEG signals, the proposed method can greatly help professionals experimentally and visually in the classification and diagnosis of epileptic seizures.
Chen, Chien-Chou; Teng, Yung-Chu; Lin, Bo-Cheng; Fan, I-Chun; Chan, Ta-Chien
2016-11-25
Cases of dengue fever have increased in areas of Southeast Asia in recent years. Taiwan hit a record-high 42,856 cases in 2015, with the majority in southern Tainan and Kaohsiung Cities. Leveraging spatial statistics and geo-visualization techniques, we aim to design an online analytical tool for local public health workers to prospectively identify ongoing hot spots of dengue fever weekly at the village level. A total of 57,516 confirmed cases of dengue fever in 2014 and 2015 were obtained from the Taiwan Centers for Disease Control (TCDC). Incorporating demographic information as covariates with cumulative cases (365 days) in a discrete Poisson model, we iteratively applied space-time scan statistics by SaTScan software to detect the currently active cluster of dengue fever (reported as relative risk) in each village of Tainan and Kaohsiung every week. A village with a relative risk >1 and p value dengue fever transmission on a weekly basis at the village level by using the routine surveillance data.
Wang, X; Heimann, T; Lo, P; Sumkauskaite, M; Puderbach, M; de Bruijne, M; Meinzer, H P; Wegner, I
2012-08-21
The segmentation of tree-like tubular structures such as coronary arteries and airways is an essential step for many 3D medical imaging applications. Statistical tracking techniques for the extraction of elongated structures have received considerable attention in recent years due to their robustness against image noise and pathological changes. However, most tracking methods are limited to a specific application and do not support branching structures efficiently. In this work, we present a novel statistical tracking approach for the extraction of different types of tubular structures with ringlike cross-sections. Domain-specific knowledge is learned from training data sets and integrated into the tracking process by simple adaption of parameters. In addition, an efficient branching detection algorithm is presented. This approach was evaluated by extracting coronary arteries from 32 CTA data sets and distal airways from 20 CT scans. These data sets were provided by the organizers of the workshop '3D Segmentation in the Clinic: A Grand Challenge II-Coronary Artery Tracking (CAT08)' and 'Extraction of Airways from CT 2009 (EXACT'09)'. On average, 81.5% overlap and 0.51 mm accuracy for the tracking of coronary arteries were achieved. For the extraction of airway trees, 51.3% of the total tree length, 53.6% of the total number of branches and a 4.98% false positive rate were attained. In both experiments, our approach is comparable to state-of-the-art methods.
Bowles, Roland L.; Buck, Bill K.
2009-01-01
The objective of the research developed and presented in this document was to statistically assess turbulence hazard detection performance employing airborne pulse Doppler radar systems. The FAA certification methodology for forward looking airborne turbulence radars will require estimating the probabilities of missed and false hazard indications under operational conditions. Analytical approaches must be used due to the near impossibility of obtaining sufficient statistics experimentally. This report describes an end-to-end analytical technique for estimating these probabilities for Enhanced Turbulence (E-Turb) Radar systems under noise-limited conditions, for a variety of aircraft types, as defined in FAA TSO-C134. This technique provides for one means, but not the only means, by which an applicant can demonstrate compliance to the FAA directed ATDS Working Group performance requirements. Turbulence hazard algorithms were developed that derived predictive estimates of aircraft hazards from basic radar observables. These algorithms were designed to prevent false turbulence indications while accurately predicting areas of elevated turbulence risks to aircraft, passengers, and crew; and were successfully flight tested on a NASA B757-200 and a Delta Air Lines B737-800. Application of this defined methodology for calculating the probability of missed and false hazard indications taking into account the effect of the various algorithms used, is demonstrated for representative transport aircraft and radar performance characteristics.
Hyde, J M; Cerezo, A; Williams, T J
2009-04-01
Statistical analysis of atom probe data has improved dramatically in the last decade and it is now possible to determine the size, the number density and the composition of individual clusters or precipitates such as those formed in reactor pressure vessel (RPV) steels during irradiation. However, the characterisation of the onset of clustering or co-segregation is more difficult and has traditionally focused on the use of composition frequency distributions (for detecting clustering) and contingency tables (for detecting co-segregation). In this work, the authors investigate the possibility of directly examining the neighbourhood of each individual solute atom as a means of identifying the onset of solute clustering and/or co-segregation. The methodology involves comparing the mean observed composition around a particular type of solute with that expected from the overall composition of the material. The methodology has been applied to atom probe data obtained from several irradiated RPV steels. The results show that the new approach is more sensitive to fine scale clustering and co-segregation than that achievable using composition frequency distribution and contingency table analyses.
Barnett, C.S.
1978-10-12
Some of the statistical questions associated with problems of detecting random-point-process signals embedded in random-point-process noise are examined. An example of such a problem is that of searching for a lost radioactive source with a moving detection system. The emphasis is on theoretical questions, but some experimental and Monte Carlo results are used to test the theoretical results. Several idealized binary decision problems are treated by starting with simple, specific situations and progressing toward more general problems. This sequence of decision problems culminates in the minimum-cost-expectation rule for deciding between two Poisson processes with arbitrary intensity functions. As an example, this rule is then specialized to the detector-passing-a-point-source decision problem. Finally, Monte Carlo techniques are used to develop and test one estimation procedure: the maximum-likelihood estimation of a parameter in the intensity function of a Poisson process. For the Monte Carlo test this estimation procedure is specialized to the detector-passing-a-point-source case. Introductory material from probability theory is included so as to make the report accessible to those not especially conversant with probabilistic concepts and methods. 16 figures.
Zhan, Yimin; Mechefske, Chris K.
2007-07-01
Optimal maintenance decision analysis is heavily dependent on the accuracy of condition indicators. A condition indicator that is subject to such varying operating conditions as load is unable to provide precise condition information of the monitored object for making optimal operational maintenance decisions even if the maintenance program is established within a rigorous theoretical framework. For this reason, the performance of condition monitoring techniques applied to rotating machinery under varying load conditions has been a long-term concern and has attracted intensive research interest. Part I of this study proposed a novel technique based on adaptive autoregressive modeling and hypothesis tests. The method is able to automatically search for the optimal time-series model order and establish a compromised autoregressive model fitting based on the healthy gear motion residual signals under varying load conditions. The condition of the monitored gearbox is numerically represented by a modified Kolmogorov-Smirnov test statistic. Part II of this study is devoted to applications of the proposed technique to entire lifetime condition detection of three gearboxes with distinct physical specifications, distinct load conditions, and distinct failure modes. A comprehensive and thorough comparative study is conducted between the proposed technique and several counterparts. The detection technique is further enhanced by a proposed method to automatically identify and generate fault alerts with the aid of the Wilcoxon rank-sum test and thus requires no supervision from maintenance personnel. Experimental analysis demonstrated that the proposed technique applied to automatic identification and generation of fault alerts also features two highly desirable properties, i.e. few false alerts and early alert for incipient faults. Furthermore, it is found that the proposed technique is able to identify two types of abnormalities, i.e. strong ghost components abruptly
Larsen, L.; Watts, D.; Khurana, A.; Anderson, J. L.; Xu, C.; Merritts, D. J.
2015-12-01
The classic signal of self-organization in nature is pattern formation. However, the interactions and feedbacks that organize depositional landscapes do not always result in regular or fractal patterns. How might we detect their existence and effects in these "irregular" landscapes? Emergent landscapes such as newly forming deltaic marshes or some restoration sites provide opportunities to study the autogenic processes that organize landscapes and their physical signatures. Here we describe a quest to understand autogenic vs. allogenic controls on landscape evolution in Big Spring Run, PA, a landscape undergoing restoration from bare-soil conditions to a target wet meadow landscape. The contemporary motivation for asking questions about autogenic vs. allogenic controls is to evaluate how important initial conditions or environmental controls may be for the attainment of management objectives. However, these questions can also inform interpretation of the sedimentary record by enabling researchers to separate signals that may have arisen through self-organization processes from those resulting from environmental perturbations. Over three years at Big Spring Run, we mapped the dynamic evolution of floodplain vegetation communities and distributions of abiotic variables and topography. We used principal component analysis and transition probability analysis to detect associative interactions between vegetation and geomorphic variables and convergent cross-mapping on lidar data to detect causal interactions between biomass and topography. Exploratory statistics revealed that plant communities with distinct morphologies exerted control on landscape evolution through stress divergence (i.e., channel initiation) and promoting the accumulation of fine sediment in channels. Together, these communities participated in a negative feedback that maintains low energy and multiple channels. Because of the spatially explicit nature of this feedback, causal interactions could not
Meng, X.; Daniels, C.; Smith, E.; Peng, Z.; Chen, X.; Wagner, L. S.; Fischer, K. M.; Hawman, R. B.
2015-12-01
Since 2001, the number of M>3 earthquakes increased significantly in Central and Eastern United States (CEUS), likely due to waste-water injection, also known as "induced earthquakes" [Ellsworth, 2013]. Because induced earthquakes are driven by short-term external forcing and hence may behave like earthquake swarms, which are not well characterized by branching point-process models, such as the Epidemic Type Aftershock Sequence (ETAS) model [Ogata, 1988]. In this study we focus on the 02/15/2014 M4.1 South Carolina and the 06/16/2014 M4.3 Oklahoma earthquakes, which likely represent intraplate tectonic and induced events, respectively. For the South Carolina event, only one M3.0 aftershock is identified by the ANSS catalog, which may be caused by a lack of low-magnitude events in this catalog. We apply a recently developed matched filter technique to detect earthquakes from 02/08/2014 to 02/22/2014 around the epicentral region. 15 seismic stations (both permanent and temporary USArray networks) within 100 km of the mainshock are used for detection. The mainshock and aftershock are used as templates for the initial detection. Newly detected events are employed as new templates, and the same detection procedure repeats until no new event can be added. Overall we have identified more than 10 events, including one foreshock occurred ~11 min before the M4.1 mainshock. However, the numbers of aftershocks are still much less than predicted with the modified Bath's law. For the Oklahoma event, we use 1270 events from the ANSS catalog and 182 events from a relocated catalog as templates to scan through continuous recordings 3 days before to 7 days after the mainshock. 12 seismic stations within the vicinity of the mainshock are included in the study. After obtaining more complete catalogs for both sequences, we plan to compare the statistical parameters (e.g., b, a, K, and p values) between the two sequences, as well as their spatial-temporal migration pattern, which may
Williams, Jack D; Yazarians, Jessica A; Almeyda, Chelcie C; Anderson, Kristin A; Boyce, Gregory R
2016-06-01
The discovery of the (+)-α-thujone and (-)-β-thujone stereoisomers in the essential oil of sage (Salvia officinalis L.) and dietary supplements is documented for the first time. The detection was accomplished using a chiral resolution protocol of racemic α-/β-thujone on headspace solid-phase microextraction-gas chromatography-mass spectrometry. Because the previously unreported stereoisomers, (+)-α-thujone and (-)-β-thujone, are not commercially available, a three-step synthesis of racemic thujone from commercially available starting materials was developed. Thermolysis studies demonstrated that no racemization at the cyclopropane stereocenters occurs, corroborating that the detection is not an artifact from the hydrodistillation process. The developed chiral resolution of thujone was also used to provide evidence for the absence of the (+)-α-thujone and (-)-β-thujone enantiomers in other common thujone-containing essential oils.
Michel Parrot
2012-04-01
Full Text Available
Many examples of ionospheric perturbations observed during large seismic events were recorded by the low-altitude satellite DEMETER. However, there are also ionospheric variations without seismic activity. The present study is devoted to a statistical analysis of the night-time ion density variations. Software was implemented to detect variations in the data before earthquakes world-wide. Earthquakes with magnitudes >4.8 were selected and classified according to their magnitudes, depths and locations (land, close to the coast, or below the sea. For each earthquake, an automatic search for ion density variations was conducted from 15 days before the earthquake, when the track of the satellite orbit was at less than 1,500 km from the earthquake epicenter. The result of this first step provided the variations relative to the background in the vicinity of the epicenter for each 15 days before each earthquake. In the second step, comparisons were carried out between the largest variations over the 15 days and the earthquake magnitudes. The statistical analysis is based on calculation of the median values as a function of the various seismic parameters (magnitude, depth, location. A comparison was also carried out with two other databases, where on the one hand, the locations of the epicenters were randomly modified, and on the other hand, the longitudes of the epicenters were shifted. The results show that the intensities of the ionospheric perturbations are larger prior to the earthquakes than prior to random events, and that the perturbations increase with the earthquake magnitudes.
Simon eTousseyn
2014-07-01
Full Text Available There is currently a lack of knowledge about EEG-fMRI specificity. Our aim was to define sensitivity and specificity of Blood Oxygen Level Dependent (BOLD responses to interictal epileptic spikes during EEG-fMRI for detecting the ictal onset zone (IOZ. We studied 21 refractory focal epilepsy patients who had a well-defined IOZ after a full presurgical evaluation and interictal spikes during EEG-fMRI. Areas of spike-related BOLD changes overlapping the IOZ in patients were considered as true positives; if no overlap was found, they were treated as false negatives. Matched healthy case-controls underwent similar EEG-fMRI in order to determine true negative and false positive fractions. The spike-related regressor of the patient was used in the design matrix of the healthy case-control. Suprathreshold BOLD changes in the brain of controls were considered as false positives, absence of these changes as true negatives. Sensitivity and specificity were calculated for different statistical thresholds at the voxel level combined with different cluster size thresholds and represented in receiver operating characteristic (ROC-curves. Additionally, we calculated the ROC-curves based upon the cluster containing the maximal significant activation. We achieved a combination of 100% specificity and 62% sensitivity, using a Z-threshold in the interval 3.4-3.5 and cluster size threshold of 350 voxels. We could obtain higher sensitivity at the expense of specificity. Similar performance was found when using the cluster containing the maximal significant activation. Our data provide a guideline for different EEG-fMRI settings with their respective sensitivity and specificity for detecting the ictal onset zone. The unique cluster containing the maximal significant BOLD activation was a sensitive and specific marker of the ictal onset zone.
Neubert, A.; Fripp, J.; Engstrom, C.; Schwarz, R.; Lauer, L.; Salvado, O.; Crozier, S.
2012-12-01
Recent advances in high resolution magnetic resonance (MR) imaging of the spine provide a basis for the automated assessment of intervertebral disc (IVD) and vertebral body (VB) anatomy. High resolution three-dimensional (3D) morphological information contained in these images may be useful for early detection and monitoring of common spine disorders, such as disc degeneration. This work proposes an automated approach to extract the 3D segmentations of lumbar and thoracic IVDs and VBs from MR images using statistical shape analysis and registration of grey level intensity profiles. The algorithm was validated on a dataset of volumetric scans of the thoracolumbar spine of asymptomatic volunteers obtained on a 3T scanner using the relatively new 3D T2-weighted SPACE pulse sequence. Manual segmentations and expert radiological findings of early signs of disc degeneration were used in the validation. There was good agreement between manual and automated segmentation of the IVD and VB volumes with the mean Dice scores of 0.89 ± 0.04 and 0.91 ± 0.02 and mean absolute surface distances of 0.55 ± 0.18 mm and 0.67 ± 0.17 mm respectively. The method compares favourably to existing 3D MR segmentation techniques for VBs. This is the first time IVDs have been automatically segmented from 3D volumetric scans and shape parameters obtained were used in preliminary analyses to accurately classify (100% sensitivity, 98.3% specificity) disc abnormalities associated with early degenerative changes.
Malm, Christer B.; Khoo, Nelson S.; Granlund, Irene; Lindstedt, Emilia; Hult, Andreas
2016-01-01
The discovery of erythropoietin (EPO) simplified blood doping in sports, but improved detection methods, for EPO has forced cheating athletes to return to blood transfusion. Autologous blood transfusion with cryopreserved red blood cells (RBCs) is the method of choice, because no valid method exists to accurately detect such event. In endurance sports, it can be estimated that elite athletes improve performance by up to 3% with blood doping, regardless of method. Valid detection methods for autologous blood doping is important to maintain credibility of athletic performances. Recreational male (N = 27) and female (N = 11) athletes served as Transfusion (N = 28) and Control (N = 10) subjects in two different transfusion settings. Hematological variables and physical performance were measured before donation of 450 or 900 mL whole blood, and until four weeks after re-infusion of the cryopreserved RBC fraction. Blood was analyzed for transferrin, iron, Hb, EVF, MCV, MCHC, reticulocytes, leucocytes and EPO. Repeated measures multivariate analysis of variance (MANOVA) and pattern recognition using Principal Component Analysis (PCA) and Orthogonal Projections of Latent Structures (OPLS) discriminant analysis (DA) investigated differences between Control and Transfusion groups over time. Significant increase in performance (15 ± 8%) and VO2max (17 ± 10%) (mean ± SD) could be measured 48 h after RBC re-infusion, and remained increased for up to four weeks in some subjects. In total, 533 blood samples were included in the study (Clean = 220, Transfused = 313). In response to blood transfusion, the largest change in hematological variables occurred 48 h after blood donation, when Control and Transfused groups could be separated with OPLS-DA (R2 = 0.76/Q2 = 0.59). RBC re-infusion resulted in the best model (R2 = 0.40/Q2 = 0.10) at the first sampling point (48 h), predicting one false positive and one false negative. Over all, a 25% and 86% false positives ratio was
Riehn, Katharina; Hasenclever, Dirk; Petroff, David; Nöckler, Karsten; Mayer-Scholl, Anne; Makrutzki, Gregor; Lücker, Ernst
2013-05-20
Proficiency testing (PT) is the use of inter-laboratory comparisons to determine the performance of individual laboratories for specific tests or measurements, and to monitor a laboratory's performance. Participation in proficiency testing provides laboratories with an objective means of assessing and demonstrating the reliability of the data they are producing. To ensure the reliability of Trichinella detection and meat hygiene within the European Union and afford optimal protection to the consumer, PT is conducted under the direction of the European National Reference Laboratories for Trichinella. Evaluation of data from the national PT showed that lab-internal shortcomings are frequent. These shortcomings are specifically related to: (1) improper sample collection and preparation; (2) incorrect transposition and application of the protocol as laid down in Annex I, Chapter I, Nr. 3 (a-g) of the Commission Regulation (EC) No. 2075/2005; (3) insufficient sedimentation times; and (4) improper equipment.(e.g. Prost and Nowakowski, 1990; Rossi and Pozio, 2008; Forbes and Gajadhar, 1999; Rossi and Pozio, 2008). To test the hypothesis that both method based errors as well as internal lab errors can influence the accuracy and precision of the magnetic stirrer method for pooled sample digestion (MSM), we initiated a study to evaluate the analytical uncertainty of the MSM. Results presented here are based on: (i) data from PT in Germany (2008, 2009, and 2010); (ii) within-lab performance conducting high volumes of MSM; (iii) larval recovery experiments; and (iv) statistical evaluation of data resulting from these procedures. Quantitative data from the PT show that on average only 60% of Trichinella larvae were detected. Even laboratories that showed relatively good performance (>80% larva recovery, no false negative or false positive results), frequently reported samples with an unexpectedly low larval count (loss of >2 larvae). In our own laboratory, high numbers of
Wahyu Widji Pamungkas
2017-07-01
Full Text Available Achievement of national palm oil industry as a producer and exporter of crude palm oil (CPO in the world, it is now giving birth insecurity issues. This is because the growth of upstream and downstream industries of national palm oil that has not been balanced, which in turn encourages the national palm oil industry players to be oriented to the export of CPO which eliminates the added value in the country. On the other hand, though bring in foreign exchange for the country, but is prone commodity export orientation encountered a barriers problem in the international market. It is therefore important to provide a means of monitoring, prediction and assessment to facilitate the formulation of policies more about the marketing of national CPO industry. This research proposed the development of a model framework called adaptive threshold statistical control detection adaptive (SCDA as a means of monitoring, prediction, and assessment of the movement of national CPO production volume. SCDA idea is to determine the dynamic threshold based mapping pattern historical data and predictions from the aspect of the frequency and trends. SCDA model adapted the techniques of statistical process control (SPC, while the values of the predictions generated from the simulation prediction model developed using the techniques of artificial neural network back propagation (ANN-BP based on historical data of the national CPO production volume. The data used was the average volume of annual national CPO production period 1967 to 2015. The simulation results showed that the prediction model of national CPO production volume in 2016 until 2018 predicted were31.025 million, 32.214 million, and 34.504 million tons, respectively, while the values of maximum and minimum threshold that was formed in the model predictions SCDA for the period 2016-2018 each sequence were 33,322,065 and 29,246,547, respectively. As far as the literature search results, modeling SCDA has never
Stroebel, Armin M
2010-11-08
Abstract Background Animals, including humans, exhibit a variety of biological rhythms. This article describes a method for the detection and simultaneous comparison of multiple nycthemeral rhythms. Methods A statistical method for detecting periodic patterns in time-related data via harmonic regression is described. The method is particularly capable of detecting nycthemeral rhythms in medical data. Additionally a method for simultaneously comparing two or more periodic patterns is described, which derives from the analysis of variance (ANOVA). This method statistically confirms or rejects equality of periodic patterns. Mathematical descriptions of the detecting method and the comparing method are displayed. Results Nycthemeral rhythms of incidents of bodily harm in Middle Franconia are analyzed in order to demonstrate both methods. Every day of the week showed a significant nycthemeral rhythm of bodily harm. These seven patterns of the week were compared to each other revealing only two different nycthemeral rhythms, one for Friday and Saturday and one for the other weekdays.
Nielsen, Allan Aasbjerg; Conradsen, Knut; Skriver, Henning
2016-01-01
Based on an omnibus likelihood ratio test statistic for the equality of several variance-covariance matrices following the complex Wishart distribution with an associated p-value and a factorization of this test statistic, change analysis in a short sequence of multilook, polarimetric SAR data...
Nielsen, Allan Aasbjerg; Conradsen, Knut; Skriver, Henning
2016-01-01
Based on an omnibus likelihood ratio test statistic for the equality of several variance-covariance matrices following the complex Wishart distribution with an associated p-value and a factorization of this test statistic, change analysis in a short sequence of multilook, polarimetric SAR data...
Scholl, Joep H G; van Puijenbroek, Eugène P
2016-01-01
PURPOSE: In pharmacovigilance, the commonly used disproportionality analysis (DPA) in statistical signal detection is known to have its limitations. The aim of this study was to investigate the value of the time to onset (TTO) of ADRs in addition to DPA. METHODS: We performed a pilot study using
Scholl, Joep H G; van Puijenbroek, Eugène P
2016-01-01
PURPOSE: In pharmacovigilance, the commonly used disproportionality analysis (DPA) in statistical signal detection is known to have its limitations. The aim of this study was to investigate the value of the time to onset (TTO) of ADRs in addition to DPA. METHODS: We performed a pilot study using ind
Muino, J.M.; Kaufmann, K.; Ham, van R.C.H.J.; Angenent, G.C.; Krajewski, P.
2011-01-01
Background In vivo detection of protein-bound genomic regions can be achieved by combining chromatin-immunoprecipitation with next-generation sequencing technology (ChIP-seq). The large amount of sequence data produced by this method needs to be analyzed in a statistically proper and computationally
Pedestrian flow statistical system based on infrared detection method%红外光控人流量统计系统
吴小林
2013-01-01
为了实时统计教室、实验室、图书馆等开放教学场所的人数，实现教学管理自动化。系统采用对红外光线的通断检测的方法来统计进出实验室人数，当有人通过时挡住红外光，输出信号与可编程逻辑器件连接，通过红外收发输入信号顺序的判断，主程序对两路信号进行加减计数；经译码、显示、报警等模块的连接实现可逆流量统计功能。实验表明该系统具有测量精度高，抗干扰能力强、功耗低等特点。设计的创新在于红外检测与可编程器件结合了成本，提高了实时可靠性。%To count the pedestrian flow in real time to enter the open teaching places including classroom,laboratory and li-brary,and realize the automation of teaching management,a system that counts the interrupted times of infrared beam to deter-mine the quantity of pedestrian flow is designed. When someone passes by,blocking the infrared beam,the output signal is con-nected with programmable logic device,through the judgment of the infrared transceiving input signal sequence,the main pro-gram carries out the addition and subtraction counting on the two signals. The reversible flow statistics function is realized by connection of decoding,display and alarm module. Experiments show that this system has high precision,strong anti-interfer-ence ability,low power consumption. The design innovation is that the infrared detection and VHDL are combined with the cost. The design improved the real-time reliability of the system.
Tombesi, F.; Cappi, M.; Reeves, J. N.; Palumbo, G. G. C.; Yaqoob, T.; Braito, V.; Dadina, M.
2010-10-01
Context. Blue-shifted Fe K absorption lines have been detected in recent years between 7 and 10 keV in the X-ray spectra of several radio-quiet AGNs. The derived blue-shifted velocities of the lines can often reach mildly relativistic values, up to 0.2-0.4c. These findings are important because they suggest the presence of a previously unknown massive and highly ionized absorbing material outflowing from their nuclei, possibly connected with accretion disk winds/outflows. Aims: The scope of the present work is to statistically quantify the parameters and incidence of the blue-shifted Fe K absorption lines through a uniform analysis on a large sample of radio-quiet AGNs. This allows us to assess their global detection significance and to overcome any possible publication bias. Methods: We performed a blind search for narrow absorption features at energies greater than 6.4 keV in a sample of 42 radio-quiet AGNs observed with XMM-Newton. A simple uniform model composed by an absorbed power-law plus Gaussian emission and absorption lines provided a good fit for all the data sets. We derived the absorption lines parameters and calculated their detailed detection significance making use of the classical F-test and extensive Monte Carlo simulations. Results: We detect 36 narrow absorption lines on a total of 101 XMM-Newton EPIC pn observations. The number of absorption lines at rest-frame energies higher than 7 keV is 22. Their global probability to be generated by random fluctuations is very low, less than 3 × 10-8, and their detection have been independently confirmed by a spectral analysis of the MOS data, with associated random probability UFOs) those highly ionized absorbers with outflow velocities higher than 104 km s-1, then the majority of the lines are consistent with being associated to UFOs and the fraction of objects with detected UFOs in the whole sample is at least ~35%. This fraction is similar for type 1 and type 2 sources. The global covering fraction of
Statistically robust sampling strategies form an integral component of grain storage and handling activities throughout the world. Developing sampling strategies to target biological pests such as insects in stored grain is inherently difficult due to species biology and behavioral characteristics. ...
Yoon, Hyun Jung; Chung, Myung Jin; Hwang, Hye Sun; Moon, Jung Won; Lee, Kyung Soo
2015-01-01
To assess the performance of adaptive statistical iterative reconstruction (ASIR)-applied ultra-low-dose CT (ULDCT) in detecting small lung nodules. Thirty patients underwent both ULDCT and standard dose CT (SCT). After determining the reference standard nodules, five observers, blinded to the reference standard reading results, independently evaluated SCT and both subsets of ASIR- and filtered back projection (FBP)-driven ULDCT images. Data assessed by observers were compared statistically. Converted effective doses in SCT and ULDCT were 2.81 ± 0.92 and 0.17 ± 0.02 mSv, respectively. A total of 114 lung nodules were detected on SCT as a standard reference. There was no statistically significant difference in sensitivity between ASIR-driven ULDCT and SCT for three out of the five observers (p = 0.678, 0.735, Adaptive statistical iterative reconstruction-driven ULDCT delivering a radiation dose of only 0.17 mSv offers acceptable sensitivity in nodule detection compared with SCT and has better performance than FBP-driven ULDCT.
Kittiwisit, Piyanat; Jacobs, Daniel C; Thyagarajan, Nithyanandan; Beardsley, Adam P
2016-01-01
We study the impact of instrumental systematics on the variance, skewness, and kurtosis of redshifted 21 cm intensity fluctuation observations from the Epoch of Reionization. We simulate realistic 21 cm observations based on the Murchison Widefield Array (MWA) Phase I reionization experiment, using the array's point spread function (PSF) and antenna beam patterns, full-sky 21 cm models, and the FHD imaging pipeline. We measure the observed redshift evolution of pixel probability density functions (PDF) and one-point statistics from the simulated maps, comparing them to the measurements derived from simpler simulations that represent the instrument PSFs with Gaussian kernels. We find that both methods yield statistics with similar trends with greater than 80% correlation. We perform additional simulations based on the Hydrogen Epoch of Reionization Array (HERA), using Gaussian kernels as the instrument PSFs, and study the effect of frequency binning on the statistics. We find that PSF smoothing and sampling va...
Raj, Sunny [Univ. of Central Florida, Orlando, FL (United States); Jha, Sumit Kumar [Univ. of Central Florida, Orlando, FL (United States); Pullum, Laura L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Ramanathan, Arvind [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)
2017-05-01
Validating the correctness of human detection vision systems is crucial for safety applications such as pedestrian collision avoidance in autonomous vehicles. The enormous space of possible inputs to such an intelligent system makes it difficult to design test cases for such systems. In this report, we present our tool MAYA that uses an error model derived from a convolutional neural network (CNN) to explore the space of images similar to a given input image, and then tests the correctness of a given human or object detection system on such perturbed images. We demonstrate the capability of our tool on the pre-trained Histogram-of-Oriented-Gradients (HOG) human detection algorithm implemented in the popular OpenCV toolset and the Caffe object detection system pre-trained on the ImageNet benchmark. Our tool may serve as a testing resource for the designers of intelligent human and object detection systems.
Jia, Zhenyu; Wang, Yipeng; Hu, Yuanjie; McLaren, Christine; Yu, Yingyan; Ye, Kai; Xia, Xiao-Qin; Koziol, James A.; Lernhardt, Waldemar; McClelland, Michael; Mercola, Dan
2013-01-01
In case-control profiling studies, increasing the sample size does not always improve statistical power because the variance may also be increased if samples are highly heterogeneous. For instance, tumor samples used for gene expression assay are often heterogeneous in terms of tissue composition or
Michalczyk, Leszek
2013-01-01
This article is one in a series of two publications concerning companies' detection of accounting engineering operations in use. Its conclusions and methods may be applied to external auditing procedures. The aim of the present duo-article is to define a method of statistical analysis that could identify procedures falling within the scope of a framework herein defined as accounting engineering. This model for analysis is meant to be employed in these aspects of initial financial and accounti...
Leszek Michalczyk
2013-01-01
This article is one in a series of two publications concerning companies’ detection of accounting engineering operations in use. Its conclusions and methods may be applied to external auditing procedures. The aim of the present duo-article is to define a method of statistical analysis that could identify procedures falling within the scope of a framework herein defined as accounting engineering. This model for analysis is meant to be employed in these aspects of initial financial and accounti...
Leszek Michalczyk
2013-01-01
This article is one in a series of two publications concerning detection of accounting engineering operations in use. Its conclusions and methods may be applied to external auditing procedures. The aim of the present duo-article is to define a method of statistical analysis that could identify procedures falling within the scope of a framework herein defined as accounting engineering. This model for analysis is meant to be employed in these aspects of initial financial and accounting audit in...
Scholl, Joep H G; van Puijenbroek, Eugène P
2016-12-01
In pharmacovigilance, the commonly used disproportionality analysis (DPA) in statistical signal detection is known to have its limitations. The aim of this study was to investigate the value of the time to onset (TTO) of ADRs in addition to DPA. We performed a pilot study using individual case safety reports (ICSRs) for three drugs (Cervarix®, nitrofurantoin and simvastatin) from the Lareb spontaneous reporting database. TTO distributions for drug - ADR associations were compared to other ADRs for the same drug and to other drugs for the same ADR using two-sample Anderson-Darling testing. Statistically significant associations were considered true positive (TP) signals if the association was present in the official product information of the drug. Sensitivity and specificity for the TTO method were compared with the DPA method. As a measure of disproportionality, the reporting odds ratio (ROR) was used. In general, sensitivity was lower, and specificity was higher for the TTO method compared to DPA. The TTO method showed similar sensitivity for all three drugs, whereas specificity was lower for Cervarix®. Eight additional TP signals were found using the TTO method compared to DPA. Our study shows that statistical signal detection based on the TTO alone resulted in a limited number of additional signals compared to DPA. We therefore conclude that the TTO method is of limited value for full database statistical screening in our setting. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Gladysz, Szymon; Yaitskova, Natalia; Christou, Julian C
2010-11-01
This paper is an introduction to the problem of modeling the probability density function of adaptive-optics speckle. We show that with the modified Rician distribution one cannot describe the statistics of light on axis. A dual solution is proposed: the modified Rician distribution for off-axis speckle and gamma-based distribution for the core of the point spread function. From these two distributions we derive optimal statistical discriminators between real sources and quasi-static speckles. In the second part of the paper the morphological difference between the two probability density functions is used to constrain a one-dimensional, "blind," iterative deconvolution at the position of an exoplanet. Separation of the probability density functions of signal and speckle yields accurate differential photometry in our simulations of the SPHERE planet finder instrument.
Wang, X.; Heimann, T.; Lo, P.
2012-01-01
The segmentation of tree-like tubular structures such as coronary arteries and airways is an essential step for many 3D medical imaging applications. Statistical tracking techniques for the extraction of elongated structures have received considerable attention in recent years due to their robust......The segmentation of tree-like tubular structures such as coronary arteries and airways is an essential step for many 3D medical imaging applications. Statistical tracking techniques for the extraction of elongated structures have received considerable attention in recent years due...... to their robustness against image noise and pathological changes. However, most tracking methods are limited to a specific application and do not support branching structures efficiently. In this work, we present a novel statistical tracking approach for the extraction of different types of tubular structures...... sets and distal airways from 20 CT scans. These data sets were provided by the organizers of the workshop '3D Segmentation in the Clinic: A Grand Challenge II-Coronary Artery Tracking (CAT08)' and 'Extraction of Airways from CT 2009 (EXACT'09)'. On average, 81.5% overlap and 0.51 mm accuracy...
Dolan, T E; Lynch, P D; Karazsia, J L; Serafy, J E
2016-03-01
An expansion is underway of a nuclear power plant on the shoreline of Biscayne Bay, Florida, USA. While the precise effects of its construction and operation are unknown, impacts on surrounding marine habitats and biota are considered by experts to be likely. The objective of the present study was to determine the adequacy of an ongoing monitoring survey of fish communities associated with mangrove habitats directly adjacent to the power plant to detect fish community changes, should they occur, at three spatial scales. Using seasonally resolved data recorded during 532 fish surveys over an 8-year period, power analyses were performed for four mangrove fish metrics (fish diversity, fish density, and the occurrence of two ecologically important fish species: gray snapper (Lutjanus griseus) and goldspotted killifish (Floridichthys carpio). Results indicated that the monitoring program at current sampling intensity allows for detection of monitoring programs for improved, focused change detection deserves consideration from both ecological and cost-benefit perspectives.
Malicious Code Detection Model Based on Traffic Statistical Fingerprinting%基于流量统计指纹的恶意代码检测模型
苗甫; 王振兴; 张连成
2011-01-01
In order to detect malicious codes which utilize encryption technology and tunnels encapsulation, a new malicious code detection model based on traffic statistical fingerprinting is presented. The packet-level features and flow-level features are extracted from each flow in a training set. The flow-level features are filtered by the Principal Component Analysis. The detection model is constructed after malicious code's traffic statistical fingerprinting is got from these features' probability density functions. Experimental results indicate that this model can effectively detect encrypted or tunneled malicious codes.%采用加密和隧道技术的恶意代码难以检测.为此,提出基于流量统计指纹的恶意代码检测模型.提取恶意代码流量中的包层特征和流层特征,对高维流层特征采用主成分分析进行降维,利用两类特征的概率密度函数建立恶意代码流量统计指纹,使用该指纹检测网络中恶意代码通信流量.实验结果表明,该模型能有效检测采用加密和隧道技术的恶意代码.
Jing, Yu; Wang, Yaxuan; Liu, Jianxin; Liu, Zhaoxia
2015-08-01
Edge detection is a crucial method for the location and quantity estimation of oil slick when oil spills on the sea. In this paper, we present a robust active contour edge detection algorithm for oil spill remote sensing images. In the proposed algorithm, we define a local Gaussian data fitting energy term with spatially varying means and variances, and this data fitting energy term is introduced into a global minimization active contour (GMAC) framework. The energy function minimization is achieved fast by a dual formulation of the weighted total variation norm. The proposed algorithm avoids the existence of local minima, does not require the definition of initial contour, and is robust to weak boundaries, high noise and severe intensity inhomogeneity exiting in oil slick remote sensing images. Furthermore, the edge detection of oil slick and the correction of intensity inhomogeneity are simultaneously achieved via the proposed algorithm. The experiment results have shown that a superior performance of proposed algorithm over state-of-the-art edge detection algorithms. In addition, the proposed algorithm can also deal with the special images with the object and background of the same intensity means but different variances.
Fidalgo, Angel M.; Bartram, Dave
2010-01-01
The main objective of this study was to establish the relative efficacy of the generalized Mantel-Haenszel test (GMH) and the Mantel test for detecting large numbers of differential item functioning (DIF) patterns. To this end this study considered a topic not dealt with in the literature to date: the possible differential effect of type of scores…
Luyan Zhang; Huihui Li; Jiankang Wang
2012-01-01
Epistasis is a commonly observed genetic phenomenon and an important source of variation of complex traits,which could maintain additive variance and therefore assure the long-term genetic gain in breeding.Inclusive composite interval mapping (ICIM) is able to identify epistatic quantitative trait loci (QTLs) no matter whether the two interacting QTLs have any additive effects.In this article,we conducted a simulation study to evaluate detection power and false discovery rate (FDR) of ICIM epistatic mapping,by considering F2 and doubled haploid (DH) populations,different F2 segregation ratios and population sizes.Results indicated that estimations of QTL locations and effects were unbiased,and the detection power of epistatic mapping was largely affected by population size,heritability of epistasis,and the amount and distribution of genetic effects.When the same likelihood of odd (LOD) threshold was used,detection power of QTL was higher in F2 population than power in DH population; meanwhile FDR in F2 was also higher than that in DH.The increase of marker density from 10 cM to 5 cM led to similar detection power but higher FDR.In simulated populations,ICIM achieved better mapping results than multiple interval mapping (MIM) in estimation of QTL positions and effect.At the end,we gave epistatic mapping results of ICIM in one actual population in rice (Oryza sativa L.).
... Certification Import Safety International Recall Guidance Civil and Criminal Penalties Federal Court Orders & Decisions Research & Statistics Research & Statistics Technical Reports Injury Statistics NEISS Injury ...
Parmar, Rudrangi; Ghanta, Ajay; Haware, Rahul V; Johnson, Paul R; Stagner, William C
2016-12-01
A sucrose octaacetate (SOA) gradient HPLC evaporative light scattering detection (ELSD) and low-wavelength UV-diode array detection (UV-DAD)-specific stability-indicating method development and validation comparison is reported. A central composite response surface design and multicriteria optimization was used to maximize molten SOA area-under-the-curve response and signal-to-noise ratio. The ELSD data were also analyzed using multivariate principal component analysis, analysis of variance, and standard least squares effects modeling. The method suitability and validation parameters of both methods were compared. To the authors' knowledge, this is the first report that validates an ELSD method using a molten analyte. SOA exhibited a low molar absorptivity of 439 absorption units/cm/M in water at 210 nm requiring low-wavelength UV-DAD detection. The low-wavelength UV-DAD method provided substantially better intraday and interday precision, intraday and interday goodness-of-fit, detection limit, and quantitation limit than ELSD. ELSD exhibited a 60-fold greater area-under-the-curve response, better resolution, and 58% more theoretical plates. On balance, the UV-DAD method was chosen for SOA chemical kinetic studies. This study illustrates that ELSD may not always be the best alternative to gradient HPLC low-wavelength UV detection. Copyright © 2016. Published by Elsevier Inc.
Le Bot, O., E-mail: lebotol@gmail.com [Univ. Grenoble Alpes, GIPSA-Lab, F-38000 Grenoble (France); CNRS, GIPSA-Lab, F-38000 Grenoble (France); Mars, J.I. [Univ. Grenoble Alpes, GIPSA-Lab, F-38000 Grenoble (France); CNRS, GIPSA-Lab, F-38000 Grenoble (France); Gervaise, C. [Univ. Grenoble Alpes, GIPSA-Lab, F-38000 Grenoble (France); CNRS, GIPSA-Lab, F-38000 Grenoble (France); Chaire CHORUS, Foundation of Grenoble Institute of Technology, 46 Avenue Félix Viallet, 38031 Grenoble Cedex 1 (France)
2015-10-23
This Letter proposes an algorithm to detect an unknown deterministic signal hidden in additive white Gaussian noise. The detector is based on recurrence analysis. It compares the distribution of the similarity matrix coefficients of the measured signal with an analytic expression of the distribution expected in the noise-only case. This comparison is achieved using divergence measures. Performance analysis based on the receiver operating characteristics shows that the proposed detector outperforms the energy detector, giving a probability of detection 10% to 50% higher, and has a similar performance to that of a sub-optimal filter detector. - Highlights: • We model the distribution of the similarity matrix coefficients of a Gaussian noise. • We use divergence measures for goodness-of-fit test between a model and measured data. • We distinguish deterministic signal and Gaussian noise with similarity matrix analysis. • Similarity matrix analysis outperforms energy detector.
Nicolas Hengartner
2006-12-01
Full Text Available The performance of weak gaseous plume-detection methods in hyperspectral long-wave infrared imagery depends on scene-specific conditions such at the ability to properly estimate atmospheric transmission, the accuracy of estimated chemical signatures, and background clutter. This paper reviews commonly-applied physical models in the context of weak plume identification and quantification, identifies inherent error sources as well as those introduced by making simplifying assumptions, and indicates research areas.
Adams, Michael C; Barbano, David M
2015-06-01
Our objective was to develop a statistical approach that could be used to determine whether a handler's fat, protein, or other solids mid-infrared (MIR) spectrophotometer test values were different, on average, from a milk regulatory laboratory's MIR test values when split-sampling test values are not available. To accomplish this objective, the Proc GLM procedure of SAS (SAS Institute Inc., Cary, NC) was used to develop a multiple linear regression model to evaluate 4 mo of MIR producer payment testing data (112 to 167 producers per month) from 2 different MIR instruments. For each of the 4 mo and each of the 2 components (fat or protein), the GLM model was Response=Instrument+Producer+Date+2-Way Interactions+3-Way Interaction. Instrument was significant in determining fat and protein tests for 3 of the 4 mo, and Producer was significant in determining fat and protein tests for all 4 mo. This model was also used to establish fat and protein least significant differences (LSD) between instruments. Fat LSD between instruments ranged from 0.0108 to 0.0144% (α=0.05) for the 4 mo studied, whereas protein LSD between instruments ranged from 0.0046 to 0.0085% (α=0.05). In addition, regression analysis was used to determine the effects of component concentration and date of sampling on fat and protein differences between 2 MIR instruments. This statistical approach could be performed monthly to document a regulatory laboratory's verification that a given handler's instrument has obtained a different test result, on average, from that of the regulatory laboratory's and that an adjustment to producer payment may be required.
Philippe Andrey
Full Text Available In eukaryotes, the interphase nucleus is organized in morphologically and/or functionally distinct nuclear "compartments". Numerous studies highlight functional relationships between the spatial organization of the nucleus and gene regulation. This raises the question of whether nuclear organization principles exist and, if so, whether they are identical in the animal and plant kingdoms. We addressed this issue through the investigation of the three-dimensional distribution of the centromeres and chromocenters. We investigated five very diverse populations of interphase nuclei at different differentiation stages in their physiological environment, belonging to rabbit embryos at the 8-cell and blastocyst stages, differentiated rabbit mammary epithelial cells during lactation, and differentiated cells of Arabidopsis thaliana plantlets. We developed new tools based on the processing of confocal images and a new statistical approach based on G- and F- distance functions used in spatial statistics. Our original computational scheme takes into account both size and shape variability by comparing, for each nucleus, the observed distribution against a reference distribution estimated by Monte-Carlo sampling over the same nucleus. This implicit normalization allowed similar data processing and extraction of rules in the five differentiated nuclei populations of the three studied biological systems, despite differences in chromosome number, genome organization and heterochromatin content. We showed that centromeres/chromocenters form significantly more regularly spaced patterns than expected under a completely random situation, suggesting that repulsive constraints or spatial inhomogeneities underlay the spatial organization of heterochromatic compartments. The proposed technique should be useful for identifying further spatial features in a wide range of cell types.
Zhang, Xiaoshuai; Yang, Xiaowei; Yuan, Zhongshang; Liu, Yanxun; Li, Fangyu; Peng, Bin; Zhu, Dianwen; Zhao, Jinghua; Xue, Fuzhong
2013-01-01
For genome-wide association data analysis, two genes in any pathway, two SNPs in the two linked gene regions respectively or in the two linked exons respectively within one gene are often correlated with each other. We therefore proposed the concept of gene-gene co-association, which refers to the effects not only due to the traditional interaction under nearly independent condition but the correlation between two genes. Furthermore, we constructed a novel statistic for detecting gene-gene co-association based on Partial Least Squares Path Modeling (PLSPM). Through simulation, the relationship between traditional interaction and co-association was highlighted under three different types of co-association. Both simulation and real data analysis demonstrated that the proposed PLSPM-based statistic has better performance than single SNP-based logistic model, PCA-based logistic model, and other gene-based methods.
On Improving the Level of Social Statistical Detection Work%提高社会统计检测工作的水平
刘夕民
2011-01-01
随着经济的发展,统计工作在企业中的作用也越来越重要,成为企业进行科学管理和对企业的各种经营活动进行监督和计划的一个重要工具.本文将对提高社会统计检测水平的方法进行简要的分析.%With the development of economy, statistical work is becoming more and more important in companies which has become a key tool for enterprise's conduction of scientific management at the same time monitor various business activities. The paper briefly discussed the methods of improving social statistical detection level.
Adam James Carroll
2015-07-01
Full Text Available This article describes PhenoMeter, a new type of metabolomics database search that accepts metabolite response patterns as queries and searches the MetaPhen database of reference patterns for responses that are statistically significantly similar or inverse for the purposes of detecting functional links. To identify a similarity measure that would detect functional links as reliably as possible, we compared the performance of four statistics in correctly top-matching metabolic phenotypes of Arabidopsis thaliana metabolism mutants affected in different steps of the photorespiration metabolic pathway to reference phenotypes of mutants affected in the same enzymes by independent mutations. The best performing statistic, the PhenoMeter Score (PM Score, was a function of both Pearson correlation and Fisher’s Exact Test of directional overlap. This statistic outperformed Pearson correlation, biweight midcorrelation and Fisher’s Exact Test used alone. To demonstrate general applicability, we show that the PhenoMeter reliably retrieved the most closely functionally-linked response in the database when queried with responses to a wide variety of environmental and genetic perturbations. Attempts to match metabolic phenotypes between independent studies were met with varying success and possible reasons for this are discussed. Overall, our results suggest that integration of pattern-based search tools into metabolomics databases will aid functional annotation of newly recorded metabolic phenotypes analogously to the way sequence similarity search algorithms have aided the functional annotation of genes and proteins. PhenoMeter is freely available at MetabolomeExpress (https://www.metabolome-express.org/phenometer.php.
Gómez González, A.; Fassois, S. D.
2016-03-01
The problem of vibration-based damage detection under varying environmental conditions and uncertainty is considered, and a novel, supervised, PCA-type statistical methodology is postulated. The methodology employs vibration data records from the healthy and damaged states of a structure under various environmental conditions. Unlike standard PCA-type methods in which a feature vector corresponding to the least important eigenvalues is formed in a single step, the postulated methodology uses supervised learning in which damaged-state data records are employed to sequentially form a feature vector by appending a transformed scalar element at a time under the condition that it optimally, among all remaining elements, improves damage detectability. This leads to the formulation of feature vectors with optimized sensitivity to damage, and thus high damage detectability. Within this methodology three particular methods, two non-parametric and one parametric, are formulated. These are validated and comparatively assessed via a laboratory case study focusing on damage detection on a scale wind turbine blade under varying temperature and the potential presence of sprayed water. Damage detection performance is shown to be excellent based on a single vibration response sensor and a limited frequency bandwidth.
Lotterhos, Katie E; Whitlock, Michael C
2015-03-01
Although genome scans have become a popular approach towards understanding the genetic basis of local adaptation, the field still does not have a firm grasp on how sampling design and demographic history affect the performance of genome scans on complex landscapes. To explore these issues, we compared 20 different sampling designs in equilibrium (i.e. island model and isolation by distance) and nonequilibrium (i.e. range expansion from one or two refugia) demographic histories in spatially heterogeneous environments. We simulated spatially complex landscapes, which allowed us to exploit local maxima and minima in the environment in 'pair' and 'transect' sampling strategies. We compared F(ST) outlier and genetic-environment association (GEA) methods for each of two approaches that control for population structure: with a covariance matrix or with latent factors. We show that while the relative power of two methods in the same category (F(ST) or GEA) depended largely on the number of individuals sampled, overall GEA tests had higher power in the island model and F(ST) had higher power under isolation by distance. In the refugia models, however, these methods varied in their power to detect local adaptation at weakly selected loci. At weakly selected loci, paired sampling designs had equal or higher power than transect or random designs to detect local adaptation. Our results can inform sampling designs for studies of local adaptation and have important implications for the interpretation of genome scans based on landscape data. © 2015 John Wiley & Sons Ltd.
Meldrum Cliff J
2003-12-01
Full Text Available Abstract Denaturing high performance liquid chromatography is a relatively new method by which heteroduplex structures formed during the PCR amplification of heterozygote samples can be rapidly identified. The use of this technology for mutation detection in hereditary non-polyposis colorectal cancer (HNPCC has the potential to appreciably shorten the time it takes to analyze genes associated with this disorder. Prior to acceptance of this method for screening genes associated with HNPCC, assessment of the reliability of this method should be performed. In this report we have compared mutation and polymorphism detection by denaturing gradient gel electrophoresis (DGGE with denaturing high performance liquid chromatography (DHPLC in a set of 130 families. All mutations/polymorphisms representing base substitutions, deletions, insertions and a 23 base pair inversion were detected by DHPLC whereas DGGE failed to identify four single base substitutions and a single base pair deletion. In addition, we show that DHPLC has been used for the identification of 5 different mutations in exon 7 of hMSH2 that could not be detected by DGGE. From this study we conclude that DHPLC is a more effective and rapid alternative to the detection of mutations in hMSH2 and hMLH1 with the same or better accuracy than DGGE. Furthermore, this technique offers opportunities for automation, which have not been realised for the majority of other methods of gene analysis.
No Previous Public Services Required
Taylor, Kelley R.
2009-01-01
In 2007, the Supreme Court heard a case that involved the question of whether a school district could be required to reimburse parents who unilaterally placed their child in private school when the child had not previously received special education and related services in a public institution ("Board of Education v. Tom F."). The…
2017-01-01
Malacca River water quality is affected due to rapid urbanization development. The present study applied LULC changes towards water quality detection in Malacca River. The method uses LULC, PCA, CCA, HCA, NHCA, and ANOVA. PCA confirmed DS, EC, salinity, turbidity, TSS, DO, BOD, COD, As, Hg, Zn, Fe, E. coli, and total coliform. CCA confirmed 14 variables into two variates; first variate involves residential and industrial activities; and second variate involves agriculture, sewage treatment plant, and animal husbandry. HCA and NHCA emphasize that cluster 1 occurs in urban area with Hg, Fe, total coliform, and DO pollution; cluster 3 occurs in suburban area with salinity, EC, and DS; and cluster 2 occurs in rural area with salinity and EC. ANOVA between LULC and water quality data indicates that built-up area significantly polluted the water quality through E. coli, total coliform, EC, BOD, COD, TSS, Hg, Zn, and Fe, while agriculture activities cause EC, TSS, salinity, E. coli, total coliform, arsenic, and iron pollution; and open space causes contamination of turbidity, salinity, EC, and TSS. Research finding provided useful information in identifying pollution sources and understanding LULC with river water quality as references to policy maker for proper management of Land Use area. PMID:28377790
Ang Kean Hua
2017-01-01
Full Text Available Malacca River water quality is affected due to rapid urbanization development. The present study applied LULC changes towards water quality detection in Malacca River. The method uses LULC, PCA, CCA, HCA, NHCA, and ANOVA. PCA confirmed DS, EC, salinity, turbidity, TSS, DO, BOD, COD, As, Hg, Zn, Fe, E. coli, and total coliform. CCA confirmed 14 variables into two variates; first variate involves residential and industrial activities; and second variate involves agriculture, sewage treatment plant, and animal husbandry. HCA and NHCA emphasize that cluster 1 occurs in urban area with Hg, Fe, total coliform, and DO pollution; cluster 3 occurs in suburban area with salinity, EC, and DS; and cluster 2 occurs in rural area with salinity and EC. ANOVA between LULC and water quality data indicates that built-up area significantly polluted the water quality through E. coli, total coliform, EC, BOD, COD, TSS, Hg, Zn, and Fe, while agriculture activities cause EC, TSS, salinity, E. coli, total coliform, arsenic, and iron pollution; and open space causes contamination of turbidity, salinity, EC, and TSS. Research finding provided useful information in identifying pollution sources and understanding LULC with river water quality as references to policy maker for proper management of Land Use area.
Chiara eMastropasqua
2014-08-01
Full Text Available We combined continuous theta burst stimulation (cTBS and resting state (RS -fMRI approaches to investigate changes in functional connectivity (FC induced by right dorso-lateral prefrontal cortex (DLPFC cTBS at rest in a group of healthy subjects. Seed based fMRI analysis revealed a specific pattern of correlation between the right prefrontal cortex and several brain regions: based on these results, we defined a 29-node network to assess changes in each network connection before and after, respectively, DLPFC-cTBS and sham sessions. A decrease of correlation between the right prefrontal cortex and right parietal cortex (Brodmann areas 46 and 40 respectively was detected after cTBS, while no significant result was found when analyzing sham-session data. To our knowledge, this is the first study that demonstrates within-subject changes in FC induced by cTBS applied on prefrontal area. The possibility to induce selective changes in a specific region without interfering with functionally correlated area could have several implications for the study of functional properties of the brain, and for the emerging therapeutic strategies based on transcranial stimulation.
Arismendi, I.; Johnson, S. L.; Dunham, J. B.
2015-03-01
Statistics of central tendency and dispersion may not capture relevant or desired characteristics of the distribution of continuous phenomena and, thus, they may not adequately describe temporal patterns of change. Here, we present two methodological approaches that can help to identify temporal changes in environmental regimes. First, we use higher-order statistical moments (skewness and kurtosis) to examine potential changes of empirical distributions at decadal extents. Second, we adapt a statistical procedure combining a non-metric multidimensional scaling technique and higher density region plots to detect potentially anomalous years. We illustrate the use of these approaches by examining long-term stream temperature data from minimally and highly human-influenced streams. In particular, we contrast predictions about thermal regime responses to changing climates and human-related water uses. Using these methods, we effectively diagnose years with unusual thermal variability and patterns in variability through time, as well as spatial variability linked to regional and local factors that influence stream temperature. Our findings highlight the complexity of responses of thermal regimes of streams and reveal their differential vulnerability to climate warming and human-related water uses. The two approaches presented here can be applied with a variety of other continuous phenomena to address historical changes, extreme events, and their associated ecological responses.
Morteza Behnam
2015-08-01
Full Text Available Seizure detection using brain signal (EEG analysis is the important clinical methods in drug therapy and the decisions before brain surgery. In this paper, after signal conditioning using suitable filtering, the Gamma frequency band has been extracted and the other brain rhythms, ambient noises and the other bio-signal are canceled. Then, the wavelet transform of brain signal and the map of wavelet transform in multi levels are computed. By dividing the color map to different epochs, the histogram of each sub-image is obtained and the statistics of it based on statistical momentums and Negentropy values are calculated. Statistical feature vector using Principle Component Analysis (PCA is reduced to one dimension. By EMD algorithm and sifting procedure for analyzing the data by Intrinsic Mode Function (IMF and computing the residues of brain signal using spectrum of Hilbert transform and Hilbert – Huang spectrum forming, one spatial feature based on the Euclidian distance for signal classification is obtained. By K-Nearest Neighbor (KNN classifier and by considering the optimal neighbor parameter, EEG signals are classified in two classes, seizure and non-seizure signal, with the rate of accuracy 76.54% and with variance of error 0.3685 in the different tests.
邱志宏
2013-01-01
In order to improve the detection performance of steganography detection algorithm,in this paper we put forward a steganography detection algorithm which is based on histogram statistical classification.Through extracting the histogram characteristic parameters of the image information and using the classification mode which is constructed based on artificial neural networks,this algorithm achieves accurate judgement on images embedded with steganography information.We analyse in detail the design,principle and process of this steganography algorithm,and at last construct the experimental test environment.Test results indicate that the detection success rate and the false alarm rate of the steganography detection algorithm designed in the paper are better than those of Ezstego detection tool.%为了提高隐写检测算法的检测性能,提出基于直方图统计分类的隐写检测算法.通过对图片信息的直方图特征参数的提取,使用构建基于人工神经网络的分类模式,实现对嵌入隐写信息的图片准确判定.详细分析基于直方图统计分类隐写算法的设计原理和过程,最后构建实验测试环境.测试结果表明,该隐写检测算法的检测成功率和误检率均优于Ezstego检测工具.
I. Arismendi
2014-05-01
Full Text Available Central tendency statistics may not capture relevant or desired characteristics about the variability of continuous phenomena and thus, they may not completely track temporal patterns of change. Here, we present two methodological approaches to identify long-term changes in environmental regimes. First, we use higher statistical moments (skewness and kurtosis to examine potential changes of empirical distributions at decadal scale. Second, we adapt an outlier detection procedure combining a non-metric multidimensional scaling technique and higher density region plots to detect anomalous years. We illustrate the use of these approaches by examining long-term stream temperature data from minimally and highly human-influenced streams. In particular, we contrast predictions about thermal regime responses to changing climates and human-related water uses. Using these methods, we effectively diagnose years with unusual thermal variability, patterns in variability through time, and spatial variability linked to regional and local factors that influence stream temperature. Our findings highlight the complexity of responses of thermal regimes of streams and reveal a differentiated vulnerability to both the climate warming and human-related water uses. The two approaches presented here can be applied with a variety of other continuous phenomena to address historical changes, extreme events, and their associated ecological responses.
van Ham Roeland CHJ
2011-05-01
Full Text Available Abstract Background In vivo detection of protein-bound genomic regions can be achieved by combining chromatin-immunoprecipitation with next-generation sequencing technology (ChIP-seq. The large amount of sequence data produced by this method needs to be analyzed in a statistically proper and computationally efficient manner. The generation of high copy numbers of DNA fragments as an artifact of the PCR step in ChIP-seq is an important source of bias of this methodology. Results We present here an R package for the statistical analysis of ChIP-seq experiments. Taking the average size of DNA fragments subjected to sequencing into account, the software calculates single-nucleotide read-enrichment values. After normalization, sample and control are compared using a test based on the ratio test or the Poisson distribution. Test statistic thresholds to control the false discovery rate are obtained through random permutations. Computational efficiency is achieved by implementing the most time-consuming functions in C++ and integrating these in the R package. An analysis of simulated and experimental ChIP-seq data is presented to demonstrate the robustness of our method against PCR-artefacts and its adequate control of the error rate. Conclusions The software ChIP-seq Analysis in R (CSAR enables fast and accurate detection of protein-bound genomic regions through the analysis of ChIP-seq experiments. Compared to existing methods, we found that our package shows greater robustness against PCR-artefacts and better control of the error rate.
Lee, L.; Helsel, D.
2007-01-01
Analysis of low concentrations of trace contaminants in environmental media often results in left-censored data that are below some limit of analytical precision. Interpretation of values becomes complicated when there are multiple detection limits in the data-perhaps as a result of changing analytical precision over time. Parametric and semi-parametric methods, such as maximum likelihood estimation and robust regression on order statistics, can be employed to model distributions of multiply censored data and provide estimates of summary statistics. However, these methods are based on assumptions about the underlying distribution of data. Nonparametric methods provide an alternative that does not require such assumptions. A standard nonparametric method for estimating summary statistics of multiply-censored data is the Kaplan-Meier (K-M) method. This method has seen widespread usage in the medical sciences within a general framework termed "survival analysis" where it is employed with right-censored time-to-failure data. However, K-M methods are equally valid for the left-censored data common in the geosciences. Our S-language software provides an analytical framework based on K-M methods that is tailored to the needs of the earth and environmental sciences community. This includes routines for the generation of empirical cumulative distribution functions, prediction or exceedance probabilities, and related confidence limits computation. Additionally, our software contains K-M-based routines for nonparametric hypothesis testing among an unlimited number of grouping variables. A primary characteristic of K-M methods is that they do not perform extrapolation and interpolation. Thus, these routines cannot be used to model statistics beyond the observed data range or when linear interpolation is desired. For such applications, the aforementioned parametric and semi-parametric methods must be used.
Detection of Step-Structure Edge Based on Order Statistic Filter%基于次序统计滤波器的阶跃边缘检测
马洪; 余勇; 马黎; 梅田三千雄
2001-01-01
在二值图像的边缘检测理论中，经典的方法是用固定的卷积核构成边缘检测算子，如Sobel算子，Prewitt算子，Kirsch算子和Roberts算子等。它们对数字图像边缘的复杂几何结构缺乏算法自适应性。现作者将随机滤波的思想引入边缘检测，用次序统计滤波器构造随机卷积核，从而引入一种新的随机边缘检测算子，即OSF边缘算子，并分别对雷达图像和文本图像实施边缘检测。实验结果证明了OSF边缘算子的有效性。%In the theory of edge detection of binary image, the classical method is to use regular convolution kernel to construct edge detect operator, such as Sobel operator, Prewitt operator, Kirsch operator and Roberts operator etc. However, these operators for the complex geometric structure of digital image edge lack algorithm adaptability. The authors propose stochastic filtering to edge detection. Using order statistic filter to construct stochastic convolution kernel, the authors yield a new kind of stochastic edge detect operator-OSF edge operator. The authors carry out the edge detection for radar images and document images, and the experimental results show the efficency of the OSF edge operator.
Cho, Hyun-Deok; Kim, Unyong; Suh, Joon Hyuk; Eom, Han Young; Kim, Junghyun; Lee, Seul Gi; Choi, Yong Seok; Han, Sang Beom
2016-04-01
Analytical methods using high-performance liquid chromatography with diode array and tandem mass spectrometry detection were developed for the discrimination of the rhizomes of four Atractylodes medicinal plants: A. japonica, A. macrocephala, A. chinensis, and A. lancea. A quantitative study was performed, selecting five bioactive components, including atractylenolide I, II, III, eudesma-4(14),7(11)-dien-8-one and atractylodin, on twenty-six Atractylodes samples of various origins. Sample extraction was optimized to sonication with 80% methanol for 40 min at room temperature. High-performance liquid chromatography with diode array detection was established using a C18 column with a water/acetonitrile gradient system at a flow rate of 1.0 mL/min, and the detection wavelength was set at 236 nm. Liquid chromatography with tandem mass spectrometry was applied to certify the reliability of the quantitative results. The developed methods were validated by ensuring specificity, linearity, limit of quantification, accuracy, precision, recovery, robustness, and stability. Results showed that cangzhu contained higher amounts of atractylenolide I and atractylodin than baizhu, and especially atractylodin contents showed the greatest variation between baizhu and cangzhu. Multivariate statistical analysis, such as principal component analysis and hierarchical cluster analysis, were also employed for further classification of the Atractylodes plants. The established method was suitable for quality control of the Atractylodes plants.
Connolly, Siobhan; Heron, Elizabeth A
2015-05-01
The detection of parent-of-origin effects aims to identify whether the functionality of alleles, and in turn associated phenotypic traits, depends on the parental origin of the alleles. Different parent-of-origin effects have been identified through a variety of mechanisms and a number of statistical methodologies for their detection have been proposed, in particular for genome-wide association studies (GWAS). GWAS have had limited success in explaining the heritability of many complex disorders and traits, but successful identification of parent-of-origin effects using trio (mother, father and offspring) GWAS may help shed light on this missing heritability. However, it is important to choose the most appropriate parent-of-origin test or methodology, given knowledge of the phenotype, amount of available data and the type of parent-of-origin effect(s) being considered. This review brings together the parent-of-origin detection methodologies available, comparing them in terms of power and type I error for a number of different simulated data scenarios, and finally offering guidance as to the most appropriate choice for the different scenarios.
Norén, Patrik
2013-01-01
Algebraic statistics brings together ideas from algebraic geometry, commutative algebra, and combinatorics to address problems in statistics and its applications. Computer algebra provides powerful tools for the study of algorithms and software. However, these tools are rarely prepared to address statistical challenges and therefore new algebraic results need often be developed. This way of interplay between algebra and statistics fertilizes both disciplines. Algebraic statistics is a relativ...
Leszek Michalczyk
2013-05-01
Full Text Available This article is one in a series of two publications concerning companies’ detection of accounting engineering operations in use. Its conclusions and methods may be applied to external auditing procedures. The aim of the present duo-article is to define a method of statistical analysis that could identify procedures falling within the scope of a framework herein defined as accounting engineering. This model for analysis is meant to be employed in these aspects of initial financial and accounting audit in a business enterprise that have to do with isolating the influence of variant accounting solutions, which are a consequence of the settlement method chosen by the enterprise. Materials for statistical analysis were divided into groups according to the field in which a given company operated. In this article, we accept and elaborate on the premise that significant differences in financial results may be solely a result of either expansive policy on new markets or the acquisition of cheaper sources for operating activities. In the remaining cases, the choice of valuation and settlement methods becomes crucial; the greater the deviations, the more essential this choice becomes. Even though the research materials we analyze are regionally-conditioned, the model may find its application in other accounting systems in the country, provided that it has been appropriately implemented. Furthermore, the article defines an innovative concept of variant accounting.
Leszek Michalczyk
2013-10-01
Full Text Available This article is one in a series of two publications concerning detection of accounting engineering operations in use. Its conclusions and methods may be applied to external auditing procedures. The aim of the present duo-article is to define a method of statistical analysis that could identify procedures falling within the scope of a framework herein defined as accounting engineering. This model for analysis is meant to be employed in these aspects of initial financial and accounting audit in a business enterprise that have to do with isolating the influence of variant accounting solutions, which are a consequence of the settlement method chosen by the enterprise. Materials for statistical analysis were divided into groups according to the field in which a given company operated. In this article, we accept and elaborate on the premise that significant differences in financial results may be solely a result of either expansive policy on new markets or the acquisition of cheaper sources for operating activities. In the remaining cases, the choice of valuation and settlement methods becomes crucial; the greater the deviations, the more essential this choice becomes. Even though the research materials we analyze are regionally-conditioned, the model may find its application in other accounting systems, provided that it has been appropriately implemented. Furthermore, the article defines an innovative concept of variant accounting.
Blondeau-Patissier, David; Gower, James F. R.; Dekker, Arnold G.; Phinn, Stuart R.; Brando, Vittorio E.
2014-04-01
The need for more effective environmental monitoring of the open and coastal ocean has recently led to notable advances in satellite ocean color technology and algorithm research. Satellite ocean color sensors' data are widely used for the detection, mapping and monitoring of phytoplankton blooms because earth observation provides a synoptic view of the ocean, both spatially and temporally. Algal blooms are indicators of marine ecosystem health; thus, their monitoring is a key component of effective management of coastal and oceanic resources. Since the late 1970s, a wide variety of operational ocean color satellite sensors and algorithms have been developed. The comprehensive review presented in this article captures the details of the progress and discusses the advantages and limitations of the algorithms used with the multi-spectral ocean color sensors CZCS, SeaWiFS, MODIS and MERIS. Present challenges include overcoming the severe limitation of these algorithms in coastal waters and refining detection limits in various oceanic and coastal environments. To understand the spatio-temporal patterns of algal blooms and their triggering factors, it is essential to consider the possible effects of environmental parameters, such as water temperature, turbidity, solar radiation and bathymetry. Hence, this review will also discuss the use of statistical techniques and additional datasets derived from ecosystem models or other satellite sensors to characterize further the factors triggering or limiting the development of algal blooms in coastal and open ocean waters.
新家, 健精
2013-01-01
© 2012 Springer Science+Business Media, LLC. All rights reserved. Article Outline: Glossary Definition of the Subject and Introduction The Bayesian Statistical Paradigm Three Examples Comparison with the Frequentist Statistical Paradigm Future Directions Bibliography
Roth, Amanda L; Hanson, Nancy D
2013-01-01
In the United States, the production of the Klebsiella pneumoniae carbapenemase (KPC) is an important mechanism of carbapenem resistance in Gram-negative pathogens. Infections with KPC-producing organisms are associated with increased morbidity and mortality; therefore, the rapid detection of KPC-producing pathogens is critical in patient care and infection control. We developed a real-time PCR assay complemented with traditional high-resolution melting (HRM) analysis, as well as statistically based genotyping, using the Rotor-Gene ScreenClust HRM software to both detect the presence of bla(KPC) and differentiate between KPC-2-like and KPC-3-like alleles. A total of 166 clinical isolates of Enterobacteriaceae, Pseudomonas aeruginosa, and Acinetobacter baumannii with various β-lactamase susceptibility patterns were tested in the validation of this assay; 66 of these organisms were known to produce the KPC β-lactamase. The real-time PCR assay was able to detect the presence of bla(KPC) in all 66 of these clinical isolates (100% sensitivity and specificity). HRM analysis demonstrated that 26 had KPC-2-like melting peak temperatures, while 40 had KPC-3-like melting peak temperatures. Sequencing of 21 amplified products confirmed the melting peak results, with 9 isolates carrying bla(KPC-2) and 12 isolates carrying bla(KPC-3). This PCR/HRM assay can identify KPC-producing Gram-negative pathogens in as little as 3 h after isolation of pure colonies and does not require post-PCR sample manipulation for HRM analysis, and ScreenClust analysis easily distinguishes bla(KPC-2-like) and bla(KPC-3-like) alleles. Therefore, this assay is a rapid method to identify the presence of bla(KPC) enzymes in Gram-negative pathogens that can be easily integrated into busy clinical microbiology laboratories.
Pestman, Wiebe R
2009-01-01
This textbook provides a broad and solid introduction to mathematical statistics, including the classical subjects hypothesis testing, normal regression analysis, and normal analysis of variance. In addition, non-parametric statistics and vectorial statistics are considered, as well as applications of stochastic analysis in modern statistics, e.g., Kolmogorov-Smirnov testing, smoothing techniques, robustness and density estimation. For students with some elementary mathematical background. With many exercises. Prerequisites from measure theory and linear algebra are presented.
Eliazar, Iddo
2017-05-01
The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their 'public relations' for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford's law, and 1/f noise.
Experimental Mathematics and Computational Statistics
Bailey, David H.; Borwein, Jonathan M.
2009-04-30
The field of statistics has long been noted for techniques to detect patterns and regularities in numerical data. In this article we explore connections between statistics and the emerging field of 'experimental mathematics'. These includes both applications of experimental mathematics in statistics, as well as statistical methods applied to computational mathematics.
Boslaugh, Sarah
2008-01-01
Need to learn statistics as part of your job, or want some help passing a statistics course? Statistics in a Nutshell is a clear and concise introduction and reference that's perfect for anyone with no previous background in the subject. This book gives you a solid understanding of statistics without being too simple, yet without the numbing complexity of most college texts. You get a firm grasp of the fundamentals and a hands-on understanding of how to apply them before moving on to the more advanced material that follows. Each chapter presents you with easy-to-follow descriptions illustrat
Sebacinales everywhere: previously overlooked ubiquitous fungal endophytes.
Weiss, Michael; Sýkorová, Zuzana; Garnica, Sigisfredo; Riess, Kai; Martos, Florent; Krause, Cornelia; Oberwinkler, Franz; Bauer, Robert; Redecker, Dirk
2011-02-15
Inconspicuous basidiomycetes from the order Sebacinales are known to be involved in a puzzling variety of mutualistic plant-fungal symbioses (mycorrhizae), which presumably involve transport of mineral nutrients. Recently a few members of this fungal order not fitting this definition and commonly referred to as 'endophytes' have raised considerable interest by their ability to enhance plant growth and to increase resistance of their host plants against abiotic stress factors and fungal pathogens. Using DNA-based detection and electron microscopy, we show that Sebacinales are not only extremely versatile in their mycorrhizal associations, but are also almost universally present as symptomless endophytes. They occurred in field specimens of bryophytes, pteridophytes and all families of herbaceous angiosperms we investigated, including liverworts, wheat, maize, and the non-mycorrhizal model plant Arabidopsis thaliana. They were present in all habitats we studied on four continents. We even detected these fungi in herbarium specimens originating from pioneering field trips to North Africa in the 1830s/40s. No geographical or host patterns were detected. Our data suggest that the multitude of mycorrhizal interactions in Sebacinales may have arisen from an ancestral endophytic habit by specialization. Considering their proven beneficial influence on plant growth and their ubiquity, endophytic Sebacinales may be a previously unrecognized universal hidden force in plant ecosystems.
Sebacinales everywhere: previously overlooked ubiquitous fungal endophytes.
Michael Weiss
Full Text Available Inconspicuous basidiomycetes from the order Sebacinales are known to be involved in a puzzling variety of mutualistic plant-fungal symbioses (mycorrhizae, which presumably involve transport of mineral nutrients. Recently a few members of this fungal order not fitting this definition and commonly referred to as 'endophytes' have raised considerable interest by their ability to enhance plant growth and to increase resistance of their host plants against abiotic stress factors and fungal pathogens. Using DNA-based detection and electron microscopy, we show that Sebacinales are not only extremely versatile in their mycorrhizal associations, but are also almost universally present as symptomless endophytes. They occurred in field specimens of bryophytes, pteridophytes and all families of herbaceous angiosperms we investigated, including liverworts, wheat, maize, and the non-mycorrhizal model plant Arabidopsis thaliana. They were present in all habitats we studied on four continents. We even detected these fungi in herbarium specimens originating from pioneering field trips to North Africa in the 1830s/40s. No geographical or host patterns were detected. Our data suggest that the multitude of mycorrhizal interactions in Sebacinales may have arisen from an ancestral endophytic habit by specialization. Considering their proven beneficial influence on plant growth and their ubiquity, endophytic Sebacinales may be a previously unrecognized universal hidden force in plant ecosystems.
Skatter, Sondre; Fritsch, Sebastian; Schlomka, Jens-Peter
2016-05-01
The performance limits were explored for an X-ray Diffraction based explosives detection system for baggage scanning. This XDi system offers 4D imaging that comprises three spatial dimensions with voxel sizes in the order of ~(0.5cm)3, and one spectral dimension for material discrimination. Because only a very small number of photons are observed for an individual voxel, material discrimination cannot work reliably at the voxel level. Therefore, an initial 3D reconstruction is performed, which allows the identification of objects of interest. Combining all the measured photons that scattered within an object, more reliable spectra are determined on the object-level. As a case study we looked at two liquid materials, one threat and one innocuous, with very similar spectral characteristics, but with 15% difference in electron density. Simulations showed that Poisson statistics alone reduce the material discrimination performance to undesirable levels when the photon counts drop to 250. When additional, uncontrolled variation sources are considered, the photon count plays a less dominant role in detection performance, but limits the performance also for photon counts of 500 and higher. Experimental data confirmed the presence of such non-Poisson variation sources also in the XDi prototype system, which suggests that the present system can still be improved without necessarily increasing the photon flux, but by better controlling and accounting for these variation sources. When the classification algorithm was allowed to use spectral differences in the experimental data, the discrimination between the two materials improved significantly, proving the potential of X-ray diffraction also for liquid materials.
Sadovskii, Michael V
2012-01-01
This volume provides a compact presentation of modern statistical physics at an advanced level. Beginning with questions on the foundations of statistical mechanics all important aspects of statistical physics are included, such as applications to ideal gases, the theory of quantum liquids and superconductivity and the modern theory of critical phenomena. Beyond that attention is given to new approaches, such as quantum field theory methods and non-equilibrium problems.
Deborah D.M.T. Carneiro
2007-11-01
Full Text Available Mirroring the global increase of registered cases of American visceral leishmaniasis (AVL, this infection has become a growing public health problem in Brazil during the last several years. As the traditional approach to control employed by the governmental health agencies has failed to reduce the incidence and epidemic outbreaks of this illness, we propose a re-evaluation of the national strategy of intervention and monitoring. Our thinking is based on a series of spatio-temporal scan statistics of the west-central region of the state of Bahia covering the 11-year period from 1994 to 2004. By analyzing the situation, spatially and temporally, we show that the disease is a not only a growing focal threat but that it is also appearing in the form of endemic clusters in the cities. The areas where the disease has been found have been classified according to the degree of risk of infection for humans and canines. The overall objective of this study was to identify areas of increased risk of AVL, including its seasonality, and to suggest ways and means to improve the detection of the disease. The findings presented here should not only be of interest for the efforts to control AVL in the study area but also be useful for developing control strategies in other endemic regions of Brazil.
Szulc, Stefan
1965-01-01
Statistical Methods provides a discussion of the principles of the organization and technique of research, with emphasis on its application to the problems in social statistics. This book discusses branch statistics, which aims to develop practical ways of collecting and processing numerical data and to adapt general statistical methods to the objectives in a given field.Organized into five parts encompassing 22 chapters, this book begins with an overview of how to organize the collection of such information on individual units, primarily as accomplished by government agencies. This text then
... Foodborne, Waterborne, and Environmental Diseases Mycotic Diseases Branch Histoplasmosis Statistics Recommend on Facebook Tweet Share Compartir How common is histoplasmosis? In the United States, an estimated 60% to ...
Forbes, Catherine; Hastings, Nicholas; Peacock, Brian J.
2010-01-01
A new edition of the trusted guide on commonly used statistical distributions Fully updated to reflect the latest developments on the topic, Statistical Distributions, Fourth Edition continues to serve as an authoritative guide on the application of statistical methods to research across various disciplines. The book provides a concise presentation of popular statistical distributions along with the necessary knowledge for their successful use in data modeling and analysis. Following a basic introduction, forty popular distributions are outlined in individual chapters that are complete with re
Eliazar, Iddo, E-mail: eliazar@post.tau.ac.il
2017-05-15
The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.
The statistical stability phenomenon
Gorban, Igor I
2017-01-01
This monograph investigates violations of statistical stability of physical events, variables, and processes and develops a new physical-mathematical theory taking into consideration such violations – the theory of hyper-random phenomena. There are five parts. The first describes the phenomenon of statistical stability and its features, and develops methods for detecting violations of statistical stability, in particular when data is limited. The second part presents several examples of real processes of different physical nature and demonstrates the violation of statistical stability over broad observation intervals. The third part outlines the mathematical foundations of the theory of hyper-random phenomena, while the fourth develops the foundations of the mathematical analysis of divergent and many-valued functions. The fifth part contains theoretical and experimental studies of statistical laws where there is violation of statistical stability. The monograph should be of particular interest to engineers...
Glaz, Joseph
2009-01-01
Suitable for graduate students and researchers in applied probability and statistics, as well as for scientists in biology, computer science, pharmaceutical science and medicine, this title brings together a collection of chapters illustrating the depth and diversity of theory, methods and applications in the area of scan statistics.
Petocz, Peter; Sowey, Eric
2008-01-01
In this article, the authors focus on hypothesis testing--that peculiarly statistical way of deciding things. Statistical methods for testing hypotheses were developed in the 1920s and 1930s by some of the most famous statisticians, in particular Ronald Fisher, Jerzy Neyman and Egon Pearson, who laid the foundations of almost all modern methods of…
Lyons, L
2016-01-01
Accelerators and detectors are expensive, both in terms of money and human effort. It is thus important to invest effort in performing a good statistical anal- ysis of the data, in order to extract the best information from it. This series of five lectures deals with practical aspects of statistical issues that arise in typical High Energy Physics analyses.
Induced vaginal birth after previous caesarean section
Akylbek Tussupkaliyev
2016-11-01
Full Text Available Introduction The rate of operative birth by Caesarean section is constantly rising. In Kazakhstan, it reaches 27 per cent. Research data confirm that the percentage of successful vaginal births after previous Caesarean section is 50–70 per cent. How safe the induction of vaginal birth after Caesarean (VBAC remains unclear. Methodology The studied techniques of labour induction were amniotomy of the foetal bladder with the vulsellum ramus, intravaginal administration of E1 prostaglandin (Misoprostol, and intravenous infusion of Oxytocin-Richter. The assessment of rediness of parturient canals was conducted by Bishop’s score; the labour course was assessed by a partogram. The effectiveness of labour induction techniques was assessed by the number of administered doses, the time of onset of regular labour, the course of labour and the postpartum period and the presence of complications, and the course of the early neonatal period, which implied the assessment of the child’s condition, described in the newborn development record. The foetus was assessed by medical ultrasound and antenatal and intranatal cardiotocography (CTG. Obtained results were analysed with SAS statistical processing software. Results The overall percentage of successful births with intravaginal administration of Misoprostol was 93 per cent (83 of cases. This percentage was higher than in the amniotomy group (relative risk (RR 11.7 and was similar to the oxytocin group (RR 0.83. Amniotomy was effective in 54 per cent (39 of cases, when it induced regular labour. Intravenous oxytocin infusion was effective in 94 per cent (89 of cases. This percentage was higher than that with amniotomy (RR 12.5. Conclusions The success of vaginal delivery after previous Caesarean section can be achieved in almost 70 per cent of cases. At that, labour induction does not decrease this indicator and remains within population boundaries.
Ross, Sheldon M
2005-01-01
In this revised text, master expositor Sheldon Ross has produced a unique work in introductory statistics. The text's main merits are the clarity of presentation, contemporary examples and applications from diverse areas, and an explanation of intuition and ideas behind the statistical methods. To quote from the preface, ""It is only when a student develops a feel or intuition for statistics that she or he is really on the path toward making sense of data."" Ross achieves this goal through a coherent mix of mathematical analysis, intuitive discussions and examples.* Ross's clear writin
Ross, Sheldon M
2010-01-01
In this 3rd edition revised text, master expositor Sheldon Ross has produced a unique work in introductory statistics. The text's main merits are the clarity of presentation, contemporary examples and applications from diverse areas, and an explanation of intuition and ideas behind the statistical methods. Concepts are motivated, illustrated and explained in a way that attempts to increase one's intuition. To quote from the preface, ""It is only when a student develops a feel or intuition for statistics that she or he is really on the path toward making sense of data."" Ross achieves this
Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James
2014-01-01
Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.
Wannier, Gregory H
2010-01-01
Until recently, the field of statistical physics was traditionally taught as three separate subjects: thermodynamics, statistical mechanics, and kinetic theory. This text, a forerunner in its field and now a classic, was the first to recognize the outdated reasons for their separation and to combine the essentials of the three subjects into one unified presentation of thermal physics. It has been widely adopted in graduate and advanced undergraduate courses, and is recommended throughout the field as an indispensable aid to the independent study and research of statistical physics.Designed for
Blakemore, J S
1962-01-01
Semiconductor Statistics presents statistics aimed at complementing existing books on the relationships between carrier densities and transport effects. The book is divided into two parts. Part I provides introductory material on the electron theory of solids, and then discusses carrier statistics for semiconductors in thermal equilibrium. Of course a solid cannot be in true thermodynamic equilibrium if any electrical current is passed; but when currents are reasonably small the distribution function is but little perturbed, and the carrier distribution for such a """"quasi-equilibrium"""" co
Hao, Q; Cao, W; Chen, P F
2015-01-01
We improve our filament automated detection method which was proposed in our previous works. It is then applied to process the full disk H$\\alpha$ data mainly obtained by Big Bear Solar Observatory (BBSO) from 1988 to 2013, spanning nearly 3 solar cycles. The butterfly diagrams of the filaments, showing the information of the filament area, spine length, tilt angle, and the barb number, are obtained. The variations of these features with the calendar year and the latitude band are analyzed. The drift velocities of the filaments in different latitude bands are calculated and studied. We also investigate the north-south (N-S) asymmetries of the filament numbers in total and in each subclass classified according to the filament area, spine length, and tilt angle. The latitudinal distribution of the filament number is found to be bimodal. About 80% of all the filaments have tilt angles within [0{\\deg}, 60{\\deg}]. For the filaments within latitudes lower (higher) than 50{\\deg} the northeast (northwest) direction i...
The Surveillance, Epidemiology, and End Results (SEER) Program of the National Cancer Institute works to provide information on cancer statistics in an effort to reduce the burden of cancer among the U.S. population.
... Resources Conducting Clinical Trials Statistical Tools and Data Terminology Resources NCI Data Catalog Cryo-EM NCI's Role ... Contacts Other Funding Find NCI funding for small business innovation, technology transfer, and contracts Training Cancer Training ...
U.S. Department of Health & Human Services — The CMS Center for Strategic Planning produces an annual CMS Statistics reference booklet that provides a quick reference for summary information about health...
Tryggestad, Kjell
2004-01-01
The study aims is to describe how the inclusion and exclusion of materials and calculative devices construct the boundaries and distinctions between statistical facts and artifacts in economics. My methodological approach is inspired by John Graunt's (1667) Political arithmetic and more recent work...... within constructivism and the field of Science and Technology Studies (STS). The result of this approach is here termed reversible statistics, reconstructing the findings of a statistical study within economics in three different ways. It is argued that all three accounts are quite normal, albeit...... in different ways. The presence and absence of diverse materials, both natural and political, is what distinguishes them from each other. Arguments are presented for a more symmetric relation between the scientific statistical text and the reader. I will argue that a more symmetric relation can be achieved...
Wendelberger, Laura Jean [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2017-08-08
In large datasets, it is time consuming or even impossible to pick out interesting images. Our proposed solution is to find statistics to quantify the information in each image and use those to identify and pick out images of interest.
Department of Homeland Security — Accident statistics available on the Coast Guard’s website by state, year, and one variable to obtain tables and/or graphs. Data from reports has been loaded for...
Serdobolskii, Vadim Ivanovich
2007-01-01
This monograph presents mathematical theory of statistical models described by the essentially large number of unknown parameters, comparable with sample size but can also be much larger. In this meaning, the proposed theory can be called "essentially multiparametric". It is developed on the basis of the Kolmogorov asymptotic approach in which sample size increases along with the number of unknown parameters.This theory opens a way for solution of central problems of multivariate statistics, which up until now have not been solved. Traditional statistical methods based on the idea of an infinite sampling often break down in the solution of real problems, and, dependent on data, can be inefficient, unstable and even not applicable. In this situation, practical statisticians are forced to use various heuristic methods in the hope the will find a satisfactory solution.Mathematical theory developed in this book presents a regular technique for implementing new, more efficient versions of statistical procedures. ...
Statistics of football dynamics
Mendes, R S; Anteneodo, C
2007-01-01
We investigate the dynamics of football matches. Our goal is to characterize statistically the temporal sequence of ball movements in this collective sport game, searching for traits of complex behavior. Data were collected over a variety of matches in South American, European and World championships throughout 2005 and 2006. We show that the statistics of ball touches presents power-law tails and can be described by $q$-gamma distributions. To explain such behavior we propose a model that provides information on the characteristics of football dynamics. Furthermore, we discuss the statistics of duration of out-of-play intervals, not directly related to the previous scenario.
Magnander, Tobias [Department of Radiation Physics, Institute of Clinical Sciences at Sahlgrenska Academy, University of Gothenburg, Gothenburg (Sweden); Department of Medical Physics and Biomedical Engineering, Sahlgrenska University Hospital, Gothenburg (Sweden); Wikberg, E. [Department of Medical Physics and Biomedical Engineering, Sahlgrenska University Hospital, Gothenburg (Sweden); Svensson, J. [Department of Oncology, The Sahlgrenska Academy, University of Gothenburg, Gothenburg (Sweden); Gjertsson, P. [Department of Clinical Physiology, The Sahlgrenska Academy, University of Gothenburg, Gothenburg (Sweden); Wängberg, B. [Department of Surgery, The Sahlgrenska Academy, University of Gothenburg, Gothenburg (Sweden); Båth, M.; Bernhardt, Peter [Department of Radiation Physics, Institute of Clinical Sciences at Sahlgrenska Academy, University of Gothenburg, Gothenburg (Sweden); Department of Medical Physics and Biomedical Engineering, Sahlgrenska University Hospital, Gothenburg (Sweden)
2016-01-19
Low uptake ratios, high noise, poor resolution, and low contrast all combine to make the detection of neuroendocrine liver tumours by {sup 111}In-octreotide single photon emission tomography (SPECT) imaging a challenge. The aim of this study was to develop a segmentation analysis method that could improve the accuracy of hepatic neuroendocrine tumour detection. Our novel segmentation was benchmarked by a retrospective analysis of patients categorized as either {sup 111}In-octreotide positive ({sup 111}In-octreotide(+)) or {sup 111}In-octreotide negative ({sup 111}In-octreotide(−)) for liver tumours. Following a 3-year follow-up period, involving multiple imaging modalities, we further segregated {sup 111}In-octreotide-negative patients into two groups: one with no confirmed liver tumours ({sup 111}In-octreotide(−)/radtech(−)) and the other, now diagnosed with liver tumours ({sup 111}In-octreotide(−)/radtech(+)). We retrospectively applied our segmentation analysis to see if it could have detected these previously missed tumours using {sup 111}In-octreotide. Our methodology subdivided the liver and determined normalized numbers of uptake foci (nNUF), at various threshold values, using a connected-component labelling algorithm. Plots of nNUF against the threshold index (ThI) were generated. ThI was defined as follows: ThI = (c{sub max} − c{sub thr})/c{sub max}, where c{sub max} is the maximal threshold value for obtaining at least one, two voxel sized, uptake focus; c{sub thr} is the voxel threshold value. The maximal divergence between the nNUF values for {sup 111}In-octreotide(−)/radtech(−), and {sup 111}In-octreotide(+) livers, was used as the optimal nNUF value for tumour detection. We also corrected for any influence of the mean activity concentration on ThI. The nNUF versus ThI method (nNUFTI) was then used to reanalyze the {sup 111}In-octreotide(−)/radtech(−) and {sup 111}In-octreotide(−)/radtech(+) groups. Of a total of 53 {sup 111}In
Jana, Madhusudan
2015-01-01
Statistical mechanics is self sufficient, written in a lucid manner, keeping in mind the exam system of the universities. Need of study this subject and its relation to Thermodynamics is discussed in detail. Starting from Liouville theorem gradually, the Statistical Mechanics is developed thoroughly. All three types of Statistical distribution functions are derived separately with their periphery of applications and limitations. Non-interacting ideal Bose gas and Fermi gas are discussed thoroughly. Properties of Liquid He-II and the corresponding models have been depicted. White dwarfs and condensed matter physics, transport phenomenon - thermal and electrical conductivity, Hall effect, Magneto resistance, viscosity, diffusion, etc. are discussed. Basic understanding of Ising model is given to explain the phase transition. The book ends with a detailed coverage to the method of ensembles (namely Microcanonical, canonical and grand canonical) and their applications. Various numerical and conceptual problems ar...
Schwabl, Franz
2006-01-01
The completely revised new edition of the classical book on Statistical Mechanics covers the basic concepts of equilibrium and non-equilibrium statistical physics. In addition to a deductive approach to equilibrium statistics and thermodynamics based on a single hypothesis - the form of the microcanonical density matrix - this book treats the most important elements of non-equilibrium phenomena. Intermediate calculations are presented in complete detail. Problems at the end of each chapter help students to consolidate their understanding of the material. Beyond the fundamentals, this text demonstrates the breadth of the field and its great variety of applications. Modern areas such as renormalization group theory, percolation, stochastic equations of motion and their applications to critical dynamics, kinetic theories, as well as fundamental considerations of irreversibility, are discussed. The text will be useful for advanced students of physics and other natural sciences; a basic knowledge of quantum mechan...
Rohatgi, Vijay K
2003-01-01
Unified treatment of probability and statistics examines and analyzes the relationship between the two fields, exploring inferential issues. Numerous problems, examples, and diagrams--some with solutions--plus clear-cut, highlighted summaries of results. Advanced undergraduate to graduate level. Contents: 1. Introduction. 2. Probability Model. 3. Probability Distributions. 4. Introduction to Statistical Inference. 5. More on Mathematical Expectation. 6. Some Discrete Models. 7. Some Continuous Models. 8. Functions of Random Variables and Random Vectors. 9. Large-Sample Theory. 10. General Meth
Mandl, Franz
1988-01-01
The Manchester Physics Series General Editors: D. J. Sandiford; F. Mandl; A. C. Phillips Department of Physics and Astronomy, University of Manchester Properties of Matter B. H. Flowers and E. Mendoza Optics Second Edition F. G. Smith and J. H. Thomson Statistical Physics Second Edition E. Mandl Electromagnetism Second Edition I. S. Grant and W. R. Phillips Statistics R. J. Barlow Solid State Physics Second Edition J. R. Hook and H. E. Hall Quantum Mechanics F. Mandl Particle Physics Second Edition B. R. Martin and G. Shaw The Physics of Stars Second Edition A. C. Phillips Computing for Scient
Levine-Wissing, Robin
2012-01-01
All Access for the AP® Statistics Exam Book + Web + Mobile Everything you need to prepare for the Advanced Placement® exam, in a study system built around you! There are many different ways to prepare for an Advanced Placement® exam. What's best for you depends on how much time you have to study and how comfortable you are with the subject matter. To score your highest, you need a system that can be customized to fit you: your schedule, your learning style, and your current level of knowledge. This book, and the online tools that come with it, will help you personalize your AP® Statistics prep
Freund, Rudolf J; Wilson, William J
2010-01-01
Statistical Methods, 3e provides students with a working introduction to statistical methods offering a wide range of applications that emphasize the quantitative skills useful across many academic disciplines. This text takes a classic approach emphasizing concepts and techniques for working out problems and intepreting results. The book includes research projects, real-world case studies, numerous examples and data exercises organized by level of difficulty. This text requires that a student be familiar with algebra. New to this edition: NEW expansion of exercises a
Davidson, Norman
2003-01-01
Clear and readable, this fine text assists students in achieving a grasp of the techniques and limitations of statistical mechanics. The treatment follows a logical progression from elementary to advanced theories, with careful attention to detail and mathematical development, and is sufficiently rigorous for introductory or intermediate graduate courses.Beginning with a study of the statistical mechanics of ideal gases and other systems of non-interacting particles, the text develops the theory in detail and applies it to the study of chemical equilibrium and the calculation of the thermody
NONE
1998-12-31
For the year 1997 and 1998, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually includes also historical time series over a longer period (see e.g. Energiatilastot 1997, Statistics Finland, Helsinki 1998, ISSN 0784-3165). The inside of the Review`s back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO{sub 2}-emissions, Electricity supply, Energy imports by country of origin in January-September 1998, Energy exports by recipient country in January-September 1998, Consumer prices of liquid fuels, Consumer prices of hard coal, Natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, Value added taxes and fiscal charges and fees included in consumer prices of some energy sources, Energy taxes and precautionary stock fees, pollution fees on oil products
NONE
1998-12-31
For the year 1997 and 1998, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually includes also historical time series over a longer period (see e.g. Energiatilastot 1996, Statistics Finland, Helsinki 1997, ISSN 0784-3165). The inside of the Review`s back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO{sub 2}-emissions, Electricity supply, Energy imports by country of origin in January-June 1998, Energy exports by recipient country in January-June 1998, Consumer prices of liquid fuels, Consumer prices of hard coal, Natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, Value added taxes and fiscal charges and fees included in consumer prices of some energy sources, Energy taxes and precautionary stock fees, pollution fees on oil products
Gallavotti, Giovanni
2011-01-01
C. Cercignani: A sketch of the theory of the Boltzmann equation.- O.E. Lanford: Qualitative and statistical theory of dissipative systems.- E.H. Lieb: many particle Coulomb systems.- B. Tirozzi: Report on renormalization group.- A. Wehrl: Basic properties of entropy in quantum mechanics.
Povoski, Stephen P; Chapman, Gregg J; Murrey, Douglas A; Lee, Robert; Martin, Edward W; Hall, Nathan C
2013-03-04
Intraoperative detection of (18)F-FDG-avid tissue sites during 18F-FDG-directed surgery can be very challenging when utilizing gamma detection probes that rely on a fixed target-to-background (T/B) ratio (ratiometric threshold) for determination of probe positivity. The purpose of our study was to evaluate the counting efficiency and the success rate of in situ intraoperative detection of (18)F-FDG-avid tissue sites (using the three-sigma statistical threshold criteria method and the ratiometric threshold criteria method) for three different gamma detection probe systems. Of 58 patients undergoing (18)F-FDG-directed surgery for known or suspected malignancy using gamma detection probes, we identified nine (18)F-FDG-avid tissue sites (from amongst seven patients) that were seen on same-day preoperative diagnostic PET/CT imaging, and for which each (18)F-FDG-avid tissue site underwent attempted in situ intraoperative detection concurrently using three gamma detection probe systems (K-alpha probe, and two commercially-available PET-probe systems), and then were subsequently surgical excised. The mean relative probe counting efficiency ratio was 6.9 (± 4.4, range 2.2-15.4) for the K-alpha probe, as compared to 1.5 (± 0.3, range 1.0-2.1) and 1.0 (± 0, range 1.0-1.0), respectively, for two commercially-available PET-probe systems (P < 0.001). Successful in situ intraoperative detection of 18F-FDG-avid tissue sites was more frequently accomplished with each of the three gamma detection probes tested by using the three-sigma statistical threshold criteria method than by using the ratiometric threshold criteria method, specifically with the three-sigma statistical threshold criteria method being significantly better than the ratiometric threshold criteria method for determining probe positivity for the K-alpha probe (P = 0.05). Our results suggest that the improved probe counting efficiency of the K-alpha probe design used in conjunction with the three-sigma statistical
77 FR 70176 - Previous Participation Certification
2012-11-23
... URBAN DEVELOPMENT Previous Participation Certification AGENCY: Office of the Chief Information Officer... digital submission of all data and certifications is available via HUD's secure Internet systems. However...: Previous Participation Certification. OMB Approval Number: 2502-0118. Form Numbers: HUD-2530 ....
Natrella, Mary Gibbons
2005-01-01
Formulated to assist scientists and engineers engaged in army ordnance research and development programs, this well-known and highly regarded handbook is a ready reference for advanced undergraduate and graduate students as well as for professionals seeking engineering information and quantitative data for designing, developing, constructing, and testing equipment. Topics include characterizing and comparing the measured performance of a material, product, or process; general considerations in planning experiments; statistical techniques for analyzing extreme-value data; use of transformations
Meneghetti, M; Dahle, H; Limousin, M
2013-01-01
The existence of an arc statistics problem was at the center of a strong debate in the last fifteen years. With the aim to clarify if the optical depth for giant gravitational arcs by galaxy clusters in the so called concordance model is compatible with observations, several studies were carried out which helped to significantly improve our knowledge of strong lensing clusters, unveiling their extremely complex internal structure. In particular, the abundance and the frequency of strong lensing events like gravitational arcs turned out to be a potentially very powerful tool to trace the structure formation. However, given the limited size of observational and theoretical data-sets, the power of arc statistics as a cosmological tool has been only minimally exploited so far. On the other hand, the last years were characterized by significant advancements in the field, and several cluster surveys that are ongoing or planned for the near future seem to have the potential to make arc statistics a competitive cosmo...
2012-01-01
In 1975 John Tukey proposed a multivariate median which is the 'deepest' point in a given data cloud in R^d. Later, in measuring the depth of an arbitrary point z with respect to the data, David Donoho and Miriam Gasko considered hyperplanes through z and determined its 'depth' by the smallest portion of data that are separated by such a hyperplane. Since then, these ideas has proved extremely fruitful. A rich statistical methodology has developed that is based on data depth and, more general...
Sheffield, Scott
2009-01-01
In recent years, statistical mechanics has been increasingly recognized as a central domain of mathematics. Major developments include the Schramm-Loewner evolution, which describes two-dimensional phase transitions, random matrix theory, renormalization group theory and the fluctuations of random surfaces described by dimers. The lectures contained in this volume present an introduction to recent mathematical progress in these fields. They are designed for graduate students in mathematics with a strong background in analysis and probability. This book will be of particular interest to graduate students and researchers interested in modern aspects of probability, conformal field theory, percolation, random matrices and stochastic differential equations.
Heyen, H. [GKSS-Forschungszentrum Geesthacht GmbH (Germany). Inst. fuer Gewaesserphysik
1998-12-31
A multivariate statistical approach is presented that allows a systematic search for relationships between the interannual variability in climate records and ecological time series. Statistical models are built between climatological predictor fields and the variables of interest. Relationships are sought on different temporal scales and for different seasons and time lags. The possibilities and limitations of this approach are discussed in four case studies dealing with salinity in the German Bight, abundance of zooplankton at Helgoland Roads, macrofauna communities off Norderney and the arrival of migratory birds on Helgoland. (orig.) [Deutsch] Ein statistisches, multivariates Modell wird vorgestellt, das eine systematische Suche nach potentiellen Zusammenhaengen zwischen Variabilitaet in Klima- und oekologischen Zeitserien erlaubt. Anhand von vier Anwendungsbeispielen wird der Klimaeinfluss auf den Salzgehalt in der Deutschen Bucht, Zooplankton vor Helgoland, Makrofauna vor Norderney, und die Ankunft von Zugvoegeln auf Helgoland untersucht. (orig.)
Paine, Gregory Harold
1982-03-01
The primary objective of the thesis is to explore the dynamical properties of small nerve networks by means of the methods of statistical mechanics. To this end, a general formalism is developed and applied to elementary groupings of model neurons which are driven by either constant (steady state) or nonconstant (nonsteady state) forces. Neuronal models described by a system of coupled, nonlinear, first-order, ordinary differential equations are considered. A linearized form of the neuronal equations is studied in detail. A Lagrange function corresponding to the linear neural network is constructed which, through a Legendre transformation, provides a constant of motion. By invoking the Maximum-Entropy Principle with the single integral of motion as a constraint, a probability distribution function for the network in a steady state can be obtained. The formalism is implemented for some simple networks driven by a constant force; accordingly, the analysis focuses on a study of fluctuations about the steady state. In particular, a network composed of N noninteracting neurons, termed Free Thinkers, is considered in detail, with a view to interpretation and numerical estimation of the Lagrange multiplier corresponding to the constant of motion. As an archetypical example of a net of interacting neurons, the classical neural oscillator, consisting of two mutually inhibitory neurons, is investigated. It is further shown that in the case of a network driven by a nonconstant force, the Maximum-Entropy Principle can be applied to determine a probability distribution functional describing the network in a nonsteady state. The above examples are reconsidered with nonconstant driving forces which produce small deviations from the steady state. Numerical studies are performed on simplified models of two physical systems: the starfish central nervous system and the mammalian olfactory bulb. Discussions are given as to how statistical neurodynamics can be used to gain a better
Joint target localization estimation and detection for statistical MIMO radar%分布式MIMO雷达的参数估计与检测联合算法
马鹏; 郑志东; 张剑云; 李小波
2013-01-01
本文针对分布式MIMO雷达系统,在站间大间隔配置获得的空间分集增益的基础上,提出了一种目标位置估计与检测的联合算法.与以往距离门检测不同的是,这里通过所定义的目标假设框架下进行联合估计与检测.通过理论分析证明,本文所提出的位置估计与检测联合算法在检测性能上要优于距离门检测法,且漏检概率与信噪比SNR成反比.仿真实验也验证了算法的有效性.%We consider multiple-input multiple-output (MIMO) radar systems with widely spaced antennas,which facilitate capturing the inherent diversity gain for joint detection and estimation.Unlike conventional MIMO radars that break the space into small cells and aim at detecting the presence of a target in a specified cell,a new MIMO radar framework for detecting a target that lies in an unknown location has been put forward in this paper.We treat this problem through offering a novel composite hypothesis testing framework for target detection.The test offered optimizes a metric that accounts for both detection and estimation accuracies.The analytical and empirical results show that the proposed algorithm is better than classical detection method in MIMO radar.
READING STATISTICS AND RESEARCH
Reviewed by Yavuz Akbulut
2008-10-01
Full Text Available The book demonstrates the best and most conservative ways to decipher and critique research reports particularly for social science researchers. In addition, new editions of the book are always better organized, effectively structured and meticulously updated in line with the developments in the field of research statistics. Even the most trivial issues are revisited and updated in new editions. For instance, purchaser of the previous editions might check the interpretation of skewness and kurtosis indices in the third edition (p. 34 and in the fifth edition (p.29 to see how the author revisits every single detail. Theory and practice always go hand in hand in all editions of the book. Re-reading previous editions (e.g. third edition before reading the fifth edition gives the impression that the author never stops ameliorating his instructional text writing methods. In brief, “Reading Statistics and Research” is among the best sources showing research consumers how to understand and critically assess the statistical information and research results contained in technical research reports. In this respect, the review written by Mirko Savić in Panoeconomicus (2008, 2, pp. 249-252 will help the readers to get a more detailed overview of each chapters. I cordially urge the beginning researchers to pick a highlighter to conduct a detailed reading with the book. A thorough reading of the source will make the researchers quite selective in appreciating the harmony between the data analysis, results and discussion sections of typical journal articles. If interested, beginning researchers might begin with this book to grasp the basics of research statistics, and prop up their critical research reading skills with some statistics package applications through the help of Dr. Andy Field’s book, Discovering Statistics using SPSS (second edition published by Sage in 2005.
Roth, Amanda L.; Hanson, Nancy D.
2013-01-01
In the United States, the production of the Klebsiella pneumoniae carbapenemase (KPC) is an important mechanism of carbapenem resistance in Gram-negative pathogens. Infections with KPC-producing organisms are associated with increased morbidity and mortality; therefore, the rapid detection of KPC-producing pathogens is critical in patient care and infection control. We developed a real-time PCR assay complemented with traditional high-resolution melting (HRM) analysis, as well as statisticall...
Huleihel, Mahmoud; Shufan, Elad; Zeiri, Leila; Salman, Ahmad
2016-01-01
Of the eight members of the herpes family of viruses, HSV1, HSV2, and varicella zoster are the most common and are mainly involved in cutaneous disorders. These viruses usually are not life-threatening, but in some cases they might cause serious infections to the eyes and the brain that can lead to blindness and possibly death. An effective drug (acyclovir and its derivatives) is available against these viruses. Therefore, early detection and identification of these viral infections is highly important for an effective treatment. Raman spectroscopy, which has been widely used in the past years in medicine and biology, was used as a powerful spectroscopic tool for the detection and identification of these viral infections in cell culture, due to its sensitivity, rapidity and reliability. Our results showed that it was possible to differentiate, with a 97% identification success rate, the uninfected Vero cells that served as a control, from the Vero cells that were infected with HSV-1, HSV-2, and VZV. For that, linear discriminant analysis (LDA) was performed on the Raman spectra after principal component analysis (PCA) with a leave one out (LOO) approach. Raman spectroscopy in tandem with PCA and LDA enable to differentiate among the different herpes viral infections of Vero cells in time span of few minutes with high accuracy rate. Understanding cell molecular changes due to herpes viral infections using Raman spectroscopy may help in early detection and effective treatment. PMID:27078266
Tombesi, F; Reeves, J N; Palumbo, G G C; Yaqoob, T; Braito, V; Dadina, M
2010-01-01
We performed a blind search for narrow absorption features at energies greater than 6.4 keV in a sample of 42 radio-quiet AGNs observed with XMM-Newton. We detect 36 narrow absorption lines on a total of 101 XMM-Newton EPIC pn observations. The number of absorption lines at rest-frame energies E>7 keV is 22. Their global probability to be generated by random fluctuations is very low, less than 3x10^-8, and their detection have been independently confirmed by a spectral analysis of the MOS data, with associated random probability <10^-7. We identify the lines as Fe XXV and Fe XXVI K-shell resonant absorption. They are systematically blue-shifted, with a velocity distribution ranging from zero up to 0.3c, with a peak and mean value at 0.1c. We detect variability of the lines on both EWs and blue-shifted velocities among different observations even on time-scales as short as a few days, possibly suggesting somewhat compact absorbers. Moreover, we find no significant correlation between the cosmological red-sh...
Huleihel, Mahmoud; Shufan, Elad; Zeiri, Leila; Salman, Ahmad
2016-01-01
Of the eight members of the herpes family of viruses, HSV1, HSV2, and varicella zoster are the most common and are mainly involved in cutaneous disorders. These viruses usually are not life-threatening, but in some cases they might cause serious infections to the eyes and the brain that can lead to blindness and possibly death. An effective drug (acyclovir and its derivatives) is available against these viruses. Therefore, early detection and identification of these viral infections is highly important for an effective treatment. Raman spectroscopy, which has been widely used in the past years in medicine and biology, was used as a powerful spectroscopic tool for the detection and identification of these viral infections in cell culture, due to its sensitivity, rapidity and reliability. Our results showed that it was possible to differentiate, with a 97% identification success rate, the uninfected Vero cells that served as a control, from the Vero cells that were infected with HSV-1, HSV-2, and VZV. For that, linear discriminant analysis (LDA) was performed on the Raman spectra after principal component analysis (PCA) with a leave one out (LOO) approach. Raman spectroscopy in tandem with PCA and LDA enable to differentiate among the different herpes viral infections of Vero cells in time span of few minutes with high accuracy rate. Understanding cell molecular changes due to herpes viral infections using Raman spectroscopy may help in early detection and effective treatment.
Salman, A; Shufan, E; Zeiri, L; Huleihel, M
2013-03-01
Cancer is one of the leading worldwide causes of death. It may be induced by a variety of factors, including carcinogens, radiation, genetic factors, or DNA and RNA viruses. The early detection of cancer is critical for its successful therapy, which can result in complete recovery from some types of cancer. Raman spectroscopy has been widely used in medicine and biology. It is a noninvasive, nondestructive, and water-insensitive technique that can detect changes in cells and tissues that are caused by different disorders, such as cancer. In this study, Raman spectroscopy was used for the identification and characterization of murine fibroblast cell lines (NIH/3T3) and malignant fibroblast cells transformed by murine sarcoma virus (NIH-MuSV) cells. Using principal component analysis and LDA it was possible to differentiate between the NIH/3T3 and NIH-MuSV cells with an 80-85% success rate based on their Raman shift spectra. The best results for differentiation were achieved from spectra that were obtained from the rich membrane sites. Because of its homogeneity and complete control of most factors affecting its growth, cell culture is a preferred model for the detection and identification of specific biomarkers related to cancer transformation or other cellular modifications.
李鹏; 张永良; 李焱淼; 李骏康
2011-01-01
针对联指图中存在的各种噪声,提出了一种基于频域统计量的联指图噪声检测和去除算法.首先对联指图进行规格化处理,然后利用三个频域统计量来检测和去除噪声,最后对图像进行后处理.该算法根据指纹和各种噪声在频域上的不同特性,给出了相应的噪声判定条件.实验结果表明该算法是一种行之有效的联指图噪声检测和去除算法.%To detect and remove the noise in slap fingerprint images, an efficient algorithm was presented based on frequency domain statistics. Firstly, the slap was normalized; then, three frequency domain statistics were used for noise detection and removal; finally, post-processing was applied to optimize the result. Based on the different frequency domain characteristics between fingerprint and noise, the corresponding criterion was proposed to detect and remove the noise. The experimental results show that the presented algorithm is effective.
Ballance, Simon; Holtan, Synnøve; Aarstad, Olav Andreas; Sikorski, Pawel; Skjåk-Braek, Gudmund; Christensen, Bjørn E
2005-11-04
Alginates comprised of essentially alternating units of mannuronic (M) acid-guluronic (G) acid (MG-alginate), and G-blocks isolated from a seaweed where subjected to partial acid hydrolysis at pH 3.5 The chain-length distribution of oligosaccharides in the hydrolysate were investigated by statistical analysis after their separation with high-performance anion-exchange chromatography and pulsed amperometric detection (HPAEC-PAD). Simulated depolymerisation of the MG-alginate provided an estimate of the ratio between two acid hydrolysis rate constants (p=8.3+/-1) and the average distribution of the MM linkages in the original sample of polysaccharide chains. In conclusion, we found HPAEC-PAD together with statistical analysis was a useful method to investigate the fine structure and some properties of binary polysaccharides.
All of statistics a concise course in statistical inference
Wasserman, Larry
2004-01-01
This book is for people who want to learn probability and statistics quickly It brings together many of the main ideas in modern statistics in one place The book is suitable for students and researchers in statistics, computer science, data mining and machine learning This book covers a much wider range of topics than a typical introductory text on mathematical statistics It includes modern topics like nonparametric curve estimation, bootstrapping and classification, topics that are usually relegated to follow-up courses The reader is assumed to know calculus and a little linear algebra No previous knowledge of probability and statistics is required The text can be used at the advanced undergraduate and graduate level Larry Wasserman is Professor of Statistics at Carnegie Mellon University He is also a member of the Center for Automated Learning and Discovery in the School of Computer Science His research areas include nonparametric inference, asymptotic theory, causality, and applications to astrophysics, bi...
Subsequent pregnancy outcome after previous foetal death
Nijkamp, J. W.; Korteweg, F. J.; Holm, J. P.; Timmer, A.; Erwich, J. J. H. M.; van Pampus, M. G.
2013-01-01
Objective: A history of foetal death is a risk factor for complications and foetal death in subsequent pregnancies as most previous risk factors remain present and an underlying cause of death may recur. The purpose of this study was to evaluate subsequent pregnancy outcome after foetal death and to
Yin Y. Shugart; Bing-Jian Feng; Andrew Collins
2002-11-01
We have evaluated the power for detecting a common trait determined by two loci, using seven statistics, of which five are implemented in the computer program SimWalk2, and two are implemented in GENEHUNTER. Unlike most previous reports which involve evaluations of the power of allele-sharing statistics for a single disease locus, we have used a simulated data set of general pedigrees in which a two-locus disease is segregating and evaluated several non-parametric linkage statistics implemented in the two programs. We found that the power for detecting linkage using the $S_{\\text{all}}$ statistic in GENEHUNTER (GH, version 2.1), implemented as statistic in SimWalk2 (version 2.82), is different in the two. The values associated with statistic output by SimWalk2 are consistently more conservative than those from GENEHUNTER except when the underlying model includes heterogeneity at a level of 50% where the values output are very comparable. On the other hand, when the thresholds are determined empirically under the null hypothesis, $S_{\\text{all}}$ in GENEHUNTER and statistic have similar power.
Ott, Julien G.; Ba, Alexandre; Racine, Damien; Viry, Anais; Bochud, Francois O.; Verdun, Francis R. [Univ. Hospital Lausanne (Switzerland). Inst. of Radiation Physics
2017-08-01
This study aims to assess CT image quality in a way that would meet specific requirements of clinical practice. Physics metrics like Fourier transform derived metrics were traditionally employed for that. However, assessment methods through a detection task have also developed quite extensively lately, and we chose here to rely on this modality for image quality assessment. Our goal was to develop a tool adapted for a fast and reliable CT image quality assessment in order to pave the way for new CT benchmarking techniques in a clinical context. Additionally, we also used this method to estimate the benefits brought by some IR algorithms. A modified QRM chest phantom containing spheres of 5 and 8 mm at contrast levels of 10 and 20 HU at 120 kVp was used. Images of the phantom were acquired at CTDI{sub vol} of 0.8, 3.6, 8.2 and 14.5 mGy, before being reconstructed using FBP, ASIR 40 and MBIR on a GE HD 750 CT scanner. They were then assessed by eight human observers undergoing a 4-AFC test. After that, these data were compared with the results obtained from two different model observers (NPWE and CHO with DDoG channels). The study investigated the effects of the acquisition conditions as well as reconstruction methods. NPWE and CHO models both gave coherent results and approximated human observer results well. Moreover, the reconstruction technique used to retrieve the images had a clear impact on the PC values. Both models suggest that switching from FBP to ASIR 40 and particularly to MBIR produces an increase of the low contrast detection, provided a minimum level of exposure is reached. Our work shows that both CHO with DDoG channels and NPWE models both approximate the trend of humans performing a detection task. Both models also suggest that the use of MBIR goes along with an increase of the PCs, indicating that further dose reduction is still possible when using those techniques. Eventually, the CHO model associated to the protocol we described in this study
Induced vaginal birth after previous caesarean section
Akylbek Tussupkaliyev; Andrey Gayday; Bibigul Karimsakova; Saule Bermagambetova; Lunara Uteniyazova; Guldana Iztleuova; Gulkhanym Kusherbayeva; Meruyert Konakbayeva; Assylzada Merekeyeva; Zamira Imangaliyeva
2016-01-01
Introduction The rate of operative birth by Caesarean section is constantly rising. In Kazakhstan, it reaches 27 per cent. Research data confirm that the percentage of successful vaginal births after previous Caesarean section is 50–70 per cent. How safe the induction of vaginal birth after Caesarean (VBAC) remains unclear. Methodology The studied techniques of labour induction were amniotomy of the foetal bladder with the vulsellum ramus, intravaginal administra...
Patent Statistics and IPR Laws Update Online
无
2006-01-01
Patent Statistics Beginning from No.2 of 2005 of China Patents & Trademarks, the Statistics on Patent Applications & Grants in China, previously published under the column of Statistics, will be updated online, including the monthly Statistics on Patent Applications by Patent Category, the Patent Grants by Patent Category, the Domestic Patent Applications by Province, and the Overseas Patent Applications by Country, and their yearly statistics at www.cpt.cn or www.cpahkltd.com/cn/ Publications/staten.htm...
Statistics for characterizing data on the periphery
Theiler, James P [Los Alamos National Laboratory; Hush, Donald R [Los Alamos National Laboratory
2010-01-01
We introduce a class of statistics for characterizing the periphery of a distribution, and show that these statistics are particularly valuable for problems in target detection. Because so many detection algorithms are rooted in Gaussian statistics, we concentrate on ellipsoidal models of high-dimensional data distributions (that is to say: covariance matrices), but we recommend several alternatives to the sample covariance matrix that more efficiently model the periphery of a distribution, and can more effectively detect anomalous data samples.
Previous studies underestimate BMAA concentrations in cycad flour.
Cheng, Ran; Banack, Sandra Anne
2009-01-01
The traditional diet of the Chamorro people of Guam has high concentrations of the neurotoxin BMAA, beta-methyl-amino-L-alanine, in cycad tortillas and from animals that feed on cycad seeds. We measured BMAA concentration in washed cycad flour and compared different extraction methods used by previous researchers in order to determine how much BMAA may have been unaccounted for in prior research. Samples were analyzed with AQC precolumn derivatization using HPLC-FD detection and verified with UPLC-UV, UPLC-MS, and triple quadrupole LC/MS/MS. Although previous workers had studied only the free amino acid component of BMAA in washed cycad flour, we detected significant levels of protein-associated BMAA in washed cycad flour. These data support a link between ALS/PDC and exposure to BMAA.
OUTCOME OF PREGNANCY IN WOMEN WITH PREVIOUS CAESAREAN SECTION
Bellad Girija
2016-06-01
Full Text Available BACKGROUND Carefully selected cases of Vaginal Birth after Caesarean Section (VBAC is safe and successful. Even though options of elective caesarean section or a trial of labour are given to women with prior caesarean section, the risk is always present. In successful VBACs, morbidity is less compared to repeat caesarean section. That is why this study is conducted to determine the outcome of pregnancy in women with previous CS. OBJECTIVES 1. To evaluate the clinical course of labour in cases with previous caesarean section. 2. To study the perinatal outcome in cases with previous caesarean section either by vaginal delivery or repeat Caesarean section. 3. To study maternal morbidity in these cases. METHOD A retrospective analysis of medical records of 250 women with a previous caesarean section, who delivered in BIMS Hospital between May 2015 and July 2015 was carried out. Women with recurrent indications for caesarean section and those having nonrecurrent indications with any complicating factors in present pregnancy and women with previous two caesarean sections were not given trial for vaginal delivery. Those women with previous section for the non-recurrent indications were given trial for vaginal delivery. STATISTICAL ANALYSIS Was done by Chi-square test. RESULT In 250 cases, 132 cases were given trial for vaginal delivery. In these, vaginal delivery was 61.3% and repeat section was 38%. There is an association between maternal morbidity and type of delivery. Birth weight was associated with the type of delivery. There is no association between neonatal outcome and type of delivery. CONCLUSION In carefully selected patients, appropriate timing and close supervision, trial of vaginal delivery in previous one caesarean section is safe and successful. Individual approach seems to be the best.
Duarte, Janaína; Pacheco, Marcos T. T.; Villaverde, Antonio Balbin; Machado, Rosangela Z.; Zângaro, Renato A.; Silveira, Landulfo
2010-07-01
Toxoplasmosis is an important zoonosis in public health because domestic cats are the main agents responsible for the transmission of this disease in Brazil. We investigate a method for diagnosing toxoplasmosis based on Raman spectroscopy. Dispersive near-infrared Raman spectra are used to quantify anti-Toxoplasma gondii (IgG) antibodies in blood sera from domestic cats. An 830-nm laser is used for sample excitation, and a dispersive spectrometer is used to detect the Raman scattering. A serological test is performed in all serum samples by the enzyme-linked immunosorbent assay (ELISA) for validation. Raman spectra are taken from 59 blood serum samples and a quantification model is implemented based on partial least squares (PLS) to quantify the sample's serology by Raman spectra compared to the results provided by the ELISA test. Based on the serological values provided by the Raman/PLS model, diagnostic parameters such as sensitivity, specificity, accuracy, positive prediction values, and negative prediction values are calculated to discriminate negative from positive samples, obtaining 100, 80, 90, 83.3, and 100%, respectively. Raman spectroscopy, associated with the PLS, is promising as a serological assay for toxoplasmosis, enabling fast and sensitive diagnosis.
Cutaneous responses to vaccinia in individuals with previous smallpox vaccination.
Simpson, Eric L; Hercher, Michelle; Hammarlund, Erika K; Lewis, Matthew W; Slifka, Mark K; Hanifin, Jon M
2007-09-01
The durability of immune responses to smallpox vaccine is a subject of considerable debate. We compared cutaneous vaccinia responses in patients vaccinated in the distant past with vaccine-naïve individuals using serial close-up photographs. The previously vaccinated group had a significantly reduced time course and milder cutaneous reactions. Vaccinated individuals appear to maintain clinically detectable immunity against vaccinia for at least 20 years after smallpox vaccination.
Cataract surgery in previously vitrectomized eyes.
Akinci, A; Batman, C; Zilelioglu, O
2008-05-01
To evaluate the results of extracapsular cataract extraction (ECCE) and phacoemulsification (PHACO) performed in previously vitrectomized eyes. In this retrospective study, 56 vitrectomized eyes that had ECCE and 60 vitrectomized eyes that had PHACO were included in the study group while 65 eyes that had PHACO in the control group. The evaluated parameters were the incidence of intra-operative and postoperative complications (IPC) and visual outcomes. Chi-squared, independent samples and paired samples tests were used for comparing the results. Deep anterior chamber (AC) was significantly more common in the PHACO group of vitrectomized eyes (PGVE) and observed in eyes that had undergone extensive vitreous removal (p ECCE group and the PGVE (p > 0.05). Some of the intra-operative conditions such as posterior synechiae, primary posterior capsular opacification (PCO) and postoperative complications such as retinal detachment (RD), PCO were significantly more common in vitrectomized eyes than the controls (p ECCE group and the PGVE (p > 0.05). Deep AC is more common in eyes with extensive vitreous removal during PHACO than ECCE. Decreasing the bottle height is advised in this case. Except for this, the results of ECCE and PHACO are similar in previously vitrectomized eyes. Posterior synechiaes, primary and postoperative PCO and RD are more common in vitrectomized eyes than the controls.
吕喜在; 苏绍璟; 黄芝平
2011-01-01
In order to get the frame synchronization information on the unknown lines in digital communication, a method of blind detecting sub synchronous sequence is presented. By modeling frame synchronization, the relationship between the probability of detecting the sub synchronous sequence and the length of data is gained in the word frequency statistics with different word widths. According to this relationship, the blind detecting sub synchronous sequence of unknown lines with any bit-start in error conditions is realized by conducting the word frequency statistic and adjusting the word width and data length,which makes it possible to get the whole frame synchronization information on unknown lines.%为了获取数字通信中未知线路的帧同步信息,提出了一种子同步码盲检测方法.通过对帧同步问题进行建模,得出了不同字宽的字频统计中子同步码检出概率与数据长度的关系.据此关系,通过对原始数据进行字频统计并调整字宽和数据长度,实现了误码环境中具有任意比特起点的未知线路子同步码的全盲检测,从而使完整帧同步信息的获取成为可能.
Books average previous decade of economic misery.
R Alexander Bentley
Full Text Available For the 20(th century since the Depression, we find a strong correlation between a 'literary misery index' derived from English language books and a moving average of the previous decade of the annual U.S. economic misery index, which is the sum of inflation and unemployment rates. We find a peak in the goodness of fit at 11 years for the moving average. The fit between the two misery indices holds when using different techniques to measure the literary misery index, and this fit is significantly better than other possible correlations with different emotion indices. To check the robustness of the results, we also analysed books written in German language and obtained very similar correlations with the German economic misery index. The results suggest that millions of books published every year average the authors' shared economic experiences over the past decade.
Obinutuzumab for previously untreated chronic lymphocytic leukemia.
Abraham, Jame; Stegner, Mark
2014-04-01
Obinutuzumab was approved by the Food and Drug Administration in late 2013 for use in combination with chlorambucil for the treatment of patients with previously untreated chronic lymphocytic leukemia (CLL). The approval was based on results of an open-label phase 3 trial that showed improved progression-free survival (PFS) with the combination of obinutuzumab plus chlorambucil compared with chlorambucil alone. Obinutuzumab is a monoclonal antibody that targets CD20 antigen expressed on the surface of pre B- and mature B-lymphocytes. After binding to CD20, obinutuzumab mediates B-cell lysis by engaging immune effector cells, directly activating intracellular death signaling pathways, and activating the complement cascade. Immune effector cell activities include antibody-dependent cellular cytotoxicity and antibody-dependent cellular phagocytosis.
Can previous learning alter future plasticity mechanisms?
Crestani, Ana Paula; Quillfeldt, Jorge Alberto
2016-02-01
The dynamic processes related to mnemonic plasticity have been extensively researched in the last decades. More recently, studies have attracted attention because they show an unusual plasticity mechanism that is independent of the receptor most usually related to first-time learning--that is, memory acquisition-the NMDA receptor. An interesting feature of this type of learning is that a previous experience may cause modifications in the plasticity mechanism of a subsequent learning, suggesting that prior experience in a very similar task triggers a memory acquisition process that does not depend on NMDARs. The intracellular molecular cascades necessary to assist the learning process seem to depend on the activation of hippocampal CP-AMPARs. Moreover, most of these studies were performed on hippocampus-dependent tasks, even though other brain areas, such as the basolateral amygdala, also display NMDAR-independent learning.
Books average previous decade of economic misery.
Bentley, R Alexander; Acerbi, Alberto; Ormerod, Paul; Lampos, Vasileios
2014-01-01
For the 20(th) century since the Depression, we find a strong correlation between a 'literary misery index' derived from English language books and a moving average of the previous decade of the annual U.S. economic misery index, which is the sum of inflation and unemployment rates. We find a peak in the goodness of fit at 11 years for the moving average. The fit between the two misery indices holds when using different techniques to measure the literary misery index, and this fit is significantly better than other possible correlations with different emotion indices. To check the robustness of the results, we also analysed books written in German language and obtained very similar correlations with the German economic misery index. The results suggest that millions of books published every year average the authors' shared economic experiences over the past decade.
Finkelstein, Michael O
2015-01-01
This classic text, first published in 1990, is designed to introduce law students, law teachers, practitioners, and judges to the basic ideas of mathematical probability and statistics as they have been applied in the law. The third edition includes over twenty new sections, including the addition of timely topics, like New York City police stops, exonerations in death-sentence cases, projecting airline costs, and new material on various statistical techniques such as the randomized response survey technique, rare-events meta-analysis, competing risks, and negative binomial regression. The book consists of sections of exposition followed by real-world cases and case studies in which statistical data have played a role. The reader is asked to apply the theory to the facts, to calculate results (a hand calculator is sufficient), and to explore legal issues raised by quantitative findings. The authors' calculations and comments are given in the back of the book. As with previous editions, the cases and case stu...
... PSN PSEN GRAFT Contact Us News Plastic Surgery Statistics Plastic surgery procedural statistics from the American Society of Plastic Surgeons. Statistics by Year Print 2016 Plastic Surgery Statistics 2015 ...
... Standards Act and Program MQSA Insights MQSA National Statistics Share Tweet Linkedin Pin it More sharing options ... but should level off with time. Archived Scorecard Statistics 2017 Scorecard Statistics 2016 Scorecard Statistics (Archived) 2015 ...
Previous gastric bypass surgery complicating total thyroidectomy.
Alfonso, Bianca; Jacobson, Adam S; Alon, Eran E; Via, Michael A
2015-03-01
Hypocalcemia is a well-known complication of total thyroidectomy. Patients who have previously undergone gastric bypass surgery may be at increased risk of hypocalcemia due to gastrointestinal malabsorption, secondary hyperparathyroidism, and an underlying vitamin D deficiency. We present the case of a 58-year-old woman who underwent a total thyroidectomy for the follicular variant of papillary thyroid carcinoma. Her history included Roux-en-Y gastric bypass surgery. Following the thyroid surgery, she developed postoperative hypocalcemia that required large doses of oral calcium carbonate (7.5 g/day), oral calcitriol (up to 4 μg/day), intravenous calcium gluconate (2.0 g/day), calcium citrate (2.0 g/day), and ergocalciferol (50,000 IU/day). Her serum calcium levels remained normal on this regimen after hospital discharge despite persistent hypoparathyroidism. Bariatric surgery patients who undergo thyroid surgery require aggressive supplementation to maintain normal serum calcium levels. Preoperative supplementation with calcium and vitamin D is strongly recommended.
Surgery of intracranial aneurysms previously treated endovascularly.
Tirakotai, Wuttipong; Sure, Ulrich; Yin, Yuhua; Benes, Ludwig; Schulte, Dirk Michael; Bien, Siegfried; Bertalanffy, Helmut
2007-11-01
To perform a retrospective study on the patients who underwent aneurysmal surgery following endovascular treatment. We performed a retrospective study on eight patients who underwent aneurysmal surgery following endovascular treatment (-attempts) with gugliemi detachable coils (GDCs). The indications for surgery, surgical techniques and clinical outcomes were analyzed. The indications for surgical treatment after GDC coiling of aneurysm were classified into three groups. First group: surgery of incompletely coiled aneurysms (n=4). Second group: surgery of mass effect on the neural structures due to coil compaction or rebleeding (n=2). Third group: surgery of vascular complications after endovascular procedure due to parent artery occlusion or thrombus propagation from aneurysm (n=2). Aneurysm obliterations could be performed in all cases confirmed by postoperative angiography. Six patients had an excellent outcome and returned to their profession. Patient's visual acuity was improved. One individual experienced right hemiparesis (grade IV/V) and hemihypesthesia. Microsurgical clipping is rarely necessary for previously coiled aneurysms. Surgical treatment is uncommonly required when an acute complication arises during endovascular treatment, or when there is a dynamic change of a residual aneurysm configuration over time that is considered to be insecure.
[Electronic cigarettes - effects on health. Previous reports].
Napierała, Marta; Kulza, Maksymilian; Wachowiak, Anna; Jabłecka, Katarzyna; Florek, Ewa
2014-01-01
Currently very popular in the market of tobacco products have gained electronic cigarettes (ang. E-cigarettes). These products are considered to be potentially less harmful in compared to traditional tobacco products. However, current reports indicate that the statements of the producers regarding to the composition of the e- liquids not always are sufficient, and consumers often do not have reliable information on the quality of the product used by them. This paper contain a review of previous reports on the composition of e-cigarettes and their impact on health. Most of the observed health effects was related to symptoms of the respiratory tract, mouth, throat, neurological complications and sensory organs. Particularly hazardous effects of the e-cigarettes were: pneumonia, congestive heart failure, confusion, convulsions, hypotension, aspiration pneumonia, face second-degree burns, blindness, chest pain and rapid heartbeat. In the literature there is no information relating to passive exposure by the aerosols released during e-cigarette smoking. Furthermore, the information regarding to the use of these products in the long term are not also available.
A previously undescribed pathway for pyrimidine catabolism.
Loh, Kevin D; Gyaneshwar, Prasad; Markenscoff Papadimitriou, Eirene; Fong, Rebecca; Kim, Kwang-Seo; Parales, Rebecca; Zhou, Zhongrui; Inwood, William; Kustu, Sydney
2006-03-28
The b1012 operon of Escherichia coli K-12, which is composed of seven unidentified ORFs, is one of the most highly expressed operons under control of nitrogen regulatory protein C. Examination of strains with lesions in this operon on Biolog Phenotype MicroArray (PM3) plates and subsequent growth tests indicated that they failed to use uridine or uracil as the sole nitrogen source and that the parental strain could use them at room temperature but not at 37 degrees C. A strain carrying an ntrB(Con) mutation, which elevates transcription of genes under nitrogen regulatory protein C control, could also grow on thymidine as the sole nitrogen source, whereas strains with lesions in the b1012 operon could not. Growth-yield experiments indicated that both nitrogens of uridine and thymidine were available. Studies with [(14)C]uridine indicated that a three-carbon waste product from the pyrimidine ring was excreted. After trimethylsilylation and gas chromatography, the waste product was identified by mass spectrometry as 3-hydroxypropionic acid. In agreement with this finding, 2-methyl-3-hydroxypropionic acid was released from thymidine. Both the number of available nitrogens and the waste products distinguished the pathway encoded by the b1012 operon from pyrimidine catabolic pathways described previously. We propose that the genes of this operon be named rutA-G for pyrimidine utilization. The product of the divergently transcribed gene, b1013, is a tetracycline repressor family regulator that controls transcription of the b1012 operon negatively.
CURRENT STATUS OF NONPARAMETRIC STATISTICS
Orlov A. I.
2015-02-01
Full Text Available Nonparametric statistics is one of the five points of growth of applied mathematical statistics. Despite the large number of publications on specific issues of nonparametric statistics, the internal structure of this research direction has remained undeveloped. The purpose of this article is to consider its division into regions based on the existing practice of scientific activity determination of nonparametric statistics and classify investigations on nonparametric statistical methods. Nonparametric statistics allows to make statistical inference, in particular, to estimate the characteristics of the distribution and testing statistical hypotheses without, as a rule, weakly proven assumptions about the distribution function of samples included in a particular parametric family. For example, the widespread belief that the statistical data are often have the normal distribution. Meanwhile, analysis of results of observations, in particular, measurement errors, always leads to the same conclusion - in most cases the actual distribution significantly different from normal. Uncritical use of the hypothesis of normality often leads to significant errors, in areas such as rejection of outlying observation results (emissions, the statistical quality control, and in other cases. Therefore, it is advisable to use nonparametric methods, in which the distribution functions of the results of observations are imposed only weak requirements. It is usually assumed only their continuity. On the basis of generalization of numerous studies it can be stated that to date, using nonparametric methods can solve almost the same number of tasks that previously used parametric methods. Certain statements in the literature are incorrect that nonparametric methods have less power, or require larger sample sizes than parametric methods. Note that in the nonparametric statistics, as in mathematical statistics in general, there remain a number of unresolved problems
Detecting and Avoiding Wormhole Attacks by Statistical Analysis Based on RTT%基于RTI的统计分析方法检测与防御虫洞攻击
杨姣; 王东
2011-01-01
移动Ad hoc网是一种新型的无线移动网络,具有无中心、自组织、拓扑结构动态变化以及开放式通信等特性,使得Ad hoc网络易遭受攻击.虫洞攻击是针对Ad hoc路由协议的攻击,对Ad hoc网络造成的威胁最大.提出一种基于RTT(往返时间)的统计分析检测方法,在路由发现过程中,目的节点在返回路由应答(RREP)之前统计并分析获得的路由信息,得到各条链路在所有路由中出现的频率,再结合两两节点间的RTT时间来选择路由.由于在路由过程中就检测出虫洞,因此可以很好地抵御虫洞的攻击.同时,RTT的计算由各个节点自己完成,计算量不大且维持较少的开销.仿真实验表明,该方法能有效地检测出虫洞攻击并提高虫洞的检测率.%Mobile Ad hoc Networks (MANET) is a new networking for wireless hosts.Because of self-organization,dynamic topology and openness of wireless communication, it makes them very attractive to attackers.Wormhole attack is one of the most severe threats to ad hoc networks.They create a higher level virtual tunnel between two malicious colluding nodes in the network.In this paper, we proposed a novel statistical analysis mechanism based on RTT (Round Trip Time) to detect and avoid wormhole attacks.It detected wormhole attacks during route setup procedure by statistically analyzing the route information and the transmission time between every two successive nodes along the established path before sending RREP to the source node.The wormhole will be recognize before it can do harm to the network in the route setup.And each node compute the RTT, it only introduces very limited overhead.Simulation results showed that the method can efficiently detect wormhole attacks and have better detection rote.
The Statistical Drake Equation
Maccone, Claudio
2010-12-01
function, apparently previously unknown and dubbed "Maccone distribution" by Paul Davies. DATA ENRICHMENT PRINCIPLE. It should be noticed that ANY positive number of random variables in the Statistical Drake Equation is compatible with the CLT. So, our generalization allows for many more factors to be added in the future as long as more refined scientific knowledge about each factor will be known to the scientists. This capability to make room for more future factors in the statistical Drake equation, we call the "Data Enrichment Principle," and we regard it as the key to more profound future results in the fields of Astrobiology and SETI. Finally, a practical example is given of how our statistical Drake equation works numerically. We work out in detail the case, where each of the seven random variables is uniformly distributed around its own mean value and has a given standard deviation. For instance, the number of stars in the Galaxy is assumed to be uniformly distributed around (say) 350 billions with a standard deviation of (say) 1 billion. Then, the resulting lognormal distribution of N is computed numerically by virtue of a MathCad file that the author has written. This shows that the mean value of the lognormal random variable N is actually of the same order as the classical N given by the ordinary Drake equation, as one might expect from a good statistical generalization.
Predict! Teaching Statistics Using Informational Statistical Inference
Makar, Katie
2013-01-01
Statistics is one of the most widely used topics for everyday life in the school mathematics curriculum. Unfortunately, the statistics taught in schools focuses on calculations and procedures before students have a chance to see it as a useful and powerful tool. Researchers have found that a dominant view of statistics is as an assortment of tools…
QRS DETECTION OF ECG - A STATISTICAL ANALYSIS
I.S. Siva Rao
2015-03-01
Full Text Available Electrocardiogram (ECG is a graphical representation generated by heart muscle. ECG plays an important role in diagnosis and monitoring of heart’s condition. The real time analyzer based on filtering, beat recognition, clustering, classification of signal with maximum few seconds delay can be done to recognize the life threatening arrhythmia. ECG signal examines and study of anatomic and physiologic facets of the entire cardiac muscle. The inceptive task for proficient scrutiny is the expulsion of noise. It is attained by the use of wavelet transform analysis. Wavelets yield temporal and spectral information concurrently and offer stretchability with a possibility of wavelet functions of different properties. This paper is concerned with the extraction of QRS complexes of ECG signals using Discrete Wavelet Transform based algorithms aided with MATLAB. By removing the inconsistent wavelet transform coefficient, denoising is done in ECG signal. In continuation, QRS complexes are identified and in which each peak can be utilized to discover the peak of separate waves like P and T with their derivatives. Here we put forth a new combinatory algorithm builded on using Pan-Tompkins' method and multi-wavelet transform.
Estimation of descriptive statistics for multiply censored water quality data
Helsel, D.R.; Cohn, T.A.
1988-01-01
This paper extends the work of Gilliom and Helsel on procedures for estimating descriptive statistics of water quality data than contain "less than' observations. Previously, procedures were evaluated when only one detection limit was present. Here the performance of estimators for data that have multiple detection limits is investigated. Probability plotting and maximum likelihood methods perform substantially better than simple substitution procedures now commonly in use. Therefore simple substitution procedures eg substitution of the detection limit should be avoided. Probability plotting methods are more robust than maximum likelihood methods to misspecification of the parent distribution and their use should be encouraged in the typical situation where the parent distribution is unknown. When utilized correctly, less than values frequently contain nearly as much information for estimating population moments and quantiles as would the same observations had the detection limit been below them. -Authors
In vitro culture of previously uncultured oral bacterial phylotypes.
Thompson, Hayley; Rybalka, Alexandra; Moazzez, Rebecca; Dewhirst, Floyd E; Wade, William G
2015-12-01
Around a third of oral bacteria cannot be grown using conventional bacteriological culture media. Community profiling targeting 16S rRNA and shotgun metagenomics methods have proved valuable in revealing the complexity of the oral bacterial community. Studies investigating the role of oral bacteria in health and disease require phenotypic characterizations that are possible only with live cultures. The aim of this study was to develop novel culture media and use an in vitro biofilm model to culture previously uncultured oral bacteria. Subgingival plaque samples collected from subjects with periodontitis were cultured on complex mucin-containing agar plates supplemented with proteose peptone (PPA), beef extract (BEA), or Gelysate (GA) as well as on fastidious anaerobe agar plus 5% horse blood (FAA). In vitro biofilms inoculated with the subgingival plaque samples and proteose peptone broth (PPB) as the growth medium were established using the Calgary biofilm device. Specific PCR primers were designed and validated for the previously uncultivated oral taxa Bacteroidetes bacteria HOT 365 and HOT 281, Lachnospiraceae bacteria HOT 100 and HOT 500, and Clostridiales bacterium HOT 093. All agar media were able to support the growth of 10 reference strains of oral bacteria. One previously uncultivated phylotype, Actinomyces sp. HOT 525, was cultivated on FAA. Of 93 previously uncultivated phylotypes found in the inocula, 26 were detected in in vitro-cultivated biofilms. Lachnospiraceae bacterium HOT 500 was successfully cultured from biofilm material harvested from PPA plates in coculture with Parvimonas micra or Veillonella dispar/parvula after colony hybridization-directed enrichment. The establishment of in vitro biofilms from oral inocula enables the cultivation of previously uncultured oral bacteria and provides source material for isolation in coculture.
Savage, Leonard J
1972-01-01
Classic analysis of the foundations of statistics and development of personal probability, one of the greatest controversies in modern statistical thought. Revised edition. Calculus, probability, statistics, and Boolean algebra are recommended.
Adrenal Gland Tumors: Statistics
... Gland Tumor: Statistics Request Permissions Adrenal Gland Tumor: Statistics Approved by the Cancer.Net Editorial Board , 03/ ... primary adrenal gland tumor is very uncommon. Exact statistics are not available for this type of tumor ...
... Facts and Statistics Printable Version Blood Facts and Statistics Facts about blood needs Facts about the blood ... to Top Learn About Blood Blood Facts and Statistics Blood Components Whole Blood and Red Blood Cells ...
Algebraic statistics computational commutative algebra in statistics
Pistone, Giovanni; Wynn, Henry P
2000-01-01
Written by pioneers in this exciting new field, Algebraic Statistics introduces the application of polynomial algebra to experimental design, discrete probability, and statistics. It begins with an introduction to Gröbner bases and a thorough description of their applications to experimental design. A special chapter covers the binary case with new application to coherent systems in reliability and two level factorial designs. The work paves the way, in the last two chapters, for the application of computer algebra to discrete probability and statistical modelling through the important concept of an algebraic statistical model.As the first book on the subject, Algebraic Statistics presents many opportunities for spin-off research and applications and should become a landmark work welcomed by both the statistical community and its relatives in mathematics and computer science.
Kaufman Jay S
2008-07-01
Full Text Available Abstract In 2004, Garcia-Berthou and Alcaraz published "Incongruence between test statistics and P values in medical papers," a critique of statistical errors that received a tremendous amount of attention. One of their observations was that the final reported digit of p-values in articles published in the journal Nature departed substantially from the uniform distribution that they suggested should be expected. In 2006, Jeng critiqued that critique, observing that the statistical analysis of those terminal digits had been based on comparing the actual distribution to a uniform continuous distribution, when digits obviously are discretely distributed. Jeng corrected the calculation and reported statistics that did not so clearly support the claim of a digit preference. However delightful it may be to read a critique of statistical errors in a critique of statistical errors, we nevertheless found several aspects of the whole exchange to be quite troubling, prompting our own meta-critique of the analysis. The previous discussion emphasized statistical significance testing. But there are various reasons to expect departure from the uniform distribution in terminal digits of p-values, so that simply rejecting the null hypothesis is not terribly informative. Much more importantly, Jeng found that the original p-value of 0.043 should have been 0.086, and suggested this represented an important difference because it was on the other side of 0.05. Among the most widely reiterated (though often ignored tenets of modern quantitative research methods is that we should not treat statistical significance as a bright line test of whether we have observed a phenomenon. Moreover, it sends the wrong message about the role of statistics to suggest that a result should be dismissed because of limited statistical precision when it is so easy to gather more data. In response to these limitations, we gathered more data to improve the statistical precision, and
Rates of induced abortion in Denmark according to age, previous births and previous abortions
Marie-Louise H. Hansen
2009-11-01
Full Text Available Background: Whereas the effects of various socio-demographic determinants on a woman's risk of having an abortion are relatively well-documented, less attention has been given to the effect of previous abortions and births. Objective: To study the effect of previous abortions and births on Danish women's risk of an abortion, in addition to a number of demographic and personal characteristics. Data and methods: From the Fertility of Women and Couples Dataset we obtained data on the number of live births and induced abortions by year (1981-2001, age (16-39, county of residence and marital status. Logistic regression analysis was used to estimate the influence of the explanatory variables on the probability of having an abortion in a relevant year. Main findings and conclusion: A woman's risk of having an abortion increases with the number of previous births and previous abortions. Some interactions were was found in the way a woman's risk of abortion varies with calendar year, age and parity. The risk of an abortion for women with no children decreases while the risk of an abortion for women with children increases over time. Furthermore, the risk of an abortion decreases with age, but relatively more so for women with children compared to childless women. Trends for teenagers are discussed in a separate section.
Estimation of global network statistics from incomplete data.
Catherine A Bliss
Full Text Available Complex networks underlie an enormous variety of social, biological, physical, and virtual systems. A profound complication for the science of complex networks is that in most cases, observing all nodes and all network interactions is impossible. Previous work addressing the impacts of partial network data is surprisingly limited, focuses primarily on missing nodes, and suggests that network statistics derived from subsampled data are not suitable estimators for the same network statistics describing the overall network topology. We generate scaling methods to predict true network statistics, including the degree distribution, from only partial knowledge of nodes, links, or weights. Our methods are transparent and do not assume a known generating process for the network, thus enabling prediction of network statistics for a wide variety of applications. We validate analytical results on four simulated network classes and empirical data sets of various sizes. We perform subsampling experiments by varying proportions of sampled data and demonstrate that our scaling methods can provide very good estimates of true network statistics while acknowledging limits. Lastly, we apply our techniques to a set of rich and evolving large-scale social networks, Twitter reply networks. Based on 100 million tweets, we use our scaling techniques to propose a statistical characterization of the Twitter Interactome from September 2008 to November 2008. Our treatment allows us to find support for Dunbar's hypothesis in detecting an upper threshold for the number of active social contacts that individuals maintain over the course of one week.
STATISTICAL ANALYSIS, REPORTS), (*PROBABILITY, REPORTS), INFORMATION THEORY, DIFFERENTIAL EQUATIONS, STATISTICAL PROCESSES, STOCHASTIC PROCESSES, MULTIVARIATE ANALYSIS, DISTRIBUTION THEORY , DECISION THEORY, MEASURE THEORY, OPTIMIZATION
Statistical models for trisomic phenotypes
Lamb, N.E.; Sherman, S.L.; Feingold, E. [Emory Univ., Atlanta, GA (United States)
1996-01-01
Certain genetic disorders are rare in the general population but more common in individuals with specific trisomies, which suggests that the genes involved in the etiology of these disorders may be located on the trisomic chromosome. As with all aneuploid syndromes, however, a considerable degree of variation exists within each phenotype so that any given trait is present only among a subset of the trisomic population. We have previously presented a simple gene-dosage model to explain this phenotypic variation and developed a strategy to map genes for such traits. The mapping strategy does not depend on the simple model but works in theory under any model that predicts that affected individuals have an increased likelihood of disomic homozygosity at the trait locus. This paper explores the robustness of our mapping method by investigating what kinds of models give an expected increase in disomic homozygosity. We describe a number of basic statistical models for trisomic phenotypes. Some of these are logical extensions of standard models for disomic phenotypes, and some are more specific to trisomy. Where possible, we discuss genetic mechanisms applicable to each model. We investigate which models and which parameter values give an expected increase in disomic homozygosity in individuals with the trait. Finally, we determine the sample sizes required to identify the increased disomic homozygosity under each model. Most of the models we explore yield detectable increases in disomic homozygosity for some reasonable range of parameter values, usually corresponding to smaller trait frequencies. It therefore appears that our mapping method should be effective for a wide variety of moderately infrequent traits, even though the exact mode of inheritance is unlikely to be known. 21 refs., 8 figs., 1 tab.
Explorations in statistics: statistical facets of reproducibility.
Curran-Everett, Douglas
2016-06-01
Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This eleventh installment of Explorations in Statistics explores statistical facets of reproducibility. If we obtain an experimental result that is scientifically meaningful and statistically unusual, we would like to know that our result reflects a general biological phenomenon that another researcher could reproduce if (s)he repeated our experiment. But more often than not, we may learn this researcher cannot replicate our result. The National Institutes of Health and the Federation of American Societies for Experimental Biology have created training modules and outlined strategies to help improve the reproducibility of research. These particular approaches are necessary, but they are not sufficient. The principles of hypothesis testing and estimation are inherent to the notion of reproducibility in science. If we want to improve the reproducibility of our research, then we need to rethink how we apply fundamental concepts of statistics to our science.
于欣; 侯晓娇; 卢焕达; 余心杰; 范良忠; 刘鹰
2014-01-01
鱼类群体行为的异常检测能够为鱼类健康监控与预警提供重要的方法和手段，对研究鱼类行为的机理，提升水产养殖过程中的信息化水平具有非常重要的意义。该文通过计算机视觉和图像处理技术，基于鱼群运动特征统计方法，对鱼群异常行为检测进行研究。利用Lucas-Kanade光流法得到目标鱼群的运动矢量，并对目标运动的行为特征进行统计，得到速度与转角这2个行为特征的联合直方图与联合概率分布。最后，在联合概率分布的基础上，基于标准互信息(normalized mutual information-NMI)和局部距离异常因子(local distance-based outlier factor-LDOF)2种方法对鱼群行为进行异常检测。试验结果表明，2种异常检测方法均达到99.5%以上的准确率。%The behavior of fishes is very sensitive to the changes of the parameters of the environment, such as temperature, dissolved oxygen, light, and so on. The anomaly detection of fish school behavior can not only discover the relationship between the fish behaviors and the environmental parameters, but also provide an important method and tool for fish health monitoring and early warning. Moreover, it is very meaningful for the study of the mechanism of fish behavior and promotion of the informatization level in aquaculture. By using computer vision technology and based on a statistical method of motion features, the anomaly detection of fish school behavior was studied. The zebra fish was selected as the study object in this paper. First, based on the foreground object detection method with a threshold value method, the backgrounds were removed from the original video images to reduce the influence of noises. Secondly, by the Lucas-Kanade optical flow method, which is based on the local deference method and has better performance, the vectors of motion behavior could be obtained in different temporal and spatial conditions. Thirdly, from these data, the
Purohit, Sudha G; Deshmukh, Shailaja R
2015-01-01
STATISTICS USING R will be useful at different levels, from an undergraduate course in statistics, through graduate courses in biological sciences, engineering, management and so on. The book introduces statistical terminology and defines it for the benefit of a novice. For a practicing statistician, it will serve as a guide to R language for statistical analysis. For a researcher, it is a dual guide, simultaneously explaining appropriate statistical methods for the problems at hand and indicating how these methods can be implemented using the R language. For a software developer, it is a guide in a variety of statistical methods for development of a suite of statistical procedures.
A new statistic for picking out Non-Gaussianity in the CMB
Lewin, A; Magueijo, J; Lewin, Alex; Albrecht, Andreas; Magueijo, Joao
1999-01-01
In this paper we propose a new statistic capable of detecting non-Gaussianity in the CMB. The statistic is defined in Fourier space, and therefore naturally separates angular scales. It consists of taking another Fourier transform, in angle, over the Fourier modes within a given ring of scales. Like other Fourier space statistics, our statistic outdoes more conventional methods when faced with combinations of Gaussian processes (be they noise or signal) and a non-Gaussian signal which dominates only on some scales. However, unlike previous efforts along these lines, our statistic is successful in recognizing multiple non-Gaussian patterns in a single field. We discuss various applications, in which the Gaussian component may be noise or primordial signal, and the non-Gaussian component may be a cosmic string map, or some geometrical construction mimicking, say, small scale dust maps.
A phased approach to network intrusion detection
Jackson, K.A.; DuBois, D.H.; Stallings, C.A.
1991-01-01
This paper describes the design and development of a prototype intrusion detection system for the Los Alamos National Laboratory's Integrated Computing Network (ICN). The development of this system is based on three basic assumptions: (1) that statistical analysis of computer system and user activates may be used to characterize normal system and user behavior, and that given the resulting statistical profiles, behavior which deviates beyond certain bounds can be detected, (2) that expert system techniques can be applied to security auditing and intrusion detection, and (3) that successful intrusion detection may take place while monitoring a limited set of network activities. The Network Anomaly Detection and Intrusion Reporter (NADIR) design intent was to duplicate and improve the audit record review activities which had previously been undertaken by security personnel, to replace the manual review of audit logs with a near realtime expert system.
[Pro Familia statistics for 1974].
1975-09-01
Statistics for 1974 for the West German family planning organization Pro Familia are reported. 56 offices are now operating, and 23,726 clients were seen. Men were seen more frequently than previously. 10,000 telephone calls were also handled. 16-25 year olds were increasingly represented in the clientele, as were unmarried persons of all ages. 1,242 patients were referred to physicians or clinics for clinical diagnosis.
Industrial statistics with Minitab
Cintas, Pere Grima; Llabres, Xavier Tort-Martorell
2012-01-01
Industrial Statistics with MINITAB demonstrates the use of MINITAB as a tool for performing statistical analysis in an industrial context. This book covers introductory industrial statistics, exploring the most commonly used techniques alongside those that serve to give an overview of more complex issues. A plethora of examples in MINITAB are featured along with case studies for each of the statistical techniques presented. Industrial Statistics with MINITAB: Provides comprehensive coverage of user-friendly practical guidance to the essential statistical methods applied in industry.Explores
Rumsey, Deborah
2011-01-01
The fun and easy way to get down to business with statistics Stymied by statistics? No fear ? this friendly guide offers clear, practical explanations of statistical ideas, techniques, formulas, and calculations, with lots of examples that show you how these concepts apply to your everyday life. Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more.Tracks to a typical first semester statistics cou
NONE
2004-07-01
Reports to the ISAG (Information System for Waste and Recycling) for 2001 cover 402 Danish waste treatment plants owned by 295 enterprises. The total waste generation in 2001 amounted to 12,768,000 tonnes, which is 2% less than in 2000. Reductions are primarily due to the fact that sludge for mineralization is included with a dry matter content of 20% compared to 1,5% in previous statistics. This means that sludge amounts have been reduced by 808,886 tonnes. The overall rate of recycling amounted to 63%, which is 1% less than the overall recycling target of 64% for 2004. Since sludge has a high recycling rate, the reduction in sludge amounts of 808,886 tonnes has also caused the total recycling rate to fall. Waste amounts incinerated accounted for 25%, which is 1% more than the overall target of 24% for incineration in 2004. Waste going to landfill amounted to 10%, which is better than the overall landfill target for 2004 of a maximum of 12% for landfilling. Targets for treatment of waste from the different sectors, however, are still not complied with, since too little waste from households and the service sector is recycled, and too much waste from industry is led to landfill. (BA)
Quality assurance and statistical control
Heydorn, K.
1991-01-01
In scientific research laboratories it is rarely possible to use quality assurance schemes, developed for large-scale analysis. Instead methods have been developed to control the quality of modest numbers of analytical results by relying on statistical control: Analysis of precision serves...... serves to detect analytical bias by comparing results obtained by two different analytical methods, each relying on a different detection principle and therefore exhibiting different influence from matrix elements; only 5-10 sets of results are required to establish whether a regression line passes...
Premorbid adjustment and previous personality in schizophrenic patients
José Juan Rodríguez Solano
2005-12-01
Full Text Available Psychosocial adjustment and premorbid personality are two factors that are frequently studied in order to elucidate the etiopathogenesis of schizophrenia. Premorbid adjustment alterations and personality disorders (principally those of the schizophrenia spectrum have been considered vulnerability elements or have been linked with the early manifestations of a disease that is still underdeveloped (hypothesis of neurodevelopment. In this paper we review the literature. We also studied the relationship between premorbid adjustment (PAS scale and previous personality disorders (SCID-II in a sample of 40 patients with schizophrenia (DSM-III-R, DSM-IV, CIE-10, and statistically correlated them. The results show that premorbid adjustment correlates with avoidant, schizotypal and schizoid personality disorders: the more personality pathology found, the poorer is the premorbid psychosocial adjustment. Premorbid adjustment positively correlates with histrionic personality traits. The pathological traits of schizotypal and schizoid personalities account for up to 77% of the variance of the total premorbid adjustment in schizophrenic patients. Conclusion: The degrees of premorbid adjustment in schizophrenia are related to the different premorbid personality disorders of schizophrenic patients, which are mainly those most genetically related with schizophrenia, that is, the spectrum of the schizophrenia.
Zhu Qihui
2006-10-01
Full Text Available Abstract Background The identification of chromosomal homology will shed light on such mysteries of genome evolution as DNA duplication, rearrangement and loss. Several approaches have been developed to detect chromosomal homology based on gene synteny or colinearity. However, the previously reported implementations lack statistical inferences which are essential to reveal actual homologies. Results In this study, we present a statistical approach to detect homologous chromosomal segments based on gene colinearity. We implement this approach in a software package ColinearScan to detect putative colinear regions using a dynamic programming algorithm. Statistical models are proposed to estimate proper parameter values and evaluate the significance of putative homologous regions. Statistical inference, high computational efficiency and flexibility of input data type are three key features of our approach. Conclusion We apply ColinearScan to the Arabidopsis and rice genomes to detect duplicated regions within each species and homologous fragments between these two species. We find many more homologous chromosomal segments in the rice genome than previously reported. We also find many small colinear segments between rice and Arabidopsis genomes.
U.S. Department of Health & Human Services — The CMS Office of Enterprise Data and Analytics has developed CMS Program Statistics, which includes detailed summary statistics on national health care, Medicare...
... Standard Drink? Drinking Levels Defined Alcohol Facts and Statistics Print version Alcohol Use in the United States: ... 1245, 2004. PMID: 15010446 11 National Center for Statistics and Analysis. 2014 Crash Data Key Findings (Traffic ...
... Statistics Students' Pages Errata Other Statistical Sites Subjects Inflation & Prices » Consumer Price Index Producer Price Indexes Import/Export Price ... Choose a Subject Employment and Unemployment Employment Unemployment Inflation, Prices, and ... price indexes Consumer spending Industry price indexes Pay ...
Recreational Boating Statistics 2012
Department of Homeland Security — Every year, the USCG compiles statistics on reported recreational boating accidents. These statistics are derived from accident reports that are filed by the owners...
Lindström, Erik; Madsen, Henrik; Nielsen, Jan Nygaard
Statistics for Finance develops students’ professional skills in statistics with applications in finance. Developed from the authors’ courses at the Technical University of Denmark and Lund University, the text bridges the gap between classical, rigorous treatments of financial mathematics...
Mathematical and statistical analysis
Houston, A. Glen
1988-01-01
The goal of the mathematical and statistical analysis component of RICIS is to research, develop, and evaluate mathematical and statistical techniques for aerospace technology applications. Specific research areas of interest include modeling, simulation, experiment design, reliability assessment, and numerical analysis.
Lindström, Erik; Madsen, Henrik; Nielsen, Jan Nygaard
Statistics for Finance develops students’ professional skills in statistics with applications in finance. Developed from the authors’ courses at the Technical University of Denmark and Lund University, the text bridges the gap between classical, rigorous treatments of financial mathematics...
Neuroendocrine Tumor: Statistics
... Tumor > Neuroendocrine Tumor: Statistics Request Permissions Neuroendocrine Tumor: Statistics Approved by the Cancer.Net Editorial Board , 11/ ... the United States are diagnosed with Merkel cell skin cancer each year. Almost all people diagnosed with the ...
Experiment in Elementary Statistics
Fernando, P. C. B.
1976-01-01
Presents an undergraduate laboratory exercise in elementary statistics in which students verify empirically the various aspects of the Gaussian distribution. Sampling techniques and other commonly used statistical procedures are introduced. (CP)
Overweight and Obesity Statistics
... the full list of resources . Overweight and Obesity Statistics Page Content About Overweight and Obesity Prevalence of ... adults age 20 and older [ Top ] Physical Activity Statistics Adults Research Findings Research suggests that staying active ...
... Research AMIGAS Fighting Cervical Cancer Worldwide Stay Informed Statistics for Other Kinds of Cancer Breast Cervical Colorectal ( ... Skin Vaginal and Vulvar Cancer Home Uterine Cancer Statistics Language: English Español (Spanish) Recommend on Facebook Tweet ...
School Violence: Data & Statistics
... Social Media Publications Injury Center School Violence: Data & Statistics Recommend on Facebook Tweet Share Compartir The first ... fact sheet provides up-to-date data and statistics on youth violence. Data Sources Indicators of School ...
Recreational Boating Statistics 2013
Department of Homeland Security — Every year, the USCG compiles statistics on reported recreational boating accidents. These statistics are derived from accident reports that are filed by the owners...
Software for Spatial Statistics
Edzer Pebesma
2015-02-01
Full Text Available We give an overview of the papers published in this special issue on spatial statistics, of the Journal of Statistical Software. 21 papers address issues covering visualization (micromaps, links to Google Maps or Google Earth, point pattern analysis, geostatistics, analysis of areal aggregated or lattice data, spatio-temporal statistics, Bayesian spatial statistics, and Laplace approximations. We also point to earlier publications in this journal on the same topic.
Software for Spatial Statistics
Edzer Pebesma; Roger Bivand; Paulo Justiniano Ribeiro
2015-01-01
We give an overview of the papers published in this special issue on spatial statistics, of the Journal of Statistical Software. 21 papers address issues covering visualization (micromaps, links to Google Maps or Google Earth), point pattern analysis, geostatistics, analysis of areal aggregated or lattice data, spatio-temporal statistics, Bayesian spatial statistics, and Laplace approximations. We also point to earlier publications in this journal on the same topic.
Statistical Mechanics of Zooplankton.
Hinow, Peter; Nihongi, Ai; Strickler, J Rudi
2015-01-01
Statistical mechanics provides the link between microscopic properties of many-particle systems and macroscopic properties such as pressure and temperature. Observations of similar "microscopic" quantities exist for the motion of zooplankton, as well as many species of other social animals. Herein, we propose to take average squared velocities as the definition of the "ecological temperature" of a population under different conditions on nutrients, light, oxygen and others. We test the usefulness of this definition on observations of the crustacean zooplankton Daphnia pulicaria. In one set of experiments, D. pulicaria is infested with the pathogen Vibrio cholerae, the causative agent of cholera. We find that infested D. pulicaria under light exposure have a significantly greater ecological temperature, which puts them at a greater risk of detection by visual predators. In a second set of experiments, we observe D. pulicaria in cold and warm water, and in darkness and under light exposure. Overall, our ecological temperature is a good discriminator of the crustacean's swimming behavior.
Statistical Mechanics of Zooplankton.
Peter Hinow
Full Text Available Statistical mechanics provides the link between microscopic properties of many-particle systems and macroscopic properties such as pressure and temperature. Observations of similar "microscopic" quantities exist for the motion of zooplankton, as well as many species of other social animals. Herein, we propose to take average squared velocities as the definition of the "ecological temperature" of a population under different conditions on nutrients, light, oxygen and others. We test the usefulness of this definition on observations of the crustacean zooplankton Daphnia pulicaria. In one set of experiments, D. pulicaria is infested with the pathogen Vibrio cholerae, the causative agent of cholera. We find that infested D. pulicaria under light exposure have a significantly greater ecological temperature, which puts them at a greater risk of detection by visual predators. In a second set of experiments, we observe D. pulicaria in cold and warm water, and in darkness and under light exposure. Overall, our ecological temperature is a good discriminator of the crustacean's swimming behavior.
Lenard, Christopher; McCarthy, Sally; Mills, Terence
2014-01-01
There are many different aspects of statistics. Statistics involves mathematics, computing, and applications to almost every field of endeavour. Each aspect provides an opportunity to spark someone's interest in the subject. In this paper we discuss some ethical aspects of statistics, and describe how an introduction to ethics has been…
A Parametric Cumulative Sum Statistic for Person Fit
Armstrong, Ronald D.; Shi, Min
2009-01-01
This article develops a new cumulative sum (CUSUM) statistic to detect aberrant item response behavior. Shifts in behavior are modeled with quadratic functions and a series of likelihood ratio tests are used to detect aberrancy. The new CUSUM statistic is compared against another CUSUM approach as well as traditional person-fit statistics. A…
A Parametric Cumulative Sum Statistic for Person Fit
Armstrong, Ronald D.; Shi, Min
2009-01-01
This article develops a new cumulative sum (CUSUM) statistic to detect aberrant item response behavior. Shifts in behavior are modeled with quadratic functions and a series of likelihood ratio tests are used to detect aberrancy. The new CUSUM statistic is compared against another CUSUM approach as well as traditional person-fit statistics. A…
Selling statistics[Statistics in scientific progress
Bridle, S. [Astrophysics Group, University College London (United Kingdom)]. E-mail: sarah@star.ucl.ac.uk
2006-09-15
From Cosmos to Chaos- Peter Coles, 2006, Oxford University Press, 224pp. To confirm or refute a scientific theory you have to make a measurement. Unfortunately, however, measurements are never perfect: the rest is statistics. Indeed, statistics is at the very heart of scientific progress, but it is often poorly taught and badly received; for many, the very word conjures up half-remembered nightmares of 'null hypotheses' and 'student's t-tests'. From Cosmos to Chaos by Peter Coles, a cosmologist at Nottingham University, is an approachable antidote that places statistics in a range of catchy contexts. Using this book you will be able to calculate the probabilities in a game of bridge or in a legal trial based on DNA fingerprinting, impress friends by talking confidently about entropy, and stretch your mind thinking about quantum mechanics. (U.K.)